Springer
Refine
Year of publication
Document Type
- Article (123)
- Part of a Book (103)
- Conference Proceeding (20)
- Book (2)
Has Fulltext
- no (248) (remove)
Keywords
- MINLP (3)
- Natural language processing (3)
- Seismic design (3)
- Additive manufacturing (2)
- Digitale Transformation (2)
- Engineering optimization (2)
- Information extraction (2)
- Optimization (2)
- Pitching Moment (2)
- Powertrain (2)
- Process engineering (2)
- Prozessautomatisierung (2)
- Tanks (2)
- Telecommunication (2)
- Wave Drag (2)
- Wind Tunnel (2)
- 3D printing (1)
- Active learning (1)
- Advanced driver assistance systems (ADAS/AD) (1)
- Agent-based simulation (1)
- Analytics (1)
- Annulus Fibrosus (1)
- Arbeit 4.0 (1)
- Autonomous mobile robots (1)
- BEV (1)
- Bahadur efficiency (1)
- Balance (1)
- Balanced hypergraph (1)
- Best practice sharing (1)
- Bioeconomy (1)
- Bioethanol (1)
- Biomechanical simulation (1)
- Biomolecular logic gate (1)
- Biorefinery (1)
- Biorefinery definitions (1)
- Bladder (1)
- Bloom’s Taxonomy (1)
- Bone sawing (1)
- Business Models (1)
- Business Process (1)
- Calorimetric gas sensor (1)
- Cardiovascular MRI (1)
- Carsharing (1)
- Certification Rule (1)
- Change culture (1)
- Charging stations (1)
- Chatbots (1)
- Chemical imaging (1)
- Cloud Computing (1)
- Clustering (1)
- Co-managed care (1)
- Coefficient of ocular rigidity (1)
- Cognitive assistance system (1)
- Collaborative robot (1)
- Competence Developing Games (1)
- Complex System (1)
- Components (1)
- Connected Automated Vehicle (1)
- Controller Parameter (1)
- Cooling system (1)
- Corneo-scleral shell (1)
- Coverage probability (1)
- Cryptographic protocols (1)
- Crámer–von-Mises distance (1)
- Customer Orientation (1)
- DNA (1)
- Datenschutz (1)
- Decentral (1)
- Deep Learning (1)
- Deep learning (1)
- Design examples (1)
- Differential tonometry (1)
- Digital leadership (1)
- Digital manufacturing (1)
- Digitalisierung (1)
- Disc Degeneration (1)
- Drag Reduction (1)
- Dry surfaces (1)
- Duality (1)
- E-carsharing (1)
- E-mobility (1)
- EN 1998-4 (1)
- Efficiency optimization (1)
- Elderly (1)
- Electrical vehicle (1)
- Electromagnetism (1)
- Elektroenzephalographie (1)
- Elicit (1)
- Energy efficiency (1)
- Energy market design (1)
- Engine Efficiency (1)
- Engineering optimisation (1)
- Enterprise Architecture (1)
- Enterprise architecture (1)
- Enterprise transformation (1)
- Enzyme biosensor (1)
- Equivalence test (1)
- Eurocode 8 (1)
- Evacuation Rule (1)
- Experimental validation (1)
- Eyeball (1)
- FGF23 (1)
- Fall prevention (1)
- Field-effect device (1)
- Field-effect sensor (1)
- Flight Test (1)
- Fracture configuration (1)
- Fracture simulation (1)
- Fully connected car (1)
- Game-based learning (1)
- Gamification (1)
- Gearbox (1)
- Geschäftsprozessmanagement (1)
- Glass powder (1)
- Glaucoma (1)
- Global optimization (1)
- Gold nanoparticle (1)
- Goodness-of-fit tests for uniformity (1)
- Ground-level falls (1)
- Growth modelling (1)
- Gust wind response (1)
- Hall’s Theorem (1)
- Human-Robot interaction (1)
- Human-centered work design (1)
- Human-robot collaboration (1)
- Hydraulic structures (1)
- Hydrogen peroxide (1)
- Hypergraph (1)
- ISO 26262 (1)
- IT Products (1)
- IT security education (1)
- IT-Sicherheit (1)
- Ice melting probe (1)
- Ice penetration (1)
- Icy moons (1)
- Identitätsmanagement (1)
- Incomplete data (1)
- Inductive charging (1)
- Industrial facilities (1)
- Industrial optimisation (1)
- Industrial units (1)
- Industry 4.0 (1)
- Information and communication technology (1)
- Informationsgetriebene Geschäftsmodelle (1)
- Integrated empirical distribution (survival) function (1)
- Integrated mobility (1)
- Interactive process mining (1)
- Internet der Dinge (1)
- Intervertebral Disc (1)
- Intradiscal Pressure (1)
- Introduction (1)
- Kernel density estimator (1)
- Keyword analysis (1)
- Klotho (1)
- Koenig’s Theorem (1)
- L-PBF (1)
- Laser processing (1)
- Leaderboard (1)
- Leading Edge Vortex (1)
- Lean thinking (1)
- Left ventriular function (1)
- Length of confidence intervals (1)
- Level Control System (1)
- Light-addressable potentiometric sensor (1)
- Lignocellulose feedstook (1)
- Limit analysis (1)
- MILP (1)
- MR-stethoscope (1)
- Mach Number (1)
- Machine learning (1)
- Magnetic field strength (1)
- Magnetic resonance imaging (MRI) (1)
- Malicious model (1)
- Map (eTOM) Process reference model Process design Telecommunications industry (1)
- Marginal homogeneity test (1)
- Market modeling (1)
- Mars (1)
- Matching (1)
- Mechanical (1)
- Mechanical simulation (1)
- Melting (1)
- Methodology (1)
- Microbial adhesion (1)
- Minimum Risk Manoeuvre (1)
- Minor chemistry (1)
- Mixed-integer nonlinear black-box optimization (1)
- Mixed-integer nonlinear problem (1)
- Mixed-integer nonlinear programming (1)
- Mixed-integer programming (1)
- Mobility (1)
- Mobility management (1)
- Mobility tests (1)
- Monetarisierung (1)
- Multi-criteria optimization (1)
- Multi-robot systems (1)
- Multi-sensor system (1)
- Multidisciplinary Design Optimization (1)
- Multimode failure (1)
- Muscle fibers (1)
- Natural language understanding (1)
- Network (1)
- Neural Network (1)
- Noise Exposure (1)
- Non-linear optimization (1)
- Nonlinear Dynamics (1)
- Nucleus Pulposus (1)
- Numerical inversion of Laplace transforms (1)
- OR 2019 (1)
- Objective data (1)
- Ocean worlds (1)
- Ocular blood flow (1)
- On-site (1)
- Open channels (1)
- Operational Design Domain (1)
- Optimal Closed Loop (1)
- Optimal Topology (1)
- PTH (1)
- Paired sample (1)
- Paper recycling (1)
- Parabolized Stability Equation (1)
- Passenger compartment (1)
- Passive stretching (1)
- Path planning (1)
- Pelvic floor dysfunction (1)
- Pelvic muscle (1)
- Performance (1)
- Personality (1)
- Phosphate (1)
- Physical chemistry (1)
- Physical chemistry basics (1)
- Physical chemistry starters (1)
- Physical modeling (1)
- Piecewise linearization (1)
- Pitman efficiency (1)
- Potentiometry (1)
- Pre-treatment (1)
- Pressure-volume relationship (1)
- Privacy (1)
- Privacy-enhancing technologies (1)
- Process design (1)
- Process reference model (1)
- Process schemes (1)
- Process virtualization (1)
- Product Management (1)
- Product bundling (1)
- Product family optimization (1)
- Profile extraction (1)
- Projektbeispiele (1)
- Proximal humerus fracture (1)
- Prozessabläufe (1)
- Prozessmodellierung (1)
- Prozessstandardisierung (1)
- Psychiatrische Biomarker (1)
- Pumping systems (1)
- Pushover analysis (1)
- Qualitative Wertschöpfungsanalyse (1)
- Query learning (1)
- RVA (1)
- Rapid manufacturing (1)
- Rapid prototyping (1)
- Reconstruction (1)
- Reference modelling (1)
- Referenzmodellierung (1)
- Relation classification (1)
- Reliability analysis (1)
- Renewable resources (1)
- Reproducible research (1)
- Resampling test (1)
- Resilience (1)
- Resolvent Operator (1)
- Response spectrum (1)
- Responsibility (1)
- RoboCup (1)
- Robotic Process Automation (1)
- Rotator cuff (1)
- Safety concept (1)
- Safety of the intended functionality (SOTIF) (1)
- Safety-critical systems validation (1)
- Sampling methods (1)
- Schlafspindeldetektion (1)
- Secure multi-party computation (1)
- Services (1)
- Severe Accident (1)
- Shakedown analysis (1)
- Silos (1)
- Similitude (1)
- Smart factory (1)
- Software (1)
- Software development (1)
- Software testing (1)
- Softwareroboter (1)
- Sonic Boom (1)
- Specific Fuel Consumption (1)
- Spectral analysis (1)
- Strategic Business Planning (1)
- Structural health monitoring (1)
- Supersonic Flow (1)
- Supersonic Wind Tunnel (1)
- Surface microorganisms (1)
- Swabbing (1)
- TM Forum (1)
- Teamwork (1)
- Technical Operation Research (1)
- Technical Operations Research (1)
- Technische Schutzmaßnahmen (1)
- Technology Challenge (1)
- Telecommunication Industry (1)
- Text mining (1)
- Thermal Fatigue Testing (1)
- Thermal comfort (1)
- Thermal management (1)
- Thermodynamics as minor (1)
- Tinetti test (1)
- Transformation (1)
- Transformation Project (1)
- Transiton of Control (1)
- Trustworthy artificial intelligence (1)
- Urban areas (1)
- Ureter (1)
- V2X (1)
- Validation (1)
- Variable Geometry (1)
- Vascular response (1)
- Vertex cover (1)
- Visual field asymmetry (1)
- Vitamin D (1)
- WLTP (1)
- Water (1)
- Water distribution system (1)
- Wilcoxon tests (1)
- Wind turbulence (1)
- Workspace monitoring (1)
- Zero-knowledge proofs (1)
- business analytics (1)
- decision analytics (1)
- digital economy (1)
- eTOM (1)
- enhanced Telecom Operations Map (eTOM) (1)
- mathematical optimization (1)
- training simulator (1)
- virtual reality (1)
- · Psychiatrische Erkrankungen/Diagnostik (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (71)
- Fachbereich Elektrotechnik und Informationstechnik (69)
- IfB - Institut für Bioengineering (42)
- Fachbereich Luft- und Raumfahrttechnik (25)
- Fachbereich Chemie und Biotechnologie (22)
- Fachbereich Energietechnik (21)
- Fachbereich Wirtschaftswissenschaften (17)
- Fachbereich Maschinenbau und Mechatronik (14)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (11)
- INB - Institut für Nano- und Biotechnologien (10)
- Fachbereich Bauingenieurwesen (8)
- ECSM European Center for Sustainable Mobility (3)
- Institut fuer Angewandte Polymerchemie (3)
- Solar-Institut Jülich (2)
- Sonstiges (2)
- Freshman Institute (1)
- Nowum-Energy (1)
- Verwaltung (1)
Supervised machine learning and deep learning require a large amount of labeled data, which data scientists obtain in a manual, and time-consuming annotation process. To mitigate this challenge, Active Learning (AL) proposes promising data points to annotators they annotate next instead of a subsequent or random sample. This method is supposed to save annotation effort while maintaining model performance.
However, practitioners face many AL strategies for different tasks and need an empirical basis to choose between them. Surveys categorize AL strategies into taxonomies without performance indications. Presentations of novel AL strategies compare the performance to a small subset of strategies. Our contribution addresses the empirical basis by introducing a reproducible active learning evaluation (ALE) framework for the comparative evaluation of AL strategies in NLP.
The framework allows the implementation of AL strategies with low effort and a fair data-driven comparison through defining and tracking experiment parameters (e.g., initial dataset size, number of data points per query step, and the budget). ALE helps practitioners to make more informed decisions, and researchers can focus on developing new, effective AL strategies and deriving best practices for specific use cases. With best practices, practitioners can lower their annotation costs. We present a case study to illustrate how to use the framework.
Messenger apps like WhatsApp and Telegram are frequently used for everyday communication, but they can also be utilized as a platform for illegal activity. Telegram allows public groups with up to 200.000 participants. Criminals use these public groups for trading illegal commodities and services, which becomes a concern for law enforcement agencies, who manually monitor suspicious activity in these chat rooms. This research demonstrates how natural language processing (NLP) can assist in analyzing these chat rooms, providing an explorative overview of the domain and facilitating purposeful analyses of user behavior. We provide a publicly available corpus of annotated text messages with entities and relations from four self-proclaimed black market chat rooms. Our pipeline approach aggregates the extracted product attributes from user messages to profiles and uses these with their sold products as features for clustering. The extracted structured information is the foundation for further data exploration, such as identifying the top vendors or fine-granular price analyses. Our evaluation shows that pretrained word vectors perform better for unsupervised clustering than state-of-the-art transformer models, while the latter is still superior for sequence labeling.
Like all preceding transformations of the manufacturing industry, the large-scale usage of production data will reshape the role of humans within the sociotechnical production ecosystem. To ensure that this transformation creates work systems in which employees are empowered, productive, healthy, and motivated, the transformation must be guided by principles of and research on human-centered work design. Specifically, measures must be taken at all levels of work design, ranging from (1) the work tasks to (2) the working conditions to (3) the organizational level and (4) the supra-organizational level. We present selected research across all four levels that showcase the opportunities and requirements that surface when striving for human-centered work design for the Internet of Production (IoP). (1) On the work task level, we illustrate the user-centered design of human-robot collaboration (HRC) and process planning in the composite industry as well as user-centered design factors for cognitive assistance systems. (2) On the working conditions level, we present a newly developed framework for the classification of HRC workplaces. (3) Moving to the organizational level, we show how corporate data can be used to facilitate best practice sharing in production networks, and we discuss the implications of the IoP for new leadership models. Finally, (4) on the supra-organizational level, we examine overarching ethical dimensions, investigating, e.g., how the new work contexts affect our understanding of responsibility and normative values such as autonomy and privacy. Overall, these interdisciplinary research perspectives highlight the importance and necessary scope of considering the human factor in the IoP.
The efficiency concepts of Bahadur and Pitman are used to compare the Wilcoxon tests in paired and independent survey samples. A comparison through the length of corresponding confidence intervals is also done. Simple conditions characterizing the dominance of a procedure are derived. Statistical tests for checking these conditions are suggested and discussed.
The paper deals with the asymptotic behaviour of estimators, statistical tests and confidence intervals for L²-distances to uniformity based on the empirical distribution function, the integrated empirical distribution function and the integrated empirical survival function. Approximations of power functions, confidence intervals for the L²-distances and statistical neighbourhood-of-uniformity validation tests are obtained as main applications. The finite sample behaviour of the procedures is illustrated by a simulation study.
We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets.
This paper considers a paired data framework and discusses the question of marginal homogeneity of bivariate high-dimensional or functional data. The related testing problem can be endowed into a more general setting for paired random variables taking values in a general Hilbert space. To address this problem, a Cramér–von-Mises type test statistic is applied and a bootstrap procedure is suggested to obtain critical values and finally a consistent test. The desired properties of a bootstrap test can be derived that are asymptotic exactness under the null hypothesis and consistency under alternatives. Simulations show the quality of the test in the finite sample case. A possible application is the comparison of two possibly dependent stock market returns based on functional data. The approach is demonstrated based on historical data for different stock market indices.
Industrial facilities must be thoroughly designed to withstand seismic
actions as they exhibit an increased loss potential due to the possibly wideranging
damage consequences and the valuable process engineering equipment.
Past earthquakes showed the social and political consequences of seismic damage
to industrial facilities and sensitized the population and politicians worldwide
for the possible hazard emanating from industrial facilities. However, a holistic
approach for the seismic design of industrial facilities can presently neither be
found in national nor in international standards. The introduction of EN 1998-4
of the new generation of Eurocode 8 will improve the normative situation with
specific seismic design rules for silos, tanks and pipelines and secondary process
components. The article presents essential aspects of the seismic design of
industrial facilities based on the new generation of Eurocode 8 using the example
of tank structures and secondary process components. The interaction effects of
the process components with the primary structure are illustrated by means of
the experimental results of a shaking table test of a three story moment resisting
steel frame with different process components. Finally, an integrated approach of
digital plant models based on building information modelling (BIM) and structural
health monitoring (SHM) is presented, which provides not only a reliable
decision-making basis for operation, maintenance and repair but also an excellent
tool for rapid assessment of seismic damage.
Because of customer churn, strong competition, and operational inefficiencies, the telecommunications operator ME Telco (fictitious name due to confidentiality) launched a strategic transformation program that included a Business Process Management (BPM) project. Major problems were silo-oriented process management and missing cross-functional transparency. Process improvements were not consistently planned and aligned with corporate targets. Measurable inefficiencies were observed on an operational level, e.g., high lead times and reassignment rates of the incident management process.
Nutzen und Rahmenbedingungen 5 informationsgetriebener Geschäftsmodelle des Internets der Dinge
(2018)
Im Kontext der zunehmenden Digitalisierung wird das Internet der Dinge (englisch: Internet of Things, IoT) als ein technologischer Treiber angesehen, durch den komplett neue Geschäftsmodelle im Zusammenspiel unterschiedlicher Akteure entstehen können. Identifizierte Schlüsselakteure sind unter anderem traditionelle Industrieunternehmen, Kommunen und Telekommunikationsunternehmen. Letztere sorgen mit der Bereitstellung von Konnektivität dafür, dass kleine Geräte mit winzigen Batterien nahezu überall und direkt an das Internet angebunden werden können. Es sind schon viele IoT-Anwendungsfälle auf dem Markt, die eine Vereinfachung für Endkunden darstellen, wie beispielsweise Philips Hue Tap. Neben Geschäftsmodellen basierend auf Konnektivität besteht ein großes Potenzial für informationsgetriebene Geschäftsmodelle, die bestehende Geschäftsmodelle unterstützen sowie weiterentwickeln können. Ein Beispiel dafür ist der IoT-Anwendungsfall Park and Joy der Deutschen Telekom AG, bei dem Parkplätze mithilfe von Sensoren vernetzt und Autofahrer in Echtzeit über verfügbare Parkplätze informiert werden. Informationsgetriebene Geschäftsmodelle können auf Daten aufsetzen, die in IoT-Anwendungsfällen erzeugt werden. Zum Beispiel kann ein Telekommunikationsunternehmen Mehrwert schöpfen, indem es aus Daten entscheidungsrelevantere Informationen – sogenannte Insights – ableitet, die zur Steigerung der Entscheidungsagilität genutzt werden. Außerdem können Insights monetarisiert werden. Die Monetarisierung von Insights kann nur nachhaltig stattfinden, wenn sorgfältig gehandelt wird und Rahmenbedingungen berücksichtigt werden. In diesem Kapitel wird das Konzept informationsgetriebener Geschäftsmodelle erläutert und anhand des konkreten Anwendungsfalls Park and Joy verdeutlicht. Darüber hinaus werden Nutzen, Risiken und Rahmenbedingungen diskutiert.
Im Rahmen der digitalen Transformation werden innovative Technologiekonzepte, wie z. B. das Internet der Dinge und Cloud Computing als Treiber für weitreichende Veränderungen von Organisationen und Geschäftsmodellen angesehen. In diesem Kontext ist Robotic Process Automation (RPA) ein neuartiger Ansatz zur Prozessautomatisierung, bei dem manuelle Tätigkeiten durch sogenannte Softwareroboter erlernt und automatisiert ausgeführt werden. Dabei emulieren Softwareroboter die Eingaben auf der bestehenden Präsentationsschicht, so dass keine Änderungen an vorhandenen Anwendungssystemen notwendig sind. Die innovative Idee ist die Transformation der bestehenden Prozessausführung von manuell zu digital, was RPA von traditionellen Ansätzen des Business Process Managements (BPM) unterscheidet, bei denen z. B. prozessgetriebene
Anpassungen auf Ebene der Geschäftslogik notwendig sind. Am Markt werden bereits unterschiedliche RPA-Lösungen als Softwareprodukte angeboten. Gerade bei operativen Prozessen mit sich wiederholenden Verarbeitungsschritten in unterschiedlichen Anwendungssystemen sind gute Ergebnisse durch RPA dokumentiert, wie z. B. die Automatisierung von 35 % der Backoffice-Prozesse bei Telefonica. Durch den vergleichsweise niedrigen Implementierungsaufwand verbunden mit einem hohen Automatisierungspotenzial ist in der Praxis (z. B. Banken, Telekommunikation, Energieversorgung) ein hohes Interesse an RPA vorhanden. Der Beitrag diskutiert RPA als innovativen Ansatz zur
Prozessdigitalisierung und gibt konkrete Handlungsempfehlungen für die Praxis. Dazu wird zwischen modellgetriebenen und selbstlernenden Ansätzen unterschieden. Anhand von generellen Architekturen von RPA-Systemen werden Anwendungsszenarien sowie deren Automatisierungspotenziale, aber auch Einschränkungen, diskutiert. Es folgt ein strukturierter Marktüberblick ausgewählter RPA-Produkte. Anhand von drei konkreten Anwendungsbeispielen wird die Nutzung von RPA in der Praxis verdeutlicht.
Due to the high number of customer contacts, fault clearances, installations, and product provisioning per year, the automation level of operational processes has a significant impact on financial results, quality, and customer experience. Therefore, the telecommunications operator Deutsche Telekom (DT) has defined a digital strategy with the objectives of zero complexity and zero complaint, one touch, agility in service, and disruptive thinking. In this context, Robotic Process Automation (RPA) was identified as an enabling technology to formulate and realize DT’s digital strategy through automation of rule-based, routine, and predictable tasks in combination with structured and stable data.
Information technologies, such as big data analytics, cloud computing,
cyber physical systems, robotic process automation, and the internet of things, provide a sustainable impetus for the structural development of business sectors as well as the digitalization of markets, enterprises, and processes. Within the consulting industry, the proliferation of these technologies opened up the new segment of digital transformation, which focuses on setting up, controlling, and implementing projects for enterprises from a broad range of sectors. These recent developments raise the question, which requirements evolve for IT consultants as important success factors of those digital transformation projects. Therefore, this empirical contribution provides indications regarding the qualifications and competences necessary for IT consultants in the era of digital transformation from a labor market perspective. On the one hand, this knowledge base is interesting for the academic education of consultants, since it supports a market-oriented design of adequate training measures. On the other hand, insights into the competence requirements for consultants are considered relevant for skill and talent management processes in consulting practice. Assuming that consulting companies pursue a strategic human resource management approach, labor market information may also be useful to discover strategic behavioral patterns.
Subject of this case is Deutsche Telekom Services Europe (DTSE), a service center for administrative processes. Due to the high volume of repetitive tasks (e.g., 100k manual uploads of offer documents into SAP per year), automation was identified as an important strategic target with a high management attention and commitment. DTSE has to work with various backend application systems without any possibility to change those systems. Furthermore, the complexity of administrative processes differed. When it comes to the transfer of unstructured data (e.g., offer documents) to structured data (e.g., MS Excel files), further cognitive technologies were needed.
This book reflects the tremendous changes in the telecommunications industry in the course of the past few decades – shorter innovation cycles, stiffer competition and new communication products. It analyzes the transformation of processes, applications and network technologies that are now expected to take place under enormous time pressure. The International Telecommunication Union (ITU) and the TM Forum have provided reference solutions that are broadly recognized and used throughout the value chain of the telecommunications industry, and which can be considered the de facto standard. The book describes how these reference solutions can be used in a practical context: it presents the latest insights into their development, highlights lessons learned from numerous international projects and combines them with well-founded research results in enterprise architecture management and reference modeling. The complete architectural transformation is explained, from the planning and set-up stage to the implementation. Featuring a wealth of examples and illustrations, the book offers a valuable resource for telecommunication professionals, enterprise architects and project managers alike.
Market changes have forced telecommunication companies to transform their business. Increased competition, short innovation cycles, changed usage patterns, increased customer expectations and cost reduction are the main drivers. Our objective is to analyze to what extend transformation projects have improved the orientation towards the end-customers. Therefore, we selected 38 real-life case studies that are dealing with customer orientation. Our analysis is based on a telecommunication-specific framework that aligns strategy, business processes and information systems. The result of our analysis shows the following: transformation projects that aim to improve the customer orientation are combined with clear goals on costs and revenue of the enterprise. These projects are usually directly linked to the customer touch points, but also to the development and provisioning of products. Furthermore, the analysis shows that customer orientation is not the sole trigger for transformation. There is no one-fits-all solution; rather, improved customer orientation needs aligned changes of business processes as well as information systems related to different parts of the company.
The telecommunications industry is currently going through a major transformation. In this context, the enhanced Telecom Operations Map (eTOM) is a domain-specific process reference model that is offered by the industry organization TM Forum. In practice, eTOM is well accepted and confirmed as de facto standard. It provides process definitions and process flows on different levels of detail. This article discusses the reference modeling of eTOM, i.e., the design, the resulting artifact, and its evaluation based on three project cases. The application of eTOM in three projects illustrates the design approach and concrete models on strategic and operational levels. The article follows the Design Science Research (DSR) paradigm. It contributes with concrete design artifacts to the transformational needs of the telecommunications industry and offers lessons-learned from a general DSR perspective.
The potential of electronic markets in enabling innovative product bundles through flexible and sustainable partnerships is not yet fully exploited in the telecommunication industry. One reason is that bundling requires seamless de-assembling and re-assembling of business processes, whilst processes in telecommunication companies are often product-dependent and hard to virtualize. We propose a framework for the planning of the virtualization of processes, intended to assist the decision maker in prioritizing the processes to be virtualized: (a) we transfer the virtualization pre-requisites stated by the Process Virtualization Theory in the context of customer-oriented processes in the telecommunication industry and assess their importance in this context, (b) we derive IT-oriented requirements for the removal of virtualization barriers and highlight their demand on changes at different levels of the organization. We present a first evaluation of our approach in a case study and report on lessons learned and further steps to be performed.
Der Telekommunikationsmarkt erfährt substanzielle Veränderungen. Neue Geschäftsmodelle, innovative Dienstleistungen und Technologien erfordern Reengineering, Transformation und Prozessstandardisierung. Mit der Enhanced Telecom Operation Map (eTOM) bietet das TM Forum ein international anerkanntes de facto Referenz-Prozess-Framework basierend auf spezifischen Anforderungen und Ausprägungen der Telekommunikationsindustrie an. Allerdings enthält dieses Referenz-Framework nur eine hierarchische Sammlung von Prozessen auf unterschiedlichen Abstraktionsebenen. Eine Kontrollsicht verstanden als sequenzielle Anordnung von Aktivitäten und daraus resultierend ein realer Prozessablauf fehlt ebenso wie eine Ende-zu-Ende-Sicht auf den Kunden. In diesem Artikel erweitern wir das eTOM-Referenzmodell durch Referenzprozessabläufe, in welchen wir das Wissen über Prozesse in Telekommunikationsunternehmen abstrahieren und generalisieren. Durch die Referenzprozessabläufe werden Unternehmen bei dem strukturierten und transparenten (Re-)Design ihrer Prozesse unterstützt. Wir demonstrieren die Anwendbarkeit und Nützlichkeit unserer Referenzprozessabläufe in zwei Fallstudien und evaluieren diese anhand von Kriterien für die Bewertung von Referenzmodellen. Die Referenzprozessabläufe wurden vom TM Forum in den Standard aufgenommen und als Teil von eTOM Version 9 veröffentlicht. Darüber hinaus diskutieren wir die Komponenten unseres Ansatzes, die auch außerhalb der Telekommunikationsindustrie angewandt werden können.