Springer
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (70)
- Fachbereich Elektrotechnik und Informationstechnik (67)
- IfB - Institut für Bioengineering (40)
- Fachbereich Luft- und Raumfahrttechnik (30)
- Fachbereich Chemie und Biotechnologie (24)
- Fachbereich Energietechnik (22)
- Fachbereich Wirtschaftswissenschaften (12)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (12)
- Fachbereich Maschinenbau und Mechatronik (11)
- INB - Institut für Nano- und Biotechnologien (11)
Language
- English (237) (remove)
Document Type
- Article (122)
- Part of a Book (89)
- Conference Proceeding (24)
- Book (2)
Keywords
- MINLP (3)
- Natural language processing (3)
- Seismic design (3)
- Additive manufacturing (2)
- CFD (2)
- Engineering optimization (2)
- Information extraction (2)
- Obstacle avoidance (2)
- Optimization (2)
- Path planning (2)
- Pitching Moment (2)
- Powertrain (2)
- Process engineering (2)
- Tanks (2)
- Telecommunication (2)
- UAV (2)
- Wave Drag (2)
- Wind Tunnel (2)
- 3D printing (1)
- ABE (1)
- Acid crash (1)
- Active learning (1)
- Actuator disk modelling (1)
- Acyl-amino acids (1)
- Acylation (1)
- Advanced driver assistance systems (ADAS/AD) (1)
- Agent-based simulation (1)
- Aircraft sizing (1)
- Algal Turf Scrubber (1)
- Algal–bacterial bioflm (1)
- Aminoacylase (1)
- Analytics (1)
- Annulus Fibrosus (1)
- Autonomous mobile robots (1)
- Autonomy (1)
- BET (1)
- BEV (1)
- Balance (1)
- Balanced hypergraph (1)
- Best practice sharing (1)
- Bio-inspired systems (1)
- Biocatalysis (1)
- Bioeconomy (1)
- Bioethanol (1)
- Biogas (1)
- Biomechanical simulation (1)
- Biomolecular logic gate (1)
- Biorefinery (1)
- Biorefinery definitions (1)
- Biosurfactants (1)
- Bladder (1)
- Bloom’s Taxonomy (1)
- Bone sawing (1)
- Boundary integral equations (1)
- Brake set-up (1)
- Brake test (1)
- Business Models (1)
- Business Process (1)
- Butanol (1)
- C. acetobutylicum (1)
- CFD propeller simulation (1)
- Calorimetric gas sensor (1)
- Capacitive field-effect sensor (1)
- Cardiovascular MRI (1)
- Carsharing (1)
- Centrifugal twisting moment (1)
- Certification Rule (1)
- Change culture (1)
- Chaperone (1)
- Charging station (1)
- Charging stations (1)
- Chemical imaging (1)
- Chondroitin sulfate (1)
- Circular bioeconomy (1)
- Clustering (1)
- Co-managed care (1)
- Coefficient of ocular rigidity (1)
- Cognitive assistance system (1)
- Collaborative robot (1)
- Competence Developing Games (1)
- Complex System (1)
- Components (1)
- Connected Automated Vehicle (1)
- Controller Parameter (1)
- Cooling system (1)
- Corneo-scleral shell (1)
- Coverage probability (1)
- Cryptographic protocols (1)
- Crámer–von-Mises distance (1)
- Customer Orientation (1)
- DNA (1)
- Decentral (1)
- Deep learning (1)
- Design examples (1)
- Dietary supplements (1)
- Differential tonometry (1)
- Digital leadership (1)
- Digital manufacturing (1)
- Disc Degeneration (1)
- Drag Reduction (1)
- Drag estimation (1)
- Dry surfaces (1)
- Duality (1)
- E-carsharing (1)
- E-mobility (1)
- EN 1998-4 (1)
- Efficiency optimization (1)
- Elderly (1)
- Electrical vehicle (1)
- Electromagnetism (1)
- Electronic vehicle (1)
- Elicit (1)
- Energy efficiency (1)
- Energy market design (1)
- Engine Efficiency (1)
- Engineering optimisation (1)
- Enterprise Architecture (1)
- Enterprise architecture (1)
- Enterprise transformation (1)
- Enzyme biosensor (1)
- Equivalence test (1)
- Eurocode 8 (1)
- Evacuation Rule (1)
- Experimental validation (1)
- Extension–twist coupling (1)
- Eyeball (1)
- FGF23 (1)
- Fall prevention (1)
- Field-effect device (1)
- Field-effect sensor (1)
- Flight Test (1)
- Fracture configuration (1)
- Fracture simulation (1)
- Freight rail (1)
- Fully connected car (1)
- Game-based learning (1)
- Gamification (1)
- Gearbox (1)
- Glass powder (1)
- Glaucoma (1)
- Global optimization (1)
- Glucosamine (1)
- Gold nanoparticle (1)
- Goodness-of-fit tests for uniformity (1)
- Ground-level falls (1)
- Growth modelling (1)
- Gust wind response (1)
- Hall’s Theorem (1)
- Helmholtz equation (1)
- High field MRI (1)
- High-field NMR (1)
- Human-Robot interaction (1)
- Human-centered work design (1)
- Human-robot collaboration (1)
- Hydraulic structures (1)
- Hydrogen peroxide (1)
- Hypergraph (1)
- ISO 26262 (1)
- IT Products (1)
- IT security education (1)
- Ice melting probe (1)
- Ice penetration (1)
- Icy moons (1)
- Incident analysis (1)
- Incomplete data (1)
- Inductive charging (1)
- Industrial facilities (1)
- Industrial optimisation (1)
- Industrial units (1)
- Industry 4.0 (1)
- Information and communication technology (1)
- Integrated empirical distribution (survival) function (1)
- Integrated mobility (1)
- Interactive process mining (1)
- Interior Neumann eigenvalues (1)
- Intervertebral Disc (1)
- Intradiscal Pressure (1)
- Introduction (1)
- Keyword analysis (1)
- Klotho (1)
- Koenig’s Theorem (1)
- L-PBF (1)
- Label-free detection (1)
- Laser processing (1)
- Leaderboard (1)
- Leading Edge Vortex (1)
- Lean thinking (1)
- Left ventriular function (1)
- Level Control System (1)
- Lifting propeller (1)
- Light-addressable potentiometric sensor (1)
- Lignocellulose feedstook (1)
- Limit analysis (1)
- Local path planning (1)
- MILP (1)
- MR safety (1)
- MR-stethoscope (1)
- MRI (1)
- Mach Number (1)
- Machine learning (1)
- Magnetic field strength (1)
- Magnetic resonance imaging (MRI) (1)
- Magneto alert sensor (1)
- Malicious model (1)
- Map (eTOM) Process reference model Process design Telecommunications industry (1)
- Marginal homogeneity test (1)
- Market modeling (1)
- Mars (1)
- Matching (1)
- Mechanical (1)
- Mechanical simulation (1)
- Melting (1)
- Metabolic shift (1)
- Methane (1)
- Methodology (1)
- Microbial adhesion (1)
- Minimum Risk Manoeuvre (1)
- Minor chemistry (1)
- Mixed-integer nonlinear black-box optimization (1)
- Mixed-integer nonlinear problem (1)
- Mixed-integer nonlinear programming (1)
- Mixed-integer programming (1)
- Mobility (1)
- Mobility management (1)
- Mobility tests (1)
- Multi-criteria optimization (1)
- Multi-robot systems (1)
- Multi-sensor system (1)
- Multidisciplinary Design Optimization (1)
- Multimode failure (1)
- Multirotor UAS (1)
- Muscle fibers (1)
- Natural language understanding (1)
- Network (1)
- Neural Network (1)
- Noise Exposure (1)
- Non-linear optimization (1)
- Nonlinear Dynamics (1)
- Nucleus Pulposus (1)
- Numerical inversion of Laplace transforms (1)
- Numerics (1)
- OR 2019 (1)
- Objective data (1)
- Ocean worlds (1)
- Ocular blood flow (1)
- On-site (1)
- Open channels (1)
- Operational Design Domain (1)
- Optimal Closed Loop (1)
- Optimal Topology (1)
- PTH (1)
- Paired sample (1)
- Paper recycling (1)
- Parabolized Stability Equation (1)
- Parasitic drag (1)
- Parking (1)
- Passenger compartment (1)
- Passive stretching (1)
- Pelvic floor dysfunction (1)
- Pelvic muscle (1)
- Performance (1)
- Personality (1)
- Phosphate (1)
- Physical chemistry (1)
- Physical chemistry basics (1)
- Physical chemistry starters (1)
- Physical modeling (1)
- Piecewise linearization (1)
- Plant virus (1)
- Polysaccharides (1)
- Potential theory (1)
- Potentiometry (1)
- Pre-culture (1)
- Pre-treatment (1)
- Pressure-volume relationship (1)
- Privacy (1)
- Privacy-enhancing technologies (1)
- Process design (1)
- Process reference model (1)
- Process schemes (1)
- Process virtualization (1)
- Product Management (1)
- Product bundling (1)
- Product family optimization (1)
- Profile extraction (1)
- Propeller aerodynamics (1)
- Propeller performance (1)
- Proximal humerus fracture (1)
- Pumping systems (1)
- Pushover analysis (1)
- Query learning (1)
- RVA (1)
- Rapid manufacturing (1)
- Rapid prototyping (1)
- Reconstruction (1)
- Reference modelling (1)
- Relation classification (1)
- Reliability analysis (1)
- Renewable resources (1)
- Reproducible research (1)
- Resampling test (1)
- Reservation system (1)
- Resilience (1)
- Resolvent Operator (1)
- Response spectrum (1)
- Responsibility (1)
- RoboCup (1)
- Rotator cuff (1)
- Safety concept (1)
- Safety of the intended functionality (SOTIF) (1)
- Safety-critical systems validation (1)
- Sampling methods (1)
- Secure multi-party computation (1)
- Services (1)
- Severe Accident (1)
- Shakedown analysis (1)
- Silos (1)
- Similitude (1)
- Simulation (1)
- Smart factory (1)
- Software (1)
- Software development (1)
- Software testing (1)
- Sonic Boom (1)
- Specific Fuel Consumption (1)
- Spectral analysis (1)
- Strategic Business Planning (1)
- Structural health monitoring (1)
- Supersonic Flow (1)
- Supersonic Wind Tunnel (1)
- Surface microorganisms (1)
- Swabbing (1)
- TM Forum (1)
- Teamwork (1)
- Technical Operation Research (1)
- Technical Operations Research (1)
- Technology Challenge (1)
- Telecommunication Industry (1)
- Text mining (1)
- Thermal Fatigue Testing (1)
- Thermal comfort (1)
- Thermal management (1)
- Thermodynamics as minor (1)
- Tinetti test (1)
- Tobacco mosaic virus (TMV) (1)
- Train composition (1)
- Transformation (1)
- Transformation Project (1)
- Transiton of Control (1)
- Trapeze effect (1)
- Trustworthy artificial intelligence (1)
- Uktrahigh field MRI (1)
- Unmanned aerial vehicles (1)
- Urban areas (1)
- Ureter (1)
- Utilization improvement (1)
- V2X (1)
- Validation (1)
- Variable Geometry (1)
- Vascular response (1)
- Vertex cover (1)
- Visual field asymmetry (1)
- Vitamin D (1)
- WLTP (1)
- Water (1)
- Water distribution system (1)
- Wind milling (1)
- Wind tunnel experiments (1)
- Wind turbulence (1)
- Workspace monitoring (1)
- Zero-knowledge proofs (1)
- Zeta potential (1)
- business analytics (1)
- decision analytics (1)
- digital economy (1)
- enhanced Telecom Operations Map (eTOM) (1)
- mathematical optimization (1)
- training simulator (1)
- virtual reality (1)
Due to the increasing complexity of software projects, software development is becoming more and more dependent on teams. The quality of this teamwork can vary depending on the team composition, as teams are always a combination of different skills and personality types. This paper aims to answer the question of how to describe a software development team and what influence the personality of the team members has on the team dynamics. For this purpose, a systematic literature review (n=48) and a literature search with the AI research assistant Elicit (n=20) were conducted. Result: A person’s personality significantly shapes his or her thinking and actions, which in turn influences his or her behavior in software development teams. It has been shown that team performance and satisfaction can be strongly influenced by personality. The quality of communication and the likelihood of conflict can also be attributed to personality.
Ice melting probes
(2023)
The exploration of icy environments in the solar system, such as the poles of Mars and the icy moons (a.k.a. ocean worlds), is a key aspect for understanding their astrobiological potential as well as for extraterrestrial resource inspection. On these worlds, ice melting probes are considered to be well suited for the robotic clean execution of such missions. In this chapter, we describe ice melting probes and their applications, the physics of ice melting and how the melting behavior can be modeled and simulated numerically, the challenges for ice melting, and the required key technologies to deal with those challenges. We also give an overview of existing ice melting probes and report some results and lessons learned from laboratory and field tests.
Modern implementations of driver assistance systems are evolving from a pure driver assistance to a independently acting automation system. Still these systems are not covering the full vehicle usage range, also called operational design domain, which require the human driver as fall-back mechanism. Transition of control and potential minimum risk manoeuvres are currently research topics and will bridge the gap until full autonomous vehicles are available. The authors showed in a demonstration that the transition of control mechanisms can be further improved by usage of communication technology. Receiving the incident type and position information by usage of standardised vehicle to everything (V2X) messages can improve the driver safety and comfort level. The connected and automated vehicle’s software framework can take this information to plan areas where the driver should take back control by initiating a transition of control which can be followed by a minimum risk manoeuvre in case of an unresponsive driver. This transition of control has been implemented in a test vehicle and was presented to the public during the IEEE IV2022 (IEEE Intelligent Vehicle Symposium) in Aachen, Germany.
In recent years, the development of large pretrained language models, such as BERT and GPT, significantly improved information extraction systems on various tasks, including relation classification. State-of-the-art systems are highly accurate on scientific benchmarks. A lack of explainability is currently a complicating factor in many real-world applications. Comprehensible systems are necessary to prevent biased, counterintuitive, or harmful decisions.
We introduce semantic extents, a concept to analyze decision patterns for the relation classification task. Semantic extents are the most influential parts of texts concerning classification decisions. Our definition allows similar procedures to determine semantic extents for humans and models. We provide an annotation tool and a software framework to determine semantic extents for humans and models conveniently and reproducibly. Comparing both reveals that models tend to learn shortcut patterns from data. These patterns are hard to detect with current interpretability methods, such as input reductions. Our approach can help detect and eliminate spurious decision patterns during model development. Semantic extents can increase the reliability and security of natural language processing systems. Semantic extents are an essential step in enabling applications in critical areas like healthcare or finance. Moreover, our work opens new research directions for developing methods to explain deep learning models.
Supervised machine learning and deep learning require a large amount of labeled data, which data scientists obtain in a manual, and time-consuming annotation process. To mitigate this challenge, Active Learning (AL) proposes promising data points to annotators they annotate next instead of a subsequent or random sample. This method is supposed to save annotation effort while maintaining model performance.
However, practitioners face many AL strategies for different tasks and need an empirical basis to choose between them. Surveys categorize AL strategies into taxonomies without performance indications. Presentations of novel AL strategies compare the performance to a small subset of strategies. Our contribution addresses the empirical basis by introducing a reproducible active learning evaluation (ALE) framework for the comparative evaluation of AL strategies in NLP.
The framework allows the implementation of AL strategies with low effort and a fair data-driven comparison through defining and tracking experiment parameters (e.g., initial dataset size, number of data points per query step, and the budget). ALE helps practitioners to make more informed decisions, and researchers can focus on developing new, effective AL strategies and deriving best practices for specific use cases. With best practices, practitioners can lower their annotation costs. We present a case study to illustrate how to use the framework.
Messenger apps like WhatsApp and Telegram are frequently used for everyday communication, but they can also be utilized as a platform for illegal activity. Telegram allows public groups with up to 200.000 participants. Criminals use these public groups for trading illegal commodities and services, which becomes a concern for law enforcement agencies, who manually monitor suspicious activity in these chat rooms. This research demonstrates how natural language processing (NLP) can assist in analyzing these chat rooms, providing an explorative overview of the domain and facilitating purposeful analyses of user behavior. We provide a publicly available corpus of annotated text messages with entities and relations from four self-proclaimed black market chat rooms. Our pipeline approach aggregates the extracted product attributes from user messages to profiles and uses these with their sold products as features for clustering. The extracted structured information is the foundation for further data exploration, such as identifying the top vendors or fine-granular price analyses. Our evaluation shows that pretrained word vectors perform better for unsupervised clustering than state-of-the-art transformer models, while the latter is still superior for sequence labeling.
Like all preceding transformations of the manufacturing industry, the large-scale usage of production data will reshape the role of humans within the sociotechnical production ecosystem. To ensure that this transformation creates work systems in which employees are empowered, productive, healthy, and motivated, the transformation must be guided by principles of and research on human-centered work design. Specifically, measures must be taken at all levels of work design, ranging from (1) the work tasks to (2) the working conditions to (3) the organizational level and (4) the supra-organizational level. We present selected research across all four levels that showcase the opportunities and requirements that surface when striving for human-centered work design for the Internet of Production (IoP). (1) On the work task level, we illustrate the user-centered design of human-robot collaboration (HRC) and process planning in the composite industry as well as user-centered design factors for cognitive assistance systems. (2) On the working conditions level, we present a newly developed framework for the classification of HRC workplaces. (3) Moving to the organizational level, we show how corporate data can be used to facilitate best practice sharing in production networks, and we discuss the implications of the IoP for new leadership models. Finally, (4) on the supra-organizational level, we examine overarching ethical dimensions, investigating, e.g., how the new work contexts affect our understanding of responsibility and normative values such as autonomy and privacy. Overall, these interdisciplinary research perspectives highlight the importance and necessary scope of considering the human factor in the IoP.
The paper deals with the asymptotic behaviour of estimators, statistical tests and confidence intervals for L²-distances to uniformity based on the empirical distribution function, the integrated empirical distribution function and the integrated empirical survival function. Approximations of power functions, confidence intervals for the L²-distances and statistical neighbourhood-of-uniformity validation tests are obtained as main applications. The finite sample behaviour of the procedures is illustrated by a simulation study.
We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets.
This paper considers a paired data framework and discusses the question of marginal homogeneity of bivariate high-dimensional or functional data. The related testing problem can be endowed into a more general setting for paired random variables taking values in a general Hilbert space. To address this problem, a Cramér–von-Mises type test statistic is applied and a bootstrap procedure is suggested to obtain critical values and finally a consistent test. The desired properties of a bootstrap test can be derived that are asymptotic exactness under the null hypothesis and consistency under alternatives. Simulations show the quality of the test in the finite sample case. A possible application is the comparison of two possibly dependent stock market returns based on functional data. The approach is demonstrated based on historical data for different stock market indices.
Industrial facilities must be thoroughly designed to withstand seismic actions as they exhibit an increased loss potential due to the possibly wideranging damage consequences and the valuable process engineering equipment. Past earthquakes showed the social and political consequences of seismic damage to industrial facilities and sensitized the population and politicians worldwide for the possible hazard emanating from industrial facilities. However, a holistic approach for the seismic design of industrial facilities can presently neither be found in national nor in international standards. The introduction of EN 1998-4 of the new generation of Eurocode 8 will improve the normative situation with
specific seismic design rules for silos, tanks and pipelines and secondary process components. The article presents essential aspects of the seismic design of industrial facilities based on the new generation of Eurocode 8 using the example of tank structures and secondary process components. The interaction effects of the process components with the primary structure are illustrated by means of the experimental results of a shaking table test of a three story moment resisting steel frame with different process components. Finally, an integrated approach of
digital plant models based on building information modelling (BIM) and structural health monitoring (SHM) is presented, which provides not only a reliable decision-making basis for operation, maintenance and repair but also an excellent tool for rapid assessment of seismic damage.
Because of customer churn, strong competition, and operational inefficiencies, the telecommunications operator ME Telco (fictitious name due to confidentiality) launched a strategic transformation program that included a Business Process Management (BPM) project. Major problems were silo-oriented process management and missing cross-functional transparency. Process improvements were not consistently planned and aligned with corporate targets. Measurable inefficiencies were observed on an operational level, e.g., high lead times and reassignment rates of the incident management process.
Due to the high number of customer contacts, fault clearances, installations, and product provisioning per year, the automation level of operational processes has a significant impact on financial results, quality, and customer experience. Therefore, the telecommunications operator Deutsche Telekom (DT) has defined a digital strategy with the objectives of zero complexity and zero complaint, one touch, agility in service, and disruptive thinking. In this context, Robotic Process Automation (RPA) was identified as an enabling technology to formulate and realize DT’s digital strategy through automation of rule-based, routine, and predictable tasks in combination with structured and stable data.
Information technologies, such as big data analytics, cloud computing,
cyber physical systems, robotic process automation, and the internet of things, provide a sustainable impetus for the structural development of business sectors as well as the digitalization of markets, enterprises, and processes. Within the consulting industry, the proliferation of these technologies opened up the new segment of digital transformation, which focuses on setting up, controlling, and implementing projects for enterprises from a broad range of sectors. These recent developments raise the question, which requirements evolve for IT consultants as important success factors of those digital transformation projects. Therefore, this empirical contribution provides indications regarding the qualifications and competences necessary for IT consultants in the era of digital transformation from a labor market perspective. On the one hand, this knowledge base is interesting for the academic education of consultants, since it supports a market-oriented design of adequate training measures. On the other hand, insights into the competence requirements for consultants are considered relevant for skill and talent management processes in consulting practice. Assuming that consulting companies pursue a strategic human resource management approach, labor market information may also be useful to discover strategic behavioral patterns.
Subject of this case is Deutsche Telekom Services Europe (DTSE), a service center for administrative processes. Due to the high volume of repetitive tasks (e.g., 100k manual uploads of offer documents into SAP per year), automation was identified as an important strategic target with a high management attention and commitment. DTSE has to work with various backend application systems without any possibility to change those systems. Furthermore, the complexity of administrative processes differed. When it comes to the transfer of unstructured data (e.g., offer documents) to structured data (e.g., MS Excel files), further cognitive technologies were needed.
This book reflects the tremendous changes in the telecommunications industry in the course of the past few decades – shorter innovation cycles, stiffer competition and new communication products. It analyzes the transformation of processes, applications and network technologies that are now expected to take place under enormous time pressure. The International Telecommunication Union (ITU) and the TM Forum have provided reference solutions that are broadly recognized and used throughout the value chain of the telecommunications industry, and which can be considered the de facto standard. The book describes how these reference solutions can be used in a practical context: it presents the latest insights into their development, highlights lessons learned from numerous international projects and combines them with well-founded research results in enterprise architecture management and reference modeling. The complete architectural transformation is explained, from the planning and set-up stage to the implementation. Featuring a wealth of examples and illustrations, the book offers a valuable resource for telecommunication professionals, enterprise architects and project managers alike.
Market changes have forced telecommunication companies to transform their business. Increased competition, short innovation cycles, changed usage patterns, increased customer expectations and cost reduction are the main drivers. Our objective is to analyze to what extend transformation projects have improved the orientation towards the end-customers. Therefore, we selected 38 real-life case studies that are dealing with customer orientation. Our analysis is based on a telecommunication-specific framework that aligns strategy, business processes and information systems. The result of our analysis shows the following: transformation projects that aim to improve the customer orientation are combined with clear goals on costs and revenue of the enterprise. These projects are usually directly linked to the customer touch points, but also to the development and provisioning of products. Furthermore, the analysis shows that customer orientation is not the sole trigger for transformation. There is no one-fits-all solution; rather, improved customer orientation needs aligned changes of business processes as well as information systems related to different parts of the company.
The telecommunications industry is currently going through a major transformation. In this context, the enhanced Telecom Operations Map (eTOM) is a domain-specific process reference model that is offered by the industry organization TM Forum. In practice, eTOM is well accepted and confirmed as de facto standard. It provides process definitions and process flows on different levels of detail. This article discusses the reference modeling of eTOM, i.e., the design, the resulting artifact, and its evaluation based on three project cases. The application of eTOM in three projects illustrates the design approach and concrete models on strategic and operational levels. The article follows the Design Science Research (DSR) paradigm. It contributes with concrete design artifacts to the transformational needs of the telecommunications industry and offers lessons-learned from a general DSR perspective.
The potential of electronic markets in enabling innovative product bundles through flexible and sustainable partnerships is not yet fully exploited in the telecommunication industry. One reason is that bundling requires seamless de-assembling and re-assembling of business processes, whilst processes in telecommunication companies are often product-dependent and hard to virtualize. We propose a framework for the planning of the virtualization of processes, intended to assist the decision maker in prioritizing the processes to be virtualized: (a) we transfer the virtualization pre-requisites stated by the Process Virtualization Theory in the context of customer-oriented processes in the telecommunication industry and assess their importance in this context, (b) we derive IT-oriented requirements for the removal of virtualization barriers and highlight their demand on changes at different levels of the organization. We present a first evaluation of our approach in a case study and report on lessons learned and further steps to be performed.
In this study, a recently proposed NMR standardization approach by 2H integral of deuterated solvent for quantitative multicomponent analysis of complex mixtures is presented. As a proof of principle, the existing NMR routine for the analysis of Aloe vera products was modified. Instead of using absolute integrals of targeted compounds and internal standard (nicotinamide) from 1H-NMR spectra, quantification was performed based on the ratio of a particular 1H-NMR compound integral and 2H-NMR signal of deuterated solvent D2O. Validation characteristics (linearity, repeatability, accuracy) were evaluated and the results showed that the method has the same precision as internal standardization in case of multicomponent screening. Moreover, a dehydration process by freeze drying is not necessary for the new routine. Now, our NMR profiling of A. vera products needs only limited sample preparation and data processing. The new standardization methodology provides an appealing alternative for multicomponent NMR screening. In general, this novel approach, using standardization by 2H integral, benefits from reduced sample preparation steps and uncertainties, and is recommended in different application areas (purity determination, forensics, pharmaceutical analysis, etc.).
We study the possibility to fabricate an arbitrary phase mask in a one-step laser-writing process inside the volume of an optical glass substrate. We derive the phase mask from a Gerchberg–Saxton-type algorithm as an array and create each individual phase shift using a refractive index modification of variable axial length. We realize the variable axial length by superimposing refractive index modifications induced by an ultra-short pulsed laser at different focusing depth. Each single modification is created by applying 1000 pulses with 15 μJ pulse energy at 100 kHz to a fixed spot of 25 μm diameter and the focus is then shifted axially in steps of 10 μm. With several proof-of-principle examples, we show the feasibility of our method. In particular, we identify the induced refractive index change to about a value of Δn=1.5⋅10−3. We also determine our current limitations by calculating the overlap in the form of a scalar product and we discuss possible future improvements.
Industrial production systems are facing radical change in multiple dimensions. This change is caused by technological developments and the digital transformation of production, as well as the call for political and social change to facilitate a transformation toward sustainability. These changes affect both the capabilities of production systems and companies and the design of higher education and educational programs. Given the high uncertainty in the likelihood of occurrence and the technical, economic, and societal impacts of these concepts, we conducted a technology foresight study, in the form of a real-time Delphi analysis, to derive reliable future scenarios featuring the next generation of manufacturing systems. This chapter presents the capabilities dimension and describes each projection in detail, offering current case study examples and discussing related research, as well as implications for policy makers and firms. Specifically, we discuss the benefits of capturing expert knowledge and making it accessible to newcomers, especially in highly specialized industries. The experts argue that in order to cope with the challenges and circumstances of today’s world, students must already during their education at university learn how to work with AI and other technologies. This means that study programs must change and that universities must adapt their structural aspects to meet the needs of the students.
Next Generation Manufacturing promises significant improvements in performance, productivity, and value creation. In addition to the desired and projected improvements regarding the planning, production, and usage cycles of products, this digital transformation will have a huge impact on work, workers, and workplace design. Given the high uncertainty in the likelihood of occurrence and the technical, economic, and societal impacts of these changes, we conducted a technology foresight study, in the form of a real-time Delphi analysis, to derive reliable future scenarios featuring the next generation of manufacturing systems. This chapter presents the organization dimension and describes each projection in detail, offering current case study examples and discussing related research, as well as implications for policy makers and firms. Specifically, we highlight seven areas in which the digital transformation of production will change how we work, how we organize the work within a company, how we evaluate these changes, and how employment and labor rights will be affected across company boundaries. The experts are unsure whether the use of collaborative robots in factories will replace traditional robots by 2030. They believe that the use of hybrid intelligence will supplement human decision-making processes in production environments. Furthermore, they predict that artificial intelligence will lead to changes in management processes, leadership, and the elimination of hierarchies. However, to ensure that social and normative aspects are incorporated into the AI algorithms, restricting measurement of individual performance will be necessary. Additionally, AI-based decision support can significantly contribute toward new, socially accepted modes of leadership. Finally, the experts believe that there will be a reduction in the workforce by the year 2030.
There is a broad international discussion about rethinking engineering education in order to educate engineers to cope with future challenges, and particularly the sustainable development goals. In this context, there is a consensus about the need to shift from a mostly technical paradigm to a more holistic problem-based approach, which can address the social embeddedness of technology in society. Among the strategies suggested to address this social embeddedness, design thinking has been proposed as an essential complement to engineering precisely for this purpose. This chapter describes the requirements for integrating the design thinking approach in engineering education. We exemplify the requirements and challenges by presenting our approach based on our course experiences at RWTH Aachen University. The chapter first describes the development of our approach of integrating design thinking in engineering curricula, how we combine it with the Sustainable Development Goals (SDG) as well as the role of sustainability and social responsibility in engineering. Secondly, we present the course “Expanding Engineering Limits: Culture, Diversity, and Gender” at RWTH Aachen University. We describe the necessity to theoretically embed the method in social and cultural context, giving students the opportunity to reflect on cultural, national, or individual “engineering limits,” and to be able to overcome them using design thinking as a next step for collaborative project work. The paper will suggest that the successful implementation of design thinking as a method in engineering education needs to be framed and contextualized within Science and Technology Studies (STS).
This study investigated the anaerobic digestion of an algal–bacterial biofilm grown in artificial wastewater in an Algal Turf Scrubber (ATS). The ATS system was located in a greenhouse (50°54′19ʺN, 6°24′55ʺE, Germany) and was exposed to seasonal conditions during the experiment period. The methane (CH4) potential of untreated algal–bacterial biofilm (UAB) and thermally pretreated biofilm (PAB) using different microbial inocula was determined by anaerobic batch fermentation. Methane productivity of UAB differed significantly between microbial inocula of digested wastepaper, a mixture of manure and maize silage, anaerobic sewage sludge, and percolated green waste. UAB using sewage sludge as inoculum showed the highest methane productivity. The share of methane in biogas was dependent on inoculum. Using PAB, a strong positive impact on methane productivity was identified for the digested wastepaper (116.4%) and a mixture of manure and maize silage (107.4%) inocula. By contrast, the methane yield was significantly reduced for the digested anaerobic sewage sludge (50.6%) and percolated green waste (43.5%) inocula. To further evaluate the potential of algal–bacterial biofilm for biogas production in wastewater treatment and biogas plants in a circular bioeconomy, scale-up calculations were conducted. It was found that a 0.116 km2 ATS would be required in an average municipal wastewater treatment plant which can be viewed as problematic in terms of space consumption. However, a substantial amount of energy surplus (4.7–12.5 MWh a−1) can be gained through the addition of algal–bacterial biomass to the anaerobic digester of a municipal wastewater treatment plant. Wastewater treatment and subsequent energy production through algae show dominancy over conventional technologies.
Introduction
In regard of surgical training, the reproducible simulation of life-like proximal humerus fractures in human cadaveric specimens is desirable. The aim of the present study was to develop a technique that allows simulation of realistic proximal humerus fractures and to analyse the influence of rotator cuff preload on the generated lesions in regards of fracture configuration.
Materials and methods
Ten cadaveric specimens (6 left, 4 right) were fractured using a custom-made drop-test bench, in two groups. Five specimens were fractured without rotator cuff preload, while the other five were fractured with the tendons of the rotator cuff preloaded with 2 kg each. The humeral shaft and the shortened scapula were potted. The humerus was positioned at 90° of abduction and 10° of internal rotation to simulate a fall on the elevated arm. In two specimens of each group, the emergence of the fractures was documented with high-speed video imaging. Pre-fracture radiographs were taken to evaluate the deltoid-tuberosity index as a measure of bone density. Post-fracture X-rays and CT scans were performed to define the exact fracture configurations. Neer’s classification was used to analyse the fractures.
Results
In all ten cadaveric specimens life-like proximal humerus fractures were achieved. Two III-part and three IV-part fractures resulted in each group. The preloading of the rotator cuff muscles had no further influence on the fracture configuration. High-speed videos of the fracture simulation revealed identical fracture mechanisms for both groups. We observed a two-step fracture mechanism, with initial impaction of the head segment against the glenoid followed by fracturing of the head and the tuberosities and then with further impaction of the shaft against the acromion, which lead to separation of the tuberosities.
Conclusion
A high energetic axial impulse can reliably induce realistic proximal humerus fractures in cadaveric specimens. The preload of the rotator cuff muscles had no influence on initial fracture configuration. Therefore, fracture simulation in the proximal humerus is less elaborate. Using the presented technique, pre-fractured specimens are available for real-life surgical education.
Plant viruses are major contributors to crop losses and induce high economic costs worldwide. For reliable, on-site and early detection of plant viral diseases, portable biosensors are of great interest. In this study, a field-effect SiO2-gate electrolyte-insulator-semiconductor (EIS) sensor was utilized for the label-free electrostatic detection of tobacco mosaic virus (TMV) particles as a model plant pathogen. The capacitive EIS sensor has been characterized regarding its TMV sensitivity by means of constant-capacitance method. The EIS sensor was able to detect biotinylated TMV particles from a solution with a TMV concentration as low as 0.025 nM. A good correlation between the registered EIS sensor signal and the density of adsorbed TMV particles assessed from scanning electron microscopy images of the SiO2-gate chip surface was observed. Additionally, the isoelectric point of the biotinylated TMV particles was determined via zeta potential measurements and the influence of ionic strength of the measurement solution on the TMV-modified EIS sensor signal has been studied.
Vitamin D plays an essential role in calcium and inorganic phosphate (Pi) homeostasis, maintaining their optimal levels to assure adequate bone mineralization. Vitamin D, as calcitriol (1,25(OH)2D), not only increases intestinal calcium and phosphate absorption but also facilitates their renal reabsorption, leading to elevated serum calcium and phosphate levels. The interaction of 1,25(OH)2D with its receptor (VDR) increases the efficiency of intestinal absorption of calcium to 30–40% and phosphate to nearly 80%. Serum phosphate levels can also influence 1,25 (OH)2D and fibroblast growth factor 23 (FGF23) levels, i.e., higher phosphate concentrations suppress vitamin D activation and stimulate parathyroid hormone (PTH) release, while a high FGF23 serum level leads to reduced vitamin D synthesis. In the vitamin D-deficient state, the intestinal calcium absorption decreases and the secretion of PTH increases, which in turn causes the stimulation of 1,25(OH)2D production, resulting in excessive urinary phosphate loss. Maintenance of phosphate homeostasis is essential as hyperphosphatemia is a risk factor of cardiovascular calcification, chronic kidney diseases (CKD), and premature aging, while hypophosphatemia is usually associated with rickets and osteomalacia. This chapter elaborates on the possible interactions between vitamin D and phosphate in health and disease.
This study reviews the practice of brake tests in freight railways, which is time consuming and not suitable to detect certain failure types. Public incident reports are analysed to derive a reasonable brake test hardware and communication architecture, which aims to provide automatic brake tests at lower cost than current solutions. The proposed solutions relies exclusively on brake pipe and brake cylinder pressure sensors, a brake release position switch as well as radio communication via standard protocols. The approach is embedded in the Wagon 4.0 concept, which is a holistic approach to a smart freight wagon. The reduction of manual processes yields a strong incentive due to high savings in manual
labour and increased productivity.
This study focuses on thermoelectric elements (TEE) as an alternative for room temperature control. TEE are semi-conductor devices that can provide heating and cooling via a heat pump effect without direct noise emissions and no refrigerant use. An efficiency evaluation of the optimal operating mode is carried out for different numbers of TEE, ambient temperatures, and heating loads. The influence of an additional heat recovery unit on system efficiency and an unevenly distributed heating demand are examined. The results show that TEE can provide heat at a coefficient of performance (COP) greater than one especially for small heating demands and high ambient temperatures. The efficiency increases with the number of elements in the system and is subject to economies of scale. The best COP exceeds six at optimal operating conditions. An additional heat recovery unit proves beneficial for low ambient temperatures and systems with few TEE. It makes COPs above one possible at ambient temperatures below 0 ∘C. The effect increases efficiency by maximal 0.81 (from 1.90 to 2.71) at ambient temperature 5 K below room temperature and heating demand Q˙h=100W but is subject to diseconomies of scale. Thermoelectric technology is a valuable option for electricity-based heat supply and can provide cooling and ventilation functions. A careful system design as well as an additional heat recovery unit significantly benefits the performance. This makes TEE superior to direct current heating systems and competitive to heat pumps for small scale applications with focus on avoiding noise and harmful refrigerants.
Gearboxes are mechanical transmission systems that provide speed and torque conversions from a rotating power source. Being a central element of the drive train, they are relevant for the efficiency and durability of motor vehicles. In this work, we present a new approach for gearbox design: Modeling the design problem as a mixed-integer nonlinear program (MINLP) allows us to create gearbox designs from scratch for arbitrary requirements and—given enough time—to compute provably globally optimal designs for a given objective. We show how different degrees of freedom influence the runtime and present an exemplary solution.
Planning the layout and operation of a technical system is a common task
for an engineer. Typically, the workflow is divided into consecutive stages: First,
the engineer designs the layout of the system, with the help of his experience or of
heuristic methods. Secondly, he finds a control strategy which is often optimized
by simulation. This usually results in a good operating of an unquestioned sys-
tem topology. In contrast, we apply Operations Research (OR) methods to find a
cost-optimal solution for both stages simultaneously via mixed integer program-
ming (MILP). Technical Operations Research (TOR) allows one to find a provable
global optimal solution within the model formulation. However, the modeling error
due to the abstraction of physical reality remains unknown. We address this ubiq-
uitous problem of OR methods by comparing our computational results with mea-
surements in a test rig. For a practical test case we compute a topology and control
strategy via MILP and verify that the objectives are met up to a deviation of 8.7%.
Purpose Vascular risk factors and ocular perfusion are heatedly discussed in the pathogenesis of glaucoma. The retinal vessel analyzer (RVA, IMEDOS Systems, Germany) allows noninvasive measurement of retinal vessel regulation. Significant differences especially in the veins between healthy subjects and patients suffering from glaucoma were previously reported. In this pilot-study we investigated if localized vascular regulation is altered in glaucoma patients with altitudinal visual field defect asymmetry. Methods 15 eyes of 12 glaucoma patients with advanced altitudinal visual field defect asymmetry were included. The mean defect was calculated for each hemisphere separately (-20.99 ± 10.49 pro- found hemispheric visual field defect vs -7.36 ± 3.97 dB less profound hemisphere). After pupil dilation, RVA measurements of retinal arteries and veins were conducted using the standard protocol. The superior and inferior retinal vessel reactivity were measured consecutively in each eye. Results Significant differences were recorded in venous vessel constriction after flicker light stimulation and overall amplitude of the reaction (p \ 0.04 and p \ 0.02 respectively) in-between the hemispheres spheres. Vessel reaction was higher in the hemisphere corresponding to the more advanced visual field defect. Arterial diameters reacted similarly, failing to reach statistical significance. Conclusion Localized retinal vessel regulation is significantly altered in glaucoma patients with asymmetri altitudinal visual field defects. Veins supplying the hemisphere concordant to a less profound visual field defect show diminished diameter changes. Vascular dysregulation might be particularly important in early glaucoma stages prior to a significant visual field defect.
The term ocular rigidity is widely used in clinical ophthalmology. Generally it is assumed as a resistance of the whole eyeball to mechanical deformation and relates to biomechanical properties of the eye and its tissues. Basic principles and formulas for clinical tonometry, tonography and pulsatile ocular blood flow measurements are based on the concept of ocular rigidity. There is evidence for altered ocular rigidity in aging, in several eye diseases and after eye surgery. Unfortunately, there is no consensual view on ocular rigidity: it used to make a quite different sense for different people but still the same name. Foremost there is no clear consent between biomechanical engineers and ophthalmologists on the concept. Moreover ocular rigidity is occasionally characterized using various parameters with their different physical dimensions. In contrast to engineering approach, clinical approach to ocular rigidity claims to characterize the total mechanical response of the eyeball to its deformation without any detailed considerations on eye morphology or material properties of its tissues. Further to the previous chapter this section aims to describe clinical approach to ocular rigidity from the perspective of an engineer in an attempt to straighten out this concept, to show its advantages, disadvantages and various applications.
Pure analytical or experimental methods can only find a control strategy for technical systems with a fixed setup. In former contributions we presented an approach that simultaneously finds the optimal topology and the optimal open-loop control of a system via Mixed Integer Linear Programming (MILP). In order to extend this approach by a closed-loop control we present a Mixed Integer Program for a time discretized tank level control. This model is the basis for an extension by combinatorial decisions and thus for the variation of the network topology. Furthermore, one is able to appraise feasible solutions using the global optimality gap.
The UN sets the goal to ensure access to water and sanitation for all people by 2030. To address this goal, we present a multidisciplinary approach for designing water supply networks for slums in large cities by applying mathematical optimization. The problem is modeled as a mixed-integer linear problem (MILP) aiming to find a network describing the optimal supply infrastructure. To illustrate the approach, we apply it on a small slum cluster in Dhaka, Bangladesh.
The energy-efficiency of technical systems can be improved by a systematic design approach. Technical Operations Research (TOR) employs methods known from Operations Research to find a global optimal layout and operation strategy of technical systems. We show the practical usage of this approach by the systematic design of a decentralized water supply system for skyscrapers. All possible network options and operation strategies are modeled by a Mixed-Integer Nonlinear Program. We present the optimal system found by our approach and highlight the energy savings compared to a conventional system design.
Highly competitive markets paired with tremendous production volumes demand particularly cost efficient products. The usage of common parts and modules across product families can potentially reduce production costs. Yet, increasing commonality typically results in overdesign of individual products. Multi domain virtual prototyping enables designers to evaluate costs and technical feasibility of different single product designs at reasonable computational effort in early design phases. However, savings by platform commonality are hard to quantify and require detailed knowledge of e.g. the production process and the supply chain. Therefore, we present and evaluate a multi-objective metamodel-based optimization algorithm which enables designers to explore the trade-off between high commonality and cost optimal design of single products.
Around 60% of the paper worldwide is made from recovered paper. Especially adhesive contaminants, so called stickies, reduce paper quality. To remove stickies but at the same time keep as many valuable fibers as possible, multi-stage screening systems with several interconnected pressure screens are used. When planning such systems, suitable screens have to be selected and their interconnection as well as operational parameters have to be defined considering multiple conflicting objectives. In this contribution, we present a Mixed-Integer Nonlinear Program to optimize system layout, component selection and operation to find a suitable trade-off between output quality and yield.
In product development, numerous design decisions have to be made. Multi-domain virtual prototyping provides a variety of tools to assess technical feasibility of design options, however often requires substantial computational effort for just a single evaluation. A special challenge is therefore the optimal design of product families, which consist of a group of products derived from a common platform. Finding an optimal platform configuration (stating what is shared and what is individually designed for each product) and an optimal design of all products simultaneously leads to a mixed-integer nonlinear black-box optimization model. We present an optimization approach based on metamodels and a metaheuristic. To increase computational efficiency and solution quality, we compare different types of Gaussian process regression metamodels adapted from the domain of machine learning, and combine them with a genetic algorithm. We illustrate our approach on the example of a product family of electrical drives, and investigate the trade-off between solution quality and computational overhead.
In order to maximize the possible travel distance of battery electric vehicles with one battery charge, it is mandatory to adjust all components of the powertrain carefully to each other. While current vehicle designs mostly simplify the powertrain rigorously and use an electric motor in combination with a gearbox with only one fixed transmission ratio, the use of multi-gear systems has great potential. First, a multi-speed system is able to improve the overall energy efficiency. Secondly, it is able to reduce the maximum momentum and therefore to reduce the maximum current provided by the traction battery, which results in a longer battery lifetime. In this paper, we present a systematic way to generate multi-gear gearbox designs that—combined with a certain electric motor—lead to the most efficient fulfillment of predefined load scenarios and are at the same time robust to uncertainties in the load. Therefore, we model the electric motor and the gearbox within a Mixed-Integer Nonlinear Program, and optimize the efficiency of the mechanical parts of the powertrain. By combining this mathematical optimization program with an unsupervised machine learning algorithm, we are able to derive global-optimal gearbox designs for practically relevant momentum and speed requirements.
The chemical industry is one of the most important industrial sectors in Germany in terms of manufacturing revenue. While thermodynamic boundary conditions often restrict the scope for reducing the energy consumption of core processes, secondary processes such as cooling offer scope for energy optimisation. In this contribution, we therefore model and optimise an existing cooling system. The technical boundary conditions of the model are provided by the operators, the German chemical company BASF SE. In order to systematically evaluate different degrees of freedom in topology and operation, we formulate and solve a Mixed-Integer Nonlinear Program (MINLP), and compare our optimisation results with the existing system.
Component failures within water supply systems can lead to significant performance losses. One way to address these losses is the explicit anticipation of failures within the design process. We consider a water supply system for high-rise buildings, where pump failures are the most likely failure scenarios. We explicitly consider these failures within an early design stage which leads to a more resilient system, i.e., a system which is able to operate under a predefined number of arbitrary pump failures. We use a mathematical optimization approach to compute such a resilient design. This is based on a multi-stage model for topology optimization, which can be described by a system of nonlinear inequalities and integrality constraints. Such a model has to be both computationally tractable and to represent the real-world system accurately. We therefore validate the algorithmic solutions using experiments on a scaled test rig for high-rise buildings. The test rig allows for an arbitrary connection of pumps to reproduce scaled versions of booster station designs for high-rise buildings. We experimentally verify the applicability of the presented optimization model and that the proposed resilience properties are also fulfilled in real systems.
This chapter describes three general strategies to master uncertainty in technical systems: robustness, flexibility and resilience. It builds on the previous chapters about methods to analyse and identify uncertainty and may rely on the availability of technologies for particular systems, such as active components. Robustness aims for the design of technical systems that are insensitive to anticipated uncertainties. Flexibility increases the ability of a system to work under different situations. Resilience extends this characteristic by requiring a given minimal functional performance, even after disturbances or failure of system components, and it may incorporate recovery. The three strategies are described and discussed in turn. Moreover, they are demonstrated on specific technical systems.
The application of mathematical optimization methods for water supply system design and operation provides the capacity to increase the energy efficiency and to lower the investment costs considerably. We present a system approach for the optimal design and operation of pumping systems in real-world high-rise buildings that is based on the usage of mixed-integer nonlinear and mixed-integer linear modeling approaches. In addition, we consider different booster station topologies, i.e. parallel and series-parallel central booster stations as well as decentral booster stations. To confirm the validity of the underlying optimization models with real-world system behavior, we additionally present validation results based on experiments conducted on a modularly constructed pumping test rig. Within the models we consider layout and control decisions for different load scenarios, leading to a Deterministic Equivalent of a two-stage stochastic optimization program. We use a piecewise linearization as well as a piecewise relaxation of the pumps’ characteristics to derive mixed-integer linear models. Besides the solution with off-the-shelf solvers, we present a problem specific exact solving algorithm to improve the computation time. Focusing on the efficient exploration of the solution space, we divide the problem into smaller subproblems, which partly can be cut off in the solution process. Furthermore, we discuss the performance and applicability of the solution approaches for real buildings and analyze the technical aspects of the solutions from an engineer’s point of view, keeping in mind the economically important trade-off between investment and operation costs.
Water distribution systems are an essential supply infrastructure for cities. Given that climatic and demographic influences will pose further challenges for these infrastructures in the future, the resilience of water supply systems, i.e. their ability to withstand and recover from disruptions, has recently become a subject of research. To assess the resilience of a WDS, different graph-theoretical approaches exist. Next to general metrics characterizing the network topology, also hydraulic and technical restrictions have to be taken into account. In this work, the resilience of an exemplary water distribution network of a major German city is assessed, and a Mixed-Integer Program is presented which allows to assess the impact of capacity adaptations on its resilience.