Refine
Year of publication
Institute
- Fachbereich Elektrotechnik und Informationstechnik (722) (remove)
Language
- English (722) (remove)
Document Type
- Article (409)
- Conference Proceeding (241)
- Part of a Book (39)
- Book (23)
- Conference: Meeting Abstract (6)
- Patent (2)
- Conference Poster (1)
- Doctoral Thesis (1)
Keywords
- Enterprise Architecture (5)
- MINLP (5)
- Engineering optimization (4)
- Optimization (3)
- Powertrain (3)
- Technical Operations Research (3)
- Telecommunication (3)
- CO2 (2)
- Competence Developing Games (2)
- Energy efficiency (2)
- Engineering education (2)
- Experimental validation (2)
- Gamification (2)
- Hot S-parameter (2)
- LiDAR (2)
- Machine Learning (2)
- Optimal Topology (2)
- Process engineering (2)
- Pump System (2)
- Robotic Process Automation (2)
- Serious Game (2)
- Ventilation System (2)
- Water (2)
- Water distribution system (2)
- autonomous driving (2)
- mathematical optimization (2)
- robotic process automation (2)
- 197m/gHg (1)
- 3-D printing (1)
- 3D object detection (1)
- Accuracy (1)
- Advanced driver assistance systems (ADAS/AD) (1)
- Agile development (1)
- Agility (1)
- Android (1)
- Angle Sensor (1)
- Anomaly detection (1)
- Artificial Intelligence (1)
- Automated driving (1)
- Automation (1)
- Automotive application (1)
- Autonomous mobile robots (1)
- Awareness (1)
- BEV (1)
- Benchmark (1)
- Bloom’s Taxonomy (1)
- Booster Stations (1)
- Buffering Capacity (1)
- Building Automation (1)
- Business Engineering (1)
- Business Process (1)
- Business understanding (1)
- CAV (1)
- CDG (1)
- Carbon Dioxide (1)
- Cardiovascular Magnetic Resonance (1)
- Case Study (1)
- Case study (1)
- Chance Constraint (1)
- Change culture (1)
- Charging stations (1)
- Circuit simulation (1)
- Competence Developing Game (1)
- Computational modeling (1)
- Connected Automated Vehicle (1)
- Conpot (1)
- Control (1)
- Control engineering (1)
- Controller Parameter (1)
- Cooling system (1)
- Cryptographic protocols (1)
- Customer Orientation (1)
- DC machines (1)
- Data analysis (1)
- Data visualization (1)
- Datasets (1)
- Deep learning (1)
- Digital Age (1)
- Digital Twin Evolution (1)
- Digital transformation (1)
- Digital triage (1)
- Digital twin (1)
- Discrete Optimisation (1)
- Drinking Water Supply (1)
- E-carsharing (1)
- E-mobility (1)
- EGG (1)
- Education (1)
- Efficiency (1)
- Efficiency optimization (1)
- Electrocardiography (1)
- Electronic learning (1)
- Elicit (1)
- Energy (1)
- Energy Disaggregation (1)
- Energy-efficient ventilation (1)
- Engineering Application (1)
- Engineering optimisation (1)
- Enterprise architecture (1)
- Enterprise transformation (1)
- FPGA (1)
- Frequency Doubler (1)
- Fully connected car (1)
- Furnace (1)
- Fusion (1)
- Future Skills (1)
- GPU (1)
- Game-based learning (1)
- Gearbox (1)
- Global optimization (1)
- Gold nanoparticle (AuNP) (1)
- Graph Theory (1)
- Harmonic Radar (1)
- Heuristic algorithms (1)
- High field MRI (1)
- Home Assistant (1)
- Home Automation Platform (1)
- ICS (1)
- IP-based networks (1)
- ISO 26262 (1)
- IT security education (1)
- Im-plementation of AI-systems (1)
- Image Database (1)
- Image Forensics (1)
- Image Quality Assessment (1)
- Image Quality Score (1)
- Implementation Case (1)
- Incremental Encoder (1)
- Indoor air quality (1)
- Indoor environmental quality (1)
- Inductive charging (1)
- Industrial optimisation (1)
- Industry 4.0 (1)
- Information and communication technology (1)
- Integrated mobility (1)
- Interactive process mining (1)
- Interdisciplinarity (1)
- Interval Time Series (1)
- Keyword analysis (1)
- LQR (1)
- Larynx position (1)
- Latin Hypercube Sampling (1)
- Leaderboard (1)
- Lean thinking (1)
- Level Control System (1)
- Li7La3Zr2O12 (1)
- LiGaO2 (1)
- Lidar (1)
- Literature review (1)
- Low voltage (1)
- MATLAB , MLPI , Motion control , Open Core , industrial drives , rapid control prototypin , sercos automation bus (1)
- MILP (1)
- MPC (1)
- MR safety (1)
- MRI (1)
- Machine learning (1)
- Magneto alert sensor (1)
- Malicious model (1)
- Manifolds (1)
- Map (eTOM) Process reference model Process design Telecommunications industry (1)
- Matlab (1)
- Meitner-Auger-electron (MAE) (1)
- Methodology (1)
- Minimum Risk Manoeuvre (1)
- Mixed Integer Programming (1)
- Mixed-Integer Nonlinear Optimisation (1)
- Mixed-integer nonlinear black-box optimization (1)
- Mixed-integer nonlinear problem (1)
- Mixed-integer nonlinear programming (1)
- Mixed-integer programming (1)
- Mobile Phones (1)
- Mobility management (1)
- Mode converter (1)
- Modeling (1)
- Monitoring (1)
- Mpc (1)
- Multi-criteria optimization (1)
- Multi-robot systems (1)
- NGN (1)
- Navigation (1)
- Network (1)
- Network design (1)
- Neural networks (1)
- Next Generation Network (1)
- OR 2019 (1)
- Objective data (1)
- Online services (1)
- Open Source (1)
- Operational Design Domain (1)
- Optimal Closed Loop (1)
- PM2.5 (1)
- PPO (1)
- Paper recycling (1)
- Participation (1)
- Path-following (1)
- Performance (1)
- Personality (1)
- Piecewise Linearization (1)
- Piecewise linearization (1)
- Plasma (1)
- Plasma diagnostics (1)
- Player Types (1)
- Position Encoder (1)
- Power dissipation (1)
- Preface (1)
- Privacy-enhancing technologies (1)
- Process design (1)
- Process model (1)
- Process optimization (1)
- Process reference model (1)
- Process virtualization (1)
- Product bundling (1)
- Product family optimization (1)
- Prototype (1)
- Pumping systems (1)
- Quality control (1)
- Quantum Computing (1)
- Quantum Machine Learning (1)
- RANSAC (1)
- Radar (1)
- Reference Process Model (1)
- Reference modelling (1)
- Reinforcement Learning (1)
- Requirements (1)
- Rescue System (1)
- Research process (1)
- Resilience (1)
- Resilience Assessment (1)
- Resilience assessment (1)
- Resilience metric graph theory (1)
- Resilient infrastructure (1)
- RoboCup (1)
- Robotic process automation (1)
- Rotary encoder (1)
- Rotational Encoder (1)
- Rotatory Inverted Pendulum (1)
- Safety of the intended functionality (SOTIF) (1)
- Safety-critical systems validation (1)
- Secure multi-party computation (1)
- Self-driving (1)
- Sensors comparison (1)
- Serious Games (1)
- Services (1)
- Similarity Theory (1)
- Smart Building (1)
- Smart factory (1)
- Society (1)
- Software (1)
- Software Robots (1)
- Software development (1)
- Software packages (1)
- Software testing (1)
- Solver Per- formance (1)
- Sound Pressure Level (1)
- Stochastic Programming (1)
- Story (1)
- Stress testing (1)
- Subject-oriented Business Process Management (1)
- Sustainability (1)
- Synchronous machines (1)
- System Design (1)
- System Design Problem (1)
- TGA (1)
- TM Forum (1)
- Tag (1)
- Targeted radionuclide therapy (TRT) (1)
- Teamwork (1)
- Technical Operation Research (1)
- Telecommunication Industry (1)
- Telecommunications Industry. (1)
- Text analytics (1)
- Text mining (1)
- Three-dimensional displays (1)
- Throughput (1)
- Time-series synchronization (1)
- Tomography (1)
- Training (1)
- Transdisciplinarity (1)
- Transfer impedance (1)
- Transformation (1)
- Transformation Project (1)
- Transformative Competencies (1)
- Transiton of Control (1)
- Transponder (1)
- Triage-app (1)
- Uktrahigh field MRI (1)
- Uncertainty (1)
- Urban areas (1)
- V2X (1)
- VOP compression (1)
- Validation (1)
- Video Game (1)
- Voice assessment (1)
- WLTP (1)
- Water Distribution (1)
- Water Supply Networks (1)
- Water Supply System (1)
- Water supply system (1)
- Wearable electronic device (1)
- Wiegand Effect (1)
- Wiegand sensor (1)
- Zero-knowledge proofs (1)
- active learning (1)
- agile (1)
- anticipation strategy (1)
- applications (1)
- artificial intelligence (1)
- automated parking (1)
- automated vehicles (1)
- availability (1)
- batteries and fuel cells (1)
- body imaging at 7 T MRI (1)
- body imaging at UHF MRI (1)
- business analytics (1)
- business process automation (1)
- business simulation (1)
- commercial offthe- shelf solutions (1)
- competence developing games (1)
- connected automated vehicles (1)
- culpability (1)
- cybersecurity (1)
- decision analytics (1)
- design of technical systems (1)
- digital economy (1)
- do-it-yourself (1)
- eTOM (1)
- education (1)
- electrospinning (1)
- embedded hardware (1)
- energy efficiency (1)
- energy transfer (1)
- enhanced Telecom Operations Map (eTOM) (1)
- experimental evaluation (1)
- fault detection (1)
- fibers (1)
- ga-doping (1)
- garnet solid electrolyte (1)
- genetic algorithm (1)
- global optimization (1)
- harmonic radar (1)
- harmonic radar tags (1)
- heating system (1)
- honeynet (1)
- honeypot (1)
- ignition (1)
- information systems (1)
- integrated transmit coil arrays (1)
- irradiation (1)
- lab work (1)
- learning theories (1)
- legal obligations (1)
- liability (1)
- line detection (1)
- management (1)
- metrological characterization (1)
- microplasma (1)
- microwave (MW) plasma (1)
- microwave measurements (1)
- mixed-integer linear programming (1)
- model-predictive control (1)
- motivation theories (1)
- next generation network (1)
- nonlinear VNA measurements (1)
- optimization (1)
- parking slot detection (1)
- passive inter-modulation (1)
- plasma jet (1)
- point cloud processing (1)
- porous materials (1)
- power integrity (1)
- product bundling (1)
- product liability (1)
- professional skills (1)
- programming (1)
- remote sensing (1)
- requirements (1)
- research framework (1)
- resilience (1)
- sensor networks (1)
- signal integrity (1)
- slum classification (1)
- software engineering (1)
- software evaluation (1)
- software selection (1)
- solid-state battery (1)
- stochastic optimization (1)
- sustainability (1)
- system optimization (1)
- system synthesis (1)
- systematic literature review (1)
- tablet game (1)
- technical operations research (1)
- technology (1)
- telecommunication (1)
- thermal dose (1)
- tissue temperature (1)
- transmit antenna arrays (1)
- water supply design (1)
- water supply system (1)
This paper describes the potential for developing a digital twin of society- a dynamic model that can be used to observe, analyze, and predict the evolution of various societal aspects. Such a digital twin can help governmental agencies and policy makers in interpreting trends, understanding challenges, and making decisions regarding investments or policies necessary to support societal development and ensure future prosperity. The paper reviews related work regarding the digital twin paradigm and its applications. The paper presents a motivating case study- an analysis of opportunities and challenges faced by the German federal employment agency, Bundesagentur f¨ur Arbeit (BA), proposes solutions using digital twins, and describes initial proofs of concept for such solutions.
Objective
This study assesses and quantifies impairment of postoperative magnetic resonance imaging (MRI) at 7 Tesla (T) after implantation of titanium cranial fixation plates (CFPs) for neurosurgical bone flap fixation.
Materials and methods
The study group comprised five patients who were intra-individually examined with 3 and 7 T MRI preoperatively and postoperatively (within 72 h/3 months) after implantation of CFPs. Acquired sequences included T₁-weighted magnetization-prepared rapid-acquisition gradient-echo (MPRAGE), T₂-weighted turbo-spin-echo (TSE) imaging, and susceptibility-weighted imaging (SWI). Two experienced neurosurgeons and a neuroradiologist rated image quality and the presence of artifacts in consensus reading.
Results
Minor artifacts occurred around the CFPs in MPRAGE and T2 TSE at both field strengths, with no significant differences between 3 and 7 T. In SWI, artifacts were accentuated in the early postoperative scans at both field strengths due to intracranial air and hemorrhagic remnants. After resorption, the brain tissue directly adjacent to skull bone could still be assessed. Image quality after 3 months was equal to the preoperative examinations at 3 and 7 T.
Conclusion
Image quality after CFP implantation was not significantly impaired in 7 T MRI, and artifacts were comparable to those in 3 T MRI.
One central challenge for self-driving cars is a proper path-planning. Once a trajectory has been found, the next challenge is to accurately and safely follow the precalculated path. The model-predictive controller (MPC) is a common approach for the lateral control of autonomous vehicles. The MPC uses a vehicle dynamics model to predict the future states of the vehicle for a given prediction horizon. However, in order to achieve real-time path control, the computational load is usually large, which leads to short prediction horizons. To deal with the computational load, the control algorithm can be parallelized on the graphics processing unit (GPU). In contrast to the widely used stochastic methods, in this paper we propose a deterministic approach based on grid search. Our approach focuses on systematically discovering the search area with different levels of granularity. To achieve this, we split the optimization algorithm into multiple iterations. The best sequence of each iteration is then used as an initial solution to the next iteration. The granularity increases, resulting in smooth and predictable steering angle sequences. We present a novel GPU-based algorithm and show its accuracy and realtime abilities with a number of real-world experiments.
The Android operating system powers the majority of the world’s mobile devices and has been becoming increasingly important in day-to-day digital forensics. Therefore, technicians and analysts are in need of reliable methods for extracting and analyzing memory images from live Android systems. This paper takes different existing, extraction methods and derives a universal, reproducible, reliably documented method for both extraction and analysis. In addition the VOLIX II front-end for the Volatility Framework is extended with additional functionality to make the analysis of Android memory images easier for technically non-adept users.
Physical layer specification of the L-band Digital Aeronautical Communications System (L-DACS1)
(2009)
This paper presents the latest prototype of the integrated emitter turn-off thyristor concept, which potentially ranks among thyristor high-power devices like the gate turn-off thyristor and the integrated gate-commutated thyristor (IGCT). Due to modifications of the external driver stage and mechanical press-pack design optimization, this prototype allows for full device characterization. The turn-off capability was increased to 1600 A with an active silicon area of 823mm2 . This leads to a transient peak power of 672.1kW/cm² . Within this paper, measurements and concept assessment are presented and a comparison to state-of-the-art IGCT devices is provided.
The continuously growing amount of renewable sources starts compromising the stability of electrical grids. Contradictory to fossil fuel power plants, energy production of wind and photovoltaic (PV) energy is fluctuating. Although predictions have significantly improved, an outage of multi-MW offshore wind farms poses a challenging problem. One solution could be the integration of storage systems in the grid. After a short overview, this paper focuses on two exemplary battery storage systems, including the required power electronics. The grid integration, as well as the optimal usage of volatile energy reserves, is presented for a 5- kW PV system for home application, as well as for a 100- MW medium-voltage system, intended for wind farm usage. The efficiency and cost of topologies are investigated as a key parameter for large-scale integration of renewable power at medium- and low-voltage.
Motivation-based Learning: Teaching Fundamentals of Electrical Engineering with an LED Spinning Top
(2018)
This thesis introduces the Integrated Emitter Turn-Off (IETO) Thyristor as a new high-power device. Known state-of-the-art research activities like the Dual GCT, the ETO thyristor and the ICT were presented and critically reviewed. A comparison with commercialized solutions identifies the pros and cons of each type of device family. Based on this analysis, the IETO structure is proposed, covering most benefits of each device class. In particular the combination of a MOS-assisted turn-off with a thyristor-based device allows a voltage-controlled MOS switching and the low on-state voltage of the thyristors. The following synthesis of an IETO device stands on a three-dimensional field of optimization spanned by electric, mechanical and thermal aspects. From an electric point of view, the lowest possible parasitic inductance and resistance within the commutation path are optimization criteria. The mechanical construction has to withstand the required contact pressure of multiple kilo Newtons. Finally, thermal borders limit the maximum average current of the device. FEM simulations covering these three aspects are performed for several design proposals. An IETO prototype is constructed and measurements on various test benches attest thermal, mechanical and electric performance. A local decoupling of the external driver stage and the presspack housing is presented by a cable connection. This separation enables a thermal and mechanical independence, which is advantageous in terms of vibrations and thermal cycles including increased reliability. The electric pulse performance of the prototype device is a factor of 3.1 above today''s solutions. In single-pulse measurements, a current up to 1600 A was successfully turned off at 115°C with an active silicon area of 823 mm². One reason for this increased turn-off capability is the extremely low-inductive construction. Additional functionality of the IETO thyristor like over-current self-protection and defined short-circuit failure state are successfully verified.
The Scarab Project
(2015)
Urban Search and Rescue (USAR) is an active research
field in the robotics community. Despite recent advances
for many open research questions, these kind of systems are
not widely used in real rescue missions. One reason is that such
systems are complex and not (yet) very reliable; another is that
one has to be an robotic expert to run such a system. Moreover,
available rescue robots are very expensive and the benefits of
using them are still limited.
In this paper, we present the Scarab robot, an alternative
design for a USAR robot. The robot is light weight, humanpackable
and its primary purpose is that of extending the
rescuer’s capability to sense the disaster site. The idea is that a
responder throws the robot to a certain spot. The robot survives
the impact with the ground and relays sensor data such as
camera images or thermal images to the responder’s hand-held
control unit from which the robot can be remotely controlled.
Assessment of RF Safety of Transmit Coils at 7 Tesla by Experimental and Numerical Procedures (490.)
(2012)
Information technologies, such as big data analytics, cloud computing,
cyber physical systems, robotic process automation, and the internet of things, provide a sustainable impetus for the structural development of business sectors as well as the digitalization of markets, enterprises, and processes. Within the consulting industry, the proliferation of these technologies opened up the new segment of digital transformation, which focuses on setting up, controlling, and implementing projects for enterprises from a broad range of sectors. These recent developments raise the question, which requirements evolve for IT consultants as important success factors of those digital transformation projects. Therefore, this empirical contribution provides indications regarding the qualifications and competences necessary for IT consultants in the era of digital transformation from a labor market perspective. On the one hand, this knowledge base is interesting for the academic education of consultants, since it supports a market-oriented design of adequate training measures. On the other hand, insights into the competence requirements for consultants are considered relevant for skill and talent management processes in consulting practice. Assuming that consulting companies pursue a strategic human resource management approach, labor market information may also be useful to discover strategic behavioral patterns.
The continuing growth of scientific publications raises the question how research processes can be digitalized and thus realized more productively. Especially in information technology fields, research practice is characterized by a rapidly growing volume of publications. For the search process various information systems exist. However, the analysis of the published content is still a highly manual task. Therefore, we propose a text analytics system that allows a fully digitalized analysis of literature sources. We have realized a prototype by using EBSCO Discovery Service in combination with IBM Watson Explorer and demonstrated the results in real-life research projects. Potential addressees are research institutions, consulting firms, and decision-makers in politics and business practice.
The benefits of robotic process automation (RPA) are highly related to the usage of commercial off-the-shelf (COTS) software products that can be easily implemented and customized by business units. But, how to find the best fitting RPA product for a specific situation that creates the expected benefits? This question is related to the general area of software evaluation and selection. In the face of more than 75 RPA products currently on the market, guidance considering those specifics is required. Therefore, this chapter proposes a criteria-based selection method specifically for RPA. The method includes a quantitative evaluation of costs and benefits as well as a qualitative utility analysis based on functional criteria. By using the visualization of financial implications (VOFI) method, an application-oriented structure is provided that opposes the total cost of ownership to the time savings times salary (TSTS). For the utility analysis a detailed list of functional criteria for RPA is offered. The whole method is based on a multi-vocal review of scientific and non-scholarly literature including publications by business practitioners, consultants, and vendors. The application of the method is illustrated by a concrete RPA example. The illustrated
structures, templates, and criteria can be directly utilized by practitioners in their real-life RPA implementations. In addition, a normative decision process for selecting RPA alternatives is proposed before the chapter closes with a discussion and outlook.
After a brief introduction of conventional laboratory structures, this work focuses on an innovative and universal approach for a setup of a training laboratory for electric machines and drive systems. The novel approach employs a central 48 V DC bus, which forms the backbone of the structure. Several sets of DC machine, asynchronous machine and synchronous machine are connected to this bus. The advantages of the novel system structure are manifold, both from a didactic and a technical point of view: Student groups can work on their own performance level in a highly parallelized and at the same time individualized way. Additional training setups (similar or different) can easily be added. Only the total power dissipation has to be provided, i.e. the DC bus balances the power flow between the student groups. Comparative results of course evaluations of several cohorts of students are shown.
The initial idea of Robotic Process Automation (RPA) is the automation of business processes through the presentation layer of existing application systems. For this simple emulation of user input and output by software robots, no changes of the systems and architecture is required. However, considering strategic aspects of aligning business and technology on an enterprise level as well as the growing capabilities of RPA driven by artificial intelligence, interrelations between RPA and Enterprise Architecture (EA) become visible and pose new questions. In this paper we discuss the relationship between RPA and EA in terms of perspectives and implications. As workin- progress we focus on identifying new questions and research opportunities related to RPA and EA.
This paper presents an approach for reducing the cognitive load for humans working in quality control (QC) for production processes that adhere to the 6σ -methodology. While 100% QC requires every part to be inspected, this task can be reduced when a human-in-the-loop QC process gets supported by an anomaly detection system that only presents those parts for manual inspection that have a significant likelihood of being defective. This approach shows good results when applied to image-based QC for metal textile products.
Digital twins enable the modeling and simulation of real-world entities (objects, processes or systems), resulting in improvements in the associated value chains. The emerging field of quantum computing holds tremendous promise forevolving this virtualization towards Quantum (Digital) Twins (QDT) and ultimately Quantum Twins (QT). The quantum (digital) twin concept is not a contradiction in terms - but instead describes a hybrid approach that can be implemented using the technologies available today by combining classicalcomputing and digital twin concepts with quantum processing. This paperpresents the status quo of research and practice on quantum (digital) twins. It alsodiscuses their potential to create competitive advantage through real-timesimulation of highly complex, interconnected entities that helps companies better
address changes in their environment and differentiate their products andservices.
Algorithmic design and resilience assessment of energy efficient high-rise water supply systems
(2018)
High-rise water supply systems provide water flow and suitable pressure in all levels of tall buildings. To design such state-of-the-art systems, the consideration of energy efficiency and the anticipation of component failures are mandatory. In this paper, we use Mixed-Integer Nonlinear Programming to compute an optimal placement of pipes and pumps, as well as an optimal control strategy.Moreover, we consider the resilience of the system to pump failures. A resilient system is able to fulfill a predefined minimum functionality even though components fail or are restricted in their normal usage. We present models to measure and optimize the resilience. To demonstrate our approach, we design and analyze an optimal resilient decentralized water supply system inspired by a real-life hotel building.
On obligations in the development process of resilient systems with algorithmic design methods
(2018)
Advanced computational methods are needed both for the design of large systems and to compute high accuracy solutions. Such methods are efficient in computation, but the validation of results is very complex, and highly skilled auditors are needed to verify them. We investigate legal questions concerning obligations in the development phase, especially for technical systems developed using advanced methods. In particular, we consider methods of resilient and robust optimization. With these techniques, high performance solutions can be found, despite a high variety of input parameters. However, given the novelty of these methods, it is uncertain whether legal obligations are being met. The aim of this paper is to discuss if and how the choice of a specific computational method affects the developer’s product liability. The review of legal obligations in this paper is based on German law and focuses on the requirements that must be met during the design and development process.
Energy-efficient components do not automatically lead to energy-efficient systems. Technical Operations Research (TOR) shifts the focus from the single component to the system as a whole and finds its optimal topology and operating strategy simultaneously. In previous works, we provided a preselected construction kit of suitable components for the algorithm. This approach may give rise to a combinatorial explosion if the preselection cannot be cut down to a reasonable number by human intuition. To reduce the number of discrete decisions, we integrate laws derived from similarity theory into the optimization model. Since the physical characteristics of a production series are similar, it can be described by affinity and scaling laws. Making use of these laws, our construction kit can be modeled more efficiently: Instead of a preselection of components, it now encompasses whole model ranges. This allows us to significantly increase the number of possible set-ups in our model. In this paper, we present how to embed this new formulation into a mixed-integer program and assess the run time via benchmarks. We present our approach on the example of a ventilation system design problem.
Cheap does not imply cost-effective -- this is rule number one of zeitgeisty system design. The initial investment accounts only for a small portion of the lifecycle costs of a technical system. In fluid systems, about ninety percent of the total costs are caused by other factors like power consumption and maintenance. With modern optimization methods, it is already possible to plan an optimal technical system considering multiple objectives. In this paper, we focus on an often neglected contribution to the lifecycle costs: downtime costs due to spontaneous failures. Consequently, availability becomes an issue.
Planning the layout and operation of a technical system is a common task
for an engineer. Typically, the workflow is divided into consecutive stages: First,
the engineer designs the layout of the system, with the help of his experience or of
heuristic methods. Secondly, he finds a control strategy which is often optimized
by simulation. This usually results in a good operating of an unquestioned sys-
tem topology. In contrast, we apply Operations Research (OR) methods to find a
cost-optimal solution for both stages simultaneously via mixed integer program-
ming (MILP). Technical Operations Research (TOR) allows one to find a provable
global optimal solution within the model formulation. However, the modeling error
due to the abstraction of physical reality remains unknown. We address this ubiq-
uitous problem of OR methods by comparing our computational results with mea-
surements in a test rig. For a practical test case we compute a topology and control
strategy via MILP and verify that the objectives are met up to a deviation of 8.7%.
Pure analytical or experimental methods can only find a control strategy for technical systems with a fixed setup. In former contributions we presented an approach that simultaneously finds the optimal topology and the optimal open-loop control of a system via Mixed Integer Linear Programming (MILP). In order to extend this approach by a closed-loop control we present a Mixed Integer Program for a time discretized tank level control. This model is the basis for an extension by combinatorial decisions and thus for the variation of the network topology. Furthermore, one is able to appraise feasible solutions using the global optimality gap.
In times of planned obsolescence the demand for sustainability keeps growing. Ideally, a technical system is highly reliable, without failures and down times due to fast wear of single components. At the same time, maintenance should preferably be limited to pre-defined time intervals. Dispersion of load between multiple components can increase a system’s reliability and thus its availability inbetween maintenance points. However, this also results in higher investment costs and additional efforts due to higher complexity. Given a specific load profile and resulting wear of components, it is often unclear which system structure is the optimal one. Technical Operations Research (TOR) finds an optimal structure balancing availability and effort. We present our approach by designing a hydrostatic transmission system.
Gearboxes are mechanical transmission systems that provide speed and torque conversions from a rotating power source. Being a central element of the drive train, they are relevant for the efficiency and durability of motor vehicles. In this work, we present a new approach for gearbox design: Modeling the design problem as a mixed-integer nonlinear program (MINLP) allows us to create gearbox designs from scratch for arbitrary requirements and—given enough time—to compute provably globally optimal designs for a given objective. We show how different degrees of freedom influence the runtime and present an exemplary solution.
Existing residential buildings have an average lifetime of 100 years. Many of these buildings will exist for at least another 50 years. To increase the efficiency of these buildings while keeping costs at reasonable rates, they can be retrofitted with sensors that deliver information to central control units for heating, ventilation and electricity. This retrofitting process should happen with minimal intervention into existing infrastructure and requires new approaches for sensor design and data transmission. At FH Aachen University of Applied Sciences, students of different disciplines work together to learn how to design, build, deploy and operate such sensors. The presented teaching project already created a low power design for a combined CO2, temperature and humidity measurement device that can be easily integrated into most home automation systems
Resilience as a concept has found its way into different disciplines to describe the ability of an individual or system to withstand and adapt to changes in its environment. In this paper, we provide an overview of the concept in different communities and extend it to the area of mechanical engineering. Furthermore, we present metrics to measure resilience in technical systems and illustrate them by applying them to load-carrying structures. By giving application examples from the Collaborative Research Centre (CRC) 805, we show how the concept of resilience can be used to control uncertainty during different stages of product life.
Nach Stand von Wissenschaft und Technik werden Komponenten hinsichtlich ihrer Eigenschaften, wie Lebensdauer oder Energieeffizienz, optimiert. Allerdings können selbst hervorragende Komponenten zu ineffizienten oder instabilen Systemen führen, wenn ihr Zusammenspiel nur unzureichend berücksichtigt wird. Eine Systembetrachtung schafft ein größeres Optimierungspotential - dem erhöhten Potential steht jedoch auch ein erhöhter Komplexitätsgrad gegenüber. Die vorliegende Arbeit ist im Rahmen des Sonderforschungsbereichs 805 entstanden, dessen Ziel die Beherrschung von Unsicherheit in Systemen des Maschinenbaus ist. Die Arbeit zeigt anhand eines realen Systems aus dem Bereich der Hydraulik, wie Unsicherheit in der Entwicklungsphase beherrscht werden kann. Hierbei ist neu, dass die durch den späteren Betrieb zu erwartende Systemdegradation eines jeden möglichen Systemvorschlags antizipiert werden kann. Dadurch können Betriebs- und Wartungskosten vorausgesagt und minimiert werden und durch eine optimale Betriebs- und Wartungsstrategie die Verfügbarkeit des Systems garantiert werden. Wesentliche Fragen bei der optimalen Auslegung des betrachteten hydrostatischen Getriebes sind dessen physikalische Modellierung, die Darstellung des Optimierungsproblems als gemischt-ganzzahliges lineares Programm, und dessen algorithmische Behandlung zur Lösungsfindung. Hierzu werden Heuristiken zum schnelleren Auffinden sinnvoller Systemtopologien vorgestellt und mittels mathematischer Dekomposition eine Bewertung des dynamischen Verschleiß- und Wartungsverlaufs möglicher Systemvorschläge vorgenommen. Die Arbeit stellt die Optimierung technischer Systeme an der Schnittstelle von Mathematik, Informatik und Ingenieurwesen sowohl gründlich als auch anschaulich und nachvollziehbar dar.
In this paper research activities developed within the FutureCom project are presented. The project, funded by the European Metrology Programme for Innovation and Research (EMPIR), aims at evaluating and characterizing: (i) active devices, (ii) signal- and power integrity of field programmable gate array (FPGA) circuits, (iii) operational performance of electronic circuits in real-world and harsh environments (e.g. below and above ambient temperatures and at different levels of humidity), (iv) passive inter-modulation (PIM) in communication systems considering different values of temperature and humidity corresponding to the typical operating conditions that we can experience in real-world scenarios. An overview of the FutureCom project is provided here, then the research activities are described.
RGB-D sensors such as the Microsoft Kinect or the Asus Xtion are inexpensive 3D sensors. A depth image is computed by calculating the distortion of a known infrared light (IR) pattern which is projected into the scene. While these sensors are great devices they have some limitations. The distance they can measure is limited and they suffer from reflection problems on transparent, shiny, or very matte and absorbing objects. If more than one RGB-D camera is used the IR patterns interfere with each other. This results in a massive loss of depth information. In this paper, we present a simple and powerful method to overcome these problems. We propose a stereo RGB-D camera system which uses the pros of RGB-D cameras and combine them with the pros of stereo camera systems. The idea is to utilize the IR images of each two sensors as a stereo pair to generate a depth map. The IR patterns emitted by IR projectors are exploited here to enhance the dense stereo matching even if the observed objects or surfaces are texture-less or transparent. The resulting disparity map is then fused with the depth map offered by the RGB-D sensor to fill the regions and the holes that appear because of interference, or due to transparent or reflective objects. Our results show that the density of depth information is increased especially for transparent, shiny or matte objects.
Clinical assessment of newly developed sensors is important for ensuring their validity. Comparing recordings of emerging electrocardiography (ECG) systems to a reference ECG system requires accurate synchronization of data from both devices. Current methods can be inefficient and prone to errors. To address this issue, three algorithms are presented to synchronize two ECG time series from different recording systems: Binned R-peak Correlation, R-R Interval Correlation, and Average R-peak Distance. These algorithms reduce ECG data to their cyclic features, mitigating inefficiencies and minimizing discrepancies between different recording systems. We evaluate the performance of these algorithms using high-quality data and then assess their robustness after manipulating the R-peaks. Our results show that R-R Interval Correlation was the most efficient, whereas the Average R-peak Distance and Binned R-peak Correlation were more robust against noisy data.