Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Elektrotechnik und Informationstechnik (234) (remove)
Language
- English (234) (remove)
Document Type
- Conference Proceeding (234) (remove)
Keywords
- Enterprise Architecture (5)
- Engineering education (2)
- Engineering optimization (2)
- MINLP (2)
- Machine Learning (2)
- Robotic Process Automation (2)
- Serious Game (2)
- Ventilation System (2)
- Water distribution system (2)
- autonomous driving (2)
Clinical assessment of newly developed sensors is important for ensuring their validity. Comparing recordings of emerging electrocardiography (ECG) systems to a reference ECG system requires accurate synchronization of data from both devices. Current methods can be inefficient and prone to errors. To address this issue, three algorithms are presented to synchronize two ECG time series from different recording systems: Binned R-peak Correlation, R-R Interval Correlation, and Average R-peak Distance. These algorithms reduce ECG data to their cyclic features, mitigating inefficiencies and minimizing discrepancies between different recording systems. We evaluate the performance of these algorithms using high-quality data and then assess their robustness after manipulating the R-peaks. Our results show that R-R Interval Correlation was the most efficient, whereas the Average R-peak Distance and Binned R-peak Correlation were more robust against noisy data.
The problem of fair and privacy-preserving ordered set reconciliation arises in a variety of applications like auctions, e-voting, and appointment reconciliation. While several multi-party protocols have been proposed that solve this problem in the semi-honest model, there are no multi-party protocols that are secure in the malicious model so far. In this paper, we close this gap. Our newly proposed protocols are shown to be secure in the malicious model based on a variety of novel non-interactive zero-knowledge-proofs. We describe the implementation of our protocols and evaluate their performance in comparison to protocols solving the problem in the semi-honest case.
The RoboCup Logistics League (RCLL) is a robotics competition in a production logistics scenario in the context of a Smart Factory. In the competition, a team of three robots needs to assemble products to fulfill various orders that are requested online during the game. This year, the Carologistics team was able to win the competition with a new approach to multi-agent coordination as well as significant changes to the robot’s perception unit and a pragmatic network setup using the cellular network instead of WiFi. In this paper, we describe the major components of our approach with a focus on the changes compared to the last physical competition in 2019.
Due to the increasing complexity of software projects, software development is becoming more and more dependent on teams. The quality of this teamwork can vary depending on the team composition, as teams are always a combination of different skills and personality types. This paper aims to answer the question of how to describe a software development team and what influence the personality of the team members has on the team dynamics. For this purpose, a systematic literature review (n=48) and a literature search with the AI research assistant Elicit (n=20) were conducted. Result: A person’s personality significantly shapes his or her thinking and actions, which in turn influences his or her behavior in software development teams. It has been shown that team performance and satisfaction can be strongly influenced by personality. The quality of communication and the likelihood of conflict can also be attributed to personality.
This paper presents an approach for reducing the cognitive load for humans working in quality control (QC) for production processes that adhere to the 6σ -methodology. While 100% QC requires every part to be inspected, this task can be reduced when a human-in-the-loop QC process gets supported by an anomaly detection system that only presents those parts for manual inspection that have a significant likelihood of being defective. This approach shows good results when applied to image-based QC for metal textile products.
Digital forensics of smartphones is of utmost importance in many criminal cases. As modern smartphones store chats, photos, videos etc. that can be relevant for investigations and as they can have storage capacities of hundreds of gigabytes, they are a primary target for forensic investigators. However, it is exactly this large amount of data that is causing problems: extracting and examining the data from multiple phones seized in the context of a case is taking more and more time. This bears the risk of wasting a lot of time with irrelevant phones while there is not enough time left to analyze a phone which is worth examination. Forensic triage can help in this case: Such a triage is a preselection step based on a subset of data and is performed before fully extracting all the data from the smartphone. Triage can accelerate subsequent investigations and is especially useful in cases where time is essential. The aim of this paper is to determine which and how much data from an Android smartphone can be made directly accessible to the forensic investigator – without tedious investigations. For this purpose, an app has been developed that can be used with extremely limited storage of data in the handset and which outputs the extracted data immediately to the forensic workstation in a human- and machine-readable format.
KNX is a protocol for smart building automation, e.g., for automated heating, air conditioning, or lighting. This paper analyses and evaluates state-of-the-art KNX devices from manufacturers Merten, Gira and Siemens with respect to security. On the one hand, it is investigated if publicly known vulnerabilities like insecure storage of passwords in software, unencrypted communication, or denialof-service attacks, can be reproduced in new devices. On the other hand, the security is analyzed in general, leading to the discovery of a previously unknown and high risk vulnerability related to so-called BCU (authentication) keys.
Nowadays, the most employed devices for recoding videos or capturing images are undoubtedly the smartphones. Our work investigates the application of source camera identification on mobile phones. We present a dataset entirely collected by mobile phones. The dataset contains both still images and videos collected by 67 different smartphones. Part of the images consists in photos of uniform backgrounds, especially collected for the computation of the RSPN. Identifying the source camera given a video is particularly challenging due to the strong video compression. The experiments reported in this paper, show the large variation in performance when testing an highly accurate technique on still images and videos.
Modern implementations of driver assistance systems are evolving from a pure driver assistance to a independently acting automation system. Still these systems are not covering the full vehicle usage range, also called operational design domain, which require the human driver as fall-back mechanism. Transition of control and potential minimum risk manoeuvres are currently research topics and will bridge the gap until full autonomous vehicles are available. The authors showed in a demonstration that the transition of control mechanisms can be further improved by usage of communication technology. Receiving the incident type and position information by usage of standardised vehicle to everything (V2X) messages can improve the driver safety and comfort level. The connected and automated vehicle’s software framework can take this information to plan areas where the driver should take back control by initiating a transition of control which can be followed by a minimum risk manoeuvre in case of an unresponsive driver. This transition of control has been implemented in a test vehicle and was presented to the public during the IEEE IV2022 (IEEE Intelligent Vehicle Symposium) in Aachen, Germany.
The work in modern open-pit and underground mines requires the transportation of large amounts of resources between fixed points. The navigation to these fixed points is a repetitive task that can be automated. The challenge in automating the navigation of vehicles commonly used in mines is the systemic properties of such vehicles. Many mining vehicles, such as the one we have used in the research for this paper, use steering systems with an articulated joint bending the vehicle’s drive axis to change its course and a hydraulic drive system to actuate axial drive components or the movements of tippers if available. To address the difficulties of controlling such a vehicle, we present a model-predictive approach for controlling the vehicle. While the control optimisation based on a parallel error minimisation of the predicted state has already been established in the past, we provide insight into the design and implementation of an MPC for an articulated mining vehicle and show the results of real-world experiments in an open-pit mine environment.
This paper addresses the pixel based recognition of 3D objects with bidirectional associative memories. Computational power and memory requirements for this approach are identified and compared to the performance of current computer architectures by benchmarking different processors. It is shown, that the performance of special purpose hardware, like neurocomputers, is between one and two orders of magnitude higher than the performance of mainstream hardware. On the other hand, the calculation of small neural networks is performed more efficiently on mainstream processors. Based on these results a novel concept is developed, which is tailored for the efficient calculation of bidirectional associative memories. The computational efficiency is further enhanced by the application of algorithms and storage techniques which are matched to characteristics of the application at hand.
This paper addresses the pixel based classification of three dimensional objects from arbitrary views. To perform this task a coding strategy, inspired by the biological model of human vision, for pixel data is described. The coding strategy ensures that the input data is invariant against shift, scale and rotation of the object in the input domain. The image data is used as input to a class of self organizing neural networks, the Kohonen-maps or self-organizing feature maps (SOFM). To verify this approach two test sets have been generated: the first set, consisting of artificially generated images, is used to examine the classification properties of the SOFMs; the second test set examines the clustering capabilities of the SOFM when real world image data is applied to the network after it has been preprocessed to be invariant against shift, scale and rotation. It is shown that the clustering capability of the SOFM is strongly dependant on the invariance coding of the images.
Aim of the AXON2 project (Adaptive Expert System for Object Recogniton using Neuml Networks) is the development of an object recognition system (ORS) capable of recognizing isolated 3d objects from arbitrary views. Commonly, classification is based on a single feature extracted from the original image. Here we present an architecture adapted from the Mixtures of Eaqerts algorithm which uses multiple neuml networks to integmte different features. During tmining each neural network specializes in a subset of objects or object views appropriate to the properties of the corresponding feature space. In recognition mode the system dynamically chooses the most relevant features and combines them with maximum eficiency. The remaining less relevant features arz not computed and do therefore not decelerate the-recognition process. Thus, the algorithm is well suited for ml-time applications.
In this paper we report on CO2 Meter, a do-it-yourself carbon dioxide measuring device for the classroom. Part of the current measures for dealing with the SARS-CoV-2 pandemic is proper ventilation in indoor settings. This is especially important in schools with students coming back to the classroom even with high incidents rates. Static ventilation patterns do not consider the individual situation for a particular class. Influencing factors like the type of activity, the physical structure or the room occupancy are not incorporated. Also, existing devices are rather expensive and often provide only limited information and only locally without any networking. This leaves the potential of analysing the situation across different settings untapped. Carbon dioxide level can be used as an indicator of air quality, in general, and of aerosol load in particular. Since, according to the latest findings, SARS-CoV-2 can be transmitted primarily in the form of aerosols, carbon dioxide may be used as a proxy for the risk of a virus infection. Hence, schools could improve the indoor air quality and potentially reduce the infection risk if they actually had measuring devices available in the classroom. Our device supports schools in ventilation and it allows for collecting data over the Internet to enable a detailed data analysis and model generation. First deployments in schools at different levels were received very positively. A pilot installation with a larger data collection and analysis is underway.
Existing residential buildings have an average lifetime of 100 years. Many of these buildings will exist for at least another 50 years. To increase the efficiency of these buildings while keeping costs at reasonable rates, they can be retrofitted with sensors that deliver information to central control units for heating, ventilation and electricity. This retrofitting process should happen with minimal intervention into existing infrastructure and requires new approaches for sensor design and data transmission. At FH Aachen University of Applied Sciences, students of different disciplines work together to learn how to design, build, deploy and operate such sensors. The presented teaching project already created a low power design for a combined CO2, temperature and humidity measurement device that can be easily integrated into most home automation systems
This paper describes the potential for developing a digital twin of society- a dynamic model that can be used to observe, analyze, and predict the evolution of various societal aspects. Such a digital twin can help governmental agencies and policy makers in interpreting trends, understanding challenges, and making decisions regarding investments or policies necessary to support societal development and ensure future prosperity. The paper reviews related work regarding the digital twin paradigm and its applications. The paper presents a motivating case study- an analysis of opportunities and challenges faced by the German federal employment agency, Bundesagentur f¨ur Arbeit (BA), proposes solutions using digital twins, and describes initial proofs of concept for such solutions.
How does the implementation of a next generation network influence a telecommunication company?
(2009)
As the potential of a Next Generation Network (NGN) is recognized, telecommunication companies consider switching to it. Although the implementation of an NGN seems to be merely a modification of the network infrastructure, it may trigger or require changes in the whole company and even influence the company strategy. To capture the effects of NGN we propose a framework based on concepts of business engineering and technical recommendations for the introduction of NGN technology. The specific design of solutions for the layers "Strategy", "Processes" and "Information Systems" as well as their interdependencies are an essential characteristic of the developed framework. We have per-formed a case study on NGN implementation and observed that all layers captured by our framework are influenced by the introduction of an NGN.
Market changes have forced telecommunication companies to transform their business. Increased competition, short innovation cycles, changed usage patterns, increased customer expectations and cost reduction are the main drivers. Our objective is to analyze to what extend transformation projects have improved the orientation towards the end-customers. Therefore, we selected 38 real-life case studies that are dealing with customer orientation. Our analysis is based on a telecommunication-specific framework that aligns strategy, business processes and information systems. The result of our analysis shows the following: transformation projects that aim to improve the customer orientation are combined with clear goals on costs and revenue of the enterprise. These projects are usually directly linked to the customer touch points, but also to the development and provisioning of products. Furthermore, the analysis shows that customer orientation is not the sole trigger for transformation. There is no one-fits-all solution; rather, improved customer orientation needs aligned changes of business processes as well as information systems related to different parts of the company.
Development of a subject-oriented reference process model for the telecommunications industry
(2016)
Generally the usage of reference models can be structured top-down or bottom-up. The practical need of agile change and flexible organizational implementation requires a consistent mapping to an operational level. In this context, well-established reference process models are typically structured top-down. The subject-oriented Business Process Management (sBPM) offers a modeling concept that is structured bottom-up and concentrates on the process actors on an
operational level. This paper applies sBPM to the enhanced Telecom Operations Map (eTOM), a well-accepted reference process model in the telecommunications industry. The resulting design artifact is a concrete example for a combination of a bottom-up and top-down developed reference model. The results are evaluated and confirmed in practical context through the involvement of the industry body TMForum.
The telecommunications industry is currently going through a major transformation. In this context, the enhanced Telecom Operations Map (eTOM) is a domain-specific process reference model that is offered by the industry organization TM Forum. In practice, eTOM is well accepted and confirmed as de facto standard. It provides process definitions and process flows on different levels of detail. This article discusses the reference modeling of eTOM, i.e., the design, the resulting artifact, and its evaluation based on three project cases. The application of eTOM in three projects illustrates the design approach and concrete models on strategic and operational levels. The article follows the Design Science Research (DSR) paradigm. It contributes with concrete design artifacts to the transformational needs of the telecommunications industry and offers lessons-learned from a general DSR perspective.
The continuing growth of scientific publications raises the question how research processes can be digitalized and thus realized more productively. Especially in information technology fields, research practice is characterized by a rapidly growing volume of publications. For the search process various information systems exist. However, the analysis of the published content is still a highly manual task. Therefore, we propose a text analytics system that allows a fully digitalized analysis of literature sources. We have realized a prototype by using EBSCO Discovery Service in combination with IBM Watson Explorer and demonstrated the results in real-life research projects. Potential addressees are research institutions, consulting firms, and decision-makers in politics and business practice.
The initial idea of Robotic Process Automation (RPA) is the automation of business processes through the presentation layer of existing application systems. For this simple emulation of user input and output by software robots, no changes of the systems and architecture is required. However, considering strategic aspects of aligning business and technology on an enterprise level as well as the growing capabilities of RPA driven by artificial intelligence, interrelations between RPA and Enterprise Architecture (EA) become visible and pose new questions. In this paper we discuss the relationship between RPA and EA in terms of perspectives and implications. As workin- progress we focus on identifying new questions and research opportunities related to RPA and EA.
The initial idea of Robotic Process Automation (RPA) is the automation of business processes through a simple emulation of user input and output by software robots. Hence, it can be assumed that no changes of the used software systems and existing Enterprise Architecture (EA) is
required. In this short, practical paper we discuss this assumption based on a real-life implementation project. We show that a successful RPA implementation might require architectural work during analysis, implementation, and migration. As practical paper we focus on exemplary lessons-learned and new questions related to RPA and EA.
Digital twins enable the modeling and simulation of real-world entities (objects, processes or systems), resulting in improvements in the associated value chains. The emerging field of quantum computing holds tremendous promise forevolving this virtualization towards Quantum (Digital) Twins (QDT) and ultimately Quantum Twins (QT). The quantum (digital) twin concept is not a contradiction in terms - but instead describes a hybrid approach that can be implemented using the technologies available today by combining classicalcomputing and digital twin concepts with quantum processing. This paperpresents the status quo of research and practice on quantum (digital) twins. It alsodiscuses their potential to create competitive advantage through real-timesimulation of highly complex, interconnected entities that helps companies better
address changes in their environment and differentiate their products andservices.
In this paper research activities developed within the FutureCom project are presented. The project, funded by the European Metrology Programme for Innovation and Research (EMPIR), aims at evaluating and characterizing: (i) active devices, (ii) signal- and power integrity of field programmable gate array (FPGA) circuits, (iii) operational performance of electronic circuits in real-world and harsh environments (e.g. below and above ambient temperatures and at different levels of humidity), (iv) passive inter-modulation (PIM) in communication systems considering different values of temperature and humidity corresponding to the typical operating conditions that we can experience in real-world scenarios. An overview of the FutureCom project is provided here, then the research activities are described.
Many of today’s factors make software development more and more complex, such as time pressure, new technologies, IT security risks, et cetera. Thus, a good preparation of current as well as future software developers in terms of a good software engineering education becomes progressively important. As current research shows, Competence Developing Games (CDGs) and Serious Games can offer a potential solution.
This paper identifies the necessary requirements for CDGs to be conducive in principle, but especially in software engineering (SE) education. For this purpose, the current state of research was summarized in the context of a literature review. Afterwards, some of the identified requirements as well as some additional requirements were evaluated by a survey in terms of subjective relevance.
This paper covers the use of the magnetic Wiegand effect to design an innovative incremental encoder. First, a theoretical design is given, followed by an estimation of the achievable accuracy and an optimization in open-loop operation.
Finally, a successful experimental verification is presented. For this purpose, a permanent magnet synchronous machine is controlled in a field-oriented manner, using the angle information of the prototype.
Cybersecurity of Industrial Control Systems (ICS) is an important issue, as ICS incidents may have a direct impact on safety of people or the environment. At the same time the awareness and knowledge about cybersecurity, particularly in the context of ICS, is alarmingly low. Industrial honeypots offer a cheap and easy to implement way to raise cybersecurity awareness and to educate ICS staff about typical attack patterns. When integrated in a productive network, industrial honeypots may not only reveal attackers early but may also distract them from the actual important systems of the network. Implementing multiple honeypots as a honeynet, the systems can be used to emulate or simulate a whole Industrial Control System. This paper describes a network of honeypots emulating HTTP, SNMP, S7communication and the Modbus protocol using Conpot, IMUNES and SNAP7. The nodes mimic SIMATIC S7 programmable logic controllers (PLCs) which are widely used across the globe. The deployed honeypots' features will be compared with the features of real SIMATIC S7 PLCs. Furthermore, the honeynet has been made publicly available for ten days and occurring cyberattacks have been analyzed
This paper introduces a Competence Developing Game (CDG) for the purpose of a cybersecurity awareness training for businesses. The target audience will be discussed in detail to understand their requirements. It will be explained why and how a mix of business simulation and serious game meets these stakeholder requirements. It will be shown that a tablet and touchscreen based approach is the most suitable solution. In addition, an empirical study will be briefly presented. The study was carried out to examine how an interaction system for a 3D-tablet based CDG has to be designed, to be manageable for non-game experienced employees. Furthermore, it will be explained which serious content is necessary for a Cybersecurity awareness training CDG and how this content is wrapped in the game
Competence Developing Games (CDGs) are a new concept of how to think about games with serious intentions. In order to emphasize on this topic, a new framework has been developed. It basically relies on learning and motivation theories. This ‘motivational Competence Developing Game Framework’ demonstrates how it is possible to use these theories in a CDG development process. The theoretical derivation and use of the framework is explained in this paper.
During the development of a Competence Developing Game’s (CDG) story it is indispensable to understand the target audience. Thereby, CDGs stories represent more than just the plot. The Story is about the
Setting, the Characters and the Plot. As a toolkit to support the
development of such a story, this paper introduces the UserFocused Storybuilding (short UFoS) Framework for CDGs. The Framework and its utilization will be explained, followed by a description of its development and derivation, including an empirical study. In addition, to simplify the Framework use regarding the CDG’s target audience, a new concept of Nine Psychographic Player Types will be explained. This concept of Player Types provides an approach to handle the differences in between players during the UFoS Framework use. Thereby,
this article presents a unique approach to the development of
target group-differentiated CDGs stories.
In this article we describe an Internet-of-Things sensing device with a wireless interface which is powered by the oftenoverlooked harvesting method of the Wiegand effect. The sensor can determine position, temperature or other resistively measurable quantities and can transmit the data via an ultra-low power ultra-wideband (UWB) data transmitter. With this approach we can energy-self-sufficiently acquire, process, and wirelessly transmit data in a pulsed operation. A proof-of-concept system was built up to prove the feasibility of the approach. The energy consumption of the system is analyzed and traced back in detail to the individual components, compared to the generated energy and processed to identify further optimization options. Based on the proof-of-concept, an application demonstrator was developed. Finally, we point out possible use cases.
In this paper we investigate the use of deep neural networks for 3D object detection in uncommon, unstructured environments such as in an open-pit mine. While neural nets are frequently used for object detection in regular autonomous driving applications, more unusual driving scenarios aside street traffic pose additional challenges. For one, the collection of appropriate data sets to train the networks is an issue. For another, testing the performance of trained networks often requires tailored integration with the particular domain as well. While there exist different solutions for these problems in regular autonomous driving, there are only very few approaches that work for special domains just as well. We address both the challenges above in this work. First, we discuss two possible ways of acquiring data for training and evaluation. That is, we evaluate a semi-automated annotation of recorded LIDAR data and we examine synthetic data generation. Using these datasets we train and test different deep neural network for the task of object detection. Second, we propose a possible integration of a ROS2 detector module for an autonomous driving platform. Finally, we present the performance of three state-of-the-art deep neural networks in the domain of 3D object detection on a synthetic dataset and a smaller one containing a characteristic object from an open-pit mine.
Water suppliers are faced with the great challenge of achieving high-quality and, at the same time, low-cost water supply. Since climatic and demographic influences will pose further challenges in the future, the resilience enhancement of water distribution systems (WDS), i.e. the enhancement of their capability to withstand and recover from disturbances, has been in particular focus recently. To assess the resilience of WDS, graph-theoretical metrics have been proposed. In this study, a promising approach is first physically derived analytically and then applied to assess the resilience of the WDS for a district in a major German City. The topology based resilience index computed for every consumer node takes into consideration the resistance of the best supply path as well as alternative supply paths. This resistance of a supply path is derived to be the dimensionless pressure loss in the pipes making up the path. The conducted analysis of a present WDS provides insight into the process of actively influencing the resilience of WDS locally and globally by adding pipes. The study shows that especially pipes added close to the reservoirs and main branching points in the WDS result in a high resilience enhancement of the overall WDS.
In times of planned obsolescence the demand for sustainability keeps growing. Ideally, a technical system is highly reliable, without failures and down times due to fast wear of single components. At the same time, maintenance should preferably be limited to pre-defined time intervals. Dispersion of load between multiple components can increase a system’s reliability and thus its availability inbetween maintenance points. However, this also results in higher investment costs and additional efforts due to higher complexity. Given a specific load profile and resulting wear of components, it is often unclear which system structure is the optimal one. Technical Operations Research (TOR) finds an optimal structure balancing availability and effort. We present our approach by designing a hydrostatic transmission system.
The understanding that optimized components do not automatically lead to energy-efficient systems sets the attention from the single component on the entire technical system. At TU Darmstadt, a new field of research named Technical Operations Research (TOR) has its origin. It combines mathematical and technical know-how for the optimal design of technical systems. We illustrate our optimization approach in a case study for the design of a ventilation system with the ambition to minimize the energy consumption for a temporal distribution of diverse load demands. By combining scaling laws with our optimization methods we find the optimal combination of fans and show the advantage of the use of multiple fans.
Energy-efficient components do not automatically lead to energy-efficient systems. Technical Operations Research (TOR) shifts the focus from the single component to the system as a whole and finds its optimal topology and operating strategy simultaneously. In previous works, we provided a preselected construction kit of suitable components for the algorithm. This approach may give rise to a combinatorial explosion if the preselection cannot be cut down to a reasonable number by human intuition. To reduce the number of discrete decisions, we integrate laws derived from similarity theory into the optimization model. Since the physical characteristics of a production series are similar, it can be described by affinity and scaling laws. Making use of these laws, our construction kit can be modeled more efficiently: Instead of a preselection of components, it now encompasses whole model ranges. This allows us to significantly increase the number of possible set-ups in our model. In this paper, we present how to embed this new formulation into a mixed-integer program and assess the run time via benchmarks. We present our approach on the example of a ventilation system design problem.
Finding a good system topology with more than a handful of components is a
highly non-trivial task. The system needs to be able to fulfil all expected load cases, but at the
same time the components should interact in an energy-efficient way. An example for a system
design problem is the layout of the drinking water supply of a residential building. It may be
reasonable to choose a design of spatially distributed pumps which are connected by pipes in at
least two dimensions. This leads to a large variety of possible system topologies. To solve such
problems in a reasonable time frame, the nonlinear technical characteristics must be modelled
as simple as possible, while still achieving a sufficiently good representation of reality. The
aim of this paper is to compare the speed and reliability of a selection of leading mathematical
programming solvers on a set of varying model formulations. This gives us empirical evidence
on what combinations of model formulations and solver packages are the means of choice with the current state of the art.
The UN sets the goal to ensure access to water and sanitation for all people by 2030. To address this goal, we present a multidisciplinary approach for designing water supply networks for slums in large cities by applying mathematical optimization. The problem is modeled as a mixed-integer linear problem (MILP) aiming to find a network describing the optimal supply infrastructure. To illustrate the approach, we apply it on a small slum cluster in Dhaka, Bangladesh.
The overall energy efficiency of ventilation systems can be improved by considering not only single components, but by considering as well the interplay between every part of the system. With the help of the method "TOR" ("Technical Operations Research"), which was developed at the Chair of Fluid Systems at TU Darmstadt, it is possible to improve the energy efficiency of the whole system by considering all possible design choices programmatically. We show the ability of this systematic design approach with a ventilation system for buildings as a use case example.
Based on a Mixed-Integer Nonlinear Program (MINLP) we model the ventilation system. We use binary variables to model the selection of different pipe diameters. Multiple fans are model with the help of scaling laws. The whole system is represented by a graph, where the edges represent the pipes and fans and the nodes represents the source of air for cooling and the sinks, that have to be cooled. At the beginning, the human designer chooses a construction kit of different suitable fans and pipes of different diameters and different load cases. These boundary conditions define a variety of different possible system topologies. It is not possible to consider all topologies by hand. With the help of state of the art solvers, on the other side, it is possible to solve this MINLP.
Next to this, we also consider the effects of malfunctions in different components. Therefore, we show a first approach to measure the resilience of the shown example use case. Further, we compare the conventional approach with designs that are more resilient. These more resilient designs are derived by extending the before mentioned model with further constraints, that consider explicitly the resilience of the overall system. We show that it is possible to design resilient systems with this method already in the early design stage and compare the energy efficiency and resilience of these different system designs.
To increase pressure to supply all floors of high buildings with water, booster stations, normally consisting of several parallel pumps in the basement, are used. In this work, we demonstrate the potential of a decentralized pump topology regarding energy savings in water supply systems of skyscrapers. We present an approach, based on Mixed-Integer Nonlinear Programming, that allows to choose an optimal network topology and optimal pumps from a predefined construction kit comprising different pump types. Using domain-specific scaling laws and Latin Hypercube Sampling, we generate different input sets of pump types and compare their impact on the efficiency and cost of the total system design. As a realistic application example, we consider a hotel building with 325 rooms, 12 floors and up to four pressure zones.
The paper industry is the industry with the third highest energy consumption in the European Union. Using recycled paper instead of fresh fibers for papermaking is less energy consuming and saves resources. However, adhesive contaminants in recycled paper are particularly problematic since they reduce the quality of the resulting paper-product. To remove as many contaminants and at the same time obtain as many valuable fibres as possible, fine screening systems, consisting of multiple interconnected pressure screens, are used. Choosing the best configuration is a non-trivial task: The screens can be interconnected in several ways, and suitable screen designs as well as operational parameters have to be selected. Additionally, one has to face conflicting objectives. In this paper, we present an approach for the multi-criteria optimization of pressure screen systems based on Mixed-Integer Nonlinear Programming. We specifically focus on a clear representation of the trade-off between different objectives.
Water suppliers are faced with the great challenge of achieving high-quality and, at the same time, low-cost water supply. In practice, the focus is set on the most beneficial maintenance measures and/or capacity adaptations of existing water distribution systems (WDS). Since climatic and demographic influences will pose further challenges in the future, the resilience enhancement of WDS, i.e. the enhancement of their capability to withstand and recover from disturbances, has been in particular focus recently. To assess the resilience of WDS, metrics based on graph theory have been proposed. In this study, a promising approach is applied to assess the resilience of the WDS for a district in a major German City. The conducted analysis provides insight into the process of actively influencing the
resilience of WDS
The development of resilient technical systems is a challenging task, as the system should adapt automatically to unknown disturbances and component failures. To evaluate different approaches for deriving resilient technical system designs, we developed a modular test rig that is based on a pumping system. On the basis of this example
system, we present metrics to quantify resilience and an algorithmic approach to improve resilience. This approach enables the pumping system to automatically react on unknown disturbances and to reduce the impact of component failures. In this case, the system is able to automatically adapt its topology by activating additional valves. This enables the system to still reach a minimum performance, even in case of failures. Furthermore, timedependent disturbances are evaluated continuously, deviations from the original state are automatically detected and anticipated in the future. This allows to reduce the impact of future disturbances and leads to a more resilient
system behaviour.
The course Physics for Electrical Engineering is part of the curriculum of the bachelor program Electrical Engineering at University of Applied Science Aachen.
Before covid-19 the course was conducted in a rather traditional way with all parts (lecture, exercise and lab) face-to-face. This teaching approach changed fundamentally within a week when the covid-19 limitations forced all courses to distance learning. All parts of the course were transformed to pure distance learning including synchronous and asynchronous parts for the lecture, live online-sessions for the exercises and self-paced labs at home. Using these methods, the course was able to impart the required knowledge and competencies. Taking the teacher’s observations of the student’s learning behaviour and engagement, the formal and informal feedback of the students and the results of the exams into account, the new methods are evaluated with respect to effectiveness, sustainability and suitability for competence transfer. Based on this analysis strong and weak points of the concept and countermeasures to solve the weak points were identified. The analysis further leads to a sustainable teaching approach combining synchronous and asynchronous parts with self-paced learning times that can be used in a very flexible manner for different learning scenarios, pure online, hybrid (mixture of online and presence times) and pure presence teaching.
The chemical industry is one of the most important industrial sectors in Germany in terms of manufacturing revenue. While thermodynamic boundary conditions often restrict the scope for reducing the energy consumption of core processes, secondary processes such as cooling offer scope for energy optimisation. In this contribution, we therefore model and optimise an existing cooling system. The technical boundary conditions of the model are provided by the operators, the German chemical company BASF SE. In order to systematically evaluate different degrees of freedom in topology and operation, we formulate and solve a Mixed-Integer Nonlinear Program (MINLP), and compare our optimisation results with the existing system.