Conference Proceeding
Refine
Year of publication
Institute
- Fachbereich Elektrotechnik und Informationstechnik (241) (remove)
Language
- English (241) (remove)
Document Type
- Conference Proceeding (241) (remove)
Keywords
- Enterprise Architecture (5)
- Engineering education (2)
- Engineering optimization (2)
- MINLP (2)
- Machine Learning (2)
- Robotic Process Automation (2)
- Serious Game (2)
- Ventilation System (2)
- Water distribution system (2)
- autonomous driving (2)
The continuing growth of scientific publications raises the question how research processes can be digitalized and thus realized more productively. Especially in information technology fields, research practice is characterized by a rapidly growing volume of publications. For the search process various information systems exist. However, the analysis of the published content is still a highly manual task. Therefore, we propose a text analytics system that allows a fully digitalized analysis of literature sources. We have realized a prototype by using EBSCO Discovery Service in combination with IBM Watson Explorer and demonstrated the results in real-life research projects. Potential addressees are research institutions, consulting firms, and decision-makers in politics and business practice.
After a brief introduction of conventional laboratory structures, this work focuses on an innovative and universal approach for a setup of a training laboratory for electric machines and drive systems. The novel approach employs a central 48 V DC bus, which forms the backbone of the structure. Several sets of DC machine, asynchronous machine and synchronous machine are connected to this bus. The advantages of the novel system structure are manifold, both from a didactic and a technical point of view: Student groups can work on their own performance level in a highly parallelized and at the same time individualized way. Additional training setups (similar or different) can easily be added. Only the total power dissipation has to be provided, i.e. the DC bus balances the power flow between the student groups. Comparative results of course evaluations of several cohorts of students are shown.
The initial idea of Robotic Process Automation (RPA) is the automation of business processes through the presentation layer of existing application systems. For this simple emulation of user input and output by software robots, no changes of the systems and architecture is required. However, considering strategic aspects of aligning business and technology on an enterprise level as well as the growing capabilities of RPA driven by artificial intelligence, interrelations between RPA and Enterprise Architecture (EA) become visible and pose new questions. In this paper we discuss the relationship between RPA and EA in terms of perspectives and implications. As workin- progress we focus on identifying new questions and research opportunities related to RPA and EA.
This paper presents an approach for reducing the cognitive load for humans working in quality control (QC) for production processes that adhere to the 6σ -methodology. While 100% QC requires every part to be inspected, this task can be reduced when a human-in-the-loop QC process gets supported by an anomaly detection system that only presents those parts for manual inspection that have a significant likelihood of being defective. This approach shows good results when applied to image-based QC for metal textile products.
Digital twins enable the modeling and simulation of real-world entities (objects, processes or systems), resulting in improvements in the associated value chains. The emerging field of quantum computing holds tremendous promise forevolving this virtualization towards Quantum (Digital) Twins (QDT) and ultimately Quantum Twins (QT). The quantum (digital) twin concept is not a contradiction in terms - but instead describes a hybrid approach that can be implemented using the technologies available today by combining classicalcomputing and digital twin concepts with quantum processing. This paperpresents the status quo of research and practice on quantum (digital) twins. It alsodiscuses their potential to create competitive advantage through real-timesimulation of highly complex, interconnected entities that helps companies better
address changes in their environment and differentiate their products andservices.
Energy-efficient components do not automatically lead to energy-efficient systems. Technical Operations Research (TOR) shifts the focus from the single component to the system as a whole and finds its optimal topology and operating strategy simultaneously. In previous works, we provided a preselected construction kit of suitable components for the algorithm. This approach may give rise to a combinatorial explosion if the preselection cannot be cut down to a reasonable number by human intuition. To reduce the number of discrete decisions, we integrate laws derived from similarity theory into the optimization model. Since the physical characteristics of a production series are similar, it can be described by affinity and scaling laws. Making use of these laws, our construction kit can be modeled more efficiently: Instead of a preselection of components, it now encompasses whole model ranges. This allows us to significantly increase the number of possible set-ups in our model. In this paper, we present how to embed this new formulation into a mixed-integer program and assess the run time via benchmarks. We present our approach on the example of a ventilation system design problem.
In times of planned obsolescence the demand for sustainability keeps growing. Ideally, a technical system is highly reliable, without failures and down times due to fast wear of single components. At the same time, maintenance should preferably be limited to pre-defined time intervals. Dispersion of load between multiple components can increase a system’s reliability and thus its availability inbetween maintenance points. However, this also results in higher investment costs and additional efforts due to higher complexity. Given a specific load profile and resulting wear of components, it is often unclear which system structure is the optimal one. Technical Operations Research (TOR) finds an optimal structure balancing availability and effort. We present our approach by designing a hydrostatic transmission system.
Existing residential buildings have an average lifetime of 100 years. Many of these buildings will exist for at least another 50 years. To increase the efficiency of these buildings while keeping costs at reasonable rates, they can be retrofitted with sensors that deliver information to central control units for heating, ventilation and electricity. This retrofitting process should happen with minimal intervention into existing infrastructure and requires new approaches for sensor design and data transmission. At FH Aachen University of Applied Sciences, students of different disciplines work together to learn how to design, build, deploy and operate such sensors. The presented teaching project already created a low power design for a combined CO2, temperature and humidity measurement device that can be easily integrated into most home automation systems
In this paper research activities developed within the FutureCom project are presented. The project, funded by the European Metrology Programme for Innovation and Research (EMPIR), aims at evaluating and characterizing: (i) active devices, (ii) signal- and power integrity of field programmable gate array (FPGA) circuits, (iii) operational performance of electronic circuits in real-world and harsh environments (e.g. below and above ambient temperatures and at different levels of humidity), (iv) passive inter-modulation (PIM) in communication systems considering different values of temperature and humidity corresponding to the typical operating conditions that we can experience in real-world scenarios. An overview of the FutureCom project is provided here, then the research activities are described.
Clinical assessment of newly developed sensors is important for ensuring their validity. Comparing recordings of emerging electrocardiography (ECG) systems to a reference ECG system requires accurate synchronization of data from both devices. Current methods can be inefficient and prone to errors. To address this issue, three algorithms are presented to synchronize two ECG time series from different recording systems: Binned R-peak Correlation, R-R Interval Correlation, and Average R-peak Distance. These algorithms reduce ECG data to their cyclic features, mitigating inefficiencies and minimizing discrepancies between different recording systems. We evaluate the performance of these algorithms using high-quality data and then assess their robustness after manipulating the R-peaks. Our results show that R-R Interval Correlation was the most efficient, whereas the Average R-peak Distance and Binned R-peak Correlation were more robust against noisy data.