Refine
Year of publication
Institute
- Fachbereich Elektrotechnik und Informationstechnik (718) (remove)
Language
- English (718) (remove)
Document Type
- Article (414)
- Conference Proceeding (234)
- Part of a Book (38)
- Book (23)
- Conference: Meeting Abstract (5)
- Patent (2)
- Conference Poster (1)
- Doctoral Thesis (1)
Keywords
- Enterprise Architecture (5)
- MINLP (5)
- Engineering optimization (4)
- Optimization (3)
- Powertrain (3)
- Technical Operations Research (3)
- Telecommunication (3)
- Competence Developing Games (2)
- Energy efficiency (2)
- Engineering education (2)
This chapter describes three general strategies to master uncertainty in technical systems: robustness, flexibility and resilience. It builds on the previous chapters about methods to analyse and identify uncertainty and may rely on the availability of technologies for particular systems, such as active components. Robustness aims for the design of technical systems that are insensitive to anticipated uncertainties. Flexibility increases the ability of a system to work under different situations. Resilience extends this characteristic by requiring a given minimal functional performance, even after disturbances or failure of system components, and it may incorporate recovery. The three strategies are described and discussed in turn. Moreover, they are demonstrated on specific technical systems.
Solution of plane anisotropic elastostatical boundary value problems by singular integral equations
(1982)
Smart pixel : photonic mixer device (PMD) ; new system concept of a 3D-imaging camera-on-a-chip
(1998)
The Volatility Framework is a collection of tools for the analysis of computer RAM. The framework offers a multitude of analysis options and is used by many investigators worldwide. Volatility currently comes with a command line interface only, which might be a hinderer for some investigators to use the tool. In this paper we present a GUI and extensions for the Volatility Framework, which on the one hand simplify the usage of the tool and on the other hand offer additional functionality like storage of results in a database, shortcuts for long Volatility Framework command sequences, and entirely new commands based on correlation of data stored in the database.
Short term effects of magnetic resonance imaging on excitability of the motor cortex at 1.5T and 7T
(2010)
Rationale and Objectives
The increasing spread of high-field and ultra-high-field magnetic resonance imaging (MRI) scanners has encouraged new discussion of the safety aspects of MRI. Few studies have been published on possible cognitive effects of MRI examinations. The aim of this study was to examine whether changes are measurable after MRI examinations at 1.5 and 7 T by means of transcranial magnetic stimulation (TMS).
Materials and Methods
TMS was performed in 12 healthy, right-handed male volunteers. First the individual motor threshold was specified, and then the cortical silent period (SP) was measured. Subsequently, the volunteers were exposed to the 1.5-T MRI scanner for 63 minutes using standard sequences. The MRI examination was immediately followed by another TMS session. Fifteen minutes later, TMS was repeated. Four weeks later, the complete setting was repeated using a 7-T scanner. Control conditions included lying in the 1.5-T scanner for 63 minutes without scanning and lying in a separate room for 63 minutes. TMS was performed in the same way in each case. For statistical analysis, Wilcoxon's rank test was performed.
Results
Immediately after MRI exposure, the SP was highly significantly prolonged in all 12 subjects at 1.5 and 7 T. The motor threshold was significantly increased. Fifteen minutes after the examination, the measured value tended toward normal again. Control conditions revealed no significant differences.
Conclusion
MRI examinations lead to a transient and highly significant alteration in cortical excitability. This effect does not seem to depend on the strength of the static magnetic field.
This article addresses the need for an innovative technique in plasma shaping, utilizing antenna structures, Maxwell’s laws, and boundary conditions within a shielded environment. The motivation lies in exploring a novel approach to efficiently generate high-energy density plasma with potential applications across various fields. Implemented in an E01 circular cavity resonator, the proposed method involves the use of an impedance and field matching device with a coaxial connector and a specially optimized monopole antenna. This setup feeds a low-loss cavity resonator, resulting in a high-energy density air plasma with a surface temperature exceeding 3500 o C, achieved with a minimal power input of 80 W. The argon plasma, resembling the shape of a simple monopole antenna with modeled complex dielectric values, offers a more energy-efficient alternative compared to traditional, power-intensive plasma shaping methods. Simulations using a commercial electromagnetic (EM) solver validate the design’s effectiveness, while experimental validation underscores the method’s feasibility and practical implementation. Analyzing various parameters in an argon atmosphere, including hot S -parameters and plasma beam images, the results demonstrate the successful application of this technique, suggesting its potential in coating, furnace technology, fusion, and spectroscopy applications.
The potential of electronic markets in enabling innovative product bundles through flexible and sustainable partnerships is not yet fully exploited in the telecommunication industry. One reason is that bundling requires seamless de-assembling and re-assembling of business processes, whilst processes in telecommunication companies are often product-dependent and hard to virtualize. We propose a framework for the planning of the virtualization of processes, intended to assist the decision maker in prioritizing the processes to be virtualized: (a) we transfer the virtualization pre-requisites stated by the Process Virtualization Theory in the context of customer-oriented processes in the telecommunication industry and assess their importance in this context, (b) we derive IT-oriented requirements for the removal of virtualization barriers and highlight their demand on changes at different levels of the organization. We present a first evaluation of our approach in a case study and report on lessons learned and further steps to be performed.
Existing residential buildings have an average lifetime of 100 years. Many of these buildings will exist for at least another 50 years. To increase the efficiency of these buildings while keeping costs at reasonable rates, they can be retrofitted with sensors that deliver information to central control units for heating, ventilation and electricity. This retrofitting process should happen with minimal intervention into existing infrastructure and requires new approaches for sensor design and data transmission. At FH Aachen University of Applied Sciences, students of different disciplines work together to learn how to design, build, deploy and operate such sensors. The presented teaching project already created a low power design for a combined CO2, temperature and humidity measurement device that can be easily integrated into most home automation systems
KNX is a protocol for smart building automation, e.g., for automated heating, air conditioning, or lighting. This paper analyses and evaluates state-of-the-art KNX devices from manufacturers Merten, Gira and Siemens with respect to security. On the one hand, it is investigated if publicly known vulnerabilities like insecure storage of passwords in software, unencrypted communication, or denialof-service attacks, can be reproduced in new devices. On the other hand, the security is analyzed in general, leading to the discovery of a previously unknown and high risk vulnerability related to so-called BCU (authentication) keys.
ICSs (Industrial Control Systems) and its subset SCADA systems (Supervisory Control and Data Acquisition) are getting exposed to a constant stream of new threats. The increasing importance of IT security in ICS requires viable methods to assess the security of ICS, its individual components, and its protocols. This paper presents a security analysis with focus on the communication protocols of a single PLC (Programmable Logic Controller). The PLC, a Beckhoff CX2020, is examined and new vulnerabilities of the system are revealed. Based on these findings recommendations are made to improve security of the Beckhoff system and its protocols.
Scattering Parameter Measurements of Microstrip Devices using the Double-LNN Calibration Technique
(1994)
SAR Simulations & Safety
(2017)
Safety of subjects during radiofrequency exposure in ultra-high-field magnetic resonance imaging
(2020)
Magnetic resonance imaging (MRI) is one of the most important medical imaging techniques. Since the introduction of MRI in the mid-1980s, there has been a continuous trend toward higher static magnetic fields to obtain i.a. a higher signal-to-noise ratio. The step toward ultra-high-field (UHF) MRI at 7 Tesla and higher, however, creates several challenges regarding the homogeneity of the spin excitation RF transmit field and the RF exposure of the subject. In UHF MRI systems, the wavelength of the RF field is in the range of the diameter of the human body, which can result in inhomogeneous spin excitation and local SAR hotspots. To optimize the homogeneity in a region of interest, UHF MRI systems use parallel transmit systems with multiple transmit antennas and time-dependent modulation of the RF signal in the individual transmit channels. Furthermore, SAR increases with increasing field strength, while the SAR limits remain unchanged. Two different approaches to generate the RF transmit field in UHF systems using antenna arrays close and remote to the body are investigated in this letter. Achievable imaging performance is evaluated compared to typical clinical RF transmit systems at lower field strength. The evaluation has been performed under consideration of RF exposure based on local SAR and tissue temperature. Furthermore, results for thermal dose as an alternative RF exposure metric are presented.
Robotic process automation (RPA) has attracted increasing attention in research and practice. This chapter positions, structures, and frames the topic as an introduction to this book. RPA is understood as a broad concept that comprises a variety of concrete solutions. From a management perspective RPA offers an innovative approach for realizing automation potentials, whereas from a technical perspective the implementation based on software products and the impact of artificial intelligence (AI) and machine learning (ML) are relevant. RPA is industry-independent and can be used, for example, in finance, telecommunications, and the public sector. With respect to RPA this chapter discusses definitions, related approaches, a structuring framework, a research framework, and an inside as well as outside architectural view. Furthermore, it provides an overview of the book combined with short summaries of each chapter.
A new trend in automation is to deploy so-called cyber-physical systems (CPS) which combine computation with physical processes. The novel RoboCup Logistics League Sponsored by Festo (LLSF) aims at such CPS logistic scenarios in an automation setting. A team of robots has to produce products from a number of semi-finished products which they have to machine during the game. Different production plans are possible and the robots need to recycle scrap byproducts. This way, the LLSF is a very interesting league offering a number of challenging research questions for planning, coordination, or communication in an application-driven scenario. In this paper, we outline the objectives of the LLSF and present steps for developing the league further towards a benchmark for logistics scenarios for CPS. As a major milestone we present the new automated referee system which helps in governing the game play as well as keeping track of the scored points in a very complex factory scenario.
In this paper research activities developed within the FutureCom project are presented. The project, funded by the European Metrology Programme for Innovation and Research (EMPIR), aims at evaluating and characterizing: (i) active devices, (ii) signal- and power integrity of field programmable gate array (FPGA) circuits, (iii) operational performance of electronic circuits in real-world and harsh environments (e.g. below and above ambient temperatures and at different levels of humidity), (iv) passive inter-modulation (PIM) in communication systems considering different values of temperature and humidity corresponding to the typical operating conditions that we can experience in real-world scenarios. An overview of the FutureCom project is provided here, then the research activities are described.
As the field strength and, therefore, the operational frequency in MRI is increased, the wavelength approaches the size of the human head/body, resulting in wave effects, which cause signal decreases and dropouts. Several multichannel approaches have been proposed to try to tackle these problems, including RF shimming, where each element in an array is driven by its own amplifier and modulated with a certain (constant) amplitude and phase relative to the other elements, and Transmit SENSE, where spatially tailored RF pulses are used. In this article, a relatively inexpensive and easy to use imaging scheme for 7 Tesla imaging is proposed to mitigate signal voids due to B1 field inhomogeneity. Two time-interleaved images are acquired using a different excitation mode for each. By forming virtual receive elements, both images are reconstructed together using GRAPPA to achieve a more homogeneous image, with only small SNR and SAR penalty in head and body imaging at 7 Tesla.
Resilience as a concept has found its way into different disciplines to describe the ability of an individual or system to withstand and adapt to changes in its environment. In this paper, we provide an overview of the concept in different communities and extend it to the area of mechanical engineering. Furthermore, we present metrics to measure resilience in technical systems and illustrate them by applying them to load-carrying structures. By giving application examples from the Collaborative Research Centre (CRC) 805, we show how the concept of resilience can be used to control uncertainty during different stages of product life.
Water suppliers are faced with the great challenge of achieving high-quality and, at the same time, low-cost water supply. Since climatic and demographic influences will pose further challenges in the future, the resilience enhancement of water distribution systems (WDS), i.e. the enhancement of their capability to withstand and recover from disturbances, has been in particular focus recently. To assess the resilience of WDS, graph-theoretical metrics have been proposed. In this study, a promising approach is first physically derived analytically and then applied to assess the resilience of the WDS for a district in a major German City. The topology based resilience index computed for every consumer node takes into consideration the resistance of the best supply path as well as alternative supply paths. This resistance of a supply path is derived to be the dimensionless pressure loss in the pipes making up the path. The conducted analysis of a present WDS provides insight into the process of actively influencing the resilience of WDS locally and globally by adding pipes. The study shows that especially pipes added close to the reservoirs and main branching points in the WDS result in a high resilience enhancement of the overall WDS.
Many of today’s factors make software development more and more complex, such as time pressure, new technologies, IT security risks, et cetera. Thus, a good preparation of current as well as future software developers in terms of a good software engineering education becomes progressively important. As current research shows, Competence Developing Games (CDGs) and Serious Games can offer a potential solution.
This paper identifies the necessary requirements for CDGs to be conducive in principle, but especially in software engineering (SE) education. For this purpose, the current state of research was summarized in the context of a literature review. Afterwards, some of the identified requirements as well as some additional requirements were evaluated by a survey in terms of subjective relevance.
Objective
To investigate the feasibility of 7T MR imaging of the kidneys utilising a custom-built 8-channel transmit/receive radiofrequency body coil.
Methods
In vivo unenhanced MR was performed in 8 healthy volunteers on a 7T whole-body MR system. After B0 shimming the following sequences were obtained: 1) 2D and 3D spoiled gradient-echo sequences (FLASH, VIBE), 2) T1-weighted 2D in and opposed phase 3) True-FISP imaging and 4) a T2-weighted turbo spin echo (TSE) sequence. Visual evaluation of the overall image quality was performed by two radiologists.
Results
Renal MRI at 7T was feasible in all eight subjects. Best image quality was found using T1-weighted gradient echo MRI, providing high anatomical details and excellent conspicuity of the non-enhanced vasculature. With successful shimming, B1 signal voids could be effectively reduced and/or shifted out of the region of interest in most sequence types. However, T2-weighted TSE imaging remained challenging and strongly impaired because of signal heterogeneities in three volunteers.
Conclusion
The results demonstrate the feasibility and diagnostic potential of dedicated 7T renal imaging. Further optimisation of imaging sequences and dedicated RF coil concepts are expected to improve the acquisition quality and ultimately provide high clinical diagnostic value.
This book reflects the tremendous changes in the telecommunications industry in the course of the past few decades – shorter innovation cycles, stiffer competition and new communication products. It analyzes the transformation of processes, applications and network technologies that are now expected to take place under enormous time pressure. The International Telecommunication Union (ITU) and the TM Forum have provided reference solutions that are broadly recognized and used throughout the value chain of the telecommunications industry, and which can be considered the de facto standard. The book describes how these reference solutions can be used in a practical context: it presents the latest insights into their development, highlights lessons learned from numerous international projects and combines them with well-founded research results in enterprise architecture management and reference modeling. The complete architectural transformation is explained, from the planning and set-up stage to the implementation. Featuring a wealth of examples and illustrations, the book offers a valuable resource for telecommunication professionals, enterprise architects and project managers alike.
Communication via serial bus systems, like CAN, plays an important role for all kinds of embedded electronic and mechatronic systems. To cope up with the requirements for functional safety of safety-critical applications, there is a need to enhance the safety features of the communication systems. One measure to achieve a more robust communication is to add redundant data transmission path to the applications. In general, the communication of real-time embedded systems like automotive applications is tethered, and the redundant data transmission lines are also tethered, increasing the size of the wiring harness and the weight of the system. A radio link is preferred as a redundant transmission line as it uses a complementary transmission medium compared to the wired solution and in addition reduces wiring harness size and weight. Standard wireless links like Wi-Fi or Bluetooth cannot meet the requirements for real-time capability with regard to bus communication. Using the new dual-mode radio enables a redundant transmission line meeting all requirements with regard to real-time capability, robustness and transparency for the data bus. In addition, it provides a complementary transmission medium with regard to commonly used tethered links. A CAN bus system is used to demonstrate the redundant data transfer via tethered and wireless CAN.
This paper introduces a hardware setup to measure efficiency maps of low-power electric motors and their associated inverters. Here, the power of the device under test (DUT) ranges from some Watts to a few hundred Watts. The torque and speed of the DUT are measured independent of voltage and current in multiple load points. A Matlab-based software approach in combination with an open Texas-Instruments (TI) hardware setup ensures flexibility. Exemplarily, the efficiency field of a Permanent Magnet Synchronous Machine (PMSM) is measured to proof the concept. Brushless-DC (BLDC) motors can be tested as well. The nomenclature in this paper is based on the new European standard DIN EN 50598. Special attention is paid to the calculation of the measurement error.
Reasoning with Qualitative Positional Information for Domestic Domains in the Situation Calculus
(2011)
The initial idea of Robotic Process Automation (RPA) is the automation of business processes through a simple emulation of user input and output by software robots. Hence, it can be assumed that no changes of the used software systems and existing Enterprise Architecture (EA) is
required. In this short, practical paper we discuss this assumption based on a real-life implementation project. We show that a successful RPA implementation might require architectural work during analysis, implementation, and migration. As practical paper we focus on exemplary lessons-learned and new questions related to RPA and EA.
This article describes the functionality of a MATLAB® library that can be used to develop motion-logic applications in MATLAB programming language for industrial drive and control systems using the well known sercos automation bus. Therewith MATLAB's functionality is extended to designing automation applications from single axis machines up to multi-kinematic robots.
Quantitative Farbmessung in laryngoskopischen Bildern. Palm, C; Scholl, I; Lehmann, TM; Spitzer, K.
(1998)
Automated driving is now possible in diverse road and traffic conditions. However, there are still situations that automated vehicles cannot handle safely and efficiently. In this case, a Transition of Control (ToC) is necessary so that the driver takes control of the driving. Executing a ToC requires the driver to get full situation awareness of the driving environment. If the driver fails to get back the control in a limited time, a Minimum Risk Maneuver (MRM) is executed to bring the vehicle into a safe state (e.g., decelerating to full stop). The execution of ToCs requires some time and can cause traffic disruption and safety risks that increase if several vehicles execute ToCs/MRMs at similar times and in the same area. This study proposes to use novel C-ITS traffic management measures where the infrastructure exploits V2X communications to assist Connected and Automated Vehicles (CAVs) in the execution of ToCs. The infrastructure can suggest a spatial distribution of ToCs, and inform vehicles of the locations where they could execute a safe stop in case of MRM. This paper reports the first field operational tests that validate the feasibility and quantify the benefits of the proposed infrastructure-assisted ToC and MRM management. The paper also presents the CAV and roadside infrastructure prototypes implemented and used in the trials. The conducted field trials demonstrate that infrastructure-assisted traffic management solutions can reduce safety risks and traffic disruptions.
Reducing poverty, protecting the planet, and improving life on earth for everyone are the essential goals of the "2030 Agenda for Sustainable Development"committed by the United Nations (UN). Achieving those goals will require technological innovation as well as their implementation in almost all areas of our business and day-to-day life. This paper proposes a high-level framework that collects and structures different uses cases addressing the goals defined by the UN. Hence, it contributes to the discussion by proposing technical innovations that can be used to achieve those goals. As an example, the goal "Climate Actionïs discussed in detail by describing use cases related to tackling biodiversity loss in order to conservate ecosystems.
Production and distribution of personalized information services employing mass customization
(2003)
Highly competitive markets paired with tremendous production volumes demand particularly cost efficient products. The usage of common parts and modules across product families can potentially reduce production costs. Yet, increasing commonality typically results in overdesign of individual products. Multi domain virtual prototyping enables designers to evaluate costs and technical feasibility of different single product designs at reasonable computational effort in early design phases. However, savings by platform commonality are hard to quantify and require detailed knowledge of e.g. the production process and the supply chain. Therefore, we present and evaluate a multi-objective metamodel-based optimization algorithm which enables designers to explore the trade-off between high commonality and cost optimal design of single products.
Procedures for the Determination of the Scattering Parameters for Network Analyzer Calibration
(1993)
During the Covid-19 pandemic, vocational colleges, universities of applied science and technical universities often had to cancel laboratory sessions requiring students’ attendance. These above of all are of decisive importance in order to give learners an understanding of theory through practical work.This paper is a contribution to the implementation of distance learning for laboratory work applicable for several upper secondary educational facilities. Its aim is to provide a paradigm for hybrid teaching to analyze and control a non-linear system depicted by a tank model. For this reason, we redesign a full series of laboratory sessions on the basis of various challenges. Thus, it is suitable to serve different reference levels of the European Qualifications Framework (EQF).We present problem-based learning through online platforms to compensate the lack of a laboratory learning environment. With a task deduced from their future profession, we give students the opportunity to develop own solutions in self-defined time intervals. A requirements specification provides the framework conditions in terms of time and content for students having to deal with the challenges of the project in a self-organized manner with regard to inhomogeneous previous knowledge. If the concept of Complete Action is introduced in classes before, they will automatically apply it while executing the project.The goal is to combine students’ scientific understanding with a procedural knowledge. We suggest a series of remote laboratory sessions that combine a problem formulation from the subject area of Measurement, Control and Automation Technology with a project assignment that is common in industry by providing extracts from a requirements specification.
The problem of fair and privacy-preserving ordered set reconciliation arises in a variety of applications like auctions, e-voting, and appointment reconciliation. While several multi-party protocols have been proposed that solve this problem in the semi-honest model, there are no multi-party protocols that are secure in the malicious model so far. In this paper, we close this gap. Our newly proposed protocols are shown to be secure in the malicious model based on a variety of novel non-interactive zero-knowledge-proofs. We describe the implementation of our protocols and evaluate their performance in comparison to protocols solving the problem in the semi-honest case.