Refine
Year of publication
- 2020 (245) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (59)
- Fachbereich Energietechnik (35)
- Fachbereich Wirtschaftswissenschaften (35)
- Fachbereich Luft- und Raumfahrttechnik (33)
- IfB - Institut für Bioengineering (33)
- Fachbereich Elektrotechnik und Informationstechnik (29)
- ECSM European Center for Sustainable Mobility (19)
- Fachbereich Maschinenbau und Mechatronik (18)
- Fachbereich Bauingenieurwesen (14)
- Solar-Institut Jülich (14)
Language
- English (172)
- German (71)
- Multiple languages (1)
- Dutch (1)
Document Type
- Article (135)
- Conference Proceeding (55)
- Part of a Book (20)
- Book (9)
- Review (7)
- Other (4)
- Doctoral Thesis (3)
- Patent (3)
- Administrative publication (3)
- Conference Poster (2)
Keywords
- Amtliche Mitteilung (3)
- MINLP (3)
- Additive manufacturing (2)
- Adjacent buildings (2)
- Experimental validation (2)
- Historical centres (2)
- INODIS (2)
- SIJ (2)
- Shake table test (2)
- Solar-Institut Jülich (2)
Is part of the Bibliography
- no (245)
Integrated voice assistants (IVA) receive more and more attention and are widespread for entertainment use cases, such as radio hearing or web searches. At the same time, the health care segment suffers in process inefficiency and missing staff, whereas the usage of IVA has the potential to improve caring processes and patient satisfaction. By applying a design science approach and based on a qualitative study, we identify IVA requirements, barriers and design guidelines for the health care sector. The results reveal three important IVA functions: the ability to set appointments with care service staff, the documentation of health history and the communication with service staff. Integration, system stability and volume control are the most important nonfunctional requirements. Based on the interview results and project experiences, six design and implementation guidelines are derived.
This paper analyzes the drag characteristics of several landing gear and turret configurations that are representative of unmanned aircraft tricycle landing gears and sensor turrets. A variety of these components were constructed via 3D-printing and analyzed in a wind-tunnel measurement campaign. Both turrets and landing gears were attached to a modular fuselage that supported both isolated components and multiple components at a time. Selected cases were numerically investigated with a Reynolds-averaged Navier-Stokes approach that showed good accuracy when compared to wind-tunnel data. The drag of main gear struts could be significantly reduced via streamlining their cross-sectional shape and keeping load carrying capabilities similar. The attachment of wheels introduced interference effects that increased strut drag moderately but significantly increased wheel drag compared to isolated cases. Very similar behavior was identified for front landing gears. The drag of an electro-optical and infrared sensor turret was found to be much higher than compared to available data of a clean hemisphere-cylinder combination. This turret drag was merely influenced by geometrical features like sensor surfaces and the rotational mechanism. The new data of this study is used to develop simple drag estimation recommendations for main and front landing gear struts and wheels as well as sensor turrets. These recommendations take geometrical considerations and interference effects into account.
Reinforced concrete (RC) structures with masonry infills are widely used for several types of buildings all over the world. However, it is well known that traditional masonry infills constructed with rigid contact to the surrounding RC frame performed rather poor in past earthquakes. Masonry infills showed severe in-plane damages and failed in many cases under out-of-plane seismic loading. As the undesired interactions between frames and infills changes the load transfer on building level, complete collapses of buildings were observed. A possible solution is uncoupling of masonry infills to the frame to reduce the infill contribution activated by the frame deformation under horizontal loading. The paper presents numerical simulations on RC frames equipped with the innovative decoupling system INODIS. The system was developed within the European project INSYSME and allows an effective uncoupling of frame and infill. The simulations are carried out with a micro-modelling approach, which is able to predict the complex nonlinear behaviour resulting from the different materials and their interaction. Each brick is modelled individually and connected taking into account nonlinearity of a brick mortar interface. The calibration of the model is based on small specimen tests and experimental results for one bay one storey frame are used for the validation. The validated model is further used for parametric studies on two storey and two bay infilled frames. The response and change of the structural stiffness are analysed and compared to the traditionally infilled frame. The results confirm the effectiveness of the INODIS system with less damage and relatively low contribution of the infill at high drift levels. In contrast to the uncoupled system configurations, traditionally infilled frames experienced brittle failure at rather low drift levels.
A further development of the Added-Mass-Method allows the combined representation of the effects of both soil-structure-interaction and fluid-structure interaction on a liquid-filled-tank in one model. This results in a practical method for describing the dynamic fluid pressure on the tank shell during joint movement. The fluid pressure is calculated on the basis of the tank's eigenform and the earthquake acceleration and represented by additional masses on the shell. The bearing on compliant ground is represented by replacement springs, which are calculated dependent on the local soil composition. The influence of the shear modulus of the compliant soil is clearly visible in the pressure curves and the stress distribution in the shell. The acceleration spectra are also dependent on soil stiffness. According to Eurocode-8 the acceleration spectra are determined for fixed soil-classes, instead of calculating the accelerations for each site in direct dependence on the soil composition. This leads to unrealistic sudden changes in the system's response. Therefore, earthquake spectra are calculated for different soil models in direct dependence of the shear modulus. Thus, both the acceleration spectra and the replacement springs match the soil composition. This enables a reasonable and consistent calculation of the system response for the actual conditions at each site.
There is a growing body of evidence for the effects of vitamin D on intestinal host-microbiome interactions related to gut dysbiosis and bowel inflammation. This brief review highlights the potential links between vitamin D and gut health, emphasizing the role of vitamin D in microbiological and immunological mechanisms of inflammatory bowel diseases. A comprehensive literature search was carried out in PubMed and Google Scholar using combinations of keywords “vitamin D,” “intestines,” “gut microflora,” “bowel inflammation”. Only articles published in English and related to the study topic are included in the review. We discuss how vitamin D (a) modulates intestinal microbiome function, (b) controls antimicrobial peptide expression, and (c) has a protective effect on epithelial barriers in the gut mucosa. Vitamin D and its nuclear receptor (VDR) regulate intestinal barrier integrity, and control innate and adaptive immunity in the gut. Metabolites from the gut microbiota may also regulate expression of VDR, while vitamin D may influence the gut microbiota and exert anti-inflammatory and immune-modulating effects. The underlying mechanism of vitamin D in the pathogenesis of bowel diseases is not fully understood, but maintaining an optimal vitamin D status appears to be beneficial for gut health. Future studies will shed light on the molecular mechanisms through which vitamin D and VDR interactions affect intestinal mucosal immunity, pathogen invasion, symbiont colonization, and antimicrobial peptide expression.
Die NATO definiert den Cyberspace als die "Umgebung, die durch physische und nicht-physische Bestandteile zum Speichern, Ändern, und Austauschen von Daten mit Hilfe von Computer-Netzwerken" [NATO CCDCOE]. Darüber hinaus ist es ein Medium menschlicher Interaktion. IT Angriffe sind feindselige, nichtkooperative Interaktionen, die mittels Konflikttheorie beschrieben werden können. Durch die Anwendung dieses Gedankengebäudes auf IT Sicherheit von Organisationen können eine Reihe von Verbesserungen in Unternehmen identifiziert werden.
In many historical centres in Europe, stone masonry buildings are part of building aggregates, which developed when the layout of the city or village was densified. In these aggregates, adjacent buildings share structural walls to support floors and roofs. Meanwhile, the masonry walls of the façades of adjacent buildings are often connected by dry joints since adjacent buildings were constructed at different times. Observations after for example the recent Central Italy earthquakes showed that the dry joints between the building units were often the first elements to be damaged. As a result, the joints opened up leading to pounding between the building units and a complicated interaction at floor and roof beam supports. The analysis of such building aggregates is very challenging and modelling guidelines do not exist. Advances in the development of analysis methods have been impeded by the lack of experimental data on the seismic response of such aggregates. The objective of the project AIMS (Seismic Testing of Adjacent Interacting Masonry Structures), included in the H2020 project SERA, is to provide such experimental data by testing an aggregate of two buildings under two horizontal components of dynamic
excitation. The test unit is built at half-scale, with a two-storey building and a one-storey building. The buildings share one common wall while the façade walls are connected by dry joints. The floors are at different heights leading to a complex dynamic response of this smallest possible building aggregate. The shake table test is conducted at the LNEC seismic testing facility. The testing sequence comprises four levels of shaking: 25%, 50%, 75% and 100% of nominal shaking table capacity. Extensive instrumentation, including accelerometers, displacement transducers and optical measurement systems, provides detailed information on the building aggregate response. Special attention is paid to the interface opening, the globa
Biocomposite Materials Based on Carbonized Rice Husk in Biomedicine and Environmental Applications
(2020)
This chapter describes the prospects for biomedical and environmental engineering applications of heterogeneous materials based on nanostructured carbonized rice husk. Efforts in engineering enzymology are focused on the following directions: development and optimization of immobilization methods leading to novel biotechnological and biomedical applications; construction of biocomposite materials based on individual enzymes, multi-enzyme complexes and whole cells, targeted on realization of specific industrial processes. Molecular biological and biochemical studies on cell adhesion focus predominantly on identification, isolation and structural analysis of attachment-responsible biological molecules and their genetic determinants. The chapter provides a short overview of applications of the biocomposite materials based of nanostructured carbonized adsorbents. It emphasizes that further studies and better understanding of the interactions between CNS and microbial cells are necessary. The future use of living cells as biocatalysts, especially in the environmental field, needs more systematic investigations of the microbial adsorption phenomenon.
The adoption of the Digital Health Transformation is a tremendous paradigm change in health organizations, which is not a trivial process in reality. For that reason, in this chapter, it is proposed a methodology with the objective to generate a changing culture in healthcare organisations. Such a change culture is essential for the successful implementation of any supporting methods like Interactive Process Mining. It needs to incorporate (mostly) new ways of team-based and evidence-based approaches for solving structural problems in a digital healthcare environment.
Erdbebennachweis von Mauerwerksbauten mit realistischen Modellen und erhöhten Verhaltensbeiwerten
(2020)
Die Anwendung des linearen Nachweiskonzepts auf Mauerwerksbauten führt dazu, dass bereits heute Standsicherheitsnachweise für Gebäude mit üblichen Grundrissen in Gebieten mit moderaten Erdbebeneinwirkungen nicht mehr geführt werden können. Diese Problematik wird sich in Deutschland mit der Einführung kontinuierlicher probabilistischer Erdbebenkarten weiter verschärfen. Aufgrund der Erhöhung der seismischen Einwirkungen, die sich vielerorts ergibt, ist es erforderlich, die vorhandenen, bislang nicht berücksichtigten Tragfähigkeitsreserven in nachvollziehbaren Nachweiskonzepten in der Baupraxis verfügbar zu machen. Der vorliegende Beitrag stellt ein Konzept für die gebäudespezifische Ermittlung von erhöhten Verhaltensbeiwerten vor. Die Verhaltensbeiwerte setzen sich aus drei Anteilen zusammen, mit denen die Lastumverteilung im Grundriss, die Verformungsfähigkeit und Energiedissipation sowie die Überfestigkeiten berücksichtigt werden. Für die rechnerische Ermittlung dieser drei Anteile wird ein nichtlineares Nachweiskonzept auf Grundlage von Pushover-Analysen vorgeschlagen, in denen die Interaktionen von Wänden und Geschossdecken durch einen Einspanngrad beschrieben werden. Für die Bestimmung der Einspanngrade wird ein nichtlinearer Modellierungsansatz eingeführt, mit dem die Interaktion von Wänden und Decken abgebildet werden kann. Die Anwendung des Konzepts mit erhöhten gebäudespezifischen Verhaltensbeiwerten wird am Beispiel eines Mehrfamilienhauses aus Kalksandsteinen demonstriert. Die Ergebnisse der linearen Nachweise mit erhöhten Verhaltensbeiwerten für dieses Gebäude liegen deutlich näher an den Ergebnissen nichtlinearer Nachweise und somit bleiben übliche Grundrisse in Erdbebengebieten mit den traditionellen linearen Rechenansätzen nachweisbar.
Stahlbetonrahmentragwerke mit Ausfachungen aus Mauerwerk weisen nach Erdbeben häufig schwere Schäden auf. Gründe hierfür sind die Beanspruchungen der Ausfachungswände durch die aufgezwungenen Rahmenverformungen in Wandebene und die gleichzeitig auftretenden Trägheitskräfte senkrecht zur Wandebene in Kombination mit der konstruktiven Ausführung des Ausfachungsmauerwerks. Die Ausfachung wird in der Regel knirsch gegen die Rahmenstützen gemauert, wobei der Verschluss der oberen Fuge mit Mörtel oder Montageschaum erfolgt. Dadurch kommt es im Erdbebenfall zu lokalen Interaktionen zwischen Ausfachung und Rahmen, die in der Folge zu einem Versagen einzelner Ausfachungswände oder zu einem sukzessiven Versagen des Gesamtgebäudes führen können. Die beobachteten Schäden waren die Motivation dafür, in dem europäischen Forschungsprojekt INSYSME für Stahlbetonrahmentragwerke mit Ausfachungen aus hochwärmedämmenden Ziegelmauerwerk innovative Lösungen zur Verbesserung des seismischen Verhaltens zu entwickeln. Der vorliegende Beitrag stellt die im Rahmen des Projekts von den deutschen Projektpartnern (Universität Kassel, SDA-engineering GmbH) entwickelten Lösungen vor und vergleicht deren seismisches Verhalten mit der traditionellen Ausführung der Ausfachungswände. Grundlage für den Vergleich sind statisch-zyklische Wandversuche und Simulationen auf Wandebene. Aus den Ergebnissen werden Empfehlungen für die erdbebensichere Auslegung von Stahlbetonrahmentragwerken mit Ausfachungen aus Ziegelmauerwerk abgeleitet.
Mit finanzieller Unterstützung der Deutschen Gesellschaft für Mauerwerks- und Wohnungsbau e.V. (DGfM) und des Deutschen Instituts für Bautechnik in Berlin (DIBt) wurden zwei aufeinander aufbauende Forschungsvorhaben zur Verbesserung der seismischen Nachweise von Mauerwerksbauten in deutschen Erdbebengebieten durchgeführt. Zunächst wurde das seismische Verhalten von drei modernen unbewehrten Mauerwerksgebäuden in der Region Emilia Romagna in Italien während der Erdbebenserie im Jahr 2012 in Kooperation mit der Universität Pavia eingehend untersucht. Aufbauend auf den Erkenntnissen dieser Untersuchungen wurde ein verbessertes seismisches Bemessungskonzept für unbewehrte Mauerwerksbauten erarbeitet. Der Beitrag stellt die wesentlichen Ergebnisse dieser Forschungsarbeiten und deren Eingang in die Normung vor.
Researching the field of business intelligence and analytics (BI & A) has a long tradition within information systems research. Thereby, in each decade the rapid development of technologies opened new room for investigation. Since the early 1950s, the collection and analysis of structured data were the focus of interest, followed by unstructured data since the early 1990s. The third wave of BI & A comprises unstructured and sensor data of mobile devices. The article at hand aims at drawing a comprehensive overview of the status quo in relevant BI & A research of the current decade, focusing on the third wave of BI & A. By this means, the paper’s contribution is fourfold. First, a systematically developed taxonomy for BI & A 3.0 research, containing seven dimensions and 40 characteristics, is presented. Second, the results of a structured literature review containing 75 full research papers are analyzed by applying the developed taxonomy. The analysis provides an overview on the status quo of BI & A 3.0. Third, the results foster discussions on the predicted and observed developments in BI & A research of the past decade. Fourth, research gaps of the third wave of BI & A research are disclosed and concluded in a research agenda.
Armiranobetonske (AB) zgrade sa zidanom ispunom
se izvode u mnogim zemljama širom sveta. Iako se
zidana ispuna posmatra kao nekonstruktivni element, ona
značajno utiče na promenu dinamičkih karakteristika AB
ramovskih konstrukcija u toku zemljotresnog dejstva.
Odskora, značajan napor je utrošen na istraživanje
izolovanih ispuna, koje su odvojene od okolnog rama
obično ostavljanjem prostora između rama i ispune. U
ovom slučaju deformacija rama ne aktivira ispunu i na taj
način ispuna ne utiče na ponašanje rama. Ovaj rad
predstavlja rezultate istraživanja ponašanja AB
ramovskih zgrada sa INODIS sistemom koji izoluje ispunu
u odnosu na okolni ram. Uticaj izolovane ispune je prvo
ispitan na jednospratnim i jednobrodnim ramovima. Ovo
je iskorišćeno kao osnova za parametarsku analizu na
višespratnim i višebrodnim ramovima, kao i na primeru
zgrade. Promena krutosti i dinamičkih karakteristika je
analizirano kao i odgovor pri zemljotresnom dejstvu.
Izvršeno je poređenje sa praznom ramovskom
konstrukcijom kao i ramovima ispunjenim ispunom na
tradicionalni način. Rezultati pokazuju da je ponašanje
ramova sa izolovanom ispunom slično ponašanju praznih
ramova, dok je ponašanje ramova sa tradicionalnom
ispunom daleko drugačije i zahteva kompleksne
numeričke modele. Ovo znači da ukoliko se primeni
adekvatna konstruktivna mera izolacije ispune, proračun
ramovskim zgrada sa zidanom ispunom se može
značajno pojednostaviti.
In traditional microbial biobutanol production, the solvent must be recovered during fermentation process for a sufficient space-time yield. Thermal separation is not feasible due to the boiling point of n-butanol. As an integrated and selective solid-liquid separation alternative, solvent impregnated resins (SIRs) were applied. Two polymeric resins were evaluated and an extractant screening was conducted. Vacuum application with vapor collection in fixed-bed column as bioreactor bypass was successfully implemented as butanol desorption step. In course of further increasing process economics, fermentation with renewable lignocellulosic substrates was conducted using Clostridium acetobutylicum. Utilization of SIR was shown to be a potential strategy for solvent removal from fermentation broth, while application of a bypass column allows for product removal and recovery at once.
Bacterial cellulose (BC) is a promising material for biomedical applications due to its unique properties such as high mechanical strength and biocompatibility. This article describes the microbiological synthesis, modification, and characterization of the obtained BC-nanocomposites originating from symbiotic consortium Medusomyces gisevii. Two BC-modifications have been obtained: BC-Ag and BC-calcium phosphate (BC-Ca3(PO4)2). Structure and physicochemical properties of the BC and its modifications were investigated by scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDX), atomic force microscopy (AFM), and infrared Fourier spectroscopy as well as by measurements of mechanical and water holding/absorbing capacities. Topographic analysis of the surface revealed multicomponent thick fibrils (150–160 nm in diameter and about 15 µm in length) constituted by 50–60 nm nanofibrils weaved into a left-hand helix. Distinctive features of Ca-phosphate-modified BC samples were (a) the presence of 500–700 nm entanglements and (b) inclusions of Ca3(PO4)2 crystals. The samples impregnated with Ag nanoparticles exhibited numerous roundish inclusions, about 110 nm in diameter. The boundaries between the organic and inorganic phases were very distinct in both cases. The Ag-modified samples also showed a prominent waving pattern in the packing of nanofibrils. The obtained BC gel films possessed water-holding capacity of about 62.35 g/g. However, the dried (to a constant mass) BC-films later exhibited a low water absorption capacity (3.82 g/g). It was found that decellularized BC samples had 2.4 times larger Young’s modulus and 2.2 times greater tensile strength as compared to dehydrated native BC films. We presume that this was caused by molecular compaction of the BC structure.
Three-dimensional (3D) full-field measurements provide a comprehensive and accurate validation of finite element (FE) models. For the validation, the result of the model and measurements are compared based on two respective point-sets and this requires the point-sets to be registered in one coordinate system. Point-set registration is a non-convex optimization problem that has widely been solved by the ordinary iterative closest point algorithm. However, this approach necessitates a good initialization without which it easily returns a local optimum, i.e. an erroneous registration. The globally optimal iterative closest point (Go-ICP) algorithm has overcome this drawback and forms the basis for the presented open-source tool that can be used for the validation of FE models using 3D full-field measurements. The capability of the tool is demonstrated using an application example from the field of biomechanics. Methodological problems that arise in real-world data and the respective implemented solution approaches are discussed.
The steel industry in the European Union (EU), important for the economy as a whole, faces various challenges. These are inter alia volatile prices for relevant input factors, uncertainties concerning the regulation of CO₂-emissions and market shocks caused by the recently introduced additional import duties in the US, which is an important sales market. We examine primary and secondary effects of these challenges on the steel industry in the EU and their impacts on European and global level. Developing and using a suitable meta-model, we analyze the competitiveness of key steel producing countries with respect to floor prices depending on selected cost factors and draw conclusions on the impacts in the trade of steel on emissions, energy demand, on the involvement of developing countries in the value chain as well on the need for innovations to avoid relocations of production. Hence, our study contributes to the assessment of sustainable industrial development, which is aimed by the Sustainability Development Goal “Build resilient infrastructure, promote inclusive and sustainable industrialization and foster innovation countries”. By applying information on country-specific Human Development Indexes (reflecting aspects of life expectancy, education, and per capita income), we show that relocating energy-intensive industries from the EU may not only increase global energy demand and CO₂-emissions, but may also be to the disadvantage of developing countries.
Robust estimators for free surface turbulence characterization: A stepped spillway application
(2020)
Robust estimators are parameters insensitive to the presence of outliers. However, they presume the shape of the variables’ probability density function. This study exemplifies the sensitivity of turbulent quantities to the use of classic and robust estimators and the presence of outliers in turbulent flow depth time series. A wide range of turbulence quantities was analysed based upon a stepped spillway case study, using flow depths sampled with Acoustic Displacement Meters as the flow variable of interest. The studied parameters include: the expected free surface level, the expected fluctuation intensity, the depth skewness, the autocorrelation timescales, the vertical velocity fluctuation intensity, the perturbations celerity and the one-dimensional free surface turbulence spectrum. Three levels of filtering were utilised prior to applying classic and robust estimators, showing that comparable robustness can be obtained either using classic estimators together with an intermediate filtering technique or using robust estimators instead, without any filtering technique.
Humic substances originating from various organic matters can ameliorate soil properties, stimulate plant growth, and improve nutrient uptake. Due to the low calorific heating value, leonardite is rather unsuitable as fuel. However, it may serve as a potential source of humic substances. This study was aimed at characterizing the leonardite-based soil amendments and examining the effect of their application on the soil microbial community, as well as on potato growth and tuber yield. A high yield (71.1%) of humic acid (LHA) from leonardite has been demonstrated. Parental leonardite (PL) and LHA were applied to soil prior to potato cultivation. The 16S rRNA sequencing of soil samples revealed distinct relationships between microbial community composition and the application of leonardite-based soil amendments. Potato tubers were planted in pots in greenhouse conditions. The tubers were harvested at the mature stage for the determination of growth and yield parameters. The results demonstrated that the LHA treatments had a significant effect on increasing potato growth (54.9%) and tuber yield (66.4%) when compared to the control. The findings highlight the importance of amending leonardite-based humic products for maintaining the biogeochemical stability of soils, for keeping their healthy microbial community structure, and for increasing the agronomic productivity of potato plants.
In the study, the process chain of additive manufacturing by means of powder bed fusion will be presented based on the material glass. In order to reliably process components additively, new concepts with different solutions were developed and investigated.
Compared to established metallic materials, the properties of glass materials differ significantly. Therefore, the process control was adapted to the material glass in the investigations. With extensive parameter studies based on various glass powders such as borosilicate glass and quartz glass, scientifically proven results on powder bed fusion of glass are presented. Based on the determination of the particle properties with different methods, extensive investigations are made regarding the melting behavior of glass by means of laser beams. Furthermore, the experimental setup was steadily expanded. In addition to the integration of coaxial temperature measurement and regulation, preheating of the building platform is of major importance. This offers the possibility to perform 3D printing at the transformation temperatures of the glass materials. To improve the component’s properties, the influence of a subsequent heat treatment was also investigated.
The experience gained was incorporated into a new experimental system, which allows a much better exploration of the 3D printing of glass. Currently, studies are being conducted to improve surface texture, building accuracy, and geometrical capabilities using three-dimensional specimen.
The contribution shows the development of research in the field of 3D printing of glass, gives an insight into the machine and process engineering as well as an outlook on the possibilities and applications.
Design and Development of a Hot S-Parameter Measurement System for Plasma and Magnetron Applications
(2020)
This paper presents the design, development and calibration procedures of a novel hot S-parameter measurement system for plasma and magnetron applications with power level up to 6 kW. Based on a vector network analyzer, a power amplifier and two directional couplers, the input matching hotS 11 and transmission hotS 21 of the device under test are measured at 2.45 GHz center frequency and 300MHz bandwidth, while the device is driven by the magnetron. This measurement system opens a new horizon to develop many new industrial applications such as microwave plasma jets, dryer systems, dryers and so forth. Furthermore, the developing, controlling and monitoring a 2kW 2.45GHz plasma jet and a dryer system using the measurement system are presented and explained.
Implementation of gender and diversity perspectives in transport development plans in germany
(2020)
As mobility should ensure the accessibility to and participation in society, transport planning has to deal with a variety of gender and diversity categories affecting users’ mobility needs and patterns. Exemplified by an analysis of an instrument of transport development processes – German Transport Development Plans (TDPs) – we investigated to what extent diverse target groups and their mobility requirements are implemented in transport strategy papers. Research results illustrate a still-prevalent neglect of several relevant gender and diversity categories while prioritizing and focusing on eco-friendly topics. But how sustainable can transport be without facing the diversification of life circumstances?
The number of case studies focusing on hybrid-electric aircraft is steadily increasing, since these configurations are thought to lead to lower operating costs and environmental impact than traditional aircraft. However, due to the lack of reference data of actual hybrid-electric aircraft, in most cases, the design tools and results are difficult to validate. In this paper, two independently developed approaches for hybrid-electric conceptual aircraft design are compared. An existing 19-seat commuter aircraft is selected as the conventional baseline, and both design tools are used to size that aircraft. The aircraft is then re-sized under consideration of hybrid-electric propulsion technology. This is performed for parallel, serial, and fully-electric powertrain architectures. Finally, sensitivity studies are conducted to assess the validity of the basic assumptions and approaches regarding the design of hybrid-electric aircraft. Both methods are found to predict the maximum take-off mass (MTOM) of the reference aircraft with less than 4% error. The MTOM and payload-range energy efficiency of various (hybrid-) electric configurations are predicted with a maximum difference of approximately 2% and 5%, respectively. The results of this study confirm a correct formulation and implementation of the two design methods, and the data obtained can be used by researchers to benchmark and validate their design tools.
Verantwortlichkeit, Data Breach, das Ende von Fax & E-Mail: Aufsichtsbehörden mit streitbaren Thesen
(2020)
In many cities, diesel buses are being replaced by electric buses with the aim of reducing local emissions and thus improving air quality. The protection of the environment and the health of the population is the highest priority of our society. For the transport companies that operate these buses, not only ecological issues but also economic issues are of great importance. Due to the high purchase costs of electric buses compared to conventional buses, operators are forced to use electric vehicles in a targeted manner in order to ensure amortization over the service life of the vehicles. A compromise between ecology and economy must be found in order to both protect the environment and ensure economical operation of the buses.
In this study, we present a new methodology for optimizing the vehicles’ charging time as a function of the parameters CO₂eq emissions and electricity costs. Based on recorded driving profiles in daily bus operation, the energy demands of conventional and electric buses are calculated for the passenger transportation in the city of Aachen in 2017. Different charging scenarios are defined to analyze the influence of the temporal variability of CO₂eq intensity and electricity price on the environmental impact and economy of the bus. For every individual day of a year, charging periods with the lowest and highest costs and emissions are identified and recommendations for daily bus operation are made. To enable both the ecological and economical operation of the bus, the parameters of electricity price and CO₂ are weighted differently, and several charging periods are proposed, taking into account the priorities previously set. A sensitivity analysis is carried out to evaluate the influence of selected parameters and to derive recommendations for improving the ecological and economic balance of the battery-powered electric vehicle.
In all scenarios, the optimization of the charging period results in energy cost savings of a maximum of 13.6% compared to charging at a fixed electricity price. The savings potential of CO₂eq emissions is similar, at 14.9%. From an economic point of view, charging between 2 a.m. and 4 a.m. results in the lowest energy costs on average. The CO₂eq intensity is also low in this period, but midday charging leads to the largest savings in CO₂eq emissions. From a life cycle perspective, the electric bus is not economically competitive with the conventional bus. However, from an ecological point of view, the electric bus saves on average 37.5% CO₂eq emissions over its service life compared to the diesel bus. The reduction potential is maximized if the electric vehicle exclusively consumes electricity from solar and wind power.
Large scale central receiver systems typically deploy between thousands to more than a hundred thousand heliostats. During solar operation, each heliostat is aligned individually in such a way that the overall surface normal bisects the angle between the sun’s position and the aim point coordinate on the receiver. Due to various tracking error sources, achieving accurate alignment ≤1 mrad for all the heliostats with respect to the aim points on the receiver without a calibration system can be regarded as unrealistic. Therefore, a calibration system is necessary not only to improve the aiming accuracy for achieving desired flux distributions but also to reduce or eliminate spillage. An overview of current larger-scale central receiver systems (CRS), tracking error sources and the basic requirements of an ideal calibration system is presented. Leading up to the main topic, a description of general and specific terms on the topics heliostat calibration and tracking control clarifies the terminology used in this work. Various figures illustrate the signal flows along various typical components as well as the corresponding monitoring or measuring devices that indicate or measure along the signal (or effect) chain. The numerous calibration systems are described in detail and classified in groups. Two tables allow the juxtaposition of the calibration methods for a better comparison. In an assessment, the advantages and disadvantages of individual calibration methods are presented.
For short take-off and landing (STOL) aircraft, a parallel hybrid-electric propulsion system potentially offers superior performance compared to a conventional propulsion system, because the short-take-off power requirement is much higher than the cruise power requirement. This power-matching problem can be solved with a balanced hybrid propulsion system. However, there is a trade-off between wing loading, power loading, the level of hybridization, as well as range and take-off distance. An optimization method can vary design variables in such a way that a minimum of a particular objective is attained. In this paper, a comparison between the optimization results for minimum mass, minimum consumed primary energy, and minimum cost is conducted. A new initial sizing algorithm for general aviation aircraft with hybrid-electric propulsion systems is applied. This initial sizing methodology covers point performance, mission performance analysis, the weight estimation process, and cost estimation. The methodology is applied to the design of a STOL general aviation aircraft, intended for on-demand air mobility operations. The aircraft is sized to carry eight passengers over a distance of 500 km, while able to take off and land from short airstrips. Results indicate that parallel hybrid-electric propulsion systems must be considered for future STOL aircraft.
We propose the so-called chance constrained programming model of stochastic programming theory to analyze limit and shakedown loads of structures under random strength with a lognormal distribution. A dual chance constrained programming algorithm is developed to calculate simultaneously both the upper and lower bounds of the plastic collapse limit and the shakedown limit. The edge-based smoothed finite element method (ES-FEM) is used with three-node linear triangular elements.
Electrolyte-insulator-semiconductor (EIS) field-effect sensors belong to a new generation of electronic chips for biochemical sensing, enabling a direct electronic readout. The review gives an overview on recent advances and current trends in the research and development of chemical sensors and biosensors based on the capacitive field-effect EIS structure—the simplest field-effect device, which represents a biochemically sensitive capacitor. Fundamental concepts, physicochemical phenomena underlying the transduction mechanism and application of capacitive EIS sensors for the detection of pH, ion concentrations, and enzymatic reactions, as well as the label-free detection of charged molecules (nucleic acids, proteins, and polyelectrolytes) and nanoparticles, are presented and discussed.
Game-based learning is a promising approach to anti-phishing education, as it fosters motivation and can help reduce the perceived difficulty of the educational material. Over the years, several prototypes for game-based applications have been proposed, that follow different approaches in content selection, presentation, and game mechanics. In this paper, a literature and product review of existing learning games is presented. Based on research papers and accessible applications, an in-depth analysis was conducted, encompassing target groups, educational contexts, learning goals based on Bloom’s Revised Taxonomy, and learning content. As a result of this review, we created the publications on games (POG) data set for the domain of anti-phishing education. While there are games that can convey factual and conceptual knowledge, we find that most games are either unavailable, fail to convey procedural knowledge or lack technical depth. Thus, we identify potential areas of improvement for games suitable for end-users in informal learning contexts.
Impact of Battery Performance on the Initial Sizing of Hybrid-Electric General Aviation Aircraft
(2020)
Studies suggest that hybrid-electric aircraft have the potential to generate fewer emissions and be inherently quieter when compared to conventional aircraft. By operating combustion engines together with an electric propulsion system, synergistic benefits can be obtained. However, the performance of hybrid-electric aircraft is still constrained by a battery’s energy density and discharge rate. In this paper, the influence of battery performance on the gross mass for a four-seat general aviation aircraft with a hybrid-electric propulsion system is analyzed. For this design study, a high-level approach is chosen, using an innovative initial sizing methodology to determine the minimum required aircraft mass for a specific set of requirements and constraints. Only the peak-load shaving operational strategy is analyzed. Both parallel- and serial-hybrid propulsion configurations are considered for two different missions. The specific energy of the battery pack is varied from 200 to 1,000 W⋅h/kg, while the discharge time, and thus the normalized discharge rating (C-rating), is varied between 30 min (2C discharge rate) and 2 min (30C discharge rate). With the peak-load shaving operating strategy, it is desirable for hybrid-electric aircraft to use a light, low capacity battery system to boost performance. For this case, the battery’s specific power rating proved to be of much higher importance than for full electric designs, which have high capacity batteries. Discharge ratings of 20C allow a significant take-off mass reduction aircraft. The design point moves to higher wing loadings and higher levels of hybridization if batteries with advanced technology are used.
Im Projekt Coolplan‐ AIR geht es um die Fortentwicklung und Feld‐ Validierung eines Berechnungs‐ und Auslegungstools zur energieeffizienten Kühlung von Gebäuden mit luftgestützten Systemen. Neben dem Aufbau und der Weiterentwicklung von Simulationsmodellen erfolgen Vermessungen der Gesamtsysteme anhand von Praxisanlagen im Feld. Eine der betrachteten Anlagen arbeitet mit indirekter Verdunstung. Diese Veröffentlichung zeigt den Entwicklungsprozess und den Aufbau des Simulationsmodells zur Verdunstungskühlung in der Simulationsumgebung Matlab‐ Simulink mit der CARNOT‐ Toolbox. Das besondere Augenmerk liegt dabei auf dem physikalischen Modell des Wärmeübertragers, in dem die Verdunstung implementiert ist. Dem neuen Modellansatz liegt die Annahme einer aus der Enthalpie‐ Betrachtung hergeleiteten effektiven Wärmekapazität zugrunde. Des Weiteren wird der Befeuchtungsgrad als konstant angesehen und eine standardisierte Zunahme der Wärmeübertragung des feuchten gegenüber dem trockenen Wärmeübertrager angenommen. Die Validierung des Modells erfolgte anhand von Literaturdaten. Für den trockenen Wärmetauscher ist der maximale absolute Fehler der berechneten Austrittstemperatur (Zuluft) kleiner als ±0.1 K und für den nassen Wärmetauscher (Kühlfall) unter der Annahme eines konstanten Verdunstungsgrades kleiner als ±0.4 K.
Coronavirus disease 2019 (COVID-19) is a novel human infectious disease provoked by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Currently, no specific vaccines or drugs against COVID-19 are available. Therefore, early diagnosis and treatment are essential in order to slow the virus spread and to contain the disease outbreak. Hence, new diagnostic tests and devices for virus detection in clinical samples that are faster, more accurate and reliable, easier and cost-efficient than existing ones are needed. Due to the small sizes, fast response time, label-free operation without the need for expensive and time-consuming labeling steps, the possibility of real-time and multiplexed measurements, robustness and portability (point-of-care and on-site testing), biosensors based on semiconductor field-effect devices (FEDs) are one of the most attractive platforms for an electrical detection of charged biomolecules and bioparticles by their intrinsic charge. In this review, recent advances and key developments in the field of label-free detection of viruses (including plant viruses) with various types of FEDs are presented. In recent years, however, certain plant viruses have also attracted additional interest for biosensor layouts: Their repetitive protein subunits arranged at nanometric spacing can be employed for coupling functional molecules. If used as adapters on sensor chip surfaces, they allow an efficient immobilization of analyte-specific recognition and detector elements such as antibodies and enzymes at highest surface densities. The display on plant viral bionanoparticles may also lead to long-time stabilization of sensor molecules upon repeated uses and has the potential to increase sensor performance substantially, compared to conventional layouts. This has been demonstrated in different proof-of-concept biosensor devices. Therefore, richly available plant viral particles, non-pathogenic for animals or humans, might gain novel importance if applied in receptor layers of FEDs. These perspectives are explained and discussed with regard to future detection strategies for COVID-19 and related viral diseases.
Insbesondere im wirtschaftlichen Kontext wird die Diversität von Belegschaften zunehmend als ein kritischer Erfolgsfaktor gesehen. Neben dem Potenzial, welches sich laut Studien aus einem vielfältigen Team ergibt, werden jedoch ebenfalls die aus menschlicher Diversität resultierenden Herausforderungen thematisiert und wissenschaftlich untersucht. Sowohl aus dem Potenzial als auch aus den Herausforderungen ergibt sich dabei die Notwendigkeit der Implementierung eines organisationsspezifischen Diversity Managements, welches die Gewinnung neuer Mitarbeiter*innen einerseits und das Management der vorhandenen Vielfalt andererseits gleichermaßen unterstützt. In der psychologischen, sozial- und wirtschaftswissenschaftlichen Literatur gibt es unterschiedliche Definitionen von Diversität, woraus sich verschiedene Perspektiven auf das Vorgehen bei der Gestaltung und Umsetzung eines Diversity Management Ansatzes ergeben. Insbesondere vor dem Hintergrund der Komplexität des Organisationsumfeldes und der steigenden Anforderungen an die organisationsinterne Agilität besteht die Notwendigkeit, Diversität in Organisationen stärker zu reflektieren und systemspezifische Ansätze zu entwickeln. Dies erfordert die Berücksichtigung organisationsspezifischer Strukturen und Prozesse sowie die Reflexion des Wandels der Organisationskultur durch die Umsetzung eines Diversity Management Ansatzes, der die gegebene Komplexität aufgreift und bewältigen kann. Darüber hinaus sind die psychologischen Auswirkungen solcher Veränderungen auf die Mitarbeiter*innen zu berücksichtigen, um Reaktanzen zu vermeiden und eine nachhaltige Umsetzung von Diversity Management zu ermöglichen. In Ermangelung entsprechender Ansätze im Rahmen öffentlich finanzierter, komplexer Forschungsorganisationen, ist das Ziel dieser Dissertation die Entwicklung und Erprobung eines Forschungsdesigns, welches die Ansätze des Diversity- und Change Managements mit der Organisationskultur verknüpft, indem es eine systemtheoretische Perspektive einnimmt. Dabei wird das Forschungsdesign auf eine komplexe wissenschaftliche Organisation angewendet. Als Basis dient die in Teil A durchgeführte Betrachtung des aktuellen Forschungsstandes aus einer interdisziplinären Perspektive und die damit einhergehende umfassende Einführung in das Forschungsfeld. Im Zuge dessen wird detailliert auf die begriffliche Definition von Diversität eingegangen, bevor dann die psychologischen Konzepte im Diversitätskontext den Übergang zu einer differenzierten Auseinandersetzung mit dem Konzept des Diversity Managements bilden. Auf dieser Grundlage werden das Forschungsdesign sowie die daraus resultierenden Forschungsphasen abgeleitet. Teil A stellt somit die theoretische Grundlage für die in Teil B präsentierten Fachaufsätze dar. Jeder Fachaufsatz beleuchtet dabei in chronologischer Reihenfolge die unterschiedlichen Forschungsphasen. Fachaufsatz I präsentiert den sechsstufigen Forschungsansatz und beleuchtet die besonderen Rahmenbedingungen des Forschungsobjektes aus einer theoretischen Perspektive. Im Anschluss werden die Ergebnisse der Organisationsanalyse, welche zugleich Phase I und II des Forschungskonzeptes darstellen, vorgestellt. Aufbauend auf diesen Forschungsergebnissen fokussiert Forschungsaufsatz II die Darlegung der Ergebnisse aus Forschungsphase III, der Befragung der Führungsebene. Die Befragung thematisierte dabei die Wahrnehmung von Diversity und Diversity Management auf Führungsebene, die Verknüpfung von Diversität mit Innovation sowie die Reflexion des eigenen Führungsstils. Als Ergebnis der Befragung konnten sechs Typen identifiziert werden, die das Führungsverständnis im Diversitätskontext widerspiegeln und somit den Ansatzpunkt für eine top-down gerichtete Diversity Management Strategie darstellen. Darauf aufbauend wird in Forschungsphase IV die Mitarbeiter*innenebene beforscht. Im Zentrum der quantitativen Befragung standen die vorherrschenden Einstellungen zum Themenkomplex Diversity und Diversity Management, die Wahrnehmung von Diversität sowie die Untersuchung des Einflusses der Führungsebene auf die Mitarbeiter*innenebene. Forschungsaufsatz III präsentiert erste Ergebnisse dieser Untersuchung. Die Analyse weist auf eine unterschiedliche Gewichtung der verschiedenen Diversitätskategorien hinsichtlich der Verknüpfung mit Innovationen und somit der Reflexion des Kontextes zwischen Diversität und Innovationen hin. Vergleichbar mit den identifizierten Typen auf der Führungsebene, deutet die Analyse auf die Existenz unterschiedlicher Reflexionsgrade auf Mitarbeiter*innenebene hin. Auf Basis dessen wird im Rahmen von Forschungsaufsatz IV eine nähere Untersuchung des Reflexionsgrades auf Mitarbeiter*innenebene präsentiert und der Diversity Management Ansatz mit Elementen des Change Managements kombiniert. Besondere Berücksichtigung findet als Schlussfolgerung einer theoretischen Analyse die Organisationskultur als zentrales Element bei der Entwicklung und Einführung eines Diversity Management Ansatzes in eine komplexe Forschungsorganisation in Deutschland. Die Analyse zeigt, dass die Wahrnehmung von Diversität heterogen aber zunächst losgelöst vom individuellen Hintergrund ist (im Rahmen dieser Analyse lag der Fokus auf den Diversitätskategorien Gender und Herkunft). Hinsichtlich der Wertschätzung von Diversität zeigt sich dabei ebenfalls ein heterogenes Bild. In der Gesamtbetrachtung stimmen lediglich 17% der Mitarbeiter*innen zu, dass Diversitätskategorien wie Gender, Herkunft oder auch Alter einen Mehrwert darstellen können. Zugleich bewertet diese Gruppe die dem Thema beigemessene Wichtigkeit im CoE als ausreichend. Zusammengefasst lassen sich folgende Erkenntnisse im Rahmen dieser Dissertation ableiten und dienen somit als Grundlage für die Entwicklung eines Diversity Management Ansatzes: (1) Die Entwicklung eines bedarfsorientierten Diversity Management Ansatzes erfordert einen systemtheoretischen Prozess, der sowohl organisationsinterne als auch externe Einflussfaktoren berücksichtigt. Der im Rahmen des Forschungsprojektes entwickelte sechsstufige Forschungsprozess hat sich dabei als geeignetes Instrument erwiesen. (2)Im Rahmen öffentlicher Forschungseinrichtungen lassen sich dabei drei zentrale Faktoren identifizieren: die individuelle Reflexionsebene, die Organisationskultur sowie extern beeinflusste Organisationsstrukturen, Prozesse und Systeme.(3)Vergleichbar mit privatwirtschaftlichen Unternehmen hat auch in wissenschaftlichen Organisationen die Führungsebene einen maßgeblichen Einfluss auf die Wahrnehmung von Diversität und somit einen Einfluss auf die Umsetzung einer Diversity Management Strategie. Daher ist auch im wissenschaftlichen Kontext, bedingt durch die rechtlichen Rahmenbedingungen des Hochschulsystems, ein top-down Ansatz für eine nachhaltige Implementierung erforderlich. (4) Diversity Management steht in einem engen Zusammenhang mit einem organisationalen Wandel, was die Reflexion von Veränderungsprozesse aus einer psychologischen Perspektive erfordert und eine Verknüpfung von Diversity und Change Management bedingt. Aufbauend auf den im Rahmen des entwickelten Forschungskonzeptes gewonnenen zentralen Erkenntnissen wird ein Ansatz entwickelt, der die Ableitung theoretischer Implikationen sowie Implikationen für das Management ermöglicht. Insbesondere vor dem Hintergrund der Reflexion der besonderen Rahmenbedingungen öffentlich finanzierter Forschungsorganisationen werden darüber hinaus politische Implikationen abgeleitet, die auf die Veränderung struktureller Dimensionen abzielen.
Textsammlung mit allen für den Datenschutz in Kirchen maßgeblichen Regelwerken: DSGVO, KDG, KDR-OG, DSG-EKD sowie begleitende Verordnungen (KDG-DVO, KDR-OG-DVO und ITSVO-EKD) und Verfahrensgesetze (KDSGO und KiGG.EKD). Die vorliegende Textsammlung enthält die für den Datenschutz der beiden großen Kirchen in Deutschland maßgeblichen Vorschriften: Dies sind, neben der Datenschutz-Grundverordnung, die Normen des kirchlichen Rechts, die neu erlassen bzw. geändert wurden. Weiter sind die verschiedenen Durchführungsverordnungen für die kirchlichen Datenschutzgesetze und die maßgeblichen Normen des kirchlichen Verfahrensrechts abgedruckt. Ergänzt wird das Werk durch Verweise auf maßgebliche Veröffentlichungen der weltlichen und kirchlichen Datenschutzaufsichtsbehörden und weitere Materialien.
Comparative assessment of parallel-hybrid-electric propulsion systems for four different aircraft
(2020)
Until electric energy storage systems are ready to allow fully electric aircraft, the combination of combustion engine and electric motor as a hybrid-electric propulsion system seems to be a promising intermediate solution. Consequently, the design space for future aircraft is expanded considerably, as serial hybrid-electric, parallel hybrid-electric, fully electric, and conventional propulsion systems must all be considered. While the best propulsion system depends on a multitude of requirements and considerations, trends can be observed for certain types of aircraft and certain types of missions. This Paper provides insight into some factors that drive a new design toward either conventional or hybrid propulsion systems. General aviation aircraft, regional transport aircraft vertical takeoff and landing air taxis, and unmanned aerial vehicles are chosen as case studies. Typical missions for each class are considered, and the aircraft are analyzed regarding their takeoff mass and primary energy consumption. For these case studies, a high-level approach is chosen, using an initial sizing methodology. Only parallel-hybrid-electric powertrains are taken into account. Aeropropulsive interaction effects are neglected. Results indicate that hybrid-electric propulsion systems should be considered if the propulsion system is sized by short-duration power constraints. However, if the propulsion system is sized by a continuous power requirement, hybrid-electric systems offer hardly any benefit.
Nach dem Scheitern des Privacy Shield hofft der Datenschutzberater Alexander Golland, dass die europäischen Behörden bald konkrete Maßnahmen für den Datentransfer in Drittländer vorschlagen. Auch kleine Unternehmen müssten die Herausforderung bewältigen – sonst wäre das Urteil nur Wasser auf die Mühlen jener, die ohnehin über den Datenschutz schimpfen.
Die qualitative und quantitative Detektion von Zielsubstanzen innerhalb einer wässrigen Probe ist für viele Fragestellungen von Interesse, etwa bei der Detektion von Kontaminationen in Trinkwasser in Krisensituationen. Hierbei ist es nicht nur wichtig, dass Pathogene möglichst sensitiv detektiert werden können, sondern auch, dass die Analyse schnell erfolgt, um Betroffenen im Katastrophenfall zügig sicheres Trinkwasser zu Verfügung stellen zu können. Da bei einem solchen Szenario nicht von einer in der Nähe befindlichen funktionierenden Laborinfrastruktur ausgegangen werden kann, ist es wichtig, dass die Messung direkt vor Ort erfolgen kann. Im Rahmen dieser Arbeit wurde untersucht, ob eine derartige Schnellanalytik mithilfe von superparamagnetischen Beads (MBs) und der magnetischen Frequenzmischtechnik möglich ist. Dabei werden die MBs mit Hilfe von primären Antikörpern an die Zielsubstanz gebunden und mit sekundären Antikörpern an die Poren-Oberfläche eines Polyethylen-Filters fixiert (Sandwich-Immunoassay). So kann die Quantifizierung der Zielsubstanz auf eine magnetische Messung der immobilisierten MB-Marker zurückgeführt werden. Die magnetische Frequenzmischtechnik basiert auf der Anregung der Probe mit Magnetfeldern zweier verschiedener Frequenzen. Die durch die nichtlineare Magnetisierungsform der superparamagnetischen MBs entstehenden Mischfrequenzen werden typischerweise mithilfe einer zweistufigen Lock-in-Detektion analysiert (analoge Demodulation), die in einem Magnetreader als Handheldgerät realisiert wurde. Zusätzlich zu dieser Technik wurde das Prinzip der direkten Digitalisierung des gesamten Antwortsignals mit anschließender Fourier-Analyse der erzeugten Mischfrequenzen experimentell umgesetzt, um die Amplituden und Phasen mehrerer Mischfrequenzen simultan zu erfassen. Eine Möglichkeit zur Sensitivitätssteigerung ist die magnetische Aufkonzentration, indem vor der magnetischen Analyse eine Separation der MBs aus einem größeren Probenvolumen mittels magnetischem Feldgradienten durchgeführt wird. Zur Charakterisierung verschiedener kommerzieller MBs hinsichtlich ihrer magnetischen Separierbarkeit wurde ein Aufbau zur Messung ihrer magnetophoretischen Beweglichkeiten realisiert und ihre Geschwindigkeiten im Gradientenfeld mikroskopisch gemessen.Da eine Probe oftmals nicht nur auf eine einzige Zielsubstanz, sondern simultan auf mehrere verschiedene Pathogene hin untersucht werden soll, wurden verschiedene Ansätze entwickelt und getestet, die einen solchen multiparametrischen magnetischen Immunoassay ermöglichen. Einerseits wurde eine räumliche Separation der Bindungsbereiche für verschiedene Zielsubstanzen realisiert, die sequentiell ausgewertet werden können. Andererseits wurde die Unterscheidung von verschiedenen Zielsubstanzen anhand der Charakteristika der an sie gebundenen, verschieden funktionalisierten MB-Typen untersucht. Für eine solche Unterscheidung wurde zum einen die Anregefrequenz der magnetischen Frequenzmischtechnik während einer Messung variiert. Damit konnte gezeigt werden, dass sich verschiedene MB-Sorten anhand der Phase ihrer Frequenzmischsignale voneinander unterscheiden lassen. Weiterhin wurde gezeigt, dass sich der Signalverlauf einer binären Mischung zweier verschiedener MB-Typen als gradueller Übergang der Verläufe der beiden reinen MB-Lösungen ergibt. Eine weitere Analysemethode für einen multiparametrischen Immunoassay besteht darin, ein zusätzliches einstellbares statisches magnetisches Offsetfeld zu verwenden. Hierfür wurden mehrere Aufbauten auf Basis von Permanent- und Elektromagneten simuliert, konstruiert und charakterisiert. Mithilfe von Simulationen konnte gezeigt werden, dass eine auf diesem Verfahren beruhende Unterscheidung für MBs mit unterschiedlichen magnetischen Partikelmomenten möglich ist. Als direkte Anwendung des hier entwickelten Magnetreaders in Zusammenspiel mit der digitalen Demodulation wurde ein magnetischer Assay gegen die B-Untereinheit des Choleratoxins in Trinkwasser mit einem niedrigen Detektionslimit von 0,2 ng/ml demonstriert.
A German–Brazilian research project investigates sugarcane as an energy plant in anaerobic digestion for biogas production. The aim of the project is a continuous, efficient, and stable biogas process with sugarcane as the substrate. Tests are carried out in a fermenter with a volume of 10 l.
In order to optimize the space–time load to achieve a stable process, a continuous process in laboratory scale has been devised. The daily feed in quantity and the harvest time of the substrate sugarcane has been varied. Analyses of the digester content were conducted twice per week to monitor the process: The ratio of inorganic carbon content to volatile organic acid content (VFA/TAC), the concentration of short-chain fatty acids, the organic dry matter, the pH value, and the total nitrogen, phosphate, and ammonium concentrations were monitored. In addition, the gas quality (the percentages of CO₂, CH₄, and H₂) and the quantity of the produced gas were analyzed.
The investigations have exhibited feasible and economical production of biogas in a continuous process with energy cane as substrate. With a daily feeding rate of 1.68gᵥₛ/l*d the average specific gas formation rate was 0.5 m3/kgᵥₛ. The long-term study demonstrates a surprisingly fast metabolism of short-chain fatty acids. This indicates a stable and less susceptible process compared to other substrates.
Extracellular acidification is a basic indicator for alterations in two vital metabolic pathways: glycolysis and cellular respiration. Measuring these alterations by monitoring extracellular acidification using cell-based biosensors such as LAPS plays an important role in studying these pathways whose disorders are associated with numerous diseases including cancer. However, the surface of the biosensors must be specially tailored to ensure high cell compatibility so that cells can represent more in vivo-like behavior, which is critical to gain more realistic in vitro results from the analyses, e.g., drug discovery experiments. In this work, O2 plasma patterning on the LAPS surface is studied to enhance surface features of the sensor chip, e.g., wettability and biofunctionality. The surface treated with O2 plasma for 30 s exhibits enhanced cytocompatibility for adherent CHO–K1 cells, which promotes cell spreading and proliferation. The plasma-modified LAPS chip is then integrated into a microfluidic system, which provides two identical channels to facilitate differential measurements of the extracellular acidification of CHO–K1 cells. To the best of our knowledge, it is the first time that extracellular acidification within microfluidic channels is quantitatively visualized as differential (bio-)chemical images.
In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem.
Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ.
Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible.
In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production.
Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data.
There is a very large number of very important situations which can be modeled with nonlinear parabolic partial differential equations (PDEs) in several dimensions. In general, these PDEs can be solved by discretizing in the spatial variables and transforming them into huge systems of ordinary differential equations (ODEs), which are very stiff. Therefore, standard explicit methods require a large number of iterations to solve stiff problems. But implicit schemes are computationally very expensive when solving huge systems of nonlinear ODEs. Several families of Extrapolated Stabilized Explicit Runge-Kutta schemes (ESERK) with different order of accuracy (3 to 6) are derived and analyzed in this work. They are explicit methods, with stability regions extended, along the negative real semi-axis, quadratically with respect to the number of stages s, hence they can be considered to solve stiff problems much faster than traditional explicit schemes. Additionally, they allow the adaptation of the step length easily with a very small cost.
Two new families of ESERK schemes (ESERK3 and ESERK6) are derived, and analyzed, in this work. Each family has more than 50 new schemes, with up to 84.000 stages in the case of ESERK6. For the first time, we also parallelized all these new variable step length and variable number of stages algorithms (ESERK3, ESERK4, ESERK5, and ESERK6). These parallelized strategies allow to decrease times significantly, as it is discussed and also shown numerically in two problems. Thus, the new codes provide very good results compared to other well-known ODE solvers. Finally, a new strategy is proposed to increase the efficiency of these schemes, and it is discussed the idea of combining ESERK families in one code, because typically, stiff problems have different zones and according to them and the requested tolerance the optimum order of convergence is different.
The industrial revolution especially in the IR4.0 era have driven many states of the art technologies to be introduced.
The automotive industry as well as many other key industries have also been greatly influenced. The rapid development of automotive industries in Europe have created wide industry gap between European Union (EU) and developing countries such as in South East Asia (SEA). Indulging this situation, FH JOANNEUM, Austria together with European partners from FH Aachen, Germany and Politecnico di Torino, Italy are taking initiative to close down the gap utilizing the Erasmus+ United Capacity Building in Higher Education grant from EU. A consortium was founded to engage with automotive technology transfer using the European framework to Malaysian, Indonesian and Thailand Higher Education Institutions (HEI) as well as automotive industries in respective countries. This could be achieved by establishing Engineering Knowledge Transfer Unit (EKTU) in respective SEA institutions guided by the industry partners in their respective countries. This EKTU could offer updated, innovative and high-quality training courses to increase graduate’s employability in higher education institutions and strengthen relations between HEI and the wider economic and social environment by addressing University-industry cooperation which is the regional priority for Asia. It is expected that, the Capacity Building Initiative would improve the quality of higher education and enhancing its relevance for the labor market and society in the SEA partners. The outcome of this project would greatly benefit the partners in strong and complementary partnership targeting the automotive industry and enhanced larger scale international cooperation between the European and SEA partners. It would also prepare the SEA HEI in sustainable partnership with Automotive industry in the region as a mean of income generation in the future.
The Rothman–Woodroofe symmetry test statistic is revisited on the basis of independent but not necessarily identically distributed random variables. The distribution-freeness if the underlying distributions are all symmetric and continuous is obtained. The results are applied for testing symmetry in a meta-analysis random effects model. The consistency of the procedure is discussed in this situation as well. A comparison with an alternative proposal from the literature is conducted via simulations. Real data are analyzed to demonstrate how the new approach works in practice.
The Atmospheric Remote-Sensing Infrared Exoplanet Large-survey, ARIEL, has been selected to be the next (M4) medium class space mission in the ESA Cosmic Vision programme. From launch in 2028, and during the following 4 years of operation, ARIEL will perform precise spectroscopy of the atmospheres of ~1000 known transiting exoplanets using its metre-class telescope. A three-band photometer and three spectrometers cover the 0.5 µm to 7.8 µm region of the electromagnetic spectrum. This paper gives an overview of the mission payload, including the telescope assembly, the FGS (Fine Guidance System) - which provides both pointing information to the spacecraft and scientific photometry and low-resolution spectrometer data, the ARIEL InfraRed Spectrometer (AIRS), and other payload infrastructure such as the warm electronics, structures and cryogenic cooling systems.
We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets.
Domain experts regularly teach novice students how to perform a task. This often requires them to adjust their behavior to the less knowledgeable audience and, hence, to behave in a more didactic manner. Eye movement modeling examples (EMMEs) are a contemporary educational tool for displaying experts’ (natural or didactic) problem-solving behavior as well as their eye movements to learners. While research on expert-novice communication mainly focused on experts’ changes in explicit, verbal communication behavior, it is as yet unclear whether and how exactly experts adjust their nonverbal behavior. This study first investigated whether and how experts change their eye movements and mouse clicks (that are displayed in EMMEs) when they perform a task naturally versus teach a task didactically. Programming experts and novices initially debugged short computer codes in a natural manner. We first characterized experts’ natural problem-solving behavior by contrasting it with that of novices. Then, we explored the changes in experts’ behavior when being subsequently instructed to model their task solution didactically. Experts became more similar to novices on measures associated with experts’ automatized processes (i.e., shorter fixation durations, fewer transitions between code and output per click on the run button when behaving didactically). This adaptation might make it easier for novices to follow or imitate the expert behavior. In contrast, experts became less similar to novices for measures associated with more strategic behavior (i.e., code reading linearity, clicks on run button) when behaving didactically.
Twee Kanten van één Medaille
(2020)
Elastic transmission eigenvalues and their computation via the method of fundamental solutions
(2020)
A stabilized version of the fundamental solution method to catch ill-conditioning effects is investigated with focus on the computation of complex-valued elastic interior transmission eigenvalues in two dimensions for homogeneous and isotropic media. Its algorithm can be implemented very shortly and adopts to many similar partial differential equation-based eigenproblems as long as the underlying fundamental solution function can be easily generated. We develop a corroborative approximation analysis which also implicates new basic results for transmission eigenfunctions and present some numerical examples which together prove successful feasibility of our eigenvalue recovery approach.
In this paper we present SMART-FACTORY, a setup for a research and teaching facility in industrial robotics that is based on the RoboCup Logistics League. It is driven by the need for developing and applying solutions for digital production. Digitization receives constantly increasing attention in many areas, especially in industry. The common theme is to make things smart by using intelligent computer technology. Especially in the last decade there have been many attempts to improve existing processes in factories, for example, in production logistics, also with deploying cyber-physical systems. An initiative that explores challenges and opportunities for robots in such a setting is the RoboCup Logistics League. Since its foundation in 2012 it is an international effort for research and education in an intra-warehouse logistics scenario. During seven years of competition a lot of knowledge and experience regarding autonomous robots was gained. This knowledge and experience shall provide the basis for further research in challenges of future production. The focus of our SMART-FACTORY is to create a stimulating environment for research on logistics robotics, for teaching activities in computer science and electrical engineering programmes as well as for industrial users to study and explore the feasibility of future technologies. Building on a very successful history in the RoboCup Logistics League we aim to provide stakeholders with a dedicated facility oriented at their individual needs.
Stahlbau 2
(2020)
Innovative breeds of sugar cane yield up to 2.5 times as much organic matter as conventional breeds, resulting in a great potential for biogas production. The use of biogas production as a complementary solution to conventional and second-generation ethanol production in Brazil may increase the energy produced per hectare in the sugarcane sector. Herein, it was demonstrated that through ensiling, energy cane can be conserved for six months; the stored cane can then be fed into a continuous biogas process. This approach is necessary to achieve year-round biogas production at an industrial scale. Batch tests revealed specific biogas potentials between 400 and 600 LN/kgVS for both the ensiled and non-ensiled energy cane, and the specific biogas potential of a continuous biogas process fed with ensiled energy cane was in the same range. Peak biogas losses through ensiling of up to 27% after six months were observed. Finally, compared with second-generation ethanol production using energy cane, the results indicated that biogas production from energy cane may lead to higher energy yields per hectare, with an average energy yield of up to 162 MWh/ha. Finally, the Farm²CBG concept is introduced, showing an approach for decentralized biogas production.
In this article, a concept of implicit methods for scalar conservation laws in one or more spatial dimensions allowing also for source terms of various types is presented. This material is a significant extension of previous work of the first author (Breuß SIAM J. Numer. Anal. 43(3), 970–986 2005). Implicit notions are developed that are centered around a monotonicity criterion. We demonstrate a connection between a numerical scheme and a discrete entropy inequality, which is based on a classical approach by Crandall and Majda. Additionally, three implicit methods are investigated using the developed notions. Next, we conduct a convergence proof which is not based on a classical compactness argument. Finally, the theoretical results are confirmed by various numerical tests.
The established Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic is investigated for partly not identically distributed data. Surprisingly, it turns out that the statistic has the well-known distribution-free limiting null distribution of the classical criterion under standard regularity conditions. An application is testing goodness-of-fit for the regression function in a non parametric random effects meta-regression model, where the consistency is obtained as well. Simulations investigate size and power of the approach for small and moderate sample sizes. A real data example based on clinical trials illustrates how the test can be used in applications.
Interior transmission eigenvalue problems for the Helmholtz equation play an important role in inverse wave scattering. Some distribution properties of those eigenvalues in the complex plane are reviewed. Further, a new scattering model for the interior transmission eigenvalue problem with mixed boundary conditions is described and an efficient algorithm for computing the interior transmission eigenvalues is proposed. Finally, extensive numerical results for a variety of two-dimensional scatterers are presented to show the validity of the proposed scheme.
We present new numerical results for shape optimization problems of interior Neumann eigenvalues. This field is not well understood from a theoretical standpoint. The existence of shape maximizers is not proven beyond the first two eigenvalues, so we study the problem numerically. We describe a method to compute the eigenvalues for a given shape that combines the boundary element method with an algorithm for nonlinear eigenvalues. As numerical optimization requires many such evaluations, we put a focus on the efficiency of the method and the implemented routine. The method is well suited for parallelization. Using the resulting fast routines and a specialized parametrization of the shapes, we found improved maxima for several eigenvalues.
This publication is intended to present the current state of research on the rebound effect. First, a systematic literature review is carried out to outline (current) scientific models and theories. Research Question 1 follows with a mathematical introduction of the rebound effect, which shows the interdependence of consumer behaviour, technological progress, and interwoven effects for both. Thereupon, the research field is analysed for gaps and limitations by a systematic literature review. To ensure quantitative and qualitative results, a review protocol is used that integrates two different stages and covers all relevant publications released between 2000 and 2019. Accordingly, 392 publications were identified that deal with the rebound effect. These papers were reviewed to obtain relevant information on the two research questions. The literature review shows that research on the rebound effect is not yet comprehensive and focuses mainly on the effect itself rather than solutions to avoid it. Research Question 2 finds that the main gap, and thus the limitations, is that not much research has been published on the actual avoidance of the rebound effect yet. This is a major limitation for practical application by decision-makers and politicians. Therefore, a theoretical analysis was carried out to identify potential theories and ideas to avoid the rebound effect. The most obvious idea to solve this problem is the theory of a Steady-State Economy (SSE), which has been described and reviewed.
Häufig bremsen geringe IT-Ressourcen, fehlende Softwareschnittstellen oder eine veraltete und komplex gewachsene Systemlandschaft die Automatisierung von Geschäftsprozessen. Robotic Process Automation (RPA) ist eine vielversprechende Methode, um Geschäftsprozesse oberflächenbasiert und ohne größere Systemeingriffe zu automatisieren und Medienbrüche abzubauen. Die Auswahl der passenden Prozesse ist dabei für den Erfolg von RPA-Projekten entscheidend. Der vorliegende Beitrag liefert dafür Selektionskriterien, die aus einer qualitativen Inhaltanalyse von elf Interviews mit RPA-Experten aus dem Versicherungsumfeld resultieren. Das Ergebnis umfasst eine gewichtetet Liste von sieben Dimensionen und 51 Prozesskriterien, welche die Automatisierung mit Softwarerobotern begünstigen bzw. deren Nichterfüllung eine Umsetzung erschweren oder sogar verhindern. Die drei wichtigsten Kriterien zur Auswahl von Geschäftsprozessen für die Automatisierung mittels RPA umfassen die Entlastung der an dem Prozess mitwirkenden Mitarbeiter (Arbeitnehmerüberlastung), die Ausführbarkeit des Prozesses mittels Regeln (Regelbasierte Prozessteuerung) sowie ein positiver Kosten-Nutzen-Vergleich. Praktiker können diese Kriterien verwenden, um eine systematische Auswahl von RPA-relevanten Prozessen vorzunehmen. Aus wissenschaftlicher Perspektive stellen die Ergebnisse eine Grundlage zur Erklärung des Erfolgs und Misserfolgs von RPA-Projekten dar.
Die im Zuge einer Betriebsübergabe anstehende Baumaßnahmen am eigenen Büro- und Produktionsgebäude boten ideale Voraussetzung zur Anwendung einer raum-kreierenden Außenhaut. Mit der elementierten, freistehenden Eichenholz-Fassade wurde ein bis dahin weitgehend funktionales Bauwerk substanzschonend und zugleich optisch ansprechender umgestaltet.
Bei der Entwicklung des Fassadensystems ging es darum die mögliche Dauerhaftigkeit von Holz bei direkter Bewitterung zu maximieren. Gleichzeitig soll gezeigt werden, dass mittels durchdachter Ansätze beim konstruktiven Holzschutz und die Wahl einer geeigneten Holzart langlebige Konstruktionen realisiert werden können.
Räumliche Transformation
(2020)
The successful implementation and continuous development of sustainable corporate-level solutions is a challenge. These are endeavours in which social, environmental, and financial aspects must be weighed against each other. They can prove difficult to handle and, in some cases, almost unrealistic. Concepts such as green controlling, IT, and manufacturing look promising and are constantly evolving. This paper aims to achieve a better understanding of the field of corporate sustainability (CS). It will evaluate the hypothesis by which Corporate Sustainability thrives, via being efficient, increasing the performance, and raising the value of the input of the enterprises to the resources used. In fact, Corporate Sustainability on the surface could seem to contradict the idea, which supports the understanding that it encourages the reduction of the heavy reliance on the use of natural resources, the overall environmental impact, and above all, their protection. To understand how the contradictory notion of CS came about, in this part of the paper, the emphasis is placed on providing useful insight to this regard. The first part of this paper summarizes various definitions, organizational theories, and measures used for CS and its derivatives like green controlling, IT, and manufacturing. Second, a case study is given that combines the aforementioned sustainability models. In addition to evaluating the hypothesis, the overarching objective of this paper is to demonstrate the use of green controlling, IT, and manufacturing in the corporate sector. Furthermore, this paper outlines the current challenges and possible directions for CS in the future.
Rapid development of virtual and data acquisition technology makes Digital Twin Technology (DT) one of the fundamental areas of research, while DT is one of the most promissory developments for the achievement of Industry 4.0. 48% percent of organisations implementing the Internet of Things are already using DT or plan to use DT in 2020. The global market for DT is expected to grow by 38 percent annually, reaching USD16 billion by 2023. In addition, the number of participating organisations using digital twins is expected to triple by 2022. DTs are characterised by the integration between physical and virtual spaces. The driving idea for DT is to develop, test and build our devices in a virtual environment. The objective of this paper is to study the impact of DT in the automotive industry on the new marketing logic. This paper outlines the current challenges and possible directions for the future DT in marketing. This paper will be helpful for managers in the industry to use the advantages and potentials of DT.
As researchers continue to seek the expansion of the material base for additive manufacturing, there is a need to focus attention on the Ni–Cu group of alloys which conventionally has wide industrial applications. In this work, the G-NiCu30Nb casting alloy, a variant of the Monel family of alloys with Nb and high Si content is, for the first time, processed via the laser powder bed fusion process (LPBF). Being novel to the LPBF processes, optimum LPBF parameters were determined, and hardness and tensile tests were performed in as-built conditions and after heat treatment at 1000 °C. Microstructures of the as-cast and the as-built condition were compared. Highly dense samples (99.8% density) were achieved after varying hatch distance (80 µm and 140 µm) with scanning speed (550 mm/s–1500 mm/s). There was no significant difference in microhardness between varied hatch distance print sets. Microhardness of the as-built condition (247 HV0.2) exceeded the as-cast microhardness (179 HV0.2.). Tensile specimens built in vertical (V) and horizontal (H) orientations revealed degrees of anisotropy and were superior to conventionally reported figures. Post heat treatment increased ductility from 20% to 31% (V), as well as from 16% to 25% (H), while ultimate tensile strength (UTS) and yield strength (YS) were considerably reduced.
This paper uses a quantitative analysis to examine the interdependence and impact of resource rents on socio-economic development from 2002 to 2017. Nigeria and Norway have been chosen as reference countries due to their abundance of natural resources by similar economic performance, while the ranking in the Human Development Index differs dramatically. As the Human Development Index provides insight into a country’s cultural and socio-economic characteristics and development in addition to economic indicators, it allows a comparison of the two countries. The hypothesis presented and discussed in this paper was researched before. A qualitative research approach was used in the author’s master’s thesis “The Human Development Index (HDI) as a Reflection of Resource Abundance (using Nigeria and Norway as a case study)” in 2018. The management of scarce resources is an important aspect in the development of modern countries and those on the threshold of becoming industrialised nations. The effects of a mistaken resource management are not only of a purely economic nature but also of a social and socio-economic nature. In order to present a partial aspect of these dependencies and influences this paper uses a quantitative analysis to examine the interdependence and impact of resource rents on socio-economic development from 2002 to 2017. Nigeria and Norway have been chosen as reference countries due to their abundance of natural resources by similar economic performance, while the ranking in the Human Development Index differs significantly. As the Human Development Index provides insight into a country’s cultural and socio-economic characteristics and development in addition to economic indicators, it allows a comparison of the two countries. This paper found out in a holistic perspective that (not or poorly managed) resource wealth in itself has a negative impact on socio-economic development and significantly reduces the productivity of the citizens of a state. This is expressed in particular for the years 2002 till 2017 in a negative correlation of GDP per capita and HDI value with the share respectively the size of resources in the GDP of a country.
Können wir Skizzenblätter, die gemischte Systeme von Text und Bildanteilen zeigen, als räumliche und zeitliche Verdichtung von Reflexionsmilieus verstehen? Wie wirkt sich die durch die räumliche Begrenzung des Blatts bedingte gleichzeitige Anwesenheit von Text und Bild aus, welche Wechselwirkungen entfalten sich? Diese Fragenstellungen führen zur Definition der ‚Multidimensionalen Arbeitsblätter‘, die als geeignetes Medium der Analyse von entwerferischen Denkprozessen verstanden werden. Anhand von fünf Beispielen wird beschrieben, wie durch dekompositorische Prozeduren Zeichnungsgenealogien sichtbar gemacht werden können, die intensive Auskunft über Entwurfshandlungen geben.
Mit der Digitalen Automatischen Kupplung beginnt ein neues Kapitel des Schienengüterverkehrs, in dem zusammengestellte Wagen sich automatisch in wenigen Minuten abfahrbereit machen, ohne dass der Mensch eingreifen muss. Eines des größten Hemmnisse der umweltfreundlichen Schiene wird dann entfallen. Notwendig ist jetzt eine Diskussion über den Umfang und die Systemgrenzen der Automatischen Bremsprobe.
Neue Perspektiven für die Bahn in der Produktions- und Distributionslogistik durch Prozessautomation
(2020)
Background: Architectural representation, nurtured by the interaction between design thinking and design action, is inherently multi-layered. However, the representation object cannot always reflect these layers. Therefore, it is claimed that these reflections and layerings can gain visibility through ‘performativity in personal knowledge’, which basically has a performative character. The specific layers of representation produced during the performativity in personal knowledge permit insights about the ‘personal way of designing’ [1]. Therefore, the question, ‘how can these layered drawings be decomposed to understand the personal way of designing’, can be defined as the beginning of the study. On the other hand, performativity in personal knowledge in architectural design is handled through the relationship between explicit and tacit knowledge and representational and non-representational theory. To discuss the practical dimension of these theoretical relations, Zvi Hecker's drawing of the Heinz-Galinski-School is examined as an example. The study aims to understand the relationships between the layers by decomposing a layered drawing analytically in order to exemplify personal ways of designing.
Methods: The study is based on qualitative research methodologies. First, a model has been formed through theoretical readings to discuss the performativity in personal knowledge. This model is used to understand the layered representations and to research the personal way of designing. Thus, one drawing of Hecker’s Heinz-Galinski-School project is chosen. Second, its layers are decomposed to detect and analyze diverse objects, which hint to different types of design tools and their application. Third, Zvi Hecker’s statements of the design process are explained through the interview data [2] and other sources. The obtained data are compared with each other.
Results: By decomposing the drawing, eleven layers are defined. These layers are used to understand the relation between the design idea and its representation. They can also be thought of as a reading system. In other words, a method to discuss Hecker’s performativity in personal knowledge is developed. Furthermore, the layers and their interconnections are described in relation to Zvi Hecker’s personal way of designing.
Conclusions: It can be said that layered representations, which are associated with the multilayered structure of performativity in personal knowledge, form the personal way of designing.
Objective
In local SAR compression algorithms, the overestimation is generally not linearly dependent on actual local SAR. This can lead to large relative overestimation at low actual SAR values, unnecessarily constraining transmit array performance.
Method
Two strategies are proposed to reduce maximum relative overestimation for a given number of VOPs. The first strategy uses an overestimation matrix that roughly approximates actual local SAR; the second strategy uses a small set of pre-calculated VOPs as the overestimation term for the compression.
Result
Comparison with a previous method shows that for a given maximum relative overestimation the number of VOPs can be reduced by around 20% at the cost of a higher absolute overestimation at high actual local SAR values.
Conclusion
The proposed strategies outperform a previously published strategy and can improve the SAR compression where maximum relative overestimation constrains the performance of parallel transmission.
Mechano-pharmacological testing of L-Type Ca²⁺ channel modulators via human vascular celldrum model
(2020)
Background/Aims: This study aimed to establish a precise and well-defined working model, assessing pharmaceutical effects on vascular smooth muscle cell monolayer in-vitro. It describes various analysis techniques to determine the most suitable to measure the biomechanical impact of vasoactive agents by using CellDrum technology. Methods: The so-called CellDrum technology was applied to analyse the biomechanical properties of confluent human aorta muscle cells (haSMC) in monolayer. The cell generated tensions deviations in the range of a few N/m² are evaluated by the CellDrum technology. This study focuses on the dilative and contractive effects of L-type Ca²⁺ channel agonists and antagonists, respectively. We analyzed the effects of Bay K8644, nifedipine and verapamil. Three different measurement modes were developed and applied to determine the most appropriate analysis technique for the study purpose. These three operation modes are called, particular time mode" (PTM), "long term mode" (LTM) and "real-time mode" (RTM). Results: It was possible to quantify the biomechanical response of haSMCs to the addition of vasoactive agents using CellDrum technology. Due to the supplementation of 100nM Bay K8644, the tension increased approximately 10.6% from initial tension maximum, whereas, the treatment with nifedipine and verapamil caused a significant decrease in cellular tension: 10nM nifedipine decreased the biomechanical stress around 6,5% and 50nM verapamil by 2,8%, compared to the initial tension maximum. Additionally, all tested measurement modes provide similar results while focusing on different analysis parameters. Conclusion: The CellDrum technology allows highly sensitive biomechanical stress measurements of cultured haSMC monolayers. The mechanical stress responses evoked by the application of vasoactive calcium channel modulators were quantified functionally (N/m²). All tested operation modes resulted in equal findings, whereas each mode features operation-related data analysis.
We present first results from a newly developed monitoring station for a closed loop geothermal heat pump test installation at our campus, consisting of helix coils and plate heat exchangers, as well as an ice-store system. There are more than 40 temperature sensors and several soil moisture content sensors distributed around the system, allowing a detailed monitoring under different operating conditions.In the view of the modern development of renewable energies along with the newly concepts known as Internet of Things and Industry 4.0 (high-tech strategy from the German government), we created a user-friendly web application, which will connect the things (sensors) with the open network (www). Besides other advantages, this allows a continuous remote monitoring of the data from the numerous sensors at an arbitrary sampling rate.Based on the recorded data, we will also present first results from numerical simulations, taking into account all relevant heat transport processes.The aim is to improve the understanding of these processes and their influence on the thermal behavior of shallow geothermal systems in the unsaturated zone. This will in turn facilitate the prediction of the performance of these systems and therefore yield an improvement in their dimensioning when designing a specific shallow geothermal installation.
As part of the transnational research project EDITOR, a parabolic trough collector system (PTC) with concrete thermal energy storage (C-TES) was installed and commissioned in Limassol, Cyprus. The system is located on the premises of the beverage manufacturer KEAN Soft Drinks Ltd. and its function is to supply process steam for the factory's pasteurisation process [1]. Depending on the factory's seasonally varying capacity for beverage production, the solar system delivers between 5 and 25 % of the total steam demand. In combination with the C-TES, the solar plant can supply process steam on demand before sunrise or after sunset. Furthermore, the C-TES compensates the PTC during the day in fluctuating weather conditions. The parabolic trough collector as well as the control and oil handling unit is designed and manufactured by Protarget AG, Germany. The C-TES is designed and produced by CADE Soluciones de Ingeniería, S.L., Spain. In the focus of this paper is the description of the operational experience with the PTC, C-TES and boiler during the commissioning and operation phase. Additionally, innovative optimisation measures are presented.