Refine
Year of publication
- 2018 (262) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (68)
- Fachbereich Elektrotechnik und Informationstechnik (44)
- IfB - Institut für Bioengineering (41)
- INB - Institut für Nano- und Biotechnologien (25)
- Fachbereich Luft- und Raumfahrttechnik (24)
- Fachbereich Maschinenbau und Mechatronik (24)
- Fachbereich Chemie und Biotechnologie (22)
- Fachbereich Energietechnik (22)
- Fachbereich Wirtschaftswissenschaften (21)
- Fachbereich Bauingenieurwesen (16)
Document Type
- Article (127)
- Conference Proceeding (78)
- Part of a Book (31)
- Book (12)
- Working Paper (3)
- Conference: Meeting Abstract (2)
- Doctoral Thesis (2)
- Patent (2)
- Part of a Periodical (2)
- Conference Poster (1)
Keywords
- Datenschutz (2)
- Digitale Transformation (2)
- Energy efficiency (2)
- Engineering optimization (2)
- Literaturanalyse (2)
- MINLP (2)
- Pump System (2)
- Serious Game (2)
- Water (2)
- Actors (1)
- Agility (1)
- Antarctica (1)
- Awareness (1)
- Bahadur efficiency (1)
- Bioeconomy (1)
- Bioethanol (1)
- Biorefinery (1)
- Biorefinery definitions (1)
- Bladder (1)
- Booster Stations (1)
- Buffering Capacity (1)
- CDG (1)
- CEO career variety (1)
- Chance Constraint (1)
- Chemical imaging (1)
- Cloud Computing (1)
- Coat protein (1)
- Competence Developing Game (1)
- Conditions (1)
- Conductive boundary condition (1)
- Coverage probability (1)
- Cramér-von-Mises statistic (1)
- Datenschutzgrundverordnung (1)
- Datenschutzrecht (1)
- Design process (1)
- Dry surfaces (1)
- EBSCO Discovery Service (1)
- EU-DS-GVO (1)
- EUDSGVO (1)
- Engineering Application (1)
- Enterprise Architecture (1)
- Enzyme nanocarrier (1)
- Equivalence test (1)
- Field-effect device (1)
- Forschungsprozess (1)
- GOSSAMER-1 (1)
- Geschäftsprozessmanagement (1)
- Global optimization (1)
- Glucose biosensor (1)
- Glucose oxidase (1)
- Goodness-of-fit tests for uniformity (1)
- Growth modelling (1)
- IBM Watson Explorer (1)
- INODIS (1)
- IT-Sicherheit (1)
- Identitätsmanagement (1)
- Informationsgetriebene Geschäftsmodelle (1)
- Integrated empirical distribution (survival) function (1)
- Internet der Dinge (1)
- Introduction (1)
- Inverse scattering (1)
- Jupiter (1)
- Kernel density estimator (1)
- Lab-on-Chip (1)
- Latin Hypercube Sampling (1)
- Length of confidence intervals (1)
- Light-addressable potentiometric sensor (1)
- Lignocellulose feedstook (1)
- Literatur-analyse-prozess (1)
- Literaturdaten (1)
- Literature review (1)
- MASCOT (1)
- Manifestations (1)
- Mars (1)
- Mechanical simulation (1)
- Microbial adhesion (1)
- Minimum dissipation (1)
- Mixed-integer nonlinear problem (1)
- Monetarisierung (1)
- Multi-criteria optimization (1)
- Muscle fibers (1)
- Network (1)
- Numerical inversion of Laplace transforms (1)
- Paper recycling (1)
- Passive stretching (1)
- Pelvic floor dysfunction (1)
- Pelvic muscle (1)
- Pitman efficiency (1)
- Planetary exploration (1)
- Planning process (1)
- Player Types (1)
- Potentiometry (1)
- Process engineering (1)
- Process schemes (1)
- Projektbeispiele (1)
- Prozessautomatisierung (1)
- Qualitative Wertschöpfungsanalyse (1)
- RC frames (1)
- Reconstruction (1)
- Rehabilitation Technology and Prosthetics (1)
- Relative exploration orientation (1)
- Renewable resources (1)
- Research process (1)
- Sampling methods (1)
- Softwareroboter (1)
- Stochastic Programming (1)
- Story (1)
- Structure and Stages (1)
- Surface microorganisms (1)
- Surgical Navigation and Robotics (1)
- Swabbing (1)
- TMT composition (1)
- TMT structure (1)
- Technische Schutzmaßnahmen (1)
- Text Analytics (1)
- Text Analytics (1)
- Text analytics (1)
- Text mining (1)
- Tobacco mosaic virus (TMV) (1)
- Tools (1)
- Transition (1)
- Transmission eigenvalues (1)
- Turbulence (1)
- Uncertainty (1)
- Ureter (1)
- Video Game (1)
- Water Distribution (1)
- Water Supply Networks (1)
- Wilcoxon tests (1)
- Wissenstransfer (1)
- achilles tendon (1)
- agile (1)
- business simulation (1)
- design of technical systems (1)
- earthquakes (1)
- energy absorption (1)
- energy dissipation (1)
- frequency mixing (1)
- functional data (1)
- habitability (1)
- huge dimensional data (1)
- ice moons (1)
- icy moons (1)
- in-plane and out-of-plane failure (1)
- legal obligations (1)
- life detection (1)
- magnetic beads (1)
- magnetic sensing (1)
- mathematical optimization (1)
- mechanical buffer (1)
- multiple NEA rendezvous (1)
- optimization (1)
- product liability (1)
- remote sensing (1)
- resilience (1)
- separable Hilbert space (1)
- slum classification (1)
- small spacecraft (1)
- solar sail (1)
- space missions (1)
- stiffness (1)
- superparamagnetic nanoparticles (1)
- tablet game (1)
- underwater vehicle (1)
- upper echelons theory (1)
- water supply design (1)
Manufacturing process simulation (MPS) has become more and more important for aviation and the automobile industry. A highly competitive market requires the use of high performance metals and composite materials in combination with reduced manufacturing cost and time as well as a minimization of the time to market for a new product. However, the use of such materials is expensive and requires sophisticated manufacturing processes. An experience based process and tooling design followed by a lengthy trial-and-error optimization is just not contemporary anymore. Instead, a tooling design process aided by simulation is used more often. This paper provides an overview of the capabilities of MPS in the fields of sheet metal forming and prepreg autoclave manufacturing of composite parts summarizing the resulting benefits for tooling design and manufacturing engineering. The simulation technology is explained briefly in order to show several simplification and optimization techniques for developing industrialized simulation approaches. Small case studies provide examples of an efficient application on an industrial scale.
Rare event simulation to optimise maintenance intervals of safety critical redundant subsystems
(2018)
Digitalisierung bezeichnet die Nutzung großer Datenmengen, die zu einer umfassenden Vernetzung aller Bereiche der Wirtschaft und Gesellschaft führen wird (BMWi, 2015 und ähnlich Köhler/Wollschläger, 2014: 79). Sie umfasst die Erhebung von analogen Informationen („Big Data“ in einem engen Sinne; z.B. O´Leary, 2013), ihre Speicherung in einem digitaltechnischen System (lokale Speicherung oder „Cloud Computing“ durch die Weiterentwickelung des Internets; z.B. Hashem et al., 2015: 101), die Analyse und Interpretation sowie den Transfer in andere Systeme („Internet der Dinge“ bzw. „Internet of Things“; z.B. Ashton, 2009).
Multi-analyte biosensors may offer the opportunity to perform cost-effective and rapid analysis with reduced sample volume, as compared to electrochemical biosensing of each analyte individually. This work describes the development of an enzyme-based biosensor system for multi-parametric determination of four different organic acids. The biosensor array comprises five working electrodes for simultaneous sensing of ethanol, formate, d-lactate, and l-lactate, and an integrated counter electrode. Storage stability of the biosensor was evaluated under different conditions (stored at +4 °C in buffer solution and dry at −21 °C, +4 °C, and room temperature) over a period of 140 days. After repeated and regular application, the individual sensing electrodes exhibited the best stability when stored at −21 °C. Furthermore, measurements in silage samples (maize and sugarcane silage) were conducted with the portable biosensor system. Comparison with a conventional photometric technique demonstrated successful employment for rapid monitoring of complex media.
A field-effect biosensor employing tobacco mosaic virus (TMV) particles as scaffolds for enzyme immobilization is presented. Nanotubular TMV scaffolds allow a dense immobilization of precisely positioned enzymes with retained activity. To demonstrate feasibility of this new strategy, a penicillin sensor has been developed by coupling a penicillinase with virus particles as a model system. The developed field-effect penicillin biosensor consists of an Al-p-Si-SiO₂-Ta₂O₅-TMV structure and has been electrochemically characterized in buffer solutions containing different concentrations of penicillin G. In addition, the morphology of the biosensor surface with virus particles was characterized by scanning electron microscopy and atomic force microscopy methods. The sensors possessed a high penicillin sensitivity of ~ 92 mV/dec in a nearly-linear range from 0.1 mM to 10 mM, and a low detection limit of about 50 µM. The long-term stability of the penicillin biosensor was periodically tested over a time period of about one year without any significant loss of sensitivity. The biosensor has also been successfully applied for penicillin detection in bovine milk samples.
In this paper the results of a techno-economic analysis of improved and optimized molten salt solar tower plants (MSSTP plants) are presented. The potential improvements that were analyzed include different receiver designs, different designs of the HTF-system and plant control, increased molten salt temperatures (up to 640°C) and multi-tower systems. Detailed technological and economic models of the solar field, solar receiver and high temperature fluid system (HTF-system) were developed and used to find potential improvements compared to a reference plant based on Solar Two technology and up-to-date cost estimations. The annual yield model calculates the annual outputs and the LCOE of all variants. An improved external tubular receiver and improved HTF-system achieves a significant decrease of LCOE compared to the reference. This is caused by lower receiver cost as well as improvements of the HTF-system and plant operation strategy, significantly reducing the plant own consumption. A novel star receiver shows potential for further cost decrease. The cavity receiver concepts result in higher LCOE due to their high investment cost, despite achieving higher efficiencies. Increased molten salt temperatures seem possible with an adapted, closed loop HTF-system and achieve comparable results to the original improved system (with 565°C) under the given boundary conditions. In this analysis all multi tower systems show lower economic viability compared to single tower systems, caused by high additional cost for piping connections and higher cost of the receivers.
REFERENCES
Kinematics and kinetics of handcycling propulsion at increasing workloads in able-bodied subjects
(2018)
In Paralympic sports, biomechanical optimisation of movements and equipment seems to be promising for improving performance. In handcycling, information about the biomechanics of this sport is mainly provided by case studies. The aim of the current study was (1) to examine changes in handcycling propulsion kinematics and kinetics due to increasing workloads and (2) identify parameters that are associated with peak aerobic performance. Twelve non-disabled male competitive triathletes without handcycling experience voluntarily participated in the study. They performed an initial familiarisation protocol and incremental step test until exhaustion in a recumbent racing handcycle that was attached to an ergometer. During the incremental test, tangential crank kinetics, 3D joint kinematics, blood lactate and ratings of perceived exertion (local and global) were identified. As a performance criterion, the maximal power output during the step test (Pmax) was calculated and correlated with biomechanical parameters. For higher workloads, an increase in crank torque was observed that was even more pronounced in the pull phase than in the push phase. Furthermore, participants showed an increase in shoulder internal rotation and abduction and a decrease in elbow flexion and retroversion. These changes were negatively correlated with performance. At high workloads, it seems that power output is more limited by the transition from pull to push phase than at low workloads. It is suggested that successful athletes demonstrate small alterations of their kinematic profile due to increasing workloads. Future studies should replicate and expand the test spectrum (sprint and continuous loads) as well as use methods like surface electromyography (sEMG) with elite handcyclists.
Magnetic detection structure for Lab-on-Chip applications based on the frequency mixing technique
(2018)
A magnetic frequency mixing technique with a set of miniaturized planar coils was investigated for use with a completely integrated Lab-on-Chip (LoC) pathogen sensing system. The system allows the detection and quantification of superparamagnetic beads. Additionally, in terms of magnetic nanoparticle characterization ability, the system can be used for immunoassays using the beads as markers. Analytical calculations and simulations for both excitation and pick-up coils are presented; the goal was to investigate the miniaturization of simple and cost-effective planar spiral coils. Following these calculations, a Printed Circuit Board (PCB) prototype was designed, manufactured, and tested for limit of detection, linear response, and validation of theoretical concepts. Using the magnetic frequency mixing technique, a limit of detection of 15 µg/mL of 20 nm core-sized nanoparticles was achieved without any shielding.
Slot die coating is applied to deposit thin and homogenous films in roll-to-roll and sheet-to-sheet applications. The critical step in operation is to choose suitable process parameters within the process window. In this work, we investigate an upper limit for stripe coatings. This maximum film thickness is characterized by stripe merging which needs to be avoided in a stable process. It is shown that the upper limit reduces the process window for stripe coatings to a major extent. As a result, stripe coatings at large coating gaps and low viscosities are only possible for relatively thick films. Explaining the upper limit, a theory of balancing the side pressure in the gap region in the cross-web direction has been developed.
Ensuring access to water and sanitation for all is Goal No. 6 of the 17 UN Sustainability Development Goals to transform our world. As one step towards this goal, we present an approach that leverages remote sensing data to plan optimal water supply networks for informal urban settlements. The concept focuses on slums within large urban areas, which are often characterized by a lack of an appropriate water supply. We apply methods of mathematical optimization aiming to find a network describing the optimal supply infrastructure. Hereby, we choose between different decentral and central approaches combining supply by motorized vehicles with supply by pipe systems. For the purposes of illustration, we apply the approach to two small slum clusters in Dhaka and Dar es Salaam. We show our optimization results, which represent the lowest cost water supply systems possible. Additionally, we compare the optimal solutions of the two clusters (also for varying input parameters, such as population densities and slum size development over time) and describe how the result of the optimization depends on the entered remote sensing data.
The UN sets the goal to ensure access to water and sanitation for all people by 2030. To address this goal, we present a multidisciplinary approach for designing water supply networks for slums in large cities by applying mathematical optimization. The problem is modeled as a mixed-integer linear problem (MILP) aiming to find a network describing the optimal supply infrastructure. To illustrate the approach, we apply it on a small slum cluster in Dhaka, Bangladesh.
Retrofitting of existing parabolic trough collector power plants with molten salt tower systems
(2018)
Due to the Renewable Energy Act, in Germany it is planned to increase the amount of renewable energy carriers up to 60%. One of the main problems is the fluctuating supply of wind and solar energy. Here biogas plants provide a solution, because a demand-driven supply is possible. Before running such a plant, it is necessary to simulate and optimize the process. This paper provides a new model of a biogas plant, which is as accurate as the standard ADM1 model. The advantage compared to ADM1 is that it is based on only four parameters compared to 28. Applying this model, an optimization was installed, which allows a demand-driven supply by biogas plants. Finally the results are confirmed by several experiments and measurements with a real test plant.
Algal polysaccharides (extracellular polysaccharides) and carbon nanotubes (CNTs) were adsorbed on dioctadecyldimethylammonium bromide Langmuir monolayers to serve as a matrix for the incorporation of urease. The physicochemical properties of the supramolecular system as a monolayer at the air–water interface were investigated by surface pressure–area isotherms, surface potential–area isotherms, interfacial shear rheology, vibrational spectroscopy, and Brewster angle microscopy. The floating monolayers were transferred to hydrophilic solid supports, quartz, mica, or capacitive electrolyte–insulator–semiconductor (EIS) devices, through the Langmuir–Blodgett (LB) technique, forming mixed films, which were investigated by quartz crystal microbalance, fluorescence spectroscopy, and field emission gun scanning electron microscopy. The enzyme activity was studied with UV–vis spectroscopy, and the feasibility of the thin film as a urea sensor was essayed in an EIS sensor device. The presence of CNT in the enzyme–lipid LB film not only tuned the catalytic activity of urease but also helped to conserve its enzyme activity. Viability as a urease sensor was demonstrated with capacitance–voltage and constant capacitance measurements, exhibiting regular and distinctive output signals over all concentrations used in this work. These results are related to the synergism between the compounds on the active layer, leading to a surface morphology that allowed fast analyte diffusion owing to an adequate molecular accommodation, which also preserved the urease activity. This work demonstrates the feasibility of employing LB films composed of lipids, CNT, algal polysaccharides, and enzymes as EIS devices for biosensing applications.
The article presents the investigation of the seismic behaviour of a modern URM building located in the municipality of Finale Emilia in province of Modena, Northern Italy. The building is situated in the centre of the series of the 2012 Northern Italy earthquakes and has not suffered any damage during the earthquake series in 2012. The observed earthquake resistance of the building is compared with predicted resistances based on linear and nonlinear design approaches according to Eurocode. Furthermore, probabilistic analyses based on nonlinear calculation models taking into account scattering of the most relevant input parameters are carried out to identify their influence to the results and to derive fragility curves.
In most beers, producers strive to minimize haze to maximize visual appeal. To detect the formation of particulates, a measurement system for sub-micron particles is required. Beer haze is naturally occurring, composed of protein or polyphenol particles; in their early stage of growth their size is smaller than 2 µm. Microscopy analysis is time and resource intensive; alternatively, backscattering is an inexpensive option for detecting particle sizes of interest.
Heavy-duty trucks are one of the main contributors to greenhouse gas emissions in German traffic. Drivetrain electrification is an option to reduce tailpipe emissions by increasing energy conversion efficiency. To evaluate the vehicle’s environmental impacts, it is necessary to consider the entire life cycle. In addition to the daily use, it is also necessary to include the impact of production and disposal. This study presents the comparative life cycle analysis of a parallel hybrid and a conventional heavy-duty truck in long-haul operation. Assuming a uniform vehicle glider, only the differing parts of both drivetrains are taken into account to calculate the environmental burdens of the production. The use phase is modeled by a backward simulation in MATLAB/Simulink considering a characteristic driving cycle. A break-even analysis is conducted to show at what mileage the larger CO2eq emissions due to the production of the electric drivetrain are compensated. The effect of parameter variation on the break-even mileage is investigated by a sensitivity analysis. The results of this analysis show the difference in CO2eq/t km is negative, indicating that the hybrid vehicle releases 4.34 g CO2eq/t km over a lifetime fewer emissions compared to the diesel truck. The break-even analysis also emphasizes the advantages of the electrified drivetrain, compensating the larger emissions generated during production after already a distance of 15,800 km (approx. 1.5 months of operation time). The intersection coordinates, distance, and CO2eq, strongly depend on fuel, emissions for battery production and the driving profile, which lead to nearly all parameter variations showing an increase in break-even distance.
Monitoring of organic acids (OA) and volatile fatty acids (VFA) is crucial for the control of anaerobic digestion. In case of unstable process conditions, an accumulation of these intermediates occurs. In the present work, two different enzyme-based biosensor arrays are combined and presented for facile electrochemical determination of several process-relevant analytes. Each biosensor utilizes a platinum sensor chip (14 × 14 mm²) with five individual working electrodes. The OA biosensor enables simultaneous measurement of ethanol, formate, d- and l-lactate, based on a bi-enzymatic detection principle. The second VFA biosensor provides an amperometric platform for quantification of acetate and propionate, mediated by oxidation of hydrogen peroxide. The cross-sensitivity of both biosensors toward potential interferents, typically present in fermentation samples, was investigated. The potential for practical application in complex media was successfully demonstrated in spiked sludge samples collected from three different biogas plants. Thereby, the results obtained by both of the biosensors were in good agreement to the applied reference measurements by photometry and gas chromatography, respectively. The proposed hybrid biosensor system was also used for long-term monitoring of a lab-scale biogas reactor (0.01 m³) for a period of 2 months. In combination with typically monitored parameters, such as gas quality, pH and FOS/TAC (volatile organic acids/total anorganic carbonate), the amperometric measurements of OA and VFA concentration could enhance the understanding of ongoing fermentation processes.
Die Batterie ist eine der absolut zentralen Komponenten des Elektrofahrzeugs. Die serielle Entwicklung und Produktion dieser Batterien und die Verbesserung der Leistungen wird entscheidend für den Erfolg der Elektromobilität sein. Die Batterie ist jedoch nicht das einzige elektrofahrzeugspezifische System, das neu entwickelt, umkonzipiert oder verbessert werden muss. So sind ebenso die Entwicklung der neuen Fahrzeugstruktur sowie des elektrifizierten Antriebsstranges Teil dieses Kapitels. Weiterhin wird ein Blick auf das bedeutende Thema des Thermomanagements geworfen.
Die urbane Mobilität ist im Wandel und insbesondere neue innovative Geschäftsmodelle werden einen wesentlichen Teil zur Lösung von künftigen Mobilitätsbedürfnissen beitragen. Die sogenannte „Shared Mobility“ gilt aktuell neben der Elektrifizierung des Antriebes und autonomem Fahrzeugtechnologien als einer der wichtigsten Trendthemen in der Automobilindustrie. Neue Mobilitätsdienstleistungen verlangen dabei verstärkt auch neue Fahrzeugkonzepte.
Urbane Mobilitätskonzepte der Zukunft erfordern neue Unternehmensformen, idealerweise aus Old Economy und New Economy, sowie eine enge Anbindung an die gesellschaftsrelevante Zukunftsforschung. Für neue Fahrzeugkonzepte des Carsharing bedeutet dies, dass alle kostenverursachenden Faktoren erfasst und analysiert werden müssen. Die FH Aachen, share2drive und FEV geben einen Ausblick auf die zukünftige Fahrzeugklasse der Personal Public Vehicles als „Rolling Device“.
In this work, we report on our attempt to design and implement an early introduction to basic robotics principles for children at kindergarten age. One of the main challenges of this effort is to explain complex robotics contents in a way that pre-school children could follow the basic principles and ideas using examples from their world of experience. What sets apart our effort from other work is that part of the lecturing is actually done by a robot itself and that a quiz at the end of the lesson is done using robots as well. The humanoid robot Pepper from Softbank, which is a great platform for human–robot interaction experiments, was used to present a lecture on robotics by reading out the contents to the children making use of its speech synthesis capability. A quiz in a Runaround-game-show style after the lecture activated the children to recap the contents they acquired about how mobile robots work in principle. In this quiz, two LEGO Mindstorm EV3 robots were used to implement a strongly interactive scenario. Besides the thrill of being exposed to a mobile robot that would also react to the children, they were very excited and at the same time very concentrated. We got very positive feedback from the children as well as from their educators. To the best of our knowledge, this is one of only few attempts to use a robot like Pepper not as a tele-teaching tool, but as the teacher itself in order to engage pre-school children with complex robotics contents.
Wenn durch innovative, automatisierte Güterwagen betriebswirtschaftliche Vorteile nutzbar gemacht werden sollen, muss die Migration auf das neue System in sinnvollen Teilschritten unter Berücksichtigung der organisationellen und betrieblichen Vereinbarkeit vorgenommen werden. Eine stufenweise Migration mit Nachrüstbarkeit und Kompatibilität kann die optimale Ausstattungsvariante für die unterschiedlichen Betriebsszenarien sowie eine Steigerung der Wirtschaftlichkeit des Gesamtsystems bieten.
Often, research results from collaboration projects are not transferred into productive environments even though approaches are proven to work in demonstration prototypes. These demonstration prototypes are usually too fragile and error-prone to be transferred
easily into productive environments. A lot of additional work is required.
Inspired by the idea of an incremental delivery process, we introduce an architecture pattern, which combines the approach of Metrics Driven Research Collaboration with microservices for the ease of integration. It enables keeping track of project goals over the course of the collaboration while every party may focus on their expert skills: researchers may focus on complex algorithms,
practitioners may focus on their business goals.
Through the simplified integration (intermediate) research results can be introduced into a productive environment which enables
getting an early user feedback and allows for the early evaluation of different approaches. The practitioners’ business model benefits throughout the full project duration.
Seismic design of buried pipeline systems for energy and water supply is not only important for plant and operational safety but also for the maintenance of the supply infrastructure after an earthquake. The present paper shows special issues of the seismic wave impacts on buried pipelines, describes calculation methods, proposes approaches and gives calculation examples. This paper regards the effects of transient displacement differences and resulting tensions within the pipeline due to the wave propagation of the earthquake. However, the presented model can also be used to calculate fault rupture induced displacements. Based on a three-dimensional Finite Element Model parameter studies are performed to show the influence of several parameters such as incoming wave angle, wave velocity, backfill height and synthetic displacement time histories. The interaction between the pipeline and the surrounding soil is modeled with non-linear soil springs and the propagating wave is simulated affecting the pipeline punctually, independently in time and space. Special attention is given to long-distance heat pipeline systems. Here, in regular distances expansion bends are arranged to ensure movements of the pipeline due to high temperature. Such expansion bends are usually designed with small bending radii, which during the earthquake lead to high bending stresses in the cross-section of the pipeline. Finally, an interpretation of the results and recommendations are given for the most critical parameters.
Nutzen und Rahmenbedingungen 5 informationsgetriebener Geschäftsmodelle des Internets der Dinge
(2018)
Im Kontext der zunehmenden Digitalisierung wird das Internet der Dinge (englisch: Internet of Things, IoT) als ein technologischer Treiber angesehen, durch den komplett neue Geschäftsmodelle im Zusammenspiel unterschiedlicher Akteure entstehen können. Identifizierte Schlüsselakteure sind unter anderem traditionelle Industrieunternehmen, Kommunen und Telekommunikationsunternehmen. Letztere sorgen mit der Bereitstellung von Konnektivität dafür, dass kleine Geräte mit winzigen Batterien nahezu überall und direkt an das Internet angebunden werden können. Es sind schon viele IoT-Anwendungsfälle auf dem Markt, die eine Vereinfachung für Endkunden darstellen, wie beispielsweise Philips Hue Tap. Neben Geschäftsmodellen basierend auf Konnektivität besteht ein großes Potenzial für informationsgetriebene Geschäftsmodelle, die bestehende Geschäftsmodelle unterstützen sowie weiterentwickeln können. Ein Beispiel dafür ist der IoT-Anwendungsfall Park and Joy der Deutschen Telekom AG, bei dem Parkplätze mithilfe von Sensoren vernetzt und Autofahrer in Echtzeit über verfügbare Parkplätze informiert werden. Informationsgetriebene Geschäftsmodelle können auf Daten aufsetzen, die in IoT-Anwendungsfällen erzeugt werden. Zum Beispiel kann ein Telekommunikationsunternehmen Mehrwert schöpfen, indem es aus Daten entscheidungsrelevantere Informationen – sogenannte Insights – ableitet, die zur Steigerung der Entscheidungsagilität genutzt werden. Außerdem können Insights monetarisiert werden. Die Monetarisierung von Insights kann nur nachhaltig stattfinden, wenn sorgfältig gehandelt wird und Rahmenbedingungen berücksichtigt werden. In diesem Kapitel wird das Konzept informationsgetriebener Geschäftsmodelle erläutert und anhand des konkreten Anwendungsfalls Park and Joy verdeutlicht. Darüber hinaus werden Nutzen, Risiken und Rahmenbedingungen diskutiert.
This paper presents NLP Lean Programming
framework (NLPf), a new framework
for creating custom natural language processing
(NLP) models and pipelines by utilizing
common software development build systems.
This approach allows developers to train and
integrate domain-specific NLP pipelines into
their applications seamlessly. Additionally,
NLPf provides an annotation tool which improves
the annotation process significantly by
providing a well-designed GUI and sophisticated
way of using input devices. Due to
NLPf’s properties developers and domain experts
are able to build domain-specific NLP
applications more efficiently. NLPf is Opensource
software and available at https://
gitlab.com/schrieveslaach/NLPf.
Sleep scoring is a necessary and time-consuming task in sleep studies. In animal models (such as mice) or in humans, automating this tedious process promises to facilitate long-term studies and to promote sleep biology as a data-driven f ield. We introduce a deep neural network model that is able to predict different states of consciousness (Wake, Non-REM, REM) in mice from EEG and EMG recordings with excellent scoring results for out-of-sample data. Predictions are made on epochs of 4 seconds length, and epochs are classified as artifactfree or not. The model architecture draws on recent advances in deep learning and in convolutional neural networks research. In contrast to previous approaches towards automated sleep scoring, our model does not rely on manually defined features of the data but learns predictive features automatically. We expect deep learning models like ours to become widely applied in different fields, automating many repetitive cognitive tasks that were previously difficult to tackle.
Enzyme und Biosensorik
(2018)
Enzymbasierte Biosensoren finden seit mehr als fünf Jahrzehnten einen prosperierenden Wachstumsmarkt und werden zunehmend auch in biotechnologischen Prozessen eingesetzt. In diesem Kapitel werden, ausgehend vom Sensorbegriff und typischen Kenngrößen für Biosensoren (Abschn. 18.1), elektrochemische Enzym-Biosensoren vorgestellt und deren typischen Einsatzgebiete diskutiert (Abschn. 18.2). Ein Blick über den „Tellerrand“ hinaus zeigt alternative Transduktorprinzipien (Abschn. 18.3) und führt abschließend in aktuelle Forschungstrends ein (Abschn. 18.4).
Against the background of growing data in everyday life, data processing tools become more powerful to deal with the increasing complexity in building design. The architectural planning process is offered a variety of new instruments to design, plan and communicate planning decisions. Ideally the access to information serves to secure and document the quality of the building and in the worst case, the increased data absorbs time by collection and processing without any benefit for the building and its user. Process models can illustrate the impact of information on the design- and planning process so that architect and planner can steer the process. This paper provides historic and contemporary models to visualize the architectural planning process and introduces means to describe today’s situation consisting of stakeholders, events and instruments. It explains conceptions during Renaissance in contrast to models used in the second half of the 20th century. Contemporary models are discussed regarding their value against the background of increasing computation in the building process.
Highly competitive markets paired with tremendous production volumes demand particularly cost efficient products. The usage of common parts and modules across product families can potentially reduce production costs. Yet, increasing commonality typically results in overdesign of individual products. Multi domain virtual prototyping enables designers to evaluate costs and technical feasibility of different single product designs at reasonable computational effort in early design phases. However, savings by platform commonality are hard to quantify and require detailed knowledge of e.g. the production process and the supply chain. Therefore, we present and evaluate a multi-objective metamodel-based optimization algorithm which enables designers to explore the trade-off between high commonality and cost optimal design of single products.
Given industrial applications, the costs for the operation and maintenance of a pump system typically far exceed its purchase price. For finding an optimal pump configuration which minimizes not only investment, but life-cycle costs, methods like Technical Operations Research which is based on Mixed-Integer Programming can be applied. However, during the planning phase, the designer is often faced with uncertain input data, e.g. future load demands can only be estimated. In this work, we deal with this uncertainty by developing a chance-constrained two-stage (CCTS) stochastic program. The design and operation of a booster station working under uncertain load demand are optimized to minimize total cost including purchase price, operation cost incurred by energy consumption and penalty cost resulting from water shortage. We find optimized system layouts using a sample average approximation (SAA) algorithm, and analyze the results for different risk levels of water shortage. By adjusting the risk level, the costs and performance range of the system can be balanced, and thus the
system’s resilience can be engineered
The Kremer-Grest (KG) bead-spring model is a near standard in Molecular Dynamic simulations of generic polymer properties. It owes its popularity to its computational efficiency, rather than its ability to represent specific polymer species and conditions. Here we investigate how to adapt the model to match the universal properties of a wide range of chemical polymers species. For this purpose we vary a single parameter originally introduced by Faller and Müller-Plathe, the chain stiffness. Examples include polystyrene, polyethylene, polypropylene, cis-polyisoprene, polydimethylsiloxane, polyethyleneoxide and styrene-butadiene rubber. We do this by matching the number of Kuhn segments per chain and the number of Kuhn segments per cubic Kuhn volume for the polymer species and for the Kremer-Grest model. We also derive mapping relations for converting KG model units back to physical units, in particular we obtain the entanglement time for the KG model as function of stiffness allowing for a time mapping. To test these relations, we generate large equilibrated well entangled polymer melts, and measure the entanglement moduli using a static primitive-path analysis of the entangled melt structure as well as by simulations of step-strain deformation of the model melts. The obtained moduli for our model polymer melts are in good agreement with the experimentally expected moduli.