Refine
Year of publication
- 2018 (262) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (68)
- Fachbereich Elektrotechnik und Informationstechnik (44)
- IfB - Institut für Bioengineering (41)
- INB - Institut für Nano- und Biotechnologien (25)
- Fachbereich Luft- und Raumfahrttechnik (24)
- Fachbereich Maschinenbau und Mechatronik (24)
- Fachbereich Chemie und Biotechnologie (22)
- Fachbereich Energietechnik (22)
- Fachbereich Wirtschaftswissenschaften (21)
- Fachbereich Bauingenieurwesen (16)
Document Type
- Article (127)
- Conference Proceeding (78)
- Part of a Book (31)
- Book (12)
- Working Paper (3)
- Conference: Meeting Abstract (2)
- Doctoral Thesis (2)
- Patent (2)
- Part of a Periodical (2)
- Conference Poster (1)
- Other (1)
- Report (1)
Keywords
- Datenschutz (2)
- Digitale Transformation (2)
- Energy efficiency (2)
- Engineering optimization (2)
- Literaturanalyse (2)
- MINLP (2)
- Pump System (2)
- Serious Game (2)
- Water (2)
- Actors (1)
Nutzen und Rahmenbedingungen 5 informationsgetriebener Geschäftsmodelle des Internets der Dinge
(2018)
Im Kontext der zunehmenden Digitalisierung wird das Internet der Dinge (englisch: Internet of Things, IoT) als ein technologischer Treiber angesehen, durch den komplett neue Geschäftsmodelle im Zusammenspiel unterschiedlicher Akteure entstehen können. Identifizierte Schlüsselakteure sind unter anderem traditionelle Industrieunternehmen, Kommunen und Telekommunikationsunternehmen. Letztere sorgen mit der Bereitstellung von Konnektivität dafür, dass kleine Geräte mit winzigen Batterien nahezu überall und direkt an das Internet angebunden werden können. Es sind schon viele IoT-Anwendungsfälle auf dem Markt, die eine Vereinfachung für Endkunden darstellen, wie beispielsweise Philips Hue Tap. Neben Geschäftsmodellen basierend auf Konnektivität besteht ein großes Potenzial für informationsgetriebene Geschäftsmodelle, die bestehende Geschäftsmodelle unterstützen sowie weiterentwickeln können. Ein Beispiel dafür ist der IoT-Anwendungsfall Park and Joy der Deutschen Telekom AG, bei dem Parkplätze mithilfe von Sensoren vernetzt und Autofahrer in Echtzeit über verfügbare Parkplätze informiert werden. Informationsgetriebene Geschäftsmodelle können auf Daten aufsetzen, die in IoT-Anwendungsfällen erzeugt werden. Zum Beispiel kann ein Telekommunikationsunternehmen Mehrwert schöpfen, indem es aus Daten entscheidungsrelevantere Informationen – sogenannte Insights – ableitet, die zur Steigerung der Entscheidungsagilität genutzt werden. Außerdem können Insights monetarisiert werden. Die Monetarisierung von Insights kann nur nachhaltig stattfinden, wenn sorgfältig gehandelt wird und Rahmenbedingungen berücksichtigt werden. In diesem Kapitel wird das Konzept informationsgetriebener Geschäftsmodelle erläutert und anhand des konkreten Anwendungsfalls Park and Joy verdeutlicht. Darüber hinaus werden Nutzen, Risiken und Rahmenbedingungen diskutiert.
This paper presents NLP Lean Programming
framework (NLPf), a new framework
for creating custom natural language processing
(NLP) models and pipelines by utilizing
common software development build systems.
This approach allows developers to train and
integrate domain-specific NLP pipelines into
their applications seamlessly. Additionally,
NLPf provides an annotation tool which improves
the annotation process significantly by
providing a well-designed GUI and sophisticated
way of using input devices. Due to
NLPf’s properties developers and domain experts
are able to build domain-specific NLP
applications more efficiently. NLPf is Opensource
software and available at https://
gitlab.com/schrieveslaach/NLPf.
Sleep scoring is a necessary and time-consuming task in sleep studies. In animal models (such as mice) or in humans, automating this tedious process promises to facilitate long-term studies and to promote sleep biology as a data-driven f ield. We introduce a deep neural network model that is able to predict different states of consciousness (Wake, Non-REM, REM) in mice from EEG and EMG recordings with excellent scoring results for out-of-sample data. Predictions are made on epochs of 4 seconds length, and epochs are classified as artifactfree or not. The model architecture draws on recent advances in deep learning and in convolutional neural networks research. In contrast to previous approaches towards automated sleep scoring, our model does not rely on manually defined features of the data but learns predictive features automatically. We expect deep learning models like ours to become widely applied in different fields, automating many repetitive cognitive tasks that were previously difficult to tackle.
Enzyme und Biosensorik
(2018)
Enzymbasierte Biosensoren finden seit mehr als fünf Jahrzehnten einen prosperierenden Wachstumsmarkt und werden zunehmend auch in biotechnologischen Prozessen eingesetzt. In diesem Kapitel werden, ausgehend vom Sensorbegriff und typischen Kenngrößen für Biosensoren (Abschn. 18.1), elektrochemische Enzym-Biosensoren vorgestellt und deren typischen Einsatzgebiete diskutiert (Abschn. 18.2). Ein Blick über den „Tellerrand“ hinaus zeigt alternative Transduktorprinzipien (Abschn. 18.3) und führt abschließend in aktuelle Forschungstrends ein (Abschn. 18.4).
Against the background of growing data in everyday life, data processing tools become more powerful to deal with the increasing complexity in building design. The architectural planning process is offered a variety of new instruments to design, plan and communicate planning decisions. Ideally the access to information serves to secure and document the quality of the building and in the worst case, the increased data absorbs time by collection and processing without any benefit for the building and its user. Process models can illustrate the impact of information on the design- and planning process so that architect and planner can steer the process. This paper provides historic and contemporary models to visualize the architectural planning process and introduces means to describe today’s situation consisting of stakeholders, events and instruments. It explains conceptions during Renaissance in contrast to models used in the second half of the 20th century. Contemporary models are discussed regarding their value against the background of increasing computation in the building process.
Highly competitive markets paired with tremendous production volumes demand particularly cost efficient products. The usage of common parts and modules across product families can potentially reduce production costs. Yet, increasing commonality typically results in overdesign of individual products. Multi domain virtual prototyping enables designers to evaluate costs and technical feasibility of different single product designs at reasonable computational effort in early design phases. However, savings by platform commonality are hard to quantify and require detailed knowledge of e.g. the production process and the supply chain. Therefore, we present and evaluate a multi-objective metamodel-based optimization algorithm which enables designers to explore the trade-off between high commonality and cost optimal design of single products.
Given industrial applications, the costs for the operation and maintenance of a pump system typically far exceed its purchase price. For finding an optimal pump configuration which minimizes not only investment, but life-cycle costs, methods like Technical Operations Research which is based on Mixed-Integer Programming can be applied. However, during the planning phase, the designer is often faced with uncertain input data, e.g. future load demands can only be estimated. In this work, we deal with this uncertainty by developing a chance-constrained two-stage (CCTS) stochastic program. The design and operation of a booster station working under uncertain load demand are optimized to minimize total cost including purchase price, operation cost incurred by energy consumption and penalty cost resulting from water shortage. We find optimized system layouts using a sample average approximation (SAA) algorithm, and analyze the results for different risk levels of water shortage. By adjusting the risk level, the costs and performance range of the system can be balanced, and thus the
system’s resilience can be engineered
The Kremer-Grest (KG) bead-spring model is a near standard in Molecular Dynamic simulations of generic polymer properties. It owes its popularity to its computational efficiency, rather than its ability to represent specific polymer species and conditions. Here we investigate how to adapt the model to match the universal properties of a wide range of chemical polymers species. For this purpose we vary a single parameter originally introduced by Faller and Müller-Plathe, the chain stiffness. Examples include polystyrene, polyethylene, polypropylene, cis-polyisoprene, polydimethylsiloxane, polyethyleneoxide and styrene-butadiene rubber. We do this by matching the number of Kuhn segments per chain and the number of Kuhn segments per cubic Kuhn volume for the polymer species and for the Kremer-Grest model. We also derive mapping relations for converting KG model units back to physical units, in particular we obtain the entanglement time for the KG model as function of stiffness allowing for a time mapping. To test these relations, we generate large equilibrated well entangled polymer melts, and measure the entanglement moduli using a static primitive-path analysis of the entangled melt structure as well as by simulations of step-strain deformation of the model melts. The obtained moduli for our model polymer melts are in good agreement with the experimentally expected moduli.
For fuel flexibility enhancement hydrogen represents a possible alternative gas turbine fuel within future low emission power generation, in case of hydrogen production by the use of renewable energy sources such as wind energy or biomass. Kawasaki Heavy Industries, Ltd. (KHI) has research and development projects for future hydrogen society; production of hydrogen gas, refinement and liquefaction for transportation and storage, and utilization with gas turbine / gas engine for the generation of electricity. In the development of hydrogen gas turbines, a key technology is the stable and low NOx hydrogen combustion, especially Dry Low Emission (DLE) or Dry Low NOx (DLN) hydrogen combustion. Due to the large difference in the physical properties of hydrogen compared to other fuels such as natural gas, well established gas turbine combustion systems cannot be directly applied for DLE hydrogen combustion. Thus, the development of DLE hydrogen combustion technologies is an essential and challenging task for the future of hydrogen fueled gas turbines. The DLE Micro-Mix combustion principle for hydrogen fuel has been in development for many years to significantly reduce NOx emissions. This combustion principle is based on cross-flow mixing of air and gaseous hydrogen which reacts in multiple miniaturized “diffusion-type” flames. The major advantages of this combustion principle are the inherent safety against flashback and the low NOx-emissions due to a very short residence time of the reactants in the flame region of the micro-flames.
In the present work an optical sensor in combination with a spectrally resolved detection device for in-line particle-size-monitoring for quality control in beer production is presented. The principle relies on the size and wavelength dependent backscatter of growing particles in fluids. Measured interference structures of backscattered light are compared with calculated theoretical values, based on Mie-Theory, and fitted with a linear least square method to obtain particle size distributions. For this purpose, a broadband light source in combination with a process-CCD-spectrometer (charge ? coupled device spectrometer) and process adapted fiber optics are used. The goal is the development of an easy and flexible measurement device for in-line-monitoring of particle size. The presented device can be directly installed in product fill tubes or vessels, follows CIP- (cleaning in place) and removes the need of sample taking. A proof of concept and preliminary results, measuring protein precipitation, are presented.