Refine
Year of publication
- 2018 (252) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (67)
- Fachbereich Elektrotechnik und Informationstechnik (43)
- IfB - Institut für Bioengineering (39)
- INB - Institut für Nano- und Biotechnologien (25)
- Fachbereich Maschinenbau und Mechatronik (24)
- Fachbereich Luft- und Raumfahrttechnik (23)
- Fachbereich Chemie und Biotechnologie (22)
- Fachbereich Energietechnik (22)
- Fachbereich Wirtschaftswissenschaften (20)
- Fachbereich Bauingenieurwesen (16)
Has Fulltext
- no (252) (remove)
Document Type
- Article (125)
- Conference Proceeding (74)
- Part of a Book (31)
- Book (11)
- Conference: Meeting Abstract (2)
- Doctoral Thesis (2)
- Patent (2)
- Working Paper (2)
- Conference Poster (1)
- Other (1)
Keywords
- Datenschutz (2)
- Digitale Transformation (2)
- Energy efficiency (2)
- Engineering optimization (2)
- Literaturanalyse (2)
- MINLP (2)
- Pump System (2)
- Serious Game (2)
- Water (2)
- Agility (1)
Around 60% of the paper worldwide is made from recovered paper. Especially adhesive contaminants, so called stickies, reduce paper quality. To remove stickies but at the same time keep as many valuable fibers as possible, multi-stage screening systems with several interconnected pressure screens are used. When planning such systems, suitable screens have to be selected and their interconnection as well as operational parameters have to be defined considering multiple conflicting objectives. In this contribution, we present a Mixed-Integer Nonlinear Program to optimize system layout, component selection and operation to find a suitable trade-off between output quality and yield.
Given industrial applications, the costs for the operation and maintenance of a pump system typically far exceed its purchase price. For finding an optimal pump configuration which minimizes not only investment, but life-cycle costs, methods like Technical Operations Research which is based on Mixed-Integer Programming can be applied. However, during the planning phase, the designer is often faced with uncertain input data, e.g. future load demands can only be estimated. In this work, we deal with this uncertainty by developing a chance-constrained two-stage (CCTS) stochastic program. The design and operation of a booster station working under uncertain load demand are optimized to minimize total cost including purchase price, operation cost incurred by energy consumption and penalty cost resulting from water shortage. We find optimized system layouts using a sample average approximation (SAA) algorithm, and analyze the results for different risk levels of water shortage. By adjusting the risk level, the costs and performance range of the system can be balanced, and thus the
system’s resilience can be engineered
To increase pressure to supply all floors of high buildings with water, booster stations, normally consisting of several parallel pumps in the basement, are used. In this work, we demonstrate the potential of a decentralized pump topology regarding energy savings in water supply systems of skyscrapers. We present an approach, based on Mixed-Integer Nonlinear Programming, that allows to choose an optimal network topology and optimal pumps from a predefined construction kit comprising different pump types. Using domain-specific scaling laws and Latin Hypercube Sampling, we generate different input sets of pump types and compare their impact on the efficiency and cost of the total system design. As a realistic application example, we consider a hotel building with 325 rooms, 12 floors and up to four pressure zones.
Highly competitive markets paired with tremendous production volumes demand particularly cost efficient products. The usage of common parts and modules across product families can potentially reduce production costs. Yet, increasing commonality typically results in overdesign of individual products. Multi domain virtual prototyping enables designers to evaluate costs and technical feasibility of different single product designs at reasonable computational effort in early design phases. However, savings by platform commonality are hard to quantify and require detailed knowledge of e.g. the production process and the supply chain. Therefore, we present and evaluate a multi-objective metamodel-based optimization algorithm which enables designers to explore the trade-off between high commonality and cost optimal design of single products.
Resilience as a concept has found its way into different disciplines to describe the ability of an individual or system to withstand and adapt to changes in its environment. In this paper, we provide an overview of the concept in different communities and extend it to the area of mechanical engineering. Furthermore, we present metrics to measure resilience in technical systems and illustrate them by applying them to load-carrying structures. By giving application examples from the Collaborative Research Centre (CRC) 805, we show how the concept of resilience can be used to control uncertainty during different stages of product life.
The overall energy efficiency of ventilation systems can be improved by considering not only single components, but by considering as well the interplay between every part of the system. With the help of the method "TOR" ("Technical Operations Research"), which was developed at the Chair of Fluid Systems at TU Darmstadt, it is possible to improve the energy efficiency of the whole system by considering all possible design choices programmatically. We show the ability of this systematic design approach with a ventilation system for buildings as a use case example.
Based on a Mixed-Integer Nonlinear Program (MINLP) we model the ventilation system. We use binary variables to model the selection of different pipe diameters. Multiple fans are model with the help of scaling laws. The whole system is represented by a graph, where the edges represent the pipes and fans and the nodes represents the source of air for cooling and the sinks, that have to be cooled. At the beginning, the human designer chooses a construction kit of different suitable fans and pipes of different diameters and different load cases. These boundary conditions define a variety of different possible system topologies. It is not possible to consider all topologies by hand. With the help of state of the art solvers, on the other side, it is possible to solve this MINLP.
Next to this, we also consider the effects of malfunctions in different components. Therefore, we show a first approach to measure the resilience of the shown example use case. Further, we compare the conventional approach with designs that are more resilient. These more resilient designs are derived by extending the before mentioned model with further constraints, that consider explicitly the resilience of the overall system. We show that it is possible to design resilient systems with this method already in the early design stage and compare the energy efficiency and resilience of these different system designs.
The energy-efficiency of technical systems can be improved by a systematic design approach. Technical Operations Research (TOR) employs methods known from Operations Research to find a global optimal layout and operation strategy of technical systems. We show the practical usage of this approach by the systematic design of a decentralized water supply system for skyscrapers. All possible network options and operation strategies are modeled by a Mixed-Integer Nonlinear Program. We present the optimal system found by our approach and highlight the energy savings compared to a conventional system design.
The UN sets the goal to ensure access to water and sanitation for all people by 2030. To address this goal, we present a multidisciplinary approach for designing water supply networks for slums in large cities by applying mathematical optimization. The problem is modeled as a mixed-integer linear problem (MILP) aiming to find a network describing the optimal supply infrastructure. To illustrate the approach, we apply it on a small slum cluster in Dhaka, Bangladesh.
Ensuring access to water and sanitation for all is Goal No. 6 of the 17 UN Sustainability Development Goals to transform our world. As one step towards this goal, we present an approach that leverages remote sensing data to plan optimal water supply networks for informal urban settlements. The concept focuses on slums within large urban areas, which are often characterized by a lack of an appropriate water supply. We apply methods of mathematical optimization aiming to find a network describing the optimal supply infrastructure. Hereby, we choose between different decentral and central approaches combining supply by motorized vehicles with supply by pipe systems. For the purposes of illustration, we apply the approach to two small slum clusters in Dhaka and Dar es Salaam. We show our optimization results, which represent the lowest cost water supply systems possible. Additionally, we compare the optimal solutions of the two clusters (also for varying input parameters, such as population densities and slum size development over time) and describe how the result of the optimization depends on the entered remote sensing data.
Nahezu 100.000 denkbare Strukturen kann ein Getriebe bei gleicher Funktion aufweisen - je nach Ganganzahl und gefordertem Freiheitsgrad. Mit dem traditionellen Ansatz bei der Entwicklung, einzelne vielversprechende Systemkonfigurationen manuell zu identifizieren und zu vergleichen, können leicht innovative und vor allem kostenminimale Lösungen übersehen werden. Im Rahmen eines Forschungsprojekts hat die TU Darmstadt spezielle Optimierungsmethoden angewendet, um auch bei großen Lösungsräumen zielsicher ein für die individuellen Zielstellungen optimales Layout zu finden.
On obligations in the development process of resilient systems with algorithmic design methods
(2018)
Advanced computational methods are needed both for the design of large systems and to compute high accuracy solutions. Such methods are efficient in computation, but the validation of results is very complex, and highly skilled auditors are needed to verify them. We investigate legal questions concerning obligations in the development phase, especially for technical systems developed using advanced methods. In particular, we consider methods of resilient and robust optimization. With these techniques, high performance solutions can be found, despite a high variety of input parameters. However, given the novelty of these methods, it is uncertain whether legal obligations are being met. The aim of this paper is to discuss if and how the choice of a specific computational method affects the developer’s product liability. The review of legal obligations in this paper is based on German law and focuses on the requirements that must be met during the design and development process.
Textsammlung mit allen für den Datenschutz in Kirchen maßgeblichen Regelwerken: DSGVO, KDG, KDR-OG und DSG-EKD sowie begleitende Verordnungen KDO-DVO und ITSVO-EKD. Die vorliegende Textsammlung enthält neben dem zentralen Regelwerk, der Datenschutz-Grundverordnung (DSGVO) in ihrer letzten korrigierten Fassung vom 19. April 2018, die Normen des kirchlichen Rechts, die aufgrund der DSGVO neu erlassen wurden. Auf Seiten der katholischen Kirche sind dies das Gesetz über den Kirchlichen Datenschutz (KDG) und die Kirchliche Datenschutzregelung der Ordensgemeinschaft päpstlichen Rechts (KDR-OG); zudem findet die Verordnung zur Durchführung der Anordnung über den kirchlichen Datenschutz (KDO-DVO) weiterhin entsprechende Anwendung. Die evangelische Kirche novellierte das Kirchengesetz über den Datenschutz der Evangelischen Kirche in Deutschland (DSG-EKD) und hielt an der Verordnung zur Sicherheit der Informationstechnik (ITSVO-EKD) fest. Ergänzt wird das Werk durch Verweise auf maßgebliche Veröffentlichungen der Artikel-29-Datenschutzgruppe und der weltlichen und kirchlichen Datenschutzaufsichtsbehörden. Damit richtet sich das vorliegende Werk vor allem an kirchliche Gemeinden sowie Unternehmen in kirchlicher Trägerschaft und ihre Datenschutzbeauftragten wie gleichermaßen an private Unternehmen, Kanzleien und Berater mit kirchlicher Kundschaft bzw. Mandantschaft.
Nach einem intensiven politischen Diskurs wurde im vergangenen Jahr die Datenschutz-Grundverordnung (DSGVO) verabschiedet. Die DSGVO ersetzt zum 25.5.2018 die bislang geltende, aus dem Jahre 1995 stammende Datenschutz-Richtlinie 95/46/EG. Die Novellierung des Datenschutzrechts bringt zahlreiche neue Anforderungen mit sich. Unternehmen sind daher gezwungen, sich auf die Änderungen einzustellen, ihre datenschutzrelevanten Prozesse im Hinblick auf die neuen Anforderungen zu überprüfen und bis zum Mai 2018 an der DSGVO auszurichten. Der Beitrag gibt einen kurzen Überblick über die zentralen Aspekte der Datenschutzreform und die damit einhergehenden Herausforderungen für Unternehmen.
Das neue kirchliche Datenschutzrecht – Herausforderungen für Unternehmen der Privatwirtschaft
(2018)
Die Rechtsfigur der gemeinsamen Verantwortlichkeit beschäftigt die datenschutzrechtliche Literatur seit Langem. Die Bestimmung der Verantwortlichkeit bei arbeitsteiligen Verarbeitungsverfahren, welche vor allem bei heutigen Plattformdiensten üblich sind, ist komplex: Stets sind mehrere Akteure beteiligt und in der Regel werden durch die Handlung eines Beteiligten mehrere Verarbeitungsschritte ausgelöst. Nun hat sich der EuGH in einem in mehrfacher Hinsicht bemerkenswerten Urteil geäußert.
Die Datenschutz-Grundverordnung (DS-GVO) regelt in ihrem Art. 3 das räumlich anwendbare Datenschutzrecht und zielt dabei gerade auch auf Angebote nichteuropäischer Diensteanbieter ab. Die bisherige Diskussion konzentriert sich bislang in erster Linie darauf, das eingeführte Marktortprinzip zu thematisieren; das weitgehend unangetastete
Niederlassungsprinzip und vor allem die Probleme, die sich durch dessen unveränderte Beibehaltung ergeben, werden dagegen nicht erörtert. Der folgende Beitrag versucht sich an einer systematischen Analyse eines teils kontrovers, teils kaum diskutierten Themas.
Das Kopplungsverbot fristete – obwohl in rechtswissenschaftlicher Literatur seit jeher diskutiert – unter der Geltung des BDSG ein Schattendasein. Mit der Datenschutz-Grundverordnung (DS-GVO) ist eine Änderung absehbar: Der neue Art. EWG_DSGVO Artikel 7 Abs. EWG_DSGVO Artikel 7 Absatz 4 DS-GVO stellt klar, dass die Leistungserbringung nicht von der Einwilligungserteilung abhängig gemacht werden darf. Doch dieses scheinbare Novum des Datenschutzrechts wirft zahlreiche Fragen auf. Während vor allem Vertreter der unternehmerischen Praxis die Anwendung des Kopplungsverbots in zahlreichen Konstellationen ablehnen, beschwören dessen Apologeten das Ende sämtlicher „datenfinanzierten“ Dienste herauf. Der vorliegende Beitrag gibt Einblick in die Regelungstiefe einer Norm, die das Web 2.0 revolutionieren könnte, und schlägt eine Lösung vor, die dem Schutz der Privatsphäre des Betroffenen und den wirtschaftlichen Interessen von Diensteanbietern gleichermaßen gerecht wird.
Cloud Computing wirft in zahlreichen Rechtsbereichen neuartige juristische Fragestellungen auf. Ziel der Darstellung der rechtlichen Rahmenbedingungen ist, die das Identitätsmanagement in der Cloud betreffenden Rechtsgrundlagen aus den unterschiedlichen Rechtsgebieten vorzustellen und einzuordnen, bevor im Rahmen des sechsten Kapitels die Darstellung der hieraus resultierenden Verpflichtungen in ihrer konkreten Form erfolgt.
Datenschutz und der 25.5.2018, wie ein Damoklesschwert scheinen beide Begriffe zurzeit im Raum zu stehe. Jeder weiß oder sollte zumindest um das Inkrafttreten der europäischen DSGVO am 25.5.2018 wissen. Viel wurde über wesentliche Neuerungen im Datenschutzrecht berichtet. Nicht zuletzt über gesteigerte organisatorische Anforderungen, Dokumentationspflichten und drohende Bußgelder. Doch was bedeuten diese Neuerungen ganz konkret für die Praxis des Steuerberaters? Anders als man vermuten könnte, werden die datenschutzrechtlichen Neuerungen nicht nur im Bereich der Kanzleiorganisation relevant. Auch im Steuerverwaltungsverfahren sieht sich der Steuerberater datenschutzrechtlichen Fragestellungen gegenüber, bspw. dann, wenn die Finanzbehörden bei der Verarbeitung der personenbezogenen Daten des Mandanten gegen die DSGVO verstoßen. Gleiches gilt in Bereichen des Beschäftigtendatenschutzes. Sowohl der Kanzleiinhaber selbst, als auch seine Arbeitgeber-Mandanten haben die Vorschriften des Beschäftigtendatenschutzes einzuhalten. Der Steuerberater benötigt datenschutzrechtliches Know How, welches unmittelbar seine tägliche Praxis betrifft. Andernfalls besteht das Risiko, dass dieser mit mehr Fragen, als Antworten zurück bleibt.
The continuing growth of scientific publications raises the question how research processes can be digitalized and thus realized more productively. Especially in information technology fields, research practice is characterized by a rapidly growing volume of publications. For the search process various information systems exist. However, the analysis of the published content is still a highly manual task. Therefore, we propose a text analytics system that allows a fully digitalized analysis of literature sources. We have realized a prototype by using EBSCO Discovery Service in combination with IBM Watson Explorer and demonstrated the results in real-life research projects. Potential addressees are research institutions, consulting firms, and decision-makers in politics and business practice.
As an interdisciplinary research network, the Cluster of Excellence “Integrative Production Technology for High-Wage Countries” (CoE) comprises of around 150 researchers. Their scientific background ranges from mechanical engineering and computer science to social sciences such as sociology and psychology. In addition to content- and methodbased challenges, the CoE’s employees are faced with heterogenic organizational cultures, different hierarchical levels, an imbalanced gender distribution, and a high employee fluctuation. The sub-project Scientific Cooperation Engineering 1 (CSP1) addresses the challenge of interdisciplinary cooperation and organizational learning and aims at fostering interdisciplinarity and its synergies as a source of innovation. Therefore, the project examines means of reaching an organizational development, ranging from temporal structures to a sustainable network in production technology. To achieve this aim, a broad range of means has been developed during the last twelve years: In addition to physical measures such as regular network events and trainings, virtual measures such as the Terminology App were focused. The app is an algorithmic analysis method for uncovering latent topic structures of publications of the CoE to highlight thematic intersections and synergy potentials. The detection and promotion of has been a vital and long known element in knowledge management. Furthermore, CSP1 focusses on project management and thus developed evaluation tools to measure and control the success of interdisciplinary cooperation. In addition to the cooperation fostering measures, CSP1 conducted studies about interdisciplinarity and diversity and their relationship with innovation. The scientific background of these means and the research results of CSP1 are outlined in this paper to offer approaches for successful interdisciplinary cooperation management.
Das anhaltende Wachstum wissenschaftlicher Veröffentlichungen wirft die Fragestellung auf, wie Literaturana-lysen im Rahmen von Forschungsprozessen digitalisiert und somit produktiver realisiert werden können. Insbesondere in informationstechnischen Fachgebieten ist die Forschungspraxis durch ein rasant wachsendes Publikationsaufkommen gekennzeichnet. Infolgedessen bietet sich der Einsatz von Methoden der Textanalyse (Text Analytics) an, die Textdaten automatisch vorbereiten und verarbeiten können. Erkenntnisse entstehen dabei aus Analysen von Wortarten und Subgruppen, Korrelations- sowie Zeitreihenanalysen. Dieser Beitrag stellt die Konzeption und Realisierung eines Prototypen vor, mit dem Anwender bibliographische Daten aus der etablierten Literaturdatenbank EBSCO Discovery Service mithilfe textanalytischer Methoden erschließen können. Der Prototyp basiert auf dem Analysesystem IBM Watson Explorer, das Hochschulen lizenzkostenfrei zur Verfügung steht. Potenzielle Adressaten des Prototypen sind Forschungseinrichtungen, Beratungsunternehmen sowie Entscheidungsträger in Politik und Unternehmenspraxis.
Angesichts des anhaltenden Wachstums wissenschaftlicher Veröffentlichungen werden Instrumente benötigt, um Literaturanalysen durch Digitalisierung produktiver zu gestalten. Dieser Beitrag stellt einen Ansatz vor, der bibliographische Daten aus der Literaturdatenbank EBSCO Discovery Service mithilfe von Text-Analytics-Methoden erschließt. Die Lösung basiert auf dem Textanalysesystem IBM Watson Explorer und eignet sich für explorative Literaturanalysen, um beispielsweise den Status quo emergierender Technologiefelder in der Literatur zu reflektieren. Die generierten Ergebnisse sind in den Kontext der zunehmenden Werkzeugunterstützung des Literaturrechercheprozesses einzuordnen und können für intra- sowie interinstitutionelle Wissenstransferprozesse in Forschungs- und Beratungskontexten genutzt werden.
Im Rahmen der digitalen Transformation werden innovative Technologiekonzepte, wie z. B. das Internet der Dinge und Cloud Computing als Treiber für weitreichende Veränderungen von Organisationen und Geschäftsmodellen angesehen. In diesem Kontext ist Robotic Process Automation (RPA) ein neuartiger Ansatz zur Prozessautomatisierung, bei dem manuelle Tätigkeiten durch sogenannte Softwareroboter erlernt und automatisiert ausgeführt werden. Dabei emulieren Softwareroboter die Eingaben auf der bestehenden Präsentationsschicht, so dass keine Änderungen an vorhandenen Anwendungssystemen notwendig sind. Die innovative Idee ist die Transformation der bestehenden Prozessausführung von manuell zu digital, was RPA von traditionellen Ansätzen des Business Process Managements (BPM) unterscheidet, bei denen z. B. prozessgetriebene
Anpassungen auf Ebene der Geschäftslogik notwendig sind. Am Markt werden bereits unterschiedliche RPA-Lösungen als Softwareprodukte angeboten. Gerade bei operativen Prozessen mit sich wiederholenden Verarbeitungsschritten in unterschiedlichen Anwendungssystemen sind gute Ergebnisse durch RPA dokumentiert, wie z. B. die Automatisierung von 35 % der Backoffice-Prozesse bei Telefonica. Durch den vergleichsweise niedrigen Implementierungsaufwand verbunden mit einem hohen Automatisierungspotenzial ist in der Praxis (z. B. Banken, Telekommunikation, Energieversorgung) ein hohes Interesse an RPA vorhanden. Der Beitrag diskutiert RPA als innovativen Ansatz zur
Prozessdigitalisierung und gibt konkrete Handlungsempfehlungen für die Praxis. Dazu wird zwischen modellgetriebenen und selbstlernenden Ansätzen unterschieden. Anhand von generellen Architekturen von RPA-Systemen werden Anwendungsszenarien sowie deren Automatisierungspotenziale, aber auch Einschränkungen, diskutiert. Es folgt ein strukturierter Marktüberblick ausgewählter RPA-Produkte. Anhand von drei konkreten Anwendungsbeispielen wird die Nutzung von RPA in der Praxis verdeutlicht.
Nutzen und Rahmenbedingungen 5 informationsgetriebener Geschäftsmodelle des Internets der Dinge
(2018)
Im Kontext der zunehmenden Digitalisierung wird das Internet der Dinge (englisch: Internet of Things, IoT) als ein technologischer Treiber angesehen, durch den komplett neue Geschäftsmodelle im Zusammenspiel unterschiedlicher Akteure entstehen können. Identifizierte Schlüsselakteure sind unter anderem traditionelle Industrieunternehmen, Kommunen und Telekommunikationsunternehmen. Letztere sorgen mit der Bereitstellung von Konnektivität dafür, dass kleine Geräte mit winzigen Batterien nahezu überall und direkt an das Internet angebunden werden können. Es sind schon viele IoT-Anwendungsfälle auf dem Markt, die eine Vereinfachung für Endkunden darstellen, wie beispielsweise Philips Hue Tap. Neben Geschäftsmodellen basierend auf Konnektivität besteht ein großes Potenzial für informationsgetriebene Geschäftsmodelle, die bestehende Geschäftsmodelle unterstützen sowie weiterentwickeln können. Ein Beispiel dafür ist der IoT-Anwendungsfall Park and Joy der Deutschen Telekom AG, bei dem Parkplätze mithilfe von Sensoren vernetzt und Autofahrer in Echtzeit über verfügbare Parkplätze informiert werden. Informationsgetriebene Geschäftsmodelle können auf Daten aufsetzen, die in IoT-Anwendungsfällen erzeugt werden. Zum Beispiel kann ein Telekommunikationsunternehmen Mehrwert schöpfen, indem es aus Daten entscheidungsrelevantere Informationen – sogenannte Insights – ableitet, die zur Steigerung der Entscheidungsagilität genutzt werden. Außerdem können Insights monetarisiert werden. Die Monetarisierung von Insights kann nur nachhaltig stattfinden, wenn sorgfältig gehandelt wird und Rahmenbedingungen berücksichtigt werden. In diesem Kapitel wird das Konzept informationsgetriebener Geschäftsmodelle erläutert und anhand des konkreten Anwendungsfalls Park and Joy verdeutlicht. Darüber hinaus werden Nutzen, Risiken und Rahmenbedingungen diskutiert.
Prozessorientierte Messung der Customer Experience am Beispiel der Telekommunikationsindustrie
(2018)
Hohe Wettbewerbsintensität und gestiegene Kundenanforderungen erfordern bei Telekommunikationsunternehmen eine aktive Gestaltung der Customer Experience (CX). Ein wichtiger Aspekt dabei ist die CX-Messung. Traditionelle Zufriedenheitsmessungen sind oft nicht ausreichend, um die Kundenerfahrung in komplexen Prozessen vollständig zu erfassen. Daher wird in diesem Kapitel eine prozessübergreifende Referenzlösung zur CX-Messung am Beispiel der Telekommunikationsindustrie vorgeschlagen. Ausgangspunkt ist ein industriespezifisches Prozessmodell, das sich an dem Referenzmodell eTOM orientiert. Dieses wird um Messpunkte erweitert, die Schwachstellen in Bezug auf die CX identifizieren. Für die erkannten Schwachstellen werden über eine Referenzmatrix mögliche Auslöser abgeleitet und anhand von typischen Geschäftsfallmengen bewertet. Somit ist eine direkte Zuordnung und Erfolgsmessung konkreter Maßnahmen zur Behebung der Schwachstellen möglich. Die so entwickelte Referenzlösung wurde im Projekt K1 bei der Deutschen Telekom erfolgreich umgesetzt. Details zur Umsetzung werden als Fallstudien dargestellt.
Because of customer churn, strong competition, and operational inefficiencies, the telecommunications operator ME Telco (fictitious name due to confidentiality) launched a strategic transformation program that included a Business Process Management (BPM) project. Major problems were silo-oriented process management and missing cross-functional transparency. Process improvements were not consistently planned and aligned with corporate targets. Measurable inefficiencies were observed on an operational level, e.g., high lead times and reassignment rates of the incident management process.
Malaria infection remains a significant risk for much of the population of tropical and subtropical areas, particularly in developing countries. Therefore, it is of high importance to develop sensitive, accurate and inexpensive malaria diagnosis tests. Here, we present a novel aptamer-based electrochemical biosensor (aptasensor) for malaria detection by impedance spectroscopy, through the specific recognition between a highly discriminatory DNA aptamer and its target Plasmodium falciparum lactate dehydrogenase (PfLDH). Interestingly, due to the isoelectric point (pI) of PfLDH, the aptasensor response showed an adjustable detection range based on the different protein net-charge at variable pH environments. The specific aptamer recognition allows sensitive protein detection with an expanded detection range and a low detection limit, as well as a high specificity for PfLDH compared to analogous proteins. The specific feasibility of the aptasensor is further demonstrated by detection of the target PfLDH in human serum. Furthermore, the aptasensor can be easily regenerated and thus applied for multiple usages. The robustness, sensitivity, and reusability of the presented aptasensor make it a promising candidate for point-of-care diagnostic systems.
The chemical imaging sensor is a semiconductor-based chemical sensor capable of visualizing pH and ion distributions. The spatial resolution depends on the lateral diffusion of photocarriers generated by illumination of the semiconductor substrate. In this study, two types of optical setups, one based on a bundle of optical fibers and the other based on a binocular tube head, were developed to project a hybrid illumination of a modulated light beam and a ring-shaped constant illumination onto the sensor plate. An improved spatial resolution was realized by the ring-shaped constant illumination, which suppressed lateral diffusion of photocarriers by enhanced recombination due to the increased carrier concentration.
As with most high-velocity free-surface flows, stepped spillway flows become self-aerated when the drop height exceeds a critical value. Due to the step-induced macro-roughness, the flow field becomes more turbulent than on a similar smooth-invert chute. For this reason, cascades are oftentimes used as re-aeration structures in wastewater treatment. However, for stepped spillways as flood release structures downstream of deoxygenated reservoirs, gas transfer is also of crucial significance to meet ecological requirements. Prediction of mass transfer velocities becomes challenging, as the flow regime differs from typical previously studied flow conditions. In this paper, detailed air-water flow measurements are conducted on stepped spillway models with different geometry, with the aim to estimate the specific air-water interface. Re-aeration performances are determined by applying the absorption method. In contrast to earlier studies, the aerated water body is considered a continuous mixture up to a level where 75% air concentration is reached. Above this level, a homogenous surface wave field is considered, which is found to significantly affect the total air-water interface available for mass transfer. Geometrical characteristics of these surface waves are obtained from high-speed camera investigations. The results show that both the mean air concentration and the mean flow velocity have influence on the mass transfer. Finally, an empirical relationship for the mass transfer on stepped spillway models is proposed.
Synthetic mimics of natural high-performance structural materials have shown great and partly unforeseen opportunities for the design of multifunctional materials. For nacre-mimetic nanocomposites, it has remained extraordinarily challenging to make ductile materials with high stretchability at high fractions of reinforcements, which is however of crucial importance for flexible barrier materials. Here, highly ductile and tough nacre-mimetic nanocomposites are presented, by implementing weak, but many hydrogen bonds in a ternary nacre-mimetic system consisting of two polymers (poly(vinyl amine) and poly(vinyl alcohol)) and natural nanoclay (montmorillonite) to provide efficient energy dissipation and slippage at high nanoclay content (50 wt%). Tailored interactions enable exceptional combinations of ductility (close to 50% strain) and toughness (up to 27.5 MJ m⁻³). Extensive stress whitening, a clear sign of high internal dynamics at high internal cohesion, can be observed during mechanical deformation, and the materials can be folded like paper into origami planes without fracture. Overall, the new levels of ductility and toughness are unprecedented in highly reinforced bioinspired nanocomposites and are of critical importance to future applications, e.g., as barrier materials needed for encapsulation and as a printing substrate for flexible organic electronics.
False spectra formation in the differential two-channel scheme of the laser Doppler flowmeter
(2018)
Noise in the differential two-channel scheme of a classic laser Doppler flowmetry (LDF) instrument was studied. Formation of false spectral components in the output signal due to beating of electrical signals in the differential amplifier was found out. The improved block-diagram of the flowmeter was developed allowing to reduce the noise.
Background
Culture media containing complex compounds like yeast extract or peptone show numerous disadvantages. The chemical composition of the complex compounds is prone to significant variations from batch to batch and quality control is difficult. Therefore, the use of chemically defined media receives more and more attention in commercial fermentations. This concept results in better reproducibility, it simplifies downstream processing of secreted products and enable rapid scale-up. Culturing bacteria with unknown auxotrophies in chemically defined media is challenging and often not possible without an extensive trial-and-error approach. In this study, a respiration activity monitoring system for shake flasks and its recent version for microtiter plates were used to clarify unknown auxotrophic deficiencies in the model organism Bacillus pumilus DSM 18097.
Results
Bacillus pumilus DSM 18097 was unable to grow in a mineral medium without the addition of complex compounds. Therefore, a rich chemically defined minimal medium was tested containing basically all vitamins, amino acids and nucleobases, which are essential ingredients of complex components. The strain was successfully cultivated in this medium. By monitoring of the respiration activity, nutrients were supplemented to and omitted from the rich chemically defined medium in a rational way, thus enabling a systematic and fast determination of the auxotrophic deficiencies. Experiments have shown that the investigated strain requires amino acids, especially cysteine or histidine and the vitamin biotin for growth.
Conclusions
The introduced method allows an efficient and rapid identification of unknown auxotrophic deficiencies and can be used to develop a simple chemically defined tailor-made medium. B. pumilus DSM 18097 was chosen as a model organism to demonstrate the method. However, the method is generally suitable for a wide range of microorganisms. By combining a systematic combinatorial approach based on monitoring the respiration activity with cultivation in microtiter plates, high throughput experiments with high information content can be conducted. This approach facilitates media development, strain characterization and cultivation of fastidious microorganisms in chemically defined minimal media while simultaneously reducing the experimental effort.
Monitoring of organic acids (OA) and volatile fatty acids (VFA) is crucial for the control of anaerobic digestion. In case of unstable process conditions, an accumulation of these intermediates occurs. In the present work, two different enzyme-based biosensor arrays are combined and presented for facile electrochemical determination of several process-relevant analytes. Each biosensor utilizes a platinum sensor chip (14 × 14 mm²) with five individual working electrodes. The OA biosensor enables simultaneous measurement of ethanol, formate, d- and l-lactate, based on a bi-enzymatic detection principle. The second VFA biosensor provides an amperometric platform for quantification of acetate and propionate, mediated by oxidation of hydrogen peroxide. The cross-sensitivity of both biosensors toward potential interferents, typically present in fermentation samples, was investigated. The potential for practical application in complex media was successfully demonstrated in spiked sludge samples collected from three different biogas plants. Thereby, the results obtained by both of the biosensors were in good agreement to the applied reference measurements by photometry and gas chromatography, respectively. The proposed hybrid biosensor system was also used for long-term monitoring of a lab-scale biogas reactor (0.01 m³) for a period of 2 months. In combination with typically monitored parameters, such as gas quality, pH and FOS/TAC (volatile organic acids/total anorganic carbonate), the amperometric measurements of OA and VFA concentration could enhance the understanding of ongoing fermentation processes.
Algal polysaccharides (extracellular polysaccharides) and carbon nanotubes (CNTs) were adsorbed on dioctadecyldimethylammonium bromide Langmuir monolayers to serve as a matrix for the incorporation of urease. The physicochemical properties of the supramolecular system as a monolayer at the air–water interface were investigated by surface pressure–area isotherms, surface potential–area isotherms, interfacial shear rheology, vibrational spectroscopy, and Brewster angle microscopy. The floating monolayers were transferred to hydrophilic solid supports, quartz, mica, or capacitive electrolyte–insulator–semiconductor (EIS) devices, through the Langmuir–Blodgett (LB) technique, forming mixed films, which were investigated by quartz crystal microbalance, fluorescence spectroscopy, and field emission gun scanning electron microscopy. The enzyme activity was studied with UV–vis spectroscopy, and the feasibility of the thin film as a urea sensor was essayed in an EIS sensor device. The presence of CNT in the enzyme–lipid LB film not only tuned the catalytic activity of urease but also helped to conserve its enzyme activity. Viability as a urease sensor was demonstrated with capacitance–voltage and constant capacitance measurements, exhibiting regular and distinctive output signals over all concentrations used in this work. These results are related to the synergism between the compounds on the active layer, leading to a surface morphology that allowed fast analyte diffusion owing to an adequate molecular accommodation, which also preserved the urease activity. This work demonstrates the feasibility of employing LB films composed of lipids, CNT, algal polysaccharides, and enzymes as EIS devices for biosensing applications.
The light-addressable potentiometric sensor (LAPS) and scanning photo-induced impedance microscopy (SPIM) are two closely related methods to visualise the distributions of chemical species and impedance, respectively, at the interface between the sensing surface and the sample solution. They both have the same field-effect structure based on a semiconductor, which allows spatially resolved and label-free measurement of chemical species and impedance in the form of a photocurrent signal generated by a scanning light beam. In this article, the principles and various operation modes of LAPS and SPIM, functionalisation of the sensing surface for measuring various species, LAPS-based chemical imaging and high-resolution sensors based on silicon-on-sapphire substrates are described and discussed, focusing on their technical details and prospective applications.
Background
Impairment of neurovascular coupling (NVC) was recently reported in the context of subarachnoid hemorrhage and may correlate with disease severity and outcome. However, previous techniques to evaluate NVC required invasive procedures. Retinal vessels may represent an alternative option for non-invasive assessment of NVC.
Methods
A prototype of an adapted retinal vessel analyzer was used to assess retinal vessel diameter in mice. Dynamic vessel analysis (DVA) included an application of monochromatic flicker light impulses in predefined frequencies for evaluating NVC. All retinae were harvested after DVA and electroretinograms were performed.
Results
A total of 104 retinal scans were conducted in 21 male mice (90 scans). Quantitative arterial recordings were feasible only in a minority of animals, showing an emphasized reaction to flicker light impulses (8 mice; 14 scans). A characteristic venous response to flicker light, however, could observed in the majority of animals. Repeated measurements resulted in a significant decrease of baseline venous diameter (7 mice; 7 scans, p < 0.05). Ex-vivo electroretinograms, performed after in-vivo DVA, demonstrated a significant reduction of transretinal signaling in animals with repeated DVA (n = 6, p < 0.001).
Conclusions
To the best of our knowledge, this is the first non-invasive study assessing murine retinal vessel response to flicker light with characteristic changes in NVC. The imaging system can be used for basic research and enables the investigation of retinal vessel dimension and function in control mice and genetically modified animals.
On the flight performance impact of landing gear drag reduction methods for unmanned air vehicles
(2018)
The flight performance impact of three different landing gear configurations on a small, fixed-wing UAV is analyzed with a combination of RANS CFD calculations and an incremental flight performance algorithm. A standard fixed landing gear configuration is taken as a baseline, while the influence of retracting the landing gear or applying streamlined fairings is investigated. A retraction leads to a significant parasite drag reduction, while also fairings promise large savings. The increase in lift-to-drag ratio is reduced at high lift coefficients due to the influence of induced drag. All configurations are tested on three different design missions with an incremental flight performance algorithm. A trade-off study is performed using the retracted or faired landing gear's weight increase as a variable. The analysis reveals only small mission performance gains as the aerodynamic improvements are negated by weight penalties. A new workflow for decision-making is presented that allows to estimate if a change in landing gear configuration is beneficial for a small UAV.
For fuel flexibility enhancement hydrogen represents a possible alternative gas turbine fuel within future low emission power generation, in case of hydrogen production by the use of renewable energy sources such as wind energy or biomass. Kawasaki Heavy Industries, Ltd. (KHI) has research and development projects for future hydrogen society; production of hydrogen gas, refinement and liquefaction for transportation and storage, and utilization with gas turbine / gas engine for the generation of electricity. In the development of hydrogen gas turbines, a key technology is the stable and low NOx hydrogen combustion, especially Dry Low Emission (DLE) or Dry Low NOx (DLN) hydrogen combustion. Due to the large difference in the physical properties of hydrogen compared to other fuels such as natural gas, well established gas turbine combustion systems cannot be directly applied for DLE hydrogen combustion. Thus, the development of DLE hydrogen combustion technologies is an essential and challenging task for the future of hydrogen fueled gas turbines. The DLE Micro-Mix combustion principle for hydrogen fuel has been in development for many years to significantly reduce NOx emissions. This combustion principle is based on cross-flow mixing of air and gaseous hydrogen which reacts in multiple miniaturized “diffusion-type” flames. The major advantages of this combustion principle are the inherent safety against flashback and the low NOx-emissions due to a very short residence time of the reactants in the flame region of the micro-flames.
BACKGROUND
Immunosuppression is often considered as an indication for antibiotic prophylaxis to prevent surgical site infections (SSI) while performing skin surgery. However, the data on the risk of developing SSI after dermatologic surgery in immunosuppressed patients are limited.
PATIENTS AND METHODS
All patients of the Department of Dermatology and Allergology at the University Hospital of RWTH Aachen in Aachen, Germany, who underwent hospitalization for a dermatologic surgery between June 2016 and January 2017 (6 months), were followed up after surgery until completion of the wound healing process. The follow-up addressed the occurrence of SSI and the need for systemic antibiotics after the operative procedure. Immunocompromised patients were compared with immunocompetent patients. The investigation was conducted as a retrospective analysis of patient records.
RESULTS
The authors performed 284 dermatologic surgeries in 177 patients. Nineteen percent (54/284) of the skin surgery was performed on immunocompromised patients. The most common indications for surgical treatment were nonmelanoma skin cancer and malignant melanomas. Surgical site infections occurred in 6.7% (19/284) of the cases. In 95% (18/19), systemic antibiotic treatment was needed. Twenty-one percent of all SSI (4/19) were seen in immunosuppressed patients.
CONCLUSION
According to the authors' data, immunosuppression does not represent a significant risk factor for SSI after dermatologic surgery. However, larger prospective studies are needed to make specific recommendations on the use of antibiotic prophylaxis while performing skin surgery in these patients.
The available data on complications after dermatologic surgery have improved over the past years. Particularly, additional risk factors have been identified for surgical site infections (SSI). Purulent surgical sites, older age, involvement of head, neck, and acral regions, and also the involvement of less experienced surgeons have been reported to increase the risk of the SSI after dermatologic surgeries.1 In general, the incidence of SSI after skin surgery is considered to be low.1,2 However, antibiotics in dermatologic surgeries, especially in the perioperative setting, seem to be overused,3,4 particularly regarding developing antibiotic resistances and side effects.
Immunosuppression has been recommended to be taken into consideration as an additional indication for antibiotic prophylaxis to prevent SSI after skin surgery in special cases.5,6 However, these recommendations do not specify the exact dermatologic surgeries, and were not specifically developed for dermatologic surgery patients and treatments, but adopted from other surgical fields.6 According to the survey conducted on American College of Mohs Surgery members in 2012, 13% to 29% of the surgeons administered antibiotic prophylaxis to immunocompromised patients to prevent SSI while performing dermatologic surgery on noninfected skin,3 although this was not recommended by Journal of the American Academy of Dermatology Advisory Statement. Indeed, the data on the risk of developing SSI after dermatologic surgery in immunosuppressed patients are limited. However, it is possible that due to the insufficient evidence on the risk of SSI occurrence in this patient group, dermatologic surgeons tend to overuse perioperative antibiotic prophylaxis.
To make specific recommendations on the use of antibiotic prophylaxis in immunosuppressed patients in the field of skin surgery, more information about the incidence of SSI after dermatologic surgery in these patients is needed. The aim of this study was to fill this data gap by investigating whether there is an increased risk of SSI after skin surgery in immunocompromised patients compared with immunocompetent patients.
The quest for life on other planets is closely connected with the search for water in liquid state. Recent discoveries of deep oceans on icy moons like Europa and Enceladus have spurred an intensive discussion about how these waters can be accessed. The challenge of this endeavor lies in the unforeseeable requirements on instrumental characteristics both with respect to the scientific and technical methods. The TRIPLE/nanoAUV initiative is aiming at developing a mission concept for exploring exo-oceans and demonstrating the achievements in an earth-analogue context, exploring the ocean under the ice shield of Antarctica and lakes like Dome-C on the Antarctic continent.
We present and discuss an exploration of the possibilities and properties of 3D printing with a printing space of 1 cubic meter, and how those can be integrated into architectural education through an experimental design and research course with students of architecture.We expand on issues presented at the eCAADe conference 2017 in Rome [Ref 6] by increasing the complexity and size of our prints, printing not a model to scale, but a full scale funtional prototype of a usable architectural object: A coffee bar.
Extrem hohe Blitzströme
(2018)
Blitze sind nach wie vor eine enorme Schadensquelle für Personenschäden, Brände, mechanische Zerstörungen und insbesondere auch Überspannungen. Das zeigen nicht zuletzt aktuelle Statistiken der Schadensversicherer. Immer wieder gibt es Meldungen über extrem hohe Blitzströme, die natürlich auch zu großen Schäden und Zerstörungen führen können. Dabei werden Scheitelwerte von teilweise deutlich über 300 kA genannt. Dies wirft Fragen auf, da die „klassische“ Blitzstatistik (z. B. nach CIGRE und IEC [8][10]) bisher solche Werte nicht kennt. Diese extremen Blitzströme werden meist aus den Daten von Blitzortungssystemen ermittelt.
The Carologistics team participates in the RoboCup Logistics League for the seventh year. The RCLL requires precise vision,
manipulation and path planning, as well as complex high-level decision
making and multi-robot coordination. We outline our approach with an
emphasis on recent modifications to those components.
The team members in 2018 are David Bosen, Christoph Gollok, Mostafa
Gomaa, Daniel Habering, Till Hofmann, Nicolas Limpert, Sebastian Schönitz,
Morian Sonnet, Carsten Stoffels, and Tarik Viehmann.
This paper is based on the last year’s team description.
Purpose — to compare the chemical elemental composition of vitreous cavity content taken from cadaveric eyes compared to samples taken from the eyes with terminal stage refractory glaucoma with decompensated intraocular pressure (IOP). Material and methods. The vitreous contents of the eyes from 2 groups were studied. The 1st group included 15 cadaveric eyes; the 2nd group included 15 eyes with refractory glaucoma in the terminal stage of the disease with decompensated IOP in patients with hypertension pain. The vitreal content samples were taken in the course of antiglaucoma surgery aimed at preserving the eye as an organ and involving employment of drainage in the vitreous cavity. The study of virtual contents was carried out on energy dispersive spectrometer Oxford X-Max 50 integrated into scanning electron microscope Zeiss EVO LS10. Results. Increased concentrations of Kalium and Phosphorus were detected in the vitreous content of cadaveric eyes compared with the vitreal content from the eyes with terminal glaucoma with decompensated IOP taken in vivo (K — 0.172/0.093; P — 0.045/0.025 mmol/L). In the vitreous cavity in the eyes with end-stage glaucoma with decompensated IOP, the concentration of Nitrogen was higher in comparison with human cadaver eyes (2.030/1.424 mmol/L). Conclusion. The increased concentrations of Kalium and Phosphorus in the vitreous content of cadaveric eyes is associated with postmortem autolytic processes and with the release of intracellular content in the destruction of cell membranes. The increased Nitrogen concentration in the vitreal contents of the eyes with terminal stage glaucoma with decompensated IOP may be associated with the presence of osmotically active nitrogen-containing compounds in the eyes with increased IOP.
To train end users how to interact with digital systems is indispensable to ensure a strong computer security. 'Competence Developing Game'-based approaches are particularly suitable for this purpose because of their motivation-and simulation-aspects. In this paper the Competence Developing Game 'GHOST' for cybersecurity awareness trainings and its underlying patterns are described. Accordingly, requirements for an 'Competence Developing Game' based training are discussed. Based on these requirements it is shown how a game can fulfill these requirements. A supplementary game interaction design and a corresponding evaluation study is shown. The combination of training requirements and interaction design is used to create a 'Competence Developing Game'-based training concept. A part of these concept is implemented into a playable prototype that serves around one hour of play respectively training time. This prototype is used to perform an evaluation of the game and training aspects of the awareness training. Thereby, the quality of the game aspect and the effectiveness of the training aspect are shown.
Fahrzeugreifen
(2018)
Die Erfindung betrifft einen Fahrzeugreifen mit zumindest einem radial außen befindlichen Gummilaufstreifen, Wulstbereichen für den Anschluss an eine Felge und mit auf einer Gummimischung basierenden Seitenwandbereichen zwischen Gummilaufstreifen und den Wulstbereichen. Ferner betrifft die Erfindung Verfahren zur Herstellung solcher Fahrzeugreifen. Um den Anteil der umweltschädlichen Substanzen, insbesondere im Innenstadtbereich, zu reduzieren weisen die Seitenwandbereiche auf der äußeren Oberfläche eine für den oxidativen Abbau von Molekülen photokatalytisch aktive Substanz auf.
Manufacturing process simulation (MPS) has become more and more important for aviation and the automobile industry. A highly competitive market requires the use of high performance metals and composite materials in combination with reduced manufacturing cost and time as well as a minimization of the time to market for a new product. However, the use of such materials is expensive and requires sophisticated manufacturing processes. An experience based process and tooling design followed by a lengthy trial-and-error optimization is just not contemporary anymore. Instead, a tooling design process aided by simulation is used more often. This paper provides an overview of the capabilities of MPS in the fields of sheet metal forming and prepreg autoclave manufacturing of composite parts summarizing the resulting benefits for tooling design and manufacturing engineering. The simulation technology is explained briefly in order to show several simplification and optimization techniques for developing industrialized simulation approaches. Small case studies provide examples of an efficient application on an industrial scale.
Untersuchungen zur Tragfähigkeit und Steifigkeit eines neuartigen Wandelements in Holzbauweisen
(2018)
The Kremer-Grest (KG) bead-spring model is a near standard in Molecular Dynamic simulations of generic polymer properties. It owes its popularity to its computational efficiency, rather than its ability to represent specific polymer species and conditions. Here we investigate how to adapt the model to match the universal properties of a wide range of chemical polymers species. For this purpose we vary a single parameter originally introduced by Faller and Müller-Plathe, the chain stiffness. Examples include polystyrene, polyethylene, polypropylene, cis-polyisoprene, polydimethylsiloxane, polyethyleneoxide and styrene-butadiene rubber. We do this by matching the number of Kuhn segments per chain and the number of Kuhn segments per cubic Kuhn volume for the polymer species and for the Kremer-Grest model. We also derive mapping relations for converting KG model units back to physical units, in particular we obtain the entanglement time for the KG model as function of stiffness allowing for a time mapping. To test these relations, we generate large equilibrated well entangled polymer melts, and measure the entanglement moduli using a static primitive-path analysis of the entangled melt structure as well as by simulations of step-strain deformation of the model melts. The obtained moduli for our model polymer melts are in good agreement with the experimentally expected moduli.
Algorithmic design and resilience assessment of energy efficient high-rise water supply systems
(2018)
High-rise water supply systems provide water flow and suitable pressure in all levels of tall buildings. To design such state-of-the-art systems, the consideration of energy efficiency and the anticipation of component failures are mandatory. In this paper, we use Mixed-Integer Nonlinear Programming to compute an optimal placement of pipes and pumps, as well as an optimal control strategy.Moreover, we consider the resilience of the system to pump failures. A resilient system is able to fulfill a predefined minimum functionality even though components fail or are restricted in their normal usage. We present models to measure and optimize the resilience. To demonstrate our approach, we design and analyze an optimal resilient decentralized water supply system inspired by a real-life hotel building.
Purpose
In vivo, a loss of mesh porosity triggers scar tissue formation and restricts functionality. The purpose of this study was to evaluate the properties and configuration changes as mesh deformation and mesh shrinkage of a soft mesh implant compared with a conventional stiff mesh implant in vitro and in a porcine model.
Material and Methods
Tensile tests and digital image correlation were used to determine the textile porosity for both mesh types in vitro. A group of three pigs each were treated with magnetic resonance imaging (MRI) visible conventional stiff polyvinylidene fluoride meshes (PVDF) or with soft thermoplastic polyurethane meshes (TPU) (FEG Textiltechnik mbH, Aachen, Germany), respectively. MRI was performed with a pneumoperitoneum at a pressure of 0 and 15 mmHg, which resulted in bulging of the abdomen. The mesh-induced signal voids were semiautomatically segmented and the mesh areas were determined. With the deformations assessed in both mesh types at both pressure conditions, the porosity change of the meshes after 8 weeks of ingrowth was calculated as an indicator of preserved elastic properties. The explanted specimens were examined histologically for the maturity of the scar (collagen I/III ratio).
Results
In TPU, the in vitro porosity increased constantly, in PVDF, a loss of porosity was observed under mild stresses. In vivo, the mean mesh areas of TPU were 206.8 cm2 (± 5.7 cm2) at 0 mmHg pneumoperitoneum and 274.6 cm2 (± 5.2 cm2) at 15 mmHg; for PVDF the mean areas were 205.5 cm2 (± 8.8 cm2) and 221.5 cm2 (± 11.8 cm2), respectively. The pneumoperitoneum-induced pressure increase resulted in a calculated porosity increase of 8.4% for TPU and of 1.2% for PVDF. The mean collagen I/III ratio was 8.7 (± 0.5) for TPU and 4.7 (± 0.7) for PVDF.
Conclusion
The elastic properties of TPU mesh implants result in improved tissue integration compared to conventional PVDF meshes, and they adapt more efficiently to the abdominal wall. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 827–833, 2018.
68Ga-radiopharmaceuticals are common in the field of Nuclear Medicine to visualize receptor-mediated processes. In contrast to straightforward labeling procedures for clinical applications, preclinical in vitro and in vivo applications are hampered for reasons like e.g. volume
restriction, activity concentration, molar activity and osmolality. Therefore, we developed a semiautomatic system specifically to overcome these problems. A difficulty appeared unexpectedly, as intrinsic trace metals derived from eluate (Zn, Fe and Cu) are concentrated as well in amounts that influence radiochemical yield and thus lower molar activity.
Often, research results from collaboration projects are not transferred into productive environments even though approaches are proven to work in demonstration prototypes. These demonstration prototypes are usually too fragile and error-prone to be transferred
easily into productive environments. A lot of additional work is required.
Inspired by the idea of an incremental delivery process, we introduce an architecture pattern, which combines the approach of Metrics Driven Research Collaboration with microservices for the ease of integration. It enables keeping track of project goals over the course of the collaboration while every party may focus on their expert skills: researchers may focus on complex algorithms,
practitioners may focus on their business goals.
Through the simplified integration (intermediate) research results can be introduced into a productive environment which enables
getting an early user feedback and allows for the early evaluation of different approaches. The practitioners’ business model benefits throughout the full project duration.
Durch die Bauforschung im Rahmen des mehrjährigen Projektes "Die Hochkorridore von Sabbioneta" konnten der Verlauf und viele Anschlüsse der hochliegenden Gänge an bestehende Bauten weitgehend geklärt werden. Im Bereich der Chiesa della SS. Maria Incoronata und des angrenzenden Servitenklosters führten die Bauuntersuchungen zu völlig neuen Erkenntnissen über den ehemaligen Verlauf des Hochkorridors und damit zu einer Neuinterpretation des baulichen Ensembles sowie seiner Bedeutung für die Idealstadt. Aus dieser historischen Umdeutung entstanden Ideen für die Umnutzung des ehemaligen Servitenklosters und seiner angrenzenden Freiflächen, die im Rahmen von studentischen Entwürfen am Lehrstuhl für Baugeschichte und Denkmalpflege in Kooperation mit dem Lehrstuhl für Landschaftsarchitektur der RWTH Aachen weiterentwickelt wurden. Somit stellten sich Befunde einer historisch orientierten Bauforschung mit übergeordneter Fragestellung als neue Impulsgeber für Stadtplanung, Architektur und Denkmalpflege heraus.
Heavy-duty trucks are one of the main contributors to greenhouse gas emissions in German traffic. Drivetrain electrification is an option to reduce tailpipe emissions by increasing energy conversion efficiency. To evaluate the vehicle’s environmental impacts, it is necessary to consider the entire life cycle. In addition to the daily use, it is also necessary to include the impact of production and disposal. This study presents the comparative life cycle analysis of a parallel hybrid and a conventional heavy-duty truck in long-haul operation. Assuming a uniform vehicle glider, only the differing parts of both drivetrains are taken into account to calculate the environmental burdens of the production. The use phase is modeled by a backward simulation in MATLAB/Simulink considering a characteristic driving cycle. A break-even analysis is conducted to show at what mileage the larger CO2eq emissions due to the production of the electric drivetrain are compensated. The effect of parameter variation on the break-even mileage is investigated by a sensitivity analysis. The results of this analysis show the difference in CO2eq/t km is negative, indicating that the hybrid vehicle releases 4.34 g CO2eq/t km over a lifetime fewer emissions compared to the diesel truck. The break-even analysis also emphasizes the advantages of the electrified drivetrain, compensating the larger emissions generated during production after already a distance of 15,800 km (approx. 1.5 months of operation time). The intersection coordinates, distance, and CO2eq, strongly depend on fuel, emissions for battery production and the driving profile, which lead to nearly all parameter variations showing an increase in break-even distance.
Sterben und Tod aus wissenschaftlicher Sicht - dying and death from a scientific point of view
(2018)
The porosity of surgical meshes makes them flexible for large elastic deformation and establishes the healing conditions of good tissue in growth. The biomechanic modeling of orthotropic and compressible materials requires new materials models and simulstaneoaus fit of deformation in the load direction as well as trannsversely to to load. This nonlinear modeling can be achieved by an optical deformation measurement. At the same time the full field deformation measurement allows the dermination of the change of porosity with deformation. Also the socalled effective porosity, which has been defined to asses the tisssue interatcion with the mesh implants, can be determined from the global deformation of the surgical meshes.
Prosthetic textile implants of different shapes, sizes and polymers are used to correct the apical prolapse after hysterectomy (removal of the uterus). The selection of the implant before or during minimally invasive surgery depends on the patient’s anatomical defect, intended function after reconstruction and most importantly the surgeon’s preference. Weakness or damage of the supporting tissues during childbirth, menopause or previous pelvic surgeries may put females in higher risk of prolapse. Numerical simulations of reconstructed pelvic floor with weakened tissues and organ supported by textile product models: DynaMesh®-PRS soft, DynaMesh®-PRP soft and DynaMesh®-CESA from FEG Textiletechnik mbH, Germany are compared.
Zusammenfassung: In der Orthopädie zählt der therapeutische Ultraschall als Mittel zur Prävention und Therapiebegleitung. Er hat mechanische, thermische und physiko-chemische Auswirkungen auf den menschlichen Körper. Um mehr Erkenntnisse über die thermischen Auswirkungen zu erlangen, wurden Versuche an einem Hydrogel-Phantom und an Probanden durchgeführt. Dabei entstand eine signifikante Erwärmung des Gewebes, welche beim Probandenversuch an der Oberfläche und beim Hydrogelversuch in der Tiefe gemessen wurde.
Summary: In orthopaedics, therapeutic ultrasound is a tool of prevention and therapy support. It has mechanical, thermal and physico-chemical effects on the human body. Tests with a hydrogel phantom and with human probands have been performed in order to obtain more knowledge about their thermal effects. Both tests measured temperature increases in cell tissue, on the surface with the human proband test and in depth with the hydrogel phantom test.
The integration of sensors is one of the major tasks in embedded, control and “internet of things” (IoT) applications. For the integration mainly digital interfaces are used, starting from rather simple pulse-width modulation (PWM) interface to more complex interfaces like CAN (Controller Area Network). Even though these interfaces are tethered by definition, a wireless realization is highly welcome in many applications to reduce cable and connector cost, increase the flexibility and realize new emerging applications like wireless control systems. Currently used wireless solutions like Bluetooth, WirelessHART or IO-Link Wireless use dedicated communication standards and corresponding higher protocol layers to realize the wireless communication. Due to the complexity of the communication and the protocol handling, additional latency and jitter are introduced to the data communication that can meet the requirements for many applications. Even though tunnelling of other bus data like CAN data is generally also possible the latency and jitter prevent the tunnelling from being transparent for the bus system. Therefore a new basic technology based on dual-mode radio is used to realize a wireless communication on the physical layer only, enabling a reliable and real-time data transfer. As this system operates on the physical layer it is independent of any higher layers of the OSI (open systems interconnection) model. Hence it can be used for several different communication systems to replace the tethered physical layer. A prototype is developed and tested for real-time wireless PWM, SENT (single-edge nibble transmission) and CAN data transfer with very low latency and jitter.
In this work, we report on our attempt to design and implement an early introduction to basic robotics principles for children at kindergarten age. One of the main challenges of this effort is to explain complex robotics contents in a way that pre-school children could follow the basic principles and ideas using examples from their world of experience. What sets apart our effort from other work is that part of the lecturing is actually done by a robot itself and that a quiz at the end of the lesson is done using robots as well. The humanoid robot Pepper from Softbank, which is a great platform for human–robot interaction experiments, was used to present a lecture on robotics by reading out the contents to the children making use of its speech synthesis capability. A quiz in a Runaround-game-show style after the lecture activated the children to recap the contents they acquired about how mobile robots work in principle. In this quiz, two LEGO Mindstorm EV3 robots were used to implement a strongly interactive scenario. Besides the thrill of being exposed to a mobile robot that would also react to the children, they were very excited and at the same time very concentrated. We got very positive feedback from the children as well as from their educators. To the best of our knowledge, this is one of only few attempts to use a robot like Pepper not as a tele-teaching tool, but as the teacher itself in order to engage pre-school children with complex robotics contents.
Sleep scoring is a necessary and time-consuming task in sleep studies. In animal models (such as mice) or in humans, automating this tedious process promises to facilitate long-term studies and to promote sleep biology as a data-driven f ield. We introduce a deep neural network model that is able to predict different states of consciousness (Wake, Non-REM, REM) in mice from EEG and EMG recordings with excellent scoring results for out-of-sample data. Predictions are made on epochs of 4 seconds length, and epochs are classified as artifactfree or not. The model architecture draws on recent advances in deep learning and in convolutional neural networks research. In contrast to previous approaches towards automated sleep scoring, our model does not rely on manually defined features of the data but learns predictive features automatically. We expect deep learning models like ours to become widely applied in different fields, automating many repetitive cognitive tasks that were previously difficult to tackle.
Wireless CAN
(2018)
In modernen elektronischen und mechatronischen Systemen, z. B. im industriellen oder automobil Bereich, kommunizieren eingebettete Steuergeräte und Sensoren vielfach über Bussysteme wie CAN oder LIN. Die Kommunikation findet in der Regel drahtgebunden statt, so dass der Kabelbaum für die Kommunikation sehr groß werden kann. Daher ist es naheliegend, Leitungen und dazugehörige Stecker, z. B. für nicht-sicherheitskritische Komfortsysteme, einzusparen und diese durch gerichtete Funkstrecken für kurze Entfernungen zu ersetzen. Somit könnten Komponenten wie ECUs oder Sensoren kabel- und steckerlos in ein Bussystem integriert werden. Zudem ist eine einfache galvanische und mechanische Trennung zu erreichen. Funkübertragung wird bei diesen Bussystemen derzeit nicht eingesetzt, da insbesondere die Echtzeitfähigkeit und die Robustheit der vorhandenen Funksysteme nicht den Anforderungen der Anwendungen entspricht. Zudem sind bestehende Funksysteme wie WLAN oder Bluetooth im Vergleich zur konventionellen Verkabelung teuer und es besteht hierbei die Möglichkeit, dass sie ausspioniert werden können und so sensible Daten entwendet werden können. In dieser Arbeit wird eine alternative Realisierung zu den bestehenden Funksystemen vorgestellt, die aus wenigen Komponenten aufzubauen ist. Es ist eine protokolllose, echtzeitfähige Übertragung möglich und somit die transparente Integration in ein Bussystem wie CAN.