Article
Refine
Year of publication
- 2018 (127) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (47)
- IfB - Institut für Bioengineering (22)
- INB - Institut für Nano- und Biotechnologien (19)
- Fachbereich Elektrotechnik und Informationstechnik (13)
- Fachbereich Maschinenbau und Mechatronik (13)
- Fachbereich Chemie und Biotechnologie (12)
- Fachbereich Wirtschaftswissenschaften (12)
- Fachbereich Energietechnik (9)
- Fachbereich Bauingenieurwesen (7)
- Fachbereich Luft- und Raumfahrttechnik (7)
Has Fulltext
- no (127)
Document Type
- Article (127) (remove)
Keywords
- Actors (1)
- Bahadur efficiency (1)
- Booster Stations (1)
- Buffering Capacity (1)
- Chance Constraint (1)
- Conditions (1)
- Conductive boundary condition (1)
- Coverage probability (1)
- Cramér-von-Mises statistic (1)
- Design process (1)
Is part of the Bibliography
- no (127)
Against the background of growing data in everyday life, data processing tools become more powerful to deal with the increasing complexity in building design. The architectural planning process is offered a variety of new instruments to design, plan and communicate planning decisions. Ideally the access to information serves to secure and document the quality of the building and in the worst case, the increased data absorbs time by collection and processing without any benefit for the building and its user. Process models can illustrate the impact of information on the design- and planning process so that architect and planner can steer the process. This paper provides historic and contemporary models to visualize the architectural planning process and introduces means to describe today’s situation consisting of stakeholders, events and instruments. It explains conceptions during Renaissance in contrast to models used in the second half of the 20th century. Contemporary models are discussed regarding their value against the background of increasing computation in the building process.
The inverse scattering problem for a conductive boundary condition and transmission eigenvalues
(2018)
In this paper, we consider the inverse scattering problem associated with an inhomogeneous media with a conductive boundary. In particular, we are interested in two problems that arise from this inverse problem: the inverse conductivity problem and the corresponding interior transmission eigenvalue problem. The inverse conductivity problem is to recover the conductive boundary parameter from the measured scattering data. We prove that the measured scatted data uniquely determine the conductivity parameter as well as describe a direct algorithm to recover the conductivity. The interior transmission eigenvalue problem is an eigenvalue problem associated with the inverse scattering of such materials. We investigate the convergence of the eigenvalues as the conductivity parameter tends to zero as well as prove existence and discreteness for the case of an absorbing media. Lastly, several numerical and analytical results support the theory and we show that the inside–outside duality method can be used to reconstruct the interior conductive eigenvalues.
During rapid deceleration of the body, tendons buffer part of the elongation of the muscle-tendon unit (MTU), enabling safe energy dissipation via eccentric muscle contraction. Yet, the influence of changes in tendon stiffness within the physiological range upon these lengthening contractions is unknown. This study aimed to examine the effect of training-induced stiffening of the Achilles tendon on triceps surae muscle-tendon behavior during a landing task. Twenty-one male subjects were assigned to either a 10-week resistance-training program consisting of single-leg isometric plantarflexion (n = 11) or to a non-training control group (n = 10). Before and after the training period, plantarflexion force, peak Achilles tendon strain and stiffness were measured during isometric contractions, using a combination of dynamometry, ultrasound and kinematics data. Additionally, testing included a step-landing task, during which joint mechanics and lengths of gastrocnemius and soleus fascicles, Achilles tendon, and MTU were determined using synchronized ultrasound, kinematics and kinetics data collection. After training, plantarflexion strength and Achilles tendon stiffness increased (15 and 18%, respectively), and tendon strain during landing remained similar. Likewise, lengthening and negative work produced by the gastrocnemius MTU did not change detectably. However, in the training group, gastrocnemius fascicle length was offset (8%) to a longer length at touch down and, surprisingly, fascicle lengthening and velocity were reduced by 27 and 21%, respectively. These changes were not observed for soleus fascicles when accounting for variation in task execution between tests. These results indicate that a training-induced increase in tendon stiffness does not noticeably affect the buffering action of the tendon when the MTU is rapidly stretched. Reductions in gastrocnemius fascicle lengthening and lengthening velocity during landing occurred independently from tendon strain. Future studies are required to provide insight into the mechanisms underpinning these observations and their influence on energy dissipation.
Comparison of different training algorithms for the leg extension training with an industrial robot
(2018)
In the past, different training scenarios have been developed and implemented on robotic research platforms, but no systematic analysis and comparison have been done so far. This paper deals with the comparison of an isokinematic (motion with constant velocity) and an isotonic (motion against constant weight) training algorithm. Both algorithms are designed for a robotic research platform consisting of a 3D force plate and a high payload industrial robot, which allows leg extension training with arbitrary six-dimensional motion trajectories. In the isokinematic as well as the isotonic training algorithm, individual paths are defined i n C artesian s pace by sufficient s upport p oses. I n t he i sotonic t raining s cenario, the trajectory is adapted to the measured force as the robot should only move along the trajectory as long as the force applied by the user exceeds a minimum threshold. In the isotonic training scenario however, the robot’s acceleration is a function of the force applied by the user. To validate these findings, a simulative experiment with a simple linear trajectory is performed. For this purpose, the same force path is applied in both training scenarios. The results illustrate that the algorithms differ in the force dependent trajectory adaption.
Magnetic detection structure for Lab-on-Chip applications based on the frequency mixing technique
(2018)
A magnetic frequency mixing technique with a set of miniaturized planar coils was investigated for use with a completely integrated Lab-on-Chip (LoC) pathogen sensing system. The system allows the detection and quantification of superparamagnetic beads. Additionally, in terms of magnetic nanoparticle characterization ability, the system can be used for immunoassays using the beads as markers. Analytical calculations and simulations for both excitation and pick-up coils are presented; the goal was to investigate the miniaturization of simple and cost-effective planar spiral coils. Following these calculations, a Printed Circuit Board (PCB) prototype was designed, manufactured, and tested for limit of detection, linear response, and validation of theoretical concepts. Using the magnetic frequency mixing technique, a limit of detection of 15 µg/mL of 20 nm core-sized nanoparticles was achieved without any shielding.
Im Rahmen des europäischen Verbundprojekts INSYSME wurden von den deutschen Partnern die Systeme IMES und INODIS zur Verbesserung des seismischen Verhaltens von ausgefachten Stahlbetonrahmen entwickelt. Ziel beider Systeme ist es, Stahlbetonrahmen und Ausfachung zu entkoppeln, anstatt die Tragfähigkeit durch aufwendige und kostspielige zusätzliche Bewehrungseinlagen zu erhöhen. Erste Ergebnisse des Systems IMES für Belastungen in und senkrecht zu der Wandebene werden vorgestellt.
Im Rahmen des europäischen Verbundprojekts INSYSME wurden von den deutschen Partnern die Systeme IMES und INODIS zur Verbesserung des seismischen Verhaltens von ausgefachten Stahlbetonrahmen entwickelt. Ziel beider Systeme ist es, Stahlbetonrahmen und Ausfachung zu entkoppeln, anstatt die Tragfähigkeit durch aufwendige und kostspielige zusätzliche Bewehrungseinlagen zu erhöhen. Erste Ergebnisse des Systems IMES für Belastungen in und senkrecht zu der Wandebene werden vorgestellt.
The efficiency concepts of Bahadur and Pitman are used to compare the Wilcoxon tests in paired and independent survey samples. A comparison through the length of corresponding confidence intervals is also done. Simple conditions characterizing the dominance of a procedure are derived. Statistical tests for checking these conditions are suggested and discussed.
The paper deals with the asymptotic behaviour of estimators, statistical tests and confidence intervals for L²-distances to uniformity based on the empirical distribution function, the integrated empirical distribution function and the integrated empirical survival function. Approximations of power functions, confidence intervals for the L²-distances and statistical neighbourhood-of-uniformity validation tests are obtained as main applications. The finite sample behaviour of the procedures is illustrated by a simulation study.
A nonparametric goodness-of-fit test for random variables with values in a separable Hilbert space is investigated. To verify the null hypothesis that the data come from a specific distribution, an integral type test based on a Cramér-von-Mises statistic is suggested. The convergence in distribution of the test statistic under the null hypothesis is proved and the test's consistency is concluded. Moreover, properties under local alternatives are discussed. Applications are given for data of huge but finite dimension and for functional data in infinite dimensional spaces. A general approach enables the treatment of incomplete data. In simulation studies the test competes with alternative proposals.
Das anhaltende Wachstum wissenschaftlicher Veröffentlichungen wirft die Fragestellung auf, wie Literaturana-lysen im Rahmen von Forschungsprozessen digitalisiert und somit produktiver realisiert werden können. Insbesondere in informationstechnischen Fachgebieten ist die Forschungspraxis durch ein rasant wachsendes Publikationsaufkommen gekennzeichnet. Infolgedessen bietet sich der Einsatz von Methoden der Textanalyse (Text Analytics) an, die Textdaten automatisch vorbereiten und verarbeiten können. Erkenntnisse entstehen dabei aus Analysen von Wortarten und Subgruppen, Korrelations- sowie Zeitreihenanalysen. Dieser Beitrag stellt die Konzeption und Realisierung eines Prototypen vor, mit dem Anwender bibliographische Daten aus der etablierten Literaturdatenbank EBSCO Discovery Service mithilfe textanalytischer Methoden erschließen können. Der Prototyp basiert auf dem Analysesystem IBM Watson Explorer, das Hochschulen lizenzkostenfrei zur Verfügung steht. Potenzielle Adressaten des Prototypen sind Forschungseinrichtungen, Beratungsunternehmen sowie Entscheidungsträger in Politik und Unternehmenspraxis.
Nach einem intensiven politischen Diskurs wurde im vergangenen Jahr die Datenschutz-Grundverordnung (DSGVO) verabschiedet. Die DSGVO ersetzt zum 25.5.2018 die bislang geltende, aus dem Jahre 1995 stammende Datenschutz-Richtlinie 95/46/EG. Die Novellierung des Datenschutzrechts bringt zahlreiche neue Anforderungen mit sich. Unternehmen sind daher gezwungen, sich auf die Änderungen einzustellen, ihre datenschutzrelevanten Prozesse im Hinblick auf die neuen Anforderungen zu überprüfen und bis zum Mai 2018 an der DSGVO auszurichten. Der Beitrag gibt einen kurzen Überblick über die zentralen Aspekte der Datenschutzreform und die damit einhergehenden Herausforderungen für Unternehmen.
Das Kopplungsverbot fristete – obwohl in rechtswissenschaftlicher Literatur seit jeher diskutiert – unter der Geltung des BDSG ein Schattendasein. Mit der Datenschutz-Grundverordnung (DS-GVO) ist eine Änderung absehbar: Der neue Art. EWG_DSGVO Artikel 7 Abs. EWG_DSGVO Artikel 7 Absatz 4 DS-GVO stellt klar, dass die Leistungserbringung nicht von der Einwilligungserteilung abhängig gemacht werden darf. Doch dieses scheinbare Novum des Datenschutzrechts wirft zahlreiche Fragen auf. Während vor allem Vertreter der unternehmerischen Praxis die Anwendung des Kopplungsverbots in zahlreichen Konstellationen ablehnen, beschwören dessen Apologeten das Ende sämtlicher „datenfinanzierten“ Dienste herauf. Der vorliegende Beitrag gibt Einblick in die Regelungstiefe einer Norm, die das Web 2.0 revolutionieren könnte, und schlägt eine Lösung vor, die dem Schutz der Privatsphäre des Betroffenen und den wirtschaftlichen Interessen von Diensteanbietern gleichermaßen gerecht wird.
Die Datenschutz-Grundverordnung (DS-GVO) regelt in ihrem Art. 3 das räumlich anwendbare Datenschutzrecht und zielt dabei gerade auch auf Angebote nichteuropäischer Diensteanbieter ab. Die bisherige Diskussion konzentriert sich bislang in erster Linie darauf, das eingeführte Marktortprinzip zu thematisieren; das weitgehend unangetastete
Niederlassungsprinzip und vor allem die Probleme, die sich durch dessen unveränderte Beibehaltung ergeben, werden dagegen nicht erörtert. Der folgende Beitrag versucht sich an einer systematischen Analyse eines teils kontrovers, teils kaum diskutierten Themas.
Die Rechtsfigur der gemeinsamen Verantwortlichkeit beschäftigt die datenschutzrechtliche Literatur seit Langem. Die Bestimmung der Verantwortlichkeit bei arbeitsteiligen Verarbeitungsverfahren, welche vor allem bei heutigen Plattformdiensten üblich sind, ist komplex: Stets sind mehrere Akteure beteiligt und in der Regel werden durch die Handlung eines Beteiligten mehrere Verarbeitungsschritte ausgelöst. Nun hat sich der EuGH in einem in mehrfacher Hinsicht bemerkenswerten Urteil geäußert.
Das neue kirchliche Datenschutzrecht – Herausforderungen für Unternehmen der Privatwirtschaft
(2018)
Resilience as a concept has found its way into different disciplines to describe the ability of an individual or system to withstand and adapt to changes in its environment. In this paper, we provide an overview of the concept in different communities and extend it to the area of mechanical engineering. Furthermore, we present metrics to measure resilience in technical systems and illustrate them by applying them to load-carrying structures. By giving application examples from the Collaborative Research Centre (CRC) 805, we show how the concept of resilience can be used to control uncertainty during different stages of product life.
Given industrial applications, the costs for the operation and maintenance of a pump system typically far exceed its purchase price. For finding an optimal pump configuration which minimizes not only investment, but life-cycle costs, methods like Technical Operations Research which is based on Mixed-Integer Programming can be applied. However, during the planning phase, the designer is often faced with uncertain input data, e.g. future load demands can only be estimated. In this work, we deal with this uncertainty by developing a chance-constrained two-stage (CCTS) stochastic program. The design and operation of a booster station working under uncertain load demand are optimized to minimize total cost including purchase price, operation cost incurred by energy consumption and penalty cost resulting from water shortage. We find optimized system layouts using a sample average approximation (SAA) algorithm, and analyze the results for different risk levels of water shortage. By adjusting the risk level, the costs and performance range of the system can be balanced, and thus the
system’s resilience can be engineered