Refine
Year of publication
- 2021 (127) (remove)
Document Type
- Article (63)
- Conference Proceeding (46)
- Part of a Book (12)
- Book (2)
- Doctoral Thesis (2)
- Other (1)
- Preprint (1)
Language
- English (127) (remove)
Has Fulltext
- no (127) (remove)
Keywords
- Hydrogen (2)
- NOx emissions (2)
- Out-of-plane load (2)
- Principal component analysis (2)
- autonomous driving (2)
- building information modelling (2)
- capacitive field-effect sensor (2)
- constructive alignment (2)
- earthquakes (2)
- electro mobility (2)
- examination (2)
- harmonic radar (2)
- hydrogen (2)
- industrial facilities (2)
- installations (2)
- long-term retention (2)
- multimodal (2)
- piping (2)
- practical learning (2)
- robotic process automation (2)
Institute
- Fachbereich Medizintechnik und Technomathematik (42)
- IfB - Institut für Bioengineering (31)
- Fachbereich Elektrotechnik und Informationstechnik (24)
- Fachbereich Luft- und Raumfahrttechnik (18)
- Fachbereich Energietechnik (15)
- INB - Institut für Nano- und Biotechnologien (10)
- Fachbereich Chemie und Biotechnologie (8)
- Solar-Institut Jülich (8)
- Fachbereich Bauingenieurwesen (7)
- ECSM European Center for Sustainable Mobility (6)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (6)
- Fachbereich Maschinenbau und Mechatronik (3)
- Fachbereich Wirtschaftswissenschaften (3)
- IMP - Institut für Mikrowellen- und Plasmatechnik (2)
- Nowum-Energy (2)
- ZHQ - Bereich Hochschuldidaktik und Evaluation (2)
- Arbeitsstelle fuer Hochschuldidaktik und Studienberatung (1)
- Digitalisierung in Studium & Lehre (1)
- Freshman Institute (1)
- IaAM - Institut für angewandte Automation und Mechatronik (1)
Background:
Additional stabilization of the “comma sign” in anterosuperior rotator cuff repair has been proposed to provide biomechanical benefits regarding stability of the repair.
Purpose:
This in vitro investigation aimed to investigate the influence of a comma sign–directed reconstruction technique for anterosuperior rotator cuff tears on the primary stability of the subscapularis tendon repair.
Study Design:
Controlled laboratory study.
Methods:
A total of 18 fresh-frozen cadaveric shoulders were used in this study. Anterosuperior rotator cuff tears (complete full-thickness tear of the supraspinatus and subscapularis tendons) were created, and supraspinatus repair was performed with a standard suture bridge technique. The subscapularis was repaired with either a (1) single-row or (2) comma sign technique. A high-resolution 3D camera system was used to analyze 3-mm and 5-mm gap formation at the subscapularis tendon-bone interface upon incremental cyclic loading. Moreover, the ultimate failure load of the repair was recorded. A Mann-Whitney test was used to assess significant differences between the 2 groups.
Results:
The comma sign repair withstood significantly more loading cycles than the single-row repair until 3-mm and 5-mm gap formation occurred (P≤ .047). The ultimate failure load did not reveal any significant differences when the 2 techniques were compared (P = .596).
Conclusion:
The results of this study show that additional stabilization of the comma sign enhanced the primary stability of subscapularis tendon repair in anterosuperior rotator cuff tears. Although this stabilization did not seem to influence the ultimate failure load, it effectively decreased the micromotion at the tendon-bone interface during cyclic loading.
Clinical Relevance:
The proposed technique for stabilization of the comma sign has shown superior biomechanical properties in comparison with a single-row repair and might thus improve tendon healing. Further clinical research will be necessary to determine its influence on the functional outcome.
For now, the Planetary Defense Conference Exercise 2021's incoming fictitious(!), asteroid, 2021 PDC, seems headed for impact on October 20th, 2021, exactly 6 months after its discovery. Today (April 26th, 2021), the impact probability is 5%, in a steep rise from 1 in 2500 upon discovery six days ago. We all know how these things end. Or do we? Unless somebody kicked off another headline-grabbing media scare or wants to keep civil defense very idle very soon, chances are that it will hit (note: this is an exercise!). Taking stock, it is barely 6 months to impact, a steadily rising likelihood that it will actually happen, and a huge uncertainty of possible impact energies: First estimates range from 1.2 MtTNT to 13 GtTNT, and this is not even the worst-worst case: a 700 m diameter massive NiFe asteroid (covered by a thin veneer of Ryugu-black rubble to match size and brightness), would come in at 70 GtTNT. In down to Earth terms, this could be all between smashing fireworks over some remote area of the globe and a 7.5 km crater downtown somewhere. Considering the deliberate and sedate ways of development of interplanetary missions it seems we can only stand and stare until we know well enough where to tell people to pack up all that can be moved at all and save themselves. But then, it could just as well be a smaller bright rock. The best estimate is 120 m diameter from optical observation alone, by 13% standard albedo. NASA's upcoming DART mission to binary asteroid (65803) Didymos is designed to hit such a small target, its moonlet Dimorphos. The Deep Impact mission's impactor in 2005 successfully guided itself to the brightest spot on comet 9P/Tempel 1, a relatively small feature on the 6 km nucleus. And 'space' has changed: By the end of this decade, one satellite communication network plans to have launched over 11000 satellites at a pace of 60 per launch every other week. This level of series production is comparable in numbers to the most prolific commercial airliners. Launch vehicle production has not simply increased correspondingly – they can be reused, although in a trade for performance. Optical and radio astronomy as well as planetary radar have made great strides in the past decade, and so has the design and production capability for everyday 'high-tech' products. 60 years ago, spaceflight was invented from scratch within two years, and there are recent examples of fast-paced space projects as well as a drive towards 'responsive space'. It seems it is not quite yet time to abandon all hope. We present what could be done and what is too close to call once thinking is shoved out of the box by a clear and present danger, to show where a little more preparedness or routine would come in handy – or become decisive. And if we fail, let's stand and stare safely and well instrumented anywhere on Earth together in the greatest adventure of science.
Modern industry and multi-discipline projects require highly trained individuals with resilient science and engineering back-grounds. Graduates must be able to agilely apply excellent theoretical knowledge in their subject matter as well as essential practical “hands-on” knowledge of diverse working processes to solve complex problems. To meet these demands, university education follows the concept of Constructive Alignment and thus increasingly adopts the teaching of necessary practical skills to the actual industry requirements and assessment routines. However, a systematic approach to coherently align these three central teaching demands is strangely absent from current university curricula. We demonstrate the feasibility of implementing practical assessments in a regular theory-based examination, thus defining the term “blended assessment”. We assessed a course for natural science and engineering students pursuing a career in biomedical engineering, and evaluated the benefit of blended assessment exams for students and lecturers. Our controlled study assessed the physiological background of electrocardiograms (ECGs), the practical measurement of ECG curves, and their interpretation of basic pathologic alterations. To study on long time effects, students have been assessed on the topic twice with a time lag of 6 months. Our findings suggest a significant improvement in student gain with respect to practical skills and theoretical knowledge. The results of the reassessments support these outcomes. From the lecturers ́ point of view, blended assessment complements practical training courses while keeping organizational effort manageable. We consider blended assessment a viable tool for providing an improved student gain, industry-ready education format that should be evaluated and established further to prepare university graduates optimally for their future careers.
An approach to automatically generate a dynamic energy simulation model in Modelica for a single existing building is presented. It aims at collecting data about the status quo in the preparation of energy retrofits with low effort and costs. The proposed method starts from a polygon model of the outer building envelope obtained from photogrammetrically generated point clouds. The open-source tools TEASER and AixLib are used for data enrichment and model generation. A case study was conducted on a single-family house. The resulting model can accurately reproduce the internal air temperatures during synthetical heating up and cooling down. Modelled and measured whole building heat transfer coefficients (HTC) agree within a 12% range. A sensitivity analysis emphasises the importance of accurate window characterisations and justifies the use of a very simplified interior geometry. Uncertainties arising from the use of archetype U-values are estimated by comparing different typologies, with best- and worst-case estimates showing differences in pre-retrofit heat demand of about ±20% to the average; however, as the assumptions made are permitted by some national standards, the method is already close to practical applicability and opens up a path to quickly estimate possible financial and energy savings after refurbishment.
Geochemical characterisation of hypersaline waters is difficult as high concentrations of salts hinder the analysis of constituents at low concentrations, such as trace metals, and the collection of samples for trace metal analysis in natural waters can be easily contaminated. This is particularly the case if samples are collected by non-conventional techniques such as those required for aquatic subglacial environments. In this paper we present the first analysis of a subglacial brine from Taylor Valley, (~ 78°S), Antarctica for the trace metals: Ba, Co, Mo, Rb, Sr, V, and U. Samples were collected englacially using an electrothermal melting probe called the IceMole. This probe uses differential heating of a copper head as well as the probe’s sidewalls and an ice screw at the melting head to move through glacier ice. Detailed blanks, meltwater, and subglacial brine samples were collected to evaluate the impact of the IceMole and the borehole pump, the melting and collection process, filtration, and storage on the geochemistry of the samples collected by this device. Comparisons between melt water profiles through the glacier ice and blank analysis, with published studies on ice geochemistry, suggest the potential for minor contributions of some species Rb, As, Co, Mn, Ni, NH4+, and NO2−+NO3− from the IceMole. The ability to conduct detailed chemical analyses of subglacial fluids collected with melting probes is critical for the future exploration of the hundreds of deep subglacial lakes in Antarctica.
In the context of the Solvency II directive, the operation of an internal risk model is a possible way for risk assessment and for the determination of the solvency capital requirement of an insurance company in the European Union. A Monte Carlo procedure is customary to generate a model output. To be compliant with the directive, validation of the internal risk model is conducted on the basis of the model output. For this purpose, we suggest a new test for checking whether there is a significant change in the modeled solvency capital requirement. Asymptotic properties of the test statistic are investigated and a bootstrap approximation is justified. A simulation study investigates the performance of the test in the finite sample case and confirms the theoretical results. The internal risk model and the application of the test is illustrated in a simplified example. The method has more general usage for inference of a broad class of law-invariant and coherent risk measures on the basis of a paired sample.
7T MR Safety
(2021)
A new method for improved autoclave loading within the restrictive framework of helicopter manufacturing is proposed. It is derived from experimental and numerical studies of the curing process and aims at optimizing tooling positions in the autoclave for fast and homogeneous heat-up. The mold positioning is based on two sets of information. The thermal properties of the molds, which can be determined via semi-empirical thermal simulation. The second information is a previously determined distribution of heat transfer coefficients inside the autoclave. Finally, an experimental proof of concept is performed to show a cycle time reduction of up to 31% using the proposed methodology.
Magnetic nanoparticle relaxation in biomedical application: focus on simulating nanoparticle heating
(2021)
Dual frequency magnetic excitation of magnetic nanoparticles (MNP) enables enhanced biosensing applications. This was studied from an experimental and theoretical perspective: nonlinear sum-frequency components of MNP exposed to dual-frequency magnetic excitation were measured as a function of static magnetic offset field. The Langevin model in thermodynamic equilibrium was fitted to the experimental data to derive parameters of the lognormal core size distribution. These parameters were subsequently used as inputs for micromagnetic Monte-Carlo (MC)-simulations. From the hysteresis loops obtained from MC-simulations, sum-frequency components were numerically demodulated and compared with both experiment and Langevin model predictions. From the latter, we derived that approximately 90% of the frequency mixing magnetic response signal is generated by the largest 10% of MNP. We therefore suggest that small particles do not contribute to the frequency mixing signal, which is supported by MC-simulation results. Both theoretical approaches describe the experimental signal shapes well, but with notable differences between experiment and micromagnetic simulations. These deviations could result from Brownian relaxations which are, albeit experimentally inhibited, included in MC-simulation, or (yet unconsidered) cluster-effects of MNP, or inaccurately derived input for MC-simulations, because the largest particles dominate the experimental signal but concurrently do not fulfill the precondition of thermodynamic equilibrium required by Langevin theory.
This book provides a compact introduction to the bootstrap method. In addition to classical results on point estimation and test theory, multivariate linear regression models and generalized linear models are covered in detail. Special attention is given to the use of bootstrap procedures to perform goodness-of-fit tests to validate model or distributional assumptions. In some cases, new methods are presented here for the first time.
The text is motivated by practical examples and the implementations of the corresponding algorithms are always given directly in R in a comprehensible form. Overall, R is given great importance throughout. Each chapter includes a section of exercises and, for the more mathematically inclined readers, concludes with rigorous proofs. The intended audience is graduate students who already have a prior knowledge of probability theory and mathematical statistics.
The existence of several mobile operating systems, such as Android and iOS, is a challenge for developers because the individual platforms are not compatible with each other and require separate app developments. For this reason, cross-platform approaches have become popular but lack in cloning the native behavior of the different operating systems. Out of the plenty cross-platform approaches, the progressive web app (PWA) approach is perceived as promising but needs further investigation. Therefore, the paper at hand aims at investigating whether PWAs are a suitable alternative for native apps by developing a PWA clone of an existing app. Two surveys are conducted in which potential users test and evaluate the PWA prototype with regard to its usability. The survey results indicate that PWAs have great potential, but cannot be treated as a general alternative to native apps. For guiding developers when and how to use PWAs, four design guidelines for the development of PWA-based apps are derived based on the results.