Refine
Year of publication
Document Type
- Article (3226)
- Conference Proceeding (1146)
- Part of a Book (184)
- Book (144)
- Doctoral Thesis (30)
- Patent (25)
- Other (9)
- Report (9)
- Working Paper (6)
- Lecture (5)
- Poster (4)
- Preprint (4)
- Talk (4)
- Master's Thesis (2)
- Bachelor Thesis (1)
- Contribution to a Periodical (1)
- Habilitation (1)
Language
- English (4801) (remove)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
- Shakedown analysis (6)
- avalanche (6)
- shakedown analysis (6)
- Clusterion (5)
- Earthquake (5)
- Enterprise Architecture (5)
- MINLP (5)
- solar sail (5)
- Air purification (4)
- Diversity Management (4)
Institute
- Fachbereich Medizintechnik und Technomathematik (1668)
- Fachbereich Elektrotechnik und Informationstechnik (693)
- IfB - Institut für Bioengineering (620)
- Fachbereich Energietechnik (579)
- INB - Institut für Nano- und Biotechnologien (555)
- Fachbereich Chemie und Biotechnologie (534)
- Fachbereich Luft- und Raumfahrttechnik (477)
- Fachbereich Maschinenbau und Mechatronik (278)
- Fachbereich Wirtschaftswissenschaften (207)
- Solar-Institut Jülich (164)
- Fachbereich Bauingenieurwesen (153)
- ECSM European Center for Sustainable Mobility (79)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (67)
- Nowum-Energy (28)
- Fachbereich Gestaltung (25)
- Institut fuer Angewandte Polymerchemie (23)
- Sonstiges (21)
- Fachbereich Architektur (20)
- Freshman Institute (18)
- Kommission für Forschung und Entwicklung (18)
Virgin passive colon biomechanics and a literature review of active contraction constitutive models
(2022)
The objective of this paper is to present our findings on the biomechanical aspects of the virgin passive anisotropic hyperelasticity of the porcine colon based on equibiaxial tensile experiments. Firstly, the characterization of the intestine tissues is discussed for a nearly incompressible hyperelastic fiber-reinforced Holzapfel–Gasser–Ogden constitutive model in virgin passive loading conditions. The stability of the evaluated material parameters is checked for the polyconvexity of the adopted strain energy function using positive eigenvalue constraints of the Hessian matrix with MATLAB. The constitutive material description of the intestine with two collagen fibers in the submucosal and muscular layer each has been implemented in the FORTRAN platform of the commercial finite element software LS-DYNA, and two equibiaxial tensile simulations are presented to validate the results with the optical strain images obtained from the experiments. Furthermore, this paper also reviews the existing models of the active smooth muscle cells, but these models have not been computationally studied here. The review part shows that the constitutive models originally developed for the active contraction of skeletal muscle based on Hill’s three-element model, Murphy’s four-state cross-bridge chemical kinetic model and Huxley’s sliding-filament hypothesis, which are mainly used for arteries, are appropriate for numerical contraction numerical analysis of the large intestine.
Unsteady shallow meandering flows in rectangular reservoirs: a modal analysis of URANS modelling
(2022)
Shallow flows are common in natural and human-made environments. Even for simple rectangular shallow reservoirs, recent laboratory experiments show that the developing flow fields are particularly complex, involving large-scale turbulent structures. For specific combinations of reservoir size and hydraulic conditions, a meandering jet can be observed. While some aspects of this pseudo-2D flow pattern can be reproduced using a 2D numerical model, new 3D simulations, based on the unsteady Reynolds-Averaged Navier-Stokes equations, show consistent advantages as presented herein. A Proper Orthogonal Decomposition was used to characterize the four most energetic modes of the meandering jet at the free surface level, allowing comparison against experimental data and 2D (depth-averaged) numerical results. Three different isotropic eddy viscosity models (RNG k-ε, k-ε, k-ω) were tested. The 3D models accurately predicted the frequency of the modes, whereas the amplitudes of the modes and associated energy were damped for the friction-dominant cases and augmented for non-frictional ones. The performance of the three turbulence models remained essentially similar, with slightly better predictions by RNG k-ε model in the case with the highest Reynolds number. Finally, the Q-criterion was used to identify vortices and study their dynamics, assisting on the identification of the differences between: i) the three-dimensional phenomenon (here reproduced), ii) its two-dimensional footprint in the free surface (experimental observations) and iii) the depth-averaged case (represented by 2D models).
This study reviews the practice of brake tests in freight railways, which is time consuming and not suitable to detect certain failure types. Public incident reports are analysed to derive a reasonable brake test hardware and communication architecture, which aims to provide automatic brake tests at lower cost than current solutions. The proposed solutions relies exclusively on brake pipe and brake cylinder pressure sensors, a brake release position switch as well as radio communication via standard protocols. The approach is embedded in the Wagon 4.0 concept, which is a holistic approach to a smart freight wagon. The reduction of manual processes yields a strong incentive due to high savings in manual
labour and increased productivity.
Sleep spindles are neurophysiological phenomena that appear to be linked to memory formation and other functions of the central nervous system, and that can be observed in electroencephalographic recordings (EEG) during sleep. Manually identified spindle annotations in EEG recordings suffer from substantial intra- and inter-rater variability, even if raters have been highly trained, which reduces the reliability of spindle measures as a research and diagnostic tool. The Massive Online Data Annotation (MODA) project has recently addressed this problem by forming a consensus from multiple such rating experts, thus providing a corpus of spindle annotations of enhanced quality. Based on this dataset, we present a U-Net-type deep neural network model to automatically detect sleep spindles. Our model’s performance exceeds that of the state-of-the-art detector and of most experts in the MODA dataset. We observed improved detection accuracy in subjects of all ages, including older individuals whose spindles are particularly challenging to detect reliably. Our results underline the potential of automated methods to do repetitive cumbersome tasks with super-human performance.
In this article we describe an Internet-of-Things sensing device with a wireless interface which is powered by the oftenoverlooked harvesting method of the Wiegand effect. The sensor can determine position, temperature or other resistively measurable quantities and can transmit the data via an ultra-low power ultra-wideband (UWB) data transmitter. With this approach we can energy-self-sufficiently acquire, process, and wirelessly transmit data in a pulsed operation. A proof-of-concept system was built up to prove the feasibility of the approach. The energy consumption of the system is analyzed and traced back in detail to the individual components, compared to the generated energy and processed to identify further optimization options. Based on the proof-of-concept, an application demonstrator was developed. Finally, we point out possible use cases.
Virtual Reality (VR) offers novel possibilities for remote training regardless of the availability of the actual equipment, the presence of specialists, and the training locations. Research shows that training environments that adapt to users' preferences and performance can promote more effective learning. However, the observed results can hardly be traced back to specific adaptive measures but the whole new training approach. This study analyzes the effects of a combined point and leveling VR-based gamification system on assembly training targeting specific training outcomes and users' motivations. The Gamified-VR-Group with 26 subjects received the gamified training, and the Non-Gamified-VR-Group with 27 subjects received the alternative without gamified elements. Both groups conducted their VR training at least three times before assembling the actual structure. The study found that a level system that gradually increases the difficulty and error probability in VR can significantly lower real-world error rates, self-corrections, and support usages. According to our study, a high error occurrence at the highest training level reduced the Gamified-VR-Group's feeling of competence compared to the Non-Gamified-VR-Group, but at the same time also led to lower error probabilities in real-life. It is concluded that a level system with a variable task difficulty should be combined with carefully balanced positive and negative feedback messages. This way, better learning results, and an improved self-evaluation can be achieved while not causing significant impacts on the participants' feeling of competence.
A Gamified Information System (GIS) implements game concepts and elements, such as affordances and game design principles to motivate people. Based on the idea to develop a GIS to increase the motivation of software developers to perform software quality tasks, the research work at hand aims at investigating relevant requirements from that target group. Therefore, 14 interviews with software development experts are conducted and analyzed. According to the results, software developers prefer the affordances points, narrative storytelling in a multiplayer and a round-based setting. Furthermore, six design principles for the development of a GIS are derived.
Concentrating solar power
(2022)
The focus of this chapter is the production of power and the use of the heat produced from concentrated solar thermal power (CSP) systems.
The chapter starts with the general theoretical principles of concentrating systems including the description of the concentration ratio, the energy and mass balance. The power conversion systems is the main part where solar-only operation and the increase in operational hours.
Solar-only operation include the use of steam turbines, gas turbines, organic Rankine cycles and solar dishes. The operational hours can be increased with hybridization and with storage.
Another important topic is the cogeneration where solar cooling, desalination and of heat usage is described.
Many examples of commercial CSP power plants as well as research facilities from the past as well as current installed and in operation are described in detail.
The chapter closes with economic and environmental aspects and with the future potential of the development of CSP around the world.
Solar thermal concentrated power is an emerging technology that provides clean electricity for the growing energy market. To the solar thermal concentrated power plant systems belong the parabolic trough, the Fresnel collector, the solar dish, and the central receiver system.
For high-concentration solar collector systems, optical and thermal analysis is essential. There exist a number of measurement techniques and systems for the optical and thermal characterization of the efficiency of solar thermal concentrated systems.
For each system, structure, components, and specific characteristics types are described. The chapter presents additionally an outline for the calculation of system performance and operation and maintenance topics. One main focus is set to the models of components and their construction details as well as different types on the market. In the later part of this article, different criteria for the choice of technology are analyzed in detail.
When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Purpose
In the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset.
Design/methodology/approach
In this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments.
Findings
Based on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model.
Originality/value
For the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future.
REM sleep without atonia (RSWA) is a key feature for the diagnosis of rapid eye movement (REM) sleep behaviour disorder (RBD). We introduce RBDtector, a novel open-source software to score RSWA according to established SINBAR visual scoring criteria. We assessed muscle activity of the mentalis, flexor digitorum superficialis (FDS), and anterior tibialis (AT) muscles. RSWA was scored manually as tonic, phasic, and any activity by human scorers as well as using RBDtector in 20 subjects. Subsequently, 174 subjects (72 without RBD and 102 with RBD) were analysed with RBDtector to show the algorithm’s applicability. We additionally compared RBDtector estimates to a previously published dataset. RBDtector showed robust conformity with human scorings. The highest congruency was achieved for phasic and any activity of the FDS. Combining mentalis any and FDS any, RBDtector identified RBD subjects with 100% specificity and 96% sensitivity applying a cut-off of 20.6%. Comparable performance was obtained without manual artefact removal. RBD subjects also showed muscle bouts of higher amplitude and longer duration. RBDtector provides estimates of tonic, phasic, and any activity comparable to human scorings. RBDtector, which is freely available, can help identify RBD subjects and provides reliable RSWA metrics.
FEM shakedown analysis of structures under random strength with chance constrained programming
(2022)
Direct methods, comprising limit and shakedown analysis, are a branch of computational mechanics. They play a significant role in mechanical and civil engineering design. The concept of direct methods aims to determine the ultimate load carrying capacity of structures beyond the elastic range. In practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and constraints. If strength and loading are random quantities, the shakedown analysis can be formulated as stochastic programming problem. In this paper, a method called chance constrained programming is presented, which is an effective method of stochastic programming to solve shakedown analysis problems under random conditions of strength. In this study, the loading is deterministic, and the strength is a normally or lognormally distributed variable.
On the basis of bivariate data, assumed to be observations of independent copies of a random vector (S,N), we consider testing the hypothesis that the distribution of (S,N) belongs to the parametric class of distributions that arise with the compound Poisson exponential model. Typically, this model is used in stochastic hydrology, with N as the number of raindays, and S as total rainfall amount during a certain time period, or in actuarial science, with N as the number of losses, and S as total loss expenditure during a certain time period. The compound Poisson exponential model is characterized in the way that a specific transform associated with the distribution of (S,N) satisfies a certain differential equation. Mimicking the function part of this equation by substituting the empirical counterparts of the transform we obtain an expression the weighted integral of the square of which is used as test statistic. We deal with two variants of the latter, one of which being invariant under scale transformations of the S-part by fixed positive constants. Critical values are obtained by using a parametric bootstrap procedure. The asymptotic behavior of the tests is discussed. A simulation study demonstrates the performance of the tests in the finite sample case. The procedure is applied to rainfall data and to an actuarial dataset. A multivariate extension is also discussed.
The fourth industrial revolution presents a multitude of challenges for industries, one of which being the increased flexibility required of manufacturing lines as a result of increased consumer demand for individualised products. One solution to tackle this challenge is the digital twin, more specifically the standardised model of a digital twin also known as the asset administration shell. The standardisation of an industry wide communications tool is a critical step in enabling inter-company operations. This paper discusses the current state of asset administration shells, the frameworks used to host them and their problems that need to be addressed. To tackle these issues, we propose an event-based server capable of drastically reducing response times between assets and asset administration shells and a multi-agent system used for the orchestration and deployment of the shells in the field.
The work presented in this report provides scientific support to building renovation policies in the EU by promoting a holistic point of view on the topic. Integrated renovation can be seen as a nexus between European policies on disaster resilience, energy efficiency and circularity in the building sector. An overview of policy measures for the seismic and energy upgrading of buildings across EU Member States identified only a few available measures for combined upgrading. Regulatory framework, financial instruments and digital tools similar to those for energy renovation, together with awareness and training may promote integrated renovation. A framework for regional prioritisation of building renovation was put forward, considering seismic risk, energy efficiency, and socioeconomic vulnerability independently and in an integrated way. Results indicate that prioritisation of building renovation is a multidimensional problem. Depending on priorities, different integrated indicators should be used to inform policies and accomplish the highest relative or most spread impact across different sectors. The framework was further extended to assess the impact of renovation scenarios across the EU with a focus on priority regions. Integrated renovation can provide a risk-proofed, sustainable, and inclusive built environment, presenting an economic benefit in the order of magnitude of the highest benefit among the separate interventions. Furthermore, it presents the unique capability of reducing fatalities and energy consumption at the same time and, depending on the scenario, to a greater extent.
Industrial facilities must be thoroughly designed to withstand seismic
actions as they exhibit an increased loss potential due to the possibly wideranging
damage consequences and the valuable process engineering equipment.
Past earthquakes showed the social and political consequences of seismic damage
to industrial facilities and sensitized the population and politicians worldwide
for the possible hazard emanating from industrial facilities. However, a holistic
approach for the seismic design of industrial facilities can presently neither be
found in national nor in international standards. The introduction of EN 1998-4
of the new generation of Eurocode 8 will improve the normative situation with
specific seismic design rules for silos, tanks and pipelines and secondary process
components. The article presents essential aspects of the seismic design of
industrial facilities based on the new generation of Eurocode 8 using the example
of tank structures and secondary process components. The interaction effects of
the process components with the primary structure are illustrated by means of
the experimental results of a shaking table test of a three story moment resisting
steel frame with different process components. Finally, an integrated approach of
digital plant models based on building information modelling (BIM) and structural
health monitoring (SHM) is presented, which provides not only a reliable
decision-making basis for operation, maintenance and repair but also an excellent
tool for rapid assessment of seismic damage.
Inference on the basis of high-dimensional and functional data are two topics which are discussed frequently in the current statistical literature. A possibility to include both topics in a single approach is working on a very general space for the underlying observations, such as a separable Hilbert space. We propose a general method for consistently hypothesis testing on the basis of random variables with values in separable Hilbert spaces. We avoid concerns with the curse of dimensionality due to a projection idea. We apply well-known test statistics from nonparametric inference to the projected data and integrate over all projections from a specific set and with respect to suitable probability measures. In contrast to classical methods, which are applicable for real-valued random variables or random vectors of dimensions lower than the sample size, the tests can be applied to random vectors of dimensions larger than the sample size or even to functional and high-dimensional data. In general, resampling procedures such as bootstrap or permutation are suitable to determine critical values. The idea can be extended to the case of incomplete observations. Moreover, we develop an efficient algorithm for implementing the method. Examples are given for testing goodness-of-fit in a one-sample situation in [1] or for testing marginal homogeneity on the basis of a paired sample in [2]. Here, the test statistics in use can be seen as generalizations of the well-known Cramérvon-Mises test statistics in the one-sample and two-samples case. The treatment of other testing problems is possible as well. By using the theory of U-statistics, for instance, asymptotic null distributions of the test statistics are obtained as the sample size tends to infinity. Standard continuity assumptions ensure the asymptotic exactness of the tests under the null hypothesis and that the tests detect any alternative in the limit. Simulation studies demonstrate size and power of the tests in the finite sample case, confirm the theoretical findings, and are used for the comparison with concurring procedures. A possible application of the general approach is inference for stock market returns, also in high data frequencies. In the field of empirical finance, statistical inference of stock market prices usually takes place on the basis of related log-returns as data. In the classical models for stock prices, i.e., the exponential Lévy model, Black-Scholes model, and Merton model, properties such as independence and stationarity of the increments ensure an independent and identically structure of the data. Specific trends during certain periods of the stock price processes can cause complications in this regard. In fact, our approach can compensate those effects by the treatment of the log-returns as random vectors or even as functional data.
Nanoparticles are recognized as highly attractive tunable materials for designing field-effect biosensors with enhanced performance. In this work, we present a theoretical model for electrolyte-insulator-semiconductor capacitors (EISCAP) decorated with ligand-stabilized charged gold nanoparticles. The charged AuNPs are taken into account as additional, nanometer-sized local gates. The capacitance-voltage (C–V) curves and constant-capacitance (ConCap) signals of the AuNP-decorated EISCAPs have been simulated. The impact of the AuNP coverage on the shift of the C–V curves and the ConCap signals was also studied experimentally on Al–p-Si–SiO₂ EISCAPs decorated with positively charged aminooctanethiol-capped AuNPs. In addition, the surface of the EISCAPs, modified with AuNPs, was characterized by scanning electron microscopy for different immobilization times of the nanoparticles.
Frequency mixing magnetic detection (FMMD) has been explored for its applications in fields of magnetic biosensing, multiplex detection of magnetic nanoparticles (MNP) and the determination of core size distribution of MNP samples. Such applications rely on the application of a static offset magnetic field, which is generated traditionally with an electromagnet. Such a setup requires a current source, as well as passive or active cooling strategies, which directly sets a limitation based on the portability aspect that is desired for point of care (POC) monitoring applications. In this work, a measurement head is introduced that involves the utilization of two ring-shaped permanent magnets to generate a static offset magnetic field. A steel cylinder in the ring bores homogenizes the field. By variation of the distance between the ring magnets and of the thickness of the steel cylinder, the magnitude of the magnetic field at the sample position can be adjusted. Furthermore, the measurement setup is compared to the electromagnet offset module based on measured signals and temperature behavior.