Article
Refine
Year of publication
- 2022 (80) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (34)
- IfB - Institut für Bioengineering (23)
- Fachbereich Chemie und Biotechnologie (14)
- INB - Institut für Nano- und Biotechnologien (13)
- Fachbereich Energietechnik (8)
- Fachbereich Elektrotechnik und Informationstechnik (5)
- Fachbereich Luft- und Raumfahrttechnik (5)
- Fachbereich Maschinenbau und Mechatronik (4)
- ECSM European Center for Sustainable Mobility (3)
- Fachbereich Bauingenieurwesen (3)
Language
- English (80) (remove)
Document Type
- Article (80) (remove)
Keywords
- Chemometrics (2)
- Earthquake (2)
- Heparin (2)
- NMR spectroscopy (2)
- biosensors (2)
- damage (2)
- Actuator disk modelling (1)
- Algal Turf Scrubber (1)
- Algal–bacterial bioflm (1)
- Alkalihalobacillus okhensis (1)
- Alzheimer's disease (1)
- Artificial intelligence (1)
- Assembly (1)
- Automated driving (1)
- Automotive application (1)
- BET (1)
- Behaviour factor q (1)
- Benchmark (1)
- Biocomposites (1)
- Biogas (1)
- Biomass (1)
- Biomechanical simulation (1)
- Bootstrapping (1)
- Boundary integral equations (1)
- Brake set-up (1)
- Brake test (1)
- CAV (1)
- CFD propeller simulation (1)
- Categorial variable (1)
- Cementoblast (1)
- Circular bioeconomy (1)
- Civil engineering (1)
- Clinical decision support systems (1)
- Collective risk model (1)
- Compression (1)
- Concomitant (1)
- Conductive boundary condition (1)
- Crude heparin (1)
- Data-driven models (1)
- Decoupling (1)
- Drinfeld modules (1)
- Dynamic simulation (1)
- ES-FEM (1)
- Empirical process (1)
- Explainability (1)
- FS-FEM (1)
- Feature selection (1)
- Finite element analysis (1)
- Forces (1)
- Fracture classification (1)
- Fracture configuration (1)
- Fracture simulation (1)
- Freight rail (1)
- Gamification (1)
- Genetic algorithm (1)
- Haemodialysis (1)
- Handbike (1)
- Hazard assessment (1)
- Heterostructure (1)
- Higher derivations (1)
- INSYSME (1)
- IR spectroscopy (1)
- Image Reconstruction (1)
- Imaging (1)
- In-plane performance, isolation (1)
- Incident analysis (1)
- Independence test (1)
- Infill wall design (1)
- Inorganic ions (1)
- Instructional design (1)
- Interstellar objects (1)
- Inverse spectral problem (1)
- Ions (1)
- Justice (1)
- Landslide tsunamis (1)
- Large scale tests (1)
- Level system (1)
- Lidar (1)
- Linear elastic analysis (1)
- Machine learning (1)
- Masonry infill (1)
- Measurement models (1)
- Measurement uncertainty (1)
- Mechanotransduction (1)
- Medical AI (1)
- Metascintillator (1)
- Methane (1)
- Microcirculation (1)
- Mild cognitive impairment (1)
- Missions (1)
- Modelling (1)
- Modern constructions (1)
- Molecular modelling (1)
- Molecular weight determination (1)
- Molten salt receiver system (1)
- Molten salt solar tower (1)
- Monte Carlo Tree Search (1)
- Morphing (1)
- Multiple TOF kernels (1)
- Myocardial infarction and cardiac death (1)
- NMR (1)
- Natural fibres (1)
- Normative standards (1)
- Numerical modelling (1)
- Overland flow (1)
- PLS-regression (1)
- Paralympic sport (1)
- Periods (1)
- Polymer-matrix composites (1)
- Probability distribution mapping (1)
- Propeller aerodynamics (1)
- Propeller performance (1)
- Proper Orthogonal Decomposition (1)
- Proximal humerus fracture (1)
- Pulsations (1)
- Q-criterion (1)
- Quality control (1)
- Quantum chemistry (1)
- Reinforced concrete frame (1)
- Retinal vessel analysis (1)
- Retinal vessels (1)
- Rotator cuff (1)
- S-FEM (1)
- Seismic design (1)
- Seismic loading (1)
- Self-driving (1)
- Shoulder (1)
- Solitary waves (1)
- Standardization (1)
- Stress concentrations (1)
- Structural design (1)
- TOF PET (1)
- Tension (1)
- Train composition (1)
- Trajectories (1)
- Transcendence (1)
- Transient flux distribution (1)
- Transmission eigenvalues (1)
- Two-phase modelling (1)
- USP (1)
- Unreinforced masonry buildings (1)
- User study (1)
- Vasomotions (1)
- Virtual reality (1)
- additive manufacturing (1)
- adipose-derived stromal cells (ASCs) (1)
- aminooctanethiol (1)
- anaesthetic complications (1)
- anisotropy (1)
- aortic perfusion (1)
- aseptic parameters (1)
- atomic layer deposition (1)
- automated vehicles (1)
- bacterial cellulose (1)
- batteries and fuel cells (1)
- bi-enzyme biosensor (1)
- biaxial tensile experiment (1)
- biocompatible materials (1)
- biodegradable electronic devices (1)
- biopharmaceutical production process (1)
- biopotential electrodes (1)
- body imaging at 7 T MRI (1)
- bootstrap (1)
- borefields (1)
- borehole disposal (1)
- capacitive field-effect sensor (1)
- capacitive field-effect sensors (1)
- capacitive model (1)
- carbon electrodes (1)
- cell aerosolization (1)
- cell atomization (1)
- cerebral small vessel disease (1)
- chance constrained programming (1)
- coculture (1)
- cognitive impairment (1)
- coherent structures (1)
- computational fluid dynamics analysis (1)
- connected automated vehicles (1)
- constitutive modeling (1)
- correlation (1)
- crystallization (1)
- dental trauma (1)
- detergent protease (1)
- dialysis (1)
- difficult airway (1)
- disposal facility (1)
- distorted element (1)
- double-lumen tube intubation (1)
- downstream processing design (1)
- electrolyte-insulator-semiconductor capacitors (1)
- electrospinning (1)
- endoluminal (1)
- energy (1)
- enzyme-logic gate (1)
- exopolysaccharides (1)
- experiment quality (1)
- experimental evaluation (1)
- extracorporeal membrane oxygenation (1)
- eye movement modelling examples (1)
- fibers (1)
- field-effect sensor (1)
- force generation (1)
- forehead EEG (1)
- frequency mixing magnetic detection (1)
- fused filament fabrication (1)
- gaseous hydrogen peroxide (1)
- geological disposal (1)
- geothermal (1)
- glucose (1)
- gold nanoparticles (1)
- halotolerant protease (1)
- high-alkaline subtilisin (1)
- high-intensity exercise (1)
- hybrid model validation (1)
- hydraulic modelling (1)
- hyperelastic (1)
- ignition (1)
- impedance spectroscopy (1)
- in-ear EEG (1)
- irradiation (1)
- limit analysis (1)
- magnetic nanoparticles (1)
- magnetic sensors (1)
- microplasma (1)
- microwave (MW) plasma (1)
- model performance (1)
- multi-sensing platform (1)
- muscle fascicle behavior (1)
- nanoparticle coverage (1)
- non-simplex S-FEM elements (1)
- nuclear waste (1)
- oxidative stable protease (1)
- pH sensors (1)
- penicillinase (1)
- performance testing (1)
- plasma jet (1)
- polyetheretherketone (1)
- porous materials (1)
- prebiotic (1)
- pullulan (1)
- rapid tooling (1)
- reliability of structures (1)
- retention time (1)
- retinal vessels (1)
- sEMG (1)
- shakedown analysis (1)
- simulation (1)
- sizing (1)
- smooth muscle contraction (1)
- spore kill rate (1)
- sterility (1)
- stochastic programming (1)
- strain energy function (1)
- survival (1)
- t-modules (1)
- tendon rupture (1)
- thermal dose (1)
- tissue temperature (1)
- tobacco mosaic virus (TMV) (1)
- transmit antenna arrays (1)
- tri-lineage differentiation (1)
- twin-fluid atomizer (1)
- ultrasound imaging (1)
- ultrathin gate insulators (1)
- urease (1)
- video learning (1)
- videolaryngoscopy (1)
- virgin passive (1)
- viscoelasticity (1)
- walking gait (1)
On the basis of independent and identically distributed bivariate random vectors, where the components are categorial and continuous variables, respectively, the related concomitants, also called induced order statistic, are considered. The main theoretical result is a functional central limit theorem for the empirical process of the concomitants in a triangular array setting. A natural application is hypothesis testing. An independence test and a two-sample test are investigated in detail. The fairly general setting enables limit results under local alternatives and bootstrap samples. For the comparison with existing tests from the literature simulation studies are conducted. The empirical results obtained confirm the theoretical findings.
On the basis of bivariate data, assumed to be observations of independent copies of a random vector (S,N), we consider testing the hypothesis that the distribution of (S,N) belongs to the parametric class of distributions that arise with the compound Poisson exponential model. Typically, this model is used in stochastic hydrology, with N as the number of raindays, and S as total rainfall amount during a certain time period, or in actuarial science, with N as the number of losses, and S as total loss expenditure during a certain time period. The compound Poisson exponential model is characterized in the way that a specific transform associated with the distribution of (S,N) satisfies a certain differential equation. Mimicking the function part of this equation by substituting the empirical counterparts of the transform we obtain an expression the weighted integral of the square of which is used as test statistic. We deal with two variants of the latter, one of which being invariant under scale transformations of the S-part by fixed positive constants. Critical values are obtained by using a parametric bootstrap procedure. The asymptotic behavior of the tests is discussed. A simulation study demonstrates the performance of the tests in the finite sample case. The procedure is applied to rainfall data and to an actuarial dataset. A multivariate extension is also discussed.
FEM shakedown analysis of structures under random strength with chance constrained programming
(2022)
Direct methods, comprising limit and shakedown analysis, are a branch of computational mechanics. They play a significant role in mechanical and civil engineering design. The concept of direct methods aims to determine the ultimate load carrying capacity of structures beyond the elastic range. In practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and constraints. If strength and loading are random quantities, the shakedown analysis can be formulated as stochastic programming problem. In this paper, a method called chance constrained programming is presented, which is an effective method of stochastic programming to solve shakedown analysis problems under random conditions of strength. In this study, the loading is deterministic, and the strength is a normally or lognormally distributed variable.
REM sleep without atonia (RSWA) is a key feature for the diagnosis of rapid eye movement (REM) sleep behaviour disorder (RBD). We introduce RBDtector, a novel open-source software to score RSWA according to established SINBAR visual scoring criteria. We assessed muscle activity of the mentalis, flexor digitorum superficialis (FDS), and anterior tibialis (AT) muscles. RSWA was scored manually as tonic, phasic, and any activity by human scorers as well as using RBDtector in 20 subjects. Subsequently, 174 subjects (72 without RBD and 102 with RBD) were analysed with RBDtector to show the algorithm’s applicability. We additionally compared RBDtector estimates to a previously published dataset. RBDtector showed robust conformity with human scorings. The highest congruency was achieved for phasic and any activity of the FDS. Combining mentalis any and FDS any, RBDtector identified RBD subjects with 100% specificity and 96% sensitivity applying a cut-off of 20.6%. Comparable performance was obtained without manual artefact removal. RBD subjects also showed muscle bouts of higher amplitude and longer duration. RBDtector provides estimates of tonic, phasic, and any activity comparable to human scorings. RBDtector, which is freely available, can help identify RBD subjects and provides reliable RSWA metrics.
Purpose
In the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset.
Design/methodology/approach
In this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments.
Findings
Based on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model.
Originality/value
For the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future.
The European Union's aim to become climate neutral by 2050 necessitates ambitious efforts to reduce carbon emissions. Large reductions can be attained particularly in energy intensive sectors like iron and steel. In order to prevent the relocation of such industries outside the EU in the course of tightening environmental regulations, the establishment of a climate club jointly with other large emitters and alternatively the unilateral implementation of an international cross-border carbon tax mechanism are proposed. This article focuses on the latter option choosing the steel sector as an example. In particular, we investigate the financial conditions under which a European cross border mechanism is capable to protect hydrogen-based steel production routes employed in Europe against more polluting competition from abroad. By using a floor price model, we assess the competitiveness of different steel production routes in selected countries. We evaluate the climate friendliness of steel production on the basis of specific GHG emissions. In addition, we utilize an input-output price model. It enables us to assess impacts of rising cost of steel production on commodities using steel as intermediates. Our results raise concerns that a cross-border tax mechanism will not suffice to bring about competitiveness of hydrogen-based steel production in Europe because the cost tends to remain higher than the cost of steel production in e.g. China. Steel is a classic example for a good used mainly as intermediate for other products. Therefore, a cross-border tax mechanism for steel will increase the price of products produced in the EU that require steel as an input. This can in turn adversely affect competitiveness of these sectors. Hence, the effects of higher steel costs on European exports should be borne in mind and could require the cross-border adjustment mechanism to also subsidize exports.
Heparin is a natural polysaccharide, which plays essential role in many biological processes. Alterations in building blocks can modify biological roles of commercial heparin products, due to significant changes in the conformation of the polymer chain. The variability structure of heparin leads to difficulty in quality control using different analytical methods, including infrared (IR) spectroscopy. In this paper molecular modelling of heparin disaccharide subunits was performed using quantum chemistry. The structural and spectral parameters of these disaccharides have been calculated using RHF/6-311G. In addition, over-sulphated chondroitin sulphate disaccharide was studied as one of the most widespread contaminants of heparin. Calculated IR spectra were analyzed with respect to specific structure parameters. IR spectroscopic fingerprint was found to be sensitive to substitution pattern of disaccharide subunits. Vibrational assignments of calculated spectra were correlated with experimental IR spectral bands of native heparin. Chemometrics was used to perform multivariate analysis of simulated spectral data.
Lignin is a promising renewable biopolymer being investigated worldwide as an environmentally benign substitute of fossil-based aromatic compounds, e.g. for the use as an excipient with antioxidant and antimicrobial properties in drug delivery or even as active compound. For its successful implementation into process streams, a quick, easy, and reliable method is needed for its molecular weight determination. Here we present a method using 1H spectra of benchtop as well as conventional NMR systems in combination with multivariate data analysis, to determine lignin’s molecular weight (Mw and Mn) and polydispersity index (PDI). A set of 36 organosolv lignin samples (from Miscanthus x giganteus, Paulownia tomentosa and Silphium perfoliatum) was used for the calibration and cross validation, and 17 samples were used as external validation set. Validation errors between 5.6% and 12.9% were achieved for all parameters on all NMR devices (43, 60, 500 and 600 MHz). Surprisingly, no significant difference in the performance of the benchtop and high-field devices was found. This facilitates the application of this method for determining lignin’s molecular weight in an industrial environment because of the low maintenance expenditure, small footprint, ruggedness, and low cost of permanent magnet benchtop NMR systems.
NMR standardization approach that uses the 2H integral of deuterated solvent for quantitative multinuclear analysis of pharmaceuticals is described. As a proof of principle, the existing NMR procedure for the analysis of heparin products according to US Pharmacopeia monograph is extended to the determination of Na+ and Cl- content in this matrix. Quantification is performed based on the ratio of a 23Na (35Cl) NMR integral and 2H NMR signal of deuterated solvent, D2O, acquired using the specific spectrometer hardware. As an alternative, the possibility of 133Cs standardization using the addition of Cs2CO3 stock solution is shown. Validation characteristics (linearity, repeatability, sensitivity) are evaluated. A holistic NMR profiling of heparin products can now also be used for the quantitative determination of inorganic compounds in a single analytical run using a single sample. In general, the new standardization methodology provides an appealing alternative for the NMR screening of inorganic and organic components in pharmaceutical products.
Although several successful applications of benchtop nuclear magnetic resonance (NMR) spectroscopy in quantitative mixture analysis exist, the possibility of calibration transfer remains mostly unexplored, especially between high- and low-field NMR. This study investigates for the first time the calibration transfer of partial least squares regressions [weight average molecular weight (Mw) of lignin] between high-field (600 MHz) NMR and benchtop NMR devices (43 and 60 MHz). For the transfer, piecewise direct standardization, calibration transfer based on canonical correlation analysis, and transfer via the extreme learning machine auto-encoder method are employed. Despite the immense resolution difference between high-field and low-field NMR instruments, the results demonstrate that the calibration transfer from high- to low-field is feasible in the case of a physical property, namely, the molecular weight, achieving validation errors close to the original calibration (down to only 1.2 times higher root mean square errors). These results introduce new perspectives for applications of benchtop NMR, in which existing calibrations from expensive high-field instruments can be transferred to cheaper benchtop instruments to economize.