Article
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1545)
- Fachbereich Wirtschaftswissenschaften (700)
- Fachbereich Elektrotechnik und Informationstechnik (630)
- Fachbereich Energietechnik (598)
- Fachbereich Chemie und Biotechnologie (588)
- INB - Institut für Nano- und Biotechnologien (524)
- Fachbereich Maschinenbau und Mechatronik (464)
- IfB - Institut für Bioengineering (429)
- Fachbereich Luft- und Raumfahrttechnik (368)
- Fachbereich Bauingenieurwesen (327)
Has Fulltext
- no (5518) (remove)
Language
Document Type
- Article (5518) (remove)
Keywords
- avalanche (5)
- Earthquake (4)
- LAPS (4)
- field-effect sensor (4)
- frequency mixing magnetic detection (4)
- CellDrum (3)
- Heparin (3)
- SLM (3)
- capacitive field-effect sensor (3)
- hydrogen peroxide (3)
- impedance spectroscopy (3)
- magnetic nanoparticles (3)
- snow (3)
- sustainability (3)
- tobacco mosaic virus (TMV) (3)
- Architektur (2)
- Bacillus atrophaeus (2)
- Boundary integral equations (2)
- Chemometrics (2)
- Conductive boundary condition (2)
- Datenschutz (2)
- Datenschutzgrundverordnung (2)
- Deutschland (2)
- Drinfeld modules (2)
- Empirical process (2)
- Field-effect sensor (2)
- Germany (2)
- Goodness-of-fit test (2)
- Haustechnik (2)
- Heizung (2)
- Hot S-parameter (2)
- IR spectroscopy (2)
- Independence test (2)
- Klimatechnik (2)
- Light-addressable potentiometric sensor (2)
- Literaturanalyse (2)
- Lüftung (2)
- Lüftungstechnik (2)
- MINLP (2)
- NMR spectroscopy (2)
- Paired sample (2)
- Parametric bootstrap (2)
- Principal component analysis (2)
- Pump System (2)
- Raman spectroscopy (2)
- Referenzmodellierung (2)
- Standardization (2)
- Stiffness (2)
- Transcendence (2)
- Transmission eigenvalues (2)
- Wasserbau (2)
- Wasserwirtschaft (2)
- additive manufacturing (2)
- bacterial cellulose (2)
- biosensors (2)
- constructive alignment (2)
- damage (2)
- energy efficiency (2)
- examination (2)
- fused filament fabrication (2)
- gold nanoparticles (2)
- harmonic radar (2)
- immobilization (2)
- light-addressable potentiometric sensor (2)
- likelihood ratio test (2)
- locomotion (2)
- long-term retention (2)
- microfluidics (2)
- multimodal (2)
- muscle fascicle behavior (2)
- not identically distributed (2)
- practical learning (2)
- prebiotic (2)
- rapid tooling (2)
- rebound-effect (2)
- t-modules (2)
- ultrasound (2)
- ultrasound imaging (2)
- (Bio)degradation (1)
- 197m/gHg (1)
- 1P hub loads (1)
- 3-D printing (1)
- 316L (1)
- 3D nonlinear finite element model (1)
- 60er Jahre (1)
- 802.15.4 (1)
- ABE (1)
- Abluft (1)
- Acceleration (1)
- Achilles tendon (1)
- Acid crash (1)
- Active humidity control (1)
- Adaptive control (1)
- Additive Manufacturing (1)
- Additive manufacturing (1)
- Aeroelasticity (1)
- Afterload (1)
- Ageing (1)
- Alginate beads (1)
- AlterG (1)
- Alzheimer's disease (1)
- Anastomotic leakage (1)
- Anatomy (1)
- Annulus Fibrosus (1)
- Anschlüsse an Stahlbeton (1)
- Antarctic Glaciology (1)
- Antibias (1)
- Anwendungsorientierter Forschungsansatz (1)
- Arbeit 4.0 (1)
- Architectural gear ratio (1)
- Artificial intelligence (1)
- Assembly (1)
- Assistive technology (1)
- Asymptotic efficiency (1)
- Atomausstieg (1)
- Authenticity (1)
- Autolysis (1)
- Automated driving (1)
- Automatic control (1)
- Automotive application (1)
- Avalanche (1)
- BIM (1)
- Bacillus atrophaeus spores (1)
- Bacillus sp (1)
- Bahadur efficiency (1)
- Balance (1)
- Balanced hypergraph (1)
- Bank-issued Warrants (1)
- Basisisolierung (1)
- Bauwerksüberwachung (1)
- Behaviour factor q (1)
- Bemessung (1)
- Benchmark (1)
- Biocomposites (1)
- Biomass (1)
- Biomechanical simulation (1)
- Biosolubilization (1)
- Bloom’s Taxonomy (1)
- Bluetooth (1)
- Booster Station (1)
- Booster Stations (1)
- Bootstrap (1)
- Bootstrapping (1)
- Borehole heat exchanger (1)
- Boundary integral equations, (1)
- Bragg peak (1)
- Brake set-up (1)
- Braking curves (1)
- Brandfall (1)
- Brands (1)
- Brownian Pillow (1)
- Buffering Capacity (1)
- Bundesrepublik Deutschland (1)
- Butanol (1)
- C. acetobutylicum (1)
- CAV (1)
- CNOT (1)
- CO2 emission reduction targets (1)
- CRISPR/Cas9 (1)
- Calorimetric gas sensor (1)
- Capacitive field-effect (1)
- Capacitive model (1)
- Cardiac myocytes (1)
- Cardiac tissue (1)
- Cardiovascular MRI (1)
- Cardiovascular Magnetic Resonance (1)
- Categorial variable (1)
- Cell permeability (1)
- Cellular force (1)
- Cementoblast (1)
- Censored data (1)
- Chance Constraint (1)
- Change (1)
- Chatbots (1)
- Chemical images (1)
- Chemical imaging sensor (1)
- Chemical sensor (1)
- Circuit simulation (1)
- Circular Dichroism (1)
- Civil engineering (1)
- Clinical decision support systems (1)
- Co-managed care (1)
- Coal (1)
- Collective risk model (1)
- Commercial Vehicle (1)
- Common Rail Injection System (1)
- Competence Developing Games (1)
- Competitiveness (1)
- Complex System (1)
- Complex-valued eigenvalues (1)
- Compliance (1)
- Compression (1)
- Computational biomechanics (1)
- Concomitant (1)
- Conductive Boundary Condition (1)
- Consensus (1)
- Conservation laws (1)
- Constitutive model (1)
- Contractile tension (1)
- Contractility (1)
- Cost function (1)
- Coverage probability (1)
- Cramér-von-Mises statistic (1)
- Cramér-von-Mises test (1)
- Cross border adjustment mechanism (1)
- Crude heparin (1)
- Crámer–von-Mises distance (1)
- Cyclotron production (1)
- C–V method (1)
- DAC (1)
- DIN EN 1996 (1)
- DLR-ESTEC GOSSAMER roadmap for solar sailing (1)
- DNA biosensor (1)
- DNA hybridization (1)
- DPA (dipicolinic acid) (1)
- Damage mechanics theory (1)
- Decomposition (1)
- Decoupling (1)
- Deep Learning (1)
- Dehydrogenase (1)
- Design Science Research (1)
- Detergent protease (1)
- Deuterated solvents (1)
- Deuterium NMR (1)
- Diaphorase (1)
- Diesel Engine (1)
- Digitalisierung (1)
- Dimensional splitting (1)
- Disc Degeneration (1)
- Discontinuous fractures (1)
- Discourse ethics (1)
- Discrete Optimization (1)
- Disposition Effect (1)
- Distorsion des oberen Sprunggelenks (1)
- Distributed Control Systems, (1)
- Diversity Management (1)
- Driver assistance system (1)
- Driving cycle recognition (1)
- Drug simulation (1)
- Dry-low-NOx (DLN) combustion (1)
- Duality (1)
- Dynamic simulation (1)
- E-Learning (1)
- E-Mobility (1)
- EBSCO Discovery Service (1)
- ECMS (1)
- EGG (1)
- EIS capacitive sensor (1)
- EN 1993-1-2 (1)
- ES-FEM (1)
- Effective modal mass (1)
- Eigenvalue trajectories (1)
- Eingespannte Stützen und Träger (1)
- Einspanntiefen (1)
- Elderly (1)
- Electrochemical impedance spectroscopy (1)
- Electrolyte–insulator–semiconductor (1)
- Electromagnetism (1)
- Electromechanical modeling (1)
- Elektroenzephalographie (1)
- Elemental (1)
- Emilia-Romagna earthquake (1)
- Empirical consequence curves (1)
- Empirical fragility functions (1)
- End-to-end colorectal anastomosis (1)
- Endothelial cells (1)
- Endothelial dysfunction (1)
- Energy management strategies (1)
- Energy-intensive industry (1)
- Engineering Application (1)
- Engineering Habitus (1)
- Enhanced Telecom Operations Map (1)
- Entropy solution (1)
- Enzymatic biosensor (1)
- Enzyme coverage (1)
- Enzyme logic gate (1)
- Equivalence test (1)
- Erdbebeneinwirkung (1)
- Esophageal Doppler monitor (1)
- European Transient Cycle (1)
- Europäische Energiepolitik (1)
- Eutectic Silver Copper alloy (1)
- Experimental validation (1)
- Explainability (1)
- Exponential Euler scheme, (1)
- Exponential time differencing (1)
- External knee adduction moments (1)
- Extracellular enzymes (1)
- Extraterrestrial Glaciology (1)
- FS-FEM (1)
- Fall prevention (1)
- Fault detection (1)
- Feature selection (1)
- Field effect (1)
- Field-effect biosensor (1)
- Finite difference methods (1)
- Finite differences (1)
- Finite element analysis (1)
- Finite element modelling (1)
- Finland (1)
- Floor prices (1)
- Flutter (1)
- Force (1)
- Forces (1)
- Forschung, pränormativ (1)
- Forschungsprozess (1)
- Fracture classification (1)
- Fracture configuration (1)
- Fracture simulation (1)
- Frame structure (1)
- Freeze–thaw process (1)
- Freight rail (1)
- Frequency adaption (1)
- Frequency mixing magnetic detection (1)
- Fuel cell (1)
- Functional Delta Method (1)
- Furnace (1)
- Fusion (1)
- GOSSAMER-1 (1)
- Ga-68 (1)
- Game-based learning (1)
- Gamification (1)
- Gamma distribution (1)
- Genetic algorithm (1)
- Glaciological instruments and methods (1)
- Glaucoma (1)
- Gold nanoparticle (AuNP) (1)
- Goodness-of-fit tests for uniformity (1)
- Ground-level falls (1)
- Guerrillawerbung (1)
- H2 (1)
- Hadamard differentiability (1)
- Haemodialysis (1)
- Hall’s Theorem (1)
- Handbike (1)
- Hazard assessment (1)
- Health management system (1)
- Heart tissue culture (1)
- Heat transport (1)
- Helmholtz equation (1)
- Heterostructure (1)
- High field MRI (1)
- High hydrogen combustion (1)
- Higher derivations (1)
- Higher-order codes (1)
- Hodgkin–Huxley models (1)
- Hoeffding-Blum-Kiefer-Rosenblatt independence test (1)
- Homogenization (1)
- Hotelling’s T² test (1)
- Human Development Index (1)
- Human factors (1)
- Human-Computer interaction (1)
- Hydrogen combustion (1)
- Hydrogen gas turbine (1)
- Hydrogen peroxide (1)
- Hydrogenotrophic methanogens (1)
- Hyperdifferentials (1)
- Hypergraph (1)
- Hypersecretion (1)
- Hühnerzucht (1)
- IBM Watson Explorer (1)
- INODIS (1)
- INSYSME (1)
- IP-based networks (1)
- IR (1)
- IT security education (1)
- Illusion (1)
- Illustration (1)
- Image Quality Assessment (1)
- Image Quality Score (1)
- Image Reconstruction (1)
- Imaging (1)
- Impedance analysis (1)
- Impedance spectroscopy (1)
- Implicit methods (1)
- In-plane performance, isolation (1)
- Incomplete data (1)
- Individual Investors (1)
- Induced pluripotent stem cells (1)
- Industrial Automation Technology, (1)
- Infill wall design (1)
- Informationstechnik (1)
- Inorganic ions (1)
- Inotropic compounds (1)
- Instructional design (1)
- Integrated empirical distribution (survival) function (1)
- Interior Neumann eigenvalues (1)
- Interior transmission eigenvalues (1)
- Interior transmission problem (1)
- Interstellar objects (1)
- Interval Time Series (1)
- Intervertebral Disc (1)
- Intradiscal Pressure (1)
- Inverse Scattering (1)
- Inverse dynamic problem (1)
- Inverse kinematic problem (1)
- Inverse scattering (1)
- Inverse spectral problem (1)
- Ion channels (1)
- Ions (1)
- Iterative learning control (1)
- Justice (1)
- Kernel density estimator (1)
- Kernenergie (1)
- Keyword analysis (1)
- Kinetic energy (1)
- Koenig’s Theorem (1)
- LPBF (1)
- LPS (1)
- Lab-on-Chip (1)
- Label-free detection (1)
- Lactobacillus rhamnosus GG (1)
- Landslide tsunamis (1)
- Langevin theory (1)
- Large scale tests (1)
- Larynx position (1)
- Latvia (1)
- Layer-by-layer adsorption (1)
- LbL films (1)
- Left ventriular function (1)
- Length of confidence intervals (1)
- Level system (1)
- Li7La3Zr2O12 (1)
- LiGaO2 (1)
- Lidar (1)
- Light-addressable Potentiometric Sensor (1)
- Linear discriminant analysis (1)
- Linear elastic analysis (1)
- Lipopolysaccharide (1)
- Literaturrecherche (1)
- Liver (1)
- Long COVID (1)
- Low-field NMR (1)
- Luxury (1)
- MILP (1)
- MOS (1)
- MR safety (1)
- MR-stethoscope (1)
- MRI (1)
- MUT measurement; scanner (1)
- Machine learning (1)
- Magnetic field strength (1)
- Magnetic nanoparticles (1)
- Magnetic resonance imaging (MRI) (1)
- Magneto alert sensor (1)
- Manipulated variables (1)
- Manufacturer (1)
- Marginal homogeneity test (1)
- Marker-free mutagenesis (1)
- Masonry infill (1)
- Masonry partition walls (1)
- Matching (1)
- Matrix exponential (1)
- Mauerwerksbauten (1)
- Mauerwerksgebäude (1)
- Measurement models (1)
- Measurement uncertainty (1)
- Mechanotransduction (1)
- Medical AI (1)
- Medical radionuclide production (1)
- Meitner-Auger-electron (MAE) (1)
- Metabolic shift (1)
- Metal contaminants (1)
- Metascintillator (1)
- Methane (1)
- Methanogenesis (1)
- Microcirculation (1)
- Microfluidic solvent extraction (1)
- Micromagnetic simulation (1)
- Micromix combustion (1)
- Mild cognitive impairment (1)
- Mischen (1)
- Missions (1)
- Mobility (1)
- Mobility tests (1)
- Mobility transition (1)
- Mode converter (1)
- Modelica (1)
- Modeling (1)
- Modelling (1)
- Modern constructions (1)
- Modulbau (1)
- Molecular modelling (1)
- Molecular weight determination (1)
- Molten salt receiver system (1)
- Molten salt solar tower (1)
- Momentenverteilung (1)
- Monitoring (1)
- Monotone methods (1)
- Monte Carlo Tree Search (1)
- Morphing (1)
- Multi-criteria decision analysis (1)
- Multi-dimensional partial differential equations (1)
- Multi-objective optimization (1)
- Multi-sample problem (1)
- Multi-storey (1)
- Multianalyte detection (1)
- Multiple TOF kernels (1)
- Muscle (1)
- Muscle Fascicle (1)
- Muscle Force (1)
- Musculoskeletal model (1)
- Musculoskeletal system (1)
- Myocardial infarction and cardiac death (1)
- NGN (1)
- NMR (1)
- NMR exchange relaxometry (1)
- NONOate (1)
- Nasskühlturm (1)
- Natural fibres (1)
- Natural frequency (1)
- Negative Feedback Trading (1)
- Negative impedance convertor (1)
- Neural Network (1)
- Nitric Oxide (1)
- Nitric Oxide Donor (1)
- Non-model-based Evaluation (1)
- Non-parallel fissures (1)
- Nonequilibrium dynamics (1)
- Nonlinear Dynamics (1)
- Nonlinear PDEs (1)
- Nonlinear eigenvalue problems (1)
- Normative standards (1)
- Nucleus Pulposus (1)
- Numerical inversion of Laplace transforms (1)
- Numerical linear algebra (1)
- Numerical modelling (1)
- Numerics (1)
- O2 plasma (1)
- Ocular blood flow (1)
- Online diagnostic (1)
- Operations (1)
- Organic light-emitting diode display (1)
- Organizational Culture (1)
- Out-of-plane capacity (1)
- Overland flow (1)
- P2G (1)
- PBEE (1)
- PEM fuel cell (1)
- PIV (1)
- PLS-regression (1)
- PROFINET (1)
- Parabolic SPDEs (1)
- Paralympic sport (1)
- Parasitäre Strategie (1)
- Path planning (1)
- Penicillin (1)
- Periods (1)
- Pflanzen <Motiv> (1)
- Pharmacology (1)
- Photographie (1)
- Physiology (1)
- Piping (1)
- Pitman efficiency (1)
- Plasma (1)
- Plasma diagnostics (1)
- Poly(allylamine hydrochloride) (1)
- Poly(d,l-lacticacid) (1)
- Polyimide (1)
- Polymer-matrix composites (1)
- Pornographie (1)
- Porositat (1)
- Portalrahmen (1)
- Post-COVID-19 syndrome (1)
- Potential theory (1)
- Powertrain (1)
- Praxisprojekte (1)
- Pre-culture (1)
- Precast buildings (1)
- Predictive battery discharge (1)
- Preference assessment (1)
- Probability distribution mapping (1)
- Process virtualization (1)
- Product bundling (1)
- Product-integration (1)
- Propeller whirl flutter (1)
- Proper Orthogonal Decomposition (1)
- Proximal humerus fracture (1)
- Prozessabläufe (1)
- Prozessautomatisierung (1)
- Prozessmodellierung (1)
- Prozessstandardisierung (1)
- Psychiatrische Biomarker (1)
- Pulsations (1)
- Pumping systems (1)
- PushoverAnalysen (1)
- Q-criterion (1)
- Quality control (1)
- Quantum chemistry (1)
- Quelle-Fertighaus (1)
- RAMMS (1)
- RVA (1)
- Radar (1)
- Rahmentragwirkung (1)
- Reaction-diffusion systems (1)
- Real distinct pole (1)
- Real-time monitoring (1)
- Rechtsgebiet (1)
- Recombinant activated protein C (1)
- Regionalization (1)
- Rehabilitation Technology and Prosthetics (1)
- Rehabilitation engineering (1)
- Reinforced concrete frame (1)
- Relative air humidity (1)
- Requirements prioritization (1)
- Requirements relations (1)
- Resampling test (1)
- Resolvent Operator (1)
- Resonance-mode measurement (1)
- Response Surface Method (1)
- Retinal vessel analysis (1)
- Retinal vessels (1)
- Robotic Process Automation (1)
- Robotic rehabilitation (1)
- Rotary encoder (1)
- Rotator cuff (1)
- Running (1)
- S-FEM (1)
- SFCW (1)
- SOA (1)
- SSE) JEL : O33 (1)
- ScaLAPACK (1)
- Schadensersatz (1)
- Schlafspindeldetektion (1)
- Seismic design (1)
- Seismic loading (1)
- Selective Catalytic Reduction (1)
- Selektives Laser Schmelzen (1)
- Self-driving (1)
- Semi-parametric random censorship model (1)
- Sensor (1)
- Septic cardiomyopathy (1)
- Services (1)
- Shoulder (1)
- Shunting (1)
- Silber (1)
- Simulation (1)
- Simultaneous determination (1)
- Slab deflection (1)
- Small spacecraft (1)
- Snow (1)
- Sn₃O₄ (1)
- Soft independent modeling of class analogy (1)
- Solar sail (1)
- Solitary waves (1)
- Sound Pressure Level (1)
- Source term (1)
- Spleen (1)
- Sprunggelenkorthesen (1)
- Stahlkonstruktion (1)
- Stahlprofile (1)
- Steel industry (1)
- Stenotrophomonas maltophilia (1)
- Sterilisation process (1)
- Stochastic Programming (1)
- Stress concentrations (1)
- Structural design (1)
- Subclacial exploration (1)
- Subglacial lakes (1)
- SunRav BookEditor (1)
- Surgical Navigation and Robotics (1)
- Surgical staplers (1)
- Survival analysis (1)
- Suspension bridge (1)
- TMV adsorption (1)
- TOF PET (1)
- Tank (1)
- Targeted radionuclide therapy (TRT) (1)
- Ta₂O₅ gate (1)
- Technical Operations Research (1)
- Technical Operations Research (TOR) (1)
- Telecommunication (1)
- Tendon Rupture (1)
- Tendon properties (1)
- Tension (1)
- Text Analytics (1)
- Text Analytics (1)
- Text Mining (1)
- Thermal conductivity (1)
- Thin shell finite elements (1)
- Tinetti test (1)
- Tomography (1)
- Tool support (1)
- Trading Behavior (1)
- Tragwerksbemessung (1)
- Trajectories (1)
- Transfer impedance (1)
- Transformation (1)
- Transient flux distribution (1)
- Transmission Eigenvalues (1)
- Trockenkühlturm (1)
- Two-phase modelling (1)
- Typographie (1)
- USP (1)
- Uktrahigh field MRI (1)
- Uncertainty (1)
- Uniaxial compression test (1)
- Unreinforced masonry buildings (1)
- Unsteady aerodynamics (1)
- Unterhaltung (1)
- Uracil-phosphoribosyltransferase (1)
- User study (1)
- VOF (1)
- VOP compression (1)
- Vapnik–Čhervonenkis class (1)
- Variable height stapler design (1)
- Vascular response (1)
- Vasomotions (1)
- Velocity (1)
- Verhaltensbeiwerte (1)
- Vertex cover (1)
- Virtual reality (1)
- Visual field asymmetry (1)
- Voice assessment (1)
- Volterra integral equation (1)
- Volume of confidence regions (1)
- Volume status (1)
- Wand-Decken-Interaktion (1)
- Water Distribution (1)
- Water Supply Networks (1)
- Wiegand sensor (1)
- Wilcoxon tests (1)
- Wireless Networks (1)
- XOR (1)
- Zuschauer (1)
- achilles tendon (1)
- actin cytoskeleton (1)
- actuator-sensor system (1)
- adaptive systems (1)
- adipose-derived stromal cells (ASCs) (1)
- adsorption (1)
- agility (1)
- allocation (1)
- aminooctanethiol (1)
- anaesthetic complications (1)
- anammox (1)
- anisotropy (1)
- ankle braces (1)
- ankle sprain (1)
- annealing (1)
- aortic perfusion (1)
- aquaculture (1)
- artificial intelligence (1)
- artificial olfactory image (1)
- aseptic parameters (1)
- aspergillus (1)
- assistance system (1)
- asymptotic relative efficiency (1)
- automated vehicles (1)
- automotive (1)
- availability (1)
- barium strontium titanate (1)
- batteries and fuel cells (1)
- bi-enzyme biosensor (1)
- biaxial tensile experiment (1)
- bioavailability (1)
- biodegradable polymers (1)
- biofilms (1)
- biological dosimeter (1)
- biomechanics (1)
- biomethane (1)
- biopotential electrodes (1)
- biosensor (1)
- body imaging at 7 T MRI (1)
- body imaging at UHF MRI (1)
- body limbs (1)
- bootstrap (1)
- borefields (1)
- borehole disposal (1)
- brachytherapy (1)
- bubble column (1)
- building energy modelling (1)
- building energy simulation (1)
- business culture (1)
- calorimetric gas sensor (1)
- calorimetric gas sensor;hydrogen peroxide;wireless sensor system (1)
- capacitive EIS sensor (1)
- capacitive model (1)
- carbon dioxide removal (1)
- carbonized rice husk (1)
- cardiac gating (1)
- cardiomyocyte biomechanics (1)
- cardiovascular MR imaging (1)
- catalytic metal (1)
- cell aerosolization (1)
- cell atomization (1)
- central symmetry test (1)
- cerebral small vessel disease (1)
- chance constrained programming (1)
- change (1)
- change management (1)
- chemical sensor (1)
- chip-based sensor setup (1)
- churches (1)
- climate change (1)
- climate neutrality (1)
- coculture (1)
- cognitive impairment (1)
- coherent structures (1)
- colorization (1)
- community dwelling (1)
- complete block symmetry (1)
- compression behavior (1)
- computational fluid dynamics analysis (1)
- concentrating collector (1)
- conditional excess distribution (1)
- conditional expectation principle (1)
- confidence interval (1)
- connected automated vehicles (1)
- connective tissue (1)
- constitutive modeling (1)
- contactless conductivity sensor (1)
- control gate (1)
- corporate sustainability (1)
- correlation (1)
- coupled Néel–Brownian relaxation dynamics (1)
- covariance principle (1)
- critical (1)
- crop yield (1)
- crystallization (1)
- cyber-physical production systems (1)
- dam-break (1)
- debris flow (1)
- deficit irrigation (1)
- dental trauma (1)
- deserts (1)
- design of technical systems (1)
- detection of charged macromolecules (1)
- dialysis (1)
- difficult airway (1)
- digital factory (1)
- digital twin (1)
- direct air capture (1)
- disposal facility (1)
- distance learning (1)
- distorted element (1)
- diversity management (1)
- double-lumen tube intubation (1)
- drag force (1)
- drop jump (1)
- drug metabolising enzymes (1)
- drug–drug interactions (1)
- e-books (1)
- e-issues (1)
- eTOM (1)
- earthquake engineering (1)
- economics (1)
- efficiency side-effects (1)
- eigensolvers (1)
- elastic scattering (1)
- elastomers (1)
- electrical conductivity of liquids (1)
- electrocardiogram (1)
- electrolyte-insulator semiconductor sensor (EIS) (1)
- electrolyte-insulator-semiconductor capacitors (1)
- electromyography (1)
- electronic nose (1)
- electrospinning (1)
- endoluminal (1)
- endospores (1)
- energy (1)
- energy absorption (1)
- energy dissipation (1)
- energy transfer (1)
- engineering (1)
- entrepreneurship education (1)
- enzymatic (bio)degradation (1)
- enzyme cascade (1)
- enzyme kinetics (1)
- enzyme-logic gate (1)
- equivalent circuit (1)
- equivalent stiffness (1)
- event-based simulation (1)
- exchangeability test (1)
- exopolysaccharides (1)
- experimental evaluation (1)
- extracorporeal membrane oxygenation (1)
- eye movement modelling examples (1)
- factory planning (1)
- fibers (1)
- fibulare Bandruptur (1)
- field-effect structure (1)
- filamentous fungi (1)
- force generation (1)
- forecast (1)
- forehead EEG (1)
- frequency mixing (1)
- fuel cell vehicle (1)
- functional data (1)
- ga-doping (1)
- gait (1)
- gamification (1)
- garnet solid electrolyte (1)
- gas sensor (1)
- gaseous hydrogen peroxide (1)
- genetic algorithm (1)
- genome engineering (1)
- geological disposal (1)
- geothermal (1)
- global optimization (1)
- glucose oxidase (GOx) (1)
- glycine (1)
- goodness-of-fit test (1)
- granular silo (1)
- grey energy (1)
- harmonic radar tags (1)
- healthy aging (1)
- heat demand (1)
- heat transfer coefficient (1)
- heating system (1)
- heavy metals (1)
- hiPS cardiomyocytes (1)
- high field MR imaging (1)
- high-intensity exercise (1)
- high-k material (1)
- horseradish peroxidase (HRP) (1)
- huge dimensional data (1)
- human metabolites (1)
- humic acid (1)
- hydraulic modelling (1)
- hydrogel (1)
- hydrogels (1)
- hydroxylation (1)
- hyper-gravity (1)
- hyperelastic (1)
- hypo-gravity (1)
- hypoplasticity (1)
- ignition (1)
- impulsive effects (1)
- in-ear EEG (1)
- in-plane behaviour (1)
- in-situ monitoring (1)
- incontinence (1)
- independence test (1)
- industrial agents (1)
- industry 4.0 (1)
- infill strategy (1)
- innovation management (1)
- integrated transmit coil arrays (1)
- intelligent control (1)
- intelligent energy management (1)
- intraclass correlation coefficient (1)
- irradiation (1)
- jevons paradox (1)
- key performance indicators (1)
- kombiniertes Verfahren (1)
- lable-free detection (1)
- legal obligations (1)
- light-addressable electrode (1)
- light-addressing technologies (1)
- lignite (1)
- limit analysis (1)
- lipopolysaccharide (1)
- liquid-storage tank (1)
- liquid-structure interaction (1)
- literature (1)
- lizards (1)
- low-rank coal (1)
- machine learning (1)
- macro-element (1)
- magnetic actuation (1)
- magnetic beads (1)
- magnetic biosensing (1)
- magnetic relaxation (1)
- magnetic resonance imaging (1)
- magnetic sandwich immunoassay (1)
- magnetic sensing (1)
- magnetic sensors (1)
- magnetic separation (1)
- magnetic tweezers (1)
- magnetophoretic velocity (1)
- mainstream deammonification (1)
- management (1)
- manufacturing (1)
- manufacturing data model (1)
- manufacturing flexibility (1)
- marketing (1)
- masonry infill (1)
- mathematical optimization (1)
- mechanical buffer (1)
- mechanical properties (1)
- metal-oxide-semiconductor structure (1)
- methanation (1)
- method of fundamental solutions (1)
- micromagnetic simulation (1)
- micronutrients (1)
- microplasma (1)
- microwave (MW) plasma (1)
- microwave measurements (1)
- mixed-integer linear programming (1)
- model performance (1)
- motivation (1)
- multi-agent systems (1)
- multi-functional material (1)
- multi-sensing platform (1)
- multianalyte detection (1)
- multinomial distribution (1)
- multiparametric immunoassays (1)
- multiplex detection (1)
- multivariate normal distribution (1)
- muscle mechanics (1)
- nanobelts (1)
- nanoparticle coverage (1)
- negative emissions (1)
- neutrons (1)
- next generation network (1)
- nitrogen elimination (1)
- non-simplex S-FEM elements (1)
- nonlinear VNA measurements (1)
- nonlinear radar (1)
- nonlinear transient analyses (1)
- novel photoexcitation method (1)
- nuclear waste (1)
- numerical model (1)
- on-chip integrated addressable EISCAP sensors (1)
- onion (1)
- optical fibers (1)
- optical sensor setup (1)
- optical spore trapping (1)
- optical trapping (1)
- optimization (1)
- optimization system (1)
- organosilanes (1)
- out-of-plane behaviour (1)
- overload (1)
- parabolic flight (1)
- penicillinase (1)
- performance analysis (1)
- performance testing (1)
- phonocardiogram (1)
- physical model (1)
- physiology (1)
- plant virus detection (1)
- plasma jet (1)
- plug flow reactor (1)
- plug-based microfluidic device (1)
- point-focussing system (1)
- poly(d, l-lactic acid) (1)
- polyaspartic acid (1)
- polyetheretherketone (1)
- polyetheretherketone (PEEK) (1)
- polystyrene sulfonate (1)
- porous materials (1)
- portfolio risk (1)
- prevention (1)
- product bundling (1)
- product liability (1)
- production planning and control (1)
- programming (1)
- prostate cancer (1)
- prostatectomy (1)
- proton therapy (1)
- protons (1)
- psychosocial (1)
- pullulan (1)
- qNMR (1)
- quality of life (1)
- random effects (1)
- random effects meta-regression model (1)
- raytracing (1)
- rehabilitation (1)
- relative dosimetry (1)
- reliability of structures (1)
- remote sensing (1)
- research association (1)
- resilience (1)
- resource abundance (1)
- retention time (1)
- retinal microvasculature (1)
- retinal vessels (1)
- review (1)
- rollout (1)
- rubber (1)
- running (1)
- rupture of the fibular ligament (1)
- sEMG (1)
- sarcomere operating length (1)
- scanned light pulse technique (1)
- seismic (1)
- seismic response (1)
- separable Hilbert space (1)
- series elastic element behavior (1)
- service-oriented architectures (1)
- shakedown analysis (1)
- shoulder (1)
- silanization (1)
- simulation (1)
- sizing (1)
- slum classification (1)
- smooth muscle contraction (1)
- socio-economic welfare (1)
- softs (1)
- soil amendment (1)
- soil health (1)
- soil remediation (1)
- solar process heat (1)
- solid-state battery (1)
- spatial resolution (1)
- spore kill rate (1)
- sprint start (1)
- standard error of measurement (1)
- steel profiles (1)
- sterilisation (1)
- sterility (1)
- sterilization (1)
- sterilization conditions (1)
- stiffness (1)
- stochastic optimization (1)
- stochastic programming (1)
- strain energy function (1)
- stretch reflex (1)
- stretch-shortening cycle (1)
- structural design (1)
- structure-soil-structure interaction (1)
- superabsorbent polymers (1)
- superparamagnetic bead (1)
- superparamagnetic nanoparticles (1)
- supramolecular structures (1)
- surface functionalization (1)
- surface modification (1)
- survival (1)
- swelling properties (1)
- swimming (1)
- system optimization (1)
- system synthesis (1)
- systematic (1)
- technical operations research (1)
- technology planning (1)
- telecommunication (1)
- temperature (1)
- tendon rupture (1)
- test-retest reliability (1)
- theory and modeling (1)
- thermal dose (1)
- thermometry (1)
- tilted constant illumination (1)
- tissue temperature (1)
- transmit antenna arrays (1)
- transponder (1)
- transporters (1)
- tri-lineage differentiation (1)
- truss (1)
- turnip vein clearing virus (TVCV) (1)
- twin-fluid atomizer (1)
- ultrasonography (1)
- uniformly most powerful invariant test (1)
- unloading (1)
- urease (1)
- vault (1)
- video learning (1)
- videolaryngoscopy (1)
- virgin passive (1)
- viscoelasticity (1)
- visualization (1)
- walking (1)
- walking gait (1)
- wastewater (1)
- water economy (1)
- water supply design (1)
- yield (1)
- · Psychiatrische Erkrankungen/Diagnostik (1)
Because of simple construction process, high energy efficiency, significant fire resistance and excellent sound isolation, masonry infilled reinforced concrete (RC) frame structures are very popular in most of the countries in the world, as well as in seismic active areas. However, many RC frame structures with masonry infills were seriously damaged during earthquake events, as the traditional infills are generally constructed with direct contact to the RC frame which brings undesirable infill/frame interaction. This interaction leads to the activation of the equivalent diagonal strut in the infill panel, due to the RC frame deformation, and combined with seismically induced loads perpendicular to the infill panel often causes total collapses of the masonry infills and heavy damages to the RC frames. This fact was the motivation for developing different approaches for improving the behaviour of masonry infills, where infill isolation (decoupling) from the frame has been more intensively studied in the last decade. In-plane isolation of the infill wall reduces infill activation, but causes the need for additional measures to restrain out-of-plane movements. This can be provided by installing steel anchors, as proposed by some researchers. Within the framework of European research project INSYSME (Innovative Systems for Earthquake Resistant Masonry Enclosures in Reinforced Concrete Buildings) the system based on a use of elastomers for in-plane decoupling and steel anchors for out-of-plane restrain was tested. This constructive solution was tested and deeply investigated during the experimental campaign where traditional and decoupled masonry infilled RC frames with anchors were subjected to separate and combined in-plane and out-of-plane loading. Based on a detailed evaluation and comparison of the test results, the performance and effectiveness of the developed system are illustrated.
Erdbebennachweis von Mauerwerksbauten mit realistischen Modellen und erhöhten Verhaltensbeiwerten
(2021)
Die Anwendung des linearen Nachweiskonzepts auf Mauerwerksbauten führt dazu, dass bereits heute Standsicherheitsnachweise für Gebäude mit üblichen Grundrissen in Gebieten mit moderaten Erdbebeneinwirkungen nicht mehr geführt werden können. Diese Problematik wird sich in Deutschland mit der Einführung kontinuierlicher probabilistischer Erdbebenkarten weiter verschärfen. Aufgrund der Erhöhung der seismischen Einwirkungen, die sich vielerorts ergibt, ist es erforderlich, die vorhandenen, bislang nicht berücksichtigten Tragfähigkeitsreserven in nachvollziehbaren Nachweiskonzepten in der Baupraxis verfügbar zu machen. Der vorliegende Beitrag stellt ein Konzept für die gebäudespezifische Ermittlung von erhöhten Verhaltensbeiwerten vor. Die Verhaltensbeiwerte setzen sich aus drei Anteilen zusammen, mit denen die Lastumverteilung im Grundriss, die Verformungsfähigkeit und Energiedissipation sowie die Überfestigkeiten berücksichtigt werden. Für die rechnerische Ermittlung dieser drei Anteile wird ein nichtlineares Nachweiskonzept auf Grundlage von Pushover-Analysen vorgeschlagen, in denen die Interaktionen von Wänden und Geschossdecken durch einen Einspanngrad beschrieben werden. Für die Bestimmung der Einspanngrade wird ein nichtlinearer Modellierungsansatz eingeführt, mit dem die Interaktion von Wänden und Decken abgebildet werden kann. Die Anwendung des Konzepts mit erhöhten gebäudespezifischen Verhaltensbeiwerten wird am Beispiel eines Mehrfamilienhauses aus Kalksandsteinen demonstriert. Die Ergebnisse der linearen Nachweise mit erhöhten Verhaltensbeiwerten für dieses Gebäude liegen deutlich näher an den Ergebnissen nichtlinearer Nachweise und somit bleiben übliche Grundrisse in Erdbebengebieten mit den traditionellen linearen Rechenansätzen nachweisbar.
Monte Carlo Tree Search (MCTS) is a search technique that in the last decade emerged as a major breakthrough for Artificial Intelligence applications regarding board- and video-games. In 2016, AlphaGo, an MCTS-based software agent, outperformed the human world champion of the board game Go. This game was for long considered almost infeasible for machines, due to its immense search space and the need for a long-term strategy. Since this historical success, MCTS is considered as an effective new approach for many other scientific and technical problems. Interestingly, civil structural engineering, as a discipline, offers many tasks whose solution may benefit from intelligent search and in particular from adopting MCTS as a search tool. In this work, we show how MCTS can be adapted to search for suitable solutions of a structural engineering design problem. The problem consists of choosing the load-bearing elements in a reference reinforced concrete structure, so to achieve a set of specific dynamic characteristics. In the paper, we report the results obtained by applying both a plain and a hybrid version of single-agent MCTS. The hybrid approach consists of an integration of both MCTS and classic Genetic Algorithm (GA), the latter also serving as a term of comparison for the results. The study’s outcomes may open new perspectives for the adoption of MCTS as a design tool for civil engineers.
Mauerwerksbauten in Deutschland sind mit Einführung des nationalen Anwendungsdokuments DIN EN 1998-1/NA auf Grundlage einer neuen probabilistischen Erdbebenkarte nachzuweisen. Für erfolgreiche Erdbebennachweise üblicher Grundrissformen von Mauerwerksbauten stehen in dem zukünftigen Anwendungsdokument neue rechnerische Nachweismöglichkeiten zur Verfügung, mit denen die Tragfähigkeitsreserven von Mauerwerksbauten in der Baupraxis mit einem überschaubaren Aufwand besser in Ansatz gebracht werden können. Das Standardrechenverfahren ist weiterhin der kraftbasierte Nachweis, der nun mit höheren Verhaltensbeiwerten im Vergleich zur DIN 4149 durchgeführt werden kann. Die höheren Verhaltensbeiwerte basieren auf der besseren Ausnutzung der gebäudespezifischen Verformungsfähigkeit und Energiedissipation sowie der Lastumverteilung der Schubkräfte im Grundriss mit Ansatz von Rahmentragwirkung durch Wand-Deckeninteraktionen. Alternativ dazu kann ein nichtlinearer Nachweis auf Grundlage von Pushover-Analysen zur Anwendung kommen. Vervollständigt werden die Regelungen für Mauerwerksbauten durch neue Regelungen für nichttragende Innenwände und Außenmauerschalen. Der vorliegende Beitrag stellt die Grundlagen und Hintergründe der neuen rechnerischen Nachweise in DIN EN 1998-1/NA vor und demonstriert deren Anwendung an einem Beispiel aus der Praxis.
Past earthquakes demonstrated the high vulnerability of industrial facilities equipped with complex process technologies leading to serious damage of process equipment and multiple and simultaneous release of hazardous substances. Nonetheless, current standards for seismic design of industrial facilities are considered inadequate to guarantee proper safety conditions against exceptional events entailing loss of containment and related consequences. On these premises, the SPIF project -Seismic Performance of Multi-Component Systems in Special Risk Industrial Facilities- was proposed within the framework of the European H2020 SERA funding scheme. In detail, the objective of the SPIF project is the investigation of the seismic behaviour of a representative industrial multi-storey frame structure equipped with complex process components by means of shaking table tests. Along this main vein and in a performance-based design perspective, the issues investigated in depth are the interaction between a primary moment resisting frame (MRF) steel structure and secondary process components that influence the performance of the whole system; and a proper check of floor spectra predictions. The evaluation of experimental data clearly shows a favourable performance of the MRF structure, some weaknesses of local details due to the interaction between floor crossbeams and process components and, finally, the overconservatism of current design standards w.r.t. floor spectra predictions.
In a special paired sample case, Hotelling’s T² test based on the differences of the paired random vectors is the likelihood ratio test for testing the hypothesis that the paired random vectors have the same mean; with respect to a special group of affine linear transformations it is the uniformly most powerful invariant test for the general alternative of a difference in mean. We present an elementary straightforward proof of this result. The likelihood ratio test for testing the hypothesis that the covariance structure is of the assumed special form is derived and discussed. Applications to real data are given.
Hotelling’s T² tests in paired and independent survey samples are compared using the traditional asymptotic efficiency concepts of Hodges–Lehmann, Bahadur and Pitman, as well as through criteria based on the volumes of corresponding confidence regions. Conditions characterizing the superiority of a procedure are given in terms of population canonical correlation type coefficients. Statistical tests for checking these conditions are developed. Test statistics based on the eigenvalues of a symmetrized sample cross-covariance matrix are suggested, as well as test statistics based on sample canonical correlation type coefficients.
The paper deals with the asymptotic behaviour of estimators, statistical tests and confidence intervals for L²-distances to uniformity based on the empirical distribution function, the integrated empirical distribution function and the integrated empirical survival function. Approximations of power functions, confidence intervals for the L²-distances and statistical neighbourhood-of-uniformity validation tests are obtained as main applications. The finite sample behaviour of the procedures is illustrated by a simulation study.
Suppose we have k samples X₁,₁,…,X₁,ₙ₁,…,Xₖ,₁,…,Xₖ,ₙₖ with different sample sizes ₙ₁,…,ₙₖ and unknown underlying distribution functions F₁,…,Fₖ as observations plus k families of distribution functions {G₁(⋅,ϑ);ϑ∈Θ},…,{Gₖ(⋅,ϑ);ϑ∈Θ}, each indexed by elements ϑ from the same parameter set Θ, we consider the new goodness-of-fit problem whether or not (F₁,…,Fₖ) belongs to the parametric family {(G₁(⋅,ϑ),…,Gₖ(⋅,ϑ));ϑ∈Θ}. New test statistics are presented and a parametric bootstrap procedure for the approximation of the unknown null distributions is discussed. Under regularity assumptions, it is proved that the approximation works asymptotically, and the limiting distributions of the test statistics in the null hypothesis case are determined. Simulation studies investigate the quality of the new approach for small and moderate sample sizes. Applications to real-data sets illustrate how the idea can be used for verifying model assumptions.
A nonparametric goodness-of-fit test for random variables with values in a separable Hilbert space is investigated. To verify the null hypothesis that the data come from a specific distribution, an integral type test based on a Cramér-von-Mises statistic is suggested. The convergence in distribution of the test statistic under the null hypothesis is proved and the test's consistency is concluded. Moreover, properties under local alternatives are discussed. Applications are given for data of huge but finite dimension and for functional data in infinite dimensional spaces. A general approach enables the treatment of incomplete data. In simulation studies the test competes with alternative proposals.
The efficiency concepts of Bahadur and Pitman are used to compare the Wilcoxon tests in paired and independent survey samples. A comparison through the length of corresponding confidence intervals is also done. Simple conditions characterizing the dominance of a procedure are derived. Statistical tests for checking these conditions are suggested and discussed.
The paper deals with an asymptotic relative efficiency concept for confidence regions of multidimensional parameters that is based on the expected volumes of the confidence regions. Under standard conditions the asymptotic relative efficiencies of confidence regions are seen to be certain powers of the ratio of the limits of the expected volumes. These limits are explicitly derived for confidence regions associated with certain plugin estimators, likelihood ratio tests and Wald tests. Under regularity conditions, the asymptotic relative efficiency of each of these procedures with respect to each one of its competitors is equal to 1. The results are applied to multivariate normal distributions and multinomial distributions in a fairly general setting.
Let X₁,…,Xₙ be independent and identically distributed random variables with distribution F. Assuming that there are measurable functions f:R²→R and g:R²→R characterizing a family F of distributions on the Borel sets of R in the way that the random variables f(X₁,X₂),g(X₁,X₂) are independent, if and only if F∈F, we propose to treat the testing problem H:F∈F,K:F∉F by applying a consistent nonparametric independence test to the bivariate sample variables (f(Xᵢ,Xⱼ),g(Xᵢ,Xⱼ)),1⩽i,j⩽n,i≠j. A parametric bootstrap procedure needed to get critical values is shown to work. The consistency of the test is discussed. The power performance of the procedure is compared with that of the classical tests of Kolmogorov–Smirnov and Cramér–von Mises in the special cases where F is the family of gamma distributions or the family of inverse Gaussian distributions.
The Rothman–Woodroofe symmetry test statistic is revisited on the basis of independent but not necessarily identically distributed random variables. The distribution-freeness if the underlying distributions are all symmetric and continuous is obtained. The results are applied for testing symmetry in a meta-analysis random effects model. The consistency of the procedure is discussed in this situation as well. A comparison with an alternative proposal from the literature is conducted via simulations. Real data are analyzed to demonstrate how the new approach works in practice.
In the context of the Solvency II directive, the operation of an internal risk model is a possible way for risk assessment and for the determination of the solvency capital requirement of an insurance company in the European Union. A Monte Carlo procedure is customary to generate a model output. To be compliant with the directive, validation of the internal risk model is conducted on the basis of the model output. For this purpose, we suggest a new test for checking whether there is a significant change in the modeled solvency capital requirement. Asymptotic properties of the test statistic are investigated and a bootstrap approximation is justified. A simulation study investigates the performance of the test in the finite sample case and confirms the theoretical results. The internal risk model and the application of the test is illustrated in a simplified example. The method has more general usage for inference of a broad class of law-invariant and coherent risk measures on the basis of a paired sample.
We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets.
The established Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic is investigated for partly not identically distributed data. Surprisingly, it turns out that the statistic has the well-known distribution-free limiting null distribution of the classical criterion under standard regularity conditions. An application is testing goodness-of-fit for the regression function in a non parametric random effects meta-regression model, where the consistency is obtained as well. Simulations investigate size and power of the approach for small and moderate sample sizes. A real data example based on clinical trials illustrates how the test can be used in applications.
On the basis of independent and identically distributed bivariate random vectors, where the components are categorial and continuous variables, respectively, the related concomitants, also called induced order statistic, are considered. The main theoretical result is a functional central limit theorem for the empirical process of the concomitants in a triangular array setting. A natural application is hypothesis testing. An independence test and a two-sample test are investigated in detail. The fairly general setting enables limit results under local alternatives and bootstrap samples. For the comparison with existing tests from the literature simulation studies are conducted. The empirical results obtained confirm the theoretical findings.
This paper considers a paired data framework and discusses the question of marginal homogeneity of bivariate high-dimensional or functional data. The related testing problem can be endowed into a more general setting for paired random variables taking values in a general Hilbert space. To address this problem, a Cramér–von-Mises type test statistic is applied and a bootstrap procedure is suggested to obtain critical values and finally a consistent test. The desired properties of a bootstrap test can be derived that are asymptotic exactness under the null hypothesis and consistency under alternatives. Simulations show the quality of the test in the finite sample case. A possible application is the comparison of two possibly dependent stock market returns based on functional data. The approach is demonstrated based on historical data for different stock market indices.
Armiranobetonske (AB) zgrade sa zidanom ispunom
se izvode u mnogim zemljama širom sveta. Iako se
zidana ispuna posmatra kao nekonstruktivni element, ona
značajno utiče na promenu dinamičkih karakteristika AB
ramovskih konstrukcija u toku zemljotresnog dejstva.
Odskora, značajan napor je utrošen na istraživanje
izolovanih ispuna, koje su odvojene od okolnog rama
obično ostavljanjem prostora između rama i ispune. U
ovom slučaju deformacija rama ne aktivira ispunu i na taj
način ispuna ne utiče na ponašanje rama. Ovaj rad
predstavlja rezultate istraživanja ponašanja AB
ramovskih zgrada sa INODIS sistemom koji izoluje ispunu
u odnosu na okolni ram. Uticaj izolovane ispune je prvo
ispitan na jednospratnim i jednobrodnim ramovima. Ovo
je iskorišćeno kao osnova za parametarsku analizu na
višespratnim i višebrodnim ramovima, kao i na primeru
zgrade. Promena krutosti i dinamičkih karakteristika je
analizirano kao i odgovor pri zemljotresnom dejstvu.
Izvršeno je poređenje sa praznom ramovskom
konstrukcijom kao i ramovima ispunjenim ispunom na
tradicionalni način. Rezultati pokazuju da je ponašanje
ramova sa izolovanom ispunom slično ponašanju praznih
ramova, dok je ponašanje ramova sa tradicionalnom
ispunom daleko drugačije i zahteva kompleksne
numeričke modele. Ovo znači da ukoliko se primeni
adekvatna konstruktivna mera izolacije ispune, proračun
ramovskim zgrada sa zidanom ispunom se može
značajno pojednostaviti.
On the basis of bivariate data, assumed to be observations of independent copies of a random vector (S,N), we consider testing the hypothesis that the distribution of (S,N) belongs to the parametric class of distributions that arise with the compound Poisson exponential model. Typically, this model is used in stochastic hydrology, with N as the number of raindays, and S as total rainfall amount during a certain time period, or in actuarial science, with N as the number of losses, and S as total loss expenditure during a certain time period. The compound Poisson exponential model is characterized in the way that a specific transform associated with the distribution of (S,N) satisfies a certain differential equation. Mimicking the function part of this equation by substituting the empirical counterparts of the transform we obtain an expression the weighted integral of the square of which is used as test statistic. We deal with two variants of the latter, one of which being invariant under scale transformations of the S-part by fixed positive constants. Critical values are obtained by using a parametric bootstrap procedure. The asymptotic behavior of the tests is discussed. A simulation study demonstrates the performance of the tests in the finite sample case. The procedure is applied to rainfall data and to an actuarial dataset. A multivariate extension is also discussed.
FEM shakedown analysis of structures under random strength with chance constrained programming
(2022)
Direct methods, comprising limit and shakedown analysis, are a branch of computational mechanics. They play a significant role in mechanical and civil engineering design. The concept of direct methods aims to determine the ultimate load carrying capacity of structures beyond the elastic range. In practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and constraints. If strength and loading are random quantities, the shakedown analysis can be formulated as stochastic programming problem. In this paper, a method called chance constrained programming is presented, which is an effective method of stochastic programming to solve shakedown analysis problems under random conditions of strength. In this study, the loading is deterministic, and the strength is a normally or lognormally distributed variable.
Purpose
In the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset.
Design/methodology/approach
In this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments.
Findings
Based on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model.
Originality/value
For the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future.
The European Union's aim to become climate neutral by 2050 necessitates ambitious efforts to reduce carbon emissions. Large reductions can be attained particularly in energy intensive sectors like iron and steel. In order to prevent the relocation of such industries outside the EU in the course of tightening environmental regulations, the establishment of a climate club jointly with other large emitters and alternatively the unilateral implementation of an international cross-border carbon tax mechanism are proposed. This article focuses on the latter option choosing the steel sector as an example. In particular, we investigate the financial conditions under which a European cross border mechanism is capable to protect hydrogen-based steel production routes employed in Europe against more polluting competition from abroad. By using a floor price model, we assess the competitiveness of different steel production routes in selected countries. We evaluate the climate friendliness of steel production on the basis of specific GHG emissions. In addition, we utilize an input-output price model. It enables us to assess impacts of rising cost of steel production on commodities using steel as intermediates. Our results raise concerns that a cross-border tax mechanism will not suffice to bring about competitiveness of hydrogen-based steel production in Europe because the cost tends to remain higher than the cost of steel production in e.g. China. Steel is a classic example for a good used mainly as intermediate for other products. Therefore, a cross-border tax mechanism for steel will increase the price of products produced in the EU that require steel as an input. This can in turn adversely affect competitiveness of these sectors. Hence, the effects of higher steel costs on European exports should be borne in mind and could require the cross-border adjustment mechanism to also subsidize exports.
Unternehmen sind in der Regel überzeugt, dass sie die Bedürfnisse ihrer Kunden in den Mittelpunkt stellen. Aber in der direkten Interaktion mit dem Kunden zeigen sie häufig Schwächen. Der folgende Beitrag illustriert, wie durch eine konsequente Ausrichtung der Wertschöpfungsprozesse auf die zentralen Kundenbedürfnisse ein Dreifacheffekt erzielt werden kann: Nachhaltig erhöhte Kundenzufriedenheit, gesteigerte Effizienz und eine Differenzierung im Wettbewerb.
Kundenanforderungen an Netzwerke haben sich in den vergangenen Jahren stark verändert. Mit NFV und SDN sind Unternehmen technisch in der Lage, diesen gerecht zu werden. Die Provider stehen jedoch vor großen Herausforderungen: Insbesondere Produkte und Prozesse müssen angepasst und agiler werden, um die Stärken von NFV und SDN zum Kundenvorteil auszuspielen.
Die Durchführung einer systematischen Literaturrecherche ist eine zentrale Kompetenz wissenschaftlichen Arbeitens und bildet daher einen festen Ausbildungsbestandteil von Bachelor- und Masterstudiengängen. In entsprechenden Lehrveranstaltungen werden Studierende zwar mit den grundlegenden Hilfsmitteln zur Suche und Verwaltung von Literatur vertraut gemacht, allerdings werden die Potenziale textanalytischer Methoden und Anwendungssysteme (Text Mining, Text Analytics) dabei zumeist nicht abgedeckt. Folglich werden Datenkompetenzen, die zur systemgestützten Analyse und Erschließung von Literaturdaten erforderlich sind, nicht hinreichend ausgeprägt. Um diese Kompetenzlücke zu adressieren, ist an der Hochschule Osnabrück eine Lehrveranstaltung konzipiert und projektorientiert umgesetzt worden, die sich insbesondere an Studierende wirtschaftswissenschaftlicher Studiengänge richtet. Dieser Beitrag dokumentiert die fachliche sowie technische Ausgestaltung dieser Veranstaltung und zeigt Potenziale für die künftige Weiterentwicklung auf.
Zur Anwendung des Eurocode 3 Teil 1-2 für die Heißbemessung und Anregungen für dessen Novellierung
(2016)
Die Eurocodes werden bis zum Jahr 2020 im Europäischen Komitee für Normung (CEN), Technisches Komitee TC 250, überarbeitet. In Vorbereitung auf die Eurocode-Novellierung haben engagierte Ingenieure im Rahmen der Initiative PraxisRegeln Bau (PRB) die für die praktische Anwendung häufig genutzten Teile des Eurocode 3 untersucht. Mit dem Ziel, die Praxistauglichkeit des Eurocode 3 für die Heißbemessung zu verbessern, wurden die bestehende Norm EN 1993 Teil 1-2 insbesondere in Bezug auf die Anwenderfreundlichkeit analysiert und Vorschläge für die europäische Novellierung erarbeitet. Die Analysen zeigen, dass durch Umstrukturierungen und durch die Einführung von Tabellen die Verständlichkeit und Anwenderfreundlichkeit der Regeln für die Heißbemessung bedeutend erhöht werden können.
Stützen und Träger aus Stahlprofilen können in Fundamente oder Wände aus Stahlbeton einbetoniert werden. Diese Anschlüsse wirken in der Regel wie Einspannungen, die eine ausreichende Einspanntiefe erfordern. Im Folgenden wird eine verallgemeinerte Berechnungsmethode für in Stahlbetonkonstruktionen eingespannte Stahlprofile aus gewalzten I-Profilen, geschweißten I-Profilen, runden Hohlprofilen, eckigen Hohlprofilen und einzelligen Kastenquerschnitten vorgestellt. Für Beanspruchungen infolge einachsiger Biegung um die starke und schwache Profilachse werden der profilabhängige Ansatz der Betondruckspannungen im Einspannbereich und die Ermittlung der Einspanntiefe behandelt. Unter Berücksichtigung der Normalkraft werden an den maßgebenden Stellen Tragfähigkeitsnachweise für die Stahlprofile geführt. Als Ergänzung zu den Berechnungsformeln werden Bemessungshilfen zur Verfügung gestellt, die die Wahl der mitwirkenden Breiten und der Einspanntiefen erleichtert.
In der Praxis bestehen vielfältige Einsatzbereiche für Verkehrsnachfragemodelle. Mit ihnen können Kenngrößen des Verkehrsangebots und der Verkehrsnachfrage für den heutigen Zustand wie auch für zukünftige Zustände bereitgestellt werden, um so die Grundlagen für verkehrsplanerische Entscheidungen zu liefern. Die neuen „Empfehlungen zum Einsatz von Verkehrsnachfragemodellen für den Personenverkehr“ (EVNM-PV) (FGSV 2022) veranschaulichen anhand von typischen Planungsaufgaben, welche differenzierten Anforderungen daraus für die Modellkonzeption und -erstellung resultieren. Vor dem Hintergrund der konkreten Aufgabenstellung sowie deren spezifischer planerischer Anforderungen bildet die abzuleitende Modellspezifikation die verabredete Grundlage zwischen Auftraggeber und Modellersteller für die konkrete inhaltliche, fachliche Ausgestaltung des Verkehrsmodells.
Die neu erschienenen „Empfehlungen zum Einsatz von Verkehrsnachfragemodellen für den Personenverkehr“ liefern erstmals als Empfehlungspapier der Forschungsgesellschaft für Straßen- und Verkehrswesen einen umfassenden Überblick zu den verschiedenen Aspekten der Modellierung und geben dem Fachplaner konkrete Hilfestellung für die Konzeption von Nachfragemodellen. Das Empfehlungspapier zielt unter anderem darauf ab, die Erwartungen und das Anspruchsniveau in Hinblick auf Sachgerechtigkeit der Modelle, die erzielbare Modellqualität und den Detaillierungsgrad der Modellaussagen zu harmonisieren.
The potential of electronic markets in enabling innovative product bundles through flexible and sustainable partnerships is not yet fully exploited in the telecommunication industry. One reason is that bundling requires seamless de-assembling and re-assembling of business processes, whilst processes in telecommunication companies are often product-dependent and hard to virtualize. We propose a framework for the planning of the virtualization of processes, intended to assist the decision maker in prioritizing the processes to be virtualized: (a) we transfer the virtualization pre-requisites stated by the Process Virtualization Theory in the context of customer-oriented processes in the telecommunication industry and assess their importance in this context, (b) we derive IT-oriented requirements for the removal of virtualization barriers and highlight their demand on changes at different levels of the organization. We present a first evaluation of our approach in a case study and report on lessons learned and further steps to be performed.
As the potential of a next generation network (NGN) is recognised, telecommunication companies consider switching to it. Although the implementation of an NGN seems to be merely a modification of the network infrastructure, it may trigger or require changes in the whole company, because it builds upon the separation between service and transport, a flexible bundling of services to products and the streamlining of the IT infrastructure. We propose a holistic framework, structured into the layers ‘strategy’, ‘processes’ and ‘information systems’ and incorporate into each layer all concepts necessary for the implementation of an NGN, as well as the alignment of these concepts. As a first proof-of-concept for our framework we have performed a case study on the introduction of NGN in a large telecommunication company; we show that our framework captures all topics that are affected by an NGN implementation.
Der Telekommunikationsmarkt erfährt substanzielle Veränderungen. Neue Geschäftsmodelle, innovative Dienstleistungen und Technologien erfordern Reengineering, Transformation und Prozessstandardisierung. Mit der Enhanced Telecom Operation Map (eTOM) bietet das TM Forum ein international anerkanntes de facto Referenz-Prozess-Framework basierend auf spezifischen Anforderungen und Ausprägungen der Telekommunikationsindustrie an. Allerdings enthält dieses Referenz-Framework nur eine hierarchische Sammlung von Prozessen auf unterschiedlichen Abstraktionsebenen. Eine Kontrollsicht verstanden als sequenzielle Anordnung von Aktivitäten und daraus resultierend ein realer Prozessablauf fehlt ebenso wie eine Ende-zu-Ende-Sicht auf den Kunden. In diesem Artikel erweitern wir das eTOM-Referenzmodell durch Referenzprozessabläufe, in welchen wir das Wissen über Prozesse in Telekommunikationsunternehmen abstrahieren und generalisieren. Durch die Referenzprozessabläufe werden Unternehmen bei dem strukturierten und transparenten (Re-)Design ihrer Prozesse unterstützt. Wir demonstrieren die Anwendbarkeit und Nützlichkeit unserer Referenzprozessabläufe in zwei Fallstudien und evaluieren diese anhand von Kriterien für die Bewertung von Referenzmodellen. Die Referenzprozessabläufe wurden vom TM Forum in den Standard aufgenommen und als Teil von eTOM Version 9 veröffentlicht. Darüber hinaus diskutieren wir die Komponenten unseres Ansatzes, die auch außerhalb der Telekommunikationsindustrie angewandt werden können.
Das anhaltende Wachstum wissenschaftlicher Veröffentlichungen wirft die Fragestellung auf, wie Literaturana-lysen im Rahmen von Forschungsprozessen digitalisiert und somit produktiver realisiert werden können. Insbesondere in informationstechnischen Fachgebieten ist die Forschungspraxis durch ein rasant wachsendes Publikationsaufkommen gekennzeichnet. Infolgedessen bietet sich der Einsatz von Methoden der Textanalyse (Text Analytics) an, die Textdaten automatisch vorbereiten und verarbeiten können. Erkenntnisse entstehen dabei aus Analysen von Wortarten und Subgruppen, Korrelations- sowie Zeitreihenanalysen. Dieser Beitrag stellt die Konzeption und Realisierung eines Prototypen vor, mit dem Anwender bibliographische Daten aus der etablierten Literaturdatenbank EBSCO Discovery Service mithilfe textanalytischer Methoden erschließen können. Der Prototyp basiert auf dem Analysesystem IBM Watson Explorer, das Hochschulen lizenzkostenfrei zur Verfügung steht. Potenzielle Adressaten des Prototypen sind Forschungseinrichtungen, Beratungsunternehmen sowie Entscheidungsträger in Politik und Unternehmenspraxis.
Im Rahmen der Digitalisierung ist die zunehmende Automatisierung von bisher manuellen Prozessschritten ein Aspekt, der massive Auswirkungen auf die zukünftige Arbeitswelt haben wird. In diesem Kontext werden an den Einsatz von Softwarerobotern zur Prozessautomatisierung hohe Erwartungen geknüpft. Bei den Implementierungsansätzen wird die Diskussion aktuell insbesondere durch Robotic Process Automation (RPA) und Chatbots geprägt. Beide Ansätze verfolgen das gemeinsame Ziel einer 1:1-Automatisierung von menschlichen Handlungen und dadurch ein direktes Ersetzen von Mitarbeitern durch Maschinen. Bei RPA werden Prozesse durch Softwareroboter erlernt und automatisiert ausgeführt. Dabei emulieren RPA-Roboter die Eingaben auf der bestehenden Präsentationsschicht, so dass keine Änderungen an vorhandenen Anwendungssystemen notwendig sind. Am Markt werden bereits unterschiedliche RPA-Lösungen als Softwareprodukte angeboten. Durch Chatbots werden Ein- und Ausgaben von Anwendungssystemen über natürliche Sprache realisiert. Dadurch ist die Automatisierung von unternehmensexterner Kommunikation (z. B. mit Kunden) aber auch von unternehmensinternen Assistenztätigkeiten möglich. Der Beitrag diskutiert die Auswirkungen von Softwarerobotern auf die Arbeitswelt anhand von Anwendungsbeispielen und erläutert die unternehmensindividuelle Entscheidung über den Einsatz von Softwarerobotern anhand von Effektivitäts- und Effizienzzielen.
Am Beispiel der Telekommunikationsindustrie zeigt der Beitrag eine konkrete Ausgestaltung anwendungsorientierter Forschung, die sowohl für die Praxis als auch für die Wissenschaft nutzen- und erkenntnisbringend ist. Forschungsgegenstand sind die Referenzmodelle des Industriegremiums TM Forum, die von vielen Telekommunikationsunternehmen zur Transformation ihrer Strukturen und Systeme genutzt werden. Es wird die langjährige Forschungstätigkeit bei der Weiterentwicklung und Anwendung dieser Referenzmodelle beschrieben. Dabei wird ein konsequent gestaltungsorientierter Forschungsansatz verfolgt. Das Zusammenspiel aus kontinuierlicher Weiterentwicklung in Zusammenarbeit mit einem Industriegremium und der Anwendung in vielfältigen Praxisprojekten führt zu einer erfolgreichen Symbiose aus praktischer Nutzengenerierung sowie wissenschaftlichem Erkenntnisgewinn. Der Beitrag stellt den gewählten Forschungsansatz anhand konkreter Beispiele vor. Darauf basierend werden Empfehlungen und Herausforderungen für eine gestaltungs- und praxisorientierte Forschung diskutiert.
The molecular weight properties of lignins are one of the key elements that need to be analyzed for a successful industrial application of these promising biopolymers. In this study, the use of 1H NMR as well as diffusion-ordered spectroscopy (DOSY NMR), combined with multivariate regression methods, was investigated for the determination of the molecular weight (Mw and Mn) and the polydispersity of organosolv lignins (n = 53, Miscanthus x giganteus, Paulownia tomentosa, and Silphium perfoliatum). The suitability of the models was demonstrated by cross validation (CV) as well as by an independent validation set of samples from different biomass origins (beech wood and wheat straw). CV errors of ca. 7–9 and 14–16% were achieved for all parameters with the models from the 1H NMR spectra and the DOSY NMR data, respectively. The prediction errors for the validation samples were in a similar range for the partial least squares model from the 1H NMR data and for a multiple linear regression using the DOSY NMR data. The results indicate the usefulness of NMR measurements combined with multivariate regression methods as a potential alternative to more time-consuming methods such as gel permeation chromatography.
In this study, a recently proposed NMR standardization approach by 2H integral of deuterated solvent for quantitative multicomponent analysis of complex mixtures is presented. As a proof of principle, the existing NMR routine for the analysis of Aloe vera products was modified. Instead of using absolute integrals of targeted compounds and internal standard (nicotinamide) from 1H-NMR spectra, quantification was performed based on the ratio of a particular 1H-NMR compound integral and 2H-NMR signal of deuterated solvent D2O. Validation characteristics (linearity, repeatability, accuracy) were evaluated and the results showed that the method has the same precision as internal standardization in case of multicomponent screening. Moreover, a dehydration process by freeze drying is not necessary for the new routine. Now, our NMR profiling of A. vera products needs only limited sample preparation and data processing. The new standardization methodology provides an appealing alternative for multicomponent NMR screening. In general, this novel approach, using standardization by 2H integral, benefits from reduced sample preparation steps and uncertainties, and is recommended in different application areas (purity determination, forensics, pharmaceutical analysis, etc.).
The investigation of the possibility to determine various characteristics of powder heparin (n = 115) was carried out with infrared spectroscopy. The evaluation of heparin samples included several parameters such as purity grade, distributing company, animal source as well as heparin species (i.e. Na-heparin, Ca-heparin, and heparinoids). Multivariate analysis using principal component analysis (PCA), soft independent modelling of class analogy (SIMCA), and partial least squares – discriminant analysis (PLS-DA) were applied for the modelling of spectral data. Different pre-processing methods were applied to IR spectral data; multiplicative scatter correction (MSC) was chosen as the most relevant.
Obtained results were confirmed by nuclear magnetic resonance (NMR) spectroscopy. Good predictive ability of this approach demonstrates the potential of IR spectroscopy and chemometrics for screening of heparin quality. This approach, however, is designed as a screening tool and is not considered as a replacement for either of the methods required by USP and FDA.
Quantitative nuclear magnetic resonance (qNMR) is routinely performed by the internal or external standardization. The manuscript describes a simple alternative to these common workflows by using NMR signal of another active nuclei of calibration compound. For example, for any arbitrary compound quantification by NMR can be based on the use of an indirect concentration referencing that relies on a solvent having both 1H and 2H signals. To perform high-quality quantification, the deuteration level of the utilized deuterated solvent has to be estimated.
In this contribution the new method was applied to the determination of deuteration levels in different deuterated solvents (MeOD, ACN, CDCl3, acetone, benzene, DMSO-d6). Isopropanol-d6, which contains a defined number of deuterons and protons, was used for standardization. Validation characteristics (precision, accuracy, robustness) were calculated and the results showed that the method can be used in routine practice. Uncertainty budget was also evaluated. In general, this novel approach, using standardization by 2H integral, benefits from reduced sample preparation steps and uncertainties, and can be applied in different application areas (purity determination, forensics, pharmaceutical analysis, etc.).
Heparin is a natural polysaccharide, which plays essential role in many biological processes. Alterations in building blocks can modify biological roles of commercial heparin products, due to significant changes in the conformation of the polymer chain. The variability structure of heparin leads to difficulty in quality control using different analytical methods, including infrared (IR) spectroscopy. In this paper molecular modelling of heparin disaccharide subunits was performed using quantum chemistry. The structural and spectral parameters of these disaccharides have been calculated using RHF/6-311G. In addition, over-sulphated chondroitin sulphate disaccharide was studied as one of the most widespread contaminants of heparin. Calculated IR spectra were analyzed with respect to specific structure parameters. IR spectroscopic fingerprint was found to be sensitive to substitution pattern of disaccharide subunits. Vibrational assignments of calculated spectra were correlated with experimental IR spectral bands of native heparin. Chemometrics was used to perform multivariate analysis of simulated spectral data.
Lignin is a promising renewable biopolymer being investigated worldwide as an environmentally benign substitute of fossil-based aromatic compounds, e.g. for the use as an excipient with antioxidant and antimicrobial properties in drug delivery or even as active compound. For its successful implementation into process streams, a quick, easy, and reliable method is needed for its molecular weight determination. Here we present a method using 1H spectra of benchtop as well as conventional NMR systems in combination with multivariate data analysis, to determine lignin’s molecular weight (Mw and Mn) and polydispersity index (PDI). A set of 36 organosolv lignin samples (from Miscanthus x giganteus, Paulownia tomentosa and Silphium perfoliatum) was used for the calibration and cross validation, and 17 samples were used as external validation set. Validation errors between 5.6% and 12.9% were achieved for all parameters on all NMR devices (43, 60, 500 and 600 MHz). Surprisingly, no significant difference in the performance of the benchtop and high-field devices was found. This facilitates the application of this method for determining lignin’s molecular weight in an industrial environment because of the low maintenance expenditure, small footprint, ruggedness, and low cost of permanent magnet benchtop NMR systems.
NMR standardization approach that uses the 2H integral of deuterated solvent for quantitative multinuclear analysis of pharmaceuticals is described. As a proof of principle, the existing NMR procedure for the analysis of heparin products according to US Pharmacopeia monograph is extended to the determination of Na+ and Cl- content in this matrix. Quantification is performed based on the ratio of a 23Na (35Cl) NMR integral and 2H NMR signal of deuterated solvent, D2O, acquired using the specific spectrometer hardware. As an alternative, the possibility of 133Cs standardization using the addition of Cs2CO3 stock solution is shown. Validation characteristics (linearity, repeatability, sensitivity) are evaluated. A holistic NMR profiling of heparin products can now also be used for the quantitative determination of inorganic compounds in a single analytical run using a single sample. In general, the new standardization methodology provides an appealing alternative for the NMR screening of inorganic and organic components in pharmaceutical products.
Although several successful applications of benchtop nuclear magnetic resonance (NMR) spectroscopy in quantitative mixture analysis exist, the possibility of calibration transfer remains mostly unexplored, especially between high- and low-field NMR. This study investigates for the first time the calibration transfer of partial least squares regressions [weight average molecular weight (Mw) of lignin] between high-field (600 MHz) NMR and benchtop NMR devices (43 and 60 MHz). For the transfer, piecewise direct standardization, calibration transfer based on canonical correlation analysis, and transfer via the extreme learning machine auto-encoder method are employed. Despite the immense resolution difference between high-field and low-field NMR instruments, the results demonstrate that the calibration transfer from high- to low-field is feasible in the case of a physical property, namely, the molecular weight, achieving validation errors close to the original calibration (down to only 1.2 times higher root mean square errors). These results introduce new perspectives for applications of benchtop NMR, in which existing calibrations from expensive high-field instruments can be transferred to cheaper benchtop instruments to economize.
Retinal vessels are similar to cerebral vessels in their structure and function. Moderately low oscillation frequencies of around 0.1 Hz have been reported as the driving force for paravascular drainage in gray matter in mice and are known as the frequencies of lymphatic vessels in humans. We aimed to elucidate whether retinal vessel oscillations are altered in Alzheimer's disease (AD) at the stage of dementia or mild cognitive impairment (MCI). Seventeen patients with mild-to-moderate dementia due to AD (ADD); 23 patients with MCI due to AD, and 18 cognitively healthy controls (HC) were examined using Dynamic Retinal Vessel Analyzer. Oscillatory temporal changes of retinal vessel diameters were evaluated using mathematical signal analysis. Especially at moderately low frequencies around 0.1 Hz, arterial oscillations in ADD and MCI significantly prevailed over HC oscillations and correlated with disease severity. The pronounced retinal arterial vasomotion at moderately low frequencies in the ADD and MCI groups would be compatible with the view of a compensatory upregulation of paravascular drainage in AD and strengthen the amyloid clearance hypothesis.
Edge-based and face-based smoothed finite element methods (ES-FEM and FS-FEM, respectively) are modified versions of the finite element method allowing to achieve more accurate results and to reduce sensitivity to mesh distortion, at least for linear elements. These properties make the two methods very attractive. However, their implementation in a standard finite element code is nontrivial because it requires heavy and extensive modifications to the code architecture. In this article, we present an element-based formulation of ES-FEM and FS-FEM methods allowing to implement the two methods in a standard finite element code with no modifications to its architecture. Moreover, the element-based formulation permits to easily manage any type of element, especially in 3D models where, to the best of the authors' knowledge, only tetrahedral elements are used in FS-FEM applications found in the literature. Shape functions for non-simplex 3D elements are proposed in order to apply FS-FEM to any standard finite element.
The mechanical behavior of the large intestine beyond the ultimate stress has never been investigated. Stretching beyond the ultimate stress may drastically impair the tissue microstructure, which consequently weakens its healthy state functions of absorption, temporary storage, and transportation for defecation. Due to closely similar microstructure and function with humans, biaxial tensile experiments on the porcine large intestine have been performed in this study. In this paper, we report hyperelastic characterization of the large intestine based on experiments in 102 specimens. We also report the theoretical analysis of the experimental results, including an exponential damage evolution function. The fracture energies and the threshold stresses are set as damage material parameters for the longitudinal muscular, the circumferential muscular and the submucosal collagenous layers. A biaxial tensile simulation of a linear brick element has been performed to validate the applicability of the estimated material parameters. The model successfully simulates the biomechanical response of the large intestine under physiological and non-physiological loads.