Refine
Year of publication
- 2023 (112) (remove)
Institute
- Fachbereich Medizintechnik und Technomathematik (28)
- Fachbereich Elektrotechnik und Informationstechnik (21)
- Fachbereich Luft- und Raumfahrttechnik (18)
- ECSM European Center for Sustainable Mobility (16)
- Fachbereich Chemie und Biotechnologie (16)
- Fachbereich Energietechnik (13)
- INB - Institut für Nano- und Biotechnologien (11)
- IfB - Institut für Bioengineering (9)
- Fachbereich Wirtschaftswissenschaften (8)
- Fachbereich Maschinenbau und Mechatronik (6)
Language
- English (112) (remove)
Document Type
- Article (64)
- Conference Proceeding (33)
- Part of a Book (6)
- Habilitation (2)
- Preprint (2)
- Talk (2)
- Book (1)
- Conference: Meeting Abstract (1)
- Contribution to a Periodical (1)
Keywords
- Information extraction (3)
- Natural language processing (3)
- Associated liquids (2)
- Bacillaceae (2)
- Biotechnological application (2)
- CFD (2)
- Diversity Management (2)
- Engineering Habitus (2)
- Future Skills (2)
- Interdisciplinarity (2)
- Organizational Culture (2)
- Power plants (2)
- Subtilases (2)
- Subtilisin (2)
- Sustainability (2)
- additive manufacturing (2)
- factory planning (2)
- manufacturing flexibility (2)
- ultrasound (2)
- (Poly)saccharides (1)
- (R)- or (S)- gamma-valerolactone (1)
- 197m/gHg (1)
- 4-hydroxy valeric acid (1)
- Academia (1)
- Active learning (1)
- Acyl-amino acids (1)
- Aeroelasticity (1)
- Agent-based simulation (1)
- Agile development (1)
- Aloe vera (1)
- Aminoacylase (1)
- Anammox (1)
- Android (1)
- Anomaly detection (1)
- Anti-Bias (1)
- Antibias (1)
- Architectural design (1)
- Asymptotic relative efficiency (1)
- Automation (1)
- Automotive safety approach (1)
- Autonomy (1)
- Bacillus atrophaeus spores (1)
- Bacterial cellulose (1)
- Best practice sharing (1)
- Bioabsorbable (1)
- Blade element method (1)
- Bragg peak (1)
- Brake set-up (1)
- Braking curves (1)
- Brands (1)
- Broad pH spectrum (1)
- Building Automation (1)
- Business Process Intelligence (1)
- CO2 (1)
- CO2 emission reduction targets (1)
- CRISPR/Cas9 (1)
- Capacitive field-effect sensor (1)
- Carbon Dioxide (1)
- Carbon sources (1)
- Cellulose nanostructure (1)
- Change (1)
- Chaperone co-expression (1)
- Charging station (1)
- Chiralidon-R (1)
- Chiralidon-S (1)
- Chondroitin sulfate (1)
- Clustering (1)
- Cognitive assistance system (1)
- Collagen fibrils (1)
- Competitiveness (1)
- Conductive Boundary Condition (1)
- Connected Automated Vehicle (1)
- Connective tissues (1)
- Control (1)
- Cost function (1)
- Cost-effectiveness (1)
- Cramér-von-Mises test (1)
- Crashworthiness (1)
- Cross border adjustment mechanism (1)
- Culture media (1)
- Cyclotron production (1)
- DPA (dipicolinic acid) (1)
- Datasets (1)
- Decision theory (1)
- Deep learning (1)
- Design Thinking (1)
- Dietary supplements (1)
- Digital leadership (1)
- Digital transformation (1)
- Digital triage (1)
- Digital twin (1)
- District data model (1)
- District energy planning platform (1)
- Drag estimation (1)
- Driver assistance system (1)
- Driving cycle recognition (1)
- E-Mobility (1)
- ECMS (1)
- Earthquake (1)
- Education (1)
- Electrocardiography (1)
- Electrochemistry (1)
- Electronic vehicle (1)
- Elicit (1)
- Endothelial dysfunction (1)
- Energy Disaggregation (1)
- Energy management strategies (1)
- Energy market design (1)
- Energy storage (1)
- Energy system planning (1)
- Energy-intensive industry (1)
- Enterprise information systems (1)
- Extracellular matrix (ECM) (1)
- Fault approximation (1)
- Fault detection (1)
- Finite element method (1)
- Finland (1)
- Floor prices (1)
- Freight rail (1)
- Full-vehicle crash test (1)
- Future skills (1)
- Ga-68 (1)
- Gamification (1)
- Geriatric (1)
- Germany (1)
- Glucosamine (1)
- Gold nanoparticle (AuNP) (1)
- Gold nanoparticles (1)
- Halotolerant protease (1)
- High-field NMR (1)
- Hip fractures (1)
- Home Assistant (1)
- Home Automation Platform (1)
- Human factors (1)
- Human-centered work design (1)
- Human-robot collaboration (1)
- Ice melting probe (1)
- Ice penetration (1)
- Icy moons (1)
- Inclusion bodies (1)
- Incomplete data (1)
- Instagram store (1)
- Interculturality (1)
- Inverse Scattering (1)
- Inverse scattering problem (1)
- Key competences (1)
- Label-free detection (1)
- Labyfrinth weirs (1)
- Lactobacillus rhamnosus GG (1)
- Large Eddy Simulation (1)
- Latvia (1)
- LbL films (1)
- Leaderboard (1)
- Levulinic acid (1)
- Local path planning (1)
- Long COVID (1)
- Luxury (1)
- MCDA (1)
- Machine Learning (1)
- Mainstream (1)
- Marginal homogeneity (1)
- Market modeling (1)
- Mars (1)
- Masonry partition walls (1)
- Mechanical stability (1)
- Medical radionuclide production (1)
- Medusomyces gisevi (1)
- Meitner-Auger-electron (MAE) (1)
- Metal contaminants (1)
- Microfluidic solvent extraction (1)
- Micromix (1)
- Minimum Risk Manoeuvre (1)
- Minor chemistry (1)
- Mobility transition (1)
- Model-driven software engineering (1)
- Mpc (1)
- Multi-criteria decision analysis (1)
- Multi-objective optimization (1)
- Multicell (1)
- Multiplexing (1)
- Multirotor UAS (1)
- Natural Language Processing (1)
- Natural language understanding (1)
- Navigation (1)
- Neural networks (1)
- Nitrogen removal (1)
- Obstacle avoidance (1)
- Ocean worlds (1)
- Open Source (1)
- Operational Design Domain (1)
- Operations (1)
- Organic acids (1)
- Out-of-plane capacity (1)
- PLS (1)
- Paired sample (1)
- Parking (1)
- Partial nitritation (1)
- Path planning (1)
- Path-following (1)
- Performance (1)
- Personality (1)
- Physical chemistry (1)
- Physical chemistry basics (1)
- Physical chemistry starters (1)
- Polylactide acid (1)
- Polysaccharides (1)
- Post-COVID-19 syndrome (1)
- Predictive battery discharge (1)
- Preference assessment (1)
- Prevention (1)
- Privacy (1)
- Process Model Extraction (1)
- Process optimization (1)
- Profile extraction (1)
- Propeller (1)
- Propeller elasticity (1)
- Prophylaxis (1)
- Prototype (1)
- Quality control (1)
- Query learning (1)
- Raman spectroscopy (1)
- Regionalization (1)
- Relation classification (1)
- Renewable energy integration (1)
- Reproducible research (1)
- Reservation system (1)
- Resistive temperature detector (1)
- Responsibility (1)
- Rotary encoder (1)
- SOA (1)
- Sensors comparison (1)
- Shunting (1)
- Silk fibroin (1)
- Simulation (1)
- Slab deflection (1)
- Smart Building (1)
- Sn₃O₄ (1)
- Social impact measurement (1)
- Society (1)
- Software (1)
- Software and systems modeling (1)
- Software development (1)
- Software testing (1)
- Spectroscopy (1)
- Steel industry (1)
- Streptomyces griseus (1)
- Streptomyces lividans (1)
- Stress testing (1)
- Sustainable engineering education (1)
- Tapered ends (1)
- Targeted radionuclide therapy (TRT) (1)
- Teamwork (1)
- Text Mining (1)
- Text mining (1)
- Thermodynamics as minor (1)
- Time-series synchronization (1)
- Transdisciplinarity (1)
- Transformative Competencies (1)
- Transiton of Control (1)
- Transmission Eigenvalues (1)
- Triage-app (1)
- Trustworthy artificial intelligence (1)
- UAV (1)
- Utilization improvement (1)
- V2X (1)
- Vibrio natriegens (1)
- Volumes of confidence regions (1)
- Wastewater (1)
- Wearable electronic device (1)
- Wiegand sensor (1)
- Wind milling (1)
- Wind tunnel experiments (1)
- active learning (1)
- adaptive systems (1)
- aircraft engine (1)
- allocation (1)
- amperometric biosensors (1)
- anammox (1)
- artificial intelligence (1)
- aspergillus (1)
- assistance system (1)
- bacterial cellulose (1)
- bio-methane (1)
- biocompatible (1)
- biodegradabl (1)
- biofilms (1)
- biological dosimeter (1)
- biomechanics (1)
- biosensor (1)
- bubble column (1)
- central symmetry test (1)
- climate change (1)
- combustion (1)
- compression behavior (1)
- conditional excess distribution (1)
- conditional expectation principle (1)
- confidence interval (1)
- connective tissue (1)
- covariance principle (1)
- deficit irrigation (1)
- distribution grid simulation (1)
- e-mobility (1)
- eVTOL development (1)
- eVTOL safety (1)
- electromyography (1)
- emission index (1)
- encapsulation materials (1)
- energy efficiency (1)
- entrepreneurship education (1)
- enzyme cascade (1)
- exchangeability test (1)
- fibroin (1)
- field-effect sensor (1)
- filamentous fungi (1)
- forecast (1)
- fuel cell vehicle (1)
- fused filament fabrication (1)
- gamification (1)
- genome engineering (1)
- glucose oxidase (GOx) (1)
- goodness-of-fit test (1)
- heavy metals (1)
- horseradish peroxidase (HRP) (1)
- hydrogel (1)
- hydrogen (1)
- immobilization (1)
- independence test (1)
- infill strategy (1)
- intelligent control (1)
- intelligent energy management (1)
- lab work (1)
- locomotion (1)
- machine learning (1)
- mainstream deammonification (1)
- manufacturing (1)
- manufacturing data model (1)
- methanation (1)
- mix flexibility (1)
- nanobelts (1)
- neutrons (1)
- nitric oxides (1)
- nitrogen elimination (1)
- not identically distributed (1)
- onion (1)
- optical fibers (1)
- optical sensor setup (1)
- optical trapping (1)
- optimization system (1)
- overload (1)
- physiology (1)
- plug flow reactor (1)
- polyetheretherketone (PEEK) (1)
- portfolio risk (1)
- power-to-gas (1)
- prebiotic (1)
- production planning and control (1)
- professional skills (1)
- proton therapy (1)
- protons (1)
- purchase factor (1)
- qNMR (1)
- random effects (1)
- rapid tooling (1)
- recombinant expression (1)
- relative dosimetry (1)
- retinal microvasculature (1)
- service-oriented architectures (1)
- shopping behavior (1)
- smart-charging (1)
- sterilization (1)
- stretch-shortening cycle (1)
- structural equation model (1)
- technology planning (1)
- tobacco mosaic virus (TMV) (1)
- turnip vein clearing virus (TVCV) (1)
- volume flexibility (1)
- wastewater (1)
- water economy (1)
- yield (1)
- α-aminoacylase (1)
- ε-lysine acylase (1)
The major advantage of labyrinth weirs over linear weirs is hydraulic efficiency. In hydraulic modeling efforts, this strength contrasts with limited pump capacity as well as limited computational power for CFD simulations. For the latter, reducing the number of investigated cycles can significantly reduce necessary computational time. In this study, a labyrinth weir with different cycle numbers was investigated. The simulations were conducted in FLOW-3D HYDRO as a Large Eddy Simulation. With a mean deviation of 1.75 % between simulated discharge coefficients and literature design equations, a reasonable agreement was found. For downstream conditions, overall consistent results were observed as well. However, the orientation of labyrinth weirs with a single cycle should be chosen carefully under consideration of the individual research purpose.
Today’s society is undergoing a paradigm shift driven by the megatrend of sustainability. This undeniably affects all areas of Western life. This paper aims to find out how the luxury industry is dealing with this change and what adjustments are made by the companies. For this purpose, interviews were conducted with managers from the luxury industry, in which they were asked about specific measures taken by their companies as well as trends in the industry. In a subsequent evaluation, the trends in the luxury industry were summarized for the areas of ecological, social, and economic sustainability. It was found that the area of environmental sustainability is significantly more focused than the other sub-areas. Furthermore, the need for a customer survey to validate the industry-based measures was identified.
This thesis aims at the presentation and discussion of well-accepted and new
imaging techniques applied to different types of flow in common hydraulic
engineering environments. All studies are conducted in laboratory conditions and
focus on flow depth and velocity measurements. Investigated flows cover a wide
range of complexity, e.g. propagation of waves, dam-break flows, slightly and fully
aerated spillway flows as well as highly turbulent hydraulic jumps.
Newimagingmethods are compared to different types of sensorswhich are frequently
employed in contemporary laboratory studies. This classical instrumentation as well
as the general concept of hydraulic modeling is introduced to give an overview on
experimental methods.
Flow depths are commonly measured by means of ultrasonic sensors, also known as
acoustic displacement sensors. These sensors may provide accurate data with high
sample rates in case of simple flow conditions, e.g. low-turbulent clear water flows.
However, with increasing turbulence, higher uncertainty must be considered.
Moreover, ultrasonic sensors can provide point data only, while the relatively large
acoustic beam footprint may lead to another source of uncertainty in case of
relatively short, highly turbulent surface fluctuations (ripples) or free-surface
air-water flows. Analysis of turbulent length and time scales of surface fluctuations
from point measurements is also difficult. Imaging techniques with different
dimensionality, however, may close this gap. It is shown in this thesis that edge
detection methods (known from computer vision) may be used for two-dimensional
free-surface extraction (i.e. from images taken through transparant sidewalls in
laboratory flumes). Another opportunity in hydraulic laboratory studies comes with
the application of stereo vision. Low-cost RGB-D sensors can be used to gather
instantaneous, three-dimensional free-surface elevations, even in flows with very
high complexity (e.g. aerated hydraulic jumps). It will be shown that the uncertainty
of these methods is of similar order as for classical instruments.
Particle Image Velocimetry (PIV) is a well-accepted and widespread imaging
technique for velocity determination in laboratory conditions. In combination with
high-speed cameras, PIV can give time-resolved velocity fields in 2D/3D or even as
volumetric flow fields. PIV is based on a cross-correlation technique applied to small
subimages of seeded flows. The minimum size of these subimages defines the
maximum spatial resolution of resulting velocity fields. A derivative of PIV for
aerated flows is also available, i.e. the so-called Bubble Image Velocimetry (BIV). This
thesis emphasizes the capacities and limitations of both methods, using relatively
simple setups with halogen and LED illuminations. It will be demonstrated that
PIV/BIV images may also be processed by means of Optical Flow (OF) techniques.
OF is another method originating from the computer vision discipline, based on the
assumption of image brightness conservation within a sequence of images. The
Horn-Schunck approach, which has been first employed to hydraulic engineering
problems in the studies presented herein, yields dense velocity fields, i.e. pixelwise
velocity data. As discussed hereinafter, the accuracy of OF competes well with PIV
for clear-water flows and even improves results (compared to BIV) for aerated flow
conditions. In order to independently benchmark the OF approach, synthetic images
with defined turbulence intensitiy are used.
Computer vision offers new opportunities that may help to improve the
understanding of fluid mechanics and fluid-structure interactions in laboratory
investigations. In prototype environments, it can be employed for obstacle detection
(e.g. identification of potential fish migration corridors) and recognition (e.g. fish
species for monitoring in a fishway) or surface reconstruction (e.g. inspection of
hydraulic structures). It can thus be expected that applications to hydraulic
engineering problems will develop rapidly in near future. Current methods have not
been developed for fluids in motion. Systematic future developments are needed to
improve the results in such difficult conditions.
Ice melting probes
(2023)
The exploration of icy environments in the solar system, such as the poles of Mars and the icy moons (a.k.a. ocean worlds), is a key aspect for understanding their astrobiological potential as well as for extraterrestrial resource inspection. On these worlds, ice melting probes are considered to be well suited for the robotic clean execution of such missions. In this chapter, we describe ice melting probes and their applications, the physics of ice melting and how the melting behavior can be modeled and simulated numerically, the challenges for ice melting, and the required key technologies to deal with those challenges. We also give an overview of existing ice melting probes and report some results and lessons learned from laboratory and field tests.
The feasibility study presents results of a hydrogen combustor integration for a Medium-Range aircraft engine using the Dry-Low-NOₓ Micromix combustion principle. Based on a simplified Airbus A320-type flight mission, a thermodynamic performance model of a kerosene and a hydrogen-powered V2530-A5 engine is used to derive the thermodynamic combustor boundary conditions. A new combustor design using the Dry-Low NOx Micromix principle is investigated by slice model CFD simulations of a single Micromix injector for design and off-design operation of the engine. Combustion characteristics show typical Micromix flame shapes and good combustion efficiencies for all flight mission operating points. Nitric oxide emissions are significant below ICAO CAEP/8 limits. For comparison of the Emission Index (EI) for NOₓ emissions between kerosene and hydrogen operation, an energy (kerosene) equivalent Emission Index is used.
A full 15° sector model CFD simulation of the combustion chamber with multiple Micromix injectors including inflow homogenization and dilution and cooling air flows investigates the combustor integration effects, resulting NOₓ emission and radial temperature distributions at the combustor outlet. The results show that the integration of a Micromix hydrogen combustor in actual aircraft engines is feasible and offers, besides CO₂ free combustion, a significant reduction of NOₓ emissions compared to kerosene operation.
Like all preceding transformations of the manufacturing industry, the large-scale usage of production data will reshape the role of humans within the sociotechnical production ecosystem. To ensure that this transformation creates work systems in which employees are empowered, productive, healthy, and motivated, the transformation must be guided by principles of and research on human-centered work design. Specifically, measures must be taken at all levels of work design, ranging from (1) the work tasks to (2) the working conditions to (3) the organizational level and (4) the supra-organizational level. We present selected research across all four levels that showcase the opportunities and requirements that surface when striving for human-centered work design for the Internet of Production (IoP). (1) On the work task level, we illustrate the user-centered design of human-robot collaboration (HRC) and process planning in the composite industry as well as user-centered design factors for cognitive assistance systems. (2) On the working conditions level, we present a newly developed framework for the classification of HRC workplaces. (3) Moving to the organizational level, we show how corporate data can be used to facilitate best practice sharing in production networks, and we discuss the implications of the IoP for new leadership models. Finally, (4) on the supra-organizational level, we examine overarching ethical dimensions, investigating, e.g., how the new work contexts affect our understanding of responsibility and normative values such as autonomy and privacy. Overall, these interdisciplinary research perspectives highlight the importance and necessary scope of considering the human factor in the IoP.
The number of electronic vehicles increase steadily while the space for extending the charging infrastructure is limited. In particular in urban areas, where parking spaces in attractive areas are famous, opportunities to setup new charging stations is very limited. This leads to an overload of some very attractive charging stations and an underutilization of less attractive ones. Against this background, the paper at hand presents the design of an e-vehicle reservation system that aims at distributing the utilization of the charging infrastructure, particularly in urban areas. By applying a design science approach, the requirements for a reservation-based utilization approach are elicited and a model for a suitable distribution approach and its instantiation are developed. The artefact is evaluated by simulating the distribution effects based on data of real charging station utilizations.
Extracting workflow nets from textual descriptions can be used to simplify guidelines or formalize textual descriptions of formal processes like business processes and algorithms. The task of manually extracting processes, however, requires domain expertise and effort. While automatic process model extraction is desirable, annotating texts with formalized process models is expensive. Therefore, there are only a few machine-learning-based extraction approaches. Rule-based approaches, in turn, require domain specificity to work well and can rarely distinguish relevant and irrelevant information in textual descriptions. In this paper, we present GUIDO, a hybrid approach to the process model extraction task that first, classifies sentences regarding their relevance to the process model, using a BERT-based sentence classifier, and second, extracts a process model from the sentences classified as relevant, using dependency parsing. The presented approach achieves significantly better resul ts than a pure rule-based approach. GUIDO achieves an average behavioral similarity score of 0.93. Still, in comparison to purely machine-learning-based approaches, the annotation costs stay low.
The integration of high temperature thermal energy storages into existing conventional power plants can help to reduce the CO2 emissions of those plants and lead to lower capital expenditures for building energy storage systems, due to the use of synergy effects [1]. One possibility to implement that, is a molten salt storage system with a powerful power-to-heat unit. This paper presents two possible control concepts for the startup of the charging system of such a facility. The procedures are implemented in a detailed dynamic process model. The performance and safety regarding the film temperatures at heat transmitting surfaces are investigated in the process simulations. To improve the accuracy in predicting the film temperatures, CFD simulations of the electrical heater are carried out and the results are merged with the dynamic model. The results show that both investigated control concepts are safe regarding the temperature limits. The gradient controlled startup performed better than the temperature-controlled startup. Nevertheless, there are several uncertainties that need to be investigated further.
Messenger apps like WhatsApp and Telegram are frequently used for everyday communication, but they can also be utilized as a platform for illegal activity. Telegram allows public groups with up to 200.000 participants. Criminals use these public groups for trading illegal commodities and services, which becomes a concern for law enforcement agencies, who manually monitor suspicious activity in these chat rooms. This research demonstrates how natural language processing (NLP) can assist in analyzing these chat rooms, providing an explorative overview of the domain and facilitating purposeful analyses of user behavior. We provide a publicly available corpus of annotated text messages with entities and relations from four self-proclaimed black market chat rooms. Our pipeline approach aggregates the extracted product attributes from user messages to profiles and uses these with their sold products as features for clustering. The extracted structured information is the foundation for further data exploration, such as identifying the top vendors or fine-granular price analyses. Our evaluation shows that pretrained word vectors perform better for unsupervised clustering than state-of-the-art transformer models, while the latter is still superior for sequence labeling.
New European Union (EU) regulations for UAS operations require an operational risk analysis, which includes an estimation of the potential danger of the UAS crashing. A key parameter for the potential ground risk is the kinetic impact energy of the UAS. The kinetic energy depends on the impact velocity of the UAS and, therefore, on the aerodynamic drag and the weight during free fall. Hence, estimating the impact energy of a UAS requires an accurate drag estimation of the UAS in that state. The paper at hand presents the aerodynamic drag estimation of small-scale multirotor UAS. Multirotor UAS of various sizes and configurations were analysed with a fully unsteady Reynolds-averaged Navier–Stokes approach. These simulations included different velocities and various fuselage pitch angles of the UAS. The results were compared against force measurements performed in a subsonic wind tunnel and provided good consistency. Furthermore, the influence of the UAS`s fuselage pitch angle as well as the influence of fixed and free spinning propellers on the aerodynamic drag was analysed. Free spinning propellers may increase the drag by up to 110%, depending on the fuselage pitch angle. Increasing the fuselage pitch angle of the UAS lowers the drag by 40% up to 85%, depending on the UAS. The data presented in this paper allow for increased accuracy of ground risk assessments.
Melting probes are a proven tool for the exploration of thick ice layers and clean sampling of subglacial water on Earth. Their compact size and ease of operation also make them a key technology for the future exploration of icy moons in our Solar System, most prominently Europa and Enceladus. For both mission planning and hardware engineering, metrics such as efficiency and expected performance in terms of achievable speed, power requirements, and necessary heating power have to be known.
Theoretical studies aim at describing thermal losses on the one hand, while laboratory experiments and field tests allow an empirical investigation of the true performance on the other hand. To investigate the practical value of a performance model for the operational performance in extraterrestrial environments, we first contrast measured data from terrestrial field tests on temperate and polythermal glaciers with results from basic heat loss models and a melt trajectory model. For this purpose, we propose conventions for the determination of two different efficiencies that can be applied to both measured data and models. One definition of efficiency is related to the melting head only, while the other definition considers the melting probe as a whole. We also present methods to combine several sources of heat loss for probes with a circular cross-section, and to translate the geometry of probes with a non-circular cross-section to analyse them in the same way. The models were selected in a way that minimizes the need to make assumptions about unknown parameters of the probe or the ice environment.
The results indicate that currently used models do not yet reliably reproduce the performance of a probe under realistic conditions. Melting velocities and efficiencies are constantly overestimated by 15 to 50 % in the models, but qualitatively agree with the field test data. Hence, losses are observed, that are not yet covered and quantified by the available loss models. We find that the deviation increases with decreasing ice temperature. We suspect that this mismatch is mainly due to the too restrictive idealization of the probe model and the fact that the probe was not operated in an efficiency-optimized manner during the field tests. With respect to space mission engineering, we find that performance and efficiency models must be used with caution in unknown ice environments, as various ice parameters have a significant effect on the melting process. Some of these are difficult to estimate from afar.
Immunosorbent turnip vein clearing virus (TVCV) particles displaying the IgG-binding domains D and E of Staphylococcus aureus protein A (PA) on every coat protein (CP) subunit (TVCVPA) were purified from plants via optimized and new protocols. The latter used polyethylene glycol (PEG) raw precipitates, from which virions were selectively re-solubilized in reverse PEG concentration gradients. This procedure improved the integrity of both TVCVPA and the wild-type subgroup 3 tobamovirus. TVCVPA could be loaded with more than 500 IgGs per virion, which mediated the immunocapture of fluorescent dyes, GFP, and active enzymes. Bi-enzyme ensembles of cooperating glucose oxidase and horseradish peroxidase were tethered together on the TVCVPA carriers via a single antibody type, with one enzyme conjugated chemically to its Fc region, and the other one bound as a target, yielding synthetic multi-enzyme complexes. In microtiter plates, the TVCVPA-displayed sugar-sensing system possessed a considerably increased reusability upon repeated testing, compared to the IgG-bound enzyme pair in the absence of the virus. A high coverage of the viral adapters was also achieved on Ta2O5 sensor chip surfaces coated with a polyelectrolyte interlayer, as a prerequisite for durable TVCVPA-assisted electrochemical biosensing via modularly IgG-assembled sensor enzymes.
In recent years, the development of large pretrained language models, such as BERT and GPT, significantly improved information extraction systems on various tasks, including relation classification. State-of-the-art systems are highly accurate on scientific benchmarks. A lack of explainability is currently a complicating factor in many real-world applications. Comprehensible systems are necessary to prevent biased, counterintuitive, or harmful decisions.
We introduce semantic extents, a concept to analyze decision patterns for the relation classification task. Semantic extents are the most influential parts of texts concerning classification decisions. Our definition allows similar procedures to determine semantic extents for humans and models. We provide an annotation tool and a software framework to determine semantic extents for humans and models conveniently and reproducibly. Comparing both reveals that models tend to learn shortcut patterns from data. These patterns are hard to detect with current interpretability methods, such as input reductions. Our approach can help detect and eliminate spurious decision patterns during model development. Semantic extents can increase the reliability and security of natural language processing systems. Semantic extents are an essential step in enabling applications in critical areas like healthcare or finance. Moreover, our work opens new research directions for developing methods to explain deep learning models.
Residential and commercial buildings account for more than one-third of global energy-related greenhouse gas emissions. Integrated multi-energy systems at the district level are a promising way to reduce greenhouse gas emissions by exploiting economies of scale and synergies between energy sources. Planning district energy systems comes with many challenges in an ever-changing environment. Computational modelling established itself as the state-of-the-art method for district energy system planning. Unfortunately, it is still cumbersome to combine standalone models to generate insights that surpass their original purpose. Ideally, planning processes could be solved by using modular tools that easily incorporate the variety of competing and complementing computational models. Our contribution is a vision for a collaborative development and application platform for multi-energy system planning tools at the district level. We present challenges of district energy system planning identified in the literature and evaluate whether this platform can help to overcome these challenges. Further, we propose a toolkit that represents the core technical elements of the platform. Lastly, we discuss community management and its relevance for the success of projects with collaboration and knowledge sharing at their core.
This article describes an Internet of things (IoT) sensing device with a wireless interface which is powered by the energy-harvesting method of the Wiegand effect. The Wiegand effect, in contrast to continuous sources like photovoltaic or thermal harvesters, provides small amounts of energy discontinuously in pulsed mode. To enable an energy-self-sufficient operation of the sensing device with this pulsed energy source, the output energy of the Wiegand generator is maximized. This energy is used to power up the system and to acquire and process data like position, temperature or other resistively measurable quantities as well as transmit these data via an ultra-low-power ultra-wideband (UWB) data transmitter. A proof-of-concept system was built to prove the feasibility of the approach. The energy consumption of the system during start-up was analysed, traced back in detail to the individual components, compared to the generated energy and processed to identify further optimization options. Based on the proof of concept, an application prototype was developed.
Clinical assessment of newly developed sensors is important for ensuring their validity. Comparing recordings of emerging electrocardiography (ECG) systems to a reference ECG system requires accurate synchronization of data from both devices. Current methods can be inefficient and prone to errors. To address this issue, three algorithms are presented to synchronize two ECG time series from different recording systems: Binned R-peak Correlation, R-R Interval Correlation, and Average R-peak Distance. These algorithms reduce ECG data to their cyclic features, mitigating inefficiencies and minimizing discrepancies between different recording systems. We evaluate the performance of these algorithms using high-quality data and then assess their robustness after manipulating the R-peaks. Our results show that R-R Interval Correlation was the most efficient, whereas the Average R-peak Distance and Binned R-peak Correlation were more robust against noisy data.
Digital forensics of smartphones is of utmost importance in many criminal cases. As modern smartphones store chats, photos, videos etc. that can be relevant for investigations and as they can have storage capacities of hundreds of gigabytes, they are a primary target for forensic investigators. However, it is exactly this large amount of data that is causing problems: extracting and examining the data from multiple phones seized in the context of a case is taking more and more time. This bears the risk of wasting a lot of time with irrelevant phones while there is not enough time left to analyze a phone which is worth examination. Forensic triage can help in this case: Such a triage is a preselection step based on a subset of data and is performed before fully extracting all the data from the smartphone. Triage can accelerate subsequent investigations and is especially useful in cases where time is essential. The aim of this paper is to determine which and how much data from an Android smartphone can be made directly accessible to the forensic investigator – without tedious investigations. For this purpose, an app has been developed that can be used with extremely limited storage of data in the handset and which outputs the extracted data immediately to the forensic workstation in a human- and machine-readable format.
In order to reduce energy consumption of homes, it is important to make transparent which devices consume how much energy. However, power consumption is often only monitored aggregated at the house energy meter. Disaggregating this power consumption into the contributions of individual devices can be achieved using Machine Learning. Our work aims at making state of the art disaggregation algorithms accessibe for users of the open source home automation platform Home Assistant.