Refine
Year of publication
- 2024 (31)
- 2023 (40)
- 2022 (52)
- 2021 (56)
- 2020 (59)
- 2019 (71)
- 2018 (68)
- 2017 (67)
- 2016 (54)
- 2015 (70)
- 2014 (66)
- 2013 (65)
- 2012 (72)
- 2011 (82)
- 2010 (73)
- 2009 (85)
- 2008 (61)
- 2007 (57)
- 2006 (75)
- 2005 (48)
- 2004 (85)
- 2003 (57)
- 2002 (55)
- 2001 (54)
- 2000 (65)
- 1999 (40)
- 1998 (39)
- 1997 (36)
- 1996 (32)
- 1995 (19)
- 1994 (13)
- 1993 (19)
- 1992 (13)
- 1991 (12)
- 1990 (17)
- 1989 (21)
- 1988 (22)
- 1987 (26)
- 1986 (7)
- 1985 (10)
- 1984 (9)
- 1983 (6)
- 1982 (24)
- 1981 (16)
- 1980 (30)
- 1979 (20)
- 1978 (27)
- 1977 (13)
- 1976 (16)
- 1975 (14)
- 1974 (4)
- 1973 (3)
- 1972 (6)
- 1971 (1)
- 1969 (1)
- 1968 (2)
- 1967 (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (2087) (remove)
Document Type
- Article (1585)
- Conference Proceeding (253)
- Book (98)
- Part of a Book (63)
- Doctoral Thesis (28)
- Patent (17)
- Report (15)
- Other (9)
- Conference: Meeting Abstract (5)
- Habilitation (4)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (16)
- CAD (15)
- civil engineering (14)
- Bauingenieurwesen (13)
- Einspielen <Werkstoff> (13)
- shakedown analysis (9)
- FEM (6)
- Limit analysis (6)
- Shakedown analysis (6)
Damit Sie als Moderator effektiv und professionell moderieren können, sollten Sie die entsprechenden Methoden kennen.
Mit den richtigen Methoden können Sie Diskussionen leiten, Konflikte lösen, die Teilnehmer motivieren und dafür sorgen, dass die Ziele der Veranstaltung erreicht werden. Außerdem helfen sie Ihnen, eine positive Atmosphäre zu schaffen und das Interesse der Teilnehmer zu halten.
In diesem zweiten Beitrag der mehrteiligen Serie lernen Sie die grundsätzlichen Methoden kennen, um erfolgreiche Teamsitzungen, Arbeitsgruppentreffen, Kick-offs und Meetings durchzuführen.
Die Bereitstellung von nachhaltig erzeugtem Wasserstoff als Energieträger und Rohstoff ist eine wichtige Schlüsseltechnologie sowohl als Ersatz für fossile Energieträger, aber auch als Produkt im Zusammenhang mit Kreislaufprozessen. In der Abwasserbehandlung bestehen verschiedene Möglichkeiten Wasserstoff herzustellen. Mehrere Wege, mögliche Synergien, aber auch deren Nachteile werden vorgestellt.
Teamsitzungen, Arbeitsgruppentreffen, Kickoffs und Meetings – sie alle werden mit dem Ziel durchgeführt, innerhalb einer vorgegebenen Zeitspanne ein gemeinsames Arbeitsziel zu erreichen. Damit die Zielerreichung auch bei komplexeren Arbeitsaufträgen nicht vom Zufall abhängt, empfiehlt es sich, die Leitung des Ablaufs einem Moderator zu übertragen.
In diesem Beitrag einer mehrteiligen Serie wird beschrieben, über welches Mindset der Moderator verfügen sollte, welche grundsätzlichen Methoden hilfreich sind und was bei der Onlinemoderation im Besonderen zu beachten ist.
The growing body of political texts opens up new opportunities for rich insights into political dynamics and ideologies but also increases the workload for manual analysis. Automated speaker attribution, which detects who said what to whom in a speech event and is closely related to semantic role labeling, is an important processing step for computational text analysis. We study the potential of the large language model family Llama 2 to automate speaker attribution in German parliamentary debates from 2017-2021. We fine-tune Llama 2 with QLoRA, an efficient training strategy, and observe our approach to achieve competitive performance in the GermEval 2023 Shared Task On Speaker Attribution in German News Articles and Parliamentary Debates. Our results shed light on the capabilities of large language models in automating speaker attribution, revealing a promising avenue for computational analysis of political discourse and the development of semantic role labeling systems.
Using scenarios is vital in identifying and specifying measures for successfully transforming the energy system. Such transformations can be particularly challenging and require the support of a broader set of stakeholders. Otherwise, there will be opposition in the form of reluctance to adopt the necessary technologies. Usually, processes for considering stakeholders' perspectives are very time-consuming and costly. In particular, there are uncertainties about how to deal with modifications in the scenarios. In principle, new consulting processes will be required. In our study, we show how multi-criteria decision analysis can be used to analyze stakeholders' attitudes toward transition paths. Since stakeholders differ regarding their preferences and time horizons, we employ a multi-criteria decision analysis approach to identify which stakeholders will support or oppose a transition path. We provide a flexible template for analyzing stakeholder preferences toward transition paths. This flexibility comes from the fact that our multi-criteria decision aid-based approach does not involve intensive empirical work with stakeholders. Instead, it involves subjecting assumptions to robustness analysis, which can help identify options to influence stakeholders' attitudes toward transitions.
Subglacial environments on Earth offer important analogs to Ocean World targets in our solar system. These unique microbial ecosystems remain understudied due to the challenges of access through thick glacial ice (tens to hundreds of meters). Additionally, sub-ice collections must be conducted in a clean manner to ensure sample integrity for downstream microbiological and geochemical analyses. We describe the field-based cleaning of a melt probe that was used to collect brine samples from within a glacier conduit at Blood Falls, Antarctica, for geomicrobiological studies. We used a thermoelectric melting probe called the IceMole that was designed to be minimally invasive in that the logistical requirements in support of drilling operations were small and the probe could be cleaned, even in a remote field setting, so as to minimize potential contamination. In our study, the exterior bioburden on the IceMole was reduced to levels measured in most clean rooms, and below that of the ice surrounding our sampling target. Potential microbial contaminants were identified during the cleaning process; however, very few were detected in the final englacial sample collected with the IceMole and were present in extremely low abundances (∼0.063% of 16S rRNA gene amplicon sequences). This cleaning protocol can help minimize contamination when working in remote field locations, support microbiological sampling of terrestrial subglacial environments using melting probes, and help inform planetary protection challenges for Ocean World analog mission concepts.
Muscle function is compromised by gravitational unloading in space affecting overall musculoskeletal health. Astronauts perform daily exercise programmes to mitigate these effects but knowing which muscles to target would optimise effectiveness. Accurate inflight assessment to inform exercise programmes is critical due to lack of technologies suitable for spaceflight. Changes in mechanical properties indicate muscle health status and can be measured rapidly and non-invasively using novel technology. A hand-held MyotonPRO device enabled monitoring of muscle health for the first time in spaceflight (> 180 days). Greater/maintained stiffness indicated countermeasures were effective. Tissue stiffness was preserved in the majority of muscles (neck, shoulder, back, thigh) but Tibialis Anterior (foot lever muscle) stiffness decreased inflight vs. preflight (p < 0.0001; mean difference 149 N/m) in all 12 crewmembers. The calf muscles showed opposing effects, Gastrocnemius increasing in stiffness Soleus decreasing. Selective stiffness decrements indicate lack of preservation despite daily inflight countermeasures. This calls for more targeted exercises for lower leg muscles with vital roles as ankle joint stabilizers and in gait. Muscle stiffness is a digital biomarker for risk monitoring during future planetary explorations (Moon, Mars), for healthcare management in challenging environments or clinical disorders in people on Earth, to enable effective tailored exercise programmes.
The growing body of political texts opens up new opportunities for rich insights into political dynamics and ideologies but also increases the workload for manual analysis. Automated speaker attribution, which detects who said what to whom in a speech event and is closely related to semantic role labeling, is an important processing step for computational text analysis. We study the potential of the large language model family Llama 2 to automate speaker attribution in German parliamentary debates from 2017-2021. We fine-tune Llama 2 with QLoRA, an efficient training strategy, and observe our approach to achieve competitive performance in the GermEval 2023 Shared Task On Speaker Attribution in German News Articles and Parliamentary Debates. Our results shed light on the capabilities of large language models in automating speaker attribution, revealing a promising avenue for computational analysis of political discourse and the development of semantic role labeling systems.
As one class of molecular imprinted polymers (MIPs), surface imprinted polymer (SIP)-based biosensors show great potential in direct whole-bacteria detection. Micro-contact imprinting, that involves stamping the template bacteria immobilized on a substrate into a pre-polymerized polymer matrix, is the most straightforward and prominent method to obtain SIP-based biosensors. However, the major drawbacks of the method arise from the requirement for fresh template bacteria and often non-reproducible bacteria distribution on the stamp substrate. Herein, we developed a positive master stamp containing photolithographic mimics of the template bacteria (E. coli) enabling reproducible fabrication of biomimetic SIP-based biosensors without the need for the “real” bacteria cells. By using atomic force and scanning electron microscopy imaging techniques, respectively, the E. coli-capturing ability of the SIP samples was tested, and compared with non-imprinted polymer (NIP)-based samples and control SIP samples, in which the cavity geometry does not match with E. coli cells. It was revealed that the presence of the biomimetic E. coli imprints with a specifically designed geometry increases the sensor E. coli-capturing ability by an “imprinting factor” of about 3. These findings show the importance of geometry-guided physical recognition in bacterial detection using SIP-based biosensors. In addition, this imprinting strategy was employed to interdigitated electrodes and QCM (quartz crystal microbalance) chips. E. coli detection performance of the sensors was demonstrated with electrochemical impedance spectroscopy (EIS) and QCM measurements with dissipation monitoring technique (QCM-D).
Many important properties of bacterial cellulose (BC), such as moisture absorption capacity, elasticity and tensile strength, largely depend on its structure. This paper presents a study on the effect of the drying method on BC films produced by Medusomyces gisevii using two different procedures: room temperature drying (RT, (24 ± 2 °C, humidity 65 ± 1%, dried until a constant weight was reached) and freeze-drying (FD, treated at − 75 °C for 48 h). BC was synthesized using one of two different carbon sources—either glucose or sucrose. Structural differences in the obtained BC films were evaluated using atomic force microscopy (AFM), scanning electron microscopy (SEM), and X-ray diffraction. Macroscopically, the RT samples appeared semi-transparent and smooth, whereas the FD group exhibited an opaque white color and sponge-like structure. SEM examination showed denser packing of fibrils in FD samples while RT-samples displayed smaller average fiber diameter, lower surface roughness and less porosity. AFM confirmed the SEM observations and showed that the FD material exhibited a more branched structure and a higher surface roughness. The samples cultivated in a glucose-containing nutrient medium, generally displayed a straight and ordered shape of fibrils compared to the sucrose-derived BC, characterized by a rougher and wavier structure. The BC films dried under different conditions showed distinctly different crystallinity degrees, whereas the carbon source in the culture medium was found to have a relatively small effect on the BC crystallinity.
Electrolyte-insulator-semiconductor capacitors (EISCAP) belong to field-effect sensors having an attractive transducer architecture for constructing various biochemical sensors. In this study, a capacitive model of enzyme-modified EISCAPs has been developed and the impact of the surface coverage of immobilized enzymes on its capacitance-voltage and constant-capacitance characteristics was studied theoretically and experimentally. The used multicell arrangement enables a multiplexed electrochemical characterization of up to sixteen EISCAPs. Different enzyme coverages have been achieved by means of parallel electrical connection of bare and enzyme-covered single EISCAPs in diverse combinations. As predicted by the model, with increasing the enzyme coverage, both the shift of capacitance-voltage curves and the amplitude of the constant-capacitance signal increase, resulting in an enhancement of analyte sensitivity of the EISCAP biosensor. In addition, the capability of the multicell arrangement with multi-enzyme covered EISCAPs for sequentially detecting multianalytes (penicillin and urea) utilizing the enzymes penicillinase and urease has been experimentally demonstrated and discussed.
In this work, we present a compact, bifunctional chip-based sensor setup that measures the temperature and electrical conductivity of water samples, including specimens from rivers and channels, aquaculture, and the Atlantic Ocean. For conductivity measurements, we utilize the impedance amplitude recorded via interdigitated electrode structures at a single triggering frequency. The results are well in line with data obtained using a calibrated reference instrument. The new setup holds for conductivity values spanning almost two orders of magnitude (river versus ocean water) without the need for equivalent circuit modelling. Temperature measurements were performed in four-point geometry with an on-chip platinum RTD (resistance temperature detector) in the temperature range between 2 °C and 40 °C, showing no hysteresis effects between warming and cooling cycles. Although the meander was not shielded against the liquid, the temperature calibration provided equivalent results to low conductive Milli-Q and highly conductive ocean water. The sensor is therefore suitable for inline and online monitoring purposes in recirculating aquaculture systems.
Methane is a valuable energy source helping to mitigate the growing energy demand worldwide. However, as a potent greenhouse gas, it has also gained additional attention due to its environmental impacts. The biological production of methane is performed primarily hydrogenotrophically from H2 and CO2 by methanogenic archaea. Hydrogenotrophic methanogenesis also represents a great interest with respect to carbon re-cycling and H2 storage. The most significant carbon source, extremely rich in complex organic matter for microbial degradation and biogenic methane production, is coal. Although interest in enhanced microbial coalbed methane production is continuously increasing globally, limited knowledge exists regarding the exact origins of the coalbed methane and the associated microbial communities, including hydrogenotrophic methanogens. Here, we give an overview of hydrogenotrophic methanogens in coal beds and related environments in terms of their energy production mechanisms, unique metabolic pathways, and associated ecological functions.
This paper investigates the interior transmission problem for homogeneous media via eigenvalue trajectories parameterized by the magnitude of the refractive index. In the case that the scatterer is the unit disk, we prove that there is a one-to-one correspondence between complex-valued interior transmission eigenvalue trajectories and Dirichlet eigenvalues of the Laplacian which turn out to be exactly the trajectorial limit points as the refractive index tends to infinity. For general simply-connected scatterers in two or three dimensions, a corresponding relation is still open, but further theoretical results and numerical studies indicate a similar connection.
We conducted a scoping review for active learning in the domain of natural language processing (NLP), which we summarize in accordance with the PRISMA-ScR guidelines as follows:
Objective: Identify active learning strategies that were proposed for entity recognition and their evaluation environments (datasets, metrics, hardware, execution time).
Design: We used Scopus and ACM as our search engines. We compared the results with two literature surveys to assess the search quality. We included peer-reviewed English publications introducing or comparing active learning strategies for entity recognition.
Results: We analyzed 62 relevant papers and identified 106 active learning strategies. We grouped them into three categories: exploitation-based (60x), exploration-based (14x), and hybrid strategies (32x). We found that all studies used the F1-score as an evaluation metric. Information about hardware (6x) and execution time (13x) was only occasionally included. The 62 papers used 57 different datasets to evaluate their respective strategies. Most datasets contained newspaper articles or biomedical/medical data. Our analysis revealed that 26 out of 57 datasets are publicly accessible.
Conclusion: Numerous active learning strategies have been identified, along with significant open questions that still need to be addressed. Researchers and practitioners face difficulties when making data-driven decisions about which active learning strategy to adopt. Conducting comprehensive empirical comparisons using the evaluation environment proposed in this study could help establish best practices in the domain.
Due to the transition to renewable energies, electricity markets need to be made fit for purpose. To enable the comparison of different energy market designs, modeling tools covering market actors and their heterogeneous behavior are needed. Agent-based models are ideally suited for this task. Such models can be used to simulate and analyze changes to market design or market mechanisms and their impact on market dynamics. In this paper, we conduct an evaluation and comparison of two actively developed open-source energy market simulation models. The two models, namely AMIRIS and ASSUME, are both designed to simulate future energy markets using an agent-based approach. The assessment encompasses modelling features and techniques, model performance, as well as a comparison of model results, which can serve as a blueprint for future comparative studies of simulation models. The main comparison dataset includes data of Germany in 2019 and simulates the Day-Ahead market and participating actors as individual agents. Both models are comparable close to the benchmark dataset with a MAE between 5.6 and 6.4 €/MWh while also modeling the actual dispatch realistically.
In the research domain of energy informatics, the importance of open datais rising rapidly. This can be seen as various new public datasets are created andpublished. Unfortunately, in many cases, the data is not available under a permissivelicense corresponding to the FAIR principles, often lacking accessibility or reusability.Furthermore, the source format often differs from the desired data format or does notmeet the demands to be queried in an efficient way. To solve this on a small scale atoolbox for ETL-processes is provided to create a local energy data server with openaccess data from different valuable sources in a structured format. So while the sourcesitself do not fully comply with the FAIR principles, the provided unique toolbox allows foran efficient processing of the data as if the FAIR principles would be met. The energydata server currently includes information of power systems, weather data, networkfrequency data, European energy and gas data for demand and generation and more.However, a solution to the core problem - missing alignment to the FAIR principles - isstill needed for the National Research Data Infrastructure.