Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1699)
- Fachbereich Elektrotechnik und Informationstechnik (722)
- IfB - Institut für Bioengineering (627)
- Fachbereich Energietechnik (590)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (555)
- Fachbereich Luft- und Raumfahrttechnik (500)
- Fachbereich Maschinenbau und Mechatronik (289)
- Fachbereich Wirtschaftswissenschaften (224)
- Solar-Institut Jülich (165)
Language
- English (4961) (remove)
Document Type
- Article (3277)
- Conference Proceeding (1197)
- Part of a Book (197)
- Book (146)
- Conference: Meeting Abstract (34)
- Doctoral Thesis (32)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
The quest for scientifically advanced and sustainable solutions is driven by growing environmental and economic issues associated with coal mining, processing, and utilization. Consequently, within the coal industry, there is a growing recognition of the potential of microbial applications in fostering innovative technologies. Microbial-based coal solubilization, coal beneficiation, and coal dust suppression are green alternatives to traditional thermochemical and leaching technologies and better meet the need for ecologically sound and economically viable choices. Surfactant-mediated approaches have emerged as powerful tools for modeling, simulation, and optimization of coal-microbial systems and continue to gain prominence in clean coal fuel production, particularly in microbiological co-processing, conversion, and beneficiation. Surfactants (surface-active agents) are amphiphilic compounds that can reduce surface tension and enhance the solubility of hydrophobic molecules. A wide range of surfactant properties can be achieved by either directly influencing microbial growth factors, stimulants, and substrates or indirectly serving as frothers, collectors, and modifiers in the processing and utilization of coal. This review highlights the significant biotechnological potential of surfactants by providing a thorough overview of their involvement in coal biodegradation, bioprocessing, and biobeneficiation, acknowledging their importance as crucial steps in coal consumption.
Several unconnected laboratory experiments are usually offered for students in instrumental analysis lab. To give the students a more rational overview of the most common instrumental techniques, a new laboratory experiment was developed. Marketed pain relief drugs, familiar consumer products with one to three active components, namely, acetaminophen (paracetamol), acetylsalicylic acid (ASA), and caffeine, were selected. Common analytical methods were compared regarding the performance of qualitative and quantitative analysis of unknown tablets: UV–visible (UV–vis), infrared (IR), and nuclear magnetic resonance (NMR) spectroscopies, as well as high-performance liquid chromatography (HPLC). The students successfully uncovered the composition of formulations, which were divided into three difficulty categories. Students were shown that in addition to simple mixtures handled in theoretical classes, the composition of complex drug products can also be uncovered. By comparing the performance of different techniques, students deepen their understanding and compare the efficiency of analytical methods in the context of complex mixtures. The laboratory experiment can be adjusted for graduate level by including extra tasks such as method optimization, validation, and 2D spectroscopic techniques.
Sexism in online media comments is a pervasive challenge that often manifests subtly, complicating moderation efforts as interpretations of what constitutes sexism can vary among individuals. We study monolingual and multilingual open-source text embeddings to reliably detect sexism and misogyny in Germanlanguage online comments from an Austrian newspaper. We observed classifiers trained on text embeddings to mimic closely the individual judgements of human annotators. Our method showed robust performance in the GermEval 2024 GerMS-Detect Subtask 1 challenge, achieving an average macro F1 score of 0.597 (4th place, as reported on Codabench). It also accurately predicted the distribution of human annotations in GerMS-Detect Subtask 2, with an average Jensen-Shannon distance of 0.301 (2nd place). The computational efficiency of our approach suggests potential for scalable applications across various languages and linguistic contexts.
To successfully develop and introduce concrete artificial intelligence (AI) solutions in operational practice, a comprehensive process model is being tested in the WIRKsam joint project. It is based on a methodical approach that integrates human, technical and organisational aspects and involves employees in the process. The chapter focuses on the procedure for identifying requirements for a work system that is implementing AI in problem-driven projects and for selecting appropriate AI methods. This means that the use case has already been narrowed down at the beginning of the project and must be completely defined in the following. Initially, the existing preliminary work is presented. Based on this, an overview of all procedural steps and methods is given. All methods are presented in detail and good practice approaches are shown. Finally, a reflection of the developed procedure based on the application in nine companies is given.
Effective government services rely on accurate population numbers to allocate resources. In Colombia and globally, census enumeration is challenging in remote regions and where armed conflict is occurring. During census preparations, the Colombian National Administrative Department of Statistics conducted social cartography workshops, where community representatives estimated numbers of dwellings and people throughout their regions. We repurposed this information, combining it with remotely sensed buildings data and other geospatial data. To estimate building counts and population sizes, we developed hierarchical Bayesian models, trained using nearby full-coverage census enumerations and assessed using 10-fold cross-validation. We compared models to assess the relative contributions of community knowledge, remotely sensed buildings, and their combination to model fit. The Community model was unbiased but imprecise; the Satellite model was more precise but biased; and the Combination model was best for overall accuracy. Results reaffirmed the power of remotely sensed buildings data for population estimation and highlighted the value of incorporating local knowledge.
Perennial ryegrass (Lolium perenne) is an underutilized lignocellulosic biomass that has several benefits such as high availability, renewability, and biomass yield. The grass press-juice obtained from the mechanical pretreatment can be used for the bio-based production of chemicals. Lactic acid is a platform chemical that has attracted consideration due to its broad area of applications. For this reason, the more sustainable production of lactic acid is expected to increase. In this work, lactic acid was produced using complex medium at the bench- and reactor scale, and the results were compared to those obtained using an optimized press-juice medium. Bench-scale fermentations were carried out in a pH-control system and lactic acid production reached approximately 21.84 ± 0.95 g/L in complex medium, and 26.61 ± 1.2 g/L in press-juice medium. In the bioreactor, the production yield was 0.91 ± 0.07 g/g, corresponding to a 1.4-fold increase with respect to the complex medium with fructose. As a comparison to the traditional ensiling process, the ensiling of whole grass fractions of different varieties harvested in summer and autumn was performed. Ensiling showed variations in lactic acid yields, with a yield up to 15.2% dry mass for the late-harvested samples, surpassing typical silage yields of 6–10% dry mass.
Purpose: Impaired paravascular drainage of β-Amyloid (Aβ) has been proposed as a contributing cause for sporadic Alzheimer’s disease (AD), as decreased cerebral blood vessel pulsatility and subsequently reduced propulsion in this pathway could lead to the accumulation and deposition of Aβ in the brain. Therefore, we hypothesized that there is an increased impairment in pulsatility across AD spectrum.
Patients and Methods: Using transcranial color-coded duplex sonography (TCCS) the resistance and pulsatility index (RI; PI) of the middle cerebral artery (MCA) in healthy controls (HC, n=14) and patients with AD dementia (ADD, n=12) were measured. In a second step, we extended the sample by adding patients with mild cognitive impairment (MCI) stratified by the presence (MCI-AD, n=8) or absence of biomarkers (MCI-nonAD, n=8) indicative for underlying AD pathology, and compared RI and PI across the groups. To control for atherosclerosis as a confounder, we measured the arteriolar-venular-ratio of retinal vessels.
Results: Left and right RI (p=0.020; p=0.027) and left PI (p=0.034) differed between HC and ADD controlled for atherosclerosis with AUCs of 0.776, 0.763, and 0.718, respectively. The RI and PI of MCI-AD tended towards ADD, of MCI-nonAD towards HC, respectively. RIs and PIs were associated with disease severity (p=0.010, p=0.023).
Conclusion: Our results strengthen the hypothesis that impaired pulsatility could cause impaired amyloid clearance from the brain and thereby might contribute to the development of AD. However, further studies considering other factors possibly influencing amyloid clearance as well as larger sample sizes are needed.
Purpose: A precise determination of the corneal diameter is essential for the diagnosis of various ocular diseases, cataract and refractive surgery as well as for the selection and fitting of contact lenses. The aim of this study was to investigate the agreement between two automatic and one manual method for corneal diameter determination and to evaluate possible diurnal variations in corneal diameter.
Patients and Methods: Horizontal white-to-white corneal diameter of 20 volunteers was measured at three different fixed times of a day with three methods: Scheimpflug method (Pentacam HR, Oculus), placido based topography (Keratograph 5M, Oculus) and manual method using an image analysis software at a slitlamp (BQ900, Haag-Streit).
Results: The two-factorial analysis of variance could not show a significant effect of the different instruments (p = 0.117), the different time points (p = 0.506) and the interaction between instrument and time point (p = 0.182). Very good repeatability (intraclass correlation coefficient ICC, quartile coefficient of dispersion QCD) was found for all three devices. However, manual slitlamp measurements showed a higher QCD than the automatic measurements with the Keratograph 5M and the Pentacam HR at all measurement times.
Conclusion: The manual and automated methods used in the study to determine corneal diameter showed good agreement and repeatability. No significant diurnal variations of corneal diameter were observed during the period of time studied.
Transgenic plants have the potential to produce recombinant proteins on an agricultural scale, with yields of several tons per year. The cost-effectiveness of transgenic plants increases if simple cultivation facilities such as greenhouses can be used for production. In such a setting, we expressed a novel affinity ligand based on the fluorescent protein DsRed, which we used as a carrier for the linear epitope ELDKWA from the HIV-neutralizing antibody 2F5. The DsRed-2F5-epitope (DFE) fusion protein was produced in 12 consecutive batches of transgenic tobacco (Nicotiana tabacum) plants over the course of 2 years and was purified using a combination of blanching and immobilized metal-ion affinity chromatography (IMAC). The average purity after IMAC was 57 ± 26% (n = 24) in terms of total soluble protein, but the average yield of pure DFE (12 mg kg−1) showed substantial variation (± 97 mg kg−1, n = 24) which correlated with seasonal changes. Specifically, we found that temperature peaks (>28°C) and intense illuminance (>45 klx h−1) were associated with lower DFE yields after purification, reflecting the loss of the epitope-containing C-terminus in up to 90% of the product. Whereas the weather factors were of limited use to predict product yields of individual harvests conducted for each batch (spaced by 1 week), the average batch yields were well approximated by simple linear regression models using two independent variables for prediction (illuminance and plant age). Interestingly, accumulation levels determined by fluorescence analysis were not affected by weather conditions but positively correlated with plant age, suggesting that the product was still expressed at high levels, but the extreme conditions affected its stability, albeit still preserving the fluorophore function. The efficient production of intact recombinant proteins in plants may therefore require adequate climate control and shading in greenhouses or even cultivation in fully controlled indoor farms.
Chromatography is the workhorse of biopharmaceutical downstream processing because it can selectively enrich a target product while removing impurities from complex feed streams. This is achieved by exploiting differences in molecular properties, such as size, charge and hydrophobicity (alone or in different combinations). Accordingly, many parameters must be tested during process development in order to maximize product purity and recovery, including resin and ligand types, conductivity, pH, gradient profiles, and the sequence of separation operations. The number of possible experimental conditions quickly becomes unmanageable. Although the range of suitable conditions can be narrowed based on experience, the time and cost of the work remain high even when using high-throughput laboratory automation. In contrast, chromatography modeling using inexpensive, parallelized computer hardware can provide expert knowledge, predicting conditions that achieve high purity and efficient recovery. The prediction of suitable conditions in silico reduces the number of empirical tests required and provides in-depth process understanding, which is recommended by regulatory authorities. In this article, we discuss the benefits and specific challenges of chromatography modeling. We describe the experimental characterization of chromatography devices and settings prior to modeling, such as the determination of column porosity. We also consider the challenges that must be overcome when models are set up and calibrated, including the cross-validation and verification of data-driven and hybrid (combined data-driven and mechanistic) models. This review will therefore support researchers intending to establish a chromatography modeling workflow in their laboratory.
Proteins are important ingredients in food and feed, they are the active components of many pharmaceutical products, and they are necessary, in the form of enzymes, for the success of many technical processes. However, production can be challenging, especially when using heterologous host cells such as bacteria to express and assemble recombinant mammalian proteins. The manufacturability of proteins can be hindered by low solubility, a tendency to aggregate, or inefficient purification. Tools such as in silico protein engineering and models that predict separation criteria can overcome these issues but usually require the complex shape and surface properties of proteins to be represented by a small number of quantitative numeric values known as descriptors, as similarly used to capture the features of small molecules. Here, we review the current status of protein descriptors, especially for application in quantitative structure activity relationship (QSAR) models. First, we describe the complexity of proteins and the properties that descriptors must accommodate. Then we introduce descriptors of shape and surface properties that quantify the global and local features of proteins. Finally, we highlight the current limitations of protein descriptors and propose strategies for the derivation of novel protein descriptors that are more informative.
The book covers various numerical field simulation methods, nonlinear circuit technology and its MF-S- and X-parameters, as well as state-of-the-art power amplifier techniques. It also describes newly presented oscillators and the emerging field of GHz plasma technology. Furthermore, it addresses aspects such as waveguides, mixers, phase-locked loops, antennas, and propagation effects, in combination with the bachelor's book 'High-Frequency Engineering,' encompassing all aspects related to the current state of GHz technology.
Self metathesis of oleochemicals offers a variety of bifunctional compounds, that can be used as monomer for polymer production. Many precursors are in huge scales available, like oleic acid ester (biodiesel), oleyl alcohol (tensides), oleyl amines (tensides, lubricants). We show several ways to produce and separate and purify C18-α,ω-bifunctional compounds, using Grubbs 2nd Generation catalysts, starting from technical grade educts.
The research group focuses on the characteristics in the land-and cityscapes of the Drielanden-zone, which contribute to generate common identities, as well as on those features that trigger differences and specificities of the adjacent countries that enrich the perception of the zone. In this research, the instruments of cartography and land survey system serve to detect and localize the fragmented appearance of relevant historic elements. These analytic procedures help to develop strategies for infrastructures and processes that gradually initiate local forms of cross-border tourism. The architectural research displays how top-down and bottom-up interventions can be combined in order to guarantee a sustainable use and development of the considered area.
In many instances, freight vehicles exchange load or information with plants that are or will soon be Industry4.0 plants. The Wagon4.0 concept, as developed in close cooperation with e.g. port or mine operations, offers a maximum in railway operational efficiency while providing strong business cases already in the respective plant interaction. The Wagon4.0 consists of main components, a power supply, data network, sensors, actuators and an operating system, the so called WagonOS. The Wagon OS is implemented in a granular, self-sufficient manner, to allow basic features such as WiFi-Mesh and train christening in remote areas without network connection. Furthermore, the granularity of the operating system allows to extend the familiar app concept to freight rail rolling stock, making it possible to use specialised actuators for certain applications, e.g. an electrical parking brake or an auxiliary drive. In order to facilitate migration to the Wagon4.0 for existing fleets, a migration concept featuring five levels of technical adaptation was developed. The present paper investigates the benefits of Wagon4.0-implementations for the particular challenges of heavy haul operations by focusing on train christening, ep-assisted braking, autonomous last mile and traction boost operation as well as improved maintenance schedules
In the introduction to their book "What is philosophy?" Gilles Deleuze and Felix Guattari deplore the inflationary and trivialised use of the term concept: "Finally, the most shameful moment came when computer science, marketing, design and advertising, all the disciplines of communication, seized hold of the word concept itself and said: 'This is our concern, we are the creative ones, we are the ideas men! We are the friends of the concept, we put in our computers.' " This doctoral thesis shares the concern of Gilles Deleuze and Felix Guattari, but still, it is a thesis in architecture and thus collocated within the field of the representatives of the "ideas men". It engages in architectural design theory, and refers in particular to the investigation of methodological approaches within the design process. Therefore, the thesis will not contribute to the philosophical dimension of the term, but intends to overcome its imprecise use within the architectural discourse, in compliance with Eugène Viollet-le-Duc's admonition relative to vague definitions: "Dans les arts, et dans l'architecture en particulier, les définitions vagues ont causé bien des erreurs, ont laissé germer bien des préjugés, enraciner bien des idées fausses. On met un mot en avant, chacun y attache un sens différent." The term concept in architecture is very often used as pure marketing collateral, it serves to sell an idea, a product, a design. Its functional applicability is reduced to a special manner of illustration, produced as one of the various design presentation documents at the end of the design process. In contrast, the original contribution of this thesis aims to give a precise, instrumental dimension to the term concept: the concept is the expression of a specific logic, capable to guide the decisional sequences of the process and thus to improve the quality of the designed projects. The motivation to define a specific instrumentality of the concept is closely connected to the issue of interdisciplinarity in the architects’ profession. The interdisciplinary character of the architectural field is widely accepted and discussed as such, but the thesis intends to give a more precise definition of the various kinds of competences involved by classifying them into either the internal or the external group. The traditional notion of interdisciplinarity, predominantly seen as collaboration between architects and technical experts, and, most notably, the historical, sometimes contentious, relationship between architects and engineers is described. Referring to recent developments, the transformation of the architect’s role within the professional sphere, marked by an increasing importance of diverse influences and linked to a growing risk of marginalisation, is illustrated. The thesis describes different ways to adapt to this specific kind of interdisciplinarity, which generally requires the architect’s ability to connect and to integrate various contents, different points of view and diverse scales. On the other hand, the big potential which is implicit in the interdisciplinary field is exposed: architects can inform their core competence, the design, by extracting contents of different disciplinary competences, pertaining or not pertaining to their own professional field. They have the possibility to cross fields of external competences in a selective way and by doing so they can build up a corpus of knowledge capable to generate and communicate guidelines and systematic methodologies for their design. At the end, the analysis of these two aspects allows the definition of a more specific professional profile of the architect as specialist of interdisciplinarity. The thesis is concerned with the theories around the design process. The design process is seen as open to inspection and critical evaluation, with major focus on the decisional sequences which characterise it. It concentrates on the process’ descriptiveness and the degree of self-conscious approaches applied within it. The importance of regulative, strategic mechanisms is illustrated by testimonies taken from a series of design researches and leads to the functional definition of the figure of the concept as representation of a coherent set of ideas, as generator of a project-specific system of rules and as communicator of decisional strategies. The concept's function is furthermore defined as communicative interface which generates and transmits the system of rules authoritative for all the disciplinary competences involved in the design process, a communicative interface which constitutes a basis of shared convictions capable to increase the efficiency of collaboration. Furthermore, the concept's capacity to explore and elaborate the contents of external disciplines is identified as a possible methodological approach to innovative design thinking. The approach to a specific functional definition of the concept is continued by the description of a series of instruments that are simultaneously generating and communicating it. It is outlined to which degree the concept itself is already the result of an ideational process, collocated within the initial phase of the design proceedings, serving as a guideline to them, but still continuously evolving and adapting in its progression. In addition, it is illustrated how all the diverse instruments of the concept are operational media through which the knowledge transition between different disciplines can occur. The considerations about the concept as operational instrument of design are elaborated with regard to a number of examples of didactical applications that are particularly involved in the development and teaching of specific design methods. These examples illustrate the interrelations between design theory and design education. They are derived from very different schools of architecture and diverse mindsets, but all of them transmit models of conceptual design thinking.
Concept - this is a key term in architectural discourse. However, all too often it is used imprecisely or merely for marketing purposes. What is a concept actually? This publication moves between design theory and design practice and follows the history of the definition of concept in architecture, leading to the formulation of a specifically instrumental and operative definition. It bases concept in architecture on its strategic potential in design decision-making processes. In the changing profession of the designing architect, decisions are increasingly made in multidisciplinary groups. Concept can serve as a dialogic instrument in the process, making it possible to process heterogeneous information from a range of spheres of knowledge. The effective presentation of selected information becomes a relevant interface in the design process, which has a significant influence on the quality of the design.