Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1695)
- Fachbereich Elektrotechnik und Informationstechnik (719)
- IfB - Institut für Bioengineering (626)
- Fachbereich Energietechnik (589)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (553)
- Fachbereich Luft- und Raumfahrttechnik (498)
- Fachbereich Maschinenbau und Mechatronik (284)
- Fachbereich Wirtschaftswissenschaften (222)
- Solar-Institut Jülich (165)
Language
- English (4940) (remove)
Document Type
- Article (3270)
- Conference Proceeding (1186)
- Part of a Book (195)
- Book (146)
- Conference: Meeting Abstract (33)
- Doctoral Thesis (32)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
Heparin is a natural polysaccharide, which plays essential role in many biological processes. Alterations in building blocks can modify biological roles of commercial heparin products, due to significant changes in the conformation of the polymer chain. The variability structure of heparin leads to difficulty in quality control using different analytical methods, including infrared (IR) spectroscopy. In this paper molecular modelling of heparin disaccharide subunits was performed using quantum chemistry. The structural and spectral parameters of these disaccharides have been calculated using RHF/6-311G. In addition, over-sulphated chondroitin sulphate disaccharide was studied as one of the most widespread contaminants of heparin. Calculated IR spectra were analyzed with respect to specific structure parameters. IR spectroscopic fingerprint was found to be sensitive to substitution pattern of disaccharide subunits. Vibrational assignments of calculated spectra were correlated with experimental IR spectral bands of native heparin. Chemometrics was used to perform multivariate analysis of simulated spectral data.
Lignin is a promising renewable biopolymer being investigated worldwide as an environmentally benign substitute of fossil-based aromatic compounds, e.g. for the use as an excipient with antioxidant and antimicrobial properties in drug delivery or even as active compound. For its successful implementation into process streams, a quick, easy, and reliable method is needed for its molecular weight determination. Here we present a method using 1H spectra of benchtop as well as conventional NMR systems in combination with multivariate data analysis, to determine lignin’s molecular weight (Mw and Mn) and polydispersity index (PDI). A set of 36 organosolv lignin samples (from Miscanthus x giganteus, Paulownia tomentosa and Silphium perfoliatum) was used for the calibration and cross validation, and 17 samples were used as external validation set. Validation errors between 5.6% and 12.9% were achieved for all parameters on all NMR devices (43, 60, 500 and 600 MHz). Surprisingly, no significant difference in the performance of the benchtop and high-field devices was found. This facilitates the application of this method for determining lignin’s molecular weight in an industrial environment because of the low maintenance expenditure, small footprint, ruggedness, and low cost of permanent magnet benchtop NMR systems.
NMR standardization approach that uses the 2H integral of deuterated solvent for quantitative multinuclear analysis of pharmaceuticals is described. As a proof of principle, the existing NMR procedure for the analysis of heparin products according to US Pharmacopeia monograph is extended to the determination of Na+ and Cl- content in this matrix. Quantification is performed based on the ratio of a 23Na (35Cl) NMR integral and 2H NMR signal of deuterated solvent, D2O, acquired using the specific spectrometer hardware. As an alternative, the possibility of 133Cs standardization using the addition of Cs2CO3 stock solution is shown. Validation characteristics (linearity, repeatability, sensitivity) are evaluated. A holistic NMR profiling of heparin products can now also be used for the quantitative determination of inorganic compounds in a single analytical run using a single sample. In general, the new standardization methodology provides an appealing alternative for the NMR screening of inorganic and organic components in pharmaceutical products.
Although several successful applications of benchtop nuclear magnetic resonance (NMR) spectroscopy in quantitative mixture analysis exist, the possibility of calibration transfer remains mostly unexplored, especially between high- and low-field NMR. This study investigates for the first time the calibration transfer of partial least squares regressions [weight average molecular weight (Mw) of lignin] between high-field (600 MHz) NMR and benchtop NMR devices (43 and 60 MHz). For the transfer, piecewise direct standardization, calibration transfer based on canonical correlation analysis, and transfer via the extreme learning machine auto-encoder method are employed. Despite the immense resolution difference between high-field and low-field NMR instruments, the results demonstrate that the calibration transfer from high- to low-field is feasible in the case of a physical property, namely, the molecular weight, achieving validation errors close to the original calibration (down to only 1.2 times higher root mean square errors). These results introduce new perspectives for applications of benchtop NMR, in which existing calibrations from expensive high-field instruments can be transferred to cheaper benchtop instruments to economize.
The continuing growth of scientific publications raises the question how research processes can be digitalized and thus realized more productively. Especially in information technology fields, research practice is characterized by a rapidly growing volume of publications. For the search process various information systems exist. However, the analysis of the published content is still a highly manual task. Therefore, we propose a text analytics system that allows a fully digitalized analysis of literature sources. We have realized a prototype by using EBSCO Discovery Service in combination with IBM Watson Explorer and demonstrated the results in real-life research projects. Potential addressees are research institutions, consulting firms, and decision-makers in politics and business practice.
The initial idea of Robotic Process Automation (RPA) is the automation of business processes through the presentation layer of existing application systems. For this simple emulation of user input and output by software robots, no changes of the systems and architecture is required. However, considering strategic aspects of aligning business and technology on an enterprise level as well as the growing capabilities of RPA driven by artificial intelligence, interrelations between RPA and Enterprise Architecture (EA) become visible and pose new questions. In this paper we discuss the relationship between RPA and EA in terms of perspectives and implications. As workin- progress we focus on identifying new questions and research opportunities related to RPA and EA.
The initial idea of Robotic Process Automation (RPA) is the automation of business processes through a simple emulation of user input and output by software robots. Hence, it can be assumed that no changes of the used software systems and existing Enterprise Architecture (EA) is
required. In this short, practical paper we discuss this assumption based on a real-life implementation project. We show that a successful RPA implementation might require architectural work during analysis, implementation, and migration. As practical paper we focus on exemplary lessons-learned and new questions related to RPA and EA.
Digital twins enable the modeling and simulation of real-world entities (objects, processes or systems), resulting in improvements in the associated value chains. The emerging field of quantum computing holds tremendous promise forevolving this virtualization towards Quantum (Digital) Twins (QDT) and ultimately Quantum Twins (QT). The quantum (digital) twin concept is not a contradiction in terms - but instead describes a hybrid approach that can be implemented using the technologies available today by combining classicalcomputing and digital twin concepts with quantum processing. This paperpresents the status quo of research and practice on quantum (digital) twins. It alsodiscuses their potential to create competitive advantage through real-timesimulation of highly complex, interconnected entities that helps companies better
address changes in their environment and differentiate their products andservices.
We study the possibility to fabricate an arbitrary phase mask in a one-step laser-writing process inside the volume of an optical glass substrate. We derive the phase mask from a Gerchberg–Saxton-type algorithm as an array and create each individual phase shift using a refractive index modification of variable axial length. We realize the variable axial length by superimposing refractive index modifications induced by an ultra-short pulsed laser at different focusing depth. Each single modification is created by applying 1000 pulses with 15 μJ pulse energy at 100 kHz to a fixed spot of 25 μm diameter and the focus is then shifted axially in steps of 10 μm. With several proof-of-principle examples, we show the feasibility of our method. In particular, we identify the induced refractive index change to about a value of Δn=1.5⋅10−3. We also determine our current limitations by calculating the overlap in the form of a scalar product and we discuss possible future improvements.
Reliable methods for automatic readability assessment have the potential to impact a variety of fields, ranging from machine translation to self-informed learning. Recently, large language models for the German language (such as GBERT and GPT-2-Wechsel) have become available, allowing to develop Deep Learning based approaches that promise to further improve automatic readability assessment. In this contribution, we studied the ability of ensembles of fine-tuned GBERT and GPT-2-Wechsel models to reliably predict the readability of German sentences. We combined these models with linguistic features and investigated the dependence of prediction performance on ensemble size and composition. Mixed ensembles of GBERT and GPT-2-Wechsel performed better than ensembles of the same size consisting of only GBERT or GPT-2-Wechsel models. Our models were evaluated in the GermEval 2022 Shared Task on Text Complexity Assessment on data of German sentences. On out-of-sample data, our best ensemble achieved a root mean squared error of 0:435.
Acknowledging that a diverse workforce could be a potential source of innovation, the current research deals with the fine details of why diversity management is central to achieving innovation in heterogeneous research groups and how this could be effectively realized in an organization. The types of heterogeneities addressed mainly include gender, qualification, academic discipline and intercultural perspectives. The type of organization being dealt with in this work is a complex association of research institutes at a technical university in Germany (RWTH Aachen University), namely a 'Cluster of Excellence', whereby several institutes of the university work collaboratively in different sub-projects. The 'Cluster of Excellence' is a part of the 'Excellence Initiative' of the German federal and state governments German Research Foundation (DFG) and German Council of Science and Humanities, with the ultimate aim of promoting cutting-edge research. To support interdisciplinary collaboration and thus the performance of the cluster, the development of a diversity and innovation management concept is presently in the conceptual phase and will be described in the frame of this paper. The 3-S-Diversity Model, composed of the three elements: skills, structure and strategy, serves as a basis for the development of the concept. The proposed concept consists of six phases; the first two phases lay the ground work by developing an understanding of the status quo on the forms of diversity in the Cluster of Excellence, the type of organizational structure of the member institutes and the varieties of specialist work cultures of the same. The third and the fourth phases build up on this foundation by means of qualitative and quantitative studies. While the third phase deals with the sensitization of the management level to the close connection between diversity and innovation; the need to manage them thereafter and find tailor-made methods of doing so, the fourth phase shall mainly focus on the mindset of the employees in this regard. The fifth phase shall consolidate the learnings and the ideas developed in the course of the first four phases into an implementable strategy. The ultimate phase shall be the implementation of this concept in the Cluster. The first three phases have been accomplished successfully and the preliminary results are already available.
This study addresses a proof-of-concept experiment with a biocompatible screen-printed carbon electrode deposited onto a biocompatible and biodegradable substrate, which is made of fibroin, a protein derived from silk of the Bombyx mori silkworm. To demonstrate the sensor performance, the carbon electrode is functionalized as a glucose biosensor with the enzyme glucose oxidase and encapsulated with a silicone rubber to ensure biocompatibility of the contact wires. The carbon electrode is fabricated by means of thick-film technology including a curing step to solidify the carbon paste. The influence of the curing temperature and curing time on the electrode morphology is analyzed via scanning electron microscopy. The electrochemical characterization of the glucose biosensor is performed by amperometric/voltammetric measurements of different glucose concentrations in phosphate buffer. Herein, systematic studies at applied potentials from 500 to 1200 mV to the carbon working electrode (vs the Ag/AgCl reference electrode) allow to determine the optimal working potential. Additionally, the influence of the curing parameters on the glucose sensitivity is examined over a time period of up to 361 days. The sensor shows a negligible cross-sensitivity toward ascorbic acid, noradrenaline, and adrenaline. The developed biocompatible biosensor is highly promising for future in vivo and epidermal applications.
Retinal vessels are similar to cerebral vessels in their structure and function. Moderately low oscillation frequencies of around 0.1 Hz have been reported as the driving force for paravascular drainage in gray matter in mice and are known as the frequencies of lymphatic vessels in humans. We aimed to elucidate whether retinal vessel oscillations are altered in Alzheimer's disease (AD) at the stage of dementia or mild cognitive impairment (MCI). Seventeen patients with mild-to-moderate dementia due to AD (ADD); 23 patients with MCI due to AD, and 18 cognitively healthy controls (HC) were examined using Dynamic Retinal Vessel Analyzer. Oscillatory temporal changes of retinal vessel diameters were evaluated using mathematical signal analysis. Especially at moderately low frequencies around 0.1 Hz, arterial oscillations in ADD and MCI significantly prevailed over HC oscillations and correlated with disease severity. The pronounced retinal arterial vasomotion at moderately low frequencies in the ADD and MCI groups would be compatible with the view of a compensatory upregulation of paravascular drainage in AD and strengthen the amyloid clearance hypothesis.
Edge-based and face-based smoothed finite element methods (ES-FEM and FS-FEM, respectively) are modified versions of the finite element method allowing to achieve more accurate results and to reduce sensitivity to mesh distortion, at least for linear elements. These properties make the two methods very attractive. However, their implementation in a standard finite element code is nontrivial because it requires heavy and extensive modifications to the code architecture. In this article, we present an element-based formulation of ES-FEM and FS-FEM methods allowing to implement the two methods in a standard finite element code with no modifications to its architecture. Moreover, the element-based formulation permits to easily manage any type of element, especially in 3D models where, to the best of the authors' knowledge, only tetrahedral elements are used in FS-FEM applications found in the literature. Shape functions for non-simplex 3D elements are proposed in order to apply FS-FEM to any standard finite element.
The mechanical behavior of the large intestine beyond the ultimate stress has never been investigated. Stretching beyond the ultimate stress may drastically impair the tissue microstructure, which consequently weakens its healthy state functions of absorption, temporary storage, and transportation for defecation. Due to closely similar microstructure and function with humans, biaxial tensile experiments on the porcine large intestine have been performed in this study. In this paper, we report hyperelastic characterization of the large intestine based on experiments in 102 specimens. We also report the theoretical analysis of the experimental results, including an exponential damage evolution function. The fracture energies and the threshold stresses are set as damage material parameters for the longitudinal muscular, the circumferential muscular and the submucosal collagenous layers. A biaxial tensile simulation of a linear brick element has been performed to validate the applicability of the estimated material parameters. The model successfully simulates the biomechanical response of the large intestine under physiological and non-physiological loads.
Promoting diversity and combatting discrimination in research organizations: a practitioner’s guide
(2022)
The essay is addressed to practitioners in research management and from
academic leadership. It describes which measures can contribute to creating an inclusive climate for research teams and preventing and effectively dealing with discrimination. The practical recommendations consider the policy and organizational levels, as well as the individual perspective of research managers. Following a series of basic recommendations, six lessons learned are formulated, derived from the contributions to the edited collection on “Diversity and Discrimination in Research Organizations.”
Diversity management is seen as a decisive factor for ensuring the development of socially responsible innovations (Beacham and Shambaugh, 2011; Sonntag, 2014; López, 2015; Uebernickel et al., 2015). However, many diversity management approaches fail due to a one-sided consideration of diversity (Thomas and Ely, 2019) and a lacking linkage between the prevailing organizational culture and the perception of diversity in the respective organization. Reflecting the importance of diverse perspectives, research institutions have a special responsibility to actively deal with diversity, as they are publicly funded institutions that drive socially relevant development and educate future generations of developers, leaders and decision-makers. Nevertheless, only a few studies have so far dealt with the influence of the special framework conditions of the science system on diversity management. Focusing on the interdependency of the organizational culture and diversity management especially in a university research environment, this chapter aims in a first step to provide a theoretical perspective on the framework conditions of a complex research organization in Germany in order to understand the system-specific factors influencing diversity management. In a second step, an exploratory cluster analysis is presented, investigating the perception of diversity and possible influencing factors moderating this perception in a scientific organization. Combining both steps, the results show specific mechanisms and structures of the university research environment that have an impact on diversity management and rigidify structural barriers preventing an increase of diversity. The quantitative study also points out that the management level takes on a special role model function in the scientific system and thus has an influence on the perception of diversity. Consequently, when developing diversity management approaches in research organizations, it is necessary to consider the top-down direction of action, the special nature of organizational structures in the university research environment as well as the special role of the professorial level as role model for the scientific staff.