Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1699)
- Fachbereich Elektrotechnik und Informationstechnik (720)
- IfB - Institut für Bioengineering (627)
- Fachbereich Energietechnik (589)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (554)
- Fachbereich Luft- und Raumfahrttechnik (498)
- Fachbereich Maschinenbau und Mechatronik (288)
- Fachbereich Wirtschaftswissenschaften (222)
- Solar-Institut Jülich (165)
Language
- English (4951) (remove)
Document Type
- Article (3273)
- Conference Proceeding (1191)
- Part of a Book (197)
- Book (146)
- Conference: Meeting Abstract (33)
- Doctoral Thesis (32)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
Achieving the 17 Sustainable Development Goals (SDGs) set by the United Nations (UN) in 2015 requires global collaboration between different stakeholders. Industry, and in particular engineers who shape industrial developments, have a special role to play as they are confronted with the responsibility to holistically reflect sustainability in industrial processes. This means that, in addition to the technical specifications, engineers must also question the effects of their own actions on an ecological, economic and social level in order to ensure sustainable action and contribute to the achievement of the SDGs. However, this requires competencies that enable engineers to apply all three pillars of sustainability to their own field of activity and to understand the global impact of industrial processes. In this context, it is relevant to understand how industry already reflects sustainability and to identify competences needed for sustainable development.
There is a growing demand for more flexibility in manufacturing to counter the volatility and unpredictability of the markets and provide more individualization for customers. However, the design and implementation of flexibility within manufacturing systems are costly and only economically viable if applicable to actual demand fluctuations. To this end, companies are considering additive manufacturing (AM) to make production more flexible. This paper develops a conceptual model for the impact quantification of AM on volume and mix flexibility within production systems in the early stages of the factory-planning process. Together with the model, an application guideline is presented to help planners with the flexibility quantification and the factory design process. Following the development of the model and guideline, a case study is presented to indicate the potential impact additive technologies can have on manufacturing flexibility Within the case study, various scenarios with different production system configurations and production programs are analyzed, and the impact of the additive technologies on volume and mix flexibility is calculated. This work will allow factory planners to determine the potential impacts of AM on manufacturing flexibility in an early planning stage and design their production systems accordingly.
A Cooperative Work Environment for Evolutionary Software Development / Kurbel, K., Pietsch, W.
(1990)
An improved and convenient ninhydrin assay for aminoacylase activity measurements was developed using the commercial EZ Nin™ reagent. Alternative reagents from literature were also evaluated and compared. The addition of DMSO to the reagent enhanced the solubility of Ruhemann's purple (RP). Furthermore, we found that the use of a basic, aqueous buffer enhances stability of RP. An acidic protocol for the quantification of lysine was developed by addition of glacial acetic acid. The assay allows for parallel processing in a 96-well format with measurements microtiter plates.
A nonparametric goodness-of-fit test for random variables with values in a separable Hilbert space is investigated. To verify the null hypothesis that the data come from a specific distribution, an integral type test based on a Cramér-von-Mises statistic is suggested. The convergence in distribution of the test statistic under the null hypothesis is proved and the test's consistency is concluded. Moreover, properties under local alternatives are discussed. Applications are given for data of huge but finite dimension and for functional data in infinite dimensional spaces. A general approach enables the treatment of incomplete data. In simulation studies the test competes with alternative proposals.
Cyberspace is "the environment formed by physical and non-physical components to store, modify, and exchange data using computer networks" (NATO CCDCOE). Beyond that, it is an environment where people interact. IT attacks are hostile, non-cooperative interactions that can be described with conflict theory. Applying conflict theory to IT security leads to different objectives for end-user education, requiring different formats like agency-based competence developing games.
A melting probe equipped with autofluorescence-based detection system combined with a light scattering unit, and, optionally, with a microarray chip would be ideally suited to probe icy environments like Europa’s ice layer as well as the polar ice layers of Earth and Mars for recent and extinct live.
A concept for a sensitive micro total analysis system for high throughput fluorescence imaging
(2006)
This paper discusses possible methods for on-chip fluorescent imaging for integrated bio-sensors. The integration of optical and electro-optical accessories, according to suggested methods, can improve the performance of fluorescence imaging. It can boost the signal to background ratio by a few orders of magnitudes in comparison to conventional discrete setups. The methods that are present in this paper are oriented towards building reproducible arrays for high-throughput micro total analysis systems (µTAS). The first method relates to side illumination of the fluorescent material placed into microcompartments of the lab-on-chip. Its significance is in high utilization of excitation energy for low concentration of fluorescent material. The utilization of a transparent µLED chip, for the second method, allows the placement of the excitation light sources on the same optical axis with emission detector, such that the excitation and emission rays are directed controversly. The third method presents a spatial filtering of the excitation background.
Innovative interplanetary deep space missions, like a main belt asteroid sample
return mission, require ever larger velocity increments (∆V s) and thus ever
more demanding propulsion capabilities. Providing much larger exhaust velocities than chemical high-thrust systems, electric low-thrust space-propulsion
systems can significantly enhance or even enable such high-energy missions. In
1995, a European-Russian Joint Study Group (JSG) presented a study report
on “Advanced Interplanetary Missions Using Nuclear-Electric Propulsion”
(NEP). One of the investigated reference missions was a sample return (SR)
from the main belt asteroid (19) Fortuna. The envisaged nuclear power plant,
Topaz-25, however, could not be realized and also the worldwide developments
in space reactor hardware stalled. In this paper, we investigate, whether such
a mission is also feasible using a solar electric propulsion (SEP) system and
compare our SEP results to corresponding NEP results.
Finding a good system topology with more than a handful of components is a
highly non-trivial task. The system needs to be able to fulfil all expected load cases, but at the
same time the components should interact in an energy-efficient way. An example for a system
design problem is the layout of the drinking water supply of a residential building. It may be
reasonable to choose a design of spatially distributed pumps which are connected by pipes in at
least two dimensions. This leads to a large variety of possible system topologies. To solve such
problems in a reasonable time frame, the nonlinear technical characteristics must be modelled
as simple as possible, while still achieving a sufficiently good representation of reality. The
aim of this paper is to compare the speed and reliability of a selection of leading mathematical
programming solvers on a set of varying model formulations. This gives us empirical evidence
on what combinations of model formulations and solver packages are the means of choice with the current state of the art.
The number of case studies focusing on hybrid-electric aircraft is steadily increasing, since these configurations are thought to lead to lower operating costs and environmental impact than traditional aircraft. However, due to the lack of reference data of actual hybrid-electric aircraft, in most cases, the design tools and results are difficult to validate. In this paper, two independently developed approaches for hybrid-electric conceptual aircraft design are compared. An existing 19-seat commuter aircraft is selected as the conventional baseline, and both design tools are used to size that aircraft. The aircraft is then re-sized under consideration of hybrid-electric propulsion technology. This is performed for parallel, serial, and fully-electric powertrain architectures. Finally, sensitivity studies are conducted to assess the validity of the basic assumptions and approaches regarding the design of hybrid-electric aircraft. Both methods are found to predict the maximum take-off mass (MTOM) of the reference aircraft with less than 4% error. The MTOM and payload-range energy efficiency of various (hybrid-) electric configurations are predicted with a maximum difference of approximately 2% and 5%, respectively. The results of this study confirm a correct formulation and implementation of the two design methods, and the data obtained can be used by researchers to benchmark and validate their design tools.
Mice that have been genetically humanized for proteins involved in drug metabolism and toxicity and mice engrafted with human hepatocytes are emerging and promising in vivo models for an improved prediction of the pharmacokinetic, drug–drug interaction and safety characteristics of compounds in humans. The specific advantages and disadvantages of these models should be carefully considered when using them for studies in drug discovery and development. Here, an overview on the corresponding genetically humanized and chimeric liver humanized mouse models described to date is provided and illustrated with examples of their utility in drug metabolism and toxicity studies. We compare the strength and weaknesses of the two different approaches, give guidance for the selection of the appropriate model for various applications and discuss future trends and perspectives.
The readout of gamma detectors is considerably simplified when the event intensity is encoded as a pulse width (Pulse Width Modulation, PWM). Time-to-Digital-Converters (TDC) replace the conventional ADCs and multiple TDCs can be realized easily in one PLD chip (Programmable Logic Device). The output of a PWM stage is only one digital signal per channel which is well suited for transport so that further processing can be performed apart from the detector. This is particularly interesting for large systems with high channel density (e.g. high resolution scanners). In this work we present a circuit with a linear transfer function that requires a minimum of components by performing the PWM already in the preamp stage. This allows a very compact and also cost-efficient implementation of the front-end electronics.
A Classical Reformulation of Finite-Dimensional Quantum Mechanics. Hellwig, K.-E.; Stulpe, W.
(1993)
The predictive control of commercial vehicle energy management systems, such as vehicle thermal management or waste heat recovery (WHR) systems, are discussed on the basis of information sources from the field of environment recognition and in combination with the determination of the vehicle system condition.
In this article, a mathematical method for predicting the exhaust gas mass flow and the exhaust gas temperature is presented based on driving data of a heavy-duty vehicle. The prediction refers to the conditions of the exhaust gas at the inlet of the exhaust gas recirculation (EGR) cooler and at the outlet of the exhaust gas aftertreatment system (EAT). The heavy-duty vehicle was operated on the motorway to investigate the characteristic operational profile. In addition to the use of road gradient profile data, an evaluation of the continuously recorded distance signal, which represents the distance between the test vehicle and the road user ahead, is included in the prediction model. Using a Fourier analysis, the trajectory of the vehicle speed is determined for a defined prediction horizon.
To verify the method, a holistic simulation model consisting of several hierarchically structured submodels has been developed. A map-based submodel of a combustion engine is used to determine the EGR and EAT exhaust gas mass flows and exhaust gas temperature profiles. All simulation results are validated on the basis of the recorded vehicle and environmental data. Deviations from the predicted values are analyzed and discussed.
South Africa in recent years is the establishment of a number of research hubs involved in AI activities ranging from mobile robotics and computational intelligence, to knowledge representation and reasoning, and human language technologies. In this survey we take the reader through a quick tour of the research being conducted at these hubs, and touch on an initiative to maintain and extend the current level of interest in AI research in the country.
Manufacturing companies across multiple industries face an increasingly dynamic and unpredictable environment. This development can be seen on both the market and supply side. To respond to these challenges, manufacturing companies must implement smart manufacturing systems and become more flexible and agile. The flexibility in operational planning regarding the scheduling and sequencing of customer orders needs to be increased and new structures must be implemented in manufacturing systems’ fundamental design as they constitute much of the operational flexibility available. To this end, smart and more flexible solutions for production planning and control (PPC) are developed. However, scheduling or sequencing is often only considered isolated in a predefined stable environment. Moreover, their orientation on the fundamental logic of the existing IT solutions and their applicability in a dynamic environment is limited. This paper presents a conceptual model for a task-based description logic that can be applied to factory planning, technology planning, and operational control. By using service-oriented architectures, the goal is to generate smart manufacturing systems. The logic is designed to allow for easy and automated maintenance. It is compatible with the existing resource and process allocation logic across operational and strategic factory and production planning.
A 3D finite element model of the female pelvic floor for the reconstruction of urinary incontinence
(2014)
Macroporous silicon has been etched from n-type Si, using a vertical etching cell where no rear side contact on the silicon wafer is necessary. The resulting macropores have been characterised by means of Scanning Electron Microscopy (SEM). After etching, SiO₂ was thermally grown on the top of the porous silicon as an insulating layer and Si₃N₄ was deposited by means of Low Pressure Chemical Vapour Deposition (LPCVD) as transducer material to fabricate a capacitive pH sensor. In order to prepare porous biosensors, the enzyme penicillinase has been additionally immobilised inside the porous structure. Electrochemical measurements of the pH sensor and the biosensor with an Electrolyte/Insulator/Semiconductor (EIS) structure have been performed in the Capacitance/Voltage (C/V) and Constant capacitance (ConCap) mode.
Background:
Additional stabilization of the “comma sign” in anterosuperior rotator cuff repair has been proposed to provide biomechanical benefits regarding stability of the repair.
Purpose:
This in vitro investigation aimed to investigate the influence of a comma sign–directed reconstruction technique for anterosuperior rotator cuff tears on the primary stability of the subscapularis tendon repair.
Study Design:
Controlled laboratory study.
Methods:
A total of 18 fresh-frozen cadaveric shoulders were used in this study. Anterosuperior rotator cuff tears (complete full-thickness tear of the supraspinatus and subscapularis tendons) were created, and supraspinatus repair was performed with a standard suture bridge technique. The subscapularis was repaired with either a (1) single-row or (2) comma sign technique. A high-resolution 3D camera system was used to analyze 3-mm and 5-mm gap formation at the subscapularis tendon-bone interface upon incremental cyclic loading. Moreover, the ultimate failure load of the repair was recorded. A Mann-Whitney test was used to assess significant differences between the 2 groups.
Results:
The comma sign repair withstood significantly more loading cycles than the single-row repair until 3-mm and 5-mm gap formation occurred (P≤ .047). The ultimate failure load did not reveal any significant differences when the 2 techniques were compared (P = .596).
Conclusion:
The results of this study show that additional stabilization of the comma sign enhanced the primary stability of subscapularis tendon repair in anterosuperior rotator cuff tears. Although this stabilization did not seem to influence the ultimate failure load, it effectively decreased the micromotion at the tendon-bone interface during cyclic loading.
Clinical Relevance:
The proposed technique for stabilization of the comma sign has shown superior biomechanical properties in comparison with a single-row repair and might thus improve tendon healing. Further clinical research will be necessary to determine its influence on the functional outcome.
[{ReN(PMe2Ph)3}{ReO3N}]2 – Structural Evidence for the Nitridotrioxorhenate(VII) Anion, [ReO3N]2−
(2005)
[Skripte]
(2008)