Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1695)
- Fachbereich Elektrotechnik und Informationstechnik (719)
- IfB - Institut für Bioengineering (626)
- Fachbereich Energietechnik (589)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (553)
- Fachbereich Luft- und Raumfahrttechnik (498)
- Fachbereich Maschinenbau und Mechatronik (284)
- Fachbereich Wirtschaftswissenschaften (222)
- Solar-Institut Jülich (165)
Language
- English (4940) (remove)
Document Type
- Article (3264)
- Conference Proceeding (1191)
- Part of a Book (196)
- Book (146)
- Conference: Meeting Abstract (33)
- Doctoral Thesis (32)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
South Africa in recent years is the establishment of a number of research hubs involved in AI activities ranging from mobile robotics and computational intelligence, to knowledge representation and reasoning, and human language technologies. In this survey we take the reader through a quick tour of the research being conducted at these hubs, and touch on an initiative to maintain and extend the current level of interest in AI research in the country.
The predictive control of commercial vehicle energy management systems, such as vehicle thermal management or waste heat recovery (WHR) systems, are discussed on the basis of information sources from the field of environment recognition and in combination with the determination of the vehicle system condition.
In this article, a mathematical method for predicting the exhaust gas mass flow and the exhaust gas temperature is presented based on driving data of a heavy-duty vehicle. The prediction refers to the conditions of the exhaust gas at the inlet of the exhaust gas recirculation (EGR) cooler and at the outlet of the exhaust gas aftertreatment system (EAT). The heavy-duty vehicle was operated on the motorway to investigate the characteristic operational profile. In addition to the use of road gradient profile data, an evaluation of the continuously recorded distance signal, which represents the distance between the test vehicle and the road user ahead, is included in the prediction model. Using a Fourier analysis, the trajectory of the vehicle speed is determined for a defined prediction horizon.
To verify the method, a holistic simulation model consisting of several hierarchically structured submodels has been developed. A map-based submodel of a combustion engine is used to determine the EGR and EAT exhaust gas mass flows and exhaust gas temperature profiles. All simulation results are validated on the basis of the recorded vehicle and environmental data. Deviations from the predicted values are analyzed and discussed.
A Classical Reformulation of Finite-Dimensional Quantum Mechanics. Hellwig, K.-E.; Stulpe, W.
(1993)
The readout of gamma detectors is considerably simplified when the event intensity is encoded as a pulse width (Pulse Width Modulation, PWM). Time-to-Digital-Converters (TDC) replace the conventional ADCs and multiple TDCs can be realized easily in one PLD chip (Programmable Logic Device). The output of a PWM stage is only one digital signal per channel which is well suited for transport so that further processing can be performed apart from the detector. This is particularly interesting for large systems with high channel density (e.g. high resolution scanners). In this work we present a circuit with a linear transfer function that requires a minimum of components by performing the PWM already in the preamp stage. This allows a very compact and also cost-efficient implementation of the front-end electronics.
Mice that have been genetically humanized for proteins involved in drug metabolism and toxicity and mice engrafted with human hepatocytes are emerging and promising in vivo models for an improved prediction of the pharmacokinetic, drug–drug interaction and safety characteristics of compounds in humans. The specific advantages and disadvantages of these models should be carefully considered when using them for studies in drug discovery and development. Here, an overview on the corresponding genetically humanized and chimeric liver humanized mouse models described to date is provided and illustrated with examples of their utility in drug metabolism and toxicity studies. We compare the strength and weaknesses of the two different approaches, give guidance for the selection of the appropriate model for various applications and discuss future trends and perspectives.
The number of case studies focusing on hybrid-electric aircraft is steadily increasing, since these configurations are thought to lead to lower operating costs and environmental impact than traditional aircraft. However, due to the lack of reference data of actual hybrid-electric aircraft, in most cases, the design tools and results are difficult to validate. In this paper, two independently developed approaches for hybrid-electric conceptual aircraft design are compared. An existing 19-seat commuter aircraft is selected as the conventional baseline, and both design tools are used to size that aircraft. The aircraft is then re-sized under consideration of hybrid-electric propulsion technology. This is performed for parallel, serial, and fully-electric powertrain architectures. Finally, sensitivity studies are conducted to assess the validity of the basic assumptions and approaches regarding the design of hybrid-electric aircraft. Both methods are found to predict the maximum take-off mass (MTOM) of the reference aircraft with less than 4% error. The MTOM and payload-range energy efficiency of various (hybrid-) electric configurations are predicted with a maximum difference of approximately 2% and 5%, respectively. The results of this study confirm a correct formulation and implementation of the two design methods, and the data obtained can be used by researchers to benchmark and validate their design tools.
Finding a good system topology with more than a handful of components is a
highly non-trivial task. The system needs to be able to fulfil all expected load cases, but at the
same time the components should interact in an energy-efficient way. An example for a system
design problem is the layout of the drinking water supply of a residential building. It may be
reasonable to choose a design of spatially distributed pumps which are connected by pipes in at
least two dimensions. This leads to a large variety of possible system topologies. To solve such
problems in a reasonable time frame, the nonlinear technical characteristics must be modelled
as simple as possible, while still achieving a sufficiently good representation of reality. The
aim of this paper is to compare the speed and reliability of a selection of leading mathematical
programming solvers on a set of varying model formulations. This gives us empirical evidence
on what combinations of model formulations and solver packages are the means of choice with the current state of the art.
Innovative interplanetary deep space missions, like a main belt asteroid sample
return mission, require ever larger velocity increments (∆V s) and thus ever
more demanding propulsion capabilities. Providing much larger exhaust velocities than chemical high-thrust systems, electric low-thrust space-propulsion
systems can significantly enhance or even enable such high-energy missions. In
1995, a European-Russian Joint Study Group (JSG) presented a study report
on “Advanced Interplanetary Missions Using Nuclear-Electric Propulsion”
(NEP). One of the investigated reference missions was a sample return (SR)
from the main belt asteroid (19) Fortuna. The envisaged nuclear power plant,
Topaz-25, however, could not be realized and also the worldwide developments
in space reactor hardware stalled. In this paper, we investigate, whether such
a mission is also feasible using a solar electric propulsion (SEP) system and
compare our SEP results to corresponding NEP results.
A concept for a sensitive micro total analysis system for high throughput fluorescence imaging
(2006)
This paper discusses possible methods for on-chip fluorescent imaging for integrated bio-sensors. The integration of optical and electro-optical accessories, according to suggested methods, can improve the performance of fluorescence imaging. It can boost the signal to background ratio by a few orders of magnitudes in comparison to conventional discrete setups. The methods that are present in this paper are oriented towards building reproducible arrays for high-throughput micro total analysis systems (µTAS). The first method relates to side illumination of the fluorescent material placed into microcompartments of the lab-on-chip. Its significance is in high utilization of excitation energy for low concentration of fluorescent material. The utilization of a transparent µLED chip, for the second method, allows the placement of the excitation light sources on the same optical axis with emission detector, such that the excitation and emission rays are directed controversly. The third method presents a spatial filtering of the excitation background.
A melting probe equipped with autofluorescence-based detection system combined with a light scattering unit, and, optionally, with a microarray chip would be ideally suited to probe icy environments like Europa’s ice layer as well as the polar ice layers of Earth and Mars for recent and extinct live.
Cyberspace is "the environment formed by physical and non-physical components to store, modify, and exchange data using computer networks" (NATO CCDCOE). Beyond that, it is an environment where people interact. IT attacks are hostile, non-cooperative interactions that can be described with conflict theory. Applying conflict theory to IT security leads to different objectives for end-user education, requiring different formats like agency-based competence developing games.
A nonparametric goodness-of-fit test for random variables with values in a separable Hilbert space is investigated. To verify the null hypothesis that the data come from a specific distribution, an integral type test based on a Cramér-von-Mises statistic is suggested. The convergence in distribution of the test statistic under the null hypothesis is proved and the test's consistency is concluded. Moreover, properties under local alternatives are discussed. Applications are given for data of huge but finite dimension and for functional data in infinite dimensional spaces. A general approach enables the treatment of incomplete data. In simulation studies the test competes with alternative proposals.
An improved and convenient ninhydrin assay for aminoacylase activity measurements was developed using the commercial EZ Nin™ reagent. Alternative reagents from literature were also evaluated and compared. The addition of DMSO to the reagent enhanced the solubility of Ruhemann's purple (RP). Furthermore, we found that the use of a basic, aqueous buffer enhances stability of RP. An acidic protocol for the quantification of lysine was developed by addition of glacial acetic acid. The assay allows for parallel processing in a 96-well format with measurements microtiter plates.
A Cooperative Work Environment for Evolutionary Software Development / Kurbel, K., Pietsch, W.
(1990)
There is a growing demand for more flexibility in manufacturing to counter the volatility and unpredictability of the markets and provide more individualization for customers. However, the design and implementation of flexibility within manufacturing systems are costly and only economically viable if applicable to actual demand fluctuations. To this end, companies are considering additive manufacturing (AM) to make production more flexible. This paper develops a conceptual model for the impact quantification of AM on volume and mix flexibility within production systems in the early stages of the factory-planning process. Together with the model, an application guideline is presented to help planners with the flexibility quantification and the factory design process. Following the development of the model and guideline, a case study is presented to indicate the potential impact additive technologies can have on manufacturing flexibility Within the case study, various scenarios with different production system configurations and production programs are analyzed, and the impact of the additive technologies on volume and mix flexibility is calculated. This work will allow factory planners to determine the potential impacts of AM on manufacturing flexibility in an early planning stage and design their production systems accordingly.
Achieving the 17 Sustainable Development Goals (SDGs) set by the United Nations (UN) in 2015 requires global collaboration between different stakeholders. Industry, and in particular engineers who shape industrial developments, have a special role to play as they are confronted with the responsibility to holistically reflect sustainability in industrial processes. This means that, in addition to the technical specifications, engineers must also question the effects of their own actions on an ecological, economic and social level in order to ensure sustainable action and contribute to the achievement of the SDGs. However, this requires competencies that enable engineers to apply all three pillars of sustainability to their own field of activity and to understand the global impact of industrial processes. In this context, it is relevant to understand how industry already reflects sustainability and to identify competences needed for sustainable development.
Companies often build their businesses based on product information and therefore try to automate the process of information extraction (IE). Since the information source is usually heterogeneous and non-standardized, classic extract, transform, load techniques reach their limits. Hence, companies must implement the newest findings from research to tackle the challenges of process automation. They require a flexible and robust system that is extendable and ensures the optimal processing of the different document types. This paper provides a distributed microservice architecture pattern that enables the automated generation of IE pipelines. Since their optimal design is individual for each input document, the system ensures the ad-hoc generation of pipelines depending on specific document characteristics at runtime. Furthermore, it introduces the automated quality determination of each available pipeline and controls the integration of new microservices based on their impact on the business value. The introduced system enables fast prototyping of the newest approaches from research and supports companies in automating their IE processes. Based on the automated quality determination, it ensures that the generated pipelines always meet defined business requirements when they come into productive use.
This Research Briefing, issued in July 2010, concluded that:
- Small and medium-sized enterprises (SMEs) in Europe have long called for a matching legal form valid across the EU (similar to that of the European company (SE) for large firms)
- The main benefits would be the availability of uniform Europe-wide company structures, significant cost reductions for businesses and further integration of the internal market
- Given the differing national views regarding the concrete features of the new legal form there is currently no sign of an agreement being reached at the European level in the short term; however, it is possible that progress will be made in negotiations during the year
- The key issues being discussed in depth are company formation, transnationality and employee participation rights in the new European private company (SPE).
Manufacturing process simulation enables the evaluation and improvement of autoclave mold concepts early in the design phase. To achieve a high part quality at low cycle times, the thermal behavior of the autoclave mold can be investigated by means of simulations. Most challenging for such a simulation is the generation of necessary boundary conditions. Heat-up and temperature distribution in an autoclave mold are governed by flow phenomena, tooling material and shape, position within the autoclave, and the chosen autoclave cycle. This paper identifies and summarizes the most important factors influencing mold heat-up and how they can be introduced into a thermal simulation. Thermal measurements are used to quantify the impact of the various parameters. Finally, the gained knowledge is applied to develop a semi-empirical approach for boundary condition estimation that enables a simple and fast thermal simulation of the autoclave curing process with reasonably high accuracy for tooling optimization.