Refine
Year of publication
- 2024 (48)
- 2023 (92)
- 2022 (136)
- 2021 (136)
- 2020 (169)
- 2019 (196)
- 2018 (169)
- 2017 (154)
- 2016 (157)
- 2015 (162)
- 2014 (160)
- 2013 (171)
- 2012 (162)
- 2011 (183)
- 2010 (181)
- 2009 (179)
- 2008 (150)
- 2007 (137)
- 2006 (129)
- 2005 (122)
- 2004 (150)
- 2003 (95)
- 2002 (123)
- 2001 (103)
- 2000 (102)
- 1999 (109)
- 1998 (98)
- 1997 (96)
- 1996 (81)
- 1995 (78)
- 1994 (87)
- 1993 (59)
- 1992 (54)
- 1991 (29)
- 1990 (39)
- 1989 (44)
- 1988 (56)
- 1987 (32)
- 1986 (19)
- 1985 (33)
- 1984 (22)
- 1983 (20)
- 1982 (29)
- 1981 (20)
- 1980 (36)
- 1979 (24)
- 1978 (34)
- 1977 (14)
- 1976 (13)
- 1975 (12)
- 1974 (3)
- 1973 (2)
- 1972 (2)
- 1971 (1)
- 1968 (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (1571)
- Fachbereich Elektrotechnik und Informationstechnik (712)
- IfB - Institut für Bioengineering (566)
- Fachbereich Energietechnik (562)
- Fachbereich Chemie und Biotechnologie (539)
- INB - Institut für Nano- und Biotechnologien (533)
- Fachbereich Luft- und Raumfahrttechnik (481)
- Fachbereich Maschinenbau und Mechatronik (267)
- Fachbereich Wirtschaftswissenschaften (207)
- Solar-Institut Jülich (161)
Has Fulltext
- no (4713) (remove)
Language
- English (4713) (remove)
Document Type
- Article (3205)
- Conference Proceeding (1039)
- Part of a Book (195)
- Book (146)
- Doctoral Thesis (32)
- Conference: Meeting Abstract (29)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (5)
Keywords
- Gamification (6)
- avalanche (6)
- Earthquake (5)
- Enterprise Architecture (5)
- MINLP (5)
- solar sail (5)
- Additive manufacturing (4)
- Diversity Management (4)
- Energy storage (4)
- Engineering optimization (4)
This book is based on a multimedia course for biological and chemical engineers, which is designed to trigger students' curiosity and initiative. A solid basic knowledge of thermodynamics and kinetics is necessary for understanding many technical, chemical, and biological processes.
The one-semester basic lecture course was divided into 12 workshops (chapters). Each chapter covers a practically relevant area of physical chemistry and contains the following didactic elements that make this book particularly exciting and understandable:
- Links to Videos at the start of each chapter as preparation for the workshop
- Key terms (in bold) for further research of your own
- Comprehension questions and calculation exercises with solutions as learning checks
- Key illustrations as simple, easy-to-replicate blackboard pictures
Humorous cartoons for each workshop (by Faelis) additionally lighten up the text and facilitate the learning process as a mnemonic. To round out the book, the appendix includes a summary of the most popular experiments in basic physical chemistry courses, as well as suggestions for designing workshops with exhibits, experiments, and "questions of the day."
Suitable for students minoring in chemistry; chemistry majors are sure to find this slimmed-down, didactically valuable book helpful as well. The book is excellent for self-study.
After a liver tumor intervention the medical doctor has to compare both pre and postoperative CT acquisitions to ensure that all carcinogenic cells are destroyed. A correct assessment of the intervention is of vital importance, since it will reduce the probability of tumor recurrence. Some methods have been proposed to support the medical doctors during the assessment process, however, all of them focus on secondary tumors. In this paper a tool is presented that enables the outcome validation for both primary and secondary tumors. Therefore, a multiphase registration (preoperative arterial and portal phases) followed by a registration between the pre and postoperative CT images is carried out. The first registration is in charge of the primary tumors that are only visible in the arterial phase. The secondary tumors will be incorporated in the second registration step. Finally, the part of the tumor that was not covered by the necrosis is quantified and visualized. The method has been tested in 9 patients, with an average registration error of 1.41 mm.
Quaternary events at the Horn of Africa / Voigt, B., B. Gabriel, B. Lassonczyk and Mumin M. Ghod
(1990)
False spectra formation in the differential two-channel scheme of the laser Doppler flowmeter
(2018)
Noise in the differential two-channel scheme of a classic laser Doppler flowmetry (LDF) instrument was studied. Formation of false spectral components in the output signal due to beating of electrical signals in the differential amplifier was found out. The improved block-diagram of the flowmeter was developed allowing to reduce the noise.
The replacement of existing spillway crests or gates with labyrinth weirs is a proven techno-economical means to increase the discharge capacity when rehabilitating existing structures. However, additional information is needed regarding energy dissipation of such weirs, since due to the folded weir crest, a three-dimensional flow field is generated, yielding more complex overflow and energy dissipation processes. In this study, CFD simulations of labyrinth weirs were conducted 1) to analyze the discharge coefficients for different discharges to compare the Cd values to literature data and 2) to analyze and improve energy dissipation downstream of the structure. All tests were performed for a structure at laboratory scale with a height of approx. P = 30.5 cm, a ratio of the total crest length to the total width of 4.7, a sidewall angle of 10° and a quarter-round weir crest shape. Tested headwater ratios were 0.089 ≤ HT/P ≤ 0.817. For numerical simulations, FLOW-3D Hydro was employed, solving the RANS equations with use of finite-volume method and RNG k-ε turbulence closure. In terms of discharge capacity, results were compared to data from physical model tests performed at the Utah Water Research Laboratory (Utah State University), emphasizing higher discharge coefficients from CFD than from the physical model. For upstream heads, some discrepancy in the range of ± 1 cm between literature, CFD and physical model tests was identified with a discussion regarding differences included in the manuscript. For downstream energy dissipation, variable tailwater depths were considered to analyze the formation and sweep-out of a hydraulic jump. It was found that even for high discharges, relatively low downstream Froude numbers were obtained due to high energy dissipation involved by the three-dimensional flow between the sidewalls. The effects of some additional energy dissipation devices, e.g. baffle blocks or end sills, were also analyzed. End sills were found to be non-effective. However, baffle blocks with different locations may improve energy dissipation downstream of labyrinth weirs.
Future evolution of risk management for structures : Advancement for the future IEC 62305-2 Ed3
(2011)
Residential and commercial buildings account for more than one-third of global energy-related greenhouse gas emissions. Integrated multi-energy systems at the district level are a promising way to reduce greenhouse gas emissions by exploiting economies of scale and synergies between energy sources. Planning district energy systems comes with many challenges in an ever-changing environment. Computational modelling established itself as the state-of-the-art method for district energy system planning. Unfortunately, it is still cumbersome to combine standalone models to generate insights that surpass their original purpose. Ideally, planning processes could be solved by using modular tools that easily incorporate the variety of competing and complementing computational models. Our contribution is a vision for a collaborative development and application platform for multi-energy system planning tools at the district level. We present challenges of district energy system planning identified in the literature and evaluate whether this platform can help to overcome these challenges. Further, we propose a toolkit that represents the core technical elements of the platform. Lastly, we discuss community management and its relevance for the success of projects with collaboration and knowledge sharing at their core.
Objectives
The aim of this study was to identify characteristics of phosphorus (³¹P) spectra of the human prostate and to investigate changes of individual phospholipid metabolites in prostate cancer through in vivo ³¹P magnetic resonance spectroscopic imaging (MRSI) at 7 T.
Materials and Methods
In this institutional review board–approved study, 15 patients with biopsy-proven prostate cancer underwent T₂-weighted magnetic resonance imaging and 3-dimensional ³¹P MRSI at 7 T. Voxels were selected at the tumor location, in normal-appearing peripheral zone tissue, normal-appearing transition zone tissue, and in the base of the prostate close to the seminal vesicles. Phosphorus metabolite ratios were determined and compared between tissue types.
Results
Signals of phosphoethanolamine (PE) and phosphocholine (PC) were present and well resolved in most ³¹P spectra in the prostate. Glycerophosphocholine signals were observable in 43% of the voxels in malignant tissue, but in only 10% of the voxels in normal-appearing tissue away from the seminal vesicles. In many spectra, independent of tissue type, 2 peaks resonated in the chemical shift range of inorganic phosphate, possibly representing 2 separate pH compartments. The PC/PE ratio in the seminal vesicles was highly elevated compared with the prostate in 5 patients. A considerable overlap of ³¹P metabolite ratios was found between prostate cancer and normal-appearing prostate tissue, preventing direct discrimination of these tissues. The only 2 patients with high Gleason scores tumors (≥4+5) presented with high PC and glycerophosphocholine levels in their cancer lesions.
Conclusions
Phosphorus MRSI at 7 T shows distinct features of phospholipid metabolites in the prostate gland and its surrounding structures. In this exploratory study, no differences in ³¹P metabolite ratios were observed between prostate cancer and normal-appearing prostate tissue possibly because of the partial volume effects of small tumor foci in large MRSI voxels.
Purpose
To assess the feasibility of prostate ¹H MR spectroscopic imaging (MRSI) using low-power spectral-spatial (SPSP) pulses at 7T, exploiting accurate spectral selection and spatial selectivity simultaneously.
Methods
A double spin-echo sequence was equipped with SPSP refocusing pulses with a spectral selectivity of 1 ppm. Three-dimensional prostate ¹H-MRSI at 7T was performed with the SPSP-MRSI sequence using an 8-channel transmit array coil and an endorectal receive coil in three patients with prostate cancer and in one healthy subject. No additional water or lipid suppression pulses were used.
Results
Prostate ¹H-MRSI could be obtained well within specific absorption rate (SAR) limits in a clinically feasible time (10 min). Next to the common citrate signals, the prostate spectra exhibited high spermine signals concealing creatine and sometimes also choline. Residual lipid signals were observed at the edges of the prostate because of limitations in spectral and spatial selectivity.
Conclusion
It is possible to perform prostate ¹H-MRSI at 7T with a SPSP-MRSI sequence while using separate transmit and receive coils. This low-SAR MRSI concept provides the opportunity to increase spatial resolution of MRSI within reasonable scan times.
Nobody ever dies! / 1. ed.
(2000)
Therefore Fermat is right
(2014)
It was Fernat's idea to investigate how many numbers would fulfill the equation according to the Pythagorean Theorem if the exponent were increased to random, e.g. to a3 + b3 = c3. His question became therefore: are there two whole numbers the cubes of which add up to the volume of the cube of a third whole number? He posed this same question, of course, for all kinds of higher exponents, so that the equation could be generalized: is there an integral solution for the equation an + bn = cn, if the exponent n is higher than 2? Although in 1993, the English mathematician Andrew Wiles was able to produce an arithmetical proof for Fermat's famous theorem, I will show that there is a simple logical explanation which is also pragmatic and plausible and what is the result of a fundamental alternative idea how our world seems to be constructed.
In any books about genetics it can still today be read that our genetic code is called “degenerate” because it is still believed that 43 = 64 triplets encode the 20 essential amino acids. Indeed we have to assume the inverse law, what means that 34 = 81 exact code positions are really effective for our genetic code and encode the amino acids, compiled to proteins. This very important discovery leads to two completely new results that are limits-overlooking: 1) 34 (=81) genetic code positions mean exactly the same number as there are stable and naturally existing chemical elements in our universe. This famous argument should now lead to some alternative, as well as new fundamental conclusions about our existence. 2) A genetic code positioning system shows that nature is much smarter than expected: mutations are made less dangerous than believed, because they won't be that easily able any more to cause severe damages in the protein-synthesis. This should also lead to some alternative views upon evolution of life.
Our world is well ordered in measurement and number : or why natural constants are as they are
(2013)
All the important natural constants can be logically explained with and derived from the first four ordinal numbers, 1, 2, 3 and 4, its addition to ten and finally the standard values for obviously maximal feasibility Ω and the optimum in our world, the Golden Section (GS), i.e. the number sequences 273 and 618. They both are the first three numbers of irrational results by an arithmetical transformation of simple geometrical relationships by creating multiplicity out of singularity. Both of them show that the infinite is inherent in finiteness and explain in a simple way the smallest deviations and fluctuations between the physical AS-IS state and the obvious spiritual ideal behind: Wherever we look in this world, and especially in important key-positions, we regularly find these sequences. All of the above mentioned numbers so seem to be key players in our world, what can be demonstrated by the derivation of natural constants.
Developing a new production host from a blueprint: Bacillus pumilus as an industrial enzyme producer
(2014)
KNX is a protocol for smart building automation, e.g., for automated heating, air conditioning, or lighting. This paper analyses and evaluates state-of-the-art KNX devices from manufacturers Merten, Gira and Siemens with respect to security. On the one hand, it is investigated if publicly known vulnerabilities like insecure storage of passwords in software, unencrypted communication, or denialof-service attacks, can be reproduced in new devices. On the other hand, the security is analyzed in general, leading to the discovery of a previously unknown and high risk vulnerability related to so-called BCU (authentication) keys.
For a wide acceptance of E-Mobility, a well-developed charging infrastructure is needed. Conductive charging stations, which are today’s state of the art, are of limited suitability for urbanised areas, since they cause a significant diversification in townscape. Furthermore, they might be destroyed by vandalism. Besides for those urbanistic reasons, inductive charging stations are a much more comfortable alternative, especially in urbanised areas. The usage of conductive charging stations requires more or less bulky charging cables. The handling of those standardised charging cables, especially during poor weather conditions, might cause inconvenience, such as dirty clothing etc. Wireless charging does not require visible and vandalism vulnerable charge sticks. No wired connection between charging station and vehicle is needed, which enable the placement below the surface of parking spaces or other points of interest. Inductive charging seems to be the optimal alternative for E-Mobility, as a high power transfer can be realised with a manageable technical and financial effort. For a well-accepted and working public charging infrastructure in urbanised areas it is essential that the infrastructure fits the vehicles’ needs. Hence, a well-adjusted standardisation of the charging infrastructure is essential. This is carried out by several IEC (International Electrotechnical Commission) and national standardisation committees. To ensure an optimised technical solution for future’s inductive charging infrastructures, several field tests had been carried out and are planned in near future.
This paper introduces a Competence Developing Game (CDG) for the purpose of a cybersecurity awareness training for businesses. The target audience will be discussed in detail to understand their requirements. It will be explained why and how a mix of business simulation and serious game meets these stakeholder requirements. It will be shown that a tablet and touchscreen based approach is the most suitable solution. In addition, an empirical study will be briefly presented. The study was carried out to examine how an interaction system for a 3D-tablet based CDG has to be designed, to be manageable for non-game experienced employees. Furthermore, it will be explained which serious content is necessary for a Cybersecurity awareness training CDG and how this content is wrapped in the game
There are different types of games that try to make use of the motivation of a gaming situation in learning contexts. This paper introduces the new terminology ‘Competence Developing Game’ (CDG) as an umbrella term for all games with this intention. Based on this new terminology, an assessment framework has been developed and validated in scope of an empirical study. Now, all different types of CDGs can be evaluated according to a defined and uniform set of assessment criteria and, thus, are comparable according to their characteristics and effectiveness.
To train end users how to interact with digital systems is indispensable to ensure a strong computer security. 'Competence Developing Game'-based approaches are particularly suitable for this purpose because of their motivation-and simulation-aspects. In this paper the Competence Developing Game 'GHOST' for cybersecurity awareness trainings and its underlying patterns are described. Accordingly, requirements for an 'Competence Developing Game' based training are discussed. Based on these requirements it is shown how a game can fulfill these requirements. A supplementary game interaction design and a corresponding evaluation study is shown. The combination of training requirements and interaction design is used to create a 'Competence Developing Game'-based training concept. A part of these concept is implemented into a playable prototype that serves around one hour of play respectively training time. This prototype is used to perform an evaluation of the game and training aspects of the awareness training. Thereby, the quality of the game aspect and the effectiveness of the training aspect are shown.
During the development of a Competence Developing Game’s (CDG) story it is indispensable to understand the target audience. Thereby, CDGs stories represent more than just the plot. The Story is about the
Setting, the Characters and the Plot. As a toolkit to support the
development of such a story, this paper introduces the UserFocused Storybuilding (short UFoS) Framework for CDGs. The Framework and its utilization will be explained, followed by a description of its development and derivation, including an empirical study. In addition, to simplify the Framework use regarding the CDG’s target audience, a new concept of Nine Psychographic Player Types will be explained. This concept of Player Types provides an approach to handle the differences in between players during the UFoS Framework use. Thereby,
this article presents a unique approach to the development of
target group-differentiated CDGs stories.
Through a mirror darkly – On the obscurity of teaching goals in game-based learning in IT security
(2021)
Teachers and instructors use very specific language communicating teaching goals. The most widely used frameworks of common reference are the Bloom’s Taxonomy and the Revised Bloom’s Taxonomy. The latter provides distinction of 209 different teaching goals which are connected to methods. In Competence Developing Games (CDGs - serious games to convey knowledge) and in IT security education, a two- or three level typology exists, reducing possible learning outcomes to awareness, training, and education. This study explores whether this much simpler framework succeeds in achieving the same range of learning outcomes. Method wise a keyword analysis was conducted. The results were threefold: 1. The words used to describe teaching goals in CDGs on IT security education do not reflect the whole range of learning outcomes. 2. The word choice is nevertheless different from common language, indicating an intentional use of language. 3. IT security CDGs use different sets of terms to describe learning outcomes, depending on whether they are awareness, training, or education games. The interpretation of the findings is that the reduction to just three types of CDGs reduces the capacity to communicate and think about learning outcomes and consequently reduces the outcomes that are intentionally achieved.
Cyberspace is "the environment formed by physical and non-physical components to store, modify, and exchange data using computer networks" (NATO CCDCOE). Beyond that, it is an environment where people interact. IT attacks are hostile, non-cooperative interactions that can be described with conflict theory. Applying conflict theory to IT security leads to different objectives for end-user education, requiring different formats like agency-based competence developing games.
Thickness dependence of the electronic structure of ultrathin, epitaxial Ni(111)/W(110) layers.
(1988)
Series production and testing of a micro motor. Serienfertigung und Prüfung eines Mikromotors
(1998)
The CellDrum technology (The term 'CellDrum technology' includes a couple of slightly different technological setups for measuring lateral mechanical tension in various types of cell monolayers or 3D-tissue constructs) was designed to quantify the contraction rate and mechanical tension of self-exciting cardiac myocytes. Cells were grown either within flexible, circular collagen gels or as monolayer on top of respective 1-mum thin silicone membranes. Membrane and cells were bulged outwards by air pressure. This biaxial strain distribution is rather similar the beating, blood-filled heart. The setup allowed presetting the mechanical residual stress level externally by adjusting the centre deflection, thus, mimicking hypertension in vitro. Tension was measured as oscillating differential pressure change between chamber and environment. A 0.5-mm thick collagen-cardiac myocyte tissue construct induced after 2 days of culturing (initial cell density 2 x 10(4) cells/ml), a mechanical tension of 1.62 +/- 0.17 microN/mm(2). Mechanical load is an important growth regulator in the developing heart, and the orientation and alignment of cardiomyocytes is stress sensitive. Therefore, it was necessary to develop the CellDrum technology with its biaxial stress-strain distribution and defined mechanical boundary conditions. Cells were exposed to strain in two directions, radially and circumferentially, which is similar to biaxial loading in real heart tissues. Thus, from a biomechanical point of view, the system is preferable to previous setups based on uniaxial stretching.
All cells generate contractile tension. This strain is crucial for mechanically controlling the cell shape, function and survival. In this study, the CellDrum technology quantifying cell's (the cellular) mechanical tension on a pico-scale was used to investigate the effect of lipopolysaccharide (LPS) on human aortic endothelial cell (HAoEC) tension. The LPS effect during gram-negative sepsis on endothelial cells is cell contraction causing endothelium permeability increase. The aim was to finding out whether recombinant activated protein C (rhAPC) would reverse the endothelial cell response in an in-vitro sepsis model. In this study, the established in-vitro sepsis model was confirmed by interleukin 6 (IL-6) levels at the proteomic and genomic levels by ELISA, real time-PCR and reactive oxygen species (ROS) activation by florescence staining. The thrombin cellular contraction effect on endothelial cells was used as a positive control when the CellDrum technology was applied. Additionally, the Ras homolog gene family, member A (RhoA) mRNA expression level was checked by real time-PCR to support contractile tension results. According to contractile tension results, the mechanical predominance of actin stress fibers was a reason of the increased endothelial contractile tension leading to enhanced endothelium contractility and thus permeability enhancement. The originality of this data supports firstly the basic measurement principles of the CellDrum technology and secondly that rhAPC has a beneficial effect on sepsis influenced cellular tension. The technology presented here is promising for future high-throughput cellular tension analysis that will help identify pathological contractile tension responses of cells and prove further cell in-vitro models.
The rail business is challenged by long product life cycles and a broad spectrum of assembly groups and single parts. When spare part obsolescence occurs, quick solutions are needed. A reproduction of obsolete parts is often connected to long waiting times and minimum lot quantities that need to be purchased and stored. Spare part storage is therefore challenged by growing stocks, bound capital and issues of part ageing. A possible solution could be a virtual storage of spare parts which will be 3D printed through additive manufacturing technologies in case of sudden demand. As mechanical properties of additive manufactured parts are neither guaranteed by machine manufacturers nor by service providers, the utilization of this relatively young technology is impeded and research is required to address these issues. This paper presents an examination of mechanical properties of specimens manufactured from stainless steel through the selective laser melting (SLM) process. The specimens were produced in multiple batches. This paper interrogates the question if the test results follow a normal distribution pattern and if mechanical property predictions can be made. The results will be put opposite existing threshold values provided as the industrial standard. Furthermore, probability predictions will be made in order to examine the potential of the SLM process to maintain state-of-the-art mechanical property requirements.
Energy saving ordinances requires that buildings must be designed in such a way that the heat transfer surface including the joints is permanently air impermeable. The prefabricated roof and wall panels in lightweight steel constructions are airtight in the area of the steel covering layers. The sealing of the panel joints contributes to fulfil the comprehensive requirements for an airtight building envelope. To improve the airtightness of steel sandwich panels, additional sealing tapes can be installed in the panel joint. The influence of these sealing tapes was evaluated by measurements carried out by the RWTH Aachen University - Sustainable Metal Building Envelopes. Different installation situations were evaluated by carrying out airtightness tests for different joint distances. In addition, the influence on the heat transfer coefficient was also evaluated using the Finite Element Method (FEM). The combination of obtained air volume flow and transmission losses enables to create an "effective heat transfer coefficient" due to transmission and infiltration. This summarizes both effects in one value and is particularly helpful for approximate calculations on energy efficiency.
Learning- and memory-related processes are thought to result from dynamic interactions in large-scale brain networks that include lateral and mesial structures of the temporal lobes. We investigate the impact of incidental and intentional learning of verbal episodic material on functional brain networks that we derive from scalp-EEG recorded continuously from 33 subjects during a neuropsychological test schedule. Analyzing the networks' global statistical properties we observe that intentional but not incidental learning leads to a significantly increased clustering coefficient, and the average shortest path length remains unaffected. Moreover, network modifications correlate with subsequent recall performance: the more pronounced the modifications of the clustering coefficient, the higher the recall performance. Our findings provide novel insights into the relationship between topological aspects of functional brain networks and higher cognitive functions.