Refine
Year of publication
- 2024 (25)
- 2023 (101)
- 2022 (132)
- 2021 (145)
- 2020 (157)
- 2019 (192)
- 2018 (168)
- 2017 (154)
- 2016 (154)
- 2015 (176)
- 2014 (166)
- 2013 (171)
- 2012 (154)
- 2011 (184)
- 2010 (179)
- 2009 (185)
- 2008 (155)
- 2007 (149)
- 2006 (160)
- 2005 (130)
- 2004 (161)
- 2003 (106)
- 2002 (130)
- 2001 (106)
- 2000 (108)
- 1999 (109)
- 1998 (99)
- 1997 (99)
- 1996 (81)
- 1995 (78)
- 1994 (86)
- 1993 (59)
- 1992 (54)
- 1991 (29)
- 1990 (39)
- 1989 (45)
- 1988 (57)
- 1987 (32)
- 1986 (19)
- 1985 (34)
- 1984 (22)
- 1983 (20)
- 1982 (29)
- 1981 (20)
- 1980 (36)
- 1979 (24)
- 1978 (34)
- 1977 (14)
- 1976 (13)
- 1975 (12)
- 1974 (3)
- 1973 (2)
- 1972 (2)
- 1971 (1)
- 1968 (1)
Document Type
- Article (3226)
- Conference Proceeding (1146)
- Part of a Book (184)
- Book (144)
- Doctoral Thesis (30)
- Patent (25)
- Other (9)
- Report (9)
- Working Paper (6)
- Lecture (5)
- Poster (4)
- Preprint (4)
- Talk (4)
- Master's Thesis (2)
- Bachelor Thesis (1)
- Contribution to a Periodical (1)
- Habilitation (1)
Language
- English (4801) (remove)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
- Shakedown analysis (6)
- avalanche (6)
- shakedown analysis (6)
- Clusterion (5)
- Earthquake (5)
- Enterprise Architecture (5)
- MINLP (5)
- solar sail (5)
- Air purification (4)
- Diversity Management (4)
Institute
- Fachbereich Medizintechnik und Technomathematik (1668)
- Fachbereich Elektrotechnik und Informationstechnik (693)
- IfB - Institut für Bioengineering (620)
- Fachbereich Energietechnik (579)
- INB - Institut für Nano- und Biotechnologien (555)
- Fachbereich Chemie und Biotechnologie (534)
- Fachbereich Luft- und Raumfahrttechnik (477)
- Fachbereich Maschinenbau und Mechatronik (278)
- Fachbereich Wirtschaftswissenschaften (207)
- Solar-Institut Jülich (164)
- Fachbereich Bauingenieurwesen (153)
- ECSM European Center for Sustainable Mobility (79)
- MASKOR Institut für Mobile Autonome Systeme und Kognitive Robotik (67)
- Nowum-Energy (28)
- Fachbereich Gestaltung (25)
- Institut fuer Angewandte Polymerchemie (23)
- Sonstiges (21)
- Fachbereich Architektur (20)
- Freshman Institute (18)
- Kommission für Forschung und Entwicklung (18)
In this study, a high-speed chemical imaging system was developed for visualization of the interior of a microfluidic channel. A microfluidic channel was constructed on the sensor surface of the light-addressable potentiometric sensor (LAPS), on which the ion concentrations could be measured in parallel at up to 64 points illuminated by optical fibers. The temporal change of pH distribution inside the microfluidic channel was recorded at a maximum rate of 100 frames per second (fps). The high frame rate allowed visualization of moving interfaces and plugs in the channel even at a flow velocity of 111 mm/s, which suggests the feasibility of plug-based microfluidic devices for flow-injection analysis (FIA).
The chemical imaging sensor is a semiconductor-based chemical sensor that can visualize the spatial distribution of specific ions on the sensing surface. The conventional chemical imaging system based on the light-addressable potentiometric sensor (LAPS), however, required a long time to obtain a chemical image, due to the slow mechanical scan of a single light beam. For high-speed imaging, a plurality of light beams modulated at different frequencies can be employed to measure the ion concentrations simultaneously at different locations on the sensor plate by frequency division multiplex (FDM). However, the conventional measurement geometry of back-side illumination limited the bandwidth of the modulation frequency required for FDM measurement, because of the low-pass filtering characteristics of carrier diffusion in the Si substrate. In this study, a high-speed chemical imaging system based on front-side-illuminated LAPS was developed, which achieved high-speed spatiotemporal recording of pH change at a rate of 70 frames per second.
High-spin isomer in ¹³⁷ Ce
(1978)
High-spin states in ¹³³ La
(1982)
High-spin states in ¹³³ La
(1980)
High-spin states in ¹⁸⁰ Os
(1979)
Micromachined thermal heater platforms offer low electrical power consumption and high modulation speed, i.e. properties which are advantageous for realizing nondispersive infrared (NDIR) gas- and liquid monitoring systems. In this paper, we report on investigations on silicon-on-insulator (SOI) based infrared (IR) emitter devices heated by employing different kinds of metallic and semiconductor heater materials. Our results clearly reveal the superior high-temperature performance of semiconductor over metallic heater materials. Long-term stable emitter operation in the vicinity of 1300 K could be attained using heavily antimony-doped tin dioxide (SnO2:Sb) heater elements.
This paper describes the modeling of a high-temperature storage system for an existing solar tower power plant with open volumetric receiver technology, which uses air as heat transfer medium (HTF). The storage system model has been developed in the simulation environment Matlab/Simulink®. The storage type under investigation is a packed bed thermal energy storage system which has the characteristics of a regenerator. Thermal energy can be stored and discharged as required via the HTF air. The air mass flow distribution is controlled by valves, and the mass flow by two blowers. The thermal storage operation strategy has a direct and significant impact on the energetic and economic efficiency of the solar tower power plants.
HisT/PLIER : A Two-Fold Provenance Approach for Grid-Enabled Scientific Workflows Using WS-VLAM
(2011)
The established Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic is investigated for partly not identically distributed data. Surprisingly, it turns out that the statistic has the well-known distribution-free limiting null distribution of the classical criterion under standard regularity conditions. An application is testing goodness-of-fit for the regression function in a non parametric random effects meta-regression model, where the consistency is obtained as well. Simulations investigate size and power of the approach for small and moderate sample sizes. A real data example based on clinical trials illustrates how the test can be used in applications.
Hollow core fiber delivery of sub-ps pulses from a TruMicro 5000 Femto edition thin disk amplifier
(2015)
MultiChannel Photomultipliers (PM), like the R7600-00-M64 or R5900-00-M64 from Hamamatsu, are often chosen as photodetectors in high-resolution positron emission tomography (PET). A major problem of this PM is the nonuniform channel gain. In order to solve this problem, light attenuating masks were created. The aim of the masks is a homogenization of the output of all 64 channels using different hole sizes at the channel positions. The hole area, which is individually defined for the different channels, is inversely proportional to the channel gain. The measurements by inserting light attenuating masks improved a homogenization to a ratio of 1:1.2.
Ceramic hot gas filters are widely used in combined cycles based on pressurised fluidised beds. They fulfil most of the demands with respect to cleaning efficiency and long time durability, but their operation regarding the consumption of pulse gas and energy still has to be optimised. Experimental investigations were carried out to measure the flow field, the pressure and the gas temperature inside the filter candle during pulse jet cleaning. These results are compared with the results of a numerical procedure based on a solution of the two - dimensional conservation equations for momentum and energy. The observed difficulties handling different flow regimes like highly turbulent flow as well as Darcy flow simultaneously are discussed.
Hotelling’s T² tests in paired and independent survey samples are compared using the traditional asymptotic efficiency concepts of Hodges–Lehmann, Bahadur and Pitman, as well as through criteria based on the volumes of corresponding confidence regions. Conditions characterizing the superiority of a procedure are given in terms of population canonical correlation type coefficients. Statistical tests for checking these conditions are developed. Test statistics based on the eigenvalues of a symmetrized sample cross-covariance matrix are suggested, as well as test statistics based on sample canonical correlation type coefficients.
How different diversity factors affect the perception of first-year requirements in higher education
(2021)
In the light of growing university entry rates, higher education institutions not only serve larger numbers of students, but also seek to meet first-year students’ ever more diverse needs. Yet to inform universities how to support the transition to higher education, research only offers limited insights. Current studies tend to either focus on the individual factors that affect student success or they highlight students’ social background and their educational biography in order to examine the achievement of selected, non-traditional groups of students. Both lines of research appear to lack integration and often fail to take organisational diversity into account, such as different types of higher education institutions or degree programmes. For a more comprehensive understanding of student diversity, the present study includes individual, social and organisational factors. To gain insights into their role for the transition to higher education, we examine how the different factors affect the students’ perception of the formal and informal requirements of the first year as more or less difficult to cope with. As the perceived requirements result from both the characteristics of the students and the institutional context, they allow to investigate transition at the interface of the micro and the meso level of higher education. Latent profile analyses revealed that there are no profiles with complex patterns of perception of the first-year requirements, but the identified groups rather differ in the overall level of perceived challenges. Moreover, SEM indicates that the differences in the perception largely depend on the individual factors self-efficacy and volition.
How does the implementation of a next generation network influence a telecommunication company?
(2009)
As the potential of a Next Generation Network (NGN) is recognized, telecommunication companies consider switching to it. Although the implementation of an NGN seems to be merely a modification of the network infrastructure, it may trigger or require changes in the whole company and even influence the company strategy. To capture the effects of NGN we propose a framework based on concepts of business engineering and technical recommendations for the introduction of NGN technology. The specific design of solutions for the layers "Strategy", "Processes" and "Information Systems" as well as their interdependencies are an essential characteristic of the developed framework. We have per-formed a case study on NGN implementation and observed that all layers captured by our framework are influenced by the introduction of an NGN.
Domain experts regularly teach novice students how to perform a task. This often requires them to adjust their behavior to the less knowledgeable audience and, hence, to behave in a more didactic manner. Eye movement modeling examples (EMMEs) are a contemporary educational tool for displaying experts’ (natural or didactic) problem-solving behavior as well as their eye movements to learners. While research on expert-novice communication mainly focused on experts’ changes in explicit, verbal communication behavior, it is as yet unclear whether and how exactly experts adjust their nonverbal behavior. This study first investigated whether and how experts change their eye movements and mouse clicks (that are displayed in EMMEs) when they perform a task naturally versus teach a task didactically. Programming experts and novices initially debugged short computer codes in a natural manner. We first characterized experts’ natural problem-solving behavior by contrasting it with that of novices. Then, we explored the changes in experts’ behavior when being subsequently instructed to model their task solution didactically. Experts became more similar to novices on measures associated with experts’ automatized processes (i.e., shorter fixation durations, fewer transitions between code and output per click on the run button when behaving didactically). This adaptation might make it easier for novices to follow or imitate the expert behavior. In contrast, experts became less similar to novices for measures associated with more strategic behavior (i.e., code reading linearity, clicks on run button) when behaving didactically.
Modern industry and multi-discipline projects require highly trained individuals with resilient science and engineering back-grounds. Graduates must be able to agilely apply excellent theoretical knowledge in their subject matter as well as essential practical “hands-on” knowledge of diverse working processes to solve complex problems. To meet these demands, university education follows the concept of Constructive Alignment and thus increasingly adopts the teaching of necessary practical skills to the actual industry requirements and assessment routines. However, a systematic approach to coherently align these three central teaching demands is strangely absent from current university curricula. We demonstrate the feasibility of implementing practical assessments in a regular theory-based examination, thus defining the term “blended assessment”. We assessed a course for natural science and engineering students pursuing a career in biomedical engineering, and evaluated the benefit of blended assessment exams for students and lecturers. Our controlled study assessed the physiological background of electrocardiograms (ECGs), the practical measurement of ECG curves, and their interpretation of basic pathologic alterations. To study on long time effects, students have been assessed on the topic twice with a time lag of 6 months. Our findings suggest a significant improvement in student gain with respect to practical skills and theoretical knowledge. The results of the reassessments support these outcomes. From the lecturers ́ point of view, blended assessment complements practical training courses while keeping organizational effort manageable. We consider blended assessment a viable tool for providing an improved student gain, industry-ready education format that should be evaluated and established further to prepare university graduates optimally for their future careers.
The 2nd edition of the lightning risk management
standard (IEC 62305-2) considers structures, which may
endanger environment. In these cases, the loss is not limited to
the structure itself, which is valid for usual structures. In the past
(Edition 1) this danger was simply taken into account by a special
hazard factor, multiplying the existing risk for the structure with
a number. Now, in the edition 2, we add to the risk for the
structure itself a “second risk” due to the losses outside the
structure. The losses outside can be treated independently from
what occurs inside. This is a major advantage to analyze the risk
for sensitive structures, like chemical plants, nuclear plants, or
structures containing explosives, etc. In this paper, the existing
procedure given by the European version EN 62305-2 Ed.2 is
further developed and applied to a few structures.
Many companies still conduct the worldwide management of people as if neither the external economic nor the internal structure of the firm had changed. The costs of cross-cultural failure, for individuals and their companies, are enormous: personal and family costs; financial, professional and emotional costs; costs to one’s career prospects, to one’s self-esteem, to one’s marriage and family. This scenario describes sufficiently the reason for learning “the art of crossing cultures” (Craig Storti). To this end, this research paper describes an innovative approach of cross-cultural training, following the didactical ideas of Kolb and Fry, the so-called 'experiential learning'.
Cement augmentation is an emerging surgical procedure in which bone cement is used to infiltrate and reinforce osteoporotic vertebrae. Although this infiltration procedure has been widely applied, it is performed empirically and little is known about the flow characteristics of cement during the injection process. We present a theoretical and experimental approach to investigate the intertrabecular bone permeability during the infiltration procedure. The cement permeability was considered to be dependent on time, bone porosity, and cement viscosity in our analysis. In order to determine the time-dependent permeability, ten cancellous bone cores were harvested from osteoporotic vertebrae, infiltrated with acrylic cement at a constant flow rate, and the pressure drop across the cores during the infiltration was measured. The viscosity dependence of the permeability was determined based on published experimental data. The theoretical model for the permeability as a function of bone porosity and time was then fit to the testing data. Our findings suggest that the intertrabecular bone permeability depends strongly on time. For instance, the initial permeability (60.89 mm4/N.s) reduced to approximately 63% of its original value within 18 seconds. This study is the first to analyze cement flow through osteoporotic bone. The theoretical and experimental models provided in this paper are generic. Thus, they can be used to systematically study and optimize the infiltration process for clinical practice.
The number of electronic vehicles increase steadily while the space for extending the charging infrastructure is limited. In particular in urban areas, where parking spaces in attractive areas are famous, opportunities to setup new charging stations is very limited. This leads to an overload of some very attractive charging stations and an underutilization of less attractive ones. Against this background, the paper at hand presents the design of an e-vehicle reservation system that aims at distributing the utilization of the charging infrastructure, particularly in urban areas. By applying a design science approach, the requirements for a reservation-based utilization approach are elicited and a model for a suitable distribution approach and its instantiation are developed. The artefact is evaluated by simulating the distribution effects based on data of real charging station utilizations.
We introduce a new way to measure the forecast effort that analysts devote to their earnings forecasts by measuring the analyst's general effort for all covered firms. While the commonly applied effort measure is based on analyst behaviour for one firm, our measure considers analyst behaviour for all covered firms. Our general effort measure captures additional information about analyst effort and thus can identify accurate forecasts. We emphasise the importance of investigating analyst behaviour in a larger context and argue that analysts who generally devote substantial forecast effort are also likely to devote substantial effort to a specific firm, even if this effort might not be captured by a firm-specific measure. Empirical results reveal that analysts who devote higher general forecast effort issue more accurate forecasts. Additional investigations show that analysts' career prospects improve with higher general forecast effort. Our measure improves on existing methods as it has higher explanatory power regarding differences in forecast accuracy than the commonly applied effort measure. Additionally, it can address research questions that cannot be examined with a firm-specific measure. It provides a simple but comprehensive way to identify accurate analysts.
Mouse nongenotoxic hepatocarcinogens phenobarbital (PB) and chlordane induce hepatomegaly characterized by hypertrophy and hyperplasia. Increased cell proliferation is implicated in the mechanism of tumor induction. The relevance of these tumors to human health is unclear. The xenoreceptors, constitutive androstane receptors (CARs), and pregnane X receptor (PXR) play key roles in these processes. Novel “humanized” and knockout models for both receptors were developed to investigate potential species differences in hepatomegaly. The effects of PB (80 mg/kg/4 days) and chlordane (10 mg/kg/4 days) were investigated in double humanized PXR and CAR (huPXR/huCAR), double knockout PXR and CAR (PXRKO/CARKO), and wild-type (WT) C57BL/6J mice. In WT mice, both compounds caused increased liver weight, hepatocellular hypertrophy, and cell proliferation. Both compounds caused alterations to a number of cell cycle genes consistent with induction of cell proliferation in WT mice. However, these gene expression changes did not occur in PXRKO/CARKO or huPXR/huCAR mice. Liver hypertrophy without hyperplasia was demonstrated in the huPXR/huCAR animals in response to both compounds. Induction of the CAR and PXR target genes, Cyp2b10 and Cyp3a11, was observed in both WT and huPXR/huCAR mouse lines following treatment with PB or chlordane. In the PXRKO/CARKO mice, neither liver growth nor induction of Cyp2b10 and Cyp3a11 was seen following PB or chlordane treatment, indicating that these effects are CAR/PXR dependent. These data suggest that the human receptors are able to support the chemically induced hypertrophic responses but not the hyperplastic (cell proliferation) responses. At this time, we cannot be certain that hCAR and hPXR when expressed in the mouse can function exactly as the genes do when they are expressed in human cells. However, all parameters investigated to date suggest that much of their functionality is maintained.
Digital Shadows as the aggregation, linkage and abstraction of data relating to physical objects are a central vision for the future of production. However, the majority of current research takes a technocentric approach, in which the human actors in production play a minor role. Here, the authors present an alternative anthropocentric perspective that highlights the potential and main challenges of extending the concept of Digital Shadows to humans. Following future research methodology, three prospections that illustrate use cases for Human Digital Shadows across organizational and hierarchical levels are developed: human-robot collaboration for manual work, decision support and work organization, as well as human resource management. Potentials and challenges are identified using separate SWOT analyses for the three prospections and common themes are emphasized in a concluding discussion.
While bringing new opportunities, the Industry 4.0 movement also imposes new challenges to the manufacturing industry and all its stakeholders. In this competitive environment, a skilled and engaged workforce is a key to success. Gamification can generate valuable feedbacks for improving employees’ engagement and performance. Currently, Gamification in workspaces focuses on computer-based assignments and training, while tasks that require manual labor are rarely considered. This research provides an overview of Enterprise Gamification approaches and evaluates the challenges. Based on that, a skill-based Gamification framework for manual tasks is proposed, and a case study in the Industry 4.0 model factory is shown.
Like all preceding transformations of the manufacturing industry, the large-scale usage of production data will reshape the role of humans within the sociotechnical production ecosystem. To ensure that this transformation creates work systems in which employees are empowered, productive, healthy, and motivated, the transformation must be guided by principles of and research on human-centered work design. Specifically, measures must be taken at all levels of work design, ranging from (1) the work tasks to (2) the working conditions to (3) the organizational level and (4) the supra-organizational level. We present selected research across all four levels that showcase the opportunities and requirements that surface when striving for human-centered work design for the Internet of Production (IoP). (1) On the work task level, we illustrate the user-centered design of human-robot collaboration (HRC) and process planning in the composite industry as well as user-centered design factors for cognitive assistance systems. (2) On the working conditions level, we present a newly developed framework for the classification of HRC workplaces. (3) Moving to the organizational level, we show how corporate data can be used to facilitate best practice sharing in production networks, and we discuss the implications of the IoP for new leadership models. Finally, (4) on the supra-organizational level, we examine overarching ethical dimensions, investigating, e.g., how the new work contexts affect our understanding of responsibility and normative values such as autonomy and privacy. Overall, these interdisciplinary research perspectives highlight the importance and necessary scope of considering the human factor in the IoP.
Humanized UGT2 and CYP3A transchromosomic rats for improved prediction of human drug metabolism
(2019)
Hybrid control for autonomous systems — Integrating learning, deliberation and reactive control
(2010)
Abstractauthoren Graphene oxide (GO) nanoparticles were incorporated in temperature-sensitive Poly(N-isopropylacrylamide) (PNIPAAm) hydrogels. The nanoparticles increase the light absorption and convert light energy into heat efficiently. Thus, the hydrogels with GO can be stimulated spatially resolved by illumination as it was demonstrated by IR thermography. The temporal progression of the temperature maximum was detected for different concentrations of GO within the polymer network. Furthermore, the compatibility of PNIPAAm hydrogels with GO and cell cultures was investigated. For this purpose, culture medium was incubated with hydrogels containing GO and the viability and morphology of chinese hamster ovary (CHO) cells was examined after several days of culturing in presence of this medium.
The feasibility study presents results of a hydrogen combustor integration for a Medium-Range aircraft engine using the Dry-Low-NOₓ Micromix combustion principle. Based on a simplified Airbus A320-type flight mission, a thermodynamic performance model of a kerosene and a hydrogen-powered V2530-A5 engine is used to derive the thermodynamic combustor boundary conditions. A new combustor design using the Dry-Low NOx Micromix principle is investigated by slice model CFD simulations of a single Micromix injector for design and off-design operation of the engine. Combustion characteristics show typical Micromix flame shapes and good combustion efficiencies for all flight mission operating points. Nitric oxide emissions are significant below ICAO CAEP/8 limits. For comparison of the Emission Index (EI) for NOₓ emissions between kerosene and hydrogen operation, an energy (kerosene) equivalent Emission Index is used.
A full 15° sector model CFD simulation of the combustion chamber with multiple Micromix injectors including inflow homogenization and dilution and cooling air flows investigates the combustor integration effects, resulting NOₓ emission and radial temperature distributions at the combustor outlet. The results show that the integration of a Micromix hydrogen combustor in actual aircraft engines is feasible and offers, besides CO₂ free combustion, a significant reduction of NOₓ emissions compared to kerosene operation.
The European Union's aim to become climate neutral by 2050 necessitates ambitious efforts to reduce carbon emissions. Large reductions can be attained particularly in energy intensive sectors like iron and steel. In order to prevent the relocation of such industries outside the EU in the course of tightening environmental regulations, the establishment of a climate club jointly with other large emitters and alternatively the unilateral implementation of an international cross-border carbon tax mechanism are proposed. This article focuses on the latter option choosing the steel sector as an example. In particular, we investigate the financial conditions under which a European cross border mechanism is capable to protect hydrogen-based steel production routes employed in Europe against more polluting competition from abroad. By using a floor price model, we assess the competitiveness of different steel production routes in selected countries. We evaluate the climate friendliness of steel production on the basis of specific GHG emissions. In addition, we utilize an input-output price model. It enables us to assess impacts of rising cost of steel production on commodities using steel as intermediates. Our results raise concerns that a cross-border tax mechanism will not suffice to bring about competitiveness of hydrogen-based steel production in Europe because the cost tends to remain higher than the cost of steel production in e.g. China. Steel is a classic example for a good used mainly as intermediate for other products. Therefore, a cross-border tax mechanism for steel will increase the price of products produced in the EU that require steel as an input. This can in turn adversely affect competitiveness of these sectors. Hence, the effects of higher steel costs on European exports should be borne in mind and could require the cross-border adjustment mechanism to also subsidize exports.
Hydrostatic propeller drive
(2011)
Ice melting probes
(2023)
The exploration of icy environments in the solar system, such as the poles of Mars and the icy moons (a.k.a. ocean worlds), is a key aspect for understanding their astrobiological potential as well as for extraterrestrial resource inspection. On these worlds, ice melting probes are considered to be well suited for the robotic clean execution of such missions. In this chapter, we describe ice melting probes and their applications, the physics of ice melting and how the melting behavior can be modeled and simulated numerically, the challenges for ice melting, and the required key technologies to deal with those challenges. We also give an overview of existing ice melting probes and report some results and lessons learned from laboratory and field tests.
The ”IceMole“ is a novel maneuverable subsurface ice probe for clean in-situ analysis and sampling of subsurface ice and subglacial water/brine. It is developed and build at FH Aachen University of Applied Sciences’ Astronautical Laboratory. A first prototype was successfully tested on the Swiss Morteratsch glacier in 2010. Clean sampling is achieved with a hollow ice screw (as it is used in mountaineering) at the tip of the probe. Maneuverability is achieved with a differentially heated melting head. Funded by the German Space Agency (DLR), a consortium led by FH Aachen currently develops a much more advanced IceMole probe, which includes a sophisticated system for obstacle avoidance, target detection, and navigation in the ice. We intend to use this probe for taking clean samples of subglacial brine at the Blood Falls (McMurdo Dry Valleys, East Antarctica) for chemical and microbiological analysis. In our conference contribution, we 1) describe the IceMole design, 2) report the results of the field tests of the first prototype on the Morteratsch glacier, 3) discuss the probe’s potential for the clean in-situ analysis and sampling of subsurface ice and subglacial liquids, and 4) outline the way ahead in the development of this technology.
There is significant interest in sampling subglacial environments for geobiological studies, but they are difficult to access. Existing ice-drilling technologies make it cumbersome to maintain microbiologically clean access for sample acquisition and environmental stewardship of potentially fragile subglacial aquatic ecosystems. The IceMole is a maneuverable subsurface ice probe for clean in situ analysis and sampling of glacial ice and subglacial materials. The design is based on the novel concept of combining melting and mechanical propulsion. It can change melting direction by differential heating of the melting head and optional side-wall heaters. The first two prototypes were successfully tested between 2010 and 2012 on glaciers in Switzerland and Iceland. They demonstrated downward, horizontal and upward melting, as well as curve driving and dirt layer penetration. A more advanced probe is currently under development as part of the Enceladus Explorer (EnEx) project. It offers systems for obstacle avoidance, target detection, and navigation in ice. For the EnEx-IceMole, we will pay particular attention to clean protocols for the sampling of subglacial materials for biogeochemical analysis. We plan to use this probe for clean access into a unique subglacial aquatic environment at Blood Falls, Antarctica, with return of a subglacial brine sample.
We present the novel concept of a combined drilling and melting probe for subsurface ice research. This probe, named “IceMole”, is currently developed, built, and tested at the FH Aachen University of Applied Sciences’ Astronautical Laboratory. Here, we describe its first prototype design and report the results of its field tests on the Swiss Morteratsch glacier. Although the IceMole design is currently adapted to terrestrial glaciers and ice shields, it may later be modified for the subsurface in-situ investigation of extraterrestrial ice, e.g., on Mars, Europa, and Enceladus. If life exists on those bodies, it may be present in the ice (as life can also be found in the deep ice of Earth).
Dynamic retinal vessel analysis (DVA) provides a non-invasive way to assess microvascular function in patients and potentially to improve predictions of individual cardiovascular (CV) risk. The aim of our study was to use untargeted machine learning on DVA in order to improve CV mortality prediction and identify corresponding response alterations.
The molecular events during nongenotoxic carcinogenesis and their temporal order are poorly understood but thought to include long-lasting perturbations of gene expression. Here, we have investigated the temporal sequence of molecular and pathological perturbations at early stages of phenobarbital (PB) mediated liver tumor promotion in vivo. Molecular profiling (mRNA, microRNA [miRNA], DNA methylation, and proteins) of mouse liver during 13 weeks of PB treatment revealed progressive increases in hepatic expression of long noncoding RNAs and miRNAs originating from the Dlk1-Dio3 imprinted gene cluster, a locus that has recently been associated with stem cell pluripotency in mice and various neoplasms in humans. PB induction of the Dlk1-Dio3 cluster noncoding RNA (ncRNA) Meg3 was localized to glutamine synthetase-positive hypertrophic perivenous hepatocytes, sug- gesting a role for β-catenin signaling in the dysregulation of Dlk1-Dio3 ncRNAs. The carcinogenic relevance of Dlk1-Dio3 locus ncRNA induction was further supported by in vivo genetic dependence on constitutive androstane receptor and β-catenin pathways. Our data identify Dlk1-Dio3 ncRNAs as novel candidate early biomarkers for mouse liver tumor promotion and provide new opportunities for assessing the carcinogenic potential of novel compounds.
Objectives
To assess the image quality of T2-weighted (T2w) magnetic resonance imaging of the prostate and the visibility of prostate cancer at 7 Tesla (T).
Materials & methods
Seventeen prostate cancer patients underwent T2w imaging at 7T with only an external transmit/receive array coil. Three radiologists independently scored images for image quality, visibility of anatomical structures, and presence of artefacts. Krippendorff’s alpha and weighted kappa statistics were used to assess inter-observer agreement. Visibility of prostate cancer lesions was assessed by directly linking the T2w images to the confirmed location of prostate cancer on histopathology.
Results
T2w imaging at 7T was achievable with ‘satisfactory’ (3/5) to ‘good’ (4/5) quality. Visibility of anatomical structures was predominantly scored as ‘satisfactory’ (3/5) and ‘good’ (4/5). If artefacts were present, they were mostly motion artefacts and, to a lesser extent, aliasing artefacts and noise. Krippendorff’s analysis revealed an α = 0.44 between three readers for the overall image quality scores. Clinically significant cancer lesions in both peripheral zone and transition zone were visible at 7T.
Conclusion
T2w imaging with satisfactory to good quality can be routinely acquired, and cancer lesions were visible in patients with prostate cancer at 7T using only an external transmit/receive body array coil.
Image reconstruction analysis for positron emission tomography with heterostructured scintillators
(2022)
The concept of structure engineering has been proposed for exploring the next generation of radiation detectors with improved performance. A TOF-PET geometry with heterostructured scintillators with a pixel size of 3.0×3.1×15 mm3 was simulated using Monte Carlo. The heterostructures consisted of alternating layers of BGO as a dense material with high stopping power and plastic (EJ232) as a fast light emitter. The detector time resolution was calculated as a function of the deposited and shared energy in both materials on an event-by-event basis. While sensitivity was reduced to 32% for 100 μm thick plastic layers and 52% for 50 μm, the CTR distribution improved to 204±49 ps and 220±41 ps respectively, compared to 276 ps that we considered for bulk BGO. The complex distribution of timing resolutions was accounted for in the reconstruction. We divided the events into three groups based on their CTR and modeled them with different Gaussian TOF kernels. On a NEMA IQ phantom, the heterostructures had better contrast recovery in early iterations. On the other hand, BGO achieved a better contrast to noise ratio (CNR) after the 15th iteration due to the higher sensitivity. The developed simulation and reconstruction methods constitute new tools for evaluating different detector designs with complex time responses.