Article
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1354)
- INB - Institut für Nano- und Biotechnologien (503)
- Fachbereich Chemie und Biotechnologie (476)
- Fachbereich Elektrotechnik und Informationstechnik (409)
- IfB - Institut für Bioengineering (404)
- Fachbereich Energietechnik (358)
- Fachbereich Luft- und Raumfahrttechnik (250)
- Fachbereich Maschinenbau und Mechatronik (156)
- Fachbereich Wirtschaftswissenschaften (116)
- Fachbereich Bauingenieurwesen (70)
Language
- English (3277) (remove)
Document Type
- Article (3277) (remove)
Keywords
- Einspielen <Werkstoff> (7)
- avalanche (5)
- Earthquake (4)
- FEM (4)
- Finite-Elemente-Methode (4)
- LAPS (4)
- additive manufacturing (4)
- biosensors (4)
- field-effect sensor (4)
- frequency mixing magnetic detection (4)
In this study, a high-speed chemical imaging system was developed for visualization of the interior of a microfluidic channel. A microfluidic channel was constructed on the sensor surface of the light-addressable potentiometric sensor (LAPS), on which the ion concentrations could be measured in parallel at up to 64 points illuminated by optical fibers. The temporal change of pH distribution inside the microfluidic channel was recorded at a maximum rate of 100 frames per second (fps). The high frame rate allowed visualization of moving interfaces and plugs in the channel even at a flow velocity of 111 mm/s, which suggests the feasibility of plug-based microfluidic devices for flow-injection analysis (FIA).
The chemical imaging sensor is a semiconductor-based chemical sensor that can visualize the spatial distribution of specific ions on the sensing surface. The conventional chemical imaging system based on the light-addressable potentiometric sensor (LAPS), however, required a long time to obtain a chemical image, due to the slow mechanical scan of a single light beam. For high-speed imaging, a plurality of light beams modulated at different frequencies can be employed to measure the ion concentrations simultaneously at different locations on the sensor plate by frequency division multiplex (FDM). However, the conventional measurement geometry of back-side illumination limited the bandwidth of the modulation frequency required for FDM measurement, because of the low-pass filtering characteristics of carrier diffusion in the Si substrate. In this study, a high-speed chemical imaging system based on front-side-illuminated LAPS was developed, which achieved high-speed spatiotemporal recording of pH change at a rate of 70 frames per second.
High-spin isomer in ¹³⁷ Ce
(1978)
High-spin states in ¹³³ La
(1980)
High-spin states in ¹³³ La
(1982)
High-spin states in ¹⁸⁰ Os
(1979)
This paper describes the modeling of a high-temperature storage system for an existing solar tower power plant with open volumetric receiver technology, which uses air as heat transfer medium (HTF). The storage system model has been developed in the simulation environment Matlab/Simulink®. The storage type under investigation is a packed bed thermal energy storage system which has the characteristics of a regenerator. Thermal energy can be stored and discharged as required via the HTF air. The air mass flow distribution is controlled by valves, and the mass flow by two blowers. The thermal storage operation strategy has a direct and significant impact on the energetic and economic efficiency of the solar tower power plants.
HisT/PLIER : A Two-Fold Provenance Approach for Grid-Enabled Scientific Workflows Using WS-VLAM
(2011)
The established Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic is investigated for partly not identically distributed data. Surprisingly, it turns out that the statistic has the well-known distribution-free limiting null distribution of the classical criterion under standard regularity conditions. An application is testing goodness-of-fit for the regression function in a non parametric random effects meta-regression model, where the consistency is obtained as well. Simulations investigate size and power of the approach for small and moderate sample sizes. A real data example based on clinical trials illustrates how the test can be used in applications.
MultiChannel Photomultipliers (PM), like the R7600-00-M64 or R5900-00-M64 from Hamamatsu, are often chosen as photodetectors in high-resolution positron emission tomography (PET). A major problem of this PM is the nonuniform channel gain. In order to solve this problem, light attenuating masks were created. The aim of the masks is a homogenization of the output of all 64 channels using different hole sizes at the channel positions. The hole area, which is individually defined for the different channels, is inversely proportional to the channel gain. The measurements by inserting light attenuating masks improved a homogenization to a ratio of 1:1.2.
Hotelling’s T² tests in paired and independent survey samples are compared using the traditional asymptotic efficiency concepts of Hodges–Lehmann, Bahadur and Pitman, as well as through criteria based on the volumes of corresponding confidence regions. Conditions characterizing the superiority of a procedure are given in terms of population canonical correlation type coefficients. Statistical tests for checking these conditions are developed. Test statistics based on the eigenvalues of a symmetrized sample cross-covariance matrix are suggested, as well as test statistics based on sample canonical correlation type coefficients.
How different diversity factors affect the perception of first-year requirements in higher education
(2021)
In the light of growing university entry rates, higher education institutions not only serve larger numbers of students, but also seek to meet first-year students’ ever more diverse needs. Yet to inform universities how to support the transition to higher education, research only offers limited insights. Current studies tend to either focus on the individual factors that affect student success or they highlight students’ social background and their educational biography in order to examine the achievement of selected, non-traditional groups of students. Both lines of research appear to lack integration and often fail to take organisational diversity into account, such as different types of higher education institutions or degree programmes. For a more comprehensive understanding of student diversity, the present study includes individual, social and organisational factors. To gain insights into their role for the transition to higher education, we examine how the different factors affect the students’ perception of the formal and informal requirements of the first year as more or less difficult to cope with. As the perceived requirements result from both the characteristics of the students and the institutional context, they allow to investigate transition at the interface of the micro and the meso level of higher education. Latent profile analyses revealed that there are no profiles with complex patterns of perception of the first-year requirements, but the identified groups rather differ in the overall level of perceived challenges. Moreover, SEM indicates that the differences in the perception largely depend on the individual factors self-efficacy and volition.
Domain experts regularly teach novice students how to perform a task. This often requires them to adjust their behavior to the less knowledgeable audience and, hence, to behave in a more didactic manner. Eye movement modeling examples (EMMEs) are a contemporary educational tool for displaying experts’ (natural or didactic) problem-solving behavior as well as their eye movements to learners. While research on expert-novice communication mainly focused on experts’ changes in explicit, verbal communication behavior, it is as yet unclear whether and how exactly experts adjust their nonverbal behavior. This study first investigated whether and how experts change their eye movements and mouse clicks (that are displayed in EMMEs) when they perform a task naturally versus teach a task didactically. Programming experts and novices initially debugged short computer codes in a natural manner. We first characterized experts’ natural problem-solving behavior by contrasting it with that of novices. Then, we explored the changes in experts’ behavior when being subsequently instructed to model their task solution didactically. Experts became more similar to novices on measures associated with experts’ automatized processes (i.e., shorter fixation durations, fewer transitions between code and output per click on the run button when behaving didactically). This adaptation might make it easier for novices to follow or imitate the expert behavior. In contrast, experts became less similar to novices for measures associated with more strategic behavior (i.e., code reading linearity, clicks on run button) when behaving didactically.
Modern industry and multi-discipline projects require highly trained individuals with resilient science and engineering back-grounds. Graduates must be able to agilely apply excellent theoretical knowledge in their subject matter as well as essential practical “hands-on” knowledge of diverse working processes to solve complex problems. To meet these demands, university education follows the concept of Constructive Alignment and thus increasingly adopts the teaching of necessary practical skills to the actual industry requirements and assessment routines. However, a systematic approach to coherently align these three central teaching demands is strangely absent from current university curricula. We demonstrate the feasibility of implementing practical assessments in a regular theory-based examination, thus defining the term “blended assessment”. We assessed a course for natural science and engineering students pursuing a career in biomedical engineering, and evaluated the benefit of blended assessment exams for students and lecturers. Our controlled study assessed the physiological background of electrocardiograms (ECGs), the practical measurement of ECG curves, and their interpretation of basic pathologic alterations. To study on long time effects, students have been assessed on the topic twice with a time lag of 6 months. Our findings suggest a significant improvement in student gain with respect to practical skills and theoretical knowledge. The results of the reassessments support these outcomes. From the lecturers ́ point of view, blended assessment complements practical training courses while keeping organizational effort manageable. We consider blended assessment a viable tool for providing an improved student gain, industry-ready education format that should be evaluated and established further to prepare university graduates optimally for their future careers.
Cement augmentation is an emerging surgical procedure in which bone cement is used to infiltrate and reinforce osteoporotic vertebrae. Although this infiltration procedure has been widely applied, it is performed empirically and little is known about the flow characteristics of cement during the injection process. We present a theoretical and experimental approach to investigate the intertrabecular bone permeability during the infiltration procedure. The cement permeability was considered to be dependent on time, bone porosity, and cement viscosity in our analysis. In order to determine the time-dependent permeability, ten cancellous bone cores were harvested from osteoporotic vertebrae, infiltrated with acrylic cement at a constant flow rate, and the pressure drop across the cores during the infiltration was measured. The viscosity dependence of the permeability was determined based on published experimental data. The theoretical model for the permeability as a function of bone porosity and time was then fit to the testing data. Our findings suggest that the intertrabecular bone permeability depends strongly on time. For instance, the initial permeability (60.89 mm4/N.s) reduced to approximately 63% of its original value within 18 seconds. This study is the first to analyze cement flow through osteoporotic bone. The theoretical and experimental models provided in this paper are generic. Thus, they can be used to systematically study and optimize the infiltration process for clinical practice.
The number of electronic vehicles increase steadily while the space for extending the charging infrastructure is limited. In particular in urban areas, where parking spaces in attractive areas are famous, opportunities to setup new charging stations is very limited. This leads to an overload of some very attractive charging stations and an underutilization of less attractive ones. Against this background, the paper at hand presents the design of an e-vehicle reservation system that aims at distributing the utilization of the charging infrastructure, particularly in urban areas. By applying a design science approach, the requirements for a reservation-based utilization approach are elicited and a model for a suitable distribution approach and its instantiation are developed. The artefact is evaluated by simulating the distribution effects based on data of real charging station utilizations.
We introduce a new way to measure the forecast effort that analysts devote to their earnings forecasts by measuring the analyst's general effort for all covered firms. While the commonly applied effort measure is based on analyst behaviour for one firm, our measure considers analyst behaviour for all covered firms. Our general effort measure captures additional information about analyst effort and thus can identify accurate forecasts. We emphasise the importance of investigating analyst behaviour in a larger context and argue that analysts who generally devote substantial forecast effort are also likely to devote substantial effort to a specific firm, even if this effort might not be captured by a firm-specific measure. Empirical results reveal that analysts who devote higher general forecast effort issue more accurate forecasts. Additional investigations show that analysts' career prospects improve with higher general forecast effort. Our measure improves on existing methods as it has higher explanatory power regarding differences in forecast accuracy than the commonly applied effort measure. Additionally, it can address research questions that cannot be examined with a firm-specific measure. It provides a simple but comprehensive way to identify accurate analysts.
Mouse nongenotoxic hepatocarcinogens phenobarbital (PB) and chlordane induce hepatomegaly characterized by hypertrophy and hyperplasia. Increased cell proliferation is implicated in the mechanism of tumor induction. The relevance of these tumors to human health is unclear. The xenoreceptors, constitutive androstane receptors (CARs), and pregnane X receptor (PXR) play key roles in these processes. Novel “humanized” and knockout models for both receptors were developed to investigate potential species differences in hepatomegaly. The effects of PB (80 mg/kg/4 days) and chlordane (10 mg/kg/4 days) were investigated in double humanized PXR and CAR (huPXR/huCAR), double knockout PXR and CAR (PXRKO/CARKO), and wild-type (WT) C57BL/6J mice. In WT mice, both compounds caused increased liver weight, hepatocellular hypertrophy, and cell proliferation. Both compounds caused alterations to a number of cell cycle genes consistent with induction of cell proliferation in WT mice. However, these gene expression changes did not occur in PXRKO/CARKO or huPXR/huCAR mice. Liver hypertrophy without hyperplasia was demonstrated in the huPXR/huCAR animals in response to both compounds. Induction of the CAR and PXR target genes, Cyp2b10 and Cyp3a11, was observed in both WT and huPXR/huCAR mouse lines following treatment with PB or chlordane. In the PXRKO/CARKO mice, neither liver growth nor induction of Cyp2b10 and Cyp3a11 was seen following PB or chlordane treatment, indicating that these effects are CAR/PXR dependent. These data suggest that the human receptors are able to support the chemically induced hypertrophic responses but not the hyperplastic (cell proliferation) responses. At this time, we cannot be certain that hCAR and hPXR when expressed in the mouse can function exactly as the genes do when they are expressed in human cells. However, all parameters investigated to date suggest that much of their functionality is maintained.
While bringing new opportunities, the Industry 4.0 movement also imposes new challenges to the manufacturing industry and all its stakeholders. In this competitive environment, a skilled and engaged workforce is a key to success. Gamification can generate valuable feedbacks for improving employees’ engagement and performance. Currently, Gamification in workspaces focuses on computer-based assignments and training, while tasks that require manual labor are rarely considered. This research provides an overview of Enterprise Gamification approaches and evaluates the challenges. Based on that, a skill-based Gamification framework for manual tasks is proposed, and a case study in the Industry 4.0 model factory is shown.
Hybrid control for autonomous systems — Integrating learning, deliberation and reactive control
(2010)
Abstractauthoren Graphene oxide (GO) nanoparticles were incorporated in temperature-sensitive Poly(N-isopropylacrylamide) (PNIPAAm) hydrogels. The nanoparticles increase the light absorption and convert light energy into heat efficiently. Thus, the hydrogels with GO can be stimulated spatially resolved by illumination as it was demonstrated by IR thermography. The temporal progression of the temperature maximum was detected for different concentrations of GO within the polymer network. Furthermore, the compatibility of PNIPAAm hydrogels with GO and cell cultures was investigated. For this purpose, culture medium was incubated with hydrogels containing GO and the viability and morphology of chinese hamster ovary (CHO) cells was examined after several days of culturing in presence of this medium.