Article
Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1359)
- INB - Institut für Nano- und Biotechnologien (503)
- Fachbereich Chemie und Biotechnologie (473)
- Fachbereich Elektrotechnik und Informationstechnik (414)
- IfB - Institut für Bioengineering (410)
- Fachbereich Energietechnik (361)
- Fachbereich Luft- und Raumfahrttechnik (254)
- Fachbereich Maschinenbau und Mechatronik (151)
- Fachbereich Wirtschaftswissenschaften (116)
- Fachbereich Bauingenieurwesen (69)
Language
- English (3285) (remove)
Document Type
- Article (3285) (remove)
Keywords
- Einspielen <Werkstoff> (7)
- avalanche (5)
- Earthquake (4)
- FEM (4)
- Finite-Elemente-Methode (4)
- LAPS (4)
- additive manufacturing (4)
- biosensors (4)
- field-effect sensor (4)
- frequency mixing magnetic detection (4)
The scientific interest for near-Earth asteroids as well as the interest in potentially hazardous asteroids from the perspective of planetary defense led the space community to focus on near-Earth asteroid mission studies. A multiple near-Earth asteroid rendezvous mission with close-up observations of several objects can help to improve the characterization of these asteroids. This work explores the design of a solar-sail spacecraft for such a mission, focusing on the search of possible sequences of encounters and the trajectory optimization. This is done in two sequential steps: a sequence search by means of a simplified trajectory model and a set of heuristic rules based on astrodynamics, and a subsequent optimization phase. A shape-based approach for solar sailing has been developed and is used for the first phase. The effectiveness of the proposed approach is demonstrated through a fully optimized multiple near-Earth asteroid rendezvous mission. The results show that it is possible to visit five near-Earth asteroids within 10 years with near-term solar-sail technology.
Solidification of silver-germanium alloys in an amorphous matrix aboard the space station Mir
(1993)
Solution of plane anisotropic elastostatical boundary value problems by singular integral equations
(1982)
There is a very large number of very important situations which can be modeled with nonlinear parabolic partial differential equations (PDEs) in several dimensions. In general, these PDEs can be solved by discretizing in the spatial variables and transforming them into huge systems of ordinary differential equations (ODEs), which are very stiff. Therefore, standard explicit methods require a large number of iterations to solve stiff problems. But implicit schemes are computationally very expensive when solving huge systems of nonlinear ODEs. Several families of Extrapolated Stabilized Explicit Runge-Kutta schemes (ESERK) with different order of accuracy (3 to 6) are derived and analyzed in this work. They are explicit methods, with stability regions extended, along the negative real semi-axis, quadratically with respect to the number of stages s, hence they can be considered to solve stiff problems much faster than traditional explicit schemes. Additionally, they allow the adaptation of the step length easily with a very small cost.
Two new families of ESERK schemes (ESERK3 and ESERK6) are derived, and analyzed, in this work. Each family has more than 50 new schemes, with up to 84.000 stages in the case of ESERK6. For the first time, we also parallelized all these new variable step length and variable number of stages algorithms (ESERK3, ESERK4, ESERK5, and ESERK6). These parallelized strategies allow to decrease times significantly, as it is discussed and also shown numerically in two problems. Thus, the new codes provide very good results compared to other well-known ODE solvers. Finally, a new strategy is proposed to increase the efficiency of these schemes, and it is discussed the idea of combining ESERK families in one code, because typically, stiff problems have different zones and according to them and the requested tolerance the optimum order of convergence is different.
The growing body of political texts opens up new opportunities for rich insights into political dynamics and ideologies but also increases the workload for manual analysis. Automated speaker attribution, which detects who said what to whom in a speech event and is closely related to semantic role labeling, is an important processing step for computational text analysis. We study the potential of the large language model family Llama 2 to automate speaker attribution in German parliamentary debates from 2017-2021. We fine-tune Llama 2 with QLoRA, an efficient training strategy, and observe our approach to achieve competitive performance in the GermEval 2023 Shared Task On Speaker Attribution in German News Articles and Parliamentary Debates. Our results shed light on the capabilities of large language models in automating speaker attribution, revealing a promising avenue for computational analysis of political discourse and the development of semantic role labeling systems.
At the present time, one of the most serious environmental problems of Central Asia and South Kazakhstan is the ongoing large-scale deterioration of principal urban tree populations. Several major centers of massive spread of invasive plant pests have been found in urban dendroflora of this region. The degree of damage of seven most wide-spread aboriginal tree species was found to range from 21.4±1.1 to 85.4±1.8%. In particular, the integrity of the native communities of sycamore (Platanus spp.), willow (Salix spp.), poplar (Populus spp.) and elm (Ulmus spp.) is highly endangered. Our taxonomic analysis of the most dangerous tree pests of the region has revealed them as neobiontic xylophilous insects such as Cossus cossus L. (Order: Lepidoptera L.) Monochamus urussovi Fisch., Monochamus sutor L., Acanthocinus aedelis L. and Ñetonia aureate L. (Order: Coleoptera L.). We relate the origin of this threatening trend with the import of industrial wood in the mid 90s of the last century that was associated with high degree of the constructional work in the region. Because of the absence of efficient natural predators of the pest species, the application of microbiological methods of the pest control and limitation is suggested.
In this work, a spore-based biosensor is evaluated to monitor the microbicidal efficacy of sterilization processes applying gaseous hydrogen peroxide (H2O2). The sensor is based on interdigitated electrode structures (IDEs) that have been fabricated by means of thin-film technologies. Impedimetric measurements are applied to study the effect of sterilization process on spores of Bacillus atrophaeus. This resilient microorganism is commonly used in industry to proof the sterilization efficiency. The sensor measurements are accompanied by conventional microbiological challenge tests, as well as morphological characterizations with scanning electron microscopy (SEM) and transmission electron microscopy (TEM). The sensor measurements are correlated with the microbiological test routines. In both methods, namely the sensor-based and microbiological one, a tailing effect has been observed. The results are evaluated and discussed in a three-dimensional calibration plot demonstrating the sensor's suitability to enable a rapid process decision in terms of a successfully performed sterilization.
Block ramps are ecologically oriented drop structures with adequate energy dissipation and partially moderate flow velocities. A special case is given with crossbar block ramps, where the upstream and downstream level difference is reduced by a series of basins. To prevent the total structure from failing, the stability of single boulders within the crossbars and the bed material in between must be guaranteed. The present paper addresses the stability of bed material and scour development for various flow regimes. Any bed material erosion may affect the stability of the crossbar boulders, which in turn can result in major damages of the ramp. Therefore new design approaches are developed to choose an appropriate bed material size and to avoid failures of crossbar block ramp structures.
Objective
To investigate whether functional brain networks of epilepsy patients treated with antiepileptic medication differ from networks of healthy controls even during the seizure-free interval.
Methods
We applied different rules to construct binary and weighted networks from EEG and MEG data recorded under a resting-state eyes-open and eyes-closed condition from 21 epilepsy patients and 23 healthy controls. The average shortest path length and the clustering coefficient served as global statistical network characteristics.
Results
Independent on the behavioral condition, epileptic brains exhibited a more regular functional network structure. Similarly, the eyes-closed condition was characterized by a more regular functional network structure in both groups. The amount of network reorganization due to behavioral state changes was similar in both groups. Consistent findings could be achieved for networks derived from EEG but hardly from MEG recordings, and network construction rules had a rather strong impact on our findings.
Conclusions
Despite the locality of the investigated processes epileptic brain networks differ in their global characteristics from non-epileptic brain networks. Further methodological developments are necessary to improve the characterization of disturbed and normal functional networks.
Significance
An increased regularity and a diminished modulation capability appear characteristic of epileptic brain networks.
The rail business is challenged by long product life cycles and a broad spectrum of assembly groups and single parts. When spare part obsolescence occurs, quick solutions are needed. A reproduction of obsolete parts is often connected to long waiting times and minimum lot quantities that need to be purchased and stored. Spare part storage is therefore challenged by growing stocks, bound capital and issues of part ageing. A possible solution could be a virtual storage of spare parts which will be 3D printed through additive manufacturing technologies in case of sudden demand. As mechanical properties of additive manufactured parts are neither guaranteed by machine manufacturers nor by service providers, the utilization of this relatively young technology is impeded and research is required to address these issues. This paper presents an examination of mechanical properties of specimens manufactured from stainless steel through the selective laser melting (SLM) process. The specimens were produced in multiple batches. This paper interrogates the question if the test results follow a normal distribution pattern and if mechanical property predictions can be made. The results will be put opposite existing threshold values provided as the industrial standard. Furthermore, probability predictions will be made in order to examine the potential of the SLM process to maintain state-of-the-art mechanical property requirements.
The paper deals with the asymptotic behaviour of estimators, statistical tests and confidence intervals for L²-distances to uniformity based on the empirical distribution function, the integrated empirical distribution function and the integrated empirical survival function. Approximations of power functions, confidence intervals for the L²-distances and statistical neighbourhood-of-uniformity validation tests are obtained as main applications. The finite sample behaviour of the procedures is illustrated by a simulation study.
In this research work, a statistical analysis of the CO2 laser beam welding of dual phase (DP600)/transformation induced plasticity (TRIP700) steel sheets was done using response surface methodology. The analysis considered the effect of laser power (2–2.2 kW), welding speed (40–50 mm/s) and focus position (−1 to 0 mm) on the heat input, the weld bead geometry, uniaxial tensile strength, formability limited dome height and welding operation cost. The experimental design was based on Box–Behnken design using linear and quadratic polynomial equations for predicting the mathematical models. The results indicate that the proposed models predict the responses adequately within the limits of welding parameters being used and the welding speed is the most significant parameter during the welding process.
The optimization of light output and energy resolution of scintillators is of special interest for the development of high resolution and high sensitivity PET. The aim of this work is to obtain statistically reliable results concerning optimal surface treatment of scintillation crystals and the selection of reflector material. For this purpose, raw, mechanically polished and etched LSO crystals (size 2×2×10 mm3) were combined with various reflector materials (Teflon tape, Teflon matrix, BaSO4) and exposed to a 22Na source. In order to ensure the statistical reliability of the results, groups of 10 LSO crystals each were measured for all combinations of surface treatment and reflector material. Using no reflector material the light output increased up to 551±35% by mechanical polishing the surface compared to 100±5% for raw crystals. Etching the surface increased the light output to 441±29%. The untreated crystals had an energy resolution of 24.6±4.0%. By mechanical polishing the surface it was possible to achieve an energy resolution of 13.2±0.8%, by etching of 14.8±0.7%. In combination with BaSO4 as reflector material the maximum increase of light output has been established to 932±57% for mechanically polished and 895±61% for etched crystals. The combination with BaSO4 also caused the best improvement of the energy resolution up to 11.6±0.2% for mechanically polished and 12.2±0.3% for etched crystals. Relating to the light output there was no significant statistical difference between the two surface treatments in combination with BaSO4. In contrast to this, the statistical results of the energy resolution have shown the combination of mechanical polishing and BaSO4 as the optimum.
The construction of a statistical test is investigated which is based only on “reliability” and “precision” as quality criteria. The reliability of a statistical test is quantifiedin a straightforward way by the probability that the decision of the test is correct. However, the quantification of the precision of a statistical test is not at all evident. Thereforethe paper presents and discusses several approaches. Moreover the distinction of “nullhypothesis” and “alternative hypothesis” is not necessary any longer.