Refine
Year of publication
Institute
- Fachbereich Elektrotechnik und Informationstechnik (38)
- Fachbereich Luft- und Raumfahrttechnik (20)
- Fachbereich Medizintechnik und Technomathematik (13)
- Fachbereich Energietechnik (12)
- ECSM European Center for Sustainable Mobility (10)
- Fachbereich Maschinenbau und Mechatronik (8)
- Solar-Institut Jülich (6)
- INB - Institut für Nano- und Biotechnologien (4)
- IfB - Institut für Bioengineering (4)
- Fachbereich Wirtschaftswissenschaften (3)
Language
- English (102) (remove)
Document Type
- Article (62)
- Conference Proceeding (29)
- Part of a Book (5)
- Book (4)
- Conference: Meeting Abstract (2)
Keywords
- Blitzschutz (2)
- Erneuerbare Energien (2)
- Hybridsystem (2)
- Lightning protection (2)
- cyber physical production system (2)
- digital shadow (2)
- humans (2)
- Actuator disk modelling (1)
- Aeroelasticity (1)
- Antarctica (1)
Magnetic nanoparticles (MNPs) are used as therapeutic and diagnostic agents for local delivery of heat and image contrast enhancement in diseased tissue. Besides magnetization, the most important parameter that determines their performance for these applications is their magnetic relaxation, which can be affected when MNPs immobilize and agglomerate inside tissues. In this letter, we investigate different MNP agglomeration states for their magnetic relaxation properties under excitation in alternating fields and relate this to their heating efficiency and imaging properties. With focus on magnetic fluid hyperthermia, two different trends in MNP heating efficiency are measured: an increase by up to 23% for agglomerated MNP in suspension and a decrease by up to 28% for mixed states of agglomerated and immobilized MNP, which indicates that immobilization is the dominant effect. The same comparatively moderate effects are obtained for the signal amplitude in magnetic particle spectroscopy.
With the many achievements of Machine Learning in the past years, it is likely that the sub-area of Deep Learning will continue to deliver major technological breakthroughs [1]. In order to achieve best results, it is important to know the various different Deep Learning frameworks and their respective properties. This paper provides a comparative overview of some of the most popular frameworks. First, the comparison methods and criteria are introduced and described with a focus on computer vision applications: Features and Uses are examined by evaluating papers and articles, Adoption and Popularity is determined by analyzing a data science study. Then, the frameworks TensorFlow, Keras, PyTorch and Caffe are compared based on the previously described criteria to highlight properties and differences. Advantages and disadvantages are compared, enabling researchers and developers to choose a framework according to their specific needs.
4CH TX/RX Surface Coil for 7T: Design, Optimization and Application for Cardiac Function Imaging
(2010)
Practical impediments of ultra high field cardiovascular MR (CVMR) can be catalogued in exacerbated magnetic field and radio frequency (RF) inhomogeneities, susceptibility and off-resonance effects, conductive and dielectric effects in tissue, and RF power deposition constraints, which all bear the potential to spoil the benefit of CVMR at 7T. Therefore, a four element cardiac transceive surface coil array was developed. Cardiac imaging provided clinically acceptable signal homogeneity with an excellent blood myocardium contrast. Subtle anatomic structures, such as pericardium, mitral and tricuspid valves and their apparatus, papillary muscles, and trabecles were accurately delineated.
Purpose
To design and evaluate a four-channel cardiac transceiver coil array for functional cardiac imaging at 7T.
Materials and Methods
A four-element cardiac transceiver surface coil array was developed with two rectangular loops mounted on an anterior former and two rectangular loops on a posterior former. specific absorption rate (SAR) simulations were performed and a Burn:x-wiley:10531807:media:JMRI22451:tex2gif-stack-1 calibration method was applied prior to obtain 2D FLASH CINE (mSENSE, R = 2) images from nine healthy volunteers with a spatial resolution of up to 1 × 1 × 2.5 mm3.
Results
Tuning and matching was found to be better than 10 dB for all subjects. The decoupling (S21) was measured to be >18 dB between neighboring loops, >20 dB for opposite loops, and >30 dB for other loop combinations. SAR values were well within the limits provided by the IEC. Imaging provided clinically acceptable signal homogeneity with an excellent blood-myocardium contrast applying the Burn:x-wiley:10531807:media:JMRI22451:tex2gif-stack-2 calibration approach.
Conclusion
A four-channel cardiac transceiver coil array for 7T was built, allowing for cardiac imaging with clinically acceptable signal homogeneity and an excellent blood-myocardium contrast. Minor anatomic structures, such as pericardium, mitral, and tricuspid valves and their apparatus, as well as trabeculae, were accurately delineated.
In this chapter, the key technologies and the instrumentation required for the subsurface exploration of ocean worlds are discussed. The focus is laid on Jupiter’s moon Europa and Saturn’s moon Enceladus because they have the highest potential for such missions in the near future. The exploration of their oceans requires landing on the surface, penetrating the thick ice shell with an ice-penetrating probe, and probably diving with an underwater vehicle through dozens of kilometers of water to the ocean floor, to have the chance to find life, if it exists. Technologically, such missions are extremely challenging. The required key technologies include power generation, communications, pressure resistance, radiation hardness, corrosion protection, navigation, miniaturization, autonomy, and sterilization and cleaning. Simpler mission concepts involve impactors and penetrators or – in the case of Enceladus – plume-fly-through missions.
Textile reinforced concrete. Part I: Process model for collaborative research and development
(2003)
A High-Throughput Functional Complementation Assay for Classification of BRCA1 Missense Variants
(2013)
Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler. This approach has various limitations, particularly if human interactions should be integrated in workflow execution. To support human interactions with the benefit of enabling inter organizational computation and community approaches, this poster paper proposes the idea of a pull-based task distribution strategy. Here, heterogeneous resources, including human interaction, should actively select tasks for execution from a central repository. This leads to special demands regarding security issues like access control. In the established push-based job execution the resources are responsible for granting access to workflows and job initiators. In general this is done by access control lists, where users are explicitly mapped to local accounts according to their policies. In the pull-based approach the resources actively apply for job executions by sending requests to a central task repository. This means that every resource has to be able to authenticate against the repository to be authorized for task execution. In other words the authorization is relocated from the resources to the repository. The poster paper introduces current work regarding to the mentioned security aspects in the pull-based approach within the scope of the project “HiX4AGWS”.
High aerodynamic efficiency requires propellers with high aspect ratios, while propeller sweep potentially reduces noise. Propeller sweep and high aspect ratios increase elasticity and coupling of structural mechanics and aerodynamics, affecting the propeller performance and noise. Therefore, this paper analyzes the influence of elasticity on forward-swept, backward-swept, and unswept propellers in hover conditions. A reduced-order blade element momentum approach is coupled with a one-dimensional Timoshenko beam theory and Farassat's formulation 1A. The results of the aeroelastic simulation are used as input for the aeroacoustic calculation. The analysis shows that elasticity influences noise radiation because thickness and loading noise respond differently to deformations. In the case of the backward-swept propeller, the location of the maximum sound pressure level shifts forward by 0.5 °, while in the case of the forward-swept propeller, it shifts backward by 0.5 °. Therefore, aeroacoustic optimization requires the consideration of propeller deformation.
This paper compares several blade element theory (BET) method-based propeller simulation tools, including an evaluation against static propeller ground tests and high-fidelity Reynolds-Average Navier Stokes (RANS) simulations. Two proprietary propeller geometries for paraglider applications are analysed in static and flight conditions. The RANS simulations are validated with the static test data and used as a reference for comparing the BET in flight conditions. The comparison includes the analysis of varying 2D aerodynamic airfoil parameters and different induced velocity calculation methods. The evaluation of the BET propeller simulation tools shows the strength of the BET tools compared to RANS simulations. The RANS simulations underpredict static experimental data within 10% relative error, while appropriate BET tools overpredict the RANS results by 15–20% relative error. A variation in 2D aerodynamic data depicts the need for highly accurate 2D data for accurate BET results. The nonlinear BET coupled with XFOIL for the 2D aerodynamic data matches best with RANS in static operation and flight conditions. The novel BET tool PropCODE combines both approaches and offers further correction models for highly accurate static and flight condition results.
Information technologies, such as big data analytics, cloud computing,
cyber physical systems, robotic process automation, and the internet of things, provide a sustainable impetus for the structural development of business sectors as well as the digitalization of markets, enterprises, and processes. Within the consulting industry, the proliferation of these technologies opened up the new segment of digital transformation, which focuses on setting up, controlling, and implementing projects for enterprises from a broad range of sectors. These recent developments raise the question, which requirements evolve for IT consultants as important success factors of those digital transformation projects. Therefore, this empirical contribution provides indications regarding the qualifications and competences necessary for IT consultants in the era of digital transformation from a labor market perspective. On the one hand, this knowledge base is interesting for the academic education of consultants, since it supports a market-oriented design of adequate training measures. On the other hand, insights into the competence requirements for consultants are considered relevant for skill and talent management processes in consulting practice. Assuming that consulting companies pursue a strategic human resource management approach, labor market information may also be useful to discover strategic behavioral patterns.
The continuing growth of scientific publications raises the question how research processes can be digitalized and thus realized more productively. Especially in information technology fields, research practice is characterized by a rapidly growing volume of publications. For the search process various information systems exist. However, the analysis of the published content is still a highly manual task. Therefore, we propose a text analytics system that allows a fully digitalized analysis of literature sources. We have realized a prototype by using EBSCO Discovery Service in combination with IBM Watson Explorer and demonstrated the results in real-life research projects. Potential addressees are research institutions, consulting firms, and decision-makers in politics and business practice.
The benefits of robotic process automation (RPA) are highly related to the usage of commercial off-the-shelf (COTS) software products that can be easily implemented and customized by business units. But, how to find the best fitting RPA product for a specific situation that creates the expected benefits? This question is related to the general area of software evaluation and selection. In the face of more than 75 RPA products currently on the market, guidance considering those specifics is required. Therefore, this chapter proposes a criteria-based selection method specifically for RPA. The method includes a quantitative evaluation of costs and benefits as well as a qualitative utility analysis based on functional criteria. By using the visualization of financial implications (VOFI) method, an application-oriented structure is provided that opposes the total cost of ownership to the time savings times salary (TSTS). For the utility analysis a detailed list of functional criteria for RPA is offered. The whole method is based on a multi-vocal review of scientific and non-scholarly literature including publications by business practitioners, consultants, and vendors. The application of the method is illustrated by a concrete RPA example. The illustrated
structures, templates, and criteria can be directly utilized by practitioners in their real-life RPA implementations. In addition, a normative decision process for selecting RPA alternatives is proposed before the chapter closes with a discussion and outlook.