Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (161) (remove)
Has Fulltext
- no (161) (remove)
Document Type
- Conference Proceeding (161) (remove)
Keywords
- Natural language processing (4)
- Clustering (2)
- Information extraction (2)
- humans (2)
- Active learning (1)
- Agent-based modeling (1)
- Agent-based simulation (1)
- Analytical models (1)
- Chance constrained programming (1)
- Cloud Computing (1)
- Cloud Service Broker (1)
- Comparative simulation (1)
- Database (1)
- Deep learning (1)
- EEG (1)
- Energy dispatch (1)
- Energy market (1)
- Energy market design (1)
- Evolution of damage (1)
- Extension fracture (1)
- Extension strain criterion (1)
- Focusing (1)
- Force (1)
- Grid Computing (1)
- Impedance Spectroscopy (1)
- Information Extraction (1)
- Instruments (1)
- Iterative learning control (1)
- Knee (1)
- Limit analysis (1)
- Load modeling (1)
- Machine learning (1)
- Market modeling (1)
- Measurement (1)
- Mohr–Coulomb criterion (1)
- Natural Language Processing (1)
- Natural language understanding (1)
- Open Data (1)
- Open source (1)
- Process model (1)
- Profile Extraction (1)
- Profile extraction (1)
- Query learning (1)
- Refining (1)
- Relation classification (1)
- Reliability of structures (1)
- Renewable energy sources (1)
- Reproducible research (1)
- Shakedown analysis (1)
- Simulation (1)
- Sleep EEG (1)
- Stochastic programming (1)
- Text Mining (1)
- Text mining (1)
- Time-series (1)
- Tobacco mosaic virus (1)
- Training (1)
- Trustworthy artificial intelligence (1)
- Workflow (1)
- Workflow Orchestration (1)
- XML (1)
- access control (1)
- acetoin (1)
- authorization (1)
- biopotential electrodes (1)
- capacitive field-effect biosensor (1)
- concrete (1)
- containers (1)
- distribution strategy (1)
- engines (1)
- enzyme immobilization (1)
- framework (1)
- grid computing (1)
- history (1)
- libraries (1)
- provenance (1)
- resource management (1)
- scheduling (1)
- security (1)
- sensors (1)
- synchronization (1)
- workflow (1)
- workflow management software (1)
Clearance of blood components and fluid drainage play a crucial role in subarachnoid hemorrhage (SAH) and post hemorrhagic hydrocephalus (PHH). With the involvement of interstitial fluid (ISF) and cerebrospinal fluid (CSF), two pathways for the clearance of fluid and solutes in the brain are proposed. Starting at the level of capillaries, flow of ISF follows along the basement membranes in the walls of cerebral arteries out of the parenchyma to drain into the lymphatics and CSF [1]–[3]. Conversely, it is shown that CSF enters the parenchyma between glial and pial basement membranes of penetrating arteries [4]–[6]. Nevertheless, the involved structures and the contribution of either flow pathway to fluid balance between the subarachnoid space and interstitial space remains controversial. Low frequency oscillations in vascular tone are referred to as vasomotion and corresponding vasomotion waves are modeled as the driving force for flow of ISF out of the parenchyma [7]. Retinal vessel analysis (RVA) allows non-invasive measurement of retinal vessel vasomotion with respect to diameter changes [8]. Thus, the aim of the study is to investigate vasomotion in RVA signals of SAH and PHH patients.
Hypertension describes the pathological increase of blood pressure, which is most commonly associated with the increase of vascular wall stiffness [1]. Referring to the “Deutsche Bluthochdruck Liga” this pathology shows a growing trend in our aging society. In order to find novel pharmacological and probably personalized treatments, we want to present a functional approach to study biomechanical properties of a human aortic vascular model.
In this method review we will give an overview of recent studies which were carried out with the CellDrum technology [2] and underline the added value to already existing standard procedures known from the field of physiology.
Herein described CellDrum technology is a system to measure functional mechanical properties of cell monolayers and thin tissue constructs in-vitro. Additionally, the CellDrum enables to elucidate the mechanical response of cells to pharmacological drugs, toxins and vasoactive agents. Due to its highly flexible polymer support, cells can also be mechanically stimulated by steady and cyclic biaxial stretching.
Fields of asymmetric tensors play an important role in many applications such as medical imaging (diffusion tensor magnetic resonance imaging), physics, and civil engineering (for example Cauchy-Green-deformation tensor, strain tensor with local rotations, etc.). However, such asymmetric tensors are usually symmetrized and then further processed. Using this procedure results in a loss of information. A new method for the processing of asymmetric tensor fields is proposed restricting our attention to tensors of second-order given by a 2x2 array or matrix with real entries. This is achieved by a transformation resulting in Hermitian matrices that have an eigendecomposition similar to symmetric matrices. With this new idea numerical results for real-world data arising from a deformation of an object by external forces are given. It is shown that the asymmetric part indeed contains valuable information.
Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler. This approach has various limitations, particularly if human interactions should be integrated in workflow execution. To support human interactions with the benefit of enabling inter organizational computation and community approaches, this poster paper proposes the idea of a pull-based task distribution strategy. Here, heterogeneous resources, including human interaction, should actively select tasks for execution from a central repository. This leads to special demands regarding security issues like access control. In the established push-based job execution the resources are responsible for granting access to workflows and job initiators. In general this is done by access control lists, where users are explicitly mapped to local accounts according to their policies. In the pull-based approach the resources actively apply for job executions by sending requests to a central task repository. This means that every resource has to be able to authenticate against the repository to be authorized for task execution. In other words the authorization is relocated from the resources to the repository. The poster paper introduces current work regarding to the mentioned security aspects in the pull-based approach within the scope of the project “HiX4AGWS”.
In energy economy forecasts of different time series are rudimentary. In this study, a prediction for the German day-ahead spot market is created with Apache Spark and R. It is just an example for many different applications in virtual power plant environments. Other examples of use as intraday price processes, load processes of machines or electric vehicles, real time energy loads of photovoltaic systems and many more time series need to be analysed and predicted.
This work gives a short introduction into the project where this study is settled. It describes the time series methods that are used in energy industry for forecasts shortly. As programming technique Apache Spark, which is a strong cluster computing technology, is utilised. Today, single time series can be predicted. The focus of this work is on developing a method to parallel forecasting, to process multiple time series simultaneously with R and Apache Spark.
A capacitive electrolyte-insulator-semiconductor (EISCAP) biosensor modified with Tobacco mosaic virus (TMV) particles for the detection of acetoin is presented. The enzyme acetoin reductase (AR) was immobilized on the surface of the EISCAP using TMV particles as nanoscaffolds. The study focused on the optimization of the TMV-assisted AR immobilization on the Ta 2 O 5 -gate EISCAP surface. The TMV-assisted acetoin EISCAPs were electrochemically characterized by means of leakage-current, capacitance-voltage, and constant-capacitance measurements. The TMV-modified transducer surface was studied via scanning electron microscopy.
When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
The progress in natural language processing (NLP) research over the last years, offers novel business opportunities for companies, as automated user interaction or improved data analysis. Building sophisticated NLP applications requires dealing with modern machine learning (ML) technologies, which impedes enterprises from establishing successful NLP projects. Our experience in applied NLP research projects shows that the continuous integration of research prototypes in production-like environments with quality assurance builds trust in the software and shows convenience and usefulness regarding the business goal. We introduce STAMP 4 NLP as an iterative and incremental process model for developing NLP applications. With STAMP 4 NLP, we merge software engineering principles with best practices from data science. Instantiating our process model allows efficiently creating prototypes by utilizing templates, conventions, and implementations, enabling developers and data scientists to focus on the business goals. Due to our iterative-incremental approach, businesses can deploy an enhanced version of the prototype to their software environment after every iteration, maximizing potential business value and trust early and avoiding the cost of successful yet never deployed experiments.
In positron emission tomography improving time, energy and spatial detector resolutions and using Compton kinematics introduces the possibility to reconstruct a radioactivity distribution image from scatter coincidences, thereby enhancing image quality. The number of single scattered coincidences alone is in the same order of magnitude as true coincidences. In this work, a compact Compton camera module based on monolithic scintillation material is investigated as a detector ring module. The detector interactions are simulated with Monte Carlo package GATE. The scattering angle inside the tissue is derived from the energy of the scattered photon, which results in a set of possible scattering trajectories or broken line of response. The Compton kinematics collimation reduces the number of solutions. Additionally, the time of flight information helps localize the position of the annihilation. One of the questions of this investigation is related to how the energy, spatial and temporal resolutions help confine the possible annihilation volume. A comparison of currently technically feasible detector resolutions (under laboratory conditions) demonstrates the influence on this annihilation volume and shows that energy and coincidence time resolution have a significant impact. An enhancement of the latter from 400 ps to 100 ps leads to a smaller annihilation volume of around 50%, while a change of the energy resolution in the absorber layer from 12% to 4.5% results in a reduction of 60%. The inclusion of single tissue-scattered data has the potential to increase the sensitivity of a scanner by a factor of 2 to 3 times. The concept can be further optimized and extended for multiple scatter coincidences and subsequently validated by a reconstruction algorithm.