Refine
Year of publication
Document Type
- Article (1343)
- Conference Proceeding (208)
- Book (43)
- Part of a Book (40)
- Doctoral Thesis (18)
- Other (5)
- Patent (4)
- Preprint (3)
- Lecture (2)
- Habilitation (1)
- Talk (1)
Language
- English (1668) (remove)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- FEM (6)
- Limit analysis (6)
- Shakedown analysis (6)
- shakedown analysis (6)
- Clusterion (5)
- Air purification (4)
- Hämoglobin (4)
- LAPS (4)
- Lipopolysaccharide (4)
- Luftreiniger (4)
- Natural language processing (4)
- Plasmacluster ion technology (4)
- Raumluft (4)
- hydrogen peroxide (4)
Institute
- Fachbereich Medizintechnik und Technomathematik (1668) (remove)
The chemical imaging sensor is a field-effect sensor which is able to visualize both the distribution of ions (in LAPS mode) and the distribution of impedance (in SPIM mode) inthe sample. In this study, a novel wound-healing assay is proposed, in which the chemical imaging sensor operated in SPIM mode is applied to monitor the defect of a cell layer brought into proximity of the sensing surface.A reduced impedance inside the defect, which was artificially formed ina cell layer, was successfully visualized in a photocurrent image.
The chemical imaging sensor, which is based on the principle of the light-addressable potentiometric sensor (LAPS), is a powerful tool to visualize the spatial distribution of chemical species on the sensor surface. The spatial resolution of this sensor depends on the diffusion of photocarriers excited by a modulated light. In this study, a novel hybrid fiber-optic illumination was developed to enhance the spatial resolution. It consists of a modulated light probe to generate a photocurrent signal and a ring of constant light, which suppresses the lateral diffusion of minority carriers excited by the modulated light. It is demonstrated that the spatial resolution was improved from 92 μm to 68 μm.
The chemical imaging sensor is a chemical sensor which is capable of visualizing the spatial distribution of chemical species in sample solution. In this study, a novel measurement system based on the chemical imaging sensor was developed to observe the inside of a Y-shaped microfluidic channel while injecting two sample solutions from two branches. From the collected chemical images, it was clearly observed that the injected solutions formed laminar flows in the microfluidic channel. In addition, ion diffusion across the laminar flows was observed. This label-free method can acquire quantitative data of ion distribution and diffusion in microfluidic devices, which can be used to determine the diffusion coefficients, and therefore, the molecular weights of chemical species in the sample solution.
The chemical imaging sensor was applied to in-situ pH imaging of the solution in the vicinity of a corroding surface of stainless steel under potentiostatic polarization. A test piece of polished stainless steel was placed on the sensing surface leaving a narrow gap filled with artificial seawater and the stainless steel was corroded under polarization. The pH images obtained during polarization showed correspondence between the region of lower pH and the site of corrosion. It was also found that the pH value in the gap became as low as 2 by polarization, which triggered corrosion.
Quartz crystal nanobalance (QCN) sensors are considered as powerful masssensitive sensors to determine materials in the sub-nanogram level. In this study, a single piezoelectric quartz crystal nanobalance modified with polystyrene was employed to detect benzene, toluene, ethylbenzene and xylene (BTEX compounds). The frequency shift of the QCN sensor was found to be linear against the BTEX compound concentrations in the range about 1-45 mg l-1. The correlation coefficients for benzene, toluene, ethylbenzene, and xylene were 0.991, 0.9977, 0.9946 and 0.9971, respectively. The principal component analysis was also utilized to process the frequency response data of the single piezoelectric crystal at different times, considering to the different adsorption-desorption dynamics of BTEX compounds. Using principal component analysis, it was found that over 90% of the data variance could still be explained by use of two principal components (PC1 and PC2). Subsequently, the successful identification of benzene and toluene was possible through the principal component analysis of the transient responses of the polystyrene modified QCN sensor. The results showed that the polystyrene-modified QCN had favorable identification and quantification performances for the BTEX compounds.
Air-pulse corneal applanation signal curve parameters for the characterisation of keratoconus
(2011)
Optical coherence tomography : a potential tool to predict premature rupture of fetal membranes
(2013)
Kyphoplasty of Osteoporotic Fractured Vertebrae: A Finite Element Analysis about Two Types of Cement
(2019)
Transport through Redox-Active Ru-Terpyridine Complexes Integrated in Single Nanoparticle Devices
(2020)
Transition metal complexes are electrofunctional molecules due to their high conductivity and their intrinsic switching ability involving a metal-to-ligand charge transfer. Here, a method is presented to contact reliably a few to single redox-active Ru-terpyridine complexes in a CMOS compatible nanodevice and preserve their electrical functionality. Using hybrid materials from 14 nm gold nanoparticles (AuNP) and bis-{4′-[4-(mercaptophenyl)-2,2′:6′,2″-terpyridine]}-ruthenium(II) complexes a device size of 30² nm² inclusive nanoelectrodes is achieved. Moreover, this method bears the opportunity for further downscaling. The Ru-complex AuNP devices show symmetric and asymmetric current versus voltage curves with a hysteretic characteristic in two well separated conductance ranges. By theoretical approximations based on the single-channel Landauer model, the charge transport through the formed double-barrier tunnel junction is thoroughly analyzed and its sensibility to the molecule/metal contact is revealed. It can be verified that tunneling transport through the HOMO is the main transport mechanism while decoherent hopping transport is present to a minor extent.
Market abstraction of energy markets and policies - application in an agent-based modeling toolbox
(2023)
In light of emerging challenges in energy systems, markets are prone to changing dynamics and market design. Simulation models are commonly used to understand the changing dynamics of future electricity markets. However, existing market models were often created with specific use cases in mind, which limits their flexibility and usability. This can impose challenges for using a single model to compare different market designs. This paper introduces a new method of defining market designs for energy market simulations. The proposed concept makes it easy to incorporate different market designs into electricity market models by using relevant parameters derived from analyzing existing simulation tools, morphological categorization and ontologies. These parameters are then used to derive a market abstraction and integrate it into an agent-based simulation framework, allowing for a unified analysis of diverse market designs. Furthermore, we showcase the usability of integrating new types of long-term contracts and over-the-counter trading. To validate this approach, two case studies are demonstrated: a pay-as-clear market and a pay-as-bid long-term market. These examples demonstrate the capabilities of the proposed framework.
Useful market simulations are key to the evaluation of diferent market designs existing of multiple market mechanisms or rules. Yet a simulation framework which has a comparison of diferent market mechanisms in mind was not found. The need to create an objective view on different sets of market rules while investigating meaningful agent strategies concludes that such a simulation framework is needed to advance the research on this subject. An overview of diferent existing market simulation models is given which also shows the research gap and the missing capabilities of those systems. Finally, a methodology is outlined how a novel market simulation which can answer the research questions can be developed.
The sorption of LPS toxic shock by nanoparticles on base of carbonized vegetable raw materials
(2008)
Immobilization of lactobacillus on high temperature carbonizated vegetable raw material (rice husk, grape stones) increases their physiological activity and the quantity of the antibacterial metabolits, that consequently lead to increase of the antagonistic activity of lactobacillus. It is implies that the use of the nanosorbents for the attachment of the probiotical microorganisms are highly perspective for decision the important problems, such as the probiotical preparations delivery to the right address and their attachment to intestines mucosa with the following detoxication of gastro-intestinal tract and the normalization of it’s microecology. Besides that, thus, the received carbonizated nanoparticles have peculiar properties – ability to sorption of LPS toxical shock and, hence, to the detoxication of LPS.
Biocomposite Materials Based on Carbonized Rice Husk in Biomedicine and Environmental Applications
(2020)
This chapter describes the prospects for biomedical and environmental engineering applications of heterogeneous materials based on nanostructured carbonized rice husk. Efforts in engineering enzymology are focused on the following directions: development and optimization of immobilization methods leading to novel biotechnological and biomedical applications; construction of biocomposite materials based on individual enzymes, multi-enzyme complexes and whole cells, targeted on realization of specific industrial processes. Molecular biological and biochemical studies on cell adhesion focus predominantly on identification, isolation and structural analysis of attachment-responsible biological molecules and their genetic determinants. The chapter provides a short overview of applications of the biocomposite materials based of nanostructured carbonized adsorbents. It emphasizes that further studies and better understanding of the interactions between CNS and microbial cells are necessary. The future use of living cells as biocatalysts, especially in the environmental field, needs more systematic investigations of the microbial adsorption phenomenon.
One of the priority trends of carbon nanotechnology is creation of nanocomposite systems. Such carbon nanostructured composites were produced using - raw materials based on the products of agricultural waste, such as grape stones, apricot stones, rice husk. These products have a - wide spectrum of application and can be obtained in large quantities. The Institute of Combustion Problems has carried out the work on synthesis of the nanostructured carbon sorbents for multiple applications including the field of biomedicine. The article presents the data on the synthesis and physico-chemical properties of carbonaceous sorbents using physicochemical methods of investigation: separation and purification of biomolecules; isolation of phytohormone - fusicoccin; adsorbent INGO-1 in the form of an adsorption column for blood detoxification, oral (entero) sorbent - INGO-2; the study of efferent and probiotic properties and sorption activity in regard to the lipopolysaccharide (LPS), new biocomposites - based on carbonized rice husk (CRH) and cellular microorganisms; the use of CRH in wound treatment. A new material for blood detoxication (INGO-1) has been obtained. Adsorption of p-cresyl sulfate and indoxyl sulfate has shown that active carbon adsorbent can remove clinically significant level of p-cresyl sulfate and indoxyl sulfate from human plasma. Enterosorbent INGO-2 possesses high adsorption activity in relation to Gram-negative bacteria and their endotoxins. INGO-2 slows down the growth of conditionally pathogenic microorganisms, without having a negative effect on bifido and lactobacteria. The use of enterosorbent INGO-2 for sorption therapy may provide a solution to a complex problem - detoxication of the digestive tract and normalization of the intestinal micro ecology. The immobilized probiotic called "Riso-lact" was registered at the Ministry of Health of the Republic of Kazakhstan as a biologically active food additive. The developed technology is patented and provides production of the medicine in the form of freeze-dried biomass immobilized in vials.
Conventional EEG devices cannot be used in everyday life and
hence, past decade research has been focused on Ear-EEG for mobile,
at-home monitoring for various applications ranging from
emotion detection to sleep monitoring. As the area available for
electrode contact in the ear is limited, the electrode size and location
play a vital role for an Ear-EEG system. In this investigation, we
present a quantitative study of ear-electrodes with two electrode
sizes at different locations in a wet and dry configuration. Electrode
impedance scales inversely with size and ranges from 450 kΩ to
1.29 MΩ for dry and from 22 kΩ to 42 kΩ for wet contact at 10 Hz.
For any size, the location in the ear canal with the lowest impedance
is ELE (Left Ear Superior), presumably due to increased contact
pressure caused by the outer-ear anatomy. The results can be used
to optimize signal pickup and SNR for specific applications. We
demonstrate this by recording sleep spindles during sleep onset
with high quality (5.27 μVrms).
Wearable EEG has gained popularity in recent years driven by promising uses outside of clinics and research. The ubiquitous application of continuous EEG requires unobtrusive form-factors that are easily acceptable by the end-users. In this progression, wearable EEG systems have been moving from full scalp to forehead and recently to the ear. The aim of this study is to demonstrate that emerging ear-EEG provides similar impedance and signal properties as established forehead EEG. EEG data using eyes-open and closed alpha paradigm were acquired from ten healthy subjects using generic earpieces fitted with three custom-made electrodes and a forehead electrode (at Fpx) after impedance analysis. Inter-subject variability in in-ear electrode impedance ranged from 20 kΩ to 25 kΩ at 10 Hz. Signal quality was comparable with an SNR of 6 for in-ear and 8 for forehead electrodes. Alpha attenuation was significant during the eyes-open condition in all in-ear electrodes, and it followed the structure of power spectral density plots of forehead electrodes, with the Pearson correlation coefficient of 0.92 between in-ear locations ELE (Left Ear Superior) and ERE (Right Ear Superior) and forehead locations, Fp1 and Fp2, respectively. The results indicate that in-ear EEG is an unobtrusive alternative in terms of impedance, signal properties and information content to established forehead EEG.
Introduction: In peripheral percutaneous (VA) extracorporeal membrane oxygenation (ECMO) procedures the femoral arteries perfusion route has inherent disadvantages regarding poor upper body perfusion due to watershed. With the advent of new long flexible cannulas an advancement of the tip up to the ascending aorta has become feasible. To investigate the impact of such long endoluminal cannulas on upper body perfusion, a Computational Fluid Dynamics (CFD) study was performed considering different support levels and three cannula positions.
Methods: An idealized literature-based- and a real patient proximal aortic geometry including an endoluminal cannula were constructed. The blood flow was considered continuous. Oxygen saturation was set to 80% for the blood coming from the heart and to 100% for the blood leaving the cannula. 50% and 90% venoarterial support levels from the total blood flow rate of 6 l/min were investigated for three different positions of the cannula in the aortic arch.
Results: For both geometries, the placement of the cannula in the ascending aorta led to a superior oxygenation of all aortic blood vessels except for the left coronary artery. Cannula placements at the aortic arch and descending aorta could support supra-aortic arteries, but not the coronary arteries. All positions were able to support all branches with saturated blood at 90% flow volume.
Conclusions: In accordance with clinical observations CFD analysis reveals, that retrograde advancement of a long endoluminal cannula can considerably improve the oxygenation of the upper body and lead to oxygen saturation distributions similar to those of a central cannulation.
It is well known that the already large dielectric constants of some electrolytes like BaTiO₃ can be enhanced further by adding metallic (e.g. Ni, Cu or Ag) nanoparticles. The enhancement can be quite large, a factor of more than 1000 is possible. The consequences for the properties will be discussed in the present paper applying a brick-layer model (BLM) for calculating dc-resistivities of thin layers and a modified one (PBLM) that includes percolation for calculating dielectric properties of these materials. The PBLM results in an at least qualitative description and understanding of the physical phenomena: This model gives an explanation for the steep increase of the dielectric constant below the percolation threshold and why this increase is connected to a dramatic decrease of the breakdown voltage as well as the ability of storing electrical energy. We conclude that metallic electrolyte composites like BaTiO₃ are not appropriate for energy storage.
Based on the European Space Agency (ESA) Science in Space Environment (SciSpacE) community White Paper “Human Physiology – Musculoskeletal system”, this perspective highlights unmet needs and suggests new avenues for future studies in musculoskeletal research to enable crewed exploration missions. The musculoskeletal system is essential for sustaining physical function and energy metabolism, and the maintenance of health during exploration missions, and consequently mission success, will be tightly linked to musculoskeletal function. Data collection from current space missions from pre-, during-, and post-flight periods would provide important information to understand and ultimately offset musculoskeletal alterations during long-term spaceflight. In addition, understanding the kinetics of the different components of the musculoskeletal system in parallel with a detailed description of the molecular mechanisms driving these alterations appears to be the best approach to address potential musculoskeletal problems that future exploratory-mission crew will face. These research efforts should be accompanied by technical advances in molecular and phenotypic monitoring tools to provide in-flight real-time feedback.
A new in vitro tool to investigate cardiac contractility under physiological mechanical conditions
(2019)
Analysis of the long-term effect of the MBST® nuclear magnetic resonance therapy on gonarthrosis
(2016)
Introduction
In regard of surgical training, the reproducible simulation of life-like proximal humerus fractures in human cadaveric specimens is desirable. The aim of the present study was to develop a technique that allows simulation of realistic proximal humerus fractures and to analyse the influence of rotator cuff preload on the generated lesions in regards of fracture configuration.
Materials and methods
Ten cadaveric specimens (6 left, 4 right) were fractured using a custom-made drop-test bench, in two groups. Five specimens were fractured without rotator cuff preload, while the other five were fractured with the tendons of the rotator cuff preloaded with 2 kg each. The humeral shaft and the shortened scapula were potted. The humerus was positioned at 90° of abduction and 10° of internal rotation to simulate a fall on the elevated arm. In two specimens of each group, the emergence of the fractures was documented with high-speed video imaging. Pre-fracture radiographs were taken to evaluate the deltoid-tuberosity index as a measure of bone density. Post-fracture X-rays and CT scans were performed to define the exact fracture configurations. Neer’s classification was used to analyse the fractures.
Results
In all ten cadaveric specimens life-like proximal humerus fractures were achieved. Two III-part and three IV-part fractures resulted in each group. The preloading of the rotator cuff muscles had no further influence on the fracture configuration. High-speed videos of the fracture simulation revealed identical fracture mechanisms for both groups. We observed a two-step fracture mechanism, with initial impaction of the head segment against the glenoid followed by fracturing of the head and the tuberosities and then with further impaction of the shaft against the acromion, which lead to separation of the tuberosities.
Conclusion
A high energetic axial impulse can reliably induce realistic proximal humerus fractures in cadaveric specimens. The preload of the rotator cuff muscles had no influence on initial fracture configuration. Therefore, fracture simulation in the proximal humerus is less elaborate. Using the presented technique, pre-fractured specimens are available for real-life surgical education.
It is well known that the degradation environment can strongly influence the biodegradability and kinetics of biodegradation processes of polymers. Therefore, besides the monitoring of the degradation process, it is also necessary to control the medium in which the degradation takes place. In this work, a micromachined multi-parameter sensor chip for the control of the polymer-degradation medium has been developed. The chip combines a capacitive field-effect pH sensor, a four-electrode electrolyte-conductivity sensor and a thin-film Pt-temperature sensor. The results of characterization of individual sensors are presented. In addition, the multi-parameter sensor chip together with an impedimetric polymer-degradation sensor was simultaneously characterized in degradation solutions with different pH and electrolyte conductivity. The obtained results demonstrate the feasibility of the multi-parameter sensor chip for the control of the polymer-degradation medium.
This paper reports a first microbial biosensor for rapid and cost-effective determination of organophosphorus pesticides fenitrothion and EPN. The biosensor consisted of recombinant PNP-degrading/oxidizing bacteria Pseudomonas putida JS444 anchoring and displaying organophosphorus hydrolase (OPH) on its cell surface as biological sensing element and a dissolved oxygen electrode as the transducer. Surfaceexpressed OPH catalyzed the hydrolysis of fenitrothion and EPN to release 3-methyl-4-nitrophenol and p-nitrophenol, respectively, which were oxidized by the enzymatic machinery of Pseudomonas putida JS444 to carbon dioxide while consuming oxygen, which was measured and correlated to the concentration of organophosphates. Under the optimum operating conditions, the biosensor was able to measure as low as 277 ppb of fenitrothion and 1.6 ppm of EPN without interference from phenolic compounds and other commonly used pesticides such as carbamate pesticides, triazine herbicides and organophosphate pesticides without nitrophenyl substituent. The applicability of the biosensor to lake water was also demonstrated.
Epilepsy
(2010)
Network theory provides novel concepts that promise an improved characterization of interacting dynamical systems. Within this framework, evolving networks can be considered as being composed of nodes, representing systems, and of time-varying edges, representing interactions between these systems. This approach is highly attractive to further our understanding of the physiological and pathophysiological dynamics in human brain networks. Indeed, there is growing evidence that the epileptic process can be regarded as a large-scale network phenomenon. We here review methodologies for inferring networks from empirical time series and for a characterization of these evolving networks. We summarize recent findings derived from studies that investigate human epileptic brain networks evolving on timescales ranging from few seconds to weeks. We point to possible pitfalls and open issues, and discuss future perspectives.
After a liver tumor intervention the medical doctor has to compare both pre and postoperative CT acquisitions to ensure that all carcinogenic cells are destroyed. A correct assessment of the intervention is of vital importance, since it will reduce the probability of tumor recurrence. Some methods have been proposed to support the medical doctors during the assessment process, however, all of them focus on secondary tumors. In this paper a tool is presented that enables the outcome validation for both primary and secondary tumors. Therefore, a multiphase registration (preoperative arterial and portal phases) followed by a registration between the pre and postoperative CT images is carried out. The first registration is in charge of the primary tumors that are only visible in the arterial phase. The secondary tumors will be incorporated in the second registration step. Finally, the part of the tumor that was not covered by the necrosis is quantified and visualized. The method has been tested in 9 patients, with an average registration error of 1.41 mm.
False spectra formation in the differential two-channel scheme of the laser Doppler flowmeter
(2018)
Noise in the differential two-channel scheme of a classic laser Doppler flowmetry (LDF) instrument was studied. Formation of false spectral components in the output signal due to beating of electrical signals in the differential amplifier was found out. The improved block-diagram of the flowmeter was developed allowing to reduce the noise.
Nobody ever dies! / 1. ed.
(2000)
Our world is well ordered in measurement and number : or why natural constants are as they are
(2013)
All the important natural constants can be logically explained with and derived from the first four ordinal numbers, 1, 2, 3 and 4, its addition to ten and finally the standard values for obviously maximal feasibility Ω and the optimum in our world, the Golden Section (GS), i.e. the number sequences 273 and 618. They both are the first three numbers of irrational results by an arithmetical transformation of simple geometrical relationships by creating multiplicity out of singularity. Both of them show that the infinite is inherent in finiteness and explain in a simple way the smallest deviations and fluctuations between the physical AS-IS state and the obvious spiritual ideal behind: Wherever we look in this world, and especially in important key-positions, we regularly find these sequences. All of the above mentioned numbers so seem to be key players in our world, what can be demonstrated by the derivation of natural constants.
In any books about genetics it can still today be read that our genetic code is called “degenerate” because it is still believed that 43 = 64 triplets encode the 20 essential amino acids. Indeed we have to assume the inverse law, what means that 34 = 81 exact code positions are really effective for our genetic code and encode the amino acids, compiled to proteins. This very important discovery leads to two completely new results that are limits-overlooking: 1) 34 (=81) genetic code positions mean exactly the same number as there are stable and naturally existing chemical elements in our universe. This famous argument should now lead to some alternative, as well as new fundamental conclusions about our existence. 2) A genetic code positioning system shows that nature is much smarter than expected: mutations are made less dangerous than believed, because they won't be that easily able any more to cause severe damages in the protein-synthesis. This should also lead to some alternative views upon evolution of life.
Therefore Fermat is right
(2014)
It was Fernat's idea to investigate how many numbers would fulfill the equation according to the Pythagorean Theorem if the exponent were increased to random, e.g. to a3 + b3 = c3. His question became therefore: are there two whole numbers the cubes of which add up to the volume of the cube of a third whole number? He posed this same question, of course, for all kinds of higher exponents, so that the equation could be generalized: is there an integral solution for the equation an + bn = cn, if the exponent n is higher than 2? Although in 1993, the English mathematician Andrew Wiles was able to produce an arithmetical proof for Fermat's famous theorem, I will show that there is a simple logical explanation which is also pragmatic and plausible and what is the result of a fundamental alternative idea how our world seems to be constructed.
An optimization method is developed to describe the mechanical behaviour of the human cancellous bone. The method is based on a mixture theory. A careful observation of the behaviour of the bone material leads to the hypothesis that the bone density is controlled by the principal stress trajectories (Wolff’s law). The basic idea of the developed method is the coupling of a scalar value via an eigenvalue problem to the principal stress trajectories. On the one hand this theory will permit a prediction of the reaction of the biological bone structure after the implantation of a prosthesis, on the other hand it may be useful in engineering optimization problems. An analytical example shows its efficiency.
The CellDrum technology (The term 'CellDrum technology' includes a couple of slightly different technological setups for measuring lateral mechanical tension in various types of cell monolayers or 3D-tissue constructs) was designed to quantify the contraction rate and mechanical tension of self-exciting cardiac myocytes. Cells were grown either within flexible, circular collagen gels or as monolayer on top of respective 1-mum thin silicone membranes. Membrane and cells were bulged outwards by air pressure. This biaxial strain distribution is rather similar the beating, blood-filled heart. The setup allowed presetting the mechanical residual stress level externally by adjusting the centre deflection, thus, mimicking hypertension in vitro. Tension was measured as oscillating differential pressure change between chamber and environment. A 0.5-mm thick collagen-cardiac myocyte tissue construct induced after 2 days of culturing (initial cell density 2 x 10(4) cells/ml), a mechanical tension of 1.62 +/- 0.17 microN/mm(2). Mechanical load is an important growth regulator in the developing heart, and the orientation and alignment of cardiomyocytes is stress sensitive. Therefore, it was necessary to develop the CellDrum technology with its biaxial stress-strain distribution and defined mechanical boundary conditions. Cells were exposed to strain in two directions, radially and circumferentially, which is similar to biaxial loading in real heart tissues. Thus, from a biomechanical point of view, the system is preferable to previous setups based on uniaxial stretching.
All cells generate contractile tension. This strain is crucial for mechanically controlling the cell shape, function and survival. In this study, the CellDrum technology quantifying cell's (the cellular) mechanical tension on a pico-scale was used to investigate the effect of lipopolysaccharide (LPS) on human aortic endothelial cell (HAoEC) tension. The LPS effect during gram-negative sepsis on endothelial cells is cell contraction causing endothelium permeability increase. The aim was to finding out whether recombinant activated protein C (rhAPC) would reverse the endothelial cell response in an in-vitro sepsis model. In this study, the established in-vitro sepsis model was confirmed by interleukin 6 (IL-6) levels at the proteomic and genomic levels by ELISA, real time-PCR and reactive oxygen species (ROS) activation by florescence staining. The thrombin cellular contraction effect on endothelial cells was used as a positive control when the CellDrum technology was applied. Additionally, the Ras homolog gene family, member A (RhoA) mRNA expression level was checked by real time-PCR to support contractile tension results. According to contractile tension results, the mechanical predominance of actin stress fibers was a reason of the increased endothelial contractile tension leading to enhanced endothelium contractility and thus permeability enhancement. The originality of this data supports firstly the basic measurement principles of the CellDrum technology and secondly that rhAPC has a beneficial effect on sepsis influenced cellular tension. The technology presented here is promising for future high-throughput cellular tension analysis that will help identify pathological contractile tension responses of cells and prove further cell in-vitro models.
Learning- and memory-related processes are thought to result from dynamic interactions in large-scale brain networks that include lateral and mesial structures of the temporal lobes. We investigate the impact of incidental and intentional learning of verbal episodic material on functional brain networks that we derive from scalp-EEG recorded continuously from 33 subjects during a neuropsychological test schedule. Analyzing the networks' global statistical properties we observe that intentional but not incidental learning leads to a significantly increased clustering coefficient, and the average shortest path length remains unaffected. Moreover, network modifications correlate with subsequent recall performance: the more pronounced the modifications of the clustering coefficient, the higher the recall performance. Our findings provide novel insights into the relationship between topological aspects of functional brain networks and higher cognitive functions.
Purpose Vascular risk factors and ocular perfusion are heatedly discussed in the pathogenesis of glaucoma. The retinal vessel analyzer (RVA, IMEDOS Systems, Germany) allows noninvasive measurement of retinal vessel regulation. Significant differences especially in the veins between healthy subjects and patients suffering from glaucoma were previously reported. In this pilot-study we investigated if localized vascular regulation is altered in glaucoma patients with altitudinal visual field defect asymmetry. Methods 15 eyes of 12 glaucoma patients with advanced altitudinal visual field defect asymmetry were included. The mean defect was calculated for each hemisphere separately (-20.99 ± 10.49 pro- found hemispheric visual field defect vs -7.36 ± 3.97 dB less profound hemisphere). After pupil dilation, RVA measurements of retinal arteries and veins were conducted using the standard protocol. The superior and inferior retinal vessel reactivity were measured consecutively in each eye. Results Significant differences were recorded in venous vessel constriction after flicker light stimulation and overall amplitude of the reaction (p \ 0.04 and p \ 0.02 respectively) in-between the hemispheres spheres. Vessel reaction was higher in the hemisphere corresponding to the more advanced visual field defect. Arterial diameters reacted similarly, failing to reach statistical significance. Conclusion Localized retinal vessel regulation is significantly altered in glaucoma patients with asymmetri altitudinal visual field defects. Veins supplying the hemisphere concordant to a less profound visual field defect show diminished diameter changes. Vascular dysregulation might be particularly important in early glaucoma stages prior to a significant visual field defect.
Background
Post-COVID-19 syndrome (PCS) is a lingering disease with ongoing symptoms such as fatigue and cognitive impairment resulting in a high impact on the daily life of patients. Understanding the pathophysiology of PCS is a public health priority, as it still poses a diagnostic and treatment challenge for physicians.
Methods
In this prospective observational cohort study, we analyzed the retinal microcirculation using Retinal Vessel Analysis (RVA) in a cohort of patients with PCS and compared it to an age- and gender-matched healthy cohort (n = 41, matched out of n = 204).
Measurements and main results
PCS patients exhibit persistent endothelial dysfunction (ED), as indicated by significantly lower venular flicker-induced dilation (vFID; 3.42% ± 1.77% vs. 4.64% ± 2.59%; p = 0.02), narrower central retinal artery equivalent (CRAE; 178.1 [167.5–190.2] vs. 189.1 [179.4–197.2], p = 0.01) and lower arteriolar-venular ratio (AVR; (0.84 [0.8–0.9] vs. 0.88 [0.8–0.9], p = 0.007). When combining AVR and vFID, predicted scores reached good ability to discriminate groups (area under the curve: 0.75). Higher PCS severity scores correlated with lower AVR (R = − 0.37 p = 0.017). The association of microvascular changes with PCS severity were amplified in PCS patients exhibiting higher levels of inflammatory parameters.
Conclusion
Our results demonstrate that prolonged endothelial dysfunction is a hallmark of PCS, and impairments of the microcirculation seem to explain ongoing symptoms in patients. As potential therapies for PCS emerge, RVA parameters may become relevant as clinical biomarkers for diagnosis and therapy management.
In energy economy forecasts of different time series are rudimentary. In this study, a prediction for the German day-ahead spot market is created with Apache Spark and R. It is just an example for many different applications in virtual power plant environments. Other examples of use as intraday price processes, load processes of machines or electric vehicles, real time energy loads of photovoltaic systems and many more time series need to be analysed and predicted.
This work gives a short introduction into the project where this study is settled. It describes the time series methods that are used in energy industry for forecasts shortly. As programming technique Apache Spark, which is a strong cluster computing technology, is utilised. Today, single time series can be predicted. The focus of this work is on developing a method to parallel forecasting, to process multiple time series simultaneously with R and Apache Spark.
An array of 50 MHz quartz microbalances (QMBs) coated with a dendronized polymer was used to detect small amounts of volatile organic compounds (VOCs) in the gas phase. The results were compared to those obtained with the commonly used 10 MHz QMBs. The 50 MHz QMBs proved to be a powerful tool for the detection of VOCs in the gas phase; therefore, they represent a promising alternative to the much more delicate surface acoustic wave devices (SAWs).
Proc. of the 2005 ASCE Intl. Conf. on Computing in Civil Engineering (ICCC 2005) eds. L. Soibelman und F. Pena-Mora, Seite 1-14, ASCE (CD-ROM), Cancun, Mexico, 2005 Current CAD tools are not able to support the fundamental conceptual design phase, and none of them provides consistency analyses of sketches produced by architects. To give architects a greater support at the conceptual design phase, we develop a CAD tool for conceptual design and a knowledge specification tool allowing the definition of conceptually relevant knowledge. The knowledge is specific to one class of buildings and can be reused. Based on a dynamic knowledge model, different types of design rules formalize the knowledge in a graph-based realization. An expressive visual language provides a user-friendly, human readable representation. Finally, consistency analyses enable conceptual designs to be checked against this defined knowledge. In this paper we concentrate on the knowledge specification part of our project.
In: Net-distributed Co-operation : Xth International Conference on Computing in Civil and Building Engineering, Weimar, June 02 - 04, 2004 ; proceedings / [ed. by Karl Beuke ...] . - Weimar: Bauhaus-Univ. Weimar 2004. - 1. Aufl. . Seite 1-14 ISBN 3-86068-213-X International Conference on Computing in Civil and Building Engineering <10, 2004, Weimar> Summary In our project, we develop new tools for the conceptual design phase. During conceptual design, the coarse functionality and organization of a building is more important than a detailed worked out construction. We identify two roles, first the knowledge engineer who is responsible for knowledge definition and maintenance; second the architect who elaborates the conceptual de-sign. The tool for the knowledge engineer is based on graph technology, it is specified using PROGRES and the UPGRADE framework. The tools for the architect are integrated to the in-dustrial CAD tool ArchiCAD. Consistency between knowledge and conceptual design is en-sured by the constraint checker, another extension to ArchiCAD.
In: Computer Aided Architectural Design Futures 2005 2005, Part 4, 207-216, DOI: http://dx.doi.org/10.1007/1-4020-3698-1_19 The conceptual design at the beginning of the building construction process is essential for the success of a building project. Even if some CAD tools allow elaborating conceptual sketches, they rather focus on the shape of the building elements and not on their functionality. We introduce semantic roomobjects and roomlinks, by way of example to the CAD tool ArchiCAD. These extensions provide a basis for specifying the organisation and functionality of a building and free architects being forced to directly produce detailed constructive sketches. Furthermore, we introduce consistency analyses of the conceptual sketch, based on an ontology containing conceptual relevant knowledge, specific to one class of buildings.
In: Proc. of the 11th Intl. Conf. on Computing in Civil and Building Engineering (ICCCBE-XI) ed. Hugues Rivard, Montreal, Canada, Seite 1-12, ACSE (CD-ROM), 2006 Currently, the conceptual design phase is not adequately supported by any CAD tool. Neither the support while elaborating conceptual sketches, nor the automatic proof of correctness with respect to effective restrictions is currently provided by any commercial tool. To enable domain experts to store the common as well as their personal domain knowledge, we develop a visual language for knowledge formalization. In this paper, a major extension to the already existing concepts is introduced. The possibility to define rule dependencies extends the expressiveness of the knowledge definition language and contributes to the usability of our approach.
In: Advanced Engineering Informatics. Vol 21, Issue 1, 2007, Pages 67-83 http://dx.doi.org/10.1016/j.aei.2006.10.001 eds. J.C. Kunz, I.F.C. Smith and T. Tomiyama, Elsevier, Seite 1-22 Current CAD tools are not able to support the conceptual design phase, and none of them provides a consistency analysis for sketches produced by architects. This phase is fundamental and crucial for the whole design and construction process of a building. To give architects a better support, we developed a CAD tool for conceptual design and a knowledge specification tool. The knowledge is specific to one class of buildings and it can be reused. Based on a dynamic and domain-specific knowledge ontology, different types of design rules formalize this knowledge in a graph-based form. An expressive visual language provides a user-friendly, human readable representation. Finally, a consistency analysis tool enables conceptual designs to be checked against this formal conceptual knowledge. In this article, we concentrate on the knowledge specification part. For that, we introduce the concepts and usage of a novel visual language and describe its semantics. To demonstrate the usability of our approach, two graph-based visual tools for knowledge specification and conceptual design are explained.
ITCE-2003 - 4th Joint Symposium on Information Technology in Civil Engineering ed Flood, I., Seite 1-12, ASCE (CD-ROM), Nashville, USA In this paper we discussed graph based tools to support architects during the conceptual design phase. Conceptual Design is defined before constructive design; the used concepts are more abstract. We develop two graph based approaches, a topdown using the graph rewriting system PROGRES and a more industrially oriented approach, where we extend the CAD system ArchiCAD. In both approaches, knowledge can be defined by a knowledge engineer, in the top-down approach in the domain model graph, in the bottom-up approach in the in an XML file. The defined knowledge is used to incrementally check the sketch and to inform the architect about violations of the defined knowledge. Our goal is to discover design error as soon as possible and to support the architect to design buildings with consideration of conceptual knowledge.
In: Advances in intelligent computing in engineering : proceedings of the 9.International EG-ICE Workshop ; Darmstadt, (01 - 03 August) 2002 / Martina Schnellenbach-Held ... (eds.) . - Düsseldorf: VDI-Verl., 2002 .- Fortschritt-Berichte VDI, Reihe 4, Bauingenieurwesen ; 180 ; S. 1-35 The paper describes a novel way to support conceptual design in civil engineering. The designer uses semantical tools guaranteeing certain internal structures of the design result but also the fulfillment of various constraints. Two different approaches and corresponding tools are discussed: (a) Visually specified tools with automatic code generation to determine a design structure as well as fixing various constraints a design has to obey. These tools are also valuable for design knowledge specialist. (b) Extensions of existing CAD tools to provide semantical knowledge to be used by an architect. It is sketched how these different tools can be combined in the future. The main part of the paper discusses the concepts and realization of two prototypes following the two above approaches. The paper especially discusses that specific graphs and the specification of their structure are useful for both tool realization projects.
Applications of Graph Transformations with Industrial Relevance Lecture Notes in Computer Science, 2004, Volume 3062/2004, 434-439, DOI: http://dx.doi.org/10.1007/978-3-540-25959-6_33 This paper gives a brief overview of the tools we have developed to support conceptual design in civil engineering. Based on the UPGRADE framework, two applications, one for the knowledge engineer and another for architects allow to store domain specific knowledge and to use this knowledge during conceptual design. Consistency analyses check the design against the defined knowledge and inform the architect if rules are violated.
The Saturnian moon Enceladus with its extensive water bodies underneath a thick ice sheet cover is a potential candidate for extraterrestrial life. Direct exploration of such extraterrestrial aquatic ecosystems requires advanced access and sampling technologies with a high level of autonomy. A new technological approach has been developed as part of the collaborative research project Enceladus Explorer (EnEx). The concept is based upon a minimally invasive melting probe called the IceMole. The force-regulated, heater-controlled IceMole is able to travel along a curved trajectory as well as upwards. Hence, it allows maneuvers which may be necessary for obstacle avoidance or target selection. Maneuverability, however, necessitates a sophisticated on-board navigation system capable of autonomous operations. The development of such a navigational system has been the focal part of the EnEx project. The original IceMole has been further developed to include relative positioning based on in-ice attitude determination, acoustic positioning, ultrasonic obstacle and target detection integrated through a high-level sensor fusion. This paper describes the EnEx technology and discusses implications for an actual extraterrestrial mission concept.