Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (2072) (remove)
Document Type
- Article (1591)
- Conference Proceeding (241)
- Book (96)
- Part of a Book (62)
- Doctoral Thesis (27)
- Patent (17)
- Report (15)
- Other (9)
- Habilitation (4)
- Lecture (3)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (16)
- CAD (15)
- civil engineering (14)
- Bauingenieurwesen (13)
- Einspielen <Werkstoff> (13)
- shakedown analysis (9)
- FEM (6)
- Limit analysis (6)
- Shakedown analysis (6)
ITCE-2003 - 4th Joint Symposium on Information Technology in Civil Engineering ed Flood, I., Seite 1-12, ASCE (CD-ROM), Nashville, USA In this paper we discussed graph based tools to support architects during the conceptual design phase. Conceptual Design is defined before constructive design; the used concepts are more abstract. We develop two graph based approaches, a topdown using the graph rewriting system PROGRES and a more industrially oriented approach, where we extend the CAD system ArchiCAD. In both approaches, knowledge can be defined by a knowledge engineer, in the top-down approach in the domain model graph, in the bottom-up approach in the in an XML file. The defined knowledge is used to incrementally check the sketch and to inform the architect about violations of the defined knowledge. Our goal is to discover design error as soon as possible and to support the architect to design buildings with consideration of conceptual knowledge.
In: Forum Bauinformatik 2005 : junge Wissenschaftler forschen / [Lehrstuhl Bauinformatik, Brandenburgische Technische Universität Cottbus. Frank Schley ... (Hrsg.)]. - Cottbus : Techn. Universität 2005. S. 1-10 ISBN 3-934934-11-0
Mittels eines operationalen Ansatzes zur Semantikdefinition wird am Bei-spiel des konzeptuellen Gebäudeentwurfs ein Regelsystem formalisiert. Dazu werdenzwei Teile, zum einen das Regelwissen, zum anderen ein konzeptueller Entwurfsplan zunächst informell eingeführt und dann formal beschrieben. Darauf aufbauend wird die Grundlage für eine Konsistenzprüfung des konzeptuellen Entwurfs gegen das Regel-wissen formal angeben
In: Proceedings of the 39th Annual Hawaii International Conference on System Sciences, 2006. HICSS '06 http://dx.doi.org/10.1109/HICSS.2006.200 The conceptual design phase at the beginning of the building construction process is not adequately supported by any CAD-tool. Conceptual design support needs regarding two aspects: first, the architect must be able to develop conceptual sketches that provide abstraction from constructive details. Second, conceptually relevant knowledge should be available to check these conceptual sketches. The paper deals with knowledge to formalize for conceptual design. To enable domain experts formalizing knowledge, a graph-based specification is presented that allows the development of a domain ontology and design rules specific for one class of buildings at runtime. The provided tool support illustrates the introduced concepts and demonstrates the consistency analysis between knowledge and conceptual design.
In: Proc. of the 11th Intl. Conf. on Computing in Civil and Building Engineering (ICCCBE-XI) ed. Hugues Rivard, Montreal, Canada, Seite 1-12, ACSE (CD-ROM), 2006 Currently, the conceptual design phase is not adequately supported by any CAD tool. Neither the support while elaborating conceptual sketches, nor the automatic proof of correctness with respect to effective restrictions is currently provided by any commercial tool. To enable domain experts to store the common as well as their personal domain knowledge, we develop a visual language for knowledge formalization. In this paper, a major extension to the already existing concepts is introduced. The possibility to define rule dependencies extends the expressiveness of the knowledge definition language and contributes to the usability of our approach.
In: Computer Aided Architectural Design Futures 2005 2005, Part 4, 207-216, DOI: http://dx.doi.org/10.1007/1-4020-3698-1_19 The conceptual design at the beginning of the building construction process is essential for the success of a building project. Even if some CAD tools allow elaborating conceptual sketches, they rather focus on the shape of the building elements and not on their functionality. We introduce semantic roomobjects and roomlinks, by way of example to the CAD tool ArchiCAD. These extensions provide a basis for specifying the organisation and functionality of a building and free architects being forced to directly produce detailed constructive sketches. Furthermore, we introduce consistency analyses of the conceptual sketch, based on an ontology containing conceptual relevant knowledge, specific to one class of buildings.
In: Net-distributed Co-operation : Xth International Conference on Computing in Civil and Building Engineering, Weimar, June 02 - 04, 2004 ; proceedings / [ed. by Karl Beuke ...] . - Weimar: Bauhaus-Univ. Weimar 2004. - 1. Aufl. . Seite 1-14 ISBN 3-86068-213-X International Conference on Computing in Civil and Building Engineering <10, 2004, Weimar> Summary In our project, we develop new tools for the conceptual design phase. During conceptual design, the coarse functionality and organization of a building is more important than a detailed worked out construction. We identify two roles, first the knowledge engineer who is responsible for knowledge definition and maintenance; second the architect who elaborates the conceptual de-sign. The tool for the knowledge engineer is based on graph technology, it is specified using PROGRES and the UPGRADE framework. The tools for the architect are integrated to the in-dustrial CAD tool ArchiCAD. Consistency between knowledge and conceptual design is en-sured by the constraint checker, another extension to ArchiCAD.
Proc. of the 2005 ASCE Intl. Conf. on Computing in Civil Engineering (ICCC 2005) eds. L. Soibelman und F. Pena-Mora, Seite 1-14, ASCE (CD-ROM), Cancun, Mexico, 2005 Current CAD tools are not able to support the fundamental conceptual design phase, and none of them provides consistency analyses of sketches produced by architects. To give architects a greater support at the conceptual design phase, we develop a CAD tool for conceptual design and a knowledge specification tool allowing the definition of conceptually relevant knowledge. The knowledge is specific to one class of buildings and can be reused. Based on a dynamic knowledge model, different types of design rules formalize the knowledge in a graph-based realization. An expressive visual language provides a user-friendly, human readable representation. Finally, consistency analyses enable conceptual designs to be checked against this defined knowledge. In this paper we concentrate on the knowledge specification part of our project.
An array of 50 MHz quartz microbalances (QMBs) coated with a dendronized polymer was used to detect small amounts of volatile organic compounds (VOCs) in the gas phase. The results were compared to those obtained with the commonly used 10 MHz QMBs. The 50 MHz QMBs proved to be a powerful tool for the detection of VOCs in the gas phase; therefore, they represent a promising alternative to the much more delicate surface acoustic wave devices (SAWs).
In energy economy forecasts of different time series are rudimentary. In this study, a prediction for the German day-ahead spot market is created with Apache Spark and R. It is just an example for many different applications in virtual power plant environments. Other examples of use as intraday price processes, load processes of machines or electric vehicles, real time energy loads of photovoltaic systems and many more time series need to be analysed and predicted.
This work gives a short introduction into the project where this study is settled. It describes the time series methods that are used in energy industry for forecasts shortly. As programming technique Apache Spark, which is a strong cluster computing technology, is utilised. Today, single time series can be predicted. The focus of this work is on developing a method to parallel forecasting, to process multiple time series simultaneously with R and Apache Spark.
Background
Post-COVID-19 syndrome (PCS) is a lingering disease with ongoing symptoms such as fatigue and cognitive impairment resulting in a high impact on the daily life of patients. Understanding the pathophysiology of PCS is a public health priority, as it still poses a diagnostic and treatment challenge for physicians.
Methods
In this prospective observational cohort study, we analyzed the retinal microcirculation using Retinal Vessel Analysis (RVA) in a cohort of patients with PCS and compared it to an age- and gender-matched healthy cohort (n = 41, matched out of n = 204).
Measurements and main results
PCS patients exhibit persistent endothelial dysfunction (ED), as indicated by significantly lower venular flicker-induced dilation (vFID; 3.42% ± 1.77% vs. 4.64% ± 2.59%; p = 0.02), narrower central retinal artery equivalent (CRAE; 178.1 [167.5–190.2] vs. 189.1 [179.4–197.2], p = 0.01) and lower arteriolar-venular ratio (AVR; (0.84 [0.8–0.9] vs. 0.88 [0.8–0.9], p = 0.007). When combining AVR and vFID, predicted scores reached good ability to discriminate groups (area under the curve: 0.75). Higher PCS severity scores correlated with lower AVR (R = − 0.37 p = 0.017). The association of microvascular changes with PCS severity were amplified in PCS patients exhibiting higher levels of inflammatory parameters.
Conclusion
Our results demonstrate that prolonged endothelial dysfunction is a hallmark of PCS, and impairments of the microcirculation seem to explain ongoing symptoms in patients. As potential therapies for PCS emerge, RVA parameters may become relevant as clinical biomarkers for diagnosis and therapy management.
Purpose Vascular risk factors and ocular perfusion are heatedly discussed in the pathogenesis of glaucoma. The retinal vessel analyzer (RVA, IMEDOS Systems, Germany) allows noninvasive measurement of retinal vessel regulation. Significant differences especially in the veins between healthy subjects and patients suffering from glaucoma were previously reported. In this pilot-study we investigated if localized vascular regulation is altered in glaucoma patients with altitudinal visual field defect asymmetry. Methods 15 eyes of 12 glaucoma patients with advanced altitudinal visual field defect asymmetry were included. The mean defect was calculated for each hemisphere separately (-20.99 ± 10.49 pro- found hemispheric visual field defect vs -7.36 ± 3.97 dB less profound hemisphere). After pupil dilation, RVA measurements of retinal arteries and veins were conducted using the standard protocol. The superior and inferior retinal vessel reactivity were measured consecutively in each eye. Results Significant differences were recorded in venous vessel constriction after flicker light stimulation and overall amplitude of the reaction (p \ 0.04 and p \ 0.02 respectively) in-between the hemispheres spheres. Vessel reaction was higher in the hemisphere corresponding to the more advanced visual field defect. Arterial diameters reacted similarly, failing to reach statistical significance. Conclusion Localized retinal vessel regulation is significantly altered in glaucoma patients with asymmetri altitudinal visual field defects. Veins supplying the hemisphere concordant to a less profound visual field defect show diminished diameter changes. Vascular dysregulation might be particularly important in early glaucoma stages prior to a significant visual field defect.
Learning- and memory-related processes are thought to result from dynamic interactions in large-scale brain networks that include lateral and mesial structures of the temporal lobes. We investigate the impact of incidental and intentional learning of verbal episodic material on functional brain networks that we derive from scalp-EEG recorded continuously from 33 subjects during a neuropsychological test schedule. Analyzing the networks' global statistical properties we observe that intentional but not incidental learning leads to a significantly increased clustering coefficient, and the average shortest path length remains unaffected. Moreover, network modifications correlate with subsequent recall performance: the more pronounced the modifications of the clustering coefficient, the higher the recall performance. Our findings provide novel insights into the relationship between topological aspects of functional brain networks and higher cognitive functions.
All cells generate contractile tension. This strain is crucial for mechanically controlling the cell shape, function and survival. In this study, the CellDrum technology quantifying cell's (the cellular) mechanical tension on a pico-scale was used to investigate the effect of lipopolysaccharide (LPS) on human aortic endothelial cell (HAoEC) tension. The LPS effect during gram-negative sepsis on endothelial cells is cell contraction causing endothelium permeability increase. The aim was to finding out whether recombinant activated protein C (rhAPC) would reverse the endothelial cell response in an in-vitro sepsis model. In this study, the established in-vitro sepsis model was confirmed by interleukin 6 (IL-6) levels at the proteomic and genomic levels by ELISA, real time-PCR and reactive oxygen species (ROS) activation by florescence staining. The thrombin cellular contraction effect on endothelial cells was used as a positive control when the CellDrum technology was applied. Additionally, the Ras homolog gene family, member A (RhoA) mRNA expression level was checked by real time-PCR to support contractile tension results. According to contractile tension results, the mechanical predominance of actin stress fibers was a reason of the increased endothelial contractile tension leading to enhanced endothelium contractility and thus permeability enhancement. The originality of this data supports firstly the basic measurement principles of the CellDrum technology and secondly that rhAPC has a beneficial effect on sepsis influenced cellular tension. The technology presented here is promising for future high-throughput cellular tension analysis that will help identify pathological contractile tension responses of cells and prove further cell in-vitro models.
The CellDrum technology (The term 'CellDrum technology' includes a couple of slightly different technological setups for measuring lateral mechanical tension in various types of cell monolayers or 3D-tissue constructs) was designed to quantify the contraction rate and mechanical tension of self-exciting cardiac myocytes. Cells were grown either within flexible, circular collagen gels or as monolayer on top of respective 1-mum thin silicone membranes. Membrane and cells were bulged outwards by air pressure. This biaxial strain distribution is rather similar the beating, blood-filled heart. The setup allowed presetting the mechanical residual stress level externally by adjusting the centre deflection, thus, mimicking hypertension in vitro. Tension was measured as oscillating differential pressure change between chamber and environment. A 0.5-mm thick collagen-cardiac myocyte tissue construct induced after 2 days of culturing (initial cell density 2 x 10(4) cells/ml), a mechanical tension of 1.62 +/- 0.17 microN/mm(2). Mechanical load is an important growth regulator in the developing heart, and the orientation and alignment of cardiomyocytes is stress sensitive. Therefore, it was necessary to develop the CellDrum technology with its biaxial stress-strain distribution and defined mechanical boundary conditions. Cells were exposed to strain in two directions, radially and circumferentially, which is similar to biaxial loading in real heart tissues. Thus, from a biomechanical point of view, the system is preferable to previous setups based on uniaxial stretching.
An optimization method is developed to describe the mechanical behaviour of the human cancellous bone. The method is based on a mixture theory. A careful observation of the behaviour of the bone material leads to the hypothesis that the bone density is controlled by the principal stress trajectories (Wolff’s law). The basic idea of the developed method is the coupling of a scalar value via an eigenvalue problem to the principal stress trajectories. On the one hand this theory will permit a prediction of the reaction of the biological bone structure after the implantation of a prosthesis, on the other hand it may be useful in engineering optimization problems. An analytical example shows its efficiency.
Unser Schlüssel zur Ewigkeit
(2015)
Der Schlüssel zur Ewigkeit
(1999)
Mit Logik die Welt begreifen
(2005)
Our world is well ordered in measurement and number : or why natural constants are as they are
(2013)
All the important natural constants can be logically explained with and derived from the first four ordinal numbers, 1, 2, 3 and 4, its addition to ten and finally the standard values for obviously maximal feasibility Ω and the optimum in our world, the Golden Section (GS), i.e. the number sequences 273 and 618. They both are the first three numbers of irrational results by an arithmetical transformation of simple geometrical relationships by creating multiplicity out of singularity. Both of them show that the infinite is inherent in finiteness and explain in a simple way the smallest deviations and fluctuations between the physical AS-IS state and the obvious spiritual ideal behind: Wherever we look in this world, and especially in important key-positions, we regularly find these sequences. All of the above mentioned numbers so seem to be key players in our world, what can be demonstrated by the derivation of natural constants.
Twee Kanten van één Medaille
(2020)
Sterben und Tod aus wissenschaftlicher Sicht - dying and death from a scientific point of view
(2018)
Nobody ever dies! / 1. ed.
(2000)
Therefore Fermat is right
(2014)
It was Fernat's idea to investigate how many numbers would fulfill the equation according to the Pythagorean Theorem if the exponent were increased to random, e.g. to a3 + b3 = c3. His question became therefore: are there two whole numbers the cubes of which add up to the volume of the cube of a third whole number? He posed this same question, of course, for all kinds of higher exponents, so that the equation could be generalized: is there an integral solution for the equation an + bn = cn, if the exponent n is higher than 2? Although in 1993, the English mathematician Andrew Wiles was able to produce an arithmetical proof for Fermat's famous theorem, I will show that there is a simple logical explanation which is also pragmatic and plausible and what is the result of a fundamental alternative idea how our world seems to be constructed.
In any books about genetics it can still today be read that our genetic code is called “degenerate” because it is still believed that 43 = 64 triplets encode the 20 essential amino acids. Indeed we have to assume the inverse law, what means that 34 = 81 exact code positions are really effective for our genetic code and encode the amino acids, compiled to proteins. This very important discovery leads to two completely new results that are limits-overlooking: 1) 34 (=81) genetic code positions mean exactly the same number as there are stable and naturally existing chemical elements in our universe. This famous argument should now lead to some alternative, as well as new fundamental conclusions about our existence. 2) A genetic code positioning system shows that nature is much smarter than expected: mutations are made less dangerous than believed, because they won't be that easily able any more to cause severe damages in the protein-synthesis. This should also lead to some alternative views upon evolution of life.
Experimentelle Untersuchungen über die Wirksamkeit verschiedener Schienbeinschoner im Fußballsport
(1985)
Nah- versus Nachtoderfahrungen
Nahtoderfahrungen (NTE) sind ein Phänomen aus der Kategorie „außergewöhnliche Bewusstseinserfahrungen“. Sie treten in unmittelbarer Nähe des eigenen Todes auf. Oft, aber nicht immer, handelt es sich dabei um Erfahrungen von Personen, die durch ärztliche Maßnahmen wiederbelebt wurden und später davon berichten (NTE-ler). Jedoch kommen solche Phänomene auch bei Menschen vor, die während einer schweren Erkrankung eine lebensbedrohliche Krise haben, hiervon aber spontan genesen.
Den NTE ähnlich sind auch sogenannte Nachtod-erfahrungen sowie spontane Erlebnisse, die im Rahmen anderer außergewöhnlicher Stresssituationen auftreten. Von Nachtoderfahrungen spricht man, wenn die Betroffenen anlässlich des Todes von geliebten Angehörigen oder Freunden Erlebnisse haben, die inhaltlich ebenfalls, zumindest aber teilweise, denen von NTE entsprechen.
Nachtoderfahrungen sowie spontane NTE-ähnliche Erlebnisse unterscheiden sich jedoch von den echten NTE zumeist sowohl quantitativ als auch qualitativ. Unter einem quantitativen Unterschied versteht man in diesem Zusammenhang eine in der Regel geringer ausgeprägte Komplexität, als sie sehr vielen NTE zu eigen ist. Da sich aber auch viele NTE selbst bezüglich ihrer Komplexität unterscheiden, ist der Hauptunterschied qualitativer Natur: Echte NTE besitzen gegenüber den anderen hier erwähnten Phänomenen eine größere inhaltliche und emotionale Tiefe. Sie begleiten die Betroffenen anschließend ein Leben lang – zumeist mit positiven, in Einzelfällen aber auch mit negativen Folgen, die bis zu einem späteren Suizid reichen können.
Die genannten außergewöhnlichen Bewusstseinsphänomene lassen sich in ihrer Gesamtheit bei rund 5 % der Bevölkerung finden. NTE im Speziellen haben hiervon einen durchaus bedeutenden Anteil. Je nach Studie geben zwischen 18 % und 40 % aller Personen, die reanimiert wurden, an, währenddessen eine NTE erlebt zu haben. Dass nicht alle eine solche Erfahrung machen, wird von Kritikern gern dahingehend interpretiert, NTE seien rein physiologischer und keineswegs spiritueller Natur. Jedoch sollte man bedenken, dass im Fall einer rein neurophysiologischen Grundlage von NTE dann bei jedem ein solches Phänomen zu erwarten wäre, so wie beispielsweise auch die Symptome einer Hypoglykämie im Wesentlichen immer dieselben sind.
Neurophysiologisch ist das nicht alles zu erklären : Nahtoderfahrungen aus wissenschaftlicher Sicht
(2017)
Im Rahmen von Ermüdungsanalysen ist nachzuweisen, daß die thermisch bedingten fortschreitenden Deformationen begrenzt bleiben. Hierzu ist die Abgrenzung des Shakedown-Bereiches (Einspielen) vom Ratchetting-Bereich (fortschreitende Deformation) von Interesse. Im Rahmen eines EU-geförderten Forschungsvorhabens wurden Experimente mit einem 4-Stab-Modell durchgeführt. Das Experiment bestand aus einem wassergekühlten inneren Rohr und drei isolierten und beheizbaren äußeren Probestäben. Das System wurde durch alternierende Axialkräfte, denen alternierende Temperaturen an den äußeren Stäben überlagert wurden, belastet. Die Versuchsparameter wurden teilweise nach vorausgegangenen Einspielanalysen gewählt. Während der Versuchsdurchführung wurden Temperaturen und Dehnungen zeitabhängig gemessen. Begleitend und nachfolgend zur Versuchsdurchführung wurden die Belastungen und die daraus resultierenden Beanspruchungen nachvollzogen. Bei dieser inkrementellen elasto-plastischen Analyse mit dem Programm ANSYS wurden unterschiedliche Werkstoffmodelle angesetzt. Die Ergebnisse dieser Simulationsberechnung dienen dazu, die Shakedown-Analysen mittels FE-Methode zu verifizieren.