Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (48)
- IfB - Institut für Bioengineering (22)
- Fachbereich Maschinenbau und Mechatronik (20)
- INB - Institut für Nano- und Biotechnologien (18)
- Fachbereich Chemie und Biotechnologie (14)
- Fachbereich Luft- und Raumfahrttechnik (12)
- Fachbereich Energietechnik (11)
- ECSM European Center for Sustainable Mobility (8)
- Fachbereich Elektrotechnik und Informationstechnik (8)
- Nowum-Energy (7)
Has Fulltext
- yes (128) (remove)
Document Type
- Article (128) (remove)
Keywords
- Einspielen <Werkstoff> (7)
- Multimediamarkt (6)
- Rapid prototyping (5)
- FEM (4)
- Finite-Elemente-Methode (4)
- Rapid Prototyping (4)
- Blitzschutz (3)
- 3D-Printing (2)
- Acyl-amino acids (2)
- Aminoacylase (2)
In this work, the effects of carbon sources and culture media on the production and structural properties of bacterial cellulose (BC) synthesized by Medusomyces gisevii have been studied. The culture medium was composed of different initial concentrations of glucose or sucrose dissolved in 0.4% extract of plain green tea. Parameters of the culture media (titratable acidity, substrate conversion degree etc.) were monitored daily for 20 days of cultivation. The BC pellicles produced on different carbon sources were characterized in terms of biomass yield, crystallinity and morphology by field emission scanning electron microscopy (FE-SEM), atomic force microscopy and X-ray diffraction. Our results showed that Medusomyces gisevii had higher BC yields in media with sugar concentrations close to 10 g L−1 after a 18–20 days incubation period. Glucose in general lead to a higher BC yield (173 g L−1) compared to sucrose (163.5 g L−1). The BC crystallinity degree and surface roughness were higher in the samples synthetized from sucrose. Obtained FE-SEM micrographs show that the BC pellicles synthesized in the sucrose media contained densely packed tangles of cellulose fibrils whereas the BC produced in the glucose media displayed rather linear geometry of the BC fibrils without noticeable aggregates.
The number of electronic vehicles increase steadily while the space for extending the charging infrastructure is limited. In particular in urban areas, where parking spaces in attractive areas are famous, opportunities to setup new charging stations is very limited. This leads to an overload of some very attractive charging stations and an underutilization of less attractive ones. Against this background, the paper at hand presents the design of an e-vehicle reservation system that aims at distributing the utilization of the charging infrastructure, particularly in urban areas. By applying a design science approach, the requirements for a reservation-based utilization approach are elicited and a model for a suitable distribution approach and its instantiation are developed. The artefact is evaluated by simulating the distribution effects based on data of real charging station utilizations.
Das Gesundheitswesen ist konfrontiert mit steigenden Kosten und einer immer schwieriger werdenden Personalsituation. Zeitgleich versprechen moderne Sprachsteuerungssysteme Prozesse in Arztpraxen und Krankenhäusern zu verschlanken und Vorgänge zu beschleunigen. Dennoch wird derzeit der Einsatz von Sprachsteuerungssystemen in Arztpraxen oder Krankenhäusern nur selten beobachtet, was auch an den besonders strengen Datenschutzauflagen der Datenschutzgrundverordnung (DSGVO) liegt. Darüber hinaus wirft die niedrige Nutzungsrate die Frage nach den konkreten Anforderungen und ihrer Umsetzbarkeit auf, was durch den vorliegenden Beitrag adressiert wird, indem die Ergebnisse von Interviews mit acht medizinischen Fachexperten ausgewertet werden. Ergänzend wird die technische Umsetzbarkeit einzelner Anforderungen mit unterschiedlichen Cloud-Anbietern erprobt.
Häufig bremsen geringe IT-Ressourcen, fehlende Softwareschnittstellen oder eine veraltete und komplex gewachsene Systemlandschaft die Automatisierung von Geschäftsprozessen. Robotic Process Automation (RPA) ist eine vielversprechende Methode, um Geschäftsprozesse oberflächenbasiert und ohne größere Systemeingriffe zu automatisieren und Medienbrüche abzubauen. Die Auswahl der passenden Prozesse ist dabei für den Erfolg von RPA-Projekten entscheidend. Der vorliegende Beitrag liefert dafür Selektionskriterien, die aus einer qualitativen Inhaltanalyse von elf Interviews mit RPA-Experten aus dem Versicherungsumfeld resultieren. Das Ergebnis umfasst eine gewichtetet Liste von sieben Dimensionen und 51 Prozesskriterien, welche die Automatisierung mit Softwarerobotern begünstigen bzw. deren Nichterfüllung eine Umsetzung erschweren oder sogar verhindern. Die drei wichtigsten Kriterien zur Auswahl von Geschäftsprozessen für die Automatisierung mittels RPA umfassen die Entlastung der an dem Prozess mitwirkenden Mitarbeiter (Arbeitnehmerüberlastung), die Ausführbarkeit des Prozesses mittels Regeln (Regelbasierte Prozessteuerung) sowie ein positiver Kosten-Nutzen-Vergleich. Praktiker können diese Kriterien verwenden, um eine systematische Auswahl von RPA-relevanten Prozessen vorzunehmen. Aus wissenschaftlicher Perspektive stellen die Ergebnisse eine Grundlage zur Erklärung des Erfolgs und Misserfolgs von RPA-Projekten dar.
Wind energy represents the dominant share of renewable energies. The rotor blades of a wind turbine are typically made from composite material, which withstands high forces during rotation. The huge dimensions of the rotor blades complicate the inspection processes in manufacturing. The automation of inspection processes has a great potential to increase the overall productivity and to create a consistent reliable database for each individual rotor blade. The focus of this paper is set on the process of rotor blade inspection automation by utilizing an autonomous mobile manipulator. The main innovations include a novel path planning strategy for zone-based navigation, which enables an intuitive right-hand or left-hand driving behavior in a shared human–robot workspace. In addition, we introduce a new method for surface orthogonal motion planning in connection with large-scale structures. An overall execution strategy controls the navigation and manipulation processes of the long-running inspection task. The implemented concepts are evaluated in simulation and applied in a real-use case including the tip of a rotor blade form.
Kunstwerke sowie ihre Präsentation und Vermittlung werden zunehmend von digitalen Technologien unterstützt. Virtuelle Ausstellungen, Internet-Projekte und komplexe Datenarchive stellen das Kunstwerk in einen medialen Kontext, der weit über das Moment einer technischen Reproduzierbarkeit hinausgeht. Das allgegenwärtige Konzept der Vernetzung dynamisiert Kunst, ihre Rezipienten und Ausstellungsorte. Die Beziehungen zwischen diesen Feldern werden mit Hilfe physiologischer Metaphern definiert und visualisiert. Frühere Speicher und Archive geraten in einen prozessualen Sog, in dem alles fluktuiert, sich kurzweilig verknüpft, auflöst und in permanente Dialoge mit seiner Umgebung tritt - das virtuelle Museum gerät in aktuellen Standortbestimmungen in die definitorische Nähe des Künstlichen Lebens.
Subtilisins from microbial sources, especially from the Bacillaceae family, are of particular interest for biotechnological applications and serve the currently growing enzyme market as efficient and novel biocatalysts. Biotechnological applications include use in detergents, cosmetics, leather processing, wastewater treatment and pharmaceuticals. To identify a possible candidate for the enzyme market, here we cloned the gene of the subtilisin SPFA from Fictibacillus arsenicus DSM 15822ᵀ (obtained through a data mining-based search) and expressed it in Bacillus subtilis DB104. After production and purification, the protease showed a molecular mass of 27.57 kDa and a pI of 5.8. SPFA displayed hydrolytic activity at a temperature optimum of 80 °C and a very broad pH optimum between 8.5 and 11.5, with high activity up to pH 12.5. SPFA displayed no NaCl dependence but a high NaCl tolerance, with decreasing activity up to concentrations of 5 m NaCl. The stability enhanced with increasing NaCl concentration. Based on its substrate preference for 10 synthetic peptide 4-nitroanilide substrates with three or four amino acids and its phylogenetic classification, SPFA can be assigned to the subgroup of true subtilisins. Moreover, SPFA exhibited high tolerance to 5% (w/v) SDS and 5% H₂O₂ (v/v). The biochemical properties of SPFA, especially its tolerance of remarkably high pH, SDS and H₂O₂, suggest it has potential for biotechnological applications.
Halophilic and halotolerant microorganisms represent a promising source of salt-tolerant enzymes suitable for various biotechnological applications where high salt concentrations would otherwise limit enzymatic activity. Considering the current growing enzyme market and the need for more efficient and new biocatalysts, the present study aimed at the characterization of a high-alkaline subtilisin from Alkalihalobacillus okhensis Kh10-101T. The protease gene was cloned and expressed in Bacillus subtilis DB104. The recombinant protease SPAO with 269 amino acids belongs to the subfamily of high-alkaline subtilisins. The biochemical characteristics of purified SPAO were analyzed in comparison with subtilisin Carlsberg, Savinase, and BPN'. SPAO, a monomer with a molecular mass of 27.1 kDa, was active over a wide range of pH 6.0–12.0 and temperature 20–80 °C, optimally at pH 9.0–9.5 and 55 °C. The protease is highly oxidatively stable to hydrogen peroxide and retained 58% of residual activity when incubated at 10 °C with 5% (v/v) H2O2 for 1 h while stimulated at 1% (v/v) H2O2. Furthermore, SPAO was very stable and active at NaCl concentrations up to 5.0 m. This study demonstrates the potential of SPAO for biotechnological applications in the future.
The aim of the present study was the characterisation of three true subtilisins and one phylogenetically intermediate subtilisin from halotolerant and halophilic microorganisms. Considering the currently growing enzyme market for efficient and novel biocatalysts, data mining is a promising source for novel, as yet uncharacterised enzymes, especially from halophilic or halotolerant Bacillaceae, which offer great potential to meet industrial needs. Both halophilic bacteria Pontibacillus marinus DSM 16465ᵀ and Alkalibacillus haloalkaliphilus DSM 5271ᵀ and both halotolerant bacteria Metabacillus indicus DSM 16189 and Litchfieldia alkalitelluris DSM 16976ᵀ served as a source for the four new subtilisins SPPM, SPAH, SPMI and SPLA. The protease genes were cloned and expressed in Bacillus subtilis DB104. Purification to apparent homogeneity was achieved by ethanol precipitation, desalting and ion-exchange chromatography. Enzyme activity could be observed between pH 5.0–12.0 with an optimum for SPPM, SPMI and SPLA around pH 9.0 and for SPAH at pH 10.0. The optimal temperature for SPMI and SPLA was 70 °C and for SPPM and SPAH 55 °C and 50 °C, respectively. All proteases showed high stability towards 5% (w/v) SDS and were active even at NaCl concentrations of 5 M. The four proteases demonstrate potential for future biotechnological applications.
In Folge mehrjähriger statistischer Untersuchungen an der FH Aachen ist unter anderem ein Eingangstest entstanden, der als Diagnosetool für einen erfolgreichen Studieneinstieg verwendet wird. Es hat sich herausgestellt, dass ein Testergebnis von weniger als 25 (von maximal 56 erreichbaren) Punkten die Chance auf einen erfolgreichen Studieneinstieg deutlich verringert. Ungefähr die Hälfte aller Erstsemester hat weniger als 25 Punkte im Eingangstest. Weniger als 20 % dieser Gruppe bestehen innerhalb eines Jahres die Klausur Mathematik 1. Die investierte Zeit von zwei Semestern ist mit Blick auf den Wissenszuwachs und damit letztendlich den Studienerfolg nicht effizient genutzt. Deshalb haben wir im WS 2013/14 einen semesterbegleitenden Anpassungskurs für diese Gruppe installiert. Ziel eines solchen Kurses ist es, die Student/innen innerhalb eines Jahres in die Lage zu versetzen, nach zwei Semestern problemlos den Vorlesungen in Mathematik zu folgen. Dieser Artikel beschreibt das Konzept dieses Anpassungskurses und zeigt erste Ergebnisse und Probleme des Pilotdurchgangs auf.
For short take-off and landing (STOL) aircraft, a parallel hybrid-electric propulsion system potentially offers superior performance compared to a conventional propulsion system, because the short-take-off power requirement is much higher than the cruise power requirement. This power-matching problem can be solved with a balanced hybrid propulsion system. However, there is a trade-off between wing loading, power loading, the level of hybridization, as well as range and take-off distance. An optimization method can vary design variables in such a way that a minimum of a particular objective is attained. In this paper, a comparison between the optimization results for minimum mass, minimum consumed primary energy, and minimum cost is conducted. A new initial sizing algorithm for general aviation aircraft with hybrid-electric propulsion systems is applied. This initial sizing methodology covers point performance, mission performance analysis, the weight estimation process, and cost estimation. The methodology is applied to the design of a STOL general aviation aircraft, intended for on-demand air mobility operations. The aircraft is sized to carry eight passengers over a distance of 500 km, while able to take off and land from short airstrips. Results indicate that parallel hybrid-electric propulsion systems must be considered for future STOL aircraft.
Table of Contents Introduction 1. Generative Manufacturing Processes 2. Classification of Generative Manufacturing Processes 3. Application of Generative Processes on the Fabrication of Ceramic Parts 3.1 Extrusion 3.2 3D-Printing 3.3 Sintering – Laser Sintering 3.4 Layer-Laminate Processes 3.5 Stereolithography (sometimes written: Stereo Lithography) 4. Layer Milling 5. Conclusion - Vision
Generative Verfahren sind seit etwa 1987 in den USA und seit etwa 1990 in Europa und Deutschland in Form von Rapid Prototyping Verfahren bekannt und haben sich in dieser Zeit von eher als exotisch anzusehenden Modellbauverfahren zu effizienten Werkzeugen für die Beschleunigung der Produktentstehung gewandelt. Mit der Weiterentwicklung der Verfahren und insbesondere der Materialien wird mehr und mehr das Feld der direkten Anwendung der Rapid Technologie zur Fertigung erschlossen. Rapid Technologien werden daher zum Schlüssel für neue Konstruktionssystematiken und Fertigungsstrategien.
Die generative Herstellung von Kunststoffbauteilen hat im Gewand des Rapid Prototyping die Produktentwicklung nachhaltig positiv beeinflusst und ist im Begriff als Rapid Manufacturing die Fertigung zu revolutionieren. Je mehr sich die besonderen Eigenschaften generativ gefertigter Kunststoffbauteile herumsprechen, desto lauter wird der Ruf nach Metallbauteilen. Die Entwicklung entsprechender Prozesse läuft auf Hochtouren, kann aber bisher aber erst vereinzelt Erfolge vorweisen. Dabei wären es gerade die Metallbauteile, die ausgestattet mit den besonderen Merkmalen generativ gefertigter Werkstücke, in vielen Branchen einen deutlichen Entwicklungsschub auslösen könnten. Für den potenziellen Anwender ist dabei besonders verwirrend, dass die unterschiedlichsten Ansätze nebeneinander verfolgt werden. Im Folgenden soll daher der Versuche unternommen werden, dieses weite Feld systematisiert darzustellen und Möglichkeiten und Trends zu erläutern.
Als um 1987 ein Verfahren namens Stereolithographie und ein Stereolithography Apparatus (SLA) vorgestellt wurden, war der Traum von der Herstellung beliebiger dreidimensionaler Bauteile direkt aus Computerdaten und ohne bauteilspezifische Werkzeuge Realität geworden. Ein Anwendungs-Szenario wurde gleich mitgeliefert. Diese Technologie würde es möglich machen, die gesamte Ersatzteilversorgung der Amerikanischen Pazifikflotte mittels ein paar dieser Maschinen, umfangreicher Datenstätze und genügend Rohmaterial vor Ort auf einem Flugzeugträger direkt nach Bedarf zu fertigen. Diese Vorstellung definierte schon damals die direkte digitale Fertigung, das Rapid Manufacturing. In der Realität bestanden die mit diesem Verfahren hergestellten Bauteile nur aus Kunststoff, waren ungenau, bruchempfindlich und klebrig und allein in der Produktentwicklung, eben als Prototypen zu benutzen. Sie waren schnell verfügbar, weil zu Ihrer Herstellung keine Werkzeuge benötigt wurden. Folgerichtige und zudem modern hießen sie: Rapid Prototyping. Rapid Prototyping wurde schnell zum Synonym eines neuen Zweiges der Fertigungstechnik, der Generativen Fertigungstechnik. Die weitere Entwicklung brachte neue Verfahren, höhere Genauigkeiten, verbesserte Werkstoffe und neue Anwendungen. Die Herstellung von Negativen, also Werkzeugen, mit dem gleichen Verfahren wurde marketing-getrieben Rapid Tooling genannt und als die ersten Bauteile nicht mehr als Prototypen, sondern als Endprodukte eingesetzt wurden, nannte man dies Rapid Manufacturing - das Ziel war erreicht. War das Ziel wirklich erreicht? Ist es Rapid Manufacturing, wenn ein generativ gefertigtes Bauteil die gewünschte Spezifikation erreicht? Was muss passieren, damit aus dem Phänomen Rapid Prototyping eine Strategie wird, die geeignet ist, einen Paradigmenwechsel von der heutigen Hersteller-induzierten Massenproduktion von Massenartikeln zur Verbraucher-induzierten (und verantworteten) Massenproduktion von Einzelteilen für jedermann ermöglichen und möglicherweise unsere Arbeits- und Lebensformen tiefgreifend zu beeinflussen? Im Beitrag wird der Begriff der (Fertigungs-) Strategie „Rapid Manufacturing“ näher beleuchtet. Es wird diskutiert, welche Maßnahmen auf der technischen und der operative Ebene getroffen werden müssen, damit die generative Fertigungstechnik im Sinne dieser Strategie umgesetzt werden kann. Beispiele belegen, dass diese Entwicklung bereits begonnen hat und geben Anregungen für eine konstruktive Diskussion auf der RapidTech 2006.
Die Berechnung der Durchströmung von Bauteilen ist gegenüber derjenigen von umströmten Bauteilen deutlich im Hintertreffen. Das liegt vor allem an der fehlenden Verfügbarkeit geeigneter optisch transparenter Modellkanäle für die experimentelle Analyse. Der Beitrag stellt ein Verfahren zur Herstellung transparenter durchströmter Geometrien auf der Basis generativ gefertigter Urmodelle vor. Damit können beliebig komplexe Innenströmungen optisch analysiert werden. Anhand von zwei Beispielen aus der Medizin, der Modellierung der oberen Atemwege und des Bronchialbaums, wird das Verfahren vorgeführt. Der generative Bauprozess mittels 3D-Printing wird beschrieben und die Abformung in transparentem Silikon gezeigt. Schließlich werden beispielhaft der Messaufbau und Ergebnisse der Anwendung vorgestellt. Das Verfahren bildet die Grundlage für die Analyse und Berechnung komplexer Innenströmungen und trägt somit zur Verbesserung zahlreicher technischer Anwendungen bei.
Diese Studie beschäftigte sich mit der Dämpfungswirkung von Schienbeinschonern, wie sie beim Fußball zum Einsatz kommen. Sie wurde mit Hilfe eines Pendelhammers durchgeführt, der verschiedene Aufschlagkräfte auf die Schoner ermöglichte. Dabei wurde deutlich, dass Schienbeinschoner die beste Wirkung bei Maximalkräften unterhalb von 5kN erreichen können, dass bei größerer Belastung allerdings Verbesserungsbedarf besteht. Hierfür konnte, u.a. durch den Einsatz neuer Materialien, ein guter Ansatzpunkt im „adäquaten Zusammenspiel von Schale und Polsterung“ der Schoner gefunden werden. Die Untersuchung hat weiterhin gezeigt, dass zumindest teilweise eine deutliche Verbesserung der Dämpfungswirkung der Schienbeinschoner in den letzten Jahren erreicht werden konnte.
A method for detecting and approximating fault lines or surfaces, respectively, or decision curves in two and three dimensions with guaranteed accuracy is presented. Reformulated as a classification problem, our method starts from a set of scattered points along with the corresponding classification algorithm to construct a representation of a decision curve by points with prescribed maximal distance to the true decision curve. Hereby, our algorithm ensures that the representing point set covers the decision curve in its entire extent and features local refinement based on the geometric properties of the decision curve. We demonstrate applications of our method to problems related to the detection of faults, to multi-criteria decision aid and, in combination with Kirsch’s factorization method, to solving an inverse acoustic scattering problem. In all applications we considered in this work, our method requires significantly less pointwise classifications than previously employed algorithms.
Reliable automation of the labor-intensive manual task of scoring animal sleep can facilitate the analysis of long-term sleep studies. In recent years, deep-learning-based systems, which learn optimal features from the data, increased scoring accuracies for the classical sleep stages of Wake, REM, and Non-REM. Meanwhile, it has been recognized that the statistics of transitional stages such as pre-REM, found between Non-REM and REM, may hold additional insight into the physiology of sleep and are now under vivid investigation. We propose a classification system based on a simple neural network architecture that scores the classical stages as well as pre-REM sleep in mice. When restricted to the classical stages, the optimized network showed state-of-the-art classification performance with an out-of-sample F1 score of 0.95 in male C57BL/6J mice. When unrestricted, the network showed lower F1 scores on pre-REM (0.5) compared to the classical stages. The result is comparable to previous attempts to score transitional stages in other species such as transition sleep in rats or N1 sleep in humans. Nevertheless, we observed that the sequence of predictions including pre-REM typically transitioned from Non-REM to REM reflecting sleep dynamics observed by human scorers. Our findings provide further evidence for the difficulty of scoring transitional sleep stages, likely because such stages of sleep are under-represented in typical data sets or show large inter-scorer variability. We further provide our source code and an online platform to run predictions with our trained network.
The paper presents the derivation of a new equivalent skin friction coefficient for estimating the parasitic drag of short-to-medium range fixed-wing unmanned aircraft. The new coefficient is derived from an aerodynamic analysis of ten different unmanned aircraft used for surveillance, reconnaissance, and search and rescue missions. The aircraft is simulated using a validated unsteady Reynolds-averaged Navier Stokes approach. The UAV’s parasitic drag is significantly influenced by the presence of miscellaneous components like fixed landing gears or electro-optical sensor turrets. These components are responsible for almost half of an unmanned aircraft’s total parasitic drag. The new equivalent skin friction coefficient accounts for these effects and is significantly higher compared to other aircraft categories. It is used to initially size an unmanned aircraft for a typical reconnaissance mission. The improved parasitic drag estimation yields a much heavier unmanned aircraft when compared to the sizing results using available drag data of manned aircraft.
N-Acyl-amino acids can act as mild biobased surfactants, which are used, e.g., in baby shampoos. However, their chemical synthesis needs acyl chlorides and does not meet sustainability criteria. Thus, the identification of biocatalysts to develop greener synthesis routes is desirable. We describe a novel aminoacylase from Paraburkholderia monticola DSM 100849 (PmAcy) which was identified, cloned, and evaluated for its N-acyl-amino acid synthesis potential. Soluble protein was obtained by expression in lactose autoinduction medium and co-expression of molecular chaperones GroEL/S. Strep-tag affinity purification enriched the enzyme 16-fold and yielded 15 mg pure enzyme from 100 mL of culture. Biochemical characterization revealed that PmAcy possesses beneficial traits for industrial application like high temperature and pH-stability. A heat activation of PmAcy was observed upon incubation at temperatures up to 80 °C. Hydrolytic activity of PmAcy was detected with several N-acyl-amino acids as substrates and exhibited the highest conversion rate of 773 U/mg with N-lauroyl-L-alanine at 75 °C. The enzyme preferred long-chain acyl-amino-acids and displayed hardly any activity with acetyl-amino acids. PmAcy was also capable of N-acyl-amino acid synthesis with good conversion rates. The best synthesis results were obtained with the cationic L-amino acids L-arginine and L-lysine as well as with L-leucine and L-phenylalanine. Exemplarily, L-phenylalanine was acylated with fatty acids of chain lengths from C8 to C18 with conversion rates of up to 75%. N-lauroyl-L-phenylalanine was purified by precipitation, and the structure of the reaction product was verified by LC–MS and NMR.
Amino acid-based surfactants are valuable compounds for cosmetic formulations. The chemical synthesis of acyl-amino acids is conventionally performed by the Schotten-Baumann reaction using fatty acyl chlorides, but aminoacylases have also been investigated for use in biocatalytic synthesis with free fatty acids. Aminoacylases and their properties are diverse; they belong to different peptidase families and show differences in substrate specificity and biocatalytic potential. Bacterial aminoacylases capable of synthesis have been isolated from Burkholderia, Mycolicibacterium, and Streptomyces. Although several proteases and peptidases from S. griseus have been described, no aminoacylases from this species have been identified yet. In this study, we investigated two novel enzymes produced by S. griseus DSM 40236ᵀ . We identified and cloned the respective genes and recombinantly expressed an α-aminoacylase (EC 3.5.1.14), designated SgAA, and an ε-lysine acylase (EC 3.5.1.17), designated SgELA, in S. lividans TK23. The purified aminoacylase SgAA was biochemically characterized, focusing on its hydrolytic activity to determine temperature- and pH optima and stabilities. The aminoacylase could hydrolyze various acetyl-amino acids at the Nα -position with a broad specificity regarding the sidechain. Substrates with longer acyl chains, like lauroyl-amino acids, were hydrolyzed to a lesser extent. Purified aminoacylase SgELA specific for the hydrolysis of Nε -acetyl-L-lysine was unstable and lost its enzymatic activity upon storage for a longer period but could initially be characterized. The pH optimum of SgELA was pH 8.0. While synthesis of acyl-amino acids was not observed with SgELA, SgAA catalyzed the synthesis of lauroyl-methionine.
Background
Aminoacylases are highly promising enzymes for the green synthesis of acyl-amino acids, potentially replacing the environmentally harmful Schotten-Baumann reaction. Long-chain acyl-amino acids can serve as strong surfactants and emulsifiers, with application in cosmetic industries. Heterologous expression of these enzymes, however, is often hampered, limiting their use in industrial processes.
Results
We identified a novel mycobacterial aminoacylase gene from Mycolicibacterium smegmatis MKD 8, cloned and expressed it in Escherichia coli and Vibrio natriegens using the T7 overexpression system. The recombinant enzyme was prone to aggregate as inclusion bodies, and while V. natriegens Vmax™ could produce soluble aminoacylase upon induction with isopropyl β-d-1-thiogalactopyranoside (IPTG), E. coli BL21 (DE3) needed autoinduction with lactose to produce soluble recombinant protein. We successfully conducted a chaperone co-expression study in both organisms to further enhance aminoacylase production and found that overexpression of chaperones GroEL/S enhanced aminoacylase activity in the cell-free extract 1.8-fold in V. natriegens and E. coli. Eventually, E. coli ArcticExpress™ (DE3), which co-expresses cold-adapted chaperonins Cpn60/10 from Oleispira antarctica, cultivated at 12 °C, rendered the most suitable expression system for this aminoacylase and exhibited twice the aminoacylase activity in the cell-free extract compared to E. coli BL21 (DE3) with GroEL/S co-expression at 20 °C. The purified aminoacylase was characterized based on hydrolytic activities, being most stable and active at pH 7.0, with a maximum activity at 70 °C, and stability at 40 °C and pH 7.0 for 5 days. The aminoacylase strongly prefers short-chain acyl-amino acids with smaller, hydrophobic amino acid residues. Several long-chain amino acids were fairly accepted in hydrolysis as well, especially N-lauroyl-L-methionine. To initially evaluate the relevance of this aminoacylase for the synthesis of N-acyl-amino acids, we demonstrated that lauroyl-methionine can be synthesized from lauric acid and methionine in an aqueous system.
Conclusion
Our results suggest that the recombinant enzyme is well suited for synthesis reactions and will thus be further investigated.
This study focuses on thermoelectric elements (TEE) as an alternative for room temperature control. TEE are semi-conductor devices that can provide heating and cooling via a heat pump effect without direct noise emissions and no refrigerant use. An efficiency evaluation of the optimal operating mode is carried out for different numbers of TEE, ambient temperatures, and heating loads. The influence of an additional heat recovery unit on system efficiency and an unevenly distributed heating demand are examined. The results show that TEE can provide heat at a coefficient of performance (COP) greater than one especially for small heating demands and high ambient temperatures. The efficiency increases with the number of elements in the system and is subject to economies of scale. The best COP exceeds six at optimal operating conditions. An additional heat recovery unit proves beneficial for low ambient temperatures and systems with few TEE. It makes COPs above one possible at ambient temperatures below 0 ∘C. The effect increases efficiency by maximal 0.81 (from 1.90 to 2.71) at ambient temperature 5 K below room temperature and heating demand Q˙h=100W but is subject to diseconomies of scale. Thermoelectric technology is a valuable option for electricity-based heat supply and can provide cooling and ventilation functions. A careful system design as well as an additional heat recovery unit significantly benefits the performance. This makes TEE superior to direct current heating systems and competitive to heat pumps for small scale applications with focus on avoiding noise and harmful refrigerants.
New European Union (EU) regulations for UAS operations require an operational risk analysis, which includes an estimation of the potential danger of the UAS crashing. A key parameter for the potential ground risk is the kinetic impact energy of the UAS. The kinetic energy depends on the impact velocity of the UAS and, therefore, on the aerodynamic drag and the weight during free fall. Hence, estimating the impact energy of a UAS requires an accurate drag estimation of the UAS in that state. The paper at hand presents the aerodynamic drag estimation of small-scale multirotor UAS. Multirotor UAS of various sizes and configurations were analysed with a fully unsteady Reynolds-averaged Navier–Stokes approach. These simulations included different velocities and various fuselage pitch angles of the UAS. The results were compared against force measurements performed in a subsonic wind tunnel and provided good consistency. Furthermore, the influence of the UAS`s fuselage pitch angle as well as the influence of fixed and free spinning propellers on the aerodynamic drag was analysed. Free spinning propellers may increase the drag by up to 110%, depending on the fuselage pitch angle. Increasing the fuselage pitch angle of the UAS lowers the drag by 40% up to 85%, depending on the UAS. The data presented in this paper allow for increased accuracy of ground risk assessments.
We consider a binary multivariate regression model where the conditional expectation of a binary variable given a higher-dimensional input variable belongs to a parametric family. Based on this, we introduce a model-based bootstrap (MBB) for higher-dimensional input variables. This test can be used to check whether a sequence of independent and identically distributed observations belongs to such a parametric family. The approach is based on the empirical residual process introduced by Stute (Ann Statist 25:613–641, 1997). In contrast to Stute and Zhu’s approach (2002) Stute & Zhu (Scandinavian J Statist 29:535–545, 2002), a transformation is not required. Thus, any problems associated with non-parametric regression estimation are avoided. As a result, the MBB method is much easier for users to implement. To illustrate the power of the MBB based tests, a small simulation study is performed. Compared to the approach of Stute & Zhu (Scandinavian J Statist 29:535–545, 2002), the simulations indicate a slightly improved power of the MBB based method. Finally, both methods are applied to a real data set.
"INGMEDIA: Entwicklung und Evaluation interaktiver, multimedialer Lernsoftware für technische und physikalische Praktika in Ingenieurstudiengängen". So lautet der Titel des vom bmb+f im Förderprogramm "Neue Medien in der Hochschullehre" unterstützten Verbundprojekts. [...] Im vorliegenden Beitrag wird über das Evaluationskonzept von INGMEDIA berichtet. Es handelt sich hierbei um einen im E-Learning-Bereich bisher kaum vertretenen Ansatz hochschuldidaktischer Aktionsforschung. Der Beitrag betont entsprechend des kevih - Tagungskonzepts (Tübingen 11./12.3.03) die besonderen hochschuldidaktischen Zielrichtungen, fokussiert also klar auf der konzeptionellen Ebene. Die Umsetzung und Evaluationsergebnisse zu INGMEDIA werden nach Projektabschluss an anderer Stelle veröffentlicht.
Laborpraktika bieten Studierenden besondere Lernmöglichkeiten. Sie erleben im Praktikum mit Kopf (kognitiv), Herz (affektiv) und Hand (motorisch) Zusammenhänge und Zusammenarbeit. Durch die multimediale Vorbereitung und Unterstützung der Laborpraktika mit INGMEDIA können diese Lernvorteile intensiver genutzt werden. Vielfältige und differenzierte Kontextangebote bieten individuelle Lerneinstiegsmöglichkeiten und aktivieren zu Selbstgesteuertem Lernen. Durch die Verbesserung von Vorwissen und Motivation eröffnen scih Lehrenden und Lernenden neue Freiräume bei der Gestaltung der Präsenzveranstaltung. Durch hochschuldidaktische Aktionsforschung beim Einsatz im Lehrbetrieb wird die weitere Entwicklung von Software und Präsenzveranstaltung prozesshaft begleitet.
The increasing share of renewable electricity in the grid drives the need for sufficient storage capacity. Especially for seasonal storage, power-to-gas can be a promising approach. Biologically produced methane from hydrogen produced from surplus electricity can be used to substitute natural gas in the existing infrastructure. Current reactor types are not or are poorly optimized for flexible methanation. Therefore, this work proposes a new reactor type with a plug flow reactor (PFR) design. Simulations in COMSOL Multiphysics ® showed promising properties for operation in laminar flow. An experiment was conducted to support the simulation results and to determine the gas fraction of the novel reactor, which was measured to be 29%. Based on these simulations and experimental results, the reactor was constructed as a 14 m long, 50 mm diameter tube with a meandering orientation. Data processing was established, and a step experiment was performed. In addition, a kLa of 1 h−1 was determined. The results revealed that the experimental outcomes of the type of flow and gas fractions are in line with the theoretical simulation. The new design shows promising properties for flexible methanation and will be tested.
An acetoin biosensor based on a capacitive electrolyte–insulator–semiconductor (EIS) structure modified with the enzyme acetoin reductase, also known as butane-2,3-diol dehydrogenase (Bacillus clausii DSM 8716ᵀ), is applied for acetoin detection in beer, red wine, and fermentation broth samples for the first time. The EIS sensor consists of an Al/p-Si/SiO₂/Ta₂O₅ layer structure with immobilized acetoin reductase on top of the Ta₂O₅ transducer layer by means of crosslinking via glutaraldehyde. The unmodified and enzyme-modified sensors are electrochemically characterized by means of leakage current, capacitance–voltage, and constant capacitance methods, respectively.
Plant viruses are major contributors to crop losses and induce high economic costs worldwide. For reliable, on-site and early detection of plant viral diseases, portable biosensors are of great interest. In this study, a field-effect SiO2-gate electrolyte-insulator-semiconductor (EIS) sensor was utilized for the label-free electrostatic detection of tobacco mosaic virus (TMV) particles as a model plant pathogen. The capacitive EIS sensor has been characterized regarding its TMV sensitivity by means of constant-capacitance method. The EIS sensor was able to detect biotinylated TMV particles from a solution with a TMV concentration as low as 0.025 nM. A good correlation between the registered EIS sensor signal and the density of adsorbed TMV particles assessed from scanning electron microscopy images of the SiO2-gate chip surface was observed. Additionally, the isoelectric point of the biotinylated TMV particles was determined via zeta potential measurements and the influence of ionic strength of the measurement solution on the TMV-modified EIS sensor signal has been studied.
In this work, the bioabsorbable materials, namely fibroin, polylactide acid (PLA), magnesium and magnesium oxide are investigated for their application as transient, resistive temperature detectors (RTD). For this purpose, a thin-film magnesium-based meander-like electrode is deposited onto a flexible, bioabsorbable substrate (fibroin or PLA) and encapsulated (passivated) by additional magnesium oxide layers on top and below the magnesium-based electrode. The morphology of different layered RTDs is analyzed by scanning electron microscopy. The sensor performance and lifetime of the RTD is characterized both under ambient atmospheric conditions between 30°C and 43°C, and wet tissue-like conditions with a constant temperature regime of 37°C. The latter triggers the degradation process of the magnesium-based layers. The 3-layers RTDs on a PLA substrate could achieve a lifetime of 8.5 h. These sensors also show the best sensor performance under ambient atmospheric conditions with a mean sensitivity of 0.48 Ω/°C ± 0.01 Ω/°C.
Herein, fibroin, polylactide (PLA), and carbon are investigated for their suitability as biocompatible and biodegradable materials for amperometric biosensors. For this purpose, screen-printed carbon electrodes on the biodegradable substrates fibroin and PLA are modified with a glucose oxidase membrane and then encapsulated with the biocompatible material Ecoflex. The influence of different curing parameters of the carbon electrodes on the resulting biosensor characteristics is studied. The morphology of the electrodes is investigated by scanning electron microscopy, and the biosensor performance is examined by amperometric measurements of glucose (0.5–10 mM) in phosphate buffer solution, pH 7.4, at an applied potential of 1.2 V versus a Ag/AgCl reference electrode. Instead of Ecoflex, fibroin, PLA, and wound adhesive are tested as alternative encapsulation compounds: a series of swelling tests with different fibroin compositions, PLA, and Ecoflex has been performed before characterizing the most promising candidates by chronoamperometry. Therefore, the carbon electrodes are completely covered with the particular encapsulation material. Chronoamperometric measurements with H2O2 concentrations between 0.5 and 10 mM enable studying the leakage current behavior.
The treatment method to deactivate viable microorganisms from objects or products is termed sterilization. There are multiple forms of sterilization, each intended to be applied for a specific target, which depends on—but not limited to—the thermal, physical, and chemical stability of that target. Herein, an overview on the currently used sterilization processes in the global market is provided. Different sterilization techniques are grouped under a category that describes the method of treatment: radiation (gamma, electron beam, X-ray, and ultraviolet), thermal (dry and moist heat), and chemical (ethylene oxide, ozone, chlorine dioxide, and hydrogen peroxide). For each sterilization process, the typical process parameters as defined by regulations and the mode of antimicrobial activity are summarized. Finally, the recommended microorganisms that are used as biological indicators to validate sterilization processes in accordance with the rules that are established by various regulatory agencies are summarized.
In comparison to single-analyte devices, multiplexed systems for a multianalyte detection offer a reduced assay time and sample volume, low cost, and high throughput. Herein, a multiplexing platform for an automated quasi-simultaneous characterization of multiple (up to 16) capacitive field-effect sensors by the capacitive–voltage (C–V) and the constant-capacitance (ConCap) mode is presented. The sensors are mounted in a newly designed multicell arrangement with one common reference electrode and are electrically connected to the impedance analyzer via the base station. A Python script for the automated characterization of the sensors executes the user-defined measurement protocol. The developed multiplexing system is tested for pH measurements and the label-free detection of ligand-stabilized, charged gold nanoparticles.
Sleep spindles are neurophysiological phenomena that appear to be linked to memory formation and other functions of the central nervous system, and that can be observed in electroencephalographic recordings (EEG) during sleep. Manually identified spindle annotations in EEG recordings suffer from substantial intra- and inter-rater variability, even if raters have been highly trained, which reduces the reliability of spindle measures as a research and diagnostic tool. The Massive Online Data Annotation (MODA) project has recently addressed this problem by forming a consensus from multiple such rating experts, thus providing a corpus of spindle annotations of enhanced quality. Based on this dataset, we present a U-Net-type deep neural network model to automatically detect sleep spindles. Our model’s performance exceeds that of the state-of-the-art detector and of most experts in the MODA dataset. We observed improved detection accuracy in subjects of all ages, including older individuals whose spindles are particularly challenging to detect reliably. Our results underline the potential of automated methods to do repetitive cumbersome tasks with super-human performance.
Ein vorausschauendes Risikomanagement beinhaltet, Risiken zu kalkulieren. Es liefert Entscheidungsgrundlagen, um diese Risiken zu begrenzen und es macht transparent,welche Risiken sinnvoll über Versicherungen abgedeckt werden sollten. Bei Unternehmen, die mit umfangreichen elektronischenEinrichtungen produzieren oder Dienstleistungen erbringen (und das sind heutzutage wohl die meisten), muss auch das Risiko durch Blitzeinwirkungen besondere Berücksichtigung finden. Dabei ist zu beachten, dass der Schaden aufgrund der Nichtverfügbarkeit der elektronischen Einrichtungen und damit derProduktion bzw. der Dienstleistung und ggf. der Verlust von Daten den Hardwareschaden an der betroffenen Anlage oft bei weitem übersteigt.
Dem Blitzschutz von Anlagen der regenerativen Energien kommt in Zukunft eine steigende Bedeutung zu. Dabei ist es notwendig zu berücksichtigen, dass die Schutzmaßnahmen technisch/wirtschaftlich ausgewogen sind. Erbauer, Besitzer oder Benutzer von netzautarken Hybridanlagen haben zu entscheiden, ob die Anlage einen Schutz braucht oder nicht. Um diese Entscheidung zu fällen, ist eine Risikoanalyse als erster Schritt sinnvoll. Diese muss dabei die für die Hybridanlage relevanten Schadenarten und spezifischen Parameter, Werte und Randbedingungen mit einbeziehen. Dazu ist die Hilfe eines Blitzschutzexperten sehr hilfreich.
In: Advanced Engineering Informatics. Vol 21, Issue 1, 2007, Pages 67-83 http://dx.doi.org/10.1016/j.aei.2006.10.001 eds. J.C. Kunz, I.F.C. Smith and T. Tomiyama, Elsevier, Seite 1-22 Current CAD tools are not able to support the conceptual design phase, and none of them provides a consistency analysis for sketches produced by architects. This phase is fundamental and crucial for the whole design and construction process of a building. To give architects a better support, we developed a CAD tool for conceptual design and a knowledge specification tool. The knowledge is specific to one class of buildings and it can be reused. Based on a dynamic and domain-specific knowledge ontology, different types of design rules formalize this knowledge in a graph-based form. An expressive visual language provides a user-friendly, human readable representation. Finally, a consistency analysis tool enables conceptual designs to be checked against this formal conceptual knowledge. In this article, we concentrate on the knowledge specification part. For that, we introduce the concepts and usage of a novel visual language and describe its semantics. To demonstrate the usability of our approach, two graph-based visual tools for knowledge specification and conceptual design are explained.
The Virtual Clean Room - a new tool in teaching MST process technologies University education in high-technology fields like MST is not complete without intensive laboratory sessions. Students cannot fully grasp the complexity and the special problems related to the manufacturing of microsystems without a thorough hands-on experience in a MST clean room.
An optimization method is developed to describe the mechanical behaviour of the human cancellous bone. The method is based on a mixture theory. A careful observation of the behaviour of the bone material leads to the hypothesis that the bone density is controlled by the principal stress trajectories (Wolff’s law). The basic idea of the developed method is the coupling of a scalar value via an eigenvalue problem to the principal stress trajectories. On the one hand this theory will permit a prediction of the reaction of the biological bone structure after the implantation of a prosthesis, on the other hand it may be useful in engineering optimization problems. An analytical example shows its efficiency.
The eVTOL industry is a rapidly growing mass market expected to start in 2024. eVTOL compete, caused by their predicted missions, with ground-based transportation modes, including mainly passenger cars. Therefore, the automotive and classical aircraft design process is reviewed and compared to highlight advantages for eVTOL development. A special focus is on ergonomic comfort and safety. The need for further investigation of eVTOL’s crashworthiness is outlined by, first, specifying the relevance of passive safety via accident statistics and customer perception analysis; second, comparing the current state of regulation and certification; and third, discussing the advantages of integral safety and applying the automotive safety approach for eVTOL development. Integral safety links active and passive safety, while the automotive safety approach means implementing standardized mandatory full-vehicle crash tests for future eVTOL. Subsequently, possible crash impact conditions are analyzed, and three full-vehicle crash load cases are presented.
Originalausgabe: Orthopädische Praxis Jg. 47. 2011 H. 11; S. 536-543. Mit freundlicher Genehmigung des Verlags Zusammenfassung: Auf der Basis von Patientenabfragen mittels Fragebogen zum Schmerzempfinden und zur Einschränkung bei Aktivitäten des alltäglichen Lebens wird die Langzeitwirkung der MBST® KernspinResonanz-Therapie bei Gonarthrose untersucht. An der Studie nahmen 39 Patienten teil, bei denen die Therapie bis zu vier Jahre zurückliegt. Neben einer Gesamtbetrachtung wird der Erfolg auch in Abhängigkeit von Alter, Geschlecht und sportlicher Aktivität analysiert. Insgesamt weist die Studie auf eine anhaltende Verbesserung des Gesundheitszustands mit zum Teil deutlicher Schmerzlinderung auch noch nach vier Jahren hin, jedoch mit einer leichten Schmerzzunahme gegen Ende des Untersuchungszeitraums von vier Jahren. Eine tendenziell positivere Wirkung bei Frauen, älteren Menschen oder auch sportlich nicht-aktiven Patienten lässt auf eine mögliche Beeinflussung des Erfolgs der Therapie durch (Über-)Belastung im Alltag schließen. Ein zusätzlich positiver Effekt der Therapie auf die Knochendichte ist ebenfalls denkbar, dies bleibt jedoch offen.
Jürgen Lohr, Jahrgang 1962, beschäftigt mit Softwareentwicklung im Projekt "Interaktive Multimedia" bei Telekom AG, Entwicklungszentrum Berlin. Zuerst erschienen in: Telekom-Praxis Ausgabe 1996. Inhaltsverzeichnis: 1. Einleitung 1.1 Einführung 1.2 Neue Dienste und Anwendungen 2 Modell zur Verteilung und Architektur 3 Technologien 3.1 Netzwerk 3.2 Computertechniken 3.3. Aufgaben der Server 4 Geplanter Einsatz der Pilotprojekte 4.1 Pilote der Telekom 4.2 Show-Case Berlin 5 Verwendete Server-Architektur 5.1 Berlin - SEL/Alcatel 5.2 Hanburg - Philips 5.3. Köln/Bonn - Digital, FUBA und Nokia 5.4 Nürnberg - Oracle, nCube und Sequent 5.5 Stuttgart - SEL/Alcatel, Hewlett Packard und Bosch 6 Zukünftige Aspekte 6.1 DVB 6.2 DAVIC 6.3 weitere Aspekte 7 Zusammenfassung 8 Schrifttum 9 verwendete Abkürzungen
zuerst erschienen in Telekom-Praxis Ausgabe 1997. Von Jürgen Lohr, Jahrgang 1962, beschäftigt mit Softwareentwicklung im Projekt "Interaktive Multimedia" bei der Deutschen Telekom AG, Entwicklungszentrum Berlin. 26 S. Der Beitrag befaßt sich mit dem Thema der universellen Kommunikationsplattform für neue, interaktive, multimediale Dienste und Anwendungen. Ausgehend von den Diensten wird ein Referenzmodell für offene Kommunikation und die Kommunikationsplattform kurz vorgestellt. Desweiteren wird die XAPI mit den Grundbegriffen, den Phasen der Kommunikation und dem Status Modell dargelegt. Ebenfalls werden die realisierten Service Provider erläutert. Abschließend werden zukünftige Vorhaben aus den Standardisierungsprojekten ITU und DAVIC sowie weitere Realisierungen aufgezeigt.
Zuerst erschienen in Telekom-Praxis Ausgabe 2000. 24 S. Innovative multimediale Dienste werden durch die Globalisierung und Konvergenz der Märkte, als auch durch Provider-Strategien ausgerichtet. Grundlegende Innovationsfelder sind: Globaler Zugang, Navigation und Intelligenter Inhalt. Die MPEG-Standards - im besonderen MPEG-4 und MPEG-7 - helfen, die oben genannten Forderungen zu erfüllen. Weiterhin ermöglichen sie auch für die Provider und den Kunden eine Zukunftssicherheit zu geben und einen zeitlichen Bestand für innovative Produkte zu sichern. Die Aufwärtkompabilität der MPEG-Standards ermöglicht die Vermeidung von Überschneidung und die Erschließung neuer Dimensionen.
In: Unterrichtsblätter / Deutsche Telekom AG. 53. 2000. 7. S. 326-340. (15 S. ) Die Multimedia-Dienste erhalten durch die Datenreduktion bei der Kompressionstechnologie eine Wirtschaftlichkeit, die den breiteren Einsatz von breitbandigen Diensten erlaubt. Die Dienste benötigen für die verschiedenen Medien nicht mehr so große Übertragungs- und Speicherleistungen. Bei den entwickelten Verfahren, den so genannten MPEG-(Motion Picture Experts Group-)Standards, werden die Video- und Tonsignale in die digitale Ebene überführt und anschließend unrelevante Signalanteile entfernt. Der daraus resultierende Datenstrom benötigt weniger Bandbreite bei der Übertragung zum Endkunden. Die MPEG-Organisation wurde bereits im Jahre 1988 ins Leben gerufen und ist ein gemeinsames Gremium der beiden Organisationen ISO (International Standard Organization) und IEC (International Electrotechnical Commission), welches sich mit der Standardisierung von Kodier- und Kompressionsverfahren für die digitalen Bild-, Video und Audioformate befasst. Mittlerweile sind vier wichtige Standards mit MPEG-1, MPEG-2 und MPEG-4 verabschiedet worden sowie mit MPEG-7 in Vorbereitung. Da die Grundlagen zu MPEG-1, -2 und -Audio bereits in anderen Beiträgen behandelt wurden, werden hier ausschließlich die neuen bzw. aktuellen MPEG-Standards vorgestellt.
In: Unterrichtsblätter / Deutsche Telekom AG. 54. 2001. 7. S. 410-420 (11 S. ) Angesichts der zunehmenden Globalisierung von Informationen und Informationsdiensten können Inhalte (Contents) für mehrere unterschiedliche Dienste genutzt und auf verschiedenen Endgeräten ausgegeben werden. Hier setzt ein Content-Management-System (CMS) an, welches sowohl für die Kunden als auch für die Anbieter der unterschiedlich distribuierten Dienste Synergien und somit Einsparpotenziale bietet. Darüber hinaus werden für die Anbieter dieser Dienste durch die allgemeine Definition von Leistungstools und die Definition von Wertschöpfungsketten künftige Produktentwicklungen vereinheitlicht und optimiert werden. Mit der Entwicklung und Vertriebsfreigabe immer weiterer Informationsdienste, die von verschiedenen Dienste-Providern betrieben werden, ist der Bedarf an einer Koordinierung der Entwicklungen und Investitionen im Bereich der Content-Akquisition und des Content-Management (CM) bedeutend angestiegen. Neben Akquisition, lizenzrechtlichen Fragen und Verwaltung des im Rahmen von Diensten angebotenen Content rücken vor allem auch Fragen der Gestaltung von Content-Management-Plattformen (CMP) immer stärker in den Blickpunkt. Der Beitrag stellt die globalen Ergebnisse dar, die in einem Forschungs- und Entwicklungsauftrag des Zentralbereichs Innovationsmanagement der Deutschen Telekom zu diesem Thema ermittelt wurden. Es werden die Kernmodule für eine Content- Management-Plattform beschrieben, die die Anforderungen an die Bereitstellung vielfältiger Content-Angebote erfüllt. Die folgenden Themen werden behandelt: + Begriffsbestimmung, + Content- und Dienste-Portfolio, + Standard Content-Prozess, + synergetische Content-Plattform (sCP), + Modelle der sCP, + Aspekte beim Betrieb und + Nutzen eines Content-Management.
In: Unterrichtsblätter / Deutsche Telekom AG. 53. 2000. 11. S. 618-634 (17 S.) Wo man hinblickt: Turbulenzen, Unvorhersagbarkeiten, Unregelmäßigkeiten – kurz Chaos. Ist unsere wissenschaftliche Sichtweise falsch, alle Vorgänge des Kosmos auf die Basis der Ordnung abzubilden? – Nein. Mit Chaos ist nicht Fehlen jeglicher Ordnung und völlig regelloses Durcheinander gemeint, sondern – auf Grund der Vernetztheit der vielen Elemente, die miteinander eine Wechselwirkung haben – die Unberechenbarkeit der Naturprozesse. Die Chaostheorie erlaubt durch die Modellierung weit auseinander liegende Problemfelder miteinander zu verknüpfen, um dann in einem Modell die Zusammenhänge erkennbar zu machen. Mit Hilfe der Chaostheorie werden gesellschaftliche Prozesse abgebildet und dann mit einer Simulation neue, globale Strategien erstellt, um kritische Systempunkte (Systemelemente) zu erkennen. Der nachfolgende Beitrag beschreibt die Modellierung am Beispiel der Multimedia-Dienste und gibt mit einem umfassenden Glossar eine Einführung in die Begrifflichkeiten der Chaostheorie. Die Chaostheorie ist die mathematisch-physikalische Theorie zur Beschreibung von Systemen, die zwar durch Gesetzmäßigkeiten determiniert sind, bei denen aber kleine Änderungen der Anfangsbedingungen ein exponentielles Anwachsen von Störungen bewirken. Das Verhalten derartiger Systeme führt zur Ausbildung chaotischer Strukturen und ist langfristig nicht vorhersagbar. Die Chaostheorie ist beispielsweise in der nichtlinearen Optik, bei chemischen Reaktionen und der Wettervorhersage anwendbar.
There is a growing demand for more flexibility in manufacturing to counter the volatility and unpredictability of the markets and provide more individualization for customers. However, the design and implementation of flexibility within manufacturing systems are costly and only economically viable if applicable to actual demand fluctuations. To this end, companies are considering additive manufacturing (AM) to make production more flexible. This paper develops a conceptual model for the impact quantification of AM on volume and mix flexibility within production systems in the early stages of the factory-planning process. Together with the model, an application guideline is presented to help planners with the flexibility quantification and the factory design process. Following the development of the model and guideline, a case study is presented to indicate the potential impact additive technologies can have on manufacturing flexibility Within the case study, various scenarios with different production system configurations and production programs are analyzed, and the impact of the additive technologies on volume and mix flexibility is calculated. This work will allow factory planners to determine the potential impacts of AM on manufacturing flexibility in an early planning stage and design their production systems accordingly.
This study analyses the expected utilization of an urban distribution grid under high penetration of photovoltaic and e-mobility with charging infrastructure on a residential level. The grid utilization and the corresponding power flow are evaluated, while varying the control strategies and photovoltaic installed capacity in different scenarios. Four scenarios are used to analyze the impact of e-mobility. The individual mobility demand is modelled based on the largest German studies on mobility “Mobilität in Deutschland”, which is carried out every 5 years. To estimate the ramp-up of photovoltaic generation, a potential analysis of the roof surfaces in the supply area is carried out via an evaluation of an open solar potential study. The photovoltaic feed-in time series is derived individually for each installed system in a resolution of 15 min. The residential consumption is estimated using historical smart meter data, which are collected in London between 2012 and 2014. For a realistic charging demand, each residential household decides daily on the state of charge if their vehicle requires to be charged. The resulting charging time series depends on the underlying behavior scenario. Market prices and mobility demand are therefore used as scenario input parameters for a utility function based on the current state of charge to model individual behavior. The aggregated electricity demand is the starting point of the power flow calculation. The evaluation is carried out for an urban region with approximately 3100 residents. The analysis shows that increased penetration of photovoltaics combined with a flexible and adaptive charging strategy can maximize PV usage and reduce the need for congestion-related intervention by the grid operator by reducing the amount of kWh charged from the grid by 30% which reduces the average price of a charged kWh by 35% to 14 ct/kWh from 21.8 ct/kWh without PV optimization. The resulting grid congestions are managed by implementing an intelligent price or control signal. The analysis took place using data from a real German grid with 10 subgrids. The entire software can be adapted for the analysis of different distribution grids and is publicly available as an open-source software library on GitHub.
Miniaturized electrolyte–insulator–semiconductor capacitors (EISCAPs) with ultrathin gate insulators have been studied in terms of their pH-sensitive sensor characteristics: three different EISCAP systems consisting of Al–p-Si–Ta2O5(5 nm), Al–p-Si–Si3N4(1 or 2 nm)–Ta2O5 (5 nm), and Al–p-Si–SiO2(3.6 nm)–Ta2O5(5 nm) layer structures are characterized in buffer solution with different pH values by means of capacitance–voltage and constant capacitance method. The SiO2 and Si3N4 gate insulators are deposited by rapid thermal oxidation and rapid thermal nitridation, respectively, whereas the Ta2O5 film is prepared by atomic layer deposition. All EISCAP systems have a clear pH response, favoring the stacked gate insulators SiO2–Ta2O5 when considering the overall sensor characteristics, while the Si3N4(1 nm)–Ta2O5 stack delivers the largest accumulation capacitance (due to the lower equivalent oxide thickness) and a higher steepness in the slope of the capacitance–voltage curve among the studied stacked gate insulator systems.
This study addresses a proof-of-concept experiment with a biocompatible screen-printed carbon electrode deposited onto a biocompatible and biodegradable substrate, which is made of fibroin, a protein derived from silk of the Bombyx mori silkworm. To demonstrate the sensor performance, the carbon electrode is functionalized as a glucose biosensor with the enzyme glucose oxidase and encapsulated with a silicone rubber to ensure biocompatibility of the contact wires. The carbon electrode is fabricated by means of thick-film technology including a curing step to solidify the carbon paste. The influence of the curing temperature and curing time on the electrode morphology is analyzed via scanning electron microscopy. The electrochemical characterization of the glucose biosensor is performed by amperometric/voltammetric measurements of different glucose concentrations in phosphate buffer. Herein, systematic studies at applied potentials from 500 to 1200 mV to the carbon working electrode (vs the Ag/AgCl reference electrode) allow to determine the optimal working potential. Additionally, the influence of the curing parameters on the glucose sensitivity is examined over a time period of up to 361 days. The sensor shows a negligible cross-sensitivity toward ascorbic acid, noradrenaline, and adrenaline. The developed biocompatible biosensor is highly promising for future in vivo and epidermal applications.
Quantitative nuclear magnetic resonance (qNMR) is considered as a powerful tool for multicomponent mixture analysis as well as for the purity determination of single compounds. Special attention is currently paid to the training of operators and study directors involved in qNMR testing. To assure that only qualified personnel are used for sample preparation at our GxP-accredited laboratory, weighing test was proposed. Sixteen participants performed six-fold weighing of the binary mixture of dibutylated hydroxytoluene (BHT) and 1,2,4,5-tetrachloro-3-nitrobenzene (TCNB). To evaluate the quality of data analysis, all spectra were evaluated manually by a qNMR expert and using in-house developed automated routine. The results revealed that mean values are comparable and both evaluation approaches are free of systematic error. However, automated evaluation resulted in an approximately 20% increase in precision. The same findings were revealed for qNMR analysis of 32 compounds used in pharmaceutical industry. Weighing test by six-fold determination in binary mixtures and automated qNMR methodology can be recommended as efficient tools for evaluating staff proficiency. The automated qNMR method significantly increases throughput and precision of qNMR for routine measurements and extends application scope of qNMR.
Lifting propellers are of increasing interest for Advanced Air Mobility. All propellers and rotors are initially twisted beams, showing significant extension–twist coupling and centrifugal twisting. Torsional deformations severely impact aerodynamic performance. This paper presents a novel approach to assess different reasons for torsional deformations. A reduced-order model runs large parameter sweeps with algebraic formulations and numerical solution procedures. Generic beams represent three different propeller types for General Aviation, Commercial Aviation, and Advanced Air Mobility. Simulations include solid and hollow cross-sections made of aluminum, steel, and carbon fiber-reinforced polymer. The investigation shows that centrifugal twisting moments depend on both the elastic and initial twist. The determination of the centrifugal twisting moment solely based on the initial twist suffers from errors exceeding 5% in some cases. The nonlinear parts of the torsional rigidity do not significantly impact the overall torsional rigidity for the investigated propeller types. The extension–twist coupling related to the initial and elastic twist in combination with tension forces significantly impacts the net cross-sectional torsional loads. While the increase in torsional stiffness due to initial twist contributes to the overall stiffness for General and Commercial Aviation propellers, its contribution to the lift propeller’s stiffness is limited. The paper closes with the presentation of approximations for each effect identified as significant. Numerical evaluations are necessary to determine each effect for inhomogeneous cross-sections made of anisotropic material.
The aerodynamic performance of propellers strongly depends on their geometry and, consequently, on aeroelastic deformations. Knowledge of the extent of the impact is crucial for overall aircraft performance. An integrated simulation environment for steady aeroelastic propeller simulations is presented. The simulation environment is applied to determine the impact of elastic deformations on the aerodynamic propeller performance. The aerodynamic module includes a blade element momentum approach to calculate aerodynamic loads. The structural module is based on finite beam elements, according to Timoshenko theory, including moderate deflections. Several fixed-pitch propellers with thin-walled cross sections made of both isotropic and non-isotropic materials are investigated. The essential parameters are varied: diameter, disc loading, sweep, material, rotational, and flight velocity. The relative change of thrust between rigid and elastic blades quantifies the impact of propeller elasticity. Swept propellers of large diameters or low disc loadings can decrease the thrust significantly. High flight velocities and low material stiffness amplify this tendency. Performance calculations without consideration of propeller elasticity can lead to decreased efficiency. To avoid cost- and time-intense redesigns, propeller elasticity should be considered for swept planforms and low disc loadings.
A new functionalization method to modify capacitive electrolyte–insulator–semiconductor (EIS) structures with nanofilms is presented. Layers of polyallylamine hydrochloride (PAH) and graphene oxide (GO) with the compound polyaniline:poly(2-acrylamido-2-methyl-1-propanesulfonic acid) (PANI:PAAMPSA) are deposited onto a p-Si/SiO2 chip using the layer-by-layer technique (LbL). Two different enzymes (urease and penicillinase) are separately immobilized on top of a five-bilayer stack of the PAH:GO/PANI:PAAMPSA-modified EIS chip, forming a biosensor for detection of urea and penicillin, respectively. Electrochemical characterization is performed by constant capacitance (ConCap) measurements, and the film morphology is characterized by atomic force microscopy (AFM) and scanning electron microscopy (SEM). An increase in the average sensitivity of the modified biosensors (EIS–nanofilm–enzyme) of around 15% is found in relation to sensors, only carrying the enzyme but without the nanofilm (EIS–enzyme). In this sense, the nanofilm acts as a stable bioreceptor onto the EIS chip improving the output signal in terms of sensitivity and stability.
Die IMechE Railway Challenge wird jährlich in Stapleford, Großbritannien ausgetragen. Im Rahmen der Challenge entwickeln und bauen Studierende eine Lokomotive und vergleichen sich in verschiedenen Disziplinen, darunter eine automatisierte Zielbremsung, optimale Energierückgewinnung beim Bremsen und minimale Geräuschemissionen. Neben diesen und weiteren technischen Wettbewerbsdisziplinen treten die Fahrzeuge und die Teams auch in nicht-technischen Disziplinen wie einer Business Case Challenge an.
This study reviews the practice of brake tests in freight railways, which is time consuming and not suitable to detect certain failure types. Public incident reports are analysed to derive a reasonable brake test hardware and communication architecture, which aims to provide automatic brake tests at lower cost than current solutions. The proposed solutions relies exclusively on brake pipe and brake cylinder pressure sensors, a brake release position switch as well as radio communication via standard protocols. The approach is embedded in the Wagon 4.0 concept, which is a holistic approach to a smart freight wagon. The reduction of manual processes yields a strong incentive due to high savings in manual
labour and increased productivity.
Berücksichtigung von No Fault Found im Diagnose- und Instandhaltungssystem von Schienenfahrzeugen
(2020)
Intermittierende und nicht reproduzierbare Fehler, auch als No Fault Found bezeichnet, treten in praktisch allen Bereichen auf und sorgen für hohe Kosten. Diese sind häufig auf unpräzise Fehlerbeschreibungen zurückzuführen. Im vorliegenden Beitrag werden Anpassungen der Vorgehensweise bei der Entwicklung und Anpassungen des Diagnosesystems vorgeschlagen.
Im Studiengang Mikrosystemtechnik des Fachhochschulstandortes Zweibrücken werden zwei neue moderne Anlagen für die Herstellung von mikrotechnischen Komponenten in Betrieb genommen: Ein Oxidationsofen für Herstellung dünner Oxidschichten auf Silizium-Einkristallen und eine Belichtungsapparatur für die Fotolithografie - das Besondere an diesen Anlagen: Sie existieren nur virtuell, d.h. als Animationen in einer Computerwelt.
Plant physiology and plant stress: Plant physiology will be much more important for human mankind because of yield and cultivation limits of crops determined by their resistance to stress. To assess and counteract various stress factors it is necessary to conduct plant research to gain information and results on plant physiology.
Contractile behavior of the gastrocnemius medialis muscle during running in simulated hypogravity
(2021)
Vigorous exercise countermeasures in microgravity can largely attenuate muscular degeneration, albeit the extent of applied loading is key for the extent of muscle wasting. Running on the International Space Station is usually performed with maximum loads of 70% body weight (0.7 g). However, it has not been investigated how the reduced musculoskeletal loading affects muscle and series elastic element dynamics, and thereby force and power generation. Therefore, this study examined the effects of running on the vertical treadmill facility, a ground-based analog, at simulated 0.7 g on gastrocnemius medialis contractile behavior. The results reveal that fascicle−series elastic element behavior differs between simulated hypogravity and 1 g running. Whilst shorter peak series elastic element lengths at simulated 0.7 g appear to be the result of lower muscular and gravitational forces acting on it, increased fascicle lengths and decreased velocities could not be anticipated, but may inform the development of optimized running training in hypogravity. However, whether the alterations in contractile behavior precipitate musculoskeletal degeneration warrants further study.
The international partnership of space agencies has agreed to proceed forward to the Moon sustainably. Activities on the Lunar surface (0.16 g) will allow crewmembers to advance the exploration skills needed when expanding human presence to Mars (0.38 g). Whilst data from actual hypogravity activities are limited to the Apollo missions, simulation studies have indicated that ground reaction forces, mechanical work, muscle activation, and joint angles decrease with declining gravity level. However, these alterations in locomotion biomechanics do not necessarily scale to the gravity level, the reduction in gastrocnemius medialis activation even appears to level off around 0.2 g, while muscle activation pattern remains similar. Thus, it is difficult to predict whether gastrocnemius medialis contractile behavior during running on Moon will basically be the same as on Mars. Therefore, this study investigated lower limb joint kinematics and gastrocnemius medialis behavior during running at 1 g, simulated Martian gravity, and simulated Lunar gravity on the vertical treadmill facility. The results indicate that hypogravity-induced alterations in joint kinematics and contractile behavior still persist between simulated running on the Moon and Mars. This contrasts with the concept of a ceiling effect and should be carefully considered when evaluating exercise prescriptions and the transferability of locomotion practiced in Lunar gravity to Martian gravity.
REM sleep without atonia (RSWA) is a key feature for the diagnosis of rapid eye movement (REM) sleep behaviour disorder (RBD). We introduce RBDtector, a novel open-source software to score RSWA according to established SINBAR visual scoring criteria. We assessed muscle activity of the mentalis, flexor digitorum superficialis (FDS), and anterior tibialis (AT) muscles. RSWA was scored manually as tonic, phasic, and any activity by human scorers as well as using RBDtector in 20 subjects. Subsequently, 174 subjects (72 without RBD and 102 with RBD) were analysed with RBDtector to show the algorithm’s applicability. We additionally compared RBDtector estimates to a previously published dataset. RBDtector showed robust conformity with human scorings. The highest congruency was achieved for phasic and any activity of the FDS. Combining mentalis any and FDS any, RBDtector identified RBD subjects with 100% specificity and 96% sensitivity applying a cut-off of 20.6%. Comparable performance was obtained without manual artefact removal. RBD subjects also showed muscle bouts of higher amplitude and longer duration. RBDtector provides estimates of tonic, phasic, and any activity comparable to human scorings. RBDtector, which is freely available, can help identify RBD subjects and provides reliable RSWA metrics.
We study the possibility to fabricate an arbitrary phase mask in a one-step laser-writing process inside the volume of an optical glass substrate. We derive the phase mask from a Gerchberg–Saxton-type algorithm as an array and create each individual phase shift using a refractive index modification of variable axial length. We realize the variable axial length by superimposing refractive index modifications induced by an ultra-short pulsed laser at different focusing depth. Each single modification is created by applying 1000 pulses with 15 μJ pulse energy at 100 kHz to a fixed spot of 25 μm diameter and the focus is then shifted axially in steps of 10 μm. With several proof-of-principle examples, we show the feasibility of our method. In particular, we identify the induced refractive index change to about a value of Δn=1.5⋅10−3. We also determine our current limitations by calculating the overlap in the form of a scalar product and we discuss possible future improvements.
The demand of replacements for inoperable organs exceeds the amount of available organ transplants. Therefore, tissue engineering developed as a multidisciplinary field of research for autologous in-vitro organs. Such three dimensional tissue constructs request the application of a bioreactor. The UREPLACE bioreactor is used to grow cells on tubular collagen scaffolds OPTIMAIX Sponge 1 with a maximal length of 7 cm, in order to culture in vitro an adequate ureter replacement. With a rotating unit, (urothelial) cells can be placed homogeneously on the inner scaffold surface. Furthermore, a stimulation is combined with this bioreactor resulting in an orientation of muscle cells. These culturing methods request a precise control of several parameters and actuators. A combination of a LabBox and the suitable software LabVision is used to set and conduct parameters like rotation angles, velocities, pressures and other important cell culture values. The bioreactor was tested waterproof successfully. Furthermore, the temperature controlling was adjusted to 37 °C and the CO2 - concentration regulated to 5 %. Additionally, the pH step responses of several substances showed a perfect functioning of the designed flow chamber. All used software was tested and remained stable for several days.
Companies often build their businesses based on product information and therefore try to automate the process of information extraction (IE). Since the information source is usually heterogeneous and non-standardized, classic extract, transform, load techniques reach their limits. Hence, companies must implement the newest findings from research to tackle the challenges of process automation. They require a flexible and robust system that is extendable and ensures the optimal processing of the different document types. This paper provides a distributed microservice architecture pattern that enables the automated generation of IE pipelines. Since their optimal design is individual for each input document, the system ensures the ad-hoc generation of pipelines depending on specific document characteristics at runtime. Furthermore, it introduces the automated quality determination of each available pipeline and controls the integration of new microservices based on their impact on the business value. The introduced system enables fast prototyping of the newest approaches from research and supports companies in automating their IE processes. Based on the automated quality determination, it ensures that the generated pipelines always meet defined business requirements when they come into productive use.
Limit loads can be calculated with the finite element method (FEM) for any component, defect geometry, and loading. FEM suggests that published long crack limit formulae for axial defects under-estimate the burst pressure for internal surface defects in thick pipes while limit loads are not conservative for deep cracks and for pressure loaded crack-faces. Very deep cracks have a residual strength, which is modelled by a global collapse load. These observations are combined to derive new analytical local and global collapse loads. The global collapse loads are close to FEM limit analyses for all crack dimensions.
Improved collapse loads of thick-walled, crack containing pipes and vessels are suggested. Very deep cracks have a residual strength which is better modelled by a global limit load. In all burst tests, the ductility of pressure vessel steels was sufficiently high whereby the burst pressure could be predicted by limit analysis with no need to apply fracture mechanics. The relative prognosis error increases however, for long and deep defects due to uncertainties of geometry and strength data.
Structural design analyses are conducted with the aim of verifying the exclusion of ratchetting. To this end it is important to make a clear distinction between the shakedown range and the ratchetting range. The performed experiment comprised a hollow tension specimen which was subjected to alternating axial forces, superimposed with constant moments. First, a series of uniaxial tests has been carried out in order to calibrate a bounded kinematic hardening rule. The load parameters have been selected on the basis of previous shakedown analyses with the PERMAS code using a kinematic hardening material model. It is shown that this shakedown analysis gives reasonable agreement between the experimental and the numerical results. A linear and a nonlinear kinematic hardening model of two-surface plasticity are compared in material shakedown analysis.
In the new European standard for unfired pressure vessels, EN 13445-3, there are two approaches for carrying out a Design-by-Analysis that cover both the stress categorization method (Annex C) and the direct route method (Annex B) for a check against global plastic deformation and against progressive plastic deformation. This paper presents the direct route in the language of limit and shakedown analysis. This approach leads to an optimization problem. Its solution with Finite Element Analysis is demonstrated for mechanical and thermal actions. One observation from the examples is that the so-called 3f (3Sm) criterion fails to be a reliable check against progressive plastic deformation. Precise conditions are given, which greatly restrict the applicability of the 3f criterion.
Fatigue analyses are conducted with the aim of verifying that thermal ratcheting is limited. To this end it is important to make a clear distintion between the shakedown range and the ratcheting range (continuing deformation). As part of an EU-supported research project, experiments were carried out using a 4-bar model. The experiment comprised a water-cooled internal tube, and three insulated heatable outer test bars. The system was subjected to alternating axial forces, superimposed with alternating temperatures at the outer bars. The test parameters were partly selected on the basis of previous shakedown analyses. During the test, temperatures and strains were measured as a function of time. The loads and the resulting stresses were confirmed on an ongoing basis during performance of the test, and after it. Different material models were applied for this incremental elasto-plastic analysis using the ANSYS program. The results of the simulation are used to verify the FEM-based shakedown analysis.
The structural reliability with respect to plastic collapse or to inadaptation is formulated on the basis of the lower bound limit and shakedown theorems. A direct definition of the limit state function is achieved which permits the use of the highly effective first order reliability methods (FORM) is achieved. The theorems are implemented into a general purpose FEM program in a way capable of large-scale analysis. The limit state function and its gradient are obtained from a mathematical optimization problem. This direct approach reduces considerably the necessary knowledge of uncertain technological input data, the computing time, and the numerical error, leading to highly effective and precise reliability analyses.
Limit and shakedown analysis are effective methods for assessing the load carrying capacity of a given structure. The elasto–plastic behavior of the structure subjected to loads varying in a given load domain is characterized by the shakedown load factor, defined as the maximum factor which satisfies the sufficient conditions stated in the corresponding static shakedown theorem. The finite element dicretization of the problem may lead to very large convex optimization. For the effective solution a basis reduction method has been developed that makes use of the special problem structure for perfectly plastic material. The paper proposes a modified basis reduction method for direct application to the two-surface plasticity model of bounded kinematic hardening material. The considered numerical examples show an enlargement of the load carrying capacity due to bounded hardening.
The load-carrying capacity or the safety against plastic limit states are the central questions in the design of structures and passive components in the apparatus engineering. A precise answer is most simply given by limit and shakedown analysis. These methods can be based on static and kinematic theorems for lower and upper bound analysis. Both may be formulated as optimization problems for finite element discretizations of structures. The problems of large-scale analysis and the extension towards realistic material modelling will be solved in a European research project. Limit and shakedown analyses are briefly demonstrated with illustrative examples.
Extension fractures are typical for the deformation under low or no confining pressure. They can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. In this article, it is shown that the simple extension strain criterion makes unrealistic strength predictions in biaxial compression and tension. To overcome this major limitation, a new extension strain criterion is proposed by adding a weighted principal shear component to the simple criterion. The shear weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting extension failure modes, which are unexpected in the classical understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain leading to dilatancy. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak stress CP. Different from compressive loading, tensile loading requires only a limited number of critical cracks to cause failure. Therefore, for tensile stresses, the failure criteria must be modified somehow, possibly by a cut-off corresponding to the CI stress. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Shock waves, explosions, impacts or cavitation bubble collapses may generate stress waves in solids causing cracks or unexpected dammage due to focussing, physical nonlinearity or interaction with existing cracks. There is a growing interest in wave propagation, which poses many novel problems to experimentalists and theorists.
Sieht man sich die umfangreichen Betätigungsfelder für einen Projektsteuerer in den Publikationen der einschlägigen Verbände und der Anbieter etwas genauer an, so wird man feststellen, das nach der eigentlichen Projektvorbereitungsphase mit Wirtschaftlichkeitsberechnungen und Sicherstellung der Finanzierung erhebliche Überschneidungen zu den in der HOAI ausgewiesenen Tätigkeiten der weiteren Planungsbeteiligten, insbesondere des Gebäudeplaners, also des Architekten bestehen. Geht man nun davon aus, dass der Bauherr diese Leistungen nicht doppelt bezahlen will, wäre die logische Konsequenz aus der vollumfänglichen Beauftragung eines Projektsteuerers die Verminderung des Auftragsumfangs an den Architekten, verbunden mit einer Honorarminderung für den Architekten. Damit bricht dem Architekten bei eingehender Betrachtung am Ende mehr als die Hälfte seiner Tätigkeit und damit seiner Grundlage zur Honorarerzielung weg. Der Bauherr muss in erster Linie seine Wünsche definieren und sein Budget bestimmen. Er beauftragt die Planungsbeteiligten und nimmt deren Leistungen entgegen. Sein Problem dabei ist, dass er diese Leistung nicht beurteilen kann, weder in Bezug auf deren Vollständigkeit, noch in Bezug auf deren Inhalt. Hier steht der Projektsteuerer im eigentlichen Sinne. Er muss wissen, was die Planungsbeteiligten für ihr Geld zu leisten haben und wie er diese Leistungen durchsetzen kann. Letztendlich sorgt er dann aber auch dafür, das die Architektenleistungen, also Planung und Ausschreibungsunterlagen vom Bauherrn verstanden werden. Warum aber kann der Architekt selbst seine Leistungen und damit den Nachweis der Leistungserfüllung nicht selbst dem Bauherrn verständlich und damit glaubhaft machen? Es liegt also letztlich in der Hand der Architekten, ob ihr Betätigungsfeld weiter durch in die Planung und Gestaltung eingreifende, zusätzliche Projektsteuerer und Generalunternehmer eingeengt oder sogar weggenommen werden kann. Die Frage, wer das Baugeschehen steuert und lenkt bleibt solange ungeklärt, wie die Architekten dieses Tätigkeitsfeld des Architekten im Baubetrieb weiterhin nur unzulänglich ausfüllen können und wollen.
„Smartes“ Laden an öffentlich zugänglichen Ladesäulen – Teil 2: USER-Verhalten und -Erwartungen
(2021)
Unmanned Aerial Vehicles (UAV) constantly gain in versatility. However, more reliable path planning algorithms are required until full autonomous UAV operation is possible. This work investigates the algorithm 3DVFH* and analyses its dependency on its cost function weights in 2400 environments. The analysis shows that the 3DVFH* can find a suitable path in every environment. However, a particular type of environment requires a specific choice of cost function weights. For minimal failure, probability interdependencies between the weights of the cost function have to be considered. This dependency reduces the number of control parameters and simplifies the usage of the 3DVFH*. Weights for costs associated with vertical evasion (pitch cost) and vicinity to obstacles (obstacle cost) have the highest influence on the failure probability of the local path planner. Environments with mainly very tall buildings (like large American city centres) require a preference for horizontal avoidance manoeuvres (achieved with high pitch cost weights). In contrast, environments with medium-to-low buildings (like European city centres) benefit from vertical avoidance manoeuvres (achieved with low pitch cost weights). The cost of the vicinity to obstacles also plays an essential role and must be chosen adequately for the environment. Choosing these two weights ideal is sufficient to reduce the failure probability below 10%.
Obstacle avoidance is critical for unmanned aerial vehicles (UAVs) operating autonomously. Obstacle avoidance algorithms either rely on global environment data or local sensor data. Local path planners react to unforeseen objects and plan purely on local sensor information. Similarly, animals need to find feasible paths based on local information about their surroundings. Therefore, their behavior is a valuable source of inspiration for path planning. Bumblebees tend to fly vertically over far-away obstacles and horizontally around close ones, implying two zones for different flight strategies depending on the distance to obstacles. This work enhances the local path planner 3DVFH* with this bio-inspired strategy. The algorithm alters the goal-driven function of the 3DVFH* to climb-preferring if obstacles are far away. Prior experiments with bumblebees led to two definitions of flight zone limits depending on the distance to obstacles, leading to two algorithm variants. Both variants reduce the probability of not reaching the goal of a 3DVFH* implementation in Matlab/Simulink. The best variant, 3DVFH*b-b, reduces this probability from 70.7 to 18.6% in city-like worlds using a strong vertical evasion strategy. Energy consumption is higher, and flight paths are longer compared to the algorithm version with pronounced horizontal evasion tendency. A parameter study analyzes the effect of different weighting factors in the cost function. The best parameter combination shows a failure probability of 6.9% in city-like worlds and reduces energy consumption by 28%. Our findings demonstrate the potential of bio-inspired approaches for improving the performance of local path planning algorithms for UAV.
In traditional microbial biobutanol production, the solvent must be recovered during fermentation process for a sufficient space-time yield. Thermal separation is not feasible due to the boiling point of n-butanol. As an integrated and selective solid-liquid separation alternative, solvent impregnated resins (SIRs) were applied. Two polymeric resins were evaluated and an extractant screening was conducted. Vacuum application with vapor collection in fixed-bed column as bioreactor bypass was successfully implemented as butanol desorption step. In course of further increasing process economics, fermentation with renewable lignocellulosic substrates was conducted using Clostridium acetobutylicum. Utilization of SIR was shown to be a potential strategy for solvent removal from fermentation broth, while application of a bypass column allows for product removal and recovery at once.
Biomass from various types of organic waste was tested for possible use in hydrogen production. The composition consisted of lignified samples, green waste, and kitchen scraps such as fruit and vegetable peels and leftover food. For this purpose, the enzymatic pretreatment of organic waste with a combination of five different hydrolytic enzymes (cellulase, amylase, glucoamylase, pectinase and xylase) was investigated to determine its ability to produce hydrogen (H2) with the hydrolyzate produced here. In course, the anaerobic rod-shaped bacterium T. neapolitana was used for H2 production. First, the enzymes were investigated using different substrates in preliminary experiments. Subsequently, hydrolyses were carried out using different types of organic waste. In the hydrolysis carried out here for 48 h, an increase in glucose concentration of 481% was measured for waste loads containing starch, corresponding to a glucose concentration at the end of hydrolysis of 7.5 g·L−1. In the subsequent set fermentation in serum bottles, a H2 yield of 1.26 mmol H2 was obtained in the overhead space when Terrific Broth Medium with glucose and yeast extract (TBGY medium) was used. When hydrolyzed organic waste was used, even a H2 yield of 1.37 mmol could be achieved in the overhead space. In addition, a dedicated reactor system for the anaerobic fermentation of T. neapolitana to produce H2 was developed. The bioreactor developed here can ferment anaerobically with a very low loss of produced gas. Here, after 24 h, a hydrogen concentration of 83% could be measured in the overhead space.
This work is an attempt to answer the question: How to use convex programming in shakedown analysis of structures made of materials with temperature-dependent properties. Based on recently established shakedown theorems and formulations, a dual relationship between upper and lower bounds of the shakedown limit load is found, an algorithmfor shakedown analysis is proposed. While the original problem is neither convex nor concave, the algorithm presented here has the advantage of employing convex programming tools.
Photoelectrochemical (PEC) biosensors are a rather novel type of biosensors thatutilizelighttoprovideinformationaboutthecompositionofananalyte,enablinglight-controlled multi-analyte measurements. For enzymatic PEC biosensors,amperometric detection principles are already known in the literature. In con-trast, there is only a little information on H+-ion sensitive PEC biosensors. Inthis work, we demonstrate the detection of H+ions emerged by H+-generatingenzymes, exemplarily demonstrated with penicillinase as a model enzyme on atitanium dioxide photoanode. First, we describe the pH sensitivity of the sensorand study possible photoelectrocatalytic reactions with penicillin. Second, weshow the enzymatic PEC detection of penicillin.
Die Einleitung zur Norm DIN EN 62305-3 beschreibt klar und ein - deutig: Der vorliegende Teil der IEC 62305 behandelt den Schutz von baulichen Anlagen gegen materielle Schäden und den Schutz von Personen gegen Verletzungen durch Berührungs- und Schrittspannungen. Als das wesentlichste und effektivste Mittel zum Schutz von baulichen Anlagen gegen materielle Schäden gilt das Blitz - schutzsystem (LPS).
This article describes an Internet of things (IoT) sensing device with a wireless interface which is powered by the energy-harvesting method of the Wiegand effect. The Wiegand effect, in contrast to continuous sources like photovoltaic or thermal harvesters, provides small amounts of energy discontinuously in pulsed mode. To enable an energy-self-sufficient operation of the sensing device with this pulsed energy source, the output energy of the Wiegand generator is maximized. This energy is used to power up the system and to acquire and process data like position, temperature or other resistively measurable quantities as well as transmit these data via an ultra-low-power ultra-wideband (UWB) data transmitter. A proof-of-concept system was built to prove the feasibility of the approach. The energy consumption of the system during start-up was analysed, traced back in detail to the individual components, compared to the generated energy and processed to identify further optimization options. Based on the proof of concept, an application prototype was developed.