Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (1941)
- Fachbereich Elektrotechnik und Informationstechnik (1153)
- Fachbereich Wirtschaftswissenschaften (1121)
- Fachbereich Energietechnik (1067)
- Fachbereich Chemie und Biotechnologie (899)
- Fachbereich Maschinenbau und Mechatronik (817)
- Fachbereich Luft- und Raumfahrttechnik (770)
- Fachbereich Bauingenieurwesen (665)
- IfB - Institut für Bioengineering (630)
- INB - Institut für Nano- und Biotechnologien (586)
Has Fulltext
- no (9351) (remove)
Language
Document Type
- Article (5509)
- Conference Proceeding (1448)
- Book (1064)
- Part of a Book (572)
- Patent (177)
- Bachelor Thesis (170)
- Report (83)
- Conference: Meeting Abstract (82)
- Doctoral Thesis (82)
- Other (67)
- Contribution to a Periodical (20)
- Master's Thesis (18)
- Review (18)
- Working Paper (13)
- Preprint (6)
- Conference Poster (5)
- Habilitation (5)
- Talk (5)
- Diploma Thesis (3)
- Part of a Periodical (2)
- Examination Thesis (1)
- Video (1)
Keywords
- Illustration (10)
- Nachhaltigkeit (10)
- Corporate Design (9)
- Erscheinungsbild (8)
- Gamification (8)
- Redesign (7)
- Animation (6)
- Datenschutz (6)
- Deutschland (6)
- Digitalisierung (6)
Die interaktive Dokumentation erzählt die sinnliche Wirkung des Gemeinschaftsgartens HirschGrün und teilt die Erfahrungen von vier Gärtner:innen in einer auditiven und visuellen Auseinandersetzung, innerhalb eines Open-World Ansatzes. Das Projekt stellt eine progressive Weise dar, sich mit journalistischen Inhalten zu beschäftigen. Es schafft Perspektiven für die sozialen und schöpferischen Potenziale eines Gemeinschaftsgartens. Interaktionen, visuelle und auditive Reize illustrieren die Atmosphäre. Grünflächen sind nicht nur klimarelevant, sondern schaffen für Stadtbewohner:innen vielfältige Wege, mit Leben in Kontakt zu treten. Besonders jene verbinden Gemeinschaft, Natur und Nachhaltigkeit. Die explorative Art technischer Informationsvermittlung und aktuelle Themen dieser Zeit werden auf einer Webseite vereint. Das Projekt zeigt eine Aussicht, multimedialen Journalismus zu gestalten.
To successfully develop and introduce concrete artificial intelligence (AI) solutions in operational practice, a comprehensive process model is being tested in the WIRKsam joint project. It is based on a methodical approach that integrates human, technical and organisational aspects and involves employees in the process. The chapter focuses on the procedure for identifying requirements for a work system that is implementing AI in problem-driven projects and for selecting appropriate AI methods. This means that the use case has already been narrowed down at the beginning of the project and must be completely defined in the following. Initially, the existing preliminary work is presented. Based on this, an overview of all procedural steps and methods is given. All methods are presented in detail and good practice approaches are shown. Finally, a reflection of the developed procedure based on the application in nine companies is given.
Effective government services rely on accurate population numbers to allocate resources. In Colombia and globally, census enumeration is challenging in remote regions and where armed conflict is occurring. During census preparations, the Colombian National Administrative Department of Statistics conducted social cartography workshops, where community representatives estimated numbers of dwellings and people throughout their regions. We repurposed this information, combining it with remotely sensed buildings data and other geospatial data. To estimate building counts and population sizes, we developed hierarchical Bayesian models, trained using nearby full-coverage census enumerations and assessed using 10-fold cross-validation. We compared models to assess the relative contributions of community knowledge, remotely sensed buildings, and their combination to model fit. The Community model was unbiased but imprecise; the Satellite model was more precise but biased; and the Combination model was best for overall accuracy. Results reaffirmed the power of remotely sensed buildings data for population estimation and highlighted the value of incorporating local knowledge.
Das Masterprojekt DIE STUHL hebt die Unsichtbarkeit von Gestalterinnen im Stuhldesign hervor. Es werden 108 Designerinnen aus 40 Ländern vorgestellt, die vom Beginn des 20. Jahrhunderts bis heute innovative und richtungsweisende Entwürfe von Stühlen entwickelt haben. Die Arbeit beleuchtet den Beitrag, den sie geleistet haben und stellt gleichzeitig heraus, dass viele von ihnen im internationalen Designkanon übersehen wurden. Chronologisch nach dem Erscheinungsjahr der ersten entwickelten Stühle geordnet, werden die Objekte in einem Atlas illustriert und Kurzbiografien der Protagonistinnen gezeigt. Anhand von Informationsgrafiken werden die Stühle in das soziopolitische Geschehen ihrer Zeit eingebettet. DIE STUHL würdigt die Arbeit von Gestalterinnen und stellt Fragen zur strukturellen Ungleichheit der Geschlechter. Parallel zur Publikation ist außerdem eine Skulptur entstanden, die als Zeitstrahl fungiert und in abstrakter Form alle im Atlas erwähnten Stühle aufführt. Hierbei wird das ungleiche Verhältnis der von Gestalterinnen und Gestalter entworfenen Stühle sichtbar.
Perennial ryegrass (Lolium perenne) is an underutilized lignocellulosic biomass that has several benefits such as high availability, renewability, and biomass yield. The grass press-juice obtained from the mechanical pretreatment can be used for the bio-based production of chemicals. Lactic acid is a platform chemical that has attracted consideration due to its broad area of applications. For this reason, the more sustainable production of lactic acid is expected to increase. In this work, lactic acid was produced using complex medium at the bench- and reactor scale, and the results were compared to those obtained using an optimized press-juice medium. Bench-scale fermentations were carried out in a pH-control system and lactic acid production reached approximately 21.84 ± 0.95 g/L in complex medium, and 26.61 ± 1.2 g/L in press-juice medium. In the bioreactor, the production yield was 0.91 ± 0.07 g/g, corresponding to a 1.4-fold increase with respect to the complex medium with fructose. As a comparison to the traditional ensiling process, the ensiling of whole grass fractions of different varieties harvested in summer and autumn was performed. Ensiling showed variations in lactic acid yields, with a yield up to 15.2% dry mass for the late-harvested samples, surpassing typical silage yields of 6–10% dry mass.
Purpose: Impaired paravascular drainage of β-Amyloid (Aβ) has been proposed as a contributing cause for sporadic Alzheimer’s disease (AD), as decreased cerebral blood vessel pulsatility and subsequently reduced propulsion in this pathway could lead to the accumulation and deposition of Aβ in the brain. Therefore, we hypothesized that there is an increased impairment in pulsatility across AD spectrum.
Patients and Methods: Using transcranial color-coded duplex sonography (TCCS) the resistance and pulsatility index (RI; PI) of the middle cerebral artery (MCA) in healthy controls (HC, n=14) and patients with AD dementia (ADD, n=12) were measured. In a second step, we extended the sample by adding patients with mild cognitive impairment (MCI) stratified by the presence (MCI-AD, n=8) or absence of biomarkers (MCI-nonAD, n=8) indicative for underlying AD pathology, and compared RI and PI across the groups. To control for atherosclerosis as a confounder, we measured the arteriolar-venular-ratio of retinal vessels.
Results: Left and right RI (p=0.020; p=0.027) and left PI (p=0.034) differed between HC and ADD controlled for atherosclerosis with AUCs of 0.776, 0.763, and 0.718, respectively. The RI and PI of MCI-AD tended towards ADD, of MCI-nonAD towards HC, respectively. RIs and PIs were associated with disease severity (p=0.010, p=0.023).
Conclusion: Our results strengthen the hypothesis that impaired pulsatility could cause impaired amyloid clearance from the brain and thereby might contribute to the development of AD. However, further studies considering other factors possibly influencing amyloid clearance as well as larger sample sizes are needed.
Purpose: A precise determination of the corneal diameter is essential for the diagnosis of various ocular diseases, cataract and refractive surgery as well as for the selection and fitting of contact lenses. The aim of this study was to investigate the agreement between two automatic and one manual method for corneal diameter determination and to evaluate possible diurnal variations in corneal diameter.
Patients and Methods: Horizontal white-to-white corneal diameter of 20 volunteers was measured at three different fixed times of a day with three methods: Scheimpflug method (Pentacam HR, Oculus), placido based topography (Keratograph 5M, Oculus) and manual method using an image analysis software at a slitlamp (BQ900, Haag-Streit).
Results: The two-factorial analysis of variance could not show a significant effect of the different instruments (p = 0.117), the different time points (p = 0.506) and the interaction between instrument and time point (p = 0.182). Very good repeatability (intraclass correlation coefficient ICC, quartile coefficient of dispersion QCD) was found for all three devices. However, manual slitlamp measurements showed a higher QCD than the automatic measurements with the Keratograph 5M and the Pentacam HR at all measurement times.
Conclusion: The manual and automated methods used in the study to determine corneal diameter showed good agreement and repeatability. No significant diurnal variations of corneal diameter were observed during the period of time studied.
Transgenic plants have the potential to produce recombinant proteins on an agricultural scale, with yields of several tons per year. The cost-effectiveness of transgenic plants increases if simple cultivation facilities such as greenhouses can be used for production. In such a setting, we expressed a novel affinity ligand based on the fluorescent protein DsRed, which we used as a carrier for the linear epitope ELDKWA from the HIV-neutralizing antibody 2F5. The DsRed-2F5-epitope (DFE) fusion protein was produced in 12 consecutive batches of transgenic tobacco (Nicotiana tabacum) plants over the course of 2 years and was purified using a combination of blanching and immobilized metal-ion affinity chromatography (IMAC). The average purity after IMAC was 57 ± 26% (n = 24) in terms of total soluble protein, but the average yield of pure DFE (12 mg kg−1) showed substantial variation (± 97 mg kg−1, n = 24) which correlated with seasonal changes. Specifically, we found that temperature peaks (>28°C) and intense illuminance (>45 klx h−1) were associated with lower DFE yields after purification, reflecting the loss of the epitope-containing C-terminus in up to 90% of the product. Whereas the weather factors were of limited use to predict product yields of individual harvests conducted for each batch (spaced by 1 week), the average batch yields were well approximated by simple linear regression models using two independent variables for prediction (illuminance and plant age). Interestingly, accumulation levels determined by fluorescence analysis were not affected by weather conditions but positively correlated with plant age, suggesting that the product was still expressed at high levels, but the extreme conditions affected its stability, albeit still preserving the fluorophore function. The efficient production of intact recombinant proteins in plants may therefore require adequate climate control and shading in greenhouses or even cultivation in fully controlled indoor farms.
Chromatography is the workhorse of biopharmaceutical downstream processing because it can selectively enrich a target product while removing impurities from complex feed streams. This is achieved by exploiting differences in molecular properties, such as size, charge and hydrophobicity (alone or in different combinations). Accordingly, many parameters must be tested during process development in order to maximize product purity and recovery, including resin and ligand types, conductivity, pH, gradient profiles, and the sequence of separation operations. The number of possible experimental conditions quickly becomes unmanageable. Although the range of suitable conditions can be narrowed based on experience, the time and cost of the work remain high even when using high-throughput laboratory automation. In contrast, chromatography modeling using inexpensive, parallelized computer hardware can provide expert knowledge, predicting conditions that achieve high purity and efficient recovery. The prediction of suitable conditions in silico reduces the number of empirical tests required and provides in-depth process understanding, which is recommended by regulatory authorities. In this article, we discuss the benefits and specific challenges of chromatography modeling. We describe the experimental characterization of chromatography devices and settings prior to modeling, such as the determination of column porosity. We also consider the challenges that must be overcome when models are set up and calibrated, including the cross-validation and verification of data-driven and hybrid (combined data-driven and mechanistic) models. This review will therefore support researchers intending to establish a chromatography modeling workflow in their laboratory.
Proteins are important ingredients in food and feed, they are the active components of many pharmaceutical products, and they are necessary, in the form of enzymes, for the success of many technical processes. However, production can be challenging, especially when using heterologous host cells such as bacteria to express and assemble recombinant mammalian proteins. The manufacturability of proteins can be hindered by low solubility, a tendency to aggregate, or inefficient purification. Tools such as in silico protein engineering and models that predict separation criteria can overcome these issues but usually require the complex shape and surface properties of proteins to be represented by a small number of quantitative numeric values known as descriptors, as similarly used to capture the features of small molecules. Here, we review the current status of protein descriptors, especially for application in quantitative structure activity relationship (QSAR) models. First, we describe the complexity of proteins and the properties that descriptors must accommodate. Then we introduce descriptors of shape and surface properties that quantify the global and local features of proteins. Finally, we highlight the current limitations of protein descriptors and propose strategies for the derivation of novel protein descriptors that are more informative.
The book covers various numerical field simulation methods, nonlinear circuit technology and its MF-S- and X-parameters, as well as state-of-the-art power amplifier techniques. It also describes newly presented oscillators and the emerging field of GHz plasma technology. Furthermore, it addresses aspects such as waveguides, mixers, phase-locked loops, antennas, and propagation effects, in combination with the bachelor's book 'High-Frequency Engineering,' encompassing all aspects related to the current state of GHz technology.
Metathese von Ölsäure und Derivaten ist ein interessanter Weg für die Synthese bifunktioneller Verbindungen aus nachwachsenden Rohstoffen. Verwendet wurden Ru-Katalysatoren der zweiten Generation, welche eine hohe Toleranz gegenüber funktionellen Gruppen und Verunreinigungen aufweisen. Trotz des Einsatzes technischer Edukte waren Umsetzungen mit niedrigen Katalysatormengen (0.001 – 0.01 mol-%) möglich, mit Ausbeuten entsprechend der Literatur. Kreuzmetathesen ermöglichten variable Kettenlängen und Funktionalitäten der Monomere, die Produktgewinnung ist jedoch aufwändig. Selbstmetathese lieferte C18-bifunktionelle Verbindungen, welche einfach durch Destillation oder Kristallisation isoliert werden können. Neben der katalystischen Umsetzung wurde auch die Produktgewinnung untersucht und für ausgewählte Produkte auch im größeren Maßstab durchgeführt.
Self metathesis of oleochemicals offers a variety of bifunctional compounds, that can be used as monomer for polymer production. Many precursors are in huge scales available, like oleic acid ester (biodiesel), oleyl alcohol (tensides), oleyl amines (tensides, lubricants). We show several ways to produce and separate and purify C18-α,ω-bifunctional compounds, using Grubbs 2nd Generation catalysts, starting from technical grade educts.
Die Bereitstellung von nachhaltig erzeugtem Wasserstoff als Energieträger und Rohstoff ist eine wichtige Schlüsseltechnologie sowohl als Ersatz für fossile Energieträger, aber auch als Produkt im Zusammenhang mit Kreislaufprozessen. In der Abwasserbehandlung bestehen verschiedene Möglichkeiten Wasserstoff herzustellen. Mehrere Wege, mögliche Synergien, aber auch deren Nachteile werden vorgestellt.
Die Erfindung liegt auf dem Gebiet der Enzymtechnologie. Die Erfindung betrifft Proteasen aus Metabacillus indicus, die insbesondere im Hinblick auf den Einsatz in Wasch- und Reinigungsmitteln verwendet werden können, alle hinreichend ähnlichen Proteasen mit einer entsprechend ähnlichen Sequenz zu SEQ ID NO:1 und für sie codierende Nukleinsäuren. Die Erfindung betrifft ferner deren Herstellung sowie Verfahren zur Verwendung dieser Proteasen, deren Verwendung als solche sowie diese enthaltende Mittel, insbesondere Wasch- und Reinigungsmittel.
Die Erfindung liegt auf dem Gebiet der Enzymtechnologie. Die Erfindung betrifft Proteasen aus Fictibacillus arsenicus, die insbesondere im Hinblick auf den Einsatz in Wasch- und Reinigungsmitteln verwendet werden können, alle hinreichend ähnlichen Proteasen mit einer entsprechend ähnlichen Sequenz zu SEQ ID NO:1 und für sie codierende Nukleinsäuren. Die Erfindung betrifft ferner deren Herstellung sowie Verfahren zur Verwendung dieser Proteasen, deren Verwendung als solche sowie diese enthaltende Mittel, insbesondere Wasch- und Reinigungsmittel.
The research group focuses on the characteristics in the land-and cityscapes of the Drielanden-zone, which contribute to generate common identities, as well as on those features that trigger differences and specificities of the adjacent countries that enrich the perception of the zone. In this research, the instruments of cartography and land survey system serve to detect and localize the fragmented appearance of relevant historic elements. These analytic procedures help to develop strategies for infrastructures and processes that gradually initiate local forms of cross-border tourism. The architectural research displays how top-down and bottom-up interventions can be combined in order to guarantee a sustainable use and development of the considered area.
Cento Tavole
(2016)
Martinella
(2010)
Mit der Digitalen Automatischen Kupplung beginnt ein neues Kapitel des Schienengüterverkehrs, in dem zusammengestellte Wagen sich automatisch in wenigen Minuten abfahrbereit machen, ohne dass der Mensch eingreifen muss. Eines des größten Hemmnisse der umweltfreundlichen Schiene wird dann entfallen. Notwendig ist jetzt eine Diskussion über den Umfang und die Systemgrenzen der Automatischen Bremsprobe.
In many instances, freight vehicles exchange load or information with plants that are or will soon be Industry4.0 plants. The Wagon4.0 concept, as developed in close cooperation with e.g. port or mine operations, offers a maximum in railway operational efficiency while providing strong business cases already in the respective plant interaction. The Wagon4.0 consists of main components, a power supply, data network, sensors, actuators and an operating system, the so called WagonOS. The Wagon OS is implemented in a granular, self-sufficient manner, to allow basic features such as WiFi-Mesh and train christening in remote areas without network connection. Furthermore, the granularity of the operating system allows to extend the familiar app concept to freight rail rolling stock, making it possible to use specialised actuators for certain applications, e.g. an electrical parking brake or an auxiliary drive. In order to facilitate migration to the Wagon4.0 for existing fleets, a migration concept featuring five levels of technical adaptation was developed. The present paper investigates the benefits of Wagon4.0-implementations for the particular challenges of heavy haul operations by focusing on train christening, ep-assisted braking, autonomous last mile and traction boost operation as well as improved maintenance schedules
Neue Perspektiven für die Bahn in der Produktions- und Distributionslogistik durch Prozessautomation
(2019)
Deutschland braucht mehr Eisenbahn um CO2-Emissionen aus dem Verkehr zu reduzieren. Sie muss zum Rückgrat aktueller Logistikprozesse, z.B. bei Kaufmannsgütern und E-Commerce, werden. Dies geht nicht ohne neuartige betriebliche Konzepte und eine Transformation des Güterwagens von einem „dummen Stück Stahl“ zu einem modernen Werkzeug der Logistik.
Als „Güterwagen 4.0“ wird ein kommunikativer und kooperativer Güterwagen verstanden, der die Voraussetzung zur Automatisierung aller Prozesse der Zugvorbereitung bereitstellt, sich aber ansonsten vollkommen kompatibel mit heutigen Betriebsverfahren im Hauptlauf präsentiert. Durch Kommunikation zwischen Güterwagen und umgebenden intelligenten Systemen im Sinne eines „Internet der Dinge“ gelingt damit unter Anderem die Realisierung hoch effizienter Gleisanschlussverkehre, die der Güterbahn neue Märkte abseits der klassisch bahn-affinen Verkehre erschließen und letztlich den Wandel zu einer nachhaltigen Gütermobilität fördern.
Lokomotiven sind dank modernster Konzepte der Antriebstechnik heute energiesparend und umweltfreundlich. Eine Ausrüstung mit Telematik und Assistenzfunktionen ist Standard. Auf der Strecke zeigt sich moderne Technik in Form elektronischer Stellwerke und Zugsicherungssysteme und in Rangier- und Abstellanlagen als EOW-Technik. Am Güterwagen hingegen ist der technische Fortschritt komplett vorbeigegangen. Auch beim modernsten Wagen (Abb. 1) ist die einzige „Automatik“-Funktion die zentral über die Hauptluftleitung (HL) versorgte und betätigte Luftbremse.
Organizzare l’addizione
(2014)
In the introduction to their book "What is philosophy?" Gilles Deleuze and Felix Guattari deplore the inflationary and trivialised use of the term concept: "Finally, the most shameful moment came when computer science, marketing, design and advertising, all the disciplines of communication, seized hold of the word concept itself and said: 'This is our concern, we are the creative ones, we are the ideas men! We are the friends of the concept, we put in our computers.' " This doctoral thesis shares the concern of Gilles Deleuze and Felix Guattari, but still, it is a thesis in architecture and thus collocated within the field of the representatives of the "ideas men". It engages in architectural design theory, and refers in particular to the investigation of methodological approaches within the design process. Therefore, the thesis will not contribute to the philosophical dimension of the term, but intends to overcome its imprecise use within the architectural discourse, in compliance with Eugène Viollet-le-Duc's admonition relative to vague definitions: "Dans les arts, et dans l'architecture en particulier, les définitions vagues ont causé bien des erreurs, ont laissé germer bien des préjugés, enraciner bien des idées fausses. On met un mot en avant, chacun y attache un sens différent." The term concept in architecture is very often used as pure marketing collateral, it serves to sell an idea, a product, a design. Its functional applicability is reduced to a special manner of illustration, produced as one of the various design presentation documents at the end of the design process. In contrast, the original contribution of this thesis aims to give a precise, instrumental dimension to the term concept: the concept is the expression of a specific logic, capable to guide the decisional sequences of the process and thus to improve the quality of the designed projects. The motivation to define a specific instrumentality of the concept is closely connected to the issue of interdisciplinarity in the architects’ profession. The interdisciplinary character of the architectural field is widely accepted and discussed as such, but the thesis intends to give a more precise definition of the various kinds of competences involved by classifying them into either the internal or the external group. The traditional notion of interdisciplinarity, predominantly seen as collaboration between architects and technical experts, and, most notably, the historical, sometimes contentious, relationship between architects and engineers is described. Referring to recent developments, the transformation of the architect’s role within the professional sphere, marked by an increasing importance of diverse influences and linked to a growing risk of marginalisation, is illustrated. The thesis describes different ways to adapt to this specific kind of interdisciplinarity, which generally requires the architect’s ability to connect and to integrate various contents, different points of view and diverse scales. On the other hand, the big potential which is implicit in the interdisciplinary field is exposed: architects can inform their core competence, the design, by extracting contents of different disciplinary competences, pertaining or not pertaining to their own professional field. They have the possibility to cross fields of external competences in a selective way and by doing so they can build up a corpus of knowledge capable to generate and communicate guidelines and systematic methodologies for their design. At the end, the analysis of these two aspects allows the definition of a more specific professional profile of the architect as specialist of interdisciplinarity. The thesis is concerned with the theories around the design process. The design process is seen as open to inspection and critical evaluation, with major focus on the decisional sequences which characterise it. It concentrates on the process’ descriptiveness and the degree of self-conscious approaches applied within it. The importance of regulative, strategic mechanisms is illustrated by testimonies taken from a series of design researches and leads to the functional definition of the figure of the concept as representation of a coherent set of ideas, as generator of a project-specific system of rules and as communicator of decisional strategies. The concept's function is furthermore defined as communicative interface which generates and transmits the system of rules authoritative for all the disciplinary competences involved in the design process, a communicative interface which constitutes a basis of shared convictions capable to increase the efficiency of collaboration. Furthermore, the concept's capacity to explore and elaborate the contents of external disciplines is identified as a possible methodological approach to innovative design thinking. The approach to a specific functional definition of the concept is continued by the description of a series of instruments that are simultaneously generating and communicating it. It is outlined to which degree the concept itself is already the result of an ideational process, collocated within the initial phase of the design proceedings, serving as a guideline to them, but still continuously evolving and adapting in its progression. In addition, it is illustrated how all the diverse instruments of the concept are operational media through which the knowledge transition between different disciplines can occur. The considerations about the concept as operational instrument of design are elaborated with regard to a number of examples of didactical applications that are particularly involved in the development and teaching of specific design methods. These examples illustrate the interrelations between design theory and design education. They are derived from very different schools of architecture and diverse mindsets, but all of them transmit models of conceptual design thinking.
Concept - this is a key term in architectural discourse. However, all too often it is used imprecisely or merely for marketing purposes. What is a concept actually? This publication moves between design theory and design practice and follows the history of the definition of concept in architecture, leading to the formulation of a specifically instrumental and operative definition. It bases concept in architecture on its strategic potential in design decision-making processes. In the changing profession of the designing architect, decisions are increasingly made in multidisciplinary groups. Concept can serve as a dialogic instrument in the process, making it possible to process heterogeneous information from a range of spheres of knowledge. The effective presentation of selected information becomes a relevant interface in the design process, which has a significant influence on the quality of the design.
Robustheit
(2017)
Architects and civil engineers work together regularly during their professional days and are irreplaceable for each other. This co-operation is sometimes made more difficult by the differences in their disciplinary languages and approaches. Structures are evaluated by architects on the basis of criteria such as spatial impact and usability, while civil engineers analyze them more closely by their bearing and deformation properties, as well as by constructive aspects. This diversity of assessment criteria and approaches is often continued in both academic disciplines in the view on structures.
Within the framework of the Exploratory Teaching Space (ETS), a funding program to improve teaching at RWTH Aachen University and to promote new teaching concepts, a project was carried out jointly by the Junior Professorship of Tool-Culture at the Faculty of Architecture and the Institute of Structural Concrete at the Faculty of Civil Engineering. The aim of the project is to present buildings in such a way that the differences in perception between architects and civil engineers are reduced and the common understanding is promoted.
The project develops a database, which contains a collection of striking buildings from Aachen and the surrounding area. The buildings are categorized according to terms that come from both disciplinary areas. The collection can be freely explored or crossed through learning trails. The medium of film plays a special role in presenting the buildings. The buildings are assigned to different categories of load bearing structures as linear, planar and spatial structures, and further to different types of material, functional programs and spatial characteristics. Since the buildings are located in the direct vicinity of Aachen, they can be visited by the students. This makes them more sensitive to their environment. Intrinsic motivation, as well as implicit learning is encouraged. The paper will provide a detailed report of the project, its implementation, the feedback of the students and the plans for further development.
Background: Architectural representation, nurtured by the interaction between design thinking and design action, is inherently multi-layered. However, the representation object cannot always reflect these layers. Therefore, it is claimed that these reflections and layerings can gain visibility through ‘performativity in personal knowledge’, which basically has a performative character. The specific layers of representation produced during the performativity in personal knowledge permit insights about the ‘personal way of designing’ [1]. Therefore, the question, ‘how can these layered drawings be decomposed to understand the personal way of designing’, can be defined as the beginning of the study. On the other hand, performativity in personal knowledge in architectural design is handled through the relationship between explicit and tacit knowledge and representational and non-representational theory. To discuss the practical dimension of these theoretical relations, Zvi Hecker's drawing of the Heinz-Galinski-School is examined as an example. The study aims to understand the relationships between the layers by decomposing a layered drawing analytically in order to exemplify personal ways of designing.
Methods: The study is based on qualitative research methodologies. First, a model has been formed through theoretical readings to discuss the performativity in personal knowledge. This model is used to understand the layered representations and to research the personal way of designing. Thus, one drawing of Hecker’s Heinz-Galinski-School project is chosen. Second, its layers are decomposed to detect and analyze diverse objects, which hint to different types of design tools and their application. Third, Zvi Hecker’s statements of the design process are explained through the interview data [2] and other sources. The obtained data are compared with each other.
Results: By decomposing the drawing, eleven layers are defined. These layers are used to understand the relation between the design idea and its representation. They can also be thought of as a reading system. In other words, a method to discuss Hecker’s performativity in personal knowledge is developed. Furthermore, the layers and their interconnections are described in relation to Zvi Hecker’s personal way of designing.
Conclusions: It can be said that layered representations, which are associated with the multilayered structure of performativity in personal knowledge, form the personal way of designing.
Against the background of growing data in everyday life, data processing tools become more powerful to deal with the increasing complexity in building design. The architectural planning process is offered a variety of new instruments to design, plan and communicate planning decisions. Ideally the access to information serves to secure and document the quality of the building and in the worst case, the increased data absorbs time by collection and processing without any benefit for the building and its user. Process models can illustrate the impact of information on the design- and planning process so that architect and planner can steer the process. This paper provides historic and contemporary models to visualize the architectural planning process and introduces means to describe today’s situation consisting of stakeholders, events and instruments. It explains conceptions during Renaissance in contrast to models used in the second half of the 20th century. Contemporary models are discussed regarding their value against the background of increasing computation in the building process.
Heimat entwerfen?
(2019)
Können wir Skizzenblätter, die gemischte Systeme von Text und Bildanteilen zeigen, als räumliche und zeitliche Verdichtung von Reflexionsmilieus verstehen? Wie wirkt sich die durch die räumliche Begrenzung des Blatts bedingte gleichzeitige Anwesenheit von Text und Bild aus, welche Wechselwirkungen entfalten sich? Diese Fragenstellungen führen zur Definition der ‚Multidimensionalen Arbeitsblätter‘, die als geeignetes Medium der Analyse von entwerferischen Denkprozessen verstanden werden. Anhand von fünf Beispielen wird beschrieben, wie durch dekompositorische Prozeduren Zeichnungsgenealogien sichtbar gemacht werden können, die intensive Auskunft über Entwurfshandlungen geben.
In the research domain of energy informatics, the importance of open datais rising rapidly. This can be seen as various new public datasets are created andpublished. Unfortunately, in many cases, the data is not available under a permissivelicense corresponding to the FAIR principles, often lacking accessibility or reusability.Furthermore, the source format often differs from the desired data format or does notmeet the demands to be queried in an efficient way. To solve this on a small scale atoolbox for ETL-processes is provided to create a local energy data server with openaccess data from different valuable sources in a structured format. So while the sourcesitself do not fully comply with the FAIR principles, the provided unique toolbox allows foran efficient processing of the data as if the FAIR principles would be met. The energydata server currently includes information of power systems, weather data, networkfrequency data, European energy and gas data for demand and generation and more.However, a solution to the core problem - missing alignment to the FAIR principles - isstill needed for the National Research Data Infrastructure.
Due to the transition to renewable energies, electricity markets need to be made fit for purpose. To enable the comparison of different energy market designs, modeling tools covering market actors and their heterogeneous behavior are needed. Agent-based models are ideally suited for this task. Such models can be used to simulate and analyze changes to market design or market mechanisms and their impact on market dynamics. In this paper, we conduct an evaluation and comparison of two actively developed open-source energy market simulation models. The two models, namely AMIRIS and ASSUME, are both designed to simulate future energy markets using an agent-based approach. The assessment encompasses modelling features and techniques, model performance, as well as a comparison of model results, which can serve as a blueprint for future comparative studies of simulation models. The main comparison dataset includes data of Germany in 2019 and simulates the Day-Ahead market and participating actors as individual agents. Both models are comparable close to the benchmark dataset with a MAE between 5.6 and 6.4 €/MWh while also modeling the actual dispatch realistically.