Refine
Year of publication
Institute
- Fachbereich Medizintechnik und Technomathematik (2087)
- Fachbereich Elektrotechnik und Informationstechnik (1190)
- Fachbereich Wirtschaftswissenschaften (1167)
- Fachbereich Energietechnik (1117)
- Fachbereich Chemie und Biotechnologie (920)
- Fachbereich Maschinenbau und Mechatronik (885)
- Fachbereich Luft- und Raumfahrttechnik (801)
- Fachbereich Bauingenieurwesen (712)
- IfB - Institut für Bioengineering (693)
- INB - Institut für Nano- und Biotechnologien (616)
Language
- German (5150)
- English (4961)
- Russian (14)
- Portuguese (6)
- Italian (5)
- Multiple languages (5)
- Spanish (3)
- Dutch (2)
Document Type
- Article (5641)
- Conference Proceeding (1683)
- Book (1087)
- Part of a Book (578)
- Bachelor Thesis (332)
- Patent (177)
- Report (102)
- Conference: Meeting Abstract (83)
- Doctoral Thesis (82)
- Administrative publication (77)
Keywords
- Amtliche Mitteilung (72)
- Bachelor (34)
- Aachen University of Applied Sciences (31)
- Master (31)
- Prüfungsordnung (31)
- Bauingenieurwesen (30)
- Lesbare Fassung (28)
- Biosensor (25)
- Fachhochschule Aachen (23)
- Illustration (23)
Zugriffsart
- campus (2155)
- weltweit (1913)
- bezahl (790)
- fachbereichsweit (FB4) (38)
Purpose: A precise determination of the corneal diameter is essential for the diagnosis of various ocular diseases, cataract and refractive surgery as well as for the selection and fitting of contact lenses. The aim of this study was to investigate the agreement between two automatic and one manual method for corneal diameter determination and to evaluate possible diurnal variations in corneal diameter.
Patients and Methods: Horizontal white-to-white corneal diameter of 20 volunteers was measured at three different fixed times of a day with three methods: Scheimpflug method (Pentacam HR, Oculus), placido based topography (Keratograph 5M, Oculus) and manual method using an image analysis software at a slitlamp (BQ900, Haag-Streit).
Results: The two-factorial analysis of variance could not show a significant effect of the different instruments (p = 0.117), the different time points (p = 0.506) and the interaction between instrument and time point (p = 0.182). Very good repeatability (intraclass correlation coefficient ICC, quartile coefficient of dispersion QCD) was found for all three devices. However, manual slitlamp measurements showed a higher QCD than the automatic measurements with the Keratograph 5M and the Pentacam HR at all measurement times.
Conclusion: The manual and automated methods used in the study to determine corneal diameter showed good agreement and repeatability. No significant diurnal variations of corneal diameter were observed during the period of time studied.
Transgenic plants have the potential to produce recombinant proteins on an agricultural scale, with yields of several tons per year. The cost-effectiveness of transgenic plants increases if simple cultivation facilities such as greenhouses can be used for production. In such a setting, we expressed a novel affinity ligand based on the fluorescent protein DsRed, which we used as a carrier for the linear epitope ELDKWA from the HIV-neutralizing antibody 2F5. The DsRed-2F5-epitope (DFE) fusion protein was produced in 12 consecutive batches of transgenic tobacco (Nicotiana tabacum) plants over the course of 2 years and was purified using a combination of blanching and immobilized metal-ion affinity chromatography (IMAC). The average purity after IMAC was 57 ± 26% (n = 24) in terms of total soluble protein, but the average yield of pure DFE (12 mg kg−1) showed substantial variation (± 97 mg kg−1, n = 24) which correlated with seasonal changes. Specifically, we found that temperature peaks (>28°C) and intense illuminance (>45 klx h−1) were associated with lower DFE yields after purification, reflecting the loss of the epitope-containing C-terminus in up to 90% of the product. Whereas the weather factors were of limited use to predict product yields of individual harvests conducted for each batch (spaced by 1 week), the average batch yields were well approximated by simple linear regression models using two independent variables for prediction (illuminance and plant age). Interestingly, accumulation levels determined by fluorescence analysis were not affected by weather conditions but positively correlated with plant age, suggesting that the product was still expressed at high levels, but the extreme conditions affected its stability, albeit still preserving the fluorophore function. The efficient production of intact recombinant proteins in plants may therefore require adequate climate control and shading in greenhouses or even cultivation in fully controlled indoor farms.
Chromatography is the workhorse of biopharmaceutical downstream processing because it can selectively enrich a target product while removing impurities from complex feed streams. This is achieved by exploiting differences in molecular properties, such as size, charge and hydrophobicity (alone or in different combinations). Accordingly, many parameters must be tested during process development in order to maximize product purity and recovery, including resin and ligand types, conductivity, pH, gradient profiles, and the sequence of separation operations. The number of possible experimental conditions quickly becomes unmanageable. Although the range of suitable conditions can be narrowed based on experience, the time and cost of the work remain high even when using high-throughput laboratory automation. In contrast, chromatography modeling using inexpensive, parallelized computer hardware can provide expert knowledge, predicting conditions that achieve high purity and efficient recovery. The prediction of suitable conditions in silico reduces the number of empirical tests required and provides in-depth process understanding, which is recommended by regulatory authorities. In this article, we discuss the benefits and specific challenges of chromatography modeling. We describe the experimental characterization of chromatography devices and settings prior to modeling, such as the determination of column porosity. We also consider the challenges that must be overcome when models are set up and calibrated, including the cross-validation and verification of data-driven and hybrid (combined data-driven and mechanistic) models. This review will therefore support researchers intending to establish a chromatography modeling workflow in their laboratory.
Proteins are important ingredients in food and feed, they are the active components of many pharmaceutical products, and they are necessary, in the form of enzymes, for the success of many technical processes. However, production can be challenging, especially when using heterologous host cells such as bacteria to express and assemble recombinant mammalian proteins. The manufacturability of proteins can be hindered by low solubility, a tendency to aggregate, or inefficient purification. Tools such as in silico protein engineering and models that predict separation criteria can overcome these issues but usually require the complex shape and surface properties of proteins to be represented by a small number of quantitative numeric values known as descriptors, as similarly used to capture the features of small molecules. Here, we review the current status of protein descriptors, especially for application in quantitative structure activity relationship (QSAR) models. First, we describe the complexity of proteins and the properties that descriptors must accommodate. Then we introduce descriptors of shape and surface properties that quantify the global and local features of proteins. Finally, we highlight the current limitations of protein descriptors and propose strategies for the derivation of novel protein descriptors that are more informative.
The book covers various numerical field simulation methods, nonlinear circuit technology and its MF-S- and X-parameters, as well as state-of-the-art power amplifier techniques. It also describes newly presented oscillators and the emerging field of GHz plasma technology. Furthermore, it addresses aspects such as waveguides, mixers, phase-locked loops, antennas, and propagation effects, in combination with the bachelor's book 'High-Frequency Engineering,' encompassing all aspects related to the current state of GHz technology.
Metathese von Ölsäure und Derivaten ist ein interessanter Weg für die Synthese bifunktioneller Verbindungen aus nachwachsenden Rohstoffen. Verwendet wurden Ru-Katalysatoren der zweiten Generation, welche eine hohe Toleranz gegenüber funktionellen Gruppen und Verunreinigungen aufweisen. Trotz des Einsatzes technischer Edukte waren Umsetzungen mit niedrigen Katalysatormengen (0.001 – 0.01 mol-%) möglich, mit Ausbeuten entsprechend der Literatur. Kreuzmetathesen ermöglichten variable Kettenlängen und Funktionalitäten der Monomere, die Produktgewinnung ist jedoch aufwändig. Selbstmetathese lieferte C18-bifunktionelle Verbindungen, welche einfach durch Destillation oder Kristallisation isoliert werden können. Neben der katalystischen Umsetzung wurde auch die Produktgewinnung untersucht und für ausgewählte Produkte auch im größeren Maßstab durchgeführt.
Self metathesis of oleochemicals offers a variety of bifunctional compounds, that can be used as monomer for polymer production. Many precursors are in huge scales available, like oleic acid ester (biodiesel), oleyl alcohol (tensides), oleyl amines (tensides, lubricants). We show several ways to produce and separate and purify C18-α,ω-bifunctional compounds, using Grubbs 2nd Generation catalysts, starting from technical grade educts.
Die Bereitstellung von nachhaltig erzeugtem Wasserstoff als Energieträger und Rohstoff ist eine wichtige Schlüsseltechnologie sowohl als Ersatz für fossile Energieträger, aber auch als Produkt im Zusammenhang mit Kreislaufprozessen. In der Abwasserbehandlung bestehen verschiedene Möglichkeiten Wasserstoff herzustellen. Mehrere Wege, mögliche Synergien, aber auch deren Nachteile werden vorgestellt.
Die Erfindung liegt auf dem Gebiet der Enzymtechnologie. Die Erfindung betrifft Proteasen aus Metabacillus indicus, die insbesondere im Hinblick auf den Einsatz in Wasch- und Reinigungsmitteln verwendet werden können, alle hinreichend ähnlichen Proteasen mit einer entsprechend ähnlichen Sequenz zu SEQ ID NO:1 und für sie codierende Nukleinsäuren. Die Erfindung betrifft ferner deren Herstellung sowie Verfahren zur Verwendung dieser Proteasen, deren Verwendung als solche sowie diese enthaltende Mittel, insbesondere Wasch- und Reinigungsmittel.
Die Erfindung liegt auf dem Gebiet der Enzymtechnologie. Die Erfindung betrifft Proteasen aus Fictibacillus arsenicus, die insbesondere im Hinblick auf den Einsatz in Wasch- und Reinigungsmitteln verwendet werden können, alle hinreichend ähnlichen Proteasen mit einer entsprechend ähnlichen Sequenz zu SEQ ID NO:1 und für sie codierende Nukleinsäuren. Die Erfindung betrifft ferner deren Herstellung sowie Verfahren zur Verwendung dieser Proteasen, deren Verwendung als solche sowie diese enthaltende Mittel, insbesondere Wasch- und Reinigungsmittel.
1. Auswirkungen des europäischen Data Act: Untersuchung des Verhältnisses zwischen Datenzugang und Datenschutz
- Olivia Sohn | Seite 4-58
2. Detecting companies’ willingness to invest in their sustainable transformation – relevant factors and their evaluation
- Titus Thamm |Seite 59-107
3. Unterschiede in den arbeitsbezogenen Wertvorstellungen der Generation Y und Z? Don ́t believe the hype
- Lara Heimann | Seite 108-199
4. Die virtuelle Mitgliederversammlung beim eingetragenen Verein – ein Modell für die Zukunft?
- Abdullah Andug | Seite 200-253
5. Die AGB-Kontrolle von Rechtswahlklauseln in der deutschen und europäischen Kontrollpraxis – Effektiver Verbraucherschutz oder zusätzliche Rechtsunsicherheit?
- Johannes Stahl | Seite 254-325
6. Datenzugangs-und Nutzungsrechte durch den EU Data Act am Beispiel der Automobilbranche
- Tim Schultwessel | Seite 326-380
The research group focuses on the characteristics in the land-and cityscapes of the Drielanden-zone, which contribute to generate common identities, as well as on those features that trigger differences and specificities of the adjacent countries that enrich the perception of the zone. In this research, the instruments of cartography and land survey system serve to detect and localize the fragmented appearance of relevant historic elements. These analytic procedures help to develop strategies for infrastructures and processes that gradually initiate local forms of cross-border tourism. The architectural research displays how top-down and bottom-up interventions can be combined in order to guarantee a sustainable use and development of the considered area.
Cento Tavole
(2016)
Martinella
(2010)
Mit der Digitalen Automatischen Kupplung beginnt ein neues Kapitel des Schienengüterverkehrs, in dem zusammengestellte Wagen sich automatisch in wenigen Minuten abfahrbereit machen, ohne dass der Mensch eingreifen muss. Eines des größten Hemmnisse der umweltfreundlichen Schiene wird dann entfallen. Notwendig ist jetzt eine Diskussion über den Umfang und die Systemgrenzen der Automatischen Bremsprobe.
In many instances, freight vehicles exchange load or information with plants that are or will soon be Industry4.0 plants. The Wagon4.0 concept, as developed in close cooperation with e.g. port or mine operations, offers a maximum in railway operational efficiency while providing strong business cases already in the respective plant interaction. The Wagon4.0 consists of main components, a power supply, data network, sensors, actuators and an operating system, the so called WagonOS. The Wagon OS is implemented in a granular, self-sufficient manner, to allow basic features such as WiFi-Mesh and train christening in remote areas without network connection. Furthermore, the granularity of the operating system allows to extend the familiar app concept to freight rail rolling stock, making it possible to use specialised actuators for certain applications, e.g. an electrical parking brake or an auxiliary drive. In order to facilitate migration to the Wagon4.0 for existing fleets, a migration concept featuring five levels of technical adaptation was developed. The present paper investigates the benefits of Wagon4.0-implementations for the particular challenges of heavy haul operations by focusing on train christening, ep-assisted braking, autonomous last mile and traction boost operation as well as improved maintenance schedules
Neue Perspektiven für die Bahn in der Produktions- und Distributionslogistik durch Prozessautomation
(2019)
Deutschland braucht mehr Eisenbahn um CO2-Emissionen aus dem Verkehr zu reduzieren. Sie muss zum Rückgrat aktueller Logistikprozesse, z.B. bei Kaufmannsgütern und E-Commerce, werden. Dies geht nicht ohne neuartige betriebliche Konzepte und eine Transformation des Güterwagens von einem „dummen Stück Stahl“ zu einem modernen Werkzeug der Logistik.
Als „Güterwagen 4.0“ wird ein kommunikativer und kooperativer Güterwagen verstanden, der die Voraussetzung zur Automatisierung aller Prozesse der Zugvorbereitung bereitstellt, sich aber ansonsten vollkommen kompatibel mit heutigen Betriebsverfahren im Hauptlauf präsentiert. Durch Kommunikation zwischen Güterwagen und umgebenden intelligenten Systemen im Sinne eines „Internet der Dinge“ gelingt damit unter Anderem die Realisierung hoch effizienter Gleisanschlussverkehre, die der Güterbahn neue Märkte abseits der klassisch bahn-affinen Verkehre erschließen und letztlich den Wandel zu einer nachhaltigen Gütermobilität fördern.
Lokomotiven sind dank modernster Konzepte der Antriebstechnik heute energiesparend und umweltfreundlich. Eine Ausrüstung mit Telematik und Assistenzfunktionen ist Standard. Auf der Strecke zeigt sich moderne Technik in Form elektronischer Stellwerke und Zugsicherungssysteme und in Rangier- und Abstellanlagen als EOW-Technik. Am Güterwagen hingegen ist der technische Fortschritt komplett vorbeigegangen. Auch beim modernsten Wagen (Abb. 1) ist die einzige „Automatik“-Funktion die zentral über die Hauptluftleitung (HL) versorgte und betätigte Luftbremse.
Organizzare l’addizione
(2014)
In the introduction to their book "What is philosophy?" Gilles Deleuze and Felix Guattari deplore the inflationary and trivialised use of the term concept: "Finally, the most shameful moment came when computer science, marketing, design and advertising, all the disciplines of communication, seized hold of the word concept itself and said: 'This is our concern, we are the creative ones, we are the ideas men! We are the friends of the concept, we put in our computers.' " This doctoral thesis shares the concern of Gilles Deleuze and Felix Guattari, but still, it is a thesis in architecture and thus collocated within the field of the representatives of the "ideas men". It engages in architectural design theory, and refers in particular to the investigation of methodological approaches within the design process. Therefore, the thesis will not contribute to the philosophical dimension of the term, but intends to overcome its imprecise use within the architectural discourse, in compliance with Eugène Viollet-le-Duc's admonition relative to vague definitions: "Dans les arts, et dans l'architecture en particulier, les définitions vagues ont causé bien des erreurs, ont laissé germer bien des préjugés, enraciner bien des idées fausses. On met un mot en avant, chacun y attache un sens différent." The term concept in architecture is very often used as pure marketing collateral, it serves to sell an idea, a product, a design. Its functional applicability is reduced to a special manner of illustration, produced as one of the various design presentation documents at the end of the design process. In contrast, the original contribution of this thesis aims to give a precise, instrumental dimension to the term concept: the concept is the expression of a specific logic, capable to guide the decisional sequences of the process and thus to improve the quality of the designed projects. The motivation to define a specific instrumentality of the concept is closely connected to the issue of interdisciplinarity in the architects’ profession. The interdisciplinary character of the architectural field is widely accepted and discussed as such, but the thesis intends to give a more precise definition of the various kinds of competences involved by classifying them into either the internal or the external group. The traditional notion of interdisciplinarity, predominantly seen as collaboration between architects and technical experts, and, most notably, the historical, sometimes contentious, relationship between architects and engineers is described. Referring to recent developments, the transformation of the architect’s role within the professional sphere, marked by an increasing importance of diverse influences and linked to a growing risk of marginalisation, is illustrated. The thesis describes different ways to adapt to this specific kind of interdisciplinarity, which generally requires the architect’s ability to connect and to integrate various contents, different points of view and diverse scales. On the other hand, the big potential which is implicit in the interdisciplinary field is exposed: architects can inform their core competence, the design, by extracting contents of different disciplinary competences, pertaining or not pertaining to their own professional field. They have the possibility to cross fields of external competences in a selective way and by doing so they can build up a corpus of knowledge capable to generate and communicate guidelines and systematic methodologies for their design. At the end, the analysis of these two aspects allows the definition of a more specific professional profile of the architect as specialist of interdisciplinarity. The thesis is concerned with the theories around the design process. The design process is seen as open to inspection and critical evaluation, with major focus on the decisional sequences which characterise it. It concentrates on the process’ descriptiveness and the degree of self-conscious approaches applied within it. The importance of regulative, strategic mechanisms is illustrated by testimonies taken from a series of design researches and leads to the functional definition of the figure of the concept as representation of a coherent set of ideas, as generator of a project-specific system of rules and as communicator of decisional strategies. The concept's function is furthermore defined as communicative interface which generates and transmits the system of rules authoritative for all the disciplinary competences involved in the design process, a communicative interface which constitutes a basis of shared convictions capable to increase the efficiency of collaboration. Furthermore, the concept's capacity to explore and elaborate the contents of external disciplines is identified as a possible methodological approach to innovative design thinking. The approach to a specific functional definition of the concept is continued by the description of a series of instruments that are simultaneously generating and communicating it. It is outlined to which degree the concept itself is already the result of an ideational process, collocated within the initial phase of the design proceedings, serving as a guideline to them, but still continuously evolving and adapting in its progression. In addition, it is illustrated how all the diverse instruments of the concept are operational media through which the knowledge transition between different disciplines can occur. The considerations about the concept as operational instrument of design are elaborated with regard to a number of examples of didactical applications that are particularly involved in the development and teaching of specific design methods. These examples illustrate the interrelations between design theory and design education. They are derived from very different schools of architecture and diverse mindsets, but all of them transmit models of conceptual design thinking.
Concept - this is a key term in architectural discourse. However, all too often it is used imprecisely or merely for marketing purposes. What is a concept actually? This publication moves between design theory and design practice and follows the history of the definition of concept in architecture, leading to the formulation of a specifically instrumental and operative definition. It bases concept in architecture on its strategic potential in design decision-making processes. In the changing profession of the designing architect, decisions are increasingly made in multidisciplinary groups. Concept can serve as a dialogic instrument in the process, making it possible to process heterogeneous information from a range of spheres of knowledge. The effective presentation of selected information becomes a relevant interface in the design process, which has a significant influence on the quality of the design.
Robustheit
(2017)
Architects and civil engineers work together regularly during their professional days and are irreplaceable for each other. This co-operation is sometimes made more difficult by the differences in their disciplinary languages and approaches. Structures are evaluated by architects on the basis of criteria such as spatial impact and usability, while civil engineers analyze them more closely by their bearing and deformation properties, as well as by constructive aspects. This diversity of assessment criteria and approaches is often continued in both academic disciplines in the view on structures.
Within the framework of the Exploratory Teaching Space (ETS), a funding program to improve teaching at RWTH Aachen University and to promote new teaching concepts, a project was carried out jointly by the Junior Professorship of Tool-Culture at the Faculty of Architecture and the Institute of Structural Concrete at the Faculty of Civil Engineering. The aim of the project is to present buildings in such a way that the differences in perception between architects and civil engineers are reduced and the common understanding is promoted.
The project develops a database, which contains a collection of striking buildings from Aachen and the surrounding area. The buildings are categorized according to terms that come from both disciplinary areas. The collection can be freely explored or crossed through learning trails. The medium of film plays a special role in presenting the buildings. The buildings are assigned to different categories of load bearing structures as linear, planar and spatial structures, and further to different types of material, functional programs and spatial characteristics. Since the buildings are located in the direct vicinity of Aachen, they can be visited by the students. This makes them more sensitive to their environment. Intrinsic motivation, as well as implicit learning is encouraged. The paper will provide a detailed report of the project, its implementation, the feedback of the students and the plans for further development.
Background: Architectural representation, nurtured by the interaction between design thinking and design action, is inherently multi-layered. However, the representation object cannot always reflect these layers. Therefore, it is claimed that these reflections and layerings can gain visibility through ‘performativity in personal knowledge’, which basically has a performative character. The specific layers of representation produced during the performativity in personal knowledge permit insights about the ‘personal way of designing’ [1]. Therefore, the question, ‘how can these layered drawings be decomposed to understand the personal way of designing’, can be defined as the beginning of the study. On the other hand, performativity in personal knowledge in architectural design is handled through the relationship between explicit and tacit knowledge and representational and non-representational theory. To discuss the practical dimension of these theoretical relations, Zvi Hecker's drawing of the Heinz-Galinski-School is examined as an example. The study aims to understand the relationships between the layers by decomposing a layered drawing analytically in order to exemplify personal ways of designing.
Methods: The study is based on qualitative research methodologies. First, a model has been formed through theoretical readings to discuss the performativity in personal knowledge. This model is used to understand the layered representations and to research the personal way of designing. Thus, one drawing of Hecker’s Heinz-Galinski-School project is chosen. Second, its layers are decomposed to detect and analyze diverse objects, which hint to different types of design tools and their application. Third, Zvi Hecker’s statements of the design process are explained through the interview data [2] and other sources. The obtained data are compared with each other.
Results: By decomposing the drawing, eleven layers are defined. These layers are used to understand the relation between the design idea and its representation. They can also be thought of as a reading system. In other words, a method to discuss Hecker’s performativity in personal knowledge is developed. Furthermore, the layers and their interconnections are described in relation to Zvi Hecker’s personal way of designing.
Conclusions: It can be said that layered representations, which are associated with the multilayered structure of performativity in personal knowledge, form the personal way of designing.
Against the background of growing data in everyday life, data processing tools become more powerful to deal with the increasing complexity in building design. The architectural planning process is offered a variety of new instruments to design, plan and communicate planning decisions. Ideally the access to information serves to secure and document the quality of the building and in the worst case, the increased data absorbs time by collection and processing without any benefit for the building and its user. Process models can illustrate the impact of information on the design- and planning process so that architect and planner can steer the process. This paper provides historic and contemporary models to visualize the architectural planning process and introduces means to describe today’s situation consisting of stakeholders, events and instruments. It explains conceptions during Renaissance in contrast to models used in the second half of the 20th century. Contemporary models are discussed regarding their value against the background of increasing computation in the building process.
Heimat entwerfen?
(2019)
Können wir Skizzenblätter, die gemischte Systeme von Text und Bildanteilen zeigen, als räumliche und zeitliche Verdichtung von Reflexionsmilieus verstehen? Wie wirkt sich die durch die räumliche Begrenzung des Blatts bedingte gleichzeitige Anwesenheit von Text und Bild aus, welche Wechselwirkungen entfalten sich? Diese Fragenstellungen führen zur Definition der ‚Multidimensionalen Arbeitsblätter‘, die als geeignetes Medium der Analyse von entwerferischen Denkprozessen verstanden werden. Anhand von fünf Beispielen wird beschrieben, wie durch dekompositorische Prozeduren Zeichnungsgenealogien sichtbar gemacht werden können, die intensive Auskunft über Entwurfshandlungen geben.
In the research domain of energy informatics, the importance of open datais rising rapidly. This can be seen as various new public datasets are created andpublished. Unfortunately, in many cases, the data is not available under a permissivelicense corresponding to the FAIR principles, often lacking accessibility or reusability.Furthermore, the source format often differs from the desired data format or does notmeet the demands to be queried in an efficient way. To solve this on a small scale atoolbox for ETL-processes is provided to create a local energy data server with openaccess data from different valuable sources in a structured format. So while the sourcesitself do not fully comply with the FAIR principles, the provided unique toolbox allows foran efficient processing of the data as if the FAIR principles would be met. The energydata server currently includes information of power systems, weather data, networkfrequency data, European energy and gas data for demand and generation and more.However, a solution to the core problem - missing alignment to the FAIR principles - isstill needed for the National Research Data Infrastructure.
Due to the transition to renewable energies, electricity markets need to be made fit for purpose. To enable the comparison of different energy market designs, modeling tools covering market actors and their heterogeneous behavior are needed. Agent-based models are ideally suited for this task. Such models can be used to simulate and analyze changes to market design or market mechanisms and their impact on market dynamics. In this paper, we conduct an evaluation and comparison of two actively developed open-source energy market simulation models. The two models, namely AMIRIS and ASSUME, are both designed to simulate future energy markets using an agent-based approach. The assessment encompasses modelling features and techniques, model performance, as well as a comparison of model results, which can serve as a blueprint for future comparative studies of simulation models. The main comparison dataset includes data of Germany in 2019 and simulates the Day-Ahead market and participating actors as individual agents. Both models are comparable close to the benchmark dataset with a MAE between 5.6 and 6.4 €/MWh while also modeling the actual dispatch realistically.
The FAYMONVILLE case study describes how the family-owned company Faymonville from eastern Belgium has succeeded in becoming one of the leading manufacturers in its sector. The targeted identification of new markets, the focus on relevant customer needs, and a consistent product policy with a coordinated manufacturing concept lay the foundations for success. In this case study, students can learn about how a company can successfully resolve the fundamental contradiction between economic and customized production.
We conducted a scoping review for active learning in the domain of natural language processing (NLP), which we summarize in accordance with the PRISMA-ScR guidelines as follows:
Objective: Identify active learning strategies that were proposed for entity recognition and their evaluation environments (datasets, metrics, hardware, execution time).
Design: We used Scopus and ACM as our search engines. We compared the results with two literature surveys to assess the search quality. We included peer-reviewed English publications introducing or comparing active learning strategies for entity recognition.
Results: We analyzed 62 relevant papers and identified 106 active learning strategies. We grouped them into three categories: exploitation-based (60x), exploration-based (14x), and hybrid strategies (32x). We found that all studies used the F1-score as an evaluation metric. Information about hardware (6x) and execution time (13x) was only occasionally included. The 62 papers used 57 different datasets to evaluate their respective strategies. Most datasets contained newspaper articles or biomedical/medical data. Our analysis revealed that 26 out of 57 datasets are publicly accessible.
Conclusion: Numerous active learning strategies have been identified, along with significant open questions that still need to be addressed. Researchers and practitioners face difficulties when making data-driven decisions about which active learning strategy to adopt. Conducting comprehensive empirical comparisons using the evaluation environment proposed in this study could help establish best practices in the domain.
In recent years, more and more digital startups have been founded and many of them work remotely by applying enterprise collaboration systems (ECS). The study investigates the functional affordances of ECS, particularly Slack, and examines its potential as a virtual office environment for cultural development in digital startups. Through a case study and based on affordance theoretical considerations, the paper explores how ECS facilitates remote collaboration, communication, and socialization within digital startups. The findings comprise material properties of ECS (synchrony and asynchrony communication), functional affordances (virtual office and culture development affordances) as well as its realization (through communication practices, openness, and inter-company accessibility) and are conceptualized as a model for ECS affordances in digital startups.
Architecture is a university subject with educational roots in both the technical university and art/specialized architecture schools, yet it lacks a strong research orientation and is focused on professional expertise. This chapter explores the particular role of research within architectural education in general by discussing two different cases for the implementation of undergraduate research in architecture: during the late 1990s and early 2000s at the University of Sheffield, UK, and during the 2010s at RWTH Aachen University, Germany. These examples illustrate the asynchronous beginnings of similar developments, and also contextualize differences in disciplinary habitus and pedagogical approaches between Sheffield, where research impulses stemmed from within the Architectural Humanities, and Aachen with its strong tradition as a technical university.