Refine
Year of publication
- 2024 (55)
- 2023 (116)
- 2022 (147)
- 2021 (154)
- 2020 (172)
- 2019 (198)
- 2018 (173)
- 2017 (155)
- 2016 (161)
- 2015 (176)
- 2014 (167)
- 2013 (174)
- 2012 (164)
- 2011 (189)
- 2010 (187)
- 2009 (189)
- 2008 (157)
- 2007 (149)
- 2006 (160)
- 2005 (130)
- 2004 (161)
- 2003 (106)
- 2002 (130)
- 2001 (106)
- 2000 (108)
- 1999 (109)
- 1998 (99)
- 1997 (99)
- 1996 (81)
- 1995 (78)
- 1994 (87)
- 1993 (59)
- 1992 (54)
- 1991 (29)
- 1990 (39)
- 1989 (45)
- 1988 (57)
- 1987 (32)
- 1986 (19)
- 1985 (34)
- 1984 (22)
- 1983 (20)
- 1982 (29)
- 1981 (20)
- 1980 (36)
- 1979 (24)
- 1978 (34)
- 1977 (14)
- 1976 (13)
- 1975 (12)
- 1974 (3)
- 1973 (2)
- 1972 (2)
- 1971 (1)
- 1968 (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (1695)
- Fachbereich Elektrotechnik und Informationstechnik (719)
- IfB - Institut für Bioengineering (626)
- Fachbereich Energietechnik (589)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (552)
- Fachbereich Luft- und Raumfahrttechnik (497)
- Fachbereich Maschinenbau und Mechatronik (284)
- Fachbereich Wirtschaftswissenschaften (222)
- Solar-Institut Jülich (165)
Language
- English (4938) (remove)
Document Type
- Article (3288)
- Conference Proceeding (1170)
- Part of a Book (195)
- Book (146)
- Doctoral Thesis (32)
- Conference: Meeting Abstract (29)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
Die Detektion von Schadstoffen repräsentiert in der Umweltanalytik eine wichtige Aufgabenstellung. Gerade die Abwasser- bzw. Brauchwasseranalytik sowie die Prozesskontrolle haben einen hohen Stellenwert. Siliziumbasierte Dünnschichtsensoren bieten eine kostengünstige Möglichkeit, „online“-Messungen bzw. Vor-Ort-Messungen zeitnah durchzuführen. In dieser Arbeit wird ein potentiometrisches Sensorarray auf der Basis von Chalkogenidgläsern zur Detektion von Schwermetallen in wässrigen Medien vorgestellt.
Wing weight estimation methodology for highly non-planar lifting systems during conceptual design
(2013)
The recently proposed NASA and ESA missions to Saturn and Jupiter pose difficult tasks to mission designers because chemical propulsion scenarios are not capable of transferring heavy spacecraft into the outer solar system without the use of gravity assists. Thus our developed mission scenario based on the joint NASA/ESA Titan Saturn System Mission baselines solar electric propulsion to improve mission flexibility and transfer time. For the calculation of near-globally optimal low-thrust trajectories, we have used a method called Evolutionary Neurocontrol, which is implemented in the low-thrust trajectory optimization software InTrance. The studied solar electric propulsion scenario covers trajectory optimization of the interplanetary transfer including variations of the spacecraft's thrust level, the thrust unit's specific impulse and the solar power generator power level. Additionally developed software extensions enabled trajectory optimization with launcher-provided hyperbolic excess energy, a complex solar power generator model and a variable specific impulse ion engine model. For the investigated mission scenario, Evolutionary Neurocontrol yields good optimization results, which also hold valid for the more elaborate spacecraft models. Compared to Cassini/Huygens, the best found solutions have faster transfer times and a higher mission flexibility in general.
Comparison of solar hot water systems in solar settlements - decentralized or centralized systems?
(2004)
Lately there has been an increasing concern about uranium toxicity in some districts of Punjab State located in the North Western part of India after the publication of a report (Blaurock-Busch et al. 2010) which showed that the concentration of uranium in hair and urine of children suffering from physical deformities, neurological and mental disorder from Malwa region (Fig. 1) of Punjab State was manifold higher than the reference ranges. A train which connects the affected region with the nearby city of Bikaner which has a Cancer Hospital has been nicknamed as Cancer Express due to the frenzy generated on account of uranium related toxicity.
The optical study carried out on insulating polymers namely polyethyleneterephthalate (PET) and polyvinylchloride (PVC) has been described. The polymers are exposed to different radiation doses by exposing them to swift heavy ions of carbon (90 MeV), silicon (120 MeV) and nickel (100 MeV) which influence on their optical properties. The studies show that amongst the investigated polymers, PVC and PET have potential for application as dosimeter beyond a threshold dose which is strongly dependent on the nature of the material and the radiation type. The optical micrographs show a distinct change in colour of the sample with increase in radiation dose.
Study of swift heavy ion modified conduction polymer composites for application as gas sensor
(2006)
A polyaniline-based conducting composite was prepared by oxidative polymerisation of aniline in a polyvinylchloride (PVC) matrix. The coherent free standing thin films of the composite were prepared by a solution casting method. The polyvinyl chloride-polyaniline composites exposed to 120 MeV ions of silicon with total ion fluence ranging from 1011 to 1013 ions/cm2, were observed to be more sensitive towards ammonia gas than the unirradiated composite. The response time of the irradiated composites was observed to be comparably shorter. We report for the first time the application of swift heavy ion modified insulating polymer conducting polymer (IPCP) composites for sensing of ammonia gas.
Limit loads can be calculated with the finite element method (FEM) for any component, defect geometry, and loading. FEM suggests that published long crack limit formulae for axial defects under-estimate the burst pressure for internal surface defects in thick pipes while limit loads are not conservative for deep cracks and for pressure loaded crack-faces. Very deep cracks have a residual strength, which is modelled by a global collapse load. These observations are combined to derive new analytical local and global collapse loads. The global collapse loads are close to FEM limit analyses for all crack dimensions.
Improved collapse loads of thick-walled, crack containing pipes and vessels are suggested. Very deep cracks have a residual strength which is better modelled by a global limit load. In all burst tests, the ductility of pressure vessel steels was sufficiently high whereby the burst pressure could be predicted by limit analysis with no need to apply fracture mechanics. The relative prognosis error increases however, for long and deep defects due to uncertainties of geometry and strength data.
This paper presents the direct route to Design by Analysis (DBA) of the new European pressure vessel standard in the language of limit and shakedown analysis (LISA). This approach leads to an optimization problem. Its solution with Finite Element Analysis is demonstrated for some examples from the DBA-Manual. One observation from the examples is, that the optimisation approach gives reliable and close lower bound solutions leading to simple and optimised design decision.
Structural design analyses are conducted with the aim of verifying the exclusion of ratchetting. To this end it is important to make a clear distinction between the shakedown range and the ratchetting range. The performed experiment comprised a hollow tension specimen which was subjected to alternating axial forces, superimposed with constant moments. First, a series of uniaxial tests has been carried out in order to calibrate a bounded kinematic hardening rule. The load parameters have been selected on the basis of previous shakedown analyses with the PERMAS code using a kinematic hardening material model. It is shown that this shakedown analysis gives reasonable agreement between the experimental and the numerical results. A linear and a nonlinear kinematic hardening model of two-surface plasticity are compared in material shakedown analysis.
In the new European standard for unfired pressure vessels, EN 13445-3, there are two approaches for carrying out a Design-by-Analysis that cover both the stress categorization method (Annex C) and the direct route method (Annex B) for a check against global plastic deformation and against progressive plastic deformation. This paper presents the direct route in the language of limit and shakedown analysis. This approach leads to an optimization problem. Its solution with Finite Element Analysis is demonstrated for mechanical and thermal actions. One observation from the examples is that the so-called 3f (3Sm) criterion fails to be a reliable check against progressive plastic deformation. Precise conditions are given, which greatly restrict the applicability of the 3f criterion.
Reliability of the Primary Circuit Pressure Boundary of an HTR-Module under Accident Conditions
(1993)
In: Technical feasibility and reliability of passive safety systems for nuclear power plants. Proceedings of an Advisory Group Meeting held in Jülich, 21-24 November 1994. - Vienna , 1996. - Seite: 43 - 55 IAEA-TECDOC-920 Abstract: It is shown that the difficulty for probabilistic fracture mechanics (PFM) is the general problem of the high reliability of a small population. There is no way around the problem as yet. Therefore what PFM can contribute to the reliability of steel pressure boundaries is demonstrated with the example of a typical reactor pressure vessel and critically discussed. Although no method is distinguishable that could give exact failure probabilities, PFM has several additional chances. Upper limits for failure probability may be obtained together with trends for design and operating conditions. Further, PFM can identify the most sensitive parameters, improved control of which would increase reliability. Thus PFM should play a vital role in the analysis of steel pressure boundaries despite all shortcomings.
Fatigue analyses are conducted with the aim of verifying that thermal ratcheting is limited. To this end it is important to make a clear distintion between the shakedown range and the ratcheting range (continuing deformation). As part of an EU-supported research project, experiments were carried out using a 4-bar model. The experiment comprised a water-cooled internal tube, and three insulated heatable outer test bars. The system was subjected to alternating axial forces, superimposed with alternating temperatures at the outer bars. The test parameters were partly selected on the basis of previous shakedown analyses. During the test, temperatures and strains were measured as a function of time. The loads and the resulting stresses were confirmed on an ongoing basis during performance of the test, and after it. Different material models were applied for this incremental elasto-plastic analysis using the ANSYS program. The results of the simulation are used to verify the FEM-based shakedown analysis.
The structural reliability with respect to plastic collapse or to inadaptation is formulated on the basis of the lower bound limit and shakedown theorems. A direct definition of the limit state function is achieved which permits the use of the highly effective first order reliability methods (FORM) is achieved. The theorems are implemented into a general purpose FEM program in a way capable of large-scale analysis. The limit state function and its gradient are obtained from a mathematical optimization problem. This direct approach reduces considerably the necessary knowledge of uncertain technological input data, the computing time, and the numerical error, leading to highly effective and precise reliability analyses.
Limit and shakedown analysis are effective methods for assessing the load carrying capacity of a given structure. The elasto–plastic behavior of the structure subjected to loads varying in a given load domain is characterized by the shakedown load factor, defined as the maximum factor which satisfies the sufficient conditions stated in the corresponding static shakedown theorem. The finite element dicretization of the problem may lead to very large convex optimization. For the effective solution a basis reduction method has been developed that makes use of the special problem structure for perfectly plastic material. The paper proposes a modified basis reduction method for direct application to the two-surface plasticity model of bounded kinematic hardening material. The considered numerical examples show an enlargement of the load carrying capacity due to bounded hardening.
The load-carrying capacity or the safety against plastic limit states are the central questions in the design of structures and passive components in the apparatus engineering. A precise answer is most simply given by limit and shakedown analysis. These methods can be based on static and kinematic theorems for lower and upper bound analysis. Both may be formulated as optimization problems for finite element discretizations of structures. The problems of large-scale analysis and the extension towards realistic material modelling will be solved in a European research project. Limit and shakedown analyses are briefly demonstrated with illustrative examples.
Sensitivity of and Influences on the Reliability of an HTR-Module Primary Circuit Pressure Boundary
(1993)
Extension fractures are typical for the deformation under low or no confining pressure. They can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. In this article, it is shown that the simple extension strain criterion makes unrealistic strength predictions in biaxial compression and tension. To overcome this major limitation, a new extension strain criterion is proposed by adding a weighted principal shear component to the simple criterion. The shear weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting extension failure modes, which are unexpected in the classical understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain leading to dilatancy. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak stress CP. Different from compressive loading, tensile loading requires only a limited number of critical cracks to cause failure. Therefore, for tensile stresses, the failure criteria must be modified somehow, possibly by a cut-off corresponding to the CI stress. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Shock waves, explosions, impacts or cavitation bubble collapses may generate stress waves in solids causing cracks or unexpected dammage due to focussing, physical nonlinearity or interaction with existing cracks. There is a growing interest in wave propagation, which poses many novel problems to experimentalists and theorists.
The nonlinear scalar constitutive equations of gases lead to a change in sound speed from point to point as would be found in linear inhomogeneous (and time dependent) media. The nonlinear tensor constitutive equations of solids introduce the additional local effect of solution dependent anisotropy. The speed of a wave passing through a point changes with propagation direction and its rays are inclined to the front. It is an open question whether the widely used operator splitting techniques achieve a dimensional splitting with physically reasonable results for these multi-dimensional problems. May be this is the main reason why the theoretical and numerical investigations of multi-dimensional wave propagation in nonlinear solids are so far behind gas dynamics. We hope to promote the subject a little by a discussion of some fundamental aspects of the solution of the equations of nonlinear elastodynamics. We use methods of characteristics because they only integrate mathematically exact equations which have a direct physical interpretation.
Soft Materials in Technology and Biology – Characteristics, Properties, and Parameter Identification
(2008)
Smoothed Finite Element Methods for Nonlinear Solid Mechanics Problems: 2D and 3D Case Studies
(2016)
The Smoothed Finite Element Method (SFEM) is presented as an edge-based and a facebased techniques for 2D and 3D boundary value problems, respectively. SFEMs avoid shortcomings of the standard Finite Element Method (FEM) with lower order elements such as overly stiff behavior, poor stress solution, and locking effects. Based on the idea of averaging spatially the standard strain field of the FEM over so-called smoothing domains SFEM calculates the stiffness matrix for the same number of degrees of freedom (DOFs) as those of the FEM. However, the SFEMs significantly improve accuracy and convergence even for distorted meshes and/or nearly incompressible materials.
Numerical results of the SFEMs for a cardiac tissue membrane (thin plate inflation) and an artery (tension of 3D tube) show clearly their advantageous properties in improving accuracy particularly for the distorted meshes and avoiding shear locking effects.
Limit Analysis of Defects
(2000)
Limit and shakedown theorems are exact theories of classical plasticity for the direct computation of safety factors or of the load carrying capacity under constant and varying loads. Simple versions of limit and shakedown analysis are the basis of all design codes for pressure vessels and pipings. Using Finite Element Methods more realistic modeling can be used for a more rational design. The methods can be extended to yield optimum plastic design. In this paper we present a first implementation in FE of limit and shakedown analyses for perfectly plastic material. Limit and shakedown analyses are done of a pipe–junction and a interaction diagram is calculated. The results are in good correspondence with the analytic solution we give in the appendix.
Safety and reliability of structures may be assessed indirectly by stress distributions. Limit and shakedown theorems are simplified but exact methods of plasticity that provide safety factors directly in the loading space. These theorems may be used for a direct definition of the limit state function for failure by plastic collapse or by inadaptation. In a FEM formulation the limit state function is obtained from a nonlinear optimization problem. This direct approach reduces considerably the necessary knowledge of uncertain technological input data, the computing time, and the numerical error. Moreover, the direct way leads to highly effective and precise reliability analyses. The theorems are implemented into a general purpose FEM program in a way capable of large-scale analysis.
Structural design analyses are conducted with the aim of verifying the exclusion of ratcheting. To this end it is important to make a clear distinction between the shakedown range and the ratcheting range. In cyclic plasticity more sophisticated hardening models have been suggested in order to model the strain evolution observed in ratcheting experiments. The hardening models used in shakedown analysis are comparatively simple. It is shown that shakedown analysis can make quite stable predictions of admissible load ranges despite the simplicity of the underlying hardening models. A linear and a nonlinear kinematic hardening model of two-surface plasticity are compared in material shakedown analysis. Both give identical or similar shakedown ranges. Structural shakedown analyses show that the loading may have a more pronounced effect than the hardening model.
Numerical methods for limit and shakedown analysis. Deterministic and probabilistic problems.
(2003)
Biomechanics studies biological soft tissue materials (growth, remodeling) in vivo. For this objective, the detailed information of material properties must be well defined to construct reliable constitutive models. In the paper, the bulge test is carried out with elastomers in order to develop a test method. Then, application of the test for soft tissue materials is straightforward due to the similarities between elastomers with soft tissue materials as proved in Holzapfel 2005, Ogden 2009. It means, after the preliminary experiments and parameter identification with rubber materials has been setup, experiments on soft tissue materials can be similarly carried out. Elastomers have a complex behavior which strongly depends on the largest previous load cycle. For simplicity we consider only the first loading.
When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion.
Load bearing capacity of thin shell structures made of elastoplastic material by direct methods
(2008)
7th International Conference on Reliability of Materials and Structures (RELMAS 2008). June 17 - 20, 2008 ; Saint Petersburg, Russia. pp 354-358. Reprint with corrections in red Introduction Analysis of advanced structures working under extreme heavy loading such as nuclear power plants and piping system should take into account the randomness of loading, geometrical and material parameters. The existing reliability are restricted mostly to the elastic working regime, e.g. allowable local stresses. Development of the limit and shakedown reliability-based analysis and design methods, exploiting potential of the shakedown working regime, is highly needed. In this paper the application of a new algorithm of probabilistic limit and shakedown analysis for shell structures is presented, in which the loading and strength of the material as well as the thickness of the shell are considered as random variables. The reliability analysis problems may be efficiently solved by using a system combining the available FE codes, a deterministic limit and shakedown analysis, and the First and Second Order Reliability Methods (FORM/SORM). Non-linear sensitivity analyses are obtained directly from the solution of the deterministic problem without extra computational costs.
Suburethral slings as well as different meshes are widely used treating stress urinary incontinence and prolaps in women. With the development of MiniSlings and special meshes using less alloplastic material anchorage systems become more important to keep devices in place and to put some tension especially on the MiniSlings. To date, there are many different systems of MiniSlings of different companies on the market which differ in the structure of the used meshes and anchors. A new objective measurement method to compare different properties of MiniSling systems (mesh and anchor) is presented in this article. Ballistic gelatine acts as soft tissue surrogate. Significant differences in parameters like pull-out strength of anchors or shrinkage of meshes under loading conditions have been determined. The form and size of the anchors as well as the structural stability of the meshes are decisive for a proper integration. The tested anchorings sytems showed markedly different mechanical function at their respective load bearing capacity. As the stable fixation of the device in tissue is a prerequisite for a permanet reinforcement, the proposed test system permits further optimisation of anchor and mesh devices to improve the success of the surgical treatment
Upper and lower bound theorems of limit analyses have been presented in part I of the paper. Part II starts with the finite element discretization of these theorems and demonstrates how both can be combined in a primal–dual optimization problem. This recently proposed numerical method is used to guide the development of a new class of closed-form limit loads for circumferential defects, which show that only large defects contribute to plastic collapse with a rapid loss of strength with increasing crack sizes. The formulae are compared with primal–dual FEM limit analyses and with burst tests. Even closer predictions are obtained with iterative limit load solutions for the von Mises yield function and for the Tresca yield function. Pressure loading of the faces of interior cracks in thick pipes reduces the collapse load of circumferential defects more than for axial flaws. Axial defects have been treated in part I of the paper.
Limit loads of circumferentially flawed pipes and cylindrical vessels under internal pressure
(2006)
Picosecond dynamics in haemoglobin from different species: A quasielastic neutron scattering study
(2014)
Thermodynamic stability, configurational motions and internal forces of haemoglobin (Hb) of three endotherms (platypus, Ornithorhynchus anatinus; domestic chicken, Gallus gallus domesticus and human, Homo sapiens) and an ectotherm (salt water crocodile, Crocodylus porosus) were investigated using circular dichroism, incoherent elastic neutron scattering and coarse-grained Brownian dynamics simulations. The experimental results from Hb solutions revealed a direct correlation between protein resilience, melting temperature and average body temperature of the different species on the 0.1 ns time scale. Molecular forces appeared to be adapted to permit conformational fluctuations with a root mean square displacement close to 1.2 Å at the corresponding average body temperature of the endotherms. Strong forces within crocodile Hb maintain the amplitudes of motion within a narrow limit over the entire temperature range in which the animal lives. In fully hydrated powder samples of human and chicken, Hb mean square displacements and effective force constants on the 1 ns time scale showed no differences over the whole temperature range from 10 to 300 K, in contrast to the solution case. A complementary result of the study, therefore, is that one hydration layer is not sufficient to activate all conformational fluctuations of Hb in the pico- to nanosecond time scale which might be relevant for biological function. Coarse-grained Brownian dynamics simulations permitted to explore residue-specific effects. They indicated that temperature sensing of human and chicken Hb occurs mainly at residues lining internal cavities in the β-subunits.
The invention relates to a method for production of single-stranded macronucleotides by amplifying and ligating an extended monomeric single-stranded target nucleic acid sequence (targetss) into a repetitive cluster of double-stranded target nucleic acid sequences (targetds), and subsequently cloning the construct into a vector (aptagene vector). The aptagene vector is transformed into host cells for replication of the aptagene and isolated in order to optain single-stranded target sequences (targetss). The invention also relates to single-stranded nucleic acids, produced by a method of the invention.
Concept - this is a key term in architectural discourse. However, all too often it is used imprecisely or merely for marketing purposes. What is a concept actually? This publication moves between design theory and design practice and follows the history of the definition of concept in architecture, leading to the formulation of a specifically instrumental and operative definition. It bases concept in architecture on its strategic potential in design decision-making processes. In the changing profession of the designing architect, decisions are increasingly made in multidisciplinary groups. Concept can serve as a dialogic instrument in the process, making it possible to process heterogeneous information from a range of spheres of knowledge. The effective presentation of selected information becomes a relevant interface in the design process, which has a significant influence on the quality of the design.
In the introduction to their book "What is philosophy?" Gilles Deleuze and Felix Guattari deplore the inflationary and trivialised use of the term concept: "Finally, the most shameful moment came when computer science, marketing, design and advertising, all the disciplines of communication, seized hold of the word concept itself and said: 'This is our concern, we are the creative ones, we are the ideas men! We are the friends of the concept, we put in our computers.' " This doctoral thesis shares the concern of Gilles Deleuze and Felix Guattari, but still, it is a thesis in architecture and thus collocated within the field of the representatives of the "ideas men". It engages in architectural design theory, and refers in particular to the investigation of methodological approaches within the design process. Therefore, the thesis will not contribute to the philosophical dimension of the term, but intends to overcome its imprecise use within the architectural discourse, in compliance with Eugène Viollet-le-Duc's admonition relative to vague definitions: "Dans les arts, et dans l'architecture en particulier, les définitions vagues ont causé bien des erreurs, ont laissé germer bien des préjugés, enraciner bien des idées fausses. On met un mot en avant, chacun y attache un sens différent." The term concept in architecture is very often used as pure marketing collateral, it serves to sell an idea, a product, a design. Its functional applicability is reduced to a special manner of illustration, produced as one of the various design presentation documents at the end of the design process. In contrast, the original contribution of this thesis aims to give a precise, instrumental dimension to the term concept: the concept is the expression of a specific logic, capable to guide the decisional sequences of the process and thus to improve the quality of the designed projects. The motivation to define a specific instrumentality of the concept is closely connected to the issue of interdisciplinarity in the architects’ profession. The interdisciplinary character of the architectural field is widely accepted and discussed as such, but the thesis intends to give a more precise definition of the various kinds of competences involved by classifying them into either the internal or the external group. The traditional notion of interdisciplinarity, predominantly seen as collaboration between architects and technical experts, and, most notably, the historical, sometimes contentious, relationship between architects and engineers is described. Referring to recent developments, the transformation of the architect’s role within the professional sphere, marked by an increasing importance of diverse influences and linked to a growing risk of marginalisation, is illustrated. The thesis describes different ways to adapt to this specific kind of interdisciplinarity, which generally requires the architect’s ability to connect and to integrate various contents, different points of view and diverse scales. On the other hand, the big potential which is implicit in the interdisciplinary field is exposed: architects can inform their core competence, the design, by extracting contents of different disciplinary competences, pertaining or not pertaining to their own professional field. They have the possibility to cross fields of external competences in a selective way and by doing so they can build up a corpus of knowledge capable to generate and communicate guidelines and systematic methodologies for their design. At the end, the analysis of these two aspects allows the definition of a more specific professional profile of the architect as specialist of interdisciplinarity. The thesis is concerned with the theories around the design process. The design process is seen as open to inspection and critical evaluation, with major focus on the decisional sequences which characterise it. It concentrates on the process’ descriptiveness and the degree of self-conscious approaches applied within it. The importance of regulative, strategic mechanisms is illustrated by testimonies taken from a series of design researches and leads to the functional definition of the figure of the concept as representation of a coherent set of ideas, as generator of a project-specific system of rules and as communicator of decisional strategies. The concept's function is furthermore defined as communicative interface which generates and transmits the system of rules authoritative for all the disciplinary competences involved in the design process, a communicative interface which constitutes a basis of shared convictions capable to increase the efficiency of collaboration. Furthermore, the concept's capacity to explore and elaborate the contents of external disciplines is identified as a possible methodological approach to innovative design thinking. The approach to a specific functional definition of the concept is continued by the description of a series of instruments that are simultaneously generating and communicating it. It is outlined to which degree the concept itself is already the result of an ideational process, collocated within the initial phase of the design proceedings, serving as a guideline to them, but still continuously evolving and adapting in its progression. In addition, it is illustrated how all the diverse instruments of the concept are operational media through which the knowledge transition between different disciplines can occur. The considerations about the concept as operational instrument of design are elaborated with regard to a number of examples of didactical applications that are particularly involved in the development and teaching of specific design methods. These examples illustrate the interrelations between design theory and design education. They are derived from very different schools of architecture and diverse mindsets, but all of them transmit models of conceptual design thinking.
Architects and civil engineers work together regularly during their professional days and are irreplaceable for each other. This co-operation is sometimes made more difficult by the differences in their disciplinary languages and approaches. Structures are evaluated by architects on the basis of criteria such as spatial impact and usability, while civil engineers analyze them more closely by their bearing and deformation properties, as well as by constructive aspects. This diversity of assessment criteria and approaches is often continued in both academic disciplines in the view on structures.
Within the framework of the Exploratory Teaching Space (ETS), a funding program to improve teaching at RWTH Aachen University and to promote new teaching concepts, a project was carried out jointly by the Junior Professorship of Tool-Culture at the Faculty of Architecture and the Institute of Structural Concrete at the Faculty of Civil Engineering. The aim of the project is to present buildings in such a way that the differences in perception between architects and civil engineers are reduced and the common understanding is promoted.
The project develops a database, which contains a collection of striking buildings from Aachen and the surrounding area. The buildings are categorized according to terms that come from both disciplinary areas. The collection can be freely explored or crossed through learning trails. The medium of film plays a special role in presenting the buildings. The buildings are assigned to different categories of load bearing structures as linear, planar and spatial structures, and further to different types of material, functional programs and spatial characteristics. Since the buildings are located in the direct vicinity of Aachen, they can be visited by the students. This makes them more sensitive to their environment. Intrinsic motivation, as well as implicit learning is encouraged. The paper will provide a detailed report of the project, its implementation, the feedback of the students and the plans for further development.
The research group focuses on the characteristics in the land-and cityscapes of the Drielanden-zone, which contribute to generate common identities, as well as on those features that trigger differences and specificities of the adjacent countries that enrich the perception of the zone. In this research, the instruments of cartography and land survey system serve to detect and localize the fragmented appearance of relevant historic elements. These analytic procedures help to develop strategies for infrastructures and processes that gradually initiate local forms of cross-border tourism. The architectural research displays how top-down and bottom-up interventions can be combined in order to guarantee a sustainable use and development of the considered area.
Against the background of growing data in everyday life, data processing tools become more powerful to deal with the increasing complexity in building design. The architectural planning process is offered a variety of new instruments to design, plan and communicate planning decisions. Ideally the access to information serves to secure and document the quality of the building and in the worst case, the increased data absorbs time by collection and processing without any benefit for the building and its user. Process models can illustrate the impact of information on the design- and planning process so that architect and planner can steer the process. This paper provides historic and contemporary models to visualize the architectural planning process and introduces means to describe today’s situation consisting of stakeholders, events and instruments. It explains conceptions during Renaissance in contrast to models used in the second half of the 20th century. Contemporary models are discussed regarding their value against the background of increasing computation in the building process.