Article
Refine
Year of publication
- 2024 (34)
- 2023 (66)
- 2022 (79)
- 2021 (86)
- 2020 (102)
- 2019 (95)
- 2018 (85)
- 2017 (72)
- 2016 (79)
- 2015 (83)
- 2014 (93)
- 2013 (97)
- 2012 (82)
- 2011 (130)
- 2010 (122)
- 2009 (121)
- 2008 (103)
- 2007 (94)
- 2006 (86)
- 2005 (99)
- 2004 (131)
- 2003 (74)
- 2002 (92)
- 2001 (88)
- 2000 (84)
- 1999 (88)
- 1998 (82)
- 1997 (79)
- 1996 (70)
- 1995 (68)
- 1994 (77)
- 1993 (51)
- 1992 (48)
- 1991 (25)
- 1990 (35)
- 1989 (38)
- 1988 (54)
- 1987 (32)
- 1986 (18)
- 1985 (32)
- 1984 (18)
- 1983 (17)
- 1982 (26)
- 1981 (18)
- 1980 (35)
- 1979 (23)
- 1978 (30)
- 1977 (14)
- 1976 (13)
- 1975 (10)
- 1974 (3)
- 1972 (2)
- 1971 (1)
- 1968 (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (1359)
- INB - Institut für Nano- und Biotechnologien (503)
- Fachbereich Chemie und Biotechnologie (473)
- Fachbereich Elektrotechnik und Informationstechnik (414)
- IfB - Institut für Bioengineering (410)
- Fachbereich Energietechnik (361)
- Fachbereich Luft- und Raumfahrttechnik (254)
- Fachbereich Maschinenbau und Mechatronik (151)
- Fachbereich Wirtschaftswissenschaften (116)
- Fachbereich Bauingenieurwesen (69)
Language
- English (3285) (remove)
Document Type
- Article (3285) (remove)
Keywords
- Einspielen <Werkstoff> (7)
- avalanche (5)
- Earthquake (4)
- FEM (4)
- Finite-Elemente-Methode (4)
- LAPS (4)
- additive manufacturing (4)
- biosensors (4)
- field-effect sensor (4)
- frequency mixing magnetic detection (4)
Mice that have been genetically humanized for proteins involved in drug metabolism and toxicity and mice engrafted with human hepatocytes are emerging and promising in vivo models for an improved prediction of the pharmacokinetic, drug–drug interaction and safety characteristics of compounds in humans. The specific advantages and disadvantages of these models should be carefully considered when using them for studies in drug discovery and development. Here, an overview on the corresponding genetically humanized and chimeric liver humanized mouse models described to date is provided and illustrated with examples of their utility in drug metabolism and toxicity studies. We compare the strength and weaknesses of the two different approaches, give guidance for the selection of the appropriate model for various applications and discuss future trends and perspectives.
Cyberspace is "the environment formed by physical and non-physical components to store, modify, and exchange data using computer networks" (NATO CCDCOE). Beyond that, it is an environment where people interact. IT attacks are hostile, non-cooperative interactions that can be described with conflict theory. Applying conflict theory to IT security leads to different objectives for end-user education, requiring different formats like agency-based competence developing games.
A nonparametric goodness-of-fit test for random variables with values in a separable Hilbert space is investigated. To verify the null hypothesis that the data come from a specific distribution, an integral type test based on a Cramér-von-Mises statistic is suggested. The convergence in distribution of the test statistic under the null hypothesis is proved and the test's consistency is concluded. Moreover, properties under local alternatives are discussed. Applications are given for data of huge but finite dimension and for functional data in infinite dimensional spaces. A general approach enables the treatment of incomplete data. In simulation studies the test competes with alternative proposals.
An improved and convenient ninhydrin assay for aminoacylase activity measurements was developed using the commercial EZ Nin™ reagent. Alternative reagents from literature were also evaluated and compared. The addition of DMSO to the reagent enhanced the solubility of Ruhemann's purple (RP). Furthermore, we found that the use of a basic, aqueous buffer enhances stability of RP. An acidic protocol for the quantification of lysine was developed by addition of glacial acetic acid. The assay allows for parallel processing in a 96-well format with measurements microtiter plates.
A Cooperative Work Environment for Evolutionary Software Development / Kurbel, K., Pietsch, W.
(1990)
There is a growing demand for more flexibility in manufacturing to counter the volatility and unpredictability of the markets and provide more individualization for customers. However, the design and implementation of flexibility within manufacturing systems are costly and only economically viable if applicable to actual demand fluctuations. To this end, companies are considering additive manufacturing (AM) to make production more flexible. This paper develops a conceptual model for the impact quantification of AM on volume and mix flexibility within production systems in the early stages of the factory-planning process. Together with the model, an application guideline is presented to help planners with the flexibility quantification and the factory design process. Following the development of the model and guideline, a case study is presented to indicate the potential impact additive technologies can have on manufacturing flexibility Within the case study, various scenarios with different production system configurations and production programs are analyzed, and the impact of the additive technologies on volume and mix flexibility is calculated. This work will allow factory planners to determine the potential impacts of AM on manufacturing flexibility in an early planning stage and design their production systems accordingly.
Companies often build their businesses based on product information and therefore try to automate the process of information extraction (IE). Since the information source is usually heterogeneous and non-standardized, classic extract, transform, load techniques reach their limits. Hence, companies must implement the newest findings from research to tackle the challenges of process automation. They require a flexible and robust system that is extendable and ensures the optimal processing of the different document types. This paper provides a distributed microservice architecture pattern that enables the automated generation of IE pipelines. Since their optimal design is individual for each input document, the system ensures the ad-hoc generation of pipelines depending on specific document characteristics at runtime. Furthermore, it introduces the automated quality determination of each available pipeline and controls the integration of new microservices based on their impact on the business value. The introduced system enables fast prototyping of the newest approaches from research and supports companies in automating their IE processes. Based on the automated quality determination, it ensures that the generated pipelines always meet defined business requirements when they come into productive use.
This Research Briefing, issued in July 2010, concluded that:
- Small and medium-sized enterprises (SMEs) in Europe have long called for a matching legal form valid across the EU (similar to that of the European company (SE) for large firms)
- The main benefits would be the availability of uniform Europe-wide company structures, significant cost reductions for businesses and further integration of the internal market
- Given the differing national views regarding the concrete features of the new legal form there is currently no sign of an agreement being reached at the European level in the short term; however, it is possible that progress will be made in negotiations during the year
- The key issues being discussed in depth are company formation, transnationality and employee participation rights in the new European private company (SPE).