Refine
Year of publication
- 2024 (55)
- 2023 (116)
- 2022 (146)
- 2021 (154)
- 2020 (172)
- 2019 (196)
- 2018 (173)
- 2017 (155)
- 2016 (161)
- 2015 (176)
- 2014 (167)
- 2013 (174)
- 2012 (164)
- 2011 (189)
- 2010 (187)
- 2009 (189)
- 2008 (157)
- 2007 (149)
- 2006 (160)
- 2005 (130)
- 2004 (161)
- 2003 (106)
- 2002 (130)
- 2001 (106)
- 2000 (108)
- 1999 (109)
- 1998 (99)
- 1997 (99)
- 1996 (81)
- 1995 (78)
- 1994 (87)
- 1993 (59)
- 1992 (54)
- 1991 (29)
- 1990 (39)
- 1989 (45)
- 1988 (57)
- 1987 (32)
- 1986 (19)
- 1985 (34)
- 1984 (22)
- 1983 (20)
- 1982 (29)
- 1981 (20)
- 1980 (36)
- 1979 (24)
- 1978 (34)
- 1977 (14)
- 1976 (13)
- 1975 (12)
- 1974 (3)
- 1973 (2)
- 1972 (2)
- 1971 (1)
- 1968 (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (1693)
- Fachbereich Elektrotechnik und Informationstechnik (719)
- IfB - Institut für Bioengineering (624)
- Fachbereich Energietechnik (589)
- INB - Institut für Nano- und Biotechnologien (557)
- Fachbereich Chemie und Biotechnologie (552)
- Fachbereich Luft- und Raumfahrttechnik (497)
- Fachbereich Maschinenbau und Mechatronik (283)
- Fachbereich Wirtschaftswissenschaften (222)
- Solar-Institut Jülich (165)
Language
- English (4935) (remove)
Document Type
- Article (3285)
- Conference Proceeding (1170)
- Part of a Book (195)
- Book (146)
- Doctoral Thesis (32)
- Conference: Meeting Abstract (29)
- Patent (25)
- Other (10)
- Report (10)
- Conference Poster (6)
Keywords
- Biosensor (25)
- Finite-Elemente-Methode (12)
- Einspielen <Werkstoff> (10)
- CAD (8)
- civil engineering (8)
- Bauingenieurwesen (7)
- Blitzschutz (6)
- FEM (6)
- Gamification (6)
- Limit analysis (6)
The sandfish (Scincus scincus) is a lizard having the remarkable ability to move through desert sand for significant distances. It is well adapted to living in loose sand by virtue of a combination of morphological and behavioural specializations. We investigated the bodyform of the sandfish using 3D-laserscanning and explored its locomotion in loose desert sand using fast nuclear magnetic resonance (NMR) imaging. The sandfish exhibits an in-plane meandering motion with a frequency of about 3 Hz and an amplitude of about half its body length accompanied by swimming-like (or trotting) movements of its limbs. No torsion of the body was observed, a movement required for a digging-behaviour. Simple calculations based on the Janssen model for granular material related to our findings on bodyform and locomotor behaviour render a local decompaction of the sand surrounding the moving sandfish very likely. Thus the sand locally behaves as a viscous fluid and not as a solid material. In this fluidised sand the sandfish is able to “swim” using its limbs.
Hypertension describes the pathological increase of blood pressure, which is most commonly associated with the increase of vascular wall stiffness [1]. Referring to the “Deutsche Bluthochdruck Liga” this pathology shows a growing trend in our aging society. In order to find novel pharmacological and probably personalized treatments, we want to present a functional approach to study biomechanical properties of a human aortic vascular model.
In this method review we will give an overview of recent studies which were carried out with the CellDrum technology [2] and underline the added value to already existing standard procedures known from the field of physiology.
Herein described CellDrum technology is a system to measure functional mechanical properties of cell monolayers and thin tissue constructs in-vitro. Additionally, the CellDrum enables to elucidate the mechanical response of cells to pharmacological drugs, toxins and vasoactive agents. Due to its highly flexible polymer support, cells can also be mechanically stimulated by steady and cyclic biaxial stretching.
Mechano-pharmacological testing of L-Type Ca²⁺ channel modulators via human vascular celldrum model
(2020)
Background/Aims: This study aimed to establish a precise and well-defined working model, assessing pharmaceutical effects on vascular smooth muscle cell monolayer in-vitro. It describes various analysis techniques to determine the most suitable to measure the biomechanical impact of vasoactive agents by using CellDrum technology. Methods: The so-called CellDrum technology was applied to analyse the biomechanical properties of confluent human aorta muscle cells (haSMC) in monolayer. The cell generated tensions deviations in the range of a few N/m² are evaluated by the CellDrum technology. This study focuses on the dilative and contractive effects of L-type Ca²⁺ channel agonists and antagonists, respectively. We analyzed the effects of Bay K8644, nifedipine and verapamil. Three different measurement modes were developed and applied to determine the most appropriate analysis technique for the study purpose. These three operation modes are called, particular time mode" (PTM), "long term mode" (LTM) and "real-time mode" (RTM). Results: It was possible to quantify the biomechanical response of haSMCs to the addition of vasoactive agents using CellDrum technology. Due to the supplementation of 100nM Bay K8644, the tension increased approximately 10.6% from initial tension maximum, whereas, the treatment with nifedipine and verapamil caused a significant decrease in cellular tension: 10nM nifedipine decreased the biomechanical stress around 6,5% and 50nM verapamil by 2,8%, compared to the initial tension maximum. Additionally, all tested measurement modes provide similar results while focusing on different analysis parameters. Conclusion: The CellDrum technology allows highly sensitive biomechanical stress measurements of cultured haSMC monolayers. The mechanical stress responses evoked by the application of vasoactive calcium channel modulators were quantified functionally (N/m²). All tested operation modes resulted in equal findings, whereas each mode features operation-related data analysis.
A comparative performance analysis of the CFD platforms OpenFOAM and FLOW-3D is presented, focusing on a 3D swirling turbulent flow: a steady hydraulic jump at low Reynolds number. Turbulence is treated using RANS approach RNG k-ε. A Volume Of Fluid (VOF) method is used to track the air–water interface, consequently aeration is modeled using an Eulerian–Eulerian approach. Structured meshes of cubic elements are used to discretize the channel geometry. The numerical model accuracy is assessed comparing representative hydraulic jump variables (sequent depth ratio, roller length, mean velocity profiles, velocity decay or free surface profile) to experimental data. The model results are also compared to previous studies to broaden the result validation. Both codes reproduced the phenomenon under study concurring with experimental data, although special care must be taken when swirling flows occur. Both models can be used to reproduce the hydraulic performance of energy dissipation structures at low Reynolds numbers.
BIG KARL and COSY
(1995)
Most drugs are no longer produced in their own countries by the pharmaceutical companies, but by contract manufacturers or at manufacturing sites in countries that can produce more cheaply. This not only makes it difficult to trace them back but also leaves room for criminal organizations to fake them unnoticed. For these reasons, it is becoming increasingly difficult to determine the exact origin of drugs. The goal of this work was to investigate how exactly this is possible by using different spectroscopic methods like nuclear magnetic resonance and near- and mid-infrared spectroscopy in combination with multivariate data analysis. As an example, 56 out of 64 different paracetamol preparations, collected from 19 countries around the world, were chosen to investigate whether it is possible to determine the pharmaceutical company, manufacturing site, or country of origin. By means of suitable pre-processing of the spectra and the different information contained in each method, principal component analysis was able to evaluate manufacturing relationships between individual companies and to differentiate between production sites or formulations. Linear discriminant analysis showed different results depending on the spectral method and purpose. For all spectroscopic methods, it was found that the classification of the preparations to their manufacturer achieves better results than the classification to their pharmaceutical company. The best results were obtained with nuclear magnetic resonance and near-infrared data, with 94.6%/99.6% and 98.7/100% of the spectra of the preparations correctly assigned to their pharmaceutical company or manufacturer.
We prove characterizations of the existence of perfect ƒ-matchings in uniform mengerian and perfect hypergraphs. Moreover, we investigate the ƒ-factor problem in balanced hypergraphs. For uniform balanced hypergraphs we prove two existence theorems with purely combinatorial arguments, whereas for non-uniform balanced hypergraphs we show that the ƒ-factor problem is NP-hard.
Given the strong increase in regulatory requirements for business processes the management of business process compliance becomes a more and more regarded field in IS research. Several methods have been developed to support compliance checking of conceptual models. However, their focus on distinct modeling languages and mostly linear (i.e., predecessor-successor related) compliance rules may hinder widespread adoption and application in practice. Furthermore, hardly any of them has been evaluated in a real-world setting. We address this issue by applying a generic pattern matching approach for conceptual models to business process compliance checking in the financial sector. It consists of a model query language, a search algorithm and a corresponding modelling tool prototype. It is (1) applicable for all graph-based conceptual modeling languages and (2) for different kinds of compliance rules. Furthermore, based on an applicability check, we (3) evaluate the approach in a financial industry project setting against its relevance for decision support of audit and compliance management tasks.
With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.
Objective: As high-field cardiac MRI (CMR) becomes more widespread the propensity of ECG to interference from electromagnetic fields (EMF) and to magneto-hydrodynamic (MHD) effects increases and with it the motivation for a CMR triggering alternative. This study explores the suitability of acoustic cardiac triggering (ACT) for left ventricular (LV) function assessment in healthy subjects (n=14). Methods: Quantitative analysis of 2D CINE steady-state free precession (SSFP) images was conducted to compare ACT’s performance with vector ECG (VCG). Endocardial border sharpness (EBS) was examined paralleled by quantitative LV function assessment. Results: Unlike VCG, ACT provided signal traces free of interference from EMF or MHD effects. In the case of correct Rwave recognition, VCG-triggered 2D CINE SSFP was immune to cardiac motion effects—even at 3.0 T. However, VCG-triggered 2D SSFP CINE imaging was prone to cardiac motion and EBS degradation if R-wave misregistration occurred. ACT-triggered acquisitions yielded LV parameters (end-diastolic volume (EDV), endsystolic volume (ESV), stroke volume (SV), ejection fraction (EF) and left ventricular mass (LVM)) comparable with those derived fromVCG-triggered acquisitions (1.5 T: ESVVCG=(56± 17) ml, EDVVCG=(151±32)ml, LVMVCG=(97±27) g, SVVCG=(94± 19)ml, EFVCG=(63±5)% cf. ESVACT= (56±18) ml, EDVACT=(147±36) ml, LVMACT=(102±29) g, SVACT=(91± 22) ml, EFACT=(62±6)%; 3.0 T: ESVVCG=(55±21) ml, EDVVCG=(151±32) ml, LVMVCG=(101±27) g, SVVCG=(96±15) ml, EFVCG=(65±7)% cf. ESVACT=(54±20) ml, EDVACT=(146±35) ml, LVMACT= (101±30) g, SVACT=(92±17) ml, EFACT=(64±6)%). Conclusions: ACT’s intrinsic insensitivity to interference from electromagnetic fields renders
After a brief introduction of conventional laboratory structures, this work focuses on an innovative and universal approach for a setup of a training laboratory for electric machines and drive systems. The novel approach employs a central 48 V DC bus, which forms the backbone of the structure. Several sets of DC machine, asynchronous machine and synchronous machine are connected to this bus. The advantages of the novel system structure are manifold, both from a didactic and a technical point of view: Student groups can work on their own performance level in a highly parallelized and at the same time individualized way. Additional training setups (similar or different) can easily be added. Only the total power dissipation has to be provided, i.e. the DC bus balances the power flow between the student groups. Comparative results of course evaluations of several cohorts of students are shown.
Design and initial performance of PlanTIS: a high-resolution positron emission tomograph for plants
(2010)
Positron emitters such as 11C, 13N and 18F and their labelled compounds are widely used in clinical diagnosis and animal studies, but can also be used to study metabolic and physiological functions in plants dynamically and in vivo. A very particular tracer molecule is 11CO2 since it can be applied to a leaf as a gas. We have developed a Plant Tomographic Imaging System (PlanTIS), a high-resolution PET scanner for plant studies. Detectors, front-end electronics and data acquisition architecture of the scanner are based on the ClearPET™ system. The detectors consist of LSO and LuYAP crystals in phoswich configuration which are coupled to position-sensitive photomultiplier tubes. Signals are continuously sampled by free running ADCs, and data are stored in a list mode format. The detectors are arranged in a horizontal plane to allow the plants to be measured in the natural upright position. Two groups of four detector modules stand face-to-face and rotate around the field-of-view. This special system geometry requires dedicated image reconstruction and normalization procedures. We present the initial performance of the detector system and first phantom and plant measurements.
To better understand what kinds of sports and exercise could be beneficial for the intervertebral disc (IVD), we performed a review to synthesise the literature on IVD adaptation with loading and exercise. The state of the literature did not permit a systematic review; therefore, we performed a narrative review. The majority of the available data come from cell or whole-disc loading models and animal exercise models. However, some studies have examined the impact of specific sports on IVD degeneration in humans and acute exercise on disc size. Based on the data available in the literature, loading types that are likely beneficial to the IVD are dynamic, axial, at slow to moderate movement speeds, and of a magnitude experienced in walking and jogging. Static loading, torsional loading, flexion with compression, rapid loading, high-impact loading and explosive tasks are likely detrimental for the IVD. Reduced physical activity and disuse appear to be detrimental for the IVD. We also consider the impact of genetics and the likelihood of a ‘critical period’ for the effect of exercise in IVD development. The current review summarises the literature to increase awareness amongst exercise, rehabilitation and ergonomic professionals regarding IVD health and provides recommendations on future directions in research.
The benefits of robotic process automation (RPA) are highly related to the usage of commercial off-the-shelf (COTS) software products that can be easily implemented and customized by business units. But, how to find the best fitting RPA product for a specific situation that creates the expected benefits? This question is related to the general area of software evaluation and selection. In the face of more than 75 RPA products currently on the market, guidance considering those specifics is required. Therefore, this chapter proposes a criteria-based selection method specifically for RPA. The method includes a quantitative evaluation of costs and benefits as well as a qualitative utility analysis based on functional criteria. By using the visualization of financial implications (VOFI) method, an application-oriented structure is provided that opposes the total cost of ownership to the time savings times salary (TSTS). For the utility analysis a detailed list of functional criteria for RPA is offered. The whole method is based on a multi-vocal review of scientific and non-scholarly literature including publications by business practitioners, consultants, and vendors. The application of the method is illustrated by a concrete RPA example. The illustrated
structures, templates, and criteria can be directly utilized by practitioners in their real-life RPA implementations. In addition, a normative decision process for selecting RPA alternatives is proposed before the chapter closes with a discussion and outlook.
The continuing growth of scientific publications raises the question how research processes can be digitalized and thus realized more productively. Especially in information technology fields, research practice is characterized by a rapidly growing volume of publications. For the search process various information systems exist. However, the analysis of the published content is still a highly manual task. Therefore, we propose a text analytics system that allows a fully digitalized analysis of literature sources. We have realized a prototype by using EBSCO Discovery Service in combination with IBM Watson Explorer and demonstrated the results in real-life research projects. Potential addressees are research institutions, consulting firms, and decision-makers in politics and business practice.
Information technologies, such as big data analytics, cloud computing,
cyber physical systems, robotic process automation, and the internet of things, provide a sustainable impetus for the structural development of business sectors as well as the digitalization of markets, enterprises, and processes. Within the consulting industry, the proliferation of these technologies opened up the new segment of digital transformation, which focuses on setting up, controlling, and implementing projects for enterprises from a broad range of sectors. These recent developments raise the question, which requirements evolve for IT consultants as important success factors of those digital transformation projects. Therefore, this empirical contribution provides indications regarding the qualifications and competences necessary for IT consultants in the era of digital transformation from a labor market perspective. On the one hand, this knowledge base is interesting for the academic education of consultants, since it supports a market-oriented design of adequate training measures. On the other hand, insights into the competence requirements for consultants are considered relevant for skill and talent management processes in consulting practice. Assuming that consulting companies pursue a strategic human resource management approach, labor market information may also be useful to discover strategic behavioral patterns.
Antibias training is increasingly demanded and practiced in academia and industry to increase employees’ sensitivity to discrimination, racism, and diversity. Under the heading of “Diversity Management,” antibias trainings are mainly offered as one-off workshops intending to raise awareness of unconscious biases, create a diversity-affirming corporate culture, promote awareness of the potential of
diversity, and ultimately enable the reflection of diversity in development processes. However, coming from childhood education, research and scientific articles on the sustainable effectiveness of antibias in adulthood, especially in academia, are very scarce. In order to fill this research gap, the article aims to explore how sustainable the effects of individual antibias trainings on participants’ behavior are. In order to investigate this, participant observation in a qualitative pre–post setting was conducted, analyzing antibias training in an academic context. Two observers actively participated in the training sessions and documented the activities and reflection processes of the participants. Overall, the results question the effectiveness of single antibias trainings and show that a target-group adaptive approach is mandatory owing to the background of the approach in early childhood education. Therefore, antibias work needs to be adapted to the target group’s needs and realities of life. Furthermore, the study reveals that single antibias trainings must be embedded in a holistic diversity management approach to stimulate sustainable reflection processes among the target group. This article is one of the first to scientifically evaluate antibias training effectiveness, especially in engineering sciences and the university context.
The utilization of phase change material (PCM) for latent heat storage and thermal control of spacecraft has been demonstrated in the past in few missions only. One limiting factor was the fact that all concepts developed so far envisioned the PCM to be applied as an additional capacitor, encapsulated in its own housing, leading to mass, efficiency and accommodation challenges. Recently, the application of PCM within the scan cavity of a GEOS type satellite has been suggested, in order to tackle thermal issues due to direct sun intrusion (Choi, M., 2014). However, the application of PCM in such complex mechanical structures is extremely challenging. A new concept to tackle this issue is currently under development at the FH Aachen University of Applied Sciences. The concept "Infused Thermal Solutions (ITS)" is based on the idea to 3D print metallic structures in their regular functional shape, but double walled with internal lattice support structures, allowing the infusion of a PCM layer directly into the voids and eliminating the need for additional parts and interfaces. Together with OHB System, FH Aachen theoretically studied the application of this technology to the Meteosat Third Generation (MTG) Infra-Red Sounder (IRS) instrument. The study focuses on the scan cavity and entrance baffling assembly (EBA) of the IRS. It consists of thermal analyses, 3D-redesign and bread boarding of a scaled and PCM infused EBA version. In the thermal design of the alternative EBA, PCM was applied directly into the EBA, simulating the worst hot case sun intrusion of the mission. By applying 4kg of PCM (to a 60kg baffle) the EBA temperature excursions during sun intrusion were limited from 140K to 30K, leading to a significant thermo-opto-elastic performance gain. This paper introduces the ITS concept development status.
This paper compares several blade element theory (BET) method-based propeller simulation tools, including an evaluation against static propeller ground tests and high-fidelity Reynolds-Average Navier Stokes (RANS) simulations. Two proprietary propeller geometries for paraglider applications are analysed in static and flight conditions. The RANS simulations are validated with the static test data and used as a reference for comparing the BET in flight conditions. The comparison includes the analysis of varying 2D aerodynamic airfoil parameters and different induced velocity calculation methods. The evaluation of the BET propeller simulation tools shows the strength of the BET tools compared to RANS simulations. The RANS simulations underpredict static experimental data within 10% relative error, while appropriate BET tools overpredict the RANS results by 15–20% relative error. A variation in 2D aerodynamic data depicts the need for highly accurate 2D data for accurate BET results. The nonlinear BET coupled with XFOIL for the 2D aerodynamic data matches best with RANS in static operation and flight conditions. The novel BET tool PropCODE combines both approaches and offers further correction models for highly accurate static and flight condition results.
High aerodynamic efficiency requires propellers with high aspect ratios, while propeller sweep potentially reduces noise. Propeller sweep and high aspect ratios increase elasticity and coupling of structural mechanics and aerodynamics, affecting the propeller performance and noise. Therefore, this paper analyzes the influence of elasticity on forward-swept, backward-swept, and unswept propellers in hover conditions. A reduced-order blade element momentum approach is coupled with a one-dimensional Timoshenko beam theory and Farassat's formulation 1A. The results of the aeroelastic simulation are used as input for the aeroacoustic calculation. The analysis shows that elasticity influences noise radiation because thickness and loading noise respond differently to deformations. In the case of the backward-swept propeller, the location of the maximum sound pressure level shifts forward by 0.5 °, while in the case of the forward-swept propeller, it shifts backward by 0.5 °. Therefore, aeroacoustic optimization requires the consideration of propeller deformation.
As a low-input crop, Miscanthus offers numerous advantages that, in addition to agricultural applications, permits its exploitation for energy, fuel, and material production. Depending on the Miscanthus genotype, season, and harvest time as well as plant component (leaf versus stem), correlations between structure and properties of the corresponding isolated lignins differ. Here, a comparative study is presented between lignins isolated from M. x giganteus, M. sinensis, M. robustus and M. nagara using a catalyst-free organosolv pulping process. The lignins from different plant constituents are also compared regarding their similarities and differences regarding monolignol ratio and important linkages. Results showed that the plant genotype has the weakest influence on monolignol content and interunit linkages. In contrast, structural differences are more significant among lignins of different harvest time and/or season. Analyses were performed using fast and simple methods such as nuclear magnetic resonance (NMR) spectroscopy. Data was assigned to four different linkages (A: β-O-4 linkage, B: phenylcoumaran, C: resinol, D: β-unsaturated ester). In conclusion, A content is particularly high in leaf-derived lignins at just under 70% and significantly lower in stem and mixture lignins at around 60% and almost 65%. The second most common linkage pattern is D in all isolated lignins, the proportion of which is also strongly dependent on the crop portion. Both stem and mixture lignins, have a relatively high share of approximately 20% or more (maximum is M. sinensis Sin2 with over 30%). In the leaf-derived lignins, the proportions are significantly lower on average. Stem samples should be chosen if the highest possible lignin content is desired, specifically from the M. x giganteus genotype, which revealed lignin contents up to 27%. Due to the better frost resistance and higher stem stability, M. nagara offers some advantages compared to M. x giganteus. Miscanthus crops are shown to be very attractive lignocellulose feedstock (LCF) for second generation biorefineries and lignin generation in Europe.
Chromatography is the workhorse of biopharmaceutical downstream processing because it can selectively enrich a target product while removing impurities from complex feed streams. This is achieved by exploiting differences in molecular properties, such as size, charge and hydrophobicity (alone or in different combinations). Accordingly, many parameters must be tested during process development in order to maximize product purity and recovery, including resin and ligand types, conductivity, pH, gradient profiles, and the sequence of separation operations. The number of possible experimental conditions quickly becomes unmanageable. Although the range of suitable conditions can be narrowed based on experience, the time and cost of the work remain high even when using high-throughput laboratory automation. In contrast, chromatography modeling using inexpensive, parallelized computer hardware can provide expert knowledge, predicting conditions that achieve high purity and efficient recovery. The prediction of suitable conditions in silico reduces the number of empirical tests required and provides in-depth process understanding, which is recommended by regulatory authorities. In this article, we discuss the benefits and specific challenges of chromatography modeling. We describe the experimental characterization of chromatography devices and settings prior to modeling, such as the determination of column porosity. We also consider the challenges that must be overcome when models are set up and calibrated, including the cross-validation and verification of data-driven and hybrid (combined data-driven and mechanistic) models. This review will therefore support researchers intending to establish a chromatography modeling workflow in their laboratory.
This Research Briefing, issued in July 2010, concluded that:
- Small and medium-sized enterprises (SMEs) in Europe have long called for a matching legal form valid across the EU (similar to that of the European company (SE) for large firms)
- The main benefits would be the availability of uniform Europe-wide company structures, significant cost reductions for businesses and further integration of the internal market
- Given the differing national views regarding the concrete features of the new legal form there is currently no sign of an agreement being reached at the European level in the short term; however, it is possible that progress will be made in negotiations during the year
- The key issues being discussed in depth are company formation, transnationality and employee participation rights in the new European private company (SPE).
Divided government is often thought of as causing legislative deadlock. I investigate the link between divided government and economic reforms using a novel data set on welfare reforms in US states between 1978 and 2010. Panel data regressions show that, under divided government, a US state is around 25% more likely to adopt a welfare reform than under unified government. Several robustness checks confirm this counter-intuitive finding. Case study evidence suggests an explanation based on policy competition between governor, senate, and house.
Does stiffer electoral competition reduce political shirking? For a micro-analysis of this question, I construct a new data set spanning the years 2005 to 2012 covering biographical and political information about German Members of Parliament (MPs), including their attendance rates in voting sessions. For the parliament elected in 2009, I show that indeed opposition party MPs who expect to face a close race in their district show significantly and relevantly lower absence rates in parliament beforehand. MPs of governing parties seem not to react significantly to electoral competition. These results are confirmed by an analysis of the parliament elected in 2005, by several robustness checks, and also by employing an instrumental variable strategy exploiting convenient peculiarities of the German electoral system. The study also shows how MPs elected via party lists react to different levels of electoral competition.