Article
Refine
Year of publication
- 2024 (40)
- 2023 (78)
- 2022 (91)
- 2021 (109)
- 2020 (135)
- 2019 (121)
- 2018 (127)
- 2017 (109)
- 2016 (118)
- 2015 (126)
- 2014 (142)
- 2013 (139)
- 2012 (130)
- 2011 (182)
- 2010 (177)
- 2009 (199)
- 2008 (180)
- 2007 (176)
- 2006 (180)
- 2005 (188)
- 2004 (214)
- 2003 (154)
- 2002 (167)
- 2001 (157)
- 2000 (173)
- 1999 (153)
- 1998 (165)
- 1997 (154)
- 1996 (140)
- 1995 (147)
- 1994 (136)
- 1993 (108)
- 1992 (102)
- 1991 (74)
- 1990 (82)
- 1989 (79)
- 1988 (80)
- 1987 (77)
- 1986 (65)
- 1985 (59)
- 1984 (56)
- 1983 (47)
- 1982 (38)
- 1981 (39)
- 1980 (50)
- 1979 (43)
- 1978 (41)
- 1977 (22)
- 1976 (25)
- 1975 (18)
- 1974 (13)
- 1973 (6)
- 1972 (15)
- 1971 (7)
- 1970 (2)
- 1968 (2)
- 1967 (1)
Institute
- Fachbereich Medizintechnik und Technomathematik (1595)
- Fachbereich Wirtschaftswissenschaften (705)
- Fachbereich Elektrotechnik und Informationstechnik (637)
- Fachbereich Energietechnik (609)
- Fachbereich Chemie und Biotechnologie (604)
- INB - Institut für Nano- und Biotechnologien (542)
- Fachbereich Maschinenbau und Mechatronik (490)
- IfB - Institut für Bioengineering (450)
- Fachbereich Luft- und Raumfahrttechnik (380)
- Fachbereich Bauingenieurwesen (333)
Language
Document Type
- Article (5658) (remove)
Keywords
- Einspielen <Werkstoff> (7)
- Multimediamarkt (6)
- Rapid prototyping (5)
- avalanche (5)
- Earthquake (4)
- FEM (4)
- Finite-Elemente-Methode (4)
- LAPS (4)
- Rapid Prototyping (4)
- additive manufacturing (4)
In the context of the Solvency II directive, the operation of an internal risk model is a possible way for risk assessment and for the determination of the solvency capital requirement of an insurance company in the European Union. A Monte Carlo procedure is customary to generate a model output. To be compliant with the directive, validation of the internal risk model is conducted on the basis of the model output. For this purpose, we suggest a new test for checking whether there is a significant change in the modeled solvency capital requirement. Asymptotic properties of the test statistic are investigated and a bootstrap approximation is justified. A simulation study investigates the performance of the test in the finite sample case and confirms the theoretical results. The internal risk model and the application of the test is illustrated in a simplified example. The method has more general usage for inference of a broad class of law-invariant and coherent risk measures on the basis of a paired sample.
We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets.
The established Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic is investigated for partly not identically distributed data. Surprisingly, it turns out that the statistic has the well-known distribution-free limiting null distribution of the classical criterion under standard regularity conditions. An application is testing goodness-of-fit for the regression function in a non parametric random effects meta-regression model, where the consistency is obtained as well. Simulations investigate size and power of the approach for small and moderate sample sizes. A real data example based on clinical trials illustrates how the test can be used in applications.
On the basis of independent and identically distributed bivariate random vectors, where the components are categorial and continuous variables, respectively, the related concomitants, also called induced order statistic, are considered. The main theoretical result is a functional central limit theorem for the empirical process of the concomitants in a triangular array setting. A natural application is hypothesis testing. An independence test and a two-sample test are investigated in detail. The fairly general setting enables limit results under local alternatives and bootstrap samples. For the comparison with existing tests from the literature simulation studies are conducted. The empirical results obtained confirm the theoretical findings.
This paper considers a paired data framework and discusses the question of marginal homogeneity of bivariate high-dimensional or functional data. The related testing problem can be endowed into a more general setting for paired random variables taking values in a general Hilbert space. To address this problem, a Cramér–von-Mises type test statistic is applied and a bootstrap procedure is suggested to obtain critical values and finally a consistent test. The desired properties of a bootstrap test can be derived that are asymptotic exactness under the null hypothesis and consistency under alternatives. Simulations show the quality of the test in the finite sample case. A possible application is the comparison of two possibly dependent stock market returns based on functional data. The approach is demonstrated based on historical data for different stock market indices.
Armiranobetonske (AB) zgrade sa zidanom ispunom
se izvode u mnogim zemljama širom sveta. Iako se
zidana ispuna posmatra kao nekonstruktivni element, ona
značajno utiče na promenu dinamičkih karakteristika AB
ramovskih konstrukcija u toku zemljotresnog dejstva.
Odskora, značajan napor je utrošen na istraživanje
izolovanih ispuna, koje su odvojene od okolnog rama
obično ostavljanjem prostora između rama i ispune. U
ovom slučaju deformacija rama ne aktivira ispunu i na taj
način ispuna ne utiče na ponašanje rama. Ovaj rad
predstavlja rezultate istraživanja ponašanja AB
ramovskih zgrada sa INODIS sistemom koji izoluje ispunu
u odnosu na okolni ram. Uticaj izolovane ispune je prvo
ispitan na jednospratnim i jednobrodnim ramovima. Ovo
je iskorišćeno kao osnova za parametarsku analizu na
višespratnim i višebrodnim ramovima, kao i na primeru
zgrade. Promena krutosti i dinamičkih karakteristika je
analizirano kao i odgovor pri zemljotresnom dejstvu.
Izvršeno je poređenje sa praznom ramovskom
konstrukcijom kao i ramovima ispunjenim ispunom na
tradicionalni način. Rezultati pokazuju da je ponašanje
ramova sa izolovanom ispunom slično ponašanju praznih
ramova, dok je ponašanje ramova sa tradicionalnom
ispunom daleko drugačije i zahteva kompleksne
numeričke modele. Ovo znači da ukoliko se primeni
adekvatna konstruktivna mera izolacije ispune, proračun
ramovskim zgrada sa zidanom ispunom se može
značajno pojednostaviti.
On the basis of bivariate data, assumed to be observations of independent copies of a random vector (S,N), we consider testing the hypothesis that the distribution of (S,N) belongs to the parametric class of distributions that arise with the compound Poisson exponential model. Typically, this model is used in stochastic hydrology, with N as the number of raindays, and S as total rainfall amount during a certain time period, or in actuarial science, with N as the number of losses, and S as total loss expenditure during a certain time period. The compound Poisson exponential model is characterized in the way that a specific transform associated with the distribution of (S,N) satisfies a certain differential equation. Mimicking the function part of this equation by substituting the empirical counterparts of the transform we obtain an expression the weighted integral of the square of which is used as test statistic. We deal with two variants of the latter, one of which being invariant under scale transformations of the S-part by fixed positive constants. Critical values are obtained by using a parametric bootstrap procedure. The asymptotic behavior of the tests is discussed. A simulation study demonstrates the performance of the tests in the finite sample case. The procedure is applied to rainfall data and to an actuarial dataset. A multivariate extension is also discussed.
FEM shakedown analysis of structures under random strength with chance constrained programming
(2022)
Direct methods, comprising limit and shakedown analysis, are a branch of computational mechanics. They play a significant role in mechanical and civil engineering design. The concept of direct methods aims to determine the ultimate load carrying capacity of structures beyond the elastic range. In practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and constraints. If strength and loading are random quantities, the shakedown analysis can be formulated as stochastic programming problem. In this paper, a method called chance constrained programming is presented, which is an effective method of stochastic programming to solve shakedown analysis problems under random conditions of strength. In this study, the loading is deterministic, and the strength is a normally or lognormally distributed variable.
REM sleep without atonia (RSWA) is a key feature for the diagnosis of rapid eye movement (REM) sleep behaviour disorder (RBD). We introduce RBDtector, a novel open-source software to score RSWA according to established SINBAR visual scoring criteria. We assessed muscle activity of the mentalis, flexor digitorum superficialis (FDS), and anterior tibialis (AT) muscles. RSWA was scored manually as tonic, phasic, and any activity by human scorers as well as using RBDtector in 20 subjects. Subsequently, 174 subjects (72 without RBD and 102 with RBD) were analysed with RBDtector to show the algorithm’s applicability. We additionally compared RBDtector estimates to a previously published dataset. RBDtector showed robust conformity with human scorings. The highest congruency was achieved for phasic and any activity of the FDS. Combining mentalis any and FDS any, RBDtector identified RBD subjects with 100% specificity and 96% sensitivity applying a cut-off of 20.6%. Comparable performance was obtained without manual artefact removal. RBD subjects also showed muscle bouts of higher amplitude and longer duration. RBDtector provides estimates of tonic, phasic, and any activity comparable to human scorings. RBDtector, which is freely available, can help identify RBD subjects and provides reliable RSWA metrics.
Purpose
In the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset.
Design/methodology/approach
In this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments.
Findings
Based on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model.
Originality/value
For the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future.
The European Union's aim to become climate neutral by 2050 necessitates ambitious efforts to reduce carbon emissions. Large reductions can be attained particularly in energy intensive sectors like iron and steel. In order to prevent the relocation of such industries outside the EU in the course of tightening environmental regulations, the establishment of a climate club jointly with other large emitters and alternatively the unilateral implementation of an international cross-border carbon tax mechanism are proposed. This article focuses on the latter option choosing the steel sector as an example. In particular, we investigate the financial conditions under which a European cross border mechanism is capable to protect hydrogen-based steel production routes employed in Europe against more polluting competition from abroad. By using a floor price model, we assess the competitiveness of different steel production routes in selected countries. We evaluate the climate friendliness of steel production on the basis of specific GHG emissions. In addition, we utilize an input-output price model. It enables us to assess impacts of rising cost of steel production on commodities using steel as intermediates. Our results raise concerns that a cross-border tax mechanism will not suffice to bring about competitiveness of hydrogen-based steel production in Europe because the cost tends to remain higher than the cost of steel production in e.g. China. Steel is a classic example for a good used mainly as intermediate for other products. Therefore, a cross-border tax mechanism for steel will increase the price of products produced in the EU that require steel as an input. This can in turn adversely affect competitiveness of these sectors. Hence, the effects of higher steel costs on European exports should be borne in mind and could require the cross-border adjustment mechanism to also subsidize exports.
Unternehmen sind in der Regel überzeugt, dass sie die Bedürfnisse ihrer Kunden in den Mittelpunkt stellen. Aber in der direkten Interaktion mit dem Kunden zeigen sie häufig Schwächen. Der folgende Beitrag illustriert, wie durch eine konsequente Ausrichtung der Wertschöpfungsprozesse auf die zentralen Kundenbedürfnisse ein Dreifacheffekt erzielt werden kann: Nachhaltig erhöhte Kundenzufriedenheit, gesteigerte Effizienz und eine Differenzierung im Wettbewerb.
Kundenanforderungen an Netzwerke haben sich in den vergangenen Jahren stark verändert. Mit NFV und SDN sind Unternehmen technisch in der Lage, diesen gerecht zu werden. Die Provider stehen jedoch vor großen Herausforderungen: Insbesondere Produkte und Prozesse müssen angepasst und agiler werden, um die Stärken von NFV und SDN zum Kundenvorteil auszuspielen.
Die Durchführung einer systematischen Literaturrecherche ist eine zentrale Kompetenz wissenschaftlichen Arbeitens und bildet daher einen festen Ausbildungsbestandteil von Bachelor- und Masterstudiengängen. In entsprechenden Lehrveranstaltungen werden Studierende zwar mit den grundlegenden Hilfsmitteln zur Suche und Verwaltung von Literatur vertraut gemacht, allerdings werden die Potenziale textanalytischer Methoden und Anwendungssysteme (Text Mining, Text Analytics) dabei zumeist nicht abgedeckt. Folglich werden Datenkompetenzen, die zur systemgestützten Analyse und Erschließung von Literaturdaten erforderlich sind, nicht hinreichend ausgeprägt. Um diese Kompetenzlücke zu adressieren, ist an der Hochschule Osnabrück eine Lehrveranstaltung konzipiert und projektorientiert umgesetzt worden, die sich insbesondere an Studierende wirtschaftswissenschaftlicher Studiengänge richtet. Dieser Beitrag dokumentiert die fachliche sowie technische Ausgestaltung dieser Veranstaltung und zeigt Potenziale für die künftige Weiterentwicklung auf.
Zur Anwendung des Eurocode 3 Teil 1-2 für die Heißbemessung und Anregungen für dessen Novellierung
(2016)
Die Eurocodes werden bis zum Jahr 2020 im Europäischen Komitee für Normung (CEN), Technisches Komitee TC 250, überarbeitet. In Vorbereitung auf die Eurocode-Novellierung haben engagierte Ingenieure im Rahmen der Initiative PraxisRegeln Bau (PRB) die für die praktische Anwendung häufig genutzten Teile des Eurocode 3 untersucht. Mit dem Ziel, die Praxistauglichkeit des Eurocode 3 für die Heißbemessung zu verbessern, wurden die bestehende Norm EN 1993 Teil 1-2 insbesondere in Bezug auf die Anwenderfreundlichkeit analysiert und Vorschläge für die europäische Novellierung erarbeitet. Die Analysen zeigen, dass durch Umstrukturierungen und durch die Einführung von Tabellen die Verständlichkeit und Anwenderfreundlichkeit der Regeln für die Heißbemessung bedeutend erhöht werden können.
Stützen und Träger aus Stahlprofilen können in Fundamente oder Wände aus Stahlbeton einbetoniert werden. Diese Anschlüsse wirken in der Regel wie Einspannungen, die eine ausreichende Einspanntiefe erfordern. Im Folgenden wird eine verallgemeinerte Berechnungsmethode für in Stahlbetonkonstruktionen eingespannte Stahlprofile aus gewalzten I-Profilen, geschweißten I-Profilen, runden Hohlprofilen, eckigen Hohlprofilen und einzelligen Kastenquerschnitten vorgestellt. Für Beanspruchungen infolge einachsiger Biegung um die starke und schwache Profilachse werden der profilabhängige Ansatz der Betondruckspannungen im Einspannbereich und die Ermittlung der Einspanntiefe behandelt. Unter Berücksichtigung der Normalkraft werden an den maßgebenden Stellen Tragfähigkeitsnachweise für die Stahlprofile geführt. Als Ergänzung zu den Berechnungsformeln werden Bemessungshilfen zur Verfügung gestellt, die die Wahl der mitwirkenden Breiten und der Einspanntiefen erleichtert.
In der Praxis bestehen vielfältige Einsatzbereiche für Verkehrsnachfragemodelle. Mit ihnen können Kenngrößen des Verkehrsangebots und der Verkehrsnachfrage für den heutigen Zustand wie auch für zukünftige Zustände bereitgestellt werden, um so die Grundlagen für verkehrsplanerische Entscheidungen zu liefern. Die neuen „Empfehlungen zum Einsatz von Verkehrsnachfragemodellen für den Personenverkehr“ (EVNM-PV) (FGSV 2022) veranschaulichen anhand von typischen Planungsaufgaben, welche differenzierten Anforderungen daraus für die Modellkonzeption und -erstellung resultieren. Vor dem Hintergrund der konkreten Aufgabenstellung sowie deren spezifischer planerischer Anforderungen bildet die abzuleitende Modellspezifikation die verabredete Grundlage zwischen Auftraggeber und Modellersteller für die konkrete inhaltliche, fachliche Ausgestaltung des Verkehrsmodells.
Die neu erschienenen „Empfehlungen zum Einsatz von Verkehrsnachfragemodellen für den Personenverkehr“ liefern erstmals als Empfehlungspapier der Forschungsgesellschaft für Straßen- und Verkehrswesen einen umfassenden Überblick zu den verschiedenen Aspekten der Modellierung und geben dem Fachplaner konkrete Hilfestellung für die Konzeption von Nachfragemodellen. Das Empfehlungspapier zielt unter anderem darauf ab, die Erwartungen und das Anspruchsniveau in Hinblick auf Sachgerechtigkeit der Modelle, die erzielbare Modellqualität und den Detaillierungsgrad der Modellaussagen zu harmonisieren.
The potential of electronic markets in enabling innovative product bundles through flexible and sustainable partnerships is not yet fully exploited in the telecommunication industry. One reason is that bundling requires seamless de-assembling and re-assembling of business processes, whilst processes in telecommunication companies are often product-dependent and hard to virtualize. We propose a framework for the planning of the virtualization of processes, intended to assist the decision maker in prioritizing the processes to be virtualized: (a) we transfer the virtualization pre-requisites stated by the Process Virtualization Theory in the context of customer-oriented processes in the telecommunication industry and assess their importance in this context, (b) we derive IT-oriented requirements for the removal of virtualization barriers and highlight their demand on changes at different levels of the organization. We present a first evaluation of our approach in a case study and report on lessons learned and further steps to be performed.