TY - JOUR A1 - Marinković, Marko A1 - Butenweg, Christoph ED - Ford, Michael C. T1 - Experimental testing of decoupled masonry infills with steel anchors for out-of-plane support under combined in-plane and out-of-plane seismic loading JF - Construction and Building Materials N2 - Because of simple construction process, high energy efficiency, significant fire resistance and excellent sound isolation, masonry infilled reinforced concrete (RC) frame structures are very popular in most of the countries in the world, as well as in seismic active areas. However, many RC frame structures with masonry infills were seriously damaged during earthquake events, as the traditional infills are generally constructed with direct contact to the RC frame which brings undesirable infill/frame interaction. This interaction leads to the activation of the equivalent diagonal strut in the infill panel, due to the RC frame deformation, and combined with seismically induced loads perpendicular to the infill panel often causes total collapses of the masonry infills and heavy damages to the RC frames. This fact was the motivation for developing different approaches for improving the behaviour of masonry infills, where infill isolation (decoupling) from the frame has been more intensively studied in the last decade. In-plane isolation of the infill wall reduces infill activation, but causes the need for additional measures to restrain out-of-plane movements. This can be provided by installing steel anchors, as proposed by some researchers. Within the framework of European research project INSYSME (Innovative Systems for Earthquake Resistant Masonry Enclosures in Reinforced Concrete Buildings) the system based on a use of elastomers for in-plane decoupling and steel anchors for out-of-plane restrain was tested. This constructive solution was tested and deeply investigated during the experimental campaign where traditional and decoupled masonry infilled RC frames with anchors were subjected to separate and combined in-plane ‬and out-of-plane loading. Based on a detailed evaluation and comparison of the test results, the performance and effectiveness of the developed system are illustrated. KW - Masonry infill KW - Reinforced concrete frame KW - Earthquake KW - INSYSME KW - Decoupling Y1 - 2022 U6 - https://doi.org/10.1016/j.conbuildmat.2021.126041 SN - 1879-0526 SN - 0950-0618 VL - 318 IS - 1 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Rossi, Leonardo A1 - Winands, Mark H. M. A1 - Butenweg, Christoph ED - Zhang, Jessica T1 - Monte Carlo Tree Search as an intelligent search tool in structural design problems JF - Engineering with Computers : An International Journal for Simulation-Based Engineering N2 - Monte Carlo Tree Search (MCTS) is a search technique that in the last decade emerged as a major breakthrough for Artificial Intelligence applications regarding board- and video-games. In 2016, AlphaGo, an MCTS-based software agent, outperformed the human world champion of the board game Go. This game was for long considered almost infeasible for machines, due to its immense search space and the need for a long-term strategy. Since this historical success, MCTS is considered as an effective new approach for many other scientific and technical problems. Interestingly, civil structural engineering, as a discipline, offers many tasks whose solution may benefit from intelligent search and in particular from adopting MCTS as a search tool. In this work, we show how MCTS can be adapted to search for suitable solutions of a structural engineering design problem. The problem consists of choosing the load-bearing elements in a reference reinforced concrete structure, so to achieve a set of specific dynamic characteristics. In the paper, we report the results obtained by applying both a plain and a hybrid version of single-agent MCTS. The hybrid approach consists of an integration of both MCTS and classic Genetic Algorithm (GA), the latter also serving as a term of comparison for the results. The study’s outcomes may open new perspectives for the adoption of MCTS as a design tool for civil engineers. KW - Monte Carlo Tree Search KW - Structural design KW - Artificial intelligence KW - Civil engineering KW - Genetic algorithm Y1 - 2022 U6 - https://doi.org/10.1007/s00366-021-01338-2 SN - 1435-5663 SN - 0177-0667 VL - 38 IS - 4 SP - 3219 EP - 3236 PB - Springer Nature CY - Cham ER - TY - JOUR A1 - Gaigall, Daniel A1 - Gerstenberg, Julian A1 - Trinh, Thi Thu Ha T1 - Empirical process of concomitants for partly categorial data and applications in statistics JF - Bernoulli N2 - On the basis of independent and identically distributed bivariate random vectors, where the components are categorial and continuous variables, respectively, the related concomitants, also called induced order statistic, are considered. The main theoretical result is a functional central limit theorem for the empirical process of the concomitants in a triangular array setting. A natural application is hypothesis testing. An independence test and a two-sample test are investigated in detail. The fairly general setting enables limit results under local alternatives and bootstrap samples. For the comparison with existing tests from the literature simulation studies are conducted. The empirical results obtained confirm the theoretical findings. KW - bootstrap KW - Categorial variable KW - Concomitant KW - Empirical process KW - Independence test Y1 - 2022 U6 - https://doi.org/10.3150/21-BEJ1367 SN - 1573-9759 VL - 28 IS - 2 SP - 803 EP - 829 PB - International Statistical Institute CY - Den Haag, NL ER - TY - JOUR A1 - Ditzhaus, Marc A1 - Gaigall, Daniel T1 - Testing marginal homogeneity in Hilbert spaces with applications to stock market returns JF - Test N2 - This paper considers a paired data framework and discusses the question of marginal homogeneity of bivariate high-dimensional or functional data. The related testing problem can be endowed into a more general setting for paired random variables taking values in a general Hilbert space. To address this problem, a Cramér–von-Mises type test statistic is applied and a bootstrap procedure is suggested to obtain critical values and finally a consistent test. The desired properties of a bootstrap test can be derived that are asymptotic exactness under the null hypothesis and consistency under alternatives. Simulations show the quality of the test in the finite sample case. A possible application is the comparison of two possibly dependent stock market returns based on functional data. The approach is demonstrated based on historical data for different stock market indices. Y1 - 2022 U6 - https://doi.org/10.1007/s11749-022-00802-5 SN - 1863-8260 VL - 2022 IS - 31 SP - 749 EP - 770 PB - Springer ER - TY - CHAP A1 - Gaigall, Daniel T1 - On Consistent Hypothesis Testing In General Hilbert Spaces T2 - Proceedings of the 4th International Conference on Statistics: Theory and Applications (ICSTA’22) N2 - Inference on the basis of high-dimensional and functional data are two topics which are discussed frequently in the current statistical literature. A possibility to include both topics in a single approach is working on a very general space for the underlying observations, such as a separable Hilbert space. We propose a general method for consistently hypothesis testing on the basis of random variables with values in separable Hilbert spaces. We avoid concerns with the curse of dimensionality due to a projection idea. We apply well-known test statistics from nonparametric inference to the projected data and integrate over all projections from a specific set and with respect to suitable probability measures. In contrast to classical methods, which are applicable for real-valued random variables or random vectors of dimensions lower than the sample size, the tests can be applied to random vectors of dimensions larger than the sample size or even to functional and high-dimensional data. In general, resampling procedures such as bootstrap or permutation are suitable to determine critical values. The idea can be extended to the case of incomplete observations. Moreover, we develop an efficient algorithm for implementing the method. Examples are given for testing goodness-of-fit in a one-sample situation in [1] or for testing marginal homogeneity on the basis of a paired sample in [2]. Here, the test statistics in use can be seen as generalizations of the well-known Cramérvon-Mises test statistics in the one-sample and two-samples case. The treatment of other testing problems is possible as well. By using the theory of U-statistics, for instance, asymptotic null distributions of the test statistics are obtained as the sample size tends to infinity. Standard continuity assumptions ensure the asymptotic exactness of the tests under the null hypothesis and that the tests detect any alternative in the limit. Simulation studies demonstrate size and power of the tests in the finite sample case, confirm the theoretical findings, and are used for the comparison with concurring procedures. A possible application of the general approach is inference for stock market returns, also in high data frequencies. In the field of empirical finance, statistical inference of stock market prices usually takes place on the basis of related log-returns as data. In the classical models for stock prices, i.e., the exponential Lévy model, Black-Scholes model, and Merton model, properties such as independence and stationarity of the increments ensure an independent and identically structure of the data. Specific trends during certain periods of the stock price processes can cause complications in this regard. In fact, our approach can compensate those effects by the treatment of the log-returns as random vectors or even as functional data. Y1 - 2022 U6 - https://doi.org/10.11159/icsta22.157 N1 - 4th International Conference on Statistics: Theory and Applications (ICSTA’22), Prague, Czech Republic – July 28- 30 SP - Paper No. 157 PB - Avestia Publishing CY - Orléans, Kanada ER - TY - CHAP A1 - Staat, Manfred A1 - Tran, Ngoc Trinh T1 - Strain based brittle failure criteria for rocks T2 - Proceedings of (NACOME2022) The 11th National Conference on Mechanics, Vol. 1. Solid Mechanics, Rock Mechanics, Artificial Intelligence, Teaching and Training N2 - When confining pressure is low or absent, extensional fractures are typical, with fractures occurring on unloaded planes in rock. These “paradox” fractures can be explained by a phenomenological extension strain failure criterion. In the past, a simple empirical criterion for fracture initiation in brittle rock has been developed. But this criterion makes unrealistic strength predictions in biaxial compression and tension. A new extension strain criterion overcomes this limitation by adding a weighted principal shear component. The weight is chosen, such that the enriched extension strain criterion represents the same failure surface as the Mohr–Coulomb (MC) criterion. Thus, the MC criterion has been derived as an extension strain criterion predicting failure modes, which are unexpected in the understanding of the failure of cohesive-frictional materials. In progressive damage of rock, the most likely fracture direction is orthogonal to the maximum extension strain. The enriched extension strain criterion is proposed as a threshold surface for crack initiation CI and crack damage CD and as a failure surface at peak P. Examples show that the enriched extension strain criterion predicts much lower volumes of damaged rock mass compared to the simple extension strain criterion. KW - Extension fracture KW - Extension strain criterion KW - Mohr–Coulomb criterion KW - Evolution of damage Y1 - 2023 SN - 978-604-357-084-7 N1 - 11th National Conference on Mechanics (NACOME 2022), December 2-3, 2022, VNU University of Engineering and Technology, Hanoi, Vietnam SP - 500 EP - 509 PB - Nha xuat ban Khoa hoc tu nhien va Cong nghe (Verlag Naturwissenschaft und Technik) CY - Hanoi ER - TY - CHAP A1 - Butenweg, Christoph ED - Vacareanu, Radu ED - Ionescu, Constantin T1 - Seismic design and evaluation of industrial facilities T2 - The Third European Conference on Earthquake Engineering and Seismology N2 - Industrial facilities must be thoroughly designed to withstand seismic actions as they exhibit an increased loss potential due to the possibly wideranging damage consequences and the valuable process engineering equipment. Past earthquakes showed the social and political consequences of seismic damage to industrial facilities and sensitized the population and politicians worldwide for the possible hazard emanating from industrial facilities. However, a holistic approach for the seismic design of industrial facilities can presently neither be found in national nor in international standards. The introduction of EN 1998-4 of the new generation of Eurocode 8 will improve the normative situation with specific seismic design rules for silos, tanks and pipelines and secondary process components. The article presents essential aspects of the seismic design of industrial facilities based on the new generation of Eurocode 8 using the example of tank structures and secondary process components. The interaction effects of the process components with the primary structure are illustrated by means of the experimental results of a shaking table test of a three story moment resisting steel frame with different process components. Finally, an integrated approach of digital plant models based on building information modelling (BIM) and structural health monitoring (SHM) is presented, which provides not only a reliable decision-making basis for operation, maintenance and repair but also an excellent tool for rapid assessment of seismic damage. KW - Industrial facilities KW - Seismic design KW - Tanks KW - EN 1998-4 KW - Structural health monitoring Y1 - 2022 SN - 978-3-031-15103-3 SN - 978-3-031-15106-4 SN - 978-3-031-15104-0 U6 - https://doi.org/10.1007/978-3-031-15104-0 SN - 2524-342X SN - 2524-3438 N1 - 3ECEES - Third European Conference on Earthquake Engineering and Seismology, September 4 – September 9, 2022, Bucharest SP - 449 EP - 464 PB - Springer CY - Cham ER - TY - CHAP A1 - Gkatzogias, Konstantinos A1 - Veljkoviv, Ana A1 - Pohoryles, Daniel A. A1 - Tsionis, Georgios A1 - Bournas, Dionysios A. A1 - Crowley, Helen A1 - Norlén, Hedvig A1 - Butenweg, Christoph A1 - Gervasio, Helena A1 - Manfredi, Vincenzo A1 - Masi, Angelo A1 - Zaharieva, Roumiana ED - Gkatzogias, Konstantinos ED - Tsionis, Georgios T1 - Policy practice and regional impact assessment for building renovation T2 - REEBUILD Integrated Techniques for the Seismic Strengthening & Energy Efficiency of Existing Buildings N2 - The work presented in this report provides scientific support to building renovation policies in the EU by promoting a holistic point of view on the topic. Integrated renovation can be seen as a nexus between European policies on disaster resilience, energy efficiency and circularity in the building sector. An overview of policy measures for the seismic and energy upgrading of buildings across EU Member States identified only a few available measures for combined upgrading. Regulatory framework, financial instruments and digital tools similar to those for energy renovation, together with awareness and training may promote integrated renovation. A framework for regional prioritisation of building renovation was put forward, considering seismic risk, energy efficiency, and socioeconomic vulnerability independently and in an integrated way. Results indicate that prioritisation of building renovation is a multidimensional problem. Depending on priorities, different integrated indicators should be used to inform policies and accomplish the highest relative or most spread impact across different sectors. The framework was further extended to assess the impact of renovation scenarios across the EU with a focus on priority regions. Integrated renovation can provide a risk-proofed, sustainable, and inclusive built environment, presenting an economic benefit in the order of magnitude of the highest benefit among the separate interventions. Furthermore, it presents the unique capability of reducing fatalities and energy consumption at the same time and, depending on the scenario, to a greater extent. Y1 - 2022 SN - 978-92-76-60454-9 U6 - https://doi.org/10.2760/883122 SN - 1831-9424 SP - 1 EP - 68 PB - Publications Office of the European Union CY - Luxembourg ER - TY - CHAP A1 - Evans, Benjamin A1 - Braun, Sebastian A1 - Ulmer, Jessica A1 - Wollert, Jörg T1 - AAS implementations - current problems and solutions T2 - 20th International Conference on Mechatronics - Mechatronika (ME) N2 - The fourth industrial revolution presents a multitude of challenges for industries, one of which being the increased flexibility required of manufacturing lines as a result of increased consumer demand for individualised products. One solution to tackle this challenge is the digital twin, more specifically the standardised model of a digital twin also known as the asset administration shell. The standardisation of an industry wide communications tool is a critical step in enabling inter-company operations. This paper discusses the current state of asset administration shells, the frameworks used to host them and their problems that need to be addressed. To tackle these issues, we propose an event-based server capable of drastically reducing response times between assets and asset administration shells and a multi-agent system used for the orchestration and deployment of the shells in the field. KW - Industry 4.0 KW - Multi-agent Systems KW - Digital Twin KW - Asset Administration Shell Y1 - 2022 SN - 978-1-6654-1040-3 U6 - https://doi.org/10.1109/ME54704.2022.9982933 N1 - 20th International Conference on Mechatronics - Mechatronika (ME), 07-09 December 2022, Pilsen, Czech Republic PB - IEEE CY - New York, NY ER - TY - JOUR A1 - Baringhaus, Ludwig A1 - Gaigall, Daniel T1 - A goodness-of-fit test for the compound Poisson exponential model JF - Journal of Multivariate Analysis N2 - On the basis of bivariate data, assumed to be observations of independent copies of a random vector (S,N), we consider testing the hypothesis that the distribution of (S,N) belongs to the parametric class of distributions that arise with the compound Poisson exponential model. Typically, this model is used in stochastic hydrology, with N as the number of raindays, and S as total rainfall amount during a certain time period, or in actuarial science, with N as the number of losses, and S as total loss expenditure during a certain time period. The compound Poisson exponential model is characterized in the way that a specific transform associated with the distribution of (S,N) satisfies a certain differential equation. Mimicking the function part of this equation by substituting the empirical counterparts of the transform we obtain an expression the weighted integral of the square of which is used as test statistic. We deal with two variants of the latter, one of which being invariant under scale transformations of the S-part by fixed positive constants. Critical values are obtained by using a parametric bootstrap procedure. The asymptotic behavior of the tests is discussed. A simulation study demonstrates the performance of the tests in the finite sample case. The procedure is applied to rainfall data and to an actuarial dataset. A multivariate extension is also discussed. KW - Bootstrapping KW - Collective risk model Y1 - 2022 U6 - https://doi.org/10.1016/j.jmva.2022.105154 SN - 0047-259X SN - 1095-7243 VL - 195 IS - Article 105154 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Tran, Ngoc Trinh A1 - Trinh, Tu Luc A1 - Dao, Ngoc Tien A1 - Giap, Van Tan A1 - Truong, Manh Khuyen A1 - Dinh, Thuy Ha A1 - Staat, Manfred T1 - FEM shakedown analysis of structures under random strength with chance constrained programming JF - Vietnam Journal of Mechanics N2 - Direct methods, comprising limit and shakedown analysis, are a branch of computational mechanics. They play a significant role in mechanical and civil engineering design. The concept of direct methods aims to determine the ultimate load carrying capacity of structures beyond the elastic range. In practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and constraints. If strength and loading are random quantities, the shakedown analysis can be formulated as stochastic programming problem. In this paper, a method called chance constrained programming is presented, which is an effective method of stochastic programming to solve shakedown analysis problems under random conditions of strength. In this study, the loading is deterministic, and the strength is a normally or lognormally distributed variable. KW - limit analysis KW - shakedown analysis KW - chance constrained programming KW - stochastic programming KW - reliability of structures Y1 - 2022 U6 - https://doi.org/10.15625/0866-7136/17943 SN - 0866-7136 SN - 2815-5882 VL - 44 IS - 4 SP - 459 EP - 473 PB - Vietnam Academy of Science and Technology (VAST) ER - TY - JOUR A1 - Röthenbacher, Annika A1 - Cesari, Matteo A1 - Doppler, Christopher E.J. A1 - Okkels, Niels A1 - Willemsen, Nele A1 - Sembowski, Nora A1 - Seger, Aline A1 - Lindner, Marie A1 - Brune, Corinna A1 - Stefani, Ambra A1 - Högl, Birgit A1 - Bialonski, Stephan A1 - Borghammer, Per A1 - Fink, Gereon R. A1 - Schober, Martin A1 - Sommerauer, Michael T1 - RBDtector: an open-source software to detect REM sleep without atonia according to visual scoring criteria JF - Scientific Reports N2 - REM sleep without atonia (RSWA) is a key feature for the diagnosis of rapid eye movement (REM) sleep behaviour disorder (RBD). We introduce RBDtector, a novel open-source software to score RSWA according to established SINBAR visual scoring criteria. We assessed muscle activity of the mentalis, flexor digitorum superficialis (FDS), and anterior tibialis (AT) muscles. RSWA was scored manually as tonic, phasic, and any activity by human scorers as well as using RBDtector in 20 subjects. Subsequently, 174 subjects (72 without RBD and 102 with RBD) were analysed with RBDtector to show the algorithm’s applicability. We additionally compared RBDtector estimates to a previously published dataset. RBDtector showed robust conformity with human scorings. The highest congruency was achieved for phasic and any activity of the FDS. Combining mentalis any and FDS any, RBDtector identified RBD subjects with 100% specificity and 96% sensitivity applying a cut-off of 20.6%. Comparable performance was obtained without manual artefact removal. RBD subjects also showed muscle bouts of higher amplitude and longer duration. RBDtector provides estimates of tonic, phasic, and any activity comparable to human scorings. RBDtector, which is freely available, can help identify RBD subjects and provides reliable RSWA metrics. Y1 - 2022 U6 - https://doi.org/10.1038/s41598-022-25163-9 SN - 2045-2322 VL - 12 IS - Article number: 20886 SP - 1 EP - 14 PB - Springer Nature CY - London ER - TY - JOUR A1 - Mueller, Tobias A1 - Segin, Alexander A1 - Weigand, Christoph A1 - Schmitt, Robert H. T1 - Feature selection for measurement models JF - International journal of quality & reliability management N2 - Purpose In the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset. Design/methodology/approach In this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments. Findings Based on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model. Originality/value For the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future. KW - Feature selection KW - Modelling KW - Measurement models KW - Measurement uncertainty Y1 - 2022 U6 - https://doi.org/10.1108/IJQRM-07-2021-0245 SN - 0265-671X IS - Vol. ahead-of-print, No. ahead-of-print. PB - Emerald Group Publishing Limited CY - Bingley ER - TY - GEN A1 - Steuer-Dankert, Linda A1 - Bernhard, Sebastian A1 - Langolf, Jessica A1 - Leicht-Scholten, Carmen T1 - Managing change and acceptance of digitalization strategies - Implementing the vision of „Internet of Production“ (IoP) in existing corporate structures T2 - Textile Impulse für die Zukunft: Aachen-Dresden-Denkendorf International Textile Conference 2022 N2 - The vision of the Internet of Production is to enable a new level of crossdomain collaboration by providing semantically adequate and context-aware data from production, development & usage in real-time. Y1 - 2022 N1 - Textile Impulse für die Zukunft: Aachen-Dresden-Denkendorf International Textile Conference 2022 : 1. – 2. Dezember 2022, Eurogress Aachen SP - 153 EP - 153 ER - TY - JOUR A1 - Rübbelke, Dirk A1 - Vögele, Stefan A1 - Grajewski, Matthias A1 - Zobel, Luzy T1 - Hydrogen-based steel production and global climate protection: An empirical analysis of the potential role of a European cross border adjustment mechanism JF - Journal of Cleaner Production N2 - The European Union's aim to become climate neutral by 2050 necessitates ambitious efforts to reduce carbon emissions. Large reductions can be attained particularly in energy intensive sectors like iron and steel. In order to prevent the relocation of such industries outside the EU in the course of tightening environmental regulations, the establishment of a climate club jointly with other large emitters and alternatively the unilateral implementation of an international cross-border carbon tax mechanism are proposed. This article focuses on the latter option choosing the steel sector as an example. In particular, we investigate the financial conditions under which a European cross border mechanism is capable to protect hydrogen-based steel production routes employed in Europe against more polluting competition from abroad. By using a floor price model, we assess the competitiveness of different steel production routes in selected countries. We evaluate the climate friendliness of steel production on the basis of specific GHG emissions. In addition, we utilize an input-output price model. It enables us to assess impacts of rising cost of steel production on commodities using steel as intermediates. Our results raise concerns that a cross-border tax mechanism will not suffice to bring about competitiveness of hydrogen-based steel production in Europe because the cost tends to remain higher than the cost of steel production in e.g. China. Steel is a classic example for a good used mainly as intermediate for other products. Therefore, a cross-border tax mechanism for steel will increase the price of products produced in the EU that require steel as an input. This can in turn adversely affect competitiveness of these sectors. Hence, the effects of higher steel costs on European exports should be borne in mind and could require the cross-border adjustment mechanism to also subsidize exports. Y1 - 2022 U6 - https://doi.org/10.1016/j.jclepro.2022.135040 SN - 0959-6526 VL - 380 IS - Part 2, Art. Nr.:135040 PB - Elsevier ER - TY - JOUR A1 - Monakhova, Yulia A1 - Soboleva, Polina M. A1 - Fedotova, Elena S. A1 - Musina, Kristina T. A1 - Burmistrova, Natalia A. T1 - Quantum chemical calculations of IR spectra of heparin disaccharide subunits JF - Computational and Theoretical Chemistry N2 - Heparin is a natural polysaccharide, which plays essential role in many biological processes. Alterations in building blocks can modify biological roles of commercial heparin products, due to significant changes in the conformation of the polymer chain. The variability structure of heparin leads to difficulty in quality control using different analytical methods, including infrared (IR) spectroscopy. In this paper molecular modelling of heparin disaccharide subunits was performed using quantum chemistry. The structural and spectral parameters of these disaccharides have been calculated using RHF/6-311G. In addition, over-sulphated chondroitin sulphate disaccharide was studied as one of the most widespread contaminants of heparin. Calculated IR spectra were analyzed with respect to specific structure parameters. IR spectroscopic fingerprint was found to be sensitive to substitution pattern of disaccharide subunits. Vibrational assignments of calculated spectra were correlated with experimental IR spectral bands of native heparin. Chemometrics was used to perform multivariate analysis of simulated spectral data. KW - IR spectroscopy KW - Chemometrics KW - Quantum chemistry KW - Molecular modelling KW - Quality control Y1 - 2022 SN - 2210-271X U6 - https://doi.org/10.1016/j.comptc.2022.113891 VL - 1217 IS - Article number: 113891 PB - Elsevier CY - New York, NY ER - TY - JOUR A1 - Burger, René A1 - Lindner, Simon A1 - Rumpf, Jessica A1 - Do, Xuan Tung A1 - Diehl, Bernd W.K. A1 - Rehahn, Matthias A1 - Monakhova, Yulia A1 - Schulze, Margit T1 - Benchtop versus high field NMR: Comparable performance found for the molecular weight determination of lignin JF - Journal of Pharmaceutical and Biomedical Analysis N2 - Lignin is a promising renewable biopolymer being investigated worldwide as an environmentally benign substitute of fossil-based aromatic compounds, e.g. for the use as an excipient with antioxidant and antimicrobial properties in drug delivery or even as active compound. For its successful implementation into process streams, a quick, easy, and reliable method is needed for its molecular weight determination. Here we present a method using 1H spectra of benchtop as well as conventional NMR systems in combination with multivariate data analysis, to determine lignin’s molecular weight (Mw and Mn) and polydispersity index (PDI). A set of 36 organosolv lignin samples (from Miscanthus x giganteus, Paulownia tomentosa and Silphium perfoliatum) was used for the calibration and cross validation, and 17 samples were used as external validation set. Validation errors between 5.6% and 12.9% were achieved for all parameters on all NMR devices (43, 60, 500 and 600 MHz). Surprisingly, no significant difference in the performance of the benchtop and high-field devices was found. This facilitates the application of this method for determining lignin’s molecular weight in an industrial environment because of the low maintenance expenditure, small footprint, ruggedness, and low cost of permanent magnet benchtop NMR systems. KW - NMR KW - PLS-regression KW - Molecular weight determination KW - Chemometrics KW - Biomass Y1 - 2022 SN - 0731-7085 U6 - https://doi.org/10.1016/j.jpba.2022.114649 VL - 212 IS - Article number: 114649 PB - Elsevier CY - New York, NY ER - TY - JOUR A1 - Monakhova, Yulia A1 - Diehl, Bernd W.K. T1 - Multinuclear NMR screening of pharmaceuticals using standardization by 2H integral of a deuterated solvent JF - Journal of Pharmaceutical and Biomedical Analysis N2 - NMR standardization approach that uses the 2H integral of deuterated solvent for quantitative multinuclear analysis of pharmaceuticals is described. As a proof of principle, the existing NMR procedure for the analysis of heparin products according to US Pharmacopeia monograph is extended to the determination of Na+ and Cl- content in this matrix. Quantification is performed based on the ratio of a 23Na (35Cl) NMR integral and 2H NMR signal of deuterated solvent, D2O, acquired using the specific spectrometer hardware. As an alternative, the possibility of 133Cs standardization using the addition of Cs2CO3 stock solution is shown. Validation characteristics (linearity, repeatability, sensitivity) are evaluated. A holistic NMR profiling of heparin products can now also be used for the quantitative determination of inorganic compounds in a single analytical run using a single sample. In general, the new standardization methodology provides an appealing alternative for the NMR screening of inorganic and organic components in pharmaceutical products. KW - NMR spectroscopy KW - Inorganic ions KW - Heparin KW - Standardization Y1 - 2022 SN - 0731-7085 U6 - https://doi.org/10.1016/j.jpba.2021.114530 VL - 209 IS - Article number: 114530 PB - Elsevier ER - TY - JOUR A1 - Lindner, Simon A1 - Burger, René A1 - Rutledge, Douglas N. A1 - Do, Xuan Tung A1 - Rumpf, Jessica A1 - Diehl, Bernd W. K. A1 - Schulze, Margit A1 - Monakhova, Yulia T1 - Is the calibration transfer of multivariate calibration models between high- and low-field NMR instruments possible? A case study of lignin molecular weight JF - Analytical chemistry N2 - Although several successful applications of benchtop nuclear magnetic resonance (NMR) spectroscopy in quantitative mixture analysis exist, the possibility of calibration transfer remains mostly unexplored, especially between high- and low-field NMR. This study investigates for the first time the calibration transfer of partial least squares regressions [weight average molecular weight (Mw) of lignin] between high-field (600 MHz) NMR and benchtop NMR devices (43 and 60 MHz). For the transfer, piecewise direct standardization, calibration transfer based on canonical correlation analysis, and transfer via the extreme learning machine auto-encoder method are employed. Despite the immense resolution difference between high-field and low-field NMR instruments, the results demonstrate that the calibration transfer from high- to low-field is feasible in the case of a physical property, namely, the molecular weight, achieving validation errors close to the original calibration (down to only 1.2 times higher root mean square errors). These results introduce new perspectives for applications of benchtop NMR, in which existing calibrations from expensive high-field instruments can be transferred to cheaper benchtop instruments to economize. Y1 - 2022 SN - 1520-6882 U6 - https://doi.org/10.1021/acs.analchem.1c05125 VL - 94 IS - 9 SP - 3997 EP - 4004 PB - ACS Publications CY - Washington, DC ER - TY - CHAP A1 - Amir, Malik A1 - Bauckhage, Christian A1 - Chircu, Alina A1 - Czarnecki, Christian A1 - Knopf, Christian A1 - Piatkowski, Nico A1 - Sultanow, Eldar T1 - What can we expect from quantum (digital) twins? T2 - Wirtschaftsinformatik 2022 Proceedings N2 - Digital twins enable the modeling and simulation of real-world entities (objects, processes or systems), resulting in improvements in the associated value chains. The emerging field of quantum computing holds tremendous promise forevolving this virtualization towards Quantum (Digital) Twins (QDT) and ultimately Quantum Twins (QT). The quantum (digital) twin concept is not a contradiction in terms - but instead describes a hybrid approach that can be implemented using the technologies available today by combining classicalcomputing and digital twin concepts with quantum processing. This paperpresents the status quo of research and practice on quantum (digital) twins. It alsodiscuses their potential to create competitive advantage through real-timesimulation of highly complex, interconnected entities that helps companies better address changes in their environment and differentiate their products andservices. KW - Artificial Intelligence KW - Digital Twin Evolution KW - Machine Learning KW - Quantum Computing KW - Quantum Machine Learning Y1 - 2022 N1 - 17. Internationale Tagung Wirtschaftsinformatik, 21. – 23. Februar 2022, Nürnberg (online) SP - 1 EP - 14 PB - AIS Electronic Library (AISeL) ER -