TY - JOUR A1 - Ditzhaus, Marc A1 - Gaigall, Daniel T1 - A consistent goodness-of-fit test for huge dimensional and functional data JF - Journal of Nonparametric Statistics N2 - A nonparametric goodness-of-fit test for random variables with values in a separable Hilbert space is investigated. To verify the null hypothesis that the data come from a specific distribution, an integral type test based on a Cramér-von-Mises statistic is suggested. The convergence in distribution of the test statistic under the null hypothesis is proved and the test's consistency is concluded. Moreover, properties under local alternatives are discussed. Applications are given for data of huge but finite dimension and for functional data in infinite dimensional spaces. A general approach enables the treatment of incomplete data. In simulation studies the test competes with alternative proposals. KW - Cramér-von-Mises statistic KW - separable Hilbert space KW - huge dimensional data KW - functional data Y1 - 2018 U6 - https://doi.org/10.1080/10485252.2018.1486402 SN - 1029-0311 VL - 30 IS - 4 SP - 834 EP - 859 PB - Taylor & Francis CY - Abingdon ER - TY - JOUR A1 - Baringhaus, Ludwig A1 - Gaigall, Daniel T1 - On an asymptotic relative efficiency concept based on expected volumes of confidence regions JF - Statistics - A Journal of Theoretical and Applied Statistic N2 - The paper deals with an asymptotic relative efficiency concept for confidence regions of multidimensional parameters that is based on the expected volumes of the confidence regions. Under standard conditions the asymptotic relative efficiencies of confidence regions are seen to be certain powers of the ratio of the limits of the expected volumes. These limits are explicitly derived for confidence regions associated with certain plugin estimators, likelihood ratio tests and Wald tests. Under regularity conditions, the asymptotic relative efficiency of each of these procedures with respect to each one of its competitors is equal to 1. The results are applied to multivariate normal distributions and multinomial distributions in a fairly general setting. KW - Volume of confidence regions KW - asymptotic relative efficiency KW - likelihood ratio test KW - multivariate normal distribution KW - multinomial distribution Y1 - 2019 U6 - https://doi.org/10.1080/02331888.2019.1683560 SN - 1029-4910 VL - 53 IS - 6 SP - 1396 EP - 1436 PB - Taylor & Francis CY - London ER - TY - JOUR A1 - Baringhaus, Ludwig A1 - Gaigall, Daniel T1 - On an independence test approach to the goodness-of-fit problem JF - Journal of Multivariate Analysis N2 - Let X₁,…,Xₙ be independent and identically distributed random variables with distribution F. Assuming that there are measurable functions f:R²→R and g:R²→R characterizing a family F of distributions on the Borel sets of R in the way that the random variables f(X₁,X₂),g(X₁,X₂) are independent, if and only if F∈F, we propose to treat the testing problem H:F∈F,K:F∉F by applying a consistent nonparametric independence test to the bivariate sample variables (f(Xᵢ,Xⱼ),g(Xᵢ,Xⱼ)),1⩽i,j⩽n,i≠j. A parametric bootstrap procedure needed to get critical values is shown to work. The consistency of the test is discussed. The power performance of the procedure is compared with that of the classical tests of Kolmogorov–Smirnov and Cramér–von Mises in the special cases where F is the family of gamma distributions or the family of inverse Gaussian distributions. KW - Goodness-of-fit test KW - Independence test KW - Parametric bootstrap KW - Vapnik–Čhervonenkis class KW - Gamma distribution Y1 - 2015 U6 - https://doi.org/10.1016/j.jmva.2015.05.013 SN - 0047-259X VL - 2015 IS - 140 SP - 193 EP - 208 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Gaigall, Daniel T1 - Rothman–Woodroofe symmetry test statistic revisited JF - Computational Statistics & Data Analysis N2 - The Rothman–Woodroofe symmetry test statistic is revisited on the basis of independent but not necessarily identically distributed random variables. The distribution-freeness if the underlying distributions are all symmetric and continuous is obtained. The results are applied for testing symmetry in a meta-analysis random effects model. The consistency of the procedure is discussed in this situation as well. A comparison with an alternative proposal from the literature is conducted via simulations. Real data are analyzed to demonstrate how the new approach works in practice. Y1 - 2020 U6 - https://doi.org/10.1016/j.csda.2019.106837 SN - 0167-9473 VL - 2020 IS - 142 SP - Artikel 106837 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Gaigall, Daniel T1 - Test for Changes in the Modeled Solvency Capital Requirement of an Internal Risk Model JF - ASTIN Bulletin N2 - In the context of the Solvency II directive, the operation of an internal risk model is a possible way for risk assessment and for the determination of the solvency capital requirement of an insurance company in the European Union. A Monte Carlo procedure is customary to generate a model output. To be compliant with the directive, validation of the internal risk model is conducted on the basis of the model output. For this purpose, we suggest a new test for checking whether there is a significant change in the modeled solvency capital requirement. Asymptotic properties of the test statistic are investigated and a bootstrap approximation is justified. A simulation study investigates the performance of the test in the finite sample case and confirms the theoretical results. The internal risk model and the application of the test is illustrated in a simplified example. The method has more general usage for inference of a broad class of law-invariant and coherent risk measures on the basis of a paired sample. KW - Bootstrap KW - Empirical process KW - Functional Delta Method KW - Hadamard differentiability KW - Paired sample Y1 - 2021 U6 - https://doi.org/10.1017/asb.2021.20 SN - 1783-1350 VL - 51 IS - 3 SP - 813 EP - 837 PB - Cambridge Univ. Press CY - Cambridge ER - TY - JOUR A1 - Gaigall, Daniel T1 - Testing marginal homogeneity of a continuous bivariate distribution with possibly incomplete paired data JF - Metrika N2 - We discuss the testing problem of homogeneity of the marginal distributions of a continuous bivariate distribution based on a paired sample with possibly missing components (missing completely at random). Applying the well-known two-sample Crámer–von-Mises distance to the remaining data, we determine the limiting null distribution of our test statistic in this situation. It is seen that a new resampling approach is appropriate for the approximation of the unknown null distribution. We prove that the resulting test asymptotically reaches the significance level and is consistent. Properties of the test under local alternatives are pointed out as well. Simulations investigate the quality of the approximation and the power of the new approach in the finite sample case. As an illustration we apply the test to real data sets. KW - Marginal homogeneity test KW - Crámer–von-Mises distance KW - Paired sample KW - Incomplete data KW - Resampling test Y1 - 2019 U6 - https://doi.org/10.1007/s00184-019-00742-5 SN - 1435-926X VL - 2020 IS - 83 SP - 437 EP - 465 PB - Springer ER - TY - JOUR A1 - Gaigall, Daniel T1 - Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic on partly not identically distributed data JF - Communications in Statistics - Theory and Methods N2 - The established Hoeffding-Blum-Kiefer-Rosenblatt independence test statistic is investigated for partly not identically distributed data. Surprisingly, it turns out that the statistic has the well-known distribution-free limiting null distribution of the classical criterion under standard regularity conditions. An application is testing goodness-of-fit for the regression function in a non parametric random effects meta-regression model, where the consistency is obtained as well. Simulations investigate size and power of the approach for small and moderate sample sizes. A real data example based on clinical trials illustrates how the test can be used in applications. KW - Brownian Pillow KW - Hoeffding-Blum-Kiefer-Rosenblatt independence test KW - not identically distributed KW - random effects meta-regression model Y1 - 2020 U6 - https://doi.org/10.1080/03610926.2020.1805767 SN - 1532-415X VL - 51 IS - 12 SP - 4006 EP - 4028 PB - Taylor & Francis CY - London ER - TY - JOUR A1 - Gaigall, Daniel A1 - Gerstenberg, Julian A1 - Trinh, Thi Thu Ha T1 - Empirical process of concomitants for partly categorial data and applications in statistics JF - Bernoulli N2 - On the basis of independent and identically distributed bivariate random vectors, where the components are categorial and continuous variables, respectively, the related concomitants, also called induced order statistic, are considered. The main theoretical result is a functional central limit theorem for the empirical process of the concomitants in a triangular array setting. A natural application is hypothesis testing. An independence test and a two-sample test are investigated in detail. The fairly general setting enables limit results under local alternatives and bootstrap samples. For the comparison with existing tests from the literature simulation studies are conducted. The empirical results obtained confirm the theoretical findings. KW - bootstrap KW - Categorial variable KW - Concomitant KW - Empirical process KW - Independence test Y1 - 2022 U6 - https://doi.org/10.3150/21-BEJ1367 SN - 1573-9759 VL - 28 IS - 2 SP - 803 EP - 829 PB - International Statistical Institute CY - Den Haag, NL ER - TY - JOUR A1 - Ditzhaus, Marc A1 - Gaigall, Daniel T1 - Testing marginal homogeneity in Hilbert spaces with applications to stock market returns JF - Test N2 - This paper considers a paired data framework and discusses the question of marginal homogeneity of bivariate high-dimensional or functional data. The related testing problem can be endowed into a more general setting for paired random variables taking values in a general Hilbert space. To address this problem, a Cramér–von-Mises type test statistic is applied and a bootstrap procedure is suggested to obtain critical values and finally a consistent test. The desired properties of a bootstrap test can be derived that are asymptotic exactness under the null hypothesis and consistency under alternatives. Simulations show the quality of the test in the finite sample case. A possible application is the comparison of two possibly dependent stock market returns based on functional data. The approach is demonstrated based on historical data for different stock market indices. Y1 - 2022 U6 - https://doi.org/10.1007/s11749-022-00802-5 SN - 1863-8260 VL - 2022 IS - 31 SP - 749 EP - 770 PB - Springer ER - TY - JOUR A1 - Baringhaus, Ludwig A1 - Gaigall, Daniel T1 - A goodness-of-fit test for the compound Poisson exponential model JF - Journal of Multivariate Analysis N2 - On the basis of bivariate data, assumed to be observations of independent copies of a random vector (S,N), we consider testing the hypothesis that the distribution of (S,N) belongs to the parametric class of distributions that arise with the compound Poisson exponential model. Typically, this model is used in stochastic hydrology, with N as the number of raindays, and S as total rainfall amount during a certain time period, or in actuarial science, with N as the number of losses, and S as total loss expenditure during a certain time period. The compound Poisson exponential model is characterized in the way that a specific transform associated with the distribution of (S,N) satisfies a certain differential equation. Mimicking the function part of this equation by substituting the empirical counterparts of the transform we obtain an expression the weighted integral of the square of which is used as test statistic. We deal with two variants of the latter, one of which being invariant under scale transformations of the S-part by fixed positive constants. Critical values are obtained by using a parametric bootstrap procedure. The asymptotic behavior of the tests is discussed. A simulation study demonstrates the performance of the tests in the finite sample case. The procedure is applied to rainfall data and to an actuarial dataset. A multivariate extension is also discussed. KW - Bootstrapping KW - Collective risk model Y1 - 2022 U6 - https://doi.org/10.1016/j.jmva.2022.105154 SN - 0047-259X SN - 1095-7243 VL - 195 IS - Article 105154 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Tran, Ngoc Trinh A1 - Trinh, Tu Luc A1 - Dao, Ngoc Tien A1 - Giap, Van Tan A1 - Truong, Manh Khuyen A1 - Dinh, Thuy Ha A1 - Staat, Manfred T1 - FEM shakedown analysis of structures under random strength with chance constrained programming JF - Vietnam Journal of Mechanics N2 - Direct methods, comprising limit and shakedown analysis, are a branch of computational mechanics. They play a significant role in mechanical and civil engineering design. The concept of direct methods aims to determine the ultimate load carrying capacity of structures beyond the elastic range. In practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and constraints. If strength and loading are random quantities, the shakedown analysis can be formulated as stochastic programming problem. In this paper, a method called chance constrained programming is presented, which is an effective method of stochastic programming to solve shakedown analysis problems under random conditions of strength. In this study, the loading is deterministic, and the strength is a normally or lognormally distributed variable. KW - limit analysis KW - shakedown analysis KW - chance constrained programming KW - stochastic programming KW - reliability of structures Y1 - 2022 U6 - https://doi.org/10.15625/0866-7136/17943 SN - 0866-7136 SN - 2815-5882 VL - 44 IS - 4 SP - 459 EP - 473 PB - Vietnam Academy of Science and Technology (VAST) ER - TY - JOUR A1 - Mueller, Tobias A1 - Segin, Alexander A1 - Weigand, Christoph A1 - Schmitt, Robert H. T1 - Feature selection for measurement models JF - International journal of quality & reliability management N2 - Purpose In the determination of the measurement uncertainty, the GUM procedure requires the building of a measurement model that establishes a functional relationship between the measurand and all influencing quantities. Since the effort of modelling as well as quantifying the measurement uncertainties depend on the number of influencing quantities considered, the aim of this study is to determine relevant influencing quantities and to remove irrelevant ones from the dataset. Design/methodology/approach In this work, it was investigated whether the effort of modelling for the determination of measurement uncertainty can be reduced by the use of feature selection (FS) methods. For this purpose, 9 different FS methods were tested on 16 artificial test datasets, whose properties (number of data points, number of features, complexity, features with low influence and redundant features) were varied via a design of experiments. Findings Based on a success metric, the stability, universality and complexity of the method, two FS methods could be identified that reliably identify relevant and irrelevant influencing quantities for a measurement model. Originality/value For the first time, FS methods were applied to datasets with properties of classical measurement processes. The simulation-based results serve as a basis for further research in the field of FS for measurement models. The identified algorithms will be applied to real measurement processes in the future. KW - Feature selection KW - Modelling KW - Measurement models KW - Measurement uncertainty Y1 - 2022 U6 - https://doi.org/10.1108/IJQRM-07-2021-0245 SN - 0265-671X IS - Vol. ahead-of-print, No. ahead-of-print. PB - Emerald Group Publishing Limited CY - Bingley ER - TY - JOUR A1 - Rübbelke, Dirk A1 - Vögele, Stefan A1 - Grajewski, Matthias A1 - Zobel, Luzy T1 - Hydrogen-based steel production and global climate protection: An empirical analysis of the potential role of a European cross border adjustment mechanism JF - Journal of Cleaner Production N2 - The European Union's aim to become climate neutral by 2050 necessitates ambitious efforts to reduce carbon emissions. Large reductions can be attained particularly in energy intensive sectors like iron and steel. In order to prevent the relocation of such industries outside the EU in the course of tightening environmental regulations, the establishment of a climate club jointly with other large emitters and alternatively the unilateral implementation of an international cross-border carbon tax mechanism are proposed. This article focuses on the latter option choosing the steel sector as an example. In particular, we investigate the financial conditions under which a European cross border mechanism is capable to protect hydrogen-based steel production routes employed in Europe against more polluting competition from abroad. By using a floor price model, we assess the competitiveness of different steel production routes in selected countries. We evaluate the climate friendliness of steel production on the basis of specific GHG emissions. In addition, we utilize an input-output price model. It enables us to assess impacts of rising cost of steel production on commodities using steel as intermediates. Our results raise concerns that a cross-border tax mechanism will not suffice to bring about competitiveness of hydrogen-based steel production in Europe because the cost tends to remain higher than the cost of steel production in e.g. China. Steel is a classic example for a good used mainly as intermediate for other products. Therefore, a cross-border tax mechanism for steel will increase the price of products produced in the EU that require steel as an input. This can in turn adversely affect competitiveness of these sectors. Hence, the effects of higher steel costs on European exports should be borne in mind and could require the cross-border adjustment mechanism to also subsidize exports. Y1 - 2022 U6 - https://doi.org/10.1016/j.jclepro.2022.135040 SN - 0959-6526 VL - 380 IS - Part 2, Art. Nr.:135040 PB - Elsevier ER - TY - JOUR A1 - Czarnecki, Christian A1 - Winkelmann, Axel A1 - Spiliopoulou, Myra T1 - Services in electronic telecommunication markets: a framework for planning the virtualization of processes JF - Electronic Markets N2 - The potential of electronic markets in enabling innovative product bundles through flexible and sustainable partnerships is not yet fully exploited in the telecommunication industry. One reason is that bundling requires seamless de-assembling and re-assembling of business processes, whilst processes in telecommunication companies are often product-dependent and hard to virtualize. We propose a framework for the planning of the virtualization of processes, intended to assist the decision maker in prioritizing the processes to be virtualized: (a) we transfer the virtualization pre-requisites stated by the Process Virtualization Theory in the context of customer-oriented processes in the telecommunication industry and assess their importance in this context, (b) we derive IT-oriented requirements for the removal of virtualization barriers and highlight their demand on changes at different levels of the organization. We present a first evaluation of our approach in a case study and report on lessons learned and further steps to be performed. KW - Telecommunication KW - Services KW - Process virtualization KW - Product bundling KW - Transformation Y1 - 2010 U6 - https://doi.org/10.1007/s12525-010-0045-8 SN - 1422-8890 VL - 20 IS - 3-4 SP - 197 EP - 207 PB - Springer CY - Berlin ER - TY - JOUR A1 - Czarnecki, Christian A1 - Spiliopoulou, Myra T1 - A holistic framework for the implementation of a next generation network JF - International Journal of Business Information Systems N2 - As the potential of a next generation network (NGN) is recognised, telecommunication companies consider switching to it. Although the implementation of an NGN seems to be merely a modification of the network infrastructure, it may trigger or require changes in the whole company, because it builds upon the separation between service and transport, a flexible bundling of services to products and the streamlining of the IT infrastructure. We propose a holistic framework, structured into the layers ‘strategy’, ‘processes’ and ‘information systems’ and incorporate into each layer all concepts necessary for the implementation of an NGN, as well as the alignment of these concepts. As a first proof-of-concept for our framework we have performed a case study on the introduction of NGN in a large telecommunication company; we show that our framework captures all topics that are affected by an NGN implementation. KW - next generation network KW - telecommunication KW - NGN KW - IP-based networks KW - product bundling Y1 - 2012 U6 - https://doi.org/10.1504/IJBIS.2012.046291 SN - 1746-0972 VL - 9 IS - 4 SP - 385 EP - 401 PB - Inderscience Enterprises CY - Olney, Bucks ER - TY - JOUR A1 - Burger, René A1 - Rumpf, Jessica A1 - Do, Xuan Tung A1 - Monakhova, Yulia A1 - Diehl, Bernd W. K. A1 - Rehahn, Matthias A1 - Schulze, Margit T1 - Is NMR combined with multivariate regression applicable for the molecular weight determination of randomly cross-linked polymers such as lignin? JF - ACS Omega N2 - The molecular weight properties of lignins are one of the key elements that need to be analyzed for a successful industrial application of these promising biopolymers. In this study, the use of 1H NMR as well as diffusion-ordered spectroscopy (DOSY NMR), combined with multivariate regression methods, was investigated for the determination of the molecular weight (Mw and Mn) and the polydispersity of organosolv lignins (n = 53, Miscanthus x giganteus, Paulownia tomentosa, and Silphium perfoliatum). The suitability of the models was demonstrated by cross validation (CV) as well as by an independent validation set of samples from different biomass origins (beech wood and wheat straw). CV errors of ca. 7–9 and 14–16% were achieved for all parameters with the models from the 1H NMR spectra and the DOSY NMR data, respectively. The prediction errors for the validation samples were in a similar range for the partial least squares model from the 1H NMR data and for a multiple linear regression using the DOSY NMR data. The results indicate the usefulness of NMR measurements combined with multivariate regression methods as a potential alternative to more time-consuming methods such as gel permeation chromatography. Y1 - 2021 U6 - https://doi.org/10.1021/acsomega.1c03574 SN - 2470-1343 VL - 6 IS - 44 SP - 29516 EP - 29524 PB - ACS Publications CY - Washington, DC ER - TY - JOUR A1 - Monakhova, Yulia A1 - Diehl, Bernd W. K. T1 - Simplification of NMR Workflows by Standardization Using 2H Integral of Deuterated Solvent as Applied to Aloe vera Preparations JF - Applied Magnetic Resonance N2 - In this study, a recently proposed NMR standardization approach by 2H integral of deuterated solvent for quantitative multicomponent analysis of complex mixtures is presented. As a proof of principle, the existing NMR routine for the analysis of Aloe vera products was modified. Instead of using absolute integrals of targeted compounds and internal standard (nicotinamide) from 1H-NMR spectra, quantification was performed based on the ratio of a particular 1H-NMR compound integral and 2H-NMR signal of deuterated solvent D2O. Validation characteristics (linearity, repeatability, accuracy) were evaluated and the results showed that the method has the same precision as internal standardization in case of multicomponent screening. Moreover, a dehydration process by freeze drying is not necessary for the new routine. Now, our NMR profiling of A. vera products needs only limited sample preparation and data processing. The new standardization methodology provides an appealing alternative for multicomponent NMR screening. In general, this novel approach, using standardization by 2H integral, benefits from reduced sample preparation steps and uncertainties, and is recommended in different application areas (purity determination, forensics, pharmaceutical analysis, etc.). Y1 - 2021 U6 - https://doi.org/10.1007/s00723-021-01393-4 SN - 1613-7507 VL - 52 IS - 11 SP - 1591 EP - 1600 PB - Springer CY - Cham ER - TY - JOUR A1 - Burmistrova, Natalia A. A1 - Soboleva, Polina M. A1 - Monakhova, Yulia T1 - Is infrared spectroscopy combined with multivariate analysis a promising tool for heparin authentication? JF - Journal of Pharmaceutical and Biomedical Analysis N2 - The investigation of the possibility to determine various characteristics of powder heparin (n = 115) was carried out with infrared spectroscopy. The evaluation of heparin samples included several parameters such as purity grade, distributing company, animal source as well as heparin species (i.e. Na-heparin, Ca-heparin, and heparinoids). Multivariate analysis using principal component analysis (PCA), soft independent modelling of class analogy (SIMCA), and partial least squares – discriminant analysis (PLS-DA) were applied for the modelling of spectral data. Different pre-processing methods were applied to IR spectral data; multiplicative scatter correction (MSC) was chosen as the most relevant. Obtained results were confirmed by nuclear magnetic resonance (NMR) spectroscopy. Good predictive ability of this approach demonstrates the potential of IR spectroscopy and chemometrics for screening of heparin quality. This approach, however, is designed as a screening tool and is not considered as a replacement for either of the methods required by USP and FDA. KW - IR spectroscopy KW - Heparin KW - Authenticity KW - Principal component analysis KW - Soft independent modeling of class analogy Y1 - 2021 SN - 0731-7085 U6 - https://doi.org/10.1016/j.jpba.2020.113811 VL - 194 IS - Article number: 113811 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Monakhova, Yulia A1 - Diehl, Bernd W.K. T1 - Novel approach of qNMR workflow by standardization using 2H integral: Application to any intrinsic calibration standard JF - Talanta N2 - Quantitative nuclear magnetic resonance (qNMR) is routinely performed by the internal or external standardization. The manuscript describes a simple alternative to these common workflows by using NMR signal of another active nuclei of calibration compound. For example, for any arbitrary compound quantification by NMR can be based on the use of an indirect concentration referencing that relies on a solvent having both 1H and 2H signals. To perform high-quality quantification, the deuteration level of the utilized deuterated solvent has to be estimated. In this contribution the new method was applied to the determination of deuteration levels in different deuterated solvents (MeOD, ACN, CDCl3, acetone, benzene, DMSO-d6). Isopropanol-d6, which contains a defined number of deuterons and protons, was used for standardization. Validation characteristics (precision, accuracy, robustness) were calculated and the results showed that the method can be used in routine practice. Uncertainty budget was also evaluated. In general, this novel approach, using standardization by 2H integral, benefits from reduced sample preparation steps and uncertainties, and can be applied in different application areas (purity determination, forensics, pharmaceutical analysis, etc.). KW - qNMR KW - Deuterium NMR KW - Deuterated solvents KW - Standardization Y1 - 2021 SN - 0039-9140 U6 - https://doi.org/10.1016/j.talanta.2020.121504 VL - 222 IS - Article number: 121504 PB - Elsevier ER - TY - JOUR A1 - Monakhova, Yulia A1 - Soboleva, Polina M. A1 - Fedotova, Elena S. A1 - Musina, Kristina T. A1 - Burmistrova, Natalia A. T1 - Quantum chemical calculations of IR spectra of heparin disaccharide subunits JF - Computational and Theoretical Chemistry N2 - Heparin is a natural polysaccharide, which plays essential role in many biological processes. Alterations in building blocks can modify biological roles of commercial heparin products, due to significant changes in the conformation of the polymer chain. The variability structure of heparin leads to difficulty in quality control using different analytical methods, including infrared (IR) spectroscopy. In this paper molecular modelling of heparin disaccharide subunits was performed using quantum chemistry. The structural and spectral parameters of these disaccharides have been calculated using RHF/6-311G. In addition, over-sulphated chondroitin sulphate disaccharide was studied as one of the most widespread contaminants of heparin. Calculated IR spectra were analyzed with respect to specific structure parameters. IR spectroscopic fingerprint was found to be sensitive to substitution pattern of disaccharide subunits. Vibrational assignments of calculated spectra were correlated with experimental IR spectral bands of native heparin. Chemometrics was used to perform multivariate analysis of simulated spectral data. KW - IR spectroscopy KW - Chemometrics KW - Quantum chemistry KW - Molecular modelling KW - Quality control Y1 - 2022 SN - 2210-271X U6 - https://doi.org/10.1016/j.comptc.2022.113891 VL - 1217 IS - Article number: 113891 PB - Elsevier CY - New York, NY ER -