TY - CHAP A1 - Gaigall, Daniel T1 - On Consistent Hypothesis Testing In General Hilbert Spaces T2 - Proceedings of the 4th International Conference on Statistics: Theory and Applications (ICSTA’22) N2 - Inference on the basis of high-dimensional and functional data are two topics which are discussed frequently in the current statistical literature. A possibility to include both topics in a single approach is working on a very general space for the underlying observations, such as a separable Hilbert space. We propose a general method for consistently hypothesis testing on the basis of random variables with values in separable Hilbert spaces. We avoid concerns with the curse of dimensionality due to a projection idea. We apply well-known test statistics from nonparametric inference to the projected data and integrate over all projections from a specific set and with respect to suitable probability measures. In contrast to classical methods, which are applicable for real-valued random variables or random vectors of dimensions lower than the sample size, the tests can be applied to random vectors of dimensions larger than the sample size or even to functional and high-dimensional data. In general, resampling procedures such as bootstrap or permutation are suitable to determine critical values. The idea can be extended to the case of incomplete observations. Moreover, we develop an efficient algorithm for implementing the method. Examples are given for testing goodness-of-fit in a one-sample situation in [1] or for testing marginal homogeneity on the basis of a paired sample in [2]. Here, the test statistics in use can be seen as generalizations of the well-known Cramérvon-Mises test statistics in the one-sample and two-samples case. The treatment of other testing problems is possible as well. By using the theory of U-statistics, for instance, asymptotic null distributions of the test statistics are obtained as the sample size tends to infinity. Standard continuity assumptions ensure the asymptotic exactness of the tests under the null hypothesis and that the tests detect any alternative in the limit. Simulation studies demonstrate size and power of the tests in the finite sample case, confirm the theoretical findings, and are used for the comparison with concurring procedures. A possible application of the general approach is inference for stock market returns, also in high data frequencies. In the field of empirical finance, statistical inference of stock market prices usually takes place on the basis of related log-returns as data. In the classical models for stock prices, i.e., the exponential Lévy model, Black-Scholes model, and Merton model, properties such as independence and stationarity of the increments ensure an independent and identically structure of the data. Specific trends during certain periods of the stock price processes can cause complications in this regard. In fact, our approach can compensate those effects by the treatment of the log-returns as random vectors or even as functional data. Y1 - 2022 U6 - https://doi.org/10.11159/icsta22.157 N1 - 4th International Conference on Statistics: Theory and Applications (ICSTA’22), Prague, Czech Republic – July 28- 30 SP - Paper No. 157 PB - Avestia Publishing CY - Orléans, Kanada ER - TY - CHAP A1 - Tran, Ngoc Trinh A1 - Trinh, Tu Luc A1 - Dao, Ngoc Tien A1 - Giap, Van Tan A1 - Truong, Manh Khuyen A1 - Dinh, Thuy Ha A1 - Staat, Manfred T1 - Limit and shakedown analysis of structures under random strength T2 - Proceedings of (NACOME2022) The 11th National Conference on Mechanics, Vol. 1. Solid Mechanics, Rock Mechanics, Artificial Intelligence, Teaching and Training N2 - Direct methods comprising limit and shakedown analysis is a branch of computational mechanics. It plays a significant role in mechanical and civil engineering design. The concept of direct method aims to determinate the ultimate load bearing capacity of structures beyond the elastic range. For practical problems, the direct methods lead to nonlinear convex optimization problems with a large number of variables and onstraints. If strength and loading are random quantities, the problem of shakedown analysis is considered as stochastic programming. This paper presents a method so called chance constrained programming, an effective method of stochastic programming, to solve shakedown analysis problem under random condition of strength. In this our investigation, the loading is deterministic, the strength is distributed as normal or lognormal variables. KW - Reliability of structures KW - Stochastic programming KW - Chance constrained programming KW - Shakedown analysis KW - Limit analysis Y1 - 2022 SN - 978-604-357-084-7 N1 - 11th National Conference on Mechanics (NACOME 2022), December 2-3, 2022, VNU University of Engineering and Technology, Hanoi, Vietnam SP - 510 EP - 518 PB - Nha xuat ban Khoa hoc tu nhien va Cong nghe (Verlag Naturwissenschaft und Technik) CY - Hanoi ER - TY - CHAP A1 - Maurer, Florian T1 - Framework to provide a simulative comparison of different energy market designs T2 - Energy Informatics N2 - Useful market simulations are key to the evaluation of diferent market designs existing of multiple market mechanisms or rules. Yet a simulation framework which has a comparison of diferent market mechanisms in mind was not found. The need to create an objective view on different sets of market rules while investigating meaningful agent strategies concludes that such a simulation framework is needed to advance the research on this subject. An overview of diferent existing market simulation models is given which also shows the research gap and the missing capabilities of those systems. Finally, a methodology is outlined how a novel market simulation which can answer the research questions can be developed. Y1 - 2022 U6 - https://doi.org/10.1186/s42162-022-00215-6 SN - 2520-8942 N1 - 11th DACH+ Conference on Energy Informatics, 15-16 September 2022, Freiburg, Germany VL - 5 IS - 2, Article number: 12 SP - 18 EP - 20 PB - Springer Nature ER - TY - CHAP A1 - Burgeth, Bernhard A1 - Kleefeld, Andreas A1 - Zhang, Eugene A1 - Zhang, Yue ED - Baudrier, Étienne ED - Naegel, Benoît ED - Krähenbühl, Adrien ED - Tajine, Mohamed T1 - Towards Topological Analysis of Non-symmetric Tensor Fields via Complexification T2 - Discrete Geometry and Mathematical Morphology N2 - Fields of asymmetric tensors play an important role in many applications such as medical imaging (diffusion tensor magnetic resonance imaging), physics, and civil engineering (for example Cauchy-Green-deformation tensor, strain tensor with local rotations, etc.). However, such asymmetric tensors are usually symmetrized and then further processed. Using this procedure results in a loss of information. A new method for the processing of asymmetric tensor fields is proposed restricting our attention to tensors of second-order given by a 2x2 array or matrix with real entries. This is achieved by a transformation resulting in Hermitian matrices that have an eigendecomposition similar to symmetric matrices. With this new idea numerical results for real-world data arising from a deformation of an object by external forces are given. It is shown that the asymmetric part indeed contains valuable information. Y1 - 2022 SN - 978-3-031-19897-7 U6 - https://doi.org/10.1007/978-3-031-19897-7_5 N1 - Second International Joint Conference, DGMM 2022, Strasbourg, France, October 24–27, 2022 N1 - Corresponding author: Andreas Kleefeld SP - 48 EP - 59 PB - Springer CY - Cham ER - TY - CHAP A1 - Tran, Ngoc Trinh A1 - Staat, Manfred T1 - FEM shakedown analysis of Kirchhoff-Love plates under uncertainty of strength T2 - Proceedings of UNCECOMP 2021 N2 - A new formulation to calculate the shakedown limit load of Kirchhoff plates under stochastic conditions of strength is developed. Direct structural reliability design by chance con-strained programming is based on the prescribed failure probabilities, which is an effective approach of stochastic programming if it can be formulated as an equivalent deterministic optimization problem. We restrict uncertainty to strength, the loading is still deterministic. A new formulation is derived in case of random strength with lognormal distribution. Upper bound and lower bound shakedown load factors are calculated simultaneously by a dual algorithm. Y1 - 2021 SN - 978-618-85072-6-5 U6 - https://doi.org/10.7712/120221.8041.19047 N1 - UNCECOMP 2021, 4th International Conference on Uncertainty Quantification in Computational Sciences and Engineering, streamed from Athens, Greece, 28–30 June 2021. SP - 323 EP - 338 ER - TY - CHAP A1 - Olderog, M. A1 - Mohr, P. A1 - Beging, Stefan A1 - Tsoumpas, C. A1 - Ziemons, Karl T1 - Simulation study on the role of tissue-scattered events in improving sensitivity for a compact time of flight compton positron emission tomograph T2 - 2020 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC) N2 - In positron emission tomography improving time, energy and spatial detector resolutions and using Compton kinematics introduces the possibility to reconstruct a radioactivity distribution image from scatter coincidences, thereby enhancing image quality. The number of single scattered coincidences alone is in the same order of magnitude as true coincidences. In this work, a compact Compton camera module based on monolithic scintillation material is investigated as a detector ring module. The detector interactions are simulated with Monte Carlo package GATE. The scattering angle inside the tissue is derived from the energy of the scattered photon, which results in a set of possible scattering trajectories or broken line of response. The Compton kinematics collimation reduces the number of solutions. Additionally, the time of flight information helps localize the position of the annihilation. One of the questions of this investigation is related to how the energy, spatial and temporal resolutions help confine the possible annihilation volume. A comparison of currently technically feasible detector resolutions (under laboratory conditions) demonstrates the influence on this annihilation volume and shows that energy and coincidence time resolution have a significant impact. An enhancement of the latter from 400 ps to 100 ps leads to a smaller annihilation volume of around 50%, while a change of the energy resolution in the absorber layer from 12% to 4.5% results in a reduction of 60%. The inclusion of single tissue-scattered data has the potential to increase the sensitivity of a scanner by a factor of 2 to 3 times. The concept can be further optimized and extended for multiple scatter coincidences and subsequently validated by a reconstruction algorithm. Y1 - 2021 SN - 978-1-7281-7693-2 U6 - https://doi.org/10.1109/NSS/MIC42677.2020.9507901 N1 - 2020 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC), 31 Oct.-7 Nov. 2020, Boston, MA, USA PB - IEEE CY - New York, NY ER - TY - CHAP A1 - Mandekar, Swati A1 - Jentsch, Lina A1 - Lutz, Kai A1 - Behbahani, Mehdi A1 - Melnykowycz, Mark T1 - Earable design analysis for sleep EEG measurements T2 - UbiComp '21 N2 - Conventional EEG devices cannot be used in everyday life and hence, past decade research has been focused on Ear-EEG for mobile, at-home monitoring for various applications ranging from emotion detection to sleep monitoring. As the area available for electrode contact in the ear is limited, the electrode size and location play a vital role for an Ear-EEG system. In this investigation, we present a quantitative study of ear-electrodes with two electrode sizes at different locations in a wet and dry configuration. Electrode impedance scales inversely with size and ranges from 450 kΩ to 1.29 MΩ for dry and from 22 kΩ to 42 kΩ for wet contact at 10 Hz. For any size, the location in the ear canal with the lowest impedance is ELE (Left Ear Superior), presumably due to increased contact pressure caused by the outer-ear anatomy. The results can be used to optimize signal pickup and SNR for specific applications. We demonstrate this by recording sleep spindles during sleep onset with high quality (5.27 μVrms). KW - EEG KW - sensors KW - Impedance Spectroscopy KW - Sleep EEG KW - biopotential electrodes Y1 - 2021 U6 - https://doi.org/10.1145/3460418.3479328 N1 - UbiComp '21: Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers, September 21–26, 2021, Virtual, USA SP - 171 EP - 175 ER - TY - CHAP A1 - Klöser, Lars A1 - Kohl, Philipp A1 - Kraft, Bodo A1 - Zündorf, Albert T1 - Multi-attribute relation extraction (MARE): simplifying the application of relation extraction T2 - Proceedings of the 2nd International Conference on Deep Learning Theory and Applications DeLTA - Volume 1 N2 - Natural language understanding’s relation extraction makes innovative and encouraging novel business concepts possible and facilitates new digitilized decision-making processes. Current approaches allow the extraction of relations with a fixed number of entities as attributes. Extracting relations with an arbitrary amount of attributes requires complex systems and costly relation-trigger annotations to assist these systems. We introduce multi-attribute relation extraction (MARE) as an assumption-less problem formulation with two approaches, facilitating an explicit mapping from business use cases to the data annotations. Avoiding elaborated annotation constraints simplifies the application of relation extraction approaches. The evaluation compares our models to current state-of-the-art event extraction and binary relation extraction methods. Our approaches show improvement compared to these on the extraction of general multi-attribute relations. Y1 - 2021 SN - 978-989-758-526-5 U6 - https://doi.org/10.5220/0010559201480156 N1 - 2nd International Conference on Deep Learning Theory and Applications, DeLTA2021, July 7-9, 2021 SP - 148 EP - 156 PB - SciTePress CY - Setúbal ER - TY - CHAP A1 - Kohl, Philipp A1 - Schmidts, Oliver A1 - Klöser, Lars A1 - Werth, Henri A1 - Kraft, Bodo A1 - Zündorf, Albert T1 - STAMP 4 NLP – an agile framework for rapid quality-driven NLP applications development T2 - Quality of Information and Communications Technology. QUATIC 2021 N2 - The progress in natural language processing (NLP) research over the last years, offers novel business opportunities for companies, as automated user interaction or improved data analysis. Building sophisticated NLP applications requires dealing with modern machine learning (ML) technologies, which impedes enterprises from establishing successful NLP projects. Our experience in applied NLP research projects shows that the continuous integration of research prototypes in production-like environments with quality assurance builds trust in the software and shows convenience and usefulness regarding the business goal. We introduce STAMP 4 NLP as an iterative and incremental process model for developing NLP applications. With STAMP 4 NLP, we merge software engineering principles with best practices from data science. Instantiating our process model allows efficiently creating prototypes by utilizing templates, conventions, and implementations, enabling developers and data scientists to focus on the business goals. Due to our iterative-incremental approach, businesses can deploy an enhanced version of the prototype to their software environment after every iteration, maximizing potential business value and trust early and avoiding the cost of successful yet never deployed experiments. KW - Machine learning KW - Process model KW - Natural language processing Y1 - 2021 SN - 978-3-030-85346-4 SN - 978-3-030-85347-1 U6 - https://doi.org/10.1007/978-3-030-85347-1_12 N1 - International Conference on the Quality of Information and Communications Technology, QUATIC 2021, 8-11 September, Algarve, Portugal SP - 156 EP - 166 PB - Springer CY - Cham ER - TY - CHAP A1 - Schmidts, Oliver A1 - Kraft, Bodo A1 - Winkens, Marvin A1 - Zündorf, Albert T1 - Catalog integration of heterogeneous and volatile product data T2 - DATA 2020: Data Management Technologies and Applications N2 - The integration of frequently changing, volatile product data from different manufacturers into a single catalog is a significant challenge for small and medium-sized e-commerce companies. They rely on timely integrating product data to present them aggregated in an online shop without knowing format specifications, concept understanding of manufacturers, and data quality. Furthermore, format, concepts, and data quality may change at any time. Consequently, integrating product catalogs into a single standardized catalog is often a laborious manual task. Current strategies to streamline or automate catalog integration use techniques based on machine learning, word vectorization, or semantic similarity. However, most approaches struggle with low-quality or real-world data. We propose Attribute Label Ranking (ALR) as a recommendation engine to simplify the integration process of previously unknown, proprietary tabular format into a standardized catalog for practitioners. We evaluate ALR by focusing on the impact of different neural network architectures, language features, and semantic similarity. Additionally, we consider metrics for industrial application and present the impact of ALR in production and its limitations. Y1 - 2021 SN - 978-3-030-83013-7 U6 - https://doi.org/10.1007/978-3-030-83014-4_7 N1 - International Conference on Data Management Technologies and Applications, DATA 2020, 7-9 July SP - 134 EP - 153 PB - Springer CY - Cham ER - TY - CHAP A1 - Bornheim, Tobias A1 - Grieger, Niklas A1 - Bialonski, Stephan T1 - FHAC at GermEval 2021: Identifying German toxic, engaging, and fact-claiming comments with ensemble learning T2 - Proceedings of the GermEval 2021 Workshop on the Identification of Toxic, Engaging, and Fact-Claiming Comments : 17th Conference on Natural Language Processing KONVENS 2021 Y1 - 2021 U6 - https://doi.org/10.48415/2021/fhw5-x128 N1 - KONVENS (Konferenz zur Verarbeitung natürlicher Sprache/Conference on Natural Language Processing) 2021, 6. - 9. September 2021, Düsseldorf SP - 105 EP - 111 PB - Heinrich Heine University CY - Düsseldorf ER - TY - CHAP A1 - Sildatke, Michael A1 - Karwanni, Hendrik A1 - Kraft, Bodo A1 - Schmidts, Oliver A1 - Zündorf, Albert T1 - Automated Software Quality Monitoring in Research Collaboration Projects T2 - ICSEW'20: Proceedings of the IEEE/ACM 42nd International Conference on Software Engineering Workshops N2 - In collaborative research projects, both researchers and practitioners work together solving business-critical challenges. These projects often deal with ETL processes, in which humans extract information from non-machine-readable documents by hand. AI-based machine learning models can help to solve this problem. Since machine learning approaches are not deterministic, their quality of output may decrease over time. This fact leads to an overall quality loss of the application which embeds machine learning models. Hence, the software qualities in development and production may differ. Machine learning models are black boxes. That makes practitioners skeptical and increases the inhibition threshold for early productive use of research prototypes. Continuous monitoring of software quality in production offers an early response capability on quality loss and encourages the use of machine learning approaches. Furthermore, experts have to ensure that they integrate possible new inputs into the model training as quickly as possible. In this paper, we introduce an architecture pattern with a reference implementation that extends the concept of Metrics Driven Research Collaboration with an automated software quality monitoring in productive use and a possibility to auto-generate new test data coming from processed documents in production. Through automated monitoring of the software quality and auto-generated test data, this approach ensures that the software quality meets and keeps requested thresholds in productive use, even during further continuous deployment and changing input data. Y1 - 2020 U6 - https://doi.org/10.1145/3387940.3391478 N1 - ICSE '20: 42nd International Conference on Software Engineering, Seoul, Republic of Korea, 27 June 2020 - 19 July 2020 SP - 603 EP - 610 PB - IEEE CY - New York, NY ER - TY - CHAP A1 - Pohle-Fröhlich, Regina A1 - Dalitz, Christoph A1 - Richter, Charlotte A1 - Hahnen, Tobias A1 - Stäudle, Benjamin A1 - Albracht, Kirsten T1 - Estimation of muscle fascicle orientation in ultrasonic images T2 - Proceedings of the 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 5 N2 - We compare four different algorithms for automatically estimating the muscle fascicle angle from ultrasonic images: the vesselness filter, the Radon transform, the projection profile method and the gray level cooccurence matrix (GLCM). The algorithm results are compared to ground truth data generated by three different experts on 425 image frames from two videos recorded during different types of motion. The best agreement with the ground truth data was achieved by a combination of pre-processing with a vesselness filter and measuring the angle with the projection profile method. The robustness of the estimation is increased by applying the algorithms to subregions with high gradients and performing a LOESS fit through these estimates. Y1 - 2020 SN - 978-989-758-402-2 U6 - https://doi.org/10.5220/0008933900790086 N1 - 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications VISAPP 2020, Valletta, Malta SP - 79 EP - 86 PB - SciTePress CY - Setúbal, Portugal ER - TY - CHAP A1 - Schmidts, Oliver A1 - Kraft, Bodo A1 - Winkens, Marvin A1 - Zündorf, Albert T1 - Catalog integration of low-quality product data by attribute label ranking T2 - Proceedings of the 9th International Conference on Data Science, Technology and Applications DATA - Volume 1 N2 - The integration of product data from heterogeneous sources and manufacturers into a single catalog is often still a laborious, manual task. Especially small- and medium-sized enterprises face the challenge of timely integrating the data their business relies on to have an up-to-date product catalog, due to format specifications, low quality of data and the requirement of expert knowledge. Additionally, modern approaches to simplify catalog integration demand experience in machine learning, word vectorization, or semantic similarity that such enterprises do not have. Furthermore, most approaches struggle with low-quality data. We propose Attribute Label Ranking (ALR), an easy to understand and simple to adapt learning approach. ALR leverages a model trained on real-world integration data to identify the best possible schema mapping of previously unknown, proprietary, tabular format into a standardized catalog schema. Our approach predicts multiple labels for every attribute of an inpu t column. The whole column is taken into consideration to rank among these labels. We evaluate ALR regarding the correctness of predictions and compare the results on real-world data to state-of-the-art approaches. Additionally, we report findings during experiments and limitations of our approach. Y1 - 2020 SN - 978-989-758-440-4 U6 - https://doi.org/10.5220/0009831000900101 N1 - 9th International Conference on Data Science, Technologies and Applications (DATA 2020), 7 - 9 July 2020, online SP - 90 EP - 101 PB - SciTePress CY - Setúbal, Portugal ER - TY - CHAP A1 - Iomdina, Elena N. A1 - Kiseleva, Anna A. A1 - Kotliar, Konstantin A1 - Luzhnov, Petr V. T1 - Quantification of Choroidal Blood Flow Using the OCT-A System Based on Voxel Scan Processing T2 - Proceedings of the International Conference on Biomedical Innovations and Applications- BIA 2020 N2 - The paper presents a method for the quantitative assessment of choroidal blood flow using an OCT-A system. The developed technique for processing of OCT-A scans is divided into two stages. At the first stage, the identification of the boundaries in the selected portion was performed. At the second stage, each pixel mark on the selected layer was represented as a volume unit, a voxel, which characterizes the region of moving blood. Three geometric shapes were considered to represent the voxel. On the example of one OCT-A scan, this work presents a quantitative assessment of the blood flow index. A possible modification of two-stage algorithm based on voxel scan processing is presented. Y1 - 2020 SN - 978-1-7281-7073-2 U6 - https://doi.org/10.1109/BIA50171.2020.9244511 N1 - International Conference on Biomedical Innovations and Applications, Varna, Bulgaria, September 24 - 27, 2020 SP - 41 EP - 44 PB - IEEE CY - New York, NY ER - TY - CHAP A1 - Schmidts, Oliver A1 - Kraft, Bodo A1 - Siebigteroth, Ines A1 - Zündorf, Albert T1 - Schema Matching with Frequent Changes on Semi-Structured Input Files: A Machine Learning Approach on Biological Product Data T2 - Proceedings of the 21st International Conference on Enterprise Information Systems - Volume 1: ICEIS Y1 - 2019 SN - 978-989-758-372-8 U6 - https://doi.org/10.5220/0007723602080215 SP - 208 EP - 215 ER - TY - CHAP A1 - Eschler, Eric A1 - Wozniak, Felix A1 - Richter, Christoph A1 - Drechsler, Klaus T1 - Materialanalyse an lokal verstärkten Triaxialgeflechten T2 - Leichtbau in Forschung und industrieller Anwendung von der Nano- bis zur Makroebene, LLC, Landshuter Leichtbau-Colloquium, 9 Y1 - 2019 SN - 978-3-9818439-2-7 SP - 120 EP - 131 PB - Leichtbau Cluster CY - Landshut ER - TY - CHAP A1 - Hingley, Peter A1 - Dikta, Gerhard T1 - Finding a well performing box-jenkins forecasting model for annualised patent filings counts T2 - International Symposium on Forecasting, Thessaloniki, Greece, June 2019 Y1 - 2019 ER - TY - CHAP A1 - Hunker, Jan A1 - Jung, Alexander A1 - Goßmann, Matthias A1 - Linder, Peter A1 - Staat, Manfred ED - Staat, Manfred ED - Erni, Daniel T1 - Development of a tool to analyze the conduction speed in microelectrode array measurements of cardiac tissue T2 - 3rd YRA MedTech Symposium 2019 : May 24 / 2019 / FH Aachen N2 - The discovery of human induced pluripotent stem cells reprogrammed from somatic cells [1] and their ability to differentiate into cardiomyocytes (hiPSC-CMs) has provided a robust platform for drug screening [2]. Drug screenings are essential in the development of new components, particularly for evaluating the potential of drugs to induce life-threatening pro-arrhythmias. Between 1988 and 2009, 14 drugs have been removed from the market for this reason [3]. The microelectrode array (MEA) technique is a robust tool for drug screening as it detects the field potentials (FPs) for the entire cell culture. Furthermore, the propagation of the field potential can be examined on an electrode basis. To analyze MEA measurements in detail, we have developed an open-source tool. Y1 - 2019 SN - 978-3-940402-22-6 U6 - https://doi.org/10.17185/duepublico/48750 SP - 7 EP - 8 PB - Universität Duisburg-Essen CY - Duisburg ER - TY - CHAP A1 - Raman, Aravind Hariharan A1 - Jung, Alexander A1 - Horváth, András A1 - Becker, Nadine A1 - Staat, Manfred ED - Staat, Manfred ED - Erni, Daniel T1 - Modification of a computer model of human stem cell-derived cardiomyocyte electrophysiology based on Patch-Clamp measurements T2 - 3rd YRA MedTech Symposium 2019 : May 24 / 2019 / FH Aachen N2 - Human induced pluripotent stem cells (hiPSCs) have shown to be promising in disease studies and drug screenings [1]. Cardiomyocytes derived from hiPSCs have been extensively investigated using patch-clamping and optical methods to compare their electromechanical behaviour relative to fully matured adult cells. Mathematical models can be used for translating findings on hiPSCCMs to adult cells [2] or to better understand the mechanisms of various ion channels when a drug is applied [3,4]. Paci et al. (2013) [3] developed the first model of hiPSC-CMs, which they later refined based on new data [3]. The model is based on iCells® (Fujifilm Cellular Dynamics, Inc. (FCDI), Madison WI, USA) but major differences among several cell lines and even within a single cell line have been found and motivate an approach for creating sample-specific models. We have developed an optimisation algorithm that parameterises the conductances (in S/F=Siemens/Farad) of the latest Paci et al. model (2018) [5] using current-voltage data obtained in individual patch-clamp experiments derived from an automated patch clamp system (Patchliner, Nanion Technologies GmbH, Munich). Y1 - 2019 SN - 978-3-940402-22-6 U6 - https://doi.org/10.17185/duepublico/48750 SP - 10 EP - 11 PB - Universität Duisburg-Essen CY - Duisburg ER -