Refine
Institute
Has Fulltext
- no (6)
Language
- English (6)
Document Type
- Article (4)
- Conference Proceeding (2)
Keywords
- Active learning (1)
- Deep learning (1)
- Natural language processing (1)
- Query learning (1)
- Reproducible research (1)
Planar and three-dimensional (3D) interdigitated electrodes (IDE) with electrode digits separated by an insulating barrier of different heights were electrochemically characterized and compared in terms of their sensing properties. Due to the impact of the surface resistance, both types of IDE structures display a non-linear behavior in low-ionic strength solutions. The experimental data were fitted to an electrical equivalent circuit and interpreted taking into account the surface-charge-governed properties. The effect of a charged polyelectrolyte layer electrostatically assembled onto the sensor surface on the surface resistance in solutions with different KCl concentration is studied. In case of the same electrode footprint, 3D-IDEs show a larger cell constant and a higher sensitivity to molecular adsorption than that of planar IDEs. The obtained results demonstrate the potential of 3D-IDEs as a new transducer structure for a direct label-free sensing of charged molecules.
High-k perovskite oxide of barium strontium titanate (BST) represents a very attractive multi-functional transducer material for the development of (bio-)chemical sensors. In this work, a Si-based sensor chip containing Pt interdigitated electrodes covered with a thin BST layer (485 nm) has been developed for multi-parameter chemical sensing. The chip has been applied for the contactless measurement of the electrolyte conductivity, the detection of adsorbed charged macromolecules (positively charged polyelectrolytes of polyethylenimine) and the concentration of hydrogen peroxide (H2O2) vapor. The experimental results of functional testing of individual sensors are presented. The mechanism of the BST sensitivity to charged polyelectrolytes and H2O2 vapor has been proposed and discussed.
Supervised machine learning and deep learning require a large amount of labeled data, which data scientists obtain in a manual, and time-consuming annotation process. To mitigate this challenge, Active Learning (AL) proposes promising data points to annotators they annotate next instead of a subsequent or random sample. This method is supposed to save annotation effort while maintaining model performance.
However, practitioners face many AL strategies for different tasks and need an empirical basis to choose between them. Surveys categorize AL strategies into taxonomies without performance indications. Presentations of novel AL strategies compare the performance to a small subset of strategies. Our contribution addresses the empirical basis by introducing a reproducible active learning evaluation (ALE) framework for the comparative evaluation of AL strategies in NLP.
The framework allows the implementation of AL strategies with low effort and a fair data-driven comparison through defining and tracking experiment parameters (e.g., initial dataset size, number of data points per query step, and the budget). ALE helps practitioners to make more informed decisions, and researchers can focus on developing new, effective AL strategies and deriving best practices for specific use cases. With best practices, practitioners can lower their annotation costs. We present a case study to illustrate how to use the framework.