Refine
Document Type
- Article (14) (remove)
Language
- English (14) (remove)
Has Fulltext
- yes (14)
Is part of the Bibliography
- no (14)
Keywords
- Ethernet (3)
- Ethernet-APL (3)
- Computersicherheit (2)
- PROFInet (2)
- 2-wire Ethernet (1)
- Automatisierungsystem (1)
- Bildanalyse (1)
- Biosensor (1)
- Blindleistungsregelung (1)
- Corticosteron (1)
Institute
- Fakultät I - Elektro- und Informationstechnik (14) (remove)
Quartz-crystal microbalances (QCMs) are commercially available mass sensors which mainly consist of a quartz resonator that oscillates at a characteristic frequency, which shifts when mass changes due to surface binding of molecules. In addition to mass changes, the viscosity of gases or liquids in contact with the sensor also shifts the resonance but also influences the quality factor (Q-factor). Typical biosensor applications demand operation in liquid environments leading to viscous damping strongly lowering Q-factors. For obtaining reliable measurements in liquid environments, excellent resonator control and signal processing are essential but standard resonator circuits like the Pierce and Colpitts oscillator fail to establish stable resonances. Here we present a lowcost, compact and robust oscillator circuit comprising of state-of-the-art commercially available surface-mount technology components which stimulates the QCMs oscillation, while it also establishes a control loop regulating the applied voltage. Thereby an increased energy dissipation by strong viscous damping in liquid solutions can be compensated and oscillations are stabilized. The presented circuit is suitable to be used in compact biosensor systems using custom-made miniaturized QCMs in microfluidic environments. As a proof of concept we used this circuit in combination with a customized microfabricated QCM in a microfluidic environment to measure the concentration of C-reactive protein (CRP) in buffer (PBS) down to concentrations as low as 5 μgmL -1.
An important part of computed tomography is the calculation of a three-dimensional reconstruction of an object from series of X-ray images. Unfortunately, some applications do not provide sufficient X-ray images. Then, the reconstructed objects no longer truly represent the original. Inside of the volumes, the accuracy seems to vary unpredictably. In this paper, we introduce a novel method to evaluate any reconstruction, voxel by voxel. The evaluation is based on a sophisticated probabilistic handling of the measured X-rays, as well as the inclusion of a priori knowledge about the materials that the object receiving the X-ray examination consists of. For each voxel, the proposed method outputs a numerical value that represents the probability of existence of a predefined material at the position of the voxel while doing X-ray. Such a probabilistic quality measure was lacking so far. In our experiment, false reconstructed areas get detected by their low probability. In exact reconstructed areas, a high probability predominates. Receiver Operating Characteristics not only confirm the reliability of our quality measure but also demonstrate that existing methods are less suitable for evaluating a reconstruction.
Toward a service-based workflow for automated information extraction from herbarium specimens
(2018)
Over the past years, herbarium collections worldwide have started to digitize millions of specimens on an industrial scale. Although the imaging costs are steadily falling, capturing the accompanying label information is still predominantly done manually and develops into the principal cost factor. In order to streamline the process of capturing herbarium specimen metadata, we specified a formal extensible workflow integrating a wide range of automated specimen image analysis services. We implemented the workflow on the basis of OpenRefine together with a plugin for handling service calls and responses. The evolving system presently covers the generation of optical character recognition (OCR) from specimen images, the identification of regions of interest in images and the extraction of meaningful information items from OCR. These implementations were developed as part of the Deutsche Forschungsgemeinschaft funded a standardised and optimised process for data acquisition from digital images of herbarium specimens (StanDAP-Herb) Project.
The topic of electromagnetic compatibility (EMC) remains an important aspect during the planning, installation and operation of automation systems. Communication networks, such as PROFIBUS and PROFINET, are known to be robust and reliable transmission systems. Nevertheless, it is important that a number of fundamental principles needs to be observed to ensure fault-free operation over a long plant lifetime. This paper first describes a number of principles of EMC. On the basis of these principles, six recommendations for action are then developed which are to be observed during the planning of an automation system for use in the manufacturing industry. Finally, an overview is provided of future work for systems in the process industry.
The Ethernet-APL Engineering Process - A brief look at the Ethernet-APL engineering guideline
(2021)
The vision of an “Industrial Ethernet down to the sensors and actors” has become reality. At the Achema fair in June 2021 Ethernet-APL was introduced. This technology is based on a 2-wire Ethernet that conveys information as well as energy to the sensors and actuators of the automation system. Ethernet-APL is based on the 2-wire Ethernet standard IEEE 802.3cg running at 10 Mbit/s. An additional specification, the Ethernet-APL Port Profile Specification, defines additional parameters for the use of the technology in the process industry, especially in areas with potentially explosive atmospheres. As a next step, potential users need to become familiar with the engineering process of Ethernet-APL networks. For this purpose, the Ethernet-APL project provides the Ethernet-APL Engineering Guideline that covers the main areas of planning, installation and acceptance testing.
Network convergence is an increasing trend in the automation domain. More and more plant owners strive for a unification of networks in their plants. This yields a seamless network structure, simplified supervision, and reduced training effort for the personnel, as only one unified network technology needs to be handled. Ethernet-APL is one piece of the puzzle for such a converged network, supporting various real time protocols like PROFINET, EtherNet, HART-IP as well as the middleware protocol OPC UA. This paper gives an overview on the impact of Ethernet-APL field devices to OT security and proposes how to ensure OT security for them.
This paper reflects the content of the presentation “The Next Generation: Ethernet-APL for Safety Systems” at the NAMUR Annual General Meeting 2022. It deals with the use of the Ethernet Advanced Physical Layer (Ethernet-APL) in combination with the PROFINET/PROFIsafe protocol for safety applications. It describes the virtues of the digital communication between the field and safety system. In parallel the aspect of OT security for this use case is touched as well. The paper proposes a secure architecture, where safety- and non-safety field communications are still separated. At the end a set of requirements for the development of future APL devices is described.
The PROFINET protocol has been extended in the current version to include security functions. This allows flexible network architectures with the consideration of OT security requirements to be designed for PROFINET, which were not possible due to the network segmentation previously required. In addition to the manufacturers of the protocol stacks, component manufacturers are also required to provide a secure implementation in their devices. The necessary measures go beyond the use of a secure protocol stack. Using the example of an Ethernet-APL transmitter with PROFINET communication, this article shows which technical and organizational conditions will have to be considered by PROFINET device manufacturers in the future.
Ability of Black-Box Optimisation to Efficiently Perform Simulation Studies in Power Engineering
(2023)
In this study, the potential of the so-called black-box optimisation (BBO) to increase the efficiency of simulation studies in power engineering is evaluated. Three algorithms ("Multilevel Coordinate Search"(MCS) and "Stable Noisy Optimization by Branch and Fit"(SNOBFIT) by Huyer and Neumaier and "blackbox: A Procedure for Parallel Optimization of Expensive Black-box Functions"(blackbox) by Knysh and Korkolis) are implemented in MATLAB and compared for solving two use cases: the analysis of the maximum rotational speed of a gas turbine after a load rejection and the identification of transfer function parameters by measurements. The first use case has a high computational cost, whereas the second use case is computationally cheap. For each run of the algorithms, the accuracy of the found solution and the number of simulations or function evaluations needed to determine the optimum and the overall runtime are used to identify the potential of the algorithms in comparison to currently used methods. All methods provide solutions for potential optima that are at least 99.8% accurate compared to the reference methods. The number of evaluations of the objective functions differs significantly but cannot be directly compared as only the SNOBFIT algorithm does stop when the found solution does not improve further, whereas the other algorithms use a predefined number of function evaluations. Therefore, SNOBFIT has the shortest runtime for both examples. For computationally expensive simulations, it is shown that parallelisation of the function evaluations (SNOBFIT and blackbox) and quantisation of the input variables (SNOBFIT) are essential for the algorithmic performance. For the gas turbine overspeed analysis, only SNOBFIT can compete with the reference procedure concerning the runtime. Further studies will have to investigate whether the quantisation of input variables can be applied to other algorithms and whether the BBO algorithms can outperform the reference methods for problems with a higher dimensionality.