Refine
Year of publication
Document Type
- Article (300)
- Conference Proceeding (120)
- Periodical Part (9)
- Bachelor Thesis (8)
- Report (6)
- Master's Thesis (4)
- Working Paper (4)
- Part of a Book (3)
- Preprint (3)
- Book (2)
Language
- English (460) (remove)
Is part of the Bibliography
- no (460)
Keywords
- Euterentzündung (23)
- Student (12)
- Computersicherheit (10)
- Knowledge (10)
- Mumbai (10)
- Wissen (10)
- India (9)
- Serviceorientierte Architektur (9)
- bioplastics (9)
- biopolymers (9)
The Logical Observation Identifiers, Names and Codes (LOINC) is a common terminology used for standardizing laboratory terms. Within the consortium of the HiGHmed project, LOINC is one of the central terminologies used for health data sharing across all university sites. Therefore, linking the LOINC codes to the site-specific tests and measures is one crucial step to reach this goal. In this work we report our ongoing efforts in implementing LOINC to our laboratory information system and research infrastructure, as well as our challenges and the lessons learned. 407 local terms could be mapped to 376 LOINC codes of which 209 are already available to routine laboratory data. In our experience, mapping of local terms to LOINC is a widely manual and time consuming process for reasons of language and expert knowledge of local laboratory procedures.
The German Corona Consensus (GECCO) established a uniform dataset in FHIR format for exchanging and sharing interoperable COVID-19 patient specific data between health information systems (HIS) for universities. For sharing the COVID-19 information with other locations that use openEHR, the data are to be converted in FHIR format. In this paper, we introduce our solution through a web-tool named “openEHR-to-FHIR” that converts compositions from an openEHR repository and stores in their respective GECCO FHIR profiles. The tool provides a REST web service for ad hoc conversion of openEHR compositions to FHIR profiles.
Renewable energy production is one of the strongest rising markets and further extreme growth can be anticipated due to desire of increased sustainability in many parts of the world. With the rising adoption of renewable power production, such facilities are increasingly attractive targets for cyber attacks. At the same time higher requirements on a reliable production are raised. In this paper we propose a concept that improves monitoring of renewable power plants by detecting anomalous behavior. The system does not only detect an anomaly, it also provides reasoning for the anomaly based on a specific mathematical model of the expected behavior by giving detailed information about various influential factors causing the alert. The set of influential factors can be configured into the system before learning normal behaviour. The concept is based on multidimensional analysis and has been implemented and successfully evaluated on actual data from different providers of wind power plants.
Purpose: Radiology reports mostly contain free-text, which makes it challenging to obtain structured data. Natural language processing (NLP) techniques transform free-text reports into machine-readable document vectors that are important for creating reliable, scalable methods for data analysis. The aim of this study is to classify unstructured radiograph reports according to fractures of the distal fibula and to find the best text mining method.
Materials & Methods: We established a novel German language report dataset: a designated search engine was used to identify radiographs of the ankle and the reports were manually labeled according to fractures of the distal fibula. This data was used to establish a machine learning pipeline, which implemented the text representation methods bag-of-words (BOW), term frequency-inverse document frequency (TF-IDF), principal component analysis (PCA), non-negative matrix factorization (NMF), latent Dirichlet allocation (LDA), and document embedding (doc2vec). The extracted document vectors were used to train neural networks (NN), support vector machines (SVM), and logistic regression (LR) to recognize distal fibula fractures. The results were compared via cross-tabulations of the accuracy (acc) and area under the curve (AUC).
Results: In total, 3268 radiograph reports were included, of which 1076 described a fracture of the distal fibula. Comparison of the text representation methods showed that BOW achieved the best results (AUC = 0.98; acc = 0.97), followed by TF-IDF (AUC = 0.97; acc = 0.96), NMF (AUC = 0.93; acc = 0.92), PCA (AUC = 0.92; acc = 0.9), LDA (AUC = 0.91; acc = 0.89) and doc2vec (AUC = 0.9; acc = 0.88). When comparing the different classifiers, NN (AUC = 0,91) proved to be superior to SVM (AUC = 0,87) and LR (AUC = 0,85).
Conclusion: An automated classification of unstructured reports of radiographs of the ankle can reliably detect findings of fractures of the distal fibula. A particularly suitable feature extraction method is the BOW model.
Key Points:
- The aim was to classify unstructured radiograph reports according to distal fibula fractures.
- Our automated classification system can reliably detect fractures of the distal fibula.
- A particularly suitable feature extraction method is the BOW model.
The Wnt signaling pathway has been associated with many essential cell processes. This study aims to examine the effects of Wnt signaling on proliferation of cultured HEK293T cells. Cells were incubated with Wnt3a, and the activation of the Wnt pathway was followed by analysis of the level of the β-catenin protein and of the expression levels of the target genes MYC and CCND1. The level of β-catenin protein increased up to fourfold. While the mRNA levels of c-Myc and cyclin D1 increased slightly, the protein levels increased up to a factor of 1.5. Remarkably, MTT and BrdU assays showed different results when measuring the proliferation rate of Wnt3a stimulated HEK293T cells. In the BrdU assays an increase of the proliferation rate could be detected, which correlated to the applied Wnt3a concentration. Oppositely, this correlation could not be shown in the MTT assays. The MTT results, which are based on the mitochondrial activity, were confirmed by analysis of the succinate dehydrogenase complex by immunofluorescence and by western blotting. Taken together, our study shows that Wnt3a activates proliferation of HEK293 cells. These effects can be detected by measuring DNA synthesis rather than by measuring changes of mitochondrial activity.
In industrial production facilities, technical Energy Management Systems are used to measure, monitor and display energy consumption related information. The measurements take place at the field device level of the automation pyramid. The measured values are recorded and processed at the control level. The functionalities to monitor and display energy data are located at the MES level of the automation pyramid. So the energy data from all PLCs has to be aggregated, structured and provided for higher level systems. This contribution introduces a concept for an Energy Data Aggregation Layer, which provides the functionality described above. For the implementation of this Energy Data Aggregation Layer, a combination of AutomationML and OPC UA is used.
Integrated Risk and Opportunity Management (IROM) goes far beyond what is found in organizations today. However, it offers the best opportunity not only to keep pace with the VUCA world, but to actually profit from it. Accordingly, the introduction of opportunity-based thinking in addition to risk-based thinking is part of the design specification for ISO 9000 and ISO 9001. The prerequisite for the successful design of an IROM is the individual definition, control and integration of risk and opportunity management processes, considering eight success factors, the "8 C". Top management benefits directly from the result: better, coordinated decision memos enable faster and more appropriate decisions.
Harmonisation of German Health Care Data Using the OMOP Common Data Model – A Practice Report
(2023)
Data harmonization is an important step in large-scale data analysis and for generating evidence on real world data in healthcare. With the OMOP common data model, a relevant instrument for data harmonization is available that is being promoted by different networks and communities. At the Hannover Medical School (MHH) in Germany, an Enterprise Clinical Research Data Warehouse (ECRDW) is established and harmonization of that data source is the focus of this work. We present MHH’s first implementation of the OMOP common data model on top of the ECRDW data source and demonstrate the challenges concerning the mapping of German healthcare terminologies to a standardized format.
Autonomous and integrated passenger and freight transport (APFIT) is a promising approach to tackle both, traffic and last-mile-related issues such as environmental emissions, social and spatial conflicts or operational inefficiencies. By conducting an agent-based simulation, we shed light on this widely unexplored research topic and provide first indications regarding influential target figures of such a system in the rural area of Sarstedt, Germany. Our results show that larger fleets entail inefficiencies due to suboptimal utilization of monetary and material resources and increase traffic volume while higher amounts of unused vehicles may exacerbate spatial conflicts. Nevertheless, to fit the given demand within our study area, a comparatively large fleet of about 25 vehicles is necessary to provide reliable service, assuming maximum passenger waiting times of six minutes to the expense of higher standby times, rebalancing effort, and higher costs for vehicle acquisition and maintenance.
The NOA project collects and stores images from open access publications and makes them findable and reusable. During the project a focus group workshop was held to determine whether the development is addressing researchers’ needs. This took place before the second half of the project so that the results could be considered for further development since addressing users’ needs is a big part of the project. The focus was to find out what content and functionality they expect from image repositories.
In a first step, participants were asked to fill out a survey about their images use. Secondly, they tested different use cases on the live system. The first finding is that users have a need for finding scholarly images but it is not a routine task and they often do not know any image repositories. This is another reason for repositories to become more open and reach users by integrating with other content providers. The second finding is that users paid attention to image licenses but struggled to find and interpret them while also being unsure how to cite images. In general, there is a high demand for reusing scholarly images but the existing infrastructure has room to improve.
Building a well-founded understanding of the concepts, tasks and limitations of IT in all areas of society is an essential prerequisite for future developments in business and research. This applies in particular to the healthcare sector and medical research, which are affected by the noticeable advances in digitization. In the transfer project “Zukunftslabor Gesundheit” (ZLG), a teaching framework was developed to support the development of further education online courses in order to teach heterogeneous groups of learners independent of location and prior knowledge. The study at hand describes the development and components of the framework.
Powder bed-based additive manufacturing processes offer an extended freedom in design and enable the processing of metals, ceramics, and polymers with a high level of relative density. The latter is a prevalent measure of process and component quality, which depends on various input variables. A key point in this context is the condition of powder beds. To enhance comprehension of their particle-level formation and facilitate process optimization, simulations based on the Discrete Element Method are increasingly employed in research. To generate qualitatively as well as quantitatively reliable simulation results, an adaptation of the contact model parameterization is necessary. However, current adaptation methods often require the implementation of models that significantly increase computational effort, therefore limiting their applicability. To counteract this obstacle, a sophisticated formula-based adaptation and evaluation method is presented in this research. Additionally, the developed method enables accelerated parameter determination with limited experimental effort. Thus, it represents an integrative component, which supports further research efforts based on the Discrete Element Method by significantly reducing the parameterization effort. The universal nature of deducting this method also allows its adaptation to similar parameterization problems and its implementation in other fields of research.
Pathologists need to identify abnormal changes in tissue. With the developing digitalization, the used tissue slides are stored digitally. This enables pathologists to annotate the region of interest with the support of software tools. PathoLearn is a web-based learning platform explicitly developed for the teacher-student scenario, where the goal is that students learn to identify potential abnormal changes. Artificial intelligence (AI) and machine learning (ML) have become very important in medicine. Many health sectors already utilize AI and ML. This will only increase in the future, also in the field of pathology. Therefore, it is important to teach students the fundamentals and concepts of AI and ML early in their studies. Additionally, creating and training AI generally requires knowledge of programming and technical details. This thesis evaluates how this boundary can be overcome by comparing existing end-to-end AI platforms and teaching tools for AI. It was shown that a visual programming editor offers a fitting abstraction for creating neural networks without programming. This was extended with real-time collaboration to enable students to work in groups. Additionally, an automatic training feature was implemented, removing the necessity to know technical details about training neural networks.
After kidney transplantation graft rejection must be prevented. Therefore, a multitude of parameters of the patient is observed pre- and postoperatively. To support this process, the Screen Reject research project is developing a data warehouse optimized for kidney rejection diagnostics. In the course of this project it was discovered that important information are only available in form of free texts instead of structured data and can therefore not be processed by standard ETL tools, which is necessary to establish a digital expert system for rejection diagnostics. Due to this reason, data integration has been improved by a combination of methods from natural language processing and methods from image processing. Based on state-of-the-art data warehousing technologies (Microsoft SSIS), a generic data integration tool has been developed. The tool was evaluated by extracting Banff-classification from 218 pathology reports and extracting HLA mismatches from about 1700 PDF files, both written in german language.
In this poster we present the ongoing development of an integrated free and open source toolchain for semantic annotation of digitised cultural heritage. The toolchain development involves the specification of a common data model that aims to increase interoperability across diverse datasets and to enable new collaborative research approaches.
This paper aims to provide a structured overview of four open, participatory formats that are particularly applicable in inquiry-based teaching and learning contexts: hackathons, book sprints, barcamps, and learning circles. Using examples, mostly from the work and experience context of the Open Science Lab at TIB Hannover, we address concrete processes, working methods, possible outcomes and challenges.
The compilation offers an introduction to the topic and is intended to provide tools for testing in practice.
Techno-economic analysis that allocate costs to the energy flows of energy systems are helpful to understand the formation of costs within processes and to increase the cost efficiency. For the economic evaluation, the usefulness or quality of the energy is of great importance. In exergy-based methods, this is considered by allocating costs to the exergy instead of energy. As exergy represents the ability of performing work, it is often named the useful part of energy. In contrast, the anergy, the part of energy, which cannot perform work, is often assumed to be not useful.
However, heat flows as used e.g. in domestic heating are always a mixture of a relative small portion of exergy and a big portion of anergy. Although of lower quality, the anergy is obviously useful for these applications. The question is, whether it makes sense to differentiate between exergy and anergy and take both properties into account for the economic evaluation.
To answer this question, a new methodical concept based on the definition of an anergy-exergy cost ratio is compared to the commonly applied approaches of considering either energy or exergy as the basis for economic evaluation. These three different approaches for the economic analysis of thermal energy systems are applied to an exemplary heating system with thermal storages. It is shown that the results of the techno-economic analysis can be improved by giving anergy an economic value and that the proposed anergy-cost ratio allows a flexible adaptation of the evaluation depending on the economic constraints of a system.
Parametric study of piezoresistive structures in continuous fiber reinforced additive manufacturing
(2024)
Recent advancements in fiber reinforced additive manufacturing leverage the piezoresistivity of continuous carbon fibers. This effect enables the fabrication of structural components with inherent piezoresistive properties suitable for load measurement or structural monitoring. These are achieved without necessitating additional manufacturing or assembly procedures. However, there remain unexplored variables within the domain of continuous fiber-reinforced additive manufacturing. Crucially, the roles of fiber curvature radii and sensing fiber bundle counts have yet to be comprehensively addressed. Additionally, the compression-sensitive nature of printed carbon fiber-reinforced specimens remains a largely unexplored research area. To address these gaps, this study presents experimental analyses on tensile and three-point flexural specimens incorporating sensing carbon fiber strands. All specimens were fabricated with three distinct curvature radii. For the tensile specimens, the number of layers was also varied. Sensing fiber bundles were embedded on both tensile and compression sides of the flexural specimens. Mechanical testing revealed a linear-elastic behavior in the specimens. It was observed that carbon fibers supported the majority of the load, leading to brittle fractures. The resistance measurements showed a dependence on both the number of sensing layers and the radius of curvature, and exhibited a slight decreasing trend in the cyclic tests. Compared with the sensors subjected to tensile stress, the sensors embedded on the compression side showed a lower gauge factor.
This research focuses on the fundamental ideas and underlying principles of E-Learning technology, as well as theoretical considerations for an optimal learning environment. This theoretical exploration was then used as a basis for the design and construction of a new, interactive Web-Based ESH-Training. The quality and effectiveness of this new course was then compared with that of the existing analog PDF-Training via a test with a diverse sample of employee learners. Learners were later surveyed to ascertain their views on both trainings in terms of the quality of the content, facilitator, resources, and length. Results clearly showed that regardless of demographic factors, most employee learners preferred the new, Web-Based ESH-Training to the analog PDF-Training.
Compounds that exhibit the spin crossover effect are known to show a change of spin states through external stimuli. This reversible switching of spin states is accompanied by a change of the properties of the compound. Complexes, like iron (II)-triazole complexes, that exhibit this behavior at ambient temperature are often discussed for potential applications. In previous studies we synthesized iron (II)-triazole complexes and implemented them into electrospun nanofibers. We used Mössbauer spectroscopy in first studies to prove a successful implementation with maintaining spin crossover properties. Further studies from us showed that it is possible to use different electrospinning methods to either do a implementation or a deposition of the synthesized solid SCO material into or onto the polymer nanofibers. We now used a solvent in which both, the used iron (II)-triazole complex [Fe(atrz)3](2 ns)2 and three different polymers (Polyacrylonitrile, Polymethylmethacrylate and Polyvinylpyrrolidone), are soluble. This shall lead to a higher homogeneous distribution of the complex along the nanofibers. Mössbauer spectroscopy and other measurements are therefore in use to show a successful implementation without any significant changes to the complex.
Complexes like iron (II)-triazoles exhibit spin crossover behavior at ambient temperature and are often considered for possible application. In previous studies, we implemented complexes of this type into polymer nanofibers and first polymer-based optical waveguide sensor systems. In our current study, we synthesized complexes of this type, implemented them into polymers and obtained composites through drop casting and doctor blading. We present that a certain combination of polymer and complex can lead to composites with high potential for optical devices. For this purpose, we used two different complexes [Fe(atrz)3](2 ns)2 and [Fe(atrz)3]Cl1.5(BF4)0.5 with different polymers for each composite. We show through transmission measurements and UV/VIS spectroscopy that the optical properties of these composite materials can reversibly change due to the spin crossover effect.
The increasing variety of combinations of different building technology components offers a high potential for energy and cost savings in today's buildings. However, in most cases, this potential is not yet fully exploited due to the lack of intelligent supervisory control systems that are required to manage the complexity of the resulting overall systems. In this article, we present the implementation of a mixed-integer nonlinear model predictive control approach as a smart realtime building energy management system. The presented methodology is based on a forward-looking optimization of the overall energy costs. It takes into account energy demand forecasts and varying electricity market prices. We achieve real-time capability of the controller by applying a decomposition approach, which approximates the optimal solution of the underlying mixed-integer optimal control problem by convexification and rounding of the relaxed solution. The quality of the suboptimal solution is evaluated by comparison with the globally optimal solution obtained by the dynamic programming method. Based on a real-world scenario, we demonstrate that utilization of the real-time capable mixedinteger nonlinear model predictive control approach in a building control system leads to savings of 16% in the total operating costs and 13% in primary energy compared to the state-of-the-art control strategy without any loss of comfort for the residents.
On November 30th, 2022, OpenAI released the large language model ChatGPT, an extension of GPT-3. The AI chatbot provides real-time communication in response to users’ requests. The quality of ChatGPT’s natural speaking answers marks a major shift in how we will use AI-generated information in our day-to-day lives. For a software engineering student, the use cases for ChatGPT are manifold: assessment preparation, translation, and creation of specified source code, to name a few. It can even handle more complex aspects of scientific writing, such as summarizing literature and paraphrasing text. Hence, this position paper addresses the need for discussion of potential approaches for integrating ChatGPT into higher education. Therefore, we focus on articles that address the effects of ChatGPT on higher education in the areas of software engineering and scientific writing. As ChatGPT was only recently released, there have been no peer-reviewed articles on the subject. Thus, we performed a structured grey literature review using Google Scholar to identify preprints of primary studies. In total, five out of 55 preprints are used for our analysis. Furthermore, we held informal discussions and talks with other lecturers and researchers and took into account the authors’ test results from using ChatGPT. We present five challenges and three opportunities for the higher education context that emerge from the release of ChatGPT. The main contribution of this paper is a proposal for how to integrate ChatGPT into higher education in four main areas.
The PROFINET protocol has been extended in the current version to include security functions. This allows flexible network architectures with the consideration of OT security requirements to be designed for PROFINET, which were not possible due to the network segmentation previously required. In addition to the manufacturers of the protocol stacks, component manufacturers are also required to provide a secure implementation in their devices. The necessary measures go beyond the use of a secure protocol stack. Using the example of an Ethernet-APL transmitter with PROFINET communication, this article shows which technical and organizational conditions will have to be considered by PROFINET device manufacturers in the future.
Purpose: The calculation of aggregated composite measures is a widely used strategy to reduce the amount of data on hospital report cards. Therefore, this study aims to elicit and compare preferences of both patients as well as referring physicians regarding publicly available hospital quality information.
Methods: Based on systematic literature reviews as well as qualitative analysis, two discrete choice experiments (DCEs) were applied to elicit patients’ and referring physicians’ preferences. The DCEs were conducted using a fractional factorial design. Statistical data analysis was performed using multinomial logit models.
Results: Apart from five identical attributes, one specific attribute was identified for each study group, respectively. Overall, 322 patients (mean age 68.99) and 187 referring physicians (mean age 53.60) were included. Our models displayed significant coefficients for all attributes (p < 0.001 each). Among patients, “Postoperative complication rate” (20.6%; level range of 1.164) was rated highest, followed by “Mobility at hospital discharge” (19.9%; level range of 1.127), and ‘‘The number of cases treated” (18.5%; level range of 1.045). In contrast, referring physicians valued most the ‘‘One-year revision surgery rate’’ (30.4%; level range of 1.989), followed by “The number of cases treated” (21.0%; level range of 1.372), and “Postoperative complication rate” (17.2%; level range of 1.123).
Conclusion: We determined considerable differences between both study groups when calculating the relative value of publicly available hospital quality information. This may have an impact when calculating aggregated composite measures based on consumer-based weighting.
Chronic kidney disease is one of the main causes of mortality worldwide. It affects more than 800 million patients globally, accounting for approximately 10% of the general population. The significant burden of the disease prompts healthcare systems to implement adequate preventive and therapeutic measures. This systematic review and meta-analysis aimed to provide a concise summary of the findings published in the existing body of research about the influence that mobile health technology has on the outcomes of patients with the disease. A comprehensive systematic literature review was conducted from inception until March 1st, 2023. This systematic review and meta-analysis included all clinical trials that compared the efficacy of mobile app-based educational programs to that of more conventional educational treatment for the patients. Eleven papers were included in the current analysis, representing 759 CKD patients. 381 patients were randomly assigned to use the mobile apps, while 378 individuals were assigned to the control group. The mean systolic blood pressure was considerably lower in the mobile app group (MD -4.86; 95%-9.60, -0.13; p=0.04). Meanwhile, the mean level of satisfaction among patients who used the mobile app was considerably greater (MD 0.75; 95% CI 0.03, 1.46; p=0.04). Additionally, the mean self-management scores in the mobile app groups were significantly higher (SMD 0.534; 95% CI 0.201, 0.867; p=0.002). Mobile health applications are potentially valuable interventions for patients. This technology improved the self-management of the disease, reducing the mean levels of systolic blood pressure with a high degree of patient satisfaction.
Background: In Germany, hospice and palliative care is well covered through inpatient, outpatient, and home-based care services. It is unknown if, and to what extent, there is a need for additional day care services to meet the specific needs of patients and caregivers.
Methods: Two day hospices and two palliative day care clinics were selected. In the first step, two managers from each facility (n = 8) were interviewed by telephone, using a semi-structured interview guide. In the second step, four focus groups were conducted, each with three to seven representatives of hospice and palliative care from the facilities’ hospice and palliative care networks. Interviews and focus groups were audio recorded, transcribed verbatim and analyzed using qualitative content analysis.
Results: The interviewed experts perceived day care services as providing additional patient and caregiver benefits. Specifically, the services were perceived to meet patient needs for social interaction and bundled treatments, especially for patients who did not fit into inpatient settings (due to, e.g., their young age or a lack of desire for inpatient admission). The services were also perceived to meet caregiver needs for support, providing short-term relief for the home care situation.
Conclusions: The results suggest that inpatient, outpatient, and home-based hospice and palliative care services do not meet the palliative care needs of all patients. Although the population that is most likely to benefit from day care services is assumed to be relatively small, such services may meet the needs of certain patient groups more effectively than other forms of care.
We present an approach towards a data acquisition system for digital twins that uses a 5G net- work for data transmission and localization. The current hardware setup, which utilizes stereo vision and LiDAR for 3D mapping, is explained together with two recorded point cloud data sets. Furthermore, a resulting digital twin comprised of voxelized point cloud data is shown. Ideas for future applications and challenges regarding the system are discussed and an outlook on further development is given.
As a result of a research semester in the summer of 2022, a bibliography on multimodality in technical communication (TC) is presented. Given that TC primarily involves the development of instructional information, this bibliography holds relevance for anyone interested in the use of multimodality in the communication of procedural knowledge. The bibliography is publicly accessible as Zotero group library (https://bit.ly/multimodality_in_tc) and can be used and expanded.
After a description of the objectives and target group, the five disciplines from which the publications in the bibliography originate are presented. This is followed by information on the structure and search options of the Zotero group library, which are intended to support the search for publications on the respective research interest. The article concludes with some suggestions for collaborative efforts aimed at further enhancing and expanding the bibliography.
The author actively maintains the group library. Individuals seeking to contribute publications to the group library will receive the appropriate access rights from the author (claudia.villiger@hs-hannover.de). The author aspires to foster collaboration among researchers from diverse fields through this bibliography.
In this paper we describe methods to approximate functions and differential operators on adaptive sparse (dyadic) grids. We distinguish between several representations of a function on the sparse grid and we describe how finite difference (FD) operators can be applied to these representations. For general variable coefficient equations on sparse grids, genuine finite element (FE) discretizations are not feasible and FD operators allow an easier operator evaluation than the adapted FE operators. However, the structure of the FD operators is complex. With the aim to construct an efficient multigrid procedure, we analyze the structure of the discrete Laplacian in its hierarchical representation and show the relation between the full and the sparse grid case. The rather complex relations, that are expressed by scaling matrices for each separate coordinate direction, make us doubt about the possibility of constructing efficient preconditioners that show spectral equivalence. Hence, we question the possibility of constructing a natural multigrid algorithm with optimal O(N) efficiency. We conjecture that for the efficient solution of a general class of adaptive grid problems it is better to accept an additional condition for the dyadic grids (condition L) and to apply adaptive hp-discretization.
The paper presents a comprehensive model of a banking system that integrates network effects, bankruptcy costs, fire sales, and cross-holdings. For the integrated financial market we prove the existence of a price-payment equilibrium and design an algorithm for the computation of the greatest and the least equilibrium. The number of defaults corresponding to the greatest price-payment equilibrium is analyzed in several comparative case studies. These illustrate the individual and joint impact of interbank liabilities, bankruptcy costs, fire sales and cross-holdings on systemic risk. We study policy implications and regulatory instruments, including central bank guarantees and quantitative easing, the significance of last wills of financial institutions, and capital requirements.
Conventional fluorescent tubes are increasingly being replaced with innovative light-emitting diodes (LEDs) for lighting poultry houses. However, little is known about whether the flicker frequencies of LED luminaires are potential stressors in poultry husbandry. The term “light flicker” describes the fluctuations in the brightness of an electrically operated light source caused by the design and/or control of the light source. In this context, the critical flicker frequency (CFF) characterizes the frequency at which a sequence of light flashes is perceived as continuous light. It is known that CFF in birds is higher than that in humans and that light flicker can affect behavioral patterns and stress levels in several bird species. As there is a lack of knowledge about the impact of flicker frequency on fattening turkeys, this study aimed to investigate the effects of flicker frequency on the behavior, performance, and stress response in male turkeys. In 3 trials, a total of 1,646 male day-old turkey poults of the strain B.U.T. 6 with intact beaks were reared for 20 wk in 12 barn compartments of 18 m² each. Each barn compartment was illuminated using 2 full-spectrum LED lamps. Flicker frequencies of 165 Hz, 500 Hz, and 16 kHz were set in the luminaires to illuminate the compartments. Analyses of feather corticosterone concentration were performed on fully grown third-generation primaries (P 3) of 5 turkeys from each compartment. No significant differences were found in the development of live weight, feed consumption, or prevalence of injured or killed turkeys by conspecifics reared under the above flicker frequencies. The flicker frequencies also did not significantly influence feather corticosterone concentrations in the primaries of the turkeys. In conclusion, the present results indicate that flicker frequencies of 165 Hz or higher have no detrimental effect on growth performance, injurious pecking, or endocrine stress response in male turkeys and, thus, may be suitable for use as animal-friendly lighting.
Background: Autism Spectrum Disorder (ASD) is characterized by impairments in social communication, limited repetitive behaviors, impaired language development, and interest or activity patterns, which include a group complex neurodevelopmental syndrome with diverse phenotypes that reveal considerable etiological and clinical heterogeneity and are also considered one of the most heritable disorders (over 90%). Genetic, epigenetic, and environmental factors play a role in the development of ASD.
Aim: This study was designed to investigate the extent of DNA damage in parents of autistic children by treating peripheral blood mononuclear cells (PBMCs) with bleomycin and hydrogen peroxide (H2O2).
Methods: Peripheral blood mononuclear cells (PBMCs) were isolated by the Ficoll method and treated with a specific concentration of bleomycin and H2O2 for 30 min and 5 min, respectively. Then, the degree of DNA damage was analyzed by the alkaline comet assay or single cell gel electrophoresis (SCGE), an effective way to measure DNA fragmentation in eukaryotic cells.
Results: Our findings revealed that there is a significant difference in the increase of DNA damage in parents with affected children compared to the control group, which can indicate the inability of the DNA molecule repair system. Furthermore, our study showed a significant association between fathers’ occupational difficulties (exposed to the influence of environmental factors), as well as family marriage, and suffering from ASD in offspring.
Conclusion: Our results suggested that the influence of environmental factors on parents of autistic children may affect the development of autistic disorder in their offspring. Subsequently, based on our results, investigating the effect of environmental factors on the amount of DNA damage in parents with affected children requires more studies.
The miniaturized Mössbauer-spectrometer (MIMOS II), originally devised by Göstar Klingelhöfer, is further developed by the Renz group at the Leibniz University Hanover in cooperation with the Hanover University of Applied Sciences and Arts. A new processing unit with a two-dimensional (2D) data acquisition was developed by M. Jahns. The advantage of this data acquisition is that no thresholds need to be set before the measurement. The energy of each photon is determined and stored with the velocity of the drive. After the measurement, the relevant area can be selected for the Mössbauer spectrum. Now we have expanded the evaluation unit with a power supply for a MIMOS drive and a MIMOS PIN detector. So we have a very compact MIMOS transmissions measurement setup. With this setup it is possible to process the signals of two detectors serially. Currently we are working on a parallel signal processing.
Mixed-integer NMPC for real-time supervisory energy management control in residential buildings
(2023)
In recent years, building energy supply and distribution systems have become more complex, with an increasing number of energy generators, stores, flows, and possible combinations of operating modes. This poses challenges for supervisory control, especially when balancing the conflicting goals of maximizing comfort while minimizing costs and emissions to contribute to global climate protection objectives. Mixed-integer nonlinear model predictive control is a promising approach for intelligent real-time control that is able to properly address the specific characteristics and restrictions of building energy systems. We present a strategy that utilizes a decomposition approach, combining partial outer convexification with the Switch-Cost Aware Rounding procedure to handle switching behavior and operating time constraints of building components in real-time. The efficacy is demonstrated through practical applications in a single-family home with a combined heat and power unit and in a multi-family apartment complex with 18 residential units. Simulation studies show high correspondence to globally optimal solutions with significant cost savings potential of around 19%.
Background:
Many patients with cardiovascular disease also show a high comorbidity of mental disorders, especially such as anxiety and depression. This is, in turn, associated with a decrease in the quality of life. Psychocardiological treatment options are currently limited. Hence, there is a need for novel and accessible psychological help. Recently, we demonstrated that a brief face-to-face metacognitive therapy (MCT) based intervention is promising in treating anxiety and depression. Here, we aim to translate the face-to-face approach into digital application and explore the feasibility of this approach.
Methods:
We translated a validated brief psychocardiological intervention into a novel non-blended web app. The data of 18 patients suffering from various cardiac conditions but without diagnosed mental illness were analyzed after using the web app over a two-week period in a feasibility trial. The aim was whether a nonblended web app based MCT approach is feasible in the group of cardiovascular patients with cardiovascular disease.
Results:
Overall, patients were able to use the web app and rated it as satisfactory and beneficial. In addition, there was first indication that using the app improved the cardiac patients’ subjectively perceived health and reduced their anxiety. Therefore, the approach seems feasible for a future randomized controlled trial.
Conclusion:
Applying a metacognitive-based brief intervention via a nonblended web app seems to show good acceptance and feasibility in a small target group of patients with CVD. Future studies should further develop, improve and validate digital psychotherapy approaches, especially in patient groups with a lack of access to standard psychotherapeutic care.
In the last years generative models have gained large public attention due to their high level of quality in generated images. In short, generative models learn a distribution from a finite number of samples and are able then to generate infinite other samples. This can be applied to image data. In the past generative models have not been able to generate realistic images, but nowadays the results are almost indistinguishable from real images.
This work provides a comparative study of three generative models: Variational Autoencoder (VAE), Generative Adversarial Network (GAN) and Diffusion Models (DM). The goal is not to provide a definitive ranking indicating which one of them is the best, but to qualitatively and where possible quantitively decide which model is good with respect to a given criterion. Such criteria include realism, generalization and diversity, sampling, training difficulty, parameter efficiency, interpolating and inpainting capabilities, semantic editing as well as implementation difficulty. After a brief introduction of how each model works on the inside, they are compared against each other. The provided images help to see the differences among the models with respect to each criterion.
To give a short outlook on the results of the comparison of the three models, DMs generate most realistic images. They seem to generalize best and have a high variation among the generated images. However, they are based on an iterative process, which makes them the slowest of the three models in terms of sample generation time. On the other hand, GANs and VAEs generate their samples using one single forward-pass. The images generated by GANs are comparable to the DM and the images from VAEs are blurry, which makes them less desirable in comparison to GANs or DMs. However, both the VAE and the GAN, stand out from the DMs with respect to the interpolations and semantic editing, as they have a latent space, which makes space-walks possible and the changes are not as chaotic as in the case of DMs. Furthermore, concept-vectors can be found, which transform a given image along a given feature while leaving other features and structures mostly unchanged, which is difficult to archive with DMs.
There are many aspects of code quality, some of which are difficult to capture or to measure. Despite the importance of software quality, there is a lack of commonly accepted measures or indicators for code quality that can be linked to quality attributes. We investigate software developers’ perceptions of source code quality and the practices they recommend to achieve these qualities. We analyze data from semi-structured interviews with 34 professional software developers, programming teachers and students from Europe and the U.S. For the interviews, participants were asked to bring code examples to exemplify what they consider good and bad code, respectively. Readability and structure were used most commonly as defining properties for quality code. Together with documentation, they were also suggested as the most common target properties for quality improvement. When discussing actual code, developers focused on structure, comprehensibility and readability as quality properties. When analyzing relationships between properties, the most commonly talked about target property was comprehensibility. Documentation, structure and readability were named most frequently as source properties to achieve good comprehensibility. Some of the most important source code properties contributing to code quality as perceived by developers lack clear definitions and are difficult to capture. More research is therefore necessary to measure the structure, comprehensibility and readability of code in ways that matter for developers and to relate these measures of code structure, comprehensibility and readability to common software quality attributes.
The aim of this cross-sectional study was to investigate associated factors of the severity of clinical mastitis (CM). Milk samples of 249 cases of CM were microbiologically examined, of which 27.2% were mild, 38.5% moderate, and 34.3% severe mastitis. The samples were incubated aerobically and anaerobically to investigate the role of aerobic and anaerobic microorganisms. In addition, the pathogen shedding was quantitatively examined, and animal individual data, outside temperature and relative humidity, were collected to determine associated factors for the severity of CM. The pathogen isolated the most was Escherichia coli (35.2%), followed by Streptococcus spp. (16.4%). Non-aureus staphylococci (NaS) (15.4%) and other pathogens (e.g., Staphylococcus aureus, coryneforms) (15.4%) were the pathogens that were isolated the most for mild mastitis. Moderate mastitis was mostly caused by E. coli (38%). E. coli was also the most common pathogen in severe mastitis (50.6%), followed by Streptococcus spp. (16.4%), and Klebsiella spp. (10.3%). Obligate anaerobes (Clostridium spp.) were isolated in one case (0.4%) of moderate mastitis. The mortality rate (deceased or culled due to the mastitis in the following two weeks) was 34.5% for severe mastitis, 21.7% for moderate mastitis, and 4.4% for mild mastitis. The overall mortality rate of CM was 21.1%. The pathogen shedding (back logarithmized) was highest for severe mastitis (55,000 cfu/mL) and E. coli (91,200 cfu/mL). High pathogen shedding, low previous somatic cell count (SCC) before mastitis, high outside temperature, and high humidity were associated with severe courses of mastitis.
Appropriate data models are essential for the systematic collection, aggregation, and integration of health data and for subsequent analysis. However, recommendations for modeling health data are often not publicly available within specific projects. Therefore, the project Zukunftslabor Gesundheit investigates recommendations for modeling. Expert interviews with five experts were conducted and analyzed using qualitative content analysis. Based on the condensed categories “governance”, “modeling” and “standards”, the project team generated eight hypotheses for recommendations on health data modeling. In addition, relevant framework conditions such as different roles, international cooperation, education/training and political influence were identified. Although emerging from interviewing a small convenience sample of experts, the results help to plan more extensive data collections and to create recommendations for health data modeling.
Economic and political/governmental infrastructural factors are major contributors to the economic development/growth of all sectors of a country, such as in the area of healthcare systems and clinical research, including the pharmaceutical industry. But what is the interaction between economic, and political/governmental infrastructural factors and the development of healthcare systems, especially, the performance of the pharmaceutical industry? Information from selected articles of a literature search of PubMed and by using Google Advanced Search led to the generation of five categories of infrastructural factors, and were filled with data from 41 African Countries using the World Health Organization data repository. Median changes over time were given and tested by Wilcoxon signed-rank test and Friedman test, respectively. Analysis of factors related to availability of healthcare facilities showed that physicians and pharmacies were significant increased, with insignificantly decreased number of hospital beds. Healthcare Financing by the Government showed notable differences. Private health spending decreased significantly unlike Gross National Income. Analysis of infrastructural factors showed that stable supply of electricity and the associated use of the Internet improved significantly. The low level of data on the expansion of paved road networks suggests less developed medical services in remote rural areas. Healthcare systems in African countries improved over the last two decades, but differences between the individual countries still prevail and some of the countries cannot yet offer an attractive sales market for the products of pharmaceutical companies.
PROFINET Security: A Look on Selected Concepts for Secure Communication in the Automation Domain
(2023)
We provide a brief overview of the cryptographic security extensions for PROFINET, as defined and specified by PROFIBUS & PROFINET International (PI). These come in three hierarchically defined Security Classes, called Security Class 1, 2 and 3. Security Class 1 provides basic security improvements with moderate implementation impact on PROFINET components. Security Classes 2 and 3, in contrast, introduce an integrated cryptographic protection of PROFINET communication. We first highlight and discuss the security features that the PROFINET specification offers for future PROFINET products. Then, as our main focus, we take a closer look at some of the technical challenges that were faced during the conceptualization and design of Security Class 2 and 3 features. In particular, we elaborate on how secure application relations between PROFINET components are established and how a disruption-free availability of a secure communication channel is guaranteed despite the need to refresh cryptographic keys regularly. The authors are members of the PI Working Group CB/PG10 Security.
Context: Higher education is changing at an accelerating pace due to the widespread use of digital teaching and emerging technologies. In particular, AI assistants such as ChatGPT pose significant challenges for higher education institutions because they bring change to several areas, such as learning assessments or learning experiences.
Objective: Our objective is to discuss the impact of AI assistants in the context of higher education, outline possible changes to the context, and present recommendations for adapting to change.
Method: We review related work and develop a conceptual structure that visualizes the role of AI assistants in higher education.
Results: The conceptual structure distinguishes between humans, learning, organization, and disruptor, which guides our discussion regarding the implications of AI assistant usage in higher education. The discussion is based on evidence from related literature.
Conclusion: AI assistants will change the context of higher education in a disruptive manner, and the tipping point for this transformation has already been reached. It is in our hands to shape this transformation.
The aim of this cross-sectional study was to investigate the occurrence of bacteremia in severe mastitis cases of dairy cows. Milk and corresponding blood samples of 77 cases of severe mastitis were bacteriologically examined. All samples (milk and blood) were incubated aerobically and anaerobically to also investigate the role of obligate anaerobic microorganisms in addition to aerobic microorganisms in severe mastitis. Bacteremia occurred if identical bacterial strains were isolated from milk and blood samples of the same case. In addition, pathogen shedding was examined, and the data of animals and weather were collected to determine associated factors for the occurrence of bacteremia in severe mastitis. If Gram-negative bacteria were detected in milk samples, a Limulus test (detection of endotoxins) was also performed for corresponding blood samples without the growth of Gram-negative bacteria. In 74 cases (96.1%), microbial growth was detected in aerobically incubated milk samples. The most-frequently isolated bacteria in milk samples were Escherichia (E.) coli (48.9%), Streptococcus (S.) spp. (18.1%), and Klebsiella (K.) spp. (16%). Obligatory anaerobic microorganisms were not isolated. In 72 cases (93.5%) of the aerobically examined blood samples, microbial growth was detected. The most-frequently isolated pathogens in blood samples were non-aureus Staphylococci (NaS) (40.6%) and Bacillus spp. (12.3%). The Limulus test was positive for 60.5% of cases, which means a detection of endotoxins in most blood samples without the growth of Gram-negative bacteria. Bacteremia was confirmed in 12 cases (15.5%) for K. pneumoniae (5/12), E. coli (4/12), S. dysgalactiae (2/12), and S. uberis (1/12). The mortality rate (deceased or culled) was 66.6% for cases with bacteremia and 34.1% for cases without bacteremia. High pathogen shedding and high humidity were associated with the occurrence of bacteremia in severe mastitis.
Monitoring of clinical trials is a fundamental process required by regulatory agencies. It assures the compliance of a center to the required regulations and the trial protocol. Traditionally, monitoring teams relied on extensive on-site visits and source data verification. However, this is costly, and the outcome is limited. Thus, central statistical monitoring (CSM) is an additional approach recently embraced by the International Council for Harmonisation (ICH) to detect problematic or erroneous data by using visualizations and statistical control measures. Existing implementations have been primarily focused on detecting inlier and outlier data. Other approaches include principal component analysis and distribution of the data. Here we focus on the utilization of comparisons of centers to the Grand mean for different model types and assumptions for common data types, such as binomial, ordinal, and continuous response variables. We implement the usage of multiple comparisons of single centers to the Grand mean of all centers. This approach is also available for various non-normal data types that are abundant in clinical trials. Further, using confidence intervals, an assessment of equivalence to the Grand mean can be applied. In a Monte Carlo simulation study, the applied statistical approaches have been investigated for their ability to control type I error and the assessment of their respective power for balanced and unbalanced designs which are common in registry data and clinical trials. Data from the German Multiple Sclerosis Registry (GMSR) including proportions of missing data, adverse events and disease severity scores were used to verify the results on Real-World-Data (RWD).
The digital transformation with its new technologies and customer expectation has a significant effect on the customer channels in the insurance industry. The objective of this study is the identification of enabling and hindering factors for the adoption of online claim notification services that are an important part of the customer experience in insurance. For this purpose, we conducted a quantitative cross-sectional survey based on the exemplary scenario of car insurance in Germany and analyzed the data via structural equation modeling (SEM). The findings show that, besides classical technology acceptance factors such as perceived usefulness and ease of use, digital mindset and status quo behavior play a role: acceptance of digital innovations, lacking endurance as well as lacking frustration tolerance with the status quo lead to a higher intention for use. Moreover, the results are strongly moderated by the severity of the damage event—an insurance-specific factor that is sparsely considered so far. The latter discovery implies that customers prefer a communication channel choice based on the individual circumstances of the claim.
Introduction
Atopic dermatitis (AD) is a common inflammatory skin disease. Many patients are initiating a systemic therapy, if the disease is not adequately controlled with topical treatment only. Currently, there is little real-world evidence on the AD-related medical care situation in Germany. This study analyzed patient characteristics, treatment patterns, healthcare resource utilization and costs associated with systemically treated AD for the German healthcare system.
Methods
In this descriptive, retrospective cohort study, aggregated anonymized German health claims data from the InGef research database were used. Within a representative sample of four million insured individuals, patients with AD and systemic drug therapy initiation (SDTI) in the index year 2017 were identified and included into the study cohort. Systemic drug therapy included dupilumab, systemic corticosteroids (SCS) and systemic immunosuppressants (SIS). Patients were observed for one year starting from the date of SDTI in 2017.
Results
9975 patients were included (57.8% female, mean age 39.6 years [SD 25.5]). In the one-year observation period, the most common systemic drug therapy was SCS (> 99.0%). Administrations of dupilumab (0.3%) or dispensations of SIS were rare (cyclosporine: 0.5%, azathioprine: 0.6%, methotrexate: 0.1%). Median treatment duration of SCS, cyclosporine and azathioprine was 27 days, 102 days, and 109 days, respectively. 2.8% of the patients received phototherapy; 41.6% used topical corticosteroids and/or topical calcineurin inhibitor. Average annual costs for medications amounted to € 1237 per patient. Outpatient services were used by 99.6% with associated mean annual costs of € 943; 25.4% had at least one hospitalization (mean annual costs: € 5836). 5.3% of adult patients received sickness benefits with associated mean annual costs of € 5026.
Conclusions
Despite unfavorable risk–benefit profile, this study demonstrated a common treatment with SCS, whereas other systemic drug therapy options were rarely used. Furthermore, the results suggest a substantial economic burden for patients with AD and SDTI.
Purpose
This study aims to determine the intention to use hospital report cards (HRCs) for hospital referral purposes in the presence or absence of patient-reported outcomes (PROs) as well as to explore the relevance of publicly available hospital performance information from the perspective of referring physicians.
Methods
We identified the most relevant information for hospital referral purposes based on a literature review and qualitative research. Primary survey data were collected (May–June 2021) on a sample of 591 referring orthopedists in Germany and analyzed using structural equation modeling. Participating orthopedists were recruited using a sequential mixed-mode strategy and randomly allocated to work with HRCs in the presence (intervention) or absence (control) of PROs.
Results
Overall, 420 orthopedists (mean age 53.48, SD 8.04) were included in the analysis. The presence of PROs on HRCs was not associated with an increased intention to use HRCs (p = 0.316). Performance expectancy was shown to be the most important determinant for using HRCs (path coefficient: 0.387, p < .001). However, referring physicians have doubts as to whether HRCs can help them. We identified “complication rate” and “the number of cases treated” as most important for the hospital referral decision making; PROs were rated slightly less important.
Conclusions
This study underpins the purpose of HRCs, namely to support referring physicians in searching for a hospital. Nevertheless, only a minority would support the use of HRCs for the next hospital search in its current form. We showed that presenting relevant information on HRCs did not increase their use intention.
The shift towards RES introduces challenges related to power system stability due to the characteristics of inverter-based resources (IBRs) and the intermittent nature of renewable resources. This paper addresses these challenges by conducting comprehensive time and frequency simulations on the IEEE two-area benchmark power system with detailed type 4 wind turbine generators (WTGs), including turbines, generators, converters, filters, and controllers. The simulations analyse small-signal and transient stability, considering variations in active and reactive power, short-circuit events, and wind variations. Metrics such as rate of change of frequency (RoCoF), frequency nadir, percentage of frequency variation, and probability density function (PDF) are used to evaluate the system performance. The findings emphasise the importance of including detailed models of RES in stability analyses and demonstrate the impact of RES penetration on power system dynamics. This study contributes to a deeper understanding of RES integration challenges and provides insights for ensuring the reliable and secure operation of power systems in the presence of high levels of RES penetration.
In this paper a new rotor position observer for permanent magnet synchronous machines (PMSM) based on an Extended-Kalman-Filter (EKF) is presented. With this method, just one single EKF is sufficent to evaluate the position information from electromotive force (EMF) and anisotropy. Thus, the PMSM can be controlled for the entire speed range without a position sensor and without the need to switch or synchronize between different observers. The approach covers online estimation of permanent magnetic field and mechanical load. The resulting EKF-based rotor position estimator is embedded in the existing cascaded control concept of the PMSM without need of additional angle trackers or signal filters. The experimental validation for the position sensorless control shows optimized dynamic behaviour.
During the Corona-Pandemic, information (e.g. from the analysis of balance sheets and payment behavior) traditionally used for corporate credit risk analysis became less valuable because it represents only past circumstances. Therefore, the use of currently published data from social media platforms, which have shown to contain valuable information regarding the financial stability of companies, should be evaluated. In this data e. g. additional information from disappointed employees or customers can be present. In order to analyze in how far this data can improve the information base for corporate credit risk assessment, Twitter data regarding the ten greatest insolvencies of German companies in 2020 and solvent counterparts is analyzed in this paper. The results from t-tests show, that sentiment before the insolvencies is significantly worse than in the comparison group which is in alignment with previously conducted research endeavors. Furthermore, companies can be classified as prospectively solvent or insolvent with up to 70% accuracy by applying the k-nearest-neighbor algorithm to monthly aggregated sentiment scores. No significant differences in the number of Tweets for both groups can be proven, which is in contrast to findings from studies which were conducted before the Corona-Pandemic. The results can be utilized by practitioners and scientists in order to improve decision support systems in the domain of corporate credit risk analysis. From a scientific point of view, the results show, that the information asymmetry between lenders and borrowers in credit relationships, which are principals and agents according to the principal-agent-theory, can be reduced based on user generated content from social media platforms. In future studies, it should be evaluated in how far the data can be integrated in established processes for credit decision making. Furthermore, additional social media platforms as well as samples of companies should be analyzed. Lastly, the authenticity of user generated contend should be taken into account in order to ensure, that credit decisions rely on truthful information only.
The trend towards the use of Ethernet in automation networks is ongoing. Due to its high flexibility, speed, and bandwidth, Ethernet nowadays is not only widely used in homes and offices worldwide but finding its way into industrial applications. Especially in automation processes, where many field devices send data in relative short time spans, the requirements for a safe and fast data transfer are high. This makes the use of industrial Ethernet essential. A new hardware-layer, specifically tailored for industrial applications, has been introduced in the form of Ethernet-APL (‘Advanced Physical Layer’). Ethernet-APL is based on the Ethernet standard and implements a two-wire Ethernet-based communication for field devices and provides data and power over a two-wire cable. The operation in areas with potentially explosive atmosphere is also possible. This enables a modular, fast, and transparent Ethernet network structure throughout the entire plant. However, by integrating Ethernet-APL into the field, industrial networks in the future will face the challenge of operating at varying datarates at different locations in the network, resulting in a ‘mixed link speed’ network. This can lead to limitations in packet-throughput and consequently to potential packet loss of system relevant data, which must be avoided. Therefore, the purpose of this thesis is to investigate the potential of packet loss in ‘mixed link speed’ networks.
Background
To perform a systematic review about the effect of using clinical pathways on length of stay (LOS), hospital costs and patient outcomes. To provide a framework for local healthcare organisations considering the effectiveness of clinical pathways as a patient management strategy.
Methods
As participants, we considered hospitalized children and adults of every age and indication whose treatment involved the management strategy "clinical pathways". We include only randomised controlled trials (RCT) and controlled clinical trials (CCT), not restricted by language or country of publication. Single measures of continuous and dichotomous study outcomes were extracted from each study. Separate analyses were done in order to compare effects of clinical pathways on length of stay (LOS), hospital costs and patient outcomes. A random effects meta-analysis was performed with untransformed and log transformed outcomes.
Results
In total 17 trials met inclusion criteria, representing 4,070 patients. The quality of the included studies was moderate and studies reporting economic data can be described by a very limited scope of evaluation. In general, the majority of studies reporting economic data (LOS and hospital costs) showed a positive impact. Out of 16 reporting effects on LOS, 12 found significant shortening. Furthermore, in a subgroup-analysis, clinical pathways for invasive procedures showed a stronger LOS reduction (weighted mean difference (WMD) -2.5 days versus -0.8 days)).
There was no evidence of differences in readmission to hospitals or in-hospital complications. The overall Odds Ratio (OR) for re-admission was 1.1 (95% CI: 0.57 to 2.08) and for in-hospital complications, the overall OR was 0.7 (95% CI: 0.49 to 1.0). Six studies examined costs, and four showed significantly lower costs for the pathway group. However, heterogeneity between studies reporting on LOS and cost effects was substantial.
Conclusion
As a result of the relatively small number of studies meeting inclusion criteria, this evidence base is not conclusive enough to provide a replicable framework for all pathway strategies. Considering the clinical areas for implementation, clinical pathways seem to be effective especially for invasive care. When implementing clinical pathways, the decision makers need to consider the benefits and costs under different circumstances (e.g. market forces).
In this paper we describe the selection of a modern build automation tool for an industry research partner of ours, namely an insurance company. Build automation has become increasingly important over the years. Today, build automation became one of the central concepts in topics such as cloud native development based on microservices and DevOps. Since more and more products for build automation have entered the market and existing tools have changed their functional scope, there is nowadays a large number of tools on the market that differ greatly in their functional scope. Based on requirements from our partner company, a build server analysis was conducted. This paper presents our analysis requirements, a detailed look at one of the examined tools and a summarizes our comparison of all three tools from our final comparison round.
Background:
The increase in food intolerances poses a burgeoning problem in our society. Food intolerances not only lead to physical impairment of the individual patient but also result in a high socio-economic burden due to factors such as the treatment required as well as absenteeism. The present study aimed to explore whether lactose intolerant (LI) patients exhibit more frequent comorbidities than non-LI patients.
Methods:
The study was conducted on a case-control basis and the results were determined using routine data analysis. Routine data from the IMS Disease Analyzer database were used for this purpose. A total of 6,758 data records were processed and analyzed.
Results:
There were significant correlations between LI and the incidence of osteoporosis, changes in mental status, and the presence of additional food intolerances. Comparing 3,379 LI vs. 3,379 non-LI patients, 34.5% vs. 17.7% (P<0.0001) suffered from abdominal pain; 30.6% vs. 17.2% (P<0.0001) from gastrointestinal infections; and 20.9% vs. 16.0% (P=0.0053) from depression. Adjusted odds ratios (OR) were the highest for fructose intolerance (n=229 LI vs. n=7 non-LI; OR 31.06; P<0.0001), irritable bowel syndrome (n=247 LI vs. n=44 non-LI; OR 5.23; P<0.0001), and bloating (n=351 LI vs. n=68 non-LI; OR 4.94; P<0.0001).
Conclusion:
The study confirms that LI should not be regarded as an isolated illness but considered a possible trigger for further diseases. Additional research is necessary to assert more precise statements.
Background:
Hereditary angioedema (HAE) is a rare genetic disease and characterized by clinical features such as paroxysmal, recurrent angioedema of the skin, the gastrointestinal tract, and the upper airways. Swelling of the skin occurs primarily in the face, extremities and genitals. Gastrointestinal attacks are accompanied by painful abdominal cramps, vomiting and diarrhea. Due to the low prevalence and the fact that HAE patients often present with rather unspecific symptoms such as abdominal cramps, the final diagnosis is often made after a long delay. The aim of this German-wide survey was to characterize the period between occurrence of first symptoms and final diagnosis regarding self-perceived health, symptom burden and false diagnoses for patients with HAE.
Results:
Overall, 81 patients with HAE were included and participated in the telephone-based survey. Of those, the majority reported their current health status as “good” (47.5%) or “very good” (13.8%), which was observed to be a clear improvement compared to the year before final diagnosis (“good” (16.3%), “very good” (11.3%)). Edema in the extremities (85.2%) and in the gastrointestinal tract (81.5%) were the most currently reported symptoms and occurred earlier than other reported symptoms (mean age at onset 18.1 and 17.8 years, respectively). Misdiagnoses were observed in 50.6% of participating HAE patients with appendicitis and allergy being the most frequently reported misdiagnoses (40.0 and 30.0% of those with misdiagnosis, respectively). Patients with misdiagnosis often received mistreatment (80.0%) with pharmaceuticals and surgical interventions as the most frequently carried out mistreatments (65.6 and 56.3% of those with mistreatment, respectively). The mean observed diagnostic delay was 18.1 years (median 15.0 years). The diagnostic delay was higher in older patients and index patients.
Conclusions:
This study showed that self-perceived status of health for patients is much better once the final correct diagnosis has been made and specific treatment was available. Further challenge in the future will still be to increase awareness for HAE especially in settings which are normally approached by patients at occurrence of first symptoms to assure early referral to specialists and therefore increase the likelihood of receiving an early diagnosis.
Aim:
To characterize palliative care patients, to estimate the incidence, prevalence, and 1-year all-cause mortality in patients in Germany who received palliative care treatment.
Subject and methods:
The study analyzed the InGef Research Database, which covers 4 million people insured in German statutory health insurance companies. Specific outpatient and inpatient reimbursement codes were used to capture cases with palliative conditions. The prevalence was ascertained for the year 2015. The incidence was calculated for patients without documented palliative care services in the year before the observation period. The Kaplan–Meier method was used to analyze the 1-year all-cause mortality.
Results:
The incidence rate of palliative conditions was 41.3 and 34.9 per 10,000 persons in women and men, respectively. The prevalence per 10,000 persons was 61.3 in women and 51.1 in men. The 1-year all-cause mortality among patients receiving their first palliative care treatment was 67.5%. Mortality was lower in patients receiving general outpatient palliative care treatment (AAPV; 60.8%) compared to patients receiving specialized outpatient palliative care treatment (SAPV; 86.1%) or inpatient palliative care treatment (90.6%). Within the first 30 days, mortality was particularly high (~43.0%).
Conclusions:
In Germany, more than 400,000 patients per year receive palliative care treatment, which is lower compared to estimates of the number of persons with a potential need for palliative care. This gap was observed particularly in younger to middle-aged individuals. The findings indicate a demand for methodologically sound studies to investigate the public health burden and to quantify the unmet need for palliative care in Germany.
Influence on persistence and adherence with oral bisphosphonates on fracture rates in osteoporosis
(2009)
Background and Aim:
Oral bisphosphonates have been shown to reduce the risk of fractures in patients with osteoporosis. It can be assumed that the clinical effectiveness of oral bisphosphonates depends on persistence with therapy.
Methods:
The influence of persistence with and adherence to oral bisphosphonates on fracture risk in a real-life setting was investigated. Data from 4451 patients with a defi ned index prescription of bisphosphonates were included. Fracture rates within 180, 360, and 720 days after index prescription were compared between persistent and non-persistent patients. In an extended Cox regression model applying multiple event analysis, the influence of adherence was analyzed. Persistence was defined as the duration of continuous therapy; adherence was measured in terms of the medication possession ratio (MPR).
Results:
In patients with a fracture before index prescription, fracture rates were reduced by 29% (p = 0.025) comparing persistent and non-persistent patients within 180 days after the index prescription and by 45% (p < 0.001) within 360 days. The extended Cox regression model showed that good adherence (MPR ≥ 0.8) reduced fracture risk by about 39% (HR 0.61, 95% CI 0.47–0.78; p < 0.01).
Conclusions:
In patients with osteoporosis-related fractures, good persistence and adherence to oral bisphosphonates reduced fracture risk significantly.
To effectively prevent and control bovine mastitis, farmers and their advisors need to take infection pathways and durations into account. Still, studies exploring both aspects through molecular epidemiology with sampling of entire dairy cow herds over longer periods are scarce. Therefore, quarter foremilk samples were collected at 14-d intervals from all lactating dairy cows (n = 263) over 18 wk in one commercial dairy herd. Quarters were considered infected with Staphylococcus aureus, Streptococcus uberis, or Streptococcus dysgalactiae when ≥100 cfu/mL of the respective pathogen was detected, or with Staphylococcus epidermidis or Staphylococcus haemolyticus when ≥500 cfu/mL of the respective pathogen was detected. All isolates of the mentioned species underwent randomly amplified polymorphic DNA (RAPD)-PCR to explore strain diversity and to distinguish ongoing from new infections. Survival analysis was used to estimate infection durations. Five different strains of Staph. aureus were isolated, and the most prevalent strain caused more than 80% of all Staph. aureus infections (n = 46). In contrast, 46 Staph. epidermidis and 69 Staph. haemolyticus strains were isolated, and none of these caused infections in more than 2 different quarters. The 3 most dominant strains of Strep. dysgalactiae (7 strains) and Strep. uberis (18 strains) caused 81% of 33 and 49% of 37 infections in total, respectively. The estimated median infection duration for Staph. aureus was 80 d, and that for Staph. epidermidis and Staph. haemolyticus was 28 and 22 d, respectively. The probability of remaining infected with Strep. dysgalactiae or Strep. uberis for more than 84 and 70 d was 58.7 and 53.5%, respectively. Staphylococcus epidermidis and Staph. haemolyticus were not transmitted contagiously and the average infection durations were short, which brings into question whether antimicrobial treatment of intramammary infections with these organisms is justified. In contrast, infections with the other 3 pathogens lasted longer and largely originated from contagious transmission.
Background:
Huntington’s disease (HD) is a rare, genetic, neurodegenerative and ultimately fatal disease with no cure or progression-delaying treatment currently available. HD is characterized by a triad of cognitive, behavioural and motor symptoms. Evidence on epidemiology and management of HD is limited, especially for Germany. This study aims to estimate the incidence and prevalence of HD and analyze the current routine care based on German claims data.
Methods:
The source of data was a sample of the Institute for Applied Health Research Berlin (InGef) Research Database, comprising data of approximately four million insured persons from approximately 70 German statutory health insurances. The study was conducted in a retrospective cross-sectional design using 2015 and 2016 as a two-year observation period. At least two outpatient or inpatient ICD-10 codes for HD (ICD-10: G10) during the study period were required for case identification. Patients were considered incident if no HD diagnoses in the 4 years prior to the year of case identification were documented. Information on outpatient drug dispensations, medical aids and remedies were considered to describe the current treatment situation of HD patients.
Results:
A 2-year incidence of 1.8 per 100,000 persons (95%-Confidence interval (CI): 1.4–2.4) and a 2-year period prevalence of 9.3 per 100,000 persons (95%-CI: 8.3–10.4) was observed. The prevalence of HD increased with advancing age, peaking at 60–69 years (16.8 per 100,000 persons; 95%-CI: 13.4–21.0) and decreasing afterwards.
The most frequently observed comorbidities and disease-associated symptoms in HD patients were depression (42.9%), dementia (37.7%), urinary incontinence (32.5%), extrapyramidal and movement disorders (30.5%), dysphagia (28.6%) and disorders of the lipoprotein metabolism (28.2%).
The most common medications in HD patients were antipsychotics (66.9%), followed by antidepressants (45.1%). Anticonvulsants (16.6%), opioids (14.6%) and hypnotics (9.7%) were observed less frequently.
Physical therapy was the most often used medical aid in HD patients (46.4%). Nursing services and speech therapy were used by 27.9 and 22.7% of HD patients, respectively, whereas use of psychotherapy was rare (3.2%).
Conclusions:
Based on a representative sample, this study provides new insights into the epidemiology and routine care of HD patients in Germany, and thus, may serve as a starting point for further research.
Incorporation and Deposition of Spin Crossover Materials into and onto Electrospun Nanofibers
(2023)
We synthesized iron(II)-triazole spin crossover compounds of the type [Fe(atrz)3]X2 and incorporated and deposited them on electrospun polymer nanofibers. For this, we used two separate electrospinning methods with the goal of obtaining polymer complex composites with intact switching properties. In view of possible applications, we chose iron(II)-triazole-complexes that are known to exhibit spin crossover close to ambient temperature. Therefore, we used the complexes [Fe(atrz)3]Cl2 and [Fe(atrz)3](2ns)2 (2ns = 2-Naphthalenesulfonate) and deposited those on fibers of polymethylmethacrylate (PMMA) and incorporated them into core–shell-like PMMA fiber structures. These core–shell structures showed to be inert to outer environmental influences, such as droplets of water, which we purposely cast on the fiber structure, and it did not rinse away the used complex. We analyzed both the complexes and the composites with IR-, UV/Vis, Mössbauer spectroscopy, SQUID magnetometry, as well as SEM and EDX imaging. The analysis via UV/Vis spectroscopy, Mössbauer spectroscopy, and temperature-dependent magnetic measurements with the SQUID magnetometer showed that the spin crossover properties were maintained and were not changed after the electrospinning processes.
Background:
Multiple Sclerosis (MS) is a chronic inflammatory, immune mediated disease of the central nervous system, with Relapsing Remitting MS (RRMS) being the most common type. Within the last years, the status of high disease activity (HDA) has become increasingly important for clinical decisions. Nevertheless, little is known about the incidence, the characteristics, and the current treatment of patients with RRMS and HDA in Germany. Therefore, this study aims to estimate the incidence of HDA in a German RRMS patient population, to characterize this population and to describe current drug treatment routines and further healthcare utilization of these patients.
Methods:
A claims data analyses has been conducted, using a sample of the InGef Research Database that comprises data of approximately four million insured persons from around 70 German statutory health insurances (SHI). The study was conducted in a retrospective cohort design, including the years 2012–2016. Identification of RRMS population based on ICD-10 code (ICD-10-GM: G35.1). For identification of HDA, criteria from other studies as well as expert opinions have been used. Information on incidence, characteristics and current treatment of patients with RRMS and HDA was considered.
Results:
The overall HDA incidence within the RRMS population was 8.5% for 2016. It was highest for the age group of 0–19 years (29.4% women, 33.3% men) and lowest for the age group of ≥ 50 years (4.3% women, 5.6% men). Mean age of patients with RRMS and incident HDA was 38.4 years (SD: 11.8) and women accounted for 67.8%.
Analyses of drug utilization showed that 82.4% received at least one disease-modifying drug (DMD) in 2016. A percentage of 49.8% of patients received drugs for relapse therapy. A share of 55% of RRMS patients with HDA had at least one hospitalization with a mean length of stay of 13.9 days (SD: 18.3 days) in 2016. The average number of outpatient physician contacts was 28.1 (SD: 14.0).
Conclusions:
This study based on representative Germany-wide claims data from the SHI showed a high incidence of HDA especially within the young RRMS population. Future research should consider HDA as an important criterion for the quality of care for MS patients.
Background and Objectives:
Drawing causal conclusions from real-world data (RWD) poses methodological challenges and risk of bias. We aimed to systematically assess the type and impact of potential biases that may occur when analyzing RWD using the case of progressive ovarian cancer.
Methods:
We retrospectively compared overall survival with and without second-line chemotherapy (LOT2) using electronic medical records. Potential biases were determined using directed acyclic graphs. We followed a stepwise analytic approach ranging from crude analysis and multivariable-adjusted Cox model up to a full causal analysis using a marginal structural Cox model with replicates emulating a reference randomized controlled trial (RCT). To assess biases, we compared effect estimates (hazard ratios [HRs]) of each approach to the
HR of the reference trial.
Results:
The reference trial showed an HR for second line vs. delayed therapy of 1.01 (95% confidence interval [95% CI]: 0.82e1.25). The corresponding HRs from the RWD analysis ranged from 0.51 for simple baseline adjustments to 1.41 (95% CI: 1.22e1.64) accounting for immortal time bias with time-varying covariates. Causal trial emulation yielded an HR of 1.12 (95% CI: 0.96e1.28).
Conclusion:
Our study, using ovarian cancer as an example, shows the importance of a thorough causal design and analysis if one is expecting RWD to emulate clinical trial results.
Background
Chronic obstructive pulmonary disease (COPD) causes significant morbidity and mortality worldwide. Estimation of incidence, prevalence and disease burden through routine insurance data is challenging because of under-diagnosis and under-treatment, particularly for early stage disease in health care systems where outpatient International Classification of Diseases (ICD) diagnoses are not collected. This poses the question of which criteria are commonly applied to identify COPD patients in claims datasets in the absence of ICD diagnoses, and which information can be used as a substitute. The aim of this systematic review is to summarize previously reported methodological approaches for the identification of COPD patients through routine data and to compile potential criteria for the identification of COPD patients if ICD codes are not available.
Methods
A systematic literature review was performed in Medline via PubMed and Google Scholar from January 2000 through October 2018, followed by a manual review of the included studies by at least two independent raters. Study characteristics and all identifying criteria used in the studies were systematically extracted from the publications, categorized, and compiled in evidence tables.
Results
In total, the systematic search yielded 151 publications. After title and abstract screening, 38 publications were included into the systematic assessment. In these studies, the most frequently used (22/38) criteria set to identify COPD patients included ICD codes, hospitalization, and ambulatory visits. Only four out of 38 studies used methods other than ICD coding. In a significant proportion of studies, the age range of the target population (33/38) and hospitalization (30/38) were provided. Ambulatory data were included in 24, physician claims in 22, and pharmaceutical data in 18 studies. Only five studies used spirometry, two used surgery and one used oxygen therapy.
Conclusions
A variety of different criteria is used for the identification of COPD from routine data. The most promising criteria set in data environments where ambulatory diagnosis codes are lacking is the consideration of additional illness-related information with special attention to pharmacotherapy data. Further health services research should focus on the application of more systematic internal and/or external validation approaches.
To design cost-effective prevention strategies against mastitis in dairy cow farms, knowledge about infection pathways of causative pathogens is necessary. Therefore, we investigated the reservoirs of bacterial strains causing intramammary infections in one dairy cow herd. Quarter foremilk samples (n = 8056) and milking- and housing-related samples (n = 251; from drinking troughs, bedding material, walking areas, cow brushes, fly traps, milking liners, and milker gloves), were collected and examined using culture-based methods. Species were identified with MALDI-TOF MS, and selected Staphylococcus and Streptococcus spp. typed with randomly amplified polymorphic DNA-PCR. Staphylococci were isolated from all and streptococci from most investigated locations. However, only for Staphylococcus aureus, matching strain types (n = 2) were isolated from milk and milking-related samples (milking liners and milker gloves). Staphylococcus epidermidis and Staphylococcus haemolyticus showed a large genetic diversity without any matches of strain types from milk and other samples. Streptococcus uberis was the only Streptococcus spp. isolated from milk and milking- or housing-related samples. However, no matching strains were found. This study underlines the importance of measures preventing the spread of Staphylococcus aureus between quarters during milking.
Aim
Musculoskeletal disorders are a major public health problem in most developed countries. As a main cause of chronic pain, they have resulted in an increasing prescription of opioids worldwide. With regard to the situation in Germany, this study aimed at estimating the prevalence of musculoskeletal diseases such as chronic low back pain (CLBP) and hip/knee osteoarthritis (OA) and at depicting the applied treatment patterns.
Subject and methods
German claims data from the InGef Research Database were analyzed over a 6-year period (2011–2016). The dataset contains over 4 million people, enrolled in German statutory health insurances. Inpatient and outpatient diagnoses were considered for case identification of hip/knee OA and CLBP. The World Health Organization (WHO) analgesic ladder was applied to categorize patients according to their pain management interventions. Information on demographics, comorbidities, and adjuvant medication was collected.
Results
In 2016, n = 2,693,481 individuals (50.5% female, 49.5% male) were assigned to the study population; 62.5% of them were aged 18–60 years. In 2016, n = 146,443 patients (5.4%) with CLBP and n = 307,256 patients (11.4%) with hip/knee OA were identified. Of those with pre-specified pain management interventions (CLBP: 66.3%; hip/knee OA: 65.1%), most patients received WHO I class drugs (CLBP: 73.6%; hip/knee OA: 68.7%) as the highest level.
Conclusion
This study provides indications that CLBP and hip/knee OA are common chronic pain conditions in Germany, which are often subjected to pharmacological pain management. Compared to non-opioid analgesic prescriptions of the WHO I class, the dispensation of WHO class II and III opioids was markedly lower, though present to a considerable extent.
For the introduction of technical nursing care innovations, a usability assessment survey is conducted by nursing staff. The questionnaire is used before and after the introduction of technical products. This poster contribution shows the latest comparison of pre- and post-surveys on selected products.
Music streaming platforms offer music listeners an overwhelming choice of music. Therefore, users of streaming platforms need the support of music recommendation systems to find music that suits their personal taste. Currently, a new class of recommender systems based on knowledge graph embeddings promises to improve the quality of recommendations, in particular to provide diverse and novel recommendations. This paper investigates how knowledge graph embeddings can improve music recommendations. First, it is shown how a collaborative knowledge graph can be derived from open music data sources. Based on this knowledge graph, the music recommender system EARS (knowledge graph Embedding-based Artist Recommender System) is presented in detail, with particular emphasis on recommendation diversity and explainability. Finally, a comprehensive evaluation with real-world data is conducted, comparing of different embeddings and investigating the influence of different types of knowledge.
Ability of Black-Box Optimisation to Efficiently Perform Simulation Studies in Power Engineering
(2023)
In this study, the potential of the so-called black-box optimisation (BBO) to increase the efficiency of simulation studies in power engineering is evaluated. Three algorithms ("Multilevel Coordinate Search"(MCS) and "Stable Noisy Optimization by Branch and Fit"(SNOBFIT) by Huyer and Neumaier and "blackbox: A Procedure for Parallel Optimization of Expensive Black-box Functions"(blackbox) by Knysh and Korkolis) are implemented in MATLAB and compared for solving two use cases: the analysis of the maximum rotational speed of a gas turbine after a load rejection and the identification of transfer function parameters by measurements. The first use case has a high computational cost, whereas the second use case is computationally cheap. For each run of the algorithms, the accuracy of the found solution and the number of simulations or function evaluations needed to determine the optimum and the overall runtime are used to identify the potential of the algorithms in comparison to currently used methods. All methods provide solutions for potential optima that are at least 99.8% accurate compared to the reference methods. The number of evaluations of the objective functions differs significantly but cannot be directly compared as only the SNOBFIT algorithm does stop when the found solution does not improve further, whereas the other algorithms use a predefined number of function evaluations. Therefore, SNOBFIT has the shortest runtime for both examples. For computationally expensive simulations, it is shown that parallelisation of the function evaluations (SNOBFIT and blackbox) and quantisation of the input variables (SNOBFIT) are essential for the algorithmic performance. For the gas turbine overspeed analysis, only SNOBFIT can compete with the reference procedure concerning the runtime. Further studies will have to investigate whether the quantisation of input variables can be applied to other algorithms and whether the BBO algorithms can outperform the reference methods for problems with a higher dimensionality.
Catalogs of competency-based learning objectives (CLO) were introduced and promoted as a prerequisite for high-quality, systematic curriculum development. While this is common in medicine, the consistent use of CLO is not yet well established in epidemiology, biometry, medical informatics, biomedical informatics, and nursing informatics especially in Germany. This paper aims to identify underlying obstacles and give recommendations in order to promote the dissemination of CLO for curricular development in health data and information sciences. To determine these obstacles and recommendations a public online expert workshop was organized. This paper summarizes the findings.
Social comparison theories suggest that ingroups are strengthened whenever important outgroups are weakened (e.g., by losing status or power). It follows that ingroups have little reason to help outgroups facing an existential threat. We challenge this notion by showing that ingroups can also be weakened when relevant comparison outgroups are weakened, which can motivate ingroups to strategically offer help to ensure the outgroups' survival as a highly relevant comparison target. In three preregistered studies, we showed that an existential threat to an outgroup with high (vs. low) identity relevance affected strategic outgroup helping via two opposing mechanisms. The potential demise of a highly relevant outgroup increased participants’ perceptions of ingroup identity threat, which was positively related to helping. At the same time, the outgroup’s misery evoked schadenfreude, which was negatively related to helping. Our research exemplifies a group's secret desire for strong outgroups by underlining their importance for identity formation.
To learn a subject, the acquisition of the associated technical language is important.
Despite this widely accepted importance of learning the technical language, hardly any studies are published that describe the characteristics of most technical languages that students are supposed to learn. This might largely be due to the absence of specialized text corpora to study such languages at lexical, syntactical and textual level. In the present paper we describe a corpus of German physics text that can be used to study the language used in physics. A large and a small variant are compiled. The small version of the corpus consists of 5.3 Million words and is available on request.
The growing importance of renewable generation connected to distribution grids requires an increased coordination between transmission system operators (TSOs) and distribution system operators (DSOs) for reactive power management. This work proposes a practical and effective interaction method based on sequential optimizations to evaluate the reactive flexibility potential of distribution networks and to dispatch them along with traditional synchronous generators, keeping to a minimum the information exchange. A modular optimal power flow (OPF) tool featuring multi-objective optimization is developed for this purpose. The proposed method is evaluated for a model of a real German 110 kV grid with 1.6 GW of installed wind power capacity and a reduced order model of the surrounding transmission system. Simulations show the benefit of involving wind farms in reactive power support reducing losses both at distribution and transmission level. Different types of setpoints are investigated, showing the feasibility for the DSO to fulfill also individual voltage and reactive power targets over multiple connection points. Finally, some suggestions are presented to achieve a fair coordination, combining both TSO and DSO requirements.
Antimicrobials are widely used to cure intramammary infections (IMI) in dairy cows during the dry period (DP). Nevertheless, the IMI cure is influenced by many factors and not all quarters benefit from antimicrobial dry cow treatment (DCT). To evaluate the true effect of antibiotic DCT compared to self-cure and the role of causative pathogens on the IMI cure, a retrospective cross-sectional study was performed. The analysis included 2987 quarters infected at dry-off (DO). Information on DCT, causative pathogens, somatic cell count, milk yield, amount of lactation, Body Condition Score, and season and year of DO were combined into categorical variables. A generalized linear mixed model with a random cow, farm and year effect and the binary outcome of bacteriological cure of IMI during the DP was conducted. In the final model, a significant effect (p < 0.05) on DP cure was seen for the DO season and the category of causative pathogens (categories being: Staphylococcus aureus, non-aureus staphylococci, streptococci, coliforms, ‘other Gram-negative bacteria’, ‘other Gram positive bacteria’, non-bacterial infections and mixed infections), while antibiotic DCT (vs. non-antibiotic DCT) only showed a significant effect in combination with the pathogen categories streptococci and ‘other Gram-positive bacteria’.
The transfer of historically grown monolithic software architectures into modern service-oriented architectures creates a lot of loose coupling points. This can lead to an unforeseen system behavior and can significantly impede those continuous modernization processes, since it is not clear where bottlenecks in a system arise. It is therefore necessary to monitor such modernization processes with an adaptive monitoring concept to be able to correctly record and interpret unpredictable system dynamics. This contribution presents a generic QoS measurement framework for service-based systems. The framework consists of an XML-based specification for the measurement to be performed – the Information Model (IM) – and the QoS System, which provides an execution platform for the IM. The framework will be applied to a standard business process of the German insurance industry, and the concepts of the IM and their mapping to artifacts of the QoS System will be presented. Furtherm ore, design and implementation of the QoS System’s parser and generator module and the generated artifacts are explained in detail, e.g., event model, agents, measurement module and analyzer module.
Self-directed learning is an essential basis for lifelong learning and requires constantly changing, target groupspecific and personalized prerequisites in order to motivate people to deal with modern learning content, not to overburden them and yet to adequately convey complex contexts. Current challenges in dealing with digital resources such as information overload, reduction of complexity and focus, motivation to learn, self-control or psychological wellbeing are taken up in the conception of learning settings within our QpLuS IM project for the study program Information Management and Information Management extra-occupational (IM) at the University of Applied Sciences and Arts Hannover. We present an interactive video on the functionality of search engines as a practical example of a medially high-quality and focused self-learning format that has been methodically produced in line with our agile, media-didactic process and stage model of complexity levels.
Digital data on tangible and intangible cultural assets is an essential part of daily life, communication and experience. It has a lasting influence on the perception of cultural identity as well as on the interactions between research, the cultural economy and society. Throughout the last three decades, many cultural heritage institutions have contributed a wealth of digital representations of cultural assets (2D digital reproductions of paintings, sheet music, 3D digital models of sculptures, monuments, rooms, buildings), audio-visual data (music, film, stage performances), and procedural research data such as encoding and annotation formats. The long-term preservation and FAIR availability of research data from the cultural heritage domain is fundamentally important, not only for future academic success in the humanities but also for the cultural identity of individuals and society as a whole. Up to now, no coordinated effort for professional research data management on a national level exists in Germany. NFDI4Culture aims to fill this gap and create a usercentered, research-driven infrastructure that will cover a broad range of research domains from musicology, art history and architecture to performance, theatre, film, and media studies.
The research landscape addressed by the consortium is characterized by strong institutional differentiation. Research units in the consortium's community of interest comprise university institutes, art colleges, academies, galleries, libraries, archives and museums. This diverse landscape is also characterized by an abundance of research objects, methodologies and a great potential for data-driven research. In a unique effort carried out by the applicant and co-applicants of this proposal and ten academic societies, this community is interconnected for the first time through a federated approach that is ideally suited to the needs of the participating researchers. To promote collaboration within the NFDI, to share knowledge and technology and to provide extensive support for its users have been the guiding principles of the consortium from the beginning and will be at the heart of all workflows and decision-making processes. Thanks to these principles, NFDI4Culture has gathered strong support ranging from individual researchers to highlevel cultural heritage organizations such as the UNESCO, the International Council of Museums, the Open Knowledge Foundation and Wikimedia. On this basis, NFDI4Culture will take innovative measures that promote a cultural change towards a more reflective and sustainable handling of research data and at the same time boost qualification and professionalization in data-driven research in the domain of cultural heritage. This will create a long-lasting impact on science, cultural economy and society as a whole.
Even for the more traditional insurance industry, the Microservices Architecture (MSA) style plays an increasingly important role in provisioning insurance services. However, insurance businesses must operate legacy applications, enterprise software, and service-based applications in parallel for a more extended transition period. The ultimate goal of our ongoing research is to design a microservice reference architecture in cooperation with our industry partners from the insurance domain that provides an approach for the integration of applications from different architecture paradigms. In Germany, individual insurance services are classified as part of the critical infrastructure. Therefore, German insurance companies must comply with the Federal Office for Information Security requirements, which the Federal Supervisory Authority enforces. Additionally, insurance companies must comply with relevant laws, regulations, and standards as part of the business’s compliance requirements. Note: Since Germany is seen as relatively ’tough’ with respect to privacy and security demands, fullfilling those demands might well be suitable (if not even ’over-achieving’) for insurances in other countries as well. The question raises thus, of how insurance services can be secured in an application landscape shaped by the MSA style to comply with the architectural and security requirements depicted above. This article highlights the specific regulations, laws, and standards the insurance industry must comply with. We present initial architectural patterns to address authentication and authorization in an MSA tailored to the requirements of our insurance industry partners.
Cloud computing has become well established in private and public sector projects over the past few years, opening ever new opportunities for research and development, but also for education. One of these opportunities presents itself in the form of dynamically deployable, virtual lab environments, granting educational institutions increased flexibility with the allocation of their computing resources. These fully sandboxed labs provide students with their own, internal network and full access to all machines within, granting them the flexibility necessary to gather hands-on experience with building heterogeneous microservice architectures. The eduDScloud provides a private cloud infrastructure to which labs like the microservice lab outlined in this paper can be flexibly deployed at a moment’s notice.
In this paper the workflow of the project 'Untersuchungs-, Simulations- und Evaluationstool für Urbane Logistik` (USEfUL) is presented. Aiming to create a web-based decision support tool for urban logistics, the project needed to integrate multiple steps into a single workflow, which in turn needed to be executed multiple times. While a service-oriented system could not be created, the principles of service orientation was utilized to increase workflow efficiency and flexibility, allowing the workflow to be easily adapted to new concepts or research areas.
In the context of modern mobility, topics such as smart-cities, Car2Car-Communication, extensive vehicle sensor-data, e-mobility and charging point management systems have to be considered. These topics of modern mobility often have in common that they are characterized by complex and extensive data situations. Vehicle position data, sensor data or vehicle communication data must be preprocessed, aggregated and analyzed. In many cases, the data is interdependent. For example, the vehicle position data of electric vehicles and surrounding charging points have a dependence on one another and characterize a competition situation between the vehicles. In the case of Car2Car-Communication, the positions of the vehicles must also be viewed in relation to each other. The data are dependent on each other and will influence the ability to establish a communication. This dependency can provoke very complex and large data situations, which can no longer be treated efficiently. With this work, a model is presented in order to be able to map such typical data situations with a strong dependency of the data among each other. Microservices can help reduce complexity.
Microservices build a deeply distributed system. Although this offers significant flexibility for development teams and helps to find solutions for scalability or security questions, it also intensifies the drawbacks of a distributed system. This article offers a decision framework, which helps to increase the resiliency of microservices. A metamodel is used to represent services, resiliency patterns, and quality attributes. Furthermore, the general idea for a suggestion procedure is outlined.
Portable-micro-Combined-Heat-and-Power-units are a gateway technology bridging conventional vehicles and Battery Electric Vehicles (BEV). Being a new technology, new software has to be created that can be easily adapted to changing requirements. We propose and evaluate three different architectures based on three architectural paradigms. Using a scenario-based evaluation, we conclude that a Service-Oriented Architecture (SOA) using microservices provides a higher quality solution than a layered or Event-Driven Complex-Event-Processing (ED-CEP) approach. Future work will include implementation and simulation-driven evaluation.
FID Civil Engineering, Architecture and Urbanism digital - A platform for science (BAUdigital)
(2022)
University Library Braunschweig (UB Braunschweig), University and State Library Darmstadt (ULB Darmstadt), TIB – Leibniz Information Centre for Technology and Natural Sciences and the Fraunhofer Information Centre for Planning and Building (Fraunhofer IRB) are jointly establishing a specialised information service (FID, "Fachinformationsdienst") for the disciplines of civil engineering, architecture and urbanism. The FID BAUdigital, which is funded by the German Research Foundation (DFG, "Deutsche Forschungsgemeinschaft"), will provide researchers working on digital design, planning and production methods in construction engineering with a joint information, networking and data exchange platform and support them with innovative services for documentation, archiving and publication in their data-based research.
Operators of production plants are increasingly emphasizing secure communication, including real-time communication, such as PROFINET, within their control systems. This trend is further advanced by standards like IEC 62443, which demand the protection of realtime communication in the field. PROFIBUS and PROFINET International (PI) is working on the specification of the security extensions for PROFINET (“PROFINET Security”), which shall fulfill the requirements of secure communication in the field.
This paper discusses the matter in three parts. First, the roles and responsibilities of the plant owner, the system integrator, and the component provider regarding security, and the basics of the IEC 62443 will be described. Second, a conceptual overview of PROFINET Security, as well as a status update about the PI specification work will be given. Third, the article will describe how PROFINET Security can contribute to the defense-in-depth approach, and what the expected operating environment is. We will evaluate how PROFINET Security contributes to fulfilling the IEC 62443-4-2 standard for automation components.
Two of the authors are members of the PI Working Group CB/PG10 Security.
The usage of microservices promises a lot of benefits concerning scalability and maintainability, rewriting large monoliths is however not always possible. Especially in scientific projects, pure microservice architectures are therefore not feasible in every project. We propose the utilization of microservice principles for the construction of microsimulations for urban transport. We present a prototypical architecture for the connection of MATSim and AnyLogic, two widely used simulation tools in the context of urban transport simulation. The proposed system combines the two tools into a singular tool supporting civil engineers in decision making on innovative urban transport concepts.
To avoid the shortcomings of traditional monolithic applications, the Microservices Architecture (MSA) style plays an increasingly important role in providing business services. This is true even for the more conventional insurance industry with its highly heterogeneous application landscape and sophisticated cross-domain business processes. Therefore, the question arises of how workflows can be implemented to grant the required flexibility and agility and, on the other hand, to exploit the potential of the MSA style. In this article, we present two different approaches – orchestration and choreography. Using an application scenario from the insurance domain, both concepts are discussed. We introduce a pattern that outlines the mapping of a workflow to a choreography.
With the use of an energy management system in an industrial company according to ISO 50001, a step-by-step increase in energy efficiency can be achieved. The realization of energy monitoring and load management functions requires programs on edge devices or PLCs to acquire the data, adapt the data type or scale the values of the energy information. In addition, the energy information must be mapped to communication interfaces (e.g. based on OPC UA) in order to convey this energy information to the energy management application. The development of these energy management programs is associated with a high engineering effort, because the field devices from the heterogeneous field level do not provide the energy information in standardized semantics. To mitigate this engineering effort, a universal energy data information model (UEIM) is developed and presented in this paper.
Microservices are meanwhile an established software engineering vehicle, which more and more companies are examining and adopting for their development work. Naturally, reference architectures based on microservices come into mind as a valuable thing to utilize. Initial results for such architectures are published in generic and in domain-specific form. Missing to the best of our knowledge however, is a domain-specific reference architecture based on microservices, which takes into account specifics of the insurance industry domain. Jointly with partners from the German insurance industry, we take initial steps to fill this gap in the present article. Thus, we aim towards a microservices-based reference software architecture for (at least German) insurance companies. As the main results of this article we thus provide an initial such reference architecture together with a deeper look into two important parts of it.
Wikidata and Wikibase as complementary research data management services for cultural heritage data
(2022)
The NFDI (German National Research Data Infrastructure) consortia are associations of various institutions within a specific research field, which work together to develop common data infrastructures, guidelines, best practices and tools that conform to the principles of FAIR data. Within the NFDI, a common question is: What is the potential of Wikidata to be used as an application for science and research? In this paper, we address this question by tracing current research usecases and applications for Wikidata, its relation to standalone Wikibase instances, and how the two can function as complementary services to meet a range of research needs. This paper builds on lessons learned through the development of open data projects and software services within the Open Science Lab at TIB, Hannover, in the context of NFDI4Culture – the consortium including participants across the broad spectrum of the digital libraries, archives, and museums field, and the digital humanities.
The transfer of historically grown monolithic software architectures into modern service-oriented architectures creates a lot of loose coupling points. This can lead to an unforeseen system behavior and can significantly impede those continuous modernization processes, since it is not clear where bottlenecks in a system arise. It is therefore necessary to monitor such modernization processes with an adaptive monitoring concept in order to be able to correctly record and interpret unpredictable system dynamics. For this purpose, a general measurement methodology and a specific implementation concept are presented in this work.
A new FOSS (free and open source software) toolchain and associated workflow is being developed in the context of NFDI4Culture, a German consortium of research- and cultural heritage institutions working towards a shared infrastructure for research data that meets the needs of 21st century data creators, maintainers and end users across the broad spectrum of the digital libraries and archives field, and the digital humanities. This short paper and demo present how the integrated toolchain connects: 1) OpenRefine - for data reconciliation and batch upload; 2) Wikibase - for linked open data (LOD) storage; and 3) Kompakkt - for rendering and annotating 3D models. The presentation is aimed at librarians, digital curators and data managers interested in learning how to manage research datasets containing 3D media, and how to make them available within an open data environment with 3D-rendering and collaborative annotation features.
In microservice architectures, data is often hold redundantly to create an overall resilient system. Although the synchronization of this data proposes a significant challenge, not much research has been done on this topic yet. This paper shows four general approaches for assuring consistency among services and demonstrates how to identify the best solution for a given architecture. For this, a microservice architecture, which implements the functionality of a mainframe-based legacy system from the insurance industry, serves as an example.
Microservices is an architectural style for complex application systems, promising some crucial benefits, e.g. better maintainability, flexible scalability, and fault tolerance. For this reason microservices has attracted attention in the software development departments of different industry sectors, such as ecommerce and streaming services. On the other hand, businesses have to face great challenges, which hamper the adoption of the architectural style. For instance, data are often persisted redundantly to provide fault tolerance. But the synchronization of those data for the sake of consistency is a major challenge. Our paper presents a case study from the insurance industry which focusses consistency issues when migrating a monolithic core application towards microservices. Based on the Domain Driven Design (DDD) methodology, we derive bounded contexts and a set of microservices assigned to these contexts. We discuss four different approaches to ensure consistency and propose a best practice to identify the most appropriate approach for a given scenario. Design and implementation details and compliance issues are presented as well.
One of the main concerns of this publication is to furnish a more rational basis for discussing bioplastics and use fact-based arguments in the public discourse. Furthermore, “Biopolymers – facts and statistics” aims to provide specific, qualified answers easily and quickly for decision-makers in particular from public administration and the industrial sector. Therefore, this publication is made up like a set of rules and standards and largely foregoes textual detail. It offers extensive market-relevant and technical facts presented in graphs and charts, which means that the information is much easier to grasp. The reader can expect comparative market figures for various materials, regions, applications, process routes, agricultural land use, water use or resource consumption, production capacities, geographic distribution, etc.
During machine milking, pathogenic microorganisms can be transmitted from cow to cow through liners. Therefore, in Germany, a spray method for the intermediate disinfection of the milking cluster is often used for prevention. This method of cluster disinfection is easy to perform, requires little time and no extra materials, and the disinfection solution is safe from outside contamination in the spray bottle. Since no data on a systematic efficacy trial are available, the aim of this study was to determine the microbial reduction effect of intermediate disinfection. Therefore, laboratory and field trials were conducted. In both trials, two sprays of 0.85 mL per burst of different disinfectant solutions were sprayed into the contaminated liners. For sampling, a quantitative swabbing method using a modified wet–dry swab (WDS) technique based on DIN 10113-1: 1997-07 was applied. Thus, the effectiveness of disinfectants based on Peracetic Acid, Hydrogen Peroxide and Plasma-Activated Buffered Solution (PABS) was compared. In the laboratory trial, the inner surfaces of liners were contaminated with pure cultures of Escherichia (E.) coli, Staphylococcus (S.) aureus, Streptococcus (Sc.) uberis and Sc. agalactiae. The disinfection of the contaminated liners with the disinfectants resulted in a significant reduction in bacteria with values averaging 1 log for E. coli, 0.7 log for S. aureus, 0.7 log for Sc. uberis and 0.8 log for Sc. agalactiae. The highest reduction was obtained for contamination with E. coli (1.3 log) and Sc. uberis (0.8 log) when PABS was applied and for contamination with S. aureus (1.1 log) and Sc. agalactiae (1 log) when Peracetic Acid Solution (PAS) was used. Treatment with sterile water only led to an average reduction of 0.4 log. In the field trial, after the milking of 575 cows, the liners were disinfected and the total microorganism count from the liner surface was performed. The reduction was measured against an untreated liner within the cluster. Although a reduction in microorganisms was achieved in the field trial, it was not significant. When using PAS, a log reduction of 0.3 was achieved; when using PABS, a log reduction of 0.2 was obtained. The difference between the two disinfection methods was also not significant. Treatment with sterile water only led to a reduction of 0.1 log. The results show that spray disinfection under these circumstances does result in a reduction in the bacteria on the milking liner surface, but for effective disinfection a higher reduction would be preferred.
According to the third-person effect or the influence of presumed media influence approach, the presumption that the media has strong effects on other people can affect individuals’ attitudes and behavior. For instance, if people believe in strong media influences on others, they are more likely to increase their communication activities or support demands for restrictions on media. A standardized online survey among German journalists (N = 960) revealed that the stronger the journalists perceive the political online influence on the public to be, the more frequently they contradict unwanted political views in their articles. Moreover, even journalists are more likely to approve of restrictions on the Internet’s political influence, the stronger they believe the effects of online media to be. The data reveal no connections between communication activities and demands for restrictions.