Refine
Year of publication
Document Type
- Article (301)
- Conference Proceeding (120)
- Bachelor Thesis (9)
- Periodical Part (9)
- Report (6)
- Master's Thesis (4)
- Working Paper (4)
- Part of a Book (3)
- Preprint (3)
- Book (2)
- Doctoral Thesis (1)
Language
- English (462) (remove)
Is part of the Bibliography
- no (462) (remove)
Keywords
- Euterentzündung (23)
- Student (12)
- Computersicherheit (10)
- Knowledge (10)
- Mumbai (10)
- Wissen (10)
- India (9)
- Serviceorientierte Architektur (9)
- bioplastics (9)
- biopolymers (9)
Obesity and excess adiposity account for approximately 20% of all cancer cases; however, biomarkers of risk remain to be elucidated. While fibroblast growth factor-2 (FGF2) is emerging as an attractive candidate biomarker for visceral adipose tissue mass, the role of circulating FGF2 in malignant transformation remains unknown. Moreover, functional assays for biomarker discovery are limited. We sought to determine if human serum could stimulate the 3D growth of a non-tumorigenic cell line. This type of anchorage-independent 3D growth in soft agar is a surrogate marker for acquired tumorigenicity of cell lines. We found that human serum from cancer-free men and women has the potential to stimulate growth in soft agar of non-tumorigenic epithelial JB6 P+ cells. We examined circulating levels of FGF2 in humans in malignant transformation in vitro in a pilot study of n = 33 men and women. Serum FGF2 levels were not associated with colony formation in epithelial cells (r = 0.05, p = 0.80); however, a fibroblast growth factor receptor-1 (FGFR1) selective inhibitor significantly blocked serum-stimulated transformation, suggesting that FGF2 activation of FGFR1 may be necessary, but not sufficient for the transforming effects of human serum. This pilot study indicates that the FGF2/FGFR1 axis plays a role in JB6 P+ malignant transformation and describes an assay to determine critical serum factors that have the potential to promote tumorigenesis.
Training and evaluating deep learning models on road graphs for traffic prediction using SUMO
(2024)
The escalation of traffic volume in urban areas poses multifaceted challenges including increased accident risks, congestion, and prolonged travel times. Traditional approaches of expanding road infrastructure face limitations such as space constraints and the potential exacerbation of traffic issues.
Intelligent Transport Systems (ITS) present an alternative strategy to alleviate traffic problems by leveraging data-driven solutions. Central to ITS is traffic prediction, a process vital for applications like Traffic Management and Navigation Systems.
Recent advancements in traffic prediction have witnessed a surge of interest, particularly in deep learning methods optimized for graph-based data processing, being considered the most promising avenue presently.
These methods typically rely on real-life datasets containing traffic sensor data such as METR-LA and PeMS. However, the finite nature of real-life data prompts exploration into augmenting training and testing datasets with simulated traffic data.
This thesis explores the potential of utilizing traffic simulations, employing the microscopic traffic simulator SUMO, to train and test deep learning models for traffic prediction. A framework integrating PyTorch and SUMO is proposed for this purpose, aiming to elucidate the feasibility and effectiveness of using simulated traffic data for enhancing predictive models in traffic management systems.
The Logical Observation Identifiers, Names and Codes (LOINC) is a common terminology used for standardizing laboratory terms. Within the consortium of the HiGHmed project, LOINC is one of the central terminologies used for health data sharing across all university sites. Therefore, linking the LOINC codes to the site-specific tests and measures is one crucial step to reach this goal. In this work we report our ongoing efforts in implementing LOINC to our laboratory information system and research infrastructure, as well as our challenges and the lessons learned. 407 local terms could be mapped to 376 LOINC codes of which 209 are already available to routine laboratory data. In our experience, mapping of local terms to LOINC is a widely manual and time consuming process for reasons of language and expert knowledge of local laboratory procedures.
The German Corona Consensus (GECCO) established a uniform dataset in FHIR format for exchanging and sharing interoperable COVID-19 patient specific data between health information systems (HIS) for universities. For sharing the COVID-19 information with other locations that use openEHR, the data are to be converted in FHIR format. In this paper, we introduce our solution through a web-tool named “openEHR-to-FHIR” that converts compositions from an openEHR repository and stores in their respective GECCO FHIR profiles. The tool provides a REST web service for ad hoc conversion of openEHR compositions to FHIR profiles.
Renewable energy production is one of the strongest rising markets and further extreme growth can be anticipated due to desire of increased sustainability in many parts of the world. With the rising adoption of renewable power production, such facilities are increasingly attractive targets for cyber attacks. At the same time higher requirements on a reliable production are raised. In this paper we propose a concept that improves monitoring of renewable power plants by detecting anomalous behavior. The system does not only detect an anomaly, it also provides reasoning for the anomaly based on a specific mathematical model of the expected behavior by giving detailed information about various influential factors causing the alert. The set of influential factors can be configured into the system before learning normal behaviour. The concept is based on multidimensional analysis and has been implemented and successfully evaluated on actual data from different providers of wind power plants.
Purpose: Radiology reports mostly contain free-text, which makes it challenging to obtain structured data. Natural language processing (NLP) techniques transform free-text reports into machine-readable document vectors that are important for creating reliable, scalable methods for data analysis. The aim of this study is to classify unstructured radiograph reports according to fractures of the distal fibula and to find the best text mining method.
Materials & Methods: We established a novel German language report dataset: a designated search engine was used to identify radiographs of the ankle and the reports were manually labeled according to fractures of the distal fibula. This data was used to establish a machine learning pipeline, which implemented the text representation methods bag-of-words (BOW), term frequency-inverse document frequency (TF-IDF), principal component analysis (PCA), non-negative matrix factorization (NMF), latent Dirichlet allocation (LDA), and document embedding (doc2vec). The extracted document vectors were used to train neural networks (NN), support vector machines (SVM), and logistic regression (LR) to recognize distal fibula fractures. The results were compared via cross-tabulations of the accuracy (acc) and area under the curve (AUC).
Results: In total, 3268 radiograph reports were included, of which 1076 described a fracture of the distal fibula. Comparison of the text representation methods showed that BOW achieved the best results (AUC = 0.98; acc = 0.97), followed by TF-IDF (AUC = 0.97; acc = 0.96), NMF (AUC = 0.93; acc = 0.92), PCA (AUC = 0.92; acc = 0.9), LDA (AUC = 0.91; acc = 0.89) and doc2vec (AUC = 0.9; acc = 0.88). When comparing the different classifiers, NN (AUC = 0,91) proved to be superior to SVM (AUC = 0,87) and LR (AUC = 0,85).
Conclusion: An automated classification of unstructured reports of radiographs of the ankle can reliably detect findings of fractures of the distal fibula. A particularly suitable feature extraction method is the BOW model.
Key Points:
- The aim was to classify unstructured radiograph reports according to distal fibula fractures.
- Our automated classification system can reliably detect fractures of the distal fibula.
- A particularly suitable feature extraction method is the BOW model.
The Wnt signaling pathway has been associated with many essential cell processes. This study aims to examine the effects of Wnt signaling on proliferation of cultured HEK293T cells. Cells were incubated with Wnt3a, and the activation of the Wnt pathway was followed by analysis of the level of the β-catenin protein and of the expression levels of the target genes MYC and CCND1. The level of β-catenin protein increased up to fourfold. While the mRNA levels of c-Myc and cyclin D1 increased slightly, the protein levels increased up to a factor of 1.5. Remarkably, MTT and BrdU assays showed different results when measuring the proliferation rate of Wnt3a stimulated HEK293T cells. In the BrdU assays an increase of the proliferation rate could be detected, which correlated to the applied Wnt3a concentration. Oppositely, this correlation could not be shown in the MTT assays. The MTT results, which are based on the mitochondrial activity, were confirmed by analysis of the succinate dehydrogenase complex by immunofluorescence and by western blotting. Taken together, our study shows that Wnt3a activates proliferation of HEK293 cells. These effects can be detected by measuring DNA synthesis rather than by measuring changes of mitochondrial activity.
In industrial production facilities, technical Energy Management Systems are used to measure, monitor and display energy consumption related information. The measurements take place at the field device level of the automation pyramid. The measured values are recorded and processed at the control level. The functionalities to monitor and display energy data are located at the MES level of the automation pyramid. So the energy data from all PLCs has to be aggregated, structured and provided for higher level systems. This contribution introduces a concept for an Energy Data Aggregation Layer, which provides the functionality described above. For the implementation of this Energy Data Aggregation Layer, a combination of AutomationML and OPC UA is used.
Integrated Risk and Opportunity Management (IROM) goes far beyond what is found in organizations today. However, it offers the best opportunity not only to keep pace with the VUCA world, but to actually profit from it. Accordingly, the introduction of opportunity-based thinking in addition to risk-based thinking is part of the design specification for ISO 9000 and ISO 9001. The prerequisite for the successful design of an IROM is the individual definition, control and integration of risk and opportunity management processes, considering eight success factors, the "8 C". Top management benefits directly from the result: better, coordinated decision memos enable faster and more appropriate decisions.
Harmonisation of German Health Care Data Using the OMOP Common Data Model – A Practice Report
(2023)
Data harmonization is an important step in large-scale data analysis and for generating evidence on real world data in healthcare. With the OMOP common data model, a relevant instrument for data harmonization is available that is being promoted by different networks and communities. At the Hannover Medical School (MHH) in Germany, an Enterprise Clinical Research Data Warehouse (ECRDW) is established and harmonization of that data source is the focus of this work. We present MHH’s first implementation of the OMOP common data model on top of the ECRDW data source and demonstrate the challenges concerning the mapping of German healthcare terminologies to a standardized format.
Autonomous and integrated passenger and freight transport (APFIT) is a promising approach to tackle both, traffic and last-mile-related issues such as environmental emissions, social and spatial conflicts or operational inefficiencies. By conducting an agent-based simulation, we shed light on this widely unexplored research topic and provide first indications regarding influential target figures of such a system in the rural area of Sarstedt, Germany. Our results show that larger fleets entail inefficiencies due to suboptimal utilization of monetary and material resources and increase traffic volume while higher amounts of unused vehicles may exacerbate spatial conflicts. Nevertheless, to fit the given demand within our study area, a comparatively large fleet of about 25 vehicles is necessary to provide reliable service, assuming maximum passenger waiting times of six minutes to the expense of higher standby times, rebalancing effort, and higher costs for vehicle acquisition and maintenance.
The NOA project collects and stores images from open access publications and makes them findable and reusable. During the project a focus group workshop was held to determine whether the development is addressing researchers’ needs. This took place before the second half of the project so that the results could be considered for further development since addressing users’ needs is a big part of the project. The focus was to find out what content and functionality they expect from image repositories.
In a first step, participants were asked to fill out a survey about their images use. Secondly, they tested different use cases on the live system. The first finding is that users have a need for finding scholarly images but it is not a routine task and they often do not know any image repositories. This is another reason for repositories to become more open and reach users by integrating with other content providers. The second finding is that users paid attention to image licenses but struggled to find and interpret them while also being unsure how to cite images. In general, there is a high demand for reusing scholarly images but the existing infrastructure has room to improve.
Building a well-founded understanding of the concepts, tasks and limitations of IT in all areas of society is an essential prerequisite for future developments in business and research. This applies in particular to the healthcare sector and medical research, which are affected by the noticeable advances in digitization. In the transfer project “Zukunftslabor Gesundheit” (ZLG), a teaching framework was developed to support the development of further education online courses in order to teach heterogeneous groups of learners independent of location and prior knowledge. The study at hand describes the development and components of the framework.
Powder bed-based additive manufacturing processes offer an extended freedom in design and enable the processing of metals, ceramics, and polymers with a high level of relative density. The latter is a prevalent measure of process and component quality, which depends on various input variables. A key point in this context is the condition of powder beds. To enhance comprehension of their particle-level formation and facilitate process optimization, simulations based on the Discrete Element Method are increasingly employed in research. To generate qualitatively as well as quantitatively reliable simulation results, an adaptation of the contact model parameterization is necessary. However, current adaptation methods often require the implementation of models that significantly increase computational effort, therefore limiting their applicability. To counteract this obstacle, a sophisticated formula-based adaptation and evaluation method is presented in this research. Additionally, the developed method enables accelerated parameter determination with limited experimental effort. Thus, it represents an integrative component, which supports further research efforts based on the Discrete Element Method by significantly reducing the parameterization effort. The universal nature of deducting this method also allows its adaptation to similar parameterization problems and its implementation in other fields of research.
Pathologists need to identify abnormal changes in tissue. With the developing digitalization, the used tissue slides are stored digitally. This enables pathologists to annotate the region of interest with the support of software tools. PathoLearn is a web-based learning platform explicitly developed for the teacher-student scenario, where the goal is that students learn to identify potential abnormal changes. Artificial intelligence (AI) and machine learning (ML) have become very important in medicine. Many health sectors already utilize AI and ML. This will only increase in the future, also in the field of pathology. Therefore, it is important to teach students the fundamentals and concepts of AI and ML early in their studies. Additionally, creating and training AI generally requires knowledge of programming and technical details. This thesis evaluates how this boundary can be overcome by comparing existing end-to-end AI platforms and teaching tools for AI. It was shown that a visual programming editor offers a fitting abstraction for creating neural networks without programming. This was extended with real-time collaboration to enable students to work in groups. Additionally, an automatic training feature was implemented, removing the necessity to know technical details about training neural networks.
After kidney transplantation graft rejection must be prevented. Therefore, a multitude of parameters of the patient is observed pre- and postoperatively. To support this process, the Screen Reject research project is developing a data warehouse optimized for kidney rejection diagnostics. In the course of this project it was discovered that important information are only available in form of free texts instead of structured data and can therefore not be processed by standard ETL tools, which is necessary to establish a digital expert system for rejection diagnostics. Due to this reason, data integration has been improved by a combination of methods from natural language processing and methods from image processing. Based on state-of-the-art data warehousing technologies (Microsoft SSIS), a generic data integration tool has been developed. The tool was evaluated by extracting Banff-classification from 218 pathology reports and extracting HLA mismatches from about 1700 PDF files, both written in german language.
In this poster we present the ongoing development of an integrated free and open source toolchain for semantic annotation of digitised cultural heritage. The toolchain development involves the specification of a common data model that aims to increase interoperability across diverse datasets and to enable new collaborative research approaches.
This paper aims to provide a structured overview of four open, participatory formats that are particularly applicable in inquiry-based teaching and learning contexts: hackathons, book sprints, barcamps, and learning circles. Using examples, mostly from the work and experience context of the Open Science Lab at TIB Hannover, we address concrete processes, working methods, possible outcomes and challenges.
The compilation offers an introduction to the topic and is intended to provide tools for testing in practice.
Techno-economic analysis that allocate costs to the energy flows of energy systems are helpful to understand the formation of costs within processes and to increase the cost efficiency. For the economic evaluation, the usefulness or quality of the energy is of great importance. In exergy-based methods, this is considered by allocating costs to the exergy instead of energy. As exergy represents the ability of performing work, it is often named the useful part of energy. In contrast, the anergy, the part of energy, which cannot perform work, is often assumed to be not useful.
However, heat flows as used e.g. in domestic heating are always a mixture of a relative small portion of exergy and a big portion of anergy. Although of lower quality, the anergy is obviously useful for these applications. The question is, whether it makes sense to differentiate between exergy and anergy and take both properties into account for the economic evaluation.
To answer this question, a new methodical concept based on the definition of an anergy-exergy cost ratio is compared to the commonly applied approaches of considering either energy or exergy as the basis for economic evaluation. These three different approaches for the economic analysis of thermal energy systems are applied to an exemplary heating system with thermal storages. It is shown that the results of the techno-economic analysis can be improved by giving anergy an economic value and that the proposed anergy-cost ratio allows a flexible adaptation of the evaluation depending on the economic constraints of a system.
Parametric study of piezoresistive structures in continuous fiber reinforced additive manufacturing
(2024)
Recent advancements in fiber reinforced additive manufacturing leverage the piezoresistivity of continuous carbon fibers. This effect enables the fabrication of structural components with inherent piezoresistive properties suitable for load measurement or structural monitoring. These are achieved without necessitating additional manufacturing or assembly procedures. However, there remain unexplored variables within the domain of continuous fiber-reinforced additive manufacturing. Crucially, the roles of fiber curvature radii and sensing fiber bundle counts have yet to be comprehensively addressed. Additionally, the compression-sensitive nature of printed carbon fiber-reinforced specimens remains a largely unexplored research area. To address these gaps, this study presents experimental analyses on tensile and three-point flexural specimens incorporating sensing carbon fiber strands. All specimens were fabricated with three distinct curvature radii. For the tensile specimens, the number of layers was also varied. Sensing fiber bundles were embedded on both tensile and compression sides of the flexural specimens. Mechanical testing revealed a linear-elastic behavior in the specimens. It was observed that carbon fibers supported the majority of the load, leading to brittle fractures. The resistance measurements showed a dependence on both the number of sensing layers and the radius of curvature, and exhibited a slight decreasing trend in the cyclic tests. Compared with the sensors subjected to tensile stress, the sensors embedded on the compression side showed a lower gauge factor.