Refine
Year of publication
Document Type
- Conference Proceeding (50)
- Article (42)
- Bachelor Thesis (6)
- Part of a Book (2)
- Master's Thesis (2)
- Preprint (2)
- Doctoral Thesis (1)
- Report (1)
- Working Paper (1)
Language
- English (107) (remove)
Is part of the Bibliography
- no (107)
Keywords
- Serviceorientierte Architektur (9)
- Mikroservice (8)
- Computersicherheit (7)
- SOA (7)
- Agilität <Management> (6)
- Agile Softwareentwicklung (5)
- Insurance Industry (5)
- Künstliche Intelligenz (5)
- Nachhaltigkeit (5)
- Rechnernetz (5)
- Versicherungswirtschaft (5)
- Visualisierung (4)
- Agile methods (3)
- COVID-19 (3)
- Cloud Computing (3)
- Complex Event Processing (3)
- Computersimulation (3)
- E-Learning (3)
- Empfehlungssystem (3)
- Information Visualization (3)
- Microservices (3)
- Network Security (3)
- Neuronales Netz (3)
- OSGi (3)
- Security (3)
- Semantic Web (3)
- Simulation (3)
- Telearbeit (3)
- Virtuelle Realität (3)
- complex event processing (3)
- microservices (3)
- mobile health (3)
- AI (2)
- Agent <Informatik> (2)
- Agile software development (2)
- Akzeptanz (2)
- Benutzeroberfläche (2)
- Big Data (2)
- CEP (2)
- CI/CD (2)
- Chatbot (2)
- Consistency (2)
- Consumerization (2)
- Datenstrom (2)
- Deep learning (2)
- DevOps (2)
- Dienstgüte (2)
- ECA (2)
- Eindringerkennung (2)
- Ereignisgesteuerte Programmierung (2)
- ISO 9001 (2)
- Indicator Measurement (2)
- Machine Learning (2)
- Maschinelles Lernen (2)
- Maschinelles Sehen (2)
- Microservice (2)
- Microservices Architecture (2)
- Open Source (2)
- Patient (2)
- Qualität (2)
- Rendering (2)
- Resiliency (2)
- Resilienz (2)
- Service-orientation (2)
- Smart Device (2)
- Tertiärbereich (2)
- Urban Logistics (2)
- User Interfaces (2)
- Verarbeitung komplexer Ereignisse (2)
- Versicherung (2)
- Versicherungsbetrieb (2)
- XML-Model (2)
- XML-Schema (2)
- acceptance (2)
- agile methods (2)
- agile software development (2)
- build automation (2)
- build server (2)
- digital divide (2)
- eduscrum (2)
- event-driven architecture (2)
- general practitioners (2)
- mHealth (2)
- remote work (2)
- tablet (2)
- virtual reality (2)
- 3d mapping (1)
- 4-day work week (1)
- AI influences (1)
- API (1)
- Abalone (1)
- Absolvent (1)
- Ad-hoc-Netz (1)
- Adaptive IT Infrastructure (1)
- Adaptives Verfahren (1)
- Agile Manifesto (1)
- Agile Practices (1)
- Agile Software Development (1)
- Agile education (1)
- Agile method (1)
- Agile practices (1)
- Air quality (1)
- Allgemeinarzt (1)
- AlphaGo (1)
- Alternative work schedule (1)
- Android (1)
- Angst (1)
- Anomalieerkennung (1)
- Anomaly detection (1)
- Anonymization (1)
- Antifragile (1)
- Application Programming Interface (1)
- Arbeitsablauf (1)
- Arbeitswelt (1)
- Arbeitszufriedenheit (1)
- Articial intelligence (1)
- Asymmetric encryption (1)
- Attack detection (1)
- Auswahl (1)
- Authentication (1)
- Authentifikation (1)
- Authorization (1)
- Autorisierung (1)
- BLAST algorithm (1)
- BaaS (Backend-as-a-service) (1)
- Bacterial genomics (1)
- Bankruptcy costs (1)
- Bat algorithm (1)
- Batteriefahrzeug (1)
- Battery Electric Vehicles (1)
- Bekleidungsindustrie (1)
- Benutzererlebnis (1)
- Benutzerfreundlichkeit (1)
- Beruf (1)
- Bestärkendes Lernen <Künstliche Intelligenz> (1)
- Big Data Analytics (1)
- Biometrie (1)
- Blackboard Pattern (1)
- Brettspiel (1)
- Bring Your Own Device (1)
- Business model (1)
- C-SPARQL (1)
- C2C (1)
- COBIT (1)
- CQL (1)
- Case Management (1)
- Chaos (1)
- Chaostheorie (1)
- ChatGPT (1)
- Choreography (1)
- City-Logistik (1)
- Code quality (1)
- Complex Event Processing (CEP) (1)
- Complex event processing (1)
- Compliance (1)
- Computer Graphics (1)
- Computer Vision (1)
- Computer simulation (1)
- Computergrafik (1)
- Context Awareness (1)
- Context-aware recommender systems (1)
- Continuous Delivery (1)
- Corporate Credit Risk (1)
- Cross-holdings (1)
- Crowdshipping (1)
- Crowdsourcing (1)
- Customer channel (1)
- Cyber Insurance (1)
- Cyber Risks (1)
- Cyber-Versicherung (1)
- Cyberattacke (1)
- Damage claims (1)
- Data Cubes (1)
- Data Management (1)
- Datenwürfel (1)
- Decision Support (1)
- Decision Support Tool (1)
- Delphi (1)
- Delphi method characteristics (1)
- Delphi method variants (1)
- Depression (1)
- Design Science (1)
- Designwissenschaft <Informatik> (1)
- Diffusion Models (1)
- Distributed file systems (1)
- Docker (1)
- Domain Driven Design (DDD) (1)
- Dyadisches Gitter (1)
- Dünnes Gitter (1)
- E-Assessment (1)
- E-Grocery (1)
- E-Health (1)
- EPN (1)
- Echtzeitsimulation (1)
- Education (1)
- Eilzustellung (1)
- Eingebettetes System (1)
- Elektromobilität (1)
- Enduser Device (1)
- Energieerzeugung (1)
- Entrepreneurship (1)
- Entscheidungsunterstützungssystem (1)
- Erfolgsfaktor (1)
- Evaluation (1)
- Event Admin (EA) (1)
- Event Processing Network (1)
- Event Processing Network Model (1)
- Event monitoring (1)
- Explainability (1)
- Explainable anomaly detection (1)
- FaaS (Function-as-a-service) (1)
- Fault tolerance (1)
- Fernunterricht (1)
- Financial contagion (1)
- Financial network (1)
- Finanzplanung (1)
- Fire sales (1)
- Framework (1)
- Framework <Informatik> (1)
- Freiluftsport (1)
- Function as a Service (1)
- GAN (1)
- GPT-3 (1)
- Generative Adversarial Network (1)
- Genetic algorithms (1)
- Genetischer Algorithmus (1)
- Genomic databases (1)
- Geschlechtsunterschied (1)
- Geschäftsmodell (1)
- Gesichtserkennung (1)
- Graph embeddings (1)
- Graphische Benutzeroberfläche (1)
- Green Tourism (1)
- Hadoop (1)
- Hausarzt (1)
- Hochschullehre (1)
- IDS (1)
- ISO 27 K (1)
- ISO 27000 (1)
- ISO 27001 (1)
- ISO 27002 (1)
- ISO 9001 6.1 (1)
- ISO/IEC 27000 (1)
- IT Risk (1)
- IT Risk Management (1)
- IT Security Risk (1)
- IT Sicherheit (1)
- IT security (1)
- Idiosyncratic Risk (1)
- Information systems research (1)
- Informationstechnik (1)
- Insurance (1)
- Integrated Management (1)
- Intelligent control (1)
- Intelligentes Stromnetz (1)
- Internationalisierung (1)
- Istio (1)
- JFLAP (1)
- Kardiovaskuläre Krankheit (1)
- Knowledge graphs (1)
- Kontextbezogenes System (1)
- Kontinuierliche Integration (1)
- Kreditrisiko (1)
- Kubernetes (1)
- LON-CAPA (1)
- Lean Management (1)
- Lebensmittel (1)
- Leistungskennzahl (1)
- Lernsoftware (1)
- Lieferservice (1)
- LightSabre (1)
- Literaturbericht (1)
- Location-based systems (1)
- Luftqualität (1)
- Lymphknoten (1)
- MANET (1)
- Machine-to-Machine-Kommunikation (1)
- Magnetometer (1)
- Management (1)
- MapReduce (1)
- MapReduce algorithm (1)
- Maps (1)
- Marketing (1)
- Marketingstrategie (1)
- Marktpotenzial (1)
- Masterstudium (1)
- Metagenomics (1)
- Metakognitive Therapie (1)
- Mikro-Kraft-Wärme-Kopplung (1)
- Mobile (1)
- Mobile Applications (1)
- Mobile Device (1)
- Mobile Device Management (1)
- Multidimensional Analysis (1)
- Multidimensional analysis (1)
- Music recommender (1)
- Musik (1)
- Nagios (1)
- Neural controls (1)
- Neural networks (1)
- Neural-network models (1)
- Nichtlineare Dynamik (1)
- NoSQL databases. (1)
- Nonlinear Dynamics (1)
- Normality model (1)
- Notfallmedizin (1)
- OECD datasets (1)
- Offenes Kommunikationssystem (1)
- Online services (1)
- Online-Dienst (1)
- Ontologies (1)
- Open systems (1)
- OpenStack (1)
- Opportunity Management (1)
- Optische Zeichenerkennung (1)
- Orchestration (1)
- Outdoor (1)
- PageRank (1)
- Paket (1)
- Pathologie (1)
- Pathology (1)
- Personennahverkehr (1)
- Physically Based Rendering (1)
- Policy Evaluation (1)
- Portable Micro-CHP Unit (1)
- Pregel (1)
- Privacy by Design (1)
- Problemorientiertes Lernen (1)
- Processes (1)
- Projektmanagement (1)
- Prostatakrebs (1)
- Prozessmanagement (1)
- Prüfstand (1)
- Pseudonymization (1)
- Psychische Gesundheit (1)
- Psychokardiologie (1)
- QM (1)
- Quality Management (1)
- Quality assessment (1)
- Quality of Service (1)
- Quality of Service (QoS) (1)
- Quality perception (1)
- Qualitätsmanagement (1)
- Quellcode (1)
- REST <Informatik> (1)
- RESTful (1)
- RFID (1)
- Real-Time Rendering (1)
- Real-time Collaboration (1)
- Real-time simulation (1)
- Recommender System (1)
- Recommender systems (1)
- Reference Architecture (1)
- Referenzmodell (1)
- Reinforcement Learning (1)
- Remote work (1)
- Rendering (computer graphics) (1)
- Representational State Transfer (1)
- Richardson Maturity Model (1)
- Risiko (1)
- Risikomanagement (1)
- Risk Management (1)
- Robotics (1)
- Robotik (1)
- Rule learning (1)
- RuleCore (1)
- SEM (1)
- SIEM (1)
- SOA co-existence (1)
- SOAP (1)
- SPION (1)
- Scaling Law (1)
- Schadensersatzanspruch (1)
- Schwarmintelligenz (1)
- Scientific Visualization (1)
- Scrum <Vorgehensmodell> (1)
- Semantic Web Technologies (1)
- Semi-structured interviews (1)
- Sensor (1)
- Sensorsystem (1)
- Sentinel-Lymphknoten (1)
- Sequence alignment (1)
- Serverless Computing (1)
- Service Lifecycle (1)
- Service Management (1)
- Service Mesh (1)
- Service Monitoring (1)
- Service Orientation (1)
- Service Registry (1)
- Service Repository (1)
- Service Semantics (1)
- Shortest Path (1)
- Simulation Modeling (1)
- Situation Awareness (1)
- Skalierungsgesetz (1)
- Smart Buildings (1)
- Smart Grid (1)
- Smartphone (1)
- Social entrepreneurship (1)
- Software Architecture (1)
- Software Engineering (1)
- Software development (1)
- Softwarearchitektur (1)
- Softwareentwicklung (1)
- Softwarewerkzeug (1)
- Sonnenfinsternis (1)
- Source code properties (1)
- Spheres (1)
- Standortbezogener Dienst (1)
- Stochastic Modeling (1)
- Stochastischer Prozess (1)
- Strategie (1)
- Straßenverkehr (1)
- Streaming <Kommunikationstechnik> (1)
- Strukturgleichungsmodell (1)
- Super Resolution (1)
- Supply Chain Management (1)
- Supply Chains (1)
- Sustainability (1)
- Sustainable Tourism (1)
- Sustainable development (1)
- Swarm Intelligence (1)
- Swarm algorithm (1)
- Synchronisierung (1)
- Synchronization (1)
- Systematic Risk (1)
- Systemic risk (1)
- Tactile map (1)
- Taxonomie (1)
- Taxonomy (1)
- Technology acceptance (1)
- Tertiary study (1)
- Test Bench (1)
- Theoretische Informatik (1)
- Tourism (1)
- Tourismusmarketing (1)
- Twitter <Softwareplattform> (1)
- Twitter analysis (1)
- Unternehmen (1)
- Usability Testing (1)
- User Generated Content (1)
- Verteiltes System (1)
- Videospiel (1)
- Viertagewoche (1)
- Virtual reality (1)
- Virtuelles Laboratorium (1)
- Visual Analytics (1)
- Visualization (1)
- WS-Security (1)
- Web service (1)
- Web services (1)
- Wind power plant (1)
- Windkraftwerk (1)
- Wissensgraph (1)
- Word Counting (1)
- Workflow (1)
- XML (1)
- Zentriertes Interview (1)
- ad-hoc networks (1)
- adaptive methods (1)
- aerospace engineering (1)
- agent-based simulation (1)
- agents (1)
- agile education (1)
- anaphylaxis (1)
- anxiety (1)
- architecture (1)
- asynchronous messaging (1)
- cardiovascular disease (1)
- cashing (1)
- class room (1)
- cloud computing (1)
- clustering on countries (1)
- collaborative coordination (1)
- complex event processing (CEP) (1)
- covid 19 (1)
- credit risk (1)
- data mapping (1)
- data protection (1)
- data stream learning (1)
- data stream processing (1)
- depression (1)
- digital intervention (1)
- digital twins (1)
- distance learning (1)
- distributed environments (1)
- distributed evacuation coordination (1)
- distributed systems (1)
- dyadic grid (1)
- e-learning (1)
- e-mobility (1)
- eduDScloud (1)
- educational virtual realities (1)
- eigenface (1)
- emergency medicine (1)
- enterprise apps (1)
- evacuation guidance (1)
- evaluation (1)
- event models (1)
- events (1)
- face recognition (1)
- financial planning (1)
- forecasting models on countries (1)
- game analysis (1)
- gender (1)
- generic interface (1)
- graduate (1)
- graphical user interface (1)
- head-mounted display (1)
- health care (1)
- higher education (1)
- immersive media (1)
- information system (1)
- integrated passenger and freight transport (1)
- key performance indicators (1)
- large language model (1)
- large scale systems (1)
- lidar (1)
- literature review (1)
- load balancing (1)
- lymphadenectomy (1)
- machine learning (1)
- machine-to-machine communication (1)
- magnetometer (1)
- management (1)
- market-based coordination (1)
- matrix calulations (1)
- mental health (1)
- metacognitive therapy (1)
- multi-dimensional data (1)
- multiagent systems (1)
- ontology (1)
- open source (1)
- patients (1)
- pmCHP (1)
- point clouds (1)
- position paper (1)
- presence experience (1)
- privacy (1)
- private cloud (1)
- problem based learning (1)
- professional life (1)
- prostate cancer (1)
- psychocardiology (1)
- real-time routing (1)
- recommender systems (1)
- reliable message delivery (1)
- rural transport simulation (1)
- scaling (1)
- security (1)
- semantic knowledge (1)
- semantic web application (1)
- semistructured interview (1)
- sentiment dictionaries (1)
- sentinel lymph node dissection (1)
- serverless architecture (1)
- serverless functions (1)
- service models (1)
- service-orientation (1)
- shopping cart system (1)
- simulation training (1)
- situation aware routing (1)
- situation-awareness (1)
- smart buildings (1)
- smart cities (1)
- smartphone (1)
- solid waste management (1)
- sparse grid (1)
- stereo vision (1)
- student project (1)
- superparamagnetic iron oxide nanoparticles (1)
- survey (1)
- sustainability (1)
- system integration (1)
- systematic literature review (1)
- taxonomy (1)
- teaching entrepreneurship (1)
- text mining (1)
- tool evaluation (1)
- training effectiveness (1)
- underprivileged adolescents (1)
- user experience (1)
- user generated content (1)
- user training (1)
- virtual distance teaching (1)
- virtual emergency scenario (1)
- virtual lab (1)
- virtual patient simulation (1)
- visual delegates (1)
- visual perception (1)
- web services (1)
- work satisfaction (1)
- work-life balance (1)
- working life (1)
- workload decomposition (1)
- Ökotourismus (1)
- Übung (1)
Institute
- Fakultät IV - Wirtschaft und Informatik (107) (remove)
Renewable energy production is one of the strongest rising markets and further extreme growth can be anticipated due to desire of increased sustainability in many parts of the world. With the rising adoption of renewable power production, such facilities are increasingly attractive targets for cyber attacks. At the same time higher requirements on a reliable production are raised. In this paper we propose a concept that improves monitoring of renewable power plants by detecting anomalous behavior. The system does not only detect an anomaly, it also provides reasoning for the anomaly based on a specific mathematical model of the expected behavior by giving detailed information about various influential factors causing the alert. The set of influential factors can be configured into the system before learning normal behaviour. The concept is based on multidimensional analysis and has been implemented and successfully evaluated on actual data from different providers of wind power plants.
Integrated Risk and Opportunity Management (IROM) goes far beyond what is found in organizations today. However, it offers the best opportunity not only to keep pace with the VUCA world, but to actually profit from it. Accordingly, the introduction of opportunity-based thinking in addition to risk-based thinking is part of the design specification for ISO 9000 and ISO 9001. The prerequisite for the successful design of an IROM is the individual definition, control and integration of risk and opportunity management processes, considering eight success factors, the "8 C". Top management benefits directly from the result: better, coordinated decision memos enable faster and more appropriate decisions.
Autonomous and integrated passenger and freight transport (APFIT) is a promising approach to tackle both, traffic and last-mile-related issues such as environmental emissions, social and spatial conflicts or operational inefficiencies. By conducting an agent-based simulation, we shed light on this widely unexplored research topic and provide first indications regarding influential target figures of such a system in the rural area of Sarstedt, Germany. Our results show that larger fleets entail inefficiencies due to suboptimal utilization of monetary and material resources and increase traffic volume while higher amounts of unused vehicles may exacerbate spatial conflicts. Nevertheless, to fit the given demand within our study area, a comparatively large fleet of about 25 vehicles is necessary to provide reliable service, assuming maximum passenger waiting times of six minutes to the expense of higher standby times, rebalancing effort, and higher costs for vehicle acquisition and maintenance.
Pathologists need to identify abnormal changes in tissue. With the developing digitalization, the used tissue slides are stored digitally. This enables pathologists to annotate the region of interest with the support of software tools. PathoLearn is a web-based learning platform explicitly developed for the teacher-student scenario, where the goal is that students learn to identify potential abnormal changes. Artificial intelligence (AI) and machine learning (ML) have become very important in medicine. Many health sectors already utilize AI and ML. This will only increase in the future, also in the field of pathology. Therefore, it is important to teach students the fundamentals and concepts of AI and ML early in their studies. Additionally, creating and training AI generally requires knowledge of programming and technical details. This thesis evaluates how this boundary can be overcome by comparing existing end-to-end AI platforms and teaching tools for AI. It was shown that a visual programming editor offers a fitting abstraction for creating neural networks without programming. This was extended with real-time collaboration to enable students to work in groups. Additionally, an automatic training feature was implemented, removing the necessity to know technical details about training neural networks.
On November 30th, 2022, OpenAI released the large language model ChatGPT, an extension of GPT-3. The AI chatbot provides real-time communication in response to users’ requests. The quality of ChatGPT’s natural speaking answers marks a major shift in how we will use AI-generated information in our day-to-day lives. For a software engineering student, the use cases for ChatGPT are manifold: assessment preparation, translation, and creation of specified source code, to name a few. It can even handle more complex aspects of scientific writing, such as summarizing literature and paraphrasing text. Hence, this position paper addresses the need for discussion of potential approaches for integrating ChatGPT into higher education. Therefore, we focus on articles that address the effects of ChatGPT on higher education in the areas of software engineering and scientific writing. As ChatGPT was only recently released, there have been no peer-reviewed articles on the subject. Thus, we performed a structured grey literature review using Google Scholar to identify preprints of primary studies. In total, five out of 55 preprints are used for our analysis. Furthermore, we held informal discussions and talks with other lecturers and researchers and took into account the authors’ test results from using ChatGPT. We present five challenges and three opportunities for the higher education context that emerge from the release of ChatGPT. The main contribution of this paper is a proposal for how to integrate ChatGPT into higher education in four main areas.
We present an approach towards a data acquisition system for digital twins that uses a 5G net- work for data transmission and localization. The current hardware setup, which utilizes stereo vision and LiDAR for 3D mapping, is explained together with two recorded point cloud data sets. Furthermore, a resulting digital twin comprised of voxelized point cloud data is shown. Ideas for future applications and challenges regarding the system are discussed and an outlook on further development is given.
In this paper we describe methods to approximate functions and differential operators on adaptive sparse (dyadic) grids. We distinguish between several representations of a function on the sparse grid and we describe how finite difference (FD) operators can be applied to these representations. For general variable coefficient equations on sparse grids, genuine finite element (FE) discretizations are not feasible and FD operators allow an easier operator evaluation than the adapted FE operators. However, the structure of the FD operators is complex. With the aim to construct an efficient multigrid procedure, we analyze the structure of the discrete Laplacian in its hierarchical representation and show the relation between the full and the sparse grid case. The rather complex relations, that are expressed by scaling matrices for each separate coordinate direction, make us doubt about the possibility of constructing efficient preconditioners that show spectral equivalence. Hence, we question the possibility of constructing a natural multigrid algorithm with optimal O(N) efficiency. We conjecture that for the efficient solution of a general class of adaptive grid problems it is better to accept an additional condition for the dyadic grids (condition L) and to apply adaptive hp-discretization.
The paper presents a comprehensive model of a banking system that integrates network effects, bankruptcy costs, fire sales, and cross-holdings. For the integrated financial market we prove the existence of a price-payment equilibrium and design an algorithm for the computation of the greatest and the least equilibrium. The number of defaults corresponding to the greatest price-payment equilibrium is analyzed in several comparative case studies. These illustrate the individual and joint impact of interbank liabilities, bankruptcy costs, fire sales and cross-holdings on systemic risk. We study policy implications and regulatory instruments, including central bank guarantees and quantitative easing, the significance of last wills of financial institutions, and capital requirements.
Background:
Many patients with cardiovascular disease also show a high comorbidity of mental disorders, especially such as anxiety and depression. This is, in turn, associated with a decrease in the quality of life. Psychocardiological treatment options are currently limited. Hence, there is a need for novel and accessible psychological help. Recently, we demonstrated that a brief face-to-face metacognitive therapy (MCT) based intervention is promising in treating anxiety and depression. Here, we aim to translate the face-to-face approach into digital application and explore the feasibility of this approach.
Methods:
We translated a validated brief psychocardiological intervention into a novel non-blended web app. The data of 18 patients suffering from various cardiac conditions but without diagnosed mental illness were analyzed after using the web app over a two-week period in a feasibility trial. The aim was whether a nonblended web app based MCT approach is feasible in the group of cardiovascular patients with cardiovascular disease.
Results:
Overall, patients were able to use the web app and rated it as satisfactory and beneficial. In addition, there was first indication that using the app improved the cardiac patients’ subjectively perceived health and reduced their anxiety. Therefore, the approach seems feasible for a future randomized controlled trial.
Conclusion:
Applying a metacognitive-based brief intervention via a nonblended web app seems to show good acceptance and feasibility in a small target group of patients with CVD. Future studies should further develop, improve and validate digital psychotherapy approaches, especially in patient groups with a lack of access to standard psychotherapeutic care.
In the last years generative models have gained large public attention due to their high level of quality in generated images. In short, generative models learn a distribution from a finite number of samples and are able then to generate infinite other samples. This can be applied to image data. In the past generative models have not been able to generate realistic images, but nowadays the results are almost indistinguishable from real images.
This work provides a comparative study of three generative models: Variational Autoencoder (VAE), Generative Adversarial Network (GAN) and Diffusion Models (DM). The goal is not to provide a definitive ranking indicating which one of them is the best, but to qualitatively and where possible quantitively decide which model is good with respect to a given criterion. Such criteria include realism, generalization and diversity, sampling, training difficulty, parameter efficiency, interpolating and inpainting capabilities, semantic editing as well as implementation difficulty. After a brief introduction of how each model works on the inside, they are compared against each other. The provided images help to see the differences among the models with respect to each criterion.
To give a short outlook on the results of the comparison of the three models, DMs generate most realistic images. They seem to generalize best and have a high variation among the generated images. However, they are based on an iterative process, which makes them the slowest of the three models in terms of sample generation time. On the other hand, GANs and VAEs generate their samples using one single forward-pass. The images generated by GANs are comparable to the DM and the images from VAEs are blurry, which makes them less desirable in comparison to GANs or DMs. However, both the VAE and the GAN, stand out from the DMs with respect to the interpolations and semantic editing, as they have a latent space, which makes space-walks possible and the changes are not as chaotic as in the case of DMs. Furthermore, concept-vectors can be found, which transform a given image along a given feature while leaving other features and structures mostly unchanged, which is difficult to archive with DMs.
There are many aspects of code quality, some of which are difficult to capture or to measure. Despite the importance of software quality, there is a lack of commonly accepted measures or indicators for code quality that can be linked to quality attributes. We investigate software developers’ perceptions of source code quality and the practices they recommend to achieve these qualities. We analyze data from semi-structured interviews with 34 professional software developers, programming teachers and students from Europe and the U.S. For the interviews, participants were asked to bring code examples to exemplify what they consider good and bad code, respectively. Readability and structure were used most commonly as defining properties for quality code. Together with documentation, they were also suggested as the most common target properties for quality improvement. When discussing actual code, developers focused on structure, comprehensibility and readability as quality properties. When analyzing relationships between properties, the most commonly talked about target property was comprehensibility. Documentation, structure and readability were named most frequently as source properties to achieve good comprehensibility. Some of the most important source code properties contributing to code quality as perceived by developers lack clear definitions and are difficult to capture. More research is therefore necessary to measure the structure, comprehensibility and readability of code in ways that matter for developers and to relate these measures of code structure, comprehensibility and readability to common software quality attributes.
The digital transformation with its new technologies and customer expectation has a significant effect on the customer channels in the insurance industry. The objective of this study is the identification of enabling and hindering factors for the adoption of online claim notification services that are an important part of the customer experience in insurance. For this purpose, we conducted a quantitative cross-sectional survey based on the exemplary scenario of car insurance in Germany and analyzed the data via structural equation modeling (SEM). The findings show that, besides classical technology acceptance factors such as perceived usefulness and ease of use, digital mindset and status quo behavior play a role: acceptance of digital innovations, lacking endurance as well as lacking frustration tolerance with the status quo lead to a higher intention for use. Moreover, the results are strongly moderated by the severity of the damage event—an insurance-specific factor that is sparsely considered so far. The latter discovery implies that customers prefer a communication channel choice based on the individual circumstances of the claim.
During the Corona-Pandemic, information (e.g. from the analysis of balance sheets and payment behavior) traditionally used for corporate credit risk analysis became less valuable because it represents only past circumstances. Therefore, the use of currently published data from social media platforms, which have shown to contain valuable information regarding the financial stability of companies, should be evaluated. In this data e. g. additional information from disappointed employees or customers can be present. In order to analyze in how far this data can improve the information base for corporate credit risk assessment, Twitter data regarding the ten greatest insolvencies of German companies in 2020 and solvent counterparts is analyzed in this paper. The results from t-tests show, that sentiment before the insolvencies is significantly worse than in the comparison group which is in alignment with previously conducted research endeavors. Furthermore, companies can be classified as prospectively solvent or insolvent with up to 70% accuracy by applying the k-nearest-neighbor algorithm to monthly aggregated sentiment scores. No significant differences in the number of Tweets for both groups can be proven, which is in contrast to findings from studies which were conducted before the Corona-Pandemic. The results can be utilized by practitioners and scientists in order to improve decision support systems in the domain of corporate credit risk analysis. From a scientific point of view, the results show, that the information asymmetry between lenders and borrowers in credit relationships, which are principals and agents according to the principal-agent-theory, can be reduced based on user generated content from social media platforms. In future studies, it should be evaluated in how far the data can be integrated in established processes for credit decision making. Furthermore, additional social media platforms as well as samples of companies should be analyzed. Lastly, the authenticity of user generated contend should be taken into account in order to ensure, that credit decisions rely on truthful information only.
In this paper we describe the selection of a modern build automation tool for an industry research partner of ours, namely an insurance company. Build automation has become increasingly important over the years. Today, build automation became one of the central concepts in topics such as cloud native development based on microservices and DevOps. Since more and more products for build automation have entered the market and existing tools have changed their functional scope, there is nowadays a large number of tools on the market that differ greatly in their functional scope. Based on requirements from our partner company, a build server analysis was conducted. This paper presents our analysis requirements, a detailed look at one of the examined tools and a summarizes our comparison of all three tools from our final comparison round.
Music streaming platforms offer music listeners an overwhelming choice of music. Therefore, users of streaming platforms need the support of music recommendation systems to find music that suits their personal taste. Currently, a new class of recommender systems based on knowledge graph embeddings promises to improve the quality of recommendations, in particular to provide diverse and novel recommendations. This paper investigates how knowledge graph embeddings can improve music recommendations. First, it is shown how a collaborative knowledge graph can be derived from open music data sources. Based on this knowledge graph, the music recommender system EARS (knowledge graph Embedding-based Artist Recommender System) is presented in detail, with particular emphasis on recommendation diversity and explainability. Finally, a comprehensive evaluation with real-world data is conducted, comparing of different embeddings and investigating the influence of different types of knowledge.
The transfer of historically grown monolithic software architectures into modern service-oriented architectures creates a lot of loose coupling points. This can lead to an unforeseen system behavior and can significantly impede those continuous modernization processes, since it is not clear where bottlenecks in a system arise. It is therefore necessary to monitor such modernization processes with an adaptive monitoring concept to be able to correctly record and interpret unpredictable system dynamics. This contribution presents a generic QoS measurement framework for service-based systems. The framework consists of an XML-based specification for the measurement to be performed – the Information Model (IM) – and the QoS System, which provides an execution platform for the IM. The framework will be applied to a standard business process of the German insurance industry, and the concepts of the IM and their mapping to artifacts of the QoS System will be presented. Furtherm ore, design and implementation of the QoS System’s parser and generator module and the generated artifacts are explained in detail, e.g., event model, agents, measurement module and analyzer module.
Even for the more traditional insurance industry, the Microservices Architecture (MSA) style plays an increasingly important role in provisioning insurance services. However, insurance businesses must operate legacy applications, enterprise software, and service-based applications in parallel for a more extended transition period. The ultimate goal of our ongoing research is to design a microservice reference architecture in cooperation with our industry partners from the insurance domain that provides an approach for the integration of applications from different architecture paradigms. In Germany, individual insurance services are classified as part of the critical infrastructure. Therefore, German insurance companies must comply with the Federal Office for Information Security requirements, which the Federal Supervisory Authority enforces. Additionally, insurance companies must comply with relevant laws, regulations, and standards as part of the business’s compliance requirements. Note: Since Germany is seen as relatively ’tough’ with respect to privacy and security demands, fullfilling those demands might well be suitable (if not even ’over-achieving’) for insurances in other countries as well. The question raises thus, of how insurance services can be secured in an application landscape shaped by the MSA style to comply with the architectural and security requirements depicted above. This article highlights the specific regulations, laws, and standards the insurance industry must comply with. We present initial architectural patterns to address authentication and authorization in an MSA tailored to the requirements of our insurance industry partners.
Cloud computing has become well established in private and public sector projects over the past few years, opening ever new opportunities for research and development, but also for education. One of these opportunities presents itself in the form of dynamically deployable, virtual lab environments, granting educational institutions increased flexibility with the allocation of their computing resources. These fully sandboxed labs provide students with their own, internal network and full access to all machines within, granting them the flexibility necessary to gather hands-on experience with building heterogeneous microservice architectures. The eduDScloud provides a private cloud infrastructure to which labs like the microservice lab outlined in this paper can be flexibly deployed at a moment’s notice.
In this paper the workflow of the project 'Untersuchungs-, Simulations- und Evaluationstool für Urbane Logistik` (USEfUL) is presented. Aiming to create a web-based decision support tool for urban logistics, the project needed to integrate multiple steps into a single workflow, which in turn needed to be executed multiple times. While a service-oriented system could not be created, the principles of service orientation was utilized to increase workflow efficiency and flexibility, allowing the workflow to be easily adapted to new concepts or research areas.