Refine
Year of publication
- 2020 (80) (remove)
Document Type
- Article (39)
- Bachelor Thesis (14)
- Conference Proceeding (9)
- Book (4)
- Master's Thesis (3)
- Report (3)
- Working Paper (3)
- Review (2)
- Part of a Book (1)
- Doctoral Thesis (1)
- Periodical Part (1)
Has Fulltext
- yes (80)
Is part of the Bibliography
- no (80)
Keywords
- Euterentzündung (7)
- Computersicherheit (5)
- bovine mastitis (4)
- Deutschland (3)
- E-Learning (3)
- Einstellung (3)
- Industrie 4.0 (3)
- Knowledge (3)
- PROFInet (3)
- Rind (3)
The network security framework VisITMeta allows the visual evaluation and management of security event detection policies. By means of a "what-if" simulation the sensitivity of policies to specific events can be tested and adjusted. This paper presents the results of a user study for testing the usability of the approach by measuring the correct completion of given tasks as well as the user satisfaction by means of the system usability scale.
Introduction:
Human Immunodeficiency Virus (HIV) infection remains prevalent co-morbidity, and among fracture patients. Few studies have investigated the role of exercise interventions in preventing bone demineralization in people who have fractures and HIV. If exercise exposed, HIV-infected individuals may experience improved bone health outcomes (BMD), function, quality of life (QoL). The study will aim to assess the impact of home based exercises on bone mineral density, functional capacity, QoL, and some serological markers of health in HIV infection among Nigerians and South Africans.
Methods and design:
The study is an assessor-blinded randomized controlled trial. Patients managed with internal and external fixation for femoral shaft fracture at the study sites will be recruited to participate in the study. The participants will be recruited 2 weeks post-discharge at the follow-up clinic with the orthopaedic surgeon. The study population will consist of all persons with femoral fracture and HIV-positive and negative (HIV-positive medically confirmed) aged 18 to 60 years attending the above-named health facilities. For the HIV-positive participants, a documented positive HIV result, as well as a history of being followed-up at the HIV treatment and care center. A developed home based exercise programme will be implemented in the experimental group while the control group continues with the usual rehabilitation programme. The primary outcome measures will be function, gait, bone mineral density, physical activity, and QoL.
Discussion:
The proposed trial will compare the effect of a home-based physical exercise-training programme in the management of femoral fracture to the usual physiotherapy management programmes with specific outcomes of bone mineral density, function, and inflammatory markers.
Die folgende Studie befasst sich mit den Auswirkungen der Corona-Pandemie und deren Effekt – ein verminderter CO2-Ausstoß in Deutschland. Für einen übersichtlichen Vergleich wird im weiteren Verlauf in unterschiedlichen Szenarien darauf eingegangen, wie sich der CO2-Ausstoß im Vergleich zu 2020 ohne die Corona-Pandemie in Deutschland verändern könnte. Grundbasis für die Ermittlungen der folgenden Szenarien ist der CO2-Trend ohne die Corona-Pandemie im Jahr 2020, in dem Deutschland das Emissionsziel von -40 % gegenüber 1990 mit einer Reduktion von lediglich 37 % verfehlt hätte. Im zweiten Szenario wird dargestellt, wie sich die aktuelle Corona-Pandemie mit verschiedenen Lockdown-Phasen auf den CO2-Ausstoß auswirkt. Deutlich wird hierbei, dass trotz der aktuellen Maßnahmen eine längere Lockdown-Phase benötigt wird, um das Klimaziel von -40 % langfristig gesichert zu erreichen. In den Szenarien 3 und 4 liegt der Fokus auf möglichen Handlungs- und Verhaltensweisen nach der Pandemie. Das Szenario 3 betrachtet die Folgen von wirtschaftsfördernden Maßnahmen nach dem Lockdown und dem damit einhergehenden CO2-Anstieg. Wie viel CO2 zusätzlich aufgrund von nachhaltigem und klimaorientiertem Verhalten eingespart werden kann, erläutert das Szenario 4 „CO2 Entwicklung unter Berücksichtigung der Corona-Pandemie 2020 und möglicher positiver Umweltentwicklungen aus dem Lockdown“.
Faktoren, wie die wachsende Bevölkerung, sich verändernde Produktionsfaktoren oder Umwelteinflüsse wurden vernachlässigt. Die Studie zeigt, dass die Chancen, die durch die wirtschaftlichen Einschnitte und die Verhaltensänderungen, die durch die Corona-Pandemie bzw. deren Folgen hervorgerufen wurden, einen maßgeblichen Einfluss auf den CO2-Ausstoß der Bundesrepublik Deutschland haben können.
Improving the graphitic structure in carbon nanofibers (CNFs) is important for exploiting their potential in mechanical, electrical and electrochemical applications. Typically, the synthesis of carbon fibers with a highly graphitized structure demands a high temperature of almost 2500 °C. Furthermore, to achieve an improved graphitic structure, the stabilization of a precursor fiber has to be assisted by the presence of tension in order to enhance the molecular orientation. Keeping this in view, herein we report on the fabrication of graphene nanoplatelets (GNPs) doped carbon nanofibers using electrospinning followed by oxidative stabilization and carbonization. The effect of doping GNPs on the graphitic structure was investigated by carbonizing them at various temperatures (1000 °C, 1200 °C, 1500 °C and 1700 °C). Additionally, a stabilization was achieved with and without constant creep stress (only shrinkage stress) for both pristine and doped precursor nanofibers, which were eventually carbonized at 1700 °C. Our findings reveal that the GNPs doping results in improving the graphitic structure of polyacrylonitrile (PAN). Further, in addition to the templating effect during the nucleation and growth of graphitic crystals, the GNPs encapsulated in the PAN nanofiber matrix act in-situ as micro clamp units performing the anchoring function by preventing the loss of molecular orientation during the stabilization stage, when no external tension is applied to nanofiber mats. The templating effect of the entire graphitization process is reflected by an increased electrical conductivity along the fibers. Simultaneously, the electrical anisotropy is reduced, i.e., the GNPs provide effective pathways with improved conductivity acting like bridges between the nanofibers resulting in an improved conductivity across the fiber direction compared to the pristine PAN system.
The reactivity of graphene at its boundary region has been imaged using non-linear spectroscopy to address the controversy whether the terraces of graphene or its edges are more reactive. Graphene was functionalised with phenyl groups, and we subsequently scanned our vibrational sum-frequency generation setup from the functionalised graphene terraces across the edges. A greater phenyl signal is clearly observed at the edges, showing evidence of increased reactivity in the boundary region. We estimate an upper limit of 1 mm for the width of the CVD graphene boundary region.
Digital data on tangible and intangible cultural assets is an essential part of daily life, communication and experience. It has a lasting influence on the perception of cultural identity as well as on the interactions between research, the cultural economy and society. Throughout the last three decades, many cultural heritage institutions have contributed a wealth of digital representations of cultural assets (2D digital reproductions of paintings, sheet music, 3D digital models of sculptures, monuments, rooms, buildings), audio-visual data (music, film, stage performances), and procedural research data such as encoding and annotation formats. The long-term preservation and FAIR availability of research data from the cultural heritage domain is fundamentally important, not only for future academic success in the humanities but also for the cultural identity of individuals and society as a whole. Up to now, no coordinated effort for professional research data management on a national level exists in Germany. NFDI4Culture aims to fill this gap and create a usercentered, research-driven infrastructure that will cover a broad range of research domains from musicology, art history and architecture to performance, theatre, film, and media studies.
The research landscape addressed by the consortium is characterized by strong institutional differentiation. Research units in the consortium's community of interest comprise university institutes, art colleges, academies, galleries, libraries, archives and museums. This diverse landscape is also characterized by an abundance of research objects, methodologies and a great potential for data-driven research. In a unique effort carried out by the applicant and co-applicants of this proposal and ten academic societies, this community is interconnected for the first time through a federated approach that is ideally suited to the needs of the participating researchers. To promote collaboration within the NFDI, to share knowledge and technology and to provide extensive support for its users have been the guiding principles of the consortium from the beginning and will be at the heart of all workflows and decision-making processes. Thanks to these principles, NFDI4Culture has gathered strong support ranging from individual researchers to highlevel cultural heritage organizations such as the UNESCO, the International Council of Museums, the Open Knowledge Foundation and Wikimedia. On this basis, NFDI4Culture will take innovative measures that promote a cultural change towards a more reflective and sustainable handling of research data and at the same time boost qualification and professionalization in data-driven research in the domain of cultural heritage. This will create a long-lasting impact on science, cultural economy and society as a whole.
Vergleich von nativer App- und Cross-Platform-Entwicklung (Facebook React Native und Google Flutter)
(2020)
Die Entwicklung mobiler Applikationen für iOS und Android ist in der Regel mit viel Arbeit verbunden, da man für beide Plattformen gezwungenermaßen unterschiedlichen Quelltext schreiben muss. Abhilfe für dieses Problem schaffen Cross-Platform-Frameworks wie React Native von Facebook oder Flutter von Google. Anhand dieser Frameworks lassen sich Apps für beide Plattformen mit nur einer Codebase entwickeln. Eine kritische Stelle und oft gebrauchtes Kontra-Argument gegen die Entwicklung mit Cross-Platform-Frameworks ist die Hardwarenähe der nativen Applikationen, an welcher es den Frameworks vermeintlich mangelt. Doch wie ist der Stand der Dinge im Jahr 2020? Können Cross-Platform-Frameworks inzwischen performant und einfach auf Hardwarekomponenten zugreifen und machen damit die mühsame, native Entwicklung für iOS und Android vor allem in Anbetracht der Entwicklung von größerer Enterprise-Software obsolet?
Dieser Frage wird in dieser Arbeit nachgegangen und generell überprüft wie tauglich die Cross-Platform-Entwicklung ist. Nach dem Lesen dieser Bachelorarbeit sollte entschieden werden können, ob Cross-Platform-Frameworks für das Anwendungsproblem des Lesers geeignet sind. Um die Forschungsfrage zu beantworten, wurden je zwei Applikationen im Rahmen einer Fallstudie für je iOS und Android entwickelt, damit geprüft werden konnte, wie förderlich die zuvor genannten Frameworks sind. Der Fokus der Arbeit liegt also auf der Güte bzw. dem heutigen Stand der Cross-Platform-Entwicklung, vor allem im Bezug auf die Benutzung von Hardwarekomponenten bzw. betriebssystemspezifischen Diensten (Bluetooth, Kamera, etc.).
Die Ergebnisse der Fallstudie zeigen, dass es stets auf den Kontext und die Komplexität der zu realisierenden Anwendung ankommt inwiefern Cross-Platform-Frameworks verwendet werden können. In simplen Anwendungsfällen können Frameworks meist zu einer erheblichen Kostenminimierung und Zeitersparnis führen, wohingegen bei komplexeren Anwendungen relativ schnell Grenzen und starke Abhängigkeiten erreicht werden.
With an increasing complexity and scale, sufficient evaluation of Information Systems (IS) becomes a challenging and difficult task. Simulation modeling has proven as suitable and efficient methodology for evaluating IS and IS artifacts, presupposed it meets certain quality demands. However, existing research on simulation modeling quality solely focuses on quality in terms of accuracy and credibility, disregarding the role of additional quality aspects. Therefore, this paper proposes two design artifacts in order to ensure a holistic quality view on simulation quality. First, associated literature is reviewed in order to extract relevant quality factors in the context of simulation modeling, which can be used to evaluate the overall quality of a simulated solution before, during or after a given project. Secondly, the deduced quality factors are integrated in a quality assessment framework to provide structural guidance on the quality assessment procedure for simulation. In line with a Design Science Research (DSR) approach, we demonstrate the eligibility of both design artifacts by means of prototyping as well as an example case. Moreover, the assessment framework is evaluated and iteratively adjusted with the help of expert feedback.
The objective of this study is to analyze noise patterns during 599 visceral surgical procedures. Considering work-safety regulations, we will identify immanent noise patterns during major visceral surgeries. Increased levels of noise are known to have negative health impacts. Based on a very finegrained data collection over a year, this study will introduce a new procedure for visual representation of intra-surgery noise progression and pave new paths for future research on noise reduction in visceral surgery. Digital decibel sound-level meters were used to record the total noise in three operating theatres in one-second cycles over a year. These data were matched to archival data on surgery characteristics. Because surgeries inherently vary in length, we developed a new procedure to normalize surgery times to run cross-surgery comparisons. Based on this procedure, dBA values were adjusted to each normalized time point. Noise-level patterns are presented for surgeries contingent on important surgery characteristics: 16 different surgery types, operation method, day/night time point and operation complexity (complexity levels 1–3). This serves to cover a wide spectrum of day-to-day surgeries. The noise patterns reveal significant sound level differences of about 1 dBA, with the mostcommon noise level being spread between 55 and 60 dBA. This indicates a sound situation in many of the surgeries studied likely to cause stress in patients and staff. Absolute and relative risks of meeting or exceeding 60 dBA differ considerably across operation types. In conclusion, the study reveals that maximum noise levels of 55 dBA are frequently exceeded during visceral surgical procedures. Especially complex surgeries show, on average, a higher noise exposure. Our findings warrant active noise management for visceral surgery to reduce potential negative impacts of noise on surgical performance and outcome.
The present research study investigated the susceptibility of common mastitis pathogens—obtained from clinical mastitis cases on 58 Northern German dairy farms—to routinely used antimicrobials. The broth microdilution method was used for detecting the Minimal Inhibitory Concentration (MIC) of Streptococcus agalactiae (n = 51), Streptococcus dysgalactiae (n = 54), Streptococcus uberis (n = 50), Staphylococcus aureus (n = 85), non-aureus staphylococci (n = 88), Escherichia coli (n = 54) and Klebsiella species (n = 52). Streptococci and staphylococci were tested against cefquinome, cefoperazone, cephapirin, penicillin, oxacillin, cloxacillin, amoxicillin/clavulanic acid and cefalexin/kanamycin. Besides cefquinome and amoxicillin/clavulanic acid, Gram-negative pathogens were examined for their susceptibility to marbofloxacin and sulfamethoxazole/trimethoprim. The examined S. dysgalactiae isolates exhibited the comparatively lowest MICs. S. uberis and S. agalactiae were inhibited at higher amoxicillin/clavulanic acid and cephapirin concentration levels, whereas S. uberis isolates additionally exhibited elevated cefquinome MICs. Most Gram-positive mastitis pathogens were inhibited at higher cloxacillin than oxacillin concentrations. The MICs of Gram-negative pathogens were higher than previously reported, whereby 7.4%, 5.6% and 11.1% of E. coli isolates had MICs above the highest concentrations tested for cefquinome, marbofloxacin and sulfamethoxazole/trimethoprim, respectively. Individual isolates showed MICs at comparatively higher concentrations, leading to the hypothesis that a certain amount of mastitis pathogens on German dairy farms might be resistant to frequently used antimicrobials.
This paper presents the fundamental investigation on crack propagation rate (CPR) and Stress Intensity Factor (SIF) for a typical fatigue and welded specimens which are Compact Tension (CT) and Single Edge Notch Tension (SENT) as well as Butt and longitudinal T-joint. The material data of austenitic stainless steel SS316L was used to observe crack propagation rate with different initial crack length and different tensile load was used for the fracture mechanics investigation. The geometry of the specimens was modelled by using open source software CASCA while Franc 2D was used for post processing based on Paris Erdogan Law with different crack increment steps. The analysis of crack propagation using fracture mechanics technique requires an accurate calculation of the stress intensity factor SIF and comparison of the critical strength of the material (KIC) was used to determine the critical crack length of the specimens. it can be concluded that open source finite element method software can be used for predicting of fatigue life on simplified geometry.
Concreteness of words has been measured and used in psycholinguistics already for decades. Recently, it is also used in retrieval and NLP tasks. For English a number of well known datasets has been established with average values for perceived concreteness.
We give an overview of available datasets for German, their correlation and evaluate prediction algorithms for concreteness of German words. We show that these algorithms achieve similar results as for English datasets. Moreover, we show for all datasets there are no significant differences between a prediction model based on a regression model using word embeddings as features and a prediction algorithm based on word similarity according to the same embeddings.
Background: Stereotactic radiosurgery (SRS) is an effective treatment for trigeminal neuralgia (TN). Nevertheless, a proportion of patients will experience recurrence and treatment-related sensory disturbances. In order to evaluate the predictors of efficacy and safety of image-guided non-isocentric radiosurgery, we analyzed the impact of trigeminal nerve volume and the nerve dose/volume relationship, together with relevant clinical characteristics.
Methods: Two-hundred and ninety-six procedures were performed on 262 patients at three centers. In 17 patients the TN was secondary to multiple sclerosis (MS). Trigeminal pain and sensory disturbances were classified according to the Barrow Neurological Institute (BNI) scale. Pain-free-intervals were investigated using Kaplan Meier analyses. Univariate and multivariate Cox regression analyses were performed to identify predictors.
Results: The median follow-up period was 38 months, median maximal dose 72.4 Gy, median target nerve volume 25mm3, and median prescription dose 60 Gy. Pain control rate (BNI I-III) at 6, 12, 24, 36, 48, and 60 months were 96.8, 90.9, 84.2, 81.4, 74.2, and 71.2%, respectively. Overall, 18% of patients developed sensory disturbances. Patients with volume ≥ 30mm3 were more likely to maintain pain relief (p = 0.031), and low integral dose (< 1.4 mJ) tended to be associated with more pain recurrence than intermediate (1.4–2.7 mJ) or high integral dose (> 2.7 mJ; low vs. intermediate: log-rank test, χ2 = 5.02, p = 0.019; low vs. high: log-rank test, χ2 = 6.026, p = 0.014). MS, integral dose, and mean dose were the factors associated with pain recurrence, while re-irradiation and MS were predictors for sensory disturbance in the multivariate analysis.
Conclusions: The dose to nerve volume ratio is predictive of pain recurrence in TN, and re-irradiation has a major impact on the development of sensory disturbances after non-isocentric SRS. Interestingly, the integral dose may differ significantly in treatments using apparently similar dose and volume constraints.
Untersuchungen zu Berufen der Wirtschaftsinformatik bleiben weiterhin
interessant, wenn sie helfen können, dem mittlerweile länger anhaltenden
IT-Fachkräftemangel entgegenzuwirken. Eine Untersuchung der Hochschule
Hannover zu Wirtschaftsinformatikern/-informatikerinnen in den ersten zehn Jahren im Beruf zeigt deren berufliche Ziele und die berufliche Zufriedenheit, die sie erlangen. Deutlich wird, dass Frauen und Männer das Arbeitsklima und die Arbeitsbedingungen sehr unterschiedlich wahrnehmen und daher auch unterschiedlich zufrieden sind. Dabei bemängeln Frauen vor allem Merkmale, die mit „fehlender Fairness“ zu beschreiben sind.
Vor der Abgabe einer Studien- oder Abschlussarbeit ist dringend eine sorgfältige Überarbeitung in Form einer Endredaktion vorzunehmen, um eine gute Bewertung der Arbeit nicht zu gefährden. Dies ist einfach; denn das Vorgehen ist schlicht und wenig aufwändig. Daher wäre es besonders ärgerlich, einfache Fehler nicht zu beheben und dafür Abzüge bei der Bewertung der Arbeit hinzunehmen. Für eine Endredaktion wird hiermit eine Anleitung vorgelegt.
Insbesondere aufgrund der Zugehörigkeit zum sehr aktuellen und viel betrachteten Thema Machine Learning ist die genetische Programmierung mit ihren vielseitigen Anwendungsmöglichkeiten ein sehr interessantes Gebiet. Wie in allen Forschungsschwerpunkten gibt es auch hier viele Ansätze die standardmäßige Vorgehensweise weiter zu verbessern – einer dieser Ansätze ist die Verwendung von Subroutinen. Diese könnten in diesem Kontext auch als Methoden, Funktionen oder ähnliches bezeichnet werden und bedeuten, dass vom Algorithmus neben dem eigentlichen Programm auch wiederverwendbare Folgen von Anweisungen entwickelt werden, die über einen Bezeichner an beliebigen Stellen verwendet werden können. Hierfür gibt es bereits diverse Konzepte, die in Tests sehr gute Ergebnisse erzielt haben und eine Verbesserung gegenüber der standardmäßigen genetischen Programmierung ohne Subroutinen erreichen konnten. Diese Tests fanden allerdings immer in sehr spezialisierten Testumgebungen statt. Besonders interessant sind allerdings solche Systeme zur genetischen Programmierung, die (theoretisch) beliebige Probleme lösen kann, da sie für eine Vielzahl von Problemstellungen verwendet werden können.
Das Ziel dieser Arbeit ist es, zu untersuchen, ob und inwiefern die Verwendung von Subroutinen auch in einem solchen allgemeinen System zur genetischen Programmierung, das theoretisch dazu in der Lage ist, beliebige Probleme zu lösen, möglich und sinnvoll ist.
In parcel delivery, the “last mile” from the parcel hub to the customer is costly, especially for time-sensitive delivery tasks that have to be completed within hours after arrival. Recently, crowdshipping has attracted increased attention as a new alternative to traditional delivery modes. In crowdshipping, private citizens (“the crowd”) perform short detours in their daily lives to contribute to parcel delivery in exchange for small incentives. However, achieving desirable crowd behavior is challenging as the crowd is highly dynamic and consists of autonomous, self-interested individuals. Leveraging crowdshipping for time-sensitive deliveries remains an open challenge. In this paper, we present an agent-based approach to on-time parcel delivery with crowds. Our system performs data stream processing on the couriers’ smartphone sensor data to predict delivery delays. Whenever a delay is predicted, the system attempts to forge an agreement for transferring the parcel from the current deliverer to a more promising courier nearby. Our experiments show that through accurate delay predictions and purposeful task transfers many delays can be prevented that would occur without our approach.
Unter Crowdsensing versteht man Anwendungen, in denen Sensordaten kollaborativ von einer Menge von Freiwilligen erhoben werden. So kann Crowdsensing eingesetzt werden um die Luftqualität an Orten zu messen, an denen keine fest installierten Sensoren verfügbar sind. In Crowdsensing-Systemen müssen die Teilnehmer koordiniert und die Messdaten verarbeitet werden, um relevante Daten zu erhalten. Im Rahmen der Abschlussarbeit wurde ein System konzipiert und prototypisch umgesetzt, das auf einem Raspberry Pi (unter Einsatz geeigneter Sensoren) Sensordaten erhebt und mit der Complex Event Processing Technologie verarbeitet.
Plugins erweitern die Funktionalität von WordPress und helfen Webseitenbetreibern beim Hinzufügen neuer Elemente oder Funktionen. Dabei muss der Betreiber selbst keinen komplexen Programmcode schreiben. Auch können diese Erweiterungen als Erleichterung bei der Pflege von Informationen und Inhalten dienen. In der vorliegenden Bachelorarbeit wird die Konzipierung, Umsetzung und Anwendung eines solchen Plugins für den Vergleich von Veranstaltungen beschrieben. Dabei findet es Anwendung auf einer Internetseite zum Thema Festivals. Die Informationen bezüglich der Festivals werden per Import in einer Tabelle gespeichert und für den Vergleich dargestellt. Eintragungen können auch händisch getätigt werden, woraus sich schließlich ergibt, dass das Plugin mit seiner Importfunktion eine zeitsparende und sinnvolle Erweiterung ist.
Rezension zu:
Rösch, Hermann et al.: Bibliotheken und Informationsgesellschaft in Deutschland : eine Einführung / Hermann Rösch, Jürgen Seefeldt, Konrad Umlauf ; unter Mitarbeit von Albert Bilo und Eric W. Steinhauer ; mitbegründet von Engelbert Plassmann. – 3., neukonzipierte und aktualisierte Auflage. – Wiesbaden: Harrassowitz Verlag, 2019. – XIII, 329 Seiten. – ISBN 9783447066204 : EUR 39.80 (auch als EBook verfügbar)