Refine
Year of publication
Document Type
- Article (139)
- Conference Proceeding (43)
- Bachelor Thesis (2)
- Working Paper (2)
- Part of a Book (1)
- Master's Thesis (1)
- Preprint (1)
- Report (1)
Language
- English (190) (remove)
Has Fulltext
- yes (190)
Is part of the Bibliography
- no (190)
Keywords
- Student (12)
- Knowledge (10)
- Mumbai (10)
- Wissen (10)
- India (9)
- Germany (8)
- Epidemiologie (6)
- HIV (6)
- Indien (6)
- Klinisches Experiment (6)
Institute
- Fakultät III - Medien, Information und Design (190) (remove)
Aim:
The most suitable method for assessment of response to peptide receptor radionuclide therapy (PRRT) of neuroendocrine tumors (NET) is still under debate. In this study we aimed to compare size (RECIST 1.1), density (Choi), Standardized Uptake Value (SUV) and a newly defined ZP combined parameter derived from Somatostatin Receptor (SSR) PET/CT for prediction of both response to PRRT and overall survival (OS).
Material and Methods:
Thirty-four NET patients with progressive disease (F:M 23:11; mean age 61.2 y; SD ± 12) treated with PRRT using either Lu-177 DOTATOC or Lu-177 DOTATATE and imaged with Ga-68 SSR PET/CT approximately 10–12 weeks prior to and after each treatment cycle were retrospectively analyzed. Median duration of follow-up after the first cycle was 63.9 months (range 6.2–86.2). A total of 77 lesions (2–8 per patient) were analyzed. Response assessment was performed according to RECIST 1.1, Choi and modified EORTC (MORE) criteria. In addition, a new parameter named ZP, the product of Hounsfield unit (HU) and SUVmean (Standard Uptake Value) of a tumor lesion, was tested. Further, SUV values (max and mean) of the tumor were normalized to SUV of normal liver parenchyma. Tumor response was defined as CR, PR, or SD. Gold standard for comparison of baseline parameters for prediction of response of individual target lesions to PRRT was change in size of lesions according to RECIST 1.1. For prediction of overall survival, the response after the first and second PRRT were tested.
Results:
Based on RECIST 1.1, Choi, MORE, and ZP, 85.3%, 64.7%, 61.8%, and 70.6% achieved a response whereas 14.7%, 35.3%, 38.2%, and 29.4% demonstrated PD (progressive disease), respectively. Baseline ZP and ZPnormalized were found to be the only parameters predictive of lesion progression after three PRRT cycles (AUC ZP 0.753; 95% CI 0.6–0.9, p 0.037; AUC ZPnormalized 0.766; 95% CI 0.6–0.9; p 0.029). Based on a cut-off-value of 1201, ZP achieved a sensitivity of 86% and a specificity of 67%, while ZPnormalized reached a sensitivity of 86% and a specificity of 76% at a cut-off-value of 198. Median OS in the total cohort was not reached. In univariate analysis amongst all parameters, only patients having progressive disease according to MORE after the second cycle of PRRT were found to have significantly shorter overall survival (median OS in objective responders not reached, in PD 29.2 months; p 0.015). Patients progressive after two cycles of PRRT according to ZP had shorter OS compared to those responding (median OS for responders not reached, for PD 47.2 months, p 0.066).
Conclusions:
In this explorative study, we showed that Choi, RECIST 1.1, and SUVmax-based response evaluation varied significantly from each other. Only patients showing progressive disease after two PRRT cycles according to MORE criteria had a worse prognosis while baseline ZP and
ZPnormalized performed best in predicting lesion progression after three cycles of PRRT.
After kidney transplantation graft rejection must be prevented. Therefore, a multitude of parameters of the patient is observed pre- and postoperatively. To support this process, the Screen Reject research project is developing a data warehouse optimized for kidney rejection diagnostics. In the course of this project it was discovered that important information are only available in form of free texts instead of structured data and can therefore not be processed by standard ETL tools, which is necessary to establish a digital expert system for rejection diagnostics. Due to this reason, data integration has been improved by a combination of methods from natural language processing and methods from image processing. Based on state-of-the-art data warehousing technologies (Microsoft SSIS), a generic data integration tool has been developed. The tool was evaluated by extracting Banff-classification from 218 pathology reports and extracting HLA mismatches from about 1700 PDF files, both written in german language.
Using openEHR Archetypes for Automated Extraction of Numerical Information from Clinical Narratives
(2019)
Up to 80% of medical information is documented by unstructured data such as clinical reports written in natural language. Such data is called unstructured because the information it contains cannot be retrieved automatically as straightforward as from structured data. However, we assume that the use of this flexible kind of documentation will remain a substantial part of a patient’s medical record, so that clinical information systems have to deal appropriately with this type of information description. On the other hand, there are efforts to achieve semantic interoperability between clinical application systems through information modelling concepts like HL7 FHIR or openEHR. Considering this, we propose an approach to transform unstructured documented information into openEHR archetypes. Furthermore, we aim to support the field of clinical text mining by recognizing and publishing the connections between openEHR archetypes and heterogeneous phrasings. We have evaluated our method by extracting the values to three openEHR archetypes from unstructured documents in English and German language.
This paper deals with new job profiles in libraries, mainly systems librarians (German: Systembibliothekare), IT librarians (German: IT-Bibliothekare) and data librarians (German: Datenbibliothekare). It investigates the vacancies and requirements of these positions in the German-speaking countries by analyzing one hundred and fifty published job advertisements of OpenBiblioJobs between 2012-2016. In addition, the distribution of positions, institutional bearers, different job titles as well as time limits, scope of work and remuneration of the positions are evaluated. The analysis of the remuneration in the public sector in Germany also provides information on demands for a bachelor's or master's degree.
The average annual increase in job vacancies between 2012 and 2016 is 14.19%, confirming the need and necessity of these professional library profiles.
The higher remuneration of the positions in data management, in comparison to the systems librarian, proves the prerequisite of the master's degree and thus indicates a desideratum due to missing or few master's degree courses. Accordingly, the range of bachelor's degree courses (or IT-oriented major areas of study with optional compulsory modules in existing bachelor's degree courses) for systems and IT librarians must be further expanded. An alternative could also be modular education programs for librarians and information scientists with professional experience, as it is already the case for music librarians.
Methods for standard meta-analysis of diagnostic test accuracy studies are well established and understood. For the more complex case in which studies report test accuracy across multiple thresholds, several approaches have recently been proposed. These are based on similar ideas, but make different assumptions. In this article, we apply four different approaches to data from a recent systematic review in the area of nephrology and compare the results. The four approaches use: a linear mixed effects model, a Bayesian multinomial random effects model, a time-to-event model and a nonparametric model, respectively. In the case study data, the accuracy of neutrophil gelatinase-associated lipocalin for the diagnosis of acute kidney injury was assessed in different scenarios, with sensitivity and specificity estimates available for three thresholds in each primary study. All approaches led to plausible and mostly similar summary results. However, we found considerable differences in results for some scenarios, for example, differences in the area under the receiver operating characteristic curve (AUC) of up to 0.13. The Bayesian approach tended to lead to the highest values of the AUC, and the nonparametric approach tended to produce the lowest values across the different scenarios. Though we recommend using these approaches, our findings motivate the need for a simulation study to explore optimal choice of method in various scenarios.
Self-directed learning is an essential basis for lifelong learning and requires constantly changing, target groupspecific and personalized prerequisites in order to motivate people to deal with modern learning content, not to overburden them and yet to adequately convey complex contexts. Current challenges in dealing with digital resources such as information overload, reduction of complexity and focus, motivation to learn, self-control or psychological wellbeing are taken up in the conception of learning settings within our QpLuS IM project for the study program Information Management and Information Management extra-occupational (IM) at the University of Applied Sciences and Arts Hannover. We present an interactive video on the functionality of search engines as a practical example of a medially high-quality and focused self-learning format that has been methodically produced in line with our agile, media-didactic process and stage model of complexity levels.
Within the HiGHmeducation consortium various online learning modules shall be developed by members of the consortium to address the increasing need for skilled professionals in a networked and digitalized healthcare system. Transferability of these modules to other locations is one main objective for the design of online learning modules. Thus, a didactical framework for online learning modules was developed. To ensure feasibility of the framework, the participating universities were analyzed concerning availability of e-learning support structures and infrastructures including learning management systems (LMS). The analysis especially focuses on the various LMS learning tools and their suitability for the framework. The framework is the basis for 12 HiGHmeducation online learning modules of which a part has firstly been conducted in winter 2019/20 and leads to a comparable structure of the modules.
Automatic classification of scientific records using the German Subject Heading Authority File (SWD)
(2012)
The following paper deals with an automatic text classification method which does not require training documents. For this method the German Subject Heading Authority File (SWD), provided by the linked data service of the German National Library is used. Recently the SWD was enriched with notations of the Dewey Decimal Classification (DDC). In consequence it became possible to utilize the subject headings as textual representations for the notations of the DDC. Basically, we we derive the classification of a text from the classification of the words in the text given by the thesaurus. The method was tested by classifying 3826 OAI-Records from 7 different repositories. Mean reciprocal rank and recall were chosen as evaluation measure. Direct comparison to a machine learning method has shown that this method is definitely competitive. Thus we can conclude that the enriched version of the SWD provides high quality information with a broad coverage for classification of German scientific articles.
We present a simple method to find topics in user reviews that accompany ratings for products or services. Standard topic analysis will perform sub-optimal on such data since the word distributions in the documents are not only determined by the topics but by the sentiment as well. We reduce the influence of the sentiment on the topic selection by adding two explicit topics, representing positive and negative sentiment. We evaluate the proposed method on a set of over 15,000 hospital reviews. We show that the proposed method, Latent Semantic Analysis with explicit word features, finds topics with a much smaller bias for sentiments than other similar methods.
Regional Innovation Systems describe the relations between actors, structures and infrastructures in a region in order to stimulate innovation and regional development. For these systems the collection and organization of information is crucial. In the present paper we investigate the possibilities to extract information from websites of companies. First we describe regional innovation systems and the information types that are necessary to create them. Then we discuss the possibilities of text mining and keyword extraction techniques to extract this information from company websites. Finally, we describe a small scale experiment in which keywords related to economic sectors and commodities are extracted from the websites of over 200 companies. This experiment shows what the main challenges are for information extraction from websites for regional innovation systems.
Library of Congress Subject Headings (LCSH) are popular for indexing library records. We studied the possibility of assigning LCSH automatically by training classifiers for terms used frequently in a large collection of abstracts of the literature on hand and by extracting headings from those abstracts. The resulting classifiers reach an acceptable level of precision, but fail in terms of recall partly because we could only train classifiers for a small number of LCSH. Extraction, i.e., the matching of headings in the text, produces better recall but extremely low precision. We found that combining both methods leads to a significant improvement of recall and a slight improvement of F1 score with only a small decrease in precision.
We compare the effect of different text segmentation strategies on speech based passage retrieval of video. Passage retrieval has mainly been studied to improve document retrieval and to enable question answering. In these domains best results were obtained using passages defined by the paragraph structure of the source documents or by using arbitrary overlapping passages. For the retrieval of relevant passages in a video, using speech transcripts, no author defined segmentation is available. We compare retrieval results from 4 different types of segments based on the speech channel of the video: fixed length segments, a sliding window, semantically coherent segments and prosodic segments. We evaluated the methods on the corpus of the MediaEval 2011 Rich Speech Retrieval task. Our main conclusion is that the retrieval results highly depend on the right choice for the segment length. However, results using the segmentation into semantically coherent parts depend much less on the segment length. Especially, the quality of fixed length and sliding window segmentation drops fast when the segment length increases, while quality of the semantically coherent segments is much more stable. Thus, if coherent segments are defined, longer segments can be used and consequently less segments have to be considered at retrieval time.
Distributional semantics tries to characterize the meaning of words by the contexts in which they occur. Similarity of words hence can be derived from the similarity of contexts. Contexts of a word are usually vectors of words appearing near to that word in a corpus. It was observed in previous research that similarity measures for the context vectors of two words depend on the frequency of these words. In the present paper we investigate this dependency in more detail for one similarity measure, the Jensen-Shannon divergence. We give an empirical model of this dependency and propose the deviation of the observed Jensen-Shannon divergence from the divergence expected on the basis of the frequencies of the words as an alternative similarity measure. We show that this new similarity measure is superior to both the Jensen-Shannon divergence and the cosine similarity in a task, in which pairs of words, taken from Wordnet, have to be classified as being synonyms or not.
Lemmatization is a central task in many NLP applications. Despite this importance, the number of (freely) available and easy to use tools for German is very limited. To fill this gap, we developed a simple lemmatizer that can be trained on any lemmatized corpus. For a full form word the tagger tries to find the sequence of morphemes that is most likely to generate that word. From this sequence of tags we can easily derive the stem, the lemma and the part of speech (PoS) of the word. We show (i) that the quality of this approach is comparable to state of the art methods and (ii) that we can improve the results of Part-of-Speech (PoS) tagging when we include the morphological analysis of each word.
We compare the effect of different segmentation strategies for passage retrieval of user generated internet video. We consider retrieval of passages for rather abstract and complex queries that go beyond finding a certain object or constellation of objects in the visual channel. Hence the retrieval methods have to rely heavily on the recognized speech. Passage retrieval has mainly been studied to improve document retrieval and to enable question answering. In these domains best results were obtained using passages defined by the paragraph structure of the source documents or by using arbitrary overlapping passages. For the retrieval of relevant passages in a video no author defined paragraph structure is available. We compare retrieval results from 5 different types of segments: segments defined by shot boundaries, prosodic segments, fixed length segments, a sliding window and semantically coherent segments based on speech transcripts. We evaluated the methods on the corpus of the MediaEval 2011 Rich Speech Retrieval task. Our main conclusions are (1) that fixed length and coherent segments are clearly superior to segments based on speaker turns or shot boundaries; (2) that the retrieval results highly depend on the right choice for the segment length; and (3) that results using the segmentation into semantically coherent parts depend much less on the segment length. Especially, the quality of fixed length and sliding window segmentation drops fast when the segment length increases, while quality of the semantically coherent segments is much more stable. Thus, if coherent segments are defined, longer segments can be used and consequently fewer segments have to be considered at retrieval time.
The dependency of word similarity in vector space models on the frequency of words has been noted in a few studies, but has received very little attention. We study the influence of word frequency in a set of 10 000 randomly selected word pairs for a number of different combinations of feature weighting schemes and similarity measures. We find that the similarity of word pairs for all methods, except for the one using singular value decomposition to reduce the dimensionality of the feature space, is determined to a large extent by the frequency of the words. In a binary classification task of pairs of synonyms and unrelated words we find that for all similarity measures the results can be improved when we correct for the frequency bias.
This paper describes the approach of the Hochschule Hannover to the SemEval 2013 Task Evaluating Phrasal Semantics. In order to compare a single word with a two word phrase we compute various distributional similarities, among which a new similarity measure, based on Jensen-Shannon Divergence with a correction for frequency effects. The classification is done by a support vector machine that uses all similarities as features. The approach turned out to be the most successful one in the task.
This paper presents a possibility to extend the formalism of linear indexed grammars. The extension is based on the use of tuples of pushdowns instead of one pushdown to store indices during a derivation. If a restriction on the accessibility of the pushdowns is used, it can be shown that the resulting formalisms give rise to a hierarchy of languages that is equivalent with a hierarchy defined by Weir. For this equivalence, that was already known for a slightly different formalism, this paper gives a new proof. Since all languages of Weir's hierarchy are known to be mildly context sensitive, the proposed extensions of LIGs become comparable with extensions of tree adjoining grammars and head grammars.
In this paper we investigate how concreteness and abstractness are represented in word embedding spaces. We use data for English and German, and show that concreteness and abstractness can be determined independently and turn out to be completely opposite directions in the embedding space. Various methods can be used to determine the direction of concreteness, always resulting in roughly the same vector. Though concreteness is a central aspect of the meaning of words and can be detected clearly in embedding spaces, it seems not as easy to subtract or add concreteness to words to obtain other words or word senses like e.g. can be done with a semantic property like gender.
During the intraoperative radiograph generation process with mobile image intensifier systems (C-arm) most of the radiation exposure for patient, surgeon and operation room personal is caused by scattered radiation. The intensity and propagation of scattered radiation depend on different parameters, e.g. the intensity of the primary radiation, and the positioning of the mobile image intensifier. Exposure through scattered radiation can be minimized when all these parameters are adjusted correctly. Because radiation is potentially dangerous and could not be perceived by any human sense the current education on correct adjustment of a C-arm is designed very theoretical. This paper presents an approach of scattered radiation calculation and visualization embedded in a computer based training system for mobile image intensifier systems called virtX. With the help of this extension the virtX training system should enrich the current radiation protection training with visual and practical training aspects.
Background:
Promoting patient and occupational safety are two key challenges for hospitals. When aiming to improve these two outcomes synergistically, psychosocial working conditions, leadership by hospital management and supervisors, and perceptions of patient and occupational safety climate have to be considered. Recent studies have shown that these key topics are interrelated and form a critical foundation for promoting patient and occupational safety in hospitals. So far, these topics have mainly been studied independently from each other. The present study investigated hospital staffs’ perceptions of four different topics: (1) psychosocial working conditions, (2) leadership, (3) patient safety climate, and (4) occupational safety climate. We present results from a survey in two German university hospitals aiming to detect differences between nurses and physicians.
Methods:
We performed a cross-sectional study using a standardized paper-based questionnaire. The survey was conducted with nurses and physicians to assess the four topics. The instruments mainly consisted of scales of the German version of the COPSOQ (Copenhagen Psychosocial Questionnaire), one scale of the Copenhagen Burnout Inventory (CBI), scales to assess leadership and transformational leadership, scales to assess patient safety climate using the Hospital Survey on Patient Safety Culture (HSPSC), and analogous items to assess occupational safety climate.
Results:
A total of 995 completed questionnaires out of 2512 distributed questionnaires were returned anonymously. The overall response rate was 39.6%. The sample consisted of 381 physicians and 567 nurses. We found various differences with regard to the four topics. In most of the COPSOQ and the HSPSC-scales, physicians rated psychosocial working conditions and patient safety climate more positively than nurses. With regard to occupational safety, nurses
indicated higher occupational risks than physicians.
Conclusions:
The WorkSafeMed study combined the assessment of the four topics psychosocial working conditions, leadership, patient safety climate, and occupational safety climate in hospitals. Looking at the four topics provides an overview of where improvements in hospitals may be needed for nurses and physicians. Based on these results,
improvements in working conditions, patient safety climate, and occupational safety climate are required for health care professionals in German university hospitals – especially for nurses.
Correction to: https://doi.org/10.1186/s12913-018-3862-7
In the original publication of this article, the authors missed that reverse coding was necessary for the item “Do you work separate from your colleagues?” before calculating the scale ‘social relations’. So they corrected the analysis accordingly. The results with the revised scale show that there are no longer any significant differences between nurses and physicians with regard to this scale.
Objectives: Injury to major white matter pathways during language-area associated glioma surgery often leads to permanent loss of neurological function. The aim was to establish standardized tractography of language pathways as a predictor of language outcome in clinical neurosurgery.
Methods: We prospectively analyzed 50 surgical cases of patients with left perisylvian, diffuse gliomas. Standardized preoperative Diffusion-Tensor-Imaging (DTI)-based tractography of the 5 main language tracts (Arcuate Fasciculus [AF], Frontal Aslant Tract [FAT], Inferior Fronto-Occipital Fasciculus [IFOF], Inferior Longitudinal Fasciculus [ILF], Uncinate Fasciculus [UF]) and spatial analysis of tumor and tracts was performed. Postoperative imaging and the resulting resection map were analyzed for potential surgical injury of tracts. The language status was assessed preoperatively, postoperatively and after 3 months using the Aachen Aphasia Test and Berlin Aphasia Score. Correlation analyses, two-step cluster analysis and binary logistic regression were used to analyze associations of tractography results with language outcome after surgery.
Results: In 14 out of 50 patients (28%), new aphasic symptoms were detected 3 months after surgery. The preoperative infiltration of the AF was associated with functional worsening (cc = 0.314; p = 0.019). Cluster analysis of tract injury profiles revealed two areas particularly related to aphasia: the temporo-parieto-occipital junction (TPO; temporo-parietal AF, middle IFOF, middle ILF) and the temporal stem/peri-insular white matter (middle IFOF, anterior ILF, temporal UF, temporal AF). Injury to these areas (TPO: OR: 23.04; CI: 4.11 – 129.06; temporal stem: OR: 21.96; CI: 2.93 – 164.41) was associated with a higher-risk of persisting aphasia.
Conclusions: Tractography of language pathways can help to determine the individual aphasia risk profile presurgically. The TPO and temporal stem/peri-insular white matter were confirmed as functional nodes particularly sensitive to surgical injuries.
Ever since the 1996 revision of the Declaration of Helsinki, the World Medical Association has attempted to address ethical and scientific concerns of its diverse stakeholders for Articles 33 (use of placebo) and 34 (posttrial provisions), most recently in 2013. Both are inextricably linked to standard of care, an essential element of any comparative, interventional clinical trial. But has this now 20-year-long ethical debate truly been put to rest? The choice of standard of care in clinical trials remains a complex issue, particularly for comparative trials conducted in emerging countries.
A study to assess the knowledge and attitude towards HIV of pharmacy students from Mumbai university
(2020)
Background: India is the biggest HIV epidemic in the world. The role of a pharmacist is pivotal in educating the general masses. The aim of the study was to determine the knowledge and attitude of pharmacy students from University of Mumbai.
Methods: A cross-sectional study was conducted in University of Mumbai during February-March 2020. Therein, 307 students (214: females and 94: males) participated in the study. The questionnaire was distributed in the classroom and data was collected by means of Google-forms. Furthermore, the data was analysed using IBM SPSS version 23.
Results: The participants demonstrated good knowledge (84%) and attitude (76%) score. With respect to knowledge score, no significant difference was observed except for responses of two questions, aim of the antiretroviral therapy (ART) and Avoidance of sexual intercourse can decrease the risk of HIV. With respect to attitude score, Volunteering to work at an institute for the welfare of HIV patients showed a significant difference.
Conclusion: The current study showed that there were no misconceptions or negative attitude regarding HIV among the students. However, a study with greater sample size must be conducted across India for further investigation.
Background
The business of clinical research has changed in the past two decades, shifting from industrialised Western countries to so-called emerging markets such as Eastern Europe, Latin America and Africa. An appraisal of the trends could identify associated factors that may have implications for the local populations and their endemic diseases.
Objectives
To identify potential reasons why emerging countries have become attractive places for international sponsors to conduct their clinical trials.
Methods
Using ClinicalTrials.gov, the Pan African Clinical Trials Registry, the National Health Research Database and the Nigeria Clinical Trials Registry, trend data on clinical research development were determined for two emerging African markets, Nigeria and South Africa (SA), from 2010 to 2018. Also, health data on the two countries from the fact sheets of health statistics of the World Health Organization were compared, as well as regulatory and ethical conditions. Available data were analysed using descriptive statistics and trend analysis.
Results
The impact of globalisation is evident from the upward trend in clinical trials in SA before 2010, and the clear downward trend thereafter. One reason for this change could be the alignment of SA’s regulatory and ethical frameworks with the Western world. In contrast,
the upward trend is only just beginning in Nigeria, with the introduction of ethical/regulatory frameworks, and supportive institutions making the business of clinical research more attractive on an international level. Although the number of international and local sponsors increased in Nigeria from 2010 to 2018, only the latter increased in SA, with the former decreasing over the same period. Overall, there is a mismatch between country-specific diseases and the drugs being tested, to the extent that leprosy, which is endemic in Nigeria, and tuberculosis in SA were not in the list of top 10 study areas in either country.
Conclusions
The globalisation trend is evident in the clinical trials business, but cannot be generalised to all emerging countries. Timing and intensity vary from country to country relative to factors that advance the existing profit-orientated business models of the sponsors. Furthermore, various diseases have been localised, which entails a diversely increasing need for research.
Economic and political/governmental infrastructural factors are major contributors to the economic development/growth of all sectors of a country, such as in the area of healthcare systems and clinical research, including the pharmaceutical industry. But what is the interaction between economic, and political/governmental infrastructural factors and the development of healthcare systems, especially, the performance of the pharmaceutical industry? Information from selected articles of a literature search of PubMed and by using Google Advanced Search led to the generation of five categories of infrastructural factors, and were filled with data from 41 African Countries using the World Health Organization data repository. Median changes over time were given and tested by Wilcoxon signed-rank test and Friedman test, respectively. Analysis of factors related to availability of healthcare facilities showed that physicians and pharmacies were significant increased, with insignificantly decreased number of hospital beds. Healthcare Financing by the Government showed notable differences. Private health spending decreased significantly unlike Gross National Income. Analysis of infrastructural factors showed that stable supply of electricity and the associated use of the Internet improved significantly. The low level of data on the expansion of paved road networks suggests less developed medical services in remote rural areas. Healthcare systems in African countries improved over the last two decades, but differences between the individual countries still prevail and some of the countries cannot yet offer an attractive sales market for the products of pharmaceutical companies.
Background: The globalization of clinical research should also benefit the population in developing markets. In this context, the approval of tested medicines and the associated expansion of medical care beyond clinical studies would be desirable as a possible long-term benefit.
Objectives: This study was designed to compare the development of the number of clinical trials with the number of marketing authorizations of medicines on the African continent. To contrast these 2 parameters, the data were analyzed using the model of an ecological study.
Methods: To reflect the broad spectrum of African developing countries with diverse levels of development, the data collection was based on 2 geographically selected sample countries each from Central, North, East, West, and Southern Africa. Based on the ClinicalTrials.gov registry, the first step was to collect trends data on the development of the clinical trials in the 10 selected countries of the country list of the African Region published by the World Health Organization for the period 2015 to 2018. Subsequently, data on the current number of marketing authorizations of medicines in the selected sample countries were identified using the online registries of the national authorities. The data were utilized in comparative analyses.
Results: Eight out of 10 model countries showed an increase in the number of clinical trials, with the exceptions of Cameroon and Libya, which showed an overall decline in research activity over the entire time. In direct comparison with drug registrations, the numbers indicate a similar development. The only exception here is Nigeria, a country with a solid performance in clinical research and yet a decrease in medicine registrations since 2015.
Conclusions: The expected increase in the development of clinical research as result of the globalization trend can basically be observed in most of the model countries. However, this increase does not guarantee an improvement in the number of medicine registrations. Although this is evident in some of the selected model countries, it cannot be projected to the entire African region. This may be linked to the diverse development of the individual countries due to the different political situations and the varying degrees of clinical research infrastructure.
Objectives:
The aim was to identify theoretically expected as well as actually reported benefits from drug development and the importance of individual patient benefits compared to the collective benefits to society in general.
Background:
Ethical guidelines require that clinical research involving humans offer the potential for benefit. A number of characteristics can be applied to define research benefit. Often benefit is categorized as being either direct or indirect. Indirect benefits can involve collective benefits for society rather than any benefits to the trial patient or subject. The purpose of this review was to examine which potential individual and societal benefits were mentioned as being expected in publications from government experts and which were mentioned in publications describing completed drug development trial results.
Methods:
Literature on research benefit was first identified by searching the PubMed database using several combinations of the key words benefit and clinical research. The search was limited to articles published in English. A Google search with the same combinations of key words but without any language limitation was then performed. Additionally, the reference lists of promising articles were screened for further thematically related articles. Finally, a narrative review was performed of relevant English- and German-language articles published between 1996 and 2016 to identify which of several potential benefits were either theoretically expected or which were mentioned in publications on clinical drug development trial results.
Results:
The principal benefits from drug development discussed included 2 main types of benefit, namely individual benefits for the patients and collective benefits for society. Twenty-one of an overall total of 26 articles discussing theoretically expected benefits focused on individual patient benefits, whereas 17 out of 26 articles mentioned collective benefits to society. In these publications, the most commonly mentioned theoretically expected individual patient benefit was the chance to receive up-to-date care (38.1%). A general increase in knowledge about health care, treatments, or drugs (70.6%) was the most commonly mentioned theoretically expected benefit for society. In contrast, all 13 publications reporting actual benefits of clinical drug development trials focused on personal benefits and only 1 of these publications also mentioned a societal benefit. The most commonly mentioned individual benefit was an increased quality of life (53.9%), whereas the only mentioned collective benefit to society was a general gain of knowledge (100.0%).
Conclusions:
Both theoretically expected and actually reported benefits in the majority of the included publications emphasized the importance of individual patient benefits from drug development rather than the collective benefits to society in general. The authors of these publications emphasized the right of each individual patient or subject to look for and expect some personal benefit from participating in a clinical trial rather than considering societal benefit as a top priority. From an ethical point of view, the benefits each individual patient receives from his or her participation in a clinical trial might also be seen as a societal benefit, especially when the drug or device tested, if approved for marketing, would eventually be made available for other similar patients from the country in which the clinical trial was conducted.
From an ethical perspective, clinical research involving humans is only acceptable if it involves the potential for benefit. Various characteristics can be applied to differentiate research benefit. Often benefit is categorized in direct or indirect benefit, whereby indirect benefit might be further differentiated in collective or benefit for the society, excluding or including the trial patient in the long term. Ethical guidelines, such as the Declaration of Helsinki in its latest version, do not precisely favor a particular type of benefit.
Publication Bias
(2016)
According to the Declaration of Helsinki, as well as the Statement on Public Disclosure of Clinical Trial Results of the World Health Organization, every researcher has the ethical obligation to publish research results on all trials with human participants in a complete and accurate way within 12 months after the end of the trial.1,2 Nevertheless, for several reasons, not all research results are published in an accurate way in case they are released at all. This phenomenon of publication bias may not only create a false impression on the reliability of clinical research business, but it may also affect the evidence of clinical conclusions about the best treatments, which are mostly based on published data and results.
Complications may occur after a liver transplantation, therefore proper monitoring and care in the post-operation phase plays a very important role. Sometimes, monitoring and care for patients from abroad is difficult due to a variety of reasons, e.g., different care facilities. The objective of our research for this paper is to design, implement and evaluate a home monitoring and decision support infrastructure for international children who underwent liver transplant operation. A point-of-care device and the PedsQL questionnaire were used in patients’ home environment for measuring the blood parameters and assessing quality of life. By using a tablet PC and a specially developed software, the measured results were able to be transmitted to the health care providers via internet. So far, the developed infrastructure has been evaluated with four international patients/families transferring 38 records of blood test. The evaluation showed that the home monitoring and decision support infrastructure is technically feasible and is able to give timely alarm in case of abnormal situation as well as may increase parent’s feeling of safety for their children.
This paper summarizes the results of a comprehensive statistical analysis on a corpus of open access articles and contained figures. It gives an insight into quantitative relationships between illustrations or types of illustrations, caption lengths, subjects, publishers, author affiliations, article citations and others.
The NOA project collects and stores images from open access publications and makes them findable and reusable. During the project a focus group workshop was held to determine whether the development is addressing researchers’ needs. This took place before the second half of the project so that the results could be considered for further development since addressing users’ needs is a big part of the project. The focus was to find out what content and functionality they expect from image repositories.
In a first step, participants were asked to fill out a survey about their images use. Secondly, they tested different use cases on the live system. The first finding is that users have a need for finding scholarly images but it is not a routine task and they often do not know any image repositories. This is another reason for repositories to become more open and reach users by integrating with other content providers. The second finding is that users paid attention to image licenses but struggled to find and interpret them while also being unsure how to cite images. In general, there is a high demand for reusing scholarly images but the existing infrastructure has room to improve.
Introduction: Piper crocatum Ruiz & Pav (P. crocatum) has been reported to accelerate the diabetic wound healing process empirically. Some studies showed the benefits of P. crocatum in treating various diseases but its mechanisms in diabetic wound healing have never been reported. In the present study we investigated the diabetic wound healing activity of the active fraction of P. crocatum on wounded hyperglycemia fibroblasts (wHFs).
Methods: Bioassay-guided fractionation was performed to get the most active fraction. The selected active fraction was applied to wHFs within 72 h incubation. Mimicking a diabetic condition was done using basal glucose media containing an additional 17 mMol/L D-glucose. A wound was simulated via the scratch assay. The collagen deposition was measured using Picro-Sirius Red and wound closure was measured using scratch wound assay. Underlying mechanisms through p53, aSMA, SOD1 and Ecadherin were measured using western blotting.
Results: We reported that FIV is the most active fraction of P. crocatum. We confirmed that FIV\(7.81 mg/ml, 15.62 mg/ml, 31.25 mg/ml, 62.5 mg/ml, and 125 mg/ml) induced the collagen deposition and wound closure of wHFs. Furthermore, FIV treatment (7.81 mg/ml, 15.62 mg/ml, 31.25 mg/ml) down-regulated the protein expression level of p53 and up-regulated the protein expression levels of aSMA, E-cadherin, and SOD1.
Discussion/conclusions: Our findings suggest that ameliorating collagen deposition and wound closure through protein regulation of p53, aSMA, E-cadherin, and SOD1 are some of the mechanisms by which FIV of P. crocatum is involved in diabetic wound healing therapy.
Introduction
Atopic dermatitis (AD) is a common inflammatory skin disease. Many patients are initiating a systemic therapy, if the disease is not adequately controlled with topical treatment only. Currently, there is little real-world evidence on the AD-related medical care situation in Germany. This study analyzed patient characteristics, treatment patterns, healthcare resource utilization and costs associated with systemically treated AD for the German healthcare system.
Methods
In this descriptive, retrospective cohort study, aggregated anonymized German health claims data from the InGef research database were used. Within a representative sample of four million insured individuals, patients with AD and systemic drug therapy initiation (SDTI) in the index year 2017 were identified and included into the study cohort. Systemic drug therapy included dupilumab, systemic corticosteroids (SCS) and systemic immunosuppressants (SIS). Patients were observed for one year starting from the date of SDTI in 2017.
Results
9975 patients were included (57.8% female, mean age 39.6 years [SD 25.5]). In the one-year observation period, the most common systemic drug therapy was SCS (> 99.0%). Administrations of dupilumab (0.3%) or dispensations of SIS were rare (cyclosporine: 0.5%, azathioprine: 0.6%, methotrexate: 0.1%). Median treatment duration of SCS, cyclosporine and azathioprine was 27 days, 102 days, and 109 days, respectively. 2.8% of the patients received phototherapy; 41.6% used topical corticosteroids and/or topical calcineurin inhibitor. Average annual costs for medications amounted to € 1237 per patient. Outpatient services were used by 99.6% with associated mean annual costs of € 943; 25.4% had at least one hospitalization (mean annual costs: € 5836). 5.3% of adult patients received sickness benefits with associated mean annual costs of € 5026.
Conclusions
Despite unfavorable risk–benefit profile, this study demonstrated a common treatment with SCS, whereas other systemic drug therapy options were rarely used. Furthermore, the results suggest a substantial economic burden for patients with AD and SDTI.
Background:
The increase in food intolerances poses a burgeoning problem in our society. Food intolerances not only lead to physical impairment of the individual patient but also result in a high socio-economic burden due to factors such as the treatment required as well as absenteeism. The present study aimed to explore whether lactose intolerant (LI) patients exhibit more frequent comorbidities than non-LI patients.
Methods:
The study was conducted on a case-control basis and the results were determined using routine data analysis. Routine data from the IMS Disease Analyzer database were used for this purpose. A total of 6,758 data records were processed and analyzed.
Results:
There were significant correlations between LI and the incidence of osteoporosis, changes in mental status, and the presence of additional food intolerances. Comparing 3,379 LI vs. 3,379 non-LI patients, 34.5% vs. 17.7% (P<0.0001) suffered from abdominal pain; 30.6% vs. 17.2% (P<0.0001) from gastrointestinal infections; and 20.9% vs. 16.0% (P=0.0053) from depression. Adjusted odds ratios (OR) were the highest for fructose intolerance (n=229 LI vs. n=7 non-LI; OR 31.06; P<0.0001), irritable bowel syndrome (n=247 LI vs. n=44 non-LI; OR 5.23; P<0.0001), and bloating (n=351 LI vs. n=68 non-LI; OR 4.94; P<0.0001).
Conclusion:
The study confirms that LI should not be regarded as an isolated illness but considered a possible trigger for further diseases. Additional research is necessary to assert more precise statements.
Background: Epidemiological and experimental studies suggest that exposure to ultrafine particles (UFP) might aggravate the allergic inflammation of the lung in asthmatics.
Methods: We exposed 12 allergic asthmatics in two subgroups in a double-blinded randomized cross-over design, first to freshly generated ultrafine carbon particles (64 μg/m3; 6.1 ± 0.4 × 105 particles/cm3 for 2 h) and then to filtered air or vice versa with a 28-day recovery period in-between. Eighteen hours after each exposure, grass pollen was instilled into a lung lobe via bronchoscopy. Another 24 hours later, inflammatory cells were collected by means of bronchoalveolar lavage (BAL). (Trial registration: NCT00527462)
Results: For the entire study group, inhalation of UFP by itself had no significant effect on the allergen induced
inflammatory response measured with total cell count as compared to exposure with filtered air (p = 0.188). However, the subgroup of subjects, which inhaled UFP during the first exposure, exhibited a significant increase in total BAL cells (p = 0.021), eosinophils (p = 0.031) and monocytes (p = 0.013) after filtered air exposure and subsequent allergen challenge 28 days later. Additionally, the potential of BAL cells to generate oxidant radicals was
significantly elevated at that time point. The subgroup that was exposed first to filtered air and 28 days later to UFP did not reveal differences between sessions.
Conclusions: Our data demonstrate that pre-allergen exposure to UFP had no acute effect on the allergic inflammation. However, the subgroup analysis lead to the speculation that inhaled UFP particles might have a long-term effect on the inflammatory course in asthmatic patients. This should be reconfirmed in further studies with an appropriate study design and sufficient number of subjects.
The objective was to establish and standardise a broth microdilution susceptibility testing method for porcine Bordetella (B.) bronchiseptica. B. bronchiseptica isolates from different geographical regions and farms were genotyped by macrorestriction analysis and subsequent pulsed-field gel electrophoresis. One reference and one type strain plus two field isolates of B. bronchiseptica were chosen to analyse growth curves in four different media: cation-adjusted Mueller-Hinton broth (CAMHB) with and without 2% lysed horse blood, Brain-Heart-Infusion (BHI), and Caso broth. The growth rate of each test strain in each medium was determined by culture enumeration and the suitability of CAMHB was confirmed by comparative statistical analysis. Thereafter, reference and type strain and eight epidemiologically unrelated field isolates of B. bronchiseptica were used to test the suitability of a broth microdilution susceptibility testing method following CLSI-approved performance standards given in document VET01-A4. Susceptibility tests, using 20 antimicrobial agents, were performed in five replicates, and data were collected after 20 and 24 hours incubation and statistically analysed. Due to the low growth rate of B. bronchiseptica, an incubation time of 24 hours resulted in significantly more homogeneous minimum inhibitory concentrations after five replications compared to a 20-hour incubation. An interlaboratory comparison trial including susceptibility testing of 24 antimicrobial agents revealed a high mean level of reproducibility (97.9%) of the modified method. Hence, in a harmonization for broth microdilution susceptibility testing of B. bronchiseptica, an incubation time of 24 hours in CAMHB medium with an incubation temperature of 35°C and an inoculum concentration of approximately 5 x 105 cfu/ml was proposed.
Background: Improving the transparency of information about the quality of health care providers is one way to improve health care quality. It is assumed that Internet information steers patients toward better-performing health care providers and will motivate providers to improve quality. However, the effect of public reporting on hospital quality is still small. One of the reasons is that users find it difficult to understand the formats in which information is presented.
Objective: We analyzed the presentation of risk-adjusted mortality rate (RAMR) for coronary angiography in the 10 most commonly used German public report cards to analyze the impact of information presentation features on their comprehensibility. We wanted to determine which information presentation features were utilized, were preferred by users, led to better comprehension, and had similar effects to those reported in evidence-based recommendations described in the literature.
Methods: The study consisted of 5 steps: (1) identification of best-practice evidence about the presentation of information on hospital report cards; (2) selection of a single risk-adjusted quality indicator; (3) selection of a sample of designs adopted by German public report cards; (4) identification of the information presentation elements used in public reporting initiatives in Germany; and (5) an online panel completed an online questionnaire that was conducted to determine if respondents were able to identify the hospital with the lowest RAMR and if respondents’ hospital choices were associated with particular information design elements.
Results: Evidence-based recommendations were made relating to the following information presentation features relevant to report cards: evaluative table with symbols, tables without symbols, bar charts, bar charts without symbols, bar charts with symbols, symbols, evaluative word labels, highlighting, order of providers, high values to indicate good performance, explicit statements of whether high or low values indicate good performance, and incomplete data (“N/A” as a value). When investigating the RAMR in a sample of 10 hospitals’ report cards, 7 of these information presentation features were identified. Of these, 5 information presentation features improved comprehensibility in a manner reported previously in literature.
Conclusions: To our knowledge, this is the first study to systematically analyze the most commonly used public reporting card designs used in Germany. Best-practice evidence identified in international literature was in agreement with 5 findings about German report card designs: (1) avoid tables without symbols, (2) include bar charts with symbols, (3) state explicitly whether high or low values indicate good performance or provide a “good quality” range, (4) avoid incomplete data (N/A given as a value), and (5) rank hospitals by performance. However, these findings are preliminary and should be subject of further evaluation. The implementation of 4 of these recommendations should not present insurmountable obstacles. However, ranking hospitals by performance may present substantial difficulties.
Background: We sought to develop and test an objective scorecard-based system for assessing and categorizing available research sites in Lassa fever-affected countries based on their preparedness and capability to host Lassa fever vaccine clinical trials.
Methods: We mapped available clinical research sites through interrogation of online clinical trial registries and relevant disease-based consortia. A structured online questionnaire was used to assess the capability of clinical trial sites to conduct Lassa fever vaccine clinical trials. We developed a new scoring template by allocating scores to questionnaire parameters based on perceived importance to the conduct of clinical trials as described in the WHO/TDR Global Competency Framework for Clinical Research. Cutoff points of 75% and 50% were used to categorize sites into categories A, B, or C.
Results: This study identified 44 clinical trial sites in 8 Lassa fever-affected countries. Out of these, 35 sites were characterized based on their capacity to hold Lassa fever vaccine clinical trials. A total of 14 sites in 4 countries were identified as ready to host Lassa fever vaccine trials immediately or with little support.
Conclusion: It is feasible to hold Lassa fever vaccine trials in affected countries based on the outcome of the survey. However, the findings are to be validated through sites' visits. This experience with a standardized and objective method of the site assessment is encouraging, and the site selection method used can serve as an orientation to sponsors and researchers planning clinical trials in the region.
The development of Artificial Intelligence (AI) has profound implications for improving human and computational productivity in the future. However, it also is an existential risk to human life because it could exceed human capabilities. As such, information about the technology, the direction of the development and its purpose is important. This can be achieved through openness and transparency of processes. Indeed, companies hold property rights over AI and monopolies of software, data and experts. As a countermovement to leading AI companies, the “Open AI Movement” has evolved to push open-source AI research and products, to empower users, and to bridge the digital divide through participation and access. In this thesis, the implications of the declaration of AI as a commons have been analyzed through interviews with AI experts in the United States. The legal placement of AI is controversial but it could be seen as a basic human right. Other findings are that this field is very competitive and that the best approach is to collaboratively develop software that adds additional value on the edge of the commons.
This research focuses on the fundamental ideas and underlying principles of E-Learning technology, as well as theoretical considerations for an optimal learning environment. This theoretical exploration was then used as a basis for the design and construction of a new, interactive Web-Based ESH-Training. The quality and effectiveness of this new course was then compared with that of the existing analog PDF-Training via a test with a diverse sample of employee learners. Learners were later surveyed to ascertain their views on both trainings in terms of the quality of the content, facilitator, resources, and length. Results clearly showed that regardless of demographic factors, most employee learners preferred the new, Web-Based ESH-Training to the analog PDF-Training.
Background
To perform a systematic review about the effect of using clinical pathways on length of stay (LOS), hospital costs and patient outcomes. To provide a framework for local healthcare organisations considering the effectiveness of clinical pathways as a patient management strategy.
Methods
As participants, we considered hospitalized children and adults of every age and indication whose treatment involved the management strategy "clinical pathways". We include only randomised controlled trials (RCT) and controlled clinical trials (CCT), not restricted by language or country of publication. Single measures of continuous and dichotomous study outcomes were extracted from each study. Separate analyses were done in order to compare effects of clinical pathways on length of stay (LOS), hospital costs and patient outcomes. A random effects meta-analysis was performed with untransformed and log transformed outcomes.
Results
In total 17 trials met inclusion criteria, representing 4,070 patients. The quality of the included studies was moderate and studies reporting economic data can be described by a very limited scope of evaluation. In general, the majority of studies reporting economic data (LOS and hospital costs) showed a positive impact. Out of 16 reporting effects on LOS, 12 found significant shortening. Furthermore, in a subgroup-analysis, clinical pathways for invasive procedures showed a stronger LOS reduction (weighted mean difference (WMD) -2.5 days versus -0.8 days)).
There was no evidence of differences in readmission to hospitals or in-hospital complications. The overall Odds Ratio (OR) for re-admission was 1.1 (95% CI: 0.57 to 2.08) and for in-hospital complications, the overall OR was 0.7 (95% CI: 0.49 to 1.0). Six studies examined costs, and four showed significantly lower costs for the pathway group. However, heterogeneity between studies reporting on LOS and cost effects was substantial.
Conclusion
As a result of the relatively small number of studies meeting inclusion criteria, this evidence base is not conclusive enough to provide a replicable framework for all pathway strategies. Considering the clinical areas for implementation, clinical pathways seem to be effective especially for invasive care. When implementing clinical pathways, the decision makers need to consider the benefits and costs under different circumstances (e.g. market forces).
In this poster we present the ongoing development of an integrated free and open source toolchain for semantic annotation of digitised cultural heritage. The toolchain development involves the specification of a common data model that aims to increase interoperability across diverse datasets and to enable new collaborative research approaches.
A new FOSS (free and open source software) toolchain and associated workflow is being developed in the context of NFDI4Culture, a German consortium of research- and cultural heritage institutions working towards a shared infrastructure for research data that meets the needs of 21st century data creators, maintainers and end users across the broad spectrum of the digital libraries and archives field, and the digital humanities. This short paper and demo present how the integrated toolchain connects: 1) OpenRefine - for data reconciliation and batch upload; 2) Wikibase - for linked open data (LOD) storage; and 3) Kompakkt - for rendering and annotating 3D models. The presentation is aimed at librarians, digital curators and data managers interested in learning how to manage research datasets containing 3D media, and how to make them available within an open data environment with 3D-rendering and collaborative annotation features.
Wikidata and Wikibase as complementary research data management services for cultural heritage data
(2022)
The NFDI (German National Research Data Infrastructure) consortia are associations of various institutions within a specific research field, which work together to develop common data infrastructures, guidelines, best practices and tools that conform to the principles of FAIR data. Within the NFDI, a common question is: What is the potential of Wikidata to be used as an application for science and research? In this paper, we address this question by tracing current research usecases and applications for Wikidata, its relation to standalone Wikibase instances, and how the two can function as complementary services to meet a range of research needs. This paper builds on lessons learned through the development of open data projects and software services within the Open Science Lab at TIB, Hannover, in the context of NFDI4Culture – the consortium including participants across the broad spectrum of the digital libraries, archives, and museums field, and the digital humanities.
The Wnt signaling pathway has been associated with many essential cell processes. This study aims to examine the effects of Wnt signaling on proliferation of cultured HEK293T cells. Cells were incubated with Wnt3a, and the activation of the Wnt pathway was followed by analysis of the level of the β-catenin protein and of the expression levels of the target genes MYC and CCND1. The level of β-catenin protein increased up to fourfold. While the mRNA levels of c-Myc and cyclin D1 increased slightly, the protein levels increased up to a factor of 1.5. Remarkably, MTT and BrdU assays showed different results when measuring the proliferation rate of Wnt3a stimulated HEK293T cells. In the BrdU assays an increase of the proliferation rate could be detected, which correlated to the applied Wnt3a concentration. Oppositely, this correlation could not be shown in the MTT assays. The MTT results, which are based on the mitochondrial activity, were confirmed by analysis of the succinate dehydrogenase complex by immunofluorescence and by western blotting. Taken together, our study shows that Wnt3a activates proliferation of HEK293 cells. These effects can be detected by measuring DNA synthesis rather than by measuring changes of mitochondrial activity.
Background: Often preventive measures are not accessed by the people who were intended to be reached. Programs for older adults may target men and women, older adults, advanced old age groups and/or chronically ill patients with specific indications. The defined target groups rarely participate in the conception of programs or in the design of information materials, although this would increase accessibility and participation. In the German “Reaching the Elderly” study (2008–2011), an approach to motivating older adults to participate in a preventive home visit (PHV) program was modified with the participatory involvement of the target groups. The study examines how older men and women would prefer to be addressed for health and prevention programs.
Methods: Four focus groups (N = 42 participants) and 12 personal interviews were conducted (women and men in 2 age groups: 65–75 years and ≥ 76 years). Participants from two districts of a major German city were selected from a stratified random sample (N = 200) based on routine data from a local health insurance fund. The study focused on the participants’ knowledge about health and disease prevention and how they preferred to be approached and addressed. Videos of the focus groups were recorded and analysed using mind mapping techniques. Interviews were digitally recorded, transcribed verbatim and subjected to qualitative content analysis.
Results: A gender-specific approach profile was observed. Men were more likely to favor competitive and exerciseoriented activities, and they associated healthy aging with mobility and physical activity. Women, on the other hand, displayed a broader understanding of healthy aging, which included physical activity as only one aspect as well as a healthy diet, relaxation/wellness, memory training and independent living; they preferred holistic and socially oriented services that were not performance-oriented. The “older seniors” (76+) were ambivalent towards
certain wordings referring to aging.
Conclusions: Our results suggest that gender-specific needs must be considered in order to motivate older adults to participate in preventive services. Age-specific characteristics seem to be less relevant. It is more important to pay attention to factors that vary according to the individual state of health and life situation of
the potential participants.
Background
The eResearch system “Prospective Monitoring and Management App (PIA)” allows researchers to implement questionnaires on any topic and to manage biosamples. Currently, we use PIA in the longitudinal study ZIFCO (Integrated DZIF Infection Cohort within the German National Cohort) in Hannover (Germany) to investigate e.g. associations of risk factors and infectious diseases. Our aim was to assess user acceptance and compliance to determine suitability of PIA for epidemiological research on transient infectious diseases.
Methods
ZIFCO participants used PIA to answer weekly questionnaires on health status and report spontaneous onset of symptoms. In case of symptoms of a respiratory infection, the app requested participants to self-sample a nasal swab for viral analysis. To assess user acceptance, we implemented the System Usability Scale (SUS) and fitted a linear regression model on the resulting score. For investigation of compliance with submitting the weekly health questionnaires, we used a logistic regression model with binomial response.
Results
We analyzed data of 313 participants (median age 52.5 years, 52.4% women). An average SUS of 72.0 reveals good acceptance of PIA. Participants with a higher technology readiness score at the beginning of study participation also reported higher user acceptance. Overall compliance with submitting the weekly health questionnaires showed a median of 55.7%. Being female, of younger age and being enrolled for a longer time decreased the odds to respond. However, women over 60 had a higher chance to respond than women under 60, while men under 40 had the highest chance to respond. Compliance with nasal swab self-sampling was 77.2%.
Discussion
Our findings show that PIA is suitable for the use in epidemiologic studies with regular short questionnaires. Still, we will focus on user engagement and gamification for the further development of PIA to help incentivize regular and long-term participation.
Background:
Huntington’s disease (HD) is a rare, genetic, neurodegenerative and ultimately fatal disease with no cure or progression-delaying treatment currently available. HD is characterized by a triad of cognitive, behavioural and motor symptoms. Evidence on epidemiology and management of HD is limited, especially for Germany. This study aims to estimate the incidence and prevalence of HD and analyze the current routine care based on German claims data.
Methods:
The source of data was a sample of the Institute for Applied Health Research Berlin (InGef) Research Database, comprising data of approximately four million insured persons from approximately 70 German statutory health insurances. The study was conducted in a retrospective cross-sectional design using 2015 and 2016 as a two-year observation period. At least two outpatient or inpatient ICD-10 codes for HD (ICD-10: G10) during the study period were required for case identification. Patients were considered incident if no HD diagnoses in the 4 years prior to the year of case identification were documented. Information on outpatient drug dispensations, medical aids and remedies were considered to describe the current treatment situation of HD patients.
Results:
A 2-year incidence of 1.8 per 100,000 persons (95%-Confidence interval (CI): 1.4–2.4) and a 2-year period prevalence of 9.3 per 100,000 persons (95%-CI: 8.3–10.4) was observed. The prevalence of HD increased with advancing age, peaking at 60–69 years (16.8 per 100,000 persons; 95%-CI: 13.4–21.0) and decreasing afterwards.
The most frequently observed comorbidities and disease-associated symptoms in HD patients were depression (42.9%), dementia (37.7%), urinary incontinence (32.5%), extrapyramidal and movement disorders (30.5%), dysphagia (28.6%) and disorders of the lipoprotein metabolism (28.2%).
The most common medications in HD patients were antipsychotics (66.9%), followed by antidepressants (45.1%). Anticonvulsants (16.6%), opioids (14.6%) and hypnotics (9.7%) were observed less frequently.
Physical therapy was the most often used medical aid in HD patients (46.4%). Nursing services and speech therapy were used by 27.9 and 22.7% of HD patients, respectively, whereas use of psychotherapy was rare (3.2%).
Conclusions:
Based on a representative sample, this study provides new insights into the epidemiology and routine care of HD patients in Germany, and thus, may serve as a starting point for further research.
Background:
Multiple Sclerosis (MS) is a chronic inflammatory, immune mediated disease of the central nervous system, with Relapsing Remitting MS (RRMS) being the most common type. Within the last years, the status of high disease activity (HDA) has become increasingly important for clinical decisions. Nevertheless, little is known about the incidence, the characteristics, and the current treatment of patients with RRMS and HDA in Germany. Therefore, this study aims to estimate the incidence of HDA in a German RRMS patient population, to characterize this population and to describe current drug treatment routines and further healthcare utilization of these patients.
Methods:
A claims data analyses has been conducted, using a sample of the InGef Research Database that comprises data of approximately four million insured persons from around 70 German statutory health insurances (SHI). The study was conducted in a retrospective cohort design, including the years 2012–2016. Identification of RRMS population based on ICD-10 code (ICD-10-GM: G35.1). For identification of HDA, criteria from other studies as well as expert opinions have been used. Information on incidence, characteristics and current treatment of patients with RRMS and HDA was considered.
Results:
The overall HDA incidence within the RRMS population was 8.5% for 2016. It was highest for the age group of 0–19 years (29.4% women, 33.3% men) and lowest for the age group of ≥ 50 years (4.3% women, 5.6% men). Mean age of patients with RRMS and incident HDA was 38.4 years (SD: 11.8) and women accounted for 67.8%.
Analyses of drug utilization showed that 82.4% received at least one disease-modifying drug (DMD) in 2016. A percentage of 49.8% of patients received drugs for relapse therapy. A share of 55% of RRMS patients with HDA had at least one hospitalization with a mean length of stay of 13.9 days (SD: 18.3 days) in 2016. The average number of outpatient physician contacts was 28.1 (SD: 14.0).
Conclusions:
This study based on representative Germany-wide claims data from the SHI showed a high incidence of HDA especially within the young RRMS population. Future research should consider HDA as an important criterion for the quality of care for MS patients.
Introduction: Renal cell carcinoma (RCC), an immunogenic tumor, is the most common form of kidney cancer worldwide. Immune checkpoint inhibitors (ICIs) play an important role in the treatment of metastatic RCC. Programmed death-ligand (PD-L1) has already been proposed as a possible prognosticator for ICIs effectiveness. To elucidate the feasible role of ICIs in neoadjuvant settings, we have assessed the most common PD-L1 expression modalities [tumor proportion score (TPS), combined positivity score (CPS) and inflammatory cell (IC) score] in primary tumors (PTs) and venous tumor thrombi (VTT) in first diagnosed, previously untreated RCC patients with accompanying
VTT.
Methods: Between January 1999 and December 2016, 71 patients with a first diagnosed, untreated, locally advanced RCC (aRCC) (≥ pT3a) underwent surgery in Hanover Medical School (MHH). PD-L1 expression was examined separately in PTs and VTT using the CPS, IC score and TPS. We also considered the age at the time of the initial surgery and gender as probable influencing factors. By using a cutoff value of 1 (1%), PD-L1 expression levels in PTs and VTT were assessed to enable the determination of any frequency differences.
Results: Positive scores for PTs were shown by 54 (CPS), 53 (IC score) and 34 (TPS) patients, whereas in VTT, positive scores were evaluated
for a total of 50 (CPS), 47 (IC-score) and 36 (TPS) patients. No statistically significant differences were obtained between the PD-L1 expression immunoscores for PTs and VTT. The covariates age at the time of the initial surgery and gender could not be statistically proven to influence the differences in PD-L1 expression between the
VTT and PTs.
Conclusion: To the best of our knowledge, this research is the largest study to investigate PD-L1 expression in PTs and VTT in 71 cases. It could have relevance for the future development of neoadjuvant immunotherapy options, particularly in aRCC with VTT.
Editorial for the 17th European Networked Knowledge Organization Systems Workshop (NKOS 2017)
(2017)
Knowledge Organization Systems (KOS), in the form of classification systems, thesauri, lexical databases, ontologies, and taxonomies, play a crucial role in digital information management and applications generally. Carrying semantics in a well-controlled and documented way, Knowledge Organization Systems serve a variety of important functions: tools for representation and indexing of information and documents, knowledge-based support to information searchers, semantic road maps to domains and disciplines, communication tool by providing conceptual framework, and conceptual basis for knowledge based systems, e.g. automated classification systems. New networked KOS (NKOS) services and applications are emerging, and we have reached a stage where many KOS standards exist and the integration of linked services is no longer just a future scenario. This editorial describes the workshop outline and overview of presented papers at the 17th European Networked Knowledge Organization Systems Workshop (NKOS 2017) which was held during the TPDL 2017 Conference in Thessaloniki, Greece.
Editorial for the 15th European Networked Knowledge Organization Systems Workshop (NKOS 2016)
(2016)
Knowledge Organization Systems (KOS), in the form of classification systems, thesauri, lexical databases, ontologies, and taxonomies, play a crucial role in digital information management and applications generally. Carrying semantics in a well-controlled and documented way, Knowledge Organisation Systems serve a variety of important functions: tools for representation and indexing of information and documents, knowledge-based support to information searchers, semantic road maps to domains and disciplines, communication tool by providing conceptual framework, and conceptual basis for knowledge based systems, e.g. automated classification systems. New networked KOS (NKOS) services and applications are emerging, and we have reached a stage where many KOS standards exist and the integration of linked services is no longer just a future scenario. This editorial describes the workshop outline and overview of presented papers at the 15th European Networked Knowledge Organization Systems Workshop (NKOS 2016) in Hannover, Germany.
Fall events and their severe consequences represent not only a threatening problem for the affected individual, but also cause a significant burden for health care systems. Our research work aims to elucidate some of the prospects and problems of current sensor-based fall risk assessment approaches. Selected results of a questionnaire-based survey given to experts during topical workshops at international conferences are presented. The majority of domain experts confirmed that fall risk assessment could potentially be valuable for the community and that prediction is deemed possible, though limited. We conclude with a discussion of practical issues concerning adequate outcome parameters for clinical studies and data sharing within the research community. All participants agreed that sensor-based fall risk assessment is a promising and valuable approach, but that more prospective clinical studies with clearly defined outcome measures are necessary.
Background: Fall events contribute significantly to mortality, morbidity and costs in our ageing population. In order to identify persons at risk and to target preventive measures, many scores and assessment tools have been developed. These often require expertise and are costly to implement. Recent research investigates the use of wearable inertial sensors to provide objective data on motion features which can be used to assess individual fall risk automatically. So far it is unknown how well this new method performs in comparison with conventional fall risk assessment tools. The aim of our research is to compare the predictive performance of our new sensor-based method with conventional and established methods, based on prospective data.
Methods: In a first study phase, 119 inpatients of a geriatric clinic took part in motion measurements using a wireless triaxial accelerometer during a Timed Up&Go (TUG) test and a 20 m walk. Furthermore, the St. Thomas Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY) was performed, and the multidisciplinary geriatric care team estimated the patients’ fall risk. In a second follow-up phase of the study, 46 of the participants were interviewed after one year, including a fall and activity assessment. The predictive performances of the TUG, the STRATIFY and team scores are compared. Furthermore, two automatically induced logistic regression models based on conventional clinical and assessment data (CONV) as well as sensor data (SENSOR) are matched.
Results: Among the risk assessment scores, the geriatric team score (sensitivity 56%, specificity 80%) outperforms STRATIFY and TUG. The induced logistic regression models CONV and SENSOR achieve similar performance values (sensitivity 68%/58%, specificity 74%/78%, AUC 0.74/0.72, +LR 2.64/2.61). Both models are able to identify more persons at risk than the simple scores.
Conclusions: Sensor-based objective measurements of motion parameters in geriatric patients can be used to assess individual fall risk, and our prediction model’s performance matches that of a model based on conventional clinical and assessment data. Sensor-based measurements using a small wearable device may contribute significant information to conventional methods and are feasible in an unsupervised setting. More prospective research is needed to assess the cost-benefit relation of our approach.
Mining geriatric assessment data for in-patient fall prediction models and high-risk subgroups
(2012)
Background: Hospital in-patient falls constitute a prominent problem in terms of costs and consequences. Geriatric institutions are most often affected, and common screening tools cannot predict in-patient falls consistently. Our objectives are to derive comprehensible fall risk classification models from a large data set of geriatric in-patients’ assessment data and to evaluate their predictive performance (aim#1), and to identify high-risk subgroups from the data (aim#2).
Methods: A data set of n = 5,176 single in-patient episodes covering 1.5 years of admissions to a geriatric hospital were extracted from the hospital’s data base and matched with fall incident reports (n = 493). A classification tree model was induced using the C4.5 algorithm as well as a logistic regression model, and their predictive performance was evaluated. Furthermore, high-risk subgroups were identified from extracted classification rules with a support of more than 100 instances.
Results: The classification tree model showed an overall classification accuracy of 66%, with a sensitivity of 55.4%, a specificity of 67.1%, positive and negative predictive values of 15% resp. 93.5%. Five high-risk groups were identified, defined by high age, low Barthel index, cognitive impairment, multi-medication and co-morbidity.
Conclusions: Our results show that a little more than half of the fallers may be identified correctly by our model, but the positive predictive value is too low to be applicable. Non-fallers, on the other hand, may be sorted out with the model quite well. The high-risk subgroups and the risk factors identified (age, low ADL score, cognitive impairment, institutionalization, polypharmacy and co-morbidity) reflect domain knowledge and may be used to screen certain subgroups of patients with a high risk of falling. Classification models derived from a large data set using data mining methods can compete with current dedicated fall risk screening tools, yet lack diagnostic precision. High-risk subgroups may be identified automatically from existing geriatric assessment data, especially when combined with domain knowledge in a hybrid classification model. Further work is necessary to validate our approach in a controlled prospective setting.
Wearable sensors in healthcare and sensor-enhanced health information systems: all our tomorrows?
(2012)
Wearable sensor systems which allow for remote or self-monitoring of health-related parameters are regarded as one means to alleviate the consequences of demographic change. This paper aims to summarize current research in wearable sensors as well as in sensor-enhanced health information systems. Wearable sensor technologies are already advanced in terms of their technical capabilities and are frequently used for cardio-vascular monitoring. Epidemiologic predictions suggest that neuro-psychiatric diseases will have a growing impact on our health systems and thus should be addressed more intensively. Two current project examples demonstrate the benefit of wearable sensor technologies: long-term, objective measurement under daily-life, unsupervised conditions. Finally, up-to-date approaches for the implementation of sensor-enhanced health information systems are outlined. Wearable sensors are an integral part of future pervasive, ubiquitous and person-centered health
care delivery. Future challenges include their integration into sensor-enhanced health information systems and sound evaluation studies involving measures of workload reduction and costs.
Background:
Hereditary angioedema (HAE) is a rare genetic disease and characterized by clinical features such as paroxysmal, recurrent angioedema of the skin, the gastrointestinal tract, and the upper airways. Swelling of the skin occurs primarily in the face, extremities and genitals. Gastrointestinal attacks are accompanied by painful abdominal cramps, vomiting and diarrhea. Due to the low prevalence and the fact that HAE patients often present with rather unspecific symptoms such as abdominal cramps, the final diagnosis is often made after a long delay. The aim of this German-wide survey was to characterize the period between occurrence of first symptoms and final diagnosis regarding self-perceived health, symptom burden and false diagnoses for patients with HAE.
Results:
Overall, 81 patients with HAE were included and participated in the telephone-based survey. Of those, the majority reported their current health status as “good” (47.5%) or “very good” (13.8%), which was observed to be a clear improvement compared to the year before final diagnosis (“good” (16.3%), “very good” (11.3%)). Edema in the extremities (85.2%) and in the gastrointestinal tract (81.5%) were the most currently reported symptoms and occurred earlier than other reported symptoms (mean age at onset 18.1 and 17.8 years, respectively). Misdiagnoses were observed in 50.6% of participating HAE patients with appendicitis and allergy being the most frequently reported misdiagnoses (40.0 and 30.0% of those with misdiagnosis, respectively). Patients with misdiagnosis often received mistreatment (80.0%) with pharmaceuticals and surgical interventions as the most frequently carried out mistreatments (65.6 and 56.3% of those with mistreatment, respectively). The mean observed diagnostic delay was 18.1 years (median 15.0 years). The diagnostic delay was higher in older patients and index patients.
Conclusions:
This study showed that self-perceived status of health for patients is much better once the final correct diagnosis has been made and specific treatment was available. Further challenge in the future will still be to increase awareness for HAE especially in settings which are normally approached by patients at occurrence of first symptoms to assure early referral to specialists and therefore increase the likelihood of receiving an early diagnosis.
The amount of papers published yearly increases since decades. Libraries need to make these resources accessible and available with classification being an important aspect and part of this process. This paper analyzes prerequisites and possibilities of automatic classification of medical literature. We explain the selection, preprocessing and analysis of data consisting of catalogue datasets from the library of the Hanover Medical School, Lower Saxony, Germany. In the present study, 19,348 documents, represented by notations of library classification systems such as e.g. the Dewey Decimal Classification (DDC), were classified into 514 different classes from the National Library of Medicine (NLM) classification system. The algorithm used was k-nearest-neighbours (kNN). A correct classification rate of 55.7% could be achieved. To the best of our knowledge, this is not only the first research conducted towards the use of the NLM classification in automatic classification but also the first approach that exclusively considers already assigned notations from other
classification systems for this purpose.
Immunization is the most cost-effective intervention for infectious diseases, which are the major cause of morbidity and mortality worldwide. Vaccines not only protect the individual who is vaccinated but also reduce the burden of infectious vaccine-preventable diseases for the entire community.
1 Adult vaccination is very important given that >25% of mortality is due to infectious diseases.
2 There is a scarcity of information on the vaccination status of young adults and the role of socioeconomic conditions in India.
The world health organization defines musculoskeletal disorder (MSD) as “a disorder of muscles, tendons, peripheral vascular system not directly resulting from an acute or instantaneous event.1 Work related MSDs are one of the most important occupational hazards.1 Among many other occupations, dentistry is a highly demanding profession that requires good visual acuity, hearing, depth perception, psychomotor skills, manual dexterity, and ability to maintain occupational postures over long periods.
Nanotechnology is emerging as one of the key technologies of the 21st century and is expected to enable developments across a wide range of sectors that can benefit citizens. Nanomedicine is an application of nanotechnology in the areas of healthcare, disease diagnosis, treatment and prevention of disease. Nanomedicines pose problem of nanotoxicity related to factors like size, shape, specific surface area, surface morphology, and crystallinity. Currently, nanomedicines are regulated as medicinal products or as medical devices and there is no specific regulatory framework for nanotechnology-based products neither in the EU nor in the USA. This review presents a scheme for classification and regulatory approval process for nanotechnology based medicines.
Medical devices are health care products distinguished from drugs for regulatory purposes in most countries based on mechanism of action. Unlike drugs, medical devices operate via physical or mechanical means and are not dependent on metabolism to accomplish their primary intended effect. Developing new medical devices requires clinical investigations and approval process goes through similar process like drugs. Medical device approvals in the period of 2010 to 2014 were searched from USFDA website. Disease burden data in the similar period was searched from centers for disease control and prevention website. Collected data was analyzed to know number of approved devices, top therapy areas, and mechanism of action of these devices. Out of a total of 200 medical devices approvals in the time period of 2010 to 2014, maximum number of devices (51; 25.5%) were approved in the year 2011, cardiovascular (78; 39%) was the top therapy area. Highest number (180; 90%) of approved medical devices belonged to the category III and maximum number (73; 36.5%) of approved medical devices had ―mechanical‖ mechanism of action. The top 3 causes of deaths in USA during 2010 to 2014 were heart disease, cancer and followed by respiratory infection. There was a match between the top diseases and the medical device approvals for top 2 diseases in USA i.e. heart disease, and cancer. With respect to respiratory infections and ailments which was the 3rd leading cause of death only one device was approved out of 200 approvals in total.
Background: Antimicrobial resistance has become a serious global problem. A potential post-antibiotic era is threatening present and future medical advances. In Pakistan, the usage of antibiotic is unnecessarily high and due to over exposure to these drugs, bacteria are developing resistance against these drugs. It is necessary to improve public awareness about the rational use of antibiotics in order to bring a change in consumer’s behaviour. Therefore, present study was undertaken to assess the existing knowledge, attitude and practices related to antibiotic usage among university students.
Methods: A cross-sectional study was carried out among university students from Karachi, Pakistan during May-June 2018. 200 students were approached to participate in the study of which 159 agreed to participate (males: 70, females: 89). Pretested questionnaire was distributed to the study subjects and the collected data was analyzed using IBM SPSS version 23.
Results: Substantial number of (33% and 50%) participants were unaware about the differences in antibiotic: anti-inflammatory drugs and antibiotic: antipyretics respectively. 29% of the participants thought it is right to stop antibiotics only based on symptomatic improvement. Thirty nine percent and eighty three percent participants believed that antibiotics should always be prescribed to treat flu like symptoms and pneumonia respectively.
Conclusions: Participants demonstrated average knowledge about antibiotics. Similarly, their attitude and practice toward antibiotic use was associated with misconceptions. An educational intervention is necessary to make them aware about rational use of antibiotics.
Background: Oral cancers (OC) are malignant lesions occurring in the oral cavity that include squamous cell carcinomas (SCC), salivary gland and odontogenic neoplasms. Even though it is the eighth most common malignancy globally but in Pakistan it is the second commonest type of cancer. Lack of awareness about ill-effects of preventable risk factors of oral cancer increases the burden of disease due to the associated high cost of treatment, permanent impairment and high mortality. Hence, awareness can be very helpful in prevention, control and early diagnosis of oral cancer.
Methods: A cross-sectional study was carried out among university students from Karachi, Pakistan during April to May 2018. Three hundred students were approached to participate in the study of which 277 agreed to participate. Pretested questionnaire was distributed and collected data was analysed using IBM SPSS version 23.
Results: There were 125 (45%) males and 152 (55%) females in the study and response rate was 94%. Sixty one percent (154/250) respondents correctly identified smoking, and tobacco chewing as possible causes of oral cancer. Almost one third (74%; 184/250) respondents correctly responded that oral cancer does not spread from person to person through touch or speaking. Sixty six percent (164/250) respondents believed that oral cancer is curable. Mean score of knowledge was higher in females (61%) than males (53%). Significantly higher number of females compared to male participants answered correctly to questions regarding cause of oral cancer, spread of disease and occurrence of oral cancer in AIDS patients.
Conclusions: Participants showed poor knowledge about oral cancer. Female participants showed better knowledge compared to male counterparts. Details about oral cancer should be incorporated in the university curriculum and periodic awareness programs should be organized for students.
Background: Diabetes is fast gaining the status of a potential epidemic in India, with >62 million individuals currently diagnosed with the disease. India currently faces an uncertain future in relation to the potential burden that diabetes may impose on the country. An estimated US$ 2.2 billion would be needed to sufficiently treat all cases of type 2 diabetes mellitus (T2DM) in India. Many interventions can reduce the burden of this disease. However, health care resources are limited; thus, interventions for diabetes treatment should be prioritized. The present study assesses the cost-effectiveness of antidiabetic drugs in patients with T2DM from Mumbai, India.
Methods: A prospective cross-sectional study was performed to assess the cost-effectiveness of antidiabetic drugs in patients with T2DM. Face-to-face interviews were conducted by using a validated questionnaire in a total of 152 (76 males, 76 females) patients with T2DM from F-North Ward, Mumbai, India. Cost-effectiveness was determined on the basis of cost of antidiabetic drug/s, efficacy, adverse drug reactions, safety of administration, frequency of administration, and bioavailability.
Results: For treatment of T2DM in non-obese participants, Glimepiride+Pioglitazone costed least (`3.7) per unit of effectiveness followed by Glimepiride (`6.6), Gliclazide (`8.1), Repaglinide (`24.5), and Vildagliptin (`45.2). For treatment of T2DM in obese participants, Metformin cost least (` 6.7) per unit of effectiveness followed by Glimepiride + Metformin (`5.9) and Repaglinide (`24.5).
Conclusions: In case of non-obese participants, cost effectiveness and prescribed treatments did not show a match, while for obese participants prescribed treatments were in line with cost effectiveness.
Diabetes is fast gaining the status of a potential epidemic in India, with >62 million individuals currently diagnosed with the disease.1 India currently faces an uncertain future in relation to the potential burden that diabetes may impose on the country. An estimated US$ 2.2 billion would be needed to sufficiently treat all cases of type 2 diabetes mellitus (T2DM) in India.2 Many interventions can reduce the burden of this disease. However, health care resources are limited; thus, interventions for diabetes treatment should be prioritized.
Background: Pharmacovigilance (PV); also known as drug safety surveillance, is the science of enhancing patient care and patient safety regarding the use of medicines by collecting, monitoring, assessing, and evaluating information from healthcare providers and patients. Pharmacists are pivotal players in adverse drug event (ADE) monitoring and reporting. However, most pharmacists are unaware or not knowledgeable about the guidelines used by their respective countries’ drug regulatory bodies. It is the need of the hour to train pharmacy students on the concept of pharmacovigilance.
Methods: A cross-sectional study was carried out among pharmacy students from Mumbai University, India during May-June 2017. On the basis of the eligibility criterion 352 students were selected for the present study. Four hundred students were approached to participate in the study of which 201 agreed to participate (males: 179; females: 173). Pretested questionnaire was distributed and collected data was analyzed using IBM SPSS version 23.
Results: Overall pharmacovigilance knowledge (44%) and perception (58%) was low among the participants of the present study. Seventy four percent of the participants felt that adverse drug reaction (ADR) reporting should be made compulsory for healthcare professionals. And only 21% agreed that the topic of Pharmacovigilance is well covered in pharmacy curriculum.
Conclusions: Pharmacy council of India, pharmacy teacher’s association and respective pharmacy college should take necessary steps to increase the knowledge and create awareness regarding pharmacovigilance and adverse drug reaction reporting among pharmacy students.
Background: Self-medication, practiced globally is an important public health problem. Research studies have indicated inappropriate self‐medication results in adverse drug reactions, disease masking, antibiotic resistance and wastage of healthcare resources. The objectives of the study were to explore overall self-medication and antibiotic self-medication prevalence among students of university students in Karachi, Pakistan along with probable reasons, indications, and sources of advice for self-medication. Methods: A descriptive, cross-sectional, questionnaire-based study was carried out among students from university of Karachi, Pakistan during the time period of September to November 2016. Pretested questionnaire was distributed to 320 students, collected data was analyzed using IBM SPSS version 24. Results: From 320 students, 311 (83 male and 228 female) students participated in the study giving a response rate of 97%. Prevalence of self-medication was 66%. Belonging to higher monthly family income group was associated with likelihood of self-medication. Antibiotic self-medication prevalence was 39%. Lack of time (39%), and old prescription (35%) were the main reasons for self-medication. Pharmacy shop (75%) was the main source for self-medication. In case of antibiotics, 44% students changed the dosage of antibiotic and 50% students stopped antibiotics after the disappearance of the symptoms. Conclusions: Antibiotic self-medication (39%) and self-medication with other drugs among university students of Karachi is a worrisome problem. Our findings highlight the need for planning interventions to promote the judicious use of general medicines as well as that of antibiotics.
Background: Discovery of antibiotics have helped to manage the devastating diseases. Presently, the antibiotic era is threatened by the emergence of high level of antibiotic resistance of important pathogens. Misuse of antibiotics poses a serious risk to infectious disease control. It is necessary to improve public awareness to bring a change in the behavior of consumers. Therefore, present study was undertaken to assess the existing knowledge, attitude and practices related to antibiotic usage among university students.
Methods: A cross-sectional study was carried out among students from Mumbai University, India during May-June 2017. 300 students were approached to participate in the study of which 250 agreed to participate (males: 117; females: 133). Pretested questionnaire was distributed and collected data was analyzed using IBM SPSS version 23.
Results: Substantial number (33% and 40%) participants were unaware about the differences in antibiotic-anti-inflammatory drugs and antibiotic-antipyretics respectively. 28% of the participants thought it is right to stop antibiotics only based on symptoms improvement. Sixty eight percent and seventy nine percent participants believed that antibiotics should always be prescribed to treat flu like symptoms and pneumonia respectively.
Conclusions: Participants demonstrated poor knowledge about antibiotics. Similarly, their attitude and practice toward antibiotic use was associated with misconceptions. An educational intervention can be introduced to make them aware about rational antibiotic practices.
Knowledge and attitude towards voluntary blood donation among students from Mumbai University
(2018)
Background: Blood is scarce; its demand far outweighs the supply. In addition to limited supply, the issue of safety especially with regard to the risk of transfusion transmissible infection is also an issue of utmost concern especially in the developing countries. Blood transfusion services in India have gained special significance in recent years and forms a vital part of national health care system. Voluntary Non-Remunerated Blood Donation (VNRBD) is the safest of all types of blood donations. One of the potential sources that can be tapped for blood donation is the young and physically fit students from educational institutions across India. Methods: A cross-sectional study was carried out among students from Mumbai University, India during May–June 2017. Two hundred and fifty students were approached to participate in the study of which 201 agreed to participate (males: 104; females: 97). Pretested questionnaire was distributed and collected data was analyzed using IBM SPSS version 23. Results: High number of participants agreed about encouraging general public about voluntary blood donation (96%; 193/201), lack of awareness about VBD in general public (82%; 164/201). But not a single participant was able to respond to the knowledge part of the questionnaire with 100% accuracy. Almost all the participants had correct knowledge about blood groups (98%; 196/201) and blood matching need (195/201; 97%). Conclusions: Participants showed good attitude but demonstrated poor knowledge about voluntary blood donation. Details about blood donation should be incorporated in the undergraduate curriculum and periodic awareness programs should be organized for students.
Background: Human papillomavirus (HPV) is a common sexually transmitted infection (STI) that may cause cervical cancer and other malignancies including those of the vulva, anus, vagina, penis, head and neck. In most Asian countries including India, cervical cancer is the second most common cancer in women. Awareness about HPV and cervical cancer, use of vaccines can be very helpful in prevention, control and early diagnosis of cervical cancer. Methods: A cross-sectional study was carried out among students from Mumbai University, India during May - June 2017. Two hundred students were approached to participate in the study of which 142 were selected to participate (males: 54; females: 88). Pretested questionnaire was distributed and collected data was analyzed using IBM SPSS version 23. Results: Participants had fair knowledge (61% average) about HPV, whereas knowledge about symptoms, prevention and spread of HPV was very poor i.e. 18%. Knowledge about HPV vaccine was 50% and 78% participants had positive attitude for HPV vaccine. Conclusions: This study showed the lacunas in the pharmacy curriculum and urgent need to create awareness of HPV among bachelor of pharmacy students from Mumbai University.
A systematic review of the literature on survey questionnaires to assess self-medication practices
(2017)
Self-medication is of great public health importance as it often bypasses regulatory mechanisms to assure quality of health care. Nevertheless there are no established standards on how to assess self-medication. We therefore intended to systematically retrieve questionnaires and survey tools used to capture self-medication, with the aim to identify the scope of information investigated in this context and commonalities between the tools. We conducted a systematic review of the literature on questionnaires used for self-medication assessment by searching PubMed and Web of Science databases using the combinations of following keywords; self-medication, self-prescription, non-prescription, questionnaire. Truncation was used to ensure retrieval of all possible variations of search terms. The search was limited to articles published between 1st January 2000 and 31st December 2015, human studies and English language. Duplicate and irrelevant studies were excluded from the final review. A total of 158 studies were included in the review. Studies were from diverse geographical locations, most of the studies were from Nigeria 16 (10.1%) followed by India 10 (6.3%) and Iran 8 (5%). Forty-three studies (27.2%) focused on antibiotic self-medication. Majority of the studies (106; 67%) were done with adult populations. The components addressed by the questionnaires covered: reasons for self-medications in 147 (93%) studies, purchasing source in 136 (86%) studies, medical conditions to be treated in 153 (96.8%) studies, adverse events in 67 (42.4%) studies, use of prescribing information in 24 (15.1%) studies and antibiotic resistance awareness in 20 (46.5%) antibiotic studies. For 74 (46.8%) studies, survey questionnaires were self-administered and most studies (57; 36%) were done at homes of respondents. Thirty-seven (23.4%) studies did not report any recall period for self-medication practices. Study response rates varied from 17.9% to 100%, and while validity of the study questionnaire was reported for 100 (63.3%) studies, 15 (9.5%) studies reported reliability test of the study questionnaire. There is a large variety of questionnaires being used for investigating self-medication practices making comparability and meta-analyses very difficult. It is desirable to have a basic set of standardized survey questions on this topic to make available for future research groups in this field.
Background: Concerns about practice of self-medication (SM) world across are based on associated risks such as adverse reactions, disease masking, increased morbidity, wastage of resources and antibiotic resistance. SM is likely to differ between rural and urban areas of India. Systematically retrieved evidence on these differences are required in order to design targeted measures for improvement. Methods: We conducted a cross sectional study among the general population in urban (Matunga) and rural (Tala) areas of Maharashtra, India to explore SM practices and its associated factors. Face to face interviews were conducted using the validated study questionnaire. Data was analyzed by using descriptive and analytical statistical methods. Results: A total of 1523 inhabitants from 462 households were interviewed between [June/2015] and [August /2015], 778 (51%) of them in rural and 745 (49%) in urban areas. Overall self-medication prevalence was 29.1% (urban; 51.5%, rural; 7.7%, OR 12.7, CI 9.4-17.2) in the study participants. Participants having chronic disease (OR: 3.15, CI: 2.07-4.79) and from urban areas (OR:15.38, CI:8.49-27.85) were more likely to self-medicate. Self-medication practices were characterized by having old prescription (41.6%) as the main reason, fever (39.4%) as top indication and NSAIDs (Non-Steroidal Anti Inflammatory Agents) as the most self-medicated category of drugs (40.7%). Conclusions: The present study documented that the prevalence of self-medication is associated with place of residence, and health status of the study participants. Self-medication is still a major issue in western Maharashtra, India and is majorly an urban phenomenon. Status of implementation of existing regulations should be reconsidered.
Background: Immunization is the most cost-effective intervention for infectious diseases which are the major cause of morbidity and mortality worldwide. There is a scarcity of information on the vaccination status of young adults and the role of socioeconomic conditions in India. Objectives: Present study explored the adult vaccination status and influence of income and education of parents on adult vaccination status in university students from Mumbai, India.
Methods: On the basis of the eligibility criterion 149 students were selected for the present study. A total of 8 vaccines namely Tdap/DTP, Varicella, MMR, Influenza, Pneumococcal, Hepatitis A, Hepatitis B and Meningococcal were included in this study for all the respondents. In addition to these vaccines, Human Papilloma Virus vaccine was also included for female respondents.
Results: There were total of 149 (75 male and 74 females) respondents with the mean age of 21.5 years. The top 3 immunizations were Td/Tdap (97.3%), MMR (66.4%) and Hepatitis B (55%) among the respondents. Only 4 (5.5%) female respondents have been immunized against the HPV. Conclusions: Td/Tdap (97.3%) and MMR (66.4%) coverage was in line with the recommendations. For all the other vaccines the coverage was low varying from 5.5% to 35.4%. The vaccination coverage was better in respondents with higher educated and higher income parents. We suggest that patient education, planning by government for the implementation of policy for adult vaccination and involvement of physicians are must for better adult vaccination coverage.
Objectives: To assess the relation between the number of clinical trials conducted and respective new drug approvals in India and South Africa.
Design: Construction and analysis of a comprehensive database of completed randomised controlled clinical trials based on clinicaltrials.gov from 1 January 2005 to 31 December 2010 and drug approval data from 2006 until 2013 for India and South Africa.
Setting: USA, the EU, India and South Africa.
Main outcome measures: Percentage of completed randomised clinical trials for an Investigational Medicinal Product (IMP) leading to new drug approval in India and South Africa.
Results: A total of 622 eligible randomised controlled trials were identified as per search criteria for India and South Africa. Clustering them for the same sponsor and the same Investigational New Drug (IND) resulted in 453 eligible trials, that is, 224 for India and 229 for South Africa. The distribution of the market application approvals between the EU/USA as well as India and South Africa revealed that out of clinical trials with the participation of test centres in India and/or South Africa, 39.6% (India) clinical trials and 60.1% (South Africa) clinical trials led to market authorisation in the EU/USA without a New Drug Application (NDA) approval in India or South Africa.
Conclusions: Despite an increase in clinical trial activities, there is a clear gap between the number of trials conducted and market availability of these new drugs in India and South Africa. Drug regulatory authorities, investigators, institutional review boards and patient groups should direct their efforts to ensuring availability of new drugs in the market that have been tested and researched on their population.
Hypertension is a serious global public health problem. It accounts for 10% of all deaths in India and is the leading noncommunicable disease.1 Recent studies have shown that the prevalence of hypertension is 25% in urban and 10% in rural people in India.2 It exerts a substantial public health burden on cardiovascular health status and health care systems in India.3 Antihypertensive treatment effectively reduces hypertension-related morbidity and mortality.1 The cost of medications has always been a barrier to effective treatment.
Background: Patient satisfaction is considered as an indicator of the healthcare quality. Information on patient satisfaction based on medical expertise of the physician, interpersonal skills, physician-patient interaction time, perception and needs of the patient allow policymakers to identify areas for improvement. Primary care services and healthcare structure differ between the countries. The present study was done to determine and analyze the determinants associated with patient satisfaction in India, Pakistan, Spain and USA.
Methods: This descriptive study was performed in January to August 2019 among students from Mumbai University, India, Dow University of Health Sciences, Karachi, Pakistan, University CEU Cardenal Herrera, Valencia, Spain, Texas State University, Texas, USA. On the basis of the eligibility criterion (those who gave a written informed consent and were registered students of respective university) 890 (India: 369, Pakistan: 128, Spain: 195, USA: 99) students were selected for the present study.
Results: India had almost similar male (49%) to female (51%) ratio of participants. For other 3 countries (PK, ES, US), female participant percentage was nearly 20% or even more as compared to male participants. Overall participant’s satisfaction score about medial expertise of the doctor were highest in India (71%) and were lowest in Spain (43%). Overall satisfaction score about time spent with doctor were highest for India (64%) and were lowest for Spain (41%). Overall satisfaction score about communication with doctor were highest for US (60%) and were lowest for PK (53%). Overall satisfaction score for medical care given by the doctor was lowest in PK (43%) and was highest in US (64%). Overall satisfaction about doctor, highest number of US (83%) and lowest number of PK (32%) participants were satisfied about medical interaction with doctors.
Conclusions: These multi-country findings can provide information for health policy making in India, Pakistan, Spain and USA. Although the average satisfaction per country, except Pakistan is more than 60%, the results suggest that there is ample room for improvement.
Background: India has the third largest HIV epidemic in the world. The Indian epidemic is characterized by low levels in the general population and elevated concentrations among high-risk groups. The present study was planned to determine the awareness of HIV among students from Mumbai University.
Methods: A cross-sectional study was carried out among students from Mumbai University, India during May–June 2017. Two hundred and fifty students were approached to participate in the study of which 199 agreed to participate (males: 132; females: 67). Pretested questionnaire was distributed and collected data was analyzed using IBM SPSS version 23.
Results: Study participants had high knowledge (86%) and attitude score (87%). There was no significant difference between males and female participants for attitude and knowledge, except for one question regarding knowledge about HIV transmission via breastfeeding to child.
Conclusions: Present study showed that there are no misconceptions or negative attitudes regarding HIV among students. A longitudinal study with a larger sample size across India is recommended for further investigation.
The drugs we use to treat any condition – from an innocuous cough to a life-threatening cancer – are the outcome of painstaking human clinical trials. These trials are the only way to credibly determine the safety and efficacy of drugs. In recent years there has been a clear shift in clinical trial sites from core developed countries like USA, European countries to developing countries like India, China, South American countries. This shift is related to challenges and opportunities like costs of trials, recruitment issues, and regulatory challenges in developed vs. developing countries. Developing countries and developed countries have their unique disease burden patterns based on various parameters like but not limited to age, health care facilities, health insurance, sanitary conditions, environmental issues, education, nutrition
and GDP. Previous studies have reported that many of the important global diseases are not much explored in clinical trials and many published clinical trials have very less international health relevance. This study was aimed at finding the correlation between disease burdens, number of clinical trials done and trial success rates. We compared 2005 - 2010 Global Burden of Disease data for Germany, India and number of clinical trials from clinicaltrials.gov database done in the same period. Our findings indicated that there was a good correlation between the disease burden and clinical trials for Germany in 2005 and 2010. For India in 2005 there was a moderate positive correlation, 2010 data showed the improvement in India in terms of match between disease burden and clinical trials. But careful observation of the data shows still a need for more trials on Communicable, maternal, neonatal and nutritional disorders.
Background: Epilepsy is a chronic disorder of the brain that affects people worldwide. The overall prevalence (3.0-11.9 / 1,000) and incidence (0.2-0.6/1,000) of epilepsy in India are comparable to the rates of high-income countries. The high prevalence of negative attitudes towards epilepsy has been highlighted by several studies. Pharmacy students represent a better-educated section of society regarding drugs and have the potential to create awareness, and influence attitudes towards the disease. Thus, it is important that they have the appropriate and updated knowledge and appropriate attitude towards epilepsy and antiepileptic drugs. Objective of the present study was to determine the Mumbai University pharmacy student’s awareness about epilepsy, so as to know the kind of education and awareness strategies that would be applicable to them.
Methods: A cross-sectional study was carried out among students from Mumbai University, India during May-June 2017. Two hundred and fifty students were approached to participate in the study of which 213 agreed to participate (males: 107; females: 106). Pretested questionnaire was distributed and collected data was analyzed using IBM SPSS version 23.
Results: Response rate for this study was 85.2% (213/250). Ninety six percent (204/213) of the participants had heard or read about epilepsy. Overall knowledge was poor (40.2%) and attitude was fair (75.3%). None of the participants were aware about recent research regarding hereditary nature of epilepsy. Only 2 (0.98%) students were aware how to perform the first aid in epilepsy. Only 6.8% participants felt that epileptics should participate in sports.
Conclusions: The findings of this study show that, even with extensive curriculum covering diseases, drugs and relevant laws of land, the knowledge and attitude scores were low. There is a need to have focused education and campaigns to increase the knowledge and attitude towards epilepsy.
Background: Oral cancer is among the top three types of cancers in India. Severe alcoholism, use of tobacco in the form of cigarettes, smokeless tobacco, and betel nut chewing are the most common risk factors for oral cancer. Often individuals with pre cancer even notice the alterations, such as reduced mouth opening in oral submucous fibrosis (OSMF), but they are not aware about the causes and consequences of these changes. Awareness about causes and features of oral cancers can be very helpful in prevention, control and early diagnosis of oral cancer.
Methods: A cross-sectional study was carried out among students from Mumbai University, India during May-June 2017. Five hundred students were approached to participate in the study of which 400 agreed to participate. Pretested questionnaire was distributed and collected data was analyzed using IBM SPSS version 23.
Results: There were 199 (49%) males and 201 (50%) females in the study and response rate was (80%). Respondents had good knowledge about oral cancer. Seventy four percent (268/362) respondents correctly identified smoking, and tobacco chewing as possible causes of oral cancer. Almost all (96%; 348/362) respondents correctly responded that oral cancer does not spread from person to person through touch or speaking. Seventy two percent (260/362) respondents believed that oral cancer is curable. Significantly higher number of male (98%) compared to female participants answered correctly to questions regarding spread of disease and occurrence of oral cancer in AIDS patients.
Conclusions: Participants showed good knowledge about oral cancer. Female participants showed lesser knowledge compared to male counterparts. Details about oral cancer should be incorporated in the undergraduate curriculum and periodic awareness programs should be organized for students.
The medical devices sector helps save lives by providing innovative health care solutions regarding diagnosis, prevention, monitoring, treatment, and alleviation. Medical devices are classified into 1 of 3 categories in the order of increasing risk: Class I, Class II, and Class III.1 Medical devices are distinguished from drugs for regulatory purposes based on mechanism of action. Unlike drugs, medical devices operate via physical or mechanical means and are not dependent on metabolism to accomplish their primary intended effect.
Background: The mission of the pharmacy profession is to improve public health through ensuring safe, effective, and appropriate use of medications. Population health management (PHP) is a process wherein opportunities are identified to improve the quality of health care delivered and thereby, promote better health outcomes for patients.
Rationale: As concept of PHP is extremely important in today’s context, it is helpful to integrate data related to pharmacist in population health management practices. Authors conducted a systematic review of the literature on role of pharmacist in population health management practices. Method: We conducted a systematic review of the literature on literature on role of pharmacist in population health management practices by searching, PubMed Medline database using the following combination of keywords – pharmacist, population health. Truncation was used to ensure retrieval of all possible variations of search terms. The search was limited to articles published between 1st January 2015 and 31st December 2019, human studies and English language.
Results: Initial search resulted in a total of 281 studies, title abstract review to remove irrelevant studies resulted in 256 studies. Yearly trend showed that number of publications are decreasing. Highest number of publications were from Europe (47; 18%) and 29 publications (11%) discussed role of pharmacist in population health management of subjects in the age group of 10 to 20 years. Twenty five publications mentioned health management was done in the community settings. Advice on the lifestyle was mentioned in 242 (96%) and 10 (4%) publications offered advice about drugs during the health management. Pharmacists played important roles in population health management for e.g. as care provider in exploring the challenges faced in clinics for management of Type 2 DM. Pharmacists played an important role in increasing the quality of life of patients.
Discussion: Population health management concept has evolved steadily over the past few decades and is now contributing to the ‘patient care journey’ at all stages. There were 24 (9%) publications from India. Specially designed and implemented Pharm D program would play a major role in Indian health care system in future. This will give an opportunity to pharmacists to work more prominently in Indian health care system.
Conclusion: Authors are of the opinion that this is the first review encompassing the topic of pharmacist and population health management in the global context. It is clear that there is a global trend of moving towards involvement of pharmacist in healthcare management. This enables pharmacists to assume an expanded role and at same time it necessitates reforms in pharmacy education and practice.
Objective
Cyberknife robotic radiosurgery (RRS) provides single-session high-dose radiotherapy of brain tumors with a steep dose gradient and precise real-time image-guided motion correction. Although RRS appears to cause more radiation necrosis (RN), the radiometabolic changes after RRS have not been fully clarified. 18F-FET-PET/CT is used to differentiate recurrent tumor (RT) from RN after radiosurgery when MRI findings are indecisive. We explored the usefulness of dynamic parameters derived from 18F-FET PET in differentiating RT from RN after Cyberknife treatment in a single-center study population.
Methods
We retrospectively identified brain tumor patients with static and dynamic 18F-FET-PET/CT for suspected RN after Cyberknife. Static (tumor-to-background ratio) and dynamic PET parameters (time-activity curve, time-to-peak) were quantified. Analyses were performed for all lesions taken together (TOTAL) and for brain metastases only (METS). Diagnostic accuracy of PET parameters (using mean tumor-to-background ratio >1.95 and time-to-peak of 20 min for RT as cut-offs) and their respective improvement of diagnostic probability were analyzed.
Results
Fourteen patients with 28 brain tumors were included in quantitative analysis. Time-activity curves alone provided the highest sensitivities (TOTAL: 95%, METS: 100%) at the cost of specificity (TOTAL: 50%, METS: 57%). Combined mean tumor-to-background ratio and time-activity curve had the highest specificities (TOTAL: 63%, METS: 71%) and led to the highest increase in diagnosis probability of up to 16% p. – versus 5% p. when only static parameters were used.
Conclusions
This preliminary study shows that combined dynamic and static 18F-FET PET/CT parameters can be used in differentiating RT from RN after RRS.
To learn a subject, the acquisition of the associated technical language is important.
Despite this widely accepted importance of learning the technical language, hardly any studies are published that describe the characteristics of most technical languages that students are supposed to learn. This might largely be due to the absence of specialized text corpora to study such languages at lexical, syntactical and textual level. In the present paper we describe a corpus of German physics text that can be used to study the language used in physics. A large and a small variant are compiled. The small version of the corpus consists of 5.3 Million words and is available on request.
Malnutrition is the condition in which the body does not get the right amount of proteins, vitamins, or other nutrients.1 The global prevalence of malnutrition was reported as 13% in 2015.2 The subregion of South Asia is especially known as a critical area for severe wasted children aged <5 years.3 In India, 38.4% of children aged <3 years are stunted, and 46% are underweight.4 Malnutrition can lead to mortality as well as disabilities and long-term consequences such as cognitive disabilities, less economic productivity, or diseases.
The German Corona Consensus (GECCO) established a uniform dataset in FHIR format for exchanging and sharing interoperable COVID-19 patient specific data between health information systems (HIS) for universities. For sharing the COVID-19 information with other locations that use openEHR, the data are to be converted in FHIR format. In this paper, we introduce our solution through a web-tool named “openEHR-to-FHIR” that converts compositions from an openEHR repository and stores in their respective GECCO FHIR profiles. The tool provides a REST web service for ad hoc conversion of openEHR compositions to FHIR profiles.
Targeted panel sequencing in pediatric primary cardiomyopathy supports a critical role of TNNI3
(2019)
The underlying genetic mechanisms and early pathological events of children with primary cardiomyopathy (CMP) are insufficiently characterized. In this study, we aimed to characterize the mutational spectrum of primary CMP in a large cohort of patients ≤18 years referred to a tertiary center. Eighty unrelated index patients with pediatric primary CMP underwent genetic testing with a panel-based next-generation sequencing approach of 89 genes. At least one pathogenic or probably pathogenic variant was identified in 30/80 (38%) index patients. In all CMP subgroups, patients carried most frequently variants of interest in sarcomere genes suggesting them as a major contributor in pediatric primary CMP. In MYH7, MYBPC3, and TNNI3, we identified 18 pathogenic/probably pathogenic variants (MYH7 n = 7, MYBPC3 n = 6, TNNI3 n = 5, including one homozygous (TNNI3 c.24+2T>A) truncating variant. Protein and transcript level analysis on heart biopsies from individuals with homozygous mutation of TNNI3 revealed that the TNNI3 protein is absent and associated with upregulation of the fetal isoform TNNI1. The present study further supports the clinical importance of sarcomeric mutation-not only in adult-but also in pediatric primary CMP. TNNI3 is the third most important disease gene in this cohort and complete loss of TNNI3 leads to severe pediatric CMP.
Background and Objectives:
Drawing causal conclusions from real-world data (RWD) poses methodological challenges and risk of bias. We aimed to systematically assess the type and impact of potential biases that may occur when analyzing RWD using the case of progressive ovarian cancer.
Methods:
We retrospectively compared overall survival with and without second-line chemotherapy (LOT2) using electronic medical records. Potential biases were determined using directed acyclic graphs. We followed a stepwise analytic approach ranging from crude analysis and multivariable-adjusted Cox model up to a full causal analysis using a marginal structural Cox model with replicates emulating a reference randomized controlled trial (RCT). To assess biases, we compared effect estimates (hazard ratios [HRs]) of each approach to the
HR of the reference trial.
Results:
The reference trial showed an HR for second line vs. delayed therapy of 1.01 (95% confidence interval [95% CI]: 0.82e1.25). The corresponding HRs from the RWD analysis ranged from 0.51 for simple baseline adjustments to 1.41 (95% CI: 1.22e1.64) accounting for immortal time bias with time-varying covariates. Causal trial emulation yielded an HR of 1.12 (95% CI: 0.96e1.28).
Conclusion:
Our study, using ovarian cancer as an example, shows the importance of a thorough causal design and analysis if one is expecting RWD to emulate clinical trial results.
The practice, attitude, and knowledge of complementary and alternative medicine in Mumbai, India
(2020)
Background: In the recent times, there has been a resurging interest in the use of complementary and alternative medicine (CAM) in India. The present study was conducted to examine the prevalence of CAM use in Mumbai, the knowledge and attitude regarding CAM regarding its safety and efficacy and the reasons for the use of CAM.
Methods: A cross-sectional study was conducted among the general population of Mumbai and its adjoining regions during January-July 2020. 205 residents participated in the study and were asked to fill a pretested questionnaire. The collected data was analyzed using IBM SPSS version 23.
Results: Out of the 205 responses, 163 (79.51%) agreed to have used CAM at least once in their life. Of these, 108 (52.68%) respondents used Ayurveda and 105 (51.21%) used homeopathy. 60 (36.81%) of the respondents practicing CAM used it for common gastrointestinal (GIT)-related disorder with a 100% recovery rate, 125 (76.67%) for infectious diseases with a 93.6% recovery rate. 99 (60.74%) of the respondents preferring CAM for its safety profile, 68 (41.72%) believed that CAM is time tested and thus is efficacious. An integrative approach was suggested by 118 (57.56%) of all the respondents.
Conclusions: There is a disparity between the high prevalence in the use of CAM and its knowledge. However, a general consensus suggests that CAM is efficacious and is practiced for various indications.
For the introduction of technical nursing care innovations, a usability assessment survey is conducted by nursing staff. The questionnaire is used before and after the introduction of technical products. This poster contribution shows the latest comparison of pre- and post-surveys on selected products.
Public knowledge and awareness towards antibiotics use in Yogyakarta: A cross sectional survey
(2020)
Irrational use of antibiotics is a public health problem. Our study aimed to evaluate knowledge and awareness of antibiotics, and to examine its’ associated factors. We conducted a cross sectional survey. The questionnaire was adapted from WHO Multi-country survey. Adults aged 18 years old and were receiving prescription from eight outpatient clinics and pharmacies in Yogyakarta province completed the survey. The questionnaire was consisted of three sections, i.e. socio-demographic factors, knowledge of antibiotics, and experiences in using antibiotics. Scores on questions and data were presented descriptively and analyzed using logistic regression to evaluate the influence of variables on knowledge of antibiotics. Out of 268 respondents, a cumulative 76% of them used antibiotics in last six months. Majority of respondents (58%) had low level knowledge on antibiotic use and awareness, and incorrectly identified that cold and cough are treatable with antibiotics (75%). Interestingly, 71% of participants agreed that internet is a major source of information on antibiotics (71%), while only 58% and 45% of respondents see pharmacists and medical professionals respectively. The antibiotics were received from prescription (79%) and 70% of respondents completed the full course of antibiotics prescribed, but only 32% of them became more cautious about antibiotic use. We found the highest association between gender, age, education level, with the knowledge of antibiotics. The overall level of knowledge and awareness on antibiotics use among residents in Yogyakarta is low. This mandates public health awareness intervention programs to be implemented on the use of antibiotics.
Lack of knowledge regarding antibiotics use has been widely identified as a main reason for inappropriate antibiotics use which leads to antibiotic resistance phenomenon. This study aimed to evaluate the effects of pharmacist-initiated educational intervention on promoting appropriate use of antibiotics and reducing self-medication with antibiotics. A pre and post intervention study using two validated self-administered questionnaires was performed in Yogyakarta province. A-two hour session of course and case discussion was delivered as method of intervention. Pharmacy customers attended Gema Cermat program were invited conveniently to complete both of pre- and post-educational questionnaires. Descriptive presentation was conducted to show scores on questions. Knowledge scores were categorized as poor, adequate and high. Of 268 respondents, 34.22% respondents had poor level of knowledge before receiving educational intervention, but this number decreased into 12.21% after post-interventional phase. Another 28.23% respondents had adequate level of knowledge before and then elevated into 38.28% after receiving education about appropriate use of antibiotics. Pre-education, 37.43% participants had a high level of knowledge about antibiotics use and resistance, whereas after education the number became slightly higher (49.25%). A vast majority of respondents (75.24%) became more aware about appropriate antibiotics practice after receiving educational inter- vention. Overall, didactic educational intervention imposed higher knowledge and better practice regarding antibiotics use (p < 0.05). This study showed that using didactical education intervention towards antibiotics use and resistance can be an initial strategy that led to substantial improvement of appropriate antibiotics use. Further systemic interventions to educate people should be performed and evaluated in order to promote the appropriate use of antibiotics.
A descriptive cross-sectional study of cholera at Kakuma and Kalobeyei refugee camps, Kenya in 2018
(2020)
Introduction: cholera is a significant public health concern among displaced populations. Oral cholera vaccines are safe and can effectively be used as an adjunct to prevent cholera in settings with limited access to water and sanitation. Results from this study can inform future consideration for cholera vaccination at Kakuma and Kalobeyei.
Methods: a descriptive cross-sectional study of cholera cases at Kakuma refugee camp and Kalobeyei integrated settlement was carried out between May 2017 to May 2018 (one year). Data were extracted from the medical records and line lists at the cholera treatment centres.
Results: the results found 125 clinically suspected and confirmed cholera cases and one related death (CFR 0.8%). The cumulative incidence of all cases was 0.67 (95% CI=0.56-0.80) cases/1000 persons. Incidence of cholera was higher in children under the age of five 0.94(95% CI=0.63-1.36) cases/1000 persons. Children aged <5 years showed 51% increased risk of cholera compared to those aged ≥5 years (RR=1.51; 95% CI=1.00-2.31, p=0.051). Individuals from the Democratic Republic of Congo had nearly 9-fold risk of reporting cholera (RR=8.62; 95% CI=2.55-37.11, p<0.001) while individuals from South Sudan reported 7 times risk of cholera case compared to those from Somalia (RR=7.39; 95% CI=2.78-27.73, p<0.001).
Conclusion: in addition to the improvement of water, sanitation and hygiene (WaSH), vaccination could be implemented as a short-medium term measure of preventing cholera outbreaks. Age, country of origin and settlement independently predicted the risk of cholera.
The use of secondary data in health care research has become a very important issue over the past few years. Data from the treatment context are being used for evaluation of medical data for external quality assurance, as well as to answer medical questions in the form of registers and research databases. Additionally, the establishment of electronic clinical systems like data warehouses provides new opportunities for the secondary use of clinical data. Because health data is among the most sensitive information about an individual, the data must be safeguarded from disclosure.