me

About Kamlesh Khunti, MD, PHD, FRCP, FRCGP, FMEDSCI: Guest Editor, Improving Outcomes of People With Diabetes Through Overcoming Therapeutic InertiaPreface




me

Improving Outcomes of People With Diabetes Through Overcoming Therapeutic InertiaPreface




me

Mobilising community networks for early identification of tuberculosis and treatment initiation in Cambodia: an evaluation of a seed-and-recruit model

Background and objectives

The effects of active case finding (ACF) models that mobilise community networks for early identification and treatment of tuberculosis (TB) remain unknown. We investigated and compared the effect of community-based ACF using a seed-and-recruit model with one-off roving ACF and passive case finding (PCF) on the time to treatment initiation and identification of bacteriologically confirmed TB.

Methods

In this retrospective cohort study conducted in 12 operational districts in Cambodia, we assessed relationships between ACF models and: 1) the time to treatment initiation using Cox proportional hazards regression; and 2) the identification of bacteriologically confirmed TB using modified Poisson regression with robust sandwich variance.

Results

We included 728 adults with TB, of whom 36% were identified via the community-based ACF using a seed-and-recruit model. We found community-based ACF using a seed-and-recruit model was associated with shorter delay to treatment initiation compared to one-off roving ACF (hazard ratio 0.81, 95% CI 0.68–0.96). Compared to one-off roving ACF and PCF, community-based ACF using a seed-and-recruit model was 45% (prevalence ratio (PR) 1.45, 95% CI 1.19–1.78) and 39% (PR 1.39, 95% CI 0.99–1.94) more likely to find and detect bacteriologically confirmed TB, respectively.

Conclusion

Mobilising community networks to find TB cases was associated with early initiation of TB treatment in Cambodia. This approach was more likely to find bacteriologically confirmed TB cases, contributing to the reduction of risk of transmission within the community.




me

Management of acute COPD exacerbations in Australia: do we follow the guidelines?

Objective

We aimed to assess adherence to the Australian national guideline (COPD-X) against audited practice, and to document the outcomes of patients hospitalised with an acute exacerbation of chronic obstructive pulmonary disease (COPD) at discharge and 28 days after.

Methods

A prospective clinical audit of COPD hospital admission from five tertiary care hospitals in five states of Australia was conducted. Post-discharge follow-up was conducted via telephone to assess for readmission and health status.

Results

There were 207 admissions for acute exacerbation (171 patients; mean 70.2 years old; 50.3% males). Readmission rates at 28 days were 25.4%, with one (0.6%) death during admission and eight (6.1%) post-discharge within 28 days. Concordance to the COPD-X guidance was variable; 22.7% performed spirometry, 81.1% had blood gases collected when forced expiratory volume in 1 s was <1 L, 99.5% had chest radiography performed, 95.1% were prescribed systemic corticosteroids and 95% were prescribed antibiotic therapy. There were 89.1% given oxygen therapy and 92.6% when arterial oxygen tension was <80 mmHg; 65.6% were given ventilatory assistance when pH was <7.35. Only 32.4% were referred to pulmonary rehabilitation but 76.8% had general practitioner follow-up arranged.

Conclusion

When compared against clinical practice guidelines, we found important gaps in management of patients admitted with COPD throughout tertiary care centres in Australia. Strategies to improve guideline uptake are needed to optimise care.




me

Efficacy and safety of two doses of budesonide/formoterol fumarate metered dose inhaler in COPD

Inhaled corticosteroid/long-acting β2-agonist combination therapy is a recommended treatment option for patients with chronic obstructive pulmonary disease (COPD) and increased exacerbation risk, particularly those with elevated blood eosinophil levels. SOPHOS (NCT02727660) evaluated the efficacy and safety of two doses of budesonide/formoterol fumarate dihydrate metered dose inhaler (BFF MDI) versus formoterol fumarate dihydrate (FF) MDI, each delivered using co-suspension delivery technology, in patients with moderate-to-very severe COPD and a history of exacerbations.

In this phase 3, randomised, double-blind, parallel-group, 12–52-week, variable length study, patients received twice-daily BFF MDI 320/10 µg or 160/10 µg, or FF MDI 10 µg. The primary endpoint was change from baseline in morning pre-dose trough forced expiratory volume in 1 s (FEV1) at week 12. Secondary and other endpoints included assessments of moderate/severe COPD exacerbations and safety.

The primary analysis (modified intent-to-treat) population included 1843 patients (BFF MDI 320/10 µg, n=619; BFF MDI 160/10 µg, n=617; and FF MDI, n=607). BFF MDI 320/10 µg and 160/10 µg improved morning pre-dose trough FEV1 at week 12 versus FF MDI (least squares mean differences 34 mL [p=0.0081] and 32 mL [p=0.0134], respectively), increased time to first exacerbation (hazard ratios 0.827 [p=0.0441] and 0.803 [p=0.0198], respectively) and reduced exacerbation rate (rate ratios 0.67 [p=0.0001] and 0.71 [p=0.0010], respectively). Lung function and exacerbation benefits were driven by patients with blood eosinophil counts ≥150 cells·mm–3. The incidence of adverse events was similar, and pneumonia rates were low (≤2.4%) across treatments.

SOPHOS demonstrated the efficacy and tolerability of BFF MDI 320/10 µg and 160/10 µg in patients with moderate-to-very severe COPD at increased risk of exacerbations.




me

Survival benefit of lung transplantation compared with medical management and pulmonary rehabilitation for patients with end-stage COPD

Background

COPD patients account for a large proportion of lung transplants; lung transplantation survival benefit for COPD patients is not well established.

Methods

We identified 4521 COPD patients in the United Network for Organ Sharing (UNOS) dataset transplanted from May 2005 to August 2016, and 604 patients assigned to receive pulmonary rehabilitation and medical management in the National Emphysema Treatment Trial (NETT). After trimming the populations for NETT eligibility criteria and data completeness, 1337 UNOS and 596 NETT patients remained. Kaplan–Meier estimates of transplant-free survival from transplantation for UNOS, and NETT randomisation, were compared between propensity score-matched UNOS (n=401) and NETT (n=262) patients.

Results

In propensity-matched analyses, transplanted patients had better survival compared to medically managed patients in NETT (p=0.003). Stratifying on 6 min walk distance (6 MWD) and FEV1, UNOS patients with 6 MWD <1000 ft (~300 m) or FEV1 <20% of predicted had better survival than NETT counterparts (median survival 5.0 years UNOS versus 3.4 years NETT; log-rank p<0.0001), while UNOS patients with 6 MWD ≥1000 ft (~300 m) and FEV1 ≥20% had similar survival to NETT counterparts (median survival, 5.4 years UNOS versus 4.9 years NETT; log-rank p=0.73), interaction p=0.01.

Conclusions

Overall survival is better for matched lung transplant patients compared with medical management alone. Patients who derive maximum benefit are those with 6 MWD <1000 ft (~300 m) or FEV1 <20% of predicted, compared with pulmonary rehabilitation and medical management.




me

Epidemiological features and medical care-seeking process of patients with COVID-19 in Wuhan, China

Background

We aimed to investigate the epidemiological and clinical features, and medical care-seeking process of patients with the 2019 coronavirus disease (COVID-19) in Wuhan, China, to provide useful information to contain COVID-19 in other places with similar outbreaks of the virus.

Methods

We collected epidemiological and clinical information of patients with COVID-19 admitted to a makeshift Fangcang hospital between 7 and 26 February, 2020. The waiting time of each step during the medical care-seeking process was also analysed.

Results

Of the 205 patients with COVID-19 infection, 31% had presumed transmission from a family member. 10% of patients had hospital-related transmission. It took as long as a median of 6 days from the first medical visit to receive the COVID-19 nucleic acid test and 10 days from the first medical visit to hospital admission, indicating early recognition of COVID-19 was not achieved at the early stage of the outbreak, although these delays were shortened later. After clinical recovery from COVID-19, which took a mean of 21 days from illness onset, there was still a substantial proportion of patients who had persistent SARS-CoV-2 infection.

Conclusions

The diagnostic evaluation process of suspected patients needs to be accelerated at the epicentre of the outbreak and early isolation of infected patients in a healthcare setting rather than at home is urgently required to stop the spread of the virus. Clinical recovery is not an appropriate criterion to release isolated patients and as long as 4 weeks' isolation for patients with COVID-19 is not enough to prevent the spread of the virus.




me

High cytomegalovirus serology and subsequent COPD-related mortality: a longitudinal study

Background

Positive serology for cytomegalovirus (CMV) has been associated with all-cause mortality risk but its role in COPD mortality is unknown. The objective of the present study was to assess the relationship between CMV serology and COPD mortality.

Methods

We analysed data from 806 participants in the Tucson Epidemiological Study of Airway Obstructive Disease who, at enrolment, were aged 28–70 years and had completed lung function tests. We tested CMV serology in sera from enrolment and defined "high CMV serology" as being in the highest tertile. Vital status, date and cause of death were assessed through death certificates and/or linkage with the National Death Index up to January 2017. The association of CMV serology with all-cause and cause-specific mortality risk was tested in Cox models adjusted for age, sex, level of education, body mass index, smoking status and pack-years.

Results

High CMV serology was marginally associated with all-cause mortality (p=0.071) but the effect was inversely dependent on age, with the association being much stronger among participants <55 years than among participants ≥55 years at enrolment (p-value for CMV-by-age interaction <0.001). Compared with low CMV serology, high CMV serology was associated with mortality from COPD among all subjects (adjusted hazard ratio (HR) 2.38, 95% CI 1.11–5.08; p=0.025) and particularly in subjects <55 years old at enrolment (HR 5.40, 95% CI 1.73–16.9; p=0.004). Consistent with these results, high CMV serology also predicted mortality risk among subjects who already had airflow limitation at enrolment (HR 2.10, 95% CI 1.20–3.68; p=0.009).

Conclusions

We report a strong relationship between CMV serology and the risk of dying from COPD, and thus identify a novel risk factor for COPD mortality.




me

Low adherence to inhaled corticosteroids/long-acting {beta}2-agonists and biologic treatment in severe asthmatics

Eligibility criteria for a biologic treatment for severe asthma include poor disease control despite a full medication plan according to Global Initiative for Asthma steps 4–5 [1]. Adherence to inhaled therapy should be verified as part of that prescription requirement [2]. In fact, it has been demonstrated that poor adherence is a major cause of uncontrolled asthma, regardless of its severity [3]. Furthermore, biologics do not exert a disease-modifying effect [4]; in contrast to allergen immunotherapy, which is able to permanently modulate the way the immune system reacts to allergens beyond the immunotherapy treatment course [5], biologic therapy withdrawal usually leads to asthma relapse [4]. Thus, a low adherence rate to inhaled treatment in patients undergoing biologic therapy raises some issues related to sustainability.




me

The Transcriptional Aftermath in Two Independently Formed Hybrids of the Opportunistic Pathogen Candida orthopsilosis

ABSTRACT

Interspecific hybridization can drive evolutionary adaptation to novel environments. The Saccharomycotina clade of budding yeasts includes many hybrid lineages, and hybridization has been proposed as a source for new pathogenic species. Candida orthopsilosis is an emerging opportunistic pathogen for which most clinical isolates are hybrids, each derived from one of at least four independent crosses between the same two parental lineages. To gain insight into the transcriptomic aftermath of hybridization in these pathogens, we analyzed allele-specific gene expression in two independently formed hybrid strains and in a homozygous strain representative of one parental lineage. Our results show that the effect of hybridization on overall gene expression is rather limited, affecting ~4% of the genes studied. However, we identified a larger effect in terms of imbalanced allelic expression, affecting ~9.5% of the heterozygous genes in the hybrids. This effect was larger in the hybrid with more extensive loss of heterozygosity, which may indicate a tendency to avoid loss of heterozygosity in these genes. Consistently, the number of shared genes with allele-specific expression in the two independently formed hybrids was higher than random expectation, suggesting selective retention. Some of the imbalanced genes have functions related to pathogenicity, including zinc transport and superoxide dismutase activities. While it remains unclear whether the observed imbalanced genes play a role in virulence, our results suggest that differences in allele-specific expression may add an additional layer of phenotypic plasticity to traits related to virulence in C. orthopsilosis hybrids.

IMPORTANCE How new pathogens emerge is an important question that remains largely unanswered. Some emerging yeast pathogens are hybrids originated through the crossing of two different species, but how hybridization contributes to higher virulence is unclear. Here, we show that hybrids selectively retain gene regulation plasticity inherited from the two parents and that this plasticity affects genes involved in virulence.




me

Genetic Association Reveals Protection against Recurrence of Clostridium difficile Infection with Bezlotoxumab Treatment

ABSTRACT

Bezlotoxumab is a human monoclonal antibody against Clostridium difficile toxin B, indicated to prevent recurrence of C. difficile infection (rCDI) in high-risk adults receiving antibacterial treatment for CDI. An exploratory genome-wide association study investigated whether human genetic variation influences bezlotoxumab response. DNA from 704 participants who achieved initial clinical cure in the phase 3 MODIFY I/II trials was genotyped. Single nucleotide polymorphisms (SNPs) and human leukocyte antigen (HLA) imputation were performed using IMPUTE2 and HIBAG, respectively. A joint test of genotype and genotype-by-treatment interaction in a logistic regression model was used to screen genetic variants associated with response to bezlotoxumab. The SNP rs2516513 and the HLA alleles HLA-DRB1*07:01 and HLA-DQA1*02:01, located in the extended major histocompatibility complex on chromosome 6, were associated with the reduction of rCDI in bezlotoxumab-treated participants. Carriage of a minor allele (homozygous or heterozygous) at any of the identified loci was related to a larger difference in the proportion of participants experiencing rCDI versus placebo; the effect was most prominent in the subgroup at high baseline risk for rCDI. Genotypes associated with an improved bezlotoxumab response showed no association with rCDI in the placebo cohort. These data suggest that a host-driven, immunological mechanism may impact bezlotoxumab response. Trial registration numbers are as follows: NCT01241552 (MODIFY I) and NCT01513239 (MODIFY II).

IMPORTANCE Clostridium difficile infection is associated with significant clinical morbidity and mortality; antibacterial treatments are effective, but recurrence of C. difficile infection is common. In this genome-wide association study, we explored whether host genetic variability affected treatment responses to bezlotoxumab, a human monoclonal antibody that binds C. difficile toxin B and is indicated for the prevention of recurrent C. difficile infection. Using data from the MODIFY I/II phase 3 clinical trials, we identified three genetic variants associated with reduced rates of C. difficile infection recurrence in bezlotoxumab-treated participants. The effects were most pronounced in participants at high risk of C. difficile infection recurrence. All three variants are located in the extended major histocompatibility complex on chromosome 6, suggesting the involvement of a host-driven immunological mechanism in the prevention of C. difficile infection recurrence.




me

Subtle Variations in Dietary-Fiber Fine Structure Differentially Influence the Composition and Metabolic Function of Gut Microbiota

ABSTRACT

The chemical structures of soluble fiber carbohydrates vary from source to source due to numerous possible linkage configurations among monomers. However, it has not been elucidated whether subtle structural variations might impact soluble fiber fermentation by colonic microbiota. In this study, we tested the hypothesis that subtle structural variations in a soluble polysaccharide govern the community structure and metabolic output of fermenting microbiota. We performed in vitro fecal fermentation studies using arabinoxylans (AXs) from different classes of wheat (hard red spring [AXHRS], hard red winter [AXHRW], and spring red winter [AXSRW]) with identical initial microbiota. Carbohydrate analyses revealed that AXSRW was characterized by a significantly shorter backbone and increased branching compared with those of the hard varieties. Amplicon sequencing demonstrated that fermentation of AXSRW resulted in a distinct community structure of significantly higher richness and evenness than those of hard-AX-fermenting cultures. AXSRW favored OTUs within Bacteroides, whereas AXHRW and AXHRS favored Prevotella. Accordingly, metabolic output varied between hard and soft varieties; higher propionate production was observed with AXSRW and higher butyrate and acetate with AXHRW and AXHRS. This study showed that subtle changes in the structure of a dietary fiber may strongly influence the composition and function of colonic microbiota, further suggesting that physiological functions of dietary fibers are highly structure dependent. Thus, studies focusing on interactions among dietary fiber, gut microbiota, and health outcomes should better characterize the structures of the carbohydrates employed.

IMPORTANCE Diet, especially with respect to consumption of dietary fibers, is well recognized as one of the most important factors shaping the colonic microbiota composition. Accordingly, many studies have been conducted to explore dietary fiber types that could predictably manipulate the colonic microbiota for improved health. However, the majority of these studies underappreciate the vastness of fiber structures in terms of their microbial utilization and omit detailed carbohydrate structural analysis. In some cases, this causes conflicting results to arise between studies using (theoretically) the same fibers. In this investigation, by performing in vitro fecal fermentation studies using bran arabinoxylans obtained from different classes of wheat, we showed that even subtle changes in the structure of a dietary fiber result in divergent microbial communities and metabolic outputs. This underscores the need for much higher structural resolution in studies investigating interactions of dietary fibers with gut microbiota, both in vitro and in vivo.




me

An Extensive Meta-Metagenomic Search Identifies SARS-CoV-2-Homologous Sequences in Pangolin Lung Viromes

ABSTRACT

In numerous instances, tracking the biological significance of a nucleic acid sequence can be augmented through the identification of environmental niches in which the sequence of interest is present. Many metagenomic data sets are now available, with deep sequencing of samples from diverse biological niches. While any individual metagenomic data set can be readily queried using web-based tools, meta-searches through all such data sets are less accessible. In this brief communication, we demonstrate such a meta-metagenomic approach, examining close matches to the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) in all high-throughput sequencing data sets in the NCBI Sequence Read Archive accessible with the "virome" keyword. In addition to the homology to bat coronaviruses observed in descriptions of the SARS-CoV-2 sequence (F. Wu, S. Zhao, B. Yu, Y. M. Chen, et al., Nature 579:265–269, 2020, https://doi.org/10.1038/s41586-020-2008-3; P. Zhou, X. L. Yang, X. G. Wang, B. Hu, et al., Nature 579:270–273, 2020, https://doi.org/10.1038/s41586-020-2012-7), we note a strong homology to numerous sequence reads in metavirome data sets generated from the lungs of deceased pangolins reported by Liu et al. (P. Liu, W. Chen, and J. P. Chen, Viruses 11:979, 2019, https://doi.org/10.3390/v11110979). While analysis of these reads indicates the presence of a similar viral sequence in pangolin lung, the similarity is not sufficient to either confirm or rule out a role for pangolins as an intermediate host in the recent emergence of SARS-CoV-2. In addition to the implications for SARS-CoV-2 emergence, this study illustrates the utility and limitations of meta-metagenomic search tools in effective and rapid characterization of potentially significant nucleic acid sequences.

IMPORTANCE Meta-metagenomic searches allow for high-speed, low-cost identification of potentially significant biological niches for sequences of interest.




me

Molar element ratio analysis of lithogeochemical data: a toolbox for use in mineral exploration and mining

Molar element ratio analysis of element concentrations consists of four basic tools that provide substantial insight into the lithogeochemistry (and mineralogy) of rocks under examination. These tools consist of: (1) conserved element ratio analysis; (2) Pearce element ratio analysis; (3) general element ratio analysis; and (4) lithogeochemical mineral mode analysis. Conserved element ratio analysis is useful in creating a chemostratigraphic model for the host rocks to mineral deposits, whereas Pearce element ratio analysis and general element ratio analysis are primarily used to identify mineralogical and metasomatic controls on rock compositions and to investigate and quantify the extent of the material transfers that formed the host rocks and mineralization. Lithogeochemical mineral mode analysis converts element concentrations into mineral concentrations using a matrix-based change-of-basis operation, allowing lithogeochemical data to be interpreted in terms of mineral modes. It can be used to provide proper names to rocks, an important activity for an exploration geologist because of the implications that rock names have on genetic processes and mineral deposit models.

This paper provides a review of the theoretical foundations of each of these four tools and then illustrates how these techniques have been used in a variety of exploration applications to assist in the search for, evaluation and planning of, and the mining of mineral deposits. Examples include the evaluation of total digestion lithogeochemical datasets from mineral deposits hosted by igneous and sedimentary rocks and formed by hydrothermal and igneous processes. In addition, this paper illustrates a more recent geometallurgical application of these methods, whereby the mineral proportions determined by lithogeochemical mineral mode analysis are used to predict rock properties and obtain the ore body knowledge critical for resource evaluation, mine planning, mining and mine remediation.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




me

New developments in field-portable geochemical techniques and on-site technologies and their place in mineral exploration

This paper focuses on handheld and top-of-hole techniques which have appeared since 2007 or have undergone major improvements, and discusses their benefits, challenges and pitfalls, why we use them and what to expect from them. There is an ongoing need to be innovative with the way we undertake mineral exploration. Recent technological advances that have been applied to successful mineral exploration include on-site or portable instruments, on-site laboratory technologies, various core scanners, and technologies for fluid analysis. Portable or field technologies such as pXRF, pXRD, pNIR-SWIR, µRaman and LIBS aid in obtaining chemical and mineralogical information. Spectral gamma tools, a well-known technology, recently took advantage of improved ground and airborne (drone) instruments, to complement hyperspectral imagery. At mine and exploration sites, top-of-hole sensing technologies, such as Lab-at-Rig® and various core scanners (both spectral- and XRF-based) have become useful tools to analyse metres of core as it is being drilled. Fluid analyses are not as common as analyses of solid materials, but there are advances in such technologies as anodic stripping voltammetry, polarography and ion-exchange electrodes aiming for analysis of commodity or environmentally important elements.

Field-portable geochemical techniques and on-site technologies now offer instant response and flexibility for most exploration tasks. By providing relevant data within minutes, they allow safer field decisions and focus on the most promising finds, while saving valuable resources in sampling grids or drilling. More efficient laboratory analysis programs are supported by sample screening and homogeneity checking on-site. Field analyses are not always as accurate as laboratory ones, but most of the time can be correlated with them, enabling reliable decisions. The level of confidence in field-made decisions needs to be compared between later and less numerous laboratory analyses, and less precise but more abundant and immediate field analyses. It may be demonstrated that, in many cases, the fit–for-purpose nature of the latter allows a better confidence level. Quality compromises associated with field analyses can be reduced by the application of better sample preparation and quality assurance/quality control (QA/QC) procedures. Most of the further development of on-site chemical analysis is expected to be based on its integration with lab methods and on sound QA/QC practice, allowing a precise evaluation of its confidence level and uncertainties. Mineralogical analyses are constrained by our ability to interpret the data in near-real time but offer promising approaches in both surface and drilling exploration campaigns.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




me

Advances in the use of isotopes in geochemical exploration: instrumentation and applications in understanding geochemical processes

Among the emerging techniques to detect the real footprint of buried ore deposits is isotope tracing. Novel and automated preparation systems such as continuous flow isotope ratio mass spectrometry, off-axis integrated cavity output spectroscopy for isotopic compositions of selected molecules, multi-collector inductively coupled-plasma mass spectrometry (ICP-MS), triple quadrupole ICP-MS, laser ablation ICP-MS, and a multitude of inline preparation systems have facilitated the use of isotopes as tracers in mineral exploration, as costs for isotope analyses have decreased and the time required for the analyses has improved. In addition, the isotope systems being used have expanded beyond the traditional light stable and Pb isotopes to include a multitude of elements that behave differently during processes that promote the mobilization of elements during both primary and secondary dispersion. Isotopes are also being used to understand barren areas that lack a critical process to form an ore deposit and to reveal precise redox mechanisms. The goal is to be able to use isotopes to reflect a definitive process that occurs in association with the deposit and not in barren systems, and then to relate these to something that is easier to measure, namely elemental concentrations. As new generations of exploration and environmental scientists are becoming more comfortable with the application of isotopes to effectively trace processes involved in geoscience, and new technologies for rapid and inexpensive analyses of isotopes are continually being developed, novel applications of isotope tracing are becoming more mainstream.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




me

Advances in ICP-MS technology and the application of multi-element geochemistry to exploration

There have been several advances in inductively coupled plasma-mass spectrometer (ICP-MS) analytical technologies in the last decade. Collision/reaction cell ICP-MS and triple quadrupole ICP-MS techniques can produce lower detection limits for select elements that experience interferences with a standard quadrupole (e.g. Se and As). Triple quadrupole ICP-MS, in particular, can eliminate virtually all polyatomic or isobaric interferences for highly accurate measurements of some element isotopes systematics that are of great interest in mineral exploration, namely Pb/Pb. Laser ablation ICP-MS has become more popular as an effective analytical tool to measure mineral grain trace elements, which could assist in vectoring to mineralization or exploration drill targets. The ablation of a spot on a Li-borate fused glass disk paired with XRF analysis has also gained popularity as an alternative to total whole rock characterization packages that employ several separate digestions and analytical methods. While there have been several advancements in ICP-MS technologies in exploration geochemistry, they have not been widely accepted or implemented. This slow adaptation could be due to the extended recession in the mining industry between 2012 and 2017. It is also possible that standard ICP-MS data (i.e. no collision/reaction cell) is still fit for purpose. This stands in stark contrast to implementation of ICP-MS in the previous decade (1997–2007), which was transformational for the industry.

Consideration of all elements from large multi-element ICP-MS analytical suites for mineral exploration can be an extremely powerful tool in the exploration toolkit. The discovery of the White Gold District, Yukon, is a prime example of how the utilization of soil geochemical data, when plotted spatially, can vector to gold mineralization. The presence of Au + As + Sb soil anomalies were key to delineating mineralization, especially when accompanied by publicly available geological, geographical and geophysical data. Additionally, elements and element ratios not typically considered in Au exploration, including Ni and U, were utilized to determine the lithological and structural controls on mineralization. The availability of multi-element ICP-MS data was also useful in the discovery of the Cascadero Copper Taron Caesium deposit. Ore-grade Cs was discovered only because Cs was included in the multi-element ICP-MS exploration geochemistry suite. Before the availability of ICP-MS, it is unlikely that this deposit would have been discovered.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




me

Medical Cannabinoid Products in Children and Adolescents




me

Case 1: Progressive Dysphagia in a Teenager with Down Syndrome




me

Managing Excipient Supplier Name and Address Changes in the Pharmaceutical Quality System

It is important to identify, assess, and address current barriers to implementation of post-approval changes that are intended to ensure continued (uninterrupted) operations and drive innovation and continual improvement in a maximally efficient, agile, and flexible pharmaceutical manufacturing sector. Leveraging the International Conference for Harmonisation Quality Guideline Q10 provides regulatory relief when it comes to addressing changes related to excipients, specifically excipient supplier's name and address changes, which will ensure a sustainable, reliable global supply and the availability of high quality product to patients through the entire commercial lifecycle of a product without extensive regulatory oversight.




me

Advancing Biologics Development Programs with Legacy Cell Lines: Advantages and Limitations of Genetic Testing for Addressing Clonality Concerns Prior to Availability of Late Stage Process and Product Consistency Data

The bioprocessing industry uses recombinant mammalian cell lines to generate therapeutic biologic drugs. To ensure consistent product quality of the therapeutic proteins, it is imperative to have a controlled production process. Regulatory agencies and the biotechnology industry consider cell line "clonal origin" an important aspect of maintaining process control. Demonstration of clonal origin of the cell substrate, or production cell line, has received considerable attention in the past few years, and the industry has improved methods and devised standards to increase the probability and/or assurance of clonal derivation. However, older production cell lines developed before the implementation of these methods, herein referred to as "legacy cell lines," may not meet current regulatory expectations for demonstration of clonal derivation. In this article, the members of the IQ Consortium Working Group on Clonality present our position that the demonstration of process consistency and product comparability of critical quality attributes throughout the development life cycle should be sufficient to approve a license application without additional genetic analysis to support clonal origin, even for legacy cell lines that may not meet current day clonal derivation standards. With this commentary, we discuss advantages and limitations of genetic testing methods to support clonal derivation of legacy cell lines and wish to promote a mutual understanding with the regulatory authorities regarding their optional use during early drug development, subsequent to Investigational New Drug (IND) application and before demonstration of product and process consistency at Biologics License Applications (BLA) submission.




me

Risk Assessment Approach to Microbiological Controls of Cell Therapies

This technology review, written by a small group of pharmaceutical microbiologists experienced in cell therapies, discussed a risk-based approach to microbiological contamination detection and control during gene and cell therapy production. Topics discussed include a brief overview of cell therapies, a risk analysis related to donor selection, cell collection and infectious agent testing, cell transformation and expansion, packaging, storage, and administration, and cell therapy microbial contamination testing and release.




me

Global Organization and Proposed Megataxonomy of the Virus World [Review]

Viruses and mobile genetic elements are molecular parasites or symbionts that coevolve with nearly all forms of cellular life. The route of virus replication and protein expression is determined by the viral genome type. Comparison of these routes led to the classification of viruses into seven "Baltimore classes" (BCs) that define the major features of virus reproduction. However, recent phylogenomic studies identified multiple evolutionary connections among viruses within each of the BCs as well as between different classes. Due to the modular organization of virus genomes, these relationships defy simple representation as lines of descent but rather form complex networks. Phylogenetic analyses of virus hallmark genes combined with analyses of gene-sharing networks show that replication modules of five BCs (three classes of RNA viruses and two classes of reverse-transcribing viruses) evolved from a common ancestor that encoded an RNA-directed RNA polymerase or a reverse transcriptase. Bona fide viruses evolved from this ancestor on multiple, independent occasions via the recruitment of distinct cellular proteins as capsid subunits and other structural components of virions. The single-stranded DNA (ssDNA) viruses are a polyphyletic class, with different groups evolving by recombination between rolling-circle-replicating plasmids, which contributed the replication protein, and positive-sense RNA viruses, which contributed the capsid protein. The double-stranded DNA (dsDNA) viruses are distributed among several large monophyletic groups and arose via the combination of distinct structural modules with equally diverse replication modules. Phylogenomic analyses reveal the finer structure of evolutionary connections among RNA viruses and reverse-transcribing viruses, ssDNA viruses, and large subsets of dsDNA viruses. Taken together, these analyses allow us to outline the global organization of the virus world. Here, we describe the key aspects of this organization and propose a comprehensive hierarchical taxonomy of viruses.




me

Touching the Surface: Diverse Roles for the Flagellar Membrane in Kinetoplastid Parasites [Review]

While flagella have been studied extensively as motility organelles, with a focus on internal structures such as the axoneme, more recent research has illuminated the roles of the flagellar surface in a variety of biological processes. Parasitic protists of the order Kinetoplastida, which include trypanosomes and Leishmania species, provide a paradigm for probing the role of flagella in host-microbe interactions and illustrate that this interface between the flagellar surface and the host is of paramount importance. An increasing body of knowledge indicates that the flagellar membrane serves a multitude of functions at this interface: attachment of parasites to tissues within insect vectors, close interactions with intracellular organelles of vertebrate cells, transactions between flagella from different parasites, junctions between the flagella and the parasite cell body, emergence of nanotubes and exosomes from the parasite directed to either host or microbial targets, immune evasion, and sensing of the extracellular milieu. Recent whole-organelle or genome-wide studies have begun to identify protein components of the flagellar surface that must mediate these diverse host-parasite interactions. The increasing corpus of knowledge on kinetoplastid flagella will likely prove illuminating for other flagellated or ciliated pathogens as well.




me

Posttranscriptional Regulation of tnaA by Protein-RNA Interaction Mediated by Ribosomal Protein L4 in Escherichia coli [Article]

Escherichia coli ribosomal protein (r-protein) L4 has extraribosomal biological functions. Previously, we described L4 as inhibiting RNase E activity through protein-protein interactions. Here, we report that from stabilized transcripts regulated by L4-RNase E, mRNA levels of tnaA (encoding tryptophanase from the tnaCAB operon) increased upon ectopic L4 expression, whereas TnaA protein levels decreased. However, at nonpermissive temperatures (to inactivate RNase E), tnaA mRNA and protein levels both increased in an rne temperature-sensitive [rne(Ts)] mutant strain. Thus, L4 protein fine-tunes TnaA protein levels independently of its inhibition of RNase E. We demonstrate that ectopically expressed L4 binds with transcribed spacer RNA between tnaC and tnaA and downregulates TnaA translation. We found that deletion of the 5' or 3' half of the spacer compared to the wild type resulted in a similar reduction in TnaA translation in the presence of L4. In vitro binding of L4 to the tnaC-tnaA transcribed spacer RNA results in changes to its secondary structure. We reveal that during early stationary-phase bacterial growth, steady-state levels of tnaA mRNA increased but TnaA protein levels decreased. We further confirm that endogenous L4 binds to tnaC-tnaA transcribed spacer RNA in cells at early stationary phase. Our results reveal the novel function of L4 in fine-tuning TnaA protein levels during cell growth and demonstrate that r-protein L4 acts as a translation regulator outside the ribosome and its own operon.

IMPORTANCE Some ribosomal proteins have extraribosomal functions in addition to ribosome translation function. The extraribosomal functions of several r-proteins control operon expression by binding to own-operon transcripts. Previously, we discovered a posttranscriptional, RNase E-dependent regulatory role for r-protein L4 in the stabilization of stress-responsive transcripts. Here, we found an additional extraribosomal function for L4 in regulating the tna operon by L4-intergenic spacer mRNA interactions. L4 binds to the transcribed spacer RNA between tnaC and tnaA and alters the structural conformation of the spacer RNA, thereby reducing the translation of TnaA. Our study establishes a previously unknown L4-mediated mechanism for regulating gene expression, suggesting that bacterial cells have multiple strategies for controlling levels of tryptophanase in response to varied cell growth conditions.




me

Ribosome Dimerization Protects the Small Subunit [Article]

When nutrients become scarce, bacteria can enter an extended state of quiescence. A major challenge of this state is how to preserve ribosomes for the return to favorable conditions. Here, we show that the ribosome dimerization protein hibernation-promoting factor (HPF) functions to protect essential ribosomal proteins. Ribosomes isolated from strains lacking HPF (hpf) or encoding a mutant allele of HPF that binds the ribosome but does not mediate dimerization were substantially depleted of the small subunit proteins S2 and S3. Strikingly, these proteins are located directly at the ribosome dimer interface. We used single-particle cryo-electron microscopy (cryo-EM) to further characterize these ribosomes and observed that a high percentage of ribosomes were missing S2, S3, or both. These data support a model in which the ribosome dimerization activity of HPF evolved to protect labile proteins that are essential for ribosome function. HPF is almost universally conserved in bacteria, and HPF deletions in diverse species exhibit decreased viability during starvation. Our data provide mechanistic insight into this phenotype and establish a mechanism for how HPF protects ribosomes during quiescence.

IMPORTANCE The formation of ribosome dimers during periods of dormancy is widespread among bacteria. Dimerization is typically mediated by a single protein, hibernation-promoting factor (HPF). Bacteria lacking HPF exhibit strong defects in viability and pathogenesis and, in some species, extreme loss of rRNA. The mechanistic basis of these phenotypes has not been determined. Here, we report that HPF from the Gram-positive bacterium Bacillus subtilis preserves ribosomes by preventing the loss of essential ribosomal proteins at the dimer interface. This protection may explain phenotypes associated with the loss of HPF, since ribosome protection would aid survival during nutrient limitation and impart a strong selective advantage when the bacterial cell rapidly reinitiates growth in the presence of sufficient nutrients.




me

Measuring airway clearance outcomes in bronchiectasis: a review

While airway clearance techniques (ACTs) are recommended for individuals with bronchiectasis, many trials have demonstrated inconsistent benefits or failed to reach their primary outcome. This review determined the most common clinical and patient-reported outcome measures used to evaluate the efficacy of ACTs in bronchiectasis. A literature search of five databases using relevant keywords and filtering for studies published in English, up until the end of August 2019, was completed. Studies included randomised controlled trials, using crossover or any other trial design, and abstracts. Studies were included where the control was placebo, no intervention, standard care, usual care or an active comparator. Adults with bronchiectasis not related to cystic fibrosis were included. Extracted data comprised study authors, design, duration, intervention, outcome measures and results. The search identified 27 published studies and one abstract. The most common clinical outcome measures were sputum volume (n=23), lung function (n=17) and pulse oximetry (n=9). The most common patient-reported outcomes were health-related quality of life (measured with St George's Respiratory Questionnaire, n=4), cough-related quality of life (measured with Leicester Cough Questionnaire, n=4) and dyspnoea (measured with Borg/modified Borg scale, n=8). Sputum volume, lung function, dyspnoea and health- and cough-related quality of life appear to be the most common clinical and patient-reported measures of airway clearance treatment efficacy.




me

Thoracic ultrasound in the modern management of pleural disease

Physician-led thoracic ultrasound (TUS) has substantially changed how respiratory disorders, and in particular pleural diseases, are managed. The use of TUS as a point-of-care test enables the respiratory physician to quickly and accurately diagnose pleural pathology and ensure safe access to the pleural space during thoracentesis or chest drain insertion. Competence in performing TUS is now an obligatory part of respiratory speciality training programmes in different parts of the world. Pleural physicians with higher levels of competence routinely use TUS during the planning and execution of more sophisticated diagnostic and therapeutic interventions, such as core needle pleural biopsies, image-guided drain insertion and medical thoracoscopy. Current research is gauging the potential of TUS in predicting the outcome of different pleural interventions and how it can aid in tailoring the optimum treatment according to different TUS-based parameters.




me

Exceptionally well-preserved Permocalculus cf. tenellus (Pia) (Gymnocodiaceae) from Upper Permian Khuff Formation limestones, Saudi Arabia

An exceptionally well-preserved specimen of the articulated rhodophyte Permocalculus, compared with P. tenellus sensu Elliott, 1955, is described from fine-grained Upper Permian limestones of the Khuff Formation of Saudi Arabia. Longitudinal medullary and sheaf-like cortical filaments extend through the uniserial series of elongate-globular, concave- and convex-terminating, interlocking segments for which they are interpreted to have functioned in articulation. The filaments tend to splay and branch laterally into the cortex where they terminate at the pores. At the terminal aperture, the filaments extend as bifurcating and possibly trifurcating branches and may serve as the origin of a new segment. Numerous elongate-globular chambers, up to five in each row and intimately involved with the filaments, are developed in the outer medulla and are considered to represent reproductive sporangia. The specimen is considered to have occupied predominantly low-energy, normal to slightly elevated salinity, shallow conditions within the subtidal regime of a lagoon.




me

Implementation and Scale-Up of the Standard Days Method of Family Planning: A Landscape Analysis

ABSTRACTThe Standard Days Method (SDM), a modern fertility awareness-based family planning method, has been introduced in 30 countries since its development in 2001. It is still unclear to what extent the SDM was mainstreamed within the family planning method mix, particularly in low- and middle-income country (LMIC) settings, where the SDM had been introduced by donors and implementing partners. This review of implementation science publications on the SDM in LMICs first looked at community pilot studies of the SDM to determine the acceptability of the method; correct use and efficacy rates; demographics of users; and changes to contraceptive prevalence rates and family planning behaviors, especially among men and couples. Then, we examined the status of the SDM in the 16 countries that had attempted to scale up the method within national family planning protocols, training, and service delivery. At the community level, evidence demonstrated a high level of acceptability of the method; efficacy rates comparable to the initial clinical trials; diversity in demographic characteristics of users, including first-time or recently discontinued users of family planning; increased male engagement in family planning; and improved couple's communication. Nationally, few countries had scaled up the SDM due to uneven stakeholder engagement, lackluster political will, and competing resource priorities. Results of this review could help policy makers determine the added value of the SDM in the contraceptive method mix and identify potential barriers to its implementation moving forward.




me

How Should Home-Based Maternal and Child Health Records Be Implemented? A Global Framework Analysis

ABSTRACTBackground:A home-based record (HBR) is a health document kept by the patient or their caregivers, rather than by the health care facility. HBRs are used in 163 countries, but they have not been implemented universally or consistently. Effective implementation maximizes both health impacts and cost-effectiveness. We sought to examine this research-to-practice gap and delineate the facilitators and barriers to the effective implementation and use of maternal and child health HBRs especially in low- and middle-income countries (LMICs).Methods:Using a framework analysis approach, we created a framework of implementation categories in advance using subject expert inputs. We collected information through 2 streams. First, we screened 69 gray literature documents, of which 18 were included for analysis. Second, we conducted semi-structured interviews with 12 key informants, each of whom had extensive experience with HBR implementation. We abstracted the relevant data from the documents and interviews into an analytic matrix. The matrix was based on the initial framework and adjusted according to emergent categories from the data.Results:We identified 8 contributors to successful HBR implementation. These include establishing high-level support from the government and ensuring clear communication between all ministries and nongovernmental organizations involved. Choice of appropriate contents within the record was noted as important for alignment with the health system and for end user acceptance, as were the design, its physical durability, and timely redesigns. Logistical considerations, such as covering costs sustainably and arranging printing and distribution, could be potential bottlenecks. Finally, end users' engagement with HBRs depended on how the record was initially introduced to them and how its importance was reinforced over time by those in leadership positions.Conclusions:This framework analysis is the first study to take a more comprehensive and broad approach to the HBR implementation process in LMICs. The findings provide guidance for policy makers, donors, and health care practitioners regarding best implementation practice and effective HBR use, as well as where further research is required.




me

Unmet Need for Family Planning and Experience of Unintended Pregnancy Among Female Sex Workers in Urban Cameroon: Results From a National Cross-Sectional Study

ABSTRACTBackground:Female sex workers (FSWs) in Cameroon commonly have unmet need for contraception posing a high risk of unintended pregnancy. Unintended pregnancy leads to a range of outcomes, and due to legal restrictions, FSWs often seek unsafe abortions. Aside from the high burden of HIV, little is known about the broader sexual and reproductive health of FSWs in Cameroon.Methods:From December 2015 to October 2016, we recruited FSWs aged ≥18 years through respondent-driven sampling across 5 Cameroonian cities. Cross-sectional data were collected through a behavioral questionnaire. Modified-robust Poisson regression was used to approximate adjusted prevalence ratios (aPR) for TOP and current use of effective nonbarrier contraception.Results:Among 2,255 FSWs (median age 28 years), 57.6% reported history of unintended pregnancy and 40.0% reported prior TOP. In multivariable analysis, TOP history was associated with current nonbarrier contraceptive use (aPR=1.23, 95% confidence interval [CI]=1.07, 1.42); ever using emergency contraception (aPR=1.34, 95% CI=1.17, 1.55); >60 clients in the past month (aPR=1.29, 95% CI= 1.07, 1.54) compared to ≤30; inconsistent condom use with clients (aPR=1.17, 95% CI=1.00, 1.37); ever experiencing physical violence (aPR=1.24, 95% CI=1.09, 1.42); and older age. Most (76.5%) women used male condoms for contraception, but only 33.2% reported consistent condom use with all partners. Overall, 26.4% of women reported currently using a nonbarrier contraceptive method, and 6.2% reported using a long-acting method. Previous TOP (aPR=1.41, 95%CI=1.16, 1.72) and ever using emergency contraception (aPR=2.70, 95% CI=2.23, 3.26) were associated with higher nonbarrier contraceptive use. Recent receipt of HIV information (aPR=0.72, 95% CI=0.59, 0.89) and membership in an FSW community-based organization (aPR=0.73, 95% CI=0.57, 0.92) were associated with lower use nonbarrier contraceptive use.Conclusions:Experience of unintended pregnancies and TOP is common among FSWs in Cameroon. Given the low use of nonbarrier contraceptive methods and inconsistent condom use, FSWs are at risk of repeat unintended pregnancies. Improved integration of client-centered, voluntary family planning within community-led HIV services may better support the sexual and reproductive health and human rights of FSWs consistent with the United Nations Declaration of Human Rights.




me

Designing and Evaluating Scalable Child Marriage Prevention Programs in Burkina Faso and Tanzania: A Quasi-Experiment and Costing Study

ABSTRACTBackground:A significant number of girls are married as children, which negatively impacts their health, education, and development. Given the sheer numbers of girls at risk of child marriage globally, the challenge to eliminate the practice is daunting. Programs to prevent child marriage are typically small-scale and overlook the costs and scalability of the intervention.Implementation:This study tested and costed different approaches to preventing child marriage in rural Burkina Faso and Tanzania. The approaches tested were community dialogue, provision of school supplies, provision of a livestock asset, a model including all components, and a control arm. A quasi-experimental design was employed with surveys undertaken at baseline and after 2 years of intervention. We examined the prevalence of child marriage and school attendance controlling for background characteristics and stratified by age group. Programmatic costs were collected prospectively.Results:Among those in the community dialogue arm in Burkina Faso, girls aged 15 to 17 years had two-thirds less risk (risk ratio [RR]=0.33; 95% confidence interval [CI]=0.19, 0.60) of being married and girls aged 12 to 14 years had a greater chance of being in school (RR=1.18; 95% CI=1.07,1.29) compared to the control site. In Tanzania, girls aged 12 to 14 years residing in the multicomponent arm had two-thirds less risk of being married (RR=0.33; 95% CI=0.11, 0.99), and girls 15 to 17 in the conditional asset location had half the risk (RR=0.52; 95% CI=0.30, 0.91). All the interventions tested in Tanzania were associated with increased risk of girls 12 to 14 years old being in school, and the educational promotion arm was also associated with a 30% increased risk of girls aged 15 to 17 years attending school (RR=1.3; 95% CI=1.01, 1.67). Costs per beneficiary ranged from US$9 to US$117.Conclusion:The study demonstrates that minimal, low-cost approaches can be effective in delaying child marriage and increasing school attendance. However, community dialogues need to be designed to ensure sufficient quality and intensity of messaging. Program managers should pay attention to the cost, quality, and coverage of interventions, especially considering that child marriage persists in the most hard-to-reach rural areas of many countries.




me

Two-Way Short Message Service (SMS) Communication May Increase Pre-Exposure Prophylaxis Continuation and Adherence Among Pregnant and Postpartum Women in Kenya

ABSTRACTIntroduction:We evaluated a 2-way short message service (SMS) communication platform to improve continuation of pre-exposure prophylaxis (PrEP) for HIV prevention among Kenyan women who initiated PrEP within routine maternal child health (MCH) and family planning clinics.Methods:We adapted an existing SMS platform (Mobile WACh [mWACh]) to send PrEP-tailored, theory-based SMS and allow clients to communicate with a remote nurse. Women who did not have HIV and who were initiating PrEP at 2 MCH/family planning clinics in Kisumu County, Kenya, from February to October 2018, were offered enrollment into the mWACh-PrEP program; SMS communication was free. We evaluated acceptability, satisfaction, and implementation metrics. In a pre/postevaluation, we compared PrEP continuation at 1-month postinitiation among women who initiated PrEP in the period before (n=166) versus after mWACh-PrEP implementation, adjusting for baseline differences.Results:Of the 334 women who were screened for enrollment into the mWACh-PrEP program; 193 (58%) were eligible and of those, 190 (98%) accepted enrollment. Reasons for ineligibility (n=141) included no phone access (29%) and shared SIM cards (25%). Median age was 25 years (interquartile range=22–30), and 91% were MCH clients. Compared to women who initiated PrEP in the month before mWACh-PrEP implementation, women who enrolled in mWACh-PrEP were more likely to return for their first PrEP follow-up visit (40% vs. 53%; adjusted risk ratio [aRR]=1.26; 95% confidence interval [CI]= 1.06, 1.50; P=.008) and more likely to continue PrEP (22% vs. 43%; aRR=1.75; 95% CI=1.21, 2.55; P=.003). Among those who returned, 99% reported successful receipt of SMS through the mWACh-PrEP system and 94% reported that mWACh-PrEP helped them understand PrEP better. Concerns about PrEP use, how it works, and side effects accounted for the majority (80%) of issues raised by participants using SMS.Conclusions:Two-way SMS expanded support for PrEP and opportunities for dialogue beyond the clinic and enabled women to ask and receive answers in real time regarding PrEP, which facilitated its continued use.




me

Coaching Intensity, Adherence to Essential Birth Practices, and Health Outcomes in the BetterBirth Trial in Uttar Pradesh, India

ABSTRACTBackground:Coaching can improve the quality of care in primary-level birth facilities and promote birth attendant adherence to essential birth practices (EBPs) that reduce maternal and perinatal mortality. The intensity of coaching needed to promote and sustain behavior change is unknown. We investigated the relationship between coaching intensity, EBP adherence, and maternal and perinatal health outcomes using data from the BetterBirth Trial, which assessed the impact of a complex, coaching-based implementation of the World Health Organization's Safe Childbirth Checklist in Uttar Pradesh, India.Methods:For each birth, we defined multiple coaching intensity metrics, including coaching frequency (coaching visits per month), cumulative coaching (total coaching visits accrued during the intervention), and scheduling adherence (coaching delivered as scheduled). We considered coaching delivered at both facility and birth attendant levels. We assessed the association between coaching intensity and birth attendant adherence to 18 EBPs and with maternal and perinatal health outcomes using regression models.Results:Coaching frequency was associated with modestly increased EBP adherence. Delivering 6 coaching visits per month to facilities was associated with adherence to 1.3 additional EBPs (95% confidence interval [CI]=0.6, 1.9). High-frequency coaching delivered with high coverage among birth attendants was associated with greater improvements: providing 70% of birth attendants at a facility with at least 1 visit per month was associated with adherence to 2.0 additional EBPs (95% CI=1.0, 2.9). Neither cumulative coaching nor scheduling adherence was associated with EBP adherence. Coaching was generally not associated with health outcomes, possibly due to the small magnitude of association between coaching and EBP adherence.Conclusions:Frequent coaching may promote behavior change, especially if delivered with high coverage among birth attendants. However, the effects of coaching were modest and did not persist over time, suggesting that future coaching-based interventions should explore providing frequent coaching for longer periods.




me

Diagnostic Utility and Impact on Clinical Decision Making of Focused Assessment With Sonography for HIV-Associated Tuberculosis in Malawi: A Prospective Cohort Study

ABSTRACTBackground:The focused assessment with sonography for HIV-associated tuberculosis (TB) (FASH) ultrasound protocol has been increasingly used to help clinicians diagnose TB. We sought to quantify the diagnostic utility of FASH for TB among individuals with HIV in Malawi.Methods:Between March 2016 and August 2017, 210 adults with HIV who had 2 or more signs and symptoms that were concerning for TB (fever, cough, night sweats, weight loss) were enrolled from a public HIV clinic in Lilongwe, Malawi. The treating clinicians conducted a history, physical exam, FASH protocol, and additional TB evaluation (laboratory diagnostics and chest radiography) on all participants. The clinician made a final treatment decision based on all available information. At the 6-month follow-up visit, we categorized participants based on clinical outcomes and diagnostic tests as having probable/confirmed TB or unlikely TB; association of FASH with probable/confirmed TB was calculated using Fisher's exact tests. The impact of FASH on empiric TB treatment was determined by asking the clinicians prospectively about whether they would start treatment at 2 time points in the baseline visit: (1) after the initial history and physical exam; and (2) after history, physical exam, and FASH protocol.Results:A total of 181 participants underwent final analysis, of whom 56 were categorized as probable/confirmed TB and 125 were categorized as unlikely TB. The FASH protocol was positive in 71% (40/56) of participants with probable/confirmed TB compared to 24% (30/125) of participants with unlikely TB (odds ratio=7.9, 95% confidence interval=3.9,16.1; P<.001). Among those classified as confirmed/probable TB, FASH increased the likelihood of empiric TB treatment before obtaining any other diagnostic studies from 9% (5/56) to 46% (26/56) at the point-of-care. For those classified as unlikely TB, FASH increased the likelihood of empiric treatment from 2% to 4%.Conclusion:In the setting of HIV coinfection in Malawi, FASH can be a helpful tool that augments the clinician's ability to make a timely diagnosis of TB.




me

A Qualitative Assessment of Provider and Client Experiences With 3- and 6-Month Dispensing Intervals of Antiretroviral Therapy in Malawi

ABSTRACTIntroduction:Multimonth dispensing (MMD) of antiretroviral therapy (ART) is a differentiated model of care that can help overcome health system challenges and reduce the burden of HIV care on clients. Although 3-month dispensing has been the standard of care, interest has increased in extending refill intervals to 6 months. We explored client and provider experiences with MMD in Malawi as part of a cluster randomized trial evaluating 3- versus 6-month ART dispensing.Methods:Semi-structured in-depth interviews were conducted with 17 ART providers and 62 stable, adult clients with HIV on ART. Clients and providers were evenly divided by arm and were eligible for an interview if they had been participating in the study for 1 year (clients) or 6 months (providers). Questions focused on perceived challenges and benefits of the 3- or 6-month amount of ART dispensing. Interviews were transcribed, and data were coded and analyzed using constant comparison.Results:Both clients and providers reported that the larger medication supply had benefits. Clients reported decreased costs due to less frequent travel to the clinic and increased time for income-generating activities. Clients in the 6-month dispensing arm reported a greater sense of personal freedom and normalcy. Providers felt that the 6-month dispensing interval reduced their workload. They also expressed concerned about clients' challenges with ART storage at home, but clients reported no storage problems. Although providers mentioned the potential risk of clients sharing the larger medication supply with family or friends, clients emphasized the value of ART and reported only rare, short-term sharing, mostly with their spouses. Providers mentioned clients' lack of motivation to seek care for illnesses that might occur between refill appointments.Conclusions:The 6-month ART dispensing arm was particularly beneficial to clients for decreased costs, increased time for income generation, and a greater sense of normalcy. Providers' concerns about storage, sharing, and return visits to the facility did not emerge in client interviews. Further data are needed on the feasibility of implementing a large-scale program with 6-month dispensing.




me

Erratum. Ten-Year Outcome of Islet Alone or Islet After Kidney Transplantation in Type 1 Diabetes: A Prospective Parallel-Arm Cohort Study. Diabetes Care 2019;42:2042-2049




me

Sex Disparities in Cardiovascular Outcome Trials of Populations With Diabetes: A Systematic Review and Meta-analysis

BACKGROUND

Sex differences have been described in diabetes cardiovascular outcome trials (CVOTs).

PURPOSE

We systematically reviewed for baseline sex differences in cardiovascular (CV) risk factors and CV protection therapy in diabetes CVOTs.

DATA SOURCES

Randomized placebo-controlled trials examining the effect of diabetes medications on major adverse cardiovascular events in people ≥18 years of age with type 2 diabetes.

STUDY SELECTION

Included trials reported baseline sex-specific CV risks and use of CV protection therapy.

DATA EXTRACTION

Two reviewers independently abstracted study data.

DATA SYNTHESIS

We included five CVOTs with 46,606 participants. We summarized sex-specific data using mean differences (MDs) and relative risks (RRs) and pooled estimates using random effects meta-analysis. There were fewer women than men in included trials (28.5–35.8% women). Women more often had stroke (RR 1.28; 95% CI 1.09, 1.50), heart failure (RR 1.30; 95% CI 1.21,1.40), and chronic kidney disease (RR 1.33; 95% CI 1.17; 1.51). They less often used statins (RR 0.90; 95% CI 0.86, 0.93), aspirin (RR 0.82; 95% CI 0.71, 0.95), and β-blockers (RR 0.93; 95% CI 0.88, 0.97) and had a higher systolic blood pressure (MD 1.66 mmHg; 95% CI 0.90, 2.41), LDL cholesterol (MD 0.34 mmol/L; 95% CI 0.29, 0.39), and hemoglobin A1c (MD 0.11%; 95% CI 0.09, 0.14 [1.2 mmol/mol; 1.0, 1.5]) than men.

LIMITATIONS

We could not carry out subgroup analyses due to the small number of studies. Our study is not generalizable to low CV risk groups nor to patients in routine care.

CONCLUSIONS

There were baseline sex disparities in diabetes CVOTs. We suggest efforts to recruit women into trials and promote CV management across the sexes.




me

Effects of Continuous Glucose Monitoring on Metrics of Glycemic Control in Diabetes: A Systematic Review With Meta-analysis of Randomized Controlled Trials

BACKGROUND

Continuous glucose monitoring (CGM) provides important information to aid in achieving glycemic targets in people with diabetes.

PURPOSE

We performed a meta-analysis of randomized controlled trials (RCTs) comparing CGM with usual care for parameters of glycemic control in both type 1 and type 2 diabetes.

DATA SOURCES

Many electronic databases were searched for articles published from inception until 30 June 2019.

STUDY SELECTION

We selected RCTs that assessed both changes in HbA1c and time in target range (TIR), together with time below range (TBR), time above range (TAR), and glucose variability expressed as coefficient of variation (CV).

DATA EXTRACTION

Data were extracted from each trial by two investigators.

DATA SYNTHESIS

All results were analyzed by a random effects model to calculate the weighted mean difference (WMD) with the 95% CI. We identified 15 RCTs, lasting 12–36 weeks and involving 2,461 patients. Compared with the usual care (overall data), CGM was associated with modest reduction in HbA1c (WMD –0.17%, 95% CI –0.29 to –0.06, I2 = 96.2%), increase in TIR (WMD 70.74 min, 95% CI 46.73–94.76, I2 = 66.3%), and lower TAR, TBR, and CV, with heterogeneity between studies. The increase in TIR was significant and robust independently of diabetes type, method of insulin delivery, and reason for CGM use. In preplanned subgroup analyses, real-time CGM led to the higher improvement in mean HbA1c (WMD –0.23%, 95% CI –0.36 to –0.10, P < 0.001), TIR (WMD 83.49 min, 95% CI 52.68–114.30, P < 0.001), and TAR, whereas both intermittently scanned CGM and sensor-augmented pump were associated with the greater decline in TBR.

LIMITATIONS

Heterogeneity was high for most of the study outcomes; all studies were sponsored by industry, had short duration, and used an open-label design.

CONCLUSIONS

CGM improves glycemic control by expanding TIR and decreasing TBR, TAR, and glucose variability in both type 1 and type 2 diabetes.




me

Evaluation of Factors Related to Glycemic Management in Professional Cyclists With Type 1 Diabetes Over a 7-Day Stage Race

OBJECTIVE

To investigate factors related to glycemic management among members of a professional cycling team with type 1 diabetes over a 7-day Union Cycliste Internationale World Tour stage race.

RESEARCH DESIGN AND METHODS

An observational evaluation of possible factors related to glycemic management and performance in six male professional cyclists with type 1 diabetes (HbA1c 6.4 ± 0.6%) during the 2019 Tour of California.

RESULTS

In-ride time spent in euglycemia (3.9–10.0 mmol/L glucose) was 63 ± 11%, with a low percentage of time spent in level 1 (3.0–3.9 mmol/L; 0 ± 1% of time) and level 2 (<3.0 mmol/L; 0 ± 0% of time) hypoglycemia over the 7-day race. Riders spent 25 ± 9% of time in level 1 (10.1–13.9 mmol/L) and 11 ± 9% in level 2 (>13.9 mmol/L) hyperglycemia during races. Bolus insulin use was uncommon during races, despite high carbohydrate intake (76 ± 23 g ⋅ h–1). Overnight, the riders spent progressively more time in hypoglycemia from day 1 (6 ± 12% in level 1 and 0 ± 0% in level 2) to day 7 (12 ± 12% in level 1 and 2 ± 4% in level 2) (2[1] > 4.78, P < 0.05).

CONCLUSIONS

Professional cyclists with type 1 diabetes have excellent in-race glycemia, but significant hypoglycemia during recovery overnight, throughout a 7-day stage race.




me

Diabetes, Cognitive Decline, and Mild Cognitive Impairment Among Diverse Hispanics/Latinos: Study of Latinos-Investigation of Neurocognitive Aging Results (HCHS/SOL)

OBJECTIVE

Hispanics/Latinos are the largest ethnic/racial group in the U.S., have the highest prevalence of diabetes, and are at increased risk for neurodegenerative disorders. Currently, little is known about the relationship between diabetes and cognitive decline and disorders among diverse Hispanics/Latinos. The purpose of this study is to clarify these relationships in diverse middle-aged and older Hispanics/Latinos.

RESEARCH DESIGN AND METHODS

The Study of Latinos–Investigation of Neurocognitive Aging (SOL-INCA) is an ancillary study of the Hispanic Community Health Study/Study of Latinos (HCHS/SOL). HCHS/SOL is a multisite (Bronx, NY; Chicago, IL; Miami, FL; and San Diego, CA), probability-sampled (i.e., representative of targeted populations), and prospective cohort study. Between 2016 and 2018, SOL-INCA enrolled diverse Hispanics/Latinos aged ≥50 years (n = 6,377). Global cognitive decline and mild cognitive impairment (MCI) were the primary outcomes.

RESULTS

Prevalent diabetes at visit 1, but not incident diabetes at visit 2, was associated with significantly steeper global cognitive decline (βGC = –0.16 [95% CI –0.25; –0.07]; P < 0.001), domain-specific cognitive decline, and higher odds of MCI (odds ratio 1.74 [95% CI 1.34; 2.26]; P < 0.001) compared with no diabetes in age- and sex-adjusted models.

CONCLUSIONS

Diabetes was associated with cognitive decline and increased MCI prevalence among diverse Hispanics/Latinos, primarily among those with prevalent diabetes at visit 1. Our findings suggest that significant cognitive decline and MCI may be considered additional disease complications of diabetes among diverse middle-aged and older Hispanics/Latinos.




me

The Impact of Medicaid Expansion on Diabetes Management

OBJECTIVE

Diabetes is a chronic health condition contributing to a substantial burden of disease. According to the Robert Wood Johnson Foundation, 10.9 million people were newly insured by Medicaid between 2013 and 2016. Considering this coverage expansion, the Affordable Care Act (ACA) could significantly affect people with diabetes in their management of the disease. This study evaluates the impact of the Medicaid expansion under the ACA on diabetes management.

RESEARCH DESIGN AND METHODS

This study includes 22,335 individuals with diagnosed diabetes from the 2011 to 2016 Behavioral Risk Factor Surveillance System. It uses a difference-in-differences approach to evaluate the impact of the Medicaid expansion on self-reported access to health care, self-reported diabetes management, and self-reported health status. Additionally, it performs a triple-differences analysis to compare the impact between Medicaid expansion and nonexpansion states considering diabetes rates of the states.

RESULTS

Significant improvements in Medicaid expansion states as compared with non–Medicaid expansion states were evident in self-reported access to health care (0.09 score; P = 0.023), diabetes management (1.91 score; P = 0.001), and health status (0.10 score; P = 0.026). Among states with large populations with diabetes, states that expanded Medicaid reported substantial improvements in these areas in comparison with those that did not expand.

CONCLUSIONS

The Medicaid expansion has significant positive effects on self-reported diabetes management. While states with large diabetes populations that expanded Medicaid have experienced substantial improvements in self-reported diabetes management, non–Medicaid expansion states with high diabetes rates may be facing health inequalities. The findings provide policy implications for the diabetes care community and policy makers.




me

Medication Adherence During Adjunct Therapy With Statins and ACE Inhibitors in Adolescents With Type 1 Diabetes

OBJECTIVE

Suboptimal adherence to insulin treatment is a main issue in adolescents with type 1 diabetes. However, to date, there are no available data on adherence to adjunct noninsulin medications in this population. Our aim was to assess adherence to ACE inhibitors and statins and explore potential determinants in adolescents with type 1 diabetes.

RESEARCH DESIGN AND METHODS

There were 443 adolescents with type 1 diabetes recruited into the Adolescent Type 1 Diabetes Cardio-Renal Intervention Trial (AdDIT) and exposed to treatment with two oral drugs—an ACE inhibitor and a statin—as well as combinations of both or placebo for 2–4 years. Adherence was assessed every 3 months with the Medication Event Monitoring System (MEMS) and pill count.

RESULTS

Median adherence during the trial was 80.2% (interquartile range 63.6–91.8) based on MEMS and 85.7% (72.4–92.9) for pill count. Adherence based on MEMS and pill count dropped from 92.9% and 96.3%, respectively, at the first visit to 76.3% and 79.0% at the end of the trial. The percentage of study participants with adherence ≥75% declined from 84% to 53%. A good correlation was found between adherence based on MEMS and pill count (r = 0.82, P < 0.001). Factors associated with adherence were age, glycemic control, and country.

CONCLUSIONS

We report an overall good adherence to ACE inhibitors and statins during a clinical trial, although there was a clear decline in adherence over time. Older age and suboptimal glycemic control at baseline predicted lower adherence during the trial, and, predictably, reduced adherence was more prevalent in subjects who subsequently dropped out.




me

Trends in Emergency Department Visits and Inpatient Admissions for Hyperglycemic Crises in Adults With Diabetes in the U.S., 2006-2015

OBJECTIVE

To report U.S. national population-based rates and trends in diabetic ketoacidosis (DKA) and hyperglycemic hyperosmolar state (HHS) among adults, in both the emergency department (ED) and inpatient settings.

RESEARCH DESIGN AND METHODS

We analyzed data from 1 January 2006 through 30 September 2015 from the Nationwide Emergency Department Sample and National Inpatient Sample to characterize ED visits and inpatient admissions with DKA and HHS. We used corresponding year cross-sectional survey data from the National Health Interview Survey to estimate the number of adults ≥18 years with diagnosed diabetes to calculate population-based rates for DKA and HHS in both ED and inpatient settings. Linear trends from 2009 to 2015 were assessed using Joinpoint software.

RESULTS

In 2014, there were a total of 184,255 and 27,532 events for DKA and HHS, respectively. The majority of DKA events occurred in young adults aged 18–44 years (61.7%) and in adults with type 1 diabetes (70.6%), while HHS events were more prominent in middle-aged adults 45–64 years (47.5%) and in adults with type 2 diabetes (88.1%). Approximately 40% of the hyperglycemic events were in lower-income populations. Overall, event rates for DKA significantly increased from 2009 to 2015 in both ED (annual percentage change [APC] 13.5%) and inpatient settings (APC 8.3%). A similar trend was seen for HHS (APC 16.5% in ED and 6.3% in inpatient). The increase was in all age-groups and in both men and women.

CONCLUSIONS

Causes of increased rates of hyperglycemic events are unknown. More detailed data are needed to investigate the etiology and determine prevention strategies.




me

Early Childhood Antibiotic Treatment for Otitis Media and Other Respiratory Tract Infections Is Associated With Risk of Type 1 Diabetes: A Nationwide Register-Based Study With Sibling Analysis

OBJECTIVE

The effect of early-life antibiotic treatment on the risk of type 1 diabetes is debated. This study assessed this question, applying a register-based design in children up to age 10 years including a large sibling-control analysis.

RESEARCH DESIGN AND METHODS

All singleton children (n = 797,318) born in Sweden between 1 July 2005 and 30 September 2013 were included and monitored to 31 December 2014. Cox proportional hazards models, adjusted for parental and perinatal characteristics, were applied, and stratified models were used to account for unmeasured confounders shared by siblings.

RESULTS

Type 1 diabetes developed in 1,297 children during the follow-up (median 4.0 years [range 0–8.3]). Prescribed antibiotics in the 1st year of life (23.8%) were associated with an increased risk of type 1 diabetes (adjusted hazard ratio [HR] 1.19 [95% CI 1.05–1.36]), with larger effect estimates among children delivered by cesarean section (P for interaction = 0.016). The association was driven by exposure to antibiotics primarily used for acute otitis media and respiratory tract infections. Further, we found an association of antibiotic prescriptions in pregnancy (22.5%) with type 1 diabetes (adjusted HR 1.15 [95% CI 1.00–1.32]). In general, sibling analysis supported these results, albeit often with statistically nonsignificant associations.

CONCLUSIONS

Dispensed prescription of antibiotics, mainly for acute otitis media and respiratory tract infections, in the 1st year of life is associated with an increased risk of type 1 diabetes before age 10 years, most prominently in children delivered by cesarean section.




me

The Long-term Effects of Metformin on Patients With Type 2 Diabetic Kidney Disease

OBJECTIVE

Metformin is the first pharmacological option for treating type 2 diabetes. However, the use of this drug is not recommended in individuals with impaired kidney function because of the perceived risk of lactic acidosis. We aimed to assess the efficacy and safety of metformin in patients with type 2 diabetic kidney disease (DKD).

RESEARCH DESIGN AND METHODS

We conducted a retrospective observational cohort study of 10,426 patients with type 2 DKD from two tertiary hospitals. The primary outcomes were all-cause mortality and end-stage renal disease (ESRD) progression. The secondary outcome was metformin-associated lactic acidosis. Taking into account the possibility that patients with less severe disease were prescribed metformin, propensity score matching (PSM) was conducted.

RESULTS

All-cause mortality and incident ESRD were lower in the metformin group according to the multivariate Cox analysis. Because the two groups had significantly different baseline characteristics, PSM was performed. After matching, metformin usage was still associated with lower all-cause mortality (adjusted hazard ratio [aHR] 0.65; 95% CI 0.57–0.73; P < 0.001) and ESRD progression (aHR 0.67; 95% CI 0.58–0.77; P < 0.001). Only one event of metformin-associated lactic acidosis was recorded. In both the original and PSM groups, metformin usage did not increase the risk of lactic acidosis events from all causes (aHR 0.92; 95% CI 0.668–1.276; P = 0.629).

CONCLUSIONS

In the present retrospective study, metformin usage in advanced chronic kidney disease (CKD) patients, especially those with CKD 3B, decreased the risk of all-cause mortality and incident ESRD. Additionally, metformin did not increase the risk of lactic acidosis. However, considering the remaining biases even after PSM, further randomized controlled trials are needed to change real-world practice.




me

Optimization of Metformin in the GRADE Cohort: Effect on Glycemia and Body Weight

OBJECTIVE

We evaluated the effect of optimizing metformin dosing on glycemia and body weight in type 2 diabetes.

RESEARCH DESIGN AND METHODS

This was a prespecified analysis of 6,823 participants in the Glycemia Reduction Approaches in Diabetes: A Comparative Effectiveness Study (GRADE) taking metformin as the sole glucose-lowering drug who completed a 4- to 14-week (mean ± SD 7.9 ± 2.4) run-in in which metformin was adjusted to 2,000 mg/day or a maximally tolerated lower dose. Participants had type 2 diabetes for <10 years and an HbA1c ≥6.8% (51 mmol/mol) while taking ≥500 mg of metformin/day. Participants also received diet and exercise counseling. The primary outcome was the change in HbA1c during run-in.

RESULTS

Adjusted for duration of run-in, the mean ± SD change in HbA1c was –0.65 ± 0.02% (–7.1 ± 0.2 mmol/mol) when the dose was increased by ≥1,000 mg/day, –0.48 ± 0.02% (–5.2 ± 0.2 mmol/mol) when the dose was unchanged, and –0.23 ± 0.07% (–2.5 ± 0.8 mmol/mol) when the dose was decreased (n = 2,169, 3,548, and 192, respectively). Higher HbA1c at entry predicted greater reduction in HbA1c (P < 0.001) in univariate and multivariate analyses. Weight loss adjusted for duration of run-in averaged 0.91 ± 0.05 kg in participants who increased metformin by ≥1,000 mg/day (n = 1,894).

CONCLUSIONS

Optimizing metformin to 2,000 mg/day or a maximally tolerated lower dose combined with emphasis on medication adherence and lifestyle can improve glycemia in type 2 diabetes and HbA1c values ≥6.8% (51 mmol/mol). These findings may help guide efforts to optimize metformin therapy among persons with type 2 diabetes and suboptimal glycemic control.




me

Pre-transplant testosterone and outcome of men after allogeneic stem cell transplantation

Testosterone is an important determinant of endothelial function and vascular health in men. As both factors play a role in mortality after allogeneic stem cell transplantation (alloSCT), we retrospectively evaluated the impact of pre-transplant testosterone levels on outcome in male patients undergoing alloSCT. In the discovery cohort (n=346), an impact on outcome was observed only in the subgroup of patients allografted for acute myeloid leukemia (AML) (n=176, hereafter termed ‘training cohort’). In the training cohort, lower pre-transplant testosterone levels were significantly associated with shorter overall survival (OS) [hazard ratio (HR) for a decrease of 100 ng/dL: 1.11, P=0.045]. This was based on a higher hazard of non-relapse mortality (NRM) (cause-specific HR: 1.25, P=0.013), but not relapse (cause-specific HR: 1.06, P=0.277) in the multivariable models. These findings were replicated in a confirmation cohort of 168 male patients allografted for AML in a different center (OS, HR: 1.15, P=0.012 and NRM, cause-specific HR: 1.23; P=0.008). Next, an optimized cut-off point for pre-transplant testosterone was derived from the training set and evaluated in the confirmation cohort. In multivariable models, low pre-transplant testosterone status (<250 ng/dL) was associated with worse OS (hazard ratio 1.95, P=0.021) and increased NRM (cause-specific HR 2.68, P=0.011) but not with relapse (cause-specific HR: 1.28, P=0.551). Our findings may provide a rationale for prospective studies on testosterone/androgen assessment and supplementation in male patients undergoing alloSCT for AML.




me

Accuracy of the Ottawa score in risk stratification of recurrent venous thromboembolism in patients with cancer-associated venous thromboembolism: a systematic review and meta-analysis

In patients with cancer-associated venous thromboembolism, knowledge of the estimated rate of recurrent events is important for clinical decision-making regarding anticoagulant therapy. The Ottawa score is a clinical prediction rule designed for this purpose, stratifying patients according to their risk of recurrent venous thromboembolism during the first six months of anticoagulation. We conducted a systematic review and meta-analysis of studies validating either the Ottawa score in its original or modified versions. Two investigators independently reviewed the relevant articles published from 1st June 2012 to 15th December 2018 and indexed in MEDLINE and EMBASE. Nine eligible studies were identified; these included a total of 14,963 patients. The original score classified 49.3% of the patients as high-risk, with a sensitivity of 0.7 [95% confidence interval (CI): 0.6-0.8], a 6-month pooled rate of recurrent venous thromboembolism of 18.6% (95%CI: 13.9-23.9). In the low-risk group, the recurrence rate was 7.4% (95%CI: 3.4-12.5). The modified score classified 19.8% of the patients as low-risk, with a sensitivity of 0.9 (95%CI: 0.4-1.0) and a 6-month pooled rate of recurrent venous thromboembolism of 2.2% (95%CI: 1.6-2.9). In the high-risk group, recurrence rate was 10.2% (95%CI: 6.4-14.6). Limitations of our analysis included type and dosing of anticoagulant therapy. We conclude that new therapeutic strategies are needed in patients at high risk for recurrent cancer-associated venous thromboembolism. Low-risk patients, as per the modified score, could be good candidates for oral anticoagulation. (This systematic review was registered with the International Prospective Registry of Systematic Reviews as: PROSPERO CRD42018099506).