is

Detecting and Monitoring Porcine Hemagglutinating Encephalomyelitis Virus, an Underresearched Betacoronavirus

ABSTRACT

Members of family Coronaviridae cause a variety of diseases in birds and mammals. Porcine hemagglutinating encephalomyelitis virus (PHEV), a lesser-researched coronavirus, can infect naive pigs of any age, but clinical disease is observed in pigs ≤4 weeks of age. No commercial PHEV vaccines are available, and neonatal protection from PHEV-associated disease is presumably dependent on lactogenic immunity. Although subclinical PHEV infections are thought to be common, PHEV ecology in commercial swine herds is unknown. To begin to address this gap in knowledge, a serum IgG antibody enzyme-linked immunosorbent assay (ELISA) based on the S1 protein was developed and evaluated on known-status samples and then used to estimate PHEV seroprevalence in U.S. sow herds. Assessment of the diagnostic performance of the PHEV S1 ELISA using serum samples (n = 924) collected from 7-week-old pigs (n = 84; 12 pigs per group) inoculated with PHEV, porcine epidemic diarrhea virus, transmissible gastroenteritis virus, porcine respiratory coronavirus, or porcine deltacoronavirus showed that a sample-to-positive cutoff value of ≥0.6 was both sensitive and specific, i.e., all PHEV-inoculated pigs were seropositive from days postinoculation 10 to 42, and no cross-reactivity was observed in samples from other groups. The PHEV S1 ELISA was then used to estimate PHEV seroprevalence in U.S. sow herds (19 states) using 2,756 serum samples from breeding females (>28 weeks old) on commercial farms (n = 104) with no history of PHEV-associated disease. The overall seroprevalence was 53.35% (confidence interval [CI], ±1.86%) and herd seroprevalence was 96.15% (CI, ±3.70%).

IMPORTANCE There is a paucity of information concerning the ecology of porcine hemagglutinating encephalomyelitis virus (PHEV) in commercial swine herds. This study provided evidence that PHEV infection is endemic and highly prevalent in U.S. swine herds. These results raised questions for future studies regarding the impact of endemic PHEV on swine health and the mechanisms by which this virus circulates in endemically infected populations. Regardless, the availability of the validated PHEV S1 enzyme-linked immunosorbent assay (ELISA) provides the means for swine producers to detect and monitor PHEV infections, confirm prior exposure to the virus, and to evaluate the immune status of breeding herds.




is

Molar element ratio analysis of lithogeochemical data: a toolbox for use in mineral exploration and mining

Molar element ratio analysis of element concentrations consists of four basic tools that provide substantial insight into the lithogeochemistry (and mineralogy) of rocks under examination. These tools consist of: (1) conserved element ratio analysis; (2) Pearce element ratio analysis; (3) general element ratio analysis; and (4) lithogeochemical mineral mode analysis. Conserved element ratio analysis is useful in creating a chemostratigraphic model for the host rocks to mineral deposits, whereas Pearce element ratio analysis and general element ratio analysis are primarily used to identify mineralogical and metasomatic controls on rock compositions and to investigate and quantify the extent of the material transfers that formed the host rocks and mineralization. Lithogeochemical mineral mode analysis converts element concentrations into mineral concentrations using a matrix-based change-of-basis operation, allowing lithogeochemical data to be interpreted in terms of mineral modes. It can be used to provide proper names to rocks, an important activity for an exploration geologist because of the implications that rock names have on genetic processes and mineral deposit models.

This paper provides a review of the theoretical foundations of each of these four tools and then illustrates how these techniques have been used in a variety of exploration applications to assist in the search for, evaluation and planning of, and the mining of mineral deposits. Examples include the evaluation of total digestion lithogeochemical datasets from mineral deposits hosted by igneous and sedimentary rocks and formed by hydrothermal and igneous processes. In addition, this paper illustrates a more recent geometallurgical application of these methods, whereby the mineral proportions determined by lithogeochemical mineral mode analysis are used to predict rock properties and obtain the ore body knowledge critical for resource evaluation, mine planning, mining and mine remediation.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




is

State-of-the-art analysis of geochemical data for mineral exploration

Multi-element geochemical surveys of rocks, soils, stream/lake/floodplain sediments and regolith are typically carried out at continental, regional and local scales. The chemistry of these materials is defined by their primary mineral assemblages and their subsequent modification by comminution and weathering. Modern geochemical datasets represent a multi-dimensional geochemical space that can be studied using multivariate statistical methods from which patterns reflecting geochemical/geological processes are described (process discovery). These patterns form the basis from which probabilistic predictive maps are created (process validation). Processing geochemical survey data requires a systematic approach to effectively interpret the multi-dimensional data in a meaningful way. Problems that are typically associated with geochemical data include closure, missing values, censoring, merging, levelling different datasets and adequate spatial sample design. Recent developments in advanced multivariate analytics, geospatial analysis and mapping provide an effective framework to analyse and interpret geochemical datasets. Geochemical and geological processes can often be recognized through the use of data discovery procedures such as the application of principal component analysis. Classification and predictive procedures can be used to confirm lithological variability, alteration and mineralization. Geochemical survey data of lake/till sediments from Canada and of floodplain sediments from Australia show that predictive maps of bedrock and regolith processes can be generated. Upscaling a multivariate statistics-based prospectivity analysis for arc-related Cu–Au mineralization from a regional survey in the southern Thomson Orogen in Australia to the continental scale, reveals a number of regions with a similar (or stronger) multivariate response and hence potentially similar (or higher) mineral potential throughout Australia.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




is

Advances in the use of isotopes in geochemical exploration: instrumentation and applications in understanding geochemical processes

Among the emerging techniques to detect the real footprint of buried ore deposits is isotope tracing. Novel and automated preparation systems such as continuous flow isotope ratio mass spectrometry, off-axis integrated cavity output spectroscopy for isotopic compositions of selected molecules, multi-collector inductively coupled-plasma mass spectrometry (ICP-MS), triple quadrupole ICP-MS, laser ablation ICP-MS, and a multitude of inline preparation systems have facilitated the use of isotopes as tracers in mineral exploration, as costs for isotope analyses have decreased and the time required for the analyses has improved. In addition, the isotope systems being used have expanded beyond the traditional light stable and Pb isotopes to include a multitude of elements that behave differently during processes that promote the mobilization of elements during both primary and secondary dispersion. Isotopes are also being used to understand barren areas that lack a critical process to form an ore deposit and to reveal precise redox mechanisms. The goal is to be able to use isotopes to reflect a definitive process that occurs in association with the deposit and not in barren systems, and then to relate these to something that is easier to measure, namely elemental concentrations. As new generations of exploration and environmental scientists are becoming more comfortable with the application of isotopes to effectively trace processes involved in geoscience, and new technologies for rapid and inexpensive analyses of isotopes are continually being developed, novel applications of isotope tracing are becoming more mainstream.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




is

Parts per trillion (ppt) gold in groundwater: can we believe it, what is anomalous and how do we use it?

There is a pressing need for new exploration tools to target and vector towards mineralization in covered terrains. Groundwater provides a valuable and under-utilized geochemical sampling medium, and represents an important and cost-effective tool to expose covered terrains to systematic exploration. For Au exploration, researchers agree the best hydrogeochemistry pathfinder is dissolved Au itself, with additional potential from other pathfinders (albeit non-unique) such as As, Ag, W and Mo. Despite Au's relatively low solubility, with rigorous field protocols and appropriate analytical methods, explorers can respond to dissolved Au directly with robust parts per trillion (ppt)-level analyses.

Even with ppt-level analyses, a practical implication of Au's low solubility is that a deposit's dissolved Au signature is generally weaker than seen in other more mobile pathfinders, producing a smaller detectable footprint, which must be considered when designing exploration programmes. Using purpose-drilled groundwater sampling bores, explorers can collect groundwater samples at the density required to respond to dissolved Au where existing borehole coverage is otherwise insufficient. In addition to its use at the regional scale, with even tighter sample density, hydrogeochemistry also shows promise at the project scale, allowing the 3D modelling of pathfinder dispersion.

For hydrogeochemistry to be widely adopted for Au exploration, explorers need confidence in ppt-level dissolved Au analyses, and the context to understand their significance. This paper aims to address these topics and provide a straightforward starting point for Au explorers interested in applying hydrogeochemistry by: (i) summarizing examples of regional sampling programmes and more focused case studies to illustrate how covered Au deposits create measurable dissolved Au footprints distinguishable from background; and (ii) sharing examples of dissolved Au analyses that are being integrated into exploration at the regional and project scales.

As seen in the results, the distributions of dissolved Au in the regional- and project-scale programmes show remarkably similar and easy to interpret high-contrast, low-frequency anomalies against relatively low backgrounds. These are desirable attributes of any geochemical pathfinder. When combined with the benefits of hydrogeochemistry v. other geochemical exploration tools (e.g. groundwater can create larger footprints requiring fewer samples to detect, and groundwater can recharge from depth to reflect deeper mineralization), dissolved Au is a powerful pathfinder ideally suited for Au exploration in covered terrains.

While this paper focuses on the use of dissolved Au, additional pathfinders can provide valuable information, including indications of lithological changes, hydrothermal alteration and different styles of mineralization, as well as opportunities to use secondary pathfinders when sample density or local conditions may not result in detectable dissolved Au signatures.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




is

Recent advances in the application of mineral chemistry to exploration for porphyry copper-gold-molybdenum deposits: detecting the geochemical fingerprints and footprints of hypogene mineralization and alteration

In the past decade, significant research efforts have been devoted to mineral chemistry studies to assist porphyry exploration. These activities can be divided into two major fields of research: (1) porphyry indicator minerals (PIMs), which are used to identify the presence of, or potential for, porphyry-style mineralization based on the chemistry of magmatic minerals such as zircon, plagioclase and apatite, or resistate hydrothermal minerals such as magnetite; and (2) porphyry vectoring and fertility tools (PVFTs), which use the chemical compositions of hydrothermal minerals such as epidote, chlorite and alunite to predict the likely direction and distance to mineralized centres, and the potential metal endowment of a mineral district. This new generation of exploration tools has been enabled by advances in and increased access to laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), short-wave length infrared (SWIR), visible near-infrared (VNIR) and hyperspectral technologies. PIMs and PVFTs show considerable promise for exploration and are starting to be applied to the diversity of environments that host porphyry and epithermal deposits globally. Industry has consistently supported development of these tools, and in the case of PVFTs encouraged by several successful blind tests where deposit centres have successfully been predicted from distal propylitic settings. Industry adoption is steadily increasing but is restrained by a lack of the necessary analytical equipment and expertise in commercial laboratories, and also by the ongoing reliance on well-established geochemical exploration techniques (e.g. sediment, soil and rock chip sampling) that have aided the discovery of near-surface resources over many decades, but are now proving less effective in the search for deeply buried mineral resources and for those concealed under cover.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




is

Advances in ICP-MS technology and the application of multi-element geochemistry to exploration

There have been several advances in inductively coupled plasma-mass spectrometer (ICP-MS) analytical technologies in the last decade. Collision/reaction cell ICP-MS and triple quadrupole ICP-MS techniques can produce lower detection limits for select elements that experience interferences with a standard quadrupole (e.g. Se and As). Triple quadrupole ICP-MS, in particular, can eliminate virtually all polyatomic or isobaric interferences for highly accurate measurements of some element isotopes systematics that are of great interest in mineral exploration, namely Pb/Pb. Laser ablation ICP-MS has become more popular as an effective analytical tool to measure mineral grain trace elements, which could assist in vectoring to mineralization or exploration drill targets. The ablation of a spot on a Li-borate fused glass disk paired with XRF analysis has also gained popularity as an alternative to total whole rock characterization packages that employ several separate digestions and analytical methods. While there have been several advancements in ICP-MS technologies in exploration geochemistry, they have not been widely accepted or implemented. This slow adaptation could be due to the extended recession in the mining industry between 2012 and 2017. It is also possible that standard ICP-MS data (i.e. no collision/reaction cell) is still fit for purpose. This stands in stark contrast to implementation of ICP-MS in the previous decade (1997–2007), which was transformational for the industry.

Consideration of all elements from large multi-element ICP-MS analytical suites for mineral exploration can be an extremely powerful tool in the exploration toolkit. The discovery of the White Gold District, Yukon, is a prime example of how the utilization of soil geochemical data, when plotted spatially, can vector to gold mineralization. The presence of Au + As + Sb soil anomalies were key to delineating mineralization, especially when accompanied by publicly available geological, geographical and geophysical data. Additionally, elements and element ratios not typically considered in Au exploration, including Ni and U, were utilized to determine the lithological and structural controls on mineralization. The availability of multi-element ICP-MS data was also useful in the discovery of the Cascadero Copper Taron Caesium deposit. Ore-grade Cs was discovered only because Cs was included in the multi-element ICP-MS exploration geochemistry suite. Before the availability of ICP-MS, it is unlikely that this deposit would have been discovered.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




is

Advances in exploration geochemistry, 2007 to 2017 and beyond

Mineral exploration under relatively young, exotic cover still presents a major challenge to discovery. Advances and future developments can be categorized in four key areas, (1) understanding metal mobility and mechanisms, (2) rapid geochemical analyses, (3) data access, integration and interoperability and (4) innovation in laboratory-based methods.

Application of ‘regolith-style' surface mapping in covered terrains outside of conventional lateritic terrains is achieving success in terms of reducing background noise and improving geochemical contrasts. However, process models for anomaly generation are still uncertain and require further research. The interaction between the surface environment, microbes, hydrocarbons and chemistry is receiving greater attention. While significant progress has been achieved in understanding the role of vegetation, interaction with the water table and cycling of metals in the near surface environment in Australia, other regions of the world, for example, the till-covered terrains in the northern hemisphere and arid colluvium-covered areas of South America, have seen less progress. In addition to vegetation, the influence of bacteria, fungi and invertebrates is not as well studied with respect to metal mobilization in cover. Field portable XRF has become a standard field technique, though more often used in a camp setting. Apart from a tweaking of analytical quality, instruments have probably reached their peak of analytical development with add-ons, such as cameras, beam-limiters, wireless transmission and GPS as the main differences between instrument suppliers. Their future rests in automated application in unconventional configurations, for example, core scanning and better integration of analytical data with other information such as spectral analyses. Pattern drilling that persists in industry, however, has benefited from innovative application of field-portable tools along with rock and mineral chemistry to provide near real-time results and assist in a shift toward more flexible and targeted drilling in greenfields settings.

Innovation in the laboratory continues to progress. More selective geochemical analysis, imaging of fine particle size fractions and resistate mineral phases and isotope analysis are faster and more accessible than ever before. The application of genomics (and data analysis) as mineral exploration tools is on the horizon. A continuing problem in geoscience, the supply to industry of suitably trained geochemists, persists although some needs, particularly at junior level, will be met by recent initiatives at various universities at graduate level. Unfortunately, the current economic climate has had a significant impact on R&D and retention of geochemistry skills by the industry. Whilst the future is positive, significant investment is required to develop the next generation of geochemical exploration tools and concepts.

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




is

Introduction to the thematic collection: Exploration Geochemistry at Exploration 17 October 21-25, 2017, Toronto

Thematic collection: This article is part of the Exploration 17 collection available at: https://www.lyellcollection.org/cc/exploration-17




is

Platelet Disorders

After vascular injury and exposure of subendothelial matrix proteins to the intravascular space, mediators of hemostasis are triggered and allow for clot formation and restoration of vascular integrity. Platelets are the mediators of primary hemostasis, creating a platelet plug and allowing for initial cessation of bleeding. Platelet disorders, qualitative and quantitative, may result in bleeding signs and symptoms, particularly mucocutaneous bleeding such as epistaxis, bruising, petechiae, and heavy menstrual bleeding. Increasing evidence suggests that platelets have functional capabilities beyond hemostasis, but this review focuses solely on platelet hemostatic properties. Herein, normal platelet function as well as the effects of abnormal function and thrombocytopenia are reviewed.




is

Professionalism




is

Visual Diagnosis: A Case of Stretchy Skin and Vascular Abnormalities




is

Advancing Biologics Development Programs with Legacy Cell Lines: Advantages and Limitations of Genetic Testing for Addressing Clonality Concerns Prior to Availability of Late Stage Process and Product Consistency Data

The bioprocessing industry uses recombinant mammalian cell lines to generate therapeutic biologic drugs. To ensure consistent product quality of the therapeutic proteins, it is imperative to have a controlled production process. Regulatory agencies and the biotechnology industry consider cell line "clonal origin" an important aspect of maintaining process control. Demonstration of clonal origin of the cell substrate, or production cell line, has received considerable attention in the past few years, and the industry has improved methods and devised standards to increase the probability and/or assurance of clonal derivation. However, older production cell lines developed before the implementation of these methods, herein referred to as "legacy cell lines," may not meet current regulatory expectations for demonstration of clonal derivation. In this article, the members of the IQ Consortium Working Group on Clonality present our position that the demonstration of process consistency and product comparability of critical quality attributes throughout the development life cycle should be sufficient to approve a license application without additional genetic analysis to support clonal origin, even for legacy cell lines that may not meet current day clonal derivation standards. With this commentary, we discuss advantages and limitations of genetic testing methods to support clonal derivation of legacy cell lines and wish to promote a mutual understanding with the regulatory authorities regarding their optional use during early drug development, subsequent to Investigational New Drug (IND) application and before demonstration of product and process consistency at Biologics License Applications (BLA) submission.




is

Disinfectant Efficacy: Understanding the Expectations and How to Design Effective Studies That Include Leveraging Multi-Site Data to Drive an Efficient Program

For manufacturers of both sterile and nonsterile pharmaceuticals, there is an expectation that the manufacturing process is performed in a manner that prevents extraneous contamination so that the products are provided in a safe, integral, pure, and unadulterated form. As part of that process, cleaning and disinfection are an absolute necessity. Although cleaning and disinfection support control of microbial contamination through preventive and corrective action, specific compendia methods do not currently exist. The intent of this paper is to provide a general guidance on how to perform disinfectant efficacy validation and implementation. This includes how to make sure the concepts are understood, how to interpret facility data and utilize it to demonstrate control awareness for your facilities, and how to leverage the data to reduce redundancies in validation or verification. This paper represents the thoughts and best practices of the authoring team and their respective companies and provides an efficient way to qualify disinfectants without impacting the quality of the study. If you choose to follow the recommendations in this paper, you must ensure that the appropriate rationale is sound and the scientific data is documented. It is the belief of the authoring team that only then will this approach meet regulatory requirements.




is

Risk Assessment Approach to Microbiological Controls of Cell Therapies

This technology review, written by a small group of pharmaceutical microbiologists experienced in cell therapies, discussed a risk-based approach to microbiological contamination detection and control during gene and cell therapy production. Topics discussed include a brief overview of cell therapies, a risk analysis related to donor selection, cell collection and infectious agent testing, cell transformation and expansion, packaging, storage, and administration, and cell therapy microbial contamination testing and release.




is

Microbiota-Propelled T Helper 17 Cells in Inflammatory Diseases and Cancer [Review]

Technologies allowing genetic sequencing of the human microbiome are opening new realms to discovery. The host microbiota substantially impacts immune responses both in immune-mediated inflammatory diseases (IMIDs) and in tumors affecting tissues beyond skin and mucosae. However, a mechanistic link between host microbiota and cancer or IMIDs has not been well established. Here, we propose T helper 17 (TH17) lymphocytes as the connecting factor between host microbiota and rheumatoid or psoriatic arthritides, multiple sclerosis, breast or ovarian cancer, and multiple myeloma. We theorize that similar mechanisms favor the expansion of gut-borne TH17 cells and their deployment at the site of inflammation in extraborder IMIDs and tumors, where TH17 cells are driving forces. Thus, from a pathogenic standpoint, tumors may share mechanistic routes with IMIDs. A review of similarities and divergences in microbiota-TH17 cell interactions in IMIDs and cancer sheds light on previously ignored pathways in either one of the two groups of pathologies and identifies novel therapeutic avenues.




is

CRISPR Tools To Control Gene Expression in Bacteria [Review]

CRISPR-Cas systems have been engineered as powerful tools to control gene expression in bacteria. The most common strategy relies on the use of Cas effectors modified to bind target DNA without introducing DNA breaks. These effectors can either block the RNA polymerase or recruit it through activation domains. Here, we discuss the mechanistic details of how Cas effectors can modulate gene expression by blocking transcription initiation or acting as transcription roadblocks. CRISPR-Cas tools can be further engineered to obtain fine-tuned control of gene expression or target multiple genes simultaneously. Several caveats in using these tools have also been revealed, including off-target effects and toxicity, making it important to understand the design rules of engineered CRISPR-Cas effectors in bacteria. Alternatively, some types of CRISPR-Cas systems target RNA and could be used to block gene expression at the posttranscriptional level. Finally, we review applications of these tools in high-throughput screens and the progress and challenges in introducing CRISPR knockdown to other species, including nonmodel bacteria with industrial or clinical relevance. A deep understanding of how CRISPR-Cas systems can be harnessed to control gene expression in bacteria and build powerful tools will certainly open novel research directions.




is

Articles of Significant Interest in This Issue [Spotlight]




is

Contributions of a LysR Transcriptional Regulator to Listeria monocytogenes Virulence and Identification of Its Regulons [Article]

The capacity of Listeria monocytogenes to adapt to environmental changes is facilitated by a large number of regulatory proteins encoded by its genome. Among these proteins are the uncharacterized LysR-type transcriptional regulators (LTTRs). LTTRs can work as positive and/or negative transcription regulators at both local and global genetic levels. Previously, our group determined by comparative genome analysis that one member of the LTTRs (NCBI accession no. WP_003734782) was present in pathogenic strains but absent from nonpathogenic strains. The goal of the present study was to assess the importance of this transcription factor in the virulence of L. monocytogenes strain F2365 and to identify its regulons. An L. monocytogenes strain lacking lysR (the F2365lysR strain) displayed significant reductions in cell invasion of and adhesion to Caco-2 cells. In plaque assays, the deletion of lysR resulted in a 42.86% decrease in plaque number and a 13.48% decrease in average plaque size. Furthermore, the deletion of lysR also attenuated the virulence of L. monocytogenes in mice following oral and intraperitoneal inoculation. The analysis of transcriptomics revealed that the transcript levels of 139 genes were upregulated, while 113 genes were downregulated in the F2365lysR strain compared to levels in the wild-type bacteria. lysR-repressed genes included ABC transporters, important for starch and sucrose metabolism as well as glycerolipid metabolism, flagellar assembly, quorum sensing, and glycolysis/gluconeogenesis. Conversely, lysR activated the expression of genes related to fructose and mannose metabolism, cationic antimicrobial peptide (CAMP) resistance, and beta-lactam resistance. These data suggested that lysR contributed to L. monocytogenes virulence by broad impact on multiple pathways of gene expression.

IMPORTANCE Listeria monocytogenes is the causative agent of listeriosis, an infectious and fatal disease of animals and humans. In this study, we have shown that lysR contributes to Listeria pathogenesis and replication in cell lines. We also highlight the importance of lysR in regulating the transcription of genes involved in different pathways that might be essential for the growth and persistence of L. monocytogenes in the host or under nutrient limitation. Better understanding L. monocytogenes pathogenesis and the role of various virulence factors is necessary for further development of prevention and control strategies.




is

The Antiactivator of Type III Secretion, OspD1, Is Transcriptionally Regulated by VirB and H-NS from Remote Sequences in Shigella flexneri [Article]

Shigella species, the causal agents of bacillary dysentery, use a type III secretion system (T3SS) to inject two waves of virulence proteins, known as effectors, into the colonic epithelium to subvert host cell machinery. Prior to host cell contact and secretion of the first wave of T3SS effectors, OspD1, an effector and antiactivator protein, prevents premature production of the second wave of effectors. Despite this important role, regulation of the ospD1 gene is not well understood. While ospD1 belongs to the large regulon of VirB, a transcriptional antisilencing protein that counters silencing mediated by the histone-like nucleoid structuring protein H-NS, it remains unclear if VirB directly or indirectly regulates ospD1. Additionally, it is not known if ospD1 is regulated by H-NS. Here, we identify the primary ospD1 transcription start site (+1) and show that the ospD1 promoter is remotely regulated by both VirB and H-NS. Our findings demonstrate that VirB regulation of ospD1 requires at least one of the two newly identified VirB regulatory sites, centered at –978 and –1270 relative to the ospD1 +1. Intriguingly, one of these sites lies on a 193-bp sequence found in three conserved locations on the large virulence plasmids of Shigella. The region required for H-NS-dependent silencing of ospD1 lies between –1120 and –820 relative to the ospD1 +1. Thus, our study provides further evidence that cis-acting regulatory sequences for transcriptional antisilencers and silencers, such as VirB and H-NS, can lie far upstream of the canonical bacterial promoter region (i.e., –250 to +1).

IMPORTANCE Transcriptional silencing and antisilencing mechanisms regulate virulence gene expression in many important bacterial pathogens. In Shigella species, plasmid-borne virulence genes, such as those encoding the type III secretion system (T3SS), are silenced by the histone-like nucleoid structuring protein H-NS and antisilenced by VirB. Previous work at the plasmid-borne icsP locus revealed that VirB binds to a remotely located cis-acting regulatory site to relieve transcriptional silencing mediated by H-NS. Here, we characterize a second example of remote VirB antisilencing at ospD1, which encodes a T3SS antiactivator and effector. Our study highlights that remote transcriptional silencing and antisilencing occur more frequently in Shigella than previously thought, and it raises the possibility that long-range transcriptional regulation in bacteria is commonplace.




is

Measuring airway clearance outcomes in bronchiectasis: a review

While airway clearance techniques (ACTs) are recommended for individuals with bronchiectasis, many trials have demonstrated inconsistent benefits or failed to reach their primary outcome. This review determined the most common clinical and patient-reported outcome measures used to evaluate the efficacy of ACTs in bronchiectasis. A literature search of five databases using relevant keywords and filtering for studies published in English, up until the end of August 2019, was completed. Studies included randomised controlled trials, using crossover or any other trial design, and abstracts. Studies were included where the control was placebo, no intervention, standard care, usual care or an active comparator. Adults with bronchiectasis not related to cystic fibrosis were included. Extracted data comprised study authors, design, duration, intervention, outcome measures and results. The search identified 27 published studies and one abstract. The most common clinical outcome measures were sputum volume (n=23), lung function (n=17) and pulse oximetry (n=9). The most common patient-reported outcomes were health-related quality of life (measured with St George's Respiratory Questionnaire, n=4), cough-related quality of life (measured with Leicester Cough Questionnaire, n=4) and dyspnoea (measured with Borg/modified Borg scale, n=8). Sputum volume, lung function, dyspnoea and health- and cough-related quality of life appear to be the most common clinical and patient-reported measures of airway clearance treatment efficacy.




is

Chitotriosidase: a marker and modulator of lung disease

Chitotriosidase (CHIT1) is a highly conserved and regulated chitinase secreted by activated macrophages; it is a member of the 18-glycosylase family (GH18). CHIT1 is the most prominent chitinase in humans, can cleave chitin and participates in the body's immune response and is associated with inflammation, infection, tissue damage and remodelling processes. Recently, CHIT1 has been reported to be involved in the molecular pathogenesis of pulmonary fibrosis, bronchial asthma, COPD and pulmonary infections, shedding new light on the role of these proteins in lung pathophysiology. The potential roles of CHIT1 in lung diseases are reviewed in this article.




is

Thoracic ultrasound in the modern management of pleural disease

Physician-led thoracic ultrasound (TUS) has substantially changed how respiratory disorders, and in particular pleural diseases, are managed. The use of TUS as a point-of-care test enables the respiratory physician to quickly and accurately diagnose pleural pathology and ensure safe access to the pleural space during thoracentesis or chest drain insertion. Competence in performing TUS is now an obligatory part of respiratory speciality training programmes in different parts of the world. Pleural physicians with higher levels of competence routinely use TUS during the planning and execution of more sophisticated diagnostic and therapeutic interventions, such as core needle pleural biopsies, image-guided drain insertion and medical thoracoscopy. Current research is gauging the potential of TUS in predicting the outcome of different pleural interventions and how it can aid in tailoring the optimum treatment according to different TUS-based parameters.




is

The supportive care needs of people living with pulmonary fibrosis and their caregivers: a systematic review

Background

People with pulmonary fibrosis often experience a protracted time to diagnosis, high symptom burden and limited disease information. This review aimed to identify the supportive care needs reported by people with pulmonary fibrosis and their caregivers.

Methods

A systematic review was conducted according to PRISMA guidelines. Studies that investigated the supportive care needs of people with pulmonary fibrosis or their caregivers were included. Supportive care needs were extracted and mapped to eight pre-specified domains using a framework synthesis method.

Results

A total of 35 studies were included. The most frequently reported needs were in the domain of information/education, including information on supplemental oxygen, disease progression and prognosis, pharmacological treatments and end-of-life planning. Psychosocial/emotional needs were also frequently reported, including management of anxiety, anger, sadness and fear. An additional domain of "access to care" was identified that had not been specified a priori; this included access to peer support, psychological support, specialist centres and support for families of people with pulmonary fibrosis.

Conclusion

People with pulmonary fibrosis report many unmet needs for supportive care, particularly related to insufficient information and lack of psychosocial support. These data can inform the development of comprehensive care models for people with pulmonary fibrosis and their loved ones.




is

The unknown planktonic foraminiferal pioneer Henry A. Buckley and his collection at The Natural History Museum, London

The Henry Buckley Collection of Planktonic Foraminifera at the Natural History Museum in London (NHMUK) consists of 1665 single-taxon slides housing 23 897 individuals from 203 sites in all the major ocean basins, as well as a vast research library of Scanning Electron Microscope (SEM) photomicrographs. Buckley picked the material from the NHMUK Ocean-Bottom Deposit Collection and also from fresh tow samples. However, his collection remains largely unused as he was discouraged by his managers in the Mineralogy Department from working on or publicizing the collection. Nevertheless, Buckley published pioneering papers on isotopic interpretation of oceanographic and climatic change and was one of the first workers to investigate foraminiferal wall structure using the SEM technique. Details of the collection and images of each slide are available via the NHMUK Data Portal (http://dx.doi.org/10.5519/0035055). The Buckley Collection and its associated Ocean-Bottom Deposit Collection have great potential for taxon-specific studies as well as geochemical work, and both collections are available on request.




is

Biostratigraphy and evolution of Miocene Discoaster spp. from IODP Site U1338 in the equatorial Pacific Ocean

Assemblages of upper lower through upper Miocene Discoaster spp. have been quantified from Integrated Ocean Drilling Program (IODP) Site U1338 in the eastern equatorial Pacific Ocean. These assemblages can be grouped into five broad morphological categories: six-rayed with bifurcated ray tips, six-rayed with large central areas, six-rayed with pointed ray tips, five-rayed with bifurcated ray tips and five-rayed with pointed ray tips. Discoaster deflandrei dominates the assemblages prior to 15.8 Ma. The decline in abundance of D. deflandrei close to the early–middle Miocene boundary occurs together with the evolution of the D. variabilis group, including D. signus and D. exilis. Six-rayed discoasters having large central areas become a prominent member of the assemblages for a 400 ka interval in the late middle Miocene. Five- and six-rayed forms having pointed tips become prominent in the early late Miocene and show a strong antiphasing relationship with the D. variabilis group. Discoaster bellus completely dominates the Discoaster assemblages for a 400 ka interval in the middle late Miocene. Abundances of all discoasters, or discoasters at the species level, show only (surprisingly) weak correlations to carbonate contents or oxygen and carbon isotopes of bulk sediment when calculated over the entire sample interval.




is

Implementation and Scale-Up of the Standard Days Method of Family Planning: A Landscape Analysis

ABSTRACTThe Standard Days Method (SDM), a modern fertility awareness-based family planning method, has been introduced in 30 countries since its development in 2001. It is still unclear to what extent the SDM was mainstreamed within the family planning method mix, particularly in low- and middle-income country (LMIC) settings, where the SDM had been introduced by donors and implementing partners. This review of implementation science publications on the SDM in LMICs first looked at community pilot studies of the SDM to determine the acceptability of the method; correct use and efficacy rates; demographics of users; and changes to contraceptive prevalence rates and family planning behaviors, especially among men and couples. Then, we examined the status of the SDM in the 16 countries that had attempted to scale up the method within national family planning protocols, training, and service delivery. At the community level, evidence demonstrated a high level of acceptability of the method; efficacy rates comparable to the initial clinical trials; diversity in demographic characteristics of users, including first-time or recently discontinued users of family planning; increased male engagement in family planning; and improved couple's communication. Nationally, few countries had scaled up the SDM due to uneven stakeholder engagement, lackluster political will, and competing resource priorities. Results of this review could help policy makers determine the added value of the SDM in the contraceptive method mix and identify potential barriers to its implementation moving forward.




is

How Should Home-Based Maternal and Child Health Records Be Implemented? A Global Framework Analysis

ABSTRACTBackground:A home-based record (HBR) is a health document kept by the patient or their caregivers, rather than by the health care facility. HBRs are used in 163 countries, but they have not been implemented universally or consistently. Effective implementation maximizes both health impacts and cost-effectiveness. We sought to examine this research-to-practice gap and delineate the facilitators and barriers to the effective implementation and use of maternal and child health HBRs especially in low- and middle-income countries (LMICs).Methods:Using a framework analysis approach, we created a framework of implementation categories in advance using subject expert inputs. We collected information through 2 streams. First, we screened 69 gray literature documents, of which 18 were included for analysis. Second, we conducted semi-structured interviews with 12 key informants, each of whom had extensive experience with HBR implementation. We abstracted the relevant data from the documents and interviews into an analytic matrix. The matrix was based on the initial framework and adjusted according to emergent categories from the data.Results:We identified 8 contributors to successful HBR implementation. These include establishing high-level support from the government and ensuring clear communication between all ministries and nongovernmental organizations involved. Choice of appropriate contents within the record was noted as important for alignment with the health system and for end user acceptance, as were the design, its physical durability, and timely redesigns. Logistical considerations, such as covering costs sustainably and arranging printing and distribution, could be potential bottlenecks. Finally, end users' engagement with HBRs depended on how the record was initially introduced to them and how its importance was reinforced over time by those in leadership positions.Conclusions:This framework analysis is the first study to take a more comprehensive and broad approach to the HBR implementation process in LMICs. The findings provide guidance for policy makers, donors, and health care practitioners regarding best implementation practice and effective HBR use, as well as where further research is required.




is

Two-Way Short Message Service (SMS) Communication May Increase Pre-Exposure Prophylaxis Continuation and Adherence Among Pregnant and Postpartum Women in Kenya

ABSTRACTIntroduction:We evaluated a 2-way short message service (SMS) communication platform to improve continuation of pre-exposure prophylaxis (PrEP) for HIV prevention among Kenyan women who initiated PrEP within routine maternal child health (MCH) and family planning clinics.Methods:We adapted an existing SMS platform (Mobile WACh [mWACh]) to send PrEP-tailored, theory-based SMS and allow clients to communicate with a remote nurse. Women who did not have HIV and who were initiating PrEP at 2 MCH/family planning clinics in Kisumu County, Kenya, from February to October 2018, were offered enrollment into the mWACh-PrEP program; SMS communication was free. We evaluated acceptability, satisfaction, and implementation metrics. In a pre/postevaluation, we compared PrEP continuation at 1-month postinitiation among women who initiated PrEP in the period before (n=166) versus after mWACh-PrEP implementation, adjusting for baseline differences.Results:Of the 334 women who were screened for enrollment into the mWACh-PrEP program; 193 (58%) were eligible and of those, 190 (98%) accepted enrollment. Reasons for ineligibility (n=141) included no phone access (29%) and shared SIM cards (25%). Median age was 25 years (interquartile range=22–30), and 91% were MCH clients. Compared to women who initiated PrEP in the month before mWACh-PrEP implementation, women who enrolled in mWACh-PrEP were more likely to return for their first PrEP follow-up visit (40% vs. 53%; adjusted risk ratio [aRR]=1.26; 95% confidence interval [CI]= 1.06, 1.50; P=.008) and more likely to continue PrEP (22% vs. 43%; aRR=1.75; 95% CI=1.21, 2.55; P=.003). Among those who returned, 99% reported successful receipt of SMS through the mWACh-PrEP system and 94% reported that mWACh-PrEP helped them understand PrEP better. Concerns about PrEP use, how it works, and side effects accounted for the majority (80%) of issues raised by participants using SMS.Conclusions:Two-way SMS expanded support for PrEP and opportunities for dialogue beyond the clinic and enabled women to ask and receive answers in real time regarding PrEP, which facilitated its continued use.




is

Diagnostic Utility and Impact on Clinical Decision Making of Focused Assessment With Sonography for HIV-Associated Tuberculosis in Malawi: A Prospective Cohort Study

ABSTRACTBackground:The focused assessment with sonography for HIV-associated tuberculosis (TB) (FASH) ultrasound protocol has been increasingly used to help clinicians diagnose TB. We sought to quantify the diagnostic utility of FASH for TB among individuals with HIV in Malawi.Methods:Between March 2016 and August 2017, 210 adults with HIV who had 2 or more signs and symptoms that were concerning for TB (fever, cough, night sweats, weight loss) were enrolled from a public HIV clinic in Lilongwe, Malawi. The treating clinicians conducted a history, physical exam, FASH protocol, and additional TB evaluation (laboratory diagnostics and chest radiography) on all participants. The clinician made a final treatment decision based on all available information. At the 6-month follow-up visit, we categorized participants based on clinical outcomes and diagnostic tests as having probable/confirmed TB or unlikely TB; association of FASH with probable/confirmed TB was calculated using Fisher's exact tests. The impact of FASH on empiric TB treatment was determined by asking the clinicians prospectively about whether they would start treatment at 2 time points in the baseline visit: (1) after the initial history and physical exam; and (2) after history, physical exam, and FASH protocol.Results:A total of 181 participants underwent final analysis, of whom 56 were categorized as probable/confirmed TB and 125 were categorized as unlikely TB. The FASH protocol was positive in 71% (40/56) of participants with probable/confirmed TB compared to 24% (30/125) of participants with unlikely TB (odds ratio=7.9, 95% confidence interval=3.9,16.1; P<.001). Among those classified as confirmed/probable TB, FASH increased the likelihood of empiric TB treatment before obtaining any other diagnostic studies from 9% (5/56) to 46% (26/56) at the point-of-care. For those classified as unlikely TB, FASH increased the likelihood of empiric treatment from 2% to 4%.Conclusion:In the setting of HIV coinfection in Malawi, FASH can be a helpful tool that augments the clinician's ability to make a timely diagnosis of TB.




is

A Qualitative Assessment of Provider and Client Experiences With 3- and 6-Month Dispensing Intervals of Antiretroviral Therapy in Malawi

ABSTRACTIntroduction:Multimonth dispensing (MMD) of antiretroviral therapy (ART) is a differentiated model of care that can help overcome health system challenges and reduce the burden of HIV care on clients. Although 3-month dispensing has been the standard of care, interest has increased in extending refill intervals to 6 months. We explored client and provider experiences with MMD in Malawi as part of a cluster randomized trial evaluating 3- versus 6-month ART dispensing.Methods:Semi-structured in-depth interviews were conducted with 17 ART providers and 62 stable, adult clients with HIV on ART. Clients and providers were evenly divided by arm and were eligible for an interview if they had been participating in the study for 1 year (clients) or 6 months (providers). Questions focused on perceived challenges and benefits of the 3- or 6-month amount of ART dispensing. Interviews were transcribed, and data were coded and analyzed using constant comparison.Results:Both clients and providers reported that the larger medication supply had benefits. Clients reported decreased costs due to less frequent travel to the clinic and increased time for income-generating activities. Clients in the 6-month dispensing arm reported a greater sense of personal freedom and normalcy. Providers felt that the 6-month dispensing interval reduced their workload. They also expressed concerned about clients' challenges with ART storage at home, but clients reported no storage problems. Although providers mentioned the potential risk of clients sharing the larger medication supply with family or friends, clients emphasized the value of ART and reported only rare, short-term sharing, mostly with their spouses. Providers mentioned clients' lack of motivation to seek care for illnesses that might occur between refill appointments.Conclusions:The 6-month ART dispensing arm was particularly beneficial to clients for decreased costs, increased time for income generation, and a greater sense of normalcy. Providers' concerns about storage, sharing, and return visits to the facility did not emerge in client interviews. Further data are needed on the feasibility of implementing a large-scale program with 6-month dispensing.




is

Insights Into Provider Bias in Family Planning from a Novel Shared Decision Making Based Counseling Initiative in Rural, Indigenous Guatemala




is

National Surgical, Obstetric, and Anesthesia Plans Supporting the Vision of Universal Health Coverage




is

Issues and Events




is

Erratum. Ten-Year Outcome of Islet Alone or Islet After Kidney Transplantation in Type 1 Diabetes: A Prospective Parallel-Arm Cohort Study. Diabetes Care 2019;42:2042-2049




is

Sex Disparities in Cardiovascular Outcome Trials of Populations With Diabetes: A Systematic Review and Meta-analysis

BACKGROUND

Sex differences have been described in diabetes cardiovascular outcome trials (CVOTs).

PURPOSE

We systematically reviewed for baseline sex differences in cardiovascular (CV) risk factors and CV protection therapy in diabetes CVOTs.

DATA SOURCES

Randomized placebo-controlled trials examining the effect of diabetes medications on major adverse cardiovascular events in people ≥18 years of age with type 2 diabetes.

STUDY SELECTION

Included trials reported baseline sex-specific CV risks and use of CV protection therapy.

DATA EXTRACTION

Two reviewers independently abstracted study data.

DATA SYNTHESIS

We included five CVOTs with 46,606 participants. We summarized sex-specific data using mean differences (MDs) and relative risks (RRs) and pooled estimates using random effects meta-analysis. There were fewer women than men in included trials (28.5–35.8% women). Women more often had stroke (RR 1.28; 95% CI 1.09, 1.50), heart failure (RR 1.30; 95% CI 1.21,1.40), and chronic kidney disease (RR 1.33; 95% CI 1.17; 1.51). They less often used statins (RR 0.90; 95% CI 0.86, 0.93), aspirin (RR 0.82; 95% CI 0.71, 0.95), and β-blockers (RR 0.93; 95% CI 0.88, 0.97) and had a higher systolic blood pressure (MD 1.66 mmHg; 95% CI 0.90, 2.41), LDL cholesterol (MD 0.34 mmol/L; 95% CI 0.29, 0.39), and hemoglobin A1c (MD 0.11%; 95% CI 0.09, 0.14 [1.2 mmol/mol; 1.0, 1.5]) than men.

LIMITATIONS

We could not carry out subgroup analyses due to the small number of studies. Our study is not generalizable to low CV risk groups nor to patients in routine care.

CONCLUSIONS

There were baseline sex disparities in diabetes CVOTs. We suggest efforts to recruit women into trials and promote CV management across the sexes.




is

Effects of Continuous Glucose Monitoring on Metrics of Glycemic Control in Diabetes: A Systematic Review With Meta-analysis of Randomized Controlled Trials

BACKGROUND

Continuous glucose monitoring (CGM) provides important information to aid in achieving glycemic targets in people with diabetes.

PURPOSE

We performed a meta-analysis of randomized controlled trials (RCTs) comparing CGM with usual care for parameters of glycemic control in both type 1 and type 2 diabetes.

DATA SOURCES

Many electronic databases were searched for articles published from inception until 30 June 2019.

STUDY SELECTION

We selected RCTs that assessed both changes in HbA1c and time in target range (TIR), together with time below range (TBR), time above range (TAR), and glucose variability expressed as coefficient of variation (CV).

DATA EXTRACTION

Data were extracted from each trial by two investigators.

DATA SYNTHESIS

All results were analyzed by a random effects model to calculate the weighted mean difference (WMD) with the 95% CI. We identified 15 RCTs, lasting 12–36 weeks and involving 2,461 patients. Compared with the usual care (overall data), CGM was associated with modest reduction in HbA1c (WMD –0.17%, 95% CI –0.29 to –0.06, I2 = 96.2%), increase in TIR (WMD 70.74 min, 95% CI 46.73–94.76, I2 = 66.3%), and lower TAR, TBR, and CV, with heterogeneity between studies. The increase in TIR was significant and robust independently of diabetes type, method of insulin delivery, and reason for CGM use. In preplanned subgroup analyses, real-time CGM led to the higher improvement in mean HbA1c (WMD –0.23%, 95% CI –0.36 to –0.10, P < 0.001), TIR (WMD 83.49 min, 95% CI 52.68–114.30, P < 0.001), and TAR, whereas both intermittently scanned CGM and sensor-augmented pump were associated with the greater decline in TBR.

LIMITATIONS

Heterogeneity was high for most of the study outcomes; all studies were sponsored by industry, had short duration, and used an open-label design.

CONCLUSIONS

CGM improves glycemic control by expanding TIR and decreasing TBR, TAR, and glucose variability in both type 1 and type 2 diabetes.




is

Evaluation of Factors Related to Glycemic Management in Professional Cyclists With Type 1 Diabetes Over a 7-Day Stage Race

OBJECTIVE

To investigate factors related to glycemic management among members of a professional cycling team with type 1 diabetes over a 7-day Union Cycliste Internationale World Tour stage race.

RESEARCH DESIGN AND METHODS

An observational evaluation of possible factors related to glycemic management and performance in six male professional cyclists with type 1 diabetes (HbA1c 6.4 ± 0.6%) during the 2019 Tour of California.

RESULTS

In-ride time spent in euglycemia (3.9–10.0 mmol/L glucose) was 63 ± 11%, with a low percentage of time spent in level 1 (3.0–3.9 mmol/L; 0 ± 1% of time) and level 2 (<3.0 mmol/L; 0 ± 0% of time) hypoglycemia over the 7-day race. Riders spent 25 ± 9% of time in level 1 (10.1–13.9 mmol/L) and 11 ± 9% in level 2 (>13.9 mmol/L) hyperglycemia during races. Bolus insulin use was uncommon during races, despite high carbohydrate intake (76 ± 23 g ⋅ h–1). Overnight, the riders spent progressively more time in hypoglycemia from day 1 (6 ± 12% in level 1 and 0 ± 0% in level 2) to day 7 (12 ± 12% in level 1 and 2 ± 4% in level 2) (2[1] > 4.78, P < 0.05).

CONCLUSIONS

Professional cyclists with type 1 diabetes have excellent in-race glycemia, but significant hypoglycemia during recovery overnight, throughout a 7-day stage race.




is

Underweight Increases the Risk of End-Stage Renal Diseases for Type 2 Diabetes in Korean Population: Data From the National Health Insurance Service Health Checkups 2009-2017

OBJECTIVE

There is a controversy over the association between obesity and end-stage renal disease (ESRD) in people with or without type 2 diabetes; therefore, we examined the effect of BMI on the risk of ESRD according to glycemic status in the Korean population.

RESEARCH DESIGN AND METHODS

The study monitored 9,969,848 participants who underwent a National Health Insurance Service health checkup in 2009 from baseline to the date of diagnosis of ESRD during a follow-up period of ~8.2 years. Obesity was categorized by World Health Organization recommendations for Asian populations, and glycemic status was categorized into the following five groups: normal, impaired fasting glucose (IFG), newly diagnosed diabetes, diabetes <5 years, and diabetes ≥5 years.

RESULTS

Underweight was associated with a higher risk of ESRD in all participants after adjustment for all covariates. In the groups with IFG, newly diagnosed type 2 diabetes, diabetes duration <5 years, and diabetes ≥5 years, the hazard ratio (HR) of the underweight group increased with worsening glycemic status (HR 1.431 for IFG, 2.114 for newly diagnosed diabetes, 4.351 for diabetes <5 years, and 6.397 for diabetes ≥5 years), using normal weight with normal fasting glucose as a reference. The adjusted HRs for ESRD were also the highest in the sustained underweight group regardless of the presence of type 2 diabetes (HR 1.606 for nondiabetes and 2.14 for diabetes).

CONCLUSIONS

Underweight showed more increased HR of ESRD according to glycemic status and diabetes duration in the Korean population. These associations also persisted in the group with sustained BMI during the study period.




is

Diabetes, Cognitive Decline, and Mild Cognitive Impairment Among Diverse Hispanics/Latinos: Study of Latinos-Investigation of Neurocognitive Aging Results (HCHS/SOL)

OBJECTIVE

Hispanics/Latinos are the largest ethnic/racial group in the U.S., have the highest prevalence of diabetes, and are at increased risk for neurodegenerative disorders. Currently, little is known about the relationship between diabetes and cognitive decline and disorders among diverse Hispanics/Latinos. The purpose of this study is to clarify these relationships in diverse middle-aged and older Hispanics/Latinos.

RESEARCH DESIGN AND METHODS

The Study of Latinos–Investigation of Neurocognitive Aging (SOL-INCA) is an ancillary study of the Hispanic Community Health Study/Study of Latinos (HCHS/SOL). HCHS/SOL is a multisite (Bronx, NY; Chicago, IL; Miami, FL; and San Diego, CA), probability-sampled (i.e., representative of targeted populations), and prospective cohort study. Between 2016 and 2018, SOL-INCA enrolled diverse Hispanics/Latinos aged ≥50 years (n = 6,377). Global cognitive decline and mild cognitive impairment (MCI) were the primary outcomes.

RESULTS

Prevalent diabetes at visit 1, but not incident diabetes at visit 2, was associated with significantly steeper global cognitive decline (βGC = –0.16 [95% CI –0.25; –0.07]; P < 0.001), domain-specific cognitive decline, and higher odds of MCI (odds ratio 1.74 [95% CI 1.34; 2.26]; P < 0.001) compared with no diabetes in age- and sex-adjusted models.

CONCLUSIONS

Diabetes was associated with cognitive decline and increased MCI prevalence among diverse Hispanics/Latinos, primarily among those with prevalent diabetes at visit 1. Our findings suggest that significant cognitive decline and MCI may be considered additional disease complications of diabetes among diverse middle-aged and older Hispanics/Latinos.




is

The Prognosis of Patients With Type 2 Diabetes and Nonalbuminuric Diabetic Kidney Disease Is Not Always Poor: Implication of the Effects of Coexisting Macrovascular Complications (JDDM 54)

OBJECTIVE

Nonalbuminuric diabetic kidney disease (DKD) has become the prevailing phenotype in patients with type 2 diabetes. However, it remains unclear whether its prognosis is poorer than that of other DKD phenotypes.

RESEARCH DESIGN AND METHODS

A total of 2,953 Japanese patients with type 2 diabetes and estimated glomerular filtration rate (eGFR) ≥30 mL/min/1.73 m2, enrolled in an observational cohort study in 2004, were followed until 2015. On the basis of albuminuria (>30 mg/g creatinine) and reduced eGFR (<60 mL/min/1.73 m2) at baseline, participants were classified into the four DKD phenotypes—no-DKD, albuminuric DKD without reduced eGFR, nonalbuminuric DKD with reduced eGFR, and albuminuric DKD with reduced eGFR—to assess the risks of mortality, cardiovascular disease (CVD), and renal function decline.

RESULTS

During the mean follow-up of 9.7 years, 113 patients died and 263 developed CVD. In nonalbuminuric DKD, the risks of death or CVD were not higher than those in no-DKD (adjusted hazard ratio 1.02 [95% CI 0.66, 1.60]) and the annual decline in eGFR was slower than in other DKD phenotypes. The risks of death or CVD in nonalbuminuric DKD without prior CVD were similar to those in no-DKD without prior CVD, whereas the risks in nonalbuminuric DKD with prior CVD as well as other DKD phenotypes were higher.

CONCLUSIONS

Nonalbuminuric DKD did not have a higher risk of mortality, CVD events, or renal function decline than the other DKD phenotypes. In nonalbuminuric DKD, the presence of macrovascular complications may be a main determinant of prognosis rather than the renal phenotype.




is

Dalcetrapib Reduces Risk of New-Onset Diabetes in Patients With Coronary Heart Disease

OBJECTIVE

Incident type 2 diabetes is common among patients with recent acute coronary syndrome and is associated with an adverse prognosis. Some data suggest that cholesteryl ester transfer protein (CETP) inhibitors reduce incident type 2 diabetes. We compared the effect of treatment with the CETP inhibitor dalcetrapib or placebo on incident diabetes in patients with recent acute coronary syndrome.

RESEARCH DESIGN AND METHODS

In the dal-OUTCOMES trial, 15,871 patients were randomly assigned to treatment with dalcetrapib 600 mg daily or placebo, beginning 4–12 weeks after an acute coronary syndrome. Absence of diabetes at baseline was based on medical history, no use of antihyperglycemic medication, and hemoglobin A1c and serum glucose levels below diagnostic thresholds. Among these patients, incident diabetes after randomization was defined by any diabetes-related adverse event, new use of antihyperglycemic medication, hemoglobin A1c ≥6.5%, or a combination of at least two measurements of serum glucose ≥7.0 mmol/L (fasting) or ≥11.1 mmol/L (random).

RESULTS

At baseline, 10,645 patients (67% of the trial cohort) did not have diabetes. During a median follow-up of 30 months, incident diabetes was identified in 403 of 5,326 patients (7.6%) assigned to dalcetrapib and in 516 of 5,319 (9.7%) assigned to placebo, corresponding to absolute risk reduction of 2.1%, hazard ratio of 0.77 (95% CI 0.68–0.88; P < 0.001), and a need to treat 40 patients for 3 years to prevent 1 incident case of diabetes. Considering only those with prediabetes at baseline, the number needed to treat for 3 years to prevent 1 incident case of diabetes was 25. Dalcetrapib also decreased the number of patients who progressed from normoglycemia to prediabetes and increased the number who regressed from diabetes to no diabetes.

CONCLUSIONS

In patients with a recent acute coronary syndrome, incident diabetes is common and is reduced substantially by treatment with dalcetrapib.




is

Increase in Endogenous Glucose Production With SGLT2 Inhibition Is Unchanged by Renal Denervation and Correlates Strongly With the Increase in Urinary Glucose Excretion

OBJECTIVE

Sodium–glucose cotransporter 2 (SGLT2) inhibition causes an increase in endogenous glucose production (EGP). However, the mechanisms are unclear. We studied the effect of SGLT2 inhibitors on EGP in subjects with type 2 diabetes (T2D) and without diabetes (non-DM) in kidney transplant recipients with renal denervation.

RESEARCH DESIGN AND METHODS

Fourteen subjects who received a renal transplant (six with T2D [A1C 7.2 ± 0.1%] and eight non-DM [A1C 5.6 ± 0.1%) underwent measurement of EGP with [3-3H]glucose infusion following dapagliflozin (DAPA) 10 mg or placebo. Plasma glucose, insulin, C-peptide, glucagon, and titrated glucose-specific activity were measured.

RESULTS

Following placebo in T2D, fasting plasma glucose (FPG) (143 ± 14 to 124 ± 10 mg/dL; P = 0.02) and fasting plasma insulin (12 ± 2 to 10 ± 1.1 μU/mL; P < 0.05) decreased; plasma glucagon was unchanged, and EGP declined. After DAPA in T2D, FPG (143 ± 15 to 112 ± 9 mg/dL; P = 0.01) and fasting plasma insulin (14 ± 3 to 11 ± 2 μU/mL; P = 0.02) decreased, and plasma glucagon increased (all P < 0.05 vs. placebo). EGP was unchanged from baseline (2.21 ± 0.19 vs. 1.96 ± 0.14 mg/kg/min) in T2D (P < 0.001 vs. placebo). In non-DM following DAPA, FPG and fasting plasma insulin decreased, and plasma glucagon was unchanged. EGP was unchanged from baseline (1.85 ± 0.10 to 1.78 ± 0.10 mg/kg/min) after DAPA, whereas EGP declined significantly with placebo. When the increase in EGP production following DAPA versus placebo was plotted against the difference in urinary glucose excretion (UGE) for all patients, a strong correlation (r = 0.824; P < 0.001) was observed.

CONCLUSIONS

Renal denervation in patients who received a kidney transplant failed to block the DAPA-mediated stimulation of EGP in both individuals with T2D and non-DM subjects. The DAPA-stimulated rise in EGP is strongly related to the increase in UGE, blunting the decline in FPG.




is

Trends in Emergency Department Visits and Inpatient Admissions for Hyperglycemic Crises in Adults With Diabetes in the U.S., 2006-2015

OBJECTIVE

To report U.S. national population-based rates and trends in diabetic ketoacidosis (DKA) and hyperglycemic hyperosmolar state (HHS) among adults, in both the emergency department (ED) and inpatient settings.

RESEARCH DESIGN AND METHODS

We analyzed data from 1 January 2006 through 30 September 2015 from the Nationwide Emergency Department Sample and National Inpatient Sample to characterize ED visits and inpatient admissions with DKA and HHS. We used corresponding year cross-sectional survey data from the National Health Interview Survey to estimate the number of adults ≥18 years with diagnosed diabetes to calculate population-based rates for DKA and HHS in both ED and inpatient settings. Linear trends from 2009 to 2015 were assessed using Joinpoint software.

RESULTS

In 2014, there were a total of 184,255 and 27,532 events for DKA and HHS, respectively. The majority of DKA events occurred in young adults aged 18–44 years (61.7%) and in adults with type 1 diabetes (70.6%), while HHS events were more prominent in middle-aged adults 45–64 years (47.5%) and in adults with type 2 diabetes (88.1%). Approximately 40% of the hyperglycemic events were in lower-income populations. Overall, event rates for DKA significantly increased from 2009 to 2015 in both ED (annual percentage change [APC] 13.5%) and inpatient settings (APC 8.3%). A similar trend was seen for HHS (APC 16.5% in ED and 6.3% in inpatient). The increase was in all age-groups and in both men and women.

CONCLUSIONS

Causes of increased rates of hyperglycemic events are unknown. More detailed data are needed to investigate the etiology and determine prevention strategies.




is

Differential Health Care Use, Diabetes-Related Complications, and Mortality Among Five Unique Classes of Patients With Type 2 Diabetes in Singapore: A Latent Class Analysis of 71,125 Patients

OBJECTIVE

With rising health care costs and finite health care resources, understanding the population needs of different type 2 diabetes mellitus (T2DM) patient subgroups is important. Sparse data exist for the application of population segmentation on health care needs among Asian T2DM patients. We aimed to segment T2DM patients into distinct classes and evaluate their differential health care use, diabetes-related complications, and mortality patterns.

RESEARCH DESIGN AND METHODS

Latent class analysis was conducted on a retrospective cohort of 71,125 T2DM patients. Latent class indicators included patient’s age, ethnicity, comorbidities, and duration of T2DM. Outcomes evaluated included health care use, diabetes-related complications, and 4-year all-cause mortality. The relationship between class membership and outcomes was evaluated with the appropriate regression models.

RESULTS

Five classes of T2DM patients were identified. The prevalence of depression was high among patients in class 3 (younger females with short-to-moderate T2DM duration and high psychiatric and neurological disease burden) and class 5 (older patients with moderate-to-long T2DM duration and high disease burden with end-organ complications). They were the highest tertiary health care users. Class 5 patients had the highest risk of myocardial infarction (hazard ratio [HR] 12.05, 95% CI 10.82–13.42]), end-stage renal disease requiring dialysis initiation (HR 25.81, 95% CI 21.75–30.63), stroke (HR 19.37, 95% CI 16.92–22.17), lower-extremity amputation (HR 12.94, 95% CI 10.90–15.36), and mortality (HR 3.47, 95% CI 3.17–3.80).

CONCLUSIONS

T2DM patients can be segmented into classes with differential health care use and outcomes. Depression screening should be considered for the two identified classes of patients.




is

Every Fifth Individual With Type 1 Diabetes Suffers From an Additional Autoimmune Disease: A Finnish Nationwide Study

OBJECTIVE

The aim of this study was to quantify the excess risk of autoimmune hypothyroidism and hyperthyroidism, Addison disease, celiac disease, and atrophic gastritis in adults with type 1 diabetes (T1D) compared with nondiabetic individuals in Finland.

RESEARCH DESIGN AND METHODS

The study included 4,758 individuals with T1D from the Finnish Diabetic Nephropathy (FinnDiane) Study and 12,710 nondiabetic control individuals. The autoimmune diseases (ADs) were identified by linking the data with the Finnish nationwide health registries from 1970 to 2015.

RESULTS

The median age of the FinnDiane individuals at the end of follow-up in 2015 was 51.4 (interquartile range 42.6–60.1) years, and the median duration of diabetes was 35.5 (26.5–44.0) years. Of individuals with T1D, 22.8% had at least one additional AD, which included 31.6% of women and 14.9% of men. The odds ratios for hypothyroidism, hyperthyroidism, celiac disease, Addison disease, and atrophic gastritis were 3.43 (95% CI 3.09–3.81), 2.98 (2.27–3.90), 4.64 (3.71–5.81), 24.13 (5.60–104.03), and 5.08 (3.15–8.18), respectively, in the individuals with T1D compared with the control individuals. The corresponding ORs for women compared with men were 2.96 (2.53–3.47), 2.83 (1.87–4.28), 1.52 (1.15–2.02), 2.22 (0.83–5.91), and 1.36 (0.77–2.39), respectively, in individuals with T1D. Late onset of T1D and aging increased the risk of hypothyroidism, whereas young age at onset of T1D increased the risk of celiac disease.

CONCLUSIONS

This is one of the largest studies quantifying the risk of coexisting AD in adult individuals with T1D in the country with the highest incidence of T1D in the world. The results highlight the importance of continuous screening for other ADs in individuals with T1D.




is

Risk of Ipsilateral Reamputation Following an Incident Toe Amputation Among U.S. Military Veterans With Diabetes, 2005-2016

OBJECTIVE

To assess whether the risk of subsequent lower-limb amputations and death following an initial toe amputation among individuals with diabetes has changed over time and varies by demographic characteristics and geographic region.

RESEARCH DESIGN AND METHODS

Using Veterans Health Administration (VHA) electronic medical records from 1 October 2004 to 30 September 2016, we determined risk of subsequent ipsilateral minor and major amputation within 1 year after an initial toe/ray amputation among veterans with diabetes. To assess changes in the annual rate of subsequent amputation over time, we estimated age-adjusted incidence of minor and major subsequent ipsilateral amputation for each year, separately for African Americans (AAs) and whites. Geographic variation was assessed across VHA markets (n = 89) using log-linear Poisson regression models adjusting for age and ethnoracial category.

RESULTS

Among 17,786 individuals who had an initial toe amputation, 34% had another amputation on the same limb within 1 year, including 10% who had a major ipsilateral amputation. Median time to subsequent ipsilateral amputation (minor or major) was 36 days. One-year risk of subsequent major amputation decreased over time, but risk of subsequent minor amputation did not. Risk of subsequent major ipsilateral amputation was higher in AAs than whites. After adjusting for age and ethnoracial category, 1-year risk of major subsequent amputation varied fivefold across VHA markets.

CONCLUSIONS

Nearly one-third of individuals require reamputation following an initial toe amputation, although risks of subsequent major ipsilateral amputation have decreased over time. Nevertheless, risks remain particularly high for AAs and vary substantially geographically.




is

Trends in Bone Mineral Density, Osteoporosis, and Osteopenia Among U.S. Adults With Prediabetes, 2005-2014

OBJECTIVE

We aimed to evaluate trends in bone mineral density (BMD) and the prevalence of osteoporosis/osteopenia in U.S. adults with prediabetes and normal glucose regulation (NGR) and further investigate the association among prediabetes, osteopenia/osteoporosis, and fracture.

RESEARCH DESIGN AND METHODS

We collected and analyzed data from the U.S. National Health and Nutrition Examination Surveys during the period from 2005 to 2014. Femoral neck and lumbar spine BMD data were available for 5,310 adults with prediabetes and 5,162 adults with NGR >40 years old.

RESULTS

A shift was observed toward a lower BMD and a higher prevalence of osteopenia/osteoporosis at the femoral neck and lumbar spine in U.S. adults >40 years old with prediabetes since 2005, especially in men <60 and women ≥60 years old. A shift toward a higher prevalence of osteopenia/osteoporosis at the femoral neck was also observed in adults >40 years old with NGR. Moreover, prediabetes was associated with a higher prevalence of hip fracture, although participants with prediabetes had higher BMD and a lower prevalence of osteopenia/osteoporosis at the femoral neck.

CONCLUSIONS

There was a declining trend in BMD from 2005 to 2014 in U.S. adults >40 years old with prediabetes and NGR, and this trend was more significant in men <60 years old. Populations with prediabetes may be exposed to relatively higher BMD but a higher prevalence of fracture.




is

Possible Modifiers of the Association Between Change in Weight Status From Child Through Adult Ages and Later Risk of Type 2 Diabetes

OBJECTIVE

We investigated the association between changes in weight status from childhood through adulthood and subsequent type 2 diabetes risks and whether educational attainment, smoking, and leisure time physical activity (LTPA) modify this association.

RESEARCH DESIGN AND METHODS

Using data from 10 Danish and Finnish cohorts including 25,283 individuals, childhood BMI at 7 and 12 years was categorized as normal or high using age- and sex-specific cutoffs (<85th or ≥85th percentile). Adult BMI (20–71 years) was categorized as nonobese or obese (<30.0 or ≥30.0 kg/m2, respectively). Associations between BMI patterns and type 2 diabetes (989 women and 1,370 men) were analyzed using Cox proportional hazards regressions and meta-analysis techniques.

RESULTS

Compared with individuals with a normal BMI at 7 years and without adult obesity, those with a high BMI at 7 years and adult obesity had higher type 2 diabetes risks (hazard ratio [HR]girls 5.04 [95% CI 3.92–6.48]; HRboys 3.78 [95% CI 2.68–5.33]). Individuals with a high BMI at 7 years but without adult obesity did not have a higher risk (HRgirls 0.74 [95% CI 0.52–1.06]; HRboys 0.93 [95% CI 0.65–1.33]). Education, smoking, and LTPA were associated with diabetes risks but did not modify or confound the associations with BMI changes. Results for 12 years of age were similar.

CONCLUSIONS

A high BMI in childhood was associated with higher type 2 diabetes risks only if individuals also had obesity in adulthood. These associations were not influenced by educational and lifestyle factors, indicating that BMI is similarly related to the risk across all levels of these factors.




is

Early Childhood Antibiotic Treatment for Otitis Media and Other Respiratory Tract Infections Is Associated With Risk of Type 1 Diabetes: A Nationwide Register-Based Study With Sibling Analysis

OBJECTIVE

The effect of early-life antibiotic treatment on the risk of type 1 diabetes is debated. This study assessed this question, applying a register-based design in children up to age 10 years including a large sibling-control analysis.

RESEARCH DESIGN AND METHODS

All singleton children (n = 797,318) born in Sweden between 1 July 2005 and 30 September 2013 were included and monitored to 31 December 2014. Cox proportional hazards models, adjusted for parental and perinatal characteristics, were applied, and stratified models were used to account for unmeasured confounders shared by siblings.

RESULTS

Type 1 diabetes developed in 1,297 children during the follow-up (median 4.0 years [range 0–8.3]). Prescribed antibiotics in the 1st year of life (23.8%) were associated with an increased risk of type 1 diabetes (adjusted hazard ratio [HR] 1.19 [95% CI 1.05–1.36]), with larger effect estimates among children delivered by cesarean section (P for interaction = 0.016). The association was driven by exposure to antibiotics primarily used for acute otitis media and respiratory tract infections. Further, we found an association of antibiotic prescriptions in pregnancy (22.5%) with type 1 diabetes (adjusted HR 1.15 [95% CI 1.00–1.32]). In general, sibling analysis supported these results, albeit often with statistically nonsignificant associations.

CONCLUSIONS

Dispensed prescription of antibiotics, mainly for acute otitis media and respiratory tract infections, in the 1st year of life is associated with an increased risk of type 1 diabetes before age 10 years, most prominently in children delivered by cesarean section.