nc

Chasing Efficiency: Can Operational Changes Fix European Asylum Systems?

Brussels is searching for bright ideas on how to fix the Common European Asylum System. While recent EU-level legal reforms have stalled, this report examines the many innovative, operations-focused approaches Member States have used since the 2015-16 migration crisis to improve registration and reception systems, asylum case processing, and options for returning failed asylum seekers.




nc

Immigration and U.S. National Security: The State of Play Since 9/11

The U.S. government has made important progress in shoring up weaknesses at the nexus of immigration and national security since September 11, 2001. But as new threats emerge and evolve—including public-health emergencies such as the COVID-19 pandemic—the question is whether the post-9/11 system is up to the task of meeting these challenges, as this report explores.




nc

When Good People Write Bad Sentences, by Robert Harris

All great writing starts with a sentence. But what is it that makes a sentence great?  Could it be grammar, syntax, style, word choice, information, meaning, common sense, passion etc? According to one author, there is only one rule for writing a great sentence. And this is the rule:  "whether you're Christian, Jew, Muslim, or a disciple of the church of Penn Jillette, when you sit down to write, the Reader is thy god." The rule is certainly thought-provoking but one has to wonder if one rule would be enough for writing a great sentence.

Robert Harris, in his book, "When Good People Write Bad Sentences," offers "12 Steps to Verbal Enlightenment" that can cure any eager to learn "bad writing addict." Besides, the 12 steps don’t just provide solutions to well-known problems in the categories of punctuation, syntax, diction, and style but also help bad writers understand the emotional foundations and psychological forces behind those problems. Harris argues, not without humor, that only with this deep understanding can permanent changes take place. He identifies nine types of ineffective sentences that arise from unexamined emotions and self-destructive needs, and offers an integrated approach which could help writers learn to take a broader and healthier perspective on sentence construction.

When it comes to the malady of writing badly, which he calls "malescribism,"--an uncontrollable urge to write carelessly and unpersuasively--Harris warns that this malady "is no respecter of status, nor does it take into account social, ethnic, or religious orientation." He also notes that malescribes could be black and white, male and female, believer and nonbeliever, liberal and conservative.

Since we all would like to write better sentences consistently, let's look at the advice offered by Robert Harris, and also share some of the great sentences that we have come across in our reading.


 




nc

MPI’s Transatlantic Council on Migration Launches Research Series on Lasting Effects of Mixed Migration Flows

First report examines Canadian challenges & solutions in housing Syrian refugees

WASHINGTON — Four years after the peak of the 2015–16 migration and refugee crisis in Europe and amid swelling arrivals at the U.S.-Mexico border and elsewhere, new evidence sheds light on how well countries have responded to an unprecedented surge in mixed flows of humanitarian, economic and family migrants.




nc

Latinos & Immigrants in Kansas City Metro Area Face Higher Health Insurance Coverage Gaps, Even as They Represent Fast-Growing Share of Workforce

WASHINGTON — Latinos and immigrants are at least twice as likely to lack health insurance coverage as the overall population in three central Kansas City metro counties, a new Migration Policy Institute (MPI) study reveals. In fact, they are four times as likely to be uninsured in Johnson County, Kansas. 




nc

As European policymakers take stock of seasonal worker programmes, MPI Europe brief outlines principles to improve these schemes for all parties

Findings will be discussed during 25 February MPI Europe – SVR webinar




nc

Is a U.S. Immigration System Rebuilt after 9/11 Prepared to Tackle Ever-Evolving Security Threats, Including Pandemics? Report Assesses Successes, Gaps

WASHINGTON — The U.S. immigration system was dramatically reshaped by the terrorist attacks of September 11, 2001, which shone a harsh spotlight on weaknesses in visa and immigration screening processes. From the creation of the Department of Homeland Security (DHS) to expanded national security protections in immigration and tourism policies, countless changes in the immigration arena have unfolded over the past 19 years.




nc

Integrating Refugees and Asylum Seekers into the German Economy and Society: Empirical Evidence and Policy Objectives

As the top destination in Europe for asylum seekers in recent years, Germany has rolled out a number of integration policy changes. Based on an early look at how newcomers’ integration is progressing, the report finds the policies have had ambiguous implications. The report also provides insights into the demographic and socioeconomic characteristics of the asylum seeker and refugee population.




nc

Rebuilding Community after Crisis: Striking a New Social Contract for Diverse Societies (Transatlantic Council Statement)

Addressing the deep-rooted integration challenges unearthed by large-scale migration and rapid social change will require a combination of strategies. Governments in Europe and North America must create a new social contract for increasingly diverse societies that are confronting cycles of disruption. This report sketches a blueprint for an adaptive process oriented by skill needs rather than national origins.





nc

Kansas City Ribs - Lance Rosen

This recipe for Kansas City ribs was featured on Foodie Tuesday, a weekly segment with Raf Epstein on Drive, 774 ABC Melbourne, 3.30PM courtesy of Lance Rosen. Lance's new book is called "Temples of BBQ".




nc

Italian tomato, saffron and mozzarella arancini

1/2 cup chicken stock 1 pinch of saffron olive oil, for cooking 1 small brown onion, diced 1 tbsp. tomato paste 150 g Carnaroli rice or Arborio rice Himalayan salt and freshly ground black pepper 100 ml white wine 60 g grated parmesan finely grated zest of 1 lemon 100 g grated mozzarella 3 eggs 125 ml (1/2 cup) milk 50 g plain flour 160 g panko breadcrumbs (see Note) vegetable oil, for deep-frying




nc

SLOW-COOKED QUINCES WITH MANDARIN AND ROSEWATER

Once you've peeled and cored the quinces there's really little work to do to make this gorgeous dessert. They just bubble away happily for a few hours while the sugar and long slow cooking cast their spell, transforming the quinces' hard, ivory-coloured flesh until it's meltingly soft and an incredible deep-rose colour.




nc

Nancy's naan bread

Our Indian chef whips these delicious naan breads up for us every day to go with her chicken tikka and vegetarian curries Brushed with garlic butter as they come out of the stone oven, they are quite a treat (thats why she has to make so many everyday.....)Thanks Nancylt;i>f




nc

Nancy's vege curry

One of the most popular dishes on our menu at the moment, this curry has so much flavour! Our Nancy is a spice master...the secret is in the cooking of the onions, she almost cooks them to a paste




nc

ROASTED GREEN BEANS WITH LEMON AND ANCHOVIES

I nearly didn't write the word 'anchovies' in the name of this dish as I know that many people will pretty much automatically by-pass any recipe that has anchovies in it. It's such a shame, because (as is the case with many dishes that include them) they add a depth of flavour and intriguing background note that is perfectly delicious without being at all fishy. If you're one of my mob, and love them, then do give this recipe a try...it's such a simple dish yet full of flavour (I sometimes add a chopped red chilli at the end to give it a bit more oomph too.) The beans are terrific with any sort of grilled meat or fish.





nc

Greek syrup drenched semolina, yoghurt and almond cake somali

1 1/2 cups fine semolina 1 1/2 cups sugar 1/2 cups Greek yogurt 2 tsps. baking soda 1/2 tsp ground mastic resin, we are substituting ground fennel 3-4 tbsps. butter, melted slivered almonds For the syrup: 2 1/2 cups sugar 1 1/2 cups water 1 lemon, cut in half 1/2 tsp. rose water Dash of vanilla extract 1 cinnamon stick




nc

French spiced beetroot and apple jam

2 granny smith apples, coarsely grated 2 beetroots 1/4 tsp. ground clove 2 star anise 1/2 nutmeg finely grated 200ml balsamic vinegar 150g caster sugar




nc

Savoury Lunchbox Muffins

Monique Bowley, shared this recipe on Foodie Tuesday, a weekly segment on ABC Radio Melbourne's Drive program at 3.30pm.




nc

French salted caramel ganache tart

An indulgent French chocolate and salted caramel tart. Decorate with fresh raspberries and pistachio nuts for great colour and a sweet zing.




nc

Performance of the ESC 0/1-h and 0/3-h Algorithm for the Rapid Identification of Myocardial Infarction Without ST-Elevation in Patients With Diabetes

OBJECTIVE

Patients with diabetes mellitus (DM) have elevated levels of high-sensitivity cardiac troponin (hs-cTn). We investigated the diagnostic performance of the European Society of Cardiology (ESC) algorithms to rule out or rule in acute myocardial infarction (AMI) without ST-elevation in patients with DM.

RESEARCH DESIGN AND METHODS

We prospectively enrolled 3,681 patients with suspected AMI and stratified those by the presence of DM. The ESC 0/1-h and 0/3-h algorithms were used to calculate negative and positive predictive values (NPV, PPV). In addition, alternative cutoffs were calculated and externally validated in 2,895 patients.

RESULTS

In total, 563 patients (15.3%) had DM, and 137 (24.3%) of these had AMI. When the ESC 0/1-h algorithm was used, the NPV was comparable in patients with and without DM (absolute difference [AD] –1.50 [95% CI –5.95, 2.96]). In contrast, the ESC 0/3-h algorithm resulted in a significantly lower NPV in patients with DM (AD –2.27 [95% CI –4.47, –0.07]). The diagnostic performance for rule-in of AMI (PPV) was comparable in both groups: 0/1-h (AD 6.59 [95% CI –19.53, 6.35]) and 0/3-h (AD 1.03 [95% CI –7.63, 9.7]). Alternative cutoffs increased the PPV in both algorithms significantly, while improvements in NPV were only subtle.

CONCLUSIONS

Application of the ESC 0/1-h algorithm revealed comparable safety to rule out AMI comparing patients with and without DM, while this was not observed with the ESC 0/3-h algorithm. Although alternative cutoffs might be helpful, patients with DM remain a high-risk population in whom identification of AMI is challenging and who require careful clinical evaluation.




nc

Myocardial Ischemic Burden and Differences in Prognosis Among Patients With and Without Diabetes: Results From the Multicenter International REFINE SPECT Registry

OBJECTIVE

Prevalence and prognostic impact of cardiovascular disease differ between patients with or without diabetes. We aimed to explore differences in the prevalence and prognosis of myocardial ischemia by automated quantification of total perfusion deficit (TPD) among patients with and without diabetes.

RESEARCH DESIGN AND METHODS

Of 20,418 individuals who underwent single-photon emission computed tomography myocardial perfusion imaging, 2,951 patients with diabetes were matched to 2,951 patients without diabetes based on risk factors using propensity score. TPD was categorized as TPD = 0%, 0% < TPD < 1%, 1% ≤ TPD < 5%, 5% ≤ TPD ≤ 10%, and TPD >10%. Major adverse cardiovascular events (MACE) were defined as a composite of all-cause mortality, myocardial infarction, unstable angina, or late revascularization.

RESULTS

MACE risk was increased in patients with diabetes compared with patients without diabetes at each level of TPD above 0 (P < 0.001 for interaction). In patients with TPD >10%, patients with diabetes had greater than twice the MACE risk compared with patients without diabetes (annualized MACE rate 9.4 [95% CI 6.7–11.6] and 3.9 [95% CI 2.8–5.6], respectively, P < 0.001). Patients with diabetes with even very minimal TPD (0% < TPD < 1%) experienced a higher risk for MACE than those with 0% TPD (hazard ratio 2.05 [95% CI 1.21–3.47], P = 0.007). Patients with diabetes with a TPD of 0.5% had a similar MACE risk as patients without diabetes with a TPD of 8%.

CONCLUSIONS

For every level of TPD >0%, even a very minimal deficit of 0% < TPD < 1%, the MACE risk was higher in the patients with diabetes compared with patients without diabetes. Patients with diabetes with minimal ischemia had comparable MACE risk as patients without diabetes with significant ischemia.




nc

Microvascular and Cardiovascular Outcomes According to Renal Function in Patients Treated With Once-Weekly Exenatide: Insights From the EXSCEL Trial

OBJECTIVE

To evaluate the impact of once-weekly exenatide (EQW) on microvascular and cardiovascular (CV) outcomes by baseline renal function in the Exenatide Study of Cardiovascular Event Lowering (EXSCEL).

RESEARCH DESIGN AND METHODS

Least squares mean difference (LSMD) in estimated glomerular filtration rate (eGFR) from baseline between the EQW and placebo groups was calculated for 13,844 participants. Cox regression models were used to estimate effects by group on incident macroalbuminuria, retinopathy, and major adverse CV events (MACE). Interval-censored time-to-event models estimated effects on renal composite 1 (40% eGFR decline, renal replacement, or renal death) and renal composite 2 (composite 1 variables plus macroalbuminuria).

RESULTS

EQW did not change eGFR significantly (LSMD 0.21 mL/min/1.73 m2 [95% CI –0.27 to 0.70]). Macroalbuminuria occurred in 2.2% of patients in the EQW group and in 2.5% of those in the placebo group (hazard ratio [HR] 0.87 [95% CI 0.70–1.07]). Neither renal composite was reduced with EQW in unadjusted analyses, but renal composite 2 was reduced after adjustment (HR 0.85 [95% CI 0.74–0.98]). Retinopathy rates did not differ by treatment group or in the HbA1c-lowering or prior retinopathy subgroups. CV outcomes in those with eGFR <60 mL/min/1.73 m2 did not differ by group. Those with eGFR ≥60 mL/min/1.73 m2 had nominal risk reductions for MACE, all-cause mortality, and CV death, but interactions by renal function group were significant for only stroke (HR 0.74 [95% CI 0.58–0.93]; P for interaction = 0.035) and CV death (HR 1.08 [95% CI 0.85–1.38]; P for interaction = 0.031).

CONCLUSIONS

EQW had no impact on unadjusted retinopathy or renal outcomes. CV risk was modestly reduced only in those with eGFR ≥60 mL/min/1.73 m2 in analyses unadjusted for multiplicity.




nc

Novel Biomarkers for Change in Renal Function in People With Dysglycemia

OBJECTIVE

Diabetes is a major risk factor for renal function decline and failure. The availability of multiplex panels of biochemical markers provides the opportunity to identify novel biomarkers that can better predict changes in renal function than routinely available clinical markers.

RESEARCH DESIGN AND METHODS

The concentration of 239 biochemical markers was measured in stored serum from participants in the biomarker substudy of Outcome Reduction With Initial Glargine Intervention (ORIGIN) trial. Repeated-measures mixed-effects models were used to compute the annual change in eGFR (measured as mL/min/1.73 m2/year) for the 7,482 participants with a recorded baseline and follow-up eGFR. Linear regression models using forward selection were used to identify the independent biomarker determinants of the annual change in eGFR after accounting for baseline HbA1c, baseline eGFR, and routinely measured clinical risk factors. The incidence of the composite renal outcome (i.e., renal replacement therapy, renal death, renal failure, albuminuria progression, doubling of serum creatinine) and death within each fourth of change in eGFR predicted from these models was also estimated.

RESULTS

During 6.2 years of median follow-up, the median annual change in eGFR was –0.18 mL/min/1.73 m2/year. Fifteen biomarkers independently predicted eGFR decline after accounting for cardiovascular risk factors, as did 12 of these plus 1 additional biomarker after accounting for renal risk factors. Every 0.1 mL/min/1.73 m2 predicted annual fall in eGFR predicted a 13% (95% CI 12, 14%) higher mortality.

CONCLUSIONS

Adding up to 16 biomarkers to routinely measured clinical risk factors improves the prediction of annual change in eGFR in people with dysglycemia.




nc

The Prevalence and Determinants of Cognitive Deficits and Traditional Diabetic Complications in the Severely Obese

OBJECTIVE

To determine the prevalence of cognitive deficits and traditional diabetic complications and the association between metabolic factors and these outcomes.

RESEARCH DESIGN AND METHODS

We performed a cross-sectional study in severely obese individuals before bariatric surgery. Lean control subjects were recruited from a research website. Cognitive deficits were defined by the National Institutes of Health (NIH) Toolbox (<5th percentile for lean control subjects). Cardiovascular autonomic neuropathy (CAN) was defined by an expiration-to-inspiration (E-to-I) ratio of <5th percentile for lean control subjects. Retinopathy was based on retinal photographs and nephropathy on the estimated glomerular filtration rate (<60 mg/dL) and/or the albumin-to-creatinine ratio (ACR) (≥30 mg/g). NIH Toolbox, E-to-I ratio, mean deviation on frequency doubling technology testing, and ACR were used as sensitive measures of these outcomes. We used multivariable linear regression to explore associations between metabolic factors and these outcomes.

RESULTS

We recruited 138 severely obese individuals and 46 lean control subjects. The prevalence of cognitive deficits, CAN, retinopathy, and nephropathy were 6.5%, 4.4%, 0%, and 6.5% in lean control subjects; 22.2%, 18.2%, 0%, and 6.1% in obese participants with normoglycemia; 17.7%, 21.4%, 1.9%, and 17.9% in obese participants with prediabetes; and 25.6%, 31.9%, 6.1%, and 16.3% in obese participants with diabetes. Waist circumference was significantly associated with cognitive function (–1.48; 95% CI –2.38, –0.57) and E-to-I ratio (–0.007; 95% CI –0.012, –0.002). Prediabetes was significantly associated with retinal function (–1.78; 95% CI –3.56, –0.002).

CONCLUSIONS

Obesity alone is likely sufficient to cause cognitive deficits but not retinopathy or nephropathy. Central obesity is the key metabolic risk factor.




nc

Reduction in Global Myocardial Glucose Metabolism in Subjects With 1-Hour Postload Hyperglycemia and Impaired Glucose Tolerance

OBJECTIVE

Impaired insulin-stimulated myocardial glucose uptake has occurred in patients with type 2 diabetes with or without coronary artery disease. Whether cardiac insulin resistance is present remains uncertain in subjects at risk for type 2 diabetes, such as individuals with impaired glucose tolerance (IGT) or those with normal glucose tolerance (NGT) and 1-h postload glucose ≥155 mg/dL during an oral glucose tolerance test (NGT 1-h high). This issue was examined in this study.

RESEARCH DESIGN AND METHODS

The myocardial metabolic rate of glucose (MRGlu) was measured by using dynamic 18F-fluorodeoxyglucose positron emission tomography combined with a euglycemic-hyperinsulinemic clamp in 30 volunteers without coronary artery disease. Three groups were studied: 1) those with 1-h postload glucose <155 mg/dL (NGT 1-h low) (n = 10), 2) those with NGT 1-h high (n = 10), 3) and those with IGT (n = 10).

RESULTS

After adjusting for age, sex, and BMI, both subjects with NGT 1-h high (23.7 ± 6.4 mmol/min/100 mg; P = 0.024) and those with IGT (16.4 ± 6.0 mmol/min/100 mg; P < 0.0001) exhibited a significant reduction in global myocardial MRGlu; this value was 32.8 ± 9.7 mmol/min/100 mg in subjects with NGT 1-h low. Univariate correlations showed that MRGlu was positively correlated with insulin-stimulated whole-body glucose disposal (r = 0.441; P = 0.019) and negatively correlated with 1-h (r = –0.422; P = 0.025) and 2-h (r = –0.374; P = 0.05) postload glucose levels, but not with fasting glucose.

CONCLUSIONS

This study shows that myocardial insulin resistance is an early defect that is already detectable in individuals with dysglycemic conditions associated with an increased risk of type 2 diabetes, such as IGT and NGT 1-h high.




nc

Genetic Susceptibility Determines {beta}-Cell Function and Fasting Glycemia Trajectories Throughout Childhood: A 12-Year Cohort Study (EarlyBird 76)

OBJECTIVE

Previous studies suggested that childhood prediabetes may develop prior to obesity and be associated with relative insulin deficiency. We proposed that the insulin-deficient phenotype is genetically determined and tested this hypothesis by longitudinal modeling of insulin and glucose traits with diabetes risk genotypes in the EarlyBird cohort.

RESEARCH DESIGN AND METHODS

EarlyBird is a nonintervention prospective cohort study that recruited 307 healthy U.K. children at 5 years of age and followed them throughout childhood. We genotyped 121 single nucleotide polymorphisms (SNPs) previously associated with diabetes risk, identified in the adult population. Association of SNPs with fasting insulin and glucose and HOMA indices of insulin resistance and β-cell function, available from 5 to 16 years of age, were tested. Association analysis with hormones was performed on selected SNPs.

RESULTS

Several candidate loci influenced the course of glycemic and insulin traits, including rs780094 (GCKR), rs4457053 (ZBED3), rs11257655 (CDC123), rs12779790 (CDC123 and CAMK1D), rs1111875 (HHEX), rs7178572 (HMG20A), rs9787485 (NRG3), and rs1535500 (KCNK16). Some of these SNPs interacted with age, the growth hormone–IGF-1 axis, and adrenal and sex steroid activity.

CONCLUSIONS

The findings that genetic markers influence both elevated and average courses of glycemic traits and β-cell function in children during puberty independently of BMI are a significant step toward early identification of children at risk for diabetes. These findings build on our previous observations that pancreatic β-cell defects predate insulin resistance in the onset of prediabetes. Understanding the mechanisms of interactions among genetic factors, puberty, and weight gain would allow the development of new and earlier disease-management strategies in children.




nc

Keep Your YouTube Subscriptions in Sync With Inoreader

Did you know you can subscribe to YouTube channels and playlist in Inoreader? Simply paste the URL of the channel…




nc

16th Annual Immigration Law and Policy Conference

With immigration a central plank of the Trump administration's policy agenda, the 16th annual Immigration Law and Policy Conference, held in October 2019, featured analysis by top experts in and out of government regarding changing policies implemented at the U.S.-Mexico border, narrowing of asylum, cooperation with migrant-transit countries, and actions that could reduce legal immigration, including revisions to the public-charge rule.




nc

Young Refugee Children: Their Schooling Experiences in the United States and in Countries of First Asylum

In this webinar, the authors of three papers on the experiences of refugee children present their findings, with a focus on how such experiences affect their mental health and education.




nc

Mental Health Risks and Resilience among Somali and Bhutanese Refugee Parents

Somali and Bhutanese refugees are two of the largest groups recently resettled in the United States and Canada. This report examines factors that might promote or undermine the mental health and overall well-being of children of these refugees, with regard to factors such as past exposure to trauma, parental mental health, educational attainment, social support, and discrimination.




nc

A Study of Pregnancy and Birth Outcomes among African-Born Women Living in Utah

Resettled African refugee women may experience particularly acute complications during pregnancy, birth, and the child's early infancy. Yet health care-providers and policymakers may not be aware of the particular challenges that these women and their children face. This report, examining women giving birth in Utah over a seven-year period, compares perinatal complications of the African born and a segment of the U.S. born.




nc

In the Age of Trump: Populist Backlash and Progressive Resistance Create Divergent State Immigrant Integration Contexts

As long-simmering passions related to federal immigration policies have come to a full boil, less noted but no less important debates are taking place at state and local levels with regards to policies affecting immigrants and their children. As states are increasingly diverging in their responses, this report examines how some of the key policies and programs that support long-term integration success are faring in this volatile era.




nc

Health Insurance Test for Green-Card Applicants Could Sharply Cut Future U.S. Legal Immigration

A new Trump administration action requiring intending immigrants to prove they can purchase eligible health insurance within 30 days of arrival has the potential to block fully 65 percent of those who apply for a green card from abroad, MPI estimates.




nc

Health Insurance Coverage of Immigrants and Latinos in the Kansas City Metro Area

Latinos and immigrants are at least twice as likely to lack health insurance coverage as the overall population in the Kansas City metropolitan area. This gap that has significant implications for the region, as Latinos and immigrants will form an ever-growing share of the area’s labor force and tax base amid anticipated declines in the native-born, non-Latino population.




nc

Diapression: An Integrated Model for Understanding the Experience of Individuals With Co-Occurring Diabetes and Depression

Paul Ciechanowski
Apr 1, 2011; 29:43-49
Feature Articles




nc

Persistence of Continuous Glucose Monitoring Use in a Community Setting 1 Year After Purchase

James Chamberlain
Jul 1, 2013; 31:106-109
Feature Articles




nc

Interdisciplinary Team Care for Diabetic Patients by Primary Care Physicians, Advanced Practice Nurses, and Clinical Pharmacists

David Willens
Apr 1, 2011; 29:60-68
Feature Articles




nc

Application of Adult-Learning Principles to Patient Instructions: A Usability Study for an Exenatide Once-Weekly Injection Device

Gayle Lorenzi
Sep 1, 2010; 28:157-162
Bridges to Excellence




nc

What's So Tough About Taking Insulin? Addressing the Problem of Psychological Insulin Resistance in Type 2 Diabetes

William H. Polonsky
Jul 1, 2004; 22:147-150
Practical Pointers




nc

Improving Patient Adherence

Alan M. Delamater
Apr 1, 2006; 24:71-77
Feature Articles




nc

Heroism Science: Call for Papers, Special Issue: The Heroism of Whistleblowers

Heroism Science: Call for Papers, Special Issue The Heroism of Whistleblowers Edited by Ari Kohen, Brian Riches, and Matt Langdon Whistleblowers speak up with “concerns or information about wrongdoing inside organizations and institutions.” As such, whistleblowing “can be one of the most important and difficult forms of heroism in modern society” (Brown, 2016 p. 1). … Continue reading Heroism Science: Call for Papers, Special Issue: The Heroism of Whistleblowers




nc

Baseball and Linguistic Uncertainty

In my youth I played an inordinate amount of baseball, collected baseball cards, and idolized baseball players. I've outgrown all that but when I'm in the States during baseball season I do enjoy watching a few innings on the TV.

So I was watching a baseball game recently and the commentator was talking about the art of pitching. Throwing a baseball, he said, is like shooting a shotgun. You get a spray. As a pitcher, you have to know your spray. You learn to control it, but you know that it is there. The ball won't always go where you want it. And furthermore, where you want the ball depends on the batter's style and strategy, which vary from pitch to pitch for every batter.

That's baseball talk, but it stuck in my mind. Baseball pitchers must manage uncertainty! And it is not enough to reduce it and hope for the best. Suppose you want to throw a strike. It's not a good strategy to aim directly at, say, the lower outside corner of the strike zone, because of the spray of the ball's path and because the batter's stance can shift. Especially if the spray is skewed down and out, you'll want to move up and in a bit.

This is all very similar to the ambiguity of human speech when we pitch words at each other. Words don't have precise meanings; meanings spread out like the pitcher's spray. If we want to communicate precisely we need to be aware of this uncertainty, and manage it, taking account of the listener's propensities.

Take the word "liberal" as it is used in political discussion.

For many decades, "liberals" have tended to support high taxes to provide generous welfare, public medical insurance, and low-cost housing. They advocate liberal (meaning magnanimous or abundant) government involvement for the citizens' benefit.

A "liberal" might also be someone who is open-minded and tolerant, who is not strict in applying rules to other people, or even to him or herself. Such a person might be called "liberal" (meaning advocating individual rights) for opposing extensive government involvement in private decisions. For instance, liberals (in this second sense) might oppose high taxes since they reduce individuals' ability to make independent choices. As another example, John Stuart Mill opposed laws which restricted the rights of women to work (at night, for instance), even though these laws were intended to promote the welfare of women. Women, insisted Mill, are intelligent adults and can judge for themselves what is good for them.

Returning to the first meaning of "liberal" mentioned above, people of that strain may support restrictions of trade to countries which ignore the health and safety of workers. The other type of "liberal" might tend to support unrestricted trade.

Sending out words and pitching baseballs are both like shooting a shotgun: meanings (and baseballs) spray out. You must know what meaning you wish to convey, and what other meanings the word can have. The choice of the word, and the crafting of its context, must manage the uncertainty of where the word will land in the listener's mind.


Let's go back to baseball again.

If there were no uncertainty in the pitcher's pitch and the batter's swing, then baseball would be a dreadfully boring game. If the batter knows exactly where and when the ball will arrive, and can completely control the bat, then every swing will be a homer. Or conversely, if the pitcher always knows exactly how the batter will swing, and if each throw is perfectly controlled, then every batter will strike out. But which is it? Whose certainty dominates? The batter's or the pitcher's? It can't be both. There is some deep philosophical problem here. Clearly there cannot be complete certainty in a world which has some element of free will, or surprise, or discovery. This is not just a tautology, a necessary result of what we mean by "uncertainty" and "surprise". It is an implication of limited human knowledge. Uncertainty - which makes baseball and life interesting - is inevitable in the human world.

How does this carry over to human speech?

It is said of the Wright brothers that they thought so synergistically that one brother could finish an idea or sentence begun by the other. If there is no uncertainty in what I am going to say, then you will be bored with my conversation, or at least, you won't learn anything from me. It is because you don't know what I mean by, for instance, "robustness", that my speech on this topic is enlightening (and maybe interesting). And it is because you disagree with me about what robustness means (and you tell me so), that I can perhaps extend my own understanding.

So, uncertainty is inevitable in a world that is rich enough to have surprise or free will. Furthermore, this uncertainty leads to a process - through speech - of discovery and new understanding. Uncertainty, and the use of language, leads to discovery.

Isn't baseball an interesting game?




nc

The End of Science?


Science is the search for and study of patterns and laws in the natural and physical worlds. Could that search become exhausted, like an over-worked coal vein, leaving nothing more to be found? Could science end? After briefly touching on several fairly obvious possible end-games for science, we explore how the vast Unknown could undermine - rather than underlie - the scientific enterprize. The possibility that science could end is linked to the reason that science is possible at all. The path we must climb in this essay is steep, but the (in)sight is worth it.

Science is the process of discovering unknowns, one of which is the extent of Nature's secrets. It is possible that the inventory of Nature's unknowns is finite or conceivably even nearly empty. However, a look at open problems in science, from astronomy to zoology, suggests that Nature's storehouse of surprises is still chock full. So, from this perspective, the answer to the question 'Could science end?' is conceivably 'Yes', but most probably 'No'.

Another possible 'Yes' answer is that science will end by reaching the limit of human cognitive capability. Nature's storehouse of surprises may never empty out, but the rate of our discoveries may gradually fall, reaching zero when scientists have figured out everything that humans are able to understand. Possible, but judging from the last 400 years, it seems that we've only begun to tap our mind's expansive capability.

Or perhaps science - a product of human civilization - will end due to historical or social forces. The simplest such scenario is that we blow ourselves to smithereens. Smithereens can't do science. Another more complicated scenario is Oswald Spengler's theory of cyclical history, whereby an advanced society - such as Western civilization - decays and disappears, science disappearing with it. So again a tentative 'Yes'. But this might only be an interruption of science if later civilizations resume the search.

We now explore the main mechanism by which science could become impossible. This will lead to deeper understanding of the delicate relation between knowledge and the Unknown and to why science is possible at all.

One axiom of science is that there exist stable and discoverable laws of nature. As the philosopher A.N. Whitehead wrote in 1925: "Apart from recurrence, knowledge would be impossible; for nothing could be referred to our past experience. Also, apart from some regularity of recurrence, measurement would be impossible." (Science and the Modern World, p.36). The stability of phenomena is what allows a scientist to repeat, study and build upon the work of other scientists. Without regular recurrence there would be no such thing as a discoverable law of nature.

However, as David Hume explained long ago in An Enquiry Concerning Human Understanding, one can never empirically prove that regular recurrence will hold in the future. By the time one tests the regularity of the future, that future has become the past. The future can never be tested, just as one can never step on the rolled up part of an endless rug unfurling always in front of you.

Suppose the axiom of Natural Law turns out to be wrong, or suppose Nature comes unstuck and its laws start "sliding around", changing. Science would end. If regularity, patterns, and laws no longer exist, then scientific pursuit of them becomes fruitless.

Or maybe not. Couldn't scientists search for the laws by which Nature "slides around"? Quantum mechanics seems to do just that. For instance, when a polarized photon impinges on a polarizing crystal, the photon will either be entirely absorbed or entirely transmitted, as Dirac explained. The photon's fate is not determined by any law of Nature (if you believe quantum mechanics). Nature is indeterminate in this situation. Nonetheless, quantum theory very accurately predicts the probability that the photon will be transmitted, and the probability that it will be absorbed. In other words, quantum mechanics establishes a deterministic law describing Nature's indeterminism.

Suppose Nature's indeterminism itself becomes lawless. Is that conceivable? Could Nature become so disorderly, so confused and uncertain, so "out of joint: O, cursed spite", that no law can "set it right"? The answer is conceivably 'Yes', and if this happens then scientists are all out of a job. To understand how this is conceivable, one must appreciate the Unknown at its most rambunctious.

Let's take stock. We can identify attributes of Nature that are necessary for science to be possible. The axiom of Natural Law is one necessary attribute. The successful history of science suggests that the axiom of Natural Law has held firmly in the past. But that does not determine what Nature will be in the future.

In order to understand how Natural Law could come unstuck, we need to understand how Natural Law works (today). When a projectile, say a baseball, is thrown from here to there, its progress at each point along its trajectory is described, scientifically, in terms of its current position, direction of motion, and attributes such as its shape, mass and surrounding medium. The Laws of Nature enable the calculation of the ball's progress by solving a mathematical equation whose starting point is the current state of the ball.

We can roughly describe most Laws of Nature as formulations of problems - e.g. mathematical equations - whose input is the current and past states of the system in question, and whose solution predicts an outcome: the next state of the system. What is law-like about this is that these problems - whose solution describes a progression, like the flight of a baseball - are constant over time. The scientist calculates the baseball's trajectory by solving the same problem over and over again (or all at once with a differential equation). Sometimes the problem is hard to solve, so scientists are good mathematicians, or they have big computers, (or both). But solvable they are.

Let's remember that Nature is not a scientist, and Nature does not solve a problem when things happen (like baseballs speeding to home plate). Nature just does it. The scientist's Law is a description of Nature, not Nature itself.

There are other Laws of Nature for which we must modify the previous description. In these cases, the Law of Nature is, as before, the formulation of a problem. Now, however, the solution of the problem not only predicts the next state of the system, but it also re-formulates the problem that must be solved at the next step. There is sort of a feedback: the next state of the system alters the rule by which subsequent progress is made. For instance, when an object falls towards earth from outer space, the law of nature that determines the motion of the object depends on the gravitational attraction. The gravitational attraction, in turn, increases as the object gets closer. Thus the problem to be solved changes as the object moves. Problems like these tend to be more difficult to solve, but that's the scientist's problem (or pleasure).

Now we can appreciate how Nature might become lawlessly unstuck. Let's consider the second type of Natural Law, where the problem - the Law itself - gets modified by the evolving event. Let's furthermore suppose that the problem is not simply difficult to solve, but that no solution can be obtained in a finite amount of time (mathematicians have lots of examples of problems like this). As before, Nature itself does not solve a problem; Nature just does it. But the scientist is now in the position that no prediction can be made, no trajectory can be calculated, no model or description of the phenomenon can be obtained. No explicit problem statement embodying a Natural Law exists. This is because the problem to be solved evolves continuously from previous solutions, and none of the sequence of problems can be solved. The scientist's profession will become frustrating, futile and fruitless.

Nature becomes lawlessly unstuck, and science ends, if all Laws of Nature become of the modified second type. The world itself will continue because Nature solves no problems, it just does its thing. But the way it does this is now so raw and unruly that no study of nature can get to first base.

Sound like science fiction (or nightmare)? Maybe. But as far as we know, the only thing between us and this new state of affairs is the axiom of Natural Law. Scientists assume that Laws exist and are stable because past experience, together with our psychological makeup (which itself is evolutionary past experience), very strongly suggests that regular recurrence can be relied upon. But if you think that the scientists can empirically prove that the future will continue to be lawful, like the past, recall that all experience is past experience. Recall the unfurling-rug metaphor (by the time we test the future it becomes the past), and make an appointment to see Mr Hume.

Is science likely to become fruitless or boring? No. Science thrives on an Unknown that is full of surprises. Science - the search for Natural Laws - thrives even though the existence of Natural Law can never be proven. Science thrives precisely because we can never know for sure that science will not someday end. 




nc

The Language of Science and the Tower of Babel


And God said: Behold one people with one language for them all ... and now nothing that they venture will be kept from them. ... [And] there God mixed up the language of all the land. (Genesis, 11:6-9)

"Philosophy is written in this grand book the universe, which stands continually open to our gaze. But the book cannot be understood unless one first learns to comprehend the language and to read the alphabet in which it is composed. It is written in the language of mathematics." Galileo Galilei

Language is power over the unknown. 

Mathematics is the language of science, and computation is the modern voice in which this language is spoken. Scientists and engineers explore the book of nature with computer simulations of swirling galaxies and colliding atoms, crashing cars and wind-swept buildings. The wonders of nature and the powers of technological innovation are displayed on computer screens, "continually open to our gaze." The language of science empowers us to dispel confusion and uncertainty, but only with great effort do we change the babble of sounds and symbols into useful, meaningful and reliable communication. How we do that depends on the type of uncertainty against which the language struggles.

Mathematical equations encode our understanding of nature, and Galileo exhorts us to learn this code. One challenge here is that a single equation represents an infinity of situations. For instance, the equation describing a flowing liquid captures water gushing from a pipe, blood coursing in our veins, and a droplet splashing from a puddle. Gazing at the equation is not at all like gazing at the droplet. Understanding grows by exposure to pictures and examples. Computations provide numerical examples of equations that can be realized as pictures. Computations can simulate nature, allowing us to explore at our leisure.

Two questions face the user of computations: Are we calculating the correct equations? Are we calculating the equations correctly? The first question expresses the scientist's ignorance - or at least uncertainty - about how the world works. The second question reflects the programmer's ignorance or uncertainty about the faithfulness of the computer program to the equations. Both questions deal with the fidelity between two entities. However, the entities involved are very different and the uncertainties are very different as well.

The scientist's uncertainty is reduced by the ingenuity of the experimenter. Equations make predictions that can be tested by experiment. For instance, Galileo predicted that small and large balls will fall at the same rate, as he is reported to have tested from the tower of Pisa. Equations are rejected or modified when their predictions don't match the experimenter's observation. The scientist's uncertainty and ignorance are whittled away by testing equations against observation of the real world. Experiments may be extraordinarily subtle or difficult or costly because nature's unknown is so endlessly rich in possibilities. Nonetheless, observation of nature remorselessly cuts false equations from the body of scientific doctrine. God speaks through nature, as it were, and "the Eternal of Israel does not deceive or console." (1 Samuel, 15:29). When this observational cutting and chopping is (temporarily) halted, the remaining equations are said to be "validated" (but they remain on the chopping block for further testing).

The programmer's life is, in one sense, more difficult than the experimenter's. Imagine a huge computer program containing millions of lines of code, the accumulated fruit of thousands of hours of effort by many people. How do we verify that this computation faithfully reflects the equations that have ostensibly been programmed? Of course they've been checked again and again for typos or logical faults or syntactic errors. Very clever methods are available for code verification. Nonetheless, programmers are only human, and some infidelity may slip through. What remorseless knife does the programmer have with which to verify that the equations are correctly calculated? Testing computation against observation does not allow us to distinguish between errors in the equations, errors in the program, and compensatory errors in both.

The experimenter compares an equation's prediction against an observation of nature. Like the experimenter, the programmer compares the computation against something. However, for the programmer, the sharp knife of nature is not available. In special cases the programmer can compare against a known answer. More frequently the programmer must compare against other computations which have already been verified (by some earlier comparison). The verification of a computation - as distinct from the validation of an equation - can only use other high-level human-made results. The programmer's comparisons can only be traced back to other comparisons. It is true that the experimenter's tests are intermediated by human artifacts like calipers or cyclotrons. Nonetheless, bedrock for the experimenter is the "reality out there". The experimenter's tests can be traced back to observations of elementary real events. The programmer does not have that recourse. One might say that God speaks to the experimenter through nature, but the programmer has no such Voice upon which to rely.

The tower built of old would have reached the heavens because of the power of language. That tower was never completed because God turned talk into babble and dispersed the people across the land. Scholars have argued whether the story prescribes a moral norm, or simply describes the way things are, but the power of language has never been disputed.

The tower was never completed, just as science, it seems, has a long way to go. Genius, said Edison, is 1 percent inspiration and 99 percent perspiration. A good part of the sweat comes from getting the language right, whether mathematical equations or computer programs.

Part of the challenge is finding order in nature's bubbling variety. Each equation captures a glimpse of that order, adding one block to the structure of science. Furthermore, equations must be validated, which is only a stop-gap. All blocks crumble eventually, and all equations are fallible and likely to be falsified.

Another challenge in science and engineering is grasping the myriad implications that are distilled into an equation. An equation compresses and summarizes, while computer simulations go the other way, restoring detail and specificity. The fidelity of a simulation to the equation is usually verified by comparing against other simulations. This is like the dictionary paradox: using words to define words.

It is by inventing and exploiting symbols that humans have constructed an orderly world out of the confusing tumult of experience. With symbols, like with blocks in the tower, the sky is the limit.




nc

Jabberwocky. Or: Grand Unified Theory of Uncertainty???


Jabberwocky, Lewis Carroll's whimsical nonsense poem, uses made-up words to create an atmosphere and to tell a story. "Billig", "frumious", "vorpal" and "uffish" have no lexical meaning, but they could have. The poem demonstrates that the realm of imagination exceeds the bounds of reality just as the set of possible words and meanings exceeds its real lexical counterpart.

Uncertainty thrives in the realm of imagination, incongruity, and contradiction. Uncertainty falls in the realm of science fiction as much as in the realm of science. People have struggled with uncertainty for ages and many theories of uncertainty have appeared over time. How many uncertainty theories do we need? Lots, and forever. Would we say that of physics? No, at least not forever.

Can you think inconsistent, incoherent, or erroneous thoughts? I can. (I do it quite often, usually without noticing.) For those unaccustomed to thinking incongruous thoughts, and who need a bit of help to get started, I can recommend thinking of "two meanings packed into one word like a portmanteau," like 'fuming' and 'furious' to get 'frumious' or 'snake' and 'shark' to get 'snark'.

Portmanteau words are a start. Our task now is portmanteau thoughts. Take for instance the idea of a 'thingk':

When I think a thing I've thought,
I have often felt I ought
To call this thing I think a "Thingk",
Which ought to save a lot of ink.

The participle is written "thingking",
(Which is where we save on inking,)
Because "thingking" says in just one word:
"Thinking of a thought thing." Absurd!

All this shows high-power abstraction.
(That highly touted human contraption.)
Using symbols with subtle feint,
To stand for something which they ain't.

Now that wasn't difficult: two thoughts at once. Now let those thoughts be contradictory. To use a prosaic example: thinking the unthinkable, which I suppose is 'unthingkable'. There! You did it. You are on your way to a rich and full life of thinking incongruities, fallacies and contradictions. We can hold in our minds thoughts of 4-sided triangles, parallel lines that intersect, and endless other seeming impossibilities from super-girls like Pippi Longstockings to life on Mars (some of which may actually be true, or at least possible).

Scientists, logicians, and saints are in the business of dispelling all such incongruities, errors and contradictions. Banishing inconsistency is possible in science because (or if) there is only one coherent world. Belief in one coherent world and one grand unified theory is the modern secular version of the ancient monotheistic intuition of one universal God (in which saints tend to believe). Uncertainty thrives in the realm in which scientists and saints have not yet completed their tasks (perhaps because they are incompletable). For instance, we must entertain a wide range of conflicting conceptions when we do not yet know how (or whether) quantum mechanics can be reconciled with general relativity, or Pippi's strength reconciled with the limitations of physiology. As Henry Adams wrote:

"Images are not arguments, rarely even lead to proof, but the mind craves them, and, of late more than ever, the keenest experimenters find twenty images better than one, especially if contradictory; since the human mind has already learned to deal in contradictions."

The very idea of a rigorously logical theory of uncertainty is startling and implausible because the realm of the uncertain is inherently incoherent and contradictory. Indeed, the first uncertainty theory - probability - emerged many centuries after the invention of the axiomatic method in mathematics. Today we have many theories of uncertainty: probability, imprecise probability, information theory, generalized information theory, fuzzy logic, Dempster-Shafer theory, info-gap theory, and more (the list is a bit uncertain). Why such a long and diverse list? It seems that in constructing a logically consistent theory of the logically inconsistent domain of uncertainty, one cannot capture the whole beast all at once (though I'm uncertain about this).

A theory, in order to be scientific, must exclude something. A scientific theory makes statements such as "This happens; that doesn't happen." Karl Popper explained that a scientific theory must contain statements that are at risk of being wrong, statements that could be falsified. Deborah Mayo demonstrated how science grows by discovering and recovering from error.

The realm of uncertainty contains contradictions (ostensible or real) such as the pair of statements: "Nine year old girls can lift horses" and "Muscle fiber generates tension through the action of actin and myosin cross-bridge cycling". A logically consistent theory of uncertainty can handle improbabilities, as can scientific theories like quantum mechanics. But a logical theory cannot encompass outright contradictions. Science investigates a domain: the natural and physical worlds. Those worlds, by virtue of their existence, are perhaps coherent in a way that can be reflected in a unified logical theory. Theories of uncertainty are directed at a larger domain: the natural and physical worlds and all imaginable (and unimaginable) other worlds. That larger domain is definitely not coherent, and a unified logical theory would seem to be unattainable. Hence many theories of uncertainty are needed.

Scientific theories are good to have, and we do well to encourage the scientists. But it is a mistake to think that the scientific paradigm is suitable to all domains, in particular, to the study of uncertainty. Logic is a powerful tool and the axiomatic method assures the logical consistency of a theory. For instance, Leonard Savage argued that personal probability is a "code of consistency" for choosing one's behavior. Jim March compares the rigorous logic of mathematical theories of decision to strict religious morality. Consistency between values and actions is commendable says March, but he notes that one sometimes needs to deviate from perfect morality. While "[s]tandard notions of intelligent choice are theories of strict morality ... saints are a luxury to be encouraged only in small numbers." Logical consistency is a merit of any single theory, including a theory of uncertainty. However, insisting that the same logical consistency apply over the entire domain of uncertainty is like asking reality and saintliness to make peace.




nc

We're Just Getting Started: A Glimpse at the History of Uncertainty


We've had our cerebral cortex for several tens of thousands of years. We've lived in more or less sedentary settlements and produced excess food for 7 or 8 thousand years. We've written down our thoughts for roughly 5 thousand years. And Science? The ancient Greeks had some, but science and its systematic application are overwhelmingly a European invention of the past 500 years. We can be proud of our accomplishments (quantum theory, polio vaccine, powered machines), and we should worry about our destructive capabilities (atomic, biological and chemical weapons). But it is quite plausible, as Koestler suggests, that we've only just begun to discover our cerebral capabilities. It is more than just plausible that the mysteries of the universe are still largely hidden from us. As evidence, consider the fact that the main theories of physics - general relativity, quantum mechanics, statistical mechanics, thermodynamics - are still not unified. And it goes without say that the consilient unity of science is still far from us.

What holds for science in general, holds also for the study of uncertainty. The ancient Greeks invented the axiomatic method and used it in the study of mathematics. Some medieval thinkers explored the mathematics of uncertainty, but it wasn't until around 1600 that serious thought was directed to the systematic study of uncertainty, and statistics as a separate and mature discipline emerged only in the 19th century. The 20th century saw a florescence of uncertainty models. Lukaczewicz discovered 3-valued logic in 1917, and in 1965 Zadeh introduced his work on fuzzy logic. In between, Wald formulated a modern version of min-max in 1945. A plethora of other theories, including P-boxes, lower previsions, Dempster-Shafer theory, generalized information theory and info-gap theory all suggest that the study of uncertainty will continue to grow and diversify.

In short, we have learned many facts and begun to understand our world and its uncertainties, but the disputes and open questions are still rampant and the yet-unformulated questions are endless. This means that innovations, discoveries, inventions, surprises, errors, and misunderstandings are to be expected in the study or management of uncertainty. We are just getting started.