the

The Pandemic Can’t Lock Down Nature - Issue 84: Outbreak


Needing to clear my head, I went down to the Penobscot River. There they were, swimming with the mergansers, following an early pulse of river herring to the mouth of Kenduskeag stream: two harbor seals, raising sleek round heads for a few long breaths before rolling under the waves.

Evidently it’s not uncommon for seals to swim the couple dozen miles between Bangor, Maine, and the Atlantic Ocean, but I’d never seen them here before. They were a balm to my buzzing thoughts: What happens next? Will I become a vector of death to my elderly mother? Is the economy going to implode? For a precious few minutes there were only the seals and mergansers and the fish who drew them there, arriving as the Penobscot’s winter icepack broke and flowed to sea, a ritual enacted ever since glaciers retreated from this continental shelf.

In the months ahead we can look to nature for these respites. The nonhuman world is free of charge; sunlight is a disinfectant, physical distance easily maintained, and no pandemic can suspend it. Nature offers not just escape but reassurance.

The nonhuman world is free of charge; sunlight is a disinfectant, and physical distance is easily maintained.

In 1946, in the aftermath of World War II, with the Nazi threat vanquished but the Cold War looming, George Orwell welcomed spring’s arrival in London’s bombed-out heart. “After the sorts of winters we have had to endure recently, the spring does seem miraculous, because it has become gradually harder and harder to believe that it is actually going to happen,” he wrote in “Some Thoughts on the Common Toad.” “Every February since 1940 I have found myself thinking that this time Winter is going to be permanent. But Persephone, like the toads, always rises from the dead at about the same moment.”

So she does. And so the slumbering earth warms to life. Two nights before the seals, two nights before World Health Organization declared a pandemic, before the NBA shut down with teams on the floor and fans in the seats, before the fright went beyond viral into logarithmic, was the Worm Moon: the full moon named for the imminent stir of earthworms in thawing soil.

In burrows beneath leaf litter, hibernating toads prepare to open what Orwell called “the most beautiful eye of any living creature,” resembling “the golden-colored semi-precious stone which one sometimes sees in signet rings, and which I think is called a chrysoberyl.” Nearly as beautiful are the eyes of painted turtles waiting on pond bottoms here in eastern Maine, the ice above now retreating from shore, mallard couples dabbling in newly open water.

The birds are the surest sign of spring’s imminence. Downtown the house finches are holding daily concerts. Starlings are starting to replace their gold-streaked winter plumes with more iridescent garb. In the street today I saw two male mockingbirds joust above the pavement, their white wing-bars fluttering territorial semaphores, abandoning the contest only when a car nearly ran them down. 

There are many quieter signs, too: pale tips of shrubs poised to grow, a spider rappelling off a low branch, fresh fox scat in the driveway. It’s red from apples preserved under snow and lined with the fur of field mice and meadow voles whose secret winter tunnels are now revealed in the grass. Somewhere soon mother fox will give birth, nursing her blind hairless charges in underground peace.

Eastern comma butterflies will gather on the trunks of those apple trees and sip their rising sap. Not long after the first orange-belted bumblebee queens will appear, inspecting potential nest sites under fallen leaves and decomposing logs. Warm rainy nights will bring salamanders and newts, just a few spotted glistening inches long, some of them decades old, out from woodland hidey-holes and down ancient paths to vernal pool bacchanals held amidst a chorus of spring peepers. Woodland ephemerals will bloom in sunshine unfiltered by still-bare treetops. My favorite are trout lilies, colonies of which illuminate forest floors with a sea of bright yellow blossoms, petals falling once the canopy unfurls.

“The atom bombs are piling up in the factories, the police are prowling through the cities, the lies are streaming from the loudspeakers,” Orwell wrote, “but the earth is still going round the sun.”

At this point there’s no end of studies showing how nature is good for our health, how patients recover faster in hospital rooms with windows overlooking trees, how a mindful walk in the woods will lower stress and raise moods. All true, but at this moment something deeper and more urgent is offered. An affirmation of life.

Will the nightmare scenes out of Italy and Spain and now New York City spread across the land? How long will the pandemic last? Will it completely rend our already tattered social fabric? When can I again play hockey or go to a coffee shop or use a credit card machine without feeling like I’m risking my own and other lives? Who will die? Nobody knows for sure, but in a few weeks the swallows will arrive, and tonight above the fields at dusk I heard the cries of woodcock.

Secretive, ground-dwelling birds with limpid black eyes and long, slender beaks attuned to the frequencies of earthworm-rustles, their feathers blend perfectly with leaf litter and old grass. They rely on this camouflage, going still rather than fleeing a walker’s approach, taking wing only as a last resort.

When they do, their flight is notable for its slowness and the quavering whistle of their wings. At no other time than in spring do they dare draw attention, much less put on a show: calling out, with an urgent nasal buzz best described as a peent, and flying straight upward before spiraling against a darkening sky.

Brandon Keim is a freelance nature and science journalist. The author of The Eye of the Sandpiper: Stories from the Living World, he’s now writing Meet the Neighbors, forthcoming from W.W. Norton & Company, about what it means to think of wild animals as fellow persons—and what that means for the future of nature.

Lead image: Tim Zurowski / Shutterstock


Read More…




the

The Meme as Meme - Issue 84: Outbreak


This article from our 2013 issue, “Fame,” offers a look at the way information—whether it’s true or not—spreads across the Internet.

On April 11, 2012, Zeddie Little appeared on Good Morning America, wearing the radiant, slightly perplexed smile of one enjoying instant fame. About a week earlier, Little had been a normal, if handsome, 25-year-old trying to make it in public relations. Then on March 31, he was photographed amid a crowd of runners in a South Carolina race by a stranger, Will King, who posted the image to a social networking website, Reddit. Dubbed “Ridiculously Photogenic Guy,” Little’s picture circulated on Facebook, Twitter, and Tumblr, accruing likes, comments, and captions (“Picture gets put up as employee of the month/for a company he doesn’t work for”). It spawned spinoffs (Ridiculously Photogenic Dog, Prisoner, and Syrian Rebel) and leapt to the mainstream media. At a high point, ABC Morning News reported that a Google search for “Zeddie Little” yielded 59 million hits.

Why the sudden fame? The truth is that Little hadn’t become famous: His meme had. According to website Know Your Meme, which documents viral Internet phenomena, a meme is “a piece of content or an idea that’s passed from person to person, changing and evolving along the way.” Ridiculously Photogenic Guy is a kind of Internet meme represented by LOL cats: that is, a photograph, video, or cartoon, often overlaid with a snarky message, perfect for incubating in the bored, fertile minds of cubicle workers and college students. In an age where politicians campaign through social media and viral marketers ponder the appeal of sneezing baby pandas, memes are more important than ever—however trivial they may seem.

But trawling the Internet, I found a strange paradox: While memes were everywhere, serious meme theory was almost nowhere. Richard Dawkins, the famous evolutionary biologist who coined the word “meme” in his classic 1976 book, The Selfish Gene, seemed bent on disowning the Internet variety, calling it a “hijacking” of the original term. The peer-reviewed Journal of Memetics folded in 2005. “The term has moved away from its theoretical beginnings, and a lot of people don’t know or care about its theoretical use,” philosopher and meme theorist Daniel Dennett told me. What has happened to the idea of the meme, and what does that evolution reveal about its usefulness as a concept?

In an age where politicians campaign through social media and viral marketers ponder the appeal of sneezing baby pandas, memes are more important than ever—however trivial they may seem.

Memes were originally framed in relationship to genes. In The Selfish Gene, Dawkins claimed that humans are “survival machines” for our genes, the replicating molecules that emerged from the primordial soup and that, through mutation and natural selection, evolved to generate beings that were more effective as carriers and propagators of genes. Still, Dawkins explained, genes could not account for all of human behavior, particularly the evolution of cultures. So he identified a second replicator, a “unit of cultural transmission” that he believed was “leaping from brain to brain” through imitation. He named these units “memes,” an adaption of the Greek word mimene, “to imitate.”

Dawkins’ memes include everything from ideas, songs, and religious ideals to pottery fads. Like genes, memes mutate and evolve, competing for a limited resource—namely, our attention. Memes are, in Dawkins’ view, viruses of the mind—infectious. The successful ones grow exponentially, like a super flu. While memes are sometimes malignant (hellfire and faith, for atheist Dawkins), sometimes benign (catchy songs), and sometimes terrible for our genes (abstinence), memes do not have conscious motives. But still, he claims, memes parasitize us and drive us.

Pinpointing when memes first made the leap to the Internet is tricky. Nowadays, we might think of the dancing baby, also known as Baby Cha-Cha, that grooved into our inboxes in the 1990s. It was a kind of proto-meme, but no one called it that at the time. The first reference I could find to an “Internet meme” appeared in a footnote in a 2003 academic article, describing an important event in the life of Jonah Peretti, co-founder of the hugely successful websites The Huffington Post and BuzzFeed. In 2001, as a procrastinating graduate student at MIT, Peretti decided to order a pair of Nike sneakers customized to read “sweatshop.” Nike refused. Peretti forwarded the email exchange to friends, who sent it on and on, until the story leapt to the mainstream media, where Peretti debated a Nike representative on NBC’s Today Show. Peretti later wrote, “Without really trying, I had released what biologist Richard Dawkins calls a meme.”

Peretti concluded that the email chain had spread exponentially “because it had access to such a wide range of different social networks.” Like Dawkins, he saw that a meme’s success depends on other memes, its ecosystem—and further saw that Internet memes’ ecosystems were online social networks, years before Facebook existed. According to a recent profile in New York Magazine, the Nike experience was formative for Peretti, who created BuzzFeed with the explicit goal of creating viral Internet memes. The company uses a formula called “Big Seed Marketing,” that begins with an equation describing the growth of a virus, the spread of a disease.

From the perspective of serious meme theorists, Internet memes have trivialized and distorted the spirit of the idea. Dennett told me that, in a planned workshop to be held in May 2014, he hopes to “rehabilitate the term in a very precise kind of way” for studying cultural evolution.

According to Dawkins, what sets Internet memes apart is how they are created. “Instead of mutating by random chance before spreading by a form of Darwinian selection, Internet memes are altered deliberately by human creativity,” he explained in a recent video released by the advertising agency Saatchi & Saatchi. He seems to think that the fact that Internet memes are engineered to go viral, rather than evolving by way of natural selection, is a salient difference that distinguishes from other memes—which is arguable, since what catches fire on the Internet can be as much a product of luck as any unexpected mutation. 

“I don’t know about you, but I’m not initially attracted by the idea of my brain as a sort of dung heap in which the larvae of other peoples’ ideas renew themselves.”

But if the concept of memes can really offer new insight into the intricate web of digital culture and cultural evolution more broadly, why have academics neglected it? Looking for answers, I called Susan Blackmore, a British professor who may be one of the last defenders of memetics as a scientific field. In a 2008 TED talk, Blackmore is an animated speaker, bright-eyed and wiry, her short grey hair dyed with streaks of blue. I reached her at her home in Devon, England, where she is occasionally joined in the garden by Dawkins and Dennett for meetings of the “meme lab.” “It’s only a bit of fun, nothing serious,” Blackmore said. Sometimes, members try experiments, like folding Chinese sailing ships from origami, itself a kind of meme. She remembered a March meeting in which the issue of Internet memes arose, saying, “Richard was upset because he invented the term, which shouldn’t just be about viral Internet memes. It’s a very powerful concept for understanding why humans are the way we are.”

For Blackmore, memetics is a science. An Oxford-educated psychologist, her early work was on telepathy, which she spent years investigating after an out-of-body experience at the age of 19. She subsequently found no evidence for the existence of paranormal phenomena, but she was no stranger to pushing scientific frontiers. It is perhaps unsurprising that she decided to flesh out memetics. Dawkins wrote that, with memes, he did not intend to “sculpt a grand theory of human culture.” In her 1999 book, The Meme Machine, Blackmore does just that. She argues that everything from the development of language to our big brains were products of “memetic drive.” This is perhaps her most radical claim: that memes make us do things.

Considering this idea in his book Consciousness Explained, Dennett writes, “I don’t know about you, but I’m not initially attracted by the idea of my brain as a sort of dung heap in which the larvae of other peoples’ ideas renew themselves… who’s in charge, in according to this vision—we or our memes?” Still, Dennett, too, became a major proponent of meme theory. Speaking on the phone, he used memes to explain the joy we take in our culture and related decisions not to procreate wildly. College, he pointed out, is a great underminer of genetic fitness. Reading Blackmore and Dennett, the idea of meme as mental parasite becomes both more and less convincing: If we are created and driven by our memes, then we are our memes, a duality that Dennett himself seems to recognize.

Perhaps the notion of the meme is evolving in the direction of its own survival.

Yet, the very breadth of the concept makes it difficult to approach memes from the perspective of serious, observation-based science. In the analogy to genes, memes have inevitably disappointed. As Dawkins himself wrote, memes, as entities, are more vague than genes, where alleles compete to hold the same “chromosomal slots.” Unlike genes, memes are not directly observable and have high rates of mutation. Also, no one seems to be sure if memes exist. On the phone, Blackmore told me “the one good reason” memetics might not be a science: “There has been no example of where some scientific discovery has been made using meme theory, that couldn’t have been made any other way.” Still, Blackmore told me that people are doing research on memes—they just don’t call them by that name.

Looking for meme theory at work, I found network theory, an interdisciplinary field that unites computer science, statistics, physics, ecology, and even marketing. “If you want to use memetics to explain ‘everything,’ like how religion spreads, the problem is the data,” said Michele Coscia, a researcher at the Harvard Kennedy School, who recently wrote a paper displaying a statistical “decision tree” that described the success of memes like Ridiculously Photogenic Guy. For Coscia, Internet memes, with their visible mutations and view counts, solved the problem of empirical evidence, allowing him to do work he sees as analogous to genetics experiments.

Perhaps the notion of the meme is evolving in the direction of its own survival. The term “Internet meme” appears to be growing exponentially from year to year, in classical memetic fashion. This is what Bob Scott, a digital humanities librarian at Columbia University, found when he ran various searches on the comprehensive news and wire-service aggregator LexisNexis. He saw that the term “Internet meme” showed up with the new millennium and really took off in 2004, with references roughly doubling each year thereafter. 

Infectious Internet memes are now big business. BuzzFeed now draws 85 million unique visitors a month, compared to The New York Times’ website at 29 million, and was recently valued at $200 million. Its staff trawl the Internet for viral content and curate it, adding news stories, humor pieces, and advertisements, or “sponsored posts.” These categories can be hard to disentangle, even though ads are printed on a taupe background. Scrolling through BuzzFeed, I read: “20 People We Hope to Never See Promoted on OK Cupid,” (which was an ad by Virgin Mobile), a new story on poisoned Indian children, and a post about a Republican Congressman who had “live tweeted” Jay-Z’s new album. It turned out that “23 Times When Wal-Mart Didn’t Disappoint” was not an ad, but still, the post made me think about how subversive humor—the kind that made Peretti’s email exchange with Nike so popular—could be used to advertise one of America’s least-subversive mega-chains.

While entertaining bored office workers seems harmless enough, there is something troubling about a multi-million dollar company using our minds as petri dishes in which to grow its ideas. I began to wonder if Dawkins was right—if the term meme is really being hijacked, rather than mindlessly evolving like bacteria. The idea of memes “forces you to recognize that we humans are not entirely the center of the universe where information is concerned—we’re vehicles and not necessarily in charge,” said James Gleick, author of The Information: A History, A Theory, A Flood, when I spoke to him on the phone. “It’s a humbling thing.”

It is more humbling still to think that our minds can be seduced not through the agency of memes, as Blackmore sees it, but through human agency and clever algorithms. Not by religions or quirks of culture, but by a never-ending list of stories that make us laugh. Even if the meme meme is too broad for empirical study, it offers us a powerful metaphor for how we absorb other peoples’ ideas, and how they absorb us. So maybe this is what meme theory can ultimately give us: the insight we need to put LOL cats aside—and get down to work.

Abby Rabinowitz has written for The New York Times and teaches writing at Columbia University.


Read More…




the

The Case Against Thinking Outside of the Box - Facts So Romantic


Social, cultural, economic, spiritual, psychological, emotional, intellectual: Everything is outside the box. And this new sheltered-in-place experience won’t fit into old containers.Photo Illustration by Africa Studio / Shutterstock

Many of us are stuck now, sheltered in our messy dwellings. A daily walk lets me appreciate the urban landscaping; but I can’t stop to smell anything because a blue cotton bandana shields my nostrils. Indoors, constant digital dispatches chirp to earn my attention. I click on memes, status updates, and headlines, but everything is more of the same. How many ways can we repackage fear and reframe optimism? I mop the wood-laminate floor of my apartment because I hope “ocean paradise” scented Fabuloso will make my home smell a little less confining. My thoughts waft toward the old cliché: Think outside the box. I’ve always hated when people say that.

To begin with, the directions are ineffectual. You can’t tell someone to think outside the box and expect them to do it. Creativity doesn’t happen on demand. Want proof? Just try to make yourself think a brilliant thought, something original, innovative, or unique. Go ahead. Do it. Right now. You can’t, no matter how hard you try. This is why ancient people believed that inspiration comes from outside. It’s external, bestowed on each of us like a revelation or prophecy—a gift from the Muses. Which means your genius does not belong to you. The word “genius” is the Latin equivalent of the ancient Greek “daemon” (δαίμονες)—like a totem animal, or a spirit companion. A genius walks beside us. It mediates between gods and mortals. It crosses over from one realm to the next. It whispers divine truth.

We are paralyzed by the prospect of chaos, uncertainty, and entropy.

In modern times, our mythology moves the daemons away from the heavens and into the human soul. We say, “Meditate and let your spirit guide you.” Now we think genius comes from someplace deep within. The mind? The brain? The heart? Nobody knows for sure. Yet, it seems clear to us that inspiration belongs to us; it’s tangibly contained within our corporeal boundaries. That’s why we celebrate famous artists, poets, physicists, economists, entrepreneurs, and inventors. We call them visionaries. We read their biographies. We do our best to emulate their behaviors. We study the five habits of highly successful people. We practice yoga. We exercise. We brainstorm, doodle, sign up for online personal development workshops. We do whatever we can to cultivate the fertile cognitive soil in which the springtime seeds of inspiration might sprout. But still, even though we believe that a genius is one’s own, we know that we cannot direct it. Therefore, no matter how many people tell me to think outside the box, I won’t do it. I can’t. 

Even if I could, I’m not sure thinking outside the box would be worthwhile. Consider the origins of the phrase. It started with an old brain teaser. Nine dots are presented in a perfect square, lined up three by three. Connect them all, using only four straight lines, without lifting your pencil from the paper. It’s the kind of puzzle you’d find on the back of a box of Lucky Charms breakfast cereal, frivolous but tricky. The solution involves letting the lines expand out onto the empty page, into the negative space. Don’t confine your markings to the dots themselves. You need to recognize, instead, that the field is wider than you’d assume. In other words, don’t interpret the dots as a square, don’t imagine that the space is constricted. Think outside the box! 

For years, pop-psychologists, productivity coaches, and business gurus have all used the nine-dot problem to illustrate the difference between “fixation” and “insight.” They say that we look at markings on a page and immediately try to find a pattern. We fixate on whatever meaning we can ascribe to the image. In this case, we assume that nine dots make a box. And we imagine we’re supposed to stay within its boundaries—contained and confined. We bring habitual assumptions with us even though we’re confronting a unique problem. Why? Because we are paralyzed by the prospect of chaos, uncertainty, and entropy. We cling to the most familiar ways of organizing things in order to mitigate the risk that new patterns might not emerge at all, the possibility that meaning itself could cease to exist. But this knee-jerk reaction limits our capacity for problem-solving. Our customary ways of knowing become like a strip of packing tape that’s accidentally affixed to itself—you can struggle to undo it, but it just tangles up even more. In other words, your loyalty to the easiest, most common interpretations is the sticky confirmation bias that prevents you from arriving at a truly insightful solution. 

At least that’s what the experts used to say. And we all liked to believe it. But our minds don’t really work that way. The box parable appeals because it reinforces our existing fantasies about an individual’s proclivity to innovate and disrupt by thinking in unexpected ways. It’s not true. 

Studies have found that solving the nine-dot problem has nothing to do with the box. Even when test subjects were told that the solution requires going outside the square’s boundaries, most of them still couldn’t solve it. There was an increase in successful attempts so tiny that it was considered statistically insignificant, proving that the ability to arrive at a solution to the nine-dot problem has nothing to do with fixation or insight. The puzzle is just difficult, no matter which side of the box you’re standing on.

Still, I bet my twelve-year-old son could solve it. Yesterday, we unpacked a set of oil paints, delivered by Amazon. He was admiring the brushes and canvases. He was thinking about his project, trying to be creative, searching for insight. “Think inside the outside of the box,” he said.  “What does that mean?” I pushed the branded, smiling A-to-Z packaging aside and I looked at him like he was crazy. “Like with cardboard, you know, with all the little holes inside.” 

He was talking about the corrugations, those ridges that are pasted between layers of fiberboard. They were originally formed on the same fluted irons used to make the ruffled collars of Elizabethan-era fashion. At first, single faced corrugated paper—smooth on one side, ridged on the other—was used to wrap fragile glass bottles. Then, around 1890, the double-faced corrugated fiberboard with which we’re familiar was developed. And it transformed the packing and shipping industries. The new paperboard boxes were sturdy enough to replace wooden crates. It doesn’t take an engineering degree to understand how it works: The flutes provide support; the empty space in between makes it lightweight. My son is right; it’s all about what’s inside the outside of the box.

Now I can’t stop saying it to myself, “Think inside the outside of the box.” It’s a perfect little metaphor. In a way, it even sums up the primary cognitive skill I acquired in graduate school. One could argue that a PhD just means you’ve been trained to think inside the outside of boxes. What do I mean by that? Consider how corrugation gives cardboard it’s structural integrity. The empty space—what’s not there—makes it strong and light enough that it’s a useful and efficient way to carry objects. Similarly, it’s the intellectual frameworks that make our interpretations and analyses of the world hold up. An idea can’t stand on its own; it needs a structure and a foundation. It needs a box. It requires a frame. And by looking at how those frames are assembled, by seeing how they carry a concept through to communication, we’re able to do our best thinking. We look at the empty spaces—the invisible, or tacit assumptions—which lurk within the fluted folds of every intellectual construction. We recognize that our conscious understanding of lived experience is corrugated just like cardboard. 

The famous sociologist Erving Goffman said as much in 1974 when he published his essay on “Frame Analysis.” He encouraged his readers to identify the principles of organization which govern our perceptions. This work went on to inspire countless political consultants, pundits, publicists, advertisers, researchers, and marketers. It’s why we now talk often about the ways in which folks “frame the conversation.” But I doubt my son has read Goffman. He just stumbled on a beautifully succinct way to frame the concept of critical thinking. Maybe he was inspired by Dr. Seuss. 

When my kids were little, they asked for the same story every night, “Read Sneetches Daddy!” I could practically recite the whole thing from memory: “Now, the Star-belly Sneetches had bellies with stars. The Plain-belly Sneetches had none upon thars.” It’s an us-versus-them story, a fable about the way a consumption economy encourages people to compete for status, and to alienate the “other.” If you think inside the outside of the box, it’s also a scathing criticism of a culture that’s obsessed with personal and professional transformation—always reinventing and rebranding. 

One day, Sylvester McMonkey McBean shows up on the Sneetches’ beaches with a peculiar box-shaped fix-it-up machine. Sneetches go in with plain-bellies and they come out with stars. Now, anyone can be anything, for a fee. McBean charges them a fortune; he exploits the Sneetches’ insecurities. He builds an urgent market demand for transformational products. He preys on their most familiar—and therefore, cozy and comforting—norms of character assessment. He disrupts their identity politics, makes it so that there’s no clear way to tell who rightfully belongs with which group. And as a result, chaos ensues. Why? Because the Sneetches discover that longstanding divisive labels and pejorative categories no longer provide a meaningful way to organize their immediate experiences. They’ve lost their frames, the structural integrity of their worldview. They feel unhinged, destabilized, unboxed, and confused.

Social, cultural, economic, spiritual, psychological, emotional, intellectual: Everything is outside the box.

It should sound familiar. After all, we’ve been living through an era in history that’s just like the Sneetches’. The patterns and categories we heretofore used to define self and other are being challenged every day—sometimes for good, sometimes for bad. How can we know who belongs where in a digital diaspora, a virtual panacea, where anyone can find “my tribe”? What do identity, allegiance, heredity, and loyalty even mean now that these ideas can be detached from biology and birthplace? Nobody knows for sure. And that’s just the beginning: We’ve got Sylvester-McMonkey-McBean-style disruption everywhere we look. Connected technologies have transformed the ways in which we make sense of our relationships, how we communicate with one another, our definitions of intimacy. 

Even before the novel coronavirus, a new global paradigm forced us to live and work in a world that’s organized according to a geopolitical model we can barely comprehend. Sure, the familiar boundaries of statehood sometimes prohibited migrant foot traffic—but information, microbes, and financial assets still moved swiftly across borders, unimpeded. Similarly, cross-national supply-chains rearranged the rules of the marketplace. High-speed transportation disrupted how we perceive the limits of time and space. Automation upset the criteria through which we understand meritocracy and self-worth. Algorithms and artificial intelligence changed the way we think about labor, employment, and productivity. Data and privacy issues blurred the boundaries of personal sovereignty. And advances in bioengineering shook up the very notion of human nature.

Our boxes were already bursting. And now, cloistered at home in the midst of a pandemic, our most mundane work-a-day routines are dissolved, making it feel like our core values and deeply-held beliefs are about to tumble out all over the place. We can already envision the mess that is to come—in fact, we’re watching it unfurl in slow motion. Soon, the world will look like the intellectual, emotional, and economic equivalent of my 14-year-old’s bedroom. Dirty laundry is strewn across the floor, empty candy wrappers linger on dresser-tops, mud-caked sneakers are tossed in the corner, and the faint yet unmistakable stench of prepubescent body odor is ubiquitous. Nothing is copasetic. Nothing is in its place. Instead, everything is outside the box. 

It’s not creative, inspiring, or insightful. No, it’s disorienting and anxiety-provoking. I want to tidy it up as quickly as possible. I want to put things back in their familiar places. I want to restore order and eliminate chaos. But no matter how hard I try, I can’t do it, because the old boxes are ripped and torn. Their bottoms have fallen out. Now, they’re useless. Social, cultural, economic, spiritual, psychological, emotional, intellectual: Everything is outside the box. And this new sheltered-in-place experience won’t fit into old containers.

Jordan Shapiro, Ph.D., is a senior fellow for the Joan Ganz Cooney Center at Sesame Workshop and Nonresident Fellow in the Center for Universal Education at the Brookings Institution. He teaches at Temple University, and wrote a column for Forbes on global education and digital play from 2012 to 2017. His book, The New Childhood, was released by Little, Brown Spark in December 2018.


Read More…




the

The Economic Damage Is Barely Conceivable - Issue 84: Outbreak


Like most of us, Adam Tooze is stuck at home. The British-born economic historian and Columbia University professor of history had been on leave this school year to write a book about climate change. But now he’s studying a different global problem. There are more than 700,000 cases of COVID-19 in the United States and over 2 million infections worldwide. It’s also caused an economic meltdown. More than 18 million Americans have filed for unemployment in recent weeks, and Goldman Sachs analysts predict that U.S. gross domestic product will decline at an annual rate of 34 percent in the second quarter.

Tooze is an expert on economic catastrophes. He wrote the book Crashed: How a Decade of Financial Crises Changed the World, about the 2008 economic crisis and its aftermath. But even he didn’t see this one coming. He hadn’t thought much about how pandemics could impact the economy—few economists had. Then he watched as China locked down the city of Wuhan, in a province known for auto manufacturing, on January 23; as northern Italy shut down on February 23; and as the U.S. stock market imploded on March 9. By then, he knew he had another financial crisis to think about. He’s been busy writing ever since. Tooze spoke with Nautilus from his home in New York City.

INEQUALITY FOR ALL: Adam Tooze (above) says a crisis like this one, “where you shut the entire economy down in a matter of weeks” highlights the “profound inequality” in American society.Wikimedia

What do you make of the fact that, in three weeks, more than 16 million people in the U.S. have filed for unemployment?

The structural element here—and this is quite striking, when you compare Europe, for instance, to the U.S.—is that America has and normally celebrates the flexibility and dynamism of its labor market: The fact that people move between jobs. The fact that employers have the right to hire and fire if they need to. The downside is that in a shock like this, the appropriate response for an employer is simply to let people go. What America wasn’t able to do was to improvise the short-time working systems that the Europeans are trying to use to prevent the immediate loss of employment to so many people.

The disadvantage of the American system that reveals itself in a crisis like this is that hiring and firing is not easily reversible. People who lose jobs don’t necessarily easily get them back. There is a fantasy of a V-shaped recovery. We literally have never done this before, so we don’t know one way or another how this could happen. But it seems likely that many people who have lost employment will not immediately find reemployment over the summer or the fall when business activity resumes something like its previous state. In a situation with a lot of people with low qualifications in precarious jobs at low income, the damage from that kind of interruption of employment in sectors notably which are already teetering on the edge—the chain stores, which are quite likely closing anyway, and fragile malls, which were on the edge of dying—it’s quite likely that this shock will also induce disproportionately large amounts of scarring.

What role has wealth and income inequality played during this crisis?

The U.S. economic system is bad enough in a regular crisis. In one like this, where you shut the entire economy down in a matter of weeks, the damage is barely conceivable. There are huge disparities, all of which ultimately are rooted in social structures of race and class, and in the different types of jobs that people have. The profound inequality in American society has been brought home for us in everyone’s families, where there is a radical disparity between the ability of some households to sustain the education of their children and themselves living comfortably at home. Twenty-five percent of kids in the United States appear not to have a stable WiFi connection. They have smartphones. That seems practically universal. But you can’t teach school on a smartphone. At least, that technology is not there.

Presumably by next year something like normality returns. But forever after we’ll live under the shadow of this having happened.

President Trump wants the economy to reopen by May. Would that stop the economic crisis?

Certainly that is presumably what drives that haste to restart the economy and to lift intense social distancing provisions. There is a sense that we can’t stand this. And that has a lot to do with deep fragilities in the American social system. If all Americans live comfortably in their own homes, with the safety of a regular paycheck, with substantial savings, with health insurance that wasn’t conditional on precarious employment, and with unemployment benefits that were adequate and that were rolled out to most people in this society if they needed them, then there wouldn’t be such a rush. But that isn’t America as we know it. America is a society in which half of families have virtually no financial cushion; in which small businesses, which are so often hailed as the drivers of job creation, the vast majority of owners of them live hand-to-mouth; in which the unemployment insurance system really is a mockery; and with health insurance directly tied to employment for the vast majority of the people. A society like that really faces huge pressures if the economy is shut down.

How is the pandemic-induced economic collapse we’re facing now different from what we faced in 2008?

This is so much faster. Early this year, America had record-low unemployment numbers. And last week or so already we probably broke the record for unemployment in the United States in the period since World War II. This story is moving so fast that our statistical systems of registration can’t keep up. So we think probably de facto unemployment in the U.S. right now is 13, 14, 15 percent. That’s never happened before. 2007 to 2008 was a classic global crisis in the sense that it came out of one particular over-expanded sector, a sector which is very well known for its volatility, which is real estate and construction. It was driven by a credit boom.

What we’re seeing this time around is deliberately, government-ordered, cliff edge, sudden shutdown of the entire economy, hitting specifically the face-to-face human services—retail, entertainment, restaurants—sector, which are, generally speaking, lagging in cyclical terms and are not the kind of sectors that generate boom-bust cycles.

Are we better prepared this time than in 2008?

You’d find it very hard to point to anyone in the policymaking community at the beginning of 2020 who was thinking of pandemic risk. Some people were. Former Treasury Secretary and former Director of the National Economic Council Larry Summers, for example, wrote a paper about pandemic flu several years ago, because of MERS and SARS, previous respiratory illnesses caused by coronaviruses. But it wasn’t top of stack at the beginning of this year. So we weren’t prepared in that sense. But do we know what to do now if we see the convulsions in the credit markets that we saw at the beginning of March? Yes. Have the central banks done it? Yes. Did they use some of the techniques they employed in ’08? Yes. Did they know that you had to go in big and you had to go in heavy and hard and quickly? Yes. And they have done so on an even more gigantic scale than in ’08, which is a lesson learned in ’08, too: There’s no such a thing as too big. And furthermore, the banks, which were the fragile bit in ’08, have basically been sidelined.

You’ve written that the response to the 2008 crisis worked to “undermine democracy.” How so, and could we see that again with this crisis?

The urgency that any financial crisis produces forces governments’ hands—it strips the legislature, the ordinary processes of democratic deliberation. When you’re forced to make very dramatic, very rapid decisions—particularly in a country as chronically divided as the U.S. is on so many issues—the risk that you create opportunities for demagogues of various types to take advantage of is huge. We know what the response of the Tea Party was to the ’08, ’09 economic crisis. They created an extraordinarily distorted vision of what had happened and then rode that to see extraordinary influence over the Republican party in the years that followed. And there is every reason to think that we might be faced with similar stresses in the American political system in months to come.

The U.S. economic system is bad enough in a regular crisis. In one like this, where you shut the entire economy down in a matter of weeks, the damage is barely conceivable.

How should we be rethinking the economy to buffer against meltdowns like this in the future?

We clearly need to have a far more adequate and substantial medical capacity. There’s no alternative to a comprehensive publicly backstopped or funded health insurance system. Insofar as you haven’t got that, your capacity to guarantee the security in the most basic and elementary sense of your population is not there. When you have a system in which one of the immediate side effects, in a crisis like this, is that large parts of your hospital system go bankrupt—one of the threats to the American medical system right now—that points to something extraordinarily wrong, especially if you’re spending close to 18 percent of GDP on health, more than any other society on the planet.

What about the unemployment insurance system?

America needs to have a comprehensive unemployment insurance system. It can be graded by local wage rates and everything else. But the idea that you have the extraordinary disparities that we have between a Florida and a Georgia at one end, with recipiency rates in the 11, 12, 13, 14, 15 percent, and then states which actually operate an insurance system, which deserve the name—this shouldn’t be accepted in a country like the U.S. We would need to look at how short-time working models might be a far better way of dealing with shocks of this kind, essentially saying that there is a public interest in the continuity of employment relationships. The employer should be investing in their staff and should not be indifferent as to who shows up for work on any given day.

What does this pandemic teach us about living in a global economy?

There are a series of very hard lessons in the recent history of globalization into which the corona shock fits—about the peculiar inability of American society, American politics, and the American labor market to cushion shocks that come from the outside in a way which moderates the risk and the damage to the most vulnerable people. If you look at the impact of globalization on manufacturing, industry, inequality, the urban fabric in the U.S., it’s far more severe than in other societies, which have basically been subject to the same shock. That really needs to raise questions about how the American labor market and welfare system work, because they are failing tens of millions of people in this society.

You write in Crashed not just about the 2008 crisis, but also about the decade afterward. What is the next decade going to look like, given this meltdown?

I have never felt less certain in even thinking about that kind of question. At this point, can either you or I confidently predict what we’re going to be doing this summer or this autumn? I don’t know whether my university is resuming normal service in the fall. I don’t know whether my daughter goes back to school. I don’t know when my wife’s business in travel and tourism resumes. That is unprecedented. It’s very difficult against that backdrop to think out over a 10-year time horizon. Presumably by next year something like normality returns. But forever after we’ll live under the shadow of this having happened. Every year we’re going to be anxiously worrying about whether flu season is going to be flu season like normal or flu season like this. That is itself something to be reckoned with.

How will anxiety and uncertainty about a future pandemic-like crisis affect the economy?

When we do not know what the future holds to this extent, it makes it very difficult for people to make bold, long-term financial decisions. This previously wasn’t part of the repertoire of what the financial analysts call tail risk. Not seriously. My sister works in the U.K. government, and they compile a list every quarter of the top five things that could blow your departmental business up. Every year pandemics are in the top three. But no one ever acted on it. It’s not like terrorism. In Britain, you have a state apparatus which is geared to address the terrorism risk because it’s very real—it’s struck many times. Now all of a sudden we have to take the possibility of pandemics that seriously. And their consequences are far more drastic. How do we know what our incomes are going to be? A very large part of American society is not going to be able to answer that question for some time to come. And that will shake consumer confidence. It will likely increase the savings rate. It’s quite likely to reduce the desire to invest in a large part of the U.S. economy.

Max Kutner is a journalist in New York City. He has written for Newsweek, The Boston Globe, and Smithsonian. Follow him on Twitter @maxkutner.

Lead image: Straight 8 Photography / Shutterstock


Read More…




the

The Ecological Vision That Will Save Us - Issue 84: Outbreak


The marquee on my closed neighborhood movie theater reads, “See you on the other side.” I like reading it every day as I pass by on my walk. It causes me to envision life after the coronavirus pandemic. Which is awfully hard to envision now. But it’s out there. When you have a disease and are in a hospital, alone and afraid, intravenous tubes and sensor wires snaking from your body into digital monitors, all you want is to be normal again. You want nothing more than to have a beer in a dusky bar and read a book in amber light. At least that’s all I wanted last year when I was in a hospital, not from a coronavirus. When, this February, I had that beer in a bar with my book, I was profoundly happy. The worst can pass.

With faith, you can ask how life will be on the other side. Will you be changed personally? Will we be changed collectively? The knowledge we’re gaining now is making us different people. Pain demands relief, demands we don’t repeat what produced it. Will the pain of this pandemic point a new way forward? It hasn’t before, as every war attests. This time may be no different. But the pandemic has slipped a piece of knowledge into the body public that may not be easy to repress. It’s an insight scientists and poets have voiced for centuries. We’re not apart from nature, we are nature. The environment is not outside us, it is us. We either act in concert with the environment that gives us life, or the environment takes life away.

Guess which species is the bully? No animal has had the capacity to modify its niche the way we have.

Nothing could better emphasize our union with nature than the lethal coronavirus. It’s crafted by a molecule that’s been omnipresent on Earth for 4 billion years. Ribonucleic acid may not be the first bridge from geochemical to biochemical life, as some scientists have stated. But it’s a catalyst of biological life. It wrote the book on replication. RNA’s signature molecules, nucleotides, code other molecules, proteins, the building blocks of organisms. When RNA’s more chemically stable kin, DNA, arrived on the scene, it outcompeted its ancestor. Primitive organisms assembled into cells and DNA set up shop in their nucleus. It employed its nucleotides to code proteins to compose every tissue in every multicellular species, including us. A shameless opportunist, RNA made itself indispensable in the cellular factory, shuttling information from DNA into the cell’s power plant, where proteins are synthesized.

RNA and DNA had other jobs. They could be stripped down to their nucleotides, swirled inside a sticky protein shell. That gave them the ability to infiltrate any and all species, hijack their reproductive machinery, and propagate in ways that make rabbits look celibate. These freeloading parasites have a name: virus. But viruses are not just destroyers. They wear another evolutionary hat: developers. Viruses “may have originated the DNA replication system of all three cellular domains (archaea, bacteria, eukarya),” writes Luis P. Villareal, founding director of the Center for Virus Research at the University of California, Irvine.1 Their role in nature is so successful that DNA and RNA viruses make up the most abundant biological entities on our planet. More viruses on Earth than stars in the universe, scientists like to say.

Today more RNA than DNA viruses thrive in cells like ours, suggesting how ruthless they’ve remained. RNA viruses generally reproduce faster than DNA viruses, in part because they don’t haul around an extra gene to proofread their molecular merger with others’ DNA. So when the reckless RNA virus finds a new place to dwell, organisms become heartbreak hotels. Once inside a cell, the RNA virus slams the door on the chemical saviors dispatched by cells’ immunity sensors. It hijacks DNA’s replicative powers and fans out by the millions, upending cumulative cellular functions. Like the ability to breathe.

Humans. We love metaphors. They allow us to compare something as complex as viral infection to something as familiar as an Elvis Presley hit. But metaphors for natural processes are seldom accurate. The language is too porous, inviting our anthropomorphic minds to close the gaps. We imagine viruses have an agenda, are driven by an impetus to search and destroy. But nature doesn’t act with intention. It just acts. A virus lives in a cell like a planet revolves around a sun.

Biologists debate whether a virus should be classified as living because it’s a deadbeat on its own; it only comes to life in others. But that assumes an organism is alive apart from its environment. The biochemist and writer Nick Lane points out, “Viruses use their immediate environment to make copies of themselves. But then so do we: We eat other animals or plants, and we breathe in oxygen. Cut us off from our environment, say with a plastic bag over the head, and we die in a few minutes. One could say that we parasitize our environment—like viruses.”2

Our inseparable accord with the environment is why the coronavirus is now in us. Its genomic signature is almost a perfect match with a coronavirus that thrives in bats whose habitats range across the globe. Humans moved into the bats’ territory and the bats’ virus moved into humans. The exchange is just nature doing its thing. “And nature has been doing its thing for 3.75 billion years, when bacteria fought viruses just as we fight them now,” says Shahid Naeem, an upbeat professor of ecology at Columbia University, where he is director of the Earth Institute Center for Environmental Sustainability. If we want to assign blame, it lies with our collectively poor understanding of ecology.

FLYING LESSON: Bats don’t die from the same coronavirus that kills humans because the bat’s anatomy fights the virus to a draw, neutralizing its lethal moves. What’s the deal with the human immune system? We don’t fly.Martin Pelanek / Shutterstock

Organisms evolve with uniquely adaptive traits. Bats play many ecological roles. They are pollinators, seed-spreaders, and pest-controllers. They don’t die from the same coronavirus that kills humans because the bat’s anatomy fights the virus to a draw, neutralizing its lethal moves. What’s the deal with the human immune system? We don’t fly. “Bats are flying mammals, which is very unusual,” says Christine K. Johnson, an epidemiologist at the One Health Institute at the University of California, Davis, who studies virus spillover from animals to humans. “They get very high temperatures when they fly, and have evolved immunological features, which humans haven’t, to accommodate those temperatures.”

A viral invasion can overstimulate the chemical responses from a mammal’s immune system to the point where the response itself causes excessive inflammation in tissues. A small protein called a cytokine, which orchestrates cellular responses to foreign invaders, can get over-excited by an aggressive RNA virus, and erupt into a “storm” that destroys normal cellular function—a process physicians have documented in many current coronavirus fatalities. Bats have genetic mechanisms to inhibit that overreaction. Similarly, bat flight requires an increased rate of metabolism. Their wing-flapping action leads to high levels of oxygen-free radicals—a natural byproduct of metabolism—that can damage DNA. As a result, states a 2019 study in the journal Viruses, “bats probably evolved mechanisms to suppress activation of immune response due to damaged DNA generated via flight, thereby leading to reduced inflammation.”3

Bats don’t have better immune systems than humans; just different. Our immune systems evolved for many things, just not flying. Humans do well around the cave fungus Pseudogymnoascus destructans, source of the “white-nose syndrome” that has devastated bats worldwide. Trouble begins when we barge into wildlife habitats with no respect for differences. (Trouble for us and other animals. White-nose syndrome spread in part on cavers’ shoes and clothing, who tracked it from one site to the next.) We mine for gold, develop housing tracts, and plow forests into feedlots. We make other animals’ habitats our own.

Our moralistic brain sees retribution. Karma. A viral outbreak is the wrath that nature heaps on us for bulldozing animals out of their homes. Not so. “We didn’t violate any evolutionary or ecological laws because nature doesn’t care what we do,” Naeem says. Making over the world for ourselves is just humans being the animals we are. “Every species, if they had the upper hand, would transform the world into what it wants,” Naeem says. “Birds build nests, bees build hives, beavers build dams. It’s called niche construction. If domestic cats ruled the world, they would make the world in their image. It would be full of litter trays, lots of birds, lots of mice, and lots of fish.”

But nature isn’t an idyllic land of animal villages constructed by evolution. Species’ niche-building ways have always brought them into contact with each other. “Nature is ruled by processes like competition, predation, and mutualism,” Naeem says. “Some of them are positive, some are negative, some are neutral. That goes for our interactions with the microbial world, including viruses, which range from super beneficial to super harmful.”

Nature has been doing its thing for 3.75 billion years, when bacteria fought viruses as we fight them now.

Ultimately, nature works out a truce. “If the flower tries to short the hummingbird on sugar, the hummingbird is not going to provide it with pollination,” Naeem says. “If the hummingbird sucks up all the nectar and doesn’t do pollination well, it’s going to get pinged as well. Through this kind of back and forth, species hammer out an optimal way of getting along in nature. Evolution winds up finding some middle ground.” Naeem pauses. “If you try to beat up everybody, though, it’s not going to work.”

Guess which species is the bully? “There’s never been any species on this planet in its entire history that has had the capacity to modify its niche the way we have,” Naeem says. Our niche—cities, farms, factories—has made the planet into a zoological Manhattan. Living in close proximity with other species, and their viruses, means we are going to rub shoulders with them. Dense living isn’t for everyone. But a global economy is. And with it comes an intercontinental transportation system. A virus doesn’t have a nationality. It can travel as easily from Arkansas to China as the other way around. A pandemic is an inevitable outcome of our modified niche.

Although nature doesn’t do retribution, our clashes with it have mutual consequences. The exact route of transmission of SARS-CoV-2 from bat to humans remains unmapped. Did the virus pass directly into a person, who may have handled a bat, or through an intermediate animal? What is clear is the first step, which is that a bat shed the virus in some way. University of California, Davis epidemiologist Johnson explains bats shed viruses in their urine, feces, and saliva. They might urinate on fruit or eat a piece of it, and then discard it on the ground, where an animal may eat it. The Nipah virus outbreak in 1999 was spurred by a bat that left behind a piece of fruit that came in contact with a domestic pig and humans. The Ebola outbreaks in the early 2000s in Central Africa likely began when an ape, who became bushmeat for humans, came in contact with a fruit bat’s leftover. “The same thing happened with the Hendra virus in Australia in 1994,” says Johnson. “Horses got infected because fruit bats lived in trees near the horse farm. Domesticated species are often an intermediary between bats and humans, and they amplify the outbreak before it gets to humans.”

Transforming bat niches into our own sends bats scattering—right into our backyards. In a study released this month, Johnson and colleagues show the spillover risk of viruses is the highest among animal species, notably bats, that have expanded their range, due to urbanization and crop production, into human-run landscapes.4 “The ways we’ve altered the landscape have brought a lot of great things to people,” Johnson says. “But that has put wildlife at higher pressures to adapt, and some of them have adapted by moving in with us.”

Pressures on bats have another consequence. Studies indicate physiological and environmental stress can increase viral replication in them and cause them to shed more than they normally do. One study showed bats with white-nose syndrome had “60 times more coronavirus in their intestines” as uninfected bats.5 Despite evidence for an increase in viral replication and shedding in stressed bats, “a direct link to spillover has yet to be established,” concludes a 2019 report in Viruses.3 But it’s safe to say that bats being perpetually driven from their caves into our barns is not ideal for either species.

As my questions ran out for Columbia University’s Naeem, I asked him to put this horrible pandemic in a final ecological light for me.

“We think of ourselves as being resilient and robust, but it takes something like this to realize we’re still a biological entity that’s not capable of totally controlling the world around us,” he says. “Our social system has become so disconnected from nature that we no longer understand we still are a part of it. Breathable air, potable water, productive fields, a stable environment—these all come about because we’re part of this elaborate system, the biosphere. Now we’re suffering environmental consequences like climate change and the loss of food security and viral outbreaks because we’ve forgotten how to integrate our endeavors with nature.”

A 2014 study by a host wildlife ecologists, economists, and evolutionary biologists lays out a plan to stem the tide of emergent infectious diseases, most of which spawned in wildlife. Cases of emergent infectious diseases have practically quadrupled since 1940.6 World leaders could get smart. They could pool money for spillover research, which would identify the hundreds of thousands of potentially lethal viruses in animals. They could coordinate pandemic preparation with international health regulations. They could support animal conservation with barriers that developers can’t cross. The scientists give us 27 years to cut the rise of infectious diseases by 50 percent. After that, the study doesn’t say what the world will look like. I imagine it will look like a hospital right now in New York City.

Patients lie on gurneys in corridors, swaddled in sheets, their faces shrouded by respirators. They’re surrounded by doctors and nurses, desperately trying to revive them. In pain, inconsolable, and alone. I know they want nothing more than to see their family and friends on the other side, to be wheeled out of the hospital and feel normal again. Will they? Will others in the future? It will take tremendous political will to avoid the next pandemic. And it must begin with a reckoning with our relationship with nature. That tiny necklace of RNA tearing through patients’ lungs right now is the world we live in. And have always lived in. We can’t be cut off from the environment. When I see the suffering in hospitals, I can only ask, Do we get it now?

Kevin Berger is the editor of Nautilus.

References

1. Villareal, L.P. The Widespread Evolutionary Significance of Viruses. In Domingo, E., Parrish, C.R., & Hooland, J. (Eds.) Origin and Evolution of Viruses Elsevier, Amsterdam, Netherlands (2008).

2. Lane, N. The Vital Question: Energy, Evolution, and the Origins of Complex Life W.W. Norton, New York, NY (2015).

3. Subudhi, S., Rapin, N., & Misra, V. Immune system modulation and viral persistence in Bats: Understanding viral spillover. Viruses 11, E192 (2019).

4. Johnson, C.K., et al. Global shifts in mammalian population trends reveal key predictors of virus spillover risk. Proceedings of The Royal Society B 287 (2020).

5. Davy, C.M., et al. White-nose syndrome is associated with increased replication of a naturally persisting coronaviruses in bats. Scientific Reports 8, 15508 (2018).

6. Pike, J., Bogich, T., Elwood, S., Finnoff, D.C., & Daszak, P. Economic optimization of a global strategy to address the pandemic threat. Proceedings of the National Academy of Sciences 111, 18519-18523 (2014).

Lead image: AP Photo / Mark Lennihan


Read More…




the

Don’t Fear the Robot - Issue 84: Outbreak


You probably know my robot. I’ve been inventing autonomous machines for over 30 years and one of them, Roomba from iRobot, is quite popular. During my career, I’ve learned a lot about what makes robots valuable, and formed some strong opinions about what we can expect from them in the future. I can also tell you why, contrary to popular apocalyptic Hollywood images, robots won’t be taking over the world anytime soon. But that’s getting ahead of myself. Let me back up.

My love affair with robots began in the early 1980s when I joined the research staff at MIT’s Artificial Intelligence Lab. Physics was my college major but after a short time at the lab the potential of the developing technology seduced me. I became a roboticist.

Such an exhilarating place to work! A host of brilliant people were researching deep problems and fascinating algorithms. Amazingly clever mechanisms were being developed, and it was all converging in clever and capable mobile robots. The future seemed obvious. So, I made a bold prediction and told all my friends, “In three to five years, robots will be everywhere doing all sorts of jobs.”

But I was wrong.

Again and again in those early years, news stories teased: “Big Company X has demonstrated a prototype of Consumer Robot Y. X says Y will be available for sale next year.” But somehow next year didn’t arrive. Through the 1980s and 1990s, robots never managed to find their way out of the laboratory. This was distressing to a committed robot enthusiast. Why hadn’t all the journal papers, clever prototypes, and breathless news stories culminated in a robot I could buy in a store?

Let me answer with the story of the first consumer robot that did achieve marketplace stardom.

RUG WARRIOR: Joe Jones built his “Rug Warrior” (above) in 1989. He calls it “the earliest conceptual ancestor of Roomba.” It included bump sensors and a carpet sweeper mechanism made from a bottle brush. It picked up simulated dirt at a demonstration but, Jones says, “was not robust enough to actually clean my apartment as I had hoped.”Courtesy of Joe Jones

In the summer of 1999, while working at iRobot, a colleague, Paul Sandin, and I wrote a proposal titled “DustPuppy, A Near-Term, Breakthrough Product with High Earnings Potential.” We described an inexpensive little robot, DustPuppy, that would clean consumers’ floors by itself. Management liked the idea and gave us $10,000 and two weeks to build a prototype.

Using a cylindrical brush, switches, sensors, motors, and a commonplace microprocessor, we assembled our vision. At the end of an intense fortnight we had it—a crude version of a robot that conveyed a cleaning mechanism around the floor and—mostly—didn’t get stuck. Management saw the same promise in DustPuppy as Paul and me.

We called our robot DustPuppy for a reason. This was to be the world’s first significant consumer robot and the team’s first attempt at a consumer product. The risk was that customers might expect too much and that we might deliver too little. We were sure that—like a puppy—our robot would try very hard to please but that also—like a puppy—it might sometimes mess up. Calling it DustPuppy was our way of setting expectations and hoping for patience if our robot wasn’t perfect out of the gate. Alas, iRobot employed a firm to find a more commercial name. Many consumer tests later, DustPuppy became Roomba. The thinking was the robot’s random motion makes it appear to be dancing around the room—doing the Rumba.

Paul and I knew building a robotic floor cleaner entails fierce challenges not apparent to the uninitiated. Familiar solutions that work well for people can prove problematic when applied to a robot.

Your manual vacuum likely draws 1,400 watts or 1.9 horsepower from the wall socket. In a Roomba-sized robot, that sort of mechanism would exhaust the battery in about a minute. Make the robot bigger, to accommodate a larger battery, and the robot won’t fit under the furniture. Also, batteries are expensive—the cost of a big one might scuttle sales. We needed innovation.

Melville Bissell, who patented the carpet sweeper in 1876, helped us out. We borrowed from his invention to solve Roomba’s energy problem. A carpet sweeper picks up dirt very efficiently. Although you supply all the power, you won’t work up a sweat pushing one around. (If you supplied the entire 1.9 horsepower a conventional vacuum needs, you’d do a lot of sweating!)

When designers festoon their robots with anthropomorphic features, they are making a promise no robot can keep.

We realized that our energy-efficient carpet sweeper would not clean as quickly or as deeply as a powerful vacuum. But we thought, if the robot spends enough time doing its job, it can clean the surface dirt just as well. And if the robot runs every day, the surface dirt won’t work into the carpet. Roomba matches a human-operated vacuum by doing the task in a different way.

Any robot vacuum must do two things: 1) not get stuck, and 2) visit every part of the floor. The first imperative we satisfied in part by making Roomba round with its drive wheels on the diameter. The huge advantage of this shape is that Roomba can always spin in place to escape from an object. No other shape enables such a simple, reliable strategy. The second imperative, visiting everywhere, requires a less obvious plan.

You move systematically while cleaning, only revisiting a spot if that spot is especially dirty. Conventional wisdom says our robot should do the same—drive in a boustrophedon pattern. (This cool word means writing lines in alternate directions, left to right, right to left, like an ox turns in plowing.) How to accomplish this? We received advice like, “Just program the robot to remember where it’s been and not go there again.”

Such statements reveal a touching faith that software unaided can solve any technical problem. But try this exercise (in a safe place, please!). While standing at a marked starting point, pick another point, say, six feet to your left. Now keep your eyes closed while you walk in a big circle around the central point. How close did you come to returning to your starting point? Just like you, a robot can’t position itself in the world without appropriate sensors. Better solutions are available today, but circa 2000 a position-sensing system would have added over $1,000 to Roomba’s cost. So, boustrophedon paths weren’t an option. We had to make Roomba do its job without knowing where it was.

I design robots using a control scheme called behavior-based programming. This approach is robot-appropriate because it’s fast, responsive, and runs on low-cost computer hardware. A behavior-based program structures a robot’s control scheme as a set of simple, understandable behaviors.

Remember that Roomba’s imperative is to apply its cleaning mechanism to all parts of the floor and not get stuck. The program that accomplishes this needs a minimum of two behaviors. Call them Cruise and Escape. Cruise is single-minded. It ignores all sensor inputs and constantly outputs a signal telling the robot’s motors to drive forward.

Escape watches the robot’s front bumper. Whenever the robot collides with something, one or both of the switches attached to the bumper activate. If the left switch closes, Escape knows there’s been a collision on the left, so it tells the motors to spin the robot to the right. A collision on the right means spin left. If both switches close at once, an arbitrary decision is made. When neither switch is closed Escape sends no signal to the motors.

TEST FLOORS: “Roomba needed to function on many floor types and to transition smoothly from one type to another,” says Joe Jones. “We built this test floor to verify that Roomba would work in this way.” The sample floors include wood, various carpets, and tiles.Courtesy of Joe Jones

Occasionally Cruise and Escape try to send commands to the motors at the same time. When this happens, a bit of code called an arbiter decides which behavior succeeds—the highest priority behavior outputting a command wins. In our example, Escape is assigned the higher priority.

Watching the robot, we see a complex behavior emerge from these simple rules. The robot moves across the floor until it bumps into something. Then it stops moving forward and turns in place until the path is clear. It then resumes forward motion. Given time, this random motion lets the robot cover, and clean, the entire floor.

Did you guess so little was going on in the first Roomba’s brain? When observers tell me what Roomba is thinking they invariably imagine great complexity—imbuing the robot with intentions and intricate plans that are neither present nor necessary. Every robot I build is as simple and simple-minded as I can make it. Anything superfluous, even intelligence, works against marketplace success.

The full cleaning task contains some extra subtleties. These require more than just two behaviors for efficient operation. But the principle holds, the robot includes only the minimum components and code required for the task.

A few months from product launch, we demonstrated one of our prototypes to a focus group. The setup was classical: A facilitator presented Roomba to a cross section of potential customers while the engineers watched from a darkened room behind a one-way mirror.

The session was going well, people seemed to like the robot and it picked up test dirt effectively. Then the facilitator mentioned that Roomba used a carpet sweeper mechanism and did not include a vacuum.

The mood changed. Our test group revised the price they’d be willing to pay for Roomba, cutting in half their estimate from only minutes earlier. We designers were perplexed. We solved our energy problem by eschewing a vacuum in favor of a carpet sweeper—and it worked! Why wasn’t that enough for the focus group?

Did you guess so little was going on in Roomba’s brain? Every robot I build is as simple-minded as I can make it.

Decades of advertising have trained consumers that a vacuum drawing lots of amps means effective cleaning. We wanted customers to judge our new technology using a more appropriate metric. But there was no realistic way to accomplish that. Instead, our project manager declared, “Roomba must have a vacuum, even if it does nothing.”

No one on the team wanted a gratuitous component—even if it solved our marketing problem. We figured we could afford three watts to run a vacuum motor. But a typical vacuum burns 1,400 watts. What could we do with just three?

Using the guts of an old heat gun, some cardboard, and packing tape, I found a way. It turned out that if I made a very narrow inlet, I could achieve the same air-flow velocity as a regular vacuum but, because the volume was miniscule, it used only a tiny bit of power. We had a vacuum that actually contributed to cleaning.

DUST PUPPY: Before the marketers stepped in with the name “Roomba,” Joe Jones and his colleague Paul Sandin called their floor cleaner, “DustPuppy.” “Our robot would try very hard to please,” Jones writes. But like a puppy, “it might sometimes mess up.” Above, Sandin examines a prototype, with designer Steve Hickey (black shirt) and intern Ben Trueman.Courtesy of Joe Jones

There’s a moment in the manufacturing process called “commit to tooling” when the design must freeze so molds for the plastic can be cut. Fumble that deadline and you may miss your launch date, wreaking havoc on your sales.

About two weeks before “commit,” our project manager said, “Let’s test the latest prototype.” We put some surrogate dirt on the floor and let Roomba run over it. The dirt remained undisturbed.

Panic ensued. Earlier prototypes had seemed to work, and we thought we understood the cleaning mechanism. But maybe not. I returned to the lab and tried to identify the problem. This involved spreading crushed Cheerios on a glass tabletop and looking up from underneath as our cleaning mechanism operated.

Our concept of Mr. Bissell’s carpet sweeper went like this: As the brush turns against the floor, bristle tips pick up dirt particles. The brush rotates inside a conforming shroud carrying the dirt to the back where a toothed structure combs it from the brush. The dirt then falls into the collection bin.

That sedate description couldn’t have been more wrong. In fact, as the brush turns against the floor, a flicking action launches dirt particles into a frenetic, chaotic cloud. Some particles bounce back onto the floor, some bounce deep into the brush, some find the collection bin. The solution was to extend the shroud around the brush a little farther on the back side—that redirected the dirt that bounced out such that the brush had a second chance to pick it up. Roomba cleaned again and we could begin cutting molds with a day or two to spare.

Roomba launched in September 2002. Its success rapidly eclipsed the dreams of all involved.

Did Roomba’s nascent reign end the long robot drought? Was my hordes-of-robots-in-service-to-humanity dream about to come true?

In the years since iRobot released Roomba, many other robot companies have cast their die. Here are a few: Anki, Aria Insights, Blue Workforce, Hease Robotics, Jibo, Keecker, Kuri, Laundroid, Reach Robotics, Rethink Robotics, and Unbounded Robotics. Besides robots and millions of dollars of venture capitalist investment, what do all of these companies have in common? None are in business today.

The commercial failure of robots and robot companies is not a new phenomenon. Before Roomba, the pace was slower, but the failure rate no less disappointing. This dismal situation set me looking for ways around the fatal missteps roboticists seemed determined to make. I settled on three principles that we followed while developing Roomba.

1. Perform a Valuable Task

When a robot does a specific job, say, mowing your lawn or cleaning your grill, its value is clear and long-lasting. But over the years, I’ve seen many cool, cute, engaging robots that promised great, albeit vague, value while performing no discernable task. Often the most embarrassing question I could ask the designer of such a robot was, “What does your robot do?” In this case the blurted answer, “Everything!” is synonymous with “Nothing.” The first principle for a successful robot is: Do something people want done. When a robot’s only attribute is cuteness, value evaporates as novelty fades.

2. Do the Task Today

Many robots emerge from research labs. In the lab, researchers aspire to be first to achieve some impressive result; cost and reliability matter little. But cost and reliability are paramount for real-world products. Bleeding edge technologies are rarely inexpensive, reliable, or timely. Second principle: Use established technology. A research project on the critical path to robot completion can delay delivery indefinitely.

3. Do the Task for Less

People have jobs they want done and states they want achieved—a clean floor, a mowed lawn, fresh folded clothes in the dresser. The result matters, the method doesn’t. If a robot cannot provide the lowest cost, least arduous solution, customers won’t buy it. Third principle: A robotic solution must be cost-competitive with existing solutions. People will not pay more to have a robot do the job.

A few robots have succeeded impressively: Roomba, Kiva Systems (warehouse robots), and Husqvarna’s Automower (lawn mower). But I started this article with the question, why aren’t successful robots everywhere? Maybe the answer is becoming clearer.

Robot success is opportunistic. Not every application has a viable robotic solution. The state of the art means only select applications offer: a large market; existing technology that supports autonomy; a robotic approach that outcompetes other solutions.

There’s one more subtle aspect. Robots and people may accomplish the same task in completely different ways. This makes deciding which tasks are robot-appropriate both difficult and, from my perspective, great fun. Every potential task must be reimagined from the ground up.

My latest robot, Tertill, prevents weeds from growing in home gardens. A human gardener pulls weeds up by the roots. Why? Because this optimizes the gardeners time. Leaving roots behind isn’t a moral failure, it just means weeds will rapidly re-sprout forcing the gardener to spend more time weeding.

Tertill does not pull weeds but attacks them in two other ways. It cuts the tops off weeds and it uses the scrubbing action of the wheels to kill weeds as they sprout from seeds. These tactics work because the robot, unlike the gardener, lives in the garden. Tertill returns every day to prevent rooted weeds from photosynthesizing so roots eventually die; weed seeds that are constantly disturbed don’t sprout.

Had Tertill copied the human solution, the required root extraction mechanism and visual identification system would have increased development time, added cost, and reduced reliability. Without reimagining the task, there would be no solution.

Robots have a hard-enough time doing their jobs at all. Burdening them with unnecessary features and expectations worsens the problem. That’s one reason I’m always vexed when designers festoon their robots with anthropomorphic features—they make a promise no robot can keep. Anthropomorphic features and behaviors hint that the robot has the same sort of inner life as people. But it doesn’t. Instead the robot has a limited bag of human-mimicking tricks. Once the owner has seen all the tricks, the robot’s novelty is exhausted and along with it the reason for switching on the robot. Only robots that perform useful tasks remain in service after the novelty wears off.

No commercially successful robot I’m aware of has superfluous extras. This includes computation cycles—cycles it might use to contemplate world domination. All of the robot’s resources are devoted to accomplishing the task for which it was designed, or else it wouldn’t be successful. Working robots don’t have time to take over the world.

Robots have been slow to appear because each one requires a rare confluence of market, task, technology, and innovation. (And luck. I only described some of the things that nearly killed Roomba.) But as technology advances and costs decline, the toolbox for robot designers constantly expands. Thus, more types of robots will cross the threshold of economic viability. Still, we can expect one constant. Each new, successful robot will represent a minimum—the simplest, lowest-cost solution to a problem people want solved. The growing set of tools that let us attack ever more interesting problems make this an exciting time to practice robotics.

Joe Jones is cofounder and CTO of Franklin Robotics. A graduate of MIT, he holds more than 70 patents.

Lead image: Christa Mrgan / Flickr


Read More…




the

Why People Feel Misinformed, Confused, and Terrified About the Pandemic - Facts So Romantic


 

The officials deciding what to open, and when, seldom offer thoughtful rationales. Clearly, risk communication about COVID-19 is failing with potentially dire consequences.Photograph by michael_swan / Flickr

When I worked as a TV reporter covering health and science, I would often be recognized in public places. For the most part, the interactions were brief hellos or compliments. Two periods of time stand out when significant numbers of those who approached me were seeking detailed information: the earliest days of the pandemic that became HIV/AIDS and during the anthrax attacks shortly following 9/11. Clearly people feared for their own safety and felt their usual sources of information were not offering them satisfaction. Citizens’ motivation to seek advice when they feel they aren’t getting it from official sources is a strong indication that risk communication is doing a substandard job. It’s significant that one occurred in the pre-Internet era and one after. We can’t blame a public feeling misinformed solely on the noise of the digital age.

America is now opening up from COVID-19 lockdown with different rules in different places. In many parts of the country, people have been demonstrating, even rioting, for restrictions to be lifted sooner. Others are terrified of loosening the restrictions because they see COVID-19 cases and deaths still rising daily. The officials deciding what to open, and when, seldom offer thoughtful rationales. Clearly, risk communication about COVID-19 is failing with potentially dire consequences.

A big part of maintaining credibility is to admit to uncertainty—something politicians are loath to do.

Peter Sandman is a foremost expert on risk communication. A former professor at Rutgers University, he was a top consultant with the Centers for Disease Control in designing crisis and emergency risk-communication, a field of study that combines public health with psychology. Sandman is known for the formula Risk = Hazard + Outrage. His goal is to create better communication about risk, allowing people to assess hazards and not get caught up in outrage at politicians, public health officials, or the media. Today, Sandman is a risk consultant, teamed with his wife, Jody Lanard, a pediatrician and psychiatrist. Lanard wrote the first draft of the World Health Organization’s Outbreak Communications Guidelines. “Jody and Peter are seen as the umpires to judge the gold standard of risk communications,” said Michael Osterholm of the Center for Infectious Disease Research and Policy at the University of Minnesota. Sandman and Lanard have posted a guide for effective COVID-19 communication on the center’s website.

I reached out to Sandman to expand on their advice. We communicated through email.

Sandman began by saying he understood the protests around the country about the lockdown. “It’s very hard to warn people to abide by social-distancing measures when they’re so outraged that they want to kill somebody and trust absolutely nothing people say,” he told me. “COVID-19 outrage taps into preexisting grievances and ideologies. It’s not just about COVID-19 policies. It’s about freedom, equality, too much or too little government. It’s about the arrogance of egghead experts, left versus right, globalism versus nationalism versus federalism. And it’s endlessly, pointlessly about Donald Trump.”

Since the crisis began, Sandman has isolated three categories of grievance. He spelled them out for me, assuming the voices of the outraged:

• “In parts of the country, the response to COVID-19 was delayed and weak; officials unwisely prioritized ‘allaying panic’ instead of allaying the spread of the virus; lockdown then became necessary, not because it was inevitable but because our leaders had screwed up; and now we’re very worried about coming out of lockdown prematurely or chaotically, mishandling the next phase of the pandemic as badly as we handled the first phase.”

• “In parts of the country, the response to COVID-19 was excessive—as if the big cities on the two coasts were the whole country and flyover America didn’t need or didn’t deserve a separate set of policies. There are countless rural counties with zero confirmed cases. Much of the U.S. public-health profession assumes and even asserts without building an evidence-based case that these places, too, needed to be locked down and now need to reopen carefully, cautiously, slowly, and not until they have lots of testing and contact-tracing capacity. How dare they destroy our economy (too) just because of their mishandled outbreak!”

• “Once again the powers-that-be have done more to protect other people’s health than to protect my health. And once again the powers-that-be have done more to protect other people’s economic welfare than to protect my economic welfare!” (These claims can be made with considerable truth by healthcare workers; essential workers in low-income, high-touch occupations; residents of nursing homes; African-Americans; renters who risk eviction; the retired whose savings are threatened; and others.)

In their article for the Center for Infectious Disease Research and Policy, Sandman and Lanard point out that coping with a pandemic requires a thorough plan of communication. This is particularly important as the crisis is likely to enter a second wave of infection, when it could be more devastating. The plan starts with six core principles: 1) Don’t over-reassure, 2) Proclaim uncertainty, 3) Validate emotions—your audience’s and your own, 4) Give people things to do, 5) Admit and apologize for errors, and 6) Share dilemmas. To achieve the first three core principles, officials must immediately share what they know, even if the information may be incomplete. If officials share good news, they must be careful not to make it too hopeful. Over-reassurance is one of the biggest dangers in crisis communication. Sandman and Lanard suggest officials say things like, “Even though the number of new confirmed cases went down yesterday, I don’t want to put too much faith in one day’s good news.” 

Sandman and Lanard say a big part of maintaining credibility is to admit to uncertainty—something politicians are loath to do. They caution against invoking “science” as a sole reason for action, as science in the midst of a crisis is “incremental, fallible, and still in its infancy.” Expressing empathy, provided it’s genuine, is important, Sandman and Lanard say. It makes the bearer more human and believable. A major tool of empathy is to acknowledge the public’s fear as well as your own. There is good reason to be terrified about this virus and its consequences on society. It’s not something to hide.

Sandman and Lanard say current grievances with politicians, health officials, and the media, about how the crisis has been portrayed, have indeed been contradictory. But that makes them no less valid. Denying the contradictions only amplifies divisions in the public and accelerates the outrage, possibly beyond control. They strongly emphasize one piece of advice. “Before we can share the dilemma of how best to manage any loosening of the lockdown, we must decisively—and apologetically—disabuse the public of the myth that, barring a miracle, the COVID-19 pandemic can possibly be nearing its end in the next few months.”

Robert Bazell is an adjunct professor of molecular, cellular, and developmental biology at Yale. For 38 years, he was chief science correspondent for NBC News.


Read More…




the

In 'Dirt,' Bill Buford Is Able To Offer An Authentic Adventure In French Cooking

As a longtime Paris resident, at first I feared Dirt might be yet another expat tale of moving to France en famille, with all its tedious clichés. I should have known better.




the

The Pandemic Cancels The Celebration Of Victory In WWII In Russia

Russian President Vladimir Putin had celebrations to mark victory in WWII and a constitutional vote to keep him in power till 2036 planned for this spring. But the pandemic has canceled both events.




the

Paris Suburbs Are Facing Social Disparities Under The Coronavirus Lockdown

The French are facing social disparities in the face of the coronavirus pandemic. With long bread lines and tensions with police, the Paris suburbs are faring poorly under the lockdown.




the

Coronavirus World Map: Tracking The Spread Of The Outbreak

A map of confirmed COVID-19 cases and deaths around the world. The respiratory disease has spread rapidly across six continents and has killed thousands of people.




the

Trump wants to deliver 300 million doses of coronavirus vaccine by the end of the year. Is that even possible?

The expectation is the U.S. won’t return to normal until there’s an effective vaccine against COVID-19  — and almost everyone in the country has been vaccinated.





the

Georgia businesses reopen and customers start returning, but only time will tell if it's the right decision

Exactly one week since Georgia Gov. Brian Kemp began reopening the state's economy, small businesses shared early success stories as customers welcomed their return. But at what cost? Business owners say only time will tell.





the

Coronavirus and the 'new normal': What's coming in the months ahead

The COVID-19 pandemic has already affected the lives of every American. And while politicians and experts disagree on how best to confront the disease and mitigate its economic ramifications, there is a broad understanding that we are entering a “new normal” — an upending of our lives that will continue at least until a vaccine is developed — and perhaps well beyond that.





the

The promise — and pitfalls — of antibody testing for COVID-19

In New York, the number of patients coming to the ER with COVID-19 symptoms has dropped and there is hope that the worst is behind us. As we look to the future, many of my colleagues on the frontline are eager to know if they have antibodies.





the

How the coronavirus undid Florida Gov. Ron DeSantis

Long before the coronavirus outbreak turned him into one of the least popular governors in the nation, DeSantis of Florida was something of a conservative golden boy.





the

At the White House, social distancing is optional

As millions of Americans are following social distancing guidelines from the White House coronavirus task force, inside the White House many of these rules are not being observed.





the

'The safest place to be': A coronavirus researcher on life inside a biosafety level 3 lab

Sara Cherry, a microbiologist at the University of Pennsylvania, feels safer at work than almost anywhere else. That’s because she works inside a biosafety level 3 laboratory on the Penn campus in Philadelphia, where she is the scientific director of the High-Throughput Screening Core.





the

Google and Apple place privacy limits on countries using their coronavirus tracing technology

The tech giants shared details Monday about the tools they’ve been developing to help governments and public health authorities trace the spread of the coronavirus.





the

Coronavirus is coming for the red states too

The New York metro area’s seven-day average has been declining for weeks. For the national daily case count to stay the same, other areas must be making up the difference. In other words, the virus isn’t receding. It’s relocating.





the

A tale of two parks: Enjoying the sun in wealthy Manhattan, social distancing under police scrutiny in the Bronx

Blogger Ed García Conde, who runs the Instagram page Welcome2TheBronx, captured contrasting park photos on May 2 that show differences in how the NYPD is enforcing social distancing.





the

Will the post-coronavirus economy come roaring back? Lessons from the 1918 pandemic and the Roaring '20s

From 1918 to 1920, the Spanish flu pandemic killed hundreds of thousands of Americans and millions worldwide. Yet the U.S. emerged with a roaring economy in what became known as the Roaring ’20s. What lessons can we take away from that crisis 100 years ago?





the

Can the Postal Service be saved?

After years of financial struggles, the United States Postal Service has been brought to the brink of collapse by the coronavirus outbreak. Can it be saved?





the

Another study shows hydroxychloroquine doesn't help coronavirus patients

A new study has found that hydroxychloroquine, an antimalarial drug recommended by President Trump as a possible treatment for coronavirus, does not help patients hospitalized with COVID-19, the disease caused by the virus.





the

A big question for both parties: How do you stage a convention in the middle of the coronavirus pandemic?

Figuring out how to stage the nation’s largest and most important political gatherings will be tricky in the COVID-19 era. And while officials in both parties say they’re still planning for in-person conventions, pulling that off will be a lot easier said than done. 





the

20 million jobs lost in April, but Trump says they 'will all be back'

The U.S. economy lost more than 20 million jobs in April amid the deadly coronavirus outbreak, sending the unemployment rate to 14.7 percent — the highest since the Great Depression.





the

New Yahoo News/YouGov coronavirus poll: Almost 1 in 5 say they won't get vaccinated

Asked whether they plan to get vaccinated against COVID-19 if and when a vaccine arrives, a majority of Americans say yes. But a significant minority say they won’t get vaccinated or they’re not sure. And that, more than anything else, is what the Yahoo News/YouGov poll found — that Americans are afraid.





the

Flight attendants see a very different future for airplane travel in the age of coronavirus

“Recognize that there are going to be social distancing practices at the airport. So there’s no running to the gate at the last minute,” said Sara Nelson, the international president of the Association of Flight Attendants-CWA in an interview with Yahoo News.





the

The pros and cons for Canadian cities interested in being hubs for fan-free NHL games

As the NHL looks for ways to salvage its regular season that was suspended by the COVID-19 pandemic, one option on the table is for a select group of so-called hub cities to host all the games. Three Canadian cities have expressed interest in the role.



  • Sports/Hockey/NHL

the

Golf courses aiming for 'touchless experience' as they begin to open across Canada

While many parts of our economy remain shuttered and other sports continue to wait for the go ahead to resume play, courses in all 10 provinces will soon be open for business.




the

Coronavirus: Here's what happened in the sports world on Friday

Stay up to date on the latest on how the coronavirus outbreak is affecting sports around the globe.




the

Superman on skates: The aura of Bobby Orr

When Rob Pizzo asked Scott Russell to help him out with another look at the goal Bobby Orr scored 50 years ago to win the Stanley Cup, it sparked something in Russell. It took him back to a childhood memory of the greatest goal he ever saw scored by a hero he has been connected to and worshipped most of his life.



  • Sports/Hockey/NHL

the

Raptors lead the way as Ontario eases restrictions on team training facilities

The Ontario government paved the way Friday, easing restrictions on pro sports teams by allowing them to open their training facilities providing they follow their league's "established health and safety protocols" in response to COVID-19.



  • Sports/Basketball/NBA

the

Coronavirus: Here's what's happening in the sports world on Saturday

Stay up to date on the latest on how the coronavirus outbreak is affecting sports around the globe.




the

The Bobby Orr flying goal like you've never seen it before

In celebration of the 50th anniversary of the most famous goal in NHL History, Rob Pizzo breaks down why it is still being talked about today.




the

Simmerling, Labbé keep each other going after Tokyo 2020 (and retirement) is delayed

Stephanie Labbé, goalkeeper for the Canada's soccer team, and her long-time girlfriend Georgia Simmerling, a vital member for Canada's team pursuit in track cycling, have already qualified for the Tokyo Games. But the COVID-19 lockdown measures have rocked them. This Olympic couple had planned to retire. Now, instead of facing four months until retirement they face 16 months.




the

How the Coronavirus Pandemic Is Warping Our Sense of Time

What day is it, again? COVID-19 has put our lives at a standstill. Here’s why that can make the whole experience seem longer.




the

VIDEO: The %$#@ing Science of Swearing

Researchers say swearing might actually be good for you. #%$@ yeah!




the

Driverless Cars Still Have Blind Spots. How Can Experts Fix Them?

Visual challenges remain before autonomous cars are ready for the masses.




the

Astronomers Find the Closest (Known) Black Hole to Earth

This quiet black hole sits just 1,000 light-years from Earth. But the two stars that dance around it are possible to pick out with the naked eye.




the

What Is Remdesivir, the First Drug That Treats Coronavirus?

Remdesivir is currently the world’s best hope for treating COVID-19. But it’s not a silver bullet.




the

What’s the Difference Between Sourdough Starter and Yeast?

If both can make a dough rise, why does your dough recipe call for both?




the

How to Navigate a World Reopening During the COVID-19 Pandemic

As we try to reengage with a changed world, a slew of fresh obstacles will force us to adapt our old habits and create new ones.




the

How Did Ancient People Keep Their Food From Rotting?

Archaeologists have discovered methods that kept food fresh long before refrigeration.




the

How VPN is Changing the Way we Use Internet




the

The Psychological Benefits of Picking Up a Hobby

Even if you’re brand new to a hobby, it doesn’t have to take long before the activity can soothe you.




the

Feel free to snap pictures of the tulips, says NCC

The National Capital Commission has backed down from a decision to install signs to discourage people from taking pictures or – even stopping to admire – the Canadian Tulip Festival blooms. “Dear all: our bad!” the NCC tweeted Friday night after the move attracted controversy — and the ire of Mayor Jim Watson. The signs […]





the

Ottawa country singer pens anthem of gratitude for frontline workers

Chris Labelle has a hard time getting through his latest song, Frontliners, without becoming emotional.  The Ottawa country singer wrote the tune — an unabashedly sentimental anthem of gratitude for front-line workers — during one of the sleepless nights leading up to the birth of his first child with wife Julie. Their baby boy, Grayson, […]




the

Weather: Chilly, possible flurries for Mother's Day weekend

It’s chilly out there this morning. The temperature at 7 a.m. Saturday morning sat at -3 C. Making it feel more like March than May. Clouds shouldroll in later this morning, bringing a 40 per cent chance of flurries, the high reaching only 4 C. Yep, more like March than May. The wind kicks up […]