of

President-elect Erdoğan and the Future of Turkey


Event Information

September 4, 2014
3:00 PM - 4:30 PM EDT

Choate Room
Carnegie Endowment for International Peace
1779 Massachusetts Ave. NW
Washington, DC

Register for the Event

For the first time in Turkey’s history, the electorate directly cast their votes for president earlier this week, overwhelmingly electing current Prime Minister Recep Tayyip Erdoğan to the position with 52 percent of the votes. After 12 years in power, Erdoğan’s victory was widely expected, even though the two main opposition parties chose Ekmeleddin İhsanoğlu as their common candidate in a rare show of unity, and Selahattin Demirtaş the leader of the main Kurdish political party in Turkey, tried hard to appeal to an electoral base beyond just Kurds. The impact of the election’s results, however, remains to be seen.

How should the election results be interpreted? Will Erdoğan succeed in transforming Turkey from a parliamentary system to a presidential one? Who will he chose as prime minister? What will this outcome mean for Turkey’s economic performance and its foreign policy at a time when the neighborhood is sliding deeper into instability, if not chaos? What will happen to Turkey’s European vocation and its transatlantic relations?

On September 4, the Turkey Project of the Center on the United States and Europe at Brookings hosted a panel discussion to consider what President Erdoğan’s new mandate means for the nation, its government and institutions and the ruling Justice and Development Party. Kemal Kirisci, TÜSİAD senior fellow and Turkey project director, moderated the conversation. Panelists included Robert Wexler of the S. Daniel Abraham Center for Middle East Peace, Kadir Üstün of the SETA Foundation, and Brookings Nonresident Senior Fellow Ömer Taşpınar.

Join the conversation on Twitter using #PresErdogan

Audio

Transcript

Event Materials

     
 
 




of

Armenians and the legacies of World War I


Event Information

May 13, 2015
9:45 AM - 5:30 PM EDT

Falk Auditorium
Brookings Institution
1775 Massachusetts Avenue NW
Washington, DC 20036

Register for the Event

This year marks the centenary of the atrocities perpetrated against the Armenian people of the Ottoman Empire during World War I by the governing Committee of Union and Progress. Most scholars and many governments consider these horrific events––in which more than one million people were systematically massacred or marched to their deaths––to constitute the first modern European genocide. Turkish society has begun to open up and confront the issue over the last decade. Turkish authorities, however, continue to reject the use of the term genocide, contest the number of deaths, and highlight the fact that many other minority groups, Muslims, and Turks were killed in the same period as the war-ravaged empire unraveled. For descendants of the survivors, Turkey’s official refusal to reckon fully with this painful chapter of its past is a source of deep distress and concern and undermines societal efforts toward understanding and reconciliation. Armenians have also raised the question of reparations, further adding to the problem.

On May 13, the Center on the United States and Europe at Brookings (CUSE), together with the Massachusetts Institute of Technology (MIT) Center for International Studies, the Hrant Dink Memorial Human Rights and Justice Lectureship at MIT, and the Carnegie Endowment for International Peace held a conference with several leading scholars of the Armenian genocide and other international experts. Speakers considered the historical record and circumstances of the genocide amid the disorder of World War I; how Turkey, Armenia, and other key actors have dealt with the legacy of 1915; and how this legacy continues to reverberate in the region today, with protracted conflicts in the Caucasus and where religious and ethnic minority groups have been deliberately targeted for expulsion and death amid the upheavals in Iraq, Syria, and other states that emerged from the rubble of the Ottoman Empire.

Join the conversation on Twitter using #Armenia1915

Video

Audio

Transcript

Event Materials

      
 
 




of

Kurds will be the agent of change in Turkish politics


Real political change in Turkey has been hard to come by in recent years. Establishment parties in Turkey have, time and again, proven unable to change the political system. Now a new hope for reform has emerged in Turkey from an unlikely source: the Kurds.

During most of the Cold War—and particularly during the 1980s and 1990s—Turkey had, for lack of a better word, a Kemalist consensus: The military played a major role behind the scenes, and those outside the consensus, especially the Islamists and the Kurds, were essentially excluded from politics. 

The first wave of democratization in the post-Cold War era in Turkey came from the Islamists—specifically, from the Justice and Development Party (AKP). In 2002, when the AKP came to power, it decided that accession to the European Union should be its main goal and that effort could serve as tool to undermine the political power of the Turkish military that still lurked behind the scenes. So, incredibly, an Islamist party, the AKP, decided to bring about a post-Kemalist system by pushing for membership in the EU’s essentially liberal, democratic project. This strategy explains why Turkish liberals supported the AKP and could hope that the Islamists would push the system in a liberal direction.

But then something tragic happened. The AKP became the establishment. After the military was essentially defeated as a political force, the AKP ceased to be an anti-establishment party. Rather, it became a party that started to use the privileges of power, and itself began its own networks of patronage clientelism, and became a victim of this entity called the state. The AKP became the state. 

Now we're in a situation where the second wave of democratization may also come from an anti-establishment party, this one mostly representing the Kurds. The most democratic, the most liberal, the most progressive narrative that you hear in Turkish politics today is coming from Selahattin Demirtaş of the pro-Kurdish Peoples’ Democratic Party (HDP)—not the main opposition Republican People’s Party (CHP), not the far-right Nationalist Movement Party (MHP), and not the AKP.

There is reason to think that, in Turkey, only anti-establishment parties can actually improve the system. The old AKP was an anti-establishment party. What gives me hope about the HDP is that, even when it enters the parliament—and even if a miracle happens and it enters a coalition government—it will never become the state. 

By definition, the HDP is a Kurdish political party. The Islamists could become the state, because Turkey is 99 percent Muslim, and people could establish basically a sense of supremacy based on Muslim identity. The Kurds will never be able to represent the majority. They will never be able to become the state. They have vested and permanent interest in the rule of law—indeed their very survival depends on it. Their survival depends on minority rights and on checks and balances. This stark fact gives me hope about the HDP and its agenda.

What’s wrong with the rest of the Turkish opposition?

The real puzzle is the failure of establishment political parties to challenge the system. It would have been wonderful for a center-right party or a center-left party to have taken Turkey to the post-Kemalist phase, to a post-military, pro-E.U., pro-progressive phase. But the mainstream political parties have failed. The establishment of Turkey has failed. The Kemalist order in Turkey has failed.

The agent of change was first the Islamists, and now the agent of change has become the Kurds. 

What is it that creates this mental block of establishment political parties? Why did it take so many years for the CHP to understand that it can become an agent of change, too? In the absence of a left-wing movement in Turkey, there will never be balance. We need a progressive left. We need something that can challenge the strong coalition on the right. The HDP alone cannot be there.

One thing that is not being discussed in Turkey is the possibility of a CHP-HDP coalition, yet this is the most natural coalition. The CHP, if it's a progressive political party, it should be able to get rid of its Kemalist, neo-nationalist baggage and embrace the progress of liberal, democratic agenda of the HDP. 

One reason that the CHP voters and the CHP itself are unable to really embrace the HDP is because the CHP, deep down, is still the party of Atatürk, still the party of Kemalism, still the party of nationalism. And what the Kurds want in Turkey—make no mistake—what the Kurds want in Turkey is autonomy. They want nothing short of autonomy.

The days when you could basically solve the Kurdish question with some cosmetic cultural reforms are over. They want democratic decentralization. And to me, that translates into autonomy. And this is a very difficult step to digest for the CHP. Add to this the fact that the disgruntled CHP voters are voting for the HDP, the fact that people who usually could vote for a central-left progressive party are so disillusioned with the CHP that they're gravitating to the HDP. Therefore, there is also a tactical obstacle, in terms of cooperation between the HDP and the CHP right now. 

But down the line, I think the best reconciliation between Turkish nationalism and Kurdish nationalism would come from a CHP-HDP coalition. Turkish nationalism needs to reconcile itself to the fact that the Kurdish genie is out of the bottle. The good old days of assimilating the Kurds are over. The Kurds want autonomy. They will probably get it, hopefully in a bloodless way.

      
 
 




of

What’s the government done to relieve student loan borrowers of their burden during the corona crisis?

Forty-two million Americans, or one in every eight, have student loans, and they owe a total of $1.6 trillion, the second largest pool of consumer credit after mortgages. According to the Federal Reserve, 20 percent of adult borrowers who borrowed for their own educations were behind on their payments in 2018. Of those who are…

       




of

Webinar: Reopening the coronavirus-closed economy — Principles and tradeoffs

In an extraordinary response to an extraordinary public health challenge, the U.S. government has forced much of the economy to shut down. We now face the challenge of deciding when and how to reopen it. This is both vital and complicated. Wait too long—maintain the lockdown until we have a vaccine, for instance—and we’ll have another Great Depression. Move too soon, and we…

       




of

The ABCs of the post-COVID economic recovery

The economic activity of the U.S. has plummeted in the wake of the coronavirus pandemic and unemployment has soared—largely the result of social distancing policies designed to slow the spread of the virus. The depth and speed of the decline will rival that of the Great Depression. But will the aftermath be as painful? Or…

       




of

Making sense of the monthly jobs report during the COVID-19 pandemic

The monthly jobs report—the unemployment rate from one survey and the change in employer payrolls from another survey—is one of the most closely watched economic indicators, particularly at a time of an economic crisis like today. Here’s a look at how these data are collected and how to interpret them during the COVID-19 pandemic. What…

       




of

Artificial intelligence, deepfakes, and the uncertain future of truth

Deepfakes are videos that have been constructed to make a person appear to say or do something that they never said or did. With artificial intelligence-based methods for creating deepfakes becoming increasingly sophisticated and accessible, deepfakes are raising a set of challenging policy, technology, and legal issues. Deepfakes can be used in ways that are…

       




of

Coupled Contagion Dynamics of Fear and Disease: Mathematical and Computational Explorations

Published version of the CSED October 2007 Working Paper

ABSTRACT

Background

In classical mathematical epidemiology, individuals do not adapt their contact behavior during epidemics. They do not endogenously engage, for example, in social distancing based on fear. Yet, adaptive behavior is well-documented in true epidemics. We explore the effect of including such behavior in models of epidemic dynamics.

Methodology/Principal Findings

Using both nonlinear dynamical systems and agent-based computation, we model two interacting contagion processes: one of disease and one of fear of the disease. Individuals can “contract” fear through contact with individuals who are infected with the disease (the sick), infected with fear only (the scared), and infected with both fear and disease (the sick and scared). Scared individuals–whether sick or not–may remove themselves from circulation with some probability, which affects the contact dynamic, and thus the disease epidemic proper. If we allow individuals to recover from fear and return to circulation, the coupled dynamics become quite rich, and can include multiple waves of infection. We also study flight as a behavioral response.

Conclusions/Significance

In a spatially extended setting, even relatively small levels of fear-inspired flight can have a dramatic impact on spatio-temporal epidemic dynamics. Self-isolation and spatial flight are only two of many possible actions that fear-infected individuals may take. Our main point is that behavioral adaptation of some sort must be considered.”

View full paper »
View factsheet »

Downloads

Authors

Publication: PLoS One Journal
      
 
 




of

Africa’s industrialization in the era of the 2030 Agenda: From political declarations to action on the ground

Although African countries enjoyed fast economic growth based on high commodity prices over the past decade, this growth has not translated into the economic transformation the continent needs to eradicate extreme poverty and enjoy economic prosperity. Now, more than ever, the necessity for Africa to industrialize is being stressed at various international forums, ranging from…

      
 
 




of

Detroit Needs a Selloff, Not a Bailout

Robert Crandall and Clifford Winston discuss a proposal for automakers they think will cost taxpayers less and, in the long run, be more beneficial to labor and the overall economy than either a straight bailout or bankruptcy.

      
 
 




of

Time to Deregulate the Practice of Law

Clifford Winston and Robert Crandall argue that occupational licensing for lawyers creates a monopoly in the legal field. They write that deregulating the industry would give consumers more responsive service while lowering costs.

      
 
 




of

The U.S. Should Focus on Asia: All of Asia

President Obama made "pivoting" away from the Middle East and toward Asia the cornerstone of his foreign policy. Vali Nasr explains why Washington's renewed attention to East Asia shouldn't come at the expense of the rest of the continent.

      
 
 




of

The Dangerous Price of Ignoring Syria

Vali Nasr says that President Obama has resisted American involvement in Syria because it challenges a central aim of his foreign policy: shrinking the U.S. footprint in the Middle East and downplaying the region’s importance to global politics. Nasr examines why doing more on Syria would reverse the U.S. retreat from the region.

      
 
 




of

In the Wake of BCRA: An Early Report on Campaign Finance in the 2004 Elections

ABSTRACT:

Early experience with federal campaign finance reform suggests that the new law is fulfilling its primary objective of severing links between policymakers and large donors, and thus reducing the potential for corruption in the political process. Instead of languishing or seeking to circumvent the law, the national political parties have responded to the ban on soft money by increasing their hard money resources. While outside groups appear active, particularly on the Democratic side, their soft money financing should remain a small fraction of what candidates and parties will raise and spend in the 2004 Elections.

To read the full article, please visit The Forum's website

Publication: The Forum
     
 
 




of

Reform in an Age of Networked Campaigns

Executive Summary

The political world has been arguing about campaign finance policy for decades. A once rich conversation has become a stale two-sided battleground. One side sees contribution or spending limits as essential to restraining corruption, the appearance of corruption, or the “undue influence” of wealthy donors. The other resists any such limits in the name of free speech. The time has come to leap over this gulf and, as much as possible, move the disputes from the courts. Preventing corruption and protecting free speech should each be among the key goals of any policy regime, but they should not be the only objectives. This report seeks to change the ongoing conversation. Put simply, instead of focusing on attempts to further restrict the wealthy few, it seeks to focus on activating the many.

This is not a brief for deregulation. The members of this working group support limits on contributions to candidates and political parties. But we also recognize the limits of limits. More importantly, we believe that some of the key objectives can be pursued more effectively by expanding the playing field.

Interactive communications technology potentially can transform the political calculus. But technology alone cannot do the trick. Sound governmental policies will be essential: first, to protect the conditions under which a politically beneficial technology may flourish and, second, to encourage more candidates — particularly those below the top of the national ticket — to reach out to small donors and volunteers.

We focus on participation for two reasons. First, if enough people come into the system at the low end there may be less reason to worry about the top. Second, heightened participation would be healthy for its own sake. A more engaged citizenry would mean a greater share of the public following political events and participating in public life. And the evidence seems to suggest that giving and doing are reciprocal activities: volunteering stimulates giving, while giving small amounts seems to heighten non-financial forms of participation by people who feel more invested in the process.

For these reasons, we aim to promote equality and civic engagement by enlarging the participatory pie instead of shrinking it. The Supreme Court has ruled out pursuing equality or civic engagement by constraining speech. But the Court has never ruled out pursuing these goals through policies that do not constrain speech.

This report will show how to further these ends. The first half surveys current conditions; the second contains detailed recommendations for moving forward.

The report begins with new opportunities. The digital revolution is altering the calculus of participation by reducing the costs of both individual and collective action. Millions of American went online in 2008 to access campaign materials, comment on news reports, watch campaign videos and share information. The many can now communicate with the many without the intervention of elite or centralized organizations. This capacity has made new forms of political organizations easier to create, while permitting the traditional organizations — candidates and parties — to achieve unprecedented scales of citizen participation. No example better illustrates this potential than the Obama campaign of 2008, which is discussed at length in the full report.

Downloads

Video

Authors

Publication: The Brookings Institution, American Enterprise Institute, The Campaign Finance Institute
      
 
 




of

Campaign Finance in the 2012 Elections: The Rise of Super PACs


Event Information

March 1, 2012
9:30 AM - 11:00 AM EST

Saul/Zilkha Rooms
The Brookings Institution
1775 Massachusetts Avenue, NW
Washington, DC 20036

From “American Crossroads” to “Americans for a Better Tomorrow, Tomorrow,” so-called "super PACs" have emerged as the dominant new force in campaign finance. Created in the aftermath of two landmark court decisions and regulatory action and inaction by the Federal Election Commission (FEC), these independent spending-only political action committees are collecting unlimited contributions from individuals, corporations and unions to advocate for or against political candidates. The legal requirements they face—disclosure of donors and non-coordination with the candidates and campaigns they are supporting—have proven embarrassingly porous. Increasingly, super PACs are being formed to boost a single candidate and are often organized and funded by that candidate’s close friends, relatives and former staff members. Their presence is most visible in presidential elections but they are quickly moving to Senate and House elections.

On March 1, on the heels of the FEC’s February filing deadline, the Governance Studies program at Brookings hosted a discussion exploring the role of super PACs in the broader campaign finance landscape this election season. Anthony Corrado, professor of government at Colby College and a leading authority on campaign finance, and Trevor Potter, nonresident senior fellow at the Brookings Institution, a former chairman of the FEC and lawyer to Comedy Central’s Stephen Colbert, presented. 

After the panel discussion, the speakers took audience questions. Participants joined the discussion on Twitter by using the hashtag #BISuperPAC.

Video

Audio

Transcript

Event Materials

      
 
 




of

@Brookings Podcast: The Influence of Super PACs on the 2012 Elections


Super PACs have already spent tens of millions of dollars in the race for the GOP presidential nomination, with more to come. Expert Anthony Corrado says that the unlimited spending by the PACs, made possible by two Supreme Court decisions, is giving wealthy individuals unprecedented influence in the 2012 elections.

Video

Audio

Image Source: © Jessica Rinaldi / Reuters
      
 
 




of

In defense of John Allen

This past weekend, retired Chairman of the Joint Chiefs of Staff General Martin Dempsey criticized retired General John Allen for his involvement in this year’s presidential race in support of Hillary Clinton and in strong opposition to Donald Trump. Allen (who is currently on a leave of absence from Brookings) believes the latter could cause a historic crisis […]

      
 
 




of

The Marketplace of Democracy : Electoral Competition and American Politics


Brookings Institution Press and Cato Institute 2006 312pp.

Since 1998, U.S. House incumbents have won a staggering 98 percent of their reelection races. Electoral competition is also low and in decline in most state and primary elections. The Marketplace of Democracy combines the resources of two eminent research organizations—the Brookings Institution and the Cato Institute—to address the startling lack of competition in our democratic system. The contributors consider the historical development, legal background, and political aspects of a system that is supposed to be responsive and accountable yet for many is becoming stagnant, self-perpetuating, and tone-deaf. How did we get to this point, and what—if anything—should be done about it?

In The Marketplace of Democracy, top-tier political scholars also investigate the perceived lack of competition in arenas only previously speculated on, such as state legislative contests and congressional primaries. Michael McDonald, John Samples, and their colleagues analyze previous reform efforts such as direct primaries and term limits, and the effects they have had on electoral competition. They also examine current reform efforts in redistricting and campaign finance regulation, as well as the impact of third parties. In sum, what does all this tell us about what might be done to increase electoral competition?

Elections are the vehicles through which Americans choose who governs them, and the power of the ballot enables ordinary citizens to keep public officials accountable. This volume considers different policy options for increasing the competition needed to keep American politics vibrant, responsive, and democratic.


Brookings Forum: "The Marketplace of Democracy: A Groundbreaking Survey Explores Voter Attitudes About Electoral Competition and American Politics," October 27, 2006.

Podcast: "The Marketplace of Democracy: Electoral Competition and American Politics," a Capitol Hill briefing featuring Michael McDonald and John Samples, September 22, 2006.


Contributors: Stephen Ansolabehere (Massachusetts Institute of Technology), William D. Berry (Florida State University), Bruce Cain (University of California-Berkeley), Thomas M. Carsey (Florida State University), James G. Gimpel (University of Maryland), Tim Groseclose (University of California-Los Angeles), John Hanley (University of California-Berkeley), John mark Hansen (University of Chicago), Paul S. Herrnson (University of Maryland), Shigeo Hirano (Columbia University), Gary C. Jacobson (University of California-San Diego), Thad Kousser (University of California-San Diego), Frances E. Lee (University of Maryland), John C. Matsusaka (University of Southern California), Kenneth R. Mayer (University of Wisconsin-Madison), Michael P. McDonald (Brookings Institution and George Mason University), Jeffrey Milyo (University of Missouri-Columbia), Richard G. Niemi (University of Rochester), Natheniel Persily (University of Pennsylvania Law School), Lynda W. Powell (University of Rochester), David Primo (University of Rochester), John Samples (Cato Institute), James M. Snyder Jr. (Massachusetts Institute of Technology), Timothy Werner (University of Wisconsin-Madison), and Amanda Williams (University of Wisconsin-Madison).

ABOUT THE EDITORS

John Samples
John Samples directs the Center for Representative Government at the Cato Institute and teaches political science at Johns Hopkins University.
Michael P. McDonald

Downloads

Ordering Information:
  • {9ABF977A-E4A6-41C8-B030-0FD655E07DBF}, 978-0-8157-5579-1, $24.95 Add to Cart
  • {CD2E3D28-0096-4D03-B2DE-6567EB62AD1E}, 978-0-8157-5580-7, $54.95 Add to Cart
     
 
 




of

The Marketplace of Democracy: A Groundbreaking Survey Explores Voter Attitudes About Electoral Competition and American Politics

Event Information

October 27, 2006
10:00 AM - 12:00 PM EDT

Falk Auditorium
The Brookings Institution
1775 Massachusetts Ave., NW
Washington, DC

Register for the Event

Despite the attention on the mid-term races, few elections are competitive. Electoral competition, already low at the national level, is in decline in state and primary elections as well. Reformers, who point to gerrymandering and a host of other targets for change, argue that improving competition will produce voters who are more interested in elections, better-informed on issues, and more likely to turn out to the polls.

On October 27, the Brookings Institution—in conjunction with the Cato Institute and The Pew Research Center—presented a discussion and a groundbreaking survey exploring the attitudes and opinions of voters in competitive and noncompetitive congressional districts. The survey, part of Pew's regular polling on voter attitudes, was conducted through the weekend of October 21. A series of questions explored the public's perceptions, knowledge, and opinions about electoral competitiveness.

The discussion also explored a publication that addresses the startling lack of competition in our democratic system. The Marketplace of Democracy: Electoral Competition and American Politics (Brookings, 2006), considers the historical development, legal background, and political aspects of a system that is supposed to be responsive and accountable, yet for many is becoming stagnant, self-perpetuating, and tone-deaf. Michael McDonald, editor and Brookings visiting fellow, moderated a discussion among co-editor John Samples, director of the Center for Representative Government at the Cato Institute, and Andrew Kohut and Scott Keeter from The Pew Research Center, who also discussed the survey.

Transcript

Event Materials

     
 
 




of

The Competitive Problem of Voter Turnout

On November 7, millions of Americans will exercise their civic duty to vote. At stake will be control of the House and Senate, not to mention the success of individual candidates running for office. President Bush's "stay the course" agenda will either be enabled over the next two years by a Republican Congress or knocked off kilter by a Democratic one.

With so much at stake, it is not surprising that the Pew Research Center found that 51 percent of registered voters have given a lot of thought to this November's election. This is higher than any other recent midterm election, including 44 percent in 1994, the year Republicans took control of the House. If so, turnout should better the 1994 turnout rate among eligible voters of 41 percent.

There is good reason to suspect that despite the high interest, turnout will not exceed 1994. The problem is that a national poll is, well, a national poll, and does not measure attitudes of voters within states and districts.

People vote when there is a reason to do so. Republican and Democratic agendas are in stark contrast on important issues, but voters also need to believe that their vote will matter in deciding who will represent them. It is here that the American electoral system is broken for many voters.

Voters have little choice in most elections. In 1994, Congressional Quarterly called 98 House elections as competitive. Today, they list 51. To put it another way, we are already fairly confident of the winner in nearly 90 percent of House races. Although there is no similar tracking for state legislative offices, we know that the number of elections won by less than 60 percent of the vote has fallen since 1994.

The real damage to the national turnout rate is in the large states of California and New York, which together account for 17 percent of the country's eligible voters. Neither state has a competitive Senate or Governor's election, and few competitive House or state legislative races. Compare to 1994, when Californians participated in competitive Senate and governor races the state's turnout was 5 percentage points above the national rate. The same year New York's competitive governor's race helped boost turnout a point above the national rate.

Lacking stimulation from two of the largest states, turnout boosts will have to come from elsewhere. Texas has an interesting four-way governor's race that might draw from infrequent voters to the polls. Ohio's competitive Senate race and some House races might also draw voters. However, in other large states like Florida, Illinois, Michigan and Pennsylvania, turnout will suffer from largely uncompetitive statewide races.

The national turnout rate will likely be less than 1994 and fall shy of 40 percent. This is not to say that turnout will be poor everywhere. Energized voters in Connecticut get to vote in an interesting Senate race and three of five Connecticut House seats are up for grabs. The problem is that turnout will be localized in these few areas of competition.

The fault is not on the voters; people's lives are busy, and a rational person will abstain when their vote does not matter to the election outcome. The political parties also are sensitive to competition and focus their limited resources where elections are competitive. Television advertising and other mobilizing efforts by campaigns will only be found in competitive races.

The old adage of "build it and they will come" is relevant. All but hardcore sports fans tune out a blowout. Building competitive elections -- and giving voters real choices -- will do much to increase voter turnout in American politics. There are a number of reforms on the table: redistricting to create competitive districts, campaign financing to give candidates equal resources, and even altering the electoral system to fundamentally change how a vote elects representatives. If voters want choice and a government more responsive to their needs, they should consider how these seemingly arcane election procedures have real consequences on motivating them to do the most fundamental democratic action: vote.

Publication: washingtonpost.com
     
 
 




of

The Election of the Century

The impending presidential election may be the election of a century. Record primary voting, floods of new registrations, more small campaign donors and highly rated political conventions show that people are intensely interested.

These indicators augur a high turnout. Undoubtedly, more people will vote than the 60 percent who turned out four years ago, which was the highest rate since 1968. The question is, how many more? If participation tops the 1960 level of 64 percent, then we must go all the way back to 1908 — literally a century of American politics — to find the next highest rate: 66 percent.

Lessons from the 1960 and 1908 elections explain why 2008 may see a historical election. Many people recall the 1960 election that pitted two familiar names, Richard Nixon and John F. Kennedy. Kennedy won one of the closest presidential elections in American history. As in sports, people are interested when two contestants are evenly matched. Just like those in 1960, pre-election polls today show a tight race between Barack Obama and John McCain. People perceive that their vote will help determine big issues of peace and prosperity. Further, an African-American or a woman will be elected, for the first time, to one of the country’s highest offices. Contrast this to 1996: People tuned out when pre-election polls showed President Bill Clinton cruising to reelection over Bob Dole.

The 1908 election was not particularly close and did not involve big issues. Republican William Howard Taft won by a landslide over third-time Democratic candidate William Jennings Bryan, whose “free silver” platform had lost its luster. What is notable is that the 1908 election occurred in the twilight of the political machines that dominated American politics throughout the latter half of the 19th century. These machines were built from the bottom up. Local ward bosses, who knew their neighbors intimately, dispensed jobs and favors for votes. (Ward bosses conjure images of big city politics, but rural political machines existed, too.) Political machines even paid supporters’ taxes in states that disenfranchised tax delinquents.

During the machine era, turnout rates routinely exceeded 80 percent. Paying people to vote, however, discomfited many. Progressive Era reforms near the turn of the 20th century rooted out the obvious corruption by creating a civil service to replace patronage jobs and adopting the secret ballot so that political machines could not monitor voting. The 1908 election was among the last where machines could still turn out voters.

There is mounting evidence that political machines had something right: Face-to-face contact is among the most effective means to activate voters. Today’s high-tech campaigns recreate the mobilization capacity of political machines. In place of ward bosses are local volunteers, and in place of bosses’ neighborly knowledge are sophisticated microtargeted voter profiles that reveal which voters are persuadable and which are loyal party supporters. The glue is the Internet, which provides an information infrastructure for campaigns to recruit and communicate with their volunteers.

It is tempting to give Democrats a mobilization edge. Obama’s efforts are highly visible, whereas McCain must rely on the tightlipped Republican National Committee. Obama does not employ the Democratic National Committee for this expensive campaign operation because he opted out of public financing. Indeed, recent presidential candidates — McCain included — usually raise money for voter mobilization through their national parties.

Before Obama is given an edge, we must caution that Republicans are better able to register themselves than are lower-income Democrats. Massive Democratic registration drives create a false impression that they are out-hustling Republicans. In 2004, Democratic-aligned organizations’ highly publicized efforts exceeded their voter turnout victory targets. These groups underestimated President Bush’s 72-hour voter mobilization efforts the weekend before the election, which effectively matched them voter for voter.

Still, Obama’s organization should not be discounted. Just four years ago, Democrats were still playing catch-up to Republicans. Now they are just as sophisticated and have recruited a large cadre of volunteers, including typically apathetic youth.

American campaigns have undergone a paradigm shift. They no longer consist primarily of mass appeals through television advertising; grass-roots organizing is now a critical component. If elections stay close and interesting, we will likely observe higher turnouts. No longer will we wonder why turnout is declining; rather, we will wonder why it is climbing. A revitalized ground game will likely emerge as one explanation in the decade to come.

Publication: Politico
     
 
 




of

The Revenge of the Moderates in U.S. Politics


Alaska Republican Sen. Lisa Murkowski’s write-in candidacy for reelection makes her the latest to join a growing number of prominent politicians who have shed political affiliations in the hopes of winning public office.

Florida Gov. Charlie Crist is running as an independent for the Senate, former Sen. Lincoln Chafee is running as an independent for Rhode Island governor, Mayor Michael Bloomberg became an independent to run New York City, and, of course, Sen. Joe Lieberman lost the 2006 Democratic Senate primary — but won in the general as an independent.

The trend of moderate independent candidates who have forsworn party affiliations is not new to U.S. politics. Since the Civil War, when the modern Republican Party was established to compete against the Democratic Party, minor party or unaffiliated candidates have won election to the House or Senate a total of 697 times. Of these, 89 percent of elected minor party candidates had voting records ideologically between the two major parties.

Despite the recent polarization of U.S. politics, history tells us that moderates make winners. Consider the Wisconsin Progressive Party. Its development has a familiar ring to today’s politics. Extremist elements flourished in the Republican Party during the Great Depression, growing out of our nation’s economic anxieties. GOP moderates responded by creating this Wisconsin group, focused on issues of reform and pragmatic governance.

It started when Wisconsin Gov. Philip La Follette ran for reelection in 1932 as the GOP nominee. He was heckled throughout his speeches by Republican ‘Stalwarts’ on his political right. They “had their Phil” and were angered by his policies of perceived higher taxes to support government spending. La Follette lost the Republican primary to Stalwart-backed Walter Kohler amid then-record turnout. Kohler lost to the Democrat in the general election.

La Follette is a famous political name. Gov. Philip La Follette and Sen. Robert La Follette Jr. were sons of the leading GOP politician, Sen. Robert La Follette Sr. Republican progressives had supported him for the party’s presidential nomination in 1912 and 1916. He eventually ran for president in 1924 — on his own Independent Progressive Party ticket. But while the father’s exploits are well-known, his sons’ reactions to Wisconsin’s political climate are more relevant to today’s politics.

Frustrated by the GOP extremists, the La Follette brothers created the Wisconsin Progressive Party, and they ran as party candidates when successfully elected governor and senator in 1934. Today’s independent candidates share a similar frustration with the ideological purists on their right and left. The extremists in the Democratic and Republican primary electorates are rejecting centrist candidates who might be better positioned to win general elections.

Consider the words of Crist when he declared his Independent candidacy. “If you want somebody on the right or you want somebody on the left,” Crist said, “you have the former speaker, Rubio, or the congressman, Meek. If you want somebody who has common sense, who puts the will of the people first, who wants to fight for the people first, now you've got Charlie Crist. You have a choice.”

With all the attention paid to the successes of Tea Party activists during the GOP primaries, it is easy to forget that these are not like general elections. Primary voters tend to be more ideologically extreme. So these Republican primary voters may end up denying the party several general election victories.

For example, many political observers agree that Rep. Mike Castle (R-Del.), a moderate, would have been a stronger candidate for Senate than the GOP primary victor, Christine O’Donnell, his tea party-backed opponent. General elections have traditionally been won in the center -- where most voters still reside.

Minor party successes usually arise when the two major political parties become ideologically polarized. Moderates can usually find a seat under a big tent, but when party activists are unable to tolerate dissent, moderates are shut out and left to their own devices. So it isn’t surprising that strong candidates holding moderate positions realize they are electorally viable by abandoning their party and appealing to the center in general elections.

History tells us that conditions now are favorable for moderates like Chafee, Crist, Lieberman, and Murkowski. They step into a political vacuum at the center that the major parties created by moving to the political extremes. With room left for further polarization, this may be just the beginning of the rise of moderate independent candidates.

History also tells us the political party that first figures out how to recapture the middle -- and bring these candidates and their supporters into the fold -- is the one most likely to emerge as dominant.

Authors

Publication: POLITICO
Image Source: © Jessica Rinaldi / Reuters
      
 
 




of

@ Brookings Podcast: The Politics and Process of Congressional Redistricting

Now that the 2010 Census is concluded, states will begin the process of reapportionment—re-drawing voting district lines to account for population shifts. Nonresident Senior Fellow Michael McDonald says redistricting has been fraught with controversy and corruption since the nation’s early days, when the first “gerrymandered” district was drawn. Two states—Arizona and California—have instituted redistricting commissions intended to insulate the process from political shenanigans, but politicians everywhere will continue to work the system to gain electoral advantage and the best chance of re-election for themselves and their parties.

Subscribe to audio and video podcasts of Brookings events and policy research »

Video

Audio

      
 
 




of

The Structure of the TANF Block Grant

The 1996 welfare reform legislation replaced the Aid to Families with Dependent Children (AFDC) program with a new Temporary Assistance for Needy Families (TANF) block grant that is very different than its predecessor. In the old AFDC program, funds were used almost entirely to provide and administer cash assistance to low-income—usually single-parent—families. The federal government…

       




of

Target Compliance: The Final Frontier of Policy Implementation

Abstract Surprisingly little theoretical attention has been devoted to the final step of the public policy implementation chain: understanding why the targets of public policies do or do not “comply” — that is, behave in ways that are consistent with the objectives of the policy. This paper focuses on why program “targets” frequently fail to…

       




of

The Collapse of Canada?

America's northern neighbor faces a severe constitutional crisis. Unprecedented levels of public support for sovereignty in the predominantly French-speaking province of Quebec could lead to the breakup of Canada. This crisis was precipitated by two Canadian provinces' failure in 1990 to ratify the Meech Lake Accord, a package of revisions to Canada's constitution that addressed…

       




of

The Study of the Distributional Outcomes of Innovation: A Book Review


Editors Note: This post is an extended version of a previous post.

Cozzens, Susan and Dhanaraj Thakur (Eds). 2014. Innovation and Inequality: Emerging technologies in an unequal world. Northampton, Massachusetts: Edward Elgar.

Historically, the debate on innovation has focused on the determinants of the pace of innovation—on the premise that innovation is the driver of long-term economic growth. Analysts and policymakers have taken less interest on how innovation-based growth affects income distribution. Less attention even has received the question of how innovation affects other forms of inequality such as economic opportunity, social mobility, access to education, healthcare, and legal representation, or inequalities in exposure to insalubrious environments, be these physical (through exposure to polluted air, water, food or harmful work conditions) or social (neighborhoods ridden with violence and crime). The relation between innovation, equal political representation and the right for people to have a say in the collective decisions that affect their lives can also be added to the list of neglect.

But neglect has not been universal. A small but growing group of analysts have been working for at least three decades to produce a more careful picture of the relationship between innovation and the economy. A distinguished vanguard of this group has recently published a collection of case studies that illuminates our understanding of innovation and inequality—which is the title of the book. The book is edited by Susan Cozzens and Dhanaraj Thakur. Cozzens is a professor in the School of Public Policy and Vice Provost of Academic Affairs at Georgia Tech. She has studied innovation and inequality long before inequality was a hot topic and led the group that collaborated on this book. Thakur is a faculty member of the school College of Public Service and Urban Affairs at Tennessee State University (while writing the book he taught at the University of West Indies in Jamaica). He is an original and sensible voice in the study of social dimensions of communication technologies.

We’d like to highlight here three aspects of the book: the research design, the empirical focus, and the conceptual framework developed from the case studies in the book.

Edited volumes are all too often a collection of disparate papers, but not in this case. This book is patently the product of a research design that probes the evolution of a set of technologies across a wide variety of national settings and, at the same time, it examines the different reactions to new technologies within specific countries. The second part of the book devotes five chapters to study five emerging technologies—recombinant insulin, genetically modified corn, mobile phones, open-source software, and tissue culture—observing the contrasts and similarities of their evolution in different national environments. In turn, part three considers the experience of eight countries, four of high income—Canada, Germany, Malta, and the U.S.—and four of medium or low income—Argentina, Costa Rica, Jamaica, and Mozambique. The stories in part three tell how these countries assimilated these diverse technologies into to their economies and policy environments.

The second aspect to highlight is the deliberate choice of elements for empirical focus. First, the object of inquiry is not all of technology but a discreet set of emerging technologies gaining a specificity that would otherwise be negated if they were to handle the unwieldy concept of “technology” broadly construed. At the same time, this choice reveals the policy orientation of the book because these new entrants have just started to shape the socio-technical spaces they inhabit while the spaces of older technologies have likely ossified. Second, the study offers ample variance in terms of jurisdictions under study, i.e. countries of all income levels; a decision that makes at the same time theory construction more difficult and the test of general premises more robust.[i] We can add that the book avoids sweeping generalizations. Third, they focus on technological projects and their champions, a choice that increases the rigor of the empirical analysis. This choice, naturally, narrows the space of generality but the lessons are more precise and the conjectures are presented with according modesty. The combination of a solid design and clear empirical focus allow the reader to obtain a sense of general insight from the cases taken together that could not be derived from any individual case standing alone.

Economic and technology historians have tackled the effects of technological advancement, from the steam engine to the Internet, but those lessons are not easily applicable to the present because emerging technologies intimate at a different kind of reconfiguration of economic and social structures. It is still too early to know the long-term effects of new technologies like genetically modified crops or mobile phone cash-transfers, but this book does a good job providing useful concepts that begin to form an analytical framework. In addition, the mix of country case studies subverts the disciplinary separation between the economics of innovation (devoted mostly to high-income countries) and development studies (interested in middle and low income economies). As a consequence of these selections, the reader can draw lessons that are likely to apply to technologies and countries other than the ones discussed in this book.

The third aspect we would like to underscore in this review is the conceptual framework. Cozzens, Thakur and their colleagues have done a service to anyone interested in pursuing the empirical and theoretical analysis of innovation and inequality.

For these authors, income distribution is only one part of the puzzle. They observe that inequalities are also part of social, ethnic, and gender cleavages in society. Frances Stewart, from Oxford University, introduced the notion of horizontal inequalities or inequalities at the social group level (for instance, across ethnic groups or genders). She developed the concept to contrast vertical inequalities or inequalities operating at the individual level (such as household income or wealth). The authors of this book borrow Stewart’s concept and pay attention to horizontal inequalities in the technologies they examine and observe that new technologies enter marketplaces that are already configured under historical forms of exclusion. A dramatic example is the lack of access to recombinant insulin in the U.S., because it is expensive and minorities are less likely to have health insurance (see Table 3.1 in p. 80).[ii] Another example is how innovation opens opportunities for entrepreneurs but closes them for women in cultures that systematically exclude women from entrepreneurial activities.

Another key concept is that of complementary assets. A poignant example is the failure of recombinant insulin to reach poor patients in Mozambique who are sent home with old medicine even though insulin is subsidized by the government. The reason why doctors deny the poor the new treatment is that they don’t have the literacy and household resources (e.g. a refrigerator, a clock) necessary to preserve the shots, inject themselves periodically, and read sugar blood levels. Technologies aimed at fighting poverty require complementary assets to be already in place and in the absence of them, they fail to mitigate suffering and ultimately ameliorate inequality. Another illustration of the importance of complementary assets is given by the case of Open Source Software. This technology has a nominal price of zero; however, only individuals who have computers and the time, disposition, and resources to learn how to use open source operative systems benefit. Likewise, companies without the internal resources to adapt open software will not adopt it and remain economically tied to proprietary software.

These observations lead to two critical concepts elaborated in the book: distributional boundaries and the inequalities across technological transitions. Distributional boundaries refer to the reach of the benefits of new technologies, boundaries that could be geographic (as in urban/suburban or center/periphery) or across social cleavages or incomes levels. Standard models of technological diffusion assume the entire population will gradually adopt a new technology, but in reality the authors observe several factors intervene in limiting the scope of diffusion to certain groups. The most insidious factors are monopolies that exercise sufficient control over markets to levy high prices. In these markets, the price becomes an exclusionary barrier to diffusion. This is quite evident in the case of mobile phones (see table 5.1, p. 128) where monopolies (or oligopolies) have market power to create and maintain a distributional boundary between post-pay and high-quality for middle and high income clients and pre-pay and low-quality for poor customers. This boundary renders pre-pay plans doubly regressive because the per-minute rates are higher than post-pay and phone expenses represent a far larger percentage in poor people’s income. Another example of exclusion happens in GMOs because in some countries subsistence farmers cannot afford the prices for engineering seeds; a disadvantage that compounds to their cost and health problems as they have to use more and stronger pesticides.

A technological transition, as used here, is an inflection point in the adoption of a technology that re-shapes its distributional boundaries. When smart phones were introduced, a new market for second-hand or hand-down phones was created in Maputo; people who could not access the top technology get stuck with a sub-par system. By looking at tissue culture they find that “whether it provides benefits to small farmers as well as large ones depends crucially on public interventions in the lower-income countries in our study” (p. 190). In fact, farmers in Costa Rica enjoy much better protections compare to those in Jamaica and Mozambique because the governmental program created to support banana tissue culture was designed and implemented as an extension program aimed at disseminating know-how among small-farmers and not exclusively to large multinational-owned farms. When introducing the same technology, because of this different policy environment, the distributional boundaries were made much more extensive in Costa Rica.

This is a book devoted to present the complexity of the innovation-inequality link. The authors are generous in their descriptions, punctilious in the analysis of their case studies, and cautious and measured in their conclusions. Readers who seek an overarching theory of inequality, a simple story, or a test of causality, are bound to be disappointed. But those readers may find the highest reward from carefully reading all the case studies presented in this book, not only because of the edifying richness of the detail herein but also because they will be invited to rethink the proper way to understand and address the problem of inequality.[iii]
 


[i] These are clearly spelled out: “we assumed that technologies, societies, and inequalities co-evolved; that technological projects are always inherently distributional; and that the distributional aspects of individual projects and portfolios of projects are open to choice.” (p. 6)

[ii] This problem has been somewhat mitigated since the Affordable Healthcare Act entered into effect.

[iii] Kevin Risser contributed to this posting.

 

Image Source: © Akhtar Soomro / Reuters
     
 
 




of

The politics of federal R&D: A punctuated equilibrium analysis


The fiscal budget has become a casualty of political polarization and even functions that had enjoyed bipartisan support, like research and development (R&D), are becoming divisive issues on Capitol Hill. As a result, federal R&D is likely to grow pegged to inflation or worse, decline.

With the size of the pie fixed or shrinking, requests for R&D funding increases will trigger an inter-agency zero-sum game that will play out as pointless comparisons of agencies’ merit, or worse, as a contest to attract the favor of Congress or the White House. This insidious politics will be made even more so by the growing tendency of equating public accountability with the measurement of performance. Political polarization, tight budgets, and pressure for quantifiable results threaten to undermine the sustainability of public R&D. The situation begs the question: What can federal agencies do to deal with the changing politics of federal R&D?

In a new paper, Walter D. Valdivia and Benjamin Y. Clark apply punctuated equilibrium theory to examine the last four decades of federal R&D, both at the aggregate and the agency level. Valdivia and Clark observe a general upward trend driven by gradual increases. In turn, budget leaps or punctuations are few and far in between and do no appear to have lasting effects. As the politics of R&D are stirred up, federal departments and agencies are sure to find that proposing punctuations is becoming more costly and risky. Consequently, agencies will be well advised in securing stable growth in their R&D budgets in the long run rather than pushing for short term budget leaps.

While appropriations history would suggest the stability of R&D spending resulted from the character of the budget politics, in the future, stability will need the stewardship of R&D champions who work to institutionalize gradualism, this time, in spite of the politics.

Downloads

Authors

      
 
 




of

State of the Union’s challenge: How to make tech innovation work for us?


Tuesday night, President Obama presented four critical questions about the future of America and I should like to comment on the first two:

  1. How to produce equal opportunity, emphasizing economic security for all.
  2. In his words, “how do we make technology work for us, and not against us,” particularly to meet the “urgent challenges” of our days.

The challenges the president wishes to meet by means of technological development are climate change and cancer. Let’s consider cancer first. There are plenty of reasons to be skeptical: this is not the first presidential war against cancer, President Nixon tried that once and, alas cancer still has the upper hand. It is ironic that Mr. Obama chose this particular ”moonshot”, because not only are the technical aspects of cancer more uncertain than those of space travel, political support for the project is vastly different and we cannot be sure that even another Democrat in the White House would see this project to fruition. In effect, neither Mr. Obama nor his appointed “mission control”, Vice President Biden, have time in office to see fruits from their efforts on this front.

The second challenge the president wishes to address with technology is problematic beyond technical and economic feasibility (producing renewable energy at competitive prices); curbing carbon emissions has become politically intractable. The president correctly suggested that being leaders in the renewable energy markets of the future makes perfect business sense, even for global warming skeptics. Nevertheless, markets have a political economy, and current energy giants have a material interest in not allowing any changes to the rules that so favor them (including significant federal subsidies). Only when the costs of exploration, extraction, and distribution of fossil fuels rise above those of renewable sources, we can expect policy changes enabling an energy transition to become feasible. When renewables are competitive on a large scale, it is not very likely that their production will be controlled by new industrial players. Such is the political economy of free markets. What’s more, progressives should be wary of standard solutions that would raise the cost of energy (such as a tax on carbon emissions), because low income families are quite sensitive to energy prices; the cost of electricity, gas, and transportation is a far larger proportion of their income than that of their wealthier neighbors.

It’s odd that the president proposes technological solutions to challenges that call for a political solution. Again, in saying this, I’m allowing for the assumption that the technical side is manageable, which is not necessarily a sound assumption to make. The technical and economic complexity of these problems should only compound political hurdles. If I’m skeptical that technological fixes would curb carbon emissions or cure cancer, I am simply vexed by the president’s answer to the question on economic opportunity and security: expand the safety net. It is not that it wouldn’t work; it worked wonders creating prosperity and enlarging the middle-class in the post-World War II period. The problem is that enacting welfare state policies promises to be a hard political battle that, even if won, could result in pyrrhic victories. The greatest achievement of Mr. Obama expanding the safety net was, of course, the Affordable Care Act. But his policy success came at a very high cost: a majority of the voters have questions about the legitimacy of that policy. Even its eponymous name, Obamacare, was coined as a term of derision. It is bizarre that opposition to this reform is often found amidst people who benefit from it. We can blame the systematic campaign against it in every electoral contest, the legal subterfuges brought up to dismantle it (that ACA survived severely bruised), and the AM radio vitriol, but even controlling for the dirty war on healthcare reform, passing such as monumental legislation strictly across party lines has made it the lighting rod of distrust in government.

Progressives are free to try to increase economic opportunity following the welfare state textbook. They will meet the same opposition that Mr. Obama encountered. However, where progressives and conservatives could agree is about increasing opportunities for entrepreneurs, and nothing gives an edge to free enterprise more than innovation. Market competition is the selection mechanism by which an elite of enterprises rises from a legion created any given year; this elite, equipped with a new productive platform, can arm-wrestle markets from the old guard of incumbents. This is not the only way innovation takes place: monopolies and cartels can produce innovation, but with different outcomes. In competitive markets, innovation is the instrument of product differentiation; therefore, it improves quality and cuts consumer prices. In monopolistic markets, innovation also takes place, but generally as a monopolist’s effort to raise barriers to entry and secure high profits. Innovation can take place preserving social protections to the employees of the new industries, or it can undermine job security of its labor force (a concern with the sharing economy). These different modes of innovation are a function of the institutions that govern innovation, including industrial organization, labor and consumer protections.

What the President did not mention is that question two can answer question one: technological development can improve economic opportunity and security, and that is likely to be more politically feasible than addressing the challenges of climate change and cancer. Shaping the institutions that govern innovative activity to favor modes of innovation that benefit a broad base of society is an achievable goal, and could indeed be a standard by which his and future administrations are measured. This is so because these are not the province of the welfare state. They are policy domains that have historically enjoyed bipartisan consensus (such as federal R&D funding, private R&D tax credits) or low contestation (support for small business, tech transfer, loan guarantees).

As Mr. Obama himself suggested, technology can be indeed be made to work for us, all of us.

Image Source: © POOL New / Reuters
      
 
 




of

Why should I buy a new phone? Notes on the governance of innovation


A review essay of “Governance of Socio-technical Systems: Explaining Change”, edited by Susana Borrás and Jakob Edler (Edward Elgar, 2014, 207 pages).

Phasing-out a useful and profitable technology

I own a Nokia 2330; it’s a small brick phone that fits comfortably in the palm of my hand. People have feelings about this: mostly, they marvel at my ability to survive without a smart-phone. Concerns go beyond my wellbeing; once a friend protested that I should be aware of the costs I impose onto my friends, for instance, by asking them for precise directions to their houses. Another suggested that I cease trying to be smarter than my phone. But my reason is simple: I don’t need a smart phone. Most of the time, I don’t even need a mobile phone. I can take and place calls from my home or my office. And who really needs a phone during their commute? Still, my device will meet an untimely end. My service provider has informed me via text message that it will phase out all 2G service and explicitly encouraged me to acquire a 3G or newer model. 

There is a correct if simplistic explanation for this announcement: my provider is not making enough money with my account and should I switch to a newer device, they will be able to sell me a data plan. The more accurate and more complex explanation is that my mobile device is part of a communications system that is integrated to other economic and social systems. As those other systems evolve, my device is becoming incompatible with them; my carrier has determined that I should be integrated.

The system integration is easy to understand from a business perspective. My carrier may very well be able to make a profit keeping my account as is, and the accounts of the legion of elderly and low-income customers who use similar devices, and still they may not find it advantageous in the long run to allow 2G devices in their network. To understand this business strategy, we need to go back no farther than the introduction of the iPhone, which in addition to being the most marketable mobile phone set a new standard platform for mobile devices. Its introduction accelerated a trend underway in the core business of carriers: the shift from voice communication to data streaming because smart phones can support layers of overlapping services that depend on fast and reliable data transfer. These services include sophisticated log capabilities, web search, geo-location, connectivity to other devices, and more recently added bio-monitoring. All those services are part of systems of their own, so it makes perfect business sense for carriers to seamlessly integrate mobile communications with all those other systems. Still, the economic rationale explains only a fraction of the systems integration underway.

The communication system of mobile telephony is also integrated with regulatory, social, and cultural systems. Consider the most mundane examples: It’s hard to imagine anyone who, having shifted from paper-and-pencil to an electronic agenda, decided to switch back afterwards. We are increasingly dependent of GPS services; while it may have once served tourists who did not wish to learn how to navigate a new city, it is now a necessity for many people who without it are lost in their home town. Not needing to remember phone numbers, the time of our next appointment, or how to go back to that restaurant we really liked, is a clear example of the integration of mobile devices into our value systems.

There are coordination efforts and mutual accommodation taking place: tech designers seek to adapt to changing values and we update our values to the new conveniences of slick gadgets. Government officials are engaged in the same mutual accommodation. They are asking how many phone booths must be left in public places, how to reach more people with public service announcements, and how to provide transit information in real-time when commuters need it. At the same time, tech designers are considering all existing regulations so their devices are compliant. Communication and regulatory systems are constantly being re-integrated.

The will behind systems integration

The integration of technical and social systems that results from innovation demands an enormous amount of planning, effort, and conflict resolution. The people involved in this process come from all quarters of the innovation ecology, including inventors, entrepreneurs, financiers, and government officials. Each of these agents may not be able to contemplate the totality of the system integration problem but they more or less understand how their respective system must evolve so as to be compatible with interrelated systems that are themselves evolving.  There is a visible willfulness in the integration task that scholars of innovation call the governance of socio-technical systems.

Introducing the term governance, I should emphasize that I do not mean merely the actions of governments or the actions of entrepreneurs. Rather, I mean the effort of all agents involved in the integration and re-integration of systems triggered by innovation; I mean all the coordination and mutual accommodation of agents from interrelated systems. And there is no single vehicle to transport all the relevant information for these agents. A classic representation of markets suggests that prices carry all the relevant information agents need to make optimal decisions. But it is impossible to project this model onto innovation because, as I suggested above, it does not adhere exclusively to economic logic; cultural and political values are also at stake. The governance task is therefore fragmented into pieces and assigned to each of the participants of the socio-technical systems involved, and they cannot resolve it as a profit-maximization problem. 

Instead, the participants must approach governance as a problem of design where the goal could be characterized as reflexive adaptation. By adaptation I mean seeking to achieve inter-system compatibility. By reflexive I mean that each actor must realize that their actions trigger adaption measures in other systems. Thus, they cannot passively adapt but rather they must anticipate the sequence of accommodations in the interaction with other agents. This is one of the most important aspects of the governance problem, because all too often neither technical nor economic criteria will suffice; quite regularly coordination must be negotiated, which is to say, innovation entails politics.

The idea of governance of socio-technical systems is daunting. How do we even begin to understand it? What kinds of modes of governance exist? What are the key dimensions to understand the integration of socio-technical systems? And perhaps more pressing, who prevails in disputes about coordination and accommodation? Fortunately, Susana Borrás, from the Copenhagen Business School, and Jakob Edler, from the University of Manchester, both distinguished professors of innovation, have collected a set of case studies that shed light on these problems in an edited volume entitled Governance of Socio-technical Change: Explaining Change. What is more, they offer a very useful conceptual framework of governance that is worth reviewing here. While this volume will be of great interest to scholars of innovation—and it is written in scholarly language—I think it has great value for policymakers, entrepreneurs, and all agents involved in a practical manner in the work of innovation.

Organizing our thinking on the governance of change

The first question that Borrás and Edler tackle is how to characterize the different modes of governance. They start out with a heuristic typology across the two central categories: what kinds of agents drive innovation and how the actions of these agents are coordinated. Agents can represent the state or civil society, and actions can be coordinated via dominant or non-dominant hierarchies.

Change led by state actors

Change led by societal actors

Coordination by dominant hierarchies

Traditional deference to technocratic competence: command and control.

Monopolistic or oligopolistic industrial organization.

Coordination by non-dominant hierarchies

State agents as primus inter pares.

More competitive industries with little government oversight.

Source: Adapted from Borrás and Adler (2015), Table 1.2, p. 13.

This typology is very useful to understand why different innovative industries have different dynamics; they are governed differently. For instance, we can readily understand why consumer software and pharmaceuticals are so at odds regarding patent law. The strict (and very necessary) regulation of drug production and commercialization coupled with the oligopolistic structure of that industry creates the need and opportunity to advocate for patent protection; which is equivalent to a government subsidy. In turn, the highly competitive environment of consumer software development and its low level of regulation foster an environment where patents hinder innovation. Government intervention is neither needed nor wanted; the industry wishes to regulate itself.

This typology is also useful to understand why open source applications have gained currency much faster in the consumer segment than the contractor segment of software producers. Examples of the latter is industry specific software (e.g. to operate machinery, the stock exchange, and ATMs) or software to support national security agencies. These contractors demand proprietary software and depend on the secrecy of the source code. The software industry is not monolithic, and while highly innovative in all its segments, the innovation taking place varies greatly by its mode of governance.

Furthermore, we can understand the inherent conflicts in the governance of science. In principle, scientists are led by curiosity and organize their work in a decentralized and organic fashion. In practice, most of science is driven by mission-oriented governmental agencies and is organized in a rigid hierarchical system. Consider the centrality of prestige in science and how it is awarded by peer-review; a system controlled by the top brass of each discipline. There is nearly an irreconcilable contrast between the self-image of science and its actual governance. Using the Borrás-Edler typology, we could say that scientists imagine themselves as citizens of the south-east quadrant while they really inhabit the north-west quadrant.

There are practical lessons from the application of this typology to current controversies. For instance, no policy instrument such as patents can have the same effect on all innovation sectors because the effect will depend on the mode of governance of the sector. This corollary may sound intuitive, yet it really is at variance with the current terms of the debate on patent protection, where assertions of its effect on innovation, in either direction, are rarely qualified.

The second question Borrás and Edler address is that of the key analytical dimensions to examine socio-technical change. To this end, they draw from an ample selection of social theories of change. First, economists and sociologists fruitfully debate the advantage of social inquiry focused on agency versus institutions. Here, the synthesis offered is reminiscent of Herbert Simon’s “bounded rationality”, where the focus turns to agent decisions constrained by institutions. Second, policy scholars as well as sociologists emphasize the engineering of change. Change can be accomplished with discreet instruments such as laws and regulations, or diffused instruments such as deliberation, political participation, and techniques of conflict resolution. Third, political scientists underscore the centrality of power in the adjudication of disputes produced by systems’ change and integration. Borrás and Edler have condensed these perspectives in an analytical framework that boils down to three clean questions: who drives change? (focus on agents bounded by institutions), how is change engineered? (focus on instrumentation), and why it is accepted by society? (focus on legitimacy). The case studies contained in this edited volume illustrate the deployment of this framework with empirical research.

Standards, sustainability, incremental innovation

Arthur Daemmrich (Chapter 3) tells the story of how the German chemical company BASF succeeded marketing the biodegradable polymer Ecoflex. It is worth noting the dependence of BASF on government funding to develop Ecoflex, and on the German Institute for Standardization (DIN), making a market by setting standards. With this technology, BASF capitalized on the growing demand in Germany for biodegradables, and with its intense cooperation with DIN helped establish a standard that differentiate Ecoflex from the competition. By focusing on the enterprise (the innovation agent) and its role in engineering the market for its product by setting standards that would favor them, this story reveals the process of legitimation of this new technology. In effect, the certification of DIN was accepted by agribusinesses that sought to utilize biodegradable products.

If BASF is an example of innovation by standards, Allison Loconto and Marc Barbier (Chapter 4) show the strategies of governing by standards. They take the case of the International Social and Environmental Accreditation and Labelling alliance (ISEAL). ISEAL, an advocate of sustainability, positions itself as a coordinating broker among standard developing organizations by offering “credibility tools” such as codes of conduct, best practices, impact assessment methods, and assurance codes. The organization advocates what is known as the tripartite system regime (TSR) around standards. TSR is a system of checks and balances to increase the credibility of producers complying with standards. The TSR regime assigns standard-setting, certification, and accreditation of the certifiers, to separate and independent bodies. The case illustrates how producers, their associations, and broker organizations work to bestow upon standards their most valuable attribute: credibility. The authors are cautious not to conflate credibility with legitimacy, but there is no question that credibility is part of the process of legitimizing technical change. In constructing credibility, these authors focus on the third question of the framework –legitimizing innovation—and from that vantage point, they illuminate the role of actors and instruments that will guide innovations in sustainability markets.

While standards are instruments of non-dominant hierarchies, the classical instrument of dominant hierarchies is regulation. David Barberá-Tomás and Jordi Molas-Gallart tell the tragic consequences of an innovation in hip-replacement prosthesis that went terribly wrong. It is estimated that about 30 thousand replaced hips failed. The FDA, under the 1976 Medical Device Act, allows incremental improvements in medical devices to go into the market after only laboratory trials, assuming that any substantive innovations have already being tested in regular clinical trials. This policy was designed as an incentive for innovation, a relief from high regulatory costs. However, the authors argue, when products have been constantly improved for a number of years after an original release, any marginal improvement comes at a higher cost or higher risk—a point they refer to as the late stage of the product life-cycle. This has tilted the balance in favor of risky improvements, as illustrated by the hip prosthesis case. The story speaks to the integration of technical and cultural systems: the policy that encourages incremental innovation may alter the way medical device companies assess the relative risk of their innovations, precisely because they focus on incremental improvements over radical ones. Returning to the analytical framework, the vantage point of regulation—instrumentation—elucidates the particular complexities and biases in agents’ decisions.

Two additional case studies discuss the discontinuation of the incandescent light bulb (ILB) and the emergence of translational research, both in Western Europe. The first study, authored by Peter Stegmaier, Stefan Kuhlmann and Vincent R. Visser (Chapter 6), focuses on a relatively smooth transition. There was wide support for replacing ILBs that translated in political will and a market willing to purchase new energy efficient bulbs. In effect, the new technical system was relatively easy to re-integrate to a social system in change—public values had shifted in Europe to favor sustainable consumption—and the authors are thus able to emphasize how agents make sense of the transition. Socio-technical change does not have a unique meaning: for citizens it means living in congruence with their values; for policy makers it means accruing political capital; for entrepreneurs it means new business opportunities. The case by Etienne Vignola-Gagné, Peter Biegelbauer and Daniel Lehner (Chapter 7) offers a similar lesson about governance. My reading of their multi-site study of the implementation of translational research—a management movement that seeks to bridge laboratory and clinical work in medical research—reveals how the different agents involved make sense of this organizational innovation. Entrepreneurs see a new market niche, researchers strive for increasing the impact of their work, and public officials align their advocacy for translation with the now regular calls for rendering publicly funded research more productive. Both chapters illuminate a lesson that is as old as it is useful to remember: technological innovation is interpreted in as many ways as the number of agents that participate in it.

Innovation for whom?

The framework and illustrations of this book are useful for those of us interested in the governance of system integration. The typology of different modes of governance and the three vantage points from which empirical analysis can be deployed are very useful indeed. Further development of this framework should include the question of how political power is redistributed by effect of innovation and the system integration and re-integration that it triggers. The question is pressing because the outcomes of innovation vary as power structures are reinforced or debilitated by the emergence of new technologies—not to mention ongoing destabilizing forces such as social movements. Put another way, the framework should be expanded to explain in which circumstances innovation exacerbates inequality. The expanded framework should probe whether the mutual accommodation is asymmetric across socio-economic groups, which is the same as asking: are poor people asked to do more adapting to new technologies? These questions have great relevance in contemporary debates about economic and political inequality. 

I believe that Borrás and Edler and their colleagues have done us a great service organizing a broad but dispersed literature and offering an intuitive and comprehensive framework to study the governance of innovation. The conceptual and empirical parts of the book are instructive and I look forward to the papers that will follow testing this framework. We need to better understand the governance of socio-technical change and the dynamics of systems integration. Without a unified framework of comparison, the ongoing efforts in various disciplines will not amount to a greater understanding of the big picture. 

I also have a selfish reason to like this book: it helps me make sense of my carrier’s push for integrating my value system to their technical system. If I decide to adapt to a newer phone, I could readily do so because I have time and other resources. But that may not be the case for many customers of 2G devices who have neither the resources nor the inclination to learn to use more complex devices. For that reason alone, I’d argue that this sort of innovation-led systems integration could be done more democratically. Still, I could meet the decision of my carrier with indifference: when the service is disconnected, I could simply try to get by without the darn toy.

Note: Thanks to Joseph Schuman for an engaging discussion of this book with me.

Image Source: © Dominic Ebenbichler / Reuters
      
 
 




of

The fair compensation problem of geoengineering


The promise of geoengineering is placing average global temperature under human control, and is thus considered a powerful instrument for the international community to deal with global warming. While great energy has been devoted to learning more about the natural systems that it would affect, questions of political nature have received far less consideration. Taking as a given that regional effects will be asymmetric, the nations of the world will only give their consent to deploying this technology if they can be given assurances of a fair compensation mechanism, something like an insurance policy. The question of compensation reveals that the politics of geoengineering are far more difficult than the technical aspects.

What is Geoengineering?

In June 1991, Mount Pinatubo exploded, throwing a massive amount of volcanic sulfate aerosols into the high skies. The resulting cloud dispersed over weeks throughout the planet and cooled its temperature on average 0.5° Celsius over the next two years. If this kind of natural phenomenon could be replicated and controlled, the possibility of engineering the Earth’s climate is then within reach.

Spraying aerosols in the stratosphere is one method of solar radiation management (SRM), a class of climate engineering that focuses on increasing the albedo, i.e. reflectivity, of the planet’s atmosphere. Other SRM methods include brightening clouds by increasing their content of sea salt. A second class of geo-engineering efforts focuses on carbon removal from the atmosphere and includes carbon sequestration (burying it deep underground) and increasing land or marine vegetation. Of all these methods, SRM is appealing for its effectiveness and low costs; a recent study put the cost at about $5 to $8 billion per year.1

Not only is SRM relatively inexpensive, but we already have the technological pieces that assembled properly would inject the skies with particles that reflect sunlight back into space. For instance, a fleet of modified Boeing 747s could deliver the necessary payload. Advocates of geoengineering are not too concerned about developing the technology to effect SRM, but about its likely consequences, not only in terms of slowing global warming but the effects on regional weather. And there lies the difficult question for geoengineering: the effects of SRM are likely to be unequally distributed across nations.

Here is one example of these asymmetries: Julia Pongratz and colleagues at the department of Global Ecology of the Carnegie Institution for Science estimated a net increase in yields of wheat, corn, and rice from SRM modified weather. However, the study also found a redistributive effect with equatorial countries experiencing lower yields.2 We can then expect that equatorial countries will demand fair compensation to sign on the deployment of SRM, which leads to two problems: how to calculate compensation, and how to agree on a compensation mechanism.

The calculus of compensation

What should be the basis for fair compensation? One view of fairness could be that, every year, all economic gains derived from SRM are pooled together and distributed evenly among the regions or countries that experience economic losses.

If the system pools gains from SRM and distributes them in proportion to losses, questions about the balance will only be asked in years in which gains and losses are about the same. But if losses are far greater than the gains; then this would be a form of insurance that cannot underwrite some of the incidents it intends to cover. People will not buy such an insurance policy; which is to say, some countries will not authorize SRM deployment. In the reverse, if the pool has a large balance left after paying out compensations, then winners of SRM will demand lower compensation taxes.

Further complicating the problem is the question of how to separate gains or losses that can be attributed to SRM from regional weather fluctuations. Separating the SRM effect could easily become an intractable problem because regional weather patterns are themselves affected by SRM.  For instance, any year that El Niño is particularly strong, the uncertainty about the net effect of SRM will increase exponentially because it could affect the severity of the oceanic oscillation itself. Science can reduce uncertainty but only to a certain degree, because the better we understand nature, the more we understand the contingency of natural systems. We can expect better explanations of natural phenomena from science, but it would be unfair to ask science to reduce greater understanding to a hard figure that we can plug into our compensation equation.

Still, greater complexity arises when separating SRM effects from policy effects at the local and regional level. Some countries will surely organize better than others to manage this change, and preparation will be a factor in determining the magnitude of gains or losses. Inherent to the problem of estimating gains and losses from SRM is the inescapable subjective element of assessing preparation. 

The politics of compensation

Advocates of geoengineering tell us that their advocacy is not about deploying SRM; rather, it is about better understanding the scientific facts before we even consider deployment. It’s tempting to believe that the accumulating science on SRM effects would be helpful. But when we consider the factors I just described above, it is quite possible that more science will also crystalize the uncertainty about exact amounts of compensation. The calculus of gain or loss, or the difference between the reality and a counterfactual of what regions and countries will experience requires certainty, but science only yields irreducible uncertainty about nature.

The epistemic problems with estimating compensation are only to be compounded by the political contestation of those numbers. Even within the scientific community, different climate models will yield different results, and since economic compensation is derived from those models’ output, we can expect a serious contestation of the objectivity of the science of SRM impact estimation. Who should formulate the equation? Who should feed the numbers into it? A sure way to alienate scientists from the peoples of the world is to ask them to assert their cognitive authority over this calculus. 

What’s more, other parts of the compensation equation related to regional efforts to deal with SRM effect are inherently subjective. We should not forget the politics of asserting compensation commensurate to preparation effort; countries that experience low losses may also want compensation for their efforts preparing and coping with natural disasters.

Not only would a compensation equation be a sham, it would be unmanageable. Its legitimacy would always be in question. The calculus of compensation may seem a way to circumvent the impasses of politics and define fairness mathematically. Ironically, it is shot through with subjectivity; is truly a political exercise.

Can we do without compensation?

Technological innovations are similar to legislative acts, observed Langdon Winner.3 Technical choices of the earliest stage in technical design quickly “become strongly fixed in material equipment, economic investment, and social habit, [and] the original flexibility vanishes for all practical purposes once the initial commitments are made.” For that reason, he insisted, "the same careful attention one would give to the rules, roles, and relationships of politics must also be given to such things as the building of highways, the creation of television networks, and the tailoring of seeming insignificant features on new machines."

If technological change can be thought of as legislative change, we must consider how such a momentous technology as SRM can be deployed in a manner consonant with our democratic values. Engineering the planet’s weather is nothing short of passing an amendment to Planet Earth’s Constitution. One pesky clause in that constitutional amendment is a fair compensation scheme. It seems so small a clause in comparison to the extent of the intervention, the governance of deployment and consequences, and the international commitments to be made as a condition for deployment (such as emissions mitigation and adaptation to climate change). But in the short consideration afforded here, we get a glimpse of the intractable political problem of setting up a compensation scheme. And yet, if the clause were not approved by a majority of nations, a fair compensation scheme has little hope to be consonant with democratic aspirations.


1McClellan, Justin, David W Keith, Jay Apt. 2012. Cost analysis of stratospheric albedo modification delivery systems. Environmental Research Letters 7(3): 1-8.

2Pongratz, Julia, D. B. Lobell, L. Cao, K. Caldeira. 2012. Nature Climate Change 2, 101–105.

3Winner, Langdon. 1980. Do artifacts have politics? Daedalus (109) 1: 121-136.

Image Source: © Antara Photo Agency / Reuters
      
 
 




of

Alternative perspectives on the Internet of Things


Editor's Note: TechTakes is a new series that collects the diverse perspectives of scholars around the Brookings Institution on technology policy issues. This first post in the series features contributions from Scott Andes, Susan Hennessey, Adie Tomer, Walter Valdivia, Darrell M. West, and Niam Yaraghi on the Internet of Things.

In the coming years, the number of devices around the world connected to the Internet of Things (IoT) will grow rapidly. Sensors located in buildings, vehicles, appliances, and clothing will create enormous quantities of data for consumers, corporations, and governments to analyze. Maximizing the benefits of IoT will require thoughtful policies. Given that IoT policy cuts across many disciplines and levels of government, who should coordinate the development of new IoT platforms? How will we secure billions of connected devices from cyberattacks? Who will have access to the data created by these devices? Below, Brookings scholars contribute their individual perspectives on the policy challenges and opportunities associated with the Internet of Things.

The Internet of Things will be everywhere

Darrell M. West is vice president and director of Governance Studies and founding director of the Center for Technology Innovation.

Humans are lovable creatures, but prone to inefficiency, ineffectiveness, and distraction. They like to do other things when they are driving such as listening to music, talking on the phone, texting, or checking email. Judging from the frequency of accidents though, many individuals believe they are more effective at multi-tasking than is actually the case.

The reality of these all too human traits is encouraging a movement from communication between computers to communication between machines. Driverless cars soon will appear on the highways in large numbers, and not just as a demonstration project. Remote monitoring devices will transmit vital signs to health providers, who then can let people know if their blood pressure has spiked or heart rhythm has shifted in a dangerous direction. Sensors in appliances will let individuals know when they are running low on milk, bread, or cereal. Thermostats will adjust their energy settings to the times when people actually are in the house, thereby saving substantial amounts of money while also protecting natural resources.

With the coming rise of a 5G network, the Internet of Things will unleash high-speed devices and a fully connected society. Advanced digital devices will enable a wide range of new applications from energy and transportation to home security and healthcare. They will help humans manage the annoyances of daily lives such as traffic jams, not being able to find parking places, or keeping track of physical fitness. The widespread adoption of smart appliances, smart energy grids, resource management tools, and health sensors will improve how people connect with one another and their electronic devices. But they also will raise serious security, privacy, and policy issues.

Implications for surveillance

Susan Hennessey is Fellow in National Security in Governance Studies at the Brookings Institution. She is the Managing Editor of the Lawfare blog, which is devoted to sober and serious discussion of "Hard National Security Choices.”

As the debate over encryption and diminished law enforcement access to communications enters the public arena, some posit the growing Internet of Things as a solution to “Going Dark.” A recently released Harvard Berkman Center report, “Don’t Panic,” concludes in part that losses of communication content will be offset by the growth of IoT and networked sensors. It argues IoT provides “prime mechanisms for surveillance: alternative vectors for information-gathering that could more than fill many of the gaps left behind by sources that have gone dark – so much so that they raise troubling questions about how exposed to eavesdropping the general public is poised to become.”

Director of National Intelligence James Clapper agrees that IoT has some surveillance potential. He recently testified before Congress that “[i]n the future, intelligence services might use the IoT for identification, surveillance, monitoring, location tracking, and targeting for recruitment, or to gain access to networks or user credentials.”

But intelligence gathering in the Internet age is fundamentally about finding needles in haystacks – IoT is poised to add significantly more hay than needles. Law enforcement and the intelligence community will have to develop new methods to isolate and process the magnitude of information. And Congress and the courts will have to decide how laws should govern this type of access.

For now, the unanswered question remains: How many refrigerators does it take to catch a terrorist?

IoT governance

Scott Andes is a senior policy analyst and associate fellow at the Anne T. and Robert M. Bass Initiative on Innovation and Placemaking, a part of the Centennial Scholar Initiative at the Brookings Institution.

As with many new technology platforms, the Internet of Things is often approached as revolutionary, not evolutionary technology. The refrain is that some scientific Rubicon has been crossed and the impact of IoT will come soon regardless of public policy. Instead, the role of policymakers is to ensure this new technology is leveraged within public infrastructure and doesn’t adversely affect national security or aggravate inequality. While these goals are clearly important, they all assume technological advances of IoT are staunchly within the realm of the private sector and do not justify policy intervention. However, as with almost all new technologies that catch the public’s eye—robotics, clean energy, autonomous cars, etc.—hyperbolic news reporting overstates the market readiness of these technologies, further lowering the perceived need of policy support.

The problem with this perspective is twofold. First, greater scientific breakthroughs are still needed. The current rate of improvement in processing power and data storage, miniaturization of devices, and more energy efficient sensors only begin to scratch the surface of IoT’s full potential. Advances within next-generation computational power, autonomous devices, and interoperable systems still require scientific breakthroughs and are nowhere near deployment. Second, even if the necessary technological advancements of IoT have been met, it’s not clear the U.S. economy will be the prime recipient of its economic value. Nations that lead in advanced manufacturing, like Germany, may already be better poised to export IoT-enabled products. Policymakers in the United States should view technological advancements in IoT as a global economic race that can be won through sound science policies. These should include: accelerating basic engineering research; helping that research reach the market; supporting entrepreneurs’ access to capital; and training a science and engineering-ready workforce that can scale up new technologies.

IoT will democratize innovation

Walter D. Valdivia is a fellow in the Center for Technology Innovation at Brookings.

The Internet of Things could be a wonderful thing, but not in the way we imagine it.

Today, the debate is dominated by cheerleaders or worrywarts. But their perspectives are merely two sides of the same coin: technical questions about reliability of communications and operations, and questions about system security. Our public imagination about the future is being narrowly circumscribed by these questions. However, as the Internet of Things starts to become a thing—or multiples things, or a networked plurality—it is likely to intrude so intensely into our daily lives that alternative imaginations will emerge and will demand a hearing.

A compelling vision of the future is necessary to organize and coordinate the various market and political agents who will integrate IoT into society. Technological success is usually measured in terms set by the purveyor of that vision. Traditionally, this is a small group with a financial stake in technological development: the innovating industry. However, the intrusiveness and pervasiveness of the Internet of Things will prompt ordinary citizens to augment that vision. Citizen participation will deny any group a monopoly on that vision of the future. Such a development would be a true step in the direction of democratizing innovation. It could make IoT a wonderful thing indeed.

Applications of IoT for infrastructure

Adie Tomer is a fellow at the Brookings Institution Metropolitan Policy Program and a member of the Metropolitan Infrastructure Initiative.

The Internet of Things and the built environment are a natural fit. The built environment is essentially just a collection of physical objects—from sidewalks and streets to buildings and water pipes—that all need to be managed in some capacity. Today, we measure our shared use of those objects through antiquated analog or digital systems. Think of the electricity meter on a building, or a person manually counting pedestrians on a busy city street. Digital, Internet-connected sensors promise to modernize measurement, relaying a whole suit of indicators to centralized databases tweaked to make sense of such big data.

But let’s not fool ourselves. Simply outfitting cities and metro areas with more sensors won’t solve any of our pressing urban issues. Without governance frameworks to apply the data towards goals around transportation congestion, more efficient energy use, or reduced water waste, these sensors could be just another public investment that doesn’t lead to public benefit.

The real goal for IoT in the urban space, then, is to ensure our built environment supports broader economic, social, and environmental objectives. And that’s not a technology issue—that’s a question around leadership and agenda-setting.

Applications of IoT for health care

Niam Yaraghi is a fellow in the Brookings Institution's Center for Technology Innovation.

Health care is one of the most exciting application areas for IoT. Imagine that your Fitbit could determine if you fall, are seriously hurt, and need to be rushed to hospital. It automatically pings the closest ambulance and sends a brief summary of your medical status to the EMT personnel so that they can prepare for your emergency services even before they reach the scene. On the way, the ambulance will not need to use sirens to make way since the other autonomous vehicles have already received a notification about approaching ambulance and clear the way while the red lights automatically turn green. 

IoT will definitely improve the efficiency of health care services by reducing medical redundancies and errors. This dream will come true sooner than you think. However, if we do not appropriately address the privacy and security issues of healthcare data, then IoT can be our next nightmare. What if terrorist organizations (who are becoming increasingly technology savvy) find a way to hack into Fitbit and send wrong information to an EMT? Who owns our medical data? Can we prevent Fitbit from selling our health data to third parties? Given these concerns, I believe we should design a policy framework that encourages accountability and responsibility with regards to health data. The framework should precisely define who owns data; who can collect, store, mine and use it; and what penalties will be enforced if entities acted outside of this framework.

Authors

  • Jack Karsten
      
 
 




of

The benefits of a knives-out Democratic debate

Stop whining about Democrats criticizing each other. The idea that Democrats attacking Democrats is a risk and an avenue that will deliver reelection to Donald Trump is nonsense. Democrats must attack each other and attack each other aggressively. Vetting presidential candidates, highlighting their weaknesses and the gaps in their record is essential to building a…

       




of

‘Essential’ cannabis businesses: Strategies for regulation in a time of widespread crisis

Most state governors and cannabis regulators were underprepared for the COVID-19 pandemic, a crisis is affecting every economic sector. But because the legal cannabis industry is relatively new in most places and still evolving everywhere, the challenges are even greater. What’s more, there is no history that could help us understand how the industry will endure the current economic situation. And so, in many…

       




of

Let’s resolve to stop assuming the worst of each other in 2016


Even before the eruption of anti-Muslim rhetoric in the past several weeks, I had a privileged position from which to observe the deep current of Islamophobia that ran beneath the crust of mainstream politics over the fourteen years since 9/11. Because I work on Islamist extremism, my dad often forwards emails about Americans Muslims he receives from friends to ask if they are true. I don’t blame him for asking: they’re truly scary. Muslims imposing Sharia law over the objections of their fellow Americans. Muslims infiltrating the U.S. government to subvert it. And so on.

But as with most Internet rumors circulated over email, the vast majority of the scary reports aren’t true. Take a peek at the “25 Hottest Urban Legends” on the rumor-busting website Snopes and you’ll see what I mean. The 11th on the list is about Muslim passengers on an AirTran flight that attempted a dry run to bring down a plane (they didn’t). The 15th is about an American Muslim who oversees all U.S. immigration (she just coordinates special naturalization ceremonies). The underlying message is that American Muslims are not to be trusted because of their religion.

One reason these rumors have currency is that most Americans don’t know many of their Muslim neighbors. For all the worry of a Muslim takeover, there are only around 4 million in this country, a little over 1 percent of the total population. Most of them do not live in Republican strongholds, where they are most feared.

[A]s with most Internet rumors circulated over email, the vast majority of the scary reports aren’t true.

Of course, familiarity does not always lessen fears or tensions. But it does complicate easy stories about an unfamiliar culture and those who identify with it. For example, because I’ve worked on counterterrorism in the U.S. government, I’ve never bought the story that American Muslims are infiltrating the U.S. government to subvert it. I’ve simply met too many Muslims in the government working impossible hours to keep this country and its Constitution safe.

American Muslims have their own easy stories to tell about non-Muslims that could use some complicating. Several of my Muslim friends have been surprised at the number of non-Muslim strangers who’ve come up to them and voiced their support. They’re surprised, presumably, because they assume that most non-Muslims in this country agree with Trump's rhetoric, which they don’t.

Some American Muslims view Islamophobia a natural outgrowth of white American racism, religious bigotry, and xenophobia. That easy story may account for some Islamophobia but it ignores something major: actions by Muslims to deliberately set non-Muslims against them. Jihadist groups like al-Qaida and ISIS carry out attacks in this country to create popular backlash against Muslims in hopes of recruiting those who are angered by the backlash. 

Even though most Muslims reject the siren call of the jihadists, the backlash still leads some Muslims to expect the worst of nonbelievers and of the American government. Like the anti-Muslim rumor mill, they spread half-truths about Christian vigilante violence and government plots. For example, at least one prominent religious leader in the American Muslim community has insinuated that the San Bernardino attackers were patsies in a government conspiracy against Muslims. 

My hope is that we’ll all try to be a little less suspicious of one another’s motives and a little more suspicious of the easy stories we tell.

Since it’s the holiday season, I shall indulge in a wish for the New Year. My hope is that we’ll all try to be a little less suspicious of one another’s motives and a little more suspicious of the easy stories we tell. I know the wish is fanciful given the current political climate but I’ve been struck by the number of Americans—Muslim and non-Muslim—who have been willing to confront their biases over the past few weeks and see things from the other side. If our enemies succeed by eroding our empathy for one another, we will succeed by reinforcing and expanding it.

Authors

     
 
 




of

Experts Weigh In: What is the future of al-Qaida and the Islamic State?


Will McCants: As we wind down another year in the so-called Long War and begin another, it’s a good time to reflect on where we are in the fight against al-Qaida and its bête noire, the Islamic State. Both organizations have benefited from the chaos unleashed by the Arab Spring uprisings but they have taken different paths. Will those paths converge again or will the two organizations continue to remain at odds? Who has the best strategy at the moment? And what political changes might happen in the coming year that will reconfigure their rivalry for leadership of the global jihad?

To answer these questions, I’ve asked some of the leading experts on the two organizations to weigh in over. The first is Barak Mendelsohn, an associate professor of political science at Haverford College and a senior fellow at the Foreign Policy Research Institute (FPRI). He is author of the brand new The al-Qaeda Franchise: The Expansion of al-Qaeda and Its Consequences.


Barak Mendelsohn: Al-Qaida attacked the U.S. homeland on 9/11, unprepared for what would follow. There was a strong disconnect between al-Qaida’s meager capabilities and its strategic objectives of crippling the United States and of bringing about change in the Middle East. To bridge that gap, Osama bin Laden conveniently and unrealistically assumed that the attack on the United States would lead the Muslim masses and all other armed Islamist forces to join his cause. The collapse of the Taliban regime and the decimation of al-Qaida’s ranks quickly proved him wrong.

Yet over fourteen years later al-Qaida is still around. Despite its unrealistic political vision and considerable setbacks—above all the rise of the Islamic State that upstaged al-Qaida and threatened its survival—it has branches in North Africa, the Arabian Peninsula, the Levant, Central Asia, and the Horn of Africa.

Down, but not out

Two factors explain al-Qaida’s resilience: changes in the environment due to the Arab revolutions and the group’s ability to take advantage of new opportunities by learning from past mistakes. The Arab awakening initially undercut al-Qaida’s original claims that change in Muslim countries cannot come peacefully or without first weakening the United States. Yet, the violence of regimes against their people in Syria, Libya, and elsewhere created new opportunities for al-Qaida to demonstrate its relevance. Furthermore, involved citizens determined to shape their own future presented al-Qaida with a new opportunity to recruit. 

But favorable conditions would be insufficient to explain al-Qaida’s resilience without changes in the way al-Qaida operates. Learning from its bitter experience in Iraq, al-Qaida opted to act with some moderation. It embedded itself among rebel movements in Syria and Yemen, thus showing it could be a constructive actor, attentive to the needs of the people and willing to cooperate with a wide array of groups. As part of a broader movement, al-Qaida’s affiliates in these countries also gained a measure of protection from external enemies reluctant to alienate the group’s new allies. 

[E]ven after showing some moderation, al-Qaida’s project is still too extreme for the overwhelming majority of Muslims.

At present, the greatest threat to al-Qaida is not the United States or the Arab regimes; it’s the group’s former affiliate in Iraq, the Islamic State. ISIS is pressuring al-Qaida’s affiliates to defect—while it has failed so far to shift their allegiance, it has deepened cracks within the branches and persuaded small groups of al-Qaida members to change sides. Even if al-Qaida manages to survive the Islamic State’s challenge, in the long term it still faces a fundamental problem that is unlikely to change: even after showing some moderation, al-Qaida’s project is still too extreme for the overwhelming majority of Muslims.

Up, but not forever

With the United States seeking retrenchment and Middle Eastern regimes weakening, the Islamic State came to prominence under more convenient conditions and pursued a different strategy. Instead of wasting its energy on fighting the United States first, ISIS opted to establish a caliphate on the ruins of disintegrating Middle Eastern states. It has thrived on the chaos of the Arab rebellions. But in contrast to al-Qaida, it went beyond offering protection to oppressed Sunni Muslims by promoting a positive message of hope and pride. It does not merely empower Muslims to fend off attacks on their lives, property, and honor; the Islamic State offers its enthusiastic followers an historic chance to build a utopian order and restore the early Islamic empire or caliphate.

ISIS opted to establish a caliphate on the ruins of disintegrating Middle Eastern states. It has thrived on the chaos of the Arab rebellions.

The Islamic State’s leaders gambled that their impressive warfighting skills, the weakness of their opponents, and the reluctance of the United States to fight another war in the Middle East would allow the group to conquer and then govern territory. The gamble paid off. Not only did ISIS succeed in controlling vast territory, including the cities of Raqqa and Mosul; the slow response to its rise allowed the Islamic State’s propaganda machine to construct a narrative of invincibility and inevitability, which has, in turn, increased its appeal to new recruits and facilitated further expansion.

And yet, the Islamic State’s prospects of success are low. Its miscalculations are threatening to undo much of its success. It prematurely and unnecessarily provoked an American intervention that, through a combination of bombings from the air and skilled Kurdish proxies on the ground, is limiting the Islamic State’s ability to expand and even reversing some of the group’s gains. 

ISIS could settle for consolidating its caliphate in the territories it currently controls, but its hubris and messianic zeal do not allow for such limited goals. It is committed to pursuing military expansion alongside its state-building project. This rigid commitment to two incompatible objectives is perhaps the Islamic State’s biggest weakness. 

[T]he slow response to its rise allowed the Islamic State’s propaganda machine to construct a narrative of invincibility and inevitability.

Rather than pursue an economic plan that would guarantee the caliphate’s survival, the Islamic State has linked its economic viability to its military expansion. At present, ISIS relies on taxing its population and oil sales to support its flailing economy. But these financial resources cannot sustain a state, particularly one bent on simultaneously fighting multiple enemies on numerous fronts. Ironically, rather than taming its aspirations, the Islamic State sees conquest as the way to promote its state-building goals. Its plan for growing the economy is based on the extraction of resources through military expansion. While this plan worked well at first—when the Islamic State faced weak enemies—it is not a viable solution any longer, as the self-declared caliphate can no longer expand fast enough to meet its needs. Consequently, this strategy is undermining ISIS rather than strengthening it. 

Unfortunately, even if the Islamic State is bound to fail over the long run, it has had enough time to wreak havoc on other states in the neighborhood. And while its ability to govern is likely to continue diminishing, the terror attacks in Paris, Beirut, and Sinai suggest that the Islamic State will remain capable of causing much pain for a long time.

Authors

     
 
 




of

Amid rising fears of ISIS, Obama must reassure


As President Obama prepares to give the final State of the Union address of his presidency tonight, he’s promised to stay away from the technocrat’s laundry list of to-do’s. Instead, he’s expected to deliver a speech that will remind his fellow citizens of their ability to “come together as one American family.” It’s going to be a tough sell, especially when the citizens are terrified of outsiders and suspicious of one another.

Most of the fear and paranoia revolves around the Islamic State group. Although the group poses far less of a threat to the United States than to our allies and friends in Europe and the Middle East, it is the sum of all fears in the minds of many Americans—an immigrant, terrorist, cyber, WMD, genocidal threat rolled into one. Its name alone can be invoked to indict Obama’s national security and immigration policies—substantive criticisms are unnecessary.

[T]he Islamic State group...is the sum of all fears in the minds of many Americans.

Most of those fears are overblown, but the president will want to tackle them each of them in his speech if he intends to calm fears and bring people together. He’ll explain why taking in refugees is not just living up to American values but also smart counterterrorism. He’ll showcase evidence that the military campaign against the Islamic State in the Middle East is bearing fruit. He’ll reassure Americans that the Islamic State can’t plant a skilled operative into this country and remind them that the best way to stop the unskilled lone wolf shooters inspired by the Islamic State is to close gun loop holes and monitor their behavior online before they act. He’ll demonstrate his commitment to blunting Islamic State recruitment, touting changes to how the government counters the Islamic State’s appeal online and in America’s big cities.

All of that is well and good, but it’s a bureaucrat’s (or think tanker’s) effort at reassuring the public. To truly succeed in mitigating America’s fears and bringing citizens together, our country’s leader has to acknowledge that their fears are real and explain what our enemies hope to gain by engendering them. While Americans’ fears may be overblown, they won’t be deflated by technocratic hot air.

Authors

     
 
 




of

Experts weigh in (part 2): What is the future of al-Qaida and the Islamic State?


Will McCants: As we begin another year in the so-called Long War, it’s a good time to reflect on where we are in the fight against al-Qaida and its bête noire, the Islamic State. Both organizations have benefited from the chaos unleashed by the Arab Spring uprisings but they have taken different paths. Will those paths converge again or will the two organizations continue to remain at odds? Who has the best strategy at the moment? And what political changes might happen in the coming year that will reconfigure their rivalry for leadership of the global jihad?

To answer these questions, I’ve asked some of the leading experts on the two organizations to weigh in. First was Barak Mendelsohn, who contrasts al-Qaida’s resilience and emphasis on Sunni oppression with the Islamic State’s focus on building a utopian order and restoring the caliphate.

Next is Clint Watts, a Fox fellow at the Foreign Policy Research Institute. He offers ways to avoid the flawed assumptions that have led to mistaken counterterrorism forecasts in recent years. 


Clint Watts: Two years ago today, counterterrorism forecasts focused on a “resurgent” al-Qaida. Debates over whether al-Qaida was again winning the war on terror ensued just a week before the Islamic State invaded Mosul. While Washington’s al-Qaida debates steamed away in 2013, Ayman al-Zawahiri’s al-Qaida suffered unprecedented internal setbacks from a disobedient, rogue affiliate formerly known as al-Qaida in Iraq (AQI). With terror predictions two years ago so far off the mark, should we even attempt to anticipate what the next two years of al-Qaida and ISIS will bring?

Rather than prognosticate about how more than a dozen extremist groups operating on four continents might commit violence in the future, analysts might instead examine flawed assumptions that resulted in the strategic surprise known as the Islamic State. Here are insights from last decade’s jihadi shifts we should consider when making forecasts on al-Qaida and the Islamic State’s future in the coming decade. 

Loyalty is fleeting, self-interest is forever. Analysts that missed the Islamic State’s rise assumed that those who pledged allegiance to al-Qaida would remain loyal indefinitely. But loyalties change despite the oaths that bind them. Abu Bakr al-Baghdadi and the Islamic State’s leaders used technicalities to slip their commitments to al-Qaida. Boko Haram has rapidly gone from al-Qaida wannabe to Islamic State devotee. 

In short, jihadi pledges of loyalty should not be seen as binding or enduring, but instead temporary. When a group’s fortunes wane or leaders change, allegiance will rapidly shift to whatever strain of jihad proves most advantageous to the group or its leader. Prestige, money, manpower—these drive pledges of allegiance, not ideology. 

Al-Qaida and the Islamic State do not think solely about destroying the United States and its Western allies. Although global jihadi groups always call for attacks on the West, they don’t always deliver. Either they can’t or they have other priorities, like attacking closer to home. So jihadi propaganda alone does not tell us much about how the group is going to behave in the future. 

Zawahiri, for example, has publicly called on al-Qaida’s affiliates to carry out attacks on the West. But privately, he has instructed his affiliate in Syria to hold off. And for most of its history, the Islamic State focused on attacking the near enemy in the Middle East rather than the far enemy overseas, despite repeatedly vowing to hit the United States. Both groups will take advantage of any easy opportunity to strike the United States. However, continuing to frame future forecasts through an America-centric lens will yield analysis that’s off the mark and of questionable utility.

[J]ihadi propaganda alone does not tell us much about how the group is going to behave in the future.

Al-Qaida and the Islamic State don’t control all of the actions of their affiliates. News headlines lead casual readers to believe al-Qaida and the Islamic State command and control vast networks operating under a unified strategic plan. But a year ago, the Charlie Hebdo attack in Paris caught al-Qaida in the Arabian Peninsula (AQAP) completely by surprise—despite one of the attackers attributing the assault to the group. Al-Qaeda in the Islamic Maghreb's (AQIM) recent spate of attacks in Mali and Burkina Faso were likely conducted independently of al-Qaida’s central leadership. While the Islamic State has clearly mobilized its network and inspired others to execute a broad range of international attacks, the group’s central leadership in Iraq and Syria closely manages only a small subset of these plots. 

At no time since the birth of al-Qaida have jihadi affiliates and networks operated with such independence. Since Osama bin Laden’s death, al-Qaida affiliates in Yemen, the Sahel, Somalia, and Syria all aggressively sought to form states—a strategy bin Laden advised against. Target selections and the rapid pace of plots by militants in both networks suggest local dynamics rather than a cohesive, global grand strategy drive today’s jihad. Accurately anticipating the competition and cooperation of such a wide array of terrorist affiliates with overlapping allegiances to both groups will require examination by teams of analysts with a range of expertise rather than single pundits. 

At no time since the birth of al-Qaida have jihadi affiliates and networks operated with such independence.

Both groups and their affiliates will be increasingly enticed to align with state sponsors and other non-jihadi, non-state actors. The more money al-Qaida and the Islamic State have, the more leverage they have over their affiliates. But when the money dries up—as it did in al-Qaida’s case and will in the Islamic State’s—the affiliates will look elsewhere to sustain themselves. Distant affiliates will seek new suitors or create new enterprises. 

Inevitably, some of the affiliates will look to states that are willing to fund them in proxy wars against their mutual adversaries. Iran, despite fighting the Islamic State in Syria, might be enticed to support Islamic State terrorism inside Saudi Arabia’s borders. Saudi Arabia could easily use AQAP as an ally against the Iranian backed Houthi in Yemen. African nations may find it easier to pay off jihadi groups threatening their countries than face persistent destabilizing attacks in their cities. When money becomes scarce, the affiliates of al-Qaida and the Islamic State will have fewer qualms about taking money from their ideological enemies if they share common short-term interests. 

If you want to predict the future direction of the Islamic State and al-Qaida, avoid the flawed assumptions noted above. Instead, I offer these three notes: 

  1. First, look to regional terrorism forecasts illuminating local nuances routinely overlooked in big global assessments of al-Qaida and the Islamic State. Depending on the region, either the Islamic State or al-Qaida may reign supreme and their ascendance will be driven more by local than global forces. 
  2. Second, watch the migration of surviving foreign fighters from the Islamic State’s decline in Iraq and Syria. Their refuge will be our future trouble spot. 
  3. Third, don’t try to anticipate too far into the future. Since bin Laden’s death, the terrorist landscape has become more diffuse, a half dozen affiliates have risen and fallen, and the Arab Spring went from great hope for democracies to protracted quagmires across the Middle East. 

Today’s terrorism picture remains complex, volatile, and muddled. There’s no reason to believe tomorrow’s will be anything different.

Authors

     
 
 




of

Experts Weigh In (part 3): What is the future of al-Qaida and the Islamic State?


Will McCants: As we continue onwards in the so-called Long War, it’s a good time to reflect on where we are in the fight against al-Qaida and its bête noire, the Islamic State. Both organizations have benefited from the chaos unleashed by the Arab Spring uprisings but they have taken different paths. Will those paths converge again or will the two organizations continue to remain at odds? Who has the best strategy at the moment? And what political changes might happen in the coming year that will reconfigure their rivalry for leadership of the global jihad?

To answer these questions, I’ve asked some of the leading experts on the two organizations to weigh in. First was Barak Mendelsohn, who analyzed the factors that explain the resilience and weaknesses of both groups. Then Clint Watts offered ways to avoid the flawed assumptions that have led to mistaken counterterrorism forecasts in recent years. 

Next up is Charles Lister, a resident fellow at the Middle East Institute, to examine the respective courses each group has charted to date and whether that's likely to change. 


Charles Lister: The world of international jihad has had a turbulent few years, and only now is the dust beginning to settle. The emergence of the Islamic State as an independent transnational jihadi rival to al-Qaida sparked a competitive dynamic. That has heightened the threat of attacks in the West and intensified the need for both movements to demonstrate their value on local battlefields. Having spent trillions of dollars pushing back al-Qaida in Afghanistan and Pakistan and al-Qaida in Iraq, the jihadi threat we face today far eclipses that seen in 2000 and 2001.

As has been the case for some time, al-Qaida is no longer a grand transnational movement, but rather a loose network of semi-independent armed groups dispersed around the world. Although al-Qaida’s central leadership appears to be increasingly cut off from the world, frequently taking many weeks to respond publicly to significant events, its word remains strong within its affiliates. For example, a secret letter from al-Qaida leader Ayman al-Zawahiri to his Syrian affiliate the Nusra Front in early 2015 promptly caused the group to cease plotting attacks abroad.

Seeking rapid and visible results, ISIS worries little about taking the time to win popular acceptance and instead controls territory through force.

While the eruption of the Arab Spring in 2010 challenged al-Qaida’s insistence that only violent jihad can secure political change, the subsequent repression and resulting instability provided an opportunity. What followed was a period of extraordinary strategic review. Beginning with Ansar al-Sharia in Yemen (in 2010 and 2011) and then with al-Qaida in the Islamic Maghreb (AQIM), Ansar al-Din, and the Movement for Unity and Jihad in West Africa (MUJAO) in Mali (2012), al-Qaida began developing a new strategy focused on slowly nurturing unstable and vulnerable societies into hosts for an al-Qaida Islamic state. Although a premature imposition of harsh Shariah norms caused projects in Yemen and Mali to fail, al-Qaida’s activities in Syria and Yemen today look to have perfected the new “long game” approach.

In Syria and Yemen, al-Qaida has taken advantage of weak states suffering from acute socio-political instability in order to embed itself within popular revolutionary movements. Through a consciously managed process of “controlled pragmatism,” al-Qaida has successfully integrated its fighters into broader dynamics that, with additional manipulation, look all but intractable. Through a temporary renunciation of Islamic hudud (fixed punishments in the Quran and Hadith) and an overt insistence on multilateral populist action, al-Qaida has begun socializing entire communities into accepting its role within their revolutionary societies. With durable roots in these operational zones—“safe bases,” as Zawahiri calls them—al-Qaida hopes one day to proclaim durable Islamic emirates as individual components of an eventual caliphate.

Breadth versus depth

The Islamic State (or ISIS), on the other hand, has emerged as al-Qaida’s obstreperous and brutally rebellious younger sibling. Seeking rapid and visible results, ISIS worries little about taking the time to win popular acceptance and instead controls territory through force and psychological intimidation. As a militarily capable and administratively accomplished organization, ISIS has acquired a strong stranglehold over parts of Iraq and Syria—like Raqqa, Deir el-Zour, and Mosul—but its roots are shallow at best elsewhere in both countries. With effective and representative local partners, the U.S.-led coalition can and will eventually take back much of ISIS’s territory, but evidence thus far suggests progress will be slow.

Meanwhile, ISIS has developed invaluable strategic depth elsewhere in the world, through its acquisition of affiliates—or additional “states” for its Caliphate—in Yemen, Libya, Algeria, Egypt, Afghanistan, Pakistan, Nigeria, and Russia. Although it will struggle to expand much beyond its current geographical reach, the growing importance of ISIS in Libya, Egypt, and Afghanistan-Pakistan in particular will allow the movement to survive pressures it faces in Syria and Iraq. 

As that pressure heightens, ISIS will seek to delegate some level of power to its international affiliates, while actively encouraging retaliatory attacks—both centrally directed and more broadly inspired—against high-profile Western targets. Instability breeds opportunity for groups like ISIS, so we should also expect it to exploit the fact that refugee flows from Syria towards Europe in 2016 look set to dramatically eclipse those seen in 2015.

Instability breeds opportunity for groups like ISIS.

Charting a new course?

That the world now faces threats from two major transnational jihadist movements employing discernibly different strategies makes today’s counterterrorism challenge much more difficult. The dramatic expansion of ISIS and its captivation of the world’s media attention has encouraged a U.S.-led obsession with an organization that has minimal roots into conflict-ridden societies. Meanwhile the West has become distracted from its long-time enemy al-Qaida, which has now grown deep roots in places like Syria and Yemen. Al-Qaida has not disappeared, and neither has it been defeated. We continue this policy imbalance at our peril.

In recent discussions with Islamist sources in Syria, I’ve heard that al-Qaida may be further adapting its long-game strategy. The Nusra Front has been engaged in six weeks of on/off secret talks with at least eight moderate Islamist rebel groups, after proposing a grand merger with any interested party in early January. Although talks briefly came to a close in mid-January over the troublesome issue of the Nusra Front’s allegiance to al-Qaida, the group’s leader Abu Mohammed al-Jolani now placed those ties as an issue on the table for negotiation. 

Al-Qaida has not disappeared, and neither has it been defeated.

The fact that this sensitive subject is now reportedly open for discussion is a significant indicator of how far the Nusra Front is willing to stretch its jihadist mores for the sake of integration in Syrian revolutionary dynamics. However, the al-Nusra Front's leader, Abu Mohammed al-Jolani, is a long-time Al-Qaeda loyalist and doesn't fit the profile of someone willing to break a religious oath purely for the sake of an opportunistic power play. It is therefore interesting that this secret debate inside Syria comes amid whispers within Salafi-jihadi and pro-al-Qaida circles that Zawahiri is considering “releasing” his affiliates from their loyalty pledges in order to transform al-Qaida into an organic network of locally-inspired movements—led by and loosely tied together by an overarching strategic idea.

Whether al-Qaida and its affiliates ultimately evolve along this path or not, the threat they pose to local, regional, and international security is clear. When compounded by ISIS’s determination to continue expanding and to conduct more frequent and more deadly attacks abroad, jihadist militancy looks well-placed to pose an ever present danger for many years to come. 

Authors

      
 
 




of

Global economic and environmental outcomes of the Paris Agreement

The Paris Agreement, adopted by the Parties to the United Nations Framework Convention on Climate Change (UNFCCC) in 2015, has now been signed by 197 countries. It entered into force in 2016. The agreement established a process for moving the world toward stabilizing greenhouse gas (GHG) concentrations at a level that would avoid dangerous climate…

       




of

The risk of fiscal collapse in coal-reliant communities

EXECUTIVE SUMMARY If the United States undertakes actions to address the risks of climate change, the use of coal in the power sector will decline rapidly. This presents major risks to the 53,000 US workers employed by the industry and their communities. 26 US counties are classified as “coal-mining dependent,” meaning the coal industry is…

       




of

Columbia Energy Exchange: Coal communities face risk of fiscal collapse

       




of

The risk of fiscal collapse in coal-reliant communities

       




of

Why local governments should prepare for the fiscal effects of a dwindling coal industry

       




of

A systematic review of systems dynamics and agent-based obesity models: Evaluating obesity as part of the global syndemic

       




of

Simulating the effects of tobacco retail restriction policies

Tobacco use remains the single largest preventable cause of death and disease in the United States, killing more than 480,000 Americans each year and incurring over $300 billion per year in costs for direct medical care and lost productivity. In addition, of all cigarettes sold in the U.S. in 2016, 35% were menthol cigarettes, which…

       




of

Development of a computational modeling laboratory for examining tobacco control policies: Tobacco Town

       




of

Webinar: Reopening the coronavirus-closed economy — Principles and tradeoffs

In an extraordinary response to an extraordinary public health challenge, the U.S. government has forced much of the economy to shut down. We now face the challenge of deciding when and how to reopen it. This is both vital and complicated. Wait too long—maintain the lockdown until we have a vaccine, for instance—and we’ll have another Great Depression. Move too soon, and we…