pr

Toward a Containment Strategy for Smallpox Bioterror: An Individual-Based Computational Approach

Abstract

An individual-based computational model of smallpox epidemics in a two-town county is presented and used to develop strategies for bioterror containment. A powerful and feasible combination of preemptive and reactive vaccination and isolation strategies is developed which achieves epidemic quenching while minimizing risks of adverse side effects. Calibration of the model to historical data is described. Various model extensions and applications to other public health problems are noted.

Downloads

Authors

      
 
 




pr

Toward a Containment Strategy for Smallpox Bioterror : An Individual-Based Computational Approach


Brookings Institution Press 2004 55pp.

In the United States, routine smallpox vaccination ended in 1972. The level of immunity remaining in the U.S. population is uncertain, but is generally assumed to be quite low. Smallpox is a deadly and infectious pathogen with a fatality rate of 30 percent. If smallpox were successfully deployed as an agent of bioterrorism today, the public health and economic consequences could be devastating.

Toward a Containment Strategy for Smallpox Bioterror describes the scientific results and policy implications of a simulation of a smallpox epidemic in a two-town county. The model was developed by an interdisicplinary team from the Johns Hopkins Bloomberg School of Public Health and the Brookings Institution Center on Social and Economic Dynamics, employing agent-based and other advanced computational techniques. Such models are playing a critical role in the crafting of a national strategy for the containment of smallpox by providing public health policymakers with a variety of novel and feasible approaches to vaccination and isolation under different circumstances. The extension of these techniques to the containment of emerging pathogens, such as SARS, is discussed.

About the Authors:
Joshua M. Epstein and Shubha Chakravarty are with the Brookings Institution. Derek A. T. Cummings, Ramesh M. Singha, and Donald S. Burke are with the Johns Hopkins Bloomberg School of Public Health.

ABOUT THE AUTHORS

Derek Cummings
Donald S. Burke
Joshua M. Epstein
Ramesh M. Singa
Shubha Chakravarty

Downloads

Ordering Information:
  • {9ABF977A-E4A6-41C8-B030-0FD655E07DBF}, 978-0-8157-2455-1, $19.95 Add to Cart
      
 
 




pr

Louisiana’s prescription drug experiment: A model for the nation?

The high cost of prescription drugs has become an increasingly pressing concern for policymakers, insurers, and families. New drugs—like those now available for hepatitis C— offer tremendous medical benefits, but at a cost that puts them out of reach for many patients. In an effort to address the affordability dilemma, the Louisiana Department of Health…

       




pr

Overcoming barriers: Sustainable development, productive cities, and structural transformation in Africa

Against a background of protracted decline in global commodity prices and renewed focus on the Africa rising narrative, Africa is proving resilient, underpinned by strong economic performance in non-commodity exporting countries. The rise of African cities contains the potential for new engines for the continent’s structural transformation, if harnessed properly. However, the susceptibility of Africa’s…

      
 
 




pr

Trans-Atlantic Scorecard – April 2019

Welcome to the third edition of the Trans-Atlantic Scorecard, a quarterly evaluation of U.S.-European relations produced by Brookings’s Center on the United States and Europe (CUSE), as part of the Brookings – Robert Bosch Foundation Transatlantic Initiative. To produce the Scorecard, we poll Brookings scholars and other experts on the present state of U.S. relations…

       




pr

Time to Deregulate the Practice of Law

Clifford Winston and Robert Crandall argue that occupational licensing for lawyers creates a monopoly in the legal field. They write that deregulating the industry would give consumers more responsive service while lowering costs.

      
 
 




pr

The Dangerous Price of Ignoring Syria

Vali Nasr says that President Obama has resisted American involvement in Syria because it challenges a central aim of his foreign policy: shrinking the U.S. footprint in the Middle East and downplaying the region’s importance to global politics. Nasr examines why doing more on Syria would reverse the U.S. retreat from the region.

      
 
 




pr

Despite Predictions, BCRA Has Not Been a Democratic 'Suicide Bill'

During debates in Congress and in the legal battles testing its constitutionality, critics of the Bipartisan Campaign Reform Act of 2002 imagined a host of unanticipated and debilitating consequences. The law's ban on party soft money and the regulation of electioneering advertising would, they warned, produce a parade of horribles: A decline in political speech protected by the First Amendment, the demise of political parties, and the dominance of interest groups in federal election campaigns.

The forecast that attracted the most believers — among politicians, journalists, political consultants, election-law attorneys and scholars — was the claim that Democrats would be unable to compete against Republicans under the new rules, primarily because the Democrats' relative ability to raise funds would be severely crippled. One year ago, Seth Gitell in The Atlantic Monthly summarized this view and went so far as to call the new law "The Democratic Party Suicide Bill." Gitell quoted a leading Democratic Party attorney, who expressed his private view of the law as "a fascist monstrosity." He continued, "It is grossly offensive ... and on a fundamental level it's horrible public policy, because it emasculates the parties to the benefit of narrow-focus special-interest groups. And it's a disaster for the Democrats. Other than that, it's great."

The core argument was straightforward. Democratic Party committees were more dependent on soft money — unlimited contributions from corporations, unions and individuals — than were the Republicans. While they managed to match Republicans in soft-money contributions, they trailed badly in federally limited hard-money contributions. Hence, the abolition of soft money would put the Democrats at a severe disadvantage in presidential and Congressional elections.

In addition, the argument went, by increasing the amount an individual could give to a candidate from $1,000 to $2,000, the law would provide a big financial boost to President Bush, who would double the $100 million he raised in 2000 and vastly outspend his Democratic challenger. Finally, the ban on soft money would weaken the Democratic Party's get-out-the-vote efforts, particularly in minority communities, while the regulation of "issue ads" would remove a potent electoral weapon from the arsenal of labor unions, the party's most critical supporter.

After 18 months of experience under the law, the fundraising patterns in this year's election suggest that these concerns were greatly exaggerated. Money is flowing freely in the campaign, and many voices are being heard. The political parties have adapted well to an all-hard-money world and have suffered no decline in total revenues. And interest groups are playing a secondary role to that of the candidates and parties.

The financial position of the Democratic party is strikingly improved from what was imagined a year ago. Sen. John Kerry (D-Mass.), who opted out of public funding before the Iowa caucuses, will raise more than $200 million before he accepts his party's nomination in Boston. The unusual unity and energy in Democrats' ranks have fueled an extraordinary flood of small donations to the Kerry campaign, mainly over the Internet. These have been complemented by a series of successful events courting $1,000 and $2,000 donors.

Indeed, since Kerry emerged as the prospective nominee in March, he has raised more than twice as much as Bush and has matched the Bush campaign's unprecedented media buys in battleground states, while also profiting from tens of millions of dollars in broadcast ads run by independent groups that are operating largely outside the strictures of federal election law.

The Democratic national party committees have adjusted to the ban on soft money much more successfully than insiders had thought possible. Instead of relying on large soft-money gifts for half of their funding, Democrats have shown a renewed commitment to small donors and have relied on grassroots supporters to fill their campaign coffers. After the 2000 election, the Democratic National Committee had 400,000 direct-mail donors; today the committee has more than 1.5 million, and hundreds of thousands more who contribute over the Internet.

By the end of June, the three Democratic committees had already raised $230 million in hard money alone, compared to $227 million in hard and soft money combined at this point in the 2000 election cycle. They have demonstrated their ability to replace the soft money they received in previous elections with new contributions from individual donors.

Democrats are also showing financial momentum as the election nears, and thus have been gradually reducing the Republican financial advantage in both receipts and cash on hand. In 2003, Democrats trailed Republicans by a large margin, raising only $95 million, compared to $206 million for the GOP. But in the first quarter of this year, Democrats began to close the gap, raising $50 million, compared to $82 million for Republicans. In the most recent quarter, they narrowed the gap even further, raising $85 million, compared to the Republicans' $96 million.

Democrats are now certain to have ample funds for the fall campaigns. Although they had less than $20 million in the bank (minus debts) at the beginning of this year, they have now banked $92 million. In the past three months, Democrats actually beat Republicans in generating cash — $47 million, compared to $31 million for the GOP.

The party, therefore, has the means to finance a strong coordinated and/or independent-spending campaign on behalf of the presidential ticket, while Congressional committees have the resources they need to play in every competitive Senate and House race, thanks in part to the fundraising support they have received from Members of Congress.

Moreover, FEC reports through June confirm that Democratic candidates in those competitive Senate and House races are more than holding their own in fundraising. They will be aided by a number of Democratic-leaning groups that have committed substantial resources to identify and turn out Democratic voters on Election Day.

Democrats are highly motivated to defeat Bush and regain control of one or both houses of Congress. BCRA has not frustrated these efforts. Democrats are financially competitive with Republicans, which means the outcome will not be determined by a disparity of resources. Put simply, the doomsday scenario conjured up by critics of the new campaign finance law has not come to pass.

Publication: Roll Call
     
 
 




pr

The Competitive Problem of Voter Turnout

On November 7, millions of Americans will exercise their civic duty to vote. At stake will be control of the House and Senate, not to mention the success of individual candidates running for office. President Bush's "stay the course" agenda will either be enabled over the next two years by a Republican Congress or knocked off kilter by a Democratic one.

With so much at stake, it is not surprising that the Pew Research Center found that 51 percent of registered voters have given a lot of thought to this November's election. This is higher than any other recent midterm election, including 44 percent in 1994, the year Republicans took control of the House. If so, turnout should better the 1994 turnout rate among eligible voters of 41 percent.

There is good reason to suspect that despite the high interest, turnout will not exceed 1994. The problem is that a national poll is, well, a national poll, and does not measure attitudes of voters within states and districts.

People vote when there is a reason to do so. Republican and Democratic agendas are in stark contrast on important issues, but voters also need to believe that their vote will matter in deciding who will represent them. It is here that the American electoral system is broken for many voters.

Voters have little choice in most elections. In 1994, Congressional Quarterly called 98 House elections as competitive. Today, they list 51. To put it another way, we are already fairly confident of the winner in nearly 90 percent of House races. Although there is no similar tracking for state legislative offices, we know that the number of elections won by less than 60 percent of the vote has fallen since 1994.

The real damage to the national turnout rate is in the large states of California and New York, which together account for 17 percent of the country's eligible voters. Neither state has a competitive Senate or Governor's election, and few competitive House or state legislative races. Compare to 1994, when Californians participated in competitive Senate and governor races the state's turnout was 5 percentage points above the national rate. The same year New York's competitive governor's race helped boost turnout a point above the national rate.

Lacking stimulation from two of the largest states, turnout boosts will have to come from elsewhere. Texas has an interesting four-way governor's race that might draw from infrequent voters to the polls. Ohio's competitive Senate race and some House races might also draw voters. However, in other large states like Florida, Illinois, Michigan and Pennsylvania, turnout will suffer from largely uncompetitive statewide races.

The national turnout rate will likely be less than 1994 and fall shy of 40 percent. This is not to say that turnout will be poor everywhere. Energized voters in Connecticut get to vote in an interesting Senate race and three of five Connecticut House seats are up for grabs. The problem is that turnout will be localized in these few areas of competition.

The fault is not on the voters; people's lives are busy, and a rational person will abstain when their vote does not matter to the election outcome. The political parties also are sensitive to competition and focus their limited resources where elections are competitive. Television advertising and other mobilizing efforts by campaigns will only be found in competitive races.

The old adage of "build it and they will come" is relevant. All but hardcore sports fans tune out a blowout. Building competitive elections -- and giving voters real choices -- will do much to increase voter turnout in American politics. There are a number of reforms on the table: redistricting to create competitive districts, campaign financing to give candidates equal resources, and even altering the electoral system to fundamentally change how a vote elects representatives. If voters want choice and a government more responsive to their needs, they should consider how these seemingly arcane election procedures have real consequences on motivating them to do the most fundamental democratic action: vote.

Publication: washingtonpost.com
     
 
 




pr

Principles for Transparency and Public Participation in Redistricting


Scholars from the Brookings Institution and the American Enterprise Institute are collaborating to promote transparency in redistricting. In January 2010, an advisory board of experts and representatives of good government groups was convened in order to articulate principles for transparent redistricting and to identify barriers to the public and communities who wish to create redistricting plans. This document summarizes the principles for transparency in redistricting that were identified during that meeting.

Benefits of a Transparent, Participative Redistricting Process

The drawing of electoral districts is among the most easily manipulated and least transparent systems in democratic governance. All too often, redistricting authorities maintain their monopoly by imposing high barriers to transparency and public participation. Increasing transparency and public participation can be a powerful counterbalance by providing the public with information similar to that which is typically only available to official decision makers, which can lead to different outcomes and better representation.

Increasing transparency can empower the public to shape the representation for their communities, promote public commentary and discussion about redistricting, inform legislators and redistricting authorities which district configurations their constituents and the public support, and educate the public about the electoral process.  

Fostering public participation can enable the public to identify their neighborhoods and communities, promote the creation of alternative maps, and facilitate an exploration of a wide range of representational possibilities. The existence of publicly-drawn maps can provide a measuring stick against which an official plan can be compared, and promote the creation of a “market” for plans that support political fairness and community representational goals.

Transparency Principles

All redistricting plans should include sufficient information so the public can verify, reproduce, and evaluate a plan. Transparency thus requires that:

  • Redistricting plans must be available in non-proprietary formats.
  • Redistricting plans must be available in a format allowing them to be easily read and analyzed with commonly-used geographic information software.
  • The criteria used as a basis for creating plans and individual districts must be clearly documented.

Creating and evaluating redistricting plans and community boundaries requires access to demographic, geographic, community, and electoral data. Transparency thus requires that:

  • All data necessary to create legal redistricting plans and define community boundaries must be publicly available, under a license allowing reuse of these data for non-commercial purposes.
  • All data must be accompanied by clear documentation stating the original source, the chain of ownership (provenance), and all modifications made to it.

Software systems used to generate or analyze redistricting plans can be complex, impossible to reproduce, or impossible to correctly understand without documentation. Transparency thus requires that:

  • Software used to automatically create or improve redistricting plans must be either open-source or provide documentation sufficient for the public to replicate the results using independent software.
  • Software used to generate reports that analyze redistricting plans must be accompanied by documentation of data, methods, and procedures sufficient for the reports to be verified by the public.

Services offered to the public to create or evaluate redistricting plans and community boundaries are often opaque and subject to misinterpretation unless adequately documented. Transparency thus requires that:

  • Software necessary to replicate the creation or analysis of redistricting plans and community boundaries produced by the service must be publicly available.
  • The service must provide the public with the ability to make available all published redistricting plans and community boundaries in non-proprietary formats that are easily read and analyzed with commonly-used geographic information software.
  • Services must provide documentation of any organizations providing significant contributions to their operation.

Promoting Public Participation

New technologies provide opportunities to broaden public participation in the redistricting process. These technologies should aim to realize the potential benefits described and be consistent with the articulated transparency principles.

Redistricting is a legally and technically complex process. District creation and analysis software can encourage broad participation by: being widely accessible and easy to use; providing mapping and evaluating tools that help the public to create legal redistricting plans, as well as maps identifying local communities; be accompanied by training materials to assist the public to successfully create and evaluate legal redistricting plans and define community boundaries; have publication capabilities that allow the public to examine maps in situations where there is no access to the software; and promoting social networking and allow the public to compare, exchange and comment on both official and community-produced maps.



Official Endorsement from Organizations – Americans for Redistricting Reform, Brennan Center for Justice at New York University, Campaign Legal Center, Center for Governmental Studies, Center for Voting and Democracy, Common Cause, Demos, and the League of Women Voters of the United States.

Attending board members – Nancy Bekavac, Director, Scientists and Engineers for America; Derek Cressman, Western Regional Director of State Operations, Common Cause; Anthony Fairfax, President, Census Channel; Representative Mike Fortner (R), Illinois General Assembly; Karin Mac Donald, Director, Statewide Database, Berkeley Law, University of California, Berkeley; Leah Rush, Executive Director, Midwest Democracy Network; Mary Wilson, President, League of Women Voters.

Editors Micah Altman, Harvard University and the Brookings Institution; Thomas E. Mann, Brookings Institution; Michael P. McDonald, George Mason University and the Brookings Institution; Norman J. Ornstein, American Enterprise Institute.

This project is funded by a grant from the Sloan Foundation to the Brookings Institution and the American Enterprise Institute.

Publication: The Brookings Institution and The American Enterprise Institute
Image Source: © Lucy Nicholson / Reuters
      
 
 




pr

@ Brookings Podcast: The Politics and Process of Congressional Redistricting

Now that the 2010 Census is concluded, states will begin the process of reapportionment—re-drawing voting district lines to account for population shifts. Nonresident Senior Fellow Michael McDonald says redistricting has been fraught with controversy and corruption since the nation’s early days, when the first “gerrymandered” district was drawn. Two states—Arizona and California—have instituted redistricting commissions intended to insulate the process from political shenanigans, but politicians everywhere will continue to work the system to gain electoral advantage and the best chance of re-election for themselves and their parties.

Subscribe to audio and video podcasts of Brookings events and policy research »

Video

Audio

      
 
 




pr

Using Crowd-Sourced Mapping to Improve Representation and Detect Gerrymanders in Ohio


Analysis of dozens of publicly created redistricting plans shows that map-making technology can improve political representation and detect a gerrymander.  In 2012, President Obama won the vote in Ohio by three percentage points, while Republicans held a 13-to-5 majority in Ohio’s delegation to the U.S. House. After redistricting in 2013, Republicans held 12 of Ohio’s House seats while Democrats held four. As is typical in these races, few were competitive; the average margin of victory was 32 points. Is this simply a result of demography, the need to create a majority-minority district, and the constraints traditional redistricting principles impose on election lines—or did the legislature intend to create a gerrymander?

Crowd-Sourced Redistricting Maps

In the Ohio elections, we have a new source of information that opens a window into the legislature’s choice: Large numbers of publicly created redistricting plans.

During the last round of redistricting, across the country thousands of people in over a dozen states created hundreds of legal redistricting plans. Advances in information technology and the engagement of grassroots reform groups made these changes possible. To promote these efforts we created the DistrictBuilder open redistricting platform and many of these groups used this tool to create their plans.

Over the last several years, we have used the trove of information produced by public redistricting to gain insight into the politics of representation. In previous work that analyzed public redistricting in Virginia[1], and in Florida[2], we discovered that members of the public are capable of creating legal redistricting plans that outperform those maps created by legislatures in a number of ways.

Public redistricting in Ohio shows something new—the likely motives of the legislature. This can be seen through using information visualization methods to show the ways in which redistricting goals can be balanced (or traded-off) in Ohio , revealing the particular trade-offs made by the legislature.

The figure below, from our new research paper[3], shows 21 plots—each of which compares legislative and publicly-created plans using a pair of scores—altogether covering seven different traditional and representational criteria. A tiny ‘A’ shows the adopted plan. The top-right corner of each mini-plot shows the best theoretically possible score. When examined by itself, the legislative plan meets a few criteria: it minimizes population deviation, creates an expected majority-minority seat, and creates a substantial majority of districts that would theoretically be competitive in an open-seat race in which the statewide vote was evenly split.

Figure 1: Pairwise Congressional Score Comparisons (Scatterplots) - Standardized Scores

In previous rounds of redistricting, empirical analysis would stop here—unless experts were called in to draw alternative plans in litigation. However, the large number of public plans now available allows us to see other options, plans the legislature could readily have created had it desired to do so. Comparison of the adopted plans and public plans reveal the weakness of the legislature’s choice. Members of the public were able to find plans that soundly beat the legislative plan on almost every pair of criteria, including competitive districts.

So why was the adopted plan chosen? Information visualization can help here, as well, but we need to add another criterion—partisan advantage:

Pareto Frontier: Standard Criteria vs. Democratic Surplus

When we visualize the number of expected Democratic seats that was likely to result from each plan, and compare this to the other score, we can see that the adopted plan is the best at something— producing Republican seats.

Was Ohio gerrymandered? Applying our proposed gerrymandering detection method, the adopted plans stands in high contrast to the public sample of plans, even if the overall competition scoring formula is slightly biased towards the Democrats, as strongly biased towards the Republicans on any measure of partisan fairness. Moreover analyzing the tradeoffs among redistricting criteria illuminate empirically demonstrates what is often suspected, but is typically impossible to demonstrate—that had the legislature desired to improve any good-government criterion—it could have done so, simply by sacrificing some partisan advantage. In light of this new body of evidence, the political intent of the legislature is clearly displayed.

However, when politics and technology mix, beware of Kranzberg’s first law: “Technology is neither good nor bad; nor is it neutral.”[4] Indeed there is an unexpected and hopeful lesson on reform revealed by the public participation that was enabled by new technology. The public plans show that, in Ohio, it is possible to improve the expected competitiveness, and to improve compliance with traditional districting principles such as county integrity, without threatening majority-minority districts simply by reducing partisan advantage—this is a tradeoff we should gladly accept.



[1] Altman M, McDonald MP. A Half-Century of Virginia Redistricting Battles: Shifting from Rural Malapportionment to Voting Rights to Public Participation. Richmond Law Review [Internet]. 2013;43(1):771-831.

[2] Altman M, McDonald M. Paradoxes Of Political Reform: Congressional Redistricting In Florida. In: Jigsaw Puzzle Politics in the Sunshine State. University Press of Florida; 2014.

[3] Altman, Micah and McDonald, Michael P., Redistricting by Formula: An Ohio Reform Experiment (June 3, 2014). Available at SSRN: http://ssrn.com/abstract=2450645

[4] Kranzberg, Melvin (1986) Technology and History: "Kranzberg's Laws", Technology and Culture, Vol. 27, No. 3, pp. 544-560.

Image Source: © Jonathan Ernst / Reuters
      
 
 




pr

But Will It Work?: Implementation Analysis to Improve Government Performance

Executive Summary Problems that arise in the implementation process make it less likely that policy objectives will be achieved in many government programs. Implementation problems may also damage the morale and external reputations of the agencies in charge of implementation. Although many implementation problems occur repeatedly across programs and can be predicted in advance, legislators…

       




pr

The President's 2015 R&D Budget: Livin' with the blues


On March 4, President Obama submitted to Congress his 2015 budget request. Keeping with the spending cap deal agreed last December with Congress, the level of federal R&D will remain flat; and, when discounted by inflation, it is slightly lower. The requested R&D amount for 2015 is $135.4 billion, only $1.7 billion greater than 2014. If we discount from this 1.2% increase the expected inflation of 1.7% we are confronting a 0.5% decline in real terms.

Reaction of the Research Community

The litany of complaints has started. The President’s Science and Technology Advisor, John Holdren said to AAAS: “This budget required a lot of tough choices. All of us would have preferred more." The Association of American Universities, representing 60 top research universities, put out a statement declaring that this budget does “disappointingly little to close the nation’s innovation deficit,” so defined by the gap between the appropriate level of R&D investment and current spending.

What’s more, compared to 2014, the budget request has kept funding for scientific research roughly even but it has reallocated about $250 million from basic to applied research (see Table 1). Advocates of science have voiced their discontent. Take for instance the Federation of American Societies for Experimental Biology that has called the request a “disappointment to the research community” because the President’s budget came $2.5 billion short of their recommendations.

The President’s Research and Development Budget 2015

Source: OMB Budget 2015

These complaints are fully expected and even justified: each interest group must defend their share of tax-revenues. Sadly, in times of austerity, these protestations are toothless. If they were to have any traction in claiming a bigger piece of the federal discretionary pie, advocates would have to make a comparative case showing what budget lines must go down to make room for more R&D. But that line of argumentation could mean suicide for the scientific community because it would throw it into direct political contest with other interests and such contests are rarely decided by the merits of the cause but by the relative political power of interest groups. The science lobby is better off issuing innocuous hortatory pronouncements rather than picking up political fights that it cannot win.

Thus, the R&D slice is to remain pegged to the size of the total budget, which is not expected to grow, in the coming years, more than bonsai. The political accident of budget constraints is bound to change the scientific enterprise from within, not only in terms of the articulation of merits—which means more precise and compelling explanations for the relative importance of disciplines and programs—but also in terms of a shrewd political contest among science factions.

     
 
 




pr

Responsible innovation: A primer for policymakers


Technical change is advancing at a breakneck speed while the institutions that govern innovative activity slog forward trying to keep pace. The lag has created a need for reform in the governance of innovation. Reformers who focus primarily on the social benefits of innovation propose to unmoor the innovative forces of the market. Conversely, those who deal mostly with innovation’s social costs wish to constrain it by introducing regulations in advance of technological developments. In this paper, Walter Valdivia and David Guston argue for a different approach to reform the governance of innovation that they call "Responsible Innovation" because it seeks to imbue in the actors of the innovation system a more robust sense of individual and collective responsibility.

Responsible innovation appreciates the power of free markets in organizing innovation and realizing social expectations but is self-conscious about the social costs that markets do not internalize. At the same time, the actions it recommends do not seek to slow down innovation because they do not constrain the set of options for researchers and businesses, they expand it. Responsible innovation is not a doctrine of regulation and much less an instantiation of the precautionary principle. Innovation and society can evolve down several paths and the path forward is to some extent open to collective choice. The aim of a responsible governance of innovation is to make that choice more consonant with democratic principles.

Valdivia and Guston illustrate how responsible innovation can be implemented with three practical initiatives: 

  1. Industry: Incorporating values and motivations to innovation decisions that go beyond the profit motive could help industry take on a long-view of those decisions and better manage its own costs associated with liability and regulation, while reducing the social cost of negative externalities. Consequently, responsible innovation should be an integral part of corporate social responsibility, considering that the latter has already become part of the language of business, from the classroom to the board room, and that is effectively shaping, in some quarters, corporate policies and decisions.
  2. Universities and National Laboratories: Centers for Responsible Innovation, fashioned after the institutional reform of Internal Review Boards to protect human subjects in research and the Offices of Technology Transfer created to commercialize academic research, could organize existing responsible innovation efforts at university and laboratory campuses. These Centers would formalize the consideration of impacts of research proposals on legal and regulatory frameworks, economic opportunity and inequality, sustainable development and the environment, as well as ethical questions beyond the integrity of research subjects.
  3. Federal Government: Federal policy should improve its protections and support of scientific research while providing mechanisms of public accountability for research funding agencies and their contractors. Demanding a return on investment for every research grant is a misguided approach that devalues research and undermines trust between Congress and the scientific community. At the same time, scientific institutions and their advocates should improve public engagement and demonstrate their willingness and ability to be responsive to societal concerns and expectations about the public research agenda. Second, if scientific research is a public good, by definition, markets are not effective commercializing it. New mechanisms to develop practical applications from federal research with little market appeal should be introduced to counterbalance the emphasis the current technology transfer system places on research ready for the market. Third, federal innovation policy needs to be better coordinated with other federal policy, including tax, industrial, and trade policy as well as regulatory regimes. It should also improve coordination with initiatives at the local and state level to improve the outcomes of innovation for each region, state, and metro area.

Downloads

Authors

     
 
 




pr

The fair compensation problem of geoengineering


The promise of geoengineering is placing average global temperature under human control, and is thus considered a powerful instrument for the international community to deal with global warming. While great energy has been devoted to learning more about the natural systems that it would affect, questions of political nature have received far less consideration. Taking as a given that regional effects will be asymmetric, the nations of the world will only give their consent to deploying this technology if they can be given assurances of a fair compensation mechanism, something like an insurance policy. The question of compensation reveals that the politics of geoengineering are far more difficult than the technical aspects.

What is Geoengineering?

In June 1991, Mount Pinatubo exploded, throwing a massive amount of volcanic sulfate aerosols into the high skies. The resulting cloud dispersed over weeks throughout the planet and cooled its temperature on average 0.5° Celsius over the next two years. If this kind of natural phenomenon could be replicated and controlled, the possibility of engineering the Earth’s climate is then within reach.

Spraying aerosols in the stratosphere is one method of solar radiation management (SRM), a class of climate engineering that focuses on increasing the albedo, i.e. reflectivity, of the planet’s atmosphere. Other SRM methods include brightening clouds by increasing their content of sea salt. A second class of geo-engineering efforts focuses on carbon removal from the atmosphere and includes carbon sequestration (burying it deep underground) and increasing land or marine vegetation. Of all these methods, SRM is appealing for its effectiveness and low costs; a recent study put the cost at about $5 to $8 billion per year.1

Not only is SRM relatively inexpensive, but we already have the technological pieces that assembled properly would inject the skies with particles that reflect sunlight back into space. For instance, a fleet of modified Boeing 747s could deliver the necessary payload. Advocates of geoengineering are not too concerned about developing the technology to effect SRM, but about its likely consequences, not only in terms of slowing global warming but the effects on regional weather. And there lies the difficult question for geoengineering: the effects of SRM are likely to be unequally distributed across nations.

Here is one example of these asymmetries: Julia Pongratz and colleagues at the department of Global Ecology of the Carnegie Institution for Science estimated a net increase in yields of wheat, corn, and rice from SRM modified weather. However, the study also found a redistributive effect with equatorial countries experiencing lower yields.2 We can then expect that equatorial countries will demand fair compensation to sign on the deployment of SRM, which leads to two problems: how to calculate compensation, and how to agree on a compensation mechanism.

The calculus of compensation

What should be the basis for fair compensation? One view of fairness could be that, every year, all economic gains derived from SRM are pooled together and distributed evenly among the regions or countries that experience economic losses.

If the system pools gains from SRM and distributes them in proportion to losses, questions about the balance will only be asked in years in which gains and losses are about the same. But if losses are far greater than the gains; then this would be a form of insurance that cannot underwrite some of the incidents it intends to cover. People will not buy such an insurance policy; which is to say, some countries will not authorize SRM deployment. In the reverse, if the pool has a large balance left after paying out compensations, then winners of SRM will demand lower compensation taxes.

Further complicating the problem is the question of how to separate gains or losses that can be attributed to SRM from regional weather fluctuations. Separating the SRM effect could easily become an intractable problem because regional weather patterns are themselves affected by SRM.  For instance, any year that El Niño is particularly strong, the uncertainty about the net effect of SRM will increase exponentially because it could affect the severity of the oceanic oscillation itself. Science can reduce uncertainty but only to a certain degree, because the better we understand nature, the more we understand the contingency of natural systems. We can expect better explanations of natural phenomena from science, but it would be unfair to ask science to reduce greater understanding to a hard figure that we can plug into our compensation equation.

Still, greater complexity arises when separating SRM effects from policy effects at the local and regional level. Some countries will surely organize better than others to manage this change, and preparation will be a factor in determining the magnitude of gains or losses. Inherent to the problem of estimating gains and losses from SRM is the inescapable subjective element of assessing preparation. 

The politics of compensation

Advocates of geoengineering tell us that their advocacy is not about deploying SRM; rather, it is about better understanding the scientific facts before we even consider deployment. It’s tempting to believe that the accumulating science on SRM effects would be helpful. But when we consider the factors I just described above, it is quite possible that more science will also crystalize the uncertainty about exact amounts of compensation. The calculus of gain or loss, or the difference between the reality and a counterfactual of what regions and countries will experience requires certainty, but science only yields irreducible uncertainty about nature.

The epistemic problems with estimating compensation are only to be compounded by the political contestation of those numbers. Even within the scientific community, different climate models will yield different results, and since economic compensation is derived from those models’ output, we can expect a serious contestation of the objectivity of the science of SRM impact estimation. Who should formulate the equation? Who should feed the numbers into it? A sure way to alienate scientists from the peoples of the world is to ask them to assert their cognitive authority over this calculus. 

What’s more, other parts of the compensation equation related to regional efforts to deal with SRM effect are inherently subjective. We should not forget the politics of asserting compensation commensurate to preparation effort; countries that experience low losses may also want compensation for their efforts preparing and coping with natural disasters.

Not only would a compensation equation be a sham, it would be unmanageable. Its legitimacy would always be in question. The calculus of compensation may seem a way to circumvent the impasses of politics and define fairness mathematically. Ironically, it is shot through with subjectivity; is truly a political exercise.

Can we do without compensation?

Technological innovations are similar to legislative acts, observed Langdon Winner.3 Technical choices of the earliest stage in technical design quickly “become strongly fixed in material equipment, economic investment, and social habit, [and] the original flexibility vanishes for all practical purposes once the initial commitments are made.” For that reason, he insisted, "the same careful attention one would give to the rules, roles, and relationships of politics must also be given to such things as the building of highways, the creation of television networks, and the tailoring of seeming insignificant features on new machines."

If technological change can be thought of as legislative change, we must consider how such a momentous technology as SRM can be deployed in a manner consonant with our democratic values. Engineering the planet’s weather is nothing short of passing an amendment to Planet Earth’s Constitution. One pesky clause in that constitutional amendment is a fair compensation scheme. It seems so small a clause in comparison to the extent of the intervention, the governance of deployment and consequences, and the international commitments to be made as a condition for deployment (such as emissions mitigation and adaptation to climate change). But in the short consideration afforded here, we get a glimpse of the intractable political problem of setting up a compensation scheme. And yet, if the clause were not approved by a majority of nations, a fair compensation scheme has little hope to be consonant with democratic aspirations.


1McClellan, Justin, David W Keith, Jay Apt. 2012. Cost analysis of stratospheric albedo modification delivery systems. Environmental Research Letters 7(3): 1-8.

2Pongratz, Julia, D. B. Lobell, L. Cao, K. Caldeira. 2012. Nature Climate Change 2, 101–105.

3Winner, Langdon. 1980. Do artifacts have politics? Daedalus (109) 1: 121-136.

Image Source: © Antara Photo Agency / Reuters
      
 
 




pr

Why Bernie Sanders vastly underperformed in the 2020 primary

Senator Bernie Sanders entered the 2020 Democratic primary race with a wind at his back. With a narrow loss to Hillary Clinton in 2016 and a massive political organization, Mr. Sanders set the tone for the policy conversation in the race. Soon after announcing, the Vermont senator began raising record amounts of money, largely online…

       




pr

In administering the COVID-19 stimulus, the president’s role model should be Joe Biden

As America plunges into recession, Congress and President Donald Trump have approved a series of aid packages to assist businesses, the unemployed, and others impacted by COVID-19. The first three aid packages will likely be supplemented by at least a fourth package, as the nation’s leaders better understand the depth and reach of the economic…

       




pr

With Sanders out, what’s next for the Democratic presidential race?

Following the withdrawal of Sen. Bernie Sanders from the 2020 presidential race, the Democrats' presumptive nominee for president will be former Vice President Joe Biden. Senior Fellow John Hudak examines how Sanders and other progressives have shifted mainstream Democratic positions, and the repercussions for the Democratic convention in August. He also looks at the leadership…

       




pr

‘Essential’ cannabis businesses: Strategies for regulation in a time of widespread crisis

Most state governors and cannabis regulators were underprepared for the COVID-19 pandemic, a crisis is affecting every economic sector. But because the legal cannabis industry is relatively new in most places and still evolving everywhere, the challenges are even greater. What’s more, there is no history that could help us understand how the industry will endure the current economic situation. And so, in many…

       




pr

Policy insights from comparing carbon pricing modeling scenarios

Carbon pricing is an important policy tool for reducing greenhouse gas pollution. The Stanford Energy Modeling Forum exercise 32 convened eleven modeling teams to project emissions, energy, and economic outcomes of an illustrative range of economy-wide carbon price policies. The study compared a coordinated reference scenario involving no new policies with policy scenarios that impose…

       




pr

Leading carbon price proposals: A bipartisan dialogue

Economists overwhelmingly recommend a price on carbon as a way to control the risk of climatic disruption. A fee on carbon dioxide and other greenhouse gas emissions would shift the relative prices of different sources of energy and other goods by an amount that depends on how damaging they are to the earth’s climate. A…

       




pr

Why local governments should prepare for the fiscal effects of a dwindling coal industry

       




pr

Webinar: Reopening the coronavirus-closed economy — Principles and tradeoffs

In an extraordinary response to an extraordinary public health challenge, the U.S. government has forced much of the economy to shut down. We now face the challenge of deciding when and how to reopen it. This is both vital and complicated. Wait too long—maintain the lockdown until we have a vaccine, for instance—and we’ll have another Great Depression. Move too soon, and we…

       




pr

The false promise of ‘pro-American’ autocrats

U.S. efforts to promote democracy in the Middle East have long been paralyzed by a unique “Islamist dilemma”: We want democracy in theory but fear its outcomes in practice. In this case, the outcomes that we fear are Islamist parties either doing well in elections or winning them outright. If we would like to (finally)…

       




pr

Why a Trump presidency could spell big trouble for Taiwan


Presumptive Republican presidential nominee Donald Trump’s idea to withdraw American forces from Asia—letting allies like Japan and South Korea fend for themselves, including possibly by acquiring nuclear weapons—is fundamentally unsound, as I’ve written in a Wall Street Journal op-ed.

Among the many dangers of preemptively pulling American forces out of Japan and South Korea, including an increased risk of war between Japan and China and a serious blow to the Nuclear Non-Proliferation Treaty, such a move would heighten the threat of war between China and Taiwan. The possibility that the United States would dismantle its Asia security framework could unsettle Taiwan enough that it would pursue a nuclear deterrent against China, as it has considered doing in the past—despite China indicating that such an act itself could be a pathway to war. And without bases in Japan, the United States could not as easily deter China from potential military attacks on Taiwan. 

Trump’s proposed Asia policy could take the United States and its partners down a very dangerous road. It’s an experiment best not to run.

      
 
 




pr

President Obama’s role in African security and development


Event Information

July 19, 2016
10:00 AM - 11:30 AM EDT

Falk Auditorium
Brookings Institution
1775 Massachusetts Avenue NW
Washington, DC 20036

Register for the Event

Barack Obama’s presidency has witnessed widespread change throughout Africa. His four trips there, spanning seven countries, reflect his belief in the continent’s potential and importance. African countries face many challenges that span issues of trade, investment, and development, as well as security and stability. With President Obama’s second term coming to an end, it is important to begin to reflect on his legacy and how his administration has helped frame the future of Africa.

On July 19, the Center for 21st Century Security and Intelligence at Brookings hosted a discussion on Africa policy. Matthew Carotenuto, professor at St. Lawrence University and author of “Obama and Kenya: Contested Histories and the Politics of Belonging” (Ohio University Press, 2016) discussed his research in the region. He was joined by Sarah Margon, the Washington director of Human Rights Watch. Brookings Senior Fellow Michael O'Hanlon partook in and moderated the discussion.

Video

Audio

Transcript

Event Materials

      
 
 




pr

Podcast | Prachi Singh talks about the impact of air pollution on child health and GDP

       




pr

Podcast: Oil’s not well – How the drastic fall in prices will impact South Asia

       




pr

Can Trump count on Manila to put pressure on North Korea? 3 points to know.

       




pr

Counterterrorism and Preventive Repression: China’s Changing Strategy in Xinjiang

       




pr

Examining Xinjiang: Past, present, and future

In recent months, media reports have described in detail the systematic nature of Chinese government directives to clamp down on ethnic Uighurs in Xinjiang. China’s actions in Xinjiang have generated international criticism from dozens of countries. The Chinese government has defended its policy, saying that it is necessary for ensuring social stability. What are the…

       




pr

Understanding China’s ‘preventive repression’ in Xinjiang

The Chinese Communist Party (CCP) crackdown on Uighur and other Muslim minorities in the Xinjiang Uighur Autonomous Region (XUAR) has attracted intense scrutiny and polarized the international community. At least 1 million people, maybe as many as 1.5 million, have been detained in a large network of recently constructed camps, where they undergo forced reeducation and political indoctrination.…

       




pr

Trans-Atlantic Scorecard – April 2020

Welcome to the seventh edition of the Trans-Atlantic Scorecard, a quarterly evaluation of U.S.-European relations produced by Brookings’s Center on the United States and Europe (CUSE), as part of the Brookings – Robert Bosch Foundation Transatlantic Initiative. To produce the Scorecard, we poll Brookings scholars and other experts on the present state of U.S. relations…

       




pr

What are the prospects for the Cyber Threat Intelligence Integration Center?

Last week we learned that the federal government plans to create a Cyber Threat Intelligence Integration Center (CTIIC). There is some confusion about the purpose of this agency, especially as it relates to the National Cybersecurity and Communications Integration Center (NCCIC) and the United States Computer Emergency Readiness Team (US-CERT). While I am not a…

       




pr

New cybersecurity mantra: “If you can’t protect it, don’t collect it”

In early August I attended my 11th Black Hat USA conference in sunny Las Vegas, Nevada. Black Hat is the somewhat more corporate sibling of the annual DEF CON hacker convention, which follows Black Hat. Since my first visit to both conferences in 2002, I’ve kept tabs on the themes expressed by computer security practitioners.…

       




pr

The World Bank Group’s Mission to End Extreme Poverty: A conversation with President Jim Yong Kim

Ahead of the World Bank Group and International Monetary Fund annual meetings being held in Washington, DC from October 7 to 9, World Bank President Jim Yong Kim set out his vision for ending extreme poverty by 2030 and boosting shared prosperity. He spoke about the links between growth, poverty and inequality, the changing face of […]

      
 
 




pr

The American presidential election and implications for U.S.-R.O.K. relations

My thanks for the hosts and organizers of this conference. Many of you have heard other American speakers talk about our election this morning—Vice President Cheney, Wendy Sherman, and David Rubenstein. As we open our afternoon session, let me offer some historical perspective. American presidential campaigns are, in a sense, like the Olympics: they happen […]

      
 
 




pr

A homage to my Brookings colleague and former professor Hal Sonnenfeldt

Hal Sonnenfeldt was a tough, direct, exceedingly knowledgeable professor whose classes students wanted to attend. But in 1961, it wasn’t easy to get into his Soviet foreign policy class at the Johns Hopkins School of Advanced International Studies (SAIS). Students were first expected to take his earlier course on the domestic Soviet Union, which I…

       




pr

Predicting the impact of college subsidy programs on college enrollment

There is currently a great deal of interest in the potential of college subsidy programs to increase equitable access to higher education and to reduce the financial burden on college attendees. While colleges may be subsidized in a variety of ways, such as through grants to institutions, in our latest Brookings report, we focus on college subsidy programs that directly…

       




pr

Obama in China: Preserving the Rebalance

This November, after focusing on foreign policy concerns around the globe and congressional midterm elections at home, President Barack Obama will travel to Beijing to attend the APEC Economic Leaders’ Meeting in hopes of preserving and enhancing one of his key foreign policy achievements—the rebalance to Asia. Obama’s trip to China will be his first…

       




pr

Power and problem solving top the agenda at Global Parliament of Mayors

When more than 40 mayors from cities around the world gathered in the fjordside city of Stavanger, Norway for the second Global Parliament of Mayors, two topics dominated the discussions: power and problem solving. The agenda included the usual sweep through the most pressing issues cities face today -- refugee resettlement, safety and security, resilience…

       




pr

Trans-Atlantic Scorecard – April 2019

Welcome to the third edition of the Trans-Atlantic Scorecard, a quarterly evaluation of U.S.-European relations produced by Brookings’s Center on the United States and Europe (CUSE), as part of the Brookings – Robert Bosch Foundation Transatlantic Initiative. To produce the Scorecard, we poll Brookings scholars and other experts on the present state of U.S. relations…

       




pr

Trans-Atlantic Scorecard – April 2020

Welcome to the seventh edition of the Trans-Atlantic Scorecard, a quarterly evaluation of U.S.-European relations produced by Brookings’s Center on the United States and Europe (CUSE), as part of the Brookings – Robert Bosch Foundation Transatlantic Initiative. To produce the Scorecard, we poll Brookings scholars and other experts on the present state of U.S. relations…

       




pr

Scaling Up Development Interventions: A Review of UNDP's Country Program in Tajikistan

A key objective of the United Nations Development Programme (UNDP) is to assist its member countries in meeting the Millennium Development Goals (MDGs). UNDP pursues this objective in various ways, including through analysis and advice to governments on the progress towards the MDGs (such as support for the preparation and monitoring Poverty Reduction Strategies, or PRSs, in poor countries), assistance for capacity building, and financial and technical support for the preparation and implementation of development programs.

The challenge of achieving the MDGs remains daunting in many countries, including Tajikistan. To do so will require that all development partners, i.e., the government, civil society, private business and donors, make every effort to scale up successful development interventions. Scaling up refers to “expanding, adapting and sustaining successful policies, programs and projects on different places and over time to reach a greater number of people.” Interventions that are successful as pilots but are not scaled up will create localized benefits for a small number of beneficiaries, but they will fail to contribute significantly to close the MDG gap.

This paper aims to assess whether and how well UNDP is supporting scaling up in its development programs in Tajikistan. While the principal purpose of this assessment was to assist the UNDP country program director and his team in Tajikistan in their scaling up efforts, it also contributes to the overall growing body of evidence on the scaling up of development interventions worldwide.

Downloads

     
 
 




pr

Charting a New Course for the World Bank: Three Options for its New President


Since its 50th anniversary in 1994, the World Bank has been led by four presidents: Lewis Preston until his untimely death in 1995; then James Wolfensohn, who gave the institution new energy, purpose and legitimacy; followed by Paul Wolfowitz, whose fractious management tossed the World Bank into deep crisis; and most recently, Robert Zoellick, who will be remembered for having stabilized the bank and provided effective leadership during its remarkably swift and strong response to the global financial crisis.

Throughout these years of ups and downs in the bank’s leadership, standing and lending, the overall trend of its global role was downhill. While it remains one of the world’s largest multilateral development finance institutions, its position relative to other multilateral financing mechanisms is now much less prominent. Other multilateral institutions have taken over key roles. For example, the European Union agencies and the regional development banks have rapidly expanded their portfolios, and new “vertical funds” such as the Global Fund for AIDS, Tuberculosis and Malaria have become major funding vehicles. At the same time, according to a 2011 OECD Development Assistance Committee report multilateral aid has declined as a share of total aid. Meanwhile, non-governmental aid flows have dramatically increased, including those from major foundations like the Bill and Melinda Gates Foundation, but also from new internet-based channels bundling small individual donations, such as Global Giving. The World Bank— which 20 years ago was still the biggest and most powerful global development agency and hence a ready target for criticism— today is just one of the many institutions that offer for development to the poor and emerging market economies.

Against this backdrop, the World Bank, its members and Dr. Kim face three options in its long-term trajectory over the next 10 to 20 years: 1) the bank can continue on its current path of gradual decline; 2) it might be radically scaled back and eventually eliminated, as other aid channels take over; or 3) it can dramatically reinvent itself as a global finance institution that bundles resources for growing global needs.

There is no doubt in this author’s mind that the World Bank should remain a key part of the global governance architecture, but that requires that the new president forge an ambitious long-term vision for the bank – something that has been lacking for the last 30 years – and then reform the institution and build the authorizing environment that will make it possible to achieve the vision.

Option 1: “Business as Usual” = Continued Gradual Decline

The first option, reflecting the business-as-usual approach that characterized most of the Zoellick years of leadership will mean that the bank will gradually continue to lose in scope, funding and relevance. Its scope will be reduced since the emerging market economies find the institution insufficiently responsive to their needs. They have seen the regional development banks take on increasing importance, as reflected in the substantially greater capital increases in recent years for some of these institutions than for the World Bank in relative terms (and in the case of the Asian Development Bank, even in absolute terms). And emerging market economies have set up their own thriving regional development banks without participation of the industrial countries, such as the Caja Andina de Fomento (CAF) in Latin America and the Eurasian Development Bank in the former Soviet Union. This trend will be reinforced with the creation of a “South Bank” or “BRIC Bank”, an initiative that is currently well underway.

At the same time, the World Bank’s soft loan window, the International Development Association (IDA), will face less support from industrial countries going through deep fiscal crises, heightened competition from other concessional funds, and a perception of reduced need, as many of the large and formerly poor developing countries graduate to middle-income status. It is significant that for the last IDA replenishment much of the increase in resources was due to its growing reliance on advance repayments made by some of its members and commitments against future repayments, thus in effect mortgaging its future financial capacity. The World Bank’s status as a knowledge leader in development will also continue to be challenged with the rise of research from developing countries and growing think tank capacity, as well as a proliferation of private and official agencies doling out advice and technical assistance.

As a result, under this option, over the next 10 to 20 years the World Bank will likely become no more than a shadow of the preeminent global institution it once was. It will linger on but will not be able to contribute substantially to address any of the major global financial, economic or social challenges in the future.
 
Option 2: “The Perfect Storm” = Breaking Up the World Bank

In 1998, the U.S. Congress established a commission to review and advise on the role of the international financial institutions. In 2000, the commission, led by Professor Allan Meltzer, released its recommendations, which included far-reaching changes for the International Monetary Fund and the World Bank, most of them designed to reduce the scope and financial capacities of these institutions in line with the conservative leanings of the majority of the commission’s members. For the World Bank, the “Meltzer Report” called for much of its loan business and financial assets to be devolved to the regional development banks, in effect ending the life of the institution as we know it. The report garnered some attention when it was first issued, but did not have much impact in the way the institution was run in the following 10 years.

In 2010, the U.S. Senate Foreign Relations Committee released a report on the international financial institutions, which called on them to aim toward “succeeding in their development and economic missions and thereby putting them out of business”. However, it did not recommend a drastic restructuring of the multilateral development banks, and instead argued strongly against any dilution of the U.S. veto right, its lock on leadership selection, and its voting share at the IMF and World Bank. While not dramatic in its short-term impact, these recommendations were likely a strong factor in the subsequent decisions made by the Obama administration to oppose a substantial increase in contributions by emerging markets during the latest round of capital increase at the World Bank to push for an American to replace Robert Zoellick as World Bank president. These actions reinforced for emerging market countries that the World Bank would not change sufficiently and quickly enough to serve their interests, and thus helped create the momentum for setting up a new “South Bank.”

While there seems to be no imminent risk of a break-up of the World Bank along the lines recommended by the Meltzer Report, the combination of fiscal austerity and conservative governments in key industrial countries, compounded by a declining interest of the emerging market countries in sustaining the institution’s future, could create the perfect storm for the bank. Specifically, as governments face constrained fiscal resources, confront the increasing fragmentation of the multilateral aid architecture, and take steps to consolidate their own aid agencies, they might conclude that it would be more efficient and fiscally prudent to rationalize the international development system. There is a obvious overlap on the ground in the day-to-day business of the World Bank and that of the regional development banks. This is a reality which is being fostered by the growing decentralization of the World Bank into regional hubs; in fact, a recent evaluation by the World Bank’s Independent Evaluation Group concluded that “[r]ather than functioning as a global institution, the bank is at risk of evolving into six regional banks”. With the growing financial strength, institutional capacity and dynamism, and the apparently greater legitimacy of regional development banks among their regional members, shareholders might eventually decide that consolidation of the World Bank’s operations with those of the regional development banks, in favor of the latter, is the preferred approach.

There are lots of reasons to think that this drastic step would be difficult to take politically, financially and administratively, and therefore the inertia common to the international governance architecture will also prevail in this case. However, the new World Bank president would be well advised to be prepared for the possibility of a “perfect storm” under which the idea of eviscerating the World Bank could gain some traction,. The more the bank is seen to fade away, as postulated under Option 1 above, the greater is the likelihood that Option 2 would be given serious consideration.

Option 3: “A Different World Bank” = Creating a Stronger Global Institution for the Coming Decades

Despite all the criticism and the decline in its relative role as a development finance institution in recent decades, the World Bank is still one of the strongest and most effective development institutions in a world. According to a recent independent ranking of the principal multilateral and bilateral aid institutions by the Brookings Institution and the Center for Global Development “IDA consistently ranks among the best aid agencies in each dimension of quality”.

A third, radically different option from the first two, would build on this strength and ensure that the world has an institution 10 to 20 years from now which helps the global community and individual countries to respond effectively to the many global challenges which the world will undoubtedly face: continued poverty, hunger, conflict and fragility, major infrastructure and energy needs, education and health challenges, and global warming and environmental challenges. On top of this, global financial crises will likely recur and require institutions like the World Bank to help countries provide safety nets and the structural foundations of long-term growth, as the bank has amply demonstrated since 2008. With this as a broad mandate, how could the World Bank respond under new dynamic?

First, it would change its organizational and operating modalities to take a leaf out of the book of the vertical funds, which have been so successful in tackling major development challenges in a focused and scaled-up manner. This means substantially rebalancing the internal matrix between the regional and country departments on the one hand and the technical departments on the other hand. According to the same evaluation cited above, the World Bank has tipped too far toward short-term country priorities and has failed to adequately reflect the need for long-term, dedicated sectoral engagement. The World Bank needs to fortify its reputation as an institution that can muster the strongest technical expertise, fielding team with broad global experience and with first rate regional and country perspective. This does not imply that the World Bank would abandon its engagement at the country level, but it means that it would systematically support the pursuit of long-term sectoral and sub-sectoral strategies at the country level, linked to regional and global initiatives, and involving private-public partnership to assure that development challenges are addressed at scale and in a sustained manner.

Second, recognizing that all countries have unmet needs for which they need long-term finance and best practice in areas such as infrastructure, energy, climate change and environment, the World Bank could become a truly global development institution by opening up its funding windows to all countries, not just an arbitrarily defined subset of developing countries. This would require substantially revising the current graduation rules and possibly the financial instruments. This would mean that the World Bank becomes the global equivalent of the European Investment Bank (EIB) and of the German Kreditanstalt fuer Wiederaufbau (KfW)—development banks that have successfully supported the infrastructure development of the more advanced countries.

Third, the World Bank would focus its own knowledge management activities and support for research and development in developing countries much more on a search for effective and scalable solutions, linked closely to its operational engagement which would be specifically designed to support the scaling up of tested innovations, along the lines pioneered by the Bill and Melinda Gates Foundation.

Fourth, for those countries with strong project management capacities, the World Bank would dramatically simplify its lending processes, following the example of the EIB. This would make it a much more efficient operational institution, making it a more attractive partner to its borrowing member countries, especially the emerging market economies.

Fifth, the membership of the World Bank would fix some fundamental problems with its financial structure and governance. It would invite the emerging market economies to make significantly larger contributions to its capital base in line with their much-enhanced economic and financial capacities. It would revamp the bank’s voting and voice rules to reflect the changed global economic weights and financial contributions of emerging markets. The bank would also explore, based on the experience of the vertical funds, tapping the resources of non-official partners, such as foundations and the private sector as part of its capital and contribution base. Of course, this would bring with it further significant changes in the governance of the World Bank. And the bank would move swiftly to a transparent selection of its leadership on the basis of merit without reference to nationality.

Conclusion: The New World Bank President Needs to Work with the G-20 Leaders to Chart a Course Forward
 
The new president will have to make a choice between these three options. Undoubtedly, the easiest choice is “business-as-usual”, perhaps embellished with some marginal changes that reflect the perspective and new insights that an outsider will bring. There is no doubt that the forces of institutional and political inertia tend to prevent dramatic change. However, it is also possible that Dr. Kim, with his background in a relatively narrow sectoral area may recognize the need for a more vertical approach in the bank’s organizational and operational model. Therefore, he may be more inclined than others to explore Option 3.

If he pursues Option 3, Dr. Kim will need a lot of help. The best place to look for help might be the G-20 leadership. One could hope that at least some of the leaders of the G-20 understand that Options 1 and 2 are not in the interest of their countries and the international community. Hopefully, they would be willing to push their peers to contemplate some radical changes in the multilateral development architecture. This might involve the setting up of a high-level commission as recently recommended by this author, which would review the future of the World Bank as part of a broader approach to rationalize the multilateral system in the interest of greater efficiency and effectiveness. But in setting up such a commission, the G-20 should state a clear objective, namely that the World Bank, perhaps the strongest existing global development institution, should not be gutted or gradually starved out of existence. Instead, it needs to be remade into a focused, effective and truly global institution. If Dr. Kim embraces this vision and develops actionable ideas for the commission and the G-20 leaders to consider and support, then he may bring the right medicine for an ailing giant.

Image Source: © Issei Kato / Reuters
     
 
 




pr

Whither the G-20: Proposals for a Focused Agenda

Johannes Linn argues that the novelty of the G-20 forum has worn off since leaders first met almost four years ago. With legacy issues from previous summits now crowding the agenda, Linn proposes that the G-20 needs a focused agenda that keeps leaders’ attention on the critical longer-term issues, even as it grapples with the short-term crises of the day.
Publication: The G-20 Los Cabos Summit 2012: Bolstering the World Economy Amid Growing Fears of Recession
     
 
 




pr

Scaling Up Programs for the Rural Poor: IFAD's Experience, Lessons and Prospects (Phase 2)


The challenge of rural poverty and food insecurity in the developing world remains daunting. Recent estimates show that “there are still about 1.2 billion extremely poor people in the world. In addition, about 870 million people are undernourished, and about 2 billion people suffer from micronutrient deficiency. About 70 percent of the world’s poor live in rural areas, and many have some dependency on agriculture,” (Cleaver 2012). Addressing this challenge by assisting rural small-holder farmers in developing countries is the mandate of the International Fund for Agricultural Development (IFAD), an international financial institution based in Rome.

The International Fund for Agricultural Development is a relatively small donor in the global aid architecture, accounting for approximately one-half of 1 percent of all aid paid directly to developing countries in 2010. Although more significant in its core area of agricultural and rural development, IFAD still accounts for less than 5 percent of total official development assistance in that sector.1 Confronted with the gap between its small size and the large scale of the problem it has been mandated to address, IFAD seeks ways to increase its impact for every dollar it invests in agriculture and rural development on behalf of its member states. One indicator of this intention to scale up is that it has set a goal to reach 90 million rural poor between 2012 and 2015 and lift 80 million out of poverty during that time. These numbers are roughly three times the number of poor IFAD has reached previously during a similar time span. More generally, IFAD has declared that scaling up is “mission critical,” and this scaling-up objective is now firmly embedded in its corporate strategy and planning statements. Also, increasingly, IFAD’s operational practices are geared towards helping its clients achieve scaling up on the ground with the support of its loans and grants.

This was not always the case. For many years, IFAD stressed innovation as the key to success, giving little attention to systematically replicating and building on successful innovations. In this regard, IFAD was not alone. In fact, few aid agencies have systematically pursued the scaling up of successful projects. However, in 2009, IFAD management decided to explore how it could increase its focus on scaling up. It gave a grant to the Brookings Institution to review IFAD’s experience with scaling up and to assess its operational strategies, policies and processes with a view to strengthening its approach to scaling up. Based on an extensive review of IFAD documentation, two country case studies and intensive interactions with IFAD staff and managers, the Brookings team prepared a report that it submitted to IFAD management in June 2010 and published as a Brookings Global Working Paper in early 2011 (Linn et al. 2011).

Download the paper (PDF) »

Downloads

Authors

Image Source: © Andrew Biraj / Reuters
     
 
 




pr

It’s time for the multilateral development banks to fix their concessional resource replenishment process


The replenishment process for concessional resources of the multilateral development banks is broken. We have come to this conclusion after a review of the experience with recent replenishments of multilateral development funds. We also base it on first-hand observation, since one of us was responsible for the World Bank’s International Development Association (IDA) replenishment consultations 20 years ago and recently served as the external chair for the last two replenishment consultations of the International Fund for Agricultural Development (IFAD), which closely follow the common multilateral development bank (MDB) practice. As many of the banks and their donors are preparing for midterm reviews as a first step toward the next round of replenishment consultations, this is a good time to take stock and consider what needs to be done to fix the replenishment process.

So what’s the problem?

Most of all, the replenishment process does not serve its key intended function of setting overall operational strategy for the development funds and holding the institutions accountable for effectively implementing the strategy. Instead, the replenishment consultations have turned into a time-consuming and costly process in which donor representatives from their capitals get bogged down in the minutiae of institutional management that are better left to the boards of directors and the managements of the MDBs. There are other problems, including lack of adequate engagement of recipient countries in donors’ deliberations, the lack of full participation of the donors’ representatives on the boards of the institutions in the process, and inflexible governance structures that serve as a disincentive for non-traditional donors (from emerging countries and from private foundations) to contribute.

But let’s focus on the consultation process. What does it look like? Typically, donor representatives from capitals assemble every three years (or four, in the case of the Asian Development Bank) for a year-long consultation round, consisting of four two-day meetings (including the meeting devoted to the midterm review of the ongoing replenishment and to setting the agenda for the next consultation process). For these meetings, MDB staff prepare, per consultation round, some 20 substantive documents that are intended to delve into operational and institutional performance in great detail. Each consultation round produces a long list of specific commitments (around 40 commitments is not uncommon), which management is required to implement and monitor, and report on in the midterm review. In effect, however, this review covers only half the replenishment cycle, which leads to the reporting, monitoring, and accountability being limited to the delivery of committed outputs (e.g., a specific sector strategy) with little attention paid to implementation, let alone outcomes.

The process is eerily reminiscent of the much maligned “Christmas tree” approach of the World Bank’s structural adjustment loans in the 1980s and 1990s, with their detailed matrixes of conditionality; lack of strategic selectivity and country ownership; focus on inputs rather than outcomes; and lack of consideration of the borrowers’ capacity and costs of implementing the Bank-imposed measures. Ironically, the donors successfully pushed the MDBs to give up on such conditionality (without ownership of the recipient countries) in their loans, but they impose the same kind of conditionality (without full ownership of the recipient countries and institutions) on the MDBs themselves—replenishment after replenishment.

Aside from lack of selectivity, strategic focus, and ownership of the commitments, the consultation process is also burdensome and costly in terms of the MDBs’ senior management and staff time as well as time spent by ministerial staff in donor capitals, with literally thousands of management and staff hours spent on producing and reviewing documentation. And the recent innovation of having donor representatives meet between consultation rounds as working groups dealing with long-term strategic issues, while welcome in principle, has imposed further costs on the MDBs and capitals in terms of preparing documentation and meetings.

It doesn’t have to be that way. Twenty years ago the process was much simpler and less costly. Even today, recent MDB capital increases, which mobilized resources for the non-concessional windows of the MDBs, were achieved with much simpler processes, and the replenishment consultations for special purpose funds, such as the Global Fund for HIV/AIDS, tuberculosis, and malaria and for the GAVI Alliance, are more streamlined than those of the MDBs.

So what’s to be done?

We recommend the following measures to fix the replenishment consultation process:

  1. Focus on a few strategic issues and reduce the number of commitments with an explicit consideration of the costs and capacity requirements they imply. Shift the balance of monitoring and accountability from delivery of outputs to implementation and outcomes.
  2. Prepare no more than five documents for the consultation process: (i) a midterm review on the implementation of the previous replenishment and key issues for the future; (ii) a corporate strategy or strategy update; (iii) the substantive report on how the replenishment resources will contribute to achieve the strategy; (iv) a financial outlook and strategy document; and (v) the legal document of the replenishment resolution.
  3. Reduce the number of meetings for each replenishment round to no more than three and lengthen the replenishment period from three to four years or more.
  4. Use the newly established working group meetings between replenishment consultation rounds to focus on one or two long-term, strategic issues, including how to fix the replenishment process.

The initiative for such changes lies with the donor representatives in the capitals, and from our interviews with donor representatives we understand that many of them broadly share our concerns. So this is a good time—indeed it is high time!—for them to act.

Authors

      
 
 




pr

Getting millions to learn: What will it take to accelerate progress on meeting the Sustainable Development Goals?


Event Information

April 18-19, 2016

Falk Auditorium
Brookings Institution
1775 Massachusetts Avenue NW
Washington, DC 20036

Register for the Event


In 2015, 193 countries adopted the Sustainable Development Goals (SDGs), a new global agenda that is more ambitious than the preceding Millennium Development Goals and aims to make progress on some of the most pressing issues of our time. Goal 4, "To ensure inclusive and quality education for all, with relevant and effective learning outcomes," challenges the international education community to meet universal access plus learning by 2030. We know that access to primary schooling has scaled up rapidly over previous decades, but what can be learned from places where transformational changes in learning have occurred? What can governments, civil society, and the private sector do to more actively scale up quality learning?

On April 18-19, the Center for Universal Education (CUE) at Brookings launched "Millions Learning: Scaling Up Quality Education in Developing Countries," a comprehensive study that examines where learning has improved around the world and what factors have contributed to that process. This two-day event included two sessions. Monday, April 18 focused on the role of global actors in accelerating progress to meeting the SDGs. The second session on Tuesday, April 19 included a presentation of the Millions Learning report followed by panel discussions on the role of financing and technology in scaling education in developing countries.

 Join the conversation on Twitter #MillionsLearning

Video

Audio

Transcript

Event Materials