va How the Small Businesses Investment Company Program can better support America’s advanced industries By webfeeds.brookings.edu Published On :: Wed, 26 Jun 2019 19:20:56 +0000 On June 26, Brookings Metro Senior Fellow and Policy Director Mark Muro testified to the Senate Committee on Small Business and Entrepreneurship about the need for the reauthorization of the Small Business Administration (SBA), and particularly on the Small Business Investment Company (SBIC) program, to be better positioned to further support America’s advanced industry sector.… Full Article
va Economic Growth and Institutional Innovation: Outlines of a Reform Agenda By webfeeds.brookings.edu Published On :: Tue, 01 Jun 2010 17:54:00 -0400 Policy Brief #172 Why Institutions MatterWhen experts and pundits are asked what the president and Congress should do to promote economic growth, they typically respond with a list of policies, often mixed with stylistic and political suggestions. Few focus on institutional change, which is too easy to conflate with yawn-inducing “governmental reorganization.”This neglect of institutions is always a mistake, never more than in times of crisis. Throughout American history, profound challenges have summoned bursts of institutional creativity, with enduring effects. The dangerous inadequacies of the Articles of Confederation set the stage for a new Constitution. The Civil War resulted in three amendments that resolved—at least in principle—our founding ambivalence between the people and the states as the source of national authority, between the states and the nation as the locus of citizenship, and between slavery and the equality the Declaration of Independence had proclaimed and promised. Similarly, the Federal Reserve Board, Bretton Woods international economic system, Department of Defense, National Security Council, CIA, Congressional Budget Office and Department of Homeland Security all arose through changes occasioned by great challenges to the nation.Today’s economic crisis is reflected in three distinct but linked deficits—the fiscal deficit, the savings deficit and the investment deficit. Meeting these challenges and laying the foundation for sustained economic growth will require institutional as well as policy changes. RECOMMENDATIONS Today’s economic crisis is characterized by three distinct but linked deficits—the fiscal deficit, the savings deficit and the investment deficit. Meeting these challenges and laying the foundation for sustained economic growth will require institutional as well as policy changes. The following institution-based recommendations would help the nation meet the current economic crisis and could help prevent future crises of similar destructiveness. To promote fiscal sustainability, change longterm budget procedures and create empowered commissions—answerable to Congress but largely insulated from day-to-day politics. To boost savings, consider new mandatory individual retirement accounts as a supplement to Social Security. To improve public investment, create a National Infrastructure Bank with public seed capital—this entity would mobilize private investment and force proposed projects to pass rigorous cost-benefit analysis as well as a market test. Today’s polarized political system is an obstacle to reform in every area, including the economy. A multi-year collaboration between Brookings and the Hoover Institution produced a series of suggestions. At least two of those suggestions are worth adopting:Alter redistricting authority, so state legislatures can no longer practice gerrymandering. Experiment, in a few willing states, with compulsory voting—to move politicians away from the red-meat politics of appealing only to their bases, which now dominate elections, and toward a more moderate and consensual politics. Institutional reform Promoting fiscal sustainability Setting the federal budget on a sustainable course is an enormous challenge. If we do nothing, we will add an average of nearly $1 trillion to the national debt every year between now and 2020, raising the debt/ GDP ratio to a level not seen since the early 1950s and sending the annual cost of servicing the debt sky-high. Restoring pay-as-you-go budgeting and putting some teeth in it are a start, but not nearly enough. We need radical changes in rules and procedures. One option, recently proposed by a bipartisan group that includes three former directors of the Congressional Budget Office, would change the giant entitlement programs: Social Security, Medicare and Medicaid. The new rules would require a review every five years to determine whether projected revenues and outlays are in balance. If not, Congress would be required to restore balance through dedicated revenue increases, benefits cuts or a combination. After a financial crisis in the early 1990s, Sweden introduced a variant of this plan, which has worked reasonably well.A number of Brookings scholars—including Henry Aaron, Gary Burtless, William Gale, Alice Rivlin and Isabel Sawhill—have suggested a Value Added Tax (VAT) as part of a program of fiscal and tax reform. Burtless offers an intriguing proposal that would link a VAT to health care finance. Revenue from the VAT would be dedicated to—and would cover—the federal share of health care programs. If the federal cost rises faster than proceeds from the VAT, Congress would have to either raise the VAT rate or cut back programs to fit the flow of funds. The system would become much more transparent and accountable: because the VAT rate would appear on every purchase, citizens could see for themselves the cost of federal support for health care, and they could tell their representatives what balance they prefer between increased rates and reduced health care funding. Another option draws on the experience of the Base Realignment and Closure Commission, which enables the military to surmount NIMBY politics and shut down unneeded bases. The basic idea is straightforward: once the independent commission settles on a list of proposed closures, Congress has the option of voting it up or down without amendment. A similar idea undergirds the president’s “fast-track” authority to negotiate proposed trade treaties, which Congress can reject but cannot modify. Suitably adapted, this concept could help break longstanding fiscal logjams. Here is one way it might work. Independent commissions with members from both political parties could submit proposals in designated areas of fiscal policy. To increase bipartisan appeal, each proposal would require a super-majority of the commission. In the House and Senate, both the majority and the minority would have the opportunity to offer only a single amendment. This strategy of “empowered commissions” changes the incentive structure in Congress, reducing negative logrolling to undermine the prospects of proposals that would otherwise gain majority support. Empowered commissions represent a broader strategy—using institutional design to insulate certain activities from regular and direct political pressure. For example, the Constitution mandates that federal judges, once confirmed, hold office during “good behavior” and receive salaries that Congress may not reduce during their term of service. (By contrast, many states subject judges to regular election and possible recall.) In another striking example, members of the Board of Governors of the Federal Reserve Board are appointed to 14-year non-renewable terms, limiting the ability of the executive branch to change its membership rapidly and removing governors’ incentives to trim their policy sails in hopes of reappointment. Additionally, action by neither the president nor any other entity in the executive branch is required to implement the Fed’s decisions, and Fed chairmen have been known to take steps that vex the Oval Office. This strategy is controversial. Officials with populist leanings often argue that fundamental decisions affecting the economy should be made through transparent democratic processes. The counterargument: experience dating back to the founding of the republic suggests that when interest rates and the money supply are set at the whim of transient majorities, economic growth and stability are at risk. Boosting savings An adequate supply of capital is a precondition of long-term economic growth, and household saving is an important source of capital. During the 1960s, U.S. households saved 12 percent of their income; as recently as the 1980s, that figure stood at 8 percent. By 2005–2006, the savings rate dipped into negative territory, and today it stands at a meager 3 percent. In recent years, funds from abroad—principally Asia— filled the capital gap. But evidence is accumulating that foreign governments have reached the limit of their appetite (or tolerance) for U.S. debt. To avert a capital shortage and soaring interest rates, which would choke off growth, we must boost private savings as we reduce public deficits. For a long time, tax incentives for saving have been the tool of choice. But as evidence mounts that these incentives are less effective than hoped, policy experts are turning to alternatives. One rests on a key finding of behavioral economics: default settings have a large impact on individual conduct and collective outcomes. If you require people to opt in to enter a program, such as 401(k) retirement plans, even a modest inconvenience will deter many of them from participating. But if you reverse the procedure— automatically enrolling them unless they affirmatively opt out—you can boost participation. To achieve an adequate rate of private saving, we may need to go even further. One option is a mandatory retirement savings program to supplement Social Security. Workers would be required to set aside a fixed percentage of earnings and invest them in generic funds—equities, public debt, private debt, real estate, commodities and cash. For those who fail to designate a percentage allocation for each fund, a default program would take effect. (Participants always would have the option of regaining control.) As workers near retirement age, their holdings would be automatically rebalanced in a more conservative direction. One version of this proposal calls for “progressive matching,” in which low-earning individuals receive a subsidy equal to half their payroll contributions; those making more would get a smaller match along a sliding scale, and those at the top would receive no match at all. This strategy requires careful institutional and programmatic design. To ensure maximum benefits to wage earners, the private sector would be allowed to offer only funds with very low costs and fees. To ensure that the program actually boosts net savings, individuals would be prohibited from withdrawing funds from their accounts prior to retirement; except in emergencies, they would not be allowed to borrow against their accounts; and they would be prohibited from using them as collateral. And a clear line would be drawn to prevent government interference in the private sector: while government-administered automatic default investments would be permitted, government officials could not direct the flow of capital to specific firms. Improving public investment The investment deficit has a public face as well. Since the early 19th century, government has financed and helped build major infrastructure projects—roads, bridges, ports and canals, among others, have spurred economic growth and opened new domestic and international markets. Recently, however, public infrastructure investment has fallen well short of national needs, and often has been poorly targeted. Americans travelling and working abroad are noticing that U.S. infrastructure is falling behind not only advanced countries’ but rapidly developing countries’ as well. A study by Emilia Istrate and Robert Puentes of Brookings’s Metropolitan Policy Program, presented in a December 2009 report entitled “Investing for Success,” documents three key shortcomings of federal infrastructure investment: it lacks long-term planning, fails to provide adequately for maintenance costs, and suffers from a flawed project selection process as benefits are not weighed rigorously against costs. Istrate and Puentes explore several strategies for correcting these deficiencies. One of the most promising is a National Infrastructure Bank (NIB), to require benefit-cost analyses of proposed projects, break down financial barriers between related types of investment (facilitating inter-modal transportation, for example), and improve coordination across jurisdictional lines. The NIB could be funded through a modest initial infusion of federal capital designed to attract private capital. Projects receiving loans from the NIB would have to provide for depreciation and document the sources of funds to repay the face amount of each loan, plus interest. In short, the NIB would be more than a conduit for the flow of federal funds; it would function as a real bank, imposing market discipline on projects and making infrastructure investments attractive to private capital, partly by providing flexible subordinated debt. Istrate and Puentes identify diverse problems that designers of an NIB would confront. Insulating the selection process from political interference would pose serious difficulties, as would providing federal seed capital without increasing the federal deficit and debt. Requiring the repayment of loans could skew project awards away from projects that cannot easily charge user fees—wastewater and environmental infrastructure projects, for example. Despite these challenges, a properly designed bank could increase the quantity of infrastructure investment while improving its effectiveness, reducing bottlenecks and promoting economic efficiency. The potential benefits for long-term growth would be considerable. Creating the Political Conditions for ReformThe rise of political polarization in recent decades has made effective action much more difficult for the U.S. government. Polarization has impeded efforts to enact even the progrowth reforms sketched in this paper. A multiyear collaboration between the Brookings and Hoover Institutions—resulting in a two-volume report, Red and Blue Nation?, with Volume One published in 2006 and Volume Two in 2008— has mapped the scope of the phenomenon. This effort has shown that, while political elites are more sharply divided than citizens in general, citizens are more likely now to place themselves at the ends of the ideological spectrum than they were as recently as the 1980s. With a smaller political center to work with, even leaders committed to bipartisan compromise have been stymied. The fate of President Bush’s 2005 Social Security proposal illustrates the difficulty of addressing tough issues in these circumstances. It might seem that the only cure for polarization is a shift of public sentiment back toward moderation. The Brookings-Hoover project found, however, that changes in institutional design could reduce polarization and might, over time, lower the partisan temperature. Here are two ideas, culled from a much longer list. Congressional redistricting While population flows account for much of the growth in safe seats dominated by strong partisans, recent studies indicate that gerrymanders account for 10 to 36 percent of the reduction in competitive congressional districts since 1982. This is not a trivial effect. Few Western democracies draw up their parliamentary districts in so patently politicized a fashion as do U.S. state legislatures. Parliamentary electoral commissions, operating independently and charged with making reasonably objective determinations, are the preferred model abroad. Given the Supreme Court’s reluctance to enter the thicket of redistricting controversies, any changes will be up to state governments. In recent years, voter initiatives and referenda in four states—Washington, Idaho, Alaska and Arizona—have established nonpartisan or bipartisan redistricting commissions. These commissions struggle with a complicated riddle: how to enhance competitiveness while respecting other parameters, such as geographic compactness, jurisdictional boundaries, and the desire to consolidate “communities of interest.” Iowa’s approach, where a nonpartisan legislative staff has the last word, is often cited as a model but may be hard to export to states with more demographic diversity and complex political cultures. Arizona has managed to fashion some workable, empirically based standards that are yielding more heterogeneous districts and more competitive elections. Incentives to participate Another depolarizing reform would promote the participation of less ideologically committed voters in the electoral process. Some observers do not view the asymmetric power of passionate partisans in U.S. elections as a cause for concern: Why shouldn’t political decisions be made by the citizens who care most about them? Aren’t those who care also better informed? And isn’t their intensive involvement an indication that the outcome of the election affects their interests more than it affects the interests of the non-voters? While this argument has surface plausibility, it is not compelling. Although passionate partisanship infuses the system with energy, it erects road-blocks to problem-solving. Many committed partisans prefer gridlock to compromise, and gridlock is no formula for effective governance. To broaden the political participation of less partisan citizens, who tend to be more weakly connected to the political system, several major democracies have made voting mandatory. Australia, for one, has compulsory voting; it sets small fines for non-voting that escalate for recidivism, with remarkable results. The turnout rate in Australia tops 95 percent, and citizens regard voting as a civic obligation. Near-universal voting raises the possibility that a bulge of casual voters, with little understanding of the issues and candidates, can muddy the waters by voting on non-substantive criteria, such as the order in which candidates’ names appear on the ballot. The inevitable presence of some such “donkey voters,” as they are called in Australia, does not appear to have badly marred the democratic process in that country. Indeed, the civic benefits of higher turnouts appear to outweigh the “donkey” effect. Candidates for the Australian Parliament have gained an added incentive to appeal broadly beyond their partisan bases. One wonders whether members of Congress here in the United States, if subjected to wider suffrage, might also spend less time transfixed by symbolic issues that are primarily objects of partisan fascination, and more time coming to terms with the nation’s larger needs. At least campaigns continually tossing red meat to the party faithful might become a little less pervasive. The United States is not Australia, of course. Although both are federal systems, the U.S. Constitution confers on state governments much more extensive control over voting procedures. While it might not be flatly unconstitutional to mandate voting nationwide, it would surely chafe with American custom and provoke opposition in many states. Federalism American-style also has some unique advantages, including its tradition of using states as “laboratories of democracy” that test reform proposals before they are elevated to consideration at the national level. If a few states experiment with compulsory voting and demonstrate its democracy- enriching potential, they might, in this way, smooth the path to national consideration. Conclusion In challenging times, political leaders undertake institutional reform, not because they want to, but because they must. Our own era—a period of profound economic crisis—is no exception. Even in circumstances of deep political polarization, both political parties have accepted the need to restructure our system of financial regulation. As well, recognition is growing that we face three key challenges—a fiscal deficit, a savings deficit and an investment deficit—that have eluded control by existing institutions and, unless checked, will impede long-term economic growth. The question is whether we will be able to adopt the needed changes in an atmosphere of reflection and deliberation, or whether we will delay until a worse crisis compels us to act. Downloads Download Policy Brief Authors William A. Galston Full Article
va Hubs of Transformation: Leveraging the Great Lakes Research Complex for Energy Innovation By webfeeds.brookings.edu Published On :: Wed, 02 Jun 2010 14:29:00 -0400 Policy Brief #173 America needs to transform its energy system, and the Great Lakes region (including Minnesota, Wisconsin, Iowa, Missouri, Illinois, Indiana, Ohio, Michigan, Kentucky, West Virginia, western Pennsylvania and western New York) possesses many of the needed innovation assets. For that reason, the federal government should leverage this troubled region’s research and engineering strengths by launching a region-wide network of collaborative, high intensity energy research and innovation centers.Currently, U.S. energy innovation efforts remain insufficient to ensure the development and deployment of clean energy technologies and processes. Such deployment is impeded by multiple market problems that lead private firms to under-invest and to focus on short-term, low-risk research and product development. Federal energy efforts—let alone state and local ones—remain too small and too poorly organized to deliver the needed breakthroughs. A new approach is essential. RECOMMENDATIONS The federal government should systematically accelerate national clean energy innovation by launching a series of “themed” research and commercialization centers strategically situated to draw on the Midwest’s rich complex of strong public universities, national and corporate research laboratories, and top-flight science and engineering talent. Organized around existing capacities in a hub-spoke structure that links fundamental science with innovation and commercialization, these research centers would engage universities, industries and labs to work on specific issues that would enable rapid deployment of new technologies to the marketplace. Along the way, they might well begin to transform a struggling region’s ailing economy. Roughly six compelling innovation centers could reasonably be organized in the Great Lakes states with total annual funding between $1 billion and $2 billion.To achieve this broad goal, the federal government should:Increase energy research funding overall. Adopt more comprehensive approaches to research and development (R&D) that address and link multiple aspects of a specific problem, such as transportation. Leverage existing regional research, workforce, entrepreneurial and industrial assets. America needs to transform its energy system in order to create a more competitive “next economy” that is at once export-oriented, lower-carbon and innovation-driven. Meanwhile, the Great Lakes region possesses what may be the nation’s richest complex of innovation strengths—research universities, national and corporate research labs, and top-flight science and engineering talent. Given those realities, a partnership should be forged between the nation’s needs and a struggling region’s assets.To that end, we propose that the federal government launch a distributed network of federally funded, commercialization-oriented, sustainable energy research and innovation centers, to be located in the Great Lakes region. These regional centers would combine aspects of the “discovery innovation institutes” proposed by the National Academy of Engineering and the Metropolitan Policy Program (as articulated in “Energy Discovery-Innovation Institutes: A Step toward America’s Energy Sustainability”); the “energy innovation hubs” created by the Department of Energy (DOE); and the agricultural experiment station/cooperative extension model of the land-grant universities.In the spirit of the earlier land-grant paradigm, this network would involve the region’s research universities and national labs and engage strong participation by industry, entrepreneurs and investors, as well as by state and local governments. In response to local needs and capacities, each center could have a different theme, though all would conduct the kinds of focused translational research necessary to move fundamental scientific discoveries toward commercialization and deployment.The impact could be transformational. If built out, university-industry-government partnerships would emerge at an unprecedented scale. At a minimum, populating auto country with an array of breakthrough-seeking, high-intensity research centers would stage a useful experiment in linking national leadership and local capacities to lead the region—and the nation—toward a more prosperous future. The Great Lakes Energy System: Predicaments and Possibilities The Great Lakes region lies at the center of the nation’s industrial and energy system trials and possibilities. No region has suffered more from the struggles of America’s manufacturing sector and faltering auto and steel industries, as indicated in a new Metropolitan Policy Program report entitled “The Next Economy: Rebuilding Auto Communities and Older Industrial Metros in the Great Lakes Region.”The region also lies at ground zero of the nation’s need to “green” U.S. industry to boost national economic competitiveness, tackle climate change and improve energy security. Heavily invested in manufacturing metals, chemicals, glass and automobiles, as well as in petroleum refining, the Great Lakes states account for nearly one-third of all U.S. industrial carbon emissions.And yet, the Great Lakes region possesses significant assets and capacities that hold promise for regional renewal as the “next economy” comes into view. The Midwest’s manufacturing communities retain the strong educational and medical institutions, advanced manufacturing prowess, skills base and other assets essential to helping the nation move toward and successfully compete in the 21st century’s export-oriented, lower-carbon, innovation-fueled economy.Most notably, the region has an impressive array of innovation-related strengths in the one field essential to our nation’s future—energy. These include:Recognized leadership in R&D. The Great Lakes region accounts for 33 percent of all academic and 30 percent of all industry R&D performed in the United States. Strength and specialization in energy, science and engineering. In FY 2006, the Department of Energy sent 26 percent of its federal R&D obligations to the Great Lakes states and is the second largest federal funder of industrial R&D in the region. Also in 2006, the National Science Foundation sent 30 percent of its R&D obligations there. Existing clean energy research investments and assets. The University of Illinois is a key research partner in the BP-funded, $500 million Energy Biosciences Institute, which aims to prototype new plants as alternative fuel sources. Toledo already boasts a growing solar industry cluster; Dow Corning’s Michigan facilities produce leading silicon and silicone-based technology innovations; and the Solar Energy Laboratory at the University of Wisconsin-Madison, the oldest of its kind in the world, has significant proficiency in developing practical uses for solar energy. Finally, the region is home to the largest U.S. nuclear utility (Exelon), the nation’s largest concentration of nuclear plants and some of the country’s leading university programs in nuclear engineering. Industry potential relevant to clean energy. Given their existing technological specializations, Midwestern industries have the potential to excel in the research and manufacture of sophisticated components required for clean energy, such as those used in advanced nuclear technologies, precision wind turbines and complex photovoltaics. Breadth in energy innovation endeavors and resources. In addition to universities and industry, the region’s research laboratories specialize in areas of great relevance to our national energy challenges, including the work on energy storage systems and fuel and engine efficiency taking place at Argonne National Laboratory, research in high-energy physics at the Fermi National Accelerator Laboratory, and the work on bioenergy feedstocks, processing technologies and fuels occurring at the DOE-funded Great Lakes BioEnergy Research Center (GLBRC). Regional culture of collaboration. Finally, the universities of the Great Lakes area have a strong history of collaboration both among themselves and with industry, given their origins in the federal land-grant compact of market and social engagement. GLBRC—one of the nation’s three competitively awarded DOE Bioenergy Centers—epitomizes the region’s ability to align academia, industry and government around a single mission. Another example is the NSF-supported Blue Waters Project. This partnership between IBM and the universities and research institutions in the Great Lakes Consortium for Petascale Computation is building the world’s fastest computer for scientific work—a critical tool for advancing smart energy grids and transportation systems.In short, the Great Lakes states and metropolitan areas—economically troubled and carbon-reliant as they are—have capabilities that could contribute to their own transformation and that of the nation, if the right policies and investments were in place.Remaking America’s Energy System within a Federal Policy FrameworkAmerica as a whole, meanwhile, needs to overcome the massive sustainability and security challenges that plague the nation’s energy production and delivery system. Transformational innovation and commercialization will be required to address these challenges and accelerate the process of reducing the economy’s carbon intensity.Despite the urgency of these challenges, however, a welter of market problems currently impedes decarbonization and limits innovation. First, energy prices have generally remained too low to provide incentives for companies to commit to clean and efficient energy technologies and processes over the long haul. Second, many of the benefits of longrange innovative activity accrue to parties other than those who make investments. As a result, individual firms tend to under-invest and to focus on short-term, low-risk research and product development. Third, uncertainty and lack of information about relevant market and policy conditions and the potential benefits of new energy technologies and processes may be further delaying innovation. Fourth, the innovation benefits that derive from geographically clustering related industries (which for many years worked so well for the auto industry) have yet to be fully realized for next-generation energy enterprises. Instead, these innovations often are isolated in secure laboratories. Finally, state and local governments—burdened with budgetary pressures—are not likely to fill gaps in energy innovation investment any time soon.As a result, the research intensity—and so the innovation intensity—of the energy sector remains woefully insufficient, as pointed out in the earlier Metropolitan Policy Program paper on discovery innovation institutes. Currently, the sector devotes no more than 0.3 percent of its revenues to R&D. Such a figure lags far behind the 2.0 percent of sales committed to federal and large industrial R&D found in the health care sector, the 2.4 percent in agriculture, and the 10 percent in the information technology and pharmaceutical industries.As to the national government’s efforts to respond to the nation’s energy research shortfalls, these remain equally inadequate. Three major problems loom:The scale of federal energy research funding is insufficient. To begin with, the current federal appropriation of around $3 billion a year for nondefense energy-related R&D is simply too small. Such a figure remains well below the $8 billion (in real 2008 dollars) recorded in 1980, and represents less than a quarter of the 1980 level when measured as a share of GDP. If the federal government were to fund next-generation energy at the pace it supports advances in health care, national defense, or space exploration, the level of investment would be in the neighborhood of $20 billion to $30 billion a year.Nor do the nation’s recent efforts to catalyze energy innovation appear sufficient. To be sure, the American Recovery and Reinvestment Act (ARRA) provided nearly $13 billion for DOE investments in advanced technology research and innovation. To date, Great Lakes states are slated to receive some 42 percent of all ARRA awards from the fossil energy R&D program and 39 percent from the Office of Science (a basic research agency widely regarded as critical for the nation’s energy future). However, ARRA was a one-time injection of monies that cannot sustain adequate federal energy R&D.Relatedly, the Great Lakes region has done well in tapping two other relatively recent DOE programs: the Advanced Research Projects Agency–Energy (ARPA-E) and Energy Frontier Research Centers (EFRCs). Currently, Great Lakes states account for 44 and 50 percent of ARPA-E and EFRC funding. Yet, with ARPA-E focused solely on individual signature projects and EFRC on basic research, neither initiative has the scope to fully engage all of the region's innovation assets.The character and format of federal energy R&D remain inadequate. Notwithstanding the question of scale, the character of U.S. energy innovation also remains inadequate. In this respect, the DOE national laboratories—which anchor the nation’s present energy research efforts—are poorly utilized resources. Many of these laboratories’ activities are fragmented and isolated from the private sector and its market, legal and social realities. This prevents them from successfully developing and deploying cost-competitive, multidisciplinary new energy technologies that can be easily adopted on a large scale.For example, DOE activities continue to focus on discrete fuel sources (such as coal, oil, gas or nuclear), rather than on fully integrated end use approaches needed to realize affordable, reliable, sustainable energy. Siloed approaches simply do not work well when it comes to tackling the complexity of the nation’s real-world energy challenges. A perfect example of a complicated energy problem requiring an integrated end-use approach is transportation. Moving the nation’s transportation industry toward a clean energy infrastructure will require a multi-pronged, full systems approach. It will depend not only upon R&D in such technologies as alternative propulsion (biofuels, hydrogen, electrification) and vehicle design (power trains, robust materials, advanced computer controls) but also on far broader technology development, including that related to primary energy sources, electricity generation and transmission, and energy-efficient applications that ultimately will determine the economic viability of this important industry.Federal programming fails to fully realize regional potential. Related to the structural problems of U.S. energy innovation efforts, finally, is a failure to fully tap or leverage critical preexisting assets within regions that could accelerate technology development and deployment. In the Great Lakes, for example, current federal policy does little to tie together the billions of dollars in science and engineering R&D conducted or available annually. This wealth is produced by the region’s academic institutions, all of the available private- and public-sector clean energy activities and financing, abundant natural resources in wind and biomass, and robust, pre-existing industrial platforms for research, next-generation manufacturing, and technology adoption and deployment. In this region and elsewhere, federal policy has yet to effectively connect researchers at different organizations, break down stovepipes between research and industry, bridge the commercialization “valley of death,” or establish mechanisms to bring federally-sponsored R&D to the marketplace quickly and smoothly.A New Approach to Regional, Federally Supported Energy Research and Innovation And so the federal government should systematically accelerate clean energy innovation by launching a series of regionally based Great Lakes research centers. Originally introduced in the Metropolitan Policy Program policy proposal for energy discovery-innovation institutes (or e-DIIs), a nationwide network of regional centers would link universities, research laboratories and industry to conduct translational R&D that at once addresses national energy sustainability priorities, while stimulating regional economies.In the Great Lakes, specifically, a federal effort to “flood the zone” with a series of roughly six of these high-powered, market-focused energy centers would create a critical mass of innovation through their number, size, variety, linkages and orientation to pre-existing research institutions and industry clusters.As envisioned here, the Great Lakes network of energy research centers would organize individual centers around themes largely determined by the private market. Based on local industry research priorities, university capabilities and the market and commercialization dynamics of various technologies, each Great Lakes research and innovation center would focus on a different problem, such as renewable energy technologies, biofuels, transportation energy, carbon-free electrical power generation, and distribution and energy efficiency. This network would accomplish several goals at once:Foster multidisciplinary and collaborative research partnerships. The regional centers or institutes would align the nonlinear flow of knowledge and activity across science and non-science disciplines and among companies, entrepreneurs, commercialization specialists and investors, as well as government agencies (federal, state and local) and research universities. For example, a southeastern Michigan collaboration involving the University of Michigan, Michigan State University, the University of Wisconsin and Ford, General Motors, and Dow Chemical could address the development of sustainable transportation technologies. A Chicago partnership involving Northwestern and Purdue Universities, the University of Chicago, the University of Illinois, Argonne National Lab, Exelon and Boeing could focus on sustainable electricity generation and distribution. A Columbus group including Ohio State University and Battelle Memorial Institute could address technologies for energy efficiency. Regional industry representatives would be involved from the earliest stages to define needed research, so that technology advances are relevant and any ensuing commercialization process is as successful as possible. Serve as a distributed “hub-spoke” network linking together campus-based, industry-based and federal laboratory-based scientists and engineers. The central “hubs” would interact with other R&D programs, centers and facilities (the “spokes”) through exchanges of participants, meetings and workshops, and advanced information and communications technology. The goals would be to limit unnecessary duplication of effort and cumbersome management bureaucracy and to enhance the coordinated pursuit of larger national goals. Develop and rapidly deploy highly innovative technologies to the market. Rather than aim for revenue maximization through technology transfer, the regional energy centers would be structured to maximize the volume, speed and positive societal impact of commercialization. As much as possible, the centers would work out in advance patenting and licensing rights and other intellectual property issues.Stimulate regional economic development. Like academic medical centers and agricultural experiment stations—both of which combine research, education and professional practice—these energy centers could facilitate cross-sector knowledge spillovers and innovation exchange and propel technology transfer to support clusters of start-up firms, private research organizations, suppliers, and other complementary groups and businesses—the true regional seedbeds of greater economic productivity, competitiveness and job creation. Build the knowledge base necessary to address the nation’s energy challenges. The regional centers would collaborate with K-12 schools, community colleges, regional universities, and workplace training initiatives to educate future scientists, engineers, innovators, and entrepreneurs and to motivate the region’s graduating students to contribute to the region’s emerging green economy. Complement efforts at universities and across the DOE innovation infrastructure, but be organizationally and managerially separate from either group. The regional energy centers would focus rather heavily on commercialization and deployment, adopting a collaborative translational research paradigm. Within DOE, the centers would occupy a special niche for bottom-up translational research in a suite of new, largely top-down innovation-oriented programs that aim to advance fundamental science (EFRCs), bring energy R&D to scale (Energy Innovation Hubs) and find ways to break the cost barriers of new technology (ARPA-E).To establish and build out the institute network across the Great Lakes region, the new regional energy initiative would:Utilize a tiered organization and management structure. Each regional center would have a strong external advisory board representing the participating partners. In some cases, partners might play direct management roles with executive authority. Adopt a competitive award process with specific selection criteria. Centers would receive support through a competitive award process, with proposals evaluated by an interagency panel of peer reviewers. Receive as much federal funding as major DOE labs outside the Great Lakes region. Given the massive responsibilities of the proposed Great Lakes energy research centers, total federal funding for the whole network should be comparable to that of comprehensive DOE labs, such as Los Alamos, Oak Ridge and others, which have FY2010 budgets between $1 and $2 billion. Based on existing industry-university concentrations, one can envision as many as six compelling research centers in the Great Lakes region.Conclusion In sum, America’s national energy infrastructure—based primarily upon fossil fuels—must be updated and replaced with new technologies. At the same time, no region in the nation is better equipped to deliver the necessary innovations than is the Great Lakes area. And so this strong need and this existing capacity should be joined through an aggressive initiative to build a network of regional energy research and innovation centers. Through this intervention, the federal government could catalyze a dynamic new partnership of Midwestern businesses, research universities, federal laboratories, entrepreneurs and state and local governments to transform the nation’s carbon dependent economy, while renewing a flagging regional economy. Downloads Download Policy Brief Video Research Strength in the Great LakesPursuing Large Scale Innovation Authors James J. DuderstadtMark MuroSarah Rahman Full Article
va Spurring Innovation Through Education: Four Ideas By webfeeds.brookings.edu Published On :: Thu, 03 Jun 2010 18:13:00 -0400 Policy Brief #174 A nation’s education system is a pillar of its economic strength and international competitiveness. The National Bureau of Economic Research analyzed data from 146 countries, collected between 1950 and 2010, and found that each year of additional average schooling attained by a population translates into at least a two percent increase in economic output. A 2007 World Bank policy research working paper reported similar results. Based on these findings, if the United States increased the average years of schooling completed by its adult population from the current 12 years to 13 years—that is, added one year of postsecondary education—our gross domestic product would rise by more than $280 billion. The story also can be told by focusing on the returns to education for individuals. The difference in income between Americans who complete high school and those who drop out after 10th grade exceeds 50 percent. Large income differentials extend throughout the continuum of education attainment, with a particularly huge gap occurring between an advanced degree and a four-year college degree. Although education clearly pays, the education attainment of the nation’s youth has largely stagnated, falling substantially behind that of countries with which we compete. In 1960, the United States led the world in the number of students who graduated from high school. Today young adults in many countries, including Estonia and Korea, exceed their U.S. counterparts in education attainment. RECOMMENDATIONS America’s economic productivity and competitiveness are grounded in education. Our public schools and our higher education institutions alike are falling behind those of other nations. Four policy proposals offer substantial promise for improving American education, are achievable and have low costs: Choose K–12 curriculum based on evidence of effectiveness. Evaluate teachers in ways that meaningfully differentiate levels of performance. Accredit online education providers so they can compete with traditional schools across district and state lines. Provide the public with information that will allow comparison of the labor market outcomes and price of individual postsecondary degree and certificate programs. The problem of low education attainment is particularly salient among students from low-income and minority backgrounds. The graduation rate for minorities has been declining for 40 years, and majority/minority graduation rate differentials have not converged. Hispanic and black students earn four-year or higher degrees at less than half the rate of white students.The economic future of the nation and the prospects of many of our citizens depend on returning the United States to the forefront of education attainment. Simply put, many more of our students need to finish high school and graduate from college.At the same time, graduation standards for high school and college must be raised. Forty percent of college students take at least one remedial course to make up for deficiencies in their high school preparation, and a test of adult literacy recently given to a random sample of graduating seniors from four-year U.S. institutions found less than 40 percent to be proficient on prose and quantitative tasks.Barriers to Innovation and ReformOur present education system is structured in a way that discourages the innovation necessary for the United States to regain education leadership. K-12 education is delivered largely through a highly regulated public monopoly. Outputs such as high school graduation rates and student performance on standardized assessments are carefully measured and publicly available, but mechanisms that would allow these outputs to drive innovation and reform are missing or blocked. For example, many large urban districts and some states are now able to measure the effectiveness of individual teachers by assessing the annual academic growth of students in their classes. Huge differences in teacher effectiveness are evident, but collective bargaining agreements or state laws prevent most school district administrators from using that information in tenure or salary decisions.Further complicating K-12 reform is the fact that authority for education policy is broadly dispersed. Unlike countries with strong national ministries that can institute top-down reforms within the public sector, education policy and practice in the United States are set through a chaotic network of laws, relationships and funding streams connecting 16,000 independent school districts to school boards, mayors, and state and federal officials. The lack of central authority allows the worst characteristics of public monopolies to prevail—inefficiency, stasis and catering to interests of employees—without top-down systems’ offsetting advantage of being capable of quick and coordinated action.The challenges to reforming higher education are different. The 6,000-plus U.S. postsecondary institutions have greater flexibility to innovate than do the public school districts—and a motive to do so, because many compete among themselves for students, faculty and resources. However, while output is carefully measured and publicly reported for public K-12 schools and districts, we have only the grossest measures of output for post secondary institutions.Even for something as straightforward as graduation rates, the best data we have at the institutional level are the proportion of full-time, first-time degree-seeking students who graduate within 150 percent of the normal time to degree completion. Data on critical outputs, including labor market returns and student learning, are missing entirely. In the absence of information on issues that really matter, postsecondary institutions compete and innovate on dimensions that are peripheral to their productivity, such as the winning records of their sports teams, the attractiveness of their grounds and buildings, and their ratio of acceptances to applications. Far more information is available to consumers in the market for a used car than for a college education. This information vacuum undermines productive innovation.Examining Two Popular ReformsMany education reformers across the political spectrum agree on two structural and governance reforms: expanding the public charter school sector at the expense of traditional public schools and setting national standards for what students should know. Ironically, the evidence supporting each of these reforms is weak at best.Charter schools are publicly funded schools outside the traditional public school system that operate with considerable autonomy in staffing, curriculum and practices. The Obama administration has pushed to expand charter schools by eliminating states that don’t permit charters, or capping them, from competition for $4.35 billion in Race to the Top funding. Both President Obama and Education Secretary Arne Duncan have proposed shuttering poorly performing traditional public schools and replacing them with charters.What does research say about charter schools’ effects on academic outcomes? Large studies that control for student background generally find very small differences in student achievement between the two types of public schools.For example, on the 2005 National Assessment of Educational Progress (the “Nation’s Report Card”), white, black and Hispanic fourth graders in charter schools performed equivalently to fourth-graders with similar racial and ethnic backgrounds in traditional public schools. Positive findings do emerge from recent studies of oversubscribed New York and Boston area charter schools, which use lotteries to determine admission. But these results are obtained from children whose parents push to get them into the most popular charter schools in two urban areas with dynamic and innovative charter entrepreneurs.What about common standards? Based on the belief that high content standards for what students should know and be able to do are essential elements of reform and that national standards are superior to individual state standards, the Common Core State Standards Initiative has signed up 48 states and 3 territories to develop a common core of state standards in English-language arts and mathematics for grades K-12. The administration has praised this joint effort by the National Governors Association and Council of Chief State School Officers, made participation in it a prerequisite for Race to the Top funding, and set aside $350 million in American Recovery and Reinvestment Act funding to develop ways to assess schools’ performance in meeting common core standards.Does research support this approach? The Brown Center on Education Policy at Brookings examined the relationship between student achievement outcomes in mathematics at the state level and ratings of the quality of state content standards in math. There was no association. Some states with strong standards produce high-achieving students, such as Massachusetts, while other states with strong standards languish near the bottom in terms of achievement, such as California. Some states with weak standards boast high levels of achievement, such as New Jersey, while others with weak standards experience low levels of achievement, such as Tennessee.Four IdeasFor every complex problem there is one solution which is simple, neat, and wrong. — H. L. MenckenI will avoid Mencken’s approbation by proposing four solutions rather than one. Although education has far too many moving parts to be dramatically reformed by any short list of simple actions, we can start with changes that are straightforward, ripe for action and most promising, based on research and past experience.Link K-12 Curricula to Comparative EffectivenessLittle attention has been paid to choice of curriculum as a driver of student achievement. Yet the evidence for large curriculum effects is persuasive. Consider a recent study of first-grade math curricula, reported by the National Center for Education Evaluation and Regional Assistance in February 2009. The researchers randomly matched schools with one of four widely used curricula. Two curricula were clear winners, generating three months’ more learning over a nine-month school year than the other two. This is a big effect on achievement, and it is essentially free because the more effective curricula cost no more than the others.The federal government should fund many more comparative effectiveness trials of curricula, and schools using federal funds to support the education of disadvantaged students should be required to use evidence of effectiveness in the choice of curriculum materials. The Obama administration supports comparative effectiveness research in health care. It is no less important in education.Evaluate Teachers MeaningfullyGood education outcomes for students depend on good teachers. If we have no valid and reliable system in place to identify who is good, we cannot hope to create substantial improvements in the quality of the teacher workforce.A substantial body of high-quality research demonstrates that teachers vary substantially in effectiveness, with dramatic consequences for student learning. To increase academic achievement overall and address racial, ethnic and socioeconomic achievement gaps, we must enhance the quality of the teacher workforce and provide children from poor and minority backgrounds with equitable access to the best teachers.Despite strong empirical evidence for differences in teacher performance—as well as intuitive appeal, demonstrated when we remember our own best and worst teachers—the vast majority of public school teachers in America face no meaningful evaluation of on-the-job performance. A recent survey of thousands of teachers and administrators, spanning 12 districts in four states, revealed that none of the districts’ formal evaluation processes differentiated meaningfully among levels of teaching effectiveness, according to a 2009 report published by The New Teacher Project. In districts using binary ratings, more than 99 percent of teachers were rated satisfactory. In districts using a broader range of ratings, 94 percent of teachers received one of the top two ratings, and less than one percent were rated unsatisfactory. In most school districts, virtually all probationary teachers receive tenure—98 percent in Los Angeles, for example—and very small numbers of tenured teachers are ever dismissed for poor performance.Conditions of employment should be restructured to recruit and select more promising teachers, provide opportunities for them to realize their potential, keep the very best teachers in the profession, and motivate them to serve in locations where students have the highest needs. The precondition for these changes is a valid system of evaluating teachers.The federal government should require school districts to evaluate teachers meaningfully, as a condition of federal aid. Washington also should provide extra support to districts that pay substantially higher salaries to teachers demonstrating persistently high effectiveness and serving in high-needs schools. But, because many technical issues in the evaluation of on-the-job performance of teachers are unresolved, the federal government should refrain, at least for now, from mandating specific evaluation components or designs. The essential element is meaningful differentiation—that is, a substantial spread of performance outcomes.Accredit Online Education ProvidersTraditional forms of schooling are labor-intensive and offer few economies of scale. To the extent that financial resources are critical to education outcomes, the only way to improve the U.S. education system in its current configuration is to spend more. Yet we currently spend more per student on education than any other country in the world, and the appetite for ever-increasing levels of expenditure has been dampened by changing demographics and ballooning government deficits. The monies that can be reasonably anticipated in the next decade or two will hardly be enough to forestall erosion in the quality of the system, as currently designed. The game changer for education productivity will have to be technology, which can both cut labor costs and introduce competitive pressures.Already, at the college level, online education (also termed “virtual education” or “distance learning”) is proving competitive with the classroom experience. Nearly 3.5 million students in 2006—about 20 percent of all students in postsecondary schools and twice the number five years previously—were taking at least one course online, according to a 2007 report published by the Sloan Consortium.In K-12, online education is developing much more slowly. But, the case for online K–12 education is strong—and linked to cost control. A survey reported on page one of Education Week (March 18, 2009) found the average per-pupil cost of 20 virtual schools in 14 states to be about half the national average for a traditional public school.Local and state control of access to virtual schooling impedes the growth of high-quality online education and the competitive pressure it contributes to traditional schooling. Development costs are very high for virtual courseware that takes full advantage of the newest technologies and advances in cognitive science and instruction—much higher than the costs for traditional textbooks and instructional materials. These development costs can only be rationalized if the potential market for the resulting product is large. But, states and local school districts now are able to determine whether an online program is acceptable. The bureaucracy that may be most disrupted by the introduction of virtual education acts as gatekeeper.To overcome this challenge, K-12 virtual public education would benefit from the model of accreditation used in higher education. Colleges and universities are accredited by regional or national bodies recognized by the federal government. Such accrediting bodies as the New England Association of Schools and Colleges and the Accrediting Council for Independent Colleges and Schools are membership organizations that determine their own standards within broad federal guidelines. Once an institution is accredited, students residing anywhere can take its courses, often with the benefit of federal and state student aid.Federal legislation to apply this accreditation model to online K-12 education could transform public education, especially if the legislation also required school districts to cover the reasonable costs of online courses for students in persistently low-performing schools. This approach would exploit—and enhance—U.S. advantages in information technology. We are unlikely to regain the international lead in education by investing more in business as usual; but we could leapfrog over other countries by building new, technology-intensive education systems.Link Postsecondary Programs to Labor Market OutcomesOn a per-student basis, the United States spends two and one-half times the developed countries’ average on postsecondary education. Although our elite research universities remain remarkable engines of innovation and are the envy of the world, our postsecondary education system in general is faltering. The United States used to lead the world in higher education attainment, but, according to 2009 OECD data, is now ranked 12th among developed countries. We have become a high-cost provider of mediocre outcomes.Critical to addressing this problem is better information on the performance of our postsecondary institutions. As the U.S. Secretary of Education’s Commission on the Future of Higher Education concluded in 2006:Our complex, decentralized postsecondary education system has no comprehensive strategy, particularly for undergraduate programs, to provide either adequate internal accountability systems or effective public information. Too many decisions about higher education—from those made by policymakers to those made by students and families—rely heavily on reputation and rankings derived to a large extent from inputs such as financial resources rather than outcomes. Better data about real performance and lifelong working and learning ability is absolutely essential if we are to meet national needs and improve institutional performance.Ideally, this information would be available in comparable forms for all institutions through a national system of data collection. However, achieving consensus on the desirability of a national database of student records has proved politically contentious. One of the issues is privacy of information. More powerful is the opposition of some postsecondary institutions that apparently seek to avoid accountability for their performance.The way forward is for Congress to authorize, and fund at the state level, data systems that follow individual students through their postsecondary careers into the labor market. The standards for such state systems could be recommended at the federal level or by national organizations, to maximize comparability and eventual interoperability.The public face of such a system at the state level would be a website allowing prospective students and parents to compare degree and certificate programs within and across institutions on diverse outcomes, with corresponding information on price. At a minimum, the outcomes would include graduation rates, employment rates and average annual earnings five years after graduation. Outcomes would be reported at the individual program level, such as the B.S. program in chemical engineering at the University of Houston. Price could be reported in three ways: advertised tuition,average tuition for new students for the previous two years, and average tuition for new students for the previous two years net of institutional and state grants for students eligible for federally subsidized student loans. These different forms of price information are necessary because institutions frequently discount their advertised price, particularly for low-income students. Students and families need information about discounts in order to shop on the basis of price.Many states, such as Washington, already have data that would allow the creation of such college search sites, at least for their public institutions. The primary impediment to progress is the federal Family Educational Rights and Privacy Act (FERPA), which makes it very difficult for postsecondary institutions to share data on individual students with state agencies, such as the tax division or unemployment insurance office, in order to match students with information on post-graduation employment and wages. Congress should amend FERPA to allow such data exchanges among state agencies while maintaining restrictions on release of personally identifiable information. To address privacy concerns, Congress also should impose substantial penalties for the public release of personally identifiable information; FERPA currently is toothless.Creating a higher education marketplace that is vibrant with transparent and valid information on performance and price would be a powerful driver of reform and innovation. Easily addressed concerns about the privacy of student records and political opposition from institutions that do not want their performance exposed to the public have stood in the way of this critical reform for too long. America’s economic future depends on returning the United States to the forefront of education attainment. Simply put, many more of our students need to finish high school and graduate from college. Investments in improved data, along with structural reforms and innovation, can help restore our leadership in educational attainment and increase economic growth. Downloads Download Policy Brief Authors Grover J. "Russ" Whitehurst Full Article
va Antimicrobial Resistance: Antibiotics Stewardship and Innovation By webfeeds.brookings.edu Published On :: Thu, 12 Jun 2014 00:00:00 -0400 Antimicrobial resistance is one of the most significant threats to public health globally. It will worsen in the coming decades without concerted efforts to spur the development of new antibiotics, while ensuring the appropriate use of existing antibiotics. Antimicrobial therapy is essential for treating and preventing bacterial infections, some of which can be life-threatening and acquired as a result of critical medical interventions, including surgery, chemotherapy and dialysis. However, the international rise in antimicrobial resistance has weakened our antibiotic armamentarium and multi-resistant bacteria now cause over 150,000 deaths annually in hospitals around the world (WHO, 2013). Unfortunately, the evolution of drug-resistant pathogens is unavoidable due to random genetic changes in the pathogens that can render antibiotics ineffective. While antibiotic therapy can succeed in killing susceptible pathogens, it also inadvertently selects for organisms that are resistant. Because each exposure to antibiotics contributes to this process, efforts to restrict antibiotic usage only slow the development of resistance. Ultimately, innovative antimicrobial drugs with diverse mechanisms of action will be needed to treat emerging resistant pathogens. Combating resistance Inappropriate use of antibiotics contributes significantly to the acceleration of resistance. Needlessly exposing patients to antibiotics (for example, for viral or mild infections likely to resolve on their own), the use of overly broad-spectrum antibiotics and suboptimal doses of appropriate therapy hasten the evolution of resistant pathogens. While affordable, rapid and accurate point-of-care diagnostics are essential for determining appropriate therapy for many bacterial diseases, routine clinical use will be limited if the tests are too expensive or not accessible during routine clinical encounters. In the absence of a clear diagnostic result, many health care providers prescribe empiric broadspectrum therapy without knowing exactly what they are treating. Although inappropriate use is widespread in many parts of the world, where antibiotics are available without a prescription or oversight by a health care provider or stewardship team, overuse abounds even where antibiotic prescribing is more tightly regulated. Studies conducted in the USA indicate that around 258 million courses of antibiotics are dispensed annually for outpatient use (Hicks, 2013) and up to 75 per cent of ambulatory antibiotic prescriptions are for the treatment of common respiratory infections, which may or may not be bacterial in origin (McCaig,1995). Recent evidence suggests that over half of these prescriptions are not medically indicated. For example, 60 per cent of US adults with a sore throat receive an antibiotic prescription after visiting a primary care practice or emergency department, despite the fact that only ten per cent require treatment with antibiotics. This is particularly troubling given the availability of rapid tests that can detect Group A Streptococcus, the bacteria responsible for the ten per cent of cases that require antibiotic treatment. The overuse of antibiotics has been driven largely by their low cost and clinical effectiveness, which has led many patients to view them as cure-alls with few risks. This perception is reinforced by the fact that antibiotics are curative in nature and used for short durations. However, the clinical effectiveness of these drugs decreases over time, as resistance naturally increases, and this process is accelerated with inappropriate use. Moreover, there are numerous consequences associated with the use of antibiotics, including over 140,000 emergency department visits yearly in the USA for adverse incidents (mostly allergic reactions; CDC, 2013a). In addition, antibiotics can eliminate protective bacteria in the gut, leaving patients vulnerable to infection with Clostridium difficile, which causes diarrhoeal illness that results in 14,000 deaths every year in the USA (CDC, 2013b). It is estimated that antimicrobial resistance costs the US health care system over US$20 billion annually in excess care and an additional $35 billion in lost productivity (Roberts et al., 2009). The inappropriate use of antimicrobial drugs is particularly concerning because highly resistant pathogens can easily cross national borders and rapidly spread around the globe. In recent years, strains of highly drug-resistant tuberculosis, carbapenem-resistant Enterobacteriaceae and other resistant pathogens have spread outside their countries of origin within several years of their detection. Because resistant bacteria are unlikely to stay isolated, stewardship efforts must be improved globally and international attention is needed to improve surveillance of emerging pathogens and resistance patterns. A major challenge for clinicians and regulators will be to find stewardship interventions that can be scaled-up and involve multiple stakeholders, including providers, drug manufacturers, health care purchasers (insurers), governments and patients themselves. Such interventions should include practical and costeffective educational programmes targeted towards providers and patients that shift expectations for antibiotic prescriptions to a mutual understanding of the benefits and risks of these drugs. Educational programmes alone, however, will not be sufficient to lower prescribing rates to recommended levels. Pushing down the inappropriate use of antibiotics also warrants stronger mechanisms that leverage the critical relationships between the stakeholders. For example, health care purchasers can play an important role by using financial disincentives to align prescribing habits with clinical guidelines that are developed by infectious disease specialists in the private and public sectors. This type of approach has the potential to be effective because it includes multiple stakeholders that share responsibility for the appropriate use of antibiotics and, ultimately, patient care. Key obstacles to antibiotic development The continual natural selection for resistant pathogens despite efforts to limit antibiotic use underscores the need for new antibiotics with novel mechanisms of action. To date, antimicrobial drug innovation and development have not kept pace with resistance. The number of approved new molecular entities (NME) to treat systemic infections has been steadily declining for decades (see Figure 1). Some infections are not susceptible to any antibiotic and in some cases the only effective drugs may cause serious side effects, or be contra-indicated due to a patient’s allergies or comorbidities (e.g. renal failure). There is significant unmet medical need for therapies that treat serious and life-threatening bacterial diseases caused by resistant pathogens, as well as some less serious infections where there are few treatment alternatives available (e.g. gonorrhoea). Antibiotic development for these areas of unmet medical need has been sidelined by a number of scientific, regulatory and economic obstacles. While the costs and complexity of any clinical trial necessary for approval by drug regulators can be substantial, in part due to the large study samples needed to demonstrate safety and efficacy, the infectious disease space faces a number of unique clinical challenges. Patients with serious drug-resistant infections may be in need of urgent antibiotic therapy, which can preclude efficient consent and timely trial enrolment procedures; prior therapy can also confound treatment effects if the patient is later enrolled in a trial for an experimental drug. In addition, many patients with these pathogens are likely to have a history of longterm exposure to the health care setting and may have significant comorbidities that render them less likely to meet inclusion criteria for clinical trials. Emerging infections for which there are few or no treatment options also tend to be relatively rare. This makes it difficult to conduct adequate and well-controlled trials, which typically enrol large numbers of patients. However, clinical drug development can take many years and waiting until such infections are more common is not feasible. Another issue is that it may also not be possible to conclusively identify the pathogen and its susceptibility at the point of enrolment due to the lack of rapid diagnostic technologies. Ultimately, uncertainty about the aetiology of an infection may necessitate trials with larger numbers of patients in order to achieve sufficient statistical power, further compounding the challenge of enrolling seriously ill infectious disease patients in the first place. The need to conduct large trials involving acutely ill patients that are difficult to identify can make antibiotic development prohibitively expensive for drug developers, especially given that antibiotics are relatively inexpensive and offer limited opportunities to generate returns. Unlike treatments for chronic diseases, antibiotic therapy tends to last no longer than a few weeks, and these drugs lose efficacy over time as resistance develops, leading to diminishing returns. The decline in antimicrobial drug innovation is largely due to these economic obstacles, which have led developers to seek more durable and profitable markets (e.g. cancer or chronic disease) in recent decades. There are only a handful of companies currently in the market and the development pipeline is very thin. Changes to research infrastructure, drug reimbursement and regulation are all potentially needed to revitalise antibiotic innovation. Opportunities to streamline innovative antibiotic development In the USA, several proposals have been made to expedite the development and regulatory review of antibiotics while ensuring that safety and efficacy requirements are met. In 2012, the US President’s Council of Advisors on Science and Technology recommended that the US Food and Drug Administration (FDA) create a ‘special medical use’ (SMU) designation for the review of drugs for subpopulations of patients with unmet medical need. Drug sponsors would be required to demonstrate that clinical trials in a larger patient population would need much more time to complete or not be feasible. A drug approved under the SMU designation could be studied in subgroups of patients that are critically ill, as opposed to the broader population, under the condition that the drug’s indication would be limited to the narrow study population. The SMU designation was discussed at an expert workshop convened by the Brookings Institution in August 2013. Many participants at the meeting agreed that there is a pressing need to develop novel antibiotics and that such a limited-use pathway could support the appropriate use of newly approved drugs. The Infectious Diseases Society of America developed a related drug development pathway called the Limited Population Antibacterial Drug (LPAD) approval mechanism. The LPAD approach calls for smaller, faster and less costly clinical trials to study antibiotics that treat resistant bacteria that cause serious infections. Both the SMU and LPAD approaches would allow drug developers to demonstrate product safety and efficacy in smaller patient subpopulations and provide regulatory clarity about acceptable benefit–risk profiles for antibiotics that treat serious bacterial diseases. The US House of Representatives is currently considering a bill1 that incorporates these concepts. A recent proposal from the drug manufacturer industry for streamlined antibiotic development is to establish a tiered regulatory framework to assess narrow-spectrum antibiotics (e.g. active versus a specific bacterial genus and species or a group of related bacteria) that target resistant pathogens that pose the greatest threat to public health (Rex, 2013: pp. 269–275). This is termed a ‘pathogen-focused’ approach because the level of clinical evidence required for approval would be correlated with the threat level and feasibility of studying a specific pathogen or group of pathogens. The pathogen-focused approach was also highlighted at a recent workshop at the Brookings Institution (Brookings Institution, 2014). Some experts felt that the approach is promising but emphasised that each pathogen and experimental drug is unique and that it could be challenging to place them in a particular tier of a regulatory framework. Given that pathogen-focused drugs would likely be marketed internationally, it will be important for drug sponsors to have regular interactions and multiple levels of discussion with regulators to find areas of agreement that would facilitate the approval of these drugs. Antibiotics with very narrow indications could potentially support stewardship as well by limiting use to the most seriously ill patients. Safe use of these drugs would likely depend on diagnostics, significant provider education, labelling about the benefits and risks of the product, and the scope of clinical evidence behind its approval. Because these antibiotics would be used in a very limited manner, changes would potentially need to be made to how they are priced and reimbursed to ensure that companies are still able to generate returns on their investment. That said, a more focused drug development programme with regulatory clarity could greatly increase their odds of success and, combined with appropriate pricing and safe use provisions, could succeed in incentivising antimicrobial drug development for emerging infections. Endnote 1 H.R. 3742 – Antibiotic Development to Advance Patient Treatment (ADAPT) Act of 2013. References Barnett, M. L. and Linder, J. A., 2014. ‘Antibiotic prescribing to adults with sore throat in the United States, 1997–2010’. JAMA Internal Medicine, 174(1), pp. 138–140. Brookings Institution, 2013. Special Medical Use: Limited Use for Drugs Developed in an Expedited Manner to Meet an UnmetMedical Need. Brookings Institution. Available at: www.brookings.edu/events/2013/08/01-special-medical-use Brookings Institution, 2014. Modernizing Antibacterial Drug Development and Promoting Stewardship. Available at: www.brookings.edu/events/2014/02/07-modernizing-antibacterialdrug-development [Accessed 11 March 2014]. CDC, 2013a. Antibiotic resistance threats in the United States,2013 [PDF] CDC. Available at: www.cdc.gov/drugresistance/threatreport-2013/pdf/ar-threats-2013-508.pdf#page=25 [Accessed 16 January 2014]. CDC, 2013b. Clostridium difficile. Antibiotic resistance threats in the United States, 2013 [PDF] CDC. Available at: www.cdc.gov/drugresistance/threat-report-2013/pdf/ar-threats-2013-508.pdf#page=50 [Accessed 16 January 2014]. Hicks, L. A. et al., 2013. ‘US Outpatient Antibiotic Prescribing, 2010’. New England Journal of Medicine, 368(15), pp. 1461–1463. Infectious Disease Society of America, 2012. Limited Population Antibacterial Drug (LPAD) Approval Mechanism. Available at: www.idsociety.org/uploadedFiles/IDSA/News_and_Publications/IDSA_News_Releases/2012/LPAD%20one%20pager.pdf [Accessed 5 March 2014]. Infectious Disease Society of America, 2012. Limited Population Antibacterial Drug (LPAD) Approval Mechanism [PDF] Infectious Disease Society of America. Available at: www.idsociety.org/uploadedFiles/IDSA/News_and_Publications/IDSA_News_Releases/2012/LPAD%20one%20pager.pdf [Accessed 18 January 2013]. Kumarasamy, K. K., Toleman, M. A., Walsh, T. R. et al.,2010. ‘Emergence of a new antibiotic resistance mechanism in India, Pakistan, and the UK: A molecular, biological, and epidemiological study’. Lancet Infectious Diseases, 10(9), pp. 597–602. McCaig, L. F. and Hughes, J. M., 1995. ‘Trends in antimicrobial drug prescribing among office-based physicians in the United States’. Journal of the American Medical Association, 273(3), pp. 214–219. President’s Council of Advisors on Science and Technology, 2012. Report to the President on Propelling Innovation in Drug Discovery, Development and Evaluation. Available at:www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-fdafinal.pdf [Accessed 5 March 2014]. Rex, J. H. et al., 2013. ‘A comprehensive regulatory framework to address the unmet need for new antibacterial treatments’. Lancet Infectious Diseases, 13(3), pp. 269–275. Roberts, R. R., Hota, B., Ahmad, I. et al., 2009. ‘Hospital and societal costs of antimicrobial – Resistant infections in a Chicago teaching hospital: Implications for antibiotic stewardship’. Clinical Infectious Diseases, 49(8), pp. 1175–1184. WHO (World Health Organization), 2010. Fact Sheet: Rational Use of Medicines [webpage] WHO. Available at: www.who.int/mediacentre/factsheets/fs338/en [Accessed 28 February 2014]. WHO (World Health Organization), 2013. Antimicrobial Drug Resistance [PDF] WHO. Available at: http://apps.who.int/gb/ebwha/pdf_files/EB134/B134_37-en.pdf [Accessed 6 March 2014]. WHO (World Health Organization), 2013. Notified MDR-TB cases (number per 100,000 population), 2005–12. WHO. Available at: https://extranet.who.int/sree/Reports?op=vs&path=/WHO_HQ_Reports/G2/PROD/EXT/MDRTB_Indicators_map [Accessed 28 February 2014]. Downloads Antibiotics Stewardship and Innovation Authors Gregory W. DanielDerek GriffingSophie Mayer Publication: Commonwealth Health Partnerships 2014 Full Article
va Advancing antibiotic development in the age of 'superbugs' By webfeeds.brookings.edu Published On :: Fri, 27 Feb 2015 14:37:00 -0500 While antibiotics are necessary and crucial for treating bacterial infections, their misuse over time has contributed to a rather alarming rate of antibiotic resistance, including the development of multidrug-resistance bacteria or “super bugs.” Misuse manifests throughout all corners of public and private life; from the doctor’s office when prescribed to treat viruses; to industrial agriculture, where they are used in abundance to prevent disease in livestock. New data from the World Health Organization (WHO) and U.S. Centers for Disease Control and Prevention (CDC) confirm that rising overuse of antibiotics has already become a major public health threat worldwide. As drug resistance increases, we will see a number of dangerous and far-reaching consequences. First, common infections like STDs, pneumonia, and “staph” infections will become increasingly difficult to treat, and in extreme cases these infections may require hospitalization or treatment with expensive and toxic second-line therapies. In fact, recent estimates suggest that every year more than 23,000 people die due to drug-resistant infections in the U.S., and many more suffer from complications caused by resistant pathogens. Further, infections will be harder to control. Health care providers are increasingly encountering highly resistant infections not only in hospitals – where such infections can easily spread between vulnerable patients – but also in outpatient care settings. Fundamental Approaches to Slowing Resistance Incentivize appropriate use of antibiotics. Many patients and providers underestimate the risks of using antibiotics when they are not warranted, in part because these drugs often have rapid beneficial effects for those who truly need them. In many parts of the world the perception that antibiotics carry few risks has been bolstered by their low costs and availability without a prescription or contact with a trained health care provider. Education efforts, stewardship programs, and the development of new clinical guidelines have shown some success in limiting antibiotic use, but these fixes are limited in scope and generally not perceived as cost-effective or sustainable. Broader efforts to incentivize appropriate use, coupled with economic incentives, may be more effective in changing the culture of antibiotic use. These options might include physician or hospital report cards that help impact patient provider selection, or bonuses based on standardized performance measures that can be used to report on success of promoting appropriate use. While these might create additional costs, they would likely help control rates of drug resistant infections and outweigh the costs of treating them. Reinvigorate the drug development pipeline with novel antibiotics. There has not been a new class of antibiotics discovered in almost three decades, and companies have largely left the infectious disease space for more stable and lucrative product lines, such as cancer and chronic disease. Antibiotics have historically been inexpensive and are typically used only for short periods of time, creating limited opportunities for return on investment. In addition, unlike cancer or heart disease treatments, antibiotics lose effectiveness over time, making them unattractive for investment. Once they are on the market, the push to limit use of certain antibiotics to the most severe infections can further constrict an already weak market. Late last year, H.R. 3742, the Antibiotic Development to Advance Patient Treatment (ADAPT) Act of 2013, was introduced and referred to the House Energy and Commerce Subcommittee on Health. If enacted, the ADAPT Act would create a streamlined development pathway to expedite the approval of antibiotics that treat limited patient populations with serious unmet medical needs. This could potentially reduce costs and development time for companies, thereby encouraging investment in this space. Regulators have indicated that they would also welcome the opportunity to evaluate benefits and risk for a more selective patient subpopulation if they could be confident the product would be used appropriately. The bill has received a great deal of support and would help address a critical public health need (I cover this topic in more detail with my colleagues Kevin Outterson, John Powers, and Mark McClellan in a recent Health Affairs paper). Advance new economic incentives to remedy market failure. Innovative changes to pharmaceutical regulation, research and development (R&D), and reimbursement are necessary to alleviate the market failure for antibacterial drugs. A major challenge, particularly within a fee-for-service or volume-based reimbursement system, is providing economic incentives that promote investment in drug development without encouraging overuse. A number of public and private stakeholders, including the Engelberg Center for Health Care Reform and Chatham House’s Centre on Global Health Security Working Group on Antimicrobial Resistance, are exploring alternative reimbursement mechanisms that “de-link” revenue from the volume of antibiotics sold. Such a mechanism, combined with further measures to stimulate innovation, could create a stable incentive structure to support R&D. Improve tracking and monitoring of resistance in the outpatient setting. There is increasing concern about much less rigorous surveillance capabilities in the outpatient setting, where drug-resistant infections are also on the rise. Policymakers should consider new incentives for providers and insurers to encourage a coordinated approach for tracking inpatient and outpatient resistance data. The ADAPT Act, mentioned above, also seeks to enhance monitoring of antibiotic utilization and resistance patterns. Health insurance companies can leverage resistance-related data linked to health care claims, while providers can capture lab results in electronic health records. Ultimately, this data could be linked to health and economic outcomes at the state, federal, and international levels, and provide a more comprehensive population-based understanding of the impact and spread of resistance. Current examples include the Food and Drug Administration’s (FDA) Sentinel Initiative and the Patient-Centered Outcomes Research Institute’s PCORnet initiative. Antibiotic resistance is an urgent and persistent threat. As such, patients and providers will continue to require new antibiotics as older drugs are forced into retirement by resistant pathogens. Stewardship efforts will remain critical in the absence of game-changing therapies that parry resistance mechanisms. Lastly, a coordinated surveillance approach that involves diverse stakeholder groups is needed to understand the health and economic consequences of drug resistance, and to inform antibiotic development and stewardship efforts. Editor's note: This blog was originally posted in May 2014 on Brookings UpFront. Authors Gregory W. Daniel Full Article
va Scaling up social enterprise innovations: Approaches and lessons By webfeeds.brookings.edu Published On :: Thu, 07 Jul 2016 09:53:00 -0400 In 2015 the international community agreed on a set of ambitious sustainable development goals (SDGs) for the global society, to be achieved by 2030. One of the lessons that the implementation of the Millennium Development Goals (MDG s) has highlighted is the importance of a systematic approach to identify and sequence development interventions—policies, programs, and projects—to achieve such goals at a meaningful scale. The Chinese approach to development, which consists of identifying a problem and long-term goal, testing alternative solutions, and then implementing those that are promising in a sustained manner, learning and adapting as one proceeds—Deng Xiaoping’s “crossing the river by feeling the stones”—is an approach that holds promise for successful achievement of the SDGs. Having observed the Chinese way, then World Bank Group President James Wolfensohn in 2004, together with the Chinese government, convened a major international conference in Shanghai on scaling up successful development interventions, and in 2005 the World Bank Group (WBG ) published the results of the conference, including an assessment of the Chinese approach. (Moreno-Dodson 2005). Some ten years later, the WBG once again is addressing the question of how to support scaling up of successful development interventions, at a time when the challenge and opportunity of scaling up have become a widely recognized issue for many development institutions and experts. Since traditional private and public service providers frequently do not reach the poorest people in developing countries, social enterprises can play an important role in providing key services to those at the “base of the pyramid.” In parallel with the recognition that scaling up matters, the development community is now also focusing on social enterprises (SEs), a new set of actors falling between the traditionally recognized public and private sectors. We adopt here the World Bank’s definition of “social enterprises” as a social-mission-led organization that provides sustainable services to Base of the Pyramid (BoP) populations. This is broadly in line with other existing definitions for the sector and reflects the World Bank’s primary interest in social enterprises as a mechanism for supporting service delivery for the poor. Although social enterprises can adopt various organizational forms—business, nongovernmental organizations (NGOs), and community-based organizations are all forms commonly adopted by social enterprises—they differ from private providers principally by combining three features: operating with a social purpose, adhering to business principles, and aiming for financial sustainability. Since traditional private and public service providers frequently do not reach the poorest people in developing countries, social enterprises can play an important role in providing key services to those at the “base of the pyramid.” (Figure 1) Figure 1. Role of SE sector in public service provision Social enterprises often start at the initiative of a visionary entrepreneur who sees a significant social need, whether in education, health, sanitation, or microfinance, and who responds by developing an innovative way to address the perceived need, usually by setting up an NGO, or a for-profit enterprise. Social enterprises and their innovations generally start small. When successful, they face an important challenge: how to expand their operations and innovations to meet the social need at a larger scale. Development partner organizations—donors, for short—have recognized the contribution that social enterprises can make to find and implement innovative ways to meet the social service needs of people at the base of the pyramid, and they have started to explore how they can support social enterprises in responding to these needs at a meaningful scale. The purpose of this paper is to present a menu of approaches for addressing the challenge of scaling up social enterprise innovations, based on a review of the literature on scaling up and on social enterprises. The paper does not aim to offer specific recommendations for entrepreneurs or blueprints and guidelines for the development agencies. The range of settings, problems, and solutions is too wide to permit that. Rather, the paper provides an overview of ways to think about and approach the scaling up of social enterprise innovations. Where possible, the paper also refers to specific tools that can be helpful in implementing the proposed approaches. Note that we talk about scaling up social enterprise innovations, not about social enterprises. This is because it is the innovations and how they are scaled up that matter. An innovation may be scaled up by the social enterprise where it originated, by handoff to a public agency for implementation at a larger scale, or by other private enterprises, small or large. This paper is structured in three parts: Part I presents a general approach to scaling up development interventions. This helps establish basic definitions and concepts. Part II considers approaches for the scaling up of social enterprise innovations. Part III provides a summary of the main conclusions and lessons from experience. A postscript draws out implications for external aid donors. Examples from actual practice are used to exemplify the approaches and are summarized in Annex boxes. Downloads Download the full paper (PDF) Authors Natalia AgapitovaJohannes F. Linn Full Article
va School policies and the success of advantaged and disadvantaged students By webfeeds.brookings.edu Published On :: Thu, 02 Aug 2018 09:00:16 +0000 executive summary We make use of matched birth-school administrative data from Florida, coupled with an extensive survey of instructional policies and practices, to observe which policies and practices are associated with improved test performance for relatively advantaged students in a school, for relatively disadvantaged students in a school, for both, and for neither. We consider… Full Article
va How a Detroit developer is using innovative leasing to support the city’s creative economy By webfeeds.brookings.edu Published On :: Mon, 27 Apr 2020 15:14:44 +0000 Inclusive growth is a top priority in today’s uneven economy, as widening income inequities, housing affordability crises, and health disparities leave certain places and people without equitable access to opportunity, health, and well-being. Brookings and others have long argued that inclusive economic growth is essential to mitigate such disparities, yet implementing inclusive growth models and… Full Article
va How ‘innovation districts’ are continuing the fight against COVID-19 By webfeeds.brookings.edu Published On :: Tue, 28 Apr 2020 13:55:33 +0000 Last month, I wrote about innovation districts’ critical efforts to mitigate the impacts of COVID-19. Since the outset of the pandemic, these districts have leveraged their academic research capabilities, innovation infrastructure (e.g., laboratories, advanced technologies, Big Data for modeling), and local and global peer networks to understand and contain the spread of the coronavirus. These… Full Article
va Webinar: How federal job vacancies hinder the government’s response to COVID-19 By webfeeds.brookings.edu Published On :: Mon, 20 Apr 2020 20:52:41 +0000 Vacant positions and high turnover across the federal bureaucracy have been a perpetual problem since President Trump was sworn into office. Upper-level Trump administration officials (“the A Team”) have experienced a turnover rate of 85 percent — much higher than any other administration in the past 40 years. The struggle to recruit and retain qualified… Full Article
va Advancing antibiotic development in the age of 'superbugs' By webfeeds.brookings.edu Published On :: Fri, 27 Feb 2015 14:37:00 -0500 While antibiotics are necessary and crucial for treating bacterial infections, their misuse over time has contributed to a rather alarming rate of antibiotic resistance, including the development of multidrug-resistance bacteria or “super bugs.” Misuse manifests throughout all corners of public and private life; from the doctor’s office when prescribed to treat viruses; to industrial agriculture, where they are used in abundance to prevent disease in livestock. New data from the World Health Organization (WHO) and U.S. Centers for Disease Control and Prevention (CDC) confirm that rising overuse of antibiotics has already become a major public health threat worldwide. As drug resistance increases, we will see a number of dangerous and far-reaching consequences. First, common infections like STDs, pneumonia, and “staph” infections will become increasingly difficult to treat, and in extreme cases these infections may require hospitalization or treatment with expensive and toxic second-line therapies. In fact, recent estimates suggest that every year more than 23,000 people die due to drug-resistant infections in the U.S., and many more suffer from complications caused by resistant pathogens. Further, infections will be harder to control. Health care providers are increasingly encountering highly resistant infections not only in hospitals – where such infections can easily spread between vulnerable patients – but also in outpatient care settings. Fundamental Approaches to Slowing Resistance Incentivize appropriate use of antibiotics. Many patients and providers underestimate the risks of using antibiotics when they are not warranted, in part because these drugs often have rapid beneficial effects for those who truly need them. In many parts of the world the perception that antibiotics carry few risks has been bolstered by their low costs and availability without a prescription or contact with a trained health care provider. Education efforts, stewardship programs, and the development of new clinical guidelines have shown some success in limiting antibiotic use, but these fixes are limited in scope and generally not perceived as cost-effective or sustainable. Broader efforts to incentivize appropriate use, coupled with economic incentives, may be more effective in changing the culture of antibiotic use. These options might include physician or hospital report cards that help impact patient provider selection, or bonuses based on standardized performance measures that can be used to report on success of promoting appropriate use. While these might create additional costs, they would likely help control rates of drug resistant infections and outweigh the costs of treating them. Reinvigorate the drug development pipeline with novel antibiotics. There has not been a new class of antibiotics discovered in almost three decades, and companies have largely left the infectious disease space for more stable and lucrative product lines, such as cancer and chronic disease. Antibiotics have historically been inexpensive and are typically used only for short periods of time, creating limited opportunities for return on investment. In addition, unlike cancer or heart disease treatments, antibiotics lose effectiveness over time, making them unattractive for investment. Once they are on the market, the push to limit use of certain antibiotics to the most severe infections can further constrict an already weak market. Late last year, H.R. 3742, the Antibiotic Development to Advance Patient Treatment (ADAPT) Act of 2013, was introduced and referred to the House Energy and Commerce Subcommittee on Health. If enacted, the ADAPT Act would create a streamlined development pathway to expedite the approval of antibiotics that treat limited patient populations with serious unmet medical needs. This could potentially reduce costs and development time for companies, thereby encouraging investment in this space. Regulators have indicated that they would also welcome the opportunity to evaluate benefits and risk for a more selective patient subpopulation if they could be confident the product would be used appropriately. The bill has received a great deal of support and would help address a critical public health need (I cover this topic in more detail with my colleagues Kevin Outterson, John Powers, and Mark McClellan in a recent Health Affairs paper). Advance new economic incentives to remedy market failure. Innovative changes to pharmaceutical regulation, research and development (R&D), and reimbursement are necessary to alleviate the market failure for antibacterial drugs. A major challenge, particularly within a fee-for-service or volume-based reimbursement system, is providing economic incentives that promote investment in drug development without encouraging overuse. A number of public and private stakeholders, including the Engelberg Center for Health Care Reform and Chatham House’s Centre on Global Health Security Working Group on Antimicrobial Resistance, are exploring alternative reimbursement mechanisms that “de-link” revenue from the volume of antibiotics sold. Such a mechanism, combined with further measures to stimulate innovation, could create a stable incentive structure to support R&D. Improve tracking and monitoring of resistance in the outpatient setting. There is increasing concern about much less rigorous surveillance capabilities in the outpatient setting, where drug-resistant infections are also on the rise. Policymakers should consider new incentives for providers and insurers to encourage a coordinated approach for tracking inpatient and outpatient resistance data. The ADAPT Act, mentioned above, also seeks to enhance monitoring of antibiotic utilization and resistance patterns. Health insurance companies can leverage resistance-related data linked to health care claims, while providers can capture lab results in electronic health records. Ultimately, this data could be linked to health and economic outcomes at the state, federal, and international levels, and provide a more comprehensive population-based understanding of the impact and spread of resistance. Current examples include the Food and Drug Administration’s (FDA) Sentinel Initiative and the Patient-Centered Outcomes Research Institute’s PCORnet initiative. Antibiotic resistance is an urgent and persistent threat. As such, patients and providers will continue to require new antibiotics as older drugs are forced into retirement by resistant pathogens. Stewardship efforts will remain critical in the absence of game-changing therapies that parry resistance mechanisms. Lastly, a coordinated surveillance approach that involves diverse stakeholder groups is needed to understand the health and economic consequences of drug resistance, and to inform antibiotic development and stewardship efforts. Editor's note: This blog was originally posted in May 2014 on Brookings UpFront. Authors Gregory W. Daniel Full Article
va State of biomedical innovation conference By webfeeds.brookings.edu Published On :: Fri, 13 Mar 2015 09:00:00 -0400 Event Information March 13, 20159:00 AM - 11:30 AM EDTFalk AuditoriumBrookings Institution1775 Massachusetts Avenue NWWashington, DC 20036 Register for the Event As policy agendas for 2015 come into sharper focus, much of the national conversation is aimed at tackling challenges in biomedical innovation. The first two months of the year alone have seen landmark proposals from Congress and the Obama Administration, including the House’s 21st Century Cures initiative, a bipartisan Senate working group focused on medical progress, President Obama’s Precision Medicine Initiative and a number of additional priorities being advanced by federal agencies and other stakeholders. On March 13, the Engelberg Center for Health Care Reform hosted the State of Biomedical Innovation Conference to provide an overview of emerging policy efforts and priorities related to improving the biomedical innovation process. Senior leaders from government, academia, industry, and patient advocacy shared their thoughts on the challenges facing medical product development and promising approaches to overcome them. The discussion also examined the data and analyses that provide the basis for new policies and track their ultimate success. Join the conversation by following @BrookingsMed or #biomed Video State of biomedical innovation conference: Panel 1State of biomedical innovation conference: Panel 2 Audio State of biomedical innovation conference Transcript Uncorrected Transcript (.pdf) Event Materials 20150313_biomed_transcript313 MASTER DECK2 Full Article
va Cost, value and patient outcomes: The growing need for payer engagement By webfeeds.brookings.edu Published On :: Mon, 20 Apr 2015 00:00:00 -0400 Editor's note: This article appears in the April 2015 issue of Global Forum. Click here to view the full publication. Since passage of the Affordable Care Act in 2010, the last several years have seen a groundswell in physician payment and delivery reforms designed to achieve higher value health care through incentivizing higher quality care and lower overall costs. Accountable care models, for example, are achieving marked progress by realigning provider incentives toward greater risk-sharing and increased payments and shared savings with measured improvements in quality and cost containment. Medical homes are introducing greater care coordination and team-based care management, while the use of episode-based or bundled payments is removing perverse incentives that reward volume and intensity. These reforms are coming just as the number of highly targeted, highly priced treatments continues to expand. The U.S. Food and Drug Administration (FDA) approved a decade-high 41 novel new drugs in 2014, many of them targeted therapies approved on the basis of increasingly sophisticated progress in genomics and the understanding of disease progression. In areas like oncology, such targeted treatments have grown as a percentage of global oncology market size from 11% in 2003 to 46% in 2013. New brand specialty drug spending in the U.S. is estimated to have been $7.5 billion in 2013, or 69% of total new drug spending. The growing prevalence of these drugs and their cost to the health system are setting the stage for significant flashpoints between industry, payers, and providers, seen most clearly in the debate over hepatitis C treatment costs that roiled stakeholder interactions for most of the past year. More of these targeted treatments are in the development pipeline, and a growing number of public policy efforts taking shape in 2015 are focused on accelerating their availability. The House of Representatives' 21st Century Cures Initiative, for example, has released a slew of legislative proposals aimed at promoting breakthrough innovation by increasing the efficiency of drug development and regulatory review. These efforts have significant downstream implications for the pace at which targeted and specialty therapies will become available, their associated costs, and the growing importance of demonstrating value in the postmarket setting. As payers and providers continue their push toward increased value-based care, more innovative models for connecting such reforms to drug development are needed. Earlier collaboration with industry could enable more efficient identification of unmet need, opportunities to add value through drug development, and clearer input on the value proposition and evidentiary thresholds needed for coverage. Equally important will be unique public-private collaborations that invest in developing a better postmarket data infrastructure that can more effectively identify high value uses of new treatments and support achieving value through new payment reforms. Stronger collaboration could also improve evidence development and the coverage determination process after a targeted treatment has gained regulatory approval. Facilitated drug access programs like those proposed by the Medicare Administrative Contractor Palmetto GBA create access points for patients to receive targeted anti-cancer agents off-label while payers and industry gather important additional outcomes data in patient registries. More systematic and efficient use of policies like Medicare's Coverage with Evidence Development (CED), which allows for provisional coverage for promising technologies or treatments while evidence continues to be collected, could enable industry and payers to work together to learn about a medical product's performance in patient populations not typically represented in clinical studies. A CED-type model could be especially useful for certain specialty drugs: data collected as a condition of payment could help payers and providers develop evidence from actual practice to improve treatment algorithms, increase adherence, and improve outcomes. Finally, collaborations that support stronger postmarket data collection can also support novel drug payment models that further reward value. Bundled payments that include physician-administered drugs, for example, could encourage providers to increase quality while also incentivizing manufacturers to help promote evidence-based drug use and lower costs for uses that generate low value. Outcomes-based purchasing contracts that tie price paid to a medical product's performance could be another promising approach for high-expense treatment with clearly defined and feasibly measured outcomes. Many of these ideas are not new, but as manufacturers, payers, providers, and patients move into an increasingly value-focused era of health care, it is clear that they must work together to find new ways to both promote development of promising new treatments while also making good on the promise of value-based health care reforms. Authors Gregory W. DanielMorgan H. Romine Publication: Global Forum Online Image Source: © Mike Segar / Reuters Full Article
va Faster, more efficient innovation through better evidence on real-world safety and effectiveness By webfeeds.brookings.edu Published On :: Tue, 28 Apr 2015 00:00:00 -0400 Many proposals to accelerate and improve medical product innovation and regulation focus on reforming the product development and regulatory review processes that occur before drugs and devices get to market. While important, such proposals alone do not fully recognize the broader opportunities that exist to learn more about the safety and effectiveness of drugs and devices after approval. As drugs and devices begin to be used in larger and more diverse populations and in more personalized clinical combinations, evidence from real-world use during routine patient care is increasingly important for accelerating innovation and improving regulation. First, further evidence development from medical product use in large populations can allow providers to better target and treat individuals, precisely matching the right drug or device to the right patients. As genomic sequencing and other diagnostic technologies continue to improve, postmarket evidence development is critical to assessing the full range of genomic subtypes, comorbidities, patient characteristics and preferences, and other factors that may significantly affect the safety and effectiveness of drugs and devices. This information is often not available or population sizes are inadequate to characterize such subgroup differences in premarket randomized controlled trials. Second, improved processes for generating postmarket data on medical products are necessary for fully realizing the intended effect of premarket reforms that expedite regulatory approval. The absence of a reliable postmarket system to follow up on potential safety or effectiveness issues means that potential signals or concerns must instead be addressed through additional premarket studies or through one-off postmarket evaluations that are more costly, slower, and likely to be less definitive than would be possible through a better-established infrastructure. As a result, the absence of better systems for generating postmarket evidence creates a barrier to more extensive use of premarket reforms to promote innovation. These issues can be addressed through initiatives that combine targeted premarket reforms with postmarket steps to enhance innovation and improve evidence on safety and effectiveness throughout the life cycle of a drug or device. The ability to routinely capture clinically relevant electronic health data within our health care ecosystem is improving, increasingly allowing electronic health records, payer claims data, patient-reported data, and other relevant data to be leveraged for further research and innovation in care. Recent legislative proposals released by the House of Representatives’ 21st Century Cures effort acknowledge and seek to build on this progress in order to improve medical product research, development, and use. The initial Cures discussion draft included provisions for better, more systematic reporting of and access to clinical trials data; for increased access to Medicare claims data for research; and for FDA to promulgate guidance on the sources, analysis, and potential use of so-called Real World Evidence. These are potentially useful proposals that could contribute valuable data and methods to advancing the development of better treatments. What remains a gap in the Cures proposals, however, is a more systematic approach to improving the availability of postmarket evidence. Such a systematic approach is possible now. Biomedical researchers and health care plans and providers are doing more to collect and analyze clinical and outcomes data. Multiple independent efforts – including the U.S. Food and Drug Administration’s Sentinel Initiative for active postmarket drug safety surveillance, the Patient-Centered Outcomes Research Institute’s PCORnet for clinical effectiveness studies, the Medical Device Epidemiology Network (MDEpiNet) for developing better methods and medical device registries for medical device surveillance and a number of dedicated, product-specific outcomes registries – have demonstrated the potential for large-scale, systematic postmarket data collection. Building on these efforts could provide unprecedented evidence on how medical products perform in the real-world and on the course of underlying diseases that they are designed to treat, while still protecting patient privacy and confidentiality. These and other postmarket data systems now hold the potential to contribute to public-private collaboration for improved population-based evidence on medical products on a wider scale. Action in the Cures initiative to unlock this potential will enable the legislation to achieve its intended effect of promoting quicker, more efficient development of effective, personalized treatments and cures. What follows is a set of both short- and long-term proposals that would bolster the current systems for postmarket evidence development, create new mechanisms for generating postmarket data, and enable individual initiatives on evidence development to work together as part of a broad push toward a truly learning health care system. Downloads Download paper Authors Mark B. McClellanGregory W. Daniel Full Article
va Why legislative proposals to improve drug and device development must look beyond FDA approvals By webfeeds.brookings.edu Published On :: Tue, 28 Apr 2015 08:00:00 -0400 Legislative proposals to accelerate and improve the development of innovative drugs and medical devices generally focus on reforming the clinical development and regulatory review processes that occur before a product gets to market. Many of these proposals – such as boosting federal funding for basic science, streamlining the clinical trials process, improving incentives for development in areas of unmet medical need, or creating expedited FDA review pathways for promising treatments – are worthy pursuits and justifiably part of ongoing efforts to strengthen biomedical innovation in the United States, such as the 21st Century Cures initiative in the House and a parallel effort taking shape in the Senate. What has largely been missing from these recent policy discussions, however, is an equal and concerted focus on the role that postmarket evidence can play in creating a more robust and efficient innovation process. Data on medical product safety, efficacy, and associated patient outcomes accrued through routine medical practice and through practical research involving a broad range of medical practices could not only bolster our understanding of how well novel treatments are achieving their intended effects, but reinforce many of the premarket reforms currently under consideration. Below and in a new paper, we highlight the importance of postmarket evidence development and present a number of immediately achievable proposals that could help lay the foundation for future cures. Why is postmarket evidence development important? There are a number of reasons why evidence developed after a medical product’s approval should be considered an integral part of legislative efforts to improve biomedical innovation. First and foremost, learning from clinical experiences with medical products in large patient populations can allow providers to better target and treat individuals, matching the right drug or device to the right patient based on real-world evidence. Such knowledge can in turn support changes in care that lead to better outcomes and thus higher value realized by any given medical product. Similarly, data developed on outcomes, disease progression, and associated genetic and other characteristics that suggest differences in disease course or response to treatment can form the foundation of future breakthrough medical products. As we continue to move toward an era of increasingly-targeted treatments, this important of this type of real-world data cannot be discounted. Finally, organized efforts to improve postmarket evidence development can further establish infrastructure and robust data sources for ensuring the safety and effectiveness of FDA-approved products, protecting patient lives. This is especially important as Congress, the Administration, and others continue to seek novel policies for further expediting the pre-market regulatory review process for high-priority treatments. Without a reliable postmarket evidence development infrastructure in place, attempts to further shorten the time it takes to move a product from clinical development to FDA approval may run up against the barrier of limited capabilities to gather the postmarket data needed to refine a product’s safety and effectiveness profile. While this is particularly important for medical devices – the “life cycle” of a medical device often involves many important revisions in the device itself and in how and by whom it is used after approval – it is also important for breakthrough drugs, which may increasingly be approved based on biomarkers that predict clinical response and in particular subpopulations of patients. What can be done now? The last decade has seen progress in the availability of postmarket data and the production of postmarket evidence. Biomedical researchers, product developers, health care plans, and providers are doing more to collect and analyze clinical and outcomes data. Multiple independent efforts – including the U.S. Food and Drug Administration’s Sentinel Initiative for active postmarket drug safety surveillance, the Patient-Centered Outcomes Research Institute’s PCORnet for clinical effectiveness studies, the Medical Device Epidemiology Network (MDEpiNet) for developing better methods and medical device registries for medical device surveillance and a number of dedicated, product-specific outcomes registries – have demonstrated the powerful effects that rigorous, systematic postmarket data collection can have on our understanding of how medical products perform in the real-world and of the course of underlying diseases that they are designed to treat. These and other postmarket data systems now hold the potential to contribute to data analysis and improved population-based evidence development on a wider scale. Federal support for strengthening the processes and tools through which data on important health outcomes can be leveraged to improve evidence on the safety, effectiveness, and value of care; for creating transparent and timely access to such data; and for building on current evidence development activities will help to make the use of postmarket data more robust, routine, and reliable. Toward that end, we put forward a number of targeted proposals that current legislative efforts should consider as the 2015 policy agenda continues to take shape: Evaluate the potential use of postmarket evidence in regulatory decision-making. The initial Cures discussion draft mandated FDA to establish a process by which pharmaceutical manufacturers could submit real-world evidence to support Agency regulatory decisions. While this is an important part of further establishing methods and mechanisms for harnessing data developed in the postmarket space, the proposed timelines (roughly 12 months to first Guidance for Industry) and wide scope of the program do not allow for a thoughtfully-, collaboratively-considered approach to utilizing real-world evidence. Future proposals should allow FDA to take a longer, multi-stakeholder approach to identify the current sources of real-world data, gaps in such collection activities, standards and methodologies for collection, and priority areas where more work is needed to understand how real-world data could be used. Expand the Sentinel System’s data collection activities to include data on effectiveness. Established by Congress in 2007, Sentinel is a robust surveillance system geared toward monitoring the safety of drugs and biologics. In parallel to the program for evaluating the use of RWE outlined above, FDA could work with stakeholders to identify and pursue targeted extensions of the Sentinel system that begin to pilot collection of such data. Demonstration projects could enable faster and more effective RWE development to characterize treatment utilization patterns, further refine a product’s efficacy profile, or address pressing public health concerns – all by testing strategic linkages to data elements outside of Sentinel’s safety focus. Establish an active postmarket safety surveillance system for medical devices. Congress has already acted once to establish device surveillance, mandating in 2012 that Sentinel be expanded to include safety data on medical devices. To date, however, there has been no additional support for such surveillance or even the capability of individually tracking medical devices in-use. With the recently finalized Unique Device Identifier rule going effect and the ability to perform such tracking on the horizon, the time is now to adopt recent proposals from FDA’s National Medical Device Postmarket Surveillance System Planning Board. With Congressional authorization for FDA to establish an implementation plan and adequate appropriations, the true foundation for such a system could finally be put into place. These next steps are practical, immediately achievable, and key to fully realizing the intended effect of other policy efforts aimed at both improving the biomedical innovation process and strengthening the move to value-based health care. Authors Mark B. McClellanGregory W. DanielMorgan Romine Full Article
va Risk evaluation and mitigation strategies (REMS): Building a framework for effective patient counseling on medication risks and benefits By webfeeds.brookings.edu Published On :: Fri, 24 Jul 2015 08:45:00 -0400 Event Information July 24, 20158:45 AM - 4:15 PM EDTThe Brookings Institution1775 Massachusetts Ave., NWWashington, DC Under the Food and Drug Administration Amendments Act (FDAAA) of 2007, the FDA has the authority to require pharmaceutical manufacturers to develop Risk Evaluation and Mitigation Strategies (REMS) for drugs or biologics that carry serious potential or known risks. Since that time, the REMS program has become an important tool in ensuring that riskier drugs are used safely, and it has allowed FDA to facilitate access to a host of drugs that may not otherwise have been approved. However, concerns have arisen regarding the effects of REMS programs on patient access to products, as well as the undue burden that the requirements place on the health care system. In response to these concerns, FDA has initiated reform efforts aimed at improving the standardization, assessment, and integration of REMS within the health care system. As part of this broader initiative, the agency is pursuing four priority projects, one of which focuses on improving provider-patient benefit-risk counseling for drugs that have a REMS attached. Under a cooperative agreement with FDA, the Center for Health Policy at Brookings held an expert workshop on July 24 titled, “Risk Evaluation and Mitigation Strategies (REMS): Building a Framework for Effective Patient Counseling on Medication Risks and Benefits”. This workshop was the first in a series of convening activities that will seek input from stakeholders across academia, industry, health systems, and patient advocacy groups, among others. Through these activities, Brookings and FDA will further develop and refine an evidence-based framework of best practices and principles that can be used to inform the development and effective use of REMS tools and processes. Event Materials REMS_PBRC_Meeting_AgendaREMS BR Speaker BiosREMS BenefitRisk Meeting SummaryREMS BenefitRisk communication white paper Full Article
va Defining and measuring innovation in a changing biomedical landscape By webfeeds.brookings.edu Published On :: Wed, 14 Oct 2015 09:00:00 -0400 Event Information October 14, 20159:00 AM - 2:30 PM EDTWashington Plaza Hotel10 Thomas Circle, NWWashington, DC 20005 The biomedical innovation ecosystem continues to evolve and enhance the processes by which treatments are developed and delivered to patients. Given this changing biomedical innovation landscape, it is imperative that all stakeholders work to ensure that development programs, regulatory practices, and the policies that enable them are aligned on and achieving a common set of goals. This will require a thorough reexamination of our understanding of biomedical innovation – and the subsequent ways in which we seek to incentivize it – in order to more effectively bridge research and analysis of the process itself with the science and policy underpinning it. Traditional research into the efficiency and effectiveness of drug development programs has tended to focus on the ‘inputs’ and process trends in product development, quantifying the innovation as discrete units. At the opposite end of the research spectrum are potential measures that could be categorized as “value” or “outcomes” metrics. Identifying the appropriate measures across this spectrum – from inputs and technological progress through outcomes and value – and how such metrics can be in conversation with each other to improve the innovation process will be the focus of this expert workshop. On October 14, the Center for Health Policy at Brookings, under a cooperative agreement with the U.S. Food and Drug Administration, convened a roundtable discussion that engaged key stakeholders from throughout the innovation ecosystem to explore the factors and characteristics that could improve our understanding of what constitutes modern “innovation” and how best to track its progress. Event Materials FINAL 1014 BrookingsFDA AgendaFINAL 1014 BrookingsFDA Participant List Full Article
va How the US embassy in Prague aided Czechoslovakia’s Velvet Revolution By webfeeds.brookings.edu Published On :: Fri, 24 Apr 2020 09:00:09 +0000 In late 1989, popular protests against the communist government in Czechoslovakia brought an end to one-party rule in that country and heralded the coming of democracy. The Velvet Revolution was not met with violent suppression as had happened in Prague in 1968. A new book from the Brookings Institution Press documents the behind the scenes… Full Article
va Global China’s advanced technology ambitions By webfeeds.brookings.edu Published On :: Tue, 28 Apr 2020 09:00:08 +0000 In this special edition of the Brookings Cafeteria Podcast, Lindsey Ford, a David M. Rubenstein Fellow in Foreign Policy, interviews two authors of the most recent release of papers in the Global China series focused on China's aspiration to be a global technology leader. Saif Khan and Remco Zwetsloot are both research fellows at the… Full Article
va In Israel, Benny Gantz decides to join with rival Netanyahu By webfeeds.brookings.edu Published On :: Fri, 27 Mar 2020 21:09:18 +0000 After three national elections, a worldwide pandemic, months of a government operating with no new budget, a prime minister indicted in three criminal cases, and a genuine constitutional crisis between the parliament and the supreme court, Israel has landed bruised and damaged where it could have been a year ago. This week, Israeli opposition leader… Full Article
va Jésus est juif en Amérique: Droite évangélique et lobbies chrétiens pro-Israël By webfeeds.brookings.edu Published On :: Fri, 10 Apr 2020 13:25:20 +0000 The alliance uniting the United States and Israel for over 60 years is commonly attributed to the influence of an all-powerful Jewish lobby thought to pull the strings of American foreign policy in the Middle East. Yet in Jésus est juif en Amérique : Droite évangélique et lobbies chrétiens pro-Israël, visiting fellow in the Center… Full Article
va Appointments, Vacancies and Government IT: Reforming Personnel Data Systems By webfeeds.brookings.edu Published On :: Mon, 30 Nov -0001 00:00:00 +0000 John Hudak argues for reforming personnel data systems – more carefully tracking both appointments and vacancies within government offices – in order to ensure that agency efficacy is not compromised. Hudak recommends several revisions that would immediately recognize vacancies, track government positions and personnel more carefully, and eliminate long-standing vacancies that reduce the efficiency within a department or agency. He asks Congress to stop its cries of “waste” and “inefficiency” and instead push data system improvements that will limit these issues. Full Article
va EU election observation policy: A supranationalist transatlantic bridge? By webfeeds.brookings.edu Published On :: Tue, 23 Feb 2016 15:30:00 -0500 The European Union’s international partners often accuse it of not speaking with a single voice on key global issues. Yet, there are instances when Europe does display a coherent approach to policy-making in international affairs. In this paper for the Center on the United States and Europe, Matteo Garavoglia argues that EU Election Observation Missions (EU EOMs) are a worthy example of such occurrences. Unlike in most other foreign policy domains, EU supranational institutions, rather than national capitals, lead EOMs' policymaking. More specifically, the European External Action Service’s Democracy and Electoral Observation Division, the European Commission’s Foreign Policy Instrument, and the European Parliament’s Directorate for Democracy Support are the key actors behind this policy area. Writing for Brookings’s U.S.-Europe Analysis Series, Matteo Garavoglia investigates why European supranational actors are at the core of EOMs policymaking. Having done so, he analyzes the role that national governments and non-institutional agents play in conceptualizing and operationalizing EOMs. Finally, he explores ways in which Europe’s international partners could build bridges with Brussels in this policy area. Downloads EU election observation policy: A supranationalist transatlantic bridge? Authors Matteo Garavoglia Image Source: © Ali Jarekji / Reuters Full Article
va Advancing financial inclusion in Southeast Asia, Central Asia, and the Middle East By webfeeds.brookings.edu Published On :: Wed, 16 Sep 2015 07:30:00 -0400 Editor’s Note: This blog post is part of a series on the 2015 Financial and Digital Inclusion Project (FDIP) Report and Scorecard, which were launched at a Brookings public event on August 26. Previous posts have highlighted five key findings from the 2015 FDIP Report and explored groundbreaking financial inclusion developments in India. Today’s post will compare financial inclusion outcomes and opportunities for growth across several Asian countries included in the 2015 Report and Scorecard. **** Of the 21 countries ranked in the 2015 Financial and Digital Inclusion Project (FDIP) Report and Scorecard, no countries in Asia placed in the top 5 in the overall ranking. However, all of the FDIP Asian countries have demonstrated progress within at least one of the four dimensions of the 2015 Scorecard: country commitment, mobile capacity, regulatory environment, and adoption of traditional and digital financial services. This blog post will dive into a few of the obstacles and opportunities facing FDIP countries in central Asia, the Middle East, and southeast Asia as they move toward greater access to and usage of financial services among marginalized groups. We explore these countries in order of their overall score: Turkey (74 percent), Indonesia (70 percent), the Philippines (68 percent), Bangladesh (67 percent), Pakistan (65 percent), and Afghanistan (58 percent). You can also read our separate post on financial inclusion in India, available here. Turkey: Clear economic advantages, but opportunities for enabling regulation and greater equity remain Turkey is one of the few upper-middle income countries in the FDIP sample, ranking in the top 5 in terms of gross domestic product (GDP) measured in US dollars. Turkey’s fairly robust banking infrastructure contributed to its relatively strong adoption rates: As of 2013, the International Monetary Fund’s Financial Access Survey found that Turkey had about 20 bank branches per 100,000 adults (the 4th highest density rate among the 21 FDIP countries) and about 73 ATMs per 100,000 adults (the 2nd highest density rate among the FDIP countries). According to the World Bank’s Global Financial Inclusion (Global Findex) database, about 57 percent of adults in Turkey had an account with a mobile money provider or formal financial institution as of 2014. Turkey’s performance on the adoption dimension of the 2015 Scorecard contributed to its tie with Colombia and Chile for 6th place on the overall scorecard. With that said, Turkey received lower mobile capacity and regulatory environment scores, ranking 16th and 17th respectively. Although Turkey’s smartphone and mobile penetration levels are quite robust, a limited mobile money provider landscape, combined with a lack of regulatory clarity surrounding branchless banking regulations (particularly agent banking), constrained Turkey’s scores in those categories. Nonetheless, there is promising news for Turkey’s financial inclusion environment. In 2015, Turkey assumed the G20 presidency and has renewed its focus on financial inclusion in association with this transition. Turkey’s 2014 financial inclusion strategy is one example of the country’s commitment to advancing inclusion. To date, financial inclusion growth in Turkey has been limited, as evidenced by the results of the 2011 and 2014 Global Findex. However, if the country’s stated commitment translates into concrete initiatives moving forward, we can expect to see accelerated financial inclusion growth. This will be critical for facilitating access to and usage of quality financial services among the nearly 60 percent of women in Turkey without formal financial accounts. Reducing the approximately 25 percentage point gap in account ownership between men and women — one of the highest gender gaps among the 21 FDIP countries — should be a key priority for the country moving forward. Indonesia: High mobile money potential, but enhanced awareness needed to drive adoption Recent changes to Indonesia’s regulatory environment have facilitated a more enabling digital financial services ecosystem, although there is still room for improvement in terms of reducing supply-side barriers. Increasing mobile money awareness could help leverage Indonesia’s strong mobile capacity rates to increase access to and usage of formal financial services. However, moving from a heavily cash-based environment to greater use of digital financial services will take time: A 2014 InterMedia survey in Indonesia found that although 93 percent of bank account holders could access their accounts digitally, 73 percent preferred to access their accounts via an agent at a bank branch. The differing mandates of Indonesia’s new financial services authority, Otoritas Jasa Keuangan (OJK), which focuses on branchless banking (specifically agent banking) and Bank Indonesia, which focuses on electronic money regulation, may have created some confusion regarding the regulatory environment. Solidifying the country’s financial inclusion strategy and clarifying the roles of the various financial inclusion stakeholders could provide opportunities for greater coherence in terms of financial inclusion objectives. OJK’s recent branchless banking regulations have led to several positive changes within the regulatory environment. For example, these regulations enabled financial service providers to appoint individuals and business entities as agents and to provide simplified customer due diligence requirements. The 2015 FDIP Report highlights in greater detail some possible improvements to the branchless banking and e-money regulations. On the mobile capacity side, Indonesia tied for the second-highest score on the 2015 Scorecard. Indonesia is one of the few countries where mobile money platform interoperability has been implemented, allowing different mobile money services to “talk” to one another in real time. Indonesia also boasted the third-highest 3G network coverage by population among all the FDIP Asian countries, as well as the third-highest unique subscribership rate among these countries. However, only about 3 percent of adults were aware of mobile money as of fall 2014, according to the InterMedia survey. In terms of adoption, the 2014 Global Findex found that women in Indonesia actually had slightly higher rates of account ownership than adults in general, although there is still significant room for growth across all adoption indicators. Given Indonesia’s strong mobile capacity ranking, increasing awareness of mobile money services could drive growth in the digital finance sector. Clarifying existing regulatory frameworks and removing some remaining restrictions regarding agent exclusivity and other agent criteria could further boost financial inclusion. Philippines: Strong commitment, but geographic barriers have inhibited scale The Philippines tied with Bangladesh to garner 15th place for adoption, which contributed to the country’s overall ranking (also 15th place). In both Bangladesh and the Philippines, about 31 percent of adults had an account with a mobile money provider or formal financial institution as of 2014. According to the 2014 Global Findex, the percentage of women with formal financial accounts was about 7 percentage points higher than the overall percentage of adults with accounts — a rarity among the 21 FDIP countries, which generally exhibit a “gender gap” in which women are less likely to have formal financial accounts than men. The Philippines’ efforts to foster financial inclusion earned it the second-highest country commitment and regulatory environment rankings among the FDIP Asian countries. The Bangko Sentral ng Pilipinas (BSP), the Philippines’ central bank, has issued a number of circulars providing guidance regarding electronic money and allowing non-bank institutions to become e-money issuers. The BSP also has the distinction of being the first central bank in the world to create an office dedicated to financial inclusion. Most recently, the BSP launched a national financial inclusion strategy in July 2015. On the mobile side, according to the GSMA Intelligence database, as of the end of the first quarter of 2015 the Philippines had the highest unique mobile subscribership rate among the FDIP Asian countries, as well as the second-highest rate of 3G network coverage by population among these countries. In terms of mobile money, the Philippines is home to two of the earliest mobile financial services products, Smart’s Smart Money and Globe’s GCash. It also boasts the second-highest rate of mobile money accounts among adults in all the FDIP Asian countries, according to the 2014 Global Findex. There is still significant room for improvement in adoption of traditional and digital financial services in the Philippines. The country’s geography has posed a challenge with respect to advancing access to financial services among the dispersed population. While the extent of banking infrastructure has improved over time, as of 2013 610 out of 1,634 cities and municipalities did not have a banking office, and financial access points remained concentrated in larger cities. Expanding agent locations and facilitating interoperability could enhance mobile money adoption, mitigating the consequences of these geographic barriers. Bangladesh: Rapid growth, but high unregistered use and low adoption overall While Bangladesh performed strongly on the country commitment and mobile capacity dimensions of the 2015 FDIP Scorecard, it received one of the lowest adoption rankings among the FDIP Asian countries. According to the Global Findex, about 31 percent of adults age 15 and older had an account with a formal financial institution or mobile money provider as of 2014. Indicators pertaining to the country’s rates of formal saving, credit card use, and debit card use all received the lowest score. Bangladesh has a robust mobile landscape, with fairly strong unique mobile subscription rates — as of the first quarter of 2015, it was tied with Indonesia for the third-highest unique mobile subscribership rates among the FDIP Asian countries, after the Philippines and Turkey. This mobile coverage is combined with a multiplicity of mobile money providers (although a 2014 InterMedia survey noted that nearly 90 percent of active mobile money customers used the bKash mobile money service). Awareness of mobile money as a service in Bangladesh is very high, although understanding of the concept is less prevalent — in 2014, about 91 percent of respondents in an InterMedia survey were aware of at least one mobile money provider, although only about 36 percent were aware of mobile money as a general concept. Unregistered use of mobile money accounts is high. While about 37 percent of adults had a mobile money account or bank account or both as of 2014, according to the InterMedia survey, only about 5 percent had registered mobile money accounts, while 4 percent had active, registered mobile money accounts (meaning an account that is registered and has been used in the previous 90 days).Transitioning to registered accounts will help enable individuals to connect with more extensive financial services, such as receipt of government payments. Overall, adoption of mobile money and the expansion of agent locations have been increasingly rapid in Bangladesh — as of 2014 Bangladesh was one of the fastest growing markets in terms of total accounts globally. Over 60 percent of respondents in a 2013 InterMedia survey stated that they “fully” or “rather” trusted mobile money. Moving forward, increasing financial capability might help individuals feel more at ease registering their accounts and using them independently of an agent. Pakistan: Public and private sector initiatives advance inclusion Pakistan ranked 7th in terms of the percentage of adults with mobile money accounts among the 21 countries, achieving the highest percentage of all of the Asian FDIP countries. Yet there is significant room for growth — as of 2014, only about 6 percent of adults had a mobile money account. The State Bank of Pakistan (SBP) has clearly expressed its commitment to advancing financial inclusion, which earned the country a commitment score of 100 percent. The SBP developed Branchless Banking regulations in 2008, with revisions in 2011. These regulations were explicitly intended to promote financial inclusion. More recently, the country’s National Financial Inclusion Strategy was launched in May 2015. In terms of quantitative assessments of financial inclusion, the SBP tracks supply-side information on branchless banking in its quarterly newsletters. Recent public and private sector initiatives may help advance mobile money adoption. For example, a re-verification initiative for SIM cards was mandated by the government and initiated earlier in 2015. Mobile network operators have been promoting registration of mobile money accounts since the biometric re-verification process is more intensive than the identification requirements needed to register a mobile money account. Earlier, in September 2014, the EasyPaisa mobile money service decided to eliminate fees related to money transfers between Easypaisa account customers and cash-out transactions for a set period. As of April 2015, the number of person-to-person money transfers had increased by about 2500 percent. Still, barriers to financial inclusion remain. A 2014 InterMedia survey noted that while distance was less of a barrier to registration than previously, distance did affect the frequency with which users engaged with mobile money services. Therefore, expanding access points could further facilitate use of mobile money. Increasing the number of registered accounts could also provide individuals with more opportunities to engage with financial services beyond basic transfers — the InterMedia survey found that as of 2014, about 8 percent of adults were over-the-counter mobile money users, while 0.3 percent were registered users. Afghanistan: Commitment to improving infrastructure and adoption Instability and systemic corruption in Afghanistan over the past several decades have damaged trust in formal financial services and limited the development of traditional banking infrastructure. In addition to having one of the lowest levels of GDP among the 21 FDIP countries, as of 2013 the Financial Access Survey found Afghanistan had the lowest reported density of commercial banks per 100,000 adults. Even among individuals who can access banks, adoption of formal accounts is constrained by a lack of trust in formal financial services. On the mobile side, Afghanistan has fairly widespread 3G network coverage (over 80 percent of the population, according to the GSMA Intelligence database), which helped boost its mobile capacity ranking to 2nd place. However, Afghanistan received the lowest score possible for each of the 15 adoption indicators. According to the 2014 Global Findex, financial account ownership as of 2014 was at about 10 percent of adults, and financial account ownership among women was at only 4 percent. Tracking gender-disaggregated data at the national level could help the government better identify underserved populations and target financial solutions toward their needs. The government has made an effort to promote financial inclusion and digital financial services. For example, Da Afghanistan Bank committed to the Alliance for Financial Inclusion in 2009, and the Republic of Afghanistan is a member of the Better Than Cash Alliance. In 2008, the Money Service Providers Regulation was issued, with amendments instituted a few years later pertaining to e-money. The Afghanistan Payments Systems, which is still being fully operationalized, aims to allow payment service providers such as mobile network operators to connect their mobile money systems. While several mobile money options are available, adoption of these services is low. According to the 2014 Global Findex, about 0.3 percent of adults had a mobile money account. Implementing interoperability across platforms might help increase the utility of mobile money services for consumers, and as in Turkey, developing specific agent banking regulations could provide clarity to the sector and drive innovation. By expanding financial access points, educating consumers about traditional and digital financial services, and monitoring providers to ensure consumer protection, Afghanistan’s regulatory entities and financial service providers may be able to better reach underserved populations and inculcate trust in formal financial services. Authors Robin LewisJohn VillasenorDarrell M. West Image Source: © Romeo Ranoco / Reuters Full Article
va 20191205 Inter-American Dialogue Vanda Felbab-Brown By webfeeds.brookings.edu Published On :: Thu, 05 Dec 2019 21:13:54 +0000 Full Article
va 20200417 Inter-American Dialogue Vanda Felbab-Brown By webfeeds.brookings.edu Published On :: Fri, 17 Apr 2020 21:29:13 +0000 Full Article
va Civilian Drones, Privacy, and the Federal-State Balance By webfeeds.brookings.edu Published On :: Tue, 30 Sep 2014 00:00:00 -0400 Full Article
va Unmanned aircraft systems: Key considerations regarding safety, innovation, economic impact, and privacy By webfeeds.brookings.edu Published On :: Tue, 24 Mar 2015 14:30:00 -0400 Good afternoon Chair Ayotte, Ranking Member Cantwell, and Members of the Subcommittee. Thank you very much for the opportunity to testify today on the important topic of domestic unmanned aircraft systems (UAS). I am a nonresident senior fellow in Governance Studies and the Center for Technology Innovation at the Brookings Institution. I am also a National Fellow at the Hoover Institution at Stanford, and a professor at UCLA, where I hold appointments in the Electrical Engineering Department and the Department of Public Policy. The views I am expressing here are my own, and do not necessarily represent those of the Brookings Institution, Stanford University or the University of California. Downloads Download the testimony Authors John Villasenor Image Source: © Mike Segar / Reuters Full Article
va Class Notes: Harvard Discrimination, California’s Shelter-in-Place Order, and More By webfeeds.brookings.edu Published On :: Fri, 08 May 2020 19:21:40 +0000 This week in Class Notes: California's shelter-in-place order was effective at mitigating the spread of COVID-19. Asian Americans experience significant discrimination in the Harvard admissions process. The U.S. tax system is biased against labor in favor of capital, which has resulted in inefficiently high levels of automation. Our top chart shows that poor workers are much more likely to keep commuting in… Full Article
va Class Notes: Income Segregation, the Value of Longer Leases, and More By webfeeds.brookings.edu Published On :: Wed, 26 Feb 2020 14:06:26 +0000 This week in Class Notes: Reforming college admissions to boost representation of low and middle-income students could substantially reduce income segregation between institutions and increase intergenerational mobility. The Alaska Permanent Fund Dividend increased fertility and reduced the spacing between births, particularly for females age 20-44. Federal judges are more likely to hire female law clerks after serving on a panel… Full Article
va Funding the development and manufacturing of COVID-19 vaccines: The need for global collective action By webfeeds.brookings.edu Published On :: Fri, 24 Apr 2020 16:14:09 +0000 On February 20, the World Bank and the Coalition for Epidemic Preparedness Innovations (CEPI), which funds development of epidemic vaccines, cohosted a global consultation on funding the development and manufacturing of COVID-19 vaccines. We wrote a working paper to guide the consultation, which we coauthored with World Bank and CEPI colleagues. The consultation led to… Full Article
va Cuba moves backwards: New regulations likely to impede private sector growth By webfeeds.brookings.edu Published On :: Fri, 13 Jul 2018 13:46:32 +0000 In a leap backwards, the Cuban government has published a massive compendium of tough new regulations governing the island’s struggling private enterprises. The new regulations—the first major policy pronouncement during the administration of President Miguel Díaz-Canel—appear more focused on controlling and restricting the emerging private sector than on stimulating investment and job creation, more concerned… Full Article
va Letter from Havana: The sudden civil society awakening By webfeeds.brookings.edu Published On :: Mon, 17 Dec 2018 14:48:34 +0000 As the Castro brothers fade into history, green shoots of civil society are visibly emerging in Cuba. Make no mistake: The Cuban Communist Party retains its authoritarian hegemony. Nevertheless, and largely unnoticed in the U.S. media, various interest groups are flexing their youthful muscles—and with some remarkable albeit very partial policy successes. These unanticipated stirrings… Full Article
va At the Havana Biennial, artists test limits on free expression By webfeeds.brookings.edu Published On :: Wed, 22 May 2019 14:35:43 +0000 Full Article
va Optimal solar subsidy policy design and incentive pass-through evaluation: using US California as an example By webfeeds.brookings.edu Published On :: Mon, 04 Jul 2016 14:30:00 -0400 Renewable energy is an important source to tackle against climate change, as the latest IPCC report has pointed out. However, due to the existence of multiple market failures such as negative externalities of fossil fuels and knowledge spillovers of new technology, government subsidies are still needed to develop renewable energy, such as solar photovoltaic (PV) cells. In the United States, there have been various forms of subsidies for PV, varying from the federal level to the state level, and from the city level to the utility level. California, as the pioneer of solar PV development, has put forward the biggest state-level subsidy program for PV, the California Solar Initiative (CSI). The CSI has planned to spend around $2.2 Billion in 2007–2016 to install roughly 2 GW PV capacity, with the average subsidy level as high as $1.1/W. How to evaluate the cost-effectiveness and incentive pass-through of this program are the two major research questions we are pursing. Our cost-effectiveness analysis is based on a constrained optimization model that we developed, where the objective is to install as much PV capacity as possible under a fixed budget constraint. Both the analytical and computational results suggest that due to a strong peer effect and the learning-by-doing effect, one can shift subsides from later periods to early periods so that the final PV installed capacity can be increased by 8.1% (or 32 MW). However, if the decision-maker has other policy objectives or constraints in mind, such as maintaining the policy certainty, then, the optimally calculated subsidy policy would look like the CSI. As to the incentive pass-through question, we took a structural approach and in addition used the method of regression discontinuity (RD). While in general, the incentive pass-through rate depends on the curvature of the demand and supply curve and the level of market competition, our two estimations indicate that the incentive pass-through for the CSI program is almost complete. In other words, almost all of the incentive has been enjoyed by the customer, and the PV installers did not retain much. Based on the RD design, we observe that PV installers tend to consider the CSI incentive as exogenous to their pricing decision. The relative good performance of the CSI in terms of both the cost-effectiveness and the incentive pass-through aspect are tightly related to its policy design and program management. International speaking, the biggest challenge for the design of any PV subsidy program is the quick running out of the budget, and in the end, it looks like customers are rushing for the subsidy. Such rushing behavior is a clear indication of higher-than-needed incentive levels. Due to the policy rigidity and rapid PV technological change, the PV subsidy policy may lag behind the PV cost decline; and as a result, rational customers could rush for any unnecessarily high subsidy. Due to the high uncertainty and unpredictability of future PV costs, the CSI put forward a new design that links the incentive level change and the installed capacity goal fulfillment. Specifically, the CSI has designed nine steps to achieve its policy goal; at each step, there is a PV capacity goal that corresponds to an incentive level. Once the capacity goal is finished, the incentive level will decrease to the next lower level. Furthermore, to maintain the policy certainty, the CSI regulated that every step-wise change in the incentive level should not be higher than $0.45/W, nor smaller than $0.05/W, together with other three constraints. A good subsidy policy not only requires flexible policy design to respond to fast-changing environment, but also demands an efficient program management system, digitalized if possible. For the CSI, the authority has contracted out a third-party to maintain a good database system for the program. Specifically, the database has documented in detail every PV system that customers requested. Key data fields include 22 important dates during the PV installation process, customers’ zip code, city, utility and county information, and various characteristics of the PV system such as price, system size, incentive, PV module and installer. All information is publicly available, which to some extent fills in the information gap held by customers and fosters the market competition among PV installers. For customers to receive the incentive, their PV systems have to pass the inspection of the local government, and also to be interconnected to the grid. On the supply side, the CSI has also certified and created a list of PV installers that every customer can choose from. Although the CSI has ended in 2014 due to fast PV cost reduction starting from 2009, its experience has been transferred to other areas in the United States and in Europe. It is highly possible that other similar new technologies and products (e.g. the electric car and the battery) can adopt the CSI policy design, too. In summary, a good and successful policy may need to be simply, clear, credible, foreseeable, flexible, end-able, and incentive-compatible. The PV subsidy policy in China still has a long way to go when compared to the CSI. Authors Changgui Dong Full Article
va The value of systemwide, high-quality data in early childhood education By webfeeds.brookings.edu Published On :: Thu, 20 Feb 2020 17:38:04 +0000 High-quality early learning experiences—those filled with stimulating and supportive interactions between children and caregivers—can have long-lasting impacts for children, families, and society. Unfortunately, many families, particularly low-income families, struggle to find any affordable early childhood education (ECE) program, much less programs that offer engaging learning opportunities that are likely to foster long-term benefits. This post… Full Article
va A tale of two trade fairs: Milwaukee’s globally relevant water proposition By webfeeds.brookings.edu Published On :: Wed, 27 Jul 2016 13:47:00 +0000 As we have previously discussed, the decision to prioritize a single primary cluster in a regional economic development plan is challenging. For Milwaukee, this was especially difficult in development of its global trade and investment plan because it has three legitimate clusters: energy, power and controls; food and beverage; and water technologies. The team developing the plan was reluctant to pick a favorite. Full Article Uncategorized
va @ Brookings Podcast: Eye-Tracking Technology and Digital Privacy By webfeeds.brookings.edu Published On :: Fri, 20 Apr 2012 16:39:00 -0400 Eye-tracking technology now makes it possible for computers to gather staggering amounts of information about individuals as they use the Internet, and draw hyper-accurate conclusions about our behavior as consumers. As the technology becomes more practical, Senior Fellow John Villasenor discusses its benefits and risks. previous play pause next mute unmute @ Brookings Podcast: Eye-Tracking Technology and Digital Privacy 07:55 Download (Help) Get Code Brookings Right-click (ctl+click for Mac) on 'Download' and select 'save link as..' Get Code Copy and paste the embed code above to your website or blog. Video Eye-Tracking Technology and Digital Privacy Audio @ Brookings Podcast: Eye-Tracking Technology and Digital Privacy Image Source: © Scanpix Sweden / Reuters Full Article
va Evaluating the Evaluators: Some Lessons from a Recent World Bank Self-Evaluation By webfeeds.brookings.edu Published On :: Tue, 21 Feb 2012 14:15:00 -0500 Editor's Note: The World Bank’s Independent Evaluation Group (IEG) recently published a self-evaluation of its activities. Besides representing current thinking among evaluation experts at the World Bank, it also more broadly reflects some of the strengths and gaps in the approaches that evaluators use to assess and learn from the performance of the international institutions with which they work. The old question “Quis custodet ipsos custodes?” – loosely translated as “Who evaluates the evaluators?” – remains as relevant as ever. Johannes Linn served as an external peer reviewer of the self-evaluation and provides a bird’s-eye view on the lessons learned. An Overview of the World Bank’s IEG Self-Evaluation Report In 2011 the World Bank’s Independent Evaluation Group (IEG) carried out and published a self-evaluation of its activities. The self-evaluation team was led by an internal manager, but involved a respected external evaluation expert as the principal author and also an external peer reviewer. The IEG self-evaluation follows best professional practices as codified by the Evaluation Cooperation Group (ECG). This group brings together the evaluation offices of seven major multilateral financial institutions in joint efforts designed to enhance evaluation performance and cooperation among their evaluators. One can therefore infer that the approach and focus of the IEG self-evaluation is representative of a broader set of practices that are currently used by the evaluation community of international financial organizations. At the outset the IEG report states that “IEG is the largest evaluation department among Evaluation Capacity Group (ECG) members and is held in high regard by the international evaluation community. Independent assessments of IEG’s role as an independent evaluation function for the Bank and IFC rated it above the evaluation functions in most other ECG members, international nongovernmental organizations, and transnational corporations and found that IEG follows good practice evaluation principles.” The self-evaluation report generally confirms this positive assessment. For four out of six areas of its mandate IEG gives itself the second highest rating (“good”) out of six possible rating categories. This includes (a) the professional quality of its evaluations, (b) its reports on how the World Bank’s management follows up on IEG recommendations, (c) cooperation with other evaluation offices, and (d) assistance to borrowing countries in improving their own evaluation capacity. In the area of appraising the World Bank’s self-evaluation and risk management practices, the report offers the third highest rating (“satisfactory”), while it gives the third lowest rating (“modest”) for IEG’s impact on the Bank’s policies, strategies and operations. In addition the self-evaluation concludes that overall the performance of IEG has been “good” and that it operates independently, effectively and efficiently. The report makes a number of recommendations for improvement, which are likely to be helpful, but have limited impact on its activities. They cover measures to further enhance the independence of IEG and the consistency of evaluation practices as applied across the World Bank Group’s branches – the World Bank, the International Finance Corporation (IFC), and the Multilateral Investment Guarantee Agency (MIGA) –; to improve the design of evaluations and the engagement with Bank management upstream for greater impact; and monitoring the impact of recent organizational changes in IEG in terms of results achieved. The report also recommends that more be done to evaluate the Bank’s analytical work and that evaluations draw on comparative evidence. Assessment In terms of the parameters of self-evaluation set by the prevailing practice among the evaluators on international financial agencies, the IEG self-evaluation is accurate and helpful. From my own experience as an operational manager in the Bank whose activities were evaluated by IEG in years past, and as a user of IEG evaluations (and of evaluations of other international aid organizations) for my research on aid effectiveness, I concur that IEG is independent and effective in meeting its mandate as defined. Moreover, the self-evaluation produces useful quantitative evidence (including survey results, budget analysis, etc.) to corroborate qualitative judgments. However, the self-evaluation suffers from a number of limitations in approach and gaps in focus, which are broadly representative of the practices prevalent among many of the evaluation offices of international aid agencies. Approach of the IEG self-evaluation The core of the self-evaluation report is about the evaluation process followed by IEG, with very little said about the substance of IEG’s evaluations. The following questions could have usefully been raised, but were not: do evaluations cover the right issues with the right intensity, such as growth and poverty; environmental, governance, and gender impacts; regional dimensions versus exclusive country or project focus; effectiveness in addressing the problems of fragile and conflict states; effectiveness in dealing with global public goods; sustainability and scaling up; etc. Therefore the report does not deal with the question of whether IEG effectively responds in its evaluations to the many important strategic debates and issues with which the development community is grappling. Related to this limitation is the fact that the report assessed the quality of IEG’s mostly in terms of (a) whether its approach and processes meet certain standards established by the Evaluation Cooperation Group; and (b) how it is judged by stakeholders in response to a survey commissioned for this evaluation. Both these approaches are useful, but they do not have any basis in professional assessments of the quality of individual products. This is equivalent to IEG evaluating the World Bank’s projects on the quality of its processes (e.g., appraisal and supervision processes) and on the basis of stakeholder surveys, without evaluating individual products and their impacts. Gaps in the Self-Evaluation and in Evaluation Practice Careful reading of the report reveals six important gaps in the IEG self-evaluation, in the prevailing evaluation practice in the World Bank, and more generally in the way international financial organizations evaluate their own performance. The first three gaps relate to aspects of the evaluation approach used and the second three gaps relate to lack of focus in the self-evaluation on key internal organizational issues: 1. Impact Evaluations: The report notes that IEG carries out two to three impact evaluations per year, but it sidesteps the debate in the current evaluation literature and practice as to what extent the “gold standard” of randomized impact evaluation should occupy a much more central role. Given the importance of this debate and divergence of views, it would have been appropriate for the self-evaluation to assess IEG’s current practice of very limited use of randomized evaluations. 2. Evaluation of Scaling Up: The report does not address the question of to what extent current IEG practice not only assesses the performance of individual projects in terms of their outcomes and sustainability, but also in terms of whether the Bank has systematically built on its experience in specific projects to help scale up their impact through support for expansion or replication in follow-up operations or through effective hand-off to the government or other partners. In fact, currently IEG does not explicitly and systematically consider scaling up in its project and program evaluations. For example, in a recent IEG evaluation of World Bank funded municipal development projects (MDPs) , IEG found that the Bank has supported multiple MDPs in many countries over the years, but the evaluation did not address the obvious question whether the Bank systematically planned for the project sequence or built on its experience from prior projects in subsequent operations. While most other evaluation offices like IEG do not consider scaling up, some (in particular those of the International Fund for Agricultural Development and the United Nations Development Program) have started doing so in recent years. 3. Drawing on the Experience of and Benchmarking Against Other Institutions: The self-evaluation report does a good job in benchmarking IEG performance in a number of respects against that of other multilateral institutions. In the main text of the report it states that “IEG plans to develop guidelines for approach papers to ensure greater quality, in particular in drawing on comparative information from other sources and benchmarking against other institutions.” This is a welcome intention, but it is inadequately motivated in the rest of the report and not reflected in the Executive Summary. The reality is that IEG, like most multilateral evaluation offices, so far has not systematically drawn on the evaluations and relevant experience of other aid agencies in its evaluations of World Bank performance. This has severely limited the learning impact of the evaluations. 4. Bank Internal Policies, Management Processes and Incentives: IEG evaluations traditionally do not focus on how the Bank’s internal policies, management and incentives affect the quality of Bank engagement in countries. Therefore evaluations cannot offer any insights into whether and how Bank-internal operating modalities contribute to results. Two recent exceptions are notable exceptions. First, the IEG evaluation of the Bank’s approach to harmonization with other donors and alignment with country priorities assesses the incentives for staff to support harmonization and alignment. The evaluation concludes that there are insufficient incentives, a finding disputed by management. Second, is the evaluation of the Bank’s internal matrix management arrangements, which is currently under way. The self-evaluation notes that Bank management tried to quash the matrix evaluation on the grounds that it did not fall under the mandate of IEG. This is an unfortunate argument, since an assessment of the institutional reasons for the Bank’s performance is an essential component of any meaningful evaluation of Bank-supported programs. While making a good case for the specific instance of the matrix evaluation, the self-evaluation report shies away from a more general statement in support of engaging IEG on issues of Bank-internal policies, management processes and incentives. It is notable that IFAD’s Independent Office of Evaluation appears to be more aggressive in this regard: It currently is carrying out a full evaluation of IFAD’s internal efficiency and previous evaluations (e.g., an evaluation of innovation and scaling up) did not shy away from assessing internal institutional dimensions. 5. World Bank Governance: The IEG self-evaluation is even more restrictive in how it interprets its mandate regarding the evaluation of the World Bank’s governance structures and processes (including its approach to members’ voice and vote, the functioning of its board of directors, the selection of its senior management, etc.). It considers these topics beyond IEG’s mandate. This is unfortunate, since the way the Bank’s governance evolves will substantially affect its long-term legitimacy, effectiveness and viability as an international financial institution. Since IEG reports to the Bank’s board of directors, and many of the governance issues involve questions of the board’s composition, role and functioning, there is a valid question of how effectively IEG could carry out such an evaluation. However, it is notable that the IMF’s Independent Evaluation Office, which similarly reports to the IMF board of directors, published a full evaluation of the IMF’s governance in 2008, which effectively addressed many of the right questions. 6. Synergies between World Bank, IFC and MIGA: The self-evaluation report points out that the recent internal reorganization of IEG aimed to assure more effective and consistent evaluations across the three member branches of the World Bank Group. This is welcome, but the report does not assess how past evaluations addressed the question of whether the World Bank, IFC and MIGA effectively capitalized on the potential synergies among the three organizations. The recent evaluation of the World Bank Group’s response to the global economic crisis of 2008/9 provided parallel assessments of each agency’s performance, but did not address whether they work together effectively in maximizing their synergies. The reality is that the three organizations have deeply engrained institutional cultures and generally go their own ways rather than closely coordinating their activities on the ground. Future evaluations should explicitly consider whether the three effectively cooperate or not. While the World Bank is unique in the way it has organizationally separated its private sector and guarantee operations, other aid organizations also have problems of a lack of cooperation, coordination and synergy among different units within the agency. Therefore, the same comment also applies to their evaluation approaches. Conclusions Self-evaluations are valuable tools for performance assessment and IEG is to be congratulated for carrying out and publishing such an evaluation of its own activities. As for all self-evaluations, it should be seen as an input to an independent external evaluation, a decision that, for now, has apparently been postponed by the Bank’s board of directors. IEG’s self-evaluation has many strengths and provides an overall positive assessment of IEG’s work. However, it does reflect some important limitations of analysis and of certain gaps in approach and coverage, which an independent external review should consider explicitly, and which IEG’s management should address. Since many of these issues also likely apply to most of the other evaluation approaches by other evaluation offices, the lessons have relevance beyond IEG and the World Bank. Key lessons include: An evaluation of evaluations should focus not only on process, but also on the substantive issues that the institution is grappling with. An evaluation of the effectiveness of evaluations should include a professional assessment of the quality of evaluation products. An evaluation of evaluations should assess: o How effectively impact evaluations are used; o How scaling up of successful interventions is treated; o How the experience of other comparable institutions is utilized; o Whether and how the internal policies, management practices and incentives of the institution are effectively assessed; o Whether and how the governance of the institution is evaluated; and o Whether and how internal coordination, cooperation and synergy among units within the organizations are assessed. Evaluations play an essential role in the accountability and learning of international aid organizations. Hence it is critical that evaluations address the right issues and use appropriate techniques. If the lessons above were reflected in the evaluation practices of the aid institutions, this would represent a significant step forward in the quality, relevance and likely impact of evaluations. Authors Johannes F. Linn Image Source: © Christian Hartmann / Reuters Full Article
va The thing both conservatives and liberals want but aren't talking about By webfeeds.brookings.edu Published On :: Fri, 22 Jul 2016 17:00:00 -0400 Editor's Note: The current U.S. presidential race demonstrates the deep political divisions that exist in our country. But what does it mean to be "liberal" or "conservative," "Republican" or "Democratic"? According to Shadi Hamid, certain values transcend political chasms. This post originally appeared on PBS NewsHour. What does it mean to say that the Republican Party is on the “right”? The GOP, long defined (at least in theory) by its faith in an unbridled free market, the politics of personal responsibility, and a sort of Christian traditionalism, is no longer easily plotted on the traditional left-right spectrum of American politics. Under the stewardship of presidential nominee Donald Trump, the Republican Party appears to be morphing into a European-style ethnonationalist party. With Trump’s open disrespect for minority rights and the Bill of Rights, the GOP can no longer be considered classically “liberal” (not to be confused with capital-L American Liberalism). This is a new kind of party, an explicitly illiberal party. These developments, of course, further constrain Republicans’ appeal to minority voters (I haven’t yet met an American Muslim willing to admit they’re voting for Trump, but they apparently exist). This makes it all the more important to distinguish between conservative values and those of this latest iteration of the Republican Party. There are some aspects of Burkean conservative thought – including aspects of what might be called civic communitarianism – that could plausibly strike a chord in the current cultural landscape across “left” and “right,” categories which, in any case, are no longer as clearly distinguishable as they once were. (Take, for example, British Labour leader Jeremy Corbyn’s Euroskepticism and that of his opponents on the right, or the populist anti-elitism and trade protectionism that are now the province of both Republicans and Democrats). Everyone seems angry or distrustful of government institutions, which, even when they provide much needed redistributive fiscal stimulus and services, are still blamed for being incompetent, inefficient, or otherwise encouraging a kind of undignified dependency. After the Brexit debacle, it seemed odd that some of the most Europhobic parts of Britain were the very ones that benefited most from EU subsidies. But this assumes that people are fundamentally motivated by material considerations and that they vote – or should vote – according to their economic interests. If there’s one thing that the rise of Trump and Brexit – and the apparent scrambling of left-right divides – demonstrates, it’s that other things may matter more, and that it’s not a matter of people being too stupid to realize what’s good for them. As Will Davies put it in one of the more astute post-Brexit essays, what many Brexiteers craved was “the dignity of being self-sufficient, not necessarily in a neoliberal sense, but certainly in a communal, familial and fraternal sense.” The communitarian instinct – the recognition that meaning ultimately comes from local communities rather than happiness-maximizing individuals or bloated nanny-states – transcends the Republican-Democratic or the Labour-Conservative chasm. In other words, an avowedly redistributive state is fine, at least from the standpoint of the left, but that shouldn’t mean neglecting the importance of local control and autonomy, and finding ways, perhaps through federal incentives, to encourage things like “local investment trusts.” Setting up local investment trusts, expanding the child tax credit, or introducing a progressive consumption tax aren’t exactly a call-to-arms, and various traditionalist and communitarian-minded philosophers have, as might be expected from philosophers, tended to stay at the level of abstraction (authors armed with more policy proposals are more likely to be young conservative reformers like Ross Douthat, Reihan Salam, and Yuval Levin). Douthat and Salam want to use wide-ranging tax reform to alter incentives in the hope of strengthening families and communities. This is a worthy goal, but realizing such policies requires leadership on the federal level from the very legislators who we should presumably become less dependent on. This is the reformer’s dilemma, regardless of whether you’re on the left or right. If your objective is to weaken a centralized, overbearing state and encourage mediating or “middle” institutions, then you first need recourse to that same overbearing state, otherwise the proposed changes are unlikely to have any significant impact on the aggregate, national level. The fact that few people seem interested in talking about any of this in our national debate (we instead seem endlessly intrigued by Melania Trump’s copy-and-paste speechwriting) suggests that we’re likely to be stuck for some time to come. Incidentally, however, the Hillary Clinton campaign slogan of “Stronger Together” has an interesting communitarian tinge to it. I doubt that was the intent, and it’s only in writing this column that I even took a minute to think about what the slogan might actually mean. I, as it happens, have been much more interested in talking about – and worrying about – an unusually fascinating and frightening man named Donald Trump. Authors Shadi Hamid Publication: PBS Image Source: © Kevin Lamarque / Reuters Full Article
va Beyond Arithmetic: How Medicare Data Can Drive Innovation By webfeeds.brookings.edu Published On :: Fri, 06 Jun 2014 00:00:00 -0400 Five years ago, my mother needed an orthopedic surgeon for a knee replacement. Unable to find any data, we went with an academic doctor that was recommended to us (she suffered surgical complications). Last month, we were again looking for an orthopedic surgeon- this time hoping that a steroid injection in her spine might allay the need for invasive back surgery. This time, thanks to a recent data dump from CMS, I was able to analyze some information about Medicare providers in her area and determine the most experienced doctor for the job. Of 453 orthopedic surgeons in Maryland, only a handful had been paid by Medicare for the procedure more than 10 times. The leading surgeon had done 263- as many as the next 10 combined. We figured he might be the best person to go to, and we were right- the procedure went like clockwork. Had it been a month prior to the CMS data release, I wouldn’t have had the data at my fingertips. And I certainly wouldn’t have found the most experienced hand in less than 10 minutes. It’s been a couple of month since the release of Medicare data by the Centers for Medicare and Medicaid (CMS) on the volume and cost of services billed by healthcare providers, and despite the whiff of scandal surrounding the highest paid providers (including the now-famous Florida ophthalmologist that received $21 million) the analyses so far have been somewhat unsurprising. This week, coinciding with the fifth Health DataPalooza, is a good time to take stock of the utility of this data, its limitations, and what the future may hold. The millions of lines of data was exactly as advertised: charges and paid services under traditional Medicare “fee-for-service,” including the billing provider’s ID and the costs to Medicare. The initial headlines touting “Medicare Millionaires” relied on some basic arithmetic and some sorting. And the cautions piled up: the data could reflect multiple providers billing under a single ID; payments are not the same as a provider’s actual take home income; it’s not complete information as it doesn’t contain information about other insurers, or even Medicare Advantage, and so on. But perhaps most damning was how little insight the data seemed to provide on the quality or value of care provided, as opposed to volume of services. As Lisa Rosenbaum wrote in the New Yorker, “So much of that good isn’t captured by these numbers. You don’t bill for talking to a patient about how he wants to die. There’s no code for providing reassurance rather than ordering a test.” Where is the value in the data? Data bear witness to the fundamental flaw of the payment system that generates them. The absence of information on quality, safety, appropriateness, or outcomes appears to have been a genuine revelation to many, but it is in fact exactly the type of output that we should expect from this volume-based system that we have built. This is not a critique of the data release. It is an indictment of our payment system. Data is revealing important trends in how we pay doctors differently. Not all physician payments are created equal, and the data certainly shows the disparities across specialties, primary care, and others. For example, the average total annual Medicare payment to geriatricians was less than $100,000, while dermatologists and radiation oncologists (who presumably also see non-elderly patients) received on average $200,000 and $360,000 respectively. The important question will be why and should it continue? Figure 1: Distribution of Total Medicare Pay by Provider Type, 2012 Source: Author's calculations based on Medicare data released in April 2014 Data is revealing important indicators of cost and pricing – a major contributor to rising health care costs. Why is it that a brief visit with a geriatrician is worth $13; a 45-minute visit with a geriatrician sorting through medications, educating family members, and developing a quality of life plan with a terminal cancer patient is worth $79; and a dermatologist treating suspected skin cancer can earn upwards of $600 for a procedure that takes them minutes? Data sheds light on practice patterns. The data is also revealing important variances in utilization of drugs and treatments. For example, a block apart on Park Avenue, two ophalmologists differ significantly in their use of treatments for macular degeneration. One uses expensive injectable drugs and gets paid over $10,000 per injection, while the other receives less than $500 for the lower-cost equivalent. A CBS News report looked at spinal fusion surgeries—a procedure where there is almost no evidence demonstrating a net benefit to patients compared to other conservative therapies. They observed that “while the average spine surgeon performed them on 7 percent of patients they saw, some did so on 35 percent.” At the extremes, outlier “practice pattern” begins to raise questions of potential improper billing or outright fraud and abuse. For example, simply looking at the frequency and volume of services provided to individual beneficiaries can identify concerning outliers. This laboratory company billed for 28,954 blood glucose reagent strips in 2012- for 88 patients. And yes, that’s highly unusual. Figure 2: "Outlier" Medicare Billing for Blood Glucose Reagent Strips, 2012 Source: Author's calculations based on Medicare data released in April 2014 One clinical social worker billed for 1,697 separate days of service on 28 patients (the size of the bubble is proportional to the total amount of reimbursement by Medicare in 2012). Figure 3: "Outlier" Medicare Billing for Days of Service, 2012 Source: Author's calculations based on Medicare data released in April 2014 The most extreme outlier, Dr. Gary Ordog, was named by NPR and ProPublica in their examination of providers who are outliers on their pattern of coding for the highest intensity office. Ordog had previously lost the right to bill California’s state Medicaid program, and yet continued to charge Medicare for over $500,000 in billing in 2012. It’s important to caution however, that even in these extreme outliers, statistics alone cannot provide definitive evidence of abuse. There is a need for formal investigation. Medicare and law enforcement officials will need to create new processes for dealing with a potential flood of outlier reports from amateur sleuths like me. What's Next for Medicare Data? Data can be trended. Updates of data releases can begin to show us not just snapshots, but moving pictures of our healthcare system as it undergoes rapid changes. The New York Times reported on the increase in charges for certain frequent causes of hospitalization between 2011 and 2012. It will be interesting to see whether the data release itself, and the Steven Brill landmark Time article on hospital charges, have an impact on reversing these trends. Data can be “mashed up”. The value of open data is hugely greater than the sum of its parts. As more and more data becomes available, the files can be cross-linked and “mashed up” to be able to answer questions no one database could have. ProPublica linked together cobbled together data on state actions and sanctions on physicians with the Medicare data release to ask why these physicians are still being paid by Medicare. What does the future hold? Correlations with drug prescribing data, meaningful use, and referral patterns are possible today, Sunshine Act disclosures and quality reporting, and much more is soon to come. As we get comfortable with the data, analysts can move past the basics of arithmetic and sorting, we have an opportunity to make more ‘meaningful use’ of this data. We can begin to identify practice patterns, overuse, variations in geography or demographics, and potentially even fraud and abuse. As more and more data becomes available, the files can be cross-linked and “mashed up” to be able to answer questions no one database could have addressed. What will determine the value of the Medicare data release will be the creativity of those data scientists, epidemiologists, and health services researchers (amateur as well as professional) who can ask the challenging questions that must be answered. Authors Farzad Mostashari Full Article
va The thing both conservatives and liberals want but aren’t talking about By webfeeds.brookings.edu Published On :: Mon, 30 Nov -0001 00:00:00 +0000 What does it mean to say that the Republican Party is on the "right"? Shadi Hamid distinguishes between conservative values and those of the latest iteration of the Republican Party, while exploring the shared values of both liberals and conservatives. Full Article Uncategorized
va Class Notes: Harvard Discrimination, California’s Shelter-in-Place Order, and More By webfeeds.brookings.edu Published On :: Fri, 08 May 2020 19:21:40 +0000 This week in Class Notes: California's shelter-in-place order was effective at mitigating the spread of COVID-19. Asian Americans experience significant discrimination in the Harvard admissions process. The U.S. tax system is biased against labor in favor of capital, which has resulted in inefficiently high levels of automation. Our top chart shows that poor workers are much more likely to keep commuting in… Full Article
va Multiple Vantage Points on the Seoul G-20 Summit By webfeeds.brookings.edu Published On :: Thu, 25 Nov 2010 11:47:00 -0500 Editor’s Note: The National Perspectives on Global Leadership (NPGL) project reports on public perceptions of national leaders’ performance at important international events. This fifth installation of the NPGL Soundings provides insight on the issues facing leaders at the Seoul G-20 Summit and the coverage they received in their respective national media. Read the other commentary »The fifth G-20 Summit held in Seoul seems to show signs of a gradual maturing of the process and the forum as a mechanism for communication among leaders and a means of connecting leaders and finance ministers with their national publics, judging from National Perspectives on Global Leadership (NPGL) country commentaries. These growing strengths — looking from the G-20 capitals toward the Seoul summit contrasted with looking from the summit toward the countries — seemed particularly impressive at this Seoul summit, which was characterized by the most intense policy conflicts yet at a G-20 meeting.Policy Conflicts and the Trajectory of G-20 SummitsThe responses to the first question — “Did coverage seem to threaten or enhance the viability of G-20 summits?” — seemed to indicate that, despite the conflicts over external imbalances and currency policies, these issues did not threaten the viability of the G-20 summits as much as one might have expected. Given the focus of the NPGL project on national leadership, what is interesting about this positive result is that the coverage in the media was not just of the debate itself, but the portrayal of their national leader at the summit.With the exception of an excellent and balanced article on Saturday, November 13 in The Washington Post by Howard Schneider and Scott Wilson, the coverage in Washington and in the Financial Times would lead readers to conclude that the Seoul G-20 Summit was less successful than anticipated, and did not enhance the viability of G-20 summits as much as the Koreans hoped it would.“Agreements did not have to be worked out,” Andrew Cooper wrote, quoting Canadian Prime Minister Stephen Harper, “this month or next month in order to avert [a] cataclysm…I’m confident we will make progress over time.”Olaf Corry reported from London that UK Prime Minister David Cameron was quoted in The Guardian as saying that rebalancing “is being discussed in a proper multilateral way without resort to tit-for-tat measures and selfish policies.”U.S. President Barack Obama said in his press conference that “in each of these successive summits we’ve made real progress.”Lan Xue and Yanbing Zhang wrote that Chinese President Hu Jintao “highlighted the importance of (the) framework (for strong, sustainable and balanced growth) and also pointed out that it should be further improved,” a far cry from a rejection of it.“In contrast to previous summits,” Peter Draper reported from Johannesburg, “President Zuma’s interventions did receive some press coverage at home…To judge from this coverage, he seems to have played his cards reasonably well and to have been visible.”Melisa Deciancio commented from Buenos Aires that ”Cristina Fernandez’s contribution to the G-20 summits has always been substantive…She has also called the members of the (G-20) to work together, cooperate and avoid entering into conflict in relation to the ongoing currency war between China and the U.S.”“Both (German Chancellor Angela) Merkel and (finance minister) Schaeuble spent considerable effort to explain the positive aspects of summit agreements and praised the ‘spirit of cooperation,’” reported Thomas Fues from Germany.In each of the cases above, the leader offered a positive interpretation of the Seoul G-20 Summit and the G-20 summit process even in the context of intense policy disputes, which constrained the practical agreements that could have been reached, especially on the global economic adjustment issues. This optimistic stance indicates a forward movement by G-20 leaders on a metric of global leadership in Seoul that the four previous NPGL “Soundings” had found to be wanting at previous summits.In some countries, the problem continued with the press focusing on the shortcomings and failures of the Seoul G-20 Summit, including the coverage in the influential Financial Times. G-20 leaders were, however, more aggressive in pushing against the media’s interpretation of weakness and failures at the G-20, advancing an alternative narrative that focused on the gradual progress being made and stronger relationships developing with each G-20 summit experience. Leaders now need to assure that the G-20 “framework” and the “mutual assessment process” (MAP) of peer review that goes with it, are able to deliver a credible way forward for global economic adjustment by the time of the French G-20 Summit in November 2011.Global Economic Adjustment as a Visible Theme With regard to question two — “How was the rebalancing issue dealt with?”— the common thread running through each of the country commentaries is reflected in Olaf Corry’s comment that “explicit mention of the G-20’s formal ‘framework for strong, sustainable and balance growth’ is very sparse in UK public debate, but the themes it highlights definitely shine through.” The one exception may have been the explicit, detailed understanding of the issue conveyed by Schneider and Wilson in their Washington Post article titled “G-20 nations agree to agree; Pledge to heed common rules; but economic standards have yet to be met.” (See U.S. country commentary.)The G-20 framework and the MAP may not have received much visibility or coverage from the media, but the intensity of the currency wars, the debate about U.S. quantitative easing (QE 2) and the differences over current account targets were all widely covered, and the message communicated to most publics was that global imbalances are a real problem for all countries and a concerted global economic adjustment is essential. The G-20 leaders will, therefore, have to do far more than simply explain the process to their publics; they need to continue to push each other and their economic officials to reach agreement on a path forward by the time of the French summit in November of 2011.The difficulty of reaching agreement is reflected in a comment by Ryozo Hayashi of Japan who wrote, “Therefore, it sounds wise to let these countries (the U.S. and China) keep their current policy paths with a political commitment to avoid a currency war and for the G-20 to agree to develop economic indicators. It may become urgent or it may become irrelevant as the situation develops. Given the difficulty of establishing agreed economic indicators, the time element would be important.”Leadership at Summits and Its Linkages to Domestic Political Support What emerged more clearly at this summit than in previous G-20 summits was the degree to which the role of individual countries and their leaders (or finance ministers) in G-20 processes had domestic political valence in their home countries.“The amount of attention devoted by the media to this summit was considerably more than previous ones,” wrote Andres Rozental, “partially because the Calderón administration will host the G-20 in 2012 and Mexico is now part of the G-20 ‘troika.’”Thomas Fues commented that “The media also appreciatively noted that Germany had been asked to co-chair the G-20 working group on the international currency system, tasked with formulating policy proposals” for the French G-20 Summit. In South Africa, Peter Draper also found that the press paid attention to the fact that it co-chairs the G-20 working group on development with South Korea, and “the importance of this group’s work to the future of the G-20.”“In terms of summit diplomacy,” wrote Andrew Cooper, “Harper’s main success was in gaining the role for Canada as one of the co-chairs (with India, supported by the International Monetary Fund [IMF]) with respect to the process of working out a set of economic indicators that all members of the G-20 could use as guideposts for a stable global economy.”This is all evidence that G-20 activities now generate positive repercussions in domestic public opinion. Other dimensions of linkages between international committee positions assumed at G-20 summits and domestic political capital are beginning to emerge as the G-20 matures.In South Africa, Finance Minister Gordhan’s strong criticism of U.S. QE2 in the international press seems “to have added to his growing reputation at home” commented Peter Draper.German Finance Minister Schaeuble’s criticism of the U.S. Federal Reserve’s move as “clueless,” “forced Merkel to reiterate unswerving support of her key official” at the Seoul summit, Thomas Fues noted.Cristina Fernandez has consistently and adroitly used her substantive policy positions at G-20 summits to buttress her position at home. Argentina is head of the G77, so Argentine support for development increases its status as a leader of the South and her domestic prestige. Argentine discontent with the IMF has been legend since the 1990s; support of President Fernandez for the G-20 framework and MAP process arises as an alternative to the IMF article IV exercise, which most Argentines are against, reported Melisa Deciancio.ConclusionDespite media attention being riveted on the showdown between the United States, Germany and China on currency manipulation and external imbalances at the Seoul G-20 Summit, leaders defended the G-20 processes for working through these issues over time, rather than emphasizing the failure to reach agreement at Seoul. The leaders and their finance ministers found that taking an aggressive stance on key issues paid dividends in terms of their domestic political support.Explicit efforts by leaders to link international policies to domestic politics is a positive step forward for G-20 summits toward a greater engagement between leaders and their publics. NPGL observers have been watching this dimension of G-20 summitry in London, Pittsburgh, Toronto and now Seoul. (See: www.cigionline.org; Papers; “Soundings”)The challenge going forward will be finding a way to align the global economic adjustment policy with domestic political linkages in a consistent and reinforcing manner, that will allow for policy convergence rather than the divergence manifested at the Seoul G-20 Summit. Authors Colin I. Bradford Publication: NPGL Soundings, November 2010 Full Article
va Governance innovations for implementing the post-2015 Sustainable Development Agenda By webfeeds.brookings.edu Published On :: Mon, 30 Mar 2015 09:00:00 -0400 Event Information March 30, 20159:00 AM - 5:00 PM EDTFalk AuditoriumBrookings Institution1775 Massachusetts Avenue NWWashington, DC 20036 2015 is a crucial year for the international community. For the first time, all nations will converge upon a new set of Sustainable Development Goals applicable to advanced countries, emerging market economies, and developing countries, with the experience of implementing the Millennium Development Goals to build upon. Implementation is the critical component. The Brookings Global Economy and Development program hosted a day-long private conference at the Brookings Institution in Washington, DC on Monday, March 30 to focus on “Governance innovations for implementing the post-2015 Sustainable Development Agenda.” Hosted in collaboration with the Ministry of Foreign Affairs of Finland, this high-level conference drew on experiences from the North-South Helsinki Process on Globalization and Development carried out over the past 15 years. The Helsinki Process presaged many of the prerequisites for achieving accelerated progress by linking goal-setting to goal-implementation and by utilizing multistakeholder processes to mobilize society and financing for social and environmental goals to complement sound economic and financial policies. Download the conference agenda » Download the related report » Download the list of registrants » Download the conference statement » Brookings President Strobe Talbott shakes hands with Finland’s Minister of Foreign Affairs Erkki Tuomioja after welcoming participants to the conference. Former President of Finland Tarja Halonen shares insights in the conference’s opening panel. Over 75 conference participants from governments, multilateral institutions, civil society, the private sector, and think tanks participated in a number of roundtable discussions throughout the day. President Halonen and Minister Tuomioja share lessons from the Helsinki process as conference participants consider paths forward for implementing the post-2015 Sustainable Development Goals. Event Materials 330 PostReportFinalConference Agenda_FINALMarch 30 List of RegistrantsConference Statement Brookings Post2015 Implementation Full Article
va Political decisions and institutional innovations required for systemic transformations envisioned in the post-2015 sustainable development agenda By webfeeds.brookings.edu Published On :: Tue, 08 Sep 2015 11:04:00 -0400 2015 is a pivotal year. Three major workstreams among all the world’s nations are going forward this year under the auspices of the United Nations to develop goals, financing, and frameworks for the “post-2015 sustainable development agenda.” First, after two years of wide-ranging consultation, the U.N. General Assembly in New York in September will endorse a new set of global goals for 2030 to follow on from the Millennium Development Goals (MDGs) that culminate this year. Second, to support this effort, a financing for development (FFD) conference took place in July in Addis Ababa, Ethiopia, to identify innovative ways of mobilizing private and public resources for the massive investments necessary to achieve the new goals. And third, in Paris in December the final negotiating session will complete work on a global climate change framework. These three landmark summits will, with luck, provide the broad strategic vision, the specific goals, and the financing modalities for addressing the full range of systemic threats. Most of all, these three summit meetings will mobilize the relevant stakeholders and actors crucial for implementing the post-2015 agenda—governments, international organizations, business, finance, civil society, and parliaments—into a concerted effort to achieve transformational outcomes. Achieving systemic sustainability is a comprehensive, inclusive effort requiring all actors and all countries to be engaged. These three processes represent a potential historic turning point from “business-as-usual” practices and trends and to making the systemic transformations that are required to avoid transgressing planetary boundaries and critical tipping points. Missing from the global discourse so far is a realistic assessment of the political decisions and institutional innovations that would be required to implement the post-2015 sustainable development agenda (P2015). For 2015, it is necessary is to make sure that by the end of year the three workstreams have been welded together as a singular vision for global systemic transformation involving all countries, all domestic actors, and all international institutions. The worst outcome would be that the new Sustainable Development Goals (SDGs) for 2030 are seen as simply an extension of the 2015 MDGs—as only development goals exclusively involving developing countries. This outcome would abort the broader purposes of the P2015 agenda to achieve systemic sustainability and to involve all nations and reduce it to a development agenda for the developing world that by itself would be insufficient to make the transformations required. Systemic risks of financial instability, insufficient job-creating economic growth, increasing inequality, inadequate access to education, health, water and sanitation, and electricity, “breaking points” in planetary limits, and the stubborn prevalence of poverty along with widespread loss of confidence of people in leaders and institutions now require urgent attention and together signal the need for systemic transformation. As a result, several significant structural changes in institution arrangements and governance are needed as prerequisites for systemic transformation. These entail (i) political decisions by country leaders and parliaments to ensure societal engagement, (ii) institutional innovations in national government processes to coordinate implementation, (iii) strengthening the existing global system of international institutions to include all actors, (iv) the creation of an international monitoring mechanism to oversee systemic sustainability trajectories, and (v) realize the benefits that would accrue to the entire P2015 agenda by the engagement of the systemically important countries through fuller utilization of G20 leaders summits and finance ministers meetings as enhanced global steering mechanisms toward sustainable development. Each of these changes builds on and depends on each other. I. Each nation makes a domestic commitment to a new trajectory toward 2030 For global goal-setting to be implemented, it is essential that each nation go beyond a formal agreement at the international level to then embark on a national process of deliberation, debate, and decision-making that adapts the global goals to the domestic institutional and cultural context and commits the nation to them as a long-term trajectory around which to organize its own systemic transformation efforts. Such a process would be an explicitly political process involving national leaders, parliaments or rule-making bodies, societal leaders, business executives, and experts to increase public awareness and to guide the public conversation toward an intrinsically national decision which prioritizes the global goals in ways which fit domestic concerns and circumstances. This political process would avoid the “one-size-fits-all” approach and internalize and legitimate each national sustainability trajectory. So far, despite widespread consultation on the SDGs, very little attention has been focused on the follow-up to a formal international agreement on them at the U.N. General Assembly in September 2015. The first step in implementation of the SDGs and the P2015 agenda more broadly is to generate a national commitment to them through a process in which relevant domestic actors modify, adapt, and adopt a national trajectory the embodies the hopes, concerns and priorities of the people of each country. Without this step, it is unlikely that national systemic sustainability trajectories will diverge significantly enough from business-as-usual trends to make a difference. More attention needs to now be given to this crucial first step. And explicit mention of the need for it should appear in the UNGA decisions in New York in September. II. A national government institutional innovation for systemic transformation The key feature of systemic risks is that each risk generates spillover effects that go beyond the confines of the risk itself into other domains. This means that to manage any systemic risk requires broad, inter-disciplinary, multi-sectoral approaches. Most governments have ministries or departments that manage specific sectoral programs in agriculture, industry, energy, health, education, environment, and the like when most challenges now are inter-sectoral and hence inter-ministerial. Furthermore, spillover linkages create opportunities in which integrated approaches to problems can capture intrinsic synergies that generate higher-yield outcomes if sectoral strategies are simultaneous and coordinated. The consequence of spillovers and synergies for national governments is that “whole-of-government” coordinating committees are a necessary institutional innovation to manage effective strategies for systemic transformation. South Korea has used inter-ministerial cabinet level committees that include private business and financial executives as a means of addressing significant interconnected issues or problems requiring multi-sectoral approaches. The Korea Presidential Committee on Green Growth, which contained more than 20 ministers and agency heads with at least as many private sector leaders, proved to be an extremely effective means of implementing South Korea’s commitment to green growth. III. A single global system of international institutions The need for a single mechanism for coordinating the global system of international institutions to implement the P2015 agenda of systemic transformation is clear. However, there are a number of other larger reasons why the forging of such a mechanism is crucial now. The Brettons Woods era is over. It was over even before the initiative by China to establish the Asia Infrastructure Investment Bank (AIIB) in Beijing and the New Development Bank (NDB) in Shanghai. It was over because of the proliferation in recent years of private and official agencies and actors in development cooperation and because of the massive growth in capital flows that not only dwarf official development assistance (concessional foreign aid) but also IMF resources in the global financial system. New donors are not just governments but charities, foundations, NGOs, celebrities, and wealthy individuals. New private sources of financing have mushroomed with new forms of sourcing and new technologies. The dominance of the IMF and the World Bank has declined because of these massive changes in the context. The emergence of China and other emerging market economies requires acknowledgement as a fact of life, not as a marginal change. China in particular deserves to be received into the world community as a constructive participant and have its institutions be part of the global system of international institutions, not apart from it. Indeed, China’s Premier, Li Keqiang, stated at the World Economic Forum in early 2015 that “the world order established after World War II must be maintained, not overturned.” The economic, social and environmental imperatives of this moment are that the world’s people and the P2015 agenda require that all international institutions of consequence be part of a single coordinated effort over the next 15 years to implement the post-2015 agenda for sustainable development. The geopolitical imperatives of this moment also require that China and China’s new institutions be thoroughly involved as full participants and leaders in the post-2015 era. If nothing else, the scale of global investment and effort to build and rebuild infrastructure requires it. It is also the case that the post-2015 era will require major replenishments in the World Bank and existing regional development banks, and significantly stronger coordination among them to address global infrastructure investment needs in which the AIIB and the NDB must now be fully involved. The American public and the U.S. Congress need to fully grasp the crucial importance for the United States, of the IMF quota increase and governance reform. These have been agreed to by most governments but their implementation is stalled in the U.S. Congress. To preserve the IMF’s role in the global financial system and the role of the U.S. in the international community, the IMF quota increase and IMF governance reform must be passed and put into practice. Congressional action becomes all the more necessary as the effort is made to reshape the global system of international institutions to accommodate new powers and new institutions within a single system rather than stumble into a fragmented, fractured, and fractious global order where differences prevail over common interests. The IMF cannot carry out its significant responsibility for global financial stability without more resources. Other countries cannot add to IMF resources proportionately without U.S. participation in the IMF quota increase. Without the US contribution, IMF members will have to fund the IMF outside the regular IMF quota system, which means de-facto going around the United States and reducing dramatically the influence of the U.S. in the leadership of the IMF. This is a self-inflicted wound on the U.S., which will damage U.S. credibility, weaken the IMF, and increase the risk of global financial instability. By blocking the IMF governance reforms in the IMF agreed to by the G-20 in 2010, the U.S. is single-handedly blocking the implementation of the enlargement of voting shares commensurate with increased emerging market economic weights. This failure to act is now widely acknowledged by American thought leaders to be encouraging divergence rather than convergence in the global system of institutions, damaging U.S. interests. IV. Toward a single monitoring mechanism for the global system of international institutions The P2015 agenda requires a big push toward institutionalizing a single mechanism for the coordination of the global system of international institutions. The international coordination arrangement today, is the Global Partnership for Effective Development Cooperation created at the Busan High-Level Forum on Aid Effectiveness in 2011. This arrangement, which recognizes the increasingly complex context and the heightened tensions between emerging donor countries and traditional western donors, created a loose network of country platforms, regional arrangements, building blocks and forums to pluralize the architecture to reflect the increasingly complex set of agents and actors. This was an artfully arranged compromise, responding to the contemporary force field four years ago. Now is a different moment. The issues facing the world are both systemic and urgent; they are not confined to the development of developing countries, and still less to foreign aid. Geopolitical tensions are, if anything, higher now than then. But they also create greater incentives to find areas of cooperation and consensus among major powers who have fundamentally different perspectives on other issues. Maximizing the sweet spots where agreement and common interest can prevail is now of geopolitical importance. Gaining agreement on institutional innovations to guide the global system of international institutions in the P2015 era would be vital for effective outcomes but also importantly ease geopolitical tensions. Measurement matters; monitoring and evaluation is a strategic necessity to implementing any agenda, and still more so, an agenda for systemic transformation. As a result, the monitoring and evaluation system that accompanies the P2015 SDGs will be crucial to guiding the implementation of them. The UN, the OECD, the World Bank, and the IMF all have participated in joint data gathering efforts under the IDGs in the 1990s and the MDGs in the 2000s. Each of these institutions has a crucial role to play, but they need to be brought together now under one umbrella to orchestrate their contributions to a comprehensive global data system and to help the G20 finance ministers coordinate their functional programs. The OECD has established a strong reputation in recent years for standard setting in a variety of dimensions of the global agenda. Given the strong role of the OECD in relation to the G20 and its broad outreach to “Key Partners” among the emerging market economies, the OECD could be expected to take a strong role in global benchmarking and monitoring and evaluation of the P2015 Agenda. The accession of China to the OECD Development Centre, which now has over fifty member countries, and the presence and public speech of Chinese Premier Li Keqiang at the OECD on July 1st, bolsters the outreach of the OECD and its global profile. But national reporting is the centerpiece and the critical dimension of monitoring and evaluation. To guide the national reporting systems and evaluate their results, a new institutional arrangement is needed that is based on national leaders with responsibility for implementation of the sustainable development agendas from each country and is undertaken within the parameters of the global SDGs and the P2015 benchmarks. V. Strengthening global governance and G20 roles G-20 leaders could make a significant contribution to providing the impetus toward advancing systemic sustainability by creating a G-20 Global Sustainable Development Council charged with pulling together the national statistical indicators and implementing benchmarks on the SDGs in G-20 countries. The G-20 Global Sustainable Development Council (G-20 GSDC) would consist of the heads of the presidential committees on sustainable development charged with coordinating P2015 implementation in G-20 countries. Representing systemically important countries, they would also be charged with assessing the degree to which national policies and domestic efforts by G20 countries generate positive or negative spillover effects for the rest of the world. This G-20 GSDC would also contribute to the setting of standards for the global monitoring effort, orchestrated perhaps by the OECD, drawing on national data bases from all countries using the capacities of the international institutions to generate understanding of global progress toward systemic sustainability. The UN is not in a position to coordinate the global system of international institutions in their functional roles in global sustainable development efforts. The G-20 itself could take steps through the meetings of G-20 Finance Ministers to guide the global system of international institutions in the implementation phase of the P2015 agenda to begin in 2016. The G-20 already has a track record in coordinating international institutions in the response to the global financial crisis in 2008 and its aftermath. The G-20 created the Financial Stability Board (FSB), enlarged the resources for the IMF, agreed to reform the IMF’s governance structure, orchestrated relations between the IMF and the FSB, brought the OECD into the mainstream of G-20 responsibilities and has bridged relations with the United Nations by bringing in finance ministers to the financing for development conference in Addis under Turkey’s G-20 leadership. There is a clear need to coordinate the financing efforts of the IMF, with the World Bank and the other regional multilateral development banks (RMDBs), with the AIIB and the BRICS NDB, and with other public and private sector funding sources, and to assess the global institutional effort as whole in relation to the P2015 SDG trajectories. The G-20 Finance Ministers grouping would seem to be uniquely positioned to be an effective and credible means of coordinating these otherwise disparate institutional efforts. The ECOSOC Development Cooperation Forum and the Busuan Global Partnership provide open inclusive space for knowledge sharing and consultation but need to be supplemented by smaller bodies capable of making decisions and providing strategic direction. Following the agreements reached in the three U.N. workstreams for 2015, the China G-20 could urge the creation of a formal institutionalized global monitoring and coordinating mechanism at the China G-20 Summit in September 2016. By having the G-20 create a G-20 Global Sustainable Development Council (G-20 GSDC), it could build on the national commitments to SDG trajectories to be made next year by U.N. members countries and on the newly formed national coordinating committees established by governments to implement the P2015 Agenda, giving the G-20 GSDC functional effectiveness, clout and credibility. Whereas there is a clear need to compensate for the sized-biased representation of the G20 with still more intensive G-20 outreach and inclusion, including perhaps eventually considering shifting to a constituency based membership, for now the need in this pivotal year is to use the momentum to make political decisions and institutional innovations which will crystallize the P2015 strategic vision toward systemic sustainability into mechanisms and means of implementation. By moving forward on these recommendations, the G-20 Leaders Summits would be strengthened by involving G-20 leaders in the people-centered P2015 Agenda, going beyond finance to issues closer to peoples’ homes and hearts. Systemically important countries would be seen as leading on systemically important issues. The G-20 Finance Ministers would be seen as playing an appropriate role by serving as the mobilizing and coordinating mechanism for the global system of international institutions for the P2015 Agenda. And the G-20 GSDC would become the effective focal point for assessing systemic sustainability not only within G20 countries but also in terms of their positive and negative spillover effects on systemic sustainability paths of other countries, contributing to standard setting and benchmarking for global monitoring and evaluation. These global governance innovations could re-energize the G20 and provide the international community with the leadership, the coordination and the monitoring capabilities that it needs to implement the P2015 Agenda. Conclusion As the MDGs culminate this year, as the three U.N. workstreams on SDGs, FFD, and UNFCC are completed, the world needs to think ahead to the implementation phase of the P2015 sustainable development agenda. Given the scale and scope of the P2015 agenda, these five governance innovations need to be focused on now so they can be put in place in 2016. These will ensure (i) that national political commitments and engagement by all countries are made by designing, adopting, and implementing their own sustainable development trajectories and action plans; (ii) that national presidential committees are established, composed of key ministers and private sector leaders to coordinate each country’s comprehensive integrated sustainability strategy; (iii) that all governments and international institutions are accepted by and participate in a single global system of international institutions; (iv) that a G-20 monitoring mechanism be created by the China G-20 in September 2016 that is comprised of the super-minister officials heading the national presidential coordinating committees implementing the P2015 agenda domestically in G-20 countries, as a first step; and (v) that the G-20 Summit leaders in Antalya in November 2015 and in China in September 2016 make clear their own commitment to the P2015 agenda and their responsibility for its adaption, adoption and implementation internally in their countries but also for assessing G-20 spillover impacts on the rest of the world, as well as for deploying their G-20 finance ministers to mobilize and coordinate the global system of international institutions toward achieving the P2015 agenda. Without these five structural changes, it will be more likely that most countries and actors will follow current trends rather than ratchet up to the transformational trajectories necessary to achieve systemic sustainability nationally and globally by 2030. References Ye Yu, Xue Lei and Zha Xiaogag, “The Role of Developing Countries in Global Economic Governance---With a Special Analysis on China’s Role”, UNDP, Second High-level Policy Forum on Global Governance: Scoping Papers, (Beijing: UNDP, October 2014). Zhang Haibing, “A Critique of the G-20’s Role in UN’s post-2015 Development Agenda”, in Catrina Schlager and Chen Dongxiao (eds), China and the G-20: The Interplay between an Emerging Power and an Emerging Institution, (Shanghai: Shanghai Institutes for International Studies [SIIS] and the Friedrich Ebert Stiftung [FES], 2015) 290-208. Global Review, (Shanghai: SIIS, 2015,) 97-105. Colin I. Bradford, “Global Economic Governance and the Role International Institutions”, UNDP, Second High-level Policy Forum on Global Governance: Scoping Papers, (Beijing: UNDP, October 2014). Colin I. Bradford, “Action implications of focusing now on implementation of the post-2015 agenda.”, (Washington: The Brookings Institution, Global Economy and Development paper, September 2015). Colin I. Bradford, “Systemic Sustainability as the Strategic Imperative for the Future”, (Washington: The Bookings Institution, Global Economy and Development paper; September 2015). Wonhyuk Lim and Richard Carey, “Connecting Up Platforms and Processes for Global Development to 2015 and Beyond: What can the G-20 do to improve coordination and deliver development impact?”, (Paris: OECD Paper, February 2013). Xiaoyun Li and Richard Carey, “The BRICS and the International Development System: Challenge and Convergence”, (Sussex: Institute for Development Studies, Evidence Report No. 58, March 2014). Xu Jiajun and Richard Carey, “China’s Development Finance: Ambition, Impact and Transparency,” (Sussex : Institute for Development Studies, IDS Policy Brief, 2015). Soogil Young, “Domestic Actions for Implementing Integrated Comprehensive Strategies: Lessons from Korea’s Experience with Its Green Growth Strategy”, Washington: Paper for the Brookings conference on “Governance Innovations to Implement the Post-2015 Agenda for Sustainable Development”, March 30, 2015). Authors Colin I. BradfordHaibing Zhang Full Article
va Iran’s regional rivals aren’t likely to get nuclear weapons—here’s why By webfeeds.brookings.edu Published On :: Mon, 30 Nov -0001 00:00:00 +0000 In last summer’s congressional debate over the Iran nuclear deal, one of the more hotly debated issues was whether the deal would decrease or increase the likelihood that countries in the Middle East would pursue nuclear weapons. Bob Einhorn strongly believes the JCPOA will significantly reduce prospects for proliferation in the Middle East Full Article Uncategorized
va Renovating democracy: Governing in the age of globalization and digital capitalism By webfeeds.brookings.edu Published On :: Wed, 18 Sep 2019 20:13:04 +0000 The rise of populism in the West and the rise of China in the East have stirred a debate about the role of democracy in the international system. The impact of globalization and digital capitalism is forcing worldwide attention to the starker divide between the “haves” and the “have-nots,” challenging how we think about the… Full Article
va What must corporate directors do? Maximizing shareholder value versus creating value through team production By webfeeds.brookings.edu Published On :: Mon, 15 Jun 2015 00:00:00 -0400 In our latest 21st Century Capitalism initiative paper, "What must corporate directors do? Maximizing shareholder value versus creating value through team production," author Margaret M. Blair explores how the share value maximization norm (or the “short-termism” malady) came to dominate, why it is wrong, and why the “team production” approach provides a better basis for governing corporations over the long term. Blair reviews the legal and economic theories behind the share-value maximization norm, and then lays out a theory of corporate law building on the economics of team production. Blair demonstrates how the team production theory recognizes that creating wealth for society as a whole requires recognizing the importance of all of the participants in a corporate enterprise, and making sure that all share in the expanding pie so that they continue to collaborate to create wealth. Arguing that the corporate form itself helps solve the team production problem, Blair details five features which distinguish corporations from other organizational forms: Legal personality Limited liability Transferable shares Management under a Board of Directors Indefinite existence Blair concludes that these five characteristics are all problematic from a principal-agent point of view where shareholders are principals. However, the team production theory makes sense out of these arrangements. This theory provides a rationale for the role of corporate directors consistent with the role that boards of directors historically understood themselves to play: balancing competing interests so the whole organization stays productive. Downloads Download the paper Authors Margaret M. Blair Full Article