Report

Lessons Learned From the Investing in Innovation Program

The i3 program’s support for the evaluation and growth of promising innovations should be part of increased investments in educational research and development.

 (A teacher helps a student who is reviewing for a geometry final in a Greeley, Colorado, high school, December 2016.)
A teacher helps a student who is reviewing for a geometry final in a Greeley, Colorado, high school, December 2016. (Getty/Melanie Stetson Freeman/The Christian Science Monitor)

In 1998, Angela Jerabek, a guidance counselor at a high school outside Minneapolis, grew frustrated watching too many ninth-graders fail courses year after year. Her principal encouraged her to come up with a new strategy to address the problem, knowing that failed classes too often lead students to drop out of high school.1 By the next school year, Jerabek had developed Building Assets, Reducing Risks (BARR)—a research-based program that “empowers educators to analyze real-time data and build upon individual strengths to support academic, social, and emotional success for every student.”2

BARR’s focus on relationships and data was successful at Jerabek’s St. Louis Park High School: Course failure rates dropped from 47 percent to 28 percent in one year and stabilized at less than 20 percent in subsequent years. But despite these promising results, for more than a decade, the program did not expand beyond that one school. In 2010, when BARR won a grant from the first Investing in Innovation (i3) competition, that changed as the program expanded to a suburban Los Angeles high school and two rural high schools in Maine.3

The American Recovery and Reinvestment Act (ARRA) of 2009 provided nearly $100 billion to the U.S. Department of Education “to stimulate the economy in the short term and invest in education and other essential public services to ensure the long-term economic health of our nation.”4 Nearly $49 billion from this stimulus was delivered to governors to stabilize education funding and other state services during the recession, nearly $25 billion went to school districts through major formula programs, and $17 billion was dedicated to increasing Pell Grant funding.

In addition, a small portion of ARRA’s funding—which still represented unprecedentedly large dollar amounts—was reserved for competitive grant programs, including $650 million to create the i3 program, which awards grants to test, implement, and scale educational interventions. The amount of i3 funding that grantees received was divided into three levels, tied to the existing evidence for each program. Small “Development grants” went to new or weakly tested interventions, medium-sized “Validation grants” allowed interventions with some evidence to test their efficacy in new contexts or different populations, and large “Scale Up grants” supported widespread expansion of established interventions with strong prior evidence.5

i3 and its renamed successor, the Education Innovation and Research (EIR) program, have invested approximately $1.7 billion through nearly 250 total grants to date.67 But education research remains underfunded, especially in relation to other policy areas.8 In fiscal year 2018, even if the EIR program’s funding had been included, the Education Department’s spending on research and development (R&D) still would have been only slightly more than 1 percent of what the U.S. Department of Health and Human Services spent.9

In order to help identify and scale successful new ideas, the United States can and should invest more in education R&D, and increased support for EIR should be part of that effort. Below are some of the key lessons learned from the early years of the i3 program. Overall, the authors find that an emphasis on evidence and outcomes is possible and that focused research priorities can have advantages. But there are still clear challenges in scaling effective programs and building the demand for high-quality research. In the end, no one program can do everything, and R&D investments should be judged on the impacts of the innovations they support, not just on how many investments succeed.

An emphasis on evidence and outcomes is possible

The Department of Education’s focus on generating and using rigorous evidence began years before i3. The founding of the Institute of Education Sciences (IES) in 2002 placed a focus on rigorous research—particularly randomized controlled trials—and the launch of the What Works Clearinghouse (WWC) allowed the IES to share results from studies that met high standards. This focus on evidence also was not a phenomenon that was unique to education as i3 launched, as evidence-based programs developed across government, including teen pregnancy prevention programs; home visiting programs that connected medical and child development professionals with first-time parents; and the Social Innovation Fund (SIF), which invests in the following “priority areas:” economic opportunity, healthy futures, and youth development.10

Fueled by hundreds of millions of federal dollars and a requirement to get partially matching philanthropic grants, the i3 competition was designed to accelerate the Education Department’s evidence-based efforts with grants for school districts and nonprofits that funded high-quality evaluations of grantees’ programs as they expanded. These evaluations built the evidence base, even if they did not find positive effects.11

Beth Boulay, a principal associate at Abt Associates who helped lead evaluation technical assistance to i3 grantees, told the authors that one of the most exciting outcomes of i3 is that more than 70 percent of the evaluations meet the rigorous WWC standards, creating a tremendous amount of high-quality research for the field.12 The initial i3 competition attracted nearly 1,700 applications, and thousands more have been considered in future rounds, even with much smaller regular appropriations after the stimulus, suggesting that the field has embraced this call for evidence-based solutions.13

While controversial, focused research priorities can have advantages

The initial Investing in Innovation competition in 2010 offered significant latitude in the strategies and priorities that grantees could address. Even just among the four grantees in the scale-up grant tier, awards went to a range of topics: Teach For America addressed teacher preparation, the KIPP Foundation expanded its charter network and trained principals, the Success for All Foundation expanded a whole-school turnaround model, and the Reading Recovery Council of North America implemented reading interventions aimed at first-graders reading below grade level.14

However, when future rounds of the competition began to rely on smaller annual appropriations, rather than the one-time stimulus funding from the American Recovery and Reinvestment Act, the Department of Education developed more focused priorities, such as improving the effectiveness of principals and improving science, technology, engineering, and math education.15 Nadya Chinoy Dabby—the former assistant deputy secretary for the Office of Innovation and Improvement, which oversaw the program—told the authors that this strategy had multiple goals.16 Focused priorities allowed the Education Department to recruit peer reviewers with deep expertise in a particular area, create cohorts of grantees working on similar issues to learn from each other and provide more strategic technical assistance, and increase the odds that the field would learn something useful in a specific area.

But the decision to focus the program’s awards on particular priorities proved controversial over concerns that the “administration started getting too heavy-handed with the [competition’s] priorities.”17 The use of priorities was somewhat reined in when i3 was reauthorized and renamed Education Innovation and Research in 2015’s Every Student Succeeds Act (ESSA), as the new program places “a greater emphasis on the research priorities of the field in setting priorities for grants.”18 But the most recent competition still retained an element of focus by inviting proposals for “Field-Initiated Innovations—General” and the more directed “Field-Initiated Innovations—Science, Technology, Engineering, and Math (STEM).”19 A possible compromise could allow prioritization to be developed outside the Education Department’s politically appointed leadership, either by researchers at the Institute of Education Sciences or through an independent advisory body.

It is difficult to identify effective solutions, and it may be even harder to scale them

Education outcomes are affected by students, families, staff, schools, funding, policies, and an array of environmental factors that can all change. But even when programs such as i3 identify causal evidence that a program or intervention works, growth of those effective solutions across the more than 13,000 public school districts and nearly 100,000 schools across the country is not a given.20

In addition to associated growth challenges due to a lack of demand for evidence-backed practices, growing the supply of an effective intervention presents challenges. Especially for scale-up grants that already have significant evidence of effectiveness, the most important questions may pertain to which factors drove the success; whether they were consistent across different types of students, schools, and communities; how faithful the implementation was in new settings; and whether costs for future implementations can be reduced to make them less reliant on grant funding and more sustainable when grants run out.

In an interview with the Social Innovation Research Center (SIRC) for the organization’s report on i3, former Education Department Deputy Secretary Jim Shelton expressed a desire for more of an investment in implementation studies, rather than impact studies, in order to get a better understanding of what drives impacts and the reasons it does so.21 Dabby also stressed that evaluation requirements could better convey that a goal of evaluation should be to understand how to operate programs more efficiently by the end of the grant.22

Building awareness of, and demand for, evidence is still a challenge

According to the authors’ phone interview with Patrick Lester, Director of SIRC, i3’s grants had a major impact in the schools implementing effective interventions, but they are not yet having a broad impact on the field as a whole since diffusion has been a slow process.23 i3 grantees were required to conduct activities to disseminate findings from their evaluations, but the dispersed network of grantees—ranging from relatively small school districts to large nonprofits—had varying capacity for outreach to reach key audiences. In interviews with the SIRC, some i3 grantees expressed a desire for the Education Department to play a bigger role in spreading the word about i3’s evidence, with one saying, “They need to disseminate to decision makers. As it is, each project does it on their own.”24

One of the Department of Education’s key mechanisms for disseminating research findings is the What Works Clearinghouse, but the initiative’s review process does not prioritize i3 studies, and grantees expressed frustration to the SIRC that there was a yearslong waitlist for those reviews. Besides, the findings that do make it onto the WWC are not guaranteed to reach educators. In a 2015 survey about research use, a majority of district leaders reported that they “rarely” or “never” searched for or found research using the WWC.25

Even when schools are aware of these evidence-based programs, there are often issues related to funding, internal capacity, and buy-in that can derail growth efforts, since schools may be interested in programs due to the grant funding that accompanies them rather than a commitment to implementing new practices. Among some scale-up grantees, there were higher-than-expected attrition rates even during the grant periods, as the availability of i3 funding may have attracted schools with insufficient buy-in to the programs.26

One of the most promising avenues to increase demand for evidence-backed practices is the inclusion in the Every Student Succeeds Act of an evidence framework similar to i3 and the Education and Innovation Research program’s requirements that now guide school improvement investments.27 Lester notes that ESSA is encouraging adoption of more evidence-based practices and that some states are now developing lists of interventions that meet their evidence standards.28

No one program can do everything

One of the frequent criticisms of the i3 program is that it did not really fund new or groundbreaking early-stage innovations.29 And while that observation is somewhat true, Dabby emphasized in her interview that i3 should not be thought of as an entire innovation agenda on its own, as it was not designed to develop brand-new “stage zero” programs or interventions.30 In her interview, Boulay similarly described it as a “floor,” or minimum requirement, for even the smallest tier of grants in i3 to have a “fully baked intervention” that was ready to be rigorously evaluated.31

In particular, i3 and EIR’s smallest grant tier presents challenges within the Education Department’s peer review processes, where grants are made solely based on reviewers’ assessments of what’s included in applications. Unlike private sector or philanthropic investments, reviewers do not interview applicants’ management teams to probe them on their program designs and plans for growth, nor do they conduct external due diligence.32 Changes to these processes could help the programs make more informed decisions on the newest and riskiest grants in the program.

An analysis of the final evaluation reports from the original cohort of i3 grantees was published in 2018. It showed that only 18 percent of the grantees demonstrated a positive impact on student academic outcomes, with many others having null results for a range of reasons, including sample size and other evaluation factors.33 While one write-up labeled this a “dirty secret,”34 it actually represents a success rate that is somewhat better than those of other education interventions; for example, a 2013 Coalition for Evidence-Based Policy study found that only 12 percent of interventions yield positive effects.35 Moreover, the success rate of i3 grantees is comparable to those of clinical trials, which often range from 10 percent to 15 percent.36

Conclusion

Nine years after receiving its first i3 grant, Building Assets, Reducing Risks is now the first program to have climbed through all three levels of the i3 competition.37 A decade ago, this was a promising program in a single Minneapolis-area high school. Now, as a result of i3, BARR is on track to be used in 250 schools by 2021.38

Innovation is hard work. The i3 grants highlighted challenges building the supply of and demand for evidence-based practices, and the EIR program is still working toward striking a balance between focused priorities and leaving room for field-initiated innovations. However, these grants showed that an emphasis on evidence is both possible and popular, and they provided significant funding to grow many evidence-backed programs.

i3’s and EIR’s successes should not solely be evaluated on how many of the grants worked and received positive evaluation results but also on how those successful grantees affect the field by growing directly or indirectly. The United States needs to increase investments in educational research and development, and the i3/EIR model can continue to play a part in identifying, evaluating, and scaling promising innovations.

Neil Campbell is the director of innovation for K-12 Education Policy at the Center for American Progress. Abby Quirk is a research associate for K-12 Education at the Center.

This issue brief is part of the Moonshot for Kids project, a joint initiative from the Center for American Progress and the Thomas B. Fordham Institute to explore the rationale, potential, and possible design of a sizable new investment—whether by the federal government or large-scale philanthropy—in basic and applied research and development that leads to innovation on behalf of America’s children.39

Endnotes

  1. Elaine Allensworth and John Q. Easton, “The On-Track Indicator as a Predictor of High School Graduation” (Chicago: University of Chicago Consortium on School Research, 2005), available at https://consortium.uchicago.edu/publications/track-indicator-predictor-high-school-graduation.
  2. Kirstyn Flood, “The BARR Story: Celebrating 20 Years of the Model,” BARR Center, February 16, 2018, available at https://barrcenter.org/stories/angies-story/.
  3. Sarah D. Sparks, “In Maine, Intervention Smooths 9th Graders’ Paths,” Education Week, March 22, 2016, available at https://www.edweek.org/ew/articles/2016/03/23/in-maine-intervention-smooths-9th-graders-paths.html.
  4. U.S. Department of Education, “The Recovery and Reinvestment Act of 2009: Saving and Creating Jobs and Reforming Education,” March 7, 2009, available at https://www2.ed.gov/policy/gen/leg/recovery/implementation.html.
  5. U.S. Department of Education, “Investing in Innovation (i3),” available at https://www.ed.gov/open/plan/investing-innovation-i3 (last accessed October 2019).
  6. Insight Education Group, “Education Innovation and Research (EIR),” available at https://www.insighteducationgroup.com/education-innovation-and-research-program-2019-original (last accessed October 2019).
  7. U.S. Department of Education Office of Elementary and Secondary Education, “Funding and Legislation,” available at https://oese.ed.gov/offices/office-of-discretionary-grants-support-services/innovation-early-learning/education-innovation-and-research-eir/funding-and-legislation-5/ (last accessed October 2019).
  8. Kumar Garg, “Education Research Is One Area of Education the Federal Government Does Best. It’s Time for Congress to Boost Funding for Education R&D,” The74Million, September 11, 2019, available at https://www.the74million.org/article/garg-research-is-one-area-of-education-the-federal-government-does-best-its-time-for-congress-to-boost-funding-for-education-rd/.
  9. Office of Management and Budget, “President’s Budget: Analytical Perspectives, Research and Development” (Washington: Executive Office of the President, 2019), available at https://www.whitehouse.gov/wp-content/uploads/2019/03/ap_21_research-fy2020.pdf.
  10. Ron Haskins and Greg Margolis, Show Me the Evidence: Obama’s Fight for Rigor and Results in Social Policy (Washington: Brookings Institution Press, 2014), p. 226; Corporation for National and Community Service, “Social Innovation Fund,” available at https://www.nationalservice.gov/programs/social-innovation-fund (last accessed November 2019).
  11. Alyson Klein and Sarah D. Sparks, “Investing in Innovation: An Introduction to i3,” Education Week, March 28, 2016, available at https://www.edweek.org/ew/articles/2016/03/23/investing-in-innovation-an-introduction-to-i3.html.
  12. Beth Boulay, principal associate, Abt Associates, interview with authors via phone, May 15, 2019; Beth Boulay and others, “The Investing in Innovation Fund: Summary of 67 Evaluations: Final Report” (Washington: U.S. Department of Education, 2018), available at https://ies.ed.gov/ncee/pubs/20184013/pdf/20184013.pdf.
  13. Klein and Sparks, “Investing in Innovation: An Introduction to i3.”
  14. Sarah D. Sparks and Lovey Cooper, “i3 Grants: Findings From the First Round,” Education Week, March 24, 2016, available at https://www.edweek.org/ew/section/multimedia/i3-grants-findings-from-the-first-round.html.
  15. Department of Education Office of Innovation and Improvement, “Notice: Applications for New Awards; Investing in Innovation Fund-Development Grants,” Federal Register 80 (2015): 16648–16660, available at https://www.federalregister.gov/documents/2015/03/30/2015-07213/applications-for-new-awards-investing-in-innovation-fund-development-grants.
  16. Nadya Chinoy Dabby, former assistant deputy secretary, U.S. Department of Education, interview with authors via phone May 6, 2019.
  17. Klein and Sparks, “Investing in Innovation: An Introduction to i3.”
  18. Ibid.
  19. Department of Education Office of Elementary and Secondary Education, “Notice: Applications for New Awards; Education Innovation and Research (EIR) Program-Early-Phase Grants,” Federal Register 84 (2019): 1093–1101, available at https://www.federalregister.gov/documents/2019/02/01/2019-00708/applications-for-new-awards-education-innovation-and-research-eir-program-early-phase-grants.
  20. National Center for Education Statistics, “Table 214.10. Number of public school districts and public and private elementary and secondary schools: Selected years, 1869-70 through 2016-17,” Digest of Education Statistics, available at https://nces.ed.gov/programs/digest/d18/tables/dt18_214.10.asp?current=yes (last accessed October 2019).
  21. Patrick Lester, “Investing in Innovation (i3): Strong Start on Evaluating and Scaling Effective Programs, But Greater Focus Needed on Innovation” (Washington: Social Innovation Research Center, 2017), available at http://socialinnovationcenter.org/wp-content/uploads/2017/01/SIRC-i3-report.pdf.
  22. Dabby, interview with authors.
  23. Patrick Lester, director, Social Innovation Research Center, interview with authors via phone, May 14, 2019.
  24. Lester, “Investing in Innovation (i3): Strong Start on Evaluating and Scaling Effective Programs, But Greater Focus Needed on Innovation.”
  25. William R. Penuel and others, “Findings from a National Study on Research Use Among School and District Leaders” (Boulder, CO: National Center for Research in Policy and Practice, 2016), available at http://ncrpp.org/assets/documents/NCRPP_Technical-Report_180302.pdf.
  26. Boulay and others, “The Investing in Innovation Fund: Summary of 67 Evaluations.”
  27. U.S. Department of Education, “Non-Regulatory Guidance: Using Evidence to Strengthen Education Investments” (Washington: 2016), available at https://www2.ed.gov/policy/elsec/leg/essa/guidanceuseseinvestment.pdf.
  28. Personal communication from Patrick Lester, director, Social Innovation Research Center, October 16, 2019; Patrick Lester, “How Using Proven Models and Practices Could Overcome Decades of Failure” (Washington: Social Innovation Research Center, 2018), available at http://socialinnovationcenter.org/wp-content/uploads/2018/03/CSI-turnarounds.pdf.
  29. Klein and Sparks, “Investing in Innovation: An Introduction to i3.”
  30. Personal communication from Nadya Chinoy Dabby, former assistant deputy secretary, U.S. Department of Education, October 22, 2019.
  31. Boulay, interview with authors.
  32. Lester, interview with authors; Lester, “Investing in Innovation (i3): Strong Start on Evaluating and Scaling Effective Programs, But Greater Focus Needed on Innovation.”
  33. Boulay and others, “The Investing in Innovation Fund: Summary of 67 Evaluations.”
  34. Jill Barshay, “The ‘dirty secret’ about educational innovation,” The Hechinger Report, December 17, 2018, available at https://hechingerreport.org/the-dirty-secret-about-educational-innovation/.
  35. Coalition for Evidence-Based Policy, “Randomized Controlled Trials Commissioned by the Institute of Education Sciences Since 2002: How Many Found Positive Versus Weak or No Effects” (Washington: 2013), available at http://coalition4evidence.org/wp-content/uploads/2013/06/IES-Commissioned-RCTs-positive-vs-weak-or-null-findings-7-2013.pdf.
  36. Derek Lowe, “In the Pipeline,” Science Transitional Medicine, February 2, 2018, available at https://blogs.sciencemag.org/pipeline/archives/2018/02/02/a-new-look-at-clinical-success-rates?r3f_986=https://www.google.com/.
  37. Tara García Mathewson, “A little-known program has lifted 9th grade performance in virtually every type of school,” The Hechinger Report, August 30, 2018, available at https://hechingerreport.org/a-little-known-program-has-lifted-9th-grade-performance-in-virtually-every-type-of-school/.
  38. Flood, “The BARR Story: Celebrating 20 Years of the Model.”
  39. Michael J. Petrilli, “A Moonshot for Kids,” Thomas B. Fordham Institute, March 27, 2019, available at https://fordhaminstitute.org/national/commentary/moonshot-kids.

The positions of American Progress, and our policy experts, are independent, and the findings and conclusions presented are those of American Progress alone. A full list of supporters is available here. American Progress would like to acknowledge the many generous supporters who make our work possible.

Authors

Neil Campbell

Director, Innovation

Abby Quirk

Former Policy Analyst