ON THIS 25TH ANNIVERSARY of the Education Reform Act of 1993, the education reform effort in Massachusetts is divided, exhausted and directionless. Given the unrelenting pressure from global competition and the large gaps in achievement which remain, our leaders must find a way to rejuvenate improvement efforts. However, given the divisions among us, we should stay clear of any single “big-idea” policy initiative – such as raising the charter cap or creating new teacher evaluations; rather, we need to recognize that innovation and improvement are best led by those closest to the work: teachers, principals, and local district leaders.

We need to create more opportunities for local change-makers to rally support for their ideas and to get the funding to test them. Moreover, those opportunities need to be provided every year, and not just as a one-time gesture. Here’s a three-part proposal to do so, with safeguards to ensure that state dollars are paying off for children:

The Educators’ Innovation Fund: The state should establish a competitive grant program and invite local practitioners to come forward with their best ideas – such as new models for teachers’ professional development, new ways to use technology to personalize learning or ideas for reducing absenteeism. Applicants would specify the student outcomes they are hoping to improve – such as reducing absenteeism, improving high school graduation or college-going, improving students’ math or reading achievement, etc. To ensure a fair process, the proposals would be reviewed by panels of teachers and educational professionals. The winning proposals would receive up to three years of funding.

Here’s an important caveat: The size of each pilot would be no larger than necessary to discern impacts on student outcomes (e.g., 100 classrooms receiving the treatment and 100 comparison classrooms). It would be foolish to scale up any proposal more broadly – until there’s better evidence that it pays off for children.

Shining the Spotlight on Success: The state would contract with a third party to identify a comparison group of schools for each of the funded proposals. The comparison group would have similar demographics and recent trends in achievement to those chosen for funding. (To save costs and ensure fairness, the same evaluator would assess the impact of all the winning proposals.) The contractor would then track the outcomes of the grant recipients, and compare them against the outcomes of the comparison schools/classrooms. At the end of each year, the evaluator would provide a brief report to the state on whether the intended student outcomes improved (or not). At the end of the grant, those interventions which were shown to improve student outcomes would be placed on a list of “effective innovations.”  Each year, the state would invite the grantees to discuss with their peers what they did and what was learned.

The Efficacy Set Aside: The state would set aside a portion of its per-student allocation to districts to be spent on interventions which were demonstrated to improve student outcomes in the state. The amount of the set aside could be set many different ways: a constant value (e.g., $200 of the per student allocation), a percentage of the per student allocation (e.g., 10 percent), or as a share of the increase in spending above a starting year (e.g., 100 percent of any increase above the 2016-17 allocation would have to be spent on “effective innovations.”)

The above proposal would achieve three goals: First, the state would empower the change-makers working inside local agencies, who currently struggle against status quo demands to get their ideas funded. Second, by assessing the impact of those ideas, the state could provide better evidence to guide decisions in districts around the state. Third, by setting aside a portion of state aid for interventions with proof of efficacy, taxpayers would know that their dollars are being used wisely.

This is the same process which has produced sustained progress in our biotech sector. In the pharmaceutical industry, 80 percent of new drugs fail phase II clinical trials designed to test their efficacy. It is just too hard to anticipate the many ways in which an intervention could fail. That simply means that, in order to make sustained progress, we need to test many ideas (not the favored policy initiative of the day), and allow local practitioners to scale up the few that work.

So, over the next 25 years of educational improvement in Massachusetts, let us nurture the ingenuity of educators in traditional district schools and charter schools. At the same time, let’s not abandon what’s been built. Let’s repurpose the data systems which have been used solely for school accountability and compliance, and use them for identifying the most promising local ideas. Rather than lurching to the next “big idea” policy proposal, let’s create a system for soliciting ideas from local change-makers and testing them. That’s been the path toward sustained improvement in fields such as health care and biotechnology. Inevitably, that will be the path to sustained improvement in education as well.

Thomas J. Kane is the Walter H. Gale Professor of Education and Economics at the Harvard Graduate School of Education and faculty director of the Center for Education Policy Research. In addition to leading the $60 million Measures of Effective Teaching Project for the Bill & Melinda Gates Foundation, he has studied a range of education policy topics including college financial aid, community colleges, remedial education and school accountability systems.