Tag: Policymakers

Putting evidence into practice: a framework for knowledge mobilization

Putting evidence into practice: a framework for knowledge mobilization

Marta Pellegrini, University of Cagliari, Italy

Producing and making available evidence of effective educational programs is not enough to make it used in practice. Knowledge mobilization should be one of the key research areas to foster greater equity and responsiveness to educators’ needs. A study by Kaitlyn Fitzgerald and Beth Tipton focused on the communication of statistical data by studying how those data should be reported in order to facilitate decision-making based on evidence at school and policy levels.

The authors proposed three main considerations. First, using the expression “the message sent may not be the message received,” they highlight that researchers set norms often not understood by practitioners. To overcome this issue, participatory research methods should be used to set norms.

Second, practitioners and decision-makers are different from each other: they may work in large or small districts with diverse resources and educational backgrounds. The contexts where they work vary considerably based on the community they serve. To overcome this issue, a deep understanding of a community’s needs is necessary.

Finally, knowledge mobilization research needs other disciplines, such as data visualization, to create effective strategies to disseminate evidence.

Evidence and policy

Evidence and policy

In a review of important 2017 releases, MDRC recently referenced a memo to policymakers with recommendations for increasing research use and applying evidence to all policy decisions, both educational and otherwise.

Recommendations included:

  • Programs and policies should be independently evaluated. To ensure high-quality evaluations, they should be directly relevant to policy, free of political or other influences, and credible to subjects and consumers.
  • The government should provide incentives for programs to apply evidence results to improve their performance.
  • Utilize a tiered evidence strategy, such as is used in the Every Student Succeeds Act, to set clear guidelines for standards.
  • Existing funding sources should be applied to generate evidence. A 1% set-aside was recommended.
  • Federal and state agencies should be allowed to access and share their data for evaluation purposes.
Leveraging data to improve student outcomes

Leveraging data to improve student outcomes

Results for America and the Council of Chief State School Officers (CCSSO) have released a new report, called Leverage Points, that identifies thirteen opportunities for state education agencies (SEA) to use their ESSA state plans to build and use evidence to improve student outcomes.

Each leverage point presented in the report includes a brief explanation of the leverage point; the applicable section of the U.S. Department of Education’s former Consolidated State Plan template and, where relevant, the revised template; a summary of the relevant statutory provision(s); a set of actions states can take to promote evidence and continuous improvement; and questions states should ask when considering which option(s) to pursue.

The authors say the report can be helpful to anyone interested in pursuing more evidence-based approaches, but that it may be of particular interest and use to SEA staff charged with developing the SEA’s ESSA plan or SEA leads for federal programs, research and evaluation, school improvement, performance management, and Title II/Title IV programs.

What works to increase research use?

What works to increase research use?

A new systematic review from the UK’s EPPI-Centre at the Institute of Education looks at what works to increase research use by decision-makers. The review included 23 reviews whose relevance and methodological quality were judged appropriate.

There was reliable evidence that the following were effective:

  • Interventions facilitating access to research evidence, for example, through communications strategies and evidence repositories, conditional on the intervention design simultaneously trying to enhance decision-makers’ opportunity and motivation to use evidence.
  • Interventions building decision-makers’ skills to access and make sense of evidence (such as critical appraisal training programs) conditional on the intervention design simultaneously trying to enhance both capability and motivation to use research evidence.

There was cautious evidence that the following was effective:

  • Interventions that foster changes to decision-making structures and processes by formalizing and embedding one or more of the other mechanisms of change within existing structures and processes (such as evidence-on-demand services integrating push, user-pull, and exchange approaches).

There is reliable evidence that some individual intense and complex interventions lead to an increase in evidence use. Overall though, simpler and more defined interventions appear to have a better likelihood of success.