Evaluation Research of DreamBox Learning Garners Recognition

When I was a classroom math teacher and a K-12 district math coordinator, I spent many days and weeks reviewing, piloting, and evaluating print and digital math curricular resources with my colleagues. What struck me was how few programs have meaningful efficacy research showing how they improve student learning—especially digital math programs. Given how rapidly the ed-tech industry continues to grow, and how often new digital math programs and apps crop up, it’s critical that schools and districts understand whether the curricular solutions they’re investing in are working. This question should be at the top of every educator’s mind: What evidence exists that a math program is proven to improve student achievement?

DreamBox Learning is committed to being a leader in efficacy research, which is why I’m proud that Digital Promise has awarded research of DreamBox Learning with Honorable Mention for Evaluation Research in their Research-Based Products Campaign. This recognition is an exciting acknowledgement of our dedication to efficacy research and participation in independent studies that verify the demonstrable achievement gains that result when students use DreamBox Learning Math.

Recognizing the contributions of pioneers in ed-tech and digital learning, Digital Promise—an independent nonprofit organization committed to spurring innovation in education—launched a series of awards showcasing examples of how companies like DreamBox use and conduct research to build better products and services in three categories: Learning Sciences, User Research, and Evaluation Research. The award criteria for the Evaluation Research category were based on the rigor of the research design and methodology as well as a product’s real-world use. DreamBox was awarded Honorable Mention for a recent third-party study that was independently funded and conducted by the Center for Education Policy Research (CEPR) at Harvard University. This study found that DreamBox Learning Math is associated with improvement in math achievement for students in Grades 3–5. The study examined NWEA MAP, PARCC, and other state test scores of nearly 3,000 students in Howard County Public School System in Maryland and Rocketship Education in California. The study found that students improved nearly 4 percentile points after just 14 hours of DreamBox usage.

harvard-results-blog

I’m also proud that an earlier third-party research study of DreamBox in Grades K–1 has been highlighted as an exemplary research study by the Mathematica Center for Improving Research Evidence. To better equip educators to conduct their own research of digital programs, the U.S. Department of Education created the Ed Tech Rapid Cycle Evaluation (RCE) Coach. This digital tool references a Mathematica report entitled “Understanding Types of Evidence: A Guide for Educators” that dissects different types of reports in order to help educators understand which product claims are actually supported by research and which are merely speculative, potentially misleading, or mainly for marketing purposes. Mathematica’s report points to an SRI International study of DreamBox as a strong example of causal evidence from an independent evaluation. Because that study was a Randomized Controlled Trial with a statistically significant finding of a 5.5 percentile point improvement as measured by the NWEA MAP assessment after 21 hours of DreamBox usage, Mathematica refers to the study as an example of the “gold-standard for establishing causal effects” of a curricular program. This SRI International study of DreamBox was also reviewed and validated by the What Works Clearinghouse; the study met their research standards without reservations.

For further support with research, I’ve created two other tools that help educators evaluate research studies and curricular programs: Finding What Works in Learning: A Rubric for Analyzing Research Studies of Curricular Programs and Best Practices for Evaluating Digital Curricula.

There will always be a need for more research that evaluates the efficacy of ed-tech solutions, and at DreamBox Learning we are proud to be at the forefront of this much-needed advancement in our ever-growing and evolving industry. We are honored to have participated in the research study conducted by CEPR at Harvard University and to be recognized by Digital Promise. We are excited to have independent evaluations that not only speak to DreamBox’s impact on improving student achievement, but are also recognized as strong studies in our field. To learn more about how research plays a key role at DreamBox Learning, read the full Harvard study and the SRI International study. To see the full roster of Digital Promise award winners, you can view the list here.

Tim Hudson

Tim Hudson

VP of Learning for DreamBox Learning, Inc., Hudson is a learning innovator and education leader who frequently writes and speaks about learning, education, and technology. Prior to joining DreamBox, Hudson spent more than 10 years working in public education, first as a high school mathematics teacher and then as the K–12 Math Curriculum Coordinator for the Parkway School District, a K–12 district of over 17,000 students in suburban St. Louis. While at Parkway, Hudson helped facilitate the district’s long-range strategic planning efforts and was responsible for new teacher induction, curriculum writing, and the evaluation of both print and digital educational resources. Hudson has spoken at national conferences such as ASCD, SXSWedu, and iNACOL.
Tim Hudson