# Why Enhanced Reporting for More Effective Teacher Practice and Differentiated Digital Lessons Matters

**Why Enhanced Reporting for More Effective Teacher Practice and Differentiated Digital Lessons Matters**

How do you measure and collect evidence of a student’s thinking and understanding in mathematics? And equally important, as you’re collecting evidence of a student’s learning and understanding, how do you meaningfully report that student’s progress and proficiency? Given that these are two of the key questions that drive our curriculum development and educator experience at DreamBox Learning, our team is honored to learn that our approach to those questions is worthy of another award. As recipients of Tech & Learning’s Best Upgraded Product Award for Excellence, DreamBox continues celebrating our 10th year of improving student achievement in mathematics with a new accolade to add to our total of more than 40 industry awards. We are extremely proud of the innovations and inspirations that drive us, and I’d like to describe two of the award-winning improvements to you: new natively digital lessons and new reporting for learning guardians.

### How do you Collect Evidence of Thinking in Mathematics?

At some point many decades ago, it was decided that student achievement in math would largely be measured by standardized multiple choice, often low-level skill mathematics problems. Such assessments are extremely limited in their ability to uncover a student’s thinking or ascertain a student’s depth of understanding. Despite their inability to accurately yield evidence of deep student learning, these tests continue to be used because they are relatively easy to administer at scale and are extremely efficient to score. Many tests even claim to determine “mastery,” even though “Benjamin Bloom, the founder of modern mastery learning, nowhere defined mastery” (Wiggins, 2013). Most educators would agree that no matter how we might define mastery, it *could not* be assessed via a single-sitting multiple-choice quiz or test.

As Grant Wiggins also noted in 2013 about common misuses of the idea of mastery,

Rather than designing backward by establishing complex, worthy, and valid tasks on which students must demonstrate high-level ability (Wiggins & McTighe, 2005), schools too often reduce mastery to a high grade on a simplistic and nonvalidated assessment.

Unfortunately, even with the advent of new technologies and more tools to collect evidence of student thinking and understanding in mathematics in ways that aren’t possible *without* a digital environment, many education institutions still use antiquated techniques of measurement. In most cases, these techniques only provide educators with summative data a few times per year, and teachers aren’t even able to view the kinds of questions students were asked on the assessment. Yet teachers are expected to use this outdated, limited information for formative purposes in the classroom.

In DreamBox Learning Math, our virtual manipulatives and lessons make student thinking visible, and therefore collect evidence not only about *what* a student answers but also *how* a student answers and persists. Two of the innovative curriculum updates that Tech & Learning reviewed when choosing awardees were DreamBox’s new geometry construction lessons and polynomial array lessons.

*Geometry Construction Lesson*

*Polynomial Array Lesson*

These lessons continually observe each student’s sense-making and understanding with open-ended digital tools and manipulatives instead of multiple-choice, low-level skill questions. This design enables the teachers and curriculum designers at DreamBox to provide each student with real-time formative feedback that takes into account how that unique student is thinking during the lesson. This personalized learning design ensures DreamBox is not simply collecting evidence of understanding—DreamBox is actually cultivating and causing deeper understanding as well. And with our newly enhanced reporting, teachers can access the lessons to see the types of questions and tasks that were presented to each student.

### How do you Meaningfully Report Progress in Mathematics?

Also decades ago, it was decided that certain conventions would be used as the primary means of reporting student proficiency in mathematics: percentiles, age-based bell curves, percentages of correct answers, and distilling a student’s entire mathematical proficiency down to a single percentage, number, or letter. Indeed, these reporting conventions are simple for calculation and comparison rather than useful for meaningfully communicating actual understanding and growth. For example, how does a teacher figure out how to help a student with an ACT math score of 28 achieve a score of 31 on her next ACT test? Or what if a student in Grade 7 Math has an 86%; where does this student need to focus her energy to improve? In a more formative scenario, imagine that two students both correctly answer 7 of 10 problems on an assignment. What does it mean if one student answered the first 7 correctly while the other answered the last 7 correctly? A meaningful report should communicate something more significant about each student’s learning beyond simply that they both received a 70% on the assignment.

As Fullan and Donnelly noted in 2013 about technological innovations in assessment,

… the assessment system should be able to identify features of student behaviour and make observations on it, not in terms of binary correctness, but in the form of useful information on the ways in which the learner has engaged with the activity. Best-in-class innovations cover both formative and summative assessments; and the assessment system should show each stakeholder (student, teacher and parent) an optimal level of detail and an analysis of performance in real time. (p. 17)

As with the task of collecting evidence of understanding described earlier, despite the advent of new technologies and tools that can collect and report growth and learning, unfortunately many education institutions still use antiquated methods of reporting and thereby have not reached the standard described by Fullan and Donnelly. This reliance on decades-old reporting conventions has in some ways been exacerbated by new technologies because a percentage or diagnostic score can be even more quickly calculated using digitized multiple-choice items that, though they may be “technologically enhanced,” still remain rooted in designs for a summative test rather than being designed formatively for students as thinkers.

At DreamBox, our Insights Dashboard for teachers and other learning guardians is continually being improved and upgraded in order to distill complex information about learning and achievement into manageable and understandable reports. A major revamp of our dashboard for teachers is another update that Tech & Learning reviewed when choosing awardees. Rather than distill a student’s growth into a single number or percentage, we’ve leveraged our rich data about each student to provide teachers with classroom-level strategy group support, a real-time activity feed that shares information about whether a student has demonstrated understanding in a lesson, the opportunity to experience the tasks and questions students were given in each lesson, and the ability to easily assign differentiated lessons that take into account each student’s prior knowledge.

*Real-Time Activity Feed with Assessment Task Access*

*Classroom Strategy Groups & Differentiated Assignments*

As we continue engaging in dialogue with our partner schools and educators, DreamBox will always be improving how well we share an “optimal level of detail” that provides key insights to learning guardians.

Our team at DreamBox is inspired not only by this new recognition from Tech & Learning, but also by the recent analysis conducted by the Center for Education Policy Research at Harvard University that suggests DreamBox drives compelling achievement gains in elementary math. As we continue to enhance students’ lives and achievement, we’re excited and motivated to help teachers and other learning guardians deeply understand how their students are growing as young mathematicians.

### Tim Hudson

#### Latest posts by Tim Hudson (see all)

- Evaluation Research of DreamBox Learning Garners Recognition - November 11, 2016
- Why Enhanced Reporting for More Effective Teacher Practice and Differentiated Digital Lessons Matters - October 20, 2016
- Success in Algebra Requires Deeper Learning - April 20, 2015