Reader's Guide: Understanding EQAO Results

How to Read This Profile

EQAO measures whether students have met provincial curriculum standards in Reading, Writing, and Mathematics at Grade 3, Grade 6, and Grade 9. The results tell you something real — these skills predict long-term outcomes — but they don't tell you everything a school offers: culture, extracurriculars, social-emotional support, or whether a particular child would thrive there.

The harder interpretive problem is that a school's aggregate score reflects who attends it as much as what happens inside it. Schools serving communities with strong home literacy, stable incomes, and professional-background families tend to post higher numbers. That is not a critique of assessment design; it is a statistical fact about how socioeconomic advantage accumulates. It also means you cannot read a school's score as a direct measure of teaching quality without adjusting for the population being served. A school with lower aggregate scores may be doing exceptional work given the challenges its students face. The Demographics page lets you see each school's student population — shares of English Language Learners, students with IEPs, students from low-income families, and students whose home language differs from the language of instruction.

Most charts on this dashboard use three-year averages rather than single-year figures. In a school with 30 assessed students, a handful of students performing differently in any given cohort can shift the reported percentage by 15–20 points without anything having changed at the school. Pooling three years cuts that noise substantially.


Interpreting the Numbers

Sample size

With 25 assessed students, the Level 3/4 percentage can move roughly ±20 points from year to year by chance alone. With 50 students, the range narrows to ±14 points; with 100 students, to ±10. District and provincial results, drawn from thousands of students, are far more stable.

None of this is a flaw in the assessment. It is arithmetic. A single cohort of 25 students represents roughly one class, and one class is not a stable sample for estimating a school's sustained performance. Multiple years of data — or higher enrolment — are what make a school-level result reliable enough to act on.

Participation rates

Not every registered student completes the assessment. Some are absent on test day; others are exempted, typically students with significant special needs; for some, no result is recorded. When a substantial share of students are missing, the published percentages reflect only those who participated — and those students may not be representative of the cohort.

Across Ontario, students who are exempted or absent tend to have lower achievement than those who participate. A school with low participation will therefore tend to report higher Level 3/4 percentages than it would if all students had been assessed. This pattern does not hold at every individual school — local circumstances vary — but it is the direction you would expect.

G3 and G6 as different populations

In most schools, the Grade 6 cohort is essentially the Grade 3 cohort three years later. But the number of students in each grade can differ substantially, and when it does, the grade-to-grade comparison carries less interpretive weight than it appears to.

Some schools grow sharply from G3 to G6 because individual programs or feeder schools route students into a larger combined school at higher grades. A school with 30 Grade 3 students and 100 Grade 6 students is not tracking the same children over time. Others shrink — a school that serves only K–3 or K–4 loses its cohort to a separate Gr 4–8 school entirely. Population growth, specialized programs, and shifts in how newcomer families distribute across grades all contribute to the gap.

When the student count between G3 and G6 differs by more than 20–30 percent, treat grade-to-grade achievement differences as suggestive rather than diagnostic.


About EQAO

The Education Quality and Accountability Office (EQAO) is an independent provincial agency that administers standardized assessments at three key transition points:

Because every student in a grade writes the same assessment under the same conditions, results are comparable across schools and boards in a way that classroom grades are not. What the assessment does not do is control for the population being served — student demographics, community characteristics, mobility, and resource constraints all shape outcomes, and any fair reading of a school's results has to account for that.


Technical Methodology

Achievement Level %

The achievement metrics (e.g., % at Level 3/4) are calculated as a percentage of fully participating students.

Achievement L3/4 % = students at Level 3 or 4 ÷ fully participating students

Absent and exempt students are excluded from both numerator and denominator. A student who did not write receives no level and is not counted in the achievement percentage.

Participation Rate

The participation rate reflects the percentage of registered students who completed the assessment.

Participation rate = fully participating ÷ registered

Students fall into mutually exclusive outcomes. Exempted students count against the participation rate: they are registered (in the denominator) but contribute zero to the numerator.

Enrolled vs. Registered: The Gap

Registration is based on a snapshot date in late winter. Two gaps follow from this:


Data Sources

All data used in this dashboard is publicly available. The primary source is EQAO Open Data, released annually by the Education Quality and Accountability Office. Board boundary data is sourced from the Ontario GeoHub (Ministry of Education).

Primary source: eqao.com/about-eqao/open-data