Evaluation Partnership — Algebra Studio Sales Partners

Evaluation Partnership

Help districts design an evaluation that fits their context.

When to use this

Surface the evaluation partnership for curriculum coordinators, principals, and superintendents — anyone who needs data to justify a purchase. It's most powerful on a follow-up visit or evaluation discussion, after you've already shown the product. Don't lead with it on a first visit; lead with the product experience. Bring this when evidence comes up — either as an objection or as a requirement.

Research Design Options

A
Pre/Post with District Assessments
Rigor: Moderate Timeline: One unit (2–4 weeks)

Administer a brief assessment before and after the lab using prior-year state items or district benchmark questions. Algebra Studio provides a standard pre/post, or we help select items aligned to the specific standards the lab covers.

Measures: Assessment gain scores on targeted standards

When to recommend
The easiest entry point. Suggest this for districts that want evidence but don't have the bandwidth or political will for a controlled study. Works well for a single-school pilot. A curriculum coordinator can run this without district-level approval.
B
Delayed-Start RCT
Rigor: High Timeline: One lab cycle + delay

Assess all participating classrooms at baseline. Group A begins the lab; Group B continues as usual. Assess everyone again after Group A completes the project. Then Group B receives the program. Both groups eventually participate — no classroom misses out.

Measures: Between-group comparison on math assessment performance at midpoint

When to recommend
The strongest design. Recommend for superintendents or curriculum directors who need board-quality evidence — a superintendent who can say "we're running a delayed-start RCT on supplemental math enrichment" has a story no other supplement provides. Requires enough classrooms to split into two groups, so this works at the district level, not single-school.
C
Matched Comparison
Rigor: Moderate–High Timeline: One semester or year

Compare participating classrooms to non-participating classrooms with similar demographics and prior achievement. We help identify appropriate comparison groups and metrics using the district's existing data.

Measures: Achievement comparison controlling for prior performance and demographics

When to recommend
Good middle ground when an RCT isn't practical but the district wants more than pre/post. Works well when some schools are adopting and others aren't — the comparison happens naturally. Also useful when a district is already running Math Labs in some buildings and wants to compare outcomes to buildings that haven't started.
D
Implementation + Perception Study
Rigor: Descriptive Timeline: Ongoing

Document implementation fidelity, student engagement, and teacher perception alongside quantitative measures. Useful for understanding how and why the program works in context, not just whether.

Measures: Observation logs, teacher surveys, student engagement data, qualitative themes

When to recommend
Pair this with any of the other three options — it adds the "why" to the "whether." Also stands alone for districts more interested in teacher experience and implementation quality than test score comparisons. Principals often prefer this because it produces the kind of evidence they actually use: teacher quotes, engagement data, walkthrough observations.

The Evidence Talk Track

When they ask: "What's the evidence?"

This is the most common objection from curriculum coordinators and administrators. It's also the biggest opportunity to differentiate. Most reps deflect or oversell pilot data. The evaluation partnership reframes the evidence gap from a weakness into an invitation.

Say this

"We don't have a large-scale RCT yet — we're a young program. But here's what we'd like to do with you: help you design an evaluation that answers the question for your district. We can set up a delayed-start RCT within your schools — half your classrooms start the lab first, you assess everyone, then the other half starts. Or a pre/post with your own assessment items, or a matched comparison using classrooms that aren't participating. We'll help you structure it, and we'll be transparent about the results. We want to know if this works in your context as much as you do."

Not that

"Our pilot data shows significant gains." (Vague, unverifiable, sounds like every other vendor.) "Research supports hands-on learning." (True but generic — doesn't distinguish your product from any manipulative kit.)

When they push harder: "We need proven programs."

Some buyers have mandates around evidence-based curriculum, particularly in Title I contexts. Acknowledge the standard, then pivot to what you can offer.

Say this

"Understood. Math Labs are supplemental — they're not replacing your core curriculum, and they don't need to meet the same evidence threshold as a core adoption. What they can do is give you structured enrichment time with a built-in evaluation framework, so you're generating evidence specific to your students from day one. Most supplements can't offer that."

The research lineage (for the technical buyer)

If you're talking to someone with a learning sciences or research background — a curriculum coordinator who actually reads studies — you can name the tradition. Algebra Studio's curriculum design draws on research in embodied cognition (Nathan) and productive disciplinary engagement (Engle). Students manipulate physical materials to construct mathematical relationships, then formalize those relationships symbolically. The progression from concrete to abstract is the instructional theory, not just a feature.

You don't need to explain this to every buyer. But for the right audience, naming the research lineage by name signals that the product was designed by people who read the literature, not just marketed at people who value it.

What Algebra Studio Provides

Assessment instruments

Standard pre/post aligned to lab standards, or guidance selecting items from the district's existing assessment bank.

Study design support

Help structuring the evaluation, identifying comparison groups, and defining outcome measures.

Data collection templates

Observation protocols, teacher surveys, and student engagement rubrics ready to use.

Transparent reporting

Full results shared with the district regardless of outcome. No selective reporting.

Algebra Studio's curriculum design draws on research in embodied cognition and collaborative mathematical reasoning (Engle, Nathan). We are a young program building our evidence base and welcome rigorous evaluation in partnership with districts.

Evaluation Partnership one-pager — leave-behind for curriculum coordinators, principals, and superintendents. Outlines the four research design options.

Download PDF