Scenarios — Algebra Studio Sales Partners

Scenarios Rep Only

What to say when the conversation turns.

These are the objections and questions that come up across products. Each scenario has a reframe (how to think about it), a response (what to say), and guidance on what to show. Product-specific objections are on each product page. See the Buyer Playbook for persona-specific guidance.

The reframe

The budget objection usually means one of three things: (1) they genuinely have no discretionary funds, (2) they have funds but don't know which line item covers this, or (3) they're not convinced enough to find the money. For #1, the free games keep the door open. For #2, the funding guide is the answer. For #3, you haven't built enough value yet — back up and address the underlying skepticism before talking money.

The response

Math Labs qualify for multiple federal funding categories. Title I covers supplemental instruction for qualifying schools. Title IV-A covers well-rounded education and STEM enrichment. The PD workshop is a separate line item under Title II-A — professional development delivered by expert facilitators. For middle school labs with engineering contexts, CTE Perkins funds apply. The per-student cost runs $35–$40 for 15–20 hours of instruction — comparable to a consumable workbook. And because this is supplemental, it doesn't require a curriculum adoption budget line. It can come from enrichment, STEM, or afterschool funds.

I can send a budget justification letter your coordinator can use in a purchase request, and the funding guide with specific program details for each source.

Say this

“This qualifies under Title I, IV-A, and CTE Perkins depending on your context. I can send the funding guide with specifics. The PD is a separate Title II-A line item.”

Not that

Don't dismiss the concern. Don't say “it's only $995” — the number means different things at different levels. Lead with funding pathways, not price justification.

Show them

The Funding Guide page. If meeting with a principal or coordinator, also offer the budget justification letter template — a pre-written paragraph they can paste into a purchase request.

The reframe

This objection assumes Math Labs add time to the schedule. They don't — they fill time that already exists. Most schools have enrichment blocks, STEM periods, flex days, or intervention/extension time where structured content is needed. The question isn't whether they have time for Math Labs. The question is what's currently happening in that time and whether it's as structured and rigorous as what Math Labs provide.

The response

Math Labs go in the time that already exists on the schedule — enrichment blocks, STEM periods, flex days, or the project-based time that many schools designate but struggle to fill with structured content. The teaching portal handles the session flow: slides advance the activity, a built-in timer keeps pacing, and Howie Templer's video walkthroughs show exactly what to do at each step. Lesson planning time is zero. Setup time is pull the pre-sorted bag and distribute. The teacher facilitates; the portal runs the lesson.

Say this

“This fills your enrichment block. The teaching portal runs the session — slides, timer, video walkthroughs. Lesson planning time is zero.”

Not that

Don't minimize the concern. Teachers are overwhelmed and they know it. Acknowledge it, then redirect to the specific time slot this fills. Don't say “it's easy.”

Show them

The teaching portal. Click through 3–4 slides. Play one of Howie's walkthroughs. The moment they see the structure — slides, timer, video — the “how much prep?” question answers itself.

The reframe

When someone says they already do hands-on math, they almost always mean single-session manipulative activities — base-ten blocks for place value, fraction strips for equivalence, pattern blocks for geometry. Those are useful tools. Math Labs are a different category: multi-session projects where the hands-on work sustains across weeks and the mathematical thinking builds cumulatively. The distinction is between a manipulative activity and a lab course. Affirm what they're already doing, then draw the distinction.

The response

That's a strong foundation — manipulative work builds concrete understanding of individual concepts. Math Labs build on that by giving students a sustained context where they apply multiple concepts together over 10 or more sessions. In PRISM, students aren't just tiling to understand area — they're tiling areas, calculating material costs with multiplication, comparing perimeters, managing a budget, and presenting their department's design to the class. In Balance Lab, students aren't just using a balance to understand equality — they're solving increasingly complex equations across 14 lessons, progressing from physical balance to equation mat to paper and pencil. The extended timeline and project structure are what develop persistence, mathematical argumentation, and the ability to coordinate multiple skills in a complex context.

Say this

“Manipulatives build concept understanding. Math Labs are where students apply those concepts in extended projects — 10 sessions, multiple skills, one coherent context. It's the difference between a lab activity and a lab course.”

Not that

Don't disparage manipulatives — they have the same ones in the room and they chose them for a reason. Don't say Math Labs are “better.” Describe what's structurally different.

Show them

The session arc on the Explore page. Walk through sessions 1, 5, and 10 to show how the math deepens across the project. The 10-session progression is the clearest demonstration of the difference.

The reframe

This is a complement, not a competition. Adaptive digital platforms (DreamBox, IXL, Zearn) are getting increasingly effective at individualized skill practice. As those platforms compress the time students need for core concept acquisition, the question becomes: what do students do with the time freed up? Math Labs fill that space with structured, collaborative, hands-on application. Don't argue against digital. Affirm it and position Math Labs as what makes the digital investment more productive.

The response

Digital platforms are strong at adaptive, individualized skill practice — and they're getting better. As they handle more of that core skill work, the time freed up needs structured mathematical content. Math Labs are built for that: collaborative, project-based application of the same standards students are practicing on screen. A student who practices area calculation on IXL and then spends two weeks designing a pet supply store using area measurement is working with the concept at two fundamentally different levels of understanding. The digital handles the practice; the lab handles the application. They're designed to work together.

Say this

“Digital handles the practice. Math Labs handle the application. As your platform compresses core skill time, this fills the time freed up with structured, hands-on projects.”

Not that

Don't argue against screens or imply digital math is shallow. Teachers and administrators chose those tools. Position alongside, not against.

Show them

Classroom photos showing physical materials — wood blocks, rulers, balance beams, graphing boards. The visual contrast with screen-based work makes the point without requiring an argument.

The reframe

Be honest. Algebra Studio is a young program. There is no large-scale randomized controlled trial. The program is designed by a PhD in Learning Sciences from Northwestern and grounded in well-established research on embodied cognition (Roth, 2001; Nathan, 2021), collaborative knowledge construction (Engle & Conant, 2002), and the role of physical manipulation in mathematical understanding (Martin & Schwartz, 2005). But the honest answer to “where's the evidence?” is to offer to help generate it — in their district, with their students, using their assessments. That's the evaluation partnership.

The response

The program is grounded in established research — embodied cognition work by Nathan, Roth, and others; productive disciplinary engagement frameworks from Engle and Conant; and research on physical manipulation and mathematical transfer from Martin and Schwartz. But we're a young program and we don't have a large-scale RCT. Rather than hand you a cherry-picked white paper, we'd rather help you evaluate this in your own context. We offer districts structured research design options: a delayed-start randomized trial within your schools, pre/post using your existing benchmark assessments, matched comparison with similar non-participating classrooms, or an implementation and perception study. We'll help structure it and we'll be transparent about results. The Evaluation page has the four designs with details.

Say this

“We're built on established research in learning sciences, and we want to help you measure this in your context. We offer four research design options — I can send the evaluation partnership one-pager.”

Not that

Don't overstate the evidence. Don't claim “research-proven” or “evidence-based” without qualification. Don't cite “research shows” without naming the researchers. Don't get defensive — the honest answer is the strongest answer.

Show them

The Evaluation Partnership page. For coordinators and superintendents, walk through the four research design options. For principals, focus on the pre/post design using their existing benchmark — it's the fastest path to local data.

The reframe

This is closely related to the evidence objection but specifically asks about assessment impact. Don't promise score gains — that's a claim you can't substantiate yet. Instead, make two points: (1) Math Labs cover the same standards the state assessment covers, so students are spending additional time applying those standards; and (2) the evaluation partnership can measure the impact in their specific context.

The response

Math Labs are aligned to the same standards the state assessment covers — we have session-by-session alignment documentation for CCSS, TEKS, Florida B.E.S.T., and seven other frameworks. Students are spending 15–20 additional hours applying those standards in a context that requires them to coordinate multiple skills, work collaboratively, and explain their reasoning — the kinds of tasks that appear on state assessments as problem-solving and application items.

Rather than make claims about score impact, we offer evaluation partnerships. If you pilot Math Labs, we'll help you set up a real study: pre/post using prior-year state assessment items or your existing benchmark, comparing participating and non-participating classrooms. We'll help structure the design and we'll be transparent about results.

Say this

“We cover the same standards your state assessment measures. Rather than promise score gains, we'll help you set up a study to measure impact in your district.”

Not that

Don't promise test score improvement. Don't say “our students show gains” without citing a specific study. Don't claim “research-proven” results.

Show them

The standards-by-session table on the product page — switch to their state framework. Then the Evaluation Partnership page. The combination of “we cover these standards” + “we'll help you measure the impact” is stronger than any claim about results.

The reframe

PD comes up in two contexts: (1) they want it — they're already interested in Math Labs and want to know how teachers get trained; or (2) they're worried about it — they're concerned their teachers can't run this without extensive support. The answer is different depending on which it is. If they want it, describe the workshop. If they're worried about it, lead with the teaching portal — show them that the product is designed to be run without PD, and position the workshop as an accelerant, not a requirement.

The response

The teaching portal is designed so any teacher can run Math Labs without prior training — slides advance the session, the timer keeps pacing, and Howie Templer's video walkthroughs demonstrate every activity. But we also offer a half-day PD workshop for schools that want deeper teacher investment. A nationally recognized math educator — PAEMST-level practitioners with genuine field credibility — leads teachers through a lab session themselves. They do the activity as learners, then unpack the teaching moves with the facilitator: how to launch a session, when to let teams struggle, when to intervene, how to facilitate the debrief. It's professional development on structuring hands-on, collaborative learning — useful regardless of what curriculum the school uses. $3,995, up to 30 teachers, fundable through Title II-A as a separate line item from the kits.

Say this

“The portal is designed so teachers can run it without training. The PD workshop is an option — a half-day with a nationally recognized educator. $3,995 for up to 30 teachers, fundable through Title II-A.”

Not that

Don't say “no training required” — that implies it's trivially simple, which disrespects the complexity of teaching. Say the portal handles the session flow so teachers can focus on facilitation.

Show them

If they're worried: show the teaching portal. Click through slides, play a Howie walkthrough. If they want PD: describe the workshop format and mention it's a separate Title II-A line item — that often matters because it means PD doesn't compete with the kit budget.

These are not attack briefs. When a buyer mentions another product or approach, affirm what's working for them and draw the distinction clearly. The framing for all three: Math Labs are not replacing digital platforms or basic manipulatives. They're the structured, extended, collaborative application experience that sits between them.

vs. Digital Math Platforms (DreamBox, IXL, Zearn, etc.)

Adaptive digital platforms individualize skill practice. They track mastery, adjust difficulty, and give teachers real-time data on student progress. That's valuable and it's getting more effective. Math Labs do something structurally different: students work collaboratively on physical projects that require applying multiple skills together in sustained contexts. A student who completes an IXL module on area and a student who designs a pet supply store department using area measurement have practiced the same standard at two different levels of cognitive demand. The digital platform handles individual skill practice. The lab handles collaborative application. They serve different instructional purposes.

The key sentence

“Digital handles the practice. Labs handle the application. As your platform compresses core skill time, Math Labs fill the time freed up.”

vs. Traditional Manipulatives (base-ten blocks, fraction strips, pattern blocks, etc.)

Manipulatives are concrete tools for building understanding of individual concepts — base-ten blocks for place value, fraction strips for equivalence, pattern blocks for geometric relationships. A teacher uses them within a single lesson to make an abstract concept visible and tangible. Math Labs use physical materials differently: the materials serve a sustained project across 10 or more sessions. In PRISM, the wood blocks and rulers aren't illustrating a concept — students use them to design and build a physical structure that requires area, perimeter, and multiplication working together. The manipulative builds understanding of one concept. The lab builds the ability to coordinate multiple concepts in a complex context over time.

The key sentence

“Manipulatives build concept understanding. Math Labs are where students apply those concepts in extended projects. The lab doesn't replace manipulatives — it gives students a sustained reason to use them.”

vs. Other PBL / Hands-On Programs

Most project-based learning frameworks ask the teacher to design the project, structure the mathematical progression, and manage materials acquisition. The quality of the experience depends on that teacher's skill, time, and creativity. Math Labs provide the project design, the mathematical progression, the physical materials, and a session-by-session teaching portal. Same learning principles — collaborative, project-based, hands-on. The difference is that the design work has already been done. A teacher with 15 years of PBL experience and a first-year teacher run the same structured lab. The portal and materials standardize the experience; the teacher's expertise improves it.

The key sentence

“Most PBL asks the teacher to design the project and the math. Math Labs provide both — same collaborative principles, structured implementation, consistent mathematical rigor across classrooms.”