Ep 29. Math fact crisis: strategies for improving numeracy with Brian Poncy
This transcript was created with speech-to-text software. It was reviewed before posting but may contain errors. Credit to Jazmin Boisclair.
​​
You can listen to the episode here: Chalk & Talk Podcast.
​​
Ep 29. Math fact crisis: strategies for improving numeracy with Brian Poncy
​
[00:00:00] Anna Stokke: Welcome to Chalk and Talk, a podcast about education and math. I'm Anna Stokke, a math professor, and your host. You are listening to episode 29 of Chalk and Talk. This is the first of two episodes featuring Dr. Brian Poncy. I'll be releasing the second episode next week. Brian is a professor of school psychology at Oklahoma State University, specializing in math interventions. His extensive research on basic fact fluency led to the development of a free math program aimed at improving numeracy and computational skills, which we'll discuss in the episode.
Brian stresses that we have a basic fact crisis where many students struggle with basic fact fluency, which hampers overall math proficiency. Throughout these two episodes, we'll explore his research and discuss effective strategies for teaching basic facts and computational skills. In this episode, we begin by discussing the instructional hierarchy.
Please refer to the resource page for an infographic. The instructional hierarchy is helpful for identifying a student's learning stage and selecting appropriate tasks. I asked Brian to define some key terms such as fluency, automaticity and mastery. We talk about his free program, the MIND and what happened when it was implemented in a low-performing school. We discuss research on decomposition strategies, he stresses that we need to collect data and use it to inform instruction.
Brian shares some strategies used in the MIND. I think you'll find as I did that Brian is extremely passionate about his research and helping kids learn math. I really love this conversation, and I hope you find it as valuable as I did.
Now, without further ado, let's get started.
I am thrilled to be joined by Dr. Brian Poncy today, and he is joining me from Oklahoma. He is a professor of school psychology in the College of Education at Oklahoma State University. He has a Ph.D. in school psychology. He was the recipient of the 2005 Outstanding Dissertation Award from Division 16, that's school psychology, of the American Psychological Association.
His research focuses on academic interventions and behavioural principles of learning, specifically for mathematics. He is a co-author of the book Effective Math Interventions: A Guide to Improving Whole-Number Knowledge, and he co-developed with Dr. Gary Duhon Measures and Interventions for Numeracy Development, or MIND, which is a set of free, empirically validated resources designed to supplement core math instruction and provide intensive remediation targeting early numeracy and computation skills.
Welcome Brian, welcome to my podcast.
[00:03:24] Brian Poncy: Well, it's great to be here.
[00:03:26] Anna Stokke: So, we've got a lot to talk about today, and we'll talk about your program, the MIND, in detail a little later, but first I thought we'd lay the foundation for the conversation and discuss the instructional hierarchy and how it can inform instruction in general. But before we do that, because this word shows up a lot, can you define the word fluency?
I ask this because I think this word means different things to different people. So, I think it's really important to have a clear definition.
[00:03:57] Brian Poncy: Yeah. So, fluency you know, is a term probably coined or use the most by precision teaching back in the 70s and 80s. And so there's a guy named Carl Bender. It may be Binder, but Carl Bender, I'll call him. And it just did some fantastic work back in the 90s. And he wrote literally about how we got a basic skills crisis in America.
The precision teaching people are all about frequency as a measure, right? And so they're all about a rate of frequency, which would be you know, problems per minute or correct digits per minute. And so fluency is just defined as that. And so it is speed over time. It is a rate measure.
[00:04:40] Anna Stokke: Okay, so fluency is a rate. It's a combination of both being accurate and being able to do things quickly. So, it's beyond percentage correct say, it's more like number of basic facts correct within a minute. So, it's measuring accuracy and speed. And I heard you mention digits correct per minute as a possible measure.
So for example, if one of the questions were six times seven, and the student answers 41, that would count as one digit correct. Whereas if they answered 42, that would count as two digits correct. And a little later, we'll talk about five reasons why fluency is really important, but let's talk about the instructional hierarchy.
And I'm going to post the visual on the resource page. So listeners may want to check that out and follow along. Can you explain the instructional hierarchy and how it can inform teaching?
[00:05:39] Brian Poncy: Yes, the instructional hierarchy is a heuristic. That basically matches patterns and student responding with instructional components. It's a stage model and it's four stages. The first stage is you have to build accuracy. The second stage is you have to build fluency or speed of responding. The third stage is we promote generalization, and then the fourth and final stage is adaptation.
So I'm going to go through each of those in detail. So, accuracy is the one most focused on in schools, right? We want to make sure that when children are presented with academic stimuli, that they can respond. Now, I have added taken liberties, it's in the Codding Effective Math Interventions book. And I talk about declarative and procedural knowledge. And declarative knowledge is a fact. So if I see a two, I just say it's two, I'm not using any strategies or anything like that.
With facts, a kid using a retrieval strategy. It's a declarative fact. If a child sees two plus eight and they need to use touch points where they need to use a count on or account up or whatever they may use, then the child still can accurately respond, but they’re procedurally dependent.
I'll talk throughout this day about using procedures to build the declarative or use the declarative to build the procedure. And again, and that's kind of a cycle, you know, that we, that we think about.
Basically, the folks at the instructional hierarchy, and what they recommended, and this is old, right? It was back in 1978, Haring and Eaton, and they talked about we need to task analyze, right, what we want kids to do when we assess them. And if they can't accurately respond, then here are the instructional components, and so to build accuracy there were five instructional components. They were demonstration, model, cue, immediate feedback or performance feedback.
So if they get it correct, yes, good job. If they get it incorrect, you provide correct feedback, this is right. And so, behaviourists would refer to this as simply a learning trial.
Stimulus-response feedback antecedent behaviour consequence. People talk about instruction is so complicated. Instruction really isn't. Curriculum is. But as far as like, I don't care if you're a genius or you're intellectually disabled. If you don't know something, this is how it's taught.
Model demonstration cue, the child responds, you provide feedback. Now, the gifted child will need fewer repetitions, but they learn the same exact way. And so if we assess for prior knowledge, you know, and we know where to start, you teach in the same way. It's just what you teach and probably the rate you know, the amount of repetitions.
So once kids can respond accurately, now we move to fluency. And so if a child can go ahead and respond accurately, we'll want to increase the speed with which they respond to the stimulus. And that requires repeated practice and reinforcement because we know this can get somewhat boring and so somewhat dreary.
When I run the studies, I'm walking around, I'm high-fiving them, I'm telling them “Good job.” That’s an important piece. Repeated practice in the absence of reinforcement isn't going to be nearly as effective as that reinforcement component. And so that's important. And so when people are like, “Mindless drills,” well, you're the one as a teacher that controls that. You can make it fun.
Because if a child thinks you care, they're going to care. And so if I go into a classroom and they're just like, “No, it's all about strategies and procedures and this mindless drill,” the kids won't learn because they care what their teacher thinks. And so, but if you sit there and you encourage, you know, it generally works pretty well.
[00:09:38] Anna Stokke: So, we've got declarative skills and procedural skills at the bottom of the instructional hierarchy. So, this is like the foundation. Declarative skills, meaning things that you recall automatically from memory without using a procedure. And procedural skills, those skills that you do use a procedure for.
We have accuracy, so being able to do things accurately, and once the child is accurate, we move on to fluency. So that's increasing the speed or the rate at which you can do something. And of course, we build fluency through a lot of practice. Now, above that on the hierarchy, you have generalization and at the very top of the hierarchy, you've got adaptation.
So, can you talk about those two things?
[00:10:28] Brian Poncy: Generalization is a little more difficult. There's fewer research studies out. But in generalization, now that we've learned a behaviour and we can perform it fluently, now we want to be able to perform it across time, right? So it maintains. But then, for example, like with kindergarten kids, if you teach them to answer things horizontally, and then you show them a vertical problem, they're going to be like, “what?”
You know, because they hadn't seen that yet. And so you have to give them practice, the same thing is if I use a close, right? And even though they know two plus three is five, if it's two plus blank equals five, they're going to go, “What?” Even though they have the response they've acquired and can do it, but now they have to generalize it to new kinds of situations.
And then adaptation is the final one and adaptation is very congruent in my mind with inquiry-based instruction. And this is where we want students that now have, you know, done kind of low-level generalization, right? Like vertical versus horizontal is not that hard, you know, close isn't that hard.
But if I want to take, you know, a fact now I wanted you to do it in a word problem or I want you to do it in a problem-based piece. So that would take the adaption of the skill. So you're using the skill in a brand new context that you've never seen.
And the instructional components for that is simulations or problem-based learning. Okay, so again, we have the same goals as the NCTMers, the reformists. It is just we shouldn't introduce it until a child has the skill, and they have the skill accurately, fluently, can kind of generalize it, and then you're ready for prime time for the real, you know, higher-order thinking.
And so that's the instructional hierarchy in a nutshell.
[00:12:21] Anna Stokke: Your instructional hierarchy diagram also includes set size and saliency. So can you say a bit about that?
[00:12:30] Brian Poncy: Set size is how much we teach. And so when you're building accuracy, you're going to have low sets when you're generalizing or when you're doing adaptation, it's going to be with large amounts of sets.
And same with saliency is saliency or explicit instruction will be very high at the bottom of the instructional hierarchy. But you fade that as you go. Small sets, highly explicit, repeated practice, high levels of feedback. You begin to fade that and release that systematically to where we're at with the adaptation piece. And when I talk about that, you know, that cut off of 40, what that means is when we get declarative skills to 40 digits correct per minute. Now let's move to the next.
You know, it's imperative that we have the science to inform what our set size should be, how many repetitions. Matt Burns has done some of that in the acquisition phase, right? I'm working on stuff along with Codding and some VanDerHeyden and them about about how to best build fluency.
The cognitive people are doing a lot of cool stuff with generalization. And so I think the instructional hierarchy is really a perfect organizer for this that we can begin to organize our research findings and hopefully provide some prescriptive recommendations to teachers, for example, even if a child is accurate, but they're slow and they're below that 10-digit cut off line, you're not going to want to use explicit timing because the research that we did.
You know, and I think 2022 showed it and Robin did a study way back in 2007 that showed the exact same thing. We need more research, to do these parameters. But when Burns talks about skill-by-treatment interactions, what we're trying to do is do the literature to basically put over the instructional hierarchy and say, “Okay, if a kid's at this performance level, you're going to do these types of things.”
[00:14:27] Anna Stokke: Now for building accuracy, you've got flashcard drill and cover copy compare. Can you explain how you use those interventions?
[00:14:39] Brian Poncy: And so if you think about building accuracy and demonstration modelling, queuing, and feedback, that is flashcard drill and cover, copy, compare. So flashcard drill, as we all know, is, you know, you would say three plus six is nine, and then the child would say three plus six is nine, and then the next time through, you don't give them it, right?
You want them to try it independently, but again, that stimulus-response feedback, then you begin to fade the cue and have the child become more independent, but if you do that without taking into account that set size, right, that small set size, even though it's an empirically validated intervention, you're going to screw it up.
And that's why Burns is, again, you know, really enjoys the incremental rehearsal because it's a controlled flow list. Now, when a kid begins to have some things, we can use cover, copy, compare. And cover, copy, compare again is a flash card, it's just on a piece of paper. And this is really important because flashcard drill takes one-to-one.
I mean, you can do it in small groups, but it's pretty difficult. You’ve got to pace, you’ve got to signal, you’ve got to get choral responding. Cover, copy, compare you can give to everybody in the classroom and they self-manage it. And what's even better is that some kids needed sums to five, sums to 10, sums to 18, You could fully differentiate that, right?
Which is cool, you know, if they can't do that, or if they can't respond accurately, you could still do timed practice in the classroom. Some kids could be doing it, you know, just explicit timing. So just, you know, answering the problems other kids could be doing cover, copy, compare, they're all doing things that are appropriate for their skill level that they will feel, that they'll understand what they're going to do.
It's consistent. They know how to do it when I consult with the MIND with school districts. Teachers are always blown away. “Oh, my God. The kids liked it.” You're like, yeah, because it's predictable.
[00:16:32] Anna Stokke: One intervention you use for building fluency is taped problems. Can you explain how that works?
[00:16:40] Brian Poncy: Taped problems, I think, is the best intervention pound-for-pound on the market, and tape problems is the child would get a worksheet in front of him with the math problems, and if you're building accuracy because it has a feedback component.
You would just want to do it with small sets, but in essence, the tape reads the problems like “two plus three is…” and then it provides a delay. And it says the answer five and what you want the child to do is to beat the tape, right? And if they can't beat the tape, then it gives them the answer, so it's two plus five is seven, three plus eight is, you know, but think about if you do five-minute tape problems, they're going to complete 60 problems in five minutes.
And so when we begin to think about rates of practice and you know why these interventions are so robust, this is why now. So this you could use with a kid that was accurate, you could use it with a kid that was inaccurate, but I wouldn't like if a kid knew nothing, I wouldn't use it, right.
But if they if 60 to 100 percent range, works really well. So does cover, copy, compare, but even to build fluency, it's a pretty good intervention because your pacing is high, so that's your third intervention. And the fourth intervention is simply explicit timing, where you just let kids go and practice.
And then also to build procedural fluency, I have something, I call it procedural cover, copy, compare, but it's just worked examples. And so if you go to the MIND materials, there's multi-digit things and you'll see a fully formed problem. Basically, I will model it for the student. And then there's one for the student, they look at it, they cover it, they basically do a think-aloud and tell all the steps because I wouldn't put a kid in this until they were at 40 digits correct per minute.
So they're not messing around counting on fingers doing things. I'm going to focus on the regrouping the standard algorithm, and I also have cues for like, here's where you put it. But then by the time you get to the second page, all cues are gone. And also, sometimes you'll regroup, sometimes you won't. And so at the beginning, so the kid shows me and rehearses. I do the worked example, they show me they can do it, they have highly structured activities to guide the algorithm, then I fade all of the cues, and then they have to discriminate when or when not to use the correct cue.
And trust me when a kid can do that, if it's three digit by three digit, I can easily be like, “Hey, you know what you can do, you can do this left or right. And you could work on place value.” So again, getting back to that, you task analyze because a lot of people think memorization of facts precludes place value for whatever reason, but if we think about what I just said, now a child can do the standard algorithm, they can do all their basic math facts. Hopefully, you've taught them to read numbers and you've had them identify places.
And so now, if it's 333 plus 222, you could be like, “Okay, go all the way to the left. What problem is that?” And they're like, “Three plus two.” And you'd be like, “Well actually it's not, it's 300 plus 200.” Right? And then they could write that down.
And then I could say, well, if this is three plus two, what is this? And if they say “Three plus two,” you'd be like, ah “Now, now, now was this three plus two over here? No, that's 300 plus 200. Now, this is 30 plus 20,” and guess what? And that was really fast because they didn't have any strategies.
Now you can bring in the next problem. “What's this?” Now they got it. You get my point? And how many times do I need to sit there and do that before the child gets the freaking concept? If they already have the facts. Boom, they got it. If they have the prerequisite skills and they can focus solely on the concept because they have all the other declarative and procedural skills, you know, they're able to focus on how to connect those.
The faster you can do that mentally, that's how you make the connections, and that's how you do adaptation, and that's how declarative and procedural, up with the, you know, instructional hierarchy would structure all that.
[00:20:52] Anna Stokke: Yeah. So the instructional hierarchy, by the way, it reminds me of the cognitive people talk about the expertise reversal effect. So you gradually fade instruction, and it's based on prerequisite knowledge. So the idea is, and, we're not against inquiry, it's just we're saying that you need to introduce it at the appropriate time and a lot of times what we're seeing is people advocating for introducing or for using inquiry-based instruction at a time when children aren't really ready for it or when students in general aren't really ready for it because you do have to have the prerequisite knowledge to be able to actually work on the problem.
[00:21:35] Brian Poncy: That's exactly right. And the devil's in the details. If we don't have measurable, monitorable ways to a set task, analyze and assess observable behaviours, it means nothing, right?
Because we keep saying things like, “They have to have prerequisite knowledge.” Well, what does that mean, right? As far as well, can they do it accurately? Can they do it fluently? Why I like the instructional hierarchy is because you can only begin to make decisions in the instructional hierarchy until after you assess the child on the target behaviour to see where they're at.
And what I'm afraid of is with a lot of the back and forth I see on Twitter with the cognitive stuff is it's, you know, kind of really broad theoretical arguments. And again, I think what they're saying overlays with the instructional hierarchy too, but we have to have measurement tools, skill by treatment interactions, so that we can guide teachers with relatively simple, easy, quick, and curriculum-based assessments.
[00:22:38] Anna Stokke: So again, collecting the data to see where students are at so that you know where they're at in the instructional hierarchy is important.
What about automaticity?
[00:22:49] Brian Poncy: So automaticity came out of the cognitive literature, and they would do a lot of reaction type of studies and also automaticity really goes to an item-by-item level, right? And this will make sense to any of your listeners that have taught kids math. If I say what's three plus three, a kid may go, “Six.”
And then I go, well, “What's seven plus nine?” And they may be like, “…16.” And so they knew one problem automatically, thus they had automaticity in that. The other one, they didn't, let's say it took them three or four seconds. When I do a fluency probe, let's say I do sums to 10, I'm going to get a metric, 23 digits correct per minute.
There's going to be some items that the child basically uses a retrieval process. There's going to be others where kids maybe use a decomposition strategy, and there may be yet others that they use a counting strategy. If I'm doing intervention planning with that child, like the things the child already knows, I'll revisit one time.
And if it's things they already have, I may do a paced type intervention. So I'll be like, “I'm going to give you two seconds to do this. If you can't beat me, I'm going to say the answer. Every time you beat me, you get a chip.” If it's something that a child can't respond to within five seconds, I may do flashcard drill, right?
Where I actually say like “Eight plus nine is 17. What's eight plus nine,” right? And so I really put in very salient model and feedback, right? As compared to the pace, you know, I'm lessening the saliency of the antecedent, but still giving that feedback. And then the other one, I mean, it's just a cue of the problem and you don't even give them feedback because they already have it. But what you would give feedback is on the fluent performance because you're basically fading the consequent from an item-to-item basis to a group of items.
[00:24:48] Anna Stokke: So when we're talking about getting children to say, memorize their times tables, we're really looking for automaticity, right?
[00:24:57] Brian Poncy: Yes, as anyone that's on Twitter knows, not everybody agrees with this. And so I will give my perspective and I will provide the research for said perspective. And so, for example, I had a kid one time in practicum and the child was at 18 digits correct per minute on some state team.
We went ahead and we did explicit timing for like two and a half weeks and the child got up to 24. Okay, so the child made growth explicit timing work, but we weren't going to catch the kid up in time, right? It just wasn't fast enough. And so, you know, my colleague said double the dose and I said, “Well, before we do this, let's assess the child,” and we assess the child on sums to 10.
And we assessed them on sums 11 through 18. The child was at 40 digits correct on sums to 10. The child was about six digits correct per minute on sums 11 through 18. Still 100 percent accurate, but they had cumbersome counting strategies. When we used explicit timing and they were counters, all that did was cue them to count.
Now they could build a little bit of fluency and count faster, right? But really, all you're doing is having them practice a procedure, you know, and again, that temporal window on those larger problems was so great they were never pairing the stimulus and the response. So they were never pairing the nine plus six and 15.
So then what we did was we just had the child do cover, copy, compare, and this is an intervention where it's basically an automated flash card. You have a worksheet in front of the student, and it would have six plus nine is 15 and it would have a box beside it. The child looks at the problem and answer.
​
Says the problem and answer, covers it, writes the problem and answer, and then uncovers it to make sure that they get it right. And what that does is it prevents the child from engaging in a procedure. Now, people might be like, “Oh, don't do that! They've already demonstrated they can do the procedure.”
Okay, but now we need to transition the kid from they can do the counting procedure to automatic retrieval. And so what we do you know, is just give them, so it's six, nine, 15, six plus nine is 15, six, nine,15, six, nine, 15, six, nine, 15. And so again, people would call that rote, but they already understand one-to-one correspondence, they understand greater.
They know what 15, six and nine all mean. They know a plus sign and equal sign, right? And so it's time, right? It's time to go ahead and automatize this fact. And so we did that, and that took, I want to say, another week and a half. And then the child was at, like, 45 digits correct per minute, sums to 18.
After we did the cover, copy, compare, then we, again, released, we faded the antecedent and consequent procedures to explicit timing. And then obviously we folded in the 11 to 18, to the sums to 10, which, you know, people will talk about as interleaved practice or whatever, and the child did that a few days and then we had it.
But I mean, we literally, what would have taken even doing explicit timing, probably a month and a half to do, we cut down to two weeks. And by the way, I timed everything. I did timed to practice with cover copy compare.
We did time to practice with explicit timing, and the child, it was fun. I'm thrilled. You know why? Because they got better. You know why they got better? Because we understood instructional components such as set size, modeling, reinforcement, the things associated with the instructional hierarchy.
[00:28:35] Anna Stokke: So, when I think of being automatic with times tables or having your times tables memorized, which many of us clearly do you can just say it automatically, right? So, you know, eight times seven is 56, right? But I discovered at some point that to some people this means something like you can give the answer within three seconds.
But if you're giving the answer within three seconds, and if you take the full three seconds, you're probably using some strategy to get the answer. So to me, that isn't automatic. And, the cognitive people would say, well, that's going to use up working memory when you're trying to figure out other problems, right?
​
So it really should be instantaneous.
[00:29:18] Brian Poncy: Yes, I should be. And when we see kids that are a little slower, you know a lot of times it can be because their writing fluency is slow. So we talk about memorization of kind of basic facts, but in the MIND we actually go and we make them count fast, we make them identify numbers fast. I mean, they're just not thinking about it, right?
Because think about doing a cover, copy, compare intervention and you're a first grader and you can't write very well. Your cognitive resources are going to be focused on now “Where do I start to write that five?” And now you're not temporarily putting the three-term contingency together either.
And so, and, but again, you see the pattern, right? You increase the declarative to build the procedure, if you task analyze, for example, writing three plus eight is 11 then we know you have to be able to write, you need to be able to write fast. And so, you know, those are important things and that's what we have in the MIND and think about how happy a first-grade teacher is going to be if 80 percent of her kids come in and they can write 60 digits per minute.
And so it really snowballs over time.
[00:30:24] Anna Stokke: So how about mastery? What does that term mean?
[00:30:28] Brian Poncy: Yes, and so with the automaticity piece, it's so interesting. I went and I looked back and, obviously, the NCTMers seem to think that three seconds is fast enough. Other people have said two seconds. I think we need more research on it. And again, if you think about our dependent variables or how we measure kind of the outcomes of our interventions fluency is used a lot, digits correct per minute.
That's a dirty measure, right? Because it gives us an average. We've recently developed a computer program that we are going to do some research with over the next few years, and we're super excited about it because we're going to build in machine learning capabilities.
And what's wonderful is for the first time ever, we're going to be able to look at item-by-item latency scores. And so I think will be fascinating is I could quantify the number of steps in a procedure. And basically, you're going to see pockets of kids begin to, you know, let's say they're at 0.75 seconds.
If it's automatic retrieval, if it's a doubles plus one, which is only right, two or two steps to three steps, well, that might be at 1.5 if they count, depending on how many numbers they have to count. You should really start to be able to delineate kind of the average latency. And wouldn't that be interesting because you could go ahead and put a kid on a computer, assess them, and it could say here are the items the child has with automaticity.
Here's the one that they're using decomposition strategies. And here's the one they're straight doing counting on. And like when I talk to people that are into NCTM, when I say something like that, they're like, “Oh, yeah, that would be kind of cool.” I mean, we can meet together, but again, this whole, “Are we going to build facts through procedures? Are we going to build procedures through facts?” To me is a sticking point.
It's an empirically testable question, but if we don't have research where we can isolate the dependent variables and we're in trouble. For example, like with this, I can go ahead and I can take a group of kids if I figure out like mastery or retrieval is at 0.75 seconds, I could give them 12 items and I could teach them to 0.75, right? And then I could have a control group that was wherever they were at.
Let's say, well, we'll use three seconds. Well, because you could control it, right? You train them to three seconds, the NCTM standard. Then we could go ahead and we could do a lesson, a conceptual lesson to teach decomposition if they could give us a dependent variable to measure the concept.
And then we could go ahead and do the lesson over a week or two weeks, and we could time it, and we could see how many instructional minutes it took for a child to basically transition from a counting to a decomposition strategy. And then we settled the problem, you either memorize first, or do this, or you do the other - you time it, right, and you see which is more efficient.
Pam Snow who was on your podcast said something that really resonated with me. My advisor was Chris Skinner and he always talked about learning rate. It's not enough to measure how much kids learn. You have to measure learning per instructional minute.
And what Dr. Snow said that I thought was so cool was she talked about how that's the kids time, right. There's nothing more precious than time, especially when we're trying to remediate and catch kids up. And so it's not about, you know, what us theorists think, or what we want to fit into how we view the world of mathematics. It's about, let's compare these two different ways of doing things and what works most efficiently for kids.
[00:34:13] Anna Stokke: Absolutely. But we'd have to first agree on what the outcome is. Are we for instantaneous or three seconds? And then we're going to have to agree that we're going to test. Those things are sometimes a problem.
[00:34:26] Brian Poncy: They are sticky. But, I mean, if we were really a good scientific community, you know, we would meet across the hall and I would say, okay, “I don't do mix methods, right? I just do experiments.” But you know what? I would say, “All right, well, let's talk about what we're going to do, and you want to have mixed methods, let's have mixed methods.” And you know, “Let's go.” But let's figure it out. And let's talk about it. And let's agree on some metrics and do some good sound science.
[00:34:53] Anna Stokke: So, you research math interventions, and you developed the MIND, and it's free, it's empirically validated. The MIND consists of three sets of instructional materials, as I understand it, and you can correct me if I get anything wrong. The first is Facts on Fire, which supports Tier 1 instruction, in other words, school-wide, day-to-day instruction used in general classrooms.
​
The other two parts are for Skill Remediation, or Tier 2, and Intensive Intervention, or Tier 3. So, I'd like to start by talking about Facts on Fire, the part of the program that's designed to be used school-wide. So, can you tell us a bit about that?
[00:35:37] Brian Poncy: Sure. I developed these resources because I worked as a school psychologist in a response to intervention model. And this of course was around 2000, right? So it was about 20, 25 years ago. And in that I was in a progressive non-categorical special education district or, or co-op called Heartland AEA.
And in that you know, we, school sites never gave an IQ test. And so, you know, our job and our goal for special education evaluations really centered around what children needed to succeed. And so, at the core of our practice really were empirically validated interventions. We had to be able to, you know, get baseline data, screen kids, get baseline data, figure out why the child was struggling, what level of material they needed to be in.
We had to select or implement interventions, we had to look at growth, compare that to standards of how typical peers would grow, and we basically had to continue to intervene and ratchet up the intensity until we could demonstrate what kids needed to learn commensurate with their peers. If that took a lot of resources, we said, “Well, they obviously need specialized education services.”
And so that was kind of what we did. And as a school psychologist, we had really good assessment materials. We screened well, baselined well, evaluated well, but I scrambled to get intervention materials. And so, I would go to teachers and say, “Well, the child's slow and inaccurate, and so do a flashcard intervention or do cover, copy, compare.”
And then the teacher would look at me and it'd be like, “You want me to create all this?” And I'm like, “Well, you're the teacher.” And I realized, there is not a repository at all for teachers to get you know, academic intervention material. And then when I became a professor and computerized services started to take place, now virtually everything people had to pay for.
And I was like, you know, there's a big need for low SES districts and just teachers and parents, man, they need things. And so, I'm you know, supported through taxpayer dollars, and plus this helps me with my research and service and outreach to communities, which is a big focus of land grant universities, which Oklahoma state is, and Dr. Codding asked me to be on her book.
And with her book, I was like, you know, I can't add anything she can't write, she's brilliant. And so I said, but let me work on this and students could kind of read the book, and then we'll have you know, we'll actually have materials that they could access so they can do the things in the book.
So, all those things kind of came together and I was like, “Let's create this website, and let's give it away for free.” Let's try to support and power and help kids’ teachers, you know, and so that's where that came about. Also, as my time as a school psychologist, if a kid was referred and I worked in a really small district, we wouldn't have like a large norm and we didn't want to use a national norm because you don't know if that's going to match up to that particular school district.
And so we would have to go in and collect a school-wide norm and class-wide norms. And what I would find oftentimes is a kid would get referred and they were maybe at eight digits correct per minute. Well, geez, if they were in like a, you know, a two grade, you know, there's two, two classes in each grade, I mean, out of the 40 kids, probably 10 of them were below 10 digits correct per minute.
And then, you know, another, about half of them were below 15. And so my God, I saw all the time, we have a basic fact crisis, like kids are not moving past counting, And even, you know, reformists and whatnot, you know, they don't want people to memorize facts, but I'll tell you what they want people to do the decomposition strategies and things of that nature.
So there's a basic fact crisis. And so, you know, the teachers, the kids were really strategy-dependent on counting strategies. And so obviously the low half, they weren't using more efficient decomposition strategies and they weren't using retrieval.
And so, I couldn't sit there and validate whether I should, you know, assess this person for special education eligibility because it wasn't a kid problem, it was a grade problem and a class problem, and so I figured out real quick it's just not about having intensive remediation stuff.
We better do a kind of a tier one type approach because you know, all students would benefit from tier one instruction.
[00:40:22] Anna Stokke: And so I think part of what I hear you saying is there isn't a focus on making sure that kids have committed basic facts to memory, right? And they're relying on decomposition strategies instead. So maybe thinking of, maybe they know a double, and so they work out an addition fact from that or a multiplication fact from that instead of just knowing automatically what the fact is. Is that correct?
[00:40:50] Brian Poncy: Yeah, that's correct. And, I mean, NCTM has been pretty firm lately, and I think it's, kind of unfortunate really, they'll say we don't want kids to memorize facts, but then they'll kind of also vacillate on that and be, well, if they memorize them, it's okay as long as they do it through a strategy or a procedure.
And so I think you know, teachers you know, follow that model, and so they provide students with procedures to use. And of course, that's counting, counting on, making tens, doubles plus one. You know, they have a scope and sequence there of what they like to do.
[00:41:25] Anna Stokke: About the NCTM, by the way, and just the idea that they're fine with kids committing facts to memory, as long as they get there through a strategy, my guess is that that likely has to do with some vision of what conceptual understanding means.
So there's a fear that students will just memorize and not know what anything means, so that's likely where that's coming from. But then, you know, what ends up happening is so much time is spent on these strategies and then students actually don't commit the facts to memory. Does that ring true to you?
[00:42:02] Brian Poncy: Yeah, when I think about kids committing things to memory there has to be somewhat of a tight temporal window between the stimulus and the response. And so, if I sit there and I see eight plus nine, and I, and even if I say, “Oh, the nine is bigger, but you count on eight.” And then you say “17,” you don't remember the eight plus nine anymore because you went 9, 10, 11, 12, 13, 14, 15, 16. You know, and then you say the answer.
Now if I say, what's three plus two and a kid goes “three, five.” Right? Three, two, five. Well, that they can memorize.
Memorization is just immediate recall. It's long-term. You don't need any structure strategies, graphic organizers. You just know it, right. And you generally can pull that up fairly quickly, but what you'll see is in teachers, even reformists will say, “Well, we start with zeros and ones, because those are going to be really easy.”
And quite frankly, if it's four plus one, and you're like, four, five, right, you can either count it or retrieve it. And it's really going to be hard to discriminate that. It’s the same way with your twos as well. And then after they do that, then they'll be like, okay, now let’s learn our doubles. They will say memorize, right?
So, let's learn our doubles. And so now you got zeros, ones and twos, and then they have their doubles, and so now if a kid sees six plus seven, and the kid has six plus six memorized, then they know, okay, that's 12, 13. I just wrote a post on the Science of Math website in response to an article about the California math framework banning memorization for multiplication facts.
And one of the things that I just, it annoys me is NCTM is all about memorizing stuff. Just some stuff and in some ways. For example, if you're going to use a decomposition strategy, you have to have your doubles memorized. I mean, what teacher in their right mind would be like, “No, just count on 6, and then plus one, 13, or minus one, 11.”
That would be ridiculous. That actually would be an extra step on just count up. And so they're cool with memorizing that, and they're cool memorizing the strategy of doubles plus or minus one, and they're cool with the procedure of, “Okay, do I want to go up one or down one?” And so, people that say, “If they memorize stuff, they're just not going to get the concept.”
It's like, we'll quit talking out of both sides of your mouth. And then when we get to tens, now they memorize all the tens and now we can start making tens because I know seven plus three and four plus six and all that stuff. And now you get this little set of facts on each side that doubles plus and minus one and then making pens don't really apply to And then kids usually will revert back to good old count-up strategies.
[00:44:54] Anna Stokke: Okay, so I think I hear you saying that the decomposition strategies, so using, say, a double or making 10 to figure out an addition or multiplication fact, maybe doesn't actually work that well and that kids just end up counting on, which is really inefficient, but what about, you know, encouraging decomposition strategies so that kids use them for larger numbers, so mental math strategies. Is there any research on that?
[00:45:29] Brian Poncy: Actually there was a really cool study done by Marina Vasilyeva, two samples. They looked at American students and Taiwanese students. And what they looked at were first-grade kids, they kind of assess them individually, and they looked at whether they were using retrieval strategies, a decomposition strategy, or a counting strategy.
And they did this for an American sample and a Taiwanese sample and they did this across three different problems. They did it across single-digit, two-by-one-digit, and two-by-two-digit. And what they found in the United States was about half the kids use retrieval strategies, and I think they define theirs as within two seconds.
And in Taiwan, about 63 percent of students retrieve their facts. Only 13 percent of Taiwanese kids used counting, and it was 32 percent in America. And I just think these percentages are interesting because you know, it's been my experience about half the kids, the top half of the kids they get it.
Like they're going to learn regardless of kind of how they're taught. But what was interesting is when you went to mix digit problems, right, you can't have those kind of memorized. And so you're going to have to use some strategy. American students used counting when it was a two-by-one-digit 50 percent of the time and 42 percent used a decomposition strategy.
In Taiwan, 63 percent of the kids used a decomposition strategy as compared to only 22 that used accounting strategy and when they went to double digits 75 percent of Taiwanese kids used decomposition, whereas only 50 percent of American kids. Only 10 percent of Taiwanese kids used counting procedures and almost 40 percent of American kids. And so if you I know it's a lot of data but you know when I sit back and I thought about it retrieval with basic facts supports decomposition strategy use.
And as more kids memorize those single-digit skills or those problems, it allows them to actually achieve the very goals that the reformists are wanting to do. And so it's interesting, when we talked earlier, I was like, “Well, as long as they memorize things through procedures, they're okay.”
But what we're doing is we're making kids count and use count on procedures and the temporal window does not allow them to basically make that a memorized fact. Therefore, that never moves to a decomposition issue. And so what Taiwan does, and Asian countries are more apt to do, is have them learn and memorize those core facts and they use the declarative fact to basically inform the procedure.
In America, NCTM wants to focus on the procedure to build the fact. And so, your lower kids get left in the dust. And so, you know, you got this great organization that's championing diversity and all this stuff. But in reality, the most vulnerable populations are the very ones that are getting hurt by the pedagogies they promote.
[00:48:45] Anna Stokke: Okay, so just to get this straight. So your theory on this then is that if the students have memorized the core basic facts, right, so we're talking about single-digit arithmetic facts, right? If they've memorized those core facts, that it's then going to be easier for them to use decomposition strategies for larger numbers, right?
What I would think of as sort of mental math that you can kind of do in your head if you have some basic facts memorized, right?
[00:49:15] Brian Poncy: Well, that's exactly right. In the study I was talking about, it was mental math. And then if we go to the wonderful work of Kirschner and Sweller and Ashman and all these people, I mean, this is what they're saying, right? They're saying, if we look at the complexity, which a behaviourist, I would quantify it by number of procedural steps because, you know, making tens has got a quite a few steps in it, two or three, four steps, right?
And so, if we have those core facts memorized, then we can go ahead and we can do that, we can mentally engage in those. And so again, as I'm a behaviourist, but everything I hear them saying, I'm like, “Yep, that's exactly you know what we're seeing out in the field.” And so that's what I love about the science of math movement.
It has me listening, reading and talking to people I never would have before, and I'm finding out that I have a lot of friends in this community. And a lot of us are thinking the same way.
[00:50:07] Anna Stokke: Yeah, absolutely. So, let's talk about some specifics. So, my understanding is that you conducted a longitudinal study that may be of interest to listeners. I understand that it all started when a school contacted Oklahoma State University faculty with a problem. They had the lowest scores in the district on broad math scores.
And you, of course, are on faculty at Oklahoma State, and you have expertise in math interventions, so they went to you, and you proposed implementing the Facts on Fire program. So can you tell us a bit about that? What did that involve?
[00:50:48] Brian Poncy: Yeah, well, it was Dr. Duhon. And so I wish I could take credit for them contacting me. This was way back in 2010, so I just started and Dr. Duhon, he's my research brother and we do, you know, math stuff together. He actually graduated with Amanda VanDerHeyden. So, they had very similar training, he also is all about focusing on tier-one-type approaches.
So, they contacted him and he says, “Oh, yeah, we can fix that.” And so I was kind of working on these math materials and plus I do the second year practicum. So, we had a group of students that could kind of copy and make probes and support. And so, yeah, we went in and we met with the school district and we had them kind of define out their scope and sequence and what skills that they wanted to work on.
And we just said, if this is going to work, it has to be every single day and we're going to do it for four minutes a day. We don't want to usurp what you're doing. We're not trying to tear down your curriculum or what you're doing. Just come in every day after the announcements, every student in the building is going to work or at least grades one through, I think it was four or five, are going to work on basic facts.
And again, we didn't do this willy-nilly. We assessed every child. We found what they could do what they couldn't do. We would never put a kid and you should never put a kid in a time test on something they can't do accurately and is in their instructional range. That's where we get into problems, you know, with timed tests, and I'm sure we'll talk about that later.
But anyway, so we did that. The interesting thing is, is when we think about a time commitment over the course of the year, because, you know, we didn't do it at the beginning, like the first few weeks as they got in, and then you had testing, and then you also, the last week or two of school was kind of a madhouse, but we usually did this about 150 days a year. And so when this all comes out, it's about 600 instructional minutes.
Which is interesting, right? That's 10 hours, 10 hours of instruction over a year. We weren't taking from much, right? Four minutes a day and yet our outcome data were state test scores, right, because, you know, one of the things, and you're just in a little bit of a catch-22 as a fact researcher, is if you use facts as your outcome variable, they'll be like, “Well, who cares because, you know, you practice facts, and they got better at facts,” and so we're like, “Well, let's not use facts then, let's go ahead and use state test scores,” and so the first year, we saw no difference in the state test scores, and that made sense to us, right, because we had that test, I think it was in third grade and the students that took that test had Facts on Fires for about 100 days that year.
But then the next year, you know, the students in third grade had all of second grade and third grade and then the next year they had all the first grade, second grade, third grade, right? And so, you know, by the time we were done in our fourth year of implementation, it was tied with the highest school in the district. And this, of course, was a Title I school. the other elementary school was kind of the Richie Rich high SES school.
And we had a control group within the district that was another Title school. And they did not grow during that time. And so again, it's just fascinating, right? We grew on the state test. Let's see, it was a pretest of 711 and it went to 812. The Title school, the other one, went from 730 to 740 and then the non-Title school went from a 765 to an 813, which was a 48 point difference.
And so basically, we doubled the growth of the high school. I need to probably get into the details, right? So we started everybody in grade level material. They did to two-minute timings each day. On Wednesdays, we would go ahead and score one of the probes and we would see if it met the mastery criteria, which we set at 40 digits correct per minute. And, you know, when kids met that for two or three weeks in a row, then we would go ahead and move them to the next skill, right?
And then they practice that until the next skill. And so, again, I think a pretty minimal time requirement for teachers, especially given our students were doing all the copying and the stuffing of folders and different things. But, I mean, you know, if you could sit there and tell any superintendent, hey, I'm going to drastically increase your state test scores in 10 hours of instruction a year then I think pretty much everybody would be for that.
But again, when we think about what I said earlier, get fluent in your basic facts, it will help the procedures. That's what changed in that school as compared to, “We're going to continue to teach all these different procedures so kids get their facts and, have them with conceptual understanding.”
[00:55:54] Anna Stokke: A couple points of clarification just for the listeners. Okay, so what's a Title I school?
[00:56:00] Brian Poncy: Title I school just means there are high levels of low SES. So, basically they get Title I funding.
[00:56:06] Anna Stokke: Next thing I wanted to ask, so you didn't change the curriculum, the core curriculum that was used in the school. So the only thing you did was you had them implement this Facts on Fire program. It's sequenced, there are guidelines for when children move on, and basically it concentrates on making sure that students are fluent with basic facts and computational fluency, right?
You work on things like standard algorithms, computation, and that sort of thing as well?
[00:56:39] Brian Poncy: Yes, in that school we did if teachers wanted two-by-two digit, we gave them two-by-two digit.
[00:56:45] Anna Stokke: And the measure that you used to determine whether this program was successful was, in fact, the state test.
[00:56:53] Brian Poncy: Yes, Oklahoma state test.
[00:56:55] Anna Stokke: Which is not designed to work with Facts on Fire.
[00:56:58] Brian Poncy: Yeah, and the subtest had nothing to do with computation. It was a broad test.
[00:57:05] Anna Stokke: The state test would be based on what you'd call the state standards, right?
[00:57:11] Brian Poncy: Yes.
[00:57:05] Anna Stokke: Okay, got it. Okay, and this was just really four minutes a day, every day, across the school. And what grades were in the school?
[00:57:22] Brian Poncy: This was this was elementary. So, I think it was K-5. And we didn't touch kindergarten. We just did first grade and up, and I want to say that first year we didn't even mess with it. We got into first grade about halfway through the year.
[00:57:36] Anna Stokke: This kind of supports your hypothesis, right, that making sure that students really know those basic facts will help them to succeed in other areas of math as well.
[00:57:47] Brian Poncy: It helps the students succeed, and it helps the teachers succeed. Like, one of the things I really, I want listeners to take from this is we didn't change what the teachers were doing. These were good teachers doing good things. They had a difficult population, they were working their tails off like teachers do.
What we did through the Facts on Fire, is we increased students' level of prior knowledge so that the things they were teaching, the things that are in most curricula are going to be focused on procedural and conceptual understanding. What we did was we built the declarative fact knowledge, all right. And again, just like the cognitive people say, with the element interactivity research data, it's like you lift prior knowledge, it's going to help when you have multi-component lessons.
It literally dovetails perfectly with the cognitive literature with the behavioural literature. It's so common sense and again, if you talk to fourth and fifth grade math teachers, they're pulling their hair out because they're trying to teach so much and kids are writing things out on paper and doing different things. And the other thing I want people to realize is just because you know five plus six is 11 doesn't mean you can't realize that's five plus five plus one or six plus six minus one.
And again, the study that I talked to you about just a little bit earlier, actually what it would say memorizing that allows you to do that better. And you can do more problems like, again, in a temporal window, so you can juxtapose different things. When we talk about conceptual understanding, it's a hard thing to define, but people always talk about interconnectedness.
Well, the more problems you can do in a temporal window, the better you can make those connections. And so, you know, I just do not get the NCTM push for why memorization is somehow precluding kids from conceptual understanding. And I think the learning literature, whether it's behavioural learning theory or cognitive learning theory, they converge to say that it's just not true. It's not the way, it's not the way it happens.
[00:59:56] Anna Stokke: It doesn't seem to be true, but anyway, NCTM does have a lot of influence. Even here in Canada, I mean, it's something we have to keep talking about that some of these things that NCTM are saying are not really evidence-backed.
[01:00:12] Brian Poncy: Yeah. at the end of the day we should listen to researchers, but the final verdict should come in from the children. And when I say that, not from a qualitative journalism place where we ask children - look at their outcome data. Because again, this, basically says, “Hey, what you guys were doing in the classroom, focusing on procedures and concepts was, you know, it was good.”
And remember, it probably was going to work for the top half kids anyway, but what we did was we basically brought up the prior knowledge level of the lower kids and allowed teachers probably to teach a little bit more of a homogenous group and allowed them to be the effective professionals that they can be.
[01:00:53] Anna Stokke: Okay, so that school, are they still using the MIND program?
[01:00:58] Brian Poncy: No. And so it was funny, well, I mean, it's not funny, but you know, you got to laugh, beats crying. They basically adopted a new curriculum. Now why? We don't know. But schools for come hell or high water, about every four to six years, will just say, “Oh, we're going to do a new curriculum.” And the teachers are like, “Well, it'll be too much work to, you know, implement a new curriculum and do the Facts on Fire for four minutes a day, so get out of our school.”
And we did, and they are back to the lowest school in the district. And what was really concerning to me is, I mean, we had the data, we shared the data with the teachers and, why those data did not kind of reinforce and, you know, you would think they would want to keep doing things that showed success with children.
[01:01:45] Anna Stokke: Oh, wow. It's four minutes a day, people. Four minutes a day. That's all it takes. And on that note, let's take a break and we'll come back and talk about a few other things. I will publish that episode, the continuation of this excellent conversation in one week. And we will talk about things like why fluency is important, conceptual understanding, explicit timing, Brian's research on dosage, that's just how much practice is needed, and how frequently the practice is needed, and more.
As always, we've included a resource page for this episode that has links to articles and books mentioned in the episode.
If you enjoy this podcast, please consider showing your support by leaving a five-star review on Spotify or Apple Podcasts. Chalk and Talk is produced by me, Anna Stokke, transcript and resource page by Jazmin Boisclair, social media images by Nicole Maylem Gutierrez.
Subscribe on your favourite podcast app to get new episodes delivered as they become available. You can follow me on X for notifications or check out my website, annastokke.com, for more information. This podcast received funding through a University of Winnipeg Knowledge Mobilization and Community Impact grant funded through the Anthony Swaity Knowledge Impact Fund.