Teaching With Problems
Direct Instruction of Problem Solving
Let’s say that you (1) believe that direct instruction (i.e. explicit worked examples with a procedure followed by repeated practice) is key to instruction and that (2) as a mathematical goal, students should be able to solve problems that they have never seen before and are likely to get stuck on, at some point.
What would that classroom look like?
Well, you’d probably provide some explicit instruction at the beginning of class in how to solve a new problem. Except that, of course, there is no guaranteed procedure for solving new problems, so you’d probably be explicit that you’re modeling good moves that your students would practice using later. A worked example.
And then you’d probably give them a chance to work on a non-routine problem that uses some of the moves or strategies that you’ve explicated. Maybe a quick one in semi-whole group, so that students don’t reinforce a mislearned move and get feedback before trying a problem on their own.
And then you’d probably give practice. Except that practice with problem solving means getting stuck and then getting unstuck with some move, strategy or technique.
You’d give students feedback about how well students used the moves that you’ve explicitly taught.
Maybe I’m wrong, but this doesn’t sound so different to me from the classroom imagined by opponents of direct instruction.
13 thoughts on “ direct instruction of problem solving ”.
It’s not clear to me that proponents of direct instruction believe that classroom instruction should or can prepare students to solve novel problems. That task is often passed on until students achieve a higher level of mastery (undergrad or grad school). It’s sort of like how most music programs don’t have any goals around preparing students to create (or often even interpret or appreciate) music, merely to competently play the music of others. The argument you are making might end up implying that instruction that has as a goal students applying heuristics in novel situations can’t look like direct instruction, at least in the sense of highly chunked, very carefully scaffolded, I-do-we-do-you-do practice preceded by explicit explanations. Also, the timing of explicit explanation in relation to struggle is a difference between some models that seems to make a difference, at least in some research studies.
It’s not clear to me that proponents of direct instruction believe that classroom instruction should or can prepare students to solve novel problems.
Agreed! But isn’t it good to know what we’re really arguing about?
Also, the timing of explicit explanation in relation to struggle is a difference between some models that seems to make a difference, at least in some research studies.
Interesting. I wonder if this has to do with teaching the cultural expectation that we struggle on problems in math. The idea that this is an important element of transfer is an idea I associate with Greeno.
In terms of the struggle, I was thinking more of the work of Manu Kapur, and the “productive failure” research that indicates that struggling to make sense of an ill-defined problem before being shown a tool to efficiently think through the problem leads to better long-term retention of the tool. Example: being asked to rank a bunch of tennis players based on a mass of data, and making sense of that situation using informal methods, leads to better retention of standard deviation when it is taught after going through the messy struggle. (The data is about performance on one aspect of the game; some players are very consistent, some less so; students are already familiar with measures of center but have no tools to quantify spread before the lesson).
What’s the proposed mechanism behind the increased performance?
The discussion in this paper is short and says it better than I could gloss: http://www.manukapur.com/wp40/wp-content/uploads/2015/05/CogSci08_PF_Kapur_etal.pdf but I liked the key word *discern* — productive failure prior to lecture may support students to discern key ideas in a well-structured lecture. They do point out that there’s a lot going on in the productive failure condition including collaboration, unscaffolded solving of carefully crafted problems, and delay of structure, and it’s not clear what each contributes.
Got it. I’ll take a closer look soon.
One of the recent bits of reading that hit me hard was in Lesh & Zawojewski (2007) was their note that in many studies that show strong effects for metacognitive questioning it was unclear whether it was newly developed metacognition that was helping learning. Maybe kids were just spending more time thinking about the content? they ask.
This is a new way of being skeptical for me, and I find it troubling, in a good way.
This is very interesting – starting with the title! I think one core conception is that “learning [can? should? ideally does?] look tidy” (which to me connects to the role of right answers in the learning process). Also I think it relates to some conceptions about math as a body of knowledge, and even more as a discipline (that is, the doing of math, more than the corpus of mathematical objects, relationships, and truths). Your description evokes strong memories for me of my own experiences as a solver and learner at Ross/OSU (parallel to PROMYS — hi Justin!) and later (when I became a teacher), at PCMI (hi Bowen!)
The discussion in the comments also reminds me of the research on demonstrations in science classes — apparently without some kind of “activation” stage before the demonstration, watching it can actually REINFORCE misconceptions rather than repairing them (to the degree that students will mis-remember what happened in the demo as being consistent with their prior misconception!) I was just looking for the source on this and I couldn’t find it, but I think I initially found it through the work of Derek Muller (“Veritasium” on YouTube), so if someone reading this doesn’t point us there, maybe I will just email him to ask…
Interesting stuff — I’d definitely be interested in reading about that research if you track it down.
The epically long blog post I just wrote here http://mathforum.org/blogs/max/anyone-want-to-do-some-research-on-problem-solving/ includes links to the Veritasium stuff, and a much longer meditation on guidance and whether problem solving can be taught and what that even means.
Thanks! A great post with lots to dig in on.
My guess (from recent conversations) is that direct instruction proponents would suggest that novel problem solving in [domain x] requires that students already know [domain x]. Strong content knowledge precedes problem solving, in other words.
Any sense on whether [domain x] needs to be something big like “math” or “geometry” or if it can be something small like “1-digit addition” or “slope”?
My impression is it gets down to a pretty atomic level.
Leave a Reply Cancel reply
- Already have a WordPress.com account? Log in now.
- Follow Following
- Copy shortlink
- Report this content
- View post in Reader
- Manage subscriptions
- Collapse this bar
Login | Support
- For Schools & Districts
- Videos & Implementation Guides
- About Avanti
- An Administrator’sGuide to TeacherRetention
- Building Relationships with Students
- A Comprehensive Guide to Student Engagement
- Direct Instruction
- Professional Development for Teachers
- Learning Goals
- Video Catalog
- Get Started
When is Direct Instruction Most Effective?
- May 5, 2023
Decades of research—many included in a comprehensive meta-analysis for the Review of Educational Research —document strong positive results for the Direct Instruction model. That isn’t to say it’s the only successful pedagogy approach. There’s ample evidence that student-centered experimental, inquiry-based, and non-directive learning encourages students to think critically, foster genuine curiosity, and cultivate characteristics such as self-reliance and tenacity. But when is Direct Instruction most effective and what lessons lend themselves to this specific style?
When weighing the various teaching approaches, deciding what will work best for your particular students for each lesson can be challenging. Some guidance exists in an Educational Leadership magazine article analysis , where college professor and administrator Penelope L. Peterson notes that students did slightly better on achievement tests after receiving direct teaching but marginally worse on tests of abstract thinking concepts. On the other hand, students under an “open teaching” model emphasizing student choices and more individual or small-group work than large-group instruction did somewhat better on creativity and problem-solving but worse on achievement tests.
Because the Direct Instruction approach lends itself to topics requiring a solid factual foundation, mastering specific skills, or comprehending step-by-step processes, here are some ways to use Direct Instruction in various subjects.
Direct Instruction can be ideal for teaching math, particularly when students learn new mathematical concepts or procedures. Teachers can provide step-by-step instructions, show examples, and guide students through practice exercises. Teachers can use Direct Instruction for problem-solving by giving examples of problems and guiding students through the problem-solving process. Students can learn graph and data analysis through Direct Instruction by learning how to choose the appropriate graph type, label the axes, and interpret the data. Math generally involves a lot of procedural knowledge, so Direct Instruction can help break down complex mathematical processes into more manageable steps. Many math concepts build upon one another, and it’s essential to have a solid understanding of the basics before moving on and adding more advanced topics. Direct Instruction also incorporates plenty of practice, perfect for implementing new math formulas and equations.
Direct Instruction can be an effective approach to teaching science concepts, particularly when students are first learning basic scientific facts and procedures. Teachers can use hands-on activities to show students how scientific principles work and explain the steps involved in the process. Scientific subject matter will also implement teaching that involves students conducting experiments. It’s much easier to understand how elements react or how light behaves, for example, through experimentation than Direct Instruction. It’s important to note that many scientific areas that benefit from experiments, including physics, chemistry, and biology, also require a strong understanding of basic foundational concepts effectively taught through Direct Instruction. Every experiment, from baking soda volcanos to advanced high school chemistry, requires students to follow precise, step-by-step instructions and specific processes—also best taught via a teacher-directed approach.
Direct Instruction can be a helpful approach to teaching specific language arts skills. Many language skills can be broken down into smaller, more manageable parts, helping students incrementally build their skills and abilities as they progress. Using it to teach specific grammar rules, such as subject-verb agreement or punctuation usage, teachers can explain the rule, provide examples, and guide students through practice exercises. For teaching vocabulary words, teachers can introduce the word, give a definition, and offer examples of how the term is used in context. Direct Instruction can help you teach students how to analyze literary texts, such as poems or short stories. For example, teachers can guide students through a close reading of the text, asking questions about the plot, characters, and themes while introducing literary techniques the author uses. In writing, Direct Instruction can be used to teach how to write a thesis statement or how to structure an essay. Teachers can provide models of good writing, explain the writing process, and offer feedback on student writing.
Learning new languages is often best accomplished through a structured approach focused on rules and attributes of vocabulary, grammar, and syntax. To teach foreign language vocabulary and grammar rules, teachers can explain the meaning of new words, demonstrate correct pronunciation, and guide students through practice exercises. The Direct Instruction method can also help students develop their speaking and listening skills in a foreign language. Teachers can model correct sentence structure and pronunciation, provide feedback on student speaking, and guide students through practice conversations. To teach students about the culture and customs of countries where the foreign language is spoken, teachers can provide information on cultural norms, such as greetings and social customs, and help students understand how to navigate these norms in a foreign language setting.
Social studies subject areas, such as history and geography, often require that students memorize large amounts of information. When students must comprehend new information and commit it to memory for future recall, Direct Instruction helps break it down into smaller, more manageable chunks while providing students strategies for organizing, remembering, and implementing the info. During read-aloud sessions, teachers can guide students through discussions and provide explanations to help students understand the text. Teachers can show students how to research and select important events and help them understand how events relate in a timeline. Direct Instruction can help teach students how to read and interpret maps, such as political or topographic maps, and how to identify important geographic features.
Enhance Your Direct Instruction Through Strategies From Avanti
Enhance your teaching toolkit by discovering new strategies for effective Direct Instruction in the Avanti resource library. With a free one-week Avanti trial , you can have full access to the entire video library, recorded livestream discussions, implementation guides, and a collaborative community forum for asking questions or sharing tips, tricks, and successes.
Leave a reply Cancel reply
Your email address will not be published. Required fields are marked *
Save my name, email, and website in this browser for the next time I comment.
- Intervention Study
- Published: 05 August 2019
Problem-solving or Explicit Instruction: Which Should Go First When Element Interactivity Is High?
- Greg Ashman 1 ,
- Slava Kalyuga 1 &
- John Sweller 1
Educational Psychology Review volume 32 , pages 229–247 ( 2020 ) Cite this article
The concept of productive failure posits that a problem-solving phase prior to explicit instruction is more effective than explicit instruction followed by problem-solving. This prediction was tested with Year 5 primary school students learning about light energy efficiency. Two, fully randomised, controlled experiments were conducted. In the first experiment ( N = 64), explicit instruction followed by problem-solving was found to be superior to the reverse order for performance on problems similar to those used during instruction, with no difference on transfer problems. In the second experiment, where element interactivity was increased ( N = 71), explicit instruction followed by problem-solving was found to be superior to the reverse order for performance on both similar and transfer problems. The contradictory predictions and results of a productive failure approach and cognitive load theory are discussed using the concept of element interactivity. Specifically, for learning where element interactivity is high, explicit instruction should precede problem-solving.
This is a preview of subscription content, access via your institution .
Buy single article.
Instant access to the full article PDF.
Price includes VAT (Russian Federation)
Rent this article via DeepDyve.
Chen, O., Kalyuga, S., & Sweller, J. (2015). The worked example effect, the generation effect, and element interactivity. Journal of Educational Psychology, 107 (3), 689–704.
Article Google Scholar
Chen, O., Kalyuga, S., & Sweller, J. (2016a). Relations between the worked example and generation effects on immediate and delayed tests. Learning and Instruction, 45 , 20–30.
Chen, O., Kalyuga, S., & Sweller, J. (2016b). When instructional guidance is needed. Educational and Developmental Psychologist, 33 (2), 149–162.
Chen, O., Kalyuga, S., & Sweller, J. (2017). The expertise reversal effect is a variant of the more general element interactivity effect. Educational Psychology Review, 29 (2), 393–405. https://doi.org/10.1007/s10648-016-9359-1 .
Chen, O., Castro-Alonso, J. C., Paas, F., & Sweller, J. (2018). Extending cognitive load theory to incorporate working memory resource depletion: evidence from the spacing effect. Educational Psychology Review, 30 (2), 483–501. https://doi.org/10.1007/s10648-017-9426-2 .
Cook, M. A. (2017). A comparison of the effectiveness of worked examples and productive failure in learning procedural and conceptual knowledge related to statistics (Order No. 10666475). Available from ProQuest Dissertations & Theses Global. (1984948629).
Cowan, N. (2001). The magical number 4 in short-term memory: A reconsideration of mental storage capacity. Behavioral and Brain Sciences, 24 (1), 87–114.
Crooks, N. M., & Alibali, M. W. (2014). Defining and measuring conceptual knowledge in mathematics. Developmental Review, 34 (4), 344–377.
DeCaro, M. S., & Rittle-Johnson, B. (2012). Exploring mathematics problems prepares children to learn from instruction. Journal of Experimental Child Psychology, 113 (4), 552–568.
Fyfe, E. R., DeCaro, M. S., & Rittle-Johnson, B. (2014). An alternative time for telling: when conceptual instruction prior to problem solving improves mathematical knowledge. British Journal of Educational Psychology, 84 (3), 502–519.
Geary, D. (2008). An evolutionarily informed education science. Educational Psychologist, 43 (4), 179–195.
Geary, D., & Berch, D. (2016). Evolution and children’s cognitive and academic development. In D. Geary & D. Berch (Eds.), Evolutionary perspectives on child development and education (pp. 217–249). Switzerland: Springer.
Chapter Google Scholar
Glogger-Frey, I., Fleischer, C., Grüny, L., Kappich, J., & Renkl, A. (2015). Inventing a solution and studying a worked solution prepare differently for learning from direct instruction. Learning and Instruction, 39 , 72–87.
Glogger-Frey, I., Gaus, K., & Renkl, A. (2017). Learning from direct instruction: best prepared by several self-regulated or guided invention activities? Learning and Instruction, 51 , 26–35.
Hirshman, E., & Bjork, R. A. (1988). The generation effect: support for a two-factor theory. Journal of Experimental Psychology: Learning, Memory, and Cognition, 14 (3), 484.
Ho, J., Tumkaya, T., Aryal, S., Choi, H., & Claridge-Chang, A. (2018). Moving beyond P values: everyday data analysis with estimation plots. bioRxiv , 377978.
Hsu, C.-Y., Kalyuga, S., & Sweller, J. (2015). When should guidance be presented in physics instruction? Archives of Scientific Psychology, 3 (1), 37–53.
Hwang, J., Choi, K. M., Bae, Y., Dong, & Shin, H. (2018). Do teachers’ instructional practices moderate equity in mathematical and scientific literacy? An investigation of the PISA 2012 and 2015. International Journal of Science and Mathematics Education . Advance online publication. https://doi.org/10.1007/s10763-018-9909-8 .
Jacobson, M. J., Markauskaite, L., Portolese, A., Kapur, M., Lai, P. K., & Roberts, G. (2017). Designs for learning about climate change as a complex system. Learning and Instruction, 52 , 1–14.
Kalyuga, S., Chandler, P., Tuovinen, J., & Sweller, J. (2001). When problem solving is superior to studying worked examples. Journal of Educational Psychology, 93 (3), 579–588.
Kapur, M. (2012). Productive failure in learning the concept of variance. Instructional Science, 40 (4), 651–672.
Kapur, M. (2014). Productive failure in learning math. Cognitive Science, 38 (5), 1008–1022.
Kapur, M. (2016). Examining productive failure, productive success, unproductive failure, and unproductive success in learning. Educational Psychologist, 51 (2), 289–299.
Kapur, M., & Bielaczyc, K. (2012). Designing for productive failure. Journal of the Learning Sciences, 21 (1), 45–83. https://doi.org/10.1080/10508406.2011.591717 .
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41 (2), 75–86.
Lai, P. K., Portolese, A., & Jacobson, M. J. (2017). Does sequence matter? Productive failure and designing online authentic learning for process engineering. British Journal of Educational Technology, 48 (6), 1217–1227.
Leppink, J., Paas, F., Van Gog, T., Van der Vleuten, C., & Van Merrienboer, J. (2014). Effects of pairs of problems and examples on task performance and different types of cognitive load. Learning and Instruction, 30 , 32–42.
Loibl, K., & Rummel, N. (2014a). The impact of guidance during problem-solving prior to instruction on students’ inventions and learning outcomes. Instructional Science, 42 (3), 305–326.
Loibl, K., & Rummel, N. (2014b). Knowing what you don’t know makes failure productive. Learning and Instruction, 34 , 74–85.
Martin, A. J. (2016). Using load reduction instruction (LRI) to boost motivation and engagement . Leicester: British Psychological Society.
Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psychologist, 59 (1), 14–19.
Reber, A. S. (1989). Implicit learning and tacit knowledge. Journal of Experimental Psychology: General, 118 (3), 219–235.
Rittle-Johnson, B., Fyfe, E. R., & Loehr, A. M. (2016). Improving conceptual and procedural knowledge: the impact of instructional content within a mathematics lesson. British Journal of Educational Psychology, 86 (4), 576–591.
Rosenshine, B. (2009). The empirical support for direct instruction. In S. Tobias and T. Duffy (Eds.) Constructivist instruction: success or failure? (pp. 201–220). New York: Routledge. https://doi.org/10.1037/0003-066X.59.1.14 , 59, 1.
Schwartz, D. L., & Bransford, J. D. (1998). A time for telling. Cognition and Instruction, 16 (4), 475–522.
Schwartz, D. L., & Martin, T. (2004). Inventing to prepare for future learning: the hidden efficiency of encouraging original student production in statistics instruction. Cognition and Instruction, 22 (2), 129–184.
Schwartz, D. L., Lindgren, R., & Lewis, S. (2009). Constructivism in an age of non-constructivist assessments. In Constructivist Instruction (pp. 46-73). Routledge.
Schwartz, D. L., Chase, C. C., Oppezzo, M. A., & Chin, D. B. (2011). Practicing versus inventing with contrasting cases: the effects of telling first on learning and transfer. Journal of Educational Psychology, 103 (4), 759–775.
Schwonke, R., Renkl, A., Krieg, C., Wittwer, J., Aleven, V., & Salden, R. (2009). The worked-example effect: not an artefact of lousy control conditions. Computers in Human Behavior, 25 (2), 258–266.
Slamecka, N. J., & Graf, P. (1978). The generation effect: delineation of a phenomenon. Journal of Experimental Psychology: Human Learning and Memory, 4 (6), 592.
Sweller, J. (2010). Element interactivity and intrinsic, extraneous and germane cognitive load. Educational Psychology Review, 22 (2), 123–138.
Sweller, J., & Chandler, P. (1994). Why some material is difficult to learn. Cognition and Instruction, 12 (3), 185–233.
Sweller, J., & Paas, F. (2017). Should self-regulated learning be integrated with cognitive load theory? A commentary. Learning and Instruction, 51 , 85–89.
Sweller, J., & Sweller, S. (2006). Natural information processing systems. Evolutionary Psychology, 4 , 434–458.
Sweller, J., Ayres, P., & Kalyuga, S. (2011). Cognitive load theory . New York: Springer.
Book Google Scholar
Sweller, J., van Merriënboer, J., & Paas, F. (2019). Cognitive architecture and instructional design: 20 years later. Educational Psychology Review, 31 (2), 261–292.
Van Gog, T., Kester, L., & Paas, F. (2011). Effects of worked examples, example-problem, and problem-example pairs on novices’ learning. Contemporary Educational Psychology, 36 (3), 212–218. https://doi.org/10.1016/j.cedpsych.2010.10.004 .
Weaver, J. P., Chastain, R. J., DeCaro, D. A., & DeCaro, M. S. (2018). Reverse the routine: problem solving before instruction improves conceptual knowledge in undergraduate physics. Contemporary Educational Psychology, 52 , 36–47.
We would like to acknowledge the students, parents, staff, and leadership team of the Ballarat Clarendon College for their support with this research.
Authors and affiliations.
School of Education, University of New South Wales, Sydney, New South Wales, 2052, Australia
Greg Ashman, Slava Kalyuga & John Sweller
You can also search for this author in PubMed Google Scholar
Correspondence to Greg Ashman .
Prior to the study, approval was obtained from the Human Research Ethics Advisory Panel of the lead author’s institution.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Reprints and Permissions
About this article
Cite this article.
Ashman, G., Kalyuga, S. & Sweller, J. Problem-solving or Explicit Instruction: Which Should Go First When Element Interactivity Is High?. Educ Psychol Rev 32 , 229–247 (2020). https://doi.org/10.1007/s10648-019-09500-5
Published : 05 August 2019
Issue Date : March 2020
DOI : https://doi.org/10.1007/s10648-019-09500-5
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Productive failure
- Cognitive load theory
- Expertise reversal effect
- Element interactivity
- Find a journal
- Publish with us