Center for Teaching

Teaching problem solving.

Print Version

Tips and Techniques

Expert vs. novice problem solvers, communicate.

  • Have students  identify specific problems, difficulties, or confusions . Don’t waste time working through problems that students already understand.
  • If students are unable to articulate their concerns, determine where they are having trouble by  asking them to identify the specific concepts or principles associated with the problem.
  • In a one-on-one tutoring session, ask the student to  work his/her problem out loud . This slows down the thinking process, making it more accurate and allowing you to access understanding.
  • When working with larger groups you can ask students to provide a written “two-column solution.” Have students write up their solution to a problem by putting all their calculations in one column and all of their reasoning (in complete sentences) in the other column. This helps them to think critically about their own problem solving and helps you to more easily identify where they may be having problems. Two-Column Solution (Math) Two-Column Solution (Physics)

Encourage Independence

  • Model the problem solving process rather than just giving students the answer. As you work through the problem, consider how a novice might struggle with the concepts and make your thinking clear
  • Have students work through problems on their own. Ask directing questions or give helpful suggestions, but  provide only minimal assistance and only when needed to overcome obstacles.
  • Don’t fear  group work ! Students can frequently help each other, and talking about a problem helps them think more critically about the steps needed to solve the problem. Additionally, group work helps students realize that problems often have multiple solution strategies, some that might be more effective than others

Be sensitive

  • Frequently, when working problems, students are unsure of themselves. This lack of confidence may hamper their learning. It is important to recognize this when students come to us for help, and to give each student some feeling of mastery. Do this by providing  positive reinforcement to let students know when they have mastered a new concept or skill.

Encourage Thoroughness and Patience

  • Try to communicate that  the process is more important than the answer so that the student learns that it is OK to not have an instant solution. This is learned through your acceptance of his/her pace of doing things, through your refusal to let anxiety pressure you into giving the right answer, and through your example of problem solving through a step-by step process.

Experts (teachers) in a particular field are often so fluent in solving problems from that field that they can find it difficult to articulate the problem solving principles and strategies they use to novices (students) in their field because these principles and strategies are second nature to the expert. To teach students problem solving skills,  a teacher should be aware of principles and strategies of good problem solving in his or her discipline .

The mathematician George Polya captured the problem solving principles and strategies he used in his discipline in the book  How to Solve It: A New Aspect of Mathematical Method (Princeton University Press, 1957). The book includes  a summary of Polya’s problem solving heuristic as well as advice on the teaching of problem solving.

instruction for problem solving

Teaching Guides

  • Online Course Development Resources
  • Principles & Frameworks
  • Pedagogies & Strategies
  • Reflecting & Assessing
  • Challenges & Opportunities
  • Populations & Contexts

Quick Links

  • Services for Departments and Schools
  • Examples of Online Instructional Modules
  • Faculty & Staff

Teaching problem solving

Strategies for teaching problem solving apply across disciplines and instructional contexts. First, introduce the problem and explain how people in your discipline generally make sense of the given information. Then, explain how to apply these approaches to solve the problem.

Introducing the problem

Explaining how people in your discipline understand and interpret these types of problems can help students develop the skills they need to understand the problem (and find a solution). After introducing how you would go about solving a problem, you could then ask students to:

  • frame the problem in their own words
  • define key terms and concepts
  • determine statements that accurately represent the givens of a problem
  • identify analogous problems
  • determine what information is needed to solve the problem

Working on solutions

In the solution phase, one develops and then implements a coherent plan for solving the problem. As you help students with this phase, you might ask them to:

  • identify the general model or procedure they have in mind for solving the problem
  • set sub-goals for solving the problem
  • identify necessary operations and steps
  • draw conclusions
  • carry out necessary operations

You can help students tackle a problem effectively by asking them to:

  • systematically explain each step and its rationale
  • explain how they would approach solving the problem
  • help you solve the problem by posing questions at key points in the process
  • work together in small groups (3 to 5 students) to solve the problem and then have the solution presented to the rest of the class (either by you or by a student in the group)

In all cases, the more you get the students to articulate their own understandings of the problem and potential solutions, the more you can help them develop their expertise in approaching problems in your discipline.

LSE Home

Evidence-Based Teaching Guides → Evidence-Based Teaching Guides → Problem Solving

Instruction followed by problem solving.

  • In the PLTL approach, the instruction phase takes place in the traditional classroom, often in the form of lecture, and the problem-solving phase takes place as students work in collaborative groups (typically ranging from 6-10 students) facilitated by a trained undergraduate peer leader for 90−120 minutes each week.
  • This approach provides facilitated help to students in their courses, improves students’ problem-solving skills, enhances students’ communication abilities, and provides an active-learning experience for students.
  • The peer leader should not help solve the problems with the students in the group, but guides them to discuss their reasoning by asking probing questions and to equally participate by using different collaborative learning strategies (such as round robin, scribe, and pairs). Students decide as a group whether the answer is correct or not, which encourages the students to consider the problem more deeply.
  • PLTL is used in many STEM undergraduate disciplines including biology, chemistry, mathematics, physics, psychology, and computer science, and in all types of institutions. It has been used at all different levels of undergraduate courses. If done well, PLTL can improve course grades, course and series retention, standardized and course exam performance, and DWF rates.
  • PLTL can also be beneficial to peer leaders, self-reporting greater content learning, improved study skills, improved interpersonal skills, increased leadership skills, and confidence.
  • In the constructivist framework, teaching is not the transmission of knowledge from the instructor to the student. The instructor is a facilitator or guide, giving structure to the learning process.
  • Students construct meaning (e.g., develop concepts and models) through active involvement with the material and by making sense of their experiences.
  • Social constructivism assumes that students’ understanding and sense-making are developed jointly in collaboration with other students.
  • The peer leader is considered to be an effective guide because they are in the ZPD of the students in their PLTL group.
  • The PLTL program is integral to the course and integrated with other course components.
  • how to effectively create a community of practice within their group such that students will make joint decisions while solving the problems, discuss multiple approaches to solve the problems, and practice professional social and communication skills;
  • practice with questioning strategies to support students in deepening their discussion to include explanations for their ideas and problem-solving processes;
  • learning about how students learn based on psychology and education research, and how to apply this information while facilitating their group.
  • require students to work collaboratively to solve problems;
  • encourage students to engage deeply with the content (i.e., include prompts asking them to explain their reasoning or process), disciplinary vocabulary (i.e., include prompts asking them to define terms in their own words), and essential skills;
  • become more complex throughout the problem set while ensuring that the students are always within their Zone of Proximal Development.
  • Organizational arrangements promote active learning via focus on group size, room space, length of session, and low noise level.
  • The institution and department encourage and support innovative teaching.
  • In worked examples, the instruction takes the form of example problems. These problems include a problem statement and a step-by-step procedure for solving the problem, intended to show how an expert might solve this type of problem. After this explicit instruction, students complete practice problems like the worked examples.
  • Worked examples plus practice problems have been found to be beneficial when compared to instruction followed by problem solving alone. This benefit is observed for novices learning to solve complex problems but is lost as learners become more expert in the domain and is not observed for simple problems.
  • Worked examples provide guidance that can help students learn to do analogous problems (near transfer) and may have similar benefits to productive failure and scaffolded guided inquiry for near transfer.
  • Worked examples help students in early-to-intermediate stages of cognitive skill development as they are learning to abstract general principles for solving a given type of problem.
  • Comparing worked examples that focus on different types of problems can also help students identify deep features and abstract general principles that help them know when to use a given problem solving approach.
  • Problem-solving practice that incorporates strategies like retrieval and interleaving become more effective as students seek to become faster and more accurate.
  • Integrating sources of information (e.g., images integrated with explanatory text or auditory explanations),
  • Including visual cues to help students readily follow the explanation,
  • Fostering identification of subgoals within a problem, either by labeling or visually separating chunks of a problem solution corresponding to a subgoal.
  • Lessons should include at least two worked examples for a type of problem.
  • Worked examples should be accompanied by practice problems and should be interspersed throughout a lesson rather than combined in one section of the lesson.
  • Different problem types should use similar cover stories to emphasize deep problem features.
  • Relate solutions to abstract principles
  • Compare different examples and self-explain key similarities and differences.
  • If multiple worked examples for a given type of problem are used, it can be beneficial to remove guidance in stages (also known as fading). Backwards fading (leaving blanks later in the problems first, then earlier and earlier) has been found to be more beneficial than forward fading.

How To Use This Guide

Return to Map

Classroom Q&A

With larry ferlazzo.

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected]. Read more from this blog.

Eight Instructional Strategies for Promoting Critical Thinking

instruction for problem solving

  • Share article

(This is the first post in a three-part series.)

The new question-of-the-week is:

What is critical thinking and how can we integrate it into the classroom?

This three-part series will explore what critical thinking is, if it can be specifically taught and, if so, how can teachers do so in their classrooms.

Today’s guests are Dara Laws Savage, Patrick Brown, Meg Riordan, Ph.D., and Dr. PJ Caposey. Dara, Patrick, and Meg were also guests on my 10-minute BAM! Radio Show . You can also find a list of, and links to, previous shows here.

You might also be interested in The Best Resources On Teaching & Learning Critical Thinking In The Classroom .

Current Events

Dara Laws Savage is an English teacher at the Early College High School at Delaware State University, where she serves as a teacher and instructional coach and lead mentor. Dara has been teaching for 25 years (career preparation, English, photography, yearbook, newspaper, and graphic design) and has presented nationally on project-based learning and technology integration:

There is so much going on right now and there is an overload of information for us to process. Did you ever stop to think how our students are processing current events? They see news feeds, hear news reports, and scan photos and posts, but are they truly thinking about what they are hearing and seeing?

I tell my students that my job is not to give them answers but to teach them how to think about what they read and hear. So what is critical thinking and how can we integrate it into the classroom? There are just as many definitions of critical thinking as there are people trying to define it. However, the Critical Think Consortium focuses on the tools to create a thinking-based classroom rather than a definition: “Shape the climate to support thinking, create opportunities for thinking, build capacity to think, provide guidance to inform thinking.” Using these four criteria and pairing them with current events, teachers easily create learning spaces that thrive on thinking and keep students engaged.

One successful technique I use is the FIRE Write. Students are given a quote, a paragraph, an excerpt, or a photo from the headlines. Students are asked to F ocus and respond to the selection for three minutes. Next, students are asked to I dentify a phrase or section of the photo and write for two minutes. Third, students are asked to R eframe their response around a specific word, phrase, or section within their previous selection. Finally, students E xchange their thoughts with a classmate. Within the exchange, students also talk about how the selection connects to what we are covering in class.

There was a controversial Pepsi ad in 2017 involving Kylie Jenner and a protest with a police presence. The imagery in the photo was strikingly similar to a photo that went viral with a young lady standing opposite a police line. Using that image from a current event engaged my students and gave them the opportunity to critically think about events of the time.

Here are the two photos and a student response:

F - Focus on both photos and respond for three minutes

In the first picture, you see a strong and courageous black female, bravely standing in front of two officers in protest. She is risking her life to do so. Iesha Evans is simply proving to the world she does NOT mean less because she is black … and yet officers are there to stop her. She did not step down. In the picture below, you see Kendall Jenner handing a police officer a Pepsi. Maybe this wouldn’t be a big deal, except this was Pepsi’s weak, pathetic, and outrageous excuse of a commercial that belittles the whole movement of people fighting for their lives.

I - Identify a word or phrase, underline it, then write about it for two minutes

A white, privileged female in place of a fighting black woman was asking for trouble. A struggle we are continuously fighting every day, and they make a mockery of it. “I know what will work! Here Mr. Police Officer! Drink some Pepsi!” As if. Pepsi made a fool of themselves, and now their already dwindling fan base continues to ever shrink smaller.

R - Reframe your thoughts by choosing a different word, then write about that for one minute

You don’t know privilege until it’s gone. You don’t know privilege while it’s there—but you can and will be made accountable and aware. Don’t use it for evil. You are not stupid. Use it to do something. Kendall could’ve NOT done the commercial. Kendall could’ve released another commercial standing behind a black woman. Anything!

Exchange - Remember to discuss how this connects to our school song project and our previous discussions?

This connects two ways - 1) We want to convey a strong message. Be powerful. Show who we are. And Pepsi definitely tried. … Which leads to the second connection. 2) Not mess up and offend anyone, as had the one alma mater had been linked to black minstrels. We want to be amazing, but we have to be smart and careful and make sure we include everyone who goes to our school and everyone who may go to our school.

As a final step, students read and annotate the full article and compare it to their initial response.

Using current events and critical-thinking strategies like FIRE writing helps create a learning space where thinking is the goal rather than a score on a multiple-choice assessment. Critical-thinking skills can cross over to any of students’ other courses and into life outside the classroom. After all, we as teachers want to help the whole student be successful, and critical thinking is an important part of navigating life after they leave our classrooms.

usingdaratwo

‘Before-Explore-Explain’

Patrick Brown is the executive director of STEM and CTE for the Fort Zumwalt school district in Missouri and an experienced educator and author :

Planning for critical thinking focuses on teaching the most crucial science concepts, practices, and logical-thinking skills as well as the best use of instructional time. One way to ensure that lessons maintain a focus on critical thinking is to focus on the instructional sequence used to teach.

Explore-before-explain teaching is all about promoting critical thinking for learners to better prepare students for the reality of their world. What having an explore-before-explain mindset means is that in our planning, we prioritize giving students firsthand experiences with data, allow students to construct evidence-based claims that focus on conceptual understanding, and challenge students to discuss and think about the why behind phenomena.

Just think of the critical thinking that has to occur for students to construct a scientific claim. 1) They need the opportunity to collect data, analyze it, and determine how to make sense of what the data may mean. 2) With data in hand, students can begin thinking about the validity and reliability of their experience and information collected. 3) They can consider what differences, if any, they might have if they completed the investigation again. 4) They can scrutinize outlying data points for they may be an artifact of a true difference that merits further exploration of a misstep in the procedure, measuring device, or measurement. All of these intellectual activities help them form more robust understanding and are evidence of their critical thinking.

In explore-before-explain teaching, all of these hard critical-thinking tasks come before teacher explanations of content. Whether we use discovery experiences, problem-based learning, and or inquiry-based activities, strategies that are geared toward helping students construct understanding promote critical thinking because students learn content by doing the practices valued in the field to generate knowledge.

explorebeforeexplain

An Issue of Equity

Meg Riordan, Ph.D., is the chief learning officer at The Possible Project, an out-of-school program that collaborates with youth to build entrepreneurial skills and mindsets and provides pathways to careers and long-term economic prosperity. She has been in the field of education for over 25 years as a middle and high school teacher, school coach, college professor, regional director of N.Y.C. Outward Bound Schools, and director of external research with EL Education:

Although critical thinking often defies straightforward definition, most in the education field agree it consists of several components: reasoning, problem-solving, and decisionmaking, plus analysis and evaluation of information, such that multiple sides of an issue can be explored. It also includes dispositions and “the willingness to apply critical-thinking principles, rather than fall back on existing unexamined beliefs, or simply believe what you’re told by authority figures.”

Despite variation in definitions, critical thinking is nonetheless promoted as an essential outcome of students’ learning—we want to see students and adults demonstrate it across all fields, professions, and in their personal lives. Yet there is simultaneously a rationing of opportunities in schools for students of color, students from under-resourced communities, and other historically marginalized groups to deeply learn and practice critical thinking.

For example, many of our most underserved students often spend class time filling out worksheets, promoting high compliance but low engagement, inquiry, critical thinking, or creation of new ideas. At a time in our world when college and careers are critical for participation in society and the global, knowledge-based economy, far too many students struggle within classrooms and schools that reinforce low-expectations and inequity.

If educators aim to prepare all students for an ever-evolving marketplace and develop skills that will be valued no matter what tomorrow’s jobs are, then we must move critical thinking to the forefront of classroom experiences. And educators must design learning to cultivate it.

So, what does that really look like?

Unpack and define critical thinking

To understand critical thinking, educators need to first unpack and define its components. What exactly are we looking for when we speak about reasoning or exploring multiple perspectives on an issue? How does problem-solving show up in English, math, science, art, or other disciplines—and how is it assessed? At Two Rivers, an EL Education school, the faculty identified five constructs of critical thinking, defined each, and created rubrics to generate a shared picture of quality for teachers and students. The rubrics were then adapted across grade levels to indicate students’ learning progressions.

At Avenues World School, critical thinking is one of the Avenues World Elements and is an enduring outcome embedded in students’ early experiences through 12th grade. For instance, a kindergarten student may be expected to “identify cause and effect in familiar contexts,” while an 8th grader should demonstrate the ability to “seek out sufficient evidence before accepting a claim as true,” “identify bias in claims and evidence,” and “reconsider strongly held points of view in light of new evidence.”

When faculty and students embrace a common vision of what critical thinking looks and sounds like and how it is assessed, educators can then explicitly design learning experiences that call for students to employ critical-thinking skills. This kind of work must occur across all schools and programs, especially those serving large numbers of students of color. As Linda Darling-Hammond asserts , “Schools that serve large numbers of students of color are least likely to offer the kind of curriculum needed to ... help students attain the [critical-thinking] skills needed in a knowledge work economy. ”

So, what can it look like to create those kinds of learning experiences?

Designing experiences for critical thinking

After defining a shared understanding of “what” critical thinking is and “how” it shows up across multiple disciplines and grade levels, it is essential to create learning experiences that impel students to cultivate, practice, and apply these skills. There are several levers that offer pathways for teachers to promote critical thinking in lessons:

1.Choose Compelling Topics: Keep it relevant

A key Common Core State Standard asks for students to “write arguments to support claims in an analysis of substantive topics or texts using valid reasoning and relevant and sufficient evidence.” That might not sound exciting or culturally relevant. But a learning experience designed for a 12th grade humanities class engaged learners in a compelling topic— policing in America —to analyze and evaluate multiple texts (including primary sources) and share the reasoning for their perspectives through discussion and writing. Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care about and connect with can ignite powerful learning experiences.

2. Make Local Connections: Keep it real

At The Possible Project , an out-of-school-time program designed to promote entrepreneurial skills and mindsets, students in a recent summer online program (modified from in-person due to COVID-19) explored the impact of COVID-19 on their communities and local BIPOC-owned businesses. They learned interviewing skills through a partnership with Everyday Boston , conducted virtual interviews with entrepreneurs, evaluated information from their interviews and local data, and examined their previously held beliefs. They created blog posts and videos to reflect on their learning and consider how their mindsets had changed as a result of the experience. In this way, we can design powerful community-based learning and invite students into productive struggle with multiple perspectives.

3. Create Authentic Projects: Keep it rigorous

At Big Picture Learning schools, students engage in internship-based learning experiences as a central part of their schooling. Their school-based adviser and internship-based mentor support them in developing real-world projects that promote deeper learning and critical-thinking skills. Such authentic experiences teach “young people to be thinkers, to be curious, to get from curiosity to creation … and it helps students design a learning experience that answers their questions, [providing an] opportunity to communicate it to a larger audience—a major indicator of postsecondary success.” Even in a remote environment, we can design projects that ask more of students than rote memorization and that spark critical thinking.

Our call to action is this: As educators, we need to make opportunities for critical thinking available not only to the affluent or those fortunate enough to be placed in advanced courses. The tools are available, let’s use them. Let’s interrogate our current curriculum and design learning experiences that engage all students in real, relevant, and rigorous experiences that require critical thinking and prepare them for promising postsecondary pathways.

letsinterrogate

Critical Thinking & Student Engagement

Dr. PJ Caposey is an award-winning educator, keynote speaker, consultant, and author of seven books who currently serves as the superintendent of schools for the award-winning Meridian CUSD 223 in northwest Illinois. You can find PJ on most social-media platforms as MCUSDSupe:

When I start my keynote on student engagement, I invite two people up on stage and give them each five paper balls to shoot at a garbage can also conveniently placed on stage. Contestant One shoots their shot, and the audience gives approval. Four out of 5 is a heckuva score. Then just before Contestant Two shoots, I blindfold them and start moving the garbage can back and forth. I usually try to ensure that they can at least make one of their shots. Nobody is successful in this unfair environment.

I thank them and send them back to their seats and then explain that this little activity was akin to student engagement. While we all know we want student engagement, we are shooting at different targets. More importantly, for teachers, it is near impossible for them to hit a target that is moving and that they cannot see.

Within the world of education and particularly as educational leaders, we have failed to simplify what student engagement looks like, and it is impossible to define or articulate what student engagement looks like if we cannot clearly articulate what critical thinking is and looks like in a classroom. Because, simply, without critical thought, there is no engagement.

The good news here is that critical thought has been defined and placed into taxonomies for decades already. This is not something new and not something that needs to be redefined. I am a Bloom’s person, but there is nothing wrong with DOK or some of the other taxonomies, either. To be precise, I am a huge fan of Daggett’s Rigor and Relevance Framework. I have used that as a core element of my practice for years, and it has shaped who I am as an instructional leader.

So, in order to explain critical thought, a teacher or a leader must familiarize themselves with these tried and true taxonomies. Easy, right? Yes, sort of. The issue is not understanding what critical thought is; it is the ability to integrate it into the classrooms. In order to do so, there are a four key steps every educator must take.

  • Integrating critical thought/rigor into a lesson does not happen by chance, it happens by design. Planning for critical thought and engagement is much different from planning for a traditional lesson. In order to plan for kids to think critically, you have to provide a base of knowledge and excellent prompts to allow them to explore their own thinking in order to analyze, evaluate, or synthesize information.
  • SIDE NOTE – Bloom’s verbs are a great way to start when writing objectives, but true planning will take you deeper than this.

QUESTIONING

  • If the questions and prompts given in a classroom have correct answers or if the teacher ends up answering their own questions, the lesson will lack critical thought and rigor.
  • Script five questions forcing higher-order thought prior to every lesson. Experienced teachers may not feel they need this, but it helps to create an effective habit.
  • If lessons are rigorous and assessments are not, students will do well on their assessments, and that may not be an accurate representation of the knowledge and skills they have mastered. If lessons are easy and assessments are rigorous, the exact opposite will happen. When deciding to increase critical thought, it must happen in all three phases of the game: planning, instruction, and assessment.

TALK TIME / CONTROL

  • To increase rigor, the teacher must DO LESS. This feels counterintuitive but is accurate. Rigorous lessons involving tons of critical thought must allow for students to work on their own, collaborate with peers, and connect their ideas. This cannot happen in a silent room except for the teacher talking. In order to increase rigor, decrease talk time and become comfortable with less control. Asking questions and giving prompts that lead to no true correct answer also means less control. This is a tough ask for some teachers. Explained differently, if you assign one assignment and get 30 very similar products, you have most likely assigned a low-rigor recipe. If you assign one assignment and get multiple varied products, then the students have had a chance to think deeply, and you have successfully integrated critical thought into your classroom.

integratingcaposey

Thanks to Dara, Patrick, Meg, and PJ for their contributions!

Please feel free to leave a comment with your reactions to the topic or directly to anything that has been said in this post.

Consider contributing a question to be answered in a future post. You can send one to me at [email protected] . When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo .

Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching .

Just a reminder; you can subscribe and receive updates from this blog via email (The RSS feed for this blog, and for all Ed Week articles, has been changed by the new redesign—new ones won’t be available until February). And if you missed any of the highlights from the first nine years of this blog, you can see a categorized list below.

  • This Year’s Most Popular Q&A Posts
  • Race & Racism in Schools
  • School Closures & the Coronavirus Crisis
  • Classroom-Management Advice
  • Best Ways to Begin the School Year
  • Best Ways to End the School Year
  • Student Motivation & Social-Emotional Learning
  • Implementing the Common Core
  • Facing Gender Challenges in Education
  • Teaching Social Studies
  • Cooperative & Collaborative Learning
  • Using Tech in the Classroom
  • Student Voices
  • Parent Engagement in Schools
  • Teaching English-Language Learners
  • Reading Instruction
  • Writing Instruction
  • Education Policy Issues
  • Differentiating Instruction
  • Math Instruction
  • Science Instruction
  • Advice for New Teachers
  • Author Interviews
  • Entering the Teaching Profession
  • The Inclusive Classroom
  • Learning & the Brain
  • Administrator Leadership
  • Teacher Leadership
  • Relationships in Schools
  • Professional Development
  • Instructional Strategies
  • Best of Classroom Q&A
  • Professional Collaboration
  • Classroom Organization
  • Mistakes in Education
  • Project-Based Learning

I am also creating a Twitter list including all contributors to this column .

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for EdWeek Update

Edweek top school jobs.

Screen Shot 2024 03 12 at 6.45.38 AM

Sign Up & Sign In

module image 9

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Overview of the Problem-Solving Mental Process

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

instruction for problem solving

Rachel Goldman, PhD FTOS, is a licensed psychologist, clinical assistant professor, speaker, wellness expert specializing in eating behaviors, stress management, and health behavior change.

instruction for problem solving

  • Identify the Problem
  • Define the Problem
  • Form a Strategy
  • Organize Information
  • Allocate Resources
  • Monitor Progress
  • Evaluate the Results

Frequently Asked Questions

Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the issue.

The best strategy for solving a problem depends largely on the unique situation. In some cases, people are better off learning everything they can about the issue and then using factual knowledge to come up with a solution. In other instances, creativity and insight are the best options.

It is not necessary to follow problem-solving steps sequentially, It is common to skip steps or even go back through steps multiple times until the desired solution is reached.

In order to correctly solve a problem, it is often important to follow a series of steps. Researchers sometimes refer to this as the problem-solving cycle. While this cycle is portrayed sequentially, people rarely follow a rigid series of steps to find a solution.

The following steps include developing strategies and organizing knowledge.

1. Identifying the Problem

While it may seem like an obvious step, identifying the problem is not always as simple as it sounds. In some cases, people might mistakenly identify the wrong source of a problem, which will make attempts to solve it inefficient or even useless.

Some strategies that you might use to figure out the source of a problem include :

  • Asking questions about the problem
  • Breaking the problem down into smaller pieces
  • Looking at the problem from different perspectives
  • Conducting research to figure out what relationships exist between different variables

2. Defining the Problem

After the problem has been identified, it is important to fully define the problem so that it can be solved. You can define a problem by operationally defining each aspect of the problem and setting goals for what aspects of the problem you will address

At this point, you should focus on figuring out which aspects of the problems are facts and which are opinions. State the problem clearly and identify the scope of the solution.

3. Forming a Strategy

After the problem has been identified, it is time to start brainstorming potential solutions. This step usually involves generating as many ideas as possible without judging their quality. Once several possibilities have been generated, they can be evaluated and narrowed down.

The next step is to develop a strategy to solve the problem. The approach used will vary depending upon the situation and the individual's unique preferences. Common problem-solving strategies include heuristics and algorithms.

  • Heuristics are mental shortcuts that are often based on solutions that have worked in the past. They can work well if the problem is similar to something you have encountered before and are often the best choice if you need a fast solution.
  • Algorithms are step-by-step strategies that are guaranteed to produce a correct result. While this approach is great for accuracy, it can also consume time and resources.

Heuristics are often best used when time is of the essence, while algorithms are a better choice when a decision needs to be as accurate as possible.

4. Organizing Information

Before coming up with a solution, you need to first organize the available information. What do you know about the problem? What do you not know? The more information that is available the better prepared you will be to come up with an accurate solution.

When approaching a problem, it is important to make sure that you have all the data you need. Making a decision without adequate information can lead to biased or inaccurate results.

5. Allocating Resources

Of course, we don't always have unlimited money, time, and other resources to solve a problem. Before you begin to solve a problem, you need to determine how high priority it is.

If it is an important problem, it is probably worth allocating more resources to solving it. If, however, it is a fairly unimportant problem, then you do not want to spend too much of your available resources on coming up with a solution.

At this stage, it is important to consider all of the factors that might affect the problem at hand. This includes looking at the available resources, deadlines that need to be met, and any possible risks involved in each solution. After careful evaluation, a decision can be made about which solution to pursue.

6. Monitoring Progress

After selecting a problem-solving strategy, it is time to put the plan into action and see if it works. This step might involve trying out different solutions to see which one is the most effective.

It is also important to monitor the situation after implementing a solution to ensure that the problem has been solved and that no new problems have arisen as a result of the proposed solution.

Effective problem-solvers tend to monitor their progress as they work towards a solution. If they are not making good progress toward reaching their goal, they will reevaluate their approach or look for new strategies .

7. Evaluating the Results

After a solution has been reached, it is important to evaluate the results to determine if it is the best possible solution to the problem. This evaluation might be immediate, such as checking the results of a math problem to ensure the answer is correct, or it can be delayed, such as evaluating the success of a therapy program after several months of treatment.

Once a problem has been solved, it is important to take some time to reflect on the process that was used and evaluate the results. This will help you to improve your problem-solving skills and become more efficient at solving future problems.

A Word From Verywell​

It is important to remember that there are many different problem-solving processes with different steps, and this is just one example. Problem-solving in real-world situations requires a great deal of resourcefulness, flexibility, resilience, and continuous interaction with the environment.

Get Advice From The Verywell Mind Podcast

Hosted by therapist Amy Morin, LCSW, this episode of The Verywell Mind Podcast shares how you can stop dwelling in a negative mindset.

Follow Now : Apple Podcasts / Spotify / Google Podcasts

You can become a better problem solving by:

  • Practicing brainstorming and coming up with multiple potential solutions to problems
  • Being open-minded and considering all possible options before making a decision
  • Breaking down problems into smaller, more manageable pieces
  • Asking for help when needed
  • Researching different problem-solving techniques and trying out new ones
  • Learning from mistakes and using them as opportunities to grow

It's important to communicate openly and honestly with your partner about what's going on. Try to see things from their perspective as well as your own. Work together to find a resolution that works for both of you. Be willing to compromise and accept that there may not be a perfect solution.

Take breaks if things are getting too heated, and come back to the problem when you feel calm and collected. Don't try to fix every problem on your own—consider asking a therapist or counselor for help and insight.

If you've tried everything and there doesn't seem to be a way to fix the problem, you may have to learn to accept it. This can be difficult, but try to focus on the positive aspects of your life and remember that every situation is temporary. Don't dwell on what's going wrong—instead, think about what's going right. Find support by talking to friends or family. Seek professional help if you're having trouble coping.

Davidson JE, Sternberg RJ, editors.  The Psychology of Problem Solving .  Cambridge University Press; 2003. doi:10.1017/CBO9780511615771

Sarathy V. Real world problem-solving .  Front Hum Neurosci . 2018;12:261. Published 2018 Jun 26. doi:10.3389/fnhum.2018.00261

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

  • Our Mission

3 Simple Strategies to Improve Students’ Problem-Solving Skills

These strategies are designed to make sure students have a good understanding of problems before attempting to solve them.

Two students in math class

Research provides a striking revelation about problem solvers. The best problem solvers approach problems much differently than novices. For instance, one meta-study showed that when experts evaluate graphs , they tend to spend less time on tasks and answer choices and more time on evaluating the axes’ labels and the relationships of variables within the graphs. In other words, they spend more time up front making sense of the data before moving to addressing the task.

While slower in solving problems, experts use this additional up-front time to more efficiently and effectively solve the problem. In one study, researchers found that experts were much better at “information extraction” or pulling the information they needed to solve the problem later in the problem than novices. This was due to the fact that they started a problem-solving process by evaluating specific assumptions within problems, asking predictive questions, and then comparing and contrasting their predictions with results. For example, expert problem solvers look at the problem context and ask a number of questions:

  • What do we know about the context of the problem?
  • What assumptions are underlying the problem? What’s the story here?
  • What qualitative and quantitative information is pertinent?
  • What might the problem context be telling us? What questions arise from the information we are reading or reviewing?
  • What are important trends and patterns?

As such, expert problem solvers don’t jump to the presented problem or rush to solutions. They invest the time necessary to make sense of the problem.

Now, think about your own students: Do they immediately jump to the question, or do they take time to understand the problem context? Do they identify the relevant variables, look for patterns, and then focus on the specific tasks?

If your students are struggling to develop the habit of sense-making in a problem- solving context, this is a perfect time to incorporate a few short and sharp strategies to support them.

3 Ways to Improve Student Problem-Solving

1. Slow reveal graphs: The brilliant strategy crafted by K–8 math specialist Jenna Laib and her colleagues provides teachers with an opportunity to gradually display complex graphical information and build students’ questioning, sense-making, and evaluating predictions.

For instance, in one third-grade class, students are given a bar graph without any labels or identifying information except for bars emerging from a horizontal line on the bottom of the slide. Over time, students learn about the categories on the x -axis (types of animals) and the quantities specified on the y -axis (number of baby teeth).

The graphs and the topics range in complexity from studying the standard deviation of temperatures in Antarctica to the use of scatterplots to compare working hours across OECD (Organization for Economic Cooperation and Development) countries. The website offers a number of graphs on Google Slides and suggests questions that teachers may ask students. Furthermore, this site allows teachers to search by type of graph (e.g., scatterplot) or topic (e.g., social justice).

2. Three reads: The three-reads strategy tasks students with evaluating a word problem in three different ways . First, students encounter a problem without having access to the question—for instance, “There are 20 kangaroos on the grassland. Three hop away.” Students are expected to discuss the context of the problem without emphasizing the quantities. For instance, a student may say, “We know that there are a total amount of kangaroos, and the total shrinks because some kangaroos hop away.”

Next, students discuss the important quantities and what questions may be generated. Finally, students receive and address the actual problem. Here they can both evaluate how close their predicted questions were from the actual questions and solve the actual problem.

To get started, consider using the numberless word problems on educator Brian Bushart’s site . For those teaching high school, consider using your own textbook word problems for this activity. Simply create three slides to present to students that include context (e.g., on the first slide state, “A salesman sold twice as much pears in the afternoon as in the morning”). The second slide would include quantities (e.g., “He sold 360 kilograms of pears”), and the third slide would include the actual question (e.g., “How many kilograms did he sell in the morning and how many in the afternoon?”). One additional suggestion for teams to consider is to have students solve the questions they generated before revealing the actual question.

3. Three-Act Tasks: Originally created by Dan Meyer, three-act tasks follow the three acts of a story . The first act is typically called the “setup,” followed by the “confrontation” and then the “resolution.”

This storyline process can be used in mathematics in which students encounter a contextual problem (e.g., a pool is being filled with soda). Here students work to identify the important aspects of the problem. During the second act, students build knowledge and skill to solve the problem (e.g., they learn how to calculate the volume of particular spaces). Finally, students solve the problem and evaluate their answers (e.g., how close were their calculations to the actual specifications of the pool and the amount of liquid that filled it).

Often, teachers add a fourth act (i.e., “the sequel”), in which students encounter a similar problem but in a different context (e.g., they have to estimate the volume of a lava lamp). There are also a number of elementary examples that have been developed by math teachers including GFletchy , which offers pre-kindergarten to middle school activities including counting squares , peas in a pod , and shark bait .

Students need to learn how to slow down and think through a problem context. The aforementioned strategies are quick ways teachers can begin to support students in developing the habits needed to effectively and efficiently tackle complex problem-solving.

Initial Thoughts

Perspectives & resources, what is high-quality mathematics instruction and why is it important.

  • Page 1: The Importance of High-Quality Mathematics Instruction
  • Page 2: A Standards-Based Mathematics Curriculum
  • Page 3: Evidence-Based Mathematics Practices

What evidence-based mathematics practices can teachers employ?

  • Page 4: Explicit, Systematic Instruction
  • Page 5: Visual Representations

Page 6: Schema Instruction

  • Page 7: Metacognitive Strategies
  • Page 8: Effective Classroom Practices
  • Page 9: References & Additional Resources
  • Page 10: Credits

How does this practice align?

High-leverage practices.

  • HLP14 : Teach cognitive and metacognitive strategies to support learning and independence

CCSSM: Standards for Mathematical Practice

  • MP7 : Look for and make use of structure.

Another effective strategy for helping students improve their mathematics performance is related to solving word problems. More specifically, it involves teaching students how to identify word problem types based on a given problem’s underlying structure, or schema . Before learning about this strategy, however, it is helpful to understand why many students struggle with word problems in the first place.

Difficulty with Word Problems

Most students, especially those with mathematics difficulties and disabilities, have trouble solving word problems. This is in large part because word problems require students to:

  • Read and understand the text, including mathematics vocabulary
  • Be able to identify and separate relevant information from irrelevant information
  • Represent the problem correctly
  • Choose an appropriate strategy for solving the problem
  • Perform the computational procedures
  • Check the answer to ensure that it makes sense (Adapted from Stevens and Powell, 2016; Jitendra, et al., 2015; Jitendra et al., 2013)

Students who experience difficulty with any of the steps listed above, such as students who struggle with mathematics, will likely arrive at an incorrect answer.

Research Shows

  • Students with mathematical difficulties and disabilities struggle more than their peers when solving word problems. (Stevens & Powell, 2016; Jitendra et al., 2015; Fuchs et al., 2010)
  • Schema instruction—explicit instruction in identifying word problem types, representing them correctly, and using an effective method for solving them—has been found to be effective among students with mathematical difficulties and disabilities. (Jitendra et al., 2016; Jitendra et al., 2015; Jitendra et al., 2009; Montague & Dietz, 2009; Fuchs et al., 2010)
  • Teaching students how to solve word problems by identifying word problem types is more effective than teaching them only to identify key words (e.g., “altogether,” “difference”). (Jitendra, Griffin, Deatline-Buchman, & Sczesniak, 2007)

Word Problem Structures

To help students become more proficient at solving word problems, teachers can help students recognize the problem schema, which refers to the underlying structure of the problem or the problem type (e.g., adding or combining two or more sets, finding the difference between two sets). This, in turn, leads to an associated strategy for solving that problem type. There are two main types of schemas: additive and multiplicative. Below, we will introduce you to additive schemas before moving on to descriptions and examples of multiplicative schemas.

Additive Schemas

Additive schemas can be used for addition and subtraction problems. These schemas are effective for students in early elementary school through middle school. Below are a few examples of additive schemas used to solve word problems: total, difference, and change.

Description

  • Involves adding or combining two or more distinct sets (each set representing a part) that are put together to form a total.
  • Also known as part-part-whole or combine .
  • Students might solve for any unknown in the equation.
  • Can be used with a variety of types of numbers (e.g., whole, fractions, decimals).

part 1 plus part 2 equals total

Sam has 2 cookies. Ali has 3 cookies. How many cookies do they have altogether?

two plus three equals blank

There are 6 students in the classroom and some more students in the hallway. There are 20 students in all. How many students are in the hallway?

six plus blank equals twenty

  • Involves comparing and finding the difference between two sets.
  • Also known as compare .

greater minus less equals difference

The small dog has 3 spots. The large dog has 7 spots. How many more spots does the large dog have than the small dog?

seven minus three equals blank

Cy has 3 more pencils than Brody. Cy has 7 pencils. How many pencils does Brody have?

seven minus blank equals three

Ava has 9 fewer points than Giovani. Ava has 2 points. How many points does Giovani have?

blank minus two equals nine

  • Involves finding the increase or decrease in the quantity of the same set (i.e., there is one set and something happens to that set).
  • Can involve multiple changes to the same set.
  • Change schemas differ from total and difference schemas in that they involve a change in the set over time.
  • Students might solve for any number in the equation.

start plus or minus change equals end

Carly has 3 ribbons. Shay gives her 2 ribbons. How many ribbons does Carly have now?

seven minus three equals blank

Carly has 3 ribbons. She gave Shay 1 ribbon. How many ribbons does Carly have now?

seven minus three equals blank

Misha has 9 suckers. Kaheen gave her some more suckers. Now she has 12 suckers. How many did Kaheen give her?

seven minus three equals blank

Misha has some suckers. Kaheen gave her 4 suckers. Now Misha has 11 suckers. How many suckers did Misha have to begin with?

seven minus three equals blank

(Adapted from Stevens & Powell, 2016; Morales, Shute & Pellegrino, 1985)

For Your Information

Even when they apply the same schema to solve a word problem, students will likely approach its solution in a variety of ways. An example of this can be found below.

Problem: Emma had nine dollars. Then she earned some more money doing her chores. Now Emma has $12. How much money did she earn?

Two students, A and B, set up the problem using the change schema.

nine plus blank equals twelve

However, Student A solves the problem by subtracting 12 – 9. Student B solves the problem by counting on from 9. Although one student adds and the other subtracts, both students arrive at the correct solution. This example illustrates that the operation is secondary to the structure of the word problem.

Multiplicative Schemas

Multiplicative schemas can be used to solve multiplication and division problems. There are three main types of multiplicative schemas: equal, comparison, and ratio/proportion.

Equal Groups

  • Involves multiplying or dividing groups where there is an equal number in each group.
  • Students often encounter these types of word problems on standardized tests during 3rd and 4th grades and on into middle school.

groups times number in each group equals product

Tara has 6 bags of oranges. There are 4 oranges in each bag. How many oranges does Tara have?

six times four equals blank

Matthew has 20 comic books. His bookshelf has 5 shelves. He wants to put an equal number of comic books on each shelf. How many comic books will he put on each shelf?

five times blank equals twenty

  • Involves multiplying a set a given number of times.
  • Students often encounter these types of word problems on standardized tests during 4th and 5th grades and into middle school.

set times times equals product

Tara has 6 bags of oranges. Mai has 6 pieces of candy. Kyla has 2 times as many pieces of candy. How many pieces of candy does Kyla have?

six times two equals blank

Pedro has 7 video games. Bronwynn has 21 video games. How many times as many video games does Bronwynn have than Pedro?

seven times blank equals twenty-one

Ratios/Proportions

  • Involves finding the relationship between two numbers.
  • Students often encounter these types of word problems on standardized tests during upper elementary through middle school.

compared divided by base equals ratio

Example: On Saturday, Naoki worked in the hot sun for 10 hours, helping to clean up and revitalize a neighborhood park. To prevent dehydration, she took a 5-minute water break every hour. What proportion of time did Naoki spend working compared to taking breaks?

one hour converted to minutes divided by a base of five minutes equals the ratio

Note: To solve this problem, the student first converted hours to minutes so that he could work with the same unit.

sixty divided by five equals twelve over one

Note: The student determines that the ratio for working to taking breaks is 12 minutes of work to 1 minute on break.

Source: Jitendra, Star, Dupuis, & Rodriguez, 2013

Combined Schemas

As students advance in school, they will encounter new kinds of mathematical problems with new underlying structures, or schema. They will also encounter multi-step mathematics problems. The example below illustrates a percent change problem which involves the combination of two schemas: a multiplicative and an additive .

plus or minus change divided by original equals percent change

Note: After the student solves for the missing value in the equation above, he enters it along with the provided information in the equation below to solve the problem.

Example: Mark is interested in buying a car. The car costs $3,200. He will receive a 10% discount if he buys the car this weekend. How much will he pay for the car?

Solution equation (to determine the amount of change):

plus or minus change divided by thirty-two hundred dollars equals ten divided by one hundred as percent change

After the student solves for the “change,” which is $320, he will then create another solution equation to find the “new total.”

Solution equation (to determine the “new total”):

thirty-two hundred dollars plus or minus three hundred twenty dollars equals twenty-eight hundred dollars

The student determines that with the 10% discount Mark will pay $2,880.

Sarah Powell, who has conducted extensive research on schema instruction, discusses the underlying focus of this strategy (time: 2:40).

Sarah Powell, PhD Assistant Professor, Special Education University of Texas at Austin

View Transcript

sarah powell

Transcript: Sarah Powell, PhD

The thing with schemas is that you cannot define word problems by their operation. So you cannot describe a word problem as being a subtraction problem or being a division problem. Instead, you have to describe the word problem at a deeper level and that is describing the word problem by its schema. And sometimes I like to use the word structure . It’s really important to use schemas or structures so that students have consistency with problem solving. If we teach the structure of combined problems in 1st grade and 2nd grade, students continue to see that schema in 3rd grade, 4th grade, and 5th grade. Now the numbers may get greater. So instead of adding three plus nine, they might be adding 133 plus 239. But the structure is the same. And so one of the things that we are trying to do with our math standards that guide most instruction in the United States is to provide consistency in math learning across grade levels, and the schemas really help do that, so you see that combined structure again and again.

Now in middle school grades, you see it in a slightly different way. It might be part of a multi-step problem, but it’s still there, so that every year we don’t have to reteach problem solving. We just help students say, “Oh, now we’re looking at a total schema but with fractions. Now here’s a total schema but with decimals.” And so there’s a lot of consistency that’s provided with the schemas. And right now problem solving is really taught grade-by-grade. So how do I solve 2nd-grade word problems, or how do I solve 5th-grade word problems? And that’s not a good way of thinking about it. It’s better if we focus on the schema and think about this grade-level continuum of problem solving. And they would just make problem solving so much easier for students and also easier for teachers, because then they’re not going back to square one every year and starting about how do I teach problem solving in 5th grade?

I would argue that problem solving is the most important thing you have to teach, because when we look at high-stakes assessments—and that’s where students show their mathematics competency—for word problems, students have to take the numbers and manipulate the numbers. It’s very difficult. Problem solving should be the primary focus of the mathematics curriculum, and instead of teaching problem solving as supplementary to math instruction, problem solving should really be taught as the way that we learn mathematics. And we need to get students to be thinkers of mathematics, not just doers of mathematics.

Teaching Word Problem Structures

As when teaching any strategy, teachers should use explicit, systematic instruction when introducing schema instruction , sometimes referred to as schema-based instruction (SBI) . Although the same process is used to teach any schema, for illustrative purposes, the steps for how to teach the combine schema are outlined in the box below.

(Adapted from Stevens & Powell, 2016)

Teachers should make sure that students have mastered one schema (e.g., combine ) before introducing a different problem type (e.g., compare ). This reduces the possibility of students confusing one schema type with another during the learning process.

  • Service to the State

College of Education - UT Austin

  • Academics Overview
  • Bachelor’s Programs
  • Master’s Programs
  • Doctoral Programs
  • Post-baccalaureate
  • Educator Preparation Programs
  • Student Life Overview
  • Career Engagement
  • For Families
  • Prospective Students
  • Current Students
  • Tuition, Financial Aid and Scholarships
  • Commencement
  • Office of Student Affairs
  • Departments Overview
  • Curriculum and Instruction
  • Educational Leadership and Policy
  • Kinesiology and Health Education
  • Our Programs
  • Educational Psychology
  • Special Education
  • Centers and Institutes
  • Find Faculty
  • Office of Educational Research
  • Alumni and Friends Overview
  • Advisory Council
  • Meet Our Alumni
  • Update Your Information
  • About Overview
  • College Leadership
  • Facts and Rankings
  • Reimagine Education
  • Visit the college
  • Building Renovations
  • How to Apply
  • How To Apply
  • Newly Admitted Students
  • Academic Advising
  • Student Services
  • Office of Educational Research Support
  • Communications, Marketing and Media
  • Visit the College

Instruction Then Problem-Solving, or Vice Versa?

Photo of puzzle pieces scattered about.

by Kristen Mosley

The need and desire to actively involve students in learning experiences is evergreen, yet the timing or ways in which an instructor can do so vary widely. While some instructors prefer instruction-first approaches, others take an inverted route with problem-solving first approaches. Yet the question remains: which design is more impactful for student learning?

Let’s first break down the two approaches:

1. Instruction-first approach: this instructional design is sequenced as it is named. First, students receive instruction on a topic. Next, students engage in opportunities to apply or problem-solve with the topic at hand. The thinking behind this traditional approach is that if students lack prior knowledge on the topic, then receiving direct instruction prior to engaging in problem-solving should cue them to the key features of the problems to be solved. Equipping students with this knowledge prior to asking them to engage in novel problem solving should therefore lessen the burden on their working memory capacity. 1

2. Problem-solving first instruction: this instructional design is sequenced in the opposite fashion of its predecessor. First, students engage in a novel problem involving a yet-to-be-learned topic. Then, students receive instruction on the topic that they just explored. Proponents of this instructional design believe this approach better prepares students for future learning and applying, as they will inevitably find themselves in novel situations and with inadequate prior knowledge. 2 In essence, this approach emphasizes learner agency over attempts to control working memory capacity, as this hurdle is inherent to most problem-solving situations.

Of additional note, one more term is needed to understand these two different approaches and how they have been found to differentially impact student outcomes. Productive failure is a unique component of problem-solving first instruction in which students are engaged in problem-solving that is intentionally designed to result in failure. 3 However, this instructional aspect is not inherent to problem-solving first approaches. That is, while productive failure necessitates the use of problem-solving first instruction, the use of problem-solving first instruction does not require the use of productive failure. Instead, productive failure is a sub feature that can, and perhaps should, 4 be used within problem-solving first instructional designs.

A recent meta-analysis examined 53 studies that compared the impact of instruction-first and problem-solving first approaches on student learning. Roughly one-third of the comparisons in this study (36.7%) examined research with undergraduates, and roughly half of all comparisons in the study examined differences in student outcomes that resulted from quasi-experimental designs within real classroom settings. 4 The findings therefore hold critical implications for undergraduate teaching and learning. 

Amongst the many findings of this study, a significant effect size in favor of problem-solving first approaches (Hedge’s g = .36) demonstrated that, when comparing students’ conceptual knowledge and transfer outcomes in the two different instructional settings, a problem-solving first approach was more effective than an instruction-first approach. 4 Having found this effect for conceptual knowledge and transfer outcomes, though, it is important to also note where effect sizes were insignificant.

The benefits of problem-solving first instruction did not prove to be significantly greater than instruction-first approaches for procedural learning nor did they prove to be as strong when conducted without maintaining fidelity to productive failure instruction. That is, results suggest problem-solving first instruction that also includes productive failure design supersedes the use of problem-solving first instruction alone. 4

In sum, evidence suggests problem-solving first instruction, as compared to instruction-first approaches, is a stronger facilitator of students’ long-term, conceptual learning as well as their ability to later transfer this learning to other domains. The benefits of problem-solving first instruction were also strongest when conducted in tandem with productive failure…which is the topic of our next post! Come back next week to learn more about productive failure design and how this approach can be incorporated into your instruction. 

1. Sweller, J. (2020). Cognitive load theory and educational technology.  Educational Technology Research and Development ,  68 (1), 1-16.

2. Loibl, K., Roll, I., & Rummel, N. (2017). Towards a theory of when and how problem solving followed by instruction supports learning. Educational Psychology Review , 29(4), 693–715. https://doi.org/10.1007/s10648-016-9379-x

3. Kapur, M. (2008). Productive failure. Cognition and Instruction , 26(3), 379–424.  https://doi.org/10.1080/07370000802212669

4. Sinha, T., & Kapur, M. (2021). When Problem Solving Followed by Instruction Works: Evidence for Productive Failure. Review of Educational Research , 91(5), 761–798. https://doi.org/10.3102/00346543211019105

Email or call us for any questions you may have or to make an appointment with an OI2 member. 

[email protected] 512-232-2189 SZB 2.318

Director: Lucas Horton 512-232-4199

@oi2ut  on Twitter

instruction for problem solving

Cognition and Instruction/Problem Solving, Critical Thinking and Argumentation

We are constantly surrounded by ambiguities, falsehoods, challenges or situations in our daily lives that require our Critical Thinking , Problem Solving Skills , and Argumentation skills . While these three terms are often used interchangeably, they are notably different. Critical thinking enables us to actively engage with information that we are presented with through all of our senses, and to think deeply about such information. This empowers us to analyse, critique, and apply knowledge, as well as create new ideas. Critical thinking can be considered the overarching cognitive skill of problem solving and argumentation. With critical thinking, although there are logical conclusions we can arrive at, there is not necessarily a 'right' idea. What may seem 'right' is often very subjective. Problem solving is a form of critical thinking that confronts learners with decisions to be made about best possible solutions, with no specific right answer for well-defined and ill-defined problems. One method of engaging with Problem Solving is with tutor systems such as Cognitive Tutor which can modify problems for individual students as well as track their progress in learning. Particular to Problem Solving is Project Based Learning which focuses the learner on solving a driving question, placing the student in the centre of learning experience by conducting an extensive investigation. Problem Based Learning focuses on real-life problems that motivate the student with experiential learning. Further, Design Thinking uses a specific scaffold system to encourage learners to develop a prototype to solve a real-world problem through a series of steps. Empathy, practical design principles, and refinement of prototyping demonstrate critical thought throughout this process. Likewise, argumentation is a critical thinking process that does not necessarily involve singular answers, hence the requirement for negotiation in argumentative thought. More specifically, argumentation involves using reasoning to support or refute a claim or idea. In comparison problem solving may lead to one solution that could be considered to be empirical.

This chapter provides a theoretical overview of these three key topics: the qualities of each, their relationship to each other, as well as practical classroom applications.

Learning Outcomes:

  • Defining Critical Thought and its interaction with knowledge
  • Defining Problem Solving and how it uses Critical Thought to develop solutions to problems
  • Introduce a Cognitive Tutor as a cognitive learning tool that employs problem solving to enhance learning
  • Explore Project Based Learning as a specific method of Problem Solving
  • Examine Design Thinking as a sub-set of Project Based Learning and its scaffold process for learning
  • Define Argumentation and how it employs a Critical Though process
  • Examine specific methodologies and instruments of application for argumentation
  • 1.1 Defining critical thinking
  • 1.2 Critical thinking as a western construct
  • 1.3 Critical thinking in other parts of the world
  • 1.4 Disposition and critical thinking
  • 1.5 Self-regulation and critical thinking
  • 1.6.1 Venn Diagrams
  • 1.6.2.1 The classroom environment
  • 1.6.3.1 Socratic Method
  • 1.6.3.2 Bloom’s Taxonomy
  • 1.6.3.3 Norman Webb’s Depth of Knowledge
  • 1.6.3.4 Williams Model
  • 1.6.3.5 Wiggins & McTighe’s Six Facets of Understanding
  • 2.1.1.1.1 Structure Of The Classroom
  • 2.2.1.1 Instructional Implications
  • 2.2.2.1 Instructional Implications
  • 2.3.1 Mind set
  • 2.3.2.1.1 Instructional Implications
  • 2.4 Novice Versus Expert In Problem Solving
  • 2.5.1 An overview of Cognitive Tutor
  • 2.5.2.1 ACT-R theory
  • 2.5.2.2 Production rules
  • 2.5.2.3 Cognitive model and model tracing
  • 2.5.2.4 Knowledge tracing
  • 2.5.3.1 Cognitive Tutor® Geometry
  • 2.5.3.2 Genetics Cognitive Tutor
  • 2.6.1 Theorizing Solutions for Real World Problems
  • 2.6.2 Experience is the Foundation of Learning
  • 2.6.3 Self-Motivation Furthers Student Learning
  • 2.6.4 Educators Find Challenges in Project Based Learning Implementation
  • 2.6.5 Learner Need for Authentic Results through Critical Thought
  • 2.7.1 Using the Process of Practical Design for Real-World Solutions
  • 2.7.2 Critical Thought on Design in the Artificial World
  • 2.7.3 Critical Thinking as Disruptive Achievement
  • 2.7.4 Designers are Not Scientific?
  • 2.7.5 21st Century Learners and the Need for Divergent Thinking
  • 3.1 Educators Find Challenges in Project Based Learning Implementation
  • 3.2 Learner Need for Authentic Results through Critical Thought
  • 3.3 Critical Thinking as Disruptive Achievement
  • 3.4.1 Argumentation Stages
  • 3.5 The Impact of Argumentation on Learning
  • 4.1.1 Production, Analysis, and Evaluation
  • 4.2 How Argumentation Improves Critical Thinking
  • 5.1 Teaching Tactics
  • 5.2.1 The CoRT Thinking Materials
  • 5.2.2 The Feuerstein Instrumental Enrichment Program (FIE)
  • 5.2.3 The Productive Thinking Program
  • 5.2.4 The IDEAL Problem Solver
  • 5.3.1 Dialogue and Argumentation
  • 5.3.2 Science and Argumentation
  • 5.3.3.1 Historical Thinking - The Big Six
  • 5.4 Instructing through Academic Controversy
  • 7.1 External links
  • 8 References

Critical thinking [ edit | edit source ]

Critical thinking and its relationship to other cognitive skills

Critical thinking is an extremely valuable aspect of education. The ability to think critically often increases over the lifespan as knowledge and experience is acquired, but it is crucial to begin the process of this development as early on as possible. Research has indicated that critical thinking skills are correlated with better transfer of knowledge, while a lack of critical thinking skills has been associated with biased reasoning [1] . Before children even begin formal schooling, they develop critical thinking skills at home because of interactions with parents and caregivers [2] . As well, critical thinking appears to improve with explicit instruction [3] . Being able to engage in critical thought is what allows us to make informed decisions in situations like elections, in which candidates present skewed views of themselves and other candidates. Without critical thinking, people would fall prey to fallacious information and biased reasoning. It is therefore important that students are introduced to critical thought and are encouraged to utilize critical thinking skills as they face problems.

Defining critical thinking [ edit | edit source ]

In general, critical thinking can be defined as the process of evaluating arguments and evidence to reach a conclusion that is the most appropriate and valid among other possible conclusions. Critical thinking is a dynamic and reflective process, and it is primarily evidence-based [4] . Thinking critically involves being able to criticize information objectively and explore opposing views, eventually leading to a conclusion based on evidence and careful thought. Critical thinkers are skeptical of information given to them, actively seek out evidence, and are not hesitant to take on decision-making and complex problem solving tasks [5] . Asking questions, debating topics, and critiquing the credibility of sources are all activities that involve thinking critically. As outlined by Glaser (1941), critical thinking involves three main components: a disposition for critical thought, knowledge of critical thinking strategies, and some ability to apply the strategies [6] . Having a disposition for critical thought is necessary for applying known strategies.

Critical thinking, which includes cognitive processes such as weighing and evaluating information, leads to more thorough understanding of an issue or problem. As a type of reflection, critical thinking also promotes an awareness of one's own perceptions, intentions, feelings and actions. [7]

Critical thinking as a western construct [ edit | edit source ]

Critical thinking is considered to be essential for all democratic citizens

In modern education, critical thinking is taken for granted as something that people universally need and should acquire, especially at a higher educational level [8] [9] . However, critical thinking is a human construct [10] - not a scientific fact - that is tied to Ancient Greek philosophy and beliefs [11] .

The link to Ancient Greece relates both to Ancient Greek priorities of logic over emotion [11] , as well as its democratic principles. Various authors, including Elder & Paul [12] , Moon [8] , and Stanlick & Strawser [13] share the view that critical thinking questioning back to the time of Socrates . Likewise, Morgan & Saxton (2006) associate critical thinking with a fundamental requirement of all democratic citizens [14] .

An additional connection with Ancient Greece involves the Socratic Method. The Socratic Method involves a conversation between two or more people in which they ask and answer questions to challenge each other’s theses using logic and reason [15] . Such debates are subject to the issue of objective/subjective dualism in that the purpose of debate is the belief that there is a ‘right answer’, yet the ability to conduct such a debate demonstrates the subjectivity of any thesis [15] .

Because of this strong connection to Ancient Greece, critical thinking is generally considered to be a western construct. This is further amplified another western construct called Bloom’s Taxonomy , which is considered to be the essence of critical thinking in modern education [16] .

Since critical thinking is a human construct, notions of what constitutes critical thinking vary considerably from person to person. Moon (2007) lists 21 common notions of critical thinking provided by people from her workshops, and then provides her own 2-page definition of the term [8] . One view of critical thinking is that it involves a set of skills that enables one to reach defensible conclusions and make decisions in a domain or context in which one has some prior knowledge [10] . Another view is that critical thinking involves the use of systematic logic and reasoning, which while not necessarily producing empirical answers nevertheless uses a rational and scientific approach [17] . Ultimately, Moon concludes that there is no right or wrong definition [8] .

Critical thinking in other parts of the world [ edit | edit source ]

Scholars argue that while the critical thinking construct is linked to western, democratic nations, that does not mean that other non-western cultures do not possess or use similar constructs that involve critical thinking [18] . Instead, “there are different ways or forms of reasoning” [19] ; for example, Asian approaches to debates involve finding connections between conflictive arguments in order for such ideas to coexist [18] . This is due to eastern values regarding face-saving [8] . In contrast, western approaches are often viewed as being competitive: attacking the views of others while defending one's own position. Despite this dichotomous generalisation, eastern and western approaches have more similarities than they would first seem. With regards to the diplomatic Asian approach to debating, western approaches also involve compromise and negotiation for the very reason that ideas are often complex and that there can be many ‘right’ answers [14] . Similarly, the extent to which other cultures adopt western notions of critical thinking is determined by cultural values. In Muslim cultures, for example, the value of critical thinking is link to views on the appropriateness of voicing one’s views [20] .

Disposition and critical thinking [ edit | edit source ]

It has been suggested that critical thinking skills alone are not sufficient for the application of critical thinking – a disposition for critical thinking is also necessary [5] . A disposition for critical thought differs from cognitive skills. A disposition is better explained as the ability to consciously choose a skill, rather than just the ability to execute the skill [4] . Having a disposition for critical thinking can include such things as genuine interest and ability in intellectual activities. Perkins et al. (2000) expand on the idea of the necessity for a critical thinking disposition, and indicate three aspects involved in critical thinking disposition: an inclination for engaging in intellectual behaviours; a sensitivity to opportunities, in which such behaviours may be engaged; and a general ability for engaging in critical thought [5] . Halpern (1998) suggests that this critical thinking disposition must include a willingness to continue with tasks that seem difficult, openmindedness, and a habit of planning [5] . In fact, in a cognitive skills study conducted by Clifford et al. (2004), they discovered that a disposition for critical thinking was associated with better overall critical thinking skills [4] .

These are characteristics of one's attitude or personality that facilitate the process of developing CT skills:

  • Inquisitive
  • Truthseeking
  • Open-minded
  • Confidence in reasoning

There are many factors that can influence one's disposition towards CT; the first of these is culture [5] . There are many aspects of culture that can impact the ability for people to think critically. For instance, religion can negatively impact the development of CT [5] . Many religions are founded upon faith, which often requires wholehearted belief without evidence or support. The nature of organized religion counters the very premise of CT, which is to evaluate the validity and credibility of any claim. Growing up in an environment such as this can be detrimental to the development of CT skills. This kind of environment can dampen dispositions that question religious views or examine the validity of religion. Another cultural factor that can be detrimental to a CT disposition is that of authority [5] . When a child is raised under the conditions of an authoritarian parenting style, it can be detrimental to many aspects of their lives, but especially to their CT skills, as they are taught not to question the credibility of authority and often receive punishment if they do. This is also applicable in the classroom [5] . Classroom environments that foster a disposition for critical thinking in which teachers who do not foster an atmosphere of openness or allow students to question what they are taught can impact CT development as well. Classrooms where questions are rejected or home environments in which there is a high level of parental power and control can all affect the ability of students to think critically. What is more, students will have been conditioned not to think this way for their entire lives [5] . However, despite these cultural limitations, there are ways in which a disposition for CT can be fostered in both the home and the classroom.

Classroom structure is a primary way in which CT dispositions can be highlighted. Fostering a classroom structure in which students are a part of the decision making process of what they are studying can be very helpful in creating CT dispositions [5] . Such structures help students become invested in what they are learning as well as promote a classroom atmosphere in which students may feel free to question the teacher, as well as other students' opinions and beliefs about different subjects. Allowing the freedom to scrutinize and evaluate information that has been given to students is an effective way of creating a classroom environment that can encourage students to develop CT dispositions. This freedom allows for the students to remain individuals within the larger classroom context, and gives them the power to evaluate and make decisions on their own. Allowing the students to share power in the classroom can be extremely beneficial in helping the students stay motivated and analytical of classroom teachings [5] . Teachers can also employ a variety of techniques that can help students become autonomous in the classroom. Giving students the opportunity to take on different roles can be effective in creating CT dispositions, such as making predictions and contemplating problems [5] . Allowing students to engage with problems that are presented, instead of just teaching them what the teacher or textbook believes to be true, is essential for students to develop their own opinions and individual, though. In addition to this, gathering data and information on the subject is an important part of developing CT dispositions. Doing so allows for students to go out and find resources that they themselves can analyze and come to conclusions on their own [5] . Using these aspects of CT students can most effectively relate to the predictions that were first made and critique the validity of the findings [5] .

Self-regulation and critical thinking [ edit | edit source ]

In conjunction with instructing CT, teachers also need to keep in mind the self-regulation of their students. Students need to be able to maintain motivation and have a proactive attitude towards their own learning when learning a new skill. In an article by Phan (2010), he argues that self-regulated students that have better goal setting have more personal responsibility for their learning, can maintain their motivation, are more cognitively flexible, and hence are more inclined to utilize CT. Since CT skills are highly reflective, they help in self-regulated learning (SRL), and in turn, self-regulatory strategies aid in developing CT skills. These two cognitive practices are assets to students’ growth and development [7] .

Self-Regulation provides students with the basic meta-cognitive awareness required for proactive learning. This pro-activity allows students to engage in the cognitive processes of CT, such as evaluation, reflection and inference. Through one’s meta-cognitive ability to assess one’s own thoughts, one develops the capability to become autonomous in one’s learning [7] . Instead of having a supervisor overlook every task, the learner can progress at their own pace while monitoring their performance, thereby engaging in SRL. Part of this process would include periodic reflection upon the strategies that one uses when completing a task. This reflection can facilitate the student’s learning by using CT to evaluate which strategies best suit their own learning based on their cognitive needs.

The complex nature of CT suggests that it requires a long developmental process requiring guidance, practice and reinforcement. To facilitate this process, self-monitoring as a first step to self-regulation can jump-start reflective thought through assessing one’s own educational performance. This assessment promotes self-efficacy through generating motivational beliefs about one’s academic capabilities [7] . From there, through practice, students can extend their CT skills beyond themselves and into their educational contexts. With practice, students use their meta-cognitive strategies as a basis for developing CT in the long run.

Critical thinking strategies [ edit | edit source ]

instruction for problem solving

Psychologists and educators have discovered many different strategies for the development of critical thinking. Among these strategies are some that may be very familiar, such as concept maps or Venn diagrams , as well as some that may be less familiar, such as appeal-question stimuli strategies [21] . Concept mapping is particularly useful for illustrating the relationships between ideas and concepts, while Venn diagrams are often used to represent contrasting ideas [21] .

Venn Diagrams [ edit | edit source ]

Venn diagrams are used frequently in elementary grade levels and continue to be used as a contrast/compare tool throughout secondary school. An example of a situation in which a Venn diagram activity may be appropriate is during a science class. Instructors may direct students to develop a Venn diagram comparing and contrasting different plants or animals. Concept maps may be introduced in elementary grades, although they are most often used in the secondary and post-secondary levels. Concept maps are an interactive and versatile way to encourage students to engage with the course material. A key aspect of concept mapping is how it requires students to reflect on previously learned information and make connections. In elementary grades, concept maps can be introduced as a project, while later, possibly in college or university, students may use them as a study strategy. At the elementary level, students can use concept maps to make connections about the characters, settings, or plot in a story they have read. When introducing concept maps, teachers may provide students with a list of words or phrases and instruct the students to illustrate the connections between them in the form of a concept map. Asking questions can also be a simple and engaging way to develop critical thought. Teachers may begin by asking the students questions about the material, and then encouraging students to come up with their own questions. In secondary and post-secondary education, students may use questions as a way to assess the credibility of a source. At the elementary school level, questions can be used to assess students' understanding of the material, while also encouraging them to engage in critical thought by questioning the actions of characters in a story or the validity of an experiment. Appeal-question stimuli, founded by Svobodová, involves a process of students asking questions regarding their reading comprehension [21] .

Discussions [ edit | edit source ]

Using discussions as a way to develop students’ critical thinking skills can be a particularly valuable strategy for teachers. Peer interactions provide a basis for developing particular critical thinking skills, such as perspective taking and cooperation, which may not be as easily taught through instruction. A large part of discussions, of course, is language. Klooster (2002) suggested that critical thinking begins with asking questions [21] . Similarly, Vygotsky has claimed that language skills can be a crucial precursor for higher level thought processes [2] . As children develop larger vocabularies, they are better able to understand reading material and can then begin to think abstractly about the material and engage in thoughtful discussions with peers about what they understood [2] .

Studies have indicated that cross-age peer discussions may be particularly helpful in facilitating the development of critical thinking. Cross-age peer groups can be effective because of the motivation children tend to have when working with peers of different ages [2] . Younger children often look up to the older children as mentors and valuable sources of knowledge and experience, while older children feel a sense of maturity and a responsibility to share their knowledge and experience with younger students [2] . These cross-age peer discussions also provide students with the challenge of tailoring their use of language to the other group members in order to make their points understandable [2] . An example of cross-age peer groups that is relatively common in Canadian schools is the big buddy programs, where intermediate grade students are assigned a primary grade buddy to help over the course of the school year. Big buddies may help their little buddies with projects, advice, or school events. The big buddy/little buddy programs can be effective as younger students look up to their big buddies, and the big buddies feel a responsibility to help their little buddy. One important factor to be considered with cross-age peer discussions, as noted by Hattie (2006), is that these discussions should be highly structured activities facilitated by a teacher in order to ensure that students understand their group responsibilities [2] .

The classroom environment [ edit | edit source ]

Having an environment that is a safe place for students to ask questions and share ideas is extremely valuable for creating a classroom that encourages critical thinking. It has been suggested that students are more likely to develop a disposition for critical thinking when they are able to participate in the organization and planning of their classroom and class activities [5] . In these classrooms, students are legitimately encouraged by their teacher to engage in the decision making process regarding the functioning of the classroom [5] . It is also important for teachers to model the desired types of critical thought, by questioning themselves and other authorities in a respectful and appropriate manner [5] . Studies have indicated higher levels of cognitive engagement among students in classrooms with teachers who are enthusiastic and responsive [22] . Therefore, teachers should be encouraging and inclusive, and allow student engagement in classroom planning processes when possible.

Critical questions [ edit | edit source ]

Research is increasingly supporting the idea that critical thinking can be explicitly taught [23] . The use of critical questioning in education is of particular importance, because by teaching critical questioning, educators are actively modelling critical thinking processes. One of the key issues with teaching critical thinking in education is that students merely witness the product of critical thinking on the part of the teacher, i.e. they hear the conclusions that the teacher has reached through critical thinking [9] . Whereas an experienced critical thinker uses critical questions, these questions are implicit and not normally verbalised. However, for students to understand critical questioning and critical thinking strategies, the students must see the process of critical thinking. Modelling the formation and sequencing of critical questions explicitly demonstrates the thought process of how one can reach a logical conclusion.

There various methods of teaching critical questioning. The frameworks discussed below are among the most famous of these. All have their own strengths and weaknesses in terms of ease-of-use, complexity, and universality. Each of these methods approaches critical thinking with a specific definition of this human concept. As such, one’s own definition of critical thinking will likely affect one’s receptiveness to a specific critical questioning framework.

 Socrates

Socratic Method [ edit | edit source ]

One of the key features of western approaches to critical thinking involves the importance of critical questioning, which is linked to the Socratic Method from Ancient Greece traditions. Whether answering existing questions posed or creating new questions to be considered, critical thinking involves questions, whether explicitly / implicitly, consciously / unconsciously [13] . Browne & Keeley (2006) base their definition of critical thinking specifically on the involvement of critical questions [24] .

Answers to critical questions are not necessarily empirical. They may involve reasoning and be logical, but are nevertheless subject to alternative views from others, thus making all views both subjective and objective at the same time. Elder & Paul (2009) separate such critical questions into three categories [12] :

  • Questions that have a correct answer, which can be determined using knowledge
  • Questions that are open to subjective answers that cannot be judged
  • Questions that produce objective answers that are judged based the quality of evidence and reasoning used

Books on critical questioning tend to be influenced heavily by the Socratic Method, and they make a distinction between ‘good’ and ‘bad’ questions. Good questions are those that are relevant to the topic at hand and that take a logical, systematic approach [14] [13] , while bad questions are those that are not relevant to the topic, are superficial, and are sequenced haphazardly. Elder & Paul (2009) argue that “[i]t is not possible to be a good thinker and a poor questioner.” [25] In other words, if a person cannot thinking of relevant and logical questions, they will be unable to reach any rational conclusions.

Additionally, as indicated above, critical thinking requires more than just asking the right questions. There is a direct relationship between critical thinking and knowledge [23] . One can possess knowledge, but not know how to apply it. Conversely, one can have good critical questioning skills, but lack the knowledge to judge the merits of an answer.

In terms of teaching critical questioning using the Socratic Method, it is essential to appreciate that there is no set of questions that one can follow, since the type of critical questions needed is based on the actual context. Consequently, the examples presented by different authors vary quite considerably. Nevertheless, there are specific guidelines one can follow [26] :

  • Use critical questions to identify and understand the situation, issues, viewpoints and conclusions
  • Use critical questions to search for assumptions, ambiguity, conflicts, or fallacies
  • Use critical questions to evaluate the effects of the ideas

Part 1 of the Socratic Method is more of an information gathering stage, using questions to find out essential details, to clarify ideas or opinions, and to determine objectives. Part 2 uses the information from Part 1 and then uses questions to probe for underlying details that could provide reasons for critiquing the accuracy of the idea. Part 3 uses questions to reflect upon the consequences of such ideas.

Conklin (2012) separates the above three parts into six parts [27] :

  • Using questions to understand
  • Using questions to determine assumptions
  • Using questions to discover reasons / evidence
  • Using questions to determine perspectives
  • Using questions to determine consequences
  • Using questions to evaluate a given question

Here are some sample questions for each part [28] :

Questions for understanding:

  • Why do you think that?
  • What have you studied about this topic so far?
  • How does this relate to what you are studying now?

Questions that determine assumptions

  • How could you check that assumption?
  • What else could be assumed?
  • What are your views on that? Do you agree or disagree?

Questions that discover reasons / evidence

  • How can you be sure?
  • Why is this happening?
  • What evidence do you have to back up your opinion?

Questions that determine perspectives

  • How could you look at this argument another way?
  • Which perspective is better?

Questions that determine consequences

  • How does it affect you?
  • What impact does that have?

Questions that evaluate a given question

  • Why was I asked this question?
  • Which questions led to the most interesting answers?
  • What other questions should be asked?

Depending on the text, the Socratic Method can be extraordinarily elaborate, making it challenging for educators to apply. Conklin (2012) states that a teacher would need to spend time planning such questions in advance, rather than expect to produce them during a lesson [27] .

Bloom’s Taxonomy [ edit | edit source ]

Bloom’s Taxonomy was originally designed in 1956 to determine cognitive educational objectives and assess students’ higher-order thinking skills [29] . Since then, though, it has become adapted and used as a useful tool for promoting critical thinking skills, particularly through critical questioning [30] . These critical questions involve Bloom’s categories of understanding, applying, analysing, synthesising and evaluating. Such categories can be seen to relate to the Socratic Method promoted by other authors, i.e. the importance of questioning to understanding, analyse and evaluate. Moon (2007) believes that “‘evaluation’, ‘reflection’ and ‘understanding’” are key aspects of critical thinking [8] , which should therefore appear in any notion of critical thinking. At the same time, Bloom’s Taxonomy generates a natural set of questions that can be adapted to various contexts [31] .

In one example, a teacher uses a picture of a New York speakeasy bar. Using Bloom’s Taxonomy, the teacher could ask and model the following critical questions [14] :

  • KNOWLEDGE: What do you see in the picture?
  • COMPREHENSION: What do people do in places like that?
  • ANALYSIS: Why are there so many policemen in the picture?
  • APPLICATION: What similar situations do we see nowadays?
  • SYNTHESIS: What if there were no laws prohibiting such behaviour?
  • EVALUATION: How would you feel if you were one of these people? Why?

Norman Webb's Depth of Knowledge

Norman Webb’s Depth of Knowledge [ edit | edit source ]

Webb’s Depth of Knowledge (DOK) taxonomy was produced in 2002 in response to Bloom’s Taxonomy [32] . In contrast with Bloom’s Taxonomy, Webb’s DOK focuses on considering thinking in terms of complexity of thinking rather than difficulty [32] .

Webb’s DOK has four levels:

  • Recall & reproduction
  • Working with skills & concepts
  • Short-term strategic thinking
  • Extended strategic thinking

Level 1 aligns with Bloom’s level of remembering and recalling information. Example critical questions in this level would include:

  • What is the name of the protagonist?
  • What did Oliver Twist ask Fagin?

Level 2 involves various skills, such as classifying, comparing, predicting, gathering, and displaying. Critical questions can be derived from these skill sets, including the following:

  • How do these two ideas compare?
  • How would you categorise these objects?
  • How would you summarize the text?

Level 3 involves analysis and evaluation, once again aligning with Bloom’s Taxonomy.

  • What conclusions can you reach?
  • What theory can you generate to explain this?
  • What is the best answer? Why?

At the same time, Level 3 of DOK shares similarities with the Socratic Method in that the individual must defend their views.

Level 4 is the most elaborate and challenging level. It involves making interdisciplinary connections and the creation of new ideas / solutions.

Since DOK becomes increasingly elaborate with levels and leads to the requirement to defend one’s position using logic and evidence, there are parallels with the Socratic Method. At the same time, because is used to develop standards in assessing critical thinking, it shares similarities with Bloom’s Taxonomy.

Williams Model [ edit | edit source ]

 The KWL method shares some similarities to the 'wonder' aspect of the Williams Model

The Williams Model was designed by Frank Williams in the 1970s [27] . Unlike other methods, the Williams Model was designed specifically to promote creative thinking using critical questioning [27] . This model involves the following aspects:

  • Flexibility
  • Elaboration
  • Originality
  • Risk taking
  • Imagination

Critical questions regarding fluency follow a sort of brainstorming approach in that the questions are designed to generates ideas and options [27] . For ‘flexibility’, the questions are designed to produce variations on existing ideas. ‘Elaboration’ questions are about building upon existing ideas and developing the level of detail. As the name suggests, critical questions for ‘originality’ are for promoting the development of new ideas. The ‘curiosity’ aspect of the Williams Model bears a similarity with that of the ‘Wonder’ stage of the Know Wonder Learn (KWL) system [33] . ‘Risk taking’ questions are designed to provoke experimentation. Although the name ‘complexity’ may sound similar to ‘elaboration’, it is instead about finding order among chaos, making connections, and filling in gaps of information. The final aspect is ‘Imagination’, which involves using questions to visualise.

Wiggins & McTighe’s Six Facets of Understanding

Wiggins & McTighe’s Six Facets of Understanding [ edit | edit source ]

Wiggins & McTighe’s ‘Six Facets of Understanding’ are all based on deep understanding aspects of critical thinking [34] . The method is used for teachers to design questions for students to promote critical thinking [34] . The six facets are Explanation, Interpretation, Application, Perspective, Empathy, and Self-Knowledge [35] .

‘Why’ and ‘How’ questions dominate the ‘Explanation’ facet in developing theory and reasoning [36] :

  • How did this happen? Why do you think this?
  • How does this connect to the other theory?

Interpretation questions encourage reading between the lines, creating analogies or metaphors, and creating written or visual scenarios to illustrate the idea. Questions include:

  • How would you explain this idea in other words?
  • Why do you think that there is conflict between the two sides?
  • Why is it important to know this?

Application questions are about getting students to use knowledge. Part of this comes from predicting what will happen based on prior experience. Another aspect involves learning from the past. Critical questions in this facet include:

  • How might we prevent this happening again?
  • What do you think will happen?
  • How does this work?

Perspective questions involves not only looking at ideas from other people’s perspectives, but also determining what people’s points of views are. In comparison with Empathy questions, though, Perspective questions involve more of an analytical and critical examination [35] . Here are some example questions:

  • What are the different points of view concerning this topic?
  • Whose is speaking in the poem?
  • Whose point of view is being expressed?
  • How might this look from the other person’s perspective?

Empathy questions involve perspective-taking, including empathy, in order to show an open mind to considering what it would feel like to walk in another person’s shoes.

  • How would you feel in the same situation?
  • What would it be like to live in those conditions?
  • How would you react if someone did that your family?

Self-knowledge questions are primarily designed to encourage self reflection and to develop greater self awareness [35] . In particular, Self-Knowledge questions reveal one’s biases, values, and prejudices and how they influence our judgment of others. Critical questions in this facet include:

  • How has my life shaped my view on this topic?
  • What do I really know about the lives of people in that community?
  • What knowledge or experience do I lack?
  • How do I know what I know? Where did that information / idea come from?

Questions within the Six Facets of Understanding all incorporate the following attributes [36] :

  • They are open ended
  • They require deep thought
  • They require critical thinking
  • They promote transfer of knowledge
  • They are designed to lead to follow-up questions
  • They require answers that are substantiated

For examples of critical questioning in action in a classroom environment, view the External Link section at the bottom of this page.

Problem Solving [ edit | edit source ]

In everyday life we are surrounded by a plethora of problems that require solutions and our attention to resolve them to reach our goals [37] . We may be confronted with problems such as: needing to determine the best route to get to work, what to wear for an interview, how to do well on an argumentative essay or needing to find the solution to a quadratic equation. A problem is present in situations where there is a desire to solve the problem, however the solution is not obvious to the solver [38] . Problem solving is the process of finding the solutions to these problems. [39] . Although they are related, critical thinking differs fundamentally from problem solving. Critical thought is actually a process that can be applied to problem solving. For example, students may find themselves engaging in critical thought when they encounter ill-defined problems that require them to consider many options or possible answers. In essence, those who are able to think critically are able to solve problems effectively [40] .

instruction for problem solving

This chapter on problem solving will first differentiate between Well-defined Problems and Ill-defined Problems , then explain uses of conceptualizing and visually representing problems within the context of problem solving and finally we will discuss how mental set may impede successful problem solving.

Well-defined and Ill-defined Problems [ edit | edit source ]

Problems can be categorized into two types: ill-defined or well-defined [37] Cognitive Psychology and Instruction (5th Ed). New York: Pearson.</ref> to the problem at hand. An example of a well-defined problem is an algebraic problem (ex: 2x - 29 = 7) where one must find the value of x. Another example may be converting the weight of the turkey from kilograms to pounds. In both instances these represent well-defined problems as there is one correct solution and a clearly defined way of finding that solution.

In contrast, ill-defined problems represent those we may face in our daily lives, the goals are unclear and they have information that is conflicting, incomplete or inconclusive [41] . An example of an ill-defined problem may be “how do we solve climate change?” or “how should we resolve poverty” as there is no one right answer to these problems. These problems yield the possibility to many different solutions as there isn’t a universally agreed upon strategy for solving them. People approach these problems differently depending on their assumptions, application of theory or values that they use to inform their approach [42] . Furthermore, each solution to a problem has its own unique strengths and weaknesses. [42] .

Table 1. Summarizes the difference between well-defined and ill-defined problems.

Differences in Solving Ill-defined and Well-defined Problems [ edit | edit source ]

In earlier times, researchers assumed both types of problems were solved in similar ways [44] , more contemporary research highlights some distinct differences between processes behind finding a solution.

Kitchener (1983) proposed that well-defined problems did not involve assumptions regarding Epistemological Beliefs [37] because they have a clear and definite solution, while ill-defined problems require these beliefs due to not having a clear and particular solution [45] . In support of this idea, Schraw, Dunkle and Bendixen conducted an experiment with 200 participants, where they found that performance in well-defined problems is not predictive of one's performance on ill-defined problems, as ill-defined problems activated different beliefs about knowledge. [46]

Furthermore Shin, Jonassen and McGee (2003), [43] found that solving ill-defined problems brought forth different skills than those found in well-structured problems. In well-structured problems domain knowledge and justification skills highly predicted problem-solving scores, whereas scores on ill-structured tasks were predictive of argumentation, attitudes and metacognition in an astronomy simulation.

Aligned with these findings, Cho and Jonassen (2002) [47] found that groups solving ill-structured problems produced more argumentation and problem solving strategies due to the importance of considering a wide variety of solutions and perspectives. In contrast, the same argumentation technique distracted the participant's activities when they dealt with well-defined problems. This research highlights the potential differences in the processes behind solving ill-defined and well-defined problems.

Implications Of The Classroom Environment [ edit | edit source ]

The fundamental differences between well-structured and ill-structured problems implicate that solving ill-structured problems calls for different skills, strategies, and approaches than well-structured problems [43] . Meanwhile, most tasks in the educational setting are designed around engaging learners in solving well-structured problems that are found at the end of textbook chapters or on standardized tests. [48] . Unfortunately the strategies used for well-defined problems have little application to ill-defined problems that are likely to be encountered day to day [49] as simplified problem solving strategies used for the well-structured designs have been found to have almost no similarities to real-life problems [48]

This demonstrates the need to restructure classrooms in a way that facilitates the student problem solving of ill-structured problems. One way we may facilitate this is through asking students questions that exemplify the problems found in everyday life [50] . This type of approach is called problem based learning and this type of classroom structure students are given the opportunity to address questions by collecting and compiling evidence, data and information from a plethora of sources [51] . In doing so students learn to analyze the information,data and information, while taking into consideration the vast interpretations and perspectives in order to present and explain their findings [51] .

Structure Of The Classroom [ edit | edit source ]

In problem-based learning, students work in small groups to where they explore meaningful problems, identify the information needed to solve the given problem, and devise effective approaches for the solution [50] . Students utilize these strategies, analyze and consider their results to devise new strategies until they have come up with an effective solution [50] . The teacher’s role in this classroom structure is to guide the process, facilitate participation and pose questions to elicit reflections and critical thinking about their findings [50] . In addition teachers may also provide traditional lectures and explanations that are intended to support student inquiry [50] .

In support of the argument to implement a problem-based approach to problem solving, a meta-analysis conducted by Dochy, Segers, Van den Bossche, & Gijbels (2003), found problem-based learning to be superior to traditional styles of learning though in supporting flexible problem solving, application of knowledge, and hypothesis generation. [52] Furthermore, Williams, Hemstreet, Liu, and Smith (1998) found that this approach fostered greater gains in conceptual understanding in science [53] . Lastly Gallagher, Stepien, & Rosenthal (1992), found that in comparing traditional vs. project-based approaches students in problem-based learning demonstrate an ability to define problems. [54] These findings highlight the benefits of problem-based learning on understanding and defining problems in science. Given the positive effects of defining problems this education approach may also be applied to our next sub-topic of conceptualizing problems.

Steps to Problem Solving [ edit | edit source ]

There have been five stages consistently found within the literature of problem solving: (1) identifying the problem, (2) representing the problem, (3) choosing the appropriate strategy, (4) implementing the strategy, and (5) assessing the solutions [37] . This overview will focus on the first two stages of problem solving and examine how they influence problem solving.

instruction for problem solving

Conceptualizing Problems [ edit | edit source ]

One of the most tedious and taxing aspects of problem solving is identifying the problem as it requires one to consider the problem through multiple lenses and perspectives without being attached to one particular solution to early on in the task [39] . In addition it is also important to spend time clearly identifying the problem due to the association between time spent "conceptualizing a particular problem and the quality of one's solutions". [37] For example consider the following problem:

Becka baked a chocolate cake in her oven for twenty five minutes. How long would it take her to bake three chocolate cakes?

Most people would jump to the conclusion to multiply twenty five by three, however if we place all three cakes in the oven at a time we find it would take the same time to bake three cakes as it would take to bake one. This example highlights the need to properly conceptualize the problem and look at it from different viewpoints, before rushing to solutions.

Taking this one step further, break down the five steps as the would be used to conceptualize the problem:

Stage 1 - Define the Problem

Stage 2 - Brainstorm Solutions

Stage 3 - Pick a Solution

Stage 4 - Implement the Solution

Stage 5 - Review the Result

Research also supports the importance of taking one's time to clearly identifying the problem before proceeding to other stages. In support of this argument, Getzel and Csikszentmihalyi found that artist students that spend more time identifying the problem when producing their art were rated as having more creative and original pieces than artists who spent less time at this stage [37] . These researchers postulated that in considering a wider scope of options during this initial stage they were able to come up with more original and dynamic solutions.

Furthermore, when comparing the approaches of experienced teachers and novice post-secondary students studying to be teachers, it was found that experienced teachers spent a greater amount of time lesson planning in comparison to post-secondary students when in a placed in a hypothetical classroom. [37] In addition these teachers offered significantly more solutions to problems posed in both ill-defined and well-defined problems. Therefore it is implicated that successful problem solving is associated with the time spent finding the correct problem and the consideration of multiple solutions.

Instructional Implications [ edit | edit source ]

One instructional implication we may draw from the literature that supports that the direct relationship between time spent on conceptualizing a problem and the quality of the solution, is that teachers should encourage students to spend as much time as possible at this stage [37] . In providing this knowledge and by monitoring student’s problem solving processes to ensure that they “linger” when conceptualizing problems, we may facilitate effective problem solving [37] .

Representing the Problem [ edit | edit source ]

Problem Representation refers to how the known information about a particular problem is organized [37] . In abstract representation of a problem, we merely think or speak about the problem without externally visually representing [37] . In representing a problem tangibly this is done by creating a visual representation on paper, computer, etc. of the data though graphs, stories, symbols, pictures or equations. These visual representations [37] may be helpful they can help us keep track of solutions and steps to a problem, which can particularly be useful when encountering complex problems.

instruction for problem solving

For example if we look at Dunker's Buddhist Monk example [37]  :

In the morning a Buddhist monk walks outside at sunrise to climb up the mountain to get to the temple at the peak. He reaches the temple just prior to sunset. A couple days later, he departs from the temple at sunrise to climb back down the mountain, travelling quicker than he did during his ascent as he is going down the mountain. Can you show a location along the path that the monk would have passed on both at the exact time of the day? [37]

In solely using abstraction, this problem is seemingly impossible to solve due to the vast amount of information, how it is verbally presented and the amount of irrelevant information present in the question. In using a visual representation we are able to create a mental image of where the two points would intersect and are better able to come up with a solution [55] .

Research supports the benefits of visual representation when confronted with difficult problems. Martin and Schwartz [56] found greater usage of external representations when confronted with a difficult task and they had intermittent access to resources, which suggests that these representations are used as a tool when problems are too complex without external aids. Results found that while creating the initial visual representation itself took up time, those who created these visual representations solved tasks with greater efficiency and accuracy.

Another benefit is that these visual representations may foster problem solving abilities by enabling us to overcome our cognitive biases. In a study conducted by Chambers and Reisberg [57] , participants were asked to look at the image below then close their eyes and form a mental image. When asked to recall their mental image of the photo and see if there were any alternate possibilities of what the photo could be, none of the participants were able to do so. However when participants were given the visual representation of the photo they were quickly able to manipulate the position of the photo to come up with an alternate explanation of what the photo could be. This shows how visual representations may be used in education by learners to counteract mental sets, which will be discussed in the next section.

As shown above, relying on abstraction can often overload one’s cognitive resources due to short- term memory being limited to seven items of information at a time [37] . Many problems surpass these limits disabling us being able to hold all the relevant information needed to solve a problem in our working memory [37] . Therefore it is implicated that in posing problems teachers should represent them written or visually in order to reduce the cognitive load. Lastly another implication is that as teachers we may increase problem-solving skills through demonstrating to students different types of external representations that can be used to show the relevant information pertaining to the problem. These representations may include different types of graphs, charts and imagery, which all can serve as tools for students in coming up with an effective solution, representing relevant information and reducing cognitive load

Challenges of Problem Solving [ edit | edit source ]

As discussed above there are many techniques to facilitate the problem solving process, however there are factors that can also hinder this process. For example: one’s past experiences can often impede problem solving as they can provide a barrier in looking at novel solutions, approaches or ideas [58] .

Mind set [ edit | edit source ]

A mind set refers to one's tendency to be influenced by one's past experiences in approaching tasks. [58] Mental set refers to confining ourselves to using solutions that have worked in the past rather than seeking out alternative approaches. Mental sets can be functional in certain situation as in using strategies that have worked before we are quickly able to come up with solutions. However, they can also eliminate other potential and more effective solutions.

instruction for problem solving

Functional Fixedness [ edit | edit source ]

Functional Fixedness is a type of mental set that refers to our tendency to focus on a specific function of an object (ie. what we traditionally use it for) while overlooking other potential novel functions of that object. [37]

A classic example of functional fixedness is the candle problem [59] . Consider you are at a table with a box full of tacks, one candle, and matches, you are then asked to mount the lit candle on the wall corkscrew board wall as quickly as possible, and make sure that this doesn't cause any wax to melt on the table. Due to functional fixedness you might first be inclined to pin the candle to the wall as that is what tacks are typically used for, similar to participants in this experiment. However, this is the incorrect solution as it would cause the wax to melt on the table.

The most effective solution requires you to view the box containing the tacks as a platform for the candle rather than it's traditional use as a receptacle. In emptying the box, we may use it as a platform for the candle and then use the tacks inside to attach the box to the wall. It is difficult to initially arrive at this solution as we tend to fixate on the function of the box of holding the tacks and have difficulty designating an alternate function to the box (ie. as a platform as opposed to a receptacle). This experiment demonstrates how prior knowledge can lead to fixation and can hinder problem solving.

Techniques to Overcome Functional Fixedness [ edit | edit source ]

As proposed by McCaffrey (2012), [60] one way to overcome functional fixedness is to break the object into parts. In doing so we may ask two fundamental questions “can it be broken down further” and “does my description of the part imply a use”. To explain this we can use McCaffrey’s steel ring figure-8 example. In this scenario the subject is given two steel rings, a candle and a match, they are asked to make the two steel rings into a figure 8. Looking at the tools provided to the subject they might decide that the wax from the candle could potentially hold the two pieces of steel together when heated up. However the wax would not be strong enough. It leaves them with a problem, how do they attach the two steel rings to make them a figure eight.

In being left with the wick as a tool, and labelling it as such we become fixated on seeing the primary function of the wick as giving off light, which hinders our ability to come up with a solution for creating a figure-8. In order to effectively solve problem we must break down our concept of the wick down further. In seeing a wick as just a waxed piece of string, we are able to get past functional fixedness and see the alternate functions of the string. In doing so we may come to the conclusion and see the waxed string as being able to be used to tie the two rings together. In showing the effectiveness of this approach McCaffrey (2012) found that people trained to use this technique solved 67% more problems than the control group [60] .

Given the effectiveness of this approach, it is implicated that one way we may promote Divergent Thinking is through teaching students to consider: "whether the object may be broken down further" [60] and "whether the description of the part imply a use" in doing so we may teach students to break down objects to their purest form and make salient the obscure features of a problem. This connects to the previously discussed idea of conceptualization where problem solving effectiveness can be increased through focusing time on defining the problem rather than jumping to conclusions based on our own preconceptions. In the following section we will discuss what strategies experts use when solving problems.

Novice Versus Expert In Problem Solving [ edit | edit source ]

Many researchers view effective problem solving as being dependent on two important variables: the amount of experience we have in trying to solve a particular category of problems [61] , which we addressed earlier by demonstrating that in practicing problem solving through engaging in a problem-based approach we may increase problem solving skills. However, the second factor to consider is the amount of domain-specific knowledge that we have to draw upon [61] . Experts possess a vast amount of domain knowledge, which allows them to efficiently apply their knowledge to relevant problems. Experts have a well-organized knowledge of their domain, which impacts they notice and how they arrange, represent and interpret information, this in turn enables them to better recall, reason and solve problems in comparison to novices. [62]

In comparing experts to novices in their problem strategies, experts are able to organize their knowledge around the deep structure in important ideas or concepts in their domain, such as what kind of solution strategy is required to solve the problem [63] . In contrast novices group problems based on surface structure of the problems, such as the objects that appear in the problem. [63]

Experts also spend more time than novices analyzing and identifying problems at the beginning of the problem-solving process. Experts take more time in thinking and planning before implementing solutions and use a limited set of strategies that are optimal in allowing them to richer and more effective solutions to the given problem. [64]

In addition experts will engage in deeper and more complete problem representation novices, in using external representations such as sketches and diagrams to represent information and solve problems. In doing so they are able to solve problems quicker and come up with better solutions. [65]

Given the literature above it is evident that problem solving and expertise overlap as the key strategies that experts utilize are also provided as effective problem solving strategies. Therefore, we may conclude that experts not only have a vast knowledge of their domain, they also know and implement the most effective strategies in order to solve problem more efficiently and effectively in comparison to novices. [65] In the next section we will discuss the connection between problem solving and critical thinking.

Cognitive Tutor for Problem Solving [ edit | edit source ]

Cognitive Tutor is a kind of Intelligent Tutoring Systems. [66] It can assign different problems to students according to their individual basis, trace users’ solution steps, provide just-in-time feedback and hint, and implement mastery learning criteria. [67]

According to Anderson and colleague, [67] the students who worked with LISP tutors completed the problems 30% faster and 43% outperformed than their peers with the help of teachers in mini-course. Also, college students who employed ACT Programming Tutor (APT) with the function of immediate feedback finished faster on a set of problems and 25% better on tests than the students who received the conventional instruction. [68] In addition, in high school geometry school settings, students who used Geometry Proof Tutor (GPT) for in- class problem solving had a letter grade scores higher than their peers who participated in traditional classroom problem-solving activities on a subsequent test. [69]

An overview of Cognitive Tutor [ edit | edit source ]

In 1985, Anderson, Boyle, and Reigser added the discipline of cognitive psychology to the Intelligent Tutoring Systems. Since then, the intelligent tutoring system adopted this approach to construct cognitive models for students to gain knowledge was named Cognitive Tutors. [67] The most widely used Cognitive Tutor is Cognitive Tutor® Algebra I. [69] Carnegie Learning, Inc., the trademark owner, is developing full- scale Cognitive Tutor®, including Algebra I, II, Bridge to Algebra, Geometry, and Integrated Math I, II, III. Cognitive Tutor® now includes Spanish Modules, as well.

Cognitive Tutors support the idea of learning by doing, an important part of human tutoring, which to provide students the performance opportunities to apply the objective skills or concepts and content related feedback. [69] To monitor students’ performance, Cognitive Tutors adopt two Algorithms , model tracing and knowledge tracing. Model tracing can provide immediate feedback, and give content-specific advice based on every step of the students’ performance trace. [67] Knowledge tracing can select appropriate tasks for every user to achieve mastery learning according to the calculation of one’s prior knowledge. [67] [69]

Cognitive Tutors can be created and applied to different curriculum or domains to help students learn, as well as being integrated into classroom learning as adaptive software. The curriculum and domains include mathematics in middle school and high school, [66] [68] [70] genetics in post-secondary institutions, [71] and programming. [67] [68] [72] [73]

Cognitive Tutors yielded huge impacts on the classroom, student motivation, and student achievement. [74] Regarding the effectiveness of Cognitive Tutors, research evidence supports more effectiveness of Cognitive Tutors than classroom instruction. [67] [75] [76] [68]

The Theoretical Background of Cognitive Tutor [ edit | edit source ]

Act-r theory [ edit | edit source ].

The theoretical background of Cognitive Tutors is ACT-R theory of learning and performance, which distinguishes between procedural knowledge and declarative knowledge. [67] According to the ACT-R theory, procedural knowledge cannot be directly absorbed into people’s heads, and it can be presented in the notation of if-then Production rules. The only way to acquire procedural knowledge is learning by doing.

Production rules [ edit | edit source ]

Production rules characterize how students, whether they beginning learners or advanced learners, think in a domain or subject. [67] Production rules can represent students' informal or intuitive thinking. [77] The informal or intuitive forms of thinking are usually different from what textbook taught, and students might gain such patterns of thinking outside from school. [78] Heuristic methods, such us providing a plan of actions for problem-solving instead of giving particular operation; [79] and non-traditional strategies, such as working with graphics rather than symbols when solving equation, [69] can be represented in production rules as well.

Cognitive model and model tracing [ edit | edit source ]


Cognitive model is constructed on both ACT-R theory and empirical studies of learners. [69] All the solutions and typical misconceptions of learners are represented in the production system of the cognitive model.

For example, there are three strategies of solving an algebra equation, 2(3+X)=10. Strategy 1 is multiplying 2 across the sum (3+X); Strategy 2 is dividing both sides of the equation by 2; Strategy 3 shows the misconception of failing to multiply 2 across the sum (3+X). Since there are various methods of each task, students can choose their way of solving problems.

Model tracing is an algorithm that can run forward along every student’s learning steps and provide instant context-specific feedback. If a student chooses the correct answer, for example, using strategy 1 or strategy 2 to solve the equation, the Cognitive Tutor® will accept the action and provide the student next task. If the student’s mistake match a common misconception, such as using strategy 3, the Cognitive Tutor will highlight this step as incorrect and provide a just-in- time feedback, such as you also need to multiply X by 2. If the student’s mistake does not match any of the production rule in the cognitive model, which means that the student does not use any of the strategies above, the Cognitive Tutor® will flag this step as an error in red and italicized. Students can ask for advice or hint any time when solving problems. According to Corbett, [68] there are three levels of advice. The first level is to accomplish a particular goal; the second level is to offer general ideas of achieving the goal, and the third level is to give students detailed advice on how to solve the problem in the current context.

Knowledge tracing [ edit | edit source ]

Knowledge tracing can monitor the growing number of production rules during the problem solving process. Every student can choose one production rule every step of his or her way of solving problems, and Cognitive Tutors can calculate an updated estimate of the probability of the student has learned the particular rule. [68] [69] The probability estimates of the rules are integrated into the interface and displayed in the skill-meter. Using probability estimates, the Cognitive Tutors can select appropriate tasks or problems according to students’ individual needs.

Effectiveness [ edit | edit source ]

Cognitive tutor® geometry [ edit | edit source ].

Aleven and Koedinger conducted two experiments to examine whether Cognitive Tutor® can scaffold self-explanation effectively in high school geometry class settings. [66] The findings suggested that “problem-solving practice with a Cognitive Tutor® is even more effective when the students explain their steps by providing references to problem-solving principles.” [80]

In geometry learning, it could happen when students have over-generalized production rules in their prior knowledge, and thus leading shallow encoding and learning. For instance, a student may choose the correct answer and go to next step base on the over-generalized production rule, if an angle looks equal to another, then it is , instead of real understanding. According to Aleven & Koedinger, self-explanation can promote more general encoding during problem-solving practice for it can push students to think more and reflect explicitly on the rules in the domain of geometry. [66]

All the geometry class in the experiments includes classroom discussion, small-group activities, lectures, and solving problems with Cognitive Tutor®. In both of the experiments, students are required to solve problems with the help of the Cognitive Tutor®. However, the Cognitive Tutor® were provided with two different versions, the new version can support self-explanation which is also called guided learning by doing and explaining, [66] and the other cannot. Theses additional features of the new version required students to justify each step by entering geometry principles or referring the principles to an online glossary of geometry knowledge, as well as providing explanations and solutions according to students’ individual choice. Also, the form of explanation in the new version is different from speech-based explanations mentioned in another experiment on self-explanation. The researchers found that students who use the new version of the Cognitive Tutor® were not only better able to give accurate explanation, but also able to deeper understand the domain rules. Thus, the students were able to transfer those learned rules to new situations better, avoiding shallow encoding and learning.

Genetics Cognitive Tutor [ edit | edit source ]

Corbett et al. (2010) conducted two evaluations of the Genetics Cognitive Tutor in seven different kinds of biology courses in 12 universities in America. The findings suggested the effectiveness of implementing Genetics Cognitive Tutor in post-secondary institution genetic problem-solving practice settings. [81]

In the first evaluation, the participants used the Genetics Cognitive Tutor with their class activities or homework assignments. The software has 16 modules with about 125 problems in five general genetic topics. Genetics Cognitive Tutor utilized the cognitive model of genetics problem solving knowledge to provide step-by-step help, and both model tracing and knowledge tracing. With the average correctness of pretest (43%) and post-test (61%), the average improvements of using Genetic Cognitive Tutors was 18%. In the second empirical evaluations, the researchers examined whether the knowledge tracing can correctly predict students’ knowledge. The finding suggested that the algorithm of knowledge tracing is capable of accurately estimating every student performance on the paper- and-pencil post-test.

Project Based Learning and Design Thinking [ edit | edit source ]

Theorizing solutions for real world problems [ edit | edit source ].

Project Based Learning is a concept that is meant to place the student at the center of learning. The learner is expected to take on an active role in their learning by responding to a complex challenge or question through an extended period of investigation. Project Based Learning is meant for students to acknowledge the curriculum of their class, but also access the knowledge that they already have to solve the problem challenge. At its roots, project-based learning is an activity in which students develop an understanding of a topic based on a real-life problem or issue and requires learners to have a degree of responsibility in designing their learning activity [82] . Blummenfeld et al. (1991) states that Project Based Learning allows students to be responsible for both their initial question, activities, and nature of their artifacts [83] .

Project based learning is based on five criteria [84]

instruction for problem solving

Challenges are based on authentic, real-world problems that require learners to engage through an inquiry process and demonstrate understanding through active or experiential learning. An example would be elementary or secondary students being asked by their teacher to solve a school problem – such as how to deal with cafeteria compost. Students would be encouraged to work in groups to develop solutions for this problem within specific criteria for research, construction, and demonstration of their idea as learners are cognitively engaged with subject matter over an extended period of time keeping them motivated [83] . The result is complex learning that defines its success is more than as more than the sum of the parts [85] . Project Based Learning aims at learners coordinating skills of knowledge, collaboration, and a final project presentation. This type of schema construction allows learners to use concrete training to perform concrete results. The learner uses previous knowledge to connect with new information and elaborate on their revised perception of a topic [85] . In Project Based Learning this would constitute the process of information gathering and discussing this information within a team to decide on a final solution for the group-instructed problem.

Unlike Problem-Based Learning, experiential learning within a constructivist pedagogy, is the basis of Project Based Learning, and learners show their knowledge, or lack there of, by working towards a real solution through trial and error on a specific driving question. The philosophy of Experiential experiential learning education comes from the theories developed by John Dewey in his work Education and Experience. Dewey argues that experience is shown to be a continuous process of learning by arousing curiosity, strengthen initiative, and is a force in moving the learner towards further knowledge [86] . The experiential aspect of Project Based Learning through working towards solutions for real world problems ties learner’s solutions to practical constructs. Learners must make up the expected gap in their knowledge through research and working together in a collaborative group. The experiential learning through Project Based Learning is focused on a driving question usually presented by the teacher. It is this focus that students must respond to with a designed artifact to show acquired knowledge.

The constructivist methodology of Project Based Learning is invoked through the guided discovery process set forth by the instructor, unlike pure discovery which has been criticised for student having too much freedom [87] , Project Based Learning involves a specific question driven by the instructor to focus the process of investigation. This form of constructivist pedagogy has shown to promote cognitive processing that is most effective in this type of learning environment [87] . Project Based Learning provides a platform for learners to find their own solutions to the teacher driven question, but also have a system in which to discover, analyze, and present. Therefore, Project Based Learning delivers beneficial cognitive meaningful learning by selecting, organizing, and integrating knowledge [87] .

Experience is the Foundation of Learning [ edit | edit source ]

Project Based Learning is a branch of education theory that is based on the idea of learning through doing. John Dewey indicated that teachers and schools should help learners to achieve greater depth in correlation between theory and real-world through experiential and constructivist methods. Dewey stated that education should contain an experiential continuum and a democratization of education to promote a better quality of human experience [86] . These two elements are consistent with Project Based Learning through the application of authentic, real world problems and production of artifacts as solutions, and the learner finding their own solutions through a collaborative effort with in a group. Blumenfeld et al. mentions that the value in Project Based Learning comes from questions that students can relate to including personal health and welfare, community concerns, or current events [83] .

Project Based Learning has basis also in the work of Jean Piaget who surmised that the learner is best served to learn in a constructivist manner – using previous knowledge as a foundation for new learning and connections. The learner’s intelligence is progressed from the assimilation of things in the learner’s environment to alter their original schema by accommodating multiple new schema and assimilating all of this experienced knowledge [88] . Piaget believed in the learner discovering new knowledge for themselves, but that without collaboration the individual would not be able to coherently organize their solution [87] . Project Based Learning acknowledges Piaget’s beliefs on the need for collective communication and its use in assembling new knowledge for the learner.

Self-Motivation Furthers Student Learning [ edit | edit source ]

Project Based Learning is perceived as beneficial to learners in various ways including gained knowledge, communication, and creativity. While engaging on a single challenge, learners obtain a greater depth of knowledge. Moreover, abilities in communication, leadership, and inter-social skills are strengthened due to the collaborative nature of Project Based Learning. Students retain content longer and have a better understanding of what they are learning. There are at least four strands of cognitive research to support Project Based Learning [84] – motivation, expertise, contextual factors, and technology.

Motivation of students that is centred on the learning and mastery of subject matter are more inclined to have sustained engagement with their work [89] . Therefore, Project Based Learning discourages public competition in favour of cooperative goals to reduce the threat to individual students and increase focus on learning and mastery [84] . Project Based Learning is designed to allow students to reach goals together, without fear of reprisal or individual criticism. For instance, Helle, et al. completed a study of information system design students who were asked to work on a specific assignment over a seven-month timeline. Students were given questionnaires about their experience during this assignment to determine their motivation level. Helle, et al. examined the motivation of learners in project groups and found intrinsic motivation increased by 0.52 standard deviations, showing that Project Based learner groups used self-motivation more often to complete assignments. Further, the study implied intrinsic motivation increase substantially for those who were lowest in self-regulation [90] .

Learner metacognitive and self-regulation skills are lacking in many students and these are important to master in student development in domains [84] . In the Project Based Learning system the relationship between student and teacher allows the instructor to use scaffolding to introduce more advance forms of inquiry for students to model, thus middle school students and older are very capable of meaningful learning and sophisticated results [91] . Learners would then become experts over time of additional skills sets that they developed on their own within this system.

Contextually, situated cognition is best realized when the material to be used resembles real-life as much as possible [84] , therefore, Project Based Learning provides confidence in learners to succeed in similar tasks outside of school because they no longer associate subjects as artificial boundaries to knowledge transfer. Gorges and Goke (2015) investigated the relationship between student perception of their abilities in major high school subjects and their relating these skills to real-world problem application through an online survey. Learners showed confidence in problem-solving skills and how to apply their learning to real-life situations, as Gorges and Goke [92] report, and that students who used Project Based Learning style learning have increased self-efficacy and self-concepts of ability in math (SD .77), history (SD .72), etc. [92] . Therefore, students are more likely to use domain-specific knowledge outside of an academic setting through increased confidence. Further, a comparison between students immediately after finishing a course and 12 weeks to 2 years provided effect sizes that showed Project Based Learning helped retain much knowledge [92] .

Technology use allows learners to have a more authentic experience by providing users with an environment that includes data, expanded interaction and collaboration, and emulates the use of artifacts [84] . The learner, in accessing technology, can enhance the benefits of Project Based Learning by having more autonomy is finding knowledge and connecting with group members. Creativity is enhanced as students must find innovative solutions to their authentic problem challenges. For instance, using digital-story-telling techniques through Project Based Learning, as stated by Hung and Hwang [93] , to collect data (photos) in elementary class to help answer a specific project question on global warming in science provided a significant increase in tests results (SD 0.64). As well, in order to find answers, learners must access a broad range of knowledge, usually crossing over various disciplines. The end result is that projects are resolved by student groups that use their knowledge and access to additional knowledge (usually through technology) to build a solution to the specific problem.

Educators Find Challenges in Project Based Learning Implementation [ edit | edit source ]

One of the main arguments against this type of learning is that the project can become unfocused and not have the appropriate amount of classroom time to build solutions. Educators themselves marginalized Project Based Learning because they lack the training and background knowledge in its implementation. Further financial constraints to provide effective evaluation through technology dissuades teachers as well [94] . The information gained by students could be provided in a lecture-style instruction and can be just as effective according to critics. Further, the danger is in learners becoming off-task in their time spent in the classroom, and if they are not continually focused on the task and the learning content, then the project will not be successful. Educators with traditional backgrounds in teaching find Project Based Learning requires instructors to maintain student connection to content and management of their time – this is not necessarily a style that all teachers can accomplish [94] .Blumenfeld et al. (1998) state that real success from Project Based Learning begins and ends with a focused structure that allows teacher modelling, examples, suggested strategies, distributing guidelines, giving feedback during the activity, and allowing for revision of work [91] .

Learner Need for Authentic Results through Critical Thought [ edit | edit source ]

instruction for problem solving

Project Based Learning is applicable to a number of different disciplines since it has various applications in learning, and is specifically relevant with the 21st century redefinition of education (differentiated, technologically-focused, collaboration, cross-curricular). STEM (Science, Technology, Engineering, Mathematics) is one form of 21st century education that benefits from instructors using Project Based Learning since it natural bridges between domains. The focus of STEM is to prepare secondary students for the rigors of post-secondary education and being able to solve complex problems in teams as would be expected when performing these jobs in the real world after graduation. Many potential occupational areas could benefit from Project Based Learning including medical, engineering, computer design, and education. Project Based Learning allows secondary students the opportunity to broaden their knowledge and become successful in high-stakes situation [95] . Moreover, these same students then develop a depth in knowledge when it comes to reflecting upon their strengths and limitations [95] . The result would be a learner who has developed critical thinking and has had a chance to apply it to real situations. Further the construction of a finished product is a realistic expectation in presenting an authentic result from learning. The product result demands accountability, and learner adherent to instructor expectations as well as constraints for the project [95] .

The learner is disciplined to focus on specific outcomes, understand the parameters of the task, and demonstrate a viable artifact. The implication is that students will be ready to meet the challenges of a high-technology, fast-paced work world where innovation, collaboration, and results-driven product is essential for success. Technology is one area where Project Based Learning can be applied by developing skills in real-world application, thus cognitive tools aforded by new technology will be useful if perceived as essential for the project (as is the case in many real-world applications) [83] .. For example, designers of computer systems with prior knowledge may be able to know how to trouble-shoot an operating system, but they do not really understand how things fit or work together, and they have a false sense of security about their skills [96] .

Design-Thinking as a Sub-set of Project-Based Learning [ edit | edit source ]

Using the process of practical design for real-world solutions [ edit | edit source ].

instruction for problem solving

Design Thinking is a pedagogical approach to teaching through a constructionist methodology of challenge-based problem solving branching off of Project Based learning. It should be understood as a combination of sub-disciplines having design as the subject of their cognitive interests [97] .

An example of design-thinking would be learners engaged with finding a solution to a real-world problem. However, unlike Project Based Learning, design-thinking asks the learner to create a practical solution within a scaffolding process (Figure 3) such as finding a method to deliver clean drinking water to a village. Designers would consider social, economic, and political considerations, but would deliver a final presentation of a working prototype that could be marketable. Hence a water system could be produced to deliver water to villagers, but within the limits of the materials, finances, and local policies in mind. It designates cores principles of empathy, define, ideate, prototype, and test to fulfill the challenges of design. Starting with a goal (solution) in mind, empathise is placed upon creative and practical decision making through design to achieve an improved future result. It draws upon a thinking that requires investigation into the details of a problem to find hidden parameters for a solution-based result. The achieved goal then becomes the launching point for further goal-setting and problem solving. [97]

This type of approach to education is based on the premise that the modern world is full of artificial constructs, and that our civilization historically has relied upon these artifacts to further our progress in technological advances. Herbert Simon, a founder of design-thinking, states that the world that students find themselves in today is much more man-made and artificial that it is a natural world [98] . The challenge of design-thinking is to foster innovation by enhancing student creative thinking abilities [99] . Design-thinking is a tool for scaffolding conventional educational projects into Project Based thinking. Van Merrienbroer (2004) views design-learning as a scaffolding for whole-task practice. It decreases intrinsic cognitive load while learners can practice on the simplest of worked-out examples [87] . Therefore, Design-thinking is currently becoming popular due to its ability to bridge between the justification of what the learner knows and what the learner discovers within the context of 21st century skills and learning. A further example of this process is the design of a product that children will use to increase their physical activity (see video on Design Thinking) and can be explained using the scaffold of Design Thinking:

Critical Thought on Design in the Artificial World [ edit | edit source ]

Design-thinking is can be traced back to a specific scholars including Herbert Simon, Donald Schon, and Nigel Cross. Simon published his findings on the gap he found in education of professions in 1969. He observed that techniques in the natural sciences and that just as science strove to show simplicity in the natural world of underlying complex systems, and Simon determined the it was the same for the artificial world as well [100] . Not only should this include the process behind the sciences, but the arts and humanities as well since music, for example involves formal patterns like mathematics (Simon, 136). Hence, the creative designs of everyone is based upon a common language and its application. While Schon builds upon the empathetic characteristics of design-thinking as a Ford Professor of Urban Planning and Education at MIT, referring to this process as an artistic and intuitive process for problem-solving [101] . Schon realized that part of the design process was also the reflection-in-action that must be involved during critical thinking and ideating. Moreover, the solutions for problems do not lie in text-books, but in the designer’s ability to frame their own understanding of the situation [100] . Cross fuses these earlier ideas into a pedagogy surrounding education stating that design-thinking should be part of the general education of both sciences and humanities [97] . He implies that students encouraged to use this style of thinking will improve cognitive development of non-verbal thought and communication [97] .

Critical Thinking as Disruptive Achievement [ edit | edit source ]

Design-thinking follows a specific flow from theoretical to practical. It relies upon guided learning to promote effective learner solutions and goes beyond inquiry which has been argued does not work because it goes beyond the limits of long-term memory [97] . Design-thinking requires the learner to have a meta-analysis of their process. Creativity (innovative thought) is evident in design thinking through studies in defocused and focused attention to stimuli in memory activation [97] . Hu et al. (2010) developed a process of disrupted thinking in elementary students by having them use logical methods of critical thought towards specific design projects, over a four-year period, through specific lesson techniques. The results show that these students had increased thinking ability (SD .78) and that these effects have a long-term transfer increasing student academic achievement [102] . This shows use of divergent and convergent thinking in the creative process, and both of these process of thought has been noted to be important in the process of creativity (Goldschmidt, 2016, p 2) and demonstrates the Higher Order Thinking that is associated with long-term memory. Design-thinking specifically demonstrates the capability of having learners develop

Designers are Not Scientific? [ edit | edit source ]

Design-thinking critics comment that design is in itself not a science or cognitive method of learning, and is a non-scientific activity due to the use of intuitive processes [97] . The learner is not truly involved within a cognitive practice (scientific process of reasoning). However, the belief of Cross is that design itself is a science to be studied, hence it can be investigated with systematic and reliable methods of investigation [97] . Further, Schon states that there is connection between theory and practice that in design thinking means that there is a loyalty to developing a theoretical idea into a real world prototype [101] . Design-thinking is a process of scientific cognitive practice that does constitute technical rationality [101] and using this practice to understand the limits of their design that includes a reflective practice and meta. Further, this pedagogy is the application for the natural gap between theory and practice for most ideas, by allowing the learner to step beyond normal instruction and practice to try something new and innovative to come up with a solution. Design-thinking rejects heuristically-derived responses based on client or expert appreciation to take on an unforeseen form [101] .

21st Century Learners and the Need for Divergent Thinking [ edit | edit source ]

Design-thinking is exceptionally positioned for use with 21st century skills based around technological literacy. Specifically, it is meant to assist the learner in developing creative and critical skills towards the application of technology. Designing is a distinct form of thinking that creates a qualitative relationship to satisfy a purpose [103] . Moreover, in a world that is rapidly becoming technologized, design-thinking the ability to make decisions based upon feel, be able to pay attention to nuances, and appraise the consequences of one’s actions [103] . The designer needs to be able to think outside the perceived acceptable solution and look to use current technology. Therefore, learners using design thinking are approaching all forms of technology as potential applications for a solution. Prototyping might include not just a hardware application, but also the use of software. Cutting-edge technologies such as Augmented Reality and Virtual Reality would be acceptable forms of solutions for design challenges. Specific application of design-thinking is, therefore applicable to areas of study that require technological adaptation and innovation. Specifically, the K-12 BC new curriculum (2016) has a specific focus on Applied Design, Skills, and Technologies that calls for all students to have knowledge of design-thinking throughout their entire education career and its application towards the advancement of technology. Therefore, Design Thinking is a relative and essential component to engaging student critical thought process.

Argumentation [ edit | edit source ]

Argumentation is the process of assembling and communicating reasons for or against an idea, that is, the act of making and presenting arguments. CT in addition to clear communication makes a good argument. It is the process through which one rationally solves problems, issues and disputes as well as resolving questions [104] .

The practice of argumentation consists of two dimensions: dialogue and structure [105] . The dialogue in argumentative discussions focus on specific speech acts – actions done through language (i.e. accept, reject, refute, etc.) – that help advance the speaker’s position. The structure of an argument helps distinguish the different perspectives in discussion and highlight positions for which speakers are arguing [105] .

One of the main arguments against this type of learning is that the project can become unfocused and not have the appropriate amount of classroom time to build solutions. Educators themselves marginalize PBL* because they lack the training and background knowledge in its implementation. Further financial constraints to provide effective evaluation through technology dissuades teachers as well (Efstratia, 2014, p 1258). The information gained by students could be provided in a lecture-style instruction and can be just as effective according to critics. Further, the danger is in learners becoming off-task in their time spent in the classroom, and if they are not continually focused on the task and the learning content, then the project will not be successful. Educators with traditional backgrounds in teaching find Project Based Learning requires instructors to maintain student connection to content and management of their time – this is not necessarily a style that all teachers can accomplish (Efstratia, 2014, p 1258).

Project Based Learning is applicable to a number of different disciplines since it has various applications in learning, and is specifically relevant with the 21st century redefinition of education (differentiated, technologically-focused, collaboration, cross-curricular). STEM (Science, Technology, Engineering, Mathematics) is one form of 21st century education that benefits from instructors using Project Based Learning since it natural bridges between domains. The focus of STEM is to prepare secondary students for the rigors of post-secondary education and being able to solve complex problems in teams as would be expected when performing these jobs in the real world after graduation. Many potential occupational areas could benefit from Project Based Learning including medical, engineering, computer design, and education.

Project Based Learning allows secondary students the opportunity to broaden their knowledge and become successful in high-stakes situation (Capraro, et al., 2013, p 2). Moreover, these same students then develop a depth in knowledge when it comes to reflecting upon their strengths and limitations (Capraro, et al., 2013, p 2). The result would be a learner who has developed critical thinking and has had a chance to apply it to real situations. Further the construction of a finished product is a realistic expectation in presenting an authentic result from learning. The product result demands accountability, and learner adherent to instructor expectations as well as constraints for the project (Capraro, et al., 2013, p 2). The learner is disciplined to focus on specific outcomes, understand the parameters of the task, and demonstrate a viable artifact. The implication is that students will be ready to meet the challenges of a high-technology, fast-paced work world where innovation, collaboration, and results-driven product is essential for success. Technology is one area where Project Based Learning can be applied by developing skills in real-world application. For example, designers of computer systems with prior knowledge may be able to know how to trouble-shoot an operating system, but they do not really understand how things fit or work together, and they have a false sense of security about their skills (Gary, 2013, p 1).

Design-thinking follows a specific flow from theoretical to practical. It relies upon guided learning to promote effective learner solutions and goes beyond inquiry which has been argued does not work because it goes beyond the limits of long-term memory (Lazonder and Harmsen, 2016, p 2). Design-thinking requires the learner to have a meta-analysis of their process. Creativity (innovative thought) is evident in design thinking through studies in defocused and focused attention to stimuli in memory activation (Goldschmidt, 2016, p 1). Hu et al. (2010) developed a process of disrupted thinking in elementary students by having them use logical methods of critical thought towards specific design projects, over a four-year period, through specific lesson techniques. The results show that these students had increased thinking ability (SD .78) and that these effects have a long-term transfer increasing student academic achievement (Hu, et al. 2010, p 554). This shows use of divergent and convergent thinking in the creative process, and both of these process of thought has been noted to be important in the process of creativity (Goldschmidt, 2016, p 2) and demonstrates the Higher Order Thinking that is associated with long-term memory. Design-thinking specifically demonstrates the capability of having learners develop.

The Process of Argumentation [ edit | edit source ]

Argumentation stages [ edit | edit source ].

The psychological process of argumentation that allows one the produce, analyze and evaluate arguments [106] . These stages will be discussed in more detail later in this chapter.

The Impact of Argumentation on Learning [ edit | edit source ]

Argumentation does not only impact the development of CT and vice versa, it affects many other aspects of learning as well. For instance, a study conducted in a junior high school science class showed that when students engaged in argumentation, they drew heavily on their prior knowledge and experiences [107] . Not only did argumentation enable the students to use their prior knowledge, it also helped them consolidate knowledge and elaborate on their understanding of the subject at a higher level [107] . These are just a few of the ways in which argumentation can be seen to impact aspects of learning other than the development of CT.

Video: Argumentation in Education: https://www.youtube.com/watch?v=YHm5xUZmCDg

The Relationship between Critical Thinking and Argumentation [ edit | edit source ]

Argumentation and CT appear to have a close relationship in instruction. Many studies have shown the impact that both of these elements can have on one another. Data suggests that when CT is infused into instruction it impacts the ability of students to argue [108] tasks that involve both critical thinking and creative thinking must be of an argumentative nature [109] , and that argument analysis and storytelling can improve CT [110] . In other words it would appear that both CT and argumentation impact the development of each other in students and that both impact other aspects of learning and cognition.

How Critical Thinking Improves Argumentation [ edit | edit source ]

CT facilitates the evaluation of the information necessary to make an argument. It aids in the judgement of the validity of each position. It is used to assess the credibility of sources and helps in approaching the issue from multiple points of view. The elements of CT and argumentation have many common features. For example, examining evidence and counter-evidence of a statement and the information that backs up these claims are both facets of creating a sound argument and thinking critically.

The impact of how CT explicitly impacts one’s ability to argue and reason with reference to the aforementioned four CT components will be examined in this section. First, there needs to be an examination of the aspects of CT and how they can be impacted by argumentation. The first component, knowledge, as stated by Bruning et. al (2011), actively shapes the way in which one resolves problems [111] . Therefore, it is essential that students have a solid foundation of knowledge of whatever it is that they are arguing. The ability to use well founded information in order to effectively analyze the credibility of new information is imperative for students who wish to increase their argumentative abilities. The second component of CT that is important for argumentation is inference . As Chesñevar and Simari (2007) discuss in their examination of how we develop arguments, inference and deduction are essential aspects of reaching new conclusions from knowledge that is already known or proven [112] .

instruction for problem solving

In other words, the ability to reach conclusions from known information is pivotal in developing and elaborating an argument. As well, the use of induction , a part of the CT process, is important to argumentation. As Bruning et al. suggest, the ability to make a general conclusion from known information is an essential part of the CT process [111] . Ontañón and Plaza (2015) make the argument that induction can be used in argumentation through communication with one another. Moreover, making inductions of general conclusions using the complete information that every member of the group can provide shows how interaction can be helpful through the use of induction in argumentation [113] . Therefore, it can be seen how induction, an important part of CT, can have a significant impact on argumentation and collaboration. The final component of CT, that may be the most important in its relationship to argumentation, is evaluation . The components of Evaluation indicated by Bruning et al. are analyzing, judging and weighing. These are three essential aspects of creating a successful argument [111] . Hornikx and Hahn (2012) provide a framework for three key elements of argumentation that are heavily attached in these Bruning et al.'s three aspects of CT [106] .

Production, Analysis, and Evaluation [ edit | edit source ]

The three aspects of argumentation that Hornikx and Hahn focus on in their research is the production , analysis and evaluation of arguments [106] . Producing an argument uses the key aspects of CT; there must be evaluation, analysis, judgement and weighing of the argument that one wishes to make a stand on. Analysis of arguments and analysis in CT go hand in hand, there must be a critical analysis of information and viewpoints in order to create a successful and fully supported argument. As well, evaluation is used similarly in argumentation as it is derived from CT. Assessing the credibility of sources and information is an essential part in finding articles and papers that can assist someone in making an informed decision. The final aspect of evaluation in critical thinking is metacognition, thinking about thinking or monitoring one's own thoughts [111] . Monitoring one's own thoughts and taking time to understand the rationality of the decisions that one makes is also a significant part of argumentation. According to Pinto et al.’s research, there is a strong correlation between one's argumentation ability and metacognition. [114] In other words, the ability to think about one’s own thoughts and the validity of those thoughts correlates positively with the ability to formulate sound arguments. The transfer of thoughts into speech/argumentation shows that CT influences argumentation dramatically, however some research suggests that the two interact in different ways as well. It can clearly be seen through the research presented that argumentation is heavily influenced by CT skills, such as knowledge, inference, evaluation and metacognition. However there are also strong implications that instruction of CT in a curriculum can bolster argumentation. A study conducted by Bensley et. al (2010) suggests that when CT skills are directly infused into a course compared to groups that received no CT instruction, those who received CT instruction showed significant gains in their ability of argument analysis [115] . There can be many arguments made for the implication of specific CT skills to impact argumentation, but this research shows that explicit teaching of CT in general can increase the ability of students to more effectively analyze arguments as well. This should be taken into account that Skills Programs mentioned later in this chapter should be instituted if teachers wish to foster argumentation as well as CT in the classroom.

How Argumentation Improves Critical Thinking [ edit | edit source ]

Argumentation is a part of the CT process, it clarifies reasoning and the increases one's ability to assess viable information. It is a part of metacognition in the sense that one needs to evaluate their own ideas. CT skills such as induction and/or deduction are used to create a structured and clear argument.

Research by Glassner and Schwarz (2007) shows that argumentation lies at the intersection of critical and creative thinking. They argue that reasoning, which is both critical and creative, is done through argumentation in adolescents. They suggest that reasoning is constantly being influenced by other perspectives and information. The ability to think creatively as well as critically about new information is managed by argumentation [116] . The back and forth process of accommodating, evaluating, and being open minded to new information can be argued as critical and creative thinking working together. However, the way in which one reaches conclusions from information is created from the ability to weigh this information, and then to successfully draw a conclusion regarding the validity of the solution that students come to. There is also a clear correlation of how argumentation helps students to nurture CT skills as well.

It is clear that CT can directly impact argumentation, but this relationship can also be seen as bidirectional, with argumentation instruction developing the CT skills. A study by Gold et al. shows that CT skills can be fostered through the use of argument analysis and storytelling in instruction [117] . This research suggests that argumentation and argument analysis are not only be beneficial to students, but also to older adults. This study was conducted using mature adult managers as participants. The article outlines four skills of CT that can be impacted by the use of argument analysis and storytelling: critique of rhetoric, tradition, authority, and knowledge. These four skills of CT are somewhat deeper than many instructed in high schools and extremely important to develop. The ability of argumentation to impact CT in a way that enables a person to gain a better perspective on their view about these things is essential to developing personal values as well as being able to use argumentation and CT to critique those values when presented with new information. The ability of argumentation to influence the ability of individuals to analyze their own traditions and knowledge is important for all students as it can give them better insight into what they value.

Argumentation is beneficial to CT skills as well as creative thinking skills in high school students. Research done by Demir and İsleyen (2015) shows that argumentation based a science learning approach in 9th graders improves both of types of thinking [118] . The ability of students to use argumentation to foster CT as well as creative thinking can be seen as being very beneficial, as mentioned earlier creative and CT skills use argumentation as a means of reasoning to draw conclusions, it is therefore not surprising that argumentation in instruction also fosters both of these abilities. In summation, it can clearly be seen that there is a link between both argumentation and CT along with many skills in the subset of CT skills. Explicit instruction of both of these concepts seems to foster the growth of the other and can be seen as complementary. In the next sections of this chapter how these aspects can be beneficial if taught within the curriculum and how they go hand in hand in fostering sound reasoning as well as skills that will help students throughout their lives will be examined.

Instructional Application of Argumentation and Critical Thinking [ edit | edit source ]

instruction for problem solving

Teaching Tactics [ edit | edit source ]

An effective method for structuring the instruction of CT is to organize the thinking skills into a clear and sequential steps. The order in which these steps aid in guiding the student towards internalizing those steps in order to apply them in their daily lives. By taking a deductive approach, starting from broader skills and narrowing them down to task-specific skills helps the student begin from what they know and generate something that they hadn't known before through CT. In the spirit of CT, a student's awareness of their own skills also plays an important role in their learning. In the classroom, they should be encouraged to reflect upon the process through which they completed a goal rather than just the result. Through the encouragement of reflection, students can become more aware of the necessary thinking skills necessary for tasks, such as Argumentation.

Instructing CT and Argumentation predisposes the instruction to using CT skills first. In designing a plan to teach CT, one must be able to critically evaluate and assess different methods and make an informed decision on which would work best for one's class. There are a variety of approaches towards instructing CT. Descriptive Models consist of explanations of how "good" thinking occurs. Specifically, it focuses on thinking strategies such as heuristics to assess information and how to make decisions. Prescriptive Models consist of explanations of what good thinking should be. In a sense, these models give a prototype, a "prescription", of what good thinking is. This approach is comparatively less applicable and sets a high standard of what is expected of higher order thinking. In addition to evaluating which approach would work best for them, prior to teaching CT, instructors need to carefully select the specific types of CT skills that they want students to learn. This process involves assessing factors such as age range, performance level as well as cognitive ability of one's class in order to create a program that can benefit most of, if not all, the students. A final aspect of instruction to consider as an educator is whether direct or indirect instruction will be used to teach CT. Direct Instruction refers to the explicit teaching of CT skills that emphasizes rules and steps for thinking. This is most effective when solutions to problems are limited or when the cognitive task is easy. In contrast, Indirect Instruction refers to a learner-oriented type of teaching that focuses on the student building their own understanding of thinking. This is most effective when problems are ambiguous, unclear or open to interpretation such as moral or ethical decisions [111] .

One example of indirect CT instruction is through the process of writing literature reviews. According to Chandler and Dedman, having the skills to collect, assess and write literature reviews as well as summarize results of studies requires CT. In a teaching note, they evaluated a BSW (Baccalaureate of Social Work) program that strived to improve CT in undergraduate students. Specifically, they assert that practical writing assignments, such as creating literature reviews, help students combine revision and reflection while expanding their thinking to evaluate multiple perspectives on a topic. They found that upon reframing the assignment as a tool to facilitate students in becoming critical reviewers, students viewed the literature review as a summation of course material in addition to an opportunity to improve critical reading and writing skills. Through questioning during discussions, students were guided to analyze the authority and credibility of their articles. The students actively sought for more evidence to support articles on their topics. They found that students successfully created well synthesized literature reviews at the end of the BSW program [119] . This program used implicit instruction of CT skills through dialogue between instructor and students as well as peer engagement. Instead of explicitly stating specific skills or steps to learn CT, the instructors lead the students to practice CT through an assignment. As students worked on the assignment, they needed to use reasoning, analysis and inferential skills in order to synthesize and draw conclusions around the evidence they found on their topics. Practical application of CT skills through an assignment helped students develop CT through indirect instruction.

instruction for problem solving

Argument mapping is a way to visualize argumentation. The following are links to argument mapping software: https://www.rationaleonline.com/ http://www.argunet.org/editor/ http://debategraph.org/planet https://www.truthmapping.com/map/1021/#s7164

Skills Programs for CT [ edit | edit source ]

These programs aid in the formulation of critical thinking skills through alternative methods of instruction such as problem-solving. They are usually targeted towards special populations such as students with learning disabilities or cognitive deficits.

The CoRT Thinking Materials [ edit | edit source ]

The CoRT (Cognitive Research Trust) program is based on de Bono’s idea that thinking skills should be taught in school as a subject [120] . The Thinking Materials are geared towards the improvement of thinking skills. This skills program takes on a Gestalt approach and emphasizes the perceptual factor of problem solving. It usually spans over the course of 2 years and is suitable for a wide age range of children. The lessons strive to develop creative thinking, problem-solving as well as interpersonal skills. The materials are split into 6 units and cover topics such as planning, analyzing, comparing, selecting, evaluating and generating alternatives. A typical unit has leaflets covering a single topic, followed by examples using practice items. The leaflets are usually effective in group settings. The focus of these units are to practice thinking skills, therefore much of the instructional time is spent on practicing the topics brought up in the leaflets [111] .

Much of the empirical research on this stand-alone program revolves around the development of creative thinking, however, it is relatively more extensive in comparison to the other programs mentioned in this chapter. The CoRT program has been shown to improve creativity in gifted students. Al-Faoury and Khwaileh (2014) assessed the effectiveness of the CoRT on gifted students’ creative writing abilities. The students were given a pretest that evaluated the fluency, flexibility and originality in writing creative short stories [120] . Students in the experimental group were taught 20 CoRT lessons in total with 10 from CoRT 1 “Breadth” and 10 from CoRT 4 “Creativity” over the course of three months while the control group received traditional lessons on creative writing. The posttest followed the same parameters as the pretest and the results were analyzed by comparing pre and posttest scores. The researchers found a statistically significant effect of CoRT on the experimental group’s fluency, flexibility and originality scores. The mean scores of the experimental groups in all three elements were higher than the control group [120] . These findings suggest that the CoRT program aids gifted students in creative writing skills as indicated through the use of rhetorical devices (metaphor, analogy, etc.), developing characters through dialogue and the control of complex structures [120] . The flexibility and fluency of writing is also applicable to the practice of argumentation and CT. In developing the ability to articulate and modify ideas, students can transfer these skills from creative writing towards higher-order cognitive processes such as CT and argumentation.

The Feuerstein Instrumental Enrichment Program (FIE) [ edit | edit source ]

The FIE is a specialized program focused on mediated learning experiences that strives to develop critical thinking and problem solving skills. Mediation is learning through interaction between the student and the mediator. Similar to Vygotsky's scaffolding, mediation is student-oriented and hinges upon 4 parameters: Intentionality, Reciprocity, Transcendence and Meaning. [121] Intentionality emphasizes the differences between mediation and interaction where the student and mediator have a common goal in mind. Reciprocity involves the student-oriented mentality of mediation, the response of the student hold most importance over academic results. Transcendence focuses on the connectivity of the mediation, it encourages the formation of associations and applications that stretch beyond the scope of the immediate material. Lastly, Meaning in mediation is where the student and mediator explicitly identify "why" and "what for" which promotes dialogue between the two during mediation. [121] [122]

The "instruments" used to facilitate instruction are a series of paper and pencil exercises geared towards practicing internalizing higher order thinking strategies. The instruments cover domains such as analytic perception, spatial organization, categorization, comparison and many more. The implementation of this program varies across countries and is also dependent on the targeted population. A typical program contains 14 units with 3-4 sessions for a few hours every week administered by trained IE staff and teachers. [121]

The Productive Thinking Program [ edit | edit source ]

The Productive Thinking Program consists of the development of planning skills, generating and checking hypotheses as well as creating new ideas. This program is designed as a set of 15 lessons aimed at being completed over one semester. The target population of the program is upper-level elementary school students. The lessons are administered through the use of narrative booklets, often taking a detective-like approach to problem solving where the student is the detective solving a mystery. A structured sequence of steps guides the student to attain an objective specific to the lesson at hand. [123] Following the booklet or story, supplementary problems are given in order for students to apply and practice learned skills. [111]

The IDEAL Problem Solver [ edit | edit source ]

The IDEAL Problem Solver structures problem-solving as 5 steps using the acronym IDEAL. First, (I)dentify the problem, the solver needs to find out what the problem is. Second, (D)efine the problem involves having a clear picture of the entire problem before trying to solve it. Third, (E)xplore the alternatives, meaning that the solver needs to assess the potential solutions available. Fourth, (A)cting on a plan, that is, applying the solution and doing the act of solving. Lastly, (L)ooking at the effects which encompasses the evaluation of the consequences of the chosen solution. IDEAL is flexible in that it can be adapted to suit a wide age range and different levels of ability in its application. It can also be applied to different domains such as composition or physics. [111]

Instructing Argumentation [ edit | edit source ]

Research on argumentation is a comparatively new field of study for education, but has been noted to be of significant importance to almost all educational settings. Grade schools, high schools, and colleges now emphasize the use of argumentation in the classroom as it is seen as the best way for communication and debate in a both vocational and educational settings around the world. [124] A longitudinal study done by Crowell and Kuhn showed that an effective way to help students gain argumentative skills was through consistent and dense application of argumentation in the classroom and as homework. [124] During this longitudinal study, students were exposed to a variety of different methods from which they gained argumentative abilities. The activities employed such as peer collaboration, using computers, reflection activities, individual essays, and small group work all have implications for being valuable in teaching argumentation although it is not clear which ones are the most effective. [124] Data also showed that students all rose to a similar level of argumentative ability, no matter what they scored on argumentative tests before the study began. This shows that even students with seemingly no argumentative skills can be instructed to become as skilled or more skilled than their peers who tested higher than them at the beginning of the study. [124]

Dialogue and Argumentation [ edit | edit source ]

Research by Crowell and Kuhn (2011) highlights collaborative dialogical activities as practical interventions in the development of argumentative skills. The researchers implemented a longitudinal argumentative intervention that used topic cycles to structure a middle school philosophy class [125] . The students had class twice a week for 50 minutes each class over the span of three years. The intervention is as follows: first, students were split into small groups on the same side of the argument to generate ideas around the topic (“for” and “against” teams). Then individuals from either side argue with an opponent through an electronic medium. Finally, the students engage in a whole class debate. These three stages were termed Pregame, Game and Endgame, respectively. After the intervention, students were required to write individual essays regarding the topic through which their argumentative skills would be assessed [125] . The results showed an increased in the generation of dual perspective arguments in the intervention group. Such arguments require the arguer to assume the opposing stance to one’s own and reason its implications. This type of argument reflects a higher-order reasoning that requires critical assessment of multiple perspectives. These results did not begin to appear until year two and was only found statistically significant in year three suggesting that argumentative skills have a longer development trajectory than other lower-level cognitive skills [125] . Through this stand-alone intervention, the collaborative aspect of dialogical activities facilitates the development of intellectual dispositions necessary for good argumentation [125] .

instruction for problem solving

Further research suggests that teaching through the use of collaborative discussions and argumentative dialogue is an effective teaching strategy [105] . Through argumentation, students can acquire knowledge of concepts as well as the foundational ideas behind these concepts. In formulating arguments, students need to generate premises that provide structure to an argument through accepted definitions or claims. Argumentation helps students reveal and clarify misconceptions as well as elaborate on background knowledge. The two aforementioned dimensions of argumentation – dialogue and structure – are often used in assessing and measuring argumentative performance [105] . Specifically, through student-expert dialogue, the students can be guided to give certain arguments and counterarguments depending on the expert’s dialectical decisions [105] . This scaffolding helps the student engage in more critical evaluations that delve deeper into the topic in discussion.

In a study using content and functional coding schemes of argumentative behavior during peer-peer and peer-expert dialogue pairings, Macagno, Mayweg-Paus and Kuhn (2014) found that through student-expert dialogues, students were able to later formulate arguments that dealt with abstract concepts at the root of the issue at hand (i.e. ethical principles, conflict of values) in comparison to peer-peer dialogues [105] . The expert used more specific and sophisticated ways of attacking the student’s argument, such as suggesting an alternative solution to the problem at hand, which in turn enhanced the performance of the student in later meta-dialogues [105] . The results suggest that the practical application of argumentation through collaborate activities facilitates the development of argumentation skills. Similar to CT skills development, rather than teaching, implicit instruction through the practice of argumentation in interactive settings helps its development.

Science and Argumentation [ edit | edit source ]

Much of the literature surrounding the application of argumentation in the classroom revolves around the scientific domain. Argumentation is often used as a tool in scientific learning to enhance CT skills, improve class engagement and activate prior knowledge and beliefs around the subject [105] . In order to articulate and refine scientific theories and knowledge, scientists themselves utilize argumentation [104] . Jonassen and Kim (2010) assert that science educators often emphasize the role of argumentation more than other disciplines [126] . Argumentation supports the learning of how to solve well-structures problems as well as ill-structured ones in science, and from there by extension, in daily life. Specifically, the ill-structured ones reflect more practical everyday problems where goals and limitations are unclear and there are multiple solution pathways as well as multiple factors for evaluating possible solutions [104] .

Through argumentation, students learn to use sound reasoning and CT in order to assess and justify their solution to a problem. For example, a well-structured problem would be one posed in a physics class where concrete laws and formulas dictate the solution pathway to a problem or review questions found at the end textbook chapters which require the application of a finite set of concepts and theories. An ill-structured problem would be finding the cause of heart disease in an individual. Multiple developmental and lifestyle factors contribute to this one problem in addition to the various different forms of heart disease that need to be evaluated. This sort of problem requires the application of knowledge from other domains such as nutrition, emotional well-being and genetics. Since ill-structured problems do not have a definite answer, students are provided with an opportunity to formulate arguments that justify their solutions [104] . Through the practice of resolving problems in science, such as these, students can use CT to develop their argumentative ability.

One’s willingness to argue as well as one's ability to argue also play a significant role in learning science [127] . For one science is at its core, extremely argumentative.

If students have to ability to engage in argumentation at an early age then there knowledge of specific content such as science can grow immensely. The main reason for this is argumentative discourse, being able to disagree with others is extremely important because for adolescents they are at an age which is fundamentally social (ie junior to senior high) using this social ability is pivotal as students at this point may have the confidence to disagree with one another. When a student disagrees with another in argument in a classroom setting it gives them an opportunity to explain the way in which they think about the material. This verbalization of one’s own thoughts and ideas on a subject can help with learning the subject immensely [127] . It also allows for the student to reflect upon and expand their ideas as they have to present them to the class which helps with learning. This also provides the opportunity for the student to identify any misconceptions they have about the subject at hand as more than likely they will receive rebuttal arguments from others in their class [127] . All these factors are aspects of CT and contribute to the learning of the concept and conceptual change in the student which is what learning is all about. The nature of adolescent social behaviour could provide a window through which argumentation could benefit their learning in dramatic ways in learning science [127] .

Argumentation, Problem Solving and Critical Thinking in History Education [ edit | edit source ]

History education offers learners an abundant opportunity to develop their problem solving and critical thinking skills while broadening their perspective on the human condition. The study of history addresses a knowledge gap; specifically, it is the difference between our knowledge of present day and the “infinite, unorganized and unknowable everything that ever happened”. [128] It has long been understood that the study of history requires critical thought and analytical problem-solving skills. In order to become proficient at the study of history, learners must interpret and construct how we come to know about the past and navigate the connection between the past and the body of knowledge we call history. [129] Unfortunately, history education has been demoted to simply recalling factual information - via the overuse of rote memorization and multiple-choice testing - all of which is placed outside the context of present day. This approach does little to inspire a love of history nor does it support the learner’s ability to construct an understanding of how the past and present are connected.

On the other hand, the study of science and mathematics has for many years been centred around developing skills through problem-solving activities. Students learn basic skills and build upon these skills through a progression of increasingly complex problems in order to further their understanding of scientific theory and mathematical relationships. Specific to science education, learners are taught to think like scientists and approach problems using the scientific method. If this approach works well for science and math education, why should it not be utilized for the teaching of history? [128] . Therefore, to develop historical thinking skills it is necessary for instructors to teach the strategies and problem-solving approaches that are used by professional historians. However, unlike science and mathematics, the problems we solve in history are often ill-defined and may be unanswerable in a definitive sense making it more challenging for students to learn and transfer these skills. The following section will address these challenges and provide support for teaching historical thinking via The Big Six Historical Thinking Concepts (2013).

Historical Thinking - The Big Six [ edit | edit source ]

Based upon years of research and first-hand classroom experience, Seixas and Morton (2013) established a set of six competencies essential to the development of historical thinking skills. Much like science and mathematics education discussed above, the Big Six approach to history education allows the learner to progress from simplistic to advanced tasks. Moreover, the Big Six approach is intended to help the learner “move from depending on easily available, commonsense notions of the past to using the culture’s most powerful intellectual tools for understanding history”. (pg 1) [128] Additionally, the Big Six concepts reveal to the learner the difficulties we encounter while attempting to construct a history of the past. The Big Six competencies include the following: historical significance, evidence, continuity and change, cause and consequence, historical perspectives, and the ethical dimension.

Historical Significance

To develop a critical view of history the learner must recognize and define the qualities that makes something (e.g., person, event, social change) historically significant and why they should spend their time learning about this thing. Behaviourist approaches to history education, focusing on the textbook as the main source of information, have caused learners to become passive in their approach to learning about the past. The textbook becomes the authority on what they need to know. Moreover, the sole use of textbooks to teach national history may contribute to the creation of a “master narrative” that limits a student’s access to what is controversial about their country’s past. [130] By shifting the focus away from the textbook, learners may be able to further their critical thinking skills by following the steps historians take to study the past and constructing their own “reasoned decisions about historical significance”. [128] However, even if a learner is provided primary source evidence to construct a narrative of the past but is not taught to recognize the subjective side to historical thinking - why these pieces of evidence were selected, why this topic was selected, and why they are both historically significant - they may not recognize the impacts of human motivation on the construction of historic understanding. Unlike scientific inquiry that relies on a “positivistic definition of rationality”, historical thinking requires learners to acknowledge human motivation - their own motivation in studying the past, their instructors motivation for selecting certain topics of study, and the motivation of those living in the past [131]

Seixas & Morton (2013) cite two elements involved in constructing historical significance: “big, compelling concerns that exist in our lives today, such as environmental sustainability, justice, power, [and] welfare” and “particular events, objects, and people whose historical significance is in question” (pg 16) [128] The intersection between these two elements is where historical significance is found. It is useful here to add Freedman’s (2015), definition of critical historical reasoning . Critical historical reasoning requires us to recognize that the study of history is not objective. Historians “frame their investigations through the questions they pose and the theories they advance” and therefore, learners of history must analyze the “integrity of historical narratives and their pattern of emphasis and omission” (pg 360). [131] Critical historical reasoning aims towards “conscious awareness of the frame one has adopted and the affordances and constraints it imposes” (pg 360) [131] . Therefore, both historians and learners of history must recognize that historical significance is assigned and not an inherent feature of the past, and, importantly, is subject to change.

The second set of competencies described by Seixas and Morton (2013) are based on using evidence to address an inquiry about the past. In a study of the cognitive processes involved in evaluating source documents, Wineburg (1991) lists three heuristics: corroboration, sourcing, and contextualization. Corroboration refers to comparing one piece of evidence to another, sourcing is identifying the author(s) of the evidence prior to reading or viewing the material, and contextualization refers to situating evidence in a specific time and place (pg 77). [132]

This study utilized an expert/novice design to compare how historians and high school students make sense of historic documents. Wineburg (1991) argues that the historians were more successful in the task not because of the “schema-driven processing” common to science and mathematics, but by building a model of the [historic] event through the construction of “context-specific schema tailored to this specific event” (pg 83). [132] Additionally, historians demonstrated greater appreciation for the source of the historic documents compared to the students. This suggests that the students did not make the connection between a document's author and the reliability of the source. As Wineburg states, the historian understands “that there are no free-floating details, only details tied to witnesses, and if witnesses are suspect, so are their details” (pg. 84). [132] This study suggests the potential for historical understanding to be improved by teaching the cognitive strategies historians use to construct history.

Multiple narratives of the past exist as individuals bring their own values and experiences to their interpretations of historical evidence. Recognizing this may push learners beyond accepting historic accounts at face value and pull them towards a more critical approach to history. Inquiry-based guided discovery activities, such as Freedman’s (2015) Vietnam war narrative study, suggest that students may gain an awareness of the way they and others “frame” history through exploring primary source documents and comparing their accounts with standardized accounts (i.e. a textbook). [133] By allowing learners to view history as an interpretation of evidence rather than a fixed body of knowledge, we can promote critical thought through the learners’ creation of inferences based on evidence and construction of arguments to support their inferences.

Continuity and Change

Developing an understanding of continuity and change requires the learner to recognize that these two elements overlap over the chronology of history; some things are changing at the same time that other things remain the same. If students are able to recognize continuity and the processes of change in their own lives they should be able to transfer this understanding to their study of the past. [134] Students should be encouraged to describe and question the rate and depth of historic change as well as consider whether the change should be viewed as progress or decline. [134] The evaluation of historic change as positive or negative is, of course, dependent on the perspective taken by the viewer. An example of continuity through history is the development of cultural identity. Carretero and van Alphen (2014), explored this concept in their study of master narratives in Argentinian high school students. They suggest that identity can be useful to facilitate history education, but could also create misconceptions by the learner confounding past with present (or, presentism), as demonstrated when using “we” to discuss people involved in victorious battles or revolutions of the past which gave shape to a nation (pg 308-309). [130] It is useful, then to teach students to differentiate between periods of history. However, periodization of history, much like everything else in the knowledge domain, is based on interpretation and is dependent on the questions historians ask [134]

Educational technology such as interactive timelines, narrative history games, and online discussion groups may help learners make connections between the past and present. For example, the Museum of Civilization offers a teaching tool on the history of Canadian medicare ( http://www.museedelhistoire.ca/cmc/exhibitions/hist/medicare/medic01e.shtml ). Interactive timelines allow students to see connections between continuity, change, cause, and consequences by visually representing where these elements can be found over historic time. Also, guiding the learners’ exploration of interactive timelines by selecting strong inquiry questions may improve students understanding and facilitate the development of historical thinking. For example, an investigation into the European Renaissance could be framed by the following question: “Did everyone in Europe experience the Renaissance the same way?” Questions such as this are open-ended so as to not restrict where the students takes their inquiry but also suggest a relationship between the changes of the Renaissance and the continuity of European society. Other examples of educational technology that support historical thinking include the “Wold History for us All” ( http://worldhistoryforusall.sdsu.edu/ ) project. This website offers world history units separated into large-scale and local-scale topics and organized by historic period. The lesson plans and resources may allow the learner to making connections between local issues and the broader, global conditions affecting world history. Finally, a case study by Blackenship (2009) suggests that online discussion groups are a useful for developing critical thinking by allowing the teacher to view the students’ thought processes and thereby facilitating formative assessment and informing the type of instructional interventions required by the teacher. Blackenship (2009) cites additional research supporting the use of online discussion because it allows the learners to collect their thoughts before responding to a discussion prompt; they have more time to access prior knowledge and consider their own ideas. [135]

Cause and Consequence

The historical thinking competencies of cause and consequence require learners to become proficient at identifying direct and indirect causes of historic events as well as their immediate and long-term consequences. Effective understanding of the causes of historic change requires the recognition of both the actions of individuals as well as the prevailing conditions of the time. Historical thinking requires students to go beyond simplistic immediate causes and think of history as web of “interrelated causes and consequences, each with various influences” (pg 110). [134] In addition to improving understanding of the past, these competencies may help learners to better understand present-day conflicts and issues. Shreiner (2014) used the novice/expert format to evaluate how people utilize their knowledge of history to make reasoned conclusions about events of the present. Similar to the Wineburg (1991) study discussed above, Shreiner (2014) found the experts were better at contextualizing and using sourcing to critically analyze documents for reliability and utility in establishing a reasoned judgement. Additionally, the study found that while students would use narrative to construct meaning, they typically created schematic narrative templates - general statements about the past which lack specific details & events. [136] Seixas and Morton (2013) caution the use of overly-simplistic timelines of history because they could create a misconception that history is nothing more than a list of isolated events.The study indicates that historical narratives that follow periodization schemes and are characterized by cause-and-effect relationships, as well as change over time, are helpful for understanding contemporary issues. [134] Therefore, it is important that educators work to develop these competencies in students. Much like historic change, the consequences of certain actions in history can be viewed as positive and negative, depending on perspective. This will be discussed in further detail below.

Historical Perspectives and Ethics

The final two historical thinking competencies proposed by Seixas and Morton are historical perspectives and ethics. Historical perspectives refers to analyzing the historical context for conditions that would influence a historic figure to view an event or act in a particular way. This could include religious beliefs, social status, geographic location, time period, prevailing economic and political conditions, and social/cultural conditions. This again requires some interpretation of evidence as oftentimes we do not have evidence that explicitly describes a historic figure’s attitudes and reasons for acting. Primary source documents, such as letters and journals can provide insight but still require the historian to use inference to make sense of the documents and connect the information to a wider historical narrative or biographical sketch of an individual. Additionally, “[h]ard statistics, such as birth and death rates, ages of marriage, literacy rates, and family size... can all help us make inferences about people's experiences, thoughts, and feelings” (pg 143). [134] There are, of course, limitations to how much we can infer about the past; however, Seixas and Morton (2013) suggest that acknowledging the limitations of what we can know about the past is part of “healthy historical thinking” (pg 143). [134] Learners can develop their understanding of historical perspective by observing the contrast between past and present ways of life and worldviews, identifying universal human traits that transcend time periods (e.g., love for a child), and avoiding presentism and anachronism . [134] A greater understanding of historical perspective will be useful for students when encountering conflicting historical accounts as they will be able to see where the historical actors are “coming from” and therefore better understand their actions. Historical perspective and ethics are related. Seixas and Morton (2013) argue that “the ethical dimension of historical thinking helps to imbue the study of history with meaning” (pg 170). [134] To understand the moral reasons for an individual's actions we need to understand the influence of historical, geographical, and cultural context. Additionally, to understand ethical consequences of the past we make moral judgments which require “empathetic understanding[;] an understanding of the differences between our moral universe and theirs” (Seixas and Peck, 2004, pg 113). [137] People with little experience with historical thinking have difficulty separating the moral standards of today’s society with the societies of the past. Additionally, students tend to judge other cultures more critically than their own; oftentimes defending or justifying actions of their own nations. [138] Therefore, Lopez, Carretero and Rodriguez-Moneo (2014) suggest using national narratives of nations different from the learner’s own nation to more effectively develop critical historical thinking. As the learner becomes proficient at analyzing the ethical decisions of the past, they can translate these skills to analyzing present-day ethical questions. Role playing is a useful instructional strategy for teaching historical perspective. Traditional, face-to-face classrooms allow for dramatic role play activities, debates, and mock trials where students can take on the role of an individual or social group from history. Additionally, educational games and websites allow for the integration of technology while using the role play strategy. Whitworth and Berson (2003) found that, in the 1990-2000s, technology in the social studies classroom was focused mostly on using the internet as a digital version of material that would have otherwise been presented in the classroom. They suggest that alternative uses of technology - such as inquiry-based webquests, simulations, and collaborative working environments - promote interaction and critical thinking skills. [139] One example of a learning object that promotes critical thinking through role playing is the Musee-Mccord’s online game collection ( http://www.mccord-museum.qc.ca/en/keys/games/ ). Specifically, the Victorian Period and the Roaring Twenties games allow the learner to progress through the time period and make decisions appropriate to the historic context of the period. These games are paired with relevant resources from the museum collections which can enhance the learner’s depth of understanding of the period. In terms of teaching strategies for the ethical component of history can be explored through historical narratives, debating ethical positions on historic events, and evaluating and critiquing secondary sources of information for ethical judgements.

To summarize, introducing professional historians’ strategies for studying history is widely regarded as a way to improve historical thinking in students. Professional historian’s cognitive processes of corroborating accounts, critically analyzing sources, and establishing historic context are reflected well by Seixas and Morton’s Big Six Historical Thinking Concepts (2013). Historical thinking gives students the skills to problem solve within the context of history and make sense of the past and connect it to the present in order to broaden the learner’s perspective, understand prevailing social conditions, and influence how they interact with the world. See the Historical Thinking Project’s webpage ( http://historicalthinking.ca/lessons ) for instructional ideas for all the historical competencies.

Instructing through Academic Controversy [ edit | edit source ]

Using the technique of Academic Controversy could be an effective way of teaching both argumentation and CT skills to students. Academic controversy involves dividing a cooperative group of four in two pairs of students and assigning them opposing positions of an argument or issue, after which the two pairs each argue for their position. The groups then switch their positions and argue again, finally the group of four is asked to come up with an all-around solution to the problem [140] . This activity can be effective in instructing both aspects of argumentation and CT, though it may be a bit dated. The activity is argumentative by nature, making students come up with reasons and claims for two sets of arguments. This equilibrium is important to the argumentative process because provides the students with an opportunity to evaluate the key points of their argument and the opposition's which could be beneficial in any debate. As well, this activity is geared to engage students in a few aspects of CT such as evaluation, since the students must assess each side of the argument. It also engages metacognitive processes as the students must come up with a synthesized conclusion with their peers of their own arguments, a process which requires them to be both analytical and open minded. This activity is a good way of increasing both CT skills and argumentation as it requires students to be open-minded, but also engage in analytical debate.

Glossary [ edit | edit source ]

Suggested readings [ edit | edit source ].

  • Abrami, P.C., Bernard, R.M., Borokhovski, E., Wade, A., Surkes, M.A., Tamim, R., & Zhang, D. (2008). Instructional Interventions Affecting Critical Thinking Skills and Dispositions: A Stage 1 Meta-Analysis. Review of Educational Research, 78(4). 1102-1134. DOI: 10.3102/0034654308326084.
  • Phan, H.P. (2010). Critical thinking as a self-regulatory process component in teaching and learning. Psicothema, 22(2). 284-292.
  • Kozulin, A. & Presseisen, B.Z. (1995). Mediated Learning Experience and Psychological Tools: Vygotsky’s and Feuerstein’s Perspective in a Study of Student Learning. Educational Psychologist, 30(2), 67-75.
  • Crowell, A., & Kuhn, D. (2011). Dialogic Argumentation as a Vehicle for Developing Young Adolescents’ Thinking. Psychological Science, 22(4), 545-552. DOI: 10.1177/0956797611402512.

External links [ edit | edit source ]

  • Critical Thinking: How Children Can Start Thinking Deeply, Part 1
  • Critical Thinking for Kids In Action, Part 2
  • Critical Thinking for Kids In Action, Part 3
  • Critical Thinking for Kids In Action, Part 4
  • Critical Thinking Exercises for Kids

References [ edit | edit source ]

  • ↑ Heijltjes, A., Van Gog, T., & Paas, F. (2014). Improving Students' Critical Thinking: Empirical Support for Explicit Instructions Combined with Practice. Applied Cognitive Psychology, 28(4), 518-530.
  • ↑ a b c d e f g Murphy, K. P., Rowe, M. L., Ramani, G., & Silverman, R. (2014). Promoting Critical-Analytic Thinking in Children and Adolescents at Home and in School. Educational Psychology Review, 26(4), 561-578.
  • ↑ Gick, M. L. (1986). Problem-Solving Strategies. Educational Psychologist, 21(1/2), 99-121.
  • ↑ a b c Ku, K. Y., Ho, I. T., Hau, K., & Lau, E. C. (2014). Integrating direct and Inquiry_Based Instruction in the teaching of critical thinking: An intervention study. Instructional Science, 42(2), 251-169.
  • ↑ a b c d e f g h i j k l m n o p q Mathews, S. R., & Lowe, K. (2011). Classroom environments that foster a Disposition for Critical Thinking . Learning Environments Research, 14(1), 59-73.
  • ↑ Glaser, E. M. (1941). An Experiment in the Development of Critical Thinking. Columbia University.
  • ↑ a b c d Phan, H.P. (2010). Critical thinking as a self-regulatory process component in teaching and learning. Psicothema, 22(2). 284-292.
  • ↑ a b c d e f Moon, J. (2007). Critical Thinking: An Exploration of Theory and Practice (1st ed.). London ; New York: Routledge.
  • ↑ a b Kurfiss, J. G. (1988). Critical Thinking: Theory, Research, Practice, and Possibilities: ASHE-ERIC/Higher Education Research Report, Volume 17, Number 2, 1988 (1st ed.). Washington, D.C: Jossey-Bass.
  • ↑ a b Board on Testing and Assessment, Division of Behavioral and Social Sciences and Education, & National Research Council. (2011). Assessing 21st Century Skills: Summary of a Workshop. National Academies Press.
  • ↑ a b Mason, M. (2009). Critical Thinking and Learning. John Wiley & Sons.
  • ↑ a b Elder, L., & Paul, R. (2009). The Art of Asking Essential Questions (5th Edition). Dillon Beach, CA: Foundation for Critical Thinking
  • ↑ a b c Paul, R., & Elder, L. (2007). The Thinker’s Guide to The Art of Socratic Questioning. Dillon Beach, CA: The Foundation for Critical Thinking.
  • ↑ a b c d Morgan, N., & Saxton, J. (2006). Asking Better Questions (2nd ed.). Markham, ON: Pembroke Publishers.
  • ↑ a b Cain, R. B. (2007). The Socratic Method: Plato’s Use of Philosophical Drama. A&C Black.
  • ↑ Harmon, D. A., & Jones, T. S. (2005). Elementary Education: A Reference Handbook. ABC-CLIO.
  • ↑ Stanley, T., & Moore, B. (2013). Critical Thinking and Formative Assessments: Increasing the Rigor in Your Classroom. Routledge.
  • ↑ a b Jung, I., Nishimura, M., & Sasao, T. (2016). Liberal Arts Education and Colleges in East Asia: Possibilities and Challenges in the Global Age. Springer.
  • ↑ Mason, M. (2009). Critical Thinking and Learning. John Wiley & Sons. p. 8.
  • ↑ Davies, M., & Barnett, R. (2015). The Palgrave Handbook of Critical Thinking in Higher Education. Springer.
  • ↑ a b c d Cibáková, D. (2015). Methods of developing critical thinking when working with educative texts. E-Pedagogium, (2), 135-145.
  • ↑ Garcia, T., & Pintrich, P. R. (1992). Critical Thinking and Its Relationship to Motivation, Learning Strategies, and Classroom Experience. 2-30.
  • ↑ a b Halpern, D. F. (2013). Thought and Knowledge: An Introduction to Critical Thinking. Psychology Press.
  • ↑ Browne, M. N., & Keeley, S. M. (2006). Asking the Right Questions: A Guide to Critical Thinking (8th ed.). Upper Saddle River, N.J: Prentice Hall.
  • ↑ Elder, L., & Paul, R. (2009). The Art of Asking Essential Questions (5th Edition). Dillon Beach, CA: Foundation for Critical Thinking. p. 3.
  • ↑ Micarelli, A., Stamper, J., & Panourgia, K. (2016). Intelligent Tutoring Systems: 13th International Conference, ITS 2016, Zagreb, Croatia, June 7-10, 2016. Proceedings. Springer.
  • ↑ a b c d e Conklin, W., & Teacher Created Materials. (2012). Strategies for Developing Higher-Order Thinking Skills, Grades 6 - 12. Shell Education.
  • ↑ http://www.janinesmusicroom.com/socratic-questioning-part-i-the-framework.html
  • ↑ Bloom, B. S., Krathwohl, D. R., & Masia, B. B. (1984). Taxonomy of educational objectives: the classification of educational goals. Longman.
  • ↑ Blosser, P. E. (1991). How to Ask the Right Questions. NSTA Press.
  • ↑ Wang, J.-F., & Lau, R. (2013). Advances in Web-Based Learning -- ICWL 2013: 12th International Conference, Kenting, Taiwan, October 6-9, 2013, Proceedings. Springer.
  • ↑ a b Gregory, G., & Kaufeldt, M. (2015). The Motivated Brain: Improving Student Attention, Engagement, and Perseverance. ASCD.
  • ↑ Carol, K., & Sandi, Z. (2014). Q Tasks, 2nd Edition: How to empower students to ask questions and care about answers. Pembroke Publishers Limited.
  • ↑ a b Doubet, K. J., & Hockett, J. A. (2015). Differentiation in Middle and High School: Strategies to Engage All Learners. ASCD.
  • ↑ a b c http://www.educ.kent.edu/fundedprojects/tspt/units/sixfacets.htm
  • ↑ a b McTighe, J., & Wiggins, G. (2013). Essential Questions: Opening Doors to Student Understanding (1st ed.). Alexandria, Va. USA: Association for Supervision & Curriculum Development.
  • ↑ a b c d e f g h i j k l m n o p q Bruning, G.J. Schraw & M.M. Norby (2011) Cognitive Psychology and Instruction (5th Ed). New York: Pearson.
  • ↑ Anderson, J. R. Cognitive Psychology and Its Implications. New York: Freeman, 1980
  • ↑ a b Mayer, R. E., & Wittrock, R. C. (2006). Problem solving. In P. A. Alexander & P. H. Winne (Eds.), Handbook of educational psychology (2nd ed., pp. 287–304). Mahwah, NJ: Erlbaum.
  • ↑ Snyder, M. J., & Snyder, L. G. (2008). Teaching critical thinking and Problem solving skills. Delta Pi Epsilon Journal, L(2), 90-99.
  • ↑ a b c Voss, J. F. (1988). Problem solving and reasoning in ill-structured domains. In C. Antaki (Ed.), Analyzing everyday explanation: A casebook of methods (pp. 74-93). London: SAGE Publications.
  • ↑ a b Pretz, J. E., Naples, A. J., & Sternberg, R. J. (2003). Recognizing, defining, and representing problems. In J. E. Davidson and R. J. Sternberg (Eds.), The psychology of problem solving (pp. 3–30). Cambridge, UK: Cam- bridge University Press.
  • ↑ a b c d Shin, N., Jonassen, D. H., & McGee, S. (2003). Predictors of Well-Structured and Ill-Structured Problem Solving in an Astronomy Simulation. Journal Of Research In Science Teaching, 40(1), 6-33.8
  • ↑ Simon, D. P. (1978). Information processing theory of human problem solving. In W. K. Estes (Ed.), Handbook of learning and cognitive process. Hillsdale, NJ: Lawrence Erlbau
  • ↑ Kitchener, K.S., Cognition, metacognition, and epistemic cognition. Human Development, 1983. 26: p. 222-232.
  • ↑ Schraw G., Dunkle, M. E., & Bendixen L. D. (1995). Cognitive processes in well-structured and ill-structured problem solving. Applied Cognitive Psychology, 9, 523–538.
  • ↑ Cho, K. L., & Jonassen, D. H. (2002) The effects of argumentation scaffolds on argumentation and problem solving. Educational Technology: Research & Development, 50(3), 5-22.
  • ↑ a b Jonassen, D. H. (2000). Revisiting activity theory as a framework for designing student-centered learning environments. In D. H. Jonassen, S. M. Land, D. H. Jonassen, S. M. Land (Eds.) , Theoretical foundations of learning environments (pp. 89-121). Mahwah, NJ, US: Lawrence Erlbaum Associates Publishers.
  • ↑ Spiro, R. J., Feltovich, P. J., Jacobson, M. J., & Coulson, R. L. (1992). Cognitive flexibility, constructivism, and hypertext: Random access instruction for advanced knowledge acquisition in ill-structured domains. In T. M. Duffy, D. H. Jonassen, T. M. Duffy, D. H. Jonassen (Eds.) , Constructivism and the technology of instruction: A conversation (pp. 57-75). Hillsdale, NJ, England: Lawrence Erlbaum Associates, Inc.
  • ↑ a b c d e Barrows, H. S. (1996). “Problem-based learning in medicine and beyond: A brief overview.” In L. Wilkerson & W. H. Gijselaers (Eds.), Bringing Problem-Based Learning to higher education: Theory and practice (pp. 3-12). San Francisco: Jossey Bass.
  • ↑ a b (Barron, B., & Darling-Hammond, L. (2008). Teaching for meaningful learning: A review of research on inquiry-based and cooperative learning. Powerful Learning: What We Know About Teaching for Understanding (pp. 11-70). San Francisco, CA: Jossey-Bass.)
  • ↑ Dochy, F., Segers, M., Van den Bossche, P., & Gijbels, D. (2003). Effects of problembased learning: A meta-analysis. Learning and Instruction, 13, 533–568.
  • ↑ Williams, D., Hemstreet, S., Liu, M.& Smith, V. (1998). Examining how middle school students use problem-based learning software. Unpublished paper presented at the ED-Media/ED Telecom ‘98 world Conference on Educational Multimedia and Hypermedia & World Conference on Educational Telecommunications, Freiberg, Germany.
  • ↑ Gallagher, S. A., Stepien, W. J., & Rosenthal, H. (1992). The effects of problem based learning on problem solving. Gifted Child Quarterly, 36, 195–200. Gertzman, A., & Kolodner, J. L.
  • ↑ Bruning, G.J. Schraw & M.M. Norby (2011) Cognitive Psychology and Instruction (5th Ed). New York: Pearson.
  • ↑ Martin, L., and D. L. Schwartz. 2009. “Prospective Adaptation in the Use of External Representations.” Cognition and Instruction 27 (4): 370–400. doi:10.1080/
  • ↑ Chambers, D., and D. Reisberg. 1985. “Can Mental Images Be Ambiguous?” Journal of Experimental Psychology: Human A pragmatic perspective on visual representation and creative thinking Perception and Performance 11 (3): 317–328.
  • ↑ a b Öllinger, M., Jones, G., & Knoblich, G. (2008). Investigating the effect of mental set on insight problem solving. Experimental Psychology, 55(4), 269-282. doi:10.1027/1618-3169.55.4.269
  • ↑ Duncker, K. (1945). On Problem Solving. Psychological Monograph, Whole No. 270.
  • ↑ a b c McCaffrey, T. (2012). Innovation relies on the obscure: A key to overcoming the classic problem of functional fixedness. Psychological Science, 23(3), 215-218.
  • ↑ a b Taconis, R., Ferguson-Hessler, M. G. M., & Broekkamp, H. (2002). Teaching science problem solving: An overview of experimental work. Journal of Research in Science Teaching, 38, 442–46
  • ↑ Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academies Press.
  • ↑ a b Novick, L. R., & Bassok, M. (2005). Problem solving. In K. Holyoak & R. Morrison (Eds.), The Cambridge handbook of thinking and reasoning (pp. 321–350). Cambridge, UK: Cambridge University Press.
  • ↑ Fuchs, L. S., Fuchs, D., Stuebing, K., Fletcher, J. M., Hamlett, C. L., & Lambert, W. (2008). Problem solving and computational skills: Are they shared or distinct aspects of mathematical cognition? Journal of Educa- tional Psychology, 100, 30–
  • ↑ a b McNeill, K., & Krajcik, J. (2008). Scientific explanations: Characterizing and evaluating the effects of teachers’ instructional practices on student learning. Journal of Research in Science Teaching, 45, 53–78.
  • ↑ a b c d e Aleven, V. A. W. M. M., & Koedinger, K. R. (2002) An effective meta-cognitive strategy: learning by doing and explaining with a computer-based Cognitive Tutor. Cognitive Science, 26(2), 147–179.
  • ↑ a b c d e f g h i Anderson, J. R., Corbett, A. T., Koedinger, K. R., & Pelletier, R. (1995). Cognitive Tutors: Lessons learned. Journal of the Learning Sciences, 4(2), 167.
  • ↑ a b c d e f Corbett, A. T., & Anderson, J. R. (2001). Locus of feedback control in computer-based tutoring: Impact on learning rate, achievement and attitudes. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 245–252). New York, NY, USA: ACM.
  • ↑ a b c d e f g Koedinger, K. R., & Corbett, A. (2006). Cognitive tutors. The Cambridge handbook of the learning sciences, 61-77.
  • ↑ Koedinger, K. R. (2002). Toward evidence for instructional design principles: Examples from Cognitive Tutor Math 6. In Proceedings of PME-NA XXXIII (The North American Chapter of the International Group for the Psychology of Mathematics Education).
  • ↑ Corbett, A., Kauffman, L., Maclaren, B., Wagner, A., & J,.ones, E. (2010). A Cognitive Tutor for genetics problem solving: Learning gains and student modeling. Journal of Educational Computing Research, 42(2), 219–239.
  • ↑ Corbett, A. T., & Anderson, J. R. (2008). Knowledge decomposition and sub-goal reification in the ACT programming tutor. Department of Psychology, 81.
  • ↑ Corbett, A. T., & Bhatnagar, A. (1997). Student modelling in the ACT programming tutor: Adjusting a procedural learning model with declarative knowledge. In User modelling (pp. 243-254). Springer Vienna.
  • ↑ Corbett, A. (2002). Cognitive tutor algebra I: Adaptive student modelling in widespread classroom use. In Technology and assessment: Thinking ahead. proceedings from a workshop (pp. 50-62).
  • ↑ Koedinger, K. R. & Anderson, J. R. (1993). Effective use of intelligent software in high school math classrooms. In Proceedings of the World Conference on Artificial Intelligence in Education, (pp. 241-248). Charlottesv
  • ↑ Koedinger, K., Anderson, J., Hadley, W., & Mark, M. (1997). Intelligent tutoring goes to school in the big city. Human-Computer Interaction Institute.
  • ↑ Koedinger, K. R., & Corbett, A. (2006). Cognitive tutors. The Cambridge handbook of the learning sciences, 61-77.
  • ↑ Resnick, L. B. (1987). Learning in school and out. Educational Researcher, 16(9), 13-20.
  • ↑ Polya, G. (1957). How to Solve It: A New Aspect of Mathematical Method. (2nd ed.). Princeton, NJ: Princeton University Press.
  • ↑ <Aleven, V. A. W. M. M., & Koedinger, K. R. (2002). An effective metacognitive strategy: learning by doing and explaining with a computer-based Cognitive Tutor. Cognitive Science, 26(2), 147–179. p. 173
  • ↑ Corbett, A., Kauffman, L., Maclaren, B., Wagner, A., & Jones, E. (2010). A Cognitive Tutor for genetics problem solving: Learning gains and student modelling. Journal of Educational Computing Research, 42(2), 219–239.
  • ↑ Morgan, Alistar. (1983). Theoretical Aspects of Project-Based Learning in Higher Education. British Journal of Educational Technology, 14(1), 66-78.
  • ↑ a b c d Blumenfeld, Phyllis C., Elliot Soloway, Ronald W. MArx, Joseph S. Krajick, Mark Guzdial, Annemarie Palincsar.. (1991). Motivating Project-Based Learning: Sustaining the Doing, Supporting the Learning. Educational Psychologist, 26(3&4), 369-398.
  • ↑ a b c d e f Thomas, John W. (2000). A Review of Research on Project-Based Learning. San Rafael: Autodesk Press.
  • ↑ a b van Merriënboer, J. J. G., Clark, R. E., & de Croock, M. B. M. (2002). Blueprints for complex learning: The 4C/ID-model. Educational Technology Research and Development, 50(2), 39–61. doi:10.1007/bf02504993.
  • ↑ a b Dewey, J. (1938). Experience and education. New York: Macmillan.
  • ↑ a b c d e van Merrienboer, J. J. G., Paul A. Kirschner, & Liesbeth Kester. (2004). Taking the Load off a Learner’s Mind: Instructional Design for Complex Learning. Amsterdam: Open University of the Netherlands.
  • ↑ Piaget, Jean; Cook, Margaret (Trans). The construction of reality in the child. New York, NY, US: Basic Books The construction of reality in the child. (1954). xiii 386 pp. http://dx.doi.org/10.1037/11168-000 .
  • ↑ Ames, C. (1992). Classrooms: goals, structures, and student motivation. Journal of Educational Psychology, 84, 261-271.
  • ↑ Helle, Laura, Paivi Tynjara, Erkki Olkinora, Kristi Lonka. (2007). “Aint nothin’ like the real thing”. Motivation and study processes on a work-based project course in information systems design. British Journal of Educational Psychology, 70(2), 397-411.
  • ↑ a b Joseph Krajcik , Phyllis C. Blumenfeld , Ronald W. Marx , Kristin M. Bass , Jennifer Fredricks & Elliot Soloway (1998) Inquiry in Project-Based Science Classrooms: Initial Attempts by Middle School Students, Journal of the Learning Sciences, 7:3-4, 313-350, DOI: 10.1080/10508406.1998.9672057
  • ↑ a b c Gorges, Julia, Thomas Goke. (2015). How do I Know what I can do? Anticipating expectancy of success regarding novel academic tasks. British Journal of Educational Psyschology, 85(1), 75-90.
  • ↑ Hung, C.-M., Hwang, G.-J., & Huang, I. (2012). A Project-based Digital Storytelling Approach for Improving Students' Learning Motivation, Problem-Solving Competence and Learning Achievement. Educational Technology & Society , 15 (4), 368–379. 
  • ↑ a b Efstratia, Douladeli. (2014). Experiential education through project based learning. Procedia – Social and Behavioral Sciences . 152, 1256-1260.
  • ↑ a b c Capraro, R. M., Capraro, M. M., & Morgan, J. (2013). STEM Project-Based Learning: An Integrated Science, Technology, Engineering, and Mathematics (STEM) Approach (2nd Edition). New York, NY: Sense. 
  • ↑ Gary, Kevin. (2013), Project-Based Learning. Computer. (Vol 48:9). Tempe: Arizona State. 
  • ↑ a b c d e f g h Cross, N. (2007). Designerly ways of knowing . Basel, Switzerland: Birkha¨user.
  • ↑ Simon, H. A. (1996). The sciences of the artificial . Cambridge, MA: MIT Press.
  • ↑ Aflatoony, Leila & Ron Wakkary, (2015). Thoughtful Thinkers: Secondary Schooler’s Learning about Design Thinking. Simon Fraser University: Surrey, BC.
  • ↑ a b Koh, Joyce Hwee Ling, Chin Sing Chai, Benjamin Wong, & Huang-Yao Hong. (2015) Design Thinking for Education. Singapore: Springer Science + Business Media.
  • ↑ a b c d Schon, D. A. (1983). The reflective practitioner: How professionals think in action (Vol. 5126). New York, NY: Basic Books.
  • ↑ Hu, Weiping, Philip Adey, Xiaojuan Jia, Jia Liu, Lei Zhang, Jing Li, Xiaomei Dong. (2010). Effects of a “Learn to Think” Intervention programme on primary school students. British Journal of Educational Psychology . 81(4) 537-557.
  • ↑ a b Wells, Alastair. (2013). The importance of design thinking for technological literacy: a phenomenological perspective . International Journal Technol Des Educ. (23:623-636). DOI 10.1007/s10798-012-9207-7.
  • ↑ a b c d Jonassen, D.H., & Kim, B. (2010). Arguing to learn ad learning to argue: design justifications and guidelines. Education Technology & Research Development, 58(4), 439-457. DOI 10.1007/s11423-009-9143-8.
  • ↑ a b c d e f g h Macagno, F., Mayweg-Paus, W., & Kuhn, D. (2014). Argumentation theory in Education Studies: Coding and Improving Students’ Argumentative Strategies. Topoi, 34, 523-537.
  • ↑ a b c Hornikx, J., & Hahn, U. (2012). Reasoning and argumentation: Towards an integrated psychology of argumentation. Thinking & Reasoning, 18(3), 225-243. DOI: 10.1080/13546783.2012.674715.
  • ↑ a b Aufschnaiter, C., Erduran, S., Osborne, J., & Simon, S. (2008). Arguing to learn and learning to argue: Case studies of how students' argumentation relates to their scientific knowledge. Journal of Research in Science Teaching, 45(1), 101-131. doi:10.1002/tea.20213
  • ↑ Bensley, A., Crowe, D., Bernhardt, P., Buckner, C., & Allman, A. (2010). Teaching and assessing CT skills for argument analysis in psychology. Teaching of Psychology, 37(2), 91-96. doi:10.1080/00986281003626656
  • ↑ Glassner, A., & Schwarz, B. B. (2007). What stands and develops between creative and critical thinking? argumentation?. Thinking Skills and Creativity, 2(1), 10-18. doi:10.1016/j.tsc.2006.10.001
  • ↑ Gold J., Holman D., & Thorpe R. (2002). The role of argument analysis and story telling in facilitating critical thinking. Management Learning, 33(3), 371-388. doi:10.1177/1350507602333005
  • ↑ a b c d e f g h Bruning, R. H., Schraw, G. J., & Norby, M. M. (2011). Cognitive psychology and instruction (5th ed.) Pearson.
  • ↑ Chesñevar, I., & Simari, G. (2007). Modelling inference in argumentation through labelled deduction: Formalization and logical properties. Logica Universalis, 2007, Volume 1, Number 1, Page 93, 1(1), 93-124. doi:10.1007/s11787-006-0005-4
  • ↑ Ontañón, S., & Plaza, E. (2015). Coordinated inductive learning using argumentation-based communication. Autonomous Agents and Multi-Agent Systems, 29(2), 266-304. doi:10.1007/s10458-014-9256-2
  • ↑ Pinto, M., Iliceto, P., & Melagno, S. (2012). Argumentative abilities in metacognition and in metalinguistics: A study on university students. European Journal of Psychology of Education, 27(1), 35-58. doi:10.1007/s10212-011-0064-7
  • ↑ Bensley, A., Crowe, D., Bernhardt, P., Buckner, C., & Allman, A. (2010). Teaching and assessing critical thinking skills for argument analysis in psychology. Teaching of Psychology, 37(2), 91-96. doi:10.1080/00986281003626656
  • ↑ Demir, B., & İsleyen, T. (2015). The effects of argumentation based science learning approach on creative thinking skills of students. Educational Research Quarterly, 39(1), 49-82.
  • ↑ Chandler, S. & Dedman, D.E. (2012). Writing a Literature Review: An Essential Component of Critical Thinking. The Journal of Baccalaureate Social Work, 17. 160-165.
  • ↑ a b c d Al-Faoury, O.H., & Khwaileh, F. (2014). The Effect of Teaching CoRT Program No. (4) Entitles “Creativity” on the Gifted Learners’ Writing in Ein El-Basha Center for Gifted Students. Theory and Practice in Language Studies, 4(11), 2249-2257. doi:10.4304/tpls.4.11.2249-2257.
  • ↑ a b c Kozulin, A. & Presseisen, B.Z. (1995). Mediated Learning Experience and Psychological Tools: Vygotsky’s and Feuerstein’s Perspective in a Study of Student Learning. Educational Psychologist, 30(2), 67-75.
  • ↑ Presseisen, B.Z. & Kozulin, A. (1992). Mediated Learning – The Contributions of Vygotsky and Feuerstein in Theory and Practice.
  • ↑ Schuler, G. (1974). The Effectiveness of the Productive Thinking Program. Paper presented at the Annual Meeting of the American Educational Research Association. Retrieved from: http://www.eric.ed.gov/contentdelivery/servlet/ERICServlet?accno=ED103479 .
  • ↑ a b c d Crowell, A., & Kuhn, D. (2014). Developing dialogic argumentation skills: A 3-year intervention study. Journal of Cognition and Development, 15(2), 363-381. doi:10.1080/15248372.2012.725187
  • ↑ a b c d Crowell, A., & Kuhn, D. (2011). Dialogic Argumentation as a Vehicle for Developing Young Adolescents’ Thinking. Psychological Science, 22(4), 545-552. DOI: 10.1177/0956797611402512.
  • ↑ Jonassen, D.H., & Kim, B. (2010). Arguing to learn ad learning to argue: design justifications and guidelines. Education Technology & Research Development, 58(4), 439-457. DOI 10.1007/s11423-009-9143-8.
  • ↑ a b c d Bathgate, M., Crowell, A., Schunn, C., Cannady, M., & Dorph, R. (2015). The learning benefits of being willing and able to engage in scientific argumentation. International Journal of Science Education, 37(10), 1590-1612. doi:10.1080/09500693.2015.1045958
  • ↑ a b c d e Seixas, P., Morton, T., Colyer, J., & Fornazzari, S. (2013). The Big Six: Historical thinking Concepts. Toronto: Nelson Education.
  • ↑ Osborne, K. (2013). Forward. Seixas, P., Morton, T., Colyer, J., & Fornazzari, S. The Big Six: Historical thinking Concepts. Toronto: Nelson Education.
  • ↑ a b Carretero, M., & van Alphen, F. (2014). Do Master Narratives Change Among High School Students? A Characterization of How National History Is Represented. Cognition and Instruction, 32(3), 290–312. http://doi.org/10.1080/07370008.2014.919298
  • ↑ a b c Freedman, E. B. (2015). “What Happened Needs to Be Told”: Fostering Critical Historical Reasoning in the Classroom. Cognition and Instruction, 33(4), 357–398. http://doi.org/10.1080/07370008.2015.1101465
  • ↑ a b c Wineburg, S. S. (1991). Historical problem solving: A study of the cognitive processes used in the evaluation of documentary and pictorial evidence. Journal of Educational Psychology, 83(1), 73–87. http://doi.org/10.1037/0022-0663.83.1.73
  • ↑ Freedman, E. B. (2015). “What Happened Needs to Be Told”: Fostering Critical Historical Reasoning in the Classroom. Cognition and Instruction, 33(4), 357–398. http://doi.org/10.1080/07370008.2015.1101465
  • ↑ a b c d e f g h i Seixas, P., Morton, T., Colyer, J., & Fornazzari, S. (2013). The Big Six: Historical thinking Concepts. Toronto: Nelson Education.
  • ↑ Blackenship, W. (2009). Making connections: Using online discussion forums to engage students in historical inquiry. Social Education, 73(3), 127-130.
  • ↑ Shreiner, T. L. (2014). Using Historical Knowledge to Reason About Contemporary Political Issues: An Expert–Novice Study. Cognition and Instruction, 32(4), 313–352. http://doi.org/10.1080/07370008.2014.948680
  • ↑ Seixas, P., & Peck, C. (2004). Teaching Historical Thinking. Challenges and Prospects for Canadian Social Studies, 109–117.
  • ↑ Lopez, C., Carretero, M., & Rodriguez-Moneo, M. (2014). Telling a national narrative that is not your own. Does it enable critical historical consumption? Culture & Psychology , 20 (4 ), 547–571. http://doi.org/10.1177/1354067X14554156
  • ↑ Whitworth, S. A., & Berson, M. J. (2003). Computer technology in the social studies: An examination of the effectiveness literature (1996-2001). Contemporary Issues in Technology and Teacher Education [Online serial], 2(4). Retrieved from http://www.citejournal.org/volume-2/issue-4-02/social-studies/computer-technology-in-the-social-studies-an-examination-of-the-effectiveness-literature-1996-2001
  • ↑ Johnson, D. W., & Johnson, R. T. (1993). Creative and critical thinking through academic controversy. The American Behavioral Scientist, 37(1), 40-53. Retrieved from https://www.proquest.com/docview/1306753602

instruction for problem solving

  • Book:Cognition and Instruction

Navigation menu

Using Schema-Based Instruction to Improve Students’ Mathematical Word Problem Solving Performance

  • First Online: 31 January 2019

Cite this chapter

Book cover

  • Asha K. Jitendra 5  

5451 Accesses

1 Citations

The purpose of this chapter is to describe an evidence-based instructional program, schema-based instruction (SBI), which provides support in word problem solving for students who have difficulties in mathematics (MD). First, I describe mathematical word problem solving and the critical components linked to the ability to understand and solve word problems. Second, I describe the theoretical framework for SBI, including a discussion of its unique features and how SBI contributes to word problem solving performance. Third, I summarize previous research on SBI to understand the instructional conditions that need to be in place to support mathematical word problem solving for students with MD. Last, I conclude with a discussion of challenges yet to be addressed.

  • Word problem solving
  • Schema-based instruction
  • Mathematics difficulties

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Andersson, U. (2008). Mathematical competencies in children with different types of learning difficulties. Journal of Educational Psychology, 100 , 48–66. https://doi.org/10.1037/0022-0663.100.1.48

Article   Google Scholar  

Boonen, A. J. H., van der Schoot, M., van Wesel, F., de Vries, M. H., & Jolles, J. (2013). What underlies successful word problem solving? A path analysis in sixth grade students. Contemporary Educational Psychology, 38 (2013), 271–279. https://doi.org/10.1016/j.cedpsych.2013.05.001

Briars, D. J., & Larkin, J. H. (1984). An integrated model of skill in solving elementary word problems. Cognition and Instruction, 1 , 245–296.

Bush, S. B., & Karp, K. S. (2013). Prerequisite algebra skills and associated misconceptions of middle grade students: A review. Journal of Mathematical Behavior, 32 (3), 613–632.

Carpenter, T. P., Fennema, E., Franke, M. L., Levi, L., & Empson, S. B. (2015). Children’s mathematics: Cognitively guided instruction (2nd ed.). Portsmouth, N.H.: Heinemann.

Google Scholar  

Carpenter, T. P., & Moser, J. M. (1984). The acquisition of addition and subtraction concepts in grades one through three. Journal for Research in Mathematics Education, 15 , 179–202.

Carpraro, M. M., & Joffrion, H. (2006). Algebraic equations: Can middle school students meaningfully translate from words to mathematical symbols? Reading Psychology, 27 (2), 147–164.

Christou, C., & Philippou, G. (2001). Mapping and development of intuitive proportional thinking. The Journal of Mathematical Behavior, 20 , 321–336. https://doi.org/10.1016/S0732-3123(02)00077-9

Clarke, B., Smolkowski, K., Baker, S. K., Hank, F., Doabler, C. T., & Chard, D. J. (2011). The impact of a comprehensive Tier I core kindergarten program on the achievement of students at risk in mathematics. The Elementary School Journal, 111 , 561–584. https://doi.org/10.1086/659033

De Corte, E., Verschaffel, L., & Masui, C. (2004). The CLIA-model: A framework for designing powerful learning environments for thinking and problem solving. European Journal of Psychology of Education, 19 , 365–384.

Depaepe, F., De Corte, E., & Verschaffel, L. (2010). Teachers’ approaches towards word problem solving: Elaborating or restricting the problem context. Teaching and Teacher Education: An International Journal of Research and Studies, 26 (2), 152–160.

Desoete, A. (2009). Metacognitive prediction and evaluation skills and mathematical learning in third-grade students. Educational Research and Evaluation, 15 , 435–446.

Diezmann, C. M., & English, L. D. (2001). Promoting the use of diagrams as tools for thinking. In A. A. Cuoco & F. R. Curcio (Eds.), The roles of representation in school mathematics: 2001 yearbook (pp. 77–89). Reston, VA: National Council of Teachers of Mathematics.

EACEA/Eurydice. (2011). Mathematics education in Europe: Common challenges and national policies . Brussels: Eurydice [Online] Available at: http://eacea.ec.europa.eu/education/eurydice/ . Accessed 9 July 2017

Fuchs, L. S., Geary, D. C., Compton, D. L., Fuchs, D., Hamlett, C. L., Seethaler, P. M., … Schatschneider, C. (2010). Do different types of school mathematics development depend on different constellations of numerical versus general cognitive abilities? Developmental Psychology , 46, 1731–1746. doi: https://doi.org/10.1037/a0020662 .

Fuchs, L. S., Fuchs, D., Prentice, K., Burch, M., Hamlett, C. L., Owen, R., & Schroeter, K. (2003). Enhancing third-grade students’ mathematical problem solving with self-regulated learning strategies. Journal of Educational Psychology, 95 , 306–315.

Fuchs, L. S., Seethaler, P. M., Powell, S. R., Fuchs, D., Hamlett, C. L., & Fletcher, J. M. (2008). Effects of preventative tutoring on the mathematical problem solving of third- grade students with math and reading difficulties. Exceptional Children, 74 , 155–173.

Fuson, K. C., & Willis, G. B. (1989). Second graders’ use of schematic drawings in solving addition and subtraction word problems. Journal of Educational Psychology, 81 , 514–520. https://doi.org/10.1037/0022-0663.81.4.514

Gersten, R., Beckmann, S., Clarke, B., Foegen, A., Marsh, L., Star, J. R., & Witzel, B. (2009). Assisting students struggling with mathematics: Response to Intervention (RtI) for elementary and middle schools (NCEE 2009–4060) . Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides/

Gersten, R., Chard, D. J., Jayanthi, M., Baker, S. K., Morphy, P., & Flojo, J. (2009). Mathematics instruction for students with learning disabilities: A meta-analysis of instructional components. Review of Educational Research, 79 , 1202–1242. https://doi.org/10.3102/0034654309334431

Goldin, G. (2002). Representation in mathematical learning and problem solving. In L. English (Ed.), Handbook of research in mathematics education (pp. 197–218). Mahwah: Lawrence Erlbaum.

Greer, B. (1994). Extending the meaning of multiplication and division. In G. Harel & J. Confrey (Eds.), The development of multiplicative reasoning in the learning of mathematics (pp. 61–85). Albany: State University of New York Press.

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77 , 81–112.

Hegarty, M., & Kozhevnikov, M. (1999). Types of visual-spatial representations and mathematical problem solving. Journal of Educational Psychology, 91 , 684–689. https://doi.org/10.1037/0022-0663.91.4.684

Hegarty, M., Mayer, R. E., & Monk, C. A. (1995). Comprehension of arithmetic word problems: A comparison of successful and unsuccessful problem solvers. Journal of Educational Psychology, 87 , 18–32.

Jitendra, A. K. (2007). Solving math word problems: Teaching students with learning disabilities using schema-based instruction . Austin, TX: Pro-Ed.

Jitendra, A. K., Dupuis, D. N., & Zaslofsky, A. (2014). Curriculum-based measurement and standards-based mathematics: Monitoring the arithmetic word problem-solving performance of third-grade students at risk for mathematics difficulties. Learning Disability Quarterly, 37 (4), 241–251.

Jitendra, A. K., Griffin, C., Haria, P., Leh, J., Adams, A., & Kaduvetoor, A. (2007). A comparison of single and multiple strategy instruction on third grade students’ mathematical problem solving. Journal of Educational Psychology, 99 , 115–127. https://doi.org/10.1037/0022-0663.99.1.115

Jitendra, A. K., Griffin, C., McGoey, K., Gardill, C., Bhat, P., & Riley, T. (1998). Effects of mathematical word problem solving by students at risk or with mild disabilities. Journal of Educational Research, 91 , 345–356.

Jitendra, A. K., Rodriguez, M., Kanive, R. G., Huang, J.-P., Church, C., Corroy, K. C., & Zaslofsky, A. F. (2013). The impact of small-group tutoring interventions on the mathematical problem solving and achievement of third grade students with mathematics difficulties. Learning Disability Quarterly, 36 , 21–35.

Jitendra, A. K., Sczesniak, E., & Deatline-Buchman, A. (2005). Validation of curriculum-based mathematical word problem solving tasks as indicators of mathematics proficiency for third graders. School Psychology Review, 34 , 358–371.

Jupri, A., & Drijvers, P. (2016). Student difficulties in mathematizing word problems in algebra. EURASIA Journal of Mathematics, Science, & Technology Education, 12 (9), 2481–2502. https://doi.org/10.12973/eurasia.2016.1299a

Kalyuga, S. (2006). Rapid cognitive assessment of learners’ knowledge structures. Learning and Instruction, 16 , 1–11. https://doi.org/10.1016/j.learninstruc.2005.12.002

Kintsch, W., & Greeno, J. G. (1985). Understanding and solving word arithmetic problems. Psychological Review, 92 , 109–129. https://doi.org/10.1037/0033-295X.92.1.109

Leh, J., Jitendra, A. K., Caskie, G., & Griffin, C. (2007). An evaluation of CBM mathematics word problem solving measures for monitoring third grade students’ mathematics competence. Assessment for Effective Intervention, 32 , 90–99.

Lesh, R., & Zawojewski, J. (2007). Problem solving and modeling. In F. K. Lester Jr. (Ed.), Second handbook of research on mathematics teaching and learning. National Council of Teachers of Mathematics (pp. 763–804). Charlotte, NC: Information Age Publishing.

Marshall, S. P. (1995). Schemas in problem solving . New York: Cambridge University Press.

Book   Google Scholar  

Mayer, R. E. (1999). The promise of educational psychology Vol. I: Learning in the content areas . Upper Saddle River, NJ: Merrill Prentice Hall.

Mayer, R. E., & Hegarty, M. (1996). The process of understanding mathematics problems. In R. J. Sternberg & T. Ben-Zeev (Eds.), The nature of mathematical thinking (pp. 29–53). Hillsdale, NJ: Lawrence Erlbaum.

National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics . Reston, VA: Author.

OECD. (2010). PISA 2009 results: What students know and can do – Student performance in reading, mathematics, and science (volume 1) . Paris: OECD Publishing.

Pólya, G. (1990). How to solve it . London: Penguin (Originally published in 1945).

Presmeg, N. (2006). Research on visualization in learning and teaching mathematics. In Á. Gutiérrez & P. Boero (Eds.), Handbook of research on the psychology of mathematics education: Past, present and future (pp. 205–236). Rotterdam: Sense.

Riley, M. S., Greeno, J. G., & Heller, J. I. (1983). Development of children’s problem–solving ability in arithmetic. In H. P. Ginsburg (Ed.), The development of mathematical thinking (pp. 153–196). New York: Academic Press.

Rosenzweig, C., Krawec, J., & Montague, M. (2011). Metacognitive strategy use of eighth-grade students with and without learning disabilities during mathematical problem solving: A think-aloud analysis. Journal of Learning Disabilities, 44 , 508–520. https://doi.org/10.1177/0022219410378445

Schoenfeld, A. H. (1992). Learning to think mathematically: Problem solving, metacognition, and sense making in mathematics. In D. Grouws (Ed.), Handbook of research on mathematics teaching and learning (pp. 334–370). New York: McMillan.

Schumacher, R. F., & Fuchs, L. S. (2012). Does understanding relational terminology mediate effects of intervention on compare word problems? Journal of Experimental Child Psychology, 111 , 607–628. https://doi.org/10.1016/j.jecp.2011.12.001

Van Amerom, B. A. (2003). Focusing on informal strategies when linking arithmetic to early algebra. Educational Studies in Mathematics, 54 (1), 63–75.

Van de Walle, J. A., Karp, K. S., & Bay-Williams, J. M. (2013). Elementary and middle school mathematics: Teaching developmentally (8th ed.). New York: Pearson.

Van Dooren, W., de Bock, D., Vleugels, K., & Verschaffel, L. (2010). Just answering…or thinking? Contrasting pupils’ solutions and classifications of missing-value word problems. Mathematical Thinking and Learning, 12 , 20–35. https://doi.org/10.1080/10986060903465806

van Garderen, D. (2006). Spatial visualization, visual imagery, and mathematical problem solving of students with varying abilities. Journal of Learning Disabilities, 39 , 496–506 https://doi.org/10.1177/00222194060390060201

van Garderen, D., & Montague, M. (2003). Visual–spatial representation, mathematical problem solving, and students of varying abilities. Learning Disabilities Research & Practice, 18 , 246–254. https://doi.org/10.1111/1540-5826.00079

Verschaffel, L., Greer, B., & De Corte, E. (2000). Making sense of word problems . Lisse, The Netherlands: Swets & Zeitlinger.

Woodward, J., Beckmann, S., Driscoll, M., Franke, M., Herzig, P., Jitendra, A., et al. (2012). Improving mathematical problem solving in grades 4 through 8: A practice guide (NCEE 2012-4055) . Washington, DC: Institute of Education Sciences, U.S. Department of Education Retrieved from http://ies.ed.gov/ncee/wwc/PracticeGuide.aspx?sid=16

Xin, Y. P., Zhang, D., Park, J. Y., Tom, K., Whipple, A., & Si, L. (2011). A comparison of two mathematics problem-solving strategies: Facilitate algebra-readiness. Journal of Educational Research, 104 (6), 381–395. https://doi.org/10.1080/00220671.2010.487080

Yancey, A. V., Thompson, C. S., & Yancey, J. S. (1989). Children must learn to draw diagrams. Arithmetic Teacher, 36 (7), 15–23.

Zahner, D., & Corter, J. E. (2010). The process of probability problem solving: Use of external visual representations. Mathematical Thinking and Learning, 12 , 177–204. https://doi.org/10.1080/10986061003654240

Download references

Author information

Authors and affiliations.

Graduate School of Education, University of California, Riverside, CA, USA

Asha K. Jitendra

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Asha K. Jitendra .

Editor information

Editors and affiliations.

Faculty of Education Sciences, Department of Psychology, University of Duisburg-Essen, Essen, Germany

Annemarie Fritz

Faculty of Education, Centre for Education Practice Research, University of Johannesburg, Johannesburg, South Africa

Departamento de Psicologia, Faculdade de Filosofia e Ciências Humanas, Universidade Federal de Minas Gerais, Belo Horizonte, Brazil

Vitor Geraldi Haase

Niilo Mäki Institute, Jyväskylä, Finland

Pekka Räsänen

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer International Publishing AG, part of Springer Nature

About this chapter

Jitendra, A.K. (2019). Using Schema-Based Instruction to Improve Students’ Mathematical Word Problem Solving Performance. In: Fritz, A., Haase, V.G., Räsänen, P. (eds) International Handbook of Mathematical Learning Difficulties. Springer, Cham. https://doi.org/10.1007/978-3-319-97148-3_35

Download citation

DOI : https://doi.org/10.1007/978-3-319-97148-3_35

Published : 31 January 2019

Publisher Name : Springer, Cham

Print ISBN : 978-3-319-97147-6

Online ISBN : 978-3-319-97148-3

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

12k Accesses

9 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Similar content being viewed by others

instruction for problem solving

What colour are your eyes? Teaching the genetics of eye colour & colour vision. Edridge Green Lecture RCOphth Annual Congress Glasgow May 2019

David A. Mackey

instruction for problem solving

Impact of artificial intelligence on human loss in decision making, laziness and safety in education

Sayed Fayaz Ahmad, Heesup Han, … Antonio Ariza-Montes

instruction for problem solving

Artificial intelligence and illusions of understanding in scientific research

Lisa Messeri & M. J. Crockett

Introduction

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Exploring the effects of digital technology on deep learning: a meta-analysis.

Education and Information Technologies (2024)

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

instruction for problem solving

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Campbell Syst Rev
  • v.19(3); 2023 Sep
  • PMC10357100

Logo of csysrev

PROTOCOL: Problem solving before instruction (PS‐I) to promote learning and motivation in child and adult students

Eduardo gonzález‐cabañes.

1 Department of Psychology, Psychology Faculty, University of Oviedo, Oviedo Asturias, Spain

Trinidad Garcia

Catherine chase.

2 Department of Human Development, Teachers College, Columbia University, New York New York, USA

Jose Carlos Núñez

Associated data.

This is the protocol for a Campbell systematic review. The objectives are as follows: The purpose of this review is to synthesize the evidence about the efficacy of problem solving before instruction (PS‐I) to promote learning and motivation in students. Specifically, this review is designed to answer the following questions: To what degree does PS‐I affect learning and motivation, relative to alternative learning approaches? To what extent is the efficacy of PS‐I associated with the use of different design features within it, including the use of group work, contrasting cases, and metacognitive guidance in the initial problem‐solving activity, and the use of explanations that build upon students' solutions in the explicit instruction phase? To what extent is the relative efficacy of PS‐I associated with the contextual factors of activities used as control, age of students, duration of the interventions, and learning domain? What is the quality of the existent evidence to evaluate these questions in terms of number of studies included and potential biases derived from publication and methodological restrictions?

1. BACKGROUND

1.1. description of the condition.

A typical form of instruction is for teachers to explain a new concept or procedure and then ask students to apply it in a set of activities. However, some research suggests that it may be more beneficial to first give students the opportunity to problem‐solve in relation to the new contents before providing any explicit instruction on them. This review asks how these two approaches to instruction compare in promoting motivation and learning. For example, before explaining how to measure statistical variability, or how to solve an equation, or a psychological theory to explain our attentional experiences, would it be better if students first try to find their own solutions to these problems (e.g., Carnero‐Sierra,  2020 ; Fyfe,  2014 ; Kapur,  2012 ), or would it be better to start by providing them with instructions and concepts for solving the problems?

Problem‐solving activities are highly valued in education because they offer the opportunity for students to practice at their own pace (Jackson,  2021 ). Allowing students to take their time is an important part of the reflection processes. However, deep reflection processes are not always activated in problem‐solving activities. When students know the basic procedures to solve them, the problems often become a routine that students solve mechanically without devoting enough attention to the structural aspects (Moore,  1998 ). To encourage these reflection processes, it might be useful to give students the opportunity to problem‐solve before they receive any explanation about the relevant procedures (Schwartz,  2004 ).

These types of interventions that combine an initial phase of problem‐solving and a following phase of explicit instruction have been formulated in different specific approaches, such as the Productive Failure approach (Kapur,  2012a ), the Invention approach (Schwartz,  2004 ), or problem‐solving before instruction (Loibl,  2014a ). In this review we will generally refer to all these related approaches as problem‐solving before instruction (PS‐I).

It has been argued that PS‐I interventions can have important implications in both learning and motivation. Specifically, generating solutions in the initial problem‐solving phase can help students become more aware of their knowledge gaps (Loibl,  2014 ), activate and differentiate prior knowledge (Kapur,  2012 ), and adopt healthy motivations (Belenky,  2012 ). In this regard, several studies have shown that students who learned through PS‐I, in comparison to students who directly received explanations of the target concepts and procedures, reported higher interest in the content taught (Glogger‐Frey,  2015 ; Weaver,  2018 ). Also, they demonstrated higher understanding of the content and greater capacity to transfer this understanding to novel situations (Glogger‐Frey,  2017 ; Kapur,  2014 ; Schwartz,  2011 ; Weaver,  2018 ).

However, PS‐I can sometimes produce negative reactions related to learning and motivation. During the initial problem‐solving, students can feel overchallenged with the search for many possible solutions. They might spend most of the time paying attention to irrelevant aspects (Clark,  2012 ). Also, the uncertainty of not finding correct solutions in this task can be frustrating, and students might end up acquiring a passive role (Clark,  2012 ). There are some studies that have shown greater negative affect (Lamnina,  2019 ), lower motivation (Glogger‐Frey,  2015 ), and reduced learning (Fyfe,  2014 ; Glogger‐Frey,  2015 ; Newman,  2019 ) for students in PS‐I interventions than for students in alternative interventions.

Considering this variability of results in the literature, it is important to systematically review the evidence concerning efficacy of PS‐I, and concerning conditions that can influence this efficacy. Also, this review may have important implications for educational practice. Many instructors have negative attitudes towards the uncertainty of starting lessons with problem‐solving activities in which students can experience initial failures and negative affect (Pan,  2020 ), and empirical evidence of PS‐I's efficacy might help these instructors to reduce this uncertainty and take more informed decisions.

In terms of learning, it is important to evaluate to what extent PS‐I can promote the development of conceptual knowledge, which refers to the understanding of the content taught, and transfer, which refers to the capacity to apply this understanding to novel situations. Several national and international evaluations suggest that a great proportion of students learn by memorizing procedures (Mallart,  2014 ; OECD,  2016 ; Silver,  2000 ). For example, in the Spanish math examinations to access the university, it was noted that the majority of students passed the exams because they were able to solve problems that were similar to those seen in class. However, the majority failed to answer comprehension questions, or to correctly solve problems where they had to flexibly apply the procedures learnt (Mallart,  2014 ). Considering that real‐life situations generally differ from class situations, acquiring deeper forms of knowledge is of great relevance in the future autonomy of students, especially in a world that is increasingly changing because of globalization and development of new technologies (OECD,  2014 ).

Of no less importance is the potential efficacy of PS‐I to promote motivation for learning. Several evaluations suggest that a great proportion of students are not motivated to learn class content (Council National Research,  2003 ; OECD,  2018 ). A recent PISA evaluation with high school students from 79 countries showed that most students reported studying before or after their last class, but only 48% of these students reported interest as one of their motives (OECD,  2018 ). Promoting motivation for learning is of great importance, because, rather than just being a predisposition that can help learning (Chiu,  2008 ; Liu,  2020 ; Mega,  2014 ), it is a main factor that determines the well‐being of students during the learning process (Ryan,  2009 ).

Four reviews have provided interesting evidence about the comparison of PS‐I versus other educational interventions (Darabi,  2018 ; Jackson,  2021 ; Loibl,  2017 ; Sinha,  2021 ). They suggested a general efficacy of PS‐I to promote learning (Darabi,  2018 ), and more specifically to promote conceptual knowledge and transfer, but not procedural knowledge (Loibl,  2017 ; Sinha,  2021 ). Furthermore, the qualitative review of Loibl,  2017 and the meta‐analysis of Sinha,  2021 suggested that PS‐I was associated with higher efficacy when it was presented with guidance strategies to help students become more aware of their knowledge gaps and focus their attention on relevant features of the contents. However, these reviews are limited because they only used a small number of databases or search techniques for the identification of studies. Additionally, there are important aspects not addressed in these previous reviews, such as the evaluation of motivational outcomes, or the consideration of the types of control activities used for the comparisons.

The upcoming review aims to address these aspects and to update the studies included in these previous reviews. We will review studies in which PS‐I interventions are systematically compared with alternative interventions that provide the same contents and activities but provide explicit instruction from the beginning, and that quantify the results with conceptual knowledge tests, transfer tests, and self‐reports of motivation for learning. Yet, additional exploratory analyses might include studies either that have more general measures of learning, or in which there is not such a strict control of the equivalence of learning activities between conditions. The general goal is to provide educators and policy makers with information that can help them make decisions about introducing PS‐I in educational practice, and the various factors that can influence the efficacy of PS‐I.

1.2. Description of the intervention

The uniqueness of PS‐I educational interventions resides in the combination of two phases: an initial problem‐solving activity to explore a concept that students have not yet learned, and a subsequent explicit instruction phase to explain the concept. The initial problem‐solving activity consist of one or several problems that students can explore with their prior knowledge, but that require students to completely develop their own criteria to solve, because the main solution procedures are based in concepts they have not yet learned, and no content‐related guidance is given during problem‐solving. Students are not expected to find the correct solutions. However, this initial exploration is thought to prepare them to learn from subsequent explicit instruction. The second phase of explicit instruction consists of any activity in which students can read or listen to explanations of the target concepts, such as a lecture, a video, or an interactive discussion with concept explanations.

A typical approach in which PS‐I has been conducted is called Invention (Schwartz,  2004 ). In this approach the initial problem‐solving activity is generally formulated with invention goals, which refers to instructions to infer a general procedure, rule, or concept. Also, the data of the problem is often presented in small data sets or contrasting cases, which are examples that differ on a few key features (e.g., Section 1a of Table  1 ). This type of data presentation has been applied in a great variety of learning areas, including physics (Schwartz,  2011 ), statistics (González‐Cabañes,  2021 ), and educational sciences (Glogger‐Frey,  2015 ). The combination of invention goals and contrasting cases is meant to encourage students to discern and actively integrate relevant problem features (Schwartz,  2004 ; Schwartz,  2011 ; refer to How the intervention might work for a further description of these features).

Examples of design features in problem solving before instruction (PS‐I).

Note : Table adapted from Loibl ( 2017 ) using transformations of the PS‐I intervention in Kapur ( 2012 ) as examples.

Another typical PS‐I approach is Productive Failure (Kapur,  2012a ). Studies following this approach present students with rich problems, in which the data is complex and relevant features are not highlighted (e.g., Section 1b of Table  1 ). The problem generally allows several possible solutions, and the ensuing explicit instruction includes explanations that build on students’ solutions, commenting on the affordances and limitations of students’ solutions in comparison to the affordances and limitations of the correct solutions (e.g., Section 3a of Table  1 ; refer to How the intervention might work for a further description). The Productive Failure approach has also been used in a great variety of learning areas, including physics (Kapur,  2010 ), statistics (Kapur,  2012 ), maths (Mazziotti,  2019 ), and biology (Song,  2018 ). It is emphasized in this approach that ‘failures’ to reach the correct solutions in the initial problem are not conceived as failures, but as exploration opportunities that help students activate prior knowledge, and as opportunities to comprehend relevant features and relations when these solutions are compared with the correct ones.

To give a sense of the variability across PS‐I interventions, it is also important to consider the context in which the interventions are implemented. PS‐I interventions are generally implemented by the teachers or by the researchers who conduct the evaluations. They can be applied to different types of domains. In the literature they have been generally applied in math or science domains such as statistics or physics (e.g., González‐Cabañes,  2020 ; Kapur,  2014 ; Newman,  2019 ; Schwartz,  2011 ), but also in other domains such as psychology, or pedagogy (e.g., Carnero‐Sierra,  2020 ; Glogger‐Frey,  2015 ; Schwartz,  1998 ). PS‐I methods have been applied successfully with students from age 12 through adulthood (e.g., González‐Cabañes,  2020 ; Kapur,  2014 ; Schwartz,  2011 ), and a few studies have used PS‐I methods with primary school children (e.g., Chase,  2017 ; Mazziotti,  2019 ). PS‐I interventions have also been conducted in collaborative contexts, in which groups of two or more students work on the initial problem‐solving activity (e.g., Glogger‐Frey,  2017 ; Kapur,  2012a ), or in individual contexts, in which students work by themselves on the learning activities (e.g., Glogger‐Frey,  2015 ; González‐Cabañes,  2020 ). For the purpose of generalizing our findings broadly, in the upcoming review we will include studies conducted with students of any age, in any type of course, with collaborative or individual work, and with any kind of implementer.

There is also great potential variability regarding the intensity of PS‐I interventions. The duration of the initial problem‐solving phase generally spans between 12 min (e.g., González‐Cabañes,  2020 ) and 100 min (e.g., Kapur,  2012 ). The number of problems included in this initial activity can also vary, generally ranging between one (e.g., Glogger‐Frey,  2015 ; Weaver,  2018 ) and two (e.g., Chase,  2017 ; Glogger‐Frey,  2017 ). In regard to the times in which PS‐I is implemented within an intervention, generally PS‐I is only applied once for one specific lesson (e.g., Chase,  2017 ; Glogger‐Frey,  2017 ; González‐Cabañes,  2020 ; Kapur,  2012 ; Weaver,  2018 ), but there are other studies that applied it repeatedly over a longer time frame (e.g., Likourezos,  2017 ). Considering all these factors and the different potential durations of the explicit instruction phase, there are interventions with a great variety of total time invested.

It is just as important to consider the variability of interventions used as controls to compare the efficacy of PS‐I. It is often the case that PS‐I is compared with what are generally called ‘Instruction before Problem‐Solving’ interventions (I‐PS). Similarly than PS‐I, these interventions also include a problem‐solving phase related to the target contents, but only after students have received some explicit instruction about them. This initial instruction is often provided through lecture (e.g., Kapur,  2014 ; Weaver,  2018 ) or through worked examples to study (e.g., Schwartz,  2011 ). Worked examples refer to problems that show the resolution procedures. Interventions with no problem‐solving phase are also used, in which the initial problem‐solving activity of PS‐I is substituted with a worked example study (e.g., Glogger‐Frey,  2015 ; Glogger‐Frey,  2017 ). Lastly, other comparative interventions are more alike to the PS‐I interventions in the sense that they also start with the initial problem‐solving activity, but they provide some content guidance along with it. For example, it is common that some parts of the solution procedures are written (e.g., Likourezos,  2017 ; Newman,  2019 ). All of these types of comparisons will be considered within this review in as much as they have the same activities included in the instruction phase of the PS‐I intervention. Yet, separate meta‐analysis will be used for each type of comparison.

1.3. How the intervention might work

There are several mechanisms of PS‐I interventions that can influence learning and motivation, either positively or negatively. These mechanisms can interact, either compensating or reinforcing each other. Figure  1 depicts a proposal of these mechanisms, which is based on the theoretical proposal of Loibl ( 2017 ) regarding the cognitive PS‐I mechanisms that influence learning, but it aims to integrate motivational mechanisms within it.

An external file that holds a picture, illustration, etc.
Object name is CL2-19-e1337-g001.jpg

Theoretical model of different variables that might be associated with the efficacy of problem solving before instruction (PS‐I).

1.3.1. Potential PS‐I learning mechanisms

One potential mechanism through which PS‐I can favour learning is the opportunity in the initial problem‐solving activity to activate, differentiate, and generate prior knowledge in relation to the concepts that will be explained later (Kapur,  2012a ). As students try to explore different solutions, they can become familiar with the problem situation and relevant features of the concepts to be explained. This familiarization can help students to more easily understand and integrate the explanations given later.

Furthermore, PS‐I also gives students a creative role during the exploration of the initial problem. Students can generate solutions using their own ideas, including ideas seen in previous classes, but also ideas from real life experiences. Ideas from real life experiences are accessible to all students and can constitute an additional and important support to integrated learning (Kapur,  2011a ).

Several studies support the efficacy of activating prior knowledge with this creative component of generating personal ideas. Relative to other interventions where prior knowledge is activated through exploratory activities without this creative component, PS‐I led to greater conceptual understanding and transfer at the end of the lesson (Glogger‐Frey,  2017 ; Kapur,  2011 ; Kapur,  2014a ; Schwartz,  2011 ). Also, it is interesting that the number of solutions generated by students during the initial problem‐solving phase of PS‐I have been associated with greater conceptual knowledge and transfer, regardless of whether the generated solutions were right or wrong (Kapur,  2011a ; Kapur,  2012 ).

Another complementary mechanism of PS‐I that can favour learning is its potential to increase awareness of knowledge gaps. Humans often process information superficially, and unconsciously use this superficial knowledge to support a false illusion of understanding (Kahneman,  2011 ). In this regard, the experience of impasses within the initial problem can help students to become more aware of their knowledge gaps, which in turn can facilitate further exploration and recognition of deep features (Chi,  2000 ; VanLehn,  1999 ).

In support to these claims, several studies showed that students in PS‐I interventions reported higher awareness of knowledge gaps than students in alternative interventions in which the new concepts were directly explained from the beginning (Glogger‐Frey,  2015 ; Loibl,  2014 ). Also, students in PS‐I interventions had a better memory of the structural components of the problems presented than students in other interventions where the same problems were presented after some explanations (Schwartz,  2011 ) or together with explanations (Glogger‐Frey,  2017 ).

As discussed in Loibl ( 2017 ), it is important to consider the synergy that can occur between these mechanisms. It is likely that the greater the creative role assumed by the students, the greater the number of solutions they try, and the greater their activation of prior knowledge. In turn, it is also likely that the greater number of solutions attempted leads to greater opportunities to make mistakes, find impasses, and become aware of their learning gaps (see diagonal arrows in Figure  1 towards conceptual knowledge). The process can also be recursive. As learners become more aware of their learning gaps, prior knowledge activation is also more likely to be related to relevant aspects of concepts.

Lastly, another mechanism that can synergically reinforce these learning processes is the potential effect of PS‐I to increase motivation for learning, which is discussed in the following section. Motivation is defined as our desire to engage in the learning activity (Núñez,  2009 ), and can increase engagement in all learning processes previously mentioned (see wide white arrow coming out of Motivation for learning in Figure  1 ).

1.3.2. Potential PS‐I motivation mechanisms

PS‐I can increase motivation for learning through several potential mechanisms (see horizontal arrows in Figure  1 ). First, it can be facilitated by the previously hypothesized PS‐I effect of promoting awareness of knowledge gaps. Some theories assume that we are intrinsically driven to acquire knowledge, and the mere perception of knowledge gaps often triggers curiosity, or the desire to fill that gap (Golman,  2018 ; Loewenstein,  1994 ). Several studies have shown that students who learned through PS‐I experienced greater curiosity and interest than students who started the learning process with explicit instruction (Glogger‐Frey,  2015 ; Lamnina,  2019 ; Weaver,  2018 ). Also, the study of Glogger‐Frey,  2015 found associations between the perception of knowledge gaps in PS‐I and curiosity.

The creative role that students can adopt in PS‐I can stimulate achievement motivation, which refers to the motivation to perform well in the learning activities (Núñez,  2009 ). Students in the initial problem‐solving activity of PS‐I are creating information, rather than just assimilating information given from outside, which can trigger a sense of responsibility and ownership in the task of constructing knowledge (Ryan,  2009 ; Urdan,  2006 ). Based on that, some students might try to perform the best they can to see their performance capabilities, and to maximize their learning. Literature in this sense is very scarce, but it was observed in one study that high school students in a PS‐I condition for learning geometry experienced higher achievement motivation than students in more guided interventions (Likourezos,  2017 ).

It is also important to consider the interrelation between learning and motivation. The hypothesized higher learning in PS‐I can lead to a higher sense of self‐efficacy for students in this condition, which in turn can increase motivation for learning (Núñez,  2009 ).

1.3.3. Potential negative effects of PS‐I on learning

Although the experiencing of impasses can have important benefits when triggering awareness of knowledge gaps, it might have some negative implications. Given the inexperience of students in the topic, it is likely that they spend time attending to irrelevant information when they try to solve the impasses. In turn, this can generate extraneous cognitive load (Clark,  2012 ), which refers to a saturation in our attentional capacity due to processing irrelevant aspects, and can reduce the attentional resources available to focus on important information (Sweller,  2019 ). Therefore, this extraneous cognitive load can interfere in the recognition of deep features during the initial problem (Clark,  2012 ).

In this regard, several studies have shown that students in the initial problem‐solving activity of PS‐I reported higher extraneous cognitive load than students who faced similar problems that included explanations of the solution processes (Glogger‐Frey,  2015 ; Glogger‐Frey,  2017 ; Likourezos,  2017 ). In turn, this higher extraneous cognitive load was associated with lower learning (Glogger‐Frey,  2015 ; Glogger‐Frey,  2017 ), even when the final learning was higher within students in the PS‐I condition (Glogger‐Frey,  2017 ). Overall, these results suggest a complex interaction of negative and positive effects in PS‐I interventions, in which the positive mechanisms that we have described is balanced with the extraneous cognitive load that some students can experience (see the dashed arrow coming from extraneous cognitive load towards recognition of deep features in Figure  1 ).

1.3.4. Potential negative effects of PS‐I on motivation

A potential factor that can demotivate students in PS‐I interventions is the frustration they can feel in the initial problem‐solving phase (Clark,  2012 ). Frustration can arouse in the initial problem‐solving activity because of the experiencing of extraneous cognitive load or because of the sensation of failing to achieve the correct solution within the impasse experiences. In turn, frustration can have demotivating effects, such as reducing intrinsic motivation (Loderer,  2018 ), or contributing to fatigue (Pekrun,  2011 ; Pekrun,  2012 ). Recent studies have shown higher frustration (González‐Cabañes,  2020 ) or negative affect (Lamnina,  2019 ) in PS‐I students versus students in a typical instruction condition where they started the learning process with explanations.

1.3.5. Hypotheses about general effects

Considering this variety of potential positive and negative mechanisms, it is of great importance to study the final effects in learning and motivation. In line with the previous reviews of Loibl ( 2017 ) and Sinha ( 2021 ), we expect that PS‐I, in comparison with alternative interventions, will be associated with the higher performance of students in post‐tests of conceptual knowledge taken after the lesson, but not with performance in concurrent post‐tests of procedural knowledge. Procedural knowledge can be acquired through memorization, and therefore the potential described PS‐I mechanisms of promoting activation of prior knowledge and awareness of knowledge gaps might have little influence on it. Yet, we expect that these potential mental processes can greatly impact conceptual knowledge, which refers to the understanding of principles and relationships that underlie concepts and procedures. Conceptual knowledge not only relies on memorization, but also on the identification of structural features of the concepts. These mental processes can also have a great impact in transfer. Transfer can be facilitated by the activation of prior knowledge, as it can rely on the integration between prior knowledge and new knowledge (Loibl,  2017 ), and generally by the acquisition of conceptual knowledge (Mayer,  2003 ). Only with a clear mental representation of how the procedures work can we perceive whether the procedures generalize to other contexts.

Although there is no previous review regarding motivation for learning, we expect that PS‐I interventions will be associated with higher scores in self‐reports of interest taken after the lesson, because of the effects PS‐I can have on achievement motivation and curiosity.

1.3.6. Factors that can moderate the efficacy of PS‐I

In spite of these hypothesized general effects of PS‐I on learning and motivation, a great variety of factors can moderate these effects. Among them, are design features of the interventions, intensity of the interventions, age of students, learning domain, and activities used as control. It is important to note that, as previoulsy described, learning and motivation can benefit each other, and therefore we will consider all moderators as potentially influencing both.

Design features

The different design features used in PS‐I interventions can have different effects on the previously described mechanisms, and therefore in the general efficacy of PS‐I on learning and motivation. Below we describe some of the design features frequently discussed and used in the literature, which we will consider as potential moderators in the present review.

  • Contrasting cases . Contrasting cases is a form of guidance that is often used to present the data of the initial problem (e.g., Loibl,  2020 ; Schwartz,  2011 ). It consists of examples that differ on a few features that are relevant to the target knowledge (Schwartz,  2004 ). For example, in the contrasting cases shown in Section 1 of Table  1 , which was designed for learning about statistical variability of data distributions, we can see how the distributions of scores for player A and player B differ in the range, but not in other features of the distribution such as the mean, the number of scores, and the spread of scores. In contrast, the distribution of player B and player C's scores differ in the spread of the scores, but not in the range or other characteristics. It has been argued that the comparison of cases can help students focus on the relevant features of the problem (Loibl,  2017 ; Salmerón,  2013 ; Schwartz,  2004 ). Also, contrasting cases can help students become aware of their knowledge gaps, because students can rank the cases and self‐evaluate their solutions in regard to it (Loibl,  2017 ; Schwartz,  2004 ). Finally, contrasting cases may reduce extraneous cognitive load during problem‐solving and its associated frustration.
  • Metacognitive guidance . The initial problem is often presented with metacognitive guidance during problem solving (e.g., Holmes,  2014 ; Roll,  2012 ). Metacognitive guidance refers to prompts that do not address content, but rather are meant to stimulate conscious mental strategies such as monitoring and reflection processes. This type of guidance can stimulate mental processes that can lead students to become more aware of knowledge gaps and to recognize deep features (Holmes,  2014 ; Roll,  2012 ). For example, the metacognitive guidance in Section 2 of Table  1 can trigger students to reflect on critical features they perceive in the data distributions and the limitations of the solution ideas they generate.
  • Collaborative work . Allowing students to work on the initial problem‐solving activity in small groups, rather than asking them to work individually, might influence the relative efficacy of PS‐I (Mazziotti,  2019 ). Collaborative problem‐solving is a context that brings opportunities for elaborating and critiquing ideas (Kapur,  2012a ; Webb,  2014 ). Several studies have found that problem‐solving in pairs was associated with higher performance than working individually (Teasley,  1995 ; Webb,  1993 ). Also, the extent to which students engage in dialectical argumentation with each other's ideas and explain their problem‐solving strategies has been associated with higher problem‐solving achievement and higher acquisition of conceptual knowledge (Asterhan,  2009 ; Webb,  2014 ). Based on this, we expect that working on the initial problem in groups will be associated with higher efficacy for PS‐I than working individually.
  • Building explanations of explicit instruction phase on students' solutions . PS‐I interventions often include explanations in the explicit instruction phase that draws students' attention to the affordances and limitations of students' typical solutions given in the previous problem‐solving phase (e.g., Kapur,  2012a ; Kapur,  2014 ; Loibl,  2014a ). An example can be seen in Section 3 of Table  1 . It can be considered a form of guidance for the problem‐solving activity that, rather than given during the problem‐solving phase, it is provided afterwards to help students reorganize the ideas they activated during the problem‐resolution. It has been argued that this feedback can help students to become more aware of their knowledge gaps, and to focus on the relevant features within the complexity of the target concepts (Kapur,  2012a ; Loibl,  2017 ).

Duration of the PS‐I intervention

The duration of the PS‐I intervention can have an important effect on its efficacy to promote learning and motivation, because of a higher dosage of the hypothesized PS‐I effects. We expect a higher efficacy of PS‐I in longer interventions.

Age of students

As considered in the prior review of Sinha ( 2021 ), age might be associated with the relative efficacy of PS‐I because of the relation between age and metacognitive development. Metacognition refers to the awareness of our own mental processes and the control we have over them (Schraw,  1994 ), and has been argued to have an important influence on several PS‐I mechanisms (Glogger‐Frey,  2017 ). First, it can help students to become aware of their knowledge gaps. Students with low metacognitive skills might not relate the limitations of the solutions they generated with the solutions explained later (Roll,  2012 ). Second, metacognition can also help students to discern what information is relevant from the information that is not, which can reduce the extraneous cognitive load experienced during the initial problem‐solving phase and its associated frustration. However, these metacognitive capacities might develop slowly with age (Veenman,  2005 ). Based on these assumptions, we expect that the higher the age of the students, the higher the efficacy of PS‐I to promote conceptual knowledge, transfer, and motivation for learning.

Learning domain

The learning domain in which PS‐I is applied might have an important influence on the efficacy of PS‐I. In math and science domains (e.g., statistics, physics) conceptual structures are often abstract and complex, and the deep learning processes expected to be promoted in PS‐I interventions might be more significant in these domains. Specifically, in this review we expect that higher efficacy of PS‐I will be found in math and science domains than in other domains.

Control conditions used for comparison

The types of control conditions used to compare PS‐I can have a great influence in the relative efficacy of PS‐I. In the literature we have identified several types of comparative interventions:

  • 1. Instruction with lecture before problem‐solving . Share with the PS‐I intervention both the problem‐solving phase and the other learning activities in the instruction phase, but instead of introducing the contents with the problem‐solving activity, the contents are introduced with a lecture about the target concepts, in which students have to listen to the explanations of the concepts at a given pace.
  • 2. Instruction with worked‐examples exploration before problem‐solving . Share with the PS‐I intervention both the problem‐solving phase and the other learning activities in the instruction phase, but instead of introducing the contents with the problem‐solving activity, the contents are introduced with a worked example that they study at their own pace.
  • 3. Instruction with worked example exploration before further instruction . These interventions do not include problem‐solving activities. They share with the PS‐I intervention all the learning activities in the instruction phase, which are activities that do not include problem‐solving. Also, instead of the initial problem‐solving activity of PS‐I, they start with a worked example, where students study the resolution procedures to problems at their own pace.
  • 4. Problem‐solving with content guidance before instruction . Share with the PS‐I intervention all the learning activities in the instruction phase, and also start with the initial problem‐solving activity used in the PS‐I intervention, but provide students with some content guidance during it.

We expect that PS‐I would lead to higher benefits in terms of learning and motivation than these control conditions. Although extraneous cognitive load and frustration might be lower in the initial phases of these control conditions than in PS‐I, the initial problem‐solving phase of PS‐I gives students an opportunity to acquire a creative role and to experience impasses that are not given in other conditions.

Nevertheless, we expect that the relative efficacy of PS‐I would be higher when compared with conditions that introduce the concepts with a lecture (1), rather than when they are introduced with worked‐examples or problems with content guidance (2‐4). In these latter control conditions, students start by exploring information at their own pace, which, similarly to PS‐I, gives them the opportunity to activate prior knowledge before receiving explanations from the professor. As we have previously described, activation of prior knowledge can potentially favour the assimilation of explanations (Carriedo,  1995 ; Smith,  1992 ; Sweller,  2019 ). This pattern of results would be in line with the results of Newman,  2019 , in which the advantage of PS‐I in terms of learning was higher when compared against the introduction of concepts with a lecture, rather than when PS‐I was compared against interventions that introduced concepts with worked‐examples or problems with content guidance.

Additionally, among these last comparative interventions (2‐4) we also hypothesize that the relative efficacy of PS‐I will be higher when compared with interventions that do not include a problem‐solving phase (3) than when this problem‐solving phase is included (2, 4). Problem‐solving activities can help students to reflect and reason about the target concept, and missing these types of activities can have implications in conceptual knowledge acquired.

1.4. Why it is important to do this review

There are four reviews in the literature that have provided interesting insights into the efficacy of PS‐I (Darabi,  2018 ; Jackson,  2021 ; Loibl,  2017 ; Sinha,  2021 ).

First, the review by Jackson ( 2021 ) is a qualitative review that included studies from educational databases and conferences that addressed factors that can influence learning from failures in STEM courses. It included studies about PS‐I interventions, where failure is expected in the initial problem, but also other type of studies that addressed learning from failures. Considering the results of the 35 papers included, they discussed that the efficacy of learning from failure could depend on factors such as whether students conceptualize the failures as learning opportunities, the promotion of positive affective reactions such as persistence, the promotion of a classroom climate to speak about failures and embrace them, and the use of failures as stimuli to identify misconceptions and induce thoughtfulness about key features. Although the presence of these factors can be important in the efficacy of PS‐I to promote learning and motivation, this review was limited in that these factors were not addressed quantitatively.

The review of Loibl ( 2017 ), specifically focused on the efficacy of PS‐I to promote learning, and provided a vote counting procedure to synthetize the literature results. Their results suggested that the efficacy of PS‐I depended on the type of learning outcome considered. Across the 34 studies they identified, they found that most studies reported no significant differences between PS‐I and other alternative approaches in terms of procedural knowledge, which just refers to the capacity to reproduce memorized procedures covered in class. However, when the evaluation was made in terms of conceptual knowledge or transfer, PS‐I generally led to more positive results. For example, out of 17 studies, they identified 10 studies where transfer was significantly higher in PS‐I approaches, 1 study in which it was higher in alternative approaches, and 6 studies showing no significant difference. They also explored the effect of different PS‐I design features, for which they proposed an interesting moderator: whether PS‐I was presented in combination with techniques oriented to foster awareness of knowledge gaps and recognition of deep features, such as contrasting cases or building instruction on students' solutions. They found that when any of these forms of guidance were present, it was more likely to find a positive significant difference for PS‐I.

However, there are important aspects in relation to the scope and methods of this review that are important to address. First, the results were analysed using a vote counting procedure instead of meta‐analysis techniques. Second, their results could be easily contaminated by publication bias and availability bias, because, beyond looking into the list of references of some of the localized studies, they did not try to find studies within the grey literature. Finally, they did not consider the outcome of motivation for learning, nor some of the other potential moderating factors commented here, such as the type of control activities used for comparing PS‐I, or the intensity of PS‐I interventions.

The review by Darabi ( 2018 ) provided some meta‐analytic evidence about the general efficacy of PS‐I on learning. However, the scope of this review was larger and less specific. Their goal was to evaluate educational approaches based on learning from failures, which included PS‐I approaches, but also other failure‐driven approaches. In spite of that, out of the 23 studies that they ended up identifying, 22 were about the effectiveness of PS‐I. The results suggested that students who learned through failure‐driven approaches acquired more knowledge than students who learned through alternative approaches, with a moderate effect size ( g  = 0.43). They also explored the influence of interesting moderators such as age or intensity of the intervention, for which they found no significant results.

Nevertheless, this review has important aspects to address. Beyond mixing results of PS‐I interventions with other failure‐driven interventions, results can be biased because of mixing different types of learning outcomes. They aggregated together outcomes that referred to procedural knowledge, conceptual knowledge, and transfer, which according to the review of Loibl ( 2017 ) can lead to very different results. Also, these results might be biased by availability and publication bias, as suggested by their post‐hoc analyses. They only searched in few databases and using a very short variety of keywords, which can explain why, in spite of being more recent than the Loibl ( 2017 ) review, they identified considerably fewer studies.

Lastly, the review of Sinha ( 2021 ), specifically focused on the efficacy of PS‐I interventions to promote learning in comparison to other interventions in which explicit instruction were provided from the beginning. Up to date, their review is the one that includes more studies about this topic. Sinha and Kapur ( 2021 ) included 53 studies that were selected from studies that had cited in Google Scholar some of the reference articles of productive failure, a specific approach of PS‐I. Their review also had the advantage of analysing the results with meta‐analysis techniques. The results showed a significant moderate effect in favour of PS‐I ( g  = 0.36) versus alternative interventions, using an aggregation of measures that included tests of conceptual knowledge and transfer. Their moderation analyses showed that this effect was higher when using PS‐I design features such as using group work in the problem‐solving phase ( g  = 0.49), or building the explanations of the explicit instruction phase on students' solutions ( g  = 0.56). The use of other design features, duration of the interventions, age, and learning discipline did not show significant differences.

However, it is important to consider some aspects to complement in this review. First, the search of studies was limited to studies citing pioneer papers of productive failure in Google Scholar, which can leave behind studies about PS‐I not available in this source, or that are disconnected from the productive failure literature. Second, it did not consider the different types of control interventions to compare the efficacy of PS‐I. Also, it did not include motivational outcomes. Lastly, most of the effect sizes reported were based on a substantial body of studies in which no equivalence between conditions was kept in terms of learning materials. Only the effect size they reported for the subgroup of experimental studies is expected to be free from these studies ( g  = 0.25).

While trying to overcome their mentioned methodological limitations, the upcoming review aims to update the evidence of these four reviews. It also aims to consider a greater variety of outcomes and moderators. Regarding the outcomes, rather than just considering different types of learning, the upcoming review will also consider motivation for learning. Regarding factors that can moderate the efficacy of PS‐I, rather than just considering different PS‐I design features, and contextual factors such as duration or learning domain, it will also consider the different types of control activities used to compare the efficacy of PS‐I. Lastly, it will also provide separate results for the main analyses, in which equivalence of materials between PS‐I and other interventions is maintained, and additional exploratory analyses, in which such equivalence is not necessarily maintained.

Results of this review can have important implications when considering whether or not to introduce PS‐I into the educational practice. The use of PS‐I is very scarce (Pan,  2020 ), and it is important to offer updated evidence of whether it can contribute to the promotion of motivation, conceptual knowledge, and capacity to transfer learning. This evidence can help instructors to reduce the uncertainty of trying it. Also, it can help them to get guidance about which design features or contextual factors can contribute to its efficacy.

2. OBJECTIVES

The purpose of this review is to synthesize the evidence about the efficacy of PS‐I to promote learning and motivation in students. Specifically, this review is designed to answer the following questions:

  • To what degree does PS‐I affect learning and motivation, relative to alternative learning approaches?
  • To what extent is the efficacy of PS‐I associated with the use of different design features within it, including the use of group work, contrasting cases, and metacognitive guidance in the initial problem‐solving activity, and the use of explanations that build upon students' solutions in the explicit instruction phase?
  • To what extent is the relative efficacy of PS‐I associated with the contextual factors of activities used as control, age of students, duration of the interventions, and learning domain?
  • What is the quality of the existent evidence to evaluate these questions in terms of number of studies included and potential biases derived from publication and methodological restrictions?

3.1. Criteria for considering studies for this review

3.1.1. types of studies.

All studies included in the review will fulfil the following requirements:

  • They must involve a comparison of at least one group that goes through PS‐I with at least one comparative group that goes through an alternative intervention in which the teaching of the target concepts starts by providing students with some content.
  • They will either be randomized controlled trials or quasi‐experimental designs in which different students are assigned to the PS‐I conditions and the control conditions. For both types of designs, we will include studies in which the unit of assignment is either the students or students' groups (e.g., class groups, work groups). Also, for both types of designs, we will include studies in which the assignment method is random, quasi‐random, or even not random.

Nevertheless, we will exclude studies if the assignment leads to any difference between the PS‐I group and the comparative group that can affect learning (e.g., if one group belongs to class groups or schools with recognized better performance than the other group), or if pre‐existing differences between these two groups in terms of age, gender, or previous knowledge are statistically significant, as indicated by inferential statistical tests for group comparisons, using a level of statistical significance of p  ˂ .05. This exclusion criterion would apply for both quasi‐experimental designs and randomized controlled trials. Studies where teaching time is not the same for both groups will also be excluded.

In regard to the equivalence of teaching contents, we will have different inclusion criteria for the main analyses and complementary exploratory analyses. For the main analyses, we will only include studies in which the PS‐I group and the control group receive the same contents about the target concepts, and in which the learning activities are also the same but with the exception that the PS‐I group would perform a problem‐solving activity at the beginning of the intervention, and, during the same amount of time but not necessarily at the same time, the comparative group would perform alternative activities covering the same contents.

For the additional exploratory analyses, we will include studies in which such an equivalence of contents and activities is not maintained, which often occurs in studies that use a business‐as‐usual comparative condition. For example, in some studies the explicit instruction phase of the PS‐I condition includes explanations that build on the students’ generated solutions, while such explanations are not given in the comparative condition (e.g., Kapur,  2012a ).

3.1.2. Types of participants

The studies must have child or adult students as participants. We will not have an exclusion criterion based on age. Students from any developmental stage can potentially benefit from PS‐I. Eligible samples would also include populations at risk because of socio‐economic disadvantage, such as students from specific ethnicities or minorities, inner‐city schools, prison populations, or students who have poor school performance. These populations are important for inclusive education policies, and all of them have the potential to benefit from PS‐I. Samples consisting exclusively of people with a specific psychological diagnosis will be excluded because of the complexity of interpreting the variability generated by these populations.

3.1.3. Types of interventions

Studies eligible for this review will have to examine the effectiveness of PS‐I, and therefore will be required to have at least one group of students go through this approach, which will be defined by the following components:

  • Students start the learning process with a problem‐solving activity that targets concepts they have not yet learned,
  • For which they are given time to develop solutions on their own,
  • And that will be followed by a separate phase of explicit instruction about the new concepts, in which students can listen to or read these concepts.

Within PS‐I interventions there are possible additional characteristics that we might consider in the moderation analyses, but not within our inclusion criteria. Specifically, for the initial problem‐solving phase we will consider the presence of (a) contrasting cases; (b) metacognitive guidance; and (c) collaborative work. For the posterior instruction phase we will consider the presence of explanations that build upon students' solutions. Table  1 shows examples for these variables. We will also consider interventions with different durations, which can range from one session to several sessions.

It is important to note that we will exclude studies where students are faced with novel problems but they are not given the opportunity to solve them on their own. Examples of this situation include studies where students have access to external sources of information from the beginning of the problem‐solving activity (e.g., Tandogan,  2007 ), or where the problem only acts as a scenario to stimulate students' expression of their first intuitions.

Regarding the comparison condition, eligible studies have to at least include one control group that is given the same learning materials as the PS‐I group, but instead of going through the initial problem‐solving activity, they work through an alternative activity. Examples of these alternative activities include: (a) the same problem‐solving activity but used as a practice activity, which is provided to students once they have received all or part of the instruction about the target concepts; (b) the same problem‐solving activity but used in the form of a worked example; (c) other alternative activities that maintains a balance between the two interventions in terms of time and content covered.

Examples of eligible studies are summarized below:

  • Study 1 in Kapur ( 2014 ) assigned several statistics classes, composed of 75 9th grade students, into two learning conditions. In the PS‐I condition, students first had one hour to solve a novel problem about designing variability measures (phase 1), then they received another hour of explicit instruction about the standard deviation concept accompanied with practice problems (phase 2). In the control condition, students went through the same two phases but in reverse order. In the post‐test, students in the PS‐I condition outperformed those in the control condition in conceptual knowledge and transfer of learning, but not in procedural knowledge. During the learning process, students' engagement did not differ between conditions, but mental effort was higher for students in the PS‐I condition.
  • A study by Likourezos ( 2017 ) assigned, within their classes, 72 8th grade students into three learning conditions that spanned six 1‐h sessions of geometry. In the PS‐I condition the sessions were composed of two phases, a 30 min phase in which students solved novel problems, then an explicit instruction phase of 30 min. In the control condition the initial problems were substituted by worked out examples, which were the same as the problems used in the PS‐I condition but totally solved and included explanations that students could study. In a second control condition, which authors called partially‐guided, these worked‐examples only included the final solution, and students had to figure out the process. Post‐test results showed no significant differences between conditions in learning transfer nor procedural knowledge. Yet, some differences were found during the learning process in terms of motivation and the cognitive load students experienced.

3.1.4. Types of outcome measures

Primary outcomes.

Eligible studies must report either outcomes for learning or motivation for learning after the intervention.

In terms of learning we will consider two of the primary outcomes already analysed in the review of Loibl,  2017 , conceptual knowledge and transfer:

  • Conceptual knowledge is defined as the understanding of the structure and relationships that underlie a taught concept or procedure. It is generally measured by testing students in principle‐based reasoning, where they have to explain the why of different situations or procedures, or in the ability to connect different representations (refer to the conceptual knowledge post‐test in Kapur,  2012 for an example).
  • Transfer is defined as the ability to adapt the learned concepts to new situations. It is generally measured by asking students to solve problems that have no explicit relation with the concepts learned, and that are novel in the sense that have a new structure or occur in a new context compared to the problems students have previously seen (refer to the transfer post‐test in Kapur,  2012 for an example).

Measures of conceptual knowledge and transfer reported in the literature are generally not previously validated. They are generally created for the specific contents seen in each study. To be as comprehensive as possible we will include studies with un‐validated measures as long as the items correspond to our definitions of conceptual learning or transfer. We will only consider measures of students' performance, based on knowledge tests completed by students after the end of the interventions.

Concerning motivation, the planned primary outcome will be motivation for learning, which we define as a desire to engage in learning about the topic that has been taught. For its measurement we will primarily consider self‐report measures of interest, which refers to the perception of caring about learning the topic, and which is generally measured with questionnaires that ask students about the intensity with which they have different motives for learning. As a second priority to measure motivation for learning we will also consider self‐report measures of curiosity. Curiosity is an important component of interest, but it is more specific in the sense that refers to the desire of knowing. It is generally measured through questionnaires that ask students about the intensity with which they feel this sensation. The PS‐I literature has often used measures of interest or curiosity that have not been previously validated. To be as inclusive as possible, studies with non‐validated measures will be considered, but only as long as the items correspond with our definition of motivation for learning. Measures of engagement in the learning task will not be considered as indicators of motivation for learning, because engagement can be influenced by different factors not related to motivation, such as the task requirements.

Finally, it is important to consider that in the literature these measures are often measured at different time points during and after PS‐I interventions. For the main analysis we will consider the first measurement taken at the end of the learning process. Yet, other measurement times might be considered if available for several studies.

Secondary outcomes

We will code, and potentially consider as secondary outcomes:

  • Procedural knowledge, defined as the ability to correctly apply the learned procedures (Loibl,  2017 ). It is generally measured by testing students in the ability to carry out a set of steps, such as solving plug‐and‐chug problems, or questions to develop a set of learned procedures.
  • General measures of learning that mix items of procedural knowledge, conceptual knowledge, and/or transfer. These types of measures can be common in applied studies that use a typical exam to evaluate performance.
  • Factors that can influence the learning process, such as engagement, cognitive load, or number of solutions generated during the problem‐solving activity.
  • Factors that can influence motivation, such as self‐efficacy, anxiety and frustration.
  • Outcomes related to implementing the activities, such as work load experienced by the professors who implement the activity.

3.2. Search methods for identification of studies

Different sources will be searched to include published and unpublished international studies written in English or Spanish, with no date restriction. Although we might have problems scanning studies written in other languages than English or Spanish, no language limits will be applied in the searches.

3.2.1. Electronic searches

Based on the guidelines and lists of databases of Kugley ( 2017 ) for selecting electronic searches, we will search within the following sources that include journal articles, conference proceedings, government documents, and dissertations:

  • Databases for scientific literature, either with a general scope or with a scope focused on education. Across them, we will search in all the six indexed of Web of Science, PsycINFO, ERIC, MEDLINE, Google Scholar, Academic Search Complete (EBSCO), Education Abstracts (EBSCO), Education Full Text (EBSCO), SciELO, and Dialnet.
  • Databases that are broadly open to grey literature. Across them, we will search in EBSCO Open Dissertations, ProQuest Dissertations & Theses Global, EThOS, TESEO, and the Networked Digital Library for Theses and Dissertations.

Within these databases, we will use a combination of keywords that refer to PS‐I interventions (e.g., ‘Problem‐solving’ AND ‘Explicit instruction’ OR ‘Problem‐solving before Instruction’ OR ‘Productive Failure’ OR ‘Inventing to Prepare for Future Learning’). To make the output more specific, this combination may be restricted with a combination of keywords that refer to our primary outcomes (e.g., ‘learning’ OR ‘motivation’) and/or a combination of keywords referring to our eligible population (e.g., ‘students’ OR ‘pupils’). Appendix 1 shows an example of a strategy search in PsycINFO.

3.2.2. Searching other resources

Beyond electronic searches, we will use other sources, including:

  • Citations associated with systematic reviews and relevant studies. Specifically, we will search in the list of references of previous systematic reviews about PS‐I (Darabi,  2018 ; Jackson,  2021 ; Loibl,  2020 ; Sinha,  2021 ). Additionally, we will use Google Scholar to search across the documents that have cited either these reviews or the reports that are considered pioneers in PS‐I approaches (Kapur,  2008 ; Kapur,  2012a ; Schwartz,  1998 ; Schwartz,  2004 ). Lastly, the review team will check reference lists of included studies, and the citations in Google Scholar to these included studies.
  • Conference proceedings of educational conferences. We will search in proceedings of conferences celebrated in the last 15 years of the European Educational Research Association (EERA) and the International Society of the Learning Sciences (ISLS).
  • Documents from international and national organizations. We will search for publications in the OECD ( https://www.oecd-ilibrary.org/ ), the UNESCO ( https://www.unesco.org/es/ideas-data/publications ), the World Bank ( https://www.worldbank.org/en/research ), the Eurydice Network ( https://eacea.ec.europa.eu/national-policies/eurydice/ ), the US Department of Education ( https://www.ed.gov/ ), and the Spanish Department of Education and Professional Training ( https://www.educacionyfp.gob.es/portada.html ).
  • Hand searches of journals. Four journals that frequently publish about PS‐I interventions will be hand searched for documents published in the last 5 years: Instructional Science , Learning and Instruction , Cognition and Instruction , and Journal of Educational Psychology .
  • Communications with international experts. After finishing the search in other sources, we will email all the contacting authors of the identified studies to ask them about additional studies they may know of, including unpublished studies. This email will contain a comprehensive list of the included articles along with the inclusion criteria.

3.3. Data collection and analysis

3.3.1. selection of studies.

Study selection will be done through the software Covidence. After eliminating duplicated manuscripts, we will screen the titles and abstracts of the remaining manuscripts to evaluate their potential inclusion. Among these pre‐selected manuscripts, we will screen the full texts to consider if they meet our inclusion criteria. For these two screening processes, 20% of the manuscripts will be screened individually by two members of the team. If for a given subset of manuscripts, the level of agreement is below 80%, the subset will be screened again until reaching this standard. The level of agreement will be reported. Disagreements will be resolved by discussion until reaching consensus. If disagreements persist, a third reviewer in the team will be consulted.

3.3.2. Data extraction and management

Data of the primary studies will be directly introduced in two forms in a Microsoft Access document that can be found in the following link, together with the coding manual: https://www.dropbox.com/sh/u3nr12ayilaezps/AADQngLciNF_gGLrRSpqKtofa?dl=0 .

The first form is the Reports Form, and will be used to code information about each report that, after screening, contains any study suspected to be included in the review. It includes variables related to the following:

  • Title of the report.
  • Year of publication and type of publication.
  • Authors and affiliations.
  • Studies contained in the report.

The second form is the Studies Form, and will be used to code information of each study in the reports that has been preliminarily accepted to inclusion. It includes variables related to the following:

  • Setting (e.g., public vs. private institutions, special education units, topic taught).
  • Sample characteristics (e.g., sex ratio, age mean).
  • Design features of the PS‐I and comparative interventions (e.g., use of contrasting cases, group work, metacognitive guidance).
  • Information related to risk of bias (e.g., assignment procedures, control of extraneous variables, attrition)
  • Implementation characteristics (e.g., person who delivers the intervention, duration of intervention).
  • Types of control groups being compared.
  • Characteristics of measures used (e.g., internal reliability).
  • Characteristics of the effect sizes (e.g., time passed from the end of the intervention to the measurement).
  • Effect sizes for different subgroups (e.g., effect sizes of subsamples with different levels of prior knowledge).

This form automatically calculates effect sizes and their related statistics after introducing the means, standard deviations, and sample sizes reported in the primary studies. In cases where this information is not reported, the coders will use the Campbell Collaboration online calculator to calculate effect sizes from other reported values.

To evaluate the coding reliability, the Studies Form will be completed by two coders for a random selection of 10% of the studies. Discrepancies will be resolved by further review of the reports and by discussion until an agreement is reached. If we identify relevant variables during the coding process, they will be added to the questionnaire.

3.3.3. Assessment of risk of bias in included studies

Risk of biases will be assessed using several items in the Studies Form (refer to Data extraction and management) that address the five domains of the Cochrane Risk of Bias Tool for Randomized Trials (Sterne,  2019 ). However, in comparison to this tool, we changed some items in each domain to specifically adapt to the context of this review:

  • Randomization process: we will code for whether the units of assignment are students or students' groups, and whether assignment is random. We will also code the identification of baseline differences in terms of gender, age, previous knowledge, or other relevant variable identified, and whether this data is reported.
  • Deviations from the intended intervention: we will code for whether the PS‐I interventions and the control interventions were implemented in the same place, at the same time, with the same implementers, with the same durations, with similar levels of attrition, and covering the same contents. It will also be coded whether implementers were blind, and whether it was used a pre‐test including problem‐solving activities related to the contents to cover, which can create a PS‐I effect in the control interventions and therefore contaminate the results (Newman,  2019 ).
  • Missing outcome data: missing data higher than 5% for any relevant comparison will be identified.
  • Measurement of the outcome: appropriateness of the measure will be coded regarding whether the items correspond to the definition of the construct, using a Likert type scale (yes, probably yes, probably no, no, cannot tell). Other factors that will be coded include whether the measure was previously validated, and reliability indicators in terms of internal reliability and inter‐rater reliability.
  • Selection in the reported result: the coder will assess the probability that the reported assessments or analyses were selected on the basis of the findings, using a Likert type scale (yes, probably yes, probably no, no, cannot tell).

For each of these five dimensions, coders will assess the degree of risk of bias (low, high, or some concerns). In case of assessing them as ‘high’ or ‘some concerns’, they will describe the specific effect sizes affected by this judgement, the direction in which the potential bias is suspected to affect (favours experimental, favours comparator, towards null, away from null, unpredictable), and the reasons behind it.

After evaluating these questions, coders will re‐evaluate whether some or all effect sizes taken from the study should be analysed according to the inclusion criteria. They will also classify these effect sizes into three categories referring to the general risk of biases: low, some concerns, or high.

  • Low risk of bias will be assigned for studies in which are fulfilled two requirements: a) participants are randomly assigned to conditions (unit of assignment is the participant and the method of assignment was totally random), and b) there is enough information to assume equivalence between groups and interventions.
  • Some concerns for risk of bias will be assigned for studies in which only one of these two requirements are fulfilled.
  • High risk of bias will be assigned for studies in which none of these two requirements are fulfilled.

In case of selecting options ‘high’ or ‘some concerns’, descriptions about the specific effect sizes affected by this assessment, the direction of the potential bias, and the reasons behind it will be added.

3.3.4. Measures of treatment effect

For the three primary outcomes of conceptual knowledge, transfer, and motivation for learning, we will use standardized mean difference effect sizes, or Cohen's d (Cohen,  1988 ), to estimate the effect of PS‐I interventions in comparison with other interventions used as a control, as indicated in the following formula:

where the numerator is the difference of the PS‐I group mean minus the control group mean, and the denominator is the pooled standard deviation for the two comparison groups. Larger effect sizes will represent a higher quantity of the outcome in the PS‐I group in comparison to the control group. Once these effect sizes are obtained, they will be adjusted with the small‐sample correction factor to provide unbiased estimates (Hedges,  1981 ), and 95% confidence intervals will be calculated from them.

3.3.5. Unit of analysis issues

To prevent the inclusion of the same effect size twice in one analysis, effect sizes for different constructs and different evaluation moments would be analysed separately. In cases where one study provides more than one measure for one of the constructs we have defined, we will select only one measure. First, for that selection we will follow the priorities already specified for the primary outcomes (refer to Primary outcomes). Second, if the possibility to select two outcomes remains, we will select a previously validated measure over a non‐validated measure. Last, if the possibility to select several outcomes remains, we will select the measure that is most similar to those used by the other studies.

To prevent that a study that has been published in several reports is included several times in the analyses, at the end of the coding process we will explore nonobvious duplicates by looking for repetitions within the categories of key variables such as authors, date of publication, or effect sizes.

givenNamesIn multi‐aim primary studies that compare two PS‐I groups with one control group, we will carry two options following recommendations in Higgins,  2019 to avoid weighting as twice the values of the control group in the aggregated analyses: (a) when the two PS‐I groups are similar, we will treat them as a single group; (b) when they are not similar, the sample size of the control group will be divided in half before being compared with the PS‐I groups. A similar strategy but in reverse order will be conducted when a study compares one PS‐I group with two control groups.

Clustering issues

In the PS‐I literature it is common that the units of assignment to conditions are not the students, but clusters of students, either class groups or working groups (pairs or small groups of students that work together in the interventions). To correct for the artificial reduction in the standard error of the effect size estimates due to these clusterings, we will follow the recommendations in Higgins ( 2019 ) of multiplying the standard error by the square root of the ‘design effect’, whose formula is

For studies in which the intracluster correlation coefficient is not reported, we will use the coefficient of similar studies included in the review.

3.3.6. Dealing with missing data

To deal with missing data, authors from primary studies will be contacted via email. In case the requested information is not received, the study will be reported, but the effects for which there is missing data will not be included in the analyses.

3.3.7. Assessment of heterogeneity

We will evaluate the variability across studies using the Q statistic and its associated chi‐square test for inference. Additionally, we will provide the I 2 statistic as an indicator of the approximate proportion of variation that is due to between‐study heterogeneity rather than a sampling error. Lastly, we will estimate the τ 2 as an absolute estimation of the magnitude of variation between studies.

3.3.8. Assessment of reporting biases

To estimate the impact from publication bias, we will use funnel plots in combination with trim‐and‐fill analyses. Additionally, we will analyse the risk of publication bias with the Egger regression tests and the Kendall tau test.

3.3.9. Data synthesis

Analyses will include a descriptive summary of the contextual characteristics, methodological characteristics, sample characteristics, and outcome characteristics of the included studies.

PS‐I interventions and control interventions will be compared using averaged effect sizes based on the standardized mean difference, weighted with the inverse of variance method. Separate averages will be reported for each of the three primary outcomes of motivation for learning, conceptual knowledge, and transfer. In turn, for each of these outcomes, separate meta‐analyses will be performed for the comparison of PS‐I interventions with each type of control intervention (As defined in section Why it is important to do this review, four different types of control interventions have been identified: instruction with lecture before problem‐solving, instruction with worked‐examples exploration before problem‐solving, instruction with worked examples exploration before further instruction, and problem‐solving with content guidance before instruction. Yet, other types of control interventions might be identified during the review process).

A random effects model will be assumed. This option was chosen instead of a fixed effects model because we expect that a great variety of factors would influence the effect sizes, and therefore it is difficult to assume a common effect size for the studies (Borenstein,  2010 ). The 95% confidence intervals will be reported for the averaged effect sizes. Funnel plots will be used to visually represent their aggregation.

The comparison between PS‐I and several types of control activities might be complemented with network meta‐analysis, as long as homogeneity of the comparisons fulfil the transitivity assumption, which will be checked by observing the distribution of significant moderators in each comparison. For the network meta‐analyses, we will report a network plot to describe the direct and indirect evidence available across interventions. Effect sizes between treatments will be reported with 95% confidence intervals using a random effects model, and a p  ˂ .05 will be considered statistically significant.

Beyond these main analyses, we will conduct exploratory analyses, which will include similar comparisons between PS‐I interventions and control interventions, but we will consider secondary outcomes and studies in which there is not strict equivalence of learning materials between the PS‐I interventions and the control interventions.

3.3.10. Subgroup analysis and investigation of heterogeneity

For all of the separate meta‐analyses in which PS‐I is compared with each of the control activities in each of the three primary outcomes, in cases where we find significant statistical heterogeneity, we will perform moderation analyses to identify factors associated with the efficacy of PS‐I. Correlations between potential moderators will precede these analyses to identify whether the effects of different moderators might be cofounded with each other, and to identify potential groupings of moderating variables.

Moderation analyses will be performed individually for each of the variables discussed in How the intervention might work. Specifically, within design features of PS‐I, we will test for use in the initial problem solving activity of contrasting cases (yes vs. no), metacognitive guidance (yes vs. no), and collaborative work (yes vs. no), and for use in the explicit instruction phase of explanations that build upon students' solutions (yes vs. no). Within contextual factors, we will test for the duration of the intervention in minutes, the average age of the sample in years, and learning domain (math related domain vs. other domains). These individual analyses will also be performed with the general risk of bias variable (low risk vs. some concerns vs. high risk of bias). For the categorical variables we will perform subgroup analyses, and for the continuous variables we will perform individual meta‐regression analyses. Further combinations of moderating variables are not initially hypothesized. A minimum aggregation of three studies will be considered necessary for the analyses to be performed.

3.3.11. Sensitivity analysis

We will conduct sensitivity analyses to determine the impact of several decisions, such as removing studies with outlier effect sizes, removing unpublished studies, removing studies with high risk of bias, or using alternative ways for coding or including moderator variables in the analyses.

3.3.12. Summary of findings and assessment of the certainty of the evidence

This is the protocol for a Campbell review whose objective is exploring the efficacy of the educational strategy of Problem‐solving before Instruction (PS‐I) to promote learning and motivation in child and adult students.

CONTRIBUTIONS OF AUTHORS

  • Content: Catherine Chase, Eduardo González‐Cabañes, Trinidad García, and Jose Cárlos Núñez.
  • Systematic review methods: González‐Cabañes, Garcia, Chase, and Núñez.
  • Information retrieval: González‐Cabañes, García, and Chase. We count on the advisory assistance of librarians at our universities.

DECLARATIONS OF INTEREST

None of the researchers involved in the team have financial or personal interests in the results of this review, nor belong to any organization with such interests. All of us have published studies on the problem solving before instruction (PS‐I) method. This review is designed as an independent study and procedures will be detailed to allow replication from perspectives different than ours.

SOURCES OF SUPPORT

Internal sources

  • No sources of support provided

External sources

  • Ministry of Universities of the Government of Spain, Spain
  • Scholarship to conduct PhD studies (grant number: FPU16/05802)
  • Ministry of Economy, Industry, and Competitiveness of the Government of Spain, Spain
  • Research Project (Reference PID2019‐107201GB‐100)
  • Principality of Asturias, Spain
  • Research Project (Reference: FC‐GRUPIN‐IDI/2018/000199)

Supporting information

Supporting information.

ACKNOWLEDGEMENTS

This research is being funded by the Principality of Asturias (reference FC‐GRUPIN‐IDI/2018/000199), by the Ministry of Economy, Industry, and Competitiveness of the Government of Spain (reference: PID2019‐107201GB‐100) and by a predoctoral grant from the Ministry of Universities of Spain (grant number: FPU16/05802). We would like to thank Cheryl von Asten for her contribution to editing the English of the manuscript, Juan Botella for teaching and assisting us with methodological questions, and Jarson Varela for his consultory assistance with the creation of coding forms.

González‐Cabañes, E. , Garcia, T. , Chase, C. , & Núñez, J. C. (2023). PROTOCOL: Problem solving before instruction (PS‐I) to promote learning and motivation in child and adult students . Campbell Systematic Reviews , 19 , e1337. 10.1002/cl2.1337 [ CrossRef ] [ Google Scholar ]

OTHER REFERENCES

ADDITIONAL REFERENCES

Asterhan 2009

  • Asterhan Christa, S. C. , & Schwarz, B. B. (2009). Argumentation and explanation in conceptual change: Indications from protocol analyses of peer‐to‐peer dialog . Cognitive Science , 33 ( 3 ), 374–400. [ PubMed ] [ Google Scholar ]

Belenky 2012

  • Belenky, D. M. , & Nokes‐Malach, T. J. (2012). Motivation and transfer: The role of mastery‐approach goals in preparation for future learning . Journal of the Learning Sciences , 21 ( 3 ), 399–432. [ Google Scholar ]

Borenstein 2010

  • Borenstein, M. , Hedges, L. V. , Higgins, J. P. T. , & Rothstein, H. R. (2010). A basic introduction to fixed‐effect and random‐effects models for meta‐analysis . Research Synthesis Methods , 1 ( 2 ), 97–111. [ PubMed ] [ Google Scholar ]

Carnero‐Sierra 2020

  • Carnero‐Sierra, S. , & González‐Cabañes, E. (2020). Resolución de Problemas Previo a Instrucción, aplicado al aprendizaje online de Modelos de Atención Selectiva . Magister , 1 ( 32 ), 49–54. [ Google Scholar ]

Carriedo 1995

  • Carriedo, N. , & Alonso Tapia, J. (1995). Comprehension strategy training in content areas . European Journal of Psychology of Education , 10 ( 4 ), 411–431. [ Google Scholar ]
  • Chase, C. C. , & Klahr, D. (2017). Invention versus direct instruction: For some content, it's a tie . Journal of Science Education and Technology , 26 ( 6 ), 582–596. [ Google Scholar ]
  • Chi, M. T. H. (2000). Self‐explaining expository texts: The dual processes of generating inferences and repairing mental models . Advances in Instructional Psychology , 5 ( 5 ), 161–238. [ Google Scholar ]
  • Chiu, M. , & Xihua, Z. (2008). Family and motivation effects on mathematics achievement: Analyses of students in 41 countries . Learning and Instruction , 18 ( 4 ), 321–336. [ Google Scholar ]
  • Richard, C. , Kirschner, P. A. , & Sweller, J. (2012). Putting students on the path to learning: The case for fully guided instruction . American Educato , 36 ( 1 ), 5–11. [ Google Scholar ]
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences . Routledge. [ Google Scholar ]

Council National Research 2003

  • Council National Research . (2003). Engaging schools: Fostering high school students' motivation to learn . National Academies Press. [ Google Scholar ]

Darabi 2018

  • Darabi, A. , Logan, A. T. , & Erkan, S. (2018). Learning from failure: A meta‐analysis of the empirical studies . Etr&D‐Educational Technology Research and Development , 66 ( 5 ), 1101–1118. [ Google Scholar ]
  • Fyfe, E. R. , DeCaro, M. S. , & Rittle‐Johnson, B. (2014). An alternative time for telling: When conceptual instruction prior to problem solving improves mathematical knowledge . British Journal of Educational Psychology , 84 ( 3 ), 502–519. [ PubMed ] [ Google Scholar ]

Glogger‐Frey 2015

  • Glogger‐Frey, I. , Fleischer, C. , Grueny, L. , Kappich, J. , & Renkl, A. (2015). Inventing a solution and studying a worked solution prepare differently for learning from direct instruction . Learning and Instruction , 39 , 72–87. [ Google Scholar ]

Glogger‐Frey 2017

  • Glogger‐Frey, I. , Gaus, K. , & Renkl, A. (2017). Learning from direct instruction: Best prepared by several self‐regulated or guided invention activities? Learning and Instruction , 51 , 26–35. [ Google Scholar ]

Golman 2018

  • Golman, R. , & Loewenstein, G. (2018). Information gaps: A theory of preferences regarding the presence and absence of information . Decision , 5 ( 3 ), 143–164. [ Google Scholar ]

González‐Cabañes 2020

  • González‐Cabañes, E. , García, T. , Rodríguez, C. , Cuesta, M. , & Núñez José, C. (2020). Learning and emotional outcomes after the application of invention activities in a sample of university students . Sustainability , 12 ( 18 ), 7306. [ Google Scholar ]

González‐Cabañes 2021

  • González‐Cabañes, E. , García, T. , Núñez José, C. , & Rodríguez, C. (2021). Problem‐solving before instruction (PS‐I): A protocol for assessment and intervention in students with different abilities . JoVE , 175 , e62138. [ PubMed ] [ Google Scholar ]

Hedges 1981

  • Hedges Larry, V. (1981). Distribution theory for Glass's estimator of effect size and related estimators . Journal of Educational Statistics , 6 ( 2 ), 107–128. [ Google Scholar ]

Higgins 2019

  • Higgins, J. P. T. , Eldridge, S. , & Li, T. (2019). Chapter 23: Including variants on randomized trials. In Higgins J. P. T., Thomas J., Chandler J., Cumpston M., Li T., Page M. J., & Welch V. A. (Eds.), Cochrane handbook for systematic reviews of interventions version 6.0 (updated July 2019) (pp. 569–593). Cochrane. [ Google Scholar ]

Holmes 2014

  • Holmes, N. G. , Day, J. , Park, A. H. K. , Bonn, D. A. , & Roll, I. (2014). Making the failure more productive: scaffolding the invention process to improve inquiry behaviors and outcomes in invention activities . Instructional Science , 42 ( 4 ), 523–538. [ Google Scholar ]

Jackson 2021

  • Jackson, A. , Godwin, A. , Bartholomew, S. , & Mentzer, N. (2021). Learning from failure: A systematized review . International Journal of Technology and Design Education , 1 , 1–21. [ Google Scholar ]

Kahneman 2011

  • Kahneman, D. (2011). Thinking, fast and slow . Macmillan. [ Google Scholar ]
  • Kapur, M. (2008). Productive failure . Cognition and Instruction , 26 ( 3 ), 379–425. [ Google Scholar ]
  • Kapur, M. (2010). Productive failure in mathematical problem solving . Instructional Science , 38 ( 6 ), 523–550. [ Google Scholar ]
  • Kapur, M. (2011). A further study of productive failure in mathematical problem solving: unpacking the design components . Instructional Science , 39 ( 4 ), 561–579. [ Google Scholar ]

Kapur 2011a

  • Kapur, M. , & Bielczyz, K. (2011). Classroom‐based experiments in productive failure. In Carlson L., Hoelscher C., & Shipley T. F. (Eds.), Proceedings of the 33rd Annual Meeting of the Cognitive Science Society (pp. 2812–2817). Cognitive Science Society. [ Google Scholar ]
  • Kapur, M. (2012). Productive failure in learning the concept of variance . Instructional Science , 40 ( 4 ), 651–672. [ Google Scholar ]

Kapur 2012a

  • Kapur, M. , & Bielaczyc, K. (2012). Designing for productive failure . Journal of the Learning Sciences , 21 ( 1 ), 45–83. [ Google Scholar ]
  • Kapur, M. (2014). Productive failure in learning math . Cognitive science , 38 ( 5 ), 1008–1022. [ PubMed ] [ Google Scholar ]

Kapur 2014a

  • Kapur, M. (2014). Comparing learning from productive failure and vicarious failure . Journal of the Learning Sciences , 23 ( 4 ), 651–677. [ Google Scholar ]

Kugley 2017

  • Kugley, S. , Wade, A. , Thomas, J. , Mahood, Q. , Jørgensen Anne‐Marie, K. , Hammerstrøm, K. , & Sathe, N. (2017). Searching for studies: A guide to information retrieval for Campbell systematic reviews . Campbell Systematic Reviews , 13 ( 1 ), 1–73. [ Google Scholar ]

Lamnina 2019

  • Marianna, L. , & Catherine, C. (2019). Developing a thirst for knowledge: How uncertainty in the classroom influences curiosity, affect, learning, and transfer . Contemporary Educational Psychology , 59 , 101785. [ Google Scholar ]

Likourezos 2017

  • Likourezos, V. , & Kalyuga, S. (2017). Instruction‐first and problem‐solving‐first approaches: Alternative pathways to learning complex tasks . Instructional Science , 45 ( 2 ), 195–219. [ Google Scholar ]
  • Liu, Y. , Hau, K.‐T. , & Zheng, X. (2020). Does instrumental motivation help students with low intrinsic motivation? Comparison between Western and Confucian students . International Journal of Psychology , 55 ( 2 ), 182–191. [ PubMed ] [ Google Scholar ]

Loderer 2018

  • Loderer, K. , Pekrun, R. , & Lester, J. C. (2018). Beyond cold technology: A systematic review and meta‐analysis on emotions in technology‐based learning environments . Learning and Instruction , 70 , 101162. [ Google Scholar ]

Loewenstein 1994

  • Loewenstein, G. (1994). The psychology of curiosity: A review and reinterpretation . Psychological Bulletin , 116 ( 1 ), 75–98. [ Google Scholar ]
  • Loibl, K. , & Rummel, N. (2014). Knowing what you don't know makes failure productive . Learning and Instruction , 34 , 74–85. [ Google Scholar ]

Loibl 2014a

  • Loibl, K. , & Rummel, N. (2014). The impact of guidance during problem‐solving prior to instruction on students' inventions and learning outcomes . Instructional Science , 42 ( 3 ), 305–326. [ Google Scholar ]
  • Loibl, K. , Roll, I. , & Rummel, N. (2017). Towards a theory of when and how problem solving followed by instruction supports learning . Educational Psychology Review , 29 ( 4 ), 693–715. [ Google Scholar ]
  • Loibl, K. , Tillema, M. , Rummel, N. , & van Gog, T. (2020). The effect of contrasting cases during problem solving prior to and after instruction . Instructional Science , 48 ( 2 ), 115–136. [ Google Scholar ]

Mallart 2014

  • Mallart, S. A. (2014). La resolución de problemas en la prueba de Matemáticas de acceso a la universidad: procesos y errores . Educatio Siglo xxi , 32 , 233–254. [ Google Scholar ]
  • Mayer, R. E. (2003). Learning and instruction . Prentice Hall. [ Google Scholar ]

Mazziotti 2019

  • Mazziotti, C. , Rummel, N. , Deiglmayr, A. , & Loibl, K. (2019). Probing boundary conditions of productive failure and analyzing the role of young students’ collaboration . NPJ science of learning , 4 , 2. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Mega, C. , Ronconi, L. , & De Beni, R. (2014). What makes a good student? How emotions, self‐regulated learning, and motivation contribute to academic achievement . Journal of Educational Psychology , 106 ( 1 ), 121–131. [ Google Scholar ]
  • Moore, J. L. , & Schwartz, D. L. (1998). On learning the relationship between quantitative properties and symbolic representations. In. Bruckman A, Guzdial M., Kolodner J., & Ram A. (Eds.), Proceedings of the International Conference of the Learning Sciences (pp. 209–214). Association for the Advancement of Computing in Education. [ Google Scholar ]

Newman 2019

  • Newman, P. M. , & DeCaro Marci, S. (2019). Learning by exploring: How much guidance is optimal? Learning and Instruction , 62 , 49–63. [ Google Scholar ]

Núñez 2009

  • Núñez, J. C. (2009). Motivación, aprendizaje y rendimiento académico. In B. Duarte Silva , L.Almeida , A. Barca Lozano , & M X Congresso Internacional Galego‐Português de Psicopedagogia (pp. 41–67). Universidad do Minho. [ Google Scholar ]
  • OECD . (2014). PISA 2012 Results: Creative Problem Solving: Students' Skills in Tackling Real‐Life Problems (Volume V) . OECD Publishing Paris.
  • OECD . (2016). PISA 2015 Results (Volume I): Excellence and Equity in Education . [ Google Scholar ]
  • OECD . (2018). Pisa 2018 database . [ Google Scholar ]
  • Pan, S. C. , Sana, F. , Samani, J. , Cooke, J. , & Kim, J. A. (2020). Learning from errors: students' and instructors' practices, attitudes, and beliefs. Memory , 28 , 1–18. [ PubMed ] [ Google Scholar ]

Pekrun 2011

  • Pekrun, R. , Goetz, T. , Frenzel, A. C. , Barchfeld, P. , & Perry, R. P. (2011). Measuring emotions in students' learning and performance: The Achievement Emotions Questionnaire (AEQ) . Contemporary Educational Psychology , 36 ( 1 ), 36–48. [ Google Scholar ]

Pekrun 2012

  • Pekrun, R. , & Stephens, E. J. (2012). Academic emotions. In Harris K. R., Graham S., Urdan T., Graham S., Royer J. M., & Zeidner M. (Eds.), APA educational psychology handbook, Vol 2: Individual differences and cultural and contextual factors (pp. 3–31). American Psychological Association. [ Google Scholar ]
  • Roll, I. , Holmes, N. G. , Day, J. , & Bonn, D. (2012). Evaluating metacognitive scaffolding in guided invention activities . Instructional Science , 40 ( 4 ), 691–710. [ Google Scholar ]
  • Ryan, R. M. , & Deci, E. L. (2009). Promoting self‐determined school engagement: Motivation, learning, and well‐being. In Wentzel K. & Wigfield A. (Eds.), Handbook of motivation at school (pp. 171–196). Routledge. [ Google Scholar ]

Salmerón 2013

  • Salmerón, L. (2013). Actividades que promueven la transferencia de los aprendizajes: una revisión de la literatura . Revista de educación , Extra ( 1 ), 34–53. [ Google Scholar ]

Schraw 1994

  • Schraw, G. , & Dennison, R. S. (1994). Assessing metacogntive awareness . Contemporary Educational Psychology , 19 ( 4 ), 460–475. [ Google Scholar ]

Schwartz 1998

  • Schwartz, D. L. , & Bransford, J. D. (1998). A time for telling . Cognition and Instruction , 16 ( 4 ), 475–522. [ Google Scholar ]

Schwartz 2004

  • Schwartz, D. L. , & Martin, T. (2004). Inventing to prepare for future learning: The hidden efficiency of encouraging original student production in statistics instruction . Cognition and Instruction , 22 ( 2 ), 129–184. [ Google Scholar ]

Schwartz 2011

  • Schwartz, D. L. , Chase Catherine, C. , Oppezzo Marily, A. , & Chin Doris, B. (2011). Practicing versus inventing with contrasting cases: The effects of telling first on learning and transfer . Journal of Educational Psychology , 103 ( 4 ), 759–775. [ Google Scholar ]

Silver 2000

  • Silver, E. A. , & Kenney, P. A. (2000). Results from the seventh mathematics assessment of the National Assessment of Educational Progress . National Council of Teachers of Mathematics . [ Google Scholar ]
  • Sinha, T. , & Kapur, M. (2021). When problem solving followed by instruction works: Evidence for productive failure . Review of Educational Research , 91 ( 5 ), 761–798. [ Google Scholar ]
  • Smith, E. E. , & Swinney, D. A. (1992). The role of schemas in text: A real‐time examination . Discourse processes , 15 ( 3 ), 303–316. [ Google Scholar ]
  • Song, Y. (2018). Improving primary students' collaborative problem solving competency in project‐based science learning with productive failure instructional design in a seamless learning environment . Educational Technology Research and Development , 66 ( 4 ), 979–1008. [ Google Scholar ]

Sterne, 2019

  • Sterne, J. A. C. , Savović, J. , Page, M. J. , Elbers, R. G. , Blencowe, N. S. , Boutron, I. , Cates, C. J. , Cheng, H.‐Y. , Corbett, M. S. , Eldridge, S. M. , Hernán, M. A. , Hopewell, S. , Hróbjartsson, A. , Junqueira, D. R. , Jüni, P. , Kirkham, J. J. , Lasserson, T. , Li, T. , McAleenan, A. , Reeves, B. C. , Shepperd, S. , Shrier, I. , Stewart, L. A. , Tilling, K. , White, I. R. , Whiting, P. F. , & Higgins, J. P. T. (2019). RoB 2: A revised tool for assessing risk of bias in randomised trials . BMJ , 366 , I4898. [ PubMed ] [ Google Scholar ]

Sweller 2019

  • Sweller, J. , van Merrienboer, J. J. G. , & Paas, F. (2019). Cognitive architecture and instructional design: 20 years later . Educational Psychology Review , 31 ( 2 ), 261–292. [ Google Scholar ]

Tandogan 2007

  • Tandogan, R. O. , & Orhan, A. (2007). The effects of problem‐based active learning in science education on students' academic achievement, attitude and concept learning . Eurasia Journal of Mathematics, Science & Technology Education , 3 ( 1 ), 71–81. [ Google Scholar ]

Teasley 1995

  • Teasley, S. D. (1995). The role of talk in children peer collaborations . Developmental Psychology , 31 ( 2 ), 207–220. [ Google Scholar ]
  • Urdan, T. , & Schoenfelder, E. (2006). Classroom effects on student motivation: Goal structures, social relationships, and competence beliefs . Journal of School Psychology , 44 ( 5 ), 331–349. [ Google Scholar ]

VanLehn 1999

  • VanLehn, K. (1999). Rule‐learning events in the acquisition of a complex skill: An evaluation of Cascade . Journal of the Learning Sciences , 8 ( 1 ), 71–125. [ Google Scholar ]

Veenman 2005

  • Veenman, M. V. J. , & Spaans, M. A. (2005). Relation between intellectual and metacognitive skills: Age and task differences . Learning and Individual Differences , 15 ( 2 ), 159–176. [ Google Scholar ]

Weaver 2018

  • Weaver, J. P. , Chastain, R. J. , DeCaro, D. A. , & DeCaro, M. S. (2018). Reverse the routine: Problem solving before instruction improves conceptual knowledge in undergraduate physics . Contemporary Educational Psychology , 52 , 36–47. [ Google Scholar ]
  • Webb Noreen, M. (1993). Collaborative group versus individual assessment in mathematics: Processes and outcomes . Educational Assessment , 1 ( 2 ), 131–152. [ Google Scholar ]
  • Webb, N. M. , Franke, M. L. , Ing, M. , Wong, J. , Fernandez, C. H. , Shin, N. , & Turrou, A. C. (2014). Engaging with others' mathematical ideas: Interrelationships among student participation, teachers' instructional practices, and learning . International Journal of Educational Research , 63 , 79–93. [ Google Scholar ]
  • Open supplemental data
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, mathematical problem-solving through cooperative learning—the importance of peer acceptance and friendships.

www.frontiersin.org

  • 1 Department of Education, Uppsala University, Uppsala, Sweden
  • 2 Department of Education, Culture and Communication, Malardalen University, Vasteras, Sweden
  • 3 School of Natural Sciences, Technology and Environmental Studies, Sodertorn University, Huddinge, Sweden
  • 4 Faculty of Education, Gothenburg University, Gothenburg, Sweden

Mathematical problem-solving constitutes an important area of mathematics instruction, and there is a need for research on instructional approaches supporting student learning in this area. This study aims to contribute to previous research by studying the effects of an instructional approach of cooperative learning on students’ mathematical problem-solving in heterogeneous classrooms in grade five, in which students with special needs are educated alongside with their peers. The intervention combined a cooperative learning approach with instruction in problem-solving strategies including mathematical models of multiplication/division, proportionality, and geometry. The teachers in the experimental group received training in cooperative learning and mathematical problem-solving, and implemented the intervention for 15 weeks. The teachers in the control group received training in mathematical problem-solving and provided instruction as they would usually. Students (269 in the intervention and 312 in the control group) participated in tests of mathematical problem-solving in the areas of multiplication/division, proportionality, and geometry before and after the intervention. The results revealed significant effects of the intervention on student performance in overall problem-solving and problem-solving in geometry. The students who received higher scores on social acceptance and friendships for the pre-test also received higher scores on the selected tests of mathematical problem-solving. Thus, the cooperative learning approach may lead to gains in mathematical problem-solving in heterogeneous classrooms, but social acceptance and friendships may also greatly impact students’ results.

Introduction

The research on instruction in mathematical problem-solving has progressed considerably during recent decades. Yet, there is still a need to advance our knowledge on how teachers can support their students in carrying out this complex activity ( Lester and Cai, 2016 ). Results from the Program for International Student Assessment (PISA) show that only 53% of students from the participating countries could solve problems requiring more than direct inference and using representations from different information sources ( OECD, 2019 ). In addition, OECD (2019) reported a large variation in achievement with regard to students’ diverse backgrounds. Thus, there is a need for instructional approaches to promote students’ problem-solving in mathematics, especially in heterogeneous classrooms in which students with diverse backgrounds and needs are educated together. Small group instructional approaches have been suggested as important to promote learning of low-achieving students and students with special needs ( Kunsch et al., 2007 ). One such approach is cooperative learning (CL), which involves structured collaboration in heterogeneous groups, guided by five principles to enhance group cohesion ( Johnson et al., 1993 ; Johnson et al., 2009 ; Gillies, 2016 ). While CL has been well-researched in whole classroom approaches ( Capar and Tarim, 2015 ), few studies of the approach exist with regard to students with special educational needs (SEN; McMaster and Fuchs, 2002 ). This study contributes to previous research by studying the effects of the CL approach on students’ mathematical problem-solving in heterogeneous classrooms, in which students with special needs are educated alongside with their peers.

Group collaboration through the CL approach is structured in accordance with five principles of collaboration: positive interdependence, individual accountability, explicit instruction in social skills, promotive interaction, and group processing ( Johnson et al., 1993 ). First, the group tasks need to be structured so that all group members feel dependent on each other in the completion of the task, thus promoting positive interdependence. Second, for individual accountability, the teacher needs to assure that each group member feels responsible for his or her share of work, by providing opportunities for individual reports or evaluations. Third, the students need explicit instruction in social skills that are necessary for collaboration. Fourth, the tasks and seat arrangements should be designed to promote interaction among group members. Fifth, time needs to be allocated to group processing, through which group members can evaluate their collaborative work to plan future actions. Using these principles for cooperation leads to gains in mathematics, according to Capar and Tarim (2015) , who conducted a meta-analysis on studies of cooperative learning and mathematics, and found an increase of .59 on students’ mathematics achievement scores in general. However, the number of reviewed studies was limited, and researchers suggested a need for more research. In the current study, we focused on the effect of CL approach in a specific area of mathematics: problem-solving.

Mathematical problem-solving is a central area of mathematics instruction, constituting an important part of preparing students to function in modern society ( Gravemeijer et al., 2017 ). In fact, problem-solving instruction creates opportunities for students to apply their knowledge of mathematical concepts, integrate and connect isolated pieces of mathematical knowledge, and attain a deeper conceptual understanding of mathematics as a subject ( Lester and Cai, 2016 ). Some researchers suggest that mathematics itself is a science of problem-solving and of developing theories and methods for problem-solving ( Hamilton, 2007 ; Davydov, 2008 ).

Problem-solving processes have been studied from different perspectives ( Lesh and Zawojewski, 2007 ). Problem-solving heuristics Pólya, (1948) has largely influenced our perceptions of problem-solving, including four principles: understanding the problem, devising a plan, carrying out the plan, and looking back and reflecting upon the suggested solution. Schoenfield, (2016) suggested the use of specific problem-solving strategies for different types of problems, which take into consideration metacognitive processes and students’ beliefs about problem-solving. Further, models and modelling perspectives on mathematics ( Lesh and Doerr, 2003 ; Lesh and Zawojewski, 2007 ) emphasize the importance of engaging students in model-eliciting activities in which problem situations are interpreted mathematically, as students make connections between problem information and knowledge of mathematical operations, patterns, and rules ( Mousoulides et al., 2010 ; Stohlmann and Albarracín, 2016 ).

Not all students, however, find it easy to solve complex mathematical problems. Students may experience difficulties in identifying solution-relevant elements in a problem or visualizing appropriate solution to a problem situation. Furthermore, students may need help recognizing the underlying model in problems. For example, in two studies by Degrande et al. (2016) , students in grades four to six were presented with mathematical problems in the context of proportional reasoning. The authors found that the students, when presented with a word problem, could not identify an underlying model, but rather focused on superficial characteristics of the problem. Although the students in the study showed more success when presented with a problem formulated in symbols, the authors pointed out a need for activities that help students distinguish between different proportional problem types. Furthermore, students exhibiting specific learning difficulties may need additional support in both general problem-solving strategies ( Lein et al., 2020 ; Montague et al., 2014 ) and specific strategies pertaining to underlying models in problems. The CL intervention in the present study focused on supporting students in problem-solving, through instruction in problem-solving principles ( Pólya, 1948 ), specifically applied to three models of mathematical problem-solving—multiplication/division, geometry, and proportionality.

Students’ problem-solving may be enhanced through participation in small group discussions. In a small group setting, all the students have the opportunity to explain their solutions, clarify their thinking, and enhance understanding of a problem at hand ( Yackel et al., 1991 ; Webb and Mastergeorge, 2003 ). In fact, small group instruction promotes students’ learning in mathematics by providing students with opportunities to use language for reasoning and conceptual understanding ( Mercer and Sams, 2006 ), to exchange different representations of the problem at hand ( Fujita et al., 2019 ), and to become aware of and understand groupmates’ perspectives in thinking ( Kazak et al., 2015 ). These opportunities for learning are created through dialogic spaces characterized by openness to each other’s perspectives and solutions to mathematical problems ( Wegerif, 2011 ).

However, group collaboration is not only associated with positive experiences. In fact, studies show that some students may not be given equal opportunities to voice their opinions, due to academic status differences ( Langer-Osuna, 2016 ). Indeed, problem-solvers struggling with complex tasks may experience negative emotions, leading to uncertainty of not knowing the definite answer, which places demands on peer support ( Jordan and McDaniel, 2014 ; Hannula, 2015 ). Thus, especially in heterogeneous groups, students may need additional support to promote group interaction. Therefore, in this study, we used a cooperative learning approach, which, in contrast to collaborative learning approaches, puts greater focus on supporting group cohesion through instruction in social skills and time for reflection on group work ( Davidson and Major, 2014 ).

Although cooperative learning approach is intended to promote cohesion and peer acceptance in heterogeneous groups ( Rzoska and Ward, 1991 ), previous studies indicate that challenges in group dynamics may lead to unequal participation ( Mulryan, 1992 ; Cohen, 1994 ). Peer-learning behaviours may impact students’ problem-solving ( Hwang and Hu, 2013 ) and working in groups with peers who are seen as friends may enhance students’ motivation to learn mathematics ( Deacon and Edwards, 2012 ). With the importance of peer support in mind, this study set out to investigate whether the results of the intervention using the CL approach are associated with students’ peer acceptance and friendships.

The Present Study

In previous research, the CL approach has shown to be a promising approach in teaching and learning mathematics ( Capar and Tarim, 2015 ), but fewer studies have been conducted in whole-class approaches in general and students with SEN in particular ( McMaster and Fuchs, 2002 ). This study aims to contribute to previous research by investigating the effect of CL intervention on students’ mathematical problem-solving in grade 5. With regard to the complexity of mathematical problem-solving ( Lesh and Zawojewski, 2007 ; Degrande et al., 2016 ; Stohlmann and Albarracín, 2016 ), the CL approach in this study was combined with problem-solving principles pertaining to three underlying models of problem-solving—multiplication/division, geometry, and proportionality. Furthermore, considering the importance of peer support in problem-solving in small groups ( Mulryan, 1992 ; Cohen, 1994 ; Hwang and Hu, 2013 ), the study investigated how peer acceptance and friendships were associated with the effect of the CL approach on students’ problem-solving abilities. The study aimed to find answers to the following research questions:

a) What is the effect of CL approach on students’ problem-solving in mathematics?

b) Are social acceptance and friendship associated with the effect of CL on students’ problem-solving in mathematics?

Participants

The participants were 958 students in grade 5 and their teachers. According to power analyses prior to the start of the study, 1,020 students and 51 classes were required, with an expected effect size of 0.30 and power of 80%, provided that there are 20 students per class and intraclass correlation is 0.10. An invitation to participate in the project was sent to teachers in five municipalities via e-mail. Furthermore, the information was posted on the website of Uppsala university and distributed via Facebook interest groups. As shown in Figure 1 , teachers of 1,165 students agreed to participate in the study, but informed consent was obtained only for 958 students (463 in the intervention and 495 in the control group). Further attrition occurred at pre- and post-measurement, resulting in 581 students’ tests as a basis for analyses (269 in the intervention and 312 in the control group). Fewer students (n = 493) were finally included in the analyses of the association of students’ social acceptance and friendships and the effect of CL on students’ mathematical problem-solving (219 in the intervention and 274 in the control group). The reasons for attrition included teacher drop out due to sick leave or personal circumstances (two teachers in the control group and five teachers in the intervention group). Furthermore, some students were sick on the day of data collection and some teachers did not send the test results to the researchers.

www.frontiersin.org

FIGURE 1 . Flow chart for participants included in data collection and data analysis.

As seen in Table 1 , classes in both intervention and control groups included 27 students on average. For 75% of the classes, there were 33–36% of students with SEN. In Sweden, no formal medical diagnosis is required for the identification of students with SEN. It is teachers and school welfare teams who decide students’ need for extra adaptations or special support ( Swedish National Educational Agency, 2014 ). The information on individual students’ type of SEN could not be obtained due to regulations on the protection of information about individuals ( SFS 2009 ). Therefore, the information on the number of students with SEN on class level was obtained through teacher reports.

www.frontiersin.org

TABLE 1 . Background characteristics of classes and teachers in intervention and control groups.

Intervention

The intervention using the CL approach lasted for 15 weeks and the teachers worked with the CL approach three to four lessons per week. First, the teachers participated in two-days training on the CL approach, using an especially elaborated CL manual ( Klang et al., 2018 ). The training focused on the five principles of the CL approach (positive interdependence, individual accountability, explicit instruction in social skills, promotive interaction, and group processing). Following the training, the teachers introduced the CL approach in their classes and focused on group-building activities for 7 weeks. Then, 2 days of training were provided to teachers, in which the CL approach was embedded in activities in mathematical problem-solving and reading comprehension. Educational materials containing mathematical problems in the areas of multiplication and division, geometry, and proportionality were distributed to the teachers ( Karlsson and Kilborn, 2018a ). In addition to the specific problems, adapted for the CL approach, the educational materials contained guidance for the teachers, in which problem-solving principles ( Pólya, 1948 ) were presented as steps in problem-solving. Following the training, the teachers applied the CL approach in mathematical problem-solving lessons for 8 weeks.

Solving a problem is a matter of goal-oriented reasoning, starting from the understanding of the problem to devising its solution by using known mathematical models. This presupposes that the current problem is chosen from a known context ( Stillman et al., 2008 ; Zawojewski, 2010 ). This differs from the problem-solving of the textbooks, which is based on an aim to train already known formulas and procedures ( Hamilton, 2007 ). Moreover, it is important that students learn modelling according to their current abilities and conditions ( Russel, 1991 ).

In order to create similar conditions in the experiment group and the control group, the teachers were supposed to use the same educational material ( Karlsson and Kilborn, 2018a ; Karlsson and Kilborn, 2018b ), written in light of the specified view of problem-solving. The educational material is divided into three areas—multiplication/division, geometry, and proportionality—and begins with a short teachers’ guide, where a view of problem solving is presented, which is based on the work of Polya (1948) and Lester and Cai (2016) . The tasks are constructed in such a way that conceptual knowledge was in focus, not formulas and procedural knowledge.

Implementation of the Intervention

To ensure the implementation of the intervention, the researchers visited each teachers’ classroom twice during the two phases of the intervention period, as described above. During each visit, the researchers observed the lesson, using a checklist comprising the five principles of the CL approach. After the lesson, the researchers gave written and oral feedback to each teacher. As seen in Table 1 , in 18 of the 23 classes, the teachers implemented the intervention in accordance with the principles of CL. In addition, the teachers were asked to report on the use of the CL approach in their teaching and the use of problem-solving activities embedding CL during the intervention period. As shown in Table 1 , teachers in only 11 of 23 classes reported using the CL approach and problem-solving activities embedded in the CL approach at least once a week.

Control Group

The teachers in the control group received 2 days of instruction in enhancing students’ problem-solving and reading comprehension. The teachers were also supported with educational materials including mathematical problems Karlsson and Kilborn (2018b) and problem-solving principles ( Pólya, 1948 ). However, none of the activities during training or in educational materials included the CL approach. As seen in Table 1 , only 10 of 25 teachers reported devoting at least one lesson per week to mathematical problem-solving.

Tests of Mathematical Problem-Solving

Tests of mathematical problem-solving were administered before and after the intervention, which lasted for 15 weeks. The tests were focused on the models of multiplication/division, geometry, and proportionality. The three models were chosen based on the syllabus of the subject of mathematics in grades 4 to 6 in the Swedish National Curriculum ( Swedish National Educational Agency, 2018 ). In addition, the intention was to create a variation of types of problems to solve. For each of these three models, there were two tests, a pre-test and a post-test. Each test contained three tasks with increasing difficulty ( Supplementary Appendix SA ).

The tests of multiplication and division (Ma1) were chosen from different contexts and began with a one-step problem, while the following two tasks were multi-step problems. Concerning multiplication, many students in grade 5 still understand multiplication as repeated addition, causing significant problems, as this conception is not applicable to multiplication beyond natural numbers ( Verschaffel et al., 2007 ). This might be a hindrance in developing multiplicative reasoning ( Barmby et al., 2009 ). The multi-step problems in this study were constructed to support the students in multiplicative reasoning.

Concerning the geometry tests (Ma2), it was important to consider a paradigm shift concerning geometry in education that occurred in the mid-20th century, when strict Euclidean geometry gave way to other aspects of geometry like symmetry, transformation, and patterns. van Hiele (1986) prepared a new taxonomy for geometry in five steps, from a visual to a logical level. Therefore, in the tests there was a focus on properties of quadrangles and triangles, and how to determine areas by reorganising figures into new patterns. This means that structure was more important than formulas.

The construction of tests of proportionality (M3) was more complicated. Firstly, tasks on proportionality can be found in many different contexts, such as prescriptions, scales, speeds, discounts, interest, etc. Secondly, the mathematical model is complex and requires good knowledge of rational numbers and ratios ( Lesh et al., 1988 ). It also requires a developed view of multiplication, useful in operations with real numbers, not only as repeated addition, an operation limited to natural numbers ( Lybeck, 1981 ; Degrande et al., 2016 ). A linear structure of multiplication as repeated addition leads to limitations in terms of generalization and development of the concept of multiplication. This became evident in a study carried out in a Swedish context ( Karlsson and Kilborn, 2018c ). Proportionality can be expressed as a/b = c/d or as a/b = k. The latter can also be expressed as a = b∙k, where k is a constant that determines the relationship between a and b. Common examples of k are speed (km/h), scale, and interest (%). An important pre-knowledge in order to deal with proportions is to master fractions as equivalence classes like 1/3 = 2/6 = 3/9 = 4/12 = 5/15 = 6/18 = 7/21 = 8/24 … ( Karlsson and Kilborn, 2020 ). It was important to take all these aspects into account when constructing and assessing the solutions of the tasks.

The tests were graded by an experienced teacher of mathematics (4 th author) and two students in their final year of teacher training. Prior to grading, acceptable levels of inter-rater reliability were achieved by independent rating of students’ solutions and discussions in which differences between the graders were resolved. Each student response was to be assigned one point when it contained a correct answer and two points when the student provided argumentation for the correct answer and elaborated on explanation of his or her solution. The assessment was thus based on quality aspects with a focus on conceptual knowledge. As each subtest contained three questions, it generated three student solutions. So, scores for each subtest ranged from 0 to 6 points and for the total scores from 0 to 18 points. To ascertain that pre- and post-tests were equivalent in degree of difficulty, the tests were administered to an additional sample of 169 students in grade 5. Test for each model was conducted separately, as students participated in pre- and post-test for each model during the same lesson. The order of tests was switched for half of the students in order to avoid the effect of the order in which the pre- and post-tests were presented. Correlation between students’ performance on pre- and post-test was .39 ( p < 0.000) for tests of multiplication/division; .48 ( p < 0.000) for tests of geometry; and .56 ( p < 0.000) for tests of proportionality. Thus, the degree of difficulty may have differed between pre- and post-test.

Measures of Peer Acceptance and Friendships

To investigate students’ peer acceptance and friendships, peer nominations rated pre- and post-intervention were used. Students were asked to nominate peers who they preferred to work in groups with and who they preferred to be friends with. Negative peer nominations were avoided due to ethical considerations raised by teachers and parents ( Child and Nind, 2013 ). Unlimited nominations were used, as these are considered to have high ecological validity ( Cillessen and Marks, 2017 ). Peer nominations were used as a measure of social acceptance, and reciprocated nominations were used as a measure of friendship. The number of nominations for each student were aggregated and divided by the number of nominators to create a proportion of nominations for each student ( Velásquez et al., 2013 ).

Statistical Analyses

Multilevel regression analyses were conducted in R, lme4 package Bates et al. (2015) to account for nestedness in the data. Students’ classroom belonging was considered as a level 2 variable. First, we used a model in which students’ results on tests of problem-solving were studied as a function of time (pre- and post) and group belonging (intervention and control group). Second, the same model was applied to subgroups of students who performed above and below median at pre-test, to explore whether the CL intervention had a differential effect on student performance. In this second model, the results for subgroups of students could not be obtained for geometry tests for subgroup below median and for tests of proportionality for subgroup above median. A possible reason for this must have been the skewed distribution of the students in these subgroups. Therefore, another model was applied that investigated students’ performances in math at both pre- and post-test as a function of group belonging. Third, the students’ scores on social acceptance and friendships were added as an interaction term to the first model. In our previous study, students’ social acceptance changed as a result of the same CL intervention ( Klang et al., 2020 ).

The assumptions for the multilevel regression were assured during the analyses ( Snijders and Bosker, 2012 ). The assumption of normality of residuals were met, as controlled by visual inspection of quantile-quantile plots. For subgroups, however, the plotted residuals deviated somewhat from the straight line. The number of outliers, which had a studentized residual value greater than ±3, varied from 0 to 5, but none of the outliers had a Cook’s distance value larger than 1. The assumption of multicollinearity was met, as the variance inflation factors (VIF) did not exceed a value of 10. Before the analyses, the cases with missing data were deleted listwise.

What Is the Effect of the CL Approach on Students’ Problem-Solving in Mathematics?

As seen in the regression coefficients in Table 2 , the CL intervention had a significant effect on students’ mathematical problem-solving total scores and students’ scores in problem solving in geometry (Ma2). Judging by mean values, students in the intervention group appeared to have low scores on problem-solving in geometry but reached the levels of problem-solving of the control group by the end of the intervention. The intervention did not have a significant effect on students’ performance in problem-solving related to models of multiplication/division and proportionality.

www.frontiersin.org

TABLE 2 . Mean scores (standard deviation in parentheses) and unstandardized multilevel regression estimates for tests of mathematical problem-solving.

The question is, however, whether CL intervention affected students with different pre-test scores differently. Table 2 includes the regression coefficients for subgroups of students who performed below and above median at pre-test. As seen in the table, the CL approach did not have a significant effect on students’ problem-solving, when the sample was divided into these subgroups. A small negative effect was found for intervention group in comparison to control group, but confidence intervals (CI) for the effect indicate that it was not significant.

Is Social Acceptance and Friendships Associated With the Effect of CL on Students’ Problem-Solving in Mathematics?

As seen in Table 3 , students’ peer acceptance and friendship at pre-test were significantly associated with the effect of the CL approach on students’ mathematical problem-solving scores. Changes in students’ peer acceptance and friendships were not significantly associated with the effect of the CL approach on students’ mathematical problem-solving. Consequently, it can be concluded that being nominated by one’s peers and having friends at the start of the intervention may be an important factor when participation in group work, structured in accordance with the CL approach, leads to gains in mathematical problem-solving.

www.frontiersin.org

TABLE 3 . Mean scores (standard deviation in parentheses) and unstandardized multilevel regression estimates for tests of mathematical problem-solving, including scores of social acceptance and friendship in the model.

In light of the limited number of studies on the effects of CL on students’ problem-solving in whole classrooms ( Capar and Tarim, 2015 ), and for students with SEN in particular ( McMaster and Fuchs, 2002 ), this study sought to investigate whether the CL approach embedded in problem-solving activities has an effect on students’ problem-solving in heterogeneous classrooms. The need for the study was justified by the challenge of providing equitable mathematics instruction to heterogeneous student populations ( OECD, 2019 ). Small group instructional approaches as CL are considered as promising approaches in this regard ( Kunsch et al., 2007 ). The results showed a significant effect of the CL approach on students’ problem-solving in geometry and total problem-solving scores. In addition, with regard to the importance of peer support in problem-solving ( Deacon and Edwards, 2012 ; Hwang and Hu, 2013 ), the study explored whether the effect of CL on students’ problem-solving was associated with students’ social acceptance and friendships. The results showed that students’ peer acceptance and friendships at pre-test were significantly associated with the effect of the CL approach, while change in students’ peer acceptance and friendships from pre- to post-test was not.

The results of the study confirm previous research on the effect of the CL approach on students’ mathematical achievement ( Capar and Tarim, 2015 ). The specific contribution of the study is that it was conducted in classrooms, 75% of which were composed of 33–36% of students with SEN. Thus, while a previous review revealed inconclusive findings on the effects of CL on student achievement ( McMaster and Fuchs, 2002 ), the current study adds to the evidence of the effect of the CL approach in heterogeneous classrooms, in which students with special needs are educated alongside with their peers. In a small group setting, the students have opportunities to discuss their ideas of solutions to the problem at hand, providing explanations and clarifications, thus enhancing their understanding of problem-solving ( Yackel et al., 1991 ; Webb and Mastergeorge, 2003 ).

In this study, in accordance with previous research on mathematical problem-solving ( Lesh and Zawojewski, 2007 ; Degrande et al., 2016 ; Stohlmann and Albarracín, 2016 ), the CL approach was combined with training in problem-solving principles Pólya (1948) and educational materials, providing support in instruction in underlying mathematical models. The intention of the study was to provide evidence for the effectiveness of the CL approach above instruction in problem-solving, as problem-solving materials were accessible to teachers of both the intervention and control groups. However, due to implementation challenges, not all teachers in the intervention and control groups reported using educational materials and training as expected. Thus, it is not possible to draw conclusions of the effectiveness of the CL approach alone. However, in everyday classroom instruction it may be difficult to separate the content of instruction from the activities that are used to mediate this content ( Doerr and Tripp, 1999 ; Gravemeijer, 1999 ).

Furthermore, for successful instruction in mathematical problem-solving, scaffolding for content needs to be combined with scaffolding for dialogue ( Kazak et al., 2015 ). From a dialogical perspective ( Wegerif, 2011 ), students may need scaffolding in new ways of thinking, involving questioning their understandings and providing arguments for their solutions, in order to create dialogic spaces in which different solutions are voiced and negotiated. In this study, small group instruction through CL approach aimed to support discussions in small groups, but the study relies solely on quantitative measures of students’ mathematical performance. Video-recordings of students’ discussions may have yielded important insights into the dialogic relationships that arose in group discussions.

Despite the positive findings of the CL approach on students’ problem-solving, it is important to note that the intervention did not have an effect on students’ problem-solving pertaining to models of multiplication/division and proportionality. Although CL is assumed to be a promising instructional approach, the number of studies on its effect on students’ mathematical achievement is still limited ( Capar and Tarim, 2015 ). Thus, further research is needed on how CL intervention can be designed to promote students’ problem-solving in other areas of mathematics.

The results of this study show that the effect of the CL intervention on students’ problem-solving was associated with students’ initial scores of social acceptance and friendships. Thus, it is possible to assume that students who were popular among their classmates and had friends at the start of the intervention also made greater gains in mathematical problem-solving as a result of the CL intervention. This finding is in line with Deacon and Edwards’ study of the importance of friendships for students’ motivation to learn mathematics in small groups ( Deacon and Edwards, 2012 ). However, the effect of the CL intervention was not associated with change in students’ social acceptance and friendship scores. These results indicate that students who were nominated by a greater number of students and who received a greater number of friends did not benefit to a great extent from the CL intervention. With regard to previously reported inequalities in cooperation in heterogeneous groups ( Cohen, 1994 ; Mulryan, 1992 ; Langer Osuna, 2016 ) and the importance of peer behaviours for problem-solving ( Hwang and Hu, 2013 ), teachers should consider creating inclusive norms and supportive peer relationships when using the CL approach. The demands of solving complex problems may create negative emotions and uncertainty ( Hannula, 2015 ; Jordan and McDaniel, 2014 ), and peer support may be essential in such situations.

Limitations

The conclusions from the study must be interpreted with caution, due to a number of limitations. First, due to the regulation of protection of individuals ( SFS 2009 ), the researchers could not get information on type of SEN for individual students, which limited the possibilities of the study for investigating the effects of the CL approach for these students. Second, not all teachers in the intervention group implemented the CL approach embedded in problem-solving activities and not all teachers in the control group reported using educational materials on problem-solving. The insufficient levels of implementation pose a significant challenge to the internal validity of the study. Third, the additional investigation to explore the equivalence in difficulty between pre- and post-test, including 169 students, revealed weak to moderate correlation in students’ performance scores, which may indicate challenges to the internal validity of the study.

Implications

The results of the study have some implications for practice. Based on the results of the significant effect of the CL intervention on students’ problem-solving, the CL approach appears to be a promising instructional approach in promoting students’ problem-solving. However, as the results of the CL approach were not significant for all subtests of problem-solving, and due to insufficient levels of implementation, it is not possible to conclude on the importance of the CL intervention for students’ problem-solving. Furthermore, it appears to be important to create opportunities for peer contacts and friendships when the CL approach is used in mathematical problem-solving activities.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

The studies involving human participants were reviewed and approved by the Uppsala Ethical Regional Committee, Dnr. 2017/372. Written informed consent to participate in this study was provided by the participants’ legal guardian/next of kin.

Author Contributions

NiK was responsible for the project, and participated in data collection and data analyses. NaK and WK were responsible for intervention with special focus on the educational materials and tests in mathematical problem-solving. PE participated in the planning of the study and the data analyses, including coordinating analyses of students’ tests. MK participated in the designing and planning the study as well as data collection and data analyses.

The project was funded by the Swedish Research Council under Grant 2016-04,679.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Acknowledgments

We would like to express our gratitude to teachers who participated in the project.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2021.710296/full#supplementary-material

Barmby, P., Harries, T., Higgins, S., and Suggate, J. (2009). The array representation and primary children's understanding and reasoning in multiplication. Educ. Stud. Math. 70 (3), 217–241. doi:10.1007/s10649-008-914510.1007/s10649-008-9145-1

CrossRef Full Text | Google Scholar

Bates, D., Mächler, M., Bolker, B., and Walker, S. (2015). Fitting Linear Mixed-Effects Models Usinglme4. J. Stat. Soft. 67 (1), 1–48. doi:10.18637/jss.v067.i01

Capar, G., and Tarim, K. (2015). Efficacy of the cooperative learning method on mathematics achievement and attitude: A meta-analysis research. Educ. Sci-theor Pract. 15 (2), 553–559. doi:10.12738/estp.2015.2.2098

Child, S., and Nind, M. (2013). Sociometric methods and difference: A force for good - or yet more harm. Disabil. Soc. 28 (7), 1012–1023. doi:10.1080/09687599.2012.741517

Cillessen, A. H. N., and Marks, P. E. L. (2017). Methodological choices in peer nomination research. New Dir. Child Adolesc. Dev. 2017, 21–44. doi:10.1002/cad.20206

PubMed Abstract | CrossRef Full Text | Google Scholar

Clarke, B., Cheeseman, J., and Clarke, D. (2006). The mathematical knowledge and understanding young children bring to school. Math. Ed. Res. J. 18 (1), 78–102. doi:10.1007/bf03217430

Cohen, E. G. (1994). Restructuring the classroom: Conditions for productive small groups. Rev. Educ. Res. 64 (1), 1–35. doi:10.3102/00346543064001001

Davidson, N., and Major, C. H. (2014). Boundary crossings: Cooperative learning, collaborative learning, and problem-based learning. J. Excell. Coll. Teach. 25 (3-4), 7.

Google Scholar

Davydov, V. V. (2008). Problems of developmental instructions. A Theoretical and experimental psychological study . New York: Nova Science Publishers, Inc .

Deacon, D., and Edwards, J. (2012). Influences of friendship groupings on motivation for mathematics learning in secondary classrooms. Proc. Br. Soc. Res. into Learn. Math. 32 (2), 22–27.

Degrande, T., Verschaffel, L., and van Dooren, W. (2016). “Proportional word problem solving through a modeling lens: a half-empty or half-full glass?,” in Posing and Solving Mathematical Problems, Research in Mathematics Education . Editor P. Felmer.

Doerr, H. M., and Tripp, J. S. (1999). Understanding how students develop mathematical models. Math. Thinking Learn. 1 (3), 231–254. doi:10.1207/s15327833mtl0103_3

Fujita, T., Doney, J., and Wegerif, R. (2019). Students' collaborative decision-making processes in defining and classifying quadrilaterals: a semiotic/dialogic approach. Educ. Stud. Math. 101 (3), 341–356. doi:10.1007/s10649-019-09892-9

Gillies, R. (2016). Cooperative learning: Review of research and practice. Ajte 41 (3), 39–54. doi:10.14221/ajte.2016v41n3.3

Gravemeijer, K. (1999). How Emergent Models May Foster the Constitution of Formal Mathematics. Math. Thinking Learn. 1 (2), 155–177. doi:10.1207/s15327833mtl0102_4

Gravemeijer, K., Stephan, M., Julie, C., Lin, F.-L., and Ohtani, M. (2017). What mathematics education may prepare students for the society of the future? Int. J. Sci. Math. Educ. 15 (S1), 105–123. doi:10.1007/s10763-017-9814-6

Hamilton, E. (2007). “What changes are needed in the kind of problem-solving situations where mathematical thinking is needed beyond school?,” in Foundations for the Future in Mathematics Education . Editors R. Lesh, E. Hamilton, and Kaput (Mahwah, NJ: Lawrence Erlbaum ), 1–6.

Hannula, M. S. (2015). “Emotions in problem solving,” in Selected Regular Lectures from the 12 th International Congress on Mathematical Education . Editor S. J. Cho. doi:10.1007/978-3-319-17187-6_16

Hwang, W.-Y., and Hu, S.-S. (2013). Analysis of peer learning behaviors using multiple representations in virtual reality and their impacts on geometry problem solving. Comput. Edu. 62, 308–319. doi:10.1016/j.compedu.2012.10.005

Johnson, D. W., Johnson, R. T., and Johnson Holubec, E. (2009). Circle of Learning: Cooperation in the Classroom . Gurgaon: Interaction Book Company .

Johnson, D. W., Johnson, R. T., and Johnson Holubec, E. (1993). Cooperation in the Classroom . Gurgaon: Interaction Book Company .

Jordan, M. E., and McDaniel, R. R. (2014). Managing uncertainty during collaborative problem solving in elementary school teams: The role of peer influence in robotics engineering activity. J. Learn. Sci. 23 (4), 490–536. doi:10.1080/10508406.2014.896254

Karlsson, N., and Kilborn, W. (2018a). Inclusion through learning in group: tasks for problem-solving. [Inkludering genom lärande i grupp: uppgifter för problemlösning] . Uppsala: Uppsala University .

Karlsson, N., and Kilborn, W. (2018c). It's enough if they understand it. A study of teachers 'and students' perceptions of multiplication and the multiplication table [Det räcker om de förstår den. En studie av lärares och elevers uppfattningar om multiplikation och multiplikationstabellen]. Södertörn Stud. Higher Educ. , 175.

Karlsson, N., and Kilborn, W. (2018b). Tasks for problem-solving in mathematics. [Uppgifter för problemlösning i matematik] . Uppsala: Uppsala University .

Karlsson, N., and Kilborn, W. (2020). “Teacher’s and student’s perception of rational numbers,” in Interim Proceedings of the 44 th Conference of the International Group for the Psychology of Mathematics Education , Interim Vol., Research Reports . Editors M. Inprasitha, N. Changsri, and N. Boonsena (Khon Kaen, Thailand: PME ), 291–297.

Kazak, S., Wegerif, R., and Fujita, T. (2015). Combining scaffolding for content and scaffolding for dialogue to support conceptual breakthroughs in understanding probability. ZDM Math. Edu. 47 (7), 1269–1283. doi:10.1007/s11858-015-0720-5

Klang, N., Olsson, I., Wilder, J., Lindqvist, G., Fohlin, N., and Nilholm, C. (2020). A cooperative learning intervention to promote social inclusion in heterogeneous classrooms. Front. Psychol. 11, 586489. doi:10.3389/fpsyg.2020.586489

Klang, N., Fohlin, N., and Stoddard, M. (2018). Inclusion through learning in group: cooperative learning [Inkludering genom lärande i grupp: kooperativt lärande] . Uppsala: Uppsala University .

Kunsch, C. A., Jitendra, A. K., and Sood, S. (2007). The effects of peer-mediated instruction in mathematics for students with learning problems: A research synthesis. Learn. Disabil Res Pract 22 (1), 1–12. doi:10.1111/j.1540-5826.2007.00226.x

Langer-Osuna, J. M. (2016). The social construction of authority among peers and its implications for collaborative mathematics problem solving. Math. Thinking Learn. 18 (2), 107–124. doi:10.1080/10986065.2016.1148529

Lein, A. E., Jitendra, A. K., and Harwell, M. R. (2020). Effectiveness of mathematical word problem solving interventions for students with learning disabilities and/or mathematics difficulties: A meta-analysis. J. Educ. Psychol. 112 (7), 1388–1408. doi:10.1037/edu0000453

Lesh, R., and Doerr, H. (2003). Beyond Constructivism: Models and Modeling Perspectives on Mathematics Problem Solving, Learning and Teaching . Mahwah, NJ: Erlbaum .

Lesh, R., Post, T., and Behr, M. (1988). “Proportional reasoning,” in Number Concepts and Operations in the Middle Grades . Editors J. Hiebert, and M. Behr (Hillsdale, N.J.: Lawrence Erlbaum Associates ), 93–118.

Lesh, R., and Zawojewski, (2007). “Problem solving and modeling,” in Second Handbook of Research on Mathematics Teaching and Learning: A Project of the National Council of Teachers of Mathematics . Editor L. F. K. Lester (Charlotte, NC: Information Age Pub ), vol. 2.

Lester, F. K., and Cai, J. (2016). “Can mathematical problem solving be taught? Preliminary answers from 30 years of research,” in Posing and Solving Mathematical Problems. Research in Mathematics Education .

Lybeck, L. (1981). “Archimedes in the classroom. [Arkimedes i klassen],” in Göteborg Studies in Educational Sciences (Göteborg: Acta Universitatis Gotoburgensis ), 37.

McMaster, K. N., and Fuchs, D. (2002). Effects of Cooperative Learning on the Academic Achievement of Students with Learning Disabilities: An Update of Tateyama-Sniezek's Review. Learn. Disabil Res Pract 17 (2), 107–117. doi:10.1111/1540-5826.00037

Mercer, N., and Sams, C. (2006). Teaching children how to use language to solve maths problems. Lang. Edu. 20 (6), 507–528. doi:10.2167/le678.0

Montague, M., Krawec, J., Enders, C., and Dietz, S. (2014). The effects of cognitive strategy instruction on math problem solving of middle-school students of varying ability. J. Educ. Psychol. 106 (2), 469–481. doi:10.1037/a0035176

Mousoulides, N., Pittalis, M., Christou, C., and Stiraman, B. (2010). “Tracing students’ modeling processes in school,” in Modeling Students’ Mathematical Modeling Competencies . Editor R. Lesh (Berlin, Germany: Springer Science+Business Media ). doi:10.1007/978-1-4419-0561-1_10

Mulryan, C. M. (1992). Student passivity during cooperative small groups in mathematics. J. Educ. Res. 85 (5), 261–273. doi:10.1080/00220671.1992.9941126

OECD (2019). PISA 2018 Results (Volume I): What Students Know and Can Do . Paris: OECD Publishing . doi:10.1787/5f07c754-en

CrossRef Full Text

Pólya, G. (1948). How to Solve it: A New Aspect of Mathematical Method . Princeton, N.J.: Princeton University Press .

Russel, S. J. (1991). “Counting noses and scary things: Children construct their ideas about data,” in Proceedings of the Third International Conference on the Teaching of Statistics . Editor I. D. Vere-Jones (Dunedin, NZ: University of Otago ), 141–164., s.

Rzoska, K. M., and Ward, C. (1991). The effects of cooperative and competitive learning methods on the mathematics achievement, attitudes toward school, self-concepts and friendship choices of Maori, Pakeha and Samoan Children. New Zealand J. Psychol. 20 (1), 17–24.

Schoenfeld, A. H. (2016). Learning to think mathematically: Problem solving, metacognition, and sense making in mathematics (reprint). J. Edu. 196 (2), 1–38. doi:10.1177/002205741619600202

SFS 2009:400. Offentlighets- och sekretesslag. [Law on Publicity and confidentiality] . Retrieved from https://www.riksdagen.se/sv/dokument-lagar/dokument/svensk-forfattningssamling/offentlighets--och-sekretesslag-2009400_sfs-2009-400 on the 14th of October .

Snijders, T. A. B., and Bosker, R. J. (2012). Multilevel Analysis. An Introduction to Basic and Advanced Multilevel Modeling . 2nd Ed. London: SAGE .

Stillman, G., Brown, J., and Galbraith, P. (2008). Research into the teaching and learning of applications and modelling in Australasia. In H. Forgasz, A. Barkatas, A. Bishop, B. Clarke, S. Keast, W. Seah, and P. Sullivan (red.), Research in Mathematics Education in Australasiae , 2004-2007 , p.141–164. Rotterdam: Sense Publishers .doi:10.1163/9789087905019_009

Stohlmann, M. S., and Albarracín, L. (2016). What is known about elementary grades mathematical modelling. Edu. Res. Int. 2016, 1–9. doi:10.1155/2016/5240683

Swedish National Educational Agency (2014). Support measures in education – on leadership and incentives, extra adaptations and special support [Stödinsatser I utbildningen – om ledning och stimulans, extra anpassningar och särskilt stöd] . Stockholm: Swedish National Agency of Education .

Swedish National Educational Agency (2018). Syllabus for the subject of mathematics in compulsory school . Retrieved from https://www.skolverket.se/undervisning/grundskolan/laroplan-och-kursplaner-for-grundskolan/laroplan-lgr11-for-grundskolan-samt-for-forskoleklassen-och-fritidshemmet?url=-996270488%2Fcompulsorycw%2Fjsp%2Fsubject.htm%3FsubjectCode%3DGRGRMAT01%26tos%3Dgr&sv.url=12.5dfee44715d35a5cdfa219f ( on the 32nd of July, 2021).

van Hiele, P. (1986). Structure and Insight. A Theory of Mathematics Education . London: Academic Press .

Velásquez, A. M., Bukowski, W. M., and Saldarriaga, L. M. (2013). Adjusting for Group Size Effects in Peer Nomination Data. Soc. Dev. 22 (4), a–n. doi:10.1111/sode.12029

Verschaffel, L., Greer, B., and De Corte, E. (2007). “Whole number concepts and operations,” in Second Handbook of Research on Mathematics Teaching and Learning: A Project of the National Council of Teachers of Mathematics . Editor F. K. Lester (Charlotte, NC: Information Age Pub ), 557–628.

Webb, N. M., and Mastergeorge, A. (2003). Promoting effective helping behavior in peer-directed groups. Int. J. Educ. Res. 39 (1), 73–97. doi:10.1016/S0883-0355(03)00074-0

Wegerif, R. (2011). “Theories of Learning and Studies of Instructional Practice,” in Theories of learning and studies of instructional Practice. Explorations in the learning sciences, instructional systems and Performance technologies . Editor T. Koschmann (Berlin, Germany: Springer ). doi:10.1007/978-1-4419-7582-9

Yackel, E., Cobb, P., and Wood, T. (1991). Small-group interactions as a source of learning opportunities in second-grade mathematics. J. Res. Math. Edu. 22 (5), 390–408. doi:10.2307/749187

Zawojewski, J. (2010). Problem Solving versus Modeling. In R. Lesch, P. Galbraith, C. R. Haines, and A. Hurford (red.), Modelling student’s mathematical modelling competencies: ICTMA , p. 237–243. New York, NY: Springer .doi:10.1007/978-1-4419-0561-1_20

Keywords: cooperative learning, mathematical problem-solving, intervention, heterogeneous classrooms, hierarchical linear regression analysis

Citation: Klang N, Karlsson N, Kilborn W, Eriksson P and Karlberg M (2021) Mathematical Problem-Solving Through Cooperative Learning—The Importance of Peer Acceptance and Friendships. Front. Educ. 6:710296. doi: 10.3389/feduc.2021.710296

Received: 15 May 2021; Accepted: 09 August 2021; Published: 24 August 2021.

Reviewed by:

Copyright © 2021 Klang, Karlsson, Kilborn, Eriksson and Karlberg. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Nina Klang, [email protected]

COMMENTS

  1. Teaching Problem Solving

    Make students articulate their problem solving process . In a one-on-one tutoring session, ask the student to work his/her problem out loud. This slows down the thinking process, making it more accurate and allowing you to access understanding. When working with larger groups you can ask students to provide a written "two-column solution.".

  2. Teaching Problem Solving

    Problem solving is a necessary skill in all disciplines and one that the Sheridan Center is focusing on as part of the Brown Learning Collaborative, which provides students the opportunity to achieve new levels of excellence in six key skills traditionally honed in a liberal arts education ­- critical reading, writing, research, data ...

  3. Teaching problem solving

    Working on solutions. In the solution phase, one develops and then implements a coherent plan for solving the problem. As you help students with this phase, you might ask them to: identify the general model or procedure they have in mind for solving the problem. set sub-goals for solving the problem. identify necessary operations and steps.

  4. PDF Effective Problem-Solving Instruction, Part 2: Multiple Strategies

    This will help students understand that strategies should be chosen based on ease and efficiency. Effective Problem-Solving Instruction, Part 2: Multiple Strategies. 2. It can also be helpful for a teacher to demonstrate approaches to problems that are not successful and discuss why they seem like they would work, but why they don't.

  5. 6 Tips for Teaching Math Problem-Solving Skills

    1. Link problem-solving to reading. When we can remind students that they already have many comprehension skills and strategies they can easily use in math problem-solving, it can ease the anxiety surrounding the math problem. For example, providing them with strategies to practice, such as visualizing, acting out the problem with math tools ...

  6. Instruction followed by problem solving

    PLTL Approach and Benefits In the PLTL approach, the instruction phase takes place in the traditional classroom, often in the form of lecture, and the problem-solving phase takes place as students work in collaborative groups (typically ranging from 6-10 students) facilitated by a trained undergraduate peer leader for 90−120 minutes each week.

  7. What is Problem Solving? Steps, Process & Techniques

    Finding a suitable solution for issues can be accomplished by following the basic four-step problem-solving process and methodology outlined below. Step. Characteristics. 1. Define the problem. Differentiate fact from opinion. Specify underlying causes. Consult each faction involved for information. State the problem specifically.

  8. Eight Instructional Strategies for Promoting Critical Thinking

    Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care ...

  9. The Problem-Solving Process

    Problem-solving is a mental process that involves discovering, analyzing, and solving problems. The ultimate goal of problem-solving is to overcome obstacles and find a solution that best resolves the issue. The best strategy for solving a problem depends largely on the unique situation. In some cases, people are better off learning everything ...

  10. 3 Ways to Improve Student Problem-Solving

    While slower in solving problems, experts use this additional up-front time to more efficiently and effectively solve the problem. In one study, researchers found that experts were much better at "information extraction" or pulling the information they needed to solve the problem later in the problem than novices. This was due to the fact that they started a problem-solving process by ...

  11. Evidence-based math instruction: What you need to know

    Research shows that students who were taught using schema-based instruction were better able to solve both familiar and new multi-step problems. Teaching tips. Teach students to analyze a word problem and identify the pattern. Identify for students the unique features of each type of problem. Explicitly teach the math vocabulary needed for that ...

  12. When Problem Solving Followed by Instruction Works: Evidence for

    Arguments in favor of a problem-solving first approach (problem solving followed by instruction, or PS-I) are based on preparing students for future learning (Schwartz & Martin, 2004) by giving them opportunities to notice and encode critical domain features on their own (Loibl et al., 2017).By confronting students with challenging experiences (rather than shrinking the problem-space upfront ...

  13. Perspectives on problem solving and instruction

    In this educational context, problem solving always refers to the simultaneous use of strong methods for routine aspects of performance and knowledge-based methods for non-routine aspects of performance (e.g., reasoning, decision making). The remainder of this article will therefore focus on real-life problem solving. 3.

  14. IRIS

    Schema instruction—explicit instruction in identifying word problem types, representing them correctly, and using an effective method for solving them—has been found to be effective among students with mathematical difficulties and disabilities. Teaching students how to solve word problems by identifying word problem types is more effective ...

  15. Instruction Then Problem-Solving, or Vice Versa?

    2. Problem-solving first instruction: this instructional design is sequenced in the opposite fashion of its predecessor. First, students engage in a novel problem involving a yet-to-be-learned topic. Then, students receive instruction on the topic that they just explored. Proponents of this instructional design believe this approach better ...

  16. Cognition and Instruction/Problem Solving, Critical Thinking and

    In support of the argument to implement a problem-based approach to problem solving, a meta-analysis conducted by Dochy, Segers, Van den Bossche, & Gijbels (2003), found problem-based learning to be superior to traditional styles of learning though in supporting flexible problem solving, application of knowledge, and hypothesis generation.

  17. Implications of Cognitive Theory for Instruction in Problem Solving

    Abstract. Cognitive theories of problem solving and suggestions made by cognitive psychologists regarding how to teach problem solving are reviewed. Theories and suggestions from creativity research are also considered. The results are summarized in a description of how high levels of proficiency in problem solving are acquired and how problem ...

  18. Using Schema-Based Instruction to Improve Students ...

    What is most relevant to this chapter on word problem solving instruction is the need for teaching to ensure that instructional practices (e.g., guided questions to engage students in conversations about their thinking and problem solving) support students in the problem solving process, such as recognizing common underlying problem structures ...

  19. The effectiveness of collaborative problem solving in promoting

    Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field ...

  20. Solve it! Strategy instruction to improve mathematical problem solving

    Solve It! is a research-based instructional program for students who have difficulty solving mathematical word problems. Many of these students require particular help in understanding and using cognitive processes and self-regulation strategies that underlie effective and efficient problem solving. This article underscores the need to improve students' mathematical problem solving, describes ...

  21. PROTOCOL: Problem solving before instruction (PS‐I) to promote learning

    These types of interventions that combine an initial phase of problem‐solving and a following phase of explicit instruction have been formulated in different specific approaches, such as the Productive Failure approach (Kapur, 2012a), the Invention approach (Schwartz, 2004), or problem‐solving before instruction (Loibl, 2014a). In this ...

  22. PDF The Effects of Schema-Based Instruction on Solving Mathematics Word

    problem-solving strategies, and determine whether schema-based instruction is an effective intervention for students struggling with math. Students with Difficulties with Mathematics Problem Solving Fuchs et al. (2008a) conducted a study to investigate the cognitive processes related to problem solving and computation.

  23. Frontiers

    Mathematical problem-solving constitutes an important area of mathematics instruction, and there is a need for research on instructional approaches supporting student learning in this area. This study aims to contribute to previous research by studying the effects of an instructional approach of cooperative learning on students' mathematical problem-solving in heterogeneous classrooms in ...

  24. PDF Specially Designed Instruction (SDI): Mathematics

    - Explicit Instruction for use of flowcharts to plan strategies for problem solving - Mnemonic strategies - Cue cards with problem solving strategies, definitions, examples, models, flow chart, process steps - Small group instruction - Visual, non-verbal, verbal, physical, picture, and written prompts and cues - Repetitive practice