Streaming, or tracking, students occurs quite regularly around the world. This means that, at some point in their K-12 education, they are grouped by ability or intelligence. Many educators and parents support this idea. The philosophy, behind this practice, is that it allows for teachers to teach groups of similar intelligence levels. Also many believe that the high end students, through streaming, can be enriched on topics beyond the course.
This streaming occurs at different ages, or grades, in different countries. In Alberta, streaming usually occurs at grade 10, while in USA this practice starts in middle school. Finland, an international leader in education, has outlawed streaming entirely. This then begs the question, "What is best practice for streaming students?"
Some educators, who support streaming, say that this practice allows them to teach to "like-minded" individuals and the need for scaffolding diminishes as there is a homogeneous group in front of them. The material, consequently, is presented in such a way for the "average" of the group to understand; using the logic that all students are more or less the same.
This, unfortunately, contradicts almost all research around the growth of individual students. No matter how well you group students based on ability, there will always be some students who may find a certain topic easy, and other topics more difficult. Teaching to the "middle" will actually cause some students to struggle while preventing others from being enriched. I also fear, in an education system where streaming is prevalent, the practice might become, "If you are not understanding, you must simply be in the wrong stream".
In a mixed-ability class, the teacher is forced to create material that all levels can benefit from, where the top students are challenged while the weaker students are comforted. This results with all students learning at the highest of their levels and a philosophy of teaching that "all can succeed".
Also, when students are streamed, certain stereotypes occur. Early in my career, this was a regular occurrence for myself. I remember thinking while teaching a dash-2 course (a second stream in Alberta) "Well this is the lower stream so they won't be able to handle this..." or "We won't have time to do that project, as my students will take longer to learn...". This ideology is harmful to students.
In 1960, Rosenthal and Jacobson conducted an experiment to look at the impact of teacher expectations. Students were randomly placed in two groups, regardless of ability or talent, and these groups were labelled as "smart" and "weak" for 2 teachers. After a certain period, they determined that actually the "smart" class had scored at higher levels on IQ tests, while the "weak" class struggled with many concepts taught. Expectations from teachers was the difference.
This stereotype, once placed, becomes almost unbreakable. In England, where they stream by age 4, it has been shown that 88% of children remain in the same groupings until they leave school. This should be alarming!! A label, we give to a person who should be playing with blocks, dolls, and laughing, will determine his/her success for life! This, again, contradicts almost all research around child development and learning.
This label is not limiting only the weak students but also the "smart" ones as well. Carol Dweck, found that the moment students are streamed, by ability or intelligence, the students who were most negatively affected were those going into the top rank. Their positive growth-mind-set thinking reduced almost instantly, and they became fearful of making mistakes and consequently avoided more challenging work. This was especially prevalent in high achieving girls.
Students and parents usually support streaming due to the fact that this practice can allow schools to prepare students more appropriately for their future. This argument is actually flawed. Jo Boaler, followed students from two different school experiences. The first group came from a school where they organized students heavily by ability, while the second school mixed all abilities together. She found that the students who experienced mixed-ability grouping, despite growing up in one of the poorest areas in the country, were now in more professional jobs than those who had experienced streaming.
What is even more interesting was the attitude of the students who learned in mixed group settings. At first some of the brightest students were aphrenrsive around the fact that they would be constantly explaining their ideas to the rest of the class. However after one year this changed as they were quick to realize that this practice actually helped them understand the concepts being taught.
What is the solution?
Simply don't stream students based on prior knowledge or ability.
A great practice I have seen comes from a colleague of mine, Jonathan Mauro. He is the department head of a high school physical education group and has started a creative way of streaming; he lets the students decide. Instead of saying, "this group has high physical literacy, and this group has low physical literacy", he provides the students options of engaging in physical literacy. 3 full classes are scheduled during the same time slot and then the 3 teachers teach each unit using a different sport or activity to address the PE curriculum. Meaning, one student can learn through volleyball, while a different student can learn through badminton. Regardless of previous ability or experience, each student is free to choose which activity would be most engaging to them.
Coming together to create a real learning environment for students
Pages
▼
Friday, March 4, 2016
Tuesday, March 1, 2016
More to the PISA scores
Many people try and use PISA test data to infer that there is a problem with Math education in their area. I have seen this argument used in many blogs, papers, and social media outlets.
Initially, some people will show fancy graphs and try to convince others that the drop of PISA results has actually been caused by a change in curriculum. Later they will conclude that unless we dig up some old curriculum these scores will continue to drop in the area of mathematics.
First, any arguments, at best, truly show that there is only a correlation between the type of math curriculum their area has and a drop in PISA data. All arguments (which I have seen anyways) always fails to show causation. What is the difference?
Correlation is when two or more things or events occur near or around the same time. These things might be associated with each other, but are not necessarily connected to each other by a cause/effect relationship.
An easy example is when people get a cold during the winter months they usually end up with a runny nose and a sore throat. These two events are correlated but we cannot conclude that a runny nose will actually cause a sore throat to occur.
Most arguments around PISA, tend to show some sort of data analysis and link it to a change in curriculum, and again this would be correlation, at best. However, we can take a closer look at the data ourselves...
If we look at the Canadian results of PISA by province, we see that every single province dropped from 2009 to 2012, other than Quebec and Saskatchewan. On the international level, Netherlands, Belgium, Australia, Denmark, New Zealand, France, and many other countries fell at similar rates to Canada.
If there is a Math Crisis in your zone, then there must also be a Math Crisis around the entire planet? Does this sound likely? I would hope not!
Some "back to the basics" folk will ignore the fact that they are implying here is an International Math Crisis and simply tell us that:
Shanghai had the highest score on the 2012 PISA test with an average of 600.24, while Australia had a much smaller average of 515.01.
What can you conclude? That Australia's math curriculum is much weaker than Shanghai's?
If we look at students who were born in China and moved to Australia before ever entering school, their average on the 2012 PISA test was 614.77... 14 POINTS HIGHER THAN SHANGHAI!!!
This must mean that Australia's math curriculum is superior to Shanghai's? See the problem?
Using PISA scores, alone, to determine the quality of education in a province, state, region, or country is similar to judging someone's ability to drive simply by only watching them parallel park. While this single test can make many great observations around education on a global scale, we need to realize that it is exactly that: a single test.
There are much more variables at play in an Education system than simply the results on a test that some countries value more than others. If you attend school in Scotland and are called to write PISA, you will be forced to watch champions winning Gold Medals for their country and informed that you have the ability to "bring home the Gold for Scotland".
If you attend school in China and are called to write PISA, your name will be broadcasted and people will cheer you on as you walk into the testing room. The test you will write will be similar to the test preparation you have received over the previous months.
If you attend school in Canada and you are called to write PISA, you will be quietly removed from a class, brought to a room to write a test you know nothing about and have had no formal preparation for.
So I ask again, does a drop in PISA scores really mean a math crisis? I think not!
Lastly, we need to understand that a lot of arguments against current math practices are actually attacking teachers and not curriculum.
Further Reading:
Sam Sellar- Globalizing Educational Accountabilities
PISA Key findings
Initially, some people will show fancy graphs and try to convince others that the drop of PISA results has actually been caused by a change in curriculum. Later they will conclude that unless we dig up some old curriculum these scores will continue to drop in the area of mathematics.
First, any arguments, at best, truly show that there is only a correlation between the type of math curriculum their area has and a drop in PISA data. All arguments (which I have seen anyways) always fails to show causation. What is the difference?
Correlation is when two or more things or events occur near or around the same time. These things might be associated with each other, but are not necessarily connected to each other by a cause/effect relationship.
An easy example is when people get a cold during the winter months they usually end up with a runny nose and a sore throat. These two events are correlated but we cannot conclude that a runny nose will actually cause a sore throat to occur.
Most arguments around PISA, tend to show some sort of data analysis and link it to a change in curriculum, and again this would be correlation, at best. However, we can take a closer look at the data ourselves...
If we look at the Canadian results of PISA by province, we see that every single province dropped from 2009 to 2012, other than Quebec and Saskatchewan. On the international level, Netherlands, Belgium, Australia, Denmark, New Zealand, France, and many other countries fell at similar rates to Canada.
If there is a Math Crisis in your zone, then there must also be a Math Crisis around the entire planet? Does this sound likely? I would hope not!
Some "back to the basics" folk will ignore the fact that they are implying here is an International Math Crisis and simply tell us that:
PISA tells how well the math curriculum is in a certain area compared to other parts of the world.Well here are some stats that show how false that statement is:
Shanghai had the highest score on the 2012 PISA test with an average of 600.24, while Australia had a much smaller average of 515.01.
What can you conclude? That Australia's math curriculum is much weaker than Shanghai's?
If we look at students who were born in China and moved to Australia before ever entering school, their average on the 2012 PISA test was 614.77... 14 POINTS HIGHER THAN SHANGHAI!!!
This must mean that Australia's math curriculum is superior to Shanghai's? See the problem?
Using PISA scores, alone, to determine the quality of education in a province, state, region, or country is similar to judging someone's ability to drive simply by only watching them parallel park. While this single test can make many great observations around education on a global scale, we need to realize that it is exactly that: a single test.
There are much more variables at play in an Education system than simply the results on a test that some countries value more than others. If you attend school in Scotland and are called to write PISA, you will be forced to watch champions winning Gold Medals for their country and informed that you have the ability to "bring home the Gold for Scotland".
If you attend school in China and are called to write PISA, your name will be broadcasted and people will cheer you on as you walk into the testing room. The test you will write will be similar to the test preparation you have received over the previous months.
If you attend school in Canada and you are called to write PISA, you will be quietly removed from a class, brought to a room to write a test you know nothing about and have had no formal preparation for.
So I ask again, does a drop in PISA scores really mean a math crisis? I think not!
Lastly, we need to understand that a lot of arguments against current math practices are actually attacking teachers and not curriculum.
Further Reading:
Sam Sellar- Globalizing Educational Accountabilities
PISA Key findings