Thursday, 24 November 2016

Use it or Lose it !

This is not a column about sexual health.

I’m talking about the basic math facts — times tables and all that stuff that should be stored in our long term memory:

Will we really forget those facts if we don’t use them? 

Our brains are flexible. Neuroscientists use the word "plastic". They tell us that the brain's circuits can be repurposed, new ones can grow, and that frequent use strengthens neural connections. My guess is that YES, our memory of these basic facts will fade if we don't use them.

But I have a follow-up question.

Do you know that  1 / 1.04 = .96154? 

I do know that! And it's a fact that I don't use.

Here's the story.

Fifty years ago I worked for a large Canadian insurance company. I was a clerk in the mathematics section of the Policy Change department. My job was to compute how much to charge or refund when clients changed their insurance policies.

Among the many options available was the possibility of paying premiums in advance. The company liked it when clients did this, and to encourage them the company offered a discount.

We used an interest rate of 4%.  So, if the premium was, say, $150.00, and the payment was being made a year in advance, the client would pay $144.23, which is what you get from the following computation:

$150.00 / 1.04.

To the nearest five decimal places, 1/1.04 = 0.96154, so we could also calculate the discounted premium this way:

$150.00  ×  .96154.

Actually, 1/1.04 is slightly smaller than 0.96154, but for all practical purposes the two methods are identical. Yet, despite their equivalence, our boss made us use the second one. Instead of dividing by 1.04 we had to multiply by 0.96154. It was not because our boss had some weird mathematical fetish: an official company policy memo explicitly stated that when computing prepayments of premiums, we had to use the factors from the supplied interest tables. Policy trumps mathematics.

After a short time on the job, the "fact" that 1/1.04 equalled .96154 had established permanent residence in my mind and it still remains there even though I haven't used it for over half a century.

* * * * *

This sort of thing is not unusual.

  • My parents-in-law, veterans of WWII, could recall their military ID numbers for their entire lives, even though they seldom used them.
  • My wife and her mother used to play a game when she was a child. They recited the alphabet backwards as rapidly as they could. It's a skill that my wife has not used for over 60 years. Yet yesterday she showed me that she can still reel off the alphabet backwards without hesitation. 

So my intuition that facts fade from memory when they are not used may be wrong. Some of us retain basic facts and practiced skills even if we don't use them for an extremely long time.

* * * * *

On the other hand - 

Sometimes when people say "Use it or lose it" they really mean

Use it and you won't lose it!

Well, we all know that statement is false.

How else to account for my spotty recall of simple multiplication facts? I've used those basic multiplication facts quite a lot since leaving that insurance company, and I'm still not "fluent" with them. I can fire off the squares of the natural numbers up to 12 × 12, but ask me what 12 × 7  or 12 × 11 is and I have to perform a mental computation. And 13 × 13 is 169, but what is 13 × 9?

Some people can recall basic math facts quickly and without conscious effort — they automatically know them without thinking. Like the way I know that 1/1.04 is .96154.

Others will never completely achieve this type of fluency. When someone says "Twelve times seven" it may not provoke the involuntary "Eight-four". Instead, the reaction may be "seventy plus fourteen" having broken 12 × 7 into 10×7 plus 2×7.

To be truthful, I am fluent with most of the basic multiplication facts. But, I really do still have trouble with a few parts of the 6, 7, 8 and 12 times tables. And, as I prepared this post, I actually did have to retrieve 12×7 by that quick mental computation. However, once done, I could instantaneously recall the answer for the rest of the post, and I know that the automaticity will remain for several days, maybe even a couple of weeks. But it’s as impermanent as those facts acquired by cramming for an exam: very soon it will fade from my memory.

My advice 

If my children were young, this is what I would tell them today:

Practice your times tables. Memorize as much as you can. It is worth the effort. Maybe you will never be able to keep all of them in your head. But if you can keep a few there, and if you learn how to be flexible with numbers, you will find ways to derive the more difficult ones very quickly — almost as fast as if you had actually memorized them — and it won’t stop you from learning more mathematics.

As for getting every single one of those basic facts into genuine long term memory in a way that they can be instantly and automatically recalled: forget it — for me, and maybe for quite a few others:

It's • Not • Gonna • Happen


Just as I was finishing this post, I came across an article (linked below) by Maria Konnikova in The New Yorker magazine. It's about the tenuous relationship between practice and achievement across a variety of fields (including mathematics). It's really worthwhile reading. 

Friday, 11 November 2016

Understanding the PISA Tables

The PISA math results show Canada trending downwards internationally. Here are the rankings of the top 20 regions (of approximately 65) that participated in the 2003, 2006, 2009, and 2012 rounds:

Note Shanghai and Singapore have been greyed out for 2003 and 2006 because neither took part in those two rounds. Since it is hard to assess a trend when the population base changes, I think that either Shanghai and Singapore should be included for all rounds, or both should be excluded for all rounds. I have chosen to include them, and, based on their results in 2009 and 2012, I have assumed that they would have finished atop the PISA table had they competed in the earlier rounds.

With Shanghai and Singapore included, the perspective on the Canadian trend changes somewhat. However, some slippage is definitely happening, and for many people the trend is alarming.

But I wonder if we are concentrating too closely on the rankings. Here is the same table when grades are included:

Note: OK. This doesn't look like the PISA scores that you have seen. The results shown above are percentage grades, and the PISA consortium does not report the scores this way. Instead, they use a numerical scale that ranges from about 300 to over 600.

I have a fondness for percentage grades. They provide me with a more serviceable scale than grades out of 600. For example, to me, a class average of 67% indicates an ordinary but acceptable performance, while I am not sure what an average of 527 on the PISA scale conveys.

The conversion from the PISA scale to percentage grades in the table above was done linearly:

Percent_grade = 60 + (Pisa_Score − 482) × (10 / 62.3).

As is evident, Canada has remained remarkably stable: 68%, 67%, 67%, 66%. 

Proficiency Levels

The PISA results are intended to be a measure of mathematical proficiency, and PISA describes six different levels. The grade ranges for each level are as follows:

The descriptions above are extremely abbreviated. Detailed descriptions can be found in the 2012 PISA technical report (link [5] below). Also, PISA does not use verbal descriptions like Highly proficient or Basic proficiency.

Some features of the PISA proficiency levels

  • For each round, the PISA consortium adjusts the grades so that the proficiency levels retain the same meaning over the years. A percentage score of 65% in 2003 has exactly the same meaning as a percentage score of 65% in 2012. In other words, PISA takes care to avoid grade inflation.
  • On the percentage scale, each Level band is 10 points wide. On the PISA scale, the bands are 62.3 points wide.
  • The PISA scales are designed so that the OECD mean is 500 (or very close to 500). On the percentage scale, the OECD mean is 63%.
  • Over the four rounds from 2003 to 2012, Level 6 has never been achieved by any of the participating regions. Level 5 has been achieved by one participant. Level 4 has been achieved by five participants.  In the four rounds from 2003 to 2012, Canada has resided in the Level 3 band.

Is the slippage significant?

The difference between Canada’s score in 2003 and 2012 is statistically significant. The words mean that, with high probability, the difference in the scores is real and is not a result of the chance that arises by testing only a sample of the population. Statistical significance tells us that the statistics are reliable, but it doesn't tell us anything about whether or not the statistics are important.

(The word "significant" has considerable emotional impact. I remember a Statistics professor telling us that the word should be outlawed when we are discussing statistical results.)

Do the differences in the PISA tables show that Canada must do something to improve how our students learn mathematics? Some people believe that the statistically significant difference between 2003 and 2012 is also of great practical significance, and they have strong opinions about what should be done.

As well, the PISA group itself has investigated what will and won't work to improve students mathematical proficiency, and has published its findings in a 96 page document [6]. Some of the conclusions appear to run counter to what the back-to-basics people advocate — one conclusion in the document is that dependence on intensive memorization and rote practice of the basic algorithms is a poor strategy for learning anything but the most trivial mathematics.

That particular PISA document may leave some of my friends in a slightly awkward place: Those who have used the PISA tables as a rallying cry for a return to traditional methods now find themselves having to rebut some of PISA's own conclusions about what constitutes good teaching and good curriculum.

In any event, when I look at the percentages for Canada from 2003 to 2012, I see changes, but I don't think that they are earth-shattering. Would you be concerned if one section of your Calculus course averaged 68% and another section averaged 66%? I’ve run into that situation teaching back-to-back sections of the same course, and I was not convinced that the section with the higher score was the superior one.

* * *

In a few weeks, the OECD will release the results of the 2015 PISA round. I am very interested in seeing if Canada's results are significantly different in a practical sense. My expectation (based on no evidence whatsoever) is that the results will be much the same as before, and will fall within the 65% to 67% range. ( Hey! A Halifax weatherman had a very successful forecasting run by predicting tomorrow's weather to be the same as today's.)  I'll do a Trumpian gloat if my prediction is correct, and a Clintonian sob if it's wrong.


[1] 2003 PISA report (Math scores are on page 92)

[2] 2006 PISA report (Math scores are on page 53)

[3] 2009 PISA report (Math scores are on page 134)

[4] 2012 PISA report (Math scores are on page 5)

[5] PISA 2012 Technical report (Proficiency levels are described on pages 297 - 301)

[6] Ten Questions for Mathematics Teachers… and How PISA Can Help Answer Them, OECD Publishing, Paris (2016).