Linear Versus Logarithmic Thinking about Numbers

Some folks argue that humans intuitively think about numbers logarithmically versus linearly. My experience strongly suggests that this is not the case. If you’ve ever tried to explain logarithms or logarithmic scales to people, or asked them to interpret graphs with logarithmic scales, as I have often done, you probably share my belief that logarithms are not cognitively intuitive. The behaviors that are sometimes described as intuitive logarithmic thinking about numbers can be reasonably explained as something else entirely.

According to some sources, a research study found that six-year-old children, when asked to identify the number the falls halfway between one and nine, often selected three. Unfortunately, after extensive searching I cannot find a study that actually performed this particular experiment. One article that makes this claim cites a study titled “A Framework for Bayesian Optimality of Psychophysical Laws” as the source, but that study does not mention this particular experiment or finding. Instead, it addresses the logarithmic nature of perception, especially auditory perception. Keep in mind that perception and cognition are related but different. Many aspects of perception do indeed appear to be logarithmic. As the authors of the study mentioned above observed about auditory perception, “…under the Weber–Fechner law, a multiplicative increase in stimulus intensity leads to an additive increase in perceived intensity,” but that’s a different matter. I’m talking about cognition. Even if many kids actually did select three as halfway between one and nine in an experiment, I doubt that they were thinking logarithmically. At age six children have not yet learned to think quantitatively beyond a rudimentary understanding of numbers. Until they begin to learn mathematics, children tend to think with a limited set of numbers consisting of one, two, three, and more, which corresponds to the preattentive perception of numerosity that is built into our brains. With this limited understanding, three is the largest number that they identify individually, so it might be natural for them to select three as the value that falls halfway between one and numbers that are larger than three. If the numbers were displayed linearly and in sequence for the children to see when asked to select the number in the middle (illustrated below), however, I suspect that they would correctly select five.

1   2   3   4   5   6   7   8   9

You might argue that this works simply because it allows children to rely on spatial reasoning to identify the middle number. That is absolutely true. We intentionally take advantage of spatial reasoning when introducing several basic concepts of mathematics to children. This works as a handy conceptual device to kickstart quantitative reasoning. Believing that children naturally think logarithmically would lead us to predict that, if asked to identify the number halfway between 1 and 100, they would be inclined to choose 10. Somehow, I doubt that they would.

Another research-based example that has been used to affirm the intuitive nature of logarithmic thinking about numbers is the fact that people tend to think of the difference between the numbers one and two as greater than the difference between the numbers eight and nine. I suspect that they do this, however, not because they’re thinking logarithmically, but more simply because they’re thinking in terms of relative magnitude (i.e., proportions). Even though the incremental difference between both pairs of numbers is a value of one (i.e., 2 – 1 = 1 and 9 – 8 = 1), the number two represents twice the magnitude of one while the number nine is only 12.5% greater than eight, a significantly lesser proportion. I anticipate that some of you who are mathematically inclined might object: “But logarithmic thinking and proportional thinking are one and the same.” Actually, this is not the case. While logarithms always involve proportions, not all proportions involve logarithms. A logarithmic scale involves a consistent proportional sequence. For example, with a log base 10 scale (i.e., log10), each number along the scale is ten times the previous number. Only when we think of a sequence of numbers in which each number exhibits a consistent proportion relative to previous number are we thinking logarithmically. We do not appear to do that naturally.

Another example, occasionally cited, is that people tend to think of differences between one thousand, one million, one billion, one trillion, etc., as equal when in fact each of these numbers is 1,000 times greater than the previous. Is this because people are thinking logarithmically? I doubt it. I suspect that it is simply because each of these values exhibits the next change in the label (e.g., from the label “thousand” to the label “million”), and changes in the labels suggest equal distances. If people intuitively thought about numbers logarithmically, they should automatically recognize that each of these values (one billion versus one million versus one thousand, etc.) is 1,000 times the previous, but most of us don’t realize this fundamental fact about our decimal system without first doing the math.

Along linear scales, increments from one value to the next are determined by addition—you always add a particular value to the previous value to produce the next value in the sequence, such as by adding a value of one to produce the scale 0, 1, 2, 3, 4, etc. or a value of ten to produce the scale 0, 10, 20, 30, 40, etc. Along logarithmic scales, on the other hand, increments are determined by multiplication—you always multiply the previous value by a particular number to produce the next value in the sequence, such as by multiplying each value by two to produce the scale 1, 2, 4, 8, etc., or by ten to produce the scale 1, 10, 100, 1,000, etc. The concept of logarithms, when clearly explained, is not difficult to understand once you’ve learned the mathematical concept of multiplication, but thinking about numbers logarithmically does not appear to be intuitive. It takes training.

10 Comments on “Linear Versus Logarithmic Thinking about Numbers”


By Bretwood Higman. December 26th, 2019 at 12:41 pm

Good points, and agreed. I wonder if one could delve deeper into what’s going on with the history of developing counting systems. You mentioned that going from millions to billions is simply a step in naming, but why is that so? Given how intuitive linear comparisons are, why don’t we use a system where our names are related to quantities that are distinct by the same difference?

I think the answer provides additional evidence in favor of your argument here.

The number system we do use has unique names for every number from zero to twenty – a pattern that presumably has to eventually break down, but shows an interest in this linear framework. If we look at some more ancient number systems, like Roman Numerals, we see the normal linear thinking. 1000 is represented by M, while 2000 is represented by MM. For obvious practical reasons they didn’t represent 2000 with a series of 2000 I’s, though one might find examples of tally marks that range into the 100s. Our Hindu-Arabic counting system is arranged logarithmically, but not because that’s the intuitive way people want their numbers arranged – it’s because it was an innovation that has proven useful and worth the effort to learn.

By Berry. December 28th, 2019 at 5:57 am

Thanks for your post! I agree it does not come intuitively, but can be learned. I work a lot with logarithmic axes and tend to explain them in two ways:
– to number-savvy people: addition vs multiplication, as you wrote.
– to lay-people: zoom out for large numbers (relatively similar) while zooming in for small numbers (to better see small differences).

Examples I bring up are the logarithmic scaling of coin/banknote values or 5€ being a relevant difference in choosing a pack of coffee but not for a washing machine (of which you rarely buy a new one, hence being guided by relative not absolute difference is not economically stupid).

I think most people would correctly pick 3 as the center between 1 and 9 on a logarithmic axis. In R, get this with plot(1:9, log=”x”, type=”n”); text(1:9, rep(5,9), 1:9)
That would be interesting to test with people…

By Stephen Few. December 28th, 2019 at 10:28 am

Berry,

Your guidance for lay-people isn’t clear to me. When you’re explaining the use of log scales in graphs to lay-people, what do you say?

Steve

By Eric. December 29th, 2019 at 2:17 am

This might be helpful in tracking down sources.

Varshney, L.R. and Sun, J.Z., 2013. Why do we perceive logarithmically?. Significance, 10(1), pp.28-31.

I believe the book titled Numbersense discusses the issue, but I don’t have it at hand.

By Stephen Few. December 29th, 2019 at 9:33 am

Eric,

I’ve read the the paper by Varshney and Sun and also the book “The Number Sense” by Dehaene. The paper mentions the study involving children but does not provide a reference for it, which suggests that the authors have heard about it but never actually read it themselves, assuming it actually exists. Dahaene’s book does not mention the study about children. Researchers should never base an argument on a study that they have not themselves read and examined with care. If the study involving children actually exists, I want know precisely what question was asked of the children and how it was asked. I also want to see the actual data that the researchers collected for myself. In my own field of expertise, when I encounter research claims that conflict with my own knowledge and experience, I usually find that those claims were based on flawed research.

As the title of the paper suggests, it features logarithmic perception rather than cognition, although the authors conflate the two a bit. Both the paper and the book cite the fact that people tend to compare numbers proportionally, which is true, but this is easy to explain without claiming that people are thinking logarithmically about numbers.

There is clearly a survival advantage that’s associated with the logarithmic perception of many phenomena (e.g., sound), otherwise our senses would not have evolved as they have. The way that we think about numbers, however, is not the product of evolution. We’ve taught ourselves to think about numbers in particular ways, and that training hasn’t made logarithmic thinking about numbers intuitive, despite the fact that we can certainly learn to understand logarithms.

By Berry. January 2nd, 2020 at 10:21 am

When I explain a log scale to lay-people, I say that the axis is “zoomed in” for small values (a difference between e.g. 2 and 4 is clearly visible) while at the same time being “zoomed out” for larger values (200 and 400 have the same visual space on paper between them as 2 and 4).
In both examples the values have doubled, but on a linear axis from say 1 to 500, 2 and 4 would be hard to distinguish.

I am well aware that the zoom analogy is not technically correct. But in a presentation setting with an animated transition from a linear to logarithmic axis, followed by explaining that now rates of change can be compared (with 2-4 and 200-400 as a visual example), I’ve often received the reaction “I finally understood logarithmics!”

Sorry for the long side track. I hope my previous comment is now comprehensible…

By Stephen Few. January 2nd, 2020 at 10:37 am

Thanks Berry. This is what I assumed that you meant, but I wanted to make sure that your explanation was clear for everyone to understand and appreciate.

By Clive. June 13th, 2021 at 5:07 am

Hi I am researching the potential role of logarithmic thinking versus linear thinking for my dissertation.

The paper you are looking for might be:
“Development of Numerical Estimation in Young Children” by
Robert S. Siegler, Julie L. Booth (2004)
https://doi.org/10.1111/j.1467-8624.2004.00684.x

Another paper that might be of interest to this discussion:

Halberda, J., & Feigenson, L. (2008).
“Developmental change in the acuity of the ‘number sense’: The approximate number system in 3-, 4-, 5-, and 6-year-olds and adults.” Developmental Psychology, 44(5), 1457–1465. https://doi.org/10.1037/a0012682

By Stephen Few. June 15th, 2021 at 8:36 am

Hi Clive,

Thanks for pointing me to the study titled “Development of Numerical Estimation in Young Children.” This study is interesting and relevant to my blog article, but it does not address the specific claim that children, when asked to pick the number midway between 1 and 9 on a number line, tend to pick 3. That study remains elusive. I suspect that it does not exist.

Steve

By Clive Staples. July 20th, 2021 at 4:40 am

Thanks Steve,

You could be right that it’s a just a myth.

I had another look and came up with this:

Siegler, R. S., & Opfer, J. E. (2003). The development of numerical estimation: Evidence for multiple representations of numerical quantity. Psychological science, 14(3), 237-250.

It’s the closest I could find, so far. This study examines children’s estimations of numbers on number lines of 1-100 and 1-1000. Unfortunately not 1-10. Perhaps the idea that young children pick 3 as halfway on a 1-10 is an over-simplification that misrepresents their findings?

For a thorough review of the literature on this topic, but still lacking a reference to the elusive study in question, there is:

Opfer, J. E., & Siegler, R. S. (2012). Development of quantitative thinking.

DOI: 10.1093/oxfordhb/9780199734689.013.0030

Clive

Leave a Reply