Science

The invention of the decimal dot that changed mathematics forever

The invention of the decimal dot that changed mathematics forever
Dawn of an idea: The decimal point has stuck where other ways of splitting integers have not
Dawn of an idea: The decimal point has stuck where other ways of splitting integers have not
View 2 Images
Dawn of an idea: The decimal point has stuck where other ways of splitting integers have not
1/2
Dawn of an idea: The decimal point has stuck where other ways of splitting integers have not
Trigonometrical tables from Giovanni Bianchini's 1440s manuscript Tabulae primi mobilis B, demonstrating the first known use of the decimal point
2/2
Trigonometrical tables from Giovanni Bianchini's 1440s manuscript Tabulae primi mobilis B, demonstrating the first known use of the decimal point

Historians have discovered what may be the world's first decimal point, in an ancient manuscript written 150 years before its next known appearance. There have been many ways to split integers, but this little dot has proven uniquely powerful.

The mathematics we all learn at school seems so fundamental that it doesn’t feel like individual concepts in it would need “inventing,” but these pieces arose separately as scientists and mathematicians realized they were needed. For instance, scientists recently found the oldest written record of the numeral “0,” dating back 500 years earlier than previously thought.

Now, it looks like the decimal point is also older than expected. Ever since we’ve realized we sometimes need to break numbers into smaller fragments, humans have denoted the difference using various symbols – dashes, vertical lines, arcs and underscores have filled the role, but none of those have survived into modern usage. Commas and periods are the most common now, so when did they start?

Previously, the earliest known use of a period as a decimal point was thought to be an astronomical table by the German mathematician Christopher Clavius in 1593. But according to modern scientists, that kind of test is a weird place to introduce such a massive concept to the world, and Clavius didn’t really go on to use the idea much in his later writings. Basically, if he realized the need for the concept and invented a neat way to display and work with it, why didn’t he brag about it?

The answer, it seems, is that Clavius was just borrowing an older idea that had essentially been lost to time, and wasn’t the preferred method in his era. A new study has found that the decimal point dates back to the 1440s – about 150 years earlier – first appearing in the writings of Italian mathematician Giovanni Bianchini.

Bianchini was a professor of mathematics and astronomy at the University of Ferrara, but he also had a background in what we’d now call finance – he was a merchant, and managed assets and investments for a wealthy ruling family of the time. That real-world experience seems to have influenced his mathematical work, since Bianchini was known to have created his own system of dividing measurement units like feet into 10 equal parts to make them easier to work with. As fundamental as it feels to modern sensibilities, it didn’t catch on with the 15th century crowd who were used to a base-60 system.

Trigonometrical tables from Giovanni Bianchini's 1440s manuscript Tabulae primi mobilis B, demonstrating the first known use of the decimal point
Trigonometrical tables from Giovanni Bianchini's 1440s manuscript Tabulae primi mobilis B, demonstrating the first known use of the decimal point

Now, Dr. Glen Van Brummelen, a professor at Trinity Western University in Canada, has discovered that Bianchini illustrated this system with a decimal point, the first ever. Van Brummelen found that in a manuscript called Tabulae primi mobilis B, Bianchini was using numbers with dots in the middle – the first one being 10.4 – and showing how to multiply them, something that was tricky in a base-60 system.

“I realized that he’s using this just as we do, and he knows how to do calculations with it,” Van Brummelen told Nature. “I remember running up and down the hallways of the dorm with my computer trying to find anybody who was awake, shouting ‘look at this, this guy is doing decimal points in the 1440s!’”

In that manuscript are a series of trigonometric tables, in which Bianchini breaks numbers into tenths, hundredths and thousandths, denoted with a decimal point after the whole. This appears in sections where he tells users how much to add or subtract, to calculate values between entries in the table. Intriguingly, this is exactly the same way Clavius uses decimal points in his work 150 years later.

Van Brummelen says that this discovery shows Clavius was inspired by the earlier work of Bianchini, and he himself went on to influence mathematicians and astronomers to use the decimal system, eventually cementing it in science.

The research was published in the journal Historia Mathematica.

Source: Trinity Western University

2 comments
2 comments
Ignatz
SO maybe now Europeans will stop misusing commas for decimal points and vice versa.
DavidB
It does raise a question about the superiority of decimals over fractions:
What's the point?