But there's an even bigger complication than that - after all, that's just a difference of days: the year that things happened would still be the same...unless of course the event happened within 13 days of the new year....and it's the new year that this further complication relates to. In medieval Europe, even though January 1 was always called 'New Years Day', they counted the next numbered year as beginning, in some places on 25 December, in others on 25 March, and in others on Easter (which gets really complicated because that would be a different day each year). All of those differences are related to Christianity....25 December is Christmas, 25 March was recognised as the date of the annunciation - when the angel informed Mary that she was going to have a child - and Easter is pretty well known so I don't need to explain it.
Things also got very complicated in France during the French revolution because they made drastic changes to the calendar. They basically wanted to decimalise everything. The year still had twelve months, although the months began and ended on different days to the Gregorian months, and had different names. Each month had 30 days, and the extra 5 or 6 days every year were not designated to a month - they were called 'complementary' days. Each month consisted of three weeks because a week was 10 days long, and each day was 10 hours long, with each hour being 100 minutes long. Years were numbered from the date of the French revolution - so 1789 was year I, 1790, year II. Decimal time never really took off and was officially abandoned in 1795, and the new calendar was abolished in 1806 by Napoleon, who became the emperor of France in 1805.
One thing that really did catch on from the changes made during the French revolution is the metric system - millimetres, centimetres, kilometres, etc and, now, in the 20th/21st century we have bytes, kilobytes, megabytes, gigabytes, terabytes....it's interesting that there's nothing axiomatic or fundamental to the characteristics being quantified that lends itself to decimalisation. Like, a byte is made up of 8 bits and a kilobyte is 1024 bytes, not 1000 bytes, The same ratio applies as you go up e.g. a megabyte is 1024 kilobytes.
Who knows why decimalisation of time didn't catch on while decimalisation of weights and measures did? Although, in some ways, time is decimalised...we talk about centuries and millennia.
Even numbers themselves weren't always as they are now. Our numbers are decimal - based on powers of 10 - and that system didn't exist before the 6th century. Our number system relies on two crucial principles that were developed by Indian mathematicians in the 5th and 6th century: the principle of place-value notation, developed by Aryabhata of Kusumapura in the 5th century CE, and the use of zero, developed by Brahmagupta a century later. It's amazing to think how revolutionary that symbol - zero - the symbol for nothing - has been. The combination of those two principles means that, as seems simply logical to us now, adding zeros to the end of a number dramatically increases that number, but before the 6th century there was no such idea.
That brings me back to my original point and makes me realise that, although history and the world and the universe and our reality are complex and sometimes mysterious, we can talk about facts. We can talk about principles and ideas that were developed and we can talk about changes that happened, in a meaningful way.
No comments:
Post a Comment