The division of time into units such as hours, minutes, and seconds, based on a base-60 system, can be traced back to the ancient Babylonians. They used a sexagesimal (base-60) numeral system, which influenced the way they measured time. This system was likely adopted due to the number 60 having many divisors (1, 2, 3, 4, 5, 6, 10, 12, 15, 20, 30), making it mathematically convenient for calculations.
However, when it comes to larger units of time, such as centuries and millennia, the choice of using base-10 (decimal) system with 100 years in a century is more of a historical and cultural convention than a mathematical one.
The decimal system, based on powers of 10, has been widely used in many human cultures throughout history. It is likely that when people started measuring longer periods of time, such as centuries, they adopted the base-10 system that was already familiar and commonly used for various other purposes, including counting and arithmetic calculations.
The adoption of 100 years as a century could also be attributed to the fact that it aligns well with the use of decades (groups of 10 years), which is a common way to organize and refer to periods of time. Using a base-60 system for centuries would have resulted in a less intuitive and less commonly used unit of measurement.
Ultimately, the choice of using 100 years for a century and 60 for smaller units of time is a result of historical and cultural conventions that have developed over time, influenced by mathematical convenience, practicality, and human preferences.