Centigrade Scale: Who Invented It? Things to Know about

The centigrade scale is another name for the Celsius scale. The Celsius scale is a temperature scale based on 100° for the boiling point and 0° for the freezing point of water.

Anders Celsius, a Swedish physicist, engineer, and astronomer, invented and named the Celsius scale.

The celsius scale was used to signify the boiling point of water at 0° and the freezing point of water at 1000°.

Later, the freezing point and boiling point of water were inverted to 0° and 100°, respectively. This type of celsius scale became widely used.

Celsius temperatures are measured using a relative interval or scale rather than an absolute ratio or scale. The type of ratio scale includes those used to calculate weight or distance.

Assume that when the mass is doubled (say, from 10 kg to 20 kg), it is frequently followed by an increase in volume, representing twice the amount of matter. 

The rise in this quantity of matter from 10 kg to 20 kg is comparable to the growth in this amount of matter from 50 kg to 60 kg. However, it is crucial to remember that the Celsius scale does not work with heat energy in this mode.

The difference between 10°C and 20°C and 20°C and 30°C is merely 10° since a temperature of 20°C does not exert twice the heat energy that a temperature of 10°C does.

What Is The Current Celsius Scale?

The modern Celsius scale is likewise based on the triple point of Vienna Standard Mean Ocean Water and the concept of absolute zero. This means that neither water’s boiling nor melting points describe the present Celsius scale. 

It should be noted that the variations between the formal and common meanings of the Celsius scale are negligible in practical circumstances.

It should also be noted that the difference between the present scale calculation of the boiling point of water and the original scale calculation is just 16.1 millikelvins.

Celsius vs. Centigrade: Know The Points-

Since the nineteenth century, the thermometry and scientific communities have used the phrase “centigrade scale,” and temperatures were frequently described simply as “degrees” or, when greater specificity was sought, as “degrees centigrade,” with the symbol °C.

However, the name centigrade was also employed as an angular measurement unit (which is 1/100 of a right angle) in French and Spanish, with a similar connotation in other languages.

The term gradient or centesimal degree (“gon” or “grad”: 1g = 0.9°, 100g = 90°) was used by international standards bodies such as the BIPM when clear language was required. 

It is now more properly referred to as “hectogram.” To avoid confusion between the units of angular measurement and temperature, the 9th meeting of the General Conference on Weights and Measures and the Comite International des Poids et Mesures formally adopted the “degree Celsius” for the degree of temperature in 1948, keeping the recognized degree symbol (°) rather than the centesimal or gradian degree symbol (gon or g). 

“Celsius” is usually used for scientific purposes, but in English-speaking countries, “centigrade” is still commonly used, especially in informal circumstances. It wasn’t until February 1985 that the BBC’s weather predictions changed the phrase from “centigrade” to “Celsius.” 

Though the size of the degree has been more accurately specified, the Celsius scale remains a centigrade scale with 100 degrees from the freezing point (0 C) to the boiling point (100 C) of water. At standard pressure, the triple point of water and the freezing point of water differ by 0.01 C.

Was this article helpful?

Leave a Comment