How to pronounce the word entropy

The word entropy is pronounced as entropy (shāng).

Interpretation: Entropy, the Chinese secondary word 1, refers to the quotient obtained by dividing heat energy by temperature in physics, indicating the degree to which heat is converted into work. 2.liàng technology generally refers to the degree of state of some material systems and the possible degree of state of some material systems. Social science also uses it to compare the degree of certain States of human society.

Modern interpretation

Entropy shāng(ㄕㄤ)

1, in physics, refers to the quotient obtained by dividing heat energy by temperature, indicating the degree to which heat is converted into work.

2.liàng technology generally refers to the degree of state of some material systems and the possible degree of state of some material systems. Social science also uses it to compare the degree of certain States of human society.

Characteristics of entropy

1, entropy is the state function of the system, and its value has nothing to do with the process of reaching the state;

2. The definition of entropy is dS=dQR/T, so when calculating the entropy change of irreversible process, it must be calculated by the thermal effect dQR of reversible process, which is the same as the initial state and final state of this process.

Naming significance of entropy words

In physical terms, the quotient obtained by dividing heat by temperature.

How about naming entropy? Entropy is an uncommon naming word with a general meaning. Note that you can't just look at the eight characters, you have to combine the eight characters to get a good name.

The name entropy is taboo.

1, the five elements of entropy belong to fire. According to the principle that the five elements belong to fire and gold, it is taboo to name the word entropy with the five elements belonging to gold.

2. The naming of entropy words is taboo to use words with the same vowel āng or the same tone, which makes it awkward to read and has no sense of rhythm;

3. The name of entropy is forbidden to be the same as the name of the ancestor. If there is entropy in the ancestors' names, the descendants are forbidden to use entropy.

The concept of entropy

The concept of entropy was put forward by the German physicist Clausius in 1865. One of the state parameters of matter originally used to describe "energy degradation" is widely used in thermodynamics.

But in 1948, claude elwood shannon introduced the entropy of thermodynamics into information theory, so it was also called shannon entropy. In information theory, entropy is the average amount of information contained in each received message, also known as information entropy, information source entropy and average self-information amount.