+267 votes
in Thermodynamics by
edited by

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+18 votes
by

An example of something with very low information entropy is a highly predictable sequence or pattern. Let's consider a simple example:

Suppose we have a coin that is perfectly fair and unbiased, meaning it has an equal chance of landing on heads or tails. If we were to flip this coin and record the outcomes, we would expect to see a random sequence of heads (H) and tails (T). In this case, the sequence would have high information entropy because it is unpredictable.

However, if we already know that the coin is rigged and will always land on heads (H), then the sequence of outcomes would have very low information entropy. In this case, the pattern is highly predictable and the outcome is always the same. The sequence would simply be a repeated string of 'H's, and there would be no uncertainty or randomness involved.

In general, low information entropy implies a high level of predictability, repetition, or lack of uncertainty, whereas high information entropy suggests unpredictability, diversity, and a greater range of possible outcomes.

Welcome to Physicsgurus Q&A, where you can ask questions and receive answers from other members of the community.
...