Physicscalendar_todayLast updated: Apr 2026

What is Entropy?

/ˈentrəpi/

A measure of the disorder or randomness in a system. The second law of thermodynamics states that entropy in a closed system always increases — the universe constantly moves towards disorder.
lightbulb

Everyday Example

A messy room is high entropy; a tidy room is low entropy. Left alone, rooms get messier — it takes energy (your effort) to create order. The universe does the same on a cosmic scale.

publicReal-World Application

Data centres generate enormous heat because computing requires managing entropy. Every calculation increases the entropy of the universe slightly — it's an unavoidable physical cost of information processing.
psychology

Did you know?

Rudolf Clausius coined the term entropy in 1865. Ludwig Boltzmann later showed that entropy is fundamentally statistical — a high-entropy state simply has more possible arrangements.

emoji_objects

Key Insight

Entropy explains why time only moves forward. Processes that increase disorder are statistically irreversible — you can't un-break an egg because the odds of all its molecules reassembling are essentially zero.

Want to learn Entropy in 60 seconds?

Join 50,000+ learners snacking on knowledge daily.