What is entropy Jeff Phillips

There’s a concept that’s crucial
to chemistry and physics.

It helps explain why physical processes
go one way and not the other:

why ice melts,

why cream spreads in coffee,

why air leaks out of a punctured tire.

It’s entropy, and it’s notoriously
difficult to wrap our heads around.

Entropy is often described as
a measurement of disorder.

That’s a convenient image,
but it’s unfortunately misleading.

For example, which is more disordered -

a cup of crushed ice or a glass
of room temperature water?

Most people would say the ice,

but that actually has lower entropy.

So here’s another way of thinking
about it through probability.

This may be trickier to understand,
but take the time to internalize it

and you’ll have a much better
understanding of entropy.

Consider two small solids

which are comprised
of six atomic bonds each.

In this model, the energy in each solid
is stored in the bonds.

Those can be thought of
as simple containers,

which can hold indivisible units of energy
known as quanta.

The more energy a solid has,
the hotter it is.

It turns out that there are numerous
ways that the energy can be distributed

in the two solids

and still have the same
total energy in each.

Each of these options
is called a microstate.

For six quanta of energy in Solid A
and two in Solid B,

there are 9,702 microstates.

Of course, there are other ways our eight
quanta of energy can be arranged.

For example, all of the energy
could be in Solid A and none in B,

or half in A and half in B.

If we assume that each microstate
is equally likely,

we can see that some of the energy
configurations

have a higher probability of occurring
than others.

That’s due to their greater number
of microstates.

Entropy is a direct measure of each
energy configuration’s probability.

What we see is that the energy
configuration

in which the energy
is most spread out between the solids

has the highest entropy.

So in a general sense,

entropy can be thought of as a measurement
of this energy spread.

Low entropy means
the energy is concentrated.

High entropy means it’s spread out.

To see why entropy is useful for
explaining spontaneous processes,

like hot objects cooling down,

we need to look at a dynamic system
where the energy moves.

In reality, energy doesn’t stay put.

It continuously moves between
neighboring bonds.

As the energy moves,

the energy configuration can change.

Because of the distribution
of microstates,

there’s a 21% chance that the system
will later be in the configuration

in which the energy is maximally
spread out,

there’s a 13% chance that it will
return to its starting point,

and an 8% chance that A will actually
gain energy.

Again, we see that because there are
more ways to have dispersed energy

and high entropy than concentrated energy,

the energy tends to spread out.

That’s why if you put a hot object
next to a cold one,

the cold one will warm up
and the hot one will cool down.

But even in that example,

there is an 8% chance that the hot object
would get hotter.

Why doesn’t this ever happen
in real life?

It’s all about the size of the system.

Our hypothetical solids only had
six bonds each.

Let’s scale the solids up to 6,000 bonds
and 8,000 units of energy,

and again start the system with
three-quarters of the energy in A

and one-quarter in B.

Now we find that chance of A
spontaneously acquiring more energy

is this tiny number.

Familiar, everyday objects have many, many
times more particles than this.

The chance of a hot object
in the real world getting hotter

is so absurdly small,

it just never happens.

Ice melts,

cream mixes in,

and tires deflate

because these states have more
dispersed energy than the originals.

There’s no mysterious force
nudging the system towards higher entropy.

It’s just that higher entropy is always
statistically more likely.

That’s why entropy has been called
time’s arrow.

If energy has the opportunity
to spread out, it will.

有一个
对化学和物理至关重要的概念。

它有助于解释为什么物理过程
会以一种方式而不是另一种方式发生:

为什么冰会融化,

为什么奶油会在咖啡中扩散,

为什么空气会从被刺破的轮胎中泄漏出来。

它是熵,众所周知
,我们很难绕开我们的脑袋。

熵通常被描述
为无序的度量。

这是一个方便的图像,
但不幸的是它具有误导性。

例如,哪个更无序——

一杯碎冰还是
一杯常温水?

大多数人会说冰,

但实际上熵较低。

所以这是通过概率来思考它的另一种方式

这可能更难理解,
但花时间将其内化

,您将对熵有更好的
理解。

考虑两个

由六个原子键组成的小固体。

在这个模型中,每个固体中的能量
都存储在键中。

这些可以被认为
是简单的容器

,可以容纳
被称为量子的不可分割的能量单位。

固体的能量越多
,它就越热。

事实证明,
能量可以通过多种方式分布

在两种固体中,并且每种固体中


总能量仍然相同。

这些选项中的每一个
都称为微观状态。

对于固体 A 中的六个能量量子
和固体 B 中的两个能量量子,

有 9,702 个微态。

当然,还有其他方式可以安排我们的八个
能量量子。

例如,所有能量
都可能在固体 A 中而在 B 中没有,

或者一半在 A 中,一半在 B

比其他人发生。

这是因为它们有更多
的微观状态。

熵是每种
能量配置概率的直接度量。

我们看到的是,能量


固体之间最分散的能量配置

具有最高的熵。

所以在一般意义上,

熵可以被认为
是这种能量传播的度量。

低熵
意味着能量集中。

高熵意味着它是分散的。

要了解为什么熵有助于
解释自发过程,

例如热物体冷却,

我们需要研究
能量移动的动态系统。

实际上,能量不会保持不变。

它在相邻债券之间不断移动

随着能量移动

,能量配置可以改变。

由于
微观状态的分布

,系统有 21% 的机会

在之后处于能量最大
分布的配置中,

有 13% 的机会会
回到起点

,有 8% 的机会会 A实际上会
获得能量。

同样,我们看到,
由于分散能量

和高熵的方式比集中能量的方式更多

,因此能量往往会分散。

这就是为什么如果你把一个热的物体
放在一个冷的旁边

,冷的会变热,
而热的会变冷。

但即使在那个例子中

,热物体也有 8% 的机会
变得更热。

为什么这
在现实生活中从来没有发生过?

这完全取决于系统的大小。

我们假设的固体
每个只有六个键。

让我们将固体扩展到 6,000 个键
和 8,000 个单位的能量,

然后再次
以 A 中四分之三的能量

和 B 中四分之一的能量启动系统。

现在我们发现 A
自发获得更多能量的机会

就是这个很小的数字 .

熟悉的日常
物品比这多很多很多倍。

现实世界中的热物体变得更热的可能性

是如此之小,以至于

它永远不会发生。

冰融化,

奶油混合

,轮胎放气,

因为这些状态比原始状态具有更多
分散的能量。

没有神秘的
力量将系统推向更高的熵。

只是更高的熵在统计上总是
更有可能。

这就是为什么熵被称为
时间之箭。

如果能量有
机会散开,它就会散开。