How computer memory works Kanawat Senanan

In many ways,
our memories make us who we are,

helping us remember our past,

learn and retain skills,

and plan for the future.

And for the computers that often act
as extensions of ourselves,

memory plays much the same role,

whether it’s a two-hour movie,

a two-word text file,

or the instructions for opening either,

everything in a computer’s memory
takes the form of basic units called bits,

or binary digits.

Each of these is stored in a memory cell

that can switch between two states
for two possible values,

0 and 1.

Files and programs consist of millions
of these bits,

all processed in
the central processing unit,

or CPU,

that acts as the computer’s brain.

And as the number of bits needing
to be processed grows exponentially,

computer designers face
a constant struggle

between size, cost, and speed.

Like us, computers have short-term memory
for immediate tasks,

and long-term memory
for more permanent storage.

When you run a program,

your operating system allocates area
within the short-term memory

for performing those instructions.

For example, when you press a key
in a word processor,

the CPU will access one of these locations
to retrieve bits of data.

It could also modify them,
or create new ones.

The time this takes is known
as the memory’s latency.

And because program instructions must be
processed quickly and continuously,

all locations within the short-term memory
can be accessed in any order,

hence the name random access memory.

The most common type of RAM
is dynamic RAM, or DRAM.

There, each memory cell consists
of a tiny transistor and a capacitor

that store electrical charges,

a 0 when there’s no charge,
or a 1 when charged.

Such memory is called dynamic

because it only holds charges briefly
before they leak away,

requiring periodic recharging
to retain data.

But even its low latency
of 100 nanoseconds

is too long for modern CPUs,

so there’s also a small,
high-speed internal memory cache

made from static RAM.

That’s usually made up
of six interlocked transistors

which don’t need refreshing.

SRAM is the fastest memory
in a computer system,

but also the most expensive,

and takes up three times
more space than DRAM.

But RAM and cache can only hold data
as long as they’re powered.

For data to remain
once the device is turned off,

it must be transferred
into a long-term storage device,

which comes in three major types.

In magnetic storage,
which is the cheapest,

data is stored as a magnetic pattern on
a spinning disc coated with magnetic film.

But because the disc must rotate
to where the data is located

in order to be read,

the latency for such drives is 100,000
times slower than that of DRAM.

On the other hand, optical-based storage
like DVD and Blu-ray

also uses spinning discs,

but with a reflective coating.

Bits are encoded as light and dark spots
using a dye that can be read by a laser.

While optical storage media are cheap
and removable,

they have even slower latencies
than magnetic storage

and lower capacity as well.

Finally, the newest and fastest types of
long-term storage are solid-state drives,

like flash sticks.

These have no moving parts,

instead using floating gate transistors

that store bits by trapping
or removing electrical charges

within their specially designed
internal structures.

So how reliable
are these billions of bits?

We tend to think of computer memory
as stable and permanent,

but it actually degrades fairly quickly.

The heat generated from a device
and its environment

will eventually demagnetize hard drives,

degrade the dye in optical media,

and cause charge leakage
in floating gates.

Solid-state drives
also have an additional weakness.

Repeatedly writing to floating gate
transistors corrodes them,

eventually rendering them useless.

With data on most current storage media

having less than
a ten-year life expectancy,

scientists are working to exploit
the physical properties of materials

down to the quantum level

in the hopes of making
memory devices faster,

smaller,

and more durable.

For now, immortality remains out of reach,
for humans and computers alike.

在许多方面,
我们的记忆造就了我们,

帮助我们记住过去,

学习和保留技能,

并为未来做计划。

而对于经常
充当我们自己的扩展的计算机来说,

内存扮演着同样的角色,

无论是两小时的电影

、两个单词的文本文件,

还是打开它们的指令,

计算机内存中的所有内容都
采用以下形式 称为位

或二进制数字的基本单位。

这些中的每一个都存储在一个存储单元中,该存储单元

可以在
两个可能值

0 和 1 的两种状态之间切换。

文件和程序由数百万
个这些位组成,所有这些位

都在
中央处理单元或 CPU 中处理,中央处理单元

CPU 充当计算机的 脑。

随着需要处理的位数
呈指数增长,

计算机设计人员面临

尺寸、成本和速度之间的持续斗争。

像我们一样,计算机具有
用于即时任务

的短期记忆,以及
用于更永久存储的长期记忆。

当您运行程序时,

您的操作系统会
在短期内存中分配区域

来执行这些指令。

例如,当您按下
文字处理器中的某个键时

,CPU 将访问这些位置之一
以检索数据位。

它还可以修改它们,
或创建新的。

这所花费的时间
称为内存的延迟。

并且由于程序指令必须
快速且连续地处理,

短期内存中的所有位置
都可以按任意顺序访问,

因此得名随机存取存储器。

最常见的 RAM 类型
是动态 RAM 或 DRAM。

在那里,每个存储单元
由一个微型晶体管和一个

存储电荷的电容器组成,

没有电荷时为 0,
充电时为 1。

这种内存被称为动态内存,

因为它只在电荷泄漏之前短暂保存电荷

需要定期充电
以保留数据。

但即使
是 100 纳秒的低延迟

对于现代 CPU 来说也太长了,

因此还有一个由静态 RAM 制成的小型
高速内部存储器缓存

这通常
由六个

不需要刷新的互锁晶体管组成。

SRAM 是计算机系统中速度最快的存储器

但也是最昂贵的

,占用的
空间是 DRAM 的三倍。

但是 RAM 和缓存
只有在通电的情况下才能保存数据。

为了
在设备关闭后保留数据

,必须将其传输
到长期存储设备中,

该设备分为三种主要类型。


最便宜的磁

存储中,数据以磁性图案的形式存储在
涂有磁性薄膜的旋转盘上。

但由于磁盘必须旋转
到数据所在的位置

才能被读取,

因此此类驱动器的延迟
比 DRAM 慢 100,000 倍。

另一方面,DVD 和蓝光等基于光学的存储

也使用旋转光盘,

但带有反射涂层。

使用可以被激光读取的染料将位编码为亮点和暗点。

虽然光存储介质便宜
且可移动,

但它们的延迟甚至比磁存储还要慢,

而且容量也更低。

最后,最新和最快
的长期存储类型是固态驱动器,

如闪存棒。

它们没有移动部件,

而是使用浮栅晶体管

,通过

在其专门设计的
内部结构中捕获或去除电荷来存储位。

那么
这些数十亿比特的可靠性如何呢?

我们倾向于认为计算机内存
是稳定和永久的,

但它实际上退化得相当快。

设备及其环境产生的热量

最终会使硬盘消磁,

降解光学介质中的染料,


导致浮栅中的电荷泄漏。

固态硬盘
还有一个额外的弱点。

反复写入浮栅
晶体管会腐蚀它们,

最终使它们变得无用。

由于当前大多数存储介质上的数据

预期寿命不到十年,

科学家们正在努力将
材料的物理特性开发

到量子水平

,以期使
存储设备更快、

更小

、更耐用。

目前,
对于人类和计算机来说,永生仍然遥不可及。