What do all languages have in common Cameron Morin

Language is endlessly variable.

Each of us can come up with
an infinite number of sentences

in our native language,

and we’re able to do so from an early age—

almost as soon as we start
to communicate in sentences.

How is this possible?

In the early 1950s, Noam Chomsky
proposed a theory

based on the observation that the key
to this versatility seems to be grammar:

the familiar grammatical structure
of an unfamiliar sentence

points us toward its meaning.

He suggested that there are
grammatical rules

that apply to all languages,
and that the rules are innate—

the human brain is hardwired to process
language according to these rules.

He labelled this faculty
universal grammar,

and it launched lines of inquiry
that shaped both the field of linguistics

and the emerging field
of cognitive science for decades to come.

Chomsky and other researchers
set out to investigate

the two main components
of universal grammar:

first, whether there are, in fact,
grammar rules

that are universal to all languages,

and, second, whether these rules
are hardwired in the brain.

In attempts to establish
the universal rules of grammar,

Chomsky developed an analytical tool
known as generative syntax,

which represents the order of words
in a sentence in hierarchical syntax trees

that show what structures are possible.

Based on this tree, we could suggest
a grammar rule

that adverbs must occur in verb phrases.

But with more data,
it quickly becomes clear

that adverbs can appear
outside of verb phrases.

This simplified example illustrates
a major problem:

it takes a lot of data
from each individual language

to establish the rules for that language,

before we can even begin to determine

which rules all languages
might have in common.

When Chomsky proposed universal grammar,

many languages lacked the volume
of recorded samples

necessary to analyze them
using generative syntax.

Even with lots of data,

mapping the structure of a language
is incredibly complex.

After 50 years of analysis, we still
haven’t completely figured out English.

As more linguist data
was gathered and analyzed,

it became clear that languages
around the world differ widely,

challenging the theory that there were
universal grammar rules.

In the 1980s, Chomsky revised his theory

in an attempt to accommodate
this variation.

According to his new hypothesis
of principles and parameters,

all languages shared certain
grammatical principles,

but could vary in their parameters,
or the application of these principles.

For example, a principle is
“every sentence must have a subject,"

but the parameter of whether the subject
must be explicitly stated

could vary between languages.

The hypothesis of principles
and parameters

still didn’t answer the question of which
grammatical principles are universal.

In the early 2000s, Chomsky suggested
that there’s just one shared principle,

called recursion, which means structures
can be nested inside each other.

Take this sentence,

which embeds a sentence within a sentence
within a sentence.

Or this sentence, which embeds
a noun phrase in a noun phrase

in a noun phrase.

Recursion was a good candidate
for a universal grammar rule

because it can take many forms.

However, in 2005 linguists
published findings

on an Amazonian language called Piraha,

which doesn’t appear to have
any recursive structures.

So what about the other part
of Chomsky’s theory,

that our language faculty is innate?

When he first proposed universal grammar,

the idea that there was a genetically
determined aspect of language acquisition

had a profound, revolutionary impact.

It challenged the dominant paradigm,
called behaviorism.

Behaviorists argued that all animal
and human behaviors, including language,

were acquired from the outside
by the mind,

which starts out as a blank slate.

Today, scientists agree that behaviorism
was wrong,

and there is underlying,
genetically encoded biological machinery

for language learning.

Many think the same biology
responsible for language

is also responsible for other
aspects of cognition.

So they disagree with Chomsky’s idea

that there is a specific, isolated,
innate language faculty in the brain.

The theory of universal grammar
prompted the documentation and study

of many languages
that hadn’t been studied before.

It also caused an old idea to be
reevaluated and eventually overthrown

to make room for our growing
understanding of the human brain.

语言是无穷无尽的。

我们每个人都可以用我们
的母语写出无数个句子

而且我们从小就能够做到——

几乎在我们开始
用句子交流的时候。

这怎么可能?

在 1950 年代初期,诺姆·乔姆斯基(Noam Chomsky)

基于以下观察提出了一种理论
:这种多功能性的关键似乎是语法:

一个不熟悉的句子的熟悉语法结构将

我们指向它的含义。

他认为存在

适用于所有语言的语法规则,
并且这些规则是与生俱来

的——人脑天生就可以
根据这些规则处理语言。

他将这门学科标记为
通用语法,

并启动
了在未来几十年塑造语言学领域

和新兴
认知科学领域的探究路线。

乔姆斯基和其他研究人员
着手研究通用语法

的两个主要组成
部分:

首先,事实上是否存在

适用于所有语言的语法规则

,其次,这些规则
是否在大脑中是硬连线的。

在试图
建立通用语法规则的过程中,

乔姆斯基开发了一种
称为生成句法的分析工具,

它表示
句子中单词在分层句法树

中的顺序,显示了哪些结构是可能的。

基于这棵树,我们可以提出
一个语法规则

,即副词必须出现在动词短语中。

但随着数据的增多
,很快就会

发现副词可以出现
在动词短语之外。

这个简化的例子说明
了一个主要问题:在我们开始确定所有语言可能有哪些共同规则

之前,需要
从每种语言

中建立大量数据来建立该语言的规则

当乔姆斯基提出通用语法时,

许多语言缺乏使用生成语法分析它们所需的
大量记录样本

即使有大量数据,

映射语言结构
也非常复杂。

经过 50 年的分析,我们仍然
没有完全弄清楚英语。

随着越来越多的语言学数据
被收集和分析,

很明显
世界各地的语言差异很大,

这对存在通用语法规则的理论提出了挑战

在 1980 年代,乔姆斯基修改了他的理论

,试图适应
这种变化。

根据他
关于原则和参数的新假设,

所有语言都共享某些
语法原则,

但它们的参数
或这些原则的应用可能会有所不同。

例如,一个原则是
“每个句子都必须有一个主语”,

但是否必须明确陈述主语的参数

可能因语言而异。

原则和参数的假设

仍然没有回答哪些
语法原则是普遍的问题 .

在 2000 年代初,乔姆斯基
提出只有一个共同的原则,

称为递归,这意味着结构
可以相互嵌套。

拿这个句子,

它在一个句子中嵌入一个句子中的
一个句子。

或者这个句子,它嵌入
一个 名词短语中的名词短语

中的名词短语。

递归是
通用语法规则的一个很好的候选者,

因为它可以采用多种形式。

但是,2005 年语言学家
发表了

关于亚马逊语言 Piraha 的研究结果,该语言

似乎没有
任何 递归结构。

那么乔姆斯基理论的另一部分
是什么,

即我们的语言能力是与生俱来的?

当他第一次提出通用语法时

,有一个根的想法
语言习得的伦理决定方面

产生了深远的革命性影响。

它挑战了
被称为行为主义的主导范式。

行为主义者认为
,包括语言在内的所有动物和人类行为

都是由大脑从外部获得的
,而大脑最初

只是一张白纸。

今天,科学家们一致认为行为主义
是错误的,

并且存在用于语言学习的潜在
基因编码生物机制

许多人认为
负责语言

的同一生物学也负责
认知的其他方面。

所以他们不同意乔姆斯基的观点

,即大脑中有一种特定的、孤立的、
与生俱来的语言能力。

通用语法理论
促进了

对许多
以前没有研究过的语言的记录和研究。

它还导致一个旧想法被
重新评估并最终被推翻

,以便为我们
对人类大脑的日益了解腾出空间。