Language-like features of hierarchical conceptualization

Language-like features of hierarchical
conceptualization: Implications for
the thought-language relation∗
Chris Thornton
Centre for Research in Cognitive Science
University of Sussex
Brighton
BN1 9QJ
UK
[email protected]
July 5, 2016
Abstract
Conceptual thought provides the means of using one concept as the
semantic container for one or more other concepts taken in combination
(e.g., conceiving of a family made up of a mother and child). As a construction of this form is itself a concept, the process is inherently recursive.
It is a medium through which hierarchical structures with complex, compositional meanings can be constructed. Hierarchical conceptualization is
a language-like medium in this sense. Although its properties have not
previously been analyzed from a formal perspective, they are of potential
significance for the study of mind and language. Whether this conceptual
but language-like medium plays any role in in language itself becomes of
interest. On close examination of the situation, the present paper reaches
the conclusion that a functional dependency cannot be ruled out. The
casts new light on the relationship between thought and language.
1
Introduction
A basic observation of concepts is that they are amenable to hierarchical combination (Murphy, 2002; Carey, 2009). Thanks to their capacity for semantic
accommodation, they have the ability to ‘fit inside’ each other. Particular concepts mandate particular hierarchical structures, and articulation of these is a
common mode of thought. Consider, for example, the act of characterizing the
∗ This
research was supported by Leverhulme Trust grant F/00 230/AI.
1
combination of a mother and child as a family. This can be done because the
combination of mother and child concepts can be semantically accommodated
within the family concept. The meaning of the latter allows for the idea of a
family with this make-up. The hierarchical structure that arises is a product of
the concepts’ semantics. They combine in this way by virtue of the fact that
one can encompass the other two, taken in combination.1
Since a hierarchical combination of concepts is itself a concept, constructions of more than one level are a possibility. The process readily gives rise
to multi-level structures. Say we start with the following five concepts: house,
garden, residence, lawn and flower-bed.2 A garden can be constituted of a lawn
and a flower-bed. A residence can be constituted of a house and a garden. A
hierarchical structure of two levels is thus mandated. This comprises a residence
conceived as encompassing a house and garden, with the latter conceived as encompassing a lawn and flower-bed.3 The capacities of semantic accommodation
that concepts possess give rise to specific hierarchical structures in this way.
Any collection of concepts yields a well-defined set. The set may be empty—a
particular selection of concepts may yield no hierarchical structures at all. But
it is always a well-defined product of the meanings the concepts possess.
What can then be shown is the degree to which hierarchical conceptualization
is a language-like medium. Since it is a way of building constructions with
particular meanings, it can justifiably be seen as a way of constructing meaning.
This is one way the process is language-like. But there are other features to
take into account. The process is inherently compositional, for example. In
a hierarchical combination of concepts, the encompassing concept becomes a
semantic container for the encompassed concepts. The effect is to produce a
composed structure. That the process is inherently generative is also evident.
Combining concepts hierarchically produces a new concept, which may be the
means of producing a new combination, and further concepts, in an ongoing
way. In principle, hierarchical conceptualization is an infinitely productive way
of constructing meaning. This characteristic is considered a key hallmark of
language.
The question of how this language-like medium of conceptualization relates
to language itself then becomes of interest. The constructions it gives rise to
are hierarchical and compositional, as are constructions of language (Pinker,
1994; Jackendoff, 2002). Any idea formed as a hierarchical structure in the
conceptual system can presumably also be expressed by means of a hierarchical
1 Notice that the structure obtained by hierarchical concept combination is not simply a
part-whole hierarchy (a meronomy). In a meronomy, each hierarchical unit defines a whole
to be the sum of the cited parts: there is no allowance for imposition of an accommodating
concept. Hierarchical concept combination is also not to be confused with generic concept
combination. This is a more diverse process that has been modeled in a range of ways (e.g.
Hampton, 1991; Thagard, 1997; Rips, 1995; Wisniewski, 1997; Costello and Keane, 2001;
Hampton, 1997, 2011).
2 Following Murphy’s (2002) approach, concept names are not written in a special case or
font.
3 The term ‘encompass’ is used in its constitutive sense throughout. The assertion that ‘X
encompasses Y’ implies that X encompasses Y as part of its make-up.
2
structure constructed in the linguistic system. Should we assume, then, that
the two structures are related in some way? Is it possible that the former shapes
the latter? If so, how does the shaping work? The present paper looks into the
issue in some detail. It is shown that hierarchical constructions in the conceptual
system might have no functional relationship to those of the linguistic system,
but that the existence of a shaping influence cannot be ruled out. It is at least
possible that the constructive functionality of the conceptual system shapes the
constructive functionality of the linguistic system. This has implications for the
relation between thought and language. If the latter directly exploits processes
of the former, the idea of there being a functionally independent language system
becomes open to question.
The remainder of the paper sets out the argument, and considers the implications in more detail. There are four main sections in all. The section to follow
(Section 2) introduces a shorthand for writing hierarchical conceptualizations,
and uses this to demonstrate the semantic power of the process. Following this,
Section 3 sets out the possible forms of relationship that might exist between
hierarchical conceptualization and hierarchical language. Section 4 focuses on
the case where we assume the former shapes the latter, and establishes one way
in which this might be accomplished. Finally, Section 5 discusses the general
implications of the analysis for the relation between thought and language.
Some technical points need to be noted at the outset. The material dealing
with conceptual structures follows the approach of (Murphy, 2002) in avoiding
use of special fonts or cases in the naming of concepts. Where there is any
ambiguity, the phrase ‘the concept of X’ is used to indicate that X is a concept
name. Phrasal concept names use the dot ‘.’ as a connective; the concept
of an ongoing event thus has the name ‘ongoing.event’. The material dealing
with language (Section 3 onwards) makes use of interlinear glosses as a way of
identifying grammatical structure. All the glosses presented can also be found
in the online resource ‘The World Atlas of Language Structures’ (Dryer and
Haspelmath, 2011). References to this resource use the acronym ‘WALS’ and
relate to its state at the current time. The internet location of the resource is
http://wals.info.
2
Hierarchical conceptualization
To bring out the constructive possibilities of hierarchical conceptualization, complex examples will need to be examined. Since English descriptions of multi-level
structures quickly become unreadable, it is helpful to introduce a shorthand at
this point. The convention henceforth will be that a hierarchical structure in
which concept X is conceived as encompassing some combination of concepts,
will be denoted by enclosing all the concepts in square brackets with X placed
first. Thus
[X Y Z]
is shorthand for a hierarchical combination in which concept X is conceived as
3
encompassing concepts Y and Z, taken in combination. The encompassing concept is emboldened for emphasis. The encompassed concepts can be arbitrarily
numerous, and are not in any particular order. Embedding of structure is then
dealt with by bracketing in the obvious way. The concept of an X encompassing
the combination of a Y and Z, where the Z itself encompasses the combination
of B, D and E, has the shorthand
[X Y [Z B D E]]
The two examples used in the introduction can be expressed more succinctly
using this approach. The concept of a family encompassing a mother and child
can be expressed as
[family mother child]
The concept of a residence encompassing a house and garden, where the garden
itself is conceived as encompassing a lawn and flower-bed can be expressed as
[residence house [garden lawn flower-bed]]
Formally, the shorthand obeys the following Backus-Naur (BNF) definition.
hspeci ::= hconcept-namei
hspeci ::= "[" hspeci hspeci+ "]"
The non-terminal hspeci denotes the specification of a concept. This is defined to
be either the name of a concept, or a square-bracketed sequence containing two
or more hspeci. Use of the ‘+’ superscript allows that hierarchical combinations
can have any number of encompassed elements. Without this, the effect would
be to enforce combinations with only one.4
The shorthand allows hierarchical combinations to be expressed in a succinct
way. There is the temptation to view it as a notation, but its classification as
such is problematic. The convention of placing the encompassing concept first
in sequence seems indicative of a prefix or Polish notation—one in which the
operator of a construction is placed before its arguments. But this classification
is incorrect. The operator applied in a construction is hierarchical encompassment by means of the initially placed concept. It is the initial concept deployed
in a particular way. The initial concept is not itself the constructive operator,
and the shorthand is not a prefix or Polish notation for this reason.
The shorthand is also not in any sense a version of predicate logic. This is an
important point, as predicate logic has often been used to notate concept constructions (e.g. Gentner, 1983). A hierarchical construction in predicate logic
does not describe a hierarchical concept combination. The operation of logical
predication is not the operation of constitutive encompassment. In one case,
the outcome is a boolean; in the other, it is a concept. It would be misleading,
therefore, to use X(Y,Z) as shorthand for the concept of an X encompassing a
Y and Z.
4 Notice that since the encompassed elements in a construction have no order, it is always
the case that [X Y Z] = [X Z Y].
4
If the shorthand has to be classified in a formal way, the least-bad option
is to consider it to be a programming language for concepts. The shorthand
has a kinship with LISP, a functional programming language often used in AI
(McCarthy et al., 1985/1962). LISP also uses prefix positioning to denote a
particular deployment of a named entity, on which basis we might view the
shorthand as a version of LISP in which function-calls evaluate to concepts. But,
again, the correspondence is far from perfect. Application of a computational
function is not the same thing as imposition of an encompassing concept. The
treatment of arguments also differs. In a programming language such as LISP,
arguments are given particular roles by positioning. This is not the case in the
shorthand, where the encompassed elements form a combination, and have no
order.
There is a situation in which the shorthand has the potential to serve as a
formal notation. If all the concepts named in a construction have a meaning
defined in some other way, the construction then has a formal meaning too.
It is a recursive structuring of the pre-defined meanings using the operation of
hierarchical accommodation. In this situation, constructions remain semantically precise regardless of their depth and complexity. For present purposes,
however, no such assumption is made. The shorthand should not be considered
a notation.
To get a general picture of the expressive power of hierarchical conceptualization, we need to look at the ways constructions can be built up to form
complex compositions. Imagine we are provided with a base of four concepts:
the concept of a flight, the concept of a drive, the concept of a journey and the
concept of a pilgrimage. The set of given concepts is then: flight, drive, journey,
pilgrimage. Since a journey can be made-up of a flight and a drive, a potential
hierarchical combination is
[journey flight drive]
This places the concept of a journey into the encompassing role, with flight and
drive as the encompassed elements. It realizes the idea of a journey encompassing (within its make-up) a flight and drive. Another possibility is
[pilgrimage flight drive]
This is the subtly different concept of a pilgrimage encompassing a flight and
a drive. While the same concepts are encompassed, the encompassing element
differs, with the result that a different meaning is realized. The meaning of the
construction is the concept of a particular type of pilgrimage.
Also of interest are hierarchical combinations incorporating a single encompassed concept; e.g.,
[journey drive]
This expresses the idea of a journey that solely encompasses a drive. Another
combination yields the idea of a journey that solely encompasses a flight:
[journey flight]
5
Minimal constructions like these are termed ‘singles’ below. Intuitively, they
are best seen as classifications. The second, for example, can be seen to express
the idea of a flight that is also classified as a journey. But notice there is no
implication of either element being more general than the other. Singles do
not define hyponyms. The signified relationship is hierarchical encompassment,
which in the case of a single encompassed element, implies ‘both’.5
Singles are also inherently reversible. An X that is classified as a Y can also
be seen as a Y classified as an X. By definition, therefore
[X Y] = [Y X]
Taking singles into account, it is important to remember the denoted relation in
a construction is constitutive encompassment. The structure [journey flight]
expresses the concept of a journey that encompasses a flight as its (single) constitutive element. There is the potential for confusion because, where only two
concepts are referenced, encompassment need not imply constitution. We can
speak of ‘night encompassing a building’ in a way that does not imply the latter
contributes to the former’s constitution. This is not the sense of ‘encompassing’
used here. In all cases, encompassment is considered to be specifically constitutive in nature.
Another factor contributing to the expressive power of hierarchical conceptualization is the potential to place relational concepts in the encompassing role.
With a relational concept deployed in this way, the effect achieved is something
like a relational schema. An example illustrating this is
[understanding teacher lawyer]
This constructs the concept of an understanding encompassing a teacher and a
lawyer (or an understanding between a teacher and a lawyer, as it would normally
be expressed). The encompassing concept is implicitly an imposed relation,
and what is constructed is a schema in result. But notice that no schemamaking apparatus is involved. The concept of understanding provides the main
contribution. Deployed in the encompassing role, it forms the ‘glue’ that holds
the two constituent concepts together. The relational arrangement is captured
purely by hierarchical combination—by treating one concept as encompassing
the other two.
Singles can be used to refine concepts of this type. For example, the following
two-level structure might be realized:
[understanding [agent teacher] [recipient lawyer]]
This expresses the concept of an understanding encompassing a teacher and a
lawyer, in which the teacher is classified as agent, and the lawyer is classified
as recipient. It builds the concept of a teacher understanding a lawyer, rather
5 The distinction with meronomies also becomes apparent here, since a meronomical structure with a single ‘part’ is a logical impossibility. In hierarchical concept combination, we
have a specified encompassing concept for each hierarchical unit. This means that although
there can only be one meronomy based on the flight and drive concepts, there can be any
number of hierarchical combinations.
6
than the other way around. It is what we would express in English by saying
something like the teacher understands the lawyer. The process of adding levels
of construction can continue as long as we like. Consider, for example,
[improving.thing [understanding [agent teacher] [recipient lawyer]]]
This adds a new layer of meaning: the teacher’s understanding of the lawyer is
now classified as improving.
Use of singles, relational concepts and embedding increases the representational power of hierarchical concept combination, then. Using all the possibilities on offer, we can get closer to the kinds of meaning that require language to
express. Consider the following, for example:
[seeing.action [subject John] [object [definite.thing book]]]
This constructs the idea of a seeing.action encompassing John classified as subject, and a book classified as object, in which the book is also classified as a
definite.thing. What is constructed is the idea of some individual John6 seeing
a definite book. This is something that we would express in English by saying
John sees the book.
A slightly more complex case is
[[past.behavior reading.action] [subject John] [definite.thing letter]]
This is a similar construction except that here the object is a letter, the action
is reading rather than seeing, and this is itself classified as past.behavior. The
effect is to place the reading action into the past, realizing a meaning that we
would express in English by saying John read the letter. This illustrates the way
the meaning of a tense can be captured.
A still more elaborate example is
[yesterday.event
[giving.action
[subject [indefinite.thing man]]
[object bread]
[indirect.object John] ] ]
Key to the meaning of this four-level construction is the first encompassed concept. Itself a structure, this expresses the idea of a giving action encompassing
an indefinite man (classified as subject), bread (classified as object) and John
classified as an indirect object. What this yields is the idea of an event in which
an indefinite man gives bread to John. This event is then itself classified as a
‘yesterday.event’, i.e., an event occurring yesterday. The final product is the
idea of a man giving bread to John yesterday, a meaning we could express in
English by saying Yesterday a man gave bread to John.
Some specialized forms of meaning, such as questions, can also be captured.
Consider this, for example:
6 For present purposes, names of individuals are taken to name the concept of the individual
in question.
7
[question
[[event drinking.action focal.thing]
[subject [definite.thing teacher]]
[object [definite.thing [substance water]] ] ] ]
The central concept here is
[event drinking.action focal.thing]
This expresses the idea of an event encompassing a drinking.action and a focal.thing. Encompassed by this are water and a teacher, classified as subject
and object respectively (and also as definite objects). This idea is then itself
classified as a question. The final result is thus (the idea of) a question that asks
whether a definite teacher is drinking some definite water. This is something
we could express in English by asking Is the teacher drinking the water?
These latter examples give a sense of the semantic range that hierarchical
conceptualization has when applied recursively, and with utilization of singles
and relational constructions. We see that the complex, structured meanings
we express using language can sometimes be produced by a purely conceptual
mechanism, operating independently of language. Language is a medium which
allows complex, compositional meanings to be assembled. So too is hierarchical
conceptualization. The conceptual system and the linguistic system have this
in common. What is implied for their relationship can then be examined.
3
Hierarchical conceptualization versus hierarchical language
Does the ability to form hierarchical linguistic constructions depend in some
way on the ability to form conceptual constructions? Or are these two areas of
functionality independent? The linguistic and conceptual systems are normally
viewed as separate faculties of the mind. So the least controversial assessment
is to assume there is no functional dependency. This is certainly a legitimate
way of interpreting the evidence. Clearly the conceptual system must have the
capacity to represent the meanings we express through language. If it does so,
in some cases, by deploying a constructive mechanism that happens to be hierarchical, compositional and infinitely productive, this might simply be because
a language-like approach is the one that works best.
But when particular cases are examined, the situation becomes less clearcut. If we compare a hierarchical linguistic (i.e., grammatical) structure with
the hierarchical conceptual structure that best captures its meaning, there is
an obvious correspondence. The two structures cannot be formally isomorphic,
due to the difference in the ordering of branches: in the latter, branches are not
in any order, whereas in the former, the ordering of branches is a key part of the
specification. But a basic likeness is always apparent. This can be demonstrated
by showing the way in which one structure can be derived from the other.
8
Consider again the English sentence ‘John sees the book.’ This takes the
form of a verb phrase in which ‘sees’ is the verb, ‘John’ is the subject, and ‘the
book’ is the object. The grammatical structure can be analyzed as follows:
(VP sees (N John) (NP (DET the) (N book)))
To obtain a conceptual equivalent of this structure, we proceed as follows. First,
we replace each grammatical constituent with a semantically equivalent conceptual constituent. In the simplest case, the constituent is a content word, and
the replacement is just the corresponding concept. The two content words here
are ‘John’ and ‘book’, but both can be seen as concept names in their own right,
so no change is needed. The second step involves replacing each grammatical
construction with a semantically equivalent conceptual construction. This is
a hierarchical combination in which the encompassing concept has the grammatical meaning that the grammatical construction imposes on its subordinate
elements. In this case there are two such constructions: the verb phrase and the
noun phrase. In the latter, the word ‘the’ is a definite article (an ART.DEF).
This imposes the meaning of being a definite thing on the subordinate entity
‘book’. This construct is thus replaced with
[definite.thing book]
The verb ‘sees’ imposes the meaning of a seeing action on the subordinate
subject and object. This is thus replaced with the construction
[seeing.action John [definite.thing book]]
This expresses the idea of a seeing action encompassing John and a definite
book. Still needed is discrimination of subject and object. This is where the
grammatical structure captures meaning by the ordering of branches. The SVO
ordering of the verb phrase has the effect of classifying ‘John’ as subject, and
‘the book’ as object. Branches in the conceptual structure are unordered, so
for the conversion, we need to add in the relevant classifications. The structure
then becomes
[seeing.action [subject John] [object [definite.thing book]]]
With John classified as subject, and the book as object, this expresses the idea
of John seeing a definite book, which is the meaning of ‘John sees the book.’
Converting a linguistic structure to an equivalent conceptual structure is a
relatively simple procedure, then. Individual nodes of the linguistic structure
are replaced with their conceptual counterparts, and explicit classifications are
added to capture any meanings conveyed by ordering. The procedure can also be
applied in reverse. This involves replacing conceptual entities with their grammatical counterparts, and explicit classifications with branch-orderings where
possible. This has to be done in accordance with applicable grammatical preferences, both in regard to the way concepts are symbolized, and to ordering
preferences, e.g., the preference (in this case) for SVO ordering. The reverse
9
conversion has to be performed in light of grammatical and lexical preferences,
in other words.
Since the grammatical structure of an utterance also defines the utterance
itself, a functional dependency between the conceptual and linguistic systems
can then be envisaged. Production of an utterance might be the result of converting a conceptual structure into a linguistic form by applying the relevant
grammatical preferences. On this basis, the conceptual and linguistic systems
are potentially tightly coupled, with the latter being functionally dependent on
the former. Rather than it being the linguistic system which does all the work
in language production, it is seen to be the two systems working cooperatively.
Utterances are produced by an assembly-line process, the first stage of which is
the work of the conceptual system. To what degree is a cooperative regime of
this form viable in practice? The following section sets out a number of worked
examples that showcase its feasibility.
4
Assembly-line language production
We can begin by looking at derivation of an utterance from the structure specified above:
[seeing.action [subject John] [object [definite.thing book]]]
This realizes the idea of John seeing a particular book, a meaning that is expressed in English by the utterance John sees the book. To show how realization
of the conceptual structure might shape production of the utterance, we need
to show how mapping this structure to a symbol sequence in accordance with
applicable grammatical preferences yields the sentence John sees the book.
The mapping can be specified using rewrite rules, each of which captures one
of the preferences applying. The mapping can then be performed by applying
the rules to the structure. Rules are needed to specify both the lexical and
organization preferences that apply. For each concept cited in the structure, we
need a rule giving the symbol that is the preferred encoding of the concept in the
context arising. In addition, we need a rule to specify each ordering preference.
The lexical preferences here are as follows.
book ← book
the ← definite.thing
John ← John
sees ← seeing.action
Each rule has a concept specification on the right, and the preferred English
symbol (i.e., word) on the left. The specified symbol is the preferred encoding
of the concept in the context of this particular utterance. The preferred symbol
for this particular usage of the concept of a book is book, for example.
There are also three organizational preferences that apply: (1) the preference
for SVO ordering, (2) the preference to classify subject and object by ordering,
10
and (3) the preference to use head-initial organization in simple phrases. These
can be captured by means of the following three rules.
a
2 1 3 ←− [= subject object]
b
2 ←− [subject/object =]
c
1 2 ←− [= =]
These rules are notated on a right-to-left basis, with labels over the arrows
a
for easy reference. The first rule (labeled ←−) expresses the preference for
b
SVO organization; the second (labeled ←−) expresses the preference to encode
subject and object classifications by means of ordering, while the third (labeled
c
←−) expresses the preference for head-initial organization in simple phrases.
a
The notation works in the following way. Rule ←− applies to any conceptual
structure of the form
[= subject object]
The structure can have anything as its encompassing element (the ‘=’ is a wildcard) but the encompassed elements must comprise subject and object concepts.
(As this is a concept specification, the encompassed elements have no order.) A
subject concept is specified either by name (i.e., as ‘subject’) or as a construct
for which ‘subject’ is encompassing either explicitly or implicitly. This means
[subject John] is a subject concept, as is [definite.thing [subject John]].
The numbers on the left of a rule specify the way symbols should be ordered.
Each number indexes an element of the specification on the right, while its
a
position says where symbols arising for that element should be placed. Rule ←−
has ‘2 1 3’ on the left. This means symbols arising for whatever matches the
2nd element should be placed first, followed by symbols for whatever matches
the 1st element, followed by symbols for whatever matches the 3rd element.
Given the structures it can match to, this rule captures the preference for SVO
ordering.
b
Rule ←− uses ‘/’ to denote alternatives. The specification matches any
structure in which the first element is either a subject or object concept. The
designation on the left is just ‘2’, meaning only the encompassed element is
symbolized. Given the structures it can match to, this rule captures the preference for expressing subject and object classifications implicitly. The final rule
deals with any single (i.e., any concept with a single encompassed element). It
specifies that symbol(s) for the encompassing element should be placed before
symbol(s) for the encompassed element. Given its coverage, this rule expresses
the preference for head-initial organization in simple phrases. Earlier rules take
a
b
c
precedence, so ←− has priority over ←−, which has priority over ←−.
Applying these rules to the conceptual structure, we then obtain the following processing:
11
→ [seeing.action [subject John] [object [definite.thing book]]]
→ [subject John]
← John (John)
b
←− John
← sees (seeing.action)
→ [object [definite.thing book]]
→ [definite.thing book]
← the (definite.thing)
← book (book)
c
←− the book
b
←− the book
a
←− John sees the book
Notice the general effect of the processing is to break the conceptual structure
down into its terminal elements. (Indentation is used to represent embedding.)
At the limit of each decomposition, a concept name is translated to its preferred symbol. As the recursion unwinds, the relevant ordering rule is applied
at each stage. The effect is to map the conceptual structure to symbols in accordance with grammatical preferences. The symbol sequence John sees the book
is obtained as final output.
The layout of the listing is as follows. For each application of a lexical
preference, there is a line which ends with the relevant concept name. For
example, mapping of the concept name ‘book’ to the symbol book is denoted by
the line
← book (book)
For each application of an organizational preference, there is a line showing the
concept that is processed and—at with same indentation below—a second line
b
showing the symbol sequence assembled. Use of rule ←− to turn [definite.thing
book] into the book thus has an upper line of the form
→ [definite.thing book]
and a lower line of the form
b
←− the book
The example illustrates the way production of an utterance can be analyzed as
a cooperative effort between the conceptual and linguistic systems. The conceptual system is seen to contribute the original hierarchical structure. Mapping
of this structure to a sequence of symbols in accordance with grammatical preferences then yields the final utterance. The analysis is thus comprised of the
original conceptual structure, and the grammatical preferences that define the
mapping. Grammatical rules play no role.
More complex examples can also be generated. All the later examples of the
previous section can be put to use in this way. We can also vary the language
12
in which the output comes to be expressed. Consider again the utterance ‘John
read the letter.’ Translated into Japanese, this becomes
Johnga tegamio yonda.
For an assembly-line analysis of this sentence we require a hierarchical concept combination which expresses the correct meaning. As previously noted, a
conceptual construction with the meaning of ‘John read the book’ is
[[past.behavior reading.action] [subject John] [definite.thing letter]]
The grammatical preferences that apply in this context then need to be specified.7 The following six rules capture the relevant lexical preferences:
John ← John
tegami ← letter
yon ← reading.action
da ← past.behavior
ga ← subject
o ← definite.thing
Two organizational preferences are also applicable: the preference for SOV ordering, and the preference for head-final organization in simple phrases. These
can be captured as follows:
a
2 3 1 ←− [= subject definite.thing]
b
2 1 ←− [= =]
a
Given the concepts it can match to, use of ‘2 3 1’ in rule ←− expresses the
b
preference for SOV organization, while rule ←− expresses the preference for
head-final organization in simple phrases.
With these preferences specified, mapping of the conceptual structure to a
symbol sequence proceeds as follows:
→ [[past.behavior reading.action] [subject John] [definite.thing letter]]
→ [subject John]
← John (John)
← ga (subject)
b
←− John ga
→ [definite.thing letter]
← tegami (letter)
← o (definite.thing)
7 These
are derived from the analysis of (Kuno, 1973, p. 10).
John-ga
tegami-o
John-SUBJ
letter-OBJ
‘John read the letter’
yon-da
read-PST
See also WALS, Ch. 82, Ex. 2.
13
b
←− tegami o
→ [past.behavior reading.action]
← yon (reading.action)
← da (past.behavior)
b
←− yon da
←− John ga tegami o yon da
a
With conventional word-breaks imposed, the output is Johnga tegamio yonda,
which is the original Japanese sentence.
Another example taken from the previous section is
[yesterday.event
[giving.action
[subject [indefinite.thing man]]
[object bread]
[indirect.object John] ] ]
Recall that this builds the idea of an indefinite man giving bread to an individual, John, at a particular point in time, namely yesterday. The meaning is one
we could express in English by saying ‘Yesterday a man gave bread to John.’ A
sentence from the Suriname language of Arawak with a not dissimilar meaning
is
Miaka aba wadili sika khali damyn.
This means ‘yesterday a man gave cassava bread to me’. To capture this meaning, the conceptual structure above needs to be modified in two ways. The
recipient of the action needs to be specified as ‘me’ rather than ‘John’, and the
object needs to be ‘cassava.bread’ rather than ‘bread’. This yields
[yesterday.event
[giving.action
[subject [indefinite.thing man]]
[object cassava.bread]
[indirect.object me] ] ]
To complete the analysis of this Arawak sentence, we now need to show how
mapping the structure in accordance with grammatical preferences produces a
grammatically correct expression.8 The lexical preferences which arise are
khali ← cassava.bread
8 Preferences
are derived from the analysis of (Pet, 1987).
Miaka
aba
wadili
sika
khali
yesterday
INDEF
man
give
cassava.bread
‘Yesterday a man gave cassava.bread to me’
See also WALS, Ch. 84, Ex. 4.
14
da-myn
1SG-to
sika ← giving.action
aba ← indefinite.thing
myn ← indirect.object
wadili ← man
da ← me
miaka ← yesterday.event
The organizational preferences of Arawak that apply in this case are captured
by the following four rules. The first states the preference for SVO organization;
the second, the preference for dealing with subject and object classifications by
ordering; the third, the preference for head-final organization in simple phrases
denoting an indirect object; and the last, the default preference for head-final
organization.
a
2 1 3 4 ←− [= subject object indirect.object]
b
2 ←− [subject/object =]
c
2 1 ←− [indirect.object =]
d
1 2 ←− [= =]
Mapping of the conceptual structure then proceeds as follows. (Some lines are
truncated.)
→ [yesterday.event [giving.action [subject [indefinite.thing man]] ...
← miaka (yesterday.event)
→ [giving.action [subject [indefinite.thing man]] [object ...
→ [subject [indefinite.thing man]]
→ [indefinite.thing man]
← aba (indefinite.thing)
← wadili (man)
d
←− aba wadili
b
←− aba wadili
← sika (giving.action)
→ [object cassava.bread]
← khali (cassava.bread)
b
←− khali
→ [indirect.object me]
← da (me)
← myn (indirect.object)
c
←− da myn
a
←− aba wadili sika khali da myn
d
←− miaka aba wadili sika khali da myn
With standard word-breaks imposed, the output is Miaka aba wadili sika
15
khali damyn, which is the original Arawak sentence.
To complete this series of examples, it is useful to look at
[question
[[drinking.action focal.thing]
[subject [definite.thing teacher]]
[object [definite.thing [substance water]] ] ] ]
Recall that this builds an idea that would be expressed in English by asking the
question ‘Is the teacher drinking the water?’ Say we would like to develop an
analysis of this utterance expressed in German. The utterance then takes the
form
Trinkt der lehrer das wasser?
The lexical preferences9 that arise are as follows:
lehrer ← teacher
wasser ← water
trink ← drinking.action
das ← [definite.thing substance]
der ← definite.thing
t ← focal.thing
Notice the use of a structured specification in the case of das. This is required
to capture the restricted range of this determiner. In addition, there are four
organizational preferences, captured by the four rules below. The first states the
preference to encode subject and object classifications by ordering; the second,
the preference for VSO structure given a meaning classified as a question; the
third, the preference for SVO organization otherwise, and the last, the preference
for head-initial organization in simple phrases.
a
2 ←− [subject/object =]
b
2 3 4 ←− [question [= subject object]]
c
2 1 3 ←− [= subject object]
e
1 2 ←− [= =]
Mapping of the conceptual structure in accordance with these preferences then
proceeds as follows:
→ [question [[drinking.action focal.thing] [subject ...
→ [drinking.action focal.thing]
9 Preferences
are derived from the analysis of (Dryer and Haspelmath, 2011, Ch. 116, Ex.
6).
Trink-t
der
lehrer
das
drink-3SG
DEF teacher
DEF
‘Is the teacher drinking the water’
wasser
water
16
← trink (drinking.action)
← t (focal.thing)
e
←− trink t
→ [subject [definite.thing teacher]]
→ [definite.thing teacher]
← der (definite.thing)
← lehrer (teacher)
e
←− der lehrer
a
←− der lehrer
→ [object [definite.thing [substance water]]]
← das ([definite.thing substance])
← wasser (water)
a
←− das wasser
b
←− trink t der lehrer das wasser
Once standard word-breaks have been imposed, the output is seen to be the
original German question: Trinkt der lehrer das wasser?
If we remove the question classification in the original conceptual structure,
only the inner conceptual structure remains:
[[drinking.action focal.thing]
[subject [definite.thing teacher]]
[object [definite.thing [substance water]] ] ]
This realizes a meaning we might express in English by stating that ‘the teacher
is drinking the water.’ Applying the rules to this reduced structure then invokes
the default SVO ordering, as follows:
→ [[drinking.action focal.thing] [subject ...
→ [subject [definite.thing teacher]]
→ [definite.thing teacher]
← der (definite.thing)
← lehrer (teacher)
e
←− der lehrer
a
←− der lehrer
→ [drinking.action focal.thing]
← trink (drinking.action)
← t (focal.thing)
e
←− trink t
→ [object [definite.thing [substance water]]]
← das ([definite.thing substance])
← wasser (water)
a
←− das wasser
c
←− der lehrer trink t das wasser
17
With word-breaks imposed, the output is then der lehrer trinkt das wasser,
which is ‘the teacher is drinking the water’ in German.
What these examples demonstrate is the feasibility of cooperative language
production. They show that, at least in some cases, it is possible to explain production of an utterance as an assembly-line process. According to this model,
the conceptual and language systems engage in carefully coordinated joint activity. The conceptual system contributes the initial, semantic object. The
language system then provides this with a linear encoding. The point of interest is that it is grammatical preferences (e.g., for use of SVO ordering) which are
brought into play, not grammatical rules. The grammaticality of the derived
utterance is emergent. It stems from the way the hierarchical structuring of
the conceptual object interacts with the preference-driven mapping. Grammatical rules play no role, other than as a way of legitimizing the output that is
obtained.
5
Discussion
With the language-like properties of hierarchical conceptualization taken into
account, it is natural to consider whether the process might contribute to language itself. As noted in the previous section, a case can be made that it does.
Utterances might be produced via an assembly-line arrangement, in which the
conceptual system supplies the hierarchical structure, and the language system
adds the linear encoding. But any new conception of the relationship between
thought and language must clearly take stock of what has previously been proposed in this context. There is a vast literature relating to this relationship, as
it has long been the subject of debate. Theorists divide into two main camps,
one favouring the idea that it is thought which shapes grammar, and the other
that it is grammar which shapes thought. Those in the former camp include
(Bates and MacWhinney, 1979, 1987; Bickerton, 1990; Fodor, 2001; Elman,
2004; Kirby, 1999; Jackendoff, 2002; Tomasello, 2003; Clair et al., 2009; Chater
and Christiansen, 2010; Tomasello, 2008). Those in the latter include (Fitch
and Chomsky, 2005; Chomsky, 2012, 2009; Hauser, 2009; Hinzen, 2009).
Theorists in what we might call the ‘grammar from thought’ camp argue
that a mechanism for expressing complex thought could only have evolved after
the capacity for such thought was established. Various grounds for this are
given. A key point was made by Darwin, as follows:
The mental powers of some early progenitor of man must have been
more highly developed than in any existing ape, before even the
most imperfect form of speech could have come into use. (Darwin,
1871, p. 57)
There is also evolutionary evidence which suggests complex thought must
have evolved before the capacity for language (Donald, 1991; Mithen, 1996).
The ontological priority of thought over language also seems difficult to deny.
18
As Pinker comments, ‘if thoughts depended on words, how could a new word
ever by coined?’ (Pinker, 1994, p. 58)
What develops out of this is the assumption that properties of language
must ‘arise from the structure of the thoughts language is required to express’
(Christiansen and Chater, 2008, p. 501). This entails that linguistics mechanisms somehow exploit more general cognitive functionality. As Penn et al. see
it, the arrangement must be one in which
our faculty for language relies largely on domain-general cognitive
systems that originally evolved for other purposes and still perform
these non-linguistic functions to this day. (Penn et al., 2009, p. 463)
The practical difficulty for this camp is the absence of a theory of thought
that would show how this implementational process might work. As Hinzen
notes, the key requirement for the grammar-from-thought thesis is ‘to account
for a mode of thought that appears inherently grammatical’ (Hinzen, 2012, p.
647). No such theory exists as yet; although, since such a theory would have
to explain the infinite productivity of thought (Fodor, 2008), we can assume
it would render it in some way compositional. In this vein, Christiansen and
Chater (2009) argue that
whatever the nature of our mental representations, they apparently
afford an infinite range of different thoughts, promoting the likely
emergence of compositionality in language. (Christiansen and Chater,
2009, p. 452)
Theorists in the alternative ‘thought from grammar’ camp see things very
differently. Chomsky argues that ‘thought itself is fundamentally linguistic’
(Corballis, 2014, p. xiii), on which basis, grammar can also be seen as a theory of
thought. This is a point particularly stressed by Hinzen, who argues that ‘insofar
as our mode of thought is species-specific and needs to find an explanation,
grammar is the most likely such explanation’ (Hinzen, 2012, p. 646) Generative
grammar is seen to be not only the mediation of language, but also of thought:
it ‘is the essential mechanism we need, with no additional and independent
language of thought required’ (Hinzen, 2012, p. 647).10
Here too a fundamental difficulty is encountered, however. How a mechanism
that is characteristically linguistic in nature could give rise to abstract thought
begs clarification. Admittedly, the problem is mitigated to some extent by
bringing into play the concept of Universal Grammar (UG), a formulation that
abstracts away language-specific details.11 Even so,
10 Hinzen draws attention to the way grammar and thought have often been treated as
unified in the past: ‘According to the 7th century Indian thinker Bhartrhari, there is no
thought without language, and for something to count as knowledge it has to be given a
linguistic form’ (Hinzen, 2012, p. 636).
11 In Chomsky’s view, ‘UG primarily constrains the “language of thought,” not the details
of its external expression’ (Chomsky, 2007, p. 22).
19
A theory of UG will ultimately have to provide a more detailed
account of how exactly the grammatical combinatorics feed into a
new mode of thought apparently unique to our species, and the
ability to generate a new and infinite range of grammatical meanings.
(Hinzen, 2012, p. 639)
The relatively recent Minimalist approach (Chomsky, 1995, 2001; Hauser et
al., 2002) potentially offers a way forward. This aims to reduce UG to a single
recursive constructive operator (termed ‘Merge’) which can work with entities
of any type. This is seen as providing an internal generative system capable of
implementing thought. In Chomsky’s view,
Emergence of unbounded Merge in human evolutionary history provides what has been called a language of thought, an internal generative system that constructs thoughts of arbitrary richness and
complexity, exploiting conceptual resources that are already available or may develop with the availability of structured expressions.
(Chomsky, 2007, p. 22)
Again, the implementation details remain sketchy, however. It is easy to
see how an operator that is ‘hierarchical, generative, recursive, and virtually
limitless with respect to its scope of expression’ (Hauser et al., 2002, p.1569)
might play an important role in thought. How it would implement thought in
general is less obvious. Hinzen argues that a ‘science of grammar is or can be a
science of human thought because it uncovers the principles and organization of
thought, and it can do so because our mode of thought is uniquely grammatical.’
(Hinzen, 2012, p. 642) But one foundation of the argument is the assumed lack of
a non-grammatical way of realizing the meanings we express through language.
Hinzen, for example, argues that
we see new forms of meaning arising in ways that exactly reflect a
narrow range of grammatical operations and options. The claim here
is that there is no known non-grammatical way in which meanings
of that specific kind arise, or on what generative system they should
be based. (Hinzen, 2012, p. 646)
This, then, is a convenient moment to return to bring in the present proposal.
The existence of a non-grammatical way of realizing such meanings (namely hierarchical conceptualization) allows that ‘grammar from thought’ and ‘thought
from grammar’ might both be correct. Hinzen himself points out that a more parsimonious explanation of the relationship between grammar and thought would
turn ‘the latter into the generative mechanism behind the former’ (Hinzen, 2012,
p. 637). This is what the present proposal aims to do. Conceptualization is
recognized to create hierarchical structures12 , while grammatical language is
12 The generativity of conceptualization is also recognized elsewhere, of course. Jackendoff,
for example, comments that the conceptual system is a ‘combinatorial system independent
of, and far richer than, syntactic structure’ (Jackendoff, 2002, p. 123), with the conclusion
being that meaning would then ‘be the first generative component of language to emerge’
(Jackendoff, 2003, p. 664).
20
seen to result from the way these structures are given sequential encoding. On
this basis the posited relationship might be characterized as ‘grammar from
sequential encoding of thought.’
The sense in which this hybrid interpretation accommodates the ‘grammar
from thought’ thesis will be apparent. But notice the ‘thought from grammar’
claim is also upheld to some degree. According to the proposed assembly-line
scheme, the sequential patterns we observe in language are an emergent phenomenon stemming from hierarchical conceptualization and sequential encoding. Each sequential pattern is seen to be the result of the way a particular
hierarchical conceptualization combines with a particular method of encoding.
Each grammatical rule is seen to provide such a combination with a dedicated
description that conflates the two factors. But if we let ‘grammar’ also mean
the ordering preferences that define the encoding, and allow that utterances
can have a knock-on effect on thought, there is a sense in which thought is also
shaped by grammar. The way in which thought comes to be expressed is mediated by grammatical ordering preferences. Assuming expressions of present
thought must influence future thought, a ‘thought from grammar’ effect is seen
to exist.
The assembly-line scheme can integrate the ‘grammar from thought’ and
‘thought from grammar’ theses, then. The reconciliation is achieved by turning
thought into the generative mechanism behind language, in the way Hinzen
envisages. But does this really lead to an explanation that is more parsimonious?
On the face of it, the assembly-line model seems more cumbersome. To analyze
production of an utterance, we now need two formalisms, one dealing with
hierarchical conceptualization, and one with ordering preferences. Neither is at
all familiar. Indeed, grammatical preferences (e.g., for verb phrase ordering) are
regarded as inherently mysterious (Polinsky, 2012). But against this cost, the
approach does offer some advantages. The requirement for grammatical rules
is eliminated, for example. The need for a traditional syntactic formalism is
avoided altogether. This simplifies the analysis, and in a way that is attractive
from the psycholinguistic viewpoint. Competent use of a language need not be
seen as relying on unconscious deployment of grammatical rules. Rather, it can
be seen as stemming from a capacity for hierarchical conceptualization coupled
with an ability to map conceptual structures to symbol sequences.
The approach also avoids the need to deal with semantics separately.13 Semantics becomes part of syntax. Traditionally, an utterance is seen to have a
syntactic structure (mandated by the grammar of the language) and separately,
a semantics (Pinker, 1994). Hence the syntax v. semantics distinction.14 A
complete analysis of an utterance should then identify not only its syntactic
structure, but also its semantic properties. In the assembly-line model, the
two components become one. A hierarchical conceptualization provides the semantics. But this is, at the same time, the underlying form of the syntactic
13 Dealing with semantics at all has been characterized as the ‘final frontier’ of linguistics
(Fitch, 2009, p. 306).
14 Hinzen comments that ‘the spirit of the autonomy of syntax clearly lives on’ (Hinzen,
2012, p. 638).
21
structure. The syntactic structure comes from mapping the semantic structure
to a sequence in accordance with prevailing grammatical preferences. It is the
semantic structure that provides the skeleton on which the syntactic structure
is built.
The essence of the assembly-line model is thus that the linguistic system
‘piggybacks’ on the conceptual system. Constructive functionality of the latter
is seen to be directly used within the former. The two systems are seen as
functionally fused. Whether such an arrangement exists in practice can only be
established by empirical investigation, it hardly needs to be said. The present
paper provides a proof of concept, no more. Various ways of investigation the
situation can be envisaged, however. An interesting property of the model is
that it provides a convenient way of generating artificial languages. By converting conceptual structures into utterances according to some random selection of
grammatical preferences, we can generate an arbitrarily large ‘sample’ of sentences from a language that does not exist. The prediction of the theory is
that an invented language of this form should pass muster as a natural language when scrutinized by a linguist unaware of its true origin. The theory also
predicts that an artificial language constructed in this way should be no more
difficult to acquire than a natural language. Both predictions could be put to
the test. Confirmation or disconfirmation of either would shed some light on
the practical viability of the proposed model. It is hoped future work will make
some progress towards this goal.
References
Bates, E. and MacWhinney, B. (1979). A functionalist approach to the acquisition of grammar. In Ochs and Schieffelin (Eds.), Developmental Pragmatics
(pp. 167-209), Academic Press.
Bates, E. and MacWhinney, B. (1987). Competition, Variation, and Language
Learning. In MacWhinney (Ed.), Mechanisms of Language Acquisition (pp.
157-93), Erlbaum.
Bickerton, D. (1990). Language and species, Chicago: University of Chicago
Press.
Carey, S. (2009). The Origin of Concepts, Oxford University Press.
Chater, N. and Christiansen, M. H. (2010). Language Acquisition Meets Language Evolution. Cognitive Science, 34, No. 7 (pp. 1131-1157).
Chomsky, N. (1995). The Minimalist Program, The MIT Press.
Chomsky, N. (2001). Derivation by Phase. Ken Hale: A Life in Language (pp.
1-52), The MIT Press.
Chomsky, N. (2007). Of Minds and Language. Biolinguistics, 1 (pp. 9-27).
22
Chomsky, N. (2009). Concluding Remarks. In Piattelli-Palmarini, Uriagereka
and Salaburu (Eds.), Of Minds and Language: A dialogue with Noam Chomsky in the Basque Country (pp. 379-409), Oxford University Press.
Chomsky, N. (2012). The Science of Language: Interviews with James McGilvray,
Cambridge: Cambridge University Press.
Christiansen, M. H. and Chater, N. (2008). Language as shaped by the brain.
Behavioral and Brain Sciences, 31 (pp. 489-558).
Christiansen, M. H. and Chater, N. (2009). The myth of language universals
and the myth of universal grammar. Behavioral and Brain Sciences, 32 (pp.
452-453).
Clair, M. C. S., Monaghan, P. and Ramscar, M. (2009). Relationships Between
Language Structure and Language Learning: The Suffixing Preference and
Grammatical Categorization. Cognitive Science, 33, No. 7 (pp. 1317-1329).
Corballis, M. C. (2014). The Recursive Mind: The Origins of Human Language, Thought, and Civilization, Princeton and Oxford: Princeton University Press.
Costello, F. J. and Keane, M. T. (2001). Testing Two Theories of Conceptual
Combination: Alignment versus Diagnosticity in the Comprehension and
Production of Combined Concepts. Journal of Experimental Psychology:
Learning, Memory and Cognition, 27, No. 1 (pp. 255-271).
Darwin, C. (1871). The Descent of Man and Selection in Relation to Sex, John
Murray.
Donald, M. (1991). Origins of the Modern Mind: Three Stages in the Evolution
of Culture and Cognition, Cambridge, Mass: Harvard University Press.
Dryer, M. S. and Haspelmath, M. (eds.) (2011). The World Atlas of Language
Structures Online, Munich: Max Planck Digital Library. Package online at
http://wals.info/ Accessed on 2013-04-06.
Elman, J. L. (2004). An alternative view of the mental lexicon. Trends in
Cognitive Sciences, 8 (pp. 201206).
Fitch, W. T. and Chomsky, N. (2005). The Evolution of the Language Faculty:
Clarifications and Implications. Cognition, 97, No. 2 (pp. 179-210).
Fitch, W. T. (2009). Prolegomena to a future science of biolinguistics. Biolinguistics, 3, No. 4 (pp. 283-320).
Fodor, J. (2001). Language, Thought, and Compositionality. Mind Language,
16, No. 1 (pp. 1-15).
Fodor, J. A. (2008). LOT 2: The Language of Thought Revisited, Oxford University Press.
23
Gentner, D. (1983). Structure-Mapping: A Theoretical Framework for Analogy.
Cognitive Science, 7 (pp. 155-170).
Hampton, J. A. (1991). The Combination of Prototype Concepts. In Schwanenflugel (Ed.), The Psychology of Word Meanings, Hillsdale, N.J: Lawrence
Erlbaum Associates.
Hampton, J. (1997). Conceptual Combination: Conjunction and Negation of
Natural Concepts. Memory and Cognition, 25 (pp. 888-909).
Hampton, J. (2011). Conceptual Combination and Fuzzy Logic. In Bĕlohlavek
and Klir (Eds.), Concepts and Fuzzy Logic (pp. 209-232), MIT Press.
Hauser, M. D., Chomsky, N. and Fitch, W. T. (2002). The faculty of language:
what is it, who has it, and how did it evolve? Science, 198 (pp. 15691579).
Hauser, M. D. (2009). Evolingo: The Nature of the Language Faculty. In
Piattelli-Palmarini, Uriagereka and Salaburu (Eds.), Of Minds and Language: A dialogue with Noam Chomsky in the Basque Country (pp. 74-84),
Oxford University Press.
Hinzen, W. (2009). Hierarchy, Merge, and Truth. In Piattelli-Palmarini, Uriagereka
and Salaburu (Eds.), Of Minds and Language: A dialogue with Noam Chomsky in the Basque Country (pp. 124-141), Oxford University Press.
Hinzen, W. (2012). The philosophical significance of Universal Grammar. Language Sciences, 34 (pp. 635-649).
Jackendoff, R. (2002). Foundations of Language: Brain, Meaning, Grammar,
Evolution, Oxford University Press.
Jackendoff, R. (2003). Précis to Foundations of Language: Brain, Meaning,
Grammar, Evolution. Behavioral and Brain Sciences, 26, No. 6 (pp. 65165).
Kirby, S. (1999). Function, Selection and Innateness: The Emergence of Language Universals, Oxford University Press.
Kuno, S. (1973). The Structure of the Japanese Language, Cambridge, Mass:
MIT Press.
McCarthy, J., Abrahams, P. W., Edwards, D. J., Hart, T. P. and Levin, M. I.
(1985/1962). LISP 1.5 Programmer’s Manual (2nd ed.), Cambridge, Mass:
MIT Press.
Mithen, S. (1996). The Prehistory of the Mind: A Search for the Origins of
Art, Religeon and Science, Thames and Hudson.
Murphy, G. L. (2002). The Big Book of Concepts, London, England: The MIT
Press.
24
Penn, D., Holyoak, K. J. and Povinelli, D. J. (2009). Universal grammar and
mental continuity: two modern myths. Behavioral and Brain Sciences, 32
(pp. 462-463).
Pet, W. A. P. (1987). Lokono Dian, the Arawak Language of Surniname: A
Sketch of its Grammatical Structure and Lexicon, Cornell University Doctoral Thesis.
Pinker, S. (1994). The Language Instinct: The new Science of Language and
Mind, The Penguin Press.
Polinsky, M. (2012). Headedness, again. UCLA Working Papers in Linguistics,
Theories of Everything, 17, No. 40 (pp. 348-359).
Rips, L. J. (1995). The Current Status of Research on Concept Combination.
Mind and Language, 10 (pp. 72-104).
Thagard, P. (1997). Coherent and Creative Conceptual Combination. In Ward,
Smith and Viad (Eds.), Creative Thought: an Investigation of Conceptual
Structures and Processes, Washington, DC: American Psychological Association.
Tomasello, M. (2003). Constructing a Language: A Usage-based Theory of
Language Acquisition, Harvard University Press.
Tomasello, M. (2008). Origins of Human Communication, Cambridge, Massachusetts: The MIT Press.
Wisniewski, E. J. (1997). When Concepts Combine. Psychonomic Bulletin and
Review, 4 (pp. 167-183).
25