Where are we? How did we get there? Uniformity and l`art pour l`art

Description, Analysis, Formalization, Implementation
Where are we and how did we get there?
Where are we?
• We have several major theories: GB/Minimalist Theories, HPSG, CxG,
Categorial Grammar, LFG, TAG, Dependency Grammar, Relational
Grammar/Arc Pair Grammar, Integrational Linguistics, . . .
• Many analyses translatable into other frameworks (Müller, 2013c, To appear).
• Concepts, ideas are taken over
• HPSG = Frankenstein theory (GB, CG, DG, CxG)
• LFG (inheritance, constructions)
• Minimalism
Some Remarks on the State of the Field and
on Ways to Ensure Progress:
Description, Analysis, Formalization, Implementation:
(functional application, inheritance, structured feature value pairs, LP constriants)
• But: Lappin, Levine and Johnson (2000); Culicover and Jackendoff (2005); Felix
(2010); Sternefeld and Richter (2012)
Stefan Müller
• Problems in style of argumentation
• Problems with decreasing coverage of theories
Deutsche Grammatik
Institut für Deutsche und Niederländische Philologie
Freie Universität Berlin
(example labeling, falls way back behind the 80ies)
• Problems with motivation of analyses
• Problems with crosslinguistic validation of approaches
• Problems with consistency and formalization
[email protected]
August 27, 2013
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
Description, Analysis, Formalization, Implementation
1/20
Description, Analysis, Formalization, Implementation
Where are we and how did we get there?
Poverty of the Stimulus and Argumentation for Current Theories
Poverty of the Stimulus
How did we get there? Uniformity and l’art pour l’art
Poverty of the Stimulus
• Culicover and Jackendoff (2005):
• Chomsky: There is a poverty of the stimulus. Language cannot be acquired
One reason for the way the grammars look like nowadays is the goal of uniformity.
• Sternefeld and Richter (2012):
As a mathematical discipline travels far from its empirical source, or still more, if it is a
second and third generation only indirectly inspired from ideas coming from ‘reality,’ it is
beset with very grave dangers. It becomes more and more purely aestheticizing, more and
more purely l’art pour l’art. This need not be bad, if the field is surrounded by correlated
subjects, which still have closer empirical connections, or if the discipline is under the
influence of men with an exceptionally well-developed taste.
But there is a grave danger that the subject will develop along the line of least resistance,
that the stream, so far from its source, will separate into a multitude of insignificant
branches, and that the discipline will become a disorganized mass of details and
complexities.
In other words, at a great distance from its empirical source, or after much ‘abstract’
inbreeding, a mathematical subject is in danger of degeneration. At the inception the style
is usually classical; when it shows signs of becoming baroque the danger signal is up. It
would be easy to give examples, to trace specific evolutions into the baroque and the very
high baroque, but this would be too technical. (von Neumann, 1947)
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
from input →
Innate domain-specific knowledge is required.
• Example: Auxiliary inversion in English (Chomsky, 1971, p. 29–33; 2013, p. 39)
• Bod (2009): AuxInv can be learned without access to the data that Chomsky
claimed to be required.
• Promising results from input-based theories:
Freudenthal et al., 2007, 2009
2/20
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
3/20
Description, Analysis, Formalization, Implementation
Description, Analysis, Formalization, Implementation
Poverty of the Stimulus and Argumentation for Current Theories
Poverty of the Stimulus and Argumentation for Current Theories
Example Cases
Example Cases
Evidence from a single language and UG
German is English/Romance (SVO, Laenzlinger following Kayne)
• What does it mean for other languages
• All languages are
CP
Spr-H-Comp underlyingly.
that a rule/morpheme is present in one particular language?
• Possible answer:
If we have a certain structure in language X,
it must be present in all languages.
• Example:
C0
SubjP
• The object is moved out of
DP
the VP.
. . . ObjP
DP
VP
Aux+
• Basque: Tree positions for object agreement (AgrO, AgrIO)
• Japanese/Gungbe: Tree position for topic marker
Aux
• Conclusion:
weil
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
4/20
Description, Analysis, Formalization, Implementation
er
ihn
gelesen
assumed.
DP
hat
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
5/20
Description, Analysis, Formalization, Implementation
Poverty of the Stimulus and Argumentation for Current Theories
Poverty of the Stimulus and Argumentation for Current Theories
Example Cases
Example Cases
German is German (GB Variants, CG, LFG, HPSG, . . . )
English, German, . . . are Hungarian
CP
• Hornstein, Nunes and Grohmann (2005, p. 124):
AgrP
VP
Agr
C
NP
V
ihn
V
gelesen
hat
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
weil
PP
VP
• DP is put into the specifier position of this head.
• Evidence for this:
P′
NP
NP
V
V
er
ihn
gelesen
hat
mirj
V
agreement head for the checking of case features
• Preposition is moved there.
Agr′
DP
CP
V′
NP
er
• Innateness has to be
VP
V
If such inferences regarding properties of particular languages,
one has to assume (very specific!) innate linguistic knowledge.
weil
heads (Cinque, 1999).
νP
DP
• German and Dutch neither have object agreement nor topic morphemes.
C
• The subject is fronted.
• The empty VP is fronted.
• There are further empty
. . . AuxP
6/20
hinteri
P
DP
i
j
Agreement in Hungarian postpositional phrases
• English is like Hungarian,
but the movement is invisible.
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
7/20
Description, Analysis, Formalization, Implementation
Description, Analysis, Formalization, Implementation
Poverty of the Stimulus and Argumentation for Current Theories
Bottom Up Discovery of Linguistic Generalizations
Example Cases
Crosslinguistic Considerations
German is German, . . . Hungarian is Hungarian
Evidence from other languages?
• A PP is a P together with an NP (or DP).
PP
P
DP
hinter
mir
• So should we refrain from using evidence from other languages?
• No movement instead of two movements.
• No, it is a valuable resource for understanding how human language works.
• Structure has five nodes less.
• If we have two possibilities to analyze language X, and only one of them is
compatible with language Y, we should go for this one.
• Truly minimal!
• Question: What constitutes an explanation?
Where and how is complexity of language represented?
• But this argumentation has to happen on an entirely different level (Fanselow,
2009, p. 137).
Non-trivial fragments should be compared and each fragment should be
motivated on its own.
• This is done in the CoreGram project: Müller, 2013b,a
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
8/20
Description, Analysis, Formalization, Implementation
Bottom Up Discovery of Linguistic Generalizations
Crosslinguistic Considerations
One Language at a Time
Bottom Up with Cheating
“One Language at a Time” and Methodological Opportunism
Arg Str
Arg Str
V2
SOV
VC
Set 3
9/20
Description, Analysis, Formalization, Implementation
Bottom Up Discovery of Linguistic Generalizations
Arg St
V2
SOV
VC
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
V2
Set 5
SOV
VC
Set 4
Set 8
Set 7
Set 11
SVO
Set 4
Set 1
Set 2
Set 1
Set 2
Set 6
Set 1
Set 2
Set 6
Set 12
Set 13
German
Dutch
German
Dutch
Danish
German
Dutch
Danish
English
French
one needs a description of the respective languages.
• Stepwise broadening of the research perspective
• We provide the descriptions in a formal way and the generalizations fall out on the
• Reuse of analyses if possible
way.
• German, Danish, Persian, Maltese, Chinese, English, Yiddish, French, . . .
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
• Croft (2009, p. 154): From a typologists’s perspective, there are two serious problems
with the generative methodology for deriving syntactic universals and evaluating them
with respect to evidence from crosslinguistic comparison. The first has to do with the
“one language at a time” method. In this method, the languages are examined
individually before they are compared, or more precisely, one starts with one language,
and successively compares a second language to the first, then a third to the second, and
so on, modifying the hypothesis as one goes on. This method contrasts with the
typological method, in which one examines a broad sample of languages to begin with,
and formulates hypotheses based on the evidence from the broad sample as a whole.
• In order to make statements on a broad sample,
10/20
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
11/20
Description, Analysis, Formalization, Implementation
Description, Analysis, Formalization, Implementation
Bottom Up Discovery of Linguistic Generalizations
Bottom Up Discovery of Linguistic Generalizations
One Language at a Time
Predictions/Restrictions
“One Language at a Time” and the Pro-Drop Parameter
Predictions: Individual Grammars Make Predictions
• Pro-Drop Parameter:
• MIT: This does not make any predictions concerning possible/impossible
One binary switch assumed to be responsible for a lot of phenomena
• All correlations turned out to be wrong.
languages
• These constraint sets make predictions (Müller, 1999, p. 439):
Netter (1991):
• Croft (2009) discussing the Pro-Drop Parameter:
it implies that the “one language at a time” approach is not a fruitful one for
finding syntactic universals.
• Croft uses sample sizes of 12 (p. 158). This is manageable.
• Unlikely that we will reach sample sizes of 400+ languages, but who knows . . .
You are invited to join the enterprise.
• Difference between CoreGram and GB/MP:
• No broad claims are made (right now)
• No need to explain problematic data away.
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
(1)
a. [Versucht, zu lesen], hat er das Buch nicht.
tried
to read has he the book not
‘He did not try to read the book.’
b. [Versucht, einen Freund vorzustellen], hat er ihr noch nie.
tried
a
friend to.introduce has he her yet never
‘He never tried to introduce her to a friend.’
Third Construction + PVP
12/20
Description, Analysis, Formalization, Implementation
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
13/20
Description, Analysis, Formalization, Implementation
Bottom Up Discovery of Linguistic Generalizations
Bottom Up Discovery of Linguistic Generalizations
Predictions/Restrictions
Predictions/Restrictions
Constraint Sets Make Predictions for All Languages/Language Classes
Things We Do not Have to Exclude
• Similarly: Topmost constraint set holds for all (examined) languages.
• Why are there no languages that form questions by reversing the order of the
It predicts that certain structures are impossible, for instance languages that put
the verb in penultimate position (Kayne, 1994, p. 50) are ruled out because of
information structure constraints.
• If it turns out that the topmost constraints are too restrictive,
words (Musso et al., 2003)?
• Answer: Since we do not have sufficient memory.
• No need to account for this in a grammar.
we have to change them.
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
14/20
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
15/20
Description, Analysis, Formalization, Implementation
Description, Analysis, Formalization, Implementation
Bottom Up Discovery of Linguistic Generalizations
The Cure
Predictions/Restrictions
Predictions: All SOV Languages Are V2 (Not True)
Arg Str
V2
SOV
VC
The Cure
Quote from a letter by Hai Ross to the 50th anniversary of the MIT Linguistics department:
Set 8
Set 7
Set 11
As a mathematical discipline travels far from its empirical source, or still more, if it is a second and
third generation only indirectly inspired from ideas coming from ‘reality,’ it is beset with very grave
dangers. It becomes more and more purely aestheticizing, more and more purely l’art pour l’art. This
need not be bad, if the field is surrounded by correlated subjects, which still have closer empirical
connections, or if the discipline is under the influence of men with an exceptionally well-developed taste.
SVO
But there is a grave danger that the subject will develop along the line of least resistance, that the
stream, so far from its source, will separate into a multitude of insignificant branches, and that the
discipline will become a disorganized mass of details and complexities.
Set 4
Set 1
Set 2
Set 6
Set 12
Set 13
German
Dutch
Danish
English
French
In other words, at a great distance from its empirical source, or after much ‘abstract’ inbreeding, a
mathematical subject is in danger of degeneration. At the inception the style is usually classical; when it
shows signs of becoming baroque the danger signal is up. It would be easy to give examples, to trace
specific evolutions into the baroque and the very high baroque, but this would be too technical.
Since all languages with Set 4 are also Set 7,
it follows that all SOV languages are also V2 languages.
Obviously wrong, but it shows the kind of knowledge that can be read off from
such hierarchies.
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
16/20
Description, Analysis, Formalization, Implementation
In any event, whenever this stage is reached, the only remedy seems to me to be the rejuvenating return
to the source: the reinjection of more or less directly empirical ideas. I am convinced that this is a
necessary condition to conserve the freshness and the vitality of the subject, and that this will remain so
in the future.
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
Description, Analysis, Formalization, Implementation
The Cure
Desiderata
More Data and Consistency Checks
Desiderata
• We need more data and we have it:
• psycholinguistics: Tanenhaus et al., 1996; Wittenberg et al., To appear
• neurolinguistics: Cappelle, Shtyrov and Pulvermüller, 2010
• corpus linguistics:
Meurers and Müller, 2009; Kiss, 2008; Stefanowitsch and Gries, 2009; Schäfer and
Bildhauer, 2012
• more languages with the same underlying assumptions
Desiderata for linguistic frameworks:
• If everything is taken into account, an enormous complexity results.
• Implementation is our only chance to verify consistency.
• We need standards to make analyses comparable and implementable (Fanselow,
2009).
• More data is the cure that will wipe out bad taste.
Bad taste analyses just do not scale!
• Books should be published with a list of sample sentences that the theory covers.
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
17/20
18/20
• constraint-based formalization
(Pullum and Scholz, 2001; Pullum, 2007; Sag and Wasow, 2011)
• strongly lexicalist orientation
(Sag and Wasow, 2011; Müller, 2006, To appear)
• parallel/sign-based architecture including constraints on phonology, morphology,
syntax, semantics, and information structure and the interactions between the
various levels of linguistic description (Jackendoff, 2011; Kuhn, 2007)
• not restricted to headed configurations (Jackendoff, 2008; Jacobs, 2008)
• possibility to describe complex linguistic objects rather than just lexical items
(Kay and Fillmore, 1999; Sag, 1997)
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
19/20
Description, Analysis, Formalization, Implementation
Description, Analysis, Formalization, Implementation
Conclusions
Appendix
Poverty of the Stimulus und U-DOP
Conclusions
Poverty of the Stimulus and U-DOP
• Some analyses that are currently suggested are bizarre.
• U-DOP learns from examples that do not contain examples with auxiliary inversion and
• Parts of the field seem to have disconnected from the empirical basis.
• Bad taste?
• Cure:
• more data,
• larger fragments (implementation and systematic testing, coverage never goes down)
• Needed: Ways to express linguistic theories that can be understood by
(2)
a. The man who is eating is hungry.
b. Is the boy hungry?
(3) Is the man who is eating hungry?
everybody and translated into a format that can be used for consistency checks.
• Collaboration between theoretical and computational linguists
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
relative clauses (Bod, 2009).
Once one learned the correct trees for (2) one can also assign the correct structure to
sentences with auxiliary inversion (p. 778):
20/20
Description, Analysis, Formalization, Implementation
To acquire (2) for example the sentences in (4) are sufficient:
(4)
a.
b.
c.
d.
The
The
The
The
man who is eating mumbled.
man is hungry.
man mumbled.
boy is eating.
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
21/20
Description, Analysis, Formalization, Implementation
Appendix
Appendix
Poverty of the Stimulus und U-DOP
Poverty of the Stimulus und U-DOP
Poverty of the Stimulus and U-DOP – II
Possible binary branching structures for Watch the dog and The dog barks
• Procedure:
1. Compute all possible (binary branching) trees (without category symbols)
for a set of given sentences.
2. Compute all subtrees of these trees.
3. Compute the best tree for a given sentence.
X
X
X
• The acquired grammars make the same mistakes as children!
watch
X
the
dog
watch
X
the
X
X
the
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
22/20
dog
dog
X
barks
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
the
dog
barks
23/20
Description, Analysis, Formalization, Implementation
Description, Analysis, Formalization, Implementation
Appendix
Appendix
Poverty of the Stimulus und U-DOP
Poverty of the Stimulus und U-DOP
Subtrees
Analysis with Subtrees and Probabilities
X
• Every tree has a probability of
X
X
X
watch
watch
1/12.
X
the
dog
the
dog
X
X
• the dog appears twice!
X
probability = 2/12.
X
X
X
the
watch
the
the
X
dog
dog
watch
X
dog barks
13/144
X
X
X
the
dog
barks
dog
barks
X
X
X
X
X
dog
the
X
dog
the
barks
barks
the
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
24/20
Description, Analysis, Formalization, Implementation
dog barks
14/144
is
produced
by
X
X
X
X
the dog barks
1/12 = 12/144
and
X
◦
the dog
barks
1/12 × 2/12 = 2/144
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
25/20
Cinque and UG
• from another text book:
pP
• Cinque: Certain elements appear in the same order in all languages.
Radford (1997, p. 452)
p
• Sternefeld (2006, p. 549–550) calls this a
AgrOP
D
Swiss Cheese analysis, but there are more
holes (5) than cheese (2).
AgrO
P
me
t′
AgrO
Analysis: One innate tree structure for all langugages.
If there is no evidence for elmenets in a certain language, empty nodes are
assumed.
• True: we would never arrive at Cinque’s analyses, since we reject invisible
elements for which we do not have evidence in a particular language.
• But if we really insisted, we could do better than Cinque:
AgrO
∅
dog barks
the
1/12 × 1/12 = 1/144
the dog barks
1/12 = 12/144
Poverty of the Stimulus und U-DOP
The Swiss Cheese
with
X ◦
and
Appendix
Poverty of the Stimulus und U-DOP
p
X
X
Description, Analysis, Formalization, Implementation
Appendix
P
X
X
X
X
the
is
produced
by
• Put all categories that appear in these fixed orders in the topmost set.
• Put LP constraints in the topmost set.
PP
P
D
t
t
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
If categories are present in a language, they are ordered properly,
if not, the LP constraints do not do any harm.
No empty nodes in trees are needed.
• But we reject 400+ inate categories as they were suggested by Cinque and Rizzi
(2010, p. 57). See Bod, 2009 on the Poverty of the Stimulus.
26/20
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
27/20
Description, Analysis, Formalization, Implementation
Description, Analysis, Formalization, Implementation
Appendix
Appendix
Formalization
Notes on Complexity
Chomsyk on Formalization
Bierwisch: Interaction of Phenomena beyond Human Capacities
Precisely constructed models for linguistic structure can play an important role,
both negative and positive, in the process of discovery itself. By pushing a precise
but inadequate formulation to an unacceptable conclusion, we can often expose the
exact source of this inadequacy and, consequently, gain a deeper understanding of
the linguistic data. More positively, a formalized theory may automatically provide
solutions for many problems other than those for which it was explicitly designed.
Obscure and intuition-bound notions can neither lead to absurd conclusions nor
provide new and correct ones, and hence they fail to be useful in two important
respects. I think that some of those linguists who have questioned the value of
precise and technical development of linguistic theory have failed to recognize the
productive potential in the method of rigorously stating a proposed theory and
applying it strictly to linguistic material with no attempt to avoid unacceptable
conclusions by ad hoc adjustments or loose formulation. (Chomsky, 1957, p. 5)
Es ist also sehr wohl möglich, daß mit den formulierten Regeln Sätze erzeugt werden
können, die auch in einer nicht vorausgesehenen Weise aus der Menge der grammatisch
richtigen Sätze herausfallen, die also durch Eigenschaften gegen die Grammatikalität
verstoßen, die wir nicht wissentlich aus der Untersuchung ausgeschlossen haben. Das ist der
Sinn der Feststellung, daß eine Grammatik eine Hypothese über die Struktur einer Sprache
ist. Eine systematische Überprüfung der Implikationen einer für natürliche Sprachen
angemessenen Grammatik ist sicherlich eine mit Hand nicht mehr zu bewältigende Aufgabe.
Sie könnte vorgenommen werden, indem die Grammatik als Rechenprogramm in einem
Elektronenrechner realisiert wird, so daß überprüft werden kann, in welchem Maße das
Resultat von der zu beschreibenden Sprache abweicht. (Bierwisch, 1963, p. 163)
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
28/20
Description, Analysis, Formalization, Implementation
It is very possible that the rules that we formulated generate sentences which are outside of
the set of grammatical sentences in an unpredictable way, that is, they violate
grammaticality due to properties that we did not exclude deliberately in our examination.
This is meant by the statement that a grammar is a hypothesis about the structure of a
language. A systematic check of the implications of a grammar that is appropriate for
natural languages is surely a task that cannot be done by hand any more. This task could be
solved by implementing the grammar as a calculating task on a computer so that it becomes
possible to verify to which degree the result deviates from the language to be described.
29/20
Description, Analysis, Formalization, Implementation
Appendix
References
Notes on Complexity
Abney: Fragment Size → Dramatic Increase of Complexity!
A goal of earlier linguistic work, and one that is still a central goal of the linguistic
work that goes on in computational linguistics, is to develop grammars that assign
a reasonable syntactic structure to every sentence of English, or as nearly every
sentence as possible. [. . . ] The scope of the problem of identifying the correct
parse cannot be appreciated by examining behavior on small fragments, however
deeply analyzed. Large fragments are not just small fragments several times
over—there is a qualitative change when one begins studying large fragments. As
the range of constructions that the grammar accommodates increases, the number
of undesired parses for sentences increases dramatically. (Abney, 1996, p. 20)
Abney, Steven P. 1996. Statistical Methods and Linguistics. In Judith L.
Klavans and Philip Resnik (eds.), The Balancing Act: Combining
Symbolic and Statistical Approaches to Language, Language, Speech,
and Communication, pages 1–26, London, England/Cambridge, MA:
MIT Press.
Ackerman, Farrell and Webelhuth, Gert. 1998. A Theory of Predicates.
CSLI Lecture Notes, No. 76, Stanford, CA: CSLI Publications.
Asudeh, Ash, Dalrymple, Mary and Toivonen, Ida. 2008. Constructions with
Lexical Integrity: Templates as the Lexicon-Syntax Interface. In Miriam
Butt and Tracy Holloway King (eds.), Proceedings of the LFG 2008
Conference, Stanford, CA: CSLI Publications. http://cslipublications.
stanford.edu/LFG/13/, 27.09.2012.
Asudeh, Ash, Dalrymple, Mary and Toivonen, Ida. 2013. Constructions with
Lexical Integrity. Journal of Language Modelling 1(1), 1–54.
Ayoub, Raymond George (ed.). 2005. Musings of the Masters: An
Anthology of Mathematical Reflections. The Mathematical Association
of America.
Berwick, Robert C. and Epstein, Samuel David. 1995. On the Convergence
of the ‘Minimalist’ Syntax and Categorial Grammar. In Anton Nijholt,
Giuseppe Scollo and Rene Steetskamp (eds.), Algebraic Methods in
Language Processing , pages 143–148, Enschede: University of Twente.
http://eprints.eemcs.utwente.nl/9555/01/twlt10.pdf, 27.03.2013.
Bierwisch, Manfred. 1963. Grammatik des deutschen Verbs. studia
grammatica, No. 2, Berlin: Akademie Verlag.
Bod, Rens. 2009. From Exemplar to Grammar: Integrating Analogy and
Probability in Language Learning. Cognitive Science 33(4), 752–793.
http://staff.science.uva.nl/∼rens/analogy.pdf, 15.07.2008.
Bouma, Gosse and van Noord, Gertjan. 1998. Word Order Constraints on
Verb Clusters in German and Dutch. In Erhard W. Hinrichs, Andreas
Kathol and Tsuneko Nakazawa (eds.), Complex Predicates in
Nonderivational Syntax, Syntax and Semantics, No. 30, pages 43–72,
c Stefan Müller 2013, FU Berlin, German Grammar and General Linguistics
30/20
San Diego: Academic Press. http://www.let.rug.nl/∼vannoord/
papers/, 31.05.2010.
Cappelle, Bert, Shtyrov, Yury and Pulvermüller, Friedemann. 2010. Heating
up or cooling up the Brain? MEG Evidence that Phrasal Verbs are
Lexical Units. Brain and Language 115, 189–201.
Chomsky, Noam. 1957. Syntactic Structures. Janua Linguarum / Series
Minor, No. 4, The Hague/Paris: Mouton.
Chomsky, Noam. 1971. Problems of Knowledge and Freedom. London:
Fontana.
Chomsky, Noam. 1993. A Minimalist Program for Linguistic Theory. In
Kenneth Hale and Samuel Jay Keyser (eds.), The View from Building
20: Essays in Linguistics in Honor of Sylvain Bromberger , Current
Studies in Linguistics, No. 24, pages 1–52, Cambridge, MA/London:
MIT Press.
Chomsky, Noam. 2008. On Phases. In Robert Freidin, Carlos P. Otero and
Maria Luisa Zubizarreta (eds.), Foundational Issues in Linguistic
Theory. Essays in Honor of Jean-Roger Vergnaud , pages 133–166,
Cambridge, MA: MIT Press.
Chomsky, Noam. 2013. Problems of Projection. Lingua 130, 33–49.
Cinque, Guglielmo. 1999. Adverbs and Functional Heads. A Cross-Linguistic
Perspective. New York, Oxford: Oxford University Press.
Cinque, Guglielmo and Rizzi, Luigi. 2010. The Cartography of Syntactic
Structures. In Bernd Heine and Heiko Narrog (eds.), The Oxford
Handbook of Linguistic Analysis, pages 51–65, Oxford: Oxford
University Press.
Citko, Barbara. 2008. Missing Labels. Lingua 118(7), 907–944.
Croft, William. 2001. Radical Construction Grammar. Syntactic Theory in
Typological Perspective. Oxford University Press.
Croft, William. 2009. Methods for Finding Language Universals in Syntax.
In Sergio Scalise, Elisabetta Magni and Antonietta Bisetto (eds.),
Description, Analysis, Formalization, Implementation
Description, Analysis, Formalization, Implementation
References
Universals of Language Today , volume 76 of Studies in Natural
Language and Linguistic Theory , pages 145–164, Springer Netherlands.
References
Cognitive Science 31(2), 311–341.
Culicover, Peter W. and Jackendoff, Ray S. 2005. Simpler Syntax. Oxford:
Oxford University Press.
Freudenthal, Daniel, Pine, Julian M. and Gobet, Fernand. 2006. Modeling
the Development of Children’s Use of Optional Infinitives in Dutch and
English Using MOSAIC. Cognitive Science 30(2), 277–310.
Dryer, Matthew S. 1997. Are Grammatical Relations Universal? In Joan
Bybee, John Haiman and Sandra Thompson (eds.), Essays on
Language Function and Language Type: Dedicated to T. Givon, pages
115–143, John Benjamins Publishing Co.
Freudenthal, Daniel, Pine, Julian M. and Gobet, Fernand. 2009. Simulating
the Referential Properties of Dutch, German, and English Root
Infinitives in MOSAIC. Language Learning and Development 5(1),
1–29.
Eisenberg, Peter. 1992. Platos Problem und die Lernbarkeit der Syntax. In
Peter Suchsland (ed.), Biologische und soziale Grundlagen der
Sprache, Linguistische Arbeiten, No. 280, pages 371–378, Tübingen:
Max Niemeyer Verlag.
Gold, Mark E. 1967. Language Identification in the Limit. Information and
Control 10(5), 447–474.
Fanselow, Gisbert. 2001. Features, θ-Roles, and Free Constituent Order.
Linguistic Inquiry 32(3), 405–437.
Fanselow, Gisbert. 2002. Against Remnant VP-Movement. In Artemis
Alexiadou, Elena Anagnostopoulou, Sjef Barbiers and Hans-Martin
Gärtner (eds.), Dimensions of Movement. From Features to Remnants,
Linguistik Aktuell/Linguistics Today, No. 48, pages 91–127,
Amsterdam/Philadelphia: John Benjamins Publishing Co.
Fanselow, Gisbert. 2009. Die (generative) Syntax in den Zeiten der
Empiriediskussion. Zeitschrift für Sprachwissenschaft 28(1), 133–139.
Felix, Sascha W. 2010. Me and Chomsky: Remarks from Someone Who
Quit. In Thomas Hanneforth and Gisbert Fanselow (eds.), Language
and Logos: Studies in Theoretical and Computational Linguistics,
Festschrift for Peter Staudacher on his 70th Birthday , Studia
grammatica, No. 72, pages 64–71, Berlin, Boston: Akademie Verlag.
Fox, Danny and Pesetzky, David. 2005. Cyclic Linearization of Syntactic
Structure. Theoretical Linguistics 31(1–2), 1–45.
Freudenthal, Daniel, Pine, Julian M., Aguado-Orea, Javier and Gobet,
Fernand. 2007. Modeling the Developmental Patterning of Finiteness
Marking in English, Dutch, German, and Spanish Using MOSAIC.
Grewendorf, Günther. 2002. Minimalistische Syntax. UTB für Wissenschaft:
Uni-Taschenbücher, No. 2313, Tübingen, Basel: A. Francke Verlag
GmbH.
Haider, Hubert. 2000. OV is More Basic than VO. In Peter Svenonius (ed.),
The Derivation of VO and OV , pages 45–67, Amsterdam/Philadelphia:
John Benjamins Publishing Co.
Haider, Hubert. 2001. Parametrisierung in der Generativen Grammatik. In
Martin Haspelmath, Eckehard König, Wulf Oesterreicher and Wolfgang
Raible (eds.), Sprachtypologie und sprachliche Universalien – Language
Typology and Language Universals. Ein internationales Handbuch – An
International Handbook, pages 283–294, Berlin: Mouton de Gruyter.
Haspelmath, Martin. 2010. Comparative Concepts and Descriptive
Categories in Crosslinguistic Studies. Language 86(3), 663–687.
Hornstein, Norbert, Nunes, Jairo and Grohmann, Kleantes K. 2005.
Understanding Minimalism. Cambridge Textbooks in Linguistics,
Cambridge, UK: Cambridge University Press.
Jackendoff, Ray S. 2008. Construction after Construction and Its
Theoretical Challenges. Language 84(1), 8–28.
Jackendoff, Ray S. 2011. What is the human language faculty? Two views.
Language 87(3), 586–624.
Description, Analysis, Formalization, Implementation
Müller, Stefan. 2010. Persian Complex Predicates and the Limits of
Inheritance-Based Analyses. Journal of Linguistics 46(3), 601–655.
http://hpsg.fu- berlin.de/∼stefan/Pub/persian- cp.html, 27.08.2013.
Müller, Stefan. 2013a. The CoreGram Project: A Brief Overview and
Motivation. In Denys Duchier and Yannick Parmentier (eds.),
Proceedings of the Workshop on High-level Methodologies for
Grammar Engineering (HMGE 2013), Düsseldorf , pages 93–104.
Müller, Stefan. 2013b. The CoreGram Project: Theoretical Linguistics,
Theory Development and Verification. Ms. Freie Universität Berlin.
http://hpsg.fu- berlin.de/∼stefan/Pub/coregram.html, 27.08.2013.
Müller, Stefan. 2013c. Grammatiktheorie. Stauffenburg Einführungen,
No. 20, Tübingen: Stauffenburg Verlag, second edition. http://hpsg.
fu- berlin.de/∼stefan/Pub/grammatiktheorie.html, 27.08.2013.
Müller, Stefan. To appear. Unifying Everything: Some Remarks on Simpler
Syntax, Construction Grammar, Minimalism and HPSG. Language .
http://hpsg.fu- berlin.de/∼stefan/Pub/unifying- everything.html,
27.08.2013.
Müller, Stefan and Ghayoomi, Masood. 2010. PerGram: A TRALE
Implementation of an HPSG Fragment of Persian. In Proceedings of
2010 IEEE International Multiconference on Computer Science and
Information Technology – Computational Linguistics Applications
(CLA’10). Wisla, Poland, 18–20 October 2010 , volume 5, pages
461–467, Polnish Information Processing Society. http://hpsg.
fu- berlin.de/∼stefan/Pub/pergram.html, 27.08.2013.
Müller, Stefan and Lipenkova, Janna. 2009. Serial Verb Constructions in
Chinese: An HPSG Account. In Stefan Müller (ed.), Proceedings of the
16th International Conference on Head-Driven Phrase Structure
Grammar, University of Göttingen, Germany , pages 234–254, Stanford,
CA: CSLI Publications. http://hpsg.fu- berlin.de/∼stefan/Pub/
chinese- svc.html, 27.08.2013.
Kayne, Richard S. 1994. The Antisymmetry of Syntax. Linguistic Inquiry
Monographs, No. 25, Cambridge, MA: MIT Press.
Kiss, Tibor. 2008. Towards a Grammar of Preposition-Noun Combinations.
In Stefan Müller (ed.), Proceedings of the 15th International
Conference on Head-Driven Phrase Structure Grammar , pages
116–130, Stanford, CA: CSLI Publications. http://cslipublications.
stanford.edu/HPSG/9/, 31.10.2008.
Klein, Wolfgang. 1986. Second Language Acquisition. Cambridge
Textbooks in Linguistics, Cambridge, UK: Cambridge University Press.
Kuhn, Jonas. 2007. Interfaces in Constraint-Based Theories of Grammar. In
Gillian Ramchand and Charles Reiss (eds.), The Oxford Handbook of
Linguistic Interfaces, pages 613–650, Oxford: Oxford University Press.
Laenzlinger, Christoph. 2004. A Feature-Based Theory of Adverb Syntax. In
Jennifer R. Austin, Stefan Engelberg and Gisa Rauh (eds.), Adverbials:
The Interplay Between Meaning, Context, and Syntactic Structure,
Linguistik Aktuell/Linguistics Today, No. 70, pages 205–252,
Amsterdam/Philadelphia: John Benjamins Publishing Co.
Lappin, Shalom, Levine, Robert D. and Johnson, David E. 2000. The
Structure of Unscientific Revolutions. Natural Language and Linguistic
Theory 18(3), 665–671.
Lipenkova, Janna. 2009. Serienverbkonstruktionen im Chinesischen und ihre
Analyse im Rahmen von HPSG . Masters Thesis, Institut für Sinologie,
Freie Universität Berlin. http://hpsg.fu- berlin.de/∼lipenkov/magister.
html, 03.08.2010.
Lüdeling, Anke and Kytö, Merja (eds.). 2009. Corpus Linguistics. An
International Handbook, volume 29.2 of Handbücher zur Sprach- und
Kommunikationswissenschaft. Berlin: Mouton de Gruyter.
Marslen-Wilson, William. 1975. Sentence Perception as an Interactive
Parallel Process. Science 189(4198), 226–228.
Meinunger, André. 2000. Syntactic Aspects of Topic and Comment.
Linguistik Aktuell/Linguistics Today, No. 38, Amsterdam/Philadelphia:
John Benjamins Publishing Co.
Meurers, Walt Detmar and Müller, Stefan. 2009. Corpora and Syntax. In
Lüdeling and Kytö (2009), Chapter 42, pages 920–933.
Mineur, Anne-Marie. 1995. Interview with Bob Carpenter. Ta!, the Dutch
Students’ Magazine for Computational Linguistics 3(1).
Müller, Stefan. 1999. Deutsche Syntax deklarativ. Head-Driven Phrase
Structure Grammar für das Deutsche. Linguistische Arbeiten, No. 394,
Tübingen: Max Niemeyer Verlag. http://hpsg.fu- berlin.de/∼stefan/
Pub/hpsg.html, 27.08.2013.
Müller, Stefan. 2006. Phrasal or Lexical Constructions? Language 82(4),
850–883. http://hpsg.fu- berlin.de/∼stefan/Pub/phrasal.html,
27.08.2013.
Müller, Stefan. 2007. Head-Driven Phrase Structure Grammar: Eine
Einführung . Stauffenburg Einführungen, No. 17, Tübingen:
Stauffenburg Verlag, first edition. http://hpsg.fu- berlin.de/∼stefan/
Pub/hpsg- lehrbuch.html, 27.08.2013.
Müller, Stefan. 2009a. A Head-Driven Phrase Structure Grammar for
Maltese. In Bernard Comrie, Ray Fabri, Beth Hume, Manwel Mifsud,
Thomas Stolz and Martine Vanhove (eds.), Introducing Maltese
Linguistics. Papers from the 1st International Conference on Maltese
Linguistics (Bremen/Germany, 18–20 October, 2007), Studies in
Language Companion Series, No. 113, pages 83–112, Amsterdam/
Philadelphia: John Benjamins Publishing Co. http://hpsg.fu- berlin.de/
∼stefan/Pub/maltese- sketch.html, 27.08.2013.
Müller, Stefan. 2009b. On Predication. In Stefan Müller (ed.), Proceedings
of the 16th International Conference on Head-Driven Phrase Structure
Description, Analysis, Formalization, Implementation
References
Grammar , pages 213–233, Stanford, CA: CSLI Publications. http://
hpsg.fu- berlin.de/∼stefan/Pub/predication.html, 27.08.2013.
Jacobs, Joachim. 2008. Wozu Konstruktionen? Linguistische Berichte 213,
3–44.
Kay, Paul and Fillmore, Charles J. 1999. Grammatical Constructions and
Linguistic Generalizations: the What’s X Doing Y? Construction.
Language 75(1), 1–33.
References
Müller, Stefan and Ørsnes, Bjarne. 2011. Positional Expletives in Danish,
German, and Yiddish. In Stefan Müller (ed.), Proceedings of the 18th
International Conference on Head-Driven Phrase Structure Grammar,
University of Washington, U.S.A., pages 167–187, Stanford, CA: CSLI
Publications. http://hpsg.fu- berlin.de/∼stefan/Pub/expletives.html,
27.08.2013.
Müller, Stefan and Ørsnes, Bjarne. In Preparation. Danish in Head-Driven
Phrase Structure Grammar . Empirically Oriented Theoretical
Morphology and Syntax, Berlin: Language Science Press. http://hpsg.
fu- berlin.de/∼stefan/Pub/danish.html, 27.08.2013.
Müller, Stefan, Samvelian, Pollet and Bonami, Olivier. In Preparation.
Persian in Head-Driven Phrase Structure Grammar . Empirically
Oriented Theoretical Morphology and Syntax, Berlin: Language
Science Press. http://hpsg.fu- berlin.de/∼stefan/Pub/persian.html,
27.08.2013.
Musso, Mariacristina, Moro, Andrea, Glauche, Volkmar, Rijntjes, Michel,
Reichenbach, Jürgen, Büchel, Christian and Weiller, Cornelius. 2003.
Broca’s Area and the Language Instinct. Nature Neuroscience 6(7),
774–781.
Netter, Klaus. 1991. Clause Union Phenomena and Complex Predicates in
German. In Klaus Netter and Mike Reape (eds.), Clause Structure and
Word Order Variation in Germanic, DYANA Report, Deliverable
R1.1.B, University of Edinburgh.
Newmeyer, Frederick J. 2005. Possible and Probable Languages: A
Generative Perspective on Linguistic Typology . Oxford: Oxford
University Press.
Ott, Dennis. 2011. A Note on Free Relative Clauses in the Theory of
Phases. Linguistic Inquiry 42(1), 183–192.
Ørsnes, Bjarne. 2009. Preposed Negation in Danish. In Stefan Müller (ed.),
Proceedings of the 16th International Conference on Head-Driven
Phrase Structure Grammar, University of Göttingen, Germany , pages
255–275, Stanford, CA: CSLI Publications.
Pullum, Geoffrey K. 2007. The Evolution of Model-Theoretic Frameworks
in Linguistics. In James Rogers and Stephan Kepser (eds.),
Model-Theoretic Syntax at 10 – Proceedings of the ESSLLI 2007
MTS@10 Workshop, August 13–17 , pages 1–10, Dublin: Trinity
College Dublin. http://cs.earlham.edu/esslli07mts/, 30.11.2011.
Pullum, Geoffrey K. and Scholz, Barbara C. 2001. On the Distinction
between Generative-Enumerative and Model-Theoretic Syntactic
Frameworks. In Philippe de Groote, Glyn Morrill and Christian Retor
(eds.), Logical Aspects of Computational Linguistics: 4th International
Conference, Lecture Notes in Computer Science, No. 2099, pages
17–43, Berlin/Heidelberg/New York, NY: Springer Verlag.
Pullum, Geoffrey K. and Scholz, Barbara C. 2002. Empirical Assessment of
Stimulus Poverty Arguments. The Linguistic Review 19(1–2), 9–50.
Radford, Andrew. 1997. Syntactic Theory and the Structure of English: a
Minimalist Approach. Cambridge Textbooks in Linguistics, Cambridge,
UK: Cambridge University Press.
Ross, John Robert. 2011. Letter to the MIT Linguistics Department for its
50th Anniversary. http://ling50.mit.edu/replies/haj- ross, 26.08.2013.
Sag, Ivan A. 1997. English Relative Clause Constructions. Journal of
Linguistics 33(2), 431–484. http://lingo.stanford.edu/sag/papers/
rel- pap.pdf, 30.05.2004.
Sag, Ivan A. and Wasow, Thomas. 2011. Performance-Compatible
Competence Grammar. In Robert Borsley and Kersti Börjars (eds.),
Non-Transformational Syntax: Formal and Explicit Models of
Grammar: A Guide to Current Models, pages 359–377, Oxford, UK/
Cambridge, MA: Blackwell Publishing Ltd.
Sauerland, Uli and Elbourne, Paul. 2002. Total Reconstruction, PF
Movement, and Derivational Order. Linguistic Inquiry 33(2), 283–319.
Schäfer, Roland and Bildhauer, Felix. 2012. Building Large Corpora from
the Web Using a New Effcient Tool Chain. In Nicoletta Calzolari,
Khalid Choukri, Thierry Declerck, Mehmet Uğur Doğan, Bente
Maegaard, Joseph Mariani, Jan Odijk and Stelios Piperidis (eds.),
Proceedings of the Eight International Conference on Language
Resources and Evaluation (LREC’12), pages 486–493, Istanbul,
Turkey: European Language Resources Association (ELRA).
Scholz, Barbara C. and Pullum, Geoffrey K. 2002. Searching for Arguments
to Support Linguistic Nativism. The Linguistic Review 19(1–2),
185–223.
Stefanowitsch, Anatol and Gries, Stephan Th. 2009. Corpora and Grammar.
In Lüdeling and Kytö (2009), Chapter 43, pages 933–952.
Sternefeld, Wolfgang. 2006. Syntax: eine morphologisch motivierte
generative Beschreibung des Deutschen. Stauffenburg Linguistik,
No. 31, Tübingen: Stauffenburg.
Sternefeld, Wolfgang and Richter, Frank. 2012. Wo stehen wir in der
Grammatiktheorie? — Bemerkungen anläßlich eines Buchs von Stefan
Müller. Zeitschrift für Sprachwissenschaft 31(2), 263–291.
Tanenhaus, Michael K., Spivey-Knowlton, Michael J., Eberhard,
Kathleen M. and Sedivy, Julie C. 1996. Using Eye Movements to Study
Spoken Language Comprehension: Evidence for Visually Mediated
Incremental Interpretation. In Toshio Inui and James L. McClelland
(eds.), Information Integration in Perception and Communication,
Attention and Performance, No. XVI, pages 457–478, Cambridge, MA:
MIT Press.
Veenstra, Mettina Jolanda Arnoldina. 1998. Formalizing the Minimalist
Program. Ph. D.thesis, Groningen.
von Neumann, John. 1947. The Mathematician. Reprint in: Ayoub, 2005.
http://www.math.ubc.ca/∼fsl/von%20Neumann.pdf, 25.08.2013.
von Stechow, Arnim. 1996. The Different Readings of wieder “again”: A
Structural Account. Journal of Semantics 13(2), 87–138.
Wittenberg, Eva, Jackendoff, Ray S., Kuperberg, Gina, Paczynski, Martin,
Snedeker, Jesse and Wiese, Heike. To appear. The Processing and
Representation of Light Verb Constructions. In Asaf Bachrach, Isabelle
Roy and Linnaea Stockall (eds.), Structuring the Argument,
Amsterdam/Philadelphia: John Benjamins Publishing Co.