1
|
|
2
|
- “As of 1965, and even later, we find in the bowels of Building 20 [the
home of the MIT linguistics department] a group of dedicated
co-conspirators, united by missionary zeal and shared purpose. A year or two later, the garment is
unraveling, and by the end of the decade the mood is total warfare. The field was always closed off
against the outside: no serpent was introduced from outside of Eden to
seduce or corrupt. Any dissension
had to be home-brewed” (Robin Lakoff, LW 102).
|
3
|
- “The approach taken by Katz, Fodor, and Postal has been to view a
semantic theory as being necessarily interpretive, rather than
generative. The problem, as they
see it, is to take given sentences of a language and find a device to
tell what they mean. A generative
approach to the problem might be to find a device that could generate
meanings and could map those meanings onto syntactic structures” (p.
105, George Lakoff).
|
4
|
- Chomsky: grammar that “specifies the infinite set of well-formed
sentences and assigns to each of these one or more structural
descriptions.”
- Liberal use during the ‘wars’: creative, productive, (and other dynamic
terms) – p. 106
|
5
|
- Autonomous Syntax?
- Was it wise?
|
6
|
|
7
|
|
8
|
- Transformations occur before things get to the surface (ex. Minimalism
& X-bar)
|
9
|
- “The deep structures of all languages are identical, up to the ordering
of constituents immediately dominated by the same node” (p. 119, Haj
Ross).
|
10
|
- Are they syntactic or semantic?
- Are they absolute restrictions?
|
11
|
- Every restriction has a counter example.
|
12
|
- It was discovered that without constraints a transformational grammar
was equivalent to a Turing Machine
- A Turing Machine can implement any algorithm, so language is not
restricted either.
- Chomsky and company assumed that linguistics should identify constraints
on what is a possible language.
|
13
|
- What does Generative Semantics do to this structure by breaking the
American Structuralist protocol for examining language?
- Answer: It reverses what depends on what (semantics on syntax or syntax
on semantics).
|
14
|
- Lakoff’s deep structure kept going deeper and deeper.
- He never stopped to resolve the problems that came up, exceptions to the
rules.
- The transformations were different almost every time—no consistent
model.
|
15
|
|
16
|
- Competence: has nothing to do with comprehension, nothing to do with
production; model of knowledge in an abstract form; does not tell you
how to understand a sentence.
- L = {S1, S2, S3, . . . Sⁿ }, only well-formed sentences
- Pairs each sentence with deep structure and logical form – this has nothing to do with the way humans
generate sentences.
- Grammar is a mathematical system that exists independent of humans.
|
17
|
- You can’t decide whether a sentence is acceptable unless you consider
meaning, but you can’t assign a particular meaning without considering
performance.
|
18
|
- Fear – biting off more than he can chew; what if it all falls apart
(like generative semantics)?
- Faith – when there is a performance model that shows actual production,
it will be based on competence.
|