Understanding Sentences

wrennie
Note by , created over 6 years ago

Note on Understanding Sentences, created by wrennie on 04/21/2013.

702
0
0
Tags No tags specified
wrennie
Created by wrennie over 6 years ago
Meteorologia II
Adriana Forero
Biology - B1 - AQA - GCSE - Keeping Healthy and Defending Against Infection
Josh Anderson
SFDC App Builder 1
Parker Webb-Mitchell
GCSE AQA Chemistry 2 Salts & Electrolysis
Lilac Potato
Present Simple Vs Present Continuous
Luis Alcaraz
KEE2
harrym
Themes in Lord of the Flies
lowri_luxton
Logic Pro X Practice Exam
Chris Redding
IGCSE Chemistry Revision
sachakoeppen
Pathos in Battle
mouldybiscuit

Page 1

The linguistic theory of Chomsky

Based on two related ideas:(1) relations between the language and the brain(2) a technical description of the structure of languageLanguage is a special feature that is innate, species-specific, and biologically pre-programmed, and that is a faculty independent of other cognitive structures

The goal of the study of syntax is grammar (set of rules that enable us to produce and understand language)

Distinguish between Competence (idealised) : what is tapped by our intuitions about which are acceptable sentences of our language, and which are ungrammatical strings of words; concerns our abstract knowledge of our languag. About judgements we would make if we had sufficient time and memory capacity Performance

Chomsky (1986) distinguished between externalised language (E-language) and internalized language (I-language) E-language is about collecting samples of language and understanding their properties; about describing the regularities of a language in the form of a grammar. Social phenomena I-language linguistics about what speakers know about their language. Mental phenonema. Competence

Generative grammar: grammar uses a finite number of rules that in combination can generate all the sentences of a language

Grammar must be capable of generating all the sentences of a language, it should also never generate non-sentences

Grammar must give an account of the underlying syntactic structure of sentences

Linguistic theory should explain how children acquire these rules

Syntactic Structures (1957)Language is rule based, and that our knowledge of syntax can be captured in a finite number of syntactic rules

Describing syntax and phrase-structure grammar

Phrase-structure rules describe how words can be combined, and provide a method of describing the structure of a sentence.Sentences are built up hierarchically from smaller units using rewrite rules.

Phrase-structure grammar: set of rewrite rules

In a phrase-structure grammar, there are 2 types of symbol: terminal elements and non-terminal elements

Content words do most of the semantic work of the languagenouns, adjectives, verbs and most adverbsVery large and changingOpen-class words

Function words do most of the grammatical workdeterminers, conjunctions, prepositionsShort and used frequentlyClosed-class items

Phrases combine to make clauses: contain a subject and a predicate (the element of the clause that gives information about the subject)

Constituents: a linguistic unit that is part of a larger linguistic unit

Different types of verbs, each requiring different syntactic roles to create acceptable structures: transitive verbs (require a single noun phrase called a direct object e.g. kisses)Intransitive verbs do not require any further noun phrase (laughs)

Thematic or semantic roles are a sort of interface between syntax (grammatical form) and semantics (meaning)          They allow us to interpret who did what to whom from the grammatical structureTypical thematic roles include:  AGENT (*the dog* chased the cat)  PATIENT (the dog chased *the cat*)  INSTRUMENT (the dog annoyed the cat with *the harmonica*)  GOAL (the boy gave a letter to the *postman*)

Thematic roles    Information about which thematic roles are allowed fora  particular verb is stored, or specified in the mental lexicon    e.g., chase: subject = agent, object=patient

The underlying structure of a sentence/phrase is called its phrase structure or phrase market. Tree diagrams are important for analysis of syntax

Recursion allows us to produce an infinite number of sentences from a finite number of rules and words

Iteration enables us to carry on repeating the same rule, potentially for every

Different types of phrase-structure grammar: Context-free grammars contain only rules that are not specified for particular contexts Context-sensitive grammars can have rules that can only be applied in certain circumstances

Chomsky argued that a phrase-structure grammar is not capable of capturing our linguistic competence. It cannot explain the relation between related sentences (can provide an account of the structure of sentences)

Phrase-structure grammar is not capable of capturing our linguistic competence. It cannot explain the relation between related sentences. Chomsky (1957) showed that knowledge of relations could be flagged by the introduction of special rewrite rules known as transformations . Transformations are so central to the theory that the whole approach became known as transformational grammar. A transformation rule is a rewrite rule that takes a string of symbols on the left-hand side, and rewrites this string as another string on the right-hand side. Accounts for relations such as passivization transformation, turn affirmative declarative form of a sentence into an interrogative or question form, or into a negative form.

Transformation rules capture our intuitions about how sentences are related; enable the grammar to be simplified, primarily because rules that enable us to rewrite strings as other strings capture many of the aspects of the dependencies between words

When we access the lexical entry for a word, two types of information becomes available: information about the word's meanings information about the syntactic and thematic roles that the word can take

The goal of sentence interpretation is to assign thematic roles to words in the sentence - who is doing what to whom

Important guide of thematic roles: verb's argument structure (subcategorization frame). For example, the verb "give" has the structure AGENT gives THEME to recipient (e.g., "Vlad gave the ring to Agnes")

To assign thematic roles, parsing

In autonomous models, the initial stages of parsing at least can only use syntactic information to construct a syntactic representation. According to interactive models, other sources of information (e.g., semantic info) can influence the syntactic processor at an early stage

Question of number of stages is about whether parsing is modular or interactive:      In one-stage models, syntactic and semantic information are both used to construct the syntactic representation in one go.      In two-stage models, the first stage is invariably seen as an autonomous stage of syntactic processing. Semantic information is used only in the second stage

Reasons to study syntax: Syntactic processing is a stage in extracting the meaning of what we hear/read Fodor (1975) argued that there is a "language of thought" that bears a close resemblance to our surface language. In particular, the syntax that governs the language of thought may be very similar or identical to that of external language. Studying syntax may therefore be a window onto fundamental cognitive processes

A sentence is syntactically ambiguous when multiple phrase  structures can potentially be assigned to it.

The bulk of evidence shows that we spend no longer reading the ambiguous regions of sentences than the unambiguous regions of control sentences, but we often spend longer reading the disambiguation region.

Central issue in parsing: when different types of information are used.Serial autonomous model : syntax then semantics to decide, try againParallel autonomous model : construct all syntactic representations in parallel, using syntactic representation, then use semantic/other info to chooseInteractive model: use semantic info to guide parsing; or activate representations of all possible analyses, with the level of activation affected by the plausibility of each

In "the horse raced past the barn fell": verb "raced" could be a main verb or a past participle (a word derived from a verb acting as an adjective)We initially try to parse it as a simple noun phrase followed by a verb phrase. It contains a reduce relative clause (a relative clause is one that modifies the main noun, and it is "reduced" because it lacks the relative pronoun "which" or "that"

Rayner and Frazier (1987) intentionally omitted punctuation in order to mislead the participants' processes. Deletion of the complementizer "that" can also produce misleading results, In such cases, it might be possible that these sentences are not telling us much about normal parsing as we think. In fact, reduced reduced relative clauses are common; "that" was omitted in 33% of sentences containing relative clauses in a same from the WSJ. There is evidence that appropriate punctuation such as commas can reduce the magnitude of the garden path effect by enhancing the reader's awareness of the phrasal structure.In real life, speakers give prosodic cues to provide disambiguating info and listeners are sensitive to this type of information; for example speakers tend to emphasise the direct-object nouns, and insert pauses akin to punctuation. Disfluencies influence the way in which people interpret garden path sentences. When an interruption (saying "uh") comes before an unambiguous noun phrase, listeners are more likely to think that the noun phrase is the subject of a new clause rather than the object of an old one.However, speakers do not always mean to give these cues for the express purpose of helping the listener (the audience design hypothesis). Speaker are not always aware that what they are saying is ambiguous, and they tend to produce the same cues even when there is no audience. Prosody and pauses probably reflect both the planning needs of the speaker as well as a deliberate source of information to aid the listener.

McKoon and Ratcliff (2003) showed that sentences with reduced relatives with verb like "race" occur in natural language with near-zero probability. So although such sentences might technically be syntactically correct, most people find these sorts of sentences unacceptable. McKoon and Ratcliff argue that sentences with reduced relatives with verbs similar to "race" are ungrammatical. Hence considerable caution is necessary when drawing conclusions about the syntactic processor from studies of garden path sentences.

The linguistic theory of Chomsky

Describing syntax and phrase-structure grammar

Understanding the Structure of Sentences

Dealing with structural Ambiguity