[SOLVED] CS代写 Natural language

30 $

File Name: CS代写_Natural_language.zip
File Size: 235.5 KB

SKU: 7860376632 Category: Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Or Upload Your Assignment Here:


Natural language
From the very beginnings of AI, researchers have had the goal of interacting with computers in a natural language, that is, in a language spoken by people.
• ease of use: not have to learn computer language, especially for non-technical users
• richness: refer to objects in a descriptive way

Copyright By PowCoder代写加微信 assignmentchef

Find me the article written by Jane just before she left on her trip to the Middle East.
• AI learning via books: much of what we know is written down in books
success in limited domains: limited vocabulary, limited subject matter, limited accuracy for
– language translation
– front-ends to database systems – other applications
Here: very simple form of natural language processing cps721 Artificial Intelligence © Natural language 1

Linguistics
Linguistics is the study of natural language at a variety of levels
• phonetics: the sounds in words
• morphology: roots of words, prefixes, suffixes
• syntax: how do the words group together? – Mary kicked the boy in the knee. vs.
Mary kicked the boy in the first row.
• semantics: what do the words mean?
– The astronomer spotted a star. vs.
The astronomer married a star.
– The councillors refused a permit to the demonstrators because they feared violence. vs.
because they advocated violence.
• pragmatics: what are the words being used for?
– Can you juggle? vs.
Can you pass the salt?
– I wouldn’t turn around if I were you.
Here: just syntax and some semantics
cps721 Artificial Intelligence © Natural language 2

How words group together in a language: phrases, sentences, paragraphs, …
the boy in the park with the red bench the boy in the park with the grey sweater
the cat the dog Mary owns chases
and understanding how well-formed strings of words can be generated and recognized.
Syntactically well-formed sentences need not be semantically well-formed
Colourless green ideas sleep furiously.
Syntactically ill-formed word sequences can sometimes be somewhat meaningful
Accident driver car drunk hospital.
cps721 Artificial Intelligence © Natural language 3

The starting point for the syntactic analysis of a language is a lexicon. This specifies the word categories of the language:
• article: a, the
• adjective: fat, rich, happy, oldest, orange, …
• proper nouns: Mary, John, Toronto, Great Britain,… • common nouns: boy, sweater, park, milk, justice, … • transitive verbs: kick, love, eat, …
• intransitive verbs: swim, walk, sleep, …
• copula verbs: is, seems, …
• prepositions: in, on, from, beside,…
+ others: pronouns, adverbs, interjections, … Words can appear in many categories:
for example: set, run, fat
cps721 Artificial Intelligence © Natural language 4

Grammar rules
A grammar of a language is a specification of the various types of well-formed word groups.
Usually grammars are specified by a collection of rules (not unlike Prolog rules) describing how each type of word group is formed from other word groups.
We need to distinguish between
• lexical or terminal categories, like article, or transitive_verb. We write these in lower-case.
• group or non-terminal categories, like phrases or sentences. We will write these in upper case.
It is customary to write them using short cryptic abbreviations: NP instead of Noun_phrase
A grammar rule will have the following form: category !””category1 category2 … categoryn
where the category on the left must be a non-terminal, and the categories on the right (if any) can be either terminals or non-terminals.
cps721 Artificial Intelligence © Natural language 5

Sample grammar
Here is a grammar for simple English declarative sentences
S !””NP VP
VP !””copula_verb Mods
VP !””transitive_verb NP Mods VP !””intransitive_verb Mods Mods !””
Mods !””PP Mods
PP !””preposition NP
NP !””proper_noun
NP !””article NP2
NP2 !””adjective NP2
NP2 !””common_noun Mods
Note: no pronouns, no clauses, …
cps721 Artificial Intelligence © Natural language 6

Parse tree
Parsing is the process of taking a sequence of words and determining how they fit into a given grammar.
This is done by producing a parse tree with the words at the leaves, and the desired group category at the root.
article adjective noun copula preposition article noun
The fat cat is on the mat.
cps721 Artificial Intelligence © Natural language 7

A grammar is ambiguous if there is a sequence of words with two distinct parse trees
Here, “with the hat“ modifies “boy“
PP Mods NP
with the hat
Here, “with the hat“ modifies “park“
with the hat
Artificial Intelligence
Natural language 8

What we will do here, is write a program in Prolog which does simple syntactic and semantic processing for noun phrases only:
• parse legal noun phrases according to the given grammar
• determine a referent for the legal noun phrases
that is: find a name(s) of the individual being referred to by the noun phrase
For example:
who([a,woman], X). X = mary ;
X = linda ;
Note: we will be using list of words, so ignore the commas
who([a,woman,with,a,red,hat], X). X = linda ;
who([a,red,with,a,woman,hat], X).
Artificial Intelligence © Natural language 9

The approach
The program has 3 pieces
• a database
These are clauses that represent the facts in the world we are interested in: who the people are, where they are, what they are wearing, etc.
Nothing in the database is intended to be language specific.
• a lexicon
These are clauses that describe the English words we will be using in the noun phrases.
It also links these words to their meaning in the predicates and constants in the database.
Nothing in the lexicon depends on the grammar.
• an parser / interpreter
These are clauses which define the grammar.
It also uses information provided by the separate lexicon and database to decide what individual is being referred to.
cps721 Artificial Intelligence © Natural language 10

Example database
In this example, what we know about the world is represented by a collection of atomic facts about some people, parks, trees, and hats. Each has a unique name.
person(john).sex(john,male).size(john,small).
person(george). sex(george,male). size(george,big).
person(mary).sex(mary,female).size(mary,small).
person(linda).sex(linda,female). size(linda,small).
park(qbeach).park(queens_park).
tree(tree01).size(tree01,big).
tree(tree02).size(tree02,small).
tree(tree03).size(tree03,small).
hat(hat01).size(hat01,small).colour(hat01,red).
hat(hat02).size(hat02,small).colour(hat02,blue).
hat(hat03).size(hat03,big).colour(hat03,red).
hat(hat04).size(hat04,big).colour(hat04,blue).
in(john,qbeach). on(hat01,john).
in(george,qbeach). on(hat04,george).
in(mary,queens_park). on(hat02,mary).
in(linda,queens_park). on(hat03,linda).
beside(mary,linda).
in(tree01,queens_park).in(tree02,queens_park).
in(tree03,qbeach).
cps721 Artificial Intelligence © Natural language 11

Building a lexicon
In the lexicon, we need to describe all of the words we intend to use in noun phrases.
In addition, for each word, we need to know what semantic constraint it imposes on the referent we are looking for.
For example:
• If the word is “hat” and the referent is X, the
goal hat(X) should succeed
• If the word is “red” and the referent is X, the
goal colour(X,red) should succeed
For prepositions, there are two referents involved.
For example,
• If the word is “on” and the referents are X and Y,
the goal on(X,Y) should succeed
So when we parse “the red hat on John”, we will end up
with the goal
hat(X), colour(X,red), on(X,john).
cps721 Artificial Intelligence © Natural language 12

Example lexicon
article(a).
article(the).
common_noun(park,X) :- park(X).
common_noun(tree,X) :- tree(X).
common_noun(hat,X) :- hat(X).
common_noun(man,X) :- person(X), sex(X,male).
common_noun(woman,X) :- person(X), sex(X,female).
adjective(big,X) :- size(X,big).
adjective(small,X) :- size(X,small).
adjective(red,X) :- colour(X,red).
adjective(blue,X) :- colour(X,blue).
preposition(on,X,Y) :- on(X,Y).
preposition(in,X,Y) :- in(X,Y).
preposition(beside,X,Y) :- beside(X,Y).
preposition(beside,X,Y) :- beside(Y,X).
/* The preposition “with” is flexible in its use */ preposition(with,X,Y) :- on(Y,X). preposition(with,X,Y) :- in(Y,X). preposition(with,X,Y) :- beside(Y,X). preposition(with,X,Y) :- beside(X,Y).
/*For purposes of this example, any word that is not in
one of the four categories above is a proper noun */
proper_noun(X) :-
not article(X), not common_noun(X,_),
not adjective(X,_), not preposition(X,_,_).
cps721 Artificial Intelligence © Natural language 13

The database vs. the lexicon
In our example, the database names are very similar to the lexicon names:
common_noun(hat,X) :- hat(X).
But observe that they have very different purposes:
• the lexicon describes the words we are interested in using (like the word “hat”)
• the database describes the facts of the world we care about (using the concept of a hat)
Different words may involve the same concept: • synonyms
common_noun(cap,X) :- hat(X).
common_noun(derby,X) :- hat(X).
• other languages
common_noun(chapeau,X) :- hat(X).
Different concepts may involve the same word:
common_noun(derby,X) :- hat(X).
common_noun(derby,X) :- race(X).
cps721 Artificial Intelligence © Natural language 14

Example parser
For each non-terminal category in the grammar, we will have a predicate in the parser.
Each such predicate will take two arguments: • a list of words to be parsed
• a referent.
Then each grammar rule becomes a rule in Prolog:
who(Words,Ref) :- np(Words,Ref).
np([Name],Name) :- proper_noun(Name).
np([Art|Rest],Who) :-
article(Art), np2(Rest,Who).
np2([Adj|Rest],Who) :-
adjective(Adj,Who), np2(Rest,Who).
np2([Noun|Rest],Who) :-
common_noun(Noun,Who), mods(Rest,Who).
mods([],_).
mods(Words,Who) :-
append(Start,End,Words),
pp(Start,Who),
mods(End,Who).
/* split Words into */
/* a Start list */
/* and an End list*/
pp([Prep|Rest],Who) :-
preposition(Prep,Who,Who2), np(Rest,Who2).
append([],X,X).
append([H|T],X,[H|Y]) :- append(T,X,Y).
cps721 Artificial Intelligence © Natural language 15

Tracing a parse
np2([woman], john).
eventually fails
np2([woman], mary).
common_noun(woman,mary), mods([ ], mary).
person(mary), sex(mary,female), mods([ ], mary).
who([a,small,woman], X).
np([a,small,woman], X).
article(a), np2([small,woman], X). np2([small,woman], X).
adjective(small,X), np2([woman], X). size(X,small), np2([woman], X).
adjective(woman,mary), np2([ ], mary).
cps721 Artificial Intelligence © Natural language 16

Using a prepositional modifier
who([a,woman,beside,mary], X). np([a,woman,beside,mary], X).
article(a), np2([woman,beside,mary], X).
common_noun(woman,X), mods([beside,mary],X).
person(X), sex(X,female), mods([beside,mary],X).
sex(linda,female), mods([beside,mary],linda).
mods([beside,mary],linda).
append(Start,End,[beside,mary]), pp(Start,linda),
mods(End,linda).
pp([beside,mary],linda), mods([ ],linda).
preposition(beside,linda,Who2), np([mary],Who2),
mods([ ],linda).
beside(Who2,linda), np([mary],Who2), mods([ ],linda).
np([mary],mary), mods([ ],linda).
success X = linda
adjective(woman,X), np2([beside,mary], X).
sex(john,female), mods([beside,mary],john).
sex(george,female), mods([beside,mary],george).
sex(mary,female), mods([beside,mary],mary).
mods([beside,mary],mary).
pp([ ],linda), mods([beside,mary],linda).
pp([beside],linda), mods([mary],linda).
beside(linda,Who2), np([mary],Who2), mods([ ],linda).
Artificial Intelligence
Natural language 17

Dealing with ambiguity
We can use semantic information to resolve an otherwise ambiguous noun phrase
np2([man,in,the,park,with,a,tree], _).
common_noun(man,_), mods([in,the,park,with,a,tree], _).
mods([in,the,park,with,a,tree], john).
append(Start,End,[in,the,park,with,a,tree]), pp(Start,john), mods(End,john)
pp([in,the,park],john), mods([with,a,tree],john)
pp([in,the,park,with,a,tree],john), mods([ ],john)
Start=[in,the,park] End=[with,a,tree]
Start=[in,the,park,with,a,tree] End=[ ]
The failed execution on the left corresponds to the parse tree where “with a tree” is understood to be a PP that modifies the noun “man” (in this case, John).
cps721 Artificial Intelligence © Natural language 18

Where are the parse trees?
A parse tree can be reconstructed from a successful execution path (ignoring certain details).
article noun PP prep article noun
NP2 Mods Mods NP NP2 Mods
np2([man,in,the,park,with,a,tree], _).
common_noun(man,_), mods([in,the,park,with,a,tree], _).
pp([in,the,park,with,a,tree],john), mods([ ],john)
preposition([in],john,Who2), np([the,park,with,a,tree],Who2)
article([the]), np2([park,with,a,tree],qbeach)
common_noun(park,qbeach), mods([with,a,tree], qbeach).
pp([with,a,tree],qbeach), mods([ ],qbeach)
preposition([with],qbeach,Who2), np([a,tree],Who2)
article([a]), np2([tree],tree03)
common_noun(tree,tree03), mods([ ], tree03).
Artificial Intelligence
© Natural language 19

More examples
who([a,man,with,a,big,hat],X). X = george ;
who([the,hat,on,george],X). X = hat04 ;
who([a,man,in,a,park,with,a,big,tree],X).
who([a,woman,in,a,park,with,a,big,tree],X). X = mary ;
X = linda ;
who([a,woman,in,a,park,with,a,big,red,hat],X). X = linda ;
who([a,woman,beside,a,woman,with,a,blue,hat],X). X = mary ;
X = linda ; no
who([a,woman,with,a,blue,hat,beside,a,woman],X). X = mary ;
cps721 Artificial Intelligence © Natural language 20

Fun with parsing
who([the,W,on,john],hat01). W = hat ;
who([a,man,with,H],P), who([the,hat,on,P],H). H = hat01 P = john ;
H = hat04 P = george ;
who([a,X,Y],linda).
X = small Y = woman ; no
who(L,linda).
L = [linda] ; [Execution aborted]
Execution here will not terminate. The parser is constructing a noun phrase of the form
“a small small small small …” to describe = [a,_,_,_,_], who(L,linda).
L = [a,small,small,small,woman] ;
L = [a,small,woman,in,queens_park] ;
L = [a,small,woman,beside,mary] ;
L = [a,small,woman,with,hat03] ;
L = [a,small,woman,with,mary] ;
L = [a,woman,in,a,park] ;
L = [a,woman,in,the,park] ;
L = [a,woman,beside,a,woman] ;
L = [a,woman,beside,the,woman] ;
L = [a,woman,with,a,hat] ;
L = [a,woman,with,the,hat] ;
L = [a,woman,with,a,woman] ;
L = [a,woman,with,the,woman] ;
As above but only
5 words are allowed, the first of which must be the article “a”
cps721 Artificial Intelligence © Natural language 21

Interrogative sentences
To handle some interrogative sentences, we can use grammar rules similar to before:
• wh-questions
WH !””wh_word copula_verb NP
Who is the woman with Linda?
What is the hat on the man in the park?
WH !””wh_word copula_verb PP Who is beside Mary?
What is in Queen’s park?
• yes-no questions
YN !””copula_verb NP NP
Is Mary a woman in Queen’s Park? Is the man with the blue hat John?
YN !””copula_verb NP PP Is John beside Mary?
Is the big red hat on George?
Artificial Intelligence © Natural language 22

A program for yes-no questions
To parse simple yes-no interrogatives, we use a program similar to before.
We want a predicate ynq(words,answer)
to succeed when words is a proper yes-no question and answer is the appropriate response
E.g.: ynq([is,john,beside,mary],X).
ynq([Verb|Rest],Answer) :-
copula_verb(Verb),
append(W1,W2,Rest),
np(W1,Ref),
same_ref(W2,Ref,Answer).
same_ref(W2,Ref,yes) :- np(W2,Ref).
same_ref(W2,Ref,yes) :- pp(W2,Ref).
same_ref(W2,Ref,no) :-
np(W2,Ref2),
not Ref=Ref2.
same_ref(W2,Ref,no) :-
pp(W2,Ref2),
not Ref=Ref2.
Note that we distinguish between failure (e.g. the question is ill-formed or the referents do not exist) and returning no as the answer.
Artificial Intelligence © Natural language 23

Problems with declaratives
To handle declarative sentences, we need a very different form of semantics
Consider: “John is in the park with the big tree.”
• After we have determined that the referents in question are John and queens_park, we do not want to test in(john,queens_park).
• Most likely, what is intended is that we should treat this as new information, and add it as a new fact into our database for later use.
This raises two problems:
• How do we get a program to add facts to a Prolog database?
• Our lexicon is geared to retrieval. For example, in determining that “in” is a preposition, we end up generating a goal of the form in(X,Y).
The solution: some new features of Prolog… cps721 Artificial Intelligence © Natural language 24

Atoms as terms
So far, atoms have been of the form
predicate(term, …, term)
where a term is either a constant, a variable or a list whose elements are terms
However, a term may also be another atom. So instead of using an atom like
likes(john,
[book,steinbeck,[east,of,eden]])
we can make the second argument into an atom:
likes(john, book(steinbeck,[east,of,eden])).
If we match this against
likes(john,book(X,[east,of,eden])
we get X = steinbeck. Similarly if we match with likes(john,X)
we get X = book(steinbeck,[east,of,eden]) so that a variable may have an atom as its value.
cps721 Artificial Intelligence © Natural language 25

Meta-predicates
There are three special operations that can be performed within Prolog programs on atoms:
• assert(atom)
This goal always succeeds, and has the effect of adding the atom as a fact to Prolog’s database.
• retract(atom)
This goal has the effect of removing the first fact in Prolog’s database that matches the atom. It fails iff there is no match.
• call(atom)
This goal has the effect of attempting to solve the atom, possibly binding variables it contains. It succeeds iff the atom succeeds.
So we can write “update” programs like:
get_married(X) :-
retract(single(X)),
assert(married(X)).
which could be used to keep a model of a changing world up to date.
cps721 Artificial Intelligence © Natural language 26

A new lexicon
Instead of making the lexicon call the database predicate directly as a goal, as in,
adjective(blue,X) :- colour(X,blue).
preposition(on,X,Y) :- on(X,Y).
we can have each lexical category simply return the predicate as an atom:
adjective(blue, X, colour(X,blue)).
preposition(on, X, Y, on(X,Y)).
It is then up to the parsing program to decide what to do with the database atoms.
Within noun phrases, we might simply call the atom:
np2([Adj|Rest],Who) :- adjective(Adj,Who,Atom), Find the atom call(Atom), Call it as a goal np2(Rest,Who).
This forces us to use noun phrases “referentially” only
won’t work for: John wants to marry a rich lawyer attributive vs. referential noun phrases
cps721 Artificial Intelligence © Natural language 27

Prepositional phrases
For PPs, we might want to find a referent using the embedded NP only, and return an atom containing the main predicate:
pp([Prep|Rest], X, Atom) :-
preposition(Prep,X,Y,Atom),
np(Rest,Y).
Note that the variable Y here appears in Atom, and will get a value as a

程序代写 CS代考加微信: assignmentchef QQ: 1823890830 Email: [email protected]

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] CS代写 Natural language
30 $