SlideShare ist ein Scribd-Unternehmen logo
1 von 66
Semantic Analysis in Language Technology
Lecture 1: Introduction
Course Website: http://stp.lingfil.uu.se/~santinim/sais/sais_fall2013.htm

MARINA SANTINI
PROGRAM: COMPUTATIONAL LINGUISTICS AND LANGUAGE TECHNOLOGY

DEPT OF LINGUISTICS AND PHILOLOGY

UPPSALA UNIVERSITY, SWEDEN

12 NOV 2013
Acknowledgements
2

 Thanks to Mats Dahllöf for the many slides I

borrowed from his previous course and for
structuring such an interesting and comprehensive
content.

Lecture 1: Introduction
Practical Information
3
INTENDED LEARNING OUTCOMES
ASSIGNMENTS AND EXAMINATION
READING LIST
DEMOS

Lecture 1: Introduction
Course Website & Contact Details
4

 Course website:
 http://stp.lingfil.uu.se/~santinim/sais/sais_fall2013.htm
 Contact details:
 santinim@stp.lingfil.uu.se
 marinasantini.ms@gmail.com
 marinaromestockholm@gmail.com

Lecture 1: Introduction
Check the website regularly and make sure to refresh the page:
we are building up this course together, so this page will be continously
updated!
5

Lecture 1: Introduction
About the Course
6

 Introduction to Semantics in Language Techology

and NLP.

 Focus on methods used in Language Technology

and NLP for the perform the following tasks:





Sentiment Analysis (SA)
Information Extraction (IE)
Word Sense Disambiguation (WSD)
Predicate-Argument Extraction (PAS)

Lecture 1: Introduction
Intended Learning Outcomes
7

 In order to pass the course, a student must be able to:


describe systems that perform the following tasks, apply them to authentic
linguistic data, and evaluate the results:
1.

detect and extract attitudes and opinions from text, i.e. Sentiment
Analysis (SA);

2.

use semantic analysis in the context of Information Extraction (IE)

3.

disambiguate instances of polysemous lemmas, i.e. Word Sense
Disambiguation (WSD);

4.

use robust methods to extract the Predicate-Argument Structure (PAS).

Lecture 1: Introduction
Compulsory Readings
8

1.

Bing Liu (2012) Sentiment Analysis and Opinion Mining, Morgan & Claypool.

2.

Richard Johansson and Pierre Nugues. 2008. Dependency-based Syntactic–
Semantic Analysis with PropBank and NomBank, CoNLL 2008: Proceedings
of the 12th Conference on Computational Natural Language Learning.

3.

Daniel Jurafsky and James H. Martin (2009), Speech and Language
Processing: An Introduction to Natural Language Processing, Computational
Linguistics, and Speech Recognition. Second Edition, Pearson Education.

4.

Daniel Gildea and Daniel Jurafsky. 2002. Automatic Labeling of Semantic
Roles, Computational Linguistics 28:3, 245-288.

5.

M Palmer, D Gildea, P Kingsbury. 2005. The proposition bank: An annotated
corpus of semantic roles, Computational Linguistics 31 (1), 71-106.

6.

Additional suggested readings will be listed at the end of each lecture

Lecture 1: Introduction
Demos & Tutorials
9

 This list will be continuosly updated, also with your

contribution…

Lecture 1: Introduction
Assignments and Examination
10

 Four Assignments:
1.
Essay writing: independent study of a system, an approach, or a field within
semantics-oriented language technology. The study will be presented both as a written
essay and an oral presentation. The essay work will also include a feedback step where
the work of another group is reviewed.
2.
Assignment on Predicate-Argument Structure (PAS)
3.
Assignment on Sentiment Analysis (SA)
4.
Assignment on Word Sense Disambiguation (WSD)

 General Info:
 No lab sessions, supervision by email
 Essay and assignments must be submitted to santinim@stp.lingfil.uu.se
 Examination:
 Written report submitted for each assignment
 All four assignments necessary to pass the course
 Grade G will be given to students who pass each assignment. Grade VG to those who
pass the essay assignment and at least one of the other ones with distinction.
Lecture 1: Introduction
IMPORTANT!
11

 Start thinking about a topic you are interested in for

your essay writing assignment!

Lecture 1: Introduction
Practical Organization
12

 45min + 15 min break
 Lectures on Course webpage and SlideShare
 Email all your questions to me: santinim@stp.lingfil.uu.se
 IMPORTANT:


Send me an email to santinim@stp.lingfil.uu.se, so I make sure that I have
all the correct email addresses. If you do not get an acknowledgement of
receipt, please give me a shout!

Lecture 1: Introduction
Interaction and Cooperation
13

 Communicate with me and with your classmates to

exchange ideas, if you have problems in
understanding notions and concepts or practical
implementations.
 Recommemdation: share your knowledge with your

peers and steam off stress.
 Cheating is not permitted 

Lecture 1: Introduction
Semantics in Language Technology Overview
14
SEMANTICS IN LANGUAGE TECHNOLOGY
APPLICATIONS
LEXICAL SEMANTICS
REPRESENTATION OF MEANING
SUMMARY

Lecture 1: Introduction
15

Semantics in Language Technology

Lecture 1: Introduction
Logic and Semantics
16

 Aristotelian logic – important ever since.
 Syllogisms, e.g.:
 Premise: No reptiles have fur.
 Premise: All snakes are reptiles.
 Conclusion: No snakes have fur.
 Modern logic develops, late 19th Century – more

general and systematic.
 Formal semantics in linguistics and philosophy
based on logic (20th Century).

Lecture 1: Introduction
Formal and Computational Semantics
17

 Computational semantics “is the study of how to

automate the process of constructing and reasoning
with meaning representations of natural language
expressions.” (Wikipedia).
 Early systems rule-based, most famous example:
“Montague grammar” (1970). Sophisticated
mechanisms for translation of English into a very
rich logic.
 Language technology: Recent interest in data-driven
and machine learning-based methods.
Lecture 1: Introduction
Semantics in NLP
18

 NLP semantics is typically more limited in scope than NL

semantics as analysed in linguistics and philosophy.

 NLP applications often handle semantic aspects without

having explicitly semantic components, e.g. in machine
translation.

 Other aspects of language – morphology, syntax, etc. –

can be seen as support systems for semantics: The
purpose of language lies in the use of expressions as
carriers of semantic meaning. And that is what many
NLP systems have to respect, e.g. MT, retrieval,
classification, etc.

Lecture 1: Introduction
Semantics and Truth (i)
19

Semantics, meanings and states of affairs:
 What a sentence means: a structure involving

(lexical) concepts and relations among them.
Can be articulated as a semantic
representation.

E.g. I ate a turkey sandwich. in predicate logic:

 A sentence and the semantic representation of

a sentence is also the representation of a
possible state of affairs.

Lecture 1: Introduction
Semantics and Truth (ii)
20

 Correspondence theory of truth: If the content of a sentence

corresponds to an actual state of affairs if it is true; otherwise, it is
false.

 Ignoring philosophical complications, in many cases we can extract

knowledge from texts.

E.g. Warmer climate entails increased release of carbon
dioxide by inland lakes. (From uu.se press release.)
 Related issue: Which texts should we trust?
 Many sentences are difficult to formalize in logic. (Modality,

conditionality, vague quantification, tense, etc.)

Lecture 1: Introduction
21

Representation of Meaning

Lecture 1: Introduction
Formalizing Meaning
22

 Linguistic content has – at least to a certain degree – a logical

structure that can be formalized by means of logical calculi –
meaning representations.

 The representation languages should be simple and

unambiguous – in contrast to complex and ambiguous NL.

 Logical calculi come with accounts of logical inference. They

are useful for reasoning-based applications.

 Meaning formalization faces far-reaching conceptual and
 computational difficulties.
Lecture 1: Introduction
Compositionality
23

 Linguistic content is compositional: Simple

expressions have a given (lexical) meaning; the
meaning of complex expressions is determined by
the meanings of their constituents.
People produce and understand new phrases and
sentences all the time. (NLP must also deal with
these.)
 Compositionality is studied in detail in
compositional syntax-driven semantics. Work in
this field is typically about hand-coded rule systems
for small fragments of NL.
Lecture 1: Introduction
Compositional Aspects
24

Lecture 1: Introduction
Compositional Aspects – Argument Structure
25

Lecture 1: Introduction
Discourse-Related Aspects
26

Lecture 1: Introduction
Compositional semantics in Language Technology
27

Lecture 1: Introduction
First-Order Predicate Logic (i)
28

 “flexible, well-understood, and computationally






tractable approach to the representation of
knowledge [and] meaning” (J&M. 2009: 589)
expressive
verifiability against a knowledge base (related to
database languages)
inference
model-theoretic semantics

Lecture 1: Introduction
First-Order Predicate Logic (ii)
29

 Boolean operators: negation and connectives
 Existential/universal quantification
 Individual constants
 Predicates (taking a number of arguments)

Lecture 1: Introduction
When to assume compositionality?
30

Lecture 1: Introduction
Multi-Word Expressions
31

MWEs (a.k.a multiword units or MUs) are lexical units
encompassing a wide range of linguistic phenomena,
such as idioms (e.g. kick the bucket = to die), collocations
(e.g. cream tea = a small meal eaten in Britain, with
small cakes and tea), regular compounds (cosmetic
surgery), graphically unstable compounds (e.g. selfcontained <> self contained <> selfcontained - all
graphical variants have huge number of hits in Google),
light verbs (e.g. do a revision vs. revise), lexical bundles
(e.g. in my opinion), etc. While easily mastered by
native speakers, MWEs' correct interpretation remains
challenging both for non-native speakers and for
language technology (LT), due to their complex and often
unpredictable nature.
Lecture 1: Introduction
Cross-linguality
Use Case: Information Access
32
In multi-ethnic societies, like the Swedish society, it is common that many non-native speakers
use public websites – e.g. Arbetesförmedlingen or Pensionsmyndigheten websites – to access
information that are vital to their living and integration in the host country. National
regulations are often accompanied by special terminology and new coinages. For instance, the
Swedish expression /egenremiss/ (14,900 hits, Google.se April 2013) – or alternatively as an
MWE – /egen remiss/ (8,210 hits, Google.se April 2013) denotes a referral to a specialist doctor
written by patients themselves. This expression is made up from two common Swedish words
/egen/ `own (adj)' and /remiss/ `referral'. It is a recent expression (probably coined around
20101) and not yet recorded in any official dictionary nor in Wiktionary or other multilingual
online lexical resources. However, it is very frequent in query logs belonging to a Swedish public
health service website. When trying to implement a cross-lingual search based on the automatic
translation of query logs, it turned out that none of the existing multilingual lexical resources
contained this expression.

Lecture 1: Introduction
Use Case: Personal Use & Text Understanding
33

 The use of expressions that are marked for style, genre, domain, or

register (and/or other textual categories), or the use of expressions
which are misspelled or idiomatic for some textual category are
beyond the competence of a novice reader or a non-native speaker.
Additionally, in a web search or in social networks, one cannot tell if
the texts one reads are good or bad the way a firstlanguage readers
can. When readers/users read a language they do not know at all,
they can use automatic translation or online dictionaries or other
lexical resources. However, what they cannot determine well is the
*type* of text one is reading. They cannot tell if the text is
verbose, terse, formal, informal, stupid, funny, bad, or good.

 For instance, the phrase "es ist zum Kotzen" means this is

vernacular and unrefined text as well as a controversial expression.
The phrase "isch alle", instead, means that this line in the text is
spoken by a Berliner.

Lecture 1: Introduction
Semantics vs Pragmatics/Discourse (i)
34

 What does a word, a phrase, a text segment mean as an

NL expression? (“Linguistic meaning” – semantics.)
Conventional, static, systemic aspect of meaning.

 What does the author intend to convey by means of a

word, a phrase, a text segment? (“Speaker meaning” –
pragmatics/discourse.)
Contextual, dynamic aspect of meaning.

 The two aspects depend on each other, of course.
Lecture 1: Introduction
Semantics vs Pragmatics/Discourse (ii)
35

Lecture 1: Introduction
Semantics vs Pragmatics/Discourse (iii)
36

Lecture 1: Introduction
37

Applications

Lecture 1: Introduction
Semantics-oriented NLP applications
38

 Machine translation: The translation of a text segment should

mean the same as the original (to emphasize linguistic
meaning) or should convey the same content (to emphasize
speaker meaning).

 Information extraction is to extract components of the

information conveyed by a text.

 Question answering is extraction – combined with inference –

of an answer to a given question.

 Text classification, in typical cases, relates to the meanings of

the texts being classified.

Lecture 1: Introduction
Semantics and Generation
39

 Generation: semantic representation  NL. Less

challenging than analysis – the structure of the input
is under control. Needed in e.g. dialogue systems.
 Interlingua – semantic representation in machine
 translation:
Analysis: source language  interlingua.
Generation: interlingua  target language.
Would be economic if many languages are involved. The idea has
not proved very successful so far.

Lecture 1: Introduction
Reference
40

 Reference is very important – what statements are

about.
 Referring expressions are very common.
 Reference is a discourse phenomenon.
 Resolving reference is a crucial step in e.g.



extraction, e.g.in sentiment analysis
translation, e.g. to get agreement right


English it vs French il/elle vs Swedish den/det.

Lecture 1: Introduction
Reference –An Example
41

Lecture 1: Introduction
Kinds of Referring Expressions
42

 Indefinite noun phrases. E.g. a book. Introduce new






entities.
Pronouns. E.g. he. Typically coreferent with a
previous referring expression (antecedent).
Names. E.g. Bill Gates.
Demonstrative. E.g. this room.
Other definite noun phrases. E.g. the first chapter.
Reference to somehow known entity, often
previously mentioned.

Lecture 1: Introduction
Named Entity Recognition (NER)
43

 To identify expressions being used as names. (What

characterizes a “name”?)
 Also to identify what kind of name it is: E.g. of a
person, or a place, or a stretch of time, or a chemical
compound, or a gene, etc.
 “State-of-the-art NER systems for English produce
near-human performance. For example, the best
system entering MUC-7 scored 93.39% of F-measure
while human annotators scored 97.60% and 96.95%”
(Wikipedia).
Lecture 1: Introduction
Anaphora and Deixis Resolution
44

 Pronouns (they), pronominal adverbs (there, then), and

definite NP’s refer to entities by means of contextually
given information.
 E.g. by referring to previously mentioned referents –
anaphora.
 E.g. by reference based on the participants, time, and
place of the discourse – deixis (e.g. I, you, here,
yesterday).
 Anaphora and deixis resolution is much more
challenging task than NER. The reference of name-like
graph words is much more predictible. Compare Barack
Obama and he.
Lecture 1: Introduction
Sentiment Analysis – an extraction task
45

 What views do people express in blogs and reviews?

That’s interesting for politicans and marketing people.
 Opinions are often expressed in a personal and informal
way.
E.g. Peter bought me a Baileys marzipan chocolate thing
which I washed down with Gluehwein and that, in
combination with the bright lights and cheery faces really
made me feel warm inside! (From a blog post.)

 Sentiment analysis: to extract the referent of a

“sentiment” and the polarity positive–negative
associated with it.
E.g. Baileys marzipan chocolate – positive.

Lecture 1: Introduction
46

Lexical Semantics

Lecture 1: Introduction
Lexical Concepts
47

 Words are often grammatically simple, but carry a

structured conceptual content. Definitions “unpack”
the content of concepts:





friend – a person whom one knows well, is loyal to, etc.
turkey – a kind of animal, a bird, etc.
sandwich – a kind of food item, contains bread , etc.
eat – a relation (holding in/of an event) between an organism
and a food item, the food is chewed and ingested, etc.

Lecture 1: Introduction
Lexical Concepts - Decomposition
48

Lecture 1: Introduction
Lexical Concepts – Relations (i)
49

Lecture 1: Introduction
Lexical Concepts – Relations (ii)
50

Lecture 1: Introduction
Synonimy
51

Synonymy holds between two words (word tokens) which express the same
or similar concepts.
 Unsupervised detection of synonymy can be based on “The Distributional
Hypothesis: words with similar distributions have similar meanings.” =
The Distributional Hypothesis in linguistics is the theory that words
that occur in the same contexts tend to have similar meanings. The
underlying idea that "a word is characterized by the company it
keeps" was popularized by Firth.

“Random Indexing” is a method here. (“a high-dimensional model can be
projected into a space of lower dimensionality without compromising
distance metrics if the resulting dimensions are chosen appropriately”)
 Synonymy knowledge useful in e.g. translation, text classification, and

information extraction. Also “query expansion” in retrieval.

Lecture 1: Introduction
Lexical Ambiguity
52

Lecture 1: Introduction
Lexical Ambiguity - WSD
53

Lecture 1: Introduction
Word Ambiguity: Homography vs Polysemy (i)
54

Lecture 1: Introduction
Word Ambiguity: Homography vs Polysemy (ii)
55

Lecture 1: Introduction
Word Senses
56

 Discerning word senses (for a lemma) –

lexicographical task, matter of sophisticated
linguistic judgements.
 Theoretical principles. Practical purpose.
 Different dictionaries make different analyses.
 English: WordNet – a standard resource.

Lecture 1: Introduction
Senses of day in WordNet, for instance (i)
57

Lecture 1: Introduction
Senses of day in WordNet, for instance (ii)
58

Lecture 1: Introduction
Word Sense Disambiguation (WSD)
59

 A distributional hypothesis for WSD: words representing

the same sense have more similar distributions than
words representing different senses.
I.e. distribution similarity implies sense similiarity.
 We can use this for supervised learning of WSD.
 This requires data in the form of a sense-tagged corpus

(based on a given sense inventory, e.g. the one given by
WordNet).
Lecture 1: Introduction
Manual Sense-Tagging
60

 More difficult than typical grammatical tagging.
 As we saw in the day example, senses and their

distinctions can be quite subtle. Definitions and
examples are often far from obvious.
 Expensive: requires competent people and standardised
procedures.
 Quality measure: inter-annotator agreement. ” Ex:
Cohen's kappa coefficient is a statistical measure
of inter-rater agreement or inter-annotator
agreementfor qualitative (categorical) items. It is
generally thought to be a more robust measure than
simple percent agreement calculation since κ takes into
account the agreement occurring by chance ”
Lecture 1: Introduction
61

Summary

Lecture 1: Introduction
Conclusions (i)
62

 Logic-based semantics is a theoretical foundation for

NLP semantics, but implemented systems are
typically more coarse-grained and of a more limited
scope.

 Meaning depends both on literal content and

contextual information. This is a challenge for most
NLP tasks.

 Most NLP applications have to be highly sensitive to

semantics.

Lecture 1: Introduction
Conclusions (ii)
63

 Finding and interpreting names and other referential

expressions is a central issue for NLP semantics.
 Disambiguation of polysemous lexical tokens is also
a central issue for NLP semantics.
 Accessing the content of lexical tokens is also useful.
 Meaning representation involves predicateargument structure, which captures a basic aspect of
NL compositionality.

Lecture 1: Introduction
64

Start thinking about a Topic of interest for
your essay writing! Tell me your thoughts
next time…

Lecture 1: Introduction
Suggested Readings
65

 Term Logic (Wikipedia)

 Predicate Logic (Wikipedia)
 Jurafsky and Martin (2009):
 Ch. 17 ”Representation of Meaning”
 Ch. 18 ”Computational Semantics”
 Ch. 19 ”Lexical Semantics”
 Ch. 20 ”Compuational Lexical Semantics”
 Clark et al. (2010):
 Ch 15 ”Computational Semantics”

 Indurkhya and Damerau (2010):
 Ch 5 ”Semantic Analysis”

Lecture 1: Introduction
66

This is the end… Thanks for your attention !

Lecture 1: Introduction

Weitere ähnliche Inhalte

Was ist angesagt?

5. phases of nlp
5. phases of nlp5. phases of nlp
5. phases of nlpmonircse2
 
Introduction to Natural Language Processing
Introduction to Natural Language ProcessingIntroduction to Natural Language Processing
Introduction to Natural Language ProcessingPranav Gupta
 
Natural Language Processing: Parsing
Natural Language Processing: ParsingNatural Language Processing: Parsing
Natural Language Processing: ParsingRushdi Shams
 
Natural language processing (nlp)
Natural language processing (nlp)Natural language processing (nlp)
Natural language processing (nlp)Kuppusamy P
 
Knowledge representation in AI
Knowledge representation in AIKnowledge representation in AI
Knowledge representation in AIVishal Singh
 
Semantic interpretation
Semantic interpretationSemantic interpretation
Semantic interpretationVivek Kumar
 
Natural language processing
Natural language processingNatural language processing
Natural language processingHansi Thenuwara
 
Statistical machine translation
Statistical machine translationStatistical machine translation
Statistical machine translationHrishikesh Nair
 
Unification and Lifting
Unification and LiftingUnification and Lifting
Unification and LiftingMegha Sharma
 
Stemming algorithms
Stemming algorithmsStemming algorithms
Stemming algorithmsRaghu nath
 
Syntactic analysis in NLP
Syntactic analysis in NLPSyntactic analysis in NLP
Syntactic analysis in NLPkartikaVashisht
 
Syntax directed translation
Syntax directed translationSyntax directed translation
Syntax directed translationAkshaya Arunan
 
Lecture 2: Computational Semantics
Lecture 2: Computational SemanticsLecture 2: Computational Semantics
Lecture 2: Computational SemanticsMarina Santini
 
Semantics analysis
Semantics analysisSemantics analysis
Semantics analysisBilalzafar22
 
Lecture: Word Sense Disambiguation
Lecture: Word Sense DisambiguationLecture: Word Sense Disambiguation
Lecture: Word Sense DisambiguationMarina Santini
 
Lecture 10 semantic analysis 01
Lecture 10 semantic analysis 01Lecture 10 semantic analysis 01
Lecture 10 semantic analysis 01Iffat Anjum
 

Was ist angesagt? (20)

5. phases of nlp
5. phases of nlp5. phases of nlp
5. phases of nlp
 
Introduction to Natural Language Processing
Introduction to Natural Language ProcessingIntroduction to Natural Language Processing
Introduction to Natural Language Processing
 
Natural Language Processing: Parsing
Natural Language Processing: ParsingNatural Language Processing: Parsing
Natural Language Processing: Parsing
 
Specification-of-tokens
Specification-of-tokensSpecification-of-tokens
Specification-of-tokens
 
Natural language processing (nlp)
Natural language processing (nlp)Natural language processing (nlp)
Natural language processing (nlp)
 
NLP
NLPNLP
NLP
 
Knowledge representation in AI
Knowledge representation in AIKnowledge representation in AI
Knowledge representation in AI
 
Semantic interpretation
Semantic interpretationSemantic interpretation
Semantic interpretation
 
Natural language processing
Natural language processingNatural language processing
Natural language processing
 
Statistical machine translation
Statistical machine translationStatistical machine translation
Statistical machine translation
 
AI: Logic in AI
AI: Logic in AIAI: Logic in AI
AI: Logic in AI
 
Unification and Lifting
Unification and LiftingUnification and Lifting
Unification and Lifting
 
Stemming algorithms
Stemming algorithmsStemming algorithms
Stemming algorithms
 
Syntactic analysis in NLP
Syntactic analysis in NLPSyntactic analysis in NLP
Syntactic analysis in NLP
 
Syntax directed translation
Syntax directed translationSyntax directed translation
Syntax directed translation
 
NLP_KASHK:Morphology
NLP_KASHK:MorphologyNLP_KASHK:Morphology
NLP_KASHK:Morphology
 
Lecture 2: Computational Semantics
Lecture 2: Computational SemanticsLecture 2: Computational Semantics
Lecture 2: Computational Semantics
 
Semantics analysis
Semantics analysisSemantics analysis
Semantics analysis
 
Lecture: Word Sense Disambiguation
Lecture: Word Sense DisambiguationLecture: Word Sense Disambiguation
Lecture: Word Sense Disambiguation
 
Lecture 10 semantic analysis 01
Lecture 10 semantic analysis 01Lecture 10 semantic analysis 01
Lecture 10 semantic analysis 01
 

Andere mochten auch

Differences in learners- ESPE- language acquisition
Differences in learners- ESPE- language acquisitionDifferences in learners- ESPE- language acquisition
Differences in learners- ESPE- language acquisitionFernanda Avila
 
Semantic Analysis: theory, applications and use cases
Semantic Analysis: theory, applications and use casesSemantic Analysis: theory, applications and use cases
Semantic Analysis: theory, applications and use casesDmitry Kan
 
Word and its semantic structure
Word and its semantic structureWord and its semantic structure
Word and its semantic structureAnatol Pripisnov
 
Case study: How Symantec built engagement through design
Case study: How Symantec built engagement through designCase study: How Symantec built engagement through design
Case study: How Symantec built engagement through designQuarry
 
Morpheme, morph, allomorph
Morpheme, morph, allomorphMorpheme, morph, allomorph
Morpheme, morph, allomorphSajeed Mahaboob
 
Allomorphs - Dr. Shadia Yousef Banjar
Allomorphs - Dr. Shadia Yousef Banjar Allomorphs - Dr. Shadia Yousef Banjar
Allomorphs - Dr. Shadia Yousef Banjar Dr. Shadia Banjar
 
AI and Chatbots 101 - Vancouver Legal Hackers
AI and Chatbots 101 - Vancouver Legal HackersAI and Chatbots 101 - Vancouver Legal Hackers
AI and Chatbots 101 - Vancouver Legal HackersThomas G. Martin
 
Synonyms and Antonyms (Mashup)
Synonyms and Antonyms (Mashup) Synonyms and Antonyms (Mashup)
Synonyms and Antonyms (Mashup) Carla Meyer
 
Prefixes, suffixes and roots
Prefixes, suffixes and rootsPrefixes, suffixes and roots
Prefixes, suffixes and rootsAmanda Lavery
 
Introduction to the Semantic Web
Introduction to the Semantic WebIntroduction to the Semantic Web
Introduction to the Semantic WebMarin Dimitrov
 
Introduction to linguistics ppt
Introduction to linguistics pptIntroduction to linguistics ppt
Introduction to linguistics pptzouhirgabsi
 
Synonym and Antonym PowerPoint
Synonym and Antonym PowerPointSynonym and Antonym PowerPoint
Synonym and Antonym PowerPointkmcmillen92
 
Language: Definition, Nature, and Characteristics
Language: Definition, Nature, and CharacteristicsLanguage: Definition, Nature, and Characteristics
Language: Definition, Nature, and CharacteristicsMa Elena Oblino Abainza
 

Andere mochten auch (20)

Semantics analysis ppt
Semantics analysis pptSemantics analysis ppt
Semantics analysis ppt
 
Differences in learners- ESPE- language acquisition
Differences in learners- ESPE- language acquisitionDifferences in learners- ESPE- language acquisition
Differences in learners- ESPE- language acquisition
 
Relation Extraction
Relation ExtractionRelation Extraction
Relation Extraction
 
Semantic Analysis: theory, applications and use cases
Semantic Analysis: theory, applications and use casesSemantic Analysis: theory, applications and use cases
Semantic Analysis: theory, applications and use cases
 
Semantic analysis
Semantic analysisSemantic analysis
Semantic analysis
 
Word and its semantic structure
Word and its semantic structureWord and its semantic structure
Word and its semantic structure
 
Case study: How Symantec built engagement through design
Case study: How Symantec built engagement through designCase study: How Symantec built engagement through design
Case study: How Symantec built engagement through design
 
Morpheme, morph, allomorph
Morpheme, morph, allomorphMorpheme, morph, allomorph
Morpheme, morph, allomorph
 
Allomorphs - Dr. Shadia Yousef Banjar
Allomorphs - Dr. Shadia Yousef Banjar Allomorphs - Dr. Shadia Yousef Banjar
Allomorphs - Dr. Shadia Yousef Banjar
 
Synonym powerpoint
Synonym powerpointSynonym powerpoint
Synonym powerpoint
 
AI and Chatbots 101 - Vancouver Legal Hackers
AI and Chatbots 101 - Vancouver Legal HackersAI and Chatbots 101 - Vancouver Legal Hackers
AI and Chatbots 101 - Vancouver Legal Hackers
 
Synonyms and Antonyms (Mashup)
Synonyms and Antonyms (Mashup) Synonyms and Antonyms (Mashup)
Synonyms and Antonyms (Mashup)
 
Prefixes, suffixes and roots
Prefixes, suffixes and rootsPrefixes, suffixes and roots
Prefixes, suffixes and roots
 
Introduction to the Semantic Web
Introduction to the Semantic WebIntroduction to the Semantic Web
Introduction to the Semantic Web
 
Semantics
SemanticsSemantics
Semantics
 
Introduction to linguistics ppt
Introduction to linguistics pptIntroduction to linguistics ppt
Introduction to linguistics ppt
 
Introduction to Morphology
Introduction to MorphologyIntroduction to Morphology
Introduction to Morphology
 
Synonym and Antonym PowerPoint
Synonym and Antonym PowerPointSynonym and Antonym PowerPoint
Synonym and Antonym PowerPoint
 
SEMANTICS
SEMANTICS SEMANTICS
SEMANTICS
 
Language: Definition, Nature, and Characteristics
Language: Definition, Nature, and CharacteristicsLanguage: Definition, Nature, and Characteristics
Language: Definition, Nature, and Characteristics
 

Ähnlich wie Lecture 1: Semantic Analysis in Language Technology

Course outline s1 pragmatics
Course outline s1 pragmaticsCourse outline s1 pragmatics
Course outline s1 pragmaticsSusilo Ma'ruf
 
Communicative-discursive models and cognitive linguistics
Communicative-discursive models and cognitive linguisticsCommunicative-discursive models and cognitive linguistics
Communicative-discursive models and cognitive linguisticsalaidarindira0202
 
ways of teaching grammar
 ways of teaching grammar  ways of teaching grammar
ways of teaching grammar Tudosan Alesea
 
. . . all human languages do share the same structure. More e.docx
. . . all human languages do share the same structure. More e.docx. . . all human languages do share the same structure. More e.docx
. . . all human languages do share the same structure. More e.docxadkinspaige22
 
. . . all human languages do share the same structure. More e.docx
 . . . all human languages do share the same structure. More e.docx . . . all human languages do share the same structure. More e.docx
. . . all human languages do share the same structure. More e.docxShiraPrater50
 
the nature of approaches and methods in language
the nature of approaches and methods in languagethe nature of approaches and methods in language
the nature of approaches and methods in languageSane Alexander
 
Pragmatics and College English Teaching in China
Pragmatics and College English Teaching in ChinaPragmatics and College English Teaching in China
Pragmatics and College English Teaching in Chinainventionjournals
 
Teaching methodology skills ma
Teaching methodology skills maTeaching methodology skills ma
Teaching methodology skills maSamira Rahmdel
 
Linguistic and Applied linguistic contribution to English Teaching
Linguistic and Applied linguistic contribution to English TeachingLinguistic and Applied linguistic contribution to English Teaching
Linguistic and Applied linguistic contribution to English TeachingKing Saud University
 
Chapter 4 how languages are learned - pasty m. lightbown and nina spada
Chapter 4   how languages are learned - pasty m. lightbown and nina spadaChapter 4   how languages are learned - pasty m. lightbown and nina spada
Chapter 4 how languages are learned - pasty m. lightbown and nina spadaTshen Tashi
 
Listening comprehension in efl teaching
Listening comprehension in efl teachingListening comprehension in efl teaching
Listening comprehension in efl teachingmora-deyanira
 
Listening Comprehension in EFL Teaching
Listening Comprehension in EFL TeachingListening Comprehension in EFL Teaching
Listening Comprehension in EFL Teachingmora-deyanira
 
Semantics and Computational Semantics
Semantics and Computational SemanticsSemantics and Computational Semantics
Semantics and Computational SemanticsMarina Santini
 
Speech act theory for language teaching
Speech act theory for language teachingSpeech act theory for language teaching
Speech act theory for language teachingRichard Pinner
 
49847 88091-1-pdiscourse analysis
49847 88091-1-pdiscourse analysis49847 88091-1-pdiscourse analysis
49847 88091-1-pdiscourse analysisAbdullah Saleem
 
49847 88091-1-pb discourse analysis
49847 88091-1-pb discourse analysis49847 88091-1-pb discourse analysis
49847 88091-1-pb discourse analysisAbdullah Saleem
 
Lecture 1 introduction to syntax
Lecture 1 introduction to syntaxLecture 1 introduction to syntax
Lecture 1 introduction to syntaxssuser1f22f9
 

Ähnlich wie Lecture 1: Semantic Analysis in Language Technology (20)

Course outline s1 pragmatics
Course outline s1 pragmaticsCourse outline s1 pragmatics
Course outline s1 pragmatics
 
Communicative-discursive models and cognitive linguistics
Communicative-discursive models and cognitive linguisticsCommunicative-discursive models and cognitive linguistics
Communicative-discursive models and cognitive linguistics
 
DH_syllabus_typology
DH_syllabus_typologyDH_syllabus_typology
DH_syllabus_typology
 
ways of teaching grammar
 ways of teaching grammar  ways of teaching grammar
ways of teaching grammar
 
L2.1
L2.1L2.1
L2.1
 
NLPinAAC
NLPinAACNLPinAAC
NLPinAAC
 
. . . all human languages do share the same structure. More e.docx
. . . all human languages do share the same structure. More e.docx. . . all human languages do share the same structure. More e.docx
. . . all human languages do share the same structure. More e.docx
 
. . . all human languages do share the same structure. More e.docx
 . . . all human languages do share the same structure. More e.docx . . . all human languages do share the same structure. More e.docx
. . . all human languages do share the same structure. More e.docx
 
the nature of approaches and methods in language
the nature of approaches and methods in languagethe nature of approaches and methods in language
the nature of approaches and methods in language
 
Pragmatics and College English Teaching in China
Pragmatics and College English Teaching in ChinaPragmatics and College English Teaching in China
Pragmatics and College English Teaching in China
 
Teaching methodology skills ma
Teaching methodology skills maTeaching methodology skills ma
Teaching methodology skills ma
 
Linguistic and Applied linguistic contribution to English Teaching
Linguistic and Applied linguistic contribution to English TeachingLinguistic and Applied linguistic contribution to English Teaching
Linguistic and Applied linguistic contribution to English Teaching
 
Chapter 4 how languages are learned - pasty m. lightbown and nina spada
Chapter 4   how languages are learned - pasty m. lightbown and nina spadaChapter 4   how languages are learned - pasty m. lightbown and nina spada
Chapter 4 how languages are learned - pasty m. lightbown and nina spada
 
Listening comprehension in efl teaching
Listening comprehension in efl teachingListening comprehension in efl teaching
Listening comprehension in efl teaching
 
Listening Comprehension in EFL Teaching
Listening Comprehension in EFL TeachingListening Comprehension in EFL Teaching
Listening Comprehension in EFL Teaching
 
Semantics and Computational Semantics
Semantics and Computational SemanticsSemantics and Computational Semantics
Semantics and Computational Semantics
 
Speech act theory for language teaching
Speech act theory for language teachingSpeech act theory for language teaching
Speech act theory for language teaching
 
49847 88091-1-pdiscourse analysis
49847 88091-1-pdiscourse analysis49847 88091-1-pdiscourse analysis
49847 88091-1-pdiscourse analysis
 
49847 88091-1-pb discourse analysis
49847 88091-1-pb discourse analysis49847 88091-1-pb discourse analysis
49847 88091-1-pb discourse analysis
 
Lecture 1 introduction to syntax
Lecture 1 introduction to syntaxLecture 1 introduction to syntax
Lecture 1 introduction to syntax
 

Mehr von Marina Santini

Can We Quantify Domainhood? Exploring Measures to Assess Domain-Specificity i...
Can We Quantify Domainhood? Exploring Measures to Assess Domain-Specificity i...Can We Quantify Domainhood? Exploring Measures to Assess Domain-Specificity i...
Can We Quantify Domainhood? Exploring Measures to Assess Domain-Specificity i...Marina Santini
 
Towards a Quality Assessment of Web Corpora for Language Technology Applications
Towards a Quality Assessment of Web Corpora for Language Technology ApplicationsTowards a Quality Assessment of Web Corpora for Language Technology Applications
Towards a Quality Assessment of Web Corpora for Language Technology ApplicationsMarina Santini
 
A Web Corpus for eCare: Collection, Lay Annotation and Learning -First Results-
A Web Corpus for eCare: Collection, Lay Annotation and Learning -First Results-A Web Corpus for eCare: Collection, Lay Annotation and Learning -First Results-
A Web Corpus for eCare: Collection, Lay Annotation and Learning -First Results-Marina Santini
 
An Exploratory Study on Genre Classification using Readability Features
An Exploratory Study on Genre Classification using Readability FeaturesAn Exploratory Study on Genre Classification using Readability Features
An Exploratory Study on Genre Classification using Readability FeaturesMarina Santini
 
Lecture: Semantic Word Clouds
Lecture: Semantic Word CloudsLecture: Semantic Word Clouds
Lecture: Semantic Word CloudsMarina Santini
 
Lecture: Ontologies and the Semantic Web
Lecture: Ontologies and the Semantic WebLecture: Ontologies and the Semantic Web
Lecture: Ontologies and the Semantic WebMarina Santini
 
Lecture: Summarization
Lecture: SummarizationLecture: Summarization
Lecture: SummarizationMarina Santini
 
Lecture: Question Answering
Lecture: Question AnsweringLecture: Question Answering
Lecture: Question AnsweringMarina Santini
 
IE: Named Entity Recognition (NER)
IE: Named Entity Recognition (NER)IE: Named Entity Recognition (NER)
IE: Named Entity Recognition (NER)Marina Santini
 
Lecture: Vector Semantics (aka Distributional Semantics)
Lecture: Vector Semantics (aka Distributional Semantics)Lecture: Vector Semantics (aka Distributional Semantics)
Lecture: Vector Semantics (aka Distributional Semantics)Marina Santini
 
Semantic Role Labeling
Semantic Role LabelingSemantic Role Labeling
Semantic Role LabelingMarina Santini
 
Lecture 9: Machine Learning in Practice (2)
Lecture 9: Machine Learning in Practice (2)Lecture 9: Machine Learning in Practice (2)
Lecture 9: Machine Learning in Practice (2)Marina Santini
 
Lecture 8: Machine Learning in Practice (1)
Lecture 8: Machine Learning in Practice (1) Lecture 8: Machine Learning in Practice (1)
Lecture 8: Machine Learning in Practice (1) Marina Santini
 
Lecture 5: Interval Estimation
Lecture 5: Interval Estimation Lecture 5: Interval Estimation
Lecture 5: Interval Estimation Marina Santini
 
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain RatioLecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain RatioMarina Santini
 
Lecture 3b: Decision Trees (1 part)
Lecture 3b: Decision Trees (1 part)Lecture 3b: Decision Trees (1 part)
Lecture 3b: Decision Trees (1 part) Marina Santini
 
Lecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
Lecture 3: Basic Concepts of Machine Learning - Induction & EvaluationLecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
Lecture 3: Basic Concepts of Machine Learning - Induction & EvaluationMarina Santini
 

Mehr von Marina Santini (20)

Can We Quantify Domainhood? Exploring Measures to Assess Domain-Specificity i...
Can We Quantify Domainhood? Exploring Measures to Assess Domain-Specificity i...Can We Quantify Domainhood? Exploring Measures to Assess Domain-Specificity i...
Can We Quantify Domainhood? Exploring Measures to Assess Domain-Specificity i...
 
Towards a Quality Assessment of Web Corpora for Language Technology Applications
Towards a Quality Assessment of Web Corpora for Language Technology ApplicationsTowards a Quality Assessment of Web Corpora for Language Technology Applications
Towards a Quality Assessment of Web Corpora for Language Technology Applications
 
A Web Corpus for eCare: Collection, Lay Annotation and Learning -First Results-
A Web Corpus for eCare: Collection, Lay Annotation and Learning -First Results-A Web Corpus for eCare: Collection, Lay Annotation and Learning -First Results-
A Web Corpus for eCare: Collection, Lay Annotation and Learning -First Results-
 
An Exploratory Study on Genre Classification using Readability Features
An Exploratory Study on Genre Classification using Readability FeaturesAn Exploratory Study on Genre Classification using Readability Features
An Exploratory Study on Genre Classification using Readability Features
 
Lecture: Semantic Word Clouds
Lecture: Semantic Word CloudsLecture: Semantic Word Clouds
Lecture: Semantic Word Clouds
 
Lecture: Ontologies and the Semantic Web
Lecture: Ontologies and the Semantic WebLecture: Ontologies and the Semantic Web
Lecture: Ontologies and the Semantic Web
 
Lecture: Summarization
Lecture: SummarizationLecture: Summarization
Lecture: Summarization
 
Relation Extraction
Relation ExtractionRelation Extraction
Relation Extraction
 
Lecture: Question Answering
Lecture: Question AnsweringLecture: Question Answering
Lecture: Question Answering
 
IE: Named Entity Recognition (NER)
IE: Named Entity Recognition (NER)IE: Named Entity Recognition (NER)
IE: Named Entity Recognition (NER)
 
Lecture: Vector Semantics (aka Distributional Semantics)
Lecture: Vector Semantics (aka Distributional Semantics)Lecture: Vector Semantics (aka Distributional Semantics)
Lecture: Vector Semantics (aka Distributional Semantics)
 
Lecture: Word Senses
Lecture: Word SensesLecture: Word Senses
Lecture: Word Senses
 
Sentiment Analysis
Sentiment AnalysisSentiment Analysis
Sentiment Analysis
 
Semantic Role Labeling
Semantic Role LabelingSemantic Role Labeling
Semantic Role Labeling
 
Lecture 9: Machine Learning in Practice (2)
Lecture 9: Machine Learning in Practice (2)Lecture 9: Machine Learning in Practice (2)
Lecture 9: Machine Learning in Practice (2)
 
Lecture 8: Machine Learning in Practice (1)
Lecture 8: Machine Learning in Practice (1) Lecture 8: Machine Learning in Practice (1)
Lecture 8: Machine Learning in Practice (1)
 
Lecture 5: Interval Estimation
Lecture 5: Interval Estimation Lecture 5: Interval Estimation
Lecture 5: Interval Estimation
 
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain RatioLecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
Lecture 4 Decision Trees (2): Entropy, Information Gain, Gain Ratio
 
Lecture 3b: Decision Trees (1 part)
Lecture 3b: Decision Trees (1 part)Lecture 3b: Decision Trees (1 part)
Lecture 3b: Decision Trees (1 part)
 
Lecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
Lecture 3: Basic Concepts of Machine Learning - Induction & EvaluationLecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
Lecture 3: Basic Concepts of Machine Learning - Induction & Evaluation
 

Kürzlich hochgeladen

Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxnegromaestrong
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesShubhangi Sonawane
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docxPoojaSen20
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsMebane Rash
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Shubhangi Sonawane
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...Nguyen Thanh Tu Collection
 

Kürzlich hochgeladen (20)

Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Asian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptxAsian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptx
 

Lecture 1: Semantic Analysis in Language Technology

  • 1. Semantic Analysis in Language Technology Lecture 1: Introduction Course Website: http://stp.lingfil.uu.se/~santinim/sais/sais_fall2013.htm MARINA SANTINI PROGRAM: COMPUTATIONAL LINGUISTICS AND LANGUAGE TECHNOLOGY DEPT OF LINGUISTICS AND PHILOLOGY UPPSALA UNIVERSITY, SWEDEN 12 NOV 2013
  • 2. Acknowledgements 2  Thanks to Mats Dahllöf for the many slides I borrowed from his previous course and for structuring such an interesting and comprehensive content. Lecture 1: Introduction
  • 3. Practical Information 3 INTENDED LEARNING OUTCOMES ASSIGNMENTS AND EXAMINATION READING LIST DEMOS Lecture 1: Introduction
  • 4. Course Website & Contact Details 4  Course website:  http://stp.lingfil.uu.se/~santinim/sais/sais_fall2013.htm  Contact details:  santinim@stp.lingfil.uu.se  marinasantini.ms@gmail.com  marinaromestockholm@gmail.com Lecture 1: Introduction
  • 5. Check the website regularly and make sure to refresh the page: we are building up this course together, so this page will be continously updated! 5 Lecture 1: Introduction
  • 6. About the Course 6  Introduction to Semantics in Language Techology and NLP.  Focus on methods used in Language Technology and NLP for the perform the following tasks:     Sentiment Analysis (SA) Information Extraction (IE) Word Sense Disambiguation (WSD) Predicate-Argument Extraction (PAS) Lecture 1: Introduction
  • 7. Intended Learning Outcomes 7  In order to pass the course, a student must be able to:  describe systems that perform the following tasks, apply them to authentic linguistic data, and evaluate the results: 1. detect and extract attitudes and opinions from text, i.e. Sentiment Analysis (SA); 2. use semantic analysis in the context of Information Extraction (IE) 3. disambiguate instances of polysemous lemmas, i.e. Word Sense Disambiguation (WSD); 4. use robust methods to extract the Predicate-Argument Structure (PAS). Lecture 1: Introduction
  • 8. Compulsory Readings 8 1. Bing Liu (2012) Sentiment Analysis and Opinion Mining, Morgan & Claypool. 2. Richard Johansson and Pierre Nugues. 2008. Dependency-based Syntactic– Semantic Analysis with PropBank and NomBank, CoNLL 2008: Proceedings of the 12th Conference on Computational Natural Language Learning. 3. Daniel Jurafsky and James H. Martin (2009), Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Second Edition, Pearson Education. 4. Daniel Gildea and Daniel Jurafsky. 2002. Automatic Labeling of Semantic Roles, Computational Linguistics 28:3, 245-288. 5. M Palmer, D Gildea, P Kingsbury. 2005. The proposition bank: An annotated corpus of semantic roles, Computational Linguistics 31 (1), 71-106. 6. Additional suggested readings will be listed at the end of each lecture Lecture 1: Introduction
  • 9. Demos & Tutorials 9  This list will be continuosly updated, also with your contribution… Lecture 1: Introduction
  • 10. Assignments and Examination 10  Four Assignments: 1. Essay writing: independent study of a system, an approach, or a field within semantics-oriented language technology. The study will be presented both as a written essay and an oral presentation. The essay work will also include a feedback step where the work of another group is reviewed. 2. Assignment on Predicate-Argument Structure (PAS) 3. Assignment on Sentiment Analysis (SA) 4. Assignment on Word Sense Disambiguation (WSD)  General Info:  No lab sessions, supervision by email  Essay and assignments must be submitted to santinim@stp.lingfil.uu.se  Examination:  Written report submitted for each assignment  All four assignments necessary to pass the course  Grade G will be given to students who pass each assignment. Grade VG to those who pass the essay assignment and at least one of the other ones with distinction. Lecture 1: Introduction
  • 11. IMPORTANT! 11  Start thinking about a topic you are interested in for your essay writing assignment! Lecture 1: Introduction
  • 12. Practical Organization 12  45min + 15 min break  Lectures on Course webpage and SlideShare  Email all your questions to me: santinim@stp.lingfil.uu.se  IMPORTANT:  Send me an email to santinim@stp.lingfil.uu.se, so I make sure that I have all the correct email addresses. If you do not get an acknowledgement of receipt, please give me a shout! Lecture 1: Introduction
  • 13. Interaction and Cooperation 13  Communicate with me and with your classmates to exchange ideas, if you have problems in understanding notions and concepts or practical implementations.  Recommemdation: share your knowledge with your peers and steam off stress.  Cheating is not permitted  Lecture 1: Introduction
  • 14. Semantics in Language Technology Overview 14 SEMANTICS IN LANGUAGE TECHNOLOGY APPLICATIONS LEXICAL SEMANTICS REPRESENTATION OF MEANING SUMMARY Lecture 1: Introduction
  • 15. 15 Semantics in Language Technology Lecture 1: Introduction
  • 16. Logic and Semantics 16  Aristotelian logic – important ever since.  Syllogisms, e.g.:  Premise: No reptiles have fur.  Premise: All snakes are reptiles.  Conclusion: No snakes have fur.  Modern logic develops, late 19th Century – more general and systematic.  Formal semantics in linguistics and philosophy based on logic (20th Century). Lecture 1: Introduction
  • 17. Formal and Computational Semantics 17  Computational semantics “is the study of how to automate the process of constructing and reasoning with meaning representations of natural language expressions.” (Wikipedia).  Early systems rule-based, most famous example: “Montague grammar” (1970). Sophisticated mechanisms for translation of English into a very rich logic.  Language technology: Recent interest in data-driven and machine learning-based methods. Lecture 1: Introduction
  • 18. Semantics in NLP 18  NLP semantics is typically more limited in scope than NL semantics as analysed in linguistics and philosophy.  NLP applications often handle semantic aspects without having explicitly semantic components, e.g. in machine translation.  Other aspects of language – morphology, syntax, etc. – can be seen as support systems for semantics: The purpose of language lies in the use of expressions as carriers of semantic meaning. And that is what many NLP systems have to respect, e.g. MT, retrieval, classification, etc. Lecture 1: Introduction
  • 19. Semantics and Truth (i) 19 Semantics, meanings and states of affairs:  What a sentence means: a structure involving (lexical) concepts and relations among them. Can be articulated as a semantic representation. E.g. I ate a turkey sandwich. in predicate logic:  A sentence and the semantic representation of a sentence is also the representation of a possible state of affairs. Lecture 1: Introduction
  • 20. Semantics and Truth (ii) 20  Correspondence theory of truth: If the content of a sentence corresponds to an actual state of affairs if it is true; otherwise, it is false.  Ignoring philosophical complications, in many cases we can extract knowledge from texts. E.g. Warmer climate entails increased release of carbon dioxide by inland lakes. (From uu.se press release.)  Related issue: Which texts should we trust?  Many sentences are difficult to formalize in logic. (Modality, conditionality, vague quantification, tense, etc.) Lecture 1: Introduction
  • 22. Formalizing Meaning 22  Linguistic content has – at least to a certain degree – a logical structure that can be formalized by means of logical calculi – meaning representations.  The representation languages should be simple and unambiguous – in contrast to complex and ambiguous NL.  Logical calculi come with accounts of logical inference. They are useful for reasoning-based applications.  Meaning formalization faces far-reaching conceptual and  computational difficulties. Lecture 1: Introduction
  • 23. Compositionality 23  Linguistic content is compositional: Simple expressions have a given (lexical) meaning; the meaning of complex expressions is determined by the meanings of their constituents. People produce and understand new phrases and sentences all the time. (NLP must also deal with these.)  Compositionality is studied in detail in compositional syntax-driven semantics. Work in this field is typically about hand-coded rule systems for small fragments of NL. Lecture 1: Introduction
  • 25. Compositional Aspects – Argument Structure 25 Lecture 1: Introduction
  • 27. Compositional semantics in Language Technology 27 Lecture 1: Introduction
  • 28. First-Order Predicate Logic (i) 28  “flexible, well-understood, and computationally     tractable approach to the representation of knowledge [and] meaning” (J&M. 2009: 589) expressive verifiability against a knowledge base (related to database languages) inference model-theoretic semantics Lecture 1: Introduction
  • 29. First-Order Predicate Logic (ii) 29  Boolean operators: negation and connectives  Existential/universal quantification  Individual constants  Predicates (taking a number of arguments) Lecture 1: Introduction
  • 30. When to assume compositionality? 30 Lecture 1: Introduction
  • 31. Multi-Word Expressions 31 MWEs (a.k.a multiword units or MUs) are lexical units encompassing a wide range of linguistic phenomena, such as idioms (e.g. kick the bucket = to die), collocations (e.g. cream tea = a small meal eaten in Britain, with small cakes and tea), regular compounds (cosmetic surgery), graphically unstable compounds (e.g. selfcontained <> self contained <> selfcontained - all graphical variants have huge number of hits in Google), light verbs (e.g. do a revision vs. revise), lexical bundles (e.g. in my opinion), etc. While easily mastered by native speakers, MWEs' correct interpretation remains challenging both for non-native speakers and for language technology (LT), due to their complex and often unpredictable nature. Lecture 1: Introduction
  • 32. Cross-linguality Use Case: Information Access 32 In multi-ethnic societies, like the Swedish society, it is common that many non-native speakers use public websites – e.g. Arbetesförmedlingen or Pensionsmyndigheten websites – to access information that are vital to their living and integration in the host country. National regulations are often accompanied by special terminology and new coinages. For instance, the Swedish expression /egenremiss/ (14,900 hits, Google.se April 2013) – or alternatively as an MWE – /egen remiss/ (8,210 hits, Google.se April 2013) denotes a referral to a specialist doctor written by patients themselves. This expression is made up from two common Swedish words /egen/ `own (adj)' and /remiss/ `referral'. It is a recent expression (probably coined around 20101) and not yet recorded in any official dictionary nor in Wiktionary or other multilingual online lexical resources. However, it is very frequent in query logs belonging to a Swedish public health service website. When trying to implement a cross-lingual search based on the automatic translation of query logs, it turned out that none of the existing multilingual lexical resources contained this expression. Lecture 1: Introduction
  • 33. Use Case: Personal Use & Text Understanding 33  The use of expressions that are marked for style, genre, domain, or register (and/or other textual categories), or the use of expressions which are misspelled or idiomatic for some textual category are beyond the competence of a novice reader or a non-native speaker. Additionally, in a web search or in social networks, one cannot tell if the texts one reads are good or bad the way a firstlanguage readers can. When readers/users read a language they do not know at all, they can use automatic translation or online dictionaries or other lexical resources. However, what they cannot determine well is the *type* of text one is reading. They cannot tell if the text is verbose, terse, formal, informal, stupid, funny, bad, or good.  For instance, the phrase "es ist zum Kotzen" means this is vernacular and unrefined text as well as a controversial expression. The phrase "isch alle", instead, means that this line in the text is spoken by a Berliner. Lecture 1: Introduction
  • 34. Semantics vs Pragmatics/Discourse (i) 34  What does a word, a phrase, a text segment mean as an NL expression? (“Linguistic meaning” – semantics.) Conventional, static, systemic aspect of meaning.  What does the author intend to convey by means of a word, a phrase, a text segment? (“Speaker meaning” – pragmatics/discourse.) Contextual, dynamic aspect of meaning.  The two aspects depend on each other, of course. Lecture 1: Introduction
  • 35. Semantics vs Pragmatics/Discourse (ii) 35 Lecture 1: Introduction
  • 36. Semantics vs Pragmatics/Discourse (iii) 36 Lecture 1: Introduction
  • 38. Semantics-oriented NLP applications 38  Machine translation: The translation of a text segment should mean the same as the original (to emphasize linguistic meaning) or should convey the same content (to emphasize speaker meaning).  Information extraction is to extract components of the information conveyed by a text.  Question answering is extraction – combined with inference – of an answer to a given question.  Text classification, in typical cases, relates to the meanings of the texts being classified. Lecture 1: Introduction
  • 39. Semantics and Generation 39  Generation: semantic representation  NL. Less challenging than analysis – the structure of the input is under control. Needed in e.g. dialogue systems.  Interlingua – semantic representation in machine  translation: Analysis: source language  interlingua. Generation: interlingua  target language. Would be economic if many languages are involved. The idea has not proved very successful so far. Lecture 1: Introduction
  • 40. Reference 40  Reference is very important – what statements are about.  Referring expressions are very common.  Reference is a discourse phenomenon.  Resolving reference is a crucial step in e.g.   extraction, e.g.in sentiment analysis translation, e.g. to get agreement right  English it vs French il/elle vs Swedish den/det. Lecture 1: Introduction
  • 42. Kinds of Referring Expressions 42  Indefinite noun phrases. E.g. a book. Introduce new     entities. Pronouns. E.g. he. Typically coreferent with a previous referring expression (antecedent). Names. E.g. Bill Gates. Demonstrative. E.g. this room. Other definite noun phrases. E.g. the first chapter. Reference to somehow known entity, often previously mentioned. Lecture 1: Introduction
  • 43. Named Entity Recognition (NER) 43  To identify expressions being used as names. (What characterizes a “name”?)  Also to identify what kind of name it is: E.g. of a person, or a place, or a stretch of time, or a chemical compound, or a gene, etc.  “State-of-the-art NER systems for English produce near-human performance. For example, the best system entering MUC-7 scored 93.39% of F-measure while human annotators scored 97.60% and 96.95%” (Wikipedia). Lecture 1: Introduction
  • 44. Anaphora and Deixis Resolution 44  Pronouns (they), pronominal adverbs (there, then), and definite NP’s refer to entities by means of contextually given information.  E.g. by referring to previously mentioned referents – anaphora.  E.g. by reference based on the participants, time, and place of the discourse – deixis (e.g. I, you, here, yesterday).  Anaphora and deixis resolution is much more challenging task than NER. The reference of name-like graph words is much more predictible. Compare Barack Obama and he. Lecture 1: Introduction
  • 45. Sentiment Analysis – an extraction task 45  What views do people express in blogs and reviews? That’s interesting for politicans and marketing people.  Opinions are often expressed in a personal and informal way. E.g. Peter bought me a Baileys marzipan chocolate thing which I washed down with Gluehwein and that, in combination with the bright lights and cheery faces really made me feel warm inside! (From a blog post.)  Sentiment analysis: to extract the referent of a “sentiment” and the polarity positive–negative associated with it. E.g. Baileys marzipan chocolate – positive. Lecture 1: Introduction
  • 47. Lexical Concepts 47  Words are often grammatically simple, but carry a structured conceptual content. Definitions “unpack” the content of concepts:     friend – a person whom one knows well, is loyal to, etc. turkey – a kind of animal, a bird, etc. sandwich – a kind of food item, contains bread , etc. eat – a relation (holding in/of an event) between an organism and a food item, the food is chewed and ingested, etc. Lecture 1: Introduction
  • 48. Lexical Concepts - Decomposition 48 Lecture 1: Introduction
  • 49. Lexical Concepts – Relations (i) 49 Lecture 1: Introduction
  • 50. Lexical Concepts – Relations (ii) 50 Lecture 1: Introduction
  • 51. Synonimy 51 Synonymy holds between two words (word tokens) which express the same or similar concepts.  Unsupervised detection of synonymy can be based on “The Distributional Hypothesis: words with similar distributions have similar meanings.” = The Distributional Hypothesis in linguistics is the theory that words that occur in the same contexts tend to have similar meanings. The underlying idea that "a word is characterized by the company it keeps" was popularized by Firth. “Random Indexing” is a method here. (“a high-dimensional model can be projected into a space of lower dimensionality without compromising distance metrics if the resulting dimensions are chosen appropriately”)  Synonymy knowledge useful in e.g. translation, text classification, and information extraction. Also “query expansion” in retrieval. Lecture 1: Introduction
  • 53. Lexical Ambiguity - WSD 53 Lecture 1: Introduction
  • 54. Word Ambiguity: Homography vs Polysemy (i) 54 Lecture 1: Introduction
  • 55. Word Ambiguity: Homography vs Polysemy (ii) 55 Lecture 1: Introduction
  • 56. Word Senses 56  Discerning word senses (for a lemma) – lexicographical task, matter of sophisticated linguistic judgements.  Theoretical principles. Practical purpose.  Different dictionaries make different analyses.  English: WordNet – a standard resource. Lecture 1: Introduction
  • 57. Senses of day in WordNet, for instance (i) 57 Lecture 1: Introduction
  • 58. Senses of day in WordNet, for instance (ii) 58 Lecture 1: Introduction
  • 59. Word Sense Disambiguation (WSD) 59  A distributional hypothesis for WSD: words representing the same sense have more similar distributions than words representing different senses. I.e. distribution similarity implies sense similiarity.  We can use this for supervised learning of WSD.  This requires data in the form of a sense-tagged corpus (based on a given sense inventory, e.g. the one given by WordNet). Lecture 1: Introduction
  • 60. Manual Sense-Tagging 60  More difficult than typical grammatical tagging.  As we saw in the day example, senses and their distinctions can be quite subtle. Definitions and examples are often far from obvious.  Expensive: requires competent people and standardised procedures.  Quality measure: inter-annotator agreement. ” Ex: Cohen's kappa coefficient is a statistical measure of inter-rater agreement or inter-annotator agreementfor qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into account the agreement occurring by chance ” Lecture 1: Introduction
  • 62. Conclusions (i) 62  Logic-based semantics is a theoretical foundation for NLP semantics, but implemented systems are typically more coarse-grained and of a more limited scope.  Meaning depends both on literal content and contextual information. This is a challenge for most NLP tasks.  Most NLP applications have to be highly sensitive to semantics. Lecture 1: Introduction
  • 63. Conclusions (ii) 63  Finding and interpreting names and other referential expressions is a central issue for NLP semantics.  Disambiguation of polysemous lexical tokens is also a central issue for NLP semantics.  Accessing the content of lexical tokens is also useful.  Meaning representation involves predicateargument structure, which captures a basic aspect of NL compositionality. Lecture 1: Introduction
  • 64. 64 Start thinking about a Topic of interest for your essay writing! Tell me your thoughts next time… Lecture 1: Introduction
  • 65. Suggested Readings 65  Term Logic (Wikipedia)  Predicate Logic (Wikipedia)  Jurafsky and Martin (2009):  Ch. 17 ”Representation of Meaning”  Ch. 18 ”Computational Semantics”  Ch. 19 ”Lexical Semantics”  Ch. 20 ”Compuational Lexical Semantics”  Clark et al. (2010):  Ch 15 ”Computational Semantics”  Indurkhya and Damerau (2010):  Ch 5 ”Semantic Analysis” Lecture 1: Introduction
  • 66. 66 This is the end… Thanks for your attention ! Lecture 1: Introduction

Hinweis der Redaktion

  1. To