Through a recent survey, we observe that a significant
percentage of people still share photos through email.
When an user composes an email with an attachment,
he/she most likely talks about the attachment in the
body of the email. In this paper, we develop a supervised
machine learning framework to extract keywords
relevant to the attachments from the body of
the email. The extracted keywords can then be stored
as an extended attribute to the file on the local filesystem.
Both desktop indexing software and web-based
image search portals can leverage the context-enriched
keywords to enable a richer search experience. Our
results on the public Enron email dataset shows that
the proposed extraction framework provides both high
precision and recall. As a proof of concept, we have
also implemented a version of the proposed algorithms
as a Mozilla Thunderbird Add-on.
1
2. at-least one email with an attachment last 2 weeks?
2
Yahoo! Confidential http://www.crcna.org/site_uploads/uploads/osjha/mdgs/mdghands.jpg
Image Source:
3. Introduction: The âWhat ?â
ï§ Email attachments are a very popular mechanism
to exchange content.
- Both in our personal / office worlds.
ï§ The one-liner:
- The email usually contains a crisp description of the
attachment.
- âCan we auto generate tags for the attachment from
the email ?â
3
Yahoo! Confidential
4. Introduction: The âWhy?â
ï§ Tags can be stored as extended attributes of files.
ï§ Applications like desktop search can use these
tags for building indexes.
ï§ Tags generated even without parsing content:
- Useful for Images
- In some cases, the tags have more context than the
attachment itself.
4
Yahoo! Confidential
5. Outline
ï§ Problem Statement
ï§ Challenges
ï§ Overview of solution
ï§ The Dataset : Quirks and observations
ï§ Feature Design
ï§ Experiments and overview of results
ï§ Conclusion
5
Yahoo! Confidential
6. Problem Statement
âGiven an email E that has a set of attachments A,
find the set of words KEA that appear in E and are
relevant to the attachments A.â
Subproblems:
âą Sentence Selection
âą Keyword Selection
6
Yahoo! Confidential
7. Overview of Solution
ï§ Solve a binary classification problem :
- Is a given sentence relevant to the attachments or not?
ï§ Two part solution:
- What are the features to use?
âą Most insightful and interesting aspect of this work.
âą Focus of this talk
- What classification algorithm to use?
7
Yahoo! Confidential
8. Challenges : Why is the classification hard?
ï§ Only a small part of the email is relevant to the
attachment. Which part?
ï§ While writing emails: users tend to rely on
context that is easily understood by humans.
- usage of pronouns
- nick names
- cross referencing information from a different
conversation in the same email thread.
8
Yahoo! Confidential
9. The Enron Email Dataset
ï§ Made public by Federal Energy Regulatory
Commission during its investigation [1].
ï§ Curated version [2] consists of 157,510 emails
belonging to 150 users.
ï§ A total of 30,968 emails have at least one
attachment
[1] http://www.cs.cmu.edu/~enron/
[2] Thanks to Mark Dredze for providing the curated dataset. 9
Yahoo! Confidential
10. Observation-1: User behavior for attachments
âą Barring a few outliers, almost all users sent/received emails with
attachments.
âą A rich / well-suited corpus for studying attachment behavior
Yahoo! Confidential
10
11. Observation-2: Email length
âą Roughly 80% of the emails have less than 8 sentences.
âą In another analysis: even in emails that have less than 3 sentences,
not every sentence is related to the attachment!!
11
Yahoo! Confidential
13. Feature Taxonomy:
Features
Email Conversation Sentence
Length Level Anchor Noisy Anaphora
(Short email: less than 4 (Level: Older in the
sentences? ) email thread, higher the
conversation level )
Strong Phrase Noun
Lexicon Verb
Extension
(high dimension, sparse, bag of
words feature )
Attachment Name
Weak Phrase 13
Yahoo! Confidential
14. Feature Design : Sentence level
S1 Anchor Sentences:
S2 Most likely positive
S3
matches.
S4
S5
Noisy Sentences:
S6
Most likely negative
matches.
Anaphora Sentences: have linguistic relationships
to anchor sentences
14
Yahoo! Confidential
15. Feature Design: Sentence Level ï Anchor
ï§ Strong Phrase Anchor: Feature value set to 1 if
sentence has any of the words:
- attach
- here is
- Enclosed
ï§ of the 30968 emails that have an attachment,
52% of them had a strong anchor phrase.
15
Yahoo! Confidential
16. Feature Design: Sentence Level ï Anchor
ï§ Behavioral Observation: Users tend to refer to an
attachment by its file type
ï§ Extension Anchor: Feature value set to 1 if sentence has
any of the extension keywords:
- xls ï spreadsheet, report, excel file
- jpg ï image, photo, picture
ï§ Example:
âPlease refer to the attached spreadsheet for a list of Associates and
Analysts who will be ranked in these meetings and their PRC Reps.â 16
Yahoo! Confidential
17. Feature Design: Sentence Level ï Anchor
ï§ Behavioral Observation: Users tend to use file name
tokens of the attachment to refer to the attachment
ï§ Attachment Name Anchor: Feature value set to 1 if
sentence has any of the file name tokens.
- Tokenization done on case and type transitions,
ï§ Example:
- attachment name âBook Request Form East.xlsâ
- âThese are book requests for the Netco books for all
regions.â
17
Yahoo! Confidential
18. Feature Design : Sentence level
S1 Anchor Sentences:
S2 Most likely positive
S3
matches.
S4
S5
Noisy Sentences:
S6
Most likely negative
matches.
Anaphora Sentences: have linguistic relationships
to anchor sentences
18
Yahoo! Confidential
19. Feature Design : Sentence level ï Noisy
ï§ Noisy sentences are usually salutations, signature
sections and email headers of conversations.
ï§ Two features to capture noisy sentences
- Noisy Noun
- Noisy Verb
ï§ Noisy Noun: marked true if âą Noisy Verb: marked true
more than 85% of the if no verbs in the
words in the sentence are sentence
nouns.
19
Yahoo! Confidential
20. Feature Design : Sentence level
S1 Anchor Sentences:
S2 Most likely positive
S3
matches.
S4
S5
Noisy Sentences:
S6
Most likely negative
matches.
Anaphora Sentences: have linguistic relationships
to anchor sentences
20
Yahoo! Confidential
21. Feature Design : Sentence level ï Anaphora
ï§ Once anchors have been identified:
- NLP technique called anaphora detection can be
employed
- Detects other sentences that are linguistically
dependent on anchor sentence.
- Tracks the hidden context in email.
ï§ Example:
- âThought you might me interested in the report. It gives a nice
snapshot of our activity with our major counterparties.â
21
Yahoo! Confidential
22. Correlation Analysis
Best positive correlation:
âą Strong phrase anchor
âą Anaphora feature
Short email ï low
correlation coefficient.
noisy verb feature ï
good negative correlation
conversation level <=2 feature has lower negative correlation when compared
to the conversation level > 2 feature.
22
Yahoo! Confidential
23. Experiments
ï§ Ground Truth Data
- randomly sampled 1800 sent emails.
- Two independent editors for producing class labels for
every sentence in the above sample.
- Reconciliation:
âą Discarded emails that had at least one sentence with
conflicting labels.
ï§ ML Algorithms studied : NaĂŻve Bayes, SVM, CRF.
23
Yahoo! Confidential
24. Summary of Results: F1 measure
ï§ With all features used, F1 scores read:
- CRF : 0.87
- SVM: 0.79
- NaĂŻve Bayes: 0.74
ï§ CRF consistently beats the other two methods across all
feature sub-sets
- Sequential nature of the data works in CRFâs favor.
ï§ The Phrase Anchor provides best increase in precision
ï§ The Anaphora feature provides best increase in recall.
24
Yahoo! Confidential
25. Summary of Results: User Specific Performance
ï§ In majority of the cases,
CRF outperforms
ï§ With same set of
features:
- the CRF can learn a more
generic model applicable
to a variety of users
25
Yahoo! Confidential
26. In the paper . . .
ï§ A specific use-case for attachment based tagging:
- Images sent over email
- A user study based on survey
ï§ Heuristics for sentences ï keywords
ï§ More detailed evaluation studies / comparison
for NaĂŻve Bayes, SVM, CRF
ï§ TagSeeker : A prototype for proposed algorithms
implemented as a Thunderbird plugin.
26
Yahoo! Confidential
27. Closing Remarks
ï§ Improvement of retrieval effectiveness due to
keywords mined:
- Could not be performed because the attachment content
is not available.
ï§ Working on an experiment to do this on a different
dataset.
ï§ Thanks to the:
- Reviewers for the great feedback!
- Organizers for the effort putting together this conference! 27
Yahoo! Confidential
28. Conclusion
ï§ Presented a technique to extract information
from noisy data.
ï§ The F1 measure of the proposed methodology
- In the high eighties. Good!
- Generalized well across different users.
ï§ For more information on this work / Information
Extraction @ Yahoo!
- aravindr@yahoo-inc.com
28
Yahoo! Confidential
left with 1150emails whose sentences were marked either relevant toattachment or not. This data corresponds to 112 usersfrom the Enron corpus and 6472 sentences.