SlideShare ist ein Scribd-Unternehmen logo
1 von 23
Downloaden Sie, um offline zu lesen
Adgrams: Categories and Linguistics




                            Marco Benini

                         M.Benini@leeds.ac.uk

                    Department of Pure Mathematics
                          University of Leeds


                         26th October 2011
An example

               “Paul is going to study maths in the library”.
                                     q
                                  d←
                                 in dq
                                   I2  d→
                                     q  ϵ d
                                          2
                                   →
                                  d I2        d
                                  -ing dq       d
                                        →
                              □ O  d               d
                q                       to dq        d
               ←
               d              D           2 →
                                        I2  d           d
               ϵ
              O d               
                               q            ϵ d
                                             2            dq
                              ←
                              d             I              ←
                                                           d
                              ϵ         1 go
                                                          ϵ d
          the     library              Paul
                             O dq
           A         O             ←
                                  d O            I2∗        I2
                          □         ϵ                Paul 1    is
                                 2 dq
                                   I    →                      I2
                            maths 2  d
                          D                            O
                                         ϵ
                                        2 d
                              O         I
                                  1 study
                                  Paul

2 of 23
                                  O         I2
Categories as languages



To interpret the objects of a small category as the collection of expressions
of some language, the category must have enough “structure” to encode
the grammar constructions in its arrows.

The first part of the formal model, what we are going to describe in this talk,
is devoted to describe the required structure.

Although the motivation for our definitions can be traced back to what
happens in natural languages, we will limit ourselves to the example just
shown, as it suffices for the purposes of this talk.




 3 of 23
Grammar character


A grammar character is a collection of expressions in the language that can
be used to construct new expressions. Informally, “nouns”, “adjectives”,
“verbs” are grammar characters (but beware that they do not properly work
as you may expect!)

Definition 1 (Grammar Character)
Given a small category C, we say that G is a set of grammar characters on
C, indexed by GC, if G = {Gi }i ∈GC is a partition on Obj C such that every
class is inhabited.

In the example, the grammar characters’ names are A, I2 , I2 , I2 , O and D.
                                                           1 2




 4 of 23
Grammar product



Definition 2 (Grammar Product)
Fixed a small category C, a set G = {Gi }i ∈GC of grammar characters on C
and W ∈ G (the adpositions), any pair 〈Y1 × · · · × Yn , (z1 , . . . , zn )〉 is a
grammar product on C, G, W when n ∈ N (the degree of the product) and,
for every i in 1 . . . n, Yi ∈ G  {W } and zi ∈ W .

In the example, every part of a branch starting with a right-most leaf is an
instance of a grammar product.

Since most of the relevant information about a construction is in the
grammar product, it is convenient to give a separate definition for this
auxiliary concept.



 5 of 23
Grammar construction I


Definition 3 (Grammar construction)
Given a small category C, a set G = {Gi }i ∈GC of grammar characters on C
and W ∈ G, a grammar construction θP ,x over P from x is an indexed
         { }
collection fj   j ∈P ′   where
           ′
   P = 〈P , α〉 is a grammar product over C, G, W ;
   x is an object of C, called the governor of the construction;
   there is V ∈ G  {W } such that, for every j ∈ P ′ , fj ∈ HomC (x , y ) with
   y ∈ V . Every component of j is called a dependent of the construction.
We denote V as codθP ,x , x as domθP ,x and sometimes we write θj ,x for fj .
         {    }
A family θP ,x x ∈U of grammar constructions over P from x varying in the U
grammar character forms a grammar construction θP ,U over P from U when
its elements share the same codomain.


 6 of 23
Grammar construction II




In the example, we see instances of
   the verbal constructions with product 〈O × O, (ϵ, ϵ)〉, 〈O × O, (ϵ, to)〉;
   the infinitive construction with product 〈D, (ϵ)〉;
   the gerund construction with product 〈D, (ϵ)〉;
   the adjunctive construction with product 〈A, (ϵ)〉;
   the circumstantial construction with product 〈O, (in)〉.




 7 of 23
Conjugate construction

Definition 4 (Conjugate construction)
Fixed a small category C, a set G = {Gi }i ∈GC of grammar characters on C,
                                               {{             } }
with W ∈ G, and a grammar construction θ = fj : x → c (x , j ) j ∈P ′   over
                                                                                                         x ∈V
P = 〈P ′ = P1 × · · · × Pn , (a1 , . . . , an )〉 from the V grammar character, we say
that a grammar construction σ over Q from U is conjugate to θ if, for some
k in 1 . . . n, U = Pk ,

Q = 〈P1 ×· · ·× Pk −1 × V × Pk +1 ×· · ·× Pn , (ak , a2 , . . . , ak −1 , a1 , ak +1 , . . . , an )〉 ,

and, for all (x , j1 , . . . , jn ) ∈ V × P ′ , cod θ(j1 ,...,jn ),x = cod σ(j1 ,...,jk −1 ,x ,jk +1 ,...,jn ),jk .

Essentially, a conjugate construction is just the “normal” construction where
the governor is exchanged with one dependent. It models a different, but
equivalent, point of view of what matters in the construction.

  8 of 23
Composition of constructions I


Definition 5 (Composition)
Given a small category C, a set G = {Gi }i ∈GC of grammar characters on C,
with W ∈ G, and grammar constructions
               {{                                          }               }
          η = f(y1 ,...,yn ) : x → c (x , y1 , . . . , yn ) y ∈P ,...,y ∈P
                                                                 1      1     n    n     x ∈V

over P = 〈P1 × · · · × Pn , (a1 , . . . , an )〉 from V , and
                 {{                                             }                        }
            θ=     g(z1 ,...,zm ) : x → d (x , z1 , . . . , zm ) z
                                                                     1 ∈Q1 ,...,zm ∈Qm       x ∈U

over Q = 〈Q1 × · · · × Qm , (b1 , . . . , bm )〉 from U, the composition θ ◦ η is
defined when dom θ = U = cod η.                                                                      →


 9 of 23
Composition of constructions II



→ (Composition)
In this case,
                {{                              }                                            }
         θ ◦ η = g(z1 ,...,zm ) ◦ f(y1 ,...,yn ) y
                                                     1 ∈P1 ,...,yn ∈Pn ,z1 ∈Q1 ,...,zm ∈Qm       x ∈V

so dom θ ◦ η = dom η = V , cod θ ◦ η = cod θ , and the product is
〈P1 × · · · × Pn × Q1 × · · · × Qm , (a1 , . . . , an , b1 , . . . , bm )〉.


Despite the apparent complexity, composition is what one naturally expects
to be.




 10 of 23
Grammar category I

Definition 6 (Grammar category)
Consider a small category C, a set G = {Gi }i ∈GC of grammar characters on
C, with W ∈ G, and a set C of grammar constructions over products in C, G,
W from grammar characters in G. Then, the structure 〈C, G, W , C 〉 is said
to be a grammar category when
1. (well-foundedness) There is M ⊆ Obj C, the set of atoms, such that, for
   each object x, there is an arrow f ∈ HomC (m, x ) for some m ∈ M.
   Moreover, if x ∈ M, f = 1x .
2. (covering) For each arrow f ∈ HomC (a, b), there is η P ,V ∈ C such that
   a ∈ V and f ∈ η P ,a .
3. (1-degree regularity) Every construction in C is the composition of a
   finite sequence of constructions in C each one having degree 1.
4. (conjugate regularity) Each construction in C has exactly one conjugate
   construction for every element in its product.
 11 of 23
Grammar category II

Apart the technicalities in the definition, a small category can be interpreted
as a grammar category, i.e., a language with a formal adpositional
grammar, when we can find a set of grammar characters, a set of
adpositions, a set of atoms and a set of constructions in it such that
    every object can be constructed starting from the atoms;
    every arrow is part of some construction;
    every construction can be decomposed in a sequence of elementary
    (one governor, one dependent) constructions;
    conjugate constructions depend only on the product.

It is always possible to satisfy these requirements in a trivial way. The thing
becomes interesting, as usual, in the non-trivial case.

And yes, English, Italian, Chinese, . . . can be described in this way!

 12 of 23
Adtrees

Given a grammar category, we can write every arrow in a construction
                                 {                                               }
                                     f(y1 ,...,yn ) : x → c (x , y1 , . . . , yn )   y1 ∈Y1 ,...,yn ∈Yn

over the product 〈Y1 × · · · × Yn , (a1 , . . . , an )〉 from x as

                f(y1 ,...,yn )
            x                / an (yn , an−1 (yn−1 , . . . , a1 (x , y1 ) . . . )) = c (x , y1 , . . . , yn ) .



If we omit the f , which is normally clear from the context, and we write the
target term as a tree, we get a graphical representation like our example.
This representation is called adtree.

When every object can be represented in a unique way as an adtree
(modulo conjugate constructions), a grammar category is said to be pure.
 13 of 23
Parsing

Theorem 7
Let C be a grammar category. Then, every object of C is denoted by an
adtree X , and there is an adtree Y , denoting the same object, such that Y
is linear (its governor and its dependents are atoms) and Y is equivalent to
X modulo conjugation.

When the collections of objects and arrows of C are small enough, then the
proof of the theorem provides a (very inefficient) algorithm to reconstruct X
from Y , and Y is (almost) the usual way we write expressions.

Moreover, we can limit ourselves to observe adtrees: if a grammar category
is pure, up to conjugates, every adtree represents (“is”) a unique object; in
the general case, a grammar category is a quotient of the category
Adtree(C) having adtrees as objects and instances of constructions as
arrows.
 14 of 23
Transformations

Definition 8 (Transformation)
Given a grammar category C, a transformation on C is an endofunctor in
the category Adtree(C) of adtrees on C.

Defining constructions is easy when we deal with the elementary aspects of
a natural language. But, when dealing with sophisticated concepts, it is
easier to define a construction as the result of a transformation over (a
specific subset of) adtrees.

In our example, the verbal construction and the circumstantial construction
are elementary, while the infinitive and the gerund are best understood
when considering them as the result of appropriate transformations
mapping the “direct” sentence into the “abstract” tense.

In general, one may think to transformations as “translations” of expressions
into a context.
 15 of 23
Language generation I


In the case of a pure grammar category C, the category Adtree(C) is
isomorphic to C, so the language can be generated by constructing every
possible adtree. This is possible because the grammar category is
well-founded.

In the case of English, it is immediate to understand that every correct
adtree is generated, but also incorrect ones are.

So, we have to recognize the correct adtrees among the collection of all the
generated adtrees. A closer analysis reveals that the “wrong” adtrees have
the “right” structure but they fail to agree on the person-verb (go , goes), on
the plural-singular, etc.




 16 of 23
Language generation II

All these details are called redundancies in general linguistics, as they are
deputed to “repeat” pieces of information already present in an expression,
so to unambiguously determine its meaning.

To cope with redundancies we need to deal with the morphemic structure,
i.e., to the way words are formed: I will not deepen this aspect in this talk. It
suffices to say that the morphemic structure is captured by grammar
categories whose adtrees represent “words” and idiomatic expressions.

Also, we can always build a grammar category over another one, whose
adtrees become the atoms of the new category. This construction is beyond
the scope of this talk, but it suffices to say that the linguistic level, which is a
grammar category, is built upon the morphemic level, another grammar
category, and it is possible to recover the morphemic structure in a
canonical way from the linguistic level, when the morphemic category is
known.
 17 of 23
Language generation III


It is always possible to imagine a transformation that maps an adtree X into
another one where a redundancy aspect is corrected if it was wrong in X .
By the way, this is the “usual” way used to explain redundancies (except
that no one speaks of transformations. . . ).

So, let’s suppose to have a collection R of transformations deputed to
correct redundancies. Evidently, an adtree X is correct with respect to
redundancies if, and only if, X = τ(X ) for every τ ∈ R. That is, X is correct if
and only if X is a fixed point for every element of R.

Hence, the language generated by a grammar category with respect to a
set of redundancy-transformations R is the collection of adtrees which are
fixed points for R.



 18 of 23
Semantics and pragmatics

We can develop semantics and pragmatics inside this framework by
considering “narratives”: the meaning and the intention of an expression are
given by relations with the world. And, in turn, the world is a huge
description (a narrative) which is presumed by the speaker.

The world’s narrative can be coded by some adtree λ and the expression
X , together with its narrative, becomes the enriched adtree ϵ(λ, X ).

Semantics arise when we precisely define how X relates to λ. Pragmatics
arise when we study as a sequence of expression develops in time, in
particular, in which way the λ’s are modified.

In our approach, semantics has been partially developed, although it has
not been completed. On the other side, pragmatics has been developed
according to the ideas of Searle, and it has been applied to study the
interaction between a therapist and a patient in psycoanalytical practice.

 19 of 23
Links with Leeds I



Building a theory in Mathematics is also a linguistic act!

We define a formal language, we declare axioms, we provide definitions, we
prove theorems, referring to an implicit, but well described world.

To what extent we can use the illustrated linguistic approach to infer
properties of mathematical theories?

Being constructive, or being predicative, is perhaps a “linguistic” property of
a theory?




 20 of 23
Links with Leeds II



I believe that we cannot expect to gain deep results on syntax or semantics
from the illustrated approach.
But, I also believe that we can, and we should, learn more about our
theories when we consider also the pragmatics of development of
mathematical theories.

In particular, I think that being constructive or predicative are mainly
linguistic properties of a theory. And this is an “attack point” for the research
project I’ll work on in my period in the School of Mathematics.




 21 of 23
More information




   Federico Gobbo and Marco Benini, Constructive Adpositional
   Grammars: Foundations of Constructive Linguistics, Cambridge Scholar
   Publishing, Newcastle upon Tyne, UK (2011)
   Ask the second author!




22 of 23
The end




           Questions?




23 of 23

Weitere ähnliche Inhalte

Was ist angesagt?

Magic graphs
Magic graphsMagic graphs
Magic graphs
Springer
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
zukun
 
Data Exchange over RDF
Data Exchange over RDFData Exchange over RDF
Data Exchange over RDF
net2-project
 
Complex Numbers (advance)
Complex Numbers (advance)Complex Numbers (advance)
Complex Numbers (advance)
itutor
 
On the-approximate-solution-of-a-nonlinear-singular-integral-equation
On the-approximate-solution-of-a-nonlinear-singular-integral-equationOn the-approximate-solution-of-a-nonlinear-singular-integral-equation
On the-approximate-solution-of-a-nonlinear-singular-integral-equation
Cemal Ardil
 
11.on generalized dislocated quasi metrics
11.on generalized dislocated quasi metrics11.on generalized dislocated quasi metrics
11.on generalized dislocated quasi metrics
Alexander Decker
 
On generalized dislocated quasi metrics
On generalized dislocated quasi metricsOn generalized dislocated quasi metrics
On generalized dislocated quasi metrics
Alexander Decker
 
Inverse of functions
Inverse of functionsInverse of functions
Inverse of functions
Leo Crisologo
 

Was ist angesagt? (20)

Darmon Points for fields of mixed signature
Darmon Points for fields of mixed signatureDarmon Points for fields of mixed signature
Darmon Points for fields of mixed signature
 
Magic graphs
Magic graphsMagic graphs
Magic graphs
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
 
On Some Continuous and Irresolute Maps In Ideal Topological Spaces
On Some Continuous and Irresolute Maps In Ideal Topological SpacesOn Some Continuous and Irresolute Maps In Ideal Topological Spaces
On Some Continuous and Irresolute Maps In Ideal Topological Spaces
 
Integration
IntegrationIntegration
Integration
 
A common fixed point of integral type contraction in generalized metric spacess
A  common fixed point of integral type contraction in generalized metric spacessA  common fixed point of integral type contraction in generalized metric spacess
A common fixed point of integral type contraction in generalized metric spacess
 
doubts.docx
doubts.docxdoubts.docx
doubts.docx
 
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operatorsA T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
A T(1)-type theorem for entangled multilinear Calderon-Zygmund operators
 
1 2 3
1 2 31 2 3
1 2 3
 
Data Exchange over RDF
Data Exchange over RDFData Exchange over RDF
Data Exchange over RDF
 
Complex Numbers (advance)
Complex Numbers (advance)Complex Numbers (advance)
Complex Numbers (advance)
 
On the-approximate-solution-of-a-nonlinear-singular-integral-equation
On the-approximate-solution-of-a-nonlinear-singular-integral-equationOn the-approximate-solution-of-a-nonlinear-singular-integral-equation
On the-approximate-solution-of-a-nonlinear-singular-integral-equation
 
Muchtadi
MuchtadiMuchtadi
Muchtadi
 
Complexity Classes and the Graph Isomorphism Problem
Complexity Classes and the Graph Isomorphism ProblemComplexity Classes and the Graph Isomorphism Problem
Complexity Classes and the Graph Isomorphism Problem
 
Tales on two commuting transformations or flows
Tales on two commuting transformations or flowsTales on two commuting transformations or flows
Tales on two commuting transformations or flows
 
11.on generalized dislocated quasi metrics
11.on generalized dislocated quasi metrics11.on generalized dislocated quasi metrics
11.on generalized dislocated quasi metrics
 
On generalized dislocated quasi metrics
On generalized dislocated quasi metricsOn generalized dislocated quasi metrics
On generalized dislocated quasi metrics
 
Multilinear singular integrals with entangled structure
Multilinear singular integrals with entangled structureMultilinear singular integrals with entangled structure
Multilinear singular integrals with entangled structure
 
Inverse of functions
Inverse of functionsInverse of functions
Inverse of functions
 
Ds
DsDs
Ds
 

Andere mochten auch

Pragmatic ( linguistics p pt )
Pragmatic ( linguistics p pt )Pragmatic ( linguistics p pt )
Pragmatic ( linguistics p pt )
Ida Saadah
 
Pragmatics intro to linguistic
Pragmatics intro to linguisticPragmatics intro to linguistic
Pragmatics intro to linguistic
Ulil Fauziyah
 

Andere mochten auch (8)

Lecture 1
Lecture 1Lecture 1
Lecture 1
 
Pragmatic ( linguistics p pt )
Pragmatic ( linguistics p pt )Pragmatic ( linguistics p pt )
Pragmatic ( linguistics p pt )
 
Unit 1
Unit 1Unit 1
Unit 1
 
Pragmatics intro to linguistic
Pragmatics intro to linguisticPragmatics intro to linguistic
Pragmatics intro to linguistic
 
Introduction to Language and Linguistics 007: Dynamic Semantics & Pragmatics
Introduction to Language and Linguistics 007: Dynamic Semantics & PragmaticsIntroduction to Language and Linguistics 007: Dynamic Semantics & Pragmatics
Introduction to Language and Linguistics 007: Dynamic Semantics & Pragmatics
 
Intro. to Linguistics_13 Pragmatics
Intro. to Linguistics_13 PragmaticsIntro. to Linguistics_13 Pragmatics
Intro. to Linguistics_13 Pragmatics
 
Introduction to Linguistic Subfields
Introduction to Linguistic SubfieldsIntroduction to Linguistic Subfields
Introduction to Linguistic Subfields
 
Introduction to Language and Linguistics 005: Morphology & Syntax
Introduction to Language and Linguistics 005: Morphology & SyntaxIntroduction to Language and Linguistics 005: Morphology & Syntax
Introduction to Language and Linguistics 005: Morphology & Syntax
 

Ähnlich wie Adgrams: Categories and Linguistics

Complex numbers org.ppt
Complex numbers org.pptComplex numbers org.ppt
Complex numbers org.ppt
Osama Tahir
 
Introduction to inverse problems
Introduction to inverse problemsIntroduction to inverse problems
Introduction to inverse problems
Delta Pi Systems
 
nips勉強会_Toward Property-Based Classification of Clustering Paradigms
nips勉強会_Toward Property-Based Classification of Clustering Paradigmsnips勉強会_Toward Property-Based Classification of Clustering Paradigms
nips勉強会_Toward Property-Based Classification of Clustering Paradigms
tksakaki
 
Maksim Zhukovskii – Zero-one k-laws for G(n,n−α)
Maksim Zhukovskii – Zero-one k-laws for G(n,n−α)Maksim Zhukovskii – Zero-one k-laws for G(n,n−α)
Maksim Zhukovskii – Zero-one k-laws for G(n,n−α)
Yandex
 
Reflect tsukuba524
Reflect tsukuba524Reflect tsukuba524
Reflect tsukuba524
kazuhase2011
 

Ähnlich wie Adgrams: Categories and Linguistics (20)

Complex numbers org.ppt
Complex numbers org.pptComplex numbers org.ppt
Complex numbers org.ppt
 
MTH_301_Lecture_12b_2022_.pdf
MTH_301_Lecture_12b_2022_.pdfMTH_301_Lecture_12b_2022_.pdf
MTH_301_Lecture_12b_2022_.pdf
 
Ak03102250234
Ak03102250234Ak03102250234
Ak03102250234
 
Lagrange
LagrangeLagrange
Lagrange
 
On The Zeros of Polynomials
On The Zeros of PolynomialsOn The Zeros of Polynomials
On The Zeros of Polynomials
 
Introduction to inverse problems
Introduction to inverse problemsIntroduction to inverse problems
Introduction to inverse problems
 
Group theory notes
Group theory notesGroup theory notes
Group theory notes
 
Paraproducts with general dilations
Paraproducts with general dilationsParaproducts with general dilations
Paraproducts with general dilations
 
Time complexity
Time complexityTime complexity
Time complexity
 
3 zhukovsky
3 zhukovsky3 zhukovsky
3 zhukovsky
 
Number theory lecture (part 2)
Number theory lecture (part 2)Number theory lecture (part 2)
Number theory lecture (part 2)
 
T. Popov - Drinfeld-Jimbo and Cremmer-Gervais Quantum Lie Algebras
T. Popov - Drinfeld-Jimbo and Cremmer-Gervais Quantum Lie AlgebrasT. Popov - Drinfeld-Jimbo and Cremmer-Gervais Quantum Lie Algebras
T. Popov - Drinfeld-Jimbo and Cremmer-Gervais Quantum Lie Algebras
 
The Probability that a Matrix of Integers Is Diagonalizable
The Probability that a Matrix of Integers Is DiagonalizableThe Probability that a Matrix of Integers Is Diagonalizable
The Probability that a Matrix of Integers Is Diagonalizable
 
Comparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsComparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering models
 
nips勉強会_Toward Property-Based Classification of Clustering Paradigms
nips勉強会_Toward Property-Based Classification of Clustering Paradigmsnips勉強会_Toward Property-Based Classification of Clustering Paradigms
nips勉強会_Toward Property-Based Classification of Clustering Paradigms
 
Maksim Zhukovskii – Zero-one k-laws for G(n,n−α)
Maksim Zhukovskii – Zero-one k-laws for G(n,n−α)Maksim Zhukovskii – Zero-one k-laws for G(n,n−α)
Maksim Zhukovskii – Zero-one k-laws for G(n,n−α)
 
Berezin-Toeplitz Quantization On Coadjoint orbits
Berezin-Toeplitz Quantization On Coadjoint orbitsBerezin-Toeplitz Quantization On Coadjoint orbits
Berezin-Toeplitz Quantization On Coadjoint orbits
 
Reflect tsukuba524
Reflect tsukuba524Reflect tsukuba524
Reflect tsukuba524
 
H25031037
H25031037H25031037
H25031037
 
H25031037
H25031037H25031037
H25031037
 

Mehr von Marco Benini

Mehr von Marco Benini (20)

Point-free semantics of dependent type theories
Point-free semantics of dependent type theoriesPoint-free semantics of dependent type theories
Point-free semantics of dependent type theories
 
The Graph Minor Theorem: a walk on the wild side of graphs
The Graph Minor Theorem: a walk on the wild side of graphsThe Graph Minor Theorem: a walk on the wild side of graphs
The Graph Minor Theorem: a walk on the wild side of graphs
 
Explaining the Kruskal Tree Theore
Explaining the Kruskal Tree TheoreExplaining the Kruskal Tree Theore
Explaining the Kruskal Tree Theore
 
The Graph Minor Theorem: a walk on the wild side of graphs
The Graph Minor Theorem: a walk on the wild side of graphsThe Graph Minor Theorem: a walk on the wild side of graphs
The Graph Minor Theorem: a walk on the wild side of graphs
 
Dealing with negative results
Dealing with negative resultsDealing with negative results
Dealing with negative results
 
Variations on the Higman's Lemma
Variations on the Higman's LemmaVariations on the Higman's Lemma
Variations on the Higman's Lemma
 
Dealing with negative results
Dealing with negative resultsDealing with negative results
Dealing with negative results
 
Well Quasi Orders in a Categorical Setting
Well Quasi Orders in a Categorical SettingWell Quasi Orders in a Categorical Setting
Well Quasi Orders in a Categorical Setting
 
Proof-Theoretic Semantics: Point-free meaninig of first-order systems
Proof-Theoretic Semantics: Point-free meaninig of first-order systemsProof-Theoretic Semantics: Point-free meaninig of first-order systems
Proof-Theoretic Semantics: Point-free meaninig of first-order systems
 
Point-free foundation of Mathematics
Point-free foundation of MathematicsPoint-free foundation of Mathematics
Point-free foundation of Mathematics
 
Fondazione point-free della matematica
Fondazione point-free della matematicaFondazione point-free della matematica
Fondazione point-free della matematica
 
Numerical Analysis and Epistemology of Information
Numerical Analysis and Epistemology of InformationNumerical Analysis and Epistemology of Information
Numerical Analysis and Epistemology of Information
 
L'occhio del biologo: elementi di fotografia
L'occhio del biologo: elementi di fotografiaL'occhio del biologo: elementi di fotografia
L'occhio del biologo: elementi di fotografia
 
Constructive Adpositional Grammars, Formally
Constructive Adpositional Grammars, FormallyConstructive Adpositional Grammars, Formally
Constructive Adpositional Grammars, Formally
 
Marie Skłodowska Curie Intra-European Fellowship
Marie Skłodowska Curie Intra-European FellowshipMarie Skłodowska Curie Intra-European Fellowship
Marie Skłodowska Curie Intra-European Fellowship
 
Programming modulo representations
Programming modulo representationsProgramming modulo representations
Programming modulo representations
 
Algorithms and Their Explanations
Algorithms and Their ExplanationsAlgorithms and Their Explanations
Algorithms and Their Explanations
 
Programming modulo representations
Programming modulo representationsProgramming modulo representations
Programming modulo representations
 
June 22nd 2014: Seminar at JAIST
June 22nd 2014: Seminar at JAISTJune 22nd 2014: Seminar at JAIST
June 22nd 2014: Seminar at JAIST
 
CORCON2014: Does programming really need data structures?
CORCON2014: Does programming really need data structures?CORCON2014: Does programming really need data structures?
CORCON2014: Does programming really need data structures?
 

Adgrams: Categories and Linguistics

  • 1. Adgrams: Categories and Linguistics Marco Benini M.Benini@leeds.ac.uk Department of Pure Mathematics University of Leeds 26th October 2011
  • 2. An example “Paul is going to study maths in the library”. q  d←   in dq   I2  d→   q  ϵ d 2   →  d I2 d     -ing dq d   → □ O  d d q    to dq d ←  d D   2 → I2  d d ϵ   O d   q   ϵ d 2 dq ←  d I ←  d ϵ 1 go   ϵ d the library Paul   O dq A O ←  d O I2∗ I2 □ ϵ Paul 1 is   2 dq I → I2 maths 2  d D O ϵ   2 d O I 1 study Paul 2 of 23 O I2
  • 3. Categories as languages To interpret the objects of a small category as the collection of expressions of some language, the category must have enough “structure” to encode the grammar constructions in its arrows. The first part of the formal model, what we are going to describe in this talk, is devoted to describe the required structure. Although the motivation for our definitions can be traced back to what happens in natural languages, we will limit ourselves to the example just shown, as it suffices for the purposes of this talk. 3 of 23
  • 4. Grammar character A grammar character is a collection of expressions in the language that can be used to construct new expressions. Informally, “nouns”, “adjectives”, “verbs” are grammar characters (but beware that they do not properly work as you may expect!) Definition 1 (Grammar Character) Given a small category C, we say that G is a set of grammar characters on C, indexed by GC, if G = {Gi }i ∈GC is a partition on Obj C such that every class is inhabited. In the example, the grammar characters’ names are A, I2 , I2 , I2 , O and D. 1 2 4 of 23
  • 5. Grammar product Definition 2 (Grammar Product) Fixed a small category C, a set G = {Gi }i ∈GC of grammar characters on C and W ∈ G (the adpositions), any pair 〈Y1 × · · · × Yn , (z1 , . . . , zn )〉 is a grammar product on C, G, W when n ∈ N (the degree of the product) and, for every i in 1 . . . n, Yi ∈ G {W } and zi ∈ W . In the example, every part of a branch starting with a right-most leaf is an instance of a grammar product. Since most of the relevant information about a construction is in the grammar product, it is convenient to give a separate definition for this auxiliary concept. 5 of 23
  • 6. Grammar construction I Definition 3 (Grammar construction) Given a small category C, a set G = {Gi }i ∈GC of grammar characters on C and W ∈ G, a grammar construction θP ,x over P from x is an indexed { } collection fj j ∈P ′ where ′ P = 〈P , α〉 is a grammar product over C, G, W ; x is an object of C, called the governor of the construction; there is V ∈ G {W } such that, for every j ∈ P ′ , fj ∈ HomC (x , y ) with y ∈ V . Every component of j is called a dependent of the construction. We denote V as codθP ,x , x as domθP ,x and sometimes we write θj ,x for fj . { } A family θP ,x x ∈U of grammar constructions over P from x varying in the U grammar character forms a grammar construction θP ,U over P from U when its elements share the same codomain. 6 of 23
  • 7. Grammar construction II In the example, we see instances of the verbal constructions with product 〈O × O, (ϵ, ϵ)〉, 〈O × O, (ϵ, to)〉; the infinitive construction with product 〈D, (ϵ)〉; the gerund construction with product 〈D, (ϵ)〉; the adjunctive construction with product 〈A, (ϵ)〉; the circumstantial construction with product 〈O, (in)〉. 7 of 23
  • 8. Conjugate construction Definition 4 (Conjugate construction) Fixed a small category C, a set G = {Gi }i ∈GC of grammar characters on C, {{ } } with W ∈ G, and a grammar construction θ = fj : x → c (x , j ) j ∈P ′ over x ∈V P = 〈P ′ = P1 × · · · × Pn , (a1 , . . . , an )〉 from the V grammar character, we say that a grammar construction σ over Q from U is conjugate to θ if, for some k in 1 . . . n, U = Pk , Q = 〈P1 ×· · ·× Pk −1 × V × Pk +1 ×· · ·× Pn , (ak , a2 , . . . , ak −1 , a1 , ak +1 , . . . , an )〉 , and, for all (x , j1 , . . . , jn ) ∈ V × P ′ , cod θ(j1 ,...,jn ),x = cod σ(j1 ,...,jk −1 ,x ,jk +1 ,...,jn ),jk . Essentially, a conjugate construction is just the “normal” construction where the governor is exchanged with one dependent. It models a different, but equivalent, point of view of what matters in the construction. 8 of 23
  • 9. Composition of constructions I Definition 5 (Composition) Given a small category C, a set G = {Gi }i ∈GC of grammar characters on C, with W ∈ G, and grammar constructions {{ } } η = f(y1 ,...,yn ) : x → c (x , y1 , . . . , yn ) y ∈P ,...,y ∈P 1 1 n n x ∈V over P = 〈P1 × · · · × Pn , (a1 , . . . , an )〉 from V , and {{ } } θ= g(z1 ,...,zm ) : x → d (x , z1 , . . . , zm ) z 1 ∈Q1 ,...,zm ∈Qm x ∈U over Q = 〈Q1 × · · · × Qm , (b1 , . . . , bm )〉 from U, the composition θ ◦ η is defined when dom θ = U = cod η. → 9 of 23
  • 10. Composition of constructions II → (Composition) In this case, {{ } } θ ◦ η = g(z1 ,...,zm ) ◦ f(y1 ,...,yn ) y 1 ∈P1 ,...,yn ∈Pn ,z1 ∈Q1 ,...,zm ∈Qm x ∈V so dom θ ◦ η = dom η = V , cod θ ◦ η = cod θ , and the product is 〈P1 × · · · × Pn × Q1 × · · · × Qm , (a1 , . . . , an , b1 , . . . , bm )〉. Despite the apparent complexity, composition is what one naturally expects to be. 10 of 23
  • 11. Grammar category I Definition 6 (Grammar category) Consider a small category C, a set G = {Gi }i ∈GC of grammar characters on C, with W ∈ G, and a set C of grammar constructions over products in C, G, W from grammar characters in G. Then, the structure 〈C, G, W , C 〉 is said to be a grammar category when 1. (well-foundedness) There is M ⊆ Obj C, the set of atoms, such that, for each object x, there is an arrow f ∈ HomC (m, x ) for some m ∈ M. Moreover, if x ∈ M, f = 1x . 2. (covering) For each arrow f ∈ HomC (a, b), there is η P ,V ∈ C such that a ∈ V and f ∈ η P ,a . 3. (1-degree regularity) Every construction in C is the composition of a finite sequence of constructions in C each one having degree 1. 4. (conjugate regularity) Each construction in C has exactly one conjugate construction for every element in its product. 11 of 23
  • 12. Grammar category II Apart the technicalities in the definition, a small category can be interpreted as a grammar category, i.e., a language with a formal adpositional grammar, when we can find a set of grammar characters, a set of adpositions, a set of atoms and a set of constructions in it such that every object can be constructed starting from the atoms; every arrow is part of some construction; every construction can be decomposed in a sequence of elementary (one governor, one dependent) constructions; conjugate constructions depend only on the product. It is always possible to satisfy these requirements in a trivial way. The thing becomes interesting, as usual, in the non-trivial case. And yes, English, Italian, Chinese, . . . can be described in this way! 12 of 23
  • 13. Adtrees Given a grammar category, we can write every arrow in a construction { } f(y1 ,...,yn ) : x → c (x , y1 , . . . , yn ) y1 ∈Y1 ,...,yn ∈Yn over the product 〈Y1 × · · · × Yn , (a1 , . . . , an )〉 from x as f(y1 ,...,yn ) x / an (yn , an−1 (yn−1 , . . . , a1 (x , y1 ) . . . )) = c (x , y1 , . . . , yn ) . If we omit the f , which is normally clear from the context, and we write the target term as a tree, we get a graphical representation like our example. This representation is called adtree. When every object can be represented in a unique way as an adtree (modulo conjugate constructions), a grammar category is said to be pure. 13 of 23
  • 14. Parsing Theorem 7 Let C be a grammar category. Then, every object of C is denoted by an adtree X , and there is an adtree Y , denoting the same object, such that Y is linear (its governor and its dependents are atoms) and Y is equivalent to X modulo conjugation. When the collections of objects and arrows of C are small enough, then the proof of the theorem provides a (very inefficient) algorithm to reconstruct X from Y , and Y is (almost) the usual way we write expressions. Moreover, we can limit ourselves to observe adtrees: if a grammar category is pure, up to conjugates, every adtree represents (“is”) a unique object; in the general case, a grammar category is a quotient of the category Adtree(C) having adtrees as objects and instances of constructions as arrows. 14 of 23
  • 15. Transformations Definition 8 (Transformation) Given a grammar category C, a transformation on C is an endofunctor in the category Adtree(C) of adtrees on C. Defining constructions is easy when we deal with the elementary aspects of a natural language. But, when dealing with sophisticated concepts, it is easier to define a construction as the result of a transformation over (a specific subset of) adtrees. In our example, the verbal construction and the circumstantial construction are elementary, while the infinitive and the gerund are best understood when considering them as the result of appropriate transformations mapping the “direct” sentence into the “abstract” tense. In general, one may think to transformations as “translations” of expressions into a context. 15 of 23
  • 16. Language generation I In the case of a pure grammar category C, the category Adtree(C) is isomorphic to C, so the language can be generated by constructing every possible adtree. This is possible because the grammar category is well-founded. In the case of English, it is immediate to understand that every correct adtree is generated, but also incorrect ones are. So, we have to recognize the correct adtrees among the collection of all the generated adtrees. A closer analysis reveals that the “wrong” adtrees have the “right” structure but they fail to agree on the person-verb (go , goes), on the plural-singular, etc. 16 of 23
  • 17. Language generation II All these details are called redundancies in general linguistics, as they are deputed to “repeat” pieces of information already present in an expression, so to unambiguously determine its meaning. To cope with redundancies we need to deal with the morphemic structure, i.e., to the way words are formed: I will not deepen this aspect in this talk. It suffices to say that the morphemic structure is captured by grammar categories whose adtrees represent “words” and idiomatic expressions. Also, we can always build a grammar category over another one, whose adtrees become the atoms of the new category. This construction is beyond the scope of this talk, but it suffices to say that the linguistic level, which is a grammar category, is built upon the morphemic level, another grammar category, and it is possible to recover the morphemic structure in a canonical way from the linguistic level, when the morphemic category is known. 17 of 23
  • 18. Language generation III It is always possible to imagine a transformation that maps an adtree X into another one where a redundancy aspect is corrected if it was wrong in X . By the way, this is the “usual” way used to explain redundancies (except that no one speaks of transformations. . . ). So, let’s suppose to have a collection R of transformations deputed to correct redundancies. Evidently, an adtree X is correct with respect to redundancies if, and only if, X = τ(X ) for every τ ∈ R. That is, X is correct if and only if X is a fixed point for every element of R. Hence, the language generated by a grammar category with respect to a set of redundancy-transformations R is the collection of adtrees which are fixed points for R. 18 of 23
  • 19. Semantics and pragmatics We can develop semantics and pragmatics inside this framework by considering “narratives”: the meaning and the intention of an expression are given by relations with the world. And, in turn, the world is a huge description (a narrative) which is presumed by the speaker. The world’s narrative can be coded by some adtree λ and the expression X , together with its narrative, becomes the enriched adtree ϵ(λ, X ). Semantics arise when we precisely define how X relates to λ. Pragmatics arise when we study as a sequence of expression develops in time, in particular, in which way the λ’s are modified. In our approach, semantics has been partially developed, although it has not been completed. On the other side, pragmatics has been developed according to the ideas of Searle, and it has been applied to study the interaction between a therapist and a patient in psycoanalytical practice. 19 of 23
  • 20. Links with Leeds I Building a theory in Mathematics is also a linguistic act! We define a formal language, we declare axioms, we provide definitions, we prove theorems, referring to an implicit, but well described world. To what extent we can use the illustrated linguistic approach to infer properties of mathematical theories? Being constructive, or being predicative, is perhaps a “linguistic” property of a theory? 20 of 23
  • 21. Links with Leeds II I believe that we cannot expect to gain deep results on syntax or semantics from the illustrated approach. But, I also believe that we can, and we should, learn more about our theories when we consider also the pragmatics of development of mathematical theories. In particular, I think that being constructive or predicative are mainly linguistic properties of a theory. And this is an “attack point” for the research project I’ll work on in my period in the School of Mathematics. 21 of 23
  • 22. More information Federico Gobbo and Marco Benini, Constructive Adpositional Grammars: Foundations of Constructive Linguistics, Cambridge Scholar Publishing, Newcastle upon Tyne, UK (2011) Ask the second author! 22 of 23
  • 23. The end Questions? 23 of 23