
Please click on titles to show the abstracts or here to show all abstracts.
9h30 – 11h00 : Tutorial by David Chiang (Univ. of Southern California)
Synchronous Grammars
Synchronous contextfree grammars (CFGs), first proposed in the 1960s, have become a popular and powerful tool in machine translation, semantic parsing, and other areas of natural language processing. Synchronous treeadjoining grammars (TAGs) were first proposed in the 1990s and are starting to see interesting applications. The theory behind synchronous grammars, and the algorithms that power their applications, are sometimes natural extensions of those of conventional grammars, but there are also some surprising twists and turns. I will give an introduction to synchronous CFGs and TAGs, present some of their key formal properties, and describe the main algorithms that use them. I will also describe some synchronous grammar formalisms beyond synchronous TAG, like synchronous hyperedge replacement grammars.
11h30 – 13h00 : Tutorial by Vera Demberg (Univ. of Saarland)
TreeAdjoining Grammars from a Psycholinguistic Perspective
We will first review some psycholinguistic experiments that are revealing about certain properties of human language comprehension, such as to what degree sentence processing is incremental and possibly even connected, as well as studies that indicate that humans actively predict upcoming input. We will thereby cover syntactic as well as semantic and discourselevel effects.
In the second part of the tutorial, I will discuss how these effects can be modelled using TreeAdjoining Grammars.
LCFRS+: Linear ContextFree Rewriting Systems and Related Formalisms
Recently, there has been an increased interest in Linear ContextFree
Rewriting Systems (LCFRSs), due to their mild contextsensitivitiy and
their capacity to describe discontinuous constituents and
nonprojective dependencies. LCFRS research has particularly
intensified in the area of parsing.
In this tutorial, LCFRSs will be motivated and
introduced. Furthermore, closely related formalisms such as Multiple
ContextFree Grammars (MCFG) and simple Range Concatentation Grammars
(RCG) will be defined and related to LCFRS. The link between LCFRS and
the notion of mild contextsensitivity will be discussed and, in this
context, the question whether one might even want to go beyond LCFRS
will be raised. The more powerful formalisms of (unrestricted) RCG and
Literal Movement Grammars (LMG) will be introduced that both are
natural extensions of LCFRS, depending on whether the LCFRS rules are
understood as manipulating strings or manipulating concrete
occurrences of substrings of some input string. The former leads to
LMG while the latter leads to RCG.
The aim of the tutorial is to give an overview of the formal grammar
landscape ranging from CFG to LCFRS, RCG and LMG, relating the
different types of rewriting rules and the different language classes
defined by these formalisms. The expressive power and the limitations
of these grammars are illustrated by numerous examples.
Trees abound: A primer on tree automata and tree transducers
We introduce tree automata and tree transducers formally and on examples. We also link them to the (synchronous) grammar notions that are better known in NLP. We then proceed to review most of the basic tree automata and tree transducer results with tieins into current results obtained in the NLP community. Finally, we cover some interesting advanced results that so far received little interest from the NLP community.
9h00 – 10h00 : Invited Talk by Bonnie Webber (Univ. of Edinburgh)
Chair: Laurence Danlos
Alternatives, Discourse Semantics and Discourse Structure
Sentencelevel modality and negation give rise to “alternative” events and/or situations that contribute to discourse semantics in interesting ways. But they are not the only linguistic elements that do this. In this talk, I will try to characterise the range of elements that give rise to alternatives and the nature of these events and situations. I will then show how these alternatives are distinct from the coherence relations that provide a lowlevel of discourse structure.
Delayed TreeLocality, Setlocality, and Clitic Climbing
Joan ChenMain (Univ. of Pennsylvania), Tonia Bleam (Univ. of Maryland) and Aravind Joshi (Univ. of Pennsylvania)
Since Bleam's (2000) initial claim that capturing clitic climbing patterns in Romance requires the descriptive power of setlocal MCTAG (Weir, 1988), alternative approaches to relaxing treelocality restrictions have been developed, including delayed treelocal MCTAG (Chiang and Scheffler, 2008), which, unlike setlocal MCTAG, is weakly equivalent to standard TAG. This paper compares 2delayed treelocal MCTAG with setlocal MCTAG in terms of how well the two formalisms can account for the clitic climbing data. We confirm that 2delay treelocal MCTAG has the formal expressivity to cover the data by proposing an explicit grammar to do so. However, we also find that the constraint on set locality is particularly wellsuited for capturing these clitic climbing patterns. I.e., though globally less restrictive, setlocal MCTAG appears to be restrictive in just the right way in this specific case.
Deriving syntaxsemantics mappings: node linking, type shifting and scope ambiguity
Dennis Ryan Storoshenko (Yale University) and Robert Frank (Yale University)
In this paper, we introduce a typeshifting operation which provides a principled means of describing the derivational links required in Synchronous TAG accounts of quantification. No longer do links appear on root nodes of predicates on an ad hoc basis, rather they are generated as a part of a typeshifting mechanism over arguments of the predicate. By introducing to the system a set of temporal variables, we show how this operation can also be used to account for the scope interactions of clausal embedding. We then move on to consider additional cases of multiple clausal embedding and coordination.
Tree Adjunction as Minimalist Lowering
Thomas Graf (UCLA)
Even though Minimalist grammars are more powerful than TAG on the string level, the classes of tree languages the two define are incomparable. I give a constructive proof that if the standard Move operation in Minimalist grammars is replaced by Reset Lowering, every TAG tree language can be generated. The opposite does not hold, so the strong generative capacity of Minimalist grammars with Reset Lowering exceeds that of TAG.
A FrameBased Semantics of Locative Alternation in LTAG
Yulia Zinova (Univ. of Düsseldorf) and Laura Kallmeyer (Univ. of Düsseldorf)
In this paper we present an analysis of locative alternation phenomena in Russian and English within a framebased LTAG syntaxsemantics interface. The combination of a syntactic theory with an extended domain of locality and frames provides a powerful mechanism for argument linking. Furthermore, the concept of tree families and unanchored trees in LTAG allows for a decomposition of meaning into lexical and constructional components.
A Logical Characterization of Extended TAGs
Uwe Mönnich (Univ. of Tübingen)
Contextfree tree grammars, originally introduced by Rounds ((Rounds, 1970)), are powerful grammar devices for the definition of tree languages. In the present paper, we consider a subclass of the class of contextfree tree languages, namely the class of monadic simple contextfree tree languages. For this class of contextfree tree languages, a faithful rendering of extended TAGs, we show that it can be given a simple logical characterization in terms of monadic secondorder transductions.
Synchronous Tree Unification Grammar
Timm Lichte (Univ. of Düsseldorf)
This paper presents a novel grammar formalism, Synchronous Tree Unification Grammar (STUG), that borrows ideas from two rather distinct exemplars of treebased grammar formalisms, namely Synchronous Tree Adjoining Grammar and Tree Unification Grammar. At the same time STUG differs considerably from those in that it allows for a clean separation of syntax and valency. Exploiting this potential in the modelling of natural language grammar has a number of interesting consequences that we will sketch in the course of this paper.
Synchronous ContextFree Tree Grammars
MarkJan Nederhof (Univ. of St Andrews) and Heiko Vogler (Univ. of Dresden)
We consider pairs of contextfree tree grammars combined through synchronous rewriting. The resulting formalism is at least as powerful as synchronous tree adjoining grammars and linear, nondeleting macro tree transducers, while the parsing complexity remains polynomial. Its power is subsumed by contextfree hypergraph grammars. The new formalism has an alternative characterization in terms of bimorphisms. An advantage over synchronous variants of linear contextfree rewriting systems is the ability to specify treetotree transductions.
Incremental NeoDavidsonian semantic construction for TAG
Asad Sayeed (Univ. of Saarland), and Vera Demberg (Univ. of Saarland)
We propose a NeoDavidsonian semantics approach as a framework for constructing a semantic interpretation simultaneously with a strictly incremental syntactic derivation using the PLTAG formalism, which supports full connectedness of all words under a single node at each point in time. This paper explains why NeoDavidsonian semantics is particularly suitable for incremental semantic construction and outlines how the semantic construction process works. We focus also on quantifier scope, which turns out to be a particularly interesting question in the context of incremental TAG.
Representing Focus in LTAG
Kata Balogh (Univ. of Düsseldorf)
The paper proposes an LTAG semantic analysis to derive semantic representations for different focus constructions in a uniform way. The proposal is shown via examples of different narrow focus constructions, multiple foci and focus in questions.
Describing São Tomense Using a TreeAdjoining MetaGrammar
Emmanuel Schang (Univ. of Orléans), Denys Duchier (Univ. of Orléans), Brunelle Magnana Ekoukou (Univ. of Orléans), Yannick Parmentier (Univ. of Orléans) and Simon Petitjean (Univ. of Orléans)
In this paper, we show how the interactions between the tense, aspect and mood preverbal markers in Sao Tomense can be formally and concisely described at an abstract level, using the concept of projection. More precisely, we show how to encode the different valid orders of preverbal markers in an abstract description of a TreeAdjoining Grammar of Sao Tomense. This description is written using the XMG metagrammar language (Crabbé and Duchier, 2004).
An Attempt Towards Learning Semantics: Distributional Learning of IO ContextFree Tree Grammars
Ryo Yoshinaka (Univ. of Kyoto)
Solid techniques based on distributional learning have been developed targeting rich subclasses of CFGs and their extensions including linear contextfree tree grammars. Along this line we propose a learning algorithm for some subclasses of IO contextfree tree grammars.
Delayed Tree Locality and the Status of Derivation Structures
Joan ChenMain (Univ. of Pennsylvania)
While the derived trees yielded by TAG derivations are uncontroversially taken to correspond to phrase structure, the status of TAG derivation structures as more than a record of TAG operations is less certain. An attractive possibility is to interpret the derivation structure as some representation of semantic meaning, such as a dependency analysis. However, the literature has identified cases where doing so is problematic (Rambow et al., 1995, Candito and Kahane, 1998, Frank and van Genabith, 2001, Gardent and Kallmeyer, 2003, Kallmeyer and Romero 2008), including what has been referred to as the Missing Link Problem: predicates which should have a dependency link are unconnected in the derivation structure. This paper shows that delayed treelocal MCTAG (Chiang and Scheffler, 2008) provides a solution for certain types of missing links. Further, we observe that the regular form 2level TAG solutions to the Missing Link Problem given in (Dras et al., 2004) can be reinterpreted using delayed treelocal MCTAG: the object level derivations of the 2level TAG derivations can be converted into legal 1delayed treelocal MCTAG derivations. Thus, delayed treelocality maintains the possibility that TAG derivation structures can be more meaningladen than solely a record of the combination of trees.
A Formal Model for Plausible Dependencies in Lexicalized Tree Adjoining Grammar
Laura Kallmeyer (Univ. of Düsseldorf) and Marco Kuhlmann (Univ. of Uppsala)
Several authors have pointed out that the correspondence between LTAG derivation trees and dependency structures is not as direct as it may seem at first glance, and various proposals have been made to overcome this divergence. In this paper we propose to view the correspondence between derivation trees and dependency structures as a tree transformation during which the direction of some of the original edges is reversed. We show that, under this transformation, LTAG is able to induce both illnested dependency trees and dependency trees with gapdegree greater than 1, which is not possible under the direct reading of derivation trees as dependency trees.
Using FBLTAG Derivation Trees to Generate TransformationBased Grammar Exercises
Claire Gardent (CNRS/LORIA, Nancy) and Laura PerezBeltrachini (Univ. of Lorraine)
Using a FeatureBased Lexicalised Tree Adjoining Grammar (FBLTAG), we present an approach for generating pairs of sentences that are related by a syntactic transformation and we apply this approach to create language learning exercises. We argue that the derivation trees of an FBLTAG provide a good level of representation for capturing syntactic transformations. We relate our approach to previous work on sentence reformulation, question generation and grammar exercise generation. We evaluate precision and linguistic coverage. And we demonstrate the genericity of the proposal by applying it to a range of transformations including the Passive/Active transformation, the pronominalisation of an NP, the assertion / yesno question relation and the assertion / whquestion transformation.
9h00 – 10h00 : Invited Talk by Kevin Knight (Univ. of Southern California)
Chair: Khalil Sima'an
Transformation Frameworks for Machine Translation: Strings, Trees, and Graphs
Accurate machine translation (MT) of human languages is a longstanding challenge for computer science. Probabilistic string, tree, and graph automata provide nice formal frameworks for largescale statistical MT systems. This talk addresses the power of these frameworks and how well they fit observed human translation data.
PLCFRS Parsing Revisited: Restricting the FanOut to Two
Wolfgang Maier (Univ. of Düsseldorf), Miriam Kaeshammer (Univ. of Düsseldorf) and Laura Kallmeyer (Univ. of Düsseldorf)
Linear ContextFree Rewriting System (LCFRS) is an extension of ContextFree Grammar (CFG) in which a nonterminal can dominate more than a single continuous span of terminals. Probabilistic LCFRS have recently successfully been used for the direct datadriven parsing of discontinuous structures. In this paper we present a parser for binary PLCFRS of fanout two, together with a novel monotonous estimate for A* parsing, with which we conduct experiments on modified versions of the German NeGra treebank and the Discontinuous Penn Treebank in which all trees have block degree two. The experiments show that compared to previous work, our approach provides an enormous speedup while delivering an output of comparable richness.
Decomposing TAG Algorithms Using Simple Algebraizations
Alexander Koller (Univ. of Postdam) and Marco Kuhlmann (Uppsala Univ.)
We review a number of different "algebraic" perspectives on TAG and STAG in the framework of interpreted regular tree grammars (IRTGs). We then use this framework to derive a new parsing algorithm for TAGs, based on two algebras that describe strings and derived trees. Our algorithm is extremely modular, and can easily be adapted to the synchronous case.
Practical Parsing of Parallel Multiple ContextFree Grammars
Peter Ljunglöf (Gothenburg & Chalmers Univ. of Technology)
We discuss four previously published parsing algorithms for parallell multiple contextfree grammar (PMCFG), and argue that they are similar to each other, and implement an Earleystyle topdown algorithm. Starting from one of these algorithms, we derive three modifications : one bottomup and two variants using a left corner filter. An evaluation shows that substantial improvements can be made by using the algorithm that performs best on a given grammar. The algorithms are implemented in Python and released under an opensource licence.
Idioms and extended transducers
Gregory M. Kobele (Univ. of Chicago)
There is a tension between the idea that idioms can be both listed in the lexicon, and the idea that they are themselves composed of the lexical items which seem to inhabit them in the standard way. In other words, in order to maintain the insight that idioms actually contain the words they look like they contain, we need to derive them syntactically from these words. However, the entity that should be assigned a special meaning is then a derivation, which is not the kind of object that can occur in a lexicon (which is, by definition, the atoms of which derivations are built), and thus not the kind of thing that we are able to assign meanings directly to. Here I will show how to resolve this tension in an elegant way, one which bears striking similarities to those proposed by psychologists and psycholinguists working on idioms.
Creating a Tree Adjoining Grammar from a Multilayer Treebank
Rajesh Bhatt (Univ. of Massachusetts), Owen Rambow (Columbia Univ.) and Fei Xia (Univ. of Washington)
We propose a method for the extraction of a Tree Adjoining Grammar (TAG) from a dependency treebank which has some representative examples annotated with phrase structures. We show that the resulting TAG along with corresponding dependency structure can be used to convert a dependency treebank to a TAGbased phrase structure treebank.
Search Space Properties for Learning a Class of Constraintbased Grammars
Smaranda Muresan (Rutgers Univ.)
We discuss a class of constraintbased grammars, Lexicalized WellFounded Grammars (LWFGs) and present the theoretical underpinnings for learning these grammars from a representative set of positive examples. Given several assumptions, we define the search space as a complete grammar lattice. In order to prove a learnability theorem, we give a general algorithm through which the top and bottom elements of the complete grammar lattice can be built.
StateSplit for Hypergraphs with an Application to Tree Adjoining Grammars
Johannes Osterholzer (Univ. of Dresden), and Torsten Stüber (Univ. of Dresden)
In this work, we present a generalization of the statesplit method to probabilistic hypergraphs. We show how to represent the derivational structure of probabilistic treeadjoining grammars by hypergraphs and detail how the generalized statesplit procedure can be applied to such representations, yielding a statesplit procedure for treeadjoining grammars.
Is Syntactic Binding Rational?
Thomas Graf (UCLA) and Natasha Abner (UCLA)
Recent results show that both TAG and Minimalist grammars can be enriched with rational constraints without increasing their strong generative capacity, where a constraint is rational iff it can be computed by a bottomup tree automaton. This raises the question which aspects of syntax can be adequately formalized using only such constraints. One of hardest phenomena commonly studied by syntacticians is binding theory. In this paper, we give a highlevel implementation of (the syntactic parts of) binding theory in terms of rational constraints, and we argue that this implementation is sufficiently powerful for natural language. This conclusion is backed up by data drawn from English, German, and American Sign Language.
Incremental Derivations in CCG
Vera Demberg (Univ. of Saarland)
This paper presents a research note on the degree to which strictly incremental derivations (that is derivations which are fully connected at each point in time) are possible in Combinatory Categorial Grammar (CCG). There has been a recent surge of interest in incremental parsing both from the psycholinguistic community in a bid to build psycholinguistically plausible models of language comprehension, and from the NLP community for building systems that process language greedily in order to achieve shorter response times in spoken dialogue systems, for speech recognition and machine translation. CCG allows for a variety of different derivations, including derivations that are almost fully incremental. This paper explores the syntactic constructions for which full incrementality is not possible in standard CCG, a point that recent work on incremental CCG parsing has glossed over.
On the FormMeaning Relations Definable by CoTAGs
Gregory M. Kobele (Univ. of Chicago) and Jens Michaelis (Univ. of Bielefeld)
Adding cosubstitution to the classical TAG operations of substitution and adjunction, coTAGs have been proposed as an "alternative conceptualization" to resolve the tension between the TAG mantra of locality of syntactic dependencies and the seeming nonlocality of quantifier scope. CoTAGs follow the tradition of synchronous TAGs (STAGs) in that they derive syntactic and semantic representations simultaneously as pairs. We demonstrate that the mappings definable by coTAGs go beyond those of "simple" STAGs. While with regard to the first component, coTAGs are weakly and strongly equivalent to classical TAGs, the second projection of the synchronously derived representations, can in particular be  up to a homomorphism  the nontree adjoining language MIX(k), for any k > 3.
A linguisticallymotivated 2stage Tree to Graph Transformation
Corentin Ribeyre (INRIAAlpage), Djamé Seddah (INRIAAlpage & ParisSorbonne) and Éric Villemonte de la Clergerie (INRIAAlpage)
We propose a new model for transforming dependency trees into target graphs, relying on two distinct stages. During the first stage, standard local tree transformation rules based on patterns are applied to collect a first set of constrained edges to be added to the target graph. In the second stage, motivated by linguistic considerations, the constraints on edges may be used to displace them or their neighbour edges upwards, or to build new mirror edges. The main advantages of this model is to simplify the design of a transformation scheme, with a smaller set of simpler local rules for the first stage, and good properties of termination and confluence for the second level.
Scope Economy and TAG Locality
Michael Freedman (Yale Univ.)
This paper gives an analysis explaining various cases where the scope of two logical operators is nonpermutable in a sentence. The explanation depends on a theory of derivational economy couched in the synchronous tree adjoining grammar framework. Locality restrictions made using the STAG formalism allow us to limit the computational complexity of using transderivational constraints and allows us to make interesting empirical predictions.
The Shape of Elementary Trees and Scope Possibilities in STAG
Robert Frank (Yale Univ.) and Dennis Ryan Storoshenko (Yale Univ.)
Work on the syntaxsemantics interface in the TAG framework has grappled with the problem of identifying a system with sufficient power to capture semantic dependencies which also imposes formally and linguistically interesting constraint on the kinds of dependencies that can be expressed. The consensus in recent years appears to have shifted to the use of a system that is substantially more expressive than TAG. In this paper, we revisit some of the arguments in favor of more formal power, particularly those from Nesson and Shieber (2008). We show that these arguments can be defused once we adopt a different perspective on predicateheaded semantic elementary trees, namely that they are divided into scope and variable components like their quantificational counterparts. We demonstrate as well that this proposal provides an new perspective on scope rigidity.

