Menu
Language Learning

Language Learning

8 Videos · Dec 11, 2009

About

Grammar Induction, Representation of Language and Language Learning

Now is the time to revisit some of the fundamental grammar/language learning tasks such as grammar acquisition, language acquisition, language change, and the general problem of automatically inferring generic representations of language structure in a data driven manner. Though the underlying problems have been known to be computationally intractable for the standard representations of the Chomsky hierarchy, such as regular grammars and context free grammars, progress has been made by modifying or restricting these classes to make them more observable. Generalisations of distributional learning have shown promise in unsupervised learning of linguistic structure using tree based representations, or using non-parametric approaches to inference. More radically, significant advances in this domain have been made by switching to different representations such as the work in Clark, Eyrand & Habrard (2008) that addresses the issue of language acquisition, but has the potential to cross-fertilise a wide range of problems that require data driven representations of language. Such approaches are starting to make inroads into one of the fundamental problems of cognitive science: that of learning complex representations that encode meaning. This adds a further motivation for returning to this topic at this point. Grammar induction was the subject of an intense study in the early days of Computational Learning Theory, with the theory of query learning largely developing out of this research. More recently the study of new methods of representing language and grammars through complex kernels and probabilistic modelling together with algorithms such as structured output learning has enabled machine learning methods to be applied successfully to a range of language related tasks from simple topic classification through parts of speech tagging to statistical machine translation. These methods typically rely on more fluid structures than those derived from formal grammars and yet are able to compete favourably with classical grammatical approaches that require significant input from domain experts, often in the form of annotated data.

The Workshop homepage can be found at http://www.cs.ucl.ac.uk/staff/rmartin/grll09/

Videos

video-img
54:42

Learnable Representations for Natural Language

Alexander Clark

calendar icon Jan 19, 2010 6555 views

video-img
20:59

Learning to Disambiguate Natural Language Using World Knowledge

Antoine Bordes

calendar icon Jan 19, 2010 4331 views

video-img
16:28

Language Modeling with Tree Substitution Grammars

Matt Post

calendar icon Jan 19, 2010 6199 views

video-img
25:07

Sparsity in Grammar Induction

Jennifer A. Gillenwater

calendar icon Jan 19, 2010 6386 views

video-img
57:28

Learning Languages and Rational Kernels

Mehryar Mohri

calendar icon Jan 19, 2010 4968 views

video-img
46:13

Inference for PCFGs and Adaptor Grammars

Mark Johnson

calendar icon Jan 19, 2010 4964 views

video-img
20:19

A Preliminary Evaluation of Word Representations for Named-Entity Recognition

Joseph Turian

calendar icon Jan 19, 2010 7881 views

Private
video-img
07:14

Poster Spotlights

Edouard Gilbert,

Adam R. Teichert,

Mehmet Ali Yatbaz,

Valentin I. Spitkovsky,

Oskar Kohonen,

Kewei Tu

calendar icon Jan 14, 2010 2479 views

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.