English Resource Grammar
The LinGO English Resource Grammar (ERG) is a broad-coverage, linguistically precise HPSG-based grammar of English. The ERG is semantically grounded in Minimal Recursion Semantics (MRS), which is a form of flat semantic representation capable of supporting underspecification. The ERG is developed as part of the international Deep Linguistic Processing with HPSG Initiative (DELPH-IN) and can be processed by a number of parsing and realization systems, including the LKB grammar engineering environment, as well as the more efficient PET or ACE run-time parsers, among others, for applications.
For easy exploration of ERG analyses, there is an on-line interface for parsing individual utterances. This on-line demonstration, however, is limited in the computational resources available, as well as in interfaces to the various available (views on) syntactic and semantic analyses provided by the ERG. To get started using the ERG on your own hardware, we recommend our instructions for novice users.
The early development and first application of the ERG was in the Verbmobil spoken language machine translation project. CSLI was responsible for building the English grammar for the deep-processing component of Verbmobil, which utilized a semantic transfer approach, requiring both parsing and generation of conversational English dialogues. Since then, the ERG has been used in a commercial application providing automatic response to customer email messages, in a second machine translation research project LOGON for Norwegian to English, and most recently for grammar correction in a large-scale online English Language Arts course used by tens of thousands of elementary school students (see EPGY). The grammar is also being used in continuing research for parsing existing larger corpora including the English Wikipedia (see the WeSearch project), and the familiar newspaper text of the Wall Street Journal (see the DeepBank project).
Dan Flickinger (CSLI) is the principal ERG developer. Other individuals who have made major contributions to the grammar are Emily Bender (University of Washington), Ann Copestake (University of Cambridge), Rob Malouf (San Diego State University) and Stephan Oepen (University of Oslo). Several former graduate students (Brady Clark, Judith Tonhauser, Kathryn Campbell-Kibler, Martina Faller, Ash Asudeh, Susanne Riehemann) and visiting graduate students (Jesse Tseng, University of Edinburgh; Ken Bame, Ohio State University; Judith Eckle-Kohler, University of Stuttgart) also did detailed work, including building the lexicon, developing test suites, isolating phenomena found in corpora and developing analyses in the HPSG formalism. Jeff Smith (Professor, San Jose State University) has also spent time at CSLI developing various aspects of the grammar. In addition to this direct implementation work, weekly technical project meetings have provided an important forum for critique of specific analyses, particularly from Ivan Sag (Professor, Stanford) and Tom Wasow (Professor, Stanford).
Dan Flickinger (2011) Accuracy vs. Robustness in Grammar Engineering. in E.M. Bender and J.E. Arnold, eds. Language from a Cognitive Perspective: Grammar, Usage, and Processing, pp. 31--50. CSLI Publications, Stanford.
Dan Flickinger (2002) On building a more efficient grammar by exploiting types. In Stephan Oepen, Dan Flickinger, Jun'ichi Tsujii and Hans Uszkoreit (eds.) Collaborative Language Engineering, Stanford: CSLI Publications, pp. 1-17.
Ann Copestake, Dan Flickinger, Ivan A. Sag and Carl J. Pollard (1999) Minimal Recursion Semantics: An Introduction.
Ann Copestake and Dan Flickinger (2000) An open-source grammar development environment and broad-coverage English grammar using HPSG In Proceedings of the Second conference on Language Resources and Evaluation (LREC-2000), Athens, Greece.