| The introduction of computers into the field of language study during the 1950s had a profound effect in terms of the formalization and mathematization of linguistic theory. This thesis provides an historical overview of the interaction between theoretical linguistics and computational linguistics for natural language processing particularly highlighting the differences between formalistic and functionalistic approaches. It is asserted that formalism in theoretical linguistics was largely a result of a drive for computability, but that a more functionalist approach to both theoretical and computational linguistics is required to truly capture the nuances of natural language.;The influences that are leading to a renewed interest in functional and particularly cognitive linguistic theories as a basis for theorization are noted. Particularly, Langacker's theory of cognitive grammar is chosen as the basis for a theory and implementation of natural language processing (NLP).;The NLP system developed in this research is called Computational Cognitive Linguistics (CCL). To develop the theory for CCL, the cogent aspects of Langacker's linguistic theory are summarized, and the shortcomings of traditional semantic representations as an implementation medium for CCL are demonstrated. In their place, a new semantic representation, called L-space, is developed as a mathematical model, a computational implementation, and a cognitive interpretation.;An example of CCL processing is provided along with a description of the application of CCL to a real-world system--the U.S. Army's Crisis Action Message Analyzer (CAMA). |