Feeds:
Posts
Comments

Archive for the ‘hlt’ Category

Shallow parsing (also chunking, “light parsing”) is an analysis of a sentence which identifies the constituents (noun groups, verbs, verb groups, etc.), but does not specify their internal structure, nor their role in the main sentence.

It is a technique widely used in natural language processing. It is similar to the concept of lexical analysis for computer languages.

There are several parsing methods:

  1. Voting model
  2. Maximum entrophy
  3. Language model
  4. Support vector machines  

     

     

     

Read Full Post »

Computational semantics is the study of how to automate the process of constructing and reasoning with meaning representations of natural language expressions. It consequently plays an important role in natural language processing and computational linguistics.

Matthew Stone. Towards a Computational Account of Knowledge, Action and Inference in Instructions. To appear in Journal of Language and Computation, 2000.
I consider abstract instructions, which provide indirect descriptions of actions in cases where the speaker has key information that a hearer can use to identify the right action to perform, but the speaker alone cannot identify that action. The communicative effects of such instructions, that the hearer should know what to do, are in effect implicatures.

Computational semantics shares with formal semantics research in linguistics and philosophy an absolute commitment to formalizing the meanings of sentences and discourses exactly. The difference among these fields reflects their overall enterprises. Linguistic semantics, for example, is looking for an account of human knowledge of meaning that accounts for crosslinguistic variation and human language learnability. Philosophical semantics aims to situate knowledge of meaning within a general understanding of the intentionality of human mental states.

Read Full Post »

The natural logic is a groups of terms and rules that come with natural language.  They allow us to reason and argue on it. Examples of logic terms are: “and”, “or”, “not”, “true”, “false”, “if”, “therefore”, “every”, “some”, “necessary”…

We presuppose Natural Logic in much the same way as we presuppose Natural Language: as something we have to start with and precisify later, and that may well come to be revised or extended quite seriously, but also as something that at least seems to be in part given in more or less the same way to any able speaker of a Natural Language: In it there are a considerable number of terms and – usually implicit – rules which enable every speaker of the language to argue and reason, that every speaker knows and has extensive experience with.

Interface generally refers to an abstraction that an entity provides of itself to the outside. This separates the methods of external communication from internal operation, and allows it to be internally modified without affecting the way outside entities interact with it, as well as provide multiple abstractions of itself. It may also provide a means of translation between entities which do not speak the same language, such as between a human and a computer. Because interfaces are a form of indirection, some additional overhead is incurred versus direct communication.

The interface between a human and a computer is called a user interface. Interfaces between hardware components are physical interfaces. This article deals with software interfaces, which exist between separate software components and provide a programmatic mechanism by which these components can communicate.

Read Full Post »

Topics list

  1. Common Language Resources and Technology Infrastructure
  2. Dialoging NPCs in natural game environments
  3. Question Answering Learning technologies in a multilingual and Multimodal Environment
  4. Computational Semantics
  5. Natural logic and interface
  6. Semantic taxonomy introduction
  7. ShallowSemantic parsing
  8. Knowledge Representation from Text
  9. Lexikoaren behatokia

Read Full Post »

MARTIN KAY

Kay was responsible for introducing the notion of chart parsing in computational linguistics, and the notion of unification in linguistics generally. With Ron Kaplan, he pioneered finite-state morphology. He has been a longtime contributor to, and critic of, work on machine translation. Permanent chairman of the International Committee on Computational Linguistics, Kay was a Research Fellow at the Xerox Palo Alto Research Center until 2002. Gothenburg University has made him an honorary Filosofi Doktor.

YORICK WILKS

He is a techer in the University of Sheffield. He has made courses and series of lectures delivered in functional programming languages, artificial intelligence, intelligent knowledge-based systems, research in computational linguistics, theoretical linguistics, machine translation, philosophy of language, foundations of artificial intelligence, and logic and language.New first year Undergraduate Artificial Intelligence Course from 2001. He has also published many books, such as : Machine Translation: its scope and limits, Readings in the Lexicon and Readings in Machine Translation. He has also won many prices: 2008 Zampolli Prize (ELDA, awarded at LREC-08 in Marrakech, Morocco), 2008 Lifetime Achievement Award (ACL, awarded at ACL-08 in Columbus, OH), 2006 Visiting Professor, University of Oxford (2006-)

*Martin Kay (2004, October 21) In Stanford Department of Linguistics, Retrieved 11:41, March 18, 2009 from: http://www-linguistics.stanford.edu/people/pages/kay.shtml

*Yorick Wilks (2007, 17 April)In University of Sheffield, Department of Computer Science, Retrieved 11:50, March 18, 2009 from: http://www.dcs.shef.ac.uk/~yorick/cv.html

 

Read Full Post »

The term linguistic resources refers to (usually large) sets of language data and descriptions in machine readable form, to be used in building, improving, or evaluating natural language (NL) and speech algorithms or systems. Examples of linguistic resources are written and spoken corpora, lexical databases, grammars, and terminologies, although the term may be extended to include basic software tools for the preparation, collection, management, or use of other resources.
Language technology is often called human language technology (HLT) or natural language processing (NLP) and consists of computational linguistics (or CL) and speech technology as its core but includes also many application oriented aspects of them. Language technology is closely connected to computer science and general linguistics.

Natural language processing (NLP) is a field of computer science concerned with the interactions between computers and human (natural) languages. Natural language generation systems convert information from computer databases into readable human language. Natural language understanding systems convert samples of human language into more formal representations that are easier for computer programs to manipulate. Many problems within NLP apply to both generation and understanding; for example, a computer must be able to model morphology (the structure of words) in order to understand an English sentence, but a model of morphology is also needed for producing a grammatically correct English sentence.

Read Full Post »