Since 1987 - Covering the Fastest Computers in the World and the People Who Run Them

June 9, 2011

Bringing Natural Language Processing Home

Nicole Hemsoth

IBM’s Watson supercomputer went beyond bringing advances in HPC hardware to mainstream audiences, it ushered the concept of natural language processing (NLP) into the spotlight.

As a recent article that explores the importance of NLP in clinical settings revealed, researchers have always had a fascination with bringing sophisticated “humanlike” technology into the highly personal realm of medicine.  

As Dr. Ronlinda Lacson noted, the desire to bring natural language processing into the clinical setting dates back well over 40 years before the infamous Jeopardy quiz show demonstrated the might of HPC and NLP. Natural language processing first caught public attention back in the mid-1960s in the form of automated psychologist called ELIZA. This keyword-driven machine-shrink could hold a basic question-driven conversation with a human patient, engaging in dialogue that might take place in a clinical office setting.

At this moment, the key role for Watson as he moves from the limelight into practical use is also in the medical field, doing the diagnostic work that was the distinct domain of highly trained human physicians.

According to Lacson, the growing sophistication of NLP is making the human/machine doctor concept seem much farther from science fiction than it might have seemed in the 1960s.

She notes, however, that there are many aspects to natural language processing that are worth keeping in mind when we see public examples such as IBM’s Watson.

Lacson explained that the components of NLP are as follows:

  • Morphological knowledge—How words are constructed from basic units or morphemes. “The nodular is smaller,” with the two morphemes ‘small’ and ‘er’ (suffix), conveys a comparison of the root word ‘nodule.’
  • Lexical knowledge—References the meaning of individual words, which software can delineate with the word sense or parts of speech.
  • Syntax knowledge—The structuring of words within a sentence.
  • Semantic knowledge—The way in which the meanings of individual words combine to form the meaning of a sentence.
  • Discourse knowledge—Understanding text from adjacent sentences. This can include anaphora resolution, wherein a pronoun is known to refer to a previous sentence.
  • Pragmatic knowledge—Apprehension of sentences in various contexts, where world knowledge is invoked, or a user’s goals and beliefs, to grasp abstract or non-literal meanings.

She says that the difficulty goes beyond understanding and implementing each of these elements and is now far more about integrating them into a whole.

Full story at JACR

Share This