Search icone
Search and publish your papers


Or download with : a doc exchange

About the author

Research Assistant

About the document

Published date
documents in English
5 pages
0 times
Validated by
0 Comment
Rate this document
  1. Abstract
  2. The skills needed to effectively analyze faces
  3. The most widely accepted model
  4. How the brain processes the face
  5. Emotions: discrete and clear cut or fuzzy categorization
  6. Conclusion
  7. References

Humans inherently focus on the face to understand other people's emotional states. Facial expressions are used to communicate worldwide, making them innate and universal. This essay addresses the various ways in which the brain processes facial information. Specific evidence is given for the part-based, gestalt and configurable models. Evidence presented also analyzes the time frame for processing identification, gaze, and expression. This essay also grapples with categorical emotion perception as opposed to two dimensional emotion theories. Computational models shed light on the inner workings of our perception of emotion in categories.

Keywords: Russell, Fernández-Dols, Ellison, Massaro

[...] In truth, subjects did place sharp boundaries between the categories, which made some researchers say that people cannot help but see a face without perceiving a distinct emotion (p. 1159). Other scholars say that emotion is perceived most basically in two dimensions, arousal-sleep and displeasure- pleasure; the categories are derived from these dimensions (Russell p. 304). If these dimensions were visualized as a set of axes, depression expressions would fall in the quadrant that is a mixture of sleep and displeasure, while excited expressions would fall in the quadrant mixing arousal and pleasure. [...]

[...] Three models have been developed to explain the way that emotional information is portrayed in a face. The purely categorical model explains individual expressions as acting like words in a language. By moving an eyebrow or crinkling a nose, one is showing ?arbitrary symbols whose meanings are determined by convention? (p. 230). Disproof of this theory comes from the appropriate reactions of newborns to different facial expressions. A second model, the componential model, theorizes that some components contributing to a given expression are inherently meaningful. [...]

[...] In 1997, Ellison and Massaro arrived at a completely opposite conclusion for face processing. They created the part- based model, which states that ?facial expressions are represented and identified in terms of their individual parts or features? cited). Using a computer-generated face, Ellison and Massaro manipulated both the lips and the eyebrows in five different ways. Each feature position was combined to create 25 full-face images. The tests showed that ?participants' responses to the whole-face images could be reliably predicted from their responses to the half-face images? cited). [...]

Similar documents you may be interested in reading.

Face recognition and perception by young children

 Politics & international   |  Social sciences   |  Term papers   |  05/17/2009   |   .doc   |   12 pages

Human face detection using localized feature extraction

 Science & technology   |  Physics & chemistry   |  Term papers   |  03/31/2010   |   .pdf   |   5 pages

Top sold for psychology

Cognitive behavior therapy and reality therapy

 Social studies   |  Psychology   |  Presentation   |  07/17/2008   |   .doc   |   5 pages