Face processing models suggest a neural and functional dissociation exists between processing facial identity and expressive states. This may explain why the relationship between the ability to read emotional meaning from expressive cues and the ability to remember faces based on appearance cues remains a relatively uncharted area of inquiry. One of the fundamental ways that the human face differs from other visual objects is that people read complex social meaning into faces. Thus, we hypothesized that an important part of face memory is the extent to which people read such meaning into faces. Specifically, we predicted that (a) individual differences in the ability to decode emotional messages from expressive faces would be positively associated with the ability to encode and subsequently remember a separate set of neutral faces in the same participants, and (b) that stimulus-level differences in the extent to which a separate group of raters ascribed emotionality to these same neutral faces would also be positively associated with face memory. We report evidence supporting both of these hypotheses. These findings suggest that individual differences in emotional state reasoning from faces, on both the decoder and encoder levels, are meaningfully associated with the ability to remember facial identity.
All Science Journal Classification (ASJC) codes