Dear Sharon,
here's a somewhat shameless plug for our just published database. It contains >50
expressions from 20 actors at two intensities and from three camera angles (Resolution is
PAL). The database is NOT FACS-coded (yet), and we'd love help and pointers for doing
so. There are two sets of validation experiments published with the database as well.
K. Kaulard, D.W. Cunningham, H.H. Bülthoff, and C. Wallraven (2012). The mpi facial
expression database - a validated database of emotional and conversational facial
expressions.PLoS One.
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0032321
Also, in case you are interested, here are two previous papers on dynamic expression
recognition from our lab.
D. W. Cunningham and C. Wallraven. Temporal information for the recognition of
conversational expressions. Journal of Vision, 9(13):1-17, 12 2009.
M. Nusseck, D. W. Cunningham, C. Wallraven, and H. H. Bülthoff. The contribution of
different facial regions to the recognition of conversational expressions. Journal of
Vision, 8(8):1:1-23, 06 2008.
Best
Christian
On Jun 2, 2012, at 10:58 PM, Sharon Gilad-Gutnick wrote:
Hi,
I are planning an experiment that measures expression recognition from dynamic face
information. Does anyone know of a video stimulus set that I might be able to use?
Male/female faces would both be good. Specifically, I am interested in faces that
progress from neutral to different expressions.
Thanks,
Sharon Gutnick.
sharongilad1(a)gmail.com
--
Sharon Gilad-Gutnick
(Visiting Graduate Student)
Sinha Lab for Vision Research
MIT Department of Brain and Cognitive Sciences
46-4089
77 Massachusetts Avenue, Cambridge, MA 02139
_______________________________________________
Face-research-list mailing list
Face-research-list(a)lists.stir.ac.uk
http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
--
Christian Wallraven
Cognitive Systems Lab
Dept. of Brain & Cognitive Engineering
Korea University
email: wallraven(a)korea.ac.kr
web: cogsys.korea.ac.kr