*****************************************************************************
REMINDER CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: June 4th, 2012
*********************************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) will be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission
guidelines, etc. please visit our workshop website at:
http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
Dear Sharon,
here's a somewhat shameless plug for our just published database. It contains >50 expressions from 20 actors at two intensities and from three camera angles (Resolution is PAL). The database is NOT FACS-coded (yet), and we'd love help and pointers for doing so. There are two sets of validation experiments published with the database as well.
K. Kaulard, D.W. Cunningham, H.H. Bülthoff, and C. Wallraven (2012). The mpi facial expression database - a validated database of emotional and conversational facial expressions.PLoS One.
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0032321
Also, in case you are interested, here are two previous papers on dynamic expression recognition from our lab.
D. W. Cunningham and C. Wallraven. Temporal information for the recognition of conversational expressions. Journal of Vision, 9(13):1-17, 12 2009.
M. Nusseck, D. W. Cunningham, C. Wallraven, and H. H. Bülthoff. The contribution of different facial regions to the recognition of conversational expressions. Journal of Vision, 8(8):1:1-23, 06 2008.
Best
Christian
On Jun 2, 2012, at 10:58 PM, Sharon Gilad-Gutnick wrote:
> Hi,
>
> I are planning an experiment that measures expression recognition from dynamic face information. Does anyone know of a video stimulus set that I might be able to use? Male/female faces would both be good. Specifically, I am interested in faces that progress from neutral to different expressions.
>
> Thanks,
> Sharon Gutnick.
>
> sharongilad1(a)gmail.com
>
> --
> Sharon Gilad-Gutnick
> (Visiting Graduate Student)
> Sinha Lab for Vision Research
> MIT Department of Brain and Cognitive Sciences
> 46-4089
> 77 Massachusetts Avenue, Cambridge, MA 02139
>
>
>
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
--
Christian Wallraven
Cognitive Systems Lab
Dept. of Brain & Cognitive Engineering
Korea University
email: wallraven(a)korea.ac.kr
web: cogsys.korea.ac.kr
Hi,
I are planning an experiment that measures expression recognition from
dynamic face information. Does anyone know of a video stimulus set that I
might be able to use? Male/female faces would both be good. Specifically,
I am interested in faces that progress from neutral to different
expressions.
Thanks,
Sharon Gutnick.
sharongilad1(a)gmail.com
--
Sharon Gilad-Gutnick
(Visiting Graduate Student)
Sinha Lab for Vision Research
MIT Department of Brain and Cognitive Sciences
46-4089
77 Massachusetts Avenue, Cambridge, MA 02139