*****************************************************************************
CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: May 11th, 2012
*****************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) wil be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission guidelines, etc. please visit our workshop website at: http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
face-place.org
mike@iPhone
On Apr 10, 2012, at 5:00 AM, face-research-list-request(a)lists.stir.ac.uk wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. set of emotional face images to use in an EEG study
> (Marieke Roebuck)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 09 Apr 2012 19:02:34 +0100
> From: Marieke Roebuck <M.Roebuck(a)sussex.ac.uk>
> Subject: [Face-research-list] set of emotional face images to use in
> an EEG study
> To: <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <6fe743df94dba4a77347457e0a7c28ab(a)sussex.ac.uk>
> Content-Type: text/plain; charset=UTF-8
>
> Dear all,
>
> I am a Masters student at the University of Sussex and am about to start
> my dissertation project on mirror neurone response to emotional faces. I am
> currently trying to track down a set of emotional face images to use as
> stimuli. I had hoped to find a set in my department, but so far have had no
> luck. If someone could point me in the right direction I would be very
> grateful.
>
> Many thanks.
> Marieke
>
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>
>
> End of Face-research-list Digest, Vol 15, Issue 2
> *************************************************
>
> --
> The Sunday Times Scottish University of the Year 2009/2010
> The University of Stirling is a charity registered in Scotland,
> number SC 011159.
>
>
The CAFE dataset (with 10 people with all 6 emotions, facs certified) is downloadable from my website:
http://cseweb.ucsd.edu/users/gary/CAFE/
There are plenty of others, of course. NIMS stims are available from Nim Tottenham. nimtottenham(a)ucla.edu
Gary Cottrell 858-534-6640 FAX: 858-534-7029
Computer Science and Engineering 0404
IF USING FED EX INCLUDE THE FOLLOWING LINE:
CSE Building, Room 4130
University of California San Diego
9500 Gilman Drive # 0404
La Jolla, Ca. 92093-0404
"A grapefruit is a lemon that saw an opportunity and took advantage of it." - note written on a door in Amsterdam on Lijnbaansgracht.
"Physical reality is great, but it has a lousy search function." -Matt Tong
"Only connect!" -E.M. Forster
"I am awaiting the day when people remember the fact that discovery does not work by deciding what you want and then discovering it."
-David Mermin
Email: gary(a)ucsd.edu
Home page: http://www-cse.ucsd.edu/~gary/
Dear Marieke,
1. There is a very good set at Radboud University in the Netherlands;
google them
2. We have a synthetic face generator (at the University of Cape Town)
that also generates a number of emotions in the faces. Tests show that the
emotions are well recognised by human observers. The database underlying
the model is quite small, but the resulting stimuli are quite good. We
would be happy for you to use it, if you would like.
On 10 April 2012 13:00, <face-research-list-request(a)lists.stir.ac.uk> wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. set of emotional face images to use in an EEG study
> (Marieke Roebuck)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 09 Apr 2012 19:02:34 +0100
> From: Marieke Roebuck <M.Roebuck(a)sussex.ac.uk>
> Subject: [Face-research-list] set of emotional face images to use in
> an EEG study
> To: <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <6fe743df94dba4a77347457e0a7c28ab(a)sussex.ac.uk>
> Content-Type: text/plain; charset=UTF-8
>
> Dear all,
>
> I am a Masters student at the University of Sussex and am about to start
> my dissertation project on mirror neurone response to emotional faces. I am
> currently trying to track down a set of emotional face images to use as
> stimuli. I had hoped to find a set in my department, but so far have had no
> luck. If someone could point me in the right direction I would be very
> grateful.
>
> Many thanks.
> Marieke
>
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>
>
> End of Face-research-list Digest, Vol 15, Issue 2
> *************************************************
>
> --
> The Sunday Times Scottish University of the Year 2009/2010
> The University of Stirling is a charity registered in Scotland,
> number SC 011159.
>
>
>
>
--
Colin Tredoux
Professor
Dept Psychology UCT
South Africa
colin.tredoux(a)uct.ac.za
Tel: +2721 6503424
Fax: +2721 6504104
Dear all,
I am a Masters student at the University of Sussex and am about to start
my dissertation project on mirror neurone response to emotional faces. I am
currently trying to track down a set of emotional face images to use as
stimuli. I had hoped to find a set in my department, but so far have had no
luck. If someone could point me in the right direction I would be very
grateful.
Many thanks.
Marieke
Dear all,
I would very much appreciate it if you could circulate the following PhD opportunities to any interested students/lab members:
A fully-funded PhD studentship at Bournemouth University (UK) is available, for a project entitled "Face facts: Why face processing skills should be improved in forensic and national security settings." The scholarship covers tuition fees for three years, and provides a £14,000 per year stipend.
A number of Vice-Chancellor Doctoral (fee waive only) scholarships are also available, and I would be very happy to hear from potential candidates interested in carrying out investigations into face processing, particularly examining face recognition impairments in adults and children with prosopagnosia, autisitc spectrum disorder, or Moebius syndrome.
Further information can be obtained by contacting me directly (sbate(a)bournemouth.ac.uk<mailto:sbate@bournemouth.ac.uk>).
Many thanks for your help,
Sarah
This email is intended only for the person to whom it is addressed and may contain confidential information. If you have received this email in error, please notify the sender and delete this email, which must not be copied, distributed or disclosed to any other person.
Any views or opinions presented are solely those of the author and do not necessarily represent those of Bournemouth University or its subsidiary companies. Nor can any contract be formed on behalf of the University or its subsidiary companies via email.