Dear colleagues, please excuse the mass email.
I am currently looking to appoint two full-time researchers, one postdoctoral (Salary Range: £31,948 - £35,938) and one not (Salary Range: £26,004 - £29,249), to assist with a five-year project, funded by the European Research Council.
The project will investigate the effects of exogenous and endogenous hormones on aspects of women's mate preferences and choices, emotion processing, appearance and sexual behaviour. There is also plenty of scope within the project for the appointed researchers to develop their own strands of the research, as well as contribute to other ongoing projects in our lab (www.facelab.org).
Please note that, by the time of appointment, our lab will have relocated to the Research Institute of Neuroscience and Psychology at the University of Glasgow (www.gla.ac.uk/researchinstitutes/neurosciencepsychology/).
Please pass these details on to any students, postdocs, or mailing lists you think might be interested. I can be contacted at ben.jones(a)abdn.ac.uk with queries about the positions and project.
More details about the positions, including details of how to apply, are given here:
Postdoctoral Research Associate: http://facelab.org/postdoc
Research Assistant: http://facelab.org/ra
Best wishes and kind regards,
Ben Jones
Benedict Jones, PhD
Personal Chair in Psychology
Face Research Laboratory
University of Aberdeen
Scotland, UK
www.facelab.org
The University of Aberdeen is a charity registered in Scotland, No SC013683.
*****************************************************************************
CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: May 11th, 2012
*****************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) wil be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission guidelines, etc. please visit our workshop website at: http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
face-place.org
mike@iPhone
On Apr 10, 2012, at 5:00 AM, face-research-list-request(a)lists.stir.ac.uk wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. set of emotional face images to use in an EEG study
> (Marieke Roebuck)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 09 Apr 2012 19:02:34 +0100
> From: Marieke Roebuck <M.Roebuck(a)sussex.ac.uk>
> Subject: [Face-research-list] set of emotional face images to use in
> an EEG study
> To: <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <6fe743df94dba4a77347457e0a7c28ab(a)sussex.ac.uk>
> Content-Type: text/plain; charset=UTF-8
>
> Dear all,
>
> I am a Masters student at the University of Sussex and am about to start
> my dissertation project on mirror neurone response to emotional faces. I am
> currently trying to track down a set of emotional face images to use as
> stimuli. I had hoped to find a set in my department, but so far have had no
> luck. If someone could point me in the right direction I would be very
> grateful.
>
> Many thanks.
> Marieke
>
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>
>
> End of Face-research-list Digest, Vol 15, Issue 2
> *************************************************
>
> --
> The Sunday Times Scottish University of the Year 2009/2010
> The University of Stirling is a charity registered in Scotland,
> number SC 011159.
>
>
The CAFE dataset (with 10 people with all 6 emotions, facs certified) is downloadable from my website:
http://cseweb.ucsd.edu/users/gary/CAFE/
There are plenty of others, of course. NIMS stims are available from Nim Tottenham. nimtottenham(a)ucla.edu
Gary Cottrell 858-534-6640 FAX: 858-534-7029
Computer Science and Engineering 0404
IF USING FED EX INCLUDE THE FOLLOWING LINE:
CSE Building, Room 4130
University of California San Diego
9500 Gilman Drive # 0404
La Jolla, Ca. 92093-0404
"A grapefruit is a lemon that saw an opportunity and took advantage of it." - note written on a door in Amsterdam on Lijnbaansgracht.
"Physical reality is great, but it has a lousy search function." -Matt Tong
"Only connect!" -E.M. Forster
"I am awaiting the day when people remember the fact that discovery does not work by deciding what you want and then discovering it."
-David Mermin
Email: gary(a)ucsd.edu
Home page: http://www-cse.ucsd.edu/~gary/
Dear Marieke,
1. There is a very good set at Radboud University in the Netherlands;
google them
2. We have a synthetic face generator (at the University of Cape Town)
that also generates a number of emotions in the faces. Tests show that the
emotions are well recognised by human observers. The database underlying
the model is quite small, but the resulting stimuli are quite good. We
would be happy for you to use it, if you would like.
On 10 April 2012 13:00, <face-research-list-request(a)lists.stir.ac.uk> wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. set of emotional face images to use in an EEG study
> (Marieke Roebuck)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 09 Apr 2012 19:02:34 +0100
> From: Marieke Roebuck <M.Roebuck(a)sussex.ac.uk>
> Subject: [Face-research-list] set of emotional face images to use in
> an EEG study
> To: <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <6fe743df94dba4a77347457e0a7c28ab(a)sussex.ac.uk>
> Content-Type: text/plain; charset=UTF-8
>
> Dear all,
>
> I am a Masters student at the University of Sussex and am about to start
> my dissertation project on mirror neurone response to emotional faces. I am
> currently trying to track down a set of emotional face images to use as
> stimuli. I had hoped to find a set in my department, but so far have had no
> luck. If someone could point me in the right direction I would be very
> grateful.
>
> Many thanks.
> Marieke
>
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>
>
> End of Face-research-list Digest, Vol 15, Issue 2
> *************************************************
>
> --
> The Sunday Times Scottish University of the Year 2009/2010
> The University of Stirling is a charity registered in Scotland,
> number SC 011159.
>
>
>
>
--
Colin Tredoux
Professor
Dept Psychology UCT
South Africa
colin.tredoux(a)uct.ac.za
Tel: +2721 6503424
Fax: +2721 6504104
Dear all,
I am a Masters student at the University of Sussex and am about to start
my dissertation project on mirror neurone response to emotional faces. I am
currently trying to track down a set of emotional face images to use as
stimuli. I had hoped to find a set in my department, but so far have had no
luck. If someone could point me in the right direction I would be very
grateful.
Many thanks.
Marieke
Dear all,
I would very much appreciate it if you could circulate the following PhD opportunities to any interested students/lab members:
A fully-funded PhD studentship at Bournemouth University (UK) is available, for a project entitled "Face facts: Why face processing skills should be improved in forensic and national security settings." The scholarship covers tuition fees for three years, and provides a £14,000 per year stipend.
A number of Vice-Chancellor Doctoral (fee waive only) scholarships are also available, and I would be very happy to hear from potential candidates interested in carrying out investigations into face processing, particularly examining face recognition impairments in adults and children with prosopagnosia, autisitc spectrum disorder, or Moebius syndrome.
Further information can be obtained by contacting me directly (sbate(a)bournemouth.ac.uk<mailto:sbate@bournemouth.ac.uk>).
Many thanks for your help,
Sarah
This email is intended only for the person to whom it is addressed and may contain confidential information. If you have received this email in error, please notify the sender and delete this email, which must not be copied, distributed or disclosed to any other person.
Any views or opinions presented are solely those of the author and do not necessarily represent those of Bournemouth University or its subsidiary companies. Nor can any contract be formed on behalf of the University or its subsidiary companies via email.
Hi everyone,
Some of you may not know that a few months ago Google Scholar introduced a
facility that lets you create a personal page where you can list your
publications and track citations. Like ResearcherID.com, but it covers a
wider range of sources (e.g. books and chapters as well as journals) and
offers citation alerts and so on.
See the explanatory URL:
http://scholar.google.co.uk/intl/en/scholar/citations.html
I've already found this useful for a range of reasons. For example, it's
a simpler way of finding out what someone has done than a literature
search, it allows you to follow what former students and colleagues are up
to, you can create alerts for when someone whose work interests you
publishes a paper, or for when a key paper you're interested in gets
cited. I also used it to create pages for Hadyn Ellis and Freda Newcombe,
which may be helpful to anyone who is interested in where current
cognitive and neuropsychological approaches came from.
The downside seems to be that searching the content isn't very effective
at the moment. For example, the keywords 'face perception', 'face
recognition' and 'face processing' will find different sets of people.
But since Google is behind this, I'm guessing this will be improved.
Cheers,
Andy Young.
Dear All. I have had an email (below) from a senior police officer from Cumbria who is interested in 3D modelling of hands. Does anyone have any experience of this, or know of anyone who has? Feel free to contact Richard directly. I think this application is for a court case. Regards, Charlie.
________________________________
From: San Jose, Richard [Richard.SanJose(a)cumbria.police.uk]
Sent: 27 March 2012 13:50
To: Charlie Frowd
Subject: Hands
Charlie
Are you aware or do any of your colleagues have knowledge of any research and or applications that we can use within policing to demonstrate position of a persons ‘hands’ on say a persons neck or bottle etc and also the ability to manipulate the fingers to show different positions etc?
Richard San José 8808
Scientific Support Manager
CID
T: 01768 217350
M: 07970 122834
E: Richard.SanJose(a)cumbria.police.uk<mailto:Richard.SanJose@cumbria.police.uk>
Find us on…
W: www.cumbria.police.uk<http://www.cumbria.police.uk/>
Facebook: www.facebook.com/cumbriapolice<http://www.facebook.com/cumbriapolice>
Twitter: www.twitter.com/cumbriapolice<http://www.twitter.com/cumbriapolice>
Police Headquarters, Carleton Hall, Penrith, Cumbria. CA10 2AU.
______________________________________________________
Cumbria Constabulary - Safer Stronger Cumbria
_______________________________________________________
IMPORTANT NOTICE:
This email, its content and any file transmitted with it, may be confidential/legally privileged, and are for the intended recipient only. It may contain information protected by law. Any unauthorised copying, disclosure or other processing of this information may be unlawful and may be a criminal offence. If you have received this email in error please advise Cumbria Constabulary on 101 or via email return, and delete this email and any attachments immediately. Any opinions expressed in this document are those of the author and do not necessarily reflect the opinion of Cumbria Constabulary.
This footnote also confirms that this email message has been swept by MIMEsweeper for the presence of computer viruses.
_______________________________________________________
P Please, consider your environmental responsibility. Before printing this e-mail ask yourself: "Do I need a hard copy?"
*****************************************************************************
CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: May 11th, 2012
*****************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) wil be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission guidelines, etc. please visit our workshop website at: http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html