Apologies for cross-posting, but I'd like to encourage facey people to consider applying; the cognitive and developmental positions might suit.
Peter
The University of Stirling - School of Natural Sciences -Psychology are pleased to announce the following available posts:
The School of Natural Sciences is looking to appoint 2 Lecturers and one Senior lecturer in psychology. The successful candidates will be expected to undertake internationally excellent research, high quality relevant teaching, appropriate administration and other activities in Psychology, and the School of Natural Sciences to support and develop the School's academic profile. The posts will have an emphasis in the following areas: Cognition, Cognitive Neuroscience, Development, Aging, Health and provide an exciting opportunity to enhance two key research groups - Cognition in Complex Environments and the Centre for Health and Behaviour Change. Suitably qualified candidates will be considered for a Senior Lectureship, based on evidence of research leadership and a substantial record of external funding.
For all three of these posts, experience of both undergraduate and postgraduate teaching is essential; experience and interest in teaching that is focused on Employability and enhancing the student experience will be an advantage.
These three posts are full-time and open ended. Dependent on qualifications and experience, the salary for the two lecturer posts will be within grade 8 (£37,012 to £44,165 p.a.), and the senior lecturer post will be within Grade 9 (£45,336 - £52,556 p.a.). Closing date for applications is the 29th of July 2012. Informal enquiries can be made to Professor David Donaldson, telephone 01786 467657 or email d.i.donaldson(a)stir.ac.uk<mailto:%20d.i.donaldson@stir.ac.uk>
For further particulars and to apply:
http://www.hr-services.stir.ac.uk/jobs/details.php?id=QUUFK026203F3VBQB7V79…
Peter Hancock
Professor,
Deputy Head of Psychology,
School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
fax 01786 467641
http://www.psychology.stir.ac.uk/staff/staff-profiles/academic-staff/peter-…
--
The University of Stirling is ranked in the top 50 in the world in The Times Higher Education 100 Under 50 table, which ranks the world's best 100 universities under 50 years old.
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Does anyone have/know of a set of faces that have been rated for power/dominance? A colleague of mine has been trying to locate some for a while but not had any luck.
Thanks for any help,
David Ross
Vanderbilt University
Hello,
Does anyone have a composite face test? I’m assessing a group of
prosopagnosic patients and would like to assess holistic processing. If
anyone has a composite face test that they would allow me to use I would be
very grateful.
Many thanks,
Joe
M.Res Psychology candidate
University of St Andrews
Does anyone have images of (biologically-related) brothers, and photos of sisters, in the age-range of 18-25 at the time that the photos were taken? (Colour or black and white are both fine; preferably full face or near-full face.). We're doing an experiment on kin recognition and can't find enough siblings!
Thanks in anticipation,
Dr. Graham Hole,
School of Psychology,
University of Sussex.
Post-doctoral Researcher Position at OSU
We are looking for a post-doc to work in the area of computational modeling of face perception and/or computational neuroscience. A strong background in vision, cognitive science (or cognitive neuroscience) and statistics is required. We will consider PhDs from cognitive science, psychology, neuroscience and computer vision and machine learning. The ideal candidate has previous publications in the top journals and conferences in the area.
Please contact Prof. Aleix Martinez (aleix(a)ece.osu.edu) with the subject line: Post-doctoral Research Position.
Hello
I used these stimuli in a past study and scanned them for this purpose.
I will send them to you via e-mail
Yours
Nathalie George
Le 04/06/2012 13:00, face-research-list-request(a)lists.stir.ac.uk a écrit :
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. mooney face request (Kelly Garner)
> 2. Re: mooney face request (Etienne B. Roesch)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 4 Jun 2012 09:41:22 +1000
> From: Kelly Garner<getkellygarner(a)googlemail.com>
> To: face-research-list(a)lists.stir.ac.uk
> Subject: [Face-research-list] mooney face request
> Message-ID:
> <CAEqKWdj1GZPh7rcPpMrzuUxhOR21y379+QXkr9ydX9BJPi4g2w(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hello,
>
> I'm planning a study that requires the use of a set of mooney faces - does
> anyone know where I could access a set of these stimuli? Or does anyone
> have a set they would be willing to email me?
>
> Many thanks and best wishes,
> Kelly
>
--
Si tout ici-bas était excellent, il n'y aurait rien d'excellent.
--
Nathalie GEORGE
CRICM, UMR 7225 / UMR-S 975, UPMC/CNRS/INSERM
Equipe Cogimage, 3e étage
Institut du Cerveau et de la Moelle Epiniere (ICM)
GHU Pitié-Salpetriere
47, Bd de l'Hopital
F-75651 PARIS Cedex 13
tel: +33(0)1 57 27 43 79
fax: +33(0)1 57 27 47 93
e-mail: nathalie.george(a)upmc.fr
http://cogimage.dsi.cnrs.frhttp://cogimage.dsi.cnrs.fr/perso/ngeorge/ngeorge.htm
***********
Hello,
I'm planning a study that requires the use of a set of mooney faces - does
anyone know where I could access a set of these stimuli? Or does anyone
have a set they would be willing to email me?
Many thanks and best wishes,
Kelly
--
Kelly Garner
PhD Candidate
Queensland Attention and Control Lab
School of Psychology
University of Queensland
k.garner(a)uq.edu.au
UQ Profile: http://www.psy.uq.edu.au/directory/index.html?id=2024#
Queensland Attention and Control Lab: http://www.paulduxlab.org/
Remington Eye Movement and Attention Lab: http://remingtonlab.wordpress.com/
*****************************************************************************
REMINDER CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: June 4th, 2012
*********************************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) will be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission
guidelines, etc. please visit our workshop website at:
http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
Dear Sharon,
here's a somewhat shameless plug for our just published database. It contains >50 expressions from 20 actors at two intensities and from three camera angles (Resolution is PAL). The database is NOT FACS-coded (yet), and we'd love help and pointers for doing so. There are two sets of validation experiments published with the database as well.
K. Kaulard, D.W. Cunningham, H.H. Bülthoff, and C. Wallraven (2012). The mpi facial expression database - a validated database of emotional and conversational facial expressions.PLoS One.
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0032321
Also, in case you are interested, here are two previous papers on dynamic expression recognition from our lab.
D. W. Cunningham and C. Wallraven. Temporal information for the recognition of conversational expressions. Journal of Vision, 9(13):1-17, 12 2009.
M. Nusseck, D. W. Cunningham, C. Wallraven, and H. H. Bülthoff. The contribution of different facial regions to the recognition of conversational expressions. Journal of Vision, 8(8):1:1-23, 06 2008.
Best
Christian
On Jun 2, 2012, at 10:58 PM, Sharon Gilad-Gutnick wrote:
> Hi,
>
> I are planning an experiment that measures expression recognition from dynamic face information. Does anyone know of a video stimulus set that I might be able to use? Male/female faces would both be good. Specifically, I am interested in faces that progress from neutral to different expressions.
>
> Thanks,
> Sharon Gutnick.
>
> sharongilad1(a)gmail.com
>
> --
> Sharon Gilad-Gutnick
> (Visiting Graduate Student)
> Sinha Lab for Vision Research
> MIT Department of Brain and Cognitive Sciences
> 46-4089
> 77 Massachusetts Avenue, Cambridge, MA 02139
>
>
>
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
--
Christian Wallraven
Cognitive Systems Lab
Dept. of Brain & Cognitive Engineering
Korea University
email: wallraven(a)korea.ac.kr
web: cogsys.korea.ac.kr
Hi,
I are planning an experiment that measures expression recognition from
dynamic face information. Does anyone know of a video stimulus set that I
might be able to use? Male/female faces would both be good. Specifically,
I am interested in faces that progress from neutral to different
expressions.
Thanks,
Sharon Gutnick.
sharongilad1(a)gmail.com
--
Sharon Gilad-Gutnick
(Visiting Graduate Student)
Sinha Lab for Vision Research
MIT Department of Brain and Cognitive Sciences
46-4089
77 Massachusetts Avenue, Cambridge, MA 02139
-----Original Message-----
From: Whitaker, Lydia [mailto:lwhita@essex.ac.uk]
Sent: 28 May 2012 15:29
To: face-research-list-bounces
Subject: Asian face set with different intenisties of expression
Dear all,
My name is Lydia Whitaker and I am a PhD student studying at the University of Essex, UK. I am looking for a stimuli set of Asian faces that vary in intensity of expression portrayed, preferablly 40%- 100% intensities. I would be very grateful if anyone could point me in the right direction of a face set like this.
Many thanks and kind regards,
Lydia
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Dear Administrator,
We would be greatly obliged if you could circulate the funding opportunity detailed below to the FR community.
Thanks for organising the list - it is a great resource,
David
=================
David White PhD
School of Psychology
University of New South Wales
tel. +61 (0) 2 9385 3254
mob. +61 (0) 4 1675 5100
Advertisement text as follows:
-----------------------------------------------------------
2012 University New South Wales (Sydney, Australia) Vice Chancellor’s Postdoctoral Research Fellowships (for 2013)
Applications are invited to apply for a UNSW VC Fellowship, to work in collaboration with Richard Kemp and David White on topics in the area of applied and/or theoretical aspects of face recognition. Interested candidates should forward an academic CV and a brief description of their research interests to Richard Kemp (richard.kemp(a)unsw.edu.au<mailto:richard.kemp@unsw.edu.au>). Competition for the VC fellowships is generally very strong so if you are interested in submitting an application please contact Richard ASAP so that we can draft a strong proposal.
Some information on eligibility / funding :
- The UNSW VC’s postdoc scheme for 2013 is expected to open in June 18 2012 with a closing date of August 23.
- Applicants must have been awarded a PhD, conferred no earlier than 1 January 2008 or later than 31 December 2011.
- A salary (taxable) will be provided at Level A or B based on years of experience.
- A centrally-funded research support grant of A$10,000 per annum for three years will be provided to assist with research costs.
- The funding is for three years (2 + Extra year dependent on performance)
Further info here - http://research.unsw.edu.au/vcfellowships
Kind regards,
Richard & David
Dear all,
Please draw the attention of any potentially interested students to our Masters program at Stirling: Research Methods in Psychology of Faces. This is a one year MSc that will cover most aspects of the psychology of faces, from low level visual processing to expression and gaze perception, social cognition, facial composite systems, face recognition and matching. It is particularly aimed at developing the skills required for research in face perception; students will be taught how to do morphing in 2D and 3D, how to use an eyetracker, how to program in Eprime, above all how to produce sufficiently controlled stimuli and design experiments. The course also covers more general aspects of psychology research: advanced statistics, research methods and key skills, together with a placement in a research laboratory.
http://www.psychology.stir.ac.uk/research/cognitive-neuroscience/Face-Resea…http://www.stir.ac.uk/postgraduate/programme-information/prospectus/psychol…
Thanks, Peter
Peter Hancock
Professor,
Deputy Head of Psychology,
School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
fax 01786 467641
http://www.psychology.stir.ac.uk/staff/staff-profiles/academic-staff/peter-…
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Hi,
A 2-3 year Research Assistant or Associate (post-doctoral) position is available to work with Dr Chris Petkov and Dr Quoc Vuong (Newcastle University, UK) on a BBSRC funded project entitled: The impact of attention on the neuronal mechanisms of adaptation in humans and animal models. The proposed research combines imaging and electrophysiological recordings to test neuronal models of adaptation, and how attention affects adaptation mechanisms. These adaptation models will be tested in the domain of visual face perception and/or auditory voice perception in both humans and the model system.
Applications will begin to be reviewed June 7, 2012, until a suitable candidate has been appointed
Start date: September/October 2012
For further details and the online application, please visit the link:
https://www15.i-grasp.com/fe/tpl_newcastle02.asp?newms=jj&id=44821&newlang=1
For informal enquiries, please feel free to contact Chris by email (chris.petkov(a)newcastle.ac.uk).
Best,
Quoc.
----------------------------------------
Institute of Neuroscience
School of Psychology
Henry Wellcome Building for Neuroecology
Newcastle University
Framlington Place
Newcastle upon Tyne
NE2 4HH
Tel: +44 (0)191 222 6183
Fax: +44 (0)191 222 5622
Web: www.staff.ncl.ac.uk/q.c.vuong/
----------------------------------------
Workshop to be held in conjunction with ECCV 2012 12th October 2012
The topic of human facial analysis has engaged researchers in multiple fields including computer vision, biometrics, forensics, cognitive psychology and medicine. Interest in this topic has been fuelled by scientific advances that suggest insight into a person's identity, intent, attitude as well as health solely based on their face images. The "What's in a Face?" workshop aims to provide a forum for interdisciplinary exchange on the topic of human face. The interdisciplinary aspect will promote a lively exchange of ideas between researchers in computer vision, biometrics, cognitive psychology and forensics. This exchange will be facilitated by invited talks from leading researchers in these disciplines. Additionally, a panel session will be conducted to bring to the fore new perspectives and promote active collaboration between these disciplines.
Call for Papers We invite high quality, original contributions on the following topics:
- Novel 2D and 4D face recognition algorithms
- Neuropsychology of face recognition in humans
- Face understanding in social/cognitive psychology
- Face behaviometrics
- Age, gender and race prediction from faces
- Emotion and deception detection from faces
- Familial relationships from face images
- Facial forensics based on scars, moles, tattoos
- Facial micro-expressions
- Detection of social intent from faces
- Recognition of attentional focus
Papers must be submitted online through the ECCV 2012 CMT submission system and will be peer-reviewed by the program committee. Submissions should adhere to the main ECCV 2012 proceedings style, and have a maximum length of 10 pages. Papers accepted and presented at the workshop will be published in the ECCV 2012 conference proceedings.
Organizers Arun Ross, West Virginia University, USA Alice O'Toole, University of Texas, Dallas, USA Maja Pantic, Imperial College London, UK Antitza Dantcheva, West Virginia University, USA Stefanos Zafeiriou, Imperial College London, UK
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Dear list:
A 3-year research assistant/research associate position is available to work with Dr Quoc Vuong (Newcastle University, UK) and Dr Bruno Rossion (University of Louvain, Belgium) on an ESRC funded project entitled: A neuropsychological approach to dissect face perception and perceptual expertise. The proposed research combines neuropsychological cases, behavioural training, computer graphics, and functional imaging to understand the mechanisms of face perception and expertise. The successful candidate will train and test individuals with acquired prosopagnosia, scan healthy control participants, analyze the data, and publish the results. The post is based at Newcastle but there will be funded trips to Belgium for patient-testing and meetings. The candidate will also be expected to present at conferences and participate in public engagement activities. Applicants with research experience in testing patients, psychophysics, and/or imaging as evidenced by publications in these areas are strongly encouraged to apply. Familiarity with programming (particularly matlab and the Psychtoolbox) and fMRI analyses (spm, fsl, brain voyager) are desirable. Applicants need to be from EU-member countries.
Deadline for application: May 27, 2012
Start date: ~ July/August 2012
For further details and the online application, please visit the link:
https://www15.i-grasp.com/fe/tpl_newcastle02.asp?newms=jj&id=44816&newlang=1
For informal enquiries, please feel free to contact me by email (quoc.vuong(a)newcastle.ac.uk).
Thanks,
Quoc.
----------------------------------------
Institute of Neuroscience
School of Psychology
Henry Wellcome Building for Neuroecology
Newcastle University
Framlington Place
Newcastle upon Tyne
NE2 4HH
Tel: +44 (0)191 222 6183
Fax: +44 (0)191 222 5622
Web: www.staff.ncl.ac.uk/q.c.vuong/
----------------------------------------
Dear colleagues, please excuse the mass email.
I am currently looking to appoint two full-time researchers, one postdoctoral (Salary Range: £31,948 - £35,938) and one not (Salary Range: £26,004 - £29,249), to assist with a five-year project, funded by the European Research Council.
The project will investigate the effects of exogenous and endogenous hormones on aspects of women's mate preferences and choices, emotion processing, appearance and sexual behaviour. There is also plenty of scope within the project for the appointed researchers to develop their own strands of the research, as well as contribute to other ongoing projects in our lab (www.facelab.org).
Please note that, by the time of appointment, our lab will have relocated to the Research Institute of Neuroscience and Psychology at the University of Glasgow (www.gla.ac.uk/researchinstitutes/neurosciencepsychology/).
Please pass these details on to any students, postdocs, or mailing lists you think might be interested. I can be contacted at ben.jones(a)abdn.ac.uk with queries about the positions and project.
More details about the positions, including details of how to apply, are given here:
Postdoctoral Research Associate: http://facelab.org/postdoc
Research Assistant: http://facelab.org/ra
Best wishes and kind regards,
Ben Jones
Benedict Jones, PhD
Personal Chair in Psychology
Face Research Laboratory
University of Aberdeen
Scotland, UK
www.facelab.org
The University of Aberdeen is a charity registered in Scotland, No SC013683.
*****************************************************************************
CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: May 11th, 2012
*****************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) wil be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission guidelines, etc. please visit our workshop website at: http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
face-place.org
mike@iPhone
On Apr 10, 2012, at 5:00 AM, face-research-list-request(a)lists.stir.ac.uk wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. set of emotional face images to use in an EEG study
> (Marieke Roebuck)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 09 Apr 2012 19:02:34 +0100
> From: Marieke Roebuck <M.Roebuck(a)sussex.ac.uk>
> Subject: [Face-research-list] set of emotional face images to use in
> an EEG study
> To: <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <6fe743df94dba4a77347457e0a7c28ab(a)sussex.ac.uk>
> Content-Type: text/plain; charset=UTF-8
>
> Dear all,
>
> I am a Masters student at the University of Sussex and am about to start
> my dissertation project on mirror neurone response to emotional faces. I am
> currently trying to track down a set of emotional face images to use as
> stimuli. I had hoped to find a set in my department, but so far have had no
> luck. If someone could point me in the right direction I would be very
> grateful.
>
> Many thanks.
> Marieke
>
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>
>
> End of Face-research-list Digest, Vol 15, Issue 2
> *************************************************
>
> --
> The Sunday Times Scottish University of the Year 2009/2010
> The University of Stirling is a charity registered in Scotland,
> number SC 011159.
>
>
The CAFE dataset (with 10 people with all 6 emotions, facs certified) is downloadable from my website:
http://cseweb.ucsd.edu/users/gary/CAFE/
There are plenty of others, of course. NIMS stims are available from Nim Tottenham. nimtottenham(a)ucla.edu
Gary Cottrell 858-534-6640 FAX: 858-534-7029
Computer Science and Engineering 0404
IF USING FED EX INCLUDE THE FOLLOWING LINE:
CSE Building, Room 4130
University of California San Diego
9500 Gilman Drive # 0404
La Jolla, Ca. 92093-0404
"A grapefruit is a lemon that saw an opportunity and took advantage of it." - note written on a door in Amsterdam on Lijnbaansgracht.
"Physical reality is great, but it has a lousy search function." -Matt Tong
"Only connect!" -E.M. Forster
"I am awaiting the day when people remember the fact that discovery does not work by deciding what you want and then discovering it."
-David Mermin
Email: gary(a)ucsd.edu
Home page: http://www-cse.ucsd.edu/~gary/
Dear Marieke,
1. There is a very good set at Radboud University in the Netherlands;
google them
2. We have a synthetic face generator (at the University of Cape Town)
that also generates a number of emotions in the faces. Tests show that the
emotions are well recognised by human observers. The database underlying
the model is quite small, but the resulting stimuli are quite good. We
would be happy for you to use it, if you would like.
On 10 April 2012 13:00, <face-research-list-request(a)lists.stir.ac.uk> wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. set of emotional face images to use in an EEG study
> (Marieke Roebuck)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 09 Apr 2012 19:02:34 +0100
> From: Marieke Roebuck <M.Roebuck(a)sussex.ac.uk>
> Subject: [Face-research-list] set of emotional face images to use in
> an EEG study
> To: <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <6fe743df94dba4a77347457e0a7c28ab(a)sussex.ac.uk>
> Content-Type: text/plain; charset=UTF-8
>
> Dear all,
>
> I am a Masters student at the University of Sussex and am about to start
> my dissertation project on mirror neurone response to emotional faces. I am
> currently trying to track down a set of emotional face images to use as
> stimuli. I had hoped to find a set in my department, but so far have had no
> luck. If someone could point me in the right direction I would be very
> grateful.
>
> Many thanks.
> Marieke
>
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>
>
> End of Face-research-list Digest, Vol 15, Issue 2
> *************************************************
>
> --
> The Sunday Times Scottish University of the Year 2009/2010
> The University of Stirling is a charity registered in Scotland,
> number SC 011159.
>
>
>
>
--
Colin Tredoux
Professor
Dept Psychology UCT
South Africa
colin.tredoux(a)uct.ac.za
Tel: +2721 6503424
Fax: +2721 6504104
Dear all,
I am a Masters student at the University of Sussex and am about to start
my dissertation project on mirror neurone response to emotional faces. I am
currently trying to track down a set of emotional face images to use as
stimuli. I had hoped to find a set in my department, but so far have had no
luck. If someone could point me in the right direction I would be very
grateful.
Many thanks.
Marieke
Dear all,
I would very much appreciate it if you could circulate the following PhD opportunities to any interested students/lab members:
A fully-funded PhD studentship at Bournemouth University (UK) is available, for a project entitled "Face facts: Why face processing skills should be improved in forensic and national security settings." The scholarship covers tuition fees for three years, and provides a £14,000 per year stipend.
A number of Vice-Chancellor Doctoral (fee waive only) scholarships are also available, and I would be very happy to hear from potential candidates interested in carrying out investigations into face processing, particularly examining face recognition impairments in adults and children with prosopagnosia, autisitc spectrum disorder, or Moebius syndrome.
Further information can be obtained by contacting me directly (sbate(a)bournemouth.ac.uk<mailto:sbate@bournemouth.ac.uk>).
Many thanks for your help,
Sarah
This email is intended only for the person to whom it is addressed and may contain confidential information. If you have received this email in error, please notify the sender and delete this email, which must not be copied, distributed or disclosed to any other person.
Any views or opinions presented are solely those of the author and do not necessarily represent those of Bournemouth University or its subsidiary companies. Nor can any contract be formed on behalf of the University or its subsidiary companies via email.
Hi everyone,
Some of you may not know that a few months ago Google Scholar introduced a
facility that lets you create a personal page where you can list your
publications and track citations. Like ResearcherID.com, but it covers a
wider range of sources (e.g. books and chapters as well as journals) and
offers citation alerts and so on.
See the explanatory URL:
http://scholar.google.co.uk/intl/en/scholar/citations.html
I've already found this useful for a range of reasons. For example, it's
a simpler way of finding out what someone has done than a literature
search, it allows you to follow what former students and colleagues are up
to, you can create alerts for when someone whose work interests you
publishes a paper, or for when a key paper you're interested in gets
cited. I also used it to create pages for Hadyn Ellis and Freda Newcombe,
which may be helpful to anyone who is interested in where current
cognitive and neuropsychological approaches came from.
The downside seems to be that searching the content isn't very effective
at the moment. For example, the keywords 'face perception', 'face
recognition' and 'face processing' will find different sets of people.
But since Google is behind this, I'm guessing this will be improved.
Cheers,
Andy Young.
Dear All. I have had an email (below) from a senior police officer from Cumbria who is interested in 3D modelling of hands. Does anyone have any experience of this, or know of anyone who has? Feel free to contact Richard directly. I think this application is for a court case. Regards, Charlie.
________________________________
From: San Jose, Richard [Richard.SanJose(a)cumbria.police.uk]
Sent: 27 March 2012 13:50
To: Charlie Frowd
Subject: Hands
Charlie
Are you aware or do any of your colleagues have knowledge of any research and or applications that we can use within policing to demonstrate position of a persons ‘hands’ on say a persons neck or bottle etc and also the ability to manipulate the fingers to show different positions etc?
Richard San José 8808
Scientific Support Manager
CID
T: 01768 217350
M: 07970 122834
E: Richard.SanJose(a)cumbria.police.uk<mailto:Richard.SanJose@cumbria.police.uk>
Find us on…
W: www.cumbria.police.uk<http://www.cumbria.police.uk/>
Facebook: www.facebook.com/cumbriapolice<http://www.facebook.com/cumbriapolice>
Twitter: www.twitter.com/cumbriapolice<http://www.twitter.com/cumbriapolice>
Police Headquarters, Carleton Hall, Penrith, Cumbria. CA10 2AU.
______________________________________________________
Cumbria Constabulary - Safer Stronger Cumbria
_______________________________________________________
IMPORTANT NOTICE:
This email, its content and any file transmitted with it, may be confidential/legally privileged, and are for the intended recipient only. It may contain information protected by law. Any unauthorised copying, disclosure or other processing of this information may be unlawful and may be a criminal offence. If you have received this email in error please advise Cumbria Constabulary on 101 or via email return, and delete this email and any attachments immediately. Any opinions expressed in this document are those of the author and do not necessarily reflect the opinion of Cumbria Constabulary.
This footnote also confirms that this email message has been swept by MIMEsweeper for the presence of computer viruses.
_______________________________________________________
P Please, consider your environmental responsibility. Before printing this e-mail ask yourself: "Do I need a hard copy?"