Apologies for cross-posting, but I'd like to encourage facey people to consider applying; the cognitive and developmental positions might suit.
Peter
The University of Stirling - School of Natural Sciences -Psychology are pleased to announce the following available posts:
The School of Natural Sciences is looking to appoint 2 Lecturers and one Senior lecturer in psychology. The successful candidates will be expected to undertake internationally excellent research, high quality relevant teaching, appropriate administration and other activities in Psychology, and the School of Natural Sciences to support and develop the School's academic profile. The posts will have an emphasis in the following areas: Cognition, Cognitive Neuroscience, Development, Aging, Health and provide an exciting opportunity to enhance two key research groups - Cognition in Complex Environments and the Centre for Health and Behaviour Change. Suitably qualified candidates will be considered for a Senior Lectureship, based on evidence of research leadership and a substantial record of external funding.
For all three of these posts, experience of both undergraduate and postgraduate teaching is essential; experience and interest in teaching that is focused on Employability and enhancing the student experience will be an advantage.
These three posts are full-time and open ended. Dependent on qualifications and experience, the salary for the two lecturer posts will be within grade 8 (£37,012 to £44,165 p.a.), and the senior lecturer post will be within Grade 9 (£45,336 - £52,556 p.a.). Closing date for applications is the 29th of July 2012. Informal enquiries can be made to Professor David Donaldson, telephone 01786 467657 or email d.i.donaldson(a)stir.ac.uk<mailto:%20d.i.donaldson@stir.ac.uk>
For further particulars and to apply:
http://www.hr-services.stir.ac.uk/jobs/details.php?id=QUUFK026203F3VBQB7V79…
Peter Hancock
Professor,
Deputy Head of Psychology,
School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
fax 01786 467641
http://www.psychology.stir.ac.uk/staff/staff-profiles/academic-staff/peter-…
--
The University of Stirling is ranked in the top 50 in the world in The Times Higher Education 100 Under 50 table, which ranks the world's best 100 universities under 50 years old.
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Does anyone have/know of a set of faces that have been rated for power/dominance? A colleague of mine has been trying to locate some for a while but not had any luck.
Thanks for any help,
David Ross
Vanderbilt University
Hello,
Does anyone have a composite face test? I’m assessing a group of
prosopagnosic patients and would like to assess holistic processing. If
anyone has a composite face test that they would allow me to use I would be
very grateful.
Many thanks,
Joe
M.Res Psychology candidate
University of St Andrews
Does anyone have images of (biologically-related) brothers, and photos of sisters, in the age-range of 18-25 at the time that the photos were taken? (Colour or black and white are both fine; preferably full face or near-full face.). We're doing an experiment on kin recognition and can't find enough siblings!
Thanks in anticipation,
Dr. Graham Hole,
School of Psychology,
University of Sussex.
Post-doctoral Researcher Position at OSU
We are looking for a post-doc to work in the area of computational modeling of face perception and/or computational neuroscience. A strong background in vision, cognitive science (or cognitive neuroscience) and statistics is required. We will consider PhDs from cognitive science, psychology, neuroscience and computer vision and machine learning. The ideal candidate has previous publications in the top journals and conferences in the area.
Please contact Prof. Aleix Martinez (aleix(a)ece.osu.edu) with the subject line: Post-doctoral Research Position.
Hello
I used these stimuli in a past study and scanned them for this purpose.
I will send them to you via e-mail
Yours
Nathalie George
Le 04/06/2012 13:00, face-research-list-request(a)lists.stir.ac.uk a écrit :
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. mooney face request (Kelly Garner)
> 2. Re: mooney face request (Etienne B. Roesch)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 4 Jun 2012 09:41:22 +1000
> From: Kelly Garner<getkellygarner(a)googlemail.com>
> To: face-research-list(a)lists.stir.ac.uk
> Subject: [Face-research-list] mooney face request
> Message-ID:
> <CAEqKWdj1GZPh7rcPpMrzuUxhOR21y379+QXkr9ydX9BJPi4g2w(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hello,
>
> I'm planning a study that requires the use of a set of mooney faces - does
> anyone know where I could access a set of these stimuli? Or does anyone
> have a set they would be willing to email me?
>
> Many thanks and best wishes,
> Kelly
>
--
Si tout ici-bas était excellent, il n'y aurait rien d'excellent.
--
Nathalie GEORGE
CRICM, UMR 7225 / UMR-S 975, UPMC/CNRS/INSERM
Equipe Cogimage, 3e étage
Institut du Cerveau et de la Moelle Epiniere (ICM)
GHU Pitié-Salpetriere
47, Bd de l'Hopital
F-75651 PARIS Cedex 13
tel: +33(0)1 57 27 43 79
fax: +33(0)1 57 27 47 93
e-mail: nathalie.george(a)upmc.fr
http://cogimage.dsi.cnrs.frhttp://cogimage.dsi.cnrs.fr/perso/ngeorge/ngeorge.htm
***********
Hello,
I'm planning a study that requires the use of a set of mooney faces - does
anyone know where I could access a set of these stimuli? Or does anyone
have a set they would be willing to email me?
Many thanks and best wishes,
Kelly
--
Kelly Garner
PhD Candidate
Queensland Attention and Control Lab
School of Psychology
University of Queensland
k.garner(a)uq.edu.au
UQ Profile: http://www.psy.uq.edu.au/directory/index.html?id=2024#
Queensland Attention and Control Lab: http://www.paulduxlab.org/
Remington Eye Movement and Attention Lab: http://remingtonlab.wordpress.com/
*****************************************************************************
REMINDER CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: June 4th, 2012
*********************************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) will be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission
guidelines, etc. please visit our workshop website at:
http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
Dear Sharon,
here's a somewhat shameless plug for our just published database. It contains >50 expressions from 20 actors at two intensities and from three camera angles (Resolution is PAL). The database is NOT FACS-coded (yet), and we'd love help and pointers for doing so. There are two sets of validation experiments published with the database as well.
K. Kaulard, D.W. Cunningham, H.H. Bülthoff, and C. Wallraven (2012). The mpi facial expression database - a validated database of emotional and conversational facial expressions.PLoS One.
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0032321
Also, in case you are interested, here are two previous papers on dynamic expression recognition from our lab.
D. W. Cunningham and C. Wallraven. Temporal information for the recognition of conversational expressions. Journal of Vision, 9(13):1-17, 12 2009.
M. Nusseck, D. W. Cunningham, C. Wallraven, and H. H. Bülthoff. The contribution of different facial regions to the recognition of conversational expressions. Journal of Vision, 8(8):1:1-23, 06 2008.
Best
Christian
On Jun 2, 2012, at 10:58 PM, Sharon Gilad-Gutnick wrote:
> Hi,
>
> I are planning an experiment that measures expression recognition from dynamic face information. Does anyone know of a video stimulus set that I might be able to use? Male/female faces would both be good. Specifically, I am interested in faces that progress from neutral to different expressions.
>
> Thanks,
> Sharon Gutnick.
>
> sharongilad1(a)gmail.com
>
> --
> Sharon Gilad-Gutnick
> (Visiting Graduate Student)
> Sinha Lab for Vision Research
> MIT Department of Brain and Cognitive Sciences
> 46-4089
> 77 Massachusetts Avenue, Cambridge, MA 02139
>
>
>
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
--
Christian Wallraven
Cognitive Systems Lab
Dept. of Brain & Cognitive Engineering
Korea University
email: wallraven(a)korea.ac.kr
web: cogsys.korea.ac.kr
Hi,
I are planning an experiment that measures expression recognition from
dynamic face information. Does anyone know of a video stimulus set that I
might be able to use? Male/female faces would both be good. Specifically,
I am interested in faces that progress from neutral to different
expressions.
Thanks,
Sharon Gutnick.
sharongilad1(a)gmail.com
--
Sharon Gilad-Gutnick
(Visiting Graduate Student)
Sinha Lab for Vision Research
MIT Department of Brain and Cognitive Sciences
46-4089
77 Massachusetts Avenue, Cambridge, MA 02139