Does anyone have images of (biologically-related) brothers, and photos of sisters, in the age-range of 18-25 at the time that the photos were taken? (Colour or black and white are both fine; preferably full face or near-full face.). We're doing an experiment on kin recognition and can't find enough siblings!
Thanks in anticipation,
Dr. Graham Hole,
School of Psychology,
University of Sussex.
Post-doctoral Researcher Position at OSU
We are looking for a post-doc to work in the area of computational modeling of face perception and/or computational neuroscience. A strong background in vision, cognitive science (or cognitive neuroscience) and statistics is required. We will consider PhDs from cognitive science, psychology, neuroscience and computer vision and machine learning. The ideal candidate has previous publications in the top journals and conferences in the area.
Please contact Prof. Aleix Martinez (aleix(a)ece.osu.edu) with the subject line: Post-doctoral Research Position.
Hello
I used these stimuli in a past study and scanned them for this purpose.
I will send them to you via e-mail
Yours
Nathalie George
Le 04/06/2012 13:00, face-research-list-request(a)lists.stir.ac.uk a écrit :
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. mooney face request (Kelly Garner)
> 2. Re: mooney face request (Etienne B. Roesch)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 4 Jun 2012 09:41:22 +1000
> From: Kelly Garner<getkellygarner(a)googlemail.com>
> To: face-research-list(a)lists.stir.ac.uk
> Subject: [Face-research-list] mooney face request
> Message-ID:
> <CAEqKWdj1GZPh7rcPpMrzuUxhOR21y379+QXkr9ydX9BJPi4g2w(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hello,
>
> I'm planning a study that requires the use of a set of mooney faces - does
> anyone know where I could access a set of these stimuli? Or does anyone
> have a set they would be willing to email me?
>
> Many thanks and best wishes,
> Kelly
>
--
Si tout ici-bas était excellent, il n'y aurait rien d'excellent.
--
Nathalie GEORGE
CRICM, UMR 7225 / UMR-S 975, UPMC/CNRS/INSERM
Equipe Cogimage, 3e étage
Institut du Cerveau et de la Moelle Epiniere (ICM)
GHU Pitié-Salpetriere
47, Bd de l'Hopital
F-75651 PARIS Cedex 13
tel: +33(0)1 57 27 43 79
fax: +33(0)1 57 27 47 93
e-mail: nathalie.george(a)upmc.fr
http://cogimage.dsi.cnrs.frhttp://cogimage.dsi.cnrs.fr/perso/ngeorge/ngeorge.htm
***********
Hello,
I'm planning a study that requires the use of a set of mooney faces - does
anyone know where I could access a set of these stimuli? Or does anyone
have a set they would be willing to email me?
Many thanks and best wishes,
Kelly
--
Kelly Garner
PhD Candidate
Queensland Attention and Control Lab
School of Psychology
University of Queensland
k.garner(a)uq.edu.au
UQ Profile: http://www.psy.uq.edu.au/directory/index.html?id=2024#
Queensland Attention and Control Lab: http://www.paulduxlab.org/
Remington Eye Movement and Attention Lab: http://remingtonlab.wordpress.com/
*****************************************************************************
REMINDER CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: June 4th, 2012
*********************************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) will be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission
guidelines, etc. please visit our workshop website at:
http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
Dear Sharon,
here's a somewhat shameless plug for our just published database. It contains >50 expressions from 20 actors at two intensities and from three camera angles (Resolution is PAL). The database is NOT FACS-coded (yet), and we'd love help and pointers for doing so. There are two sets of validation experiments published with the database as well.
K. Kaulard, D.W. Cunningham, H.H. Bülthoff, and C. Wallraven (2012). The mpi facial expression database - a validated database of emotional and conversational facial expressions.PLoS One.
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0032321
Also, in case you are interested, here are two previous papers on dynamic expression recognition from our lab.
D. W. Cunningham and C. Wallraven. Temporal information for the recognition of conversational expressions. Journal of Vision, 9(13):1-17, 12 2009.
M. Nusseck, D. W. Cunningham, C. Wallraven, and H. H. Bülthoff. The contribution of different facial regions to the recognition of conversational expressions. Journal of Vision, 8(8):1:1-23, 06 2008.
Best
Christian
On Jun 2, 2012, at 10:58 PM, Sharon Gilad-Gutnick wrote:
> Hi,
>
> I are planning an experiment that measures expression recognition from dynamic face information. Does anyone know of a video stimulus set that I might be able to use? Male/female faces would both be good. Specifically, I am interested in faces that progress from neutral to different expressions.
>
> Thanks,
> Sharon Gutnick.
>
> sharongilad1(a)gmail.com
>
> --
> Sharon Gilad-Gutnick
> (Visiting Graduate Student)
> Sinha Lab for Vision Research
> MIT Department of Brain and Cognitive Sciences
> 46-4089
> 77 Massachusetts Avenue, Cambridge, MA 02139
>
>
>
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
--
Christian Wallraven
Cognitive Systems Lab
Dept. of Brain & Cognitive Engineering
Korea University
email: wallraven(a)korea.ac.kr
web: cogsys.korea.ac.kr
Hi,
I are planning an experiment that measures expression recognition from
dynamic face information. Does anyone know of a video stimulus set that I
might be able to use? Male/female faces would both be good. Specifically,
I am interested in faces that progress from neutral to different
expressions.
Thanks,
Sharon Gutnick.
sharongilad1(a)gmail.com
--
Sharon Gilad-Gutnick
(Visiting Graduate Student)
Sinha Lab for Vision Research
MIT Department of Brain and Cognitive Sciences
46-4089
77 Massachusetts Avenue, Cambridge, MA 02139
-----Original Message-----
From: Whitaker, Lydia [mailto:lwhita@essex.ac.uk]
Sent: 28 May 2012 15:29
To: face-research-list-bounces
Subject: Asian face set with different intenisties of expression
Dear all,
My name is Lydia Whitaker and I am a PhD student studying at the University of Essex, UK. I am looking for a stimuli set of Asian faces that vary in intensity of expression portrayed, preferablly 40%- 100% intensities. I would be very grateful if anyone could point me in the right direction of a face set like this.
Many thanks and kind regards,
Lydia
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Dear Administrator,
We would be greatly obliged if you could circulate the funding opportunity detailed below to the FR community.
Thanks for organising the list - it is a great resource,
David
=================
David White PhD
School of Psychology
University of New South Wales
tel. +61 (0) 2 9385 3254
mob. +61 (0) 4 1675 5100
Advertisement text as follows:
-----------------------------------------------------------
2012 University New South Wales (Sydney, Australia) Vice Chancellor’s Postdoctoral Research Fellowships (for 2013)
Applications are invited to apply for a UNSW VC Fellowship, to work in collaboration with Richard Kemp and David White on topics in the area of applied and/or theoretical aspects of face recognition. Interested candidates should forward an academic CV and a brief description of their research interests to Richard Kemp (richard.kemp(a)unsw.edu.au<mailto:richard.kemp@unsw.edu.au>). Competition for the VC fellowships is generally very strong so if you are interested in submitting an application please contact Richard ASAP so that we can draft a strong proposal.
Some information on eligibility / funding :
- The UNSW VC’s postdoc scheme for 2013 is expected to open in June 18 2012 with a closing date of August 23.
- Applicants must have been awarded a PhD, conferred no earlier than 1 January 2008 or later than 31 December 2011.
- A salary (taxable) will be provided at Level A or B based on years of experience.
- A centrally-funded research support grant of A$10,000 per annum for three years will be provided to assist with research costs.
- The funding is for three years (2 + Extra year dependent on performance)
Further info here - http://research.unsw.edu.au/vcfellowships
Kind regards,
Richard & David
Dear all,
Please draw the attention of any potentially interested students to our Masters program at Stirling: Research Methods in Psychology of Faces. This is a one year MSc that will cover most aspects of the psychology of faces, from low level visual processing to expression and gaze perception, social cognition, facial composite systems, face recognition and matching. It is particularly aimed at developing the skills required for research in face perception; students will be taught how to do morphing in 2D and 3D, how to use an eyetracker, how to program in Eprime, above all how to produce sufficiently controlled stimuli and design experiments. The course also covers more general aspects of psychology research: advanced statistics, research methods and key skills, together with a placement in a research laboratory.
http://www.psychology.stir.ac.uk/research/cognitive-neuroscience/Face-Resea…http://www.stir.ac.uk/postgraduate/programme-information/prospectus/psychol…
Thanks, Peter
Peter Hancock
Professor,
Deputy Head of Psychology,
School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
fax 01786 467641
http://www.psychology.stir.ac.uk/staff/staff-profiles/academic-staff/peter-…
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.