*****************************************************************************
CBAR 2012: CALL FOR PAPERS
SocialCom12 1st International Workshop on CONTEXT BASED AFFECT RECOGNITION
http://contextbasedaffectrecog.blogspot.com/
Submission
Deadline: May 11th, 2012
*****************************************************************************
The first workshop on "Context Based Affect Recognition" CBAR12
(http://contextbasedaffectrecog.blogspot.com/) wil be held in conjunction with the 2012 ASE/IEEE International Conference on Social Computing SocialCom2012 (http://www.asesite.org/conferences/socialcom/2012/).
-----------------------------
Workshop Description
-----------------------------
The past 20 years has witnessed an increasing number
of efforts for automatic recognition of human affect using facial, vocal, body
as well as physiological signals. Several research
areas could benefit from such systems: interactive teaching systems, which
allow teachers to be aware of student stress and inattention; accident
prevention, such as driver fatigue detection; medical
tools for automatic diagnosis and monitoring such as the diagnosis of cognitive
disorder (e.g. depression, anxiety and autism) and pain assessment. However,
despite the significant amount of research on automatic affect recognition, the
current state of the art has not yet achieved the long-term objective of robust
affect recognition, particularly context based affect analysis and
interpretation. Indeed, it is well known that affect
production is accordingly displayed in a particular context, such as the
undergoing task, the other people involved, the identity and natural
expressiveness of the individual. The context tells us which expressions are
more likely to occur and thus can bias the classifier toward the most
likely/relevant classes. Without context, even humans may misunderstand the
observed facial expression. By tackling the issues of context based affect
recognition, i.e. careful study of contextual information and its relevance in
domain-specific applications, its representation, and its effect on the
performance of existing affect recognition methods, we make a step towards
real-world, real-time affect recognition.
-----------------------------
Workshop Objectives
-----------------------------
Context related affect analysis is still an unexplored area for
automatic affect recognition given the difficulty of modeling this variable and
of its introduction in the classification process. Unconsciously, humans
evaluate situations based on environment and social parameters when recognizing
emotions in social interactions. Contextual information helps us interpret and
respond to social interactions.
The purpose of the workshop is to explore the benefits and drawbacks of
integrating context on affect production, interpretation and recognition. We
wish to investigate what methodologies can be applied to include contextual
information in emotion corpora, how it ought to be represented, what contextual
information are relevant (i.e. is it domain specific or not?), and how it will
improve the performance of existing frameworks for affect recognition.
The workshop is relevant in the study of naturalistic social
interactions since contextual information cannot be discounted in doing
automatic analysis of human behavior. Embedding contextual information, such as
culture, provides a different flavor to each interaction, and makes for an
interesting scientific study. Such kinds of analysis lead us to consider
real-world parameters and complexities in affect recognition, especially in
developing human-centric systems.
For the workshop weinvite scientists
working in related areas of affective computing, ambient computing, machine
learning, psychology and cognitive behavior to share their expertise and
achievements in the emerging field of automatic and context based affect
analysis and recognition.
-----------------------------
Workshop Topics
-----------------------------
New and unpublished papers on, but not limited to, the
following topics:
· Context source detection.
· Context interpretation and analysis.
· Context based affect production
· Context based facial affect recognition
· Context based vocal affect recognition
· Context based gesture affect recognition
· Context based multimodal fusion.
· Applications (Context related affect applications).
For details concerning the workshop program, paper submission guidelines, etc. please visit our workshop website at: http://contextbasedaffectrecog.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics
Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
A position is now available for a Postdoctoral Research Associate to investigate gaze perception and adaptation as part of a scientific team based within the School of Psychology at the University of Sydney. The position is funded by a Discovery Project from the Australian Research Council (ARC) awarded to Professor Colin Clifford in collaboration with Dr Andy Calder from the MRC Cognition and Brain Sciences Unit, Cambridge, UK.
Full details can be found by copying and pasting the following into your web browser:
http://usyd.nga.net.au/cp/index.cfm?event=jobs.checkJobDetailsNewApplicatio…
--
COLIN W G CLIFFORD | Professorial Research Fellow School of Psychology | Faculty of Science THE UNIVERSITY OF SYDNEY T +61 2 9351 6810 | F +61 2 9351 2603
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Hi everyone,
does anyone know of a collection of Iraqi faces?
Thanks,
Rachel
--
You make a living by what you get; you make a life by what you give.
-Winston Churchill.
For anyone interested in face perception within / between developmental disorders:
You, or someone you know, might be interested in the following seminar information, therefore please distribute this information as widely as possible
Thanks
Debbie
Seminar Title: Developmental Disorders: co-morbidity, subgroups and variability
On Friday 29th June 2012 there is a one day workshop which discusses co-morbidity, variability and sub-groups within neurodevelopmental disorders being hosted at Kingston University.
This workshop is part of a seminar series entitled 'Neuro-developmental disorders: Exploring sensitive methods of assessment across development' which explores recent findings in neurodevelopmental disorders, with a particular focus on 1) the new research tools and methods used, 2) discussion of the wider applicability of these new tools and methods across different neurodevelopmental disorders, 3) identifying future challenges or controversies when studying neurodevelopmental disorders using a developmental approach.
The seminar series hopes to bring together specialists and established researchers as well as post-graduates, post-doctoral researchers and early career researchers in neurodevelopmental disorders.
The series is sponsored by the British Psychological Society and the Williams syndrome Foundation UK and is being organised by Dr Jo van Herwegen, Dr Emily Farran and Dr Debbie Riby. In total three seminars will be held around the UK between June 2012 and April 2013.
More information about the seminar series can be obtained from:
http://www.neurodevelopmentaldisorders-seminarseries.co.uk/
Dr Debbie Riby
School of Psychology
Newcastle University
Ridley Building 1
Framlington Place
Newcastle upon Tyne
NE1 7RU
deborah.riby(a)ncl.ac.uk
http://www.ncl.ac.uk/psychology/staff/profile/deborah.riby#tab_profile
Dear all, over the years I have tried various face adaptation experiments, with mostly incoherent results, and watched puzzled as everyone else publishes neat little findings. I began to suspect strong temporal effects and we have finally managed to publish some results demonstrating this with adaptation to antifaces: almost all the effect derives from the first few trials; it is as if whatever is adapting gets 'tired' after that. It may be that others have picked up on this already but I figured it might be useful to draw attention to the finding in case there are those who, like me, are baffled by otherwise strange results.
http://www.frontiersin.org/perception_science/10.3389/fpsyg.2012.00019/abst…
Peter
Peter Hancock
Professor,
Deputy Head of Psychology,
School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
fax 01786 467641
http://www.psychology.stir.ac.uk/staff/phancock
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Dear all, does anyone of you know a stimulus database where i could get a larger number (in the range of dozens) morphed MONKEY (Macaca mullata is the best) face-pairs?
thanks a lot
Gyula Kovacs
---
Gyula Kovacs
Dept.Cognitive Sciences, Prof.
Budapest Univ. Technology and Economics
Hungary H-1111 Stoczek u 3 III.318
T:0036-1463-1176
F:00361463-1072
---------------
Inst. Psychology
Univ. Regensburg
Gebäude PT, Zi. 4.0.35
Universitätsstraße 31,
93053 Regensburg, D
T:0049-941-943-3852;
Mobile:0049178 1725506
Fax:0049 941 943 3233
Hi everyone,
I am conducting a study on gaze direction and I need full front and 3/4 profile pictures of cars (control stimuli).
Does anybody know where I can find such stimuli ?
Thanks for your help
Benoît Montalan
___
Post-doctorant
Laboratoire de Neurosciences Cognitives
Social Group
Ecole Normale Supérieure
Département d'Études Cognitives
29, rue d'Ulm
75005 Paris, France
http://www.grezes.ens.fr/people.phphttp://sites.google.com/site/benoitmontalanperso/home
Une messagerie gratuite, garantie à vie et des services en plus, ça vous tente ?
Je crée ma boîte mail www.laposte.net
RESEARCH ASSOCIATE/RESEARCH ASSISTANT PROFESSOR (REF: 3915)
ARC Centre of Excellence in Cognition and its Disorders (CCD)
School of Psychology, University of Western Australia
• 2 year appointment
• Salary range: Level A $56,983 - $77,328 p.a. - minimum starting salary for appointee with PhD will be $77,328 p.a.
• Salary range: Level B $81,400 - $96,663 p.a.
• Plus 17% superannuation
• Level of appointment is dependent on qualifications and experience
• Closing date: Friday, 30 March 2012
We are seeking a highly motivated early-career researcher to join the Person Perception Program<http://www.ccd.edu.au/research/personperception/> and CCD in the School of Psychology. This position provides an opportunity to work with Associate Professor Romina Palermo<https://sites.google.com/site/drrominapalermo/> and Winthrop Professor Gillian Rhodes<https://www.socrates.uwa.edu.au/S> in their labs. The appointee will conduct independent and collaborative research to investigate the mechanisms (perceptual, cognitive, neural, evolutionary) underlying person perception (faces, bodies, voices). The successful candidate will have an excellent publication record, theoretical knowledge of current issues in person perception, experience designing and conducting experiments, strong quantitative skills and excellent communication skills.
For further information regarding the position please contact Associate Professor Romina Palermo on + 61 8 6488 3256 or email romina.palermo(a)uwa.edu.au<mailto:romina.palermo@uwa.edu.au>.
The position description is attached.
Best wishes,
Romina
In response to requests to be able to replicate the stimuli from our
study:
Chen Zhao, Peggy Series, Peter J. B. Hancock, and James
A. Bednar. Similar neural adaptation mechanisms underlying face
gender and tilt aftereffects. Vision Research, 51(18):2021-2030, 2011.
http://dx.doi.org/10.1016/j.visres.2011.07.014
we have made them freely available at:
http://homepages.inf.ed.ac.uk/jbednar/stimuli/
The stimulus set includes 1,612 PNG-format color images generated
along an image morph continuum between the average male face and the
average female face from a large database of feature-tagged face
images; see the paper for the full details.
Chen Zhao, Peggy Series, Peter J. B. Hancock, and James A. Bednar
--
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.
I'm forwarding this to face research list in the hope that there may be one or two modellers out there who would be interested.
Peter
From: Julien Mayor [mailto:Julien.Mayor@unige.ch]
Sent: 09 February 2012 09:32
To: Julien.Mayor(a)unige.ch
Subject: NCPW13 announcement
Dear colleague,
We cordially invite you to participate in the Thirtienth Neural Computation and Psychology Workshop (NCPW13) to be held in San Sebastian (Spain) from July 12-14, 2012: http://www.bcbl.eu/events/ncpw13
This well-established and lively workshop aims at bringing together researchers from different disciplines such as artificial intelligence, cognitive science, computer science, neurobiology, philosophy and psychology to discuss their work on models of cognitive processes. Previous themes have encompassed categorisation, language, memory, development, action. There will be no specific theme, but papers must be about emergent models -- frequently, but not necessarily -- of the connectionist/neural network kind, applied to cognition. These workshops have always been characterised by their limited size, high quality papers, the absence of parallel talk sessions, and a schedule that is explicitly designed to encourage interaction among the researchers present in an informal setting.
Furthermore, this workshop will feature a unique set of invited speakers:
* Mark Seidenberg. University of Wisconsin-Madison, USA.
* Jeffrey Elman. University of California, San Diego. USA.
* Randall C. O'Reilly. University of Colorado. USA.
* Kim Plunkett. University of Oxford, UK.
Important dates to remember:
Abstract deadline: March 31st, 2012
Notification of abstract acceptance: May 1st, 2012
Early registration deadline: June 1st, 2012
Online registration deadline: July 1st, 2012
Conference dates: July 12 - 14, 2012
Looking forward to your participation,
Julien Mayor
---
Julien Mayor
University of Geneva
40 Bd Pont d'Arve
1205 Genève
Tel: +41 (0)22 3798150
http://www.unige.ch/fapse/psycholinguistique/model.html
[cid:98DA99A3-6426-4649-926E-BBB35EDB2F6F@unige.ch]
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Hi everyone:
My name is Esther and I'm a postdoc fellow at University of British Columbia.
Weare conducting a face perception study and we need asian faces.
Does anybody know where I can find asian faces of old and young men and women, frontal views, neutral expression?
Thanks for your help
Esther
This came from Visionlist, but for anyone who is not on that list I thought
this might be useful information.
Rachel
Date: Tue, 17 Jan 2012 16:16:05 -0500
From: "Yun (Raymond) Fu" <yunfu(a)buffalo.edu>
Subject: [visionlist] A New Image Database: UB KinFace Database
To: visionlist(a)visionscience.com
Message-ID: <4F15E515.9010206(a)buffalo.edu>
Content-Type: text/plain; charset="iso-8859-1"; Format="flowed"
A new image database--UB KinFace Database--is currently online available.
http://www.cse.buffalo.edu/~yunfu/research/Kinface/Kinface.htm
<http://www.cse.buffalo.edu/%7Eyunfu/research/Kinface/Kinface.htm>
UB KinFace database is used to develop, test, and evaluate kinship
verification and recognition algorithms. It comprises 600 images of 400
people which can be separated into 200 groups. Each group is composed of
child, young parent and old parent images. Most of images in the
database are real-world collections of public figures (celebrities and
politicians) from Internet.
--
Yun (Raymond) Fu
Dr. and Assistant Professor
Department of Computer Science and Engineering
State University of New York (SUNY) at Buffalo
331 Davis Hall Buffalo, NY 14260-2500, USA
Ph: +1 (716) 645 2670
Email:yunfu@buffalo.edu
Web:http://www.cse.buffalo.edu/~yunfu/
--
You make a living by what you get; you make a life by what you give.
-Winston Churchill.
Hi everyone,
I'm looking for some Mooney Face stimuli to use in an experiment.
If anyone has any that I might be able to use, or could perhaps point me in the right direction, I would be extremely grateful.
Many thanks in advance
Hayley
Dr Hayley Ness
Psychology Lecturer
Dept of Psychology
Open University
Walton Hall,
Milton Keynes
MK7 6AA
h.ness(a)open.ac.uk<mailto:h.ness@open.ac.uk>
01908 653 557
--
The Open University is incorporated by Royal Charter (RC 000391), an exempt charity in England & Wales and a charity registered in Scotland (SC 038302).
Hello everyone,
Happy new year!
Hard times lead me to kick off the new year with a bit of shameful
self-promotion of a new book that may be of interest to some of you:
Bruce, V. and Young, A. Face perception. Hove: Psychology Press, 2012,
496 pages, ISBN 978-1-84169-878-6.
Further details and a sample chapter are available via the publisher's web
site:
http://www.cognitivepsychologyarena.com/face-perception-9781841698786
It can be ordered via the publisher or on Amazon.
We originally intended this as an update of our previous book (Bruce and
Young, 1998, In the eye of the beholder, OUP), but the literature has
moved on so much that in the end we were only able to retain a chunk of
chapter 1 and some other bits and pieces - the rest is mostly new.
We have tried to take a broad perspective on face perception and range
fairly widely in terms of sources of evidence. However, there is now such
a wealth of studies that many had to be left out. I can only apologise to
those whose work may not have received the amount of coverage it deserved.
A number of difficult choices had to be made.
Cheers,
Andy Young.
Hi all,
On behalf of Susann Fiedler (apologies for cross-posting):
Dear fellow psychologists,
To maintain high scientific standards in our field, it is extremely
important that we regularly take a step back and assess the research,
data analysis, reporting, and publication practices that represent the
current standards in psychology. You may be aware of the debate going on
in the psychological community regarding a recent article that addressed
the question of whether current practices should be revised (Simmons et
al., 2011, Psychological Science). The article has led to diverse
reactions among psychologists; and there is much speculation concerning
the popular opinion of "psychologists as a whole" towards the
recommendations put forth by the article.
In cooperation with the Open Science Framework
(http://openscienceframework.org/), we are conducting a worldwide survey
of psychologists to determine the extent to which the recommendations in
the above mentioned article are supported by the psychological
community. Everyone's opinion is important; and it is not necessary that
you have read the article in order to participate.
Share your opinion about psychological research practices and the
quality of publications in psychological journals by participating in
this short survey!
The survey takes approximately 10 minutes and is completely anonymous.
Participants may also enter into a raffle to win one of three 50$ Amazon
gift certificates.
Link to survey: http://ww3.unipark.de/uc/extern/9afd/
Please forward this e-mail to your colleagues.
Sincerely,
Susann Fiedler
Max Planck Institute for Research on Collective Goods
Kurt-Schumacher-Str. 10
53113 Bonn
Germany
--
Jessica Komes, M.Sc.
Department of General Psychology and Cognitive Neuroscience
and DFG Research Unit Person Perception
Friedrich Schiller University of Jena
Am Steiger 3, Haus 1
07743 Jena
Germany
Phone: +49 (0)3641 945934
E-Mail: jessica.komes(a)uni-jena.de
http://www2.uni-jena.de/svw/Allgpsy1/jessica.htm
Hi all -- Face applicants for the below position would be especially
welcome : )
Also note we have another faculty position available in Abnormal/
Clinical Psychology (http://jobs.anu.edu.au/PositionDetail.aspx?
p=2445) for applicants who come in pairs.
Lecturer / Senior Lecturer in Cognitive Psychology, Australian
National University
Duration of contract: Permanent. Grade Level: B-C.
Salary Package: AUD$80,166 - $108,000 pa plus 17% superannuation
(Australian dollar approx equal to US dollar).
Closing Date: 8 January 2012
The Department of Psychology seeks a productive and enthusiastic
scholar in any area of human cognition, to pursue creative original
research, supervise honours and postgraduate research students, and
contribute to teaching cognitive psychology to undergraduate students.
ANU is Australia’s top research university and was ranked 26th in the
World in the most recent QS World University rankings.
The Department has cognition research strengths in face recognition,
attention, language, reading and dyslexia, neuropsychology, cognitive
aging, cognitive development, visual cognition, and decision making.
Facilities include space for behavioural studies, eye movement
equipment, EEG, and neurostimulation (TMS, tDCS). This is a continuing
academic position at the Lecturer/Senior Lecturer level (loosely
equivalent to Assistant/Associate Professor in North America). The
research ethos within the Department is highly valued and
enthusiastically encouraged. If you think therefore that your
background and skills fit this role, we welcome your application to
join our team.
Details: http://jobs.anu.edu.au/PositionDetail.aspx?p=2444
Enquiries: Professor Don Byrne, T: +61 2 6125 3974 E: Don.Byrne(a)anu.edu.au
---
Professor Elinor McKone, PhD
Queen Elizabeth II Fellow
Department of Psychology
Australian National University
ACT 0200 Australia
ph: +61 2 6125 2822
fax: +61 2 6125 0499
email: elinor.mckone(a)anu.edu.au
As I recall Alice O'Toole has a set of these that are naturally evoked by films, but I don't know if she gives these out. Naturally evoked expressions have different dynamics than posed ones.
Alice's email is otoole(a)utdallas.edu
g.
On Nov 24, 2011, at 4:00 AM, face-research-list-request(a)lists.stir.ac.uk wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. Re: Face-research-list Digest, Vol 10, Issue 5 (Michael J. Tarr)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Wed, 23 Nov 2011 09:53:57 -0500
> From: "Michael J. Tarr" <michaeltarr(a)cmu.edu>
> Subject: Re: [Face-research-list] Face-research-list Digest, Vol 10,
> Issue 5
> To: face-research-list(a)lists.stir.ac.uk
> Message-ID: <4ECD0905.9060408(a)cmu.edu>
> Content-Type: text/plain; charset="iso-8859-1"; Format="flowed"
>
> I have lots (200+) of many different races of people expressing disgust,
> fear, sadness, and laughing. These are in standard def video, which
> means the faces are relatively small (like 200x200). I am happy to
> figure out a way to put these videos on-line if people feel they are
> useful. The static images are available at face-place.org
>
> -mike tarr
>
>
>> face-research-list-request(a)lists.stir.ac.uk
>> <mailto:face-research-list-request@lists.stir.ac.uk>
>> November 22, 2011 7:00 AM
>> Send Face-research-list mailing list submissions to
>> face-research-list(a)lists.stir.ac.uk
>>
>> To subscribe or unsubscribe via the World Wide Web, visit
>> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>> or, via email, send a message with subject or body 'help' to
>> face-research-list-request(a)lists.stir.ac.uk
>>
>> You can reach the person managing the list at
>> face-research-list-owner(a)lists.stir.ac.uk
>>
>> When replying, please edit your Subject line so it is more specific
>> than "Re: Contents of Face-research-list digest..."
>>
>>
>> Today's Topics:
>>
>> 1. Set of moving faces displaying facial expressions of emotion
>> (Whitaker, Lydia)
>> 2. Re: Set of moving faces displaying facial expressions of
>> emotion (Chris Benton)
>> 3. Re: Set of moving faces displaying facial expressions of
>> emotion (Micha? Olszanowski)
>>
>>
>> ----------------------------------------------------------------------
>>
>> Message: 1
>> Date: Mon, 21 Nov 2011 12:11:10 +0000
>> From: "Whitaker, Lydia" <lwhita(a)essex.ac.uk>
>> Subject: [Face-research-list] Set of moving faces displaying facial
>> expressions of emotion
>> To: face-research-list Mailing List
>> <face-research-list(a)lists.stir.ac.uk>
>> Message-ID: <CAEFF05E.1E1C%lwhita(a)essex.ac.uk>
>> Content-Type: text/plain; charset="us-ascii"
>>
>> Dear all,
>>
>> My name is Lydia Whitaker and I am a PhD student at the University of
>> Essex. I wondered if anyone knows of or has access to a stimuli set of
>> moving faces that are displaying facial expressions of emotion? If so
>> would anyone be willing to let me have access to their stimuli set or
>> point me in the right direction?
>>
>> Many thanks for any help you can give me.
>>
>>
>> Lydia Whitaker
>>
>>
>>
>>
>>
>> ------------------------------
>>
>> Message: 2
>> Date: Mon, 21 Nov 2011 13:57:41 +0000
>> From: Chris Benton <chris.benton(a)bristol.ac.uk>
>> Subject: Re: [Face-research-list] Set of moving faces displaying
>> facial expressions of emotion
>> To: face-research-list(a)lists.stir.ac.uk
>> Message-ID: <4ECA58D5.6050108(a)bristol.ac.uk>
>> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>>
>> Hi Lydia,
>>
>> I've got a bunch of moving faces - various expressions taken
>> simultaneously from 5 different angles at 25 Hz. You're welcome to a copy.
>>
>> There's a poster describing the image set at
>>
>> http://seis.bris.ac.uk/~pscpb/vss2007_poster.pdf
>>
>> cheers, chris
>>
>> ---
>> Dr Chris Benton
>> Admissions Tutor, Experimental Psychology
>> http://www.bris.ac.uk/expsych/people/academic/chrisbenton.html
>>
>>
>>
>>
>>
>>
>> ------------------------------
>>
>> Message: 3
>> Date: Tue, 22 Nov 2011 11:41:32 +0100
>> From: Micha? Olszanowski <molszanowski(a)swps.edu.pl>
>> Subject: Re: [Face-research-list] Set of moving faces displaying
>> facial expressions of emotion
>> To: "'Whitaker, Lydia'" <lwhita(a)essex.ac.uk>, "'face-research-list
>> Mailing List'" <face-research-list(a)lists.stir.ac.uk>
>> Message-ID: <00f801cca903$50056a70$f0103f50$(a)edu.pl>
>> Content-Type: text/plain; charset="iso-8859-2"
>>
>> Hi,
>>
>> Some time ago we've created Warsaw Set of Dynamic Facial Expression -
>> based
>> on Warsaw Set of Emotional Facial Expression Pictures (WSEFEP). Movies are
>> morphed picture frames, captured during photo sessions (photo camera was
>> taking 5 frames per second, so basically we used around 6-7frames to
>> create
>> movie). As far as I remember there are 4 displayers, each with 5
>> expressions
>> (anger, joy, disgust, fear, surprise).
>>
>> Here you can download a sample: www.emotional-face.org/mov/sample.avi. If
>> you'd need more please contact me, so I'll send you the rest.
>>
>> Regards,
>> Michal Olszanowski
>>
>>
>>
>> **********
>> Michal Olszanowski, PhD.
>> Warsaw School of Social Sciences & Humanities
>> Faculty of Psychology, Cognitive Psychology Department
>> Chodakowska Street 19/31, PL - 03815 Warsaw
>> www.swps.pl, www.emotional-face.org
>>
>>
>>
>> -----Original Message-----
>> From: face-research-list-bounces(a)lists.stir.ac.uk
>> [mailto:face-research-list-bounces@lists.stir.ac.uk] On Behalf Of
>> Whitaker,
>> Lydia
>> Sent: Monday, November 21, 2011 1:11 PM
>> To: face-research-list Mailing List
>> Subject: [Face-research-list] Set of moving faces displaying facial
>> expressions of emotion
>>
>> Dear all,
>>
>> My name is Lydia Whitaker and I am a PhD student at the University of
>> Essex.
>> I wondered if anyone knows of or has access to a stimuli set of moving
>> faces
>> that are displaying facial expressions of emotion? If so would anyone be
>> willing to let me have access to their stimuli set or point me in the
>> right
>> direction?
>>
>> Many thanks for any help you can give me.
>>
>>
>> Lydia Whitaker
>>
>>
>>
>> _______________________________________________
>> Face-research-list mailing list
>> Face-research-list(a)lists.stir.ac.uk
>> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>>
>>
>>
>>
>>
>>
>> ------------------------------
>>
>> _______________________________________________
>> Face-research-list mailing list
>> Face-research-list(a)lists.stir.ac.uk
>> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>>
>>
>> End of Face-research-list Digest, Vol 10, Issue 5
>> *************************************************
>>
>
>
>
I have lots (200+) of many different races of people expressing disgust,
fear, sadness, and laughing. These are in standard def video, which
means the faces are relatively small (like 200x200). I am happy to
figure out a way to put these videos on-line if people feel they are
useful. The static images are available at face-place.org
-mike tarr
> face-research-list-request(a)lists.stir.ac.uk
> <mailto:face-research-list-request@lists.stir.ac.uk>
> November 22, 2011 7:00 AM
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. Set of moving faces displaying facial expressions of emotion
> (Whitaker, Lydia)
> 2. Re: Set of moving faces displaying facial expressions of
> emotion (Chris Benton)
> 3. Re: Set of moving faces displaying facial expressions of
> emotion (Micha? Olszanowski)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 21 Nov 2011 12:11:10 +0000
> From: "Whitaker, Lydia" <lwhita(a)essex.ac.uk>
> Subject: [Face-research-list] Set of moving faces displaying facial
> expressions of emotion
> To: face-research-list Mailing List
> <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <CAEFF05E.1E1C%lwhita(a)essex.ac.uk>
> Content-Type: text/plain; charset="us-ascii"
>
> Dear all,
>
> My name is Lydia Whitaker and I am a PhD student at the University of
> Essex. I wondered if anyone knows of or has access to a stimuli set of
> moving faces that are displaying facial expressions of emotion? If so
> would anyone be willing to let me have access to their stimuli set or
> point me in the right direction?
>
> Many thanks for any help you can give me.
>
>
> Lydia Whitaker
>
>
>
>
>
> ------------------------------
>
> Message: 2
> Date: Mon, 21 Nov 2011 13:57:41 +0000
> From: Chris Benton <chris.benton(a)bristol.ac.uk>
> Subject: Re: [Face-research-list] Set of moving faces displaying
> facial expressions of emotion
> To: face-research-list(a)lists.stir.ac.uk
> Message-ID: <4ECA58D5.6050108(a)bristol.ac.uk>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> Hi Lydia,
>
> I've got a bunch of moving faces - various expressions taken
> simultaneously from 5 different angles at 25 Hz. You're welcome to a copy.
>
> There's a poster describing the image set at
>
> http://seis.bris.ac.uk/~pscpb/vss2007_poster.pdf
>
> cheers, chris
>
> ---
> Dr Chris Benton
> Admissions Tutor, Experimental Psychology
> http://www.bris.ac.uk/expsych/people/academic/chrisbenton.html
>
>
>
>
>
>
> ------------------------------
>
> Message: 3
> Date: Tue, 22 Nov 2011 11:41:32 +0100
> From: Micha? Olszanowski <molszanowski(a)swps.edu.pl>
> Subject: Re: [Face-research-list] Set of moving faces displaying
> facial expressions of emotion
> To: "'Whitaker, Lydia'" <lwhita(a)essex.ac.uk>, "'face-research-list
> Mailing List'" <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <00f801cca903$50056a70$f0103f50$(a)edu.pl>
> Content-Type: text/plain; charset="iso-8859-2"
>
> Hi,
>
> Some time ago we've created Warsaw Set of Dynamic Facial Expression -
> based
> on Warsaw Set of Emotional Facial Expression Pictures (WSEFEP). Movies are
> morphed picture frames, captured during photo sessions (photo camera was
> taking 5 frames per second, so basically we used around 6-7frames to
> create
> movie). As far as I remember there are 4 displayers, each with 5
> expressions
> (anger, joy, disgust, fear, surprise).
>
> Here you can download a sample: www.emotional-face.org/mov/sample.avi. If
> you'd need more please contact me, so I'll send you the rest.
>
> Regards,
> Michal Olszanowski
>
>
>
> **********
> Michal Olszanowski, PhD.
> Warsaw School of Social Sciences & Humanities
> Faculty of Psychology, Cognitive Psychology Department
> Chodakowska Street 19/31, PL - 03815 Warsaw
> www.swps.pl, www.emotional-face.org
>
>
>
> -----Original Message-----
> From: face-research-list-bounces(a)lists.stir.ac.uk
> [mailto:face-research-list-bounces@lists.stir.ac.uk] On Behalf Of
> Whitaker,
> Lydia
> Sent: Monday, November 21, 2011 1:11 PM
> To: face-research-list Mailing List
> Subject: [Face-research-list] Set of moving faces displaying facial
> expressions of emotion
>
> Dear all,
>
> My name is Lydia Whitaker and I am a PhD student at the University of
> Essex.
> I wondered if anyone knows of or has access to a stimuli set of moving
> faces
> that are displaying facial expressions of emotion? If so would anyone be
> willing to let me have access to their stimuli set or point me in the
> right
> direction?
>
> Many thanks for any help you can give me.
>
>
> Lydia Whitaker
>
>
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>
>
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Face-research-list mailing list
> Face-research-list(a)lists.stir.ac.uk
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
>
>
> End of Face-research-list Digest, Vol 10, Issue 5
> *************************************************
>
Dear Colleagues,
We are currently taking applications for a postdoctoral Research Fellow (2-year post) and would appreciate if you could forward the advertisement below to any researchers you know who would be interested.
Kind regards,
Lisa DeBruine and Ben Jones
-----------------------------------------------------------------------
The Face Research Lab<http://facelab.org> is seeking applications for a postdoctoral Research Fellow for a 2-year ESRC-funded position starting 1 October 2011. The Research Fellow will be responsible for conducting a longitudinal study of mate choice and face preferences and will be supervised by Dr Lisa DeBruine and Prof Benedict Jones.
Criteria
* PhD in Psychology or a cognate discipline.
* A proven track record of research and publication.
* Experience of conducting large-scale laboratory research.
* Background in social cognition, evolutionary theories of behaviour, and/or mate preference/choice.
* Expertise in using Excel and SPSS (or equivalent).
* Experience using Psychomorph to manipulate faces is desirable.
* Ability to work as part of a team.
* Good IT and communication skills (both written and oral).
* Ability to think and work independently.
* Ability to manage long-term research projects with large numbers of participants.
* Willingness to travel to conferences, including air travel.
Please see http://abdn.ac.uk/jobs (reference: 1201349) to apply.
Online applications are due 31 August 2011.
Contact Lisa DeBruine at l.debruine(a)abdn.ac.uk<mailto:l.debruine@abdn.ac.uk> for further information.
Peter Hancock
Professor
Acting Head of Psychology,
School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
fax 01786 467641
http://www.psychology.stir.ac.uk/staff/staff-profiles/academic-staff/peter-…
--
The Sunday Times Scottish University of the Year 2009/2010
The University of Stirling is a charity registered in Scotland,
number SC 011159.
Good luck Robbie :)
At my place, every month is MOvember, hehe.
Regards,
Hugh
On Fri, Nov 11, 2011 at 11:00 PM, <face-research-list-request(a)lists.stir.ac.uk> wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. Movember (Cooper, Robbie)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Thu, 10 Nov 2011 15:19:22 +0000
> From: "Cooper, Robbie" <R.Cooper(a)napier.ac.uk>
> Subject: [Face-research-list] Movember
> To: "Face-research-list(a)lists.stir.ac.uk"
> <Face-research-list(a)lists.stir.ac.uk>
> Message-ID:
> <9DCFABEB349E1A4DB2AD36B765897682014B2FB9941A(a)E2K7MBX.napier-mail.napier.ac.uk>
>
> Content-Type: text/plain; charset="us-ascii"
>
> Hello fellow face fanciers,
>
> I reach your inbox with an unashamed moment of self-promotion in the name of a good cause. Some of you may be aware that in the last few years November has become a time where men, previously frightened of the power of a fine moustache (mustache if you are 'stateside'), attempt to cultivate something on their top lip in order to raise awareness of cancers that are specific to men (e.g. prostate cancer, testicular cancer). This has been dubbed 'Movember'.
>
> I give you here a link to my Movember site where you can find out more information and if you are feeling particularly generous you might even leave a donation for this worthy charity.
>
> http://mobro.co/RobbieCooper
>
> I have had my own battle with cancer in the last few months and I urge you to arm yourselves with knowledge. Knowledge is power brothers and sisters. We need to look after one another.
>
> Best wishes,
> Robbie
>
> --------------------------------------------
> Dr Robbie Cooper
> Lecturer in Psychology
> School of Life, Sport and Social Sciences
> Edinburgh Napier University
> Sighthill Campus
> Sighthill Loan
> Edinburgh, UK.
> EH11 4BN
> ph: +44 (0)131 455 6481
> r.cooper(a)napier.ac.uk<mailto:r.cooper@napier.ac.uk>
>
> Staff Webpage<http://www.napier.ac.uk/fhlss/SLSSS/Staff/Pages/Robbie_Cooper.aspx>
>
>
>
> Edinburgh Napier University is one of Scotland's top universities for graduate employability. 93.2% of graduates are in work or further study within six months of leaving. The university is also proud winner of the Queen's Anniversary Prize for Higher and Further Education 2009, awarded for innovative housing construction for environmental benefit and quality of life.
>
> This message is intended for the addressee(s) only
> and should not be read, copied or disclosed to anyone else outwith the University without the permission of the sender. It is your responsibility to ensure that this message and any attachments are scanned for viruses or other defects.
> Edinburgh Napier University does not accept liability for any loss or
> damage which may result from this email or any attachment, or for errors or omissions arising after it was sent. Email is not a secure medium. Email entering the University's system is subject to routine monitoring and filtering by the University.
>
> Edinburgh Napier University is a registered Scottish
> charity.
> Registration number SC018373
>
>
>
>
Hello fellow face fanciers,
I reach your inbox with an unashamed moment of self-promotion in the name of a good cause. Some of you may be aware that in the last few years November has become a time where men, previously frightened of the power of a fine moustache (mustache if you are 'stateside'), attempt to cultivate something on their top lip in order to raise awareness of cancers that are specific to men (e.g. prostate cancer, testicular cancer). This has been dubbed 'Movember'.
I give you here a link to my Movember site where you can find out more information and if you are feeling particularly generous you might even leave a donation for this worthy charity.
http://mobro.co/RobbieCooper
I have had my own battle with cancer in the last few months and I urge you to arm yourselves with knowledge. Knowledge is power brothers and sisters. We need to look after one another.
Best wishes,
Robbie
--------------------------------------------
Dr Robbie Cooper
Lecturer in Psychology
School of Life, Sport and Social Sciences
Edinburgh Napier University
Sighthill Campus
Sighthill Loan
Edinburgh, UK.
EH11 4BN
ph: +44 (0)131 455 6481
r.cooper(a)napier.ac.uk<mailto:r.cooper@napier.ac.uk>
Staff Webpage<http://www.napier.ac.uk/fhlss/SLSSS/Staff/Pages/Robbie_Cooper.aspx>
Edinburgh Napier University is one of Scotland's top universities for graduate employability. 93.2% of graduates are in work or further study within six months of leaving. The university is also proud winner of the Queen's Anniversary Prize for Higher and Further Education 2009, awarded for innovative housing construction for environmental benefit and quality of life.
This message is intended for the addressee(s) only
and should not be read, copied or disclosed to anyone else outwith the University without the permission of the sender. It is your responsibility to ensure that this message and any attachments are scanned for viruses or other defects.
Edinburgh Napier University does not accept liability for any loss or
damage which may result from this email or any attachment, or for errors or omissions arising after it was sent. Email is not a secure medium. Email entering the University's system is subject to routine monitoring and filtering by the University.
Edinburgh Napier University is a registered Scottish
charity.
Registration number SC018373
face-research-list-request(a)lists.stir.ac.uk wrote:
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. Stimulsu request (Gavin Perry)
> 2. Re: Stimulsu request (Etienne B. Roesch)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 1 Nov 2011 11:47:13 +0000
> From: Gavin Perry <perry_gavin(a)hotmail.com>
> Subject: [Face-research-list] Stimulsu request
> To: <face-research-list(a)lists.stir.ac.uk>
> Message-ID: <BAY162-W56D2626543CBFFA9E8DECA8BD70(a)phx.gbl>
> Content-Type: text/plain; charset="iso-8859-1"
>
>
> Dear all,
>
> I was wondering if anyone would be able to help me out with some stimuli I'm looking for. I'm after a morph continuum of faces from an average female to an average male face, so that i can present faces with varying degres of male/female-ness. Does anyone have such a set that they are willing to share, or know of a set that is publicly available?
>
> Thanks in advance,
>
> Dr Gavin Perry
>
> CUBRIC
> School of Psychology
> Cardiff University
> 60 Park Place
> Cardiff
> CF10 3AT
>
>
>
Dear all,
I was wondering if anyone would be able to help me out with some stimuli I'm looking for. I'm after a morph continuum of faces from an average female to an average male face, so that i can present faces with varying degres of male/female-ness. Does anyone have such a set that they are willing to share, or know of a set that is publicly available?
Thanks in advance,
Dr Gavin Perry
CUBRIC
School of Psychology
Cardiff University
60 Park Place
Cardiff
CF10 3AT
I am very pleased to announce the publication of the following paper
in Vision Research:
Chen Zhao, Peggy Seriès, Peter J. B. Hancock, and James A. Bednar.
Similar neural adaptation mechanisms underlying face gender
and tilt aftereffects.
Vision Research, 51(18):2021-2030, 2011.
http://dx.doi.org/10.1016/j.visres.2011.07.014
This paper may be of interest to both computational and psychophysical
researchers, because it shows how computational models developed for
low-level vision (oriented lines) can help explain higher visual
function (face gender perception). Specifically, we found that models
based on the primary visual cortex successfully predicted previously
unknown and important aspects of face gender perception. These
results support the idea that higher vision uses similar mechanisms as
early vision, and are in conflict with prevailing theories of face
perception that rely on norm-based encoding.
Additional details and related models are available in Roger's
recently completed PhD:
http://homepages.inf.ed.ac.uk/jbednar/papers/zhao.phd11.pdf
We are very interested in hearing feedback about this work,
particularly from those working on norm-based theories of higher
visual perception.
Jim
Dr. James A. Bednar
Institute for Adaptive and Neural Computation
University of Edinburgh, UK
http://homepages.inf.ed.ac.uk/jbednar
--
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.
Hi all,
Given that the Cambridge Face Memory Test (CFMT) has been very widely
used, I thought some of you may be interested to know that we have
just published a paper with young adult norms for a new non-face
object memory test (the CCMT) with the same format as the CFMT.
As with the CFMT, the CCMT has a range suitable for investigating
individual differences in the normal adult population.
You can access the paper at:
http://www.springerlink.com/content/c36317587643qt37/
I have also included the abstract below.
Regards,
Hugh Dennett
--
The Cambridge Car Memory Test: A task matched in format to the
Cambridge Face Memory Test, with norms, reliability, sex differences,
dissociations from face memory, and expertise effects
Hugh W. Dennett, Elinor McKone, Raka Tavashmi, Ashleigh Hall,
Madeleine Pidcock, Mark Edwards and Bradley Duchaine
Behavior Research Methods
DOI: 10.3758/s13428-011-0160-2
Many research questions require a within-class object recognition task
matched for general cognitive requirements with a face recognition
task. If the object task also has high internal reliability, it can
improve accuracy and power in group analyses (e.g., mean inversion
effects for faces vs. objects), individual-difference studies (e.g.,
correlations between certain perceptual abilities and face/object
recognition), and case studies in neuropsychology (e.g., whether a
prosopagnosic shows a face-specific or object-general deficit). Here,
we present such a task. Our Cambridge Car Memory Test (CCMT) was
matched in format to the established Cambridge Face Memory Test,
requiring recognition of exemplars across view and lighting change. We
tested 153 young adults (93 female). Results showed high reliability
(Cronbach's alpha = .84) and a range of scores suitable both for
normal-range individual-difference studies and, potentially, for
diagnosis of impairment. The mean for males was much higher than the
mean for females. We demonstrate independence between face memory and
car memory (dissociation based on sex, plus a modest correlation
between the two), including where participants have high relative
expertise with cars. We also show that expertise with real car makes
and models of the era used in the test significantly predicts CCMT
performance. Surprisingly, however, regression analyses imply that
there is an effect of sex per se on the CCMT that is not attributable
to a stereotypical male advantage in car expertise.
--
Hugh Dennett
Ph.D. Candidate
Department of Psychology
The Australian National University
Canberra ACT 0200
E: hugh.dennett(a)anu.edu.au
T: +61 2 6125 2716
W: http://psychology.anu.edu.au/_people/people_details.asp?recId=177