This is the template my lab uses to address some problems with the standard template in the nose, ears and neck.
[cid:4a1c2f04-cc73-430c-8233-44f74264772e@campus.gla.ac.uk]
[cid:f9f2d31a-b638-4585-8591-4b0bb50a868f@campus.gla.ac.uk]
----------------------------------------------------------
Dr Lisa M DeBruine
Institute of Neuroscience and Psychology
University of Glasgow
58 Hillhead Street
G12 8QB
lisa.debruine(a)glasgow.ac.uk
http://facelab.org
0141 330 5351
----------------------------------------------------------
> On 12 Aug 2015, at 09:15, face-research-list-request(a)lists.stir.ac.uk wrote:
>
> Send Face-research-list mailing list submissions to
> face-research-list(a)lists.stir.ac.uk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
> or, via email, send a message with subject or body 'help' to
> face-research-list-request(a)lists.stir.ac.uk
>
> You can reach the person managing the list at
> face-research-list-owner(a)lists.stir.ac.uk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Face-research-list digest..."
>
>
> Today's Topics:
>
> 1. averaging faces (Emma Mullings)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Fri, 7 Aug 2015 12:41:21 +0000
> From: Emma Mullings <emma.mullings(a)manchester.ac.uk>
> To: "face-research-list(a)lists.stir.ac.uk"
> <face-research-list(a)lists.stir.ac.uk>
> Subject: [Face-research-list] averaging faces
> Message-ID:
> <0CBCFBC4D5A0C940A71B2507534A1798016F230168(a)MBXP07.ds.man.ac.uk>
> Content-Type: text/plain; charset="utf-8"
>
> Hi there,
>
> I have been using psychomorph to average 3 faces at a time.
>
> The images are really impressive, however the neck is a bit problematic. I've attached an image as an example (left side of face just under the jawline). I was wondering if you had any advice to how I could rectify this?
>
> I was also wondering if it was possible to add more delineation points to an image?
>
> Best wishes
>
> Emma
>
> Dr. Emma Mullings | Neuroscience and Psychiatry Unit, University of Manchester, G700, Stopford Building, Oxford Road, Manchester, M13 9PT | 0161 275 7432 | 07970 103411 | emma.mullings(a)manchester.ac.uk<mailto:emma.mullings@manchester.ac.uk>
>
> Visit our website to find out more about the PERS study http://www.inflammation-repair.manchester.ac.uk/PERS
>
> [PERS QR code]
>
>
Hi there,
I have been using psychomorph to average 3 faces at a time.
The images are really impressive, however the neck is a bit problematic. I've attached an image as an example (left side of face just under the jawline). I was wondering if you had any advice to how I could rectify this?
I was also wondering if it was possible to add more delineation points to an image?
Best wishes
Emma
Dr. Emma Mullings | Neuroscience and Psychiatry Unit, University of Manchester, G700, Stopford Building, Oxford Road, Manchester, M13 9PT | 0161 275 7432 | 07970 103411 | emma.mullings(a)manchester.ac.uk<mailto:emma.mullings@manchester.ac.uk>
Visit our website to find out more about the PERS study http://www.inflammation-repair.manchester.ac.uk/PERS
[PERS QR code]
Dear Colleagues
Please see below for an Research Assistant position that we are currently advertising at the University of Manchester. Please circulate to your department, or anyone you think might be interested.
See http://www.jobs.ac.uk/job/ALR798/mhs-06883-research-assistant/
Closing date : 22/08/2015
Reference : M&HS-06883
Faculty / Organisational unit : Medical & Human Sciences
School / Directorate : School of Psychological Sciences
Employment type : Fixed Term
Duration : Until 31 August 2017
Location : Oxford Road, Manchester
Salary : £25,513 to £27,057 per annum
Hours per week : Full time
Applications are invited for a 24 month full-time Research Assistant to support a Leverhulme Trust grant ‘investigating the role of movement in the recognition of facial composites’, awarded to Dr Karen Lander, Dr Charlie Frowd and Professor Tim Cootes. In this project we build on existing bodies of research regarding moving faces and composite creation, and investigate the role of motion in the recognition of identity from composites. In a criminal investigation, facial composites are images constructed by witnesses and victims of people they have seen to commit crime. This project will aid the theoretical understanding of the moving face advantage and investigate the interaction, and the relative importance, of static and dynamic information available from the face. Research on this issue may also aid the development of useful composite
Your role will be to be set up, run and analyse the experiments involved in the project. You will have a degree in psychology, neuroscience or a related discipline, with a strong interest in studying experimental psychology and, more specifically, face perception and recognition. Experience in scientific research and running experiments, particularly with human volunteers, is important. Good communication and team working skills are essential and familiarity with experimental presentation, statistical software packages (e.g. SPSS) and composite creation systems (e.g. Evo-FIT, PRO-fit) would be useful. Ideally you will also have experience of image manipulation and creation techniques, including running command line tools to manipulate data.
Closing date is 22nd August 2015
Many thanks
Karen Lander
Senior Lecturer
University of Manchester
************************************************
CFP - Apologies for multiple copies
************************************************
******** EXTENDED SUBMISSION DEADLINE: August 2nd, 2015! ********
The First International Workshop on Modeling INTERPERsonal SynchrONy - INTERPERSONAL@ICMI2015
(http://interpersonalicmi2015.isir.upmc.fr)
@the17th International Conference on Multimodal Interaction (ICMI 2015) (http://icmi.acm.org/2015/)
_______
SCOPE
_______
Understanding human behavior through computer vision and signal processing has become of major interest with the emergence of social signal processing and affective computing andtheir applications to human-computer interaction. With few exceptions, research has focusedon detection of individual persons, their nonverbal behavior in the context of emotion and related psychosocial constructs. With advances in methodology, there is increasing interest inadvancing beyond the individual to social interaction of multiple individuals. This level of analysis brings to the fore detection and understanding of interpersonal influence and interpersonal synchrony in social interaction.
Interpersonal synchrony in social interaction between interactive partners is the dynamic andreciprocal adaptation of their verbal and nonverbal behaviors. It affords both a novel domain for computer vision and machine learning, as well as a novel context with which to examine individual variation in cognitive, physiological, and neural processes in the interacting members. Interdisciplinary approaches to interpersonal synchrony are encouraged. Investigating these complex phenomena has both theoretical and practical applications.
The proposed workshop will explore the challenges of modeling, recognition, and synthesis of influence and interpersonal synchrony. It will address theory, computational models, and algorithms for the automatic analysis and synthesis of influence and interpersonal synchrony. We wish to explore both influence and interpersonal synchrony in human-human and human-machine interaction in dyadic and multi-person scenarios. Expected topics include definition of different categories of interpersonal synchrony and influence, multimodal corpora annotation of interpersonal influence, dynamics of relevant behavioral patterns, and synthesis and recognition of verbal and nonverbal patterns of interpersonal synchrony and influence.The INTERPERSONAL workshop will afford opportunity for discussing new applications such as clinical assessment, consumer behavior analysis, and design of socially aware interfaces.
The INTERPERSONAL workshop will identify and promote research challenges relevant to this exciting topic of synchrony.
______________
LIST OF TOPICS
______________
We encourage papers and demos addressing, but not limited to, the following research topics:
- Theoretical approaches to interpersonal synchrony in human/human and human/machine interaction
- Analysis and detection of non-verbal patterns of interpersonal synchrony/influence
- Models taking into account the relatioship between influence and synchrony
- Analysis and detection of physiological signals
- Modeling interpersonal synchrony in dyadic and in multi-party social interaction
- Psychological correlates of interpersonal synchrony/influence
- Analysis and detection of functional roles, persuasion, trust, dominance and so on
- Recording and annotation of corpora that vary in degree of experimental control
- Qualitative and quantitative evaluation
- Design of social agents and dialog systems.
_________________________
SUBMISSIONS AND REVISIONS
_________________________
Long paper: 8 pages maximum in the two-column ACM format as the main conference. Accepted long papers will be presented as long talk or a poster.
Short paper: 4 pages maximum in the two-column ACM format as the main conference. Accepted short papers will be presented as either a short talk or a poster.
Submissions should include: title, author(s), affiliation(s), e-mail address(es), tel/fax number(s), and postal address(es).
The papers have to be submitted at the following link:
https://easychair.org/conferences/?conf=interpersonalicmi201
All the contributions will be subject to a peer-review by at least three reviewers from the Program Committee.
INTERPERSONAL review is double blind, that is the authors do not know the name of the reviewers and the reviewers do not know the names of the authors. As a consequence, each submission should be anonymised: please, remove the authors names and all the information that could identify the authors.
__________
DEADLINES
__________
August 2nd, 2015: Submission deadline
August 25th, 2015: Notification of acceptance
August 17th, 2015: Camera ready version due to electronic form
November 13th, 2015: 2015 INTERPERSONAL@ICMI2015 Workshop
______________
ORGANIZATION
______________
Mohamed Chetouani,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(mohamed.chetouani at upmc.fr)
Giovanna Varni,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(varni at isir.upmc.fr)
Hanan Salam,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(salam at isir.umpc.fr)
Zakia Hammal
The Robotics Institute
Carnegie Mellon University
(zakia_hammal at yahoo.fr)
Jeffrey F. Cohn
University of Pittsburgh
The Robotics Institute
Carnegie Mellon Univeersity
(jeffcohn at cs.cmu.edu)
______________
SPONSORS
______________
This workshop is partially supported by the Laboratory of Excellence SMART (http://www.smart-labex.fr)
Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
Hi,
I'm conducting research into facial recognition and was wondering if anyone had any photos of East Asian male faces, preferably on a white background, but any colour will do.
Thanks,
Steff
-- The Open University is incorporated by Royal Charter (RC 000391), an exempt charity in England & Wales and a charity registered in Scotland (SC 038302). The Open University is authorised and regulated by the Financial Conduct Authority.
*Postdoctoral position in object and face recognition*
A postdoctoral research position is open at the Objects and Knowledge
Laboratory, headed by Dr. Olivia Cheung, at New York University Abu Dhabi.
This position is based in New York University, New York. The postdoctoral
researcher will carry out fMRI experiments on human object and face
recognition. Potential research projects include, but are not limited to,
investigations of the influences of experience and conceptual knowledge on
recognition processes.
Applicants must have a Ph.D. in Psychology, Cognitive Neuroscience, or a
related field, and should possess strong programming skills (e.g., Matlab).
Prior experience with neuroimaging and psychophysical techniques is
required. Initial appointment is for one year, with the possibility of
renewal depending on the availability of funding. Starting date is
flexible, preferably around September 1, 2015.
The Objects and Knowledge Laboratory is part of the rapidly growing
Psychology division at New York University Abu Dhabi. For this position,
the postdoctoral researcher will work in New York, and will have access to
neuroimaging facilitates (such as MRI) at the Center for Brain Imaging, New
York University.
New York University has established itself as a Global Network University,
a multi-site, organically connected network encompassing key global cities
and idea capitals. The network has three foundational degree-granting
campuses: New York, Abu Dhabi, and Shanghai, complimented by a network of
eleven research and study-away sites across five continents. Faculty and
students will circulate within this global network in pursuit of common
research interests and the promotion of cross-cultural and
interdisciplinary solutions for problems both local and global.
Interested individuals should email a curriculum vita, the expected date of
availability, and contact information of two referees to Olivia Cheung (
olivia.cheung(a)nyu.edu). Informal inquires regarding the position are
encouraged.
************************************************
CFP - Apologies for multiple copies
************************************************
The First International Workshop on Modeling INTEPERsonal SynchrONy - INTERPERSNAL@ICMI2015
(http://interpersonalicmi2015.isir.upmc.fr)
@the17th International Conference on Multimodal Interaction (ICMI 2015)
(http://icmi.acm.org/2015/)
_______
SCOPE
_______
Understanding human behavior through computer vision and signal
processing has become of major interest with the emergence of social
signal processing and affective computing andtheir applications to
human-computer interaction. With few exceptions, research has focusedon
detection of individual persons, their nonverbal behavior in the context
of emotion and related psychosocial constructs. With advances in
methodology, there is increasing interest inadvancing beyond the
individual to social interaction of multiple individuals. This level of
analysis brings to the fore detection and understanding of interpersonal
influence and interpersonal synchrony in social interaction.
Interpersonal synchrony in social interaction between interactive
partners is the dynamic andreciprocal adaptation of their verbal and
nonverbal behaviors. It affords both a novel domain for computer vision
and machine learning, as well as a novel context with which to examine
individual variation in cognitive, physiological, and neural processes
in the interacting members. Interdisciplinary approaches to
interpersonal synchrony are encouraged. Investigating these complex
phenomena has both theoretical and practical applications.
The proposed workshop will explore the challenges of modeling,
recognition, and synthesis of influence and interpersonal synchrony. It
will address theory, computational models, and algorithms for the
automatic analysis and synthesis of influence and interpersonal
synchrony. We wish to explore both influence and interpersonal synchrony
in human-human and human-machine interaction in dyadic and multi-person
scenarios. Expected topics include definition of different categories of
interpersonal synchrony and influence, multimodal corpora annotation of
interpersonal influence, dynamics of relevant behavioral patterns, and
synthesis and recognition of verbal and nonverbal patterns of
interpersonal synchrony and influence.The INTERPERSONAL workshop will
afford opportunity for discussing new applications such as clinical
assessment, consumer behavior analysis, and design of socially aware
interfaces.
The INTERPERSONAL workshop will identify and promote research challenges
relevant tothis exciting topic of synchrony.
______________
LIST OF TOPICS
______________
We encourage papers and demos addressing, but not limited to, the following research topics:
- Theoretical approaches to interpersonal synchrony in human/human and human/machine interaction
- Analysis and detection of non-verbal patterns of interpersonal synchrony/influence
- Models taking into account the relatioship between influence and synchrony
- Analysis and detection of physiological signals
- Modeling interpersonal synchrony in dyadic and in multi-party social interaction
- Psychological correlates of interpersonal synchrony/influence
- Analysis and detection of functional roles, persuasion, trust, dominance and so on
- Recording and annotation of corpora that vary in degree of experimental control
- Qualitative and quantitative evaluation
- Design of social agents and dialog systems.
_________________________
SUBMISSIONS AND REVISIONS
_________________________
Long paper: 8 pages maximum in the two-column ACM conference format.
Accepted long papers will be presented as long talk or a poster.
Short paper: 4 pages maximum in the two-column ACM conference format.
Accepted short papers will be presented as either a short talk or a
poster.
Submissions should include: title, author(s), affiliation(s), e-mail
address(es), tel/fax number(s), and postal address(es).
The papers have to be submitted at the following link:
https://easychair.org/conferences/?conf=interpersonalicmi201
All the contributions will be subject to a peer-review by at least three
reviewers from the Program Committee.
__________
DEADLINES
__________
July 20th, 2015: Submission deadline
August 4th, 2015: Notification of acceptance
August 17th, 2015: Camera ready version due to electronic form
November 13th, 2015: 2015 INTERPERSONAL@ICMI2015 Workshop
______________
ORGANIZATION
______________
Mohamed Chetouani,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(mohamed.chetouani at upmc.fr)
Giovanna Varni,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(varni at isir.upmc.fr)
Hanan Salam,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(salam at isir.umpc.fr)
Zakia Hammal
Robotics Institute
Carnegie Mellon University
(zakia_hammal at yahoo.fr)
Jeffrey F. Cohn
University of Pittsburgh
Robotics Institute
Carnegie Mellon University
(jeffcohn at cs.cmu.edu)
______________
SPONSORS
______________
This workshop is partially supported by the Laboratory of Excellence SMART (http://www.smart-labex.fr)
Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/
Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html
http://www.stir.ac.uk/natural-sciences/news/2015/psychologystudentship/
Evolutionary approaches to the social perception of faces using a range of stimuli and measures.
This studentship will investigate how evolutionary pressures may shape our processing of faces across a range of social perceptual tasks, using a variety of stimulus types, and using a variety of response measures.
In recent years, an evolutionary approach has shed light on how we perceive the attractiveness of faces. This evolutionary framework has also been applied to the perception of leadership, the attribution of personality traits, and the recognition of identity from face images. Much of this previous work relies on the 2D facial photographs and explicit judgements. For example, attractiveness is often assessed by showing participants 2D photographs and asking them to rate the image for attractiveness. The goal of this studentship is to address the impact of the type of stimulus used and the type of measure employed in conclusions about judging faces. The student will then explore the social perception of faces using a variety of stimuli (e.g., 2D faces, 3D faces, moving faces) and a range of measures of preference and attention (e.g., explicit and implicit preferences, eye-tracking).
More information about current research topics can be found at: www.alittlelab.com<http://www.alittlelab.com/>
Deadline 31 July
Peter Hancock
Professor,
School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
fax 01786 467641
http://rms.stir.ac.uk/converis-stirling/person/11587
Psychology at Stirling: 100% 4* Impact, REF2014
A research assistant position is available in the Computational Cognitive
Neuroscience Laboratory in the Department of Psychology at Florida
International University. Research in the lab focuses on understanding the
interplay between learning and visual processes in object categorization,
using a combination of behavioral, computational and brain imaging
techniques.
Responsibilities will include programming and conducting behavioral and
fMRI experiments, scheduling and screening human volunteers for
participation in experiments, analyzing behavioral and neuroimaging data,
working with computational models of vision and learning, and performing
lab management tasks.
A bachelor’s (or higher) degree in neuroscience, psychology, statistics,
computer science, engineering, mathematics, physics, or other related field
is required. Strong quantitative skills, computer programming skills
(especially python; experience with R and/or C++ is a plus), and
organizational skills are required.
The preferred start date is September 1st, 2015. The position requires a
commitment of two years.
This position is ideal for someone interested in obtaining experience in
cognitive neuroscience research, and improving quantitative and
computational skills, with the goal of applying to graduate school.
For informal inquiries, please send a CV, names of references, and a brief
statement of background skills and interests to fabian.soto(a)fiu.edu
Fabian A. Soto
Assistant Professor
Department of Psychology
Florida International University
Miami, FL
Hello,
I am advertising to offer a fully-funded PhD studentship at Bangor University (UK/EU fees, stipend, travel and research expenses) starting in Oct 2015, to investigate social attributions to faces. The position would be most appropriate for someone who already has, or is currently enrolled in, an MSc in psychology or related area. A general description and link to the ad is below. As you can see, there is some deliberate flexibility for the student to shape the project topic.
If you are interested, please contact me to discuss. The deadline for applications is 27 Feb.
Thank you,
Rob
Professor Robert Ward
Wolfson Centre for Clinical and Cognitive Neuroscience
School of Psychology
Bangor University
Bangor LL57 2AS
http://psychology.bangor.ac.uk/wardhttp://www.bangor.ac.uk/psychology/postgraduate/studentships/human-face.php…
Humans are both highly visual and highly social beings, and people are quick to make attributions of personality and other social traits on the basis of mere appearance. For the past few years my students and I have been investigating the accuracy of visually-based judgements from the face, and in a series of studies have found that these attributions can be surprisingly accurate, even when based on neutral "passport"-style photographs. Within such photographs is enough information to identify stable personality traits and aspects of mental health in strangers (e.g., papers from former PhD students: Kramer & Ward, 2010; Jones et al, 2012; Scott et al, 2013). Such findings raise a number of issues to be explored in this studentship. A key general issue is whether these facial cues to behaviour are part of an evolved signal system. Theories of evolved signal systems emphasise the co-evolution of the signal sender and receiver. For the system to remain stable, it must have benefits for both the sender and receiver. What adaptive benefits might there be for someone to signal their socially undesirable traits to others? Can facial signals be masked to deceive the receiver? For example, to what extent is signal disrupted by voluntary emotional expressions? A second general issue surrounds the signal content. That is, what is the facial information that observers use to identify social traits? Possibilities include subtle micro-expressions, postures, facial morphology, and more. Finally, what are the causal factors that might produce a joint influence on facial appearance and behaviour? Within the context of these general issues, there is scope for the student to shape the project aims and focus.
Rhif Elusen Gofrestredig 1141565 - Registered Charity No. 1141565
Gall y neges e-bost hon, ac unrhyw atodiadau a anfonwyd gyda hi, gynnwys deunydd cyfrinachol ac wedi eu bwriadu i'w defnyddio'n unig gan y sawl y cawsant eu cyfeirio ato (atynt). Os ydych wedi derbyn y neges e-bost hon trwy gamgymeriad, rhowch wybod i'r anfonwr ar unwaith a dilewch y neges. Os na fwriadwyd anfon y neges atoch chi, rhaid i chi beidio a defnyddio, cadw neu ddatgelu unrhyw wybodaeth a gynhwysir ynddi. Mae unrhyw farn neu safbwynt yn eiddo i'r sawl a'i hanfonodd yn unig ac nid yw o anghenraid yn cynrychioli barn Prifysgol Bangor. Nid yw Prifysgol Bangor yn gwarantu bod y neges e-bost hon neu unrhyw atodiadau yn rhydd rhag firysau neu 100% yn ddiogel. Oni bai fod hyn wedi ei ddatgan yn uniongyrchol yn nhestun yr e-bost, nid bwriad y neges e-bost hon yw ffurfio contract rhwymol - mae rhestr o lofnodwyr awdurdodedig ar gael o Swyddfa Cyllid Prifysgol Bangor.
This email and any attachments may contain confidential material and is solely for the use of the intended recipient(s). If you have received this email in error, please notify the sender immediately and delete this email. If you are not the intended recipient(s), you must not use, retain or disclose any information contained in this email. Any views or opinions are solely those of the sender and do not necessarily represent those of Bangor University. Bangor University does not guarantee that this email or any attachments are free from viruses or 100% secure. Unless expressly stated in the body of the text of the email, this email is not intended to form a binding contract - a list of authorised signatories is available from the Bangor University Finance Office.