**Apologies for cross-posting**
[cid:f567c3fd-0ed8-47bd-85a5-8bd4702ecfef]
ISRE 2022 in person: Hold the Date!
We are happy to announce the call for submissions and for pre-conference proposals for the biennial ISRE (International Society for Research on Emotion) conference. The conference will take place in-person on the 15-18th of July 2022 at the University of Southern California, Los Angeles USA.
The ISRE conference is an exciting opportunity to meet international colleagues, present your work, and to stay up-to-date with the latest developments in emotion research. ISRE members study emotions from a wide range of disciplines including psychology, neuroscience, philosophy, sociology, linguistics, affective computing, history, anthropology, art and design. The ISRE conference 2022 will include keynote addresses by Antonio Damasio, Barbara Fredrickson and Eran Halperin.
If you would like to contribute to the ISRE conference by presenting your research, we invite you to submit an abstract of max. 250 words byNovember 21, 2021 (end-of-day for all time zones). Submissions are welcome from scholars in all relevant disciplines for symposia (of up to four talks and a discussant, or 5 talks), individual talks, and posters. Symposia are encouraged to include more than one discipline to facilitate cross-disciplinary exchange. Talks will be 15 minutes long. Abstracts should be submitted online at the ISRE 2022 conference website (http://isre22.org). Submissions will open later this October. Please consult the guidelines on the website before preparing your submission.
If you would like to organize a pre-conference, we will be accepting proposals up until November 30, 2021. Please submit a 1 page PDF proposal to the pre-conference chairs, Gale Lucas (lucas(a)ict.usc.edu<mailto:lucas@ict.usc.edu>) and Rachael Jack (Rachael.Jack(a)glasgow.ac.uk<mailto:Rachael.Jack@glasgow.ac.uk>). Proposals should include names and affiliation of the organizers, a 400 word max description and provisional line-up of speakers and topics.
All abstracts will be subject to peer review by an international scientific committee; accepted abstracts will be published in the conference program. Notification of acceptance decisions will be communicated in February 2015. Online registration is expected to be available shortly after that.
We are looking forward to welcoming you in Los Angeles in July 2022!
Jonathan Gratch and Stacy Marsella
Organizers, ISRE 2022 Conference
** Apologies for cross-posting **
We are happy to announce that the call for submissions for pre-conferences held in concert with the biennial ISRE (International Society for Research on Emotion) conference is now open. Pre-conferences will be held on 15th of July 2022 at the University of Southern California, USA.
The main ISRE conference and associated pre-conferences each provide exciting opportunities to meet international colleagues, present your work, and stay up-to-date with the latest developments in emotion research. ISRE members study emotions from a wide range of disciplinary perspectives, including psychology, neuroscience, philosophy, sociology, linguistics, affective computing, history, anthropology, and design (http://isre22.org<https://urldefense.com/v3/__http:/isre22.org/__;!!LIr3w8kk_Xxm!8Dw90of93gj8…>).
ISRE2022 welcomes pre-conference proposals on any topic related to the field of emotion science. Previous pre-conferences include (but are not limited to) Affective Computing, Emotion Development, Social Dimensions of Emotion, and Culture and Emotions.
If you would like to submit a proposal for an ISRE conference, please prepare a 1 page PDF that includes:
1. Title of pre-conference
2. Names of organizers each with affiliations and contact details
3. Summary (up to 400 words) describing the key questions, relevance to emotion science, and the aims of the symposium
4. Provisional line-up of the speakers and their topics
Please send your submission by email to Gale Lucas at lucas(a)ict.usc.edu<mailto:lucas@ict.usc.edu> with the subject heading ‘ISRE2022 Preconference Proposal’ by 30th November 2021 11:59 pm International Date Line West (IDLW; UTC-12). Confused by timezone conversions? Check out worldtimebuddy<https://urldefense.com/v3/__https:/www.worldtimebuddy.com/__;!!LIr3w8kk_Xxm…> (don’t forget to select the date 30 November 2021).
ISRE2022 will host a maximum of 6 pre-conferences. All proposals will be reviewed by the pre-conference chairs, Gale Lucas and Rachael Jack and will be evaluated on the following criteria: (a) relevance to emotion science, (b) interest to the ISRE community, (c) the novelty/groundbreaking nature of the topic, and (d) potential to advance knowledge and stimulate new lines of research.
IMPORTANT DATES:
Deadline for pre-conference proposal submissions: 30th November 2021 11:59 pm IDLW
Decision outcomes announced: Early December 2021
Questions? Please contact Gale Lucas at lucas(a)ict.usc.edu<mailto:lucas@ict.usc.edu> or Rachael Jack Rachael.Jack(a)glasgow.ac.uk<mailto:Rachael.Jack@glasgow.ac.uk>.
We are looking forward to welcoming you!
Gale Lucas and Rachael Jack
Prof. Rachael E. Jack, Ph.D.
Professor of Computational Social Cognition
Institute of Neuroscience & Psychology
School of Psychology
University of Glasgow
Scotland, G12 8QB
+44 (0) 141 330 5087
Dear Colleagues,
We are conducting a meta-analysis on the body inversion effect, which reflects the difference in recognition between upright and inverted body stimuli. Our goal is to provide a summary of this effect, including the magnitude and moderating factors that influence the effect.We are seeking unpublished data from studies that meet the following criteria:
1. Neurotypical participants completed a visual body perception/recognition task that include upright and inverted conditions.
2. Body stimuli are human bodies (either computer generated or real images)
3. Outcome measures of perception/recognition were collected for both upright and inverted conditions. These may include behavioral accuracy (% correct, hit rate), electrophysiological data (e.g., N170 signal), or neuroimaging data.
If you have data that fit these criteria and you would like to share your data to be included in our synthesis, please contact Flora Oswald (feo5020(a)psu.edu<mailto:feo5020@psu.edu>) by October 18, 2021.
Thank you and best wishes,
Flora Oswald
Flora Oswald, M.S (she/her)
SSHRC Doctoral Fellow
Underrepresented Perspectives Lab<https://jmatsick.wixsite.com/uplab>, Social Vision & Interpersonal Perception Lab<https://sites.google.com/site/socialviplab/>
Departments of Psychology & Women's, Gender, and Sexuality Studies
The Pennsylvania State University
Dear All,
I would appreciate it if you`d propagate the following opportunity.
The Institute of Psychology at the University of Pecs, Hungary, has started
a PhD program for international students. During the program, among other
possibilities, students can join a research that aims to extend our
knowledge about the cognitive and neural background of face perception.
We`re particularly interested in how semantic knowledge about a person
interacts with affective processes. The student will have access to the
following equipment in our lab:
- device for accurate reaction time measurement (cedrus)
- eye-tracker (Toobi XT300)
- physiological measurements (BIOPAC modules: EDA, hearth rate, respiration
rate, EMG etc)
- EEG, fMRI
Please note that there is a tuition fee (3500 euros per semester in the
first and second year, and 2500 euros per semester in the third and fourth
year). However, students from selected countries (mainly from Asia, Africa,
and South America) may apply for a *scholarship* by the *Stipendium
Hungaricum* which covers both tuition fee and costs of living. The list of
eligible countries can be found here:
https://stipendiumhungaricum.hu/partners/
Details about the programme will appear soon at the following sites:
https://btk.pte.hu/enhttps://psychology.pte.hu/
Until then, informal enquiries can be sent to kocsor.ferenc(a)pte.hu.
Students with a background in psychology, biology, or other related fields,
are welcome to apply.
best regards
*Ferenc Kocsor, PhD*
Institute of Psychology
Faculty of Humanites and Social Sciences
University of Pécs
Hungary
psychology.pte.hu <https://psychology.pte.hu/ferenc-kocsor-phd>
Dear colleagues,
We are organizing a special session on “Applications in Healthcare and
Health Monitoring” in conjunction with the 16th IEEE Conference on
Automatic Face and Gesture Recognition to be held between 15th-18th
December 2021 in Jodhpur, India (Hybrid Event). Kindly find the related
call for papers below.
*Important dates*
Papers submission deadline: 20 August 2021
Decisions: 25 September 2021
Final camera-ready papers: 20 October 2021
*Submission instructions* can be found at
*http://iab-rubric.org/fg2021/submission.html
<http://iab-rubric.org/fg2021/submission.html>*.
*For submission* log into *https://cmt3.research.microsoft.com/fg2021/*
<https://cmt3.research.microsoft.com/fg2021/>, proceed to “create new
submission”. Select “special session track and subject area” as
“Applications in Healthcare and Health Monitoring”.
Accepted papers will be included in FG2021 proceedings and will appear in
the IEEE Xplore digital library,
Please feel free to contact us for any further details. Kindly disseminate
this email to others who might be interested.
We look forward to your contributions.
Abhijit Das (Thapar University, India)
Babak Taati (University of Toronto, Canada)
Antitza Dantcheva (INRIA, France)
Diedo Guarin (Florida Institute of Technology, USA)
Srijan Das (Stony Brook University, USA)
Andrea Bandini (University of Toronto, Canada)
Hu Han (CAS, China)
Yana Yunusovva (University of Toronto, Canada)
François Brémond (INRIA, France)
Xilin Chen (CAS, China)
--------------------------------------------------------------------------------------------
*Call for paper for FG 2021 special session *
*on *
*Applications in Healthcare and Health Monitoring*
-----------------------------------------------------------------------------------
Automated Human Health Monitoring Based on Computer Vision has gained rapid
Automated Human Health Monitoring Based on Computer Vision has gained rapid
scientific attention in the last decade, fueled by many research articles
and commercial systems. Recently, the COVID-19 pandemic has pushed the need
for virtual diagnosis and monitoring health protocols such as regulating
social distancing, surveillance of individuals wearing masks in-crowd,
gauging body temperature and other physiological measurements from
distance. Consequently, researchers from computer vision, as well as from
the medical science community have given significant attention to goals
ranging from patient analysis and monitoring to diagnostics (e.g., for
dementia, depression, healthcare, physiological measurement, rare
neurologic diseases). Moreover, healthcare represents an area of broad
economic, social, and scientific impact. The goal of this special session
is to bring together researchers and practitioners working in this area of
computer vision and medical science and to address a wide range of
theoretical and practical issues related to real-life healthcare systems.
We especially invite papers resulting from collaboration between technical
and clinical experts. Hence, this FG Special Session represents a venue for
fostering these collaborations, providing a unique and welcoming
environment for transdisciplinary research that is sometimes labelled as
being “too clinical” by technical journals or “too technical” by clinical
journals.
Topics of interest include, but are not limited to:
Health monitoring based on face analysis,
Health monitoring based on gesture analysis,
Health monitoring based corporeal-based visual features,
Depression analysis based on visual features,
Face analytics for human behaviour understanding,
Anxiety diagnosis based on face and gesture,
Physiological measurement employing face analytics,
Databases on health monitoring, e.g., depression analysis,
Augmentative and alternative communication,
Human-robot interaction,
Home healthcare,
Technology for cognition,
Automatic emotional hearing and understanding,
Visual attention and visual saliency,
Assistive living,
Privacy-preserving systems,
Quality of life technologies,
Mobile and wearable systems,
Applications for the visually impaired,
Sign language recognition and applications for hearing impaired,
Applications for the ageing society,
Personalized monitoring,
Egocentric and first-person vision,
Assessing physical and/or cognitive ability based on face and body
movement analysis,
Orofacial assessment in clinical populations,
Hand function assessment in clinical populations,
Assessment of gait and/or balance,
Assistive technology,
Applications to improve health and wellbeing of children and elderly.
[image: image.gif]
Dear all
I hope you're all well.
I was wondering if there was anyone who would be willing to send me some composite facial images I can use for my book chapter. Ideally it would eb great if there were soe from different systems, e.g. EFIT V, EvoFit, versus the older system EFIT, PhotoFit, Faces etc... I am trying to demonstrate the newer holistic systems create more life-like and recognisable composites compared to the older feature based systems.
Also if anyone has specific examples of composites that have successfully been used by the police and have led to arrests/convictions it would be really great if I could include one of those too.
I will of course reference the images, and/or the paper you may have included them in already in my book and give you full credit.
If you need any more details about the book chapter, please do get in touch and I'll be happy to provide them
All the best,
Trina
[cid:image001.jpg@01D778AC.060CD010]
[cid:image002.png@01D778AC.060CD010]
Dr Catriona Havard| Senior Lecturer in Psychology
The Open University, Walton Hall, Milton Keynes, MK7 6AA
To see a selection of my papers please visit http://www.open.ac.uk/people/ch22572
[cid:image003.png@01D778AC.060CD010]<https://www.facebook.com/theopenuniversity.OUFASSstudents/>[cid:image004.png@01D778AC.060CD010]<https://twitter.com/OU_FASS>[cid:image005.png@01D778AC.060CD010]<https://www.instagram.com/ou_fass/>[youtube_circle_red]<https://www.youtube.com/channel/UCC--E90gra2WwUHZRPrXtuA>
[cid:image007.jpg@01D778AC.060CD010]
From: Ailsa E Millen
Sent: 06 July 2021 16:07
To: face-research-list(a)lists.stir.ac.uk<mailto:face-research-list@lists.stir.ac.uk>.
Subject: Postdoctoral Research Fellow Vacancy
Dr Ailsa Millen is offering a full-time, fixed-term Postdoctoral Research Fellow position to work on the ESRC project 'Identifying Novel Markers of Concealed Face Recognition' https://www.stir.ac.uk/about/work-at-stirling/list/details/?jobId=2584&jobT… starting August 2021 (or as soon as possible thereafter). Ailsa is seeking an excellent postdoctoral research fellow to conduct experiments combining eye-tracking, skin conductance, facial expressions and vocal cues. Excellent programming and analysis skills are essential. A crucial aspect of this role is to streamline the completion of the existing ESRC grant by taking over programming, data analysis, and manuscript writing so these skills are essential prior to starting. The project aims to further our understanding of how our brains recognise faces and find ways to help the police detect crime (www.conface.org<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.confac…>). The post is initially for 10 months but we will request an extension three months prior to the June 2022 end date.
Enquiries to ailsa.millen(a)stir.ac.uk
Dr Ailsa E. Millen (she/her)
Lecturer in Psychology
University of Stirling
Phone: + 44 (0) 1786 466372
Twitter @ailsamillen
Staff page: https://www.stir.ac.uk/people/255892
Project page: www.conface.org<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.confac…> @confacedotorg
Leader of The Cognition Research Group @corgis_uos
I aim to reply within 3 working days. My messages may arrive outside of the working day, but this does not imply any expectation that you should reply outside of your normal working hours. If you wish to respond, please do so when convenient.
Latest papers
Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability (2020). Advances in Methods and Practices in Psychological Science. https://doi.org/10.1177/2515245920958687<https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdoi.org%2…>
Many Labs 5: Registered Replication Report of Crosby, Monin & Richardson (2008) (2020). Advances in Methods and Practices in Psychological Science. https://doi.org/10.1177/2515245919870737<https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdoi.org%2…>
Eye spy a liar: Assessing the utility of eye fixations and confidence judgments for detecting concealed recognition of people, places and objects (2020). Cognitive Research: Principles and Implications. https://rdcu.be/b6g6X<https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Frdcu.be%2…>
Registered Replication Report on Fischer, Castel, Dodd, and Pratt (2003) (2020). Advances in Methods and Practices in Psychological Science. https://doi.org/10.1177/2515245920903079<https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdoi.org%2…>
Eye see through you! Eye tracking unmasks concealed face recognition despite countermeasures (2019). Cognitive Research: Principles and Implications. https://rdcu.be/bNlKn<https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Frdcu.be%2…>
Latest papers:
________________________________
The University achieved an overall 5 stars in the QS World University Rankings 2020
UK Sports University of the Year 2020 (Times Higher Good University Guide)
The University of Stirling is a charity registered in Scotland, number SC 011159.
Dear colleagues,
We are organizing a special session on “Applications in Healthcare and
Health Monitoring” in conjunction with the 16th IEEE Conference on
Automatic Face and Gesture Recognition to be held between 15th-18th
December 2021 in Jodhpur, India (Hybrid Event). Kindly find the related
call for papers below.
*Important dates*
Papers submission deadline: 1 August 2021
Decisions: 25 September 2021
Final camera-ready papers: 20 October 2021
*Submission instructions* can be found at
*http://iab-rubric.org/fg2021/submission.html
<http://iab-rubric.org/fg2021/submission.html>*.
*For submission* log into *https://cmt3.research.microsoft.com/fg2021/*
<https://cmt3.research.microsoft.com/fg2021/>, proceed to “create new
submission”. Select “special session track and subject area” as
“Applications in Healthcare and Health Monitoring”.
Accepted papers will be included in FG2021 proceedings and will appear in
the IEEE Xplore digital library,
Please feel free to contact us for any further details. Kindly disseminate
this email to others who might be interested.
We look forward to your contributions.
Abhijit Das (Thapar University, India)
Babak Taati (University of Toronto, Canada)
Antitza Dantcheva (INRIA, France)
Diedo Guarin (Florida Institute of Technology, USA)
Srijan Das (Stony Brook University, USA)
Andrea Bandini (University of Toronto, Canada)
Hu Han (CAS, China)
Yana Yunusovva (University of Toronto, Canada)
François Brémond (INRIA, France)
Xilin Chen (CAS, China)
--------------------------------------------------------------------------------------------
*Call for paper for FG 2021 special session *
*on *
*Applications in Healthcare and Health Monitoring*
-----------------------------------------------------------------------------------
Automated Human Health Monitoring Based on Computer Vision has gained rapid
Automated Human Health Monitoring Based on Computer Vision has gained rapid
scientific attention in the last decade, fueled by many research articles
and commercial systems. Recently, the COVID-19 pandemic has pushed the need
for virtual diagnosis and monitoring health protocols such as regulating
social distancing, surveillance of individuals wearing masks in-crowd,
gauging body temperature and other physiological measurements from
distance. Consequently, researchers from computer vision, as well as from
the medical science community have given significant attention to goals
ranging from patient analysis and monitoring to diagnostics (e.g., for
dementia, depression, healthcare, physiological measurement, rare
neurologic diseases). Moreover, healthcare represents an area of broad
economic, social, and scientific impact. The goal of this special session
is to bring together researchers and practitioners working in this area of
computer vision and medical science and to address a wide range of
theoretical and practical issues related to real-life healthcare systems.
We especially invite papers resulting from collaboration between technical
and clinical experts. Hence, this FG Special Session represents a venue for
fostering these collaborations, providing a unique and welcoming
environment for transdisciplinary research that is sometimes labelled as
being “too clinical” by technical journals or “too technical” by clinical
journals.
Topics of interest include, but are not limited to:
Health monitoring based on face analysis,
Health monitoring based on gesture analysis,
Health monitoring based corporeal-based visual features,
Depression analysis based on visual features,
Face analytics for human behaviour understanding,
Anxiety diagnosis based on face and gesture,
Physiological measurement employing face analytics,
Databases on health monitoring, e.g., depression analysis,
Augmentative and alternative communication,
Human-robot interaction,
Home healthcare,
Technology for cognition,
Automatic emotional hearing and understanding,
Visual attention and visual saliency,
Assistive living,
Privacy-preserving systems,
Quality of life technologies,
Mobile and wearable systems,
Applications for the visually impaired,
Sign language recognition and applications for hearing impaired,
Applications for the ageing society,
Personalized monitoring,
Egocentric and first-person vision,
Assessing physical and/or cognitive ability based on face and body
movement analysis,
Orofacial assessment in clinical populations,
Hand function assessment in clinical populations,
Assessment of gait and/or balance,
Assistive technology,
Applications to improve health and wellbeing of children and elderly.
INVITATION to the International Association of Craniofacial Identification (IACI)
One-day Symposium - 23 July 2021
The IACI conference in Liverpool has been postponed to July 2022 due to the continued pandemic. However, we will be hosting a one-day online symposium as an IACI free taster event on 23 July 2021, 10am - 15:30pm BST.
The theme of the event will be:
'Race and Face: bias in forensic and archaeological investigation'
Keynote speakers will be:
* Race and Forensic Investigation - Prof Amade M'Charek, University of Amsterdam
* Race and Facial Recognition Algorithms - Dr Jonathon Phillips, National Institute of Standards and Technology's Information Technology Laboratory
* Race and Super-recognisers - Dr Josh Davis, University of Greenwich
* Race and Forensic Genetics - Dr David Skinner, Anglia Ruskin University
* Historical ethnographic craniofacial collections - Dr Tobias Houlton, University of Dundee
There will be an online poster event and the opportunity for some short presentations on any of the following topics:
Facial identification of the dead
* Facial reconstruction/approximation
* Craniofacial anatomy
* Craniofacial superimposition
* Depiction of preserved remains for museum exhibition
* Ethical issues relating to the presentation of faces of the dead
* DNA analysis for facial depiction of skeletal remains
* Migrant disaster victim identification
* CGI and animation
Facial identification of the living
* Age progression
* Eyewitness composites
* DNA-to-face
* Facial recognition
* CCTV analysis
* Facial morphing and deep fakes
If you would like to present a short paper or poster at this symposium, please submit an abstract with your name and affiliation to facelab(a)ljmu.ac.uk<mailto:facelab@ljmu.ac.uk> by 1 July 2021.
Details of how to access the symposium will be sent out at a later date.
Please forward this to any interested parties.
Dr Sarah Shrimpton BA (Hons), MSc, AFHEA, PhD
Research Assistant, Face Lab
IC1 Liverpool Science Park, 131 Mount Pleasant, Liverpool, L3 5TF
tel: 0151 482 9609 (Direct) or 0151 482 9605 (Lab)
email:s.l.shrimpton@ljmu.ac.uk<mailto:s.l.shrimpton@ljmu.ac.uk>
________________________________
Important Notice: Liverpool John Moores University was established as a Higher Education Corporation under section 121 of the Education Reform Act 1988. Further information about Liverpool John Moores University can be found at https://www.ljmu.ac.uk/about-us The information in this email and any attachments is for the sole use of the intended recipient(s). If you are not an intended recipient, or a person responsible for delivering it to an intended recipient, you should delete it from your system immediately without disclosing its contents elsewhere and advise the sender by returning the email or by telephoning a number contained in the body of the email. No responsibility is accepted for loss or damage arising from viruses or changes made to this message after it was sent and the recipient must ensure that the email (and attachments) are virus free. The views contained in this email are those of the author and not necessarily those of Liverpool John Moores University. We will use the personal data information provided by you to respond to your email. For information about how we process personal data and monitor communications please see our Privacy Notice. https://www.ljmu.ac.uk/legal/privacy-and-cookies
________________________________
The University achieved an overall 5 stars in the QS World University Rankings 2020
UK Sports University of the Year 2020 (Times Higher Good University Guide)
The University of Stirling is a charity registered in Scotland, number SC 011159.
GLASGOW FACE MATCHING TEST 2 (GFMT2)
A new psychometric test of face matching ability has been developed by UNSW Sydney and University of York and is freely available for scientific use.
White, D., Guilbert, D., Varela, V.P.L. Renkins, R., & Burton, A. M. (2021). GFMT2: A psychometric measure of face matching ability. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01638-x
GFMT2 is a new expanded version of the original Glasgow Face Matching Test. Test forms include:
GFMT2-S: A short 80 item test with test-retest reliability over a week r = 0.774. There are two equally difficult 40-item forms for use in experimental intervention studies.
GFMT-Low: Specifically designed to target lower than average performers, suited for assessing acquired or developmental prosopagnosia.
GFMT2-High: Specifically designed to target higher than average performers, suited for assessing super-recognisers and certain professional groups.
Short tests do not contain repeating identities, nor items from the original GFMT. Image pairs now include variation in head angle, pose, expression and subject-to-camera distance, making the new test more difficult and more representative of challenges in everyday face identification tasks.
The publication is available here: https://rdcu.be/cm1YR
Executable versions of the test are available for PC and MAC via: www.gfmt2.org