Dear all
We’re excited to announce 2 x EPSRC-funded PhD Scholarships on CS+Psych projects hosted in the School of Psychology and Neuroscience in collaboration with the School of Computing Science at the University of Glasgow. Please share widely Deadline: 17 June
Project 1: The unconscious effect of physical beauty in human social interactions
The aim of the project is to investigate how "physical beauty" can bias the outcomes of social decisions (e.g. job interviews). To do this, we aim to create an algorithm able to transform the "physical beauty" of participants in real time and use that algorithm during negotiations to see how it influences social outcomes and non-verbal behavior. We are searching for a multidisciplinary candidate that is interested in real-time computer vision (e.g. voice/face transformation and analysis) as well as social cognition (e.g., social interactions, non-verbal data analysis, social biases).
Project description: https://www.gla.ac.uk/postgraduate/doctoraltraining/mvls-epsrc/projects/pab…
Project team members: Pablo Arias Sarah (primary supervisor), Alessandro Vinciarelli (co-supervisor), Mathieu Chollet (co-supervisor)
Questions? Contact: pablo.arias(a)glasgow.ac.uk<mailto:pablo.arias@glasgow.ac.uk>
Project 2: HIGH-FIDELITY 3D FACIAL RECONSTRUCTION FOR SOCIAL SIGNAL UNDERSTANDING
Human faces convey a wealth of rich social and emotional information—for example, facial expressions often convey our internal emotion states while the shape, colour, and texture of faces can betray our age, sex, and ethnicity. As a highly salient source of social information, human faces are integral to shaping social communication and interactions. The faces in the video can be viewed as a temporal sequence of facial images with intrinsic dynamic changes. Establishing correlations between faces in different frames is important for tracking and reconstructing faces from videos. Jointly modelling fine facial geometry and appearance in a data-driven manner enables the model to learn the relationship between a single 2D face image and the corresponding 3D face model and thus reconstruct its high-quality 3D face model by leveraging the high capacity of deep neural networks. This project is to investigate computational methods for high-fidelity 3D facial tracking on videos for social signal analysis in social interaction scenarios. It involves developing computational models for reconstruction of 3D facial details capturing geometric facial expression changes and analysing social signals.
Project description: https://www.gla.ac.uk/postgraduate/doctoraltraining/mvls-epsrc/projects/hui…
Project team members: Hui Yu (primary supervisor), Rachael Jack (co-supervisor), Tanaya Guha (co-supervisor)
Question? Contact : Hui.Yu(a)glasgow.ac.uk<mailto:Hui.Yu@glasgow.ac.uk>
Prof. Rachael E. Jack, Ph.D.
Professor of Computational Social Cognition
School of Psychology & Neuroscience
University of Glasgow
Scotland, G12 8QB
+44 (0) 141 330 5087
[MOSAIC_small_white.tiff]
A nearly final program for our workshop here, July 25-6th is now available. The posters are still a bit fluid: if you’d like to bring one, please let me know.
In keeping with the workshop format, we’re allowing extended time for discussion after each set of talks
To book attendance, which is free: https://faceresearch.stir.ac.uk/july-workshop/
Peter
Program
Thursday 25th July
9:00 Registration
Session 1 Face representations
9:30 How the learning of unfamiliar faces is affected by their resemblance to familiar faces
Katie L.H. Gray, Maddie Atkinson, Kay Ritchie, Peter Hancock
9:50 How Does Increased Familiarity Change Face Representation in Memory?
Mintao Zhao, Isabelle Bülthoff
10:10 The contribution of distinctive features to cost-efficient facial representations
Christel Devue and Mathieu Blondel
10:30 Discussion
10:50 Coffee break
11:30 Keynote 1: Meike Ramon: Unique traits, computational insights: studying Super-Recognizers for societal applications
12:30 Lunch
Session 2: Decision making
13:30 Human computer teaming with low mismatch incidence,
Anna Bobak, Melina Muller, Peter Hancock
13:50 Unfamiliar face matching and metacognitive efficiency
Robin Kramer, Robert McIntosh
14:10 Distinct criterion placement for intermixed face matching tasks
Kristen A. Baker, Markus Bindemann
14:30 Discussion
14:50 Break and posters
16:00 Keynote 2: Alice O’Toole: Dissecting Face Representations in Deep Neural Networks: Implications for Rethinking Neural Codes
17:00 Break
18:00 Public Lecture: Peter Hancock: Face recognition by humans and computers: criminal injustice?
19: 30 Dinner
Friday 26th July
Session 3: Factors affecting face recognition
9:30 Face masks and fake masks: Have we been underestimating the problem of face masks in face identity perception?
Kay L Ritchie, Daniel J Carragher, Josh P Davis, Katie Read, Ryan E Jenkins, Eilidh Noyes, Katie LH Gray, Peter JB Hancock
9:50 Identification of masked faces: typical observers, super-recognisers, forensic examiners and algorithms.
Eilidh Noyes, Reuben Moreton, Peter Hancock, Kay Ritchie, Sergio Castro Martinez, Katie Gray, and Josh Davis
10:10 Individual variation, socio-emotional functioning and face perception
Karen Lander, Grace Talbot, Anastasia Murphy & Richard Brown
10:30 Discussion
10:50 Coffee
Session 4: Identification of suspects
11:20 Identity Recognition of Composites Constructed of Unfamiliar Faces
Charlie Frowd
11:40 Inverse caricature effects in eyewitness identification performance and deep learning models of face recognition
Gaia Giampietro, Ryan McKay, Thora Bjornsdottir, Laura Mickes, Nicholas Furl
12:00 Implicit markers of concealed face recognition
Ailsa Millen
12:20 Discussion
13:00 Workshop end
Posters
As good as it gets? Computer-enhanced recognition of single-view faces does not improve performance across matching or recognition tasks. Scott P Jones, Peter Hancock
"They're just not my cup of tea": random preferences are more important than random effects in modelling facial attractiveness ratings. Thomas Hancock, Peter Hancock, Anthony Lee, Morgan Sidari, Amy Zhao, Brendan Zietsch
Investigating the modulatory effects of emotional expressions on short-term face familiarity. Constantin-Iulian Chiță, Simon Paul Liversedge, Philipp Ruhnau
Human-computer teaming with low quality images. Dan Carragher, Peter Hancock, David White
Wisdom of the crowds, within and between individuals, Dan Carragher and Peter Hancock
Islands of Expertise and face matching. Emily Cunningham, Anna Bobak, Peter Hancock
Investigating Face Recognition Ability in Neurodiverse Individuals. Caelan Dow, Anna Bobak, Jud Lowes
The Heterogeneity of Face Processing in Developmental Prosopagnosia from a Single Case Analysis Approach, Benjamin Armstrong, Anna Bobak, Jud Lowes
The effects of age on face recognition. Zsofi Kovacs-Bodo, Stephen Langton, Peter Hancock & Anna Bobak
Seeing through the lies: effectiveness of eye-tracking measures for the detection of concealed recognition of newly familiar faces and objects. Amir Shapira and Ailsa Millen
Peter Hancock (he/him)
Professor
Psychology, School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
http://rms.stir.ac.uk/converis-stirling/person/11587
@pjbhancock
Latest papers:
Face masks and fake masks: the effect of real and superimposed masks on face matching with super-recognisers, typical observers, and algorithms https://rdcu.be/dxAIR
Balanced Integration Score: A new way of classifying Developmental Prosopagnosia
https://www.sciencedirect.com/science/article/abs/pii/S0010945224000054
My messages may arrive outside of the working day but this does not imply any expectation that you should reply outside of your normal working hours. If you wish to respond, please do so when convenient.
Web: www.stir.ac.uk<http://www.stir.ac.uk/>
[Facebook icon]<https://www.facebook.com/universityofstirling/>[Twitter icon]<https://twitter.com/StirUni>[LinkedIn icon]<https://www.linkedin.com/edu/university-of-stirling-12676>[Instagram icon]<https://www.instagram.com/universityofstirling/>[Youtbue icon]<https://www.youtube.com/user/UniversityOfStirling>
[Banner]<https://www.stir.ac.uk/>
________________________________
Scotland’s University for Sporting Excellence
The University of Stirling is a charity registered in Scotland, number SC 011159
Dear Colleagues,
Please find below the invitation to contribute to the 5th Workshop and Competition on Affective Behavior Analysis in-the-wild (ABAW) to be held in conjunction with the IEEE Computer Vision and Pattern Recognition Conference (CVPR), 2023.
(1): The Competition is split into the below four Challenges:
* Valence-Arousal Estimation Challenge
* Expression Classification Challenge
* Action Unit Detection Challenge
*
Emotional Reaction Intensity Estimation Challenge
The first 3 Challenges are based on an augmented version of the Aff-Wild2 database, which is an audiovisual in-the-wild database of 594 videos of 584 subjects of around 3M frames; it contains annotations in terms of valence-arousal, expressions and action units.
The last Challenge is based on the Hume-Reaction dataset, which is a multimodal dataset of about 75 hours of video recordings of 2222 subjects; it contains continuous annotations for the intensity of 7 emotional experiences.
Participants are invited to participate in at least one of these Challenges.
There will be one winner per Challenge; the top-3 performing teams of each Challenge will have to contribute paper(s) describing their approach, methodology and results to our Workshop; the accepted papers will be part of the CVPR 2023 proceedings; all other teams are also encouraged to submit paper(s) describing their solutions and final results; the accepted papers will be part of the CVPR 2023 proceedings.
More information about the Competition can be found here<https://ibug.doc.ic.ac.uk/resources/cvpr-2023-5th-abaw/>.
Important Dates:
* Call for participation announced, team registration begins, data available:
13 January, 2023
* Final submission deadline:
18 March, 2023
* Winners Announcement:
19 March, 2023
* Final paper submission deadline:
24 March, 2023
* Review decisions sent to authors; Notification of acceptance:
3 April, 2023
* Camera ready version deadline:
8 April, 2023
Chairs:
Dimitrios Kollias, Queen Mary University of London, UK
Stefanos Zafeiriou, Imperial College London, UK
Panagiotis Tzirakis, Hume AI
Alice Baird, Hume AI
Alan Cowen, Hume AI
(2): The Workshop solicits contributions on the recent progress of recognition, analysis, generation and modelling of face, body, and gesture, while embracing the most advanced systems available for face and gesture analysis, particularly, in-the-wild (i.e., in unconstrained environments) and across modalities like face to voice. In parallel, this Workshop will solicit contributions towards building fair models that perform well on all subgroups and improve in-the-wild generalisation.
Original high-quality contributions, including:
- databases or
- surveys and comparative studies or
- Artificial Intelligence / Machine Learning / Deep Learning / AutoML / (Data-driven or physics-based) Generative Modelling Methodologies (either Uni-Modal or Multi-Modal; Uni-Task or Multi-Task ones)
are solicited on the following topics:
i) "in-the-wild" facial expression or micro-expression analysis,
ii) "in-the-wild" facial action unit detection,
iii) "in-the-wild" valence-arousal estimation,
iv) "in-the-wild" physiological-based (e.g.,EEG, EDA) affect analysis,
v) domain adaptation for affect recognition in the previous 4 cases
vi) "in-the-wild" face recognition, detection or tracking,
vii) "in-the-wild" body recognition, detection or tracking,
viii) "in-the-wild" gesture recognition or detection,
ix) "in-the-wild" pose estimation or tracking,
x) "in-the-wild" activity recognition or tracking,
xi) "in-the-wild" lip reading and voice understanding,
xii) "in-the-wild" face and body characterization (e.g., behavioral understanding),
xiii) "in-the-wild" characteristic analysis (e.g., gait, age, gender, ethnicity recognition),
xiv) "in-the-wild" group understanding via social cues (e.g., kinship, non-blood relationships, personality)
xv) subgroup distribution shift analysis in affect recognition
xvi) subgroup distribution shift analysis in face and body behaviour
xvii) subgroup distribution shift analysis in characteristic analysis
Accepted workshop papers will appear at CVPR 2023 proceedings.
Important Dates:
Paper Submission Deadline: 24 March, 2023
Review decisions sent to authors; Notification of acceptance: 3 April, 2023
Camera ready version 8 April, 2023
Chairs:
Dimitrios Kollias, Queen Mary University of London, UK
Stefanos Zafeiriou, Imperial College London, UK
Panagiotis Tzirakis, Hume AI
Alice Baird, Hume AI
Alan Cowen, Hume AI
In case of any queries, please contact d.kollias(a)qmul.ac.uk<mailto:d.kollias@qmul.ac.uk>
Kind Regards,
Dimitrios Kollias,
on behalf of the organising committee
========================================================================
Dr Dimitrios Kollias, PhD, MIEEE, FHEA
Lecturer (Assistant Professor) in Artificial Intelligence
Member of Multimedia and Vision (MMV) research group
Member of Queen Mary Computer Vision Group
Associate Member of Centre for Advanced Robotics (ARQ)
Academic Fellow of Digital Environment Research Institute (DERI)
School of EECS
Queen Mary University of London
========================================================================
Hello,
I hope this message finds you well.
We are excited to announce an upcoming workshop that aims to break new ground in the realm of affective computing. The workshop, titled “From Lab to Life: Realising the Potential of Affective Computing”, will encourage discussion from both academic and industry experts, which promises to be an enriching experience for both professionals and researchers alike.
Machine capabilities are on the rise. New advances in AI and Robotics have enabled the creation of ever more competent artificial systems that have the potential to contribute to various types of human activities. However, for this potential to result in a step-change in how humans and machines interact and work with each other, machines also need to be competent at understanding their human counterparts. How can task-competent machines become competent teammates, assistants, and companions for human users? How can technology make sense of human behaviour, responses, and experiences?
Affective Computing research has been spearheading the effort to answer these questions and resolve challenges of human-machine interaction. To take the unique insights and innovations developed in Affective Computing from lab prototypes to robust and reliable technology solutions for human users, there is a need for academic and industry researchers to come together. In this workshop, we create a forum for this conversation structured around three specific themes: (1) ethics and regulations, (2) industry perspectives, and (3) academic perspectives.
Here’s what you can expect from the workshop:
1. Expert insights: gain valuable insights from renowned experts in academia and industry who will share their experiences, perspectives, and ethical considerations on affective computing.
2. Interactive discussions: participate in discussions with experts by submitting a 2 page perspective piece and sharing your perspectives during a moderated panel where both you and speakers can discuss.
3. Networking opportunities: connect with fellow participants, industry professionals, and researchers to exchange ideas, forge new partnerships, and explore potential collaborations.
Whether you are a seasoned researcher, an industry professional, or someone in-between interested in the latest developments in affective computing, ethical concerns when navigating affective computing to industry, or curious as to how one can safely streamline affective computing products to consumers, this workshop offers a unique opportunity to expand your knowledge, broaden your network, and contribute to the advancement of this exciting field.
We invite you to submit your work and share your insights by submitting to our workshop by June 12th, 2024. For more information what to submit and how, please visit https://www.cambridgeconsultants.com/acii2024-fromlabtolife/<https://url.us.m.mimecastprotect.com/s/Jk9RC9rmgPhm5KmqTONsSC?domain=cambri…>.
Thank you for taking the time to consider our invitation and we hope to see you at our workshop this September.
Kind Regards,
Emma Hughson
Senior Affective Computing Engineer, Human Machine Understanding
Cambridge Consultants Ltd
29 Science Park, Milton Road,
Cambridge, CB4 0DW, UK
<https://www.cambridgeconsultants.com/home>www.cambridgeconsultants.com<https://www.cambridgeconsultants.com/home>
[cid:image001.png@01DAA148.210A2080]
[cid:image002.png@01DAA148.210A2080]<https://www.linkedin.com/company/cambridge-consultants>
[cid:image003.png@01DAA148.210A2080]<https://www.youtube.com/user/CamConsultants>
[cid:image004.png@01DAA148.210A2080]<https://twitter.com/CambConsultants>
[cid:image005.png@01DAA148.210A2080]<https://www.facebook.com/cambridgeconsultantsuk/>
[cid:image006.png@01DAA148.210A2080]<https://www.instagram.com/cambridge_consultants/>
Sign-up for our newsletter<https://www.cambridgeconsultants.com/newsletter>
This email, including attachments, contains confidential information belonging to Cambridge Consultants. It is intended for the addressee only and may only be copied or disclosed to others with permission. If you are not the intended recipient please delete the material from any computer. Emails are subject to monitoring for legal and security purposes. If this email has been sent as a personal message the company accepts no liability for the content.