A postdoctoral researcher position is available immediately at the Objects
and Knowledge Laboratory, headed by Dr. Olivia Cheung, at NYU Abu Dhabi (
http://www.oliviacheunglab.org/). The postdoctoral researcher will carry
out experiments on high-level vision (object/face/letter/scene recognition)
in humans using behavioral, fMRI, and computational methods. Potential
research projects include, but are not limited to, investigations of the
influences of experience and conceptual knowledge on visual recognition.
Applicants must have a Ph.D. in Psychology, Cognitive Neuroscience,
Cognitive Science, or a related field, and should possess strong
programming skills (e.g., R, Matlab, Python). Prior experience with
neuroimaging, computational, or psychophysical techniques is highly
preferred. Initial appointment is for two years, with the possibility of
renewal. Starting date is flexible, preferably by summer/early fall 2021.
The Objects and Knowledge Laboratory is part of the rapidly growing
Psychology department at NYU Abu Dhabi. The lab has access to the
state-of-the-art neuroimaging and behavioral facilities (including MRI,
MEG, eyetracking).
The terms of employment are very competitive, including relocation and
housing costs, and other benefits among which educational subsidies for
children. Informal inquiries regarding the position, university, or area,
are encouraged. To apply, individuals should email a curriculum vita, cover
letter, statement of research interests, the expected date of availability,
and contact information of two referees. All correspondence should be sent
to Olivia Cheung (olivia.cheung(a)nyu.edu).
The NYU Abu Dhabi campus is located on Saadiyat Island (Abu Dhabi’s
cultural hub), minutes away from the white sand beaches as well as the
world class entertainment, big city and nature activities that have made
the area one of the top ten tourist destinations in the world. More
information about living in Abu Dhabi can be found here:
https://nyuad.nyu.edu/en/campus-life/living-in-abu-dhabi.html
*About NYUAD:*
NYU Abu Dhabi is a degree-granting research university with a fully
integrated liberal arts and science undergraduate program in the Arts,
Sciences, Social Sciences, Humanities, and Engineering. NYU Abu Dhabi, NYU
New York, and NYU Shanghai, form the backbone of NYU’s global network
university, an interconnected network of portal campuses and academic
centers across six continents that enable seamless international mobility
of students and faculty in their pursuit of academic and scholarly
activity. This global university represents a transformative shift in
higher education, one in which the intellectual and creative endeavors of
academia are shaped and examined through an international and multicultural
perspective. As a major intellectual hub at the crossroads of the Arab
world, NYU Abu Dhabi serves as a center for scholarly thought, advanced
research, knowledge creation, and sharing, through its academic, research,
and creative activities.
Hello colleagues,
I am recruiting a student for a fully funded, 3-year PhD studentship at Queen Margaret University, in collaboration with Edinburgh Napier University. The studentship will be supervised by me (Dr Jamal Mansour) at Queen Margaret University as well as Dr Alex McIntyre and Dr Faye Skelton of Edinburgh Napier University. The project is entitled "Reducing cross-race identification errors via the creation of a cross-race recognition training tool and a diagnostic test of cross-race recognition ability".
For more information about the project, please see here: https://www.qmu.ac.uk/ad/napier-collaborative-phds. Potential applicants are encouraged to contact me via email (jmansour(a)qmu.ac.uk) to discuss the project. Applications are due by Wednesday, June 16, 2021 and should be submitted to the Queen Margaret University Graduate School. Details of how to apply can be found here: https://www.qmu.ac.uk/study-here/how-to-apply/phd-and-professional-doctorat….
Could I ask you to please do circulate this opportunity via your relevant networks and share it with any potential candidates?
Very best wishes,
Jamal.
---------------------------------------------------------------------------------
Jamal K. Mansour, PhD
Senior Lecturer in Psychology
Psychology, Sociology, & Education
Queen Margaret University
Edinburgh, UK
EH21 6UU
Email: jmansour(a)qmu.ac.uk
Phone: +44 (0) 131 474 0000 and say my name (Jam-el Man-sir) when prompted
Fax: +44 (0) 131 474 0001
Web: https://www.qmu.ac.uk/schools-and-divisions/psychology-and-sociology/psycho…
Memory Research Group Web site: https://memoryresearchgroup.wordpress.com/
Twitter: @EyewitnessIDUp
Check out my new article on eyewitness identification confidence: https://www.sciencedirect.com/science/article/abs/pii/S2211368120300048
This message and its attachment(s) are intended for the addressee(s) only and should not be read, copied, disclosed, forwarded or relied upon by any person other than the intended addressee(s) without the permission of the sender. If you are not the intended addressee, you must not take any action based on this message and its attachment(s) nor must you copy or show them to anyone. If you have received this email in error, please inform the sender immediately and delete all copies of it.
It is your responsibility to ensure that this message and its attachment(s) are scanned for viruses or other defects. Queen Margaret University does not accept liability for any loss or damage which may result from this message or its attachment(s), or for errors or omissions arising after it was sent. Email is not a secure medium. Email traffic entering Queen Margaret University's system is subject to routine monitoring and filtering by Queen Margaret University.
Queen Margaret University, Edinburgh is a registered charity: Scottish Charity Number SC002750.
Dear All,
Please find below the invitation to contribute to the 2nd Workshop and Competition on Affective Behavior Analysis in-the-wild (ABAW) to be held in conjunction with the International Conference on Computer Vision (ICCV) 2021.
(1): The Competition is split into three Challenges-Tracks, which are based on the same database, Aff-Wild2, which is the first comprehensive benchmark for the three affect recognition tasks in-the-wild:
* dimensional affect recognition (valence and arousal estimation)
* categorical affect classification (seven basic expression classification)
* facial action unit detection
Aff-Wild2 is an audiovisual in-the-wild database of 564 videos of around 2.8M frames.
Participants are invited to participate in one or more of these Challenges.
There will be one winner per Challenge-Track; the winners are expected to contribute a paper describing their approach, methodology and results; the accepted winning papers will be part of the ICCV 2021 proceedings; all other teams are also able to submit a paper describing their solutions and final results; the accepted papers will be part of the ICCV 2021 proceedings.
For more information about the challenge, see here<https://ibug.doc.ic.ac.uk/resources/iccv-2021-2nd-abaw/>.
Important Dates:
* Call for participation announced, team registration begins, data available:
12 May, 2021
* Final submission deadline:
10 July, 2021
* Winners Announcement:
11 July, 2021
* Final paper submission deadline:
21 July, 2021
* Review decisions sent to authors; Notification of acceptance:
10 August, 2021
* Camera ready version deadline:
17 August, 2021
Chairs:
Dimitrios Kollias, University of Greenwich, UK
Stefanos Zafeiriou, Imperial College London, UK
Irene Kotsia, Middlesex University London, UK
Elnar Hajiyev, Realeyes - Emotional Intelligence
(2): The Workshop solicits contributions on the recent progress of recognition, analysis, generation and modelling of face, body, and gesture, while embracing the most advanced systems available for face and gesture analysis, particularly, in-the-wild (i.e., in unconstrained environments) and across modalities like face to voice.
Original high-quality contributions, including:
- databases or
- surveys and comparative studies or
- Artificial Intelligence / Machine Learning / Deep Learning / AutoML / (Data-driven or physics-based) Generative
Modelling Methodologies (either Uni-Modal or Multi-Modal ones)
are solicited on the following topics:
i) "in-the-wild" facial expression or micro-expression analysis,
ii) "in-the-wild" facial action unit detection,
iii) "in-the-wild" valence-arousal estimation,
iv) "in-the-wild" physiological-based (e.g., EEG, EDA) affect analysis,
v) domain adaptation for affect recognition in the previous 4 cases
vi) "in-the-wild" face recognition, detection or tracking,
vii) "in-the-wild" body recognition, detection or tracking,
viii) "in-the-wild" gesture recognition or detection,
ix) "in-the-wild" pose estimation or tracking,
x) "in-the-wild" activity recognition or tracking,
xi) "in-the-wild" lip reading and voice understanding,
xii) "in-the-wild" face and body characterization (e.g., behavioral understanding),
xiii) "in-the-wild" characteristic analysis (e.g., gait, age, gender, ethnicity recognition),
xiv) "in-the-wild" group understanding via social cues (e.g., kinship, non-blood relationships, personality)
Accepted papers will appear at ICCV 2021 proceedings.
Important Dates:
Paper Submission Deadline: 21 July, 2021
Review decisions sent to authors; Notification of acceptance: 10 August, 2021
Camera ready version 17 August, 2021
Accepted workshop papers will appear at ICCV 2021 proceedings.
Chairs:
Dimitrios Kollias, University of Greenwich, UK
Stefanos Zafeiriou, Imperial College London, UK
Irene Kotsia, Middlesex University London, UK
Elnar Hajiyev, Realeyes - Emotional Intelligence
In case of any queries, please contact D.Kollias(a)greenwich.ac.uk
Kind Regards,
Dimitrios Kollias,
on behalf of the organising committee
===================================================
Dr Dimitrios Kollias
Senior Lecturer in Computer Science (Artificial Intelligence)
School of Computing and Mathematical Sciences
University of Greenwich
====================================================
University of Greenwich, a charity and company limited by guarantee, registered in England (reg no. 986729). Registered Office: Old Royal Naval College, Park Row, Greenwich SE10 9LS.
Dear colleagues
We are delighted to announce the opening of a 3-year ERC-funded postdoctoral position in the FaceSyntax lab (Director: Prof. Rachael Jack, University of Glasgow, Scotland). Application portal here: https://www.jobs.ac.uk/job/CFV656/research-assistant-associate; Deadline: 3 June 2021
Please share with your networks.
Many thanks
Prof. Rachael E. Jack, Ph.D.
Professor of Computational Social Cognition
Institute of Neuroscience & Psychology
School of Psychology
University of Glasgow
Scotland, G12 8QB
+44 (0) 141 330 5087
Dear colleagues,
We are inviting abstract submissions for a special session on “Artificial
Intelligence for Automated Human Health-care and Monitoring”, as part of
the 16th IEEE International Conference on Automatic Face and Gesture
Recognition (FG’21, http://iab-rubric.org/fg2021/), December 15-18, 2021.
Details on the special session follow below.
Title, abstract, list of authors, as well as the name of the corresponding
author should be emailed directly to Abhijit Das (abhijitdas2048(a)gmail.com).
Please submit your abstracts before Sunday, May 8th 2021. The expected
paper submission deadline will be on 1st August 2021.
Feel free to contact Abhijit Das, if you have any further questions.
Kindly circulate this email to others, who might be interested.
We look forward to your contributions!
Abhijit Das (Thapar University, India)
Antitza Dantcheva (Inria, France)
Srijan Das (Stony Brook University, USA)
François Brémond (Inria, France)
Xilin Chen (CAS, China)
Hu Han (CAS, China)
-------------------------------------------------------------------------------------
*Call for abstract for FG 2021 special session *
*on*
*Artificial Intelligence for Automated Human Health-care and Monitoring*
------------------------------------------------------------------------------------
Automated Human Health Monitoring Based on Computer Vision has gained rapid
scientific attention in the decade, fueled by a large number of research
articles and commercial systems based on a set of features, extracted from
face and gesture. Recently, the COVID-19 pandemic has pushed the need for
virtual diagnosis and monitoring health protocols (such as regulations for
social distancing, surveillance of individuals wearing the mask in-crowd,
gauging body temperature and other physiological measurements from
distance). Consequently, researchers from computer vision, as well as from
the medical science community have given significant attention to goals
ranging from patient analysis and monitoring to diagnostics (e.g., for
dementia, depression, general healthcare, physiological measurements, rare
neurologic diseases). Moreover, healthcare represents an area of broad
economic[1] <#_ftn1>, social, and scientific impact.
We aim to document recent advancements in automated healthcare, as well as
enable and discuss the progress. Therefore, the goal of this special
session is to bring together researchers and practitioners working in this
area of computer vision and medical science, and to address a wide range of
theoretical and practical issues related to real-life healthcare systems.
Topics of interest include, but are not limited to:
· Health monitoring based on face analysis,
· Health monitoring based on gesture analysis,
· Health monitoring based corporeal-based visual features,
· Depression analysis based on visual features,
· Face analytics for human behaviour understanding,
· Anxiety diagnosis based on face and gesture,
· Physiological measurement employing face analytics,
· Databases on health monitoring, e.g., depression analysis,
· Augmentative and alternative communication,
· Human-robot interaction,
· Home healthcare,
· Technology for cognition,
· Automatic emotional hearing and understanding,
· Visual attention and visual saliency,
· Assistive living,
· Privacy preserving systems,
· Quality of life technologies,
· Mobile and wearable systems,
· Applications for the visually impaired,
· Sign language recognition and applications for hearing impaired,
· Applications for the ageing society,
· Personalized monitoring,
· Egocentric and first-person vision,
· Applications to improve health and wellbeing of children and elderly.
------------------------------
[1] <#_ftnref1>
https://www.prnewswire.com/news-releases/healthcare-automation-market-to-re…
Dear All,
I would appreciate it if you`d propagate the following opportunity.
The Institute of Psychology at the University of Pecs, Hungary, is planning
to start a PhD program for international students. During the program,
among other possibilities, students can join a research that aims at
extending our knowledge about the cognitive background of face perception.
We`re particularly interested in how semantic knowledge about a person
interacts with affective processes. The student will have access to the
following equipment in our lab:
- device for accurate reaction time measurement (cedrus)
- eye-tracker (Toobi XT300)
- physiological measurements (BIOPAC modules: EDA, hearth rate, respiration
rate, EMG etc)
- EEG
Please note that there is a tuition fee (3500 euros per semester in the
first and second year, and 2500 euros per semester in the third and fourth
year). However, from the second year on, students from selected countries
may apply for a scholarship by the Stipendium Hungaricum which covers both
tuition fee and costs of living. The list of eligible countries can be
found here:
https://stipendiumhungaricum.hu/partners/
We are still waiting for the official approval of the PhD program from the
university administration, hence application will be possible only from
July. Until then, informal enquiries can be sent to kocsor.ferenc(a)pte.hu.
Students with a background in psychology, biology, or other related fields,
are welcome to apply.
best regards
*Ferenc Kocsor, PhD*
Institute of Psychology
Faculty of Humanites
University of Pécs
psychology.pte.hu <https://psychology.pte.hu/ferenc-kocsor-phd>
evolutionpsychology.com
<https://psychology.pte.hu/ferenc-kocsor-phd>
Dear colleagues
We are delighted to announce the opening of a 3-year fully funded PhD position in the FaceSyntax lab (Director: Prof. Rachael Jack, University of Glasgow, Scotland). Please find advert attached. Sharing widely would be greatly appreciated.
Best wishes,
Prof. Rachael E. Jack, Ph.D.
Professor of Computational Social Cognition
Institute of Neuroscience & Psychology
School of Psychology
University of Glasgow
Scotland, G12 8QB
+44 (0) 141 330 5087
** Postdoc opportunity ** for neuroscientists and/or neuro-enthusiast engineers
We are seeking to appoint a highly motivated, adventurous, interdisciplinary neuroscientist to help us understand how humans prioritise information in support of the decisions they make. We focus on visual perception, attention and individual differences, and will be using psychometrics assessments and psychophysics measurements as well as concurrent EEG-fMRI recording. Full time, fixed term for up to 24 months, with possibility of extension.
This role falls in the remit of CHAI, a new £2.4M research project funded by the UK EPSRC and led by the Internet of Things and Security Centre (ISEC) at the University of Greenwich, in collaboration with University College London (UCL), the University of Bristol and Queen Mary University of London, as well as industrial partners. As a follow up to the €1.5M EU project “Cocoon: Emotion psychology meets Cyber Security” (2016-2020) led by us, which measured and established how users of connected Internet-of-Things devices react to cyber security risks, “CHAI: Cyber Hygiene in AI-enabled domestic life” (2021-2024) examines the particular threats posed by Artificial Intelligence. CHAI addresses the challenge of figuring out how to best help users protect themselves against the security risks they will face in a world supported by AI (see: http://bit.ly/chai-introduction-video).
The role advertised relates to the fundamental research on human decision making that will inform the technical and pedagogical developments of our partners in the project. The remit is intentionally flexible to allow emerging opportunities for collaborations and includes funding for networking and training.
You will have:
- excellent computer and statistical skills, including a programming language (Python)
- experience analysing neuroimaging data, including EEG and/or fMRI
- the motivation to pursue research at the interface between academia and industry
This work will be carried out in the Centre for Integrative Neuroscience & Neurodynamics (CINN), in the School of Psychology and Clinical Language Sciences, at the University of Reading. CINN is a research platform currently gathering over 100 research staff and students. It is host to a research-dedicated 3T PRISMA Siemens MR scanner, MR compatible EEG (Brain Products) and TMS (MagVenture) equipments, eye-tracking and versatile computing clusters (incl. cloud management and GPUs), all of which available to the project.
Other research projects in the lab currently include the development of a novel Bayesian framework for the analysis of data in psychology and neuroscience, neuroimaging of visual perception and attention, as well as projects with industrial partners on topics as varied as machine learning and brain-computer interfaces for a wide range of sectors.
CINN and the School of Psychology are a tight-knit community, committed to open research and reproducibility, at the forefront of what is done in the UK in many ways. The post holder will have the opportunity to participate in many initiatives, and propose new ones, including training events on reproducibility, and best practices in neuroimaging, data analysis and coding, including Software Carpentry workshops.
The University of Reading was the first University in the UK to publicly commit to open research. It is one of the first institutional members of the UK Reproducibility Network, and a member of the data and software Carpentries. The University is signatory to the Leiden Manifesto for Research Metrics (http://leidenmanifesto.org), is committed to having a diverse and inclusive workforce, supports the gender equality Athena SWAN Charter and the Race Equality Charter, and is a Diversity Champion for Stonewall, the leading LGBT+ rights organisation in the UK. Applications for job-share, part-time and flexible working arrangements are welcomed and will be considered in line with the project’s needs.
Deadline: 19/4
Interviews (online): Early may
Informal inquiries to Etienne Roesch, e.b.roesch(a)reading.ac.uk
More information at: https://jobs.reading.ac.uk/displayjob.aspx?jobid=7590
Etienne
—
dr. Etienne B. Roesch | Associate Professor of Cognitive Science | Univ. Reading
Deputy Director of Imaging, Centre for Integrative Neuroscience and Neurodynamics (CINN)
Programme Director, MSc Cognitive Neuroscience, School of Psychology and Clinical Language Sciences
meet: Book yourself in my University Outlook calendar
www: etienneroes.ch | osf.io/ede88 | github.com/eroesch | rdg.ac.uk/cinn
Dear colleagues,
We are happy to invite proposals for contributions to a forthcoming special issue “Bridging the gap between intergroup and face perception research: Understanding the mechanisms underlying the other-‘race’ effect” for the British Journal of Psychology (see https://bpspsychub.onlinelibrary.wiley.com/page/journal/20448295/homepage/c…)
Interested authors should initially submit a proposal (including title, prospective author(s), affiliation(s), abstract with 200 words maximum) to bjop(a)wiley.com<mailto:bjop@wiley.com> no later than 1 March 2021.
We look forward to receiving your proposal.
Best wishes,
Marleen Stelter (marleen.stelter(a)uni-hamburg.de<mailto:marleen.stelter@uni-hamburg.de>)
Stefan Schweinberger (stefan.schweinberger(a)uni-jena.de<mailto:stefan.schweinberger@uni-jena.de>)
--
Dr. Marleen Stelter
Universität Hamburg
Department of Social Psychology
Von-Melle-Park 5 - 20146 Hamburg, Germany
Tel.: +49 (0) 40 42838-5530
UHH Personal <https://www.psy.uni-hamburg.de/arbeitsbereiche/sozialpsychologie/personen/s…> Webpage<https://www.psy.uni-hamburg.de/arbeitsbereiche/sozialpsychologie/personen/s…>
Call For Papers - Frontiers Research Topic: Affective Shared Perception
I. Aim and Scope
Our perception of the world depends on both sensory inputs and prior knowledge. This applies in general to sensing and has particular implications for affective understanding. Humans adapt their social and affective perception as a function of the current stimulation, of the context, of the history of the interaction, and as of the status of the partner. This influences their behavior, which in turn modifies the social and affective perception of both partners and the evolution of the interaction.
Understanding shared perception as part of affective processing will allow us to tackle this problem and to provide the next step towards a real-world affective computing system. The goal of this research topic is to present and discuss new findings, theories, systems, and trends in affective shared perception and computational models.
We are interested in collecting interesting and exciting research from researchers on the areas of social cognition, affective computing, and human-robot interaction, including also, but not restricted to specialists in computer and cognitive science, psychologists, neuroscientists, and specialists in bio-inspired solutions. We envision that it will allow us to tackle the existing problems in this area and it will provide the next step towards a real-world affective computing system.
II. Potential Topics
Topics include, but are not limited to:
- Affective perception and learning
- Affective modulation and decision making
- Developmental perspectives of shared perception
- Machine learning for shared perception
- Bio-inspired approaches for affective shared perception
- Affective processing for embodied and cognitive robots
- Multisensory modeling for conflict resolution in shared perception
- New psychological findings on shared perception
- Assistive aspects and applications of shared affective perception
III. Submission
- Abstract - 24 October 2020
- Paper Submission - 21 February 2021
IV. Guest Editors
Pablo Vinicius Alves De Barros, Italian Institute of Technology (IIT), Genova, Italy
Alessandra Sciutti, Italian Institute of Technology (IIT), Genova, Italy
Ginevra Castellano, Uppsala University, Uppsala, Sweden
Yukie Nagai, The University of Tokyo, Bunkyō, Japan
----------------------------------------
Dr. Pablo Barros
Postdoctoral Researcher - CONTACT Unit
Istituto Italiano di Tecnologia – Center for Human Technologies
Via Enrico Melen 83, Building B 16152 Genova, Italy
email: pablo.alvesdebarros(a)iit.it
website: https://www.pablobarros.net<http://www.pablobarros.net><https://www.p>
twitter: @PBarros_br