Dear colleagues,
I would be grateful if you could share this call for EoIs with your colleagues and on your mailing lists.
Deadline 4 FEB 2019.
Thanks
The Center for Social, Cognitive, Affective Neuroscience (cSCAN<http://cscan.gla.ac.uk/>: <<http://cscan.gla.ac.uk/>http://cscan.gla.ac.<http://cscan.gla.ac.uk/>uk>),<http://cscan.gla.ac.uk/> <http://cscan.gla.ac.uk/> at the University of Glasgow, Scotland seeks expressions of interest (EoI) in the Future Leaders Fellowship (FLF) from UK Research and Innovation <www.ukri.org/funding/funding-opportunities/future-leaders-fellowships/<http://www.ukri.org/funding/funding-opportunities/future-leaders-fellowship…>>. The FLF seeks to “develop, retain, attract and sustain research and innovation talent in the UK” and offers a research-focused post for 4 years, plus a possible 3 additional years, followed by an ongoing academic post within cSCAN.
cSCAN researchers address fundamental mechanisms of social perception, social cognition, and social interaction from a unique, highly transdisciplinary perspective that spans psychology, neuroscience and the computational/engineering sciences. More details can be found at <cscan<http://cscan.gla.ac.uk/>.gla.ac.<http://cscan.gla.ac.uk/>uk<http://cscan.gla.ac.uk/>>. Only candidates who are early-career researchers (within ~6-7 years post-PhD) with very strong publication records relative to career stage and with clear value-added to the cSCAN research program will be considered.
Please email rachael.jack(a)glasgow.ac.uk<mailto:rachael.jack@glasgow.ac.uk> a current CV and a 2-page research statement by Feb 4th, 2019 at the latest. Include ‘Interest in Future Leaders Fellowship’ in the subject of the email. We will be in touch with applicants who are short-listed to progress to the next stage and work with them to develop applications to submit to UKRI. Applicants should refer to the UKRI website for deadline information.
Dr. Rachael E. Jack, Ph.D.
Reader
Institute of Neuroscience & Psychology
School of Psychology
University of Glasgow
+44 (0) 141 5087
www.psy.gla.ac.uk/schools/psychology/staff/rachaeljack/
Apologies for cross-posting
***********************************************************************************
CBAR 2019: CALL FOR PAPERS
6th International Workshop on CONTEXT BASED AFFECT RECOGNITION
https://cbar2019.blogspot.com/
Submission Deadline: January 28th, 2019
***********************************************************************************
The 6th International Workshop on Context Based Affect Recognition (CBAR
2019) will be held in conjunction with FG 2019 in May 2019 in Lille
France – http://fg2019.org/
For details concerning the workshop program, paper submission, and
guidelines please visit our workshop website at:
https://cbar2019.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/http://ri.cmu.edu/personal-pages/ZakiaHammal/
Apologies for cross-posting
***********************************************************************************
CBAR 2019: CALL FOR PAPERS
6th International Workshop on CONTEXT BASED AFFECT RECOGNITION
https://cbar2019.blogspot.com/
Submission Deadline: January 28th, 2019
***********************************************************************************
The 6th International Workshop on Context Based Affect Recognition (CBAR
2019) will be held in conjunction with FG 2019 in May 2019 in Lille
France – http://fg2019.org/
For details concerning the workshop program, paper submission, and
guidelines please visit our workshop website at:
https://cbar2019.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/http://ri.cmu.edu/personal-pages/ZakiaHammal/
Dear all,
I am on the hunt for a face database that contains expressions
of embarrassment, guilt, flirtation, boredom, arrogance and admiration, as
well as neutral and basic emotions (e.g. neutral, happy, angry, sad etc.)
for an EEG experiment I am running with a dissertation student of mine. If
anyone can point me in the right direction or has access to a number of
faces with these expressions, I would be most grateful!
Best wishes,
Nicola
--
Dear face researchers,
We want to create caricatures for some famous faces that we used in an
experiment last year. These include both men and women and a range of ages
from mid-20s to 60-70s (and the queen). All are Caucasian/White.
Does anyone have average faces for male and female for approx <30 years,
30-60 years and 60+ years, that they would be willing to share with us?
We also have the problem that many but not all the faces are showing teeth,
so we will probably need different averages for the mouth with teeth
showing versus no teeth showing versions...
Thanks!
Rachel
--
“It is not our differences that divide us. It is our inability to
recognize, accept, and celebrate those differences.” - Audre Lorde
Apologies for cross-posting
***********************************************************************************
CBAR 2019: CALL FOR PAPERS
6th International Workshop on CONTEXT BASED AFFECT RECOGNITION
https://cbar2019.blogspot.com/
Submission Deadline: January 28th, 2019 (Extended)
***********************************************************************************
The 6th International Workshop on Context Based Affect Recognition (CBAR
2019) will be held in conjunction with FG 2019 in May 2019 in Lille
France – http://fg2019.org/
For details concerning the workshop program, paper submission, and
guidelines please visit our workshop website at:
https://cbar2019.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/http://ri.cmu.edu/personal-pages/ZakiaHammal/
Dear members,
We are pleased to announce publication of first of its kind children
spontaneous facial expression video database, the LIRIS Children
Spontaneous Facial Expression Video Database (LIRIS-CSE). This unique
database contains spontaneous / natural facial expressions of 12 children
in diverse settings with variable recording scenarios showing six universal
or prototypic emotional expressions (happiness, sadness, anger, surprise,
disgust and fear). Children are recorded in constraint free environment (no
restriction on head movement, no restriction on hands movement, free
sitting setting, no restriction of any sort) while they watched specially
built / selected stimuli. This constraint free environment allowed us to
record spontaneous / natural expression of children as they occur. The
database has been validated by 22 human raters. Details of the database are
presented in the following publication:
A novel database of Children's Spontaneous Facial Expressions (LIRIS-CSE).
Rizwan Ahmed Khan, Crenn Arthur, Alexandre Meyer, Saida Bouakaz. arXiv
(2018) preprint, arXiv:1812.01555. https://arxiv.org/abs/1812.01555
To request database download (for research purpose only) visit project
webpage at: https://childrenfacialexpression.projet.liris.cnrs.fr/
__________________
Best Regards,
Dr. Rizwan Ahmed KHAN
Associate Professor, Barrett Hodgson University, Karachi, Pakistan.
|| Researcher
- Laboratoire d'InfoRmatique en Image et Systèmes d'information (LIRIS),
Lyon, France.
<https://sites.google.com/site/drkhanrizwan17/>
<http://scholar.google.com/citations?user=T66djn8AAAAJ&hl=en>
<http://dblp.uni-trier.de/pers/hd/k/Khan:Rizwan_Ahmed.html>
<https://www.youtube.com/user/Rizwankhan2000/videos?view_as=subscriber>
*Help preserve the color of our world – Think before you print.*
Apologies for cross-posting
***********************************************************************************
CBAR 2019: CALL FOR PAPERS
6th International Workshop on CONTEXT BASED AFFECT RECOGNITION
https://cbar2019.blogspot.com/
Submission Deadline: January 7th, 2019
***********************************************************************************
The 6th International Workshop on Context Based Affect Recognition (CBAR
2019) will be held in conjunction with FG 2019 in May 2019 in Lille
France – http://fg2019.org/
For details concerning the workshop program, paper submission, and
guidelines please visit our workshop website at:
https://cbar2019.blogspot.com/
Best regards,
Zakia Hammal
Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/http://ri.cmu.edu/personal-pages/ZakiaHammal/
CALL FOR PAPERS
Extended Deadline for the IEEE Transactions on Affective Computing
Special Issue on Automated Perception of Human Affect from Longitudinal
Behavioral Data
Website:
https://www2.informatik.uni-hamburg.de/wtm/omgchallenges/tacSpecialIssue201…
I. Aim and Scope
Research trends within artificial intelligence and cognitive sciences are
still heavily based on computational models that attempt to imitate human
perception in various behavior categorization tasks. However, most of the
research in the field focuses on instantaneous categorization and
interpretation of human affect, such as the inference of six basic emotions
from face images, and/or affective dimensions (valence-arousal), stress and
engagement from multi-modal (e.g., video, audio, and autonomic physiology)
data. This diverges from the developmental aspect of emotional behavior
perception and learning, where human behavior and expressions of affect
evolve and change over time. Moreover, these changes are present not only
in the temporal domain but also within different populations and more
importantly, within each individual. This calls for a new perspective when
designing computational models for analysis and interpretation of human
affective behaviors: the computational models that can timely and
efficiently adapt to different contexts and individuals over time, and also
incorporate existing neurophysiological and psychological findings (prior
knowledge). Thus, the long-term goal is to create life-long personalized
learning and inference systems for analysis and perception of human
affective behaviors. Such systems would benefit from long-term contextual
information (including demographic and social aspects) as well as
individual characteristics. This, in turn, would allow building intelligent
agents (such as mobile and robot technologies) capable of adapting their
behavior in a continuous and on-line manner to the target contexts and
individuals.
This special issue aims at contributions from computational neuroscience
and psychology, artificial intelligence, machine learning, and affective
computing, challenging and expanding current research on interpretation and
estimation of human affective behavior from longitudinal behavioral data,
i.e., single or multiple modalities captured over extended periods of time
allowing efficient profiling of target behaviors and their inference in
terms of affect and other socio-cognitive dimensions. We invite
contributions focusing on both the theoretical and modeling perspective, as
well as applications ranging from human-human, human-computer and
human-robot interactions.
II. Potential Topics
Given computational models, the capability to perceive and understand
emotion behavior is an important and popular research topic. That is why
recent special issues on the IEEE Journal on Transactions on Affective
Computing covered topics from emotion behavior analysis “in-the-wild” to
personality analysis. However, most of the research published by these
specific calls treat emotion behavior as an instantaneous event, relating
mostly to emotion recognition, and thus neglect the development of complex
emotion behavior models. Our special issue will foster the development of
the field by focusing excellent research on emotion models for long-term
behavior analysis.
The topics of interest for this special issue include, but are not limited
to:
- New theories and findings on continuous emotion recognition
- Multi- and Cross-modal emotion perception and interpretation
- Lifelong affect analysis, perception, and interpretation
- Novel neural network models for affective processing
- New neuroscientific and psychological findings on continuous emotion
representation
- Embodied artificial agents for empathy and emotion appraisal
- Machine learning for affect-driven interventions
- Socially intelligent human-robot interaction
- Personalized systems for human affect recognition
III. Submission
Prospective authors are invited to submit their manuscripts electronically,
adhering to the IEEE Transactions on Affective Computing guidelines (
https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=5165369). Please
submit your papers through the online system (
https://mc.manuscriptcentral.com/taffc-cs) and be sure to select the
special issue: Special Issue/Section on Automated Perception of Human
Affect from Longitudinal Behavioral Data.
IV. IMPORTANT DATES:
Submissions Deadline: 15th of February 2019
V. Guest Editors
Pablo Barros, University of Hamburg, Germany
Stefan Wermter, University of Hamburg, Germany
Ognjen (Oggi) Rudovic, Massachusetts Institute of Technology, United States
of America
Hatice Gunes, University of Cambridge, United Kingdom
--
Dr. Pablo Barros
Postdoctoral Research Associate - Crossmodal Learning Project (CML)
Knowledge Technology
Department of Informatics
University of Hamburg
Vogt-Koelln-Str. 30
22527 Hamburg, Germany
Phone: +49 40 42883 2535
Fax: +49 40 42883 2515
barros at informatik.uni-hamburg.dehttp://www.pablobarros.nethttps://www.inf.uni-hamburg.de/en/inst/ab/wtm/people/barros.htmlhttps://www.inf.uni-hamburg.de/en/inst/ab/wtm/
CALL FOR PAPERS
IEEE Transactions on Affective Computing
Special Issue on Automated Perception of Human Affect from Longitudinal
Behavioral Data
Website:
https://www2.informatik.uni-hamburg.de/wtm/omgchallenges/tacSpecialIssue201…
I. Aim and Scope
Research trends within artificial intelligence and cognitive sciences are
still heavily based on computational models that attempt to imitate human
perception in various behavior categorization tasks. However, most of the
research in the field focuses on instantaneous categorization and
interpretation of human affect, such as the inference of six basic emotions
from face images, and/or affective dimensions (valence-arousal), stress and
engagement from multi-modal (e.g., video, audio, and autonomic physiology)
data. This diverges from the developmental aspect of emotional behavior
perception and learning, where human behavior and expressions of affect
evolve and change over time. Moreover, these changes are present not only
in the temporal domain but also within different populations and more
importantly, within each individual. This calls for a new perspective when
designing computational models for analysis and interpretation of human
affective behaviors: the computational models that can timely and
efficiently adapt to different contexts and individuals over time, and also
incorporate existing neurophysiological and psychological findings (prior
knowledge). Thus, the long-term goal is to create life-long personalized
learning and inference systems for analysis and perception of human
affective behaviors. Such systems would benefit from long-term contextual
information (including demographic and social aspects) as well as
individual characteristics. This, in turn, would allow building intelligent
agents (such as mobile and robot technologies) capable of adapting their
behavior in a continuous and on-line manner to the target contexts and
individuals.
This special issue aims at contributions from computational neuroscience
and psychology, artificial intelligence, machine learning, and affective
computing, challenging and expanding current research on interpretation and
estimation of human affective behavior from longitudinal behavioral data,
i.e., single or multiple modalities captured over extended periods of time
allowing efficient profiling of target behaviors and their inference in
terms of affect and other socio-cognitive dimensions. We invite
contributions focusing on both the theoretical and modeling perspective, as
well as applications ranging from human-human, human-computer and
human-robot interactions.
II. Potential Topics
Given computational models, the capability to perceive and understand
emotion behavior is an important and popular research topic. That is why
recent special issues on the IEEE Journal on Transactions on Affective
Computing covered topics from emotion behavior analysis “in-the-wild” to
personality analysis. However, most of the research published by these
specific calls treat emotion behavior as an instantaneous event, relating
mostly to emotion recognition, and thus neglect the development of complex
emotion behavior models. Our special issue will foster the development of
the field by focusing excellent research on emotion models for long-term
behavior analysis.
The topics of interest for this special issue include, but are not limited
to:
- New theories and findings on continuous emotion recognition
- Multi- and Cross-modal emotion perception and interpretation
- Lifelong affect analysis, perception, and interpretation
- Novel neural network models for affective processing
- New neuroscientific and psychological findings on continuous emotion
representation
- Embodied artificial agents for empathy and emotion appraisal
- Machine learning for affect-driven interventions
- Socially intelligent human-robot interaction
- Personalized systems for human affect recognition
III. Submission
Prospective authors are invited to submit their manuscripts electronically,
adhering to the IEEE Transactions on Affective Computing guidelines (
https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=5165369). Please
submit your papers through the online system (
https://mc.manuscriptcentral.com/taffc-cs) and be sure to select the
special issue: Special Issue/Section on Automated Perception of Human
Affect from Longitudinal Behavioral Data.
IV. IMPORTANT DATES:
Submissions Deadline: 15th of February 2019
V. Guest Editors
Pablo Barros, University of Hamburg, Germany
Stefan Wermter, University of Hamburg, Germany
Ognjen (Oggi) Rudovic, Massachusetts Institute of Technology, United States
of America
Hatice Gunes, University of Cambridge, United Kingdom
--
Best regards,
*Pablo Barros*
*http://www.pablobarros.net <http://www.pablobarros.net>*