The BPS Cognitive Section annual meeting will be held in Stirling from 26th-28th August. One theme will be faces. If you think you would like to present, please let me know. The formal call for submissions will be later in Spring.
There will be two additional reasons to make the trip:
Either as part of the meeting, or alongside it, we are planning to mark the retirement of Vicki Bruce with talks by her ex-students and colleagues
The Edinburgh Festival will be in full swing at the time, so you could go along there either before or after the meeting.
Peter
Peter Hancock
Professor
Psychology, School of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
http://rms.stir.ac.uk/converis-stirling/person/11587
@pjbhancock
Psychology at Stirling 100% 4* Impact, REF 2014
[Sports University of the Year 2020]
My messages may arrive outside of the working day but this does not imply any expectation that you should reply outside of your normal working hours. If you wish to respond, please do so when convenient.
________________________________
The University achieved an overall 5 stars in the QS World University Rankings 2018
The University of Stirling is a charity registered in Scotland, number SC 011159.
Applications are invited from highly motivated researchers for a postdoctoral position immediately available in the Computational & Cognitive Neuroscience Lab, led by Prof. Angela Yu, at University of California, San Diego. Initial appointment is for one year, renewable for up to 2-3 years.
Dr. Yu’s lab applies modern machine learning and statistical tools to extract computational principles that underlie cognitive processes that enable intelligent behavior, in particular how humans and other intelligent agents perform inference, learning, decision-making, and social interactions under conditions of uncertainty and non-stationarity. Current interests include perception, face processing, active sensing, judgement and decision making, and social cognition.
Applicants should be committed to applying rigorous mathematical tools to model cognitive functions and/or their neural underpinnings. Experience or interest in carrying out human behavioral experiments (either in person or on Amazon M-Turk) and/or collaborating with other neuroimaging/neurophysiology laboratories is desirable.
Dr. Yu's lab is situated within the Cognitive Science department, and also affiliated with the Halıcıoğlu Data Science Institute, the Computer Science Department, the Neurosciences Graduate Program, and the Institute of Neural Computation. There are ample opportunities for collaborations with related groups across the UCSD main campus, the medical school, and the Salk Institute.
Interested candidates should send a research statement, along with a CV including publications, to Dr. Angela Yu (ajyu(a)ucsd.edu) with the subject “Postdoc Application”. Two or more letters of references should be sent directly by the recommenders to ajyu(a)ucsd.edu. More information about Dr. Yu’s group can be found at http://www.cogsci.ucsd.edu/~ajyu .
----------------------------------------------
Angela Yu
Associate Professor
Cognitive Science, UCSD
858-822-3317
http://www.cogsci.ucsd.edu/~ajyuhttps://ucsd.zoom.us/my/angelajyu
----------------------------------------------
Applications are invited from highly motivated researchers for a postdoctoral position immediately available in the Computational & Cognitive Neuroscience Lab, led by Prof. Angela Yu, at University of California, San Diego. Initial appointment is for one year, renewable for up to 2-3 years.
Dr. Yu’s lab applies modern machine learning and statistical tools to extract computational principles that underlie cognitive processes that enable intelligent behavior, in particular how humans and other intelligent agents perform inference, learning, decision-making, and social interactions under conditions of uncertainty and non-stationarity. Current interests include perception, face processing, active sensing, judgement and decision making, and social cognition.
Applicants should be committed to applying rigorous mathematical tools to model cognitive functions and/or their neural underpinnings. Experience or interest in carrying out human behavioral experiments (either in person or on Amazon M-Turk) and/or collaborating with other neuroimaging/neurophysiology laboratories is desirable.
Dr. Yu's lab is situated within the Cognitive Science department, and also affiliated with the Halıcıoğlu Data Science Institute, the Computer Science Department, the Neurosciences Graduate Program, and the Institute of Neural Computation. There are ample opportunities for collaborations with related groups across the UCSD main campus, the medical school, and the Salk Institute.
Interested candidates should send a research statement, along with a CV including publications, to Dr. Angela Yu (ajyu(a)ucsd.edu) with the subject “Postdoc Application”. Two or more letters of references should be sent directly by the recommenders to ajyu(a)ucsd.edu <mailto:ajyu@ucsd.edu>. More information about Dr. Yu’s group can be found at https://www.cogsci.ucsd.edu/~ajyu <https://www.cogsci.ucsd.edu/~ajyu> .
*** Please accept our apologies if you receive multiple copies of this CfP ***
***********************************************************************************
AAP 2020: CALL FOR PAPERS
International Workshop on Automated Assessment of Pain
http://aap-2020.net/
Submission Deadline: January 15th, 2020
***********************************************************************************
The International Workshop on Automated Assessment of Pain (AAP
2020) will be held in conjunction with IEEE FG 2020 on May 18th - May 22nd, Buenos Aires, Argentina.
For details concerning the workshop program, paper submission, and
guidelines please visit our workshop website at:
http://aap-2020.net/
Best regards,
Zakia Hammal, Steffen Walter, Nadia Berthouze
Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/http://ri.cmu.edu/personal-pages/ZakiaHammal/
To whom it may concern,
please find attached an advert for PhD and Postdoc positions at the
University of Fribourg, Switzerland for distribution.
Sincerely,
Meike Ramon
---
Meike Ramon, PhD
*SNSF PRIMA Fellow and Group Leader*
*- Applied Face Cognition Lab -*
University of Fribourg
Department of Psychology
Faucigny 2
1700 Fribourg
Switzerland
Tel : +41 26 300 7533
Fax: +41 26 300 9712
Email : meike.ramon(a)gmail.com
*www.meikeramon.com <http://www.meikeramon.com>*
*https://www3.unifr.ch/psycho/de/forschungseinheiten/afcl.html
<https://www3.unifr.ch/psycho/de/forschungseinheiten/afcl.html>*
We would like to let the community members know about the Consortium of
European Research on Emotion (CERE) 2020 Conference in Granada, Spain, *Jun
5-6*. http://www.cere-emotionconferences.org/
The deadline for abstract submission has been extended from January 15th* to
January 31th (23:59 CET).*
Due to many requests for an extension of the deadline the CERE Granada 2020
Conference organizers have agreed to extend the deadline for submission.
You are welcome to submit your abstract by following the links available
from
http://www.cere-emotionconferences.org/abstract-submission-and-presentation…
Dr. Manuel J. Ruiz and Dr. Inmaculada Valor-Segura
CERE Conference chairs
--
<http://www.unex.es/>
Manuel J. Ruiz, *P.hD*
*Departamento de Psicología y Antropología*
*Personalidad, Evaluación y Tratamiento Psicológico*
*Universidad de Extremadura*
Despacho 2.22 (Edif. Principal)
Facultad de Educación
Avda. de Elvas S/N
06006 Badajoz (Spain)
Tfno:
Email: mjrm <mjrm(a)unex.es>@unex.es <mjrm(a)unex.es>
ORCID: 0000-0002-1286-6624 <http://orcid.org/0000-0002-1286-6624>
<http://www.cere-emotionconferences.org/>
Call for Multimodal Grand Challenges
***************************************
ICMI 2020: Call for Multimodal Grand Challenges
https://icmi.acm.org/2020/index.php?id=CfC
25-29 Oct 2020, Utrecht, The Netherlands
***************************************
We are calling for teams to propose one or more ICMI Multimodal Grand
Challenges.
The International Conference on Multimodal Interaction (ICMI) is the
premier international forum for multidisciplinary research on multimodal
human-human and human-computer interaction, interfaces, and system
development. Developing systems that can robustly understand human-human
communication or respond to human input requires identifying the best
algorithms and their failure modes. In fields such as computer vision,
speech recognition, computational (para-) linguistics and physiological
signal processing, for example, the availability of datasets and common
tasks have led to great progress. We invite the ICMI community to
collectively define and tackle the scientific Grand Challenges in our
domain for the next 5 years. ICMI Multimodal Grand Challenges aim to
inspire new ideas in the ICMI community and create momentum for future
collaborative work. Analysis, synthesis, and interactive tasks are all
possible.
Challenge papers will be indexed in the main proceedings of ICMI.
The grand challenge sessions are still to be confirmed. We invite
organizers from various fields related to multimodal interaction to propose
and run Grand Challenge events. We are looking for exciting and stimulating
challenges including but not limited to the following categories:
- Dataset-driven challenge: This challenge will provide a dataset that is
exemplary of the complexities of current and future multimodal problems,
and one or more multimodal tasks whose performance can be objectively
measured and compared in rigorous conditions. Participants in the Challenge
will evaluate their methods against the challenge data in order to identify
areas of strengths and weaknesses.
- Use-case challenge: This challenge will provide an interactive problem
system (e.g. dialog-based or non-verbal-based) and the associated
resources, which can allow people to participate through the integration of
specific modules or alternative full systems. Proposers should also
establish systematic evaluation procedures.
- Health challenge: This challenge will provide a dataset that is exemplary
of a health related task, whose analysis, diagnosis, treatment or
prevention can be aided by Multimodal Interactions. The challenge should
focus on exploring the benefits of multimodal (audio, visual,
physiological, etc) solutions for the stated task.
- Policy challenge: Legal, ethical, and privacy issues of Multimodal
Interaction systems in the age of AI. The challenge could revolve around
opinion papers, panels, discussions, etc.
Prospective organizers should submit a five-page maximum proposal
containing the following information:
1. Title
2. Abstract appropriate for possible Web promotion of the Challenge
3. Distinctive topics to be addressed and specific goals
4. Detailed description of the Challenge and its relevance to multimodal
interaction
5. Length (full day or half day)
6. Plan for soliciting participation
7. Description of how submissions (challenge’s submissions and papers) will
be evaluated, and a list of proposed reviewers
8. Proposed schedule for releasing datasets (if applicable) and/or systems
(if applicable) and receiving submissions
9. Short biography of the organizers (preferably from multiple institutions)
10. Funding source (if any) that supports or could support the challenge
organization
11. Draft call for papers; affiliations and email address of the
organisers; summary of the Grand Challenge; list of potential Technical
Program Committee members and their affiliations, important dates
Proposals will be evaluated based on originality, ambition, feasibility,
and implementation plan. A Challenge with dataset(s) or system(s) that has
had pilot results to ensure its representativity and suitability to the
proposed task will be given preference for acceptance; an additional 1 page
description must be attached in such case. Continuation of or variants on
the 2019 challenges are welcome, though we ask for submissions of this form
to highlight the number of participants that attended during the previous
year and describe what changes (if any) will be made from the previous year.
The ICMI organizers will offer support with basic logistics, which includes
rooms and equipment to run the Workshop, coffee breaks can be offered if
synchronised with the main conference.
Important Dates and Contact Details
*Proposals due: January 13, 2020*
Proposal notification: February 3, 2020
Paper camera-ready: August 17, 2020
Proposals should be emailed to both ICMI 2020 Multimodal Grand Challenge
Chairs, Dr. Hayley Hung and Dr. Laura Cabrera-Quiros via
grandchallenge.ICMI20(a)gmail.com. Prospective organizers are also encouraged
to contact the co-chairs if they have any questions. Proposals are due by
January 13, 2020. Notifications will be sent on February 3, 2020.
Dear colleagues,
Calling all Psychologists, Neuroscientists & Philosophers for the 20th ACM International Conference on Intelligent Virtual Agents, 9th-12th September 2020 in Glasgow, UK.
Any questions, get in touch!
Rachael E. Jack
IVA 2020 General Co-Chair
Reader (Associate Professor)
Institute of Neuroscience & Psychology
School of Psychology
University of Glasgow
Scotland, G12 8QB
+44 (0) 141 330 5087
View this email in your browser<https://mailchi.mp/02ea15a092ee/call-for-papers-iva-5697375?e=c6d54cce24>
[https://gallery.mailchimp.com/bd7d786e2e8ae63c14d942ee0/images/a12ef9a3-dc6…]
Call for Submissions
ACM 20th International Conference on Intelligent Virtual Agents
September 9-12th 2020 University of Glasgow, Scotland
https://iva2020.gla.ac.uk/
SUBMISSION DATES
Papers: Sunday 5th April 2020 (23:59 UTC-12)
Extended Abstracts: Sunday 24th May 2020 (23:59 UTC-12)
---------------------------------------------
2020 Intelligent Virtual Agents
---------------------------------------------
Intelligent Virtual Agents (IVA) Annual Convention is the premier
international event for interdisciplinary research on the design,
application, and evaluation of Intelligent Virtual Agents (IVAs) with a
specific focus on the ability to socially interact.
*** IVA 2020 will be the 20th Annual Convention, held in Glasgow, Scotland,
9- 12th September 2020. ***
IVAs are interactive characters that exhibit human-like qualities including
communicating using natural human modalities such as facial expressions,
speech and gesture. IVAs are also capable of real-time perception,
cognition, emotion and action that allows them to participate in dynamic
social situations.
IVA 2020 aims to showcase cutting-edge research on the design, application,
and evaluation of IVAs, as well as the basic research underlying the
technology that supports human-agent interaction such as social perception,
dialog modeling, and social behavior planning. We also welcome submissions
on central theoretical issues, uses of virtual agents in psychological
research and showcases of working applications.
IVA 2020 offers two submission tracks: Papers (8 pages, including
references) and Extended Abstracts (3 pages, including references).
All submissions will be double-blind peer-reviewed by external expert
reviewers. All accepted submissions will be published in the proceedings.
Accepted papers will be presented at a talk. Accepted extended abstracts
will be presented as a talk or a poster, depending on the outcome of the
review process.
[https://gallery.mailchimp.com/bd7d786e2e8ae63c14d942ee0/images/18ce1289-cd5…]
SCOPE AND LIST OF TOPICS
IVA invites submissions on a broad range of topics, including but not
limited to:
Agent Design and modeling of:
* Cognition
* Emotion (including personality and cultural differences)
* Socially communicative behavior (e.g., of emotions, personality traits)
* Conversational behavior
* Social perception
* Machine learning approaches to agent modeling
* Approaches to realizing adaptive behavior
* Models informed by theoretical and empirical research from psychology
Multimodal interaction:
* Verbal and nonverbal behavior coordination
* Face-to-face communication skills
* Engagement
* Managing co-presence and interpersonal relation
* Multi-party interaction
* Data driven multimodal modeling
Social agent architectures:
* Design criteria and design methodologies
* Engineering of real-time human-agent interaction
* Standards / measures to support interoperability
* Portability and reuse
* Specialized tools, toolkits and tool chains
Evaluation methods and studies:
* Evaluation methodologies and user studies
* Ethical considerations and societal impact
* Applicable lessons across fields (e.g. between robotics and virtual agents)
* Social agents as a means to study and model human behavior
Applications:
* Applications in education, skills training, health, counseling, games, art, etc.
* Virtual agents in games and simulations
* Social agents as tools in psychology
* Migration between platforms
SPECIAL IVA 2020 TOPIC:
Exploring Connections between Computer Science, Robotics and Psychology.
Across computer science, robotics, psychology and the commercial world,
there has been a rapid growth in the research, development and application
of artificial social agents. Computer scientists and roboticists are
researching graphics-based and physical social agents. Psychologists and
neuroscientists are using these artifacts in laboratory experiments in
order to study our interaction with them as well as to use them as
confederates in the study of human behavior. Companies are actively
developing similar technologies. However, these communities too rarely
interact even though there are close synergies between psychology, the
study of human behavior, and artificial social agents, the engineering of
human behavior. The design of an artificial social agent involves the
formalization of theories and data about human behavior, integration of
resulting models into an agent and evaluation of its behavior, leveraging
techniques derived from psychology. Each of these steps can in return be of
fundamental value to psychological research. For example, formalization and
integration forces one to concretely specify theoretical constructs and
thereby expose hidden assumptions and gaps in theories. IVA 2020’s Special
Topic provides an invitation to researchers and developers across
disciplines to share their work on the challenges and uses of social agent
research, in the hope to further trans-disciplinary collaboration.
INSTRUCTIONS FOR AUTHORS
Paper submissions should be anonymous and prepared in the "ACM Standard"
format, more specifically the "SigConf" format.
* The LaTeX template for the "ACM Standard"/"SigConf" format can be found inside the official 2017 ACM Master article template package. Please use the most recent version (1.65) available at:https://www.acm.org/publications/proceedings-template<https://gla.us4.list-manage.com/track/click?u=bd7d786e2e8ae63c14d942ee0&id=…>
* The "ACM Standard" Microsoft Word template is currently not part of the downloadable package as the ACM is currently revising it to improve accessibility of resulting PDF-documents. Please use the "Interim Word Template" instead: https://www.acm.org/binaries/content/assets/publications/word_style/interim…<https://gla.us4.list-manage.com/track/click?u=bd7d786e2e8ae63c14d942ee0&id=…>
* IVA 2020 accepts two types of submissions:
* Full papers: 8 pages (including references)
* Extended abstracts: 3 pages (including references)
* All papers must be submitted in PDF-format.
Important Dates:
Papers
---> Submission Deadline: Sunday 5th April 2020 (23:59 UTC-12)
---> Notification of acceptance: 15th May, 2020
Extended Abstracts
---> Submission Deadline: Sunday 24th May 2020 (23:59 UTC-12)
---> Notification of acceptance: 21st June, 2020
CONFERENCE ORGANIZERS
Conference Chairs:
* Stacy Marsella, University of Glasgow
* Rachael Jack, University of Glasgow
Program Chairs:
* Hannes Vilhjalmsson, Reykjavik University
* Pedro Sequeira, SRI International
* Emily Cross, University of Glasgow
Workshop/Demonstration Organization Chairs:
* Lucile Callebert, University of Glasgow
* Florian Pecune, University of Glasgow
* Contact: workshopsdemos.iva2020(a)gmail.com<mailto:workshopsdemos.iva2020@gmail.com>
Web Site
* Amol Deshmukh, University of Glasgow Doctoral Consortium
* Jonathan Gratch, ICT/USC
Treasurer
* Catherine Pelachaud, CNRS
Publicity Chair
* Mary Ellen Foster, University of Glasgow
Volunteer Coordinator:
* Carolyn Saund, University of Glasgow
[IVA 2020 Website]<https://gla.us4.list-manage.com/track/click?u=bd7d786e2e8ae63c14d942ee0&id=…>
IVA 2020
This email was sent to 2039067h(a)student.gla.ac.uk<mailto:2039067h@student.gla.ac.uk>
why did I get this?<https://gla.us4.list-manage.com/about?u=bd7d786e2e8ae63c14d942ee0&id=f8c7d9…> unsubscribe from this list<https://gla.us4.list-manage.com/unsubscribe?u=bd7d786e2e8ae63c14d942ee0&id=…> update subscription preferences<https://gla.us4.list-manage.com/profile?u=bd7d786e2e8ae63c14d942ee0&id=f8c7…>
IVA · University Avenue · Glasgow, Glg G12 8QQ · United Kingdom
[Email Marketing Powered by Mailchimp]<http://www.mailchimp.com/monkey-rewards/?utm_source=freemium_newsletter&utm…>
*We would like to let the community members know about the Consortium of
European Research on Emotion (CERE) 2020 Conference in Granada, Spain, Jun
5-6. Call for abstracts open until 15th January
2020. http://www.cere-emotionconferences.org/
<http://www.cere-emotionconferences.org/>*
Manuel J. Ruiz and Inmaculada Valor
--
<http://www.unex.es/>
Manuel J. Ruiz, *P.hD*
*Departamento de Psicología y Antropología*
*Personalidad, Evaluación y Tratamiento Psicológico*
*Universidad de Extremadura*
Despacho 2.22 (Edif. Principal)
Facultad de Educación
Avda. de Elvas S/N
06006 Badajoz (Spain)
Tfno:
Email: mjrm <mjrm(a)unex.es>@unex.es <mjrm(a)unex.es>
ORCID: 0000-0002-1286-6624 <http://orcid.org/0000-0002-1286-6624>
<http://www.cere-emotionconferences.org/>
Dear colleagues,
We are organizing a special session on “Computer Vision for Automatic Human
Health Monitoring” in conjunction with 15th IEEE Conference on
Automatic Face and Gesture Recognition to be held between 18th-22th May
2020 in Buenos Aires, Argentina. Kindly find the related call for papers
below.
*Important dates*
Papers submission deadline: 10th January 2019 – midnight PST (firm deadline
no further extension)
Paper notification deadline: 10th February 2020
Final camera-ready papers: 28 February 2020
*Submission instructions* can be found at
https://fg2020.org/instructions-of-paper-submission-for-review/.
*For submission* log into
https://cmt3.research.microsoft.com/FG2020/Submission/Index, proceed to
“create new submission”. Select “special session track and subject area” as
“Special session: Computer Vision for Automatic Human Health Monitoring”.
Accepted papers will be included in FG2020 proceedings and will appear in
the IEEE Xplore digital library,
Please feel free to contact us for any further details. Kindly disseminate
this email to others who might be interested.
We look forward to your contributions.
Antitza Dantcheva (INRIA, France)
Abhijit Das (USC, USA)
François Brémond (INRIA, France)
Xilin Chen (CAS, China)
Hu Han (CAS, China)
--------------------------------------------------------------------------------------------
*Call for paper for FG 2020 special session *
*on *
*COMPUTER VISION FOR AUTOMATIC HUMAN HEALTH MONITORING*
-----------------------------------------------------------------------------------
Automatic Human Health Monitoring Based on Computer Vision has gained rapid
scientific attention in the decade, fueled by a large number of research
articles and commercial systems based on set of features, extracted from
face and gesture. Consequently, researchers from computer vision, as well
as from medical science community have granted significant attention, with
goals ranging from patient analysis and monitoring to diagnostics. The
goal of this special session is to bring together researchers and
practitioners working in this area of computer vision and medical science,
and to address a wide range of theoretical and practical issues related to
real-life healthcare systems.
Topics of interest include, but are not limited to:
Health monitoring based on face analysis,
Health monitoring based on gesture analysis,
Health monitoring based corporeal-based visual features,
Depression analysis based on visual features,
Face analytics for human behavior understanding,
Anxiety diagnosis based on face and gesture,
Physiological measurement employing face analytics,
Databases on health monitoring, e.g., depression analysis,
Augmentative and alternative communication,
Human-robot interaction,
Home healthcare,
Technology for cognition,
Automatic emotional hearing and understanding,
Visual attention and visual saliency,
Assistive living,
Privacy preserving systems,
Quality of life technologies,
Mobile and wearable systems,
Applications for the visually impaired,
Sign language recognition and applications for hearing impaired,
Applications for the ageing society,
Personalized monitoring,
Egocentric and first-person vision,
Applications to improve health and wellbeing of children and elderly.