Apologies for cross-postings
Call for challenge participation
Sixth Emotion Recognition in the Wild (EmotiW) Challenge 2018
https://sites.google.com/view/emotiw2018
@ ACM International Conference on Multimodal Interaction 2018, Boulder,
Colarado.
---------------------------------------------------------------------
The sixth Emotion Recognition in the Wild (EmotiW) 2018 Grand Challenge
consists of an all-day event with a focus on affective sensing in
unconstrained conditions. There are three sub-challenges: engagement in the
wild prediction sub-challenge, audio-video based emotion classification
sub-challenge and image based group emotion recognition sub-challenge.
*Challenge website*: https://sites.google.com/view/emotiw2018
*Contact email*: emotiw2014[AT]gmail.com
*Timeline*
Challenge website - January 2018
Train and validate data available - March 2018
Test data available - 8 June 2018
Last date for uploading the results - 23 June 2018
Paper submission deadline - 1 July 2018
Paper notification - 30 July 2018
Camera-ready papers - 8 August 2018
*Organizers*
Abhinav Dhall (Indian Institute of Technology Ropar, India)
Roland Goecke (University of Canberra, Australia)
Jyoti Joshi
Tom Gedeon (Australian National University, Australia)
--
Abhinav Dhall, PhD
Assistant Professor,
Indian Institute of Technology, Ropar
Webpage: https://goo.gl/5LrRB7
Google Scholar: https://goo.gl/iDwNTx
Hi everyone,
Could I please ask you to pass on this PhD bursary opportunity to any students you think might be interested?
Queen Margaret University (Edinburgh, UK) now invites applications to its PhD bursary competition. One of the bursaries available may be awarded to a student interested in studying eyewitness identification. Dr Jamal Mansour (https://www.qmu.ac.uk/schools-and-divisions/psychology-and-sociology/psycho… ) welcomes applications from competitive students with an honours undergraduate or masters degree. The bursary covers tuition as well as provides an annual stipend for living and a small research budget. The deadline for applications is Friday, March 30. The details of the eyewitness identification project can be found here: https://www.qmu.ac.uk/media/4209/cass-phd-bursary-topics-2018.pdf (BUR18-03). Further details about the competition can be found here: https://www.qmu.ac.uk/study-here/postgraduate-research-study/graduate-schoo…. Jamal would like to encourage anyone who is considering applying to email her directly at jmansour(a)qmu.ac.uk<mailto:jmansour@qmu.ac.uk>.
Thanks!
Jamal.
---------------------------------------------------------------------------------
Jamal K. Mansour, PhD
Senior Lecturer in Psychology
Psychology & Sociology
Queen Margaret University
Edinburgh, UK
EH21 6UU
Email: jmansour(a)qmu.ac.uk
Phone: +44 (0) 131 474 0000 and say my name (Jam-el Man-sir) when prompted
Fax: +44 (0) 131 474 0001
Web: https://www.qmu.ac.uk/schools-and-divisions/psychology-and-sociology/psycho…
Memory Research Group Web site: https://memoryresearchgroup.wordpress.com/
Twitter: @EyewitnessIDUp
Check out my recent paper on conducting multiple-trial lineup experiments: https://link.springer.com/article/10.3758/s13428-017-0855-0
Participate in our study on legal attitudes! https://www.psytoolkit.org/cgi-bin/psy2.4.0/survey?s=Z8jMR
This message and its attachment(s) are intended for the addressee(s) only and should not be read, copied, disclosed, forwarded or relied upon by any person other than the intended addressee(s) without the permission of the sender. If you are not the intended addressee, you must not take any action based on this message and its attachment(s) nor must you copy or show them to anyone. If you have received this email in error, please inform the sender immediately and delete all copies of it.
It is your responsibility to ensure that this message and its attachment(s) are scanned for viruses or other defects. Queen Margaret University does not accept liability for any loss or damage which may result from this message or its attachment(s), or for errors or omissions arising after it was sent. Email is not a secure medium. Email traffic entering Queen Margaret University's system is subject to routine monitoring and filtering by Queen Margaret University.
Queen Margaret University, Edinburgh is a registered charity: Scottish Charity Number SC002750.
Thanks Peter, that's very helpful!
And Thanks Lisa De Bruine for the other email about WebMorph, I had already
applied for an account, so I will check it out. :)
I will also send a separate email to you about template files.
Regards,
Rachel
>
>
> Today's Topics:
>
> 1. Re: PsychoMorph Questions (Peter Hancock)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Thu, 25 Jan 2018 09:22:57 +0000
> From: Peter Hancock <p.j.b.hancock(a)stir.ac.uk>
> To: face-research-list Mailing List
> <face-research-list(a)lists.stir.ac.uk>
> Subject: Re: [Face-research-list] PsychoMorph Questions
> Message-ID: <4d6140ad51e44576905d8da9be1b6cb6(a)havra.ad.stir.ac.uk>
> Content-Type: text/plain; charset="utf-8"
>
> The lines in Psychomorph are also for helping with placement, so far as I
> know. The end of the file does just tell you which points are joined
> together. Here’s the start of a ‘standard’ template file line section:
>
> 39 # 39 line definitions
> 0 # move on
> 2 #first line has two points
> 0 0 #they are both point 0 ## I don’t know the significance of defining
> a zero length line, this is the pupil
> 0 # move on
> 2 #next line has two points
> 1 1 #other pupil
> 0
> 9 # a proper line with 9 points!
> 2 3 4 5 6 7 8 9 2 # the line forms a ring, starting and ending at point 2
> 0
> 9
> 10 11 12 13 14 15 16 17 10
>
> You can define the lines from within Psychomorph from the delineate menu.
> I’ve attached a superbatch file for caricaturing, with some comments
>
> Peter
>
>
> From: Face-research-list [mailto:face-research-list-
> bounces(a)lists.stir.ac.uk] On Behalf Of Rachel Robbins
> Sent: 16 January 2018 23:30
> To: face-research-list Mailing List <face-research-list(a)lists.stir.ac.uk>
> Subject: [Face-research-list] PsychoMorph Questions
>
> Hi everyone,
> I am trying to learn PsychoMorph having previously used Fantamorph. I have
> read through Clare Sutherland's basic guide, but I need some help with more
> detailed qus and I can't find anything Wiki site. If anyone can provide
> advice on any of these questions I would be very grateful!
>
> In Fantamorph the lines are are purely for visual grouping and don't do
> anything, morphing is all to do with matched dot placement, and you can
> check that dots are correctly matched by looking at the triangles. Do the
> lines in PsychoMorph do anything, or are they just guides?
>
> Part of the reason I need to know is that I am trying to import Fantamorph
> information into PsychoMorph. I have managed to import my point dot
> information by taking the lines of paired dot position information from the
> .fmd files and putting them into a .tem file with the number of dots in the
> first line corrected. However, I couldn't figure out exactly what the info
> at the end of the original .tem files generated by PsychoMorph is and
> whether I need it or something equivalent. It SEEMS to be the information
> about lines, does anyone know about this?
>
> I would also love to be able to batch importing and/or making caricatures
> if I can get the .tem files set up properly. It seems like I might be able
> to do this with SuperBatchTransform, but from this page
> http://cherry.dcs.aber.ac.uk:8080/wiki/batch
> I can't figure out exactly what needs to go in my input file. Does anyone
> have an example they would be willing to share?
>
> Thanks!
> Rachel
>
> --
> You make a living by what you get; you make a life by what you give.
> -Winston Churchill.
>
Hi Rachel,
You might find Webmorph.org<http://Webmorph.org> useful (it’s a web-based version of Psychomorph). It has a lot of extra batch functions that are easier to use than Psychmorph.
Send me your email and I’ll sign you up for a beta testing account.
I’d also be keen to add a function to webmorph to import FantaMorph templates. If you have any examples of template files you could send me, I can have a bash at writing a conversion script.
Cheers,
Lisa
----------------------------------------------------------
Dr Lisa M DeBruine
Institute of Neuroscience and Psychology
University of Glasgow
58 Hillhead Street
G12 8QB
lisa.debruine(a)glasgow.ac.uk<mailto:lisa.debruine@glasgow.ac.uk>
http://facelab.org
0141 330 5351
----------------------------------------------------------
On 25 Jan 2018, at 09:22, face-research-list-request(a)lists.stir.ac.uk<mailto:face-research-list-request@lists.stir.ac.uk> wrote:
Send Face-research-list mailing list submissions to
face-research-list(a)lists.stir.ac.uk<mailto:face-research-list@lists.stir.ac.uk>
To subscribe or unsubscribe via the World Wide Web, visit
http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
or, via email, send a message with subject or body 'help' to
face-research-list-request(a)lists.stir.ac.uk
You can reach the person managing the list at
face-research-list-owner(a)lists.stir.ac.uk
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Face-research-list digest..."
Today's Topics:
1. Re: PsychoMorph Questions (Peter Hancock)
----------------------------------------------------------------------
Message: 1
Date: Thu, 25 Jan 2018 09:22:57 +0000
From: Peter Hancock <p.j.b.hancock(a)stir.ac.uk>
To: face-research-list Mailing List
<face-research-list(a)lists.stir.ac.uk>
Subject: Re: [Face-research-list] PsychoMorph Questions
Message-ID: <4d6140ad51e44576905d8da9be1b6cb6(a)havra.ad.stir.ac.uk>
Content-Type: text/plain; charset="utf-8"
The lines in Psychomorph are also for helping with placement, so far as I know. The end of the file does just tell you which points are joined together. Here’s the start of a ‘standard’ template file line section:
39 # 39 line definitions
0 # move on
2 #first line has two points
0 0 #they are both point 0 ## I don’t know the significance of defining a zero length line, this is the pupil
0 # move on
2 #next line has two points
1 1 #other pupil
0
9 # a proper line with 9 points!
2 3 4 5 6 7 8 9 2 # the line forms a ring, starting and ending at point 2
0
9
10 11 12 13 14 15 16 17 10
You can define the lines from within Psychomorph from the delineate menu.
I’ve attached a superbatch file for caricaturing, with some comments
Peter
From: Face-research-list [mailto:face-research-list-bounces@lists.stir.ac.uk] On Behalf Of Rachel Robbins
Sent: 16 January 2018 23:30
To: face-research-list Mailing List <face-research-list(a)lists.stir.ac.uk>
Subject: [Face-research-list] PsychoMorph Questions
Hi everyone,
I am trying to learn PsychoMorph having previously used Fantamorph. I have read through Clare Sutherland's basic guide, but I need some help with more detailed qus and I can't find anything Wiki site. If anyone can provide advice on any of these questions I would be very grateful!
In Fantamorph the lines are are purely for visual grouping and don't do anything, morphing is all to do with matched dot placement, and you can check that dots are correctly matched by looking at the triangles. Do the lines in PsychoMorph do anything, or are they just guides?
Part of the reason I need to know is that I am trying to import Fantamorph information into PsychoMorph. I have managed to import my point dot information by taking the lines of paired dot position information from the .fmd files and putting them into a .tem file with the number of dots in the first line corrected. However, I couldn't figure out exactly what the info at the end of the original .tem files generated by PsychoMorph is and whether I need it or something equivalent. It SEEMS to be the information about lines, does anyone know about this?
I would also love to be able to batch importing and/or making caricatures if I can get the .tem files set up properly. It seems like I might be able to do this with SuperBatchTransform, but from this page http://cherry.dcs.aber.ac.uk:8080/wiki/batch
I can't figure out exactly what needs to go in my input file. Does anyone have an example they would be willing to share?
Thanks!
Rachel
--
You make a living by what you get; you make a life by what you give.
-Winston Churchill.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.stir.ac.uk/pipermail/face-research-list/attachments/20180125/1…>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: superbatch_new.xls
Type: application/vnd.ms-excel
Size: 27648 bytes
Desc: superbatch_new.xls
URL: <http://lists.stir.ac.uk/pipermail/face-research-list/attachments/20180125/1…>
------------------------------
Subject: Digest Footer
_______________________________________________
Face-research-list mailing list
Face-research-list(a)lists.stir.ac.uk
http://lists.stir.ac.uk/cgi-bin/mailman/listinfo/face-research-list
------------------------------
End of Face-research-list Digest, Vol 82, Issue 6
*************************************************
Hi everyone,
I am trying to learn PsychoMorph having previously used Fantamorph. I have
read through Clare Sutherland's basic guide, but I need some help with more
detailed qus and I can't find anything Wiki site. If anyone can provide
advice on any of these questions I would be very grateful!
In Fantamorph the lines are are purely for visual grouping and don't do
anything, morphing is all to do with matched dot placement, and you can
check that dots are correctly matched by looking at the triangles. Do the
lines in PsychoMorph do anything, or are they just guides?
Part of the reason I need to know is that I am trying to import Fantamorph
information into PsychoMorph. I have managed to import my point dot
information by taking the lines of paired dot position information from the
.fmd files and putting them into a .tem file with the number of dots in the
first line corrected. However, I couldn't figure out exactly what the info
at the end of the original .tem files generated by PsychoMorph is and
whether I need it or something equivalent. It SEEMS to be the information
about lines, does anyone know about this?
I would also love to be able to batch importing and/or making caricatures
if I can get the .tem files set up properly. It seems like I might be able
to do this with SuperBatchTransform, but from this page
http://cherry.dcs.aber.ac.uk:8080/wiki/batch
I can't figure out exactly what needs to go in my input file. Does anyone
have an example they would be willing to share?
Thanks!
Rachel
--
You make a living by what you get; you make a life by what you give.
-Winston Churchill.
** Apologies for cross-posting **
**********************************************
CALL FOR PAPERS - FG 2018 WORKSHOPS
Submission deadlines approaching
May 15th and May 19th, 2018
Xi'an China
Visit: https://fg2018.cse.sc.edu/Workshop.html
**********************************************
The paper submission deadline for several workshops held in conjunction
with the 2018 edition of the IEEE
International Conference on Automatic Face and Gesture Recognition is
approaching. Prospective authors are
invited to submit a contribution.
** Workshops **
1. 8th Int. Workshop on Human Behavior Understanding in conjunction
with the 2nd Int. Workshop on Automatic Face Analytics for Human
Behavior Understanding
Organizers: Carlos Busso, Xiaohua Huang, Takatsugu Hirayama, Guoying Zhao,
Albert Ali Salah, Matti Pietikäinen, Roberto Vezzani, Wenming Zheng,
Abhinav Dhall
2. Latest developments of FG technologies in China
Organizers: Qingshan Liu, Shiqi Yu, Zhen Lei
3. First Workshop on Large-scale Emotion Recognition and Analysis
Organizers: Abhinav Dhall, Yelin Kim, Qiang Ji
4. Workshop on Dense 3D Reconstruction of 2D Face Images in the Wild
Organizers: Zhenhua Feng, Patrik Huber, Josef Kittler, Xiaojun Wu
5. Face and Gesture Analysis for Health Informatics (FGAHI)
Organizers: Kévin Bailly, Liming Chen, Mohamed Daoudi, Arnaud Dapogny,
Zakia Hammal, Di Huang
6. Facial Micro-Expression Grand Challenge (MEGC): Methods and Datasets
Organizers: Moi Hoon Yap,Sujing Wang, John See, Xiaopeng Hong, Stefanos
Zafeiriou
7. The 1st International Workshop on Real-World Face and Object
Recognition from Low-Quality Images (FOR-LQ)
Organizers: Dong Liu, Weisheng Dong, Zhangyang Wang, Ding Liu
** Additional Information **
For more information on the workshops please visit:
https://fg2018.cse.sc.edu/Workshop.html
--
assoc.prof. Vitomir Štruc, PhD
Laboratory for Machine Intelligence
Faculty of Electrical Engineering
University of Ljubljana
Slovenia
Tel: +386 1 4768 839
Fax: +386 1 4768 316
URL: luks.fe.uni-lj.si/nluks/people/vitomir-struc/
Workshop and Tutorial Co-Chair: Automatic Face and Gesture Recognition 2018
http://www.fg2018.org/
Finance Chair: Automatic Face and Gesture Recognition 2019
Guest editor:
Image and Vision Computing SI: Biometrics in the Wild
Several researcher positions (Postdocs and PhD students) are available at the Human Communication Research Group, led by Katharina von Kriegstein. The group is currently based at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig (MPI CBS; http://www.cbs.mpg.de/independent-research-groups/human-communication ) and will transfer to the Psychology Faculty of the TU Dresden in 2018.
The positions are funded by the ERC consolidator grant SENSOCOM. The aim of the SENSOCOM project is to investigate the role of auditory and visual subcortical sensory structures in analysing human communication signals and to specify how their dysfunction contributes to human communication disorders such as developmental dyslexia and autism spectrum disorders. For examples of our work on these topics see von Kriegstein et al., 2008 Current Biology, Diaz et al., 2012 PNAS; Müller-Axt et al., 2017 Current Biology. The projects include experiments using cognitive neuroscience methods to understand the basic mechanisms of cortico-subcortical interactions as well as development of training programmes that are aimed at creating behavioural intervention programmes for communication deficits (for a brief description see http://cordis.europa.eu/project/rcn/199655_en.html ).
The positions will be based at the TU Dresden. Research will be performed at the Neuroimaging Centre at the TU Dresden ( http://www.nic-tud.de ) and MPI CBS in Leipzig. The centres offer cutting-edge infrastructure with 3-Tesla MRI, 7-Tesla MRI, a Connectom scanner, MRI compatible eye-tracking, several EEG systems, 306-channel MEG, neurostimulation units including neuronavigaton, TMS and tDCS devices. Besides an excellent infrastructure, the centres offer an international and friendly environment with researchers from diverse backgrounds. All experimental facilities are supported by experienced physics and IT staff. For analyses with high computational demands, there is access to high-performance computing clusters.
Candidates should have a strong interest in perceptual aspects of human communication and experience with experimental methods of cognitive neuroscience, such as psychophysics, functional or structural MRI, TMS, diffusion-weighted imaging, brainstem recordings or EEG/MEG. Experience with clinical populations (e.g. developmental dyslexia) would be an asset but is not essential. PhD student candidates must have a Master’s degree (or equivalent) in neuroscience, clinical linguistics, psychology, cognitive science, biology, or a related field. Postdoc candidates must have a PhD in similar fields and should be able to demonstrate a consistently outstanding academic record, including publications.
The position starting date is flexible. Initially for two (postdocs) or three (PhD) years, the positions offer the possibility of an extension. Remuneration depends on experience and is based on regulations of the Max Planck Society payscale. MPI CBS is an equal opportunities employer, committed to the advancement of individuals without regard to ethnicity, religion, gender, or disability. PhD students will have the opportunity to participate in the TU Dresden graduate academy (https://tu-dresden.de/ga?set_language=en). TU Dresden is one of eleven German Universities of Excellence and offers an interdisciplinary scientific environment.
To apply, please submit a CV, contact information of two references, a brief personal statement describing your qualifications and future research interests, copies of up to two of your publications. Please submit your application via our online system at http://www.cbs.mpg.de/vacancies (using subject heading “ERC 01/18”). The deadline for application submission is 15st February 2018. Contact for informal enquiries regarding the post: Prof. Dr. Katharina von Kriegstein (katharina.von_kriegstein(a)tu-dresden.de).
---
Prof. Dr. Katharina von Kriegstein
Max Planck Institute for Human Cognitive and Brain Sciences
Stephanstr. 1A, 04103 Leipzig, Germany
Technische Universität Dresden
Bamberger Str. 7, 01187 Dresden, Germany
Phone +49 (0) 341-9940-2476
http://www.cbs.mpg.de/independent-research-groups/human-communicationhttps://twitter.com/kvonkriegstein
Apologies for cross-posting
***********************************************************************************
FGAHI 2017: CALL FOR PAPERS
1st International Workshop on Face and Gesture Analysis for Health Informatics
http://fgahi.isir.upmc.fr
Submission Deadline: January 28th, 2018
***********************************************************************************
The 1st International Workshop on Face and Gesture Analysis for Health Informatics (FGAHI
2018) will be held in conjunction with IEEE FG 2018 on May 15-19, 2018, Xi’an, China – https://fg2018.cse.sc.edu/
For details concerning the workshop program, paper submission, and
guidelines please visit our workshop website at:
http://fgahi.isir.upmc.fr
Best regards,
Zakia Hammal
Organising committee
Kevin Bailly, Liming Chen, Mohamed Daoudi, Arnaud Dapogny, Zakia Hammal, and Di Huang
Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/http://ri.cmu.edu/personal-pages/ZakiaHammal/
Hi, all
I have a funded PhD studentship on the influence of social contexts and social motivation on face memory, face recognition and first impression formation. The closing date is 26th February. I would be very grateful if you would circulate this advert around your contacts, and send it to any students who you think might be interested.
The School of Psychology at the University of Lincoln has recently moved into a purpose-built building, and is expanding its research expertise in the area of person perception.
https://www.findaphd.com/search/ProjectDetails.aspx?PJID=94153
Thanks a lot
Kay
[University of Lincoln]<http://lncn.eu/jv>
Dr. Kay Ritchie | Lecturer in Cognitive Psychology
School of Psychology, College of Social Science
University of Lincoln. Brayford Pool, Lincoln, Lincolnshire. LN6 7TS
tel: +44 (0)1522 835463
lincoln.ac.uk/psychology<http://www.lincoln.ac.uk/home/psychology/> | @PsychLincoln<http://twitter.com/PsychLincoln> | @kayritchiepsych<http://twitter.com/kayritchiepsych> | Website<https://kayritchie87.wixsite.com/kayritchiepsychology>
[TEF Gold]<http://www.lincoln.ac.uk/opendays>
The University of Lincoln, located in the heart of the city of Lincoln, has established an international reputation based on high student satisfaction, excellent graduate employment and world-class research.
The information in this e-mail and any attachments may be confidential. If you have received this email in error please notify the sender immediately and remove it from your system. Do not disclose the contents to another person or take copies.
Email is not secure and may contain viruses. The University of Lincoln makes every effort to ensure email is sent without viruses, but cannot guarantee this and recommends recipients take appropriate precautions.
The University may monitor email traffic data and content in accordance with its policies and English law. Further information can be found at: http://www.lincoln.ac.uk/legal.