**Please forward to anyone who might be interested**
Apologies for cross-posting
eNTERFACE'17
the 13th Intl. Summer Workshop on Multimodal Interfaces
Porto, Italy, July 3th - 28th, 2017
Call for Participation - May 2, 2017
The eNTERFACE 2017 Workshop is being organized this summer in Porto, Portugal, from July 3th to 28th, 2017. The Workshop will be held at Digital Creativity Centre, (http://artes.ucp.pt/ccd ), Universidade Catolica Portuguesa.
The eNTERFACE Workshops present an opportunity of collaborative research and software development by gathering, in a single place, a team of senior project leaders in multimodal interfaces, PhD students, and (undergraduate) students, to work on a pre-specified list of challenges, for the duration of four weeks. Participants are organized in teams, assigned to specific projects. The ultimate goal is to make this event a unique opportunity for students and experts all over the world to meet and effectively work together, so as to foster the development of tomorrow's multimodal research community.
Senior researchers, PhD, or undergraduate students interested in participating to the Workshop should send their application by emailing the Organizing Committee at enterface17(a)porto.ucp.pt<mailto:enterface17@porto.ucp.pt> on or before May 2, 2017. The application should contain:
- A short CV
- A list of three preferred projects to work on
- A list of skills to offer for these projects.
Participants must procure their own travel and accommodation expenses. Information about the venue location and stay are provided on the eNTERFACE'09 website (http://artes.ucp.pt/enterface17 ). Note that although no scholarships are available for PhD students, there are no application fees.
eNTERFACE'17 will welcome students, researchers, and seniors, working in teams on the following projects
#01
How to Catch A Werewolf, Exploring Multi-Party Game-Situated Human-Robot Interaction
#02
KING'S SPEECH Foreign language: pronounce with style!
#03
The RAPID-MIX API: a toolkit for fostering innovation in the creative industries with Multimodal, Interactive and eXpressive (MIX) technology
#04
Prynth
#05
End-to-End Listening Agent for Audio-Visual Emotional and Naturalistic Interactions
#06
Cloud-based Toolbox for Computer Vision
#07
Across the virtual bridge
#08
ePHoRt project: A telerehabilitation system for reeducation after hip replacement surgery
#09
Big Brother can you find, classify, detect and track us ?
#10
Networked Creative Coding Environments
#11
Study of the realty level of VR simulations*
#12
Audiovisualy Experience Through Digital Art Holography*
The full detailed description of the projects is available at http://artes.ucp.pt/enterface17/Call.for.participation_eNTERFACE17.html
............................................................
AVISO DE CONFIDENCIALIDADE
Esta mensagem (incluindo quaisquer anexos) pode conter informação confidencial ou legalmente protegida para uso exclusivo do destinatário. Se não for o destinatário pretendido da mesma, não deverá fazer uso, copiar, distribuir ou revelar o seu conteúdo (incluindo quaisquer anexos) a terceiros, sem a devida autorização. Se recebeu esta mensagem por engano, por favor informe o emissor, por e-mail, e elimine-a imediatamente. Obrigado.
............................................................
CONFIDENTIALITY NOTICE
This message may contain confidential information or privileged material, and is intended only for the individual(s) named. If you are not the named addressee, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately by e-mail if you have received this e-mail by mistake and delete this e-mail from your system. Thank you.
Dear all,
The School of Psychology at the University of Plymouth has three fully funded full time PhD studentships (3 years) and two part-time studentships (5 years with teaching duties). Full details of all projects are available here:
https://www.plymouth.ac.uk/schools/psychology/phd-studentships
For the interest of this list there are two potential projects on offer involving the development of face processing in childhood and face processing in social anxiety. Details of these projects are available via the link above but note that the description is quite embryonic to allow for candidates to discuss their own direction with the supervisors.
If you know of any MSc students that are looking to do a PhD in face processing please do forward on these details to them, or ask them to contact me directly (chris.longmore(a)plymouth.ac.uk<mailto:chris.longmore@plymouth.ac.uk>).
Queries about funding for non-UK applicants for the full time positions should be sent to Prof. Chris Mitchell (christopher.mitchell(a)plymouth.ac.uk<mailto:christopher.mitchell@plymouth.ac.uk>), PG tutor or Dr Jeremy Goslin (jeremy.goslin(a)plymouth.ac.uk<mailto:jeremy.goslin@plymouth.ac.uk>) for the part time positions.
Thanks,
Chris
--
Dr Chris Longmore
Admissions Tutor
School of Psychology
Faculty of Health and Human Sciences
Plymouth University
Drake Circus
Plymouth
PL4 8AA
Tel: +44 (0)1752 584890<tel:+44%201752%20584890>
Fax: +44 (0)1752 584808<tel:+44%201752%20584808>
Email : chris.longmore(a)plymouth.ac.uk<mailto:chris.longmore@plymouth.ac.uk>
________________________________
[http://www.plymouth.ac.uk/images/email_footer.gif]<http://www.plymouth.ac.uk/worldclass>
This email and any files with it are confidential and intended solely for the use of the recipient to whom it is addressed. If you are not the intended recipient then copying, distribution or other use of the information contained is strictly prohibited and you should not rely on it. If you have received this email in error please let the sender know immediately and delete it from your system(s). Internet emails are not necessarily secure. While we take every care, Plymouth University accepts no responsibility for viruses and it is your responsibility to scan emails and their attachments. Plymouth University does not accept responsibility for any changes made after it was sent. Nothing in this email or its attachments constitutes an order for goods or services unless accompanied by an official order form.
Hello all
The Department of Psychology at Bournemouth University are currently advertising permanent Lectureships/Senior Lectureships in face-processing. Please see the advert below:
http://www.jobs.ac.uk/job/AXR916/senior-lecturer-lecturer-academic-in-psych…
Many thanks
Sarah
BU is a Disability Two Ticks Employer and has signed up to the Mindful Employer charter. Information about the accessibility of University buildings can be found on the BU DisabledGo webpages. This email is intended only for the person to whom it is addressed and may contain confidential information. If you have received this email in error, please notify the sender and delete this email, which must not be copied, distributed or disclosed to any other person. Any views or opinions presented are solely those of the author and do not necessarily represent those of Bournemouth University or its subsidiary companies. Nor can any contract be formed on behalf of the University or its subsidiary companies via email.
Dear colleagues
We are excited to announce a 4-year fully ESRC funded MSc+PhD position on the project Building a Culturally Flexible Generative Model of Face Signalling for Social Robots at the Institute of Neuroscience & Psychology, University of Glasgow, UK in collaboration with Dimensional Imaging (DI4D), Glasgow, UK.
We are looking for someone with multidisciplinary experience and so casting a very wide net. See the findaphd.com<http://findaphd.com> advert here: https://tinyurl.com/maw9fad
I have also attached the advert in PDF and JPEG format. I would be very grateful if you could share this advert in your departments and to any other persons or groups you think might be interested or have good connections.
See also my posts on FB and Twitter (@rachaelejack). I would be very grateful if you could RT and share.
Many thanks!
Rachael E. Jack, Ph.D., RSE YAS
Lecturer
Institute of Neuroscience & Psychology
School of Psychology
http://www.gla.ac.uk/schools/psychology/staff/rachaeljack/
Member of the RSE Young Academy of Scotland
[cid:96cd7982-33ee-493b-a07e-0e141adbe5ca@campus.gla.ac.uk]
[cid:0a42bc60-de08-4e38-bbc6-bd901604d3ca@campus.gla.ac.uk]
The FOX research group (http://cristal.univ-lille1.fr/FOX <http://cristal.univ-lille1.fr/FOX>) of CRSItAL laboratory (http://www.cristal.univ-lille.fr <http://www.cristal.univ-lille.fr/>) (UMR CNRS 9189), France, is looking for a promising candidate to work on the field of Human Behavior Analysis from video under unconstrained settings.
The recognition and prediction of people behaviour from videos are major concerns in the field of computer vision. A specific class of behavior analysis concerning facial expression recognition attracts lot of attention from researchers and industry in various fields.
State of the art solutions work fine in controlled environments were expressions are exaggerated and the head of the subject stay still, but as soon as the subject moves freely and expressions are natural, the performances of existing systems drop in an important manner. This observation is confirmed by performance evaluation conducted on new datasets (such as RECOLA, GEMEP) where the acquisitions conditions are similar to natural interaction settings.
We look for a PHD candidate in order to study and propose algorithms that analyze human behavior from video in unconstrained environments.
http://sujets-these.lille.inria.fr/details.html?id=a1382de1c76647509ef9e25c… <http://sujets-these.lille.inria.fr/details.html?id=a1382de1c76647509ef9e25c…>
== Required expertise
Strong preference will be given to candidates with experience in Computer Vision and Pattern Recognition and a good knowledge of written and oral English. Background in motion analysis field would be appreciated.
Applicants are expected to have a strong background in Computer Science. Strong programming skills (C or C++) are a plus. French language skills are not required, English is mandatory.
The thesis shall start October 1st in Lille.
Applications must be sent by email to Prof. Ch. Djeraba (chabane.djeraba(a)univ-lille1.fr <mailto:chabane.djeraba@univ-lille1.fr>) and IM Bilasco (marius.bilasco(a)univ-lille1.fr <mailto:marius.bilasco@univ-lille1.fr> <mailto:marius.bilasco@univ-lille1.fr <mailto:marius.bilasco@univ-lille1.fr>>), Subject: [Phd Position].
They must contain a statement of interest, a CV, a list of publications, if any, and the names of two references.
Please contact the Prof. Ch. Djeraba <chabane.djeraba(a)univ-lille1.fr <mailto:chabane.djeraba@univ-lille1.fr>> and IM Bilasco (marius.bilasco(a)univ-lille1.fr <mailto:marius.bilasco@univ-lille1.fr> <mailto:marius.bilasco@univ-lille1.fr <mailto:marius.bilasco@univ-lille1.fr>>) for more informations.
—
Ioan Marius BILASCO
MCF Univ Lille 1
Centre de Recherche en Image, Signal et Automatique (CRIStAL)
Équipe FOX - Groupe IMAGE
Bureau 336, Bât M3 Ext
Cité Scientifique
59650 Villeneuve d'Ascq Cedex - France
mailto:2emePrenom.Nom@univ-lille1.fr <mailto:2emePrenom.Nom@lifl.fr>
http://www.cristal.univ-lille.fr/~bilasco <http://www.lifl.fr/~bilasco>
phone: (+33) (0)3 3 20 43 41 88 / 3 62 53 15 84
fax: (+33) (0)3 28 77 85 37
Trust in CNRS's certificates: http://igc.services.cnrs.fr/Doc/General/trust.html <http://igc.services.cnrs.fr/Doc/General/trust.html>
Apologies for cross-postings
Call for challenge participation
Fifth Emotion Recognition in the Wild (EmotiW) Challenge 2017
https://sites.google.com/site/emotiwchallenge
@ ACM International Conference on Multimodal Interaction 2017, Glasgow
---------------------------------------------------------------------
The Fifth Emotion Recognition in the Wild 2017 Challenge consists of
multimodal classification challenges, which mimics real-world conditions.
Traditionally, emotion recognition has been performed on laboratory
controlled data. While undoubtedly worthwhile at the time, such lab
controlled data poorly represents the environment and conditions faced in
real-world situations. With the increase in the number of video clips
online, it is worthwhile to explore the performance of emotion recognition
methods that work ‘in the wild’. There are two sub-challenges: audio-video
based emotion recognition in videos and group-level emotion recognition in
the images (new).
Timeline:
Challenge Website Up: early March 2017
Train and validate data available: April 2017
Test data available: 8 July 2017
Last date for uploading the results: 23 July 2017
Paper submission deadline: 10 August 2017
Notification: 1 September 2017
Camera-ready papers: 21 September 2017
Organisers
Abhinav Dhall, Roland Goecke, Jyoti Joshi, Jesse Hoey and Tom Gedeon
Contact
emotiw2014(a)gmail.com
--
Abhinav Dhall, PhD (ANU)
Assistant Professor,
Indian Institute of Technology Ropar
I asked about the relevance of the Image and Vision Computing (IVC) Special Issue on Biometrics in the Wild to Psychologists. This was Vito's response:
It may be important to provide some context first. The special issue builds on a workshop that we are organizing as part of the 2017 edition of the IEEE Conference on Automatic Face and Gesture Recognition (http://luks.fe.uni-lj.si/bwild17/), but is open to everyone interested in contributing. While we issued a call for biometrics in general, we expect most submitted papers to be face and (maybe gesture) related - due to our connection to AFGR.
I feel that papers on the psychology of face recognition would fit nicely into the scope of the planned special issue. Work on perceptual and cognitive aspects of face recognition and connections to recent machine learning models would certainly be interesting as well. I see a lot of work in this area that would make sense for the special issue.
Peter Hancock
Professor,
Deputy Head of Psychology,
Faculty of Natural Sciences
University of Stirling
FK9 4LA, UK
phone 01786 467675
fax 01786 467641
http://stir.ac.uk/190http://orcid.org/0000-0001-6025-7068http://www.researcherid.com/rid/A-4633-2009
Psychology at Stirling: 100% 4* Impact, REF2014
Come and study Face Perception at the University of Stirling! Our unique MSc in the Psychology of Faces is open for applications. For more information see http://www.stir.ac.uk/postgraduate/programme-information/prospectus/psychol…
** Apologies for cross-posting **
****************************************************
CALL FOR PAPERS
Image and Vision Computing (IVC) Special Issue on:
Biometrics in the Wild
Submission deadline: 30 June, 2017
Target publication date: April, 2018
****************************************************
CALL FOR PAPERS
IVC SI: Biometrics in the Wild
** Motivation **
Biometric recognition from data captured in unconstrained settings,
commonly referred to as biometric recognition in the wild, represents a
challenging and highly active area of research. The interest in this
area is fueled by the numerous application domains that deal with
unconstrained data acquisition conditions such as forensics,
surveillance, social media, consumer electronics or border control.
While the existing biometric technology has matured to a point, where
excellent performance can be achieved for various tasks in ideal
laboratory-like settings, many problems related to in-the-wild scenarios
still require further research and novel ideas. The goal of this special
issue is to present the most advanced work related to biometric
recognition in unconstrained settings and introduce novel solutions to
open biometrics-related problems. Submitted papers should make a
significant contribution in terms of theoretical findings or empirical
observations, demonstrate improvements over the existing
state-of-the-art and use the most challenging datasets available.
** Topics of Interest **
We invite high-quality papers on topics related to biometric recognition
in the wild, including, but not limited to:
• Region of interest detection (alignment, landmarking) in the wild,
• Soft biometrics in the wild,
• Context-aware techniques for biometric detection and recognition,
• Novel normalization techniques,
• Multi-modal biometrics in the wild,
• Biometric recognition in the wild,
• Biometrics from facial behavior (e.g., eye movement, facial
expressions, micro-expressions),
• Biometrics based on facial dynamics,
• Novel databases and performance benchmarks,
• Ethical issues, privacy protection and de-identification,
• Spoofing and countermeasures,
• Deep learning approaches for unconstrained biometric recognition,
• Related applications, especially mobile.
** Important Dates **
Submission deadline: 30 June, 2017
Notifications to authors: 31 January, 2018
Target publication date: April, 2018
** Guest Editors **
Bir Bhanu, University of California, Riverside, United States
Abdenour Hadid, University of Oulu, Finland
Qiang Ji, Rensselaer Polytechnic Institute, United States
Mark Nixon, University of Southampton, United Kingdom
Vitomir Struc, University of Ljubljana, Slovenia
** Advisory Editors **
Rama Chellappa, University of Maryland, United States
Josef Kittler, University of Surrey, United Kingdom
For more information visit:
www.journals.elsevier.com/image-and-vision-computing/call-for-papers/specia…
--
ass.prof. Vitomir Struc, PhD
Laboratory of Artificial Perception, Systems and Cybernetics
Faculty of Electrical Engineering
University of Ljubljana
Slovenia
Tel: +386 1 4768 839
Fax: +386 1 4768 316
URL: luks.fe.uni-lj.si/nluks/people/vitomir-struc/
Co-organizer: Workshop on Biometric in the Wild 2017
http://luks.fe.uni-lj.si/bwild17
Program Co-chair: International Symposium on Image and Signal Processing and Analysis 2017
http://www.isispa.org/
Competition Co-chair: International Joint Conference on Biometrics 2017
http://www.ijcb2017.org/
Dear colleagues
I wonder if you are able to help me please. I am a third year Psychology student with the Open University and I am just about to embark on my final year experimental project.
My area of interest for this project is the interaction between simultaneous vs sequential presentation and neutral vs leading phrasing in accuracy rates of rejection in target absent line-ups. I have been searching high and low for short videos of nonviolent crimes (I'm aiming for 4).
Do you have any advice about where I could go about locating anything like that?
I really appreciate you talking the time to read my email.
Thank you
Kind regards
Bonnie Parker
Research Administrator
Department of Physics
King's College London
Strand Building S3.07 | Strand | London | WC2R 2LS
+44 (0) 20 7848 2155
bonnie.parker(a)kcl.ac.uk<mailto:bonnie.parker@kcl.ac.uk>
www.kcl.ac.uk/physics<http://www.kcl.ac.uk/physics>
**Please forward to anyone who might be interested**
Apologies for cross-posting
**************************************************************************
CfP eNTERFACE'17 Workshop: International Summer Workshop on Multimodal Interfaces
http://artes.ucp.pt/enterface17/
Final project proposal submission deadline (10 February)
http://artes.ucp.pt/enterface17/authors-kit/eNTERFACE17_Authors_Kit.pdf
**************************************************************************
eNTERFACE workshops aim at establishing a tradition of collaborative, localized research and development work by gathering, in a single place, a team of senior project leaders in multimodal interfaces, researchers, and (undergraduate) students, to work on a pre-specified list of challenges, for 4 weeks.
Following the success of the previous eNTERFACE workshops held in Mons (Belgium, 2005), Dubrovnik (Croatia, 2006), Istanbul (Turkey, 2007), Paris (France, 2008), Genova (Italy, 2009), Amsterdam (Netherlands, 2010), Plzen (Czech Republic, 2011), Metz (France, 2012), Lisbon (Portugal, 2013), Bilbao (Spain, 2014), Mons (Belgium, 2015), Twente (Netherlands 2016), the Digital Creativity Centre (CCD)<http://artes.ucp.pt/ccd>, Universidade Catolica Portuguesa, has the pleasure to host eNTERFACE’17<http://artes.ucp.pt/enterface17>, the 13th Summer Workshop on Multimodal Interfaces, to be held in Porto, Portugal from July 3rd to 28th, 2017.
The eNTERFACE'17 committee<http://artes.ucp.pt/enterface17/Committees_eNTERFACE17.html> now invites researchers to submit project proposals that will be evaluated by the scientific committee. All the information asked to submit a project is available on the website of the workshop. The proposals should contain a full description of the project's objectives, required hardwares/softwares and relevant literatures.
Participants are organized in teams, attached to specific projects, working on free software. Each week will typically consist of working sessions by the teams on their respective projects plus a tutorial given by an invited senior researcher and a presentation of the results achieved by each project group. The last week will be devoted to writing an article on the results obtained by the teams plus a big session where all the groups will present their achievements.
Proceedings are expected to be published by CITAR Journal. CITAR Journal was recently (July 2016) accepted for inclusion in a new index of the Web of Science (WoS) Core Collection: the Emerging Sources Citation Index (ESCI), and has also been accepted for indexing by Elsevier's Scopus.
TOPICS
Although not exhaustive, the submitted projects can cover one or several of the topics : Art and Technology, Affective Computing, Assistive and Rehabilitation Technologies, Assistive Technologies for Education and Social Inclusion, Augmented Reality, Conversational Embodied Agents, Human Behavior Analysis, Human Robot Interaction, Interactive Playgrounds, Innovative Musical Interfaces, Interactive Systems for Artistic Applications, Multimodal Interaction, Signal Analysis and Synthesis, Multimodal Spoken Dialog Systems, Search in Multimedia and Multilingual Documents, Smart Spaces and Environments, Social Signal Processing, Tangible and Gesture Interfaces, Teleoperation and Telerobotics, Wearable Technology, Virtual Reality
SUBMISSION PROCEDURE
The general procedure of the eNTERFACE workshop series is as follows. Researchers are invited to submit project proposals. The project proposals will be evaluated by the eNTERFACE steering committee. If accepted, the projects will be published and researchers and students are invited to apply for up to 3 projects they would like to be part of. After notifying the applicants, the project leaders can start building their teams. Final project proposals must be submitted by email (PDF) to enterface17(a)porto.ucp.pt<http://mailto:enterface17%40porto.ucp.pt%3cmailto%3Aenterface17@porto.ucp.p…>.
* Instructions for the final proposal: http://artes.ucp.pt/enterface17/authors-kit/eNTERFACE17_Authors_Kit.pdf
IMPORTANT DATES
* 20 January 2017: Notification of interest for a project proposal with a summary of project goals, work-packages and deliverables (1-page)
* 10 February 2017: Submission deadline: Final project proposal
* 20 February 2017: Notification of acceptance to project leaders
* 06 March 2017: Start Call for Participation, participants can apply for projects
* 21 April 2017: Call for Participation is closed
* 28 April 2017: Teams are built, notification of acceptance to participants
* 03 – 28 July 2017: eNTERFACE’17 Workshop
Scientific Committee
* Prof. Albert Ali Salah, University of Bogazici, Turkey
* Prof. Alvaro Barbosa, University of Saint Joseph, Macao, China
* Prof. Andrew Perkis, Norwegian University of Science and Technology, Norway
* Prof. Antonio Camurri, University of Genova, Italy
* Prof. Benoit Macq, Université Catholique de Louvain (UCL), Belgium
* Prof. Bruce Pennycook, University of Texas at Austin, USA
* Prof. Christophe d'Alessandro, CNRS-LIMSI, France
* Dr. Daniel Erro, Cirrus Logic, Spain
* Prof. Dirk Heylen, University of Twente, Netherlands
* Prof. Gualtiero Volpe, University of Genova, Italy
* Prof. Igor S. Pandžić, University of Zagreb, Croatia
* Prof. Inma Hernaez, University of the Basque Country, Spain
* Prof. Jean Vanderdonckt, Université Catholique de Louvain (UCL), Belgium
* Prof. Jorge C. S. Cardoso, University of Coimbra, Portugal
* Prof. Khiet Truong, University of Twente, Netherlands
* Prof. Kostas Karpouzis, National Technical University of Athens, Greece
* Prof. Ludger Brümmer, ZKM | Center for Art and Media Karlsruhey, Germany
* Prof. Luis Teixeira, Universidade Católica Portuguesa (UCP), Portugal
* Prof. Martin Kaltenbrunner, Kunstuniversität Linz, Austria
* Prof. Maureen Thomas, Cambridge University Moving Image Studio, UK
* Prof. Milos Zelezny, University of West Bohemia, Czech Republic
* Prof. Nuno Guimarães, Information Sciences, Technologies and Architecture Research Center (ISTAR-UL), Portugal
* Prof. Olivier Pietquin, University of Lille | Google DeepMind, France
* Prof. Sandra Pauletto, University of York, UK
* Prof. Stefania Serafin Professor, Aalborg University Copenhagen, Denmark
* Prof. Thierry Dutoit, University of Mons, Belgium
* Prof. Yves Rybarczyk, New University of Lisbon, Portugal
INFRASTRUCTURE
eNTERFACE’17 will be held in the Digital Creativity Centre located on the campus of the Catholic Portuguese University in the city of Porto, Portugal. The Digital Creativity Centre<http://artes.ucp.pt/ccd> is a center of competence and creative excellence with an infrastructure equipped with cutting edge technology in the areas of Digital and Interactive Arts, Computer Music, Sound Design, Audiovisual and Cinematic Arts, Computer Animation.
Facilities include experiment spaces, meeting rooms, as well as a Motion Capture (MoCap) Lab equipped with a Vicon Motion Capture System, a Digital and Interactive Arts (Yamaha Disklavier Grand Piano Robotic Performance System, a Notomoton Percussion Robot, two Reactable Live and one Reactable Media Bench Systems, various Microsoft Kinects, Nintendo Wiimotes, LeapMotion sensors, Webcameras, 3d-printers, Arduino and Raspberry Pi systems).
ORGANIZATION
eNTERFACE’17 will be organized and hosted by the Digital Creativity Centre, Universidade Catolica Portuguesa - School of Arts.
............................................................
AVISO DE CONFIDENCIALIDADE
Esta mensagem (incluindo quaisquer anexos) pode conter informação confidencial ou legalmente protegida para uso exclusivo do destinatário. Se não for o destinatário pretendido da mesma, não deverá fazer uso, copiar, distribuir ou revelar o seu conteúdo (incluindo quaisquer anexos) a terceiros, sem a devida autorização. Se recebeu esta mensagem por engano, por favor informe o emissor, por e-mail, e elimine-a imediatamente. Obrigado.
............................................................
CONFIDENTIALITY NOTICE
This message may contain confidential information or privileged material, and is intended only for the individual(s) named. If you are not the named addressee, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately by e-mail if you have received this e-mail by mistake and delete this e-mail from your system. Thank you.