<div dir="ltr"><div class="gmail_default" style="">In recent times digital biometrics is of immense importance in all spheres of life. Mostly the advances are in the direction of 3D biometrics and the face is the body part that is used<br>mostly. Though face biometrics is one of the most used forms after fingerprint right now, it is also open to many kinds of presentation attack instruments. Presentation attack instruments are mainly videos, photographs or masks and many times expert impersonators with prosthetic makeup. The 3D face biometrics is sometimes strengthened with the ear, and in many cases, the ear alone is sufficient for the recognition of individuals. The ear is agnostic of expressions and thus easy to recognize but forging a plastic-based ear is also a lot easier than face. 3D ear recognition mitigates the effect to a consider-<br>able extent. 3D vascular biometrics and palm-based biometrics have recently gained steam. Thus in many forms of human biometrics, 3D information is crucial. But the need for sophisticated and expensive hardware components works as a deterrent to its widespread adoption. To record and promote this area of this research we plan to host this special session. We invite practitioners, researchers, and engineers from biometrics, signal processing, computer vision, and machine learning fields to contribute their expertise to uplift the state-of-the-art.<br><font face="arial, sans-serif" style="" color="#000000">Topics of interest include but are not limited to<br><br>• 3D shape capturing and reconstruction for the human body or body parts from monocular vision<br>• 3D vasculature and palm-based biometrics from monocular vision<br>• 3D ear biometrics from monocular vision<br></font><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"><span style="font-size:small"><font face="arial, sans-serif" style="" color="#000000">• 3D air signature from monocular vision</font></span></span></p><p dir="ltr" style="line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">* Passive 3D Gait biometrics-based recognition from monocular vision<br>• 3D face by the monocular vision for biometric application<br>• Emotion and artifact agnostic 3D biometrics by monocular vision<br>• Multimodal sensors for real-time 3D shape capturing<br>• 3D face estimation with high occlusion and monocular camera<br>• 3D information capture under low lighting conditions from the monocular camera<br>• 3D biometrics from short videos<br>• Advancement in inexpensive single-shot sensor technology for 3D biometrics capture<span style="font-size:small"><font face="arial, sans-serif" style="" color="#000000"><br></font></span></span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"><br></span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Submission Guidelines:</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> Submit your papers at: <a href="https://cmt3.research.microsoft.com/IJCB2023" target="_blank">https://cmt3.research.microsoft.com/IJCB2023</a> in a special session track.</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> The paper presented at this session will be published as part of the IJCB2023 and should, therefore, follow the same guideline as the main conference.</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> Page limit: A paper can be up to 8 pages including figures and tables, plus additional pages for references only. </span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> Papers will be double-blind peer-reviewed by at least three reviewers. Please remove author names, affiliations, email addresses, etc. from the paper. Remove personal acknowledgements.</span></p><br><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Important Dates:</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> Full Paper Submission: July 17, 2023, 23:59:59 PDT</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> Acceptance Notice: August 17, 2023, 23:59:59 PDT</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap"> Camera-Ready Paper: August 21, 2023, 23:59:59 PDT</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><br></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Organizing Committee:</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Abhijit Das, BITS Pilani, India</span></p><p style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Aritra Mukherjee, </span><span style="color:rgb(0,0,0);font-family:Arial;font-size:14.6667px;white-space:pre-wrap">BITS Pilani, India</span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">Xiangyu Zhu, CAS, China<br></span></p><p dir="ltr" style="color:rgb(61,133,198);font-family:georgia,serif;line-height:1.38;margin-top:0pt;margin-bottom:0pt"><span style="font-size:11pt;font-family:Arial;color:rgb(0,0,0);background-color:transparent;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;vertical-align:baseline;white-space:pre-wrap">
</span></p></div><img src="https://wordpress.stir.ac.uk/files/2019/03/1pixel.png" width="0" height="0" alt="Web Bug from https://mailtrack.io/trace/mail/d7dc9bc3f4aa9236723dcf482d21a37b833900bb.png?u=339165" /></div>