<div dir="ltr"><div class="gmail_default" style="font-family:georgia,serif;color:#3d85c6"><div class="gmail_default" style="color:rgb(80,0,80)"><b style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">Final Call for Papers (</b><b style="font-family:Arial,Helvetica,sans-serif"><font color="#ff0000">no more extension</font></b><b style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">)</b></div><div class="gmail_default"><b><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"></b><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">I</span><u style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span class="gmail_default" style="font-family:georgia,serif;color:rgb(61,133,198)"></span>CPR 2024:  2nd Workshop on- Call for Papers<span class="gmail_default" style="font-family:georgia,serif;color:rgb(61,133,198)"> </span>Fairness in Biometric Systems</u><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">Biometric systems have spread worldwide and therefore have been increasingly involved in critical decision-making processes, including finances, public security, and forensics.  Despite their increasing impact on everybody’s daily life, many biometric solutions perform highly divergent for different groups of individuals, as previous works have shown. Consequently, the recognition performance of such systems is significantly impacted by demographic and non-demographic attributes of users. This brings to the fore discriminatory and unfair treatment of users of such systems.</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">At the same time, several political regulations, such as Article 7 of the Universal Declaration of Human Rights and Article 71 of the General Data Protection Regulation (GDPR), have highlighted the importance of the right to non-discrimination. These political efforts show the pertinent need for analyzing and mitigating equability concerns in biometric systems. Given the increasing impact on everybody’s daily life, as well as the associated social interest, research on fairness in biometric solutions is urgently needed.</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">This includes</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">• Developing and analyzing biometric datasets</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">• Proposing metrics related to equability in biometrics</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">• Demographic and non-demographic factors in</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">biometric systems</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">• Investigating and mitigating equability concerns</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">in biometric algorithms including</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">o Identity verification and identification</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">o Soft-biometric attribute estimation</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">o Presentation attack detection</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">o Template protection</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">o Biometric image generation</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">o Quality assessment</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><b>Important Dates</b></span></div><div class="gmail_default">--------------------------------<br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">Full Paper Submission: September 8, 2024</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">Acceptance Notice: September 20, 2024</span><br style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">Camera-Ready Paper:  September 24, 2024</span><br clear="all"></div><div class="gmail_default"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">Workshop: December 01, 2024</span></div></div><img width="0" height="0" class="mailtrack-img" alt="" style="display:flex" src="https://mailtrack.io/trace/mail/a00d2633e1599355c587d79779dc981c18355e39.png?u=339165"></div>