קול קורא // למאמרים (כתב עת): בינה מלאכותית והדרה על בסיס אתני, דתי ומגדרי (כתב העת "הכללה חברתית") [אנגלית] דדליין לתקצירים=15.9.22

כתובת ההודעה: https://www.hum-il.com/message/2072100/

Artificial Intelligence and Ethnic, Religious, and Gender-Based Discrimination

The use of artificial intelligence-based technologies, including biometrics and blockchain, is on the rise in many sectors. Still, these new technologies are often employed with no regulation, weak oversight and governance mechanisms. The existing literature suggests that the deployment of these technologies has been opaque with little knowledge about who has access to the data, with whom it is shared and who is accountable for the wrongdoings of humans and automated decision-makers in the process. In many sectors, recipients are obliged to provide their consent in order to receive the product, without knowing how their data will be used and how it will be protected.

Moreover, these new technologies have been introduced without an examination of possible forms of exclusion. Like any other tool, technology in itself is not neutral. The ability to design, own and use AI-based technologies is directly related to relationships of power. Not only assessing individual characteristics and posing a risk to privacy rights, biometric identification can discriminate according to group-based (gender, ethnic, religious) characteristics. For instance, existing preliminary research finds that 'machine bias' against gender and racialised characteristics of individuals persist in the scanning of CVs and in assessments of criminals' likelihood of becoming a recidivist. Even though there is growing activism on risks to data privacy, there are very few scholarly investigations on how AI-based technologies can give rise to discrimination of certain groups over others.

This thematic issue will explore the following questions and related topics: To what extent the use of new technologies result in discrimination based on gender, ethnic or religious backgrounds? What are the newly emerging governance mechanisms to mitigate such forms of discrimination? How is accountability ensured in the design and implementation stages? What is the role of civil society and courts in challenging the 'machine bias'? This thematic issue invites articles with a critical lens and empirically novel findings across various spheres, including but not limited to courts, public security, and border management, among others.

READ MORE

מפרסם ההודעה
מערכת רמה - רשת מדעי הרוח והחברה info@hum-il.com *** רמה אינה קשורה להודעה זו ואין בידיה מידע נוסף. לכל שאלה הנוגעת להודעה ולפרטי התקשרות יש להקליק על הקישור המפנה אל האתר הרלוונטי.
שמירה ושיתוף

 

תזכורות יישלחו 10 ,5 ,2 ימים לפני האירוע וביום האירוע
השתנה בהצלחה

Warning: Cannot modify header information - headers already sent by (output started at /home/fnipqtjg/domains/hum-il.com/public_html/wp-content/plugins/google-analytics-dashboard-for-wp/includes/frontend/frontend.php:43) in /home/fnipqtjg/domains/hum-il.com/public_html/wp-content/themes/nucleus-child/functions.php on line 423