Automated Discrimination?
How to prevent bias in AI and Algorithmic Decision-Making
WATCH THE EVENT LIVE
Ce contenu a été bloqué.
Souhaitez-vous afficher le contenu du domaine « web-greensefa.streamovations.be » ?
If you consent to the use of social plugins on our website, the Greens/EFA in the European Parliament processes your connection data (access data and terminal device data) and your user behaviour data as well as account data if you are logged in with a social media platform account and sets cookies on your terminal device for the purpose of external presentation of the company and external communication with interested parties.
The storage period of the above-mentioned data relating to you is for the duration of the browser session.
If you are logged into your Facebook, Instagram, Twitter, YouTube, Spotify or SoundCloud account and use its social plugin, you enable the respective providers to assign your surfing behaviour directly to your personal profile. You can prevent this by logging out of your account.
The following processors are recipients of the data relating to you that is processed via the social plugins:
- Facebook - Meta Platforms Ireland Limited, 4 Grand Canal Place Grand Canal Harbour, Dublin 2, 2 Ireland, and Meta Platforms, Inc., 1 Hacker Way Menlo Park, California 94025, CA, USA.
- Instagram - Meta Platforms Ireland Limited, 4 Grand Canal Place Grand Canal Harbour, Dublin 2, 2 Ireland, and Meta Platforms, Inc., 1 Hacker Way Menlo Park, California 94025, CA, USA.
- X (formerly Twitter) - Twitter International Unlimited Company, One Cumberland Place, Fenian Street, Dublin 2, D02 AX07 Ireland, and X Corp., 1355 Market Street, Suite 900, San Francisco, CA 94103, USA.
- YouTube - Alphabet, Inc., 901 Cherry Ave., San Bruno, CA 94066, USA.
- Spotify - Spotify AB, Regeringsgatan 19, 111 53 Stockholm, Sweden, and Spotify USA Inc., 150 Greenwich Street, Floor 62, New York, NY 10007, USA.
- SoundCloud - SoundCloud Global Limited & Co. KG, Rheinsberger Str. 76/77, 10115 Berlin, Germany, and SoundCloud Inc., 71 5th Avenue, 3rd Floor, New York 10003, NY, USA.
- Giphy: Giphy, Inc., 416 W 13th St, New York, NY, USA.
- Streamovations, Pannestraat 42, 8630 Veurne, Belgium.
We use processors in the USA. The EU Commission has adopted a decision on an adequate level of data protection for the USA. The legal basis for this third country transfer is the transfer of data on the basis of an adequacy decision pursuant to Art. 47 EUDPR.
In accordance with Art. 23 EUDPR, you can withdraw your consent at any time via the cookie banner on the website or by sending an email to info@greens-efa.eu. A later withdrawal of your consent will no longer affect the lawfulness of the processing carried out up to that point in accordance with Art. 7 para. 3 sentence 3 EUDPR.
Further information can be found under link.
Artificial intelligence is playing an increasingly important yet invisible role in our everyday lives. Many sectors, including healthcare, education, financial services, labour markets and advertising are now using automated decision-making processes by processing large amounts of data. While automated decision-making and other types of AI offer us benefits and seem to make our lives easier, they can be subject to bias. Discrimination against women, people of colour, or poor people are making their way into algorithmic decision-making, which may be exacerbating existing inequalities.
On behalf of the Digital Working Group of the Greens/EFA, MEPs Alexandra Geese, Patrick Breyer, Marcel Kolaja, Kim van Sparrentak, Sergey Lagodinsky and Damian Boeselager would like to invite you to a hearing to discuss the societal impact of automated decision-making processes with a special focus on discrimination.
The hashtag for the event will be #AIdiscrimination
Moderator: Jennifer Baker, European journalist specialising in EU policy and legislation in the technology sector.
15:00 – 15:15: Welcome words and introduction by Alexandra Geese MEP
15:15 – 16:30 : PANEL 1 - Exposing bias and societal imbalances in algorithmic systems
How do bias and societal imbalances occur and are perpetuated through algorithmic systems? Where are the problems with automated decision making when used by public authorities and by private entities?
Chair: Alexandra Geese MEP; @alexandra_geese
Moderation: Jennifer Baker; @BrusselsGeek
Speakers:
- Christiaan van Veen, Director of the Digital Welfare State and Human Rights project at New York University School of Law and Special Advisor on new technologies and human rights to the UN Special Rapporteur on Extreme Poverty and Human Rights @cpjvanveen
- Dr. F.S. (Seda) Gürses, Assistant Professor in the Department of Multi-Actor Systems at TU Delft at the Faculty of Technology Policy and Management, and an affiliate at the COSIC Group at the Department of Electrical Engineering (ESAT), KU Leuven; @sedyst
- Nakeema Stefflbauer, Founder and Program Director of FrauenLoop gUG, a Berlin non-profit organisation that trains resident, refugee and immigrant women for careers in the technology field; @DocStefflbauer
- Prof. Florian Gallwitz, Professor of computer science at Nuremberg Institute of Technology (TH Nürnberg Georg-Simon-Ohm) specialising in pattern recognition, computer vision, speech recognition, deep learning, robotics; @FlorianGallwitz
- Sanne Blauw, Dutch journalist focusing on Artificial Intelligence for the news platform ''The Correspondent''; @sanneblauw
Respondent: Gabriele Mazzini, Policy Officer at DG CNECT, Artificial Intelligence and Digital Industry Unit of the European Commission
16:30 – 16:45 : COFFEE BREAK
16:45- 18:00 : PANEL II – Tackling the problems
What are the solutions to tackle bias and societal imbalances in algorithmic systems? What kind of legislation needs to be updated and which new rules do we need? What are possible technological responses?
Chair: Patrick Breyer MEP; @echo_pbreyer
Moderation: Jennifer Baker
Speakers:
- Katharine Jarmul, Founder at Kjamistan, machine learning engineer and privacy activist; @kjam
- Fanny Hidvegi, Member of the EU's High-Level Expert Group on Artificial Intelligence and Policy Manager of Access Now Europe; @infoFannny
- Frederike Kaltheuner, technology policy expert and former Director of Programme at the global civil liberties organisation Privacy International, 2019/2020 Mozilla Tech Policy fellow; @F_Kaltheuner
Respondent: Gabriele Mazzini, Policy Officer at DG CNECT, Artificial Intelligence and Digital Industry Unit of the European Commission