Apprich, ClemensChun, Wendy Hui KyongCramer, FlorianSteyerl, Hito2019-03-252019-03-2520189783957961457https://mediarep.org/handle/doc/4407Algorithmic identity politics reinstate old forms of social segregation—in a digital world, identity politics is pattern discrimination. It is by recognizing patterns in input data that artificial intelligence algorithms create bias and practice racial exclusions thereby inscribing power relations into media. How can we filter information out of data without reinserting racist, sexist, and classist beliefs?<ul> <li><a href ='https://doi.org/10.25969/mediarep/12352'>Clemens Apprich: <i>Introduction</i></a></li> <li><a href ='https://doi.org/10.25969/mediarep/12348'>Hito Steyerl: <i>A Sea of Data: Pattern Recognition and Corporate Animism (Forked Version)</i></a></li> <li><a href ='https://doi.org/10.25969/mediarep/12349'>Florian Cramer: <i>Crapularity Hermeneutics: Interpretation as the Blind Spot of Analytics, Artificial Intelligence, and Other Algorithmic Producers of the Postapocalyptic Present</i></a></li> <li><a href ='https://doi.org/10.25969/mediarep/12350'>Wendy Hui Kyong Chun: <i>Queerying Homophily</i></a></li> <li><a href ='https://doi.org/10.25969/mediarep/12351'>Clemens Apprich: <i>Data Paranoia: How to Make Sense of Pattern Discrimination</i></a></li> </ul>engCreative Commons Attribution Non Commercial 4.0 Genericcritical algorithm studiesartificial intelligenceAlgorithmusalgorithmKünstliche Intelligenz384Pattern Discrimination10.14619/145710.25969/mediarep/3655https://mediarep.org/handle/doc/16078