How do “human” prejudices reemerge in algorithmic cultures allegedly devised to be blind to them? To answer this question, this book investigates a fundamental axiom in computer science: pattern discrimination. By imposing identity on input data, in order to filter—that is, to discriminate—signals from noise, patterns become a highly political issue. Algorithmic identity politics reinstate old forms of social segregation, such as class, race, and gender, through defaults and paradigmatic assumptions about the homophilic nature of connection.