New computer-based affective technologies marketed to express, to regulate and to manipulate human emotions continue to increase in availability. From biometric emotion recognition surveillance cameras alleged to fight terrorism and to keep us safer by recognizing persons evidencing dangerous emotions such as rage, revenge and fear, to video games designed to allow players to experience depression for the purposes of developing empathy to consumer strategies designed to increase marketing share by engaging the potential consumers’ emotional responses to the marketed commodity, new information and communication technologies are heavily reliant on the attempt to understand contemporary “structure of feeling” (Williams, 1977). Brave new possibilities are imagined for these technologies, from the protection of our national security to the improvement of our emotional intelligence and collective psychological welfare. Consistent with the SSHRC goal of examining the roles that “emerging and/or disruptive information and communication technologies play in learning for individuals, institutions and society,” Professor Shoshana Magnet’s project will examine the implications of new technologies aimed at automating emotions. Grounding my project within a current cultural context that remains preoccupied with finding technologies that help us manage our emotions, and interrogating the relationship between these technologies and systemic markers of inequality, my project will investigate three institutional contexts. These include: (1) Surveillance: one of the first uses of biometric technologies claimed to be able to identify persons who are security risks by recognizing negative emotion states (2) Marketing: the expansion of technologies that can read consumer’s emotional responses in order to improve sales and (3) Education: the pedagogical use of children’s literature and video games to teach affective states through an investigation of the development of “empathy games” and “empathy literature” aimed at teaching people how to navigate traumatic emotional experiences. Although the real-world deployment of technologies aimed at analyzing emotions is often described as being transparent and self-evident, I argue that their uses remain complex, ambiguous, differentially imposed and targeted to specific audiences and, as a result, inherently problematic. As SSHRC highlights, these technologies are being developed “at breakneck speed. In order to benefit from, integrate and adapt to these technologies effectively, we need to understand their ethical, environmental, economic, legal and social implications.” The central problematic of my project is an examination of the implications of these new emotive technologies, the specific focus in this project is the differential implications these technologies have for communities marked by gender, race, sexuality, class and disability. What are the implications for a surveillance society of biometric emotion technologies that are coded in order to identify anger and aggression in order to predict criminal behaviour? What are the costs, both financial and social, of keeping us safer? Looking at the possibilities of these technologies aimed at automating emotions for providing us with new and richer understandings of our emotional worlds as well as the ways in which these technologies may serve to intensify existing inequalities, I aim to think about the social consequences of our longing for new technologies that are skilled at deciphering the complex world of our emotions.