call for contributions “ECREA Book: AI Infrastructures and Sustainability”, Muenster University February 2024

for an Open Access publication with Palgrave as part of the ECREA Open Access Book Series

Proposal for an edited book on

AI Infrastructures and Sustainability

Deadline: 29.02.2024
Editors: Anne Mollen, Sigrid Kannengießer, Fieke Jansen, Julia Velkova

This call for contributions follows an invitation by the ECREA Open Access Book Series Committee to develop a proposal for a volume on “AI Infrastructures and Sustainability”. The Committee has invited overall three publications to develop full proposals – one of which will be selected as Open Access publication with Palgrave as part of the ECREA book series.

The proposed volume assembles research in media and communications on AI infrastructures in relation to questions of sustainability. We invite critical theoretical, historical, methodological, and empirical reflections on the “sustainability” of technologies that go under the label of “AI”. Contributions could include analyses of how sustainability, infrastructures or other related notions can be conceptualized in relation to technologies of automation – to deconstruct how AI-related narratives, imaginaries, norms, practices etc. with their ensuing implications manifest in infrastructures of automated communication. We also welcome authors to introduce new concepts that contribute to create more affective, transformative, theoretically nuanced narratives and understandings of how to make liveable relations with AI. Considering the necessity for a great socio-ecological transformation, the proposed volume also encourages reflections on transformative perspectives in media and communication research, addressing media and communication’s role in the shaping and transforming of societies increasingly becoming reliant on technologies of automation.

Submission details and expected time frame for publication

We are seeking abstracts (250-300 words, excluding references) to be submitted until February, 29 2024 to addressing – but not limited to – one or more of the following topics:

  • Critical theoretical, conceptual, empirical, and methodological work on “AI”, infrastructures and sustainability in media and communication research
  • Critical discussions of sustainability, sustainability narratives and normative frameworks in relation to AI infrastructures
  • Imaginaries of sustainability and AI (their construction as well as resistance to it)
  • Human rights and digital justice implications of AI
  • Extractivism and AI (labour, data, resources etc.), including AI-related protest and movements
  • Intersectional perspectives on AI and sustainability
  • Resource consumption and environmental impacts of AI
  • Intersections of AI with local and energy politics
  • Market concentration, political economy, geopolitical perspectives and global distributional (in)justices in relation to AI infrastructures
  • Bias and discrimination in AI infrastructures, representation, and AI
  • Transformative and transdisciplinary perspectives on AI and sustainability

Media and communication research can contribute with nuanced, critical, and normative analyses on the socio-technical relations that make and sustain AI infrastructures. This perspective is direly needed in the discussion of AI and sustainability, which needs to be acknowledged as more than a technical concern to which technical solutions can be found. A comprehensive media and communication perspective can instead assess the manifestations, contestations, and historical continuities in the emergence of AI infrastructures while reflecting on matters of sustainability. With the proposed volume we are calling on scholars to orient discussions on automation as well as human-machine-interaction emerging in relation to media and communications towards an interrogation of the infrastructures, practices, and more-than-human relations that constitute the operations of technologies that go under the label of “AI” through the lens of sustainability.

More information here