KRAKEN: an Artificial Intelligence Coded by an Octopus

Goldsmiths Visual Cultures department hosts Etic Lab / 0rphan Drift talk about collaborative octopus artificial intelligence (AI) project.

KRAKEN? was a talk by Etic Lab and 0rphan Drift at Goldsmiths, University of London department of Visual Cultures on 24th October 2019.

imagining an artificial intelligence communicating with an octopus

Maggie Roberts, film still

Artificial Intelligence Coded by an Octopus

Etic Lab and 0rphan Drift ask:

What does it mean to communicate with an Alien Intelligence and how might we try to do it?

How to address human exceptionalism’s limited understanding of ourselves in relation to other life forms?

What alternative forms of experience, perception, ethics and entanglement are revealed in the process of stripping away the human tendency to deploy anthropomorphism, to ‘see’ another form of intelligent life?

What is the potential for artificial intelligence to process uncertainty and changing, dynamic environments through engagement with a distributed consciousness, resisting its evolution as a surveilling and predictive modeling tool?

Maggie Mer Roberts of artist collective 0rphan Drift is introduced by design engineer Dr Ramon Amaro of Goldsmiths Visual Cultures department. She outlines 0rphan Drift’s approach to the project of producing an artificial intelligence coded by an octopus, in collaboration with Aberystwyth University’s Marine Biology department. Etic Lab’s Stephanie Moran discusses challenges of communicating with octopuses and explains the kind of machine learning the project will adopt. Finally Dr Kevin Hogan of Etic Lab presents the development approach and elaborates on our ethical and philosophical position.


As part of the collaborative artist 0rphan Drift, I have focused on ‘machine vision’ for a long time, the co-evolution of human perception and artificial intelligence, the expanding sensory experience evolving with digital imaging tools and collaborating with the unknown. We have always been interested in other kinds of perception than the human.

Scenarios once science-fictional are now defining and expanding the present and what it means to be human, or nonhuman. Ecological crisis highlights the urgency of engaging with otherness, organic and inorganic, across difficult to image spatio-temporal dimensions. And the cooption of algorithm systems to surveillance and marketing agendas is shaping reductive, unsustainable and unethical futures. The work we are thinking about in this Etic Lab – 0rphan Drift collaboration, marks the beginnings of an experiment to facilitate a different model of artificial intelligence, by engaging it in communication with an octopus, and somewhat reprogram the humans involved in the process. The octopus is an otherness so strange and mesmerizing, it’s a challenge to imagine into its sentience at all.

Cross-disciplinary research over the last couple of years has been catalysed by two comments: philosopher Vilem Flusser stating in Vampyroteuthis Infernalis that an “octopoid revolution in consciousness is needed”; and design theorist Betti Marenko’s closing remarks that “the other side of the digital is the octopus” in her text FutureCrafting: 
A Speculative Method for an Imaginative AI [Artificial Intelligence]

Watch video here. Script here.


I consider what it might mean to communicate with an alien intelligence and how we might try to do it. I discuss Astrobiology, a form of future- and outer-space-oriented speculative evolutionary biology, that already considers this question. Computer scientist Michael Arbib’s octoplus was a thought experiment in how language could have evolved differently, given a different set of sensory and cognitive apparatus, based on the alien entity of the octopus. Arbib (2011) asks,

what might be some of the properties of a language evolved from the basis of chromatophores and body texture rather than visual control of the hand?

I then discuss biosemiotic approaches, followed by a brief overview of machine learning techniques, neural nets and how they work, and a more detailed description how we intend to apply reinforcement learning.

I continue with some implications of embodied cognition for an octopus-coded artificial intelligence. Humans are physically, bodily surface-bound, with free movement on the horizontal plane but limited vertically, which are reflected in spatial semantics. By this logic, if we imagine a species whose lifeworld is characterized by weightlessness, concepts of up and down are much less relevant for them. Artificial intelligence does not possess a human-like set of sensory apparatus; it lacks defined morphology, pharmacology, embodiment, but can be hard-coded with teleological purpose; like humans, this can give it the emergent property of appearing to have a conscious purpose. In a context where machine learning algorithms are now actively changing human culture, systemic rules shaping users behaviour, what do we want from an artificial intelligence and how can we consider the ethics of technology beyond the human?

What we ultimately want, rather than constructing an interpretation of a digital octopus in a digital environment, is an artificial intelligence based on real-world interactions.We want a real octopus to programme the artificial intelligence. We are interested in decentring anthropocentric narratives; the nonhuman intelligence of AI may both help and hinder with this.

Watch me mumbling the overlong and dense paper into a microphone here. Alternatively,  read the full script with presentation images here.


Kevin speaks about how we intend to use artificial intelligence to communicate with an octopus, and how an octopus might program a very different kind of artificial intelligence. He talks about the problems inherent to the idea of communicating with another life form, and human-centric ideas about intelligence, other definitions, and ideas about how intelligence arises. He discusses octopus morphology and perceptual systems, embodied and situated intelligence and cognition. He then describes our project in more detail, our initial system diagram, how we intend to work with an octopus in a smart tank, and how we will provide it with the capacity to program an artificial intelligence:

Etic has and is developing algorithms capable of learning in light of the data produced by the system developed above and the output it produces to drive both a visual display or an effector output such as a robotic arm. In summary, sensing capabilities are localised within the tank, combining eye tracking, detecting changes in skin colour and quantifying movement interactions with an intelligent object/display. Then a further software stack learns and uses these data to drive a new input for the octopus.

Watch Kevin’s keynote here.