Cybernetics — Adaptive Systems

  • Emerged in the 1940s

  • Norbert Wiener: »Cybernetics: or, Control and Communication in the Animal and the Machine« (1948)

  • Was interdisciplinary from it’s beginning. Wiener tried to connect “digital electronic computing (then still novel), information theory, early work on neural networks, the theory of servomechanisms and feedback systems, and work in psychology, psychiatry, decision theory, and the social sciences”. [Pic10](3)

Control — Subjugation

  • »Control« is a key term of cybernetic

  • Critics fear an increasing rationality and subjugation through forcing control on people, animals, society, environment, …

  • According to Felix Stalder the terms »Control« and »Communication« where used synonymously. This effects our information technology / internet still today. [Sta16](81 ff.)

Ontology of Modern Science

  • The world consists of fixed and specifiable entities

  • We can achieve (produce) knowledge through science

  • This is the mainstream ontology in educational institutions

Ontology of Posthumanism

  • “Cybernetics […] stages for us a nonmodern ontology in which people and things are not so different after all.” [Pic10](18)

  • Bateson argues in »Conscious purpose versus nature« that the environment is too complex and thus we can’t fully predict the consequences of our actions [Pic15](647)

  • → Less anthropocentric worldview

Ontology of Unknowability

  • According to Stafford Beer, cybernetics is the “science of exceedingly complex systems that modern science can never quite grasp”. [Pic10](23 f.)

  • Andrew Pickering calls this an “ontology of unknowability”. [Pic10](23 f.)

  • → Cybernetics is about dealing with unknowability and uncertainty through the adaptive brain (adaptive system)

Modes of Control — Static

  • Targeted change of the behaviour of a system

  • The procedure remains unchanged (no feedback)

  • Example: simple traffic light

Modes of Control — Feedback

  • Targeted change of the behaviour of a system

  • The procedure changes during operation through feedback

  • Example: AI traffic light (see the chapter »Technical Activity«)

Modes of Control — Equilibrium

  • The system gains an equilibrium with it’s environment

  • The target comes from the inside of the system, not from outside

Control — Adaption & Performance

  • In British cybernetics, »control« is not about disciplining or subjugation

  • We can’t control the complexity of the world, so it’s about performatively dealing with the unknown

Adaptive Systems

  • “[…] the cybernetic brain was not representational but performative, as I shall say, and its role in performance was adaptation.” [Pic10](6)

  • “Bridges and buildings, lathes and power presses, cars, televisions, computers, are all designed to be indifferent to their environment, to withstand fluctuations, not to adapt to them.” [State of 2010.][Pic10](7)

  • Cybernetic systems interact with their environment and adapt to it.

Gordon Pask: Musicolor

  • Machine takes the sound of a musician as input

  • Analyzes it

  • This controls the lights (output)

  • The Input Output relation is not static

  • The machine has an internal state, related to the input (past)

  • If there’s not enough variety in the musician’s play, the lights turn off

  • The musician has to adapt

  • According to Pask, adaption is integral to our being. [Pic10](332)

Kerstin Ergenzinger: Navigating Noise

Kerstin Ergenzinger: Navigating Noise

See also her work Room Tunings, “in which a room is tuned to its own modes by measuring its acoustic resonance peaks.”



Andrew Pickering. The Cybernetic Brain: Sketches of Another Future. The University of Chicago Press, Chicago, 2010.


Andrew Pickering. Cybernetics. 2015. URL:


Felix Stalder. Kultur der Digitalität. Suhrkamp, Berlin, 2016.