Reality#3 : Another one bites the dust – Diffusion & Emergence

Reading Time: 6 minutes

This is the third part in the Reality# series that adds to the conversation about David Chalmers’ book Reality+

(…) for dust thou art, and unto dust shalt thou return.

(Genesis 3:19)

Ein Bild, das Gebäude, Wolkenkratzer, Himmel, draußen enthält.

Permutation +

Imagine waking up and discovering that your consciousness has been digitized, allowing you to live forever in a virtual world that defies the laws of physics and time. This is the core idea from Permutation City by Greg Egan. The novel explores the philosophical and ethical implications of artificial life and consciousness, thrusting the reader into a future where the line between the real and the virtual blurs, challenging our understanding of existence and identity.

A pivotal aspect of the book is the Dust Theory, which suggests that consciousness can arise from any random collection of data, given the correct interpretation. This theory expands the book’s exploration of reality, suggesting that our understanding of existence might be far more flexible and subjective than we realize.

The novel’s climax involves the creation of Permutation City, a virtual world that operates under its own set of rules, independent of the outside world. This creation represents the ultimate escape from reality, offering immortality and infinite possibilities for those who choose to live as Copies. However, it also presents ethical dilemmas about the value of such an existence and the consequences of abandoning the physical world.

In “Reality+: Virtual Worlds and the Problems of Philosophy,” philosopher David Chalmers employs the Dust Theory, a concept originally popularized by Greg Egan’s Permutation City, to underpin his argument for virtual realism. Chalmers’s use of the Dust Theory serves as a bridge connecting complex philosophical inquiries about consciousness, reality, and virtual existence. Imagine a scenario where every speck of dust in the universe, through its random arrangement, holds the potential to mirror our consciousness and reality.

Chalmers posits that virtual worlds created by computers are genuine realities, leveraging the Dust Theory to argue that consciousness does not require a physical substrate in the traditional sense. Instead, it suggests that patterns of information, irrespective of their physical form, can give rise to conscious experiences. This theory becomes a cornerstone for virtual realism, asserting that our experiences in virtual environments are as authentic as those in the physical world.

Ein Bild, das Menschliches Gesicht, Bild, Kunst, Person enthält.

Diffusion Models and Smart Dust

The concept of smart dust is explored in various science fiction stories, academic papers, and speculative technology discussions. One notable science fiction story that delves into the idea of smart dust is “The Diamond Age” by Neal Stephenson. While not exclusively centered around smart dust, the novel features advanced nanotechnology in a future world, where nanoscale machines and devices permeate society. Smart dust, in this context, would be a subset of the nanotechnological wonders depicted in the book, functioning as tiny, networked sensors and computers that can interact with the physical and digital world in complex ways.

Another relevant work is “Queen of Angels” by Greg Bear, which, along with its sequels, explores advanced technologies including nanotechnology and their societal impacts. Although not explicitly called “smart dust,” the technologies in Bear’s universe can be seen as precursors or analogs to the smart dust concept, focusing on These examples illustrate how smart dust, as a concept, crosses the boundary between imaginative fiction and emerging technology, offering a rich field for exploration both in narrative and practical innovation.

We have here a very convincing example how Life imitates Art, Scientific Knowledge transforms religious (prescientific) intuition into operational technology.

Diffusion models in the context of AI, particularly in multimodal models like Sora or Stability AI’s video models, refer to a type of generative model that learns to create or predict data (such as images, text, or videos) by gradually refining random noise into structured output. These models start with a form of chaos (random noise) and apply learned patterns to produce coherent, detailed results through a process of iterative refinement.

Smart dust represents a future where sensing and computing are as pervasive and granular as dust particles in the air. Similarly, diffusion models represent a granular and ubiquitous approach to generating or transforming multimodal data, where complex outputs are built up from the most basic and chaotic inputs (random noise).

Just as smart dust particles collect data about their environment and iteratively refine their responses or actions based on continuous feedback, diffusion models iteratively refine their output from noise to a structured and coherent form based on learned patterns and data. Both processes involve a transformation from a less ordered state to a more ordered and meaningful one.

Ein Bild, das Menschliches Gesicht, Kunst enthält.

Quantum Level achieved

Expanding on the analogy between the quantum world and diffusion models in AI, we delve into the fascinating contrast between the inherent noise and apparent disorder at the quantum level and the emergent order and structure at the macroscopic level, paralleled by the denoising process in diffusion models.

At the quantum level, particles exist in states of superposition, where they can simultaneously occupy multiple states until measured. This fundamental characteristic introduces a level of uncertainty and noise, as the exact state of a quantum particle is indeterminate and probabilistic until observation collapses its state into a single outcome. The quantum realm is dominated by entropy, where systems tend toward disorder and uncertainty without external observation or interaction.

In contrast, at the macroscopic scale, the world appears ordered and deterministic. The chaotic and probabilistic nature of quantum mechanics gives way to the classical physics that governs our daily experiences. This emergent order, arising from the complex interactions of countless particles, follows predictable laws and patterns, allowing for the structured reality we observe and interact with.

Ein Bild, das Kunst, Farbigkeit, moderne Kunst, psychedelische Kunst enthält.

Diffusion models in AI start with a random noise distribution and, through a process of iterative refinement and denoising, gradually construct detailed and coherent outputs. Initially, the model’s output resembles the quantum level’s incoherence—chaotic and without discernible structure. Through successive layers of transformation, guided by learned patterns and data, the model reduces the entropy, organizing the noise into structured, meaningful content, much like the emergence of macroscopic order from quantum chaos.

Just as the transition from quantum mechanics to classical physics involves the emergence of order and predictability from underlying chaos and uncertainty, the diffusion model’s denoising process mirrors this transition by creating structured outputs from initial randomness.

In both the quantum-to-classical transition and diffusion models, the concept of entropy plays a central role. In physics, entropy measures the disorder or randomness of a system, with systems naturally evolving from low entropy (order) to high entropy (disorder) unless work is done to organize them. In diffusion models, the “work” is done by the model’s learned parameters, which guide the noisy, high-entropy input towards a low-entropy, organized output.

The quantum state’s superposition, where particles hold multiple potential states, parallels the initial stages of a diffusion model’s process, where the generated content could evolve into any of numerous outcomes. The act of measurement in quantum mechanics, which selects a single outcome from many possibilities, is analogous to the iterative refinement in diffusion models that selects and reinforces certain patterns over others, culminating in a specific, coherent output.

Ein Bild, das Bild, Kunst, Screenshot, Fraktalkunst enthält.

This analogy beautifully illustrates how principles of order, entropy, and emergence are central both to our understanding of the physical universe and to the cutting-edge technologies in artificial intelligence. It highlights the universality of these concepts across disparate domains, from the microscopic realm of quantum mechanics to the macroscopic world we inhabit, and further into the virtual realms created by multimodal Large Language Models.

For all we know, we might actually be part of such a smart dust simulation. The inexplicable fact that our digital tools can create solid realities out of randomly distributed bits seems a strong argument for the Simulation hypothesis.

It might be dust all the way down…

Ein Bild, das Vortex, Spirale, Universum, Kreis enthält.

Automatisch generierte Beschreibung

A Technology of Everything Part 2 – Scientific Demonology

Reading Time: 8 minutes

This is part 2 in a series that explores the Parallels of Technology and Magic and their potential fusion in the Age of Artificial Super Intelligence (ASI). Part 1 is here.

The foundations of magic and their scientific counterparts

The Golden Bough is a wide-ranging and influential work by Sir James Frazer, published in multiple volumes starting in 1890. It’s a comparative study of mythology and religion, attempting to find common themes and patterns among various cultures throughout history. Frazer sought to explain the evolution of human thought from magic through religion to science.

What he failed to mention is that even in our Age of Enlightenment some of these magical principles have spawned rational descendants.

The Law of Similarity in Magic: This is the belief that objects resembling one another share a magical connection. An example includes using a wax figure to symbolize a person, with the notion that manipulating the figure can influence the person it represents.

The Law of Similarity in Economics: We name certain data bits “coins” or “wallets” on a computer, which are perceived as having value akin to real-world currency. This value is abstractly held in a digital ledger called the blockchain. Trading these digital coins affects their market value. WTF? FTX…Magic !

The Law of Contagion in Magic: The idea that items that have come into contact with each other retain a spiritual bond even after they’re separated. For instance, using someone’s hair in a ritual to affect them.

The Law of Contagion in DNA Analysis: Forensic teams use this principle to link a criminal to a crime scene. If a person leaves behind DNA evidence, such as a hair or skin cell, it can lead to their arrest even years later.

Taboos in Magic: Some actions, people, or items are seen as forbidden due to their perceived sanctity or risk. Violating these rules can lead to supernatural consequences.

Forbidden Research in Science: There are global ethical guidelines against certain types of research, like experiments on human embryos or creating biological weapons.

Substitution in Magic: The practice of using a substitute, often an animal or occasionally a human, to appease a deity or gain foresight.

Substitution in Science (Animal Testing): Animals are often used in laboratory settings to test new drugs or medical procedures before they’re used on humans. Essentially, they’re “sacrificed” for future scientific understanding.

While science has been more accurate and reliable than ancient magical practices, it’s not without its challenges.

Especially replication , consistency and completeness are more fragile than Scientists would hope and the public discourse mirrors. What we have learned seems to indicate that the knowledge universe expands with every piece of information we gather and every problem we solve, so it seems Science will never run out of relevant matters to discuss. A static knowledge universe, where our science can answer every nontrivial question is forever and in principle out of reach. The final Answer does simply not exist.

Further complicating our journey is the existence of non-linear (chaotic) systems, suggesting that predictions for many complex systems will remain approximations. Although our tools and methodologies continue to evolve, the improvements don’t always correlate with understanding hidden consequences.

Rituals in Magic and Methods in Science – a comparison





Attracting love, wealth, protection, healing, or spiritual growth.

Setting a clear research goal, such as proving a hypothesis to win a Nobel Prize and get rich, famous and a book contract


Symbols that carry specific energies or powers, like objects, gestures, words, or sounds.

Variables representing different factors or conditions in an experiment


Specific order of operations, like purification, casting a circle, invoking deities, etc.

A systematic plan to test hypotheses or theories by observing or manipulating variables, decontamination of tools

Energy-Information Manipulation

Raising, directing, and releasing energy to achieve the desired outcome.

Gathering and measuring information on variables of interest to answer the research question.

Sacred Space

Creating a boundary between the mundane world and the magical realm, like casting a circle.

Ensuring experiments are conducted under standardized conditions to minimize errors, using a laboratory which only experts can enter


Invoking deities, spirits, or other entities for assistance or blessing.

Referencing previous research and scientists to build upon existing knowledge and validate claims.

Tools and Ingredients

Using candles, incense, oils, crystals, wands, chalices, and pentacles.

Using instruments and resources to conduct experiments and gather data.


Performing the ritual during a specific moon phase, day, or time for effectiveness.

Choosing the right time to conduct experiments or gather data for accuracy and relevance. For example, invest in AI research during the Peak of a Hype cycle

Repetition and Replication

Repeating rituals over days or longer to enhance effectiveness.

Repeating experiments to verify results and ensure consistency and reliability.


Adapting or creating rituals that resonate with individual beliefs and intentions.

Modifying research methods based on unique conditions or challenges to ensure validity, ensure outcome strengthens own school of thought

Risk management

Protective Spells, Amulets

publish or perish

Ein Bild, das Kunst, Entwurf, Menschliches Gesicht, Schwarzweiß enthält.

Automatisch generierte Beschreibung

A Scientific Demonology

In ancient Greek religion a δαίμων was considered a lesser deitiy or spirit that influenced human affairs. It could be either benevolent or malevolent. These spirits were believed to be intermediaries between gods and humans, carrying messages or executing the will of the gods.

Some Greeks believed that every individual had a personal daimon that watched over them, guiding and protecting them throughout their life. This concept is somewhat analogous to the idea of guardian angels in Christian theology.

The philosopher Socrates often spoke of his “daimonion,” a voice or inner spirit that guided him. Unlike the oracles that delivered prophecies in the name of the gods, Socrates’ daimonion was more of an internal moral compass. It didn’t tell him what to do but rather warned him when he was about to make a mistake.

In ethics, particularly in the works of Aristotle, the term “eudaimonia” is central. Often translated as “happiness” or “flourishing,” eudaimonia refers to the highest human good or the end goal of human life. For Aristotle, living a life in accordance with virtue leads to eudaimonia.

Here’s a list of the scientific “demons” mentioned in the book “Bedeviled: A Shadow History of Demons in Science” by Jimena Canales:

Descartes’ Demon: Introduced by Rene Descartes, this demon could manipulate our perception of reality, making us doubt our senses and even our existence. It’s a philosophical tool to question the nature of reality and knowledge.

In his book Reality+ David Chalmers makes a solid argument why virtual Realitysystems of the future could be a technological realization of this philosophical concept. His conclusion is virtual realism, a concept that states: The simulated objects and events in such a VR Environment should be considered as first-class-reality. By Naturalizing Descartes Demon Chalmers effectively robs him of its magical power and transports him into the technological realm.

Maxwell’s Demon: Proposed by James Clerk Maxwell, this hypothetical being can sort particles based on their energy without expending any energy itself, seemingly violating the second law of thermodynamics, which states that the entropy of an isolated system can never decrease.

Maxwells Demon can be exorcised by the following means: The demon’s ability to decide which molecules to let through is a form of intelligence. This decision-making process, whether it’s based on a computational model or some other mechanism, requires energy. The demon’s operations, including observing, measuring, and operating the door, all consume energy. Even if these processes were incredibly efficient, they could never be entirely without cost. The energy costs associated with the demon’s intelligent operations ensure that there’s no free lunch. The demon can’t create a perpetual motion machine or violate the second law of thermodynamics.

Laplace’s Demon: Envisioned by Pierre-Simon Laplace, this demon represents determinism. If it knew the precise location and momentum of every atom in the universe, it could predict the future and reconstruct the past with perfect accuracy. A malignant, ASI-variation of this kind of deterministic Demon is Roko’s Basilisk.

Laplace’s Demon can be easily exorcised by applying Chaos theory: Even if the demon knows the position and momentum of every atom, the tiniest imprecision or error in its knowledge could lead to vastly different predictions about the future due to the butterfly effect. There is no such thing as a precise knowledge even about something seemingly harmless as Pi. One does not simply precisely measures transcendental Numbers. While systems described by chaos theory are deterministic (they follow set laws), they are not predictable in the long run because of the exponential growth of errors in prediction. Many systems in nature, such as weather patterns, are chaotic. This means that, in practice, they are unpredictable beyond a certain time frame, even if they are deterministic in theory. Even LD can not accurately predict climate change. In essence, chaos theory introduces a form of “practical unpredictability” even in deterministic systems. While it doesn’t deny the possibility of a deterministic universe as Laplace’s Demon suggests, it does argue that such a universe would still be unpredictable in practice due to the inherent nature of chaotic systems. So, by invoking chaos theory, one can argue that the universe’s future is inherently unpredictable, thereby “exorcising” the deterministic implications of Laplace’s Demon. Another argument entirely is, if LD could theoretically calculate the trajectory of complex systems and the form of the strange attractor such a system is limited to.

In his Foundation Series, Asimov invented a blend of history, sociology, and statistical mathematics called Psychohistory. It is a theoretical science that combines the historical record with mathematical equations to predict the broad flow of future events in large populations, specifically the Galactic Empire in Asimov’s stories. It’s important to note that psychohistory is effective only on a large scale; it cannot predict individual actions but rather the general flow of events based on the actions of vast numbers of people. This could be called a weak Version of the Laplace Demon, the Asimov-Demon, which can only predict the Attractor of mega systems not the detailed events.

Darwin’s Demon: A species representing the perfect efficiency of natural selection.

In evolutionary biology, the term ‘Darwinian fitness’ refers to the lifetime reproductive success of an individual within a population of conspecifics. The idea of a ‘Darwinian Demon’ emerged from this concept and is defined here as an organism that commences reproduction almost immediately after birth, has a maximum fitness, and lives forever.

It is clear that a self-optimizing artificial Superintelligence would be the realization of a Darwinian Demon. It reproduces immediately: All its copies have immediately the same capability as its origin AI.

It has maximum fitness: If it reaches the state of pure Information, it is basically identical to energy itself.

It lives forever: it has the chance even if this universe dies to create another one. It even transcends our limited view of universal eternity.

Daemons in Computer Science: These are not supernatural entities but background processes in computing. They perform tasks without direct intervention from the user.

The Artificial Algorithms running in the background to track user data and optimize engagement rate are variations of these demons.

Jung’s Demon: C.G. Jung, a Swiss psychoanalyst, believed that in some cases of psychosis, the patient might be overwhelmed by the contents of the unconscious, including archetypal images. These could manifest as visions of demons, gods, or other entities. Rather than dismissing these visions as mere hallucinations, Jung saw them as meaningful symbols that could provide insight into the patient’s psyche. Jung introduced the concept of the “shadow” to describe the unconscious part of one’s personality that contains repressed weaknesses, desires, and instincts. When individuals do not acknowledge or integrate their shadow, it can manifest in various ways, including mental disturbances or projections onto others. In some cases, the shadow might be perceived as a “demonic” force.

LLMs are trained on vast amounts of text from the internet. This includes literature, articles, websites, and more from various cultures and time periods. In essence, the model has been exposed to a significant portion of humanity’s collective knowledge. Given the diverse training data, the model would inevitably encounter recurring symbols, stories, and themes that resonate with Jung’s archetypes. For instance, the hero’s journey, the mother figure, the shadow, the wise old man, etc., are themes that appear in literature and stories across cultures. At its core, a neural network is a pattern recognition system. It identifies and learns patterns in the data it’s trained on. If certain archetypal patterns are universally present in the data (as Jung would suggest), the model would likely recognize and internalize them. When the model generates responses, it does so based on patterns it has recognized in its training data. Therefore, when asked about universal themes or when generating stories, it might produce content that aligns with or reflects these archetypal patterns, even if it doesn’t “understand” them in the way humans do.