# Tunneling Into The Soup

• Context
• Our Photoreceptive Hardware
• The Birthplace Of Human-Visibility
• Takeaways

Context

In Knowledge: An Empirical Sketch, I left you with the following image of perceptual tunneling:

Today, we will explore this idea in more depth.

Recall what you know about the nature of light:

Since h and c are just constants, the relation becomes very simple: energy is inversely proportional to wavelength. Rather than identifying a photon by its energy, then, let’s identify it by its wavelength. We will do this because wavelength is easier to measure (in my language, we have selected a measurement-affine independent variable).

So we can describe one photon by its wavelength. How about billions? In such a case, it would be useful to draw a map, on which we can locate photon distributions. Such a photon map is called an electromagnetic spectrum.

With this background knowledge in place, we can explore a photon map of solar radiation: what types of photons strike the Earth from the Sun?

This image is rather information-dense. Let’s break it down.

Compare the black curve to the yellow distribution. The former represents the difference between an idealized energy radiator (a black body), whereas the latter represents the actual transmission characteristics of the sun. As you can see, while the black body abstraction does not perfectly model the idiosyncrasies of our Sun, it does a pretty good job “summarizing” the energy output of our star.

Next, compare the yellow distribution to the red distribution. The former represents solar radiation before it hits our atmosphere, the latter represents solar radiation after it hits our atmosphere (when it strikes the Earth). As you can see, at some wavelengths light passes through the atmosphere easily (red ~= yellow; call this atmospheric lucidity) whereas at other wavelengths, the atmosphere absorbs most of the photons (red << yellow; call this atmospheric opacity).

These different responses to different energy light does not occur at random, of course. Rather, the chemical composition of the atmosphere causes atmospheric opacity. Ever hear the meme “the ozone layer protects us from UV light”? Well, here is that data underlying the meme (see the “O3” marker at the 300 nm mark?). Other, more powerful but less well-known, effects can be seen in the above spectrum, which characterize the shielding effects of water vapor, carbon dioxide, and oxygen onto the spectra.

Our Photoreceptive Hardware

Your eyes house two types of photoreceptive cell: the rod and the cone.

Rods are tuned towards performing well in low-light conditions. After roughly 30 minutes in the dark, everything is ready for optical stimulation. In this “dark-adapted” state, the visual system is amazingly sensitive. A single photon of light can cause a rod to trigger. You will see a flash of light if as few as seven rods absorb photons at one time.

Cones, on the other hand, excel in daylight. They also underwrite the experience of color (a phenomenon we will discuss next time).  Now, unless you are a tetrachromat mutant, your eyes contain three kinds of cone:

• 16% blue cones
• 10% green cones
• 74% red cones

Below are the absorption spectra.  Please note that, while not shown, rods manifest a similar spectrum: they reside between the blue and green curves , with a peak receptivity is 498 nm.

It is important to take the above in context of the cone’s broader function. By virtue of phototransductive chemical processes, sense organs like the cone accept photons matching the above spectrum as input, and thereby facilitate the production of spike trains (neural code) as system output.

The Birthplace Of Human-Visibility

We now possess two spectra: one for solar radiation, and one for photoreceptor response profile. Time to combine spectra!  After resizing the images to achieve scale consistency, we arrive at the following:

Peak solar radiation corresponds to cone spectrum! Why should this be? Well, recall the purpose of vision in the first place. Vision is an evolutionary adaptation that extracts information from the environment & makes it available to the nervous system of its host. If most photon-mediated information is happening at the 450-700 nm energy level, should we really be so surprised to learn that our eyes have adapted to this particular range?

Notice that we drew two dotted lines around the intersection boundaries. We have now earned the right to use a name. Let us name photons that reside within the above interval, visible light. Then,

• Let “ultraviolet light” represent photons to the left of the interval (smaller wavelengths, higher energy)
• Let “infrared light” represent photons to the right of the interval (longer wavelengths, lower energy)

We have thus stumbled on the definition of visibility. Visibility is not an intrinsic physical property, like charge. Rather, it is human invention: the boundary at which our idiosyncratic photoreceptors carve into the larger particle soup.

Takeaways

We have now familiarized ourselves with the mechanism of tunneling. Perceptual tunneling occurs when sense organ transduces some slice of the particle soup of reality. In vision, photoreceptors cells transduce photons within the 450-700 nm energy band into the neural code.

With this precise understanding of transduction, we begin to develop a sense of radical contingency. For example,

• Ever wonder what would happen if the human eye also contained photoreceptors on the 1100 nm range?  The human umwelt As you heat a fire up, for example, you would see tendrils of flame brighten, then vanish, then reappear. I suspect everyday language would feature “bright-visible” and “dim-visible”
• Consider what would have happened if, during our evolution, the solar radiation spectrum if the sun had been colder than 5250 degrees Celsius. The black-body idealized spectrum of the sun would shift, and its peak would move towards the right. The actual radiation signature of the sun (yellow distribution) would follow. Given how precisely the rods in our eyes “found” the band of peak emitted energy in this universe, in that world, it seems likely that we would be wearing different photoreceptors with an absorption signature better calibrated to the  the new information. Thus, we have a causal link between the temperature of the Sun and the composition of our eyeballs.

I began this post with a partial quote from Metzinger. Here is the complete quote:

The evening sky is colorless. The world is not inhabited by colored objects at all. It is just as your physics teacher in high school told you: Out there, in front of your eyes, there is just an ocean of electromagnetic radiation, a wild and raging mixture of different wavelengths. Most of them are invisible to you and can never become part of your conscious model of reality. What is really happening is that your brain is drilling a tunnel through this inconceivably rich physical environment and in the process painting the tunnel walls in various shades of color. Phenomenal color. Appearance.

Next time, we’ll explore the other half of Metzinger’s quote: “painting the tunnel walls in various shades of color, phenomenal color”…

# Knowledge: An Empirical Sketch

• Introduction
• All The World Is Particle Soup
• Soup Texture
• Perceptual Tunnels
• On Resolution
• Sampling
• Light Cones, Transduction Artifacts, Translation Proxies
• Step One: Conceptiation
• Step Two: Graphicalization
• Step Three: Annotation
• Putting It All Together: The Triad
• Conclusion
• Going Meta
• Takeaways

## Introduction

All The World Is Particle Soup

Scientific realism holds that the entities scientists refer to are real things. Electrons are not figments of our imagination, they possess an existence independent of your mind. What does it mean for us to view particle physics with such a lens?

Here’s what it means: every single thing you see, smell, touch… every vacation, every distant star, every family member… it is all made of particles.

This is an account of how the nervous system (a collection of particles) came to understand the universe (a larger collection of particles). How could Particle Soup ever come to understand itself?

Soup Texture

Look at your hand. How many types of particles do you think you are staring at? A particle physicist might answer: nine. You have four first-generation fermions (roughly, particles that comprise matter) and five bosons (roughly, particles to carry force). Sure, you may get lucky and find a couple exotic particles within your hand, but such a nuance would not detract from the morale to the story: in your hand, the domain (number of types) of particles is very small.

Look at your hand. How large a quantity of particles do you think you are staring at? The object of your gaze is a collection of about 700,000,000,000,000,000,000,000,000 (7.0 * 10^26) particles. Make a habit about thinking in this way, and you’ll find a new appreciation for the Matrix Trilogy. 🙂 In your hand, the cardinality (number of tokens) of particles is very large.

These observations generalize. There aren’t many kinds of sand in God’s Sandbox, but there is a lot of it, with different consistencies across space.

## Perceptual Tunnels

On Resolution

Consider the following image. What do you see?

Your eyes filter images at particular frequencies. At this default human frequency, your “primitives” are the pixelated squares. However, imagine being able to perceive this same image at a lower resolution (sound complicated? move your face away from the screen :P). If you do this, the pixels fade, and a face emerges.

Here, we learn that different resolution lens may complement one another, despite their imaging the same underlying reality. In much the same way, we can enrich our cognitive toolkit by examining the same particle soup with different “lens settings”.

Sampling

By default, the brain does not really collect useful information. It is only by way of sensory transductor cells – specialized cells that translate particle soup into Mentalese – that the brain gains access to some small slice of physical reality. With increasing quantity and type of these sensory organs, the perceptual tunnel burrowed into the soup becomes wide enough to support a lifeform.

Another term for the perceptual tunnel is the umwelt. Different biota experience different umwelts; for example, honeybees are able to perceive the Earth’s magnetic field as directly as we humans perceive the sunrise.

Perceptual tunneling may occur at different resolutions. For example, your proprioceptive cells create signals only on the event of coordinated effort of trillions and trillions of particles (e.g., the wind pushes against your arm). In contrast, your vision cells create signals at very fine resolutions (e.g., if a single photon strikes your photoreceptor, it will fire).

Light Cones, Transduction Artifacts, Translation Proxies

Transduction is a physically-embedded computational process. As such, it is subject to several pervasive imperfections. Let me briefly point towards three.

First, nature precludes the brain from the ability to sample from the entirety of the particle soup. Because your nervous system is embedded within a particular spatial volume, it is subject to one particular light cone. Since particles cannot move faster than the speed of light, you cannot perceive any non-local particles. Speaking more generally: all information outside of your light cone is closed to direct experience.

Second, the nervous system is an imperfect medium. It has difficulty, for example, representing negative numbers (ever try to get a neuron firing -10 times per second?). Another such transduction artifact is our penchant for representing information in a comparative, rather than absolute, format. Think of all those times you have driven on the highway with the radio on: when you turn onto a sidestreet, the music feels louder. This experience has nothing at all to do with an increased sound wave amplitude: it is an artifact of a comparison (music minus background noise). Practically all sensory information is stained by this compressive technique.

Third, perceptual data may not represent the actual slice of the particle soup we want. To take one colorful example, suppose we ask a person whether they perceived a dim flashing light, and they say “yes”. Such self-reporting, of course, represents sensory input (in this case, audio vibrations). But this kind of sensory information is a kind of translation proxy to a different collection of particles we are interested in observing (e.g., the activity of your visual cortex).

This last point underscores an oft-neglected aspect of perception: it is an active process. Our bodies don’t just sample particles, they move particles around. Despite the static nature of our umwelt, our species has managed to learn ever more intricate scientific theories in virtue of sophisticated measurement technology; and measurement devices are nothing more than mechanized translation proxies.

Step One: Conceptiation

Plato once describes concept acquisition as “carving nature at its joints”. I will call this process (constructing Mentalese from the Soup) theory conceptiation.

If you meditate on this diagram for a while, you will notice that theory conceptiation is a form of compression. Acccording to Kolmogorov information theory, the efficacy of compression hinges on how many patterns exist within your data. This is why you’ll find leading researchers claiming that:

Compression and Artificial Intelligence are equivalent problems

A caveat: concepts are also not carved solely from perception; as one’s bag of concepts expands, such pre-existent mindware exerts an influence on the further carving up of percepts. This is what the postmoderns attribute to hermeneutics, this is the root of memetic theory, this is what is meant by the nature vs. nurture dialogue.

Step Two: Graphicalization

Once the particle soup is compressed into a set of concepts, relations between these concepts are established. Call this process theory graphicalization.

If I were ask you to complete the word “s**p”, would you choose “soap” or “soup”?  How would your answer change if we were to have a conversation about food network television?

Even if I never once mention the word “soup”, you become significantly more likely to auto-complete that alternative after our conversation. Such priming is explained through concept graphs: our conversation about the food network activates food-proximate nodes like “soup” much more strongly than graphically distant nodes like “soap”.

Step Three: Annotation

Once the graph structure is known, metagraph information (e.g., “this graph skeleton occurs frequently”) is appended. Such metagraph information is not bound to graphs. Call this process theory annotation.

We can express a common complaint about metaphysics thusly: theoretical annotation is invariant to changes in conceptiation & graphicalization results. In my view (as hinted at by my discussion of normative therapy) theoretical annotation is fundamentally an accretive process – it is logically possible to generate an infinite annotative tree; this is not seen in practice because of the computational principle of cognitive speed limit (or, to make a cute analogy, the cognition cone).

Putting It All Together: The Triad

Call the cumulative process of conceptiation, graphicalization, and annotation the lens-dependent theorybuilding triad.

## Conclusion

Going Meta

One funny thing about theorybuilding is how amenable it is to recursion. Can we explain this article in terms of Kevin engaging in theorybuilding? Of course! For example, consider the On Resolution section above. Out of all possible adjectives used to describe theorybuilding, I deliberately chose to focus my attention on spatial resolution. What phase of the triad does that sound like to you?  Right: theory conceptiation.

Takeaways

This article does not represent serious research. In fact, its core model – the lens-dependent theorybuilding triad – cites almost no empirical results. It is a toy model designed to get us thinking about how a cognitive process can construct a representation of reality. Here is an executive summary of this toy model:

1. Perception tunneling is how organisms begin to understand the particle soup of the universe.
1. Tunneling only occurs by virtue of sensory organs, which transduce some subset of data (sampling) into Mentalese.
2. Tunneling is a local effect, it discolors its target, and its sometimes merely represents data located elsewhere.
2. The Lens-Dependent Theorybuilding Triad takes the perception tunnel as input, and builds models of the world. There are three phases:
1. During conceptiation, perception contents are carved into isolable concepts.
2. During graphicalization, concept interrelationships are inferred.
3. During annotation, abstracted properties and metadata are attached to the conceptual graph.