Noetic Archeology: Unearthing Conceptual Gold from Cultural Debris
Memetic Evolution: Mutations, Mimicry, and Membranes
I spent years frustrated by misrepresentations of ideas I valued: Fake Buddha and Nietzsche quotes. Aromatherapy to align your chakras. Naive technological utopias promise that AI or Bitcoin will solve all global problems.
Then I recognized a pattern: every valuable idea attracts superficial interpretations, hype, and charlatans. Conceptual gold is invariably buried in muck. In this piece, I'll explore why this occurs. If you find these arguments persuasive, you may stop resenting the mistreatment of your cherished ideas. You might even join me in becoming a dung-sifter of the noosphere, a noetic archeologist.
The Memetic Lens: Ideas Have People
To explore this topic, we'll use the framework of cultural evolution. Just as genes drive physical behavior in natural evolution, ideas or "memes" (a concept pioneered by Richard Dawkins) shape cultural evolution. While internet memes are a subset of this concept, I’m using the broader definition in this piece: Memes are units of cultural transmission passed between people.
Humans are monkeys infected with ideas.
Memes, like genes, are selected for fitness in their environment. However, 'fitness' here means usefulness rather than truth, encompassing explanatory power, status-conferring ability, and compatibility with a host's existing beliefs. In this piece, I’m equating adaptive ideas with “good” ideas since they are useful to their hosts and because platonic musings on definitions are out of scope.
Memes interact to form complex structures called "memeplexes," analogous to genomes. For instance, Jesus is a meme, while Christianity is a memeplex.
The Memetic Lens explains why good ideas are usually surrounded by bad ones. We analyze three main areas: mutations, mimicry, and membranes.
Mutating Memes: Success Breeds Variation
Highly fit memes spread rapidly, increasing their mutation rate - much like an elaborate game of telephone. Mutations can occur in three areas:
Sender:
Faulty message: The sharer may not fully understand the meme, perhaps due to receiving a mutated version or misinterpreting it.
Encoding errors: Even if understood, the sender might fail to articulate the meme effectively.
Signal:
Faulty transmission: e.g. distortions from language barriers or cultural differences
Noise interference: e.g. background static leading to misunderstandings
Receiver: Misinterpretation in various ways (decoding errors)
Memetic friction, another source of mutation, occurs when memes encounter the receiver's existing beliefs. This can result in rejection, acceptance, or partial assimilation with modifications.
Highly adaptive memes are more likely to experience memetic friction. Their novelty and potential benefits make them attractive for assimilation, but also more prone to modification to fit existing belief systems.
For example, the Buddhist concept of "dukkha" (struggle with experience) is often translated as "suffering." This translation, while more relatable to Judeo-Christian thought, alters the original meaning.
In summary, highly adaptive memes propagate more often and therefore mutate more often, leading to more butchered versions of the originally good idea floating around.
Cognitive Cuckoos: On Memetic Mimicry
Mimicry is ubiquitous in biology. Harmless species mimic dangerous ones for protection or predators mimic the prey of their prey to prey on their prey.
The former example is called Batesian mimicry in biology. Similarly, bad ideas may mimic the appearance or structure of good ideas to propagate. Fake gurus with their Swami Safu Rinpoche trappings - bidi markings, mala beads, and the affected accent - use appearances to give their ideas more credibility. Or the latest high TPS horizontally scalable blockchain for RWAs. All the right buzzwords but none of the substance.
Buzzwords often are a tell for mimicry, serving as the memetic equivalent of bright colors in biological mimicry. "Transformative non-dual quantum consciousness," anyone?
While Batesian memetic mimicry typically benefits the originator (gaining followers, selling courses, or attracting investments), more nefarious forms exist. Aggressive memetic mimicry, akin to its biological counterpart, involves harmful ideas imitating benign ones to lure unsuspecting targets, such as phishing emails or fake news sites mimicking legitimate outlets.
Memetic drift, analogous to genetic drift, occurs in cultural niches where adaptively neutral memes may randomly become more frequent. This mimicry happens at the carrier level rather than as an adaptive strategy for the memes themselves. Think of groups of friends subconsciously picking up each other's vocabulary. Unlike mutations which change the content of memes, memetic drift changes meme frequency randomly, independent of adaptive value. This explains the sometimes random assortment of ideas clustering around a central, beneficial concept.
Conceptual Castles: Memetic Membranes
Memes travel in memeplexes - interconnected idea structures like stories, philosophies, worldviews, and religions. Just as DNA is protected by a cell nucleus, memeplexes have membranes safeguarding their integrity.
These memetic membranes are dense and challenging to penetrate, often manifesting as complex keystone concepts. Think of Buddhism's dependent origination or Christianity's trinity. They can also take the form of canonical texts, from the Bible to the "rationalist canon”.
Like their biological counterparts, memetic membranes are semi-permeable, rejecting most memes while allowing some to pass. For memes, hitching a ride with a successful memeplex is a winning strategy. This membrane dynamic explains the clustering of bad ideas around good ones in two ways:
Bad ideas that slip through the membrane
Accumulation of rejected bad ideas at the membrane's edge
Once inside, bad ideas are tough to evict. Holy books, for instance, often contain ethnocentric prejudices that aren't central to the teachings but reflect the cultural assumptions of their time.
Most memes, particularly bad ones born from mutations or mimicry, fail to breach adaptive memeplexes. This constant barrage of rejected ideas often outnumbers the memes within the memeplex. Case in point: misunderstood Buddhist teachings flooding YouTube and Twitter vastly exceed the word count of the original Suttas. Similarly, vapid AI and crypto startups far outnumber genuine technological breakthroughs.
Armor of Absurdity: When Bad Ideas Shield Good Ones
While sifting through bad ideas to grasp a new concept is frustrating, these intellectual weeds might actually benefit the memeplex. In nature, parasites sometimes evolve into symbionts. Similarly, in the realm of ideas, seemingly harmful concepts can protect memeplexes from destructive contact with other idea systems or camouflage their true value. These strategies help memeplexes select their ideal hosts, potentially spreading further by choosing which populations they infiltrate.
Memetic endosymbiosis occurs when seemingly maladaptive memes reside inside the memeplex. These ideas act as an immune system, ensuring that hosts with incompatible beliefs reject the entire structure rather than cherry-picking parts. This all-or-nothing approach benefits the memeplex, as its ideas mutually reinforce each other and spread more effectively as a unit.
Consider Christianity's concept of immaculate conception. This idea, rejected by anyone with a strictly rational worldview (like Richard Dawkins), acts as a gatekeeper. It repels carriers of the rational/scientific worldview, preserving the memeplex's integrity by attracting only traditionalists or post-rationalists. This selectivity helps the memeplex of Christianity avoid being torn apart by scientific scrutiny or scientism.
Memetic exosymbiosis, on the other hand, involves bad ideas clustering outside the memeplex's core. This “firewall of nonsense” camouflages the memeplex's true value with superficial or misunderstood versions. It’s oddly beneficial that most bankers still view Bitcoin as a criminal tool. I prefer that most psychopathic CEOs think that meditation is nonsense for naive hippies. We can thank ransomware scammers and New Age influencers for these protective misconceptions.
This outer layer of misunderstanding ensures that only the truly curious or persistent discover the memeplex's real value. While it might seem counterintuitive for memes to limit their spread, this strategy mirrors successful startup tactics: dominate a niche market before expanding. For memes, achieving high saturation in smaller, coordinated populations proves more effective for long-term propagation than a rapid, shallow global spread.
Idea Infection: Recap & Glossary
Drawing parallels between evolutionary biology and cultural evolution to explain the prevalence of bad ideas surrounding good ones was a ton of fun.
To recap: Highly adaptive memes spread and mutate more frequently. Since most mutations are maladaptive, this explains why good ideas often become surrounded by their inferior variants.
Mimicry, employed by both memes and their hosts, is a strategy to spread by imitating adaptive memes' structure or appearance. This phenomenon explains the proliferation of buzzwords by fake gurus and influencers.
Memes typically travel in memeplexes with semi-permeable membranes. As various memes attempt to infiltrate adaptive memeplexes, some bad ideas manage to slip inside and circulate within. Many more accumulate around the membrane, effectively shrouding the adaptive memeplex. Both internal and external bad ideas can benefit the memeplex's propagation by allowing it to select for suitable hosts.
Here is a quick glossary of the words I made up:
Memetic friction: Mutations arising from idea collisions as a host assimilates a memeplex.
Batesian memetic mimicry: Memes imitating adaptive memes to spread ("copycats").
Aggressive memetic mimicry: Memes imitating adaptive memes to lure hosts for nefarious purposes ("bait").
Memetic drift: Random frequency increases of neutral memes within a memeplex as hosts imitate one another.
Memetic membranes: Keystone ideas and structures (e.g., books, canons) protecting memeplex integrity and filtering incoming memes.
Memetic endosymbiosis: Initially non-adaptive memes within an adaptive memeplex that act as an immune system, preventing contact with hosts carrying incompatible memeplexes.
Memetic exosymbiosis: Accumulation of bad ideas outside an adaptive memeplex's membrane, camouflaging its value and selecting for host populations to achieve higher saturation in strategic social niches.
These concepts are initial attempts to understand the evolutionary dynamics in idea space. At the very least, I hope these musings have demonstrated the value of applying a memetic framework to cultural evolution.
Beyond Buzzwords: Become a Noetic Archeologist
From a memetic perspective, every powerful idea seems to spawn a swarm of nonsense. Conceptual gold lies buried beneath layers of intellectual detritus.
Here’s the kicker: This means that great ideas can be uncovered by digging into the biggest heaps of bullshit around. You can follow the trail of buzzwords, scammers, and naive idealists back to genuine sources of insight. I certainly found this to work. As an unexpected benefit, you begin to value the most ridiculous ideas - they serve as breadcrumbs leading to profound insights lurking in the same cognitive neighborhood.
So, what's the most absurd notion you keep encountering? Start your excavation there.
"This means that great ideas can be uncovered by digging into the biggest heaps of bullshit around. You can follow the trail of buzzwords, scammers, and naive idealists back to genuine sources of insight."
Postmodernism is perhaps the greatest example of this. But, of course, this is challenging for metamodernists because (as Cussens alludes to)...diminishing postmodernism seems like a pretty necessary part of having 'metamodernism'.
I would say well...a necessary part of having 'metamodernism' in the first place. But now that you have it.........
As I've mentioned in the Frame piece, this piece is largely a padded-out version of what the more intellectual NLPers would call the Meta Model - distort, delete, generalise. In many ways, I think metamodernism, as practised, has a lot in common with NLP at its more abstract. This is both good and bad IMO. I'll be writing about this shortly....hopefully today actually.
A related question here; what are the ethical responsibilities involved in knowing you are going to be misunderstood? This is probably most clearly exemplified by the notion of 'stochastic terrorism' - if you literally know you're very likely to be misunderstood and you know that this will likely have terrible consequences....is the dismissal of any responsibility (note, not necessarily legal responsibility) really tenable? I think a lot of liminals would really, really want to say yes to that. But I don't think denial aligns with how any of us really cognitively handle causation.