www.thenewatlantis.com Open in urlscan Pro
18.245.253.57  Public Scan

URL: https://www.thenewatlantis.com/publications/rational-magic
Submission: On December 07 via api from US — Scanned from DE

Form analysis 4 forms found in the DOM

GET /newsletter

<form class="w-72" action="/newsletter" method="GET">
  <div class="flex text-white border-opacity-grey border relative px-4 items-center">
    <div class="flex-shrink-0">
      <svg width="19" height="14" viewBox="0 0 19 14" class="fill-current">
        <path
          d="M6.8026 7.36207L9.30354 9.55862L9.55573 9.77586L9.80793 9.55862L12.3089 7.36207L17.2477 12.069H1.90581L6.8026 7.36207ZM17.9833 2.41379V11.5138L13.0234 6.75862L17.9833 2.41379ZM1.17024 2.41379L6.13008 6.75862L1.17024 11.5138V2.41379ZM1.17024 0.965517H17.9833V1.2069L9.57675 8.56897L1.17024 1.2069V0.965517ZM0.32959 0V13.0345H18.8239V0H0.32959Z">
        </path>
      </svg>
    </div>
    <input name="email" type="email" class="ml-3 py-2 font-callunasans text-base outline-none placeholder-white bg-almost-black newsletter-input" placeholder="Get our Newsletter">
    <button type="submit" class="text-white link-transition hover:text-light-grey">
      <svg width="25" height="14">
        <path class="stroke-current" opacity=".7" d="M0 7h24m0 0l-6.247-6M24 7l-6.247 6"></path>
      </svg>
    </button>
  </div>
</form>

GET https://www.thenewatlantis.com/

<form action="https://www.thenewatlantis.com/" method="GET">
  <div class="flex items-center border-b border-almost-white text-white">
    <div class="mr-4 flex-shrink-0">
      <svg class="h-6 w-6" fill="none" viewBox="0 0 30 30">
        <path fill-rule="evenodd" clip-rule="evenodd"
          d="M19.162 10.581a8.581 8.581 0 11-17.162 0 8.581 8.581 0 0117.162 0zm-1.84 8.156a10.538 10.538 0 01-6.74 2.425C4.736 21.162 0 16.425 0 10.582 0 4.736 4.737 0 10.581 0c5.844 0 10.581 4.737 10.581 10.581 0 2.562-.91 4.911-2.425 6.742l10.186 10.186a1 1 0 01-1.414 1.414L17.323 18.737z"
          fill="#fff"></path>
      </svg>
    </div>
    <input name="s" class="font-callunasans py-2 font-light bg-almost-black w-full text-1xl outline-none" type="text" value="" placeholder="Search">
  </div>
  <div class="font-callunasans text-sm mt-8 px-1/12 nav:hidden">
    <input type="submit" class="rounded-none uppercase w-full py-3 bg-white text-almost-black font-bold text-center" value="Search">
  </div>
</form>

POST /sign-in

<form action="/sign-in" method="POST" class="font-callunasans flex items-center flex-col">
  <div class="flex flex-col justify-center w-full nav:flex-row xl:w-3/5">
    <div class="w-full flex-shrink-0 flex text-white border-white border relative px-3 items-center nav:mr-6 nav:w-1/3">
      <div class="flex-shrink-0">
        <svg width="19" height="14" viewBox="0 0 19 14" fill="currentColor">
          <path
            d="M6.8026 7.36207L9.30354 9.55862L9.55573 9.77586L9.80793 9.55862L12.3089 7.36207L17.2477 12.069H1.90581L6.8026 7.36207ZM17.9833 2.41379V11.5138L13.0234 6.75862L17.9833 2.41379ZM1.17024 2.41379L6.13008 6.75862L1.17024 11.5138V2.41379ZM1.17024 0.965517H17.9833V1.2069L9.57675 8.56897L1.17024 1.2069V0.965517ZM0.32959 0V13.0345H18.8239V0H0.32959Z">
          </path>
        </svg>
      </div>
      <input type="email" name="email" class="ml-3 py-2 w-full outline-none bg-almost-black placeholder-white" placeholder="Email">
    </div>
    <div class="w-full flex-shrink-0 flex text-white border-white border relative px-3 items-center mt-6 nav:mt-0 nav:w-1/3 xl:mr-6">
      <div class="flex-shrink-0">
        <svg width="17" height="22" viewBox="0 0 17 22" fill="currentColor">
          <path
            d="M8.5 14C9.03429 14 9.47143 14.45 9.47143 15C9.47143 15.55 9.03429 16 8.5 16C7.96571 16 7.52857 15.55 7.52857 15C7.52857 14.45 7.96571 14 8.5 14ZM8.5 13C7.43143 13 6.55714 13.9 6.55714 15C6.55714 15.925 7.18857 16.725 8.01429 16.95V18.25C8.01429 18.525 8.23286 18.75 8.5 18.75C8.76714 18.75 8.98571 18.5 8.98571 18.25V16.95C9.81143 16.725 10.4429 15.95 10.4429 15C10.4429 13.9 9.56857 13 8.5 13ZM0.971429 10.5H16.0286V21H0.971429V10.5ZM8.5 1C11.1957 1 13.3571 3.25 13.3571 6.1V9.5H3.64286V6.1C3.64286 3.25 5.80429 1 8.5 1ZM8.5 0C5.27 0 2.67143 2.725 2.67143 6.1V9.5H0.437143C0.194286 9.525 0 9.75 0 10V21.5C0 21.75 0.242857 22 0.485714 22H16.5143C16.7571 22 17 21.75 17 21.5V10C17 9.75 16.7571 9.5 16.5143 9.5H14.3286V6.1C14.3286 2.725 11.73 0 8.5 0Z">
          </path>
        </svg>
      </div>
      <input type="password" name="password" class="ml-3 py-2 w-full outline-none bg-almost-black placeholder-white" placeholder="Password">
    </div>
    <button type="submit" class="btn-outline-white py-2 hidden w-1/3 xl:block"> Sign In </button>
  </div>
  <button type="submit" class="btn-outline-white py-2 mt-8 w-full nav:w-1/3 xl:hidden"> Sign In </button>
  <div class="mt-8 flex flex-col nav:flex-row justify-center font-callunasans">
    <p class="mr-4">
      <a class="text-grey hover:text-white link-transition" href="https://ezsubscription.com/tna/lostfound.aspx?">Forgot Password?</a>
    </p>
  </div>
</form>

GET /newsletter

<form action="/newsletter" method="GET">
  <div class="w-full flex text-white border-opacity-grey border relative px-4 items-center">
    <div class="flex-shrink-0">
      <svg width="19" height="14" viewBox="0 0 19 14" class="fill-current">
        <path
          d="M6.8026 7.36207L9.30354 9.55862L9.55573 9.77586L9.80793 9.55862L12.3089 7.36207L17.2477 12.069H1.90581L6.8026 7.36207ZM17.9833 2.41379V11.5138L13.0234 6.75862L17.9833 2.41379ZM1.17024 2.41379L6.13008 6.75862L1.17024 11.5138V2.41379ZM1.17024 0.965517H17.9833V1.2069L9.57675 8.56897L1.17024 1.2069V0.965517ZM0.32959 0V13.0345H18.8239V0H0.32959Z">
        </path>
      </svg>
    </div>
    <input name="email" type="email" class="ml-3 py-2 font-callunasans text-base outline-none placeholder-white bg-almost-black newsletter-input" placeholder="Get our Newsletter">
    <button type="submit" class="text-white link-transition hover:text-light-grey">
      <svg width="25" height="14">
        <path class="stroke-current" opacity=".7" d="M0 7h24m0 0l-6.247-6M24 7l-6.247 6"></path>
      </svg>
    </button>
  </div>
</form>

Text Content

 * About
 * Journal
 * Projects
 * Books


 * Subscribe
 * Donate
 * Search
 * Sign in


ABOUT US


CONTACT


PRAISE


DONATE




About Us
Contact
Praise
Donate
About
Journal
Projects
Books
Subscribe today for early access to new articles and subscriber-only content
Buy Back Issues
print + digital
$34
digital
$24
Subscribe Today
Renew Existing Subscription Buy Back Issues
Subscribe
The New Atlantis is building a culture in which science and technology work for,
not on, human beings.
Read a note from our editor on what it takes to sustain our work.

amount to donate

$50 $100 $250
Donate
Read a note from our editor on what it takes to sustain our work.
Donate

Search
Sign in to access subscriber-only content and to manage your account

Sign In
Sign In

Forgot Password?

Sign In
?>


Essay

Spring 2023


RATIONAL MAGIC

Why a Silicon Valley culture that was once obsessed with reason is going woo

Tara Isabella Burton


Subscriber Only
Sign in or Subscribe Now for audio version

It broke open a shell in my heart,” the young man I’ll call Vogel said of
reading Nietzsche’s Human, All Too Human when we met for an interview earlier
this year at a Brooklyn bar. “I was very, very depressed at that time…. Beauty
in the world had become hauntingly distant. It existed over the horizon behind
some mountain and I couldn’t access it.”

Vogel wears the owl of Minerva around his neck. It’s a reference to the pursuit
of wisdom, but the charm also evokes Hegel’s maxim “The owl of Minerva spreads
its wings only with the falling of the dusk” — the idea that true insight only
comes late, at the end of an era. His bracelets, too, are symbolic: they
represent Huginn and Muninn, who in Norse mythology are the two ravens who sit
on the shoulders of the god Odin, and whose names mean thought and memory.

Vogel is part of a loose online subculture known as the postrationalists — also
known by the jokey endonym “this part of Twitter,” or TPOT. They are a group of
writers, thinkers, readers, and Internet trolls alike who were once
rationalists, or members of adjacent communities like the effective altruism
movement, but grew disillusioned. To them, rationality culture’s technocratic
focus on ameliorating the human condition through hyper-utilitarian goals —
increasing the number of malaria nets in the developing world, say, or
minimizing the existential risk posed by the development of unfriendly
artificial intelligence — had come at the expense of taking seriously the less
quantifiable elements of a well-lived human life.

On Twitter, Vogel calls himself Prince Vogelfrei and tweets a combination of
subcultural in-jokes, deeply earnest meditations on the nature of spiritual
reality, and ambiguous amalgamations of the two (example: “get over all social
fomo by contemplating the inaccessible experience of all history and prehistory,
the primordial love stories of rodent-like ancestors”). Vogelfrei in German
means outlawed but is literally free as a bird — not a bad thing to be, on the
often intellectually siloed birdsite. It’s also a reference to a series of poems
by Nietzsche sung by a prince who, imitating birds, sets himself spiritually
free.

Vogel’s pursuit of truth had hardly been painless. Raised a pastor’s son and
educated in evangelical Christian homeschool circles, as a teenager he was
living in Louisville, Kentucky when he experienced a crisis of faith, or what he
calls a “deconversion.” It was, he told me, a “psychological shock,” even “a
mystical experience.” He had “a vision of God being sacrificed on the altar of
truth.” Traditional Christianity now seemed untenable to him; untenable, too,
the secular world, which seemed no less full of unexamined dogma, tinged with
moral and intellectual unseriousness. Unwilling to enter the standard life
tracks that seemed most easily available to him — ministry, say, or conservative
politics — he moved to Seattle, where he worked for a while as a janitor at the
University of Washington.

He continued to seek out new avenues of intellectual and spiritual engagement.
He got involved with his local rationalist community, ultimately running a
rationalist reading group. He also got involved with a group devoted to the
practice of Historical European martial arts: people he described as
“Renaissance Faire, pagan types.” Reading Nietzsche around this time, he saw in
the philosopher a model for how to bridge his intellectual and creative worlds.
Nietzsche, Vogel argued, “disarms some of the reasons that intelligent people
often end up very cynical by doing better than them, but still coming back
around to a perspective of hope, essentially.”

Vogel’s enthusiasm for beauty, for poetry, for mythic references, for an
esoteric strain of quasi-occult religious thought called Traditionalism: all of
this, his onetime compatriots in the rationality community might once upon a
time have dismissed as New Age claptrap. But Vogel’s personal journey from
rationalism to postrationalism is part of a wider intellectual shift — in
hyper-STEM-focused Silicon Valley circles and beyond — toward a new openness to
the religious, the numinous, and the perilously “woo.”

You might call it the postrationalist turn: a cultural shift in both relatively
“normie” and hyper-weird online spaces. Whether you call it spiritual hunger,
reactionary atavism, or postliberal epistemology, more and more young,
intellectually inclined, and politically heterodox thinkers (and would-be
thinkers) are showing disillusionment with the contemporary faith in technocracy
and personal autonomy. They see this combination as having contributed to the
fundamentally alienating character of modern Western life. The chipper,
distinctly liberal optimism of rationalist culture that defines so much of
Silicon Valley ideology — that intelligent people, using the right epistemic
tools, can think better, and save the world by doing so — is giving way, not to
pessimism, exactly, but to a kind of techno-apocalypticism. We’ve run up against
the limits — political, cultural, and social alike — of our civilizational
progression; and something newer, weirder, maybe even a little more exciting,
has to take its place. Some of what we’ve lost — a sense of wonder, say, or the
transcendent — must be restored.

‘RAISE THE SANITY WATERLINE’

A quick primer for the less-online. The rationality community got its start on a
few blogs in the early 2000s. The first, Overcoming Bias, founded in 2006 and
affiliated with Oxford’s Future of Humanity Institute, was initially co-written
by economics professor Robin Hanson and, somewhat improbably, Eliezer Yudkowsky,
a self-taught AI researcher. Yudkowsky’s chief interest was in saving the world
from the existential threat posed by the inevitable development of a hostile
artificial intelligence capable of wiping out humanity, and his primary medium
for recruiting people to his cause was a wildly popular, nearly 700,000-word
fanfiction called Harry Potter and the Methods of Rationality, in which Harry
learns that the human mind is capable of far more magic than a wooden wand could
ever provide.

GET OUR NEWSLETTER


For updates about our latest work


As its name might suggest, Overcoming Bias was dedicated to figuring out all the
ways in which human beings have gotten very good at lying to ourselves, whether
through fear of the unknown or a desire for self-aggrandizement or just plain
being really bad at math, as well as all the ways in which we might train
ourselves to think better. By 2009, Yudkowsky had decamped to his own blog,
LessWrong, which purported to help people be, well, just that, by hacking into
our primordial predator-avoiding monkey-brains and helping them to run new
neurological software, optimized for life in a complicated modern world.

Both LessWrong and the similarly-focused Slate Star Codex, founded in 2013 by a
Bay Area psychiatrist writing under the pen name Scott Alexander, attracted not
just passive readers but enthusiastic commenters, who were drawn to the promise
of individual self-improvement as well as the potential to discuss philosophy,
science, and technology with people as uncompromisingly devoted to the truth as
they believed they were. These commenters — a mixture of the traditionally
educated and autodidacts, generally STEM-focused and with a higher-than-average
share of people who identified as being on the autism spectrum — tended to be
suspicious not just of humanities as a discipline, but of all the ways in which
human emotional response clouded practical judgment.

Central to the rationalist worldview was the idea that nothing — not social
niceties, not fear of political incorrectness, certainly not unwarranted emotion
— could, or should, get between human beings and their ability to apprehend the
world as it really is. One longtime rationalist of my acquaintance described the
rationalist credo to me as “truth for truth’s sake.” No topic, no matter how
potentially politically incendiary, was off-limits. Truth, the rationalists
generally believed, would set humanity free. Sure, that meant tolerating the odd
fascist, Nazi, or neoreactionary in the LessWrong or Slate Star Codex comments
section (New Right leader Curtis Yarvin, then writing as Mencius Moldbug, was
among them). But free and open debate, even with people whose views you find
abhorrent, was so central to the rationalist ethos that the most obvious
alternative — the kinds of harm-focused safeguarding central to fostering the
ostensibly “safe spaces” of the social justice left — seemed unthinkable.

The rationalist universe soon expanded beyond the blogs themselves. Members of
the wider LessWrong community founded the Center for Applied Rationality in
Berkeley in 2012. Its purpose was to disseminate rationalist principles more
widely — or, in rationalist parlance, to “raise the sanity waterline.” They
focused on big-picture, global-level issues, most notably and controversially
Yudkowsky’s pet concern: the “x-risk” (“x” for existential) that we will
inadvertently create unfriendly artificial intelligence that will wipe out human
life altogether.

There were rationalist sister movements: the transhumanists, who believed in
hacking and improving the “wetware” of the human body; and the effective
altruists, who posited that the best way to make the world a better place is to
abandon cheap sentiment entirely — such as our attachment to those who live in
proximity to us — and figure out how to maximize one’s overall utility to the
wider world. In practice, that usually means making a lot of money at tech or
finance jobs and then donating it to global health initiatives.

There were commune-style rationalist group houses and polyamorous rationalist
group houses devoted to modeling rational principles of good living. (In his
ethnography of the rationalists, journalist Tom Chivers recounts one group that
uses a randomized, but weighted, math game to determine how to split restaurant
bills fairly.)

Rationalist culture — and its cultural shibboleths and obsessions — became
inextricably intertwined with the founder culture of Silicon Valley as a whole,
with its faith in intelligent creators who could figure out the tech, mental and
physical alike, that could get us out of the mess of being human. Investor Peter
Thiel gave over $1 million to Yudkowsky’s Machine Intelligence Research
Institute. Elon Musk met his now-ex Grimes when the two bonded on Twitter over a
rationalist meme. Meanwhile, the effective altruism movement took a major public
relations hit late last year when one of its wealthiest proponents, Sam
Bankman-Fried, was arrested and charged with fraud, conspiracy to commit money
laundering, and other crimes related to FTX, the cryptocurrency exchange he had
founded.

‘VITAMIN DEFICIENCY’

For many, rationality culture had at least initially offered a thrilling sense
of purpose: a chance to be part of a group of brilliant, committed young heroes
capable of working together to save all humanity. One former high-level employee
of the Centre for Effective Altruism, who asked not to be identified by name,
called the period he spent there in the 2010s the “most exciting time of my
career by quite a lot.”

The center had been founded in Oxford in 2011 to help maximize giving and career
impact. “I thought ‘this thing is perfect,’” he told me in a Zoom interview,
“we’re going to figure out exactly where money should go, and how to improve the
world, and if we’re wrong, we’ll figure out how we’re wrong and then we’ll fix
that.”

But he soon grew disillusioned with the utilitarianism of rationality culture,
which focused so intently on quantifiable markers of success — the number of
people on college campuses recruited into EA-approved professional fields, say —
that it seemed to leave out something profound about the other side of human
life.

Effective altruism, he found, “depowered a lot of people. It made them less
interesting and vibrant as people, and more like — trying to fit into a slightly
soulless bureaucracy of good-doing.”

Likewise, Tyler Alterman, a former director of growth at the Centre for
Effective Altruism — and now somewhere in the postrationalist landscape —
describes his experience there as analogous to a “vitamin deficiency.”

“I took that vitamin deficiency to just be the type of necessary sacrifice that
one needs in order to think clearly and to put aside one’s personal desires in
order to effectively save and improve lives,” Alterman said in a phone
interview. Only crushing depression — exacerbated by a severe stress-related
gastrointestinal illness — made him re-examine his priorities. “It took a few
years of basically being disabled, disabled to the point of not being able to
work, to realize that, oh, actually these things that I deemed to be irrational
or, like, useless types of creativity, were essential to my functioning.” He
longed to form genuine friendships based on mutual affinity and understanding,
rather than by screening potential friends for qualities that would “make them a
good ally, which will contribute to you both working on existential risk
together in an effective way.”

Alterman wasn’t just concerned with his own personal happiness. He was also
increasingly convinced that intuition could be useful for the broader
rationalist project: namely, figuring out the truth about the world, and using
that knowledge to save it from itself.

Tyler Alterman gives a TEDx talk on “Morality for a Godless Generation” in 2014.
Yogesh Patel / TEDx CUNY / Flickr

“It turns out that, like, intuition is incredibly powerful … an incredibly
powerful epistemic tool,” he said, “that it just seems like a lot of
rationalists weren’t using because it falls into this domain of ‘woo stuff.’”

These critiques were not isolated ones. More and more rationalists and
fellow-travelers were yearning to address personal existential crises alongside
global existential risks. The realm of the “woo” started to look less like a
wrong turn and more like territory to be mined for new insights.

This wasn’t totally out of left field, even for rationalists. They even had a
word for such impulses, according to a former employee of the Center for Applied
Rationality, Leah Libresco Sargeant, who writes regularly on how rationalism led
her to her Catholic faith. They called it “pica,” after a compulsion that causes
people to eat dirt or other non-food objects, and that is often a sign of
nutritional imbalance.

“When people respond to something,” Sargeant told me in a phone interview,
“there’s some hunger here. What is that hunger aimed at? And can you aim it at
the right thing?” Whatever was out there, and however you could (or couldn’t)
justify it with propositional truth-claims or Bayesian reasoning, it probably
pointed to something worth exploring.

‘BETTER TO BE INTERESTING AND WRONG … ’

By the late 2010s, the rationalist landscape had started to shift, becoming
increasingly open to investigating, if not necessarily the truth claims of
spirituality, religion, and ritual, then at least some of their beneficial
effects. (A rationalist mega-meetup I attended while researching an article in
2018, for example, included a talk on Tarot.) Wider rationalist-verse
institutions like Leverage Research — a controversial, Peter Thiel–funded think
tank that employed heavily from within the rationalist and effective altruism
communities — began to look into more esoteric topics, such as “intention
research”: how practitioners of bodywork, energy healing, or mesmerism could use
nonverbal cues to subtly influence the mindset of the people on whom they
worked.

The rationalists weren’t the only ones experiencing pica. Over the past decade
or so, several different intellectual — and less-than-intellectual — subcultures
have become far more open than they once were to the language and imagery of the
spiritual, the magical, and the religious, and to the traditions that once
sustained them.

There’s the rise of what you could call popular neo-Jungianism: figures like
Jordan Peterson, who point to the power of myth, ritual, and a relationship to
the sacred as a vehicle for combating postmodern alienation — often in uneasy
alliance with traditionalist Christians. (A whole article could be written on
Peterson’s close intellectual relationship with Roman Catholic Bishop Robert
Barron.) There’s the progressive-coded version you can find on TikTok, where
witchcraft and activism and sage cleansing and “manifesting” co-exist in a
miasma of vibes. There’s the openly fascist version lurking at the margins of
the New Right, where blood-and-soil nationalists, paleo bodybuilders, Julius
Evola–reading Traditionalists like Steve Bannon, and Catholic sedevacantist
podcasters make common cause in advocating for the revival of the mores of a
mystic and masculinist past, all the better to inject life into the sclerotic
modern world.

But the specific postrationalist version of this tendency is all the more
striking for the fact that its genesis lies in a subculture ostensibly dedicated
to the destruction of all thoughts non-rational. For example, when I was writing
a piece on the rationalists for Religion News Service in 2018, I attended a
rationalist-affiliated “Secular Solstice” in New York — a non-theistic version
of Hanukkah in which a series of (battery-operated) candles were lit and
subsequently extinguished to represent the snuffing out of superstitions. The
ceremony culminated (or would have culminated, if a stubborn candle hadn’t
refused to go out) in total darkness, during which we were invited to meditate
upon the finality of death, the non-existence of God, and the sole avenue for
hope: supporting — financially, intellectually, or otherwise — quixotic
scientific initiatives capable of prolonging life, or of eliminating death
altogether.

It’s possible, of course, to look at the rise of the postrationalists as merely
the kind of development you’d see in any online subculture that lasts more than
a couple of years: the replacement of one model of discourse or fandom by its
younger, self-proclaimedly punker cousin. And, certainly, there’s something even
more extremely online, and extremely 2020s,about postrationalism’s freewheeling
eclecticism. If rationality culture arose out of a very specific early-2000s
blog culture — big-name essayists like LessWrong’s Eliezer Yudkowsky and Slate
Star Codex’s Scott Alexander, meticulously parsed by hyper-serious interlocutors
in the comments section — “postrat culture” is no less wedded to its own
particular medium: Twitter, along with a backchannel network of private group
chats and Discord servers and Zoom rooms.

Like their rationalist forebears, the postrationalist community has its own
blogger-luminaries — Venkatesh Rao at Ribbonfarm; Sarah Perry, also a Ribbonfarm
contributor and author of the anti-natalist manifesto Every Cradle Is a Grave;
and David Chapman at Meaningness. But the postrationalists also have a more
anarchic side, marked by the ubiquity of pseudonymous Twitter micro-celebrities
— like eigenrobot (43k followers), and Zero H. P. Lovecraft (98k), who has
rejected the postrat label but is widely followed by them — whose accounts, like
Vogel’s, sometimes blend sincerity and shitposting. They share some of
rationality culture’s shibboleths — a fondness for speaking in obscure jargon, a
commitment to an Overton Window so wide it might as well be a glass house, a
contempt for the “wokeness” they see as stifling free intellectual discourse.

But they’re also far more likely to embrace the seemingly irrational — religious
ritual, Tarot, meditation, or the psychological-meets-spiritual self-examination
called “shadow work” — in pursuit of spiritual fulfillment, and a vision of life
that takes seriously the human need for beauty, meaning, and narrative. Today’s
postrationalists might be, for example, practitioners of Vajrayana Buddhism, or
they might adopt the carefully choreographed practices of self-proclaimed
radical agnostic and ritual artist Rebecca Fox, who designs bespoke rituals she
refers to as “psychospiritual technology.” The movement’s defining maxim —
according to at least one person familiar with the movement I spoke to — might
be a proclamation by writer Sarah Perry: “It is better to be interesting and
wrong than it is to be right and boring.”

“The virtue of darkness is to face reality at its harshest without looking
away.” — a speaker at a “Secular Solstice” event on Zoom in 2020
Screenshot of a YouTube video uploaded by Rachel Shu

‘DAWN OF THE METATRIBE’

The online rationalist ecosystem had become wider — and weirder — sparked in
part by the organic if tech-boosted formation of communities on Twitter, where
people-you-may-know algorithms were increasingly connecting members of the
burgeoning postrationalist scene with old-school rationalists. These connections
only intensified during the pandemic, when people’s lives moved more online and
the sacrifices engendered by isolation made many under the rationalist umbrella
more conscious of the importance of embodied community.

Tyler Alterman, expanding upon a term he first heard on the Intellectual
Explorers Club podcast by Peter Limberg, has called this new wider social
landscape the metatribe. In a September 2020 Twitter thread influential enough
that several people I spoke to for this piece seemed aware of it, Alterman
declared the year 2020 the “dawn of the metatribe.”

The metatribe, Alterman wrote, “is neither nihilist nor locked onto an ethical
system. It has political opinions without being left, right, or center….
metatribers often appear to be ‘heterodox.’” The metatribe, furthermore, “is
scientific without scientism. It is spiritual while being neither new age nor
traditionally religious.” It includes both members of the specific “postrat”
subculture as well as thinkers from other subcultures caught up in the wider
postrationalist turn.

The “metatribe” is not the only term used by members or ideological
fellow-travelers. Others call it “the liminal web,” “the sense-making web,” or
the “intellectual deep web.” Who counts as metatribe members — rationalists,
postrationalists, metamodernists, or accounts that just post good memes — is
hardly set in stone.

But we can identify in the modern metatribe a distinct cultural and intellectual
identity: at once hyper-aware of the problems posed by human irrationality, and
committed to the notion that our emotional and spiritual lives are as
fundamental to human flourishing as our intellectual ones. Being “allergic to
political ideology,” as Alterman told me in our interview, metatribe members are
often political magpies, taking their practices and theories from across the
ideological spectrum, even as their commitment to radical openness renders them
at times uncomfortably close to more explicitly right-wing circles like the
intellectual dark web — at least according to their progressive critics.

They’re interested in dissolving the barriers between intellectual disciplines,
as well as between the mental and embodied life — Alterman compares them to the
ancient Greeks, where “it wasn’t that unusual for someone to be both a
philosopher and a wrestler.” And while few of them find a home among the
seemingly implausible dogmas of traditional, organized religion, they’re far
more willing than their rationalist forebears to see in religious, spiritual, or
even esoteric or occult practice an avenue toward self-transformation in the
service of a meaningful life. They at once evoke the classic Californian
Ideology famously described in 1995 by Richard Barbrook and Andy Cameron — equal
parts hippie mysticism and relentless self-development — and subvert its linear
narrative of human progress.

John Vervaeke, a cognitive scientist at the University of Toronto whose work
many of the metatribe cite as highly influential, calls the metatribe’s practice
a re-inventio. In Latin, Vervaeke told me in a Zoom interview, inventio means
both “invent” and “discover.”

“It’s all mixed together,” Vervaeke said. “This is one of the most culturally
significant things that is happening right now — this re-inventio of what
sacredness means and what that experience of sacredness points to…. [which] has
moved out of being the proprietary purview of the established religions.”
Religions, Vervaeke says, are “still important partners in this process, but
they don’t have the monopoly on it anymore.”

Vervaeke’s work deals extensively with what he calls the modern “meaning
crisis”: the idea that we don’t understand what, exactly, we’re living for.
According to Vervaeke, rationalist culture — to say nothing of our contemporary
world more broadly — has, in its technocratic worship of human computing power,
lost sight of the more complex questions involved in living the good life, or
what he calls wisdom.

“Our Cartesian reduction of rationality to sort of computational abilities, and
then the reduction of that to just communication and communicative manipulation
— we have lost a lot,” said Vervaeke. “That notion of rationality that
[rationalists] are making use of is seriously truncated, seriously missing what
most of the ancient world thought they were referring to. They used words like
logos and ratio, and those older notions of rationality were bound up with
wisdom, were bound up with practice, with the use of the imaginal” — that which
the imagination can transform, rather than the merely imaginary — with “ritual,
transformation, aspiration. And so all of those things are now coming back in.”

Wisdom, for Vervaeke — as for the metatribe generally — is something distinct
from, if related to, the raw, computational processing power so prized by the
rationalist community. It involves a holistic approach to thinking — what does
it mean to live a good life? — that can’t be quantified the way you can
quantify, say, the number of malaria nets you’ve sent to the developing world.
It’s also, unlike the often hyper-individualistic (and autodidact-focused)
rationality culture, deeply wedded to a conception of tradition, and of the
collected insights of others more broadly, as a source of intellectual value.

To mature in wisdom, after all, takes a village, says Vervaeke. “You have to
acquire identities and roles and responsibilities and virtues … in order to
properly become wise. That takes a community that is willing to hang with you
for a long time, which means the best shot of finding such a community is one
that has a tradition and a history behind it.”

CELEBRATING 20 YEARS

Our one-of-a-kind thought leadership on science and technology happens only with
your help.

Support our 20th anniversary campaign


In this sense, the metatribe project is as much about recovery as it is about
progression: reviving a vision of communal life, communal responsibility, and
communal reverence for the sacred that the atomized modern world has rendered
increasingly rare — while still embracing the freedom and technological comfort
modernity has made possible. As another plugged-in postrat, who tweets as
@bigmastertroll, told me via Twitter DM: “There’s a sense where modernity is
kind of great, because sacredness leads to irrationality, problems, wars etc.
That maybe trading off a bit of guaranteed meaning for more choice, warm homes
and less violence … is actually a pretty good deal. At the same time,
[postrationalists] are basically all mourning the loss of enchantment.”

“In a way,” he wrote, “you can summarise post rationality, as, like, ‘how do you
get the sacred without violence.’”

Hannah Yoest

‘TO DISCOVER THE LAWS OF MAGIC AND BECOME … ’

If the metatribe reflects anything about our wider cultural moment, it is our
shared disillusionment with the broader liberal optimism the rationalists have
come to embody. The promise proffered by so much of Silicon Valley — that we can
hack our way to Enlightenment, transcending our humanity along the way — no
longer seems plausible amid the broad ennui and general pessimism that has
settled into our culture over the last decade.

As Vogel puts it:

> The reason that history feels both out of control and stagnant is because
> we’re alienated from it. Our spirits can’t actively participate in it. We
> constantly engage with it in terms of geopolitics, or trying to build the
> right kind of AI, or design the right kind of society. We’re like “Can we
> figure out the formula for making people into great founders?”

At its best, the kind of holistic questioning preferred by many metatribe
members offers a way forward beyond either liberal optimism or the atavistic
pessimism of the New Right: How can we effectively and altruistically advocate
for the good life of others, when we ourselves aren’t sure what the good life
even means?

But it’s also true that metatribe discourse remains more wedded to contemporary
liberal individualism than many of its members might care to admit. While they
are often hungry for, and vocally supportive of, the kind of communal spiritual
practices and rituals that might anchor a well-lived life, they’re more
interested in spirituality’s function — how it contributes to personal
fulfillment — than in its claims to truth, which they largely treat as
irrelevant or obsolete.

The postrationalists’ interest in religion and spirituality, says Leah Libresco
Sargeant, can be likened to the wider metatribe interest in psychedelics: both
are tools in the service of better thinking or optimized life perspective. She
describes their general mindset as: “I think religion is very powerful and it’s
interesting that religious people have greater life satisfaction. So should I
try doing religion like it’s recreational drugs to see where that takes me?”

Religion, meditation, magic, occultism, shadow work — all these, in the
metatribe model, are mere avenues for self-development and self-transcendence.
Cast a love spell, go to church, attend a 5Rhythms ecstatic dance class, take
psychedelic mushrooms — all of these, functionally, amount to the same thing: an
injection of what foundational postrat writer David Chapman calls meaningness.

“Meaningness,” Chapman writes on his website, entails that “meaning is real but
not definite. It is neither objective nor subjective. It is neither given by an
external force nor a human invention.” It requires, he says, taking from the
eternalist stance the commitment that human beings do, and indeed should,
experience the world as a locus of meaning, and from the nihilist stance the
rejection that there is a single “eternal source of meaning” behind it. Or, as
Sarah Perry puts it, “There’s no One True Ritual Order that’s going to survive
forever. The best hope is maybe there are [ritual] micronutrients or vitamins
that we can discover, and then figure out how to supply them under different
technological regimes.” Spirituality exists not in itself, but for us.

What we see here is the positing that meaning is good and important and out
there, but also the conviction that, in practice, the precise contours of that
meaning are up to us to decide. It’s spirituality for a secular age: anchored by
the conviction that reality is downstream of our personal psychological power.

If there is a doctrine underpinning both rationalist and postrationalist
thought, it is this quintessential liberal faith in human potential, combined
with an awareness of the way in which human imaginal power does not merely
respond to, but actively shapes, the world around us. The rationalists dreamed
of overcoming bias and annihilating death; the postrats are more likely to dream
of integrating our shadow-selves or experiencing oneness. But both camps evince
a profound faith in what we might call human godliness: the idea that we are not
only the recipients of the world around us but also its creators. Indeed, it’s
little wonder that so many metatribe members find themselves drawn to esoteric
or occult spiritual schools of thought like chaos magick or Traditionalism,
schools in which it is difficult to distinguish the human power to shape and
persuade from the outright supernatural.

Reddit user book-lover1993


In his Harry Potter and the Methods of Rationality — perhaps old-school
rationalists’ most effective recruiting text — Eliezer Yudkowsky is clear that
part of the appeal of rationality is the promise of self-overcoming, of becoming
more than merely human. Harry, we learn, “wants to discover the laws of magic
and become a god.” Yet it is rationality, in the end, that gives Harry the
godlike powers of understanding, and shaping, his world — a world that,
Yudkowsky tells us, will one day be one in which “the descendants of humanity
have spread from star to star” and “won’t tell the children about the history of
Ancient Earth until they’re old enough to bear it; and when they learn they’ll
weep to hear that such a thing as Death had ever once existed!”

The metatribe may have different, well, methods. But their goal, too, is
self-transcendence. As Vogel told me: “Both Nietzscheanism and the occult
discourse of — the hermeticism — and even modern rationality: a thread through
all of these things is the implicit desire to become a god.” Doing the
psycho-spiritual work necessary to unchain yourself from mere human facticity is
the only way out of the tragic mire of ordinary human life.

Does Vogel, personally, wish to be a god?

He declined to answer on the record.

TOPICS

 * Social Media
 * On the Modern Project

Link copied to clipboard!


Subscriber Only
Sign in or Subscribe Now for access to PDF downloads
Tara Isabella Burton is the author of Strange Rites: New Religions for a Godless
World (Public Affairs, 2022), The World Cannot Give: A Novel (Simon & Schuster,
2022), and the forthcoming Self-Made: Creating Our Identities from Da Vinci to
the Kardashians (Public Affairs, June 2023).
Tara Isabella Burton, “Rational Magic,” The New Atlantis, Number 72, Spring
2023, pp. 3–17.
Header image: Hannah Yoest

Essay

Spring 2023

TOPICS

 * Social Media
 * On the Modern Project

Link copied to clipboard!


Subscriber Only
Sign in or Subscribe Now for access to PDF downloads

RELATED

Winter 2018

THE IDEA INCARNATE

How Victor Frankenstein’s thoughts ran away from him

Kirsten A. Hall

Fall 2019

ENLIGHTENMENT LATER

Will reason survive rationalism?

Kent Anhari

Winter 2020

EAT ME, DRINK ME, LIKE ME

Is love in the attention economy unreal?

Tara Isabella Burton

Fall 2021

MODERNITY AND THE FALL

One weird trick to re-enchant the world

Caitrin Keiper

Spring 2023

Rational Magic

Subscribe



20 YEARS OF KEEPING HUMANITY HUMAN

Read a note from our editor on our 20th anniversary and what it takes to sustain
the work of The New Atlantis.

Read More

 





A JOURNAL OF TECHNOLOGY & SOCIETY

Subscribe

|

Back Issues

|

Blogs

|

About

|

Contributors

|

Contact

|

Donate



Published by The Center for the Study of Technology and Society

© 2023 The New Atlantis

Privacy Policy

Notifications