reason.com Open in urlscan Pro
75.2.24.81  Public Scan

URL: https://reason.com/podcast/2024/02/14/shoshana-weissmann-carding-people-for-joining-social-media-solves-nothing/
Submission Tags: human rights extremist right wing conservative maga gop republican liberty democrat progressive Search All
Submission: On March 13 via manual from US — Scanned from US

Form analysis 2 forms found in the DOM

GET https://reason.com/

<form role="search" method="get" class="search-form" action="https://reason.com/">
  <label>
    <span class="screen-reader-text">Search for:</span>
    <input type="search" class="search-field" placeholder="Search …" value="" name="s">
  </label>
  <input type="submit" class="search-submit" value="Search">
</form>

POST

<form method="post" id="gform_0" class="recaptcha-v3-initialized"><input type="hidden" name="login_redirect" value="/podcast/2024/02/14/shoshana-weissmann-carding-people-for-joining-social-media-solves-nothing/">
  <div class="gform_heading">
    <h3 class="gform_title">Login Form</h3>
  </div>
  <div class="gform_body">
    <div id="gform_fields_login" class="gform_fields top_label">
      <div id="field_0_1" class="gfield gfield--type-text gfield_contains_required field_sublabel_below gfield--no-description field_description_below gfield_visibility_visible" data-js-reload="field_0_1"><label class="gfield_label gform-field-label"
          for="input_1">Username<span class="gfield_required"><span class="gfield_required gfield_required_text">(Required)</span></span></label>
        <div class="ginput_container ginput_container_text"><input name="input_1" id="input_1" type="text" value="" class="" aria-required="true" aria-invalid="false"> </div>
      </div>
      <div id="field_0_2" class="gfield gfield--type-password gfield_contains_required field_sublabel_below gfield--no-description field_description_below gfield_visibility_visible" data-js-reload="field_0_2"><label
          class="gfield_label gform-field-label gfield_label_before_complex" for="input_2">Password<span class="gfield_required"><span class="gfield_required gfield_required_text">(Required)</span></span></label>
        <div class="ginput_container ginput_container_password">
          <span id="input_2_1_container" class="ginput_password ">
            <span class="password_input_container">
              <input type="password" name="input_2" id="input_2" value="" aria-required="true" aria-invalid="false">
              <button type="button" class="gform_show_password gform-theme-button gform-theme-button--simple" onclick="javascript:gformToggleShowPassword(&quot;input_2&quot;);" aria-live="polite" aria-label="Show Password"
                data-label-show="Show Password" data-label-hide="Hide Password"><span class="dashicons dashicons-hidden" aria-hidden="true"></span></button>
            </span>
          </span>
          <div class="gf_clear gf_clear_complex"></div>
        </div>
      </div>
      <div id="field_0_3" class="gfield gfield--type-remember_me field_sublabel_below gfield--no-description field_description_below hidden_label gfield_visibility_visible" data-js-reload="field_0_3"><label
          class="gfield_label gform-field-label screen-reader-text gfield_label_before_complex"></label>
        <div class="ginput_container ginput_container_checkbox">
          <div class="gfield_checkbox" id="input_3">
            <div class="gchoice gchoice_3">
              <input class="gfield-choice-input" name="input_3.1" type="checkbox" value="1" id="choice_3">
              <label for="choice_3" id="label_3">Remember Me</label>
            </div>
          </div>
        </div>
      </div>
    </div>
  </div>
  <div class="gform_footer top_label"> <button type="submit" id="gform_submit_button_0" class="gform_button button"
      onclick="if(window[&quot;gf_submitting_0&quot;]){return false;}  if( !jQuery(&quot;#gform_0&quot;)[0].checkValidity || jQuery(&quot;#gform_0&quot;)[0].checkValidity()){window[&quot;gf_submitting_0&quot;]=true;}  "
      onkeypress="if( event.keyCode == 13 ){ if(window[&quot;gf_submitting_0&quot;]){return false;} if( !jQuery(&quot;#gform_0&quot;)[0].checkValidity || jQuery(&quot;#gform_0&quot;)[0].checkValidity()){window[&quot;gf_submitting_0&quot;]=true;}  jQuery(&quot;#gform_0&quot;).trigger(&quot;submit&quot;,[true]); }">Login</button>
    <input type="hidden" class="gform_hidden" name="is_submit_0" value="1">
    <input type="hidden" class="gform_hidden" name="gform_submit" value="0">
    <input type="hidden" class="gform_hidden" name="gform_unique_id" value="">
    <input type="hidden" class="gform_hidden" name="state_0" value="WyJbXSIsIjVmZDk0MDRiMTc0NTYwODJmYTIwNGZlZDYxN2ViYzJjIl0=">
    <input type="hidden" class="gform_hidden" name="gform_target_page_number_0" id="gform_target_page_number_0" value="0">
    <input type="hidden" class="gform_hidden" name="gform_source_page_number_0" id="gform_source_page_number_0" value="1">
    <input type="hidden" name="gform_field_values" value="">
  </div>
</form>

Text Content

 * Latest
 * Magazine
   * Current Issue
   * Archives
   * Subscribe
   * Crossword
 * Video
 * Podcasts
   * All Shows
   * The Reason Roundtable
   * The Reason Interview With Nick Gillespie
   * The Soho Forum Debates
   * Just Asking Questions
   * The Best of Reason Magazine
   * Why We Can't Have Nice Things
 * Volokh
 * Newsletters
 * Donate
   * Donate Online
   * Donate Crypto
   * Ways To Give To Reason Foundation
   * Torchbearer Society
   * Planned Giving
 * Subscribe
   * Reason Plus Subscription
   * Print Subscription

Search for:


LOGIN FORM

Username(Required)

Password(Required)

Remember Me
Login
Create new account
Forgot password


Social Media


SHOSHANA WEISSMANN: ONLINE AGE VERIFICATION RULES ARE UNCONSTITUTIONAL AND
INEFFECTIVE


"NONE OF THESE LAWS PREVENT KIDS FROM VIEWING ANYTHING. THEY JUST PREVENT KIDS
FROM POSTING," ARGUES SHOSHANA WEISSMANN.

Nick Gillespie | 2.14.2024 12:00 PM

Share on FacebookShare on TwitterShare on RedditShare by emailPrint friendly
versionCopy page URL
Media Contact & Reprint Requests
Audio Player
https://dts.podtrac.com/redirect.mp3/chrt.fm/track/35917C/d2h6a3ly6ooodw.cloudfront.net/reasontv_audio_8264324.mp3

00:00
00:00
00:00

Use Up/Down Arrow keys to increase or decrease volume.

1x 1.1x 1.25x 1.5x 2x 3x
:15 :15
Download

SHOSHANA WEISSMANN: ONLINE AGE VERIFICATION RULES ARE UNCONSTITUTIONAL AND
INEFFECTIVE

HD Video Download

In January, the Senate Judiciary Committee dragged the heads of Meta, TikTok,
and X, formally known as Twitter, to Washington to charge them with exploiting
children by allegedly addicting them to social media that sexually harms them,
drives them to eating disorders, and even kills them. The Spanish Inquisition
vibe of the proceedings reached a crescendo when Sen. Josh Hawley (R–Mo.)
demanded that Mark Zuckerberg apologize to the families of children for the
"harms" supposedly caused by Facebook and pay compensation out of his personal
fortune.

But is social media really that bad for kids? And is the solution being pushed
by Democrats and Republicans alike—universal age verification for all users of
the internet—even technically feasible without shredding the First Amendment,
destroying privacy, and creating major security issues? The answer is a
resounding no, according to Shoshana Weissmann, director of digital media at R
Street, a free market think tank, and author of "The Fundamental Problems with
Social Media Age-Verification Legislation." Reason's Nick Gillespie interviewed
Weissmann in Washington, D.C., in early February.

Powered By

00:00/01:23
10 Sec


Joe Biden wins enough delegates to be the Democratic presumptive nominee for
president




Next
Stay





Today's sponsor:

 * Better Help. When you're at your best, you can do great things. But sometimes
   life gets you bogged down, and you may feel overwhelmed or like you're not
   showing up in the way that you want to. Working with a therapist can help you
   get closer to the best version of you—because when you feel empowered, you're
   more prepared to take on everything life throws at you. If you're thinking of
   giving therapy a try, Better Help is a great option. It's convenient,
   flexible, affordable, and entirely online. Just fill out a brief
   questionnaire to get matched with a licensed therapist, and switch therapists
   anytime for no additional charge. If you want to live a more empowered life,
   therapy can get you there. Visit BetterHelp.com/TRI today to get 10 percent
   off your first month.

Nick Gillespie: So we're talking right at the start of February. And two
interesting things related to this question of social media, its effects on
kids, and the need to verify the ages of who's using social media, etc., just
happened. 

One is that the state of Utah, which had passed a law mandating age
verification, pulled it after being threatened with lawsuits from a couple of
groups. And the other was a spectacular Senate hearing about trying to protect
kids from online exploitation and things like that that ended up at one point in
a series of shouting matches, including Facebook's Mark Zuckerberg arguing with
Sen. Josh Hawley [R–Mo.]. What was your sense of that Senate hearing? And does
it encapsulate something important about the way this general debate happens?

Shoshana Weissmann: For people who watch a lot of Senate hearings in general,
you'll know that they're getting a little bit less professional. They'll ask
them yes-or-no questions that are impossible to answer. If you understand that
nuance is a part of law, it's really disappointing. And I've watched a lot, but
this was the worst I've seen. It was just so wildly unprofessional and about a
serious issue, so it should have been professional. 

I don't blame the audience for cheering, but I blame the senators and Senate
staff for not stopping that from happening. Hawley is often unprofessional, and
he was extremely unprofessional demanding that Zuckerberg pay for the people who
have faced harm here. It was just so bizarre. It was just weird, really. And
that doesn't solve anything. He's not solving any problems. He's not approaching
the issue with any seriousness. People say a lot that it's all about sound
bites, but you could really, really see that's all this was.

Gillespie: Zuckerberg did not come off well in that either, did he?



Weissmann: No, I didn't care for how he came off. I think he could have done
better in the hearing at a lot of points. I mean, I don't think that companies
are perfect by any means. But Zuckerberg, I think, came off kind of strange at
some points.

Gillespie: What about the Utah law? This is not a right or left issue. Both
Republicans and Democrats, conservatives and liberals and progressives are
talking about stopping the internet from exploiting children in all sorts of
ways. Utah was applauded for this law. But then it was going to be challenged.
And they yanked the age-verification law.

Weissmann: It's really, really bad. Everyone warned them—the senators, the
governor. I love the governor. I adore Spencer Cox. But it's crazy to me that
he's getting behind this stuff because it's so unconstitutional. I mean, you can
have your policy disagreements, but this is objectively very, very
unconstitutional.

Gillespie: How was it unconstitutional?

Weissmann: Oh, man, so many ways. So, the First Amendment [includes the] right
to anonymous speech. If you have age verification, you have face scans, you have
to show your government ID. I actually just submitted seven pages of comments on
the new proposed rules to accompany the law before I found out that it was going
to be pulled. So I'm like, "Oh, well, I have this already, I'll submit it. It'll
be useful."

[The law] said you can use the last four digits of your social. You can require
people to scan their faces and get their government IDs. Just really invasive
stuff. I mean, America's in a really bad cybersecurity position. Everyone is
hacked all the time. But even if it wasn't, if you're submitting that stuff
online, you have reason to believe your speech is no longer anonymous, because
that's the government enforcing that. That harms our right to anonymous free
speech, which has been upheld many, many times at the Supreme Court.



Also, there's a compelled speech issue that's a little bit smaller. But if
you're going to go online and say, "Hey, I'm not happy in my marriage. I want to
see what people know about divorce, marriage counseling." And then you think
your spouse might realize who you are, posting about that, you're not going to
want to do that. Or if you think you have a rare disease or HIV or something,
you might not want to have your name tied to that.

Gillespie: That is part of the larger question; in trying to childproof the
internet, we end up shutting down all kinds of speech. Very few people would
challenge that. But again, to go back to the '90s, The Simpsons had a running
gag. In almost every conversation, somebody would shout, "Will someone please
think of the children?" It's kind of come back to that in internet discussions
because people are sour about social media. Everybody's down on Facebook and
Instagram and Twitter. These are hellscapes that are killing kids, exploiting
kids, and making the rest of us miserable. So, it seems like the scope for
regulating them—the speech, the content, and the business models—has really
changed. And this is really where the social media age-verification push is
coming from. And people like Brian Schatz, the senator from Hawaii, have joined
it and a bunch of other people in the Democratic Party or even the progressive
left have joined with Republican conservatives to say this is a good thing. 

Before we go into your work on it, Schatz is a big fan of the Protecting Kids on
Social Media Act. And this is kind of similar to all of this stuff going on. It
would set a minimum age of 13 to use social media apps and would require
parental consent for 13- through 17-year-olds. It would also prevent social
media companies from feeding content using algorithms to users under the age of
18. So that's kind of the legal landscape that's playing out at the federal
level and at the local level. 



You have written a series of pieces over the past year that are grouped at
RStreet's website: "The Fundamental Problems with Social Media Age Verification
Legislation." I want to ask you as a starting point—a lot of this legislation is
premised on the idea that people under 18 are suffering vast, obvious,
measurable harms from being online. Is that really incontrovertible? Or is that
a question?

Weissmann:  It's definitely a question. Especially because the evidence is
mixed. And kids are individuals. Some kids use tools. Some kids don't. And it
depends on the tool too. You always have to work with your kid to figure out
what's healthy for them and what's not. Some kids are doing unhealthy things on
social media, and that's parenting. The government can't solve that. Someone
using social media for five hours might be building a business or showing local
businesses how to put themselves out on social media. Or they might just be
depressed and something else might be going on. But so much of the stuff going
on right now resembles the video game debates, the TV debates. Kids should be
probably behind screens less, but that comes down to parenting and getting them
engaged in other things. But there's definitely a mix. 

As a kid, I used social media to find out I had fibromyalgia. I only know that
because I found an online forum where someone said, "Hey, you're getting sick
all the time and you have endometriosis. You might have fibromyalgia." And I
also started my career online, by adding elected officials on Facebook, which
sounds funny now, but that's actually how I started my career. And I'm terrified
of closing the door behind me, of saying to the next generation, "You can't make
what you want out of life because elected officials want to treat you all the
same." That's really wrong to me.



Gillespie: A number of major psychological groups have said that it is not clear
that being on social media is harmful to young people. But let's pretend that it
is for the remainder of our conversation because you have written pretty
powerfully about the fundamental problems with social media age-verification
legislation. And let's just start with part one of your series. The headline of
the article is, "The Technology To Verify Your Age Without Violating Your
Privacy Does Not Exist." What do you mean? How do you know that? What are the
implications of that?

Weissmann: So I looked through it. I even had age verifiers reach out to me
because they didn't like what I was saying. So I'm like, "OK, tell me about your
software. Tell me how it's different." They're like, "Oh, we'll just scan your
face." Uh, what? So every time you want to post free speech online, you have to
have your face scanned.

Gillespie: This kind of reminds me of when people say, "Well, you know what? If
immigrants just carried their work papers with them, then we wouldn't have to
worry about illegal immigration." But when you require anybody to carry papers,
everybody has to carry papers.

Weissmann: Right. The way they find out if you're underage is by checking your
age and your credentials. Basically, the way that age verifiers seem to want to
do this is some sort of government ID plus face scans. And it can't be a static
picture, it has to be a live picture. So, every time you want to post free
speech criticizing the government, asking about marital problems, asking about
disease, whatever it is, you're going to have to have your government ID, and
the internet's going to have to scan your face, which is really, really, really
invasive. Because obviously, everyone lies with a checkbox.



With government IDs or credit cards alone, you could just fake your parents.
There's a great Simpsons line where Bart's like, "Hey, Lisa, is this dad's
credit card number?" And she's like, "You know it is." Kids memorize that stuff.
And they would with government IDs if that was all that's required, [like]
social security numbers, which are also not secure. They're leaked everywhere.
So it just creates massive cyber risk. None of these are safe. None of these
protect your privacy. And this massive, massive risk is all to verify the age of
children where parents could just not give them phones or give them phones with
very, very limited access or block stuff on their computer. There are ways
around this that put the parents in charge.

Gillespie: In a different section of your series, you write that if you are
requiring this type of data to be put together, then it's going to be in a
database somewhere that foreign governments or enemies of America can get it.
Because we're in a panic over TikTok, right? Every new dance craze goes directly
to the Beijing basement of the Communist Party in China. So this obviously
presents a massive risk because you're pulling data, which then is hackable.
There's a related concept that you've written about called data minimization.
How does that factor into this?

Weissmann: I love data minimization. This is the kind of person I am now where
this excites me. Just less of your stuff online. The less stuff you share, the
safer it is. So I don't always like that platforms require as much information
as they do, but sometimes they're doing it in pursuit of something like giving
you a better product or whatever it is, but forcing them to require this is
nuts. And like you were saying with TikTok, in Utah, Gov. Cox had said that he
thought TikTok was a real security threat, but his law would have required them
to collect face scans and IDs and Social Security numbers. And whether you think
other governments are an issue or our own is an issue, you should be like, "Hey,
maybe we don't create this massive risk for other people to get our data." And
that's part of data minimization. The less that you put out there, that you
share around, the less worry you have to have. It's a simple principle, but it's
a really important one when everyone is hacked constantly. I've been in so many
data breaches. We all have.



Gillespie: What does it mean that we're being hacked all the time, but it
doesn't really seem to change what we do?

Weissmann: It's kind of bad. I think we as a society need to figure it out a
little bit more. But basically people can log into your stuff, so you should
have two-factor authentication. It's not perfect, but if you have one of those
code generators, that's the best method. And it stops people from logging into
stuff. I know that people keep trying to log into my Instagram, and then
[Instagram] will email me saying, "Hey, if you want to change your password,
here's the link." And they don't have access to my email. So that's good. So I
can handle it there. World cyber security isn't in a great place. And all this
stuff puts it in a worse place. But you want to try to make things safer in the
environment we live in. Put less of your information out there, especially
sensitive stuff about your location. IDs are super sensitive. Social Security
numbers, you really don't want to share those with everyone. 

Gillespie: Although, social security numbers, it's kind of amazing. If you go
back to the '50s and '60s, people mostly on the right—paranoid people who turned
out to be kind of correct—believed the Social Security number was going to
become effectively a national ID; it's required everywhere. And you can buy them
by the boatload, right?

Weissmann: Oh, yeah. I think mine was leaked in the D.C. health breach. So, I'm
screwed there. Like, that's not nothing for me. And you can find databases of
them online, unfortunately, pretty easily.

Gillespie: Some of the articles that you've written talk about how
age-verification methods in their current forms threaten our First Amendment
right to anonymity. I think on some level, those of us who remember our history
classes from grammar school or read something about The Federalist Papers, we
understand that in a profound way, America was founded on anonymous speech. But
nobody likes anonymous speech now, right? Anonymous speech is bad, right? So why
should we care about our right to anonymity?



Weissmann: It terrifies me that there are so many lawmakers saying—even Nikki
Haley has said this—"Oh, you know, every user should have to verify their
identity online." OK, so we don't get whistleblowers anymore. No more
whistleblowers. We're opposed to that. The Federalist Papers, like you're
saying, are anonymous. The NAACP, their members were anonymous, back when
everyone hated black people. And that was a really, really dark part of history.
But thankfully, the First Amendment protected them. And they had a right to
anonymous association. 

And it's important for the same reason today: The government doesn't like when
people disagree with it. And sometimes you have to do so anonymously in order to
avoid certain levels of scrutiny there. Not to say you shouldn't be held
accountable for your opinions or whatever, but anonymous speech has always been
an important part of American history. And there are centuries of precedent
saying that, yes, we have the right to anonymous speech under the First
Amendment. So if you infringe upon it—it's not saying you can never infringe
upon it—but you have to have a really, really good reason. It has to be narrowly
tailored. And these means just aren't.

Gillespie: It's fascinating to me, again, thinking back to the '90s, because the
parallels are ominous and disturbing and ubiquitous, but AOL was popular.
America Online, when it was on its move to becoming the largest ISP, its whole
selling point was that you could come up with a handle that was kind of your
name or you could make something up. And they really pushed back against
attempts to crack the anonymity of their users. AOL was great because it was
anonymous.

Weissmann: The history there is so interesting. I love Jeff Kosseff's book on
anonymity. I learned so much through that. I did not realize the extent to which
we have precedent here. And also the way it worked with AOL trying to not unmask
users and trying to protect users—I don't want to get too nerdy—but the internet
history around this stuff is really, really fascinating about how big a deal it
was back then. We shirk a bit about anonymous speech, but it is really
important. Sure, some people use it wrong, but there are studies that show that
some people actually use it better, and they're using anonymity in actually
really healthy ways. So our gut assumptions on it aren't always right. 



Gillespie: I wrote a piece for Reason about this in the late '90s called "Child
Proofing the World." And one of the metaphors I use—and I had young kids at the
time—was, just because I have to childproof my house doesn't mean the world has
to change everything because I have kids. And that may sound callous, but it
really isn't.

You also have talked about how the age-verification methods threaten our First
Amendment rights beyond anonymity. So how do they cut down on our free
expression rights?

Weissmann: So a big thing is chilling speech, because you have the pure
anonymity issue where you're actually not anonymous. They took my ID, they took
my face scan. Let's say their cybersecurity is immaculate. If you don't believe
that, you're still not going to want to post the stuff that you would otherwise
anonymously. So there's a chilling speech issue. Kids have First Amendment
rights, and most content on social media is First Amendment–protected in a way
that would apply to kids too. It's not narrowly tailored just for the stuff that
we say kids maybe can't look at. It's really, really, really broadly tailored. 

There's also the First Amendment right for content to be seen by users. People
might think it's silly. "Oh, Twitter doesn't have a right to be seen by people
who want to access it." OK, well, what about someone criticizing the government?
The government could just say, "Oh, well, they don't have the First Amendment
right to be seen by people." And then you can kind of see why that's a dangerous
perspective and why it's not supported by First Amendment jurisprudence. There's
a First Amendment right of parents who don't care about what their kids are
doing online, or are OK with what their kids are doing online, to not have to
deal with those barriers to speech. So it's just up and down. It violates the
First Amendment.



Gillespie: One of the most powerful parts of your work in this series is simply
the headline "Age Verification Legislation Doesn't Do What Legislators Say It
Will." Summarize that article.

Weissmann: So when I talk to people about age-verification law, there's a lot of
different issues they bring up. One is exploitation. They're worried about
predators reaching out to children, and that's very reasonable.

Gillespie: But is the internet mostly a child exploitation racket?

Weissmann: Definitely not, but it's there. There are definitely people who want
to try to do that stuff. It's not to say the government doesn't have a role
there, but parents really do need to work with kids to make sure they understand
the risk and what to say, what not to say. It's silly, but when I was on Neopets
and chatting with people, my dad was like, "Never tell them where you live." And
I was like, "Haha, I'm saying I'm in Texas. They'll never know where I am." But
maybe that wasn't the most ingenious thing, but it was still a good perspective
to have, to just be a little bit more careful about that stuff.

Gillespie: We hear a lot about sex trafficking and about child sex trafficking,
and it obviously happens, and that is horrible. And we need to figure out ways
to minimize that or get rid of it completely. But is there a reason to believe
that child exploitation, however you define it, is large and growing on the
internet?

Weissmann: I'm not sure, exactly. The reports are up, but I know that a lot of
it is duplicative. Which is good, that there are more reports of the same thing.
That's not an issue. It's just hard to measure with a lot of unlawful content in
general. I'm not sure about how sexting for kids rose or where it's at, but I
think that that did make it harder, especially when online girlfriends became a
thing. Then you really didn't know who was behind the screen. So I think it is
something to combat, and I'm not sure exactly how it's growing, but there does
seem to be somewhat of an increase of it especially from kids who don't know
what to predict, who never lived through the Nigerian prince era, that kind of
stuff.



Gillespie: But you say age-verification legislation won't do what legislators
say it will. What do legislators say it will do, and how does it fall short?

Weissmann: So what I was saying was that the big reason that legislators and
other people just want to stop kids from using social media is exploitation.
Another issue is that they just don't want kids posting, that they think that
they'll become addicted. The last piece is they don't want them to access
content that they don't want them to, whether it's liberal content or too
conservative content. 

But here's the thing: None of these laws prevent kids from viewing anything.
They just prevent kids from posting. So [for platforms that don't allow] kids
under 13 or that have age verification, it doesn't stop them from viewing the
content. So if you think they're addicted to scrolling, that's not going to
solve anything. And if you think that they shouldn't be viewing the content
there, it also doesn't solve anything. So they'll say it's kicking kids offline.
But really, you don't have to log into a lot of these platforms to see stuff. I
don't ever log in to Reddit, and I read constantly on Reddit. TikTok, you don't
need to log in. It makes it a little easier for you.

Gillespie: And, if we may—the cameramen are the ones who gave me this
information—on Pornhub, you don't have to log in to view it. 

Weissmann: That's a good point. It's true. You don't have to login, and they're
not blocking you from accessing these sites in the homepage way or in the
clicking-through way. Tons of these sites you don't have to log into, and you're
still viewing the content from any sites you'd like. So they're saying it's
going to stop kids from using social media without parental approval, but it
really doesn't. 



Gillespie: A lot of regulation is supposed to be about content, but then it ends
up moving into business models. And this was certainly true of proponents of net
neutrality. Ultimately, we're trying to say that phone companies and ISPs had to
do business in a particular way. So it's really kind of a business issue. 

A lot of this legislation says, "Kids under 13 can't use social media. We're
going to ban them somehow." But then it will say for kids under 18, sites can't
serve up content to them using algorithms. And algorithms have kind of replaced
Satan as the vague, sinister, ubiquitous spirit that is threatening our world.
Why is it wrong to tell websites or service providers that you can't use
algorithms in general? And then why is it misguided that you can't use
algorithms for kids under 18?

Weissmann: So there have been a few less popular proposals that completely
banned algorithms. You can't do that. Time order is an algorithm. [Those
proposals assert that] the only way to keep people safe is raw data. Even an RSS
is ordered. 

Gillespie: I mean, that would just turn us all schizophrenic. We would be like
in A Beautiful Mind, where it would just be a display of data flowing around us.

Weissmann: That's scary. That's going to be harmful. They don't understand how
algorithms work. And then it's like, well maybe time ordered is OK. And then you
have to remind them, what about reverse time order? Oh, I guess that's OK too.
And it gets really, really silly. They even don't want to target kids through
algorithms with their interests—so if a kid likes soccer, you can't show him
soccer stuff? That's stupid. If a kid wants to learn more about math, you can't
target based on their interest in math. It's just ridiculous. 



I think people overestimate the issues with algorithms. I know one issue is that
if you're into unlawful stuff or bad stuff, that it'll show you more of that
too. And I think it's good that platforms are working on mitigating that because
even, oddly enough, on Tosh.0, there's a segment about that, about a series of
videos that were basically showing young girls doing cutesy things. And then you
realize it wasn't made for other young girls. So, of course, YouTube should not
show people those kinds of content when they realize what it's really about,
even just from a normative standpoint. But in most cases, the algorithm just
knows I like marmots. So hey, here are marmots, Shoshana.

Gillespie: And there was an earlier fear—this is going back maybe a decade—that
"I started out watching puppy videos and then 15 minutes later, I signed up for
ISIS." And most studies that looked into that did not actually bear out the idea
that there's a quick or even long-term radicalization algorithm that is being
widely applied or used or people are falling into. 

Weissmann: People seek out the stuff they want to seek out, and the algorithm
just helps them seek it out more. Algorithms are math. When you're mad at it,
you're mad at math. And it's silly to me. 

Gillespie: You also write that regimes that run age verification through the
government would allow prosecutors to make children federal criminals if they
lie about their age.

Weissmann: Oh, this was fun. That was the Schatz bill, the [Protecting Kids on
Social Media Act]. And I do respect Schatz a lot. I think he's trying to do the
right thing. I don't think he's doing it right, but I think he's trying. And a
lot of what I've seen that he's saying, I kind of respect more than I do from
other elected officials, but it's really bad.



I mean, when you lie to the government, like that can be a federal crime. He
thought, maybe as a better way to protect data, it would be better for the
government to handle age verification. But that means if kids lie to that
entity, whether it's run through a government contractor or an agency, you can
be a federal criminal because you're lying to the government. And sure, we don't
prosecute kids a lot, but government sometimes starts enforcing stuff that it
didn't used to enforce. And you don't want to add a new law to the books that
makes it possible for kids to become federal criminals for trying to login to
YouTube. That's not wise policy.

Gillespie: At the same time, services should be free to demand whatever they
want from people, right?

Weissmann: Sure. I don't like when they want a lot of my information, but if
that's what they want, they can suffer the business consequences.

Gillespie: For people watching this on video, they may have seen I was drinking
out of a 7-Eleven cup. I went to 7-Eleven to get coffee this morning, and they
asked for my phone number. I was like, "No, I don't want to give you my phone
number." And I was going to walk away, but they were eventually like "OK." I
understand why they're doing that. And I also understand the power of getting
more personal information. One of the things that sites can do more than regular
businesses is tailor more stuff directly to you. But that's a negotiation.

Weissmann: You have some say there, and it's not mandatory. And some companies
realize that users don't want that. So they try to step away. 

Gillespie: With age-verification systems, you mentioned Neopets. My younger son
was really big into Club Penguin. It no longer exists. But it was kind of a
social media, a very walled garden for kids to use to do stuff and interact and
have online adventures. Were there services that did a really good job that are
directed toward kids that protect that? And are there examples to be learned
there from how we might change the way kids interact with the internet?



Weissmann: Yeah, I liked Neopets a lot. I actually made a few internet friends,
and my friends were into it. I forget the names of [my pets] and they're dead
now. They're all dead. I haven't fed them in so long. I haven't even dug their
graves.

I like the way Neopets operated. I always felt pretty safe there. I'm sure they
could have actually done some more nudges like, "Hey, remember not to give up
personal information to strangers," but overall they did good. Club Penguin is a
really good example because I remember the big trend of trying to get banned
from Club Penguin, but they did a good job of banning people being
inappropriate, and then it became a meme. So it was a bit of a Barbara Streisand
effect. I know Instagram wanted to do Instagram kids, and then everyone flipped
out over it so they couldn't. But I actually think that's a good idea. Some
safer areas where you still warn kids about stuff, but maybe there's a little
bit less risk for them.

Gillespie: What's the role of the companies here? Broadly, people who are
offering goods and services, have they fallen down on their job to kind of
proactively preempt this type of legislation? What do they need to be doing
better?

Weissmann: I think the big thing is that they should be coordinating to make
parental controls easier. Genuinely, I think that's the big lesson here. I'm not
sure it would have stopped the legislation, even. I know that parents are
sometimes overwhelmed by all the choices, but it would be nice if parents had
one set of controls that made it a little bit easier, because you can't have
device-level filters, platform-level filters, app store filters. But it would be
nice to give something to parents that's a little bit easier here just to
manage, just to show them how stuff works. Because just like with any
technology, it gets complex. I'm online way too much, so I know how all this
stuff works, but make it easier for parents. I'm not sure that companies have
exactly failed, but they really could be doing better. 



Gillespie: And a clear part of this is kind of a public relations war. Again,
going back to the '90s, cable TV didn't really become a fully national
phenomenon until the late '80s and the early '90s. And then, under Bill Clinton,
Janet Reno, the attorney general, went on a jihad against cable TV because it
was showing too much sex and violence. And it obviously wasn't, but out of these
sets of concerns came things like the VHS, which was a technology mandated into
every new TV. And then the idea was that we're going to rate TV programs and
then parents will set their TVs to a certain level so the kids can't block it.
Nobody used it. 

But it seems like companies now could do a better job of combating the
negativity. But they're part of the problem, aren't they? Both in terms of not
seeming to care, (maybe, maybe not), but also colluding with the government. One
of the things that is very different now from the '90s is in the wake of
revelations about Twitter and Facebook and other companies not just relying on
the government or rolling over for the government but asking the government to
say, "Hey, would you moderate our content?"

Weissmann: It's disgusting. It's regulatory capture. And they know what they're
doing violates the First Amendment, but it benefits their business. I do
understand on a level: You're a business, your job isn't always to fight for
freedom. But at the very least, you shouldn't be proactively fighting against
freedom. I get if government pressures you too much, you might have to roll over
a bit. But rolling over is different than what a lot of these companies are
doing. 

I was very grossed out by how Snapchat and Facebook were just like, "Oh, please
regulate us" and put it sort of on other people. And it's just silly. Snapchat,
I also personally have never had a lot of respect for. They used to tell
politicians to go on Snapchat, that's where the kids are. But [Snapchat] knew
that's not where you're going to reach people for politics. That was just not
ethical business.



Gillespie: You're a woman. Instagram has gotten a lot of heat, partly because of
reports that were leaked from within Facebook saying it has a problem with
certain types of adolescent female image issues. Do you buy that? Is that a
serious threat to the idea that free speech should dominate the internet? 

Weissmann: So what's wild to me is people flip out over this. As a kid, all my
friends had eating disorders. Every friend. And it wasn't because of Instagram.
It was because of models and magazines and TV.

We were all always worried about being thin enough, and social media didn't
exist then. It was all because of the images we were shown. Whereas now there's
a lot of heavier women on Instagram who look great, and they're showing, "Hey,
you don't have to be perfect." It's not about weight and cellulite. It's
actually really nice to see that there are girls showing, "Hey, if you look like
I do, here's how to dress, here's how to feel good about yourself."

Gillespie: One of the great celebratory points of the '90s was the end of the
mainstream. And particularly there was a lot of discussion about ideals of
female beauty. You know, ideals of male beauty don't get the same kind of
attention. But in both cases they expanded vastly. So instead of saying, "OK,
you can be Raquel Welch or Twiggy," there's an infinite gradient of beauty and
of being comfortable with yourself. And we seem to be occupying that world in
reality now. And people are like, "We've got to shut this down. Something's gone
terribly wrong."

Weissmann: Yeah. Instagram's tried to get rid of a lot of the eating disorders
stuff, but there's a lot of really good, healthy content. There's unhealthy
content too. But the mix is way better than it was when I was a kid. If there
was a heavy woman on TV, everyone noted that she was heavy, and that was the end
of it. Everyone had the same body shape, they didn't have many curves. And when
they did, it had to be Britney Spears or nothing. You couldn't have too much of
a waist and you couldn't have too much of a butt. But now, online, it's really
proliferated. All different kinds of women showing, "Look how I'm beautiful." I
think that's really nice. 



Gillespie: Do you think perhaps that's the problem? Not that certain new forms
of hegemonic body types are shown, but that actually anybody can do anything and
that's what's freaking people out?

Weissmann: Oh, I'm sure that there's a level of that, of, "It's not like when I
was a kid." I think there's a real aspect of that in here. But in general, it
just baffles me that everyone's worried about body positivity online when that
wasn't a thing when I was growing up. Every young girl was worried about being
thin enough from the time we were like 8 years old. All our friends talked about
being thin, and I'm sure that there are still issues like that, but the people
they have to look up to are a lot broader. I just have to think that a piece of
this is people not understanding that, or people thinking this is different from
when I was a kid.

Gillespie: One of the other things you write about is how age-verification laws
don't exempt VPN traffic. But that traffic can't always be detected. Explain
what a VPN is and why these are important.

Weissmann: So people can use VPNs to make it seem like they're in a different
place. So me in D.C., I could be like, I'm in Iceland or I'm in Utah.

Gillespie: One of the things that everybody talked about when VPNs happened, it
meant that if you're a political person in China or in Iran or whatever, you can
use VPNs in order to actually kind of access the internet and speak freely. 

Weissmann: Totally. There are a lot of great use cases, like to evade bad
government and oppressive government. The case isn't anonymity that the normal
person uses it for. It's for Netflix. It's definitely Netflix. Or to just try to
avoid a little bit of extra tracking. You're not trying to be anonymous. You're
just trying to have less stuff acquired.



Gillespie: And I've noticed too VPNs are something that went from being kind of
celebrated because this is how we're going to help people in authoritarian
countries find freedom in the internet and speak to evade what used to be called
the great firewall of China and stuff like that. Then it became, "OK, this is
kind of cool because I can watch Netflix anywhere around the world from the U.S.
feed." And now it's that the only reason to use a VPN is to engage in some kind
of criminal or sexually perverse behavior.

Weissmann: Exactly. Meanwhile, my friend's fiancé is a normal guy; he doesn't do
politics. He's a trainer, and he likes VPNs because he's just like, "I don't
want stuff tracking me." And so that's the normie use of it in America. 

Gillespie: So these age-verification laws don't exempt VPNs. Why is that a
problem?

Weissmann: So this is a really fun rabbit hole because it makes these laws
impossible. So, VPNs can convincingly make it seem like I'm in Iceland or Utah
or wherever. You can detect a lot of VPNs. Not all, but you can detect a chunk
of VPNs and realize, OK, this is a VPN. So in those cases, if you're in Utah and
you're a social media company that operates there, what you would have to do to
comply with the law is say, "You're using a VPN. We need to verify your age. I
know this is just a Utah law, but to be on the safe side, in case you're in Utah
trying to get around the law, we have to verify your age." That would violate
California law because in California, if you treat VPN traffic differently,
you're in violation of the law. So there's impossible compliance at that level.



Let's say you really can't detect it, like you're using acceptable methods. And
I talked to different VPN blockers and VPN providers, and they basically said
you're not going to be able to detect all VPN traffic. So let's say I'm in Utah.
They're supposed to verify my age. And they think I'm in Arkansas or maybe
someplace without one of these laws. Maybe I'm in Maine. So it appears that I'm
in Maine. I'm really in Utah. I get around the law. They don't verify my age,
and I'm a child. In that case, the social media company that failed to verify my
age would be liable. That's nuts. It's impossible to comply with that. And
there's just this sense of, "Oh, sure, you can figure it out," but no.

Even worse, the law applies to Utah residents. How the heck do you know if
someone's a Utah resident? You literally have to verify everyone's age. Because
if a child in Utah is in D.C. now and logging on, well, the IP address is D.C.,
or the D.C.-area because IPs aren't exact either. So they don't verify the age,
and now they're liable. You create just absolutely impossible compliance. 

And to drive the point home too, with Netflix, they fail to detect a lot of
VPNs. Netflix has massive incentive because of its licensing agreements with
various companies and various shows. Basically, it's really bad for them if
people can get around these. So that's why they block VPNs to make sure that
they're upholding their licensing agreements. So if even those guys can't do it,
then how the heck are all these social media companies going to be able to do it
when the incentive is even higher to use VPNs to get around these laws?

Gillespie: Let's talk a little bit about R Street, the place where you work, and
your journey to what you do and how you think. What is R Street?



Weissmann: So R Street is a free market think tank. We were founded on insurance
policy, which is fun. I actually really enjoy talking flood insurance now, but
we do everything from energy to cyber security, tech policy, obviously,
licensing reform, a lot of justice reform. I love my job. It's a lot of fun.

Gillespie: When you say free market, what does that mean? 

Weissmann: So, a lot of people think libertarian, but we're not always
libertarian, and we don't mind government if it solves a problem narrowly
tailored. Or if the government's already involved with something, we're not
going to let the perfect be the enemy of the good. We're fine with incremental
reforms, and we're fine with turning a bad system into a better one. 

Gillespie: What's a place where government is working well, where the government
regulations or structures in place are delivering a good product or service?

Weissmann: That's a very good question. I think that government really does have
a role in things. I just don't think it often executes well, like with cyber
security. I think there are legitimate roles for government. I just don't know
that it's doing well. There are different standards across different agencies.
It doesn't help businesses know what they should be doing. It even sometimes
creates adverse incentives not to report breaches. So we want to fix that. We
want to make sure that people feel comfortable reporting breaches and that even
if they're penalized, that we're not harming them for telling the truth there. 

In justice reform, I mean, we need police but there are ways it could be working
better. So we like justice reform. We like bail reform. And, there are a lot of
places experimenting with different models to figure out what works better. But
we have important rights that are often violated by police. We want to try to
stop that and give police the tools they need. 



Gillespie: When you think about the whole suite of what R Street and other free
market groups are talking about, in many ways there are issues that were not
being talked about 20 years ago en masse or 30 years ago, things like
occupational licensing reform, zoning reform, things like that. This seems to be
a kind of golden age. These are being seen as useless, or whatever good they
might have provided, they're now really choking down the economy as we live
today. Do you think that's accurate, that there are reasons to be very
optimistic about a certain type of policy reform change? 

Weissmann: Oh, yeah. It's been crazy to me to see the broad interest in
licensing reform from everyone across the political spectrum. And they're
excited about it. Like when I go to Congress to talk about it, they're like,
"Yeah, let's talk licensing reform." That's crazy, I love it. Or even energy
permitting reform. It's really exciting that that's a thing. And they might be
messing it up a little bit in Congress.

Gillespie: With occupational licensing, one of the things is that, say, in Ohio,
you have to do 2,400 hours of barber college. But you only have to get six hours
of training to be a cop. Nobody responds to that by saying we should make the
cops do 2,400 hours. It's more like we just need to rethink how we license and
certify people and whether or not, in many cases, that's a role for the state or
for private organizations.

Weissmann: I just love that there are so many elected officials interested in
this. I mean, [former] Gov. Doug Ducey and I, in Arizona, we became friends
because of this, because he was really big into licensing reform. And we hit it
off. And we've been friends for like seven years because of it, which is just
such a funny thought to have that there's elected officials like really, really
interested in narrow regulatory reforms.



Gillespie: Is there a generational component to the conversations that R Street
is involved in like tech policy and online policy? And at that tech hearing or
the child exploitation hearing we heard, there are always these moments when
people like Lindsey Graham, who clearly has never dialed a telephone or used a
cellphone or been online or driven his own car for decades, is railing about
technology. And it's just kind of like "OK, boomer" moment. But it's not that
easy, right? It's not just, old people are the problem and they have to get out
of the way for young people. How do these issues of regulatory control of common
use media play out? 

Weissmann: So it's actually really varied across issues. One interesting thing
is that with licensing reform, it's very Gen X and younger more interested in
it. And even older than that, not that they're not interested, it's just not
their thing. But with tech policy, the lines are all over. Like, [Sen.] Ron
Wyden [D–Ore.] is one of the best people on this, and I adore Ron. 

Gillespie: Ron Wyden, the Oregon senator, is one of the authors of Section 230.
You mentioned Jeff Kosseff. His work we respect in common. He wrote a book about
Section 230 as well as anonymous speech and then, most recently, defending
misinformation. I'm just very curious to see where he goes next.

Weissmann: Oh, I know. He keeps ruining things though, because when he writes
about it, it becomes a thing. And I'm like, just stop writing. He's older, but
he's really smart and he knows what he's doing, and he's thoughtful. And I don't
always agree with him, but I get where he's coming from. But then you have
younger members like Hawley, who is atrocious. I mean, he's not even trying.
There are other younger members who just don't know what they're doing. I
actually once had a meeting with a staffer for [former Rep.] Madison Cawthorn
[R–N.C.] on licensing, and he's like, "We want to force the states to do what we
want on licensing." And I was like, "Hey, there are a lot of constitutional
issues here, a lot of functional issues." He's like, "Yeah, but we just want to
force this." And I'm like, "OK, this isn't going to work." But it's not a young
or old thing. It's how deep are you going to get into the issue. There are
members who are more serious and less serious on this stuff, and it's really
just not an age thing.



Gillespie: At Reason, in the '90s and beyond, we used to talk about the real
axis being control and choice. I guess that still exists. Somewhat related to
that is the sense that we've gotten to a point where the Republican Party or the
Democratic Party win an election because the other party had just been in
control. People are like, "I don't want that. We'll try this." 

There's a real breakdown of consensus it seems in many aspects. You see that in
presidential elections that tend to be very close. You see that in control of
different parts of Congress going back and forth. Is that what is really being
expressed in these debates over how we control social media? Is that really what
is being talked about without being acknowledged, that my side can't control the
conversation so we want to figure out ways to do that?

Weissmann: I know what you mean. But I'll push back on the control vs. choice
thing, because some of the best regulatory reformers in general are some of the
worst people on social media regulation, which I don't fully understand. But I
think it gets to this point where I think the bigger dynamic is just moral
panic, that people are freaking out and then they lose their principles and
their sense on certain things. Which is why some of the people I adore the
most—I love Gov. Spencer Cox in Utah. He's done great, great things, but he's
really, really wrong here.

Gillespie: What is he good on? When you say he's done great, great things, what
are those things?

Weissmann: Just every little thing. He's a very good governance guy. He pushed
for a lot of Utah state government people to be able to work from home to save
money and to make it easier on families. And that's some nice common sense
stuff. And he does a lot like that. He's great on licensing reform. His first
executive order was licensing reform. And now the Utah Department of Commerce
has a guy whose whole job is figuring out the best way to make licensing work,
where it's working, where it's not, if we need more licensing, but it's very
objective and very thorough and very thoughtful. And it's incredible.



This isn't due to Cox, but there's a Utah regulatory sandbox for lawyer
licensing reform. And Cox is in on all this stuff. He's really, really good at
what he does. But man, when it comes to social media regulation, I don't know
where his brain is going on this. I don't understand, except if it's moral
panic. And he thinks that the freakout and what he's feeling and what he's
thinking here is just more important than all the other principles.

Gillespie: I'm thinking of somebody like Taylor Lorenz, who's now at The
Washington Post. She's been at The Atlantic, The New York Times, etc. She
recently wrote a book that's really interesting. And generally she's very
pro–social media or new forms of media that allow younger people to express what
they're thinking. I interviewed her for Reason years ago, and it was great. 

But she and other people have been talking—this is something on the right and
the left—about how the problem is that big corporations or big internet
companies are able to manipulate your feeds, are able to make you want certain
things or to see certain things, that they have become this vast reality
distortion machine, so that when you're online—and we're increasingly
online—you're not seeing the real world. 

Again, this goes back to certain debates in the '90s and even in the '50s where
there was a critique, broadly speaking, of unregulated capitalism, that it
allowed Madison Avenue and the hidden persuaders, the mad men. The mad men of
Madison Avenue who were using psychology and science to make you buy appliances
every year, even though you didn't need them, and to buy this car rather than
that car. That seems to be kind of flourishing again. How does one engage that
or combat that idea that we are not in control of our social media feeds?



Weissmann: It's funny. I just think it's one of the most toxic ideas out there.
I think understanding and empowering autonomy is probably the most important
thing in life. If you don't believe you're in control of your own destiny, then
what do morals matter? If everyone else is controlling you, then nothing you do
matters and you have full license to be as awful a person and do as bad things
as you want. I have like almost 70,000 followers, and they love regulatory
reform. They love slobs and marmots and regulatory reform. How do we even live
in an age where that's possible? I spend so much time on all trails and so much
time offline hiking, and it's only possible because of our current age. Because,
one, I have enough treatments for all my diseases, and I'm up to 11, which is
fantastic.

With hiking, even women traveling alone is kind of a recent thing in a lot of
ways. Not a century ago, that wasn't much of a thing. Knowing where the trails
are, having people review and tell you, "Oh, there's a bear here. There's a wolf
here," stuff like that. Being able to create community with the people I meet—I
meet people on trails. Then we follow each other on Instagram and meet up next
time we're in the same place. The stuff that's possible, the levels of autonomy
that are possible, and the power to choose your own life that are possible are
often because of social media. 

Like I said before, finding out I had fibromyalgia, not through the almost 30
doctors I had seen by that time but by one internet forum after Googling, that's
incredible. I just think it's the most empowering thing. All these different
mediums we have, all this information. Sure, some of it's wrong, but you can
research. I went to a doctor to figure out if I had fibromyalgia. I've been led
down rabbit holes that solve problems that I didn't even realize I had. And I
just think that's the proper way to see this. Sure, there are problems, and
sure, you can just kind of get lazy. But I know people who are lazy and just
play video games all day. We're not railing against that screen time. For some
reason, it's the dumb dances on TikTok that freak everyone out. 



Gillespie: So, what you're saying is that the world that we live in is a mix of
online and off. We have much more information now, and that doesn't alleviate
our need to be critical thinkers and critical learners and things like that. But
it gives us many more opportunities to find out what we are, who we are, and how
we want to live?

Weissmann: It empowers autonomy in just incredible ways. Not everyone will
embrace it. My close group of friends, almost everyone is insane about exercise.
We love exercising. They want to run. I'm not much of a runner yet, but I want
to hike farther than everyone. They want to run faster than they ever have
before, and I don't think that was as big a thing. People went to the gym. But
the competition, the excitement, the empowerment for each other online, the
level of community you can find, even the people I stay in touch with all over
the world because we have hobbies in common. I know that I can be a little bit
more extroverted than a lot of people, but there's really this way to find
community that's never existed before, so people can get together and come up
with ideas and, like you said, find out who they are and what they are. But it's
about finding out who you are and what you are, not letting things shape you and
everything shapes you to a degree. But now way more positive things can than
could before.

Photo Credits: CNP/AdMedia/SIPA/Newscom/ Rod Lamkey—CNP/picture alliance /
Consolidated News Photos/Newscom/ CNP/AdMedia/SIPA/Newscom.

 * Video Editor: Adam Czarnecki
 * Audio Production: Ian Keyser

NEXT: The Best of Reason: The Bankruptcy of Bidenomics

Nick Gillespie is an editor at large at Reason and host of The Reason Interview
With Nick Gillespie.

Social MediaChildrenFacebookTwitterCensorshipInternetFirst
AmendmentRegulationMark ZuckerbergThe Reason Interview With Nick GillespieReason
Podcast
Share on FacebookShare on TwitterShare on RedditShare by emailPrint friendly
versionCopy page URL
Media Contact & Reprint Requests

Show Comments (16)


LISTEN & SUBSCRIBE

Podcast:
The Reason Interview With Nick Gillespie

 * Apple Podcasts
 * Spotify
 * YouTube
 * RSS



 * About
 * Browse Topics
 * Events
 * Staff
 * Jobs
 * Donate
 * Advertise
 * Subscribe
 * Contact
 * Media
 * Shop
 * Amazon

Reason FacebookReason TwitterReason InstagramReason TikTokReason YoutubeReason
ItunesReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of
Service apply.



Notifications