reason.com
Open in
urlscan Pro
75.2.24.81
Public Scan
URL:
https://reason.com/2024/01/01/we-absolutely-do-not-need-an-fda-for-ai/
Submission: On January 02 via manual from US — Scanned from US
Submission: On January 02 via manual from US — Scanned from US
Form analysis
3 forms found in the DOMGET https://reason.com/
<form role="search" method="get" class="search-form" action="https://reason.com/">
<label>
<span class="screen-reader-text">Search for:</span>
<input type="search" class="search-field" placeholder="Search …" value="" name="s">
</label>
<input type="submit" class="search-submit" value="Search">
</form>
POST
<form method="post" id="gform_0" class="recaptcha-v3-initialized"><input type="hidden" name="login_redirect" value="/2024/01/01/we-absolutely-do-not-need-an-fda-for-ai/">
<div class="gform_heading">
<h3 class="gform_title">Login Form</h3>
</div>
<div class="gform_body">
<div id="gform_fields_login" class="gform_fields top_label">
<div id="field_0_1" class="gfield gfield--type-text gfield_contains_required field_sublabel_below gfield--no-description field_description_below gfield_visibility_visible" data-js-reload="field_0_1"><label class="gfield_label gform-field-label"
for="input_1">Username<span class="gfield_required"><span class="gfield_required gfield_required_text">(Required)</span></span></label>
<div class="ginput_container ginput_container_text"><input name="input_1" id="input_1" type="text" value="" class="" aria-required="true" aria-invalid="false"> </div>
</div>
<div id="field_0_2" class="gfield gfield--type-text gfield_contains_required field_sublabel_below gfield--no-description field_description_below gfield_visibility_visible" data-js-reload="field_0_2"><label class="gfield_label gform-field-label"
for="input_2">Password<span class="gfield_required"><span class="gfield_required gfield_required_text">(Required)</span></span></label>
<div class="ginput_container ginput_container_text"><input name="input_2" id="input_2" type="password" value="" class="" aria-required="true" aria-invalid="false"> </div>
</div>
<div id="field_0_3" class="gfield gfield--type-remember_me field_sublabel_below gfield--no-description field_description_below hidden_label gfield_visibility_visible" data-js-reload="field_0_3"><label
class="gfield_label gform-field-label screen-reader-text gfield_label_before_complex"></label>
<div class="ginput_container ginput_container_checkbox">
<div class="gfield_checkbox" id="input_3">
<div class="gchoice gchoice_3">
<input class="gfield-choice-input" name="input_3.1" type="checkbox" value="1" id="choice_3">
<label for="choice_3" id="label_3">Remember Me</label>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="gform_footer top_label"> <button type="submit" id="gform_submit_button_0" class="gform_button button"
onclick="if(window["gf_submitting_0"]){return false;} if( !jQuery("#gform_0")[0].checkValidity || jQuery("#gform_0")[0].checkValidity()){window["gf_submitting_0"]=true;} "
onkeypress="if( event.keyCode == 13 ){ if(window["gf_submitting_0"]){return false;} if( !jQuery("#gform_0")[0].checkValidity || jQuery("#gform_0")[0].checkValidity()){window["gf_submitting_0"]=true;} jQuery("#gform_0").trigger("submit",[true]); }">Login</button>
<input type="hidden" class="gform_hidden" name="is_submit_0" value="1">
<input type="hidden" class="gform_hidden" name="gform_submit" value="0">
<input type="hidden" class="gform_hidden" name="gform_unique_id" value="">
<input type="hidden" class="gform_hidden" name="state_0" value="WyJbXSIsIjVmZDk0MDRiMTc0NTYwODJmYTIwNGZlZDYxN2ViYzJjIl0=">
<input type="hidden" class="gform_hidden" name="gform_target_page_number_0" id="gform_target_page_number_0" value="0">
<input type="hidden" class="gform_hidden" name="gform_source_page_number_0" id="gform_source_page_number_0" value="1">
<input type="hidden" name="gform_field_values" value="">
</div>
</form>
POST /2024/01/01/we-absolutely-do-not-need-an-fda-for-ai/#gf_17
<form method="post" enctype="multipart/form-data" target="gform_ajax_frame_17" id="gform_17" class="puprf-signup-widget recaptcha-v3-initialized" action="/2024/01/01/we-absolutely-do-not-need-an-fda-for-ai/#gf_17" data-formid="17" novalidate="">
<div class="gf_invisible ginput_recaptchav3" data-sitekey="6LeMnkUaAAAAALL8T1-XAyB7vxpOeTExu6KwR48-" data-tabindex="0"><input id="input_1efd75c4c074a2cc62f8fe9a651ea3c9" class="gfield_recaptcha_response" type="hidden"
name="input_1efd75c4c074a2cc62f8fe9a651ea3c9" value=""></div>
<div class="gform-body gform_body">
<div id="gform_fields_17" class="gform_fields top_label form_sublabel_below description_below">
<div id="field_17_1" class="gfield gfield--type-email gfield_contains_required field_sublabel_below gfield--no-description field_description_below hidden_label gfield_visibility_visible" data-js-reload="field_17_1"><label
class="gfield_label gform-field-label" for="input_17_1">Email<span class="gfield_required"><span class="gfield_required gfield_required_text">(Required)</span></span></label>
<div class="ginput_container ginput_container_email">
<input name="input_1" id="input_17_1" type="email" value="" class="large" placeholder="Email Address" aria-required="true" aria-invalid="false">
</div>
</div>
<div id="field_17_2" class="gfield gfield--type-honeypot gform_validation_container field_sublabel_below gfield--has-description field_description_below gfield_visibility_visible" data-js-reload="field_17_2"><label
class="gfield_label gform-field-label" for="input_17_2">Comments</label>
<div class="ginput_container"><input name="input_2" id="input_17_2" type="text" value="" autocomplete="new-password"></div>
<div class="gfield_description" id="gfield_description_17_2">This field is for validation purposes and should be left unchanged.</div>
</div>
</div>
</div>
<div class="gform_footer top_label"> <button type="submit" id="gform_submit_button_17" class="gform_button button"
onclick="if(window["gf_submitting_17"]){return false;} if( !jQuery("#gform_17")[0].checkValidity || jQuery("#gform_17")[0].checkValidity()){window["gf_submitting_17"]=true;} "
onkeypress="if( event.keyCode == 13 ){ if(window["gf_submitting_17"]){return false;} if( !jQuery("#gform_17")[0].checkValidity || jQuery("#gform_17")[0].checkValidity()){window["gf_submitting_17"]=true;} jQuery("#gform_17").trigger("submit",[true]); }">Submit</button>
<input type="hidden" name="gform_ajax" value="form_id=17&title=&description=1&tabindex=0&theme=data-form-theme='gravity-theme'">
<input type="hidden" class="gform_hidden" name="is_submit_17" value="1">
<input type="hidden" class="gform_hidden" name="gform_submit" value="17">
<input type="hidden" class="gform_hidden" name="gform_unique_id" value="">
<input type="hidden" class="gform_hidden" name="state_17" value="WyJbXSIsIjVmZDk0MDRiMTc0NTYwODJmYTIwNGZlZDYxN2ViYzJjIl0=">
<input type="hidden" class="gform_hidden" name="gform_target_page_number_17" id="gform_target_page_number_17" value="0">
<input type="hidden" class="gform_hidden" name="gform_source_page_number_17" id="gform_source_page_number_17" value="1">
<input type="hidden" name="gform_field_values" value="">
</div>
<p style="display: none !important;"><label>Δ<textarea name="ak_hp_textarea" cols="45" rows="8" maxlength="100"></textarea></label><input type="hidden" id="ak_js_1" name="ak_js" value="1704167647135">
<script>
document.getElementById("ak_js_1").setAttribute("value", (new Date()).getTime());
</script>
</p>
</form>
Text Content
* Latest * Magazine * Current Issue * Archives * Subscribe * Crossword * Video * Podcasts * All Shows * The Reason Roundtable * The Reason Interview With Nick Gillespie * The Soho Forum Debates * Just Asking Questions * The Best of Reason Magazine * Why We Can't Have Nice Things * Volokh * Newsletters * Donate * Donate Online * Donate Crypto * Ways To Give To Reason Foundation * Torchbearer Society * Planned Giving * Subscribe * Print/Digital Subscriptions * Gift Subscriptions Search for: LOGIN FORM Username(Required) Password(Required) Remember Me Login Create new account Forgot password Artificial Intelligence WE ABSOLUTELY DO NOT NEED AN FDA FOR AI IF OUR BEST AND BRIGHTEST TECHNOLOGISTS AND THEORISTS ARE STRUGGLING TO SEE THE WAY FORWARD FOR AI, WHAT MAKES ANYONE THINK POLITICIANS ARE GOING TO GET THERE FIRST? Katherine Mangu-Ward | From the February 2024 issue Share on FacebookShare on TwitterShare on RedditShare by emailPrint friendly versionCopy page URL Media Contact & Reprint Requests (Photo: @eshear/X) I don't know whether artificial intelligence (AI) will give us a 4-hour workweek, write all of our code and emails, and drive our cars—or whether it will destroy our economy and our grasp on reality, fire our nukes, and then turn us all into gray goo. Possibly all of the above. But I'm supremely confident about one thing: No one else knows either. November saw the public airing of some very dirty laundry at OpenAI, the artificial intelligence research organization that brought us ChatGPT, when the board abruptly announced the dismissal of CEO Sam Altman. What followed was a nerd game of thrones (assuming robots are nerdier than dragons, a debatable proposition) that consisted of a quick parade of three CEOs and ended with Altman back in charge. The shenanigans highlighted the many axes on which even the best-informed, most plugged-in AI experts disagree. Is AI a big deal, or the biggest deal? Do we owe it to future generations to pump the brakes or to smash the accelerator? Can the general public be trusted with this tech? And—the question that seems to have powered more of the recent upheaval than anything else—who the hell is in charge here? Powered By 00:00/00:57 10 Sec AP Top Stories January 1 P Next Stay OpenAI had a somewhat novel corporate structure, in which a nonprofit board tasked with keeping the best interests of humanity in mind sat on top of a for-profit entity with Microsoft as a significant investor. This is what happens when effective altruism and ESG do shrooms together while rolling around in a few billion dollars. After the events of November, this particular setup doesn't seem to have been the right approach. Altman and his new board say they're working on the next iteration of governance alongside the next iteration of their AI chatbot. Meanwhile, OpenAI has numerous competitors—including Google's Bard, Meta's Llama, Anthropic's Claude, and something Elon Musk built in his basement called Grok—several of which differentiate themselves by emphasizing different combinations of safety, profitability, and speed. Labels for the factions proliferate. The e/acc crowd wants to "build the machine god." Techno-optimist Marc Andreessen declared in a manifesto that "we believe intelligence is in an upward spiral—first, as more smart people around the world are recruited into the techno-capital machine; second, as people form symbiotic relationships with machines into new cybernetic systems such as companies and networks; third, as Artificial Intelligence ramps up the capabilities of our machines and ourselves." Meanwhile Snoop Dogg is channeling AI pioneer-turned-doomer Geoffrey Hinton when he said on a recent podcast: "Then I heard the old dude that created AI saying, 'This is not safe 'cause the AIs got their own mind and these motherfuckers gonna start doing their own shit.' And I'm like, 'Is we in a fucking movie right now or what?'" (Hinton told Wired, "Snoop gets it.") And the safetyists just keep shouting the word guardrails. (Emmett Shear, who was briefly tapped for the OpenAI CEO spot, helpfully tweeted this faction compass for the uninitiated.) If even our best and brightest technologists and theorists are struggling to see the way forward for AI, what makes anyone think that the power elite in Washington, D.C., and state capitals are going to get there first? When the release of ChatGPT 3.5 about a year ago triggered an arms race, politicians and regulators collectively swiveled their heads toward AI like a pack of prairie dogs. State legislators introduced 191 AI -related bills this year, according to a September report from the software industry group BSA. That's a 440 percent increase from the number of AI-related bills introduced in 2022. In a May hearing of the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, at which Altman testified, senators and witnesses cited the Food and Drug Administration and the Nuclear Regulatory Commission as models for a new AI agency, with Altman declaring the latter "a great analogy" for what is needed. Sens. Richard Blumenthal (D–Conn.) and Josh Hawley (R–Mo.) released a regulatory framework that includes a new AI regulatory agency, licensing requirements, increased liability for developers, and many more mandates. A bill from Sens. John Thune (R–S.D.) and Amy Klobuchar (D–Minn.) is softer and more bipartisan, but would still represent a huge new regulatory effort. And President Joe Biden announced a sweeping executive order on AI in October. But "America did not have a Federal Internet Agency or National Software Bureau for the digital revolution," as Adam Thierer has written for the R Street Institute, "and it does not need a Department of AI now." Aside from the usual risk throttling of innovation, there is the concern about regulatory capture. The industry has a handful of major players with billions invested and a huge head start, who would benefit from regulations written with their input. Though he has rightly voiced worries about "what happens to countries that try to overregulate tech," Altman has also called concerns about regulatory capture a "transparently, intellectually dishonest response." More importantly, he has said: "No one person should be trusted here….If this really works, it's quite a powerful technology, and you should not trust one company and certainly not one person." Nor should we trust our politicians. One silver lining: While legislators try to figure out their priorities on AI, other tech regulation has fallen by the wayside. Regulations on privacy, self-driving cars, and social media have been buried by the wave of new bills and interest in the sexy new tech menace. One thing is clear: We are not in a Jurassic Park situation. If anything, we are experiencing the opposite of Jeff Goldblum's famous line about scientists who "were so preoccupied with whether or not they could, they didn't stop to think if they should." The most prominent people in AI seem to spend most of their time asking if they should. It's a good question. There's just no reason to think politicians or bureaucrats will do a good job answering it. Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup. Email(Required) Comments This field is for validation purposes and should be left unchanged. Submit Δ NEXT: Brickbat: Filed Away and Forgotten Katherine Mangu-Ward is editor in chief of Reason. Artificial IntelligenceFutureTechnologyBig GovernmentRegulationInnovation Share on FacebookShare on TwitterShare on RedditShare by emailPrint friendly versionCopy page URL Media Contact & Reprint Requests Show Comments (58) LATEST CALIFORNIA'S 'REPUGNANT' RESTRICTIONS ON PUBLIC GUN POSSESSION JUST TOOK EFFECT Jacob Sullum | 1.1.2024 3:00 PM MICKEY MOUSE IS NOW IN THE PUBLIC DOMAIN. WELL, SORT OF. Joe Lancaster | 1.1.2024 8:00 AM WE ABSOLUTELY DO NOT NEED AN FDA FOR AI Katherine Mangu-Ward | From the February 2024 issue BRICKBAT: FILED AWAY AND FORGOTTEN Charles Oliver | 1.1.2024 4:00 AM RED STATE, RED TAPE Christian Britschgi | 12.31.2023 7:00 AM * About * Browse Topics * Events * Staff * Jobs * Donate * Advertise * Subscribe * Contact * Media * Shop * Amazon Reason FacebookReason InstagramReason TikTokReason YoutubeReason ItunesReason on FlipboardReason RSS © 2023 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Notifications