SUBSCRIBERS

Regulating AI will be essential. And complicated

The middle ground would be creating a new federal agency, like the SEC or the FDA, but staffed with experts on artificial intelligence

    • Governments and society will have to grapple with the issue of how to regulate artificial intelligence and who should do so.
    • Governments and society will have to grapple with the issue of how to regulate artificial intelligence and who should do so. PHOTO: REUTERS
    Published Mon, Apr 3, 2023 · 01:53 PM

    WHETHER or not calls for pausing artificial intelligence (AI) development succeed (spoiler: they won’t), AI is going to need regulation. Every technology in history with comparably transformational capabilities has been subject to rules of some sort. What that regulation should look like is going to be an important and complicated problem, one that I and others will be writing a lot about in the months and years to come.

    Before we even get to the content of the regulation needed, however, there’s a crucial threshold question that needs to be addressed: Who should regulate AI? If it’s government, which part of government, and how? If it’s industry, what are the right kinds of mechanisms to balance innovation with safety?

    I’d like to start suggesting some basic principles that should guide our approach, starting with government regulation. I’ll save the question of private sector self-regulation for a future column. (Disclosure: I advise a number of companies that are involved in AI, including Meta.)

    Share with us your feedback on BT's products and services