THE BROAD VIEW
·
SUBSCRIBERS

Human this Christmas

ChatGPT can program a reader but only mimic a writer

    • ChatGPT and other artificial approximations of human expression have a hype cycle: They promise, they catch hold, but ultimately under-deliver.
    • ChatGPT and other artificial approximations of human expression have a hype cycle: They promise, they catch hold, but ultimately under-deliver. photo: NYTimes
    Published Fri, Dec 23, 2022 · 10:00 AM

    EVERYONE in my professional life – fellow faculty members, other writers – is up in arms about ChatGPT, the new artificial intelligence (AI) tool that can write like a human being.

    Tech is not supposed to be human. It is only ever supposed to be humanoid. But this chatbot can take multiple ideas and whip up a cogent paragraph. The professional classes are aghast.

    Some of us professors are primarily obsessed with assessment and guarding the integrity of, well, everything. We scan essays into proprietary cheating detectors and tut-tut when a program finds a suspiciously high proportion of copied text. For at least 10 years, academics have fought about the proper role of rooting out computer-assisted cheating. Should we build better tests or scare students straight like a 1980s after-school special? We are split.

    Decoding Asia newsletter: your guide to navigating Asia in a new global order. Sign up here to get Decoding Asia newsletter. Delivered to your inbox. Free.

    Share with us your feedback on BT's products and services