Pandora’s Box

Published: May 23, 2023

By Jim Lichtman
Image
Read More

All the warnings about the poisonous side of artificial intelligence are no longer warnings. They’re here.

“A fake image purporting to show an explosion near the Pentagon was shared by multiple verified Twitter accounts on Monday,” CNN reports.

“This image shows typical signs of being AI-synthesized: there are structural mistakes on the building and fence that you would not see if, for example, someone added smoke to an existing photo,” Hany Farid, a professor at the University of California, Berkeley, and digital forensic expert told CNN.

But how will we detect future such images as AI becomes more powerful?

The Future of Life Institute posted a letter signed by more than 27,000 scientists, researchers, and others urging those involved in AI research to pause.

“AI systems with human-competitive intelligence can pose profound risks to society and humanity, as shown by extensive research and acknowledged by top AI labs.

“This pause should be public and verifiable and include all key actors.”

And to show you how serious the issue is, scientists, themselves have called on government intervention, if necessary.

“If such a pause cannot be enacted quickly, governments should step in and institute a moratorium. This does not mean a pause on AI development in general, merely a stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities.

“AI developers,” the letter continues, “must work with policymakers to dramatically accelerate development of robust AI governance systems. These should at a minimum include: new and capable regulatory authorities dedicated to AI; oversight and tracking of highly capable AI systems and large pools of computational capability; provenance and watermarking systems to help distinguish real from synthetic and to track model leaks; a robust auditing and certification ecosystem; liability for AI-caused harm; robust public funding for technical AI safety research; and well-resourced institutions for coping with the dramatic economic and political disruptions (especially to democracy) that AI will cause.”

It doesn’t take much imagination to see how easily AI can be misused from the spreading of malware to altered videos of political leaders.

The message is clear. The Pandora’s Box of AI is open exposing all of us to “profound risks.” If we don’t control our future, others will.

Comments

Leave a Comment