Musk, experts urge pause on training of AI systems that can outperform GPT-4

GPT-4 is the newest version of OpenAI’s language model systems. PHOTO: AFP

MASSACHUSETTS, United States – Mr Elon Musk and a group of artificial intelligence experts and industry executives are calling for a six-month pause in training of systems more powerful than OpenAI’s newly launched model GPT-4, they said in an open letter, citing potential risks to society and humanity.

The letter, issued by the non-profit Future of Life Institute and signed by more than 1,000 people, including Mr Musk, Stability AI CEO Emad Mostaque, researchers at Alphabet-owned DeepMind, as well as AI-heavyweights Yoshua Bengio and Stuart Russell, called for a pause on advanced artificial intelligence (AI) development until shared safety protocols for such designs were developed, implemented and audited by independent experts.

“Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable,” the letter said.

The letter also detailed potential risks to society and civilisation from human-competitive AI systems in the form of economic and political disruptions, and called on developers to work with policymakers on governance and regulatory authorities.

The letter comes as European Union police force Europol on Monday joined a chorus of ethical and legal concerns over advanced AI-like ChatGPT, warning about the potential misuse of the system in phishing attempts, disinformation and cybercrime.

Mr Musk, whose carmaker Tesla is using AI for its Autopilot system, has been vocal about his concerns.

Since its release in 2022, Microsoft-backed OpenAI’s ChatGPT has prompted rivals to launch similar products and companies to integrate it or similar technologies into their apps and products.

Mr Sam Altman, chief executive at OpenAI, has not signed the letter, a spokesman at Future of Life told Reuters.

OpenAI did not immediately respond to a request for comment.

“The letter isn’t perfect, but the spirit is right: We need to slow down until we better understand the ramifications,” said expert Gary Marcus, an emeritus professor at New York University who signed the letter.

“They can cause serious harm... The big players are becoming increasingly secretive about what they are doing, which makes it hard for society to defend against whatever harms may materialise.” REUTERS

Join ST's Telegram channel and get the latest breaking news delivered to you.