Elon Musk, tech experts call for pause on AI systems amid risks to humanity

In recent months, AI labs around the world "locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one-not even their creators-can understand, predict, or reliably control," said the letter published on website of Future of Life Institute, a nonprofit organization based in Massachusetts, US.

Billionaire Elon Musk and a number of tech experts in a letter Wednesday called for pause on artificial intelligence (AI) experiments and systems, citing risks to society and humanity.

In recent months, AI labs around the world "locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one-not even their creators-can understand, predict, or reliably control," said the letter published on website of Future of Life Institute, a nonprofit organization based in Massachusetts, US.

The necessary level of planning and management that could have "a profound change in the history of life on Earth" is not happening, said the letter signed by dozens of tech experts, including Apple co-founder Steve Wozniak.

"Contemporary AI systems are now becoming human-competitive at general tasks, and we must ask ourselves: Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization?" the letter said.

Signatories argued that such decisions must not be delegated to unelected tech leaders, and powerful AI systems should be developed only once humanity is confident that their effects will be positive and their risks will be manageable.

They called for AI labs to immediately pause for at least six months the training of AI systems more powerful than GPT-4 -- the latest multimodal large language model released earlier this month by San Francisco-based research lab OpenAI.

"This pause should be public and verifiable, and include all key actors. If such a pause cannot be enacted quickly, governments should step in and institute a moratorium," they urged.

The signatories included profound professors in physics, computer science and neural networks, co-founders of many US-based tech firms, and researchers in private institutions.

X
Sitelerimizde reklam ve pazarlama faaliyetlerinin yürütülmesi amaçları ile çerezler kullanılmaktadır.

Bu çerezler, kullanıcıların tarayıcı ve cihazlarını tanımlayarak çalışır.

İnternet sitemizin düzgün çalışması, kişiselleştirilmiş reklam deneyimi, internet sitemizi optimize edebilmemiz, ziyaret tercihlerinizi hatırlayabilmemiz için veri politikasındaki amaçlarla sınırlı ve mevzuata uygun şekilde çerez konumlandırmaktayız.

Bu çerezlere izin vermeniz halinde sizlere özel kişiselleştirilmiş reklamlar sunabilir, sayfalarımızda sizlere daha iyi reklam deneyimi yaşatabiliriz. Bunu yaparken amacımızın size daha iyi reklam bir deneyimi sunmak olduğunu ve sizlere en iyi içerikleri sunabilmek adına elimizden gelen çabayı gösterdiğimizi ve bu noktada, reklamların maliyetlerimizi karşılamak noktasında tek gelir kalemimiz olduğunu sizlere hatırlatmak isteriz.