Oh Good – AI Could 'Kill Many Humans' Within Two Years, Says Rishi Sunak’s Adviser

Matt Clifford warns of “very powerful” systems that humans could struggle to control without global regulation.
Asked what percentage chance he would give that humanity could be wiped out by AI, Matt Clifford said: “I think it is not zero.”
Asked what percentage chance he would give that humanity could be wiped out by AI, Matt Clifford said: “I think it is not zero.”
Anton Petrus via Getty Images

Rishi Sunak’s adviser on artificial intelligence has warned there’s just two years to save the world from AI systems.

Matt Clifford said AI could have the capability to be behind advances that “kill many humans” within that timeframe.

He said that unless producers are regulated on a global scale then there could be “very powerful” systems that humans could struggle to control.

Even the short-term risks were “pretty scary”, he told TalkTV, with AI having the potential to create cyber and biological weapons that could inflict many deaths.

The comments come after a letter backed by dozens of experts, including AI pioneers, was published last week warning that the risks of the technology should be treated with the same urgency as pandemics or nuclear war.

Senior bosses at companies such as Google DeepMind and Anthropic signed the letter along with the so-called “godfather of AI”, Geoffrey Hinton, who resigned from his job at Google earlier this month, saying that in the wrong hands, AI could be used to harm people and spell the end of humanity.

EXCLUSIVE: The PM’s AI Task Force adviser Matt Clifford says the world may only have two years left to tame Artificial Intelligence before computers become too powerful for humans to control.

— TalkTV (@TalkTV) June 5, 2023

Clifford is advising the prime minister on the development of the UK government’s Foundation Model Taskforce, which is looking into AI language models such as ChatGPT and Google Bard, and is also chairman of the Advanced Research and Invention Agency (Aria).

He told TalkTV: “I think there are lots of different types of risks with AI and often in the industry we talk about near-term and long-term risks, and the near-term risks are actually pretty scary.

“You can use AI today to create new recipes for bio weapons or to launch large-scale cyber attacks. These are bad things.

“The kind of existential risk that I think the letter writers were talking about is… about what happens once we effectively create a new species, an intelligence that is greater than humans.”

While conceding that a two-year timescale for computers to surpass human intelligence was at the “bullish end of the spectrum”, Clifford said AI systems were becoming “more and more capable at an ever increasing rate”.

Asked on the First Edition programme on Monday what percentage chance he would give that humanity could be wiped out by AI, Clifford said: “I think it is not zero.”

He continued: “If we go back to things like the bio weapons or cyber (attacks), you can have really very dangerous threats to humans that could kill many humans – not all humans – simply from where we would expect models to be in two years’ time.

“I think the thing to focus on now is how do we make sure that we know how to control these models because right now we don’t.”

The technology expert said AI production needed to be regulated on a global scale and not only by national governments.

AI apps have gone viral online, with users posting fake images of celebrities and politicians, and students using ChatGPT and other “language learning models” to generate university-grade essays.

But AI can also perform life-saving tasks, such as algorithms analysing medical images such as X-rays, scans and ultrasounds, helping doctors to identify and diagnose diseases such as cancer and heart conditions more accurately and quickly.

Clifford said that AI, if harnessed in the right way, could be a force for good.

“You can imagine AI curing diseases, making the economy more productive, helping us get to a carbon neutral economy,” he said.

Close

What's Hot