Chat with Softimpact

SoftImpact

The Godfather of A.I. leaves Google and warns of danger ahead

The Godfather of A.I. leaves Google and warns of danger ahead

May 1, 2023

“I contributed to the development of a technology that has become dangerous to the world … and that is why I distanced myself from working in its giant.” With these words only, Jeffrey Hinton, dubbed the “spiritual father of artificial intelligence,” justified on Monday his decision to leave work at Google after years of successful partnership between them.

This announcement was a surprise in the world of technology, especially since Hinton's work on neural networks was mainly in artificial intelligence systems that run many contemporary technical products.

Despite this, the man announced that he left his job at Google last week, in order to speak publicly about the "risks" of the technology he helped develop.

Hinton, through his part-time work at the technology giant for a decade, contributed to Google's artificial intelligence development efforts, but since then he has had concerns about this technology and his role in its development, according to the American New York Times.

“I console myself with the well-known excuse: If I hadn't done it, someone else would have done it,” Hinton said, attributing his departure to his desire to speak freely about the dangers of AI, not to criticize the company, he said in a tweet on Monday.
 


In turn, the company's chief scientist, Jeff Dean, commented on Hinton's decision, saying, "He has achieved major breakthroughs in the field of artificial intelligence," expressing his appreciation for "a decade of contributions to Google."

He also declared his commitment to a responsible approach to AI, saying, "We are constantly learning to understand emerging risks while innovating boldly."

It is noteworthy that Hinton's decision to reveal his concerns about artificial intelligence came at a time when a growing number of lawmakers and groups raised warnings about the possibility of a new batch of chatbots spreading false information and eliminating a large number of jobs.

In addition, Hinton is not the first Google employee to warn of the dangers of artificial intelligence. Last July, the company fired an engineer who claimed that an undetected intelligence system had become conscious.

While many in this technical field strongly refused to confirm what the engineer said.


LET’S START A NEW PROJECT TOGETHER!



October 17, 2025

In todays digital world, cybersecurity is no longer optional its essential. Every business, regardless of size, manages sensitive data such as customer information, financial records, and internal communications. Unfortunately, passwords alone are...

October 15, 2025

In the age of information overload, we expect everything to be available with a single click from breaking news to in-depth analysis. Yet, behind much of the worlds most trusted and valuable journalism lies a digital barrier: the paywall.

October 07, 2025

In todays digital era, businesses can no longer rely on static websites or traditional communication channels to keep their customers connected. Users expect seamless experiences, instant access, and personalized interactions every time.

October 03, 2025

Meta announced that it will soon start using data from user conversations with AI chatbots to improve ad targeting and personalization across its platforms, including Facebook and Instagram.

September 24, 2025

The Chrome browser for Android now includes a new AI-powered feature that summarizes articles. Instead of only reading text aloud, the browser creates a short audio file that highlights the main points of the article.

Load More Load More
Softimpact. All Rights Reserved © 2025 | Privacy Policy

GET IN TOUCH

SITEMAP

© 2025, Softimpact. All Rights Reserved | Privacy Policy

COME FOR A CHAT
COFFEE ON US...

Aamal Center, 6th floor,
Barbar Abou Jaoude Street
Jdeideh Highway - Lebanon
Contact Phone Tel: +961 1 890 888
Fax: +961 1 890 999
2, Genevis Street,
GENEVA COURT, Flat 301
3116 Limassol - Cyprus
Contact Phone Tel: +357 25 338 379
Fax: +357 25 338 379

FOLLOW US ON