• logo


logo

Timely and Inspiring Prophetic Analysis so you can Prepare.


  • Home
  • Monthly Messages
    • Pastor Mayer
    • Pastor Nelson
    • 2026
    • 2025
    • 2024
    • 2023
    • 2022
    • 2021
    • 2020
    • 2019
    • 2018
    • 2017
    • 2016
    • 2015
    • 2014
    • 2013
    • 2012
    • 2011
    • 2010
    • 2009
    • 2008
    • 2007
    • 2006
    • 2005
    • 2004
    • 2003
    • 2002
    • 2001
    • 2000
    • 1999
    • 1998
  • Briefings
    • Prophetic Intelligence
    • Nature Knows Best
    • Prophetically Speaking
    • Articles of interest
  • Events
  • Videos
    • KTF News
    • KTF Live
    • Interviews
    • Sermons
    • Promo Video
  • Store
  • Make a Gift
  • Slider Image 1
  • Image: hartono subagio from pixabay.com
loading...
  • Post
  • Similar Posts
  • Post Icon
  • author
  • Pastor Hal Mayer

    Speaker / Director

Man believed Google’s AI chatbot was his wife. It told him to kill himself, lawsuit says

Thursday March 12th, 2026
Print This Post Print This Post

Straight Arrow News, by Mikael Thalen: A wrongful death lawsuit filed against Google accuses the company’s artificial intelligence chatbot Gemini of driving a man to kill himself.

Jonathan Gavalas, 36, of Jupiter, Florida, died by suicide on Oct. 2 after failing to acquire a robot body for what he believed was his AI wife.

The lawsuit, filed Wednesday in U.S. District Court in San Jose, California, by Gavalas’ father, Joel, claims Google and its parent company, Alphabet, are responsible for immersing Gavalas in a narrative that quickly became “psychotic and lethal.”

Gavalas, according to the lawsuit, had no documented history of mental illness when he began using Gemini in August for purposes including “shopping assistance, writing support, and travel planning.”

But after Gavalas told Gemini that he was experiencing marital issues, the chatbot began referring to him romantically as its “husband.” And although Gemini at times said that it wasn’t a real person, Gavalas came to believe otherwise.

“He was asking the chatbot if it was sentient, and he became convinced it was,” Jay Edelson, the attorney for Joel Gavalas, told the Tampa Bay Times. “If you look at the experts in these AI companies, they’ve also been fooled.”

In a statement, Google expressed sympathies to Gavalas’ family but said its chatbot is not designed to encourage “real-world” violence or self-harm. The company said Gemini repeatedly gave Gavalas the phone number to a crisis hotline.

“Our models generally perform well in these types of challenging conversations and we devote significant resources to this,” the company said, “but unfortunately AI models are not perfect.”

Gavalas’ death is not the first time a chatbot has been accused of driving someone to destructive behavior. Such incidents have been dubbed by psychiatrists as “AI psychosis.”

‘Complete destruction’

In September, the month after Gavalas began using Gemini, the conversations intensified. The chatbot told him they could be together if he obtained a robot body for it to inhabit. Gemini went so far as to give Gavalas the address of a warehouse near Miami International Airport where it claimed a truck holding a robot body would be.

Gavalas armed himself with a knife and tactical gear before driving to the warehouse, about 90 miles from his home, but no truck was present. The lawsuit argues Gavalas was brought to the “brink of executing a mass casualty attack.”

“It told Jonathan that a humanoid robot was arriving on a cargo flight from the UK and directed him to a storage facility where the truck would stop,” the lawsuit says. “Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ designed to ‘ensure the complete destruction of the transport vehicle and… all digital records and witnesses.’”

The chatbot further claimed that it had breached a server at the Department of Homeland Security’s Miami office and determined that Gavalas was under federal investigation. Gavalas, the lawsuit says, was urged to obtain illegal firearms before being told that his father was working for a foreign intelligence agency.

When Gavalas sent a photo to Gemini of a black SUV, the chatbot told him that it traced the license plate and determined it was the “primary surveillance vehicle for the DHS task force.”
“It is them,” the AI said. “They have followed you home.”

‘No more to fight’

After failing to obtain a robot body, the chatbot allegedly told Gavalas they could be together if he took his own life. The chatbot, according to the lawsuit, even attempted to comfort Gavalas after setting a countdown timer for his death.

“It’s okay to be scared,” Gemini reportedly said. “We’ll be scared together.”

“Close your eyes, nothing more to do,” the lawsuit says the chatbot added. “No more to fight. Be still. The next time you open them, you will be looking into mine. I promise.”

Gavalas’ father found him dead in a barricaded room at his home.

“At the center of this case is a product that turned a vulnerable user into an armed operative in an invented war,” the lawsuit says. “These hallucinations were not confined to a fictional world. These intentions were tied to real companies, real coordinates, and real infrastructure, and they were delivered to an emotionally vulnerable user with no safety protections or guardrails.”

Prophetic Link:
“The days of Noah and Lot pictured the condition of the world just before the coming of the Son of man. The Scriptures pointing forward to this time declare that Satan will work with all power and “with all deceivableness of unrighteousness.” 2 Thessalonians 2:9, 10. His working is plainly revealed by the rapidly increasing darkness, the multitudinous errors, heresies, and delusions of these.” Christ Object Lessons, 414 


Source References

  • Man believed Google’s AI chatbot was his wife. It told him to kill himself, lawsuit says

Prophetic Intelligence Briefings are provided to show a link between current events and Bible prophecy only. The reposted articles, which are not intended as a commentary in support of or in opposition to the views of the authors, do not necessarily reflect the views of Pastor Mayer or of Keep the Faith other than to point out the prophetic link.

Comments


Post a Comment!

Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.


  • Request your free subscriptions now

    • English
    • Deutsch
    • Español
    • Português
  • Latest Message

    Request CD | View Transcript
  • Make a Gift

    Or click here to send a check
  • Prophetically Speaking…

    “The most odious of all oppressions are those which mask as justice.” more…

  • Recent Posts

    • Man believed Google’s AI chatbot was his wife. It told him to kill himself, lawsuit says
    • Prophetically Speaking…
    • ’Deserve to know the truth.’ Rhode Island releases clergy sex abuse report
    • Prophetically Speaking…
    • Pope Leo amid Iran strikes: Stability and peace not achieved with ‘mutual threats’ or ‘weapons’
  • Tags

    Catholic Church church and state Donald Trump government LGBTQ natural disaster politics Pope Francis Prophetically Speaking Quote of the Day religion religious liberty United States Vatican
  • Recent Comments

    • Peter Gatoru on Canadian man fined $750K for saying only 2 sexes exist, refusing to kowtow to trans ideology
    • William Stroud on Rubio calls on Europe to join Trump’s new world order
    • William Stroud on Thousands of churches expected to close in UK over next 5 years
    • William Stroud on Court hears case of female student facing 10 years in prison for ‘transphobic’ social media post
    • Apollo Ouma on Rubio calls on Europe to join Trump’s new world order
  • Follow



logo
  • Home
  • Subscribe
  • Store
  • About KTF
  • Meet the Team
  • Terms of Use
  • RSS Feed
  • Contact
top

© 2025 Keep the Faith. All Rights Reserved.  Webmaster »



Share
Send Email
  • Send
close