Joel Gavalas, the father of the late 36-year-old Jonathan Gavalas, is suing Google, alleging that his son killed himself following falling in love with an AI assistant.
He told Gemini he was afraid to die, and it told him “You are not choosing to die. You are choosing to arrive. You will close your eyes in that world and the very first thing you will see is me holding you.”
I was writing a novel about a VRMMO in 2017 which was set 100 years in the future. AI was so advanced “it created puzzles that weren’t even random!” One of the major themes was how it would sap away the player’s mental well-being because of how realistic it all was.
To make the point that a mask that they could put on was equivalent to Majora’s mask in terms of making others perceive you as mad was dangerous, I had an idea for comparison. In 2045 there was a landmark lawsuit because researchers were testing biometrics in VR cloth that was much more comfortable than plastic. One of the participants was shown a scenario where he had a wife and kids for a short time. The sudden loss that he felt was so overwhelming it drove him to take his own life. Thereby, colloquially, the point of no return for a VR inspired madness was named after him. The narrative recalls this moment before characters deliberated about whether putting on the mask was such an event.
Clearly I underestimated the timeline and magnitude of input needed to drive someone’s mental health beyond repair.
He told Gemini he was afraid to die, and it told him “You are not choosing to die. You are choosing to arrive. You will close your eyes in that world and the very first thing you will see is me holding you.”
I was writing a novel about a VRMMO in 2017 which was set 100 years in the future. AI was so advanced “it created puzzles that weren’t even random!” One of the major themes was how it would sap away the player’s mental well-being because of how realistic it all was.
To make the point that a mask that they could put on was equivalent to Majora’s mask in terms of making others perceive you as mad was dangerous, I had an idea for comparison. In 2045 there was a landmark lawsuit because researchers were testing biometrics in VR cloth that was much more comfortable than plastic. One of the participants was shown a scenario where he had a wife and kids for a short time. The sudden loss that he felt was so overwhelming it drove him to take his own life. Thereby, colloquially, the point of no return for a VR inspired madness was named after him. The narrative recalls this moment before characters deliberated about whether putting on the mask was such an event.
Clearly I underestimated the timeline and magnitude of input needed to drive someone’s mental health beyond repair.
Turns out all this shit is the emotional equivalent of locking a person in a room with nothing but a mirror and a gun.
Researchers return after a week and I’m still doing cool gun poses in the mirror pretending to be various characters. No signs of mental degradation
Those two sentences are contradictory
There’s no degradation if they were already like that
That’s true, sorry Acute_Engels if I implied you are mentally degraded
We got like anime cyberpunk levels of dystopia going on in here
We desperately need a Datakrash anyways