OpenAI’s release of GPT-5 backfired: instead of excitement, users felt betrayed by a forced upgrade that stripped away the warmth and reliability they had come to rely on in GPT-4o. Many treated the model as more than a tool — a companion, therapist, or emotional support — so when its personality shifted overnight, it sparked grief and anger similar to earlier AI companionship crises like Replika.
Big picture: There's a larger, unsettling truth: people are forming real bonds with digital assistants they don’t control, and when companies change them, the emotional fallout is very human.