
When the medical emergency at the last minute forced an American traveler to cancel his dream trip to Medellin in Colombia, the response of travel providers was a quick disappointment: “No return. No exceptions.”
“I booked a hotel and years through the expedia,” the traveler shared in the now viral Reddit. “The hotel had politics without knowledge. The airline also had policy without knowledge. I asked both absolutely said no.”
Face to face a loss around $ 2,500 ( £210 000), turned to an unlikely source: Chatgpt. And then everything changed.
First: complete rejection. Then: medical strategy with the help of AI
The user did not buy travel insurance and the hotel and the airline quoted the company irreversible principles. Only after both companies have closely denied any form of payment, the traveler decided to increase the legitimate health – generalized anxiety (GAD) – affected by the doctor’s note.
“I asked Chatgpt to act as my lawyer and defend me,” he explained. “I used a medical apology that would be gad. I got a medical comment.”
Chatbot AI then helped create personalized and well explored attraction based on the conditions of the Expedium and Hotel and evoked medical situations. It worked.
“Chatgpt examined the policy of the Expedia, Hotel Policy and Air Society policy. Then he wrote a letter for me. The hotel gave me my return for health,” he added.
Screengrab from a viral post.
Airline said no – until AI caught discrimination
However, the airline was firm. According to their policy, only terminal or death disease guaranteed an exception to their no-refund rule. GAD did not qualify.
“I shared the answer with Chatgpt and wrote me another letter to give international airlines,” the traveler said.
“She described the reasons why and how my health could actually affect the flight and that they were distorted on the basis of mental illness.”
Within one hour, the airline turned its position and agreed to a full replacement.
Reddit reacts: “This is the future”
While many users appreciated the perseverance of the traveler and the smart use of AI, not everyone was amazed. One commentator sharply criticized this approach, indicating that he has been bounded by dishonesty:
The user wrote: “Congratulations.
The other user commented: “Chatgpt could pay a hundred times for one shot.”
Next said, “Reminders: Don’t take” no “in the nominal value (sic).”
The traveler made it clear that Chatgpt did not make excuses – simply helped express a real medical problem in a convincing and structured way, supported by research.
“If I didn’t use Chatgpt, I would have to hire a parallegal,” he wrote. “And that would cost me more money.”
“So if you find yourself in these conditions … you don’t take the answer.”
(Tagstotranslate) Online compiler