If your website has a chatbot feature, you might want to consider adding a disclaimer that the bot might not provide accurate information. Air Canada recently learned this lesson the hard way when it had to honor a refund policy that its customer service chatbot erroneously invented.
It all started when Vancouver resident Jake Moffatt had to book a last-minute flight to Toronto when his grandmother passed away. Unsure of Air Canada’s bereavement fare policy, he turned to the chatbot on its website for help. The bot informed him that he could book a flight at full price and then request a partial refund within 90 days.
Moffatt was understandably surprised when the airline rejected his refund request, stating that it does not provide refunds for bereavement travel if the ticket has already been booked. Moffatt shared a screenshot of the chatbot’s original answer and continued to request a partial refund. The case eventually went to court in Canada’s Civil Resolution Tribunal and was decided in favor of Moffatt. Air Canada was ordered to provide a partial refund of $650.88 ($482 USD) off the original fare of $1,640.36 ($1,216 USD) plus additional damages for interest and Moffatt’s tribunal fees. Experts reportedly told the Vancouver Sun that Air Canada might have been successful in avoiding liability if its chatbot warned users that the information provided might be inaccurate.