A recent ruling in Canadian small claims court has significant implications for companies using AI-powered chatbots. Air Canada has been ordered to compensate a customer after its chatbot provided incorrect information on bereavement fares.
The customer, while grieving his grandmother, inquired about bereavement rates. However, Air Canada’s chatbot offered misleading information regarding refunds, causing the customer to take a flight under incorrect assumptions. While Air Canada did provide a link to its policy, the company initially refused the refund and offered a $200 flight credit instead.
Air Canada took an unprecedented stance, claiming the chatbot constituted a separate legal entity and denying responsibility for its actions. This defense was dismissed by the court.
The decision sets a precedent, emphasizing that companies utilizing AI assistants must ensure they provide accurate information. Companies are being held accountable for the information disseminated by their chatbots, underscoring the potential legal consequences of AI errors.