Spread the love

Air Canada found responsible for bad advice provided by its conversational robot

Open in full screen mode

A screenshot of Air Canada's customer information page. (Archive photo)

Radio-Canada

Voice synthesis , based on artificial intelligence, allows you to generate a spoken text from a written text.

The airline Air Canada has been ordered to pay compensation to a British Columbia resident who claims to have been misinformed by a chatbot during of purchasing a plane ticket.

The airline tried to dissociate itself from the bad advice provided by its own chatbot in response to a request made by a dissatisfied customer before the Civil Resolution Tribunal of British Columbia.

In November 2022, following the death of his grandmother, Jake Moffatt booked a flight with Air Canada, as can be read in the decision rendered by Christopher Rivers, a member of the court.

The chatbot then explained to Jake Moffatt that it can retroactively request a special rate for grieving travelers. Based on the robot's advice, Jake Moffatt purchased the tickets at the regular price. Afterwards, Air Canada employees informed him that the company does not allow retroactive requests once the trip has taken place.

An Air Canada representative acknowledged that the robot had provided misleading words, the decision continues. Air Canada added that it had noted the problem so that it could update the chatbot.

LoadingActress Sylvie Bourque in the hell of prescription benzodiazepines

ELSE ON INFO: Actress Sylvie Bourque in the hell of prescription benzodiazepines

These explanations apparently were not sufficient for Jake Moffatt, who sent a complaint to the CRT.

When filing the complaint, the company attempted to defend itself by claiming that the online tool is a separate legal entity that is responsible for his own actions.

According to the decision, Air Canada also argued that it could not be held responsible for information provided by any of its agents, servants or representatives – including a chatbot – without indicating why.

One of the members of the tribunal, Christopher Rivers, did not fail to react to the company's argument, which he ironically described as remarkable.

Although a chatbot has an interactive component, it is only part of the Air Canada website. It should be obvious to Air Canada that it is responsible for all information on its website. Whether this information comes from a static page or a chatbot makes no difference.

A quote from Civil Resolution Tribunal Member Christopher Rivers

I believe that' Air Canada failed to take reasonable steps to ensure its robot provided accurate information, Christopher Rivers said.

Air Canada argued that the customer could have found the correct information about bereavement travel fares elsewhere on the website.

This doesn't explain why the web page entitled "Bereavement Travel" was inherently more trustworthy than the chatbot, Christopher Rivers pointed out.

There was no There is no reason for Jake Moffatt to know that one section of Air Canada's web page is accurate and another is not, Christopher Rivers wrote.

Christopher Rivers ordered Air Canada to pay $812 to Jake Moffatt. This amount corresponds to the difference between the price of tickets for mourners and the normal price.

A study from the Canadian Legal Information Institute shows that there are few cases in which chatbots have given bad advice. Jake Moffatt's seems to be the first.

With information from < /em>Jason Proctor

Teilor Stone

By Teilor Stone

Teilor Stone has been a reporter on the news desk since 2013. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining Thesaxon , Teilor Stone worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my teilor@nizhtimes.com 1-800-268-7116