A lesson for companies preparing to roll-out ever more sophisticated chatbots to improve customer service: if they get it wrong, it’s on you. A Canadian tribunal has ruled that Air Canada has to pay up after its chatbot gave the wrong information to a customer booking a flight after his grandmother died in Ontario. The airline offers a bereavement discount if the request is submitted before a last-minute flight; the chatbot told British Columbia resident Jake Moffatt that he had 90 days to apply, according to the Washington Post. Air Canada reportedly argued that the chatbot was a separate legal entity that was “responsible for its own actions”. Not so, said the tribunal: it should be “obvious” that Air Canada is responsible for all the information on its website, including the chatbot. Air Canada was ordered to pay just over $600 in damages and tribunal fees; it says it will comply with the decision.