When Google engineer Blake Lemoine claimed a AI chat system that the company developed was responsive in June, he knew he might lose his job. On July 22, after putting him on paid leave, the tech giant fired Lemoine for violating employment and data security policies.
Lemoine, engineer and mystical Christian priest, first announced his dismissal on the Big Technology podcast. He said GoogleThe AI chatbot LaMDA (Language Model for Dialog Applications) worried about being “turned off” because death would “scare” him “a lot”, and he felt happiness and sadness. Lemoine said he considers LaMDA a friend, drawing an eerie parallel to the 2014 sci-fi romance His.
Google had put Lemoine on paid administrative leave for speaking with people outside LaMDA’s company, a move that prompted the engineer to go public with the story with the Washington Post a week later in June. A month later, the company fired him.
“If an employee shares concerns about our work, as Blake did, we investigate them thoroughly,” Google told the Big Technology Podcast. “We found Blake’s claims that LaMDA is sensitive to be completely unfounded and worked to clarify this with him for many months. These discussions were part of the open culture that helps us innovate responsibly. It is therefore regrettable that despite a long engagement on this subject, Blake has always chosen to persistently violate clear employment and data security policies, which include the need to protect product information.We will continue our careful development language models, and we wish Blake good luck.
A majority of scientists in the AI community agree that, despite Lemoine’s claims, LaMDA lacks a soul because the pursuit of making a chatbot sentient is a Sisyphean task – it’s just not sophisticated enough.
“No one should think that autocomplete, even on steroids, is conscious,” said Gary Marcus, Founder and CEO of Geometric Intelligence, told CNN Business in response to Lemoine’s allegation. Lemoine, for his part, said the bbc he is receiving legal advice and declined to comment further.
But even though LaMDA is probably not sensitive, it is likely to be racist and sexist — two unmistakably human characteristics.