AI is the foundational tech at Google and its parent company Alphabet, CEO Sundar Pichai told the audience at this year’s Code conference in Los Angeles. He pointed out the “extraordinary” successes of the Google AI and DeepMind teams in areas such as large language models and the AlphaFold project, which showed the underlying structure of 200 million proteins. He said Google was now applying deep computer science and AI to all its products, from search to its work with pharma companies with AlphaFold to self-driving cars. But, he added, it is “important that we develop AI aligned with human values.”
Conference host Kara Swisher showed a 2016 interview in which Pichai (then interviewed by the now-retired Walt Mossberg) said he expected we would have true “conversational AI” to help get things done in the next 5 to 10 years. Pichai said that is not fully here yet, saying it was a few years out, and that Google still has work to do. For instance, he said, it would be great if everyone could have a tutor in their pocket on any topic, but AI is not yet good enough for that.
Swisher pressed him on how he got the company to pivot to AI, and he talked about framing the move to being “AI first” was the same kind of thing as being “mobile first” and that AI would be “as big as the Internet,” and then having a big call to arms within the organization. Now he said, “AI is a part of all our products.”
Asked about AI ethics and how some people have left the company with complaints about how it was handling the issue, he said that compared to most technologies, AI is very early on. He gave credit to people who have highlighted the issue. He said companies are talking about it, and Google had done its part by publishing hundreds of papers and open sourcing its models. He acknowledged that there have been internal disagreements about what should have been published, saying “we should have handled it better.” In this situation, he said, the company should have slowed down and engaged better. He said that Google’s emphasis on decentralization gave it more speed but at a cost.
Since the controversy, he said Google has expanded its AI Ethics team, but also praised the “independent voices” that look at AI and said there was a role for government in AI regulation, such as is being promoted in the European Union.
On the hubbub of whether Google’s LaMDA model was indeed sentient, Pichai said Google slowed things down and followed the process. “I don’t think LaMDA is sentient by any measures,” he said, but it showed the issues that Google faces. “I think there’s a long way to go,” he said, adding that the discussion gets into philosophical topics about what is sentience. “The good news is we’re far from that and we may never get there.”
The conversation touched on many other issues facing Google and the tech industry. Asked about competition, he said competition comes out of nowhere, noting that “none of us were talking about TikTok a few years ago.” But in general, he said, Google’s biggest competitors are the other big tech giants including Amazon, Apple, Microsoft, and Facebook; emerging tech such as TikTok; and in some areas, the Chinese companies (Alibaba, Tencent, and Baidu). But added, “you tend to go wrong by focusing on competitors,” and those big companies that fail usually do that internally.
Later, asked about antitrust issues and Google’s large position in the advertising market, he talked about how he doesn’t see “general search” being the market that should be considered because people can see advertising and buy products in a variety of contexts. He noted that TikTok now has $12 billion in ad revenue, Amazon has $30 billion, and Apple is now expanding its ad business, saying “competition is hyper-intense, and can come from anywhere.”
While he said he had a lot of respect for Senator Amy Klobuchar (who spoke at the conference earlier about her antitrust bill, and said it was right for elected officials and regulators to look at tech), Pichai thinks most American consumers would prioritize looking at privacy and the safety of children online.
On the controversial subject of content moderation, he noted that the company faced a lot of scrutiny a few years ago for hosting unsafe content for children. Google called an “internal code yellow” to fix it and has made content responsibility a main pillar in areas from search to YouTube. “Our mission is high-quality information,” he said.
Asked for specifics, Pichai said the company blocked new videos from Donald Trump after January 6 because of “the risk of violence,” but said Google is “committed to freedom of speech” and that the decision will be re-evaluated as circumstances change. He said Trump’s Truth Social app was only recently submitted to the Play store and that the Google team was working with Truth Social on things such as a mechanism to remove direct violent content. He noted that the Parler social network was recently approved.