Google Overhauls Search with AI, Led by Robby Stein's New Strategy
Photo by Dylan Carr (unsplash.com/@dyl_carr) on Unsplash
Google announced a major overhaul of Search, with VP of Search Robby Stein unveiling an AI‑driven “AI Mode” built on Gemini 3 (Flash & Pro) that adds multi‑turn reasoning, real‑time tool use and optional “personal intelligence” integration, Share reports.
Key Facts
- •Key company: Google
Google’s new “AI Mode” is built on Gemini 3, the company’s latest large‑language model, and it splits into two flavors: Gemini 3 Flash, optimized for speed, and Gemini 3 Pro, tuned for deeper reasoning. According to the Share interview with Robby Stein, Flash can return a response in under a second for straightforward queries, while Pro takes a few extra seconds to chain together multiple inference steps, enabling the multi‑turn conversations that were previously impossible in a keyword‑driven interface. The distinction lets Google route simple look‑ups—like weather or sports scores—to the faster engine, while more complex tasks—such as planning a multi‑city trip or troubleshooting a code bug—are handed to the more deliberative Pro model.
Beyond raw model power, AI Mode integrates a “tool use” layer that lets the system call external APIs in real time. Stein explained that the engine can pull live financial data, generate on‑the‑fly graphs, or even execute a reservation through a partner’s booking service without the user leaving the search page. This capability, demonstrated in the Share episode with live finance simulations, marks a shift from static answer snippets to an execution platform that can act on behalf of the user, a concept the team refers to as “agentic search.”
Personalization is another pillar of the overhaul. Users who opt in to “personal intelligence” can link Google services such as Gmail, Calendar, and Photos, allowing the model to surface context‑aware results that draw on a person’s own data. Stein said the feature is strictly voluntary and processed on‑device where possible, but it enables scenarios like surfacing a relevant email thread when searching for a past meeting or suggesting a photo album when looking for travel ideas. The Share podcast notes that this approach keeps the user’s data under their control while delivering a more tailored search experience.
The rollout is staged, with AI Mode initially available on desktop and Android, and a beta on iOS slated for the next quarter. Early internal testing, as described by Stein, shows a 30 percent reduction in the number of follow‑up clicks users need to complete a task, indicating that the conversational flow is already cutting friction. Google plans to expand the tool‑use catalog and add more personal‑intelligence integrations over the next six to nine months, turning the search bar into a planning and execution hub rather than a simple information dump.
Industry observers see the move as Google’s answer to the wave of AI‑first search products launched by rivals. While the Share interview does not cite competitor data, Stein’s framing of AI Mode as a “thought partner” and “execution layer” underscores Google’s intent to reclaim the narrative around search as a proactive assistant rather than a passive index. If the adoption metrics hold, the new architecture could redefine how billions of users interact with the web, turning a single query into a multi‑step, context‑rich dialogue that bridges information retrieval and action.
Sources
This article was created using AI technology and reviewed by the SectorHQ editorial team for accuracy and quality.