Google (Alphabet Inc.) expanded Circle to Search in its March 2026 feature drop to identify and shop every individual item in a full outfit from a single photo — coat, shirt, trousers, shoes, and accessories — in a single gesture. The update, rolling out to Android and iOS devices globally, marks the most comprehensive upgrade to visual shopping since Circle to Search launched in January 2024.
The same feature drop introduced Magic Cue for Gemini-powered restaurant recommendations, extended Live Translate for headphones to iOS users in more than 200 countries, and pushed several Gemini integrations deeper into Google’s core app ecosystem.
Circle to Search 2026: Full Outfit Identification in One Tap
Previously, Circle to Search let users draw around a single item — a jacket, a sneaker — to find it or similar products online. The March 2026 update removes that constraint entirely. Draw around an entire person in a photo and Circle to Search now parses every visible clothing item individually, returns shoppable results for each piece, and presents them as a unified outfit breakdown.
The feature works on any image source: social media feeds, editorial photos, screenshots, or camera roll images. Google’s underlying Vision AI models segment each garment by category — outerwear, tops, bottoms, footwear, bags, jewelry — and match them against Google Shopping’s catalog of billions of indexed products.
According to Google’s product blog, the outfit segmentation model was trained on hundreds of millions of fashion images and can distinguish between similar silhouettes — a trench coat versus a mac, oxford shoes versus derby shoes — with subcategory precision. Internal testing cited in the announcement showed correct category identification in over 90% of cases.
How Circle to Search Outfit Detection Actually Works
Circle to Search’s outfit mode runs on a multi-stage pipeline. First, a person detection model isolates the subject from the background. Second, a semantic segmentation layer maps body regions to garment categories. Third, each garment segment passes through a fine-grained fashion recognition model that extracts attributes — color, pattern, silhouette, material texture — before querying Google Shopping.
Results surface in a scrollable card tray at the bottom of the screen, one card per item, each with price range, retailer links, and a “Find similar” option for items Google cannot match exactly. For items with no direct match, the system falls back to style-based recommendations.
The architecture builds on Google Lens, which processed over 20 billion visual searches per month as of late 2025, according to Google’s internal figures. Outfit mode adds a coordination layer that preserves relationships between garments — filtering by “under $100” applies across all identified items simultaneously rather than item by item.
Magic Cue: Gemini Now Recommends Restaurants Mid-Conversation
The March drop also introduced Magic Cue, a contextual trigger that lets Gemini surface Google Maps restaurant recommendations directly inside a chat session. When a conversation includes location signals — a city name, a neighborhood, a phrase like “looking for dinner” — Magic Cue activates automatically and presents a restaurant card carousel without requiring the user to leave the chat or rephrase a separate query.
Magic Cue pulls from Google Maps’ review database of over 250 million places and surfaces results filtered by cuisine, price tier, current open status, and proximity. It is Google’s clearest push yet to make Gemini a direct competitor to conversational dining apps like Yelp’s AI assistant and OpenTable’s recently launched chat interface.
The feature is initially available in the United States, United Kingdom, Canada, and Australia, with broader rollout planned through Q2 2026. It runs exclusively on Gemini Advanced subscribers at launch — the $19.99/month tier — before a wider free-tier release.
Live Translate Expands to iOS and 200-Plus Countries
Live Translate, Google’s real-time earphone translation feature, has been an Android-exclusive since its debut with Pixel Buds Pro in 2022. The March 2026 update breaks that exclusivity: Live Translate now works with AirPods Pro (2nd generation and later) and select Beats headphones on iOS 17.4+, across more than 200 countries.
The expansion covers 48 language pairs at launch, up from 30 available on Android. Translation latency, per Google’s published specs, sits at under 500 milliseconds for the top 20 language pairs — fast enough for natural conversational cadence. The feature requires a Google account and the Google Translate app installed on iOS.
The practical implication is largest in markets where Android penetration is low and iPhone dominates. Japan recorded a 68% iPhone market share as of Q4 2025, per Counterpoint Research. Live Translate is now a functional travel tool for iPhone users in those markets for the first time.
Circle to Search vs. Pinterest Lens, Amazon StyleSnap, and Bing Visual Search
Google is not alone in visual fashion search. Pinterest Lens, Amazon’s StyleSnap, and Bing Visual Search all offer overlapping capabilities. The March 2026 Circle to Search update shifts the competitive landscape on the specific dimension of full-outfit parsing — a feature none of its major rivals currently match.
| Tool | Full Outfit Breakdown | Shopping Integration | Platform | Free Tier |
|---|---|---|---|---|
| Google Circle to Search | Yes (March 2026) | Google Shopping (billions of products) | Android, iOS | Yes |
| Pinterest Lens | No (single item) | Pinterest catalog + partner retailers | iOS, Android | Yes |
| Amazon StyleSnap | No (single item) | Amazon catalog only | iOS, Android | Yes |
| Bing Visual Search | Limited (2-3 items) | Bing Shopping index | iOS, Android, Web | Yes |
Pinterest Lens remains the strongest competitor for style discovery — its recommendation engine draws on signals from 465 million monthly active users, per Pinterest’s Q4 2025 earnings report — but it does not attempt to identify every item in a photo simultaneously. Amazon StyleSnap is effective within Amazon’s own catalog but provides no path to third-party retailers. Google’s advantage is catalog breadth and the integration of Maps, Shopping, and AI into one surface.
The Broader March 2026 Google AI Feature Drop
The outfit detection update arrived alongside six other Gemini and Google AI changes in March. Beyond Magic Cue and Live Translate, Google updated Gemini in Gmail to summarize email threads up to 50 messages long (previously capped at 10), extended NotebookLM audio overviews to 40 languages, and pushed Gemini Nano on-device processing to all Android 15 devices — not just Pixel flagships.
The pace of Google’s feature drops has accelerated sharply. The company released 23 distinct Gemini feature updates in Q1 2026, compared to 11 in Q1 2025, according to Google’s release notes archive. That cadence reflects competitive pressure from OpenAI’s expanding consumer integrations and the broader race to embed AI into daily workflows before usage habits solidify.
MegaOne AI tracks 139+ AI tools across 17 categories. Circle to Search’s outfit mode is the most direct challenge yet to standalone fashion-AI apps — a category that includes startups like Stylitics and Acustom Apparel — because it operates at zero additional cost inside a product users already open dozens of times daily.
What Outfit Search Means for Retailers and E-Commerce Teams
For e-commerce teams, Circle to Search’s outfit mode changes acquisition math. Product discovery driven by visual search converts at roughly 2.5x the rate of text search for fashion categories, according to Salesforce’s 2025 Commerce Cloud benchmarks. A tool that surfaces an entire outfit — rather than a single item — multiplies the potential basket size per discovery session.
Retailers who have optimized product imagery for Google Shopping’s structured data will benefit most. Google’s outfit model relies on accurate product titles, category tags, and high-resolution imagery to make clean matches. Brands with poor catalog hygiene will surface inconsistently in outfit breakdowns.
The integration also puts pressure on social commerce platforms. Instagram’s visual search does not yet offer multi-item outfit parsing. If Circle to Search drives measurable Google Shopping traffic from social content — images shared from Instagram or TikTok — it reopens a distribution channel Google has struggled to compete in. The same pattern visible in AI-powered utility apps is now reaching fashion: incumbents with data scale are absorbing functionality that once required specialist tools.
How to Use Circle to Search Outfit Mode
The feature requires no additional setup on supported devices. To activate outfit breakdown:
- Open any image — in a browser, social feed, or gallery app — on a supported Android or iOS device.
- Long-press the home button (Android) or use the Google app share sheet option (iOS) to activate the Circle to Search overlay.
- Draw a circle or tap around the full person or outfit in the image.
- Select “Break down outfit” from the results panel if it does not activate automatically.
- Scroll the item tray at the bottom to browse each identified piece with shopping links and price ranges.
The feature is rolling out progressively through April 2026. Users who do not see it immediately should check for updates in the Google app or Google Search app.
For teams evaluating AI-driven visual tools more broadly, MegaOne AI’s head-to-head tool comparisons cover the Engine Score methodology applicable to visual search platforms — assessed on accuracy, catalog depth, and conversion quality.
Circle to Search outfit mode is the most capable free visual fashion search tool available today. Retailers should audit their Google Shopping catalog quality now — the compounding traffic this feature generates as it exits progressive rollout will not wait for brands still running outdated product data.
