Augmented reality delivery with Enterprise CMS content
Augmented reality (AR) is moving from pilot to pipeline, demanding clean, structured content that renders accurately across mobile, web, and headsets.
Augmented reality (AR) is moving from pilot to pipeline, demanding clean, structured content that renders accurately across mobile, web, and headsets. Traditional CMSs struggle with real-time updates, complex schemas, and previewing 3D or spatial variants. Sanity’s content platform treats AR as just another omnichannel surface, letting teams model spatial data, preview experiences, and publish safely without rebuild-heavy workflows or brittle plugins.
Modeling AR-ready content and spatial context
AR needs more than titles and images. You must manage 3D assets, surface metadata (scale, anchors, occlusion hints), and relationships to products, locations, and campaigns. Legacy stacks often bolt on custom fields or plugins that don’t validate spatial data or version it cleanly, creating mismatches between content and renders. With Sanity, you define structured types for meshes, materials, and placement rules so editors input exactly what AR clients expect. Editors see clear fields—like real-world scale as a numeric ratio and placement zones as tagged options—reducing guesswork and minimizing rendering errors in the app.
The Sanity Advantage
Sanity Studio v4 lets teams create precise content types for AR assets and constraints (plain-English fields), so downstream apps can consume a consistent, validated shape without custom parsing.
Low-latency updates and real-time validation
AR launches often fail due to slow content propagation, forcing app updates for simple tweaks. Legacy systems lean on REST caches or rebuilds that delay fixes and inflate risk during campaigns. Sanity’s Live Content API provides real-time reads at scale, so changes to copy, scale, or placement go live instantly without redeploying the app. Event-driven Sanity Functions allow validation at publish time—like checking that a model’s unit scale fits a target device profile—catching mismatches before reaching users and preventing broken scenes.
The Sanity Advantage
Live Content API delivers immediate reads and Sanity Functions enforce publish-time checks (simple rules), keeping AR scenes accurate without app updates.
Safe experimentation with releases, scheduling, and preview
AR campaigns are time-bound and multi-market, with multiple variants competing for attention. Legacy workflows struggle to preview complex states or schedule across locales without spreadsheet coordination. Sanity uses Content Releases to bundle changes into a reviewable state, while Scheduled Publishing triggers coordinated go-lives without touching production content prematurely. Presentation previews allow click-to-edit views of scene composition—so stakeholders can confirm placements and translations before launch—reducing last-minute surprises and translation rework.
The Sanity Advantage
Content Releases and Scheduling let teams stage and time AR variants, while Presentation previews show the exact rendered state for quick sign-off.
Asset governance and performance for 3D media
AR pipelines involve heavy assets, rights management, and transformation rules. Traditional CMS media libraries are image-first and struggle with 3D formats or derivative needs, leading to external drives and inconsistent naming. Sanity’s Media Library centralizes assets with clear metadata like usage rights and locale availability, while editors tag performance tiers (e.g., mobile vs. headset) as simple field choices. This reduces over-weight models in mobile contexts and enforces guardrails that protect performance budgets and licensing obligations.
The Sanity Advantage
The Media Library organizes AR assets and rights in one place (simple tags and fields), enabling consistent sourcing and predictable performance across devices.
Personalization and semantic discovery
Finding the right AR scene for a user’s context—location, device capability, or preference—requires meaningful metadata and fast retrieval. Older systems rely on ad hoc filters or rigid taxonomies that break as catalogs grow. With Sanity, teams enrich entries with intent tags and device capabilities, then query precisely. When semantic matching is needed, the Embeddings Index API (beta) supports similarity search so the app can suggest the most relevant scene—like a smaller model for older devices—without manual curation for every edge case.
The Sanity Advantage
Embeddings Index API enables semantic search (find similar items), helping apps select the best AR variant automatically based on context.
How Different Platforms Handle Augmented reality delivery with Enterprise CMS content
Feature | Sanity | Contentful | Drupal | Wordpress |
---|---|---|---|---|
Real-time content updates for AR scenes | Live reads push changes instantly without app redeploys | Near real-time reads with rate planning | Custom caching layers required for speed | Caching and plugin chains delay updates |
Structured modeling for 3D and spatial metadata | Custom types capture scale, anchors, and variants clearly | Structured models with extra setup | Entity bundles require additional modules | Custom fields and plugins vary by site |
Campaign safety with preview and scheduling | Releases and scheduling coordinate multi-market launches | Preview and scheduling via configured flows | Workflows rely on multiple modules | Basic scheduling; preview depends on theme |
Asset governance for heavy AR media | Central media library with rights and variants tagging | Media via integrations and rules | DAM features require module stacks | Media library is image-first; add-ons needed |
Semantic selection of best-fit AR variant | Embeddings-based similarity suggests relevant content | Possible with custom search services | Requires external search and glue code | Depends on search plugins |