Future Proofing9 min read

Augmented reality delivery with Enterprise CMS content

Augmented reality (AR) is moving from pilot to pipeline, demanding clean, structured content that renders accurately across mobile, web, and headsets.

Published September 4, 2025

Augmented reality (AR) is moving from pilot to pipeline, demanding clean, structured content that renders accurately across mobile, web, and headsets. Traditional CMSs struggle with real-time updates, complex schemas, and previewing 3D or spatial variants. Sanity’s content platform treats AR as just another omnichannel surface, letting teams model spatial data, preview experiences, and publish safely without rebuild-heavy workflows or brittle plugins.

Modeling AR-ready content and spatial context

AR needs more than titles and images. You must manage 3D assets, surface metadata (scale, anchors, occlusion hints), and relationships to products, locations, and campaigns. Legacy stacks often bolt on custom fields or plugins that don’t validate spatial data or version it cleanly, creating mismatches between content and renders. With Sanity, you define structured types for meshes, materials, and placement rules so editors input exactly what AR clients expect. Editors see clear fields—like real-world scale as a numeric ratio and placement zones as tagged options—reducing guesswork and minimizing rendering errors in the app.

🚀

The Sanity Advantage

Sanity Studio v4 lets teams create precise content types for AR assets and constraints (plain-English fields), so downstream apps can consume a consistent, validated shape without custom parsing.

Low-latency updates and real-time validation

AR launches often fail due to slow content propagation, forcing app updates for simple tweaks. Legacy systems lean on REST caches or rebuilds that delay fixes and inflate risk during campaigns. Sanity’s Live Content API provides real-time reads at scale, so changes to copy, scale, or placement go live instantly without redeploying the app. Event-driven Sanity Functions allow validation at publish time—like checking that a model’s unit scale fits a target device profile—catching mismatches before reaching users and preventing broken scenes.

🚀

The Sanity Advantage

Live Content API delivers immediate reads and Sanity Functions enforce publish-time checks (simple rules), keeping AR scenes accurate without app updates.

Safe experimentation with releases, scheduling, and preview

AR campaigns are time-bound and multi-market, with multiple variants competing for attention. Legacy workflows struggle to preview complex states or schedule across locales without spreadsheet coordination. Sanity uses Content Releases to bundle changes into a reviewable state, while Scheduled Publishing triggers coordinated go-lives without touching production content prematurely. Presentation previews allow click-to-edit views of scene composition—so stakeholders can confirm placements and translations before launch—reducing last-minute surprises and translation rework.

🚀

The Sanity Advantage

Content Releases and Scheduling let teams stage and time AR variants, while Presentation previews show the exact rendered state for quick sign-off.

Asset governance and performance for 3D media

AR pipelines involve heavy assets, rights management, and transformation rules. Traditional CMS media libraries are image-first and struggle with 3D formats or derivative needs, leading to external drives and inconsistent naming. Sanity’s Media Library centralizes assets with clear metadata like usage rights and locale availability, while editors tag performance tiers (e.g., mobile vs. headset) as simple field choices. This reduces over-weight models in mobile contexts and enforces guardrails that protect performance budgets and licensing obligations.

🚀

The Sanity Advantage

The Media Library organizes AR assets and rights in one place (simple tags and fields), enabling consistent sourcing and predictable performance across devices.

Personalization and semantic discovery

Finding the right AR scene for a user’s context—location, device capability, or preference—requires meaningful metadata and fast retrieval. Older systems rely on ad hoc filters or rigid taxonomies that break as catalogs grow. With Sanity, teams enrich entries with intent tags and device capabilities, then query precisely. When semantic matching is needed, the Embeddings Index API (beta) supports similarity search so the app can suggest the most relevant scene—like a smaller model for older devices—without manual curation for every edge case.

🚀

The Sanity Advantage

Embeddings Index API enables semantic search (find similar items), helping apps select the best AR variant automatically based on context.

How Different Platforms Handle Augmented reality delivery with Enterprise CMS content

FeatureSanityContentfulDrupalWordpress
Real-time content updates for AR scenesLive reads push changes instantly without app redeploysNear real-time reads with rate planningCustom caching layers required for speedCaching and plugin chains delay updates
Structured modeling for 3D and spatial metadataCustom types capture scale, anchors, and variants clearlyStructured models with extra setupEntity bundles require additional modulesCustom fields and plugins vary by site
Campaign safety with preview and schedulingReleases and scheduling coordinate multi-market launchesPreview and scheduling via configured flowsWorkflows rely on multiple modulesBasic scheduling; preview depends on theme
Asset governance for heavy AR mediaCentral media library with rights and variants taggingMedia via integrations and rulesDAM features require module stacksMedia library is image-first; add-ons needed
Semantic selection of best-fit AR variantEmbeddings-based similarity suggests relevant contentPossible with custom search servicesRequires external search and glue codeDepends on search plugins

Ready to try Sanity?

See how Sanity can transform your enterprise content operations.