Omran Mogbil
HomeAboutExperienceProjectsBlogContact
Omran Mogbil

Senior Software Architect & Technical Leader passionate about creating amazing web experiences.

Quick Links

  • About
  • Experience
  • Projects
  • Blog

Contact

  • Get in Touch
  • info@omranmogbil.com

Follow Me

LinkedInEmail

© 2025 Omran Mogbil. All rights reserved.

Back to Blog
Sitecore
Headless
JSS
API
Caching
Experience Edge

Optimizing Layout Service and API Caching in Sitecore Headless

March 7, 2025
3 min read

Optimizing Layout Service and API Caching in Sitecore Headless

Performance is critical in headless Sitecore projects. Efficient use of APIs and smart caching can drastically improve page speed, scalability, and hosting costs. In XM Cloud projects, the Experience Edge GraphQL Delivery API is the preferred mechanism for content delivery — with the Layout Service still available for legacy and hybrid use cases.

Understanding API Options

Sitecore Headless provides multiple delivery mechanisms:

  • Experience Edge (GraphQL)

    • Delivery endpoint → optimized for public traffic, cached at the CDN.
    • Preview endpoint → uncached, for Pages authoring & in-context editing.
  • Layout Service

    • Still supported, but primarily for legacy solutions.
    • In new builds, GraphQL Delivery is preferred for performance and flexibility.

Caching Layers in XM Cloud

  1. Experience Edge CDN

    • GraphQL Delivery API responses are cached globally.
    • The Preview API is never cached (always fresh for authoring).
  2. Next.js Incremental Static Regeneration (ISR)

    • Pages are statically generated and revalidated on schedule.
    • Combine ISR with GraphQL Delivery to minimize API calls.
  3. Application-Level Cache (optional)

    • Use @apollo/client or similar for short-lived client-side caching.
    • Good for frequently reused fragments (menus, footers).

Example: Next.js with ISR + GraphQL Delivery

TSX
// pages/[...path].tsx
import { GetStaticProps } from 'next';
import { gql } from '@apollo/client';
import client from '../lib/apollo-client';

const PAGE_QUERY = gql`
  query PageQuery($path: String!) {
    item(path: $path) {
      id
      name
      ... on Page {
        title { value }
        body { value }
      }
    }
  }
`;

export const getStaticProps: GetStaticProps = async ({ params }) => {
  const path = '/' + (params?.path?.join('/') || '');
  const { data } = await client.query({ query: PAGE_QUERY, variables: { path } });

  return {
    props: {
      item: data.item,
    },
    revalidate: 60, // ISR revalidation every 60 seconds
  };
};

✅ revalidate: 60 ensures the page is rebuilt at most once per minute — balancing freshness and performance.

When to Use Layout Service

  • Legacy integrations already built on Layout Service contracts.
  • Complex presentation scenarios where rendering variants aren’t yet migrated to GraphQL.
  • Transitional projects upgrading from XP/XM to XM Cloud.

If starting fresh: use GraphQL Delivery.

Common Pitfalls

⚠️ Don’t cache Preview endpoints → they must always reflect authoring changes in real-time.
⚠️ Beware of stale personalization → ISR pages cache variants unless personalization is moved client-side.
⚠️ Over-fetching in GraphQL → always limit fields to what your component needs.

Best Practices Checklist

  • ✅ Use Experience Edge Delivery API for production traffic.
  • ✅ Combine ISR with GraphQL queries for scalable builds.
  • ✅ Never cache Preview endpoints.
  • ✅ Cache fragments (menus, footers) at the component level for extra wins.
  • ✅ Migrate Layout Service usage to GraphQL when possible.

🚀 With smart caching and the right API strategy, Sitecore JSS + XM Cloud apps deliver fast, scalable, and resilient digital experiences.

Share this article

Help others discover this content

More posts coming soon...