loke.dev
Header image for Why You Should Never Let Your Metadata Hydrate

Why You Should Never Let Your Metadata Hydrate

Client-side metadata is a silent killer for organic rankings; discover why static injection remains the only way to guarantee search bots see your site correctly.

· 4 min read

Why You Should Never Let Your Metadata Hydrate

Ever wonder why your beautifully crafted React app looks like a broken link when you share it on Slack or Twitter, even though you’re using a fancy library to update your document head?

It’s a trap many of us fell into during the "Single Page App" gold rush. We assumed that because Google *can* crawl JavaScript, we could just let our app mount, wait for a useEffect to trigger, and then update the <title> and <meta> tags.

Technically, it works. But in the high-stakes world of SEO and social sharing, "technically works" is usually just another way of saying "it's broken half the time."

The Lie of the "Two-Wave" Indexing

We’ve all heard the pitch: Googlebot renders JavaScript. While true, Googlebot is also incredibly lazy (or rather, resource-conscious).

When Google crawls a site, it performs "two-wave indexing." First, it scrapes the raw HTML. If it sees a bunch of empty <div> tags and a script bundle, it puts that page in a queue to be rendered later when it has spare CPU cycles. This "later" can be hours or even days. If your metadata only appears *after* JavaScript executes, your site is invisible to the first wave.

But here’s the kicker: Social media bots don't even try.

If you share a link on Discord, LinkedIn, or iMessage, their crawlers are looking for a simple GET request and some immediate strings. They aren't going to spin up a headless Chrome instance just to see what your og:description is. They’ll just show your root index.html title—usually something generic like "Vite + React"—and your click-through rate will plummet.

What "Hydrated" Metadata Looks Like

In a typical client-side React app, you might see something like this using a library like react-helmet:

// This is the "bad" way for SEO
import { Helmet } from "react-helmet";

function ProductPage({ product }) {
  return (
    <div>
      <Helmet>
        <title>{product.name} | My Awesome Store</title>
        <meta name="description" content={product.description} />
      </Helmet>
      <h1>{product.name}</h1>
    </div>
  );
}

When a bot hits this URL, the raw source code it receives looks like this:

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <title>My App</title> <!-- Default title from public/index.html -->
  </head>
  <body>
    <div id="root"></div>
    <script src="/assets/index.js"></script>
  </body>
</html>

The metadata hasn't "hydrated" yet. The JavaScript hasn't run, the product data hasn't been fetched, and the Helmet hasn't injected the tags into the DOM. To a bot, every single page on your site looks identical.

Static Injection: The Only Real Fix

The solution isn't to get better at client-side rendering; it’s to stop doing it for metadata entirely. You need your server (or your build process) to bake that metadata into the HTML before it ever leaves the building.

If you’re using a framework like Next.js, this is handled through the Metadata API, which ensures the tags are in the initial response.

// layout.tsx or page.tsx in Next.js (App Router)
import type { Metadata } from 'next'

export async function generateMetadata({ params }): Promise<Metadata> {
  const product = await getProduct(params.id);
 
  return {
    title: `${product.name} | My Awesome Store`,
    description: product.description,
    openGraph: {
      images: [product.image],
    },
  }
}

export default function Page() {
  return (
    // ... page content
  )
}

By doing this, the server-side process fetches the data and constructs the <head> before sending the bytes to the client. When the bot hits the page, it gets the full story immediately. No waiting, no "second wave," and no broken previews.

The "Gotcha" with Static Sites

I’ve seen developers move to SSG (Static Site Generation) and still mess this up. They’ll generate a static index.html for the whole site but then use a client-side fetch to get the page content and update the title.

If you are using a static generator, you must ensure that your build step generates a unique HTML file for every route. If you have 1,000 products, you should have 1,000 .html files (or a server that generates them on the fly). If they all point back to the same generic index.html, you’ve just reinvented the client-side metadata problem with extra steps.

Keep It Boring

SEO is one area where "boring" technology wins every time. A raw HTML string is more reliable than the most sophisticated JavaScript framework in the world.

If you're auditing your site right now, do the "View Source" test. Don't use the Inspect tool (which shows the current state of the DOM after JS has run). Right-click and select View Page Source. If you don't see your page title and meta tags staring back at you in that wall of text, you’re losing rankings.

Stop letting your metadata hydrate. Inject it, bake it, or server-render it—just make sure it’s there when the door opens.