Enhancing SEO: Next.js Markdown, Dynamic Sitemaps, and Meta

By Priyash Patil | Updated: Oct 21, 2023 12PM IST

This is a follow-up post to my previous post on Adding code syntax highlighting . If you want to follow up with this post you can find source code at this GitHub Revision.

Problem statement

As you may know that this blog was started on basis of Learn Next.js Tutorial. That guide has already provided an excellent explanation on SEO. But the blog is missing SEO because it's Markdown only blog. So based on that I've prepared following list of feature I want at least for now:


As you know that Next.js tutorial already has guide on SEO. So, I'll be following that to achieve above goals.

Dynamic sitemap.xml

The Learn Next.js Tutorial has an example for dynamic sitemap.xml. I't uses Next.js SSR feature to dynamically return sitemap.xml.

So, I'll be creating new file under pages sitemap.xml.tsx with following content.

import { GetServerSideProps } from "next";
import { getAllCategoryIds, getSortedPostsData } from "../lib/posts";

const POSTS_ENDPOINT_URL = "https://www.example.com/posts";
const CATEGORIES_ENDPOINT_URL = "https://www.example.com/categories";

function generateSiteMap(
  posts: { id: string }[],
  categories: { id: string }[]
) {
  return `<?xml version="1.0" encoding="UTF-8"?>
   <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
     <!--We manually set the two URLs we know already-->
       .map(({ id }) => {
         return `

        .map(({ id }) => {
          return `

function SiteMap() {
  // getServerSideProps will do the heavy lifting

export const getServerSideProps: GetServerSideProps = async ({ res }) => {
  const posts = await getSortedPostsData();
  const categoriesIds = await getAllCategoryIds();
  const categories = categoriesIds.map((cat) => {
    return { id: cat.params.id };

  // We generate the XML sitemap with the posts data
  const sitemap = generateSiteMap(posts, categories);

  res.setHeader("Content-Type", "text/xml");
  // we send the XML to the browser

  return {
    props: {},

export default SiteMap;

Meta Tags

One of the reason I wanted to fix SEO is links on social media. Adding few tags related to social media sharing can have better experience when sharing links on social media sites.

To add social media related meta tags I made following changes to pages/posts/[id.tsx] and lib/post.tsx

// pages/posts/[id.tsx]
// ...

export default function Post({
}: {
  postData: {
    title: string,
    id: string,
    excerpt: string,
    date: string,
    contentHtml: string,
    categories: string[],
}) {
  return (
        {/* metadata */}
        <meta name="title" content={postData.title} />
        <meta name="description" content={postData.excerpt} />

        {/* og metadata */}
        <meta property="og:type" content="website" />
          content={"https://www.example.com/posts/" + postData.id}
        <meta property="og:title" content={postData.title} />
        <meta property="og:description" content={postData.excerpt} />

        {/* twitter metadata */}
        <meta property="twitter:card" content="summary_large_image" />
        <meta property="twitter:title" content={postData.title} />
          content={"https://www.example.com/posts/" + postData.id}
        <meta property="twitter:description" content={postData.excerpt} />
      // ...

// lib/post.tsx
// ...
export async function getPostData(id: string): Promise<{
  date: string;
  title: string;
  id: string;
  excerpt: string;
  contentHtml: string;
  categories: string[];
}> {
  return {
    excerpt: matterResult.data.excerpt ?? "Priyash Patil Blog",
    ...(matterResult.data as { date: string; title: string }),
    categories: matterResult.data.categories ?? ["Uncategorized"],

To get meta tags working we need to ensure we have the following front matter attributes in all Markdown files.

title: "Adding code syntax highlighting to my Next.js Markdown blog"
date: "2022-02-15"
excerpt: "One of the very important feature of any tech blog is code syntax highlighting.
Browsers by default doesn't support code highlighting other than just very basic different font-family."

Missing robots.txt

For crawlers added robots.txt in public directory with following content.

# Allow all crawlers
User-agent: *
Allow: /

Source code

The final source code is on this GitHub Revision. Do note that this source repository is not my actual website. I'll be maintaining a separate source code repository for this blog series.


I think the requirement of doing basic SEO is fulfilled. There's a lot more to be done. SEO is not one time thing so in future I might have more posts on SEO coming up. So, stay tuned for that.