Was this article helpful?
Thanks for your feedback
Search Engine Optimization (SEO) improves your website’s ranking on the Search Engine Results Page (SERP). An exceptional SEO strategy can attract customer traffic and provide growth in revenue and sales. Let us look at some of the SEO best practices that you can implement with Contentstack to improve your site’s visibility on a search engine.
Meta description and title tags provide a brief introduction to your website. This introductory content comes upon a SERP when a customer’s search criteria match keywords in your website’s metadata.
Title tags and meta descriptions are pieces of HTML code placed within the <header> tag of a web page.
Let us look at how you can assign appropriate meta descriptions and title tags to your web pages with Contentstack.
While defining the structure of your Contentstack website, perform the following steps:
Additional Resource: Want to know more about content types? Check out the Create Content Types section that has a set of guides with all the details that you need to know.
This will make sure you have relevant metadata in place when you decide to publish content to your website.
Perform the following steps to add SEO related tags to your website code:
<meta name="title" content="an appropriate title for your website content">
This code assigns a title to your meta tag.
<meta name="description" content="appropriate description for your website content.">
This code assigns description details to your <meta> tag.
This code will make sure that the necessary SEO calls fetch and display appropriate metadata on your website.
Displaying images is the best way to explain a specific feature of your company’s software. The images provide a direct view of the steps being performed while a user uses that particular feature.
The “alt” (short for “alternate”) text appears in place of an image on a website if the image fails to load on a user's screen. Sometimes, this alternate text helps provide information about an image while the image loads on the screen. This helps improve SEO ranking as search engines find it easy to crawl the web and visit your website’s links.
Contentstack also uses alt text to provide information about the images uploaded to our website. Follow the below steps to add “alt” text to images within Contentstack:
If you happen to change the URL of an already-published entry, it is highly advised that you redirect the previous URL to the newer one. This is considered an appropriate SEO best practice. Users trying to visit the older link to the website will be automatically taken to the newer link, instead of seeing a “Page not found” error.
Additional Resource: We have a detailed, step-by-step guide on Redirecting URLs that you can refer to for more details.
Let’s see what you need to do to manage URL redirects for your Contentstack-powered websites.
You need to create a separate content type to manage the redirection of the changed URLs.
If you change the URL of any published entry, create an entry for your Redirect content type.
If any user visits the old URL, they will be redirected to the new URL automatically.
Broken links can affect your website in the following ways:
Additional Resource: Contentstack uses Xenu for finding broken links on the website. This tool is available online, and you can download it for free. It connects to your website and checks for broken links by crawling through web pages just like any search engine would.
A canonical tag specifies to the search engine that a particular URL is a master copy of a specific page. This tag indicates which version of a particular URL should be displayed in the search results that a search engine derives.
Using the canonical tag helps avoid duplicate content being placed on multiple URLs. This allows the search engine easily crawl the Web and pick out the URL you have mentioned in the canonical tag to browse through your website. Easy navigation enables better SEO rankings.
Let us consider a sample website that can be reached through all of the following links:
A search engine will consider every single one of these URLs as a unique webpage. However, they are five similar copies of the homepage. Here, setting a canonical tag that points to a specific URL (e.g., https://www.samplesite.com) will tell the search engine to include just that URL in the search results.
When setting up the code for your website on Contentstack, add the following line of code inside the header tag in HTML:
<link rel="canonical" href="your preferred website address">
You can add the version of your website address that needs to be displayed on the search results page here.
A sitemap provides a list of all the important URLs that exist on your website. The sitemap forms a roadmap that Google’s search engine can follow to crawl to a particular URL link on your website.
The use of sitemap benefits organizations that have websites:
By providing a list of all the essential web pages on a sitemap, you indicate to a search engine that these are appropriate landing pages on your website. The sitemap then provides an easily accessible index for the search engine to traverse through your website intelligently. This offers better on-site SEO.
You can use service providers such as Google’s online XML sitemap generator service to make use of sitemaps with Contentstack. These sitemaps are normally stored within a file named sitemap-index.xml.
You can also create a plugin that autogenerates the sitemap-index.xml file when a particular entry on your Contentstack website is published.
High-quality content attracts search engines and visitors to your website. Visitors prefer crisp and engaging content that sticks to the main topic that your website is all about. Users can find relevant content and may end up visiting related sections of your site too.
Content that promotes good onsite-SEO should contain the following attributes:
When search engines can easily crawl to a particular website URL, onsite-SEO rankings improve. Search engine-friendly URLs provide users with an appropriate path to their desired destination.
Let us see how URLs can be optimized for SEO.
It is a good practice to include keywords that tell crawlers what the webpage is about in the URL. When you have the most relevant term alongside the root domain of the website URL, this helps search engines direct the right traffic to your site.
Consider a website with the URL https://example.com/headlesscms. Here, the keyword headlesscms specifies what the webpage is about.
An SEO-friendly URL should flow logically from the root domain to categories and sub-categories. This hierarchy will allow search engines and users to crawl through your website with ease.
Consider the following URLs:
The first URL is SEO-friendly as it provides smooth flow towards the CMS. While the second URL confuses users with its inappropriate categorization.
Here are a few more practices that help maintain SEO-friendly URLs on your website:
It is a good practice to optimize the header tags on your website for a better user experience and onsite-SEO. The use of SEO-friendly headlines makes your content easier to browse through and improves SEO ranking for your website.
Let us look at some best practices to be followed while using header tags for your website.
An H1 tag provides the most significant headline available on your webpage. This tag usually represents your web page's title or other significant content that should stand out from the remaining data on the web page.
It is considered a best practice to use only one H1 tag for your web page as more than one title would confuse visitors that scan through your website.
For example, the Contentstack website follows the same ideology on its webpage. The webpage maintains “Industry Leader in Headless CMS” as the sole H1 tag, providing better readability for its visitors.
The use of a single header tag enhances user experience and improves SEO rankings for the website.
The header tags shape the content on your website and provide an appropriate context for the data present on your webpage. To provide the appropriate flow of information on your website, you can use different header tags to structure your content.
Let us take a look at the different types of header tags:
For example, the H1 tag would create the title of your blog post, while the tags ranging from H2 to H6 would form the subsequent sub-headings inside your blog post.
Robots.txt is a text file used by search engines to determine which pages to crawl and which not to crawl on a website.
Let’s consider the following basic syntax that lies within the “Robots.txt” file:
User-agent: * Disallow: /
Here, the asterisk refers to all the search engines or web crawlers who crawl the website, while the slash indicates users not to crawl any page of the website, including the homepage.
Here is another way to instruct a specific web crawler to stop crawling a specific web page:
User-agent: Googlebot Disallow: /example
This syntax indicates the Google crawler agent (Googlebot) not to crawl any webpage that includes the /example string in the URL of the website. For instance, www.website.com/example.
Any search engine that crawls your website will crawl through all the webpages displayed on your website. However, you may not want the engine to crawl all the pages, or you may want it to crawl in a structured way.
Now, this is where the “Robots.txt” file comes in handy. In this file, you can tell the search engine to crawl or not crawl certain pages. This helps improve SEO rankings for your website.
As you create more pages for your site, you need to keep updating this file. While you can manage this file on your own, you can also use Contentstack to manage the content of this file, so that updating content becomes easy.
You can do this by creating a single content type called “no-index”, with a URL field (marked as multiple). If there are any pages that shouldn’t be crawled the content manager or the developer can add relevant URLs and publish the entry.
The developer maintaining the site code needs to ensure that URLs published as part of this entry should be:
An alternate way is to create a content type called “Robots.txt” with different multi-line textbox fields for different environments. When you create an entry, you can have the whole Robots.txt content for each environment in its respective field. The developer just needs to ensure that the changes you make here are updated in the original Robots.txt.
Was this article helpful?
Thanks for your feedback