SEO refers to Search Engine Optimization. SEO, or Search Engine Optimization, is a set of practices and strategies aimed at optimising a website's visibility on search engines. The primary goal of SEO is to improve a website's organic (non-paid) search engine rankings, making it more likely to appear prominently in search results when users search for a keyword. We can easily improve the search engine ranking of our website by paying for advertisements and promotions but sometimes, especially for small businesses, it is not a feasible strategy. Hence, a good SEO strategy can increase their ranking in Search Engine Result Pages (SERPs) in a very economical way.
There are 3 main steps involved when you perform a search on any search engine -
So what are the factors that can help make our website more friendly to search engines? Here's a comprehensive checklist which can provide essential information about your content to search engines, helping them understand the context and relevance of your pages to users.
Text written within the title tag appear as the clickable headline in search engine results pages (SERPs) and are crucial for both user experience and SEO. This directly helps search engines to understand what the page is about and shows it to users looking for such content.
Meta tags provide additional information to search engines in order to better understand the details of the content present on this page. Things like description, keywords, publish date, authors, media and metadata are provided through meta tags. This is how the basic structure looks like -
<html lang="en">
<head>
<title>Optimizing Your Website for SEO</title>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta name="description" content="Learn how to optimize your website for better search engine rankings.">
<meta name="keywords" content="SEO, website optimization, search engine rankings">
</head>
<body>
<!-- Your webpage content goes here -->
</body>
</html>
Having a well-structured URL not only helps in providing a good user experience but it also helps search engines in understanding your website's architecture and navigate better. Avoid using IDs and hashes in URL. Try including keywords in URL. For example -
Wrong way - https://example.com/products/hcu457nfc4iix32dj8d45d
Right way - https://example.com/products/herbals-facewash-for-men
This also improves the readability of the URL.
Canonical tags are HTML elements that specify the preferred version of a page when multiple versions with similar content exist. They help address duplicate content issues and guide search engines on which version to index and rank. For example, your website might have an archived page and now you have made a new page with fresh content which is pretty similar to the archived page. You want your new page to get indexed and rank and that's when you can make use of canonical referencing.
<link rel="canonical" href="https://www.example.com/preferred-url">
Core Web Vitals are a set of user-centered metrics introduced by Google to measure the performance and user experience of web pages. These vitals are crucial for SEO because Google considers user experience as a significant ranking factor. Improved Core Web Vitals can positively impact your website's SEO This involves developing the website in a way that it loads faster, has better interactivity, follows best practices, does not block essential content and provides a smooth user experience.
Fun fact - Google has reported that as page load time goes from one second to 10 seconds, the probability of a mobile site visitor bouncing increases by 123%.
This is probably the most underrated aspect of SEO strategy. Lets know a little more about sitemaps.
A sitemap is a file that provides information about the pages, videos, and other files on your website and the relationships between them. It serves as a roadmap for search engines, helping them navigate and understand the structure of your site. Search engine bots, such as Googlebot, use sitemaps to discover and crawl pages on your website more efficiently.
Sitemaps often include information about the priority and frequency of updates for each page. This allows you to signal to search engines which pages are more important or how frequently they are updated. If certain pages on your website are not linked to from other pages or have limited internal links, a sitemap can serve as a means for search engines to discover and index these otherwise less accessible pages.
Semantic HTML tags play a crucial role in enhancing the SEO of a website by providing a clear and meaningful structure to the content. Search engines rely on these semantic tags to understand the context and relevance of different parts of a webpage. This understanding allows search engines to better index and rank the content, ensuring that it appears in relevant search results.
These points can be helpful in writing semantic HTML -
<header>
for the header section, <nav>
for navigation links, <main>
for the main content area, and <footer>
for the footer section.<ul>
, <ol>
, and <dl>
for unordered lists, ordered lists, and description lists, respectively.<img>
tag. This is crucial for accessibility and helps search engines understand the content of images.Backlinks, also known as inbound or incoming links, are links from one website to another. Backlinks play a crucial role in improving search engine rankings. Search engines like Google view backlinks as signals of a website's authority, relevance, and credibility.This is a non-technical, off-page SEO strategy but it plays an important role in SEO.
If reputable and high-authority websites link to your content, it signals to search engines that your site is credible and deserve higher rankings. Websites with more backlinks are often crawled more frequently by search engine bots. This means that new content on your site is more likely to be discovered and indexed quickly if you have a strong backlink profile.
robots.txt
fileThe robots.txt
file itself doesn't directly improve SEO, it plays a crucial role in controlling the crawling and indexing of a website. The robots.txt
file allows website owners to specify which areas or pages of their site should or should not be crawled by search engine bots. By controlling which parts of the site are crawled, you can ensure that search engines focus on indexing the most important and relevant content. Here's an example of a very simple robots.txt
file -
User-agent: *
Disallow: /private/ or /admin/
Allow: /public/
If there are areas of your website that contain private or sensitive information like the admin panel or the dashboard page, you can use the robots.txt
file to block search engines from accessing those sections, enhancing privacy and security.
We can also provide the link to the location of sitemap file in robots.txt for search engines to access it.
So these were some points that could be helpful in getting you through the SEO and make your webpages rank on search engine result pages. These are all tried and tested action steps and I have implemented all these steps personally in this blog website. Just to give you a glimpse, these are the Google Chrome Lighthouse audit results from different pages of my website.
On mobile devices -
On desktop devices -