What is SEO
SEO refers to Search Engine Optimization. SEO, or Search Engine Optimization, is a set of practices and strategies aimed at optimising a website's visibility on search engines. The primary goal of SEO is to improve a website's organic (non-paid) search engine rankings, making it more likely to appear prominently in search results when users search for a keyword. We can easily improve the search engine ranking of our website by paying for advertisements and promotions but sometimes, especially for small businesses, it is not a feasible strategy. Hence, a good SEO strategy can increase their ranking in Search Engine Result Pages (SERPs) in a very economical way.
Why is it important for businesses
Visibility - Users are most likely to visit websites which appear at the top of search results. Improving SEO can improve your ranking in search engine results and your website can start getting listed at the top of search results. This can really improve your website’s visibility and increase the number of visitors on your website.
Credibility - Websites which are listed higher in search results are considered more credible and trustworthy by users. This will not only compel them to visit your site but it will also forge trust in users even before they visit your website.
Economical - Paid advertisements can improve your rankings quickly but for someone just starting with a new venture, it might not always be feasible. SEO might not bring immediate results, but in the long run, it is always rewarding if practised properly with the right strategy.
How search engines work
There are 3 main steps involved when you perform a search on any search engine -
Crawling - This step involves automated bots called crawlers. The role of these crawlers is to literally go (or crawl) from one webpage to another webpage (whose link is given on that page) and collect the data from these pages. This data is then sent to a massive database. Crawlers always follow the rules mentioned in a special file called robots.txt. This file should be present at the root of your project repository for crawlers to identify it and follow it. In this file, we define rules for crawlers to follow while collecting information from our website.
Indexing - Once crawlers have done their job, the collected data is indexed in a massive database. Indexing refers to a process of optimizing databases so as to quickly fetch information from them. Indexing involves sophisticated data structures (like inverted index, kind of a hashmap) and algorithms (like stemming, lemmatization and many others) to optimize the search queries and deliver quick search results.
Rendering or ranking - This step also involves complex search algorithms to sort the thousands of search results in order of relevance, keywords, backlinks, user experience, site credibility and security and many other metrics. The most search engine optimized pages are placed on the top of search engine result pages (SERPs)
So what are the factors that can help make our website more friendly to search engines? Here’s a comprehensive checklist which can provide essential information about your content to search engines, helping them understand the context and relevance of your pages to users.
Title tag and meta tags
Text written within the title tag appear as the clickable headline in search engine results pages (SERPs) and are crucial for both user experience and SEO. This directly helps search engines to understand what the page is about and shows it to users looking for such content.
Meta tags provide additional information to search engines in order to better understand the details of the content present on this page. Things like description, keywords, publish date, authors, media and metadata are provided through meta tags. This is how the basic structure looks like -
<title>Optimizing Your Website for SEO</title>
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta name="description" content="Learn how to optimize your website for better search engine rankings.">
<meta name="keywords" content="SEO, website optimization, search engine rankings">
<!-- Your webpage content goes here -->
Proper URL structure and canonical referencing
Having a well-structured URL not only helps in providing a good user experience but it also helps search engines in understanding your website's architecture and navigate better. Avoid using IDs and hashes in URL. Try including keywords in URL. For example -
Wrong way - https://example.com/products/hcu457nfc4iix32dj8d45d
Right way - https://example.com/products/herbals-facewash-for-men
This also improves the readability of the URL.
Canonical tags are HTML elements that specify the preferred version of a page when multiple versions with similar content exist. They help address duplicate content issues and guide search engines on which version to index and rank. For example, your website might have an archived page and now you have made a new page with fresh content which is pretty similar to the archived page. You want your new page to get indexed and rank and that's when you can make use of canonical referencing.
<link rel="canonical" href="https://www.example.com/preferred-url">
Improve the core web vitals and user experience
Core Web Vitals are a set of user-centered metrics introduced by Google to measure the performance and user experience of web pages. These vitals are crucial for SEO because Google considers user experience as a significant ranking factor. Improved Core Web Vitals can positively impact your website's SEO This involves developing the website in a way that it loads faster, has better interactivity, follows best practices, does not block essential content and provides a smooth user experience.Fun fact - Google has reported that as page load time goes from one second to 10 seconds, the probability of a mobile site visitor bouncing increases by 123%.
This is probably the most underrated aspect of SEO strategy. Lets know a little more about sitemaps.
A sitemap is a file that provides information about the pages, videos, and other files on your website and the relationships between them. It serves as a roadmap for search engines, helping them navigate and understand the structure of your site. Search engine bots, such as Googlebot, use sitemaps to discover and crawl pages on your website more efficiently.
Sitemaps often include information about the priority and frequency of updates for each page. This allows you to signal to search engines which pages are more important or how frequently they are updated. If certain pages on your website are not linked to from other pages or have limited internal links, a sitemap can serve as a means for search engines to discover and index these otherwise less accessible pages.
Semantic HTML tags play a crucial role in enhancing the SEO of a website by providing a clear and meaningful structure to the content. Search engines rely on these semantic tags to understand the context and relevance of different parts of a webpage. This understanding allows search engines to better index and rank the content, ensuring that it appears in relevant search results.
These points can be helpful in writing semantic HTML -
Instead of using a generic div for all the content, use
<header>for the header section,
<nav>for navigation links,
<main>for the main content area, and
<footer>for the footer section.
A proper headings hierarchy, from h1 to h6
<dl>for unordered lists, ordered lists, and description lists, respectively.
Always include descriptive and meaningful alt attributes for images using the
<img>tag. This is crucial for accessibility and helps search engines understand the content of images.
Backlinks, also known as inbound or incoming links, are links from one website to another. Backlinks play a crucial role in improving search engine rankings. Search engines like Google view backlinks as signals of a website's authority, relevance, and credibility.This is a non-technical, off-page SEO strategy but it plays an important role in SEO.
If reputable and high-authority websites link to your content, it signals to search engines that your site is credible and deserve higher rankings. Websites with more backlinks are often crawled more frequently by search engine bots. This means that new content on your site is more likely to be discovered and indexed quickly if you have a strong backlink profile.
robots.txtfile itself doesn't directly improve SEO, it plays a crucial role in controlling the crawling and indexing of a website. The
robots.txtfile allows website owners to specify which areas or pages of their site should or should not be crawled by search engine bots. By controlling which parts of the site are crawled, you can ensure that search engines focus on indexing the most important and relevant content. Here's an example of a very simple
User-agent: * Disallow: /private/ or /admin/ Allow: /public/
If there are areas of your website that contain private or sensitive information like the admin panel or the dashboard page, you can use the
robots.txtfile to block search engines from accessing those sections, enhancing privacy and security.
We can also provide the link to the location of sitemap file in robots.txt for search engines to access it.
Mobile responsive - Most users discover new websites on mobile devices, and Google crawls websites on a mobile browser by default. Your mobile speed affects your search rank on both mobile and desktop devices.
So these were some points that could be helpful in getting you through the SEO and make your webpages rank on search engine result pages. These are all tried and tested action steps and I have implemented all these steps personally in this blog website. Just to give you a glimpse, these are the Google Chrome Lighthouse audit results from different pages of my website.
On mobile devices -
On desktop devices -