What is SEO / Search Engine Optimization ?

SEO stands for search engine optimization. It basically means the optimization of a website for search engines by providing search engine knowledge of what our website is all about. It is the process of improving the quality and quantity of website traffic to our website or a webpage from the search engines.

How search engine optimization affects search engines?
Every search engine has its own pattern of reading the website or understanding the website and then showing it to the users when they search for anything on the search engine. Popular search engines such as Google and Bing crawl the web page and index them on their search engine results page according to various SEO factors such as page quality, page speed, etc. Everything we do in SEO is to improve our rank in a SERP that is the Search Engine Result Page.

Factors affecting search engine optimization or SEO
1. Secure connection to a website
2. Page speed
3. Mobile-friendliness of website
4. Quality of content
5. References and links
6. Technical SEO
7. Systematic plan of the website to users
8. User interface and user experience

What are the types of SEO?
1. On-page SEO - on-page SEO is the process of optimizing individual web pages to rank best in the SERPs by improving and editing the content or codes in the HTML source code of our website pages.
E.g- Meta Tags,etc

2. Off-page SEO - off-page SEO is the process of improving the traffic on our website by external means or by external recognitions. It refers to the activities occurring externally to our website that affect the rankings.
E.g- Backlinks, etc.

Let's dig deeper now!
On-page SEO or Technical SEO
The most significant factors affecting SERPs are as follows:

  1. Meta Tags
  2. Canonical Tags
  3. Sitemaps
  4. robots.txt
  5. Redirection
  6. Schema Markup or Structured Data

Meta Tags

a.Meta Title- Its the most important primary tag to be optimized. The title is placed in the tag that will be shown by the Search Engine when someone searches query.

<title>This is the title of the page.</title>

b.Meta description- It’s the description that explains the page info in narrow terms. This tag informs the Search Engine whats the page all about.

<meta name="description" content="Place the meta description text here.">

c.Meta Robots- This tag provides crawler instruction in what way the web page should be crawled. This tag directs the crawlers and determines whether to index or not to index the page. And instruct whether the crawler must follow or should not follow the links on the same page.

  1. index: tells bots to index the page;
  2. noindex: tells bots not to index the page;
  3. follow: tells bots to crawl links on the page, and also choose to give value to linked pages
  4. nofollow: tells bots not to crawl links on the page and avoid to crawl further.
Not setting a meta robots tag is also equivalent to index, follow.

<meta name="robots" content="noindex,nofollow">

d.Meta Viewport- This tag instructs the browser about rendering the page on devices such as a mobile, tablet, etc. It is recomended that this tag should be used on all of the web pages as standard.

<meta name="viewport" content="width=device-width, initial-scale=1.0">
The content in viewport can be defined accordingly.

e.Meta Charset- This tag needs to be used on all web pages by default but with correct syntax.

<meta charset="utf-8">

Canonical Tags

This tag is used in a duplicate page which should not be crawled and instruct the crawler to get directed towards the original page. There are conditions where identical pages are created and among them, the only original page needs to be crawled by bots. In such cases, canonical tag can be useful.

<link rel=“canonical” href=“https://example.com/canonical-page/(Place Here original page URL)” />


SiteMaps helps crawlers to crawl websites systematically according to the priority we set. Here we instruct crawlers to crawl web pages according to the need and value the website owner wants to give to the web pages. Use Google Search Console for auditing sitemap. sitemap.xml file needs to be uploaded in root directory

e.g - Sitemap of diigialmarketer.tech / { https://example.com/sitemap.xml }
Tool for creating Sitemap - XML Sitemaps
Priority range from 0.1 to 1 where 1 is high priority and 0.1 is less priority.


robots.txt file is used to allow or disallow bots of search engines and instruct whether they should crawl the page or not. Googlebot, Bingbot, etc can be instructed specifically. For example, It helps in avoiding the crawling of the unusable page such as Admin login or Admin page.

Tool for creating robots.txt file - smallseotools
Locating robots.txt file - { https://example.com/robots.txt }
Syntax e.g - { https://www.indiatimes.com/robots.txt } or robots.txt

301 & 302 Redirects

301 redirect is used for permanent link / URL transfer such as http to https and passes 99% of ranking power or page equity.
302 redirect is used for temporary link transfer and does not passes page quality.

Schema Markup or Structured Data

Structured data, also called Schema Markup. It is a type of code that makes it easier for search engines to crawl, organize, and display your content on your web page. This data informs search engines what your data is all about.
For creating schema markup script use - Google Structured Data Markup Helper

This steps will led to effective On Page Seo.

Follow our other blogs.Thanks for providing Support !