Thursday 28 May 2020

What is GSC or Google Search Console? How can you verify your site in GSC?


GOOGLE SEARCH CONSOLE


google search console

Google Search Console or GSC was known as Google Webmaster's tools in its initial form. It is a free service offered by Google.

Google Search Console tells you everything you want to know about your website and the people that visit it daily.  

Here's the definition of GSC by Google: "Google Search Console is a free service offered by Google that helps you, monitor, and maintain your site's presence in Google's search results. You don't have to sign up for your site to be included in Google's search results, but doing so can help you understand how Google views your site and optimize its performance in search results".

Adding and verifying a site in GSC


New users should add and verify the site(s) before doing anything else.

Adding a site in GSC : 
  • Create an account in the GSC, and log in.
  • Enter the URL of the website you want to add in the "Add Property" box.
Verifying a site in GSC:

There are a few different methods through which you can verify your site.
  • Adding an HTML tag: Google gives you a simple HTML file. You can copy the code and paste it within the head section of the HTML page of your website. Once this is done, you can go back to the search console and click verify.
  • Uploading an HTML file: Google gives you an HTML file. You can upload the HTML file on the root directory of your website. After uploading, go back to the search console and click verify.
  • Via Domain Name Provider:  Domain Name provider is the company from where you bought your domain, or where your website is hosted. This is a good option if you own a large website because this shows Google that you own the main domain, and subdomains and the subdirectories as well. You will be asked to create  DNS TXT record for your provider to get it verified.
  • Adding Google Analytics Code: If you already use GA for your site, this is the best method. You need to make sure in the HTML  that the GA tracking code is placed within the head section of your homepage's code. If all these conditions are satisfied you can easily verify your site.
  • Using Google Tag Manager: If you already use GTM for your site, this must be the easiest method to verify your site. You need to make sure that the GTM code is placed immediately after your site's body tag. If all these conditions are satisfied you can easily verify your site.

Google Search Console Reports

Google Search Console provides a set of statics that help webmasters analyze the overall performance of their website in the Google Search Engine. Let's dive a little deep into the features of Google Search Console.

1.Performance Report

This is one of the most important and useful tools provided by Google Search Console. It shows a statistic report that shows the number of clicks, impressions, the average click-through rates, and the position of the website over a particular time interval.
 Also, you have the option to look at the data from a maximum of the last 16 months. Google Search Console even provides you the choice to compare the results over different time periods. 

You can also compare how your website performs for certain queries, how your pages perform for them, and the overall number of clicks, impressions, click-through rates, and average position

Top Pages & Top Search Queries

You can see the top pages on your website and the top search queries that is getting the most traffic from. This can help you analyze the performance of your keywords and can guide you through your content marketing strategies.

Average CTR

The Click-through rate is the ratio of the number of clicks per impressions on a page. You can view the CTR of the pages with the largest no.of impressions and you can adopt similar content strategies for other sites with lower CTR.

Average Position for Pages

This is an opportunity to know the average position of your pages in the Google search engine results page. Normally, Higher the average position of your page in the SERP the lower will be your click through rates.

2. URL Inspection

This tool helps you to know if your page is able to be indexed in Google, and request the search engine to crawl the page.  It also helps in retrieving the information about Google's indexed version of your page.
It is very important to rectify any errors or coverage issues that are noted here to maintain your website reach.

3. Indexing- Coverage & Sitemaps'

The index represents the coverage of your website in the Google search results, you can make sure to submit your website in the Google.

Coverage

It is important to fix any coverage issues detected in your website as soon as possible. Also make sure that the no.of valid pages are increasing as you keep on adding content to the website.

Sitemap
By placing a formatted XML file with a sitemap on your web server, you enable the crawlers to find out what pages are present, and which have recently changed, and to crawl your site accordingly.
After creating a sitemap you can add it in the search console and submit it. This keeps your website up-to-date in search results without errors.

4. Enhancements

This section in the Google Search Console shows any type of enhancements that were found on your website when it was crawled the last time by the bots. If no enhancements were made, this section will be empty.

Mobile Usability

This section shows any kind of mobile usability issues detected on your website.
You can see information about any rich result types (structured data) found on the page. This includes the number of valid items found on the URL, descriptions of each item, and details about any warnings or errors found.

5. Security And Manual Actions

This section in the Google Search Console shows any kind of security issues associate with your website. Google can tell you what to fix, identify hacks, detect flaws, and will send you an email when these issues pop up.

6. Links

This shows up all the types of external and internal sites linking to your website. This helps to analyze the backlinks of your website. You get a clear idea about the sites linking to your website, how your internal links are performing etc.








Wednesday 27 May 2020

Major Google Algorithm Updates

GOOGLE ALGORITHM UPDATES

In order to improve the quality of the search engine, Google made Algorithm updates as a solution to any kind of spammy techniques practiced by webmasters.

Google Algorithm Updates



Google Panda Update-2011

Google imposed penalties in case of content spamming by webmasters. Let's look into the conditions of penalities. Content duplication, thin content, user-generated spam, and keyword stuffing are different cases of penalities. 

                           google panda update


  • Content duplication: Content copied from other sites. The duplicate content issue can also happen on your website when you have multiple pages featuring the same text with little or no variation.
  • Content farming: A large number of low-quality pages often aggregated from other websites.
  • High Ad to Content Ratio: Pages that contain more paid ads rather than the original content.
  • Thin content: Weak pages with very little relevant text and resources.
  • Low-quality User-generated Content:  Pages that are made of user-made posts that are full of grammatical or spelling errors, and lacking authoritative information.


Google Penguin Algorithm - 2012


The Google Penguin update targeted two specific practices: Manipulative link schemes, and keyword stuffing.
                                    
Google Penguin update


What triggers Google Penguin Algorithm?

  • Paid link recommendations or purchasing backlinks from low-quality or unrelated websites to create an artificial picture of popularity and relevance.
  • Exchange of backlinks in a mutual way to increase the rankings.
  • Recommendations from low-quality spammy pages. 
  • Building links using artificial methods or automated mechanisms.
  • Hacking websites and getting backlinks from them. 
  • Adding backlinks through unwanted or unrelated comments,.
  • Adding backlinks through pages that are editable by the user eg. Wikipedia is a link scheme practiced by webmasters to achieve high rankings.

In 2016, the Penguin algorithm was made realtime. So, changes will be visible much faster, shortly after google recrawl and reindex the pages.

Google Humming Bird Algorithm Update - 2013

Hummingbird has been cited as a major overhaul to Google's Core Algorithm rater than Add-ons like Panda or Penguin updates.

Google Humming Bird Algorithm Update


  • It changed the working method of search engines by focussing on synonyms and theme-related topics. 
  • As Hummingbird used context and intend to deliver results that matched the needs of the user, local results become more precise.
  • It allowed the users to confidently search for topics and subtopics. 
  • Hummingbird helped Google to give semantic search results to the user i.e. the concept of improving search results by focusing on the intend of the users, and how the subject of the search relates to other information in a wider sense.

Google Rankbrain Update- 2015


google rankbrain update

RankBrain is a machine-learning algorithm that Google uses to sort search results. Depending on the keyword, RankBrain will increase or decrease the importance of backlinks, content length, content freshness, domain authority, etc.

RankBrain vs Google Engineers


  • Signals like user location, content freshness are taken into account to interpret intend and deliver the results most likely to satisfy searchers. 
  • Pre RankBrain Google utilized its basic algorithm to determine the results that are to be shown for a query. 
  • Post-RankBrain, its believed that the query goes through an "Interpretation Model" that can apply possible factors like location of the users, personalization, and the words of the query to determine the true intent of the users, and thus delivering more relevant results. 

Importance of RankBrain: Let's consider an example to get an idea about the importance of RankBrain. 


When a user searches for "Olympics location", Google "tries" to know the intention of the user. Imagine that the Olympics just concluded in Russia and the official website of the Olympics earned millions of links for its content about the past event. 

If your algorithm is simplistic, it may show results about the Sochi Games, because they have earned the most links.



RankBrain AI













This is where RankBrain emerges. It's only by mathematically calculating based on the patterns, the machine learning algorithm "notice" in searcher behavior that, the users searching for "Olympics Location" is looking for the location of the upcoming Games. In this case, Google will come up with links of locations of the Olympics to be held in upcoming years.


Google Pigeon Update- 2014

Pigeon Update helps to provide more useful, relevant, and local search results to the users. 

Google stated that this update improves their distance and location ranking parameters to provide local, relevant results to the user based on proximity. 

This update was cited as the most impactful update after Google Venice update in 2012.

Pigeon Update



 To increase the local ranking, webmasters should get their sites verified on Google My Business mainly using a letter, or by OTP or phone call. 

Adding the local keyword in their homepage, adding the details of their websites on local directories, providing an embedded map on the homepage, getting interactions from local users through social media also helps in increasing the ranking of the website locally. 

Parked Domain Update

This update was made to fight against the appearance of parked domain websites in the google search results page. Parked Domain sites are placeholder sites that are seldom useful and often filled with ads.

Google Parked Domain Update

Parked Domain sites have no valuable content for users, so Google prefers not to show them in SERPs.

EMD  or Exact Match Domain Update- 2012

Google targeted Exact match domain names in this update. However, the intention behind this was not to target exact match domain names exclusively but to target sites with the combination of spammy tactics and exact match domains with poor quality sites with thin content.

Pirate Update - 2012

The pirate update was introduced as a filter to prevent sites with many copyright infringement reports, as filed through Google's DMCA system from ranking well in search listings. Demoting sites in the results that have a large number of valid copyright removal notices.

Mobilegeddon Algorithm- 2015

On April 25, Google released a significant new mobile-friendly algorithm to give a boost to mobile-friendly pages in Google's mobile search results. 

Searchers could easily find more mobile-friendly high quality and relevant results without tapping or zooming, tap targets spaced appropriately, and the page avoids unplayable content and horizontal scrolling.

                             
mobilegeddon update


While the mobile-friendly change is important, a variety of signals were used to rank the websites. Even if a page with high-quality content is not mobile-friendly, it could still rank high if it has great content for the query.

On-Page Search Engine Optimization Guidelines

ON-PAGE SEO TECHNIQUES

On-page SEO (or on-site) is the practice of optimizing web page content for search engines and users.
Common on-page SEO practices include optimizing title tags, content, and internal links.

If optimization is not done correctly, Google crawlers may interpret the
content wrongly and the website may be displayed for unrelated or unintended users.

      
                                  
On-page SEO

The structure of a website can be divided into two portions: head, and body sections. The Head section contains a title tag and a basic snippet. A basic snippet can be referred to as the portion of the website shown in the google search results page.

On-Page SEO Basics


Importance of Page Title
  • Gives a hint to the crawler as to the content of the page. (Search Engines during the crawling and indexing phase try to associate a page with related keywords and page title is one of the elements that is considered).
  • It is shown on the search results page. (It should be interesting to encourage the users to click and visit the website).
  • Page Title is shown in the social media snippets. ( Title also appears on social networks when it is shared or posted. It should be good enough to get likes and shares and this increases visibility and click-through rate which in turn increases SEO).


How to optimize the title tags?

Title tags in SERP

A title tag is a great way to increase organic traffic.

  •  It should include keywords or search terms on the title. This can attract related users particularly, and this increases CTR. 
  • Adding question marks on the title in relevant cases. Users looking for the answers to similar questions click to get their answer. 
  • Adding brackets and numbers. 
  • Keep the length of the title to 60 characters or less. ( To make sure users can see the title in the snippet completely. pixel width is 512. 
  • It is advised to avoid full capital letters or small letters usage in title tags. The minimum limit is 3 words).
  • Use Title tag Modifiers. ( Eg. fast, guide, checklist, reviews) for long-tail versions of your target words).

Description  Meta tags

A good meta description helps your result stand out, which can boost organic CTR. Google bolds the terms that match the person's query.

  • Character limitation is 150-160 characters for pages and within 155 characters for blogs or articles as the date also takes up some space in the pixel width in the snippet.
  • It should be free from spelling or grammatical error issues.
  • If the description does not match with the content of the website, Google adds a description tag by itself, or from any authorized pages if your page is mentioned there.
  • If the website is mentioned in the open directory project Google may take up the description provided there as the meta description tag.
Content Optimization

Effective content optimization is a vehicle for modern SEO. Let's discuss some techniques and tips which can be used to optimize the content.

Content Optimization


  • Use the target words in the first 100 words: Even though it is an old school tactic, it still helps to attract the eyes of the users. Google puts more weight to the terms that occur early on the page.
  • Make your blog post title in H1 and the subheadings in H2 tags.
  • Add relevant H1 tags helps to give the user as well as the web crawler the matter of the content. It is advised to use H1 single time on a page to avoid confusion. H2 tag is also best when used only once on a page. While H3 and H4 tags can be used multiple times depending on the length and depth of your content.
  • Include your target word in at least one subheading.
  • Add anchor text in your content. It is the most visible part of the content and helps to increase the visibility of keywords or other relevant terms.
  • Keyword Density is the amount of the occurrence of the keyword in the page. Even though Google may deny that using the same word multiple times helps, SEO practitioners, swear by this technique.
                        keyword density

  • Use outbound links : External links to related pages( authority pages) helps Google to understand the topic better. It also shows that the page is a hub of quality information.
  • Optimize the URLs for SEO: Make it short and include the target word in the URL.
  • Include the focussing keyword just before a full stop or comma.
  • Bolding the focussing word or important terms helps the user to get an overview as to what is being discussed in the content. It also helps to improve reading easiness.
  • Never link related pages that have the same content as your page. This has a chance to pass the equity to the outbound links, which in turn decrease our page ranking.

Image Optimization Techniques


Image Optimization                 

  • Give every image of the page Alt text and a descriptive filename. This helps Google understand what the image is about.
  • Make one image optimized around the keyword and use the same keyword as part of your image alt tags.

  • Optimized images help search engines understand the content of the page.







History and Evolution of Search Engine Optimization or SEO


HISTORY AND EVOLUTION OF SEO


Ever wondered how Google or other search engines rank your websites within your searches?

Or how content such as videos or local listings are shown and ranked based on what the search engine considers more relevant to the users? 

This opens the door to the concept of Search Engine Optimization or SEO.


SEO

What is SEO?

SEO or Search Engine Optimization is the practice of increasing the quantity and quality of traffic to your website through organic search engine results.


SEO                 

Google has a crawler that goes out and gathers information about all the content they can find on the Internet. The crawlers then build an index based on the collected data. The index is then fed through an algorithm that tries to match all that data with your query. 


History of SEO

For getting a better understanding of the history of SEO, It is important to have a brief idea about the history of Google.

Google was founded by Larry Page and Sergey Brin while they were Ph.D. students at Standford University in California. 


Google

After the World Trade Centre attack, more people started using Google to search for it. However, Google could not give satisfying results to its users. In its initial days, Google was not a real-time search engine. It could not give instant results to the users, instead, the result was sent to the user's email id within 24 hours. The officials in Google arranged a meeting to discuss this issue.

They decided to make Google crawlable so that fresh and new edited content can be added to the index of google making the search results wider.  

GoogleBot is Google's robot that visits the sites and looks for updated content. This helps in ranking the sites in the SERP.

 

Google Bot



 



   



Google published SEO starter guide for webmasters and demanded them to start practicing it.

Google allowed web crawlers to crawl the database by optimizing it. This is where SEO emerges.


Evolution of SEO

The key objective of Google at that time was to ensure that search-engine technology focused on providing quality results to the users. Google had issued guidelines for creating quality content. 

PageRank of a website was measured on the basis of the occurrence of the keywords. 

Content Specific SEO


This resulted in Keyword stuffing by the Webmasters which is a 'Black-hat' SEO technique. This called for an update on the working feature of Google. 

Google was then made link specific i.e. PageRank of a website was measured on the basis of inbound links. The higher the number of inbound links the websites had, the better would be its PageRank. 


PageRank

Google also provided a toolbar on Internet Explorer for webmasters to check the PageRank effectively.

This started to affect the quality of the search engine since webmasters tried to get inbound links from other sites by getting them paid.

To deal with this issue Google made it quality link specific .i.e. inbound links from highly ranked sites were given more ranking. 

Google Adwords has also begun during this time and you could see paid results alongside organic search results. At this time link quality was not measured. Webmasters were not really following a 'White hat' way to get rankings as suggested by Google. 


Google Adwords

It is then that Google started off with its algorithm updates.

 Google made an update in 2003.

  •  This year many sites lost their search engine rankings.
  •  The sites that did keyword stuffing got penalized.
  •  Google acquired blogger.com and created the blog monetizing platform Adsense.
  •  Many blogs that created poor quality content used Adsense to monetize and quick money.
In the year 2009, Google made itself more interactive with users.
  •  If users spent a good time on a website and do inquiries, likes, comments, or any other activities Google considers it as the users preferred site, and such websites are ranked higher in the search result listings. 
  • Google  initialized personalized search by utilizing searcher history and preferences.
The year 2010 was a year of enlightenment to the evolution of SEO.
  •  It was a generation of growing social media that could influence the visibility of the content.
  • Google Plus was born along with a +1button, both of which played a significant factor in boosting content visibility.
  • Content that was shared throughout the web and social media created valuable backlinks and engagement that built authority.