Instructor Lead On-Demand Learning Courses - August Big $10 Sitewide Extravaganza All Pro Web Designs and Udemy are happy to offer this special to you, good only on dates: 08/21-08/31 Act Now!

How to Get Your Site Indexed by Google

How to get your site indexed by Google

This article is part of an SEO series from WooRank. Thank you for supporting the partners who make SitePoint possible.

If it’s not already, organic search traffic needs to be a priority for your digital marketing plan. More than half of traffic on the Internet comes from search engines (it could be as high as 60%). Organic search traffic is also hugely important in generating online sales. Of course, you realize your SEO is a priority — but where to start? All SEO starts with getting your website found, crawled and indexed by search engine robots.

In this piece you’ll learn about the technical and on page aspects of SEO and how you can use them to attract the attention of Google, Bing and other search engines.

Step 1: On Page SEO

The first step to getting found by search engines is to create your pages in a way that makes it easy for them. Start off by figuring out who your website is targeting and decide what keywords your audience is using in order to find you. This will determine which keywords you want to rank for. Best practice is to target long-tail keywords as they account for the vast majority of search traffic, have less competition (making it easier to rank highly) and can indicate a searcher is in-market. They also have the added bonus of getting more clicks, having a higher click-through rate (CTR), and more conversions.

Instructor Lead On-Demand Learning Courses - August Big $10 Sitewide Extravaganza All Pro Web Designs and Udemy are happy to offer this special to you, good only on dates: 08/21-08/31 Act Now!

There are quite a few free keyword research tools available out on the web.

Once you have your target keywords, use them to build an optimized foundation for your pages. Put your keywords into these on page elements:

Title tag: Title tags are one of the most important on page factors search engines look at when deciding on the relevance of a page. Keywords in the title tags tell search engines what it will find on the page. Keep your title tags 60 characters or less and use your most important keyword at the beginning. A correctly used title tag looks like this:

<title>Page Title</title>

Meta description: Meta descriptions by themselves don’t have much of an impact on the way search engines see your page. What they do influence is the way humans see your search snippet — the title, URL and description displayed in search results. A good meta description will get users to click on your site, increasing its CTR, which does have a major impact on your ranking. Keywords used in the descriptions appear in snippets in bold so, again, use yours here.

Page Content: Obviously, you need to put your keywords in your page content. Don’t stuff your content though, just use your keyword 3-5 times throughout the page. Incorporate some synonyms and latent semantic indexing (LSI) keywords as well.

Add a blog: Aside from the more stereotypical content marketing SEO benefits, blogs are crawling and indexing powerhouses for your site. Sites that have blogs get an average of:

  • 97% more indexed links
  • 55% more visitors
  • 434% more indexed pages

Adding and updating pages or content to your site encourages more frequent crawling by search engines.

Step 2: Technical SEO

Robots.txt

After you’ve optimized your on page SEO factors for your target keywords, take on the technical aspects of getting Google to visit your page. Use a robots.txt file to help the search engine crawlers navigate your site. Very simply, a robots.txt file is a plain text file in the root directory of your website. It contains some code that dictates what user agents have access to what files. It usually looks something like this:

User-agent:*

Disallow:

The first line, as you can probably guess, defines the user agent. In this case the * denotes all bots. Leaving the Disallow line blank gives bots access to the entire site. You can add multiple disallow lines to a single user-agent line, but you must make a separate disallow line for each URL. So if you want to block Googlebot from accessing multiple pages you need to add multiple disallows:

User-agent: Googlebot

Disallow: /tmp/

Disallow: /junk/

Disallow: /private/

Do this for each bot you want to block from those pages. You can also use the robots.txt file to keep bots from trying to crawl certain file types like PowerPoints or PDFs:

User-agent:*

Disallow: *.ppt$

Disallow: *.pdf$

To block all bots from your entire site, add a slash:

User-agent:*

Disallow: /

It’s good practice to block all robots from accessing your entire site while you are building or redesigning it. Restore access to crawlers when your site goes live or it can’t be indexed. Also be sure that you haven’t blocked access to Schema.org markup or it won’t show up in Google’s rich search results.

If you have a Google Search Console account, you can submit and test your file to the robots.txt Tester in the Crawl section.

robots.txt Tester

Continue reading %How to Get Your Site Indexed by Google%

Instructor Lead On-Demand Learning Courses - August Big $10 Sitewide Extravaganza All Pro Web Designs and Udemy are happy to offer this special to you, good only on dates: 08/21-08/31 Act Now!

Leave a Reply