How to Improve Indexing
One of the most popular discussions that you will see in the Google forums revolve around indexing and many people complain that their new content isn’t appearing online. It would be ideal if Google visited your site on a daily basis and content was instantly available in search engine rankings, but this will only ever happen if, and when, Google recognizes how great your site it.
While you cannot forcibly improve indexing and Google’s crawl rate, there are some things you can do to get Google to visit your site more often. Before you look at the steps I have outlined below, make sure your site is actually indexed by Google. If not, you’ll need to visit Google and add it.
Produce unique content, regularly and often. If you reuse old content or extract your content from article syndicates, you will find that it is quite some time before Google indexes your new pages. In some cases, it may not index them at all. Google doesn’t like duplicate content, and if it recognizes the articles or pages you have published from elsewhere on the web, there is a high chance it will disregard them. At the very best it will place your content very low in the rankings. However, if you create new content on a regular basis that is very unique, Google will be encouraged to visit regularly, and it will increase your crawl rate; it likes rapidly evolving sites! So the basic rule of thumb for web search optimization is to put effort into creating content that your readers will not have seen before, this will naturally increase Google crawl rate. If you have a large amount of content that isn’t unique, consider getting it rewritten so that you can avoid duplicate content issues. Try our rewriting services.
Make sure Google robots can crawl your site. The Google crawlers will only spend a limited amount of time crawling your site before moving on to another site. Make sure that Google can crawl your site quickly and effectively so that it can cover as many pages as possible during its visit. Here are some things to look out for:
- Pages that are slow to load. If the search robots spend a long time crawling large images and files, they may not be able to reference many of your pages during their visit.
- Duplicate contact on the site. Do you have several pages that contain the same content but on different URLs? If so, consider removing it. It will confuse the search spiders and delay the rate at which they can crawl your site.
- Broken pages. Does your site always work correctly? Google’s Webmaster Tools has a great area that can allow you to see crawl errors. This is useful for identifying any issues that your site has. Fix any errors as soon as possible in order to ensure that Google spiders can crawl your site effectively.
Ask Google to visit more often. Google’s Webmaster Tools can help when you are trying to improve indexing for your website. If your site is slightly older, you can request that Google increases the crawl rate through the Settings screen on the dashboard. I personally would not advise this because Google generally sets the crawl rate according to what it believes your site server can handle. Even if you set the crawl rate to “fastest” the rate at which the spider can crawl the site will be impacted by the server response time. If your server is unable to accommodate Google’s crawl speed, it will remember this and revisit at a later date with a reduced crawl speed.
N.B.: This option is only available to sites that have been established for some time. If you see the following message in your Google Webmaster Tools account, you cannot change the crawl rate.
“Your site has been assigned special crawl rate settings. You will not be able to change the crawl rate.”
Add your sitemap to Google Webmaster Tools. Make sure that you regularly update your sitemap and that an xml version of it is available in Google Webmaster Tools. The sitemap will give Google additional information about your URL structure and this will improve indexing because it will help Google to crawl your site more efficiently. Be careful when using online sitemap generators because many of these sites are only interested in gaining access to the data on your website, and they do nothing to help your web search optimization efforts.
Write and publish an article with a link back to your site. I notice that every time one of my articles is published on any of the free article sites like eZine, the link back to my site triggers Google to recrawl and index my content. Creating backlinks by posting comments on other blogs or writing press releases will have the same effect. Try and avoid spamming though, as many people will just remove spam links and you will have wasted your time.
Ping Google. Let them know that you’ve updated your content. While you’re at it, ping all the major search engines services. Many blogging platforms, like WordPress, have a ping plug-in that you can download. Such services inform 100s of URLs that you’ve updated your site, and this may lure the spider to crawl your site and aid your search rankings.
Create dynamic content. Dynamic content can attract bots to your site and it is, therefore, worthwhile considering including an area on your site that creates random content. Many blogging platforms offer dynamic content plug-ins and widgets that show random posts from the blog or random quotes and sayings. By including these, you frequently change the content without doing any work, and this may help to improve the rate at which Google indexes your site content.
Interlink your website pages. Many people don’t realize this, but internal linking is just as important as external linking. Ensure that you use simple links throughout your site and check them on a regular basis to ensure that they are still working; Google does not like broken links. An effective internal linking structure will allow both your customers and Google to easily navigate your site, and this is a crucial element of good SEO. Many blogging packages, like WordPress, automatically insert trackbacks when you add a link. These will ping Google, and you will find that your new content is indexed within hours. Try the SEOpresser WordPress plugin.
Use unique title and description metatags on every page. Some blogging packages use the same metatags on every page. This is not effective, and the Google robots may assume that the content is duplicate. Google Webmaster Tools will highlight any pages that have duplicate metatags; deal with them as soon as you can in order to improve the overall crawl rate of your site.
Ensure that you optimize the images that you use on your site. By doing this, you allow Google to read your full page. Optimizing images makes them search engine friendly so Google can scan your page more effectively and can quickly move onto the next page, thus improving your overall crawl rate.
If you need help to improve indexing and increase your website ranking, you should consider using our web content editing services. Order online today!