Here’s how Google finds websites. The Google bot follows all the links on the internet that it can find. If it finds your website, it will index you into its search results.
Then, it will look at all your internal links and index every page it can find on your website.
In order for Google to know your website exists, another website must link to you. If you have a brand new website and there are no links to you yet, it will be a while before the bot finds you.
How can you speed up the process? John Chow gives you some insider tips in this video, filmed at the Platinum Mastermind in Fiji.
GET INTO GOOGLE FAST
If you have no links yet, it’s still possible to get into Google within 24 hours. Just tell Google you exist.
Head over to Google Webmasters. If you have a Gmail address, you’ll already have an account. Then just enter the URL of the site you own.
Google will require you to take a few steps to confirm you are the owner. They’ll give you a file to upload to your website. Once you’ve done this, your website should start appearing in Google search results.
USE A SITE MAP
Using a site map makes the Google bot’s job easier in finding all the pages of your site. A site map is a document that lists out all the pages of your site, and includes a note on how often they’re updated and which pages have priority.
The easier you make Google’s job, the faster you will rank.
If you register with no site map, the bot has a harder job in finding all your pages. When you make changes to your site, it may also take a longer time for them to be reflected in the Google search results. That’s not good for you, for Google or for people searching on Google.
AUTOMATICALLY UPDATE YOUR SITE MAP
If your website runs on WordPress, there’s a plugin that automatically creates and updates your site map. It’s called Google XML Sitemaps, by Arne Brachhold.
By using this plugin, you just focus on updating your website and the site map takes care of itself.
LEAVE OUT WHAT YOU DON’T WANT
There are certain pages of your site you don’t want the Google bot to index. You don’t want people accessing your admin folder, which contains important files that keep your website functioning. You don’t want people accessing the password protected part of your website. That’s for you only.
By using a file called robots.txt, you can list out the parts of your website that you don’t want the general public to access. The Google bot will make a note of this and leave them out of Google search results.
Another thing you want to do is eliminate duplicate content. When you write for your blog, the content is duplicated in at least 3 places. You don’t need 3 different pages in Google that all have the same content. That’s confusing both for the Google bot and for searchers.
The robots.txt is your chance to point out to the Google bot which folders are duplicates. It will ignore those and index everything else.
The MOBE Gold Masterclass goes into more depth on this topic. It will show you how to set up your website and build a consistently profitable business using a Customer Acquisition Process. To learn more about the Gold Masterclass, click HERE.