The Google Sandbox is the term given to the holding area which contains domains which Google raises a red flag against. When a domain is placed in the Sandbox it does not receive a ranking in the search engine, it’s content does not get crawled by Google’s spiders and the website’s indexed pages get placed into a supplemental index. Getting sandboxed is one of the nightmare occurrences for webmasters and online traders. If you have conducted and implemented proper keyword research in relation to your product then search engine referrals can account for up to 90% of your traffic. Invisible websites cash invisible cheques. Google can place a website in the sandbox is it conforms to any of the following criteria: If the domain name is newly registered If the domain or website is constantly changing either its IP or DNS address(s) If your website links to or receives links from ‘bad neighbourhoods’ (such as other websites in the Sandbox or those with questionable content) If your website is involved in a link farm or if it has used Black Hat tactics to achieve a higher than justified ranking in the search engines. If you have abused 301 Permanent redirects. Some people believe that the Sandbox doesn’t exist. Matt Cutts, Google’s Chief of Web Spam, in a 2005 interview explained that Google’s search algorithm “might affect some sites, under some circumstances, in a way that a webmaster would perceive as being sandboxed.” How to tell if you have been Sandboxed The quickest way to determine if you have been Sandboxed is to check if your content is been indexed by the search engines. Open up Google.com and type the following command into the search bar – site:your-domain-name.com – I have demonstrated this in the screenshot below. The ‘site’ command runs a query on Google’s data centres and determines how many of your site’s pages are indexed. Only indexed pages within Google appear to the end user as results when they search. If your site isn’t indexed then it won’t appear. The image below is the corresponding result to the site query. It displays the amount of indexed pages in Google for the domain macblogger.net. As you can see, there are approximately 31 pages of this website indexed by Google. This indicates that the domain is not Sandboxed. If the domain were Sandboxed then there would be a grand total of zero results. How to avoid getting Sandboxed Websites which Google has trust in do not appear in the sandbox. So how do you gain trust? This can be a time consuming process. Not because it requires a lot of attention or man-hours by yourself to get un-boxed but rather because Google takes its time in apply trust to new websites. Patience is a virtue in getting un-boxed as it can take up to six months to make a reappearance in the search engine results (SERPS). In saying that, there are a few recommendations I can make to new site owners who hope to avoid getting Sandboxed in the first place. “The Do Not’s” Don’t use search engine auto submit tools. Google, and the other search engines, want things to occur naturally. Using auto submit tools to submit your site to 100’s of search engines is a sure way to draw unwanted attention to your website. This can be seen as mass spamming. I don’t recommend going to Google’s Add URL (Add your URL to Google) page either. By using this tool you are adding your website to a list of sites for Google to crawl. It’s best to avoid this list and instead focus on getting a handful of quality back links from relevant and trusted sites. This is the organic approach which Google prefers. Do not race into a link building strategy. Google loves links. It’s one of the major factors which can propel your website from search engine obscurity to the top of the rankings. If Google feels that your site is obtaining too many links too quickly then your site is a candidate for the Sandbox. Google wants to see organic linking from websites which have relevancy to one another. Very rarely does a brand new website receive 100’s of links instantly. Google may think you are involved in a Link Farm or that you are buying links in order to increase your back links. Build your links slowly and as naturally as possible. Choose wisely who you link to and who you ask to receive links from. If you are in doubt over linking to a website then ‘rel=nofollow’ the link to that website. The ‘nofollow’ attribute in links advises search engine spiders not to follow the link as you can not be certain the link is trustworthy. Respect 301 redirects. If your website has moved domain from old-domain.com to new-domain.com then a 301 redirect is normally used to transfer the entire site’s content from the old domain to the new domain. This is sometimes abused when developers purchase a number of websites and attempt to redirect that site’s indexed content to the new domain. “The Do’s” Register your Domain for as long as possible. Most new websites register their domain name for a single year and renew it annually. If you register your domain for as long as what you can afford then you build trust with the search engines. By registering your domain name for five or more years you are giving a clear indication that you plan on been here for the long haul. Google became an official Domain Name Registrar in 2005 and is most definitely using the ICANN records within its search algorithm to a certain extent. Purchase an SSL Certificate and use it. SSL Certificates are used on web pages which require their content to be encrypted. Checkout pages for online shops and logins for websites generally use SSL to build trust with the end user. Search engines can respond in the same way. By securing a page on your website with an SSL Certificate you are further building trust with the search engines. Host your website on a Dedicated IP. Most small websites are hosted on shared servers. A shared sever can contain thousands of other websites. If some of these websites get blacklisted by Google or any other search engine then there is a possibility that your site might follow suit and join them on the list. This is because all of the website’s hosted on the shared server share an IP address with each other. It’s best to avoid any common denominators with blacklisted sites and the only way to do so is to host on a dedicated server. If you are serious about your company’s online presence then you should use dedicated hosting. Sit on your Site. Once you have registered your site you should put a few pages of content up there. These pages don’t have to be part of the final design of the site but they act as an indicator of the site’s future content. By placing content on the site you are providing search engine spiders with content to crawl. It is their job to crawl websites. If all you have is a logo with ‘coming soon’ text on your site, then the spider won’t care much to return to your site and won’t be in a hurry to return. Leave approximately four or five pages of content on your site. In a months time check to see if your site appears in the search engine results by running the command ‘site:my-domain-name.com’. If the spiders have indexed your site then you have successfully avoided the sandbox. Conclusion It is imperative that you avoid the Sandbox. If you abuse the system then the system will make you pay. Read up on the Google’s Guidelines for Webmasters (Webmaster Help Center) and avoid Black Hat SEO tactics. If you implement the advice I have provided then you should avoid having your website Sandboxed.