The So-Called Sandbox is Page-Level NOT Site-Level

For those of you who do not know, the Google Sandbox is widely thought of as a probationary period for new websites during which time your site has to really struggle to show up in Google’s search results. Many people deny that the Google Sandbox even exists.

I’d like to argue that the Sandbox, defined as a probationary period for new sites, does not exist. However, I do think that there is a buffering algorithm put in place by modern search engines. In my experience, this buffering algorithm is applied on a page-by-page basis, and even old websites, with plenty of backlinks, can suddenly find many of their pages diminished in the search results, simply due to a loss of page-level authority.

What’s my evidence?

I have several websites that have between 20 and 50 total pages. These websites are not under any obvious manual penalty. However, of the 20 to 50 pages, only 1 or 2 of the pages gets search traffic.

You might suggest that this is accidental. But there is a common pattern. The 1 or 2 pages were each promoted as linkbait. They each received numerous natural links (ranging from 3 to 141). And 6-9 months after being promoted as linkbait, they still draw in dozens of searches each day. The other articles on the site receive ZERO searches each day.

Possible counter evidence

On some of my websites, a new article will show up high in the search results within minutes of first publishing the article. This seems to indicate that modern search engines bias towards sites that meet an authority criteria and bias against sites that fail to meet that criteria. In other words, Sandboxing takes place at a domain level.

I think this is wrong. Instead, you should think of strength/authority as being a page level value. The homepage of any website has a certain strength/authority value and it is often the strongest of any page on the site. When a new article is posted on a homepage with high strength/authority, that strength gets transferred to the new article.

If a site has a very weak homepage, then you will not see the transfer of authority from the homepage to the new article. In fact, if you’ve done some linkbait articles, it could turn out that your linkbait articles have more authority than your homepage. Keep in mind that some of the authority from your linkbait articles will leak into your homepage, so the more linkbait you do, the better it is for your homepage. But if you’ve only done a few, it’s still quite possible that your homepage is weak.

More counter evidence

I’ve had websites whose articles showed up high in the serps, and then accidentally the home page lost thousands backlinks. Within a month, almost all search referrals stopped… it was almost like a retro-sandbox … except the search referrals coming into linkbait articles. You see, the linkbait articles had independent authority that wasn’t fully removed when the homepage lost it’s authority.

Conclusion

If you like the Sandboxing metaphor for understanding the way search engines list your articles in their results, the most important thing to remember is that Sandboxing analysis takes place for each page on your site… including your homepage. By building up the natural backlinks to articles on your site, you transfer authority throughout your site, including to your homepage. The more authority coming into your site from a variety of angles (articles, category pages, homepage) the stronger your index pages will be, and the stronger your index pages, the more quickly new articles will show up high in the search results.

One thought on “The So-Called Sandbox is Page-Level NOT Site-Level

  1. Thanks for this thought – it may indeed explain why the majority of my (over 3000 pages) pages don’t draw traffic and keep disappearing off google’s radar. Don’t know if it’s true or not but I’m going to run a few trials and see if I can make some changes. At least it gives me a working hypothesis and a way to test and/or possibly improve rankings/traffic. Good post

Comments are closed.