Menu Close

Confusing Correlation With Causation in SEO

I am flabbergasted at how frequently I come across either really bad SEO advice or wild conjectures with no supporting results to back up the claims. However, given the secrecy behind the search engine algorithms and the time consuming nature of conducting rigorous testing of SEO tactics, it is understandable that correlation often becomes confused with causation.

Here is an example from this weekend demonstrating how easy it is to jump to a conclusion based on a correlation that may or may not be supported by causation. On Friday, I created a Twitter profile and account for a site that sells plastic shopping baskets at twitter.com/shoppingbaskets. Over the weekend, the shopping basket website linked to from this new account moved from the second to first page on Google for the term “shopping baskets”. Thus, an easy conclusion to reach is that creating a Twitter profile led to the jump on Google. However, as tempting as it is too think I came up with an “ah-ha”, this experiment needs to be repeated numerous times before any conclusion can be reached.

Assigning causation to any single factor in a change in search engine rankings is likely to be wrong because: a) it is seldom possible to come up with a clean test isolating only one factor influencing results, and b) the search engine algorithms are tweaked almost daily, so a change in rankings may have nothing to do with any sort of measurable activity, a secret change in the algorithm may be solely responsible.

Testing the impact of a single tactic on search engine rankings can provide interesting results, but the results are unlikely to be definitive. Link diversity is widely assumed to be an important factor in the ranking algorithms. Thus. a test of any single tactic typically will lead to an unnatural link profile and the test results may not be replicable in the “wild”. Also, with Google measuring 200 factors, any attempt to test a single factor may be muddied by other influences that were not being controlled.

I am not discounting SEO testing. However for the testing to be valid it should probably be conducted on a number of sites and the results measured over a period of more than 30 days. Leading with that caveat, I still recommend reading Alex Whalley’s test of article marketing and the SEOptimise test of attempting to rank a new domain with just Facebook and Twitter. Due to both these tests being conducted on single sites, it is impossible to know what outside factors may be influencing or polluting the results. But they are interesting tests, and neither author is making any wild unsubstantiated claims based on their results.

Conclusion

When it comes to SEO advice be very skeptical of claims that are not supported by actual results. And even if there is supporting evidence, question if causation has really been proved or is it simply correlation.