Considering the amount of money at stake, SEO seems to suffer from a whitewash of supposition and theory, complete with conclusions that are based on nothing other than observations that are seldom tested.
Contrast that with the equally obscure theories about the existence of new particles such as the Higgs Bosun, which nevertheless result in a theory that is then tested empirically.
There's the difference.
In particle physics, someone develops a theory that a particle should exist and then develops an experiment to test it. The existence of many particles, such as the Higgs Bosun can only be shown by the effect that they have on their surroundings, or by the production of other particles that in theory could only be produced from a Higgs Bosun.
Contrast that with the fact that in SEO we get as far as observing that something appears to have happened to search results and then concluding that the algorithm contains a certain something that makes those results change.
But then we miss out the next step.
The next step should be that we theorise that the algorithm contains a function X and if it contained a function X, then we should be able to observe something very specific happening. A very specific set of observations that should change.
The obvious argument against this is that there are too many variables to do the experiment. But that just leaves us with relative guesswork. Is that good enough ?
Suppose someone does start doing those experiments, however difficult. Imagine the advantage that would give them. Imagine how many SEO snake oil salesmen would go out of business. Nice.
Here's one theory. Google has it in for EMD's. No subtlety. Just de-rank them. It may sound stupid, but this theory is spouted forth all over the net.
What would we expect from a blanket 'hammer EMD' function in the algorithm ?... that EVERY EMD would suffer ranking loss.
Clearly that has not happened.
Google has a function in its algorithm that penalises sites where the title, description and meta tags have the same structure, but just different wording depending on the keywords targeted.
In which case, ANY site with templated keyword changes would have suffered.
So why had loads of shops good rankings, despite being built on templated site builders with very similar title and meta tag structures ?
We can't observe what is happening in the algorithm, but we can theorise on what we would expect to see if certain things were present within it.
Suppose Google had a function within the algorithm that deranked EMD's, IF they also had numerous external links to retailers and contained no unique content ? Forget why it might have that. What would you expect to see in the rankings ?
...and what has been seen ?
Suppose the Google algorithm had a function that rewarded unique structure as well as content ?
There's one to have a think about...
Would you only see one price comparison site in the first two pages ?
Clearly anyone who discovers an aspect of the algorithm that benefits their own sites is not going to broadcast it.
So perhaps a lot of webmasters would do well just to study scientific method a little bit and take a cold look at facts.
If you read SEO stuff, you can fill your head with all sorts of possibilities, so why not just write them all down and then write down what you would expect to observe, IF the theory was true.
It might just give you a few clues.
Google’s updates are like the Higgs Bosun particle. You can only infer their properties by their effect on their surroundings and when you try to isolate them, they immediately decay into other updates.