The title of the post looks clickbaity but it is in fact not. I am going to mention why all the SEO information you know is likely to be wrong.
Who am I to have the audacity to tell you don’t know SEO?
Do I know something you and other SEO experts are not likely to know?
Let me answer that right away.
I am not an insider who leaked in Google’s secret laboratory to steal the golden SEO formula. However, If you read this post to the end, I am going to prove all you know about SEO is likely to be wrong.
I am going to challenge what you believe as reliable SEO information is nothing but a personal belief only.
Ok, let’s get into that!
In 2009, Google’s Matt Cutts said there are over 200 variables used to decide where should a website rank. 10 years later today, what are the chances of there are still 200 ranking factors?
Google improved its algorithm quite a lot since that time. Since Google didn’t publicly announce anything yet, I can speculate there are over 2000 ranking factors available.
Also, who knows Matt Cutts told the truth about the actual number of ranking factors. Maybe Google uses only 20 ranking factors, but it wants you to think the algorithm is more sophisticated than you think.
Maybe only in this way, they may sidetrack you not to fool with Google’s algorithm.
In my opinion, a talk made by a representative of Google 10 years ago should not be used as the factual SEO information today. Unfortunately, most SEO information available on the internet is still based on this assumption.
Let’s talk a little bit about link building.
There are many SEO experts, internet marketing influencers make videos teaching how to build links right.
They say you should be doing only “white hat link building” or “gray hat link building “etc. Most experts cannot even agree on the definition of these terms.
Some SEOs recommend guest post links as white hat links, while another group of SEOs classifying them as gray hat links.
Google on the other hand discouraging guest blogging completely.
Here what Matt Cutts said:
Okay, I’m calling it: if you’re using guest blogging as a way to gain links in 2014, you should probably stop. Why? Because over time it’s become a more and more spammy practice, and if you’re doing a lot of guest blogging then you’re hanging out with really bad company.
Here what else he added:
So stick a fork in it: guest blogging is done; it’s just gotten too spammy. In general I wouldn’t recommend accepting a guest blog post unless you are willing to vouch for someone personally or know them well. Likewise, I wouldn’t recommend relying on guest posting, guest blogging sites, or guest blogging SEO as a linkbuilding strategy.
Then should you do guest blogging?
Truth be told, guest posting looks like an effective way of link building today. There is almost no other reliable way of link building since Google patched the other holes in its algorithm.
However, it is not a whitehat practice. Because anytime you contact someone to build links you are violating Google’s terms of services.
It is likely in the upcoming years Google should come up with algorithmic updates to kill the guest blogging completely.
Should you delete or disavow guest post links coming from crappy websites?
I don’t know it. In fact, no one knows it other than Google.
Anyone gives you SEO advice you should build or delete certain links are just telling you the information based on what they have learned from somewhere else.
You may buy 20 cheap PBN links from Fiverr and can rank your website on the first page of Google. Maybe someone else may pay hundreds of dollars from the expensive PBN link vendors but they may still get caught by Google.
But aren’t there SEO case studies enlightening us about the safe link building techniques?
Am I just ignoring those case studies made by well-known SEO gurus?
They have made case studies to prove guest posting works or PBNs harm to a website.
Let me whisper you the truth.
Most case studies done are nothing but garbage. They have no value to provide factual information you can use to improve your SEO campaigns.
I am going to explain scientifically why you should not take the advice of any SEOs as the factual information to build your SEO campaigns.
SEO case studies should be scientific experiments otherwise they would be meaningless.
In science, any given experiment has numerous control variables. What I mean with a controlled variable is one which the researcher holds constant (controls) during an experiment.
If during an SEO case study the control variable changes, it may invalidate the correlation between the dependent and independent variables.
Let’s assume you make a case study about the effect of a particular link coming from a certain website.
During your SEO case study, you need to stop publishing content, your visitor behaviors on-page should be constant, and Google shouldn’t do algorithm updates during your experiment(which they make 3000 times a year).
We cannot also ignore the residual SEO effect that may be coming from a link you have built 3 weeks ago from a PBN. Maybe you have a citation boost since someone was mentioning your name in their content.
Also, all this assumption is made considering a single website. If more than one website used for the experiment the situation gets even more complicated.
We don’t even know the actual variables that are required to be kept as constant to make an SEO case study. Also, most variables are not managed by us.
What most internet marketers do is using correlation to imply causation.
It is nothing different than thinking like If malaria has seen around the swamp, then swamp likely causes malaria. Or maybe if mosquito causes malaria then all mosquito is the reason for it.
I think you understand my point.
There are so many stereotypes in the SEO community. It is very hard to know if they are scientifically proven SEO information or just another urban legend.
People say private blog networks don’t work.
When I hear this type of statement my reaction is, Prove it!
Do I think PBNs work, or am I recommending to build PBN links?
It doesn’t matter what I think or what your SEO guru thinks. No one knows Google’s algorithm, and none of us has the capability to test Google’s algorithm on a large scale.
Then why all those SEO case studies exist?
For one good reason. People like to make scientific-looking case studies to place themselves in an expert position in the SEO industry. It gives them a chance to sell their $3000 SEO courses or private mentorship programs easier.
Also, these case studies are likely to build links. And building links lead to better SEO rankings and more profitability. In a place, no one knows anything, you can lead people with incomplete information as well.
Are SEO case studies completely unreliable?
No, SEO case studies done right can reveal some type of correlation. However, they should have enough sampling size to show statistical significance.
To give an example we can look at the case study made by Brian Dean.
He analyzed 1 Million Google Search Results. If you have the possibility to check this large volume of URLs you can expect to get some useful data.
Brian Dean is a successful internet marketer that is well respected in the SEO community. I want to thank him for the time he took to contribute with his research.
However, the findings of this research are still very generalized information. Here are some points Brian mentioned as the result of his experiment.
Backlinks remain an extremely important Google ranking factor.
Yes, we know the backlinks are still very important ranking factor. But we wanted to know what type of backlinks they are, and what combination of backlinks gives the best results.
At what volume, and velocity they should be built not to trigger Google’s webspam algorithm.
The average Google first page result contains 1,890 words.
Ok, this one is good. Knowing the average length of content on Google’s first page may help to define the new generation content format. Thanks for letting us know that a 500-word long content doesn’t work anymore.
HTTPS had a reasonably strong correlation with first page Google rankings.
We already know https is more secure, and Google gives an SEO boost to the websites using SSL encryption.
The question is how much does having an SSL certificate really matter? I don’t think we can still assign a mathematical value as a percentage in all other Google ranking factors.
Our data shows that the use of Schema markup doesn’t correlate with higher rankings.
There are contradicting opinions about structured data in the SEO community. What kind of methodology is followed during the case study to decide Schema markup is not a Google ranking factor.
Content with at least one image significantly outperformed content without any images.
Google representatives mentioned earlier having an image on a page can improve user experience and rankings. But it is good to know we should include images on our pages.
By the way how many people has poor SEO results for the reason of not having an image on a page? Almost all content I find online has at least an image.
We found a very small relationship between title tag keyword optimization and ranking.
The title tags don’t work. Google stopped looking at them already a long time ago, and Google representatives mentioned that earlier.
Site speed matters. Pages on fast-loading sites rank significantly higher than pages on slow-loading sites.
Ok, thanks for letting us know that. However, are there any numerical data we should be targeting to achieve. For instance, can we say a webpage opens in 1,5 second has %25 more likely to rank on Google than a webpage opens in 1 second?
Despite Google’s many Penguin updates, exact match anchor text appears to have a strong influence on rankings.
We know that having links with an exact match anchor is more powerful than links with generic anchor or naked URLs. However, do we know until what percentage of links if built with an exact match anchor will not trigger the Penguin algorithm?
Can we say up to 20% of your backlinks can be an exact match, and you will not have any exact match URL penalty?
Using data from SimilarWeb, we found that a low bounce rate was associated with higher Google rankings.
Yes, a low bounce rate means high engagement on a page. If people lands on your site, reads your content and clicks other pages then you are doing great.
However, we still don’t know what is the importance of user behavior on a webpage. Can I assume user behavior metrics equates %10 importance in the algorithm?
Brian Dean has made a great job with his case study within the limits of his capabilities. Most SEO case studies are not even as comprehensive as this one.
What I want to emphasize here once again, Google is a multibillion dollars business. They employ hundreds of smartest engineers from all around the world. They use artificial intelligence and machine learning to use earlier experiments to sharpen their system.
Most SEO case studies are done by using tools like Ahrefs, SEMrush. Although these tools are amazing to improve SEO campaigns, they have the limited ability to test the effect of multiple variables in the SERP result.
SEO case studies made today, and even for long upcoming years can only pick hints from what Google values. There is no methodical way to measure the actual effect of any SEO element in isolation.
Hence all you know about SEO is likely to be far from reality. There may be chances that Google still values PBN links, and tolerate them very well up to a certain level that you don’t take advantage of it.
I find SEO and poker have certain similarities. You are playing against the others and Google is the dealer. You can successfully execute a bluff and win, or lose everything.
Hence, don’t follow any SEO expert blindly ignoring your own judgment, and don’t risk more than you can afford to lose.
Have a great SEO journey!