🤖Google’s Ranking Factors: The Complete List for 2020

By Altay Gursel | August 8, 2020

Google’s Ranking Factors

Home / Blogging / Search Engine Optimization / 🤖Google’s Ranking Factors: The Complete List for 2020
Google ranking factors

There are over 200 Google ranking factors. In fact, Google’s Matt Cutts said “200 variables”. Obviously, Google is not transparent with its algorithm. We don’t know how many are those factors, what are they, and how do they affect websites.

In this post, I will discuss all ranking factors that Google is likely utilizing to assess websites. They are based on my opinions, research, and observations by analyzing SERP listing.

I believe didn’t leave anything behind you need to know.

Let’s get started!

Domain Age

There are only a few things when aged becomes more valuable. One I can remember wine and another is a domain name.

domain age

Age of a domain age is a ranking factor.

Google takes its time to monitor new websites to have an opinion on who they are, and how they will behave.

Most SEOs call this period as Google Sandbox.

If you start with a brand new domain, it will take you longer to get where you want than if you have started with an aged domain.

Google really wants to trust new websites before rewarding them with consistent search traffic.

Unless you have a valid reason, it is always better to start with an aged domain. Ideally, you should choose domains that have at least a certain amount of authority.

But, how old is not new anymore?

We are going to get that answer that question right from its source.

Here what Google’s Matt Cutts said:

“The difference between a domain that’s six months old versus one year old is really not that big at all.”

Starting a new website pick an expired domain. These domains are aged, and for some reason, the earlier owners dropped them.

Make sure you choose one that has a clean history, and relevant to your niche.

How to check if you are picking a good domain?

There are popular metrics you can use:

In my opinion, aged domains with at least a few niche relevant backlinks are excellent. Obviously the more is better if you can afford it.

Better the backlink profile of a domain, more expensive it is.

Domains with good metrics worth a lot. Because they have “domain authority” and “link juice” behind to help them to rank on Google.

There is no way to know what Google thinks about a certain domain. Third-party tools I have just mentioned can only serve best within their abilities.

Speedy Reports, Deep SEO insights, Smart Analytics

Should you have your keyword in your domain name?

In my opinion yes. Google representatives also hinted choosing a relevant domain may boost SEO a little bit.

Have a phrase match of your main keyword or a complementary LSI keyword in your domain name.

However, avoid picking an EMDs (exact match domain).

EMD (Exact Match Domain)

They look too spammy, and Google devalued EMDs already a long time ago.

You can pick brandable domains as well. Domain selection primarily depends on personal choice.

Where Could You Find Expired Domains?

Finding expired domains is quite easy. You can visit expireddomains.net to find a ton of expired or deleted domains.

However, finding expired domains with good metrics and not spammed is hard.

Searching high-quality domains manually is looking for a needle in a haystack. It is inefficient and likely to give any positive results.

You should either use a tool like Scrapebox or delegate this task to a freelancer on Fiverr. Programs like Scrapebox reverse crawl the web to find expired domains. If you would do manually, it could take you forever.

Are there any other ways expired domains can help SEO?

Yes, you can benefit from their authority even if you don’t build a website on them.

There are people simply buying expired domains to 301-redirect them to their site. In this way, they pass domain authority to their site. This technique also brings some referral traffic as a bonus.

Keep in mind, It may be okay if you 301-redirect one or a few expired domains to your site. However, do not scale this technique since it may trigger Google’s algorithm if done excessively.

Domain History

If you have a domain that you don’t know its history you may be in trouble.

The majority of the domains change registrars multiple times throughout their life cycle.

Some domains used for black hat projects or other goals that are considered illegitimate by the search engines. 

If you are using a domain that is penalized in the past, your site rankings may be negatively affected by that.

Also, if your domain history is not relevant to what you do with that domain now, you may not get any relevancy boost from Google.

If you plan to have an expired domain, I would highly recommend scrutinizing its entire history on Wayback Machine.

wayback machine

Here are some tips choosing expired domains:

Stay away from the domains that previously,

  • used for aggressive affiliate marketing campaigns,
  • published content in a foreign language (or other than your target language),
  • published sexual content,
  • owned by a company or an organization notorious to have a bad reputation.

We never know what Google thinks of a specific domain.

However, we may lower our risks by not starting with a domain that is too obvious not a good match.

Quality Of Content

“Content is King”. I am sure you have heard it before.

It is a very valid statement you need to fully grasp if you want to earn a competitive Google listing.

High-quality content that is informative is hard to create, and expensive to outsource. If you plan to outsource your content you need to be very careful.

Many people outsourcing content are using platforms like Fiverr and Upwork. However, it is usually a bad idea if you need solid content, that is well researched and professionally written.

I have written a dedicated post on ordering content from Fiverr or Upwork. I highly recommend you to read it.

I have tried both of those websites to order my content in the past. No matter which writer I have worked with the results were terrible.

No matter how detailed instructions I have provided, writers did what was comfortable for themselves.

Surely, there are great writers producing very high-quality content. But it is hard to find writers creating well researched, human-readable, SEO optimized content.

Where to outsource your content then?

Ideally, you should work with agencies that have a team of writers. There are agencies offering these kinds of services.

Although it is going to cost you more expensive, you will have a certain level of quality assurance. Agencies don’t keep writers long if they produce poor results.

I have ordered content from Content development pros for one of my niche sites, the results were very satisfactory.

Publish long-form of content

Back in 2012, it was possible to rank on Google with a short 400 word SEO article.

Google SEO is more complicated today. You need well-researched, in-depth content in order to earn a competitive Google listing.

Brian Dean and Eric Van Buskirk analyzed 1 million websites in 2016. According to the research they made, they concluded that:

“Based on SERP data from SEMrush, we found that longer content tends to rank higher in Google’s search results. The average Google first page result contains 1,890 words.”

Needless to say, longer the content higher the chances to rank on Google. The reason for that, longer content naturally covers more topics.

Covering more topics means including more ideas and more keywords.

However, you should be creating long-form content as long as you stay relevant to your topic.

Here you can learn how to write content that ranks on Google.

Stay relevant

Let’s remember it. What was the goal of Google?

Serving the most relevant information the user is looking for.

Content that doesn’t answer the user’s intent is irrelevant. Google measures user behaviors to understand if a certain piece of content meets the user intent.

If our content doesn’t meet with the user expectation, Google readjusts our position in SERP by lowering our ranking.

How does Google measure relevancy?

Keywords in the content have the number one priority for Google to measure relevancy. However, only targeting keywords, and providing a poor user experience wouldn’t help you much with Google SEO.

Google constantly measures information such as CTR(Click Through Rate), Average Time Spent On Site, scrolling behaviors, etc.

You may think you have the perfect SEO optimized content that should rank high on Google. In reality, if no one is reading your content Google will eventually lower your position to a place your site where it deserves to be.

If you outsource your content from the places I have mentioned earlier your content is likely to be irrelevant most of the time.

Hired writers don’t care if your content is relevant and engaging. Unless you pay special attention they will only provide you the word count you have asked them to write.

Google ranking is more complicated today than ever. It uses machine learning and definitely can tell if your content meets the user intent.

Hence create the content that is most comprehensive but concise as possible.

Stay away from robot generated content

Google’s algorithm is smart enough to distinguish a human written content from a robot generated content.

There are many people using article spinners, or paraphrasing tools to create content for their websites.

These tools usually create readable content that is grammatically fine. However, they lack the human touch, and cannot keep the reader on a website.

If people are not engaging with your content you will not rank on Google for long.

Unique content for Google does not only mean unique with its wording. It also means unique with its ideas, presentation, and customization it has.

Keyword stuffing is bad

SEO copywriting is not using the same keywords over and over again in a piece of content. Google considers using keywords more often than it is natural as spam. In SEO terminology it is called “keyword stuffing“.

keyword stuffing

Google’s algorithm knows how often a human writer naturally uses a certain keyword in content.

Using a keyword more than it is natural is over-optimization. Google doesn’t want to see people are optimizing their content to manipulate search engine rankings.

Write your content as if SEO is not a concern. Google will still catch your keywords in your content and will rank your page as long as your whole content is also relevant.

Structure your content well

Google values user experience over everything else. Because people will use Google as long as they have a good search experience.

It is important to provide well-structured, easily skimmable content. Visitors should be able to find what they need by only glancing titles, and subtitles on pages.

Never leave large chunks of text on a page. People will get turned off from your content if you don’t make reading easy for them.

Include images and other relevant multimedia in your content. Avoid building too long paragraphs. Long paragraphs are really difficult to follow for the human eye.

If your content has many titles and subtitles consider having a table of contents.

Delete low-quality backlinks

It is no surprise backlinks are still a very important Google ranking factor.

However, having backlinks from websites that are spammy or low quality is bad for your rankings.

Google is very likely to ignore low-quality links. This is an optimistic approach though. Your site may get penalized for having them.

Consider deleting the toxic links pointing to your website.

Don’t build PBN links

Private Blog Networks (PBN) sites are the group of websites built to rank one or multiple websites by providing backlinks.

Google took massive actions against PBNs in recent years.

Many websites that have backlinks from PBN sites are penalized.

There are many businesses lost almost all of their traffic. The ones could disavow bad links, and get rid of penalties are the lucky ones.

Have a fast website

It is no surprise slower websites don’t provide a good user experience. People usually don’t wait for not opening sites and bounces back to SERP results.

Therefore, Google tends to rank slower websites poorly in search results.

Is there an optimum time for a page to open?

There is no optimum time for how fast a webpage should be opened.

However, pages that are fully opened in 3 seconds or less (both for desktop and mobile) should be fine.

Obviously faster is always better. You can read my other post I link here if you want to speed up your website.

Google provides many valuable web performance tools to help site owners and developers to optimize their websites.

Google page speed insights is one of them.

optimize web performance

Use it to understand what Google thinks about the performance of your website.

Test both the home page and internal pages

A common mistake folks do is testing the performance of the home page only. Although it is very necessary it can be misleading as well.

Always test at least several pages from your site to have a broader idea of your site performance.

Use 3rd party tools as well.

Although Google’s tool tells us the performance of our website, it is still a great idea using other 3rd party tools.

It can be particularly useful if you target a specific region, and want to learn how your site performs if reached from that location.

I like using GTMetrix and Pingdom to test my blogs’ performance.

Keep in mind a website with a loading time 1.5 sec may not have a competitive ranking advantage to the one with a loading time is 1.8sec.

There are over 200 ranking factors, and site performance is only one of them.

Well enough is well enough.

Benefit from Accelerated Mobile Pages (AMP)

Accelerated mobile pages are primarily clean HTML versions of the current webpage offering quicker loading times than conventional HTML5 versions.

This is particularly important to increase mobile performance. Using AMP, your website will load much faster for visitors on a 3G connection.

Also, AMP pages with structured data are more likely to be listed in “Rich Search Results” such as Google’s News Carousel.

Have a Premium Theme

Let’s reverse engineer the low-quality sites for now.

What is the most common thing about the sites that are planned to be around for a while?

Let me answer, lack of good customization. Because low-quality projects focus on throwing the content and hammering with the backlinks.

A custom or premium theme can be a ranking signal.

I don’t mean the websites that have free WordPress themes are of low quality. I mean customization signals uniqueness and authenticity to Google.

You can pick great themes on Themeforest that will separate your website from the crowd.

Use a mobile-optimized theme

More than %50 of web traffic is going through mobile devices. Also, this percentage is expected to rise in the next years.

If your theme is not mobilize optimized, it is time to have one now.

Use Google’s free Mobile-Friendly Test to check if your website is mobile-friendly.

Have a Logo and Favicon

Google wants to see all websites are uniquely customized. Logo and favicon are the essential benchmarks of branding a website.

Create essential pages

Google wants to list legit websites only. Not having a privacy policy, disclaimer, and a contact page signals to Google a low-quality website.

Display content author

You post content to the web but who knows whose work it is. Obviously Google doesn’t like it.

Especially after the E-A-T update, Google prioritizes websites that are transparent with their content.

It is a quality signal if you include a short bio and image of the authors of your site. If this is not possible create a separate page showing your content team.

Private WhoIs is not good

Let’s reverse engineer low-quality sites once again.

Who may want to hide its identity from showing as a website owner?

Maybe a scammer or someone who has a secret agenda to rank his website.

Google obviously doesn’t like it and expects websites to be transparent. A privately registered domain may be a signal of a low-quality website.

Matt Cutts earlier hinted that Private Whois may be used as a negative ranking factor for a website.

He said:

“Rather than any real content, most of the pages were pay-per-click (PPC) parked pages, and when I checked the whois on them, they all had “whois privacy protection service” on them.

That’s relatively unusual. Having lots of sites isn’t automatically bad, and having PPC sites isn’t automatically bad, and having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”

I don’t think Google behaves both Public and Private WhoIs domain records in the same manner.

Sharing hosting with a spammy neighbor

In my opinion, shared hosting is terrible. On shared hosting, you have the same IP address with the other people.

Your neighbor may be running a PBN or other blackhat projects on your shared IP.

We have seen Google penalized entire server IP ranges in the past. Have your own unique IP by using high quality and reliable web hosting that you can trust.

I am using Digital Ocean to host this website, and the hosting experience they provide is truly amazing.

Use a high-quality Webhosting

On shared hosting, you are sharing server resources with other people. A lot of times servers cannot handle the amount of computing power requested by the users.

Hosting companies only think of their business but not yours. This naturally causes much more frequent and longer server downtimes than a premium hosting.

If your website is down more than usual on a frequent basis, Google will definitely take it as a low-quality signal.

Use Google WebConsole & Analytics

Google wants to track everything and does it almost perfectly. %75 of internet browsing goes through the Google Chrome browser.

Why would someone not want to use Google Analytics and Developer Tools?

Maybe someone who doesn’t want to share information with Google. Ranking on Google requires being transparent to Google with your website.

Choose the right domain extension

Unless you have another reason, it is the best to use: .com, .net, .org domain extensions.

It is a very bad idea to have a website targeting global ranking but picking a domain with a .ru extension. Local domain extensions may give a little boost if you target that specific country.

Have an SSL certificate

HTTPS stands for “Hypertext Transport Protocol Security” provides SSL 2048-bit key to protect a website connection through authentication and encryption.

It allows a secure connection from a web server to a browser.

Should we use an SSL certificate?

Following are some reasons that you should consider to have an SSL certificate:

  • Most of the users would not likely proceed with a purchase if they see the site doesn’t have a secure connection.
  • Google hinted earlier that they will work harder to make the web is more secure space.
  • Chrome browser labels websites not having a secure connection.
  • HTTPS improves page loading times, we definitely want it.
  • Most of Google’s first-page results are dominated by websites that have an SSL certificate.

You can get an affordable SSL certificate here. (from Namecheap.)

Link to authority sites & internal pages

The internet is a network of information. It works by referencing data on other sources.

If your content doesn’t link to other high-quality sources it is not credible.

It is equally important to link our own internal pages as well. This creates a better user experience and passes link juice to the internal pages.

Having a content silo architecture is my priority building websites. There are many resources online showing how you could build a powerful silo architecture.

Keep anchor text distribution natural

Google’s algorithm is very strict with regard to anchor text ratios. A natural mix of following anchor text is necessary to have.

Exact-match

“Exact match” anchor text is using the exact keywords in the anchor text. For instance, linking to a page about “green apples” and using the anchor text green apple.

Partial-match

“Partial match” is including some part of the keyword in the anchor text. For instance, linking to a page about “sour green apples” and using the anchor text green apples.

Branded

A brand name used as the anchor text. For instance, “Metriculum” or “Metriculum Blog” are branded anchors for my website.

Naked URL

Using the anchor “https://metriculum.com” as the anchor text.

Generic

Using generic anchor text or phrases such as “Learn more”, or “Click here”.

Images

Links that are coming to a website through the images.

Have plenty of sitewide content

If you publish a single post, no matter how good it is, the chances of ranking it is very low.

The more content published on a website more trust, and relevancy Google will designate for that website.

Topical authority” is an important Google ranking factor showing authority over a niche rather than authority over a single idea.

Publish a lot of great content. The number of pages on a site is definitely a ranking factor.

Set up branded social media accounts

Google wants to rank brands. Almost all brands you can imagine have a good social presence.

Claim your branded social media accounts if you haven’t done so yet. Create Facebook, Linkedin, Twitter and Pinterest profiles at a minimum.

Whenever you publish a new piece of content, syndicate it on all of your accounts. It may be challenging to do it manually.

Consider investing an automation tool like Revive Social.

Get your content shared

People love sharing good content. If no one is sharing your content, it may signal to Google you are not creating something good.

Encourage people sharing your content. Add sharing buttons, and make your content very easy to share.

Don’t buy/sell links

Buying or selling links is a clear violation of Google’s terms of services.

If your site reported to Google for buying/selling links you are in big trouble. It may lead to penalties that are not reversible. 

Brand mentions

Having a website that is not mentioned anywhere is not having a brand.

Everyone on the web mention how great content marketers and SEO professionals are Rand Fishkin, Neil Patel, Brian Dean, Barry Schwartz, Nathan Gotch, Jim Harmer & Rick Kesler.

Google values immensely the websites and businesses that are mentioned everywhere.

Keep an eye on your competitor

“The supreme art of war is to subdue the enemy without fighting.”

The Art of War by Sun Tzu

I like this quote, although I am against the value system it defends. But there are many people who wouldn’t avoid doing whatever it takes to place themselves a better position in front of their competitors.

There is a chance your competitor has a secret agenda for you. There are dedicated gigs on outsourcing platforms using “black hat techniques” to devalue a specific website or business.

Do keyword research

The most essential step of a content marketing campaign is keyword research. It starts with understanding the keyword intent. If ignored no campaign can sustain long term success.

If you want to rank for blue shoes, you need to mention blue shoes in your content. We cannot expect to rank for a keyword if our content doesn’t have that keyword.

Content that lacks the targeted keyword considered keyword deficient. Make sure you use your keyword consistent enough along with your content.

Finding unknown keywords that have low keyword volatility can be helpful to acquire a competitive advantage in any niche.

Know seasonal trends

Every website has a certain amount of seasonality. However, certain niches feel the effect of it more intense.

A surfing blog should naturally expect less traffic during the winter months. Similarly, a snowboarding blog likely to get less traffic during the summer months.

Knowing seasonal trends in your niche will help you make better decisions.

No matter what monetization method a website pursues, it will get affected seasonal trends. An affiliate blog, display ads site, or rank rent business they all get affected seasonal trends since the number of actual visitors defines the actual income a website generates.

Ranking on Google takes time

Google needs time to evaluate your content before it decides where it should rank. It also wants to trust your website before rewarding you with a good SERP position.

Unless you target a zero competition keyword, your content needs to get aged.

Content published by high authority sites ranks faster on Google. Because they have already established trust with Google.

Use SEO tools

Ranking on Google is competing with others to have a better position for your website.

Many internet marketers are using advanced SEO tools to be competitive in their niches.

Is it a must to use SEO tools?

It is almost a must. Fortunately, there are free alternatives to almost all SEO tools on the market.

The bad news, they are not as efficient as the premium ones.

Learn the right SEO strategies

Google SEO is not a “learn once and implement forever” type of science. It is continuously evolving with ongoing algorithmic updates. If you operate by using incorrect or outdated information you will not be achieving a good Google listing.

Many people approach SEO today still like in 1999. The times for ranking with low-quality content and spammy backlinks are gone.

Google updates its algorithm very frequently.

How many times did you update your SEO information in recent years?

Read Google’s SEO Starter Guide

You want to rank on Google.

Have you ever read Search Engine Optimization (SEO) Starter Guide?

I bet many people are not even aware of the existence of this guide.

We want to clearly understand what Google tells us.

Follow authority websites

“The only thing I know is that I know nothing, and I am no quite sure that I know that.”

Socrates

If you think you know everything about SEO, then Good Luck. Everyone has a unique approach to SEO.

One planning to be consistently successful in this market should be humble to learn from others as well. I like reading other authority blogs to sharpen the SEO strategies that I use.

Include semantic keywords

Semantic keywords are the ones that complement our target keywords.

They provide more meaning to our content by creating a “word cloud” and topical relevancy.

We want our content is optimized for the true intent rather than just answering a single query.

By doing so, we bring more depth to our content, in return more value.

Needless to say, it creates more opportunities to rank for a variety of keywords. You can read my post on how to use LSI keywords to rank your website.

Refer authority sites in your niche

By referring to other authority websites in your content, you create topical relevance for the search engines. Google can better understand your site if you mention in your content other sites in your niche.

Here is an example.

If you mention Satoshi Nakamoto in your content, Google will assume that your content may be related to Bitcoin.

Keep outbound links natural

There was a myth that is debunked already a long time ago. If you provide a “dofollow backlink” to another website, the website gets the backlink will suck the link juice out of your site.

That would mean you lose the authority of your website while you increase the authority of another website.

This assumption is only partially true.

While you provide value for the website you are linking, your website doesn’t lose value or so-called “link juice”.

There is another myth that is also debunked but I need to mention. Linking to other websites with only “nofollow backlinks” to protect the authority of your domain.

Google has reviewed billions of websites.

Its advanced algorithm knows how many percent of your outbound links naturally be distributed as dofollow or nofollow.

If you have almost all dofollow or nofollow outbound links, it may be considered by Google as unnatural.

Google has recently announced new nofollow link attributes. As the best practice, I recommend using them as much it is appropriately right to do.

Having no traffic from the other search engines (Bing, Yandex or Baidu)

This one is completely my speculation, and I wouldn’t expect everyone to agree with my opinion.

We know Google can track from what websites you are having traffic. If you receive a low amount of traffic from Bing, Yandex or Baidu this may sign Google something is not right with your content.

Why would Google, the biggest search engine ever, would need to have the opinion of another search engine?

Every algorithm including Google’s algorithm has flaws. It is always good to have a second eye on any job. I have a strong idea that Google verify where your website is indexed in comparison to Bing’s index.

This would work as a filter to eliminate people messing with Google’s algorithm and taking advantage of it.

Not building an email list

Websites that build relationships with their readers are considered as high quality. Google definitely supports visitors from interacting with the authors and the business overall.

An email list is a good benchmark for Google to decide if a website is intended to build relationships with the readers.

Content blocking ads and pop-ups

Google definitely hates content blocking ads, above the fold banners and other advertising techniques that lower the quality of the user experience.

In 2014, John Mueller said that pop-ups can negatively affect your Google rankings.

Links font that is similar to your content or background

There was a blackhat technique popular amongst internet marketers. In the early 2010s, people used font colors the same with their content to avoid showing links to the visitors. It was also very common using a text font color either the same or similar to the background color of that page.

Similarly, it was also a common blackhat technique placing single-pixel image links by a 1px by 1px image or with an incredibly small text.

Google strictly penalize this type of actions. If you are trying to fool Google using such techniques, odds are very high you will eventually get caught.

I mention it because I know there are still people doing it without even knowing it. Make sure your content, links, and background are decently set.

Your content is not readable, cannot pass the “Flesch Reading Ease” test

Readability is an important ranking Google ranking factor. If you are using Yoast SEO plugin you can automatically track the readability score of your post.

The “Flesch Reading Ease” is a very reliable tool scores English-language contents’ readability. Make sure your content achieves at least 60 points from this test.

Your content is too small to read

not readable font size
Google tells you if your font is small to read.

Your text should be easily readable on different sizes of displays both on mobile and desktop.

The above screenshot is from my Google Web console. You will notice Google was thinking my default font comes with my theme was not readable for humans. Later I have fixed this issue by using the Easy Google Fonts plugin. If you are using WordPress, it is a must-have plugin.

Not using correct grammar & having many writing & spelling mistakes

Always make sure you are using the right grammar and spelling. You can check your article while writing on Google docs, or alternatively use Grammarly to understand if your content doesn’t have any mistakes.

Having duplicate content

Duplicate content (External)

Using duplicate content that is plagiarized will devalue our content overall. Also, it may cause copyright problems with the original owner of the content.

Google loves fresh unique content that is not published anywhere else before. Type of content that is not only unique in words but also the ideas it has. All in all, duplicate content is not going to perform well on any type of search engine.

Duplicate content (Internal)

This one is really tricky. If you are using WordPress the chances are high you have some type of duplicate content. Category pages and tag pages will automatically generate duplicate content on your site unless you pay special attention to it.

Many people are not aware of they have duplicate content on their websites. Different versions of website www and non-www versions of the site may trick Google think you may have duplicate content.

Use the right URL structure

Your website URL structure is an important Google ranking factor. If you are using WordPress, go to “Settings” in your dashboard and find “Permalinks”. You will see the above options to set up your permalinks.

Post name

Set your permalinks as “Post name” before posting any content. There is some important merit to do so for your Google rankings. Let me explain why:

Plain

The “Plain” option is not understandable by the search engines. It also doesn’t make any sense for the visitors.

Day and name

“Day and name” option are slightly better. However, it timestamps the posts. If your content becomes old and you want to update it you cannot change URL. This would cause a broken page on your website.

Month and name

“Month and name” is very similar to “Day and name” we have already mentioned. We don’t want our URLs to have any piece of time data on them.

Numeric

“Numeric” is fine but it gives no boost for our SEO efforts. We cannot include our keywords into a numeric permalink structure. The other minus, no one can understand what about the post by looking at the URL. People will naturally not trust the links if they don’t know what about it is.

Custom structure

“Custom Structure” lets us using a combination of timestamping (year, month, day, etc.) with keywords and tags. However, it is quite complicated to use. Unless you plan a specific website architecture or silo system, it should not be used.

Use CND service

There are many reasons you should be using a high-quality CDN (Content Delivery Network) service. I am not going to mention all of them in this post.

If you cache or temporarily store your content on a CDN, your content is delivered from the edge to your visitors faster than if it is delivered from the origin.

CDN lets our site to load much faster both on desktop and mobile devices. Also, CDN is important especially during the times our website having a larger amount of traffic. It simply provides to enhance our site performance.

Your main navigation does not have your keywords

The main navigation menu is the unique opportunity that you should definitely take advantage of. Instead of using generic keywords in your main navigation, try to include keywords that you want to rank on Google. Keywords on the main navigation menu are repeated on every page you create on your website.

Your content may be stolen by someone else without you knowing it.

Once you post an article you need to visit Google Webmasters and fetch your content right away. Many people today posting content online is skipping it, and expecting Google to find their post. Although it may seem okay on a surface level approach, it contains significant risks.

There are software bots constantly scraping web to find the content that is not indexed by Google. If someone finds your content and publishes before the Google index, it is not your content anymore.

You need to report it to Google, and convince them you are the original owner of that piece of content. Stay on the safe side, submit your URL to Google as soon as you publish it.

You don’t have meta titles & meta descriptions, or not optimized properly

The whole idea of getting traffic from Google happens like this.

Google lists your website in its SERP result. The user chooses your website in between of other alternatives(your competitors) and clicks to access your website.

The part of your website listed on SERP is the “meta title” and “meta description”

If you don’t optimize your metal titles people will have no reason to click on your website. Low CTR (Click through rate) is an important signal tells Google that your website may not be related to what the user is searching for. We want our meta titles, and meta descriptions to entice users to click on them.

I don’t mean we should provide clickbait titles that don’t represent our content. There are other problems occur such as pogo-sticking. Pogo sticking happens when a visitor performs a search and click to the website and quickly clicks back to the SERP.

This tells search engines an immediate dissatisfaction exists with the website that is visited. Pogo sticking always negatively affect search engine rankings.

Your website has poor customization

Google wants to see your website is uniquely valuable. Not only your content and images but your website overall should be customized. As I have earlier mentioned you should definitely use either a custom WordPress theme or premium WordPress theme to separate yourself from the crowds.

Other than that try to change colors, font size to make your website uniquely different than all other websites.

In your content always use underline, bold, italic, and color-marked text as much appropriately possible.

Not having enough unique multimedia

As we have earlier mentioned Google prioritizes good user experience over everything else.

Part of providing a good user experience is using a good amount of multimedia to engage visitors with our content. More engaging multimedia we use in our content more time visitors will spend time on our website.

In return, it will increase our higher Google rankings. I would recommend reading this post to learn how many images are optimum to use in our content.

If you have the budget for it take benefit of infographics. Infographics are a great way of producing content with the added benefit of high sharability. More people use and share our infographic more backlinks we build to our website.

Another great type of multimedia is video. Youtube owned by Google and definitely gives a boost to video content shared on Youtube, and embedded to our site.

Consider using audio podcasts as well to diversify the way you use multimedia. In my opinion, Google definitely pays attention to how many different ways we present our content.

When you share video content on your website do not host video files on your web page. This may slow down the overall speed of your page. Upload your videos to YouTube then embed video links to your page.

Your content doesn’t let people comment on it

High-quality content is the type of content engaging for users. Letting people comment on our post is a great thing to create user engagement.

If you are using WordPress there is a default “Leave a comment” section under each post. You can let people comment there if you can moderate the comments well enough.

There are multiple benefits to doing so. First, it will increase engagement with your post. Also, more people comment on your post, more keywords your post is likely to rank.

Never allow auto-approved blog comments. Else spammers will create millions of spam comments on your site.

You can learn about the SEO benefits of allowing blog comments on your site in this post.

Not Having Enough Backlinks

Backlinks are still a very important factor to determine the authority of a website. It is the endorsement of websites on the web.

Here what Matt Cutts said:

“It turns out that backlinks, even though there’s some noise and certainly a lot of spam, for the most part are still a really really big win in terms of quality of search results.”

Google tried in the past lowering the effect of backlinks when ranking websites. The results were terrible.

🔗 Type Of Backlinks

All backlinks are not created equal. Ideally, we should target having backlinks organically. The way to do it is creating content worth to get links from the other sites.

This is practically not possible every time. Hence blogs using outreach to guest posts on other blogs. This is kind of a “gray hat link building strategy” may be accepted as ok by Google.

There are also other ways of building backlinks, but many of them considered “black hat link building strategy”. There are outsourcing platforms such as Fiverr that let people blast thousands of backlinks by using the software. This is extremely dangerous and should be avoided at all costs.

🔗 Link Building Velocity

Link building velocity is an important Google ranking factor. It is very unusual for a new website to have hundreds of backlinks in the first couple of month of its life. Google knows how fast websites naturally build links.

Building links faster than it is naturally possible may raise red flags in Google algorithm. Manual actions or manual reviews are here to review websites suspected of having unnatural link building behaviors.

🔗 Backlinks that have no backlinks

Having backlinks from webpages that has no backlinks is questionable. The Internet is built on networking. If no one else linking to the page that links your page, you have backlinks from an orphan page.

It may be okay if you a few of them. If you have too many of them it is not a good thing for your rankings. Hence, many SEO professionals using tiered link building strategies by backlinking to the pages they have backlinks from.

I am certainly against the artificial link building techniques trying to take advantage of the Google algorithm.

You don’t have schema markup code

Schema.org is a joint project of the search engines to understand information more efficiently. It is like giving a better map to the search engines on how they should interpret our content.

Usually, a piece of schema markup code is placed to a page to structure information.

But, how come not using structured data can be a negative ranking factor?

If your competitors are taking advantage of structured data and you don’t, then not using it works against you.

Here’s an example for schema markup of a Google search returned for the search query: “Newyork bus schedule”:

You don’t have fresh content. 

You may have a post with the title “Who will likely to be the next US president?” Because you were thinking Bill Clinton may be the next president. 

It is time to update that old post you wrote in 1992. Google love fresh content that is relevant today. It is recommended updating your content at least once a year to show Google your content is fresh and still very relevant.

You are in a very competitive niche

Mike Tyson said one of the most genius quotes ever:

“Everybody has a plan until they get punched in the mouth.”

Think about the following for a second!

If your competitor is better than you, you don’t stand much chance to get a better position than your competitor. Of course, until you become better and/or smarter over your competitor.

The reason for your poor rankings maybe not only depending on what you do or not. You may be trying to compete in a very competitive market that requires more effort to be invested.

In my experience, search returns with 50 million or fewer results have still a lot of space for ranking.

Improper usage of Hyphen / Dash / Underscore In Domain Names

Let’s have a look at what does a hyphen, dash and underscore are, before deciding which option would work best for us choosing SEO friendly domain names.

Hyphen (-) is a punctuation mark used to bring two or more words together. For instance “up-to-date” or “one-third” is hyphen separated. Hyphens are ok if you don’t use more than once in your domain name.

Dash (–) is longer than a hyphen and used to indicate a range of values or a pause. It is also ok if used in the same way as a hyphen.

Underscore ( _ ) or also called understrike is a character that started to be used with the typewriter keyboard to simply emphasize the words, phrases or numbers. Underscores create an exact match for the keywords combined. It may be useful if it is implemented strategically.

We want our domain name to be friendly to the search engines. It is always best if we don’t use them in our domain at all.

There was a time using the “Exact match domain”(EMD) brought a big ranking boost. Everyone was using very long keyword stuffed domains separated with hyphens.

Those times are passed. Google devalued the ranking boost that came with the keywords in the domain names.

It is still suggested to have the keyword in the domain name. However, having very long domains separated with hyphens brings no added benefit. Google will no more prioritize us simply we choose the domain name “best-red-cars-run-with-water”.

Improper usage of Dash / Underscore in URL Structure

It is ok using dashes in our URLs as long as we don’t go overboard. Here what Matt Cutts said:

“If you are going to make a site and you’re starting fresh, so you’ve got a blank slate to work with, I would probably go ahead and go with dashes. I would continue to go with dashes at lease for the foreseeable future.”

“Nobody’s slated to be working on that so at least for the time being it’s better to use the dash.”

Long Domain Registration Term, premium hosting

Although it is highly speculative domains that have a longer time to its expiration may have a small boost in Google rankings.

Google hates spam and fights against it.

Websites with a longer registration period may signal trust to Google. Having plans to be around for a longer period of time is opposite what spammers and blackhat SEO websites do.

Not having a reputable Webhosting provider

It is very important to host our websites with a reputable Webhosting company. This is related to the trust factor we want to create with Google.

There are many $1/month Webhosting companies that are used to host PBN sites since they provide a unique IP for each hosting plan. Google knows all of these businesses.

Make sure you don’t make a marginal selection when selecting a hosting provider. Don’t choose the cheapest Webhosting company you find.

Not having your keywords in your categories and tags section

I personally don’t use tags when I want to publish a post. However, I know there are still many people using it as a good SEO practice. I like to name the categories with the keyword I want to rank on Google. It is something we should definitely take advantage of. Luckily, WordPress makes it very easy for us without needing any experience in coding.

You didn’t send XML sitemaps

using sitemap
Submitted sitemap to Google

XML Sitemaps are very important for SEO. It is like a blueprint of our website that tells search engines how our site is structured.

Sending an XML sitemap makes it very easy for Google to find the content we publish. It also makes crawling job easier for the search engines.

It is recommended that as soon as you publish your first content submit a sitemap to Google.

If you are using WordPress you can submit a sitemap by using plugins such as Yoast or Google XML Sitemaps.

Contradicting plugins trying to do different things,

plugins with same functionality
Yoast & Google XML Sitemaps Can Do The Same Thing.

This one is especially important for WordPress users. If you are using plugins that do the same or similar job you may have problems. For instance, Yoast and “All in One SEO Pack” are two popular SEO plugin.

When used together they may create technical problems. The same principle applies to any type of plugins in WordPress. Similarly, don’t use or have two cache optimizing plugins at the same time. Else your website is likely to crash if two or more plugins contradict each other.

You are having too many plugins

It is always best having the minimum amount of plugins functionally possible. I feel uncomfortable if I have more than 15 plugins on my WordPress dashboard.

Each plugin you use brings additional PHP, JavaScript, and whole other files together. This will decrease the performance of your website and create security problems.

Text/code Ratio is too small, less doing is good. Low Code-to-Content Ratio

The text/code ratio is not a direct Google ranking factor. However. it may negatively affect once it is overlooked.

As a good SEO practice, the text/code ratio is anywhere from 25 to 70 percent is fine.

Websites with a higher text/code ratio are more readable and user-friendly. The page load speed is also higher is another positive factor for SEO. Finally, having less coding on our webpages will increase the crawl performance of our pages.

You don’t have styling, fonts, font size, underline, bolt, etc.

Bold, Strong and Italic, Underline tags are important for good SEO practices. They emphasize the semantic importance of the part of the text to the search engines.

They also help us by adding a unique touch to our content.

Not benefiting from bullet lists or overusing it

It is always a good idea using bullet lists if we long items to show in order. The benefits bullet lists provide:

  • Better organization of our data
  • Easier to consume content for the reader
  • Better presentation of data for the search engines.

Although it is not a direct Google ranking factor, I believe there are SEO benefits using bullet lists. Having at least a 2-3 bullet list for a 1000 word of content looks optimum for good user experience.

Don’t overuse bullet lists, else it starts to look unnatural and spammy.

You didn’t define site identity, the tagline in WordPress

Tell search engines what your WordPress is about.

Take advantage of “Site Title” and “Tagline” sections in WordPress. Your site title and Tagline will be displayed on every page you create. We should definitely benefit from impressive ones with our target keywords included inside.

You don’t use a caching plugin

This one indirectly increases your website’s SEO. Caching is the process of creating static versions of your page and serving it to your visitors. Static pages are usually rendered quicker in the browsers. Hence, you will have a faster website.

There are very good wp caching plugins that help us optimize our website with a few clicks only.

Your page size is too big

Webpages with large file sizes have bad page load times. If you don’t optimize your multimedia and use an excessive amount of coding, you will have a slower website.

Uninteresting or Novel Content

Google literally discussed earlier the uninteresting content that brings no value. Content that is very similar to the earlier published content is useless. Your content should not only be unique in its wording to pass Copyscape, but it should also bring ideas and opinions that are not discussed before.

Google assigns a “novelty score” to each content published and uses this score as input for its algorithm.

I would recommend reading this post to improve your content immediately.

Sitewide Average Novelty Score

Google walks one step further with the novelty score. They assign an “average novelty score” that applies sitewide content.

Positive Sentiment in Comments

Websites that have many negative comments or very little positive sentiment on the web.

Not using / improper using rel=”hreflang” tags

Hreflang tags are used by sites that have similar content in multiple languages. If your content is “English only” you can skip this one.

Exceedingly Long URLs

Matt Cutts said using more than about five words in URL lowers the “keyword in URL” boost. Search engines consider very long URLs as spam even though the same word is not repeated more than once in URL.

Long Internal Link Anchors

Longer link anchor text is generally natural. However, having a very long anchor text may be treated as keyword stuffing. In my personal opinion, we should avoid anchor texts more than 10 words.

High Ratio of Links to Text

Webpages with very little content but many links considered low-quality pages. People built such pages in the past for the goal of linking (internal/external). Although there is no perfect data in our knowledge, we should aim a good text/link ratio as natural as possible.

JavaScript-Hidden Content

Placing text in Javascript is considered as a blackhat technique, and not recommended. Although search engines can crawl these texts, it is still not a good practice to do. There are chances of having a cloaking penalty if you do so.

Copyright Violation

Content published in a manner that violates the “Digital Millennium Copyright Act” (DMCA) or equivalent codes outside of the U.S. can lead to severe penalties. Google tries to understand unattributed sources and unlicensed content automatically, but people can also report infringement directly to Google, that will result in a manual action against the website.

Doorway Pages

The Doorway Pages are a group of pages built with the only goal of having search traffic. These pages don’t provide any value to the visitors. For instance, creating a website to show chiropractors in the US, without having no actual business or providing no useful information to the users.

This is categorized by Google as “spamdexing” which is the short form of (Google’s page index spamming).

Using Clickbait Meta Titles

Using catchy titles to let people click on our website is a good SEO practice. However, using Clickbait titles that don’t represent our content is against the Google guidelines.

It is also bad for our SEO. If people cannot find on our website what we have promised them, they will click the back button and turn to the search result page. Google will take this signal as the user dissatisfaction and readjust our search ranking position.

Overuse Of Bold, Italic, or Other Emphasis

It is a very good SEO practice using bold, italic and other emphases that adds personality to our posts. However, used excessively it will be taken as “spammy activity”.

Broken Internal Links

Broken internal links reduce the overall quality score of our website. It also makes crawling more difficult for the search engines to navigate. Make sure you don’t have any broken links on your website.

My favorite free tool to find broken links is Screaming Frog SEO Spider.

Screaming Frog SEO Spider
Screaming Frog Crawls Your Website To Show You On-site Problems

It finds almost all on-site problems you can possibly have. It is also a very popular tool used to optimize page headings, meta titles, meta descriptions, etc.

Redirected Internal Links

Matt Cutts has suggested that redirects are subject to “PageRank decay”. It means some of the page authority is sacrificed every time a page directs to another. In 2016, Gary Illyes posted a tweet about this is no more an issue to worry about.

Dynamic Content

The dynamic content is used for different goals. A webpage that displays different content for every session it is loaded or rotates itself depending on the time of the day, users location that access it.

Dynamic content can really challenge search engine spiders to understand our content. Using “noindex” and minimizing the use of dynamic content, particularly when accessible by Google, is considered to result in better user experience, and increase rankings.

Page “NoIndex” Tag

Pages with the meta tag “noindex” are not indexed to shown in search results. Sometimes website owners are not even aware that they have activated “noindex” value. We should be aware of this is not the case for any webpage we want to rank on Google.

However, it is a good practice keeping this tag activated on pages that don’t provide much benefit to the users. For instance, tag and category pages don’t generally provide a good user experience. Itis recommended keeping them tagged as noindex.

Similar to private whois data, it’s been made clear that representatives from Google are aware of this common trick and treating it as a problem. If for no reason other than it being a violation of ICANN guidelines, and potentially allowing a domain hijacker to steal your domain via a dispute without you getting a say, don’t use fake information to register a domain.

Penalized Registrant

Although this is very speculative, you may be the reason for your websites’ poor Google ranking.

If you have earlier spammed, or used black hat techniques across several sites, Google may have blacklisted your name.