Google Ranking

Why Your Website Is Not Ranking On Google?

Pinterest LinkedIn Tumblr

Table of Contents

All Google Ranking Factors You Need To Know

What is a bigger secret, the algorithm of Google or the formula of Coca Cola? 

No one knows what type of algorithm Google runs. Does Google algorithm work as Google informs us, or Google informs us as it wants us to know. 

It is believed over 200 google ranking factors exist in their algorithm. 

Let’s dive in to find out what are Google ranking factors. 

Starting With A Brand New Domain 

When you start with a brand new domain, embrace that it will take a longer time for Google to trust your website. Domain age is still a major ranking factor. We need domains with a high reputation for domains.

But, how old is not new anymore?

Googleโ€™s Matt Cutts states that:

“The difference between a domain thatโ€™s six months old versus one year old is really not that big at all.โ€

It is always a good idea to start with an expired domain. If you pick an expired domains make sure it is relevant to your niche and has a clean history. 

Check metrics like Mozโ€™s Spam Score and Majesticโ€™s Flow Metrics (Trust Flow & Citation Flow) before making any purchase.

In my opinion, aged domains with at least a few niche relevant backlinks are excellent. Obviously the more is better if you can afford it.

Domains with good metrics worth a lot. Because they have โ€œdomain authorityโ€ and โ€œlink juiceโ€ behind to help them to rank on google.

It is always a good idea to have your keyword in your domain. But you can go with brandable domains as well. Domain selection primarily depends on personal choice.

How Could You Find Expired Domains?

Finding expired domains is not a big challenge. You can simply visit expireddomains.net to list a ton of expired or deleted domains.

However, finding expired domains with good metrics without a spammy background is hard.

I wouldn’t recommend searching for expired domains manually. It is inefficient to give any good results.

There are programs like Scrapebox that help you to find high authority expired domains easily.

These types of programs reverse crawl the web, which manually done would take you forever to achieve.

You can benefit from expired domains even if you don’t want to build a website on them. There are people simply buying expired domains to 301 redirects to their site. This boosts domain authority and brings some free traffic which is a bonus.

Keep in mind, It may be okay if you 301 redirect one or a few expired domains to your site. However, do not scale this technique since it may trigger Google’s algorithm and puts you in a bad position.

Domain With A History Of Spam

If you have a domain that you donโ€™t know its history you start the game with a big loss. Many domains change registrars multiple times throughout their life cycle.

Some domains used for black hat projects or other goals that are considered illegitimate by the search engines. If a domain is penalized in the past, it drastically lowers your chances of ranking it.

Maybe earlier owners of your domain used it for a legit goal but not relevant to your niche. In this case, you may not get a relevancy boost from having an expired domain.

If you plan to have an expired domain, I would highly recommend checking its entire history on Wayback Machine. Stay away from the domains that

  • run aggressive affiliate marketing in the past,
  • published content in a foreign language (or other than your target language),
  • published sexual content,
  • owned by a company or organization notorious to have a bad reputation.

We never know what google thinks of a specific domain. However, we may lower our risks by not starting with a domain that is too obvious used for spam in the past.

Low-Quality Content

“Content is King”. I am sure you have heard it before. It is a very valid statement if you want to rank your website on Google.

Providing high-quality content should be the priority of every website. However, there are many mistakes people do while producing their content.

Good content that is informative is hard to create. Hence, many people prefer ordering content from Fiverr or Upwork.

Honestly speaking, I have tried those websites to order my content in the past. No matter which writer I have worked the results were terrible.

No matter how detailed instructions I have provided, writers did what was comfortable for themselves.

Surely, there are great writers producing very high-quality content. But it is extremely expensive to outsource well researched, human-readable, SEO optimized content.

๐Ÿ“Œ Thin Content

Back in 2012, it was possible to rank on google with a short 400 word SEO article. Those times have passed already a long time ago. Search engines today prioritizing to list websites with a longer-form of content on their SERP(Search Engine Result Page)

Brian Dean and Eric Van Buskirk analyzed 1 million websites in 2016. According to the research they made, they concluded that:

โ€œBased on SERP data from SEMrush, we found that longer content tends to rank higher in Googleโ€™s search results. The average Google first page result contains 1,890 words.โ€

Needless to say, longer the content higher the chance to rank on Google. The reason for that, longer content naturally covers more topics. Covering more topics means including more ideas, more keywords, and keyword combinations.

Here you can learn how to write content that ranks on Google.

๐Ÿ“Œ Irrelevant Content

Let’s remember it. What is the goal of Google?

Serving the most relevant information that user typed in Google search bar.

Content that doesnโ€™t answer the user’s intent is irrelevant. Google measures user behavior to understand if our content is relevant.

If our content doesn’t meet with the user expectation Google readjust our position in SERP by lowering our ranking.

How does Google measure relevancy?

I think keywords are the number one ingredient Google uses to measure relevancy. However, only targeting keywords and providing a poor user experience wouldn’t take us too far.

Google constantly tests and measures information such as CTR(Click Through Rate), Average Time On Page, Scrolling behaviors, etc.

We may think we have a perfectly fine keyword optimized article that should rank #1 on Google. In reality, if no one is reading that content Google will eventually lower our position to a place where it deserves to be.

If you are ordering your content, I bet your content is garbage. Hired writers are not interested in your content is engaging. Let alone the work can be keyword optimized and structured with good SEO practices.

Any content can be served on a higher position on Google for a while. But remember Googles’ algorithm is getting smarter and smarter.

Optimize your content for the Google RankBrain algorithm. RankBrain is an advanced AI algorithm developed by Google Engineers to sort search results more efficiently. It uses machine learning techniques to constantly improve itself.

You take it or not, my advice, create your content as you are a student in an exam.

Your teacher will read it once the exam finished. The only way succeding in the exam creating a great piece of content your teacher reads and thinks

“Wow, I have never read something like this before.”

That teacher is Google, will reward you if you are genuinely good and don’t try to fool it.

๐Ÿ“Œ Spun / Robot Generated Content

Google’s algorithm is smart enough to distinguish a human written content from a robot generated content.

In recent years, article spinning programs also started using machine learning to imitate human written content. Honestly speaking there are great programs that may help you to write great content, that is perfectly readable for humans.

However, you can not rely on article spinning programs as the only content building ingredient.

Unique content for Google does not only mean unique with its wording. It also means unique with its ideas, presentation, and customization it has.

๐Ÿ“Œ Content with unnaturally high keyword density

Google consider using keywords more often than it is natural as spam. In SEO Community it is called “keyword stuffing“.

Google’s algorithm knows how often a human writer naturally uses a certain keyword in content. They spent billions of dollars to prevent webspam.

๐Ÿ“Œ Poorly Structured Content

Google values user experience over everything else. Because people will use Google as long as they have a good experience. In result, Google keeps making money from the advertisers.

It is important to provide easily skimmable content. Visitors should be able to find what they are looking for by only glancing titles, and subtitles on your page.

You should organize your content into titles and subtitles instead of a large block of content. People get turned off if you don’t make it easy reading your content for them.

Use images, and other multimedia in between of your content. Avoid building too long paragraphs. Long paragraphs are really difficult to follow for the human eye.

If your content has many titles and subtitles consider having a table of contents.

Having backlinks from low-quality spammy websites

It is no surprise backlinks are still a very important Google ranking factor. I am not going to mention now if you should build backlinks.

However, having backlinks from the websites that are notorious for being spammy will lower your rankings. To be more optimistic, they will not give you any ranking boost if they will not cause any penalty.

Having backlinks from former PBN sites.

Private Blog Networks (PBN) sites are the group of websites built to rank one or multiple websites by providing backlinks. Google took massive actions against PBNs in recent years.

Many websites that have backlinks from PBN sites are penalized. Many businesses lost almost all of their traffic. The websites could disavow bad links and get rid of penalties are the lucky ones.

Not having a fast enough website

We already know Google prioritize user experience over everything else. It is no surprise slower websites don’t provide a good user experience. A lot of times people become sick of waiting and bounces back to SERP results.

Really, who would like to wait for 6 seconds for a webpage to load?

Hence, Google tends to rank slower websites poorly in search results. There is no optimum value for how fast a webpage should be opened. However, webpages that may be accessed in 3 seconds or less (both for desktop and mobile) should be fine. Obviously faster is always better. You can my other post if you want to accelerate the performance of your website.

Google provides many valuable web performance tools to help site owners and developers to optimize their website. “Google page speed insights” is one of them.

optimize web performance

๐Ÿ’จ Test Loading Speed of Home & Internal Pages

A common mistake folks do is testing site speed for only the home page. This type of measurement can be a bit misleading. Always test at least several pages from your site to have a broader idea of your site speed.

๐Ÿ’จ Use Multiple Measurement Tools To Test Site Speed

You can visit Google Page Speed Insights to test how fast your website can be loaded. This tool is very valuable because it tells us what Google is thinking about our website.

There are also other 3rd party websites you can visit to test the speed of your website according to a selected server location. This is particularly useful if you target a specific region and want to learn how your site performs if reached from that location.

I like also using GTMetrix and Pingdom to test my blogs’ performance.

I also believe a website with a loading time 1.5 sec may not have a competitive ranking advantage to the one with a loading time is 1.8sec. Well enough is well enough.

๐Ÿ’จ Use Accelerated Mobile Pages (AMP)

Accelerated mobile pages are primarily clean HTML versions of the current webpage offering quicker loading times than conventional HTML5 versions.

This is particularly important to increase mobile performance. Using AMP, your website will load much faster for visitors on a 3G connection.

Also, AMP pages with structured data are more likely to be listed in “Rich Search Results” such as Google’s News Carousel.

Not using a premium or custom theme

Let’s reverse engineer for now. What is the most common thing about the sites that are planned to be around for a while?

Let me answer, churn and burn projects lack of good customization. Because they only focus on throwing the content and hammering with the backlinks.

A custom or premium theme can be a signal for serious business. I don’t mean the websites that have free WordPress themes are of low quality. This website is on a free theme. I mean customization signals uniqueness and authenticity to Google.

Not Having a mobile-optimized theme

More than %50 of web traffic is going through mobile devices. Also, this percentage is expected to rise in the next years.

If your theme is not mobilize optimized, it is time to have one now. You can check if your website is mobile friendly on Google’s free Mobile-Friendly Test Page.

Not setting up Logo and Favicon

Google wants to see all websites are uniquely customized. Logo and Favicon are the essential benchmarks of branding a website.

Not having a privacy policy, disclaimer, contact pages

Google wants to list legit websites only. Not having a privacy policy, disclaimer, and a contact page signals to Google a low-quality website.

Having no author for the published content

You post content to the web but who knows whose work it is. Obviously Google also doesn’t like it. It is a quality signal if you mention who the author is. It is even better if you include a short bio and image of the author.

Locking your domain WhoIs as private

Let’s reverse engineer once again. Who may want to hide its identity from showing as a website owner? Maybe a scammer or someone who has a secret agenda to rank his website.

Google obviously doesn’t like it and expects websites to be transparent. A privately registered domain may be a signal for a low-quality website.

Matt Cutts earlier hinted that Private Whois may be used as a negative ranking factor for a website.

He said:

“Rather than any real content, most of the pages were pay-per-click (PPC) parked pages, and when I checked the whois on them, they all had “whois privacy protection service” on them. That’s relatively unusual. Having lots of sites isn’t automatically bad, and having PPC sites isn’t automatically bad, and having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”

Logically, a penalized WhoIs owner will naturally want to use a private WhoIs. I don’t think Google behaves both Public and Private WhoIs domain records in the same manner.

Sharing hosting with a spammy neighbor

shared webhosting

In my opinion, shared hosting is terrible. On shared hosting, you have the same IP address with some other people. Your neighbor may be running a PBN on your shared IP or blackhat tactics.

We have seen Google penalized entire server IP ranges in the past. Have your own unique IP by using high quality and reliable web hosting that you can trust.

Using a low-quality Webhosting

On shared hosting, you are sharing server resources with other people. A lot of times servers cannot handle the amount of computing power requested by the users.

Hosting companies only thinking of their business but not yours. This naturally causes much more frequent and longer server downtimes than a premium hosting.

If your website is down more than usual and on a frequent basis, Google will definitely take it as a low-quality signal.

Not connecting your website to Google WebConsole & Google Analytics

Google wants to track everything and does it almost perfectly. %75 of internet browsing goes through the Google Chrome browser.

Why would someone not want to use Google Analytics and Developer Tools?

Maybe someone who doesn’t want to share information with Google. Ranking on Google requires being transparent to Google with what you are doing.

Choosing the wrong domain extension

Unless you have another reason, it is the best use: .com, .net, .org domain extensions.

It is a very bad idea to have a website targets global ranking but use a .ru domain extension. Local domain extensions may give a little boost if you target that specific country.

Not having an SSL certificate

HTTPS stands for “Hypertext Transport Protocol Security” provides SSL 2048-bit key to protect a website connection through authentication and encryption. It allows a secure connection from a web server to a browser.

Should we use an SSL certificate?

Following are some reasons that you should consider to have an SSL certificate:

  • Most of the users would not likely proceed with a purchase if they see the site doesn’t have a secure connection.
  • Google hinted earlier that they will work harder to make the web is more secure space.
  • Chrome browser labels websites not having a secure connection.
  • HTTPS improves page loading times, in return better rankings likely to reach.
  • Most of google’s first-page results are dominated by websites that have an SSL certificate.

Content that doesnโ€™t link to any authority sites or internal pages

The Internet is a network of information. It works by referencing data on other sources. If your content doesn’t link to high-quality resources it is not credible. We want our website contextually link to reliable sources.

It is equally important to link internal pages on the website. This creates a better user experience and passes link juice to the internal pages.

Having a content silo strategy is my priority when building websites. There are many resources online showing how to use silo architecture properly.

Not having a natural anchor text distribution

Google’s algorithm is very strict with regard to anchor text ratios. A natural mix of following anchor text is necessary to have.

๐Ÿ”— Exact-match

“Exact match” anchor text is using the exact keywords in the anchor text. For instance, linking to a page about “green apples” and using the anchor text green apple.

๐Ÿ”— Partial-match

“Partial match” is including some part of the keyword the anchor text. For instance, linking to a page about “sour green apples” and using the anchor text green apples.

๐Ÿ”— Branded

A brand name used as the anchor text. For instance, “Metriculum” links to a post about “Metriculum Blog”.

๐Ÿ”— Naked URL

Using the anchor ‘https://metriculum.com’ as the anchor text.

๐Ÿ”— Generic

Using generic anchor text or phrases such as “Learn more”, or “Click here”.

๐Ÿ”— Images

Links that are coming to a website through the images.

Not having a good amount of content sitewide

If you publish a single post, no matter how good it is, the chances of ranking it is very low. The more content published on a website more trust, and relevancy google will designate for that website.

Topical authority” is an important Google ranking factor showing authority over a niche rather than authority over a single idea.

For instance, a website dedicated to SEO is expected to have higher rankings as opposed to a website dedicated to all things and not a topical authority about the subject.

No Social media accounts set up and pointing to your website.

Google wants to see websites involving relevant communities in social media. Having active social media accounts, and constantly interacting with people there is a high-quality ranking signal.

Your content is not shared on social media 

social media

People love sharing good content. If no one sharing your content, it may signal to Google you are not creating something good.

Some niches may not be very suitable for active sharing in social media. However, if it is valid for your site, it applies to all other sites in the same niche.

Having guest post links from a website that is apparent selling links.

Let’s say you visit Fiverr, and pay $50 to guest post on a niche relevant high-quality contextual backlink from a seller. I believe you can buy guest post links there from $5 – $500. You send your post to the seller to publish it. You have a link to your website, an image, and other outbound links to the authority resources to make it look as natural-looking possible.

3 months later, some other guy wants to buy a guest from the same website and have a dispute with the seller. He reports the domain by selling guest posting links with all including conversation in the attachment. It is expected from Google to slap all the sites that a specific website is backlinking to.

Of course, this scenario is not likely to happen every time. But trust me, some of your competitors will really love it if such an opportunity is presented.

Guest posting links are considered “gray hat link building” by many people. Because you usually get only one backlink from a single source. Hence, it doesn’t raise many red flags for the algorithms.

However, buying links is strictly against Google’s terms of services. Go buy links if you always want to look back over your shoulder. Alternatively, build content that is very hard not to get backlinks from the other websites.

Selling Links

Selling links to other websites is a clears violation of Google’s Webmaster Guidelines. It may lead to penalties by Google that is not be reversible. 

Your website or brand is not mentioned anywhere

brand recognition

Having a website or brand that is not mentioned anywhere on the web is a sign of low quality work.

Everyone on the web mention how great content marketers and SEO professionals are Rand Fishkin, Neil Patel, Brian Dean, Ryan Stewart, Barry Schwartz, Nathan Gotch, Matthew Woodward, Jim Harmer & Rick Kesler.

Google values immensely the websites and businesses that are mentioned everywhere.

Your competitor is doing shady stuff

โ€œThe supreme art of war is to subdue the enemy without fighting.โ€

The Art of War by Sun Tzu

I like this quote, although I am against the value system it defends. But there are many people who wouldn’t avoid doing whatever it takes to place themselves a better position in front of its competitors.

There is a chance your competitor has a secret agenda for you. There are dedicated gigs on outsourcing platforms using “black hat techniques” to devalue a specific website or business.

Not making a good keyword research

The most essential step of a successful keyword research process is knowing the keyword intent. Ignoring it no campaign can achieve long term profitability.

Keyword research is very important to achieve high Google rankings.

If you want to rank for blue shoes, you need to mention about blue shoes somewhere in your content. We cannot expect to rank for “blue shoes” if our content is only about shoes.

Most content written without considering search engines have keyword deficiency problems.

To rank for a certain term our content should have sufficient overall keyword consistency.

Finding unknown keywords for many that have low keyword volatility is key to acquire a competitive advantage in any niche.

Not understanding seasonal trends

If your website is focused on surfing, naturally you will have less traffic during the winter months. Also, if your website is focused on snowboarding you may have less traffic during the summer months.

Your content is not structured in a way responses to the searchers’ query

Google’s job is to serve what searcher is looking for. If your content is not structured in a way that responses searchers’ queries, Google will prioritize another content that can perform better than yours.

You donโ€™t understand ranking on Google takes time.

Ranking on Google requires time. Google needs time to evaluate your content against the other content. It also wants to trust your website before giving you any significant place.

Many people call this time as Google sandbox. We don’t know if it really exists, or it is just another urban legend. However, there is a consensus between internet marketers that ranking on Google for a new website requires at least a couple of months.

You are not using the necessary tools or technologies to rank your website

Ranking on Google is competing with others to have a better position for your website. Many internet marketers are using today advanced SEO tools, and software to help them analyzing keywords, backlinks, etc.

Of course, it is not a must to have them. But, they make your SEO much easier and gives an important edge over your competitors.

You are not educated enough or have misinformation about SEO

If you want to rank a website you need the right education. Many people today still have an approach that was good for 2015 when PBNs were ranking websites almost overnight.

Never read Googleโ€™s SEO Starter Guide

Desiring to rank on Google, but never investing any time to read “Search Engine Optimization (SEO) Starter Guide“? I bet many people even are not aware of the existence of this guide.

We want to clearly understand what Google wants from us to form a smart SEO strategy that works.

Not following authority websites to learn their approaches

“The only thing I know is that I know nothing, and I am no quite sure that I know that.”

Socrates

If you think you know everything about SEO, and how to rank websites on Google, Good Luck. Everyone in the SEO community has unique strategies that work great for them. Every individual brings a unique point of view to the same topic.

One planning to be consistently successful in this market should be humble and open to learning from others. I like reading blogs of authorities and sharpen my approach and SEO strategies that I plan to use.

Content without semantic keywords relevant to your niche

Semantic keywords are the ones that are not your direct target but closely related to what you want to rank for. They provide more meaning to your content by creating a “word cloud” and topical relevancy.

We want our content is optimized for the true intent rather than just answering a single query. By doing so, we bring more depth to our content, in return more value. Needless to say, it creates more opportunities to rank for a variety of keywords. You can read my post on how to use LSI keywords to rank your website.

Not mentioning authorities of your niche in your content

There are two major reasons you want to mention other people in your niche.

You create topical relevance for the search engines, to understand what your site is about. If you mention Satoshi Nakamoto somewhere in your content, Google will assume that your content may be related to Bitcoin.

Unnatural linking techniques, nofollow / dofollow outbound links ratio

There was a myth that is debunked already a long time ago. If you provide a “dofollow backlink” to another website, the website gets the backlink will suck the link juice of your site.

That would mean you lose the authority of your website while you increase the authority of the other site.

In my opinion, this way of approach is partially true.

While you provide value for the website you are linking, your website doesn’t lose value or so-called “link juice” by doing that.

There is another myth that is also debunked but I need to mention. Linking to other websites with the only “nofollow backlinks” to protect the authority of our own website.

Google has reviewed billions of websites.

Its advanced algorithm knows how many percent of your outbound links naturally be distributed as dofollow or nofollow.

If you have almost all dofollow or nofollow outbound links, it may be considered by Google as unnatural.

Google has recently announced new nofollow link attributes. As the best practice, I recommend using them as much it is appropriately right to do.

Having no traffic from the other search engines (Bing, Yandex or Baidu)

This one is completely my speculation, and I wouldn’t expect everyone to agree with my opinion.

We know Google can track from what websites you are having traffic. If you receive a low amount of traffic from Bing, Yandex or Baidu this may sign Google something is not right with your content.

Why would Google, the biggest search engine ever, would need to have the opinion of another search engine?

Every algorithm including Google’s algorithm has flaws. It is always good to have a second eye on any job. I have a strong idea that Google verify where your website is indexed in comparison to Bing’s index.

This would work as a filter to eliminate people messing with Google’s algorithm and taking advantage of it.

Not building an email list

Websites that build relationships with their readers are considered as high quality. Google definitely supports visitors from interacting with the authors and the business overall.

An email list is a good benchmark for Google to decide if a website is intended to build relationships with the readers.

Content blocking ads and pop-ups

Google definitely hates content blocking ads, above the fold banners and other advertising techniques that lower the quality of the user experience.

In 2014, John Mueller said that pop-ups can negatively affect your Google rankings.

Links font that is similar to your content or background

There was a blackhat technique popular amongst internet marketers. In the early 2010s, people used font colors the same with their content to avoid showing links to the visitors. It was also very common using a text font color either same or similar to the background color of that page.

Similarly, it was also a common blackhat technique placing single-pixel image links by a 1px by 1px image or with an incredibly small text.

Google strictly penalize this type of actions. If you are trying to fool Google using such techniques, odds are very high you will eventually get caught.

I mention it because I know there are still people doing it without even knowing it. Make sure your content, links, and background are decently set.

Your content is not readable, cannot pass the “Flesch Reading Ease” test

Readability is an important ranking Google ranking factor. If you are using Yoast SEO plugin you can automatically track the readability score of your post.

The “Flesch Reading Ease” is a very reliable tool scores English-language contents’ readability. Make sure your content achieves at least 60 points from this test.

Your content is too small to read

not readable font size
Google tells you if your font is small to read.

Your text should be easily readable on different sizes of displays both on mobile and desktop.

The above screenshot is from my Google Web console. You will notice Google was thinking my default font comes with my theme was not readable for humans. Later I have fixed this issue by using the Easy Google Fonts plugin. If you are using WordPress, it is a must-have plugin.

Not using correct grammar & having many writing & spelling mistakes

Always make sure you are using the right grammar and spelling. You can check your article while writing on Google docs, or alternatively use Grammarly to understand if your content doesn’t have any mistakes.

Having duplicate content

๐Ÿ“ Duplicate Content (3rd Party)

Using duplicate content that is plagiarized will devalue our content overall. Also, it may cause copyright problems with the original owner of the content.

Google loves fresh unique content that is not published anywhere else before. Type of content that is not only unique in words but also the ideas it has. All in all, duplicate content is not going to perform well on any type of search engine.

๐Ÿ“ Duplicate Content (Internal)

This one is really tricky. If you are using WordPress the chances are high you have some type of duplicate content. Category pages and tag pages will automatically generate duplicate content on your site unless you pay special attention to it.

Many people are not aware of they have duplicate content on their websites. Different versions of website www and non-www versions of the site may trick Google think you may have duplicate content.

Not using an optimum URL structure

Your website URL structure is an important Google ranking factor. If you are using WordPress, go to “Settings” in your dashboard and find “Permalinks”. You will see the above options to set up your permalinks.

Post name

Set your permalinks as “Post name” before posting any content. There is some important merit to do so for your Google rankings. Let me explain why:

Plain

The “Plain” option is not understandable by the search engines. It also doesn’t make any sense for the visitors.

Day and name

“Day and name” option are slightly better. However, it timestamps the posts. If your content becomes old and you want to update it you cannot change URL. This would cause a broken page on your website.

Month and name

“Month and name” is very similar to “Day and name” we have already mentioned. We don’t want our URLs to have any piece of time data on them.

Numeric

“Numeric” is fine but it gives no boost for our SEO efforts. We cannot include our keywords into a numeric permalink structure. The other minus, no one can understand what about the post by looking at the URL. People will naturally not trust the links if they don’t know what about it is.

Custom Structure

“Custom Structure” lets us using a combination of timestamping (year, month, day, etc.) with keywords and tags. However, it is quite complicated to use. Unless you plan a specific website architecture or silo system, it should not be used.

Not using a high-quality CDN to locate your images

There are many reasons you should be using a high-quality CDN (Content Delivery Network) service. I am not going to mention all of them in this post.

If you cache or temporarily store your content on a CDN, your content is delivered from the edge to your visitors faster than if it is delivered from the origin.

CDN lets our site to load much faster both on desktop and mobile devices. Also, CDN is important especially during the times our website having larger amount of traffic. It simply provides to enhance our site performance.

Your main navigation does not have your keywords

The main navigation menu is the unique opportunity that you should definitely take advantage of. Instead of using generic keywords in your main navigation, try to include keywords that you want to rank on Google. Keywords on the main navigation menu are repeated on every page you create on your website.

Your content may be stolen by someone else without you knowing it.

Once you post an article you need to visit Google Webmasters and fetch your content right away. Many people today posting content online is skipping it, and expecting Google to find their post. Although it may seem okay on a surface level approach, it contains significant risks.

There are software bots constantly scraping web to find the content that is not indexed by Google. If someone finds your content and publishes before the Google index, it is not your content anymore.

You need to report it to Google, and convince them you are the original owner of that piece of content. Stay on the safe side, submit your URL to Google as soon as you publish it.

You donโ€™t have meta titles & meta descriptions, or not optimized properly

The whole idea of getting traffic from Google happens like this.

Google lists your website in its SERP result. The user chooses your website in between of other alternatives(your competitors) and clicks to access your website.

The part of your website listed on SERP is the “meta title” and “meta description”

If you don’t optimize your metal titles people will have no reason to click on your website. Low CTR (Click through rate) is an important signal tells Google that your website may not be related to what the user is searching for. We want our meta titles, and meta descriptions to entice users to click on them.

I don’t mean we should provide clickbait titles that don’t represent our content. There are other problems occur such as pogo-sticking. Pogo sticking happens when a visitor performs a search and click to the website and quickly clicks back to the SERP.

pogo sticking for google ranking

This tells search engines an immediate dissatisfaction exists with the website that is visited. Pogo sticking always negatively affect search engine rankings.

Your website has poor customization

Google wants to see your website is uniquely valuable. Not only your content and images but your website overall should be customized. As I have earlier mentioned you should definitely use either a custom WordPress theme or premium WordPress theme to separate yourself from the crowds.

Other than that try to change colors, font size to make your website uniquely different than all other websites.

In your content always use underline, bold, italic, and color-marked text as much appropriately possible.

Not having enough unique multimedia

As we have earlier mentioned Google prioritizes good user experience over everything else.

Part of providing a good user experience is using good amount of multimedia to engage visitors with our content. More engaging multimedia we use in our content more time visitors will spend time on our website.

In return, it will increase our higher Google rankings. I would recommend reading this post to learn how many images are optimum to use in our content.

If you have the budget for it take benefit of infographics. Infographics are a great way of producing content with the added benefit of high sharability. More people use and share our infographic more backlinks we build to our website.

Another great type of multimedia is video. Youtube owned by Google and definitely gives a boost to video content shared on Youtube, and embedded to our site.

Consider using audio podcasts as well to diversify the way you use multimedia. In my opinion, Google definitely pays attention to how many different ways we present our content.

When you share video content on your website do not host video files on your web page. This may slow down the overall speed of your page. Upload your videos to YouTube then embed video links to your page.

Your content doesn’t let people comment on it

High-quality content is the type of content engaging for users. Letting people comment on our post is great thing to create user engagement.

If you are using wordpress there is a default “Leave a comment” section under each post. You can let people comment there if you can moderate the comments well enough.

There are multiple benefits to doing so. First, it will increase engagement with your post. Also, more people comment on your post, more keywords your post is likely to rank.

Never allow auto-approved blog comments. Else spammers will create millions of spam comments on your site.

You can learn about the SEO benefits of allowing blog comments on your site in this post.

Not Having Enough Backlinks

Backlinks are still a very important factor to determine the authority of a website. It is the endorsement of websites on the web.

Here what Matt Cutts said:

“It turns out that backlinks, even though thereโ€™s some noise and certainly a lot of spam, for the most part are still a really really big win in terms of quality of search results.”

Google tried in the past lowering the effect of backlinks when ranking websites. The results were terrible.

๐Ÿ”— Type Of Backlinks

All backlinks are not created equal. Ideally, we should target having backlinks organically. The way to do it is creating content worth to get links from the other sites.

This is practically not possible every time. Hence blogs using outreach to guest posts on other blogs. This is kind of a “gray hat link building strategy” may be accepted as ok by Google.

There are also other ways of building backlinks, but many of them considered “black hat link building strategy”. There are outsourcing platforms such as Fiverr that let people blast thousands of backlinks by using the software. This is extremely dangerous and should be avoided at all costs.

๐Ÿ”— Link Building Velocity

Link building velocity is an important Google ranking factor. It is very unusual for a new website to have hundreds of backlinks in the first couple month of its life. Google knows how fast websites naturally build links.

Building links faster than it is naturally possible may raise red flags in Google algorithm. Manual actions or manual reviews are here to review websites suspected having unnatural link building behaviours.

๐Ÿ”— Backlinks that have no backlinks

Having backlinks from webpages that has no backlinks is questionable. Internet is built on networking. If no one else linking to the page that links your page, you have a backlinks from an orphan page.

It may be okay, if you a few of them. If you have too many of them it is not a good thing for your rankings. Hence, many SEO professional using tiered link building strategies by backlinking to the the pages they have backlinks from.

I am certainly against the artificial link building techniques trying to take advantage of Google algorithm.

You don’t have schema markup code

Schema.org is a joint project of the search engines to understand information more efficiently. It is like giving a better map to the search engines how they should interpret our content.

Usually a piece of schema markup code is placed to a page to structure information.

But, how come not using structured data can be a negative ranking factor?

If your competitors are taking advantage of structured data and you don’t, then not using it works against you.

Hereโ€™s an example for schema markup of a Google search returned for the search query: “Newyork bus schedule”:

You donโ€™t have fresh content. 

You may have a post with the title “Who will likely to be the next US president?” Because you were thinking Bill Clinton may be the next president. 

It is time to update that old post you wrote in 1992. Google love fresh content that is relevant today. It is recommended updating your content at least once a year to show Google your content is fresh and still very relevant.

You are in a very competitive niche

Mike Tyson said one of the most genius quotes ever:

“Everybody has a plan until they get punched in the mouth.”

Think about the following for a second!

If your competitor is better than you, you don’t stand much chance to get a better position than your competitor. Of course, until you become better and/or smarter over your competitor.

The reason for your poor rankings maybe not only depending on what you do or not. You may be trying to compete in a very competitive market that requires more effort to be invested.

In my experience, search returns with 50 million or fewer results have still a lot of space for ranking.

Improper usage of Hyphen / Dash / Underscore In Domain Names

Let’s have a look at what does a hyphen, dash and underscore are, before deciding which option would work best for us choosing SEO friendly domain names.

Hyphen (-) is a punctuation mark used to bring two or more words together. For instance “up-to-date” or “one-third” is hyphen separated. Hyphens are ok if you don’t use more than once in your domain name.

Dash (โ€“) is longer than a hyphen and used to indicate a range of values or a pause. It is also ok if used in the same way as a hyphen.

Underscore ( _ ) or also called understrike is a character that started to be used with the typewriter keyboard to simply to emphasize the words, phrases or numbers. Underscores create an exact match for the keywords combined. It may be useful if it is implemented strategically.

We want our domain name to be friendly to the search engines. It is always best if we don’t use them in our domain at all.

There was a time using the “Exact match domain”(EMD) brought a big ranking boost. Everyone was using very long keyword stuffed domains separated with hyphens.

Those times are passed. Google devalued the ranking boost that came with the keywords in the domain names.

It is still suggested to have the keyword in the domain name. However, having very long domains separated with hyphens brings no added benefit. Google will no more prioritize us simply we choose the domain name “best-red-cars-run-with-water”.

Improper usage of Dash / Underscore in URL Structure

It is ok using dashes in our URLs as long as we don’t go overboard. Here what Matt Cutts said:

โ€œIf you are going to make a site and youโ€™re starting fresh, so youโ€™ve got a blank slate to work with, I would probably go ahead and go with dashes. I would continue to go with dashes at lease for the foreseeable future.โ€

โ€œNobodyโ€™s slated to be working on that so at least for the time being itโ€™s better to use the dash.โ€

Long Domain Registration Term, premium hosting

Although it is highly speculative domains that have a longer time to its expiration may have a small boost in Google rankings.

Google hates spam and fights against it.

Websites with a longer registration period may signal trust to Google. Having plans to be around for a longer period of time is opposite what spammers and blackhat SEO websites do.

Not having a reputable Webhosting provider

It is very important to host our websites with a reputable Webhosting company. This is related to the trust factor we want to create with Google.

There are many $1/month Webhosting companies that are used to host PBN sites since they provide a unique IP for each hosting plan. Google knows all of these businesses.

Make sure you don’t make a marginal selection when selecting a hosting provider. Don’t choose the cheapest Webhosting company you find.

Not having your keywords in your categories and tags section

I personally don’t use tags when I want to publish a post. However, I know there are still many people using it as a good SEO practice. I like to name the categories with the keyword I want to rank on Google. It is something we should definitely take advantage of. Luckily, WordPress makes it very easy for us without needing any experience in coding.

You didn’t send XML sitemaps

using sitemap
Submitted sitemap to Google

XML Sitemaps are very important for SEO. It is like a blueprint of our website that tells search engines how our site is structured.

Sending an XML sitemap makes it very easy for Google to find the content we publish. It also makes crawling job easier for the search engines.

It is recommended that as soon as you publish your first content submit a sitemap to Google.

If you are using wordpress you can submit a sitemap by using plugins such as Yoast or Google XML Sitemaps.

Contradicting plugins trying to do different things,

plugins with same functionality
Yoast & Google XML Sitemaps Can Do The Same Thing.

This one is especially important for WordPress users. If you are using plugins that do the same or similar job you may have problems. For instance, Yoast and “All in One SEO Pack” are two popular SEO plugin.

When used together they may create technical problems. The same principle applies to any type of plugins in WordPress. Similarly, don’t use or have two cache optimizing plugins at the same time. Else your website is likely to crash if two or more plugins contradict each other.

You are having too many plugins

It is always best having the minimum amount of plugins functionally possible. I feel uncomfortable if I have more than 15 plugins on my WordPress dashboard.

Each plugin you use brings additional PHP, JavaScript, and whole other files together. This will decrease the performance of your website and create security problems.

Text/code Ratio is too small, less doing is good. Low Code-to-Content Ratio

The text/code ratio is not a direct Google ranking factor. However. it may negatively affect once it is overlooked.

As a good SEO practice, the text/code ratio is anywhere from 25 to 70 percent is fine.

Websites with a higher text/code ratio are more readable and user-friendly. The page load speed is also higher is another positive factor for SEO. Finally, having less coding on our webpages will increase the crawl performance of our pages.

You donโ€™t have styling, fonts, font size, underline, bolt, etc.

Bold, Strong and Italic, Underline tags are important for good SEO practices. They emphasize the semantic importance of the part of the text to the search engines.

They also help us by adding a unique touch to our content.

Not benefiting from bullet lists or overusing it

It is always a good idea using bullet lists if we long items to show in order. The benefits bullet lists provide:

  • Better organization of our data
  • Easier to consume content for the reader
  • Better presentation of data for the search engines.

Although it is not a direct Google ranking factor, I believe there are SEO benefits using bullet lists. Having at least a 2-3 bullet list for a 1000 word of content looks optimum for good user experience.

Don’t overuse bullet lists, else it starts to look unnatural and spammy.

You didn’t define site identity, the tagline in WordPress

Tell search engines what your WordPress is about.

Take advantage of “Site Title” and “Tagline” sections in WordPress. Your site title and Tagline will be displayed on every page you create. We should definitely benefit from impressive ones with our target keywords included inside.

You don’t use a caching plugin

This one indirectly increases your website’s SEO. Caching is the process of creating static versions of your page and serving it to your visitors. Static pages are usually rendered quicker in the browsers. Hence, you will have a faster website.

There are very good wp caching plugins that help us optimize our website with a few clicks only.

Your page size is too big

Webpages with large file sizes have bad page load times. If you don’t optimize your multimedia and use an excessive amount of coding, you will have a slower website.

Uninteresting or Novel Content

Google literally discussed earlier the uninteresting content that brings no value. Content that is very similar to earlier published contents is useless. Your content should not only be unique in its wording to pass Copyscape, but it should also bring ideas and opinions that are not discussed before.

Google assigns a “novelty score” to each content published and uses this score as input for its algorithm.

I would recommend reading this post to improve your content immediately.

Sitewide Average Novelty Score

Google walks one step further with the novelty score. They assign an “average novelty score” that applies sitewide content.

Positive Sentiment in Comments

Websites that have many negative comments or very little positive sentiment on the web.

Not using / improper using rel=”hreflang” tags

Hreflang tags are used by sites that have similar content in multiple languages. If your content is “English only” you can skip this one.

Exceedingly Long URLs

Matt Cutts said using more than about five words in URL lowers the “keyword in URL” boost. Search engines consider very long URLs as spam even though the same word is not repeated more than once in URL.

Long Internal Link Anchors

Longer link anchor text is generally natural. However, having a very long anchor text may be treated as keyword stuffing. In my personal opinion, we should avoid anchor texts more than 10 words.

High Ratio of Links to Text

Webpages with very little content but many links considered low-quality pages. People built such pages in the past for the goal of linking (internal/external). Although there is no perfect data in our knowledge, we should aim a good text/link ratio as natural as possible.

JavaScript-Hidden Content

Placing text in Javascript is considered as a blackhat technique, and not recommended. Although search engines can crawl these texts, it is still not a good practice to do. There are chances of having a cloaking penalty if you do so.

Copyright Violation

Content published in a manner that violates the “Digital Millennium Copyright Act” (DMCA) or equivalent codes outside of the U.S. can lead to severe penalties. Google tries to understand unattributed sources and unlicensed content automatically, but people can also report infringement directly to Google, that will result in a manual action against the website.

Doorway Pages

The Doorway Pages are a group of pages built with the only goal of having search traffic. These pages don’t provide any value to the visitors. For instance, creating a website to show chiropractors in the US, without having no actual business or providing no useful information to the users.

This is categorized by Google as “spamdexing” which is the short form of (Google’s page index spamming).

Using Clickbait Meta Titles

Using catchy titles to let people click on our website is a good SEO practice. However, using Clickbait titles that don’t represent our content is against the Google guidelines.

It is also bad for our SEO. If people cannot find on our website what we have promised them, they will click the back button and turn to the search result page. Google will take this signal as the user dissatisfaction and readjust our search ranking position.

Overuse Of Bold, Italic, or Other Emphasis

It is a very good SEO practice using bold, italic and other emphases that adds personality to our posts. However, used excessively it will be taken as “spammy activity”.

Broken Internal Links

Broken internal links reduce the overall quality score of our website. It also makes crawling more difficult for the search engines to navigate. Make sure you don’t have any broken links on your website.

My favorite free tool to find broken links is Screaming Frog SEO Spider.

Screaming Frog SEO Spider
Screaming Frog Crawls Your Website To Show You On-site Problems

It finds almost all on-site problems you can possibly have. It is also a very popular tool used to optimize page headings, meta titles, meta descriptions, etc.

Redirected Internal Links

Matt Cutts has suggested that redirects are subject to “PageRank decay”. It means some of the page authority is sacrificed every time a page directs to another. In 2016, Gary Illyes posted a tweet about this is no more an issue to worry about.

Dynamic Content

The dynamic content is used for different goals. A webpage that displays different content for every session it is loaded or rotates itself depending on the time of the day, users location that access it.

Dynamic content can really challenge search engine spiders to understand our content. Using “noindex” and minimizing the use of dynamic content, particularly when accessible by Google, is considered to result in better user experience, and increase rankings.

Page “NoIndex” Tag

Pages with the meta tag “noindex” are not indexed to shown in search results. Sometimes website owners are not even aware that they have activated “noindex” value. We should be aware of this is not the case for any webpage we want to rank on Google.

However, it is a good practice keeping this tag activated on pages that don’t provide much benefit to the users. For instance, tag and category pages don’t generally provide a good user experience. Itis recommended keeping them tagged as noindex.

Similar to private whois data, it’s been made clear that representatives from Google are aware of this common trick and treating it as a problem. If for no reason other than it being a violation of ICANN guidelines, and potentially allowing a domain hijacker to steal your domain via a dispute without you getting a say, don’t use fake information to register a domain.

Penalized Registrant

Although this is very speculative, you may be the reason for your websites’ poor Google ranking.

If you have earlier spammed, or used black hat techniques across several sites, Google may have blacklisted your name.

error: Content is protected !!