Does it feel like youve done everything right, yet your website is still nowhere to be seen in Googles search results? The bad news: Several things could be preventing you from showing up in Google. The good news: Many of them are easy to fix.
Below, we explore nine possible reasons why youre not showing up in Google and how to fix each issue.
Before we start…
Its important to note that when you type something into Google hoping to see your website in the search results, youre not actually looking for your website.
You are looking for a page on your website.
Thats an important distinction.
If Google doesnt know about the existence of the page you are trying to rank or thinks it doesnt deserve to rank, then it wont show up anywhere that matters in the search results.
For that reason, to show up in Google, three things need to be true:
- Google knows that your website exists and can find and access all your important pages.
- You have a page thats a relevant result for the keyword you want to show up for.
- Youve demonstrated to Google that your page is worthy of ranking for your target search query—more so than any other page from another website.
Most of the issues we tackle below relate to one of these three things.
Lets start with the simple stuff…
1. Your website is too new
It takes time for Google to discover new websites and web pages. If you only launched your site this morning, then the most straightforward explanation is that Google just hasnt found it yet.
To check whether Google knows your website exists, run a search for
If there is at least one result, then Google knows about your website.
If there are no results, then they dont.
But even if they know about your website, they might not know about the page you are trying to rank. Check that they know about this by searching for
There should be one result.
Search Console > Sitemaps > Enter sitemap URL > Submit
A sitemap tells Google which pages are important on your site and where to find them.
It can also speed up the discovery process.
Go to yourwebsite.com/sitemap.xml. If theres nothing there, go to yourwebsite.com/robots.txt as this often lists the sitemap URL.
Still nothing? You might not have one. Read this.
2. Youre blocking search engines from indexing your pages
If you tell Google not to show certain pages in the search results, then it wont.
You do that with a “noindex” meta tag, which is a piece of HTML code that looks like this:
<meta name="robots" content="noindex"/>
Pages with that code wont be indexed, even if you created a sitemap and submitted it in Google Search Console.
You probably dont recall ever adding that code to any of your pages, but that doesnt mean it isnt there.
For example, WordPress adds it to every page if you check the wrong box when setting up your site.
Its also something that a lot of web developers use to prevent Google from indexing a site during the development process and forget to remove it before publishing.
If Google has already crawled the pages in your sitemap, itll tell you about any “noindexed” ones in the “Coverage” report in Google Search Console.
Just look for this error:
If you recently submitted your sitemap to Google and they havent crawled the pages yet, run a crawl in Connekt Teacher Site Audit. This checks every page on your site for 100+ potential SEO issues, including the presence of “noindex” tags.
Remove “noindex” tags from any pages that shouldnt have them.
3. You are blocking search engines from crawling your pages
Most websites have something called a robots.txt file. This instructs search engines where they can and cant go on your website.
Google cant crawl URLs blocked in your robots.txt file, which usually results in them not showing up in search results.
If youve submitted your sitemap via Google Search Console, it should alert you about issues related to this. Go to the “Coverage” report and look for “Submitted URL blocked by robots.txt” errors.
Once again, that only works if Google has already attempted to crawl the URLs in your sitemap. If you only recently submitted this, then that may not yet be the case.
If you prefer not to wait, you can check manually. Just head to yourdomain.com/robots.txt.
You should see a file like this:
What you dont want to see here is this piece of code…
.… under any of these user-agents:
Why? Because it blocks Google from crawling all the pages on your site.
You also dont want to see a “Disallow” directive for any important content.
For example, this Disallow rule would prevent Google from crawling all the posts on our blog.
Remove any directives blocking content that you want to show up on Google.
Robots.txt files can be complicated, and theyre easy to mess up. If you feel that yours may be preventing pages from showing up on Google, and you dont know much about this file, hire an expert to fix it.