There are a number of methods possible to redirect pages. Typically, the choice of method will depend heavily on your usage, and what you’re trying to accomplish. For example, there are temporary redirects and permanent redirect. Each of these redirects have a usage case, and can be implemented through a wide array of technologies.
Types of methods to redirect a page
301 – Moved permanently, recommended
302 – Moved temporarily, should be used sparingly
Meta refresh
What’s a redirect?
It’s a method to send users and/or search engines to a different URL from the one they originally went to.
301 redirects pass 90-99% of the ranking SEO juice to the destination page. 301 refers to the HTTP status code for this redirect. In most cases, you want to implement a 301 redirect.
302 redirects, are temporary redirects. According to some Google employees, in some cases a 302 redirect may be treated as a 301 redirect. If you are going to leave a 302 redirect in place for a long time, it may become a 301 redirect in Google’s eyes.
Meta refresh redirects are a type of redirect that’s executed when a user hits your page. They are slower, and not recommended. Meta refreshes pass some link juice, but aren’t recommended due to their slow execution, and loss of SEO juice.
SEO Best Practices when redirecting a page to another page
It’s common to redirect one url to another. It’s critical though you observe best practices, otherwise you risk losing SEO value. If your goal is to send both users and search engines to the new page, then you should do a 301 redirect. It’s crucial you remember that when you setup 301 redirects, it takes time for search engines to discover the 301 redirect and credit the new page. This timeline can be elongated, if search engines rarely visit the webpage. Options like 302s, will not pass rankings and/or search engine credit to the new page.
If your goal is to do A/B testing, and and you won’t be redirecting people to the destination page/pages permanently, then you should do a 302 redirect.
What to do when switching domains
If you’re switching domain names, you should be considering a 301 redirect in most cases (assuming you aren’t in a penalty). If you do a 301 redirect, you should redirect each page to it’s new counterpart. According to some Google reps, if you 301 redirect all the pages to the homepage, that could a 404 soft error on all of them, resulting in no SEO juice being given to the domain.
What to do when A/B testing landing pages
If you’re redirecting users to A/B test landing pages, we recommend using a 302 redirect. When you use a 302 redirect for a short period of time, Google let’s the original page you had rank, without interrupting your SEO rankings. This allows you to preserve your SEO rankings while letting you do the A/B test. We typically don’t recommend leaving this in place for an extended period of time. Some Google reps have mentioned that 302 redirects can be treated as 301 redirects, if left in place over an extended period of time. This leads us to believe that if you’re doing A/B testing with a 302 redirect for an extended period of time, this could cause harm to your rankings.
What to do when you change the permalink slug of page
If you change the URL of your page, such as: www.lawyer.com/criminal to www.lawyer.com/criminal-lawyers, then you should a 301 redirect from the original link to the new one. In this case, it’s assumed you tried to improve the SEO friendless of your page, and changed the URL. In this case, you want both users and search engines to see the page has changed – and to give SEO credit to the new page. The 301 redirect will be the safest way to do this.
First and foremost, when you create a new website, or a new blog for your business – the first thing you care about is whether people are finding it or not. One of the first ways they’ll find it, is through search engines. Typically, you have to wait for Google’s crawler to visit your website and then add it to their index. So the question is, how can you see what pages are indexed, and moreover, how can you improve variables that make it so Google will crawl quicker. Here are some basics so you can understand how to know if your content is being indexed, and some great ways to ensure Google bot is crawling your website/blog.
What’s Googlebot, Crawling, Indexing, etc.
The Googlebot is the search crawler Google sends to collect info about articles on the web – and to then add it to Google’s searchable index. Crawling, is the actual process where Googlebot is going around the web and finding new information. Googlebot follows links, from one website to another, in order to find new things to crawl. Indexing, is the process in which information is gathered by Googlebot from what it crawls. Once the article it crawls is processed, they are added to the searchable index if it’s determined to be of high quality. When the article is being indexed, Googlebot processes the words in the article. Things like title tags, alt tags, etc, are looked at in order to help Google understand what the article/content is about.
Googlebot finds new content by looking on the web at places like blogs, pages, press releases, etc, in order to find links. It crawls the web pages, and then goes to the destinations of the links in order to find new places to crawl. It also look at website sitemaps in order to find a list of destination subpages to crawl.
How you can get your content discovered
Here are some great ways for your new content to be discovered by Googlebot.
1. Create a sitemap – Sitemaps are XML documents on your website that list every single page in your website. It is updated frequently, and tells search engines what new pages have been added, and it’s something that is helpful in order to promote regular indexing of new content by crawlers. For example, if you website is built on WordPress, then you can install numerous plugins in order to have the sitemap automatically created and updated.
2. Submit the sitemap to Webmaster tools – After you create your sitemap, the next thing you should do is submit it to Google webmaster tools. If you don’t have one already, create a free Google webmaster tools account, and add your website to webmaster tools. After you add your website to webmaster tools, you can go to the sitemaps option and add a link to your website’s sitemap to Webmaster tools. This will tell Google to crawl your sitemap, and the pages listed in it.
3. Create social network profiles – Crawlers get to your website through links on other websites. One way you can get your content discovered is by creating social network profiles, and then adding links on those profiles to your content. Examples of profiles are: twitter profiles, facebook pages, google+ pages, etc.
4. Create content offsite – Remembers, crawlers look at content off-site, and look at embedded links in it. One great method of getting both links, and getting Google to index your content, is to create offsite content, such a guest blog post to a website in your niche, and then embedding a link in it back to your content. Please note, you don’t want to create blackhat content, and engage in spammy link building techniques. This is against Googles guidelines.
Ok – so how to check # of pages indexed
It’s actually easy. First, you can go to Google webmaster tools and look there. You can see inside Google webmaster tools the amount of pages indexed. This is a very simple way. Another way of understanding how many pages you have indexed is by doing the following.
1. Go to google.com
2. Type in site:domain.com
This will show you a list of all the pages indexed by Google. You can then scroll through all the content indexed.
Another cool thing you can do is type the exact url, in quotation marks, and do a Google search for it. For example, if you wanted to see if this page is indexed by Google, you would type in “https://www.seocompany.ca/how-can-i-see-what-pages-are-indexed-in-google/” – into Google, and then if it’s indexed, Google will show it as a search result.
What if the index count is “going down”
Google’s index count is a good indicator of how much content you have indexed. Typically, Google will remove low quality content from it’s index, in order to reduce pollution. Sometimes, Google’s overall # of pages indexed number might change and fluctuate. If you see a 1% to 5% fluctuation in terms of overall # of pages indexed, that’s pretty normal. Things are in flux, and some of your outdated articles that are lower quality may get tossed out of the index, etc. Sometimes, Google’s # of indexed pages may fluctuate for no practical reason at all.
If the # of indexed pages drops drastically, meaning over 10% drop, then that’s a potential issue. Here are some reasons why the # of indexed pages might go down
1) You de-indexed them via robots.txt file. This is a very common mistake. Sometimes, you may inadvertently de-index your website by setting certain sections of your website as no-index, or no-follow, and therefore cause issues
2) You may be in a Panda penalty, or have a filter applied on some of your content. If you see content being de-indexed, you should look at the quality of the content. That means, look to see if it’s thin, or low quality. For example, in the past we’ve seen websites get penalized heavily because they have 400-500 pages of content, where each article is maybe 300-400 words long. In theory, it sounds fantastic having so much content, but when the length of the article is so thin – Google will actually think you’re trying to game the system. If you have too much content like that, you can trigger a Panda penalty – which results in your entire website getting penalized.