> How to ensure SEO on pages that are not indexed...?

How to ensure SEO on pages that are not indexed...?

Posted at: 2015-03-04 
One approach is to create a robots.txt file and upload it to the root of the site. You can then use the following rules to make sure that those pages arn't indexed.

User-agent: *

Disallow: /path/with-trailing-slash/

You can also add a meta tag to the pages that you don't want to be indexed so if you add the following to the page it should not be indexed.



Use 'robots.txt' for that within your pages and modify it to not index these enteries. I'm not sure though if it would work for you guys to score the same search engine points as them themselves, through them but it would surely do the trick you want.

Hal Smith

URLdreamer Consultant

I guess if you want the benefit of backlinks then you will have to allow search engines index your website and if you don't then no benefit.

Download this software it is called instant free massive traffics I have saved it on a Zip file for you go through it and you will build massive lists, you might be asked to complete a survey in the link that I saved it go ahead and complete the surevey it is better than buying the software for it cost me $135.00

I completely agree with you Oliver.

It's either all or nothing at all. You can't have both.

I am working on improving our search rankings, and need some help. Essentially, many external pages will link to ours in the future, but the pages that they link to we do not want to be index-able or accessible via a search engine. However, if possible, we still want these links to score us search engine points. Any suggestions?