Software for finding outgoing links

Hi everyone,

I’ve got problems I hope someone here might help me with. I’ve just started out managing the corporate site for your company. The way their current is defined up is you will discover 2 versions from the site:
sandbox. corporation. com & www. company. com. The sandbox must be used as a testbed for all those content before it’s used to go live. However, it seems which google has found the sandbox and started indexing them. I was thinking if there was a method to list all that outgoing links from the website and listing them. Right these days, google isn’t being beneficial since it wants to think both the links include the same domain (which that they kinda are), so I became wondering if anything at all had any tips.

Thanks beforehand,

Joel

Brand-new heard of the actual robots. txt file

It sits in your root and informs web spiders where they can’t visit.

I have attached a file for you. Upload to your root and the should stop google and others.

lol, That will work.

I recently started managing this great site 2 weeks in the past, and I’ve already been so busy trying to puzzle out the file structure we never bothered to check out if they had a robots track. Thanks.

However, do you realize of any software that help us search the site to search for the links that was causing the SEs to find the sandbox

Looking at your stats would show that.

The referrers of the spiders..

This entry was posted in Web Design and tagged , , , , , , , , , , , , . Bookmark the permalink.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *