don’t look at me!. there are situation when you don’t want search engines digging through some...

6
Don’t look at Me!

Upload: aubrey-bishop

Post on 18-Jan-2018

217 views

Category:

Documents


0 download

DESCRIPTION

All spiders automatically look for this file in your root director, so all you need to do is create it, Upload it Wait for the spiders to read it This file is not a secure file. Your stuff isn’t safe in this file, it simply prevents spiders from indexing it. In fact, anyone can read your robots.txt file by simply going to the domain name/robots.txt file:

TRANSCRIPT

Page 1: Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the

Don’t look at Me!

Page 2: Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the

There are situation when you don’t want search engines digging through some files or indexing some pages.

You create a file in the root directory called robots.text and place these files in here.Example: dynamic search results pages that

may display improperly without user input.404 pagesImage directories,Login pages,General content that you don’t want search

engines seeing

Page 3: Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the

All spiders automatically look for this file in your root director, so all you need to do is

create it,Upload it Wait for the spiders to read it

This file is not a secure file. Your stuff isn’t safe in this file, it simply prevents spiders from indexing it.

In fact, anyone can read your robots.txt file by simply going to the domain name/robots.txt file:

http://whitehouse.gov/robots.txt

Page 4: Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the

It’s easyUser-agent is the search spider agent you

want to receive the message. If you use and * you will indicate all spiders.

Preventing spiders from indexing content is done with the keyword Disallow.

Followed by the path to the private content. You can do more than one with the command for example:

Page 5: Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the

User Agent: googlebot (or * for all spiders)# My private folder pathDisallow: /private-folder/Disallow: /404.php/

If you want to disallow just a photo image folder then you would do:

User-agent: Googlebot-ImageDisallow: /photos/

Page 6: Don’t look at Me!. There are situation when you don’t want search engines digging through some files or indexing some pages. You create a file in the

http://www.robotstxt.org