rahul garg- seo complete project report free download

83
PROJECT REPORT ON “Search Engine Optimization” Submitted in the partial fulfillment of the requirements for the degree of MASTER OF COMPUTER APPLICATION Final Year (6 th Semester) Session 2012-13 By Rahul Garg (Roll no.-----------) (Batch 2010-2013) Under the supervision of Internal Guide HOD, MCA Ms. Abc Mr. Abc

Upload: rahul-garg

Post on 08-May-2015

571 views

Category:

Education


1 download

DESCRIPTION

Search engine optimization (SEO) is the art and science of publishing and marketing information that ranks well for valuable keywords in search engines like Google, Yahoo! Search, and Microsoft Live Search.

TRANSCRIPT

Page 1: Rahul Garg- SEO Complete Project Report Free Download

PROJECT REPORT

ON

“Search Engine Optimization”Submitted in the partial fulfillment of the requirements for the degree

of

MASTER OF COMPUTER APPLICATION

Final Year (6th Semester)

Session 2012-13

By

Rahul Garg

(Roll no.-----------)

(Batch 2010-2013)

Under the supervision of

Internal Guide HOD, MCA

Ms. Abc Mr. Abc

(Asst. Prof, MCA)) (Head, MCA)

DEPARTMENT OF MCA

IDEAL INSTITUTE OF TECHNOLOGY, GHAZIABAD

Page 2: Rahul Garg- SEO Complete Project Report Free Download

Project Approval

This is to certify that the project report entitled “Search Engine

Optimization” done by Mr. Rahul Garg , Roll No. -------, is an authentic

work carried out by him at “Company Name” Matter embodied in this

project work has not been submitted earlier for the award of any degree or

diploma to the best of my knowledge.

Internal Guide External Examiner

Ms. ABC

(Asst. Prof, MCA)

Page 3: Rahul Garg- SEO Complete Project Report Free Download

Declaration

I, Rahul Garg, hereby declare that the project entitled “Search Engine

Optimization” which is being submitted in partial fulfillment of the

requirements for the award of degree of MCA at Ideal Institute of

Technology, Ghaziabad (U.P.) under the supervision of Ms. ABC (Faculty,

MCA).

I also declare that any or all content incorporated in this project have

not been submitted in any form for the award of any degree of diploma of

any other institution or university.

Rahul Garg

Roll No. 1------------

Semester 6th, MCA

Ideal Institute of Technology,

Ghaziabad.

Page 5: Rahul Garg- SEO Complete Project Report Free Download

Acknowledgment

 I take this opportunity to express a deep sense of gratitude to Mr. ABC Agrawal, C.E.O, Company Name., for his cordial support, valuable information and guidance, which helped me in completing this task through various stages.

 I am obliged to staff members of (Company Name.), for the valuable information provided by them in their respective fields. I am grateful for their cooperation during the period of my assignment.

I also take this opportunity to express my profound gratitude and deep regard to my guide Mrs. ABC for her guidance, monitoring and constant encouragement throughout the course of this project.

Last but not the least, I am very thankful to our HOD, MCA department, Mr. ABC, for providing zealous and good deal of help and guidance.

Lastly, I thank almighty, my parents and friends for their constant encouragement without which this assignment would not be possible.

Thank you

Rahul Garg

Page 6: Rahul Garg- SEO Complete Project Report Free Download

TABLE OF CONTENTS

1. Introduction

1.1. Aim of the Project

1.2. Goals/Objectives

1.3. Application to SEO

1.4. System Study

Existing System Proposed System

2. Company Profile

3. System Requirements

3.1. Fundamental Requirements

3.2. Hardware Requirements

3.3. Software Requirements

4. System Analysis

4.1 Information Gathering

Request clarification

Feasible Study

Request Approval

5. System Implementation

5.1 On Page factor

Title Tags Meta Tags Robots Control Header Tags The Crawl Map  Anchor Text Checklist  Url Structure

5.2 Off Page factor

Page 7: Rahul Garg- SEO Complete Project Report Free Download

Directory Submissions Social Bookmarking Submissions Blogs & Comments Article Submission Press Release Submissions Forum Posting

5.3 SMO(Social Media Optimization)

6. Testing Tools

6.1 Tools For Measuring Results

6.2 PPC For Testing Purposes

7. Final Conclusion

Bibliography

Appendices

Page 8: Rahul Garg- SEO Complete Project Report Free Download

INTRODUCTION / OVERVIEW

• What is SEO?

– Search engine optimization (SEO) is the art and science of publishing and marketing information that ranks well for valuable keywords in search engines like Google, Yahoo! Search, and Microsoft Live Search.

• Main benefits of SEO

– Low cost, targeted traffic

– Traffic scale can be very large

– With a high volume of targeted traffic, comes substantial revenue

• Ranking on “Page One” of Google

– What it takes…

– An interesting success story

• Market Research

– Identify keywords your competitors are optimizing for

– Identify the “On Page” SEO strategy that is working for your competitors

– Identify the Linking Strategy that your competitors are utilizing

– You can learn what is necessary for you to rank above your competitors, from your competitors

Page 9: Rahul Garg- SEO Complete Project Report Free Download

AIM OF THE PROJECT:

Coming up with the right SEO answer is just one part of improving your website's performance online. The strategy you devise then needs implementation and, more often than not, project management. The purpose of this post is to share what I (and a few others) have learned about managing SEO strategies over the years. There isn't much hardcore SEO though- I suggest you take a look at this if you want some of that!

Goals/Objectives:

If you're developing an SEO strategy for a website, you need to make sure you have some objectives in place. A simple, 'increase traffic' or 'rank better' is not specific enough but, having said that, creating goals that are so specific they exclude any recognition of improvement across the board are similarly limiting.

Application to SEO:You're an in-house SEO for a website that sells cheese online. Your overall goal is to increase conversions on your site. Your strategy goals are threefold:

reduce bounce rate by about x% increase the number of new visitors by about x% increase conversation rate by about x%

It's a painfully obvious thing to say, but having aims in place like this will really increase your chances of creating a successful strategy; everything that goes into it has to have a motivation. Recommending a Twitter account? Is that because you think it'll increase the number of new visitors by x% or because you quite like Twittering? By giving every task you outline a definite purpose, you'll reduce the risk of wasting time on tactics that don't work.

These goals and objectives need to be developed in partnership with whoever you're creating the strategy for, whether that's a client or your boss. It really helps if you can demonstrate to this person why you've chosen these goals and, once you've come up with the strategy, how you're going to achieve them. One of the main reasons for this is that you'll probably need their help at some point along the way.

SEO isn't rocket science but if your client's/boss's expertise lies elsewhere then it's really worth making sure they understand what you're trying to do and why you're trying to do it. Make sure someone (and it can be you) really believes in the strategy and can champion it to whoever needs convincing. It's important that this person can communicate the overall idea as well as go into the specifics. We've found powerpoint, graphs and the odd screenshot of a 'moz tool helps with this. (My post about using 'moz tools in the sales process talks a bit about this).

In terms of implementation, if you can show (preferably with diagrams) how changing that title tag or contacting that partner site is crucial to the strategy then you've won half the battle. 

Page 10: Rahul Garg- SEO Complete Project Report Free Download

SYSTEM STUDY:

EXISTING SYSTEM:

Our Existing Site(Http://www.gitmgurgaon.com has low rank and the url of our website is shown in eight page of Search Engine Result Pages(SERP),so the website owner want to optimize the site so as to get maximum traffic and enhance the Page Rank(PR) of the site.

So the company take the responsibility to optimize this site within 3 months. Site require also the ON-PAGE factors. In this site there was no outbound links and no inlinks as its competitors sites with brings traffic to the site. The keywords in title, content and links were not good enough to compete with rival sites.

The website owner suggested some keywords and we have to research these keywords, in against the competitors who were gaining higher ranks. We also found that the keyword density,content of the website had was to low .so the site was not searchable when the user search according to their desired keyword. we also found that the meta tag robot.txt is disallows through which the spiders do not croll some pages of website that degrade the rank of website.we we analyse the site through different search engines such yahoo and msn but there is the same problem with the site .this was are initial system what we analyse in the beginning.this consistute our existing system.

PROPOSED SYSTEM:

Our proposed site http://www.gitmgurgaon.com should attain higher rank in search engine indexsuch as google.

The site should have higher page rank(PR)

The site require maximum quality backlinks from same theme site whose page rank is high.

The keyword density of the high pr site require 4 -6% keyword density.

The proper ALT tag is needed in the code of the website.

Need to change the disallows so the spiders croll the site.

Need to create dynamic sitema

Page 11: Rahul Garg- SEO Complete Project Report Free Download

COMPANY PROFILE

Brainguru is a software development company specializing in IT enabled services. Founded in 2010 by a team of professionals with decades of experience in the field of information and technology, Brainguru has developed into a world class organization serving national and international clients across continents with quality products and services. We have been serving clients across US, UK, Canada and India with pride. With state of art development center in the Indian capital region we have offices across Greater noida

Areas of Expertise:

Custom Website Designing and Redesigning

* E-Learning

* B2B and B2C portals

* Web Designing

* Web Development

* Software Development

* E-marketing

* Search Engine Optimization

How are we different from others?

What makes us different from the rest is that we have a dedicated team of professionals with high qualifications and experience across diverse portfolio working round the clock to deliver quality products and services.

Mission:

To be a world class IT company offering cost effective and quality products and services for the benefit of all

Page 12: Rahul Garg- SEO Complete Project Report Free Download

SYSTEM REQUIREMENTS

FUNDAMENTAL REQUIREMENTS

• So, what did I do when Matt called me up and asked me to take a look at his Adsense Experiment?

• I went to Google… and looked for opportunity

• I then reported back to him that we had a huge upside

• Let’s go to Google and see what we can find out about our competitors

– Who’s ranking on page one?

– Why are they there? Success leaves clues…

– What is their PageRank?

– What does their “On Page” optimization look like?

– Who’s got all the “content”?

– If there’s little content… they must have PageRank (PageRank sometimes trumps content)

– How many links, and what quality are the links, they have going to their page?

– What does Yahoo Site Explorer have to say?

– Let’s dig in… we’re off to Google

Page 13: Rahul Garg- SEO Complete Project Report Free Download

HARDWARE REQUIREMENTS

Main Processor Pentium IV

Hard-disk Capacity 80 G.B

RAM 1 G.B

CPU Speed 2 GHz

Keyboard 104 Key

Monitor 16 inch LCD

SOFTWARE REQUIREMENTS

Operating System Window XP

Web browser Firefox, Internet Explorer, Opera, Crome

Toolbar Google Toolbar

Browser Tool SEO for FireFox

Keyword Analysis PPC Web Spy

Page 14: Rahul Garg- SEO Complete Project Report Free Download

SYSTEM ANANYSIS

Information Gathering

A request to take assistance from information system can be made for many reasons, but in each

case some one in the organization initiate the request. When the request is made, the first system

activity the preliminary investigation begins. This activity has three parts:

Request clarification

Feasible Study

Request approval

Request clarification

Many requests from employees and users in the organization are not clearly defined. Therefore,

it becomes necessary that project request must be examined and clarified properly before

considering systems investigation.

Feasible Study

The feasibility study is carried out by a small group of people who are familiar with information

system techniques, understand the parts of the business or organization that will be involved or

affected by the project, and are skilled in the system analysis and design process.

"Everyone has different needs and so every site does not need SEO"

SEO Feasibility is the foremost thing one should analyze when taking up a project. If you want to meet the clients expectations, wish to see your strategy working and want to avoid the later hurdles, this is not to miss one. It's important to find first whether your site is applicable for optimization and if so then much worthy it can be?

It involves carrying out a research and analysis which should cover following key points:

Study and analyze the domain. Identify the segment of people and their approach for accessing information on web. Identify the internet market share percentage. Analyze the current web presence of website. Understand the objectives of website. Is search engine optimization a viable option to meet the desired goals. Define Area of scope. What will be the ROI for the optimization efforts. Do we have a strategy to combat well with competitors. Availability of resources to execute optimization activities.

Based on above findings a report can be created which measures the score of individual findings. Finally jot down your points and conclude with indication whether to go further with SEO or not. If not then alternative options from internet marketing can be checked to take it further. A good

Page 15: Rahul Garg- SEO Complete Project Report Free Download

feasibility analysis report should recommend other marketing techniques, if SEO does not apply effectively.

Advantages:

Outlines the limitations. Helps in Project Planning. Figure out realistic estimated efforts. Better time utilization.

Market Research

Above all advantages, the main thing is you get to know your boundaries while setting up client expectations. It brings transparency to clients where you clearly indicate what can be achieved and what not. So next time you hit any project, make sure you are done with feasibility analysis.

• What is the easiest way to discover this important information about your competition?– Install Google Toolbar– Install SEO for FireFox– Get PPC Web Spy

• Who are you competing against?– We’re also going to see what other keywords these competitors are trying to rank

for– We’re going to see what Google thinks their site is about by putting their URL in

their Keyword Tool– We’re going to start building a list for the next step in the process

PageRank Explained

• B has tons of incoming links and only links out to C• C has only 1 incoming link, but it comes from B which has a lot of incoming links and C

only links back to B.• E has a lot of incoming links and only two outgoing (one which is reciprocal), but doesn’t

have a lot of “important” links• A has “juice” that it receives from D, but retains it all because it doesn’t link out to any

other sites • Wikipedia definition

Page 16: Rahul Garg- SEO Complete Project Report Free Download

How Do You Get Page Rank?

You “buy it”… – You create content (write it or have it written) to “attract” PageRank (links). This

is sometimes known as “Link Bait”– You pay someone to attract links for you (by contacting important sites that

would be interested in linking to your site (see who is linking to your competitors ideas)

– You “spend” your time getting involved in your “community” online (talking to website owners, posting on Blogs, getting involved in forums, etc.)

– Using “Article Trading” strategies, you can post content on other people’s sites (mainly Blogs) to link to your most important pages on your site

– You use social media to build links to your site (by creating pages on “Authority Sites” and linking back to your site, you can generate both traffic and PageRank

Request Approval

Page 17: Rahul Garg- SEO Complete Project Report Free Download

It is not necessary that all request projects are desirable or feasible. Some organizations receive

so many projects request from employees that only a few of them can be purchased. However,

those projects that are feasible and desirable should be put into a schedule.

In some cases, development can start immediately, although usually system staff members

are busy on other on going projects. When such situation arises, management decides which

projects are more urgent and schedule them accordingly. After a project request is approved, its

cost, priority, completion time and personal requirements are estimated and used to determine

where to add it to any existing project list. Later on, when the other projects have been

completed, the proposed application development can be initiated.

Analysis is a process of studying a problem and to find the best solution to that

problem. System analysis gives us the target for the design and the implementation. Analysis is

one phase, which is important phase for system development lie cycle. System development is a

problem solving techniques. Analysis involves interviewing the client and the user. Three people

and the existing documents about the current mode of operation are the basic source of

information for the analyst.

Analysis is the process of studying a problem to find the best solution to that problem. System

analysis gives us the target for the design and the implementation. Analysis is one phase of the

very important phase of the system development life cycle. System development is a problem

solving techniques. Analyses involve interviewing the client and the user. These people and the

existing document about the current mode of operation are the basic source of information for

the analyst.

Page 18: Rahul Garg- SEO Complete Project Report Free Download

SYSTEM IMPLEMENTATION

SEO is not an overnight process, once a website is optimized it may take some time before you can see an improvement in your rankings. This is because search engines have their own re-crawl rates depending on the website itself.

If your website is not indexed in Google, do not try and submit it through the URL submission form. Instead, build a few backlinks to your website from other websites. Google finds more new websites through this process. Also, create an XML sitemap (www.xml-sitemaps.com) and register for a Google Webmaster Central account (www.google.co.uk/webmasters/) then once you have added your website to your new account, submit an XML sitemap, it will notify Google.

Ensure you install an analytics platform, these are covered later on in the document. We recommend Google analytics as a free, quick and easy solution.

If your website rankings decline, don't panic. Your optimized pages will mean that your rankings increase either after a small decline or after a short period of time. Ranking fluctuations are expected, especially if the website has remained in an un-optimised state for a long period of time.

When building backlinks to your website, you should expect to wait a few months before these links are recognized. When checking your backlink status, use Yahoo site explorer instead of the backlink check in Google, Google does not release all backlink information, even in the webmaster central panel.

If you make a mistake during your optimization, even after you have uploaded the pages, you can make amends without too much impact.

When optimizing for a search engine, focus on Google, other search engines including Yahoo and MSN will tend to follow suit. Each search engine has its own ranking criteria.

Once you have completed the optimization on your website, get straight into social bookmarking, article submissions and resource development, maybe even a little PR. Boost popularity by letting the world know that your website is there and ready to be browsed.

Depending on the nature of your website, you could also join forums and communities, create some articles and point users of the website in the direction of those articles, do not be misleading in anyway, and do not plaster your websites URL's all over the forum, just one URL every couple of days will do.

Check your websites hosting, geographics of your website are important, if your website is hosted in a different country to its origin or target market, consider a change of host. This is covered later in the guide.

Download and install a ranking report client, so you can keep track of a websites performance in natural search listings. We recommend Caphyon Advanced Web Ranking, you can download this from www.advancedwebranking.com

Keep monitoring your website in Google webmaster central, look for any items that need attention such as broken links etc.

 Hopefully these points will help you through your SEO implementation, remember, hard work and dedication pay off, even if it takes a little while. SEO is beneficial far more so than pay per click in the long run. 

Next we will look at the off-site factors that complete the whole SEO process.  Both on-site and off-site SEO work together to portray quality and relevancy.  

Page 19: Rahul Garg- SEO Complete Project Report Free Download

On Page Factor

Title tags are a very important part of search engine optimization. They are also important for webmasters and web designers/developers alike. Title tags act as the overhead descriptor for the document. In a short one sentence summary they provide an overview to the pages topic/theme. Title tags are comparable to the spine of a book, in a library the book cover is hidden, only the spine is on view, much like this, the title tag provides a short snippet about the pages content/theme. 

The quality of the title tag is determined by how well it describes the page, title tags should be brief, and direct without excessive word stuffing. Cramming loads of content into the title tag is a very bad idea, instead, title tags should conform to SEO standards. There is no preset value, but as a rough guide, title tags should not exceed 100 characters in length, ideally, title tags should be within the 0-68 character threshold. 

Title tags should use clear and concise English, avoid using abbreviations to represent standard text, try to be direct and most importantly, do not stuff as many keywords into the title tags. Google will automatically truncate titles that are excessively long, truncation is where only some of the title is shown, the rest of the title will not be visible. Truncation is used on long title tags, or title tags that are poorly compiled. 

Here some examples of title tags:

Meta tags

Meta tags are a group of tags that sit in the pages header. The meta tags, or meta data as they are known, are generic descriptors for the document. Meta tags can specify the documents keywords, description, theme, audience, formatting, author, and updated dates. Other information can also be integrated into the Meta group. 

The most commonly used meta tags include "Meta Keywords" and "Meta Description". These tags are utilised so that the document has additional descriptors along with the title tag. The keywords meta tag is designed to assign keywords to the documents overview. This tag has been depreciated as search engine algorithms have evolved to detect the documents keywords by evaluating the pages backlinks and content. 

The description meta tag is a brief overview of the pages content. This tag is commonly used within search engine listings as the documents descriptor. This tag should have information describing what the proceeding page is about. 

Most designers and webmasters do not utilise any other meta tags as they offer no additional benefits to the webpage. Meta information was a part of the original HTML specification which was changed throughout the years of 1989, 1995, 1997, 1998, 2002, 2005, 2008, 2009. As the years have gone on, gradual depreciation has been seen across the generic meta spectrum. 

The only meta tags that are of importance are, 

Meta Keywords Meta Description Meta Content Meta Robots

 

Page 20: Rahul Garg- SEO Complete Project Report Free Download

Screenshot of basic Meta Attributes:    Meta attributes offer additional instructions to search engines and crawlers alike. The meta keywords tag is ignored by larger search engines such as Google and Yahoo. Other search engines such as MSN, Live, Excite, ASK, Lycos and AOL still utilise this tag, although with a lower priority then historically.  

Utilisation of the Meta Robots tag allows the specification on page crawling. You can use the Meta Robots tag to prevent the document being indexed by search engines and other search engine crawlers (Bots). 

Robots Control With the robots control functionality, tags can be implemented to prevent crawling of a document, below are common usage tags. 

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

This version of the meta tag prevents Google and other search engines from Indexing or crawling the documents links. Applying both of these tags will prevent the page being indexed or acknowledged. This can be useful on pages that are either using duplicate content, or important login pages. 

<META NAME="ROBOTS" CONTENT="INDEX, NOFOLLOW">

This version of the meta tag will prevent search engines from following the links on the page, but will allow search engines to index the page and its content. This is useful in situations for pages that should not inherit page rank. This is commonly found on pages that are not important towards rankings, such as privacy policy pages, summary pages, contact pages or other pages with a large volume of outbound links. 

<META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW"> This version of the meta tag will prevent the search engines from indexing the page, but will allow the search engines to crawl links that are found on the page. This is useful and commonly implemented on search results pages which are dynamic, yet offer useful links to other parts of the website. 

<META NAME="ROBOTS" CONTENT="INDEX, FOLLOW"> This version of the meta tag will allow search engines to crawl and index the page whilst crawling links on the page. Typically search engines will do this without this tag being present. 

You should always seek professional advice before implementing tags to prevent crawling and indexing issues. 

Meta Refresh Tags for Re Directs Meta refresh tags have become obsolete but are sometimes used in areas where dynamic redirects are not possible. Meta refresh tags allow the document to re-direct the end user to another page through HTML code. This tag has become obsolete as it is classed as an unfriendly search engine redirect. This tag should not be used on important pages, and should be avoided at all possible costs. Pages with these re-directs may not be indexed correctly, or may loose ground in search engine positions. 

Page 21: Rahul Garg- SEO Complete Project Report Free Download

Header tags are a HTML standard preset to highlight information contained within the tags. Header tags are an alternative way of representing information in bold and in a larger font face. Typically, header tags are designed to highlight important information, paragraphs, news and other important data. Because of the nature of header tags, search engines class them as an additional ranking attribute, not in great detail, but they do offer benefits when utilised properly.

With the advancements in web technology, header tags can now be styled in CSS so they can inherit a font, size and colour in order to keep the header tags in the same design format as the rest of the website. Header tags come in a number of preset sizes from H1 (biggest) to H6 (smallest). These can be used anyway depending on the information that is to be represented. 

Many websites adopt a hierachy whereby the H1 is the leading header tag to represent the theme of the page, following header tags are then shown in a smaller format to represent individual paragraphs or data. Header tags are a part of on-site search engine optimisation because they are a part of a pages structure. 

When search engines index content from a page, the content itself is first parsed, the surrounding tags are evaluted to see how data is being represented, search engines will use this principle to see how the page delivers its content to the end user, for example, a good structure will represent important information in a header tag, with proceeding content being directly relevant to the header tag.  

This structure is SEO friendly, and contributes towards the websites ability to convey theme, as well as its position in natural search. 

Header Tags (HTML)    Header Tags (Browser)

Header tags are a HTML standard preset to highlight information contained within the tags. Header tags are an alternative way of representing information in bold and in a larger font face. Typically, header tags are designed to highlight important information, paragraphs, news and other important data. Because of the nature of header tags, search engines class them as an additional ranking attribute, not in great detail, but they do offer benefits when utilised properly.

With the advancements in web technology, header tags can now be styled in CSS so they can inherit a font, size and colour in order to keep the header tags in the same design format as the rest of the website. Header tags come in a number of preset sizes from H1 (biggest) to H6 (smallest). These can be used anyway depending on the information that is to be represented. 

Many websites adopt a hierachy whereby the H1 is the leading header tag to represent the theme of the page, following header tags are then shown in a smaller format to represent individual paragraphs or data. Header tags are a part of on-site search engine optimisation because they are a part of a pages structure. 

When search engines index content from a page, the content itself is first parsed, the surrounding tags are evaluted to see how data is being represented, search engines will use this principle to see how the page delivers its content to the end user, for example, a good structure will represent important information in a header tag, with proceeding content being directly relevant to the header tag.  

This structure is SEO friendly, and contributes towards the websites ability to convey theme, as well as its position in natural search.

Page 22: Rahul Garg- SEO Complete Project Report Free Download

Page content

Page Content is the foundation of search engine optimisation as well as the theme of the website. Without content a website is little more than www. found in your browser. Websites convey information through the power of content, discussing everything from products, services to general information. The web has grown to become the worlds biggest resource, and the net has grown to become a enormous hub of content, from text to video. 

Page content is essential for a websites performance. Search engines typically parse a website to filter out all the text from the coding. The content is then evaluated by search engine algorithms to detect how good the quality of the content is, how broad the topic is and references made. 

Typically, a search engines goal is to return the most relative pages possible to the users search query. The foundation of a search engine relies on the quality of the results returned for keyword searches. Filtering good content from bad content is an difficult task even for humans, but for an automated search engine? impossible? not quite. 

Search engines have evolved a long way from the traditional methods, which returned websites stuffed full of keywords, now, with advances in technology, search engines such as Google and Yahoo can now evaluate how good the content actually is. Because of this power, these search engines have dominated the market place as the search results are of a high quality and relevancy. 

So just how do search engines decipher good content from bad content, well, the principle is actually quite simple, the process involves spectrum analysis, whereby a content is evaluated on a number of different levels, including, how broad the content is, what keywords are mentioned in relation to the core theme, the directive that the content is written in, the perspective, and the language/quality of language used.  

Search engines such as Google will typically ignore keyword frequency, as this is a system that can still be abused by webmasters. It is a big misconception that keyword density levels can influence search rankings, when actually the quality of the content is far more likely to yield positive results. If the keyword density principle really did have a positive effect, people could write garbage, and then chuck the keyword elsewhere in the page x amount of times to maintain a density level.

Anchor text

Anchor text is the text used to represent clickable links. Anchor text is used internally and externally on websites, which is the standard for all websites to follow. Anchor text is an important part of web design, development, search engine optimisation and pay per click, simply because the links are avenues through a website and between websites. Anchor text can be utilised anyway, from numbers to special characters, although for search engine optimisation purposes this is not advisable. 

When links are created, they will be default show as the URL that they point too, links can then be assigned independent text, or can inherit the URL. When utilising anchor text there are many things to consider such as how relevant is the anchor text to the destination page, to does the anchor text read naturally. Using anchor text in the incorrect fashion can lead to problems when optimising a website. 

Page 23: Rahul Garg- SEO Complete Project Report Free Download

Many websites have flaws because of the anchor text, either the anchor text is too long, or does not use the correct wording/keywords. Other problems include using the same anchor text on links that link to different pages, this creates a relevancy separation issue. 

If a website uses links to different pages that share the same anchor text, search engines can become confused when determining what page is the right page to return for the keywords used, which is often the cause of supplemental results. Other problems include the use of anchor text for dynamic URL's that are not reachable or properly indexed by search engines. 

So what is good anchor text, well good anchor text is text that relates heavily to the destination page, whilst being represented in a short and understood format. Avoid using too many stop words, avoid stop words in anchor text altogether if possible. Try not to make link text too long, more than 100 characters can begin to appear spammy, and can cause supplemental issues with the destination page if that page does not have enough authority. 

Optimal usage for anchor text includes using the primary keywords for the link, so if the destination page is about cheap vauxhall exhausts, why not use that for the anchor text "Cheap Vauxhall Exhausts" is far better than "Click Here" or "For More Information". 

The Crawl Map Anchor text is the visual representation for text based links. When search engines such as Google crawl websites, they index, record and evaluate the linking profile. A big part of this includes the text used to represent the links. Using irrelevant text on links can be seen as a negative item, it can even be classed as misleading if the anchor text is completely incorrect or excessively long. 

Search engines build a profile of internal links, the text used, and how many pages share duplicate anchor text, if a website shows no examples of duplicated anchor text it can be seen as a big bonus. 

The map is typical to that of a library, where the spines of books and the equivalent to anchor text, the descriptive anchor text provides the information for the end user to understand what the link will take them too. This is why anchor text is not only an SEO related factor, but also a usability related factor also. 

Good Structures Good anchor text will meet simple criteria, of which includes keeping the anchor text to a reasonable length, avoiding too many instances of one worded links, using text that represents the destination page in the best possible way and more. 

Misleading anchor text not only has an impact on the websites SEO compliance, it also has a knock on effect with a websites bounce rate. Confused users will often leave a website that is littered with too many links, clutter, or confusing navigation aids. 

Anchor Text Checklist Use the options below to determine if your anchor text is in an SEO Compliant format. 

Is your anchor text for each link below 100 characters (not a preset volume, but good practice)

Is the anchor text easily understood by the end user Does the anchor text relate directly to the destination page Does the anchor text utilise direct keywords

Internal links

Page 24: Rahul Garg- SEO Complete Project Report Free Download

Internal links are a core part to any website. The links internally are created to link each page of your website together in a user defined structure. Pages of websites are all connected through an internal web of links. These links are created by the webmaster to direct users through the website to find pages relative to the query, or browsing requirement of the website visitor. 

Internal links consist of page based, navigation based, footer based and reference based links. Navigation based links will typically link to the most important and core pages throughout a website, whilst page based links are usually incorporated within content to direct users to anything else that may be relevant. Other links such as reference links are designed to lead the user to a more indepth version of the content. 

But a websites internal linking profile doesnt just help the end user, it also helps search engines through navigation crawls of the website. The way in which a website is linked internally can have a large impact on search engine rankings. If a website has a good semantic structure due to internal links, it is more likely that visitors are able to find content that they are looking for, a website with poor internal links may fail in this region, thus causing ranking problems as well as supplemental results. 

If a website has a good internal link profile and users can find what they are looking for, than the bounce rate is likely to be lower, whilst browsing times are maintained. At the same time, good structures help the website rank in a more prominent position along with internal pages. Good site structure is also recognised by Google and Yahoo, in this instance, the search listings provide additional links within the search description. These links are known as sitelinks, and are generated when a website has a good sound internal URL structure.  

Good linking structures also minimise the chances of internal pages becoming supplemental. This has been known to happen to pages that are barely referenced throughout the website, this is because the perception of unimportance is given to the pages that do not have many internal links.

Using a Good Structure 

A good url structure will consist of the home page linking to the most important top level pages, these pages will break down into sub pages that follow a hierachy. If this pattern is maintained consistently, most of the website will interlink with itself, allowing for page rank to be spread across these pages, whilst making the pages more manageable when crawling. 

Internal linking also has an impact because search engines can evaluate how the website links internally, to see how relative content is connected between pages and much more. Have a good URL structure benefits the end user the most,  allowing them to find what they are looking for without having to scour the website. 

Lower Level Pages Lower level pages will tend to have lower link references, this is because the focus always should start from top to bottom. Ensuring a websites link profile is SEO compliant is essential. Linking to internal pages should be done carefully, as different pages should have independance rather than similarity. 

Internal Links Checklist Use the options below to determine if your title tags are in an SEO Compliant format. 

Are your websites main pages easily found from the home page Does your website have an SE friendly URL structure allowing Google to reach lower

level pages

Page 25: Rahul Garg- SEO Complete Project Report Free Download

Are your internal links unique, or do you have multiple links to the same page using different anchor texts?

 

These are just some of the things to note.   

Url Structure

URL Structure is an important part of search engine optimisation. Namely because URL structure helps dictate keyword usage within the URL name. URL Structure is typically setup depending on the websites server technology, and depending on the websites structure. Friendly structures operate in a hierachy which seperate different website pages, typically by theme. A good URL structure will seperate products, information, themes and so forth. 

URL Structures are either based on URL rewriting, or by raw assignment, which will mean the page extension is shown. 

Example:

URL Rewriting Disabled:

www.mysite.com/myproduct/page1.html 

URL Rewriting Enabled:

www.mysite.com/myproduct/product1 

Search engine optimisation is aided by keyword usage within URL's. Typically, keyword usage in URL's can compensate for domains that do not use the keyword within. When keywords are used in URL's they should be implemented in a proper manner, avoiding excessively long URL's. The website structure should also be broken down based on the subjects and themes covered, 

For Example:

www.mywebsite.com/theme/page-on-theme 

So, the theme can for example, cars, and the page could be about Vauxhall, this principle should apply site wide to separate information. This system helps to prevent supplemental results, mixed themes and other website issues. 

A good foundation for a website will rely on its ability to separate information through a URL hierarchy, generally, a website will break information up into categories. Combining this with optimal keyword usage will help boost the rankings for the website. 

Keywords within URL's should be selective based upon their most direct values, for example, if the URL is to discuss food, the top level keyword should be based on food instead of multiple keywords, this helps keep the URL within compliance length. 

Ideally URL length should remain below 250 characters, URL's exceeding this length can become truncated, or can appear spam like.

Page 26: Rahul Garg- SEO Complete Project Report Free Download

Keyword Usage within URL For URL's keywords should be kept in a trimmed format, using long tail keyword combinations can cause the length of the URL to spiral out of control, not only that, excessive URL's can also complicate site structure making it more difficult to seperate and manage information. When a URL is indexed by a search engine, it is evaluated to see how the keywords combined match up with the resulting page content. 

A URL should keep below 250 characters, and should include the relative keywords. Information seperation sits hand in hand with the keyword assignment. 

Break Down of Information The optimum way to break down information is by subject and theme, and then into subcategories, so the website can keep information seperated. 

For example, www.mywebsite.com/food/cooking/pasta

www.mywebsite.com/food/safety/handling-food 

www.mywebsite.com/cars/vauxhall/servicing

www.mywebsite.com/cars/mitsubishi/evo 

and so forth.  

This system helps to keep all topics seperated. 

URL Checklist Use the options below to determine if your URL Structure is in an SEO Compliant format. 

URL Length below 250 characters No Special Characters Keywords Seperated by Spacers instead of Gaps URL Rewriting Enabled if Possible

  

Keyword emphasis

Keyword emphasis is a means of highlighting text within a page. Keyword emphasis can be utilised in a number of different ways for a number of different reasons. Typically, website emphasis was a part of the initial HTML coding standards. Even to this day, keyword emphasis and content emphasis remain incredibly popular, moreso for the end user rather than just search engines. 

When emphasising a keyword, it becomes a highlighted object within the page designed to stand out to the end user. Keywords and content can be highlighted to make sure it is seen as a prominent part of the page, wether it is content related or image related. 

Using content and keyword emphasis, you can highlight single keywords or multiple words in a sentence. More often than not this system is over used or not utilised properly. When highlighting information, it should be so that the end user can understand the information to be important. Search engines can detect when this tag is overused, or when it is used out of content,

Page 27: Rahul Garg- SEO Complete Project Report Free Download

perhaps to highlight information that has no relevance to the page, i.e. CLICK HERE or MORE INFORMATION. 

Within content, highlighted keywords not only stand out, but it also shows the end user that the highlighted keyword is a point to remember or to understand further. References are also highlighted in content emphasis tags. 

Keyword emphasis is a tag that can have a mixed approach, typically it is designed for highlighting content and not for SEO purposes, it is designed for the highlighting of selective content, but when used effectively can have beneficial effects towards a websites SEO status. 

Keyword emphasis is typically integrated using a BOLD tag within HTML. This is represented with a <B> tag and closed with a </B> tag. What ever content is placed between the open and closed bold tag will be highlighted on the page as follows: 

No Bold:  The World is full of Surprises Bold: The World is full of Surprises 

Another tag that is frequently used is a strong tag which is opened and closed by <STRONG> & </STRONG>. This has the same visual effects as a bold tag, but is a different representation within HTML. The influence is the same as a bold tag.

Bold Tags The bold tags used within web pages highlight key pieces of information. This should be used on RELEVANT information throughout the web page, for example, a page talking about cars, should highlight not only the keyword cars, but also content relative to it such as Vauxhall, Ford, Renault etc. This shows that the content is broad, and that the highlighting shows the most important pieces of information. 

Strong Tags This is the equivalent to a bold tag and does exactly the same thing. Many argue that the strong tag is recognised by search engine as a better way of representing the information, even though this is known not to be the case. Because the tags have exactly the same effect, they are treated the same way.  

Italic and Underlining Tags This is an alternative method to highlighting keywords, both options can be used. Italic tags allow for content to appear in italic, and underlining can be used to make content appear with an underline running below. 

These tags contribute towards SEO efforts on the website, the use of tagging should be utilised for the end user as well as search engines. Search engines look for highlighted information and will evaluate the relativity between the content and the highlighted content. 

Using italic and underlining tagging can be beneficial, but also can become detrimental if abused. It is ideal to highlight content only if it needs to be highlighted towards the end user.

Other external factor

301 Redirects

301 Redirects are a search engine friendly form of redirects that allow users to be directed to a pages new location. A redirect traditionally is put in place when a page changes its filename, extension or location. When the page has moved, search engines still return the old url, which

Page 28: Rahul Garg- SEO Complete Project Report Free Download

equates to a broken link. Placing a redirect in place means that when the old page is requested, the user is automatically forwarded onto the new page location. 

There are many different types of redirect. The reason for this is because of the nature of website technology, and the way that the page has changed its target URL. For example, a temporary redirect is known as a 302 redirect, this tells search engines that although the location has changed, it is only temporary, therefore search engines tend not to update search listings for these redirects. 

301 redirects are known as a permanent redirect, this is because of the 301 code which in HTML standards equates to a permanent status. When using a 301 redirect, search engines such as Google, Yahoo and MSN will update their site records to return the pages new location instead of the old location, this process however can be slow. 301 redirects are vital, as they prevent masses of broken links, and the potential loss of rankings and traffic as a result.   

301 redirects are known as search engine friendly redirects, because they do not pose as a risk to rankings. These redirects are very common for implementation, and are very easy to implement, in PHP, and ASPX (Classic ASP). They can also be implemented via alternative means such as the configuration of .htaccess files. Also, within certain types of web server technology, the redirect can even be implemented in a dummy file that mimics the file name of the old filename to have been moved. 

For example, lets say you have a product page under the following URL, 

www.mywebsite.com/product/kettle.php 

if you have a new target page, you would create the target page as 

www.mywebsite.com/product/latest-kettles.php 

And in the old kettle.php file you would replace the code with the re-direct code, which would mean that the old file exists, but the users would not see it, instead they would be redirected to the appropriate page. 

Implementing 301 Redirects Implementing 301 redirects has become incredibly easy to do, below are some commonly used redirect implementations. 

301 in PHP Implementing a 301 redirect in PHP is very easy, simply apply the following code to the old file, 

<?

Header( "HTTP/1.1 301 Moved Permanently" );

Header( "Location: http://www.mywebsite.com/product/pagename.php" );

?>  

Change the location to the new filename, and apply this code to the old filename, this will then form a 301 redirect.  

301 in .NET  (ASPX) Implementing a 301 in Microsoft's .NET platform is not as easy, but fairly straight forward for technicians. 

Page 29: Rahul Garg- SEO Complete Project Report Free Download

Open up your IIS (internet services) application, once you have connected to your webspace server, select the file to redirect. Right click on the file and select the options for the page. Once you have the panel up, select the radio button "Redirection to a URL". Enter the new location, and then select "Exact URL entered above" and "A Permanent Redirection for this resource". 

Than finish by clicking apply. Once this has been completed, the new redirect will be in place. 

302 Redirects or (Re-Directs) are temporary re-directs put in place for websites during a transitional period. Websites that are moving server, location, or are restructuring internally will benefit from temporary redirects. Google will read a temporary re-direct and as a result, will leave the existing index in Google unchanged. If this redirect is left in place too long than it will begin to loose its positions in search results. 

A temporary redirect is exactly what it says, temporary, and should not be used if the redirect is likely to be in place for more than 3 months. It is important to remember the differences between temporary and permanent redirects including the impact in natural search. 

When implementing a redirect, its important to remember that your pages existing structure will remain in search engines, so if no redirects are put in place, users will fall on broken links, and if a temporary redirect is left in place too long, it will begin to slide in the rankings. 

Setting up a 302 redirect couldn't be easier, as seen in the right hand panel. You can create the redirects in any server technology, just bare in mind the amount of time that the redirect is likely to be in place for. Websites can see huge declines in their rankings if the redirects are left unattended. It is common for re-directs to be forgotten about when the website undergoes a large change, wether it be internal or external. 

302 redirects have been common practice for over 10 years since their introduction in 1998. The misuse of this redirect is still prominent as many websites lay dormant, or in low positions due to the adverse affects of excessive time periods. 

Temporary redirects are overlooked by search engines in the short term, but after a period of time, search engines will then assume that the redirect has been left, and will begin to devalue the page and its placement. 

This is why there are strict guidelines for the use of 302 redirects.

Definition of a 302 Redirect The definition of a 302 redirect means, using a 302 redirection from http://www.mywebsite.com/ to http://www.mywebsite.com/new-folder/index.htm, means that, although the page is temporarily redirected to the xx page, the preferred address to be used in the future remains http://www.mywebsite.com/. 302 redirects should be used for temporary redirections. 

Why is this an SEO No Temporary redirects present search engines with the problem of determining wether or not the new redirect location still poses as a relevant source of information relating to the users search query. When a search engine recognises a temporary redirect, it will put into question wether or not the new page that the users are being directed too is of value. And if the redirected page is of value, how long will it remain in place until it is moved back to its old location. 

Page 30: Rahul Garg- SEO Complete Project Report Free Download

Implementing a 302 Redirect You can implement a 302 redirect as follows: 

In PHP

Redirecting a page in PHP

<?php

header(”Location: http://www.domain.com/temporary-address/temporary-file-name.html”);

exit(); 

in HTACCESS

Redirecting a page

Redirect /file-name.html http://www.domain.com/temporary-directory/temporary-file-name.html 

Redirecting a directory

Redirect /directory http://www.domain.com/temporary-directory

Redirecting an entire site

Redirect / http://www.temporary-domain.com/ 

http status

HTTP Status codes are codes returned by a server indicating the status of a requested document. Typically status codes are used by browser agents such as Mozilla, Internet Explorer etc, to identify if a page exists, if the agent has the correct permissions, if the document has moved and lots more. Search engines also use header status codes to identify the document as either Ok, Moved Temporarily, Moved Permanently, or not found. 

All documents in a website should return as a HTTP 200 which signifies OK, this means that the document is present and can be requested.  

HTTP 200:

Means the document can be requested and that it exists, this should be returned for all the pages within your website unless they have been redirected. 

HTTP 301:

Means the document has moved permanently, in this case, the URL that was once followed will be replaced by the new location. 

HTTP 302:

Means the document has moved temporarily, in this case, the old URL will remain as it will become useful when the re-direct is lifted. 

HTTP 400:

Page 31: Rahul Garg- SEO Complete Project Report Free Download

Means that the document can no longer be found at the given URL, search engines will treat this as a broken link. 

HTTP 404:

Means that the document can no longer be found and that no alternative document could be provided. 

Of course they are only some of the many status codes that are in existence. Typically these are the most common status codes as these are present through most SEO and development projects. The first number indicates the series that the status codes belong too. Http 2xx codes are typically codes that mean the document exists, the proceeding numbers provide the actual state of the document at the time it was requested. 

Frequent HTTP Status codes as per the W3 Directive:

201 Created

202 Accepted

203 Non-Authoritative Information

204 No Content

205 Reset Content

206 Partial Content 

Why are Status Codes Important?

Status codes are vital to user agents and search engines because they provide clear instructions and permissions. If a document has a particular function, it will return a code to notify the user agent, for example, some documents process forms which makes a post or GET request. These documents will typically return more than one header status depending on the stage of its processing. 

Other documents provide an indication to search engines for their location. If a document has moved, a status code can be assigned to the moved document to update search engines on the new location, this is a part of a 301 or 302 re-direct. 

Importance of Status Codes

This is CRITICAL to search engine optimisation and website usability, if a document cannot be found, it could potentially affect the websites status whilst causing page rank to dissipate. 

Common tags within SEO are 200, 301 and 302. All documents should utilise HTTP 200 to reflect that they have been found, if any of the documents move permanently they should be assigned a 301. 302 redirects should only be used temporarily, perhaps during a website migration or editing process.

Canonicalization

Canonicalisation is a part of search engine optimisation that is overlooked by many basic agencies or seo newbies. This process is left undiscovered by many, and as a result can have many negative effects when left unattended. This process involves the correct assignment of

Page 32: Rahul Garg- SEO Complete Project Report Free Download

URLs within a website, which means that the URL is either represented with www or non www, or with trailing slashes or without trailing slashes. 

Some websites do not use www and some do, some do not use trailing slashes in URL's and some do, it is this process that identifies the preferred selection and then makes sure that the non selected items 301 re-direct to prevent URL variants causing duplicate or supplemental results. 

When search engines index a website, all the URLS that are found are indexed, however, if external websites link to variants, these can cause issues with canonicalisation. So, if my website appears as follows, here is how the scenario works, 

http://www.mywebsite.com or http://mywebsite.com  

The 2 domains are the same, however, if they are both accessible, than this means that there is a canonicalisation error because both variants return different pages. The process of canonicalisation means that the preferred domain should return a http status code of 200, whilst the non preferred version should 301 re-direct to the preferred version, and vice versa. 

So, lets say we want to stick with the www. version, it would sit as follows: 

http://www.mywebsite.com   <- HTTP 200 - Document OK http://mywebsite.com   <- HTTP 301 - Moved Permanently  

This principle also applies to websites that have URL rewriting enabled, lets look at URL rewriting: 

www.mywebsite.com/my-products    or

www.mywebsite.com/myproducts/ 

The trailing slash makes a huge difference, as with the above scenario, one should 301 re-direct to the selected version.

Robot.txt file

Robots.txt is a file created by webmasters to instruct search engines on crawl behaviour, wether a directory or file type should be indexed. Robots.txt are used in many different scenarios, mainly to prevent indexing of protected content, images and much more. Search engines obey robots.txt, as this allows the webmaster to have selective preferences for website security and for privacy reasons. 

Hundreds of thousands of websites are built and released online with no block on crawl permissions for various directories of a website. This prevents major security implications as search engines can list protected content unless instructed not too. 

Robots.txt is also used for SEO purposes to prevent the indexing of various documents. Some webmasters prevent pages with duplicate content from being indexed, whilst others use robots.txt to stop Google from crawling internal links pages and much more. 

Robots.txt is a raw text file placed on a webserver, typically on the root of the domain, so that search engines can access it directly. Search engines will obey the robots.txt file as it is a mandatory requirement. Over the years the robots.txt file has become of more use, and has many substantial benefits, even tailoring instructions to each search engine based on the crawlers name. 

Page 33: Rahul Garg- SEO Complete Project Report Free Download

Search engines have bots that work on their behalf, to crawl and index content, these bots have names assigned to them, it is these names that can be used within robots.txt to either allow or block access to various documents. For example, you can prevent Yahoo's bot (Slurp) from indexing a particular part of the website, whilst allowing other bots such as Google's bot (Googlebot) to index it. 

This is very useful when tailoring results as per search engine requirements. Some search engines for example, may not be able to index flash, as a result, you could allow Google and Yahoo to crawl it, whilst disallowing other search engines to prevent failed results. 

This standard has become more widely used over the past 8 years as the internet and search engines have evolved, it is with this that more webmasters really need to utilise the power of robots.txt. 

More About Robots.txt

Search engines are numerous, with over 50,000 search engines online, out of this huge volume, only about 10 actually provide any worthwhile traffic. A list of search engine robot names can be found at http://www.robotstxt.org/db.html. 

So what commands are available within robots.txt? 

There are many different things that can be specified in Robots.txt, such as disallow and allow, which we elaborate further on below: 

Example Robots.txt: 

# robots.txt for http://www.example.com/ 

User-agent: *

Disallow: /cyberworld/map/ # This is an infinite virtual URL space

Disallow: /tmp/ # these will soon disappear

Disallow: /foo.html 

Above is a snippet from an example robots.txt file, here's how the commands break down. 

Break Down of Information

User-agent

User agent is the name of the search engine bot, if you specify a chain of search bots here, you can instruct either all or only the listed search engines to adhere to the commands. If you would like all search engines to follow the commands, simply use a star *. 

Disallow

What ever file or directory follows this command it will be honored by the specified user agent. For example, if you disallow mypage.html it will mean that the user agent will not index the file. Here you can also disallow entire directories by listing them as they are presented on the server, i.e. /mydirectory/. 

Crawl permission

Page 34: Rahul Garg- SEO Complete Project Report Free Download

Crawl permissions are used in order to prevent many issues affiliated with website indexing. When search engines index a website, they will by default crawl everything that can be reached from internal site references. Setting crawl permissions allows you to manage how your website is indexed. Crawl permissions is a top level category over robots.txt and internal meta data. 

You can instruct search engines to crawl, ignore or read but not to index. Lets see how that can become beneficial, 

I want search engines to follow a page but not to index it? 

This is commonly used on websites with lots of dynamic search results. You can instruct search engines to follow the links on a page but not to index it, this is simply done through meta data, using a noindex, follow attribute. This is simply done by adding a meta instruction to the top of the document, learn more about this on the meta tags page. 

I want search engines to completely ignore a specific page? 

This is another common scenario for various types of pages, this can be done through a robots.txt file by setting the URL as a disallow, or by adding a meta attribute to the top of the document that instructs search engines to nofollow and noindex a page. 

For more information visit the meta tags page or the robots.txt page. 

I want search engines to index a page but not to follow the links on the page? 

This is another common scenario for webmasters, this is easily done by adding a meta attribute to the page, adding a index, nofollow to the meta attributes will allow search engines to index the page, but not to follow any of the links on the page. 

For more information visit the meta tags page. 

These are just some of the permissions that can be set by various external website factors. For most SEO projects, these will be common scenarios in order to create a sound SEO structure on the website itself.

Crawl Permissions

So how do search engines actually crawl? 

Well, the term crawl is applied as search engines follow paths laid out by the design of the website. As the website is constructed, the links built internally on the pages will be responsible for how the website is crawled. 

Setting the permissions will help maintain a good profile, by preventing the indexing of potentially duplicate documents, by reducing page rank spread and much more.

404 errors

404 Errors are more common than any other status code other than a http 200 OK code. This occurs when a page on the website is no longer found at the location of the URL. This is a critical no no for websites that are to perform well as websites with multiple broken links can suffer performance and ranking declines in Google.

Page 35: Rahul Garg- SEO Complete Project Report Free Download

A search engines goal is to return the most relevant results to the users search query. A search engine will devalue a website that provides multiple broken links because it will be considered an unreliable resource. 

Search engines base their results on quality and consistency, if a website has frequent 404 errors there is no guarantee that the pages are going to be available if requested from a high position. Many websites over a period of time will encounter 404 errors that can be generated for a variety of reasons from server issues to website migration issues. 

Websites that identify and resolve these issues quickly will not suffer the effects of traffic loss or ranking decline.

It is imperative to maintain a good linking profile within a website, and in order to do this, it's important to check that the links within your website are correct and not returning 404 errors. 

A good tool to use to prevent or monitor errors is Google webmaster central, which can automatically report 404 errors that have been encountered. Resolving these issues as they occur is a great way to be counterproductive. 

404 errors can be customised by a custom 404 page, providing the server settings are correct. Some websites have custom 404 pages that direct the user to the next most relevant page, or offer navigational options to other parts of the website. This is a great idea to prevent any loss of rankings or site performance as a result.

So, either setup Google webmasters central or run a site crawling tool to identify broken links, we recommend a link sleuth such as XENU which can be downloaded from download.com. 

Subdomains

Sub domains are an extension of a domain name that will typically form before the domain name, and will be separated with a full stop. Sub domains are treated as independent websites unless they are heavily linked between the parent domain. 

Sub domains are incredibly popular for large scale businesses and commercial websites that have a large product array. Sub domains can be optimised independently but can also be a part of the leading domain inheriting page rank and authority. 

There are lots of ways that sub domains can be beneficial, at the same time, if used in the wrong manner sub domains can potentially affect the performance of the top level domain. Sub domains will appear in a fashion similar to http://mysubdomain.domain.com. Most sub domains avoid the use of www. Protocols because of the connection from the root level of the domain. 

Sub domains will usually be hosted on the same server, its important to know that hosting a sub domain on a different server can actually have a draining factor on the top level domain. Many different configuration options are available to prevent this, but in general, it will have an impact. 

Sub domains are utilised to split up a website into different categories or product bases, which is ideal for businesses looking to separate different sections of their website.

Utilising sub domains properly however can bring many benefits including enhanced ranking placements for various different keyword groups and themes rather than a domains general approach of core keywords on the home page and sub pages being returned for the appropriate keyword search. 

Page 36: Rahul Garg- SEO Complete Project Report Free Download

Sub domains should contain unique content, and should maintain a good internal link profile, having a poorly utilised sub domain can lead to trust issues with top level domains unless the sub domain is separated from the root domain with nofollow attributes. 

Sub-Domains and the Selection Process

When sub domains were introduced by registry foundations such as Nominet, they were often abused by adding in lots of different keyword groups into the URLs and then re-directed to the top level of the domain, which actually led to the decline of many popular websites. Sub domains are a separation factor only and should not be abused in order to enhance rankings. 

Be careful not to link too heavily between domain and sub domain, because the sub domain is treated as an independent website, it could be seen as excessive intra site linking, which can become detrimental towards rankings for both the domain and sub domain. 

Domain age

Domain age is an important factor within search engine optimisation because it plays a part in the trust built up for the website. Think of age in the connection with trust, the older a website is, the more likely it has been created for good use. Spam like websites generally have a short life before being left to rot in the darkest corners of the internet, it is the age factor that helps prevent these websites from gaining high placements. 

This system was introduced in order to filter results in a more stringent fashion. Websites that were kept maintained and content driven were more likely to gain successful placement over a long period of time. This is because search engines will look at update frequency, how the website updates, how the site changes and more. Domain age will strip out the websites that are born and die quickly, this helps clean out results. 

Domain age can be an obstruction for newly acquired domains and newly built websites, typically this can be overcome within a 12 month period by maintaining the website and giving it attention so that it expands and has a growing link profile. If search engines can see that the website has not been designed to come and go, it will become more trusted and therefore can achieve higher positions. 

This principle was once called the Google sandbox, however, it has been proven that there really is no sandbox, instead, it is more like an evaluation period. New websites can still rank and perform well, but it all depends on the market and the competition that the website enters against. 

Domain age will play a key part in any SEO project, which is why it is important that new domains need trust building before the website can make a true impact on performance. You will find that websites younger than 1 year will struggle the most, after the first 3 years the domain will have built a trust element within search engines, which is when it can begin to target highly competitive keywords.

Domain keyword utilization

Domain keyword utilisation is not an essential part of search engine optimisation, but it does play a part in being an additional ranking attribute. Typically, keyword rich domains have the edge over competitors who are not using keywords, however, since keyword attributes are directly influenced by the webmasters choice, it has been devalued. 

Some websites are coupled with domains excessive in keyword utilisation, many featuring more then 3-4 keywords within the domain name, which immediately looks spammy. When a domain

Page 37: Rahul Garg- SEO Complete Project Report Free Download

utilises keywords it should do in a proper manner, ensuring that the domain name does not become too long, and that the keyword utilisation is just at the right level. 

For example, if you were running a car ownership and you brought a domain such as

www.my-cheap-car-sales-with-bad-credit.co.uk you end up with a domain name that looks spammy, and is less likely to have trust attributes given to it initially. In a good example the domain would look something like

www.bad-credit-cars.co.uk 

The domain is shorter, and covers some of the keyword scope that the website can utilise. Keywords within the domain is a good idea providing that the domain keywords can be implemented without excessive length being required.  

Creating a keyword rich domain with 2 of the websites most popular keywords is ideal, however, implementing a string of keywords separated by a hyphen is a bad idea as it can have a profound effect on the websites appearance in organic search results.  

Keyword Rich Domains

The easiest way to create a keyword rich domain is to use keywords that are the most relative to the whole website, i.e. Discount Cars - www.discount-cars.co.uk or something along those lines, stuffing keywords in will likely give the website a more spam like appearance. 

Domain length is an important factor, excessively long domains end up truncated, and are no better off than domains not utilising any keywords. Keep your domain short, remember on top of your domain you will have directories/page names which only contribute to the URL length. 

Try to keep your domain below 70 characters in length (excluding www. And .com/.co.uk). Only use hyphens to separate keywords, do not utilise any other characters. 

Geographic targeting 

Geographic targeting is a vital part of search engine optimisation as it helps to dictate the end users market. Geographic targeting is the process in which hosting is based in the same country that the top level domain represents, along with it being in the same area as the website focuses its services. Think of it from this perspective, if you searched for car dealerships and you lived in Bristol, why would you want listings for car dealerships in new york?. 

Alongside content, Google will filter results using geographic targeting, so next time you search, you get results tailored to your country, you can further reduce this to region. Generally the separation is done by visiting Google.co.uk or Google.com, but dont forget about Google.fr or Google.de, theres loads of countries, and lots of search engines with dedicated results. 

If you search in Google.de you will get sites that concentrate on the business relative to the keyword query within that country, so Google.de will return popular car dealerships in Germany, whilst Google UK will do the same for the UK and so on. 

Your hosting and top level domain will contribute heavily towards this, if you have a UK based website with a .co.uk domain that is hosted from the US, this will put a downer on your final search positions, this can be overridden with content and backlinks, but in general, the best approach is to keep your site and its hosting features in the same country that you deal with,

Page 38: Rahul Garg- SEO Complete Project Report Free Download

unless you are in the Global market, in this scenario you would have domains and hosting split up based upon region. 

Google introduced Geographic targeting shortly after the big daddy update, which saw lots of websites shift based on their TLD (top level domain) in relation to their hosting. This helps to provide even more accurate results, and to filter out results that would be useless to visitors and website owners alike. If your searching for a product to buy in the UK, results from the USA are hardly likely to be of any benefit. 

Google webmaster central offers the ability for you to specify your target location, which is ideal, however, your hosting and TLD should reflect this.  

Domain Geo Targeting

You can change your target market by having the appropriate top level domain and hosting in that region, so if you are focusing on a UK market, you would have a .co.uk domain with hosting in the UK. If you operate in France and only deal with french business, you would have an .FR domain with hosting in the corresponding location. 

Geographic results are beneficial for the seperation and relevancy of results. This is often overlooked when webmasters open a hosting account, many go for price over geographic factors, when really this is not the correct decision to make. 

Ideally, everything your website focuses on should be emphasised with geographic targeting. Websites suffering from a reduction in rankings could potentially make large improvements by shifting hosting to a region closer to home. The top level domain is also vital, it is rare to find foreign domains being listed in a search elsewhere in the world unless the keywords searched for are niche to that foreign region. 

Many large businesses setup domains with different extensions based upon the target market. You can install a number of firefox tools to identify the location of hosting. One very useful addon is called Flagfox which you can download from https://addons.mozilla.org/en-US/firefox/addon/5791 

The addon displays a flag in the URL bar which will show a flag based upon the hosting location. So if your flag doesn’t show as being in your top level domains associated country, then it might be time to consider a different host.

Off Page Optimization

Off Page Optimization is a Search engine optimization method, which is done outside of the website to get the best positions for particular set of keywords and attract more visitors. Off page optimization concentrates on getting back links via various link building strategies for particular keywords. Link building methods can be any like directory submission , article distribution , press release submissions, Blogging, Forum posting , manual link building , link exchanges, social bookmarking submission ,social media optimization and many more. Backlinks are often called as inboundlink too. More inboundlink give more better position in search engines.It is very necessory to get inbould link from quality websites instead of link farms.

Page 39: Rahul Garg- SEO Complete Project Report Free Download

What are most important Methods for Off Page Optimization? Off Page Optimizations is usuall consist of different

link building methods.

Directory Submissions:

Steps for directory submissions:

Step1: Look for free directory sites in Google,

yahoo, bing etc.

Page 40: Rahul Garg- SEO Complete Project Report Free Download

Step 2: The Directories with highest is opened:

Step 3: Look for submit link in opened site:

Page 41: Rahul Garg- SEO Complete Project Report Free Download

Step 4: Submit the link by entering details in following form:

Page 42: Rahul Garg- SEO Complete Project Report Free Download

Social Bookmarking Submission

Social bookmarking is a method for Internet users to share, organize, search, and manage bookmarks of web resources. Unlike file sharing, the resources themselves aren't shared, merely bookmarks that reference them.

Descriptions may be added to these bookmarks in the form of metadata, so that other users may understand the content of the resource without first needing to download it for themselves. Such descriptions may be free text comments, votes in favor of or against its quality, or tags that collectively or collaboratively become a folksonomy. Folksonomy is also called social tagging, "the process by which many users add metadata in the form of keywords to shared content".[1]

In a social bookmarking system, users save links to web pages that they want to remember and/or share. These bookmarks are usually public, and can be saved privately, shared only with specified people or groups, shared only inside certain networks, or another combination of public and private domains. The allowed people can usually view these bookmarks chronologically, by category or tags, or via a search engine.

Most social bookmark services encourage users to organize their bookmarks with informal tags instead of the traditional browser-based system of folders, although some services feature categories/folders or a combination of folders and tags. They also enable viewing bookmarks associated with a chosen tag, and include information about the number of users who have bookmarked them. Some social bookmarking services also draw inferences from the relationship of tags to create clusters of tags or bookmarks.

Many social bookmarking services provide web feeds for their lists of bookmarks, including lists organized by tags. This allows subscribers to become aware of new bookmarks as they are saved, shared, and tagged by other users.

As these services have matured and grown more popular, they have added extra features such as ratings and comments on bookmarks, the ability to import and export bookmarks from browsers, emailing of bookmarks, web annotation, and groups or other social network

Step 1:Look for top social bookmarking site in search engine:

Page 43: Rahul Garg- SEO Complete Project Report Free Download

Step 2:Open the Bookmarking site:

Step 3:Register in the Bookmarking site if not registered and login using your username and password:

Page 44: Rahul Garg- SEO Complete Project Report Free Download

Step 4: Post your story:

Step 5: This is what you will get after providing your story:

Page 45: Rahul Garg- SEO Complete Project Report Free Download

Blogs & Comments:

Step 1: Open Blog sites like Blogger.com or wordpress.com

Step 2: Login with your username and password if registered(With Gmail id)

Step 3: Search Blogger on Search Engine e.g ([email protected])

Step 4: enter your details and publish post.

Page 46: Rahul Garg- SEO Complete Project Report Free Download

Article Submission

Step 1:

Step 2: Following result will be shown:

Page 47: Rahul Garg- SEO Complete Project Report Free Download

Step 3:Different directories with Page Rank will be displayed:

Step 4:Open article directory with highest PR.

Page 48: Rahul Garg- SEO Complete Project Report Free Download

Step 4:Register as new user if not registered:

Step 5:A confirmation email will be send to your email and login using your username and password:

Page 49: Rahul Garg- SEO Complete Project Report Free Download

Step 6:Click on Add new article:

Step 6: Fill the details of the form so produced:

Page 50: Rahul Garg- SEO Complete Project Report Free Download

Step 7: Submitted Article will be displayed like this:

Press Release Submissions:

A press release, news release, media release, or press statement is a written or recorded communication directed at members of the news media for the purpose of announcing something claimed as having news value. Typically, they are mailed, faxed, or e-mailed to assignment editors at newspapers, magazines, radio stations, television stations, and/or television networks. Commercial press release distribution services, such as PR Newswire, PR NewsChannel and Business Wire, are also used to distribute them.

The use of a press release is common in the field of public relations, the aim of which is to attract favorable media attention to public relations professional's client and/or provide publicity for products or events marketed by those clients. A press release provides reporters with the basics they need to develop a news story. Press releases can announce a range of news items such as: scheduled events, personal promotions, awards, news products and services, sales and other financial data, accomplishments, etc. They are often used in generating a feature story or are sent for the purpose of announcing news conferences, upcoming events or change in corporation.

A press statement is information supplied to reporters. This is an official statement or account of a news story that is specially prepared and issued to newspapers and other news media for them to make known to the public.

Page 51: Rahul Garg- SEO Complete Project Report Free Download

Forum Posting:

An Internet forum, or message board, is an online discussion site.[1] It originated as the modern equivalent of a traditional bulletin board, and a technological evolution of the dialup bulletin board system.[2][3] From a technological standpoint, forums or boards are web applications managing user-generated content.[3][4]

People participating in an Internet forum may cultivate social bonds and interest groups for a topic made from the discussions.

A Bulletin Board System, or BBS, is a computer system running software that allows users to connect and log in to the system using a terminal program. Once logged in, a user can perform functions such as uploading and downloading software and data, reading news and bulletins, and exchanging messages with other users, either through electronic mail or in public message boards. Many BBSes also offer on-line games, in which users can compete with each other, and BBSes with multiple phone lines often provide chat rooms, allowing users to interact with each other.

Originally BBSes were accessed only over a phone line using a modem, but by the early 1990s some BBSes allowed access via a Telnet, packet switched network, or packet radio connection.

The term "Bulletin Board System" itself is a reference to the traditional cork-and-pin bulletin board often found in entrances of supermarkets, schools, libraries or other public areas where people can post messages, advertisements, or community news.

During their heyday from the late 1970s to the mid 1990s, most BBSes were run as a hobby free of charge by the system operator (or "SysOp"), while other BBSes charged their users a subscription fee for access, or were operated by a business as a means of supporting their customers. Bulletin Board Systems were in many ways a precursor to the modern form of the World Wide Web and other aspects of the Internet.

Early BBSes were often a local phenomenon, as one had to dial into a BBS with a phone line and would have to pay additional long distance charges for a BBS out of the local calling area. Thus, many users of a given BBS usually lived in the same area, and activities such as BBS Meets or Get Togethers, where everyone from the board would gather and meet face to face, were common.

As the use of the Internet became more widespread in the mid to late 1990s, traditional BBSes rapidly faded in popularity. Today, Internet forums occupy much of the same social and technological space as BBSes did, and the term BBS is often used to refer to any online forum or message board.

Ward Christensen and the computer that ran the first public Bulletin Board System, CBBS

Although BBSing survives only as a niche hobby in most parts of the world, it is still an extremely popular form of communication for Taiwanese youth (see PTT Bulletin Board System). Most BBSes are now accessible over telnet and typically offer free email accounts, FTP services, IRC chat and all of the protocols commonly used on the Internet.

Page 52: Rahul Garg- SEO Complete Project Report Free Download

Affiliate Marketing:

Affiliate marketing is a marketing practice in which a business rewards one or more affiliates for each visitor or customer brought about by the affiliate's marketing efforts. Examples include rewards sites, where users are rewarded with cash or gifts, for the completion of an offer, and the referral of others to the site. The industry has four core players: the merchant (also known as 'retailer' or 'brand'), the network, the publisher (also known as 'the affiliate') and the customer. The market has grown in complexity to warrant a secondary tier of players, including affiliate management agencies, super-affiliates and specialized third parties vendors.

Affiliate marketing overlaps with other Internet marketing methods to some degree, because affiliates often use regular advertising methods. Those methods include organic search engine optimization, paid search engine marketing, e-mail marketing, and in some sense display advertising. On the other hand, affiliates sometimes use less orthodox techniques, such as publishing reviews of products or services offered by a partner.

Affiliate marketing—using one website to drive traffic to another—is a form of online marketing, which is frequently overlooked by advertisers. [1] While search engines, e-mail, and website syndication capture much of the attention of online retailers, affiliate marketing carries a much lower profile. Still, affiliates continue to play a significant role in e-retailers' marketing strategies.[2]

Page 53: Rahul Garg- SEO Complete Project Report Free Download

Link Buinding

• You need to get your pages “indexed” in Google in order for them to rank. The best way to get your pages indexed is to get links from important sites to your pages

• Having your page listed in an important directory is a valuable link (being listed in DMOZ and Yahoo are probably two of the most important)

• You need to get “links” to rank, the more quality links you acquire, the less links you need to rank above your competitors

• Search engines view links as “votes”, with some votes counting more than others. To get high quality links (that help your site rank better) you need to participate in the social aspects of your community and give away valuable unique content that people talk about and share with others (also known as “Link Bait”)

• How many sites are "voting for you" (by linking to you) and how many sites are linking to them (which determines how important they are)… this, in turn, determines how important your site is

• Not all links are the same… the “Logarithmic Scale” is alive and well.

Page 54: Rahul Garg- SEO Complete Project Report Free Download

• You also need to control your “link reputation” so that Google knows what your site is about. What other sites "say" about you when they link to you (in other words, the "anchor text").

• You need anchor text, on the links coming to your page, that uses the keyword phrase that you want to rank for… and have it link to a page that is about that keyword phrase.

• You need to vary your anchor text so that it looks “natural” to the SE. Think about how people would do it naturally… they wouldn’t all use the same exact anchor text on a link to your page.

SMO (Social Media Optimization)

Abbreviated as SMO, social media optimization is the process of increasing the awareness of a product, brand or event by using a number of social media outlets and communities to generate viral publicity.

Social media optimization includes using RSS feeds, social news and bookmarking sites, as well as social media sites and video and blogging sites. SMO is similar to SEO (search engine optimization) in that the goal is to drive traffic to your Web site.

Facebook: https://www.facebook.com/gitm.gurgaon.india

1.Create Facebook Page:

Page 55: Rahul Garg- SEO Complete Project Report Free Download

2. Facebook Likes: Increase Facebbok likes by using Various Activity

3.Daily Post On Facebook

Page 56: Rahul Garg- SEO Complete Project Report Free Download

Linkedin:http://www.linkedin.com/groups/Admission-Open-2013-Helpline-91-4459157?gid=4459157&trk=hb_side_g

Twitter: https://twitter.com/gitmfarukhnagar

Page 57: Rahul Garg- SEO Complete Project Report Free Download

Google+: https://plus.google.com/u/0/117556983242816167057/posts

StumbleUpon: http://www.stumbleupon.com/stumbler/GlobalInstitute/likes

Page 58: Rahul Garg- SEO Complete Project Report Free Download

Blogspot: http://gitm-gurgaon.blogspot.in/

Youtube: http://www.youtube.com/user/GlobalInstituteGITM?feature=mhee

Page 60: Rahul Garg- SEO Complete Project Report Free Download

TESTING TOOLS

Tools For Measuring Results:

• Google Analytics • Google Website Optimizer • Google Adwords • Google’s Search Based Keyword Tool • Google Trends • UltraCart Shopping Cart • 1ShoppingCart • Aweber Email Systems • iContact Email Systems

PPC For Testing Purposes

• Using PPC advertising, you can identify your "Money Terms" you want to optimize your site for

• SEO is a long term strategy that has a huge payoff for doing it correctly. This means optimizing for the right terms, your "Money Terms".

• By bidding on your top search phrases, you can find out almost immediately what your conversions are “by keyword”

• Once you identify the keywords that are most profitable for you in PPC, you can immediately go to work to create and optimize pages for “organic search”.

Page 61: Rahul Garg- SEO Complete Project Report Free Download

FINAL CONCLUSION

Our website www.gitmgurgaon.com is shown as top ranked in Search Engine Result Page(SERP).

1. GITM Gurgaon

Page 62: Rahul Garg- SEO Complete Project Report Free Download

2. Global Institute Of Technology&Management

3. GITM :

Page 63: Rahul Garg- SEO Complete Project Report Free Download

4. M.Tech College in Gurgaon

5.Top B-School in Gurgaon

Page 64: Rahul Garg- SEO Complete Project Report Free Download

REFERENCE AND BIBLIOGRAPHY

The Technical Writer's Freelancing Guide (Sterling Publishing - 2 editions) Learn Microsoft Money Fast!t! (SYBEX) Outside the IBM PC and PS2 (Bantam; This is by Peter Norton - I wrote three chapters) Understanding Windows Draw (SYBEX) Peter Norton's User's Guide to Windows 3.1 (Brady/Random House) Shareware Treasure Chest: Clip Art Collection (SYBEX) Peter Norton's User's Guide to Windows NT: Tips & Tricks (Random House) 10 Minute Guide to the Internet (Alpha/Que) The Complete Idiot's Next Step with the Internet (Alpha/Que) Using Mosaic (Que; I wrote several chapters) The Complete Idiot's Guide to the World Wide Web (Que) The Complete Idiot's Guide to the Internet for Windows (Que) PGP Companion for Windows (Ventana) Using Microsoft Network (Que) Using Microsoft Internet Explorer (Que) The Complete Idiot's Guide to the Internet with Windows 95 (Que) Using Netscape 2 for Windows 95 (Que; Mis-listed in Books in Print/Amazon - it was

not written by Warren Ernst) Using Netscape 2 with Your Mac (Que; with Joe Heck) The Official Netscape JavaScript Book (Netscape Press; with John Kent; see second

edition) Using Netscape 3 (Que; covers Windows, Mac, and UNIX versions) The Best Sex of Your Life (Barricade Books; with sex therapist Jim White) Discover Windows NT Workstation (IDG) Discover MS FrontPage 97 (IDG) The Official Netscape JavaScript 1.2 Book (with John Kent, Ventana/Netscape Press) Using Netscape Communicator 4 (Que) The Official Netscape JavaScript 1.2 Programmer's Reference (Ventana; with Kent

Multer) Making Money in Technical Writing: Turn Your Writing Skills into $100,000 A Year

(ARCO). Career Ideas for Kids Who Like Computers (Facts on File) Poor Richard's Internet Marketing and Promotions (2 editions, with Tara Calishain; Top

Floor Publishing) The Complete Idiot's Guide to the Internet UK 2001 (4 editions, with Rob Young,

Prentice Hall)

Page 65: Rahul Garg- SEO Complete Project Report Free Download

One important characteristic of an expert and professional consultant is that he or she understands that the theories and techniques in their field are subject to quite concrete and specific limitations. In this page, I review the important ones that apply to search engine optimization and website marketing and their practical significance.

SEO is hand made to order. If you have planned a product launch, a wedding or been a party to a lawsuit, you know that the best laid plans are seldom executed without some major changes. Life is simply too complex. SEO is another one of those human activities because, to be effective, it is hand made for a specific site and business.

There are other important limitations which you need to understand and take into consideration.

Searching is an evolving process from the point of view of providers (the search engines), users and website owners. What worked yesterday may not work today and be counter-productive or harmful tomorrow. In the result, monitoring or regular checks of the key search engines and directories is required to maintain a high ranking once it is achieved.

Quality is everything. Since virtually everything that we do to improve a site's ranking will be known to anyone who knows how to get it, innovations tend to be short-lived. Moreover, search engines are always on the lookout for exploits that manipulate their ranking algorithms. The only thing that cannot be copied or exploited is high quality and value content especially when others link to it for those reasons. Only higher and more valuable content trumps it.

The cost of SEO is rising. More expertise is required than before and this trend will continue. The techniques employed are more sophisticated, complex and time consuming. There are fewer worthwhile search engines and directories that offer free listings. Paid placement costs are rising and the best key words expensive.

The search lottery. Search engines collect only a fraction of the billions of sites' pages for various technological reasons which change over time but nonetheless will mean for the foreseeable future that searching is akin to a lottery. SEO improves the odds but cannot remove the uncertainty altogether.

SEO is a marketing exercise and, accordingly, the same old business rules apply. You can sell almost anything to someone once but businesses are built and prosper through repeat customers to whom the reputation, brand or goodwill is important. Content quality and value is the key and that remains elusive, expensive and difficult to source but, in websites, is the only basis of effective marketing using SEO techniques.

Suffice to say, if your site is included in a search engine and you achieve a high enough ranking for your requirements, then these limitations are costs of doing business in cyberspace.