100% found this document useful (1 vote)
288 views103 pages

SEO Classic

This document provides an introduction to the Search Engine Optimization (SEO) Certification from ExpertRating. It outlines the structure and content of the certification program which aims to equip students to work as professional SEO optimizers. The certification covers key areas of SEO like on-page optimization, link building, and tracking ranking changes. It emphasizes the importance of continuing to learn about new SEO developments and stresses ethical practices like maintaining client privacy and providing honest guidance.

Uploaded by

Nassif El Dada
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
288 views103 pages

SEO Classic

This document provides an introduction to the Search Engine Optimization (SEO) Certification from ExpertRating. It outlines the structure and content of the certification program which aims to equip students to work as professional SEO optimizers. The certification covers key areas of SEO like on-page optimization, link building, and tracking ranking changes. It emphasizes the importance of continuing to learn about new SEO developments and stresses ethical practices like maintaining client privacy and providing honest guidance.

Uploaded by

Nassif El Dada
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 103

CHAPTER 1

1 Introduction to Search Engine Optimization Certification

Welcome to the Search Engine Optimization Certification from ExpertRating. We


congratulate you on making a wise decision to educate yourself on one of the
fastest growing and most lucrative industries - The Search Engine Optimization
Industry. The ExpertRating Search Engine Optimization Certification has been
developed by ExpertRating Solutions, a leader in online testing and certification
with over 2,700,000 certified professionals in over 60 countries in more than 170
different skill areas. This certification program is one of the most comprehensive
programs on SEO available to date and can successfully guide you towards your
goal of becoming a professional optimizer. The ExpertRating SEO program has
been developed by thorough professionals under stringent ISO specifications, and
can fully equip you to join the SEO marketplace as a self employed professional or
while seeking a job.

As you must be well aware, the incredible growth of the internet is leading to the
development of millions of websites every year. Most websites are developed to
generate profits for their owners through ecommerce or online advertising
revenues. Every website owner would like his or her website to attract more
visitors, and hence generate more business. This translates into a huge business
potential worth billions of dollars for search engine optimizers, who can quickly
master the intricacies of this industry through a well researched and
comprehensive program such as this one. This is an opportune time to become a
Search Engine Optimization professional and reap the huge rewards through a well
directed and dedicated effort. Best of luck & Happy Optimizing!!

You will proceed through the courseware according to the following list of
chapters:

1.1 The Final Certification Test

After you have gone through and revised the complete program material, you can
appear for the final test. You must appear for the test without referring to the text
material, and it is advisable that you are well prepared before attempting the test.
The specifications of the test are mentioned below:

ExpertRating Final Test Format

Type of Test - Multiple choice questions with one or more correct answer(s)

Duration - 35 minutes
Number of Questions - 35

Question Weightage - All questions carry equal marks

Navigation - You can go back and answer unanswered questions

Answer Review - You can review the questions when you reach the end of the
exam by going back and answering the marked questions, if any

Exhibits - Some tests will require you to answer a question based upon an exhibit

Pass marks - 50%

Retake Policy - You can retake the test any number of times by paying the
required retake fee.

Note: Some tests may follow a different format. Please read the test details carefully
before taking the test.

All successful candidates receive a hard copy certificate of accomplishment stating


that they have completed all the requirements of the Search Engine Optimization
Certification process. This certificate can be used as a means of marketing your SEO
business as well as while seeking a job. It takes about 3 weeks to get your certificate
through registered post. You also get logo usage rights and an online transcript that
you can immediately use to display your test marks and the areas you are
proficient in. You can link to the online transcript from your website or ask friends,
relatives or business associates to look it up on the internet.

1.2 Roles and Responsibilities of a Search Engine Optimization Professional

As a Search Engine Optimization professional you will be responsible for optimizing


websites or webpages for clients or for your employer. The SEO process covers
many aspects of the website and includes various activities to improve the ranking
of webpages in the SERPs (Search Engine Results Pages). SEO activities help to
increase the visibility of a webpage by not only enabling it to rank higher in the
SERPs of the major search engines, but also by making it visible in more places on
the internet. As a result, the website attracts more visitors, traffic and most
importantly, business for its owners. SEO encompasses several facets of the
website. Some areas of operation for search engine optimizers are:

 Website domain name (deciding the domain name)


 Webpage names and file/directory names(deciding the page names and file
names)
 Deciding Page titles, headers, sub-headers
 Choosing & embedding relevant keywords throughout the website
 Making alterations to the copy matter of the website
 Making changes to the flow and layout of the website & optimizing internal links
 Making design changes
 Making the site search engine readable by working with meta tags & robots.txt files
 Monitoring changes to the website in the SERPs of major search engines
 Developing strategies for increasing in-bound traffic
 Monitoring changes to search engine algorithms
 Studying competing websites and learning from their SEO efforts
Images in the above graphic have been sourced from istock.com

What characteristics will help you become a good Search Engine Optimization
Professional?

A Search Engine Optimization professional requires a lot of close interaction with


clients several times a month. Good communication and interpersonal skills can
greatly help you, especially if you are running your own Search Engine Optimization
business. A pleasing and helpful nature will help you attract more clients as well as
keep your existing clients happy and satisfied. Good SEO professionals must also
keep tabs on the latest happenings in the search engine industry and should
understand how changes within the industry will effect the search engine rankings.
Since many clients may not know much about SEO, it is the duty of the SEO
professional to keep them aware of the actual picture regarding their websites and
how it will perform in the SERPs in the future.

Some characteristics that make great Search Engine Optimization


Professionals are:

 A pleasing and helpful nature


 Good communication skills
 Selling skills (especially if you plan to run your own business)
 Good listening skills
 A responsible attitude toward the clients.
 Patience toward your SEO efforts and toward your client's requests
 A good understanding of the science of search and the internal working of the
major search engines
 The ability to grasp new techniques and scientific concepts
 The habit of reading regularly about the latest SEO news

Some Expert Advice

Even if you do not possess some of the above characteristics, it does not mean that
you cannot become a good Search Engine Optimization professional. You can
develop the required skills and characteristics on the job and over time by keeping
your weaknesses and strengths in mind. Successful Search Engine Optimization
professionals who charge over $100 per hour are not born that way;they have
slowly built up their careers and reputation over time by ironing out weaknesses
and sharpening strengths.

One characteristic that will take your career far is an ability to constantly keep
learning. It is often seen that professionals who stop the learning process get
stagnated in their careers and loose out to the competition. If you are to become
successful and do justice to the Search Engine Optimization profession, you must
keep in touch with the latest information on the search engine industry.

Secondly, if you plan to start your own Search Engine Optimization business it will
be a good idea to sharpen your selling skills. You may only get a couple of minutes
to convince your prospective client once he/she is in front of you. These moments
are crucial and you must say the right things at the right time by creating an
effective sales pitch.

Some good websites to keep you abreast of the latest developments in the
SEO industry are:

http://www.searchenginewatch.com/

http://www.searchenginejournal.com/

http://www.seochat.com/

http://www.seoinc.com/

http://www.seroundtable.com/

http://www.searchengineland.com/

http://www.selfseo.com/

http://www.pagerank.net/

http://www.sitepoint.com/

http://www.seo.com/

http://www.seobook.com/

http://www.seobythesea.com/
http://www.searchengineguide.com/

Ethical responsibility of the Search Engine Optimization Professional

Search Engine Optimization professionals have a responsibility to their clients as


well as to the society. SEO professionals are on their honor to uphold certain ethics
and perform their duties to their clients and employers according to a set code of
ethics that they must keep in mind at all times. The foundation of the code of ethics
revolve around basic duties of the Search Engine Optimization professional towards
the client, practices to be followed during work, protecting rights and interests of
the client, maintaining professional relations with the client and performing one's
duty toward the profession.

As a certified Search Engine Optimization Professional, you must adhere to


the following code of ethics:

(A) The Relationship with the client

1. The Search Engine Optimization professional must ensure that the best advise
and guidance is provided to the client to the best of his/her ability.

2. Privacy of the client: The Search Engine Optimization professional must protect
the personal information provided by the client which could relate to website
information such as FTP or business information. In the eventuality of the
information being passed on to any Government authority, the client must be
informed immediately.

3. Misguiding the client: The Search Engine Optimization professional must not
abuse his/her position to influence the client to incur any expense that is of no
benefit to the client's website or business.

4. The contractual relation: The Search Engine Optimization professional must


adhere to the contract signed with the client and is bound to act honestly and in a
trustworthy manner with the client at all times.

5. Discrimination: The Search Engine Optimization professional must not


discriminate between clients on the basis of caste, creed, education level or any
other distinguishing factor. All clients must be provided with the same level of
service at the same price.

(B) The Conduct of the Search Engine Optimization Professional

1. The Search Engine Optimization professional should remain in the domain of


knowledge acquired during education and certification and should not try to
experiment with the client's website

2.The Search Engine Optimization professional must always act in a responsible


manner with issues related to the client's the website. Avoiding any form of damage
to the website rankings or website reputation should be a prime concern for the
Search Engine Optimization Professional.

3. The Search Engine Optimization professional must take the initiative to improve
and update his/her knowledge so as to incorporate the latest SEO techniques to the
client's website.

4. The Search Engine Optimization professional must not misrepresent or advertise


himself or herself in any untruthful way with respect to his/her education, skills or
experience etc.

5. The Search Engine Optimization professional should not try to promote any
product or service to the client simply because the selling company will pay
commissions. Since the Search Engine Optimization professional is in a position to
influence the client, it is very important to keep in mind that the welfare of the
client is far more important than any additional monetary benefits that may accrue
to the Search Engine Optimization professional. All recommendations should solely
be based on benefits derived by the client.
6. In case of any difference of opinion or interest with the client, the Search Engine
Optimization professional must make a decision that is in the best interest of the
client. If there is a conflict of interest with the client which is likely to affect the level
of service provided by the Search Engine Optimization professional, it is advised to
discontinue the contractual relationship to avoid harm to the client.

(C) Conduct at the work place and towards society

1. The Search Engine Optimization professional must cooperate with his/her


colleagues at the work place in terms of offering help and exchanging information.
The Search Engine Optimization professional must behave in a manner that is
honest and in good faith with all colleagues.

2. The Search Engine Optimization professional must bring to the notice of the
management or higher authorities, if he/she notices any activity which is not in the
best interest of the client. This could extend to passing on information regarding
the malpractices, unethical behavior or misdeeds of any colleague at the workplace.

3. The Search Engine Optimization professional must always respect the law and
make sure that he/she is complying with the statutory requirements of the State
Government. The Search Engine Optimization professional must also ensure that
he/she does not perform any activity that could lead to legal complications.

4. The Search Engine Optimization professional must endeavor to further the


profession and to help promote the benefits of SEO for the good of other
individuals and websites.

CHAPTER 2
2 Introduction to SEO and what it involves

Search Engine Optimization Terminology

Algorithm - Mathematical formula which search engines use to determine


rankings. Algorithms are changed frequently to prevent anyone from cracking their
code.

Anchor Text - The words visible in a hyperlink to a webpage.

Backlinks - Also referred to as Inbound Links. These are links pointing to one's
website from outside sources or other pages on the site.

Black Hat - Optimization techniques looked down upon and discouraged by the
search engines.

Bot - A search engine robot or spider that search engines use to find, read, and
report back on websites in order to create entries on them in indexes or databases.

CPA - Cost per Acquisition: The total cost of an online sale or lead.

CPC - Cost per Click: The amount advertisers charge websites in PPC campaigns for
every click on an ad.

Crawl - The search engine spider's act of finding information on web pages.

CTR - Click-Through Rate: A ratio measure of online users who click on an ad to the
number of ad impressions delivered (the number of times the ad was shown). For
example, if an ad was delivered 100 times online, and 7 users clicked on that ad, it
would have a click-through rate of 7%.

DMOZ - Directory Mozilla: A website directory reviewed by people whose listings


appear in many search engines. A DMOZ listing is rumored to boost rankings in
Google's SERPs.

Keyword/phrase - A word or phrase that an online searcher may enter in a query.


The results from the query relate back to the submitted keyword.

Meta Tag - Meta tags are used to further specify and gives details to websites in the
SERPs, helping search engines to properly categorize them. They include keywords,
titles, and descriptions and are included in the HTML, invisible to online searchers.

Organic Results - Commonly referred to as Natural Results.(SERP) in a query that


the listed website has not paid for. These are the websites that the search engines,
each through their own algorithm, consider relevant in regard to the search term.

Paid Inclusion - A marketing technique where a search engine charges a website in


exchange for including it in the search index. Most of the major search engines
provide some sort of paid inclusion program.

PPC - Pay-Per-Click Advertising: A form of online advertising where the client pays
only for the clicks on its ads which send searchers to a specified website. A variety
of factors are included in the click price including the market rate, competition,
keyword popularity, and landing page quality. Some popular PPC programs are
Google AdWords, Yahoo Search Marketing, and Microsoft adCenter.

PR/Page Rank - Google's algorithm to rank websites based on their quality and
relevance to a search term.

Rank - How a website compares with others in position on the SERPs. The lower the
rank number, the higher up in SERPs your website will appear.

ROI- Return on Investment: A measure of the effectiveness of investment spending.

Search Directory - A directory that organizes websites by category and


subcategory. Unlike search engines, web directories do not show search results
based on keywords. Rather, websites are usually placed in one or two categories
based on the entire website. Some directories are very general; others are centered
on specific niches.

Search Engine - A system designed to easily find information from various sources,
including web pages, images, databases, and directories. Search engines use
mathematical algorithms to manage their systems.

SEM - Search Engine Marketing

SEO - Search Engine Optimization

SERP - Search Engine Results Page: This page displays the search results of an
online query based on the search term. The organic search results are listed by
rank, and paid ads are usually listed on the side.

URL - Uniform Resource Locator: The website address.


White Hat- Optimization techniques that search engines approve of and promote.

What is the importance of search for websites and how can SEO save valuable
dollars in advertising expenses?

It is estimated that the online advertising spend in the UK will be $47 billion by the
year 2010. The success of any website is proportional to the number of visitors to
the site. Any site, no matter how well designed or developed is a failure if it does
not attract a sufficient number of visitors to make it commercially viable. Statistics
show that traffic from search engines accounts for as much as 70% of the average
traffic for websites. SEO can help webmasters and site owners vastly increase traffic
from search engines without increasing advertising expenses. Depending upon
their line of business, websites pay as much as $50 and even more per visitor on
paid search engine listings. Therefore, free traffic through SEO efforts can actually
be more valuable than you imagine.

Moreover most SEO efforts have a long lasting impact on the site traffic as opposed
to paid listings that go off the air as soon as the budget is exhausted. Believe it or
not, 100 visitors for a keyword such as New York lawyers or New Jersey lawyers
could actually be worth $5000 in cost per click search engine listings such as on
Google AdWords or Yahoo Search Marketing Services

Which Search Engine should I concentrate on while performing SEO Activities?

There are hundreds of search engines, but only a handful cater to the bulk of the
billions of searches conducted monthly. It is therefore only necessary that you
focus your SEO efforts on the following search engines which cater to more then
80% of all search undertaken on the internet.

The following is an approximate share of Google, Yahoo and Windows Live(MSN) in


the total worldwide search:

Google-65.9% of overall search

Microsoft - 15.1% of overall search

Yahoo-14.5% of overall search


Ask Network - 2.9% of overall search

AOL - 1.6% of overall search

Core Search Traffic Share

This does not mean that it is a waste of time targeting other search engines, but the
fact is that by targeting the top three, you are effectively addressing 89% of the over
all world wide search. The remaining 11% of the world wide search is catered to by
dozens of smaller players, the largest of whom is AOL (with 4% of the search
market). Another fact is that many of the smaller search engines use website
information drawn from the larger engines. Therefore the chances are that you
would already figure in most of the smaller search engines if your site is listed in
the big 3. That said, the importance of directories cannot be ruled out, especially of
DMOZ. It is wise to get listed in as many directories as possible, as they are looked
at favorably by most search engines. A DMOZ directory listing should be a priority,
as it gets a website automatically indexed by Google.

2.1 Who can or should perform SEO activities


SEO activities refer to the internal(or even external, often paid) efforts related to the
website such as optimization and maintenance of design, HTML changes and
content layout as well as external efforts which relate to link building, submission to
search engines and directories as well as creating partnerships for increasing in-
bound links to the site. The person responsible for undertaking SEO should have
access and capability to handle design aspects of the site related to:

 Changes in HTML (page titles, page headers, meta tags and text in the webpages)
 Changes in the site layout of pages

The person is also require to undertake the following activities:

 Ability to browse the internet and search for link partners


 Ability to locate directories, free listing areas and forums for adding the site details

Why is it a good idea to undertake SEO activities by yourself?

Since search engines follow a definite programmed way of working and ranking
websites, SEO techniques and skills can be learnt and practiced by anyone. There
are several compelling reasons why you should not outsource your SEO activities to
search marketing or SEO companies:

 More often than not, companies charge large fixed monthly charges for
undertaking SEO activities. Undertaking SEO activities yourself help you to build
your Knowledge base for the future and your own efforts are free.
 It is difficult to monitor what or how much SEO activity is going into the website
since it takes time for the results to show in most search engines. As a result you
could be paying an SEO company for little to no work.
 Many SEO companies ask customers to pay on an on-going basis for the the
positive search engine ranking results to persist over time. This often leads
websites to be bound to pay or face the eventuality of a loss in ranking/traffic.
 SEO companies may resort to undesirable or banned SEO techniques that could
get the website black listed by major search engines. It is safer to trust your own
judgment regarding which site you should link to and which SEO technique you
should adopt.
 SEO is not a complex science that can only be mastered by a selected few. Using
the correct study resources anybody can learn and incorporate SEO techniques for
optimizing websites.
 Many SEO companies require their customers to furnish the website FTP details to
allow changes to be made in the site HTML. This could lead to unwanted or
incorrect alterations to the website, or even deletion of crucial files or folders.
 It is common for SEO activities outsourced to companies make a poor ROI (return
on investment). SEO companies may charge as much as $300 for optimizing a
website for a single keyword/month. In this amount of money you could purchase
up to 3000 click-throughs in a pay per click listing on Google AdWords or Overture
(Yahoo Search Marketing).

What are the areas of operation for Search Engine Optimization


Professionals?

SEO encompasses several areas of operation. Some of these have been mentioned
earlier as well:

 Discussing the client's objectives keeping in mind the budget


 Understanding the business model of the website
 Website domain name(deciding the domain name)
 Web page names and file/directory names(deciding the page names and file
names)
 Deciding page titles, headers,sub-headers
 Choosing and embedding relevant keywords throughout the website
 Making alterations to the copy matter of the website
 Making changes to the flow and layout of the website and optimizing internal links
 Making design changes
 Making the site search engine readable by working with meta tags and robots.txt
files
 Monitoring changes to the website in the SERPs of major search engines
 Developing strategies for increasing in-bound traffic
 Monitoring changes to search engine algorithms
 Studying competing websites and learning from their SEO efforts
 Handling customer inquiries and problems
 Providing timely ranking reports to the customer
2.2 Finding Clients for your SEO Business

Personal Connections/Local Community

For someone just starting out in the SEO industry, the best way to find clients is to
look at your personal connections and your local community. In the local area,
there may be less competition or no competition at all. Start by looking at
successful area businesses, analyzing their websites, and determining which ones
could benefit most from SEO services. Look at titles, tags, and back links. You can
even come up with a list of keywords you believe will be most beneficial for the site,
looking both at business and domain names. Finding the websites' ranks, if
possible, is also a good idea.

One can begin to contact businesses after this breakdown of the websites: explain
what you do, ask how satisfied they are with their websites, and inquire into which
areas need improvements. You can then offer suggestions on how you could help
the website.

If a business wants to talk about it further, make sure you are prepared; if not,
leave a business card in case the company later decides it needs your services.
Additionally, offering services pro bono to local clubs and charities is a good way to
gain experience, build a portfolio, and make connections.

Networking

Beyond a good rank, good reputation and trust can gain you valuable clients. Word-
of-mouth can be very effective. The easiest way to network is to attend industry
conferences and get to know more professionals. Beginners and veterans alike
attend these conferences, and you may just meet a few who are willing to give you
business. Conferences are also great places to build personal relationships with
clients themselves. Organizations that attend such conferences generally are
interested in obtaining SEO services and therefore provide valuable opportunities.
Some established SEO professionals find themselves with too many clients and are
looking to refer them to other credible professionals.

Website/Blog

Having a website where potential and current clients can contact you or follow up
on services is essential for SEO professionals of all experience levels. Starting your
business online rather than just locally is important because many potential clients
logically seek out SEO services on search engines. After all, an SEO business by
nature should be able to achieve high search engine rankings and attract clients in
this way. Additionally, setting up and optimizing the website for your own SEO
business will gain you valuable experience. Blogs especially are a simple and
convenient way to promote your services as well as your main website. Your blog
should have a link to a contact page and a list of services you offer. Maintaining the
blog is important to ensure your sites are frequently indexed. Blogging about
industry trends and your specialty areas will help you stay current, and discussion
of popular topics will keep readers coming back. This can attract a wide variety of
clients you may not otherwise have had access to.

Online Services

Several online sites and communities serve as channels for businesses to meet
professionals and contract work. Using these avenues can increase business and
add traffic to your main website. Forums.digitalpoint.com provides for discussion
on numerous topics, and there is a section specifically for SEO. Forums can be
particularly beneficial for gaining clients. If you answer people's questions
effectively, prove your knowledge, and gain trust without actively promoting your
business, many forum users may start offering you real SEO work. Sitepoint.com
has a marketplace where users can view listings and advertise services. Not only
can you market your specific SEO services, you can also sell documents and guides
to those who want to optimize their own sites. Consequently, these individuals may
request additional services.

vWorker.com allows professionals to bid on projects relating to custom software,


writing, graphic design, and other services. By subscribing to the newsletter, you
can view the daily bid requests of the site's registered buyers. Elance.com,
oDesk.com, and Guru.com help companies find freelance talent and hire
contractors. All of these websites facilitate SEO professionals in meeting clients and
give an idea of what opportunities exist. Each of these sites allows freelancers to
create profiles for free. Companies search these profiles, post projects, and view
free quotes. After the contracted work is completed, clients rate the freelancers and
give reviews. These evaluations are then available on the freelancers' profiles.
2.3 Selling your business through SEO

ROI: SEO versus Advertising

Search engine optimization usually generates higher return on investment than


advertising because of its more sustainable revenues and less substantial costs.
The return for the two strategies is measured by increases in sales leads, while the
investment factor in implementation and maintenance costs. Exact rates of return
on investment, in terms of sales, are difficult to define and compare for either tactic
in terms of sales because of differing variables across differing industries.
Furthermore, ROIs for SEO are particularly hard to compute in view of the fact that
SEO sales leads are hard to measure and continue even after investments cease.

Return

Although advertising can show more immediate and measurable return on


investment, search engine optimization is a sustainable investment that generates
revenue long after payments for SEO services have ended. SEO can be thought of
as an investment in business infrastructure that will continue to produce results
well into the future. There is no known end to the benefits. However, when
advertising funds run out, so do the exposure and the revenue from the ads.
Additionally, fundamental differences in the way both strategies attract searchers
favor SEO. Internet users exercise generous freedom when deciding where to go
online. Searchers easily avoid advertisements and usually click on the more
relevant search results. SEO can improve SERP rank through link popularity,
magnifying this effect.

Investment

Online advertising costs are simple and straightforward: most advertisers pay the
publishers (website owners) for every click on an ad (PPC), resulting in costs of
hundreds to thousands of dollars a day. Offline costs include fees for billboards,
print ads, and TV commercials. SEO costs to generate sales leads can be more
complicated, requiring modest initial investment and smaller follow-up costs to
keep up with the changes in the industry. However, most companies find that these
costs, often starting around $3,000, are much smaller in comparison to
costincurred on other forms of advertisement. Overall SEO costs depend on a
variety of factors, including online market competitiveness, keyword popularity,
amount of content, and company age. There are both direct and indirect costs
associated with SEO:

Direct costs include hiring an SEO firm, training in-house staff, and obtaining
technical resources. More substantial costs arise if optimization is initiated after the
website's launch, requiring rework of the site's technical infrastructure, design, and
layout.

Indirect costs include opportunity costs, displaced priorities, and time-to-market


costs. Opportunity costs can be characterized as missed opportunities, failed
implementations and not optimizing wherever possible. Displaced priorities occur
when an outside team is not hired and instead in-house staff is trained for SEO
purposes. This may cause the stuff to neglect previous duties. Time-to-market costs
represent lost traffic due to the slow implementation of search engine optimization.

CHAPTER 3
3 What are Search Engines & Directories

As a Search Engine Optimization professional, it must be very clear to you what


search engines exactly are and how they work. This chapter discusses the working
of search engines from the point of view of SEO.

What is a Search Engine?

In simple terms, a search engine may be described as an


automated computer program that enables one to search information,
documents, or databases available on the internet. Search engines provide a smart
and swift way of finding information on the internet. When a search term or a
keyword is entered into a search engine, it returns the most relevant and matching
webpages that contain similar search terms and keywords. Google, MSN Search,
Yahoo, AOL Search, AlltheWeb.com, Ask Jeeves, Excite, Lycos, Yahoo Search
Marketing and Alta Vista are some of the famous search engines. Search engines
are probably the only forum on earth that provide information on almost
everything and anything and that too in a couple of seconds on your desktop.
From the SEO perspective, search engines can be considered as huge databases
containing website information which offer visitors the facility to query the
database based upon the visitors search terms. Search engines gather data about
websites with the help of specially design software called the search engine
spiders or search engine robots or crawlers or bots. These software scan the
world wide web looking for new websites to add to the existing database of sites,
and track changes to websites that the search engines have already indexed. From
the SEO perspective, we can look at search engines as intelligent software that have
been programmed to report website information in a meaningful manner based on
search criteria. Since a search engine is actually a software, a search engine
optimization professional can learn how it has been programmed to think.
Accordingly the SEO professional can take action on websites so that they are
viewed favorably by the search engine while displaying the search results.

Brief History of Search Engines

Many of the first efforts at organizing information online were put together by
university students. One of these early attempts was called Archie, created in 1990
by a student at McGill University in Montreal, while and another, called Veronica,
came out of the University of Nevada. They were both indexed manually.These
systems worked by running searches on a database of web filenames. After Tim
Berners-Lee's creation of the World Wide Web in 1991, Matthew Grey created the
first internet robot, a program which could execute tasks at speeds beyond human
ability. He called the bot World Wide Wanderer, and it was meant to measure the
growth of the WWW. Online directories such as Galaxy and Yahoo! emerged in the
mid-1990s. Yahoo! provided brief page descriptions in addition to the URLs. Next
emerged Lycos, innovative in providing ranked relevance retrieval of millions of
documents. Lycos was followed by Infoseek and Altavista. In 1997, AskJeeves.com
was launched with a new form of search using human editors to match
queries.MSN Search joined the competition in 1998.

Sergey Brin and Larry Page created Google for a project at Stanford University in
1997. Google brought innovation to the search engine industry with a system to
rate websites, called PageRank. The search engine has since become the pioneer in
search innovation. It has succeeded in predicting and providing for the wants and
needs of internet users. The company has many vertical search services, including
Google News, Google Book Search, and Google Video, and it offers other
information services such as email (Gmail), maps (Google Maps), and calendars
(Google Calendar.) Google has also gained vast brand exposure through its key
partnerships and large investment in advertising.

Various components of the SERPs (Search engine results pages) with special
focus on Google.

As a Search Engine Optimization professional, you must start looking at the search
engine listings with an eye for detail. Since what is displayed in the SERPs is the
moment of truth for the SEO professional who has been optimizing a website or a
webpage, it is all the more necessary to understand various components:

Below is a screen shot of the Google results page when the term "insurance
lawyer" is searched for:

Click here to enlarge

Notice the marked parts of the screenshot of the Google SERP (search engine
results page) for the term "insurance lawyer". Google is arguably the best search
engine with incisive algorithms that enable it to generate precise results that match
with the search keywords.

Notice the following important aspects in the above Google screenshot:

1. The Title of the listing in the SERPs has been marked by “A”. The contents of the
title come from the title of the webpage.

Google displays up to 66 characters of a Title Tag of the web pages in the Title of
the listing in the SERP. Google ensures two things:

 That there should be no more than 66 characters in the title line.


 That the last word of the title line is a complete one, even if it means reducing the number of
characters to less than 66.

2. The Description of the listing in the SERPs has been marked by “B”. The contents
of the description come from the Description meta tag of the webpage.

3. The searched term or parts of the searched term are in bold. Each SERP contains
ten listings on each page.

4. The number of results found by Google to contain the search term has been
marked by "C". For example, if you search for the term "insurance lawyer", Google
finds 58,600,000 incidences (the number may vary) of the presence of the term
"insurance lawyer".

5. “D” is the time (usually in seconds, seldom in minutes) taken by Google to


produce the results on the search engine results page.

6. Sponsored Links have been marked as "E". Sponsored Links are the links that you
find on the top of the page and the right-hand side column of each Google SERP.
These are paid text advertisements for which Google charges a fee for every click.
"Insurance lawyer" is one of the most expensive search terms available. Note that
insurance.lawyers.com appears three times in the top five natural search results
and nowhere in the sponsored listings. This is a good example of how effective SEO
techniques can create value for a website.

7. The address of the page, where the user will be taken by clicking on the link is
represented by "F".
8. On the left side of the page is a search section that allows the user to narrow
results to the news, by location, or for shopping results among other selections as
shown in "G". Users can also select related searches and more advanced search
tools.

Click here to enlarge

9. 'Cached' has been marked as "I" and refers to a snapshot of the complete web
page (matching the search term) last crawled by Google. This appears when you
hover over a search result and click on the arrows that appear on the right side of
each listing (marked as "H"). The main feature of this page is that you can see a
page preview before you click the link and it highlights all or a part of the search
term.

You can use the cache feature to quickly see how many times the searched
keyword is present within a webpage.

10. 'Similar' (marked as "J") provides a list of pages indexed by Google containing
information similar to that contained on the website that you had a preview of.

Some other interesting features of Google Search Engine Result Pages:

Google Search Results may have a link to the location map if searching for local results.
Google can tell the current time of places in the world.

Use Google to find what things mean.


A search Box within the Organic search result

Related searches
Google can convert.

Below is a screenshot of the Yahoo results page when the term "insurance
lawyer" is searched for:
Notice the following important aspects in the above screenshot of Yahoo
Directory Search Engine Result Page:

1. The Title of the listing in the SERPs has been marked by "A". The contents of the
title come from the title of the webpage.

2. The Description of the listing in the SERPs has been marked by "B". The contents
of the description come from the Description meta tag of the webpage.

3. The searched term or parts of the searched term are in bold. Each SERP contains
ten listings on each page.

4. The number of results found by Yahoo to contain the search term has been
marked by "C". For example, if you search for the term "insurance lawyer", Yahoo
finds 135,000,000 incidences (the number may vary) of the presence of the term
"insurance lawyer".

5. Sponsored Links have been marked by "D". Sponsored Links are the links that
you find at the top of the page and on the right-hand side column of each Yahoo
SERP. These are paid text advertisements where Yahoo charges a fee for every click.

6. “E” is the actual landing page that a visitor will arrive at once they click on the link
in the search engine results.

At the end of the day, all SEO efforts boil down to what people see in the search
engine results pages. It is therefore essential that you know where the search
engine picks up the various parts of their listings because this is the only part of
your website that people will get a glimpse of while on the search engine.

The following table shows where various parts of the SERP listings come from
for various search engines:

Where do the main search results, pay per click search results and Directory
results come from?

Search engines make use of information from several sources apart from their own
database of information. As a Search Engine Optimization professional, you should
know from where various search engines are picking up data that make up various
parts of their search results. The following terms should be clear:

 The main search results: Results that we see in the SERPs;constitute the bulk of the results for
any search engine.
 Pay per click search results: Search results that appear due to advertisers paying the search
engine to get a listing in the results, or having a textual advertisement displayed next to the
search results.
 Directory results: Many search engines have a directory of websites apart from a regular search
engine feature. The listings within the directory are referred to as directory results.

The following table depicts the relationship between the various search engines
and directories:

Do Search Engines search the whole web while performing a search


operation?

No, as discussed above. Contrary to the common misconception, search engines do


not actually search the internet; instead they search the documents that they have
indexed and that are included in their databases. Search engines generate
information only about those titles, phrases, keywords, subjects, or topics that are
stored in their respective databases. Although there is no way to determine exactly
how many web pages there are on the internet (because there are so many
different web hosts and registrars), a Google engineer recently reported that there
are over 1 trillion unique URLs on the world wide web.

3.1 Types of Search Engines

Crawler-based Search Engines : As the name suggests, the Crawler-based search


engines use a Spider or Crawler to search for new webpages (or changes in existing
pages). Google and Yahoo are the most powerful Crawler-based Search Engines.

Human-Powered Directories : Human-Powered Directories rely on humans for


reviewing and selecting the webpages. Open Directory Project, Yahoo Directory,
Google Directory, LookSmart, and EuroSeek are some of the most popular Human-
Powered Directories.

Hybrid Search Engines : Hybrid search engines use both crawlers as well as a
directory for generating relevant results. Google, ExactSeek,
Lycos and AltaVista are hybrid search engines which use a Directory to
supplement their own search engines.

Meta Search Engines - Meta Search Engines are those search engines that search
other search engines and directories. These search engines extract the best of the
searches from various popular search engines and directories and include the
information in their own search results. Dog pile, WebCrawler, Excite, MetaCrawler,
and Ixquick are some examples of meta search engines.

How do search engines work and how do they rank websites based upon a
search term?

As mentioned above, search engines consist of huge databases of websites that are
queried based upon the search term entered by a user in the search engine search
box. The job of the search engine optimizer (or SEO professional) lies in
understanding how the search engine ranks websites in the results pages, and then
taking necessary action to improve the websites' current rankings. Every search
engine makes use of a search algorithms (or formula) to query its database of
websites. This algorithm defines the factors which determine the relevancy of a
website while a search conducted. The search engine algorithm differs from search
engine to search engine and is a carefully developed formula that allows the search
engine to display websites that are most relevant to a search term in the order of
relevance. Most companies including Google keep the search algorithm a closely
guarded secret.

Search engines are programmed to seek out evidence that point to a connection (or
relation) between the searched term and the webpages existing in their databases.
Some of the important factors that are included within the algorithm of most major
search engines (including Google), and make up part of the evidence that points to
relevancy between the search term and a web page are.

 Whether the Page title of the web page includes the searched term.
 Whether the header of the web page includes the searched term.
 Whether the body of the webpage includes the searched term and how many
times it appears within the webpage.
 The presence of the search term within pages of other sites that link to the
webpage.

The above list is in no way exhaustive. Search engines like Google have several
dozen parameters upon which their search algorithm is based. The role of the
Search Engine Optimization professional is to understand the well known factors
that make a considerable difference to the search results of the web page and then
to use optimization techniques on the web page so that it is viewed favorably by
the search engine algorithm and gets a better ranking in the SERPs (Search Engine
Result Pages) for a keyword/search term. Always keep in mind that it will never be
possible to know the exact logic usedby major search engine because it is never
made public, and because it is always evolving.

What are Directories and how do they differ from Search Engines?

Directories are sorted listings of websites that are maintained and approved for
listing by human editors. Directory databases comprise of websites that have been
manually submitted by the website owners or website editors. Unlike search
engines which make use of automation through spiders and bots for searching new
websites, directories make use of human intelligence to ensure that each new site
submitted to the directory meets certain standards and is placed in the appropriate
category within the directory. Some of the important directories from the point of
view of the search engine optimization are

1. DMOZ (http://www.dmoz.org/)

2. Yahoo directory (dir.yahoo.com)

3. LookSmart directory (www.looksmart.com)


A good site for locating directories is www.allwebdirectories.com which lists several
directories by topic. Most directories offer free listings. It is a good idea to get listed
in as many as possible since this is viewed favorably by most major search engines
while ranking a site in the search results.

Difference between Search Engines and Directories

Searches can be conducted by using two different modes - search engines and
directories. However, the two work in completely different ways.
What is the Open Directory Project and how can a listing in DMOZ boost my
SEO efforts?

The open directory project is the world's largest volunteer based directory of
websites organized, maintained and reviewed by chosen directory editors who
work as volunteers. The Open Directory Project is available at
http://www.dmoz.org/ and is one of the first places when you should submit your
website. Sites are submitted to DMOZ free of charge at
http://www.dmoz.org/add.html, and are placed in an appropriate category after
being reviewed by an editor who is a chosen specialist in a category within DMOZ.
The importance of DMOZ can be judged from the fact that it supplies its directory
listings to leading search engine such as Google, AltaVista, AOL, Go.com, iwon,
MSN, Netscape.

The DMOZ directory listing is organized alphabetically, so a website named


eqtest.com will figure above a website called iqtest.com. Since DMOZ is such an
important place to get listed, many website owners actually choose domain names
beginning with numbers such as 1eqtest.com so that they list high within the
directory. This trick may not always work since each site is reviewed by a real
human being who may be able to see through the technique. Besides, domains
without numbers in them are considered to be more valuable.

What is the importance of a Yahoo Directory listing?

Yahoo has its own paid directory listing at


https://ecom.yahoo.com/dir/submit/intro/ which allows websites to get listed for
one year at $299 ($600 for sites with Adult content). Yahoo editors review all
submissions within one week. The position of the site within the directory changes
according to the number of people clicking on it and it slowly move up the ranks as
more and more people click on the listing. The Yahoo Directory listings are used
within the Yahoo search results, but spending the $299.99 has some other benefits
that are usually overlooked:

 Getting a Yahoo Directory listing ensures that the website will quickly become
available in the Google search results (within days) due to the agreement between
the two mega search engines that allows Google to use the Yahoo Directory listings
information. This technique can save new sites months of waiting to see their site
in the Google SERPs.
 The Yahoo Directory passes PR (Google page rank) to the listed sites. Most Yahoo
directory pages have a PR of 6 or 5. This is an excellent way of buying PR for your
site from the world's number one visited site. Due to this reason Google usually
gives a PR boost to websites listed in the Yahoo Directory. You will read about the
concept of Google PR later on.

What happens if Yahoo places you in the wrong category of its directory or on
a non-PR page?

If you had planned to get a Yahoo Directory listing for generating more traffic from
Yahoo and to get ranking benefits from Google, you may come across a situation
where your site is listed in a category which does not match your site's objectives.
Or, you may encounter a situation where you expected to got one on a PR2 page. In
such an eventuality, you can appeal to Yahoo within 30 days of getting listed and
stand a good chance of being heard.

Do placements in (PPC) Pay per click search engines help to increase the
website links popularity?

No. PPC search engines such as Overtune & Google AdWords do not offer direct
links to websites from their search engine paid listings or on websites that offer
advertising space to PPC search engine advertising partners. When people click on
a PPC search engine listing, it first goes to the search engine for tracking the click
through popularity, and then is re-directed to the advertiser's website. The same is
true of affiliate networks such as Commision Junction and LinkShare.

How do you submit your site to search engines/directories?

You can either manually submit your site to search engines such as Google or
Yahoo, or let the search engine spiders automatically find your website. If your
website already has a link back from an existing spidered webpage from another
website, it will most likely, get indexed when the webpage linking to you is re-visited
by the spider. Following are some of the places where you can submit your site to
various search engines yourself.

Google - http://www.google.com/addurl

Yahoo - http://search.yahoo.com/info/submit.html

Bing – http://www.bing.com/toolbox/webmaster

DMOZ - http://dmoz.org/add.html

While submitting pages to search engines keep in mind that most engines have a
maximum limit as to the number of webpages from any website. It is a good idea to
break up the page submission over a few days. If you feel the webpage has not
been indexed even after 6-7 weeks, you can re-submit the URL. No search engine
penalizes websites for re-submission of webpages unless it is done excessively.
Keep in mind that it could take search engines like Google or Yahoo upto 6-7 weeks
to index a website that has been submitted for inclusion. You can speed up the
initial indexing of the site by one or more of the following techniques:

 Get a directory listing in DMOZ (which itself could take weeks).


 Get a link back from a quality site that has been indexed. Websites that have news
or blogs on them are visited often by spiders. Try to get links from such sites to
speed up the indexing process.
 Get a paid directory listing on Yahoo.
 Submit your site to several free online directories.
 Always remember to be patient while the search engines index your site. Don't
resort to any desperate measures such as re-submitting the site over and over
again. The search engines will always take their time.

Factors to keep in mind while submitting a site to the search engines

 Ensure that the keywords and description meta tags have been made carefully.
You will learn about metatags later.
 Ensure that the pages are optimized and that the keywords are placed throughout.
 Add the site to the most relevant category. You can conduct a search of your
keyword or check where your competitipon is listed in the directory.
 Ensure that your website is working properly. Some directory editors will check to
see if the site is real.
 Ensure that there are no broken links.

How do you monitor the performance of your website in the search engines?

To check the ranking of your site in the SERPs, simply search for the keyword or
keyword phrase in the search box of the search engine and check the rank of your
optimized webpage. Keep working on your webpage till it reaches the first 10
positions, as most people don't go beyond the first page in the search results. If you
have a large number of keywords, you can consider using a software which will give
you reports on the ranking of your keywords. Google does not approve of such
software, but is not taking any active steps against people who use them or
companies who make them.

What is Search Engine Spam? Why should it be avoided?

Yahoo classifies search engine spam as "pages created deliberately to trick the
search engine into offering inappropriate, redundant, or poor-quality search
results." Some common spam techniques are hidden text, doorway pages, and
mirror sites. Hidden text is characterized by text that readers cannot see, such as
black text on black background, but is still read by search engine spiders. This text
usually includes numerous keywords. Doorway pages are stand-alone, full-page
advertisements. When you click on a site in a search engine results page, a full-size
advertisement will pop up first that you have to look at before reaching your
desired page. Mirror sites are duplicates of sites that use different keywords. For
example, a winter sports website could have multiple duplicate pages with differing
keywords such as skiing, snowboarding, and snowmobiling. Search engines look
down on mirror sites because they allow a website to have an infinite number of
listings.

Search engine spamming does not serve the best interests of the consumer, can
lead to undesirable consequences for websites and their managers, and should
therefore be avoided. Search engines play an active role in detecting and
eliminating search engine spam. Leading search engines, including Google, Yahoo,
and MSN, have systems in place for users to report spam. Google's spam policy is
clear: "In especially egregious cases, we will remove spammers from our index
immediately, so they don't show up in search results at all. At a minimum we'll use
the data from each spam report to improve our site ranking and filtering
algorithms, which, over time, should increase the quality of our results." Search
engines take SEO spam very seriously, and it is important to avoid spam so your
site does not drop in page rank or gets banned. Reading different search engines'
statements of acceptable practices can help you determine whether your SEO
techniques are considered spam or not.

CHAPTER 4
4 Keywords - the key to successful SEO

What is the importance of keywords in SEO?

Keywords are probably the single most important aspect of all SEO activities. As
discussed earlier, search engines sift through indexes of websites available with
them based on complex algorithms that help match the search term with the most
relevant search results. Every search engine algorithm considers the keyword(s)
within a webpage as one of the most important factors within the search algorithm.
As soon as a search term is entered into the search box of a search engine, the
search engine immediatly looks for webpages in its database that could be most
closely related to the search term. One of the first and most obvious things that it
looks for is whether the search term (keywords) exist within the webpage. It is this
reason that elevates the importance of choosing and correctly placing the right
keywords within the webpage.

Here is an example to explain the meaning of keywords:

If you have a website selling a hypnosis eBook and you know that people find your
site by searching for the term "Hypnosis Ebook", then this search term is referred to
as a keyword for which you must optimize your website. By making changes to the
page (which you will learn later) such as including the keyword "Hypnosis Ebook" in
the title, header, body and meta tags of the web page, you will associate the web
page with the search term "Hypnosis Ebook". Now, when some one searches for a
hypnosis eBook in the search engines, your page will be viewed favorably and will
be ranked better in the SERPs.. Some keyword related activities for the person
optimizing the page are:

 Choosing relevant keywords.


 Placing keywords throughout the site.
 Submitting or placing links tn other sites using the selected keywords.
 Checking the keywords of competing sites and improving existing keywords for better
performance in the SERPs.

How do you search for the right keywords that will help bring in the most
traffic?

Now that you know your webpages should be optimized for a particular keyword or
keywords, the next important question becomes how to choose the right keywords.
One of the fundamental tasks of an SEO professional is to identify and then
optimize the website for keywords that will bring in the most targeted traffic to the
website. It is strongly advised that you optimize a single webpage for 1-3 keywords
only as the search engines tend to add more importance to webpages that are
focused on a particular search term. To find the right keywords for your websites,
use the following techniques.

Example: Your site sells IQ tests.

(A) Wordtracker keyword tool:


This is a free keyword analysis tool from Wordtracker and is available at
http://wordtracker.com. The tools gives you valuable information on the number of
visitors searching for various things. This tool is by far the best free keyword
analysis tool on the internet. There is a free trial available for Wordtracker.
However, to keep the service you will need to pay either $69 a month or $379 per
year. If you search for the term "iq test", the tool will display a list of related
keywords (that include the term "iq test") along with the number of people
searching for them every month. The results will look something like this:

This information holds many secrets that can help boost visitors to your webpages.
As you can see, apart from the 2593 people searching for the term "iq test", you
also have 842 people searching for "free iq test" and 139 people searching for
"online iq test". By simply optimizing the keywords in your web page for "iq test",
you will miss out on about 1000 searches. There are two ways in which you can
target the additional traffic from the keywords, "free iq test" and "online iq test". (1)
Make two more pages, and target the first one for "free iq test" and the other for
online "iq test". (2) Include the keywords "free iq test" and "online iq" test in the
existing page which is already optimized for "iq test". The first(1) option is the
recommended one as it leads to a better association between the web page and
the keyword for which it is optimized. The basic reason behind this is that the
keyword will appear alone in various parts of the page such as the title, header and
body, thus offering better keyword optimization and a clear identity for the search
engine. The second option will have the additional keywords on the same pages will
be far less effective. Note: While it is recommended that you create a new page for
each major keyword, always keep in mind that each web page should be at least
40% different from the others in terms of content to avoid black listing by the
search engines.

How to make the most of the micro search?

When you search for the term "iq test" in Google, you are shown the following
information in the results bar - Results 1 - 10 of about 11,400,000 for iq test by
Google. This means that Google detected 11,400,000 documents containing
the term "iq test". Now when you search for a lesser keyword such as "fun iq
test", you are shown Results 1 - 10 of about 500,000 for fun iq test in the results
bar. A still smaller keyword search of "classic iq test" shows Results 1 - 10 of
about 459,000 for classic iq test. What does this mean from the point of view of
selecting keywords?There are fewer webpages existing that are optimized for lesser
keywords as they have been ignored while designing and optimizing the webpages.
this can be viewed as an opportunity by targeting pages for the lesser keywords
and getting a part of the search that has been left unaddressed by the search
engine optimizers and website designers of other sites. Therefore the moral of this
discussion is to optimize your website for the lesser keywords apart from the major
keywords.

(B) Check the keywords of competing websites

A good idea for selecting the best keywords is to simply search for the keyword you
are targetting in Google.You will come up with a list of the most relevant websites
for the keyword. Since Google has ranked these websites high in the SERPs there is
every possibility that the sites have been optimized or at least a lot of thought and
research has gone into selecting and embedding keywords into the website. Here is
what you should do to see what the best have done: Click and visit the webpages in
the first few search results in Google. Right click on the open webpage and choose
'view source' to read the HTML of the website. Look for the "keywords" meta tag at
the top of the HTML. You will see a list of keywords seperated by commas, These
are the keywords which the website would like spiders and bots from search
engines to see in the meta tags. Pick out the keywords you think are relevant for
your site. Make sure not to select any trade marked keywords.

(C) Information generated from your website visitor logs

Check your server logs and see what people search for when they reach your site.
Visitor logs shed light on information related to the forwarding URL of visitors to
your website which contains the search term entered by them into the search
engines while looking for websites. The following are details extracted from the
ExpertRating logs. You can see the that someone searched for "get free online
certificate of ms office", "asp.net test" and "online java test" in the first three log
entries.

1. ip address = 203.135.21.45, referrer URL

http://search.yahoo.com/search?p=get+free+online+certificate+of+ms+office&ei=U
TF-8&fl=0&fr=FP-tab-we

landing page = /jobs/Secretarial-jobs/Office-Supervisor-jobs.asp, User agent =


Mozilla/4.0 (compatible; MSIE 5.01; Windows 98)

2. ip address = 203.145.179.2, referrer URL

http://www.google.co.in/search?hl=en&q=asp.net+test&btnG=Google+Search&met
a=/aspdotnet-test.asp

Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)

3. ip address = 82.201.233.226, referrer URL

http://www.google.co.in/search?hl=en&q=asp.net+test&btnG=Google+Search&met
a=/aspdotnet-test.asp

Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.8) Gecko/20050511


Firefox/1.0.4

4. ip address = 61.9.101.160, referrer URL

http://www.google.com.ph/search?hl=en&q=pre-school+teaching+jobs&meta=
/jobs/Teaching-jobs/Preschool-Teacher-jobs.asp

Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; FunWebProducts)

5. ip address = 131.251.42.213, referrer URL

http://www.google.co.in/search?hl=en&q=asp.net+test&btnG=Google+Search&met
a=/aspdotnet-test.asp

Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.7.12) Gecko/20050923 Fedora/1.7.12-


1.5.1

6. ip address = 62.3.240.46, referrer URL

http://www.google.co.in/search?hl=en&q=asp.net+test&btnG=Google+Search&met
a=/aspdotnet-test.asp

Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322)

(D) Get a Google AdWords account

If you sign up for a Google AdWords account at


http://www.google.com/adwordscom/adwords, you get access to a keyword
analysis tool that gives you an indication of the number of searches for any given
keyword as well as related keyword searches. Considering the fact the Google could
account for upto 50% of all your traffic, and given the low Google AdWords sign up
fee is only $5, this could be an excellent resource for choosing appropriate keyword
for your site. You could of course later run a Google advertising campaign using the
account and attract paid traffic to your site.

The following is a screenshot of the Google AdWords toolbar:


(E) Other keyword research engines and websites

Even though the Overture keyword suggestion tool should suffices most of your
free keyword analysis requirements, you can also visit other sites:
http://www.optify.net - does keyword and competition analysis

http://keywords.submitexpress.com/ - free keyword suggestion service that allows


up to 15 searches per day

https://adwords.google.com/select/KeywordToolExternal - free keyword suggestion


service from Google

What should be your strategy for choosing keywords and getting more traffic
from the same search term?

The keyword analysis tools will give you a good idea of number of searches for
various keywords. It is obvious that you will optimize webpages for the most
searched keywords; however, a simple Google search of these prized keywords will
instantly reveal that there is an intense competition and several leading/acclaimed
websites are hogging the first 10-15 listings in the Google SERPs. A practical
assessment of the situation will quickly make you realise that it will not be easy
getting ranked up with the big names very easily or quickly. Therefore, you must
hunt for the lesser searched keywords and optimize pages for those keywords as
well. A keyword like IQ test may have 2593 searches over the Wordtracker tool.
However, people are also looking for "Free IQ test" (842 searches), "Online iq test"
(139 searches). Optimizing your site for these lesser searched keywords could see
you getting immediate traffic rather than waiting month for the major keywords to
bring you traffic. The amount of traffic from lesser searched keywords may not look
like much when viewed individually; however, when viewed together it will amount
to a sizeable amount of traffic. One the plus side, there is a large number of lesser
search terms to choose from.
To find even more suitable keywords, go to http://www.thesaurus.com and search
for synonyms of your keywords. This should further yield fresh lists of keyword
searches for the synonyms of the main keyword. For eg: a synonym for IQ test
could be "intelligence test" (8000 searches). There fore the amount of traffic that
you can derive from the additional synonym keywords can lead to a substantial
amount of traffic.

What should be your strategy while optimizing one keyword per webpage?

Wherever possible, you should try to optimize one webpage for only one keyword.
This may not be possible in all cases, but it should be tried whenever possible.The
most important thing to keep in mind is to keep atleast 40% of the content different
on each of the optimized pages. The simple reason for this is that same or nearly
similar webpages are viewed as search engine spam and looked down upon by
search engines. Creating multiple pages with the same or similar content could lead
to your site getting blacklisted by the search engines.

What is keyword density and how should you optimize keyword density for
the search engines?

Most search engines take into count the keyword density within a webpage while
evaluating its ranking and importance in the SERPs. The keyword density is the
number of times the keyword figures in the HTML of the web page divided by the
total number of indexable words in the HTML page. An optimum keyword density
should be in the region of 3-4% (perferably on the higher side).
You can use the "Word Count" feature in Microsoft Word to assess the total
number of words in the document and then the "Find" option to locate the
presence of the keyword. This is a quick way of calculating the keyword
density.Search engines don't index some common words within the web page such
as or, and, when, in, etc. These are called STOP WORDS. This is the reason, that the
keyword density that you calculate by dividing the number of times the keyword is
present in the page by the total number of words in the web page may not be fully
accurate according to some search engines.There are a number of tools that help
to calculate the keyword density. Two of the good ones are:

http://www.webmaster-toolkit.com/keyword-analysis-tool.shtml

http://www.live-keyword-analysis.com/

What is keyword prominence and how does it affect SEO efforts?

Keyword prominence refers to the fact that keywords placed in important parts of
the web page are given priority by the search engines and are a factor that
determines the web page relevance with respect to the keyword. We will delve into
this area of search engine optimization in the later pages of this course.

How should keywords with two or more words be ideally placed within the
webpage?

If you are to place keywords that consists of two or more words, such as IQ test (2
words), you must try your best to mention the keyword as "IQ test" instead of
placing the two words separetly. Therefore, if your keyword is actually a phrase,
always refer to it in the way you would like to target the keyword for the page. The
search engine will see the keyword "a test of IQ" very differently from "IQ test" so
make sure you always refer to your keywords accurately. If you must split the
words within your keyword phrase, try to keep them as close as possible to one
another. This is termed as keyword proximity and helps the search engine to
interpret a relationship between the keywords. While discussing the relevance of
proximity, it must also be mentioned that some search engines also attach
importance to the independent existence of words that make up the keyword
phrase within the web page. Therefore, taking the above example of IQ test, you
should include the word "IQ" at various places within the web page and the word
"test" at other places. Doing this would satisfy the more discerning aspects of the
search engine algorithm of various search engines that are looking for more proof
than the simple presence of the exact keyword (i.e. IQ test) in the web page.

4.1 What is the importance of keywords in the title of the webpage

Almost all major search engines give prominence to keywords in the title of the web
page. Always keep the title brief. Make it a 5-8 word title (up to 66 characters for
Google) that includes the keyword(s) and describes the essence of what the
webpage contains.

Think of the best line (of 66 characters) that you feel would be most compelling for
people to read and would make them click on. Remember that it should contain 1-2
keywords for which you are optimizing the web page.

For example, the expertrating.com title is "Online Certification and Employment


Testing". This title cleverly includes two keywords "Online Certification" and
"Employment Testing" and talks about the main services of the site. Another very
important factor is that major search engines including Google display the page title
in the SERPs. This is one important reason why the title should describe what is
contained on the page and a strong reason to click on the listing within the SERPs.

For example, if you are selling an online eBook on hypnosis for $100 and you know
that it is the cheapest such offering on the internet, you should try to drive home
the point within the title. The title could read "Online Hypnosis ebook-$100". This is
what people would see in bold at the top of your listing in the Google SERPs and it
could be a compelling reason to click through to your site. Search engines truncate
the display of the page title in the SERPs. Google displays only up to 66 characters,
though Yahoo and BING display a longer title. Please refer to the chapter on Search
engines to get a clear idea of the various parts of the search engine results pages.

How and where should you add keywords in the body of the web page?

As we discussed before, search engines attach more importance to keywords that


are placed at strategic locations within the body of the web page. This is called
keyword prominence and is used as an SEO technique to attach added relevence to
a keyword so that the web page ranks better in the SERPs for that particular
keyword search. Keywords can be placed in the following location within the body
of the webpage for deriving benefits due to the keyword prominence factor.

 At the beginning of the web page or close to the begining of the web page.
 Within the header or subheaders (H1, H2 Tags).
 Within the initial introduction of the webpage which is usually the first paragraph
of information that introduces the webpage to the visitors.
 At the commencement of all sentences and paragraphs within the web page.
 If you are adding sentences in bold text, try to include the keyword in bold as well.
 Ensure to evenly distribute the keyword through out the web page apart from all
prominent locations. There should not be any artificial clustering of keywords.
 Within the link titles of hyper links. The link titles are used to describe the link
when the user places the mouse over the link. In the following example, the link
title is Online Hypnosis Ebook.

<A HREF="introduction.html" TITLE="Online Hypnosis Ebook">welcome to online


hypnosis ebook</a>

Always remember that an optimum keyword density should be in the region of 3-


4%, preferably on the higher side.

What is the impact on SEO of adding images to the web page and how can you
optimize images within the web page?

Search engines cannot spider text contained within an image used on a web page.
Therefore, important keywords related to images should be placed in the ALT text
contains the keywords hypnosis ebook:<IMG SRC="header.gif" width=200
HEIGHT=200 ALT="hypnosis ebook">

When using image maps for creating links within an image, search engines are
either unable to read links within the image maps or find it difficult to do so.
Therefore, if you use image maps for creating internal or external links from your
website, you should place text links on your page as well. This will ensure the that
the search engine bots can index all the pages of your site.

Are all major search engines keyword case sensitive?

No, we don't need to worry about the case sensitivity while placing keywords in
webpages as most major search engines are case insensitive.

What is keyword stuffing and why you should avoid it?

You may have often come across websites that blantantly repeat a keyword within
the page body or in the meta tags of a web page. Here's an example of one such
sentence within the body of the web page:IQ test, IQ test-IQ test, IQ-test, IQ test, IQ
test............ Overzealous webmasters often try to increase the keyword density by
repeatedly inserting the keyword in the page. This is called keyword stuffing and is
looked down upon as spam by all major search engines. If you try keyword stuffing
by making it look invisible (by using a text color that matches the background color),
you will still get detected and your site is likely to get black listed as this technique is
very easy for the search engines to discover.

Can you add keywords with a hyphen in the domain name?

There is ample evidence to suggest that adding keywords in the domain name of
the website is beneficial for the website in the SERPs. Try to add keywords that you
think are most searched for and which best describes your site. If the domain name
requires two words, you can use a hyphen in between the words.

For example, a site about IQ testing can be named IQ-test.com.

Search engines do not penalize the hyphen between the two words. In fact, there is
evidence to suggest that it is favorably looked upon as compared to a domain name
without hyphens, such as the domain name IQtest.com. This is simply because the
search engine is able to easily identify the separate keywords in the name.
Should you add keywords in the file names?

Yes, whenever possible try to incorporate the keywords in the file names of the
webpages as these are considered by some search engines. An example of
keywords and hyphens in the file name is

http://www.expertrating.com/project-management-certification.asp

Notice that the file name contains the keyword "Project Management Certification"
and that each word is separated by hyphen.

What other ways are there of finding keywords that generate high traffic?

Zeitgeists are search patterns revealed by the search engines which can yield some
very high traffic keywords. The most popular is the Google Zeitgeists. These are
available at http://www.google.com/press/zeitgeist.html

Google has described the Zeitgeists in this way: "Pulling together interesting search
trends and patterns requires Google's human and computing power together.
Search statistics are automatically generated based on the millions of searches
conducted on Google over a given period of time - weekly, monthly, and annually.
With some help from humans, and a
http://www.google.com/technology/pigeonrank.html when they have time, these
statistics and trends make their way from the depths of Google's hard drives to
become the Google Zeitgeist report." (The reference to pigeon is just Google trying
to be funny). Below is a screenshot of the Google Zeitgeist.
Click here to enlarge

b. Trending Topics - Check http://www.metacrawler.com/ and


http://www.twitter.com for trending topics.

c. Popular keyword lists Some websites offer lists related to the highest searched
keywords for the day, week or month. There are several providers of top keyword
lists including Mike's Marketing Tools Top 500 list: http://www.mikes-marketing-
tools.com/keywords/ and Adsense Pay: http://adsensepay.com/ which lists the
most popular keywords based on bid prices.

Is there any advantage of displaying keywords in bold or strong text?

There is evidence to suggest that search engines attach importance to bold text in
the body of the page. Text can be highlighted and made bold using the <B> tag or
<STRONG> tag. You can use this highlighting technique for some keywords within
the web page. As for headers that contain the keyword, this technique may prove
doubtful to certain search engines.
Can search engines read keywords in the comment tags?
Comments can be added within the HTML using the <!-- and --> tags. Search
engines cannot read the keywords in the comment tags so this technique of hiding
keywords within the tags doesn't work.

CHAPTER 5
5 More about Google and PageRank (PR)

Google is arguably the best search engine with incisive algorithms that enable it to
generate precise results that match with the search keywords. Since Google
accounts for approximately 60% of all search engine traffic, almost every SEO
activity has Google in mind. In this chapter we will delve into the working of Google
and how it ranks pages.

What is Google PageRank

Every SEO professional must understand the concept of Google PageRank (or,
Google PR). Google PageRank is a mathematical technique (based upon a formula)
used by Google for determining the relevance or importance of any webpage. The
system was developed by Google founders Larry Page and Sergey Brin at Stanford
University in 1988. Google PageRank is the single most important factor that effects
your search engine rankings in Google (apart from over 100 other factors that
Google considers). It is therefore essential to try to increase your Google PageRank
as far as possible.

Google PageRank (also called Google PR and a trademark owned by Google) is a


Google technology that rates the significance of a webpage on the World Wide Web
and is basically a numeric value from 1 to 10 that measures how important a
webpage is. A webpage with a Google PR of 2 is more important than a webpage
with a Google PR of just 1.

The foundation of the Google PR system of ranking webpages is based upon the
principle of voting. When a website links to another website, it passes a vote in
favor of the website. As more and more websites link to the other website, the vote
keeps increasing, and the Google PR also goes up. Another important factor is -
who is linking? Is an important page linking or just a fresh page that has not been
indexed? The vote for an important page is more than the vote for a unimportant
page. There fore, Google calculates a page's importance on the basis of how many
votes are cast for the page and who is casting them.

Outbound links also have a bearing on a website's Google PageRank. Every web
page has some out-bound links whether they are going to other websites or to
other pages of the same website. The outbound links lead to a decrease in the
ability of a page to pass Google PR. Therefore, the more the outbound links there
are, the less the amount of Google PR passed by the page. It is quite possible that a
link from a Google PR 6 page passes less Google PR than a PR 4 page if the PR6
page has far more outbound links than the PR4 page. For example, if a web page
has 8 out-bound links, it will pass only one eighth of its available PageRank. If a
page has only one outbound link, it will pass 100% of its available PageRank.

To summarize, Google PR is based on:

 The number of inbound links to a page.


 The quality of in-bound links to a page (what is the Google PR of the pages on which the links
reside).
 Google PR only flows from the sender to the receiver. It is not possible to loose Google PR by
linking to a low PR site.
 The ability to pass PR also depends on the number of out-bound links on the page.
 The relevance of the content of the page to the subject matter.

The Google PR is determined for every page of the website. The Google PR for the
homepage may be different from that for the interior pages depending upon the
above factors. This is a very simplified explanation of Google PageRank. There are
over 100 other factors that determine the PageRank but none are as important as
PR.

To check the Google PR of any webpage, download the Google Toolbar from
http://toolbar.google.com/ The green bar in the toolbar indicates the value of the
PR from 1 to 10. If you are searching for the Google PR of a new site, you may have
to wait as it takes up to 3 months to get a PR after being indexed.

Below is a screenshot of the Google Toolbar. The green bar shows that the
PageRank is at 10.
If you prefer not to install the Google toolbar, or would like to check PageRank from
a different machine, you can also use the free service available at:
http://www.prchecker.info/.

This is in no way an exhaustive explanation of Google PR; however, we can discuss


a simplified scenario to gain a better understanding of Google PageRank.

If you have 2 options for getting inbound links to your site, of which one is a PR7
page with 3 outbound links and the other a PR5 page with 2 outbound links, which
would you choose? The answer should be the PR 5 page as it will pass 5/2 = 2.5 PR
while the PR7 page will pass 7/3=2.3 PR. This scenario has been depicted in the
below diagram below.

The PageRank of web page 3 is 4.8 and can be calculated by adding the total PR
flowing to it from web page 1 and web page 2, which is:

PR of web page 3 = (PR of web page 1)/2 + (PR of web page 2)/3 = 2.5 + 2.3 = 4.8

You can use the following rule of thumb to rate webpages based on their
Google PR:
0 - Black listed or new websites or webpages

1 - Very low PR (not much use getting linked from such pages)

2 - Low PR (not much use getting linked from such pages)

3 - Low-average PR (not a very good linking opportunity, but still go ahead with a
links exchange)

4 - Average PR (most running sites fall in this category - exchange links)

5 - Good PR (This could be a good links exchange)

6 - Very good PR (Exchanging links with this site would be like finding a rare gem)

7 and above - An excellent opportunity to get linked. Almost never comes for free.

While referring to the above rule of thumb, please keep in mind that you should not
leave an opportunity to get linked from a low PR website if you think it could
improve in PR or have good future prospects.

More about the workings of the Google Search Engine

Google has harnessed the technique of parallel processing, through which it makes
use of thousands of computers on a distributed network for carrying out complex
queries and calculations in split second timings. The components of the Google
search engine are:

1. The crawler called Googlebot (which finds and fetches new content and
webpages)

2. The indexer (that sorts all the information related to every document stored in
the database)

3. The query processor (which uses a complex algorithm to generate meaningful


results based upon a users search term)

5.1 More on GoogleBot - Google's web crawler


As mentioned earlier, Googlebot is Google's web crawling robot. Googlebot works
by sending requests to website pages on the WWW and retrieving them for the
indexer. The Googlebot finds webpages by crawling across hyperlinks, and keeping
a record of all the hyperlinks it encounters on every page visited. The record of the
hyperlinks is also crawled at a later time. The Googlebot assigns a "visit frequency"
priority to every web page which determines how often it will be crawled. Web
pages on which the information is constantly updated or that change frequently are
crawled more often. Website pages that are left unchanged for long periods of time
do not get crawled often. This is how Google has the ability of quickly pick up the
latest news information from news websites. The Googlebot that frequently crawls
rapidly updating websites is called the Google Freshbot, since it keeps the content
fresh at all times.

Always keep adding fresh content so that the Googlebot updates your site more
often.

What is the Google dance and deep crawling?

Once in a while (usually once in a months), Google indexes several levels down in to
a website (meaning that the pages several clicks away from the home page get
crawled). This is called a Google deep crawl and is a time when Google is
attempting to update all the records it has about every single webpage. If you have
a new website that is not getting crawled (or if only the first level is being crawled),
Google will probably attempt further crawling during the next deep crawl. The
Google deep crawl is a time to watch out for, since most SEO professionals keenly
look out for an update in the Google PR as well as a change in the search engine
rankings.

Whenever a Google deep crawl takes place (which usually takes 3 to 5 days), the
Google PageRank is also updated, and the search results vary drastically for various
search terms. This wild fluctuation in the search results is Google trying to update
the information on each of its indexed webpages in its approximately 10,000
servers holding the data. The fluctuations are called the Google Dance. The deep
crawl bot uses an IP range of 216.239.46.x. The FreshBot uses an IP range of
64.68.82.x. The servers can also be accessed with the following IPs. But this is
usually used for checking PR across servers.

66.249.93.104

64.233.179.104

216.239.51.104

66.102.9.99

66.102.9.147

66.102.9.104

64.233.161.83

64.233.183.103

64.233.189.104

64.233.187.99
64.233.187.104

64.233.185.99

What are Google Advanced Operators?

Advanced Operators are commonly used by SEO professionals while assessing the
level of indexing of various webpages in the search engines. Advanced Operators
are query words which have a distinctive meaning to Google and can offer
additional information regarding a search. Advanced Operators help fine-tune the
search and make it more specific and relevant. We are taking the example of the
Google website in the below explanation.

Link:

The Advanced Operator (link:) (link:www.expertrating.com) searches for all the


webpages that have links to www.expertrating.com.

Related:

The Advanced Operator (related:) (related:www.expertrating.com) displays the list


of the webpages that are similar to the Google home page.

Info:

The Advanced Operator (info:) (info:www.expertratin.com) displays all the


information that Google has about the Google home page.

Cache:

The Advanced Operator (cache:) (cache:www.expertrating.com certification)


displays the cached content with the word "certification" highlighted.

Many of the Advanced Operators are punctuation marks and not words:
For example:

"" (the quote operator) - The Quote Operator is used when you are looking for the
results that contain the exact phrase you have typed in. If you are searching for a
Britney Spears song, you would simply put the title of the song in the quotation
marks. For example:

"You Drive Me Crazy"

+ (the plus operator) - The Plus Operator is used when you want to add a word or
phrase (which Google otherwise does not take into consideration) apart from the
main search term to make your search more specific. For example, if you are
searching for the utility of almond oil specifically for the face, then instead of typing
in almond oil for face, just type in:

almond oil +face

This will ensure that the results complement your search by avoiding all that is
irrelevant. You must ensure that you include a space before the plus sign.

- (the minus operator) - The Minus Operator is used when the search term you
have typed in has more than one meaning and you would like Google to omit one
of the meanings. For example, if you are searching for the term Boil (ulcer), you
would type in:

boil - recipe

The results complement your search by avoiding all that is irrelevant. You must
ensure that you include a space before the minus sign.

5.2 What are Google Query Modifiers

Query modifiers help to narrow down the search criteria so as to shed light on
important aspects of the search results. Some of the important query modifiers
are:

Site:

Inclusion of (site:) in the query makes Google generate the results within a website.
site:www.expertrating.com finds pages within Google only. It must be kept in mind
that there is no space between "site:" and domain.

allintitle:

Inclusion of (allintitle:) in the query makes Google generate the results which
contain the search term in the title. As an example, the search term:

allintitle:expert rating

displays only those webpages that have both the words "expert" and "rating" in the
title.

intitle:

Inclusion of (intitle:) in the query makes Google generate the results which contain
webpages that have the search word in the title and elsewhere in the document.
For example:

intitle:expert rating

displays the webpages that contain the word "expert" in their title, and contains the
word "rating" anywhere in the document (title or anywhere else). Aagin, make sure
there is no space between "intitle:" and the following word.

allinurl:

Inclusion of (allinurl:) in the query makes Google generate only those webpages
that have the search term in the url. For example:

allinurl:expert rating

displays only those webpages that have both "expert" and "rating" in the url.
allinurl: works only on words, and not on other URL components. It ignores any
punctuation marks.

inurl:

Inclusion of (inurl:) in the query makes Google generate only those results that have
the search term in the url and elsewhere in the document. For example:

inurl:expert rating

displays the documents that contains the word "expert" in their url, and the word
"rating" anywhere in the document (url or anywhere else). It must be kept in mind
that there is no space between "inurl:" and the following word.

CHAPTER 6
6 Tuning the Meta Tags

What are meta tags and how do they help in the SEO efforts?

Meta tags are a special set of HTML tags that contains information about the
webpage and its contents. Search engine bots and spiders are programmed to read
meta tags and pick up information regarding the webpage. Part of an SEO
professional's must ensure that the meta tags that effect the optimization of the
website are included in the page HTML.The two main meta tags that search engine
spiders and bots detect and read are:-

 Keywords meta tags


 Description meta tags
The syntax of the meta tags is shown below. Meta tags are always placed within the
<HEAD> tags.

<HEAD>

<TITLE>The page title comes here</TITLE>

<META NAME="description" CONTENT="the page description comes here."

<META NAME="Keywords" CONTENT="keyword1,keyword2,keyword3"

</HEAD>

<BODY>

How should the description meta tag be created and what should it contain?

The description meta tag is placed within the <HEAD> </HEAD> tags just before the
<BODY> tag of the HTML in the page. The syntax of the description meta tag is

<META NAME="description" CONTENT="the page description comes here."

Many search engines including Google and Inktomi make use of information in the
meta description meta tags. This information is used to generate a description of
the website in the search engine results page. Most search engines use exactly the
same description as available in the meta tag, so it is important that you word it
accurately keeping in mind SEO objectives.

The description meta tag should contain approx. 150 characters of information
describing the content of the web page. Since the search engines use the meta
description to display the website description in the search result pages, keep in
mind the following important factors to get the most advantage out of meta tags.

 Keep the description meta tag is around 150 characters in size.


 Include a description in the tag that you would want people to see in the results page of the
search engines.
 Include your main keywords in the description, but be sure to keep a conversational tone and
don't overstuff it with keywords.
 Try to make the description catchy so that the visitors are compelled to click on the link in the
search results page.
 If you don't include a description meta tag in your HTML, most search engines will pick up the
first few lines of the first paragraph or sentences of the body of your HTML page and use it as a
description for the web page instead of the description in the meta tag.

Following is an example of the description meta tag from a popular website:

The following is the description as indexed in Google SERP:

What is the keywords meta tag and what should it contain?

The keywords meta tag is placed within <HEAD> </HEAD> tags just before the
<BODY> tag of the HTML in the page. The syntax of the keywords meta tag is

<META NAME="Keywords" CONTENT="keyword1,keyword2,keyword3"

Several search engines including Yahoo recommend the use of the keywords meta
tag. The keywords meta tag contains a list of important keywords that a search
engine spider or bot can read and relate to the webpage. Keep in mind the
following factors while writing the keywords meta tag

 Pick out the most important keywords (between 2 to7) for inclusion in the keywords meta tag.
(You will learn how to choose keywords later on in the program).
 Separate the keywords with commas.
 Ensure that the keywords used in the keywords meta tag are the same as those used within the
webpage.
 Do not use any trademarks as keywords in the meta tag. Trademarks are protected by law and
should not be used without permission even within the meta tags which are invisible to the web
page viewer.

6.1 What is the Robots meta tag and how is it used

The robots meta tag is an alternative to the robot.txt file and instructs search
engine robots whether or not to index or follow links or archive a webpage that
contains the robots meta tag. The syntax for the robots meta tag contains special
directions using the the following instructional words.

 Index
 Follow
 Archive

If a robots meta tag is excluded, it automatically means that search engines will
index the webpage. The syntax for the robots meta tag is:

<meta name="robots" CONTENT="noindex,follows">

To instruct the robot not to index the page and not to follow the links in the page,
the syntax is:
<META NAME="robots" CONTENT="noindex,nofollow">

T to stop search engine spiders from caching a webpage by using the instruction
noarchive, the syntax is:

<META NAME="robots"CONTENT="noarchive">

To keep the Yahoo directory title and description from being displayed, the syntax
is:

<META NAME="robots" CONTENT="NOYDIR">

To keep the Open Directory Project title and description from being displayed, the
syntax is:

<META NAME="robots" CONTENT="NOODP">

To display only the page title and not the description, the syntax is:

<META NAME="robots" CONTENT="NOSNIPPET">

What is the robots.txt file and how can it be used to pass on instructions to
search engine robots?

The robots.txt file is a text file that is placed on the server and which contains
instructions for search engine robots. The robots.txt file is recognized by most
search engine bots and spiders and can be effectively used to allow or disallow
search engine bots and spiders from crawling and indexing specific webpages and
files of the website. The robots.txt file is a standard means off stopping the
indexing of webpages. (Robot names are case insensitive).

Where are robots.txt files placed on the server?

The robots.txt file should be uploaded in the top level root directory on the server.
This is the place where the index.html (home page) will be located. Robots.txt files
are often used by SEO professionals for various reasons such as hiding of sensitive
data or blocking of specific robots from pages that are not complete.

How do you create a robots.txt file?


A robots.txt file can be created in a simple text editor such as notepad. An example
of a robots.txt file is shown below:

User-agent:Name of the robot comes here for eg. Googlebot


Disallow:The name of the file or directory comes here (This instruction disallows the
files/directory from being indexed). Here are some examples:

Following is an example of a robots.txt file that disallows all webpages from being
indexed:
User-agent:*
Disallow:/

Following is an example of a robots.txt file that allows all webpages to be indexed:


User-agent:*
Disallow:

Following is an example of a robots.txt file that disallows the Altavista robot called
"Slurp" accessing the Admin directory and personal directory:
User-agent:scooter
Disallow:/admin/
Disallow:/personal/

Below is an example of a Robots.txt file that instructs bots not to crawl any file
ending in .PDF
User-agent:*
Disallow:/*.pdf

The Robots.txt file can also have multiple sets of instructions for more than one
bot. Each set of instructions should be seperated by a blank line. There is only one
Robots.txt file for a website.

Below is an example of a Robots.txt file that disallows Google from crawling any of
the dynamically generated pages and allows the altavista scooter bot to access
every page.
User-agent:Googlebot
Disallow:/*?
User-agent:Scooter
Disallow:
What is the recommended method for redirecting visitors from one page to
another?

Redirection should be avoided, but may be a neccessity in some cases, especially if


a site owner is shifting existing site to a new domain name and would want to
retain the traffic from the old site.

The following search engine friendly ways of redirection have been explained:

1. Implement 301 Permanent Redirects on the old pages to redirect to the new
pages. This method retains the search engine rankings. The code for ASP is:

<%
Response.Status = "301 Moved Permanently"
Response.AddHeader "Location",
"http://www.yourwebsite.com/newdirectory/newpage.htm"
%>

2. If you are using a Linux server, the most search engine friendly way for
redirection is to redirect with htaccess.

Create a .htaccess file (if it does not exist) in your root directory. Write the following:

Redirect permanent / http://www.new-url.com.

3. Another search engine friendly option to redirect pages is through IIS Redirect.

 In Internet Services Manager, right click on the file or folder you wish to redirect.
 Select the radio titled "A redirection to a URL".
 Enter the redirection page.
 Check "The exact URL entered above" and "A permanent redirection for this
resource".
 Click on 'Apply'.

4. Redirecting method for ColdFusion.

<.cfheader statuscode="301" statustext="Moved permanently">


<.cfheader name="Location" value="http://www.new-url.com">

5. Method for PHP Redirect.

<?
Header( "HTTP/1.1 301 Moved Permanently" );
Header( "Location: http://www.new-url.com" );
?>

What is the meta refresh tag?

The meta refresh tag can be used to reload an existing webpage in the browser
after a specified amount of time. The reloading of the page can also be used as a
method of redirecting the visitors to another page; however this method is not very
search engine friendly. Here is an example of a meta refresh tag that reloads the
page after 7 seconds and redirects it to www.expertrating.com.

<meta http-equiv="refresh" content="7;url=http://www.expertrating.com"/>

CHAPTER 7
7 Important factors that affect search engine rankings

What are the advantages of a site map for SEO?

A site map is a textual (using links) depiction of the structure of the website through
which a visitor to the website can navigate to any part or area of the website. From
the SEO stand point, the sitemap can be viewed as a series of internal links within
the website from the home page following a hierarchical path to the secondary and
tertiary levels of the website. Site maps are important for SEO since they are used
as highways by the spiders and bots for reaching different areas of the website.
Keep in mind the following important points while designing a site map for your
site.

 Create the site map in the form of an HTML webpage using text hyperlinks to depict the
different pages of the website. Don't use JavaScript in the site map as bots find simple HTML
tags easier to read.
 Include all or major areas of your site in the site map depending upon how many links will fall
within the site map. Include pages and areas of the site that you think should be indexed by the
search engine spiders.
 Don't create an only graphics site map or a site map that contains no other text apart from the
links constituting the site map. Always add some other text (such as a small description of how
to use the site map) within the page to prevent the search engines from viewing the page as
spam. Do not add more than 70-80 links on the sitemap, since many search engines stop
indexing links beyond a certain number.
 Use keywords in the site map for displaying the text link. The keyword should match the
keyword included in the page being linked to from the sitemap.
 Try to place the site map on all important pages of the website including the homepage so that it
can be easily detected by the search engines and more Google PageRank can flow to the
sitemap.
 If your site has more than three levels i.e. homepage plus two more levels, you will find it hard
to get the 3rd level onwards indexed in the search engines. You can use sitemaps by including
these sub level pages within the map.This way you can get such pages indexed much quicker.
The search engine bots will index the site map from top to bottom, so try to keep all important
pages at the top of the map structure.

Should you use image maps?

No, search engines are rarely able to read links within image maps. If using image
maps is an essential design requirement, of the website then it is also essential that
the links within the image map are also placed as text links somewhere else on the
page so that they get properly indexed.

What is the best way to use splash or flash intro pages?

Home pages of websites that only contain a large graphic or a flash intro to the site
is a bad idea from the point of view of SEO as search engines don't find any textual
content or links to index. If you must use a splash page or a flash intro, make sure
that you also have text links to the inner pages of the site somewhere on the page
aswell. Doing this may defeat the very purpose of having the all-graphical look of
the splash page, so it is better not to use such pages in the first place.

How do you optimize JavaScript within your webpage?

JavaScript is not considered search engine friendly, and should be avoided in most
cases. However due to the many benefits that JavaScript offers to the website
design and conservation of server resources, it may become a necessary evil. Here
are some of the problems with JavaScript from the SEO stand point:

 Not readable : Whatever you include in JavaScript cannot be read by the search engines. Links
included in JavaScript too cannot be read and indexed.
 Hogs up the area meant for text : Search engines like meaningful textual content that they can
index. JavaScript uses up a lot of the overall page content in the form of search engine un-
readable code.
 The JavaScript pushes down the content in the web page, as it uses up area for its code. Since
search engines attach decreasing importance to the contents of the page while reading from the
top of the page moving downwards, parts of the page content do not get the importance they
deserve. Moreover the inclusion of JavaScript reduces the keyword density of the webpage.
(You will learn more about keyword density later.)

7.1 Solutions to improve the usage of Javascript

Move the JavaScript to another file: The JavaScript is usually added within the
<HEAD> tags of the HTML. Search engine spiders look for keywords and text to
identify with the description of the web page early on in the HTML, and this makes
it a bad place to include the JavaScript. It is therefore a good idea to shift the
JavaScript into a separate file and point to the file in the HTML. The following is an
example of pointing to a JavaScript file called "name of file.js" from within the
HTML:

<HTML>

<HEAD>

<TITLE>Example of Moving Javascript

<SCRIPT LANGUAGE="Javascript" SRC="name of file.js" TYPE="text/javascript">

</SCRIPT>

</HEAD>

<BODY>
By including the JavaScript in a separate file, the search engine spider can be
directed to target important text and keywords at the top of the HTML. Moreover
the overall size of the web page decreases and download time improves. Most
search engines are unable to read links within JavaScript code so make it a point to
add text links using HTML along with links using JavaScript to ensure that the search
engine robots are able to properly spider all the pages that contain JavaScript links.

How can you optimize the use of frames within the web pages?

Frames are not recommended from the point of view of SEO as the search engine
spiders are not able to index the sub pages within the master page. Search engines
can index the text and links of the master (frameset) page if the frameset content is
placed within the <NOFRAMES> tags.

<FRAMESET>

<NOFRAMES>

content of the frameset document comes have

</NOFRAMES>

</FRAMESET>

How can you use CSS to hide keywords in pages that are deficient in keywords
and text?

Due to the design consideration of certain sites, such as those that employ splash
pages or flash pages that do not contain any or enough textual content, you can
use CSS to hide important keywords in hidden layers. The hidden CSS layers can be
viewed and indexed by the search engines but remain invisible to the person
browsing the page. The following CSS code helps to hide the keyword "Hypnosis
Ebook" within a hidden CSS layer:
<DIV STYLE="visibility:hidden;position:absolute;left:200;top:200>Hypnosis
Ebook</DIV>

This technique is considered a black hat technique by most search engines and
should be avoided even though it is near impossible for the search engines to
detect.

Can you add keywords in hidden value form tags?

Hidden value form tags are used to pass information from a page containing a form
to a target page. The syntax is <INPUT TYPE="hidden" NAME="keyword"
VALUE="Hypnosis Ebook"> Most search engines treat the technique of hiding
keywords in hidden value form tags as spam, so don't try it.

What is site popularity?

Site popularity refers to the amount of time that people spend on a particular
website after reaching the site. This is a good measure of how attention grabbing
your website is for visitors. Bounce rate is a similar concept. It is the rate at which
visitors leave a website from the same page they came in on without browsing any
other pages.

What is click through popularity?

Click through popularity refers to the system used by search engines and
directories by which they rank websites according to the number of clicks received
on the websites listings and in the search engine results pages or in the directory
listing.

CHAPTER 8
8 Good and Bad SEO Techniques
What are doorway pages?

Doorway pages can be cleverly developed to bring in a large amount of free traffic
for the website. Doorway pages are also called gateway pages and are optimized
pages submitted to search engines that help the site generate traffic by targetting
specific keywords through individual entry points to the website. The main purpose of
doorway pages is to get them spidered into the search engines so that they help
attract traffic for the site from different keywords embedded within the doorway
page.

Search engine algorithms are developed in such a way that they apply various
forms of logic to relate keywords or sets of keywords with single webpages in their
database of pages. Doorway pages are developed by optimizing the page to a very
high degree so that a search engine algorithm ranks the page high for the keyword
for which it is optimized. Since doorway pages are designed for search engines,
they often do not carry the look and feel of the main website. Doorway pages are
developed with more keyword rich textual content and for high download speeds.

Here is an example of how people use doorway pages.

Doorway pages are usually linked to internal pages of the website that call for
action from the visitor (such as making a purchase decision). For example. If a
website is selling mobile phones, doorway pages can be created that are optimized
for the popular models of phones that people are searching for. In the following
example, when people would look for the Nokia 123 phone, they would see the
doorway page (which is optimized for Nokia 123), which would take the visitor
straight to the buying information. Similar pages could be created for the other
models (Nokia 124, Nokia 125 and Ericsson 123). The Hallway page contains the
complete list of available mobile phones that visitors to the site can conveniently
browse through while on the site.
What are Hallway pages?

If you plan to use doorway pages to boost your traffic, you will require pages to
organize the list of doorway pages for maintaining a systematic linking structure
between all the doorway pages. Hallway pages also help people navigate through
multiple doorway pages and enable search engine spiders to index the doorway
pages. Keeping the same example as above, the page that lists all the models of
mobile phones that the website sells would be ideal as a hallway page. Each listing
of the mobile phone models in the hallway page would link to an independent page
(the doorway page for each model).

How you can safely use doorway pages to generate huge traffic for your site?
Most search engines including Google treat doorway pages as search engine spam
and look down upon identical doorway pages or directories that look intentianally
created. It is very difficult for the search engine algorithms to differentiate between
intentionally created doorway pages and regular pages of the website if some
important factors are adhered to. Keep the following factors in mind to create
doorway pages which the search engines would not treat as spam.

1. Each doorway page must consist of atleast 40% text different from that any other
doorway page.

2. Ensure that the Titles, Headers, Metatags and embedded keywords are different
for each doorway page. The textual matter of the webpage body should also be at
least 40% different from that any other page of the website.

3. Ensure that Hallway pages that link to the doorway pages don't have more than
80 links. Create new hallway pages if a single hallway page is not able to
accomodate the links to all the doorway pages.

4. Do not use any software to generate doorway pages, as they usually generate
ugly, incomprehendible pages which are just a collection of keywords stuffed within
meaningless text. Search engine algorithm will soon be able to detect which sites
have used such software.

If you stick to the above rules and optimize all keywords and parameters of the
doorway page, you can target as many keywords as you think can be developed
into meaningful doorway pages.

Do dynamically generated pages such as with ASP get indexed by the search
engines?

Webpages generated using programming languages such as ASP, Coldfusion and


PHP are displayed on the fly in response to a user request or response. Such pages
do not actually exist as webpages(such as HTML pages) before the user response,
instead they are generated instantly on the basis of the user response. This is the
reason search engine spiders can not detect such pages until they are actually
generated in response to the search engine spiders request. URLs of websites that
use programming languages for generating webpages have characters such as
?,&,% or $ within them. Search engines can easily stop the indexing of URLs when
they come accross such characters in the URL.
Dynamically generated pages can be altered so that search engines can index
them. This involves the instant conversion of dynamic pages into static pages
whenever a user makes a request that requires a response. Softwares are also
available that have the ability to convert dynamically generated sites into static
sites. These software have the ability to convert the unfriendly URLs in to search
engine friendly URLs without actually changing the page to an HTML format.

Which of the major search engines have the capability to spider dynamic
webpages?

Google, Yahoo, MSN and Altavista have the ability to spider webpages developed in
programming languages such as ASP and Coldfusion.

8.1 What is cloaking?

Cloaking is a controversial SEO technique that involves the creation of two sets of
pages, one for the website visitors and one for submission to search engines.
Cloaking is accomplised by identifying whether a visitor to a website is a search
engine bot or a real visitor. The recognision is accomplished by tracking the IP
address of the website visitor. Cloaking is explained below:

1. Two sets of webpages are developed. One is developed in accordance with the
site design and visually appealing for the website visitors, while and the other set
which is dynamically generated is highly optimized in response to the detection of a
search engine bot visiting the website.

2. The software program that dynamically generates the webpage for the search
engines also converts the URL to an HTML URL so that the search engine bot can
index the optimized page. The software program can automatically detect the IP
address of the search engine bot and offer the a highly optimized page on the fly.

Why do search engine optimizers use cloaking?

Cloaking is used for two main purposes by website owners and SEO companies.

A) When the website owner would like visitors to see a particular webpage and
search engine bots to see a different webpage. This could be due to:

 The highly optimized page for the search engine not being visually appealing or
matching the site design.
 The site design requiring a lot of graphics leaving less room for important
keywords and text.

B) The second reason why website owners use cloaking is to hide the optimized
HTML that would only be available to the search engines and remain a business
secret. If a site is ranking in the top position, they might not want the code for the
optimized page to be visible for others to copy and use.

The use of cloaking to display a different page or pages to visitors than to search
engines is highly controversial and is often referred to as a "black hat" technique. In
fact, Google says in their webmaster tools that "cloaking refers to the practice of
presenting different content or URLs to users and search engines. Serving up
different results based on user agent may cause your site to be perceived as
deceptive and removed from the Google index."

What is cloaking software and what features should it have?

Cloaking software automate the process of cloaking webpages for all major search
engines. Good cloaking software constantly update their list of search engine spider
IP addresses for all major search engines such as Google,Yahoo, Altavista, All the
web, Inktomi and Lycos. Well designed cloaking software should not redirect real
website visitors in any way and should auto update the list of search engine bot IP
addresses. A good website that sells such a software is
http://www.fantomaster.com/. (ExpertRating does not recommend cloaking. It is,
however, covered in the program as it is widely used, and clients often request it to
be done.)

Is it true that search engines give priority to websites that stick to a


particular theme?
It has been believed since a long time that the search algorithm of major search
engines such as Google take into account whether or not a website is following a
certain theme or subject. However there is currently no evidence to suggest that
search engines consider how many topics a website is focused on and how many
webpages are devoted to each topic. Major search engines such as Google look at
each page in isolation and attach an importance or rank to the individual page.
Therefore, having 10 or 100 other webpages on the same topic within the same
website will not bias the opinion of the search engine towards the importance of
the webpage with respect to a keyword or subject. The notion about search engines
ranking themed websites over other sites stems from the fact that search engines
rank content rich sites higher. Therefore, if a site talks about children's IQ test, the
search engine will index more pages if the site has ten pages about Child IQ tests
instead of only one page. The more the pages indexed by the search engines, the
better the chances of getting fresh traffic from search engines. Moreover, the more
a website is an "authority" on a subject or keyword, the better it is considered by
Google.

Does Google prefer themed in-bound links

Google does give a priority to themed in-bound links with relevant anchor text in
the link. It is therefore a good idea to get link backs from websites with a common
theme to your website.

What are the reasons for which I could get penalized by Google

1. Duplicate Content - If one or more of your webpages contains the same


information (or nearly the same information) as another webpage on your site or
another website, you could get banned.

2. Cloaking - It is hard for search engines to detect cloaking, but when they do, it
could lead to a ban.

3. Hidden text - Trying to hide keywords in hidden text or links.


4. Keyword stuffing - Trying to artificially hike up the keyword density by repeatedly
using keywords throughout the page.

5. Bad neighborhoods - If you link to sites banned by Google, FFA pages (free for all
links) or link farms.

6. Page creation software - Software that create hundreds of pages for the search
engines. These pages are full of meaningless text with a high keywords density.

In what ways could Google penalize me

As discussed above, there are many reasons why Google could blacklist your
website. Google could penalize a website in the following ways.

1. Loss of Google PageRank

2. Reduction of Google PageRank

3. Loss of position in the rankings in the Google results pages

4. Complete banning and bar of website from being indexed

5. Reduction or complete stoppage of the websites ability to pass Google PR

CHAPTER 9
9 Link Popularity and Linking Strategies

What is link popularity and how can it effect your search engine rankings?

Link popularity is one of the most important criteria used by most major search
engines for attaching importance or weightage to a webpage in the results pages.
Google has developed its own criteria for calculating the importance and relevancy
of each webpage in the search engine results using a system called Google
PageRank. The two single most important aspects that contributes towards a
webpage's link popularity (including Google PR) are:

 The number of links pointing to the webpage from other sites (number of in-bound links).
 The quality of the inbound links.

The more the quantity of in-bound links pointing to a website, the better it is for the
website in the SERPs. The search engines consider each inbound link as a vote of
popularity for the website. The more the vote, the more the importance of the
webpage. Inbound links have special meaning to search engines, including Google,
Yahoo, MSN Search, and Ask Jeeves, as the number, duration of linking time, and
quality are part of the search engine algorithms. The quality of the inbound links
refers to how important the webpage supplying the in-bound link is. Therefore
getting an inbound links from a webpage on CNN would be far more worth quality
than an in-bound link from an unknown website with a low Google Page Rank. We
will discuss the quality of the link later on in more details.

What are the important aspects of a links program for a website?

There are two aspects to improving your in-bound links program.

 Increase the number of in-bound links:- This can be down by starting a link exchange program
with other websites or by putting your own website's link on as many relevant sites as possible.
We will discuss how to develop an on-going link building campaign later on.
 By getting in-bound links from better quality sites and authority sites:- As mentioned earlier, the
quality of the in-bound links is important as it effects the amount of benefit you get.

Some great ideas for locating websites that match your theme, and whom you
could send a reciprocal linking proposal to are:

1. The Google Directory: Navigate to the theme of your website in the Google
directory (http://dir.google.com). Locate websites that could exchange links with
you.

2. The Yahoo Directory: Navigate to the theme of your website in the Yahoo
directory (http://dir.Yahoo.com). Locate websites that could exchange links with
you.

3. Alexa.com provides provides a list of very high traffic sites at


http://www.alexa.com/site/ds/top_sites?ts_mode=lang&lang=en

4. www.linkmarket.net offers a free link exchange program with thousands of link


partners already enlisted.

5. www.gotlinks.com which requires a code to be pasted into the link resources


page.

Your link building campaign must start with three essential sites that give good in-
bound linking benefits. These are Google search engine - Free submission, Yahoo
Directory - Paid submission, and DMOZ engine - Free submission.

These three sites are considered extremely important in terms of quality and help
the linked website get a boost in the SERPs of most major search engines.

What are authority sites and what significance do they have for sites linking
to them

Search engines (especially Google) add a lot of importance to websites that they
consider "authority sites". Authority sites are masters on their subject and attract
free natural in-bound links due to the value of the content they have. The main
considerations for a website to fall under this prestigious category is:

 It should update its content very often (possibly every day).


 It should have a large number of webpages that offer useful information.
 It should stand out as a website that the search engines consider most important for its
keywords.
 The site should have abundant in-bound natural links related to its theme (or keywords).
Some examples of authority sites are:

www.amazon.com

www.myspace.com

www.Craigslist.com

www.friendster.com

www.cnet.com

Search engines attach more weight age to links from authority sites, as they believe
that these sites carry more "voting power" for the sites that are linking to, due to
their special authority status. Therefore, try to get links from as many authority
sites as possible.

How can you get links from quality website that will make a bigger difference
in SERPs?

Authoratative sites that rank high in search engines are the most sought after for
getting link banks, as their vote in the form of an inbound link to your site is
considered more important by the search engines. The following are the some
ways to get quality sites to link to you.

1. Get your website submitted to all major search engines such as: Google,Yahoo,
Altavista, lycos which all accept free submissions.

2. Get your site listed in important directories, most importantly in the ODP
(www.dmoz.org).

3. Get a paid directory listing ($299 P.A) at the Yahoo directory. It will give you an
important boost in search engine rankings

4. Search for your keywords in Google and send out link exchange requests to the
sites coming up in the first few pages. Look for sites with a Google PR of 5 and
above or an Alexa ranking of 50,000 and above.
5. Try to get your site submitted to non-profit trade or industry listings.

6. Give a press release related to your website.This will instantly get you a large
number of in-bound links from sites that will carry the news.

How to issue a press release and get instant in-bound links and traffic?

Posting or giving an online press release will get you instant traffic since the news is
syndicated to several websites instantly. This gives you immediate traffic and back
links from leading websites. The press release services require you to draft a small
news item (which could be anything that you would like to announce about your
website).

Some of the best options for publicizing your press release include:

www.24-7pressrelease.com (paid service)

www.prlog.org (free service)

www.prweb.com (paid service)

www.pressbox.co.uk (a free service)

Choose the Basic $89 payment option at PRWeb to gets your news distributed to
top search engine like Google, Yahoo, and Bing.
Try to include some links back to your website (such as www.expertrating.com)
inside the news item so that you get direct links to your site wherever the news
item is released.

9.1 What tools can you use to increase in-bound links from other sites?

Apart from the links from authority sites discussed above, you must develop an on-
going system for exchanging links with other websites. Search engine algorithms
give most importance to the in-bound links factor than for all other criteria that
measure the relevancy of the webpages. Besides buiding inbound links, one must
also know how to gauge one's website's page rank.

Page Rank tools can help you identify the quality of sites that link to you:

The Google toolbar

One of the most widely followed page rank tools is the PageRank button on the
Google Toolbar (toolbar.google.com). The button shows an indication of the site's
rank on a gauge. Clicking on the button gives options of viewing similar pages,
backward links, or a cached snapshot of the page Google used when indexing it.
Keywords that match the search are highlighted throughout the page to show how
Google determined its relevancy. The PageRank button has received criticism,
however, on its reliability. The rank does not reveal an actual number; rather, it
displays a visible amount on a gauge. Some find this amount outdated - new
inbound links are usually listed as back links after two monthly updates.
You can download the Google toolbar at: http://toolbar.google.com/. This Toolbar
can show you the Google PR of any site that is currently being browsed. Using the
Google Toolbar, you can quickly identify the PR of webpages that have been
indexed by Google. Try to get links from pages with a Google page rank. Getting
links from pages with a PR of 4 or higher is considered good. Getting links with a
Google PR of 6 or above page is excellent and rarely comes free.

The Google and Yahoo advanced operators

The advanced operators have been discussed in detail in the chapter on search
engines. You can use the Google "link:" command to find which sites are linking to
your site and your competitors site. After identifying suitable sites, you can send
out nicely worded proposals for link exchanges.

To find out the sites linking to xyz.com, enter "link:www.xyz.com" in the Google
search box to come up with a list of sites. Use "link:http://www.xyz.com" for Yahoo.

The Alexa Toolbar

Another top page rank tool called Traffic Rank is provided by Alexa (alexa.com).
Traffic Rank is a derived value based on user reach (the number of users who visit a
site) and page views (how many pages on that site the users visit). The Alexa Rank
circled in red below for the visited website is 236, which is an excellent ranking
since lower numbers are better. Google is currently ranked #1 and Amazon is
ranked #10.

Rank is determined using calculations based on three months of historical


aggregated data as shown in the chart below. Alexa also offers other features:
Traffic Trends graphs short-term traffic fluctuations using a three-day moving
average of daily traffic. The Movers and Shakers list shows which websites are
generating the greatest change in number of users. It can also compare the change
in reach across different websites:

Despite Alexa's in-depth data analysis, one should be aware of possible sources of
inaccuracy. Because the Web is so large and users tend to frequent the more
popular sites, it is difficult to rank websites with less traffic. Alexa cautions its users
not to rely on rankings of sites with fewer than 1,000 users a month. Also, the data
used for determining rank may not be representative of global Internet usage.

Download the Alexa toolbar free of charge at Alexa.com. The toolbar is a very
useful tool as it displays the Alexa traffic rankings for over a million websites. The
Alexa rankings are a relative measure of traffic (in comparison to other sites), and
can give you an instant indication of the popularity of any site.

Quantcast

Quantcast is another powerful tool that reveals lots of relevant information


regarding a website's status, including demographics information on visitors, age,
household income, race and ethnic origin, similar audience and audience keywords.
It also provides data regarding audience composition, rank and share of visits. You
can submit a site to Quantcast to be "quantified" for free if the site information isn't
already available.
SEO Software

WebPosition (www.webposition.com) is an online software that can help you locate


good link partners, provide optimization reports, and analysis on how your site
compares with the competition. Prices start at $29 per month for a basic
membership.

SEO Power Suite (www.link-assistant.com) is a top rated suite of software that


includes RankTracker, Website Auditor, SEO SpyGlass and Link Assistant. The
downloadable software is currently priced at $249 for the professional version
intended for individual webmasters, and $599 for the enterprise version which
includes additional client reporting functions for SEO professionals.

Free Tools
SEO Book (http://tools.seobook.com) offers free Firefox extensions such as their
SEO Toolbar, Rank Checker, and SEO for Firefox in addition to other free web based
SEO tools including a keyword generator and meta tag generator.

What should you look for in a themed links exchange

There is no doubt that search engines attach added importance to in-bound links
from websites with a similar theme, apart from the fact that such links bring in
valuable traffic to your website. Look out for the following things in a website
before deciding whether or not it fits into your choice of a themed links partner.

- Same nature of site.

The website should address the same audience. This aspect is at the center of the
themed linked exchange concept. Ensure that you exchange links with websites
which are visited as well as attract the same or similar audience. If your website is
about Yamaha Mobikes, you should not exchange links with a website selling
chocolates.

- The link should not be more than one click away from the home page.

Ensure that the site linking to you is offering you a link from the home page or from
a links page that is no more than 1 link away from the home page.

- The links page should not be cluttered.

The benefits of an in-bound link (especially the Google PR) diminish as more links
are placed on the page containing the inbound link. Ensure that your links partner's
links page has no more than 40 other links. Pages with more than 80 links could
even be viewed as a link farm and attract a penalty from the search engines.
Remember to incorporate all these aspects in your own websites links page as well.

- Check the Google PR.

While short listing websites for exchanging links, try to ensure that the website has
a Google PR that is higher than your own. Do not link to sites with a Google PR of 1
or zero, as they could have been banned. Sites with a Google PR of 4 or higher are
good candidates for exchanging links with.

- Check the Alexa Ranking.

You can check the Alexa rankings of any website at Alexa.com. Try to pick websites
that are in the top 100,000 websites in terms of Alexa rankings.

You can view the graph showing the past traffic for a website at Alexa.com The
graph can reveal if a site is growing

You can use the WayBackMachine to see what a website looked like earlier. This is
available at http://www.archive.org/web/web.php
How can you find sites that could offer you in-bound links for raising your
links popularity from the search engines?

There are hundreds of thousands of sites that offer pages where website owners
can submit descriptions and links to their websites. Placing links on other sites,
whether through link exchange or otherwise, is the most important way of
increasing links popularity for free.

Here are some tips for finding sites that offer pages for submitting links.

 Search for the keyword "add URL" or its synonyms, such as "submit URL"; or
"Submit site"; and "Submit link" in Google. You will come up with webpages that
contain these terms and very often these turn out to be pages where you can add
your link.
 Search for URLs in Google that appear to be submission pages, such as
submitlink.html; or addURL.html; and addURL.asp; and submitsite.html. Try
variations with the words separated by hypen or underscores.
 Try the above searching techniques in other search engines, since every search
engine has its own search formula and will offer different results.

9.2 Inbound links and Class C IP

An Internet Protocol address is a distinctive address assigned to a computer or a


network (http://en.wikipedia.org/wiki/IP_address). Any group of addresses that
have the same first three octets (with the first ranging between 192 and 223) is
considered a Class C network. An address in a Class C network represents
'network.network.network.local.' Multiple domain names can correspond to the
same IP address, meaning the domains are using the same server. Many domains
share the same IP address(es) because their hosts have only a few or just one Class
C network.

Websites may benefit from checking the IP addresses of their inbound links to
ensure that the links are not owned by the same group. If two IP addresses are on
the same Class C network, they may be on the same server. This can be an indicator
of common ownership. Awareness of the addresses can help ensure the
uniqueness of inbound links; however, being in the same Class C network does not
necessarily mean the links belong to the same party. Large hosts have many C
networks, and websites registered to different organizations may end up on the
same one by chance. The biggest concern is for websites whose inbound links
substantially belong to the same C network, or when there is considerable linking
between many sites in that same network. Search engines may red flag or even ban
these sites. Holding inbound links from a variety of Class C networks may help
prevent such obstacles.

There are many websites where one can check the Class C status of websites. The
following sites may be helpful:

http://www.seochat.com/?option=com_seotools&tool=35

http://www.webrankinfo.com/english/tools/class-c-checker.php

http://www.ip-report.com/

Does the internal link popularity of a website affect rankings?

Yes, the webpages within a site pass a vote from one to the other and help
distribute link popularity within the site. It is therefore a good idea to get important
pages linked from other webpages that are receiving in-bound links from other
sites. Keep in mind the following points

 Try to link all important pages amongst themselves within the website.
 If a page is many levels down within the website but is important, try to link to it
from other popular pages of the website.
 Try to refrain from adding multiple external links on several pages of the website
as this dilutes the Google PR of the webpages that contain the links. If you are
adding a links page or a resources page to your website, just have it linked from
the home page, and not other pages of the site.
 Minimize your out-bound links through out the site.
Do not link the "links partners page" from every page of the site. Many sites make
the mistake of linking it from the footer or header of every page.

Try to maintain as much Google PR within the website as possible by adding fewer
external links and adding the fresh in-bound links to pages with a lower Google PR.

What is link anchor text and why is it important?

Anchor text is one of the most important external optimization factors. Anchor text
is the portion of a hyperlink that is viewed by a visitor to a webpage. This is usually
seen as the clickable text which is underlined in blue. Major search engines,
including Google attach a lot of importance to the anchor text of the hyperlink
going to a website. If the anchor text includes the website keywords, it is treated as
a good link.

Here is an example of anchor text:

<a href=" http://www.expertrating.com "> Online Certification and Employment


Testing</a>

In the above example, the "Online Certification and Employee Testing" part is the
anchor text and includes the two most important keywords related to the
expertrating.com website, namely - "Online Certification" and "Employment
Testing".

What should the anchor link text be?


Use the exact keyword/s for which you are targeting your webpage. Remember that
the same keywords should also be present in the webpage you are getting linked to
so that the search engines can identify the anchor text with the webpage.

What factors should be kept in mind while desigining an email to request a


links exchange?

All requests for link exchange should be short and simple and devoid of excessive
marketing language or hype about the advantages of reciprocating the link. A
suitable title is:

"Request for links exchange with XYZ.com" (where XYZ.com is the name of the
website on which you would like your link placed.)

Here is an example of a good links exchange proposal:

Hello,

I recently visited XYZ.com and found it a wonderful resource on college education. I


have added a link to your site on our resources page at
http://www.123collegeedu.com/res with the following description:

XYZ.com offers information and counselling for hundreds of colleges in the US.

I would be very grateful if you could reciprocate the link on your site aswell.
123collegeedu.com offers information on college loans from over 50 leading loan
providers and receives substantial traffic from the undergraduate community in the
U.S. My website currently has an Alexa rank of 50000,which is growing at 20% per
year.

Kindly link to my website using the following information:

www.123collegeedu.com offers student loans from over 50 providers at the best


possible rates.
Please let me know if you need any further information.

Regards,

Sam Woodford

(COO, www.123collegeedu.com)

Which page/s should I ask websites to link to for getting maximum linking
benefit for the whole site?

In general in-bound links should be invited on pages which are strategically placed
within the website and further link to other pages of the site. The pages which
receive most of the in-bound links will have the highest Google PR within the
website; therefore, they should be linked to other pages of the site so that the PR
can flow throughout the site. For most sites, the homepage is the chosen page for
getting linked to. Search engines also add more importance to the home page
making its chances of ranking well in the SERPs better.
Try to get the most links on your homepage, as the Google PageRank will flow
downwards throughout the website.

9.3 Is there something called link filters in Google?

Since Google keeps its search engine formula a secret, it is hard to say anything
with certainty. But it is widely believed that Google has initiated a dampening filter
on all new in-bound links. What this means is that the Google PR will not be fully
passed to a website as soon as the link is added, instead it will gradually flow over a
period of time. This has been done to dissuade people from buying high PR links in
order to attain a high PR for their websites.

Can you buy text links?

Some website owners resort to buying high PR text links as a quick way of getting
indexed by Google and attaining a high PR. As you will discover, it is not easy
getting back links from PR5, PR6 or PR7 websites. This can be easily achieved by
buying links from leading websites related to your theme. Even though Google may
frown at this method of increasing PR, it is not penalized as such. A good PR 8 link
can come for $250 per month,

A good PR 9 link can come for $1000 per month. Some good sites where you can
buy text links are:

www.linkadage.com

www.textlinkbrokers.com

Remember to carefully word the anchor text in the links that will be placed.
Most text link broking companies give you a discount if you pay in advance for 3
months

Is it a good idea to buy text links to boost Google PageRank?

Buying text links to improve PageRank is not a strongly suggested strategy. Link-
based analysis is an important way to estimate the value of a website; paid links
decrease the quality of search results and erode the trust between search engines
and links.According to the Google Webmaster Central Blog, paid links can hurt
quality in two ways. First, there are inaccuracies: links can become popular through
efforts that are not based on merit or relevance. Second, there are inequities: paid
links disproportionately advance those websites with the most money.

Many websites have improved their PageRank with paid links; however, Google has
taken a strong stance on the issue, claiming that paid links "manipulate" and
deceive" search engines. Google's "Quality Guidelines" clearly address this issue as
a basic principle, warning, "Don't participate in link schemes designed to increase
your site's ranking or PageRank." Among the listed schemes are excessive
reciprocal links, excessive link exchanging, and buying or selling links that pass
PageRank. To enforce this policy, Google discounts or removes websites which use
paid links to improve rankings, and it even has a system in place for users to report
them. Websites which sell their links will not have their PageRank lowered nor will
they be removed, but these sites put their reputation at risk. This stance on paid
links is not unique to Google. Other leading search engines, including Yahoo!, MSN,
and Ask also look down upon using paid links to inflate search rankings.

Creating a network of your websites to boost your links program.


A good way of boosting your links program if you are unable to find link exchanges
or otherlink back offers is to develop your own micro sites and get links from them.
Keep in mind the following factors while developing the microsites.

 Develop microsites to cater to a sub-topic of your main website so that all visitors
to the micro-site are also interested in the topic of your website.
 Develop microsites on topics that would generate relevant traffic for the main site,
but targetted keywords that are not targetted in the main site.This would bring in
fresh traffic from keywords which would also be relevant to the main site. For
example if you have a site on kids IQ tests for 4-6 year olds, you could develop
microsites on topics such as dyslexia, learning aids for kids, school admission
information etc.
 Ensure that each site is submitted to the major search engines and directories.
 Link to the main site from every page of the microsite using a suitable anchor text.
Ensure that the same keyword is available in the landing page of the link in the
main site.
 Link up all the microsites so that all the pages get indexed.
 Do not attempt to create any duplicate content in the micro sites as it will be
viewed as search engine spam.

Can reciprocal links be bad for your website?

There has been recent news that Google looks down upon reciprocal links.
However, if the reciprocal links are between websites with a similar theme or topic,
there is no reason for worry.

What are natural links?

You may often come across the term natural links. These are the best type of links
and have the following features:

 They are one way links.


 They are between websites that have a related theme.
 They are voluntary in nature.

Is it a good idea to join link farms?


Link farms are a vast network of webpages usually spread over several sites. Link
farm members are required to add a webpage containing multiple links to other
webpages, which in turn contain more links. This is an artificial way of boosting the
link popularity of the website, but is easily detected by the search engines and is
taken as spam.This method should never be adopted for boosting link popularity.

Benefits of Blogging for SEO

Structured Content

Blog content can be structured, basically meaning that it can be published in an


XML or xHTML format which other machines and services can read. Most blog
software allows users to write in any code they desire. This can enable the
organization of blogs by the nature of their content if the blogs are written in the
same code. For example, there are blogs are about all sorts of information such as
events and reviews. If blogs concerning the same topic are written in a common,
structured format, search engines recognize this and can (aggregate the event
blogs separately from the review blogs for example). Structured content effectively
promotes the blog in a search, thereby increasing the page rank of the main
website in queries related to the blog content.

Crawlable URLs

Blog software can make it easy for users to structure their own URLs including blog
title information. Search engine spiders can easily find the URL, and crawl its
content if the blog's title accurately reflects its substance. Facilitating the search
engine in this way can help the blog's homepage achieve a higher SERP rank.

Updated Content

Consistently updated blogs benefit internet users and websites alike. Blogs are a
good way for users to view new content and stay up-to-date, and websites can gain
from the visibility of a blog. Search engine spiders crawl updated content (the blog)
more frequently, which permits the refreshed master site to promptly appear on a
SERP. This advantage, of course, is fully realized only if the blog is updated often
enough, preferably three to five times a week.

Organic Traffic

Because of the constant interaction blogs foster in their online communities, they
naturally generate traffic from sources other than search engines. A good blog
becomes part of its community, promoting interaction between readers via
comment boards and trackback features. Interaction keeps readers coming back
for more, and they may endorse the blog by word of mouth, through email, or by
way of online forums. These are the types of links that search engines look for
when determining rank.

Another way that blogs develop organic traffic is through blogroll. A blogger shares
the list of other blogs he or she follows, and established blogs often exchange links
with each other. This coordination among bloggers can generate substantial new
avenues of traffic for a website. Building a presence in an online community and
exchanging links with other blogs will take time. Additionally, creating considerable
organic traffic may require updating the blog everyday, but the increase in traffic
will greatly help the rank of a blog's main webpage.

You might also like