How to
increase search engine traffic
If you have a
web site and you want to increase increase search engine traffic,
this article is for you. After you learn about the methods
described below, you will probably be able to solve this
incredible task again and again. Resources for such increases
really exist because the number of Internet users is constantly
growing. If you do not attract them, your competitors will.
There are
a lot of ways to increase the number of visitors, but the most
promising one is drawing traffic from search engines. As a rule,
people coming to your site from Google or Yahoo belong to your
target group and they are not just an army of surfers roaming
the Internet.
How to
make search engines "like" you? How to calculate the percentage
of visitors coming to your site from popular search engines?
AlterWind Log Analyzer
Professional, a program with fantastic features
for creating all kinds of reports, can answer these and other
questions. Today we will try to employ hidden resources of
search traffic that can be discovered with the help of the
"Pages Not Visited from Search Engines"
report.
First,
you should specify the time period for analyzing log files.
This period can vary a lot depending on the topics covered by
the site and on the number of visitors. For instance, a popular
site dealing with general topics needs two or three days to
collect a reliable database of search queries, while a
specialized resource may need more than two or three weeks.
Use the
"General" template to analyze your log files in
AlterWind Log Analyzer
Professional. This template allows you to get as
much information as possible out of your log files. Note that
if the site contains a lot of pages or other data, you should
increase the limit for generating the reports you are
interested in - only first 300 lines are included in them
by default.
Currently
we are interested in the "Pages Not Visited from
Search Engines" report located in the "Access
Statistics" folder. It contains the list of pages
that had no hits from search engines for the period in question.
The list is sorted by the number of visitors - the more people
viewed a certain page, the closer it is to the top. The question
is why such popular pages do not attract visitors from search
engines.
Study
the list as thoroughly as you can. It is sure to include some
auxiliary pages that have no information interesting for
visitors from search engines. That is normal. But if you
find some key pages of your site there, it is time to raise
alarm! The same applies to pages at the very bottom of the
"Entry Resources from Search Engines" list.
Let us write them down and do some research.
The reason
may be as easy as that: search engines just do not know about
the page. Probably, there are no links referring to it from
both your and other sites? If you are sure that it is not
the case, you should check them because they can be invalid
(for example, due to the webmaster's error) or they can just
remain unnoticed by search spiders (for example, they are
located in complicated menus written in JavaScript).
Let us
get some help from AlterWind Log Analyzer
Professional and see if search spiders visit
the page. You can do it with the help of filters - an
indispensable tool for professional analysis of various
aspects in the work of the site. Create a new template, add a
filter of the "Include" type and select
"Spiders" from the drop down list on the
"Filters" page. Now you should add the string
to filter search spiders by. We are interested in Google in
the first place that is why you should add the
"Googlebot" string to the filter (that is
the name of the spider from this search engine). By the way,
if you are initially interested in some particular search
engine, you can add some simple rules in order to exclude
hits from other search engines to the previous
"General" report. You can do it by enabling
the "Hits with specific referring URLS" filter
of the "Exclude" type and adding a string for
each search engine you want to exclude from the analysis,
e.g. *yahoo.com*, *msn.com*, etc.
After
AlterWind Log Analyzer
Professional processes the log files applying
the specified filters, the "General Statistics"
page will show you general data: the number of spider's visits,
the average time of viewing pages, total traffic. Let us find
out which pages Googlebot visited - they will be listed on
the "Pages" page of the "Access
Statistics" folder. Here we will need the previously
made list of key pages with no hits from search engines - compare
it to what you see now. If the "Pages" report
contains pages without hits, it means that the spider knows
about them, but indexes their content incorrectly. We will
dwell on this case a bit later in the article.
You can
make the analysis yet narrower if you also enable the
"Hits that requested specific resources"
filter for a certain page. Then the reports of the
"Activity Statistics" folder will
allow you to find out when the spider visited this page
for the first time, how often it explores its content,
when it requested it last time.
If you
find out that spiders ignore some pages of your site, it will
not be easy to discover the true reasons. Here are only some
possible explanations:
- Invalid links to the page - check them at
http://validator.w3.org/checklink
- The search spider cannot detect links (for example, if
they are located in a menu written in JavaScript)
- The page is located too deep in the structure of the site.
- It is forbidden to index the page/link (the file robots.txt
in the root directory, special tags like rel="nofollow", etc.)
- The page is too large
The other
case is more common: spiders know that the page exists and
explore its content on a regular basis, but search engines
do not display it among the results of search queries.
One of
the reasons for it can be invalid coding that prevents spiders
from indexing the content of the page correctly. That is why
you should use an HTML code validation service to validate
the site. It can be an online service (e.g.
http://validator.w3.org/)
or a stand-alone utility (e.g. free software from
www.htmlvalidator.com).
Validation may discover some serious errors that are not visible
when you are viewing the page in the browser, but that interfere
with the work of the search spider.
Another
reason is neglecting SEO principles while making up the text
and the source code of pages. Two main factors influence the
ranking of a page in search engines: the number of external
links to the page and the content of the page. Here we will
analyze the second factor and learn how to optimize the text
of pages in order to draw maximum traffic from search engines
taking into account reports from
Site Content
Analyzer.
First of
all, you should determine key phrases that your prospects will
use in their search queries. Make a list of them and analyze
the pages of the site with
Site Content
Analyzer in order to see the results of your
optimization later.
The main
parameters influencing the position of a page in the results
of a certain search query are the weight and the density of
the key phrase in the text of the page. Weight is an integral
indicator showing how significant a keyword or a key phrase is
for this page. It is based on various parameters whose
calculation algorithms also vary so it is impossible to find
some "absolute" value.
Density
is a relative value showing how frequently this particular
word or phrase occurs on this page relative to other words.
The more frequent this word on the page is displayed and the
less words the page contains, the higher its density is and
vice versa. Usually, the density of a keyword on a page is
from 2% to 15% because smaller values are too insignificant
to consider it as a keyword while greater values may lead to
the page being regarded as a spam attempt by search engines.
The
greater weight a word has or a phrase is, the more relevant
your page will be if this phrase is used in a search query.
So, the essence of optimization is to increase the weight of
your key phrases. Use the following simple tips that can
increase their weight.
The most
important thing is to use the key word as often as possible,
but do not abuse it because search engines may think it is
spam. The key phrase must be present in the title of the
page (the TITLE tag), in the description and among keywords
(the DESCRIPTION and KEYWORDS meta tags), it is recommended
to use it in headers (the H1, H2, : tags), in alternative
text for images (the ALT tag) and in the text of links.
Making
key phrases italic or bold in the main text also helps to
increase their weight. If your content management system
(CMS) allows you to specify any URLs, include key words
in the URL of this page, e.g.
www.site.com/you-see-keyword1-and-keyword2-there.
After
you optimize your pages, analyze the site with
Site Content
Analyzer once again. Look at the new values
of the weight and density of your key phrases and words -
they are sure to have increased! Now these pages will
become more attractive for search engines. We hope that
the pages that have had no hits from search engines so
far will become the leaders in the "Entry Resources
from Search Engines" report!