Immediate download Marketing Through Search Optimization How to be found on the web 2 edition Edition Michael A ebooks 2024
Immediate download Marketing Through Search Optimization How to be found on the web 2 edition Edition Michael A ebooks 2024
com
https://ebookfinal.com/download/marketing-through-search-
optimization-how-to-be-found-on-the-web-2-edition-edition-
michael-a/
OR CLICK BUTTON
DOWNLOAD EBOOK
https://ebookfinal.com/download/the-ultimate-web-marketing-guide-1st-
edition-michael-miller/
ebookfinal.com
https://ebookfinal.com/download/career-building-through-using-search-
engine-optimization-techniques-1st-edition-anastasia-suen/
ebookfinal.com
https://ebookfinal.com/download/marketing-to-the-social-web-how-
digital-customer-communities-build-your-business-larry-weber/
ebookfinal.com
https://ebookfinal.com/download/how-to-be-a-farmer-an-ancient-guide-
to-life-on-the-land-1st-edition-m-d-usher/
ebookfinal.com
The New Relationship Marketing How to Build a Large Loyal
Profitable Network Using the Social Web 1st Edition Mari
Smith
https://ebookfinal.com/download/the-new-relationship-marketing-how-to-
build-a-large-loyal-profitable-network-using-the-social-web-1st-
edition-mari-smith/
ebookfinal.com
https://ebookfinal.com/download/born-to-explore-how-to-be-a-backyard-
adventurer-richard-wiese/
ebookfinal.com
https://ebookfinal.com/download/the-new-community-rules-marketing-on-
the-social-web-1st-edition-tamar-weinberg/
ebookfinal.com
https://ebookfinal.com/download/one-to-one-web-marketing-build-a-
relationship-marketing-strategy-one-customer-at-a-time-second-edition-
cliff-allen/
ebookfinal.com
Marketing Through Search Optimization
This page intentionally left blank
Marketing Through Search Optimization
How people search and how to be found
on the Web
Second edition
The right of Alex Michael and Ben Salter to be identified as the authors of
this work has been asserted in accordance with the Copyright, Designs
and Patents Act 1988
Permissions may be sought directly from Elsevier’s Science & Technology Rights
Department in Oxford, UK: phone: (+44) (0) 1865 843830; fax: (+44) (0) 1865 853333;
email: permissions@elsevier.com. Alternatively you can submit your request online
by visiting the Elsevier web site at http://elsevier.com/locate/permissions, and selecting
Obtaining permission to use Elsevier material
Notice
No responsibility is assumed by the publisher for any injury and/or damage to
persons or property as a matter of products liability, negligence or otherwise,
or from any use or operation of any methods, products, instructions or ideas
contained in the material herein.
08 09 10 11 12 10 9 8 7 6 5 4 3 2 1
Acknowledgements ix
Introduction xi
v
Contents
vi
Contents
Index 229
vii
This page intentionally left blank
Acknowledgements
We would both like to thank Sprite Interactive Ltd for their support with this book.
ix
This page intentionally left blank
Introduction
Search engines provide one of the primary ways by which Internet users find websites. That’s why
a website with good search engine listings may see a dramatic increase in traffic. Everyone wants
those good listings. Unfortunately, many websites appear poorly in search engine rankings, or may
not be listed at all because they fail to consider how search engines work. In particular, submitting
to search engines is only part of the challenge of getting good search engine positioning. It’s also
important to prepare a website through ‘search engine optimization’. Search engine optimization
means ensuring that your web pages are accessible to search engines and are focused in ways that
help to improve the chances that they will be found.
This book provides information, techniques and tools for search engine optimization. This book
does not teach you ways to trick or ‘spam’ search engines. In fact, there is no such search engine
magic that will guarantee a top listing. However, there are a number of small changes you can
make that can sometimes produce big results.
The book looks at the two major ways search engines get their listings:
xi
Introduction
change your web pages, crawler-based search engines eventually find these changes, and that can
affect how you are listed. This book will look at the spidering process and how page titles, body
copy and other elements can all affect the search results.
Human-powered directories
A human-powered directory, such as Yahoo! or the Open Directory, depends on humans for its
listings. The editors at Yahoo! will write a short description for sites they review. A search looks
for matches only in the descriptions submitted.
Changing your web pages has no effect on your listing. Things that are useful for improving a
listing with a search engine have nothing to do with improving a listing in a directory. The only
exception is that a good site, with good content, might be more likely to get reviewed for free
than a poor site.
The index, sometimes called the catalog, is like a giant book containing a copy of every web page
that the spider finds. If a web page changes, then this book is updated with new information.
Sometimes it can take a while for new pages or changes that the spider finds to be added to the
index, and thus a web page may have been ‘spidered’ but not yet ‘indexed’. Until it is indexed –
added to the index – it is not available to those searching with the search engine.
Search engine software is the third part of a search engine. This is the program that sifts through
the millions of pages recorded in the index to find matches to a search and rank them in order
of what it believes is most relevant.
xii
Introduction
To begin with, some search engines index more web pages than others. Some search engines
also index web pages more often than others. The result is that no search engine has the exact
same collection of web pages to search through, and this naturally produces differences when
comparing their results.
Many web designers mistakenly assume that META tags are the ‘secret’ in propelling their web
pages to the top of the rankings. However, not all search engines read META tags. In addition,
those that do read META tags may chose to weight them differently. Overall, META tags can
be part of the ranking recipe, but they are not necessarily the secret ingredient.
Search engines may also penalize pages, or exclude them from the index, if they detect search
engine ‘spamming’. An example is when a word is repeated hundreds of times on a page, to
xiii
Introduction
increase the frequency and propel the page higher in the listings. Search engines watch for
common spamming methods in a variety of ways, including following up on complaints from
their users.
Off-the-page factors
Crawler-based search engines have plenty of experience now with webmasters who constantly
rewrite their web pages in an attempt to gain better rankings. Some sophisticated webmasters may
even go to great lengths to ‘reverse engineer’ the location/frequency systems used by a particular
search engine. Because of this, all major search engines now also make use of ‘off-the-page’
ranking criteria.
Off-the-page factors are those that a webmaster cannot easily influence. Chief among these is link
analysis. By analysing how pages link to each other, a search engine can determine both what a
page is about and whether that page is deemed to be ‘important’, and thus deserving of a ranking
boost. In addition, sophisticated techniques are used to screen out attempts by webmasters to
build ‘artificial’ links designed to boost their rankings.
Another off-the-page factor is click-through measurement. In short, this means that a search
engine may watch which results someone selects for a particular search, then eventually drop
high-ranking pages that aren’t attracting clicks while promoting lower-ranking pages that do pull
in visitors. As with link analysis, systems are used to compensate for artificial links generated by
eager webmasters.
xiv
Chapter 1
Introduction to search engine optimization
To implement search engine optimization (SEO) effectively on your website you will need to
have a knowledge of what people looking for your site are searching for, your own needs, and
then how to best implement these. Each SEO campaign is different, depending on a number of
factors – including the goals of the website, and the budget available to spend on the SEO. The
main techniques and areas that work today include:
This book will teach you about all this, but initially Chapter 1 will take you through the
background to search optimization. First of all we will look at the history of search engines, to
give you a context to work in, and then we’ll take a look at why people use search engines,
what they actually search for when they do, and how being ranked highly will benefit your
organization. Next we will provide a critical analysis of choosing the right SEO consultancy (if
you have to commission an external agency).
1
Marketing Through Search Optimization
who wanted to share a file had first to set up an FTP server to make the file available. The only
way people could find out where a file was stored was by word-of-mouth; someone would have
to post on a message board where a file was stored.
The first ever search engine was called Archie, and was created in 1990 by a man called
Alan Emtage. Archie was the solution to the problem of finding information easily; the engine
combined a data gatherer, which compiled site listings of FTP sites, with an expression matcher
that allowed it to retrieve files from a user typing in a search term or query. Archie was the first
search engine; it ‘spidered’ the Internet, matched the files it had found with search queries, and
returned results from its database.
In 1993, with the success of Archie growing considerably, the University of Nevada developed
an engine called Veronica. These two became affectionately known as the grandfather and
grandmother of search engines. Veronica was similar to Archie, but was for Gopher files rather
than FTP files. Gopher servers contained plain text files that could be retrieved in the same way
as FTP files. Another Gopher search engine also emerged at the time, called Jughead, but this
was not as advanced as Veronica.
The next major advance in search engine technology was the World Wide Web Wanderer,
developed by Matthew Gray. This was the first ever robot on the Web, and its aim was to track
the Web’s growth by counting web servers. As it grew it began to count URLs as well, and this
eventually became the Web’s first database of websites. Early versions of the Wanderer software
did not go down well initially, as they caused loss of performance as they scoured the Web and
accessed single pages many times in a day; however, this was soon fixed. The World Wide Web
Wanderer was called a robot, not because it was a robot in the traditional sci-fi sense of the
word, but because on the Internet the term robot has grown to mean a program or piece of
software that performs a repetitive task, such as exploring the net for information. Web robots
usually index web pages to create a database that then becomes searchable; they are also known
as ‘spiders’, and you can read more about how they work in relation to specific search engines in
Chapter 4.
After the development of the Wanderer, a man called Martijn Koster created a new type of web
indexing software that worked like Archie and was called ALIWEB. ALIWEB was developed
in the summer of 1993. It was evident that the Web was growing at an enormous rate, and
it became clear to Martijn Koster that there needed to be some way of finding things beyond
the existing databases and catalogues that individuals were keeping. ALIWEB actually stood
for ‘Archie-Like Indexing of the Web’. ALIWEB did not have a web-searching robot; instead
of this, webmasters posted their own websites and web pages that they wanted to be listed.
ALIWEB was in essence the first online directory of websites; webmasters were given the
opportunity to provide a description of their own website and no robots were sent out, resulting
in reduced performance loss on the Web. The problem with ALIWEB was that webmasters
had to submit their own special index file in a specific format for ALIWEB, and most of them
did not understand, or did not bother, to learn how to create this file. ALIWEB therefore
2
Chapter 1: Introduction to search engine optimization
suffered from the problem that people did not use the service, as it was only a relatively small
directory. However, it was still a landmark, having been the first database of websites that
existed.
The World Wide Web Wanderer inspired a number of web programmers to work on the
idea of developing special web robots. The Web continued growing throughout the 1990s, and
more and more powerful robots were needed to index the growing number of web pages. The
main concept behind spiders was that they followed links from web page to web page – it was
logical to assume that every page on the Web was linked to another page, and by searching
through each page and following its links a robot could work its way through the pages on
the Web. By continually repeating this, it was believed that the Web could eventually be
indexed.
At the end of December 1993 three search engines were launched that were powered by these
advanced robots; these were the JumpStation, the World Wide Web Worm, and the Repository
Based Software Engineering Spider (RBSE). JumpStation is no longer in service, but when it
was it worked by collecting the title and header from web pages and then using a retrieval system
to match these to search queries. The matching system searched through its database of results
in a linear fashion and became so slow that, as the Web grew, it eventually ground to a halt.
The World Wide Web Worm indexed titles and URLs of web pages, but like the JumpStation
it returned results in the order that it found them – meaning that results were in no order of
importance. The RBSE spider got around this problem by actually ranking pages in its index
by relevance.
All the spiders that were launched around this time, including Architext (the search software that
became the Excite engine), were unable to work out actually what it was they were indexing;
they lacked any real intelligence. To get around this problem, a product called Elnet Galaxy was
launched. This was a searchable and browsable directory, in the same way Yahoo! is today (you
can read more about directories in Chapter 4). Its website links were organized in a hierarchical
structure, which was divided into subcategories and further subcategories until users got to the
website they were after. Take a look at the Yahoo! directory for an example of this in action today.
The service, which went live in January 1994, also contained Gopher and Telnet search features,
with an added web page search feature.
The next significant stage came with the creation of the Yahoo! directory in April 1994, which
began as a couple of students’ list of favourite web pages, and grew into the worldwide phe-
nomenon that it is today. You can read more about the growth of Yahoo! in Chapter 4 of this
book, but basically it was developed as a searchable web directory. Yahoo! guaranteed the quality
of the websites it listed because they were (and still are) accepted or rejected by human editors.
The advantage of directories, as well as their guaranteed quality, was that users could also read
a title and description of the site they were about to visit, making it easier to make a choice to
visit a relevant site.
3
Marketing Through Search Optimization
The first advanced robot, which was developed at the University of Washington, was called
WebCrawler (Figure 1.1). This actually indexed the full text of documents, allowing users to
search through this text, and therefore delivering more relevant search results.
WebCrawler was eventually adopted by America Online (AOL), who purchased the system.
AOL ran the system on its own network of computers, because the strain on the University of
Washington’s computer systems had become too much to bear, and the service would have been
shut down otherwise. WebCrawler was the first search engine that could index the full text of
a page of HTML; before this all a user could search through was the URL and the description
of a web page, but the WebCrawler system represented a huge change in how web robots
worked.
The next two big guns to emerge were Lycos and Infoseek. Lycos had the advantage in the sheer
size of documents that it indexed; it launched on 20 July 1995 with 54 000 documents indexed,
and by January 1995 had indexed 1.5 million. When Infoseek launched it was not original in its
technology, but it sported a user-friendly interface and extra features such as news and a directory,
which won it many fans. In 1999, Disney purchased a 45 per cent stake of Infoseek and integrated
it into its Go.com service (Figure 1.2).
4
Chapter 1: Introduction to search engine optimization
In December 1995 AltaVista came onto the scene and was quickly recognized as the top search
engine due to the speed with which it returned results (Figure 1.3). It was also the first search
engine to use natural language queries, which meant users could type questions in much the
same way as they do with Ask Jeeves today, and the engine would recognize this and not return
irrelevant results. It also allowed users to search newsgroup articles, and gave them search ‘tips’
to help refine their search.
On 20 May 1996 Inktomi Corporation was formed and HotBot was created (Figure 1.4).
Inktomi’s results are now used by a number of major search services. When it was launched
HotBot was hailed as the most powerful search engine, and it gained popularity quickly. HotBot
claimed to be able to index 10 million web pages a day; it would eventually catch up with
itself and re-index the pages it had already indexed, meaning its results would constantly stay up
to date.
Around the same time a new service called MetaCrawler was developed, which searched a
number of different search engines at once (Figure 1.5). This got around the problem, noticed
by many people, of the search engines pulling up completely different results for the same search.
5
Marketing Through Search Optimization
MetaCrawler promised to solve this by forwarding search engine queries to search engines such
as AltaVista, Excite and Infoseek simultaneously, and then returning the most relevant results
possible. Today, MetaCrawler still exists and covers Google, Yahoo! Search, MSN Search, Ask
Jeeves, About MIVA, LookSmart and others to get its results.
By mid-1999, search sites had begun using the intelligence of web surfers to improve the quality of
search results. This was done through monitoring clicks. The DirectHit search engine introduced
a special new technology that watched which sites surfers chose, and the sites that were chosen
regularly and consistently for a particular keyword rose to the top of the listings for that keyword.
This technology is now in general use throughout the major search engines (Figure 1.6).
Next, Google was launched at the end of 1998 (Figure 1.7). Google has grown to become the
most popular search engine in existence, mainly owing to its ease of use, the number of pages it
indexes, and the relevancy of it results. Google introduced a new way of ranking sites, through
link analysis – which means that sites with more links to and from them rank higher. You can
read more about Google in Chapter 4 of this book.
6
Chapter 1: Introduction to search engine optimization
Another relatively new search engine is WiseNut (Figure 1.8). This site was launched in September
2001 and was hailed as the successor to Google. WiseNut places a lot of emphasis on link analysis
to ensure accurate and relevant results. Although the search engine is impressive it hasn’t managed
to displace any of the major players in the scene, but is still worth taking a look. It is covered in
more depth in Chapter 4 and can be found at www.wisenut.com.
More recently we have seen the launch of Yahoo! Search, as a direct competitor to Google.
Yahoo! bought Inktomi in 2002 and in 2004 developed its own web crawler, Yahoo! Slurp.
Yahoo! offers a comprehensive search package, combining the power of their directory with
their web crawler search results, and now provides a viable alternative to using Google. MSN
Search is the search engine for the MSN portal site. Previously it had used databases from other
vendors including Inktomi, LookSmart, and Yahoo! but, as of 1 February 2005, it began using its
own unique database. MSN offers a simple interface like Google’s, and is trying to catch Google
and Yahoo!
Other notable landmarks that will be discussed later in the book include the launch of LookSmart
in October 1996, the Open Directory in June 1998 and, in April 1997, Ask Jeeves, which
was intended to create a unique user experience emphasizing an intuitive easy-to-use system.
7
Marketing Through Search Optimization
Figure 1.5 The MetaCrawler website ( ©2003 InfoSpace, Inc. All rights reserved. Reprinted with permission of
InfoSpace, Inc.)
Also launched around this time was GoTo, later to be called Overture, which was the first
pay-per-click search engine (see Chapter 9).
There we have it, a brief history of search engines. Some have been missed out, of course, but the
ones covered here show the major developments in the technology, and serve as an introduction
to the main topics that are covered in a lot more detail later in this book.
8
Chapter 1: Introduction to search engine optimization
from web pages. This database links page content with keywords and URLs, and is then able
to return results depending on what keywords or search terms a web surfer enters as search
criteria.
Our research shows that around 80 per cent of websites are found through search engines. This
makes it clear why companies want to come up first in a listing when a web surfer performs a
related search. People use search engines to find specific content, whether a company’s website
or their favourite particular recipe. What you need to do through your website SEO is ensure
that you make it easy for surfers to find your site, by ranking highly in search engines, being
listed in directories, and having relevant links to and from your site across the World Wide Web.
Essentially, you are trying to make your website search engine-friendly.
Search engines have become extremely important to the average web user, and research shows
that around eight in ten web users regularly use search engines on the Web. The Pew Internet
Project Data Memo (which can be found at www.pewinternet.org), released in 2004, reveals
some extremely compelling statistics. It states that more than one in four (or about 33 million)
adults use a search engine on a daily basis in the USA, and that 84 per cent of American Internet
9
Marketing Through Search Optimization
Figure 1.7 Familiar to most of us, the Google homepage (reproduced with permission)
users have used an online search engine to find information on the Web. The report states that
‘search engines are the most popular way to locate a variety of types of information online’.
The only online activity to be more popular than using a search engine is sending and receiving
emails. Some other statistics that the report revealed were:
• College graduates are more likely to use a search engine on a typical day (39 per cent, compared
to 20 per cent of high school graduates).
• Internet users who have been online for three or more years are also heavy search engine users
(39 per cent on a typical day, compared to 14 per cent of those who gained access in the last
six months).
• Men are more likely than women to use a search engine on a typical day (33 per cent, compared
to 25 per cent of women).
• On any given day online, more than half those using the Internet use search engines. And
more than two-thirds of Internet users say they use search engines at least a couple of times
per week.
• 87 per cent of search engine users say they find the information they want most of the time
when they use search engines.
10
Chapter 1: Introduction to search engine optimization
If you are not convinced already of the importance of SEO as part of the eMarketing mix, here
are some more interesting statistics:
• The NPD Group, a research group specializing in consumer purchasing and behaviour study,
has shown that search engine positions are around two to three times more effective for
generating sales than banner ads (http://www.overture.com/d/about/advertisers/slab.jhtml).
• 81 per cent of UK users find the websites they are looking for through search engines (Source:
UK Internet User Monitor. Forrester Research Inc., June 2000).
• According to a report published by the NPD Group, 92 per cent of online consumers use
search engines to shop and/or purchase online.
• A study conducted by IMT Strategies found that search engines are the number one way (46
per cent) by which people find websites; random surfing and word-of-mouth were ranked
equal second (20 per cent each).
11
Marketing Through Search Optimization
for particular searches across various search engines, and the terms that are doing the best overall.
Just to give you an idea of some results, here is a list taken from www.wordtracker.com of the
top twenty ranking searches across the top metasearch engines on the Internet (including the
Excite and MetaCrawler search engines) on 25 February 2007:
5027 myspace
4944 google
3852 yahoo
3772 ebay
3711 anna nicole smith
3498 games play
3469 britney spears
3361 myspace.com
3043 akon
2881 antonella barba
This is interesting reading – particularly the fact that people are actually searching on search
engines for ‘Google’ and ‘Yahoo!’. This goes to show that even if web surfers know a company’s
name (in the case of the 1411 searches for ‘Yahoo.com’ they knew practically the whole web
address), they will still search for it on a search engine. These searchers were using one particular
search engine in order to find another. If Google and Yahoo!, therefore, do not have good search
engine positioning, then they will lose a lot of users who cannot find their site in searches from
other engines. The same will, of course, happen to your site if it is not listed.
To find the answer, you need to put yourself into the position of a searcher. When searchers are
confronted with a page of results, their immediate reaction is to look down that list and then
stop looking when they see a relevant site. No major studies exist regarding the importance of
top ranking, but common sense dictates that searchers will visit the first two or three relevant
sites found rather than trawling through pages of search results to find your site listed at position
298. Our own research shows that around 50 per cent of search engine users expect to find the
answer to their query on the first page, or within the top ten search engine results. Another
20 per cent revealed that they would not go past the second page of search results to find the
12
Chapter 1: Introduction to search engine optimization
site they were looking for. Therefore, if your website is not ranked towards the top you will
essentially be invisible to most search engine users. Most search engine software uses both the
position and the frequency of keywords to work out the website ranking order – so a web
page with a high frequency of keywords towards the beginning will appear higher on the listing
than one with a low frequency of keywords further down in the text. Another major factor
that is taken into account is link popularity. All these topics are covered in more detail in
Chapter 3.
Today’s search engine promotion requires a multifaceted approach. To achieve a site’s full
potential, site promotion must incorporate target audience research and analysis, competitor
analysis, pay-per-click optimization, and professional copywriting. SEO also requires a sharp eye
and an ear to the ground; search engine technology is constantly changing, so you will need to
keep up with the changes and reassess your search engine strategy accordingly.
Specialist marketing firms, like Sprite Interactive, live and breathe search engine marketing and
understand fully what it takes to generate traffic for your site and to achieve a top ranking. By
investing in the services of one of the many highly skilled SEO consultants available, you can
reap considerable rewards, but you need to have the knowledge to choose the company that is
right for you. There are a number of companies who will use underhand tactics to attempt to
promote your site, or who will not promote your site well at all. You should start with the basics
when you approach an SEO company. Ask the consultant to explain the difference between a
directory and a search engine (which you, of course, will know after reading this book). Then
ask what type of approach will be taken when the company optimizes your site – which should
be done within the site’s existing structure. SEO consultants should be able to explain to you
how the different search engines find their content, and have a good working knowledge of web
design and development – including HTML and Flash. You should be able to ask them questions
about the site architecture (see Chapter 7) and expect answers, as this information is essential to
any SEO campaign.
Credible SEO consultants should outline a plan where they will spend time working with you to
develop the relevant site keywords and phrases that you expect people to use when searching for
you. Consultants should also be skilled in writing quality concise copy. Building link popularity
for your site is another important service provided by SEO consultants, as it will boost your
ranking on certain search engines – in a nutshell, you should make sure any links you exchange
with other sites are relevant and that the consultant does not use automated linking software
13
Marketing Through Search Optimization
(see Chapter 3). Be very wary of consultants who advocate ‘spamming’ techniques, such as using
hidden text on your web pages or submitting your site multiple times over a short period of time.
They will only be found out by the search engine in question, and thus run the risk of getting
your site banned altogether. Legitimate SEO consultants will work well within the rules set by
the search engines and will keep up to date with these rules through industry sources.
An investment in professional SEO consultancy is likely to be cheaper than one month of a print
advertising campaign. For your investment your site will be optimized across three to five key
phrases. Your contract will probably last from six months to a year, as it will take this long for
the optimization to take full effect. Expect your chosen SEO consultants to be able reliably to
inform you about the latest rates on all the pay-for-placement engines. If you choose correctly,
your SEO consultant can save you a considerable amount of time and effort, and will generate
quality targeted traffic for your site.
Watch out for companies that offer guarantees against rankings achieved. Many of these are pretty
worthless and generally have a number of ‘let-out’ clauses. There is no guarantee of success, but
there are ways to greatly increase the odds of being ranked highly. The main factor in measuring
the success of an SEO campaign is the increase in traffic to your website.
You need to ask yourself a few questions when choosing an SEO professional. Is it the consultant’s
job to increase your sales? Is the consultant there to increase your traffic, or just to get you a high
ranking? Most SEO professionals would agree that they are there to get their client’s site ranked
highly, and many will state up front that this is their main aim; however, generally speaking the
first two options will result as a knock-on effect of having a highly ranked site. What happens
if this is not the case? The client will often assume that high rankings will immediately result
in extra traffic and additional sales, but in some cases this does not happen, and the finger of
blame is pointed. So who is to blame? The answer will lie in what the original agreement and
expectations were between the SEO consultant and the client.
There are a number of reasons why sales or traffic might not increase, and these may be the
fault of either the SEO company or the client. For example, it would be the SEO company’s
fault if the wrong keywords were targeted. A client’s website may be listed highly but for the
wrong keywords and search terms, and therefore would not generate any relevant traffic, or any
traffic at all. So make sure you agree on what keywords you are going to use first, to avoid any
conflicts later on. There is no real excuse for an SEO professional to target the wrong keywords,
especially after having consulted you and doing the necessary research.
There are two immediate ways in which the client could be in the wrong. First, the client may
decide that they know best, fail to pay attention to the SEO advice offered, and choose unrelated
keywords for the website. It is up to the client to follow the advice of the SEO consultant.
Second, a client may have a badly designed site, which does not convert visitors into sales; an
SEO consultant can advise on this, but in the end it is down to the client to act and to commission
a site redesign.
14
Chapter 1: Introduction to search engine optimization
It’s important to know exactly what you’ll be getting from your SEO company right from the
start, so here is a checklist of questions to ask a potential SEO consultant:
1 How long have you been providing search engine optimization services?
2 Are you an individual consultant, or are you part of a team?
3 How long have you and your team been online?
4 What types of websites do you not promote?
6 Can you describe and/or produce recent successful campaign results?
7 Do you have website design experience?
8 What are your opinions with regard to best practices for the SEO industry, and how do you
try to keep to these?
9 How many search engine optimization campaigns have you been involved with? What
was your role for those projects? How many are still active? How many are inactive? If
inactive, why?
10 Are there any guarantees for top search engine positions? (The answer to this question will
depend on whether or not you choose a pay-per-click program; see Chapter 9 for more
information.)
11 Do you have experience managing bid management campaigns?
12 What strategies would you use to increase our website’s link popularity?
13 Explain to me how Google’s PageRank software works, and how you could increase our
website’s rating. (The answer to this will involve building quality inbound links to your
website.)
14 How would you orchestrate a links popularity campaign?
15 What changes can we expect you to make to our website to improve our positioning in the
search engines?
16 Will there be changes in the coding of our website to make it rank better?
17 What type of reporting will you provide us with, and how often?
This checklist provides a useful starting point for you when approaching an SEO professional. At
Sprite we make sure that all the consultants can answer these questions, and more, whenever they
are approached for new SEO business. Most importantly, however, if you choose to use SEO
professionals, be patient with them. You need to remember that SEO is a long-term process, and
it will take around six months before you have any real measure of success. If you are not happy
with the results after this time, then it is probably time to move on. Appendix A provides an
example SEO presentation; although this is more of an internal presentation, it will give you an
idea of some of the issues you should be looking out for.
15
Marketing Through Search Optimization
try to bypass the guidelines without the website knowing, with varying levels of success. As these
guidelines are not generally written as a set of rules, they can be open to interpretation – an
important point to note.
A technique is ‘white hat’ when it conforms to the submission guidelines set out by a search
engine and contains no kind of deception in order to artificially gain higher rankings. White hat
SEO is about creating a compelling user experience and making the content easily accessible to
search engine spiders, with no tricks involved.
‘Black hat’ SEO techniques are efforts to try to trick search engines into ranking a site higher
than it should be. There are many black hat techniques, but the more common ones are ‘hidden
text’ that a site user cannot see but a search engine spider can, or ‘cloaking’, which involves
serving one page up for search engine spiders and another page up for site visitors. Search engines
have and will penalize and even ban sites they find using these techniques; one recent example
occurred in February 2006, when Google removed the BMW Germany site from its listings for
use of doorway pages.
White hat search engine techniques present a holistic view of search engine optimization – the
search engines are viewed as a necessary part of the whole web marketing mix – whereas many
black hat practitioners tend to see search engines as an enemy to be fought in order to get higher
listings. When using black hat SEO the content on a page is developed solely with the search
engines in mind. Humans are not supposed the see the black hat content on a page (such as
hidden links and text). The content may be incomprehensible to humans and if they do see it
then their experience of using the site will be considerably diminished. White hat techniques
produce content for both the search engines and the site user, usually focusing primarily on
creating relevant interesting content that is also keyword-rich for the search engine spider. Even
without the presence of a search engine, white hat pages will still be relevant.
Another area of concern should be your domains. There is always the risk of a search engine
removing your domain from their listings, due to a change in algorithm or some other related
cause, but in general by following white hat techniques you can reduce this risk. Black hat
techniques, on the other hand, will positively increase the risk. Many black hat practitioners view
domains as disposable, which can be especially hazardous if they are working on your primary
domain name. Black hat techniques may get you quick results, but these are more often than
not short-term gains, as the domains are quickly banned from the search engine indexes. White
hat techniques on the other hand will generally take longer to implement and be ingested by the
search engines, but they will provide you with a long-term stable platform for your website.
So the question is: how do I make sure I am following white hat techniques and search engine
guidelines? The one point to bear in mind is to make sure your site and its content makes sense
to humans! That is all you need to do to follow white hat guidelines. The only time you should
really have to consult search engine guidelines is if you are working on an element of your site
that is not related to the user experience, such as META tags, code placement and site submission.
16
Chapter 1: Introduction to search engine optimization
Natural traffic
If you are going to use an agency for SEO, then you will also need to tap into your site’s natural
traffic. Your natural traffic is web traffic that will develop outside of the optimization services
provided by an SEO company. It is not traffic that is ‘directed’ to your site by good search engine
ranking and positioning; it is traffic that will find your site in other ways, such as through printed
advertising or through having a shared interest in your website. You need to bear this in mind
throughout your SEO process, as it is part of the full marketing mix that will result in quality
traffic for your website. Make sure your print advertising (and any other promotional material
for that matter) features your web address in a prominent position. Target relevant publications
with your advertisements, and make sure that any groups that share an interest in your website
are well informed.
If you want to track the success of a print campaign, one technique you can use is to feature an
alternative URL; you can then track the amount of hits to this URL, which will tell you how
successful the print ad or campaign has been. Tapping into your site’s natural traffic may take
more thought and planning than just optimizing your site and hoping that people will find it by
searching for it, but the hits that you will receive from ‘natural’ traffic will be of a higher quality,
and will be more likely to spend longer on your site than those coming from search engine results
alone. Another way to increase your ‘natural’ traffic is by building your site’s link popularity (see
Chapter 3).
In conclusion
This chapter has been designed as a basic introduction to some of the concepts surrounding SEO.
It is clear from reading the statistics quoted that getting listed on search engines is essential to
promote your website effectively, and that ranking highly is essential if you want your site to be
noticed by surfers performing searches. If you do choose to use an SEO consultancy, then be
sure to follow the guidelines outlined above, and read this book first to give you the knowledge
to approach an agency confidently and make sure you are able to get the most out of them.
Remember that SEO is a long-term process; it cannot happen overnight, and is something that
you need to commit to fully to get the most out of.
17
This page intentionally left blank
Chapter 2
How people search
Power searching
Before commencing a search engine marketing campaign it is important to understand how
the search engines work and become a power searcher yourself. This will give you important
background knowledge into the area you are entering. Power searching means using all the
tricks at your disposal in your chosen search engine to get the most relevant results for you.
Modern search engines are generally pretty good on relevancy, but there are still pages that are
not particularly well optimized, but that will contain the information you’re after that can only
be accessed by using power search techniques. Learning power searching will also give you a
good background on what works and what doesn’t when optimizing your site.
It is worthwhile starting with the basics. The most important rule of any searching is that the
more specific your search is, the more likely you are to find what you want. Try asking Google
‘where do I download drivers for my new Motu sound card for Windows XP’, more often than
not this technique works to deliver relevant results. Here is a very brief summary of basic search
engine terms.
The + symbol
The + symbol lets you make sure that the pages you find contain all the words you enter. If you
wanted to find pages that have references to both Brad Pitt and Angelina Jolie you could use the
following query:
+Pitt +Jolie
You can string as many words together as you like, and this technique is especially useful for
narrowing down results when you have too many to check through.
19
Marketing Through Search Optimization
The − symbol
The − symbol simply lets you find pages that have one word on them but not another. If you
wanted to find a page with Brad Pitt but not Angelina Jolie then you would simply type:
Pitt−Jolie
Again, you can use this technique to filter your results as much as you like and it is useful to
focus your results when you get too many unrelated pages.
Quotation marks
You can use quotation marks around a phrase to be sure that that phrase appears as you have
typed it on the page, so a search for Ben Salter may return results with the two words appearing
separately on the page; if you typed in ‘Ben Salter’ then you would be guaranteed to return that
exact phrase which would lead to much more relevant results. Another example would be the
book title ‘Marketing Through Search Optimization’; without the quotation marks you would
be more likely to be presented with SEO consultancy websites, but with the quotation marks
the book title is much more likely to appear as one of the top search results.
These symbols can be added to each other in any way you please, and you can create some quite
elaborate search strings with them. For the most part these are the only ways most search engine
users will enhance their results – Boolean commands are not covered here as on the whole they
are not widely used by a typical search engine user.
• Match Any – this lets you find pages that contain any of your search terms. Usually when this
is selected the search engine will first display results with both terms.
• Match All – this is similar to the + command and makes the search engine return results that
include all the terms you have specified.
• Exclude – similar to the − command, this lets you exclude words from the search if you don’t
want them to appear in the results that are returned.
• Site Search – this is a powerful feature that lets you control what sites are included or excluded
in a search. If you wanted to see all the pages in the Sprite Interactive website you could type:
site:www.sprite.net
20
Chapter 2: How people search
This would return all the pages in the search engine’s index for www.sprite.net. This is a useful
tool to see what pages from your site have been indexed and what versions of the page are in the
search engine directory, and whether it has picked up any recent updates you have done.
You can also add other terms onto the end of the search query to see pages from that site that
have specific content, for example:
This returns the search engine marketing page from the Sprite Interactive website. You can also
use all the other search terms (+, −, ‘’) to refine your search further.
Title search
If you want to find pages that just mention certain terms in the title you can use the ‘:allintitle’
command and the resulting pages will be restricted to those containing just the terms you specified
in the title. Again, this is useful for finding certain pages in a site to see if they have been indexed.
You can also use ‘intitle:’ to return results that just have the first query in the title rather than all
the search terms; for example ‘intitle:sprite interactive’ would return pages with ‘Sprite’ in the
title, but with ‘Interactive’ in the body copy.
Info search
This is a great way to find out more information on your, or your competitors’, sites. It returns
all of the information Google has on the site you search for. If you typed ‘info:www.sprite.net’
you are given the following options from Google:
Link search
Perhaps the most useful power search tool for the SEO professional, this lets you see all the sites
linking into a particular URL; the search query would look like ‘link:www.sprite.net’. This is
a great way to see who is linking into your competitor sites to see if you can also benefit from
those links. It also lets you see if Google has indexed pages linking to your site – if it hasn’t then
you can submit them to Google yourself.
21
Marketing Through Search Optimization
Personalization
The major search engines are now looking into personalized search, the main players currently
being Google, Yahoo!, MSN and Amazon’s a9.com service. Microsoft is seen as having the
advantage due to its ability to access the files on your computer (for PC users). The concept of
personal search is that the more a search engine knows about your likes and dislikes, your history,
your search patterns and your interests, the better search results it can provide you with. Not
only can personal search base results on the terms you enter in the search query, it can also use
your personal info to work out what you really mean by those terms and suggest other results
that may be relevant or interesting to you.
Many commentators have cited personalization as the ‘Holy Grail’ for search engines, and the
search engines are certainly ‘in bed’ with the concept. It is easy to see that the more data they can
collect on their users, the more they can target their results, and also the more they can charge
for targeted advertising. Here are a couple of examples of simple searches that can ‘benefit’ from
personalization: if you are searching for ‘beatles’ are you after the band or the insect? If you search
for ‘rock’ are you interested in music or geology, and so on. Of course, using the information
provided at the start of this chapter you could narrow your search engine results down to the
same as these particular personalized search results.
Google Homepage, My Yahoo! and My MSN also offer personalized versions of their homepages,
with services like email, chat and calendars readily available.
The essence of personalized homepages is a technology called Really Simply Syndication (RSS).
This allows content to be distributed through the Web very efficiently, so a news organization
like the BBC will use RSS to plug their headlines in to any site also using the technology.
RSS can be used for a wide range of content, like weather reports, star signs, traffic and road
information, and so on. RSS feeds can be seen on the Google homepage screenshot (Figure 2.1).
The search engines can use the information a user decides to display on their personal homepage
to then create the user profile. This profile enables them to serve up more relevant advertising.
One of the more negative features of personalized search is that once a search engine thinks
it knows what you want to be searching for it will narrow your results, thus narrowing the
amount of information you can access. Though getting narrow search results can be a good
thing, searching is also about broadening your mind, and search results regularly lead users off on
tangents into information that they may not necessarily have considered. Is ending this a good
thing? Should there be an option to turn personalization off ?
Another problem that people seem concerned about is privacy. What is to stop search engine
operators tracking everything a user does on the Internet? Though there is clearly going to be
an issue of privacy with any kind of personalization, this may be less of an issue than many have
made it out to be. The search engines can track users anonymously, setting a cookie in your
browser that simply contains an ID that gives you a profile, without having to enter anything
22
Chapter 2: How people search
that will identify you as an individual. All they need to know is that you like the Beatles; they
don’t need your name or address. This system is being developed by the major search engines,
most notably Google.
There is another important side effect of personalization that is only just starting to be realized,
and that is the effect it will have on the SEO industry. If a user can filter results personally
then this change could lead to whole new profiles. What profile did they have? Where were
they located? This change could lead to whole new profiles being developed for different sets of
search engine users, and personal profiling would become commonplace. You would no longer
be tracking your site for a search on a particular keyword, you would be tracking your site by a
particular user demographic (London IT Worker, 25–30, for example). So this could lead to the
job of the SEO consultant becoming more complicated, but if you are creating useful relevant
sites aimed at people rather than search engines, you will be well on the way to benefiting from
personalized search.
Mobile search
With the increased use of the Web on mobile devices, the major search engines are now providing
support for mobile devices. The mobile Internet is developing in much the same way as the
Internet developed. In the early days of the Internet users were restricted to select content via
portals such as Netscape and Compuserve. These are reflected by the mobile operator portals
today (Vodafone, Orange Three, to name a few), who carry a very small selection of the mobile
content available, but are the first stop for many mobile users. Some of these operators have even
23
Marketing Through Search Optimization
put a walled garden around their portal, so users cannot access any content outside those they
have chosen for them (as some of the early Internet portals did). As the Internet developed, and
the depth of content developed, the portal sites were unable to provide the necessary coverage,
and search engines such as Google and AltaVista provided a way for users to find this content.
This is now happening in the mobile space, as the search engines are starting to tap into the huge
amount of mobile content available that cannot be found on the operator portals.
Google provides search interfaces for devices such as Palm PDAs, i-mode phones and WAP and
xHTML enabled devices. Google also supports facilities that let users use their numeric keypad
to enter search terms and keywords. Yahoo! provides a directory of WAP-enabled sites and
delivers personalized data to mobile devices, such as sports scores, news and entertainment and
travel information, as well as the ability to use Yahoo! email and messenger on your device.
MSN’s strategy focuses on the pocket PC and smartphone devices, which have windows software
installed on them, and which deliver up web content through Pocket MSN.
Research from Bango (www.bango.com) indicates that the most popular mobile content
services are:
• Real time searches for local landmarks, such as restaurants, ATMs, specific shops.
• Predefined search, which can pull up preset data on what is available in a local area.
24
Chapter 2: How people search
Local search needs to be quick and accurate for it to be successful. For more detail on mobile
search and the mobile Internet see Chapters 6 and 8.
Social networks
A basic social network is a collection of people you know and stay in contact with. You swap ideas
and information, and recommend friends and services to each other. This leads your network to
grow. Sharing in a social network is based on trust; if the recommendation is unbiased and is
from a known source then you are more likely to trust it than not. The best recommendations
are therefore those that come from unbiased and trustworthy sources.
In the past few years social networks on the Web have become some of the most popular sites
around. Sites like Myspace, YouTube, Friendster and Linkedin have all become big business, and
there are ways you can benefit from their popularity and user-base. The key to all these networks
is the sharing of ideas and information, and the way that they open up this area to web users who
would previously have to have had knowledge of web development to be able to do what they
are able to do through these sites now, i.e. share photos, publish journals and create their own
personal web page.
Social networks open great new opportunities for viral and word-of-mouth marketing, and
provide huge marketing potential for your business. They make viral marketing and word-of-
mouth marketing much easier than before. The best use of social networks is not to make money
‘directly’ off them, but to harness their marketing potential and to use them to market your
own business. Each social network has its own unique language: YouTube is for uploading and
commenting on videos; Myspace is for finding interesting new ‘friends’ and leaving comments
for them.
Today’s SEO understands that standard manipulation of websites only achieves results in the most
non-competitive of markets. To profit from the Web you must cover the social connected aspect
of the Internet and make it work for you. This is partly covered through link building, but also
through tapping into social networks to bring traffic to your sites.
25
Marketing Through Search Optimization
The concept behind SMO is simple: implement changes to optimize a site so that
it is more easily linked to, more highly visible in social media searches on custom
search engines, and more frequently included in relevant posts on blogs, podcasts and
vlogs.
This quote encompasses all the social aspects of the Web. SMO understands that people are the
Internet, and optimizes your website and your promotional content for people as well as for the
search engines. SMO feeds into SEO – the links you build will benefit your site’s organic search
engine results and will naturally increase the profile of your site and content, and drive targeted
traffic through a whole new network that does not rely on the major search engines.
Bhargava, Jeremiah Owyang and Loren Baker have identified 13 rules between them to help
guide thinking when conducting SMO:
Having a large social network is what will turn your site content into ‘linkable’ content and
increase its ‘link factor’; for more on linking see Chapter 3. It is important to build links, but
26
Chapter 2: How people search
it is also important to build the network that will give you the links into the future. If you
want something to spread virally it has to have a network to be seeded into. If your network
takes it on and the content is good, then it might spread into other networks, who might then
link to you too. As well as understanding the raw elements of SEO, such as site architecture,
keyword choice and keyword placement, SEO today is very much about making the social
Web work for you, as it will build the profile of your site in a way you never could on
your own.
Weblogs
A weblog (blog) is a personal website where the site owner makes regular entries, which are
presented to the viewer in reverse chronological order. Blogs are often a commentary on a
specific subject, such as politics, food or films, and some take the form of a personal diary, where
the author records their thoughts on a range of subjects. Blogs can feature text and graphics, and
links to other blogs and other websites. ‘Blogging’ is also a term that means to add an entry or a
comment to a blog.
Blogs evolved from the concept of the online diary, where writers would keep an online account
of their personal lives. Early blogs were usually manually updated sections of websites but with
the advance of technology and the introduction of blogging software, such as Blogger and
LiveJournal, the availability of blogging has been opened up to a much wider, less technically
minded audience. Blogging had a slow start, but it rapidly gained popularity. The site Xanga,
which was launched in 1996, featured only 100 diaries by 1997; this had leapt to 50 000 000 by
December 2005.
Blogging makes building your own simple web page easy. It also makes linking to other web
pages easy, with tools such as:
• Permalinks – this is a URL link that points to a specific blog entry even after the entry passed
from the front page into the blog archives.
• Blogrolls – these are collections of links to other weblogs. They are often found on the front
page of a blog, on the sidebar.
• TrackBacks – these are ways for webmasters (blog owners) to be informed when someone
links to one of their articles.
27
Marketing Through Search Optimization
Blogs and blog posts are naturally search engine-friendly as they are full of keywords and links.
They use style sheets and CSS (Cascading Style Sheets), and generally have very clean HTML
formatting. Optimizing your blog is very similar to optimizing a web page, but depending which
software you use the results may look different. There are a number of simple rules, however, that
you can follow that will boost your blog ranking and perhaps rank it higher than many websites.
Once you have a blog ranked highly you can use its positioning to link to your websites; there
is more on using a blog to increase your link factor in Chapter 3.
28
Chapter 3
Linking strategies and free listings
So you’ve built a great site and you have great content; all you now need is an audience.
However, there is a problem: there are around 100 million web pages out there and well
over a million unique domains. Of these, the top 10 per cent receive around 90 per cent of
the overall traffic. The Web is a very tough place for small sites to get exposure, particularly
those without a large budget for promotion. In this chapter we’ll look at how you can build
a presence for your site and increase your traffic, even if you have little or no site promotion
budget.
The first method discussed here of getting your site listed for free is one that should be avoided,
and one that is readily available across the Web. This is the mass submission service. There are
a number of companies offering this; just type ‘free search engine submission’ into Google and
this will become evident (Figure 3.1).
When you sign up for one of these services you will generally pay a low monthly fee and in
return the company will submit your site to what they describe as ‘thousands’ of search engines.
Don’t be fooled! Some services even offer submission to as many as 250 000 engines, although
there are not 250 000 search engines in existence. Many of these companies will take your money
29
Marketing Through Search Optimization
Figure 3.1 The results of a search on Google for ‘free search engine submission’; notice the sponsored links
(reproduced with permission)
and then generate spam email, which will give you no benefit and clog up your inbox. You only
need to concentrate on the top eight to ten search engines. Many of the top search engines will
not accept automatic submission of this kind and, even if your site does get listed, if it has not
been optimized in the first place it is very unlikely that it will rank well. When you next come
across a service that offers guaranteed top ten positions or submissions within a certain time (say
ten or fifteen days), it is worth remembering a few things. First, let’s take the previous example.
If eleven people apply to get the guaranteed top ten positions, what will happen then? Logic
dictates that it will be first come, first served, which is not very fair on the eleventh person, who
has also been guaranteed top ten placement. Anyway, submitting your URL to search engines
does not guarantee anything (see Chapter 4).
If you are going to use a mass submission tool, then use a recognized one such as Submit It. This
engine is so good and came so highly recommended that Microsoft bought it, and it is now a
part of their bCentral suite of site promotion tools. The service is not free, however, and there
is a charge to submit your site. For this, Submit It promises to get you a prominent listing on a
number of top search engines.
30
Chapter 3: Linking strategies and free listings
• Google
• AltaVista
• Yahoo!.
Building links
Another great way to get traffic to your site for free is to have other sites link to it. This is
one of the most powerful tools you can use to promote your site. It’s the online equivalent
of word-of-mouth advertising and, just like word-of-mouth, it’s the most effective way to get
new business.
It’s like one of your neighbours recommending a good plumber; a recommendation carries more
weight than if a person just stumbles across your website using a search engine – and, for the
purposes of this chapter, it can also be free or come at a very low cost. One of the best ways to
find sites to approach to link to yours is to use the search results for terms that are important for
your site. Do a search at Google for phrases that you wish to be ranked first for, and then treat
all the top listed sites as potential linking partners. Some of the sites listed may be competitors,
and you will not realistically be able to trade links with them, but there should be a number of
sites that do not sell competitor products to you – the only thing you are competing for being
the same search terms.
Now you need to visit the sites that are not competitors and see how it would be possible for
your site to get a link from them. If they have a links page, this is an obvious place for your
site to be linked from; if they have an ezine that users subscribe to then you could place a text
link in this, maybe in return for promotion on your own site or for a small fee. An example of
this in action would be a nutritionist who has written a book on ‘living better through healthy
eating’; these words would therefore be very important to the nutritionist on a search engine.
Instead of trying to rank first for these results, he or she could try to exchange links with those
sites listed first – which would mostly be stores advertising vitamins and nutritionist services. The
technique is a cost-effective way of getting your site to the top, and is known as ‘piggybacking’.
You are using the fact that some sites will be ranked highly for a specific search phrase, and will
not compete with you. It can take you a very long time to get to these positions, especially if
you have little or no budget, so the next best solution is clearly to have a link from their site
onto yours. In time, if you have links from the top ten sites for a particular term or phrase you
31
Marketing Through Search Optimization
will receive a lot of traffic, as these sites will receive a lot of traffic anyway as a result of their
high listing. However, it is not quite this easy, and you have to work out why these sites would
want to give you a link.
In any case, it’s the research that goes into analysing sites and identifying how you can link
from them in the first place that makes all the difference. You need to seek solutions where
both parties win, and you can secure for your site a cheap and plentiful supply of valuable
traffic. You need to take the time to visit any sites you are asking for links from, then find
out who runs the sites and send a polite email – addressing them by name so it is clear that
your email has not come from a piece of link generating software. Tell them your name and
demonstrate that you have seen their site and taken the time to analyse it. Explain to them
why you think a link exchange makes sense and, if they have a links page already, let them
know you have been to see it and ask for a link from it. When you send another webmaster
a link request or reciprocal linking offer, you should also state what you would like your link
to say. It is suggested that you include a piece of HTML code in your email, such as the
following:
Finally, you should offer to talk by telephone. This may seem like a lot of effort, but it works –
and is the only way that does. The goal is to leave zero doubt in the site owner’s mind that you
are a real person who has been to the site and taken the time to evaluate it and find out where a
link can fit in.
A site’s link factor is determined by having certain elements that will encourage other sites to
want to link to it, which in turn will inspire users to visit it more often and thus earn more
overall coverage for the site. At the top end of the link-factor scale are sites such as the British
32
Chapter 3: Linking strategies and free listings
Library website, which contains links throughout the site to vast amounts of data as well as a
number of off-site links; there are also hundreds, possibly thousands, of websites that link into
the British Library. This is because it has extremely rich content that is organized efficiently and
with a lot of care.
Sites with a low link factor include those that have little or no relevant content, or rely on
databases or a lot of Flash content. This is not to say that Flash content or databases are a bad
thing, but these do reduce a site’s link factor. In the case of Flash content, the text in a movie
is all pre-rendered and a movie exists under only one URL. Database-driven content constantly
changes, so there are no static pages to link to and URLs in a database can change for every
page load, which negates the power to link to these pages. In some cases a database may be
absolutely necessary to organize a large amount of information, but when considering whether to
implement one into your site you should bear this major factor in mind. For example, a magazine
with a number of articles on the Web can have a very high link factor; however, if the articles
have been organized as a database this link factor suddenly goes down because if a user wants to
link to one, a direct link will not work.
The main issue in increasing your link factor will always be the quality and relevancy of content
and the architecture to support it. For every website there are many other sites or venues (such
as search engines, directories, web guides, discussion lists, online editors, and so on) that will link
to it, and your challenge is to identify these and contact them. As mentioned above, building
links to your site can be a great free or low-cost exercise to increase your traffic and, as we will
see below, your site’s ranking.
A good way to get people to notice your site, thus increasing its link factor, is to use guerrilla
marketing techniques. There are a number of tactics and methods that can be used to increase
the visibility of your website, but if you are operating on a small budget you need to remember
one major factor when using these techniques: set realistic goals and keep to them. You can
still achieve big results with a small budget, but this has to be done in a realistic and clear
manner, otherwise you can end up concentrating all your resources on one area only to find
that it is not working. Make sure that you analyse the media related to your industry or trade,
and learn about which trade shows to attend and who the respected figures are, etc.; you
need to know all about your target audience, and this will give you the knowledge to create
targeted campaigns that really appeal to them. One great guerrilla tactic for promoting your
site is using newsgroups and chat forums. Most email clients now include a newsreader that
will allow you to access thousands of newsgroups on a range of topics. You need to subscribe
to these in order to contribute, and you can subscribe to as many different newsgroups as
you wish.
You should be aware that each newsgroup is governed by a strict set of rules that generally
apply to the posting of messages on the board and to members’ conduct. You should make sure
you read this before posting, as if you break the rules of a particular group you run the risk of
being banned. Using newsgroups as a marketing tool is much the same as using a web forum
33
Marketing Through Search Optimization
(Figure 3.2), so you can take these lessons and use them elsewhere. The main aim is to attract
more visitors to your site, and in turn more places that link to your site. The overall aim of this
type of guerrilla marketing is to increase your site’s link factor.
The first rule to remember when using newsgroups as a marketing tool is that you should never
post ‘in your face’ postings or adverts, or any kind of classified sales adverts. You will almost
certainly get ‘flamed’ for this, i.e. other users of the newsgroup will abuse you through posting.
It may even get you banned. The way to approach newsgroup marketing is to get involved in
the discussions, and gradually to become seen as an informative contributor. You should offer
advice and tips on problems to the other contributors and become an accepted member of the
group. If you offer well thought-out advice, then people will naturally want to find out more
about you – which will involve visiting your website. Many newsgroups allow you to have a
signature, which is usually a small graphic or message that you attach to the bottom of your posts;
make sure you include your web address in here, as any time you post people will see the link
and some of them may follow it.
Here are a few tips on conducting yourself in newsgroups and discussion forums:
• Before you post to a newsgroup, spend a bit of time reading others’ posts so you can become
comfortable with the posting style of the group.
• Read the rules before you begin to post.
• Never post ‘in your face’ postings or adverts, or any kind of classified sales adverts.
34
Chapter 3: Linking strategies and free listings
• Don’t post the same message to multiple newsgroups; this will probably be noticed by
somebody.
• Make sure you use a signature file if you are given the opportunity, and keep it short and to
the point.
• Don’t post messages that have nothing to do with the topic of the message.
1 Email to a friend link. Make it easy for people to send pages of your site to their friends. You
need to make your site easily accessible by all, and you can do this by putting a link at the
bottom of a page to recommend the page to a friend. You can take this one step further by
having e-cards that are related to your product or website, which site users can send to a friend
with a personalized message. You can also collect users’ email addresses using this technique,
and these can then be used to send out targeted ezines.
2 Offer free tools. You can try offering to your users a free trial version of your product if you
are a software company, or perhaps free games or useful software for download. Make sure
that whatever you offer is useful to your audience, and that it has links to your website
contained within the software. You could perhaps create a high quality screensaver, featuring
your company or products, for download. This has the extra benefit of advertising your site
on the computer screen of whoever installs the screensaver.
3 Newsletter/ezine. Newsletters and ezines are great tools when used effectively. You need to
offer an incentive for site users to subscribe, such as exclusive promotions and special offers,
or the chance to be kept fully up to date with the latest site developments – make them
think they will be the first to know. Your newsletter should be delivered to a subscribers’
inbox regularly, and should always carry relevant, useful content. Make sure you provide
a number of special offers and promotions relating to your products that the user cannot
find anywhere else. To take part in the special offers from your ezine a user should have to
visit a page on your website; this will increase hits to your site and essentially increase the
number of links you have pointing to you. At Sprite we have integrated successful ezines
into a number of our clients’ sites, and use them to promote our range of mobile software
(Figure 3.3).
4 Fresh content. This is one of the most important considerations. Make sure that your site always
offers relevant and fresh content, as this will encourage users to return to it. If the content
you are offering is topical or exclusive news that they might not find anywhere else, such as
breaking industry news, then this will encourage them further.
5 Link page. Create a page on your site that showcases or recommends your favourite websites,
and offer reciprocal links to other webmasters from this page. It seems simple, but having a
clear links page will make people far more likely to approach you for links, and in turn link to
your site.
35
Exploring the Variety of Random
Documents with Different Content
kehoittaa sinua muuhun, kuin minkä itse tiedät oikeaksi.
"En voi tehdä niin." Olavin kasvot valahtivat kalpeiksi ja aivan kuin
jäykistyivät. "Täytyyhän minun toki ajatella Ingunnia — enemmän
kuin itseäni. Enhän voi jättää häntä olemaan yksin köyhänä ja
kivulloisena ja ilottomana, salamurhaajan ja konnan leskenä."
"Eihän ole varma — aivan varma, eikö piispa keksisi jotakin keinoa
— koska siitä on niin pitkä aika — ja koska kukaan ei ole joutunut
kärsimään asiasta — ja koska kuollut oli rikkonut pahasti sinua
vastaan ja koska taistelitte rehellisesti. Ehkä piispa keksisi jonkin
keinon, jolla hän sovittaisi sinut Kristuksen kanssa ja voisi antaa
sinulle synninpäästön vaatimatta, että sinun on puhdistauduttava
murhastasi ihmistenkin edessä."
"Lieneekö se luultavaa?"
"En tohdi luottaa siihen. Se koskee liian läheltä niitä, joita olen
velvollinen suojelemaan. Yhtä hyvin olisi silloin voinut jäädä
tekemättä kaikki, mitä olen tehnyt pelastaakseni hänen kunniansa.
Etkö luule minun tietäneen, että jos olisin ilmiantanut tekoni silloin,
ei se olisi ollut kuin pikkuasia — miehestä ei olisi ollut väliä, elipä hän
tai kuoli, ja te olisitte voineet olla minun puolellani ja todistaa naisen,
jonka hän vietteli, olleen minun vaimoni. Mutta Ingunn ei olisi
kestänyt sitä — hän on aina kestänyt niin vähän — ja nyt saisi joka
ainoa ihminen tällä seudulla tietää totuuden — nyt kun hän on
muutenkin nääntynyt."
"Mahtaako hän kestää paremmin tämän. Jos hänen käy nyt kuten
ennenkin, — että hän menettää lapsensa —."
"He olisivat yhtä hyvin voineet jäädä sinne, missä olivat, niin
vähän hyötyä minulla oli heistä. Suon Hestvikenin mieluummin
Ingunnin pojalle."
"Jos luulet, Olav, että sinun olisi helpompi tietää, mitä sinun on
tehtävä päästäksesi jälleen sovintoon Jumalan kanssa — ja jos voisit
olla huoleti omistasi, niin lupaan olla Ingunnille veljen sijassa,
hänelle ja pojalle. Otan heidät luokseni jos tarvitaan."
"Ei, ei", sanoi Olav kuten ennen. "En tahdo, että sinä ajattelet
sellaista uhrausta minun puolestani —, tai kenenkään toisen
puolesta, josta minut on pantu huolehtimaan."
"Olen nyt sanonut sinulle yhtä ja toista, mutta en sitä, minkä aioin
sanoa. Olen sanonut sinulle ikävöiväni päivät ja yöt sovintoon
Kristuksen, Herramme kanssa — olen sanonut sinulle, ettei
Vapahtajamme ole mielestäni milloinkaan ollut niin kuvaamattoman
suloinen kuin silloin, kun huomasin hänen painaneen minuun Kainin
merkin. Mutta itseäni ihmetyttää, että ikävöin niin, sillä en ole nähnyt
hänen olleen näin kova ketään muuta ihmistä kohtaan. Minä olen
tehnyt tämän ainoan tihutyön — ja olin silloin niin — kiihtynyt —
etten edes muista itse, mitä ajattelin muuta kuin että käsitin
Ingunnin käyvän vielä onnettomammin, ellen tehnyt niin — suojellut
hänen viimeisiä kunnian rippeitään, vaikka minun olisi ollut tehtävä
murha. Ja kaikki sujui niin helposti, aivan kuin se olisi ollut sallittu:
hän pyysi päästä mukaani, eikä kukaan nähnyt meidän lähtevän yhtä
matkaa. Mutta jos Jumala tai suojeluspyhimykseni tai Neitsyt Maaria
olisi johtanut tuona iltana kulkumme ihmisasunnolle eikä autioon
karjamajaan Luraasenille — niin olisi käynyt toisin."
"En tiedä varmaan, enkö liene — ei; rukoillut en ollut juuri silloin.
Mutta koko sinä pääsiäisenä en ollut tehnyt muuta kuin rukoillut —
ja olin koko ajan haluton tappamaan häntä. Mutta oli kuin kaikki olisi
mukautunut siten, että minun täytyi tappaa hänet ja kehoittanut
minua salaamaan tekoni perästäpäin. Ja Jumala, joka näkee kaiken,
tiesi myös minne se johtaisi, paremmin kuin minä itse — eikö hän
siis olisi voinut estää minua, rukoilemattani —?"
"Niin sanomme kaikki, Olav, kun olemme antaneet vallan omalle
tahdollemme ja kun huomaamme jälkeenpäin, että olisi ollut
parempi toisin. Mutta sitä ennen kai ajattelit, kuten ajattelemme
jokainen, itse ymmärtäväsi parhaiten, mikä sinulle on hyväksi."
"Niin aivan. Mutta kaikessa muussa, paitsi siinä, mitä tuo teko on
tuonut muassaan, olen tehnyt oikeutta ja kohtuutta jokaiselle kykyni
mukaan. Minulla ei ole vääryydellä hankittua tavaraa, sen tiedän, en
ole puhunut pahaa lähimmäisistäni, miehistä tai naisista, vaan
antanut pahojen puheiden painua unhoon niiden tullessa ovelleni,
vaikka olisin tiennyt niiden olevan totta eikä valhetta. Ja minä olen
ollut uskollinen vaimolleni, eikä ole totta, etten soisi hyvää hänen
pojalleen — olen ollut yhtä hyvä Eirikille kuin useimmat miehet ovat
omille pojilleen. — Sano nyt sinä, Arnvid, joka ymmärrät sellaisia
asioita paremmin kuin minä ja olet elämän ikäsi ollut hurskas ja
armelias mies — enkö ole oikeassa, kun sanon Jumalan olleen
ankaramman minua kuin muita kohtaan? — Olen nähnyt enemmän
maailmaa kuin sinä" — Arnvid istui niin, ettei Olav voinut nähdä
hänen hymyilevän hänen tätä sanoessaan — "niinä vuosina, jolloin
olin maanpaossa enoni luona ja sitten kun kuuluin jaarlin väkeen.
Olen nähnyt miehiä, joita painoivat kaikki maailman seitsemän
syntiä, jotka harjoittivat julmuuksia, jollaisiin minä en olisi ryhtynyt
pikkusormellani, vaikka olisin tiennyt varmasti Jumalan jo hylänneen
minut ja tuominneen minut helvettiin. Eivät he pelänneet Jumalaa,
enkä minä huomannut heidän kaipaavan tai rakastavan Häntä silti tai
haluavan sovintoa hänen kanssaan — vaan he elivät iloisin ja
tyytyväisin mielin ja saivat autuaallisen lopun, monikin heistä; sen
olen itse nähnyt.
"Miksi emme siis me, Ingunn ja minä, voi saada rauhaa? On kuin
Jumala vainoaisi minua kaikkialla, eikä soisi minulle lepoa, vaan
vaatisi minulta mahdottomuuksia, sellaista, mitä en ole nähnyt
Hänen vaativan keneltäkään muulta?"
"Ehkä teen niin", sanoi Olav hiljaa. "Mutta sano sinä ensin —
ymmärrätkö, miksi minun osani on oleva niin paljon kovempi kuin
muiden?"
"Ethän sinä voi tietää, mitä kaikkea nuo toisetkin ovat kokeneet.
Mutta ymmärräthän sen, että kun tunnet Jumalan seuraavan sinua,
tapahtuu se siksi, ettei hän tahdo kadottaa sinua."
"En ole tehnyt sitä itsekään — minun täytyi mielestäni tehdä kuten
tein; minun huostaani oli uskottu Ingunnnin elämä ja hyvinvointi.
Mutta kaiken alku oli se, Arnvid, että Steinfinninpojat tahtoivat
varastaa minulta naimakaupan, johon isäni oli saanut suostumuksen
— oliko minun tyydyttävä siihen, taivuttava sellaiseen vääryyteen?
Minulle on opetettu Jumalan tahtovan, että kristityn tulee taistella
vääryyttä ja lainrikkomusta vastaan. Olin lapsi iältäni, lakia
tuntematon — en tiennyt muuta keinoa oikeuteni suojaamiseksi,
kuin otin itse morsiameni, ennen kuin hänet ehdittiin naittaa
toiselle."
"Niinpä kai."
"En voi uskoa sen olleen niin suuren synnin. Olen kuullut monen
miehen valehtelevan pahemminkin — syyttä suotta, enkä ole
huomannut Jumalan ojentavan sormeaan rangaistakseen heitä. Ja
siksi en ymmärrä Hänen oikeamielisyyttään, joka tahtoo rangaista
minua näin julmasti!"
Arnvid kuiskasi:
"Ja entä, jos olisi ollut niin? Jos on, kuten sanot — jos on totta,
että minä pelkäsin usein tehdä niinkuin muut miehet tekevät joka
päivä huolettomin mielin, vähemmällä syyllä —. Entä jos tuo, mitä
sinä kutsut Jumalan lahjaksi, olisikin ollut sietämätön taakka, jonka
Hän oli sitonut selkääni syntyessäni!"
"Niin moni voi sanoa samaa luojansa antimista; ellei hänellä ole
kallionlujaa uskoa Vapahtajaansa, saattaisi hän luulla syntyneensä
pahimmaksi turmanlinnuksi."
"Luulin sinun auttaneen jokaista, joka turvasi apuusi, siksi että olit
— lempeäsydäminen — ja säälit kaikkia, jotka olivat — hädässä."
"Säälin — kyllä. Usein mieleni olisi tehnyt syyttää luojaani siitä,
että Hän oli luonut minut sellaiseksi, ette voinut muuta kuin sääliä
kaikkia, vaikka en voinut pitää kenestäkään."
"Ei, ei, älähän. Sinä ja Ingunn olette olleet parhaat ystäväni. Mutta
minä en ole hurskas enkä hyvä. Ja usein olin väsynyt kaikkeen —
toivoin, että olisin voinut muuttaa itseni kovaksi mieheksi, kun en
voinut olla lempeä ja antaa Jumalan tuomita. Ranskassa oli kerran
pyhä mies, erakko; hän oli alkanut harjoittaa sellaista
laupeudentyötä Jumalan tähden, että antoi majassaan suojaa niille,
jotka kulkivat metsän läpi, missä hän asui. Eräänä iltana tuli sitten
kerjäläinen hänen ovelleen pyytäen yösijaa — erakon nimi oli
muistaakseni Julian. Vieras oli spitaalin saastuttama ja kamalan
näköinen ja kaiken lisäksi ilkeä ja ruokoton suustaan — hän moitti
kaikkea, mitä erakko teki hänen hyväkseen. Silloin Julian riisuu
hänen vaatteensa, pesee ja hoitaa hänen haavansa, suutelee niitä ja
panee hänet vuoteeseen — mutta kerjäläinen valittaa, että hänen on
kylmä, ja käskee Julianin laskeutumaan viereensä lämmittämään
häntä. Julian tekee niin. Mutta silloin vieraalta putosi aivan kuin
vaippana kaikki saasta ja riena ja paha puhe — ja Julian näki
syleilleensä Kristusta.
Olav nyökkäsi.
"Ymmärsin ne myöhemmin."
"En ole niin hurskas, ettei se olisi koskenut minuun kovasti, niin
toinen kuin toinenkin juttu. Ja minusta tuntui usein, että Jumala olisi
voinut suoda minulle sen ainoan, mitä häneltä pyysin — luvan
palvella häntä siinä muodossa — siinä asussa — jossa olisin voinut
harjoittaa armeliaisuutta heikon voimani mukaan, ilman että
ihmisten tarvitsi kuiskia selkäni takana, häväistä minua tai kutsua
minua tyhmeliiniksi, tai uskoa minusta pahinta siksi, etten ottanut
toista vaimoa enkä jalkavaimoa Tordiksen kuoltua."
*****
Olav vastasi, että hänen täytyi joutua kotiin illaksi — "mutta voi
olla mahdollista, että tulen Osloon toiste tervehtimään häntä."
Arnvid vastasi:
"Tiedän sen — ei tämä ole siksi. Mutta minun tekee mieleni antaa
jotakin teidän kirkollenne — minulla on ollut monta hyvää hetkeä
vanhassa Olavinkirkossa."
*****
Miten tällä kertaa oli käyvä, sitä hän koetti olla ajattelematta.
*****
*****
"Älä nyt anna äitisi tietää tätä", sanoi Olav astuessaan takaisin
taloja kohti. "Hänen ei tarvitse tietää, että sinä käyttäydyt kuin
ymmärtämätön penikka, vaikka olet noin iso."
*****
"Joko tulee aika?" Hän irrotti Ingunnin kädet niskastaan; oli niin
hankalaa olla kaksinkerroin. "Tahdotko, että jään kotiin tänään? Voin
itse ratsastaa Rynjuliin hakemaan Unaa tänne — ja pyytää samalla
Torgrimia hoitamaan asiaani Vidanesissa."
"Näin unta", sanoi tämä hiljaa. "Juuri ennen kuin tulit sisään."
"Minä tiesin sen. Minä tiesin sen. — Mutta sittenkin pelkäsin aika
ajoin — kun minun kävi kaikista pahimmin ja olin eniten allapäin —
silloin en voinut olla ajattelematta kauhulla, että mitähän, jos hän oli
elossa ja tulisi kostamaan minulle. Mutta uskoin sinun tehneen sen,
niin että saatoin olla huoleti!"
*****
Olav kiitti häntä hyvästä uutisesta ja tunsi, että jos hän olisi ollut
samanlainen kuin nuoruudessaan, olisi hän kai hypännyt alas
hevosen selästä ja sulkenut sukulaisen syliinsä sekä suudellut häntä.
Hän oli keventynyt ja oli iloinen, mutta ei tuntenut sitä vielä oikein.
Niinpä hän siis kiitti Signeä uudestaan siitä, että hän ollut nytkin
Ingunnin apuna.
"Lupaan kaikki, mitä tahdot." Hän hymyili aivan kuin tuskassa, niin
onnellinen hän oli.
"Lupaa, ettei häntä kasteta äitini kaimaksi. Tahtoisin, että hänen
nimekseen tulisi Cecilia."
Olav nyökkäsi.
*****