Search icone
Search and publish your papers
Our Guarantee
We guarantee quality.
Find out more!

A Guide for your first steps in SEO

Or download with : a doc exchange

About the author

General public

About the document

Published date
documents in English
term papers
6 pages
General public
2 times
Validated by
0 Comment
Rate this document
  1. Introduction -The basics on search engines and directories
  2. Optimization for search indexes (Google).
  3. Links and text
    1. The links
    2. The text
  4. Making your site useful to the visitors and visible to the search engines
    1. Limiting Multimedia
    2. Using text, not graphics!
    3. Clear titles
    4. Providing different routes
    5. Different things to do when creating a web page

The term "search engine" is usually used to describe crawler-based search engines and human-edited web directories. Search engine optimization involves achieving the highest position or ranking practical in the natural or organic listings on the search engine results pages after a specific combination of keywords (or key phrase) has been typed in. The position or ranking is dependent on an algorithm used by each search engine to match relevant site page content with the key phrase entered. There is no charge for these listings to be displayed or when a link relevant to the site is clicked upon.

In this assignment, we are going to explain how the search engines/directories are important and how we can optimize our ranking. We will take as an example the stationary brand Paperblanks (notebooks, diaries, guestbook and adressbooks). In the first part, we're going to speak about the basics of the search engines and directories and focus on the keywords importance. In the second part, we'll have a look at the links and the text in our website. Then, to finish, we'll dedicated the last part to technical aspects which are going to help our website to be more user friendly and also easy to follow by the search engines.

Search index companies (like own thousands of computers that use software know as spiders or robots to grab web pages and read the information on each page or all the pages by using complex algorithms which index this information.

Similar documents you may be interested in reading.

Webmarketing: Advice Trade Spring HK Co.Ltd

 Business & market   |  Human resources   |  Term papers   |  12/31/2010   |   .doc   |   29 pages

Recent documents in computer science category

Reconstructing householder vectors from tall-skinny QR

 Science & technology   |  Computer science   |  Presentation   |  04/21/2017   |   .doc   |   4 pages

Software requirement development - The airline ticketing reservations software systems

 Science & technology   |  Computer science   |  Presentation   |  01/30/2017   |   .doc   |   3 pages