organic seo1. The general information on search engines
1.1 History of development of search engines
1.2 General principles of job of search engines
2. Internal factors of ranging
2.1 Text registration of web pages
2.1.1 Volume of the text on page
2.1.2 Number of keywords on page
2.1.3 Density of keywords
2.1.4 Arrangement of keywords on page
2.1.5 Stylistic registration of the text
2.1.6 tag «TITLE»
2.1.7 Keywords in the text of links
2.1.8 Tags «ALT» images
2.1.9 Meta-tag Desciption
2.1.10 Meta-tag Keywords
2.2 Structure of a site
2.2.1 Number of pages of a site
2.2.2. The navigating menu
2.2.3 Keyword in the page name
2.2.4 Avoid subdirectories
2.2.5 One page – one key phrase
2.2.6 Main page of a site
2.3 Widespread errors
2.3.1 Graphic heading
2.3.2 Graphic navigating menu
2.3.3 Navigation through scripts
2.3.4 Identifier of session
2.3.5 Redirects
2.3.6 the Latent text
2.3.7 One pixel links
3 External factors of ranging
3.1 For what the account of external links to a site is used
3.2 Importance of links
3.3. The reference text
3.4 Relevance of referring pages
4 Indexation of a site
5 Selection of keywords
5.1 Initial choice of keywords
5.2 High-frequency and low-frequency inquiries
5.3 Estimation of level of a competition of search inquiries
5.4 Consecutive specification of search inquiries
6 Various information on search engines
6.1 Councils, assumptions, supervision
6.2 Creation of a correct content
6.3 Choice of the domain and a hosting
6.4 Change of the address of a site


The given course is intended for authors and owners of the sites wishing in more details to prosecute by subjects of search optimisation and promotion of the resource. It is calculated basically on beginners though also the skilled web designer, I hope, will gather from it something new. It is possible to find a considerable quantity of articles in the Internet on a subject of search optimisation, in the given textbook attempt to unite all information in the form of a uniform, consecutive course is made.
The information presented in the given textbook can be broken into some parts:
– Accurate, concrete recommendations, practical guidance to action;
– The theoretical information which, in our opinion, any expert in area seo should possess;
– Councils, supervision, the recommendations received on the basis of experience, studying of various materials, etc.

1. The general information about search engines

1.1 History of development of search engines

In an initial stage of development the Internet, number of its users it was insignificant, and volume of the accessible information rather small. In most cases to the Internet employees of various universities and laboratories had access, and as a whole the Network was used in the scientific purposes. At this time the information search problem in a network the Internet was far not so much actual, as now.
Creation of catalogues of sites in which links to resources were grouped according to subjects became one of the first ways of the organisation of access to information resources of a network. Site Yahoo which has opened in April, 1994 became first such project. After the number of sites in catalogue Yahoo has considerably increased, possibility of information search in the catalogue has been added. It, certainly, was not the search engine in full sense as the search area has been limited only by the resources which are present at the catalogue, instead of all resources of a network the Internet.
Reference catalogues were widely used earlier, but have practically lost the popularity now. The reason of it is very simple – even the modern catalogues containing a large quantity of resources, represent the information only about very small part of a network the Internet. The biggest catalogue of network DMOZ (or Open Directory Project) contains the information on 5 million resources while the base of search engine Google consists from more than 8 billion documents.
Project WebCrawler appeared in 1994 became the first high-grade search engine.
In 1995 there were search engines Lycos and AltaVista. Last was long years the leader in the field of information search in the Internet.
In 1997 Sergey Brin and Larry Page have created Google within the limits of the research project at Stanford university. At the moment Google is the most popular search engine in the world.
Now exists 3 basic international search engines – Google, Yahoo and MSN Search, having own bases and algorithms of search. The majority of other search engines (which can be counted much) uses results of 3 listed in this or that kind. For example, search AOL ( uses base Google, and AltaVista, Lycos and AllTheWeb – base Yahoo.

1.2 General principles of job of search engines

The search engine consists of following basic components:
Spider – bwoser similar program which downloads web pages.
Crawler – the program which automatically passes under all links found on page.
Indexer – the program which analyzes web pages, скаченные spiders.
Database – storehouse of the downloaded and processed pages.
SE results engine (the system of results delivery) – takes results of search from a database.
Web server – a web server which carries out interaction between the user and other components of the search engine.
Detailed realisation of search mechanisms can differ from each other (for example, sheaf Spider+Crawler+Indexer can be executed in the form of the uniform program which downloads known web pages, analyzes them and searches under links for new resources), however the described general lines are inherent in all search engines.
Spider. The spider is a program which downloads web pages in the same way, as a browser of the user. Difference consists that the browser displays the information containing on page (text, graphic etc.), a spider has no any visual the component and works directly with the page html-text (you can make «html-code viewing» in your browser to see the “crude” html-text).
Crawler. Allocates all links which are present on page. Its problem – to define, where there should be a spider, being based on links or proceeding from in advance set list of addresses further. Краулер, following under the found links, carries out search of the new documents still unknown to the search engine.
Indexer. Indexer assorts page on components and analyzes them. Various elements of page, such as the text, headings, the structural and style features, special office html-tagi etc. Database are allocated and analyzed. The Database is a storehouse of all data which the search engine downloads and analyzes. Sometimes a database name a search engine index.
Search Engine Results Engine. The system of delivery of results is engaged in ranging of pages. She solves, what pages satisfy to inquiry of the user, and in what order they should be sorted. It occurs according to algorithms of ranging of the search engine. This information is the most valuable and interesting to us – with this component of the search engine co-operates all advanced optimizators, trying to improve their sites positions in SERP, therefore further we will in detail consider all factors influencing ranging of results.
Web server. As a rule, on a server there is a html-page with entry field in which the user can set its interesting search term. The web server also is responsible for delivery of results to the user in the form of html-page.

2. Internal factors of ranging

All factors influencing position of a site in delivery of the search engine, it is possible to break on external and internal. Internal factors of ranging are what are under the control of the owner of a web site (the text, registration etc.).

2.1 Text registration of web pages

2.1.1 Volume of the text on page

Searchers appreciate the sites rich with the information maintenance. Generally it is necessary to aspire to increase in text filling of a site.
It is necessary to consider as the optimum the pages containing 500-3000 words or 2-20 кб. The text (from 2 to 20 thousand symbols).
The page consisting only from several offers, has less chances to get in топ search engines.
Besides, more quantity of the text on page increases visibility of page in search engines at the expense of rare or casual search phrases, that in some cases can give quite good inflow of visitors.

2.1.2 Number of keywords on page

Keywords (phrases) should meet in the text at least 3-4 times. The top border depends on page total amount – the more total amount, the more repetitions it is possible to make.
Separately it is necessary to consider a situation with search phrases, that is word-combinations from several keywords. The best results are observed, if the phrase meets in the text some times as a phrase (i.e. all words together in the necessary order), but also, words from a phrase come across in the text some times one by one. Also there should be some distinction (disbalance) between number of occurrences of each of the words making a phrase.
Let’s consider a situation on example. We will admit, we optimise page under a phrase «dvd player». The good variant – a phrase «dvd player» meets in the text of 10 times, besides, the word «dvd» meets separately 7 more times, a word “player” of 5 more times. All figures in an example are conditional, but well show the general idea.

2.1.3 Density of keywords

The keyword density on page shows relative frequency of the maintenance of a word in the text. The density is measured in percentage. For example, if the set word has met 5 times on page from 100 words the density of this word is equal 5 %. Too low density will lead to that the search engine will not give due value to this word. Too high density is capable to switch on a search engine spam-filter (that is page it will be artificial is lowered in results of search because of excessively frequent use of a key phrase).
Optimum it is considered density of the key text of 5-7 %. In case of the phrases consisting of several words, it is necessary to count total density of all keywords making a phrase and to be convinced, that it keeps within the specified limits.
Practice shows, that the density of the key text more than 7-8 % though does not conduct to any negative consequences, but also special in most cases also has no sense.

2.1.4 Arrangement of keywords on page

Very short rule – the is closer a keyword or a phrase to the document beginning, than more they weigh in the opinion of the search engine.

2.1.5 Stylistic registration of the text

Search engines attach special significance to the text, those or a different way allocated on page. It is possible to give following recommendations:
– Use keywords in headings (the text allocated tagами «H», in particular «h1» and «h2»). Now use css allows to redefine a kind of the text allocated with these tagами, therefore use tagов «H» has smaller value, than earlier, however, to neglect them at all does not cost;
– Allocate keywords with a fat font (not in all text, certainly but to make such allocation 2-3 times on page will not prevent). For this purpose it is recommended to use tag «strong», instead of more traditional tagа «B» (bold).

2.1.6 Tag «TITLE»

One of the most important tagов to which search engines attach huge significance. It is necessary to use keywords in tag TITLE.
Besides, the link to your site will contain the text in search engine delivery from tagа TITLE, so it, to a certain extent, the page card.
Under this link transition of the visitor of the search engine to your site is carried out, therefore tag TITLE should not only contain keywords, but to be informative and attractive.
As a rule, 50-80 symbols get to search engine delivery from tagа TITLE, therefore it is desirable to limit the size of heading to this long.

2.1.7 Keywords in the text of links

Also very simple rule – use keywords in the text of proceeding links from your pages (as on other internal pages of your site, and on other resources of a network), it can add to you small advantage at ranging.

2.1.8 Images «ALT» tags

Any image on page has special attribute «the alternative text» which is set in tagе «ALT». This text will be displayed on the screen in the event that to download the image it was not possible or display of images is blocked in a browser.
Search engines remember value tagа ALT at analysis (indexation) of page, however do not use it at ranging of results of search.
At present it is authentically known, that search engine Google considers the text in tagе ALT those images which are links to other pages, the others tagи ALT are ignored. On others search systems of exact data are not present, but it is possible to assume something similar.
As a whole it is necessary to give such advice – to use keywords in tagах ALT is possible and it is necessary, though it has no basic value.

2.1.9 Meta-tag Desciption

Meta-tag Description it is specially intended for the task of the description of page. This tag does not influence in any way ranging, but, nevertheless, is very important. Many search engines (and, in particular, the largest Google) display the information from this tagа in results of search if this tag is present on page and its contents correspond to contents of page and search inquiry.
It is possible to tell with confidence, that the high place in results of search not always provides the big number of visitors. If the description of your competitors in results of delivery is more attractive, than your site visitors of the search engine will choose them, instead of your resource.
Therefore competent drawing up meta-taga Description is of great importance. The description should be short, but informative and attractive, contain keywords, characteristic for the given page.

2.1.10 Meta-tag Keywords

This meta-tag initially intended for instructions of keywords of the given page. However now it is not used almost by search engines.
Nevertheless, it is necessary to fill this tag “just in case”. At filling it is necessary to adhere to a following rule: to add only those keywords which really are present on page.

2.2 Structure of a site

2.2.1 Number of pages of a site

The general rule – the more, the better. The increase in number of pages of a site improves its visibility in search engines.
Besides, gradual addition of new information materials on a site is perceived by search engines as site development that can give side benefits at ranging.
Thus, try to place on a site of more information – news, press releases, articles, a good advice and so on.

2.2.2. The navigating menu

As a rule, any site has the navigating menu. Use keywords in menu links, it will allow to give additional weight to those pages on which the link conducts.

2.2.3 Keyword in the page name

There is an opinion, that use of keywords in the name of a html-file of page can positively affect its place in results of search. Naturally, it concerns only English-speaking inquiries.

2.2.4 Avoid subdirectories

If your site has moderate number of pages (some tens) it is better that they were in site root directory. Search engines consider such pages as more important.

2.2.5 One page – one key phrase

Try to optimise each page under an own key phrase. Sometimes it is possible to choose 2-3 related phrases, but it is not necessary to optimise one page under 5-10 phrases at once, most likely the result will not be any.

2.2.6 Main page of a site

Optimise the main page of a site (a domain name, index.html) under the most important for you word-combinations. This page has the greatest chances to get in топ search engines.
On my supervision, on the main page of a site can have to 30-40 % of the general search traffic.

2.3 Most popular errors

2.3.1 Graphic heading

Very often in design of a site the graphic heading (cap), that is a picture in all width of the page, containing, as a rule, a company logo, the name and some other information is used.
It is not necessary to do it! The top part of page very valuable place where it is possible to place the most important keywords. In case of a graphic representation this place vanishes for nothing.
In some cases there are absolutely ridiculous situations: the heading contains the text information, but with a view of большей visual appeal is executed in the form of a picture (accordingly represented text cannot be considered searchers).
Is better to use the combined variant – the graphic logo at the top of page is present, but occupies not all its width. On the rest the text heading with keywords takes places.

2.3.2 Graphic navigating menu

The situation is similar to the previous point – internal links to your site should contain also keywords, it will give a side benefit at ranging. If the navigating menu with a view of большей appeal is executed in a drawing kind searchers cannot consider the text of links.
If to refuse the graphic menu there is no possibility, do not forget to supply, at least, all pictures correct tagами ALT.

2.3.3 Navigation through scripts

In some cases navigation on a site is carried out through use of scripts. It is necessary to understand, that searchers cannot read and execute scripts. Thus, the link set through a script will be inaccessible to the search engine and the search robot will not pass on it.
In such cases it is necessary to duplicate necessarily links in the usual way that navigation on a site was accessible to all – both for your visitors and for robots of search engines.

2.3.4 Identifier of session

On some sites it is accepted to use the session identifier – that is each visitor at calling on a site receives unique parametre &session_id = which is added to the address of each visited page of a site.
Use of the identifier of session allows to collect more conveniently statistics about behaviour of visitors of a site and can be used for some other purposes.
However, from the point of view of the search robot the page with the new address is a new page. At each calling on a site the search robot will receive the new identifier of session and, visiting the same pages, as earlier, will perceive them as new pages of a site.
Strictly speaking, search engines have algorithms «склейки» mirrors and pages with the identical maintenance, therefore the sites using identifiers of sessions, nevertheless will be проиндексированы. However, indexation of such sites is complicated and in some cases can pass incorrectly. Therefore use on a site of identifiers of sessions is not recommended.

2.3.5 Redirects

Redirects complicate the site analysis search robots. Do not use redirects if for this purpose there are no accurate reasons.

2.3.6 the Latent text

Last two points concerns more likely not to errors, and a deliberate deceit of searches, but it is necessary to mention all of them
Use of the latent text (colour of the text coincides with colour of a background, for example, white on white) allows to “pump up” page the necessary keywords without infringement of logic and design of page. Such text is invisible to visitors, however is fine read by search robots.
Use of such “grey” methods of optimisation can lead бану a site – that is to a compulsory exception of a site of an index (database) of the search engine.

2.3.7 One pixel links

Use of graphic images-references in the size of 1*1 pixel (that is actually invisible to the visitor) also is perceived by search engines as attempt of a deceit and can lead бану a site.

3 External factors of ranging

3.1 For what the account of external links to a site is used

As it is possible to see from the previous section, almost all factors influencing ranging, are under the control of the author of page. Thus, for the search engine begins impossible to distinguish really qualitative document, from page created specially under the set search phrase or even the page generated by the robot and absolutely not of not bearing helpful information. Therefore one of key factors at ranging of pages is the analysis of external links to each estimated page. It is the unique factor, which неподконтролен to the author of a site.
It is logical to assume, that the more external links is available on a site, the больший this site represents interest for visitors. If owners of other sites in a network have put the link to an estimated resource, means, they consider this resource qualitative enough. Following this criterion, the search engine also can solve, what weight to give to this or that document.
Thus, there are two major factors on which the pages which are available in base of the searcher, will be sorted at delivery. It is relevance (that is, how much considered page is connected with an inquiry subject – the factors described in the previous section) both number and quality of external links. Last factor also has received names reference цитируемость, reference popularity or a citing index.

3.2 Importance of links

It is easy to see, that simple calculation of number of external links does not give us the sufficient information for a site estimation. It is obvious, that the link from a site should mean much more, than the link from a homepage / ~ myhomepage.html, therefore to compare popularity of sites only on number of external links it is impossible – it is necessary to consider as well importance of links.
For an estimation of number and quality of external links to a site search engines enter concept of an index of citing.
The index of citing or ИЦ is a general designation of the numerical indicators estimating popularity of this or that resource, that is some absolute value of importance of page. Each search engine uses the algorithms for calculation of own index of citing, as a rule, these values are not published anywhere
Besides an ordinary index of citing, which represents an absolute indicator (that is some concrete number), is entered the term the weighed index of citing which is relative value, that is shows popularity of the given page concerning popularity of other pages in the Internet. The term “the weighed index of citing” (ВИЦ) usually use concerning the search engine Yandex.
The detailed description of indexes of citing and algorithms of their calculation will be presented in following sections.

3.3. The reference text

Huge value at ranging of results of search is given to the text of external links to a site.
The link text (or on another the anchor or reference text) is the text standing between tagами «A» and «/A», that is that text which it is possible to “click” with the mouse index in a browser for transition to new page.
If the link text contains the necessary keywords the search engine perceives it as additional and very important recommendation, acknowledgement of that the site really contains the valuable information corresponding to a subject of search inquiry.

3.4 Relevance of referring pages

Except the reference text the general information contents of referring page are considered also.
Example. We will assume, we advance a resource on sale of cars. In this case, the link from a site on car repairs will mean much more, than the similar link from a site on gardening. The first link goes from thematically similar resource, therefore in большей degrees will be estimated by the search engine.

4 Indexation of a site

Before the site will appear in results of search, it should be indexed by the search engine. Indexation means, that the search robot has visited your site, has analysed it and has brought the information in a search engine database.
If some page is brought in an index of the searcher it can be shown in results of search. If the page in an index is absent, the search engine knows nothing about it, and, hence, cannot use the information from this page in any way.
The majority of sites of the average size (that is tens containing some or hundreds pages) usually do not test any problems with correct indexation by search engines. However, there is a number of the moments which should be considered at job over a site.
The search engine can learn about again created site two ways:
– Manual addition of the address of a site through the corresponding form of the search engine. In this case you inform the search engine on a new site and its address gets to turn on indexation. To add follows only main page of a site, the others will be found by the search robot under links;
– To give to the search robot independently to find your site. If on your new resource there is at least one external link from other resources, already проиндексированных the search engine the search robot in short terms itself will visit and проиндексирует your site. In most cases it is recommended to use this variant, that is to receive some external links to a site and simply to wait robot arrival. Manual addition of a site can even extend a robot waiting time.
Time necessary for site indexation makes, as a rule, from 2-3 days till 2 weeks, depending on the search engine. Sites search engine Google fastest indexes.
Try to make a site friendly for search robots. For this purpose consider following factors:
– Try, that any pages of your site were accessible under links from the main page no more than for 3 transitions. If the structure of a site of it does not suppose, make a so-called sitemap which will allow to carry out the specified rule;
– Do not repeat widespread errors. Identifiers of sessions complicate indexation. If you use navigation through scripts necessarily duplicate links in the usual image – search engines are not able to read scripts (more in detail about these and other errors it is told in chapter 2.3);
– Remember, that search engines index no more than 100-200 кб the text on page. For pages большего volume there will be проиндексировано only a page beginning (the first 100-200 кб.). The rule From this follows – do not use page in the size more than 100 кб if want that they were проиндексированы completely.
To operate behaviour of search robots it is possible by means of a file robots.txt, in him it is possible to resolve or forbid obviously for those indexations or other pages.
Databases of search engines are constantly updated, records in base can be exposed to changes, disappear and appear again, therefore the number indexed web pages of your site can periodically vary.
One of the most frequent reasons of disappearance of page of an index is inaccessibility of a server, that is the search robot at attempt of indexation of a site could not get access to it. After restoration of working capacity of a server the site should appear in an index again after a while.
It is necessary to notice also, that the more external links has your site, the there is its reindexation faster.
To trace process of indexation of a site it is possible by means of the analysis of broad gulls-files of a server in which all visits of search robots register. In corresponding section we will in detail tell about programs which allow to make it.

5 Selection of keywords

5.1 Initial choice of keywords

Selection of keywords is the first step with which site construction begins. At the moment of preparation of texts on a site the set of keywords should be already known.
For definition of keywords first of all it is necessary to use services which offer search engines.
For English-speaking sites it and

At use of these services it is necessary to remember, that their data can differ from a real picture very strongly. At use of service Google Ads it is necessary to remember also, that this service shows not expected number of inquiries, and expected number of displays of the advertisement on the set phrase. As visitors of the search engine look through often more than one page, real number of inquiries of necessarily less number of displays of advertising by the same inquiry.
Search engine Google does not give the information on frequency of inquiries.
After the list of keywords is approximately certain it is possible to carry out the analysis of the competitors, on purpose to find out, they are guided by what key phrases, probably it will be possible to learn something new.

5.2 high-frequency and low-frequency inquiries (phrases)

By site optimisation it is possible to allocate two stratagy – optimisation for small number high-competitive keywords, or for the big number of low-competitive. In practice both are usually combined.
Lack of high-frequency inquiries – as a rule, high level of a competition on them. For a young site not always happens probably to rise in топ by these inquiries.
For low-frequency inquiries, often there is sufficient a mention of the necessary word-combination on page, or the minimum text optimisation. Under certain conditions low-frequency inquiries can give very quite good search traffic.
The purpose of the majority of commercial sites – to sell these or those goods or service, or any in another way to earn on the visitors. It needs to be considered by search optimisation and at selection of keywords. It is necessary to aspire to receive target visitors on a site (that is ready to purchase of the offered goods or service), it is rather than simple to the big number of visitors.
Example. The inquiry “monitor” is much more popular and much more competitive, than inquiry «monitor samsung 710N» (the exact name of model). However, for the seller of monitors the second visitor much more worthful and to receive him is much easier, because the competition level for the second inquiry is low. This one more possible distinction between high-frequency and low-frequency inquiries which should be considered.

5.3 Estimation of competition level of search inquiries

After the set of keywords is approximately known, it is necessary to define the basic kernel of words under which optimisation will be spent.
Low-frequency inquiries for the clear reasons are rejected at once (temporarily). In the previous section we have described advantage of low-frequency inquiries, however on that they and low-frequency, that do not demand special optimisation. Therefore in the given section we do not consider them.
On very popular phrases competition level as a rule is very high, therefore it is necessary to estimate possibilities of the site really. For an estimation of level of a competition it is necessary to count up a number of indicators for first ten sites in delivery of the searcher:

– An average of external links to sites in delivery under the version of various search engines;
– Additional parametres:

– Number of pages on the Internet, containing the set search term (in other words number of results of search);
– Number of pages on the Internet, containing exact coincidence of the set phrase (as by search so-called).
These additional parametres will help to estimate indirectly complexity of a conclusion of a site in топ on the set phrase.
Except the described parametres, it is possible to check up also what number of sites from delivery are present at home directories, such as catalogues DMOZ and Yahoo.
Specified above parametres and their comparison with parametres of own site prospects of a conclusion of your site in топ on the specified phrase will allow to predict the analysis of all accurately enough.
Having estimated competition level on all picked up phrases, you can choose a number enough popular phrases with comprehensible level of a competition on which the basic rate will be made at promotion and optimisation.

5.4 Consecutive specification of search inquiries

As already it was spoken above, services of search engines give often very inexact information. Therefore to define ideal for your site a set of keywords from the first it is possible seldom enough.
After your site certain steps on its promotion are imposed and made, in your hands there is an additional statistics on keywords: you know a rating of your site in delivery of search engines on this or that phrase and know also number of calling on your site on this phrase.
Owning this information it is possible to define successful and unsuccessful phrases accurately enough. Frequently it is even not necessary to wait, that the site left in топ on estimated phrases in all search engines – enough one or two.
Example. We will admit your site has won first place in search engine Rambler on the given phrase. Thus neither in Yandex, nor in Google it still is not present in delivery on this phrase. However, knowing a percentage parity of calling on your site from various search engines, you can already predict the approximate traffic on this phrase and solve, approaches it for your site or not. Besides allocation of unsuccessful phrases, you can find new successful variants. For example, to see, that some phrase under which did not become any promotion, brings the quite good traffic, even in spite of the fact that your site on this phrase is on 2 or 3 page in delivery. Thus, in your hands there is a new, specified set of keywords. After that it is necessary to start site reorganisation – change of texts under more successful word-combinations, creation of new pages under the new found phrases etc. Thus, after a while you can find the best set of keywords for your site and essentially increase the search traffic. Still some councils. On the statistican, on the main page of a site it is necessary to 30 %-50 % of all search traffic. It is better it is visible in searchers and has most of all external links. Therefore the main page of a site should be optimised under the most popular and competitive inquiries. Each page of a site it is necessary to optimise under 1-2 basic word-combinations (and, probably, under a number of low-frequency inquiries). It will increase chances of an exit in топ search engines on the set phrases.

6 Various information on search engines

6.1 Advises, assumptions, supervision

In the given chapter the information which has appeared as a result of the analysis of various articles, we present opinions of web profeissionals, practical supervision, etc. this Information is not exact and authentic is only assumptions and ideas, however ideas interesting. The data presented in this section, perceive not as an exact management, and as the information to reflexion.

Proceeding links. Refer to authoritative resources in your area, using the necessary keywords. Search engines appreciate links to other resources of the same subjects;
Proceeding links. Do not refer on FFA sites and the other sites excluded from an index of the search engine. It can lead to fall of a rating of your own site;
Proceeding links. The page should not contain more than 50-100 proceeding links. It does not lead to page fall in a rating, but links over this number will not be considered by the search engine;
External site wide links, that is the links standing on each page of a site. It is considered, that search engines negatively concern such links and do not consider them at ranging. There is also other opinion, what is it concerns only the big sites with thousand pages;
Ideal density of keywords. Very often it is necessary to hear a similar question. The answer consists that the ideal density of keywords does not exist, it is more correct various for each inquiry, that is pays off the search engine dynamically, depending on the search term. Our advise is to analyse the first sites from search engine delivery that will allow to estimate a situation approximately;
Age of a site. Search engines prefer old sites, as stabler;
Site updating. Search engines prefer developing sites, that is on what the new information is periodically added, new pages;
A domain zone. The preference is given to the sites located in, etc. Such domains can register only the corresponding organisations, therefore such sites have more trust;
Search engines trace, what percent of visitors comes back to search, after visiting of this or that site from Search Engine Result Page. The big percent of returns means not thematic contents, and such page goes down in search;
Search engines trace, this or that link in results of search how much often gets out. If the link gets out seldom, means, the page does not represent interest and such page goes down in a rating;
Use synonyms and related forms of keywords, it will be estimated by search engines
Too fast growth of number of external links is perceived by search engines as artificial promotion and conducts to rating fall. Very disputable statement first of all such way can be used for fall of a rating of competitors;
Google does not consider external links, if they are on one (or similar) hosts, that is pages, IP which address belongs to a range Such opinion occurs most likely that Google stated the given idea in the patents. However employees Google declare, that any restrictions on IP to the address on external links is not imposed, and there are no bases not to trust them;
Search engines check the information on the owner of the domain. Accordingly links from the sites belonging to one owner have smaller weight, than usual links. The information is presented in the patent;
Term on which the domain is registered. The more term, the more preference is given to a site;

6.2 Creation of a correct content

The content (information contents of a site) plays the major role in site promotion. To that there is a set of the reasons about which we will tell in this chapter, and also will give advice how correctly to fill a site with the information.

Uniqueness of a content. Searchers appreciate the new information which anywhere earlier was not published. Therefore at site creation lean against own texts. The site constructed on the basis of another’s materials, has much smaller chances of an exit in топ search engines. As a rule, the primary source always is above in results of search;
At creation of a site do not forget, that it is initially created for visitors, instead of for search engines. To result the visitor on a site is only the first and not the most difficult step. To keep the visitor on a site and to transform it into the buyer – here really challenge. To achieve it it is possible only the competent information filling of a site interesting to the person;
Try to update regularly the information on a site, to add new pages. Searchers appreciate developing sites. Besides, there is more than text – more visitors on a site. Write articles on a subject of your site, publish responses of visitors, create a forum for discussion of your project (the last – only if attendance of a site will allow to create an active forum). An interesting content – pledge of attraction of interested visitors;
The site created for people, instead of search cars, has an every prospect of hit in the important catalogues, such as DMOZ, Yandex and others;
The interesting thematic site has much more chances of reception of links, responses, reviews etc. other thematic sites. Such reviews in itself can give quite good inflow of visitors, besides, external links from thematic resources on advantage will be estimated by search engines.
In summary one more council. As they say, the shoemaker should do boots, and to write texts the journalist or the technical writer should. If you manage to create fascinating materials for your site it it is very good. However the majority of us does not have special abilities to a writing of attractive texts. Then it is better to entrust this part of job to professionals. It is more expensive variant, but in long-term prospect it will justify itself(himself).
6.3 Choice of the domain and a hosting
Now create the page in the Internet any can and for this purpose it is not necessary any expenses. There are the companies giving a free hosting which will place your page in exchange for the right to show on it the advertising. Many Internet providers also will give you a place on the server if you are their client. However all these variants have very essential lacks, therefore, at creation of the commercial project, you should concern these questions with more responsibility.
First of all it is necessary to buy the own domain. It gives you following advantages:

The project which does not have own domain, is perceived as a site-something ephemeral. Really, why we should trust the given resource if its owners are not ready to spend even the symbolical sum for creation of the minimum image. Placing of free materials on such resources probably, but attempt of creation of the commercial project without own domain is almost always doomed to failure;
Own domain gives you freedom in a hosting choice. If the current company has ceased suit you you at any moment can transfer the site on other, more convenient or fast platform.
At a domain choice remember following moments:
Try, that the domain name was remembered both its pronunciation and a writing would be unequivocal;
For promotion of the international projects more all approaches domains with It is possible to use also domains from, etc., however this variant is less preferable;
For promotion of national projects always it is necessary to take the domain in a corresponding national zone (.ru – for Russian-speaking – for German etc.);
In case of bilingual (and more) sites it is necessary to allocate the domain under each of languages. National search engines in большей degrees will estimate such approach, than presence on the basic site of subsections in various languages.
Domain cost makes (depending on the registrar and a zone) 10-20$ in a year.
At a hosting choice it is necessary to lean against following factors:

Speed of access;
Time of availability of servers (uptime);
Traffic cost for a gigabyte and quantity of the prepaid traffic;
It is desirable, that the platform settled down in the same geographical region, as the majority of your visitors;
Cost of a hosting for small projects fluctuates around 5-10$ in a month.
At a choice of the domain and a hosting avoid “free” offers. To see frequent it is possible, that the hostings-companies offer free domains to the clients. As a rule, domains in this case are registered not on you, and on the company, that is the actual owner of the domain is your hosting-provider. As a result you cannot replace a hosting for the project, or will be compelled to redeem the own, untwisted domain. Also in most cases it is necessary to adhere to a rule not to register the domains through a hosting-company as it can complicate possible carrying over of a site on other hosting (even in spite of the fact that you are the high-grade owner of the domain).

6.4 Change of the address of a site

Sometimes for some reasons change of the address of the project can be demanded. Some resources beginning on a free hosting and the address, develop to high-grade commercial projects and demand moving on own domain. In other cases there is more successful name for the project. At any similar variants there is a question of correct carrying over of a site on the new address.
Our council in this plan is that – create on the new address a new site with a new, unique content. On an old site put on a new resource visible links that visitors could pass to your new site, however do not clean absolutely old site and its contents.
At such approach you can receive search visitors both on new, and on an old resource. Thus you have a possibility to capture additional subjects and keywords that it would be difficult to make within the limits of one resource.
Project carrying over on the new address a problem difficult and not so pleasant (as in any case promotion of the new address should be begun practically with zero), however, if this carrying over is necessary it is necessary to take a maximum of advantage from it

That’s it, we hope you have enjoyed this short manual concerning