Search Engines and Effective Search Engine Optimization
You have completed your website, it is time to get it listed with the search engines. However, if you accept as true the fact that almost all you ought to do is make a website and individuals will simply visit this, then you certainly would be a dreamer It requires time particular attention, a number of efforts frequently and invests just little funds to promote your internet-site proficiently. I actually wish the fact that subsequent ideas will assist you to flourish in establishing visitors your internet-site. At this time I’d like to produce a clear variation between crawler-based search engines and human-powered directories, which are mixed up often. Regarding the directories, you do not should worry about the positioning of your site within their search results — human site visitors seldom search there. But for being outlined in their index is essential.
Obtaining detailed with them allows crawler-based she finds your internet site as well as perhaps will help it get ranking better, due to the hyperlink importance, these websites offer for you. For instance, Google’s directory is nothing at all else than a duplicate from the Open Listing Task (DMOZ) catalog. Therefore, it is rather crucial to be shown with DMOZ to really get your internet site into Google’s SERP’s.
In the past, the market was centered by alleged ‘words-on-the-page’ rank system, which meant that the greater times an expressed word was repeated on the page, the higher ranking the web page got. The main crawler-based search engines ranked webpages predicated on exactly where and exactly how frequently search conditions appeared to them. Now company and Google give more excess weight to webpages that can come from sites with similar content. That’s the reason getting outlined with web directories helps she’s better to determine this issue of the website. The main directories will be Open up Index (DMOZ), Yahoo and LookSmart!. Getting a site listed in DMOZ can be quite frustrating.
We realize that being outlined will help our Google rating probably, but getting back in can take a long time. Every site and web page that is put into the listing needs to be by hand reviewed prior to it is roofed. DMOZ reported utilizing more than 60 thousand volunteer editors at the right time of writing this short article, yet this quantity is deceptive — this is the final number of editors that they experienced because the task began. They don’t really found in truth possess that many publishers, nor near that lots of anywhere.
However, the ongoing company keeps growing and growing its services. Now they statement looking more than four million sites and also have lately launched fresh service, known as Thumbshots, that allows previewing backlinks before clicking on. Yahoo! costs USD 299. 00 pertaining to communicate addition to the directory site within seven days, you will need to wait from 2 to eight weeks otherwise.
For me, it is not really worth paying $299 to find yourself in the index — you are able to enter Google! free of charge through Google, because it is run by Yahoo. Optimize meant for Google, and then you? re-optimizing designed for Yahoo!. This situation will probably modify after Google! purchased Inktomi this past year and announced the program to build up to it all and utilize it as the primary energizing engine. Intended for LookSmart, the continuing enterprise is definitely dropping where it stands with only a larger level.
Following LookSmart experienced lost it is largest search portal client MSN, two more businesses — Inktomi and Sprinks announced that they might not really renew their particular contracts with the business to work with its Search engines constantly change their algorithms and there are thousands of hackers worldwide who track these changes and publish them.
Still, there are some general rules, observing which you can reach real success without changing the content of your site constantly and tracking the latest algorithm changes. But first, let’s have a look at what search engines ‘want’ to see at your pages. The first rule of the thumb: Internet is a textual environment, so the first thing you must care about is content.
Yes, surely, nice graphic design is very important, flash is just cool, but only for human visitors. Search engines don’t ‘see’ the design and you somehow need to make search engines ‘like’ your site so that they could drive people to it. You ought to endanger. Most search engines cannot scan Flash objects, so you either need to get rid of site navigation implemented in Flash, or to duplicate it with plain HTML. Although, some search engines, AllTheWeb being the first, declare that now they can find links in Flash. Still, it is always better to be on the safe side. And moreover, Flash does not provide any textual content, which is vital for estimating the relevance of the page. So how to optimize your pages so that search engines would ‘like’ them? Several issues arise in this context.
The first one is that webmasters are often at loss: are keywords per page or per site thing? The answer is obvious. Since it sometimes may be difficult to tell what the topic of the site is, especially if it touches many topics, the keywords should definitely be selected on per page basis. I have seen hundreds of pages containing heaps of keywords in their meta-tags, in hope, that they would get a high rank by those keywords. This is certainly a total rubbish. Meta Keyword tag offers just about entirely dropped its great importance regarding placement. For example, Google doesn’t support it at all, although Inktomi (which powers MSN and will be used by Yahoo! soon), Teoma and some other engines do. But to get a high ranking you need to put these keywords in the actual text of the page.
Moreover, only one search term per page is preferable, although two is also not so bad. The next thing is: how to choose appropriate keywords? Be careful, because if you initially choose wrong keywords or words no one is likely to search for, your efforts will go down the drain. There are few online services and programs, which can help you to compile a list of keywords for your site. For example, Overture’s Inventory (inventory.overture.com/d/searchinventory/suggestion/) is a service, using which you can find the most popular words related to your keyword. Word Tracker (WordTracker.com) offers a similar service, but unlike Overture, it is a paid one. Also, you can use the software package NetPromoter (www.Net-Promoter.com), which finds sites, which are similar to your site but rank higher scans their pages and their meta-tags and extracts related keywords.
In addition, you can use an ordinary thesaurus. It would be definitely ridiculous to try to compete for some generic words, like ‘computers’, ‘software’, ‘programs’ if you sell computers or software through your site because the competition is too severe. Make an effort to be precise. Go regional if you sell your products in a specific area. You have much more chances to rank higher if you choose to compete for ‘computers in Ohio’, than just for ‘computers’. Keyword solidity and density is usually yet another thing that must definitely be evaluated whenever composing a webpage. It is a percentage measure of how many times a keyword is repeated within the text of the page. For example, if a page contains 100 words and ten of these words are ‘computer’, then ‘computer’ is said to have a 10% keyword density. Presently there are actually web programs that may level keyword solidity and density by simply single words and phrases or by categories of words and phrases, just like ‘newer computer systems found in Ohio’.Normal percentage of keywords for a single page is about 2-6 %. This is the keyword repetition percentage in normal speech.
Some experts even say that the normal volume can be up to 16 percent, but I have never seen pages with so high keyword density. Actually, I think such a page would look more like Search Engine Spam. This article is dedicated to searching engine optimization — take a pencil and count the number of repetitions of this phrase on the page. You won’t get outside 2-6 percent for these words. By the way, for finding out the keyword density on the pages of your site you can use any of numerous online services and programs, like Keyword Density.com (www.keyworddensity.com/) — a free online service, Ranks.nl (www.ranks.nl/), GRKda (www.grsoftware.net/search_engines/software/grkda.html) and some others.
After choosing the keywords, you need to place them correctly. There is an order of importance for the structural elements on a page. How you build your content from top to bottom is critical when writing a copy for the web. Search engines try to deliver the information most relevant to your request, so the principles here are the following. People typically scan information when they perform searches. Elements that are bolded, in a different color, set off by bulleted or numbered lists, are usually the things that we see first. This is important if visitors don’t see the terms they are looking for, they then move on to other websites to find the information that they are scanning for. This also works for the search engines. Google gives much weight to words placed in link texts, Title tag, bolded text and alt texts to images. Make sure there are keywords at the beginning of the page. Placed there, especially in header tags H they will be considered words, introducing the content of the page, hence more important than ordinary text.
There are so-called on-page and off-page factors that influence page importance. Among off-page factors, links are the most important. Search engines like Google attribute link text to the target page; they treat it as an important element of the target page. Much has been said about link popularity for Google. Last year American bloggers tried a trick, which was later called ‘Google Bomb’.
They placed links, containing a phrase ‘miserable failure’ and pointing to the official White House biography of George W. Bush on their pages, which caused the biography of the US president to be No. 1 by the query ‘miserable failure’. The idea became so popular soon that the first four positions are now occupied by George Bush, Jimmy Carter, Michael Moore, and Hillary Clinton. Make sure that each page contains at least one outgoing link, since pages with no links are called ‘dangling pages’, and these are the things you should avoid. Probably, the most important on-page element is the Title tag. The text from this tag is displayed as the title of the page in search results. Make sure that the page’s search term is included in this tag. There is nothing wrong in repeating it twice, just make sure it reads well. Bring to mind publication the first page.
Equipped with a handful of words and phrases they create you would like to start reading a tale. It goes without saying that each page’s Title tag should be different from the Title tags on the site’s other pages. I have already mentioned header tags. The main problem with their usage is their size. For example, the H1 tag may look quite imposing against the other elements on the page. But it is easily controlled by Cascading Style Sheets — you can control both the size and the font. You can even try a little trick on search engines — define the font color of these headers to match the background color, the latter one being composed of monochromatic images. This may actually work on any element of the page.
But don’t try submitting such pages to directories — people have more chances of noticing fraud than search robots. There are several programs, which may help you analyze and optimize your pages for search engines. The abovementioned NetPromoter (www.Net-Promoter.com) has a module, called Page Analyzer — a utility, which analyzes your pages either by Google criteria or general search engines criteria, also analyzing keyword density. Also, the program most popular in the USA, Web Position Gold (www.WebPosition.com), contains the Page Critic module, which does approximately the same. AddWeb (addweb.com/) is another popular program for search engine optimization and submission, which has recently released its latest version. Surely, experienced webmasters can easily do without these programs, but for a novice, they would be of great help, especially for understanding the indexing processes and tracking ranking changes. Search engine software is also suitable for larger companies that manage many websites and wish to maintain an in-house marketing team.,
An individual or group can be trained in the use of the software and the basic skills involved in search engine marketing. Also, it may be good for small or medium business that cannot afford to employ quality SEO personnel. A good SEO campaign will actually cost you nothing in the long run because it will certainly figure to boosted coverage and boosted site visitors which frequently provides an increase in revenues.
Cite this Essay
To export a reference to this article please select a referencing style below