20 Things You Need to Know Before Optimizing a Website

Posted by damnedviper | | 0 comments »

By Kalena Jordan

One of the most important aspects of a search engine optimization project is also one of the most overlooked – preparation! There are some important steps to take in advance of optimizing your site that will make sure your SEO is successful. You can see the pre-requisites for Search Engine Optimization here.

Next, you need to establish the project requirements, so you can tailor the SEO campaign to you or your client's exact needs. For those of you servicing clients, this information is often required before you are able to quote accurately.

To determine your project requirements, you need to have the following questíons answered:

1) What technology was used to build the site? (i.e. Flash, PHP, frames, Cold Fusion, JavaScrípt, Flat HTML etc)

2) What are the file extensions of the pages? (i.e. .htm, .php, .cfm etc)

3) Does the site contain database driven content? If so, will the URLs contain query strings? e.g. www.site.com/longpagename?source=123444fgge3212, (containing "?" symbols), or does the site use parameter workarounds to remove the query strings? (the latter is more search engine friendly).

4) Are there at least 250 words of text on the home page and other pages to be optimized?

5) How does the navigation work? Does it use text links or graphical links or JavaScrípt drop-down menus?

6) Approximately how many pages does the site contain? How many of these will be optimized?

7) Does the site have a site map or will it require one? Does the site have an XML sitemap submitted to Google Sitemaps ?

8) What is the current link popularity of the site?

9) What is the approximate Google PageRank of the site? Would it benefit from link building?

10) Do I have the ability to edit the source code directly? Or will I need to hand-over the optimized code to programmers for integration?

11) Do I have permission to alter the visible content of the site?

12) What are the products/services that the site promotes? (e.g. widgets, mobile phones, hire cars etc.)

13) What are the site's geographical target markets? Are they global? Country specific? State specific? Town specific?

14) What are the site's demographic target markets? (e.g. young urban females, working mothers, single parents etc.)

15) What are 20 search keywords or phrases that I think my/my client's target markets will use to find the site in the search engines?

16) Who are my/my client's major competitors online? What are their URLs? What keywords are they targeting?

17) Who are the stake-holders of this site? How will I report to them?

18) Do I have access to site traffíc logs or statistics to enable me to track visitor activity during the campaign? Specifically, what visitor activity will I be tracking?

19) How do I plan on tracking my or my client's conversion trends and increased rankings in the search engines?

20) What are my/my client's expectations for the optimization project? Are they realistic?

Answers to the first 10 questíons above will determine the complexity of optimization required. For example, if the site pages currently have little text on them, you know you'll need to integrate more text to make the site compatible with search engines and include adequate target keywords. If the site currently uses frames, you will need to rebuild the pages without frames or create special No-Frames tags to make sure the site can be indexed, and so on.

This initial analysis will help you to scope the time and costs involved in advance. For those of you optimizing client sites, obtaining accurate answers to these questíons BEFORE quoting is absolutely crucial. Otherwise you can find yourself in the middle of a project that you have severely under-quoted for.

The remainder of questíons are to establish in advance the who, what, where, when, why and how of the optimization project. This will help you determine the most logical keywords and phrases to target, as well as which search engines to submit the site to.

For those of you optimizing web sites for a living, you might consider developing a questionnaire that you can give clients to complete to ensure you tailor the web site optimization to their exact needs.

Read More...... If you like the post then please vote on any Social Media Network.

Pre-Requisites for Search Engine Optimization

Posted by damnedviper | | 1 comments »

Before you start any search engine optimization campaign, whether it's for your site or that belonging to a client, you need to answer the following questíons:

1) What is the overall motivation for optimizing this site? What do I/they hope to achieve? e.g. more sales, more subscribers, more traffíc, more publicity etc.

2) What is the time-frame for this project?

3) What is the budget for this project?

4) Who will be responsible for this project? Will it be a joint or solo effort? Will it be run entirely in-house or outsourced?

Answering these questíons will help you to build a framework for your SEO project and establish limitations for the size and scope of the campaign.

Search Engine Compatibility Review

Search Engine Compatibility Review consists of detailed overview and analysis of a site's search engine compatibility in terms of HTML design, page extensions, link popularity, title and META tags, body text, target keywords, ALT IMG tags, page load time and other design elements that can impact search engine indexing.

Read More...... If you like the post then please vote on any Social Media Network.

Check your site for Spam

Posted by damnedviper | | 1 comments »

Spam is an overused word. Describing everything from luncheon “meat” foodstuffs to the flood of advertising Email we all receive each day, the simple, four-letter word is as widely used as many other deceptively descriptive four-letter words are. Suffice it to say that SPAM is not counted amongst the most honourable words in the English language. The IT world has forced the English language to evolve very quickly as old words are applied against new ideas. The word “Spam” has become synonymous with words such as “junk” and “waste”. In the search engine optimization field, the word Spam is used to describe illegal techniques used to try to fool search engine spiders. This section will outline a few of the more frequent Spam techniques used by webmasters or unscrupulous SEO practitioners. Check your websites for any incidents of Spam and do your best to get rid of them as they could seriously damage your search engine rankings.

Hidden Text
Some webmasters hide text by making it the same colour as the background of the page they are working on. The thinking is that one can put a collection of keywords on their website that are “invisible” to most site viewers but will be perceived and recorded by search engine spiders. This technique actually worked on first generation search engines but was heavily exploited by the adult entertainment industry, forcing search tools to make this technique totally illegal and punishable by banishment. The easiest way to find hidden text on a site is to just press Ctrl + A or hold down the left mouse button and drag your mouse across the entire body of the page you are looking at. This will highlight all text and images on the page, including any text that may be the same colour as the background. If you find hidden text on your site, delete it or make it visible by changing the text colour.

Off Topic or Redundant Text
Websites exist to spread messages about specific things, be they products, people or politics. If your site uses text that does not relate to the topic of the site in any way, it will likely be considered Spam. If this type of Spam appears on your site too often, your site will likely be considered banned.

Keyword Repetition

Any excessive repetition of specific words in the keywords meta-tag will be considered spam and will likely cause a website to rapidly lose position on search engine indices. A common rule of meta-tags is to only use the same word a maximum of three times in the Keywords tag.

Body Text
While there is no common rule for the frequency of use for keywords in the body text, it is important to note that each word used on your site must be relevant to the topic of your site and must be used in a proper context. In other words, you can’t just bulk a number of words in the lower section of your website and hope search engines reward your efforts with high rankings. Unless each word used has a good excuse for being there, it will likely be considered spam.

Transparent or Hidden Links

Many search engine spiders work by following all links found on any page they come across. A technique that once produced strong results was to hide links by using a tiny, invisible image link, generally phrased as a 1 pixel X 1 pixel dot. This technique simply doesn’t work anymore and will likely get a site using it banned for spamming. Unfortunately, this technique is a bit more difficult to find unless you know what you’re looking for. The easiest way to look for a hidden image link is to examine the source-code of your website by clicking on the word View (File Edit View Insert…) in the upper text menu on your browser. From the drop-down menu, choose Source. This will bring up a notepad document with the source-code of the page. Go to the word Edit and choose Find. Enter the following information into the dialog box that opens, “width="1" height="1" border="0"”. If the Find feature offers anything up in return, chances are you should take a close look at it. If it is a link to another URL, you should immediately remove it.

Pages Generated by Mechanical Means

There are dozens of instant page creators on the market today. Most of them are junk and produce pages that are easily spotted by search engine spiders. More often than not, an instant solution is going to get you in trouble as search engines are deluged with sites that offer no real new information but are only there in order to try to manipulate search engine listings. Website design is becoming so simple that the use of mechanically generated sites is unnecessary.


Cloaking is a technique that offers one set of data to search engine spiders and another set of data to live-human users. Cloaking is a highly advanced technique and is not likely to be used without the knowledge of the website owner or Webmaster.

Duplicate Content

With the advent of affiliate marketing schemes, there are tens of thousands of sites with duplicate content out there. As they are in the business of providing accurate and relevant results, search engines do not like sites that repeat what another site has already said. If your site is one of ten thousand duplicate sites, chances are you’re not going to rank well anyway, especially since your site does not get to remain in the search engine’s database for very long.

You may also see the article 15 Methods of SEO Spam

Read More...... If you like the post then please vote on any Social Media Network.

7 Steps -10 minutes Search Engine Optimization

Posted by damnedviper | | 2 comments »

Here I have listed seven really useful tips which are excerpts of articles written by Ross Dunn.

1. Spot Check for Spam

Before optimization begins it is crucial that your page is devoid of any 'tricks' that a search engine may construe as spam. You may not even know they are there! Some web designers with out-of-date information add hidden text because they think it will help your rankings. This is entirely untrue and can cause extreme problems for your search engine visibility. To check for hidden text just press Ctrl + A or hold down the left button on your mouse at the very top of your page and drag it down to the bottom. If there is any hidden text (normally text that matches the color of your page background) it will appear as it is highlighted. Second, you will want to search for unnecessary repetition of keywords on your page. Do you see instances of 5 or 6 keywords being repeated in a row anywhere on the page? If so, remove this.

There are other forms of spam that you should be aware of that are not covered within this article. For more information, see our resource article: " Check your site for Spam".

2. Polish & Optimize Text

The text on your home page is crucial to maintaining the attention of fresh viewers. In fact, copy is so important that many companies prefer to have a professional copy writer create the content. Since this is not an option for many, you have to keep your text engaging as well as smartly optimized to present an obvious topic to the visiting search engine spiders or visitors. To do this, keep your mind focused on the keyword(s) that you have chosen to target on the search engines while you write the content for the page. Implement the keywords within the text without sacrificing the true intent of the information - to engage and retain your viewers.

Important Note: Often the first 25-30 words of your home page are what each search engine will use for the description of your web site. Try to utilize your target keyword(s) within this area but be certain the resulting sentence is legible and descriptive.

3. Optimize the Title

The title of the home page can be optimized quite simply by placing the keyword or keyword phrase that you have chosen to target first and then finish with the name of your company or web site. For example:

" BMW Car Sales - MyBMW.com"

The search engines place a great deal of weight in the title of your home page and the keywords within your title will likely be chosen as the title of your listing on the search engines. The title also plays a huge role in the relevance of your site. The topic denoted within your title will be compared to the content within the body of the page, if the topic and content match, you will have a much better chance of obtaining higher targeted placements.

4. META Tags

Save yourself the time of creating a Keyword META Tag; they are mainly ignored by search engines today. The fact is that the Keyword META Tag was abused too heavily by Spammers and have little or no effect due to their suspect nature. The Description Tag, however, is just as important as ever and should be carefully crafted for each webpage.

For some META command please see the article SEO: Meta Commands for SE spiders

5. Navigation. Provide Clear Paths

Is the navigation within your site entirely graphical or programming oriented? In this case I recommend that you create a textual link menu at the bottom of your page. For added punch (and this is a big hit) you should make each link count by using the keyword/phrase that best describes the target page. For example:

BMW Accessories | Used BMW for Sale | More About MyBMW.com

versus these less powerful links:

Accessories | Used Cars | About Us

6. Test and Spell-Check

Before you upload your site I recommend the following course of action;

Establish a Baseline: Do you have traffic reporting for your website? (most do) If so, I highly recommend taking a snapshot of your latest traffic and visitor behavior statistics to provide yourself a baseline; after all, you won't know just how successful your promotions have been without something to compare them to. See for yourself whether your visitors have prolonged their visit or is it worsening.

Preview the page and spell check it then once you are certain everything is properly balanced, upload it to your server.

7. Hurry Up and Wait

Paid submission to search engines has largely gone the way of the Dodo Bird these past few years. In our experience, the best way to get your site listed is to publish an article or press release and have it syndicated by a press release dissemination service such as PRWeb.com. By submitting your website you will get some instant traffic and likely some free backlinks which will earn your website a visit from many of the more worthy search engine spiders.

Read More...... If you like the post then please vote on any Social Media Network.

15 Dirty Tricks Used by Spammers

Posted by damnedviper | | 2 comments »

by Jim Hedger

Please note,
I do not encourage, endorse or suggest the use of any of the techniques listed here. We don't use them and our clients' sites continue to rank well at Google, Yahoo, MSN and Ask. It is also worth noting Google has been the dominant search engine for almost five years. Most of the spammy tricks evolved in order to game Google and might not apply to the other engines. If you get busted then your site might bet banned from search listing.

1. Cloaking

Also known as "stealth(ing)", cloaking is a technique that involves serving or feeding one set of information to known search engine spiders or agents while displaying a different set of information on documents viewed by general visitors. While there are unique situations in which the use of cloaking might be considered ethical in the day-to-day practice of SEO, cloaking is never required. This is especially true after the Jagger algorithm update at Google, which uses document and link histories as important ranking factors.

2. IP Delivery
IP delivery is a simple form of cloaking in which a unique set of information is served based on the IP number the info-query originated from. IP addresses known to be search engine based are served one set of information while unrecognized IP addresses, (assumed to be live-visitors) are served another.

3. Leader Pages
Leader pages are a series of similar documents each designed to meet requirements of different search engine algorithms. This is one of the original SEO tricks dating back to the earliest days of search when there were almost a dozen leading search engines sorting less than a billion documents. It is considered SPAM by the major search engines as they see multiple incidents of what is virtually the same document. Aside from that, the technique is no longer practical as search engines consider a far wider range of factors than the arrangement or density of keywords found in unique documents.

4. Mini-Site networks
Designed to exploit a critical vulnerability in early versions of Google's PageRank algorithm, mini-site networks were very much like leader pages except they tended to be much bigger. The establishment of a mini-site network involved the creation of several topic or product related sites all linking back to a central sales site. Each mini-site would have its own keyword enriched URL and be designed to meet specific requirements of each major search engine. Often they could be enlarged by adding information from leader pages. By weaving webs of links between mini-sites, an artificial link-density was created that could heavily influence Google's perception of the importance of the main site.

In the summer of 2004, Google penalized several prominent SEO and SEM firms for using this technique by banning their entire client lists.

5. Link Farms
Link farms emerged as free-for-all link depositories when webmasters learned how heavily incoming links influenced Google. Google, in turn, quickly devalued and eventually eliminated the PR value it assigned to pages with an inordinate collection or number of links. Nevertheless, link farms persist as uninformed webmasters and unethical SEO firms continue to use them.

6. Blog and/or Forum Spam
Blogs and forums are amazing and essential communication technologies, both of which are used heavily in the daily conduct of our business. As with other Internet based media, blogs and forum posts are easily and often proliferated. In some cases, blogs and certain forums also have established high PR values for their documents. These two factors make them targets of unethical SEOs looking for high-PR links back to their websites or those of their clients. Google in particular has clamped down on Blog and Forum abuse.

7. Keyword Stuffing
At one time, search engines were limited to sorting and ranking sites based on the number of keywords found on those documents. That limitation led webmasters to put keywords everywhere they possibly could. When Google emerged and incoming links became a factor, some even went as far as using keyword stuffing of anchor text.

The most common continuing example of keyword stuffing can be found near the bottom of far too many sites in circulation.

8. Hidden Text
It is amazing that some webmasters and SEOs continue to use hidden text as a technique but, as evidenced by the number of sites we find it on, a lot of folks still use it. They shouldn't.

There are two types of hidden text. The first is text that is coloured the same shade as the background thus rendering it invisible to human visitors but not to search spiders. The second is text that is hidden behind images or under document layers. Search engines tend to dislike both forms and have been known to devalue documents containing incidents of hidden text.

9. Useless Meta Tags
Most meta tags are absolutely useless. The unethical part is that some SEO firms actually charge for the creation and insertion of meta tags. In some cases, there seems to be a meta tag for virtually every possible factor but for the most part are not considered by search spiders.

All other identifying or clarifying information should be visible on a contact page or included in the footers of each page.

10. Misuse of Directories
Directories, unlike other search indexes, tend to be sorted by human hands. Search engines traditionally gave links from directories a bit of extra weight by considering them links from trusted authorities. A practice of spamming directories emerged as some SEOs and webmasters hunted for valuable links to improve their rankings. Search engines have since tended to devalue links from most directories. Some SEOs continue to charge directory submission fees.

11. Hidden Tags
There are a number of different sorts of tags used by search browsers or website designers to perform a variety of functions such as; comment tags, style tags, alt tags, noframes tags, and http-equiv tags. For example, the "alt tag" is used by site-readers for the blind to describe visual images. Inserting keywords into these tags was a technique used by a number SEOs in previous years. Though some continue to improperly use these tags, the practice overall appears to be receding.

12. Organic Site Submissions
One of the most unethical things a service-based business can do is to charge clients for a service they don't really need. Charging for, or even claiming submissions to the major search engines are an example. Search engine spiders are advanced enough to no longer require site submissions to find information. Search spiders find new documents by following links. Site submission services or SEO firms that charge clients a single penny for submission to Google, Yahoo, MSN or Ask Jeeves, are radically and unethically overcharging those clients.

13. Email Spam
Placing a URL inside a "call-to-action" email continues to be a widely used of search marketing spam. With the advent of desktop search appliances, email spam has actually increased. StepForth does not use email to promote your website in any way.

14. Redirect Spam
There are several ways to use the redirect function to fool a search engine or even hijack traffic destined for another website! Whether the method used is a 301, a 302, a 402, a meta refresh or a java-script, the end result is search engine spam.

15. Misuse of Web 2.0 Formats (ie: Wiki, social networking and social tagging)
An emerging form of SEO spam is found in the misuse of user-input media formats such as Wikipedia. Like blog comment spamming, the instant live-to-web nature of Web 2.0 formats provide an open range for SEO spam technicians. Many of these exploits might even find short-term success though it is only a matter of time before measures are taken to devalue the efforts.

Read More...... If you like the post then please vote on any Social Media Network.

Troubleshoot Dropped Search Engine Rankings

Posted by damnedviper | | 0 comments »

Are you baffled about a recent drop in your search engine rankings? Do you know where to start and get a handle on what the problem might be and how to remedy it? One option to consider is using search engine forums as a resource. They are full of questions from people who have experienced similar situations and are great resources for an answer or two. But let's say you really want to get to the bottom of the problem and you want to do it yourself. The following are some of the beginning steps StepForth takes when evaluating dropped rankings.

Retrace Your Steps

Write a list of everything that anyone has done to your site within the past 3 weeks. Now look for anything that could have negatively impacted your content, site structure, or the reliability of your URLs. Once you write down the course of events the answer might pop right out at you. Here are some common situational culprits:

  • You just moved your website to a different hosting provider: did your site experience much, if any, downtime during the switch over? Quality hosting companies will allow you to setup your site on their servers before the switch takes place so that downtime is minimized if not removed entirely. If a search engine happened to visit your site while it was down, there is a small chance your rankings would be negatively affected, but it will only happen for a short period. Once the search engine re-indexes your website everything should be back in order.

  • The structure of your site has permanently changed: did you redirect the traffic from the old URLs to the new URLs using a 301 redirect? If not, then you should. A 301 redirect is a permanent redirect which tells any visiting search engine to permanently change its index to reflect the new site structure.

  • Contact your hosting company to check if your server has had any downtime recently. In most cases search engines will not drop your rankings if they visit your site and it is offline once; however, if this happens consistently then your rankings can fail. If your hosting company states that downtime has occurred, then you have at least one possible answer for your ranking woes. As long as your site is now reliably online and has not been offline for an extended period (days or weeks) the rankings should reappear as your site is re-indexed. There may be a notable drop in rankings but, in most cases, they will return to pre-incident status.

Check Your Content

Is all of your textual content up to date? It is amazing just how quickly a website's rankings can drop when someone accidentally overwrites optimized pages with older, non-optimized pages. Check the content and if you find old content, just overwrite it with the newer content and wait for the search engines to come back and re-index your website; Google and Yahoo are likely to come back within a week or even a day.

Check Your Server Headers

When a search engine visits your website it must first respond to any commands provided by your server. These commands are often identifiable in the server header. As a result, we like to verify that no incorrect, unusual or unnecessary commands are stashed in the header of your site. We use the free SEO Consultants Check Server Headers Tool to review any headers and take action if required, but there are others freely available as well.

Search Engine Webmaster Tool Resources

If you have not already done so, I strongly recommend claiming your website on Yahoo Site Explorer, Google Webmaster Central and Live Search Webmaster Center . Each of these fine resources provides extremely useful feedback (from each respective search engine's perspective) for site owners such as:

  • Whether your site is currently banned. If you are, in some cases they will tell you why.
  • Notes on any impediments the search engine has experienced when trying to index your website.
  • Who is linking to your website.
  • Which pages are the most popular on your website.
  • Which keywords lead the most traffic to your website.

In addition, these free webmaster resources allow you to submit an XML sitemap of your website so that you can ensure no pages are missed when the search engines index your website.

Search Your Site for SPAM

It is possible that your site has been "lucky" enough not to have been penalized until now for certain content transgressions. You see, search engines don't always catch SPAM right away. In fact, I occasionally find myself shaking my head in disbelief when I see blatantly spammy sites appearing in the top 10 search results. Your site may not be entirely spammy but all it takes is for one transgression to come to light for a search engine to penalize your search engine rankings. What SPAM is and how to identify it is an article unto itself so here are some helpful resources for you to review:

Contact an SEO or Request a Forum Review

If you haven't found a reasonable answer after following the instructions above I would recommend either contacting a reputable SEO company for advice or posting your ranking problems publicly on a popular search marketing forum within a resource like Webmaster World. There are a lot of people on forums that can be incredibly helpful and may have an answer for you. But a word to the wise, make sure the person providing advice has a solid reputation. I strongly recommend reviewing a number of their previous posts and Googling them to ensure they have suitable experience to provide advice – unless of course you have a 'no duh' moment where their advice makes perfect sense.

See the original article here

Read More...... If you like the post then please vote on any Social Media Network.

SEO: Meta Commands for SE spiders

Posted by damnedviper | | 1 comments »

Nearly all search engines utilize spiders (which are also known by their original name, robots) to go out and scour the web looking for web pages. These search engine spiders then bring the data back to be indexed by the engine.

Since roughly 1996, individual meta commands have existed that can be used on individual web pages to modify how these search engine spiders behave. The most useful of these commands are fairly universal and respected by almost all search engines. What follows is a list of some of the more popular spider commands and instances in which you might want to use them.

meta name="robots" content="index"

This meta command is one of the most common ones used - and it is also the least necessary. It tells search engine spiders to come on in and put the page in their index. However, all search engines do this by default anyway. Basically, if you want to put it in there for fun, be my guest, but this command is not giving you any special treatment. All search engines are going to index your page, unless you specifically tell them otherwise.

meta name="robots" content="follow"

The follow command is different from the index command. It basically requests that the search engine spiders follow the links that are on a particular page. Again, however, this piece of code is completely unnecessary because all search engines are going to follow the links on a page, unless otherwise directed.

meta name="robots" content="noindex"

The noindex command, the opposite of the index command, tells search engine spiders not to index the content of a page. It's important to note however that search engine spiders will still follow the links on a page that uses only this command.

When not used for legitimate purposes, this tag can be dangerous because it can put you at risk for penalization by most, if not all search engines. This is because you can use a noindex tag to hide pages with multiple links that you don't want visitors to see but that you do want all search engines to index.

There are however some legitimate uses for the noindex command. For example, if you have a dynamic site and you've created static pages to replace some of your dynamic pages, which can make them easier for search engine spiders to access, you could put a noindex tag on the dynamic version.

As Google mentions in its Webmaster Help Center:

"Consider creating static copies of dynamic pages. Although the Google index includes dynamic pages, they comprise a small portion of our index. If you suspect that your dynamically generated pages (such as URLs containing question marks) are causing problems for our crawler, you might create static copies of these pages."

In cases like these, it is acceptable to use the "no index" command on the dynamic version of the page, so that your content will not be treated as duplicate. You are not tricking all search engines, you're just redirecting them.

meta name="robots" content="nofollow"

This tag tells search engine spiders that it's OK to go ahead and index a page and list it but that they shouldn't follow any of the links that are on the page. This can be useful if, for example, you had some partners that requested a link on your site that you felt obligated to give, but you wanted to hold onto as much Page Rank as possible. Now this is of course between you and your own personal god, but you would be able to in effect have a partners page, add the nofollow attribute to the meta tags, and basically not pass on any of your Page Rank to any of the sites to which you are linking. The nofollow command in effect tells all search engines that this is the end of the line.

meta name="robots" content="noindex,nofollow"

Obviously, noindex and nofollow are powerful tags - and in combination, they can make a page and the subsequent pages to which it links invisible to nearly all search engines. This combination command tells search engine spiders, "Do not read this page; do not follow any of the links on this page; do not include this page in your index."

This command has its beneficial uses. For example, it can be placed on pages on a site that have duplicate content for legitimate reasons. A website might have both a page for the United States and a page for England that cover the same product with exactly the same content. However, nearly all search engines would see this as duplicate content and could devalue both pages. So placing this command on one of them means that search engine spiders will walk on by and you won't be penalized.

meta name="robots" content="noarchive"

Finally, almost all search engines today, including Google and Yahoo, offer a cached version of a site alongside its listing that provides a snapshot of what the page used to look like. The noarchive tag, therefore, is available to be used in circumstances where there is content on your website that is of a timely nature and therefore that you might not necessarily want search engine spiders to cache for people to have access to moving forward.

For example, a business might run a one-time special that has a ridiculously low price to drum up some business while things are slow. The business will want to be able to shut that sale down as soon as sales are back up to a solid level. However, it is conceivable that someone could click on the cached version of the business's site, see the old deal that was out there, and insist on getting it for themselves. By using the noarchive tag, you are telling search engine spiders, in effect, "This page is subject to frequent changes, and I don't want my visitors to have access to some of this content at a later time."


The commands discussed above are just a few of the ones in existence, and new ones are being added frequently. While nearly all search engines support these commands, there are still some that don't. The ones in this article, however, are fairly universally understood by search engine spiders, no matter from where they originate.

Read More...... If you like the post then please vote on any Social Media Network.

Google's Knol - Wikipedia's Rival

Posted by damnedviper | | 0 comments »

By Mikhail Tuknov

There's a new kid on the online block named Knol and even this early in the development stage, some people are already predicting that it could bring about yet another significant change to the way we share information on the Internet.

Knol is a new Web service being developed by Google that is meant to serve as a virtual storehouse of knowledge on the Internet. With content being contributed by various experts on different topics, it will behave much in the same way that Wikipedia does currently. In fact many industry experts have made the suggestion that Knol is set to become a direct competitor to Wikipedia and other similar types of web sites.


Google is of course the go to web site as far as search engines go, being the most popular search engine web site today by far. If Knol is as successful in drawing a widespread following as the developers hope, it could bring about the Google's transition from a search engine into a company that creates and publishes Web content.

Some industry observers warn that one problem that could potentially arise is that Google's objectivity in presenting search results could be compromised.

Knol – the name of which is derived from the word "knowledge" – is being developed to allow people to create Web pages on virtually any topic. When completed, it will include several features that will allow its users the ability to perform a number of tasks, such as submitting comments, rating individual web pages and suggesting changes.

We mentioned earlier in this article that Knol has been compared to Wikipedia by many industry analysts. While there are in fact many similarities between the two web services, the main difference is that Wikipedia allows virtually anyone to edit an entry while Knol only allows the author of each particular "knol," – which is what the individual pages in the service will be called – to do so. This means that the same topics could have many different authors with sometimes contrasting – or even competing – points of view.

Google has stated that the main thrust of the Knol project was to focus attention on authors who have sufficient expertise on particular topics. As vice president for engineering at Google Udi Manber wrote in the Google corporate blog recently, the Internet has evolved largely without the benefit of a standardized means to highlight the author's name on each web article. He goes on to say that the company believes that knowing who wrote a particular web article will considerably aid users make better use of the Internet and its various content.

Manber also stated that another important goal of Knol was to cover a wide range of topics, from the various sciences to health concerns to history. Eventually they hope to have Knol become the first stop for research on any topic. Today it is Wikipedia that provides that function and its web pages show up at the top of the results page of Google and many other search engines more often than not.

Some in the industry have suggested that this latest move of Google is driven by the unprecedented growth of web sites that combine knowledge resources such as Wikipedia, and that Google feels the need to have a strong presence in that particular area.

Wikipedia is by no means the only web site that offers that type of service. Many other companies have taken slightly different approaches in functioning as knowledge repositories on various topics on the Internet. These services include Squidoo, Yahoo Answers, About.com and Mahalo.

In spite of the widespread popularity of these services – as well as the existence of many free tools that allow experts and regular people the means by which they can share their knowledge online – Manber said that Google feels that it is still not easy enough for the average user to do those things.

Interestingly, considering all the hype and excitement that is currently surrounding the news of Knol's existence, Google has refrained from discussing the project any further than these initial details, and have even said that it is still an experimental project at this time. This means that just like many other Google tests that never saw the light of day, Knol could end up never even being released publicly at all.

As for Wikipedia, site founder Jimmy Wales has downplayed his site's comparison with Knol, saying that while Wikipedia's goal is utmost objectivity in its content, with each individual article being the sum total of the collective knowledge of its various authors, Knol's model will likely result in highly opinionated and possible even contradictory articles on even the simplest of topics.

Another important distinction is that Wikipedia is a strictly non-profit web site that does not carry any type of advertising, while Knol is a decidedly more commercial venture, with its content authors earning revenue from any Google ads on their site.

Editor's Note: Currently, Knol is accessible by Google invitation only. Some additional information on Knol can be found at:

Google Blog

Read More...... If you like the post then please vote on any Social Media Network.

84 Beautiful Wordpress Themes You (Probably) Haven’t Seen

Posted by damnedviper | | 0 comments »

Sometimes it’s just like searching for a needle in a haystack: if you’ve ever googled for free and quality Wordpress Themes, you know exactly what we’re talking about. Most designers love to create Wordpress themes, so they can demonstrate the quality of their work and add some fresh works to their portfolios.

However, most Wordpress themes are either used too often (Kubrick theme, now K2, is definitely overused) or just have nothing to offer - particularly, if you are looking for a free, impressive and professional design. However, sometimes search is worth it. In the gallery below you’ll find 84 free Wordpress Themes you probably haven’t seen yet. All themes offer quality, elegance and a user-friendly interface.

1. Redoable 1.0 [ Preview ]

2. Iceburgg [ Preview ]

3. Soxnest link [ Preview ]

4. Gridlock [ Preview ]

5. Fresh Theme [ Preview ]

6. Deep Red Theme [ Preview ]

7. Time Manager [ Preview ]

8. Fluid Solution [ Preview ]

9. Intra Blog [ Preview ]

10. XV [ Preview ]

11. Wave link [ Preview ]

12. PlainBox V1.0 [ Preview ]

13. Hemingway [ Preview ]

14. Hemingway for Wordpress [ Preview ]

15. Hemingway Reloaded [ Preview ]

16. Hemingway Pearled [ Preview, currently offline ]

17. U4 [ Preview ]

18. Rounded Blue V2 [ Preview ]

19. Freshy 1.0 [ Preview ]

20. Fresh 1.0 [ WP 2.0, Preview ]

21. I feel dirty [ Preview ]

22. Neo Sapien [ Preview ]

23. Drunk Loser [ Preview ]

24. ChipShot [ Preview ]

25. Kiss Theme [ Preview ]

26. Organique [ Preview ]

27. Theme Leia [ Preview ]

28. Sirius [ Preview ]

29. Pink-Kupy 1.0 [ Preview ]

30. Free FU [ Preview ]

31. GR [ Preview ]

32. Bastard Theme [ Preview ]

33. Beast-Blog v.2.0 [ Preview ]

34. Foliage, Foliage Mod [ Preview ]

35. Peaceful Rush [ Preview ]

36. Digg 3 Columns [ Preview ]

37. Contaminated [ Preview ]

38. Greenery [ Preview ]

39. Mollio [ Preview ]

40. (Not so) Fresh [ Preview ]

41. Aalglatt V1.0 [ WP 2.0, Preview, based on Green Marin?e ]

42. Blue Moon 1.0 [ WP 2.0, Preview ]

43. Light 1.0 [ WP 2.0, Preview ]

44. Fall Season [ Preview ]

45. Mental Disorder

46. Orange Sky [ WP 2.0, Download ]

47. Blue Kino [ Preview ]

48. StripedPlus 1.0 [ Preview ]

49. TerraFirma [ Preview ]

50. Brajeshwar v 7.0 [ Preview ]

51. Quadruple Blue [ Preview ]

52. Subtle [ Preview ]

53. Dream On [ Preview ]

54. Glossy Blue [ Preview ]

55. Unwakeable 1.2 [ Preview ]

56. The Hobbit [ Preview ]

57. Shantia [ Preview ]

58. Indigo [ Preview ]

59. Andharra [ Preview ]

60. Spreeksel [ Preview ]

61. 2813 [ Preview ]

62. Stripes Theme [ Preview ]

63. Js Theme [ Preview ]

64. Wonderwall Daily Misery [ Preview ]

65. Qwilm [ Preview ]

66. Don’t Touch This

67. Minima Plus [ Preview ]

68. Alternate 0 [ Preview ]

69. Squible [ Preview ]

70. Pinky and The Brain Theme [ Preview, currently offline ]

71. Balance [ Preview ]

72. 5thirtyone v2 [ Preview ]

73. Japanese Cherry Blossom [ Preview ]

74. BloxPress (AJAXified + Prototype + Scriptalicious) [ Preview ]

75. Hoofeiv 3 [ Preview ]

76. Bosco [ Preview ]

77. Darlanas [ Preview ]

78. Seo Adsense Wordpress Theme [ Preview ]

79. Nonzero [ Preview ]

80. So Suechtig [ Preview ]

81. Blue Zinfandel Wordpress Theme [ Preview ]

82. Frequency [ Preview ]

83. Vertigo Enhanced [ Preview ]

84. Lunarpages Green [ Preview ]

View the Orginal Article here

Read More...... If you like the post then please vote on any Social Media Network.

Recent Posts

ASP Web Hosting