1. Three tips to compete with established websites on Google

Old websites have a big advantage over new websites. Websites that were around several years ago had far fewer competitors and it was much easier for them to get high rankings in that environment.

As these websites have been around for a long time, search engines trust them more and they had a lot of time to get inbound links. If you have a new website, it will be difficult to compete with these established sites.

Ranking progress on Google

It is possible to compete with the big players if you have a new website. Here are three tips that will help you to compete with established websites:

Tip 1: Use less popular versions of your target keywords

It's nearly impossible to get high rankings for competitive keywords. Of course, these keywords should be included on your web pages but you should also optimize your web pages for less popular keywords that are not targeted by your competitors.

If your competitors target the keyword "pizza new york" then you might optimize one of your pages for "pizza newyork". Even if your version of the keyword gets only a fraction of the original keyword, you will still get more visitors than before.

It is much better to rank #1 for a slightly less popular keyword (you will get some visitors) than to rank #60 for a very popular keyword (you will get no visitors).

Tip 2: Optimize your web pages for as many keyword variations as possible

Repeat tip 1 with as many keywords as possible. For example, you might use "best pizza in new york", "recommended pizza restaurant new york", etc. Even if each keyword delivers only some visitors, you will get many visitors with all keywords combined.

Another advantage of getting high rankings for many of these long keywords is that your website becomes relevant to a topic. The more high rankings your website has for keywords that are relevant to a topic, the easier it gets to get high rankings for more competitive keywords.

Optimize different pages of your website for different keywords and optimize as many web pages as possible. People who search for long keywords are usually people who really care about the subject. These people are more likely to buy something and they might also link to your site.

Tip 3: Create web pages just for getting links

Some web pages on your website should generate revenue, others should help you to get inbound links. The more inbound links your website has, the easier it is to get high rankings.

Create web pages that solve the problems of the web searchers. Write "how to" articles about a very special topic. These web pages often get many inbound links. Of course, you should also actively build inbound links.

Getting high rankings for a new website is not difficult if you choose the right way. Do not attack the big players directly. Start with many related keywords and then proceed to the more competitive keywords.

Focus on both optimizing your web pages and getting good inbound links. Your website must have good optimized content and good inbound links if you want to get high rankings on Google.

2. Facts of the week

Eye trackingGoogle publishes some results of their eye-tracking studies

"Based on eye-tracking studies, we know that people tend to scan the search results in order. They start from the first result and continue down the list until they find a result they consider helpful and click it — or until they decide to refine their query."



Google tests a new interface for Google Suggest

"[Google runs] an experiment that simplifies the interface for Google Suggest and adds Google's search buttons below the list of suggestions. [...] This is not the only experiment for Google Suggest: other changes include the direct access to the top result for navigational queries, direct answers and suggestions from your search history."



Google tests a new branded local one box in the search results

"[Google tests] a new Universal Local Search Result; the Branded Local One Box. The new Universal Result appears to only show in Firefox, for regional brand related searches. It requires that the searcher is in the same general area as the business."



StudyOn-line retailers winning Internet search battle

"The results show that on-line retailers have a very strong presence, representing well over 30% of the listings shown [...] With consumers' search activity growing to more than 10 billion searches each month and Forrester Research reporting that 24% of all off-line purchases are influenced by the Internet, the importance of a strong search presence is clear."



Critics: new Google app gives abusers too much latitude

"As it stands right now, Latitude could be a gift to stalkers, prying employers, jealous partners and obsessive friends. [...] For instance, the group said, a phone left in a repair shop could be secretly enabled. Or someone could give another a Latitude-enabled phone as a gift."



Search engine newslets

  • AdSense testing larger font in ad units.
  • MSN/Live Search adds a fourth ad to the top of the search results.
  • Pay per post: Google uses every trick to beat Yahoo in Japan.
  • Google accused of invisibly deleting blog posts on the RIAA's say-so.
  • Is Google the next victim of creative destruction?
  • There's still room for Google killers, study says.

1. Google warning: is your site abused through redirects?

Google recently wrote in one of its official blogs that it is possible for spammers to take advantage of your website without ever setting a virtual foot in your server. Spammers can do this by abusing open redirects.

What are open redirects?

Many websites use links that redirect their website visitors to another page. Some redirects are left open to any arbitrary destination. These redirects can be abused by spammers to trick web surfers and search engines into following links that seem to be pointing to your website although they redirect to a spammy website.

That means that people who think that they visit your website will be redirected to highly questionable web pages that might contain adult content, viruses, malware or phishing attempts.

Which redirects on your website could be abused?

Spammers are very inventive. According to Google, they have managed to use the redirect spam on a wide range of websites, including the websites of large well-known companies and the websites of small local government agencies.

For example, the following redirection types can be abused:

  1. Scripts that redirect users to a file on the server can be abused by spammers. The links on your website could look like this:

    http://www.example.com/download.php?url=http://www...
    http:///www.example.com/get/pdf/?http://www...

  2. Site search result pages with automatic redirect options. If the result pages of your internal site search feature contain an URL variable that sends your website visitors to other pages, spammers might be able to exploit them:

    http://www.example.com/search?q=keyword&page=1&url=...

  3. Affiliate tracking links. Affiliate tracking links often allow people to direct website visitors to other pages. Spammers might enter their own URLs in the tracking links. Example:

    http://www.example.com/track.php?affid=123&url=...

  4. Proxy pages. Proxy sites send people through to other websites and they can be abused by spammers:

    http://myproxy.example.com/?url...

  5. Interstitial pages. Some websites show an interstitial page when users leave a website to let users know that the information found on the link is not under their control. These URLs usually look like this:

    http://www.example.com/redirect/http://www...
    http://www.example.com/out?http://www...
    http://www.example.com/cgi-bin/redirect.cgi?http://www...

How to find out if your website is abused

Even if you find none of the URLs above on your website, your site still may have open redirects. Do the following to check if your website is abused by spammers:

  1. Make a site search on Google

    Go to Google.com and search for "site:yourdomain.com". Replace yourdomain.com with your own domain name. If you see web pages that have nothing to do with your website then it's likely that someone exploits a security hole on your website.

  2. Check your web server logs for URL parameters like "=http:" or "=//". If your redirection URLs get a lot of traffic, this could also be caused by spammers.

  3. If you get user complaints about content or malware that you know cannot be found on your website then your website users might have seen your URL before they were redirected to the malware site.

What you can do to protect your website

It's not easy to to make sure that your redirects aren't exploited. The reason for that is that an open redirect is not a bug or a security flaw. There are some things that you can do to protect your website:

  1. Check the referrer. Your redirect scripts should only work if they area accessed from another web page of your website. The redirect script should not work if the user accesses the script directly or from a search engine.

  2. If possible, make sure that the script can only redirect to web pages and files that are on your own websites. You could use a whitelist of allowed destination domains.

  3. Use the robots.txt file of your website to exclude search engines from the redirect scripts on your website. That will make your website less attractive for hackers.

  4. Add a signature or a checksum to your redirect links so that only you can use the script.

Open redirect abuse is a big issue for Google right now. If you secure your scripts, spammers will move over to other websites and leave your website alone.


If you want to know how your website can be on the first result page on Google for keywords that really matter, take a look at this.


2. Search engine news and articles of the week

Link elementGoogle, Yahoo, and Microsoft announce support for a new link element

"Carpe diem on any duplicate content worries: we now support a format that allows you to publicly specify your preferred version of a URL.

If your site has identical or vastly similar content that's accessible through multiple URLs, this format provides you with more control over the URL returned in search results. It also helps to make sure that properties such as link popularity are consolidated to your preferred version."



Webmasters observe a drop in traffic from Yahoo's ad network

Many Yahoo advertisers reported in an online discussion forum that the traffic from Yahoo's Search Marketing Network has decreased. Some webmasters receive about 50% of the traffic that they normally receive from Yahoo.



The downfall of geo modifiers

"Geo targeting, browser location awareness, and other tools have helped searchers receive relevant results (mostly sponsored). Google has released a search update where it prompts the user for a city or zip whenever it detects a local search, then displays local results. This has increased overall local search traffic and increased Onebox traffic."



Pull the plugHow Google decides to pull the plug

"Google recently set the blogosphere abuzz by announcing that it was pulling the plug on several products.

The victims included Lively, a virtual world that was Google’s answer to Second Life; Dodgeball [...], Catalog Search [...] and Notebook [...]. Google also said it would stop actively developing Jaiku, a microblogging service similar to Twitter."



Google testing searchwiki on AdWords

"Google seems to be testing part of SearchWiki, Google's way of promoting and removing results, to be spotted in the AdWords or sponsored ads section of the Google search results. Some people are noticing the X icon, which allows searchers to delete results from Google, in the sponsored listings."



Search engine newslets

  • "1234567890 Day" Google logo appears for brief time.
  • Google changes its policies for ringtone ads.
  • Suggesting sites to DMOZ: finding the correct category.
  • Yahoo Search service will have variety of commercial models.
  • Microsoft adCenter Desktop beta is now available.
  • An update on Yahoo's homepage testing.
  • Yahoo! gets partial victory in Akaushi keyword lawsuit.
  • Russia's biggest search engine Yandex launches Yandex Answers.



1. How long does it take to get top rankings on Google?

Many people who start a website think that it is possible to get high rankings on Google within a few days. Unfortunately, this is not possible. Competition on the Internet is fierce and there are several factors that influence how long it takes until Google lists your website.

1. How old is your website?

If you have a brand new website then you have to wait. You can submit your website to Google but Google will only index your website if other websites link to your site.

In addition, you have to prove that your website is not spammy. Google has several filters for new websites and you have to earn Google's trust before your website can get lasting high rankings. A new website can get good rankings for less competitive keywords but it usually takes about 6 months to gain the minimum level of trust that is necessary to get high rankings.

2. How optimized was your website before?

If you have an old and established website that was blocking search engine robots due to a broken robots.txt file or a bad website navigation then it can be relatively easy to show up in search engines.

If you remove the factors that keep search engine robots from your web pages then search engines will list your website relatively quickly. Of course, this doesn't work if you have a new site.

3. How many inbound links does your website have?

If you have an old website that has very few links then it will take longer to get high rankings on Google. If your website has many inbound links, then Google will pick up the optimized pages on your website much quicker. The more quality links your website has, the quicker your optimized web pages will show up in Google's results.

4. Which keywords do you target?

This is a very important factor! The more competitive your keyword is, the longer you will have to wait to get high rankings and the more links and optimized pages you need. Start with multiple word keywords that are related to your business and then proceed to the more competitive keywords when your website has good content and inbound links.

5. Who are your competitors?

If the website that are ranked in the top 10 results for your keyword all have thousands of inbound links and more than thousand pages then it's not likely that your website will be able to get in the top 10 results if it has 10 inbound links and 20 pages. You can either wait for along time until you get top 10 rankings for that keywords (i.e. when you have a similar amount of pages and inbound links) or you can start with other keywords.

ClockHow many days, weeks or months does it take exactly?

Provided that your website has good inbound links and optimized web pages, you can get high rankings on Google within a few months if you have a brand new site and choose a very specific keyword that consists of several words. Old and established sites usually need some weeks for such a keyword.

If you target industry keywords, which usually consist of two or more words, brand new sites usually need six months to a year to get high rankings. An established site might get the same result within 3 months.

Highly competitive one word keywords usually require thousands of good inbound links. A brand new website can need several years to get high rankings for such a competitive keyword and even established sites can sometimes need more than a year.

High rankings on Google take some time. You have to optimize your web pages and you have to get good inbound links. Without these two factors, it is not possible to get high rankings on Google.

The number of inbound links is a very important factor. If you have optimized web pages but no links, then you have to build these links and you have to wait longer. If your website has a lot of inbound links then you will get high rankings quickly if you optimize your pages.


2. Facts of the week

RedirectionGoogle stops passing anchor text through certain 301 redirects

"In the past if the old-page.html page was deleted and redirected to the homepage then the homepage would also start ranking for that query. Now this doesn't seem to be happening, in effect the link is still (perhaps) passing PageRank but it isn't passing anchor text."



Google flags whole Internet as malware

"We're not quite sure what’s going on, but a couple of minutes ago any search result from Google started being flagged as malware with a message stating 'This site may harm your computer'. Including Google’s own websites as you can see above."

Editor's note: the problem has been solved in the meantime.



Microhoo: what might have been

"A year ago Sunday, on February 1, 2008, Microsoft Chief Executive Steve Ballmer told the world his company wanted to buy Yahoo. Despite months of discussions, the deal never materialized, distressing many Yahoo shareholders and hastening Yahoo's replacement of CEO Jerry Yang with Carol Bartz. But what if Yang had gotten up on the other side of the bed one day a year ago and led his company to accept the offer?"



Google Earth OceanA deep dive into the ocean in Google Earth

"Now, with the new version of Google Earth, people can see within a few hours what it has taken me a lifetime to understand. Anyone can fly to Hawaii vicariously and see the real Hawaiian islands, not just the mountain tops that poke through the ocean’s surface. You can swim with whales, inspect coral reefs, or see the impacts of destructive fishing."



Google: "We're not doing a good job with structured data"

"Google's Alon Halevy admitted that the search giant has 'not been doing a good job' presenting the structured data found on the web to its users. [...] Halevy was referring to the databases of the 'deep web' - those internet resources that sit behind forms and site-specific search boxes, unable to be indexed through passive means."



Search engine newslets

  • Google Video searches being poisoned.
  • Gmail had problems with its spam filter.
  • FAQ on Microsoft adCenter conversion tracking.
  • DMOZ: Kids & Teens directory
  • 7 new languages in Google Translate.
  • Yahoo! reports fourth quarter and full year 2008 financial results.



1. The art of writing a link exchange request that is not spam

Got links?Link exchange requests have a very bad reputation. The reason for that is that many people send mass link exchange requests that are nothing more than spam.

However, if you want to get links from authority sites, you have to send link exchange requests. The secret of successful requests is to write link exchange messages that aren't spam.

Get the reader's attention

If you want people to read your link exchange message, it should not sound like the dozens of other emails that the webmaster receives. Use the recipient's real name throughout your message. If your message starts with "Dear webmaster" then it's likely that it will be moved to the trash. Be personal. Write a personal message for each recipient.

Use a real email address with a real name as the sender address. Free email addresses and email addresses with numbers look spammy.

Get the reader's interest

What's in for me? You should answer that question as soon as possible in your email.

Do not list any features. Focus on benefits. First, nobody is really interested in the great features of your website. Tell the recipient how your website solves his problems or the problems of his readers.

Appeal to the ego of the recipient. Tell the recipient that you list only 10 websites in this special category and that he is in that top 10 list. It's easier to convince people if you appeal to their ego.

Create desire

The recipient of your link exchange message won't react to your request if you don't create desire.

To create desire out of interest, you can tell the recipient that you downloaded his ebook, subscribed to his newsletter or RSS feed, etc. The other webmaster is more likely to reciprocate if you have done something in advance.

Give the recipient the feeling that he's special. Include your phone number in your email message. Summarize why the recipient benefits from linking to you.

Make action as easy as possible

Don't make the recipient think. Make the linking process as easy as possible by giving him copy/paste HTML code with your link information. Of course, the code should only be an option and you should not insist on that code. If the webmaster want's to use another code to link to your site, that's okay.

Further information on how to write link exchange messages can be found in our free SEO book. If your website needs more inbound links, take a look at this tool.

Back to table of contents - Visit Axandra.com

2. Facts of the week

Matt CuttsGoogle's Matt Cutts: detecting Google bombs

"We do two different things — both of them algorithmic — to handle Googlebombs: detect Googlebombs and then mitigate their impact. The second algorithm (mitigating the impact of Googlebombs) is always running in our productionized systems.

The first algorithm (detecting Googlebombs) has to process our entire web index, so in most typical cases we tend not to run that algorithm every single time we crawl new web data."



Google testing site favicons in search results

"Out of the blue, there was a favicon next to each listing in SERPs for every 'site:' search for a period of about 30 mins. [...] We do a ton of tests every year, so I wouldn't be surprised to accidentally stumble across something like this."



Search engine ChaCha is raising another $30 million

"In case you've never heard of ChaCha, it's essentially a search engine that lets users ask questions to a real person, called a 'search guide', via the web, text message or a mobile website (answers are only provided by mobile).

We've called it a dumb idea in the past, and unscalable on numerous occasions, but it's not the only startup that's taking a crack at a human-powered Q&A service."



Danny SullivanSearch start-ups won't do much to stop Google habit

"The competitors may be aiming at Google, but don't expect them to take it down. Those weapons of mass search destruction, such as 'natural language search,' 'semantic search' and 'social search,' will just bounce off Google's Teflon engine."



Google plans to make PCs history

"Google is to launch a service that would enable users to access their personal computer from any internet connection, according to industry reports. But campaigners warn that it would give the online behemoth unprecedented control over individuals' personal data."


1. How to feed, indulge and guide Google's robot

Many people think that Google searches the Internet when you perform a search on Google. That's not the case.

Google does not search the Internet when you search

Google robotGoogle uses a so-called robot to surf the Internet. This robot is a simple software program that parses all web pages that it finds on the Internet and then stores the information it finds in Google's database. When you search on Google, you're actually searching the database that has been collected by that robot.

If you want to get high rankings on Google, you must make sure that Google's robot finds the right information on your website and that the robot writes the right information about your website in Google's database.

1. Feed the robot: optimize more than one web page

It's not enough to optimize your home page. You must optimize each page of your website individually. Optimize different pages of your website for different but related keywords so that Google's robot sees that your website is relevant to the topic. The more pages you optimize the better.

It takes some time to optimize your pages individually for Google but the results are worth the effort. There are no shortcuts to high rankings on Google. If someone promises you a quick solution, be very skeptical.

2. Indulge the robot: optimize the structure of your web page elements

Google's robot does not see web pages as you can see them in your browser. Google's robot sees the plain HTML code and it has to get all information from that code.

For that reason, you have to make sure that the HTML code of your pages contains everything in the right places so that Google's robot can write the right information to Google's database.

A single web page has many elements that can be read by Google's robot: The title tag, meta tags, headline tags, links, keywords in the body text, etc. These elements must come in the right order and they must contain your keyword in the right density if you want to get high rankings on Google for that keyword.

You should give Google's robot exactly what it wants. If you want to find out if your web page structure is alright, you can analyze your web pages with IBP's Top 10 Optimizer.

3. Guide the robot: optimize the structure of your whole website

In addition to the structure of your web pages, the structure of your whole website influences the rankings of your web pages on Google as well.

A very important aspect is the structure of your website navigation and the internal links. Your website should have easy to follow text links to every page on your website that you want search engines to see.

If your website has a poor design or if it does not link to all pages of your site, then Google's robot will skip these pages. If you design your website in Flash or if you put most of your web page content in images then Google's robot won't be able to read most of your content.

If you make it as easy as possible for Google's robot to index your web pages then you will get the best possible rankings. Optimizing your web pages takes some work but it will help you to get high rankings on Google, more customers and more sales.

Back to table of contents - Visit Axandra.com

2. Search engine news and articles of the week

YahooGoogle widened lead in search in December

"Google expanded its usage share in the U.S. search-engine market last month when it handled a whopping 72.1 percent of all queries, up from 65.9 percent in December 2007 [...] The other three main search engines all lost share. Yahoo came in a very distant second with 17.8 percent [...] Microsoft was third with 5.6 percent [...] Ask.com ranked fourth with 3.4 percent [...]"



Microsoft to put Live Search on Dell computers and Verizon cellphones

"The five-year deal with nation's second-largest carrier, and the three-year deal with Dell, come hand in hand and just in time to stop further erosion of Microsoft's four year old search engine [...]"



Google, now with 58% more ads

"Google led the competition during the fourth quarter with 58% growth in the average number of ads it showed on the first search results page per keyword (4.01 in 4Q vs. 2.54 in 3Q). Google ran an average of 4.84 ads per keyword in December 2007 [...]"



YahooYahoo!'s new controversial advertising terms and conditions

"Yahoo has sparked controversy in the search engine marketing world by taking control of search advertising campaigns... without advertisers consent. Included in new 'terms and conditions' of their Search Marketing program, Yahoo! have given themselves the right to change ads and keywords of their advertisers."



Does Google penalize site wide Webmaster Tools accounts?

"There is a webmaster [who] claimed his sites are clean and he complies with Google's terms of service. But they were all penalized soon after adding them to the same Google Webmaster Tools account."


1. Website checklist for 2009: is your website ready for the new year?
The new year 2009 might be a tough year for many businesses but if you do it correctly, it can be a very successful year for you.


Before you start with new things, you should make sure that your current website is up-to-date. The following checklist will help you:

Step 1: Check your company information

Does your about page draw a current picture of your company? If you have a staff listing on your website, is it up-to-date?

Check these pages as well as the copyright notice and the privacy police of your website to make sure that your web pages don't look outdated.

Step 2: Check your contact information

Does your website list your current phone and fax numbers? Are the mailing and email addresses listed on your website correct? You'll lose customers if your contact information is outdated.

You should also check the email addresses that you use on your website. Are help@yourdomain.com, info@yourdomain.com, order@yourdomain.com, etc. redirected to the correct recipient? Send test email messages to all addresses that are listed on your website.

Many businesses have so strong spam filters that many legit customer email messages don't reach them.

If you have contact forms on your website, make sure that they work and that they are easy to use. If someone doesn't enter a correct email address in your contact form, does the error message make sense?

Step 3: Check your auto-responders

Do you send automated confirmation messages when someone sends you an email message? Does your shopping cart send email messages after an order?

Check the text of your automated messages to make sure that it says what you want to say and that is contains current information.

Step 4: Check the links on your website

The older your website is and the more pages your website has, the more likely it is that it contains some broken links. For that reason, you should regularly check the links on your website.

You can find a free link checker in the free demo of IBP. Download the free IBP demo version and select the link checker under IBP > Tools > Broken Link Checker.

Step 5: Check if your web pages are ready for Google

If you want to get high rankings on Google, you must make sure that it's easy for Google to parse your web pages. Can Google easily find all of your web pages? Do you target the right keywords on your web pages? Are all elements of your web pages optimized for Google? You can check this with IBP's Top 10 Optimizer.

The economic times are tough but it also depends on you whether 2009 will be a successful year for your website or not. If you do the right things, chances are that your website will help you to get more customers this year.

Back to table of contents - Visit Axandra.com

2. Search engine news and articles of the week
Google's search results have changed

Webmasters in several online forums noticed changes in Google's result pages. Many websites now have fewer Sitelinks in Google's search results. Websites that disable the back button in the browser seem to have better rankings now. Other webmasters have noticed that their websites lost several positions in Google search results.



Live Search begins crawling JavaScript with MSNBot-Media

"[A member of an online forum] noticed that one of Live Search's bots was crawling through his JavaScript. The bot is named MSNBOT-MEDIA and he noticed that it was accessing JavaScript files and AJAX functions."



Google updates the PageRank displayed in Google's toolbar

"Google recently did a toolbar PageRank update. It’s pretty much done now. If you want more info, I've answered questions about PageRank and the Google Toolbar in the past."



Is Google's culture grab unstoppable?

"As such a dominant player in the online world, Google will now occupy a unique gateway position that, if abused, could easily create a de facto monopoly. A situation where competition is removed from the market place by placing the keys in the hands of one company cannot, ultimately, be good for the consumer. This is a bridge too far."



Marissa Mayer on the future of Google

"We think it's really important to move beyond just keywords and allow people to ask questions, and maybe access things more easily from their mobile phone [...] We're also looking at how to weave new media into it and how we can bring books, videos and news right into the search experience. And then there are various pieces of personalisation."




Earn $$ with WidgetBucks