After four weeks in existence, Google still hasn’t indexed this site yet. Naturally, I’ve been looking around like a madman for Wordpress 2.0 SEO tips to help speed the process. There’s a lotta good stuff out there, and here are some of the tips I’ve discovered.



1. Write keyword rich post titles - it almost goes without saying, but the post title is the most important part of the blog post for many reasons. From your reader’s perspective, a descriptive and compelling title helps them decide if your post is worth reading or not. From an SEO perspective, think about the keywords or phrases people might type into a search box to find your post 2 months from now, and use those words or phrases in your post title. For maximum benefit, try to avoid titles that are cute, clever or cryptic.


2. Make your post titles live links - many of the WP themes already do this, but if yours doesn’t, you can add the necessary code pretty easily. In your Main Index Template and Page Template, find the code for the post tiltle. It’ll probably look like this (without the “#”):


<#h2><#?php the_title(); ?><#/h2>


To link your post titles, you’ll want to replace that code with the following (be sure to leave out the “#” symbols):


<#h2><#a #href="<#?php the_permalink() ?>” rel=”bookmark” title=”Permanent Link to <#?php the_title(); ?>“><#?php the_title(); ?><#/a><#/h2>


3. Optimize your permalinks - the default WP 2.0 installation displays permalinks this way: http://www.savvysolo.com/?p-123. A more search-engine friendly permalink includes the post title in the link, like this: http://www.savvysolo.com/2006/01/16/keyword-rich-post-title/. This is a simple change to make. In your WP admin panel, click on the “Options” tab, then the “Permalinks” sub-tab, and choose the option just below the “Default” permalink option. See the Wordpress Codex for more on permailinks. <code />


4. Optimize your page titles - According to most SEO experts, the page title tag is one of the most important tags on your page. In most WP themes, you’ll find the page title tag in the Header Template, and the default version ususally looks like this (without the “#”):


<#title><#?php bloginfo('name'); ?><#?php wp_title(); ?><#/title>


The best tweak I’ve seen to optimize this tag comes from Stephen Spencer, who suggests using the following code for page title tag (omit the “#”):


<#title><#?php if (is_home()) { print "whatever title I want to have on my blog home page."; } else { wp_title(' '); print " : "; bloginfo('name'); } ?><#/title>


This tweak does a couple different things. First, it allows you to use a more descriptive, keyword-rich title for your blog’s main page. And second, for the individual post pages, it will use the title of your post as the page title, which is another reason to follow the advice I mentioned in #1 above.


5. Use the related posts plugin - Alex Malov’s Related Entries plugin is a neat little addition to any WP blog. Not only does it enhance your blog’s usability, but it also helps create a dense link structure throughout your site, which makes it easier for the search engine spiders to find and index older blog posts.


6. Use the Google sitemap plugin - According to Google:


“Google Sitemaps is an easy way for you to submit all your URLs to the Google index and get detailed reports about the visibility of your pages on Google.”


Wordpress user Arne Brachold has made the sitemap creation and submission process fairly simple with his Google Sitemap plugin, which was recently updated for WP 2.0. This plugin will create a sitemap for you and submit it to Google.


Additionally, Elliot Kosmicki offers a script that will convert your Google sitemap into a Yahoo compliant sitemap that you can submit to Yahoo.


7. Add meta keyword tags and Technorati tags to your posts automatically - although it’s questionable whether or not meta keywords are still valuable when it comes to SEO, ultimately, they can’t hurt. As far as Technorati tags, they have little impact on SEO, but they can help increase your traffic directly from Technorati, so it’s a good idea to use them.Rick Boakes created the Autometa plugin that will add both tags to your posts automatically.


Another simple and effective Technorati tag generator I’ve used is Broobles’ Simple Tags plugin.


8. Validate your code - Use the W3C Markup Validation Service to ensure the code behind your pages is clean and valid. If you find your pages have errors in the code, hire or make friends with a savvy web developer to help you clean it up. See Google’s Information for Webmasters for more info on creating a technically valid site.


If you have further SEO tips for Wordpress blogs, I’d love to hear about them in the comments.

WebProNews has a short summary from Amanda Watlington's tips for SEO optimization of your RSS feeds:
1. Subscribe to your own feed and claim it on blog engine Technorati
2. Focus your feed with a keyword theme
3. Use keywords in the title tag; keep it under 100 characters
4. Most feed readers display feeds alphabetically, title accordingly
5. Write description tags as if for a directory; keep them under 500 characters
6. Use full paths on links and unique URLs for each item
7. Provide email updates for the non-techies
8. Offer an HTML version of your feed
9. For branding, add logo and images to your feed
Now, let's add some tips from Stephan Spencer and continue with the numbering:
10. Full text, not summaries
11. 20 or MORE items (not just 10)
12. Multiple feeds (by category, latest comments, comments by post)
13. Keyword-rich item [title]
14. Your brand name in the item [title]
15. Your most important keyword in the site [title] container
16. Compelling site [description]
17. Don't put tracking codes into the URLs (e.g. &source=rss)
18. An RSS feed that contains enclosures (i.e. podcasts) can get into additional RSS directories & engines
And to round this off, a summary of my own tips [part 2 here] for using RSS to drive traffic to your site:
19. Get your RSS content (proactively) syndicated on other relevant websites [just the headlines and summaries of course]
20. Submit your RSS feeds to all the RSS search engines and directories
21. Use RSS to add relevant third-party content [again, just headlines and summaries] to your website to gain additional SE weight for your keywords
22. Use RSS to deliver all of your frequently updated content, not just for your latest blog posts
23. Whenever the content in your feed changes, ping the most important search engines and directories [yes, you don't need a blog for this]

Learn how this single SEO secret can get you on the first page of Google.

Let me start by asking you a question: What's the number-one business killer on the internet? The answer is obvious, but many people miss it. The answer is: not being found on the first page of Google.
I refer to Google often, since the search engine accounts for approximately 50 percent of search traffic. In February 2007, Google sites garnered 47.5 percent of the U.S. search market, with Yahoo! coming in second place at a distant 28.1 percent, according to comScore. Preparing your SEO strategy around Google makes your plan work for other search engines, as well.
Write AwaySo what's the number-one tip for search engine ranking? Articles. With articles, even a brand new domain--some call this the "Google sandbox hell"--can get updated into the search index quickly. Many sites are spidered or crawled, but not indexed, a major problem for new sites.


Let's first take a look at recent content strategies you can use to write stronger articles. If you're not familiar with the next wave in quality content scoring and relevancy for search engines, you need to understand a little bit about search engine technology.
Google contains more than 100 algorithms that make it the world's most popular search engine. One of those is PageRank, a complex voting system I'll cover in a future article. Another important secret, which has been around for a while, but not utilized by most webmasters, is latent semantic indexing.
"Context" is the new buzzword for SEO in 2007. While you should still write good, natural, user-friendly and relevant web copy, using some simple LSI techniques can elevate your search engine ranking.
When using LSI, engines try to determine what the content or page is about without specifically matching the search term text. It looks at the document collection as a whole and examines which other documents contain some of those same words. In simple terms, this means that as you write and link to and from other pages and sites, search engines using LSI will look at words and phrases that are contextually related and try to figure out what you're writing about. So, if you're writing about bait, poles, lures and tackle, you're probably addressing fishing.
If you want to be ranked higher in the search engines, you should write content and link profiles that have supportive text and anchor text (links) using this "theme" approach. To find related keywords and phrases, use a keyword research tool, like the Google AdWords Keyword Tool. First type in your key terms into the Google search engine and pick the first site that comes up. Then go to the Google AdWords Keyword Tool, click the "Site-Related Keywords" tab and paste the URL there. Study the results and use groups of related keywords with links on your page to develop strong on-page factors.
Distributing Your ArticleWriting an article that is topically related to your business and then submitting it to article directories like ezinearticles.com, goarticles.com and buzzle.com will pay off big.
Imagine the effect of getting a link from the Los Angeles or New York Times. There isn't a sure-fire formula for achieving this, but providing quality articles and adding your link in the resources box of the article will allow search engines to find and index you faster. If the content is interesting and newsworthy, the journalists may start calling.
Yes, this could be a lot of work. But one option is to find a service that can distribute your articles for little or no cost. I like isnare.com; for a few dollars you can get your own distribution credits. Once approved, they'll submit your article to hundreds of directories. Watch your server logs for traffic and spider bots; you'll see domains and search engine referrers very quickly.
Continue these efforts by writing press releases using similar distribution mechanisms. I use services like prweb.com and marketwire.com. SEO firms have developed a complete marketplace for the SEO compliancy of press releases. They clearly understand the power of submitting and distributing content and press releases. If you don't know how to write a press release, hire somebody to help you. You can go to sites like elance.com and guru.com to have something written for $50.
If you haven't pursued these simple tactics for your SEO strategies, you've been missing out on important traffic and business. There are a number of resources for traffic acquisition and how-to-books on building an internet presence. Outside of articles, press releases, SEO and pay-per-click, there are social networks, blogs, paid links, affiliate marketing, paid advertising, viral marketing, co-registration and banner ads--certainly enough to keep you busy for a while.
Jon Rognerud is a recognized authority on the subject of search engine optimization and has spent more than 15 years developing websites and marketing solutions at companies like Overture and Yahoo!. His website, www.microsaw.com, provides a wealth of informative articles, resources and complimentary e-mail courses on everything you'll ever need to know about SEO and search marketing.

Between them Yahoo and MSN account for around 20% of all UK internet searches. Tackling Yahoo and MSN search engine optimisation requires a different technique to Google.
Yahoo's Optimisation Factors
Yahoo Optimisation or Yahoo SEO is about improving your website's "organic" ranking for key search terms on Yahoo. Yahoo is more "on page" driven than either MSN or Google, the two other major search engines. By "on page", we mean that when Yahoo produces its results it relies on information it collects about a site from sources on the target website.
1. Meta Tag Title
This is easily the most important meta tag. Very few sites with weak meta tag titles appear in the top rankings of Yahoo. My recommended approach is an interlocking phrase of 5 or 6 words adding up to no more than 60 characters including spaces.
2. Meta Tag Description
The meta tag description is less important than the title for ranking purposes. however, the tag normally shows up in the results pages so it is seen and may be acted upon by potential website visitors.
3. Meta Tag Keywords
These have little direct role in search engine ranking. Where they are useful is as a device for testing on page alignment. See below.
4. Page or Site Content
The content or subject of a site is very important to Yahoo and MSN. Content for this purpose means words in general and headings in particular. It is important that this content has been highlighted in the meta tags title and description.
5. On page Alignment
On page alignment of the meta tags and content are a key component of success with Yahoo rankings. The meta tags on their own will not work. Simply Clicks can carry out a quick and simple test to provide the information to improve rankings.
Yahoo's Off Page Optimisation Factors
Yahoo, like all other search engines measure the number and nature of inbound links (or IBLs) into a site from other websites. It appears, however, that Yahoo places a lesser emphasis on these links than other major engines. The most important off page factor appears to be inclusion in the Yahoo directory. However this costs $299.
Simply Clicks is a Yahoo and MSN optimisation specialist. If you would like further information about Simply Clicks' Google optimisation services please e-mail using the link on the right.
Simply Clicks has been involved in over 30 SEO projects for a number of multinational and UK based companies. If you have a specific SEO need or would like further information about Simply Clicks search engine optimisation services, please e-mail your requirements.

Google is the UK's most important search engine, generating an approximate 75% market share of all search engine traffic. Google search engine optimisation (Google SEO) is focused on improving your website's "organic" ranking for key search terms on Google.
Google.co.uk, is Google's local engine for the UK. It is highly sensitive to filtering out what it considers to be non-UK based websites.
Google's Optimisation Factors
Although following similar principles, the Google search engine algorithm - Googlebot - functions in a different way from the other major search engines. This is mainly because it is more "off page" biased than either MSN or Yahoo, the two other major search engines. By "off page", we mean that when Google produces its results, it relies more heavily on information it collects from sources other than the target website. This information is largely collected by analysing the links that point at a website.
1. Inbound Links - IBLsGoogle, like all other search engines measure the number and nature of inbound links (or IBLs) into a site from other websites. Google places a greater emphasis on these links than the other engines. More than any other factor, the number, quality and origination of these links is what determines the ranking of your site in the search engine results pages (SERPs). 2. Google PageRank™Google operates a patented page ranking system called Google PageRank™. This measures the status of a page in terms of the value or PR of its inbound links. Values are calculated out of 10. A new page with no inbound links will be ranked at 0/10. An established, highly authoritative page will be ranked 10/10. Most established commercial website pages feature in a range from 3/10 to 5/10. For more detailed information and links on PageRank go to Wikipedia, the web's public encyclopedia.
3. Anchor TextAnchor text refers to the highlighted text contained within a hyperlink. Associated with the anchor text is the descriptive text which surrounds the link. Inbound links to your website contain information that Google uses to determine whether your website matches a specific search.
4. Page or Site ContentThis is the content or subject of your page or site. Google determines the content of your website by looking at three principal sources. The title and other meta tags of your website. The page, paragraph and body text shown on a page. The nature of the inbound links to a page.
5. Google UpdatesGoogle periodically goes through updates. These may be minor updates such as a recalibration of PageRank scores, or they can be quite major such as the Florida, Jagger and Big Daddy updates. Updates lead to flurry of activity from the SEO community. But from experience I'd suggest that a more circumspect approach should be taken. Updates only seem to have major and permanent impact on the websites that attempt to spoof the rankings. Solid, and ethical, SEO has little to fear.
Google Search Engine Optimisation Advice
The Google website has a section on Google webmaster guidelines. In addition there is a specific section advising webmasters on how to go about legitimate search engine optimisation.
Google Webmaster Tools
Google Webmaster Tools is a suite of tools for improving the organic performance of your website. Amongst other things it provides feedback about the standing of your pages and a range of objective performance measures.
Google Sitemaps
Google sitemaps is a way of improving the visibility of your website and helping Google map all of its pages.
Google Advanced Search Operators
Google has a series of advanced search operators. These operators, if placed before a search query, will provide specific information about a site, or restrict a search to using specific information. Examples include Link: to generate information about a website's back links, Site: to see how many pages of a website have been indexed by Google and Allinanchor: to restrict a search to sites with the designated anchor text.
Name:
Email:
Telephone:
Comments:
web form generator
Google search engine optimisation is a rapidly evolving field. Google is constantly tuning its algorithm in order to outfox the SEO community.

Search Engine Optimisation is the ultimate way to attract targeted traffic to your website. Simply Clicks is a UK SEO specialist. We improve your organic rankings on Google, Yahoo, MSN Live Search and many other search engines.
Effective search engine optimisation ensures that your website is visible against the keyword searches that are most valuable for delivering profitable business. We have an acute understanding of the SEO requirements of the major search engines, how they rank websites and how rankings translate into real business.
Six Key Components of SEO
Simply Clicks is a UK-based SEO specialist. The objective of our search engine optimisation programmes is to secure top organic search rankings against keyword search terms within the major search engines of Google, Yahoo and MSN. Based on our extensive experience Simply Clicks has identified 6 key components to a successful approach to SEO.
Keyword Research and Selection
Correct use of Meta Tags
Quality Page Content
Simple Website Design
Topic Related Inbound Links (IBLs)
Business Strategy "Fit"
1. Keyword Selection
Keyword research and selection is the first stage of SEO. You should select phrases that are both relevant to your business and likely to be used by your prospective customers. Keyword research should distinguish between generic words and phrases that have large search volumes and more selective phrases that have a higher propensity to convert in paying customers.
2. Meta Tags
For search purposes, these come in three main varieties: Title, Description and Keyword meta tags. The first is by far the most critical. The second highly relevant. The third, although diminishing in importance, is still relevant for controlling and testing content. There is much debate regarding the value of meta tags. Used properly they can still play a valuable role in organic search.
3. Page Content
Content refers to the visible text on your site. This is what your site visitors actually read. Write content that your site visitors will find informative and persuasive. Writing specifically for search engines is to be avoided. The ultimate goal is to attract prospects amd convert them into customers.
4. Links
Links connect your site to others on the Internet. They also place your site in context. All search engines, but particularly Google, place a premium on topic relevant, high quality inbound links (IBLs). Google optimisation is more concerned with managing inbound links than almost any other factor. A link building programme should focus on using the right anchor text and seeking out relevant linking partners.
5. Website Design
This refers to the visual or technical features that may either work for or against your site's visibility. There are some technical features that may enhance your site's attractiveness. Other technical features, although aesthetically pleasing, may well damage search engine visibility. A key component of website design is navigation. Visitors to your site need clear pointers to help them find relevant content. A key aid to navigation for both human and search engine visitors is a clear and concise sitemap.
6. Your Business
Search engine optimisation exists to provide your website with traffic producing visitors. These visitors should be potential customers. Ultimately it is additional sales and margins that govern the success of SEO.
SEO Research
Simply Clicks has been running a test across 4 sites to establish where organic traffic originates. An up to date graphical illustration of this data can be found by clicking SEO Ranking Research.
Free SEO Book
Simply Clicks has written a Free SEO Book that serves as an introduction to SEO. Just click SEO Book for a PDF version.
Google Optimisation
Simply Clicks has written a separate page on Google optimisation. This places specific emphasis on Google.co.uk. Google SEO is critical as the search engine accounts for approximately 75% of UK searches.
Simply Clicks has been involved in over 30 SEO projects for a number of multinational and UK based companies. If you have a specific SEO need or would like further information about Simply Clicks search engine optimisation services, please e-mail your requirements.

Simply
Clicks

offers an intensive 2 day SEO
training

course. The training is individual, highly practical and designed
to provide the skills
and techniques needed to manage your own search engine optimisation campaigns. Our
unique approach to SEO
training involves hands-on analysis and testing of the search performance of your
own and competitive websites.
Organised one to one or in very small groups,
the
training is tailored
to
meet your exact individual or corporate SEO needs. During the course
you will practice, learn
and
apply the latest and most effective SEO methods and tools.

SEO
Training Course Outline Programme


Our SEO training integrates the
theoretical
principles and practical
applications
of search engine optimisation. As a guide, the course uses Simply
Clicks' own four-stage approach to search engine
optimisation -
which
we call
Search
Engine
Marketing Over-Site
.
This gives a clear step by step structure to delivering your own
expert inhouse SEO.

During
the course you will interrogate and deconstruct your own and
competitor websites, uncovering SEO strengths, weaknesses and
vulnerabilities. The Simply Clicks' SEO training course is constantly
updated to reflect the changes brought about by
Google updates.

The
SEO training course covers
the key optimisation
factors
for each of the 3 main search engines -
Google, Yahoo and
MSN.

  1. Search
    Engine Marketing Strategy
  2. SEO Tools and Techniques
  3. Keyword
    Research,
    Website Analysis and Website Analytics
  4. Organic
    Search and Search Engine Optimisation
  5. Link building and
    building a web presence
  6. Understanding
    Search Engines and their Algorithms
  7. Utilising Pay
    Per Click Data for SEO
  8. Monitoring SEO Performance
  9. Illicit
    SEO Techniques
  10. Operating an Integrated SEO Campaign


Link Building For Seo


Within
the field of seo, link building for
off page presence
is a critical process. Without an acute understanding of the subject,
much link building activity could be wasted effort. What
are the critical components of a good link? Topic relevance, Google
PageRank® or anchor text? Simply
Clicks can takes you through the
complete link building process.

Google SEO Training


Google
is the UK's dominant search engine with a current market share of
approximately
75%. Although many of the principles of SEO can be applied to all three
major
search engines, throughout the course we pay
special attention to the particular requirements of Google. The course
is largely UK based, so we explore the crucial differences between
Google.com
and Google.co.uk.

SEO Training - Course
Benefits

Having
completed
the SEO course,
you will have a comprehensive understanding
of
search
engine optimisation. You will be fully equipped to begin the process of
optimising your own website and at the
same
time be able to expose and exploit the vulnerabilities in competitor
sites and
strategies.
As a
result, you will be able to immediately generate additional and more
cost effective traffic for your business. Each participant receives a
20,000 word SEO training
manual
for ongoing reference.

Who
these Training courses are for

Our SEO training is primarily
designed
for marketing,
advertising and Internet
practitioners. A reasonable level of computer literacy is assumed.
Various levels of ability
can be accommodated within small
groups of no more than four people. This
course can be adapted as a strategic briefing at managing
or marketing
director
level.

Course
Delivery & Costs


The
preferred location for course delivery is on-site in a training room at
your premises. The
SEO course can be arranged at external locations, where additional
expenses
may be incurred. We are happy to travel throughout the UK and have
delivered
the course in mainland Europe. Each
course participant will
need a modern and broadband connected PC.

The
cost of a two day training course, tailored to your your market
category and
exact business training
needs
is £995 for one person, £1,350
for 2 people and £1,600 for 3 people. The SEO
training can be combined with an
on-site search engine marketing consultation. Please ask for a quote
for larger
classes or to discuss specific projects.

Search
engine
optimisation is a rapidly evolving online marketing discipline. If you
require your SEO training course
to be tailored
to
emphasise
any particular issue of search engine optimisation we will
accommodate your specific needs.


Reuters is running a story on Yahoo CEO Terry Semel and how happy he is over the implementation of the new Yahoo Search Marketing ‘Panama’ platform. According to Reuters, Semel recently spoke at an AdAge conference saying that Yahoo will be showing some “very exciting numbers” in its Q1 2007 earnings report.


Panama’s new ranking technology was launched on February 5th and more changes to the system which should benefit Yahoo, its Search Marketing division, and its Yahoo Publisher Network division are coming and expected to take place worldwide.


Eluding to Yahoo’s competition with the Google Juggernaut, Semel added “We have said from the beginning and we say it clearly right now, again, that our intention is to close the gap and Panama is doing a great job.”


“I’m totally all smiles,” Semel added. “We are very excited and very happy and I’m smiling broadly.”


Yahoo advertising now serves the most relevant and high quality link advertisements in search results and contextually across their Yahoo network. Formerly, Yahoo was serving the ads of the highest bidders first in the old Overture interface, even if those ads were not of the highest quality.


Last month, a comScore Networks report showed Panama had increased the number of people who click on links that pay Yahoo.


The report showed that the number of Yahoo search users who clicked on ads — known as the click-through rate — rose 5 percent in the first week after the new system’s debut and 9 percent in the second week.

Yahoo! Inc. has appointed Reggie Davis to Vice President of Marketplace Quality, where Davis will serve as the company’s first senior executive dedicated to continually enhancing the quality of Yahoo!’s display and search listings marketplaces.


As click fraud debates continue to get more and more heated, it is good move by Yahoo to assign Davis this new role where he will serve as much as an internal watchdog as he will a talking head and public figure representing Yahoo Search Marketing and its quality standards, especially in times of controversy.


One of Davis’s first duties was addressing the amount of fraudulent clicks that Yahoo tracks and filters.


Greg Sterling adds :


Yahoo disclosed that there was an average of between 12% and 15% clicks that were filtered or not charged to search marketers. Davis was careful to explain that this was not a click fraud number, which was a smaller figure.


Davis said his appointment to a VP level position was part of a larger organizational initiative and commitment to bring more transparency to the issue of click quality and be much more open and proactive with search marketers to address their concerns and generally enhance customer relations. As a long-term approach to the marketplace Yahoo recognizes this is much better than managing litigation after the fact.



From the Press Release :


Davis is responsible for developing and executing a strategy aimed at driving more rapid innovation, greater transparency and faster delivery of product and service enhancements to build an even higher quality advertising network for Yahoo!’s customers.


Davis will hire a dedicated staff to manage across all of Yahoo!’s cross-functional quality teams and ensure that customer input is integrated into all efforts to address click fraud, traffic quality, network placement and other marketplace quality issues. Davis and his team will also be responsible for increasing Yahoo!’s dialogue with advertisers and publishers on quality related matters.


In addition to developing Yahoo’s longer-term marketplace quality strategy, Davis has already begun working closely with the Yahoo! product teams to drive several enhancements aimed at providing greater visibility and control to Yahoo!’s search advertisers this year, including: quality-based pricing, which is designed to ensure that traffic is priced in a manner that is consistent with the quality it delivers to advertisers; domain-level blocking, which allows advertisers to identify individual domains from which they do not wish to receive traffic; automated advertiser inquiry submission processes and greater detail around advertiser and publisher adjustments.

On Tuesday Google made the announcement that they are launching a Pay Per Performance ad model for AdSense Ads which comes with its own new form of Google ad placement, Google Text Link Ads : “You can create text, image, or text link ads for your pay-per-action campaign.


Text links are hyperlinked brief text descriptions that take on the characteristics of a publisher’s page. Publishers can place them in line with other text to better blend the ad and promote your product. For example, you might see the following text link embedded in a publisher’s recommendatory text: “Widgets are fun! I encourage all my friends to Buy a high-quality widget today.” (Mousing over the link will display “Ads by Google” to identify these as pay-per-action ads).


Though the maximum length of a text link is 90 characters, we’ve found that shorter links perform better because they allow the publisher use the link in more places on her/his site and in different context. The maximum length is 90 characters but less than 5 words is best. Even better, just use your brand name to offer maximum flexibility to the publisher.


Barry Schwartz has a nice rundown at Search Engine Roundtable on the new Google ‘Text Link Ads’ format for the Pay-Per-Action referral ads and includes this screenshot of a link ad in action :


Google Text Link Ads


What intrigues me is Google’s use of the phrase “text link ads.” Although they have not, to the best of my knowledge, officially named the ad unit Google Text Link Ads, they have used the term in their initial description of the unit and the term seems to be catching on among the search blogs in the industry.


Of course, there is a company called Text Link Ads which has been established for years, does a lot of promotion, and continues to grow among the SEO and Search Marketing business as a legitimate and proven form of online advertising. I can’t help but think that Google’s use of the term Text Link Ads will cause some major confusion around the Internet marketing community, and perhaps harm either the Text Link Ads company, or Google themselves.


In my opinion, Google should use a totally different label for these new advertisement options; embedded ads, keyword ads or even link ads; or something less generic and not related to an existing company and possibly a Google rival when it comes to online advertising.


The timing is also questionable, as Text Link Ads is expanding its services into Blog Reviews, RSS feed links and other text ad oriented offerings.


Your thoughts?

A screenshot of the Google Pay Per Action ‘Conversion Tracking Setup’ page have been uploaded to Flickr by a very generous Google advertiser.


Google Pay Per Action


In the capture, we see the setup page for the campaign which includes action tracking options for Sale, Purchase, Lead Generation or Other.


Statistics tracked are Bids, Total Conversions (interesting Google is tracking all conversions and not just Google PPA driven ones), CPA Conversions, and the CPA cost.


Google Pay Per Action Tracking


Hattip to Barry Schwartz who found this on Flickr and posted about it at Search Engine Roundtable.

Last week, Google announced the beta test of their new pay-per-action (PPA) advertising program. In addition, Google mentioned a new ad format, the text link ad (which will presumably be similar to the current referral text link unit.), would be available. This new ad format, along with the new PPA (or affiliate) model, will open up new potentials for bloggers to make money from their blogs.


There is a big difference between Google’s new CPA ads and their traditional, context-based AdSense ads. According to the AdSense program’s policy, publisher is not allowed to call attention or encourage their site’s readers to click on the ads, since the advertiser pays on a per-click basis. If a blogger writes, “It’s been a tough month, please be kind enough to click on a few Google ads to help make ends meet,” there’s a very good chance said blogger will find their AdSense account suspended. But with the new CPA ads, since the advertiser only pays when a specific action is taken by the user, a publisher is allowed to encourage or draw attention to the ads.


Google’s existing ad units are meant to act as supplements to the content of a site; for a blog, you can place an AdSense unit above, below, to the side, or even in the middle of a blog post, but the unit physically looks separate from the content.


Google Text Link AdsThe new text-link ad unit, however, can be embedded in the middle of a post, in the middle of sentence even. Bloggers will be able to link to offers, much as they can now link to Amazon products using Amazon’s affiliate links, within the content of their post. And since there are no restrictions to calling attention to an ad, bloggers will be able to encourage their readers to click on the ad and check out the offer.


An interesting technichal question that arises from this is how these links will work when included in a syndication feed. Like Google’s other AdSense ad units, the text-links will likely be snippets of Javascript code that a publisher embeds into their site.


Javascript will work fine in a browser on a normal web page, but when that same Javascript is exported in the blog’s syndicated feed, and rendered in an aggregator such as Bloglines or Google Reader, that strip out Javascript elements for security reasons, those links will not appear.


Google may address this issue by implementing the links in a form other than Javascript, such as how their AdSense for Feeds units are implemented, or provide alternate code that will work in environments where Javascript is not available. It will be important for bloggers, especially those that publish full content and have significant numbers of readers via their feed, to make sure the new text-link format works properly in a feed-based environment.


It will also be interesting to see how this new format is embraced by those that were critical of blog-based text-link marketplace services such as PayPerPost; the Google CPA/text-link model is not quite the same as paying bloggers to post, but it is similar in that it provides incentive for a large group of users to focus on a specific product. If a new ad for Company X is released, and thousands of bloggers begin posting about Company X, Company X’s site will likely receive an influx of plain-old links (in addition to the Google text-link ad units) from many blogs, increasing it’s visibility in search engines, memetrackers and other sites/programs that follow trends in the blogosphere.


Google takes a negative view towards paid links, and memetrackers such as TailRank have banned sites that participate in PayPerPost, as these practices artificially promote stories/sites that are not necessarily interesting in and of themselves.


Will a similar approach be taken to sites that utilize Google’s text-link ad units? A larger question is: What potential ways could the text-link, or CPA model in general, be creatively used/abused by those looking to get attention and traffic to their sites?


Google’s new Cost-Per-Action model opens up many new potential opportunities for bloggers, but also raises a host of questions. It will be interesting to see how things unfold as the program comes out of beta.


-


Greg Gershman is the co-founder of BlogDigger, an independently owned blog search engine which offers local blog search.

Google AdWords Quality Score may be giving more critical evaluation to campaigns which are consolidated to one domain.


If that domain has had issues in the past with Google results, unethical SEO, cloaking, or other techniques which may cast it as being a negative in the eyes of Google, hosting the landing pages on such a domain may bring down its Quality Score.


From a Digital Point thread:


Clearly there is a domain name Quality Score that is affecting us. It seems when using individual domains names the success or failure of an individual ad group/keyword does not affect the others, but when it all goes to the same domain name it seems tied into each other.


Of course, if your domain is clean and Google considers it an authority, hosting AdWords landing pages on that domain can be beneficial.


Peter Da Vanzo adds:


It’s not so much that the domain name itself is tagged with authority, but if the domain contains enough pages that are deemed authoritative, then it can be assumed that other pages on the domain are also highly likely to be authoritative, even if the authority of those pages hasn’t been individually calculated yet.


It’s like trusting someone who has given you good advice in the past is likely to give you good advice in the present. That assumption may not be true, but it is a reasonable assumption to make, especially if you are unable to check credibility by other means, at the time.


Google AdWords is not the only paid search vehicle to look at the quality of a domain. The new Yahoo Search Marketing ‘Panama’ Quality Index also enters site history and behavior into its equation.



According to the Quality Index patent and conversations with Yahoo, the YSM Quality Ranking considers the following factors:


Yahoo Search Marketing’s ‘Quality Ranking’ could look at these factors as being the most important:



  • Bid or Click Price

  • Domain’s historical click-through data

  • Conversion data

  • Organic Rankings of the Domain

  • Whether the query word/phrase appears in the title

  • Keyword in URL

  • Keywords on Landing Page

  • Possibly other Keyphrases in Campaign

  • Comparison between other sites advertising for keyterm

Yahoo! recently finalized their consolidation of all their individual affiliate programs at Commission Junction to a single one. The Yahoo! Search Marketing affiliate program was one of them. Although the old YSM program was closed at CJ are old affiliate links and banners still working, but without tracking commission (free traffic for Yahoo!, way to go). This flaw itself has nothing to do with Yahoo!, but is a questionable and known “feature” of Commission Junction.


I don’t want to rant (again) about this , but it exposed a flaw in the code of the YSM landing page, which is not only embarrassing but probably also causes the folks at the Yahoo! customer service department to start believing in the existence of parallel universes.


I can only imaging what must go through the head of a CS rep due to claims made by new YSM advertisers that swear by the life of their mother that Yahoo! promised but never provided them with the advertised amount of free clicks. Not advertised on another website or old magazine, but on the YSM sign-up page itself.




The problem is a flaw in the landing page code of script located at searchmarketing.yahoo.com/arp/sponsoredsearch_v2.php.


Affiliate Links redirect to that script with a number of URL parameters, such as the affiliate ID, a number of other parameters and two parameters which we will exam in more detail now.


The “o” parameter is used to pass on the Coupon Code that grants the discount to the customer to the sign-up script. The old Coupon Code that was good for $50 in credits was USCJ17 for example (o=USCJ17). It was replaced with the new coupon code USCJ16, which is good for only $25 in credits for clicks (o=USCJ16).


The other parameter is “b”, which contains the discount amount. b=50 would be a $50.00 discount for example.


The value for “o” is not validated by the script whatsoever and “b” can be any amount Yahoo! seems to offer as discount. It shows $0 on the page if the amount does not seem to be right. 100 (= $100 discount) does not work for example, but 75 ($75) seems to be a valid promotion amount, because it is accepted as value.


Check out this fake URL and see for yourself what Yahoo!’s own website is telling the visitor:


http://searchmarketing.yahoo.com/arp/sponsoredsearch_v2.php?o=GO-GOOGLE-ADWORDS&b=75


Note: The URL is fake, no discount will be granted!


Here is a screen shot, because I don’t expect the link to work the way it does work today for much longer.


Fake GO-GOOGLE-ADWORDS Coupon Code for YSM


Advice to Yahoo!: Tell one of your developers to add a check for the coupon code (URL parameter “o”) and return an error, if it is an invalid or expired coupon (yes, show two different messages to avoid customer service issues and confusion).


While you verify the validity of the coupon code, pull also the actual discount amount that the customer gets with the coupon from the database and ignore the “b” parameter altogether.


I sent the affiliate management team of the Yahoo! affiliate program an email about this flaw and the issue with old, but seemingly working YSM promo banners and links already. I also told the AM about my blog post here at SEJ. Blogs tend to expedite response times by internet companies from time to time. :)


Carsten Cumbrowski

Cumbrowski.com, Internet Marketing Resources Portal. Pay-Per-Click Search Engine Offers and free click credits, SEM Resources and more.

Dan London of AdwordsEditor.blogspot.com sent me a screen shot of the new Google AdWords Account Snapshot Beta which looks very similar to Yahoo Search Marketing’s Panama admin.


Google AdWords Account Snapshot


Yahoo Search Marketing Dashboard


Similarities include :



  • Personalized advertiser alerts

  • Campaign forecasting in chart, graph form

  • More account transparency

  • Focus on traditional marketing metrics like CPM


Not to say that one company is copying the other, but Yahoo Search Marketing’s new interface is more marketer friendly, which seems to have prompted Google to create a more user friendly campaign “Snapshot” of their own.

End of last week did Google acquire a large amount of domains around the phrases: “Claim Your Content”, “Claim My Content” and ”Claim Our Content”


CLAIMYOURCONTENT

CLAIM-YOUR-CONTENT

CLAIMMYCONTENT

CLAIM-MY-CONTENT

CLAIMOURCONTENT

CLAIM-OUR-CONTENT


Registered TLDs: .COM, .NET, .ORG

Country specific TLDs: .FR, .DE, .CH, .CO.UK etc.


WWWCLAIMYOURCONTENT

WWWCLAIM-YOUR-CONTENT


Registered TLDs: .COM, .ORG, .NET


Not registered were domains for:


WWWCLAIMMYCONTENT

WWWCLAIM-MY-CONTENT

WWWCLAIMOURCONTENT

WWWCLAIM-OUR-CONTENT


This implies that ClaimYourContent.* will be used as the primary domain.


Garrett Rogers from Googling Google Blog speculates that the domains could be used to offer webmasters a tool to fight scrapers and others that steal content from your website.


Sam Harrelson from CostPerNews.com speculates that this might be an attempt to allow for users to claim (and thereby easily monetize) content from the wide variety of content producing platforms.


An effective system to fight content theft and scraping would be great.


Webmasters fight today an uphill battle against content theft, especially against scrapers. Scraper sites are literally sites that show “scraped” content from other sources, like SERPS, RSS Feeds, Blogs and other Web Sites.


The scraper “mashes up” and “scrambles” the content as good as he can to circumvent the search engines duplicate content filters. Only as much as absolutely necessary is done on the site which consists usually of thousands and more auto-generated pages. Nothing is done by hand, because the poor converting pages that litter all engines indexes are only profitable if you generate a lot of them.


While using tools like CopyScape to find duplicate content can be helpful and vehicles like the federal Digital Millennium Copyright Act (DMCA) might be working to fight single cases of content theft by other webmasters with a real website, are those methods pretty much worthless against scrapers who produce websites using your content faster than you can act on them, not to mention the problem of finding out the identity of the scrapers to send out a DMCA notice to them.


You can also send a DMCA notice to the search engines every time a scraper site with your content appears in the SERPs, but that can turn into a full time job doing every day nothing else than that.


The most effective tool available to webmasters against scrapers that get the content right from your website today is to identify their scraper scripts and block them from accessing your website.


Those scripts are basically “bad robots” that ignore the robots.txt exclusion protocol and robots meta tags. David Naylor provided information and also source code how to identify and block bad robots at his blog.


This method does not help you if scrapers use the content of your RSS feed. The only thing you can do there is not to make full articles available in your feeds, but only a brief summary or the first 100-200 characters of the post with a “more link” to the full article on your website.


Anything Google would come up with to solve or at least reduce those problems would be helpful, but if that is what those domains might be used for, I would like to know how they would solve problems like:



  • Verify that sites that claims content as their own are actually the rightful owner of the content

  • Prevent scrapers or rouge webmasters that steal content to claim content from others as their own

  • Allow content owners to white-list sites that do have permission to re-purpose some of their content (press releases, free to re-print articles etc.)


This is a very complicated subject and hot at the same time. I think it would already be a good start if webmasters would have a way to tell the search engines if their sites content gets suppressed or removed from the SERPs due to a duplicate content penalty or filter caused by content theft. This would help especially new domains that are most likely to become a victim of this, because of the lag of trust compared to domains that are older (Google Sandbox Effect).


A scraper who acquires an old domain to put up somebody else’s content will most likely be considered the content owner by the search engines and the original content owner gets penalized or filtered out.


I guess we will have to wait a bit more to see what Google will be using the newly registered domains for. But that does not stop people from speculating. Google might gets some new and useful ideas from what people speculate.


Cheers!


Carsten Cumbrowski

Cumbrowski.com internet marketing resources like duplicate content issue and legal resources and much more.


Quick Update (for everybody who does not read comments): ClaimYourContent appears to be the name of YouTube’s copyright protection service. See here. Thanks Pete for pointing that out. However, I hope that Google does not stop there. The mentioned scraper issues are unresolved and options should be considered to find a solution for them.

If you do not recognize the obvious grammatical error in the title, you MUST read this article; I wrote it specifically for you!


If you only have five seconds to spare, just remember the three C’s of optimized copy: be concise, correct and credible.


In 2003, Kathy Kiely published an article in USA Today discussing the impact of bloggers on presidential campaigns. Despite the subject matter, Kathy’s points still apply today, “Many bloggers are not professional journalists. Few have editors. Most make no pretense of objectivity.” I believe many search engine and social media marketers fall into the same category and as a result our industry, clients, their customers and the search engine results suffer.


The Bottom Line


People either do not like or do not have the time to read poorly written content. Therefore, whether you are writing an optimized web page, linkbait, blog entry or press release, it is essential that you write with concise language, correct grammar and credibility. Besides increasing readership, you will make the search engines very happy.


Let me take a moment to state that my mother was an English teacher. Growing up under her watchful eye I rebelled by misusing commas and tenses. Despite being grammatically challenged I managed to become a competent writer, which I attribute to being opinionated and following these simple rules:



  1. Be concise: answer what, where, and when early in the content. Well-written titles and opening lines that stick to the facts have a stronger response rate from both humans and search engines.

  2. Be correct: use correct grammar and spelling. If there is only one thing I want you to remember from this article it is that you should read your content out loud, preferably from a printed document. I am a huge environmentalist, but the mistakes spell check does not catch are easily heard and seen when you remove yourself from the computer. As for grammar, try to use familiar words in short, declarative sentences.

  3. Be credible: get the facts straight. Wikipedia is NOT a reliable source for unbiased information. Neither are most blogs that quickly turn news into a bad game of telephone. If you are writing about a subject, go straight to the source. In the age of backlink nepotism, many search engine marketers are losing sight of how to write as the authority. This is not high school, do your own homework or fall into an ever-growing sea of noise. Or worse yet, get removed from feed subscriptions!


Do these points sound like common sense? They should, they are the basic pillars of journalism. I have left out some of the biggies: humor, creativity, newsworthiness, etc. For a nice summary read Michael Gray’s post, Top 12 Ways To Win Friends & Write Magnetic Headlines, at Search Engine Land.


Fortunately, when it comes to search engine optimization, marketers have free license to bend those rules for the sake of keyword integration. Flora Fair, an accomplished journalist turned optimized copywriter states, “Though journalistic and SEO writing are similar, one key difference is the challenge of integrating keyword phrases for searchability. You sometimes have to bend the strict rules of grammar applied to journalism and make the copy sound as natural as possible. But whether you’re writing for the news or for the Web, the ultimate goal is to tell a compelling story that conveys information. Including optimized keywords adds another factor to the process.”


Marketers can also bend the rules for the sake of creativity or branding, but I will never forget my mother’s caution, “you must understand the rules before you can break them.”


Class dismissed.

For those of you not yet aware, Google is currently updating the PageRank they are displaying in their toolbar. Each update causes a stir among the SEO community and webmasters trying to get their websites to the top of the Google Rankings.
What Is PageRank? Without getting into too much detail, PageRank is essentially a score out of ten as to the “value” of your site in comparison to other websites on the Internet. It is based on two primary factors; the number of links you have pointing to your website and the value of the links pointing to your website. The value is calculated based on the PageRank of the page linking to you and debatably the relevancy of the page linking to you (there is no hard evidence to back up the relevancy factor in regards to PageRank that I have seen however it definitely is a factor in your overall ranking).
If you are interested in more information on PageRank you would do well to visit the many forums and articles on the topic and also visit Google’s own description on their website at http://www.google.com/technology/ where they give a brief description of the technology.
What’s New?
The most current PageRank update will undoubtedly cause a larger stir than usual in that many sites have shown drops in their visible PageRank while at the same time showing significant increases in their backlinks. This fact reveals that one of three things has occurred in this latest update:
1. Google has raised the bar on PageRank, making it more difficult to attain a high level, or
2. The way they are displaying their backlinks has changed, or
3. The way they calculate the value of an incoming link has changed.
Any of these are possible and has been noted in the past as something they are willing to do. Additionally, it is possible for all to occur at the same time.
As we don’t like to use client’s as examples I will use the Beanstalk site, backlink counts, and PageRank changes as the meter by which the following conclusions are drawn, however this information was attained through looking at a number of client website, and their competitors.
Google Raising The Bar To Lower Yours
In the past few PageRank updates it has become quite apparent that Google is continuously raising the bar on PageRank. In their defense, with all of the reciprocal link building, link renting, etc. going on this was a natural reaction to the growing number high PageRank sites that attained those ranks simply by building or buying hundreds and thousands of links.
There is no doubt that this is a factor in the changes in this current update. If your site has maintained it’s PageRank, and the PageRanks of your second-level pages then you have done well in holding steady and if your competitors have not been as diligent their positions will slip.
New Backlink Calculations
I mention this one only to bring to light that it is a possibility for your future consideration during other updates. The Beanstalk website went from 750 shown backlinks on Google to 864. it should be noted that Google does not show all backlinks (if you want a more accurate backlink count go to Yahoo! and enter “link:http://www.yourdomain.com” (don’t forget the http://)).
When the Beanstalk site showed 750 backlinks on Google we were showing around 12,000 on Yahoo! (about 6.5% showing on Google). The Beanstalk site is now showing 864 on Google and 15,500 on Yahoo! (about 5.6%). If anything then, Google is showing less links than before which negates the possibility that a website’s PageRank is dropping due to a decrease in links but being hidden by an increased number being displayed.
In short, while which backlinks Google chooses to display has certainly changed over time it does not appear to be a major factor in this update. If you see an increase in your sites backlink counts during this update you undoubtedly have an increased number of links.
The Value Of Links
Separate from the number of links you have is their value. This appears to be an area of significant change in this update. Areas that appear to have reduced value in regards to affecting PageRank are:
1. Multiple links from the same site or run-of-site links
Intelligent and relevant reciprocal links do not seem to have been penalized, probably due to the increased relevancy factor. If you reduce the value of irrelevant links and raise the value of relevant ones then there is no need to penalize reciprocal links as, done incorrectly, they will penalize themselves.
2. Links with text around them that indicate they are purchased such as “Partners”, “Advertising”, etc.
Google has and is actively trying to reduce the value of paid links. This appears to have been moderately successful where there is clear indication that the link is paid for.
3. Links from sites that hold little relevancy (this factor is based on educated speculation)
The relevancy factor appears to have become more important. Links from sites with content related to yours is showing positive results while sites with larger numbers of less relevant links are showing drops in PageRank.
What Does This Mean?
For those of you who have been proactive in your link building, and focused on relevant sites using the Google Directory, searches or a tool like PR Prowler it means, “stay the course”. Those of you who have been building or buying links based only on PageRank with little concern for it’s location, or how it is presented - you will need to adjust your link building efforts accordingly.
What Do I Do - My PageRank Dropped ?!!?
The first thing not to do is panic. Take a deep breath, PageRank is one factor of dozens that Google uses to determine the ranking of your page, it is not the only thing. Now, visit your main competitors sites - there’s a good chance you’ll see that they too dropped in PageRank. The plus side to these kinds of updates is that they’re universal. It’s not as if Google has it in for you specifically and so when they do an update, the positive and negative impact is felt by all.
Now, if you’ve noticed that everyone around you has stayed the same or increased in PageRank try to remember this, there’s nothing you can do about where you’re currently positioned in regards to PageRank and it will probably be another 3 months before Google updates the public PageRank again so … start building some good quality (high relevancy, solid PageRank) links, work towards and increase in the next update.
Panicking now won’t help, intelligent reaction will.
What Happens Now?
Traditionally the search engine results will begin to fluctuate based on the new visible PageRank 3 to 7 days after they are visible. This does not have to be the case as Google’s had these numbers all along but it’s worked this way in the majority of cases in recent history. So monitor your search engine positions over the next week or two and watch for changes. Try to hold back on making major changes to your site during this time as often the final positions will differ from those that can be viewed during the shuffling. In a couple weeks time evaluate where you stand and tweak your site as necessary but don’t spend too much time on that … you have a solid link building effort to undertake.

The Top 10 SEO Tips contained in this tutorial were created after years of experimentation with a half a dozen websites, mostly those of my clients who insisted they should be on the first page of the search engine results pages. I hope you find these SEO tips valuable. For an in-depth study of everything I know about search engine optimization, please read my eBook.


SEO Tip #1: Find the Best Keywords


It would be a waste of your time to optimize your website for keywords that are not even being searched for. Therefore you should invest some energy into finding the best keywords. There are several SEO tools available on the Internet to help you find the best keywords. Tip: Don’t be deceived by organizations that require you to register first. The two most popular resources are WordTracker and Yahoo!. Because Yahoo! has a man-made database that truncates plurals, I prefer to use WordTracker (WT).


Below is a screenshot from WT that shows the results you’ll get when doing a query for “putter”. Notice that “golf putters” has the highest search volume with 100 searches in the last 24 hours, yet there are over 100,000 websites to compete against. Using the tool’s Keyword Effectiveness Index (KEI), you’ll be able to see that “custom putter” would have a better chance at higher ranking, since there are only 2,640 competing.


SEO Search Results


Here’s a key part of the top 10 SEO tips: When using any SEO tool for doing keyword research, start by keeping your searches ambiguous like we did in the example above for “putters”. The results will always return suggestions, sometimes surprising ones that you may not have thought of.


You can get less comprehensive results by using DigitalPoint’s Keyword Suggestion Tool. This SEO tool will give you a summary of information without the KEI. Personally, I like to know how many people are competing before I design a web page.


SEO Tip #2: Discover Your Competitors


It’s a fact and one of my top 10 SEO tips, that search engines analyze incoming links to your website as part of their ranking criteria. Knowing how many incoming links your competitors have will give you a fantastic edge. Of course, you still have to discover your competitors before you can analyze them.


My tool of choice is SEO Elite, which digs through the major search engines by keyword to not only tell you who your competitors are, but also provides you with an in-depth analysis of each competitor. The analysis includes these extremely important linking criteria (super SEO tips), such as:



  • Competitor rank in the Search Engines

  • Number of incoming links

  • What keywords are in the title of linking page

  • % of links containing keywords in the link text

  • The PageRank of linking pages

  • The Alexa traffic ranking information


Here is a screenshot of their SEO software that shows the search results and the module that has the email functionality:


SEO Software Screenshot


Stats, such as the above, play a critical part in determining what tools your website will need to compete in the Internet marketing competition. SEO Elite also offers you the ability to see who the website owner is and even send emails to all websites discovered to have quality link potential.


SEO Tip #3: Optimize Your Title


The Title and META tags should be different on every page of your website if you wish for most search engines to store and list them in the search results. Us SEO Expert’s have experimented with these two pieces of code to help us reach an accepted conclusion about how best to use them. Don’t click off this site until you’ve read the top 10 SEO tips below to see what I’ve discovered works best for search engine optimization.


Optimizing Your Website Title


There are different theories about how long your Title should be. Since Google only displays the first 66 or so characters (with spaces), my Top 10 SEO Tips for the title would be to keep it under 66 characters and relevant to the content on the page. However, some may argue that the value of the homepage title may warrant additional search term inclusion.


Bar none the most important of the top 10 SEO tips involves your keywords. If you wish to be on the first page of the search results, you must include your keywords in your Title tag. Preferably before all other words in the Title. No need to repeat your keywords in the Title, that’s interpreted as spam by the search engines. Here is an example of good Title:


SEO Tips: Search Engine Ranking


Notice the symbol between the keyword phrases. These are the types of SEO tips your competitors don’t want you to know about. For whatever reason, this little symbol has helped top SEO’s (including myself) target more than one keyword per page. I still recommend starting with one keyword phrase per page, but if you’re aggressive and think you can compete against a website targeting just one, by all means go for it.


SEO Tip #4: Optimize Your META Tags


META tags are hidden code read only by search engine webcrawlers (also called spiders). They live within the HEAD section of a web page. There are actually 4 very important META tags you need to worry about. Meta tags specifying who the author is and what the site is about really isn’t important to the search engines that matter the most (i.e.: Google). The META tags you need to be the most concerned about are:



  1. robots

  2. content-type

  3. description

  4. keywords


Sequencing of these tags may be extremely important. I say “may” because SEO is mostly hypothesis due to the changing algorithms of the search engines. Even though the W3C states that tag attributes do not have to be in any particular sequence, I’ve noticed a significant difference when I have the tags and attributes in the order described here. The only deviation from the list above is that the Title tag should come after content-type and before description.


The robots META tag tells the various search engine spiders whether or not you’d like them to crawl through your web page as well as where to start in their crawling activity. Top 10 SEO Tips wouldn’t be worthless without META robots, so long as you use a Robots.txt file. It’s not too hard to see why this tag can still be important. Here is the syntax:


<meta name=”robots” content=”index, follow” />


You can change the “index” to “noindex” and the “follow” to “nofollow” if you do not want your website to be indexed. Though, I have no idea why you wouldn’t want to be indexed.


Content-type is important to complex search engines like Google. This tag tells the spider what type of page you are posting, which helps the search engine categorize the listing. It also shows that you are following the World Wide Web Consortium (W3C) guidelines, which could be an indication of a site being “optimized”. Here is the syntax used on this page:


<meta http-equiv=”Content-Type” content=”text/html; charset=UTF-8″ />


The description META tag is the text that will be displayed under your title on the results page. See the OC Internet Advertising example above. There’s also a lot of controversy about the number of characters you should have in this tag. I’ve seen sites with a paragraph in their description listed in the top results, so I don’t think this tag has very much weight.


However, if you want the listing to look clear and to the point, my Top 10 SEO Tips for this META tag would be to keep it under 150 characters and to not repeat your keywords more than 3 times. It may be a coincidence, but I’ve also noticed ranking improvements when I put my keywords at the beginning of the description. Here’s the syntax:


<meta name=”description” content=”your_keywords_here followed by a statement about your product service or organization.” />


The last important META tag is the keywords META tag, which recently lost a lot of points in Google’s search engine algorithm. Along with being valuable to this top 10 SEO tips list, this tag is still important to many other search engines and should not be ignored. Based on my experience with this tag, you can have approximately 800 characters in this tag (including spaces).


SEO Tip: if you repeat your keywords more than 3 times it can be a pretty good indication to the search engine that you are trying to spam their search results. Also, don’t waste your time including keywords that aren’t used in the BODY section of your website, that could be seen as another spam technique. Here’s the syntax used on this Top 10 SEO Tips page:


<meta name=”keywords” content=”top 10 seo tips, what is seo, resources, seo software, seo ebook, search engine optimization” />


SEO Tip #5: Use Headings


In college and some high schools, essays are written using a standard guideline created by the Modern Language Association (MLA). These guidelines included how to write you cover page, title, paragraphs, how to cite references, etc. On the Web, we follow the W3C’s guidelines as well as commonly accepted “best practices” for organizing a web page.


Headings play an important role in organizing information, so be sure to include at least H1-H3 when assembling your page. Using cascading style Sheets (CSS), I was able to make my h1 at the top of this page more appealing. Here’s a piece of code you can pop into your heading:


<style type=”text/css”>


h1 font-size: 18px;


h2 font-size: 16px;


h3 font-size: 14px;


</style>


Since a page full of headings would look just plain silly, my SEO tip would be to fill in the blank space with paragraphs, ordered and unordered lists, images, and other content. Try to get at least 400+ words on each page.


SEO Tip #6: Use Title and ALT Attributes


More often then not, web addresses (URL’s) do not contain the topic of the page. For example, the URL www.myspace.com says nothing about being a place to make friends. Where a site like www.placetomakefriends.com would tell Google right away that the site being pointed to is about making friends. So to be more specific about where we are pointing to in our links we add a title attribute and include our keywords.


Using the Title Attribute is an direct method of telling the search engines about the relevance of the link. It’s also a W3C standard for making your page accessible to disabled people. In other words, blind folks can navigate through your website using a special browser that reads Title and ALT attributes. The syntax is:


<a href=”http://www.top10seotips.com/seo_software.htm” title=”SEO Software”>SEO Software</a>


The ALT Attribute is used for the same reasons as the Title Attribute, but is specifically for describing an image to the search engine and to the visually disabled. Here’s how you would use ALT in an IMG tag:


<img src=”http://top10seotips.com/img/image01.jpg” alt=”Top 10 SEO Tips”>


SEO Tip #7: Nomenclatures


Whenever possible, you should save your images, media, and web pages with the keywords in the file names. For example, if your keyword phrase is “golf putters” you’ll want to save the images used on that page as golf-putters-01.jpg or golf_putters_01.jpg (either will work). It’s not confirmed, but many SEO’s have experienced improvement in ranking by renaming images and media.


More important is your web page’s filename, since many search engines now allow users to query using “inurl:” searches. Your filename for the golf putters page could be golf-putters.html or golf_putters.html. Anytime there is an opportunity to display or present content, do your best to insure the content has the keywords in the filename (as well as a Title or ALT attribute).


SEO Tip #8: Create a Site Map Page


PageRank is relative and shared throughout a website by a unique voting system created by Google. I could spend two days trying to explain how PageRank works, but what it comes down to is having efficient navigation throughout your site. That where a site map page comes in. Since every page on the website will be linked to the sitemap, it allows webcrawlers (and users) to quickly and easily find content. This SEO tip is one of my favorite of top 10 SEO tips.


It use to take 4 clicks to get to a product page at www.questinc.com. By creating a site map, users and search engines can now access any page on the site with only two clicks. The PageRank from these deep pages went from 0 to 2 in about 3 months and the ranking went from virtually not existent to #1 almost across the board for nearly 2,000 pages on their site.


SEO Tip 8 - Create a Site Map - Example


Feel free to search Google for any of the terms on this catalog page, such as MITSUBISHI Monitor Repair. See how powerful a site map can truly be.


SEO Tip #9: Include a robots.txt File


By far the easiest top 10 SEO tips you will ever do as it relates to search engine optimization is include a robots.txt file at the root of your website. Open up a text editor, such as Notepad and type “User-agent: *”. Then save the file as robots.txt and upload it to your root directory on your domain. This one command will tell any spider that hits your website to “please feel free to crawl every page of my website”.


Here’s one of my best top 10 SEO tips: Because the search engine analyzes everything it indexes to determine what your website is all about, it might be a good idea to block folders and files that have nothing to do with the content we want to be analyzed. You can disallow unrelated files to be read by adding “Disallow: /folder_name/” or “Disallow: /filename.html”. Here is an example of the robots.txt file on this site:


SEO Tip 9 - Create Robots.txt File - Example


SEO Tip #10: Install a sitemap.xml for Google


Though you may feel like it is impossible to get listed high in Google’s search engine result page, believe it or not that isn’t Google’s intention. They simply want to insure that their viewers get the most relevant results possible. In fact, they’ve even created a program just for webmasters to help insure that your pages get cached in their index as quickly as possible. They call the program Google Sitemaps. In this tool, you’ll also find a great new linking tool to help discover who is linking to your website.


For Google, these two pieces in the top 10 SEO tips would be to read the tutorial entitled How Do I Create a Sitemap File and to create your own. To view the one on this page, website simply right-click this SEO Tips Sitemap.xml file and save it to your desktop. Open the file with a text editor such as Notepad.


Effective 11/06, Google, Yahoo!, and MSN will be using one standard for sitemaps. Below is a snippet of the standard code as listed at Sitemaps.org. Optional fields are lastmod, changefreq, and priority.


<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://www.example.com/</loc>
<lastmod>2005-01-01</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>

</url>
</urlset>

The equivilant to the sitemap.xml file is the urllist.txt for Yahoo!. Technically you can call the file whatever you want, but all it really contains is a list of every page on your website. Here’s a screenshot of my urllist.txt:


Example of a Yahoo! urllist.txt File


SEO BONUS Tip: Validate Your Code


There are several ways to validate the accuracy of your website’s source code. The four most important, in my opinion, are validating your search engine optimization, HTML, CSS and insuring that you have no broken links or images.


Start by analyzing broken links. One of the W3C’s top 10 SEO tips would be for you to use their tool to validate links. If you have a lot of links on your website, this could take awhile.


Next, revisit the W3C to analyze HTML and CSS. Here is a link to the W3C’s HTML Validation Tool and to their CSS Validation Tool.


The final step in the last of my Top 10 SEO Tips is to validate your search engine optimization. Without having to purchase software, the best online tool I’ve used is ScrubTheWeb’s Analyze Your HTML tool. STW has built an extremely extensive online application that you’ll wonder how you’ve lived with out.


One of my favorite features of STW’s SEO Tool is their attempt to mimic a search engine. In other words, the results of the analysis will show you (theoretically) how search engine spiders may see the website.


Author: Steven Wiideman

I’m starting the SEO Training section with some SEO tips for blogs to increase your traffic, specifically self-hosted Wordpress blogs. Whatever the purpose, I’m sure that you’d want others to know about your blog, especially after spending so much time and effort writing - I know I would.


Before we begin, some of you may ask why you should even bother listening to a blogger with barely 2 weeks of experience with Wordpress - good question!


The answer is that I always consider the marketing aspect when embarking on any online projects. Since I’m not technically-trained and can’t create fancy websites, I need to rely on my marketing skills to have an edge over my competition.


For example, I took over a forum in November last year and without a clue about how the IPB forum software works. But I managed to grow my community of less than 2,000 members to almost 9,000 members in 5 months - a 350% growth! The Vista Forums is one of the biggest unofficial Windows Vista forum today, receiving close to 7,000 visitors/day and ranks #1 on Google for “windows vista forum” and “vista forum”.


Now that we have gotten that out of the way, here are my SEO recommendations for Wordpress blogs:


1) Put Your Post In Only 1 Category


SEO Wordpress Tips - Categories

Make it easier for the search engines to find your posts by putting them into only 1 category.


2) Make Use Of The “Read More” Feature


SEO Wordpress Tips - Read More

Make use of the “More” tag - leave the first paragraph on the mainpage and the rest of the content on the post page. This helps prevent duplicated content and keeps you off Google’s Supplemental Index.


3) Related Posts


SEO Wordpress Tips - Related Posts Plugin

There’s a Wordpress plugin called “Contextual Related Posts” that automatically displays 5 of your previous related posts below the comment box. It’s an old (2005) plugin that works by simply activating it in your Wordpress Dashboard. With related posts you get more inner links, exposure for older posts and stickiness on your blog.


4) Create Attractive Titles


SEO Wordpress Tips - Titles

Getting more exposure is not enough but you need to get people to want to click on your link. Think of a creative title that can get potential readers interested, e.g. “How to cheat Ping.sg” (see image above).


5) Customize Your Permalinks


SEO Wordpress Tips - Permalinks

Wordpress creates boring URLs for your posts by default, e.g.

www.larrylim.net/seo-online-marketing/?p=5

You can customize the permalinks to embed your title (with keywords) into the URL, e.g.

www.larrylim.net/seo-online-marketing/chief-marketing-officer-2007/5/

Simply set your preference in your Wordpress Dashboard and add some commands into the .htaccess file.


6) Have Different Meta Titles

There’s a Wordpress plugin called “SEO Title Tag” that lets you set different meta titles for every post, including the static pages and categories. A meta title is the title that appears in your browser header when viewing a page and it’s also the title for your pages in search engine results. Unfortunately, this plugin is still in beta and was showing MySQL errors for my Wordpress version 2.1.2.


Update: I’ve managed to find a solution to the problem here


7) Create Description And Keyword Meta Tags


SEO Wordpress Tips - Meta Tags

You can also set your meta description and keyword tags for all your posts, static pages and category pages using a Wordpress plugin called “Add-Meta-Tags“. These meta tags are used by search engines when ranking your blog pages in search results. By default, the first full sentence is automatically used as your meta description, while the category is used for your meta keywords. There’s an option to set them manually by using custom fields in Wordpress when writing a post.


8) Create Technorati Tags For Each Post


SEO Wordpress Tips - Technorati Tags

There’s a Wordpress plugin called “Bunny’s Technorati Tags” that lets you manually insert tags for your posts. This helps increase your presence and gain more exposure at the Technorati blog search engine. While we’re at that, you should probably add “Action Buttons” to your posts - those cute little buttons you see on other blogs. It provides your readers a convenient way of adding your posts to social networking sites like Digg.com and Del.icio.us. I’d advise using a maximum of 5 buttons because any more than that may confuse your readers and create an ugly clutter.


9) Generate A Sitemap


Wordpress SEO Tips - Sitemaps

Next, make sure to generate an XML sitemap so that the search engines know which pages to crawl - you can use a Wordpress plugin called “Google Sitemaps” to do this automatically. Then inform Google and Yahoo about the location of your sitemap.


10) Use A Pinging Service

Blog pinging services like Pingoat help you ping or notify a number of blog agregators and blog search engines, e.g. Technorati and Syndic8, whenever you update your blog. This helps increase your blog’s presence on the Internet and creates more avenues for visitors to find you.


Note that all the SEO recommendations above are specific to the Wordpress blog and do not include general SEO strategies like keyword analysis and linkbuilding - we’ll keep that for future lessons. ;)


If you appreciate my effort, feel free to:

- add me to your Technorati favourites using the button on the left.

- blog about this article.

- add me to your blogroll if you’re a blogger.

- join my community at MyBlogLog.


Thanks for reading,