About michaelcdonc1

Michael Carrington Bio: Search Engine Optimizer with over 15 years real time experience in bringing large client programs with as many as 100 websites to the first page and top of the listings at Google, Yahoo and MSN. Fifteen years experience in developing high ranking organic placements to bring global leads to local businesses from local end users as well as a very extensive track record in on line sales and lead generation for local businesses. As a senior level, results-driven professional with an exceptional track record of building and ranking web programs I have successfully created and managed high organic ranking websites using all internet related marketing theories into solid proven results. I am a creative architectural strategist in bringing together multi-media marketing, new business development, and branding across multiple channels utilizing digital media. Having worked for companies in the US, Canada and Mexico I have been very successful in acquiring additional market share, global and local, by creating cross-web campaigns in foreign languages. As a long term ODP, dmoz.org, editor and being Google Certified as an SEO and Pay Per Click consultant gives me the extended knowledge of the back end processes of the search engines. I am experienced in Social Marketing theories and applications. I am considered an expert by my peers in all things pertaining to SEO, web marketing and Google, Yahoo and Bing algorithm changes.

How Website Structure & Information Architecture Should Mirror Your Business Goals

Thomas is the CEO of a major corporation. He had supervised a recent website redesign project, loved the snazzy new look with bells and whistles created by a talented graphics designer – but was calling me to help with a problem.

His beautiful new website wasn’t getting many visitors!

“Why don’t people want to visit our lovely website?” Thomas wailed, genuinely puzzled that the results of his intensive efforts weren’t as rosy as he had expected. As a strategic SEO consultant, the reasons were glaringly obvious to me… but I had to soften the impact, and gently explain what went wrong.

Together, we quickly checked the site’s ranking on Google for his top 50 keywords. They weren’t anywhere in the top 10 results. Or even 20.

You see, the not-so-apparent reason for the ‘failed’ website was the lack of something essential for both higher search engine rankings, and to enhance the visitor experience which can convert a prospect into a customer.

What’s that, you ask?

Thomas’s new website, though visually appealing and technology-rich, was sorely lacking in a well planned information architecture and website structure.

But what is “information architecture”? And how does “website structure” differ from design?

A formal definition of “information architecture” would likely put you to sleep! So let’s simply call it the art of organizing and labeling website content, and bringing design and architecture principles to bear on it.

To understand this better, we’ll look at the skeleton of a website, shorn of flesh and skin, stripped down to the basic fundamentals of what shapes and strengthens it – from within.

Basic Concepts Of Information Architecture
In medical school, trainees begin by learning about human anatomy. Knowing what makes up the body helps understand (and later treat) diseases that affect it.

At the heart of understanding website structure, and planning your strategy for information architecture, lies a need to know about terms like semantic search, latent semantic indexing, knowledge graph, and SEO automation.

Semantic search is an attempt to improve search accuracy by predicting the intent of a searcher. The shift from blindly matching keywords typed into a search box against a massive database, to a more “intelligent” form of search that attempts to understand what those words actually mean to the user, has serious implications on strategic SEO for many business owners.

Latent Semantic Indexing is an indexing and retrieval method that was designed to identify patterns in the relationship between terms and concepts within any text.

By providing unique context for each search term or phrase, it ensures that a search for ‘Apple’ computers will retrieve pages with iMac or iPad on it, while a search for ‘Apple’ fruit will pull a different set of results on gardening and growing apples.

The “knowledge graph” is made up of collated information that will help search services like Google deliver more than just a list of 10 websites, and provide contextual information that solves users’ problems better (even when those problems are not explicitly voiced by the user)!

The implications are clear. Keywords are open to being manipulated. User intent cannot be gamed so easily.

To survive a search engine war fought on the battlefield of semantic search, your business must deeply understand the psychology of your collective market, and then provide specific and meaningful answers to their problems, doubts and insecurities in the form of optimized Web pages that are simultaneously designed to rank well… and also fit into the bigger context of your overall business goals.

At first glance, this seems a daunting challenge. But it’s really straightforward if you proceed with a rational plan rooted in strategy, founded on information architecture principles and framed upon a solid website structure.

Before we explore these elements in greater depth, I’d like to make something clear.

This Is Not A Fight Between Designers & SEO Experts!
Traditionally, these two camps have been at loggerheads. Designers feel SEO ruins their carefully planned look and feel. SEO hotshots complain that higher ranking is sacrificed on the altar of a prettier website.

Yes, it is possible for a design-obsessed structure to wreak havoc with a site’s SEO. It’s also possible for a website driven entirely by SEO to destroy a brand or ruin sales potential. With planning and high quality implementation, the strengths of both specialties can be harnessed to offer a business incredible synergy.

Exploring how this happy union can be achieved is the goal of this report.

Today, any successful website needs:

•SEO (to drive relevant, quality traffic that is looking to buy),
•usability (to manage and convert these visitors into paying customers), and
•the ability synergize both to work in concert, building your brand and growing your business.
Information Architecture & Getting Inside Your Prospect’s Mind
Too often, businesses structure their corporate website based upon the business’ organization. This is often out of sync with a client’s needs, causing the business to lose money.

Your ideal prospect visits your website to see if you’ll help find solutions to her problems – not to read a self-serving brochure about your business.

Keeping this in mind, your information architecture must be based on the best ways to serve your visitor, based on an intimate understanding of ‘user logic’.

Let’s take a hypothetical case of a young couple planning a holiday to Norway. She looks at him and says, “Let’s stay at this hotel in Oslo, honey!”

And with that initial spark of desire, the journey of online exploration begins. They type the name of a hotel (or maybe just “Oslo hotel”) into Google and click the Search button.

Will they find your hotel’s website ranked on the front page?

Findability is only the first step. The title and description of your listing must address their specific problem – Where to stay on our trip to Oslo? If you win the ‘click’, that delivers a prospective guest to your hotel’s website.

Now on your landing page, the couple wants more information. About their stay. About your facilities. Your pricing. Room availability. Tourism assistance. And more.

If your landing page copy and content matches their desire for knowledge and satisfies their needs, you’ll create trust and boost your chance of getting a sale.

This logical sequence – desire, findability, information, trust – is more or less constant across industries and niches. In one form or another, it exists in your field too. And your business website must match the flow, tap into the conversation that’s going on inside your prospect’s head, and join it to engage, inform, entertain and convince.

Before getting into the nitty gritty of content hierarchy and website structure that will help create this trusting relationship with prospects, I’ll take a step back to address another overlooked facet of the strategic SEO process.

Internal Link Structure & Information Architecture
Think about information architecture in the same light as planning and building a house. You would draw up a blueprint, then lay a firm foundation, construct a framework, and only then add on the layers that turn the scaffolding into a full fledged building.

Constructing an SEO optimized website that is strategically designed to fulfill the business goals of your enterprise follows essentially the same process.

When done correctly, a website’s information architecture can offer context for your content, present it in a manner that is friendly to search engine spiders and yet easy for human visitors to navigate, and ideally set up in a way that gives access to any section with just 3 clicks – or less.

The Myth Of “Home Page Focus”
Very simple, logical website structure (like I’ve explained before) that is based upon a user’s intent behind search keyword phrases will turn every category, sub-category and topic page into a “home page”. This is awesome, because:

•Your visitor will click fewer links (remember the 3 click rule?) to reach other sections of your website – something every usability expert and designer intuitively values, and website owners must consider seriously since it impacts the way Web search works.
•You have less need for ongoing SEO to improve and/or defend rankings, and can focus it instead on growing your business with scalable solutions that last longer.
•You’ll become more authoritative on each level of your URL structure, as new topic pages added into your silo will bring additional value to the pages higher up in the hierarchy because of your strategic internal linking.
•You’ll have the freedom to sculpt PR and pass link value to handpicked relevant pages or topics outside the silo. For example, if you sell red shoes, you could link to related items like red belts (which may reside in another silo) and achieve higher sales conversions.
•You can control and direct the way search engine spiders and Web crawlers find, interpret and understand your URLs before indexing them.
•The strategic use of navigational breadcrumb links lets users zoom in to get a close up, or zoom out for a broader context.
•Such logical structuring is not vulnerable to algorithm changes and shifts in the future.
•Each level in the URL structure hierarchy becomes “almost a business or niche” in itself. Visitors get a great first impression about your business when they land on such a page, and will view your site as a place to go when they need help, knowing they’ll be able to easily find other related choices to select from. This boosts your image and builds your brand.
•It is easier to get links from other niche blogs, forums and social networks. External links pointing to a sub-category page bring link value, leading crawlers to your site from relevant ‘authority’ sites that might have already established trust. If you woke up one morning and search engines no longer existed, these sources of traffic would still be valuable.
Achieving the technical elements of SEO is easy even using free tools like Magento and WordPress. Combining elements of SEO and design into the best possible strategy will increase sales. A silo structure for Web content is not just about keyword stuffing. This has nothing to do with spamming, and your intention behind siloing your content shouldn’t just be to get more traffic. Your SEO goal is ultimately to maximize your business and profits.

Layer On Design – But Only At The End!
With the framework of your content website solidly in place, and a silo layout combined with good URL structure defined in consultation with an SEO specialist, you can now team up with a usability expert and a good designer to build a user-friendly, information-rich, self-sustaining website.

•Your site will now become the best salesperson in your organization, working day and night to generate leads and close sales, while serving as a brand manager too.
•The silo structure upon which it is based will order your content in a way that is easy for users to find what they are looking for, just like it is to locate books in a library. This brings order out of chaos.
•Each time you add fresh content or include a new product to your catalog or store, the carefully planned URL structure will build an internal link site-wide to other pages in the category, and up one level in the silo.
•Your information architecture will ensure that link value is passed along effectively and ensures maximum crawlability by search engine spiders.
•You won’t be stuck with time-consuming SEO efforts on an ongoing basis. All new content added to the site automatically fits into its optimized structure, resulting in “auto-pilot SEO” as you enjoy content growth.
•Your website structure and layout will help search engines define context and theme on a very granular level.
But this happy result requires a preparatory SEO strategy because, if not done correctly, it can land you in trouble with a nightmare of duplicate content issues. It is not something you can plan to splash on top, like chocolate syrup on an ice-cream sundae! You must take these steps well ahead of the site building effort, in order to have everything working together in synergy to explode the impact on your business.

5 Reasons to Diversify Your Search Strategy with PPC Advertising

By Elisa Gabbert
July 18, 2012

Yesterday we published the results of a study showing how sponsored advertisements on Google (PPC ads) are taking over territory previously reserved for organic listings, AKA “free clicks.” This is both good news and bad news for marketers. On the plus side, Google continues to roll out more and better types of search advertising to help marketers target their customers. On the negative side, you (obviously) have to pay for those clicks.

But the fact is, organic clicks aren’t really “free” either – gone are the days when it was relatively easy to rank on the first page in Google for your target keywords. Given the increasing costs and complications involved with SEO, it’s important to diversify your marketing channels. You can’t rely on organic search alone for traffic and leads – you never know when the next big algorithm update is going to murder your rankings.

Here are five reasons to shift some of the time and budget you spend on SEO to PPC.

#1: For Commercial Queries, Paid Clicks Outnumber Organic Clicks By Nearly 2 to 1

Organic clicks still account for more clicks overall in the search results – but different types of keywords have different value to businesses. For search queries that show high commercial intent – i.e., they indicate that the person searching wants to purchase something – more and more of the page (85% of above-the-fold pixels!) is devoted to sponsored listings. The organic results for transactional keywords like “best email software” or “waterproof digital camera” are mostly pushed below the fold. The top 3 ad spots for a commercial query take 41% of the clicks, and the Product Ad Listings take another 20%. Overall, sponsored results account for 65% of clicks on these keywords, compared to 35% for organic results.

#2: Google’s Sponsored Ad Formats Keep Getting Better

You have minimal control over how your organic search listings appear in Google. (For example, they’ve recently started applying new titles, when they think they can serve up a better one than the title you put on the page.) But you have lots of attractive choices when it comes to ad types. Here are just a few of the ad options that Google now offers:

Mega Site Links: This huge ad format offers up to 10 additional places to click, greatly increasing your chances of presenting a relevant link.Remarketing: Remarketing or retargeting allows you to track site visitors with a cookie and chase them around the Web, displaying relevant banner ads until they click and convert.Social Ad Extensions: With social extensions you can display who has +1’d your site, lending credibility and potential name recognition – it also makes your ad look less like an ad (see below).

#3: About Half Your Audience Can’t Tell the Difference Between Paid and Organic Search

A lot of people think that “nobody clicks on Google ads.” And it’s true that eye tracking studies suggest most people ignore the sponsored ads in the right column. However, one study showed that about half of people don’t recognize the ads above the search results as ads – in other words, they couldn’t tell the difference between the organic and paid results.

Top ads get clicked first whether paid or organic

Top ads get clicked first whether paid or organic

If users don’t know your ad is an ad, they can’t be suspicious of its intent – and why should they be, if it gives them what they want? Secure one of those coveted positions above the organic results for a commercial query, you’ll take the lion’s share of clicks without sacrificing trust with users.

#4: SEO Is a Full-Time Job – Or Several Full-Time Jobs

As the number of sites competing for rankings has sky-rocketed, Google’s algorithms have gotten more and more complex, and it’s become much harder to achieve – and maintain – high rankings in the organic results. Where in the past businesses could get away with hiring a single SEO point person (usually a pretty junior position), now it often requires a full team to develop and execute on an SEO strategy (a content writer, a link builder, etc.). We believe that PPC – once your campaigns are set up and running – requires significantly less time to manage. According to Perry Marshall, author of The Ultimate Guide to Google AdWords, “if you focus on the areas that bring the most traffic, I find that once you find a rhythm, you can really do this with a few minutes a day, at most a few hours a week, and that’s with a large campaign with a $10,000+ spend per month.”

#5: Algorithm Updates Don’t Affect Your PPC

Google’s rolling algorithm updates ensure that SEO gets harder and more confusing over time. The Panda and Penguin updates in particular have addressed the kind of “optimizations” that have tended to work for site owners and marketers in the past. The only way to find out if Google thinks your SEO techniques are over the line (AKA “over-optimization”) is to take a hit on rankings, and then scramble to figure out – and fix – what you’ve been doing wrong. Google does suspend AdWords accounts on occasion, sometimes without clear reason, but in PPC you’re much less likely to experience major flux or drop-offs in rankings and traffics due to changes on Google’s end.

These are all good reasons to re-allocate some of your marketing budget to PPC, if you’ve been depending on SEO for traffic and lead generation. We would never advocate giving up on SEO – you won’t hear us saying “SEO is dead” anytime soon. But strive for a balance between your search marketing channels, and you can minimize the damage incurred as SEO gets incrementally harder.

Another step to reward high-quality sites

Local Leads Online Marketing algo experts

Local leads Online Marketing Algorithm Experts

(Cross-posted on the Webmaster Central Blog)

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we’re not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.

The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content, and in fact the page text has been “spun” beyond recognition:

Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice.

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.

Posted by Matt Cutts, Distinguished Engineer

Smart Phones Outsell PC’s in 2011

smartphone-v-pc-sales-2011-analysis

Smart Phones out Sell PC as the preferred Browser in 2011

It’s finally happened, what we’ve suspected intuitively. Now we have the data. The year 2011 marked the first time more smartphones than PCs were sold.

That’s the word from market research firm Canalys, which just released figures that show that total annual global shipments of smartphones exceeded those of client PCs (including pads) for the first time.

Vendors shipped close to 489 million smartphones in 2011, compared to 415 million PCs. Smartphone shipments increased by 63% over the previous year, compared to 15% growth in PC shipments.

Canalys includes pad or tablet computers in its PC category calculation, and this was the growth area in PCs. Pad shipments grew by 274% over the past year. Pads accounted for 15% of all client PC shipments. Desktops grew by only two percent over the past year, and notebooks by seven percent.

Googler Tips on Building a Better Post Panda Site

In recent months the Google team been especially focused on helping people find high-quality sites in Google’s search results. The “Panda” algorithm change has improved rankings for a large number of high-quality websites, so most of you reading have nothing to be concerned about. However, for the sites that may have been affected by Panda we wanted to provide additional guidance on how Google searches for high-quality sites.

Our advice for publishers continues to be to focus on delivering the best possible user experience on your websites and not to focus too much on what they think are Google’s current ranking algorithms or signals. Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we’ve rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.

What counts as a high-quality site?
Our site quality algorithms are aimed at helping people find “high-quality” sites by reducing the rankings of low-quality content. The recent “Panda” change tackles the difficult task of algorithmically assessing website quality. Taking a step back, we wanted to explain some of the ideas and research that drive the development of our algorithms.

Below are some questions that one could use to assess the “quality” of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.

Of course, we aren’t disclosing the actual ranking signals used in our algorithms because we don’t want folks to game our search results; but if you want to step into Google’s mindset, the questions below provide some guidance on how we’ve been looking at the issue:

•Would you trust the information presented in this article?
•Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
•Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
•Would you be comfortable giving your credit card information to this site?
•Does this article have spelling, stylistic, or factual errors?
•Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
•Does the article provide original content or information, original reporting, original research, or original analysis?
•Does the page provide substantial value when compared to other pages in search results?
•How much quality control is done on content?
•Does the article describe both sides of a story?
•Is the site a recognized authority on its topic?
•Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
•Was the article edited well, or does it appear sloppy or hastily produced?
•For a health related query, would you trust information from this site?
•Would you recognize this site as an authoritative source when mentioned by name?
•Does this article provide a complete or comprehensive description of the topic?
•Does this article contain insightful analysis or interesting information that is beyond obvious?
•Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
•Does this article have an excessive amount of ads that distract from or interfere with the main content?
•Would you expect to see this article in a printed magazine, encyclopedia or book?
•Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
•Are the pages produced with great care and attention to detail vs. less attention to detail?
•Would users complain when they see pages from this site?
Writing an algorithm to assess page or site quality is a much harder task, but we hope the questions above give some insight into how we try to write algorithms that distinguish higher-quality sites from lower-quality sites.

What you can do
We’ve been hearing from many of you that you want more guidance on what you can do to improve your rankings on Google, particularly if you think you’ve been impacted by the Panda update. We encourage you to keep questions like the ones above in mind as you focus on developing high-quality content rather than trying to optimize for any particular Google algorithm.

One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.

We’re continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search. As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture. This way your site will be more likely to rank well for the long-term. In the meantime, if you have feedback, please tell us through our Webmaster Forum. We continue to monitor threads on the forum and pass site info on to the search quality team as we work on future iterations of our ranking algorithms.

Written by Amit Singhal, Google Fellow

Link Building New Dimensions

Local Leads Online Marketing algo experts

Local leads Online Marketing Algorithm Experts

Much has been said about the importance of a natural backlink profile since Google released the Penguin algorithm update. Is it possible to actively build links and still have a natural backlink profile?

Real link building is more important than ever

The importance of link building hasn’t decreased with Google’s Penguin update. Actually, link building has become more important than before. The difference is that link spamming doesn’t work anymore.

Blasting your link to thousands of sites and fully automated backlink networks have lost their power. Given the fact that these links never were useful to real Internet users, it’s remarkable that Google took so long to devaluate that type of backlinks.

Link building is not about manipulating search rankings (surprise)

Higher search engine rankings are a natural by-product if you build links correctly. The main purpose of link building (and SEO in general) is to get targeted visitors to your website.

Backlinks can help you with the following:

  1. backlinks can increase the search engine rankings of your site
  2. backlinks can help you to get targeted visitors
  3. backlinks can help your company reputation

If you focus your link building activities on 2 and 3 then 1 will come naturally.

Link building can lead to a natural backlink profile

The reason why some people think that link building leads to an unnatural backlink profile is that they confuse link building with link spamming. SEO is not spamming and the whole SEO industry has suffered a lot from people using the term ‘SEO’ to sell their spamming products.

The key to a natural backlink profile is a simple question: “Could the link exist without asking for it?” Many people just don’t link to your site because they don’t know it or because they don’t have the time to research links about a topic.

Help these people who would naturally link to your site by informing them about linkworthy content on your site that is related to their site. Asking the right people for the right reason often leads to much better results than the scattershot approach that is used by many companies.

For the foreseeable future, good backlinks will remain one of the leading factors in Google’s ranking algorithms. 

 

Last Big Panda Roll Out Definitive Expository

2/27/12

In February 2012 we have many improvements to celebrate. With 40 changes reported, that marks a new record for our monthly series on search quality. Most of the updates rolled out earlier this month, and a handful are actually rolling out today and tomorrow. We continue to improve many of our systems, including related searches, sitelinks, autocomplete, UI elements, indexing, synonyms, SafeSearch and more. Each individual change is subtle and important, and over time they add up to a radically improved search engine.

Here’s the list for February:

•More coverage for related searches. [launch codename “Fuzhou”] This launch brings in a new data source to help generate the “Searches related to” section, increasing coverage significantly so the feature will appear for more queries. This section contains search queries that can help you refine what you’re searching for.
•Tweak to categorizer for expanded sitelinks. [launch codename “Snippy”, project codename “Megasitelinks”] This improvement adjusts a signal we use to try and identify duplicate snippets. We were applying a categorizer that wasn’t performing well for our expanded sitelinks, so we’ve stopped applying the categorizer in those cases. The result is more relevant sitelinks.
•Less duplication in expanded sitelinks. [launch codename “thanksgiving”, project codename “Megasitelinks”] We’ve adjusted signals to reduce duplication in the snippets for expanded sitelinks. Now we generate relevant snippets based more on the page content and less on the query.
•More consistent thumbnail sizes on results page. We’ve adjusted the thumbnail size for most image content appearing on the results page, providing a more consistent experience across result types, and also across mobile and tablet. The new sizes apply to rich snippet results for recipes and applications, movie posters, shopping results, book results, news results and more.
•More locally relevant predictions in YouTube. [project codename “Suggest”] We’ve improved the ranking for predictions in YouTube to provide more locally relevant queries. For example, for the query [lady gaga in ] performed on the US version of YouTube, we might predict [lady gaga in times square], but for the same search performed on the Indian version of YouTube, we might predict [lady gaga in India].
•More accurate detection of official pages. [launch codename “WRE”] We’ve made an adjustment to how we detect official pages to make more accurate identifications. The result is that many pages that were previously misidentified as official will no longer be.
•Refreshed per-URL country information. [Launch codename “longdew”, project codename “country-id data refresh”] We updated the country associations for URLs to use more recent data.
•Expand the size of our images index in Universal Search. [launch codename “terra”, project codename “Images Universal”] We launched a change to expand the corpus of results for which we show images in Universal Search. This is especially helpful to give more relevant images on a larger set of searches.
•Minor tuning of autocomplete policy algorithms. [project codename “Suggest”] We have a narrow set of policies for autocomplete for offensive and inappropriate terms. This improvement continues to refine the algorithms we use to implement these policies.
•“Site:” query update [launch codename “Semicolon”, project codename “Dice”] This change improves the ranking for queries using the “site:” operator by increasing the diversity of results.
•Improved detection for SafeSearch in Image Search. [launch codename “Michandro”, project codename “SafeSearch”] This change improves our signals for detecting adult content in Image Search, aligning the signals more closely with the signals we use for our other search results.
•Interval based history tracking for indexing. [project codename “Intervals”] This improvement changes the signals we use in document tracking algorithms.
•Improvements to foreign language synonyms. [launch codename “floating context synonyms”, project codename “Synonyms”] This change applies an improvement we previously launched for English to all other languages. The net impact is that you’ll more often find relevant pages that include synonyms for your query terms.
•Disabling two old fresh query classifiers. [launch codename “Mango”, project codename “Freshness”] As search evolves and new signals and classifiers are applied to rank search results, sometimes old algorithms get outdated. This improvement disables two old classifiers related to query freshness.
•More organized search results for Google Korea. [launch codename “smoothieking”, project codename “Sokoban4”] This significant improvement to search in Korea better organizes the search results into sections for news, blogs and homepages.
•Fresher images. [launch codename “tumeric”] We’ve adjusted our signals for surfacing fresh images. Now we can more often surface fresh images when they appear on the web.
•Update to the Google bar. [project codename “Kennedy”] We continue to iterate in our efforts to deliver a beautifully simple experience across Google products, and as part of that this month we made further adjustments to the Google bar. The biggest change is that we’ve replaced the drop-down Google menu in the November redesign with a consistent and expanded set of links running across the top of the page.
•Adding three new languages to classifier related to error pages. [launch codename “PNI”, project codename “Soft404″] We have signals designed to detect crypto 404 pages (also known as “soft 404s”), pages that return valid text to a browser but the text only contain error messages, such as “Page not found.” It’s rare that a user will be looking for such a page, so it’s important we be able to detect them. This change extends a particular classifier to Portuguese, Dutch and Italian.
•Improvements to travel-related searches. [launch codename “nesehorn”] We’ve made improvements to triggering for a variety of flight-related search queries. These changes improve the user experience for our Flight Search feature with users getting more accurate flight results.
•Data refresh for related searches signal. [launch codename “Chicago”, project codename “Related Search”] One of the many signals we look at to generate the “Searches related to” section is the queries users type in succession. If users very often search for [apple] right after [banana], that’s a sign the two might be related. This update refreshes the model we use to generate these refinements, leading to more relevant queries to try.
•International launch of shopping rich snippets. [project codename “rich snippets”] Shopping rich snippets help you more quickly identify which sites are likely to have the most relevant product for your needs, highlighting product prices, availability, ratings and review counts. This month we expanded shopping rich snippets globally (they were previously only available in the US, Japan and Germany).
•Improvements to Korean spelling. This launch improves spelling corrections when the user performs a Korean query in the wrong keyboard mode (also known as an “IME”, or input method editor). Specifically, this change helps users who mistakenly enter Hangul queries in Latin mode or vice-versa.
•Improvements to freshness. [launch codename “iotfreshweb”, project codename “Freshness”] We’ve applied new signals which help us surface fresh content in our results even more quickly than before.
•Web History in 20 new countries. With Web History, you can browse and search over your search history and webpages you’ve visited. You will also get personalized search results that are more relevant to you, based on what you’ve searched for and which sites you’ve visited in the past. In order to deliver more relevant and personalized search results, we’ve launched Web History in Malaysia, Pakistan, Philippines, Morocco, Belarus, Kazakhstan, Estonia, Kuwait, Iraq, Sri Lanka, Tunisia, Nigeria, Lebanon, Luxembourg, Bosnia and Herzegowina, Azerbaijan, Jamaica, Trinidad and Tobago, Republic of Moldova, and Ghana. Web History is turned on only for people who have a Google Account and previously enabled Web History.
•Improved snippets for video channels. Some search results are links to channels with many different videos, whether on mtv.com, Hulu or YouTube. We’ve had a feature for a while now that displays snippets for these results including direct links to the videos in the channel, and this improvement increases quality and expands coverage of these rich “decorated” snippets. We’ve also made some improvements to our backends used to generate the snippets.
•Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.
•Improvements to English spell correction. [launch codename “Kamehameha”] This change improves spelling correction quality in English, especially for rare queries, by making one of our scoring functions more accurate.
•Improvements to coverage of News Universal. [launch codename “final destination”] We’ve fixed a bug that caused News Universal results not to appear in cases when our testing indicates they’d be very useful.
•Consolidation of signals for spiking topics. [launch codename “news deserving score”, project codename “Freshness”] We use a number of signals to detect when a new topic is spiking in popularity. This change consolidates some of the signals so we can rely on signals we can compute in realtime, rather than signals that need to be processed offline. This eliminates redundancy in our systems and helps to ensure we can continue to detect spiking topics as quickly as possible.
•Better triggering for Turkish weather search feature. [launch codename “hava”] We’ve tuned the signals we use to decide when to present Turkish users with the weather search feature. The result is that we’re able to provide our users with the weather forecast right on the results page with more frequency and accuracy.
•Visual refresh to account settings page. We completed a visual refresh of the account settings page, making the page more consistent with the rest of our constantly evolving design.
•Panda update. This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.
•Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.
•SafeSearch update. We have updated how we deal with adult content, making it more accurate and robust. Now, irrelevant adult content is less likely to show up for many queries.
•Spam update. In the process of investigating some potential spam, we found and fixed some weaknesses in our spam protections.
•Improved local results. We launched a new system to find results from a user’s city more reliably. Now we’re better able to detect when both queries and documents are local to the user.

Domain names are a crucial element for capturing clicks and conversions from search results

A new study from Microsoft Research confirms what most SEOs have known for years—that domain names are a crucial element for capturing clicks and conversions from search results. Unlike what’s been published in most search marketing forums, however, this research was not focused on SEO techniques or search engine ranking algorithms, but rather on observed searcher behavior, offering insights about how people actually respond to what’s presented to them in search results.

The results of this research present a good news/bad news scenario for search marketers. The good news: If you have a credible, trusted domain name, you’ve got an advantage, as searchers really do pay attention to the URL in search results before deciding to click. And this is true regardless of the position of the URL on a search result page.

The bad news, of course, is that it’s more difficult these days to acquire “credible” domains now that most single or even double word domains are in use or reserved. Add confounding factors such as personalization, Google changing its core algorithm more than 500 times a year, and the fact that most searchers don’t move beyond the first or second page of results and you’ve got a major headache for most SEOs.

Nonetheless, the study is worth a close read for anyone wanting to understand more about how to capture the attention and clicks of searchers, thanks to its wealth of data generated by observing real people and their search behavior. Probably the most significant conclusion from the study:

Surprisingly, we find that despite changes in the overall distribution of surfaced domains, there has not been a comparable shift in the distribution of clicked domains. Users seem to have learned the landscape of the internet and their click behavior has thus become more predictable over time.

In other words, even if search result rankings change due to factors like personalization or algorithmic tweaks, searchers don’t seem to care. They’re demonstrating a clear preference now for credibility and trustworthiness in a domain name now over simple ranking on a search result page. This is the strongest evidence yet that I’ve seen that an obsession with ranking is not only futile, it completely ignores the reality of how your site attracts users.

Key takeaway for bosses/clients: rank really doesn’t matter, if you’ve got a quality (trustworthy) domain name.

The study also has merit for anyone doing paid search, and considering what display URL is most appropriate for an ad. While advertisers are always limited to a display URL that corresponds with a top-level domain, the additional keywords shown in the display URL may be crucial in getting searchers to click. Also, even if searchers don’t have favorable “domain bias” for your main site, it may be possible to secure another more favorably-perceived domain for your paid search campaigns that serves as a microsite that ultimately funnels searchers into your main domain.

The report is thick with math and numerous citations to related work, but it well worth the effort for anyone involved in competitive search marketing

Jill Whalen’s Top Ten Questions She’d Ask a New Client

Here’s a selection of some of the questions I ask and why they’re important to the overall SEO process:

1. What web analytics program do you use, and can we have access to it?

Web analytics are the key to measuring the current level of SEO success (or lack thereof). They’re also the key to determining whether any future SEO implementation is helping to bring more targeted traffic. Therefore, it’s critical for me to have access to this information regardless of the level of SEO service I’m providing. If you use Google Analytics (GoAn), it’s very simple to add new users to the account and in most cases it’s fine to provide report-only access (rather than admin). Along with GoAn, I also ask for access to the client’s Google Webmaster Tools (GWMT) account. These days, if you have GoAn access, you can usually add the same website to your GWMT account as well, which makes the process easier.

2. What’s the purpose of your site and who is your target audience?

This is a seemingly simple question, yet it often stumps many clients. Some of them will cop out: “Well, the purpose of our site is to sell our product.” And your target audience? “Umm … anyone with a credit card?” Not very helpful. If you don’t have a good handle on who the people are who are buying your products, how will your SEO consultant help you bring those people to your website? An SEO consultant needs to have a clear picture of who you are because everything we do hinges upon this — from the keyword research to deciding what type of content needs to be written, to how you might want to attack social media marketing. If you’re an SEO consultant, I urge you to push for deep answers to this question.

3. Are there any other domains or sites that you own or control, or that you used to use instead of the current domain? (Please list them all.)

This information is important so I can assess any duplicate content issues. I need to know whether that other site I found that is using nearly the same content as yours is owned by you, or if someone scraped yours. I also need to know if you’re using multiple domains as an SEO strategy (so I can smack you!). I added this one to my questionnaire when I kept finding doorway domains or other sites that my clients *forgot* to tell me about. Even those who really do forget or who purposely don’t tell me about their additional domains aren’t getting away with anything. I usually end up finding them during my website audit process. So if you’re a client, do us both a favor and come clean from the start. This will save us all some time down the line! (And I was just kidding about smacking you :)!)

4. What have you done so far (if anything) about optimizing your site?

My favorite answer is to this is “nothing” because that means we’re starting with a clean slate and have nowhere to go but up! But most clients these days have done at least some rudimentary SEO. While I can usually spot any on-page optimization, it’s helpful to hear it from you. Sometimes, the things clients say they’ve done (e.g., created keyword-rich Title tags) don’t actually seem to be done when I look for them. That tells me that your idea of SEO and mine may be quite different, and it’s good to know this up front. It’s also good to know if you have already been through a string of SEOs and what each of them has done to the site during their tenure.

5. Is there anything that you may have done that the search engines may not have liked regarding previous optimization efforts for your site?

This one is sort of an addendum to the last one for those who may have *forgotten* to tell me any bad or spammy things they (or a previous SEO) may have done. While they may have not mentioned anything spammy in the last question, this gives them the opportunity to add anything that they weren’t quite sure was on the up-and-up. Very often, the client may think something was bad or caused problems, when it’s actually innocuous. Other times, there can be a big mess to sort out — e.g., all kinds of paid-for spammy-anchor-text links. As an SEO it’s helpful to know right away where to focus my efforts.

6. List the websites of your three biggest competitors. Why do you feel they compete with your site?

I like this question more for the second part than the first. It’s always interesting to see why people think another company or site is their competitor. Very often, the only reason people think it is that the other site shows up in the search results for the keyword phrase that the client wants to show up for! While that may make them your competitor, it also may not. It may simply mean that you’re shooting for the wrong keyword phrases. It’s also very helpful to look at competitor sites to see how they’re set up and whether they seem to have done much in the way of SEO or not.

7. What do you feel is your most unique selling proposition (USP)? Why would these clients come to you as opposed to anyone else who offers the same or similar products and services? What’s different or better about your product or service?

Hat tip to Karon Thackston for these questions, because they are ones she always asks before doing any copywriting for a website. Along with who your target audience is, these are some of the most important questions for any client to think about and answer. Sometimes a client will have a great grasp of this and provide lots of valuable information, but more often, the best they can come up with is that they are “more friendly” than their competitors. In today’s competitive marketplace and search results (especially since Google’s Panda Update), it’s critical to be able to differentiate your products and services from the rest. And even those who have an excellent grasp of this don’t always make it clear to the users of their website, which is something that will need to be fixed.

8. After a potential customer visits your site, what specifically do you want them to do?

This is a wonderful way to understand what the various conversion points of your website are. If your only answer is “Make a sale,” then you likely need to add some other smaller conversion points, such as signing up for a newsletter or updates, following you on social media, filling out a contact form, calling you, etc. As an SEO you need to know what all of these points are so that you can make sure that the client’s web analytics are set up to correctly capture all the conversions, and that the website is properly leading people to complete those conversions.

9. Do you have social media accounts (e.g., Twitter, Facebook, Google+) and if so, what are your user names?

This is important to see if and how they’re using social media. If they’re not using it at all, as an SEO, you must determine whether they should be. If they are using it, a quick review of their accounts will show you exactly how they’re using it. For instance, you’d want to look at whether they are simply tweeting out links to their own content via an automated feed, or if they are also interacting with their audience. This will help you devise an appropriate social media marketing strategy for them down the line.

10. Is there anything else you may have that you think will provide a more complete picture of your site?

It’s always a good idea to have a final, open-ended question such as this in case the client forgot to tell you anything within their previous answers. You may learn all kinds of things that you would not have otherwise learned without asking this question.

Those are the most important ones that should get you started. While you can ask all these in person or on the phone, I find it extremely helpful to have it all in writing. It also provides the client with the opportunity to think about their answers and get additional input from others within the company, as necessary.

New Google Patent -Categories For Additional Exigent Keyword Rankings

Imagine that Google assigns categories to every webpage or website that it visits. You can see categories like those for sites in Google’s local search. Now imagine that Google has looked through how frequently certain keywords appear on the pages of those websites, how often those pages rank for certain query terms in search results, and user data associated with those pages.

One of my local supermarkets has a sushi bar, and they may even note that on their website, but the keyword phrase [sushi bar] is more often found upon and associated with documents associated with a category of “Japanese Restaurants” based upon how often that phrase tends to show up on Japanese Restaurant sites, and how frequently Japanese restaurant sites tend to show up in search results for that phrase.

Since Google can make a strong statistical association between the query [sushi bar] and documents that would fall into a category of “Japanese restaurants,” it’s possible that the search engine might boost pages that have been categorized as “Japanese restaurants” in search results on a search for [sushi bar]. My supermarket [sushi bar] page might not get the same boost.

That’s something that a Google patent granted earlier this week tells us.

The patent presents this idea of creating categories for sites and associating keywords with those categories to boost sites in rankings when they are both relevant for those query term and fall within those categories within the content of local search. But the patent tells us that it can use this process in other searches as well.

Keywords associated with document categories
Invented by Tomoyuki Nanno, Michael Riley, and Gaku Ueda
Assigned to Google
US Patent 7,996,393
Granted August 9, 2011
Filed: September 28, 2007

Abstract

A system extracts a pair that includes a keyword candidate and information associated with a document from multiple documents, and calculates a frequency that the keyword candidate appears in search queries and a frequency that the pair appears in the multiple documents. The system also determines whether the keyword candidate is a keyword for a category based on the calculated frequencies, and associates the keyword with the document if the keyword candidate is the keyword for the category.
If you have access to Google’s Webmaster Tools for a website, the section on “Keywords” shows you the “most common keywords Google found when crawling your site,” along with a warning that those should “reflect the subject matter of your site.” Another section of Webmaster Tools shows the queries that your site receives visitors for, how many impressions and clickthroughs from search results that your pages receive, and an average ranking for your pages in those results. An additional section of the Google tools shows the anchor text most often used to link to your site.

If you were to take all of that information that Google provides for your site, and try to guess at a category or categories that Google might assign for your site, could you? It’s possible that Google is using that kind of information, and more to determine how your site should be categorized. Of course, Google would also be looking at other sites as well for information such as the frequency of keywords used on their pages and queries they are found for to create those categories as well, and to see how well your site might fall into one or more of them.

Of course, if you verify your business in Google Maps, you can enter categories for your business, but Google may suggest and include other categories as well. For instance, Google insists on including “Website Designer” as a category for my site even though that’s not a category that I’ve ever submitted to them.

And it while this patent discusses how it might be applied to local search, it could just as easily be applied to Web search as well, and the patent provides a long list of different types of categories that it might apply to websites that expand well beyond business types.