The Internet Has Changed the Sales Cycle Causing the Purchase Market to Change, Buyers Have EVOLVED and Acquiring Customers has Changed!

With the advent of the internet and social media sites has changed how buyers shop. Prior to the internets explosive growth phenomenon when a buyer needed a product they would turn to a local salesman, request brochures and other information from which to make their decision. Sales teams were the sole focus for information to make a purchase and involved early in the process. Decisions were made from the information of that sales force. Today, buyers no longer need to engage with a sales force to get the necessary information for the early stages of the buying cycle. Buyers now come to the point of purchase fully informed from online research by learning about the sellers’ product and services offering, product reputation, history without ever talking to the sales force. Buyers have compared comparable product offerings and pricing before ever seeking to make a purchase. They determine who has the best value proposition and control the purchasing process now.

This monumental shift of the control of the buying cycle has now moved from the seller and the power moved to the buyer. Buyers no longer need informed sales forces to guide them. As a fact if we engage a sales force with the buyer too early we risk losing the sale entirely.

Success in the buying cycle now is all about marketing intimacy of your prospective demographic. Understanding your target audience, targeting your message and talking to them in the language of the product offering as they know or understand it is more critical than ever. Focusing on the right channels by creating the right memorable communication experiences is the new closing modality that will resonate and convert.

(This new internet buyer is highly knowledgeable of the product offering and by such has created a more challenging environment for marketers who seek to drive high quality leads to their sales team. Quality leads is about being ever present in finding the right people at the right time.)

SEO’s and Designer’s Ultimate Struggle to Dislike Each Other

SEOs are a unique bunch, in case you haven’t noticed. We have real OCD tendencies (and, for some us, it’s full-blown OCD), we just know better than you (about everything), we are intensely, over-the-top goal driven beings, we are scientists-in-a-vacuum, we are data-miners and data-hogs, we are behavioral futurists, and we make the web a better place for you and yours (for a price).

SEO and Designers Do Battle

As a caveat, the good ones are like that. It’s hard definitely being an SEO, as so eloquently pointed out by Portent’s own George Freitag in “Why Web Professionals Hate SEOs”. George, as a fellow SEO, you know that we are equal-opportunity haters on every other web professional too. They may hate us, but we hate them equally as much. And, really, turnabout is fair play.

After all, if designers and developers had their way, the web would be nothing but a bunch of huge, sexy images with millions of nested DIVs blended with functionality so overwhelming and powerful that it would virtually alienate anyone over the age of 35. No one would ever use the web, but the artists would have their canvas, right?
As a writer myself (by education and training), I’m all for “Ars Gratia Artis”, but that’s successful with consumers about 0% of the time. And, I won’t make excuses for Crap Hat SEOs; they’ve read three blog posts and follow “only the best search professionals on Twitter”.

Unfortunately, it’s simply comes with the territory of a profession that expanded exponentially over the last half-decade. Professional SEOs will always be lumped in with them and have to fight their battles and over-reaches.

The Brave New World Isn’t So Brave. Or New.
Google’s Panda and Penguin didn’t change the game. It just made the rules a lot more evident to those who thought there were no rules. Those SEOs who knew the score long before Google had to put down a heavy hand, have come through unscathed nearly 2 years later.

Designers and developers just kept on polluting the web with “beautiful, functional” brand sites. But, when the Piper comes calling, and accounts are stake, it is really strange how those same developers and designers suddenly are “eager” to work with SEOs.

SEO is the Thing
Unless you’re Coca-Cola or another one of the Fortune 50, unbranded search is how people are going to find you. Sorry, but them’s the facts.
So, while designers and developers have thousands of things to contemplate (how to make the brand logo “pop” or make a cart add products), SEOs are contemplating the strategy and schema, from content to information architecture to technical foundation to link graphs and the multitude of things each encapsulates.

Each and every one of those things matter in the quest for findability. Is there any sense in making a brand logo visible if no one is going to be there to see it? Or, a cart that will make you a ham sandwich after you finish the purchase, if no one is there to use it? Instead of treating SEOs as a nuisance and as someone who’s only trying to impede your progress, you might actually find that we can make your job easier in the end.

Jack of All, Master of None
The fact that I can’t write an entire array or code in extensive JavaScript, doesn’t mean that I don’t know enough to be dangerous. And, it also doesn’t mean that I can’t see the Charlie-Foxtrot of spaghetti code your writing that’s going to make my life a nightmare in two months.

The countless duplicate pages you’re making every single time a user uses your product filters, or how you wrapped all those reviews in an I-Frame instead of attempting to take the extra hour or two to use the API, or that mistake you made by mistyping the page name and just created a new one. Yeah, I found that too.

Those things have a big effect on a site authority and trust. Also, don’t let the fact that I can’t create a CSS sprite on my own, fool you into thinking I don’t know how to create a cascading style sheet and know how they work.

SEOs (the good ones) know enough to intervene when it looks like a disaster-in-the-making and will end up as a dumpster-fire for search in two months. You may not care, but just remember, we’re ones that are accountable for “success” at the end of the day. You know, the “hard metrics” that clients need before renewing an engagement.
Here again, the professional SEOs know exactly what we are asking for when we ask for it. Most will actually hate asking you do it because we know how much time is involved and the rework we’re causing. But, in the end, we’re helping you to build a better site for the consumer and site that gets found in the SERPs.

We Don’t Know The Entire Algorithm, But We Know Enough
My guess is that not even Matt Cutts is clued into the ENTIRE Google algorithm, and that guy is the Head of Search Spam at Google and (un)official Google-SEO spokesman.
Being a jackass and claiming that I don’t know the entire algorithm simply tells me you’re a defensive person (and possibly a lazy one too). We know enough factors of the algorithm, and whether they have a positive or negative effect, on a website that the changes we ask for have a purpose.

If not knowing the entire algorithm is cause for loss of credibility for an SEO in the eyes of a designer, then that can be transposed to a designer not have full knowledge of Nielsen’s Heuristics Evaluation. Moreover, I’ve never heard anyone say, “Cause’ the algorithm, dude!” Just doesn’t happen. And, if it does happen, point in the direction of said person(s), we’ll have a chat.

SEOs will fully explain these things to you if you have the time. Seriously. Ask a question about why or how this is going to affect the website or search engine indexation. Any SEO (the good ones) will be more than happy to walk you through why it matters and why it should be the way we need it.

How Designers, Developers, and Copywriters Can Ease the Pain
Short of locking you in a room and playing professor for a few days straight, the best way, as George points out, is to communicate. That word makes it sound easy and effortless, but it’s anything but. When you’re under deadlines and shuffling between projects, ain’t nobody got time for that.

The real key is to sit down next to that SEO (that’s right, put down the email and step away) and let them walk you through it. Sure, it’s 10-20 minutes out of your day, but when you can see it from their point of view as they walk through your code or design, it breaks down barriers that email can’t. It’s that human element. It may not solve the issue, but it will help both parties gain a healthy respect for one another.

And, that’s first step to compromise. SEOs, as much as we know we need it our way because it’s right, we have to be willing to step down off the soapbox. You have to be willing to capitulate some things in order to get the bigger win for search. You may not be able to get that direct 301 you need on duplicate content, but with communication, explanation, compromise, you can convince that developer to work in a rel=canonical.
SEOs need to be able to compromise, just as developers, designers, and copywriters need to be able to learn to compromise. Because at the end of day, it’s about creating wins for the client not for own egos.

Written by Anthony Verre
Hate Web Professionals (Mostly)

It takes at least 10,000 hours (5 Years) of dedicated practice in a given field or area of expertise allows a person to become truly “expert”.

Having years (and thousands of hours) of dedicated focus and practice within a specific niche is obviously highly valuable and allows a person to have a unique, proprietary perspective on that niche (and usually highly valuable expertise). But how many hours is “enough” to achieve expertise status? One take on the subject we (and many others) found interesting was Malcolm Gladwell’s in the book Outliers where he popularized the theory that 10,000 hours of dedicated practice in a given field or area of expertise allows a person to become truly “expert”.

10,000 rule of hours practiced to claim to be an expert

Google’s Latest Earned Patent Algorithm to Trap Spammers and Maybe White Hat SEO Too

Google’s Webmaster Guidelines highlight a number of practices that the search engine warns against, that someone might engage in if they were to try to boost their rankings in the search engine in ways intended to mislead it. The guidelines start with the following warning:

Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the “Quality Guidelines,” which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise impacted by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google’s partner sites.

A Google patent granted this week describes a few ways in which the search engine might respond when it believes there’s a possibility that such practices might be taking place on a page, where they might lead to the rankings of pages being improved in those search results. The following image from the patent shows how search results might be reordered based upon such rank modifying spam:

Google Rank Modifying Spam Chart of Ranking Changes

Those practices, referred to in the patent as “rank-modifying spamming techniques,” may involve techniques such as:

•Keyword stuffing,
•Invisible text,
•Tiny text,
•Page redirects,
•Meta tags stuffing, and
•Link-based manipulation.
While the patent contains definitions of these practices, I’d recommend reading the definitions for those quality guidelines over on the Google help pages which go into much more detail. What’s really interesting about this patent isn’t that Google is taking steps to try to keep people from manipulating search results, but rather the possible steps they might take while doing so.

The patent is:

Ranking documents
Invented by Ross Koningstein
Assigned to Google
US Patent 8,244,722
Granted August 14, 2012
Filed: January 5, 2010

Abstract

A system determines a first rank associated with a document and determines a second rank associated with the document, where the second rank is different from the first rank. The system also changes, during a transition period that occurs during a transition from the first rank to the second rank, a transition rank associated with the document based on a rank transition function that varies the transition rank over time without any change in ranking factors associated with the document.

When Google believes that such techniques are being applied to a page, it might respond to them in ways that the person engaging in spamming might not expect. Rather than outright increasing the rankings of those pages, or removing them from search results, Google might respond with what the patent refers to as a time-based “rank transition function.”

The rank transition function provides confusing indications of the impact on rank in response to rank-modifying spamming activities. The systems and methods may also observe spammers’ reactions to rank changes caused by the rank transition function to identify documents that are actively being manipulated. This assists in the identification of rank-modifying spammers.

Let’s imagine that you have a page in Google’s index, and you work to improve the quality of the content on that page and acquire a number of links to it, and those activities cause the page to improve in rankings for certain query terms. The ranking of that page before the changes would be referred to as the “old rank,” and the ranking afterward is referred to as the “target rank.” Your changes might be the result of legitimate modifications to your page. A page where techniques like keyword stuffing or hidden text has been applied might also potentially climb in rankings as well, with an old rank and a higher target rank.

The rank transition function I referred to above may create a “transition rank” involving the old rank and the target rank for a page.

During the transition from the old rank to the target rank, the transition rank might cause:

•a time-based delay response,
•a negative response,
•a random response, and/or
•an unexpected response
For example, rather than just immediately raise the rank of a page when there have been some modifications to it, and/or to the links pointed to a page, Google might wait for a while and even cause the rankings of a page to decline initially before it rises. Or the page might increase in rankings initially, but to a much smaller scale than the person making the changes might have expected.

The search engine may monitor the changes to that page and to links pointing to the page to see what type of response there is to that unusual activity. For instance, if someone stuffs a page full of keywords, instead of the page improving in rankings for certain queries, it might instead drop in rankings. If the person responsible for the page then comes along and removes those extra keywords, it’s an indication that some kind of rank modifying spamming was going on.

So why use these types of transition functions?

For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero.

Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform additional changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
The rank transition function might impact one specific document, or it might have a broader impact over “the server on which the document is hosted, or a set of documents that share a similar trait (e.g., the same author (e.g., a signature in the document), design elements (e.g., layout, images, etc.), etc.)”

If someone sees a small gain based upon keyword stuffing or some other activity that goes against Google’s guidelines, they might engaging in some similar additional changes to a site involving things like adding additional keywords or hidden text. If they see a decrease, they might make other changes, including reverting a page to its original form.

If there’s a suspicion that spamming might be going on, but not enough to positively identify it, the page involved might be subjected to fluctuations and extreme changes in ranking to try to get a spammer to attempt some kind of corrective action. If that corrective action helps in a spam determination, then the page, “site, domain, and/or contributing links” might be designated as spam.  By

How Website Structure & Information Architecture Should Mirror Your Business Goals

Thomas is the CEO of a major corporation. He had supervised a recent website redesign project, loved the snazzy new look with bells and whistles created by a talented graphics designer – but was calling me to help with a problem.

His beautiful new website wasn’t getting many visitors!

“Why don’t people want to visit our lovely website?” Thomas wailed, genuinely puzzled that the results of his intensive efforts weren’t as rosy as he had expected. As a strategic SEO consultant, the reasons were glaringly obvious to me… but I had to soften the impact, and gently explain what went wrong.

Together, we quickly checked the site’s ranking on Google for his top 50 keywords. They weren’t anywhere in the top 10 results. Or even 20.

You see, the not-so-apparent reason for the ‘failed’ website was the lack of something essential for both higher search engine rankings, and to enhance the visitor experience which can convert a prospect into a customer.

What’s that, you ask?

Thomas’s new website, though visually appealing and technology-rich, was sorely lacking in a well planned information architecture and website structure.

But what is “information architecture”? And how does “website structure” differ from design?

A formal definition of “information architecture” would likely put you to sleep! So let’s simply call it the art of organizing and labeling website content, and bringing design and architecture principles to bear on it.

To understand this better, we’ll look at the skeleton of a website, shorn of flesh and skin, stripped down to the basic fundamentals of what shapes and strengthens it – from within.

Basic Concepts Of Information Architecture
In medical school, trainees begin by learning about human anatomy. Knowing what makes up the body helps understand (and later treat) diseases that affect it.

At the heart of understanding website structure, and planning your strategy for information architecture, lies a need to know about terms like semantic search, latent semantic indexing, knowledge graph, and SEO automation.

Semantic search is an attempt to improve search accuracy by predicting the intent of a searcher. The shift from blindly matching keywords typed into a search box against a massive database, to a more “intelligent” form of search that attempts to understand what those words actually mean to the user, has serious implications on strategic SEO for many business owners.

Latent Semantic Indexing is an indexing and retrieval method that was designed to identify patterns in the relationship between terms and concepts within any text.

By providing unique context for each search term or phrase, it ensures that a search for ‘Apple’ computers will retrieve pages with iMac or iPad on it, while a search for ‘Apple’ fruit will pull a different set of results on gardening and growing apples.

The “knowledge graph” is made up of collated information that will help search services like Google deliver more than just a list of 10 websites, and provide contextual information that solves users’ problems better (even when those problems are not explicitly voiced by the user)!

The implications are clear. Keywords are open to being manipulated. User intent cannot be gamed so easily.

To survive a search engine war fought on the battlefield of semantic search, your business must deeply understand the psychology of your collective market, and then provide specific and meaningful answers to their problems, doubts and insecurities in the form of optimized Web pages that are simultaneously designed to rank well… and also fit into the bigger context of your overall business goals.

At first glance, this seems a daunting challenge. But it’s really straightforward if you proceed with a rational plan rooted in strategy, founded on information architecture principles and framed upon a solid website structure.

Before we explore these elements in greater depth, I’d like to make something clear.

This Is Not A Fight Between Designers & SEO Experts!
Traditionally, these two camps have been at loggerheads. Designers feel SEO ruins their carefully planned look and feel. SEO hotshots complain that higher ranking is sacrificed on the altar of a prettier website.

Yes, it is possible for a design-obsessed structure to wreak havoc with a site’s SEO. It’s also possible for a website driven entirely by SEO to destroy a brand or ruin sales potential. With planning and high quality implementation, the strengths of both specialties can be harnessed to offer a business incredible synergy.

Exploring how this happy union can be achieved is the goal of this report.

Today, any successful website needs:

•SEO (to drive relevant, quality traffic that is looking to buy),
•usability (to manage and convert these visitors into paying customers), and
•the ability synergize both to work in concert, building your brand and growing your business.
Information Architecture & Getting Inside Your Prospect’s Mind
Too often, businesses structure their corporate website based upon the business’ organization. This is often out of sync with a client’s needs, causing the business to lose money.

Your ideal prospect visits your website to see if you’ll help find solutions to her problems – not to read a self-serving brochure about your business.

Keeping this in mind, your information architecture must be based on the best ways to serve your visitor, based on an intimate understanding of ‘user logic’.

Let’s take a hypothetical case of a young couple planning a holiday to Norway. She looks at him and says, “Let’s stay at this hotel in Oslo, honey!”

And with that initial spark of desire, the journey of online exploration begins. They type the name of a hotel (or maybe just “Oslo hotel”) into Google and click the Search button.

Will they find your hotel’s website ranked on the front page?

Findability is only the first step. The title and description of your listing must address their specific problem – Where to stay on our trip to Oslo? If you win the ‘click’, that delivers a prospective guest to your hotel’s website.

Now on your landing page, the couple wants more information. About their stay. About your facilities. Your pricing. Room availability. Tourism assistance. And more.

If your landing page copy and content matches their desire for knowledge and satisfies their needs, you’ll create trust and boost your chance of getting a sale.

This logical sequence – desire, findability, information, trust – is more or less constant across industries and niches. In one form or another, it exists in your field too. And your business website must match the flow, tap into the conversation that’s going on inside your prospect’s head, and join it to engage, inform, entertain and convince.

Before getting into the nitty gritty of content hierarchy and website structure that will help create this trusting relationship with prospects, I’ll take a step back to address another overlooked facet of the strategic SEO process.

Internal Link Structure & Information Architecture
Think about information architecture in the same light as planning and building a house. You would draw up a blueprint, then lay a firm foundation, construct a framework, and only then add on the layers that turn the scaffolding into a full fledged building.

Constructing an SEO optimized website that is strategically designed to fulfill the business goals of your enterprise follows essentially the same process.

When done correctly, a website’s information architecture can offer context for your content, present it in a manner that is friendly to search engine spiders and yet easy for human visitors to navigate, and ideally set up in a way that gives access to any section with just 3 clicks – or less.

The Myth Of “Home Page Focus”
Very simple, logical website structure (like I’ve explained before) that is based upon a user’s intent behind search keyword phrases will turn every category, sub-category and topic page into a “home page”. This is awesome, because:

•Your visitor will click fewer links (remember the 3 click rule?) to reach other sections of your website – something every usability expert and designer intuitively values, and website owners must consider seriously since it impacts the way Web search works.
•You have less need for ongoing SEO to improve and/or defend rankings, and can focus it instead on growing your business with scalable solutions that last longer.
•You’ll become more authoritative on each level of your URL structure, as new topic pages added into your silo will bring additional value to the pages higher up in the hierarchy because of your strategic internal linking.
•You’ll have the freedom to sculpt PR and pass link value to handpicked relevant pages or topics outside the silo. For example, if you sell red shoes, you could link to related items like red belts (which may reside in another silo) and achieve higher sales conversions.
•You can control and direct the way search engine spiders and Web crawlers find, interpret and understand your URLs before indexing them.
•The strategic use of navigational breadcrumb links lets users zoom in to get a close up, or zoom out for a broader context.
•Such logical structuring is not vulnerable to algorithm changes and shifts in the future.
•Each level in the URL structure hierarchy becomes “almost a business or niche” in itself. Visitors get a great first impression about your business when they land on such a page, and will view your site as a place to go when they need help, knowing they’ll be able to easily find other related choices to select from. This boosts your image and builds your brand.
•It is easier to get links from other niche blogs, forums and social networks. External links pointing to a sub-category page bring link value, leading crawlers to your site from relevant ‘authority’ sites that might have already established trust. If you woke up one morning and search engines no longer existed, these sources of traffic would still be valuable.
Achieving the technical elements of SEO is easy even using free tools like Magento and WordPress. Combining elements of SEO and design into the best possible strategy will increase sales. A silo structure for Web content is not just about keyword stuffing. This has nothing to do with spamming, and your intention behind siloing your content shouldn’t just be to get more traffic. Your SEO goal is ultimately to maximize your business and profits.

Layer On Design – But Only At The End!
With the framework of your content website solidly in place, and a silo layout combined with good URL structure defined in consultation with an SEO specialist, you can now team up with a usability expert and a good designer to build a user-friendly, information-rich, self-sustaining website.

•Your site will now become the best salesperson in your organization, working day and night to generate leads and close sales, while serving as a brand manager too.
•The silo structure upon which it is based will order your content in a way that is easy for users to find what they are looking for, just like it is to locate books in a library. This brings order out of chaos.
•Each time you add fresh content or include a new product to your catalog or store, the carefully planned URL structure will build an internal link site-wide to other pages in the category, and up one level in the silo.
•Your information architecture will ensure that link value is passed along effectively and ensures maximum crawlability by search engine spiders.
•You won’t be stuck with time-consuming SEO efforts on an ongoing basis. All new content added to the site automatically fits into its optimized structure, resulting in “auto-pilot SEO” as you enjoy content growth.
•Your website structure and layout will help search engines define context and theme on a very granular level.
But this happy result requires a preparatory SEO strategy because, if not done correctly, it can land you in trouble with a nightmare of duplicate content issues. It is not something you can plan to splash on top, like chocolate syrup on an ice-cream sundae! You must take these steps well ahead of the site building effort, in order to have everything working together in synergy to explode the impact on your business.

5 Reasons to Diversify Your Search Strategy with PPC Advertising

By Elisa Gabbert
July 18, 2012

Yesterday we published the results of a study showing how sponsored advertisements on Google (PPC ads) are taking over territory previously reserved for organic listings, AKA “free clicks.” This is both good news and bad news for marketers. On the plus side, Google continues to roll out more and better types of search advertising to help marketers target their customers. On the negative side, you (obviously) have to pay for those clicks.

But the fact is, organic clicks aren’t really “free” either – gone are the days when it was relatively easy to rank on the first page in Google for your target keywords. Given the increasing costs and complications involved with SEO, it’s important to diversify your marketing channels. You can’t rely on organic search alone for traffic and leads – you never know when the next big algorithm update is going to murder your rankings.

Here are five reasons to shift some of the time and budget you spend on SEO to PPC.

#1: For Commercial Queries, Paid Clicks Outnumber Organic Clicks By Nearly 2 to 1

Organic clicks still account for more clicks overall in the search results – but different types of keywords have different value to businesses. For search queries that show high commercial intent – i.e., they indicate that the person searching wants to purchase something – more and more of the page (85% of above-the-fold pixels!) is devoted to sponsored listings. The organic results for transactional keywords like “best email software” or “waterproof digital camera” are mostly pushed below the fold. The top 3 ad spots for a commercial query take 41% of the clicks, and the Product Ad Listings take another 20%. Overall, sponsored results account for 65% of clicks on these keywords, compared to 35% for organic results.

#2: Google’s Sponsored Ad Formats Keep Getting Better

You have minimal control over how your organic search listings appear in Google. (For example, they’ve recently started applying new titles, when they think they can serve up a better one than the title you put on the page.) But you have lots of attractive choices when it comes to ad types. Here are just a few of the ad options that Google now offers:

Mega Site Links: This huge ad format offers up to 10 additional places to click, greatly increasing your chances of presenting a relevant link.Remarketing: Remarketing or retargeting allows you to track site visitors with a cookie and chase them around the Web, displaying relevant banner ads until they click and convert.Social Ad Extensions: With social extensions you can display who has +1’d your site, lending credibility and potential name recognition – it also makes your ad look less like an ad (see below).

#3: About Half Your Audience Can’t Tell the Difference Between Paid and Organic Search

A lot of people think that “nobody clicks on Google ads.” And it’s true that eye tracking studies suggest most people ignore the sponsored ads in the right column. However, one study showed that about half of people don’t recognize the ads above the search results as ads – in other words, they couldn’t tell the difference between the organic and paid results.

Top ads get clicked first whether paid or organic

Top ads get clicked first whether paid or organic

If users don’t know your ad is an ad, they can’t be suspicious of its intent – and why should they be, if it gives them what they want? Secure one of those coveted positions above the organic results for a commercial query, you’ll take the lion’s share of clicks without sacrificing trust with users.

#4: SEO Is a Full-Time Job – Or Several Full-Time Jobs

As the number of sites competing for rankings has sky-rocketed, Google’s algorithms have gotten more and more complex, and it’s become much harder to achieve – and maintain – high rankings in the organic results. Where in the past businesses could get away with hiring a single SEO point person (usually a pretty junior position), now it often requires a full team to develop and execute on an SEO strategy (a content writer, a link builder, etc.). We believe that PPC – once your campaigns are set up and running – requires significantly less time to manage. According to Perry Marshall, author of The Ultimate Guide to Google AdWords, “if you focus on the areas that bring the most traffic, I find that once you find a rhythm, you can really do this with a few minutes a day, at most a few hours a week, and that’s with a large campaign with a $10,000+ spend per month.”

#5: Algorithm Updates Don’t Affect Your PPC

Google’s rolling algorithm updates ensure that SEO gets harder and more confusing over time. The Panda and Penguin updates in particular have addressed the kind of “optimizations” that have tended to work for site owners and marketers in the past. The only way to find out if Google thinks your SEO techniques are over the line (AKA “over-optimization”) is to take a hit on rankings, and then scramble to figure out – and fix – what you’ve been doing wrong. Google does suspend AdWords accounts on occasion, sometimes without clear reason, but in PPC you’re much less likely to experience major flux or drop-offs in rankings and traffics due to changes on Google’s end.

These are all good reasons to re-allocate some of your marketing budget to PPC, if you’ve been depending on SEO for traffic and lead generation. We would never advocate giving up on SEO – you won’t hear us saying “SEO is dead” anytime soon. But strive for a balance between your search marketing channels, and you can minimize the damage incurred as SEO gets incrementally harder.

Another step to reward high-quality sites

Local Leads Online Marketing algo experts

Local leads Online Marketing Algorithm Experts

(Cross-posted on the Webmaster Central Blog)

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we’re not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.

The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content, and in fact the page text has been “spun” beyond recognition:

Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice.

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.

Posted by Matt Cutts, Distinguished Engineer

Domain names are a crucial element for capturing clicks and conversions from search results

A new study from Microsoft Research confirms what most SEOs have known for years—that domain names are a crucial element for capturing clicks and conversions from search results. Unlike what’s been published in most search marketing forums, however, this research was not focused on SEO techniques or search engine ranking algorithms, but rather on observed searcher behavior, offering insights about how people actually respond to what’s presented to them in search results.

The results of this research present a good news/bad news scenario for search marketers. The good news: If you have a credible, trusted domain name, you’ve got an advantage, as searchers really do pay attention to the URL in search results before deciding to click. And this is true regardless of the position of the URL on a search result page.

The bad news, of course, is that it’s more difficult these days to acquire “credible” domains now that most single or even double word domains are in use or reserved. Add confounding factors such as personalization, Google changing its core algorithm more than 500 times a year, and the fact that most searchers don’t move beyond the first or second page of results and you’ve got a major headache for most SEOs.

Nonetheless, the study is worth a close read for anyone wanting to understand more about how to capture the attention and clicks of searchers, thanks to its wealth of data generated by observing real people and their search behavior. Probably the most significant conclusion from the study:

Surprisingly, we find that despite changes in the overall distribution of surfaced domains, there has not been a comparable shift in the distribution of clicked domains. Users seem to have learned the landscape of the internet and their click behavior has thus become more predictable over time.

In other words, even if search result rankings change due to factors like personalization or algorithmic tweaks, searchers don’t seem to care. They’re demonstrating a clear preference now for credibility and trustworthiness in a domain name now over simple ranking on a search result page. This is the strongest evidence yet that I’ve seen that an obsession with ranking is not only futile, it completely ignores the reality of how your site attracts users.

Key takeaway for bosses/clients: rank really doesn’t matter, if you’ve got a quality (trustworthy) domain name.

The study also has merit for anyone doing paid search, and considering what display URL is most appropriate for an ad. While advertisers are always limited to a display URL that corresponds with a top-level domain, the additional keywords shown in the display URL may be crucial in getting searchers to click. Also, even if searchers don’t have favorable “domain bias” for your main site, it may be possible to secure another more favorably-perceived domain for your paid search campaigns that serves as a microsite that ultimately funnels searchers into your main domain.

The report is thick with math and numerous citations to related work, but it well worth the effort for anyone involved in competitive search marketing

Jill Whalen’s Top Ten Questions She’d Ask a New Client

Here’s a selection of some of the questions I ask and why they’re important to the overall SEO process:

1. What web analytics program do you use, and can we have access to it?

Web analytics are the key to measuring the current level of SEO success (or lack thereof). They’re also the key to determining whether any future SEO implementation is helping to bring more targeted traffic. Therefore, it’s critical for me to have access to this information regardless of the level of SEO service I’m providing. If you use Google Analytics (GoAn), it’s very simple to add new users to the account and in most cases it’s fine to provide report-only access (rather than admin). Along with GoAn, I also ask for access to the client’s Google Webmaster Tools (GWMT) account. These days, if you have GoAn access, you can usually add the same website to your GWMT account as well, which makes the process easier.

2. What’s the purpose of your site and who is your target audience?

This is a seemingly simple question, yet it often stumps many clients. Some of them will cop out: “Well, the purpose of our site is to sell our product.” And your target audience? “Umm … anyone with a credit card?” Not very helpful. If you don’t have a good handle on who the people are who are buying your products, how will your SEO consultant help you bring those people to your website? An SEO consultant needs to have a clear picture of who you are because everything we do hinges upon this — from the keyword research to deciding what type of content needs to be written, to how you might want to attack social media marketing. If you’re an SEO consultant, I urge you to push for deep answers to this question.

3. Are there any other domains or sites that you own or control, or that you used to use instead of the current domain? (Please list them all.)

This information is important so I can assess any duplicate content issues. I need to know whether that other site I found that is using nearly the same content as yours is owned by you, or if someone scraped yours. I also need to know if you’re using multiple domains as an SEO strategy (so I can smack you!). I added this one to my questionnaire when I kept finding doorway domains or other sites that my clients *forgot* to tell me about. Even those who really do forget or who purposely don’t tell me about their additional domains aren’t getting away with anything. I usually end up finding them during my website audit process. So if you’re a client, do us both a favor and come clean from the start. This will save us all some time down the line! (And I was just kidding about smacking you :)!)

4. What have you done so far (if anything) about optimizing your site?

My favorite answer is to this is “nothing” because that means we’re starting with a clean slate and have nowhere to go but up! But most clients these days have done at least some rudimentary SEO. While I can usually spot any on-page optimization, it’s helpful to hear it from you. Sometimes, the things clients say they’ve done (e.g., created keyword-rich Title tags) don’t actually seem to be done when I look for them. That tells me that your idea of SEO and mine may be quite different, and it’s good to know this up front. It’s also good to know if you have already been through a string of SEOs and what each of them has done to the site during their tenure.

5. Is there anything that you may have done that the search engines may not have liked regarding previous optimization efforts for your site?

This one is sort of an addendum to the last one for those who may have *forgotten* to tell me any bad or spammy things they (or a previous SEO) may have done. While they may have not mentioned anything spammy in the last question, this gives them the opportunity to add anything that they weren’t quite sure was on the up-and-up. Very often, the client may think something was bad or caused problems, when it’s actually innocuous. Other times, there can be a big mess to sort out — e.g., all kinds of paid-for spammy-anchor-text links. As an SEO it’s helpful to know right away where to focus my efforts.

6. List the websites of your three biggest competitors. Why do you feel they compete with your site?

I like this question more for the second part than the first. It’s always interesting to see why people think another company or site is their competitor. Very often, the only reason people think it is that the other site shows up in the search results for the keyword phrase that the client wants to show up for! While that may make them your competitor, it also may not. It may simply mean that you’re shooting for the wrong keyword phrases. It’s also very helpful to look at competitor sites to see how they’re set up and whether they seem to have done much in the way of SEO or not.

7. What do you feel is your most unique selling proposition (USP)? Why would these clients come to you as opposed to anyone else who offers the same or similar products and services? What’s different or better about your product or service?

Hat tip to Karon Thackston for these questions, because they are ones she always asks before doing any copywriting for a website. Along with who your target audience is, these are some of the most important questions for any client to think about and answer. Sometimes a client will have a great grasp of this and provide lots of valuable information, but more often, the best they can come up with is that they are “more friendly” than their competitors. In today’s competitive marketplace and search results (especially since Google’s Panda Update), it’s critical to be able to differentiate your products and services from the rest. And even those who have an excellent grasp of this don’t always make it clear to the users of their website, which is something that will need to be fixed.

8. After a potential customer visits your site, what specifically do you want them to do?

This is a wonderful way to understand what the various conversion points of your website are. If your only answer is “Make a sale,” then you likely need to add some other smaller conversion points, such as signing up for a newsletter or updates, following you on social media, filling out a contact form, calling you, etc. As an SEO you need to know what all of these points are so that you can make sure that the client’s web analytics are set up to correctly capture all the conversions, and that the website is properly leading people to complete those conversions.

9. Do you have social media accounts (e.g., Twitter, Facebook, Google+) and if so, what are your user names?

This is important to see if and how they’re using social media. If they’re not using it at all, as an SEO, you must determine whether they should be. If they are using it, a quick review of their accounts will show you exactly how they’re using it. For instance, you’d want to look at whether they are simply tweeting out links to their own content via an automated feed, or if they are also interacting with their audience. This will help you devise an appropriate social media marketing strategy for them down the line.

10. Is there anything else you may have that you think will provide a more complete picture of your site?

It’s always a good idea to have a final, open-ended question such as this in case the client forgot to tell you anything within their previous answers. You may learn all kinds of things that you would not have otherwise learned without asking this question.

Those are the most important ones that should get you started. While you can ask all these in person or on the phone, I find it extremely helpful to have it all in writing. It also provides the client with the opportunity to think about their answers and get additional input from others within the company, as necessary.