Google 2015 version of ranking up and down indicators

The factors Google put on the list as being more important in 2015 are:

  • User experience metrics (all of them)
  • Shorter title tags
  • Original content
  • Engaging content that provides an answer, teaches, informs, is useful, delights
  • Original images
  • Quality site design
  • Descriptive meta description

The factors that he would go as far as to call deprecated, as in, no longer used by Google are:

  • Keywords
  • Focus on longtail phrases
  • Focus on ranking for specific keyword phrases
  • Lean code

The long tail does seem to be dead, it is more about the whole site experience and broader keywords than going after the big blue pineapple chair anymore.

Others added that responsive (mobile) friendly design and schema should be added to the list. I’d agree with that. But there is a lot more discussion around what should and should not be on these lists.

The Internet Has Changed the Sales Cycle Causing the Purchase Market to Change, Buyers Have EVOLVED and Acquiring Customers has Changed!

With the advent of the internet and social media sites has changed how buyers shop. Prior to the internets explosive growth phenomenon when a buyer needed a product they would turn to a local salesman, request brochures and other information from which to make their decision. Sales teams were the sole focus for information to make a purchase and involved early in the process. Decisions were made from the information of that sales force. Today, buyers no longer need to engage with a sales force to get the necessary information for the early stages of the buying cycle. Buyers now come to the point of purchase fully informed from online research by learning about the sellers’ product and services offering, product reputation, history without ever talking to the sales force. Buyers have compared comparable product offerings and pricing before ever seeking to make a purchase. They determine who has the best value proposition and control the purchasing process now.

This monumental shift of the control of the buying cycle has now moved from the seller and the power moved to the buyer. Buyers no longer need informed sales forces to guide them. As a fact if we engage a sales force with the buyer too early we risk losing the sale entirely.

Success in the buying cycle now is all about marketing intimacy of your prospective demographic. Understanding your target audience, targeting your message and talking to them in the language of the product offering as they know or understand it is more critical than ever. Focusing on the right channels by creating the right memorable communication experiences is the new closing modality that will resonate and convert.

(This new internet buyer is highly knowledgeable of the product offering and by such has created a more challenging environment for marketers who seek to drive high quality leads to their sales team. Quality leads is about being ever present in finding the right people at the right time.)

SEO’s and Designer’s Ultimate Struggle to Dislike Each Other

SEOs are a unique bunch, in case you haven’t noticed. We have real OCD tendencies (and, for some us, it’s full-blown OCD), we just know better than you (about everything), we are intensely, over-the-top goal driven beings, we are scientists-in-a-vacuum, we are data-miners and data-hogs, we are behavioral futurists, and we make the web a better place for you and yours (for a price).

SEO and Designers Do Battle

As a caveat, the good ones are like that. It’s hard definitely being an SEO, as so eloquently pointed out by Portent’s own George Freitag in “Why Web Professionals Hate SEOs”. George, as a fellow SEO, you know that we are equal-opportunity haters on every other web professional too. They may hate us, but we hate them equally as much. And, really, turnabout is fair play.

After all, if designers and developers had their way, the web would be nothing but a bunch of huge, sexy images with millions of nested DIVs blended with functionality so overwhelming and powerful that it would virtually alienate anyone over the age of 35. No one would ever use the web, but the artists would have their canvas, right?
As a writer myself (by education and training), I’m all for “Ars Gratia Artis”, but that’s successful with consumers about 0% of the time. And, I won’t make excuses for Crap Hat SEOs; they’ve read three blog posts and follow “only the best search professionals on Twitter”.

Unfortunately, it’s simply comes with the territory of a profession that expanded exponentially over the last half-decade. Professional SEOs will always be lumped in with them and have to fight their battles and over-reaches.

The Brave New World Isn’t So Brave. Or New.
Google’s Panda and Penguin didn’t change the game. It just made the rules a lot more evident to those who thought there were no rules. Those SEOs who knew the score long before Google had to put down a heavy hand, have come through unscathed nearly 2 years later.

Designers and developers just kept on polluting the web with “beautiful, functional” brand sites. But, when the Piper comes calling, and accounts are stake, it is really strange how those same developers and designers suddenly are “eager” to work with SEOs.

SEO is the Thing
Unless you’re Coca-Cola or another one of the Fortune 50, unbranded search is how people are going to find you. Sorry, but them’s the facts.
So, while designers and developers have thousands of things to contemplate (how to make the brand logo “pop” or make a cart add products), SEOs are contemplating the strategy and schema, from content to information architecture to technical foundation to link graphs and the multitude of things each encapsulates.

Each and every one of those things matter in the quest for findability. Is there any sense in making a brand logo visible if no one is going to be there to see it? Or, a cart that will make you a ham sandwich after you finish the purchase, if no one is there to use it? Instead of treating SEOs as a nuisance and as someone who’s only trying to impede your progress, you might actually find that we can make your job easier in the end.

Jack of All, Master of None
The fact that I can’t write an entire array or code in extensive JavaScript, doesn’t mean that I don’t know enough to be dangerous. And, it also doesn’t mean that I can’t see the Charlie-Foxtrot of spaghetti code your writing that’s going to make my life a nightmare in two months.

The countless duplicate pages you’re making every single time a user uses your product filters, or how you wrapped all those reviews in an I-Frame instead of attempting to take the extra hour or two to use the API, or that mistake you made by mistyping the page name and just created a new one. Yeah, I found that too.

Those things have a big effect on a site authority and trust. Also, don’t let the fact that I can’t create a CSS sprite on my own, fool you into thinking I don’t know how to create a cascading style sheet and know how they work.

SEOs (the good ones) know enough to intervene when it looks like a disaster-in-the-making and will end up as a dumpster-fire for search in two months. You may not care, but just remember, we’re ones that are accountable for “success” at the end of the day. You know, the “hard metrics” that clients need before renewing an engagement.
Here again, the professional SEOs know exactly what we are asking for when we ask for it. Most will actually hate asking you do it because we know how much time is involved and the rework we’re causing. But, in the end, we’re helping you to build a better site for the consumer and site that gets found in the SERPs.

We Don’t Know The Entire Algorithm, But We Know Enough
My guess is that not even Matt Cutts is clued into the ENTIRE Google algorithm, and that guy is the Head of Search Spam at Google and (un)official Google-SEO spokesman.
Being a jackass and claiming that I don’t know the entire algorithm simply tells me you’re a defensive person (and possibly a lazy one too). We know enough factors of the algorithm, and whether they have a positive or negative effect, on a website that the changes we ask for have a purpose.

If not knowing the entire algorithm is cause for loss of credibility for an SEO in the eyes of a designer, then that can be transposed to a designer not have full knowledge of Nielsen’s Heuristics Evaluation. Moreover, I’ve never heard anyone say, “Cause’ the algorithm, dude!” Just doesn’t happen. And, if it does happen, point in the direction of said person(s), we’ll have a chat.

SEOs will fully explain these things to you if you have the time. Seriously. Ask a question about why or how this is going to affect the website or search engine indexation. Any SEO (the good ones) will be more than happy to walk you through why it matters and why it should be the way we need it.

How Designers, Developers, and Copywriters Can Ease the Pain
Short of locking you in a room and playing professor for a few days straight, the best way, as George points out, is to communicate. That word makes it sound easy and effortless, but it’s anything but. When you’re under deadlines and shuffling between projects, ain’t nobody got time for that.

The real key is to sit down next to that SEO (that’s right, put down the email and step away) and let them walk you through it. Sure, it’s 10-20 minutes out of your day, but when you can see it from their point of view as they walk through your code or design, it breaks down barriers that email can’t. It’s that human element. It may not solve the issue, but it will help both parties gain a healthy respect for one another.

And, that’s first step to compromise. SEOs, as much as we know we need it our way because it’s right, we have to be willing to step down off the soapbox. You have to be willing to capitulate some things in order to get the bigger win for search. You may not be able to get that direct 301 you need on duplicate content, but with communication, explanation, compromise, you can convince that developer to work in a rel=canonical.
SEOs need to be able to compromise, just as developers, designers, and copywriters need to be able to learn to compromise. Because at the end of day, it’s about creating wins for the client not for own egos.

Written by Anthony Verre
Hate Web Professionals (Mostly)

Google’s Latest Earned Patent Algorithm to Trap Spammers and Maybe White Hat SEO Too

Google’s Webmaster Guidelines highlight a number of practices that the search engine warns against, that someone might engage in if they were to try to boost their rankings in the search engine in ways intended to mislead it. The guidelines start with the following warning:

Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the “Quality Guidelines,” which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise impacted by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google’s partner sites.

A Google patent granted this week describes a few ways in which the search engine might respond when it believes there’s a possibility that such practices might be taking place on a page, where they might lead to the rankings of pages being improved in those search results. The following image from the patent shows how search results might be reordered based upon such rank modifying spam:

Google Rank Modifying Spam Chart of Ranking Changes

Those practices, referred to in the patent as “rank-modifying spamming techniques,” may involve techniques such as:

•Keyword stuffing,
•Invisible text,
•Tiny text,
•Page redirects,
•Meta tags stuffing, and
•Link-based manipulation.
While the patent contains definitions of these practices, I’d recommend reading the definitions for those quality guidelines over on the Google help pages which go into much more detail. What’s really interesting about this patent isn’t that Google is taking steps to try to keep people from manipulating search results, but rather the possible steps they might take while doing so.

The patent is:

Ranking documents
Invented by Ross Koningstein
Assigned to Google
US Patent 8,244,722
Granted August 14, 2012
Filed: January 5, 2010

Abstract

A system determines a first rank associated with a document and determines a second rank associated with the document, where the second rank is different from the first rank. The system also changes, during a transition period that occurs during a transition from the first rank to the second rank, a transition rank associated with the document based on a rank transition function that varies the transition rank over time without any change in ranking factors associated with the document.

When Google believes that such techniques are being applied to a page, it might respond to them in ways that the person engaging in spamming might not expect. Rather than outright increasing the rankings of those pages, or removing them from search results, Google might respond with what the patent refers to as a time-based “rank transition function.”

The rank transition function provides confusing indications of the impact on rank in response to rank-modifying spamming activities. The systems and methods may also observe spammers’ reactions to rank changes caused by the rank transition function to identify documents that are actively being manipulated. This assists in the identification of rank-modifying spammers.

Let’s imagine that you have a page in Google’s index, and you work to improve the quality of the content on that page and acquire a number of links to it, and those activities cause the page to improve in rankings for certain query terms. The ranking of that page before the changes would be referred to as the “old rank,” and the ranking afterward is referred to as the “target rank.” Your changes might be the result of legitimate modifications to your page. A page where techniques like keyword stuffing or hidden text has been applied might also potentially climb in rankings as well, with an old rank and a higher target rank.

The rank transition function I referred to above may create a “transition rank” involving the old rank and the target rank for a page.

During the transition from the old rank to the target rank, the transition rank might cause:

•a time-based delay response,
•a negative response,
•a random response, and/or
•an unexpected response
For example, rather than just immediately raise the rank of a page when there have been some modifications to it, and/or to the links pointed to a page, Google might wait for a while and even cause the rankings of a page to decline initially before it rises. Or the page might increase in rankings initially, but to a much smaller scale than the person making the changes might have expected.

The search engine may monitor the changes to that page and to links pointing to the page to see what type of response there is to that unusual activity. For instance, if someone stuffs a page full of keywords, instead of the page improving in rankings for certain queries, it might instead drop in rankings. If the person responsible for the page then comes along and removes those extra keywords, it’s an indication that some kind of rank modifying spamming was going on.

So why use these types of transition functions?

For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero.

Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform additional changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
The rank transition function might impact one specific document, or it might have a broader impact over “the server on which the document is hosted, or a set of documents that share a similar trait (e.g., the same author (e.g., a signature in the document), design elements (e.g., layout, images, etc.), etc.)”

If someone sees a small gain based upon keyword stuffing or some other activity that goes against Google’s guidelines, they might engaging in some similar additional changes to a site involving things like adding additional keywords or hidden text. If they see a decrease, they might make other changes, including reverting a page to its original form.

If there’s a suspicion that spamming might be going on, but not enough to positively identify it, the page involved might be subjected to fluctuations and extreme changes in ranking to try to get a spammer to attempt some kind of corrective action. If that corrective action helps in a spam determination, then the page, “site, domain, and/or contributing links” might be designated as spam.  By

How Website Structure & Information Architecture Should Mirror Your Business Goals

Thomas is the CEO of a major corporation. He had supervised a recent website redesign project, loved the snazzy new look with bells and whistles created by a talented graphics designer – but was calling me to help with a problem.

His beautiful new website wasn’t getting many visitors!

“Why don’t people want to visit our lovely website?” Thomas wailed, genuinely puzzled that the results of his intensive efforts weren’t as rosy as he had expected. As a strategic SEO consultant, the reasons were glaringly obvious to me… but I had to soften the impact, and gently explain what went wrong.

Together, we quickly checked the site’s ranking on Google for his top 50 keywords. They weren’t anywhere in the top 10 results. Or even 20.

You see, the not-so-apparent reason for the ‘failed’ website was the lack of something essential for both higher search engine rankings, and to enhance the visitor experience which can convert a prospect into a customer.

What’s that, you ask?

Thomas’s new website, though visually appealing and technology-rich, was sorely lacking in a well planned information architecture and website structure.

But what is “information architecture”? And how does “website structure” differ from design?

A formal definition of “information architecture” would likely put you to sleep! So let’s simply call it the art of organizing and labeling website content, and bringing design and architecture principles to bear on it.

To understand this better, we’ll look at the skeleton of a website, shorn of flesh and skin, stripped down to the basic fundamentals of what shapes and strengthens it – from within.

Basic Concepts Of Information Architecture
In medical school, trainees begin by learning about human anatomy. Knowing what makes up the body helps understand (and later treat) diseases that affect it.

At the heart of understanding website structure, and planning your strategy for information architecture, lies a need to know about terms like semantic search, latent semantic indexing, knowledge graph, and SEO automation.

Semantic search is an attempt to improve search accuracy by predicting the intent of a searcher. The shift from blindly matching keywords typed into a search box against a massive database, to a more “intelligent” form of search that attempts to understand what those words actually mean to the user, has serious implications on strategic SEO for many business owners.

Latent Semantic Indexing is an indexing and retrieval method that was designed to identify patterns in the relationship between terms and concepts within any text.

By providing unique context for each search term or phrase, it ensures that a search for ‘Apple’ computers will retrieve pages with iMac or iPad on it, while a search for ‘Apple’ fruit will pull a different set of results on gardening and growing apples.

The “knowledge graph” is made up of collated information that will help search services like Google deliver more than just a list of 10 websites, and provide contextual information that solves users’ problems better (even when those problems are not explicitly voiced by the user)!

The implications are clear. Keywords are open to being manipulated. User intent cannot be gamed so easily.

To survive a search engine war fought on the battlefield of semantic search, your business must deeply understand the psychology of your collective market, and then provide specific and meaningful answers to their problems, doubts and insecurities in the form of optimized Web pages that are simultaneously designed to rank well… and also fit into the bigger context of your overall business goals.

At first glance, this seems a daunting challenge. But it’s really straightforward if you proceed with a rational plan rooted in strategy, founded on information architecture principles and framed upon a solid website structure.

Before we explore these elements in greater depth, I’d like to make something clear.

This Is Not A Fight Between Designers & SEO Experts!
Traditionally, these two camps have been at loggerheads. Designers feel SEO ruins their carefully planned look and feel. SEO hotshots complain that higher ranking is sacrificed on the altar of a prettier website.

Yes, it is possible for a design-obsessed structure to wreak havoc with a site’s SEO. It’s also possible for a website driven entirely by SEO to destroy a brand or ruin sales potential. With planning and high quality implementation, the strengths of both specialties can be harnessed to offer a business incredible synergy.

Exploring how this happy union can be achieved is the goal of this report.

Today, any successful website needs:

•SEO (to drive relevant, quality traffic that is looking to buy),
•usability (to manage and convert these visitors into paying customers), and
•the ability synergize both to work in concert, building your brand and growing your business.
Information Architecture & Getting Inside Your Prospect’s Mind
Too often, businesses structure their corporate website based upon the business’ organization. This is often out of sync with a client’s needs, causing the business to lose money.

Your ideal prospect visits your website to see if you’ll help find solutions to her problems – not to read a self-serving brochure about your business.

Keeping this in mind, your information architecture must be based on the best ways to serve your visitor, based on an intimate understanding of ‘user logic’.

Let’s take a hypothetical case of a young couple planning a holiday to Norway. She looks at him and says, “Let’s stay at this hotel in Oslo, honey!”

And with that initial spark of desire, the journey of online exploration begins. They type the name of a hotel (or maybe just “Oslo hotel”) into Google and click the Search button.

Will they find your hotel’s website ranked on the front page?

Findability is only the first step. The title and description of your listing must address their specific problem – Where to stay on our trip to Oslo? If you win the ‘click’, that delivers a prospective guest to your hotel’s website.

Now on your landing page, the couple wants more information. About their stay. About your facilities. Your pricing. Room availability. Tourism assistance. And more.

If your landing page copy and content matches their desire for knowledge and satisfies their needs, you’ll create trust and boost your chance of getting a sale.

This logical sequence – desire, findability, information, trust – is more or less constant across industries and niches. In one form or another, it exists in your field too. And your business website must match the flow, tap into the conversation that’s going on inside your prospect’s head, and join it to engage, inform, entertain and convince.

Before getting into the nitty gritty of content hierarchy and website structure that will help create this trusting relationship with prospects, I’ll take a step back to address another overlooked facet of the strategic SEO process.

Internal Link Structure & Information Architecture
Think about information architecture in the same light as planning and building a house. You would draw up a blueprint, then lay a firm foundation, construct a framework, and only then add on the layers that turn the scaffolding into a full fledged building.

Constructing an SEO optimized website that is strategically designed to fulfill the business goals of your enterprise follows essentially the same process.

When done correctly, a website’s information architecture can offer context for your content, present it in a manner that is friendly to search engine spiders and yet easy for human visitors to navigate, and ideally set up in a way that gives access to any section with just 3 clicks – or less.

The Myth Of “Home Page Focus”
Very simple, logical website structure (like I’ve explained before) that is based upon a user’s intent behind search keyword phrases will turn every category, sub-category and topic page into a “home page”. This is awesome, because:

•Your visitor will click fewer links (remember the 3 click rule?) to reach other sections of your website – something every usability expert and designer intuitively values, and website owners must consider seriously since it impacts the way Web search works.
•You have less need for ongoing SEO to improve and/or defend rankings, and can focus it instead on growing your business with scalable solutions that last longer.
•You’ll become more authoritative on each level of your URL structure, as new topic pages added into your silo will bring additional value to the pages higher up in the hierarchy because of your strategic internal linking.
•You’ll have the freedom to sculpt PR and pass link value to handpicked relevant pages or topics outside the silo. For example, if you sell red shoes, you could link to related items like red belts (which may reside in another silo) and achieve higher sales conversions.
•You can control and direct the way search engine spiders and Web crawlers find, interpret and understand your URLs before indexing them.
•The strategic use of navigational breadcrumb links lets users zoom in to get a close up, or zoom out for a broader context.
•Such logical structuring is not vulnerable to algorithm changes and shifts in the future.
•Each level in the URL structure hierarchy becomes “almost a business or niche” in itself. Visitors get a great first impression about your business when they land on such a page, and will view your site as a place to go when they need help, knowing they’ll be able to easily find other related choices to select from. This boosts your image and builds your brand.
•It is easier to get links from other niche blogs, forums and social networks. External links pointing to a sub-category page bring link value, leading crawlers to your site from relevant ‘authority’ sites that might have already established trust. If you woke up one morning and search engines no longer existed, these sources of traffic would still be valuable.
Achieving the technical elements of SEO is easy even using free tools like Magento and WordPress. Combining elements of SEO and design into the best possible strategy will increase sales. A silo structure for Web content is not just about keyword stuffing. This has nothing to do with spamming, and your intention behind siloing your content shouldn’t just be to get more traffic. Your SEO goal is ultimately to maximize your business and profits.

Layer On Design – But Only At The End!
With the framework of your content website solidly in place, and a silo layout combined with good URL structure defined in consultation with an SEO specialist, you can now team up with a usability expert and a good designer to build a user-friendly, information-rich, self-sustaining website.

•Your site will now become the best salesperson in your organization, working day and night to generate leads and close sales, while serving as a brand manager too.
•The silo structure upon which it is based will order your content in a way that is easy for users to find what they are looking for, just like it is to locate books in a library. This brings order out of chaos.
•Each time you add fresh content or include a new product to your catalog or store, the carefully planned URL structure will build an internal link site-wide to other pages in the category, and up one level in the silo.
•Your information architecture will ensure that link value is passed along effectively and ensures maximum crawlability by search engine spiders.
•You won’t be stuck with time-consuming SEO efforts on an ongoing basis. All new content added to the site automatically fits into its optimized structure, resulting in “auto-pilot SEO” as you enjoy content growth.
•Your website structure and layout will help search engines define context and theme on a very granular level.
But this happy result requires a preparatory SEO strategy because, if not done correctly, it can land you in trouble with a nightmare of duplicate content issues. It is not something you can plan to splash on top, like chocolate syrup on an ice-cream sundae! You must take these steps well ahead of the site building effort, in order to have everything working together in synergy to explode the impact on your business.

5 Reasons to Diversify Your Search Strategy with PPC Advertising

By Elisa Gabbert
July 18, 2012

Yesterday we published the results of a study showing how sponsored advertisements on Google (PPC ads) are taking over territory previously reserved for organic listings, AKA “free clicks.” This is both good news and bad news for marketers. On the plus side, Google continues to roll out more and better types of search advertising to help marketers target their customers. On the negative side, you (obviously) have to pay for those clicks.

But the fact is, organic clicks aren’t really “free” either – gone are the days when it was relatively easy to rank on the first page in Google for your target keywords. Given the increasing costs and complications involved with SEO, it’s important to diversify your marketing channels. You can’t rely on organic search alone for traffic and leads – you never know when the next big algorithm update is going to murder your rankings.

Here are five reasons to shift some of the time and budget you spend on SEO to PPC.

#1: For Commercial Queries, Paid Clicks Outnumber Organic Clicks By Nearly 2 to 1

Organic clicks still account for more clicks overall in the search results – but different types of keywords have different value to businesses. For search queries that show high commercial intent – i.e., they indicate that the person searching wants to purchase something – more and more of the page (85% of above-the-fold pixels!) is devoted to sponsored listings. The organic results for transactional keywords like “best email software” or “waterproof digital camera” are mostly pushed below the fold. The top 3 ad spots for a commercial query take 41% of the clicks, and the Product Ad Listings take another 20%. Overall, sponsored results account for 65% of clicks on these keywords, compared to 35% for organic results.

#2: Google’s Sponsored Ad Formats Keep Getting Better

You have minimal control over how your organic search listings appear in Google. (For example, they’ve recently started applying new titles, when they think they can serve up a better one than the title you put on the page.) But you have lots of attractive choices when it comes to ad types. Here are just a few of the ad options that Google now offers:

Mega Site Links: This huge ad format offers up to 10 additional places to click, greatly increasing your chances of presenting a relevant link.Remarketing: Remarketing or retargeting allows you to track site visitors with a cookie and chase them around the Web, displaying relevant banner ads until they click and convert.Social Ad Extensions: With social extensions you can display who has +1’d your site, lending credibility and potential name recognition – it also makes your ad look less like an ad (see below).

#3: About Half Your Audience Can’t Tell the Difference Between Paid and Organic Search

A lot of people think that “nobody clicks on Google ads.” And it’s true that eye tracking studies suggest most people ignore the sponsored ads in the right column. However, one study showed that about half of people don’t recognize the ads above the search results as ads – in other words, they couldn’t tell the difference between the organic and paid results.

Top ads get clicked first whether paid or organic

Top ads get clicked first whether paid or organic

If users don’t know your ad is an ad, they can’t be suspicious of its intent – and why should they be, if it gives them what they want? Secure one of those coveted positions above the organic results for a commercial query, you’ll take the lion’s share of clicks without sacrificing trust with users.

#4: SEO Is a Full-Time Job – Or Several Full-Time Jobs

As the number of sites competing for rankings has sky-rocketed, Google’s algorithms have gotten more and more complex, and it’s become much harder to achieve – and maintain – high rankings in the organic results. Where in the past businesses could get away with hiring a single SEO point person (usually a pretty junior position), now it often requires a full team to develop and execute on an SEO strategy (a content writer, a link builder, etc.). We believe that PPC – once your campaigns are set up and running – requires significantly less time to manage. According to Perry Marshall, author of The Ultimate Guide to Google AdWords, “if you focus on the areas that bring the most traffic, I find that once you find a rhythm, you can really do this with a few minutes a day, at most a few hours a week, and that’s with a large campaign with a $10,000+ spend per month.”

#5: Algorithm Updates Don’t Affect Your PPC

Google’s rolling algorithm updates ensure that SEO gets harder and more confusing over time. The Panda and Penguin updates in particular have addressed the kind of “optimizations” that have tended to work for site owners and marketers in the past. The only way to find out if Google thinks your SEO techniques are over the line (AKA “over-optimization”) is to take a hit on rankings, and then scramble to figure out – and fix – what you’ve been doing wrong. Google does suspend AdWords accounts on occasion, sometimes without clear reason, but in PPC you’re much less likely to experience major flux or drop-offs in rankings and traffics due to changes on Google’s end.

These are all good reasons to re-allocate some of your marketing budget to PPC, if you’ve been depending on SEO for traffic and lead generation. We would never advocate giving up on SEO – you won’t hear us saying “SEO is dead” anytime soon. But strive for a balance between your search marketing channels, and you can minimize the damage incurred as SEO gets incrementally harder.

Another step to reward high-quality sites

Local Leads Online Marketing algo experts

Local leads Online Marketing Algorithm Experts

(Cross-posted on the Webmaster Central Blog)

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we’re not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.

The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content, and in fact the page text has been “spun” beyond recognition:

Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice.

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.

Posted by Matt Cutts, Distinguished Engineer

Googler Tips on Building a Better Post Panda Site

In recent months the Google team been especially focused on helping people find high-quality sites in Google’s search results. The “Panda” algorithm change has improved rankings for a large number of high-quality websites, so most of you reading have nothing to be concerned about. However, for the sites that may have been affected by Panda we wanted to provide additional guidance on how Google searches for high-quality sites.

Our advice for publishers continues to be to focus on delivering the best possible user experience on your websites and not to focus too much on what they think are Google’s current ranking algorithms or signals. Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we’ve rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.

What counts as a high-quality site?
Our site quality algorithms are aimed at helping people find “high-quality” sites by reducing the rankings of low-quality content. The recent “Panda” change tackles the difficult task of algorithmically assessing website quality. Taking a step back, we wanted to explain some of the ideas and research that drive the development of our algorithms.

Below are some questions that one could use to assess the “quality” of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.

Of course, we aren’t disclosing the actual ranking signals used in our algorithms because we don’t want folks to game our search results; but if you want to step into Google’s mindset, the questions below provide some guidance on how we’ve been looking at the issue:

•Would you trust the information presented in this article?
•Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
•Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
•Would you be comfortable giving your credit card information to this site?
•Does this article have spelling, stylistic, or factual errors?
•Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
•Does the article provide original content or information, original reporting, original research, or original analysis?
•Does the page provide substantial value when compared to other pages in search results?
•How much quality control is done on content?
•Does the article describe both sides of a story?
•Is the site a recognized authority on its topic?
•Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
•Was the article edited well, or does it appear sloppy or hastily produced?
•For a health related query, would you trust information from this site?
•Would you recognize this site as an authoritative source when mentioned by name?
•Does this article provide a complete or comprehensive description of the topic?
•Does this article contain insightful analysis or interesting information that is beyond obvious?
•Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
•Does this article have an excessive amount of ads that distract from or interfere with the main content?
•Would you expect to see this article in a printed magazine, encyclopedia or book?
•Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
•Are the pages produced with great care and attention to detail vs. less attention to detail?
•Would users complain when they see pages from this site?
Writing an algorithm to assess page or site quality is a much harder task, but we hope the questions above give some insight into how we try to write algorithms that distinguish higher-quality sites from lower-quality sites.

What you can do
We’ve been hearing from many of you that you want more guidance on what you can do to improve your rankings on Google, particularly if you think you’ve been impacted by the Panda update. We encourage you to keep questions like the ones above in mind as you focus on developing high-quality content rather than trying to optimize for any particular Google algorithm.

One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.

We’re continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search. As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture. This way your site will be more likely to rank well for the long-term. In the meantime, if you have feedback, please tell us through our Webmaster Forum. We continue to monitor threads on the forum and pass site info on to the search quality team as we work on future iterations of our ranking algorithms.

Written by Amit Singhal, Google Fellow