The Internet Has Changed the Sales Cycle Causing the Purchase Market to Change, Buyers Have EVOLVED and Acquiring Customers has Changed!

With the advent of the internet and social media sites has changed how buyers shop. Prior to the internets explosive growth phenomenon when a buyer needed a product they would turn to a local salesman, request brochures and other information from which to make their decision. Sales teams were the sole focus for information to make a purchase and involved early in the process. Decisions were made from the information of that sales force. Today, buyers no longer need to engage with a sales force to get the necessary information for the early stages of the buying cycle. Buyers now come to the point of purchase fully informed from online research by learning about the sellers’ product and services offering, product reputation, history without ever talking to the sales force. Buyers have compared comparable product offerings and pricing before ever seeking to make a purchase. They determine who has the best value proposition and control the purchasing process now.

This monumental shift of the control of the buying cycle has now moved from the seller and the power moved to the buyer. Buyers no longer need informed sales forces to guide them. As a fact if we engage a sales force with the buyer too early we risk losing the sale entirely.

Success in the buying cycle now is all about marketing intimacy of your prospective demographic. Understanding your target audience, targeting your message and talking to them in the language of the product offering as they know or understand it is more critical than ever. Focusing on the right channels by creating the right memorable communication experiences is the new closing modality that will resonate and convert.

(This new internet buyer is highly knowledgeable of the product offering and by such has created a more challenging environment for marketers who seek to drive high quality leads to their sales team. Quality leads is about being ever present in finding the right people at the right time.)

Common mistakes in smartphone sites

Here are some common mistakes we see on smartphone-optimized websites and how to avoid them.

Unplayable videos
Many videos are not playable on smartphone devices. This can be due to requiring software or device capabilities that smartphones do not support or due to licensing constraints. We recommend using HTML5 standard tags to include videos and avoid content in formats, such as Flash, that are not supported by all mobile devices.

Regardless, try your best to offer smartphone users the best experience possible to make sure that the videos are playable on as many devices as possible. Also consider having the transcript of the video available on all devices as that may better serve your smartphone users.

Faulty redirects
Many sites have dedicated smartphone-optimized pages and redirect smartphone users based on the user-agent. A common error is to redirect a user trying to access a URL on the desktop site to an irrelevant URL on the smartphone site.

Some common examples:

•Your desktop site’s server is configured to redirect smartphone users to the smartphone site’s homepage, regardless of which URL they originally requested, even if the mobile site has the equivalent page to the redirecting desktop page.

•Desktop URLs with dynamically generated content and URL parameters that don’t map well to the equivalent mobile URL. For example, if a user is looking for a train timetable on a specific date on the desktop site and gets redirected to the general timetable search page on the smartphone-optimized site instead of the actual search.

We recommend that you configure the redirection correctly if you do have an equivalent smartphone URL so that users end up on the page they were looking for.

•Redirecting some mobile devices but not others. For example, a site may redirect only Android users to the mobile site and not redirect iPhone or Windows Phone users.

Smartphone-only 404s
Some sites serve content to desktop users accessing a URL but show an error page to smartphone users.

To ensure the best user experience:

•If you recognize a user is visiting a desktop page from a mobile device and you have an equivalent smartphone-friendly page at a different URL, redirect them to that URL instead of serving a 404 or a soft 404 page. Also make sure that the smartphone-friendly page itself is not an error page.

•If your content is not available in a smartphone-friendly format, serve the desktop page instead. Showing the content the user was looking for is a much better experience than showing an error page.

App download interstitials
Many webmasters promote their site’s apps to their web visitors. There are many implementations to do this, some of which may cause indexing issues of smartphone-optimized content and others that may be too disruptive to the visitor’s usage of the site.

Based on these various considerations, we recommend using a simple banner to promote your app inline with the page’s content. This banner can be implemented using:

•The native browser and operating system support such as Smart App Banners for Safari on iOS6.

•An HTML image, similar to a typical small advert, that links to the correct app store for download.

Irrelevant cross-linking
A common practice when a website serves users on separate smartphone-optimized URLs is to have links to the desktop-optimized version, and likewise a link from the desktop page to the smartphone page. A common error is to have link point to an irrelevant page such as having the smartphone pages link to the desktop site’s homepage.

If you add such links be sure sure that the links point to the correct equivalent page.

Page speed
Optimizing a page’s loading time on smartphones is particularly important given the characteristics of mobile data networks smartphones are connected to. Here are some starting points:

Google Developers

Google Updates to AdWords Trademark Policy To Be The Same Worldwide

Google has made a policy revision that applies to complaints we receive regarding the use of trademarks as keywords. Starting on 23 April 2013, keywords that were restricted as a result of a trademark investigation will no longer be restricted in China, Hong Kong, Macau, Taiwan, Australia, New Zealand, South Korea and Brazil.

While we will not prevent the use of trademarks as keywords in the affected regions, trademark owners will still be able to complain about the use of their trademark in ad text.

How does the revised policy affect which ads can be shown?
Google will no longer prevent advertisers from selecting a third party’s trademark as a keyword in ads targeting these regions.

Why did Google change its trademark policy?
Google’s goal is to provide our users with the most relevant information, whether from search results or advertisements, and we believe users benefit from having more choice. Our policy aims to balance the interests of users, advertisers and trademark owners, so we will continue to investigate trademark complaints concerning use of trademarks in ad text. In addition, this change means that the AdWords policy on trademarks as keywords is now harmonised throughout the world. A consistent policy and user experience worldwide benefits users, advertisers and trademark owners alike.

Does this policy change impact the usage of trademarks in ad text?
No. This policy change relates to the use of trademarked terms as keywords.

Who is affected by the policy change?
Google’s revised trademark policy applies to trademarks held in China, Hong Kong, Macau, Taiwan, Australia, New Zealand, South Korea and Brazil. This policy is already in effect in all other regions throughout world. Please consult our existing trademark policy for more information.

What will happen to existing trademark complaints?
Starting on 23 April 2013, keywords that were restricted as a result of a trademark complaint and investigation will no longer be restricted in the affected regions. If you have an existing complaint on file that includes both keywords and ad text in one of the affected regions, we will continue to restrict use of your trademark in ad text.

Will Google respond to trademark complaints in the affected regions?
Yes. With respect to the use of trademarks in ad text in the affected regions, advertisers will be able to submit trademark complaints.

What are your plans to extend this policy to additional regions?
We do not restrict trademarks as keywords in any other regions. This policy change in China, Hong Kong, Macau, Taiwan, Australia, New Zealand, South Korea and Brazil brings these countries in line with our trademark keyword policy in the rest of the world.

Will trademark terms in my account start triggering ads?
Keywords that were restricted as a result of a trademark investigation may begin triggering your ads in the affected regions, starting on 23 April 2013. If you do not want your ads to run on certain keywords, you can remove those keywords from your campaigns or add them as negative keywords.

Does this mean that I can now use trademark terms as keywords?
Google is not in a position to make recommendations regarding the use of terms corresponding to trademarks. If you have further questions, we encourage you to contact your legal counsel and consult the AdWords Terms and Conditions .

How do I change the list of those authorised to use my trademark in ad text?
If you would like to edit the list of authorised users of your trademark in your current trademark complaint, please send us a revised list. Learn more about our trademark authorisation procedure.

Who should I contact if I have further questions about this policy change?
You can email any questions you might have about the policy change to trademark-policy-revision@google.com.

Mar 21, 2013  Google Adwords Policy

SEO’s and Designer’s Ultimate Struggle to Dislike Each Other

SEOs are a unique bunch, in case you haven’t noticed. We have real OCD tendencies (and, for some us, it’s full-blown OCD), we just know better than you (about everything), we are intensely, over-the-top goal driven beings, we are scientists-in-a-vacuum, we are data-miners and data-hogs, we are behavioral futurists, and we make the web a better place for you and yours (for a price).

SEO and Designers Do Battle

As a caveat, the good ones are like that. It’s hard definitely being an SEO, as so eloquently pointed out by Portent’s own George Freitag in “Why Web Professionals Hate SEOs”. George, as a fellow SEO, you know that we are equal-opportunity haters on every other web professional too. They may hate us, but we hate them equally as much. And, really, turnabout is fair play.

After all, if designers and developers had their way, the web would be nothing but a bunch of huge, sexy images with millions of nested DIVs blended with functionality so overwhelming and powerful that it would virtually alienate anyone over the age of 35. No one would ever use the web, but the artists would have their canvas, right?
As a writer myself (by education and training), I’m all for “Ars Gratia Artis”, but that’s successful with consumers about 0% of the time. And, I won’t make excuses for Crap Hat SEOs; they’ve read three blog posts and follow “only the best search professionals on Twitter”.

Unfortunately, it’s simply comes with the territory of a profession that expanded exponentially over the last half-decade. Professional SEOs will always be lumped in with them and have to fight their battles and over-reaches.

The Brave New World Isn’t So Brave. Or New.
Google’s Panda and Penguin didn’t change the game. It just made the rules a lot more evident to those who thought there were no rules. Those SEOs who knew the score long before Google had to put down a heavy hand, have come through unscathed nearly 2 years later.

Designers and developers just kept on polluting the web with “beautiful, functional” brand sites. But, when the Piper comes calling, and accounts are stake, it is really strange how those same developers and designers suddenly are “eager” to work with SEOs.

SEO is the Thing
Unless you’re Coca-Cola or another one of the Fortune 50, unbranded search is how people are going to find you. Sorry, but them’s the facts.
So, while designers and developers have thousands of things to contemplate (how to make the brand logo “pop” or make a cart add products), SEOs are contemplating the strategy and schema, from content to information architecture to technical foundation to link graphs and the multitude of things each encapsulates.

Each and every one of those things matter in the quest for findability. Is there any sense in making a brand logo visible if no one is going to be there to see it? Or, a cart that will make you a ham sandwich after you finish the purchase, if no one is there to use it? Instead of treating SEOs as a nuisance and as someone who’s only trying to impede your progress, you might actually find that we can make your job easier in the end.

Jack of All, Master of None
The fact that I can’t write an entire array or code in extensive JavaScript, doesn’t mean that I don’t know enough to be dangerous. And, it also doesn’t mean that I can’t see the Charlie-Foxtrot of spaghetti code your writing that’s going to make my life a nightmare in two months.

The countless duplicate pages you’re making every single time a user uses your product filters, or how you wrapped all those reviews in an I-Frame instead of attempting to take the extra hour or two to use the API, or that mistake you made by mistyping the page name and just created a new one. Yeah, I found that too.

Those things have a big effect on a site authority and trust. Also, don’t let the fact that I can’t create a CSS sprite on my own, fool you into thinking I don’t know how to create a cascading style sheet and know how they work.

SEOs (the good ones) know enough to intervene when it looks like a disaster-in-the-making and will end up as a dumpster-fire for search in two months. You may not care, but just remember, we’re ones that are accountable for “success” at the end of the day. You know, the “hard metrics” that clients need before renewing an engagement.
Here again, the professional SEOs know exactly what we are asking for when we ask for it. Most will actually hate asking you do it because we know how much time is involved and the rework we’re causing. But, in the end, we’re helping you to build a better site for the consumer and site that gets found in the SERPs.

We Don’t Know The Entire Algorithm, But We Know Enough
My guess is that not even Matt Cutts is clued into the ENTIRE Google algorithm, and that guy is the Head of Search Spam at Google and (un)official Google-SEO spokesman.
Being a jackass and claiming that I don’t know the entire algorithm simply tells me you’re a defensive person (and possibly a lazy one too). We know enough factors of the algorithm, and whether they have a positive or negative effect, on a website that the changes we ask for have a purpose.

If not knowing the entire algorithm is cause for loss of credibility for an SEO in the eyes of a designer, then that can be transposed to a designer not have full knowledge of Nielsen’s Heuristics Evaluation. Moreover, I’ve never heard anyone say, “Cause’ the algorithm, dude!” Just doesn’t happen. And, if it does happen, point in the direction of said person(s), we’ll have a chat.

SEOs will fully explain these things to you if you have the time. Seriously. Ask a question about why or how this is going to affect the website or search engine indexation. Any SEO (the good ones) will be more than happy to walk you through why it matters and why it should be the way we need it.

How Designers, Developers, and Copywriters Can Ease the Pain
Short of locking you in a room and playing professor for a few days straight, the best way, as George points out, is to communicate. That word makes it sound easy and effortless, but it’s anything but. When you’re under deadlines and shuffling between projects, ain’t nobody got time for that.

The real key is to sit down next to that SEO (that’s right, put down the email and step away) and let them walk you through it. Sure, it’s 10-20 minutes out of your day, but when you can see it from their point of view as they walk through your code or design, it breaks down barriers that email can’t. It’s that human element. It may not solve the issue, but it will help both parties gain a healthy respect for one another.

And, that’s first step to compromise. SEOs, as much as we know we need it our way because it’s right, we have to be willing to step down off the soapbox. You have to be willing to capitulate some things in order to get the bigger win for search. You may not be able to get that direct 301 you need on duplicate content, but with communication, explanation, compromise, you can convince that developer to work in a rel=canonical.
SEOs need to be able to compromise, just as developers, designers, and copywriters need to be able to learn to compromise. Because at the end of day, it’s about creating wins for the client not for own egos.

Written by Anthony Verre
Hate Web Professionals (Mostly)

It takes at least 10,000 hours (5 Years) of dedicated practice in a given field or area of expertise allows a person to become truly “expert”.

Having years (and thousands of hours) of dedicated focus and practice within a specific niche is obviously highly valuable and allows a person to have a unique, proprietary perspective on that niche (and usually highly valuable expertise). But how many hours is “enough” to achieve expertise status? One take on the subject we (and many others) found interesting was Malcolm Gladwell’s in the book Outliers where he popularized the theory that 10,000 hours of dedicated practice in a given field or area of expertise allows a person to become truly “expert”.

10,000 rule of hours practiced to claim to be an expert

5 Reasons to Diversify Your Search Strategy with PPC Advertising

By Elisa Gabbert
July 18, 2012

Yesterday we published the results of a study showing how sponsored advertisements on Google (PPC ads) are taking over territory previously reserved for organic listings, AKA “free clicks.” This is both good news and bad news for marketers. On the plus side, Google continues to roll out more and better types of search advertising to help marketers target their customers. On the negative side, you (obviously) have to pay for those clicks.

But the fact is, organic clicks aren’t really “free” either – gone are the days when it was relatively easy to rank on the first page in Google for your target keywords. Given the increasing costs and complications involved with SEO, it’s important to diversify your marketing channels. You can’t rely on organic search alone for traffic and leads – you never know when the next big algorithm update is going to murder your rankings.

Here are five reasons to shift some of the time and budget you spend on SEO to PPC.

#1: For Commercial Queries, Paid Clicks Outnumber Organic Clicks By Nearly 2 to 1

Organic clicks still account for more clicks overall in the search results – but different types of keywords have different value to businesses. For search queries that show high commercial intent – i.e., they indicate that the person searching wants to purchase something – more and more of the page (85% of above-the-fold pixels!) is devoted to sponsored listings. The organic results for transactional keywords like “best email software” or “waterproof digital camera” are mostly pushed below the fold. The top 3 ad spots for a commercial query take 41% of the clicks, and the Product Ad Listings take another 20%. Overall, sponsored results account for 65% of clicks on these keywords, compared to 35% for organic results.

#2: Google’s Sponsored Ad Formats Keep Getting Better

You have minimal control over how your organic search listings appear in Google. (For example, they’ve recently started applying new titles, when they think they can serve up a better one than the title you put on the page.) But you have lots of attractive choices when it comes to ad types. Here are just a few of the ad options that Google now offers:

Mega Site Links: This huge ad format offers up to 10 additional places to click, greatly increasing your chances of presenting a relevant link.Remarketing: Remarketing or retargeting allows you to track site visitors with a cookie and chase them around the Web, displaying relevant banner ads until they click and convert.Social Ad Extensions: With social extensions you can display who has +1’d your site, lending credibility and potential name recognition – it also makes your ad look less like an ad (see below).

#3: About Half Your Audience Can’t Tell the Difference Between Paid and Organic Search

A lot of people think that “nobody clicks on Google ads.” And it’s true that eye tracking studies suggest most people ignore the sponsored ads in the right column. However, one study showed that about half of people don’t recognize the ads above the search results as ads – in other words, they couldn’t tell the difference between the organic and paid results.

Top ads get clicked first whether paid or organic

Top ads get clicked first whether paid or organic

If users don’t know your ad is an ad, they can’t be suspicious of its intent – and why should they be, if it gives them what they want? Secure one of those coveted positions above the organic results for a commercial query, you’ll take the lion’s share of clicks without sacrificing trust with users.

#4: SEO Is a Full-Time Job – Or Several Full-Time Jobs

As the number of sites competing for rankings has sky-rocketed, Google’s algorithms have gotten more and more complex, and it’s become much harder to achieve – and maintain – high rankings in the organic results. Where in the past businesses could get away with hiring a single SEO point person (usually a pretty junior position), now it often requires a full team to develop and execute on an SEO strategy (a content writer, a link builder, etc.). We believe that PPC – once your campaigns are set up and running – requires significantly less time to manage. According to Perry Marshall, author of The Ultimate Guide to Google AdWords, “if you focus on the areas that bring the most traffic, I find that once you find a rhythm, you can really do this with a few minutes a day, at most a few hours a week, and that’s with a large campaign with a $10,000+ spend per month.”

#5: Algorithm Updates Don’t Affect Your PPC

Google’s rolling algorithm updates ensure that SEO gets harder and more confusing over time. The Panda and Penguin updates in particular have addressed the kind of “optimizations” that have tended to work for site owners and marketers in the past. The only way to find out if Google thinks your SEO techniques are over the line (AKA “over-optimization”) is to take a hit on rankings, and then scramble to figure out – and fix – what you’ve been doing wrong. Google does suspend AdWords accounts on occasion, sometimes without clear reason, but in PPC you’re much less likely to experience major flux or drop-offs in rankings and traffics due to changes on Google’s end.

These are all good reasons to re-allocate some of your marketing budget to PPC, if you’ve been depending on SEO for traffic and lead generation. We would never advocate giving up on SEO – you won’t hear us saying “SEO is dead” anytime soon. But strive for a balance between your search marketing channels, and you can minimize the damage incurred as SEO gets incrementally harder.

Another step to reward high-quality sites

Local Leads Online Marketing algo experts

Local leads Online Marketing Algorithm Experts

(Cross-posted on the Webmaster Central Blog)

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we’re not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.

The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content, and in fact the page text has been “spun” beyond recognition:

Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice.

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.

Posted by Matt Cutts, Distinguished Engineer

Googler Tips on Building a Better Post Panda Site

In recent months the Google team been especially focused on helping people find high-quality sites in Google’s search results. The “Panda” algorithm change has improved rankings for a large number of high-quality websites, so most of you reading have nothing to be concerned about. However, for the sites that may have been affected by Panda we wanted to provide additional guidance on how Google searches for high-quality sites.

Our advice for publishers continues to be to focus on delivering the best possible user experience on your websites and not to focus too much on what they think are Google’s current ranking algorithms or signals. Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we’ve rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.

What counts as a high-quality site?
Our site quality algorithms are aimed at helping people find “high-quality” sites by reducing the rankings of low-quality content. The recent “Panda” change tackles the difficult task of algorithmically assessing website quality. Taking a step back, we wanted to explain some of the ideas and research that drive the development of our algorithms.

Below are some questions that one could use to assess the “quality” of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.

Of course, we aren’t disclosing the actual ranking signals used in our algorithms because we don’t want folks to game our search results; but if you want to step into Google’s mindset, the questions below provide some guidance on how we’ve been looking at the issue:

•Would you trust the information presented in this article?
•Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
•Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
•Would you be comfortable giving your credit card information to this site?
•Does this article have spelling, stylistic, or factual errors?
•Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
•Does the article provide original content or information, original reporting, original research, or original analysis?
•Does the page provide substantial value when compared to other pages in search results?
•How much quality control is done on content?
•Does the article describe both sides of a story?
•Is the site a recognized authority on its topic?
•Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
•Was the article edited well, or does it appear sloppy or hastily produced?
•For a health related query, would you trust information from this site?
•Would you recognize this site as an authoritative source when mentioned by name?
•Does this article provide a complete or comprehensive description of the topic?
•Does this article contain insightful analysis or interesting information that is beyond obvious?
•Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
•Does this article have an excessive amount of ads that distract from or interfere with the main content?
•Would you expect to see this article in a printed magazine, encyclopedia or book?
•Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
•Are the pages produced with great care and attention to detail vs. less attention to detail?
•Would users complain when they see pages from this site?
Writing an algorithm to assess page or site quality is a much harder task, but we hope the questions above give some insight into how we try to write algorithms that distinguish higher-quality sites from lower-quality sites.

What you can do
We’ve been hearing from many of you that you want more guidance on what you can do to improve your rankings on Google, particularly if you think you’ve been impacted by the Panda update. We encourage you to keep questions like the ones above in mind as you focus on developing high-quality content rather than trying to optimize for any particular Google algorithm.

One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.

We’re continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search. As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture. This way your site will be more likely to rank well for the long-term. In the meantime, if you have feedback, please tell us through our Webmaster Forum. We continue to monitor threads on the forum and pass site info on to the search quality team as we work on future iterations of our ranking algorithms.

Written by Amit Singhal, Google Fellow

Link Building New Dimensions

Local Leads Online Marketing algo experts

Local leads Online Marketing Algorithm Experts

Much has been said about the importance of a natural backlink profile since Google released the Penguin algorithm update. Is it possible to actively build links and still have a natural backlink profile?

Real link building is more important than ever

The importance of link building hasn’t decreased with Google’s Penguin update. Actually, link building has become more important than before. The difference is that link spamming doesn’t work anymore.

Blasting your link to thousands of sites and fully automated backlink networks have lost their power. Given the fact that these links never were useful to real Internet users, it’s remarkable that Google took so long to devaluate that type of backlinks.

Link building is not about manipulating search rankings (surprise)

Higher search engine rankings are a natural by-product if you build links correctly. The main purpose of link building (and SEO in general) is to get targeted visitors to your website.

Backlinks can help you with the following:

  1. backlinks can increase the search engine rankings of your site
  2. backlinks can help you to get targeted visitors
  3. backlinks can help your company reputation

If you focus your link building activities on 2 and 3 then 1 will come naturally.

Link building can lead to a natural backlink profile

The reason why some people think that link building leads to an unnatural backlink profile is that they confuse link building with link spamming. SEO is not spamming and the whole SEO industry has suffered a lot from people using the term ‘SEO’ to sell their spamming products.

The key to a natural backlink profile is a simple question: “Could the link exist without asking for it?” Many people just don’t link to your site because they don’t know it or because they don’t have the time to research links about a topic.

Help these people who would naturally link to your site by informing them about linkworthy content on your site that is related to their site. Asking the right people for the right reason often leads to much better results than the scattershot approach that is used by many companies.

For the foreseeable future, good backlinks will remain one of the leading factors in Google’s ranking algorithms.