Google’s Relationship With Affiliates

Google’s Relationship With Affiliates


Love:
–          Affiliate marketers were some of Google’s earliest advertisers. By matching commercial offers with keywords and attracting merchants to search marketing, affiliates helped build AdWords as the largest online advertising portal in the world.
–          Google utilize the affiliate marketing model via comparison ads in the finance vertical. Google has also suggested they might move to a price-per-booking model with hotels.
–          Google invested in viglink, an affiliate program that pays sites owners for generating targeted traffic to merchant sites.
–          Google invested in Whaleshark media, the self-proclaimed “world’s leading marketplace for coupons and deals”.
–          Google bought out BeatThatQuote.com, a UK financial comparison site.
–          For large merchants like Amazon, Google runs product search affiliate ads.
–          Google is testing the insertion of pre-filled lead generation forms directly into the search results.
–          Google’s Adsense program is the largest affiliate network, paying publishers to syndicate Adsence ads across the web. In addition, Google runs another Affiliate network named “Google Affiliate Network”.
–          The Google remove rater document that was leaked earlier this year highlights the following website classification as being legitimate for affiliate sites:
  1. Price comparison
  2. Coupons
  3. Original in-depth editorial reviews
Hate  
–          In Q4 of 2009 Google banned 30,000+ Affiliate Adwords accounts without any warning or justification.
–          Ex post factor: Some affiliates are not able to restore their Adwords account without improving “landing page quality” on 3rd party sites the affiliates never controlled, for ads they ran year ago, before Google’s policies changed.
These affiliate advertiser have, in effect, a lifetime ban because they don’t have control over the web properties.
–          Linking through to certain products on certain affiliate networks can lead to an Adworks account being automatically banned.
Google does not disclose which networks are problematic, thus leaving affiliates guessing as to which products they can and cannot promote.
–          Affilaite = Unnecessary
Affiliate Summit in 2010, Google’s Frederick valleys stated that affiliates were largely “just an unnecessary step in the sales funnel”. He went on to say that Google prefers affiliates link directly to merchant websites.
–          Google has taken several efforts to remove affiliate marketers from the organic search results:
  1. Google strengthened duplicate content filters to eliminate feed-driven affiliate sites.
  2. Google banned many affiliates running of sites, calling them “doorway pages”
  3. In some cases, Google later inserted their own vertical search listings in these same search results, thus removing affiliates from the search results so google could generate more revenue.
–          Google’s remove rater documents explicitly instruct rater into designating affiliate sites in the travel niche as spam, even if they’d otherwise get a high rating for usefulness and quality.
–          Google’s reconsideration request form states, “In general, sites that directly profit from traffic (e.g. search engine optimization, Affiliate programs, etc.) may need to provide more evidence of good faith before a site will be reconsidered.”
–          Eric Schmidt frequently highlights the power of the web’s ability to allow you to merge without merging.
However, when an author linked to his own book’s page on Amazon via an affiliate link, his Adwords account was disabled. Google also sells ebooks and offers their own affiliate program!

Why Should SEOs Care about Google+?

Why Should SEOs Care about Google+?

It’s easy for a lot of Internet users to write off Google+: it’s another social platform for sharing in a world in which we already have plenty of ways to share. We have Facebook for pictures of our dogs; Twitter for news; LinkedIn for networking; Instagram for more pictures of our dogs and Vine, Pinterest, and other platforms making a play to gather more users. Google_Plus_logo
With over 500 million registered users, why aren’t more people actually actively using Google+? It’s no secret that Google+ consistently falls short in terms of active usage, especially compared to its main competitor Facebook. According to Gigya blogger Victor White, Google+ only accounts to two percent of social sharing when compared to other major sharing platforms.
But, come on, it’s Google. Google makes the rules when it comes to all things Internet, so here’s the real question: should you play by its rules and get active on Google+, or continue to exclude yourself from the game by justifying your meaningful contributions on Twitter, Facebook and LinkedIn?

Google+, built for SEO

The popular Moz post “Amazing Correlation Between Google +1s and Higher Search Rankings” caused a major controversy with its revelation that, after Page Authority, Google +1s are more highly correlated with search rankings than any other factor. While correlation does not necessitate causation, can Google+ activity actually help increase page rank?
Google Webspam team head Matt Cutts was quick to deny that +1s are used directly in Google’s algorithm. While this may be true, mounting evidence does suggest that Google+ posts surpass other social platforms in terms of SEO benefits. Put simply, if someone who follows you on Google+ sees a post or a link to your website in Google search results, they can +1 that post and your search rankings can go up as a result.
So, if a link on Google+ could potentially be judged equivalent to any other type of editorial link, why aren’t we using it?
There are often rumblings of Google+ chasing our other favored social platforms for user attention. Ryan Holmes, HootSuite CEO says as much in his post “Why Google+ Is Sneaking up on Facebook,” he also admits that of Google+’s 500 million registered users, only 135 million of them are active users. As Google+ attempts to gain precedence as the source for all things content distribution, why aren’t individuals and brands leveraging their accounts?

Convince me to care about Google+

The most obvious reason to use Google+, if you don’t already, is that it’s integrated into every Google service. Just let thank sink in: Google+ is connected to Gmail, YouTube, Google Drive, AdWords, Blogger, and Google Search. If that’s not enough evidence for you, according to BL Ochman of Social Media Today, Google cofounder Larry Page spelled it out for us when he said that “if you ignore Google+, Google Search will ignore you.”
So, how is the leader in user intelligence not the leader in user experience?
This is what continues to stump the Internet world. We chase Google day in and day out, creating content to maximize the most recent Google algorithm updates; yet, Google+ continues to stump us. In January of 2012, Google+ account registration became mandatory for new users to Google services such as Gmail or Blogger. This would account for the huge number of accounts, but the shockingly low amount of engagement. Furthermore, Google+ being compared to Facebook is an underdog competition for Google+.

But, ultimately, Cutts says:

“If you make compelling content, people will link to it, like it, share it on Facebook, +1 it, etc. But that doesn’t mean that Google is using those signals in our ranking. Rather than chasing +1s of content, your time is much better spent making great content.”
There’s no way to know how +1s affect search results; however, +1s are a vote of confidence for online content. Forget about causation or direct impact though. A vote of confidence on Google+ has got to mean something. All signs point to Google+ as beneficial; yet, I personally can’t pinpoint the practical use of it – what need does it feel that isn’t already filled by our other social media platforms and is there a concrete advantage to playing by Google’s rules even on Google+?
For the other side of the Google+ argument, look for Lindsey Paholski’s post next week about applications of Google+ and how to make the most of the platform.

Google’s Matt Cutts SEO Advice On Unavailable E-Commerce Products

Google’s Matt Cutts SEO Advice On Unavailable E-

Commerce Products


Google’s Matt Cutts answered in a video what webmasters and site owners should do about their out of stock products on their e-commerce sites.
Matt Cutts basically said it depends on the size of the e-commerce site. He broke it down into three sizes: small sites with tens of pages, medium sites with thousands of pages and massive sites with hundreds of thousands of pages or more.

Small E-Commerce Sites

Small sites that sell items, such as handmade furniture, that showcase a product that is out of stock should likely link to related products. This way the customer can see that this owner can make or design something as displayed but at the same time, show other products that are currently available in stock that the customer can purchase today.
Of course, it may make sense to add a manufacturing time next to the items that are out of stock.

Medium E-Commerce Sites

The normal, medium sized, e-commerce site, that sells thousands of products, and where some of those products are out of stock. In that case, the site owner should 404 – page not found – the products that are out of stock.
That is unless you know the date that the products will come back in inventory. If you know when the products will come back in inventory, inform the customer on the site and let them choose if they want to order it for later delivery or not.
Otherwise, 404 the page because it can be frustrated for a customer to land on a product page that they cannot buy.

Large E-Commerce Sites

For really large e-commerce sites, with hundreds of thousands of pages, such as Craigslist, you should set the date the page will expire using the meta tag, unavailable_after tag. This way, when the product is added, you can immediately set when that product page will expire based on an auction date or a go-stale date.
This information is treated as a removal request: it will take about a day after the removal date passes for the page to disappear from the search results. Google currently only supports unavailable_after for Google web search results.
Here is the video: Click here

12 Ways to Increase Traffic From Google Without Building Links

Link building is hard, but it’s not the only way to make traffic gains in Google’s search results.
When I first started SEO, building links wasn’t my strong suit. Writing outreach emails terrified me, and I had little experience creating killer content. Instead, I focused on the easy wins.
While off-page factors like links typically weigh more heavily than on-page efforts in Google’s search results, SEOs today have a number of levers to pull in order to gain increased search traffic without ever building a link.
For experienced SEOs, many of these are established practices, but even the most optimized sites can improve in at least one or more of these areas.

1. In-depth articles

According to the MozCast Feature Graph, 6% of Google search results contain In-depth articles. While this doesn’t seem like a huge numbers, the articles that qualify can see a significant increase in traffic. Anecdotally, we’ve heard reports of traffic increasing up to 10% after inclusion.
By adding a few signals to your HTML, your high quality content could qualify to appear. The markup suggested by Google includes:
  • Schema.org Article markup – NewsArticle works too)
  • Google+ Authorship
  • Pagination and canonicalization best practices
  • Logo markup
  • First click free – for paywall content
While Google seems to favor authorities news sites for In-depth Article inclusion, most sites that may qualify don’t have the proper semantic markup implemented.

2. Improving user satisfaction

Can you improve your Google rankings by improving the onsite experience of your visitors?
In many ways the answer is “yes,” and the experience of several SEOs hints that the effect may be larger than we realize.
We know that Google’s Panda algorithm punishes “low-quality” websites. We also know that Google likely measures satisfaction as users click on search results.

“… Google could see how satisfied users were. … The best sign of their happiness was the “long click” – this occurred when someone went to a search result, ideally the top one, and did not return.”

-Stephen Levy from his excellent book In the Plex

The idea is called pogosticking, or return-to-SERP, and if you can reduce it by keeping satisfied visitors on your site (or at least not returning to Google to look for the answer somewhere else) many SEOs believe Google will reward you with higher positions in search results.
Tim Grice of Branded3 reports a saying they have at their SEO agency:

“If you have enough links to be in the top 5, you have enough links to be position 1″

While we have no direct evidence of pogosticking in Google’s search results, we’ve seen enough patentsinterviewsand analysis to believe it’s possibly one of the most underutilized techniques in SEO today.

3. Rich snippets from structured data

Google constantly expands the types of rich snippets it shows in search results, including eventssongs, videos andbreadcrumbs.
The first time I heard about structured data was from a presentation by Matthew Brown at MozCon in 2011. Matthew now works at Moz, and I’m happy to glean from his expertise. His Schema 101 presentation below is well worth studying.
Schema and Open Graph 101 – SMX Munich from Matthew Brown
If you’re just getting started, check out this amazingly helpful Guide to Generating Rich Snippets from the folks at SEOgadget.
Two of our favorite types of markup for increasing clicks are videos and authorship, so we’ll discuss each below.

4. Video optimization

Pixel for pixel, video snippets capture more search real estate than any other type of rich snippet, even more than authorship photos. Studies show our eyes go straight to them.
Eye-Tracking Google SERPs – 5 Tales of Pizza
Unlike author photos, video snippets are often easier to display and don’t require connecting a Google+ account.
Video snippets generally require creating a video XML sitemap and adding schema.org video markup.
To simplify things, many third party services will take care of the technical details for you. Here at Moz we use Wistia, which creates a sitemap and adds schema.org markup automatically.
Pro tip: Both schema.org and XML sitemaps allow you to define the video thumbnail that appears in search results. As the thumbnail highly influences clicks, choose wisely.
5. Google authorship
Scoring the coveted author photo in Google search results doesn’t guarantee more clicks, but getting the right photo can help your click-through rate in many results.
What makes a good author photo? While there are no rules, I’ve personally tested and studied hundreds of photos and found certain factors help:
  • Use a real face, not a company logo, cartoon or icon
  • High contrast colors. Because the photo is small, you want it to stand out with good separation between the background and foreground.
  • Audience targeted. For example, young Disney fans are probably less likely to click on an old guy in a suit who looks like a financial adviser.
Google recently got more selective about the author photos it chooses to show, but if you implement authorship correctly you may find yourself in the 20% (according to MozCast) of all search results that include author photos.

6. Improving site speed

Improving site speed not only improves visitor satisfaction (see point #1) but it may also have a direct influence on your search rankings. In fact, site speed is one of the few ranking factors Google has confirmed.
One of the interesting things we learned this year, with help from the folks at Zoompf, is that actual page load speed may be far less important than Time to First Byte (TTFB). TTFB is the amount of time it takes a server to first respond to a request.
As important as page speed is for desktop search Google considers it even more important for mobile devices. Think about the last time you waited for a page to load on your cell phone with a weak signal.

“Optimizing a page’s loading time on smartphones is particularly important given the characteristics of mobile data networks smartphones are connected to.”

– Google Developers

Suggested tool: PageSpeed Insights

7. Smartphone SEO

Aside from speed, if your website isn’t configured properly for smartphones, it probably results in lower Google search results for mobile queries. Google confirms that smartphone errors may result in lower mobile rankings.
What is a smartphone error? It could include:
  • Redirecting visitors to the wrong mobile URL
  • Embedding a video that doesn’t play on a particular phone (Flash video on an iPhone, for example)
  • Pop-ups that aren’t easily closed on mobile
  • Buttons or fonts that are too small on a mobile device
Google recommends making your site responsive, but many of the top brands in the world, including Apple.com, don’t have responsive sites. Regardless, a good mobile experience is imperative.

8. Expanding your international audience

Does your website have traffic potential outside your existing country and/or language?
Our international experts like Aleyda Solis know this well, but folks inside the United States have been slow to target specific languages and countries with SEO.
Oftentimes, the opportunities for appearing in international search results are greater than staying within your own borders, and the competition sometimes less. To see if it’s worth your while to make an investment, check out thisInternational SEO Checklist by Aleyda (who is also a mobile SEO expert—it’s so unfair!)

9. Social annotations with Google+

When you share content on Facebook and Twitter, your network basically sees it only when they are looking at Facebook and Twitter.
On the other hand, when you share content on Google+, your network can see it every time they search Google.
Google’s own research shows that users fixate on social annotations, even when presented with videos and other types of rich snippets.
The easiest way to take advantage of this is to expand your Google+ network and share good content regularly and often. Rand Fishkin elegantly explains how to use Google+ to appear in the top of Google results every time.
Additionally, content shared through Google+ often ranks in regular search results, visible to everyone on the web, regardless of their social connections.

10. Snippet optimization

This goes back to basic meta tag and title tag optimization, but it’s a good practice to keep in mind.
In the past two years, Google changed the maximum length of title tags so that it’s no longer dependent on the number of characters, but on the number of pixels used, generally around 500 pixels in length. This keeps changing as Google tests new layouts.
Because 500 pixels is difficult to determine when writing most titles, best advice is still to keep your titles between 60-80 characters, or use an online snippet optimization tool to find your ideal title tag length.
Google also updated its advice on meta descriptions, further clarifying that duplicate meta descriptions are not a good idea. Matt Cutts tells us that if you can’t make your descriptions unique for each page, it’s better to have none at all.

“You can either have a unique meta tag description, or you can choose to have no meta tag description.”

Google’s Matt Cutts

Given that duplicate meta descriptions are one of the few HTML recommendations flags in Webmaster Tools, does this indicate Google treats repetitive meta descriptions as a negative ranking factor? Hmmm….

11. Updating fresh content

Websites that stop earning new links often lose ground in Google search results. At the same time, sites that never add new content or let their pages go stale can also fall out of favor.
Freshening your content doesn’t guarantee a rankings boost, but for certain types of queries it definitely helps. Google scores freshness in different ways, and may include:
  • Inception date
  • The amount (%) your content changes
  • How often you update your content
  • How many new pages you create over time
  • Changes to important content (homepage text) vs. unimportant content (footer links)
Recommended reading:  The Ingredient for tasty SEO

12. Ongoing on-page SEO

The factors listed here only scratch the surface of earning more real estate in search results. Issues such as indexing, crawling, canonicalization, duplicate content, site architecture, keyword research, internal linking, image optimization and 1,000 other things can move ranking mountains.
The job of the Technical SEO becomes more complex each year, but we also have more opportunities now than ever.
It’s easy to think nothing is new in SEO, or that SEO is easy, or that Google will simply figure out our sites. Nothing is further from reality.
The truth is, we have work to do.

Google’s 2014 Redesign New look : Before and After

Over the past few months, Google has been testing a redesign of both their overall SERP format and their AdWords blocks. In the past day or two, it appears that they’ve rolled these changes out to a large part of their audience. While we still have a chance to grab before and after versions of the SERPs, I thought it would be worth a quick stroll down memory lane and a look at the future of Google.

I. Basic search result

Let’s start with a pretty basic search result, a query for [pygmalion]. Here’s the before and after:
The title font in the new version is slightly bigger, and Google has done away with the underlining. Interestingly, the source URL is actually a little smaller. The snippet and mini-links seem to have remained the same.

II. Expanded site-links

Here’s a #1 result with expanded site-links. The query is [carolina place mall]:
Like the main result, site-links are also getting the larger title font without underlines. This example also clearly shows that some title tags will get cut off with the new, larger font. This could impact click-through rates, so you may want to consider shorter titles going forward (at least for critical pages).
Notice the faint horizontal divider at the bottom. This sets the expanded #1 result apart from the rest of the SERP. These horizontal dividers are used frequently in the new design, and I strongly believe that they are a move toward a more card-like look (akin to mobile, Google+, and Google Now).

III. Image vertical results

This is what the new image vertical results look like. The query is [roger williams university]:
The new format has the new font, plus a fairly pronounced “More images…” link. Again, the vertical results are separated (above and below) by a horizontal divider. The images themselves appear to be formatted the same.

IV. News vertical results

Here’s a query for [wtop traffic], showing the redesigned news vertical results. Note that these were captured on different days, so the actual articles have changed—the count/layout are equivalent, though:
All articles links are using the larger font (with the same implications for length/wrapping). Like image vertical results, news results get a top and bottom divider. In general, you can see that almost every type of result is taking up significantly more vertical space.

V. Local pack results

Here’s a 3-pack of local results, for the query [lands end] and focused on San Diego, CA:
Larger font, no underlines, horizontal dividers—you know the drill. Note the lighter-gray text on the actual location information (address and phone).

VI. In-depth articles

Here’s a look at Google’s newest vertical, in-depth articles. The query is [palm oil]:
The redesign pretty much follows the pattern of the other verticals. Note that the actual header font—”In-depth articles”—is a bit smaller and slightly grayed out.
Google has been testing many variations of in-depth articles, and all of them suggest that this expanded format may be replaced with something more Spartan. Here’s a recent test (this is not live, and this design will likely change), for the query [foreclosure]:
While this test format follows the rules of the redesign, it is in every other way dramatically different from Google’s current treatment of in-depth articles. Note that this test version appeared in the “#2” slot (right after the first organic result), whereas current in-depth article blocks usually appear at or near the end of page 1. Expect in-depth articles to get a major overhaul in the next few months.

VII. Video thumbnails

In 2014, video results are really more of an enhancement than an actual vertical. Here’s a quick before and after for the query [wild kratts]:
This is essentially just an organic result, with a bit of information and a thumbnail added—the general layout and thumbnail characteristics have remained the same. This also true of authorship results and review snippets—the title and URL fonts have changed, but the general layout, thumbnail size, etc. seem to all be the same.

VIII. AdWords (top)

On top of the general design change, Google has been testing a new AdWords format for months—these may be rolling out together, but the tests themselves have been separate. Here’s a reasonably complex AdWords block from the top of a query for [keens]:
In addition to the larger, non-underlined titles and horizontal divider, the colored background is gone, and a yellow [Ad] box appears next to each individual ad. The “Ads related to…” text has been removed as well.

IX. AdWords (right)

The AdWords block in the right-hand column has also changed, but the difference is a bit less dramatic. Here’s the same query ([keens]):
There’s just one yellow [Ads] label for the entire block, and there’s no change to the background (because the old version didn’t have a colored background). The new fonts do expand the titles significantly and increase the vertical area of the total ad space.
Note that the AdWords block on the bottom of the left-hand column looks very similar to the redesigned top AdWords block. Other SERP elements, including the knowledge panel, answer boxes, paid shopping, and carousels seem to have been unaffected by the redesign (so far).

It’s in the cards

Back in November, I predicted that Google would move toward a more card-like format in 2014. While my future SERP concepts were heavily influenced by mobile and Google Now and are more extreme than the currrent redesign, don’t overlook the way Google is using dividers to separate out SERP elements. As mobile and tablet proliferate, and new devices like Glass come into play, Google wants to have SERPs that they can easily mix-and-match, providing whatever combination is most relevant for each device and situation. For now, desktop remains a fixed, two-column format, but Google’s design decisions are being driven more and more by mobile devices, and the future is in individual information elements that can be easily rearranged.
To see this idea in action, here’s a local (Chicago suburbs) search for [starbucks]. Notice how the dividers separate the expanded top ad, the expanded #1 result, a local 3-pack, a news box, and, finally, the rest of the organic results:
While a horizontal line might not seem like a big change, Google is clearly working to carve up the SERP into units that can potentially be mixed and matched. Also note where “#2” is on this page. As simple as they may seem, these design changes are redefining organic results.

Do you like it?

Trick question—no one cares. Sorry, that was a bit harsh, but here’s the reality: Google has been testing this for months across what are probably millions of unique visitors. A few dozen marketers complaining about the new design is not going to sway their decision. At this point, the decision is 98% made, and it’s made based on Google’s goals and Google’s data. The best you can do is try to assess how these changes impact your bottom line and adjust accordingly. Don’t waste your time shouting at the wind.
One final note: While this redesign seems to be rolling out, Google has not officially confirmed the change and it may still be in testing (albeit widespread testing). I wanted to put together a post while we could still compare and contrast the before and after versions, but this design could still change over the next few days, weeks, or months.

Google: Sites Penalized for Long-Term Spam Tactics Might Never Recover

Google: Sites Penalized for Long-Term Spam Tactics Might Never Recover


If you have a website that has been spamming for years, and you’re now attempting to clean it up to get back in Google’s good graces, you probably have a harder time ahead of you than the average site.
According to Google’s Distinguished Engineer Matt Cutts, longtime spamming sites will find it more difficult to get back into Google’s search results.
Marie Haynes of HisWebMarketing, and a Search Engine Watch contributor, has been trying to clear a penalty for one client’s site but having some difficulties despite submitting a reconsideration request. It seemed the issue was the fact her client had been spamming for so long and to such a degree, that traditional link removal methods weren’t strong enough.
“… So I worry that you haven’t truly gotten through to your client, who shows signs of long-standing, mass, deliberate spam,” Cutts tweeted, later adding, “Just want to make sure they have a clear-eyed view of the hole they dug for themselves over the years.”
Haynes then asked whether it would be possible to get the penalty lifted for the site at all.
“It’s possible (as John Mueller points out), but it could be quite difficult to undo all the spam across the years,” Cutts tweeted.
Mueller has mentioned before, and recently reconfirmed, that sites could have such a bad reputation, and be so deeply into spam that it’s next to impossible to dig out of the Google penalty hole:
It’s never a decision to make lightly, but there can be situations where a website has built up so many problems, that it may appear easier or faster to start over with a fresh & new website, rather than to try to fix all of those problems individually. This isn’t an easy way to get past problems that have been built up over the years, it’s a lot of work to create a new website, even if you already know the business area.
If you feel you’re in this situation, make sure to get advice from friends & other people that you trust (including webmaster communities where you trust their opinions) before doing anything drastic!
That said, payday loans has been one of those spaces that has been full of spam for years and years, which has rivaled spam in areas such as prescription drugs and gambling sites.
“But then really, prior to this year, how many payday loans companies do you know of that ranked without using spam?” Haynes said. “I’m not saying that what they did was right, but it’s what they felt they needed to do in order to rank.” And Haynes is definitely right.
When faced with such as serious long-term spam problem, and considering the above comments from Google, you have to seriously consider whether it’s best to start fresh on a new domain.
Cutts also confirmed something that many webmasters, particularly spammers, have been suspecting for years. A spammy website can contaminate other websites that have the same address and company info.
Cutts pointed out that the site shared the same business information as other payday loan sites, particularly spamming ones. Here are Cutts’ tweets:
@Marie_Haynes e.g. notice that http://www.kwikcash.co.uk/ has the same address and the same company registration number. (source)
@Marie_Haynes and make sure to press your client about exactly how many “quick case” sites they own, because it appears to be several. (source)
“As Matt pointed out, they have a history of creating unnatural links that goes back a few years. Google also has concerns with the fact that they have more than one business operating from the same address,” Haynes said. “This doesn’t mean that everyone who runs multiple businesses from their home or one address needs to be afraid that Google is going to penalize their site. But, it’s possible that this is a factor that Google uses when trying to determine the validity of a payday loans site.”
Webmasters who are having an issue with spam should make sure they separate any related sites if possible. This will include things such as matching affiliate codes, contact information, and WHOIS information.

Up-Close @ SMX West: Time To Think About Life Beyond Google

Up-Close @ SMX West: Time To Think About Life Beyond Google


With algorithm updates happening more frequently than ever and Google continuously working to keep searchers on the result page, companies need to make a conscious effort to broaden their marketing efforts and move beyond Google.
In the “Life Beyond Google: Diversifying Your Efforts” session at SMX West 2014, the speakers showed us how to do just that. While there were a ton of great tips, three main themes stood out:

Focus on the Customer

Our customers are consuming content in a variety of ways that don’t include search engines and it’s up to us as marketers to give them the content they want, where they want it.
According to, Ted Ives, Owner of Coconut Headphones, this really boils down to the notion of “What are we going to communicate to whom, and how?” To figure out the answer to those questions, there are three things to focus on:
  • Message
  • Channel
  • Audience
By understanding your audience, you can start crafting your message and identifying the channels to place that content.
Ives suggests starting with a content marketing process. Decide on what type of content you’ll be creating, where you’ll be promoting it, and how you can make the content creation process routine. Be sure to design the process with realistic expectations. After all, it doesn’t make sense to come up with a process that can’t actually be executed by your team.
Once you have that process in place, start by creating one core message that can then be used in different ways. Ives gave a great example of creating a whitepaper and then repurposing it into a webinar, podcast, blog post series, newsletter, tradeshow presentation and more.
Don’t think you have the resources for all of that? Ives pointed out that you only need your thought leaders involved in the original piece. Once you have that piece to build off of, utilize your junior people to create the next phases of content.
The best part of this process is you have now created a number of content pieces that can be shared across multiple platforms with multiple audiences.
By using this method, you naturally diversify your marketing.

Build Relationships With Influencers

With the advent of social media there is a huge opportunity to expand your reach beyond Google. However, just throwing some content up on social media networks isn’t going to do it.
Eric Enge, Owner of Stone Temple Consulting, says the key to success beyond Google is establishing brand authority and reaching influencers. The company you keep defines you even on the web, so it’s important to align yourself with the right people. When an algorithm update occurs or Google stops showing your site, the relationships you have built in other places are going to be what’s sending you traffic.
Building relationships with influencers is easier said than done, right? You don’t just suddenly become best friends with people you meet online. Relationships take time and higher valued relationships take more effort. You have to get to know them, interact with them, offer them value and build up trust. Enge says to climb the ladder one step at a time. Get in front of them often, go deeper than “check this out,” and give them something they haven’t seen.
He also notes that content is key. Great content is “a gift to social media” and will be the backbone of your success so make sure your social and content strategies are aligned. Know the type of content that does well on each platform and know the type of content your influencers (and audience) like on each platform. For example, images tend to do much better on social than a standard link. Create images that add to your content and can be shared on Twitter or Google+.
Brands must become authorities and influencers are the key to this.

Act Like The Stock Market

As mentioned earlier, we live in an unpredictable time when it comes to Google. Updates are occurring often and having major effects, sites are being pushed below the fold and data is disappearing [not provided]. What can you do about this?
Joshua Moody, Lead Enterprise Digital Marketer, 97th Floor, says to act like the stock market. Mitigate risk by diversifying your traffic sources.
Start by going social and thinking visually. Social postcards, or micrographics, are short powerful graphics that can be shared all over the place. Moody says that they are using them for a number of clients and they share extremely well on social. They can be pinned, used as Twitter cards, or integrated into Open Graph tags for Facebook. People will also display them on their own sites, creating links to the main site.
SlideShare can also add social value. An old piece of content sitting around can be repurposed into a SlideShare and promoted on your blog, Twitter or LinkedIn.
Moody also recommends becoming a part of communities that already have traffic. Buzzfeed for example, had 941 million pageviews in the past 30 days. Imagine if you could even get a part of that? Apparently you can. The Buzzfeed community allows anyone to sign up and start posting. If your post gets enough traffic and shares, it may be promoted to home page, which is where the real results come from, according to Moody. Even 5% of total views can translate into huge amounts of referral traffic. The key is to create robust, picture/gif heavy posts.
Becoming part of different communities and placing your brand in a variety of places decreases your risk for disaster and opens you up to new audiences.

Google Working On A Softer & Gentler Panda Algorithm To Help Small Businesses

Google Working On A Softer & Gentler Panda Algorithm To Help Small Businesses


Google’s head of search spam, Matt Cutts, announced at Search Marketing Expo that his search team is working the “next generation” Panda update that would appear to many as being softer.
Cutts explained that this new Panda update should have a direct impact on helping small businesses do better.
One Googler on his team is specifically working on ways to help small web sites and businesses do better in the Google search results. This next generation update to Panda is one specific algorithmic change that should have a positive impact on the smaller businesses.
Matt Cutts didn’t mention when the new update is coming out but rather they are currently working on this update. My feeling is that it is far off from being launched, like maybe in two to three months at best, but that is my gut.
This would not be the first time Google released a softer Panda update. They did a softer update to the Panda algorithm possibly in July of last year.
smx-softer-panda-cutts
Now Panda is more of a monthly rolling update and Google is unlikely to confirm future Panda updates.
Learn more about Google Panda updates.

Google Penalizes Two German Link Networks; One Being efamous

Google Penalizes Two German Link Networks; One Being efamous


Google’s head of search spam, Matt Cutts, announced on Twitter today that they have penalized and taken action on two additional German link networks. He mentioned one by name: efamous.
efamous looks like a typical ad network, but I guess they passed PageRank through their link network, which is against Google’s guidelines. Cutts also said they went after another “German agency network,” but did not disclose which one that was.
A month ago, Google penalized a German agency and their clients for unnatural links. Matt Cutts has specifically been giving Germany warnings about their use of links to manipulate the Google search results.
Google recently gave the same warning to Italian and Spanish SEOs as well as Polish link networks.

Shopping Comparison: TheFind Finds Lowest Price More Often Than Google, Bing, Others

Shopping Comparison: TheFind Finds Lowest Price More Often Than Google, Bing, Others


In December Danny Sullivan sought to determine which of the major shopping search engines found the best prices for common products. The shopping engines compared in Danny’s articles were Nextag, Shopzilla, PriceGrabber, Google, Bing and TheFind:


TheFind’s CEO Siva Kumar saw these articles and developed an internal tool to compare his site to these and other online shopping competitors.

TheFind later decided to make that internal comparison tool public facing so that anyone could try it and compare product search results across sites. Clicking any of the branded links above the main results opens a window showing the same query on the selected site.
TheFind contacted us saying that it had been running tests of its results against the shopping engines discussed in Danny’s original articles. It looked at 50 queries in six shopping categories, using the same approach and methodology pursued in Danny’s articles.
The questions asked in TheFind’s research were:
  • Which engine offers the lowest price on the same item?
  • How many stores are returned for a given result?
  • How often are product reviews, offline stores, coupons and other enhanced content featured?
Source: TheFind
The chart above reflects the percentage of times each engine had the lowest price for the 50 sample queries. TheFind said that 78 percent of the time it found the lowest price. By comparison, Google had the lowest price 42 percent of the time. Yahoo performed worst with the lowest price in only 8 percent of queries.
The reason these figures exceed 100 percent is because multiple engines may have “tied” in any given case. In other words, TheFind, Google and Shopping.com might have all found the best available price on the same item.
Source: TheFind
The flip side of the lowest-price comparison is one involving “price premiums.” The chart above reflects how much users will pay above the lowest available price by using each shopping engine. (As the overall price winner TheFind used itself as the baseline.)
Because Bing and Yahoo had the lowest prices in only 14 percent and 8 percent of cases, for example, users who buy through them are likely to pay a premium: 25 and 31 percent more on average respectively.
One of the explanations that TheFind offers for its performance vs. the others is the fact that it indexes more stores and product sources. The chart below reflects the average number of sources found for the 50 queries examined.
On average TheFind offered 65 stores or sources for each of the 50 queries. Shopzilla by comparison offered an average of five.
TheFind also explored additional or enhanced content in search results. The company looked at availability of coupons, product reviews and local stores that carried the subject products. Across the board TheFind outperformed its competitors.
Source: TheFind
It’s important to reiterate that the data and tests above were all performed and provided by TheFind. We did not verify the accuracy of these results. However I spoke to company at length to discuss its testing and methodology. I don’t believe the company selected the queries based on a known outcome or otherwise artificially skewed these results.
However the fact that TheFind’s tool Compare.TheFind.com is publicly available allows skeptics and anyone else (including competitors) to explore and verify results. Let us know if you do and find anything that contradicts the above data.