Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Tuesday, December 31, 2013

To Survive You Must Give Google Your Structure Content & Data?

To Survive You Must Give Google Your Structure Content & Data?
Greg Niland, aka GoodROI, posted a thread at WebmasterWorld arguing that if you do not hand over your data and content to Google in a structure format, then you will die a slow death.
Greg said:
hand over keysSome of you are probably thinking that using schema or HTML 5 simply makes it easier for the all-evil Google to steal your content. I'll be honest you do have a valid concern but if you ignore it, it will not go away. Your competition will do it and Google will get the content from them and give them the traffic. You need to figure out a new strategy for your site to deal with the current situation so you can incorporate these best practices and still profit. Maybe something like putting the best content behind a paid wall? Get creative!
Holding on to old techniques and technology is not a smart idea. It is like insisting on only selling printed newspapers and ignoring the fact that the world has changed and now people like to get their news online. Change is not fun but staying up to date on best practices is vital to success.
There are many SEOs and webmasters out there that don't want to give Google all their data and then let them use it in the knowledge graph without any link credit or potential to make money. But the question is, if your competitors do it, you will lose out anyway.

It is a hard call to make. You have data, the data is awesome, Google doesn't have access to it. Do you give it to Google in a structured format, such as we discussed with searching within apps or do you withhold?
Is there a win-win solution to this or is the only winner here Google?

Do you need to give Google your content in a structured format for your site to rank? If so, how long until Google replaces your site with the knowledge graph? Are there ways to benefit without handing over the keys to your business?

These are the decisions SEOs are making in 2013 and 2014.

by Barry Schwartz

Google: It's Hard To Recognize Your Large Photos Because Of Your JavaScript Links

Google: It's Hard To Recognize Your Large Photos Because Of Your JavaScript Links Google Images
A webmaster is upset that his images aren't being indexed from his site. He posted a complaint in the Google Webmaster Help forums, in which we learn two things:

(1) This webmaster was blocking Google from indexing those images with robots.txt file. This actually happens way too often.

(2) Sometimes using JavaScript for the "View Larger Image" link can make it hard for Google to index those images. Google does a pretty good job with JavaScript, but why not make it easier. Google's John Mueller said, "It seems that we're not picking up that these URLs are actually not pages but rather images -- and images without a landing page on top of that. One way to let us know about the actual connection is to use a direct link to the image (using target=_blank is fine, if you want to open them up in a new window/tab."

So if you want your images indexed, first make sure you allow Google to index them and then make sure the crawlers can process the links to them.

by Barry Schwartz

Thursday, December 26, 2013

When You Have Bad Links Impacting Your Google Rankings, Which Do You Remove?

When You Have Bad Links Impacting Your Google Rankings, Which Do You Remove? Google Panda Holiday
I see this all the time, a forum thread, where a webmaster knows his rankings are suffering in Google because he was hit by Penguin because he has a lot of really bad, manipulative links. A WebmasterWorld thread sums up the issues a webmaster in this predicament is in.

(1) They hired an SEO company (2) That SEO company ranked them well for years (3) Then Penguin smashed their links (4) They no longer rank as well (5) They are upset with the SEO company (6) They need to figure out how to rank again (7) Removing the links are the only option (8) But removing links that were the result of their initial good rankings won't help them rank immediately

In this thread, the site owner sums it up as:
1) What is the sure proof way to make sure a link is 100% bad? 2) I don't want to remove all links cause I am worried my site will drop even more. I'm sure there are some semi-good links that might be helping.
3) After submitting disavow file, typically how long does it take to recover? We have two sites, one seems to be under penguin and panda updates and the other received a manual penalty for certain bad links for certain terms.
It is sad, indeed. But you need to disavow the links, that is for sure. Those links are not helping you and they are now hurting you. Remove the hurt. Then get people to link to you because they want to link to you.
But which links should you remove? Which links are actually hurting you. That is the hard question. One SEO offered his advice:
the best advice I think I can give is to disavow the "obviously bad" links, but keep the ones you think are "grey" or "borderline" and see if you recover -- Basically, in your situation, meaning you don't "know" what's good and what's bad for sure, I'd "start with the obviously bad" and then "keep going" if necessary.
Of course, there are tools, like Link Detox, Majestic SEO, AHREFs, Moz and others. But we are assuming you have the tools already or you manually go through all your links within Google Webmaster Tools. And when you disavow, make sure to disavow on the domain level.

by Barry Schwartz

Tuesday, December 24, 2013

Matt Cutts: Google's Hummingbird Algorithm Affects 90% Of Searches

Matt Cutts: Google's Hummingbird Algorithm Affects 90% Of Searches Google Hummingbird
I keep coming back to this episode 227 from TWiG and in the video, Matt Cutts talked about the Hummingbird algorithm at exactly one hour and twenty minutes into the video. He spent maybe a minute or so talking about it.

Matt Cutts said that the Hummingbird algorithm actually effects 90% of all searches but he said only to a small degree. So while Panda may have impacted 10% or so and Penguin closer to 3% or so, Hummingbird impacted 90%. But Matt Cutts said only to a small degree where users should not notice.

Here is the snippet of what Matt Cutts said:
Hummingbird effects 90% of all searches but usually to a small degree because we are saying, this particular document isn't really about what the user is searching for.
I know Google has told us searchers and SEOs should not have noticed any impact to rankings and traffic based on this algorithm update. But we suspected it may have impacted some.
With such a large footprint, 90%, it had to on some degree.

by Barry Schwartz

Google Fetch As Googlebot Won't Crawl When Site Is Too Slow

Google Fetch As Googlebot Won't Crawl When Site Is Too Slow Google Page Speed
A Google Webmaster Help thread has one webmaster complaining his site isn't being crawled by Google and isn't showing up in the search results. The reason, his site can't handle Googlebot crawling it.

The site is pretty static and basic but the server is a cheap or free host that can't handle much activity. So Googlebot can't crawl it without taking down the site and thus stays away until it can get through to crawl it without negatively impacting the site.

The interesting thing is that if you use the Fetch As Googlebot feature when this is the case, it will fail as well. So you can actually somewhat diagnose a major site speed issue with Fetch as Googlebot.

John Mueller from Google said:
Looking at your site, I do see that we'd like to crawl more from the server, but we're holding back because we think the server might not be able to handle the load. This is the reason why the Fetch as Google requests aren't making it through. In particular, we're seeing a fairly high response-time for URLs from the server, which often signals that the server is pretty busy even without us crawling.
 by Barry Schwartz

Friday, December 20, 2013

Google's URL/Content Removal Tool Now A Wizard

Google has updated their URL removal tool to make it easier and smarter to remove content specifically from third-party web sites.

Google told us at Search Engine Land specifically that the tool is smarter by analyzing the content of the URL you submitted and letting you know what options you have based on the details of the cache result, search results and the actual live page.

You can remove complete pages if the page is actually not live or blocked to spiders. You can remove content from a page if the content shows in the Google cache and is not live on the page.

Here are some screen shots:
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard - click for full size
Now, it may even work on soft-404s, so be careful. As WebmasterWorld's moderator said:
This makes technical SEO aspect even more important, so that your site is not exposed to URL removal requests from the third party.

by Barry Schwartz

Matt Cutts: Google Tries To Minimize Search Updates Before Holidays

I am not sure if you noticed but all the Google search results tracking tools flared up yesterday reporting major rejectionchanges in the Google search results. You can see for yourself at MozCast, SERPs.com, SERP Metrics and Algoroo. But I personally saw very little chatter about any changes in the search results outside of major authorship reduction and even that is somewhat questionable.

Matt Cutts, Google's lead guy here, responded to an angry tweet from a website owner saying Google doesn't do major updates before holidays.

So are those tracking tools wrong? Is Matt Cutts lying? Is he not aware of something that is going on? Did Google not do an algorithm update but some user interface change is messing up the tracking tools?

Again, I don't see signs of an update based on webmasters complaining about ranking changes.

Did you notice changes in your traffic or rankings?

by Barry Schwartz

Thursday, December 19, 2013

Google: We Don't Control Content On The Web

Google: We Don't Control Content On The Web control
I spotted a thread that is a common question in the Google Webmaster Help forums about removing content from showing up in Google. The response was even more interesting.

Google's Eric Kuan, from the search quality team said that Google does not control the content on the web.

He wrote:
Google doesn't control the contents of the web, so before you submit a URL removal request, the content on the page has to be removed. There are some exceptions that pertain to personal information that could cause harm. You can find more information about those exceptions here: https://support.google.com/websearch/answer/2744324.
True, Google cannot control what people publish on the internet but they are the gateway to that content.
I bet many SEOs and webmasters would argue with Google's definition of "control" here.

by Barry Schwartz

Google's Matt Cutts: Our Algorithms Try To Break Black Hat SEOs Spirits

Google's Matt Cutts: Our Algorithms Try To Break Black Hat SEOs Spirits
 A couple weeks ago, Google's Matt Cutts was on This Week in Google (TWiG) and on episode 227 Matt had some interesting things to say. He said that Google specifically tries to break the spirits of black hat SEOs.
At about an hour and 25 minutes into the video, Matt said:
Matt Cutts TwigIf you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.

So in short, Google actually doesn't just look to prevent money to go to spammers, they look to break their spirits.

by Barry Schwartz

Wednesday, December 18, 2013

Google: Duplicate Content Pollutes 25-30% Of The Web

Google: Duplicate Content Pollutes 25-30% Of The Web Duplicate Content & SEO
We all know Google's take on duplicate content, it is not spammy, it is not something they have penalties for unless you are being completely evil. We also know that if you have exactly the same content as someone else, it might be hard for your content to rank in Google because Google shows only one of the duplicative pages.

The new news is that Matt Cutts of Google said that somewhere between 25% to 30% of the web is duplicative. Meaning over one-quarter of the pages, content, files, images, etc, on the web are replicated. Can you imagine?

It might seem like a lot to a normal user. Probably doesn't seem like a lot to you and me because as soon as I publish this story, 15 other sites will scrape it and post it on their sites. So this one piece of content will be duplicated 15X or so.

But as a user, because search engines cluster and show one of the duplicative pages, we don't realize that so much of the web is duplicative.

by Barry Schwartz

Tuesday, December 17, 2013

Google: It's Not About How Much Or How Many...

Google's John Mueller gave the typical Google line at this Google Webmaster Help there - quality over quantity. He wrote:
I wouldn't worry about the technicalities - how much you can write about how many topics - and instead just make sure that the content which you're creating is really of the highest quality possible. This isn't something you should do for search engines, it's really something you're probably already doing. You almost certainly know yourself where the content that you're creating is thin, and where it's really significant, unique, high-quality and compelling to users. The awesome stuff is what you should be aiming for -- and from my point of view, you should be doing that independent of any algorithms (real or not).
It isn't about how much or how many - it is about the highest quality as possible.
Truth is, I am sure many sites have writing quotas. Be it having to write one story a week, a story a day, or several stories a day. Heck, I have this artificial number in my head that I need to push out around 5 stories here a day and several on Search Engine Land, but truth is, I really don't. I just like the number. I've done more or less and truthfully, I've done stories that I probably would not have written because I wanted to publish at least four stories on a specific day. I probably should not have but quotas, we all have them as part of our job or as a number sitting in our head. Be it budgets, yearly revenue goals, to weight loss and heck, I am sure Google has their spam and quality quota metrics. We all got them.

But should step back and say, it isn't about how much or how many?

by Barry Schwartz

Thursday, December 12, 2013

Google's Matt Cutts Agrees, Guest Blogging Is Getting Out Of Hand

guestGoogle's Matt Cutts Agrees, Guest Blogging Is Getting Out Of Hand
So Matt Cutts released another video yesterday and this is at least the fourth video on the topic of guest blogging.

The deal is, Google is saying Guest Blogging as a whole is getting spammier by the day. As good things get abused, over time, those good things turn into bad things.

Matt said, the guest blogging spam is "growing" in terms of spam and abuse. So he laid out basic "do nots" that everyone who reads here already knows:
  • Don’t make guest blogging your only link building strategy
  • Don’t send out thousands of mass emails offering to guest blog to random sites
  • Don’t use the same guest article on two different sites
  • Don’t take one article and spin it many times



Here are links to related topics:

by Barry Schwartz

Google Panda Impacting Your Mega Site? Use Sitemaps To Recover?

Google Panda Impacting Your Mega Site? Use Sitemaps To Recover? sitemaps
A WebmasterWorld thread has discussion around how to determine which sections or categories of your web site are impacted by Google's Panda algorithm.

Panda has not gone away and sites are still suffering from not ranking in Google after the Panda algorithm. New sites are hit monthly, some sites are released in some way from its grips as well.
The thread talks about one technique for large sites to determine which sections of their sites are impacted by Panda.

The concept is to use XML Sitemaps and break the sitemap files into a logical structure of the web site. Then once Google processes all the files, Google will quickly show you how many pages within each sitemap file was indexed or not indexed.

One webmaster explained how he went about this technique:
The sector I was perfoming in. allowed me to created dozens of sitemaps of 100 pages each. No reason why any of the pages should not be indexed. I found some sitemaps with 0 listed then others from 25 up to the full 100. I then discovered trends. IE pages with similar title tags and URLS. (the on page content was considerably different, which is why I did not remove them initially) I then did different experiments with each sitemap group, until I saw a recovery, then applied the solutions across the board.
The question I have and I am not certain of... I thought sites impacted by Panda, the pages are indexed but they don't rank as well. Meaning, if a page is not indexed, that is more of an issue with crawl budget and PageRank (or sever set up issues) versus Panda. Panda, the content has to be indexed for Google to know not to rank it well. Am I wrong?

by Barry Schwartz 

Google On How To Make Better Mobile Web Sites

Google On How To Make Better Mobile Web Sites google mobile logo
Maile Ohye, a Google developer lead who has worked on mobile and webmaster topics for a long time, shared on Google+ that she was fed up.

Well, fed up with seeing "so many bad mobile websites" that she decided to do something about it. What did she do? She published documentation in the form of videos and a check list named Checklist for mobile website improvement.

Maile said, "A girl can only take so many bad mobile websites before she's compelled to publish documentation."

The document goes through both technical issues, marketing issues and conversion issues related to mobile web sites.

Mobile, as you all know, is increasingly important these days. So you should spend the time reading this resource.

by Barry Schwartz

Wednesday, December 11, 2013

Google's Matt Cutts: Disavow Links Aggressively

Google's Matt Cutts: Disavow Links Aggressively Google Disavow Machete Man
Google's Matt Cutts released another video this one tries to answer how to remove a penalty for bad link building over a period of time.

Specifically, if you know you hired a bad SEO who built bad links between a specific date range - how do you reverse the actions taken on your site for those bad link?

In short, Matt said, just disavow them or maybe even all the links built between the date range you hired the SEO company. He said if you had good links prior, then the easiest thing would be to disavow links acquired during the time you hired that SEO company.

Matt again said to use the disavow tool very aggressively, as we covered in our Google: Use The Disavow Tool Like A Machete & Not Fine-Toothed Comb.

He also side steps the question on Interflora only being penalized for 11 days.

by Barry Schwartz

Saturday, December 7, 2013

Google Shocks Webmasters By Updating Toolbar PageRank Before Year End

Google Shocks Webmasters By Updating Toolbar PageRank Before Year End Google PageRank Update Lives
Early this morning, Google pushed out a Google Toolbar PageRank update. This update shocked webmasters because no one expected it, at least not in 2013.

As you may remember, the last Toolbar PageRank update was over 10 months ago on February 4th. As I said at the six-month point, it was unusually for Google not to push out a PageRank update quarterly. Then Matt Cutts, Google's head of search spam, told us there won't be a PageRank update before 2014 - or at least he thought so.

Today, December 6th, the Google Toolbar PageRank values have been updated. I guess the upper management, executives, or Larry Page, didn't want PageRank to go away after all.
This makes me sad, as I am sure it makes many Googlers sad. Why? Because SEOs and Webmasters obsess about Toolbar PageRank, to the fault of Google. And as I said time after time:
Despite PageRank still being part of the algorithm, SEOs know that toolbar PageRank is often outdated and not that useful. In addition, there are many other factors part of the algorithm that may or may not be as important as PageRank.
Anyway, I do hope your green pixel dust improved since February.

by Barry Schwartz 

Friday, December 6, 2013

Google's Matt Cutts: We Don't Like Content Stitching

Google's Matt Cutts: We Don't Like Content Stitching
Google's Matt Cutts posted a video yesterday explaining that "stitching content" is a bad idea. In short, stitching content is when you take snippets from various articles across the web and place it on a single page, even with linking to it.

Matt explains there is a difference between writing a summary on a topic by using sources. But you aren't simply copying and pasting those sources, you are summarizing them and explaining them in more detail. He cites Wikipedia as a good example of doing this.

But bad examples would be just copying and pasting quotes and adding links to those sources or not.
I joked on Google+ that isn't Google theoretically doing this with their knowledge graph? Basically taking snippets of content and putting it together in a box and heck, they aren't even citing the source.

Truth is, no one likes to read these types of stitched pages - that is indeed true. But the knowledge graph is user friendly and useful.

by Barry Schwartz 

Thursday, December 5, 2013

Google: ccTLD & Wildcard Subdomain Strategies Won't Give You An "Unnatural Advantage"

Google: ccTLD & Wildcard Subdomain Strategies Won't Give You An "Unnatural Advantage" servers
Google's John Mueller responded to a detailed technical question regarding ccTLDs, wildcard subdomains and the likelihood of it being seen as doorway pages in a Stack Exchange thread.
In short, the webmaster wants to create websites that would target city names around the world, whereby TLD for each countries will be used. Here is the mapping they were considering:
www.website.de for let's say Germany and www.website.cn for China and the sub-domains will be www.berlin.website.de and www.beijing.website.cn, as an example.
Google's John Mueller responded that this won't be a good idea.
Let me share John's full response since it can be a bit technical:
My first thought when reading the question was that this is going to be a case for the web-spam team. Please don't create tons of sites that are essentially doorway pages. Also, using wildcard subdomains (assuming the idea is to map them to cities after DNS resolution) make it extremely hard to determine how those URLs should be crawled.Additionally, I think it's important to mention that a site using this kind of URL structure won't see any unnatural advantage in search. Search engines are just as good at handling URL parameters, there's no need to make it look like a website focused on [cityname],[countryname] when it's essentially just a part of the same website. Unless you have very good reasons to do this outside of web-search, I would recommend simplifying things significantly.
For geotargeting, using a ccTLD is a good way to let users & search engines know about your target audience. For Google, you can also use a gTLD (even the same one for all your sites) and work with subdirectories or subdomains to apply geotargeting there too. That saves you from having to get & maintain all those ccTLDs.
For multilingual content, at least from Google's point of view, the URL is irrelevant as long as it's unique. Use whatever URL structure works for you (and look into using hreflang where it makes sense).
Venturing overseas with content, especially with travel destination like sites, can be very tricky. This is especially true in the Panda world.

by

Wednesday, December 4, 2013

Google: Our Algorithms Don't Stand Still, Neither Should You

Google: Our Algorithms Don't Stand Still, Neither Should You walking lego
I see these threads all the time, a webmaster complaining their rankings suddenly dropped and for no reason.
Well, Google's John Mueller responded to one of these, potentially with good news but also Google-like advice.
In a Google Webmaster Help thread John said:
Fluctuations like this can be normal - the algorithms don't stand still and use many signals to review & determine the relevance of pages & sites. Looking at it just now, it seems that it may just have come back again :). That said, working to improve the overall quality of a website is always a good thing. If this is currently one of your busier times, then it might make sense to do that a bit afterwards -- but I'd definitely think about what you could do to get feedback from users (customers or not) already.
I checked the site's rankings and it was not back to number 9 for me. But maybe John saw it back at 9. Either way, the advice is typical advice from Google but important.
In short, Google's algorithms don't stand still, neither should you.
You should keep working on your site, making it better for users and hopefully Google will appreciate those efforts as the algorithms update.

by Barry Schwartz

Tuesday, December 3, 2013

Blocking GoogleBot By Accident? No Long Term Damage, Google Said.

Did you accidentally block Google from accessing your web site? Google says you don't have to worry, while there may be short term damage caused by it, you should recover.Google's John Mueller said it should not result in "long term damage."
Here is how John put it:
damageFrom our point of view, once we're able to recrawl and reprocess your URLs, they'll re-appear in our search results. There's generally no long-term damage caused by an outage like this, but it might take a bit of time for things to get back to "normal" again (with the caveat that our algorithms change over time, so the current "normal" may not be the same state as it was before).
So if your hosting company has issues or you slip up and block Google, once Google can crawl again, in the long run, you should be okay.

by Barry Schwartz

Saturday, November 30, 2013

Did Google Update On Thanksgiving Or Traffic Down Due To Holiday?

A WebmasterWorld thread plus dozens of threads at the Google Webmaster Help forums have webmasters Google Thanksgiving Day Updatecomplaining about both ranking changes and traffic drops from Google's search engine.The traffic drop would make sense being that it is a holiday and on US holidays traffic normally drops. But ranking changes cannot really be explained by an algorithm change.

Maybe all the black friday deals and all those new pages trying to compete on top selling product categories are popping up and Google needs to rank them properly? I am not sure.

One webmaster said, "I'm seeing what looks like a rollback today. Many verticals have returned to their former positions."
Another said, "We've been hit again, main keywords dropping even further."
Some can't figure it out if it is ranking or traffic.
Google has said they won't update major algorithms again during the Christmas holiday shopping season but who knows.

It is not unusual for webmasters to spark up during Thanksgiving about ranking issues. We've seen it a couple of times in the past, at least.

The Google tracking tools are not yet updated, with the exception of Algoroo but I'll keep checking Mozcast, SERPs.com and SERP Metrics to see if they agree. Update, it seems as if the tools do not agree.

by Barry Schwartz

Google Tells Webmaster: You'll Have To Earn Our Trust Again

A Google Webmaster Help thread has a story of a site that is trying to disavow and remove all the bad links pointing to his site.Trust GoogleThe interesting part is that even when he does, will it help with his site's rankings?
The goal of this webmaster is simply to remove the manual action, but Google's John Mueller tells him he also has algorithmic trust issues.
John said:
looking at your site's history, it looks like you've done quite a bit for quite some time, and it looks like our algorithms have picked up on that too. So while resolving the manual action is a good way to start, you need to keep in mind that it can possibly take quite some time for our algorithms to regain trust in your site even after that.
I see this happening a lot, webmasters aim to remove the manual action and do but then the rankings don't improve. The reason is likely do to algorithmic actions taken on the site.
That being said, it is interesting to see how Google words it.
The algorithms seem to have lost trust over time. The manual action is a "good way to start" but the algorithms need to "regain trust" in the site for there to be an improvement - which may take some time.

by Barry Schwartz

Wednesday, November 27, 2013

Google's Matt Cutts: Why Google Use To Only Crawl 100 Links Per Page

Google's Matt Cutts posted a video answering the old question, can Google crawl more than a 100 links on a  100specific page?

Of course they can. They said so in 2008 and there are plenty of sites with hundreds of links on a page that are not penalized by Google.

Google said they have no specific rule, but if the number of links you have on your page looks spammy, then Google reserves the right to take action.

Of course, many SEOs are scratching their heads and asking - really? Did Matt Cutts really need to go on record about this? Is this the state of our SEO industry?

by Barry Schwartz

Tuesday, November 26, 2013

M-Dot Domains Need To Be Verified Separately In Google Webmaster Tools

In Google Webmaster Help there is a straight forward question and answer about how to handle M-dot (i.e. m.domain.com sites in Google Webmaster Tools.In short, an M-dot is a separate site and should be verified separately in Google Webmaster Tools.
Zineb, a Google Webmaster support representative answered each question:
Google Webmaster Tools(Q) Having verified domain ie.http://www.domain.com in GWT do I need to separately verify it's mobile version ie. m.domain.com? If so which method is the best for mobile? 
(A) Yes. You need to verify both URLs in Webmaster Tools. Regarding verification methods, it depends on what you prefer :)
Furthermore, she answered the question on a separate mobile domain.
(Q) Also if the mobile website will have different URL ie. m.domainmobile.com how will that affect the verification (obviously having verified http://www.domain.com wouldn't help). 
(A) I don't see why it would affect the verification. Make sure to add the bidirectional annotation to both your sites (mobile and desktop) to help our algorithms understand the relationship between your desktop and mobile pages.

by Barry Schwartz 

Saturday, November 23, 2013

Matt Cutts Asking For Google Webmaster Tools Feedback

This is your opportunity to give Google your two-cents on the features and issues you have with Google  Webmaster Tools.

I blew up about it the other day where I asked in a nice way for Google to add a feature named automated actions to Google Webmaster Tools. Will it happen? I think it is more complex that I am making it out, so I doubt it. But if you agree, make sure Google and Matt Cutts know you want this feature.
google webmaster tools feedback
If you don't want it, fine. Just make sure you tell Matt Cutts and his team what features you want.
Matt posted the request on his personal blog asking you to leave comments with the features or changes you want made to Webmaster Tools. He said they will use it for brainstorming and possibly build some of those features in 2014 - but not guaranteed.

Matt's blog only has 125 comments or so, so go flood it with ideas.

by Barry Schwartz

Google's Search Results Rocky This Week...

This week has been a mess for Google's search results, despite Google denying anything is going on to me.
There were some tools that sparked up on November 14th that I didn't see from the webmaster chatter.

This whole week, I've been seeing a lot of sporadic complaints in both WebmasterWorld and Google Webmaster Help forums. Typically, these sporadic reports, at least I think, mean Google targeted a link network and some sites were majorly impacted by it. It also may be a weird Google bug. But not necessarily a Google update. Of course, it could be that Panda was updated, which Google stopped confirming.

It is hard to tell.

The tools are all lighting up over the past few days. Moz has warmer than normal temperature the past couple days, SERPs.com has higher volatility numbers than normal, SERP Metrics shows higher flux than normal and Algoroo is in the red the past couple days. Something seems to be up.

Is it a major algorithm update - I don't think so. If I had to guess, maybe Panda was rerun or maybe Google squashed some sort of link network.

I am honestly not sure but Google is indeed a bit rocky the past few days.

by Barry Schwartz

Friday, November 22, 2013

8 Reasons To Use The Google Disavow Tool By Matt Cutts

Matt Cutts posted a video answering the question "Should I use the disavow tool even if there's not a manual action on my site?"

The short answer is yes. But when?
google disavow man
Matt Cutts offers 8 times when to use the disavow tool. They include:

(1) When you get a manual action, of course.
(2) Webmasters won't remove the bad links to your site or want to charge you to remove them.
(3) You are worried about negative SEO.
(4) You see links pointing to your site you do not want to be associated with.
(5) You saw a link bomb attack and are afraid it might hurt your site.
(6) You are afraid someone will submit a spam report about you.
(7) You see your rankings dropped and you think it has to do with an algorithm Google ran, i.e. Penguin algorithm.
(8) You can’t sleep at night because of some of the links you have.

by Barry Schwartz 

Thursday, November 21, 2013

Here Is Why Google Forced Me To Take Down My Site...

An old WebmasterWorld member posted in the WebmasterWorld forums that he had enough, he closed  closed signdown his site that did well in the search results because it was getting too much.

What was getting too much? The requests for links to be taken off his site and removed because of all the link removal requests.

He said, most of the links were nofollowed but despite telling these people who asked for the links to be removed, they simply didn't understand and demanded the links be completely removed.

Here is what the webmaster said:
We created an article website 8-9 years ago. It was allowed the author could post a link to their website or related page to the article. All the links from day 1 were nofollow. Since the recent updates from G we have been swamped with link removal request. I had a prepared email letting these newbees know the links were nofollow and had no effect on their problem. Well some of the (most of them) don't seem to understand this, so I took a look at the website and determined it was of no value to our company and our future. The website did very well in the serps but with the content I had taken off adsense to protect our interest, so really there was no viable income.
Best option kill it. Shame but I just don't have the time to worry with all the emails and the website making the company 0.
A shame indeed.
He didn't have the heart to charge for link removals and decided to just kill the site.
I don't know if I would have done that. Heck, even this site gets link removal requests and I am sure you guys do also.

by Barry Schwartz

Google Can Tell Us If Our Sites Are Impacted By Panda Or Penguin But They Don't

I spotted a very interested comment by Google's head of search spam, Matt Cutts at Hacker News. The  chasing tailobvious comment is that Matt Cutts outs thecupcakeblog.com as being impacted in a negative way by the Panda algorithm.

More shocking to me, based on my conversations with Google search quality people is that Google knows clearly if a site is impacted by an algorithm or not.

In August, Google launched the manual action viewer, mostly to appease webmasters who want to know if their site has a penalty or action. But this only covers manual penalties, issued by Google representatives with the click of a button.
It does not include details if the site was hurt by an algorithm such as Panda or Penguin or others.
I want Google to release an "automated action viewer" to show how much your site is impacted by algorithmic updates. It sounds hard from what I imagined.

I thought that all sites are impacted by all algorithms on some level, but some are more than others. But when it comes to Penguin or Panda, it doesn't appear that it is some, it more appears that it is all or nothing.
By Matt Cutts saying, "looking at the site in question, it has been affected by Google's Panda," it makes me scratch my head. Does Google have a backend tool for themselves to see this? If so, can their dumb it down a bit to add to Webmaster Tools?

It pains me to see so many sites struggling to figure out what is hurting their sites in Google when nothing shows up in the manual actions viewer.

Don't get me wrong... Google's transparency over the years has grown tremendously. But this one thing would be gold for most small webmasters who are lost and being told by "SEO experts" or companies things that may not be true. I see so many webmasters chasing their tails - it pains me.

by Barry Schwartz 

Wednesday, November 20, 2013

Google Bug? Increase In Webmaster Tools DNS Errors?

Since November 14th, I've been seeing a large uptick in complaints about Google Webmaster Tools reporting Google Webmaster ToolsDNS issues with sites, leading to Google not being able to index sites fully.
There is a thread at WebmasterWorld and dozens at Google Webmaster Help.

In April, Google had a bug where they sent out false DNS errors to many webmasters and told everyone to ignore them. I am not sure if this is an issue with Google or something else.

The strange part is that it coincides with reports of ranking changes that I don't see myself.
I am going to try to get word from Google about the DNS issue and see if it was a bug and/or if it may be in any way related to the changes some tools noticed in regards to ranking fluctuations on November 14th.
Forum discussion at WebmasterWorld and dozens at Google Webmaster Help.

Update: Google told me, "we're not seeing anything unusual on our end." :-)

by Barry Schwartz

Tuesday, November 19, 2013

Was There A Google Update On November 14th?

Some of the automated tracking tools for detecting shifts in Google's search results and thus showing evidence of a Google update went haywire on Thursday, November 14th.

Mozcast reported 102 degrees, which means a major change in the search results. SERPs.com also showed higher than normal changes, as did SERP Metrics. But Algoroo and DigitalPoints ranking (see right side bar) show very little changes in the changes.

Google Update - Moz 102I normally see a large spike in chatter (as you know I like to call it) in the online discussion forums and social media areas when Google makes major ranking changes. But I honestly saw very little chatter.

The ongoing WebmasterWorld thread has maybe one or two people asking if there was an update around November 15th. The other forums are pretty dead around that date. In fact, the Google Webmaster Help forums has even less chatter than normal on those dates. If there was a major update, I would have seen tons of threads and complaints but I did not, nor did I see any delayed complaints over the weekend.

So what is going on? I am not sure. Moz posted more details on Google+ saying:
On 11/14, the top 10 domains held 15.4% of the page 1 positions we track - on 11/15, it jumped to 15.9%. Wikipedia, Amazon, and eBay all gained in the 3-5% range.
Dr. Pete, who analyzes this stuff wrote at the end:
Unfortunately, there's no clear pattern, and webmaster chatter has been relatively normal. I'm waiting on some of the other 3rd-party weather station to see if they confirm. If anyone saw unusual changes to their rankings, please leave a comment.
The truth is, normally the chatter and analytics tools do match - here they do not, for some reason.
Did you notice changes last Thursday, if so let us know in the comments.

by Barry Schwartz

Saturday, November 16, 2013

Did Matt Cutts Just Call Himself A Link Spammer?

Yesterday I reported that Google says using keyword rich user names in comments can be considered spammy and against Google's policies, potentially being a link scheme. Of course that caused for a lot of comments from the community but is it fair?

Remember GoogleGuy? GoogleGuy was the Googler who spent years helping webmasters under an unknown alias of GoogleGuy. He posted at WebmasterWorld a lot, but also a lot in comments and other areas. Later, we learned GoogleGuy was really Matt Cutts. What happened to GoogleGuy, well, he stop posting years ago.

His last post was in 2008 on this thread and has not posted since. You can see GoogleGuys profile over here.

Note, the name is kind of a keyword rich name for Google, the profile does link to google.com.

GoogleGuy Profile

Now we know GoogleGuy didn't go around with the intent of boosting Google.com's link profile, so the intent was not there.

But we also know that back in the day, it was common place to use aliases instead of real names.
So why go so strong against people who still are old school?
Is Matt calling himself, aka GoogleGuy, a link spammer?
Forum discussion at WebmasterWorld & Google+.

Update: To be clear, I don't think Matt Cutts is a link spammer. The purpose of this post is to convey most of link spam is about the intent of the link and anchor text. Matt's intent with GoogleGuy was not about spamming Google, obviously.

by Barry Schwartz 

Google's Matt Cutts: Using Keyword Rich Words As Your Comment Name Can Be Spam

As someone who manages a content site with comments, there are things that bother me with some comments I get here. I dislike it when I see comments from people but they use their company name and/or keyword rich anchor text. I know the name hyperlinks to your site, but the links do not count - so it looks (1) spammy and (2) that you don't know SEO because the links don't count anyway.

google comment spam


That being said, Matt Cutts, Google's head of search spam, published a video yesterday telling webmasters who leave comments to "worry about" two types of comments:

(1) Using keyword rich anchor text or your company name, as opposed to your real personal name when commenting.

(2) Using commenting as your primary link building strategy.
If you are doing these, then Google may consider it a link scheme and may take action.

by Barry Schwartz

Wednesday, November 13, 2013

Google Disavow Links Communication Confusion

Over a year ago, Google launched their disavow link tool and webmasters and SEOs have been all over it since. But every now and then, confusion arises.

There are two Google Webmaster Help threads that are confused by the messaging they received after they uploaded their disavow file to Google.

One webmaster said right after they submitted the file, Google returned this response:
You successfully uploaded a disavow links file (www.tucsonadventruedogranch.com_google domain and link disavowal file 11022013.txt) containing 135 domains and 1 URLs.
The next day, Google showed this message:
The file containing disavowed links to http://www.tucsonadventuredogranch.com/ has been updated. If this is unexpected, it may have been updated by another site owner. For more information, visit the Disavow links https://www.google.com/webmasters/tools/disavow-links?siteUrl=http://www.tucsonadventuredogranch.com/ page in Webmaster Tools. Details: You successfully uploaded a disavow links file () containing 0 domains and 0 URLs.
So one day 135 domains and 1 URLs and the next zero?
Another webmaster asked why they are showing 0 URLs and uploaded this screen shot:
Google Disavow Links Communication Confusion
I suspect the issue here is that the file format of the disavow files, in both cases, have syntax errors. Google has tried to communicate about the disavow common mistakes.

Is this a webmaster issue, Google issue or communication confusion?

by Barry Schwartz

Tuesday, November 12, 2013

Google's Matt Cutts: Show Me The Spam Or...

A Hacker News thread cites a story I wrote at Search Engine Land named Long-Time SEO Jill Whalen Moves On, Praises Google For Rewarding Content More which I also wrote over here. In short, the webmasters at Hacker News is mocking my summary of why Jill is retiring from SEO:
The tricks to beat and spam Google, Whalen said, no longer work as well.
Google FingerprintMany of these webmasters find that comical, which may be true and I've seen plenty of spam examples this year. But I find it interesting that when webmasters are asked for examples, they go hush.

Matt Cutts responded in the thread saying, "if you'd be interested in sharing any specifics (companies, keywords, people, etc.) regarding people spamming by leaving millions of links gumming up the web, my team would be happy to investigate in more detail." But nothing timely and solid was given.

He said it again later on in the thread, "We're pretty good at dissecting various techniques, spam networks, etc. But I always like to hear feedback from different places to see how we're doing, uncover new pockets of spam, etc."

Heck, even Ryan Moulton chimed in asking "If you can remember some example queries I'd love to pass them on to the team."
But nothing.

Frustrating all around, don't you think?

by Barry Schwartz

Friday, November 8, 2013

SEO: No Option But To Buy Links; Google's Cutts: That's A Bad Strategy

paid linksOver at Hacker News there are some complaints about the Google SEO Starter Guide (PDF). So Google's head of search spam, Matt Cutts jumped in to try to understand what is wrong.

That isn't the fun part, the fun part is later in on the conversation, one person said the thing that is wrong with it is that it doesn't mention how you need to buy links.

The webmaster said, "Yeah, this is all good and everything. But to get to the top in a competitive market, there is usually very little option but to buy high quality links."

In which, Matt Cutts responded, "That's a really bad long-term strategy."
I am sure the comments on this post will be fun.

by Barry Schwartz

Thursday, November 7, 2013

Reports Of A Google Update Between November 1st & 5th

I have been seeing some chatter at WebmasterWorld and other sources of a possible update that is being pushed out slowly since late October 31st through today.

google updating logoIf you read the thread, you'll see some people complaining about declines in rankings between the dates of November 1st and today. You will also see people excited about increases in rankings.

All the Google tracking tools don't show heavy fluctuations, with the exception of SERPs.com, which seems to have some bug. But the most, they show some activity on October 31st and November 1st, quiet down on the 2nd and 3rd. A spike back up on November 4th and today with chatter.

Truth is, I am not sure I'd classify this as a significant update but let's see if Google announces anything.

by Barry Schwartz

New Google Penalty For "Image Mismatch"

There is a new Google manual action that you need to be aware of named "image mismatch." This manual action is when your images on your site don't match what Google is indexing and displaying in their search results.

An anonymous reader sent me a screen shot of the notification he received for this on his site:



click for full size

It reads:
Images from this site are displayed differently on Google search results pages than they are when displayed on this site.
A form of cloaking or something less intentional like using hotlink protection techniques to prevent or discourage users from stealing your images?

Have you seen this manual action?

by Barry Schwartz

Tuesday, November 5, 2013

You Can Disable Google SSL & Pass Search Referrer Queries

I reported this at Search Engine Land on Friday and to be honest, it is not a feature most of you will benefit from, but if you want, you can force Google's SSL off and search insecurely and thus pass your query data to webmasters by using google.com/webhp?nord=1 instead of google.com.


When you go to google.com/webhp?nord=1 the SSL version of Google will be dropped. The other way to remove SSL from Google is to do it at the network level.

The benefits are limited in that you need to specifically do this yourself. Of course, you can build a browser extension to automate this but since 99.999% of people won't do this, it won't help you get search query data. But it is good to know you can do this, if you do want to disable SSL on Google.

Disable Google SSL
So either go to google.com/webhp?nord=1 or add "?nord=1" to a Google search URL after "www.google.com/", while replacing "https" with "http".

by Barry Schwartz

Google Cites Disavowed Links As Bad Link Examples

There are two different threads at Google Webmaster Help where sites with Google manual actions received  Google Disavow Link Toolresponses back from Google citing bad link examples including links already in the site's disavow file.

As you can imagine, this can be very confusing for webmasters. The webmaster disavowed the bad links, then Google tells the webmaster his site still has a penalty because of the links he already disavowed.

Another webmaster wrote, "The above URL's have all been disavowed at a domain level in the most recent disavow file uploaded. Given this, how is then that these URL's are still being flagged?"
Yea, that is confusion 101.

This is somewhat related to our story named Google's Link Report Now Much More Representative Of Your Links To Your Site. Where Google improved their link report because the example URLs provided were not enough.

But here, this is a way more confusing situation for webmasters. They disavow links and then Google tells them they have a manual penalty even with the links disavowed.

by Barry Schwartz

Friday, November 1, 2013

Awesome Google Webmaster Tools Security Issues

Google announced a new outstanding feature for Google Webmaster Tools named Security Issues.This new dashboard helps quickly know what security issues your site has, how to find the source of the issue and then how to repair your Google results after you fix the issue.
Mariya Moeva from the Google team said in a Google Webmaster Help thread explained the tool is aiming to "help webmasters whose site has been hacked pinpoint the issues and recover easily."
Here is the overview screen showing various forms of malware and hacked spam issues a site may have:
click for full size
Then this screen pinpoints the issue at the source, helping you find the issue quickly on your site:
click for full size
Finally, click "Request A Review" to tell Google you fixed the issue and they should put your search results back to normal.
click for full size
John Mueller from Google who deals with these webmaster issues all the time said on Google+, "here's a fantastic update in Webmaster Tools to help you at least get the search-side of things back on track easier."

by Barry Schwartz

Wednesday, October 30, 2013

Google's Matt Cutts: More Pages Doesn't Directly Influence Your Search Rankings

Google's Matt Cutts: More Pages Doesn't Directly Influence Your Search Rankings google pages
Yesterday, Google's Matt Cutts posted a video saying that the number of pages does not directly impact your search rankings.

There may be a side affect to a site having more pages. Typically, sites with more pages have two benefits:

(1) They have more opportunity to rank for a more diverse set of keyword phrases, assuming your pages are targeting more keywords.

(2) Typically, larger sites have more links to their site and thus may have higher PageRank, which Google uses to set your crawl depth and for ranking.

But the number of pages on a specific site doesn't have a direct impact on your rankings.

by Barry Schwartz

Tuesday, October 29, 2013

Google Removes Your Manual Penalty But Your Rankings Won't Improve

I see threads like this Google Webmaster Help thread all the time. In summary they say, my manual penalty was revoked by Google after I spent the time cleaning up my mistakes but days, months, and even years later, my rankings, traffic and thus sales have not improved at all.

Often, I'll see people get all excited after Google removes a manual action within Google Webmaster Tools, only to see that as false hope.

In early September, I talked about this in a poll I ran asking Does A Manual Action Removal Impact Google Rankings? We have almost two-hundred responses and the sad results are in.

53% said their rankings never improved, even after a year. 12% said they saw a ranking improvement within days, 14% said within a month, 8% said within 3 months, 7% within 6 months and 7% within a year. But 53% said never.

I didn't ask if they saw a full recovery. If I did, I suspect that 53% number to jump to 90% or higher.

Google Manual Penalty Removal Poll

Often, when it is a link penalty, the removal of the penalty doesn't help much. Why? Because those links that once counted, no longer do and thus the rankings will not return until you garner new, quality links.

But with content or other spam issues, why no recovery?

Have you seen the same? Do you agree with this poll?

by Barry Schwartz

Google Closing Authorship Project?

Google Closing Authorship Project?
AJ Kohn, someone who focuses more on authorship and rich snippets than most SEOs I know, wrote a story named Authorship Is Dead, Long Live Authorship.

In that story, he describes why he thinks the classic Authorship Project is slowing being closed down. In short, he thinks that because classic authorship is opt in, it isn't easy to scale at the size Google needs. So Google uses other methods to extract authorship/rich snippet like data from sources to show the richer data in the search results.

So what he means is that what you see in the search results, the authorship display, is not dead. But maybe requiring webmasters to mark up their content and if they trust that markup, is dying.

by Barry Schwartz

SEOs Get Ready For 15% Reduction In Google Rich Snippets

SEOs Get Ready For 15% Reduction In Google Rich Snippets
As many of you know, Matt Cutts, Google's head of search spam, announced at PubCon that you should expect a 15% reduction in the amount of rich snippets and authorship displayed in the Google search results.

There are several threads, including one at WebmasterWorld where webmasters and SEOs are preparing for the reduction in the richer display in the Google search results. Many noticed increases in click through rates and thus higher sales and traffic because of the increase in CTR. But if your site doesn't make the cut and you are part of the 15% reduction, how will you recoup?

To me, it makes sense. Why show rich snippets for all sites that have the markup? Google should reserve them for sites that they deem authoritative enough to display them. There are tons of spammy sites marked up with it. Let alone sites abusing it and spamming rich snippets.

A year ago, it seemed Google began cutting back on displaying rich snippets. This is all while making it easier for sites to mark up their pages. Heck, my authorship dropped out for several months and eventually returned - likely due to a bug on Google's part.

by Barry Schwartz

Saturday, October 26, 2013

Google: Your Disavow File May Not Be Accepted If...

Google: Your Disavow File May Not Be Accepted If... Google Disavow Machete Man
Google's Eric Kuan said in a Google Webmaster Help thread that even if you use a disavow file to remove links, Google may not process them if they don't see you making a serious manual attempt at removing those links.

Other reasons why the disavow tool may not work for you include:
  • Double check your disavow file and make sure the link you mentioned is properly disavowed. You can download your latest Disavow file by going to the Disavow tool, clicking "Disavow Links", and clicking "Download".
  • Documentation is really good and can help when reviewing your site for reconsideration. Make sure that when you send in documentation that it's accessible, though. It looks like https://docs.google.com/file/d/0B_RnM_AdwlOzRm55V3JKNVNoMDQ/edit?usp=drive_web hasn't been shared properly.
This is not new, but it is important to remind our readers.

by Barry Schwartz

Friday, October 25, 2013

Google Penalty On WWW Revoked But Remains On Non-WWW

Google Penalty On WWW Revoked But Remains On Non-WWW traffic cones
A WebmasterWorld thread has an interesting discussion around how a webmaster said his manual penalty was revoked on his WWW but remains on his non WWW.

The non WWW is 301 redirected to the WWW and the WWW is set as the preferred domain in Webmaster Tools.

Have you ever seen this case, where the WWW vs the non WWW don't have timed penalties properly?

by Barry Schwartz

Thursday, October 24, 2013

Google Matt Cutts Hints At Upcoming SEO Changes At PubCon

Google Matt Cutts Hints At Upcoming SEO Changes At PubCon Matt Cutts at PubCon 2013
One of the highlights of the year for me is Matt Cutts talk at PubCon. So me not being there, being 6,000+ miles away, was unfortunate. It was the first PubCon I've missed in, well, I can't remember.
Anyway, there was some excellent note taking of his presentation, plus PubCon streamed it live, so I saw a glimpse of it. I wanted to pull out the highlights and most important things Matt shared, specific to topics SEOs should be concerned with for the end of this year and 2014.
First, the live blogging I took this from include Bruce Clay, Search Engine Journal, Pole Vault Media, Search Mojo, Search Engine Land and Google+ via Brian Patterson. Of course, there is a ton of Twitter action from the event.
  • Black Hat Hacking will be a core topic Google's web spam team will focus on. They hope to go after the hard core hacking tactics and reduce that impact on the search results.
  • Child pornography will be blocked significantly in all countries, not just the US.
  • Toolbar Page Rank may or may not be fixed. The "pipeline" to export the quarterly data broke and Google has no immediate plans to fix it.
  • Mobile is going to be a key area in 2014, so pay attention to it.
  • Auto complete, look to add request Auto complete to your forms.
  • Top Ad Heavy Algo will get refreshed in a big way, see the previous update over here.
  • Authorship 15% Reduction - Google is going to get picky on who they show authorship rich snippets for. So don't be surprised if yours goes away, look for ways to make yourself more authoritative. This includes rich snippets for your site...
  • Better At JavaScript so don't be surprised to find out what Google gets into.
by Barry Schwartz

Wednesday, October 23, 2013

Rare: Google Admits A Search Result Is Wrong

Rare: Google Admits A Search Result Is Wrong
google bugIt is rare for Google to admit they are wrong, at least with search quality. Part of the complaints with the Google stock price and questioning if the stock price matches Google's quality product these days... Google's Ryan Moulton admitted Google has a bug with, at least, one of their search results.

Ryan Moulton has been with Google's search quality team for over seven years. He is big into Hacker News and has been responding to Google complaints there for years.

One such complaint was readdressed on the topic of ranking w3schools.com above MDN resources. This is when Ryan famously responded higher quality content not necessarily more useful content. The topic came back up at Hacker News and this time, he admitted there was a bug.

If you search for [html iframe element scripting] you get an MDN result first, but to the wrong answer. Then it is followed by three w3schools results, also with "wrong, or completely useless" results.

by Barry Schwartz

Thursday, October 10, 2013

SEOs Adapt To Google's Hummingbird Algorithm

SEOs Adapt To Google's Hummingbird Algorithm Google Hummingbird Not Provided
As you know, Google announced their Hummingbird algorithm about a month after it launched, claiming no one noticed and no one should notice. But we do think we did notice but no one can confirm that outside of Google and they won't.
That being said, clearly the search results are different since the launch of Hummingbird and SEOs will likely need to adapt.
Some forward thinking SEOs and webmasters are already thinking up what the end game for Google is with Hummingbird and how to adapt their sites to fit that box.
A WebmasterWorld thread has some really interesting conversation around what some believe the key difference is before and after Hummingbird.

Unique Content versus Useful Content

While unique content is more of a Google Panda related thing, useful content although Panda, is maybe more Hummingbird.
Google understands searchers queries differently with Hummingbird than they did before. So how can the search results not change. How can you as a webmaster change your content to make it more useful, while it still being unique, to encourage Google to show your site over your competitors.

Don't optimize for keywords, optimize for a satisfied customer from stage one of the buying cycle to the end. Is it that easy? What if you don't offer all the stages? Well, I assume that is not exactly the point.
Robert believes this will eventually lead to search results that are "less a collection of content farms and more a collection of pages created with the user genuinely in mind." I am not 100% confident.
Keep in mind, this is just one theory of many and for the most part, the search results did not change that much compared to let's say Penguin 2.1.

Author: Barry Schwartz