Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Saturday, November 30, 2013

Did Google Update On Thanksgiving Or Traffic Down Due To Holiday?

A WebmasterWorld thread plus dozens of threads at the Google Webmaster Help forums have webmasters Google Thanksgiving Day Updatecomplaining about both ranking changes and traffic drops from Google's search engine.The traffic drop would make sense being that it is a holiday and on US holidays traffic normally drops. But ranking changes cannot really be explained by an algorithm change.

Maybe all the black friday deals and all those new pages trying to compete on top selling product categories are popping up and Google needs to rank them properly? I am not sure.

One webmaster said, "I'm seeing what looks like a rollback today. Many verticals have returned to their former positions."
Another said, "We've been hit again, main keywords dropping even further."
Some can't figure it out if it is ranking or traffic.
Google has said they won't update major algorithms again during the Christmas holiday shopping season but who knows.

It is not unusual for webmasters to spark up during Thanksgiving about ranking issues. We've seen it a couple of times in the past, at least.

The Google tracking tools are not yet updated, with the exception of Algoroo but I'll keep checking Mozcast, SERPs.com and SERP Metrics to see if they agree. Update, it seems as if the tools do not agree.

by Barry Schwartz

Google Tells Webmaster: You'll Have To Earn Our Trust Again

A Google Webmaster Help thread has a story of a site that is trying to disavow and remove all the bad links pointing to his site.Trust GoogleThe interesting part is that even when he does, will it help with his site's rankings?
The goal of this webmaster is simply to remove the manual action, but Google's John Mueller tells him he also has algorithmic trust issues.
John said:
looking at your site's history, it looks like you've done quite a bit for quite some time, and it looks like our algorithms have picked up on that too. So while resolving the manual action is a good way to start, you need to keep in mind that it can possibly take quite some time for our algorithms to regain trust in your site even after that.
I see this happening a lot, webmasters aim to remove the manual action and do but then the rankings don't improve. The reason is likely do to algorithmic actions taken on the site.
That being said, it is interesting to see how Google words it.
The algorithms seem to have lost trust over time. The manual action is a "good way to start" but the algorithms need to "regain trust" in the site for there to be an improvement - which may take some time.

by Barry Schwartz

Wednesday, November 27, 2013

Google's Matt Cutts: Why Google Use To Only Crawl 100 Links Per Page

Google's Matt Cutts posted a video answering the old question, can Google crawl more than a 100 links on a  100specific page?

Of course they can. They said so in 2008 and there are plenty of sites with hundreds of links on a page that are not penalized by Google.

Google said they have no specific rule, but if the number of links you have on your page looks spammy, then Google reserves the right to take action.

Of course, many SEOs are scratching their heads and asking - really? Did Matt Cutts really need to go on record about this? Is this the state of our SEO industry?

by Barry Schwartz

Tuesday, November 26, 2013

M-Dot Domains Need To Be Verified Separately In Google Webmaster Tools

In Google Webmaster Help there is a straight forward question and answer about how to handle M-dot (i.e. m.domain.com sites in Google Webmaster Tools.In short, an M-dot is a separate site and should be verified separately in Google Webmaster Tools.
Zineb, a Google Webmaster support representative answered each question:
Google Webmaster Tools(Q) Having verified domain ie.http://www.domain.com in GWT do I need to separately verify it's mobile version ie. m.domain.com? If so which method is the best for mobile? 
(A) Yes. You need to verify both URLs in Webmaster Tools. Regarding verification methods, it depends on what you prefer :)
Furthermore, she answered the question on a separate mobile domain.
(Q) Also if the mobile website will have different URL ie. m.domainmobile.com how will that affect the verification (obviously having verified http://www.domain.com wouldn't help). 
(A) I don't see why it would affect the verification. Make sure to add the bidirectional annotation to both your sites (mobile and desktop) to help our algorithms understand the relationship between your desktop and mobile pages.

by Barry Schwartz 

Saturday, November 23, 2013

Matt Cutts Asking For Google Webmaster Tools Feedback

This is your opportunity to give Google your two-cents on the features and issues you have with Google  Webmaster Tools.

I blew up about it the other day where I asked in a nice way for Google to add a feature named automated actions to Google Webmaster Tools. Will it happen? I think it is more complex that I am making it out, so I doubt it. But if you agree, make sure Google and Matt Cutts know you want this feature.
google webmaster tools feedback
If you don't want it, fine. Just make sure you tell Matt Cutts and his team what features you want.
Matt posted the request on his personal blog asking you to leave comments with the features or changes you want made to Webmaster Tools. He said they will use it for brainstorming and possibly build some of those features in 2014 - but not guaranteed.

Matt's blog only has 125 comments or so, so go flood it with ideas.

by Barry Schwartz

Google's Search Results Rocky This Week...

This week has been a mess for Google's search results, despite Google denying anything is going on to me.
There were some tools that sparked up on November 14th that I didn't see from the webmaster chatter.

This whole week, I've been seeing a lot of sporadic complaints in both WebmasterWorld and Google Webmaster Help forums. Typically, these sporadic reports, at least I think, mean Google targeted a link network and some sites were majorly impacted by it. It also may be a weird Google bug. But not necessarily a Google update. Of course, it could be that Panda was updated, which Google stopped confirming.

It is hard to tell.

The tools are all lighting up over the past few days. Moz has warmer than normal temperature the past couple days, SERPs.com has higher volatility numbers than normal, SERP Metrics shows higher flux than normal and Algoroo is in the red the past couple days. Something seems to be up.

Is it a major algorithm update - I don't think so. If I had to guess, maybe Panda was rerun or maybe Google squashed some sort of link network.

I am honestly not sure but Google is indeed a bit rocky the past few days.

by Barry Schwartz

Friday, November 22, 2013

8 Reasons To Use The Google Disavow Tool By Matt Cutts

Matt Cutts posted a video answering the question "Should I use the disavow tool even if there's not a manual action on my site?"

The short answer is yes. But when?
google disavow man
Matt Cutts offers 8 times when to use the disavow tool. They include:

(1) When you get a manual action, of course.
(2) Webmasters won't remove the bad links to your site or want to charge you to remove them.
(3) You are worried about negative SEO.
(4) You see links pointing to your site you do not want to be associated with.
(5) You saw a link bomb attack and are afraid it might hurt your site.
(6) You are afraid someone will submit a spam report about you.
(7) You see your rankings dropped and you think it has to do with an algorithm Google ran, i.e. Penguin algorithm.
(8) You can’t sleep at night because of some of the links you have.

by Barry Schwartz 

Thursday, November 21, 2013

Here Is Why Google Forced Me To Take Down My Site...

An old WebmasterWorld member posted in the WebmasterWorld forums that he had enough, he closed  closed signdown his site that did well in the search results because it was getting too much.

What was getting too much? The requests for links to be taken off his site and removed because of all the link removal requests.

He said, most of the links were nofollowed but despite telling these people who asked for the links to be removed, they simply didn't understand and demanded the links be completely removed.

Here is what the webmaster said:
We created an article website 8-9 years ago. It was allowed the author could post a link to their website or related page to the article. All the links from day 1 were nofollow. Since the recent updates from G we have been swamped with link removal request. I had a prepared email letting these newbees know the links were nofollow and had no effect on their problem. Well some of the (most of them) don't seem to understand this, so I took a look at the website and determined it was of no value to our company and our future. The website did very well in the serps but with the content I had taken off adsense to protect our interest, so really there was no viable income.
Best option kill it. Shame but I just don't have the time to worry with all the emails and the website making the company 0.
A shame indeed.
He didn't have the heart to charge for link removals and decided to just kill the site.
I don't know if I would have done that. Heck, even this site gets link removal requests and I am sure you guys do also.

by Barry Schwartz

Google Can Tell Us If Our Sites Are Impacted By Panda Or Penguin But They Don't

I spotted a very interested comment by Google's head of search spam, Matt Cutts at Hacker News. The  chasing tailobvious comment is that Matt Cutts outs thecupcakeblog.com as being impacted in a negative way by the Panda algorithm.

More shocking to me, based on my conversations with Google search quality people is that Google knows clearly if a site is impacted by an algorithm or not.

In August, Google launched the manual action viewer, mostly to appease webmasters who want to know if their site has a penalty or action. But this only covers manual penalties, issued by Google representatives with the click of a button.
It does not include details if the site was hurt by an algorithm such as Panda or Penguin or others.
I want Google to release an "automated action viewer" to show how much your site is impacted by algorithmic updates. It sounds hard from what I imagined.

I thought that all sites are impacted by all algorithms on some level, but some are more than others. But when it comes to Penguin or Panda, it doesn't appear that it is some, it more appears that it is all or nothing.
By Matt Cutts saying, "looking at the site in question, it has been affected by Google's Panda," it makes me scratch my head. Does Google have a backend tool for themselves to see this? If so, can their dumb it down a bit to add to Webmaster Tools?

It pains me to see so many sites struggling to figure out what is hurting their sites in Google when nothing shows up in the manual actions viewer.

Don't get me wrong... Google's transparency over the years has grown tremendously. But this one thing would be gold for most small webmasters who are lost and being told by "SEO experts" or companies things that may not be true. I see so many webmasters chasing their tails - it pains me.

by Barry Schwartz 

Wednesday, November 20, 2013

Google Bug? Increase In Webmaster Tools DNS Errors?

Since November 14th, I've been seeing a large uptick in complaints about Google Webmaster Tools reporting Google Webmaster ToolsDNS issues with sites, leading to Google not being able to index sites fully.
There is a thread at WebmasterWorld and dozens at Google Webmaster Help.

In April, Google had a bug where they sent out false DNS errors to many webmasters and told everyone to ignore them. I am not sure if this is an issue with Google or something else.

The strange part is that it coincides with reports of ranking changes that I don't see myself.
I am going to try to get word from Google about the DNS issue and see if it was a bug and/or if it may be in any way related to the changes some tools noticed in regards to ranking fluctuations on November 14th.
Forum discussion at WebmasterWorld and dozens at Google Webmaster Help.

Update: Google told me, "we're not seeing anything unusual on our end." :-)

by Barry Schwartz

Tuesday, November 19, 2013

Was There A Google Update On November 14th?

Some of the automated tracking tools for detecting shifts in Google's search results and thus showing evidence of a Google update went haywire on Thursday, November 14th.

Mozcast reported 102 degrees, which means a major change in the search results. SERPs.com also showed higher than normal changes, as did SERP Metrics. But Algoroo and DigitalPoints ranking (see right side bar) show very little changes in the changes.

Google Update - Moz 102I normally see a large spike in chatter (as you know I like to call it) in the online discussion forums and social media areas when Google makes major ranking changes. But I honestly saw very little chatter.

The ongoing WebmasterWorld thread has maybe one or two people asking if there was an update around November 15th. The other forums are pretty dead around that date. In fact, the Google Webmaster Help forums has even less chatter than normal on those dates. If there was a major update, I would have seen tons of threads and complaints but I did not, nor did I see any delayed complaints over the weekend.

So what is going on? I am not sure. Moz posted more details on Google+ saying:
On 11/14, the top 10 domains held 15.4% of the page 1 positions we track - on 11/15, it jumped to 15.9%. Wikipedia, Amazon, and eBay all gained in the 3-5% range.
Dr. Pete, who analyzes this stuff wrote at the end:
Unfortunately, there's no clear pattern, and webmaster chatter has been relatively normal. I'm waiting on some of the other 3rd-party weather station to see if they confirm. If anyone saw unusual changes to their rankings, please leave a comment.
The truth is, normally the chatter and analytics tools do match - here they do not, for some reason.
Did you notice changes last Thursday, if so let us know in the comments.

by Barry Schwartz

Saturday, November 16, 2013

Did Matt Cutts Just Call Himself A Link Spammer?

Yesterday I reported that Google says using keyword rich user names in comments can be considered spammy and against Google's policies, potentially being a link scheme. Of course that caused for a lot of comments from the community but is it fair?

Remember GoogleGuy? GoogleGuy was the Googler who spent years helping webmasters under an unknown alias of GoogleGuy. He posted at WebmasterWorld a lot, but also a lot in comments and other areas. Later, we learned GoogleGuy was really Matt Cutts. What happened to GoogleGuy, well, he stop posting years ago.

His last post was in 2008 on this thread and has not posted since. You can see GoogleGuys profile over here.

Note, the name is kind of a keyword rich name for Google, the profile does link to google.com.

GoogleGuy Profile

Now we know GoogleGuy didn't go around with the intent of boosting Google.com's link profile, so the intent was not there.

But we also know that back in the day, it was common place to use aliases instead of real names.
So why go so strong against people who still are old school?
Is Matt calling himself, aka GoogleGuy, a link spammer?
Forum discussion at WebmasterWorld & Google+.

Update: To be clear, I don't think Matt Cutts is a link spammer. The purpose of this post is to convey most of link spam is about the intent of the link and anchor text. Matt's intent with GoogleGuy was not about spamming Google, obviously.

by Barry Schwartz 

Google's Matt Cutts: Using Keyword Rich Words As Your Comment Name Can Be Spam

As someone who manages a content site with comments, there are things that bother me with some comments I get here. I dislike it when I see comments from people but they use their company name and/or keyword rich anchor text. I know the name hyperlinks to your site, but the links do not count - so it looks (1) spammy and (2) that you don't know SEO because the links don't count anyway.

google comment spam


That being said, Matt Cutts, Google's head of search spam, published a video yesterday telling webmasters who leave comments to "worry about" two types of comments:

(1) Using keyword rich anchor text or your company name, as opposed to your real personal name when commenting.

(2) Using commenting as your primary link building strategy.
If you are doing these, then Google may consider it a link scheme and may take action.

by Barry Schwartz

Wednesday, November 13, 2013

Google Disavow Links Communication Confusion

Over a year ago, Google launched their disavow link tool and webmasters and SEOs have been all over it since. But every now and then, confusion arises.

There are two Google Webmaster Help threads that are confused by the messaging they received after they uploaded their disavow file to Google.

One webmaster said right after they submitted the file, Google returned this response:
You successfully uploaded a disavow links file (www.tucsonadventruedogranch.com_google domain and link disavowal file 11022013.txt) containing 135 domains and 1 URLs.
The next day, Google showed this message:
The file containing disavowed links to http://www.tucsonadventuredogranch.com/ has been updated. If this is unexpected, it may have been updated by another site owner. For more information, visit the Disavow links https://www.google.com/webmasters/tools/disavow-links?siteUrl=http://www.tucsonadventuredogranch.com/ page in Webmaster Tools. Details: You successfully uploaded a disavow links file () containing 0 domains and 0 URLs.
So one day 135 domains and 1 URLs and the next zero?
Another webmaster asked why they are showing 0 URLs and uploaded this screen shot:
Google Disavow Links Communication Confusion
I suspect the issue here is that the file format of the disavow files, in both cases, have syntax errors. Google has tried to communicate about the disavow common mistakes.

Is this a webmaster issue, Google issue or communication confusion?

by Barry Schwartz

Tuesday, November 12, 2013

Google's Matt Cutts: Show Me The Spam Or...

A Hacker News thread cites a story I wrote at Search Engine Land named Long-Time SEO Jill Whalen Moves On, Praises Google For Rewarding Content More which I also wrote over here. In short, the webmasters at Hacker News is mocking my summary of why Jill is retiring from SEO:
The tricks to beat and spam Google, Whalen said, no longer work as well.
Google FingerprintMany of these webmasters find that comical, which may be true and I've seen plenty of spam examples this year. But I find it interesting that when webmasters are asked for examples, they go hush.

Matt Cutts responded in the thread saying, "if you'd be interested in sharing any specifics (companies, keywords, people, etc.) regarding people spamming by leaving millions of links gumming up the web, my team would be happy to investigate in more detail." But nothing timely and solid was given.

He said it again later on in the thread, "We're pretty good at dissecting various techniques, spam networks, etc. But I always like to hear feedback from different places to see how we're doing, uncover new pockets of spam, etc."

Heck, even Ryan Moulton chimed in asking "If you can remember some example queries I'd love to pass them on to the team."
But nothing.

Frustrating all around, don't you think?

by Barry Schwartz

Friday, November 8, 2013

SEO: No Option But To Buy Links; Google's Cutts: That's A Bad Strategy

paid linksOver at Hacker News there are some complaints about the Google SEO Starter Guide (PDF). So Google's head of search spam, Matt Cutts jumped in to try to understand what is wrong.

That isn't the fun part, the fun part is later in on the conversation, one person said the thing that is wrong with it is that it doesn't mention how you need to buy links.

The webmaster said, "Yeah, this is all good and everything. But to get to the top in a competitive market, there is usually very little option but to buy high quality links."

In which, Matt Cutts responded, "That's a really bad long-term strategy."
I am sure the comments on this post will be fun.

by Barry Schwartz

Thursday, November 7, 2013

Reports Of A Google Update Between November 1st & 5th

I have been seeing some chatter at WebmasterWorld and other sources of a possible update that is being pushed out slowly since late October 31st through today.

google updating logoIf you read the thread, you'll see some people complaining about declines in rankings between the dates of November 1st and today. You will also see people excited about increases in rankings.

All the Google tracking tools don't show heavy fluctuations, with the exception of SERPs.com, which seems to have some bug. But the most, they show some activity on October 31st and November 1st, quiet down on the 2nd and 3rd. A spike back up on November 4th and today with chatter.

Truth is, I am not sure I'd classify this as a significant update but let's see if Google announces anything.

by Barry Schwartz

New Google Penalty For "Image Mismatch"

There is a new Google manual action that you need to be aware of named "image mismatch." This manual action is when your images on your site don't match what Google is indexing and displaying in their search results.

An anonymous reader sent me a screen shot of the notification he received for this on his site:



click for full size

It reads:
Images from this site are displayed differently on Google search results pages than they are when displayed on this site.
A form of cloaking or something less intentional like using hotlink protection techniques to prevent or discourage users from stealing your images?

Have you seen this manual action?

by Barry Schwartz

Tuesday, November 5, 2013

You Can Disable Google SSL & Pass Search Referrer Queries

I reported this at Search Engine Land on Friday and to be honest, it is not a feature most of you will benefit from, but if you want, you can force Google's SSL off and search insecurely and thus pass your query data to webmasters by using google.com/webhp?nord=1 instead of google.com.


When you go to google.com/webhp?nord=1 the SSL version of Google will be dropped. The other way to remove SSL from Google is to do it at the network level.

The benefits are limited in that you need to specifically do this yourself. Of course, you can build a browser extension to automate this but since 99.999% of people won't do this, it won't help you get search query data. But it is good to know you can do this, if you do want to disable SSL on Google.

Disable Google SSL
So either go to google.com/webhp?nord=1 or add "?nord=1" to a Google search URL after "www.google.com/", while replacing "https" with "http".

by Barry Schwartz

Google Cites Disavowed Links As Bad Link Examples

There are two different threads at Google Webmaster Help where sites with Google manual actions received  Google Disavow Link Toolresponses back from Google citing bad link examples including links already in the site's disavow file.

As you can imagine, this can be very confusing for webmasters. The webmaster disavowed the bad links, then Google tells the webmaster his site still has a penalty because of the links he already disavowed.

Another webmaster wrote, "The above URL's have all been disavowed at a domain level in the most recent disavow file uploaded. Given this, how is then that these URL's are still being flagged?"
Yea, that is confusion 101.

This is somewhat related to our story named Google's Link Report Now Much More Representative Of Your Links To Your Site. Where Google improved their link report because the example URLs provided were not enough.

But here, this is a way more confusing situation for webmasters. They disavow links and then Google tells them they have a manual penalty even with the links disavowed.

by Barry Schwartz

Friday, November 1, 2013

Awesome Google Webmaster Tools Security Issues

Google announced a new outstanding feature for Google Webmaster Tools named Security Issues.This new dashboard helps quickly know what security issues your site has, how to find the source of the issue and then how to repair your Google results after you fix the issue.
Mariya Moeva from the Google team said in a Google Webmaster Help thread explained the tool is aiming to "help webmasters whose site has been hacked pinpoint the issues and recover easily."
Here is the overview screen showing various forms of malware and hacked spam issues a site may have:
click for full size
Then this screen pinpoints the issue at the source, helping you find the issue quickly on your site:
click for full size
Finally, click "Request A Review" to tell Google you fixed the issue and they should put your search results back to normal.
click for full size
John Mueller from Google who deals with these webmaster issues all the time said on Google+, "here's a fantastic update in Webmaster Tools to help you at least get the search-side of things back on track easier."

by Barry Schwartz