Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Tuesday, December 31, 2013

To Survive You Must Give Google Your Structure Content & Data?

To Survive You Must Give Google Your Structure Content & Data?
Greg Niland, aka GoodROI, posted a thread at WebmasterWorld arguing that if you do not hand over your data and content to Google in a structure format, then you will die a slow death.
Greg said:
hand over keysSome of you are probably thinking that using schema or HTML 5 simply makes it easier for the all-evil Google to steal your content. I'll be honest you do have a valid concern but if you ignore it, it will not go away. Your competition will do it and Google will get the content from them and give them the traffic. You need to figure out a new strategy for your site to deal with the current situation so you can incorporate these best practices and still profit. Maybe something like putting the best content behind a paid wall? Get creative!
Holding on to old techniques and technology is not a smart idea. It is like insisting on only selling printed newspapers and ignoring the fact that the world has changed and now people like to get their news online. Change is not fun but staying up to date on best practices is vital to success.
There are many SEOs and webmasters out there that don't want to give Google all their data and then let them use it in the knowledge graph without any link credit or potential to make money. But the question is, if your competitors do it, you will lose out anyway.

It is a hard call to make. You have data, the data is awesome, Google doesn't have access to it. Do you give it to Google in a structured format, such as we discussed with searching within apps or do you withhold?
Is there a win-win solution to this or is the only winner here Google?

Do you need to give Google your content in a structured format for your site to rank? If so, how long until Google replaces your site with the knowledge graph? Are there ways to benefit without handing over the keys to your business?

These are the decisions SEOs are making in 2013 and 2014.

by Barry Schwartz

Google: It's Hard To Recognize Your Large Photos Because Of Your JavaScript Links

Google: It's Hard To Recognize Your Large Photos Because Of Your JavaScript Links Google Images
A webmaster is upset that his images aren't being indexed from his site. He posted a complaint in the Google Webmaster Help forums, in which we learn two things:

(1) This webmaster was blocking Google from indexing those images with robots.txt file. This actually happens way too often.

(2) Sometimes using JavaScript for the "View Larger Image" link can make it hard for Google to index those images. Google does a pretty good job with JavaScript, but why not make it easier. Google's John Mueller said, "It seems that we're not picking up that these URLs are actually not pages but rather images -- and images without a landing page on top of that. One way to let us know about the actual connection is to use a direct link to the image (using target=_blank is fine, if you want to open them up in a new window/tab."

So if you want your images indexed, first make sure you allow Google to index them and then make sure the crawlers can process the links to them.

by Barry Schwartz

Thursday, December 26, 2013

When You Have Bad Links Impacting Your Google Rankings, Which Do You Remove?

When You Have Bad Links Impacting Your Google Rankings, Which Do You Remove? Google Panda Holiday
I see this all the time, a forum thread, where a webmaster knows his rankings are suffering in Google because he was hit by Penguin because he has a lot of really bad, manipulative links. A WebmasterWorld thread sums up the issues a webmaster in this predicament is in.

(1) They hired an SEO company (2) That SEO company ranked them well for years (3) Then Penguin smashed their links (4) They no longer rank as well (5) They are upset with the SEO company (6) They need to figure out how to rank again (7) Removing the links are the only option (8) But removing links that were the result of their initial good rankings won't help them rank immediately

In this thread, the site owner sums it up as:
1) What is the sure proof way to make sure a link is 100% bad? 2) I don't want to remove all links cause I am worried my site will drop even more. I'm sure there are some semi-good links that might be helping.
3) After submitting disavow file, typically how long does it take to recover? We have two sites, one seems to be under penguin and panda updates and the other received a manual penalty for certain bad links for certain terms.
It is sad, indeed. But you need to disavow the links, that is for sure. Those links are not helping you and they are now hurting you. Remove the hurt. Then get people to link to you because they want to link to you.
But which links should you remove? Which links are actually hurting you. That is the hard question. One SEO offered his advice:
the best advice I think I can give is to disavow the "obviously bad" links, but keep the ones you think are "grey" or "borderline" and see if you recover -- Basically, in your situation, meaning you don't "know" what's good and what's bad for sure, I'd "start with the obviously bad" and then "keep going" if necessary.
Of course, there are tools, like Link Detox, Majestic SEO, AHREFs, Moz and others. But we are assuming you have the tools already or you manually go through all your links within Google Webmaster Tools. And when you disavow, make sure to disavow on the domain level.

by Barry Schwartz

Tuesday, December 24, 2013

Matt Cutts: Google's Hummingbird Algorithm Affects 90% Of Searches

Matt Cutts: Google's Hummingbird Algorithm Affects 90% Of Searches Google Hummingbird
I keep coming back to this episode 227 from TWiG and in the video, Matt Cutts talked about the Hummingbird algorithm at exactly one hour and twenty minutes into the video. He spent maybe a minute or so talking about it.

Matt Cutts said that the Hummingbird algorithm actually effects 90% of all searches but he said only to a small degree. So while Panda may have impacted 10% or so and Penguin closer to 3% or so, Hummingbird impacted 90%. But Matt Cutts said only to a small degree where users should not notice.

Here is the snippet of what Matt Cutts said:
Hummingbird effects 90% of all searches but usually to a small degree because we are saying, this particular document isn't really about what the user is searching for.
I know Google has told us searchers and SEOs should not have noticed any impact to rankings and traffic based on this algorithm update. But we suspected it may have impacted some.
With such a large footprint, 90%, it had to on some degree.

by Barry Schwartz

Google Fetch As Googlebot Won't Crawl When Site Is Too Slow

Google Fetch As Googlebot Won't Crawl When Site Is Too Slow Google Page Speed
A Google Webmaster Help thread has one webmaster complaining his site isn't being crawled by Google and isn't showing up in the search results. The reason, his site can't handle Googlebot crawling it.

The site is pretty static and basic but the server is a cheap or free host that can't handle much activity. So Googlebot can't crawl it without taking down the site and thus stays away until it can get through to crawl it without negatively impacting the site.

The interesting thing is that if you use the Fetch As Googlebot feature when this is the case, it will fail as well. So you can actually somewhat diagnose a major site speed issue with Fetch as Googlebot.

John Mueller from Google said:
Looking at your site, I do see that we'd like to crawl more from the server, but we're holding back because we think the server might not be able to handle the load. This is the reason why the Fetch as Google requests aren't making it through. In particular, we're seeing a fairly high response-time for URLs from the server, which often signals that the server is pretty busy even without us crawling.
 by Barry Schwartz

Friday, December 20, 2013

Google's URL/Content Removal Tool Now A Wizard

Google has updated their URL removal tool to make it easier and smarter to remove content specifically from third-party web sites.

Google told us at Search Engine Land specifically that the tool is smarter by analyzing the content of the URL you submitted and letting you know what options you have based on the details of the cache result, search results and the actual live page.

You can remove complete pages if the page is actually not live or blocked to spiders. You can remove content from a page if the content shows in the Google cache and is not live on the page.

Here are some screen shots:
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard - click for full size
Now, it may even work on soft-404s, so be careful. As WebmasterWorld's moderator said:
This makes technical SEO aspect even more important, so that your site is not exposed to URL removal requests from the third party.

by Barry Schwartz

Matt Cutts: Google Tries To Minimize Search Updates Before Holidays

I am not sure if you noticed but all the Google search results tracking tools flared up yesterday reporting major rejectionchanges in the Google search results. You can see for yourself at MozCast, SERPs.com, SERP Metrics and Algoroo. But I personally saw very little chatter about any changes in the search results outside of major authorship reduction and even that is somewhat questionable.

Matt Cutts, Google's lead guy here, responded to an angry tweet from a website owner saying Google doesn't do major updates before holidays.

So are those tracking tools wrong? Is Matt Cutts lying? Is he not aware of something that is going on? Did Google not do an algorithm update but some user interface change is messing up the tracking tools?

Again, I don't see signs of an update based on webmasters complaining about ranking changes.

Did you notice changes in your traffic or rankings?

by Barry Schwartz

Thursday, December 19, 2013

Google: We Don't Control Content On The Web

Google: We Don't Control Content On The Web control
I spotted a thread that is a common question in the Google Webmaster Help forums about removing content from showing up in Google. The response was even more interesting.

Google's Eric Kuan, from the search quality team said that Google does not control the content on the web.

He wrote:
Google doesn't control the contents of the web, so before you submit a URL removal request, the content on the page has to be removed. There are some exceptions that pertain to personal information that could cause harm. You can find more information about those exceptions here: https://support.google.com/websearch/answer/2744324.
True, Google cannot control what people publish on the internet but they are the gateway to that content.
I bet many SEOs and webmasters would argue with Google's definition of "control" here.

by Barry Schwartz

Google's Matt Cutts: Our Algorithms Try To Break Black Hat SEOs Spirits

Google's Matt Cutts: Our Algorithms Try To Break Black Hat SEOs Spirits
 A couple weeks ago, Google's Matt Cutts was on This Week in Google (TWiG) and on episode 227 Matt had some interesting things to say. He said that Google specifically tries to break the spirits of black hat SEOs.
At about an hour and 25 minutes into the video, Matt said:
Matt Cutts TwigIf you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.

So in short, Google actually doesn't just look to prevent money to go to spammers, they look to break their spirits.

by Barry Schwartz

Wednesday, December 18, 2013

Google: Duplicate Content Pollutes 25-30% Of The Web

Google: Duplicate Content Pollutes 25-30% Of The Web Duplicate Content & SEO
We all know Google's take on duplicate content, it is not spammy, it is not something they have penalties for unless you are being completely evil. We also know that if you have exactly the same content as someone else, it might be hard for your content to rank in Google because Google shows only one of the duplicative pages.

The new news is that Matt Cutts of Google said that somewhere between 25% to 30% of the web is duplicative. Meaning over one-quarter of the pages, content, files, images, etc, on the web are replicated. Can you imagine?

It might seem like a lot to a normal user. Probably doesn't seem like a lot to you and me because as soon as I publish this story, 15 other sites will scrape it and post it on their sites. So this one piece of content will be duplicated 15X or so.

But as a user, because search engines cluster and show one of the duplicative pages, we don't realize that so much of the web is duplicative.

by Barry Schwartz

Tuesday, December 17, 2013

Google: It's Not About How Much Or How Many...

Google's John Mueller gave the typical Google line at this Google Webmaster Help there - quality over quantity. He wrote:
I wouldn't worry about the technicalities - how much you can write about how many topics - and instead just make sure that the content which you're creating is really of the highest quality possible. This isn't something you should do for search engines, it's really something you're probably already doing. You almost certainly know yourself where the content that you're creating is thin, and where it's really significant, unique, high-quality and compelling to users. The awesome stuff is what you should be aiming for -- and from my point of view, you should be doing that independent of any algorithms (real or not).
It isn't about how much or how many - it is about the highest quality as possible.
Truth is, I am sure many sites have writing quotas. Be it having to write one story a week, a story a day, or several stories a day. Heck, I have this artificial number in my head that I need to push out around 5 stories here a day and several on Search Engine Land, but truth is, I really don't. I just like the number. I've done more or less and truthfully, I've done stories that I probably would not have written because I wanted to publish at least four stories on a specific day. I probably should not have but quotas, we all have them as part of our job or as a number sitting in our head. Be it budgets, yearly revenue goals, to weight loss and heck, I am sure Google has their spam and quality quota metrics. We all got them.

But should step back and say, it isn't about how much or how many?

by Barry Schwartz

Thursday, December 12, 2013

Google's Matt Cutts Agrees, Guest Blogging Is Getting Out Of Hand

guestGoogle's Matt Cutts Agrees, Guest Blogging Is Getting Out Of Hand
So Matt Cutts released another video yesterday and this is at least the fourth video on the topic of guest blogging.

The deal is, Google is saying Guest Blogging as a whole is getting spammier by the day. As good things get abused, over time, those good things turn into bad things.

Matt said, the guest blogging spam is "growing" in terms of spam and abuse. So he laid out basic "do nots" that everyone who reads here already knows:
  • Don’t make guest blogging your only link building strategy
  • Don’t send out thousands of mass emails offering to guest blog to random sites
  • Don’t use the same guest article on two different sites
  • Don’t take one article and spin it many times



Here are links to related topics:

by Barry Schwartz

Google Panda Impacting Your Mega Site? Use Sitemaps To Recover?

Google Panda Impacting Your Mega Site? Use Sitemaps To Recover? sitemaps
A WebmasterWorld thread has discussion around how to determine which sections or categories of your web site are impacted by Google's Panda algorithm.

Panda has not gone away and sites are still suffering from not ranking in Google after the Panda algorithm. New sites are hit monthly, some sites are released in some way from its grips as well.
The thread talks about one technique for large sites to determine which sections of their sites are impacted by Panda.

The concept is to use XML Sitemaps and break the sitemap files into a logical structure of the web site. Then once Google processes all the files, Google will quickly show you how many pages within each sitemap file was indexed or not indexed.

One webmaster explained how he went about this technique:
The sector I was perfoming in. allowed me to created dozens of sitemaps of 100 pages each. No reason why any of the pages should not be indexed. I found some sitemaps with 0 listed then others from 25 up to the full 100. I then discovered trends. IE pages with similar title tags and URLS. (the on page content was considerably different, which is why I did not remove them initially) I then did different experiments with each sitemap group, until I saw a recovery, then applied the solutions across the board.
The question I have and I am not certain of... I thought sites impacted by Panda, the pages are indexed but they don't rank as well. Meaning, if a page is not indexed, that is more of an issue with crawl budget and PageRank (or sever set up issues) versus Panda. Panda, the content has to be indexed for Google to know not to rank it well. Am I wrong?

by Barry Schwartz 

Google On How To Make Better Mobile Web Sites

Google On How To Make Better Mobile Web Sites google mobile logo
Maile Ohye, a Google developer lead who has worked on mobile and webmaster topics for a long time, shared on Google+ that she was fed up.

Well, fed up with seeing "so many bad mobile websites" that she decided to do something about it. What did she do? She published documentation in the form of videos and a check list named Checklist for mobile website improvement.

Maile said, "A girl can only take so many bad mobile websites before she's compelled to publish documentation."

The document goes through both technical issues, marketing issues and conversion issues related to mobile web sites.

Mobile, as you all know, is increasingly important these days. So you should spend the time reading this resource.

by Barry Schwartz

Wednesday, December 11, 2013

Google's Matt Cutts: Disavow Links Aggressively

Google's Matt Cutts: Disavow Links Aggressively Google Disavow Machete Man
Google's Matt Cutts released another video this one tries to answer how to remove a penalty for bad link building over a period of time.

Specifically, if you know you hired a bad SEO who built bad links between a specific date range - how do you reverse the actions taken on your site for those bad link?

In short, Matt said, just disavow them or maybe even all the links built between the date range you hired the SEO company. He said if you had good links prior, then the easiest thing would be to disavow links acquired during the time you hired that SEO company.

Matt again said to use the disavow tool very aggressively, as we covered in our Google: Use The Disavow Tool Like A Machete & Not Fine-Toothed Comb.

He also side steps the question on Interflora only being penalized for 11 days.

by Barry Schwartz

Saturday, December 7, 2013

Google Shocks Webmasters By Updating Toolbar PageRank Before Year End

Google Shocks Webmasters By Updating Toolbar PageRank Before Year End Google PageRank Update Lives
Early this morning, Google pushed out a Google Toolbar PageRank update. This update shocked webmasters because no one expected it, at least not in 2013.

As you may remember, the last Toolbar PageRank update was over 10 months ago on February 4th. As I said at the six-month point, it was unusually for Google not to push out a PageRank update quarterly. Then Matt Cutts, Google's head of search spam, told us there won't be a PageRank update before 2014 - or at least he thought so.

Today, December 6th, the Google Toolbar PageRank values have been updated. I guess the upper management, executives, or Larry Page, didn't want PageRank to go away after all.
This makes me sad, as I am sure it makes many Googlers sad. Why? Because SEOs and Webmasters obsess about Toolbar PageRank, to the fault of Google. And as I said time after time:
Despite PageRank still being part of the algorithm, SEOs know that toolbar PageRank is often outdated and not that useful. In addition, there are many other factors part of the algorithm that may or may not be as important as PageRank.
Anyway, I do hope your green pixel dust improved since February.

by Barry Schwartz 

Friday, December 6, 2013

Google's Matt Cutts: We Don't Like Content Stitching

Google's Matt Cutts: We Don't Like Content Stitching
Google's Matt Cutts posted a video yesterday explaining that "stitching content" is a bad idea. In short, stitching content is when you take snippets from various articles across the web and place it on a single page, even with linking to it.

Matt explains there is a difference between writing a summary on a topic by using sources. But you aren't simply copying and pasting those sources, you are summarizing them and explaining them in more detail. He cites Wikipedia as a good example of doing this.

But bad examples would be just copying and pasting quotes and adding links to those sources or not.
I joked on Google+ that isn't Google theoretically doing this with their knowledge graph? Basically taking snippets of content and putting it together in a box and heck, they aren't even citing the source.

Truth is, no one likes to read these types of stitched pages - that is indeed true. But the knowledge graph is user friendly and useful.

by Barry Schwartz 

Thursday, December 5, 2013

Google: ccTLD & Wildcard Subdomain Strategies Won't Give You An "Unnatural Advantage"

Google: ccTLD & Wildcard Subdomain Strategies Won't Give You An "Unnatural Advantage" servers
Google's John Mueller responded to a detailed technical question regarding ccTLDs, wildcard subdomains and the likelihood of it being seen as doorway pages in a Stack Exchange thread.
In short, the webmaster wants to create websites that would target city names around the world, whereby TLD for each countries will be used. Here is the mapping they were considering:
www.website.de for let's say Germany and www.website.cn for China and the sub-domains will be www.berlin.website.de and www.beijing.website.cn, as an example.
Google's John Mueller responded that this won't be a good idea.
Let me share John's full response since it can be a bit technical:
My first thought when reading the question was that this is going to be a case for the web-spam team. Please don't create tons of sites that are essentially doorway pages. Also, using wildcard subdomains (assuming the idea is to map them to cities after DNS resolution) make it extremely hard to determine how those URLs should be crawled.Additionally, I think it's important to mention that a site using this kind of URL structure won't see any unnatural advantage in search. Search engines are just as good at handling URL parameters, there's no need to make it look like a website focused on [cityname],[countryname] when it's essentially just a part of the same website. Unless you have very good reasons to do this outside of web-search, I would recommend simplifying things significantly.
For geotargeting, using a ccTLD is a good way to let users & search engines know about your target audience. For Google, you can also use a gTLD (even the same one for all your sites) and work with subdirectories or subdomains to apply geotargeting there too. That saves you from having to get & maintain all those ccTLDs.
For multilingual content, at least from Google's point of view, the URL is irrelevant as long as it's unique. Use whatever URL structure works for you (and look into using hreflang where it makes sense).
Venturing overseas with content, especially with travel destination like sites, can be very tricky. This is especially true in the Panda world.

by

Wednesday, December 4, 2013

Google: Our Algorithms Don't Stand Still, Neither Should You

Google: Our Algorithms Don't Stand Still, Neither Should You walking lego
I see these threads all the time, a webmaster complaining their rankings suddenly dropped and for no reason.
Well, Google's John Mueller responded to one of these, potentially with good news but also Google-like advice.
In a Google Webmaster Help thread John said:
Fluctuations like this can be normal - the algorithms don't stand still and use many signals to review & determine the relevance of pages & sites. Looking at it just now, it seems that it may just have come back again :). That said, working to improve the overall quality of a website is always a good thing. If this is currently one of your busier times, then it might make sense to do that a bit afterwards -- but I'd definitely think about what you could do to get feedback from users (customers or not) already.
I checked the site's rankings and it was not back to number 9 for me. But maybe John saw it back at 9. Either way, the advice is typical advice from Google but important.
In short, Google's algorithms don't stand still, neither should you.
You should keep working on your site, making it better for users and hopefully Google will appreciate those efforts as the algorithms update.

by Barry Schwartz

Tuesday, December 3, 2013

Blocking GoogleBot By Accident? No Long Term Damage, Google Said.

Did you accidentally block Google from accessing your web site? Google says you don't have to worry, while there may be short term damage caused by it, you should recover.Google's John Mueller said it should not result in "long term damage."
Here is how John put it:
damageFrom our point of view, once we're able to recrawl and reprocess your URLs, they'll re-appear in our search results. There's generally no long-term damage caused by an outage like this, but it might take a bit of time for things to get back to "normal" again (with the caveat that our algorithms change over time, so the current "normal" may not be the same state as it was before).
So if your hosting company has issues or you slip up and block Google, once Google can crawl again, in the long run, you should be okay.

by Barry Schwartz