Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Wednesday, January 29, 2014

Google's Matt Cutts On When Old Sites No Longer Rank Well

Google's Matt Cutts On When Old Sites No Longer Rank Well Google's Matt Cutts

Yesterday, Matt Cutts of Google released another video, this one answering why an old site that always ranked well, no longer ranks as well these days.



The question posed was, "How can an older site maintain its ranking over time?"



Matt
said that some old sites that have been ranking well for years don't
change anything. They leave the 15-year-old template, they don't add
any new features or content. They just leave it. This is while other
new sites and competitors come into the mix with fresher designs, better
user experiences, new ideas and features.

 Eventually, customers start
to leave the old site and go to the new site because it is a better user
experience.



Google acts the same way, if you don't continuously improve your sites then why should Google continue to rank it well?



Matt urges older domain names to take a fresh look at their site or else people may leave your site.



by Barry Schwartz

Saturday, January 18, 2014

Hold Your SEO Client Hostage With Negative SEO Threats

Hold Your SEO Client Hostage With Negative SEO Threats SEO Threat
A WebmasterWorld thread has a story of a webmaster who received a phone call from someone with an Indian accent threatening to use negative SEO on their site if they don't pay up. A form of search pirates, if you will.

This is what the webmaster's client said:
So I had a client phone me today and say he had a call from a guy with an Indian accent who told him that he will destroy his website rankings if he doesn't pay him £10 per month to NOT do this. What the hell...
This is not new, we've covered stories like this before. Threats to use the disavow tool against you, well you can't. Threats of negative SEO and much more.

What can you do? Well, you can monitor your new links in Google Webmaster Tools and anything that looks bad, add them to your disavow link file. It shouldn't cost or take too much time to do that.

Of course, Google has said time and time again that while negative SEO is hard it is possible.
Which is why they came out with the disavow link tool.

So track your new links in Google Webmaster Tools and disavow the bad new links.

 by Barry Schwartz

Friday, January 17, 2014

Google's Matt Cutts: We Don't Have Different Algorithms For Different Ranking Positions

Google's Matt Cutts: We Don't Have Different Algorithms For Different Ranking Positions matt cutts loves webmasters pic
Google's Matt Cutts released a video yesterday answering the question; does Google have algorithms for different positions in the search results.

So does position one through three use one ranking algorithm, position four through six use another and so on.

The answer is no, at least with the web organic search results. Of course the ads have different algorithms, the local results have different algorithms, the videos and so on - but the web results do not work that way.

by Barry Schwartz

Wednesday, January 8, 2014

Google: Your Rankings Dropped Because Of Links, Not Hummingbird

Google: Your Rankings Dropped Because Of Links, Not Hummingbird Google Hummingbird
A Google Webmaster Help thread has an office furniture site complaining that his Google rankings dropped after the Hummingbird update was pushed out.

Of course, you and I know that sounds a bit too much. Being that we do not know the exact rollout and dates of the Hummingbird algorithm.
But it is nice to see Google representatives saying so too.

Zineb Ait Bahajji from Google said it wasn't the Hummingbird algorithm but rather links, not any links but over-optimization links.

Here is what Zineb wrote:
The Hummingbird update does probably not have anything to do with your website's traffic loss.The primary issue here is that you are using certain over-optimization techniques on your site that are in violation of our quality guidelines, especially when it comes to building unnatural and spammy links.
Interesting she calls it "over-optimization techniques" - don't you think?

by Barry Schwartz

Tuesday, January 7, 2014

Google On What To Do If You Can't Have 100% Unique Content

Google On What To Do If You Can't Have 100% Unique Content smiley painted
Google has said time to time that having some unique content, even if it is a few sentences of unique content may be enough.

So when Gary Illyes from Google responded to a thread at Google Webmaster Help - his response stood out.
He wrote, "You want to keep in mind that the users (and also our algorithms) prefer unique and compelling content." But if you can't, then what?

Gary wrote that you must have elements that are unique.
If having 100% unique content is not possible, the webmaster should make sure to have elements in the pages that are unique and valuable for the users, give a good reason for them to visit the site.
There are plenty of sites that do not 100% unique content, including this one (clearly, since I quote threads) but adding value on top of that content by aggregating it, adding context and tools on top of it - that can be the "element"(s) that are "unique and valuable."

by Barry Schwartz

Social Optimization: Your Content Or Yourself?

Social Optimization: Your Content Or Yourself? social
There is an interesting conversation going on at Cre8asite Forums about your clients participating in social networks and the importance of that.
The given is that social is growing in importance, as it related to search rankings either indirectly and in the future, directly.
The bigger question in the thread is, do you yourself need to be active in social or does your content need to be socially appealing? The easy answer is both.

But like EGOL said in the thread, not everyone who writes the outstanding content is able to participate within social networks. People are disinterested, too busy, confused or unable to be part of the social networks, i.e. Facebook, Twitter, Google+, LinkedIn, and others.

Does that leave them at a disadvantage or can the content make up for it.

EGOL wrote:
Although I don't embrace social with my time my website gets a lot of traffic from Facebook, Stumble, Pintrest and especially the TIL part of Reddit. My visitors take my content to social for me because they want to share it with others. It works nicely. Other people do the jobs that I don't want to do and I don't have to pay them. That's how things are supposed to work IMO. It is more genuine to let it happen that way rather than hiring a shill to do it for you.
Indeed but if there is a conversation going on at the social network and if you are not there, that can be an issue. Of course, also, building your own authority outside of your own web site, then as you share content, it just helps spread that content through the network - that also is natural.

What is your take on this?

by Barry Schwartz

Saturday, January 4, 2014

Google: Can't Crawl Your Robots.txt Then We Stop Crawling Your Site

Google: Can't Crawl Your Robots.txt Then We Stop Crawling Your Site robots txt
Did you know that if Google cannot crawl your robots.txt file, that it will stop crawling your whole site?

This doesn't mean you need to have a robots.txt file, you can simply not have one. But if you do have one and Google knows you do and it cannot access it, then Google will stop crawling your site.

Google's Eric Kuan said this in a Google Webmaster Help thread. He wrote:
If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file. If this isn't happening frequently, then it's probably a one off issue you won't need to worry about. If it's happening frequently or if you're worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.
This also doesn't mean you can't block your robots.txt from showing up in the search results, you can. But be careful with that.

In short, if your robots.txt file doesn't return either a 200 or 404 response code, then you got an issue.

by Barry Schwartz

UserInteraction Schema For SEO & Social?

UserInteraction Schema For SEO & Social? Facebook Likes and Google SEO
A WebmasterWorld thread has a webmaster who asked if Google will give him a boost because one piece of his content on his site hit over 1,000 Facebook likes.

It is one of those weird questions that makes you nod your head sideways. Why?

(1) Google said they don't use Facebook likes or even Google+s directly in their ranking algorithm. Heck, Google doesn't have access to Facebook data for the most part.

(2) However, if you really had content that was organically loved and liked by so many people. It is natural for a lot of people to link to it and share it and thus Google will pick up on other signals and likely rank that content well.

That being said, should you better markup your pages so Google can easily pick up the Facebook Likes on a specific page?

There is a Schema.org markup named UserInteraction that lets you markup social related features such as Likes, Checkins, Tweets, Visits, and so forth. Here is the list:
Of course, you need a way to automate this and not manually enter this data into your HTML code. So you need a good social tracking system or integrate directly with each API from Twitter, Google+, Facebook, Foursquare, Analytics, etc.

Do you add this schema data to your site? If not, will you?

by Barry Schwartz

Friday, January 3, 2014

Google: You Must Wait A Few Weeks To Submit A Reconsideration Requests

Google: You Must Wait A Few Weeks To Submit A Reconsideration Requests google hourglasses
We know Google doesn't like it when you submit a new reconsideration request a day or two after you get a rejection response from your reconsideration request. They want to see you put effort in after you get that response.

So now, it seems Google is telling webmasters who get reconsideration request responses that they need to wait a "few weeks" before submitting a new one. If they don't listen, Google will ignore their submissions.
A Google Webmaster Help thread has one example of Darren Jamieson posting one for his site. The section that is new is:
Removing links takes time. Due to the large volume of requests we receive, and to give you a better chance of your next reconsideration request being successful, we won't review another request from this site for a few weeks from now. We recommend that you take the necessary time to remove unnatural backlinks to your site, and then file another reconsideration request.
It seems like more and more reconsideration request rejection replies have this message in them.

Hat tip to Jon Hogg from iprospect.co.uk for sending me this thread.

Wednesday, January 1, 2014

Google Not Indexing Your Sitemap URLs? Might Be A Canonical Issue

Google Not Indexing Your Sitemap URLs? Might Be A Canonical Issue Google Sitemap
A Google Webmaster Help thread has a webmaster all upset that Google shows that they have indexed none of the URLs they submitted via the XML Sitemap file. Obviously, this can be concerning to any webmaster.

The thing is, you need to be careful what Sitemap file you submit. If you verify a non-www version with Google Webmaster Tools and submit a www version of your sitemap, or visa versa, Google may be very literal and show you that they didn't index any non-www versions of your URL.

Google's Zineb said in the thread:
All the URLs listed in your XML Sitemap are non www. and they all permanently redirect to the www. version (which is your preferred domain). That explains why these specific URLs are not indexed. In order to fix that, you'll need to specify the right URLs with www., resubmit your Sitemap and wait for it to be processed again.
So technically, it is an "easy fix" and the site is indeed being indexed. But a report like this can be scary to see in Google Webmaster Tools.

by Barry Schwartz