Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Search Engine Optimization and Marketing

A well-known view on search engines and search engine marketing and optimization

Tuesday, August 12, 2014

Google Algorithm Update Around August 8th 2014

Google Algorithm Update Around August 8th 2014 google updating logo
I am seeing significant signs of a real Google update both in terms of summer/weekend forum chatter at WebmasterWorld, via the automated tracking tools and via emails sent to me with screen shots.

The update seems to have around August 9th or 8th and has continued through the weekend.

Here are some quotes from the Webmaster World thread:
I concur with the above observations. Seeing a huge drop in traffic volume and conversions, but spidering is going ballistic!
I join samwest, with big drops, hours with -95% traffic with followed
spurts of traffic. Overall traffic is at about 60%. Bounce is near 100%.
And yes we changed a little bit the navigation but this is about 8 weeks
ago. I know of two others seeing the same. It feels like a new panda.
Here are screen shots from the various automated tools at MozCast, SERPS.com, SERP Metrics and Algoroo:

click for full size

click for full size

click for full size

click for full size

Alex, a reader, also emailed me screen shots, limiting his analytic to show Google Organic referrals only. He showed how his site was nailed back with the January 9th update and recovered on August 8th:

Drop:

click for full size

Lift:

click for full size

And his webmaster tool:

click for full size

Alex also feels this is a major Panda refresh.
I will try to get Google to comment on the update and will keep you posted.

The interesting part, the forums are not lighting up as much as I would expect. But again, it is a weekend and summer weekend update.

Update: Google won't comment on this update.

by

Thursday, July 10, 2014

SEO: Harder To Know If Human Error Caused Ranking Issues Vs Algorithms

SEO: Harder To Know If Human Error Caused Ranking Issues Vs Algorithms robots vs humans

When it comes to ranking issues in Google, there are two main reasons why your rankings can drop. The first is you made a technical mistake on your own site, such as blocking Google from crawling you. The second is Google decided through algorithms or manual actions that your site shouldn't rank as well anymore.

A WebmasterWorld thread has SEOs debating if it is harder to know if your site was hurt by an algorithm, ie Google vs your own human error.

We have so many algorithms running, possibly well over 2 changes per day, in the search results. Because of that, does it make it impossible for one to know if it is a human issue vs an algorithm? There can be SEO
technical mistakes on the site, but even with those, is that the reason your site tanked?

The senior member in WebmasterWorld said:
Is it time for us to quit simply discounting any statement about a change
made yesterday possibly having an effect on rankings today, or not?
The truth is, probably not. You need to look at a case by case basis. Of course, when I track algorithm changes and Google updates, I look for patterns. If there are a ton of complaints, then you can
assume (not a safe word) that Google did release some sort of update that hit you and others. But you cannot always make that assumption because a technical issue may hit you the same day.

Anyway, this is what you hire SEO firms for. To analyze these things and make suggestions and corrections to things you can control (i.e. your technical site) and things you do not necessarily control.

by

Wednesday, June 11, 2014

My Site Is Boring. How Do I Get Links?

My Site Is Boring. How Do I Get Links? Google's Blocked Sites
The topic of getting links to boring web sites or web sites that
compete with dozens or hundreds of other web sites is not new. If you sell widgets and there are hundreds of other sites that sell the same widgets, then how do you drive up interest to get people to link to your site?

A WebmasterWorld thread has one person even asking if doing link bait type of stories
will change the topicality of his site and result in his rankings to go down.

So if he creates an article about his topic but it is about "avoiding" his topic, is that going to hurt him? This is what he wrote:
Site about widgets. Widgets are boring. Very few people link to sites about widgets. Created page on site, example.com/how-to-avoid-crimes-involving-widgets

Good link bait this page, lots of people link to it, page gets more links than the home page and all other pages combined.

Is this going to change how the search engines see the site, topically
speaking. Will this site go from being classified as a site about
widgets to a site about "how to avoid crimes involving widgets" because
of this link bait?
In my opinion, no - this is a good idea. As long as the article is about his widgets, even if it is anti his widgets, it is still the same topic.

I get these phone calls and emails all the time... The site owner complains that no
one will link to them because they are selling boring stuff.

I try my best to explain that you need to get out of your box and thing bigger.

I give examples of how my boring web development company (scary link but I vouch, it is a good guy) has drum up a ton of real links through crazy ideas on mobile apps. Heck, we even build some jewish software, a really boring religion, for Jewish iPhone apps, Jewish Android apps and even Jewish Google Glass software and believe it or not, it drives tons of traffic and links.

This helps us with a few things:

(1) Showcase our talents

(2) Help a community we are passionate about

(3) Get buzz about our company, which drives links as well.

Look how much attention JewGlass received and trust me, we are working up new concepts all the time.

So if my boring web development company can use a boring religion to make
buzz and get links, I bet your company can do the same.

Come up with a wild software idea, be it an app, glass software, web site tool, a nice piece of content, or something else. Do it, take the risk and let's see if it pays off.

So yes, boring sites have a tough time getting links. For a boring site to get links, you need to create some
life on the site.

by

Thursday, May 22, 2014

Google Panda 4.0 Now Rolling Out; A New Panda Algorithm

Google Panda 4.0 Now Rolling Out; A New Panda Algorithm

Google Panda 4.0



Last night was pretty wild, with Google confirming an update over the weekend targeting spammy queries and also Google's Matt Cutts posting on Twitter that Panda 4.0 was released.

This is a new Panda algorithm, not a refresh that we've seen almost monthly but enough for Google to name this 4.0, which means a new algorithm update to it.

Google's Matt Cutts told us at Search Engine Land that this Panda 4.0 update impacts ~7.5% of English queries to a degree that a regular user might notice.

Also when I spoke to Matt Cutts, he made it sound like this update may appear gentler for some sites but it does lay the groundwork to future changes in the direction of a softer and gentler Panda algorithm.

This began rolling out yesterday and is unrelated to the Google Spam Algorithm 2.0 released over the weekend and unrelated to anything Penguin.

So no, I was not crazy expecting something big that happened over the weekend and throughout the month and this week. Even though Google said nothing is going on, we've been seeing signs of major changes and ranking shifts all throughout the month. I suspect those were tests for both this Google Spam Algorithm version 2.0 and the Panda 4.0 release.

Google stopped confirming Panda updates last year and then started doing rolling updates but that doesn't mean they haven't done work on fine tuning the algo. They softened it recently and there have been refreshes monthly, some larger than others but it is hard to know if it was Panda or something else.

Over the next few days, we are going to need to look at our analytics and isolate if we were impacted by one of these algorithms. I'll ask you to fill out a poll in the near future and share those results.

by

Saturday, May 10, 2014

Google Preparing Large Update? Shifts On May 2nd & 7th

Google Preparing Large Update? Shifts On May 2nd & 7th Google Update Brewing
I mentioned there was some chatter about a Google update on May 2nd but now we are seeing even more chatter and signs of flux in the Google search results on May 7th.

These signs of shifts in rankings, spikes in crawl rates, often, but not
always, are early warning signs of a major Google update happening in the near future.

Here are flux/volatility charts from MozCast, SERPS.com, SearchMetrics and Algoroo:

MozCast

SearchMetrics

SERPS

algoroo

All show similar patterns for the most part.

The chatter in the forums, including the ongoing WebmasterWorld thread follow these patterns also.

I did ask Matt Cutts if there was an update, but no response yet:

I would not be surprised if Google launches something big soon on the algorithm side. It seems they are testing something big.

by

Wednesday, April 30, 2014

Google's Matt Cutts: Anticipate The Query To Better Control Titles In Google

Google's Matt Cutts: Anticipate The Query To Better Control Titles In Google title tag CTR
Google's Matt Cutts posted a video explaining why and when Google may use something other than your title tag for the search results title snippet.

MattCutts suggested that it is best for your to try to anticipate what theuser will search for when crafting your title tags. When you do that and then when it matches the query, then Google will likely show your title tag.

Google uses three criteria when determining if they should use your title tag:

(1) Something that is "relatively" short
(2) Have a good description of the page and "ideally": the site that the page is on.
(3) And that it is relevant to the query.

If you fail on these criteria, then Google may use (1) content on your page, (2) anchor text
links pointing to the page and/or (3) may also use the Open Directory Project.

Other title tag related stories:
by

Wednesday, April 23, 2014

Schema.org Actions: What Is It?

Schema.org Actions: What Is It? Schema.org
On Friday, I reported at Search Engine Land that Schema.org launched Actions, a new markup that describes actions that have taken place in the past [past actions] or could take place in the future [potential actions].

Note, Schema.org is the centralized organization backed by Google, Bing,
Yahoo and other search engines, for standardizing markup that mostly
search engines use to better understand text and content on a web page
and is often used in rich snippets.

That being said, I was curious how Google and Bing might use them, but both companies wouldn't tell
me. Google wouldn't tell me anything, but Bing did give me some details. Bing emailed me answers to my questions:
(Q) Can you explain this a bit better in terms of use cases?
(A) The Action vocabulary is intended to be used primarily for describing
actions that have taken place in the past [past actions] or could take
place in the future [potential actions]. Let’s assume Barry shared an
MSN article on Facebook yesterday. This is an example of a past action.
Facebook might use schema.org to describe the action by indicating that
Jason is the subject (agent) of the action, the action verb is sharing,
and the object of the action is an MSN article. Now let’s say MSN wanted
to expose the ability for applications to programmatically share an
article on their website. This would be an example of a potential
action. MSN might use schema.org to describe the potential action by
indicating the action verb is ‘sharing’ and that you can perform this
action by calling a specific URL.

(Q) How may Bing use this in the search results and is it being used now?

(A) Bing currently uses a draft version of the Actions vocabulary to power the recently released App Linking service. You can learn more about that via the Bing Dev Center and
associated MSDN documentation. In addition to App Linking, there are a
number of ways in which we might use the vocabulary to power new
experiences in Bing and other Microsoft products. Unfortunately there
are no definitive plans we can share at this time.

Note that Bing uses other schema.org vocabularies to power its rich web result captions
as well. More information on that product is available in the Bing
Webmaster Tools.

(Q) What are the goals here for webmasters?

(A) The primary goal of schema.org has always been to provide webmasters
with a common vocabulary for use in describing their data. The new
Actions vocabulary, especially the terms associated with potential
actions, extends this goal to include describing services as well. By
providing these descriptions, search engines like Bing and other
applications that consume them can leverage the associated information
to expose the data and services in a relevant and useful way.
Here are the various definitions allowed in Actions:
  • actionStatus: Indicates the current disposition of the Action.
  • agent: The direct performer or driver of the action (animate or inanimate). e.g. *John* wrote a book.
  • endTime: When the Action was performed: end time. This is for actions that span a period of time. e.g. John wrote a book from January to *December*.
  • instrument: The object that helped the agent perform the action. e.g. John wrote a book with *a pen*.
  • location: The location of the event, organization or action.
  • object: The object upon the action is carried out, whose state is kept intact or changed. Also known as the semantic roles patient, affected or undergoer (which change their state) or theme (which doesn't). e.g. John read *a book*.
  • participant: Other co-agents that participated in the action indirectly. e.g. John wrote a book with *Steve*.
  • result: The result produced in the action. e.g. John wrote *a book*.
  • startTime: When the Action was performed: start time. This is for actions that span a period of time. e.g. John wrote a book from *January* to December.
  • target: Indicates a target EntryPoint for an Action.
by

75% Of SEOs Want Yahoo To Return To Search

75% Of SEOs Want Yahoo To Return To Search yahoo search logo
A couple months ago new rumors surfaced around Yahoo making a return to search.

With those rumors, we asked you guys if you think Yahoo should indeed make those efforts or give up now? With over 200 responses on our poll I
wanted to tell you that the majority would like to see Yahoo return to
search 75% of you said yes, Yahoo should get back into search. Only 18% said no, they should not and 7% don't care either way.

I suspect the 75% said yes so that there would be more diversity and
competition in the search space. Or maybe they hate Microsoft?

Here is the pie chart with more purple in it:

yahoo search return poll

by

Friday, April 18, 2014

Google's Cutts On Big SEO Myths Are...

Google's Cutts On Big SEO Myths Are... google matt cutts
Google's Matt Cutts released a video on the topic of some of the largest SEO myths out today.

He broke it down into two categories:

(1) Ads and their influence on organic results.

(2) Quick fixes to break Google's algorithm.

On the ads from, Matt Cutts said there are two myths. (1) If you buy ads,
your organic rankings will go higher. (2) If you don't buy ads, your
rankings will go higher. He also added that people think that Google
makes changes to their organic results to drive more people to buy ads.
All of this is untrue and a myth according to Matt Cutts.

On the quick fix end, Matt said there is too much "group think" in the forums
and black hat forums. He said he sees this all the time where for a
couple months one person will say tactic X works awesome, then a few
months later, tactic Y and so on. For example, someone might say
article directories work, then later guest blogging, then later link
wheels and this process goes on and on. Also, someone might say a
specific tool works very well. Matt said, the truth is, if someone
found a loop hole, they wouldn't sell it as an ebook or software
product, they'd use it themselves for as long as possible before others
catch on.

by

Tuesday, April 15, 2014

Google Panda Refresh Or Softer Panda Update?

Google Panda Refresh Or Softer Panda Update? google panda
Let me start off by saying that I normally would wait another 24
hours before posting this because the chatter is so new, but I will be
offline tomorrow and the day after and I am going with what I see now.

It seems, based on the very very early chatter that a Panda refresh started late last night or this morning. Some are asking if it is that version two of the softer Panda update that Google's Matt Cutts promised.

The ongoing WebmasterWorld thread has posts from late yesterday and early this morning with questions about changes at Google that seem to relate to sites impacted by the Panda algorithm.

One webmaster said:
Ok, I see a bad sign of another silent update. Lowest traffic in the last 5 years. It looks like a Panda reiteration.
Another webmaster said this is the opposite of the "softer" update that they were expected:
My figures and search results seems very similar to the pre-soft-Panda...
A senior member agreed:
@Mentant, I can confim your observation. Lowest traffic ever. All our main keys
are gone, replace by brands that do not have anything in common with the
search string. For sure this is a Panda. I think they took an old one and let it go through the index. Google keeps going their way to destroy all ecom except amazon/ebay. There is no sign of "Leveing" or even Panda being softer.
So the questions we have now:

(1) Is Google pushing out an update?

(2) If so, was it just a typical monthly Panda refresh?

(3) Or was it the softer Panda update that seems harder for many?

Forum discussion at WebmasterWorld.

Update: Google told me on the record there was no update:
Just checked with the team and there is nothing we're aware of on this. Thanks for reaching out.
by Barry Schwartz

Friday, April 4, 2014

Matt Cutts: PageRank Not Popularity, Topical PageRank Helps & Authority On Subjects Coming

Matt Cutts: PageRank Not Popularity, Topical PageRank Helps & Authority On Subjects Coming matt cutts of google

In yesterday's video from Google's Matt Cutts, Matt summarized three topics that aren't necessarily new but often confused by folks in the webmaster and SEO community.

In short, he said:

(1) PageRank is not a measure of popularity
(2) Topical PageRank may help determine if a specific site is a good match for a specific query
(3) Google has algorithm changes they are working on to improve their understanding of who is an authority

On PageRank not being a measure of authority, that makes sense to us but not all. Matt's example is key, porn sites are way more popular than government sites in terms of traffic and usage. But no one links to porn sites and thus their PageRank is not as high as government sites, which everyone links to.

On topical PageRank, that also makes a lot of sense and most SEOs get this. This is why anchor text was/is so important and likely more important than raw links. I won't go into this more, I am tired.

Finally, Matt mentioned again, Google is working on an algorithm for the authority in the space.

 by

Wednesday, March 26, 2014

Google's Matt Cutts On Telling If Your Site Was Hit By Algorithm

Google's Matt Cutts On Telling If Your Site Was Hit By Algorithm Google's Matt Cutts
The truth is, for an experienced SEO, this video sheds nothing new
about the question on determining if your site was hit by an algorithm
or not.

In short, the best way to tell if you were hit by a Google
algorithm such as Panda or Penguin, is to see your analytics and see if you had a major dive in traffic from Google on a specific day. If so, then write down that date, go to our Google updates section here and see if the date corresponds with anything reported here. If not, then you are out of luck. Well, not exactly.

Matt describes three reasons why a ranking drop might occur:

(1) Manual Actions
(2) Crawling Errors or Issues
(3) Algorithmic Penalty

(1) Manual actions show a notification in Google Webmaster Tools, so it is clear cut, Matt said.

(2) Crawl errors also are likely to show in Google Webmaster Tools, often clear cut also.

(3)Algorithmic penalties are not thought of as a penalty, they are
algorithms for ranking. General quality and algorithms will determine
rankings. So it is hard to tell if an algorithm is hurting you. But
Google will communicate large scale algorithm changes, such as Panda or
Penguin. They will tell you on what date they run, this way you can
check the date and see if that algorithm had an impact on your site.

But as you improve your site and the algorithms run, your rankings can improve.

At WebmasterWorld, GoodROI, the administrator, said:
Forpeople (especially newbies) having trouble making money online they
should remember most things are interconnected. For example if you
publish poor content it will lead to weak link development because no
one likes linking to poor content. There are ripple effects when working
on different parts of your site.
by



Saturday, March 22, 2014

Lots Of Google Search Activity: Likely Manual Vs. Algorithmic

Lots Of Google Search Activity: Likely Manual Vs. Algorithmic Google Update Brewing
The ongoing Webmaster World thread that tracks Google activity has webmasters and SEOs asking if there are algorithmic updates going on, either Panda or Penguin. I can say, Google has not told us about any update but there has been a heck of a lot of penalties, manual actions and widespread targets on the manual side that happened over the past couple weeks.

For example, this week, we had a blog network penalized impacting many of the publishers associated with that network. Last week, Google took action on link networks in Spain, Germany and Italy. Earlier, Google also target networks in Poland and France.

Google has clearly stepped up their activity in a very manual way, targeting
networks that use links to manipulate Google's rankings. This is both
on a large scale, going after larger networks, as well as the daily
smaller scale manual actions.

So while there may have been a Panda refresh, since they are kind of pushed out monthly. But if Google pushed out that softer Panda update, Google would have likely told us.

I may be wrong, but most the tracking tools seems pretty flat, with the exception of one.

by

Saturday, March 15, 2014

Google's Matt Cutts: New Softer Panda Update Coming Soon

Google's Matt Cutts: New Softer Panda Update Coming Soon Google Panda
At SMX West Matt Cutts of Google announced they are actively working on the "next
generation" Panda update that will "soften" the algorithm.

Matt specifically said this is aimed at helping small businesses that may be impacted by the Panda algorithm. There was no date given on its release but he made it clear, this will be a bigger update that will make Panda less of an impact on certain sites.

Glass Panda Story

Yes, that is a Google Glass vignette view from Danny Sullivan while on stage, with my Search Engine Land story on the topic.

Didn't Google already soften the Panda algorithm? Yes. They announced it last may and rolled it out in July. With that, only 18% recovered fully with that last softer Panda update.

I assume when it rolls out, Matt Cutts will let us know, unlike other Panda refreshes.

by



Monday, March 10, 2014

Google: Keep URL Length Shorter Than 2,000 Characters

Google: Keep URL Length Shorter Than 2,000 Characters url
SEOs obsess about the smallest things, even how long is too long for a URL.

A Google Webmaster Help thread has SEOs and webmasters asking how long can they go for a URL. Google actually answered the question.

John Mueller of Google said, while there is "no theoretical length limit"
and they can go forever, Google does recommend you keep it under 2,000
characters. Google's John Mueller wrote:
As far as I know, there's no theoretical length limit, but we recommend keeping
URLs shorter than 2000 characters to keep things manageable.
It is interesting cause DoubleClick, a Google company, maxes out
on 2,000 characters in a URL. It seems the GET method maxes out on
2,000 characters for a URL and that Internet Explorer can't go beyond
2,000.

If you are unsure if 2,000 characters in URLs is too much, this is what it looks like:

http://www.seroundtable.com/google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219-google-url-characters-18219123.html

 So it isn't easy going all the way to 2,000 characters in a URL.

by Barry Schwartz

Wednesday, March 5, 2014

Got A Google Penalty? Should You Start A New Site?

Got A Google Penalty? Should You Start A New Site? fresh start
As more and more Google penalties become more transparent, recovering from them seems to get harder. Even when you do recover, the rankings don't always return.

In a recent column by Eric Ward named When The Best SEO Move Is To Kill The Site where he concluded that "in almost two-thirds of the cases I advised that the best move was to kill the site." This is when it comes to unnatural link penalties or Penguin related issues.

The question is, is that true? Is it often easier to kill off the site?

Matt Cutts has said time and time again that digging yourself out of a spam hole is often harder then starting fresh.

Also, now that we know penalties may follow you
to your new domain, if you don't start a fresh new web site, then
making the decision to kill off a site is even more costly and timely.

If it was as simple as copying your site to a new domain name, switching
might make sense more of the time. But if you need to rewrite your
content, redo your CMS and design, then it can take a long long time.

Google's John Mueller posted on Google+ a comment about Eric Ward's article saying:
It's never a decision to make lightly, but there can be situations where a
website has built up so many problems, that it may appear easier or
faster to start over with a fresh & new website, rather than to try
to fix all of those problems individually. This isn't an easy way to get
past problems that have been built up over the years, it's a lot of
work to create a new website, even if you already know the business
area. If you feel you're in this situation, make sure to get
advice from friends & other people that you trust (including
webmaster communities where you trust their opinions) before doing
anything drastic!
In a Google Webmaster Help thread, John Mueller gave advice to someone in a hole that if he will
go the new site route, he should start fresh. John wrote:
If you're creating a new website, and don't want to be associated with the
old one, I'd strongly recommend really making a *new* website and not
just moving the content to a different domain. You don't need to wait
for anything in a case like this -- it's fine to remove (or block) the
old website, and to create a really new one elsewhere at the same time.
So making the decision to start new is not easy. If it was me, I'd go in this order:

(1) Try removing the bad links
(2) Submit a reconsideration request
(3) Repeat this a few times until it is successful
(4) Wait two months for traffic to change
(5) If no traffic change then start a new site

Of course, it is not always this black and white and the specific situation might change
the solution. Like if you put a ton of money into your brand name and
you can't go elsewhere. Or if there are investors you need to worry
about. Or if you simply can't make a new site.

It is a shame to have to deal with this stuff.

by Barry Schwartz


Friday, February 28, 2014

Ex-Googlers Explain Why They Won't Give Up The Secrets To Ranking In Google

Ex-Googlers Explain Why They Won't Give Up The Secrets To Ranking In Google mouth zipper
I spotted a thread at Black Hat World asking why have we not seen a case where a former Google employee who worked in search quality sell their knowledge or publish the secret sauce on what it would take to rank well quickly and get rich fast. Or why haven't these former Googlers done it for themselves, exploit their
knowledge to rank well in Google and make a quick buck?

I decided to ask some former Googlers who worked in the search quality team at Google but now no longer work at Google. What they have said may be interesting to you. This is all captured in a Google+ thread.

Pedro Dias, someone at the higher levels of the Search Quality Analyst team, who
worked at Google for about five years and then leaving to move closer to
home, I believe. He gave several reasons why he has personally not
given up the secret sauce.
  • Because SQ Googlers hate spam and anything related to cheating or manipulation... We have seen the darkest of the web... Really;
  • Because this would undermine the trust between all my current and ex-colleagues;
  • Because we prefer to have it long term, than to work for the "get rich quick";
  • Because we don't see SEO as "gaming the system"... Despite what many
    SEOs say. We have our own vision, we prefer to think like Search
    Engineers and help businesses understand Search rather than selling
    magic formulas and torching them.
  • Because we signed an NDA, although it doesn't count as much as the points above.
  • And sometimes because we like to know stuff that others don't and keep it like that :P
I grilled him a bit more, because I really wonder if they
do know the secret sauce. I asked Pedro, " do you have deep dark
secrets that you can use to exploit the algo and rank #1 for [viagra] or
something that competitive." He responded:
Again, I would prefer not to go into such details mentioning that I know X, Y
or Z... I feel fortunate to have been part of a select group and touched
some very important and exclusive areas, that's all I can say...
I guess once a Googler, always a Googler. Kaspar Szymanski, who worked in the Search Quality Strategist and was at Google for about seven years, chimed in also. He told me:
Just like my clients value expertise, they equally value integrity.
Fili Wiese, who worked at Google for about seven years also but not just in the
search quality team, also the ad quality team, said it short also, "I
agree with Pedro Dias and Kaspar Szymanski."

Again, I still wonder, if Google has anything that one of these former Googlers can
truely exploit and get rich quickly with. Yea, I am sure there are some
things that will work in the short term and Googlers and non-Googlers
can figure those out. But long term - you still need to know the
fundamentals.

It just happens that all of these Googlers quoted here act in somewhat of an SEO and search consultant role now.

by Barry Schwartz

Google's Matt Cutts: Content Clarity Over Technical Content

Google's Matt Cutts: Content Clarity Over Technical Content Google's Matt Cutts Clarity
There is an excellent video from Google's Matt Cutts on the question Should I focus on clarity or jargon when writing content?
The short answer is focus on clarity over jargon.
Matt explains that in most cases, having content that most people understand is way more important that having all the scientific and technical jargon about the topic you are covering. If you can't explain the topic to a novice, then the reader likely won't be able to understand your
content.
Best case, start off explaining it in simple terms and
get more technical as you go. But if you had to pick, it seems Matt is
saying content clarity is more important over detailed technical and
scientific content, in most cases.
by Barry Schwartz

Saturday, February 22, 2014

Google's Matt Cutts Stops By WebmasterWorld After Three Years Of Silence

Google's Matt Cutts Stops By WebmasterWorld After Three Years Of Silence Google's Matt Cutts WebmasterWorld

The last time Matt Cutts of Google posted something at WebmasterWorld
was over three years ago on January 4, 2011. He broke his silence
there yesterday posting in a private member-only WebmasterWorld thread.

Why does this matter? I am not sure but he was the infamous GoogleGuy at WebmasterWorld
for years and years. Not posting there, which was like a home to
Cutts, for three years, is a pretty long time. Of course, Matt is busy
helping webmasters at a larger scale with his videos and blog posts.

What
did Matt post? I won't share all the details of the thread exactly,
since it is a private thread. But I will say a large site (which was
unnamed) accidentally blocked themselves with a robots.txt file and was
looking for ways to get reindexed quickly. Matt suggested to use the
Fetch as Googlebot feature to expedite things.

Here is a screen cap of what Matt wrote:

matt cutts webmasterworld post

So the tip is useful in that we know Fetch as GoogleBot can expedite some indexing.



by Barry Schwartz

Thursday, February 20, 2014

Signs Of More Google Algorithm Updates Early This Week

Signs Of More Google Algorithm Updates Early This Week google updating logo

We reported about a possible Google update around February 13th
and I do believe there was something rather large on that date - but
now there is more chatter about a possible update early this week, this
past Sunday and Monday at Google.

The ongoing WebmasterWorld thread has a lot of discussion from webmasters about a possible update. Here are some recent comments:

Has anybody noticed a drop in their rankings yesterday or today specifically?
Yes, but could just be the continual state of flux I see. Though Mozcast reports activity, so you may be on to something.
Another
update happened 100%. Expect a bumpy few days. If you recover, great.
If not nothing you can do. Check your money terms, see if the organics
have all but disappeared below ads & image results. My money is they
have used Google Shopping data to spot the money terms. If your site
relies on that term kiss goodbye to organic traffic.
Now, some of the tracking tools show higher than normal activity on those days including Mozcast and Algoroo but SERPs.com and SERP Metrics seem pretty flat.

Did you notice ranking changes and referral changes from Google earlier this week?



by Barry Schwartz

Friday, February 14, 2014

Google February Update: Possibly A Google Panda Refresh?

Google February Update: Possibly A Google Panda Refresh? Google Update Brewing

There is a tremendous amount of chatter going on about Google updating over the past couple days or so at WebmasterWorld.

On February 6th we had the Page Layout algorithm update but that didn't cause much of a fuss in the SEO forums. But something over the past couple days is.

Moderator, Travelin Cat, said:

I'm
following about 25 client sites and all but 3 had a huge jump in
traffic on the 11th. Some doubled their traffic. Hoping this is a trend
going forward. Also, these are all in the U.S., on the West Coast.
There are a lot of people who agree and are noticing changes between the 11th and today.

Mozcast shows heavy activity earlier on but not on the 11th. However, it has not updated today yet, so who knows. SERPs.com also shows a steady high volatility pattern. SERP Metrics is off the charts on February 12th and Algoroo also shows a lot of high activity.

Plus, we have a lot of chatter at WebmasterWorld and the other Google forums.

Have
you noticed a change in rankings at Google on February 11th through
13th? Some are suspecting it might be the monthly Panda refresh.



by Barry Schwartz

Wednesday, February 12, 2014

Google's Page Layout Algorithm Updated For Third Time

Google's Page Layout Algorithm Updated For Third Time Google Page Layout Algorithm

Yesterday, Google's Matt Cutts announced on Twitter that they have pushed out a refresh to the Google Page Layout algorithm on or around February 6th.

He
did not specify how much this impacted the search results but based on
my analysis of the SEO community, it had a very small impact on most
SEOs.

This would be the third update to the page layout algorithm.
Since it is just a refresh, it doesn't mean Google updated the
algorithm, but rather reran the algorithm and updated it's index. So I
doubt Google would classify it version 3.0, but rather 1.2 maybe.

Previous Updates For Google's Page Layout Algorithm Was:

What does it look like when a site gets hit by this? Here is a screen shot from WebmasterWorld of someone who was hit:



click for full size



Again, very few seem to have been impacted by it but there are some and
they are complaining in the forum threads I link to below.



by Barry Schwartz

Tuesday, February 11, 2014

Is Google Misleading Us? If So, How?

Is Google Misleading Us? If So, How? parking signs misleading

You've all seen it, parking signs that are so confusing that
sometimes you wonder, is the local government trying to mislead you...



Webmasters
are asking themselves the same questions about Google. With all the
blog posts, videos, guidelines, is Google setting SEOs and webmasters up
for disaster one way or another?



Last year, we asked you if you felt Google was lying to you and only 10% said no. The rest, over 55% said Google is lying to us and 31% said Google lies to us sometimes.



The post has 150+ comments with examples and debates.



The new WebmasterWorld thread asks in more of a subtle way. Asking "What Official Google SEO Advice is Misleading or Misunderstood?"



Greg Niland started off saying that guest blogging is one of those examples:

For
example Matt Cutts has recently said that if you were using guest blog
posts "you should probably stop". This led many SEO people to start
assuming that Google will consider every guest blog post to be bad. Matt
Cutts had to clarify his initial statement and say that some guest blog
posts when done in a relevant and professional manner can be good for
your online business.
I am sure you all have many examples - do share.



by Barry Schwartz

Thursday, February 6, 2014

Google To Bring It On German Link Spammers

Google To Bring It On German Link Spammers Google Sledgehammer

 Yesterday, Google posted a stern warning on the German Google Webmaster blog telling webmasters that you will be penalized for unnatural links.



Matt Cutts then Tweeted
the same warning saying "A reminder (in German) that paid links that
pass PageRank violate our guidelines." This comes the week after Google
took action on a French link network, where Matt also dropped a hint on German link networks getting hit as well.



The
question is when will German SEOs take notice of this? Which networks
will this impact exactly? How much of an impact will this have on
SEOs?



Former Googler, Pedro Dias tweeted
that he thinks this is going to target "German Newspapers that sell
links." But I am hearing from others that these are much larger than
just newspapers, but massive link networks in Germany.



Anyway, as I said before, Google likes to issue warning after warning and then issue notices and then ranking impacts - why? To break their spirits.



by Barry Schwartz

Google Sends Manual Actions For Rich Snippet Spam & Spammy Structured Markup

Google Sends Manual Actions For Rich Snippet Spam & Spammy Structured Markup Google Rich Snippets Spam

Rich snippet spam has been an issue since rich snippets came out and eventually added a report rich snippet spam tool. Then Google dropped the amount of rich snippets showing in the search results recently.



It seems like Google is now sending out notifications to those who have been spammy with their rich snippets.

One webmaster posted a notification he received in the Google Webmaster Help forums of a manual action he received for "Spammy structured markup."



Here is the text of that notification:

Spammy structured markup 
Markup
on some pages on this site appears to use techniques such as marking up
content that is invisible to users, marking up irrelevant or misleading
content, and/or other manipulative behavior that violates Google's Rich
Snippet Quality guidelines.
This is the first time I have seen a webmaster report getting a manual action sent to them for spammy structured markup.



by Barry Schwartz 


Tuesday, February 4, 2014

DMOZ Drops Over 1 Million Sites From Directory?

DMOZ Drops Over 1 Million Sites From Directory?
Did you notice that DMOZ, one of the oldest and largest human crafted
web directories, has removed over 1 million sites and 10,000 editors
from their directory?

A DigitalPoint Forum thread first noticed it. If you look at the live site now,
you will see 4,261,763 sites, 89,252 editors and over 1,019,865
categories in the footer. But if you go to the Way Back Machine archive you will see 5,310,345 sites, 99,997 editors and over 1,019,508 categories.

Here are screen shots:

NOW:

dmoz-new-sites

OLD:

dmoz-old-sites

As you can see, DMOZ dropped about 1 million sites from their directory
and 10,000 editors. There was no announcement about this, so I am not
sure if this is just a glitch on the footer.

They did however post a rare blog post announcing a new feature for reporting listings.
Of course, most of you don't bother with DMOZ listings anymore anyway but
still, interesting to see 1 million sites just vanish from DMOZ.

by Barry Schwartz

Saturday, February 1, 2014

Can You Rank In Google Without Content?

Can You Rank In Google Without Content? empty newspaper

A WebmasterWorld
thread has a webmaster who has a site that doesn't have any real
content. It is basically statistical downloads and specifications
downloadable as PDFs or Zip files.



Can you rank web pages with no content at all in Google?



A good example of a page that ranks without having the exact words on it is the Adobe Reader page which ranks for [click here].



But what about a page with almost no content? It is possible to rank on anchor text alone?



Yes, but it has to be very obscure and non-competitive words.



by Barry Schwartz

Wednesday, January 29, 2014

Google's Matt Cutts On When Old Sites No Longer Rank Well

Google's Matt Cutts On When Old Sites No Longer Rank Well Google's Matt Cutts

Yesterday, Matt Cutts of Google released another video, this one answering why an old site that always ranked well, no longer ranks as well these days.



The question posed was, "How can an older site maintain its ranking over time?"



Matt
said that some old sites that have been ranking well for years don't
change anything. They leave the 15-year-old template, they don't add
any new features or content. They just leave it. This is while other
new sites and competitors come into the mix with fresher designs, better
user experiences, new ideas and features.

 Eventually, customers start
to leave the old site and go to the new site because it is a better user
experience.



Google acts the same way, if you don't continuously improve your sites then why should Google continue to rank it well?



Matt urges older domain names to take a fresh look at their site or else people may leave your site.



by Barry Schwartz

Saturday, January 18, 2014

Hold Your SEO Client Hostage With Negative SEO Threats

Hold Your SEO Client Hostage With Negative SEO Threats SEO Threat
A WebmasterWorld thread has a story of a webmaster who received a phone call from someone with an Indian accent threatening to use negative SEO on their site if they don't pay up. A form of search pirates, if you will.

This is what the webmaster's client said:
So I had a client phone me today and say he had a call from a guy with an Indian accent who told him that he will destroy his website rankings if he doesn't pay him £10 per month to NOT do this. What the hell...
This is not new, we've covered stories like this before. Threats to use the disavow tool against you, well you can't. Threats of negative SEO and much more.

What can you do? Well, you can monitor your new links in Google Webmaster Tools and anything that looks bad, add them to your disavow link file. It shouldn't cost or take too much time to do that.

Of course, Google has said time and time again that while negative SEO is hard it is possible.
Which is why they came out with the disavow link tool.

So track your new links in Google Webmaster Tools and disavow the bad new links.

 by Barry Schwartz

Friday, January 17, 2014

Google's Matt Cutts: We Don't Have Different Algorithms For Different Ranking Positions

Google's Matt Cutts: We Don't Have Different Algorithms For Different Ranking Positions matt cutts loves webmasters pic
Google's Matt Cutts released a video yesterday answering the question; does Google have algorithms for different positions in the search results.

So does position one through three use one ranking algorithm, position four through six use another and so on.

The answer is no, at least with the web organic search results. Of course the ads have different algorithms, the local results have different algorithms, the videos and so on - but the web results do not work that way.

by Barry Schwartz

Wednesday, January 8, 2014

Google: Your Rankings Dropped Because Of Links, Not Hummingbird

Google: Your Rankings Dropped Because Of Links, Not Hummingbird Google Hummingbird
A Google Webmaster Help thread has an office furniture site complaining that his Google rankings dropped after the Hummingbird update was pushed out.

Of course, you and I know that sounds a bit too much. Being that we do not know the exact rollout and dates of the Hummingbird algorithm.
But it is nice to see Google representatives saying so too.

Zineb Ait Bahajji from Google said it wasn't the Hummingbird algorithm but rather links, not any links but over-optimization links.

Here is what Zineb wrote:
The Hummingbird update does probably not have anything to do with your website's traffic loss.The primary issue here is that you are using certain over-optimization techniques on your site that are in violation of our quality guidelines, especially when it comes to building unnatural and spammy links.
Interesting she calls it "over-optimization techniques" - don't you think?

by Barry Schwartz

Tuesday, January 7, 2014

Google On What To Do If You Can't Have 100% Unique Content

Google On What To Do If You Can't Have 100% Unique Content smiley painted
Google has said time to time that having some unique content, even if it is a few sentences of unique content may be enough.

So when Gary Illyes from Google responded to a thread at Google Webmaster Help - his response stood out.
He wrote, "You want to keep in mind that the users (and also our algorithms) prefer unique and compelling content." But if you can't, then what?

Gary wrote that you must have elements that are unique.
If having 100% unique content is not possible, the webmaster should make sure to have elements in the pages that are unique and valuable for the users, give a good reason for them to visit the site.
There are plenty of sites that do not 100% unique content, including this one (clearly, since I quote threads) but adding value on top of that content by aggregating it, adding context and tools on top of it - that can be the "element"(s) that are "unique and valuable."

by Barry Schwartz

Social Optimization: Your Content Or Yourself?

Social Optimization: Your Content Or Yourself? social
There is an interesting conversation going on at Cre8asite Forums about your clients participating in social networks and the importance of that.
The given is that social is growing in importance, as it related to search rankings either indirectly and in the future, directly.
The bigger question in the thread is, do you yourself need to be active in social or does your content need to be socially appealing? The easy answer is both.

But like EGOL said in the thread, not everyone who writes the outstanding content is able to participate within social networks. People are disinterested, too busy, confused or unable to be part of the social networks, i.e. Facebook, Twitter, Google+, LinkedIn, and others.

Does that leave them at a disadvantage or can the content make up for it.

EGOL wrote:
Although I don't embrace social with my time my website gets a lot of traffic from Facebook, Stumble, Pintrest and especially the TIL part of Reddit. My visitors take my content to social for me because they want to share it with others. It works nicely. Other people do the jobs that I don't want to do and I don't have to pay them. That's how things are supposed to work IMO. It is more genuine to let it happen that way rather than hiring a shill to do it for you.
Indeed but if there is a conversation going on at the social network and if you are not there, that can be an issue. Of course, also, building your own authority outside of your own web site, then as you share content, it just helps spread that content through the network - that also is natural.

What is your take on this?

by Barry Schwartz

Saturday, January 4, 2014

Google: Can't Crawl Your Robots.txt Then We Stop Crawling Your Site

Google: Can't Crawl Your Robots.txt Then We Stop Crawling Your Site robots txt
Did you know that if Google cannot crawl your robots.txt file, that it will stop crawling your whole site?

This doesn't mean you need to have a robots.txt file, you can simply not have one. But if you do have one and Google knows you do and it cannot access it, then Google will stop crawling your site.

Google's Eric Kuan said this in a Google Webmaster Help thread. He wrote:
If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file. If this isn't happening frequently, then it's probably a one off issue you won't need to worry about. If it's happening frequently or if you're worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.
This also doesn't mean you can't block your robots.txt from showing up in the search results, you can. But be careful with that.

In short, if your robots.txt file doesn't return either a 200 or 404 response code, then you got an issue.

by Barry Schwartz