Google Buzz Reduces The Noise? My Irish Arse It Does!

I listened yesterday and started to believe the crap I heard about how Google Buzz was going to make my social circle more relevant, reduce the noise, filter out the crap and generally make right the wrongs of the universe.

Why then Google, have I woken up this morning with 1000+ new items in my Google Reader, which were shared by people I am following on Google Buzz?

reader clogged by Google Buzz
Google Buzz = Less Noise?

This is a perfect example of what Sara was talking about in her post yesterday when discussing Googles need to have a singled shared address book or set of contacts across all their services. Google doesn’t get that sometimes you don’t want to follow the same people everywhere.

For example, I had (prior to Buzz) my Google Reader set up just the way I like it, but now, on top of the items form the 300+ blogs and news sources I check a day, I’ve got all the shares from many of the same people I am already subscribed to? What gives? I did not subscribe to their shares in Google Reader!

That’s like saying that because I subscribe to your blog, I have to get all your tweets as well!

Sorry Google, but this is a major fail – You’ve just increased the noise to signal ratio that I receive exponentially!

Has Google Gone A Step Too Far By Forcing Buzz On Users?

I know Google desperately want people to use Google Buzz, but as Steven Hodson pointed out, growth of the network will always be limited by the fact that you have to have a Gmail account in order to use the service.

Google Buzz is a clever trap, but a trap all the same. It is the hunk of cheese to get more people using Gmail which in turn locks users into Google even more.

Google, in it’s attempts to ensure adoption have taken the kind of  step that hasn’t been seen since the Microsoft of the 90’s and actually forced all Gmail account holders into being users regardless of whether they want to or not.

As Mark Davidson said on Facebook earlier tonight:

I’m not sure why but I’m bothered by Buzz. I don’t like it, I don’t want it. But I have it. Sure I don’t have to us it. I think maybe it’s because for the first time, Google has forced a web tool on me. I’ve been using Gmail since 2004. If I love Gmail, and I do, I’m forced to have Buzz. I’m about 10 minutes away from …re-installing MS Office so I can use Gmail as a relay for Outlook again. You know what I like? Choice. That’s what got me using Google web tools in the first place. Today is the first day, I’ve ever viewed Google in the same light as I viewed Microsoft in the mid-nineties.

It’s this kind of move that could result in a temporary boost for Gmail and Googles other services, but as I’ve already seen tonight people are complaining about things like unremoveable messages in their inbox, and of course the pre-existing faults caused by Googles need to have a universal address book and not allow you to delete contacts from one Google service without deleting them on all.

This forced use could ultimately be detrimental to Gmail as users who don’t want an intrusive social network clogging up their inbox choose to go to less crowded and more traditional email systems. Yes, you can stop Buzz features from appearing in your inbox after the fact but once you are in, you’re in.

As I’ve said before, the right tool for the right job, and morphing Gmail into a full featured social network may stop it being a tool of productivity and turn it into another Facebook/Twitter timewaster.

Google Buzz is a clever trap, but a trap all the same. It is the hunk of cheese to get more people using GMail which in turn locks users into Google even more.

Google Hasn’t Built A Twitter Killer: Goolge Buzz

FYI: I started writing this before the Google announcement today and am writing it as I watch it live and as a product GBuzz looks great.

Google doesn’t want to build, nor is it trying to build a “twitter killer”. What Google wants is your information in order to better target advertising.

Every time you send an email, update your status, chat on Goggle Talk, share an item on Google Reader, post a video to YouTube or a picture to Picasa, Google gets a little bit more information about you that it can use to better tune the advertising you see in the hope that you will click on one of those ads.

Information like your status updates disappear as soon as they are used but by offering timelines where you can see the status updates of your friends Google  can keep you on in their service a little longer and in front of their advertising a little longer.

By associating Facebook and Twitter id’s with your Gmail contacts Google learns a little more about both you and your contacts. They have already announced that you will be able to import RSS streams from other services, connect with twitter, and there was a hint of facebook connect. It was driven home that the platform will be made as open as possible and noted that if you tweet and it gets imported into GBuzz then it could end up on somebody elses recommended list.

GBuzz is shaping up to be a wonderful looking social experience, but it boils down to being a content gathering service. It will allow you to pump information into Google from all your other services, consume information there and ultimately spend more time in from of Google advertisements while providing it with the information necessary to fine tune those advertisements to you and your friends.

At the end of the day it all boils do to gathering more information about you in order to target advertising. They don’t need to build a Twitter killer. Twitter and Facebook already do a great job creating the information that Google wants.

What they have built however, is a marvelous looking social experience for those people who already use Gmail and Googles mobile services and given them the means to pour information about themselves and others into Googles servers.

SourceForge: Nobody Is Asking Why Now?

sourceforge hands tied
Bound by the law?

Sourceforge is now blocking access to sites from Cuba, Iran, North Korea, Sudan, and Syria.

Since 2003, the SourceForge.net Terms and Conditions of Use have prohibited certain persons from receiving services pursuant to U.S. laws, including, without limitations, the Denied Persons List and the Entity List, and other lists issued by the U.S. Department of Commerce, Bureau of Industry and Security. The specific list of sanctions that affect our users concern the transfer and export of certain technology to foreign persons and governments on the sanctions list. This means users residing in countries on the United States Office of Foreign Assets Control (OFAC) sanction list, including Cuba, Iran, North Korea, Sudan, and Syria, may not post content to, or access content available through, SourceForge.net. Last week, SourceForge.net began automatic blocking of certain IP addresses to enforce those conditions of use.

In all the commentary I am seeing, nobody has asked the very simplest and perhaps most obvious of questions: Why now?

These terms have been in place for nearly 7 years now. (The Entities list has existed since 1997)

Lets forget for the minute that one hell of a lot of the software hosted by Sourceforge is developed with the help of, or even entirely by, people living outside the U.S.

Lets also bear in mind that SourceForge has claimed that this is because of the “transfer and export of certain technology” to foreign persons and governments on the sanctions list, yet doesn’t give any details about what this technology is?

Surely everything on SourceForge can’t contain dangerous technology? Why not just restrict the programs which contain those technologies?

Not to mention the fact that everybody knows that any idiot, never mind some evil axis human overlord wannabe wouldn’t be able to use a proxy or Tor to get past the IP filtering!

Or is there something more at play here?

Google and China perhaps? Did the U.S. government pay SourceForge a call and “politely” remind them that these laws exist? Maybe because the government wants to show that it is willing to enforce it’s laws and send a subtle hint to China that the hacking of U.S. companies and theft of their I.P. might get them added to these lists?

I find it very hard to believe that the guys at SourceForge have had a sudden moment of conscience and, out of the blue, decided to comply with laws that have existed for almost 12 years and to their own terms and conditions which they have ignored for the past 7 years.

Quality Versus Quantity in Twitter Shares

Snowed Under I’ve been looking at my Twitter feed lately and have noticed that it is becoming rather crowded. There is a heck of a lot of “noise” on my stream.

With all the links that I share on a daily basis, coupled with content from my Google Reader share and our humor site Daily Shite, I can see why some of my followers would be starting to feel a little overwhelmed with the sheer volume of links that go through my stream

That’s not even mentioning my ordinary day to day tweets with all you fine people.

With that in mind I’ve decided to experiment for a bit and stop automatically feeding my GReader share into my twitter stream and be a bit more picky about what I share. (Btw, if you want to get everything I share on Google Reader you can always subscribe to my shares directly).

What it means for my dear twitter followers is that instead of being flooded with 100+ shares a day you should now be down to between 6 and 20 shares a day which have been more carefully considered. Less quantity but more quality if you will.

I’ll still be putting the Daily Shite feed through my twitter stream, it’s only 6 posts a day after all, but over all, share activity will drop dramatically on my feed in favor of quality.

The signal to noise ratio is going to considerably improve.

You may finally be able to see my own tweets in amongst all the other content 😉

This is where I turn to you guys. Do you prefer I send all my shares out, or do you think things will be better with more considered sharing?

Crawled To Death – Why You NEED To Use Google Webmaster Tools

PaulOFlaherty Crawl I thought it was bad enough when Tim Burton and a poorly configured WP-Super Cache teamed up to kill our sites last month, but I never thought that Google would be responsible for killing 2 of my sites in 1 day.

Yesterday two of our sites simultaneously started throwing 500 errors every time we tried to access them. Last month we had a similar problem but it happened across 3 of ours sites due to a poor cache configuration when Daily Shite was having a massive traffic spike. That issue was resolved and steps were taken to ensure it didn’t happen again. I knew we weren’t having any unusually high traffic so my first thought (after checking the cache was working properly) was that the MySQL server was on the fritz.

How wrong I was.

After some work and some back and forth with the ever helpful Dreamhost, it was determined that the culprit was an incredibly industrious Googlebot (66.249.67.198) which had taken it upon itself to go ape and crawl our sites with the zeal of an axe murdering maniac at a naive virgin convention.

If this is happening to your site there are ways and means to stop Google from hammering your site into oblivion.

The first and easiest is to disallow all bots from crawling your site by adding the following to your robots.txt file, which goes in the root of your site:

User-agent: *
Disallow: /

That will work but the problem is that it may also prevent your pages turning up in the search engines over time as your pages will not be indexed.

What you really want to do is slow down the crawling to reduce the load by placing something similar to this in the robots.txt (Lots more information on on robots.txt here):

User-agent: *
Crawl-Delay: 3600
Disallow:

The “Crawl-Delay” command will slow down most good bots such, as the crawler used by Bing, but Google ignores the command so will continue to hammer your site regardless. That is why is is imperative that you sign your site up for Google Webmaster tools today (it’s 100% free).

Once you’ve signed up with Google Webmaster tools you verify your site by either adding a meta tag to header of your sites pages or by uploading a specific HTML file to your server which Google then checks to verify your ownership of the site.

Once your site is verified you can then go in and lower the crawl rate of your site to reduce the load, get your site back up and give you some working room.

PaulOFlaherty.com Webmaster Tools - Settings

The reason why you need to sign up now, besides being able to register a sitemap to improve crawling and SEO, is that if your site is being hammered and your server resources are stretched to the limit, then Google won’t be able to verify the site as you’ll probably be serving cached files without the meta tag in place or your site will crash trying to generate a new page with the tag in. Either way – not good!

No matter what kind of load your site is under the HTML file verification should work, but it didn’t for us yesterday when attempting to verify Daily Shite (PaulOFlaherty.com was already verified) as Google kept throwing an error.

Google Webaster Tool Error

If we’d had all of our sites registered with Google Webmaster tools, we would have been able to reduce the crawl rate across the board straight away, stopped that Google bot in its rampant tracks and had considerably less downtime.

O’Flaherty Episode #13 – Hail to the King!

Hail to the King!

Alec and I talk about Google, FeedBurner, privacy issues and Bruce Campbell?

Links

Credit

Download Podcast MP3: O’Flaherty #13 10.75 Mb 0:24:32