Posted on 09/23/2010 8:43:39 AM PDT by ShadowAce
Google is doing it. Facebook is doing it. Yahoo is doing it. Microsoft is doing it. And soon Twitter will be doing it.
Were talking about the apparent need of every web service out there to add intermediate steps to sample what we click on before they send us on to our real destination. This has been going on for a long time and is slowly starting to build into something of a redirect hell on the Web.
And it has a price.
Theres already plenty of redirect overhead in places where you dont really think about it. For example:
And so on, and so on, and so on.
This is, of course, because Google, Facebook and other online companies like to keep track of clicks and how their users behave. Knowledge is a true resource for these companies. It can help them improve their service, it can help them monetize the service more efficiently, and in many cases the actual data itself is worth money. Ultimately this click tracking can also be good for end users, especially if it allows a service to improve its quality.
But
If it were just one extra intermediary step that may have been alright, but if you look around, youll start to discover more and more layering of these redirects, different services taking a bite of the click data on the way to the real target. You know, the one the user actually wants to get to.
It can quickly get out of hand. Weve seen scenarios where outgoing links in for example Facebook will first redirect you via a Facebook server, then a URL shortener (for example bit.ly), which in turn redirects to a longer URL that in turn will result in several additional redirects before you FINALLY reach the target. Its not uncommon with three or more layers of redirects via different sites that, from the perspective of the user, are pure overhead.
The problem is that that overhead isnt free. Itll add time to reaching your target, and itll add more links (literally!) in the chain that can either break or slow down. It can even make sites appear down when they arent, because something on the way broke down.
And it looks like this practice is only getting more and more prevalent on the Web.
Do you remember that wave of URL shorteners that came when Twitter started to get popular? Thats where our story begins.
Twitter first used the already established TinyURL.com as its default URL shortener. It was an ideal match for Twitter and its 140-character message limit.
Then came Bit.ly and a host of other URL shorteners who also wanted to ride on the coattails of Twitters growing success. Bit.ly soon succeeded in replacing TinyURL as the default URL shortener for Twitter. As a result of that, Bit.ly got its hands on a wealth of data: a big share of all outgoing links on Twitter and how popular those links were, since they could track every single click.
It was only a matter of time before Twitter wanted that data for itself. And why wouldnt it? In doing so, it gains full control over the infrastructure it runs on and more information about what Twitters users like to click on, and so on. So, not long ago, Twitter created its own URL shortener, t.co. In Twitters case this makes perfect sense.
That is all well and good, but now comes the really interesting part that is the most relevant for this article: Twitter will by the end of the year start to funnel ALL links through its URL shortener, even links already shortened by other services like Bit.ly or Googles Goo.gl. By funneling all clicks through its own servers first, Twitter will gain intimate knowledge of how its service is used, and about its users. It gets full control over the quality of its service. This is a good thing for Twitter.
But what happens when everyone wants a piece of the pie? Redirect after redirect after redirect before we arrive at our destination? Yes, thats exactly what happens, and youll have to live with the overhead.
Heres an example what link sharing could look like once Twitter starts to funnel all clicks through its own service:
It makes your head spin, doesnt it?
About a year ago we wrote an article about the potential drawbacks of URL shorteners, and it applies perfectly to this more general scenario with multiple redirects between sites. The performance, security and privacy implications of those redirects are the same.
We strongly suspect that the path we currently see Twitter going down is a sign of things to come from many other web services out there who may not already be doing this. (I.e., sampling and logging the clicks before sending them on, not necessarily using URL shorteners.)
And even when the main services dont do this, more in-between, third-party services like various URL shorteners show up all the time. Just the other day, anti-virus maker McAfee announced the beta of McAf.ee, a safe URL shortener. It may be great, who knows, but in light of what weve told you in this article its difficult not to think: yet another layer of redirects.
Is this really where the Web is headed? Do we want it to?
So what’s the dif b/n that and this when you follow an outside link from Free Republic:
“You have selected a link to an external website. In a few moments you will be redirected to http://royal.pingdom.com/2010/09/22/is-the-web-heading-toward-redirect-hell/"
I have always figured there was some technical reason why that step is needed.
I'm not sure why there is that small wait with FR. I don't think there is a purely technical reason for it.
Perhaps John Robinson can enlighten us?
Add this to “targeted ad” hell.
A few months ago I checked the price of certain vacuum cleaner on Amazon. I am still finding ads for that vac popping up on unrelated and diverse sites I visit. Creepy and annoying.
Same thing happened when someone using my computer clicked on a Crocs ad. I am still having pages that load with Crocs ads.
I don't see any ads.
So use Scroogle (Google's results without their busybodying) or Ixquick, or just hover to get the link and paste it in the url bar of a new tab. FORCE them to mind their own business. Starve the beast.
For the technical aspects it helps the source gather statistics on what sites are being visited.
I’m not sure about redirect hell, but it does seem like FreeRepublic is turning into blog hell. I mean this not as a criticism of you specifically, but it seems the forum is now dominated by personal bloggers posting “articles” from their own blog. There’s nothing inherently wrong with this, I suppose, but they used to be vanities, and could be evaluated as such.
Yup. I've been using Ixquick.
BTW--I have never heard of "Pingdom" before seeing this.
Deny them your esssssssence!
nosy-parkers and how to get rid of them bump
Yes, bloggers have become the bane of FR. Especially those who think their every thought or rehashed version of an old news story belongs under “news” and not “bloggers” and worse, they think their precious thoughts need to be excerpted down to a half sentence.
That’s where Pimpbusters comes in.....
What I’m curious about is, what does the author want? Is this one of those “there ought to be a law” things? Because you need to be careful what you wish for.
I hope it wasn't a "there oughta be a law" thing. I'm for fewer laws as it is.
“what does the author want?”
To avoid five different aggregating companies slowing down each click we take when using the internet I would think.
Yes, I need to do that. I am also so tired of every site I log into have 10,000 things going on in the margins that drive me crazy — videos, flashing symbols, blah blah blah.
I want my brain stimulated by ideas, not mindless bells and whistles.
I knew this was going to happen when the web turned away from Mosaic.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.