Web 5.5

A long and interesting critique at Abstract Dynamics of the changing nature of privilege, control and access to the web that “web 2.0” seems to be creating.

What really separates the “Web 2.0” from the “web” is the professionalism, the striation between the insiders and the users. When the web first started any motivated individual with an internet connection could join in the building. HTML took an hour or two to learn, and anyone could build. In the Web 2.0 they don’t talk about anyone building sites, they talk about anyone publishing content. What’s left unsaid is that when doing so they’ll probably be using someone else’s software. Blogger, TypePad, or if they are bit more technical maybe WordPress or Movable Type. It might be getting easier to publish, but its getting harder and harder to build the publishing tools. What’s emerging is a power relationship, the insiders who build the technology and the outsiders who just use it.

He’s also tired of the Web2.0 monicker:

Are the internet hypelords getting a bit tired? There’s this funny whiff of déjà vu that comes along with the latest and greatest buzzword: Web 2.0. Web 2.0? Wasn’t that like 1995? Don’t they remember that Business 2.0 magazine? Or remember how all the big companies have stopped using version numbers for software and instead hired professional marketers to make even blander and more confusing names? I hear “Web 2.0” and immediately smell yet another hit off the dotcom crackpipe…

Personally, I’m now just going to be refering to Web5.5

It has a whiff of the crufty, featuritis midlife of mainstream applications (Quark, Wordperfect, etc) which renders it pleasingly mundane and irrevocably intertwined with the work-a-day world.

Web 5.5 comes with a couple of giant manuals in binders and a little plastic overlay to put abouve your function keys.

It’s been 10 years between Web1.0 and Web2.0 – so expect Web5.5 sometime around 2035.

Along with space elevators.

—-
Update: a response to the AD essay by Michal Migurski

Things I dearly wish I had today

A partial list:

  • An understandable guide to the intricacies of aliasadm
  • A ‘delete all from spam folder’ button in gmail
  • the phone number of the gmail product manager to:
    (a) ask for a ‘delete all from spam folder’ button in gmail
    (b) ask them to delete all 380849 spam emails from my spam folder for me so I don’t have to do it 100 at a time, and sidestepping the resulting 7616 clicks necessary to do so

I’ve mailed gmail support already, before you ask.

Argh.

“In our wiki”

I had a  random friday afternoon thoughtfart while listening to Paul Morley/Strictly Kev’s 1hr remix of ‘raiding the 20th century’.

Listening to Morley‘s* cultural history of the cut-up on top of Kev’s sonic critique made me think how cool it would be to hear Melvyn Bragg and the "In our time" gang’s thursday morning ruminations on, for instance, Machiavelli – cut-and-pasted over mashed-up madrigals.

Putting this fancy to one side for one minute… it made me think of other superlayered participatory critique and knowledge construction – the Wikipedia.

If there were a transcript of "In our time" (is there?) why couldn’t that be munged with wikipedia like Stefan did with BBC news… and what if then new nodes were being formed by Melvyn, his guests and his audience – together, for everyone, every week, and cross-referenced to a unique culutral contextual product – the audio broadcast.

The mp3 of "In our time"  sliding into the public domain and onto the internet archive’s servers, every thursday rippling through the nöösphere reinvigorating the debate in the wikipedia, renewing collective knowledge.

"In Our Time" is great ‘campfire’ stuff – you have The Melv as the semi-naive interlocutor and trusted guide, the experts as authority to be understood and questioned… but it’s only 30 minutes and 4 people… what about scaling it way out into the wikinow?

How good would that be??!!!!

Of course a first step, a sheltered cove, would be to set up "In Our time" with their own wiki for Neal Stephenson Baroque Cycle / Pepys diary style annotations of the transcript and mp3..

The Melv’s own multimedia mash’d up many-to-many mp3 meme machine.

—-
Update: over the weekend, Matt Biddulph showed another example of how powerful mixing BBC web content with web-wide systems might be: with del.icio.us tags extending BBC Radio3’s content. Fantastic stuff.
—-

p.s. from a Bio of Morley found at pulp.net:
"Morley
earns a farthing every time Charlie’s Angels, Full Throttle is shown or
trailed, owing to his contribution as a member of the Art of Noise to
Firestarter by the Prodigy, which features a sample of the Art of
Noise’s Beat Box, used in the film. The pennies are mounting up."

Heaven is other people

to paraphrase Satre…or their digital detritus, their links and pictures are heaven anyway.

Caterina kindly quoted something I said two years ago now (although it feels longer, now I am far from London, and not making web apps) about “social software”: that it’s software that is better when there are other people ‘there’, inhabiting it.

This is certainly the case with del.icio.us, and although useful as a personal linkdump and lightweight way to spool things to the web; I am really missing the other people there in my inbox.

I’d hoped it would reappear over Christmas, but Santa didn’t get my wishlist. Ah well. Hopefully Joshua had a good break and can find the time soon to fix it.

Heaven is other people, and great social software temporarily without them is purgatory.

BBC News Online meets Wikipedia

A very nice hack by Stefan Magdalinski, which cross-references BBC News stories against the Wikipedia and, also blog entries on the subject matter retrieved from Technorati.

For instance, viewing this story on the happy news that Michael Howard is not thought to be prime-ministerial-material, returns wikipedia entries on Michael Howard, Tony Blair, the Conservative party*, the Liberal Democrats*, and the BBC!

A great proof of concept, which although BBC News have recently introduced their pretty good ‘google-news’-like Newstracker to some of their pages, introducing clearly-delineated links to blogs and the wikipedia would be far more engaging I believe.

(* the links go to ‘disambiguation pages’ due to there being multiple political parties by these names around the globe)

—-
UPDATE: Stefan exposes the thinking/ideology behind the hack, and it’s source-code here

Consilience

Alex Wright on the wikipedia / autonomy!=authority thing:

“What irks me about some of the dialogue to date is an assumption (usually implied) that networked systems are somehow inherently more “fair” than top-down systems. Democracy, like unregulated free markets, are no guarantee of fairness. And while networked systems surely give users more opportunity for input, they also abide by power laws which, though perhaps ineluctable, are neither equal nor fair (especially insofar as they favor early adopters). Top-down systems, while seemingly authoritarian, may paradoxically do a better job of defending the interests of the individual. Just as mob rule is no way to run a country, so purely democratic classifications could lead lead to groupthink, favoring conformity and marginalizing dissent.

But again, I don’t believe that top-down and bottom-up systems necessarily have to stand in opposition; the two models may ultimately prove consilient.”

Amen.

“Fast [to iterate] at the bottom, slow [to consolidate] at the top” to paraphrase Alex quoting Kevin Kelly.

This, however, does seems to be the überpattern of wikipedia afforded by its structure, as demonstrated by Historyflow, with some catastrophy and punctuated equillbrium thrown in.

“(Medium-)Fast at the bottom, slow at the top” was the principle behind iCan‘s information architecture, enabling campaigners to say exactly what it was they were campaigning for, and letting casual browsers have a way in which had some stability, and common currency of meaning at the top levels.

Neologism alert – after all this talk of ‘folksonomies’ can I say information arcology yet?

Heh.

The personalisation panacea

Hopefully put to rest by this report.

“Instead of implementing personalization strategies, the report suggests, companies should concentrate on the basics, such as making their sites easy to search and navigate.

“Given flexible, usable navigation and search, Web site visitors will be more satisfied with their experiences and will find fewer barriers to the profitable behavior sought by site operators,” according to the report published Tuesday. “In fact, good navigation can replace personalization in most cases.”

The report criticized personalization as not only ineffective, but surprisingly costly.

Personalizing a site was more than twice as likely to result in finding visitors who would never pay for anything, as it was to attract paying customers, Jupiter’s study found.

Operating the personalized Web site cost more than four times what operating a “comparable dynamic site,” Jupiter found. The report said costs came primarily from the human effort needed to measure results, manage rules and optimize the system, on top of the licensing costs for personalization software from vendors such as Broadvision, Epiphany, Teradata, IBM and ATG.

Stymieing personalization campaigns is consumers’ deep-seated suspicion of Web sites that try to extract information from them, the report found.”

An ex-colleague of mine Matt Karas has steadily held this belief (that personaisation projects never achieve a return on the investment) for the last 7/8 years I’ve known him. Although I reckon that smart reactive personalisation can make a service a lot more pleasurable and easy to use [especially for attention-impoversihed interfaces], I think pretty much the same now.

Hopefully, senior managers and the like who only can be convinced by something when they read it in a Jupiter report will now think again about the ROI of big content management and personalisation projects that don’t address tough basic service / business design questions but opt for a technological solution.

» C|net: Report slams Web personalization [via Christina]

Hold the frontpage!

A story in this morning’s MediaGuardian on whether changes in web-user behaviours mean that designs (and redesigns) of “portal”-type homepages for ISPs or other online services still serve a purpose.

There’s little analysis in this story of what might be changing user behaviour: links being exchanged over IM networks or email, the omnipotence of Google, personal weblogs as gateways to the web or even the burgeoning use of RSS?

More evidence for the arguement to “turn your website inside out “ – © Simon Waldman.

» MediaGuardian: The front-page dilemma