Until very recently, in the field I work in – social media – there was an orthodoxy. It went like this: If somebody follows you on Twitter, follow them back. Leading the charge were social media luminaries like Chris Brogan and Darren Rowse (better known as ProBlogger). Even veteran tech bloggers like Robert Scoble were you-follow-me-I follow-you-back kinda guys.
Now, many of these social media gurus don’t themselves know where this orthodoxy comes from. Many unto this day confuse it with some form of politeness. Odd though, if you think about it. It’s true that Twitter is not a strong-tie social network where Dunbar’s number – which claims that most people average about 150 strong relationships – would typically apply.
Even in Twitter’s more promiscuous interest graph (where you follow people because of what they have to say, rather than because you know them), no person can seriously pay full attention to more than a couple of hundred people. Let alone can we when we’re following 70,000 or more. People that study and immerse themselves in media should surely know that human attention is finite?
Recently Darren Rowse, who blogs about blogging, tried to articulate – just before he unfollowed everybody to start afresh – why he was following nearly 80,000 people.
“1. to reciprocate – I’ve always felt strange about having people follow me and not following them back. Perhaps this is my ‘people pleaser’ trait in action on the web.
2. to open up DM conversations – I used to use Direct Messaging as much (if not more than) as ‘Replies’ on Twitter. Following people in large numbers opened up the opportunity for this.”
Not very convincing reasons, I have to say, but not uncommon either. To quote Slovenian intellectual Slavoj Žižek, I suspect Rowse and many others had, when deciding to follow so many people, fallen prey to not knowing what they know. Which is to Žižek exactly what ideology is: The inability to realise that we have a belief and that what we believe in, shapes our actions.
Is there a ideology at work even when deciding who to follow on Twitter? Very much so.
So what exactly is this ideology? The Word Wide Web was barely two years old when Richard Barbrook and Andy Cameron recognised it and gave it a name. The Californian Ideology was coined by them in the summer of 1995. An ideology which Barbrook and Cameron claimed was a heady and odd mix of 60’s inspired libertarianism and counter-culture on the one hand and a dogmatic belief it hitched from the Reaginite 80’s on the other. The latter being the pervasive belief of the last two decades that government should not interfere in markets or business.
“This new faith has emerged from a bizarre fusion of the cultural bohemianism of San Francisco with the hi-tech industries of Silicon Valley… This amalgamation of opposites has been achieved through a profound faith in the emancipatory potential of the new information technologies. In the digital utopia, everybody will be both hip and rich.”
The part of the ideology that made people follow thousands and which we will interrogate here was rather commendable. It was the belief that through the Internet we would make for a more egalitarian world.
So this little essay is about lots of things. But primarily it’s an essay on how some of the great hopes placed on the shoulders of our new media are becoming unstuck. And in particular it’s about the credo that digital media will remake our societies to be hierarchy free. That dream may not be as default a setting as we thought it was.
So where does this belief system come from? Barbrook and Cameron points out that the roots of this ideology can be found also in the writing of the sage of modern media, Marshall McLuhan.
McLuhan claimed that new technologies would empower individuals at the expense of established hierarchies like big business and big government. We were moving into the age of the cosy global village:
‘By electricity, we everywhere resume person-to-person relations as if on the smallest village scale. It is a relation in depth, and without delegation of functions or powers… Dialogue supersedes the lecture.’
Already in McLuhan’s writing can we see some of today’s recurring themes in digital and social media take shape, ideas such as:
- Person-to-person vs One-to-many (broadcast)
- Small scale (think village or community) vs Big
- Dialogue (conversation, anyone?) vs Lecture
Many of the generation of theorists after McLuhan were, if not singing from the same sheet, at least remixing his thoughts explicitly and combining it with elements of others’ theories.
The hippie ethos of freedom, independence, respect and connectedness that rested on a foundation of a profound dislike of straight regimented society and its large and cold ossified hierarchies, these were also the values of the Internet’s first theorists and ideologues. It’s no coincidence. These pioneering thinkers of the Internet were a tight-knit community themselves who knew each other well. Often not only from their shared interest and work in computing and media, but also from the first counter-cultural hippie happenings.
One of these institutions were The WELL, a pioneering online community and arguably one of the most influential online communities in shaping ideas about what the Internet should be. The founder of The WELL was one Stewart Brand, a fascinating man who managed to combine a love for mind-altering substances with sharp business instincts. Brand was fascinated by do-it-yourself culture, and published counter-cultural magazines which were read widely by people that mattered. The best known, The Whole Earth Catalogue, tried to provide people with all the information and tools they needed to live life outside of the mainstream. Steve Jobs was later to describe it as:
“…one of the bibles of my generation…. It was sort of like Google in paperback form, 35 years before Google came along. It was idealistic and overflowing with neat tools and great notions.”
Kevin Kelly, a contributor to Brand’s Catalogue, member of The WELL and friend of Brand, said in 2008 that the Catalogue had been an Internet prototype. Media made by people for themselves. It was a precursor of social media:
“The Whole Earth Catalogue was a great example of user-generated content, without advertising, before the Internet…. No topic was too esoteric, no degree of enthusiasm too ardent, no amateur expertise too uncertified to be included… This I am sure about: it is no coincidence that the Whole Earth Catalogues disappeared as soon as the web and blogs arrived. Everything the Whole Earth Catalogues did, the web does better.”
But already there were indications of inherent tension in Brand’s ideas. On the one hand Brand was writing that “a realm of intimate, personal power” was developing. That was obvious. But increasingly through his environmental activism it became clear that he thought environmental solutions would have had to be communal. Neither Brand nor his fellow travellers ever explicitly addressed the fact that there might be a tension between the personal freedom he espoused and the we-are-all-in this-together solutions he sought.
Be that as it may, cherished establishment concepts, like copyright, were not safe any longer. Brand’s writing became a rally cry of the open-source software movement and beyond – “information wants to be free…”.
Other denizens of Brand’s The WELL included John Perry Barlow, lyricist of the psychedelic band The Grateful Dead, but also libertarian, anarchist and founder of the Electronic Frontier Foundation (EFF).
In 1996 Barlow wrote a dramatic document titled A Declaration of Independence for Cyberspace, modelled in the assertive style of the United States’ own Independence Declaration. It remains today a very good snapshot of the world view of the Californian ideologues: Technologically determinist, anti-government and Utopian.
Where McLuhan had talked about “dialogue”, a word that would later become social media jargon like few others – “conversations” – was now in evidence when he warned governments to butt out of Cyberspace:
“You have not engaged in our great and gathering conversation, nor did you create the wealth of our marketplaces.”
The use of “conversation” was no gimmick; for Barlow, like McLuhan, this term was used in opposition to one-way, top-down broadcast media (or ‘lectures’ as McLuhan called them).
The Internet was not only different, said Barlow, it was a better, more egalitarian world. A world where everybody would be heard regardless of our status in meat space:
“We are creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth.”
Brand’s Catalogue contributor Keven Kelly, who was also the first editor of Wired Magazine and a futurist who has had a more profound influence than most, deserves a closer look.
His book Out of Control: The new biology of machines has not only shaped many techies’ ideas on software development, but also the thoughts of those in business and even brand management.
An end to hierarchy was clear to Kelly. He railies against governments’ and corporations’ large mainframe computers, and believes the coming personal computing age would herald unprecedented freedom from these old authorities.
Echoing the thoughts of others like McLuhan, Kelly believes every medium had intrinsic features with wider social consequences.
“A blackboard encourages repeated modification, erasure, casual thinking, spontaneity. A quill pen on writing paper demands care, attention to grammar, tidiness, controlled thinking. A printed page solicits rewritten drafts, proofing, introspection, editing.”
The new intrinsic features of hypertext, where one text links to another, he thought, was that of cooperation. But this was not the harbinger of a intimate global village. The vision was way more radical than that.
You see, the other key metaphor for Kelly was the network. We came from a world where we were atomised individuals, but through the network, ‘the icon of the 21st century’, we were changing and literally coming together to become one being. Just like a swarm of bees acting together as if they are one organism in a collective intelligence or ‘hive mind’.
Under Kelly’s auspices Wired Magazine published many articles pushing ideas such as how the Internet advances and James’s Lovelock’s Gaia theories of the earth as an organism. And that the Internet is in fact the Earth clothing itself with a brain.
If we are constituent parts of one being, one organism, where does that leave “me”? Kelly intimated that in the post-industrial age even egos would be put on the back burner:
‘The industrial icon of a grand central or a hidden “I am” becomes hollow. Distributed, headless, emergent wholeness becomes the social ideal.’
We were a networked organism, and egos along with hierarchy and authority were things of the past.
We can now add a few more familiar concepts to the canon of digital dogma:
- Out of control vs Control
- Networks (leaderless) vs Hierarchy
- Everybody vs Central Authority
- The Whole vs Me, me, me
Openess, equality, freedom in its architecture
This wasn’t just idle talk. These Utopian ideas had already found its way into the very fabric of the Internet, into the technical architecture of how the Internet transferred information (the protocols TCP/IP) and even its user-interface design (UX).
As many readers of this essay will already know, the absence of a central control system was the very raison d’etre of the Internet. It was built in such a way that it would not have to rely on a command-and-control hierarchy. Unlike the old telecommunications network, it was supposed to withstand parts of it taken out or going down in the event of a nuclear war.
Other tools and services built on top of the Internet protocols would follow the open, interactive, peer-to-peer, non-hierarchical design.
Email, the first Internet communications tool to become ubiquitous, was always designed to allow two-way communication. Once you sent somebody a message, they had the means to send a message back.
For those not lucky enough to be part of The WELL, the first sense of Internet community most early adopters experienced was on the service called Usenet. Usenet was a distributed discussion system, a hybrid between email and a web forum. Once you subscribed and posted, everybody else in that group would see your message. Many communities were established and nourished on this architecture.
Internet Relay Chat (IRC), a group chat tool still popular with hacker groups like Anonymous, to this day does not require users to register, thus making it open to all and hard to identify users.
Early instant messaging systems (the precursor to AOL’s AIM, Microsoft’s MSN Messenger and Skype), almost always required reciprocal agreement between users to become each other’s ‘contacts’ or ‘buddies’, allowing ‘contacts’ to be able to exchange instant two-way messages. This symmetrical mechanism was the forefather of the now familiar ‘friending’, widely adopted by early social networking sites, from Friends Reunited and MySpace to Orkut and later, of course, the behemoth that is Facebook.
These are profoundly trusting and open technologies, where everybody had access to everybody. These services were designed for our better selves. ‘Real’ identities were not required, and where it did go wrong exclusion of anti-social users was difficult. As such trolls were tolerated as a necessary evil and good norms of behaviour – ‘Nettiquette’ – reigned supreme.
But the open and non hierarchical tools and services were not inevitable outcomes based on the inherent nature of technologies. No. They were result of the often benign design decisions by techies beholden to the Utopian Californian Ideology.
But nobody noticed when things started to change. One important new communications service, did not have this mutual peer-to-peer feature: web-logs. With their relatively understated start as mere online diaries, few realised the power of blogs until the turn of the last century.
Blogs were different to what had gone before, in an important way. They were primarily designed for one person to share their thoughts with an audience of unlimited potential. This and complementary technologies, like RSS which allowed people to subscribe to a feed of blog updates, set in motion a chain of events. Broadcasting possibilities were now built into the functionality and UX of web services. All that was needed was talent and graft to build a following. Because of this, the Internet was set to take a different course.
Funny, because in the year 1999 when blogs started to get noticed, early bloggers like Doc Searls, supported by the likes of Dave Winer (the inventor of RSS), came up with The Cluetrain Manifesto. If blogs were to herald a change in the way we saw the net, Cluetrain was the apogee of the original way we viewed the Net. Here was the evidence of a clear link between earlier digital thinkers and today’s social media marketers. The Manifesto starts with the phrase:
A powerful global conversation has begun. Through the Internet, people are discovering and inventing new ways to share relevant knowledge with blinding speed. As a direct result, markets are getting smarter – and getting smarter faster than most companies.
These markets are conversations. Their members communicate in language that is natural, open, honest, direct, funny and often shocking. Whether explaining or complaining, joking or serious, the human voice is unmistakably genuine. It can’t be faked.
Much of the buzzwords and articles of faith we recognise in social media can be found in Cluetrain: Authenticity, openness, conversations, community, trust (and loss of control), markets and, of course, one new one – sharing. Line number seven simply proclaims:
Hyperlinks subvert hierarchy.
One of the first and most visible hierarchies to be subverted was the mainstream professional media. But until quite recently it seemed like mainstream media was oblivious to Cluetrain and the massive changes it claimed was occurring online.
Exasperated by the ignorance, Jay Rosen, a journalism professor that got the import of social media, published a bolshie essay on his blog in 2006. The People formerly known as the Audience proclaimed the arrival of everybody as media, hailed people connecting directly with each other bypassing professional gatekeepers, and asserted that the many new voices were not a problem:
“The people formerly known as the audience wish to inform media people of our existence, and of a shift in power that goes with the platform shift you’ve all heard about.
Think of passengers on your ship who got a boat of their own. The writing readers. The viewers who picked up a camera. The formerly atomized listeners who with modest effort can connect with each other and gain the means to speak – to the world, as it were.
Now we understand that met with ringing statements like these many media people want to cry out in the name of reason herself: If all would speak who shall be left to listen? Can you at least tell us that?
The people formerly known as the audience do not believe this problem – too many speakers! – is our problem.
The essay was an instant hit, and was shared widely on other blogs. Dave Winer enthused:
The brains are in what we used to call the audience. No more looking up to the ivory tower for all fulfillment. Thank god we don’t all have to be as beautiful as Farah Fawcett and Christopher Reeve. Everyone gets to sing. Users and developers party together.
Forward to (parts of) the past
More than 10 years after Cluetrain, from the glamorous Rumi Neely of Fashion Toast to pugnacious Michael Arrington (founder of TechCrunch), it’s obvious that some bloggers have become megastars.
It’s important at this juncture to point out that the very point of breaking the hierarchy was not a nihilistic hatred of authority. The very promise of the Internet was exactly that: That talented people and meritorious businesses would stand out and shine regardless of whether they were operating out of garages or did not have multimillion dollar advertising budgets.
So when Google claims its PageRank algorithm is a democratic system where hyperlinks from one page to another is taken as a vote for that page’s quality, the underlying idea is that the best content and services will float to the top of search rankings. A meritocracy par excellence. The same goes for services like Twitter. It has liberated the meritorious. Twittering writer Mat Johnson says it levels the playing field:
“I’ve never had a single ad for any of my novels, had a movie made or been given a big budget push by a publisher… Usually, they just throw my book out to reviewers and hope it floats. Twitter lets me hijack the promotion plane, sidestep the literary establishment and connect directly to my current and potential audience… It’s a meritocracy; if you’re interesting, you get followed.”
So a talented blogger like Arrington was bound to make a name for himself and beat mainstream publishers at their own game.
So far, so good.
But what many have been slow to realise is that this meritocracy is not only subverting the old hierarchies. The digital utopia is in fact borrowing some traits from the world the pioneering thinkers wanted us to leave behind. Is it really still true that too many speakers are not a problem as Jay Rosen said?
@wildebees A very interesting essay. Thanks. But I didn’t say too many speakers could never be a problem. It’s just not TPFKATA’s problem.
— Jay Rosen(@jayrosen_nyu) January 15, 2012
Note I added this response from Jay Rosen.
Blogs were the first symptom. But it was only with the appearance of Twitter with its affordance to have people follow you without you having to follow back, that the fact that social status had gained a bridgehead online was thrown in sharp relief.
The following/follower counts sit prominently in Twitter user interfaces, impossible to ignore. And they are often the first thing people look at. In a flash, celebrities and bloggers outside the tech and social media bubble had amassed audiences of thousands while following almost no-one. These relationships weren’t peer to peer. They looked more like audiences than communities.
It was only the true believers of Cluetrain that held out.
Until 2010, when the dam wall cracked, and then middle 2011, when it burst. Ever smart and genuinely media savvy Robert Scoble was one of the first believers to realise he had been drinking the cool aid and using Twitter wrongly. He had been following over 100,000 people but had a Damascus road like conversion. Scoble realised his experience of Twitter was awful, with more spam, and many more messages in which he was simply not interested.
So he developed a new exclusive following philosophy: Only follow better people to get more (and better quality) followers.
It’s a big job to unfollow that many people on Twitter and Scoble had a script written to do it for him. So impressed was he with the result and his new experience of Twitter that he told Chris Brogan in a blog post that he was doing Twitter wrong.
Brogan famously spent 80% of his time replying to every Tweet he got because it was, he said – with reference to the Zulu greeting Sawubona (literally: I see you) – “a way of recognising or seeing” another person. Brogan countered Scoble with the standard social media dogma: in this “seeing” of a person, social media was magically different to one-way top-down mainstream media and very different from traditional advertising (See video below). But Scoble did not let up; the “conversations” where getting in the way:
“I can’t find his [Brogan´s] good blogs and videos. Why? Because he does so many conversations. Look at his Twitter home page. All you see is @replies. This is what makes Brogan Brogan, because he’s going to answer you no matter how popular he gets.”
Not for long, though. After a couple of months, Brogan shocked many when he announced Operation Unfollow. Many of those following him reacted emotionally. Others like the Sales Lion – who had been dumped by Brogan himself – blogged a half-hearted defence:
“Excuse me? At what point does a blogger/social media icon lose his/her rights to be normal, experiment, and possibly be wrong on occasion? If ‘John the Farmer’ unfollows 20 people to shake up his stream, does anyone say anything? Nope, nada. But let an ‘A-lister’ hurt a few feelings and then everyone wants to make a judgement.”
As the storm raged, Brogan, seemingly oblivious to his previous pronouncements on “seeing” people, tried to make sense of the furore:
“I think it’s because they somehow see me following them as some kind of endorsement. I don’t know. Maybe a validation? “If @chrisbrogan thinks my tweets are worth following…” but that’s just it. When you follow 131,000 people, you don’t see any tweets.”
Too many voices had become a problem.
After Brogan’s Operation Unfollow, Darren Rowse and many more social media big shots followed suit and mass-unfollowed people on their networks. Interestingly though, none of them suffered a significant drop in their follower numbers. What was obvious for some time was now reflected in the follow/follower numbers. There was a new hierarchy in town.
So is this the end of the story? A merely amusing tale of how our beliefs blind even smart people to obvious truths?
No, it goes deeper than that. Digital media is genuinely disruptive. It is different from media that has gone before. It has lowered barriers to entry, and it is incontrovertible that everybody can now be a publisher.
Leaderless revolutions and hierarchies of the future
Much have been made about the leaderless nature of 2011’s uprisings across the world. Intellectuals and pundits from the BBC’s Paul Mason to former diplomats like Carne Ross, to John Perry Barlow himself, have commented on how this success was partly due to the networked non-hierarchical nature of social media. This made it harder to combat protestors by taking out their leaders.
Famously and fiercely leaderless, from Tunisia to San Francisco the hacker group Anonymous was in on the action all over the globe.
Even the mighty News Corp was bought to heel when people used social media to vent and to agitate against it’s advertisers. Something politicians on their own had been unable to do. Said Paul Mason:
“Six months ago, in the context of Tunisia and Egypt, I wrote that the social media networks had made “all propaganda instantly flammable”. It was an understatement: complex and multifaceted media empires that do much more than propaganda, and which command the respect and loyalty of millions of readers, are now also flammable…
But the most important fact is: not for the first time in 2011, the network has defeated the hierarchy.”
So is this not proof that McLuhan, Brand, Barlow and Kelly were right? Partly I think the answer is yes. But it’s more complicated than that.
Earlier this year in response to the Egyptian revolt, sociologist Zeynep Tufekci had written a piece where she warned against over-optimism and simplistic Utopian wishful thinking. It does not follow that because of the use of digital media in the revolt, hierarchy is a thing of the past for the Egyptian revolutionaries.
“A fact little understood but pertinent to this discussion, however, is that relatively flat networks can quickly generate hierarchical structures even without any attempt at a power grab by emergent leaders or by any organizational, coordinated action. In fact, this often occurs through a perfectly natural process, known as preferential attachment, which is very common to social and other kinds of networks...”
Tufekci goes on to explain with reference to Twitter how the meritocratic nature of social media is actually the basis of new hierarchies:
“Meritorious growth: In this model, the better, the more relevant, the more informative your tweets, the more followers you get. Surely, there is a lot of this going on. While this sounds good, it brings us to the next question: how will people know your tweets are so good? One mechanism, of course, is retweets. The number of retweets, however, may depend on how many followers you have to catch and retweet your posts in the first place. This means that those who have a large number of followers end up with an advantage even in terms of being recognized as meritorious.”
In other words it is possible that even those without merit could get higher follower numbers if they already have an installed base.
“In almost all human processes, already having a high status makes it easier to claim and re-entrench that high status. Thus, not only will more people see your tweets, they will see you as having the mark of approval of the community as expressed in your follower count.”
The Egyptian Influence Network just after the February uprising. Notice that the larger dots, users with more followers, even then were generally grouped closer together. A new elite.
Interestingly, Robert Scoble recently complained of preferential attachment framed in a different way – the Unearned Follow. Since the rise of Twitter, a plethora of services (Soundcloud, Instagram, Quora) has sprung up allowing users to publish media easily to an audience who follows them without having to follow back.
Importantly, these services are also using existing networks like Twitter as a basis to recommend whom a user should follow, often based on popularity on Twitter and, in some cases, even do it for you, building preferential attachment into the very fabric of our new user experiences. And Scoble laments the passing of meritocracy with it.
“Did I really earn your follow on these new services like I did on the older services like Twitter or Facebook? Yeah, you could say that I didn’t earn them there, either, since I had hundreds of thousands of readers on my blog before I started tweeting, but on Twitter you needed to manually “follow” me and other people on the system. On the newer systems they automatically follow people based on their Twitter popularity.
This presents a distorted picture of who is putting the most effort into the system.
Do you see the problems this causes?”
The future is what we make it
Hierarchy was not part of the original vision for the Net. If there is a new hierarchy forming, then how many of the other dearly held beliefs of the digital set are on less sure ground? So to conclude a few thoughts from me on shibboleths that are now on less firmer ground:
Person to person
Broadcasting is not going away. In fact stats show that social media is boosting in particular live TV viewing. Services like Klout that purport to measure influence in media are flourishing. It makes sense. If some people are more equal than others, knowing who they are is a valuable resource. This week social media thought leader Brian Solis wrote in a post titled - 2012: The Year for Digital Darwinism:
Digital influence is becoming prominent in social networks, turning everyday consumers into new influentials. As a result, a new customer hierarchy is developing forcing businesses to identify and engage to those who rank higher than others.
Small is limited. Brands have realised they still need massive reach (a large audience) to flog their products. So in social media new techniques are developing in the marketing playbook that are not person to person. Techniques like getting celebrities to market your wares: So called marketing to an audience with an audience.
And some old techniques are migrating to social networks. Facebook advertising is widely popular and successful.
Openness and Out of Control
Not only is the wildly popular iOS a closed platform, Google negotiated a particularly pernicious deal killing off mobile Net Neutrality in the US – whereby mobile networks can exclude services, usually competitors traffic across their networks. Similarly mobile users in Europe cant access services like Skype. As mobile networks strain to cope with the increase in data watch these networks increasingly campaign against Net Neutrality as a way to protect profit margins.
People like Cory Doctorow cry that there’s a war on general purpose computing coming – without which he thinks we loose the DIY freedom and power Brand spoke of. And Cloud based services are contributing to the shift in power from us to service providers, making it easier for central authorities to control the Net. And let’s not even mention China.
The entertainment industry are doing their utmost to introduce laws that will make Internet companies liable to police the Internet for any copyright infringement. Clay Shirkey warns that this requirement to police will push up the price for providing user generating content services to the extent that they will become uneconomical.
Community and Conversation
While it would be an oversimplification to call what businesses are doing in social media to be just broadcasting, it’s not for the most part, community either. Some are spotting trends where brands and bloggers are switching off the comments on websites due to the volume and quality of what they have to deal with. Big is in fact the death of community, even for individuals. As Gina Trapini points out in an excellent post. A large following creates all kinds of problems not least of which is remembering to share with the people you care about. For brands selling fast moving consumer goods (like toothpaste), and business to business brands, creating real communities using social media is very hard. Repeated studies show that most people Like brands on Facebook in order to get discounts.
Authenticity and Meritocracy
Once some people and brands have huge audiences, the pressure not only to promote good stuff will fade. Watch for people to promote things not because they are best, but because they are close to them, or worse, have received money for it. Watch for social media services promoting mediocre celebrities to follow in order to attract users. With the development of Personalised Search Results preferential attachment could migrate even into Google Search results. There are already signs of this.
And how’s this for a thought experiment. Should you employ a SEO expert or a person – who might be clueless about SEO – but is followed by thousands on Google Plus? Bear in mind that every page this person Plus Ones, will feature higher in Google search rankings for their thousands of followers.
In John Perry Barlow’s Declaration of Independence of Cyberspace he assertively claims that online we choose our own identities and implies that they are probably different to our ‘real’ identity. The idea that who we are in the nooks and crannies of the Net is not tied to our ‘real world’ identity and that we may have a few of them is indeed yet another article of faith of the Californian Ideologues. Yet more and more services are turning to Facebook as an identity service provider. That’s because Facebook’s insistence on the use of real identities combat anti-social behaviour. Not only that but Zeynep Tufekci has argued convincingly that people using their everyday identities on Facebook was a powerful force in helping revolutions against autocrats last year. And that it would not be so if they stayed anonymous. The idea of a bunch of ordinary people together, in the open, being more effective than the lone anonymous hacker is not only a bitter pill to swallow, but should make us rethink many assumptions.
The wisdom of markets
Our Californian Ideologues always seemed to forget one simple fact. The Internet was founded on tax payers money. The explosion of information technology and deregulation of the 80’s has not lead to massive improvements in productivity in the US or high quality jobs. In fact the US has experienced relative decline. And inequality is at unprecedented levels.
In a startling post just this week Rick Bookstaber points out why the middle-classes are disappearing and inequality increasing, not because their are no jobs. The jobs are being outsourced – to us! But we are not being paid for them:
“The jobs are moving from the producer to the consumer side of the ledger. And some of that work comes as the guise of entertainment. How much of your work is being done as you do your e-mails and surf the web, keep yourselves busy with your apps as you commute to work? So it is not only that computers are replacing workers, they are turning consumers into unpaid workers.”
We sure are not all hip and rich in the digital Utopia.
So all of these digital credos have to be more critically re-examined. They don’t all currently apply, or apply evenly in all circumstances. So what to do?
We need to realise that these ideals and norms won’t necessarily result from technology by themselves. If we indeed think some of these are preferable to how our industrial age worked, we will need to argue, design, program and perhaps even fight for them.