Opinion: Oscar Wilde once said "everything popular is wrong." Digg's relaunch gives us an opportunity to find out if he was right or not.
There's no doubt that Digg.com has become massively popular in a relatively short time, rising to a level as a news source that rivals The New York Times for readership.
Originally designed as a tech news site, Digg's new version, launched June 26, will include categories for world and business news, online video, science, gaming and entertainment.
The new version also includes a number of filtering features to allow users to see what their friends are digging, the ability to switch between headlines and stories, and some limited AJAX functionality to help streamline the user experience.
Digg's model of allowing any user to submit stories and then ranking them based on how many votes ("diggs") they get has been widely hailed as the poster child for "Web 2.0," that somewhat vague meme that's been sweeping the tech world over the past year.
Its democratic model of user-generated (and promoted) content is an appealing one, allowing anyone to become an "editor" and users to become participants in an ever-expanding Web of content.
The attractiveness in the model can be seen in the ever-expanding number of "Web 2.0" startups such as flickr.com, YouTube, del.icio.us, reddit.com, and tailrank.com that combine user-submitted content with user-contributed taxonomies (tags), community commentary and popularity features.
The promise with these sites is that by allowing anyone to contribute and comment, the community will produce something that's better than the sum of its parts and certainly better than old-school publications run by a tightly-controlled cabal of editors and writers.
But does it work?
A recent analysis of Digg's homepage posts makes me wonder.
The analysis found that "66 percent of all stories posted on digg that made it to the homepage are pushed there by the same 60 people."
Yup: 66 percent of the content is being controlled by the same 60 people! Rather than being a site where the community as a whole makes decisions about content, Digg's really just a site where a small group controls just about everything that everyone sees.
Instead of a grand sweeping experiment in the democratization of content, what Digg seems to be showing us is yet another example of what economists call "The Tragedy of the Commons," where group incentives (a wide variety of interesting content) clash with individual incentives (popularity, recognition and validation).
In "traditional" publications where content is created and edited by a limited number of preselected people, the overall viewpoint and overall quality is assured by those who have a real incentive (mainly financial) to provide quality.
Readers select publications because they've been able to identify ones they like based on factors such as the quality of the information, the agenda of the publication (and its editorial board), the timeliness of the information and the exclusivity of the content.
In the best publications the barriers to entry for content contributors are relatively high and the incentives for the publications to maintain standards of quality are high, tooif you're going to make money, you're going to need subscribers and advertising revenue.
The value of the publication to the readership is that they can trust it to provide content of value that they can trust and that's been "pre-filtered" so that the readers can find important information in the limited time available to them.
The viability of the publication depends on providing value over the long term, building their brand through quality content.
On the other hand, social news sites have few such incentives. Instead, the incentives of the publication are to drive traffic and the incentives of the contributors are to achieve popularity.
What occurs is that the "editorial agenda" of a site driven by popularity becomes that of those who have the most time on their hands and the strongest drive for recognition.
It's a cycle that feeds on itself, as the most motivated posters strive to achieve recognition by using language that drives reaction (witness the number of posts with "awesome" or "amazing" in the title on Digg), recruiting their "friends" as readers, and recruiting others to "digg" their stories (as well as "digging" their own efforts off-site).
It's not a new problem, but it's one that needs to be examined critically as more and more online publishers work community features into their sites (such as the new Netscape).
Rather than blissfully running into the "Web 2.0" promised land, we should examine how to take the best features of sites like Digg (a wide net of eyeballs searching for stories, a multiplicity of viewpoints, a diversity of content and a platform for sharing media) and combine them with what we've learned about community interaction, publication and social software.
Here are some suggestions:
1. Provide barriers to entry: One of the major reasons so many dot.coms failed was because it was just too easy to start an online business and the reason that problems arise in sites like Digg is because anyone can contribute at a level equal to everyone else.
Providing "barriers" such as provisional status for new contributors (perhaps tagging their posts with their "provisional" status until they've proven that they can be good members of the community) would eliminate site spamming and would encourage responsible community behavior.
2. Predefined content taxonomies: while folksonomies enabled by tagging help users find content, they can also cause their own problems when left unchecked.
Deceptive tags, incorrect tags or misspelled tags can all create problems. Just as how search engines have had to fight search spammers, Web 2.0 sites that use tagging need to combat "tag spammers" (and just plain bad taggers) by developing predefined taxonomies of tags that users can select from rather than just allowing open tagging.
3. Put more humans in the editorial loop: Sites like Digg would benefit from what Clay Shirky calls "meta-moderation" where site editors work to cull bad posts from the site.
This, of course, can cause its own set of problems, but editorial filtering can help eliminate social effects that actually hurt the site. Netscape's new site combines editorially-generated content with user-contributed content.
4. Eliminate ownership: If the ego incentives are eliminated by not posting usernames with content, many of the negative social effects of people seeking their own popularity would be eliminated.
Of course, this might also chill posting because individual efforts aren't being recognized, but it'd also assure that those contributing are doing so for the group, not for their own benefits.
5. Provide better reporting systems for identifying "best of" and "worst of" content. Craig's List does a good job of this and sites like Slashdot have used similar features to help users block out content from problem contributors.
Users should be able to both "digg" good content and identify problem content.
There are a lot of benefits to socially-generated content, but communities have problems that can't be ignored. By looking critically at current practices, examining problems inherent in communities, and looking at some old tools that have worked in the past, we can create publications that are better than anything that's come before.
We just need to keep our eyes open so that Web 2.0 doesn't (sorry, I can't resist the pun) digg its own grave.