Wikipedia / Trust

From WhyNotWiki
Jump to: navigation, search

Contents

Is Wikipedia really open for anyone to edit?

[Openness (category)]

Increasingly the answer looks to be no, as they add more levels of control and bureaucracy...

Of course the advertised advantages of more control are that there will be less abuse, vandalism, etc., but one has to be skeptical of anything that takes away freedom and replaces it with more control: the risk of abuse by the few wielding that control can be great...

I don't have enough evidence yet to know if my doubts are justified, but I'm keeping my eyes open...

Wikimedia quality control

http://quality.wikimedia.org/

http://meta.wikimedia.org/wiki/Wikiquality

http://en.wikipedia.org/wiki/Wikipedia:Flagged_revisions/Quality_versions. Retrieved on 2007-05-11 11:18.

... The vision is ambitious: to make Wikipedia a reliable source, and requires that we shift our focus from quantity to quality.[2] These proposals are modest beginnings towards that goal. ... [...] we can evaluate by the feature article review process, whether after a month or so of editing, the article has actually improved or deteriorated.[4] If the editing has improved the article, then the new version would be promoted to "quality". ... Pages that are under constant attack from POV-pushers tend to be defended by well-meaning editors and admins, who are sometimes quick to revert edits. Some controversial pages are so heavily guarded that unless an edit is made by one of the regular contributors to the article, it may simply be reverted for no other reason than that. Constantly having to defend an article from heavy POV-pushing makes it much harder to assume good faith, especially towards editors that suggest similar changes to those previously insisted upon by notorious POV-pushers. Although we manage to write excellent articles on controversial topics, we pay a heavy for this: we create a combative rather than collaborative editing atmosphere.

It sounds good on paper. But "quality" is subjective. And if the power is put in the wrong hands, then the power that should be used simply to reject changes that don't achieve a high enough level of editorial quality could also be misused to reject changes that those power-users don't agree with!

http://technology.newscientist.com/channel/tech/mg19526226.200-wikipedia-20--now-with-added-trust.html. Retrieved on 2007-05-11 11:18.

... It is a problem that dogs every Wikipedia entry. Because anyone can edit any entry at any time, users do not generally know if they are looking at a carefully researched article, one that has had errors mischievously inserted, or a piece written by someone pushing their own agenda. As a result, although Wikipedia has grown in size and reputation since its launch in 2001 - around 7 per cent of all internet users now visit the site on any given day - its information continues to be treated cautiously. That could be about to change. Over the past few years, a series of measures aimed at reducing the threat of vandalism and boosting public confidence in Wikipedia have been developed. Last month a project designed independently of Wikipedia, called WikiScanner, allowed people to work out what the motivations behind certain entries might be by revealing which people or organisations the contributions were made by (see "Who's behind the entries?"). Meanwhile the Wikimedia Foundation, the charity that oversees the online encyclopedia, now says it is poised to trial a host of new trust-based capabilities. ... The shift is a dramatic one for the encyclopedia. For now, edits to an entry can be made by any user and appear immediately to all readers. In the new version, only edits made by a separate class of "trusted" users will be instantly implemented. To earn this trusted status, users will have to show some commitment to Wikipedia, by making 30 edits in 30 days, say. Other users will have to wait until a trusted editor has given the article a brief look, enough to confirm that the edit is not vandalism, before their changes can be viewed by readers. This is sure to ease some readers' doubts. Most malicious edits involv e crude acts of vandalism, such as the deletion of large chunks of text. Now such changes will rarely make it into articles. These benefits will come at a price, though. New users could be deterred from participating, since they will lose the gratification that comes from seeing their edit instantly implemented. That could reduce the number of editors as well as creating a class system that divides frequent users from readers. The trusted editors, likely to number around 2000, may also find that articles are being changed too fast for them to monitor. Not all versions of the encyclopedia will follow this route, says Erik M�ller of the Wikimedia Foundation. While editors on the German version are happy with a hierarchy of contributors, the English editors favour a more egalitarian approach. So English readers are likely to continue to see the latest version of an entry, with a page that has been certified as vandalism-free by trusted editors available via a link. ... As well as relying on trusted editors, Wikipedia's upgrade will involve automatically awarding trust ratings to chunks of text within a certain article. M�ller says the new system is due to be incorporated into Wikipedia within the next two months, as an option for the different language communities. The software that will do this, created by Luca de Alfaro and colleagues at the University of California, Santa Cruz, starts by assigning each Wikipedia contributor a trust rating using the encyclopedia's vast log of edits, which records every change to every article and the editor involved. Contributors whose edits tend to remain in place are awarded high trust ratings; those whose changes are quickly altered get a low score. The rationale is that if a change is useful and accurate, it is likely to remain intact during subsequent edits, but if it is inaccurate or malicious, it is likely to be changed. Therefore, users who make long-lasting edits are likely to be trustworthy. [...] ... Once all contributors are rated, the software then uses this information to rate chunks of text. If whole entries have been contributed by one person and left unchanged, the text inherits the rating of that person. If text has been edited several times, then its rating is calculated using the ratings of all contributors. If a modification to an entry leaves a particular chunk unchanged, that chunk will get a high rating. The system runs the risk of penalising editors who tackle malicious changes by correcting them, because the corrections are often quickly changed back to the malicious version by the vandals. To try to minimise this, the drop in an editor's rating that occurs when their edit is changed will depend upon the rating of the other editor involved. Once all text has been rated, the software colour-codes it, with darker shades for lower ratings. Readers will then have the option of clicking through to a colour-coded page, allowing them to immediately judge which parts of an entry to trust. Automation introduces challenges, however. New editors could get put off when they see their text flagged as questionable by default. A high rating may also become an end in itself, leading people to come up with ways to get their text rated highly without necessarily enhancing its quality. Although de Alfaro won't publish the ratings, Wikipedia's log is public, so anyone with a copy of the algorithm could publish the results.

Who's behind the entries

Wikipedia has a two-pronged plan for reducing vandalism and inaccuracies on its site, but an independent website launched last month might discourage another kind of bad behaviour: agenda-driven edits.

WikiScanner allows people to find out which organisations are behind contributions to Wikipedia entries by taking the IP address of the computer that submitted the entry, which Wikipedia makes public, and looking it up in a second database that links organisations to their IP addresses.

The site has already revealed that staff at Fox News Network cut sections of an article criticising the channel's correspondents and that someone at Diebold, which manufactures voting machines, removed paragraphs questioning the machines' reliability.

The site's influence could go further, says Wikipedia co-founder Jimmy Wales. A similar system could be created that displays the name of someone's organisation as they are submitting their edits, warning them in real time that it will be clear, to anyone who wants to know, who they are. That might make them think twice about trying to distort an entry.

Tyler,

Then again, just because someone has a strong viewpoint and is interested in defending/explaining/spreading/proliferating it, doesn't necessarily mean they should be barred from editing. You shouldn't penalize people for having strong opinions, only penalize them perhaps if they let their opinions lead them to edit an article to make it biased, one-sided, and non-neutral. People are bound to disagree over things, but who's to say whether a particular edit was "agenda-driven" or not, any more than those who seek the reversal of such a change aren't simply driven be an agenda that happens to be the polar-opposite agenda from the offending edit? Opinionated people of all walks should be encouraged to work together. If a consensus cannot be reached, then at least each opposing viewpoint could be allowed to explain/defend their particular viewpoint in part of the article. After all, the person most qualified to defend a particular viewpoint might be those who have an "agenda" to defend/push their agenda/viewpoint. Being opinionated should make them qualified to participate in the editing/discourse, not disqualified.

Ed H. Chi,

Rather than automating the reputation system thru analysis of the edits, what might be more profitable is to display the social activities of the editors --- make the actions more transparent to the community, so that social norms can be more deeply ingrained into the community. WikiDashboard ( http://wikidashboard.parc.com ) does exactly this. By having more social transparency, one might hope that there would be ways for readers to make up their own mind about the appropriateness of the information.


http://en.wikipedia.org/wiki/Wikipedia:Flagged_revisions

http://en.wikipedia.org/wiki/Wikipedia:Why_stable_versions

http://en.wikipedia.org/wiki/Wikipedia:Stable_versions

http://en.wikipedia.org/wiki/Wikipedia:Static_version

http://www.foxnews.com/wires/2006Aug04/0,4670,Wikimania,00.html



Can that be used for censorship? To promote a bias/agenda? Is it already?

One could easily use the guise of "quality control" to silence those you disagree with...


Putting opposing viewpoints side-by-side

http://wikidashboard.parc.com/. Retrieved on 2007-05-11 11:18.

The idea is that if we provide social transparency and enable attribution of work to individual workers in Wikipedia, then this will eventually result in increased credibility and trust in the page content, and therefore higher levels of trust in Wikipedia. ... Because the information is out there for anyone to examine and to question, incorrect information can be fixed and two disputed points of view can be listed side-by-side. In fact, this is precisely the academic process for ascertaining the truth. Scholars publish papers so that theories can be put forth and debated, facts can be examined, and ideas challenged. Without publication and without social transparency of attribution of ideas and facts to individual researchers, there would be no scientific progress.

Ads
Personal tools