Tuesday, April 14, 2009

Two published studies of WikiDashboard show that transparency impacts perceived trustworthiness

First Study: CSCW2008 (Best Note Award!) [1]

At CSCW2008 conference about 4 months ago, we published a user study conducted using Amazon's Mechanical Turk showing how dashboards affects user's perception of trustworthiness in Wikipedia articles.

In that experiment, we designed nearly identical dashboards in which only a few elements are changed. We designed a visualization of the history information of Wikipedia articles that aggregates a number of trust-relevant metrics.



We developed high-trust and low-trust versions of the visualization by manipulating the following metrics:
• Percentage of words contributed by anonymous users. Anonymous users with low edit-counts often spam and commit vandalism [1].
• Whether the last edit was made by an anonymous user or by an established user with a large number of prior edits.
• Stability of the content (measured by changed words) in the last day, month, and year.
• Past editing activity. Displayed in graphical form were the number of article edits (blue), number of edits made to the discussion page of the article (yellow), and the number of reverts made to either page (red). Each graph was a mirror image of the other, and showed either early high stability with more recent low stability, or vice versa.

We also included a baseline condition, in which no visualization is used at all.



The results with Mechanical Turk users show that surfacing trust-relevant information had a dramatic impact on users’ perceived trustworthiness, holding constant the content itself. The effect was robust and unaffected by the quality and degree of controversy of the page. Trust could be impacted both positively and negatively. High-trust condition increased trustworthiness above baseline and low-trust condition decreased it below baseline. This result is obviously very encouraging for folks who are keeping score on the effects of transparency on trust.

These results suggest that the widespread distrust of wikis and other mutable social collaborative systems may be reduced by providing users with transparency into the stability of content and the history of contributors.

Second Study: CHI2009 lab study [2]

In this second lab study, we extended the first study by allowing users to fully interact with the live version of WikiDashboard, which provided visualizations of the actual article and the editor histories. Moreover, we used questions from prior credibility research to assess a larger set of trust metrics for both the WikiDashboard condition and the plain old Wikipedia interface with no visualizations. Another experimental condition was whether the article had been independently identified as being of skeptical credibility by the Wikipedia community (by the WikiProject Rational Skepticism page on Wikipedia).

Interestingly, the results here are consistent with the first study. Users who saw WikiDashboard increased their credibility judgments about articles that were both previously designated as Skeptical or Non-Skeptical.

In summary, it seems that both study suggest the presenting more transparent information increases the credibility of the article, no matter whether it is controversial/skeptical or not. This is logical, since if you're buying a car, you would expect to have all of the vehicle's history along with the price information. If you only had the price information, you'd be less likely to deal with that particular dealer. Transparency breeds trust.

Given the prevalent skepticism around Wikipedia's content, it seems that the studies suggest by presenting transparent visualization of the particular authoring history of an article can boost its credibility. This further suggests that some people don't trust Wikipedia simply because they desire more understanding of how the content came to be.

References:

[1] Kittur, A., Suh, B., and Chi, E. H. 2008. Can you ever trust a Wiki? Impacting perceived trustworthiness in Wikipeda. In Proceedings of the ACM 2008 Conference on Computer Supported Cooperative Work (San Diego, CA, USA, November 08 - 12, 2008). CSCW '08. ACM, New York, NY, 477-480. DOI= http://doi.acm.org/10.1145/1460563.1460639

[2] Pirolli, P., Wollny, E., and Suh, B. 2009. So you know you're getting the best possible information: a tool that increases Wikipedia credibility. In Proceedings of the 27th international Conference on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI '09. ACM, New York, NY, 1505-1508. DOI= http://doi.acm.org/10.1145/1518701.1518929

No comments: