Page 1: Executive Summary and Introduction|
|The purpose of this experiment was to test a hypothesis that sites built in Flash are less usable than similar sites built in HTML. Luckily for me, Tiffany.com has gone through the trouble of making an "enhanced" version of their site in Flash that is a close copy of the HTML version. If you don't have the time to read through all the boring details, here's a quick summary: The Flash version of Tiffany.com scored lower than the HTML version of Tiffany.com in every objective measure, and was rated inferior by testers in 11 out of 12 subjective criteria. This was Reagan vs. Mondale, LA cops vs. Rodney King, rock vs. scissors.
You're probably asking yourself, "This test is obviously biased. This guy wrote an article called 'Flash is Evil.' Gimme a break." You'd be right, I am biased against Flash, especially when it's used to build e-commerce sites like Tiffany.com. However, in an effort to somewhat diminish that criticism, I had a neutral, objective third party observe one of the testing sessions. I encourage you to read my post-test interview with him in Appendix A Interview with Bob LinDell.
Because the test was conducted after work with a promise of only hors d'oeuvres and a can of Coke (read: no cash payment), I was only able to talk eight co-workers into performing the test. Hence, there were not enough testers to create a control group. I had four people test the HTML site and four people test the Flash site. None of the testers were told that the test was a Flash vs. HTML test; they were simply presented with a site and asked to complete a test.
Borrowing heavily from User Interface Engineering's web site testing methodology, I created four questions: (i.) a simple fact question, (ii.) a judgement question, (iii.) a comparison of fact question, and (iv.) a comparison of judgement question.
In order to create some sort of quantitative measure comparing both sites, the simple fact question and the comparison of fact question were timed, and testers told this before the questions were asked. For the judgement question and the comparison of judgement questions, I asked the testers to think out loud as they tried to answer the questions.
Finally, because we were testing an e-commerce site, I created an e-commerce-related task (adding items to a shopping cart) that also was timed.
The Post-Test Survey
After the testers answered the questions and completed the task, I left them with a post-test survey. The survey also borrows from UIE's testing methods. It asks testers to rate the site on 12 different criteria between 1 (very unsatisfied) and 5 (very satisfied). This is the most interesting part of this study, so don't leave until you've read the survey questions and results in the Detailed Test Results section of this report.
This was an informal test conducted after work with co-workers I know to varying degrees.
There were only eight testers; four tested the Flash site and four tested the HTML site. It is possible that the four testers who tested the Flash site are all less web-savvy than the four who tested the HTML site. They might all be less intelligent. They might subjectively grade things lower than those in the HTML group. They might have all just been having a bad day.
The Flash version of Tiffany.com and the HTML version of Tiffany.com are very close, but aren't exactly the same design. It is possible that the slight differences between the designs were the cause of the lopsided test results, and not the fact that one site was done in Flash and the other done in HTML.
If rash, irritation, redness, or swelling develops, discontinue reading.