[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: usability lab on DevConf in Brno

On Thu, 2012-12-13 at 16:33 -0500, Chris Lumens wrote:
> I think we only have one camera setup here, 

The portable lab I have has 3 cameras, but can only record one stream at
once (if that makes sense) so the extra cameras would do us no good.

Having audio of the test though would be very useful.... I can produce
transcripts based on that if it's in English.

> which would make it
> difficult to film multiple people installing at the same time.  However
> that's probably a limitation we will have to live with.  Thus, we should
> probably have a small number of computers so we can get one observer for
> each person doing the install.  The observer can take detailed notes on
> what the installer is doing and where they get hung up.

I can put together a worksheet that could be printed out and handed out
to observers to make it easier for them to write up their observations.

> As for specific needs, I think answering that first means we need to
> decide what we want out of this.  Are we trying to get as broad a base
> of testing of anaconda as possible, or are we trying to get a larger
> sample size of people doing similar things?  If the former, we will want
> specific setups (pre-installed Windows, pre-installed other Linux, etc.)
> on VMs.  If the latter, the setup doesn't matter all that much and we
> could just use blank disks on VMs.

My intern and I will be putting together a usability test plan
(hopefully with everybody here's help, we'll start the discussions as we
start that work here and in #anaconda) and that should hopefully be
ready in time for all of this. I have two categories of tasks I'd like
included in the test plan:

1) Testing general ability of users to install Fedora across scenarios
typical for Fedora target users (on a clean bare metal machine, on a VM,
pre-installed windows, pre-installed other Linux, and Mac)

2) Testing of a set of tasks specific to the custom partitioning UI

We could focus just on tasks in one of those two categories for DevConf
if we want. I think it'll be a more technical audience, so it might be
worth focusing on the custom partitioning UI tasks? We could have tasks
like, 'Here's a diagram showing a specific disk/partition layout.
Re-create this using Anaconda.' And we could have a set, some layouts
use LVM, some use BTRFS, etc.

I think we should emphasize coverage of features over having different
users repeat the same task, if possible. I think we will get a lot more
useful data covering more areas for improvement that way. (When you have
multiple people doing the same tasks, there's typically a sharp
diminishing of returns after about 5 or 6 users; even if only 1 user
goes through a particular set of tasks in the test plan, that 1 user
will uncover a good chunk of the most egregious problems.)


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]