Showing posts with label WTANZ. Show all posts
Showing posts with label WTANZ. Show all posts

Monday, May 10, 2010

Rorschach, the power of visualization and software testing?

Introduction
I blogged about my experience in Weekendtesting were I used Astra Site Manager creating a map WTANZ02: Same Language, different sites and places. In that post Shrini Kulkarni challenged me to expand on how to use this as test strategy.

When you look at the images posted there, you might notice that the images look a bit like spots/stains.

Rorschach test
When thinking about spots/stains and deriving information from it reminds me immediately on the Rorschach test.


From Wikipedia: Rorschach test: "also known as the Rorschach inkblot test or simply the Inkblot test) is a psychological test in which subjects' perceptions of inkblots are recorded and then analyzed using psychological interpretation, complex scientifically derived algorithms, or both."


Below you see an example of a Rorschach image. Are you able to read this picture? Are you able to assign functionality to areas? Do you see bugs?

Image saved from wikipedia http://en.wikipedia.org/wiki/File:Rorschach_blot_01.jpg

Primarily based on the perception of these spots the user is asked what and how he experience this and why. What does the spot tells you.

Testing spots

Below you see the 2 images I obtained from "testing" the 2 websites as stated in the challenge from WTANZ02: Same Language, different sites and places.


Just tell me: what do you see?

Image 1


Image 2

It depends how you look at the images, you might identify some shapes. Perhaps you only see dots or animals. Perhaps you see bugs.


The strategy
Defining a strategy is a challenge itself. Writing about it and sharing your idea is even more a challenge. Writing about it and trying to come with a Heuristic is more challenging for me as this is quite new to me. So bare with me, support me and make me teach you as I can learn from you.


First steps
I suggest first to define the approach based on patters. Ask what the image itself can tell you and what information do you need to define the approach.


Imaging: Create a map of the website/ functionality to define a certain landscape
Defocus: Don’t approach the image as a system, approach it as a painting, approach it different, what else do you see? Use your imagination.
Interpret: Are you able to tell a story about what you see (colours, lines, drawings, etc.) and argument it?
Density: is there a structure available representing the first impression you had?


Next steps
After you got a main overview about what the system could look like you might play with the following components.


Complexity: Is there some kind of structure? Are there lots of nodes and are you distracted by it?
Number of objects: Are there too much objects visible you are not able to zoom in without missing details?
Environments: Can the map also be used to identify other systems/ secure areas?
Risk Areas: Are you able to point areas of risks in the map based on "important" functionality?
Process: Is there a order available in the structure which also might support any process?


Other steps
Looking to the previous actions, I hope to provide some additional ideas how images from a website structure can support defining an test approach. I believe looking on a different way to images or structures you might come up with other concepts and thoughts which supports your test approach. The next step could be adapting the newly gained view into your test process. Based on this information you can define alternative test cases or perhaps product risks analysis.

It might help to get back some creativity back in testing.

Monday, May 3, 2010

WTANZ02: Same language, different sites and places

Weekentesting on the other side of the world
At least it is for me and there were some benefits. WeekendTesting-chapter in Australia and New Zealand (WTANZ) had their second session. As it was raining and still early in the Netherlands (just 8 PM) I asked to participate. As I was almost an hour too late I had less time available to test the mission as provided.

Here's our mission today: The mission: Exploratory testing of how easy it is to get data in different formats about education in the United States and the United Kingdom from http://data.gov/ and http://data.gov.uk/.

The Participants were:
Marlena Compton (facilitator)
Ajay balamurugadas
Allmas Mullah
Dhara Sapu
Oliver Erlewein
Jaswinder Kaur Nagi (aka Jassi)
Keis

Approach:
As mentioned I attended too late so I had another challenge, instead of following the mission am I able to get enough information to be able to start next time. Like in normal life you are faced with situations were an approach have to be defined and less time and information is available.

As I understood from the discussion and de briefing, the website ought to be similar and also with a similar objective. To understand more about the sites I came to the idea to find out about the objectives and compared them. I also checked on visual sight the structure of the site based on the menu items.
Next to it the tone of voice was important for me the learn more about the audience.

During the checking I scrolled a bit thru the menu and decided to use an old tool called Astra Site Manager which was developed by Mercury (now HP). Although this tool is not flawless, it sure provided the information I was looking for. how complex is the site.


Some Results
Website map of http://www.data.gov/ created with Astra Site Manager






Website map of http://data.gov.uk/ created with Astra Site Manager




If you compare the images you will noticed that there is some differences in structure. I think a map like this is usable to identify areas/ pin point areas were risk can be identified. If an area contains some risk you might come up with some other exploring questions as: "if user data is used how does it flow through other screens?"

As result of this tool I came up with some unreliable metrics like the number of URL's.
The UK site counted over 5961 URL's and US-site counted over 4903 URL's.

If I use these numbers with the goal of the sites: sharing information to the public, then I question: How will the public be able to find valuable information if it exceed their ideas. How will the public be able to find the right information? The change of finding some information is due to the high number if links high, the change if that information is the correct information depends how the search engine works. When will the result be the best and reliable result?

Looking to the technology: On the US-site they just use the icons for facebook and twitter. On the UK-site they explain what they do. Does this mean that the audience is different?

What I also noticed when running the tool is the differences in files which can be downloaded, from .xls, .csv, .pdf, .txt to .xlm. Also there is no usages of naming conventions in the documents as well in the webpages and directories.

The discussion
The round up was interesting, they all shared their experience and wondered if they met the mission. Some found their way using google for information, others came up with an well spoken approach. I learned from this session as well and hope others did too.

Lessons learned
- Comparing different web site: decide which will be your "Oracle" and why
- Tone of voice is different and tells something about the expected audience
- Question the value of information when it is offered in huge numbers and what is the change the right information is found
- Creating a map can be useful to pin-point risk areas and pin-pint value for the users.
- Usage of file names and the similarity can tell some about the quality of the site, at least the change of errors
- Huge number of web-pages might result in higher chance of failure, why are these kind of websites this huge?

For more information see:
Website: http://weekendtesting.com/ or follow them
on Twitter Weekend Testing: http://twitter.com/weekendtesting