Saturday, June 25, 2011

A day at Test Automation Day 2011

On June 23rd 2011 I was in the position to attend the Test Automation Day 2011 as an advantage since Squerist, the company I work for, was the founding partner of this event. In my profession I started with automated testing although I like to think more how test processes can contribute value to an organization I still have a weak spot for automated testing. This made me curious about what to expect.

I attended in the afternoon due to other obligations. This leaves me to the experiences of other people. I entered the exhibition area just before lunch and noticed the crowd entering the room. Normally I recognize some people from past conferences, now I saw new faces with one thing similar: passion for their profession.
I talked with a few of them and they heard new interesting stuff how certain tools might fit into their business. They got already some value back for attending. Others heard more or less the same story. Isn't it always this way.

I attended 3 keynotes: From Martin Gijssen (De kracht van open source testtools), to Mark Fewster (Experience Driven Test Automation) and Scott Barber
(Automating Performance Tests: Tips to Maximize Value and Minimize Effort)

Somehow I missed another presentation in the middle. instead I had a great talk with others.

What I remembered about the presentation was that Martin was able to speak about open source tools without making a judgment what we should do. In generally it depends, sometimes open source tools are good to use, sometimes not. Personally I expected more information how, when and the pitfalls instead focusing on a framework. Though some introduction was needed and if time is valuable you cannot tell the whole story. What left was the importance of a framework, a test architecture and key-word driven testing is still kept alive.

About Mark Fewster: I like this guy, If you sit far behind in the room and you don't have the full picture he has some similarity with David Letterman only in his best days. (hope he is not offended by this. I enjoyed listening and watching to him) With passion and wit he stood there convincing about the experiences he collected about test automation. Based on this he shared us his future. One of the things in his future was the maturity of tools supporting test techniques. I think he might be right, there will be coming more tools than ever.
I think another road in the future is the need for focus of tools on the human aspects.

So I think we have here 2 paths: 1 path of tools which will support more techniques and move towards the developer to build the scripts for testing and other tools which will support the "keyusers" to tell in what they want to do or did with testing.
What the future will bring, we will never know. At least he triggered me to look forward to his book in which he collected the experiences of others.

The last keynote was provided by Scott Barber, I never saw him before. Wow, he is an artist, how he sells his story. He got me all the way. Although it was mainly focused on performance testing, he sold me the idea that it is quite different then functional testing. More complex. And perhaps equally fun :-)
During the time he gave us 10 tips for free to focus on while setting up your performance test. I think they also are generally usable for all the test we perform

Top 10 performance test tips by Scott Barber on Twitter
10 data design
9 variance
8 object orientation
7 Iteration/agile
6 Error detection
5 Human validation
4 Model production
3 Reverse validation
2 Tool driven design
1 value first

For details about his tips, just contact him. I'm still looking forward for his presentation as the information was to much to write down. And he has some story to tell. (Afterwards I spoke with him and he is willing to tell you about his experiences)

Some of the eye-openers/lessons I remembered:
1. Performance testing is for developers, let them help you,
2. provide information about performance on continue basis, not only as a big bang
3. Think about the release management, test management and infrastructure before you start measuring performance
4. Also in performance testing, numbers doesn't say much. Perhaps start presenting you measurements with the happy faces and sad faces.
5. Although I was not directly involved with performance testing, in the past indirectly I learned some items and were involved with some performance issues too.


I think this conference was a good one, for a first time I think they can be proud of it. A lot of information was shared, good presentations (and some less) were given. They missed the short break after the 2 keynotes and between the "business cases". Now there was less room to share experiences of the presentations. As all conferences, you hear some old wine in new bags or what is it called. I strongly believe in the skill to learn from the information you get and read between the lines.
I had some good fun that day and was surprised. It triggered my thoughts and extended some perceptions/views and ideas.

No comments:

Post a Comment