A new weekend, new chances, new lessons
This weekend I participated in another session of the European Chapter of Weekend Testing. for more details see: EWT11: "To secure the area" The objective for this time was testing a financial application with a focus on security.
This time we had a good guest-facilitator: Anne-Marie Charrett who guided us true the mission and this weekend event. One part of the fun of weekend testing is you get to know different people. See it as the first step. Imagine you participate for the first time; you meet people you perhaps heard of and never worked with. The next time you participate you remember those names. Imagine what can happen when you meet them in real life for instance on a conference. Normally they would be one of those unknown fellow testers sharing the profession. Now you have something else in common which might make it easier to find each other. I hope to meet them some time some day, until now, I will meet them on weekend testing. Perhaps you too?
This time the following persons attended the European weekend testing session:
- Anna Baik (facilitator)
- Markus Gärtner (facilitator)
- Anne-Marie Charrett (facilitator)
- Anuradha
- Ajay Balamurugadas
- Markus Deibel
- Jaswinder Kaur Nagi (just found out that this is "Jassi" :) )
- Maik Nogens
- Thomas Ponnet
- Ravisuriya
- Jeroen Rosink (that's me)
Was it Fun?
As every weekend you have to make the decision: can and will I participate and should I? I decided although I knew I might not be able to attend the whole session to participate.
This weekend I was not able to make the application work, I didn’t succeed to get passed the registration form. Although some others did, and some shared the same “errors” I faced.
Lessons Learned
#Lesson 1: for me it is an error for the organization/developer it is a warning message
This is a mistake we often make. We see an message shown to inform us what is wrong and we cannot continue. Based on all kind of rules like business rules we should not be allowed to continue. It actually tells what to do: contact helpdesk. This seems to be an informational message.
If you see it with respect the situation then it also might be an error message as that message stopped me to continue. The situation was that I was allowed to use the software as I was provided with a license key which I was allowed to use. I also was under time pressure: testing within an hour. Another thing was: I was not supposed to call the helpdesk, so the message was not for me.
As you see; a message can be more then just informational. It is also is an error on multiple layers:
1. it was falsely shown as I ought to have a valid license key
2. it bothered me as I couldn't continue my task
3. the information provided was incorrect as I should contact helpdesk.
#Lesson 2: Security is not only within the application
One of the objectives for this session was to learn more about the security of the application. Although I didn't got that far to use the application, the chance is there that security of my PC (test environment) blocked me for using the app.
When thinking; this made me draw the conclusion that we often focus on the security within the Application Under Test (AUT). During testing we also should consider the security settings of the (test) environment. Although for some testers this has nothing to do with security and more with authorization. In that case you might consider the impact of settings of the environment of the users for the security. Perhaps add checks in your application to check if the security settings are correct? Or inform the users/ application managers under which conditions the application should be used?
#Lesson 3: Giving back an application is not a shame, it should be done considerably
There is a thin border between giving back and giving up. That border should be crossed carefully. Giving back an assignment is the last thing you can/should do. As you are admitting you are not able to do the task. Not doing your task can be taken as not skilled enough, not committed enough, not dedicated enough, or just not enough.
So be careful when giving back, come with argument you did not gave up.
I gave back the mission after I found arguments for myself to stop. I didn't give up as I stayed with the team as long it was possible for me.
Arguments were:
- I tried several option on my PC to get pass the registration form without success
- I tried to obtain help and other license keys, without success
- I tried to get help from other fellow testers, their support didn't work out either
- I check the time with respect to the mission: less time was available even when I would manage testing would make less sense
- I would do better to see what lessons I could learn about this (learning is another mission for weekend testing for me:) )and spent time on that
- not spend more valuable time of other by asking for help.
During the session I formally gave back the mission. I believe it is important to tell it instead of keeping silent so others are aware you stopped. for me the session was njot a failure, I learned from it.
# Lesson 4: On every edge there is something to learn
Although I didn't follow the mission as I intended to do; I took the time to think about lessons I can learn. This confirmed me that you can, should learn and continue learning not only looking to expected outcome. You might have to take a step back and see what the actual outcome could teach you. If you believe that nothing was to learn, you might spend more time. On every corner, edge, situations, there are lessons to learn.
WeekendTesting
For those who also wants to be challenged by challenging yourselves, you might take part on one of the weekend testing sessions and teach yourselves!
For more information see:
Website: http://weekendtesting.com/
Or follow them on twitter
weekend testing: http://twitter.com/weekendtesting
europe weekend testing: http://twitter.com/europetesters
Sunday, March 28, 2010
WeekendTesting EWT11: Giving back is not giving up
Posted by Jeroen Rosink at 10:04 AM 0 comments
Labels: EWT, Testing in General, Weekend testing
Thursday, March 25, 2010
Model Based Testing using your mind
Model Based Testing seems one of the trends which is living nowadays in "the land of the testers"
You will find more often information about this on topic on the internet, seminars, conferences, webinars, blogs, magazines. So it must be interesting, it must be new as lot of people are speaking about it.
I wrote about this topic a bit earlier already just to share my thoughts. 6th june 2009: Model Based Testing was this new?
This time I was triggered by a posting by Ewald Roodenrijs on First model, than build He gives the advice first to model and then to start building. This seems a good advice. Why not use this approach if this helps communication with your team?
A few weeks ago I also attended a session of the European chapter of Weekend Testers. Weekend Testing EWT08: Kept sticking in testing. During this session Michael Bolton also suggested to build a model before start testing. In the perspective of the challenge of weekend testing is was rather building a model in your mind instead formal paper work.
A week later during the session of EWT (European Weekend Testing) I tried to use modelling first instead starting testing. I made up a model about the system how I think the application will be based on the information provided in documentation. to model it I used the heuristic: SF DePOT (Structure, Function, Data, Platform, Operations, and Time)
- http://www.satisfice.com/articles/sfdpo.shtml
- http://www.developsense.com/articles/2005-10-ElementalModels.pdf
For me this worked out well. I learn to think about the software in another way then just functions, test scripts which must pass etc.
If model based testing is based on defining models and use them before you start testing, then I was also doing some model based testing. I only skipped a lot of formal work.
The next step would be translating my thoughts to a user, tester, developer, or who ever is interested. Here a decision can be made: will I use formal model techniques which will fit the model-tools or will I use a heuristic model (hey: also a model!)
I'm sure that there are a lot of advantages with a formal model based testing approach. I personally prefer at this moment using common sense and focus on right questions instead guided by an heuristic approach presented at one of the links mentioned above instead of implementing model based testing with a big bang.
A former teacher told me once: "If you can avoid automation, then do it, once you started there is no way back!" Is this also the case with model based testing?
Another question which just popped up in my mind is: why are we spending so much time in understanding models and creating models and not trying to learn how our own mind is working and how our mind can support us at work? Is this the meaning of learn to learn and continue with it?
Posted by Jeroen Rosink at 10:13 AM 2 comments
Labels: Ideas, Testing in General, Weekend testing
Wednesday, March 24, 2010
Thinking: You have also different types of testers
The initiation
On the 18th of March 2010 I posted an article related to the way people think called Are you a Sequential or a Spatial tester?
Based on that article Shrini Kulkarni Thinking tester replied on that posting with some interesting thoughts. He suggest that the words are chosen incorrectly between sequential and spatial. Perhaps the sources I used are also mixing up the terms. His suggestion is to alternative words like: "Holistic vs Analytical thinking"
On Twitter
On twitter we continued the visions partly triggered by me:
http://twitter.com/JeroenRo/status/10766445585
@shrinik Thanks for inspiring me with your thoughts related to sequential/spatial would you start with getting definitions right or picture?
http://twitter.com/shrinik/status/10767188470
@JeroenRo To get you started: think of "simultaneous/parallel" as complementary to "sequential" as opposed to "spatial". (1/2)
http://twitter.com/shrinik/status/10767230383
@JeroenRo I wonder what would be complementary or opposite to "Spatial". I hope "analytical" is not the right one (2/2)
http://twitter.com/JeroenRo/status/10767725525
@shrinik I don't think analytical is complimentary/opposite to spatial.Analytical might be an opposite characteristic of spatial thinking1/2
http://twitter.com/JeroenRo/status/10767773838
@shrinik in different professions we use the same words with different meanings, it can be used as part of another (2/2)
http://twitter.com/shrinik/status/10767843209
@JeroenRo This whole left-right brain thinking theory (left: looks at parts, right: Looks at wholes) - is a heuristic.
http://twitter.com/shrinik/status/10767881592
@JeroenRo If spatial = holistic then 'analytic; would be a right opposite. This fits well with left brain-right brain - model (heuristic)
Further investigation on terms
Based on the "discussion" (or is this nowadays called tweetcussion?) I had the feeling that we were talking about the same, only the terms are used differently to place the meaning in a different perspective. To understand more I did a short research on analytic vs holistic thinking.
This lead me to two modes of thought by Ulric Neisser, 1963
Here is the reference made that others using the term analytical were U. Neisser is using them term sequential. Seems that more definitions are used.
Another article which was tweeted (sorry, couldn't find the person who guided me towards this article) GSU Master Teacher Program: On Learning Styles. This article exlpains shortly about Myers-Briggs Type Indicator (MBTI) from the Myers-Briggs foundation
Types:
Extraversion (E) versus Introversion (I)
Sensing (S) versus Intuition (N)
Thinking (T) versus Feeling (F)
Judging (J) versus Perceptive (P)
Based on these differences they classified "The 16 personality types". One important phrase Myers-Briggs Type Indicator (MBTI) page is:
"All types are equal: The goal of knowing about personality type is to understand and appreciate differences between people. As all types are equal, there is no best type."
Other thoughts and resources
Another posting related to differences in thinking is written by Markus Gärtner called Turn off the beamer. He also suggested me to read the book from James Bach about the Secrets of a Buccaneer-Scholar. As far as I understood it supports an different approach then schools are providing now. (I still need to obtain this book real soon!)
If people are thinking differently then it has to infect the way of testing. This should result in different types of testers. The portal for testers called Software Testing Club is providing a free e-book related to Tester Types The E-Book in short different types are explained.
My intentions
With my previous posting I tried to start thinking and make others think that the our hemisphere have influence on our way of working. We should not judge immediately people on their view of testing. Some people tending to think in terms of issues/defects. Others in value to deliver. We have testers who think in test processes and testers who building their thoughts based on images of processes were testing is just a part of.
Like the Myers-Briggs foundation was claiming: there are different types of personalities and they are all equal. If different types exist and we accept that, perhaps we also should spend more time to understand our fellow testers. Try to learn from other visions which might not be yours from the start. We can learn from each other. It can result in acceptance of other ideas or providing arguments for you ideas.
Not always is: thinking in test scripts good. Not always should creating a structured test process be the goal; is Finding issues a primary goal.
I suggest:
- Try to understand your way of thinking;
- Try to learn from others by finding out how they think, see the world;
- Try to see if your ideas are accepted by others when you consider these different kind of thinking
- Learn from each other not in terms what is said, instead what is thought. (see the deeper meaning :))
- Try to look beyond the defined borders of behaviour
Back to initial posting
Referring back to my initial posting Are you a Sequential or a Spatial tester? I might have used words mixed up. After looking backwards, I used the words which are used on those sites which seems to me providing a good resource to support the idea that people think differently in combination what is thought on our schools. I believe that the change knowing testers who think more like spatial/visual/intuitive/holistic is larger then the opposite.
In my experience using familiar words also making us to place the definitions and terms in a certain perspective, we avoid thinking more in detail about them. Perhaps therefore using terms like Spatial/Visual is sometimes better.
You might check the mentioned sites again and see the strengths and differences between ways of thinking. you will notice that certain sequential strengths are good for testers and sometimes the visual strengths . This means that people are complex and testers also. We cannot capture our way of working , our profession in models like ISTQB, TMap, Model based Testing, Test Driven Development or what ever. Let us also approach those models by the personality we are. And stop claiming that this model is the best cure for everything. People are valuable and should be supported to deliver value. In my opinion this cannot be done by teaching them models and force them work accordingly. They should be supported to understand models and approaches and make them fit their way of thinking. As part of a team, testers should try to understand how models can be used by others also. To start with this, try to find out how others think without judging.
I know there is much more to say about this. The different types of thinking and personality made me spend some time to think about it. I hope you, reader, also are triggered to think a bit further and built your own thoughts.
Posted by Jeroen Rosink at 6:02 AM 2 comments
Labels: Ideas, Testing in General
Thursday, March 18, 2010
Are you a Sequential or a Spatial tester?
Introduction in a new way of thinking
A few months ago I got introduced to the concept of Spatial thinking. I never was aware about this kind of difference in thinking. Sometime you hear that you have people thinking mainly with their left hemisphere, other with their right hemisphere (Yes, this is an open door so let me state this before reading it comments: and sometimes you meet people who don’t think or start to avoid thinking)
Sometimes you hear people claiming that testers are thinking differently. They need a different mind set. I believe it is not a problem to think different; as long as you are open and aware that the human brain can work different for each person. This shaping their vision and awareness about testing.
Differences in thinking
The difference in thinking is explained on Gifted Development Center: "The left hemisphere is sequential, analytical, and time-oriented. The right hemisphere perceives the whole, synthesizes, and apprehends movement in space. We only have two hemispheres, and we are doing an excellent job teaching one of them."
If this is true then it is probably that our teachers, instructors and coaches also taught us also just to approach out testing knowledge based on sequential, analytical and time-oriented way.
From: Visual-Spatial Thinking
Spatial and sequential thinking are two different mental organisations that affect the way people view the world. Sequential thinking is step by step linear thinking over time, while spatial thinking is an holistic system where all knowledge is interconnected in space.
Strengths and weaknesses of spatial thinking
To understand possibilities of spatial thinking and testing you might need to see what the strengths and weaknesses are of spatial thinking. Below you will find an short listing of possible strengths and weaknesses. Keep in mind that no person is for 100% a visual or spatial thinker.
From Strengths & Weaknesses of Gifted Visual-Spatial Learners Source: Dr Linda Kreger Silverman Ph.D. Director of the Gifted Development Center Denver USA
Differences in Sequential and Spatial thinking
A comparison between the two versions is explained in: Auditory-Sequential vs Visual-Spatial thinking: http://www.gifteddevelopment.com/Visual_Spatial_Learner/vsl.htm Also see image below. When looking at it you might be forced to speak out your preferred choice. You even might suggest that Auditory-Sequential is more appropriate for testers because they are stronger in:
- Has auditory strengths
- Is an analytical thinker
- Has good auditory short-term memory
From the site Research on the Visual-Spatial Learner a research by Dr. Silverman is published providing the following facts: (I'm not in the position to check these figures and take them as proven results)
We have high confidence (over 80%) that: - At least one-third are strongly visual-spatial - One-fifth are strongly auditory-sequential - The remainder are a balance of both learning styles
Of that remainder (who are not strongly visual-spatial nor strongly auditory-sequential), - Another 30% show a slight preference for visual-spatial learning style - Another 15% show a slight preference for auditory-sequential learning style
This provides the conclusion: "This means that more than 60% of the students in a regular classroom learn best with visual-spatial presentations and the rest learn best with auditory-sequential methods. "
Sequential or Spatial Testers
When looking at the differences, strengths and weaknesses I have the feeling that the way people think has an impact on their approach and skills in testing. A person is not fully left or right thinker. Instead of seeking the disadvantages of this way of thinking we might accept it and see how it can strengthen a test process. If you accept this, you might make the next step, how to use those skills. Before you can do this you have to identify the type of thinker/tester.
Imagine what the possibilities are if you are aware of the way people thinking in communication? How about test scripts/results, requirements? Which tester would be better to prepare an overview and who have strengths in detailed testing.
I see in this topic some opportunities and will take some time to see what this way of thinking can lead me.
For more information you also might check:
In How to spot a spatial by Betty Maxwell, M.A.: "provide some clues to help you recognize those picture thinkers lurking in your environment."
The power of visual thinking by Lesley K Sword, Director, Gifted and Creative Services Australia 2005
Posted by Jeroen Rosink at 3:01 PM 4 comments
Labels: Ideas, Test Methods
Sunday, March 14, 2010
Weekend testing EWT09: Add value to a mission
Another session of EWT
This weekend I participated another session EWT09: The Imperial Strikes Back for weekend testing, again. You might say that you have something better to do during the weekends. Sure, we all have. For some people it might look like testing another application in your own free time. Others might see it as a waste of time. There are also people who don't understand it. And fortunately there are also people who share the same interest.
What does EWT values to me
Currently I attended already three sessions and each time I'm having fun an learning. as the title tells you, it is in the weekend and sometimes it doesn't fit within your daily schedule. This time I try t make it fit. Next to getting to know more people I use these session also to learn more about how they are approaching testing, how I am approaching testing and what I can learn from it.
As in the last session M. Bolton pointed me about building a model before you start testing. This was besides the mission of this EWT session a personal mission for me.
The initial mission
Before the start the following mission was stated.
"You are moving from lovely Europe with measurements based on the metrics system to the US with imperial units. Test Converber v2.2.1 (http://www.xyntec.com/converber.htm) for usability in all the situations you may face. Report back test scenarios for usability testing"
Looking to this mission is was about thinking about scenarios for testing usability. Basically I would fulfil the mission just by writing down scenarios. I have to admit, I didn't add scenarios to the bug depository. I failed over there. Was this session a of less value for me?
My lessons learned
Of course not. I choose my way in this EWT and use the mission as a boundary set. Boundaries should also be judged. I choose not to write down scenarios in front. I have done that in the past. I choose to build myself a model, define questions and think about what usability is for me.
I know that often we think about usability and actually we are checking functionality like: "Is the currency converted correctly?" Instead of terms like: "is the output readable for me?"
Based on the questions I used the application to provide me some guiding lines to define the type of tour I would be follow. This result in unwritten scenarios.
The questions I started with:
- Is the documentation was helpful and readable.
- Is the menu clear understandable and intuitive?
- Will the application fit the screen?
- Does it deal with the international settings?
- Can you be lost in this tool?
- Will it give value for me?
As mentioned I made my own mission also. I believe I am allowed to it because it is my free-time and I still kept to the original mission, define scenarios. for me the questions mentioned in the discussion were a bit the scenarios. Another way of writing down scenarios were the bugs I add to the repository. Based on type of issues you might get a picture of the route I followed within the system
Add value to a mission
Perhaps one of the main thoughts here is that you have to judge yourself if you are skilled to fulfil a mission. If not, perhaps you can define your own goals to gain those skills. If you are able to add value to a mission and you are able to explain this, you also should be able to add value to a customer. I can say that I learned again from this session.
Posted by Jeroen Rosink at 9:50 AM 0 comments
Labels: EWT, Weekend testing
Sunday, March 7, 2010
Weekend testing EWT08: Kept sticking in testing
The start of EWT08
I managed to attend this weekend another session facilitated by Anna and Markus on behalf of the European chapter of Weekend testing The application to test was this time Wiki on a Stick. Part of the mission was using SFDPOT
For more information see:
http://www.satisfice.com/articles/sfdpo.shtml
http://www.developsense.com/articles/2005-10-ElementalModels.pdf
Also check out the site EWT08:Wiki in your pocket for a transcript of the session
leaves me just another bit of advertisement to point you towards the weblogs of the testers who attended:
Anna Baik
Ajay Balamurugadas
Michael Bolton
Tony Bruce
Anne-Marie Charrett
Markus Gärtner
The Mission
For me this session was another great one. In front you will never know what you can learn. This time I tried to be more prepared. have the bug depository running, the screen capture tool in place, the connection with my virtual machine using Debian running, information about heuristics ready, I even read a bit again about them. I tried to prepare what questions I could ask with respect to SFDPOT.
My preparation
Like the other session I attended online it started with an introduction and testing in the first our and a wrap-up in the second hour. This time preparation saved me some time. Unfortunately, and that is how life goes, the start is different then you expected. This time I expected a tool I had to install. Instead it was sort of a webpage which should be stored on a place and could be used immediately. I was a bit distracted by this behaviour and stopped working with the idea to test on 2 platforms.
It took me some time to figure out what type and kind of application it was. While "touring" around the application to see in a quick view what the edges of the application was and what would be interesting to investigate further.
Retrospective
After wards I noticed it was hard to stick to an approach with heuristics based on SFDPOT. I remembered that the next time I should start defining the questions I want information about it. For me this is another way of thinking then I am used to like an approach conform TMap or ISTQB. Most of the time you start with a well defined process. Keep in mind that that will not work in EWT as the time is limited to an hour.
I believe that based on defining questions you can explain a bit about coverage. In this case is explanation more like picturing/telling what you have covered and what not in terms of the questions in stead of how many cases you executed and what that coverage would be.
I explained to the group that it was hard to stick to the initial idea and one of the pitfalls is start testing immediately. Michael explained that it is better to build a model first about the system. I like that idea. You can build a model based on documentation. I think reading the documentation with respect to the heuristic model can help you defining you questions.
Value of documentation and model
Building an initial model based on documentation is in my opinion a good way to start. This seems to look similar to other methods: reviewing intensively, defining all objects, prepare test cases. And then start testing.
Off course everything is sort of based on the Deming-cycle: Plan-Do-Check-Act. Only a lot of approaches are made too robust and formal that you loose time in documentation. And try to strive only for perfection instead of a good mix of result within in conditions.
The goal of documentation is mainly provide information. Based on this information developers and testers are lead to define their approach for building and testing. If the information is weak then the chance for mistakes are bigger. An option is to formalize a process of reviewing and adjusting the documentation to details to narrow the perception and vision of the people who has to build and test it.
To start with reviewing is an approach which might work when time is unlimited. A mistake can be to collect all information, judge the information, derive test cases based on that information and sell an advice based on the test results.
How about first building a draft model of the system and the environment? You can document this, you also can make a draft picture of it or make it up in your mind. Based on this you define your boundaries where to look at. If information is missing, you ask the owners/stakeholders. Keep in mind the time you have. For example: during EWT08, we had the program manager also online, we had only one hour available and instead of claiming and demanding information we tried to find our own way. We could have asked a lot of questions which would extend the hour testing.
This would be another way of receiving information about the system.
Information about the system
Seems that information about the system to test is important, the time window is also a leading factor this in combination of the quality of information. It turned out that the available information for this application was harder to get. Some testers who paid thoroughly attention to that and made a remark about the documentation.
Looking back it turns out that information was available in several ways:
- online manual
- in the tool itself (written information)
- the tool itself (functionality)
- people online (the project manager)
- people online (fellow testers)
- bug repository
- information regarding heuristics like SFDPOT
The needed information was different. My approach was focussed more on the functionality and there were I found something interesting Itried to find the information within the tool. this approach distracted me as already mentioned.
A lesson learned
Lesson learned here: Create a model of the system and see what kind of borders you will find. Check this with respect to the environment. The next time I would explicitly ask me questions like "how it will work when using a USB stick." I would build a framework of initial questions using the SFDPOT and during the tour I tried to refine the questions. afterwards I would try to validate if the answers to the questions are enough to make a statement of it. This can also be a statement of gaining too less information.
I believe we testers should be careful to provide a judgement about systems. When will you know you spend enough time and gained enough information? The challenge here is that we are asked to provide an objective advice although the information mostly is translated based on subjective experience.
Posted by Jeroen Rosink at 9:40 AM 1 comments
Labels: EWT, Test Methods