Wednesday, April 30, 2014
TaaS blog post featured on ThoughtWorks Insights
My presentation on "Automate Your Tests across Platform, OS, Technologies with TaaS" is features on ThoughtWorks Insights page.
Monday, April 21, 2014
What are the criteria for determining success for Test Automation?
Test Automation is often thought of as a silver bullet that will solve the Teams testing problems. As a result there is a heavy investment of time, money, people in building an Automated Suite of Tests of different types which will solve all problems.
Is that really the case? We know the theoretical aspect of what is required to make Test Automation successful for the team. I want to know from practical perspective, with the context what worked or not for you.
I am currently writing about "Building an Enterprise-class Test Automation Framework", and am gathering experience reports based on what others in the industry have seen.
I am looking for people to share stories from their current / past experiences of full or limited success of Test Automation, answering the below questions (at a minimum):
Is that really the case? We know the theoretical aspect of what is required to make Test Automation successful for the team. I want to know from practical perspective, with the context what worked or not for you.
I am currently writing about "Building an Enterprise-class Test Automation Framework", and am gathering experience reports based on what others in the industry have seen.
I am looking for people to share stories from their current / past experiences of full or limited success of Test Automation, answering the below questions (at a minimum):
Context:
- What is the product under test like? (small / med / large / enterprise) (web / desktop / mobile / etc.)
- How long is the the Test Automation framework envisioned to be used? (few months, a year or two, more than a few years, etc.)
- What is the team (complete and test automation) size?
- Is the testing team co-located or distributed?
- What are the tools / technologies used for testing?
- Are the skills and capabilities uniform for the team members?
- Is domain a factor for determining success criteria?
Framework related:
- What are the factors determining the success / failure of Test Automation implementation?
- What worked for you?
- What did not work as well?
- What could have been different in the above to make Test Automation a success story?
- What are the enablers in your opinion to make Test Automation successful?
- What are the blockers / anchors in your opinion that prevented Test Automation from being successful?
- Does it matter if the team is working in Waterfall methodology or Agile methodology?
Thursday, April 10, 2014
Sample test automation framework using cucumber-jvm
I wanted to learn and experiment with cucumber-jvm. My approach was to think of a real **complex scenario that needs to be automated and then build a cucumber-jvm based framework to achieve the following goals:
So, without further ado, I introduce to you the cucumber-jvm-sample Test Automation Framework, hosted on github.
Feel free to fork and use this framework on your projects. If there are any other features you think are important to have in a Test Automation Framework, let me know. Even better would be to submit pull requests with those changes, which I will take a look at and accept if it makes sense.
** Pun intended :) The complex test I am talking about is a simple search using google search.
- Learn how cucumber-jvm works
- Create a bare-bone framework with all basic requirements that can be reused
So, without further ado, I introduce to you the cucumber-jvm-sample Test Automation Framework, hosted on github.
Following functionality is implemented in this framework:
- Tests specified using cucumber-jvm
- Build tool: Gradle
- Programming language: Groovy (for Gradle) and Java
- Test Data Management: Samples to use data-specified in feature files, AND use data from separate json files
- Browser automation: Using WebDriver for browser interaction
- Web Service automation: Using cxf library to generate client code from web service WSDL files, and invoke methods on the same
- Take screenshots on demand and save on disk
- Integrated cucumber-reports to get 'pretty' and 'meaningful' reports from test execution
- Using apache logger for storing test logs in files (and also report to console)
- Using aspectJ to do byte code injection to automatically log test trace to file. Also creating a separate benchmarks file to track time taken by each method. This information can be mapped separately in other tools like Excel to identify patterns of test execution.
Feel free to fork and use this framework on your projects. If there are any other features you think are important to have in a Test Automation Framework, let me know. Even better would be to submit pull requests with those changes, which I will take a look at and accept if it makes sense.
** Pun intended :) The complex test I am talking about is a simple search using google search.
Tuesday, April 8, 2014
Disruptive Testing - An interview with Matt Heusser
Read an interview with Matt Heusser to know what he thinks about Lean Software Testing, Scaling Agile, Testing Metrics, and other thought provoking questions.
Friday, March 28, 2014
WAAT Java v1.5.1 released today
After a long time, and with lot of push from collaborators and users of WAAT, I have finally updated WAAT (Java) and made a new release today.
You can get this new version - v1.5.1 directly from the project's dist directory.
Once I get some feedback, I will also update WAAT-ruby with these changes.
Here is the list of changes in WAAT_v1.5.1:
Thanks.
You can get this new version - v1.5.1 directly from the project's dist directory.
Once I get some feedback, I will also update WAAT-ruby with these changes.
Here is the list of changes in WAAT_v1.5.1:
Changes in v1.5.1
-
Engine.isExpectedTagPresentInActualTagList in engine class is made public
-
Updated Engine to work without creating testData.xml file, and directly
sending exceptedSectionList for tags
Added a new method Engine.verifyWebAnalyticsData(String actionName, ArrayList
expectedSectionList, String[] urlPatterns, int minimumNumberOfPackets) -
Added an empty constructor for Section.java to prevent marshalling error
-
Support Fragmented Packets
-
Updated Engine to support Pattern comparison, instead of String contains
Thanks.
Thursday, February 27, 2014
Automate across Platform, OS, Technologies with TaaS
[Updated - link to slides, audio, experience report added]
The talk at Agile India 2014 went really well. A few things happened:
The slides from the talk are available here on slideshare. You can download the audio recording from the talk from here. The pictures and video should be available from Agile India 2014 site soon. I will update the links for the same when that becomes available.
--------
After what seems to be a long time, I am speaking in Agile India 2014 in Bangalore on "Automate across Platform, OS, Technologies with TaaS".
I am changing the format of the talk this time and am hoping to do a good demo instead of just showing code snippets. Hopefully no unpleasant surprises crop up in the demo!!
Other than that, really looking forward to interacting with a lot of fellow-enthusiasts at the conference.
Slides and experience report to follow soon.
The talk at Agile India 2014 went really well. A few things happened:
- My talk was the 2nd last talk on a Saturday. My hopes of having a decent sized audience was low. But I was very pleasantly surprised to see the room almost full.
- Usually in conferences I have spoken at, the ratio of technical / hands-on people Vs leads / managers is around 20:80. In this case, that ratio was almost inverted. There were about 70-80% technical / hands-on people in the audience.
- Due to the higher technical audience, there were great questions all along the way - which resulted in me not able to complete on time ... I actually went over by 5-6 minutes and that too had to really rush through the last few sections of my presentation, and was not able to do a complete demo.
- Almost everyone was able to relate to the challenges of integration test automation, the concept and the problem TaaS solves - which was a great validation for me!
- Unfortunately I had to rush to the airport immediately after the talk, which prevented me from networking and talking more specifics with folks after the talk. Hopefully some of them will be contacting me to know more about TaaS!
The slides from the talk are available here on slideshare. You can download the audio recording from the talk from here. The pictures and video should be available from Agile India 2014 site soon. I will update the links for the same when that becomes available.
--------
After what seems to be a long time, I am speaking in Agile India 2014 in Bangalore on "Automate across Platform, OS, Technologies with TaaS".
I am changing the format of the talk this time and am hoping to do a good demo instead of just showing code snippets. Hopefully no unpleasant surprises crop up in the demo!!
Other than that, really looking forward to interacting with a lot of fellow-enthusiasts at the conference.
Slides and experience report to follow soon.
Friday, November 22, 2013
Oraganising vodQA
My blog on "Organizing a successful meetup - Tips from vodQA" is now available from ThoughtWorks Insights page.
Monday, October 21, 2013
BDT in Colombo Selenium Meetup
[UPDATED again] Feedback and pictures from the virtual session on BDT
[UPDATED]
The slides and audio + slide recording have now been uploaded.
I will be talking virtually and remotely about "Building the 'right' regression suite using Behavior Driven Testing (BDT)" in Colombo's Selenium Meetup on Wednesday, 23rd October 2013 at 6pm IST.
If you are interested in joining virtually, let me know, and if possible, I will get you a virtual seat in the meetup.
[UPDATED]
The slides and audio + slide recording have now been uploaded.
I will be talking virtually and remotely about "Building the 'right' regression suite using Behavior Driven Testing (BDT)" in Colombo's Selenium Meetup on Wednesday, 23rd October 2013 at 6pm IST.
If you are interested in joining virtually, let me know, and if possible, I will get you a virtual seat in the meetup.
Saturday, October 12, 2013
vodQA Pune - Faster | Smarter | Reliable schedule announced
A very impressive and engrossing schedule for vodQA Pune scheduled for Saturday, 19th October 2013 at ThoughtWorks, Pune has now been announced. See the event page for more details.
I am going to be talking about "Real-time Trend and Failure Analysis using Test Trend Analyzer (TTA)"
I am going to be talking about "Real-time Trend and Failure Analysis using Test Trend Analyzer (TTA)"
Sunday, October 6, 2013
Offshore Testing on Agile Projects
Offshore Testing on Agile Projects …
Anand Bagmar
Reality of organizations
Organizations
are now spread across the world. With this spread, having distributed teams is
a reality. Reasons could be a combination of various factors, including:
Globalization
|
Cost
|
24x7
availability
|
Team size
|
Mergers
and Acquisitions
|
Talent
|
The Agile Software
methodology talks about various principles to approach Software Development. There
are various practices that can be applied to achieve these principles.
The choice
of practices is very significant and important in ensuring the success of the
project. Some of the parameters to consider, in no significant order are:
Skillset on the team
|
Capability on the team
|
Delivery objectives
|
Distributed
teams
|
Working with partners / vendors?
|
Organization Security / policy constrains
|
Tools for collaboration
|
Time overlap
time between teams
|
Mindset of team members
|
Communication
|
Test Automation
|
Project
Collaboration Tools
|
Testing Tools
|
Continuous Integration
|
** The above
list is from a Software Testing perspective.
This post is
about what practices we implemented as a team for an offshore testing project.
Case Study - A quick introduction
An enterprise had a B2B product providing an
online version of a physically conducted auction for selling used-vehicles, in
real-time and at high-speed. Typical participation in this auction is by an auctioneer,
multiple sellers, and potentially hundreds of buyers. Each sale can have up to
500 vehicles. Each vehicle gets sold / skipped in under 30 seconds - with multiple
buyers potentially bidding on it at the same time. Key business rules: only 1
bid per buyer, no consecutive bids by the same buyer.
Analysis and Development was happening across 3
locations – 2 teams in the US, and 1 team in Brazil. Only Testing was happening from Pune, India.
George Bernard Shaw said:
“Success does not consist in never making
mistakes but in never making the same one a second time.”
We took that to heart and very sincerely. We
applied all our learning and experiences in picking up the practices to make us
succeed. We consciously sought to creative, innovative and applied
out-of-the-box thinking on how we approached testing (in terms of strategy,
process, tools, techniques) for this unique, interesting and extremely
challenging application, ensuring we do not go down the same path again.
Challenges
We
had to over come many challenges for this project.
- Challenge in creating a common DSL that will be understood by ALL parties - i.e. Clients / Business / BAs / PMs / Devs / QAs
- All examples / forums talk using trivial problems - whereas we had lot of data and complex business scenarios to take care of.
- Cucumber / capybara / WebDriver / ruby do not allow an easy way to do concurrency / parallel testing
- We needed to simulate in our manual + automation tests for "n" participants at a time, interacting with the sale / auction
- A typical sale / auction can contains 60-500 buyers, 1-x sellers, 1 auctioneer. The sale / auction can contain anywhere from 50-1000 vehicles to sell. There can be multiple sales going on in parallel. So how do we test these scenarios effectively?
- Data creating / usage is a huge problem (ex: production subset snapshot is > 10GB (compressed) in size, refresh takes long time too,
- Getting a local environment in Pune to continue working effectively - all pairing stations / environment machines use RHEL Server 6.0 and are auto-configured using puppet. These machines are registered to the Client account on the RedHat Satellite Server.
- Communication challenge - We are working from 10K miles away - with a time difference of 9.5 / 10.5 hours (depending on DST) - this means almost 0 overlap with the distributed team. To add to that complexity, our BA was in another city in the US - so another time difference to take care of.
- End-to-end Performance / Load testing is not even a part of this scope - but something we are very vary of in terms of what can go wrong at that scale.
- We need to be agile - i.e. testing stories and functionality in the same iteration.
All the above-mentioned problems meant we had to come up with our
own unique way of tackling the testing.
Our principles - our North Star
We
stuck to a few guiding principles as our North Star:
- Keep it simple
- We know the goal, so evolve the framework - don't start building everything from step 1
- Keep sharing the approach / strategy / issues faced on regular basis with all concerned parties and make this a TEAM challenge instead of a Test team problem!
- Don't try to automate everything
- Keep test code clean
The End Result
At the end of the journey, here are some interesting
events from the off-shore testing project:
- Tests were specified in form of user journeys following the Behavior Driven Testing (BDT) philosophy – specified in Cucumber.
- Created a custom test framework (Cucumber, Capybara, WebDriver) that tests a real-time auction - in a very deterministic fashion.
- We had 65-70 tests in form of user journeys that covers the full automated regression for the product.
- Our regression completed in less than 30 minutes.
- We had no manual tests to be executed as part of regression.
- All tests (=user journeys) are documented directly in Cucumber scenarios and are automated
- Anything that is not part of the user journeys is pushed down to the dev team to automate (or we try to write automation at that lower level)
- Created a ‘special’ Long running test suite that simulates a real sale with 400 vehicles, >100 buyers, 2 sellers and an auctioneer.
- Created special concurrent (high speed parallel) tests that ensures even at highest possible load, the system is behaving correctly
- Since there was no separate performance and load test strategy, created special utilities in the automation framework, to benchmark “key” actions.
- No separate documentation or test cases ever written / maintained - never missed it too.
- A separate special sanity test that runs in production after deployment is done, to ensure all the integration points are setup properly
- Changed our work timings (for most team members) from 12pm - 9pm IST to get some more overlap, and remote pairing time with onsite team.
- Setup an ice-cream meter - for those that come late for standup.
Innovations and Customizations
Necessity
breeds innovation! This was so true in this project.
Below is a table listing all the different areas and specifics of
the customization we did in our framework.
Dartboard
Created a custom board “Dartboard” to quickly visualize
the testing status in the Iteration. See this post for more details: “Dartboard
– Are you on track?”
TaaS
To
automate the last mile of Integration Testing between different applications, we
created an open-source product – TaaS. This provides a platform / OS
/ Tool / Technology / Language agnostic way of Automating the Integrated
Tests between applications.
Base premise for TaaS:
Enterprise-sized organizations have multiple
products under their belt. The technology stack used for each of the product is
usually different – for various reasons.
Most of such organizations like to have a
common Test Automation solution across these products in an effort to
standardize the test automation framework.
However, this is not a good idea! If products in the same
organization can be built using different / varied technology stack, then why
should you pose this restriction on the Test Automation environment?
Each product should be tested using the tools
and technologies that are “right” for it.
“TaaS” is a product that allows you do
achieve the “correct” way of doing Test Automation.
WAAT - Web Analytics Automation Testing Framework
I had created the WAAT framework
for Java and Ruby in 2010/2011.
However this framework had a limitation - it did not work products what are
configured to work only in https mode.
For one of the applications, we need to do testing for WebTrends
reporting. Since this application worked only in https mode, I created a new
plugin for WAAT - JS Sniffer that can
work with https-only applications. See my blog for
more details about WAAT.
Subscribe to:
Posts (Atom)