Wednesday, October 2, 2013

Real-time Trend and Failure Analysis using Test Trend Analyzer (TTA)


Real-time Trend and Failure Analysis using Test Trend Analyzer (TTA)
Anand Bagmar

Summary

Organizations have long running products / programs. They need to understand the health of their products / projects at a quick glance, instead of having a team of people manually scrambling frantically to collate and collect the information needed to get a sense of quality
about the products they support. 

TTA is an open source product that becomes the source of information to give you real-time insights into the health of the product portfolio using the Test Automation results, in form of Trends, Comparative Analysis, Failure Analysis and Functional Performance Benchmarking.

The Dream

The statement "I have a dream" is a very famous quote by American activist Martin Luther King Jr.

I resonate very closely with that. Here is why and how ...

Sometime in 2011, I had a dream ... a vision about a product that can help those working in large organizations understand the health of their products / projects at a quick glance, instead of having a team of people manually scrambling frantically to collate and collect the information needed to get a sense of quality about the products they support….

I called this dream - Test Trend Analyzer - TTA

What is TTA?

In a nutshell, given all various types of Test Automation is done in your organization, TTA is a product that stores and parses the test run results, and then displays various Trend Analysis charts and also does Test Failure analysis for you. Based on the context of the product under test, the viewer can then make more meaning of the data presented, and more importantly, take meaningful actions / next steps.

Why do I need to Trend Analysis of the test results?

Automation (Unit / Integration / Functional / etc.) is a key factor in ensuring the success, quality and time-to-market for products.


Since Automated tests are executed via CI (Continuous Integration), a lot of trend analysis and test failure analysis is already be done by the CI tool itself.

However, the ability of CI doing this is limited for the following reasons:
  • The typical archival duration in CI is in the range from 15-45 days.
  • Only trends can be seen after grouping relevant jobs in the CI tool.
  • It is difficult to group all related product jobs in CI – because of the sheer volume of tests.
  • The grouping of jobs becomes more challenging if the number of products / projects / vendors or partners / environments / etc. are more in number.
  • The projects / products are long running (many months to years). It is not practical to archive the results for such duration in CI.

I have seen first-hand many of the use cases listed below from real scenarios, where we need a unique and different product to solve some Testing Specific problems:
  • A Business Manager / Test Director overseeing multiple products development in the organization may want to see the overall health of all the products in his / her portfolio, in real time.
  • A Product Owner / PM / Test Manager overseeing the product development / testing of a specific product in the organization may want to see the overall health of the product, in real time.
  • Individual team members (Tech Leads / QAs / Developers / etc.) want to do quick test failure analysis in order to decide the correct priority of next set of tasks.

Vision for TTA

With the above considerations in mind, I came up with the following vision statement for TTA:
       A single point, visual solution to gauge the health of your product portfolio using Test Automation results by means of –
      Trends
      Failure analysis
       And providing
      Drill-down reports
      Customizable reports
       So that
      Different stakeholders can get single click view of the health status and potential issues
      A project team can decide if automation is useful or not.
      Automated data collation and trending to avoid manual data aggregation and interpretation
       With the stakeholders being
      QA Directors / Managers / Leads / hands-on-tester
      Developers
      Tech-Ops

How does TTA work?

TTA is developed as an independent RoR product. It uses MySQL as the database. You will need to install TTA (instructions available on TTA github wiki) on an independent (virtual) machine.

TTA is a decoupled product. It does not depend on any specific CI (Continuous Integration) Tool, programming language, test framework, etc.

CI Jobs typically call some build tool – example ant, maven, gradle, etc. The command called by the CI job does the test setup, and then executes the tests. After execution, the results are sent back to CI, and the test run completed.

After the test execution is completed, to integrate automatic reporting of results to TTA, we need to:
  • Zip the log folder, and,
  • Send the results with test meta-data information

Current set of Features for TTA

·       Test Pyramid view (/pyramid) - to see how your project's automation effort aligns with the Test Automation philosophy
·       Comparative Analysis view (/comparative_analysis) - to see the trend of your test automation results over a period of time, and if any patterns emerge
·       Failure Analysis view (/defect_analysis) - to make better meaning of the test failures, and help you prioritize which failures should be fixed first.
·       Integration of external dashboards (add from /admin page, see integration on /home page) - this allows one to integrate different existing dashboards into TTA - to make it a one stop place for seeing all Testing related information. Example: You can integrate your defect reports from Mingle / Jira / etc., or, you can also integrate your specific Continuous Integration (CI) dashboard from Go / Jenkins / Hudson / Bamboo / etc.
·       Test Execution Trend - to see the benchmarking of specific test execution over a time period
·       Compare test runs (/compare_runs) - to compare specific test runs
o   what are the common failures, 
o   what are the unique failures, 
o   what failed on date 1, but passed on date 2
o   what failed on date 2, but passed on date 1
·       Upload Test Run Data manually (/upload) - to manually upload test data in case if you have not uploaded test data automatically to TTA, but still want to use TTA
·       TTA Statistics Page (/stats) - to know usage of TTA by different projects / teams in your organization
Refer to my blog or the TTA-github-wiki for other information, including screenshots about TTA.

Current state

TTA is available as an open-source product via github. There are a couple of clients (internal to ThoughtWorks, and external) using TTA in their projects.

How can you contribute?

Given that we have implemented only a few basic features right now, and there are many more in the backlog, here is how you can help:
  • Suggest new ideas / features that will help make TTA better
  • Use TTA on your project and provide feedback
  • More importantly, help in implementing these features

Contact information

Contact Anand Bagmar (anand.bagmar@thoughtworks.com / abagmar@gmail.com) for more information about TTA.

Tuesday, October 1, 2013

Evolution of vodQA - The Software Tester's Conference


Evolution of vodQA - The Software Tester's Conference!

Anand Bagmar


The birth of vodQA


Back in 2009 / 2010, I was looking to attend and speak in Testing Conferences, especially near, or close to my hometown, Pune.



My search results were very disappointing for the following reasons:

  • I found that the conferences were not catered towards hands-on Testers.
  • The conferences were catered for Leads / Managers/ Directors.
  • The speakers spoke mostly about process, very high-level “things” related to Software Testing.
  • There was a significant “cost” to attend the conferences – which meant that if one is really interested in attending to learn more, they had to think multiple times before doing so.


This was clearly not working for me, as well as other like-minded people I was interacting with.



Thanks to my colleagues in ThoughtWorks, we decided to do something about this.



In June 2010, we decided to start our own conference, hosted by ThoughtWorks, with the objective of “purely sharing our learning with the community, and also learn from the community”. This was the birth of “vodQA – value oriented discussion about Quality Analysis”.



To start anything, especially external / public facing, we needed to plan it well, and the first part of the planning is – what are we trying to achieve?

vodQA – Mission Statement


To make vodQA valuable to us, and other attendees, we came up with some guiding principles. These principles have kept on evolving over time like a natural process.

  • vodQA is a practioners / hands-on conference.
  • Anyone interested in the aspect and practice of Software Testing, regardless of role or organization, should be able to attend vodQA.
  • Any topic related to Software Testing (manual / exploratory / techniques / tools / technologies / process).
  • At ThoughtWorks, the Quality of the Software we build is of prime importance. To cater to that, we have a very strong Testing capability, skills and practices within the organization. Using vodQA as a platform, we want to share our learning with the Testing Community, so they can learn from what has worked well, or not so well, based on real experiences.
  • Though at ThoughtWorks, we have “good” Testing practices, there are many more practices and experiences we can learn from the experiences of other organizations and individuals. The vodQA platform should able to provide new learning from the industry for ThoughtWorkers, and others who attend.
  • In order to keep learning from what we do, and keep doing better in the future, we take feedback from attendees and speakers. Also, with an immediate after-the-event retrospective, we attempt to do the best for all participating in next vodQA.
  • vodQA is a community event. Each community hosting vodQA should have the flexibility to structure the event in the way they see best suited for the attendees.
  • Each vodQA should have a specific theme. This allows proper expectation setting from attendees, and also invites specific themed-topics from internal (ThoughtWorkers) and external speakers.

How we connect with like-minded people?


We started by using the database of people who had expressed interest in working with / being associated with ThoughtWorks, and spreading the event by word-of-mouth.



Then we started using the power of facebook (vodQA group), LinkedIn (vodQA group) and Twitter (#vodqa) to also spread the word, and bring the community together during non-event times, to share and collaborate with each other.

The journey so far


The first vodQA event in June 2010 at ThoughtWorks, Pune was a huge success. As a result, we immediately planned and hosted another vodQA in Oct 2010, again at ThoughtWorks, Pune.



The feedback and retrospectives provided us great motivation and inputs. As a result, we decided to make vodQA an India level conference.



As we took vodQA to other ThoughtWorks India offices out of Pune, (Bangalore, Chennai, Gurgaon), the respective community planned and executed this event, the way they felt best.



Till date, we have hosted the following vodQA events:



Theme
ThoughtWorks Location
Date
vodQA – The Testing Spirit
Pune
Jun-10
vodQA
Pune
Oct-10
vodQA
Pune
Mar-11
vodQA - Agile testing  for Enterprises
Gurgaon
Dec-11
vodQA - Continuous Testing For Total Quality Assurance
Chennai
Jan-12
vodQA - Agile Testing for Team and and Enterprises
Bangalore
Feb-12
vodQA – Testing and Beyond
Pune
Mar-12
vodQA – NCR
Gurgaon
Jun-12
vodQA Geek Night – Behavior Driven Testing (BDT) workshop
Pune
Jul-12
vodQA - The ABCs of testing - Automation, Big Data Analytics , Cloud Testing
Bangalore
Sep-12
vodQA – Going Beyond the Usual
Pune
Oct-12
vodQA Geek Night – Test Automation workshop
Pune
Apr-13
vodQA – Get, Set, Test
Bangalore
May-13
vodQA - Served on Mobile
Chennai
Jul-13
vodQA - Agility In Mobility
Gurgaon
Jul-13
*vodQA – Smarter | Faster | Reliable
Pune
Oct-13
*vodQA Gurgaon - Selenium for Beginners -
Gurgaon
Oct-13



*Upcoming events



Some interesting quotes from attendees:

  • Liked the participation, interactive approach of the organizers and the passion and energy to make sure that audience goes satisfied
  • A Saturday well spent @ThoughtWorks #vodQA. Got a chance to meet some highly motivated & inspired ThoughtWorkers & industry professionals!
  • This is our QA family, and we want to see it grow



Overall we have had 800+ external (non-ThoughtWorker) participants and 75+ talks / workshops / games / sessions across all vodQA events. There have been a decent % of Developers / Business Analysts / Software Coaches attending and speaking, though the majority of attendees has been Software Testers.


I am very happy to see vodQA become popular and valuable to all passionate about Software Testing, all across India.


My dream is to take vodQA across the world as a community driven, Testing practioners conference!

Thursday, September 26, 2013

Test Execution Trend feature added in TTA (0.5.7)

A new version of Test Trend Analyzer (TTA) is now available on github.

These are the new features in it:

  • Test Execution Trend - to see the benchmarking of specific test execution over a time period
  • Failure Analysis View
    • Modified view to show failures for specific Test category along with specific test run
    • Modified view to show failures for All test categories with latest test run
  • Minor Refactoring for Compare Runs view
  • Query optimisations + minor defect fixes.
For those that missed the earlier announcement - summary of features of TTA is available here, and source code is available on github.


PS: There is a lot of work (features / UI) planned for TTA. If you wish to contribute, we would love your help!!