Showing posts with label visualization. Show all posts
Showing posts with label visualization. Show all posts

Thursday, September 26, 2013

Test Execution Trend feature added in TTA (0.5.7)

A new version of Test Trend Analyzer (TTA) is now available on github.

These are the new features in it:

  • Test Execution Trend - to see the benchmarking of specific test execution over a time period
  • Failure Analysis View
    • Modified view to show failures for specific Test category along with specific test run
    • Modified view to show failures for All test categories with latest test run
  • Minor Refactoring for Compare Runs view
  • Query optimisations + minor defect fixes.
For those that missed the earlier announcement - summary of features of TTA is available here, and source code is available on github.


PS: There is a lot of work (features / UI) planned for TTA. If you wish to contribute, we would love your help!!

Saturday, September 7, 2013

TTA (0.5.4) feature information and screenshots

Here are some screenshots from the current functionality supported by TTA v0.5.4.

See the TTA Wiki page or Contact me for more information on TTA.


Manual upload of Test data

Test Pyramid

Comparative Analysis

Compare different test runs

Test Failure Analysis
Integrate External Dashboards with TTA

TTA Statistics page

Saturday, August 31, 2013

TTA - closer to becoming unified dashboard for Test Trends and Status

Test Trend Analyzer (TTA), went live again today !!! We are now at version 0.5.4.

In the past week we have made a bunch of improvements and added new features - Integrating External Dashboard , which takes TTA closer towards being the central dashboard for testing status across the organization.
 
There are a couple of projects within ThoughtWorks using this, and also some other organizations.

Here is the list of new Features:
  • Integration of external dashboards (add from /admin page, see integration on /home page) - this allows one to integrate different existing dashboards into TTA - to make it a one stop place for seeing all Testing related information. Example: You can integrate your defect reports from Mingle / Jira / etc., or, you can also integrate your specific Continuous Integration (CI) dashboard from Go / Jenkins / Hudson / Bamboo / etc.
  • Compare test runs (/compare_runs) - to compare specific test runs --- 
    • what are the common failures, 
    • what are the unique failures, 
    • what failed on date 1, but passed on date 2
    • what failed on date 2, but passed on date 1
  • TTA Statistics Page (/stats) - to know usage of TTA by different projects / teams in your organization
  • Fixes + minor UI modification
Features available from some time:
  • Test Pyramid view (/pyramid) - to see how your project's automation effort aligns with the Test Automation philosophy
  • Comparative Analysis view (comparative_analysis) - to see the trend of your test automation results over a period of time, and if any patterns emerge
  • Failure Analysis view (/defect_analysis) - to make better meaning of the test failures, and help you prioritize which failures should be fixed first.
  • Upload Test Run Data manually (/upload) - to manually upload test data in case if you have not uploaded test data automatically to TTA, but still want to use TTA
 
More information about features, how to use TTA, etc. can be found on this blog, TTA-github, TTA-github-wiki, or by contacting me.

Tuesday, April 16, 2013

Dartboard - Are you on track?

In Agile Projects, we use the swim-lanes to track the status of the card life-cycle in an iteration. Unfortunately, the swim-lanes depict a sequential work-flow. Something has to come first, second, ... last. That puts the thought in a lot of minds that what is first is indeed first, and what is last, well, is last in the scheme of things.


That depicts testing being done towards the end - which is very anti-agile!


 


Testing starts off way before development is completed on any card. See the "Agile QA Process" for one way to do Testing on Agile projects.



On Agile projects, r
eality is that testing is going on in some fashion or the other right from the beginning. To help bring that visibility into the work-stream, I tried creating a grid (physically on a board, also in mingle) - with the rows representing the state of testing in each swim lane ... but that too was not as appealing as the image shown below. I chose to call it the Dartboard.

Dartboard - Are you on track?


Some explanation on how we used this:
Each triangle can represent your individual swim lanes. From testing perspective, we chose to club together "ready for Dev", "in Dev" and "in UI" in the same category.

There is a specific in Testing triangle - because there is some amount of work that definitely needs to be done from testing perspective AFTER development and BA signoff is complete.

The RED triangle means the card is blocked from ALL perspectives in the iteration.

The GREEN tringle means the card is completed from ALL perspectives (analysis, dev, testing - manual + automation)

In each of the triangle (except RED and GREEN), the different colors mean specific things:
BLACK band = Testing not yet started
RED band = Testing is blocked (maybe for additional info needed, etc)
BLUE band = Testing is in progress. Could be identifying test cases, doing manual / exploratory testing, setting up test data, automation, etc.

As the card moves between triangles, the testing state of each of the card is very visible. 

Add to this a simple time line on top / bottom of the card to indicate where you are in the iteration, and you know if your "state-of-panic" is justified or not at a very quick glance at the dartboard.

Some swim lane states we have used:
> Backlog / Spillover?
> Ready for Dev / In Dev / In UI
> Ready for BA Signoff / In BA Signoff
> Ready for Test
> In Test / In Integrated Test
> Done
> Blocked

We also tracked each type of card separately:
Defects = RED cards / stickies
Story cards = Blue / Yellow cards / stickies
Tech cards = White cards / stickies

Friday, March 29, 2013

Introducing Test Trend Analyzer (TTA)

The statement "I have a dream" is a very famous quote by American activist Martin Luther King Jr.

I resonate very closely with that. Here is why and how ...

Almost 2 years ago, I had a dream ... a vision about a product that can help those working in large organizations understand the health of their products / projects at a quick glance, instead of having a team of people manually scrambling frantically to collate and collect the information needed to get a sense of quality about the products they support. I chose to call this product Test Trend Analyzer - TTA

Given that Automation is a key factor in ensuring the success, quality and time-to-market for products, I took that as a baseline requirement and came up with a basic high-level diagram for TTA:

TTA - first diagram
Finally, a couple of months ago, I found a bunch of passionate people, who also had some time, to try and implement this tool.

We came up with this vision for TTA:


TTA Vision

Accordingly, we planned, prioritized, spiked, did some prototypes, did demos and showcases and got a base usable product completed.
This is a open-source project, available on github with more information about it on the github wiki.

The technology stack used is: RoR with mysql db. 

TTA falls in the Big Data + Visualization space - specific to Testing.




Watch this space for more information about TTA and the currently supported features. Email out to me if you need more specific information, or have questions on how can you use TTA, etc.

How can you contribute?

Given that we have implemented only a few basic features right now, and there are many more in the backlog, here is how you can help:
  • Suggest new ideas / features that will help make TTA better
  • Use TTA on your project and provide feedback
  • More importantly, help in implementing these features