I spoke in DevOps Summit on 8th Oct in Bangalore on "To Deploy, or Not To Deploy - Decide using TTA's Trend & Failure Analysis".
The conversations during and after this talk with various veterans in the Software Industry, across various different domains; reiterated my belief in the need for me to spend more time in taking TTA to the next level and make it a more robust and feature-rich product.
Below are the details of the talk:
Abstract
In a fast-moving environment, where Continuous Integration (CI) and Continuous Delivery (CD) are a necessity and not a luxury, how can teams decide if a product is ready to be deployed to the next environment and go 'live'?
What is the health of your product portfolio at any point in time? Can you identify patterns over a period of time in making better decisions to better the quality of your product(s)? Test Automation across all layers of the Test Pyramid enables to get quick feedback about the health of the product-under-test.
However, in an organization having multiple products in its portfolio, how can you get the collated quality / health information from all the products, quickly and in real-time? Or, for a large program of work, which has various projects being worked on in parallel by numerous teams across the world, how can the relevant people quickly get the consolidated quality / health information for the whole program?
In such cases, how can you: - figure out any Trends / Patterns in the quality, or, - do any meaningful Comparative Analysis (say between the quality of last release Vs the next release), or, - do quick Failure Analysis and prioritize the 'fixing' of issues in an efficient fashion, and, - do some quick Functional Performance Benchmarking.
At present this needs to be done manually. Learn an effective way to answer the above questions - with TTA (Test Trend Analyzer), an open source product.
TTA give you real-time and visual insights into the health of the product portfolio using the Test Automation results. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.
The Agile Manifesto was published in 2001. It took the software industry a good few years to truly understand what the manifesto means, and the principles behind it. However, choosing and implementing the right set of practices to get the true value from working the Agile way has been the biggest challenge for most!
While Agile has now gone mainstream, and as we get better at the development practices to being Agile, Testing has still been lagging behind in most cases. A lot of teams are still working in the staggered fashion - with testing following next after development completed.
In this workshop, we will learn and share various principles and practices which teams should adopt to be successful in testing on Agile projects.
Agenda :
What is Agile testing? - Learn what does it mean to Test on Agile Projects
Effective strategies for Distributed Testing - Learn practices that help bridge the Distributed Testing gap!
Test Automation in Agile Projects - Why? What? How? - Why is Test Automation important, and how do we implement a good, robust, scalable and maintainable Test Automation framework!
Build the "right" regression suite using Behavior Driven Testing (BDT) - Behavior Driven Testing (BDT) is an evolved way of thinking about Testing. It helps in identifying the 'correct' scenarios, in form of user journeys, to build a good and effective (manual & automation) regression suite that validates the Business Goals.
Key Learnings for participants in this workshop :
Understand the Agile Testing Manifesto
Learn the essential Testing practices and activities essential for teams to adopt to work in Agile way of working
Discover techniques to do effective testing in distributed teams
Find out how Automation plays a crucial role in Agile projects
Learn how to build a good, robust, scalable and maintainable Functional Automation framework
Learn, by practice, how to identify the right types of tests to automate as UI functional tests - to get quick and effective feedback
Pre-requisites :
Basic working knowledge and understanding of Agile
I spoke in Agile Testing Alliance Global Gathering on 8th Oct in Bangalore on "The What, Why and How of Web Analytics Testing". This talk was my take on explaining a very important, yet quite ignored, aspect of Product / Application Development - Web Analytics. Below is the abstract of the talk, followed by slides and video from the talk.
The most used and heard about buzz words in the Software Industry today are … IoT and Big Data! With IoT, with a creative mindset looking for opportunities and ways to add value, the possibilities are infinite. With each such opportunity, there is a huge volume of data being generated - which if analyzed and used correctly, can feed into creating more opportunities and increased value propositions. There are 2 types of analysis that one needs to think about. 1. How is the end-user interacting with the product? This will give some level of understanding into how to re-position and focus on the true value add features for the product. 2. With the huge volume of data being generated by the end-user interactions, and the data being captured by all devices in the food-chain of the offering, it is important to identify patterns from what has happened, and find out new product / value opportunities based on usage patterns. Learn what is Web Analytics, why is it important, and see some techniques how you can test it manually and and also automate that validation.