Showing posts with label innovation. Show all posts
Showing posts with label innovation. Show all posts

Monday, November 23, 2015

TTA in Discuss Agile Day Pune

I spoke in Discuss Agile Day, Pune on 22nd Nov on "To Deploy, or Not To Deploy - Decide using TTA's Trend & Failure Analysis".

Below are the details of the talk:

Abstract

In a fast-moving environment, where Continuous Integration (CI) and Continuous Delivery (CD) are a necessity and not a luxury, how can teams decide if a product is ready to be deployed to the next environment and go 'live'?

What is the health of your product portfolio at any point in time?
Can you identify patterns over a period of time in making better decisions to better the quality of your product(s)?
Test Automation across all layers of the Test Pyramid enables to get quick feedback about the health of the product-under-test.

However, in an organization having multiple products in its portfolio, how can you get the collated quality / health information from all the products, quickly and in real-time? Or, for a large program of work, which has various projects being worked on in parallel by numerous teams across the world, how can the relevant people quickly get the consolidated quality / health information for the whole program?

In such cases, how can you:
- figure out any Trends / Patterns in the quality, or,
- do any meaningful Comparative Analysis (say between the quality of last release Vs the next release), or,
- do quick Failure Analysis and prioritize the 'fixing' of issues in an efficient fashion, and,
- do some quick Functional Performance Benchmarking.

At present this needs to be done manually.
Learn an effective way to answer the above questions - with TTA (Test Trend Analyzer), an open source product.

TTA give you real-time and visual insights into the health of the product portfolio using the Test Automation results. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Slides from the talk


Wednesday, October 14, 2015

Good Trends for TTA in DevOps Summit

I spoke in DevOps Summit on 8th Oct in Bangalore on "To Deploy, or Not To Deploy - Decide using TTA's Trend & Failure Analysis".

The conversations during and after this talk with various veterans in the Software Industry, across various different domains; reiterated my belief in the need for me to spend more time in taking TTA to the next level and make it a more robust and feature-rich product.

Below are the details of the talk:


Abstract

In a fast-moving environment, where Continuous Integration (CI) and Continuous Delivery (CD) are a necessity and not a luxury, how can teams decide if a product is ready to be deployed to the next environment and go 'live'?

What is the health of your product portfolio at any point in time?
Can you identify patterns over a period of time in making better decisions to better the quality of your product(s)?
Test Automation across all layers of the Test Pyramid enables to get quick feedback about the health of the product-under-test.

However, in an organization having multiple products in its portfolio, how can you get the collated quality / health information from all the products, quickly and in real-time? Or, for a large program of work, which has various projects being worked on in parallel by numerous teams across the world, how can the relevant people quickly get the consolidated quality / health information for the whole program?

In such cases, how can you:
- figure out any Trends / Patterns in the quality, or,
- do any meaningful Comparative Analysis (say between the quality of last release Vs the next release), or,
- do quick Failure Analysis and prioritize the 'fixing' of issues in an efficient fashion, and,
- do some quick Functional Performance Benchmarking.

At present this needs to be done manually.
Learn an effective way to answer the above questions - with TTA (Test Trend Analyzer), an open source product.

TTA give you real-time and visual insights into the health of the product portfolio using the Test Automation results. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Slides from the talk


Video from the talk

 

 


Monday, October 12, 2015

Web Analytics and the new kids in town!

I spoke in Agile Testing Alliance Global Gathering on 8th Oct in Bangalore on "The What, Why and How of Web Analytics Testing".

This talk was my take on explaining a very important, yet quite ignored, aspect of Product / Application Development - Web Analytics. Below is the abstract of the talk, followed by slides and video from the talk.


Topic: The What, Why & How of Web Analytics Testing

Learning Objectives:

The most used and heard about buzz words in the Software Industry today are … IoT and Big Data!

With IoT, with a creative mindset looking for opportunities and ways to add value, the possibilities are infinite. With each such opportunity, there is a huge volume of data being generated - which if analyzed and used correctly, can feed into creating more opportunities and increased value propositions.

There are 2 types of analysis that one needs to think about.
1. How is the end-user interacting with the product? This will give some level of understanding into how to re-position and focus on the true value add features for the product.
2. With the huge volume of data being generated by the end-user interactions, and the data being captured by all devices in the food-chain of the offering, it is important to identify patterns from what has happened, and find out new product / value opportunities based on usage patterns.

Learn what is Web Analytics, why is it important, and see some techniques how you can test it manually and and also automate that validation.


Slides from the talk

Wednesday, September 30, 2015

vodQA - Quest for Quality & Taste of Mobile in October

There are 2 vodQA editions coming your way in October 2015.

Quest for Quality - ThoughtWorks, Gurgaon - Saturday, 17th October, 2015

ThoughtWorks Gurgaon is happy to announce 8th edition of vodQA happening on 17th October, 2015 at TW Gurgaon office.

The theme for this edition is "Quest For Quality", i.e. anything and everything related to quality. The speaker registrations are now open, request you all to submit talks here.
 

Taste of Mobile - ThoughtWorks, Chennai - Saturday, 31st October, 2015

Welcoming speakers for the 6th edition of VodQA - Chennai
This edition of VodQa will give you the 'TASTE OF MOBILE'. We will discuss new practices and ideas in the field of mobile testing.
You could present your thoughts as a hands-on workshop, a short lightning talk (\~10 mins) or a detailed presentation (\~30 mins), submit your talks here.

Saturday, August 22, 2015

Patterns in Test Automation

I spoke in vodQA Hyderabad on Sat, 22nd August 2015 about Patterns in Test Automation - Frameworks, Data & Locators.

The slides are available on SlideShare:


The video is available on YouTube:



Abstract

Building a Test Automation Framework is easy - there are so many resources / guides / blogs / etc. available to help you get started and help solve the issues you get along the journey.

However, building a "good" Test Automation Framework is not very easy. There are a lot of principles and practices you need to use, in the right context, with a good set of skills required to make the Test Automation Framework maintainable, scalable and reusable.

Design Patterns play a big role in helping achieve this goal of building a good and robust framework.

In this talk, we will talk about, and see examples of various types of patterns you can use for:

  • Build your Test Automation Framework
  • Test Data Management
  • Locators / IDs (for finding / interacting with elements in the browser / app)
Using these patterns you will be able to build a good framework, that will help keep your tests running fast, and reliably in your CI / CD setup!

Learning outcome


  • Patterns for building Test Automation Framework
  • Patterns for Test Data Management, with pros and cons of each
  • Patterns for managing locators / IDs for interaction with UI



Thursday, August 6, 2015

Agile Testing - Metrics can be fun too!

Metrics are meaningless unless in the right context. In this case, my "right" context is purely a "feel-good-factor". 

In April 2011, I published the "Agile QA Process" paper on SlideShare. I am very happy to see it has received over 30000 views and has been downloaded over 1400 times!

On a similar note, I created a mindmap for Test Insanse - titled - "Agile QA - Capabilities & Skills". That also seems to be hitting a good note - with almost 1000 views in under 25 days!

So in this case - Metrics are fun! I don't mind this ego boost to continue writing more, and sharing more!

Thursday, July 16, 2015

Automating Look & Feel, along with Functional validation

Typically we automate the validation of functionalities - using various options and approaches based on the Test Automation Pyramid. However, there are various reasons why one may want to automate the Look & Feel of their application (web / native)

This post is not about if or when we should automate the Look & Feel aspects of the product. This post assumes you have done the discussions, evaluations and validations required to reach the decision that, yes, we need to automate the Look & Feel of the product under test. 

The next question is - how? What are the alternatives of accomplishing this type of automation?

If you need basic validation, there is an open-source alternative - where you can do Visual Testing and Validations using PhantomCss. Vishnu & Shridhar spoke about this in vodQA Pune - Innovations in Testing. See the slides and videos from their awesome session.

If you need more flexibility and functionality from the Look & Feel automation, and want to do this for cross-browser / multiple devices, then I recommend you look at Applitools Eyes product. I had have seen a couple of demos from Moshe and Adam about this product - and they were kind to record one of the demos and allow me to share it with all - you can see the demo here on youtube.



Applitools has a free account for individuals, with of course, certain limitation in usage (not functionality). You can sign-up for your account here and try it out.

Tuesday, July 14, 2015

Client-side Performance Testing Workshop in TechJam, 13th August 2015

I am conducting a Client-side Performance Testing workshop in TechJam on Thursday, 13th August 2015.

You can register for the same from the TechJam page.


Abstract

In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing.
Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.
Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.

Expected Learnings

  1. What is Performance Testing and Performance Engineering.
  2. Hand's on experience of some open-source tools to monitor, measure and automate Client-side Performance Testing.
  3. Examples / code walk-through of some ways to automate Client-side Performance Testing.

Prerequisites

  1. Participants are required to bring their own laptop for this workshop.
  2. Also, please install phantomJS on your machine (http://phantomjs.org/download.html)

Thursday, June 18, 2015

Enabling CD & TTA in Discuss Agile 2015

I had the opportunity to speak in Discuss Agile Delhi 2015 on 2 topics.

Here are the details on the same:


Enabling Continuous Delivery (CD) in Enterprises with Testing

Abstract:

The key objectives of any organization is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
In such a fast moving environment, CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury!
There are various practices that organizations need to implement to enable CD. Changes in requirements (a reality in all projects) needs to be managed better. Along with this, processes and practices need to be tuned based on the team capability, skills and distribution.

Video:




Slides:






To Deploy, or Not to Deploy? Decide using Test Trend Analyzer (TTA)

Abstract:

The key objectives of organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
In order for these organizations to to understand the quality / health of their products at a quick glance, typically a team of people scramble to collate and collect the information manually needed to get a sense of quality about the products they support. All this is done manually.
So in the fast moving environment, where CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury, how can teams take decisions if the product is ready to be deployed to the next environment or not?
Test Automation across all layers of the Test Pyramid is one of the first building blocks to ensure the team gets quick feedback into the health of the product-under-test.
The next set of questions are:
    •    How can you collate this information in a meaningful fashion to determine - yes, my code is ready to be promoted from one environment to the next?
    •    How can you know if the product is ready to go 'live'?
    •    What is the health of you product portfolio at any point in time?
    •    Can you identify patterns and do quick analysis of the test results to help in root-cause-analysis for issues that have happened over a period of time in making better decisions to better the quality of your product(s)?
The current set of tools are limited and fail to give the holistic picture of quality and health, across the life-cycle of the products.
The solution - TTA - Test Trend Analyzer
TTA is an open source product that becomes the source of information to give you real-time and visual insights into the health of the product portfolio using the Test Automation results, in form of Trends, Comparative Analysis, Failure Analysis and Functional Performance Benchmarking. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Slides:



Thursday, May 28, 2015

vodQA Pune - Innovations in Testing

vodQA Update - Agenda + Slides + Videos


Here is an update of the vodQA that went by at supersonic speed!


We had an intense and action-packed vodQA in ThoughtWorks, Pune on Saturday, 6th June 2015 - with the theme - Innovations in Testing!

Here are some highlights from the event:
  • You can find the details of the agenda + links to slides & videos from here or here.
  • After a record breaking attendee registrations (~500), we frantically closed off registrations. This meant around 140-180 people would show up based on historic attendance trends. 135 attendees made it to vodQA - the first person reaching office at 8.30am - when the event was supposed to start at 10am! That is enthusiasm!
  • We had 45+ speaker submissions (and we had to reject more submissions because the registrations had already closed). After speaking to all submitters, and a lot of dry-runs and feedback, we eventually selected 6 talks, 4 lightning talks, 4 workshops from this massive list.
  • We were unfortunately able to select only 2 external speakers (but it was purely based on the content + relevance to the theme). One of these speakers travelled all the way from Ahmedabad to Pune on his own for delivering a Lightning Talk.
  • We had a few ThoughtWorkers travelling from Bangalore (2 speakers + 1 attendee) and 1 (speaker) from Gurgaon
  • We had around 30-40 ThoughtWorkers participating in the conference. 
  • No event in the office can be possible without the amazing support from our Admin team + support staff!
  • Overall - we had around 200 people in the office on a Saturday!
  • For the first time, we did a live broadcasting of all the talks + lightning talks (NO workshops). This allowed people to connect with vodQA as it happened. Also - usually the last and most cumbersome thing from a post-event processing - uploading videos - was now the the first thing that was completed. See the videos here on youtube. This update got delayed because we still have to get the link to the slides :(
  • We celebrated the 5th Birthday of vodQA!
  • Even though most projects in TW Pune are running at 120+% delivery speed, we were able to pull this off amazingly well! This can only happen when individuals believe in what they are contributing towards. Thank you all!
  • We wrapped up most of the post-event activities (office-cleanup, retro, post-vodQA dinner and now this update email) within 5 days of the vodQA day - another record by itself!
  • Some pictures are attached with this email.
You can see the tweets and comments in the vodQA group on facebook

Again, A HUGE THANKS to ALL those who participated in any way!

On behalf of the vodQA team + all the volunteers!










-----------------------------------------------------------------------------------------------------------------------------------------

[UPDATE]

Detail agenda, with expected learning and speaker information available here (http://vodqa-pune.weebly.com/agenda.html) for vodQA Pune - Innovations in Testing.

NOTE;
- Each workshop has limited # of seats.
- Registration for workshop will be done at the Attendee Registration Desk between 9am-10am on vodQA day.
- Registration will be on first-come-first choice basis.
- See each talk / workshop details (below) for pre-requisites, if any.


----------------

vodQA is back in ThoughtWorks, Pune on Saturday, 6th June 2015. This time the theme is - "Innovations in Testing".

We got a record number of submissions from wannabe speakers and HUGE number of attendee registrations. Selecting 12-14 talks from this list was no small task - but we had to take a lot of tough decisions.

The agenda is now published (see here - http://vodqa-pune.weebly.com/agenda.html) and we are looking forward to have a very rocking vodQA!

Monday, May 11, 2015

vodQA Geek Night in ThoughtWorks, Hyderabad - Client-side Performance Testing Workshop

I am conducting a workshop on "Client-side Performance Testing" in vodQA Geek Night, ThoughtWorks, Hyderabad from 6.30pm-8pm IST on Thursday, 14th May, 2015.

Visit this page to register!

Abstract of the workshop:

In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing. 

Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.

Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.

Venue:

ThoughtWorks Technologies (India) Pvt Ltd.
3rd Floor, Apurupa Silpi,
Beside H.P. Petrol Bunk (KFC Building),
Gachibowli,
Hyderabad - 500032, India 

Saturday, April 25, 2015

Push the Envelope at vodQA, Bangalore

[UPDATED - Slides added]

Yet another vodQA begins today, Saturday, 25th April 2015 - this time at ThoughtWorks, Bangalore. The theme for this vodQA is - "Push the Envelope". The detail agenda can be found here.


I conducted a workshop on "Client-side Performance Testing" in vodQA Bangalore. 


Abstract of the workshop:



In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing. 

Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.



Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.

Here are the slides used in the workshop:

Thursday, February 19, 2015

Experiences from webinar on "Build the 'right' regression suite using Behavior Driven Testing (BDT)"

I did a webinar on how to "Build the 'right' regression suite using Behavior Driven Testing (BDT)" for uTest Community Testers on 18th Feb 2015 (2pm ET).

The recording of the webinar is available here on utest site (http://university.utest.com/recorded-webinar-build-the-right-regression-suite-using-behavior-driven-testing-bdt/).

The slides I used in the webinar can be seen below, or available from slideshare.




Here are some of my experiences from the webinar:
  • It was very difficult to do this webinar - from a timing perspective. It was scheduled from 2-3pm ET (which meant it was 12.30-1.30am IST). I could feel the fatigue in my voice when I heard the recording. I just hope the attendees did not catch that, and that it did not affect the effective delivery of the content.
  • There were over 50 attendees in the webinar. Though I finished my content in about 38-40 minutes, the remaining 20 minutes was not sufficient to go through the questions. The questions itself were very good, and thought provoking for me.
  • A webinar is a great way to create content and deliver it without a break - as a study material / course content. The challenge and the pressure is on the speaker to ensure that the flow is proper, and the session is well planned and structured. Here, there are no opportunities to tweak the content on the fly based on attendee comments / questions / body language.
  • That said, I always find it much more challenging to do a webinar compared to a talk. Reason - in a talk, I can see the audience. This is a HUGE advantage. I can understand from their facial expressions, body language if what I am saying makes sense or not. I can have many interactions with them to make them more involved in the content - and make the session about them, instead of me just talking. I can spend more time on certain content, while skipping over some - depending on their comfort levels. 

Wednesday, January 28, 2015

vodQA Cocktail - early in 2015

As we get ready for Celebrating Selenium's 10 year journey in vodQA Hyderabad, ThoughtWorks Chennai is ready to take vodQA to the next level on Saturday, 21st February, 2015 with an interesting Cocktail of topics related to Software Testing.

Register here as a speaker for vodQA Chennai, or here as an attendee.

Wednesday, December 17, 2014

Testing in the Medical domain

I had the opportunity recently to do some testing, though for a very short time, in the Medical domain - something that I have always aspired to. I learnt a lot in this time and have gained a lot of appreciation for people working in such mission-critical domains. 

Some of these experiences have been published here as "A Humbling Experience in Oncology Treatment Testing" on ThoughtWorks Insight. Looking forward for your comments and feedback on the same.

Saturday, November 22, 2014

To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA) in AgilePune 2014

I spoke on the topic - "To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)" in Agile Pune, 2014.

The slides from the talk are available here, and the video is available here.



 

Below is some information about the content.


The key objectives of organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
In order for these organizations to to understand the quality / health of their products at a quick glance, typically a team of people scramble to collate and collect the information manually needed to get a sense of quality about the products they support. All this is done manually.


So in the fast moving environment, where CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury, how can teams take decisions if the product is ready to be deployed to the next environment or not?


Test Automation across all layers of the Test Pyramid is one of the first building blocks to ensure the team gets quick feedback into the health of the product-under-test.

The next set of questions are:
  • How can you collate this information in a meaningful fashion to determine - yes, my code is ready to be promoted from one environment to the next?
  • How can you know if the product is ready to go 'live'?
  • What is the health of you product portfolio at any point in time?
  • Can you identify patterns and do quick analysis of the test results to help in root-cause-analysis for issues that have happened over a period of time in making better decisions to better the quality of your product(s)?
The current set of tools are limited and fail to give the holistic picture of quality and health, across the life-cycle of the products.
 

The solution - TTA - Test Trend Analyzer
 
TTA is an open source product that becomes the source of information to give you real-time and visual insights into the health of the product portfolio using the Test Automation results, in form of Trends, Comparative Analysis, Failure Analysis and Functional Performance Benchmarking. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.
 
There are 2 sets of audience who will benefit from TTA:
1. Management - who want to know in real time what is the latest state of test execution trends across their product portfolios / projects. Also, they can use the data represented in the trend analysis views to make more informed decisions on which products / projects they need to focus more or less. Views like Test Pyramid View, Comparative Analysis help looking at results over a period of time, and using that as a data point to identify trends.

 
2. Team Members (developers / testers) - who want to do quick test failure analysis to get to the root cause analysis as quickly as possible. Some of the views - like Compare Runs, Failure Analysis, Test Execution Trend help the team on a day-to-day basis.
 
NOTE: TTA does not claim to give answers to the potential problems. It gives a visual representation of test execution results in different formats which allow team members / management to have more focussed conversations based on data points.

Some pictures from the talk ... (Thanks to Shirish)








Sunday, September 7, 2014

Perils of Page-Object Pattern

I spoke at Selenium Conference (SeConf 2014) in Bangalore on 5th September, 2014 on "The Perils of Page-Object Pattern".

Page-Object pattern is very commonly used when implementing Automation frameworks. However, as the scale of the framework grows, there is a limitation on how much re-usability really happens. It inherently becomes very difficult to separate the test intent from the business domain.
 

I want to talk about this problem, and the solution I have been using - Business Layer - Page-Object pattern, which has helped me keep my code DRY.

The slides from the talk are available here. The video is available here

Video taken by professional:


Video taken from my laptop:


Slides:




If you want to see other slides and videos from SeConf, see the SeConf schedule page.