Typically we automate the validation of functionalities - using various options and approaches based on the Test Automation Pyramid. However, there are various reasons why one may want to automate the Look & Feel of their application (web / native) This post is not about if or when we should automate the Look & Feel aspects of the product. This post assumes you have done the discussions, evaluations and validations required to reach the decision that, yes, we need to automate the Look & Feel of the product under test. The next question is - how? What are the alternatives of accomplishing this type of automation? If you need basic validation, there is an open-source alternative - where you can do Visual Testing and Validations using PhantomCss. Vishnu & Shridhar spoke about this in vodQA Pune - Innovations in Testing. See the slides and videos from their awesome session. If you need more flexibility and functionality from the Look & Feel automation, and want to do this for cross-browser / multiple devices, then I recommend you look at Applitools Eyes product. I had have seen a couple of demos from Moshe and Adamabout this product - and they were kind to record one of the demos and allow me to share it with all - you can see the demo here on youtube.
Applitools has a free account for individuals, with of course, certain limitation in usage (not functionality). You can sign-up for your account here and try it out.
I am conducting a Client-side Performance Testing workshop in TechJam on Thursday, 13th August 2015. You can register for the same from the TechJam page.
Abstract
In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing.
Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.
Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.
Expected Learnings
What is Performance Testing and Performance Engineering.
Hand's on experience of some open-source tools to monitor, measure and automate Client-side Performance Testing.
Examples / code walk-through of some ways to automate Client-side Performance Testing.
Prerequisites
Participants are required to bring their own laptop for this workshop.
[UPDATE] I think I will include some interesting Innovation Games to demonstrate the need for collaboration and more effective working. Or it could be something more specific to Testing on Agile projects. So many options ... --------------------------------- As part of Unicom's World Conference Next Generation Testing, I am doing a 1-day workshop on "Agile Testing", on Friday, 24th July 2015 in Bangalore. You can register for the workshop from here, or contact me for more information. Below are the details of the workshop.
Agile Testing
Abstract
The Agile Manifesto was published in 2001. It took the software industry a good few years to truly understand what the manifesto means, and the principles behind it. However, choosing and implementing the right set of practices to get the true value from working the Agile way has been the biggest challenge for most!
While Agile has now gone mainstream, and as we get better at the development practices to being Agile, Testing has still been lagging behind in most cases. A lot of teams are still working in the staggered fashion - with testing following next after development completed.
In this workshop, we will learn and share various principles and practices which teams should adopt to be successful in testing on Agile projects.
Agenda
What is Agile testing? - Learn what does it mean to Test on Agile Projects
Effective strategies for Distributed Testing - Learn practices that help bridge the Distributed Testing gap!
Test Automation in Agile Projects - Why? What? How? - Why is Test Automation important, and how do we implement a good, robust, scalable and maintainable Test Automation framework!
Build the "right" regression suite using Behavior Driven Testing (BDT) - Behavior Driven Testing (BDT) is an evolved way of thinking about Testing. It helps in identifying the 'correct' scenarios, in form of user journeys, to build a good and effective (manual & automation) regression suite that validates the Business Goals.
Key Learnings for participants in this workshop
Understand the Agile Testing Manifesto
Learn the essential Testing practices and activities essential for teams to adopt to work in Agile way of working
Discover techniques to do effective testing in distributed teams
Find out how Automation plays a crucial role in Agile projects
Learn how to build a good, robust, scalable and maintainable Functional Automation framework
Learn, by practice, how to identify the right types of tests to automate as UI functional tests - to get quick and effective feedback
Pre-requisites
At-least a basic working knowledge and understanding of Agile
I had the opportunity to speak in Discuss Agile Delhi 2015 on 2 topics. Here are the details on the same:
Enabling Continuous Delivery (CD) in Enterprises with Testing
Abstract:
The key objectives of any organization is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
There are various practices that organizations need to implement to enable CD. Changes in requirements (a reality in all projects) needs to be managed better. Along with this, processes and practices need to be tuned based on the team capability, skills and distribution.
To Deploy, or Not to Deploy? Decide using Test Trend Analyzer (TTA)
Abstract:
The key objectives of organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
In order for these organizations to to understand the quality / health of their products at a quick glance, typically a team of people scramble to collate and collect the information manually needed to get a sense of quality about the products they support. All this is done manually.
So in the fast moving environment, where CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury, how can teams take decisions if the product is ready to be deployed to the next environment or not?
Test Automation across all layers of the Test Pyramid is one of the first building blocks to ensure the team gets quick feedback into the health of the product-under-test.
The next set of questions are: • How can you collate this information in a meaningful fashion to determine - yes, my code is ready to be promoted from one environment to the next? • How can you know if the product is ready to go 'live'? • What is the health of you product portfolio at any point in time? • Can you identify patterns and do quick analysis of the test results to help in root-cause-analysis for issues that have happened over a period of time in making better decisions to better the quality of your product(s)?
The current set of tools are limited and fail to give the holistic picture of quality and health, across the life-cycle of the products.
The solution - TTA - Test Trend Analyzer
TTA is an open source product that becomes the source of information to give you real-time and visual insights into the health of the product portfolio using the Test Automation results, in form of Trends, Comparative Analysis, Failure Analysis and Functional Performance Benchmarking. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.
Here is an update of the vodQA that went by at supersonic speed!
We had an intense and action-packed vodQA in ThoughtWorks, Pune on Saturday, 6th June 2015 - with the theme - Innovations in Testing!
Here are some highlights from the event:
You can find the details of the agenda + links to slides & videos from here or here.
After a record breaking attendee registrations (~500), we frantically closed off registrations. This meant around 140-180 people would show up based on historic attendance trends. 135 attendees made it to vodQA - the first person reaching office at 8.30am - when the event was supposed to start at 10am! That is enthusiasm!
We had 45+ speaker submissions (and we had to reject more submissions because the registrations had already closed). After speaking to all submitters, and a lot of dry-runs and feedback, we eventually selected 6 talks, 4 lightning talks, 4 workshops from this massive list.
We were unfortunately able to select only 2 external speakers (but it was purely based on the content + relevance to the theme). One of these speakers travelled all the way from Ahmedabad to Pune on his own for delivering a Lightning Talk.
We had a few ThoughtWorkers travelling from Bangalore (2 speakers + 1 attendee) and 1 (speaker) from Gurgaon
We had around 30-40 ThoughtWorkers participating in the conference.
No event in the office can be possible without the amazing support from our Admin team + support staff!
Overall - we had around 200 people in the office on a Saturday!
For the first time, we did a live broadcasting of all the talks + lightning talks (NO workshops). This allowed people to connect with vodQA as it happened. Also - usually the last and most cumbersome thing from a post-event processing - uploading videos - was now the the first thing that was completed. See the videos here on youtube. This update got delayed because we still have to get the link to the slides :(
We celebrated the 5th Birthday of vodQA!
Even though most projects in TW Pune are running at 120+% delivery speed, we were able to pull this off amazingly well! This can only happen when individuals believe in what they are contributing towards. Thank you all!
We wrapped up most of the post-event activities (office-cleanup, retro, post-vodQA dinner and now this update email) within 5 days of the vodQA day - another record by itself!
Detail agenda, with expected learning and speaker information available here (http://vodqa-pune.weebly.com/agenda.html) for vodQA Pune - Innovations in Testing. NOTE; - Each workshop has limited # of seats. - Registration for workshop will be done at the Attendee Registration Desk between 9am-10am on vodQA day. - Registration will be on first-come-first choice basis. - See each talk / workshop details (below) for pre-requisites, if any.
---------------- vodQA is back in ThoughtWorks, Pune on Saturday, 6th June 2015. This time the theme is - "Innovations in Testing". We got a record number of submissions from wannabe speakers and HUGE number of attendee registrations. Selecting 12-14 talks from this list was no small task - but we had to take a lot of tough decisions. The agenda is now published (see here - http://vodqa-pune.weebly.com/agenda.html) and we are looking forward to have a very rocking vodQA!
I spoke in a conference recently on "What is Agile Testing? How does Automation help?"
Abstract
Agile Methodology is not new. Many organisations / teams have already adopted Agile way of Software Development or are in the enablement journey for the same.
What does this mean for Testing? There is no doubt that the Testing approach and mindset also needs to change to be in tune with the Agile Development methodology.
Learn what does it mean to Test on Agile Projects. Also, learn how Test Automation approach needs to change for the team to be successful!
As part of this conference, I also did an interview with Saket Bansal and Atulya Mishra on - The Role of Automation in Testing.
This was an interesting, virtual interview - where interested people had asked questions during registration, and also a lot of questions came up during the interview.
Below is the video recording of the interview.
I also referenced some slides when speaking about some specific topics. Those can be seen below, or directly from slideshare.
I am conducting a workshop on "Client-side Performance Testing" in vodQA Geek Night, ThoughtWorks, Hyderabadfrom 6.30pm-8pm IST on Thursday, 14th May, 2015. Visit this page to register! Abstract of the workshop:
In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing.
Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.
Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.
Venue:
ThoughtWorks Technologies (India) Pvt Ltd. 3rd Floor, Apurupa Silpi, Beside H.P. Petrol Bunk (KFC Building), Gachibowli, Hyderabad - 500032, India
[UPDATED - Slides added]
Yet another vodQA begins today, Saturday, 25th April 2015 - this time at ThoughtWorks, Bangalore. The theme for this vodQA is - "Push the Envelope". The detail agenda can be found here.
I conducted a workshop on "Client-side Performance Testing" in vodQA Bangalore. Abstract of the workshop:
In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing.
Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.
Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.
I have been very busy off late .... and am enjoying it too! I am learning and doing a lot of interesting things in the Performance Testing / Engineering domain. I had no idea there are so many types of caching, and that there would be a need to do various different types of Monitoring for availability, client-side performance testing, Real User Monitoring, Server-side load testing and more ... it is a lot of fun being part of this aspect of Testing. That said, I am equally excited about 2 talks coming up in the end-of-March 2015:
The key objectives of Organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality! In such a fast moving environment, CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury!
There are various practices that Organizations and Enterprises need to implement to enable CD. Testing (automation) is one of the important practices that needs to be setup correctly for CD to be successful.
Testing in Organizations on the CD journey is tricky and requires a lot of discipline, rigor and hard work. In Enterprises, the Testing complexity and challenges increase exponentially.
In this session, I am sharing my vision of the Test Strategy required to make successful the journey of an Enterprise on the path of implementing CD.
Behavior Driven Testing (BDT) is a way of thinking. It helps in identifying the 'correct' scenarios, in form of user journeys, to build a good and effective (manual & automation) regression suite that validates the Business Goals. We will learn about BDT, do some hands-on exercises in form of workshops to understand the concept better, and also touch upon some potential tools that can be used.
Learning outcomes
Understand Behavior Driven Testing (BDT)
Learn how to build a good and valuable regression suite for the product under test
Learn different style of identifying / writing your scenarios that will validate the expected Business Functionality
Automating tests identified using BDT approach will automate your Business Functionality
Advantages of identifying Regression tests using BDT approach
It was very difficult to do this webinar - from a timing perspective. It was scheduled from 2-3pm ET (which meant it was 12.30-1.30am IST). I could feel the fatigue in my voice when I heard the recording. I just hope the attendees did not catch that, and that it did not affect the effective delivery of the content.
There were over 50 attendees in the webinar. Though I finished my content in about 38-40 minutes, the remaining 20 minutes was not sufficient to go through the questions. The questions itself were very good, and thought provoking for me.
A webinar is a great way to
create content and deliver it without a break - as a study material / course content. The challenge and
the pressure is on the speaker to ensure that the flow is proper, and
the session is well planned and structured. Here, there are no opportunities
to tweak the content on the fly based on attendee comments / questions /
body language.
That said, I always find it much more challenging to do a webinar compared to a talk. Reason - in a talk, I can see the audience. This is a HUGE advantage. I can understand from their facial expressions, body language if what I am saying makes sense or not. I can have many interactions with them to make them more involved in the content - and make the session about them, instead of me just talking. I can spend more time on certain content, while skipping over some - depending on their comfort levels.
ThoughtWorks, Hyderabad is proud to host its first vodQA, also the first vodQA of 2015 and start 10 Years of Selenium Celebration. This event will be held on Saturday, 31st Jan 2015. Look at the agenda of this vodQA and register soon. Given that we have mostly workshops in this vodQA, seats are going to be limited! Here is the address and direction to the ThoughtWorks office.
UPDATE:
Slides for my talk on the "Future of Testing, Test Automation and the Quality Analyst" are now available here:
I had the opportunity recently to do some testing, though for a very short time, in the Medical domain - something that I have always aspired to. I learnt a lot in this time and have gained a lot of appreciation for people working in such mission-critical domains. Some of these experiences have been published here as "A Humbling Experience in Oncology Treatment Testing" on ThoughtWorks Insight. Looking forward for your comments and feedback on the same.
I spoke on the topic - "To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)" in Agile Pune, 2014. The slides from the talk are available here, and the video is available here.
Below is some information about the content.
The key objectives of organizations is to provide / derive value from the
products / services they offer. To achieve this, they need to be able to
deliver their offerings in the quickest time possible, and of good quality!
In order for these organizations to to understand the quality / health of their
products at a quick glance, typically a team of people scramble to collate and
collect the information manually needed to get a sense of quality about the
products they support. All this is done manually.
So in the fast moving environment, where CI (Continuous Integration) and CD
(Continuous Delivery) are now a necessity and not a luxury, how can teams take
decisions if the product is ready to be deployed to the next environment or
not?
Test Automation across all layers of
the Test Pyramid is one of the first building blocks to ensure the team gets
quick feedback into the health of the product-under-test.
The next set of questions are:
How can you collate this information in
a meaningful fashion to determine - yes, my code is ready to be promoted from
one environment to the next?
How can you know if the product is ready
to go 'live'?
What is the health of you product
portfolio at any point in time?
Can you identify patterns and do quick
analysis of the test results to help in root-cause-analysis for issues that
have happened over a period of time in making better decisions to better the
quality of your product(s)?
The current set of tools are limited and fail to give the holistic picture
of quality and health, across the life-cycle of the products.
The solution - TTA - Test Trend Analyzer TTA is an open source product
that becomes the source of information to give you real-time and visual
insights into the health of the product portfolio using the Test Automation
results, in form of Trends, Comparative Analysis, Failure Analysis and
Functional Performance Benchmarking. This allows teams to take decisions on the
product deployment to the next level using actual data points, instead of
'gut-feel' based decisions. There are 2 sets of audience who will benefit from TTA:
1. Management - who want to know in real time what is the latest state of test
execution trends across their product portfolios / projects. Also, they can use
the data represented in the trend analysis views to make more informed
decisions on which products / projects they need to focus more or less. Views
like Test Pyramid View, Comparative Analysis help looking at results over a
period of time, and using that as a data point to identify trends. 2. Team Members (developers / testers) - who want to do quick test failure
analysis to get to the root cause analysis as quickly as possible. Some of the
views - like Compare Runs, Failure Analysis, Test Execution Trend help the team
on a day-to-day basis. NOTE: TTA does not claim to give answers to the potential problems. It gives
a visual representation of test execution results in different formats which
allow team members / management to have more focussed conversations based on
data points.
Some pictures from the talk ... (Thanks to Shirish)
Selenium has been around for over a decade now. ThoughtWorks has published an eBook on the occasion - titled - "Perspectives on Agile Software Testing". This eBook is available for free download. I have written a chapter in the eBook - "Is Selenium Finely Aged Wine?" An excerpt of this chapter is also published as a blog post on utest.com. You can find that here.
Inspired by Selenium's10th Birthday Celebration, a bunch of ThoughtWorkers have compiled an anthology of essays on testing approaches, tools and culture by testers for testers. This anthology of essays is available as an ebook, titled - "Perspectives on Agile Software Testing" which is now available for download from here on ThoughtWorks site. A simple registration, and you will be able to download the ebook. Here are the contents of the ebook:
Enjoy the read, and looking forward for the feedback.
[UPDATE] - The event was a great success - despite the rain gods trying to dissuade participants to join in. For those who missed, or for those who want to revisit the talks you may have missed, the videos have been uploaded and available here on YouTube. [UPDATE] - Latest count - >350 interested attendees for listening to speakers delivering 6 talks, 3 lightning talks and attending 3 workshops. Not to forget the fun and networking with a highly charged audience at the ThoughtWorks, Pune office. Be there, or be left out! :)
I am very happy to write that the next vodQA is scheduled in ThoughtWorks, Pune on 15th November 2014. The theme this time around is "Breaking Boundaries". You can register as an attendee here, and register as a speaker here. You can submit more than one topic for speaker registration - just email vodqa-pune@thoughtworks.com with details on the topics.
I spoke at Selenium Conference (SeConf 2014) in Bangalore on 5th September, 2014 on "The Perils of Page-Object Pattern". Page-Object pattern is very commonly used when implementing Automation frameworks. However, as the scale of the framework grows, there is a limitation on how much re-usability really happens. It inherently becomes very difficult to separate the test intent from the business domain.
I want to talk about this problem, and the solution I have been using - Business Layer - Page-Object pattern, which has helped me keep my code DRY. The slides from the talk are available here. The video is available here. Video taken by professional: Video taken from my laptop: Slides:
I started this talk by stating that I am going to prove that "A Triangle = A Pentagon".
A Triangle == A Pentagon??
I am happy to say that I was able to prove that "A Triangle IS A Pentagon" - in fact, left reasonable doubt in the audience mind that "A Triangle CAN BE an n-dimensional Polygon". Confused? How is this related to Continuous Delivery (CD), or Testing? See the slides and the video from the talk to know more.
This topic is also available on ThoughtWorks Insights. Below are some pictures from the conference.