Showing posts with label collaboration. Show all posts
Showing posts with label collaboration. Show all posts

Tuesday, March 19, 2019

Collaboration - A Taboo!

In AgileIndia 2019 in Bangalore, as part of the Agile Mindset theme, I played a tweak of the Taboo game - to make it a Collaboration game.

Abstract: 

When one has fun at work, work becomes fun. However, daily pressures, metrics, KPIs, and what not, have dissolved the fun, and made work drudgery in various ways. 

This creates stress for individuals, in teams, and across teams, there is mistrust, unnecessary competition, blame, finger-pointing ….

What better way to learn, and re-learn the basics of life, work, team-work - than to play a game, have fun, and correlate it with how life and work indeed should be treated as a game, and we should have fun in this journey. Only then can people truly succeed, and so can organisations.

Here, we will play a game – “Collaboration - A Taboo!” – where you will 

  • Re-learn collaboration techniques via a game! 
  • Learning applicable for individuals and teams, in small or big organisations
  • Re-live your childhood when playing this game

Be prepared for a twist which will leave you thinking!

Slides:



Thursday, February 14, 2019

Talks and workshops in Agile India 2019


In the upcoming Agile India 2019 in Bangalore, I will be speaking about:






If you have not yet registered, you can use this code to get a discount on your registration - anand-10di$c-agile 

In addition, there are some great pre and post conference workshops as well. I will be participating in "Facilitating for Effective Collaboration...One Nudge at a Time" workshop - conducted by Deborah Hartmann Preuss and Ellen Grove


This is going to be one amazing conference to learn, network and share ideas and experiences. See you there!


.

Monday, February 11, 2019

Test Automation in the World of AI and ML

My article on "Test Automation in the World of AI & ML" recently got published on InfoQ.


Here are the key takeaways mentioned in the article -

  • There are many criteria to be considered before building framework / selecting tools for Functional Test Automation
  • It is very important to prioritise framework / tools capabilities needed for the software-under-test
  • A good, scalable Test Automation Framework that provides fast and reliable feedback to the team enables collaboration and CI/CD
  • Debugging / RCA (root cause analysis) and support for libraries / tools used is an afterthought in most cases. Do not fall in that trap.
  • There are some promising commercial tools that fit seamlessly in the Agile way of working. Depending on the complete context, these tools may be a good choice over building your own framework for Functional Automation.

You can read the full article from here

Looking forward to comments on the same!


.

Friday, October 12, 2018

Conference season here is - talks, workshops, travelling, networking!

September & October 2018 is a busy conference season for me.

On 27th September, I played a game - "Collaboration - A Taboo!" at ATA GTR 2018 with an audience of 100+ people. There was absolute chaos in the game - a lot of it self-inflicted ... and thankfully - exactly was I wanted it to be. So much fun, energy and enthusiasm in the room meant there was no one feeling drowsy in the post lunch session! 

Typically I play this game in 45-min to 1 hour duration. At ATA GTR 2018 though, I had only 30 min to play the game, and add my own twist on top of it. But, never have I ever taken more than the allocated time - and I managed to get the objectives of the game achieved as well in these 30 min.

Below are some pictures from the game.




Then on 28th September, I spoke on "Measuring Consumer Quality - The Missing Feedback Loop" at StepIn's PSTC 2018. Slides from that talk can be found here.

In October, I will be off to Agile & Automation Days in Krakow, Poland. Here I will be speaking about "Measuring Consumer Quality - The Missing Feedback Loop" and also conducting a workshop on - "Analytics Rebooted - A Workshop". See detailed schedule here

Then I fly directly to Arlington, VA to participate in STPCon Fall 2018. Here I will be conducting 2 workshops - "Analytics Rebooted - A Workshop" and "Practical Agile Testing Workshop". I am also speaking about "Measuring Consumer Quality - The Missing Feedback Loop".

Will share experiences from these conferences soon!


Monday, March 14, 2016

Protractor for Angular apps?

Already asked these questions in the vodQA group on LinkedIn - but thought to repeat the same here as well - in case someone else also reads this, and has some thoughts.

I am experimenting (again) with Protractor for automation against Angular-based web-apps. This time around, my comfortness with Javascript is better (by a couple more % than before) - so I am better prepped for this challenge. 

That said, I am interested in knowing a few things on this:

  • Has anyone in the group worked with protractor recently? 
  • What has been your experiences in working with it? 
  • Who are the roles involved in the automation implementation, execution and maintenance? 
  • What are the typical utilities you built in this framework?
  • How have you been modelling you page-object pattern with JS / protractor based frameworks? Or, is there some other better set of patterns for JS that should be used?
  • How did you build your page objects? How did you build and manage the composition / nesting of pages? Did the method of a page return an appropriate page object?
  • How many tests exist in your framework? 
  • Do you run your tests in parallel?
  • Do your tests run in CI? If yes, which driver do you use? Protractor site discourages the use of phantomJS. 
  • Would it be possible to share some (non-confidential) examples of how you built your Page Objects? How are your specs written? Any example of that possible to see?
  • Did anyone manage to run their tests against Safari / IE11 as well?
  • What about soft asserts? Did you implement this?
  • I saw a strange issue when running my test against chrome - I got the element is not clickable at xxx coordinates. However the same test ran against Firefox and phantomjs. Anyone seen this before?
  • Given that protractor site does not recommend using phantomJS driver much, anyone used xvfb for running their tests in CI?
  • What reporters do you use?

Tuesday, January 12, 2016

The story of a 'small' vodQA ending up being 'x-large'

We are extremely happy to start the new year with YASV (Yet Another Successful vodQA) event, this time with the theme - Agile Testing Workshop, conducted on 9th January 2016 in ThoughtWorks, Pune office.

Why the theme - "Agile Testing Workshop"?

Over the past few years, after having worked on numerous projects, interacted with a lot of clients (and their partners / vendors), and gaining insights from speaking with individuals & teams in conferences & organizations, we (the vodQA Pune team), realized that a decent portion of the Software (testing) Industry lacks decent / good understanding of Agile and effective Testing on Agile projects / teams.

So, we decided to conduct the next vodQA in Pune - focussed on Agile Testing to answer questions like - "What is Agile and what does it mean to Test on Agile projects / teams?"

Highlights

  • When we started planning for this edition of vodQA, the plan was to keep it very lean - in planning, execution and participation as well. For this, we planned to keep this vodQA 'small'. Little did we realize it would end up being a patiala peg.
  • What started out as an event aimed at 30 attendees soon shot up to 180+ RSVPs on Facebook to 160+ confirmations and eventually we had 85+ attendees. Including ThoughtWorkers, we (again) crossed 100+ people for vodQA Pune! - There went a lot of our 'being-lean' out of the window!
  • This event was completely driven by the Facebook group (from announcements to registrations to updates).
  • We had a quite a few attendees travel from out of Pune for vodQA (ex: Mumbai, Nagpur)
  • This was one of the most vocal, enthusiastic and interactive audience vodQA Pune has seen. They shared their experiences and asked a lot of questions as well.
  • True to our objective for this vodQA, we ensured there was sufficient time between sessions / workshops to facilitate discussions and answer specific questions from the attendees.
  • We had impromptu fishbowl discussion on certain Parking Lot questions.
  • After the first session of the day (Agile Game), the attendees celebrated (it was over) by bursting the balloons - early Diwali some would say … :)
  • A huge shoutout to the organisers who were constantly tweaking their execution methods, days before the event as our expected turnout gradually rose from 30 to 100+.

Agenda and Slides

TopicBySlides
Welcome noteAnand Bagmar
Agile GameAbhay Dalvi, Vardhan Bhatt & Vikrant Chauhan
Tea break

What is Agile Testing?Amit Gundiyal & Prasad Kalgutkarhttp://www.slideshare.net/vodqanite/what-is-agile-testing-56891493
Effective Strategies for Distributed TestingPreeti Mishrahttp://www.slideshare.net/vodqanite/strategies-for-distributed-testing
Lunch

Testing the Mysterious SphereAnjali Wadhwa, Ashwini Ingle & Preeti Mishrahttp://www.slideshare.net/vodqanite/testing-the-mysterious-sphere
Break

Test Automation - Principles, PracticesVardhan Bhatt & Vikrant Chauhanhttp://www.slideshare.net/vodqanite/lessons-learnt-from-test-automation-principles-practices
Tea + Snacks break

Patterns in Test Automation (Framework + Data)Anand Bagmarhttp://www.slideshare.net/abagmar/patterns-in-test-automation
 

Feedback

  • Overall workshop was wonderful. Presentation and content was good. Helpful to understand and implement in our current process.
  • Agile testing game taught us to focus more on quality than quantity & take feedback as soon as possible from the PO
  • Though I am not working in Agile env currently, I understood whole session and got to learn something.

The always rocking vodQA Pune team!!
vodQA Pune team

Wednesday, October 14, 2015

Good Trends for TTA in DevOps Summit

I spoke in DevOps Summit on 8th Oct in Bangalore on "To Deploy, or Not To Deploy - Decide using TTA's Trend & Failure Analysis".

The conversations during and after this talk with various veterans in the Software Industry, across various different domains; reiterated my belief in the need for me to spend more time in taking TTA to the next level and make it a more robust and feature-rich product.

Below are the details of the talk:


Abstract

In a fast-moving environment, where Continuous Integration (CI) and Continuous Delivery (CD) are a necessity and not a luxury, how can teams decide if a product is ready to be deployed to the next environment and go 'live'?

What is the health of your product portfolio at any point in time?
Can you identify patterns over a period of time in making better decisions to better the quality of your product(s)?
Test Automation across all layers of the Test Pyramid enables to get quick feedback about the health of the product-under-test.

However, in an organization having multiple products in its portfolio, how can you get the collated quality / health information from all the products, quickly and in real-time? Or, for a large program of work, which has various projects being worked on in parallel by numerous teams across the world, how can the relevant people quickly get the consolidated quality / health information for the whole program?

In such cases, how can you:
- figure out any Trends / Patterns in the quality, or,
- do any meaningful Comparative Analysis (say between the quality of last release Vs the next release), or,
- do quick Failure Analysis and prioritize the 'fixing' of issues in an efficient fashion, and,
- do some quick Functional Performance Benchmarking.

At present this needs to be done manually.
Learn an effective way to answer the above questions - with TTA (Test Trend Analyzer), an open source product.

TTA give you real-time and visual insights into the health of the product portfolio using the Test Automation results. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Slides from the talk


Video from the talk

 

 


Thursday, May 28, 2015

vodQA Pune - Innovations in Testing

vodQA Update - Agenda + Slides + Videos


Here is an update of the vodQA that went by at supersonic speed!


We had an intense and action-packed vodQA in ThoughtWorks, Pune on Saturday, 6th June 2015 - with the theme - Innovations in Testing!

Here are some highlights from the event:
  • You can find the details of the agenda + links to slides & videos from here or here.
  • After a record breaking attendee registrations (~500), we frantically closed off registrations. This meant around 140-180 people would show up based on historic attendance trends. 135 attendees made it to vodQA - the first person reaching office at 8.30am - when the event was supposed to start at 10am! That is enthusiasm!
  • We had 45+ speaker submissions (and we had to reject more submissions because the registrations had already closed). After speaking to all submitters, and a lot of dry-runs and feedback, we eventually selected 6 talks, 4 lightning talks, 4 workshops from this massive list.
  • We were unfortunately able to select only 2 external speakers (but it was purely based on the content + relevance to the theme). One of these speakers travelled all the way from Ahmedabad to Pune on his own for delivering a Lightning Talk.
  • We had a few ThoughtWorkers travelling from Bangalore (2 speakers + 1 attendee) and 1 (speaker) from Gurgaon
  • We had around 30-40 ThoughtWorkers participating in the conference. 
  • No event in the office can be possible without the amazing support from our Admin team + support staff!
  • Overall - we had around 200 people in the office on a Saturday!
  • For the first time, we did a live broadcasting of all the talks + lightning talks (NO workshops). This allowed people to connect with vodQA as it happened. Also - usually the last and most cumbersome thing from a post-event processing - uploading videos - was now the the first thing that was completed. See the videos here on youtube. This update got delayed because we still have to get the link to the slides :(
  • We celebrated the 5th Birthday of vodQA!
  • Even though most projects in TW Pune are running at 120+% delivery speed, we were able to pull this off amazingly well! This can only happen when individuals believe in what they are contributing towards. Thank you all!
  • We wrapped up most of the post-event activities (office-cleanup, retro, post-vodQA dinner and now this update email) within 5 days of the vodQA day - another record by itself!
  • Some pictures are attached with this email.
You can see the tweets and comments in the vodQA group on facebook

Again, A HUGE THANKS to ALL those who participated in any way!

On behalf of the vodQA team + all the volunteers!










-----------------------------------------------------------------------------------------------------------------------------------------

[UPDATE]

Detail agenda, with expected learning and speaker information available here (http://vodqa-pune.weebly.com/agenda.html) for vodQA Pune - Innovations in Testing.

NOTE;
- Each workshop has limited # of seats.
- Registration for workshop will be done at the Attendee Registration Desk between 9am-10am on vodQA day.
- Registration will be on first-come-first choice basis.
- See each talk / workshop details (below) for pre-requisites, if any.


----------------

vodQA is back in ThoughtWorks, Pune on Saturday, 6th June 2015. This time the theme is - "Innovations in Testing".

We got a record number of submissions from wannabe speakers and HUGE number of attendee registrations. Selecting 12-14 talks from this list was no small task - but we had to take a lot of tough decisions.

The agenda is now published (see here - http://vodqa-pune.weebly.com/agenda.html) and we are looking forward to have a very rocking vodQA!

Wednesday, May 20, 2015

What is Agile Testing? How does Automation help?

I spoke in a conference recently on "What is Agile Testing? How does Automation help?"

Abstract

Agile Methodology is not new. Many organisations / teams have already adopted Agile way of Software Development or are in the enablement journey for the same. 

What does this mean for Testing? There is no doubt that the Testing approach and mindset also needs to change to be in tune with the Agile Development methodology. 

Learn what does it mean to Test on Agile Projects. Also, learn how Test Automation approach needs to change for the team to be successful!

Video

Slides



Here I am on the stage, in Main Hall, in front of 150+ people, delivering the talk - What is Agile Test? How does Automation Help?:



Tuesday, May 19, 2015

Role of Automation in Testing

I am speaking in Discuss Agile 2015 conference on 13-14 June 2015 on the following topics - 

As part of this conference, I also did an interview with Saket Bansal and Atulya Mishra on - The Role of Automation in Testing.

This was an interesting, virtual interview - where interested people had asked questions during registration, and also a lot of questions came up during the interview.

Below is the video recording of the interview. 


I also referenced some slides when speaking about some specific topics. Those can be seen below, or directly from slideshare.




Saturday, April 25, 2015

Push the Envelope at vodQA, Bangalore

[UPDATED - Slides added]

Yet another vodQA begins today, Saturday, 25th April 2015 - this time at ThoughtWorks, Bangalore. The theme for this vodQA is - "Push the Envelope". The detail agenda can be found here.


I conducted a workshop on "Client-side Performance Testing" in vodQA Bangalore. 


Abstract of the workshop:



In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing. 

Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.



Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.

Here are the slides used in the workshop:

Thursday, March 19, 2015

Enabling CD & BDT in March 2015

I have been very busy off late .... and am enjoying it too! I am learning and doing a lot of interesting things in the Performance Testing / Engineering domain. I had no idea there are so many types of caching, and that there would be a need to do various different types of Monitoring for availability, client-side performance testing, Real User Monitoring, Server-side load testing and more ... it is a lot of fun being part of this aspect of Testing.

That said, I am equally excited about 2 talks coming up in the end-of-March 2015:

Enabling CD (Continuous Delivery) in Enterprises with Testing 

- at Agile India 2015, on Friday, 27th March 2015 in Bangalore.


Abstract

The key objectives of Organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
In such a fast moving environment, CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury!

There are various practices that Organizations and Enterprises need to implement to enable CD. Testing (automation) is one of the important practices that needs to be setup correctly for CD to be successful.

Testing in Organizations on the CD journey is tricky and requires a lot of discipline, rigor and hard work. In Enterprises, the Testing complexity and challenges increase exponentially.

In this session, I am sharing my vision of the Test Strategy required to make successful the journey of an Enterprise on the path of implementing CD.



Build the 'right' regression suite using Behavior Driven Testing (BDT) - a Workshop

- at vodQA Gurgaon, on Saturday, 28th March 2015 at ThoughtWorks, Gurgaon.


Abstract

Behavior Driven Testing (BDT) is a way of thinking. It helps in identifying the 'correct' scenarios, in form of user journeys, to build a good and effective (manual & automation) regression suite that validates the Business Goals. We will learn about BDT, do some hands-on exercises in form of workshops to understand the concept better, and also touch upon some potential tools that can be used.

Learning outcomes

  • Understand Behavior Driven Testing (BDT)
  • Learn how to build a good and valuable regression suite for the product under test
  • Learn different style of identifying / writing your scenarios that will validate the expected Business Functionality
  • Automating tests identified using BDT approach will automate your Business Functionality
  • Advantages of identifying Regression tests using BDT approach

Thursday, February 19, 2015

Experiences from webinar on "Build the 'right' regression suite using Behavior Driven Testing (BDT)"

I did a webinar on how to "Build the 'right' regression suite using Behavior Driven Testing (BDT)" for uTest Community Testers on 18th Feb 2015 (2pm ET).

The recording of the webinar is available here on utest site (http://university.utest.com/recorded-webinar-build-the-right-regression-suite-using-behavior-driven-testing-bdt/).

The slides I used in the webinar can be seen below, or available from slideshare.




Here are some of my experiences from the webinar:
  • It was very difficult to do this webinar - from a timing perspective. It was scheduled from 2-3pm ET (which meant it was 12.30-1.30am IST). I could feel the fatigue in my voice when I heard the recording. I just hope the attendees did not catch that, and that it did not affect the effective delivery of the content.
  • There were over 50 attendees in the webinar. Though I finished my content in about 38-40 minutes, the remaining 20 minutes was not sufficient to go through the questions. The questions itself were very good, and thought provoking for me.
  • A webinar is a great way to create content and deliver it without a break - as a study material / course content. The challenge and the pressure is on the speaker to ensure that the flow is proper, and the session is well planned and structured. Here, there are no opportunities to tweak the content on the fly based on attendee comments / questions / body language.
  • That said, I always find it much more challenging to do a webinar compared to a talk. Reason - in a talk, I can see the audience. This is a HUGE advantage. I can understand from their facial expressions, body language if what I am saying makes sense or not. I can have many interactions with them to make them more involved in the content - and make the session about them, instead of me just talking. I can spend more time on certain content, while skipping over some - depending on their comfort levels. 

Wednesday, January 28, 2015

vodQA Cocktail - early in 2015

As we get ready for Celebrating Selenium's 10 year journey in vodQA Hyderabad, ThoughtWorks Chennai is ready to take vodQA to the next level on Saturday, 21st February, 2015 with an interesting Cocktail of topics related to Software Testing.

Register here as a speaker for vodQA Chennai, or here as an attendee.

Wednesday, December 24, 2014

Disruptive Testing with Julian Harty

As part of the Disruptive Testing series, the last interview of 2014, with Julian Harty is now available here (http://www.thoughtworks.com/insights/blog/disruptive-testing-part-8-julian-harty) as a video interview. The transcript of the same is also published.

Also look at ThoughtWorks Insights for other great articles on a variety of topics and themes.

Wednesday, December 17, 2014

Testing in the Medical domain

I had the opportunity recently to do some testing, though for a very short time, in the Medical domain - something that I have always aspired to. I learnt a lot in this time and have gained a lot of appreciation for people working in such mission-critical domains. 

Some of these experiences have been published here as "A Humbling Experience in Oncology Treatment Testing" on ThoughtWorks Insight. Looking forward for your comments and feedback on the same.

Saturday, November 22, 2014

To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA) in AgilePune 2014

I spoke on the topic - "To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)" in Agile Pune, 2014.

The slides from the talk are available here, and the video is available here.



 

Below is some information about the content.


The key objectives of organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
In order for these organizations to to understand the quality / health of their products at a quick glance, typically a team of people scramble to collate and collect the information manually needed to get a sense of quality about the products they support. All this is done manually.


So in the fast moving environment, where CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury, how can teams take decisions if the product is ready to be deployed to the next environment or not?


Test Automation across all layers of the Test Pyramid is one of the first building blocks to ensure the team gets quick feedback into the health of the product-under-test.

The next set of questions are:
  • How can you collate this information in a meaningful fashion to determine - yes, my code is ready to be promoted from one environment to the next?
  • How can you know if the product is ready to go 'live'?
  • What is the health of you product portfolio at any point in time?
  • Can you identify patterns and do quick analysis of the test results to help in root-cause-analysis for issues that have happened over a period of time in making better decisions to better the quality of your product(s)?
The current set of tools are limited and fail to give the holistic picture of quality and health, across the life-cycle of the products.
 

The solution - TTA - Test Trend Analyzer
 
TTA is an open source product that becomes the source of information to give you real-time and visual insights into the health of the product portfolio using the Test Automation results, in form of Trends, Comparative Analysis, Failure Analysis and Functional Performance Benchmarking. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.
 
There are 2 sets of audience who will benefit from TTA:
1. Management - who want to know in real time what is the latest state of test execution trends across their product portfolios / projects. Also, they can use the data represented in the trend analysis views to make more informed decisions on which products / projects they need to focus more or less. Views like Test Pyramid View, Comparative Analysis help looking at results over a period of time, and using that as a data point to identify trends.

 
2. Team Members (developers / testers) - who want to do quick test failure analysis to get to the root cause analysis as quickly as possible. Some of the views - like Compare Runs, Failure Analysis, Test Execution Trend help the team on a day-to-day basis.
 
NOTE: TTA does not claim to give answers to the potential problems. It gives a visual representation of test execution results in different formats which allow team members / management to have more focussed conversations based on data points.

Some pictures from the talk ... (Thanks to Shirish)








Monday, November 10, 2014

Perspectives on Agile Software Testing

Inspired by Selenium's 10th Birthday Celebration, a bunch of ThoughtWorkers have compiled an anthology of essays on testing approaches, tools and culture by testers for testers.  
This anthology of essays is available as an ebook, titled - "Perspectives on Agile Software Testing" which is now available for download from here on ThoughtWorks site. A simple registration, and you will be able to download the ebook. 

Here are the contents of the ebook:












Enjoy the read, and looking forward for the feedback.