Showing posts with label performance. Show all posts
Showing posts with label performance. Show all posts

Friday, April 21, 2017

Introducing MAD LAB - for Mobile Automation

The past few months I have been heads-down in stabilising my Real-Device Mobile Test Lab - which we now call MAD LAB (Mobile Automation Devices LAB) .

For those who may not recollect, see my past posts for reference -

Along with my colleagues, we have put in lot of effort in setting up MAD LAB and have now added a lot of rich features to help running tests, seeing the results and making sense out of them easier. 
  • All infrastructure management is implemented now in groovy (instead of gradle as shared earlier).
  • Actual test implementation is done in cucumber-jvm / java

List of features currently implemented:
  • Device management (selection, cleanup, app install and uninstall)
  • Parallel test execution (at Cucumber scenario level) - maximising device utilisation)
  • Appium server management
  • Adb utilities 
  • Managing periodic ADB server disconnects
  • Custom reporting using cucumber-reports
  • Video recording of each scenario and embedding in the custom reports

Contents of MAD LAB:
  • 1 Mac Minis - running various Jenkins Agents
  • 2 Powered USB hubs
  • 8 Android devices

Here are some pictures from the setup.








There are many more features, in various stages of implementation, being added to make MAD LAB more powerful.

Sneak peek into whats coming:
  • Analytics Testing
  • Trend and Failure Analysis 
  • iOS
  • Web
  • A transformed MAD LAB

Finding MAD LAB interesting? Some very interesting changes are coming in soon. Watch out for my next blog post for that. 

Want to contribute and be part of this journey? Even better! Reach out to me!

Thursday, March 16, 2017

Workshop - Client-Side Performance Testing at STPCon 2017

I conducted a 4-hour workshop on Client-Side Performance Testing at STPCon 2017 on 15th March 2017.


Workshop Abstract

In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing.
Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.
Lastly – just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity – using WebPageTest (private instance setup), and experiment with yslow – as a low-cost, programmatic alternative to WebPageTest.
We will also look at the different dimensions of Client-side Performance Testing for native mobile applications.
PS: This workshop will be a combination of presentation and hands-on-activity with lots of discussion throughout. You should bring your laptop with you.

Workshop Takeaways:

  • Understand difference between is Performance Testing and Performance Engineering.
  • Hand’s on experience of some open-source tools to monitor, measure and automate Client-side Performance Testing.
  • Examples / code walk-through of some ways to automate Client-side Performance Testing.

Slides


Client-Side Performance Testing from Anand Bagmar

Some pictures from the workshop 

(Thanks Mike LylesCurtis Stuehrenberg for the pictures)





Friday, December 2, 2016

A new beginning - entertainment on mobile

After 7+ years, I finally took the heavy step and moved out of ThoughtWorks.

The past 7+ years have been awesome. I had loads of fun, learnt many new things, made a lot of friends and found inspiration and guidance from a lot of mentors.

Thank you ThoughtWorks and ThoughtWorkers! Wouldn’t have been who I am today without you and you all will always be a huge part of me!

Taking the decision was tougher than I thought it would be ... but new challenges were waiting for me, and the time had come.

On 1st December, 2016, I started my next stint as "Directory - Quality" at Vuclip, Inc for the Viu product. You can also find us via the PlayStore or AppStore.

Day 1 at vuclip, barring the first 2 hours of paperwork, was getting right into action. With the planning for 2017 in full swing, there was no time to settle - but instead had to hit the ground running.

The charter starting Day 1 for me was:


  • Define & execute test strategy for Viu - for multiple platforms, for multiple regions & partners
  • Build team to help execute the above (see section below on what I am looking out for)
  • In scope - functional testing, automation, performance, analytics, benchmarking, infrastructure, tooling, etc.
  • Out of scope - nothing

And so the fun has begun.

So, here is what I need to learn and execute immediately (looking forward to suggestions, links, feedback on how you have done it in the past)

  • Is it worth setting up a mobile lab (real devices + simulators) in-house or use external services for running automated tests  exploratory tests? 
  • If the latter to the above, what services have you used in the past? What have the results been?
  • Is it possible (and worth) automating the checks for memory / processor / battery usage when running tests against the native app (on Android & iOS)?
  • How to do native app performance testing (client-side) for Android & iOS?


Also, I am looking to build a strong testing team with team members having the primary skills & capabilities -
  • Open-minded, quick learner
  • A good Testing-mindset
  • Mobile Testing experience (non-automated + test automation)
  • Performance Testing (client-side & server-side)

Contact me if you are interested in being part of my team to work on this challenging product.

Tuesday, September 13, 2016

Slides from vodQA Pune - Less Talk, Only Action! now available

vodQA-Pune - Less Talk, Only Action! was held on on Saturday, 27th Aug 2016, 8.30am - 5.30pm at ThoughtWorks, Pune.

Agenda



Abstracts with Slides

1. Automating Web Analytics - Why? How?

Do you know –

  • What is Web Analytics?How does Web Analytics work?
  • Why is it important? How to test Web Analytics?
  • How can we ensure correct data is sent to the Web Analytics server, every time, for all the actions?

Attend this workshop to learn ‘What is Web Analytics?’ and why it is an extremely important aspect of Software Development & Testing for your product / service to succeed!

We will share some techniques for testing Web Analytics - in a non-automated way - and why that is very challenging and error-prone.

We will learn, via hands-on activity, about WAAT - Web Analytics Automation Testing Framework (https://essenceoftesting.blogspot.com/search/label/waat) - an open-source solution, to automate validation of correct information / tags being sent to the Web Analytic server for different user actions as part of your regular Selenium-WebDriver Test Automation Framework.

Lastly, we will see how the impact of Analytics has changed dramatically with more adoption and spread of IoT (Internet of Things) and Big Data, and what we need to do to be part of the change, if not influencers of change!

Slides: automating-web-analytics


2. Performance testing with Gatling for Beginners

Gatling is a server side performance testing tool. This workshop aims at giving introduction to Gatling and facilitating participants to write their first performance tests using Gatling.
  • Brief intro to Gatling
  • Using Postman to check stub server(created using mountebank for workshop)
  • Write a sample test using gatling (pre set up machines are provided)

Slides: gatling-performance-workshop

3. Game of Test Automation

We are going to use a game to work out the WHY, WHAT and HOW of test automation within the context of consumers, application, skillset, mindset, etc.

4. Security Testing - Operation Vijay

It is now the days of the web. All businesses move their applications online for their customers to use.

Many of these applications contain critical data of the customers such as credit card details, their personal details and so on. These data are very valuable. If these data fall in the wrong hands, they can have disastrous consequences.

Attacks on these systems can destroy the trust that the customers have for the business, they can cause great losses to the customers as well as the business, and so on.

The motivation behind attacks could be different like earning money, earning popularity, destroying a competitor company, etc.

No matter what the intention of the attack is, we need to develop safe applications and we need to know the various vulnerabilities and the consequences of our decisions when we develop applications.

Similarly, we should be aware of the various vulnerabilities before we test the applications so that we can try and exploit it during the testing phase and ensure better quality and safer applications.

Slides: security-testing-operation-vijay

5. Automate your Mobile tests with Appium

  • Introduction to Appium.
  • Appium design for Android and iOS
  • How to locate elements on android and iOS applications(Inspector).
  • Hands-on code snippet for android and iOS(Wordpress as sample app)
  • Generating reports (ExtentReports)
  • Evolve code snippet written above into a Page Object Framework.
Slides: mobile-automation-using-appium

6. Increase Automation to REST

  • What are web services and why we use them?
  • How to test a web service in multiple ways?
  • Increased familiarity with automation

Tools used : rest client, postman (mention alternatives), unirest JAVA and requests python

Slides: increase-automation-to-rest, api-webservice-setup-instructions

7. Let's cook Cucumber

In this workshop we will be covering:
  • Advantages of BDD through cucumber example.
  • Framework setup along with JAVA and Selenium.
  • Writing one end to end test case in real world.
  • If time permits - will be covering basic Refactoring
Slides: lets-cook-cucumber
 

8. Mobile Automation using Espresso

Imagine a situation where every commit spits out a build that can be deployed to production with confidence. In today's startup era, this can be a huge boost to business as it will reduce the time to market. UI Automation for mobile apps, be it native or hybrid, has been painful since long. But with mature frameworks coming up and Google/Apple realizing the importance of such tools, UI Automation is gaining traction in the mobile space. 

This talk is basically to understand what and why of espresso along with automating a simple scenario using espresso.

Slides: getting-high-on-espresso

Thursday, August 6, 2015

Agile Testing - Metrics can be fun too!

Metrics are meaningless unless in the right context. In this case, my "right" context is purely a "feel-good-factor". 

In April 2011, I published the "Agile QA Process" paper on SlideShare. I am very happy to see it has received over 30000 views and has been downloaded over 1400 times!

On a similar note, I created a mindmap for Test Insanse - titled - "Agile QA - Capabilities & Skills". That also seems to be hitting a good note - with almost 1000 views in under 25 days!

So in this case - Metrics are fun! I don't mind this ego boost to continue writing more, and sharing more!

Thursday, May 28, 2015

vodQA Pune - Innovations in Testing

vodQA Update - Agenda + Slides + Videos


Here is an update of the vodQA that went by at supersonic speed!


We had an intense and action-packed vodQA in ThoughtWorks, Pune on Saturday, 6th June 2015 - with the theme - Innovations in Testing!

Here are some highlights from the event:
  • You can find the details of the agenda + links to slides & videos from here or here.
  • After a record breaking attendee registrations (~500), we frantically closed off registrations. This meant around 140-180 people would show up based on historic attendance trends. 135 attendees made it to vodQA - the first person reaching office at 8.30am - when the event was supposed to start at 10am! That is enthusiasm!
  • We had 45+ speaker submissions (and we had to reject more submissions because the registrations had already closed). After speaking to all submitters, and a lot of dry-runs and feedback, we eventually selected 6 talks, 4 lightning talks, 4 workshops from this massive list.
  • We were unfortunately able to select only 2 external speakers (but it was purely based on the content + relevance to the theme). One of these speakers travelled all the way from Ahmedabad to Pune on his own for delivering a Lightning Talk.
  • We had a few ThoughtWorkers travelling from Bangalore (2 speakers + 1 attendee) and 1 (speaker) from Gurgaon
  • We had around 30-40 ThoughtWorkers participating in the conference. 
  • No event in the office can be possible without the amazing support from our Admin team + support staff!
  • Overall - we had around 200 people in the office on a Saturday!
  • For the first time, we did a live broadcasting of all the talks + lightning talks (NO workshops). This allowed people to connect with vodQA as it happened. Also - usually the last and most cumbersome thing from a post-event processing - uploading videos - was now the the first thing that was completed. See the videos here on youtube. This update got delayed because we still have to get the link to the slides :(
  • We celebrated the 5th Birthday of vodQA!
  • Even though most projects in TW Pune are running at 120+% delivery speed, we were able to pull this off amazingly well! This can only happen when individuals believe in what they are contributing towards. Thank you all!
  • We wrapped up most of the post-event activities (office-cleanup, retro, post-vodQA dinner and now this update email) within 5 days of the vodQA day - another record by itself!
  • Some pictures are attached with this email.
You can see the tweets and comments in the vodQA group on facebook

Again, A HUGE THANKS to ALL those who participated in any way!

On behalf of the vodQA team + all the volunteers!










-----------------------------------------------------------------------------------------------------------------------------------------

[UPDATE]

Detail agenda, with expected learning and speaker information available here (http://vodqa-pune.weebly.com/agenda.html) for vodQA Pune - Innovations in Testing.

NOTE;
- Each workshop has limited # of seats.
- Registration for workshop will be done at the Attendee Registration Desk between 9am-10am on vodQA day.
- Registration will be on first-come-first choice basis.
- See each talk / workshop details (below) for pre-requisites, if any.


----------------

vodQA is back in ThoughtWorks, Pune on Saturday, 6th June 2015. This time the theme is - "Innovations in Testing".

We got a record number of submissions from wannabe speakers and HUGE number of attendee registrations. Selecting 12-14 talks from this list was no small task - but we had to take a lot of tough decisions.

The agenda is now published (see here - http://vodqa-pune.weebly.com/agenda.html) and we are looking forward to have a very rocking vodQA!

Wednesday, May 20, 2015

What is Agile Testing? How does Automation help?

I spoke in a conference recently on "What is Agile Testing? How does Automation help?"

Abstract

Agile Methodology is not new. Many organisations / teams have already adopted Agile way of Software Development or are in the enablement journey for the same. 

What does this mean for Testing? There is no doubt that the Testing approach and mindset also needs to change to be in tune with the Agile Development methodology. 

Learn what does it mean to Test on Agile Projects. Also, learn how Test Automation approach needs to change for the team to be successful!

Video

Slides



Here I am on the stage, in Main Hall, in front of 150+ people, delivering the talk - What is Agile Test? How does Automation Help?:



Tuesday, May 19, 2015

Role of Automation in Testing

I am speaking in Discuss Agile 2015 conference on 13-14 June 2015 on the following topics - 

As part of this conference, I also did an interview with Saket Bansal and Atulya Mishra on - The Role of Automation in Testing.

This was an interesting, virtual interview - where interested people had asked questions during registration, and also a lot of questions came up during the interview.

Below is the video recording of the interview. 


I also referenced some slides when speaking about some specific topics. Those can be seen below, or directly from slideshare.




Monday, May 11, 2015

vodQA Geek Night in ThoughtWorks, Hyderabad - Client-side Performance Testing Workshop

I am conducting a workshop on "Client-side Performance Testing" in vodQA Geek Night, ThoughtWorks, Hyderabad from 6.30pm-8pm IST on Thursday, 14th May, 2015.

Visit this page to register!

Abstract of the workshop:

In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing. 

Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.

Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.

Venue:

ThoughtWorks Technologies (India) Pvt Ltd.
3rd Floor, Apurupa Silpi,
Beside H.P. Petrol Bunk (KFC Building),
Gachibowli,
Hyderabad - 500032, India 

Saturday, April 25, 2015

Push the Envelope at vodQA, Bangalore

[UPDATED - Slides added]

Yet another vodQA begins today, Saturday, 25th April 2015 - this time at ThoughtWorks, Bangalore. The theme for this vodQA is - "Push the Envelope". The detail agenda can be found here.


I conducted a workshop on "Client-side Performance Testing" in vodQA Bangalore. 


Abstract of the workshop:



In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing. 

Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.



Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.

Here are the slides used in the workshop:

Thursday, March 19, 2015

Enabling CD & BDT in March 2015

I have been very busy off late .... and am enjoying it too! I am learning and doing a lot of interesting things in the Performance Testing / Engineering domain. I had no idea there are so many types of caching, and that there would be a need to do various different types of Monitoring for availability, client-side performance testing, Real User Monitoring, Server-side load testing and more ... it is a lot of fun being part of this aspect of Testing.

That said, I am equally excited about 2 talks coming up in the end-of-March 2015:

Enabling CD (Continuous Delivery) in Enterprises with Testing 

- at Agile India 2015, on Friday, 27th March 2015 in Bangalore.


Abstract

The key objectives of Organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
In such a fast moving environment, CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury!

There are various practices that Organizations and Enterprises need to implement to enable CD. Testing (automation) is one of the important practices that needs to be setup correctly for CD to be successful.

Testing in Organizations on the CD journey is tricky and requires a lot of discipline, rigor and hard work. In Enterprises, the Testing complexity and challenges increase exponentially.

In this session, I am sharing my vision of the Test Strategy required to make successful the journey of an Enterprise on the path of implementing CD.



Build the 'right' regression suite using Behavior Driven Testing (BDT) - a Workshop

- at vodQA Gurgaon, on Saturday, 28th March 2015 at ThoughtWorks, Gurgaon.


Abstract

Behavior Driven Testing (BDT) is a way of thinking. It helps in identifying the 'correct' scenarios, in form of user journeys, to build a good and effective (manual & automation) regression suite that validates the Business Goals. We will learn about BDT, do some hands-on exercises in form of workshops to understand the concept better, and also touch upon some potential tools that can be used.

Learning outcomes

  • Understand Behavior Driven Testing (BDT)
  • Learn how to build a good and valuable regression suite for the product under test
  • Learn different style of identifying / writing your scenarios that will validate the expected Business Functionality
  • Automating tests identified using BDT approach will automate your Business Functionality
  • Advantages of identifying Regression tests using BDT approach

Wednesday, January 28, 2015

vodQA Cocktail - early in 2015

As we get ready for Celebrating Selenium's 10 year journey in vodQA Hyderabad, ThoughtWorks Chennai is ready to take vodQA to the next level on Saturday, 21st February, 2015 with an interesting Cocktail of topics related to Software Testing.

Register here as a speaker for vodQA Chennai, or here as an attendee.

Monday, June 23, 2014

To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)

[UPDATE: 18th July 2014] I spoke on the same topic - "To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)" at Unicom's World Conference on Next Generation Testing in Bangalore on 18th July 2014. The slides are available here and the video is available here. In this talk, I also gave a demo of TTA.
 
I spoke in 3 conferences last week about "To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)"

You can find the slides here and the videos here:

Here is the abstract of the talk:

The key objectives of organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality. To understand the quality / health of their products at a quick glance, typically a team of people scramble to collate and collect the information manually needed to get a sense of quality about the products they support. All this is done manually. So in the fast moving environment, where CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury, how can teams take decisions if the product is ready to be deployed to the next environment or not? Test Automation across all layers of the Test Pyramid is one of the first building blocks to ensure the team gets quick feedback into the health of the product-under-test.
 

The next set of questions are:
    •    How can you collate this information in a meaningful fashion to determine - yes, my code is ready to be promoted from one environment to the next?
    •    How can you know if the product is ready to go 'live'?
    •    What is the health of you product portfolio at any point in time?
    •    Can you identify patterns and do quick analysis of the test results to help in root-cause-analysis for issues that have happened over a period of time in making better decisions to better the quality of your product(s)?
 

The current set of tools are limited and fail to give the holistic picture of quality and health, across the life-cycle of the products.
 

The solution - TTA - Test Trend Analyzer
 

TTA is an open source product that becomes the source of information to give you real-time and visual insights into the health of the product portfolio using the Test Automation results, in form of Trends, Comparative Analysis, Failure Analysis and Functional Performance Benchmarking. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Sunday, October 6, 2013

Offshore Testing on Agile Projects


Offshore Testing on Agile Projects …
Anand Bagmar

Reality of organizations

Organizations are now spread across the world. With this spread, having distributed teams is a reality. Reasons could be a combination of various factors, including:

Globalization
Cost
24x7 availability
Team size
Mergers and Acquisitions
Talent

The Agile Software methodology talks about various principles to approach Software Development. There are various practices that can be applied to achieve these principles. 

The choice of practices is very significant and important in ensuring the success of the project. Some of the parameters to consider, in no significant order are:

Skillset on the team
Capability on the team
Delivery objectives
Distributed teams
Working with partners / vendors?
Organization Security / policy constrains
Tools for collaboration
Time overlap time between teams
Mindset of team members
Communication
Test Automation
Project Collaboration Tools
Testing Tools
Continuous Integration

** The above list is from a Software Testing perspective.

This post is about what practices we implemented as a team for an offshore testing project.

Case Study - A quick introduction

An enterprise had a B2B product providing an online version of a physically conducted auction for selling used-vehicles, in real-time and at high-speed. Typical participation in this auction is by an auctioneer, multiple sellers, and potentially hundreds of buyers. Each sale can have up to 500 vehicles. Each vehicle gets sold / skipped in under 30 seconds - with multiple buyers potentially bidding on it at the same time. Key business rules: only 1 bid per buyer, no consecutive bids by the same buyer.

Analysis and Development was happening across 3 locations – 2 teams in the US, and 1 team in Brazil. Only Testing was happening from Pune, India.

“Success does not consist in never making mistakes but in never making the same one a second time.”

We took that to heart and very sincerely. We applied all our learning and experiences in picking up the practices to make us succeed. We consciously sought to creative, innovative and applied out-of-the-box thinking on how we approached testing (in terms of strategy, process, tools, techniques) for this unique, interesting and extremely challenging application, ensuring we do not go down the same path again.

Challenges

We had to over come many challenges for this project.
  • Challenge in creating a common DSL that will be understood by ALL parties - i.e. Clients / Business / BAs / PMs / Devs / QAs
  • All examples / forums talk using trivial problems - whereas we had lot of data and complex business scenarios to take care of.
  • Cucumber / capybara / WebDriver / ruby do not allow an easy way to do concurrency / parallel testing
  • We needed to simulate in our manual + automation tests for "n" participants at a time, interacting with the sale / auction
  • A typical sale / auction can contains 60-500 buyers, 1-x sellers, 1 auctioneer. The sale / auction can contain anywhere from 50-1000 vehicles to sell. There can be multiple sales going on in parallel. So how do we test these scenarios effectively?
  • Data creating / usage is a huge problem (ex: production subset snapshot is > 10GB (compressed) in size, refresh takes long time too,
  • Getting a local environment in Pune to continue working effectively - all pairing stations / environment machines use RHEL Server 6.0 and are auto-configured using puppet. These machines are registered to the Client account on the RedHat Satellite Server.
  • Communication challenge - We are working from 10K miles away - with a time difference of 9.5 / 10.5 hours (depending on DST) - this means almost 0 overlap with the distributed team. To add to that complexity, our BA was in another city in the US - so another time difference to take care of.
  • End-to-end Performance / Load testing is not even a part of this scope - but something we are very vary of in terms of what can go wrong at that scale.
  • We need to be agile - i.e. testing stories and functionality in the same iteration.

All the above-mentioned problems meant we had to come up with our own unique way of tackling the testing.

Our principles - our North Star

We stuck to a few guiding principles as our North Star:
  • Keep it simple
  • We know the goal, so evolve the framework - don't start building everything from step 1
  • Keep sharing the approach / strategy / issues faced on regular basis with all concerned parties and make this a TEAM challenge instead of a Test team problem!
  • Don't try to automate everything
  • Keep test code clean

The End Result

At the end of the journey, here are some interesting events from the off-shore testing project:
  • Tests were specified in form of user journeys following the Behavior Driven Testing (BDT) philosophy – specified in Cucumber.
  • Created a custom test framework (Cucumber, Capybara, WebDriver) that tests a real-time auction - in a very deterministic fashion.
  • We had 65-70 tests in form of user journeys that covers the full automated regression for the product.
  • Our regression completed in less than 30 minutes.
  • We had no manual tests to be executed as part of regression.
  • All tests (=user journeys) are documented directly in Cucumber scenarios and are automated
  • Anything that is not part of the user journeys is pushed down to the dev team to automate (or we try to write automation at that lower level)
  • Created a ‘special’ Long running test suite that simulates a real sale with 400 vehicles, >100 buyers, 2 sellers and an auctioneer.
  • Created special concurrent (high speed parallel) tests that ensures even at highest possible load, the system is behaving correctly
  • Since there was no separate performance and load test strategy, created special utilities in the automation framework, to benchmark “key” actions.
  • No separate documentation or test cases ever written / maintained - never missed it too.
  • A separate special sanity test that runs in production after deployment is done, to ensure all the integration points are setup properly
  • Changed our work timings (for most team members) from 12pm - 9pm IST to get some more overlap, and remote pairing time with onsite team.
  • Setup an ice-cream meter - for those that come late for standup.

Innovations and Customizations

Necessity breeds innovation! This was so true in this project.

Below is a table listing all the different areas and specifics of the customization we did in our framework.

Dartboard

Created a custom board “Dartboard” to quickly visualize the testing status in the Iteration. See this post for more details: “Dartboard – Are you on track?

TaaS

To automate the last mile of Integration Testing between different applications, we created an open-source product – TaaS. This provides a platform / OS / Tool / Technology / Language agnostic way of Automating the Integrated Tests between applications.

Base premise for TaaS:

Enterprise-sized organizations have multiple products under their belt. The technology stack used for each of the product is usually different – for various reasons.

Most of such organizations like to have a common Test Automation solution across these products in an effort to standardize the test automation framework.

However, this is not a good idea! If products in the same organization can be built using different / varied technology stack, then why should you pose this restriction on the Test Automation environment?

Each product should be tested using the tools and technologies that are “right” for it.

TaaS” is a product that allows you do achieve the “correct” way of doing Test Automation.

See my blog for all information related to TaaS.

WAAT - Web Analytics Automation Testing Framework

I had created the WAAT framework for Java and Ruby in 2010/2011. However this framework had a limitation - it did not work products what are configured to work only in https mode.

For one of the applications, we need to do testing for WebTrends reporting. Since this application worked only in https mode, I created a new plugin for WAAT  - JS Sniffer that can work with https-only applications. See my blog for more details about WAAT.