Showing posts with label waat. Show all posts
Showing posts with label waat. Show all posts

Thursday, August 18, 2016

Agenda published for vodQA Pune - Less Talk, Only Action!

The agenda for the upcoming vodQA Pune - Less Talk, Only Action! is now published.


NOTE: For this event, we will require the participants to get their laptops (and chargers).


When? 

Saturday, 27th Aug 2016, 8.30am - 5.30pm


Where? 

ThoughtWorks, Pune 
6th Floor, Binarius Building, 
Beside Sales Tax Office, Shastrinagar, Yerawada 
Pune, Maharashtra 411006 


Map: 

http://goo.gl/maps/KfuJG


What?

vodQA Pune - Less Talk, Only Action! 
- Registration - 8.30am- 9.15am
- Welcome note - 9.15am


Topics covered: 

This vodQA will focus on hands-on activities. We are setting up a bunch of workshops for participants, please keep a watch for details and pre-requisites required for the workshops.

We will publish the agenda very soon in vodQA groups on 
Facebook (https://www.facebook.com/groups/vodqa/) and 
LinkedIn (https://www.linkedin.com/groups/4281359)

NOTE: For this event, we will require the participants to get their laptops (and chargers).

Tuesday, July 5, 2016

Any WAAT (Web Analytics Automation Testing Framework) users out there?

It has been over 2 years since any update to WAAT - Java or Ruby. Over the years, I have realised, and also received a lot of thoughts / feedback from users of WAAT around where it helps, and what challenges exist. 

Also, given the widespread IoT & Big Data based work going on around the world, (Web) Analytics now plays a much bigger role in guiding business take better decisions. 

WAAT (again) fits in the grand scheme of things very nicely as a framework to automate the validation of correct reporting of tags to any Web Analytics solution provider.

Hence, its a no-brainer for me - it is high time I work on some of the feedback and limitations of WAAT to make it usable again!

At the recently concluded Selenium Conference 2016 held in Bangalore, India, I got an idea of how to overcome a lot of challenges (listed below) and pain in using WAAT. 


What's next?

To implement my new idea, this does mean a couple of things:

  • Existing plugins have limited use - and needs to be deleted.
  • A new plugin would need to be created - which may mean different set of APIs, and also different way to specify the test data.

Questions for you

Before I go ahead making these changes - I would like to get answers to the below questions (please add your answers directly in the comments):
  • Is anyone currently using WAAT? If yes - 
    • which version (Java / Ruby)?
    • which plugin
    • Using HTTP / HTTPS?
    • Which Web Analytic solution are you using? (ex: Google Analytics, WebTrends, etc?)
  • Would you be interested in using the new WAAT? If yes - 
    • Which language? Java / Ruby / JavaScript / Python / etc?
  • Would you like to contribute to implementing this new WAAT? If yes - contact me! :)
-----------------------------------------------------------------------------------------------------------------

Current plugins available in WAAT:

  • Omniture Debugger (WAAT-Java)
    • Pros:
      • OS independent
      • Run using regular-test-user 
    • Cons:
      • Browser dependent - need to implement ScriptRunner for the UI-driver in use
      • Web-Analytic solution dependent - only for Adobe Marketing Clout / Omniture SiteCatalyst
  • HTTPSniffer (WAAT-Java, WAAT-Ruby)
    • Pros
      • Web-Analytic solution independent
      • Browser independent
      • UI-Driver independent
    • Cons
      • 3rd party libraries are OS dependent
      • HTTPs is not supported out-of-the-box
      • Run tests as "root"
  • JSSniffer (WAAT-Java, WAAT-Ruby)
    • Pros
      • Web-Analytic solution independent
      • Browser independent
      • HTTPs supported out-of-the-box
      • No 3rd party library dependency
    • Cons
      • Need to write JavaScript to get the URL from the browser context
      • UI-Driver dependent
-----------------------------------------------------------------------------------------------------------------

Saturday, July 2, 2016

What is Web Analytics and how to Test it?

vodQA returned - this time with the theme - Testing Heuristics in ThoughtWorks, Hyderabad on 2nd July 2016.

Here I spoke about - "The What, Why and How of Web Analytics Testing". 

Abstract

Learn what is Web Analytics, why is it important, and see some techniques how you can test it manually and and also automate that validation. But just knowing about Web Analytics is not sufficient for business now.
There are new kids in town - IoT and Big Data - two of the most used and heard-off buzz words in the Software Industry!

With IoT, with a creative mindset looking for opportunities and ways to add value, the possibilities are infinite. With each such opportunity, there is a huge volume of data being generated - which if analyzed and used correctly, can feed into creating more opportunities and increased value propositions.

There are 2 types of analysis that one needs to think about.
1. How is the end-user interacting with the product? This will give some level of understanding into how to re-position and focus on the true value add features for the product.
2. With the huge volume of data being generated by the end-user interactions, and the data being captured by all devices in the food-chain of the offering, it is important to identify patterns from what has happened, and find out new product / value opportunities based on usage patterns.

Slides


Video

PS: Apologies for the video quality - I am not seen very clearly - but the slides are bright & clear, and so is the audio - so the important aspects are covered!




Pictures









Feedback


My Takeaway & Learning

- The attendees did not have much exposure to Web Analytics, and how it works. I should spend more time in speaking about that
- I should spend more time in challenges and potential solutions related to Big Data & IoT
- A lot of people are interested in WAAT - that could be a separate, more detailed discussion

Thursday, June 30, 2016

Learnings from Selenium Conference 2016, Bangalore

The value one gets by attending any conference / training / meetup / etc. is subjective to various aspects, some of which are mentioned below (in no particular order):

  • Individual skills & capabilities
  • Past experiences
  • Existing knowledge / information / expertise on the subject 
  • Open mindedness
  • Willingness to learn
  • Current work (tools & tech stack, challenges, risks, priorities, backlog, tech debt, team members, etc.)

The above aspects definitely played a part in what takeaways I had from the recently concluded Selenium Conference 2016 in Bangalore as well.

Here are my key takeaways, which I am going to work on learning more about, or implementing in the near future - special thanks to +Dave Haeffner , +Marcus Merrell , +Simon Stewart+Bret Pettichord for helping me find these takeaways as part of various conversations during these few days.


  • Related to Protractor
    • Use Proxy Server in tests (Protractor framework) to capture HAR file on specific actions (AJAX calls) - and capture performance metrics from the same
    • Read and experiment with the Marionette driver for Firefox - maybe it helps me overcome some of my challenges with Firefox & Maps in CI environment (headless using xvfb)
    • Remove "phantomJS" as a supported browser from my framework by ensuring headless tests work with Chrome & Firefox using xvfb
    • Highlight element when running tests before taking screenshots - will help in debugging
    • Experiment with different loggers & reporters - Allure, Winston logger
    • Better "promise" handling in framework to keep abstraction layers sane
  • Revive WAAT - Web Analytics Automation Testing Framework - create new plugin using Proxy Server approach. Also remove Omniture Debugger and HttpSniffer plugin.
  • Refocus energy on TTA - Test Trend Analyzer.
  • Keep vodQA going strong - its a good community initiative

See you all in Selenium Conference UK in November 2016!


Monday, October 12, 2015

Web Analytics and the new kids in town!

I spoke in Agile Testing Alliance Global Gathering on 8th Oct in Bangalore on "The What, Why and How of Web Analytics Testing".

This talk was my take on explaining a very important, yet quite ignored, aspect of Product / Application Development - Web Analytics. Below is the abstract of the talk, followed by slides and video from the talk.


Topic: The What, Why & How of Web Analytics Testing

Learning Objectives:

The most used and heard about buzz words in the Software Industry today are … IoT and Big Data!

With IoT, with a creative mindset looking for opportunities and ways to add value, the possibilities are infinite. With each such opportunity, there is a huge volume of data being generated - which if analyzed and used correctly, can feed into creating more opportunities and increased value propositions.

There are 2 types of analysis that one needs to think about.
1. How is the end-user interacting with the product? This will give some level of understanding into how to re-position and focus on the true value add features for the product.
2. With the huge volume of data being generated by the end-user interactions, and the data being captured by all devices in the food-chain of the offering, it is important to identify patterns from what has happened, and find out new product / value opportunities based on usage patterns.

Learn what is Web Analytics, why is it important, and see some techniques how you can test it manually and and also automate that validation.


Slides from the talk

Monday, May 19, 2014

WAAT at StarEast2014

I was speaking about "Build the 'right' regression suite using Behavior Driven Testing (BDT)" at StarEast 2014 and met Marcus Merrell who was speaking about "Automated Analytics Testing with Open Source Tools". I figured out deep into our conversation that he has used WAAT, and a few others at the table also were aware of and had used WAAT before. Felt great!

Friday, March 28, 2014

WAAT Java v1.5.1 released today

After a long time, and with lot of push from collaborators and users of WAAT, I have finally updated WAAT (Java) and made a new release today. 

You can get this new version - v1.5.1 directly from the project's dist directory.

Once I get some feedback, I will also update WAAT-ruby with these changes.

Here is the list of changes in WAAT_v1.5.1:

Changes in v1.5.1

  • Engine.isExpectedTagPresentInActualTagList in engine class is made public
  • Updated Engine to work without creating testData.xml file, and directly sending exceptedSectionList for tags
    Added a new method
        Engine.verifyWebAnalyticsData(String actionName, ArrayList
    expectedSectionList, String[] urlPatterns, int minimumNumberOfPackets)
  • Added an empty constructor for Section.java to prevent marshalling error
  • Support Fragmented Packets
  • Updated Engine to support Pattern comparison, instead of String contains
Do let me know if you see any problems / issues with this update.

Thanks.

Sunday, October 6, 2013

Offshore Testing on Agile Projects


Offshore Testing on Agile Projects …
Anand Bagmar

Reality of organizations

Organizations are now spread across the world. With this spread, having distributed teams is a reality. Reasons could be a combination of various factors, including:

Globalization
Cost
24x7 availability
Team size
Mergers and Acquisitions
Talent

The Agile Software methodology talks about various principles to approach Software Development. There are various practices that can be applied to achieve these principles. 

The choice of practices is very significant and important in ensuring the success of the project. Some of the parameters to consider, in no significant order are:

Skillset on the team
Capability on the team
Delivery objectives
Distributed teams
Working with partners / vendors?
Organization Security / policy constrains
Tools for collaboration
Time overlap time between teams
Mindset of team members
Communication
Test Automation
Project Collaboration Tools
Testing Tools
Continuous Integration

** The above list is from a Software Testing perspective.

This post is about what practices we implemented as a team for an offshore testing project.

Case Study - A quick introduction

An enterprise had a B2B product providing an online version of a physically conducted auction for selling used-vehicles, in real-time and at high-speed. Typical participation in this auction is by an auctioneer, multiple sellers, and potentially hundreds of buyers. Each sale can have up to 500 vehicles. Each vehicle gets sold / skipped in under 30 seconds - with multiple buyers potentially bidding on it at the same time. Key business rules: only 1 bid per buyer, no consecutive bids by the same buyer.

Analysis and Development was happening across 3 locations – 2 teams in the US, and 1 team in Brazil. Only Testing was happening from Pune, India.

“Success does not consist in never making mistakes but in never making the same one a second time.”

We took that to heart and very sincerely. We applied all our learning and experiences in picking up the practices to make us succeed. We consciously sought to creative, innovative and applied out-of-the-box thinking on how we approached testing (in terms of strategy, process, tools, techniques) for this unique, interesting and extremely challenging application, ensuring we do not go down the same path again.

Challenges

We had to over come many challenges for this project.
  • Challenge in creating a common DSL that will be understood by ALL parties - i.e. Clients / Business / BAs / PMs / Devs / QAs
  • All examples / forums talk using trivial problems - whereas we had lot of data and complex business scenarios to take care of.
  • Cucumber / capybara / WebDriver / ruby do not allow an easy way to do concurrency / parallel testing
  • We needed to simulate in our manual + automation tests for "n" participants at a time, interacting with the sale / auction
  • A typical sale / auction can contains 60-500 buyers, 1-x sellers, 1 auctioneer. The sale / auction can contain anywhere from 50-1000 vehicles to sell. There can be multiple sales going on in parallel. So how do we test these scenarios effectively?
  • Data creating / usage is a huge problem (ex: production subset snapshot is > 10GB (compressed) in size, refresh takes long time too,
  • Getting a local environment in Pune to continue working effectively - all pairing stations / environment machines use RHEL Server 6.0 and are auto-configured using puppet. These machines are registered to the Client account on the RedHat Satellite Server.
  • Communication challenge - We are working from 10K miles away - with a time difference of 9.5 / 10.5 hours (depending on DST) - this means almost 0 overlap with the distributed team. To add to that complexity, our BA was in another city in the US - so another time difference to take care of.
  • End-to-end Performance / Load testing is not even a part of this scope - but something we are very vary of in terms of what can go wrong at that scale.
  • We need to be agile - i.e. testing stories and functionality in the same iteration.

All the above-mentioned problems meant we had to come up with our own unique way of tackling the testing.

Our principles - our North Star

We stuck to a few guiding principles as our North Star:
  • Keep it simple
  • We know the goal, so evolve the framework - don't start building everything from step 1
  • Keep sharing the approach / strategy / issues faced on regular basis with all concerned parties and make this a TEAM challenge instead of a Test team problem!
  • Don't try to automate everything
  • Keep test code clean

The End Result

At the end of the journey, here are some interesting events from the off-shore testing project:
  • Tests were specified in form of user journeys following the Behavior Driven Testing (BDT) philosophy – specified in Cucumber.
  • Created a custom test framework (Cucumber, Capybara, WebDriver) that tests a real-time auction - in a very deterministic fashion.
  • We had 65-70 tests in form of user journeys that covers the full automated regression for the product.
  • Our regression completed in less than 30 minutes.
  • We had no manual tests to be executed as part of regression.
  • All tests (=user journeys) are documented directly in Cucumber scenarios and are automated
  • Anything that is not part of the user journeys is pushed down to the dev team to automate (or we try to write automation at that lower level)
  • Created a ‘special’ Long running test suite that simulates a real sale with 400 vehicles, >100 buyers, 2 sellers and an auctioneer.
  • Created special concurrent (high speed parallel) tests that ensures even at highest possible load, the system is behaving correctly
  • Since there was no separate performance and load test strategy, created special utilities in the automation framework, to benchmark “key” actions.
  • No separate documentation or test cases ever written / maintained - never missed it too.
  • A separate special sanity test that runs in production after deployment is done, to ensure all the integration points are setup properly
  • Changed our work timings (for most team members) from 12pm - 9pm IST to get some more overlap, and remote pairing time with onsite team.
  • Setup an ice-cream meter - for those that come late for standup.

Innovations and Customizations

Necessity breeds innovation! This was so true in this project.

Below is a table listing all the different areas and specifics of the customization we did in our framework.

Dartboard

Created a custom board “Dartboard” to quickly visualize the testing status in the Iteration. See this post for more details: “Dartboard – Are you on track?

TaaS

To automate the last mile of Integration Testing between different applications, we created an open-source product – TaaS. This provides a platform / OS / Tool / Technology / Language agnostic way of Automating the Integrated Tests between applications.

Base premise for TaaS:

Enterprise-sized organizations have multiple products under their belt. The technology stack used for each of the product is usually different – for various reasons.

Most of such organizations like to have a common Test Automation solution across these products in an effort to standardize the test automation framework.

However, this is not a good idea! If products in the same organization can be built using different / varied technology stack, then why should you pose this restriction on the Test Automation environment?

Each product should be tested using the tools and technologies that are “right” for it.

TaaS” is a product that allows you do achieve the “correct” way of doing Test Automation.

See my blog for all information related to TaaS.

WAAT - Web Analytics Automation Testing Framework

I had created the WAAT framework for Java and Ruby in 2010/2011. However this framework had a limitation - it did not work products what are configured to work only in https mode.

For one of the applications, we need to do testing for WebTrends reporting. Since this application worked only in https mode, I created a new plugin for WAAT  - JS Sniffer that can work with https-only applications. See my blog for more details about WAAT.