Showing posts with label data. Show all posts
Showing posts with label data. Show all posts

Friday, November 30, 2018

Recording from webinar on The Missing Feedback Loop now available

On 21st Nov, TestCraft.io hosted me in a webinar where I spoke about - "The Missing Feedback Loop - The Tools, Techniques, and Automation to Solve It". 

You can get the recording from here (https://hubs.ly/H0fBDN50).






Wednesday, November 28, 2018

A blog about my blog

With the risk of this ending up becoming a recursion, I want to share interesting statistics about my blog - essenceoftesting.blogspot.com and content shared on my SlideShare account.




Here are some charts from Analytics of my SlideShare and blog:

Top content from my SlideShare:




Blog Overview:



Audience:


Popular posts:





Referrers:




Monday, November 5, 2018

Upcoming webinar - The Missing Feedback Loop

I am very excited to share that I am going to conduct a webinar hosted by testcraft.io on "The Missing Feedback Loop - The Tools, Techniques, and Automation to Solve It". 

You can register for the webinar from here (https://hubs.ly/H0fp4by0).





Date & Time:
Thursday, November 21, 2018 at 02:00 PM New-York (EDT), 11:00 AM San-Francisco (PDT) and 08:00 PM Amsterdam (UTC+2)


Thursday, September 6, 2018

Some good examples of Data Science, AI & ML

Following up on my earlier post about ODSC - Data Science, AI, ML - Hype, or Reality?, I thought it is good to also share some of the good examples of work happening in the field.

Here are some of the examples I got to hear in the ODSC conference, most of which are available to the common human:
  • Amazing work done in the complex field of Speech recognition 
    • Why complex? Think about languages, dialects, multiple conversations at the same time, different speed of talking, etc.
  • Text to speech
    • Ex: This is especially very helpful for people with disabilities
  • Speech to Text
    • Ex: Alexa, Google Voice, etc. type of applications
  • Traffic control / Routes / Navigation
    • Ex: Google Maps
  • Recommendation engines
    • Ex: eCommerce products
  • Preventive maintenance
    • Lot of advanced vehicles have a number of sensors that can alert the driver / car manufacturer about potential issues coming up / service due for the vehicle
  • Autonomous vehicles
    • Ex: Self driving vehicles
    • Ex: Optimizing Cab scheduling / routing - There was a good session on how OLA manage its complexity in scheduling and routing - which is very applicable to eCommerce, Aviation industry, Hotel industry, etc.
    • I recently also saw a video about Volvo truck driver getting out of the truck in a difficult terrain, and walking in front of the truck, controlling its movement using a game-like controller
  • Medical equipment / gadgets for preventive / alerting health-care products

Also, Dr. Ravi Mehrotra, from IDeaS made a very powerful statement in his keynote - that I loved!! 

He said - "Best way to learn, is to forget what is not important".

This statement resonates a lot with what I think .... one needs to forget what is not (as) important, in order to focus and prioritize on what is important and can add value.

Especially true for Testers to keep in mind!


Monday, September 3, 2018

ODSC - Data Science, AI, ML - Hype, or Reality?

I got a chance to attend ODSC India, held in Bangalore on 31st Aug / 1st Sept. For those who don't know, ODSC is the largest Applied Data Science and AI conference, and it was conducted in India the first time this year.

I was very excited to attend this for couple of reasons:

  • I was attending a conference after a long time (i.e. where I was not speaking). So this was going to be a pure learning and knowing expedition for me.
  • Data Science / AI / ML have become huge buzzwords in the industry now. I had some opinions about it - but that was with limited knowledge / understanding about it. I was hungry to learn some specific of these buzzwords.


Since I was going to travel to Bangalore for ODSC anyway, I also decided to participate in the pre-conference workshop - Advanced Data Analysis, Dashboards And Visualization. I thought it would be interesting to learn about the What, Why and How of the techniques of Data Analysis, Dashboards and visualization - which would help me as I rebuild / extend TTA (Test Trend Analyzer). Though the workshop was good, it focused completely on Tableau as a tool and unfortunately did not meet my objectives / expectations. That said, there is another tool I came across in the conference - KNIME - seems interesting and am going to try it out.

The conference was good though. I attended a lot of sessions and had lot of hallway-conversations with many interesting people. Typical outcome of attending a conference, some sessions I liked better than others, some were amazing, some were mediocre. 

Here is my unstructured assessment of what I now think about what I heard and discussed:
  • Advanced mathematics learnt in colleges has an application in data science. So if children / kids ask why should they study Statistics - here is an answer!
  • Creating data models without Business Context will not work. If it does, you have been lucky :)
  • There are some interesting case studies and success stories of AI & ML. But these are the same success stories around since quite some time. All the other "noise" of AI & ML so far seems a hype so far.
  • There is a lot of value in understanding historical data better. Based on that understanding, there can be opportunities to forecast the future. There is a huge risk of doing this forecasting, IF % of uncertainty is not included as part of it. However, it is very easily ignored.
  • Understanding of Neural Networks, computing, and algorithms is essential to building intelligent solutions for complex problems.
  • It is not sufficient to get better / accurate prediction results. Being able to explain how and why those results are better / same / worse is equally important. In many cases, this would be a regulatory requirement.
  • Data Science is the "art" & "science" of understanding data better. To do this, we need to first cleanse / prep the data, simplify it using various techniques, and learn techniques to visualize the data.
  • There is a "grammar of graphics" and a "grammar of interactive graphics" - which helps in thinking about data visualization.
  • Deploying these AI / ML solutions to production is not a trivial task - mainly due to the fact of high computing and huge volume of data processing required to make it production ready. - This is a huge opportunity for the general Software Development / Testing/ DevOps community to solve problems faced by data scientists / people in the data science / AI / ML domain.
  • With data privacy laws rightly becoming stricter, you need to be careful and use only legally obtained sample datasets for analyzing / training the data models - else there is going to be huge penalties for companies involved. (This is in reference to GDPR, a new law coming up in USA and also India.)
  • Earlier, only PhD holder were qualified folks to work on Data Science. Now-a-days, the trend is to get relevant training to interns, and have them work on these problems, and then get the results validated / explained by the PhD specialists.
  • In a nutshell - Data Science, AI, ML are using specialized types of tools and technologies to solve different problems. People / organizations have been doing these activities before the buzzwords were formed / or got popular.
So, what is my core takeaway from this? 
  • As with any new buzzword, there is interesting work happening in Data Science, AI & ML - but the majority claiming to be in the field are just creating and riding the hype!

That said, I want to do the following:
  • Find opportunities to investigate and understand the Data Science + AI + ML in more detail. 
  • Understand the skills and capabilities required from a software developer + QA role perspective to contribute more effectively in solving these newer problem statements
  • Learn python / R 
  • Experiment with various tools / libraries related to data visualization


Saturday, March 17, 2018

Measuring Consumer Quality - The Missing Feedback Loop

I spoke in vodQA at ThoughtWorks, Pune on "Measuring Consumer Quality - the Missing Feedback Loop". 

This talk address the why and how from my earlier blog post on "Understanding, Measuring and Building Consumer Quality". I recommend you read that first, before going through the slides and video for this talk.


Abstract:

How to build a good quality product is not a new topic. Proper usage of methodologies, processes, practices, collaboration techniques can yield amazing results for the team, the organisation, and for the end-users of your product.

While there is a lot of emphasis on the processes and practices side, one aspect that is still spoken about "loosely" - is the feedback loop from your end-users to making better decisions.

SO, What is this feedback loop? Is it a myth? How do you measure it? Is there a "magic" formula to understand this data received? How to you add value to your product using this data?

In this interactive session, we will use a case study of a B2C entertainment-domain product (having millions of consumers) as an example to understand and also answer the following questions:

  • The importance of knowing your Consumers 
  • How do you know your product is working well? 
  • How do you know your Consumers are engaged with your product? 
  • Can you draw inferences and patterns from the data to reach of point of being able to make predictions on Consumer behaviour, before making any code change? 

Video:


Slides can be found here.

Pictures:



Friday, March 17, 2017

Patterns in Test Automation Framework at STPCon

I spoke about Patterns of a "good" Test Automation Framework at STPCon 2017. Here are the details from the talk.


Abstract

Building a Test Automation Framework is easy – there are so many resources / guides / blogs / etc. available to help you get started and help solve the issues you get along the journey.
However, building a “good” Test Automation Framework is not very easy. There are a lot of principles and practices you need to use, in the right context, with a good set of skills required to make the Test Automation Framework maintainable, scalable and reusable.
Design Patterns play a big role in helping achieve this goal of building a good and robust framework.
In this talk, we will talk about, and see examples of various types of patterns you can use for:
  • Build your Test Automation Framework
  • Test Data Management
Using these patterns you will be able to build a good framework, that will help keep your tests running fast, and reliably in your CI / CD setup!

Session Takeaways:


  • Patterns for building Test Automation Framework.
  • Patterns for Test Data Management, with pros and cons of each.

Slides



Pictures




Tuesday, July 19, 2016

Slides & Video from Test Data - Food for Test Automation Framework

As posted earlier, I spoke about "Test Data - Food for your Test Automation Framework!" at Selenium Conference 2016 in Bangalore on 24th June 2016 in front of a packed and a very interactive audience. What a great time it was!

The video for that session is finally available! I have updated the original post as well as linked the slides & video with this post as well.

Video



Slides



Saturday, June 25, 2016

Test Data - Food for Test Automation Framework at Selenium Conference 2016

[Updated - Pictures added, Video added]

I spoke about "Test Data - Food for your Test Automation Framework!" at Selenium Conference 2016 in Bangalore on 24th July 2016 in front of a packed and a very interactive audience. What a great time it was!

Here is some information about the session - 

Abstract

Building a Test Automation Framework is easy - there are so many resources / guides / blogs / etc. available to help you get started and help solve the issues you get along the journey.

Teams already building 1000s of tests of various types - UI, web service-based, integration, unit, etc. is a proof of that.

However, building a "good" Test Automation Framework is not very easy. There are a lot of principles and practices you need to use, in the right context, with a good set of skills required to make the Test Automation Framework maintainable, scalable and reusable.
In this talk, we will focus on one of the critical aspects and patterns in building the Test Automation framework - Test Data!

We will look at different data patterns as options and techniques how to create, manage, use, reuse Test Data in a way to keep the tests running in an reliable and deterministic way. We will also discuss what questions to ask, what things to think about in selecting your approach for Test Data!

This discussion will be applicable for any type of Test Automation (web / mobile / desktop), but, we will focus primarily on UI automation frameworks, ex. using Selenium.

Slides

Video


Pictures