Google Test Automation Conference 2016 – When top players talk about Automated Testing

Posted by Tobias Spöcker on December 5, 2016

golden-gate-foggy

This image perfectly pictures my first feelings about the conference, but let me go into a bit more detail first.
I signed up for the Google Testing blog more than a year ago. There I found a lot of interesting and useful reading about the world of automated testing. When I later got an email from Google informing me that there would be a conference held by them, I was not entirely sure wether I should apply. Is it relevant for me? Am I experienced enough to contribute? Well, what’s the worst that could happen? So I applied for it and did not regret it in the end.

GTAC facts:
It is by invite only! There is no way to purchase a ticket but that also means it is for free.
It’s a rather small conference with about 300 participants.
The conference is quite short, at only two days with one track.
Google makes sure participants come from all over the world with different levels of experience.

After my application in spring it took quite a while until more information was provided by Google. The announcement of the selected attendees was even postponed from June to July, but that made me even more happy when I received the invitation mail from the conference organizers. And that brings me back to the picture. The mail just stated that I am selected and I should confirm by joining a Google group made for the conference. Besides that, there was not a lot of information flowing in the beginning. Therefore, I was super excited to go on this adventure, curious how it would really look like behind the imaginary fog.

You might imagine that a conference held by such a big player is completely organized upfront and everything is set in stone months before. Well, it is still an information technology company in the end, right ;)?
With the conference coming closer, more and more details were published and I got more and more excited. When the schedule was finally out I was really surprised to discover how broad the spectrum of the talks was. They ranged from hardware related testing of telepresence robots (Sheldon Cooper approved!) to the wretched topic of flaky test results up to testing audio quality.

The Google Test Automation Conference concluded just 2 weeks ago, so let me wrap up what I experienced there. I would like to highlight two very interesting and inspiring talks.

Flaky tests?!?

So wait a second, did somebody just say Flaky tests? Who has never heard of them, the archenemy of every tester? This was one of the topics I was really hoping would be addressed and it was, even by a Google delegate, Atif Memon who is a computer science Professor from the University of Maryland. It was a very good talk although I found no clear goal or direction at the end.

In short they tried to identify beforehand what tests will be flaky based on analyzing the statistics of their test run data, and believe me they have enough of that data with 3.5 million tests that take them only about 20 minutes to run. With their results they can later decide if those tests identified as being flaky should be rewritten or deleted so they can rely on the actual test results. Sounds pretty neat, huh? The research and the work they put into this topic is not finished so do expect more info about that in the Google Testing blog.

20161115_090513-1

GTAC facts:
Everybody knows, Google really takes care of their employees. But they do also care about their guests. Every day started with a decent breakfast buffet and at lunch you could choose between 5 differently themed restaurants.

How to test dozens of systems?

Another presentation I’d like to mention was done by folks from the BBC (British Broadcasting Corporation). They gave a nice talk on Scale vs. Value. Jitesh Gosai and David Buckhurst shared their journey on maturing the testing department to include a separate Test Tool Infrastructure team. With mobile platforms getting more and more prominent and devices being very diverse nowadays, this team ensures all kinds of setups are available for testing at any point of time. This includes mobile phones, televisions, PC’s and equipment for handicapped people as well (e.g. hearing impaired). This was really important to their business so they also introduced an Events Monitoring team that keeps an eye on the health and the availability of the system.

They also use heuristics – something most testers should have heard of already. It’s a simple strategy for solving a problem. One of them was the PUMA testing which is focused on core functionality:

P – Prove Core Functionality – Are all basic systems doing what they are supposed to?
U – Understood by All – Can anyone easily understand the test results without special training?
M – Mandatory – It’s mandatory that all core functionality is tested and working.
A – Automated – Automation is implemented and designed to provide fast feedback.

Let’s recap

All in all, it was a really cool conference with so many interesting and unique topics from all kinds of speakers. I really liked the broad variety of subjects and how they were presented, but that is not all such an event provides. It was really nice to meet so many people from around the globe who are equally enthusiastic about the field of automated testing. It was very enjoyable to get together with these guys and have some nice chats about problems we currently face at our companies and share some tips and tricks that could possibly solve these issues.

20161115_184736

GTAC facts:
More than 1,400 people applied for GTAC with around 300 participants selected. Out of 208 speaker proposals, around 20 speakers were asked to present. Google even provided a dozen scholarships to applicants that otherwise could not make it to the GTAC conference.

I would like to personally thank the Google folks and my employer TOPdesk for giving me the opportunity to learn so many new things and to broaden my horizon with this special experience. To sum up, my expectations were exceeded and the conference was a blast. If you ever have the chance to go there, you should! If you cannot I would recommend to at least watch all the recordings on the GTAC website. Something that I learned from the event is that we are not the only ones suffering from flaky tests. Additionally, I found out that we are on the right track with identifying, addressing and putting the right priority on them. There seems to be no holy grail on fixing this issue.

Google fun fact:
The first thing I implemented at TOPdesk after the conference is something Google started already around 8 years ago: “Testing on the Toilet”. Everybody knows this situation; you either sit on the toilet using your smartphone or you are reading the ingredients or manual of a bathroom cleaner. Well, Google made use of this and put printed papers about testing topics on the walls next to the toilets.

20161115_200118

At the end of the conference they announced where the next GTAC will be. It’s time for Europe again – GTAC 2017 will be held in London! Hope to see any of you guys there, I am definitely going to apply again!

About the author: Tobias Spöcker

Scrum Master and QA Engineer at TOPdesk. Always looking for ways to improve, personally and within the organisation. Also football (not handegg) enthusiast, climber and music lover!

More Posts

LinkedIn