Tester Shares His Experience from the Software Testing World Cup 2014

Tester and uTest Blog contributor Daniel Knott was a participant in last week’s Europe preliminary round of the Software Testing World Cup (STWCSoftware Testing World Cup) 2014, and recently blogged about the experience on his own blog. Be sure to check out more on the STWC, which is currently in full swing.

[Last Friday], the preliminary Software Testing World Cup competition took place for Europe. To summarize it in one sentence…It was awesome, and a good experience for software testers.

The software that was tested was a sales tool. We had the goal of testing this application on as many as possible mobile devices with different screen sizes for usability, functionality and design. Out of scope during the session was load and performance testing. Also, security testing had a low priority.

I was part of a distributed team within Europe. One guy was in Barcelona, one in Hamburg, one in Dusseldorf, and one in Wiesbaden. We organized ourselves via Google Docs and a Google Hangout session during the competition. One person (myself), was listening to the live STWC YouTube channel, where the judges were answering questions from the teams and informing my team of important information. Each of us had a special test task of where to focus on. We focused on usability, functionality, design and some security testing.

We tested the application on iPad Minis, iPhones and on different Android devices. In total, we filed 38 bugs in the provided defect management tool. 15 of the filed bugs were critical. As an example, we were able to access sensitive data of snapshots and account settings from one of my team members. Other than that, there were lots of cross-site scripting problems in the application. 9 of the filed bugs had a high priority. Here, for example, it was very easy to create internal server errors on the application backend by entering special characters to the input fields. 11 bugs had the severity medium, and 3, low. 

During the test session, we talked a lot about the current status of each team member to get an impression of the application and its problems. While testing the application, everyone on the team was very focused, but nonetheless, we had lots of fun and were laughing a lot.

The time was flying very fast, and in the last 30 minutes, we focused on our test report. We described our testing approach, test setup, our findings and a recommendation for the application (well, that recommendation was not so good).

In total, 3169 bugs from 250 teams were filed, for an average of 12.676 bugs per team.

Overall, we had a good feeling about our results and the test session. We were very happy with the results. Right before 9 p.m., we sent our bug reports and the final test report to the judges. And then the STWC2014 was over! It was fun and awesome to work on the distributed team, and to test an application under time pressure! We are really looking forward to the results, and we hope to be part of the final teams, of course. Thanks to the judges and everyone who was involved with planning this event. Happy testing!

Daniel Knott has been in software development and testing since 2008, working for companies including IBM, Accenture, XING and AOE. He is currently a Software TDaniel Knottest Manager at AOE GmbH where he is responsible for test management and automation in mobile and Web projects. He is also a frequent speaker at various Agile conferences. You can find him over at his blog or on Twitter @dnlkntt.

Comments

  1. Marek Langhans says

    Very well written, but the software under test for Europe preliminary wasn’t the defect management tool (Agile Manager) but a completely different software, some Sales Tool. Did you actually test the Agile Manager? :)

Leave a Reply