Why Are Testers Uninterested in Upgrading Their Skill Sets?

“The only type of testing that I can do is manual testing.”Distance-Education
“Test automation is very important, but I am too busy now to learn something new.”
“Test automation is useful, but I will learn it when I will need it.”
“I am interested in test automation, but I don’t know any programming and it will take a long time to learn it.”
“I want to learn test automation, but my employer does not have any training programs.”

Have you ever heard any of these stories? I have, and not only once, but many times, about test automation, load testing, and web service testing.

Most of the testers I know say in one way or another that they would like to learn more about their profession but, “not now, maybe later, when the conditions will be better, when they will need the new skills in their job, when their employer will pay for their training, when someone will train them for free, when they will be less busy, etc.” The list goes on.

Continue Reading

Google Test Automation Conference: Video From Days 1 & 2

The Google Test Automation Conference (GTAC) is an annual test automation conference hosted by Google, bringing together engineers to discuss advances in test automation and the test engineering computer science field.

GTAC 2014 was recently held just a few weeks ago at Google’s Kirkland office (Washington State, US), and we’re happy to present video of talks and topics from both days of the conference.

If 15-plus hours of video below just isn’t enough, be sure to also check out all of our Automation courses available at uTest University today.

Testing Tool Showdown: liteCam HD vs. Mobizen

7a9a23a7651f16f378279c983cd8039a_400x400Clear, to-the-point bug reports that are backed up with solid evidence are a must for testers when it comes to communicating with developers and getting to the root cause of issues quickly.

And that evidence comes in the form of attachments, which add to a bug report by offering proof of the bug’s existence, enabling the customer or developer to reproduce and quickly rectify the issue at hand.

But with all of the options out there, we wanted to single out a couple of options that could get testers started, so we took to two popular screen recording tools from our uTest Tool Reviews in liteCam and Mobizen.

Continue Reading

Latest Testing in the Pub Podcast Takes on Testing Weekend Europe

Testing in the PubSteve Janaway and team are back for more pub pints over software testing discussion, in the latest Testing in the Pub podcast.

In Episode 13, UK-based software testers Amy Phillips and Neil Studd talk Weekend Testing Europe. Weekend Testing Europe is the European chapter of Weekend Testing and was just relaunched in 2014 by Amy and Neil.

Weekend Testing is a program that aims to facilitate peer-to-peer learning through monthly Skype testing sessions. If you’ll also recall, uTest contributor Michael Larsen is a founding member of the Americas chapter of the program.

Continue Reading

Join uTest for a #uTestChat Twitter Chat on Friday

This Friday, November 21st, we are excited for you to join @uTest on Twitter for #uTestChat starting at 1:00 p.m. EST. It’s time to huddle around the virtual water cooler and connect with your fellow software testers as we chat about all things software testing.

Have a question about furthering your career or breaking into a new testing type? How to write a great bug report? What’s the best testing tool for the job? Bring your thoughts and opinions, and get ready to spend some time connecting with the testing community. twitter-utest-chat

What is a Twitter chat?

A Twitter chat (or tweet chat) is a live, real-time conversation between a group of people on Twitter. The group follows one hashtag (#uTestChat) and your moderators from the Community Management team (Linda and Ryan) will keep the discussion moving.

Continue Reading

Meet the uTesters: Michael Solomon

Michael Solomon is a Silver-rated tester on paid projects at uTest, hailing from the United States (New York). When he’s not testing softmichael solomonware, Michael works as a freelance sound man in TV. You can visit some of his work over at his website.

Be sure to also follow Michael’s profile on uTest as well so you can stay up to date with his activity in the community!

uTest: Android or iOS?

Michael: iOS! I have had an iPhone since the first one came out, and I think I have owned every model since the very first one. I do have a Samsung Galaxy S4 for testing purposes. While the Android platform has become easier for me to understand, I most definitely prefer iOS and its abilities to sync seamlessly with my Macbook Air, Calendars, iMessage, etc.

Continue Reading

Selenium: 10 Years Later and Still Going Strong

In the field of testing technologies, it isn’t very often that we see a tool survive and grow richer in over a decade. Just recently, Selenium completed 10 years, and this article takes a look at the ecosystem that Selenium has nurtured.tw-blog-promo-tile-120x120

Agile and Selenium

The agile manifesto has been around longer than Selenium, and more teams are looking towards the agile form of software development to reduce their feedback cycles and practice continuous delivery. One of the practices that teams need to do well when working the agile way, is test automation.

Test automation may seem easy — but in order for it to be truly effective, the team needs to have a lot of discipline in defining their process, choosing the right tools and technologies to give that quick feedback by running various types of tests (smoke, regression, etc.), and also allow the test automation to evolve and scale.

That said, even today, completing test automation in the same iteration along with development is a huge challenge for most teams. These challenges get aggravated and more apparent when test automation uses tools and technologies that are difficult to keep in sync with the rapidly changing product.

It was 2004 when Jason Huggins, while testing an internal application at ThoughtWorks, created a JavaScriptTestRunner that changed the way automating the browser (browser-based-testing) is done. This then evolved into “Selenium” which was subsequently made open source.

Continue Reading

Safety Language in Software Testing: Why It’s Not OK to Deal in Absolutes

Of course this has been tested. This is definitely working as it should be. images

How many times has a tester or developer uttered these words to only have them come back and haunt them? Or worse, lose credibility? As a tester, it seems like a no-brainer to use CYA language in your everyday work. Heck, one just has to look to prolific software tester James Bach’s recent talk at CAST to figure that out (“I’m reluctant to say ‘pass.’ I’d rather say, I don’t see a problem as far as I can tell”).

But is “safety language,” such as ‘it seems to be’ versus ‘it is,’ something that should be a part of every tester’s skillset? Gold-rated tester on paid projects and uTest Forums Moderator Milos Dedijer seems to think so. It was a discussion topic that recently cropped up in the uTest Forums:

Some time ago, I had an argument with my team lead about my use of safety language. I tend to use it in any argument, and I believe that it’s a good practice. I don’t use it in my factual reports, but I do use it frequently in my descriptive reports. For example, if I say that a set of steps has been executed I don’t use safety language to report results, but if I say that a certain feature has been tested I use safety language almost all of the time. Using safety language to preserve uncertainty appears to be one of the skills a tester must have.

Continue Reading

Applause Announces the Ovation Awards: Vote for Your Favorite Apps

As testers working with hundreds of apps each year, you probably already have a good idea which ones stand out in the pack. Now’s your chance to letovationLogoLeftBlack that be known to the world.

360-degree App Quality company Applause is excited to announce The Ovation Awards, the only app awards that measure what brands & developers truly seek: the love and loyalty of users and experts.

We’re looking to you not only as testers, but as app users, to vote for your favorite apps from a list of 200 finalists across 10 categories (and across both iOS and Android). We have a panel of expert judges who will be poring over your selections and making their decisions. Here’s the timeline of the awards:

  • Public voting: Nov 12 – Dec 14 – Vote for your favorite apps – vote for just one, or vote for 20 (one per category per OS) from our pool of 200 finalists. This is a big part of what our panel of expert judges will consider.
  • Judging – Our panel includes accomplished mobile engineers, journalists, CEOs and others who understand apps inside and out. Oh, and that means testers, too. You may recognize long-time uTesters Lena Houser, Allyson Burk and Michael Larsen who are also on our esteemed panel! The judges will look at YOUR votes –  as well as the analytics used by our in-house team of data scientists to help decide the 200 finalists – in order to choose the winners across 10 categories and the overall grand prize winner for each operating system.
  • Winners: Announced January 14, 2015 – The winner for each category + OS will be announced, as will the grand-prize, overall winners for iOS and Android.

Let your voice ring loud and clear. Be sure to vote today for your favorite apps in the Ovation Awards!

Authors in Testing Q&A With Mobile Tester Daniel Knott

Daniel Knott has been in software development and testing since 2008, working for companies including IBM, Accenture, XING and AOE. He is currently a Software Daniel KnottTest Manager at AOE GmbH in Germany where he is responsible for test management and automation in mobile and Web projects. He is also a frequent speaker at various Agile conferences and now a published author. You can find him over at his blog or on Twitter @dnlkntt.

In this uTest interview, Daniel explains the biggest mobile testing pain points that come up in his user groups, and gives us a preview of recently released book, Hands-On Mobile App Testing. At the conclusion of the interview, you’ll also receive a link to an exclusive discount for the purchase of the book.

uTest: You’ve been involved in software testing in general, but what specifically drew you into mobile testing?

Daniel Knott: Back in 2011 when I was working at XING AG in Hamburg as a software tester for web applications, I had the chance to switch to the XING mobile team to establish the QA processes. Working on this team was a great experience. I had the chance to build up a test automation framework for Android and iOS from scratch and establish a mobile testing process. I was also free to try several things out to find the right tools and workflow for my company and the development environment. This time and experience was just awesome and convinced me to focus on the mobile world.

Continue Reading