Does “Quality” Come From Testing?

Okay, call this a bait and switch if you will, but the bottom line is you cannot test quality into an application. So if you can’t test quality into an app, do you then build it into an app? Or perhaps the more pertinent question is, ‘who contributes more to app quality – software developers or software testers?’ Playing with dynamite here, I know…

Let’s begin with a simple fact – developers are the ones who “create” software defects in the first place. To be fair, they don’t knowingly create buggy software, but that’s the widely accepted norm – we’re human after all. However, when bugs are discovered after the product launches, testers are typically singled out and blamed. Why?

Part of the reason is due to the misnomer that QA should stand for “quality assurance.” Do QA professionals truly assure the quality of a product, or do they assist in delivering high quality products (as Jon Bach has suggested)? So if you’re a tester by trade, I sympathize with you. On the one hand, buggy software leads to job security. On the other hand, you are constantly on the hot seat and looking over your shoulder, wondering when and where the next bug will surface. But instead of despairing over these details, testers should rise to the challenge.

Here are a few examples of how testers can lead the quality initiative:

Continue Reading

Testers: Is it Time to Reinvent the Wheel?

Our latest guest post comes from Jim “JR” Harris, Principal Engineer and Owner of Arrowhead Computer Consulting, and one of the most entertaining tester bloggers out there (you’ll see what I mean shortly). You can find more of his writings at qatechtips.blogspot.com. In this post, he addresses why the value created by testers is not always fully recognized in the world of business. Enjoy!

In the October issue of the uTest newsletter, Matt Johnston led off with the title “Are Testers the next Endangered Species” – and I blew my stack!  Now don’t get me wrong; it’s not like I was furious or anything like that, but I will admit that I did bite the heads off of about a dozen or so thick framing nails before I could compose a coherent reply.

And I let him have it – with both guns blazing! – eager to defend the honor and integrity of those of us in the Software QA community.

“Oh, it’s the idiots in Management who don’t recognize the need for quality software!”

“Those idiots in Marketing ALWAYS leave us with too much to do and too short a time-line to do it!”

“If the developers would send us software releases that were at least testable; we wouldn’t be in this bind all the time!”

Now Matt has a sick and twisted sense of humor, not unlike my own.  So instead of getting offended, he offered me the chance to express MY views on his bully pulpit.  “Ok Einstein, you’re so smart?  YOU write the next one!”  No he really didn’t say that, but his invitation was clear:  Put up or shut up.

Continue Reading

Survey Says…Software Testers ROCK

I recently came across this article, Personality Traits in Software Engineering, which conducted a research survey assessing the major personality traits of software testers and developers. Turns out — and I’m not at all surprised having met so many testers in our community — software testers rock! Here’s how the scores break down:

Tester Scores
Neuroticism: Low
Extraversion: Medium
Conscientiousness: Medium
Openness To Experience: High
Cognitive Capability: High
Agreeableness: High

According to Anne-Marie Charrett in her blog, Maverick Tester, “On average we [testers] are an agreeable bunch of people, open to experience (see below) with a high cognitive capability. A hearty clap on the back fellow testers, we all knew we were pretty special.”

I couldn’t agree more! So, yes, this is simply a feel-good blog for all those testers out there with a case of the Mondays. Give yourselves a hand. And Happy Monday!

Apple, iPhone 4 Bugs and Why Companies Need to Stop Ignoring Testers

Everyone wants to know what Apple’s going to say at their big press conference in a couple of hours. Will the iPhone 4 bugs prompt them to issue a recall? Will they send users a plastic case that supposedly solves the reception problems? Will they try to fix the defects with a software patch?  Will they say they’re sorry and that this will never happen again? Will they tell NY Senator Chuck Schumer to suck an egg?

We’ll have to wait and find out. But here’s one thing they’re NOT likely to say (but they should): “We should have listened to our testers!”

[Update: See this TechCrunch story for a round up of the press conference]

One of the biggest pet peeves among testers and engineers (or anyone in involved in quality assurance of technology) is not being taken seriously when a serious issue is uncovered. For most companies, it’s generally a cross-site scripting vulnerability, an SQL injection or a browser compatibility flaw in the UI.  For the iPhone 4, it was an antenna issue. As it turns out, many top executives – including Steve Jobs himself – were repeatedly warned about about the “death grip” well in advance of the product’s release. These warnings from respected internal resources were either ignored or not taken seriously. They should have listened to their testers.

But what should testers do when they find themselves in this situation? According to Bill Ricardi, they should report the bug and move on. A member of the uTest community, Bill gave his advice on this matter as part of our Guest Blogger series, writing:

You won’t always see eye to eye with the client. What you consider a critical bug, they might see as a non-issue (or worse, a ‘feature’). What you call a major security flaw, they might consider such a remote possibility that it doesn’t even deserve a mention.

You might ask how you bridge such a gap between your level of testing and the client’s level of acceptance and understanding of product integrity and the testing process in general. The answer is simple:

You don’t.

Continue Reading

News Flash: Developers Love Testing…Almost As Much As Doing Taxes

Is there anything worse than doing your taxes? Apparently, for most software developers the answer is ‘yes’… testing software. And this issue is costing companies…often to the tune of hundreds of thousands of dollars for some of the more severe bugs.

Most tech execs and CTO types will tell you that having your developers test their own code — and not investing in proper testing resources — is a recipe for disaster. And yet inexplicably, some companies still go down this path.

We’ve all heard the numerous arguments before:  developers are too valuable to spend their time testing (eg: an engineer in Boston comes fully loaded at $120k); developers make lousy testers; these two separate functions should each be left to the experts.

Well Chris Matyszczyk from CNET (@ChrisMatyszczyk) says that, in addition to those arguments, there’s another reason to invest in proper testing:  most developers would rather do their taxes than test code.  And he’s got the stats to back it up:

Developers seem to be increasingly bugged by the agony of ill-tested software. All but 11 percent of the respondents cited either design defects, problems in test execution, or simply insufficient time spent on testing on all platforms and targets. And 58 percent named the latter two as the greatest evils.

More than half declared that the last significant software bug cost their companies an average of $250,000. So now, even I, a regressive in so many ways, see just how painful developers’ lives really are.

However, this research doesn’t seem to account for all the depths and nuances of pain. It gave respondents the option of choosing only the dentist, the fender-bender, or the taxes when expressing their dissatisfaction on, say, discovering that management won’t be investing in proper software testing or that sorting out the bugs is down to the developer.

I’d love to hear from the testers and the developers out there.  Should testers be responsible for their own QA?  Should they own a small part of testing (eg: unit testing only)?  Or should development and testing be separated and left to the specialists?  What say you, oh wise ones?

Version 3.0 – A Better, Faster, More Powerful uTest

Check out uTest 3.0

Click the thumbnails below to see a larger screenshot.

Sign In

New Sign-In Page

Customer Welcome Screen

Test Cycle List

After months of hard work, we’re excited to announce version 3.0 of our testing platform.  This is much more than a simple refresh or a minor upgrade — this is a full-blown rewrite of our testing platform, from the UI design, some cool new features, back through the code and a new, open architecture.  And it’s all based upon feedback from customers, testers and all we’ve learned from 1,000+ test cycles in the past 18 months.

Our goal was to produce a faster, more usable, more powerful experience for uTest customers and testers alike.

So what’s new?

  • User Interface: Our new UI was designed to provide a simpler, richer, more interactive experience with more intuitive navigation from bug-to-bug or test-cycle-to-test-cycle
  • Improved Infrastructure: Our new architecture enables better scalability and, more importantly to you, is designed to provide improved performance (with faster page load times) around the world
  • API and Integration: Our new and open APIs will enable us to rapidly expand our offering and better integrate with our customers, partners and third-party app developers

What’s in it for you?  If you’re a customer, keep reading.  To check out what’s new for testers, you can skip down to the tester features.

Customer Features
Once our customers get past our sleek new UI, we hope they notice the great new features and changes we’ve included.  Two features are really important.

First, we’ve built a new test cycle wizard.  Creating a clear, concise test cycle is the most important thing a customer can do to ensure success, but there are a lot of details required to make a test cycle perfect.  Our new wizard makes this process easy, making sure each part of the test cycle is well-documented.

Second, we’ve greatly enhanced our tester rating system.  Our platform will evaluate the past performance of each tester based upon activity levels (# of test cycles participated in, # of bugs reported, # of test cases completed) and the quality of that activity (bug approval %) and, once we have enough data points, we’ll assign them a rating.  Testers who are in the top 20% of all rated testers will receive a gold, silver, or bronze rating.

Some of the other new features we’ve added in v3.0:

Sign In

Advanced Coverage Report

  • Resizable Columns – On any table or list, all columns can be re-sized and re-arranged by customers using a simple drag-and-drop.
  • Tester Messenger - Enhanced tester messaging tools help you communicate with testers to get what you need from each bug report
  • Testing Coverage Reports – Be confident that you’ve covered every corner of your testing matrix with our new coverage reports. You’ll know what’s been tested and what still needs to be tested in a single informative report
  • Easier Filtering – All test cycles and bugs can be easily filtered to find exactly the information you need
  • Smart CSV Exporting – Export a CSV file with just the information you need about your bugs or your test cycles
  • Multiple Users – Add multiple members of your testing organization easily with our new account management tools

Tester Features
Besides the new UI, we’ve added a lot of cool new features for our uTester community.  However two features are really huge for testers.  The first is our new test cycle wizard that helps testers review test cycles more easily and completely, as well as telling uTest whether or not they intend to participate in that test cycle.  Now when a tester checks out a test cycle, they will be able to indicate whether or not they want to join.

Our second big feature is a new & improved tester ratings system.  In the past, testers were rated with a star system based upon subjective customer feedback.  Starting today, testers who are active will earn a rating and a “Rated” badge.  This shows that these testers are actively participating in test cycles contributing their expertise to the uTest marketplace.  Additionally, those testers who are in the top 20% of our rated testers will receive a gold, silver, or bronze badge.  The remaining testers will receive a “Rated” badge.  Testers who are not active or for whom there is not enough data will not receive a badge.  New testers will receive a “New” badge.

A tester’s rating will depend on many factors including total activity and recent activity (# of cycles participated in, # of bugs reported, # of test cases completed), as well as the quality of that participation (bug approval %, test case approval %, and accuracy in specifying bug types and severity).

Other great new features include:

Sign In

Tester Messenger

  • Resizable Columns – For any table or list, all columns can be re-sized and re-arranged using a simple drag-and-drop.  For example, a tester can move the test cycle name to be in the first column with a width that displays the entire test cycle name
  • Tester Messenger – Enhanced communication tools between testers, uTest project management and customers
  • Easier Filtering – All test cycles and bugs are easily filtered.  For example, testers may filter bugs by “My Bugs Only,” status, severity, and by test cycle; even for closed test cycles
  • Clearer Test Cycles – All test cycles are formatted in a more digestible way to help testers more easily and completely review each test cycle
  • Improved Attachments – Bug attachment size limit has been raised from 5MB to 10MB (there is a limit of three attachments per bug).  Also, more file types are permitted for upload.
  • Smarter Emails - Cleaner, more informative notification emails from the uTest platform

Wrap-up
Our engineers and product managers have worked countless hours putting together version 3.0, but we also want to thank our customers and testers for their countless ideas and insights.  As we continue to improve and refine our platform, we are always open to thoughts and ideas about how to make our product better.

Have a great idea for our future product releases?  Testers should join our testing forums and check out our Platform Feedback section. Customers can contact their project manager or drop us a line.

IE6 — The Zombie Browser That Can’t Be Killed

Developers have long awaited the death of Internet Explorer 6; web heavyweight like Google, Facebook, Reddit, Justin.tv and Digg have all announced the expiration date for their support of IE6; Microsoft has been steering users away from IE6 for more than a year.  And last week, a funeral was held for the outdated browser which was two parts wake and one part wish.  Even Microsoft joined in the fun, sending a card to the festivities services.

So what will it take to kill the undead browser once and for all?  Well, it’s worth noting — and shocking — that IE6 still drives nearly 20 percent of all web access from beyond the grave.

How is this possible?  What outdated luddite segment of web users is still stuck in 2001?  Well, the prime culprit is large enterprises like Intel who bemoan the cost and complexity of upgrading thousands of employees and legacy apps that were built specifically for IE6.  So while the web citizenry has moved on and is ready to pull the plug, developers (and testers), IE6 will continue to be part of the web app testing matrix for much longer than any of us would like to believe.

Just to further illustrate the insanity of IE6’s continued survival, here are a few other things that were going on in 2001:

Continue Reading

Users Use; And Testers Test

VentureBeat has an interesting article about eBay’s announcement that they’re going to tap into their user base to test new features — a kind of opt-in, ongoing beta program for new features.  The title for this article:

eBay to Use Crowdsourcing to Test New Features, Starting with Streamlined Search

Those who know me well know that defending the purity of the term”crowdsourcing”  against misuse is a pet cause of mine (e.g. – Meet-ups are not crowdsourcing; online polls are not crowdsourcing; asking your Twitter followers a question is not crowdsourcing). But don’t worry… this won’t be another rant about the importance of definitions and how critical labels are.  Well, at least not about the word “crowdsourcing”.

Continue Reading

Another Community Milestone: 160 Countries!

uTest_Community_Hits_160_Countries

Just noticed something new and cool when I hit our home page tonight — the uTest community is now operating in 160 countries around the globe.

What’s that mean?  How many countries are there?  Well, depending upon who you ask (the United Nations, the US State Department, the World Almanac, etc), there’s between 189 and 195 countries on planet earth.

So recruiting professional testers from 160 different countries and getting them to profile their testing experience, demographic information, hardware and software, is no small feat.  Anyway, we just wanted to point out that the world’s largest marketplace for software testing services just got a little bigger!

Thanks to our testers from every corner of the globe for making our community so vibrant and diverse.

Giving Thanks

Thanksgiving CollageHere in the U.S., the end of November is marked by Thanksgiving.  This is a time of family, friends, feasts and football (not futbol).  So while much of the uTest crew is taking a well-deserved four-day weekend, I wanted to express our sincere gratitude to the entire uTest universe — customers, testers, investors, partners and media.

Without you and your passion for contributing to a bug-free world, uTest would not be where it is today.  Ultimately, a business such as ours (discovering and eliminating defects, pay-for-performance, collaborative, reputation-driven) would not exist if it weren’t for your collective desire to make things better than they were yesterday.

And for that, I am truly thankful.  What are you thankful for this season?