Testing Roundtable: What Do You Like Most About Testing?

Let’s face it: Testing isn’t always fun. There’s missed deadlines, missed bugs, stubborn developers, office politics and – well- you get the idea. Despite these pains, however, most people in testing truly love the work they do. But what do they like most about testing? To find out (and to brighten your day) I decided to make that the topic of this quarter’s Testing Roundtable discussion. Check out some great answers below from Jerry Weinberg, Scott Barber, Matt Heusser, Michael Cooper, Pradeep Soundararajan, Steve Vance and Peter Shih. Enjoy!

*********************

Gerald Weinberg, Author and Consultant

I like my software to do what I bought it for, and not do other things. Without testing, that won’t happen.

If you’re asking what I like most about *doing*, testing, well I like the software I help produce to be good at doing what people buy it for, but I think that’s not the answer you’re looking for. (You should be, though, because that feeling of pride in one’s work is essential to a successful profession.)

As for the actual work of testing, I like the intellectual challenge most. While testing, I feel like, say, Sherlock Holmes—-and nobody has to be murdered (usually).

*********************

Scott Barber, CTO at PerfTestPlus

I like the diversity of it. Take a recent week for example. I was helping one client devise a performance testing strategy for a database that is growing at the rate of 1TB per month that supports an application that enables M.D.s, Medical Test Labs, and Pharmacies to share relevant patient information, prescriptions, lab results, etc. essentially in real-time. I was working with a small team to figure out how to performance test a web-based voting application for a national (not North American) election that reasonably expects to need to securely & reliably process over 1 million votes per hour. I paired with a complete stranger to test a desktop application using screen and voice capture tools to document our testing and report defects. And I was testing a “teach programming to kids” application with my son.

But what I *really* like is the virtual impossibility of it all. While complete testing is not practically possible, balancing that against time, budget, technology, market, and human factors with a host of unknowns that feels bigger than the knowns, is the most fascinatingly challenging puzzle I’ve ever actively tried to solve. It’s a puzzle that always keeps me on my toes, always keeps me actively studying new things; from new technologies, to human psychology, to organizational management, to whatever industry my current client is in. For a person who loves to learn, loves to make a difference, is motivated by seemingly impossible challenges, gets bored easily, yet doesn’t want to be looking for a new career every 3 months, I simply can’t think of a field that is a better fit.

*********************

Matt Heusser, Writer and Consultant

Two decades ago I was a military cadet in the Civil Air Patrol, and I vividly remember a poem over our commanders desk:

“We the willing, led by the unknowing, have been doing the impossible, for the ungrateful.  We have been doing so much for so long for so little that we are now qualified to do anything for nothing.”

While the spirit of that poem was a little passive-aggressive, I have to say, I was inspired by the content, this idea of doing the impossible under tough constraints.

In some ways, I see this in software testing. From an infinite set of possible tests, we need to derive the most powerful ones.  We need to figure out what to test right now; what to do quickly, what to automate.  We need to figure out what the results of those tests tell us, and to give answers that stand up to scrutiny.

I call this the “Great Game of Testing,” and I don’t think it’s too much of a stretch to say that I am in software testing “For Love Of The Game.”

*********************

Continue Reading

Bug-Hunting Techniques From Matt Heusser

When it comes to finding defects in web applications, timing is everything. A bug found in production, for instance, will generally complicate matters more than a bug found in pre-production. This is also true of uTest projects, when testing is sometimes compressed into a shorter (than ideal) time-frame.

So to help you find the right bugs at the right time, we suggest checking out the latest article from our old friend Matt Heusser. In Seven Ways to Find Software Defects Before They Hit Production, Matt shares some valuable tips on ways to reinvigorate your testing. Here are three techniques that I think uTesters will find especially useful. Enjoy!

Technique 1: Quick Attacks

If you have little or no prior knowledge of a system, you don’t know its requirements, so formal techniques to transform the requirements into tests won’t help. Instead, you might attack the system, looking to send it into a state of panic by filling in the wrong thing.

If a field is required, leave it blank. If the user interface implies a workflow, try to take a different route. If the input field is clearly supposed to be a number, try typing a word, or try typing a number too large for the system to handle. If you must use numbers, figure out whether the system expects a whole number (an integer), and use a decimal-point number instead. If you must use words, try using the CharMap application in Windows (Start > Run > charmap) and select some special characters that are outside the standard characters accessed directly by keys on your keyboard.

Technique 3: Common Failure Modes

Remember when the Web was young and you tried to order a book or something from a website, but nothing seemed to happen? You clicked the order button again. If you were lucky, two books showed up in your shopping cart; if you were unlucky, they showed up on your doorstep. That failure mode was a kind of common problem—one that happened a lot, and we learned to test for it. Eventually, programmers got wise and improved their code, and this attack became less effective, but it points to something: Platforms often have the same bug coming up again and again.

Continue Reading

#STPCon Interviews – Matt Heusser

Next up in our STPCon 2011 video series is Matt Heusser – one of the most popular figures in the testing universe. Matt is currently the Editor of STP Collaborative, in addition to being an active member of the board for the Association for Software Testing.

In this short interview, he explains the ideas behind what he calls “complete testing.” Good stuff. Take a look:

Want to see more interviews from STPCon? Check out the full list here.

The Book On Software Testing

Ok, so it’s one of many books on the topic of testing. Still, how many crowdsourcing companies can honestly say that their community includes a published author (not to mention a top journalist in their space and former Testing The Limits guest)? Well, that’s the case here at uTest with Matt Heusser (@mheusser).

Anyway, this is a worthwhile read for two reasons:

  1. Heusser knows of which he speaks, as he’s not just a pundit. This guy has lived it, running QA organizations within large and small companies.
  2. Whether they’ll admit it or not, the TCO of testing is a major concern for tech execs in all industries. So any book that tackles that issue head-on is both ambitious and timely.

For those who purchase How to Reduce the Cost of Software Testing, drop us a note and let us know what you think. We’ll publish your comments in a follow-up post.

Also, we’ll see if we can wrangle an interview with the brains behind the book (Heusser and Govind Kulkarni) in the next week or so. Want us to grill them on anything in particular? Drop us a comment and we’ll put ‘em on the spot.

Mobile App Testing: This Is Different!

We’ve spent the last year (that’s 253 posts on mobileapptesting.com) explaining to the world that mobile is an entirely different animal than its web and desktop cousins. Whether the differences be in terms of OS, browser, screen size or GUI – you name it, we’ve covered it. Extensively.

Yet this concept is….well, just a concept…until it’s experienced first-hand. Matt Heusser, one of the very best testing writers out there, recently shared some details on his (first?) foray into mobile app testing with SearchSoftwareQuality. His account covers screen-size discrepancy, the expanding device matrix, GUI issues and other testing topics we all know and love.

I was particularly drawn to his “ah-ha” moment in the second paragraph (emphasis added). Take a look:

So there I was, on my iPod Touch, trying to get to a list of users whose name started with the letter “I.” It worked great on the simulator with a mouse, but with the actual iPod, my finger was too fat to click the single line of pixels.

Suddenly it hit me: This is different. Sure, all of the old GUI rules apply, but suddenly we have a new set of ways the application can fail. This tip provides a quick set of guidelines to consider, primarily for Web-based mobile applications, but much of it applies to native applications a s well.

Screen real estate

You might use a mobile device just like a regular 1024×768 pixel application, but your users probably won’t. Try to actually use the application on a number of devices — just use it. You’ll likely come away suggesting a mobile interface, perhaps an automatic re-direct on login when your application senses a mobile device. Even then, you’ll want to explore the application in a number of devices, looking for usability problems.

Continue Reading

Testing the Limits with Lanette Creamer – Part II

In part II of our Testing the Limits interview with Lanette Creamer – aka “Testy Redhead” – we cover the need for Exploratory Testing; Matt Heusser and the “rebel alliance” of testing; how to create a popular testing blog; her stance on tester certifications and more from the wide world of QA. Catch up by reading part I, and when you’re done with this one, go check out part III.

uTest: In one of your recent blog posts, you mention Elisabeth Hendrickson’s STAREAST declaration that “Exploratory Testing Is Not Optional.” Why did this statement resonate so strongly for you? Do you think all testing managers should follow Hendrickson’s lead?

LC: As a frequent conference attendee and enthusiastic reader of testing blogs, I’ve seen many ideas about how to improve testing. I’ve been through countless industry trends, such as borrowing manufacturing ideas, extensive measuring schemes, and repeated attempts to automate all testing. The bottom line is exploratory testing works in practice. Not for a few months or years, but it works to find important bugs year after year no matter what other quality trends are happening. It works well side by side with automated checks, manufacturing ideas, and it can be used with session based test management to provide measures and metrics if needed. It is the one constant a tester can go back to and find bugs that impact the user experience. It is the meat in my testing sandwich. (My pun filled humor is the cheese.) To hear Elisabeth acknowledge the importance of exploratory testing in public shows me that agile testing is about more than just automation. Agile testing can be about a balanced approach to overall quality. It resonated with me strongly because it makes me hopeful for the future of testing on agile teams.

I think test managers should evangelize and defend what works well in practice on their teams. My experience has been that exploratory testing is generally undervalued considering how effective and practical it is.

uTest: What’s the deal with this “rebel alliance” thing we’ve been hearing so much about? It sounds subversive – we want in! Seriously though, what’s the mission of this group? Please explain it to our un-initiated readers.

Continue Reading

Top 20 Software Testing Tweeps

According to Twitter co-founder Biz Stone, Twitter now has 105,779,710 registered users—and is adding 300,000 new users a day. Attempting to weed through all of the fluff can be daunting! So, if you’re interested in jumping into the Twittersphere or are just looking to follow the leading journalists and thinkers in software testing today, check out our “Top 20 Software Testing Tweeps” list below (in no particular order)!

  1. James Bach — @jamesmarcusbach
  2. Michael Bolton — @michaelbolton
  3. Testing At The Edge Of Chaos (Matt Heusser) — @mheusser
  4. Tester Tested! (Pradeep Soundararajan) — @testertested
  5. StickyMinds.com (Better Software Mag) — @StickyMinds
  6. SearchSoftwareQuality.com (Yvette Francino) — @yvettef or @SoftwareTestTT
  7. Google Testing Blog (Copeland/Whittaker) — @copelandpatrick or @googletesting
  8. Testy Redhead (Lanette  Creamer) — @lanettecream
  9. Test Obsessed (Elizabeth Hendrickson) — @testobsessed
  10. SD Times — @sdtimes
  11. Jon Bach — @jbtestpilot
  12. Software Test & Performance Mag –- @STPCollab
  13. Software Testing Club (Rosie Sherry) — @rosiesherry or @testingclub
  14. Lisa Crispin — @lisacrispin
  15. Fred Beringer — @fredberinger
  16. uTest (shameless plug! ;-)) — @uTest
  17. Weekend Testing (Santhosh/Parimala/Ajay) — @weekendtesting or
  18. Santhosh Tuppad — @santhoshst
  19. Ajay Balamurugadas — @ajay184f
  20. Parimala Shankariah — @curioustester

Update! Thanks for everyone’s recommendations. Here are a few we missed: @sbarber, @QualityFrog, @dailytestingtip, @sdelesie, @Rob_Lambert, @chris_mcmahon, @hexawise, @marlenac, @shrinik, @sbharath1012, @sellib, @TestingNews.

Please feel free to add any active Tweeps you think we may have missed in the comments! We welcome your recommendations.

Weekend Update (for software testers)

Although software testing doesn’t take weekends off, our blogging team does (most of the time). So, in an effort to tide you over until Monday morning, here are a few testing related stories – each well deserving of a weekend read. Enjoy!

Tester Professionalism
From “Uncle Bob’s” post on Sapient Testing: “It seems to me that James (Bach) is attempting to define “professionalism” as it applies to testing. A professional tester does not blindly follow a test plan. A professional tester does not simply write test plans that reflect the stated requirements. Rather a professional tester takes responsibility for interpreting the requirements with intelligence. He tests, not only the system, but also (and more importantly) the assumptions of the programmers, and specifiers.

I like this view. I like it a lot. I like the fact that testers are seeking professionalism in the same way that developer are. I like the fact that testing is becoming a craft, and that people like James are passionate about that craft. There may yet be hope for our industry!”

Podcast: Matt Heusser Explains the “Rebel Alliance
From SearchQualitySoftware.com: “No one wants to eat a bagel alone.” This is the underlying principle behind the formation of “The Rebel Alliance” a team of STAREast bound testers and developers who will attend the conference as a group. “The intention is to make everyone comfortable, introduce ourselves to the minds that we respect in software and expand our networks,” says Heusser. This kind of collaborative effort also transcends into the session Heusser will be presenting at STAREast which explains creation and service of SocialText, where Heusser is employed as a tester.”

Continue Reading

uTest Blog Abuzz With Hive Award Win @ SXSW

Last week, we found out that our humble little Software Testing Blog won the Hive Award at SXSW as the top business software blog (here’s the slideshow and the PDF report). We’re honored to make this prestigious list, along with brands we love such as HowStuffWorks, Nokia, Nike, HBO and About.com.

Part of the reason this blog has been so successful in the past year is how infrequently we talk about ourselves (ugh, boring). Well, I’m allowing myself to break that rule briefly so I can thank the people who have made our blog what it is today.

  • Our in-house team (Stanton, Mike, Jenny and Peter) for their tireless efforts and talented writing about everything from mobile apps to social media to software testing to crowdsourcing trends.
  • Our guest bloggers from the uTest community who have written passionately about everything from mobile testing to QA in agile environments to the evolving roles of testers.
  • Our Testing The Limits guests (including James Whittaker, Matt Heusser, James Bach, Michael Bolton and Jon Bach) who have not only tolerated our wide range of questions — from the insightful to the inane — but joined in with good humor, wit, eloquence and intellect.

I’ll end this little Oscar speech before the orchestra starts playing me off stage. Suffice it to say, we love writing for you; we’ll keep scouring every corner of the world (virtual and physical) for fresh topics and angles about anything related to software; and we’ll keep reminding ourselves why we’ve had this success: we write stuff that you seem to enjoy reading. We now return you to your regularly scheduled programming.

Testing the Limits With Jon Bach – Part I

After Twitter-stalking him, making some harassing phone calls and sending threatening letters, Jon Bach (@jbtestpilot) cheerily agreed to take part in our Testing the Limits series. Much like his brother, Jon has a remarkable understanding of software testing – both in theory and in practice. Having worked for companies like Quardev, LexisNexis, HP and Microsoft, Jon is also a blogger, author and software testing consultant. An expert, in the truest sense of the term.

In the first installment of our two-part interview, we get Jon’s thoughts on sibling rivalry; the blame spiral of software development; the emergence of “agile-fall”;  testing at a startup vs. testing in the enterprise; John Schneider as Jon Bach and more.

uTest: A few months back, we asked your buddy Andy Muns who’d win a fight between you and your brother (this was a big debate in the uTest office). He said you would win hands down. Would he be right? And since you and your brother seem to share the same testing philosophy, what would do you think the fight would be about?

JB: It’s hard to fight with someone who stayed in their room for most of our childhood.  He was either reading or doing science experiments with a microscope or the chemistry set.  It got worse when we got the TRS-80 in 1980.  In fact, that’s probably the last time we fought — over who got computer time next.  My memory may be fuzzy, but just when it came to blows, he programmed a user name and password dialog? Something clever like that. Now it’s better just to learn from him and do my best to keep up — but that’s true for all younger brothers, I think.

As for modern-day fighting, sponsor me for a testing certification and let’s see what he’d do.

uTest: Say you’re named grand poobah of the QA universe… what’s your first decree?

JB: Effective today, “Quality Assurance” is now “Quality Assistance”.

(Try it.  Watch what happens when you start using it.)

Continue Reading