Tag Archives | software testing

Testing the Limits with Michael Bolton – Part III

In the third and final part of the Michael Bolton trilogy, we cover advice for new testers, his hypothetical banishment from Software Land, the blogs he reads and more. Did you miss our earlier interviews? Here’s Part I and Part II.

uTest: Hypothetical: You’ve been banished from testing – nay, ALL software-related activities – for the rest of your days. What will you to earn a living?  What hobbies would you pick up to fill the intellectual void?

MB: Who knows?  For fun, I’d keep playing mandolin, probably. Teach, maybe. Write. I’ve worked in theatre stage management, been a book-keeper, tended bar, worked in a comedy club. In high school I worked in mail rooms during the summer. Whatever I’ve picked up in life, it was because something needed to be done and I was there to do it.  If it didn’t seem like much at first, I started to learn about it quickly. When you invest a little bit of effort to figure out your job, you learn how to makes it faster and better and more interesting. It turns into this great feedback loop. Any job can be more fun when you set out to master it.

uTest: Tell our testing community something about you that your most avid readers don’t know.

MB: While walking through the woods on an island near Vancouver recently, I found myself being quiet and brief, which I like from time to time. Practically nobody knows that.

Lots of people probably don’t know how much I’m eager to help people out. All of my work—courses, articles, conference presentations, this interview—comes with lifetime free technical support. Have a question? Just ask. I might not answer right away—supporting the family with paying work takes precedence over supporting the community—but I’ve never knowingly turned anybody down, so if I don’t answer right away, be persistent. James Bach makes the same offer, by the way. We’ve found that it’s a great way not only to help people, but also to explore problems and come up with solutions and learn things that can help our clients.

uTest: If you were talking to a newbie tester, what advice would you give him or her to set their professional journey off on the right foot?  How about for a 10-year veteran tester?

Continue Reading →

Continue Reading

All Circuits Are Currently Busy — A Look Back 20 Years After AT&T Network Crash

Bug-iversary Alert! Tomorrow is the 20-year anniversary of the “crash” of the AT&T Long Distance Network. On January 15, 1990 faulty software was installed on the AT&T Electronic Switching System (Number 4 ESS): a one-line bug incapacitated the entire system, disabling switches throughout half the network.

Known as one of the most serious telecom bugs in history, more than 75 million calls were not connected during 9 hours, an estimated $60 million loss.

Dennis Burke of California Polytechnic said it best: “The Jan. 1990 incident showed how bugs in self-healing software can bring down healthy systems, and the difficulty of detecting obscure load- and time-dependent defects in software.”

Speaking of “load defects,” AT&T — after signing up to be exclusive U.S. provider of iPhone service — has recently come under fire for the quality of its network coverage. Businessweek‘s top headlines read:

In light of this bug-iversary, I can’t help but wonder if more testing should have been done before AT&T took on the massive data demands of modern 3G smartphones? What do you think?

Continue Reading

Announcing The 2009 “uTester of the Year” Awards

Today, we announced the results of our 2009 uTester of the Year Awards. Our community is full of professional testers, which made the judging incredibly tough (I can’t believe how much the bar has been raised for testers over the course of 2009). This awards program, however, was designed to recognize those few testers whose testing skills, attention to detail and consistently excellent performance stood out.

The winners were selected by our community management team and project managers, and were based upon testers’ performance across several hundred test cycles for web, desktop and mobile applications.

Brian Rock from Austin, Texas was named the overall uTester of the Year.  Brian joined uTest early in 2009 and brings 10+ years of software engineering experience to our community. Over the course of the year, Brian earned MVT (Most Valuable Tester) awards on multiple test cycles and also wrote a popular uTest guest blog post, “Software Testers: The Eyes of the Battlefield.”  He consistently reports excellent bugs, communicates with customers extremely well, and is very engaged in uTest projects.  Brian had this to say about his experience with uTest:

“Working with uTest challenges me to learn new applications and to solve new testing problems on different products every week,” said Brian Rock. “This keeps things fresh and exciting, and opens my eyes to see systems holistically and keep my defect localization skills sharp. This is what I enjoy most about working with uTest, and I am honored to be among this elite group of testers.”

The complete list of winners of is available after the jump:
Continue Reading →

Continue Reading

Testing The Limits — 2009’s Top Posts

Testing The LimitsAfter we re-launched our brand in May, we decided that the uTest blog needed to be more than just uTest employees talking about uTest events, uTest awards and the uTest community (see how repetitive that gets?).

Writing witty, thought-provoking content is really hard.  And we’re pretty lazy, but fortunately we know some extremely smart & funny people.  So we invented the Testing The Limits series, in which we interview leaders from the worlds of testing, software, entrepreneurship and crowdsourcing.

We’re immensely grateful to these talented, busy people, and we have much more planned for the Testing The Limits series in 2010.  But before we flip the calendar, these posts from this year are worth another look:

June: James Whittaker – Author, Professor and Testing Evangelist at Google

July: Rosie Sherry — Founder of the UK-based Software Testing Club

August: Andrew Muns — President of Software Test & Performance

September: Jack Margo — SVP of Internet Operations of Developer Shed

October: Jon Winsor — Author, Crowdsourcing Expert, and Founder of Victors & Spoils

November: Matt Heusser — Software Testing Author, Professor and Testing Manager

December: James Bach — Software Testing Author, Teacher and Speaker

We have some great guests and ideas lined up for 2010, including software execs, QA thought leaders, and famous journalists & authors.  As always, the goal of Testing The Limits will be to inform, to entertain, and above all else, to help our readers get to know these thought leaders who are worth following and listening to.

Have a suggestion for a future Testing The Limits guest?  Drop us a note or tell us in the comments section.

Continue Reading

Our Guest Blogger Series: 2009 Year in Review

As a way to extract the collective wisdom of the uTest community, we decided to experiment with a Guest Blogger program beginning in April. To say that it’s been a success would be an understatement, but we’ll say it anyway (the number of page views don’t lie!). Having covered a wide range of topics – including mobile app testing, tester overconfidence, security testing and more – the series has become a big hit within the community — and a great way for testers to get published in front of a large audience.

Here are a some of the highlights from our 2009 guest blogger program.  Stay tuned for an even bigger 2010!

Who is the User? – by Lucia Maldonado:  In what ways is software similar to architecture? And how can this help steer testers in the right direction? In this post, Lucia Maldonado takes an in-depth look at user accessibility standards, and offers a number of essential tips for testers in this field.

Security Testing Tips (from a Bug Battle Winner) – by Bernard Shai Lelchuck:  When it comes to security testing, few can match the expertise of Bernard Shai Lelchuck – one of uTest’s first (and finest) QA professionals. In this post, Bernard covers the basics methods of security testing, including tips for  information gathering, logical attacks and injection attacks. Oh, and here’s Part II.

Respect the Defect: Advice That Will Change the Perception of  Testing – by Joseph Ours:  Testers need to reconsider they way they report bugs – this was the position taken by Joseph Ours in his first (and hopefully not last) uTest blog post. Challenging testers to demonstrate their value by writing more clearly about the bugs they uncover (among other tactics), Joesph has sparked an interesting debate among our community. Visit the comments section to see for yourself.

Step Away from the Simulator: Putting Mobile Applications Into a Tester’s Hands – by Brad Sellick:  What makes mobile testing different from conventional software testing? For one, the simulators and emulators are far less reliable. In this post, uTester Brad Sellick – a self-made expert on mobile app testing and development – explains the dangers of relying on these tools while performing mobile app testing.

What You Need to Know About Writing Effective Test Cases – by Valerie Dale:  Despite all evidence to the contrary, test case design is often seen as work with no real value – a remedial task with no significant ROI. One would think that with the added pressures to launch a quality product on schedule, test case design and planning would be a top priority. It’s not. At best, there is minimal attention paid to the practice. At worst, it’s non-existent. In this post, Valerie Dale makes a great defense of  this beleaguered practice.

Your Overconfidence is Your Weakness: Lessons from a “Crash Specialist” – by Pradeep Soundararajan:  In our most-popular guest post to date, noted blogger Pradeep Soundararajan explains why finding lots and lots of bugs isn’t necessarily a good thing. Reliving his days as a “crash specialist” Pradeep examines how a tester’s ego can get in the way of their objective. His advice is as funny as it is useful. Simply put: a must read.

Software Testers: The “Eyes of the Battlefield” – by Brian Rock:  Our testers come from all sorts of backgrounds, including the armed forces. Brian Rock – a former Sgt. for Combat Arms Forward Recon Team in the U.S Army – is a great example. In this post, Brian makes analogizes testers with cavalry scouts. That is, they are the “eyes of the battlefield.”  Advocating exploratory software testing (especially for those in the uTest community) this post will make you rethink the role of testers.

You’re a Professional Mobile Tester (you just don’t know it yet) – by Bernard Shai Lelchuck:  As the title would imply, this post makes the case that anyone with a mobile phone and an inquisitive mind can become a successful mobile tester. It worked for Bernard Shai Lelchuck! Here Beranrd explains the rise in mobile applications, how he himself broke into the field and some basic tips for those who would like to get started in this growing (and highly lucrative) field.

Question the Connection: Tips for Diagnosing User Login Failures – by Sherry Chukpa:  Forget the sweeping generalizations about software testing “best practices.” This post by uTester Sherry Chupka gets right to the point on a very specific issue: user login failures. If you’ve ever been pitted against this problem in the testing lab, Sherry feels your pains, and has some invaluable advice for you as you move forward.

It’s been a great year, with some terrific insights into the world of testing, but our Guest Blogger program is just getting started. So if you have an opinion to express, a tip to share or a bone to pick, we’re always eager to share the thoughts of our tester community. Email us your ideas at marketing@utest.com.

Continue Reading

Exploratory Software Testing: A Follow-Up Q&A with James Whittaker

James WhittakerLast week, uTest hosted a webinar on exploratory software testing with James Whittaker.  We received a fantastic response from the 250+ attendees, and we couldn’t get to all the questions before our time was up.  Luckily, James was kind enough to sift through a stack of the remaining questions and provide answers to several that jumped out at him.

Also, remember that we’re handpicking five webinar attendees to receive a free copy of his new book on exploratory testing, signed by James.

Q: When making a tour specific to your own application domain, doesn’t that become what is usually called a test scenario? How do you see tours being different from scenarios?

A: Great question and I cover this in my book. Chapter 4 deals with “Tours” and chapter 5 deals with “Scenarios.” In a nutshell, I see scenarios as more prescriptive than tours. Tours are meant as general guidance and scenarios, at least in my mind, are more specific. A tour specifies goals and approach to coming up with test cases, a scenario actually provides an outline of the test cases. Tours leave much more of the actual test case to be constructed as you test. A scenario, in other words, has less variation.

But don’t get caught up in semantics. It’s a continuum of detail really. At one end of the spectrum are fully detailed test cases, at the other is ad hoc testing. Scripts, scenarios, tours, patterns … they all fall somewhere in between.

Q: What is the difference between exploratory analysis and exploratory execution?

A: I don’t like introducing new terms – testing has too many of them already – so I will talk about the concepts here rather than reinforce these exact names. The thought processes that go into exploratory testing are generally considered something that you do while you are executing test cases but this only works with manual testing. In Google’s case, we do substantial test automation and once the test code starts running your chances of introducing exploration is pretty much gone. The automation will execute your test case with brute force and little flexibility.

With automation, you have to do your exploratory thinking up front and this is where we came up with the idea of exploratory analysis. Simply put, the idea is to run the Tours in your head and let your thinking inspire your automation. The best example we have within Google is Rajat Dewan’s example he presented at STAR East and explained on the Google Testing Blog.

Q: Do you have some tips on how to keep testing fresh and new when there is release after release? (To avoid people getting bored and always testing the same things and missing new issues.)

A: In fact, I do. I think this very problem is what I was trying to tackle with the tours. But your question gives me the opportunity to clarify this intent. Static test cases might be fun to come up with and fun the first couple of times you run them but running them build after build and release after release not only gets dull, it also introduces the pesticide paradox. The reality of the situation is that test cases, as a specific physical entity are too low level. They specify a precise sequence of user actions. Tours are a higher level concept and specify purpose and intent and remain flexible on specific input sequences. In this manner a single tour represents any number of test cases.

Now the secret is finding the balance. Some test cases are really important as they once found a bug or they represent an important user-initiated scenario. We want to run these no matter how bored we get. But beyond that, tours allow us more flexibility to increase coverage around the specific test cases and supply the variation that will keep our heads from exploding in boredom.

Q: Can you offer any advice for a developer to be a better partner in the testing process?

A: Indeed. But I want to point out that your questions is asking for advice to devs, not about what test can do to help this partnership (which is the harder answer, so I thank you for that).

I manage a dozen or so projects from cloud to client to back end data center stuff. Some of these have great developer participation and some less so. The devs who are great partners are very involved in testing. They review and provide feedback on our test plans and designs. They become concerned fairly often about whether we are doing a good enough job in test (I mistrust anyone who trusts me and my team too much). They try to steer testers to areas of the product not covered by dev-penned unit tests. They fret more over us finding very few bugs than when we find a lot of bugs (think about that one a moment). They show great interest in what our automation is doing and like to suggest new manual test cases. When they find a bug, they take the time to show it to us instead of just checking in a fix. They invite us to give presentations during all hands and engineering reviews and they take the time to share credit with us when the team succeeds.

I like this question. Maybe I’ll keep thinking about it and make my answer into a paper.

Q: Have you found that your tours work well or help in cases where requirements are sporadic, vaguely defined or non-existent.

A: Having never worked on any other type of project, I can say with some confidence that, yes, they work quite well.

Sincere thanks to James for a great presentation, and to all the attendees for some excellent questions.  We always enjoy seeing discussions about testing be elevated to a strategic level, and so much passion and interest in the subject.  Rest assured, we’ll be scheduling more webinars in the coming year.  In the meantime, you can find a library of free resources about software testing, including eBooks, whitepapers and recorded webinars.  Have other questions for James?  Have suggested topics for future uTest webinars?  Drop us a comment and let your voice be heard!

Continue Reading

uTest Wins Top Innovator Award @ New England Venture Summit

I’m proud to share with you that uTest took home the Top Innovator Award at the New England Venture Summit by youngStartup Ventures last week. The award recognizes cutting-edge companies driving the future of innovation in tech, life sciences and clean-tech sectors, and we’re excited to be among them.

NEVS Top Innovator

As one of the winners, Doron was invited to present at the exclusive Summit, where a select group of 450 entrepreneurs and investors gathered to be the first to meet the next wave of forward-looking companies.

After giving his presentation about uTest’s on-demand testing model and his entrepreneurial journey, Doron also walked away with the top honor for Best Presenter at the event!

Between the Bug Battle results, the Whittaker webinar and this prestigious honor, it was a busy week around the halls of  uTest.

Continue Reading

Exploratory Software Testing Webinar with James Whittaker — December 10th


Attention uTest Community and prospective uTesters:   If you haven’t registered for tomorrow’s free webinar (December 10th from 1pm to 2pm ET) on Exploratory Software Testing, please click here to reserve a spot.  It’s a hot ticket, with more than 300 testers from around the world already registered to attend.

Many of you have expressed interest in additional resources to help sharpen your testing skills, so this is a great opportunity to attend a free webinar with James Whittaker. He will discuss topics from his new book on Exploratory Software Testing. Additionally, we will be handing out five free copies to attendees (signed by James) – winners will be announced at the end of the webinar.

Hope to see you there!

Continue Reading

Another Community Milestone: 160 Countries!


Just noticed something new and cool when I hit our home page tonight — the uTest community is now operating in 160 countries around the globe.

What’s that mean?  How many countries are there?  Well, depending upon who you ask (the United Nations, the US State Department, the World Almanac, etc), there’s between 189 and 195 countries on planet earth.

So recruiting professional testers from 160 different countries and getting them to profile their testing experience, demographic information, hardware and software, is no small feat.  Anyway, we just wanted to point out that the world’s largest marketplace for software testing services just got a little bigger!

Thanks to our testers from every corner of the globe for making our community so vibrant and diverse.

Continue Reading

Media Wrap-Up From Our Latest Trip To The Valley

uTest was on fire at Under the Radar Mobility this year. I think Under the Radar said it best!

For anyone looking to deploy an app across multiple mobile platforms and a gazillion different handsets, one massive problem awaits them: QA. uTest solves this problem with an army of testers across the world. Crowdsourced QA… Problem solved. (Click here to see Doron’s presentation.)

And that’s not all! Doron was able to connect with multiple partners, prospects and top media outlets, including Mashable’s Ben Parr (@benparr), editor in chief at IntoMobile.com Will Park (@willpark), ReadWriteWeb’s Dana Oshiro (@suzyperplexus), as well as participated in a couple great video interviews with bnetTV’s Michelle Sklar (@bnettv) and GoMo News’ Cian O’Sullivan (@gomonews) which are posted below!

Take a peek at the video interviews below to learn more:

Doron Reuveni-CEO of uTest speaks with bnetTV.com at the Under the Radar event.

Continue Reading →

Continue Reading

Testing The Limits With Matt Heusser (part 1)

matt-heusserIn this month’s installment of “Testing The Limits”, we sit down with Matt Heusser (@mheusser) — prolific blogger for STPCollaborative, thought leader and testing extraordinaire.  We’ll discuss the state of software testing, SpeedGeeking, the role of chaos in testing software, and the lack of fistfights at STPCon 2009

uTest:  We loved the SpeedGeeking session you led at STPCon, so we’re going to flip it on you – If you had just five minutes to teach, motivate or inspire the uTest audience about software testing, what would you say?
MH: Well, I’d start by asking the audience what they are doing today – what’s the greatest point or opportunity they feel – and asking what options they see to improve. Most of the time, I hear that testing is “too slow” or “the bottleneck” or something like that.

So I suggest taking two weeks and actually measuring how the team is spending its time. Oh, not for reporting – it is very important the team stop the time tracking after two weeks and not hand individual metrics into management for evaluation. Instead, we want to use the numbers for improvement. For example, many of the people I talk to can spend 80% of their time or more in meetings, working on documentation, working on compliance activities, doing email, and so on. That only leaves 20% of the time to test! Just pushing those numbers from 80/20 to 60/40 will double the amount of time the team spends actually doing testing.

Another thing to look at is the amount of time spent trying to reproduce defects, document defects, file bug reports, “verify” fixes, and so on. We think of these activities as testing, and they can take a substantial chunk of that 20% – but they are really accidental. That’s not a testing bottleneck – it is a development bottleneck. If test can work with development to improve the quality of the software prior to code complete, that will improve the speed of the whole system. Realizing this, and having a little bit of data to “prove” it, can help the entire system improve.

So if I had five minutes, I would say start with measuring how you track your time, and ask yourself if this is the best use of your time and what can change. Sometimes, the big boss will say “no, we absolutely need you to fill out all seven pages of documentation per test run”, and you can say “ok.”  Six months from now, when someone asks why the big project is late, you can point out that the business made an explicit decision to pay the full price of defined process. You presented options and those were not accepted.

That won’t save this project — but it might save the next.  It also turns out that actually testing tends to be much more fulfilling than documentation and compliance activities. Who could have guessed?

Lots of contrasting opinions at last month’s STP Conference. While there were no fist fights (that we heard about anyway), what did you see as the most contentious issue? And where do you fall on this issue?

Continue Reading →

Continue Reading