Guest Post: 3 Reasons Why You’re Not Advancing in Your Testing Career

You’ve mastered the technical skills, but why aren’t you advancing in your QA career? Joel Montvelisky, a tester, test manager and QA consultant with over 15 years of experience in the field of Quality Assurance, tackles this question in the following guest post.  You can read more about Montvelisky’s views on testing, agile, training and more on his blog – QA Intelligence.

******

While presenting at a recent training session to a group of testers, one of them asked me what were the most important skills I thought a tester should have in order to advance in his career.

As I had my “mentoring hat” on, I immediately asked the whole group what they thought the most important skills for testers were.  They started throwing out all sorts of ideas, like analytical thinking, the ability to “read” code, knowledge of web and mobile technologies, automation, an eye for detail, etc.

I guess I should have expected that. Working as we do around programmers and engineers, the testers focused only on the hard and technical skills.  And I don’t blame them either, since these skills are incredibly important. In fact, I even wrote a post about being a technical tester in my QA Blog.

But in a sense, this is also one of the biggest mistakes you can make as a tester; to focus solely on the technical skills and not develop the “softer” skills that are also vital to performing your work well.

If you look at the testers who are the most valued by your company, in addition to technical skills, they also possess a number of other skills that help them to contribute more and to distinguish themselves and their work.  These are also the testers with greater chances of advancing to higher managerial positions.

Wouldn’t you want to be one of those people and have those opportunities yourself?

Well, I can’t sell you a potion for that, but I can explain the softer skills that are helping these most valuable testers, and how can you develop them too.

There are many soft skills a tester can acquire or develop, but after interacting with a large number of testing teams and development organizations, I’ve identified three skills that I believe are the most important for a tester. These 3 skills are:

  1. Communication skills
  2. Political skills
  3. Customer-facing skills

Communication skills

Communication skills are the single most important set of skills a tester should have – even more important than any technical skill you can think of.

By communication, I don’t mean only the ability to write a clear test case or an informative bug; it goes much further than that.

Communication starts with the ability to listen to what other people are saying and to translate that information into action.

In a QA Blog post I published in the past called, “Of Testers & Soldiers” I likened the role of testing to that of an intelligence officer’s in the army whose task is to seek information from multiple and scattered sources, and then piece it all together in order to map out a complete plan of risks and actions for his commanders or managers.  This can only be done if you are able to both listen and process the information quickly and effectively.

An additional aspect of communication is the ability to transmit your message clearly and in a way that will encourage your audience to listen to what you have to say.  As an old colleague of mine once told me, “We are all salespeople.  Some of us sell cars, others sell ideas.

Continue Reading

Guest Post: Windows 8 Design – Where Will It Take Us?

Windows 8Inge de Bleecker has been working as a User Interface Expert specializing in mobile for the past 12 years. Here’s her take on Windows 8 and why users may or may not ultimately embrace the usability change.

I recently was fortunate enough to attend a series of Windows 8 training sessions for user experience (UX) designers. Attending the sessions gave me valuable insight into Windows 8 design.

I am not here to give you a full list of the good, bad, and ugly of Windows 8. A quick Google search will reveal a number of links to articles that already cover that. What I am wondering about is what the Windows 8 user experience over time will evoke in its users.

The goal behind any good user experience has always been to provide the user with an interface that is easy and intuitive to use. With the advent of the iPhone, the interface also had to be fun to use. Usability and user delight are now top priorities.

During the Windows 8 training sessions, it quickly became clear that the Windows 8 user experience team has put a lot of thought into all aspects of user experience design. Based on that, they have created a vast number of rules; rules about what can be displayed on an app’s landing page, what needs to be stuck in menus, where menus reside on the page, the look and feel of page elements, and so on. There are so many rules that it is a bit overwhelming.

The good thing about rules is that it makes it much easier to design an app: just follow the rules, and the end result will be passable at the worst.

Consistent use of rules also leads to uniformity. For users over time, uniformity breeds familiarity. And familiarity promotes ease of use. To some extent.

While familiarity is one of the principles we strive for in design, discoverability is another one. I found that some of the Windows 8 design rules result in a lack of discoverability. To give an example, one of the rules states that actions should not be displayed on-screen, but instead provided in a (by default hidden) menu that can be opened through swiping. While users can become familiar with this and understand that they may need to look for something in a hidden menu, it impedes the discoverability. Yet usability testing time and again shows that users want actions at their fingertips, on-screen.

This example brings about some questions: Will familiarity in time overcome the discoverability issue and result in an experience that users find easy and intuitive? Will it make for an enjoyable experience?

My exposure to Windows 8 so far has definitely made me think. I am curious to see how users over time will rate the ease of use and enjoyability of Windows 8 applications. I’ll report my findings in this blog once I’ve gathered enough data. Stay tuned.

Guest Post: Why Functional Testing is Important

Make a good first impressionLucas Dargis is a software testing consultant. He has led the testing efforts of mission critical and flagship projects for several global companies. He specializes in the development and implementation of testing strategies. 

There is an age-old expression that says “You only have one chance to make a first impression.” This is a hard truth in today’s world of instant gratification. If your product fails to deliver the first time, your customers will simply move on to the next thing. In-the-wild functional testing, as provided at uTest, is similar to a dress rehearsal for your application. Your application is exposed to a group of people who accurately represent your potential user base. They can identify and report the issues (that would have negatively impacted your customer’s first impression) before your customer ever has the chance to see them.

A functional tester has the ability to evaluate individual features of an application. They are familiar with typical application behavior and have the skills needed  to look objectively at a feature and see what’s wrong.

Perhaps even more valuable is a functional tester who is able to analyze individual pieces of an application within the context of the entire application. A functional tester looks at a particular item, identifies integration points between that item and other parts of the application, and then formulates a plan of how to inspect those touch points. Applications are usually weakest in places where different parts come together. A strong functional tester knows this and knows how to exploit those weaknesses to identify any lurking bugs.

Functional testing will only be successful if an organization’s underlying quality fundamentals are solid and everyone clearly understands how testing helps achieve the goals of the business. Functional testing is only one of many activities that collectively comprise a comprehensive testing strategy. Depending on the needs and expectations of your company, different testing activities such as performance, load, and security testing should be considered. Functional testing differs from other types of testing in that it most closely reflects the experience of the users. While performance effects the experience and security issues add risk to the experience – how the application functions IS the experience.

Guest Post: uTest – My First 100 Cycles

Lucas DargisLucas Dargis joined the uTester community in March 2012. He joined with an eye toward expanding his knowledge base and getting widespread experience. To help him stay on that track, Lucas set three goals for himself. Today’s guest post will tell you how he achieved those goals ahead of schedule and what keeps him coming back to uTest. You can laern more about Lucas by visiting his uTest Profile or his blog.

********

So I’m a little late, I’m actually at 138 cycles, but I wanted to give an update on my uTest experience now that I’ve got 100 cycles under my belt.

Accomplishments

When I first signed up with uTest I set a few goals. I really had no idea how realistic they were, but you have to at least have something to shoot for right?

By the end of 2012 (9 months from when I started) I wanted to:

  1. Earn my gold badge in Functional testing
  2. Become a TTL (Test Team Lead)
  3. Develop a strong reputation within the uTest community

Gold badge

I got my functional badge within 30 days of my first test cycle. At first this was actually a disappointment. I was really looking forward to the challenge of having to work hard for that badge.

Continue Reading

Calling All Software Experts: Share Your Knowledge

Do you have a great software-related story or experience you’re itching to share? Submitting a guest post is a great opportunity to share your thoughts and gain exposure. If you have a blog post that covers software development, mobile apps, web apps or real world testing you should consider submitting to blog@utest.com.

Being a crowdsourced software testing company, we aim to share insight from the people who know software best. No one knows software better than the developers, testers and end users. From development or testing experts who share their biggest challenges or thoughts on agile development, to freelance tech writers who experience the software themselves – no matter what your software background we’d love to hear from you. So what should you write about, and how do you submit a post?

Depending on your topic, your guest post may be published on one of the following blogs (including this one):

All posts should be original content. To review the guidelines for submitting a post, or to get ideas for what to write about, please visit the Guest Blog Post Submission page.

 

Guest Post: How to Make Sense (and use) Of User Feedback

Every company wants user feedback – lots of it. They want the good, the bad and even the baffling. But how do they make sense of it all? In this guest post, Ally Howell explains how the team at PowerInbox incorporates user feedback into their product.

How to get the most from your users: let them take the wheel

Face it, the country would not have gotten very far without the results of the Industrial Revolution. A hundred and fifty years of innovation has lead to today’s extensive take on revolutionizing technology. The technology sector’s exponential growth patterns has landed it in entire dependence on the people its serving.

User testing and customer satisfaction is essential to proper and auspicious product development. In this day and age, maintaining a relationship with your users encourages sustainable customer loyalty and community expansion. Most successful companies operate on a straightforward, “no consumer, no product” mentality. They put their targeted demographic in the driver’s seat. Reality is, this quite possibly could be the only way to ensure that you deliver a product that is high in demand, and that has the potential to sustain ideal growth rates.

Let’s break it down now.

Step one: The more the better! Collect collect collect.

This is one of the most important steps, so at PowerInbox, we make sure we are reaching out to our users directly (in doing so, we personalize the outreach messages and avoid sending mass emails – this increases the influx of replies). In accordance, composing the right questions and following up once you receive the responses you’re looking for will drive innovation and improvement and acquire the strong relationships you’re looking for.

Getting it all started can be the toughest part, but the rewards are guaranteed to prove worth it. An additional benefit? This initiative fuels conversion rates through the power of personalization and recommendation. It demonstrates how valuable the customer role is to product growth. So, the advice here – reach out to a generous number of users, anticipate a response rate of roughly 50% and get as many emails out there so the feedback pool begins to accumulate a substantial amount of data to analyze.

Step two: Sift and toss

At this point, obtaining the feedback may seem like the easy part! Staying on top of the inbound questions, comments, and concerns can appear to be a daunting task, but as long as each user response is stored as it comes in, the advantages of obtaining thoughts and impressions are detected immediately. Not to mention how amazing it feels when periodic, positive feedback and encouraging notes begin to roll in! We’ve embraced this feedback tactic and recently heard some wonderful news from a dedicated user, Nihit, making the whole team smile: “Definitely! Its mind blowing. Just loved this development from you and your team.” Granted, there are going to be a lot of “interesting” responses on top of these, but more often than not, what users have to offer determine the product’s direction and success.

Continue Reading

Guest Post: The Complete Test Coverage Myth

It is hard to know when enough is enough in software testing. Some of the most common mistakes occur when terms like “complete test coverage” get thrown around in the wrong context. This guest post is from Lee Hawkins, the quality architect for Quest Software. Lee has enjoyed over 15 years in the IT industry, taking in a short stint as an artificial intelligence developer before making the leap into software testing in 1999. He is a proud exponent of context-driven testing and his areas of interest include exploratory testing and the challenges of testing in an agile world. The following post looks at the complete test coverage myth, and provides some best practices for achieving complete testing.

********

This blog post was inspired by recent discussions in a LinkedIn group about whether writing test cases is a waste of time or not and, while that is subject matter enough for a few blog posts in its own right, one dangerous common response kept appearing in regards to achieving “complete test coverage” by using a requirements traceability matrix (RTM).

A simple definition of a RTM might be “a table that traces a requirement to the tests that are needed to verify that the requirement has been met.” The common use of this matrix seems to be in suggesting that once all of the identified tests have been performed against all of the identified requirements, then we have “complete test coverage”.

Terms such as “complete test coverage” are dangerous, especially when communicated to project stakeholders and business decision makers, for they automatically set the testers up for blame when something goes wrong in production. The stakeholders hear “complete test coverage” and think “the testers have tested everything, so we’re good to go.”

The reality is that all your RTM tells you is that you’ve done *some* testing of all the listed requirements. We can never test everything, even for one requirement. We can never exhaustively list out all of the requirements either. So, you’ve actually performed less than complete testing against a less than complete list of requirements – that doesn’t sound like “complete test coverage” to me!

As testers, we need to accept the fact that complete coverage is unattainable and certainly stop reporting to any stakeholders or decision makers that we have achieved it. What we do need to clearly communicate is what risks remain based on the (incomplete) test coverage we have achieved, so that business decisions can be made based on risk rather than on false assumptions that no risks remain due to “complete testing.”

 

Guest Post: The Past, Present and Future of Apphance

uTest ApphanceAs you’ve probably seen by now, uTest acquired Apphance in a seven-figure deal. You can get more details about the deal in this blog post from uTest CEO Doron Reuveni. You can also read more about the product itself in this subsequent post from Stanton Champion. But in this post, we wanted to share the origins of Apphance from Jakub Lipiński, the CEO and co-founder of Polidea, Apphance’s parent company. Enjoy!

********

In order to diagnose what’s gone wrong with mobile apps, developers need specific information. The problem is that this information is extremely hard to find and even harder to communicate.

Consider the following conversation (sadly, based on an actual call):

Developer: I’ve got a bug saying that the app doesn’t look good on your device. Could you give me a little more information about what kind of phone you have please?
Customer: It’s white.
D: I actually meant what’s the model?
C: Aaaah!  Ummm, it’s an HTC.
D: Ok, what kind of HTC?
C: Ummm…Desire.
D: Cool, and what version of the firmware are you running?
C: Firmware? What do you mean? Is that like hardware?
D: Hmmm, never mind. Could you send me a screenshot of the problem?
C: Okay… sure, but I’m not quite sure how to do it…
D: Let me explain: Install Java JDK. Install Android OEM USB Drivers. Install Android SDK. Reboot your computer. Connect the phone to the desktop. Launch the Dalvik debugger. Go to Device->Screen capture. Make the screenshot. Save it to disk. Compose a new email and attach the file. It’s that easy.
C: …Nevermind, it’s fine.

The point: devs and users rarely speak the same language. This causes misunderstanding and frustration. It also prevents developers from improving their mobile apps and rarely closes the loop between users and engineers.

This brings me to the origins of Apphance. At Polidea, we envisioned a mobile quality tool that would gather instant (and accurate) information from testers and end users about their app experience. We wanted it to be fully automated; to provide the necessary information developers need to diagnose the issues and allow testers to clearly indicate what they would like to improve.  We scoured the mobile tools landscape for this type of functionality and came up empty.

And after several months of fruitless searching, we went ahead and created a suite of embedded software libraries that solved our own pain points. These libraries automatically send all the information about the device (maker, model, OS and version) to the server where the developers can view them.

Continue Reading

Guest Post: How Banks Can Master Software Testing

Breaking business bottlenecksDon Hamlin is a retired software development manager who spent his career developing and maintaining custom software for  banks and other financial organizations. Though he’s been out of the game for a few years now, Don is willing to bet the same issues he encountered during his years of development are still looming in the slow-changing financial industry. In today’s guest blog, Don walks us through one of the most common bottlenecks he encountered and how individuals and companies can work together to overcome the challenge.

************************

Large banks do present challenges. I retired from banking in 2004 so this could be outdated,  but I doubt it. Banks are very compartmentalized.  Security … at least six people; project managers … oh yeah; developers … galore. So how do you break into this gang of 100s, all located in different parts of the country?

If there is one thing that is true of projects in general, it’s that they nearly always run behind schedule. True with most organizations. Definitely true with banks. Banks are great at producing enterprise projects but not good at being adaptable, flexible or on time. Augmenting testing would be a true asset for any bank and would help recover from project slippage. Documentation of test cases and bugs would be mandatory. Banks were (and I am betting still are) crazy for documentation. Being able to solve this problem in a way banks can understand and are comfortable will not only help the organization, it’s your in.

During the great Y2K conversion (yes, it was a long time ago!) I outsourced some of our coding needs to Bangalore. I thought it would be a good thing. The work was drudgery, no one wanted to do it and it kept getting postponed. Well the same programmers on my staff that complained about having to do the work then whined about my outsourcing the job. Lesson learned:  Never ask a developer if they want outside help!

In my mind (then and now) success will come with finding a manager who is forward thinking AND has the authority to authorize outside help, especially when it comes to testing. Banks realize there is a lot of available help, but not everyone within the organization has the authority to act on it. Having someone in place who can recognize when help is needed AND has the authority to act on that need is key. If you want to be that person, do your research on who is truly the right person to contact to gain entry, don’t just pick a random name from within a large bank.  Internalize the greatest catch line an outside consultant used on me, “I am here to make you look good!”

Testing Usability within an Agile Framework

Adapting to AgileWe’re back with another usability testing guest post from uTest UX Expert Inge De Bleecker. Last year Inge gave us a run down of the basics of UX testing and exploratory UX testing. Now, Inge, a user experience architect by trade, is back to share her tale of transitioning to an Agile process after 15 some odd years in the business.

To learn more about Inge, visit her LinkedIn profile or check out her blog.

***********************

Lean UX: User Experience on the Fast Track

A few years ago, the company I worked for ‘went Agile’. As a user experience designer, I was scratching my head. The discipline of user experience was just gaining recognition as a valid discipline, and I had spent a good bit of time educating my colleagues on how user experience activities fit in their waterfall process. Now I was facing a whole different set of challenges trying to fit user experience into the Agile process.

One of the main concerns I had with Agile was the lack of upfront planning time. In order to build a product that users want to use, it is important to understand who the target user is, and how they expect to interact with the product. Armed with that insight, an overall information structure and blueprint for the user interface is then created. All this preliminary work takes time and effort, and has to be completed before diving into the details and actual development. Fortunately, most development teams I worked with struggled with similar upfront time constraints to get their tools running and their preliminary planning completed. That bought us some much appreciated extra time.

Very soon, I came to love one particular aspect of Agile: the short sprints introduced the perfect opportunity to be iterative. Design is inherently an iterative activity, and Agile supports that very well. I quickly got into the habit of reviewing sprint deliverables, throwing in a quick usability test at times, and submitting a list of change suggestions for consideration in the next sprint planning meeting. Such quick turnaround time for making changes to a product interface had never been possible before.

Over the past few years, many organizations have moved to some version of Agile, and many user experience designers have had to adapt. The result on the user experience side is now called ‘Agile UX’ or ‘lean UX’. Some view the lean UX movement as a ‘revolution’, for better or for worse. Proponents prefer to call it an ‘evolution’. Whichever way you look at it, two interesting changes came out of this:

1. A wider acceptance of ‘quick’ work that allows and forces us to focus on results instead of process and methodology: there is no time to write elaborate plans and 60-page result reports, no time to draw pretty pixel-perfect pictures and exhaustive wireframes. Remote and unmoderated usability testing methods are becoming more widely accepted as one of the tools in the user experience toolbox.

2. Tools to enable a quick turnaround: Previously, user experience designers were forced to master complex drawing programs to create their deliverables. Those programs were not all that well-suited to the work and were time-consuming to learn and use. Today, we have a choice of tools that are low cost, have a shallow learning curve and that allow us to create rough (and not so rough) visualizations that get the ideas across.

To be clear, there is no substitute for thorough planning and clear communication of ideas and feedback. We should never use Agile as an excuse to skim over the essentials of good user experience design. As long as we keep this in mind, Agile and the lean UX evolution have helped us greatly in building more successful products.