It takes seven seconds to form a first impression of someone you meet – can the same be said for a product? If it’s as true for smart devices as it is for people, then ensuring a positive onboarding experience is just as critical to product success as a smile and a firm handshake.
While the idea of “first impressions” lives more with Marketing and Design, it is still heavily influenced by Quality and Engineering. Product setup – and its impact on the rest of the product experience – contributes greatly to customer satisfaction. 33% of consumers complain of difficulty setting up connected devices. Even so, onboarding is often placed low on the ladder of priorities leading up to release, because it is not a “primary” feature.
Is Setup that important to the customer experience? If so, how do you prioritize the onboarding experience in your customer tests to streamline it for your target market? Here at Centercode, we have the opportunity to manage tests for hundreds of the latest products each year and continually learn from them. Based on that, let’s dive into this topic with a hypothetical scenario which could, and has, applied to many different kinds of products.
Stark Industries’ Streaming Device
Stark Industries plans to release their first line of voice-capable streaming devices. They’ve already run an Alpha Test and ironed out the bulk of the connectivity, stability, and interoperability issues that were the focus of the previous test phase. With stability as a non-issue, they’re confident this product will be a hit.
After the first week of Beta Testing, however, the team receives some unexpected feedback around setup. Their testers report that the packaging was very difficult to open; that the app seemed disconnected from the hardware experience; that configuring the device was confusing and frustrating. Though generally satisfied with its performance post-setup, this first poor experience dragged down Net Promoter Score (NPS) and the product’s overall evaluation.
This would have been a costly surprise if it happened upon release. Instead, the Stark Industries product team is able to analyze the key pain points of this experience and correct it well ahead of time.
How Onboarding Affects Customer Experience
How much does onboarding contribute to overall customer satisfaction? Although it varies from product to product, we have some concerning statistics from the growing connected market. Studies show that customers spend about 1.5 hours working through setup, and that 22% of consumers give up and return smart products. The high volume of returns from difficulty setting up means it’s in the best interest of your product to get it right the first time around.
Though setup is a relatively small part of a product’s overall lifetime, it does set the tone for your consumer’s experience early into use. Imagine if Stark industries had gone to market anyway, without addressing onboarding. About 78% of real world users would have bumbled through setup, that’s strike one. What if these users then ran into another hiccup somewhere else? A good chunk of them would likely write the whole product off as a dud and return it. A competitive marketplace isn’t baseball – you rarely have three chances to get it right.
Digging Into First Impressions with Beta Testing
The Beta phase needs to take a fresh representation of your true market through a guided product “experience” tour – starting with setup. Then, it needs to measure the perception of the experiences to see how much they enjoy individual product experiences. This effective Beta process answers the question, “Will customers like my product?” Unlike Alpha Testing, which focuses on validating stability, Beta Testing focuses on validating satisfaction – how well your product’s features help your user accomplish tasks, if they perform as expected, and whether they fit their needs and interests.
The Beta phase is the best time to test first experiences with your product. Your marketing, publication, and design teams are often working on product interface, packaging, and documentation during Alpha. But this is the first time that a true representation of your product is available all together, as close to the real life experience as you’ll have prior to product launch.
Here are three things to consider when looking at setup on your Beta Test.
Make Sure It Works
Although we’ve mentioned it a few times already, it’s worth mentioning again: running a small Alpha test (around 30 testers, for 2 weeks) ensures that you don’t have any blocking issues (critical bugs that completely disrupt product usage) in the real world. Beta Testing an unstable product is really just running an expensive Alpha Test – it’s impossible for testers to accurately evaluate their product experience when they’re reacting to technical issues. It’s amazing how often our Managed Services Team facilitate tests where this would have saved a company thousands of dollars in time and resources spent.
Choose the Right Level Topic
During testing, you want to engage your testers with product experiences and report their observations both meaningfully and consistently. Assigning tasks or scenarios that are too low-level (e.g., “Press the green connect button”) will inhibit them from experiencing your product naturally. On the other hand, topics that are too high-level (e.g., “Please use the product”) won’t show you the key drivers for the topic you need to focus on. For a Beta Test, common experiences to measure include Unboxing, Hardware Setup, Mobile Setup, and Account Creation.
As an additional note, you don’t have to ask questions about your testers’ “initial impressions.” You should be using a fresh set of eyes for your Beta Test (if at all possible). Make a point to gather feedback within the first few days, and you’ll know it’s their initial impression.
Don’t Overwork Your Testers
Make sure that the first week of testing is primarily focused on product setup and the initial experiences you’d like testers to evaluate. Although it is tempting, don’t ask your testers to do more than 3-4 topics a week. While they’re likely to complete them if you’ve set appropriate expectations, the more you ask of them at once, the less accurate their answers will be.
Striking an appropriate balance of activities and tasks is key to maintaining high participation. A steady level of engagement throughout the entire test is best. As a general rule, aim for 1-2 hours of activities per week. This will take your testers slightly outside of their day-to-day routine without overwhelming them.
Collecting feedback about the onboarding experience directly from your target market enables you to make nuanced, customer-driven recommendations that will increase overall product satisfaction. While it’s possible that a rocky start will smooth out in the end, devoting attention to making it easy for your customers to interact with your product from the very start shows concern for their experience. As the saying goes, “You rarely get a second chance at a first impression.”
That positive first experience earns brand loyalty, lowers return rates, and distinguishes your product from its competitors post-launch.
To start increasing customer satisfaction with effective Beta Test planning, preparation and execution strategies, don’t miss our webinar, Achieving High Customer Satisfaction Through Beta Testing!
Register Now to Learn Highly Effective Beta Testing Best Practices