A few years ago I had the privilege of working with an amazing startup team in Boston. From the start we were getting great traction. Unfortunately as our website traffic increased, the people we left behind became more and more obvious. What started as the rare event turned into thousands of potential customers bouncing unexpectedly. It turns out that small problems at scale become big problems.
Unfortunately it was really hard for us to pinpoint why exactly people were bouncing. There also weren't any tools that could give us the level of insight we wanted. So, we set out to create our own tool internally. Piece by piece we created a system that could tell us where people were struggling. The system worked beautifully and quickly identified problem areas in our website. Unfortunately it was never made publicly available for other companies to use. Such is the fate of most internal tools.
New projects, same problems
Eventually I left the team and over the next 3 years I worked on various projects at enterprise tech companies. With each new project it became increasingly clear that no one else seems to have a good way to improve their UX. On top of this, when I attempted to discuss building something similar I was often met with disbelief. Their disbelief fascinated me. These companies took their UX very seriously. Engineering teams were plugged into the design process and listening to users directly. There were cross team meetings where customer complaints and feature requests were reviewed to refine and improve product road maps. These teams spent a ton of time focusing on improving their user experiences, yet something was missing. Data.
Despite all of the cross team planning and product design meetings, they lacked any sort of data to validate which changes would have the most value. As a result the products struggled to reach the performance metrics that were important to management. The teams would wrack their brains over which part of the design or user flow was the cause of their users bouncing. Since they didn't have data, my first question was often have we performed an audit to make sure it's optimized against a baseline standard? The answer was always a no. They were reviewing the designs regularly, but they weren't really measuring them against anything. There was no way to know for sure if changes were going to actually impact the experience let alone improve it. Just gut feelings and hopes that we were right.
UX shouldn't be a guessing game
Creating great user experiences shouldn't be a guessing game. You also shouldn't rely on people to keep track of all potential issues with a website. They might be experts, but they aren't machines. People just don't do well with keeping track of tons of information. It's actually one of the "Laws of UX".
With Look-see, you don't have to worry about how how you're tracking and measuring user experiences. We help you identify issues early, track progress, and even collaborate across teams.