How UI impacts on metrics to provide a better UX

Sometimes we believe that in order for a digital product to work, we need to make major changes, but the truth is that small details can make a big difference. Even adjusting a couple of pixels can have a tremendous impact on the metrics we evaluate!

Today, we want to share with you our experience with Soomba, a recently launched project where we firsthand experienced the importance of measuring, testing, and verifying that our design, even if we considered it incredible, can always be improved.

Soomba is a 100% digital radio with unique playlists created by DJs without the help of algorithms. These playlists are based on the DJs' expertise, giving them a special value in a time when artificial intelligence is present at every step of our journey.

In this project, our challenge was to identify pain points, offer solutions, refresh the look and feel, and make changes without the need for significant development modifications. However, we won't go into detail about how we arrived at those pain points. If you want to learn more about this topic, we recommend reading our article, "How to Conduct a Discovery Without Losing Your Mind."

→ Returning to Soomba Radio, after going through the discovery process and having all aspects clear, we got to work and focused on improving the experience through the user interface (UI). We were convinced it would be a success right from the start: new colors, carefully designed information architecture, modern typography, and positive feedback from both the client and other teams at EGO. Here's the interesting part: we implemented an A/B testing between the initial design and our version, but... it didn't work out as expected!

So, what happened? It turns out that sometimes our intuition can deceive us, which is why testing our ideas and not taking them for granted is crucial.

When we realized that our variant was "losing," we gathered designers, developers, project managers, and advisors for a roundtable discussion to evaluate both the metrics and the design as a whole. We noticed that, in smaller screens, our variant wasn't displaying the play button for the main playlists. Additionally, when analyzing the heatmaps, we discovered that users were trying to click on the playlist boxes instead of simply clicking on the play button itself.

The change we made was minimal: we hid a sticky bar so that the playlist boxes always appeared complete, regardless of screen size, and we made the boxes themselves clickable. The improvements were significant:

No alt text provided for this image
  • Our variant showed a 2.4% increase in conversion rate compared to the original design.
  • The bounce rate decreased significantly, going from 86.75% in the initial design to 9.87% in our variant.
  • We observed that users spent more time on our variant compared to the initial design (16 minutes and 43 seconds versus 6 minutes and 50 seconds).

The lesson of the day is that we should not underestimate the power of details in design. Sometimes, a couple of pixels can make the difference between a frustrating experience and one that captivates users.

In conclusion, don't be afraid to constantly test, measure, and improve your design. Even the smallest changes can have a significant impact on metrics and user experience.

there is more to read