Imagine you are a manager monitoring your sales team performance using the graph below. Your task is to identify if the deals your reps have closed met certain company and team goals. Now imagine having to do this manually while cross referencing your notebook with your sales analytics tool.
 
Does your head hurt yet? Cause mine did when I learned that this is how managers had to monitor and measure their team's performance against team and company standards.
Background
Revenue managers routinely compare numerous metrics across data visualizations throughout their day to monitor sales representative performance. 

In conversations with the product manager, stakeholders identified the need to quickly and efficiently evaluate rep performance against company and team standards.

My work as the UX Designer helped configure and implement benchmark lines across graphs for revenue managers to easily identify and measure key rep performance metrics.
Project Kickoff and Scoping
This project started with a brief goal description from the product backlog. The goal seemed straightforward, so I began by getting as close to the customer as possible by collaborating with the Product Manager. 

I gathered customers' pain points and the PM's early design thinking to get a scope of the final solution. With this information, I set out to learn how competitors' products might use benchmark lines.

I didn't get far into my analysis before I was blocked by company demos and product licensing, so I switched to learning more about the concept of benchmarking. I completed early mockups of what the benchmark lines could look like on a graph. 

Benchmark Lines V1

Benchmark Lines Overlapping

Learning through Iteration
Designing quick iterations helped me discover the first challenge of handling overlapping benchmark lines for narrow windows of performance. Because of this I had to rethink how to the labels were displayed and how to differentiate the benchmark lines from the graph.

Through a few iterations I found that centering the label and changing the line to a dashed pattern was a good start to make these lines stand out from the graph. With more feedback from my Design Director I adjusted the opacity to push the benchmarks to the background. 

The last thing, I implemented into the design was the interaction needed to see the values of the benchmarks and to show which line was being viewed on hover. This interaction used progressive disclosure to reveal important information and took away the guest work for the user. 
Updated Benchmarks v2
Leaning into Existing Patterns
The graph was straightforward, but l still had to think about how these lines would be management. To configure the benchmarks I distilled the functionality into adding, removing, labeling, and editing the line values. 

I used system components to add this configuration into the settings. I could focus on the layout because the visualization section already existed and I knew which components allowed the input that was needed.
Outcomes
Lessons:
• Using existing components made it easier for engineers to build the feature.
• ​​​​​​​Collaboration brought forth considerations that improved the overall design.

Achievements:
• Enabled users to configure multiple benchmark lines 
• Improved the experience for managers evaluating rep performance
Post Launch
There was not much post launch monitoring besides the product manager following up with the customers to ensure their satisfaction and understanding of the feature. 

If I could go back, I would track  the feature using FullStory Analytics to understand how people are using it and what issues they run into while configuring benchmark lines. This would help me continuously improve the functionality and expansion of implementing benchmark lines on various types of graphs and charts. 

You may also like

Back to Top