All projects
Enterprise UX · One Identity

User View

"The best feature in the last 5 years" — redesigning user risk visibility for a product used by 80 of the Fortune 100.

Role

Product Designer

Team

1 Product Designer, 30 Engineers, 2 Product Managers

Tools

Figma

User View

Your design is this product's best feature in the last 5 years. The User interface has gone way beyond... I am seriously impressed with your design skills and can't wait to see them make it into the product.

Richard Hosgood

Principal Presales Engineer at One Identity North America — 11 years in the industry

The Problem

Auditors couldn't see individual user risk

Safeguard is critical in maintaining privileged access security for enterprise customers. Competitors offered administrators and auditors the ability to see individual users — we didn't. This affected our product's competitiveness and potential sales.

Problem Statement:When auditors want to see a user's risk or activity, they have to spend a long time doing multiple searches, making it difficult to find high-risk users with excessive privileges. This causes a higher chance that potential security incidents will be missed.

Design principles

  • Highlight abnormalities quickly — surface risk without requiring manual investigation.
  • Facilitate a flexible investigation — let auditors work across multiple users and time periods simultaneously.
  • Future-proof the design— accommodate features and data sources that don't exist yet.

Key Decisions

Three decisions that shaped the design

1. Integrating into the existing page, not building a new one

The first recommendation from PMs and engineers was to create a new page. I decided to explore integrating the user view into our existing sessions page because exploring sessions, users, and specific user sessions would be a more seamless experience on the same page. This proved to be the right call — it enabled component reuse and a more cohesive investigation workflow.

This decision was informed by usability studies of the existing sessions page. 75% of participants did not find the session page, and 25% thought it was a list of searches. The page clearly needed redesigning — building a separate new page would have left this broken experience in place.

Analysis of the existing sessions page — usability studies revealed major findability issues
Exploring how to integrate the user view into the existing sessions page

2. Tabbed navigation for seamless investigation

I added tabs rather than a toggle or a new page. Opening a user adds a new tab, allowing auditors to investigate multiple users simultaneously, save investigation progress, and return to Sessions and Users pages. This fulfilled our design principle of enabling flexible investigation.

I also decided to apply the date/time picker to only the currently viewed tab — so investigating User X wouldn't lose your progress on User Y. The date picker was moved inside the page with motion design to highlight changes between tabs.

3. Removing features we couldn't justify

PMs requested global user information cards. After exploration, I decided we didn't have enough information to determine what they should be. Rather than guessing, I removed them and recorded it for future user testing. Saying “no” to features without evidence was a key part of maintaining design quality.

Process

Design thinking in a 30-engineer team

I created a design process diagram to help integrate design thinking into the large engineering team. The process moved through Understand, Observe, Define, Ideate, Prototype, and Test phases — with engineers involved from the ideation stage onwards.

W+H Questions & Workshops

I ran workshops with the Product Manager, sales, and support to capture the problem using W+H questions (Who, What, Where, When, Why, How). This structured approach revealed uncertainties and starting points that would have been missed in ad-hoc discussions.

Jobs to Be Done with MoSCoW Voting

By meeting with PMs, Sales, and Support in a workshop, I captured customer tasks in a structured way that revealed hidden requirements. I added MoSCoW voting to understand which features might struggle to get stakeholder support internally — not just what users needed, but what the organisation would actually fund.

Jobs to Be Done with MoSCoW voting from cross-functional workshops

Red Routes

I organised a workshop to prioritise Jobs to Be Done into a Red Routes diagram to identify the hierarchy. Engineers were involved at this stage to flag technical limitations — this was vital as it identified several tasks that were impossible due to technical debt, time, and resource constraints.

Red Routes diagram identifying task priorities with engineering input

Design Challenges

Solving complex interaction problems

Design Challenge #1

How do we switch between filtering and searching on all screen sizes with a large search field — without shifting content?

After investigating usage data, I discovered filter mode was rarely used. I moved the filter option to an icon above the table, creating a clean experience that solved small-screen issues while maintaining a large search field.

Exploring search/filter behaviour across all screen sizes

Design Challenge #2

How can we highlight user activity and risk, allowing users to quickly see increases and decreases?

  • Activity heatmap with clear labels for active status and last active time.
  • Risk graph to quickly identify trends, categorised into three risk levels.
  • Recent sessions of interest for a quick glance at the last risky sessions.

Design Challenge #3

How can we use tabbed navigation to seamlessly navigate between Sessions, Users, and a specific User?

Opening a user creates a new tab, allowing multiple simultaneous investigations. However, this created a tab overflow problem. I used motion design and moved the date picker inside the page to free up tab space.

Motion design concept for managing tab overflow

Design Challenge #4

How can we enable users to easily zoom into a time period using the visual graphs?

I added a hover state across multiple charts simultaneously, with click-and-drag to update the page's date range. A notification was added to prevent accidental changes and educate users about the feature.

Simultaneous hover state and click-drag time zoom interaction

Design–Engineering Collaboration

Shaping the scoring algorithm through design

Aggregating session-based baselines into an accurate user score was problematic — consistent risky behaviour was often averaged out, failing to trigger high scores. I worked with engineers to define the scoring approach, which directly shaped how the UI would communicate risk:

  • Weighted score aggregation — comparing users against their peers, with more weight given to recent and frequent sessions.
  • Trend analysis — detecting gradual increases in risky behaviour that individual sessions would miss.
  • Anomaly detection — using machine learning to identify unusual patterns across multiple sessions.

These technical decisions directly shaped the design — the risk graph, activity heatmap, and baseline statistics all needed to communicate the output of these algorithms in a way that was immediately actionable for auditors.

Testing & Iteration

Refining through expert review

With Safeguard being an on-premises product, defining success quantitatively was challenging. We relied on subject matter expert reviews, user research before and after release, and measuring changes in sales trends.

The feature was tested internally by Subject Matter Experts in sales, pre-sales, and support. Based on their feedback:

  • Removed the “session time probability chart” due to algorithm limitations.
  • Added “Baseline statistics” based on Sales and Support's feedback.
  • Created variations of the activity component to handle all possible date ranges with back-end limitations.
Before: Initial activity component
After: Notification added, refined layout
The refined final design with baseline statistics

The Outcome

The biggest product update in years

User View shipped and was tested with real customers. It was described as the most significant feature update the product had seen in years, receiving widespread praise from customers, sales, and pre-sales teams. Richard Hosgood's testimonial at the top of this page reflects the internal response — a feature that went “way beyond” what the team expected.

Beyond the feature itself, the project established the design process for a 30-engineer team — W+H workshops, Jobs to Be Done with MoSCoW voting, Red Routes, and design challenge documentation became the standard approach for subsequent features.

More projects