The Story: Leveraging user research to guide an impactful redesign that satisfied the needs of multiple target personas.
The Company: Evergage, (now SaleForce Interaction Studio) was a cloud based B2B personalization and customer data platform that used data and machine learning to provide cross channel (web, mobile, email, and third party) personalized business solutions to their clients.
The Problems: Feature creep, startup growing pains, persona evolution, and user pain.
*Note: This story will dive into the campaign list redesign. You will see effects from the information architecture restructure and navigation redesign.
Evergage was experiencing a common startup challenge - feature creep. The technology and solution was strong and the platform was rapidly growing to meet the client and market needs to remain a top competitor in the marketing tech space. The user base and number of personas grew and evolved with the platform. This combination created many user problems and pain, which consequently threatened client retention and future sales.
Yet, the straw that broke the camel’s back was that we needed to expand our campaign offering of mobile, web, third party, and bulk email to include triggered email and full support of email campaigns. Yes, a lot was going on at this time #startuplife.
The CTO (aka Product Manager and Head of Product) with guidance from the team, decided on the general solutions to fix these problems.
Lead Designer & Researcher
Product Team (Myself, UX Director, and CTO - Product Manager and Head of Product), Engineering Team (20)
Design thinking, visual design, whiteboarding, script development, user interviews, user validation tests, project management
Axure, Google Docs, GoToMeeting
My team and I collaborated to identify the user problems and pain through feedback and user, competitive, and market research. Ask me about defining the users and problems.
We defined the technical, business, and feature requirements; and identified the target users, common use cases, and questions. As the design lead, I created the feature doc that held this information and was responsible for the upkeep and communicating with the rest of the team.
#1. Better campaign management: Users had difficulty managing their campaigns and was not able to successfully manage them in the platform.
Solution: Improve the campaign list screen where all the campaigns lived.
#2. Expose useful data: Evergage had a ton of critical campaign data that was hidden or dispersed throughout the platform. Users needed that data in a central location.
Solution: Add data columns to the table on the campaign list screen.
#3. Expose critical campaign details: Users spent too much time developing, testing, and troubleshooting campaigns because the campaign details that were important to those tasks were hidden or in different spots.
Solution: Put the campaign details in a more efficient location.
Persona: Campaign Developers
#4. Improve the UI: The UI was inconsistent and messy (pixels were off, too much space, hierarchy issues, etc). Users and prospects needed a cleaner interface to feel more confident and comfortable using the platform.
Solution: Redesign the framework, make the styles consistent, and clean up the interface.
We needed to talk to the users. These assumptions weren’t strong enough to move forward and we needed to better understand the different users before I started designing. I advocated for user research and given the importance of the project I was able to get a few weeks for a user study.
My Role: Lead UX Researcher
Purpose of study: To validate the three user problems and better understand our users' campaign goals and how they interaction with the campaign list screen.
Method: User Interview over GoToMeeting
Timeline: 1 week
Users: 6 (power users across industries and use cases)
I worked with the CTO and Customer Success to select users, worked with the CS team and clients to schedule the calls, determined the goals of the study and method, moderated the sessions, analyzed the findings, and added the findings to the user, team, and company profiles for future projects and insights.
The three user problems were validated. I learned a ton from the study, including these major take aways:
1) Each of the problems were critical for different personas, which highlighted the larger problem that this screen needs to be designed to accomplish their different goals.
2) I identified patterns and common needs for each of the personas
3) Strategy and organization ranged from team to team. Some teams organized their campaigns based on the campaign developer, the type of promotion, the intent of the campaign, or the campaign status. The level of organization ranged from very little to a flawless system. Some teams heavily depended on the folder system while others used third party systems like Google Sheets and begged for a tagging system.
Solve for the three intial user problems but do so with the added parameters and the ability for the screen to be customizable.
1) Better campaign management - provide different ways to organization because there is a varying need for different types of options.
2) Expose the data - include all revenue data, conversions, and click through rates.
3) Expose campaign detail - include campaign rules and history
4) Improve the UI
Better Campaign Management
I included both subfolders and an additional tagging system with the intent to test and see what resonated with the users. Tags were frequently requested by clients and the team had talked about adding them into the system. This was a good opportunity to test if they helped this management problem.
Customization, Expose the Data, and Expose Campaign Details
At the start of the project, the CTO had the idea that this could be solved by toggling the columns between a data view and a campaign detail view. This solution would be able to be implemented "quickly" by the engineering team, which is good because our resources and time were limited.
I reviewed the findings and even though there were three personas, this solution could be viable. The campaign developers would use the campaign detail view, the analysts would use the data view, and the managers would toggle between both.
I tried out two variations where the columns changed based on a toggle or sliding columns (outlined in red).
There was more information that needed to be added into the columns to so to create space I decided to move the location and orientation of the details panel from the right side to the bottom and during a team discussion we decided to create collapsible folders.
4) Improve the UI
I cleaned up the columns to create as much space as possible because more information and data needed to be exposed and they were the focal points. The rest of the UI improvement would happen incrementally and then at the end once the layout was determined.
Note: This project was closely tied to another feature enhancement that involved the detail panel (covered). There are other artifacts from that feature mixed into this screen, for example the rocket ship icon.
I reviewed the designs and checked to make sure it satisfied the all the user goals and solved the problems and looked at the bigger picture of the platform. I realized that my design solution wasn't going to work.
This would suffice for the current moment, but wouldn't work long term becasue it wasn't scaleable.
This meant that just two options to toggle the columns wouldn't be enough. This screen needed to seem like a different workspace for the three different personas with the potential for more variations.
I needed to find a solution that allowed for more options and give the user the option to create new workspaces. At that point I went back to the white board, researched similar UI structures, and looked for inspiration. I found inspiration from the concept of a custom workspace and custom boards found in Jira and YouTrack.
My new solution was to allow for multiple workspaces (column changes for now). I added a drop down to control/ change the view and each view would have different column settings, which would satisfy the different persona needs. The idea started out as the ability to be completely customizable.
I believed that creating views was the best solution to allow the different personas to achieve their goals on this screen. I presented my idea to the CTO and UX Director and leveraged the user insights to gain buy in and approval. The main hesitation was this would take double the engineering time to build, but the benefits I outlined outweighed the cost and I was given approve to move forward.
I moved forward with my new solution and tackling the rest of the design goals including the following:
This would be an impactful change and the next steps were to test the new direction with the users.
My Role: Lead UX Researcher
Purpose of study: To validate the designs. Learn how the users would interact with the new designs, identify any problem areas, and understand what worked and what didn’t.
Method: User test with high fidelity interactive mockup (Axure). I asked the participants to complete a few tasks, answer a few questions and then I left time for an open dialogue.
Timeline: 1 week
Users: 6 (same users as user interviews)
Overall, the tests went well. The two main takeaways were the following:
1) The clients were excited about the changes especially the views, search bar, and added data.
They told me how much the changes would improve their productivity.
With the positive feedback on these features I moved forward with scoping the feature. I discussed technical restrictions with the CTO and made some changes. For the first version the views couldn’t be customizable, but we had decided to provide four built in options based on the user research. We also needed to limit the statistics filters to two and remove the added details on hover.
There were mixed feelings about adding a tagging system.
2) Some clients were ecstatic about the tags, but some were very against them and said they would add unnecessary clutter.
I considered this feedback in context of the rest of the project. We were making a ton of large changes and since people are generally reluctant to big change, I thought that adding a feature we knew would be negatively received wasn’t good. This could be added in the future after further investigation when we had more time to design the feature to really work.
I took the feedback and created the final designs
Major Changes from the previous designs
The engineers were closely involved in this redesign and were already building the framework as I was designing. The pass off was this final design iteration. We had weekly design meetings with the lead front-end engineer so we could address any issues or design challenges.
This was a major change and I wanted to ensure the users were on board with the entire redesign. The change was released on a separate engineering server. Through 8 users tests, I was able to determine that this was a positive change overall.