Improving the information architecture of the Smart Pension member app

In this article we explain how we used tree testing to measure and improve the effectiveness of our information architecture

By

Joe Russell

26/8/2022

Information architecture involves making content and features easier for users to find. It has nothing to do with how a design looks aesthetically - it’s all about language, labels and taxonomies. 

In recent years the practice of information architecture has fallen out of fashion, which is a shame as you can’t design something successfully without it.  If a user can’t find a feature, it’s game over - the feature may as well not exist as far as they’re concerned. 

We were recently invited to help out with the redesign of our platform’s scheme member web app.  It was straightforward enough to come up with a new information architecture - but how could we be sure that the new design was better than the old one?

Tree testing as a research method

Tree testing is a type of quantitative research where you invite users to carry out tasks using your navigational system. You record when they complete the tasks successfully or not, and then do some data analysis to work out if there are any problems in your navigation.

Like all research methods, tree testing has some limitations. It abstracts away your app or website into just a navigation tree. In that way it’s very artificial, but it also, usefully, forces you to focus on one thing - the information architecture.

We used a tool called Treejack which made the tree testing pretty easy.

A screenshot of the Treejack website showing a task for the user to complete.

Treejack provides a recruitment service where they source participants from a provider called Cint. It’s much cheaper than conventional lab user research which can cost about $100-200 per person all-in. With Treejack, the cost for us was just $10 per person. We recruited 200 participants, 100 tested the old design and 100 tested our new design. We got the results within about 4-5 hours, which was impressively fast. One of the downsides of recruiting members of the public to do paid tasks online is that some people are going to fill in any old nonsense just to get to the end. It means you get “noise” in the data. To mitigate this, we excluded anyone who abandoned the tasks part-way through, and anyone who took less than 90 seconds to do them all.

Our findings

If you’re interested in doing some tree testing yourself, you might like to take a look at our raw data. Here are the raw findings from study 1 (using the old navigation) and from study 2 (using the new navigation). 100 participants were given 13 different tasks in each study (like “You want to change your password. Find the place to do this.”). When we looked at the average task success rate, we were really happy with what we found.

Average success rates: Old design 44%. New design 75%.

After an initial round of self congratulatory high fives, we realised that although the averages looked good, there was more under the surface that we needed to pay attention to. Some of the tasks didn’t show any improvement from old to new. Here are some of the most interesting findings.

Finding 1: Burger menus are effective… at hiding things

While now a commonly used and recognised design pattern, burger menus are usually a compromise – somewhere to put navigation that doesn’t fit. Designers know this, but this particular test showed they can also have a really negative impact on findability. This gets worse when you have multiple menus with ambiguous contents. We call this the “lucky dip” problem. Each menu is like a bucket with mystery contents. You don’t want to force users to search through every bucket to find what they’re looking for. It’s time consuming, and they might find it so frustrating that they just give up.


You can see this principle at work in the video above. Users were given the task “You want to find out if you have any messages from your pension provider. Find the place to do this”. In the old navigation, the user had to open an overflow menu (called “Menu”) in order to find the item that contained their messages. To make it even harder, that item was called “Letters” - an old fashioned term from UK pension regulations that’s rarely used in apps or websites. As a result, only 42% of participants found it. In the new navigation structure we sensibly decided not to have an overflow menu and instead we just listed all the features on the home page.  This meant that users only had to glance down the list of items to see “Inbox: unread messages (2)”. This made it easy to see, and the new label made it easy to understand - so this navigation structure got a 90% success rate.

Finding 2: Live excerpts of dynamic content can really help

In the new navigation, we added little excerpts of real data from the user’s account to clarify the labels. For example, retirement age is a very short string of characters (“65 years old”), so it made sense for us to just put it in the navigation label rather than forcing users to click something to find out. In some cases, it turned out this extra snippet of information seemed to really help with findability. For example, we gave participants the task “You want to find out what percentage of your salary you currently put into your pension. Find the place to do this”. You can see the two navigation designs in the video below. Participants struggled with the old navigation - only 10% of them getting it right. Conversely, in the new navigation, 80% got it right.

Finding 3: sometimes “technically correct” is the worst kind of correct

The pension industry has lots of archaic terminology from laws and regulations. For example, in the old navigation, we had an item labelled “Manage membership” which allowed people to take a break from putting money into their pensions. The phrase “manage membership” is a bit strange, but it’s technically correct in terms of UK pension terminology. 

In the new design we got rid of it and changed it to “Stop paying in: cease membership of your pension scheme”. We hoped that “Stop paying in” was good plain English. It turns out we were only partly right. 

We gave participants the task “Imagine you have debt problems and you want to have a break from paying into your pension for a while. Find the place to do this.” You can see the two navigation designs in the video below. It turns out- the task success rate for the old design was 4%, while it was 60% for the new design. 


This was puzzling - the new design was a clear winner but at 60%, the score still wasn’t that good. Luckily, in another piece of research we got a useful insight as to the potential reason why. The search traffic on our help centre website shows that a lot of people search for the term “opt out”, while hardly anyone searches for “cease membership”. In our test, we think that people might have been looking for “opt out” but not finding it, even though technically speaking it’s not the right term. In official pension terminology, “opt out” is something you can only do in the first 3 months of joining. After that you can’t “opt out”, but you can “cease membership”, which is a similar thing but not quite the same. Confusing isn’t it!

So in our next test, we’re going to change the label to “Stop paying in: opt out or cease membership”. At least this way, if people are looking for “opt out”, they’ll be able to see it. Then, on the next page we can explain their options more clearly.

Conclusions

Before we did the tree test research, we were confident that the information architecture in the new design was better than the old design. When the data came in, we realised that while it was better overall, there were still some areas that needed more work. It was incredibly useful to have quantitative data from 200 people to show us where to focus our efforts.  

A card in the app, listing the funds the user is invested in.

As we analysed the data we started to notice the shortcomings of tree testing. In our new design, the user interface has various things that intend to help users with findability: icons, cards, explanatory text, help, a chatbot and so on. Tree testing doesn’t acknowledge the existence of any of that - so it isn’t a replacement for other research methods like qualitative user research or analytics. That said, it was amazing in the way it gave us quantitative findability data so quickly and easily. 

We’ve decided that tree testing deserves a place in our research toolbox. Having read this article, maybe you’ll feel the same.

THIS IS JUST USED FOR RICH TEXT STYLING

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

test for a pullquote here needs to be separate to test

Static and dynamic content editing

Static and dynamic content editing

Static and dynamic content editing

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

alkdhjda

  • Bullet 1
  • Bullet 2 aodj afhiahf iafhpioafh paofh oapfh aopfhaofh aofhaofh aofh aofhafo ahof haf
  1. Bullet 1 apofdhjaof haopfhoap fhaopfh aofhaofh aofhaofhaofh aofhaofh
  2. Bullet 2

FOOTNOTES
CONTRIBUTORS

Written by Joe Russell

Edited by Harry Brignull and Max Roche

With thanks to the Smart Innovation Team

Improving the information architecture of the Smart Pension member app

In this article we explain how we used tree testing to measure and improve the effectiveness of our information architecture

About Smart

Smart is a global savings and investments technology platform provider. Its mission is to transform retirement, savings and financial wellbeing around the world.

Smart partners with governments and financial institutions (including insurers, asset managers, banks, financial advisers) to deliver retirement savings and income solutions that are digital, bespoke and cost efficient. In addition to the UK, Smart is operating in the USA, Europe, Australia and the Middle East with more than a million savers entrusting over £10 billion in assets on the platform. 

Smart supports its clients with a global team.

Aquiline Capital Partners, Legal & General, Fidelity International Strategic Ventures, J.P. Morgan, the Link Group, Barclays, Natixis Investment Managers, DWS Group and Chrysalis Investments are all investors in Smart.

For media enquiries

Email: pressoffice@smartco