Skip to content
Home>All Insights>The Digital Lighthouse: the role of inclusive design in government services (and beyond)

The Digital Lighthouse: the role of inclusive design in government services (and beyond)

User research is the linchpin behind the core services we rely on in every part of our lives.

A fundamental part of the work is inclusive design (also known as universal design) to ensure all products and services work for everybody – regardless of age, ability or circumstance.

In the latest episode of The Digital Lighthouse, Softwire’s Callum Bates — a Research Principal — delves into all the moving pieces that ensure no stone is left unturned when creating exceptional experiences for everyone.

Callum talks about our work in government projects to expand on:

  • The UK government’s world-leading Service Standard for measuring inclusivity
  • When to do “quick and dirty” research vs. more comprehensive research
  • Leveraging all areas of a service to avoid redesigning from scratch
  • The role of AI in the user-centered design (USD) process

Check out the illuminating discussion below:

About our guest

Callum Bates

Research Principal, Softwire

Callum Bates is a Research Principal at Softwire. After completing degrees in business management and then service design innovation, Callum briefly worked in marketing, before deciding to follow his passion for understanding human behaviour and helping people solve problems.

Join the podcast!

A podcast for CTOs and other tech leaders looking to navigate their organisation through endless disruption and turbulence. Packed full of real life stories and insights, this podcast serves as your guiding light – wherever you are in your journey and whatever challenges you face.

We’re always looking for new and interesting stories. Got something you want the world to know? Want to join a thriving community of tech experts and aficionados. Simple drop our host, Zoe Cunningham, a line and let’s get your story out there.

Transcript

[Zoe Cunningham]: Hello, and welcome to The Digital Lighthouse. I’m Zoe Cunningham. On The Digital Lighthouse, we get inspiration from tech leaders to help us shine a light through turbulent times.

We believe that if you have a lighthouse, you can harness the power of the storm. Today, I am delighted to welcome Callum Bates, who is a research principal here at Softwire. Callum, could I ask you to introduce yourself and tell us a little bit about what you do at Softwire?

[Callum Bates]: Yeah, sure. I am a user researcher by training, and I specialize in user-centered design, or design thinking, particularly working on government projects. I support our government teams to ensure that we’re delivering against the Service Standard and, ultimately, making sure the services we deliver meet the needs of users.

[Zoe]: Fantastic. Can you explain a little bit more about what it means to be a user researcher?

[Callum]: Yeah, so essentially, the work we do is driven by making sure that the services we deliver align with user needs. Ultimately, if a user has a challenge, problem, or issue, we want to know about it so we can fix it and reduce the number of pain points users may experience while going through and using the service—whether that’s using a train app or registering to vote. We’ve worked with a wide range of services, applying user research to ensure we identify and refine these needs.

[Zoe]: Cool. So there are a lot of specialist roles nowadays within tech and development and design teams. I suppose one way of putting it is that you’re the interface to the users, ensuring that products aren’t designed in a silo by engineers. You’re there to make sure the people who will use the product are part of the design process.

[Callum]: Yeah, absolutely. We talk about it in three terms: desirability, viability, and feasibility. As you mentioned, desirability is very much about understanding user needs, engaging with users, and going out and speaking to people through moderated research sessions to unpack and understand those needs.

Then, from a usability standpoint, we go through trialing and testing a service alongside users. Viability is about understanding the business drivers, which some of my colleagues handle. Lastly, feasibility is about what’s technically possible. So, we’re part of a three-pronged approach, if you like.

[Zoe]: Fantastic. Okay, with that as our understanding of the basics, what does it mean for design to be inclusive?

[Callum]: Yeah, as I mentioned, I specialize in government projects, and often when we’re delivering services, one of the key things is ensuring people can use them. Inclusive design, also called universal design, is essentially about making a service usable by everyone, regardless of their age, ability, or circumstances. The general approach is to make sure users can do what they need to. For example, registering to vote is a fundamental part of our democracy, and ensuring people can use both the online and offline service for this is crucial.

Booking train tickets and navigating through the country—these things need to be accessible and inclusive so people can take part in society. The reality of delivering this on a specific project is variable, especially given the diversity of service users. For example, in the voter authority certificate space, enabling people to apply for voter ID involves understanding how changes might disproportionately affect certain groups. We needed to engage with older people, people who are homeless, and others from specific communities, like the Roma community, to ensure we design services that meet these varied needs.

[Zoe]: Every time I talk about inclusivity, I think, “Oh my gosh, yeah, of course—I wouldn’t have thought of that.” You mentioned it’s also known as universal design. For government services, it’s crucial that they’re usable by everyone, not just a majority. We often think of the elderly as lacking digital access, but as a country with a diverse population, it’s reassuring to hear so much thought goes into making services work for everyone. So, what does inclusive design mean for working on government tech projects? How is it measured?

[Callum]: Most government tech projects are assessed based on something called the Service Standard. This is widely used across government, and it’s world-leading—many other governments now use a version of this. It includes 14 points that cover aspects like efficacy and quality of service output, with elements focused on inclusive design. A few key points include understanding users and their needs, solving a whole problem for users, and ensuring everyone can use the service.

We’re assessed at different points in the agile process, where a peer review team, often from another department, reviews our progress. They assess whether we’re ready to move to the next project phase. This peer review stretches us, offering opportunities to improve the service. The assessment outcome is either “met” or “unmet,” which dictates if we can proceed to the next stage, like from alpha to beta.

Inclusive design assessment also involves the Web Content Accessibility Guidelines (WCAG), where an independent auditor reviews the service using accessibility tools and screen readers. This appraisal shows how effective our inclusive design is. Peer reviews cover various things, including methodology and usability testing, and they can suggest further improvements. Although the Service Standard is specific to government, we at Softwire try to apply these principles across all projects, whether it’s LNER or integrated wheelchair reservations. It’s simply an effective approach to user-centered design.

[Zoe]: There’s an opportunity to use the expertise and the work that’s gone into creating those standards. The peer review process sounds great—having a system where you’re constantly reviewing against these standards is obviously a fantastic part of the process. But a lot of work went into deciding what should and shouldn’t be in those standards.

[Callum]: Yeah, absolutely. They’ve been honed and refined over time. We’re probably on the fifth iteration of these standards, really getting to the core of what makes a great, inclusive service for citizens.

[Zoe]: We’re talking about one set of standards here. Do they vary depending on the type of government service you’re developing? Are there some types where it’s more important, or is it a single standard for everyone?

[Callum]: So, it is one single standard, but I’d say emphasis will be placed on certain areas more than others in particular instances. So, for example, especially if we’re looking at the delivery of voting-related services, it’s absolutely critical that we are engaging with a diverse sample of users from across those spectrums. Understanding users and their needs—there will be a lot of emphasis placed on that. Whereas, potentially in other instances, especially when looking at more internally focused services, the emphasis might be on a whole range of other requirements, including making sure that the code is open-source, for example, so that it can be replicated and used across government as a means of increasing efficiencies.

So, yes, I’d say broadly each service is assessed on these principles, but potentially, emphasis might be placed on one more than others, and potentially at different phases as well.

[Zoe]: Because it must make a difference when there are other avenues. My background is as a technologist, so I’m used to thinking about the website or the piece of tech as the service. But actually, the service is broader than that, and inclusive design is about there being a way for people to access a service, which might mean that it’s not always appropriate to use the digital service. There could also be a telephone service, a paper service, or some other way to access it. So, that must make a difference to how you apply the standards. And also, I suppose, what are the consequences and redress? It seems that if someone is unable to access a train, the company could provide redress or compensation. But if someone is unable to vote, that’s a bit more serious.

[Callum]: Yeah, absolutely. Part of that is the emphasis we place on user research. In some instances, it could be okay to do kind of “quick and dirty” user research, in the sense of doing a couple of rounds of iteration to get us to a certain level of understanding. Whereas, in other instances, as you mentioned, we want to make sure that we’re covering all our bases. For example, if a user, for one reason or another, can’t use the citizen-facing service to register to vote, there are other avenues available.

What we’ll aim to do is work with the providers of those other avenues, like local authorities and their contact centers, to understand the usual process that a user would go through if they can’t access the digital service. They may access the phone details for their local authority, for example. So, we’re trying to make sure that process is as seamless and intuitive as possible to maximize the number of people who are able to, in one way or another, do what they need to—in this instance, register to vote.

We’ll do research both with citizens and, on the other side, with experts who engage with people daily. We aim to understand where other pain points might exist, ones we wouldn’t uncover if we only spoke to citizens. So, for instance, we’ll engage with people in contact centers, understand their processes, and review their internal systems to solve this as a whole problem rather than looking at one thing in isolation, like the online service. This is crucial, especially for services that millions of people are going to use, because there are so many different edge cases to cover.

[Zoe]: Yeah, absolutely. The two important points here are addressing it as a whole service and also making sure you make use of work that’s been done elsewhere regarding how best to support people—what works and what doesn’t—and building that in, rather than redesigning everything from scratch.

[Callum]: Yes, and part of our foundation at Softwire is to build out and leverage that understanding. So, for example, when we look at how we’ve done usability or accessibility testing for LNER to understand and address pain points, we ask: how can we leverage that knowledge and insight into what will work in that context and apply it elsewhere? This way, we can build on collective learning and ensure that, for clients, we’re delivering real impact and able to deliver at pace.

[Zoe]: Interesting! What we’re discussing here is a very thorough and comprehensive form of design that ensures a service is suitable for everyone. It sounds amazing, and you’d surely want to apply it in all circumstances. Is that correct, or are there situations where this approach might not be appropriate for one reason or another?

[Callum]: Yes, that’s the gold standard. As I mentioned, through those service assessments, we are scrutinized to the nth degree on larger mass-scale projects. These projects are, in some respects, slow burns; they take months to develop, and we go through multiple rounds of iteration, engaging with thousands of users. There might even be trials, for example.

However, there are instances where we need to work differently to deliver quicker. The world seems to be moving faster, and for us as practitioners, the question is how to move quickly enough within this existing paradigm. COVID is a great example. Having worked within the civil service during that time, I saw that government processes weren’t particularly nimble to handle an extraordinary reset of conditions. So, part of our work at Softwire is exploring other ways of delivering to meet briefs faster. Sometimes, a service is better than no service, especially in instances of societal challenge.

The other side of this is that the world is getting more complex. Over the past 10 years, we’ve become very good at delivering transactional services—like providing data to receive a passport. But moving forward, we’re tackling more complex problems, and that requires a broader suite of tools and skills to deliver success on systemic levels. These problems come with additional constraints and challenges. For example, working across multiple government departments, each with different requirements, policies, and processes. We face unknown unknowns in some instances, and we need to pivot as needed. Negotiating these constraints takes time and requires discussions with many stakeholders to figure out the most effective route forward.

And finally, one other challenge is when we work with a policy team who has set up the broad framework within which a service will operate. Policy is government-mandated, and it’s up to us to deliver a service that meets that mandate. In some cases, many service requirements, especially around inclusivity, haven’t been thought through fully, so as we try to deliver the service, we encounter challenges or constraints within the policy.

So, a classic example of this would be content design. If you’re looking at a user trying to understand a piece of legislation, that’s going to be very difficult. So, as a content design capability, it’s their role to make sure that’s outlined in plain English. But in some instances, the legal framework and the law require that it be stated in a certain way, which makes it difficult for users to complete what they need to as they go through a service.

[Zoe]: Such a good point. Essentially, the way that politics works within this, or the way government works in this country, is that, like you say, there’s a mandate for a certain thing to happen. We’ve been assuming and discussing this as though it’s neutral, and we then just, you know, apply fantastic service design to make it accessible to everyone. But actually, if there’s an issue in that from the start—and I think the complexity point is so important, right?—then I think there are three things you’ve highlighted that are happening all the time. Our standards are constantly rising as technology can deliver more; we expect it to deliver more, and we expect it to meet higher standards, such as being universal and delivering for everyone.

But we also expect things to be quicker, even as complexity is increasing almost exponentially. The amount of work needed to deal with that is also increasing. So, I think that balancing those things and considering that—on one hand, I think this is great, and we should do it for everything, but then you remember the agile maxim: if it takes too long to build something, it doesn’t matter how thorough you are because by the time you’ve built it, the world has moved on, and it’s out of date. So, there’s a real challenge in balancing all of that together.

[Callum]: Yeah, absolutely. One thing that’s really important, pulling it back to users, is that a user doesn’t care about those constraints. A user just wants to get from point A to point B; they don’t understand, quite often, all the work that goes into trying to resolve this and make it as intuitive and seamless as possible. They just see, “I can’t answer this question because I don’t understand what you’re asking me.” That is our role, ultimately, as user researchers in this instance: to bring that knowledge, that know-how, and that insight in.

I think, yeah, the way that you described it was right. You know, we’re building on this foundation of policy, and quite often, the design of the service is viewed as just an add-on we need to do at the end. Whereas, in an ideal world, we should be bringing that knowledge and know-how into the process early on so that we can design policy along the lines of the Service Standard and make sure we co-design the policy and the actual design of the service at the same time.

[Zoe]: Yeah, that’s an exciting way to think about the future, actually, because we forget how new so many of these areas are. Compared to the mechanisms of government, which have been around in this country for quite a long time, hopefully going forward, we will see more integration in that kind of way. So, just to finish, looking forward again, what do you think are the kind of changes we can make in how we implement this to help speed it up, like you’ve just said, and also maybe encourage innovation?

[Callum]: I think the first one is looking to things like artificial intelligence. It’s been discussed a lot, and rightly so. I think that is a real opportunity area, but it also comes with a number of different caveats. So, we’ll likely be hesitant to apply some of this stuff, and that makes absolute sense because we need to make sure that we’re applying it in the right way and that there are effective frameworks and all the rest in place.

But I think there are a whole range of different opportunities for something like artificial intelligence to have an impact in terms of delivering more innovation, better outcomes, and at a quicker pace. Putting the framework and foundations in is really important.

An example of that would be the UCD process, or the user-centered design process. Where are there opportunities to potentially speed up some of that process that is quite burdensome and heavy? This can include everything from the recruitment of participants to the processing of data to make sense of it as we’re going through. We’ve collected a bunch of user research—how do we make sense of it and use it to deliver actionable outcomes?

All of that, I think, is ripe for using this new technology and new technologies to come, in order to be able to speed up that process. I think we need to make sure that we understand where the human sits within this, both as users and as professionals and practitioners who will use this. So, it’s not necessarily about replacing, for example, user researchers or designers, but building on that to see how we add value as practitioners and supplement this new, growing technology.

More broadly, there’s a discussion happening around how we build on this agile process we have within government currently: discovery, alpha, beta, and live. Can we combine project phases, for example? We see a lot of combined discoveries and alphas or combined alphas and betas. How do we move beyond this paradigm of distinct projects into embedding other approaches or methods?

Ten years ago, it was all about lean startup, for example, or Google design sprints. Where are the next processes and tools we can use to deliver in a different way, and in a way that carries with it all the benefits of the service standards, understanding user needs, etc.?

It’s not like you want to throw the baby out with the bathwater, but rather to continue iterating, building on this, and working with our government partners to do so. Part of this is identifying which government departments are at a stage, in terms of their degree of fidelity in user-centered design and digital capability, to start looking at doing things slightly differently for the benefit of both them, in terms of reduced costs, and users, in terms of better outcomes and outputs.

[Zoe]: Yeah, such a fascinating discussion! I feel I could talk about this all day. But thank you so much, Callum, for coming on and sharing your insights and expertise with us.

[Callum]: Thank you very much. It’s been great to be here.