Skip to main content

This blog post was published under the 2015-2024 Conservative Administration

https://defencedigital.blog.gov.uk/2021/03/12/the-application-of-user-research-in-defence/

The Application of User Research in Defence

Posted by: , Posted on: - Categories: Military Digitisation, User Research

In the second of a two-part series, one of the trailblazers of user research (UR) in Defence, Dr Silvia Grant, shares some of her insights from applying user research with the Armed Forces.

The previous post on User Research in Defence explored the driving principles behind the Defence Digital Service (DDS) approach to UR in Defence. This article details some of the practical opportunities and challenges for User Centered Design (UCD) in the space of security and Defence.

Working with the Armed Forces

Defence is a little unusual, in that the majority of our digital services are provided to our own workforce as users –specifically, the Armed Forces. This is to help them execute operations effectively, to work efficiently, and to remove administrative friction from their working lives. To do user-centred design effectively in Defence, it’s essential to take some time to understand the Armed Forces; who they are, where they come from, and how they work.

Two male service personnel looking at a laptop screen

Covering that in any detail is a little outside the scope of this blog post, but there are some tips we can share from experience of working with the Forces.

I can certainly attest to their qualities of leadership, teamwork, integrity, courage, dedication and selflessness. I also admire their ability to cut straight through to what’s important, and strike the right balance between being people-oriented and getting the job done. A robust sense of humour, founded on respect, is an essential part of their kit-bag - and should be part of yours too when (as a ‘civvy’, or civilian) you work alongside them.

They are a very diverse workforce, from recent school-leavers to experienced and highly-qualified specialists. As well as the main (three) Services (Royal Navy, British Army, RAF) there are many different tribes, trades and cap-badges. They are dispersed geographically around the UK and beyond. They may have very different access to IT; most do not spend all day working at a desk or in front of a computer!

The military are naturally rank-conscious and may adapt their behaviour (for example, moderating their answers to questions in a UR interview) if someone more senior is in the room.

As users, they have an inherent understanding (and typically, personal experience) of the challenges faced by those on the front line. Of course, in Defence ‘the front line’ is a very literal thing – it is not a euphemism for direct-to-public service delivery.

Service Personnel are not at all fazed by new-fangled ‘agile’ practices. The principles of iterative delivery, constant testing and adjusting, ongoing planning and always working to a fast, regular drumbeat are common to how the Forces work on operations (and always have done).

The Forces tend to ‘get’ agile quickly and instinctively. They are also great supporters of agile delivery, and of getting things done quickly and efficiently. Despite their pressure on time, this means they also ‘get’ the value of research sessions straight away, and are generous in sharing valuable insights.

As with a lot of user research recruitment, reaching our Forces users for digital research is not always straightforward. Considering the sheer amount of users, their rank systems, divisions, postings, and fragmented access to official IT, we often have to leverage internal networks to reach our desired audience, which can take weeks. If delivering to a 6 or 8 week discovery, this can impact outcomes.

Their positioning on the digital inclusion scale is also typically very wide, stretching from 4 to 9+ on the spectrum, making the research and testing of design solutions even more important. More work is needed to understand the digital inclusivity needs of the Armed Forces, and in plotting a consolidated view of what this might look like.

Similarly, accessibility is an under-researched topic within Defence, making it a priority for engagement, as the accessibility needs of the Armed Forces are unique to their battlespace scenarios. Work is under way to understand and map these needs for individual projects, and learning will be released accordingly, in line with the military ‘lessons identified’ rationale.

Making the Service Manual work for Defence 

The introduction of the Government Digital Service’s Service Manual heralded the digital revolution of good user-led practice across government services. Its revised 14-point Service Standard is the cornerstone of external-facing, and increasingly internal-facing government systems, platforms and services.

The Navy Digital Service (NDS) has recently kickstarted service assessments in the Royal Navy, embedding the Service Standard into the culture of digital delivery teams in the RN, and adopting a learn-by-doing approach. This supplements some of the unique user needs evident in Defence, which sometimes might require amended or tailored digital assurance measures. Particularly in the battlespace, it is essential to design for systems which deliver the precise balance of intuitiveness and operational complexity, which sometimes requires devising a bespoke framework of digital assessment criteria, on top of the existing Government Digital Service guidelines.

The National Health Service has approached its digital services in a similar way, adding 3 extra points to the existing 14 points of the GOV.UK service standard in its tailored NHS service standard. Accessibility is an area of particular interest we are focusing on, as we tackle military-specific accessibility challenges of using systems on board ships and aircrafts, for instance. We will update on progress in this space as internal consultations help us refine and articulate specific points which are of relevance to Defence specifically.

Working at SECRET

Due to the nature of security in Defence, some of the work our users perform is classified as SECRET or above. What does this mean for user research? Two things. First, testing SECRET platforms, services and tools means that researchers work to stringent security standards, which prevents some forms of traditional UR from happening, such as unmoderated usability testing, remote interviewing, screensharing, and any information sharing not occurring via a SECRET terminal. Second, users face challenges in reconciling their SECRET work, IT, tools and commitments, with their regular OFFICIAL work. This represents an opportunity to explore a degree of synchronisation for our users.

In effect, rather than being hampered by work processes at different security classifications, user research is and can be deployed to tackle some of the internal SECRET working challenges, with impact on wider security recommendations.

Get in touch

If you are interested in working in the user research field in Defence, please get in touch, and we’ll let you know when we advertise new roles. You can also join the conversation on Twitter and follow DDS on @Digital4Defence.

Sharing and comments

Share this page

1 comment

  1. Comment by Ella Botting posted on

    Incredibly well wrote Silvia, and certainly reflects my own experiences conducting user research within the Royal Navy. It's interesting that our users face the same issues when trying to synchronise work across official and secret, as we do when trying to conduct research. Looking forward to continuing to work with you.

    Reply

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.