Skip to main content

https://defencedigital.blog.gov.uk/2020/12/14/kickstarting-service-assessments-in-the-royal-navy/

Kickstarting service assessments in the Royal Navy

For those working in digital across government, the Service Standard and assessment process will be familiar as a unifying vision of what good looks like for the design, build and running of services. The Government Digital Service (GDS) formally brought a digital standard for services into being in 2014, having published it a year earlier in beta. During this beta phase, GDS carried out over 50 informal assessments to get a broad sense of how services would fare and learn how the Service Standard works for teams in practice.

Taking an agile approach to adopting the Service Standard

The Royal Navy’s digital transformation plan includes the adoption of the GOV.UK Service Standard, a measure to support consistent delivery of good outcomes from its services through multidisciplinary teams, applying agile and user-centred design methods to solve problems. That plan also sets out intent to create a service assessment process for the Royal Navy to bring the same level of scrutiny to its services, many of which are for Royal Navy users. Recognising that it will be a great opportunity for us to learn about current practice in digital delivery, we got started on this with a similarly agile approach to GDS.

The team and service

We put together a small team who collectively bring a wealth of experience working with the Service Standard, as well as experience of being both assessed and the assessors in other central government departments. This combined experience led to the creation of a lightweight process from which to begin assessments, learning fast and iterating as we go. The assessments to date have been detached from formal governance, providing some great practical insight for when it’s appropriate to introduce this.

More recently, we have also started thinking about our work as one of the services offered by Navy Digital Services (NDS). This service aims to:

  • help embed the Service Standard, its ethos and intent into the culture of Royal Navy digital delivery teams and the wider organisation
  • assure the work of digital delivery teams to return consistently good outcomes to the Royal Navy and its service users
  • encourage open and collaborative approaches to identifying and solving problems
  • learn from and adapt to the nuances and challenges of digital delivery in a Defence context

The assessment

To get going, we created an assessment backlog of in-flight projects in Navy Digital Services, prioritising those approaching the end of a delivery phase and working with them on assessments. Our plan for early assessment panels is to retain a core team of panellists to continually build on what we learn, whilst also having rotating seats for experienced panellists from across government. This external influence will help bring unbiased objectivity to both the teams being assessed and to our new assessment process itself. At the time of writing this we have completed 4 assessments, 3 of which have included panellists either previously or currently working at GDS and other central government departments. With more assessments cued up in the near term we’re hoping to continue this approach, utilising networks such as cross-government Slack channels to connect with experienced panellists to take part.

The NELSON logo alongside a screenshot of the GOV.UK service assessments page

We’re supporting the full lifecycle

The GOV.UK framework assesses services during alpha, beta and live. Given the importance of discovery, we think there is benefit in nurturing teams in doing well by the Service Standard through the full project lifecycle. We’re doing two things to support this approach:

We recognise the importance of Service Owners in not only facilitating the short-term activities of teams, but realising the benefits of those endeavours long-term. We are establishing a close relationship with Service Owners in the Royal Navy so that they become ambassadors for the Service Standard and the teams they work with are guided by that cross-pollination. Navy Digital Services now has 6 military Service Owners.  We’re already benefiting from really positive engagement with them and look forward to continuing that as the Service Owner community grows. We plan to increasingly involve military personnel and civil servants in the assessment process to aid knowledge transfer and the propagation of digital culture. One vehicle we’ll be looking to leverage in this task is the great work going on in the NELSON Digital Academy, an initiative to help military personnel transition to working in digital.

Secondly we’re creating a discovery support service and discovery exit check to help teams to start well and understand the true purpose and value of this sometimes misguided phase. These provide a sounding board for teams throughout discovery enabling teams to check  ideas and queries and get independent support when assumptions about what’s to be delivered need constructive challenge. The aim is to ensure teams are in the best possible place to start an alpha phase with a well defined and evidenced problem statement. We have several teams engaged with us in discovery at present and it’s proving to be a productive and educational experience for all.

Learn by doing

We’re already starting to learn a great deal about the service we provide, what works and what needs refinement. Some of the learnings have come from our own analysis in sessions like a post-assessment retro, others have come from observers and the feedback we have requested from teams. We’re keen to maintain this feedback loop and iterate as we go, ensuring this service provides effective levels of assurance to the Royal Navy, but in ways that can be shared across the MOD. We will share some of the things we’ve learned from our retros and feedback in another Defence Digital post for anyone else interested in service assessments.

Even at this early stage, it’s apparent that the service assessments, the Service Standard  and a learn-by-doing approach have great potential to bring value to Defence. These things provide a well-established framework for delivering services that perform well for users which in turn measurably realise benefits returned to the organisation.

There’s lots ahead to explore and by operating this service we’re also learning about the unknowns, some of which include:

  • if, how and why we might involve other forms of assurance in the service assessment process, for example, Defence standards
  • how we might scale service assessments to cover the whole of the Royal Navy and join up with other parts of the MOD
  • providing an effective pathway to train new assessors from the military and civil service
  • how the Royal Navy can better incorporate a service-oriented approach, giving teams the support they need to tackle problems horizontally across organisational boundaries

As we continue to operate our assessment service we’ll be exploring these issues and many more. If you’re involved in internal assessments in a government department we’d love to hear about how it works for you and what you’ve learnt along the way.

Sharing and comments

Share this page

1 comment

  1. Comment by Mike Arama posted on

    As one of the services assessed, we got a huge benefit from the work Russell and James have done and will deliver a better service as a result.

    Reply

Leave a reply to Mike Arama

Cancel reply

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.