5 Tips for Designing Early Careers Assessment Activities

Amberjack helps future focused organisations bridge the gap between today and tomorrow.

5-tips-for-designing-early-careers
In January, I introduced four questions you can use to review whether your early careers assessment process is working – but what steps should you take to do all you can to ensure you get positive answer to these questions?

Here are five top tips…

Tip 1: Ensure a Robust Design Process

We have boiled down the best practice guidance from various professional bodies to create our eight–stage design cycle. If you are missing any of these stages, you increase your chances of seeing problems in your candidate pipeline:

  1. Scoping: Before you start designing confirm the objectives, the exact scope of work and the success criteria you will define success against.
  2. Research: Spend significant time with a diverse range of stakeholders to really understand the roles you are assessing and the organisation you are recruiting people into. This will include business unit leaders, local HRBPs, hiring managers, and previous incumbents in the role.
  3. Storyboarding and concept sign off: Once the research is complete, play back findings from the job analysis and share potential creative concepts with a group of stakeholders (ideally the ones who have inputted into the research phase).
  4. Assessment design: Build in several review points into the initial design process to ensure a range of people are kept up to date with progress and can see how the agreed concepts are coming to life, shaping it as you go.
  5. Validation trial and feedback: Once you are confident on initial assessment design, trial all materials with subject matter experts and previous role incumbents to ensure good scores on the assessment predict potential and performance. Again, invite a range of participants to play the role of candidates and assessors. They can then feed into any improvements before the materials are finalised.
  6. Finalising design: Finalise all technology builds in this stage giving access to a range of people to trial the assessment experience. This user acceptance testing will help you ensure compatibility across devices, accessibility software, and firewalls.
  7. Training and roll out: Once ready to roll the assessments out, agree review and decision points to take stock of whether the assessment process is meeting your success criteria. This will ensure candidates are progressed through the pipeline fairly, in a way that meets your objectives.
  8. Continual improvement: Use the assessment data and feedback from your stakeholders and candidates to continually improve the process and ensure it is meeting the objectives set out in the “scoping” phase. Update and refresh content as required.

Tip 2: Ensure Diversity of Thought in the Design Process

The benefits of diversity in organisations are well established. Diversity of thought, when designing assessment processes, is just as important. It is crucial that a range of people feed into how an assessment method is designed.

At Amberjack, we follow a ‘Do, Check, Review’ methodology to ensure a range of approaches feed into the design process. The “Do” is done by the individual with the best understanding of the role and the organisation being hired into. The “Check” is done by a peer ensuring the design meets our best practice standards. The “Review” is then done by myself, as a Chartered Occupational Psychologist.

As well as the designers, you need to ensure as much diversity as possible in the contributors to the eight-stage design cycle set out in Tip 1.

Tip 3: Have an Agreed, Objective Structure for Reviews

When we are assessing candidates, it is critical to assess them against agreed criteria, using an assessment framework. This ensures the candidates who perform the best are demonstrating the behaviours needed in the role being assessed.

The same principle can, and should, be applied to assessment design. At Amberjack, we use indicators, like you would see in candidate assessment to review our assessment design and ensure it is performing in the way it needs to. Here, for example, are the indicators we use in assessment centre activity design to ensure the content is fair and we are doing all we reasonably can to avoid adverse impact on any group of candidates.
Negative indicators 1 2 3 4 5 Positive indicators
Language is over-complicated to the extent that the assessment has become merely an assessment of verbal reasoning by default.           Uses plain English to ensure the assessment is assessing the criteria it needs to rather than over-weighting verbal reasoning.
Uses colloquial phrases or jargon which may favour certain demographics.           Avoids colloquial phrases or jargon which may favour certain demographics.
The design and/or review process is being managed by a homogenous group.           The design and/or review process is being managed by a diverse group.
The exercise favours people who have had certain experiences or privileges unrelated to job performance.           The exercise measures potential rather than rewarding people who have had experiences or privileges unrelated to job performance.
The exercise is not compliant with the principles agreed with our neurodiversity task force.           The exercise is compliant with the principles agreed with our neurodiversity task force.
Fictious information designed (org charts, images of people, video content) could lead certain minority groups to feel isolated.           Fictitious information designed (org charts, images of people, video content) represents a diverse workforce.
Assessment indicators are not compliant with the standards set out in the “AJ Assessment Team Indicator Standards” document.           Assessment indicators are compliant with the standards set out in the “AJ Assessment Team Indicator Standards” document.
Exercise would create issues for individuals with certain requirements (e.g. no alternative formats for video content).           Exercise does not introduce any obvious issues that would exclude people (e.g. video content required for an exercise is provided in multiple formats).
When reviewing your process, we encourage you to build a set of criteria that you can hold your assessment accountable against. Set a minimum benchmark you would be happy with, just as you do when assessing candidates and don’t settle for an assessment, or make it live, until your process meets these minimum standards.

Tip 4: Bring in External Expertise Where Needed

As with all aspects of working life, when designing assessment processes it is important to be aware of your level of competence and acknowledge where external help is required. Whilst the individuals designing your processes may be experts in assessment design, it is unlikely they are also experts in all types of accessibility for example.

When designing our model of potential and method of assessing potential, we were aware that to achieve our vision (to enable a world where people are hired and progressed based on their future potential, rather than past experience or privilege) we needed to ensure our approach was compatible with a range of needs.

Here is a link to a blog written by Nancy Doyle, an Organisational Psychologist specialising in neurodiversity, talking about how we collaborated with her and colleagues to build our approach to assessment.

There are also several organisations who have subject matter experts in a range of accessibility requirements who can evaluate processes and even certify them to ensure they adhere to certain accessibility standards. This is well worth exploring.

Tip 5: If you Identify Issues in Trialling, or in Live Data, Form a Hypotheses as to Why this May Be and Test It

If you have followed a robust process for assessment design, and you are still seeing issues, a deeper dive is needed into what could be going on. There are often insights that can be gleaned from the data itself and from the wealth of academic literature and industry best practice looking at assessment processes.

This deep dive will often highlight things that could be improved. Our recommendations in using data to improve assessment process is covered in a recent webinar which you can access here.
Share this Article

Related Articles

Testimonials

What Our Clients Have to Say

Scroll to Top