Here are five top tips…
Tip 1: Ensure a Robust Design Process
We have boiled down the best practice guidance from various professional bodies to create our eight–stage design cycle. If you are missing any of these stages, you increase your chances of seeing problems in your candidate pipeline:
- Scoping: Before you start designing confirm the objectives, the exact scope of work and the success criteria you will define success against.
- Research: Spend significant time with a diverse range of stakeholders to really understand the roles you are assessing and the organisation you are recruiting people into. This will include business unit leaders, local HRBPs, hiring managers, and previous incumbents in the role.
- Storyboarding and concept sign off: Once the research is complete, play back findings from the job analysis and share potential creative concepts with a group of stakeholders (ideally the ones who have inputted into the research phase).
- Assessment design: Build in several review points into the initial design process to ensure a range of people are kept up to date with progress and can see how the agreed concepts are coming to life, shaping it as you go.
- Validation trial and feedback: Once you are confident on initial assessment design, trial all materials with subject matter experts and previous role incumbents to ensure good scores on the assessment predict potential and performance. Again, invite a range of participants to play the role of candidates and assessors. They can then feed into any improvements before the materials are finalised.
- Finalising design: Finalise all technology builds in this stage giving access to a range of people to trial the assessment experience. This user acceptance testing will help you ensure compatibility across devices, accessibility software, and firewalls.
- Training and roll out: Once ready to roll the assessments out, agree review and decision points to take stock of whether the assessment process is meeting your success criteria. This will ensure candidates are progressed through the pipeline fairly, in a way that meets your objectives.
- Continual improvement: Use the assessment data and feedback from your stakeholders and candidates to continually improve the process and ensure it is meeting the objectives set out in the “scoping” phase. Update and refresh content as required.
Tip 2: Ensure Diversity of Thought in the Design Process
At Amberjack, we follow a ‘Do, Check, Review’ methodology to ensure a range of approaches feed into the design process. The “Do” is done by the individual with the best understanding of the role and the organisation being hired into. The “Check” is done by a peer ensuring the design meets our best practice standards. The “Review” is then done by myself, as a Chartered Occupational Psychologist.
As well as the designers, you need to ensure as much diversity as possible in the contributors to the eight-stage design cycle set out in Tip 1.
Tip 3: Have an Agreed, Objective Structure for Reviews
The same principle can, and should, be applied to assessment design. At Amberjack, we use indicators, like you would see in candidate assessment to review our assessment design and ensure it is performing in the way it needs to. Here, for example, are the indicators we use in assessment centre activity design to ensure the content is fair and we are doing all we reasonably can to avoid adverse impact on any group of candidates.
Negative indicators | 1 | 2 | 3 | 4 | 5 | Positive indicators |
Language is over-complicated to the extent that the assessment has become merely an assessment of verbal reasoning by default. | Uses plain English to ensure the assessment is assessing the criteria it needs to rather than over-weighting verbal reasoning. | |||||
Uses colloquial phrases or jargon which may favour certain demographics. | Avoids colloquial phrases or jargon which may favour certain demographics. | |||||
The design and/or review process is being managed by a homogenous group. | The design and/or review process is being managed by a diverse group. | |||||
The exercise favours people who have had certain experiences or privileges unrelated to job performance. | The exercise measures potential rather than rewarding people who have had experiences or privileges unrelated to job performance. | |||||
The exercise is not compliant with the principles agreed with our neurodiversity task force. | The exercise is compliant with the principles agreed with our neurodiversity task force. | |||||
Fictious information designed (org charts, images of people, video content) could lead certain minority groups to feel isolated. | Fictitious information designed (org charts, images of people, video content) represents a diverse workforce. | |||||
Assessment indicators are not compliant with the standards set out in the “AJ Assessment Team Indicator Standards” document. | Assessment indicators are compliant with the standards set out in the “AJ Assessment Team Indicator Standards” document. | |||||
Exercise would create issues for individuals with certain requirements (e.g. no alternative formats for video content). | Exercise does not introduce any obvious issues that would exclude people (e.g. video content required for an exercise is provided in multiple formats). |
Tip 4: Bring in External Expertise Where Needed
As with all aspects of working life, when designing assessment processes it is important to be aware of your level of competence and acknowledge where external help is required. Whilst the individuals designing your processes may be experts in assessment design, it is unlikely they are also experts in all types of accessibility for example.
When designing our model of potential and method of assessing potential, we were aware that to achieve our vision (to enable a world where people are hired and progressed based on their future potential, rather than past experience or privilege) we needed to ensure our approach was compatible with a range of needs.
Here is a link to a blog written by Nancy Doyle, an Organisational Psychologist specialising in neurodiversity, talking about how we collaborated with her and colleagues to build our approach to assessment.
There are also several organisations who have subject matter experts in a range of accessibility requirements who can evaluate processes and even certify them to ensure they adhere to certain accessibility standards. This is well worth exploring.
Tip 5: If you Identify Issues in Trialling, or in Live Data, Form a Hypotheses as to Why this May Be and Test It
This deep dive will often highlight things that could be improved. Our recommendations in using data to improve assessment process is covered in a recent webinar which you can access here.