Developing a skills assessment tool for specialist jobs

Strategic HR Review

ISSN: 1475-4398

Article publication date: 13 April 2012

483

Citation

Schroeder, H. (2012), "Developing a skills assessment tool for specialist jobs", Strategic HR Review, Vol. 11 No. 3. https://doi.org/10.1108/shr.2012.37211caa.008

Publisher

:

Emerald Group Publishing Limited

Copyright © 2012, Emerald Group Publishing Limited


Developing a skills assessment tool for specialist jobs

Article Type: HR at work From: Strategic HR Review, Volume 11, Issue 3

Short case studies and research papers that demonstrate best practice in HR

Harold SchroederPresident of Schroeder & Schroeder Inc

Al LittleManager of Business Applications for the City of Hamilton

HR managers are often required to assess the skills of candidates for specialist jobs. Knowledge tests have traditionally been an important component of the assessment process, but these are too often focused exclusively on the technical aspects or “science” of specialist work. In practice, most jobs require a mix of science and “art” skills, including the ability to use techniques and tools relevant to an area of work, as well as the softer skills involved in interacting with other people in order to achieve work objectives.

This article discusses the development of an “art- and science-based” assessment tool for use in the recruitment of IT specialists for The City of Hamilton, in Canada. As a result of departmental restructuring, job roles were being redefined, and a major recruitment exercise was planned. In this context, in January 2011, Schroeder & Schroeder Inc was commissioned to develop an assessment tool consisting of questions and answers for use in the pre-screening of candidates for 17 positions in the newly created Business Applications and Service Desk sections. In developing this assessment tool it proved necessary to address a number of key considerations as described below.

Determining scope and focus

The first step in the process was to identify the skills and knowledge that were important to performance in each of the job positions, so that questions and answers could be formulated to examine the candidates’ knowledge and abilities in each of these areas. The client had pre-defined the broad skill categories within which question sets were to be developed, which included a number of science skill areas such as technical troubleshooting, software development and desktop support, and a generic art category defined as “people skills.” However, these categories were too broad to form the basis of assessment items, and it was necessary to break them down into more detailed skill and knowledge requirements.

To do this, we used the draft job descriptions available for each job, which consisted of listed duties, as well as qualifications – encompassing science (formal qualifications and knowledge of specific technical tools and techniques) and art (e.g. problem-solving skills, leadership and ability to use good judgment).

Many duties and qualifications were found to be common to a range of job positions, so the job descriptions were synthesized to arrive at a single list of duties and qualifications for which question and answer sets were required.

Taking into account varying proficiency levels

The assessment tool was to be used for the pre-screening of job candidates at entry, intermediate and advanced level. We needed to ensure, therefore, that the question and answer sets could be used for assessing candidates with different levels of experience and knowledge. We approached this in two different ways.

First, we developed different questions and answers to reflect the types of work typically associated with each level of work. For example, questions directed at entry level candidates whose work would be highly supervised were designed to investigate understanding of key IT terms or concepts, while questions directed at intermediate-level candidates included hypothetical situations which examined their ability to independently apply their knowledge in real-life situations. In formulating advanced-level questions we considered the types of demands on staff at this level and included questions designed to assess their ability to discuss IT issues effectively with senior executives.

The second method was to adapt the design of questions to make them progressively harder for more advanced candidates to answer. This was achieved, for example, by increasing the complexity involved in choosing a combination of right answers in a multiple choice question.

Format and style of questions

Since the assessment tool was intended for the pre-screening of large numbers of candidates, a major priority was to ensure that questions were in a format that would be suitable for self-completion and would generate data that could be analyzed quickly and easily. For many questions, therefore, we used a pre-coded range of possible responses from which the candidate could choose. This meant that the correct answers could simply be recorded in a marking scheme for HR officers to use, or incorporated in an automated computer program that would simply generate an overall score for each candidate, or a test report tailored to the needs of the organization.

Questions in this format included, for example, those with a simple binary response design (e.g. True/False, or a choice between two possible answers), or a multiple choice design (with a choice of one or a combination of answers). These types of questions are particularly useful when testing candidates for entry level specialist jobs, in which the main objective is to test familiarity with and understanding of basic technical concepts. They can also be designed to assess intermediate or advanced skills, for example by asking the candidate to choose the most appropriate response to a hypothetical situation that involves specialist knowledge or related art skills, or one that requires knowledge usually only held by more experienced specialists. Also these types of questions can be used in a way that makes it more difficult for the candidates to differentiate between possible responses.

Developing question and answer content

Our project team mainly consisted of IT professionals with relevant specialist knowledge, who were able to understand the nature of jobs for which we had been asked to design assessment items, and compile questions and answers that were job-relevant and appropriately pitched to the right level of knowledge and experience. A large component of the work consisted of research to identify authoritative sources from which question and answer content could be drawn; these included professional bodies of knowledge, textbooks and specialist websites. One of the main roles of our subject experts was to verify the accuracy and reliability of these sources.

We also worked closely with and sought feedback from senior specialists at the City’s IT department on draft question and answer sets before they were finalized for use. This was crucial not only as a further quality control mechanism but also to ensure that the question database adequately incorporated any important organizational-specific requirements of the job roles.

A new recruitment model

The systematic application of the above stages of work resulted in the development of an assessment tool that has been valued by the City in its IT staff recruitment exercise. Previously, skills assessments and testing were left up to individual hiring supervisors/managers to conduct, resulting in inconsistent, and sometimes, ineffective recruitment that focused mainly on hard technical skills alone. Additionally, it represents a good practice example of the development of an art- and science-based tool, which can be readily adapted for use by City’s HR department to assess candidates for jobs in other specialist and functional areas. These methods have been successfully used within the last six months. However, it is still too early for definitive quantitative measures.

About the authors

Harold Schroeder is the President of Schroeder & Schroeder Inc, a firm of experienced professional program and project managers, management consultants, and corporate managers focused on providing transformation management consulting services to private and public sector organizations. He has over a quarter century of experience consulting to boards, executives and senior management. Schroeder is a fellow certified management consultant (FCMC), a project management professional (PMP), a certified health executive (CHE), and a certified HR professional (CHRP).

Al Little is the Manager of Business Applications for the City of Hamilton. He has over 27 years’ experience in the surveying, mapping, Geographic Information Systems (GIS), and information technology fields, 25 of which have been in a municipal government environment. He holds a diploma from Seneca College as a cartographic technician, a bachelor of environmental studies degree from the University of Waterloo, and a certificate in managing information technology projects from George Washington University (School of Business and Public Management). Al Little can be contacted at: al.little@hamilton.ca

Related articles