Can students be taught to articulate employability skills?

Jill Tomasson Goodwin (Department of Communication Arts, University of Waterloo, Waterloo, Canada)
Joslin Goh (Department of Statistics and Actuarial Science, University of Waterloo, Waterloo, Canada)
Stephanie Verkoeyen (Department of Geography, University of Waterloo, Waterloo, Canada)
Katherine Lithgow (Centre for Teaching Excellence, University of Waterloo, Waterloo, Canada)

Education + Training

ISSN: 0040-0912

Article publication date: 5 April 2019

Issue publication date: 15 May 2019

12074

Abstract

Purpose

The purpose of this paper is to report on research findings from a teaching and learning intervention that explored whether undergraduate university students can be taught to articulate their employability skills effectively to prospective employers and to retain this ability post-course.

Design/methodology/approach

The study included 3,400 students in 44 courses at a large Canadian university. Stage 1 involved a course-level teaching and learning intervention with the experimental student group, which received employability skills articulation instruction. Stage 2 involved an online survey administered six months post-course to the experimental group and the control group. Both groups responded to two randomly generated questions using the Situation/Task, Actions, Result (STAR) format, a format that employers commonly rely on to assess job candidates’ employability skills. The researchers compared the survey responses from the experimental and control groups.

Findings

Survey results demonstrate that previous exposure to the STAR format was the only significant factor affecting students’ skills articulation ability. Year of study and program (co-operative or non-co-operative) did not influence articulation.

Practical implications

The findings suggest that universities should integrate institution-wide, course-level employability skills articulation assignments for students in all years of study and programs (co-op and non-co-op).

Originality/value

This research is novel because its study design combines practical, instructional design with empirical research of significant scope (institution-wide) and participant size (3,400 students), contributing quantitative evidence to the employability skills articulation discussion. By surveying students six months post-course, the study captures whether articulation instruction can be recalled, an ability of particular relevance for career preparedness.

Keywords

Citation

Tomasson Goodwin, J., Goh, J., Verkoeyen, S. and Lithgow, K. (2019), "Can students be taught to articulate employability skills?", Education + Training, Vol. 61 No. 4, pp. 445-460. https://doi.org/10.1108/ET-08-2018-0186

Publisher

:

Emerald Publishing Limited

Copyright © 2019, Jill Tomasson Goodwin, Joslin Goh, Stephanie Verkoeyen and Katherine Lithgow

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial & non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

Over the last decade, employers have become increasingly vocal about employability skills: the set of transferable skills characterized as the higher-order thinking skills and personal attributes that employees need to succeed in a work environment (Dacre Pool and Sewell, 2007; Lowden et al., 2011). As listed in recent reports, the top 10 employability skills for undergraduates as identified by employers include oral and written communication, leadership, teamwork, conflict management, initiative, responsibility, decision making, problem solving and critical thinking skills (Hart Research Associates, 2013; Drummond and Rosenbluth, 2015). Research suggests that employability skills are highly valued, and that many employers rank employability skills above degree designation or university reputation (Finch et al., 2012). Governments have taken note of this trend, too. In some cases, they insist that postsecondary funding be tied in part to preparing graduates for the workforce. Canada, Australia and the UK, for example, have made funding partly contingent upon the twinned “demonstrable graduate outcomes” of disciplinary competence and employability skills (Bridgstock, 2009).

Despite these employer calls and supporting government efforts, many employers report a gap between the skills they are looking for and the skills job candidates have. Surveying over 1,500 Ontario employers, Stuckey and Munro (2013) report that “[o]ver 70 per cent [of employers] said that there are gaps in critical thinking and problem-solving skills. Nearly half also said that they are seeing insufficient oral communication (46 per cent) and literacy skills (42 per cent) in the workforce” (p. 26). Researchers offer various explanations for this shortfall. Some research studies support employers’ perceptions, pointing to a “skills gap” between the employability skills that graduates possess and the requirements of prospective employers (Boden and Nedeva, 2010; Jackson and Chapman, 2012). Other research studies argue that new graduates may possess the desired employability skills, but are not aware that they have them (Jackson, 2013; Strachan, 2016). And yet other research points out that if employers are forced to glean employability skills from students’ content knowledge, they will often misperceive a skills gap, when, in fact, the problem is more “a failure on the part of universities to talk to students about the skill development inherent in their education” (Harrison, 2017, p. 6). Still other research extends the scope of this responsibility, arguing that universities need to help students not only to become more aware of their skills, but once aware, to better articulate these skills to others. These researchers postulate that the skills gap is better characterized as an “articulation of skills” gap (Joy et al., 2013).

This study aims to contribute to the “articulation of skills” gap discussion. Specifically, the study tests whether undergraduate university students, regardless of year of study or program, can be taught to articulate their employability skills effectively to a specific audience – prospective employers – and to retain this ability post-course. To do so, we adopted a two-stage study design that combines a teaching and learning intervention and a post-course survey. This research is organized as follows. First, we briefly review the existing literature on students’ employability skills awareness and articulation, focusing on two studies that deployed course-level interventions to research employability skills articulation of curricular experience. Second, we describe the development and administration of a standardized course assignment to teach articulation, and the post-course survey instrument to test the ability to articulate. Third, we report our quantitative findings across two student cohorts: the experimental group that received articulation instruction in 22 courses and the control group that did not (the previous cohort of the same courses). Finally, we expand on the findings derived from the data analysis and present noteworthy implications of the research to universities. Overall, this research study provides quantitative data aimed to advance our understanding about how all students can learn to articulate employability skills, to retain these over time and to articulate how they might transfer these employability skills to new situations.

Literature review

Within the literature on employability skills, the topic of skills awareness has been the focus of much discussion in the last 15 years (e.g. Knight and Yorke, 2003; Jiang and Alexakis, 2017). Of the many studies, several studies identify and track significant academic and professional consequences, ranging from students being unable to communicate the qualities that they have to prospective employers (Jones et al., 2010) to being unaware of how their course skills apply to the workplace (Luk et al., 2014) to rating the importance of employability skills higher than their self-assessed competency levels in them (Chan et al., 2017).

To address the skills awareness gap, and to remedy these and other consequences, yet other studies advocate that universities embed employability indicators, like employability skills, into curricula (e.g. Gunn and Kafmann, 2011; Pegg et al., 2012). Of these, some studies extrapolate from their findings. For example, Finch et al. (2013) argue that students are most likely to better acquire employability skills when they are explicitly integrated into program goals, and conclude that “learning outcomes linked to soft-skills development should take priority in the development of both academic programmes (e.g. degrees or majors) and specific courses within these programmes” (p. 696).

These studies provide important groundwork. However, they do not address a key issue for universities: if programs and courses embed employability skills into learning outcomes, when do the students have the opportunity to practice articulating these skills as a part of their academic experience? While some work on employability skills articulation has focused on co-curricular experiences such as work-integrated learning (e.g. Pretti and Fannon, 2018), there is a dearth of literature on curricular-based practice. In our literature search, only two studies emerged. Both make important contributions to the discussion of employability skills articulation; that is, they model how researchers can develop study designs that combine course-based activities and research data to investigate student employability skills articulation.

In the more recent study, Joy et al. (2015) enlisted instructors in different courses to integrate skills awareness discussions and have students write reflections about course-based employability skills. The Career Integrated Learning Project at Memorial University (Newfoundland, Canada), led by researchers Rhonda Joy, Rob Shea and Karen Youden-Walsh, had instructors identify employability skills inherent in their coursework assignments, and integrate the identified skills into their course syllabus descriptions (Joy et al., 2015). During the semester, the researchers facilitated three in-class discussions about the “graduating student competencies” (employability skills) embedded in such course activities as group work, presentations and research. Students completed written reflections connecting coursework with competencies to interrogate their personal level of competency achievement. Administering in-class surveys to 450 students, the researchers report that 72 percent of students found the reflection exercise helpful, supported by some anecdotal evidence that students later articulated these skill sets during employment interviews.

In the second, older study, Brumm et al. (2006) reported on a long-standing program at Iowa State University, which targeted four courses inside a single program and had students practice the industry-standardized Situation/Task, Actions, Result (STAR) format to articulate their employability skills. First-year Agriculture and Biosystems engineering students were required to write reflections about their “professional competencies” (employability skills), selected from the engineering program outcomes assessment. They followed the set formulation of “STAR,” the acronym for the structured manner of responding to a behavior-based employment interview question by describing a specific Situation or Task, Action (taken to meet the situational challenge) and its Result. Across four first-year courses, students wrote, spoke and received feedback about their STAR responses to help prepare for co-op, internship or summer employment interviews. Moreover, throughout their studies, students were expected to reflect, add to and update these responses in an ePortfolio. Anecdotal evidence indicates that upper-year students were more confident than students who did not have exposure to STAR-format reflections (Brumm et al., 2006).

Our research contributes to the discussion of the employability skills articulation practice by extending the work of these two studies, particularly by widening in the scope of the study population (campus-wide), the design of the intervention (for all students, years and majors) and the design of the research study (quantitative and qualitative, large study, comparison and post-course). Specifically, to test a transdisciplinary, campus-wide student population, this study systematically includes courses from all university faculties on campus (Joy et al. (2015) include different courses, but not cross-campus; Brumm et al. (2006) include four courses from one program). To test a course-level intervention that was integrated campus-wide, for all students, year of study and majors, this study developed: (1) standardized teaching and learning materials; (2) that connected employability skills articulation; (3) to course-level outcomes and activities; (4) using an employer-sanctioned reflection format (STAR) (Joy et al., 2015 include intervention features 1 and 3; Brumm et al., 2006 include features 1, 2 and 4). Finally, to gather comparative qualitative and quantitative data in a large sample over time, this study compared: a control and experimental group; of 3,400 students; six months post-course (neither Joy et al., 2015 nor Brumm et al., 2006 include these protocols).

Methodology

This study extends an earlier pilot study of one course (47 students) that members of the research team ran using the same research parameters: a teaching and learning intervention, followed by a six-month post-course survey. The pilot study found that students in the experimental group were better able to provide evidence of retention to articulate learning and to transfer learning (p<0.05) than the control group, as measured by the ability to “sharpen,” “deepen” and “transfer.” Further information about the pilot study can be found at https://uwaterloo.ca/centre-for-teaching-excellence/teaching-awards-and-grants/grants/learning-innovation-and-teaching-enhancement-grants/descriptions-funded-lite-grant-projects/eportfolios-career-reflection-and-competency-integration.

With these findings, we set up this study to test the pilot study’s generalizability to a larger cohort of students (44 courses; 3,400 students). Specifically, in this study, we hypothesized the following:

H1.

The students’ year of study does not impact their ability to articulate employability skills.

H2.

The students’ program (co-op or non-co-op) does not impact their ability to articulate employability skills.

H3.

The experimental group can articulate their employability skills better than the control group.

Stage 1: teaching and learning intervention

Stage 1 of this study involved a teaching and learning intervention designed by the research team for the experimental student group. For further information about the teaching and learning intervention, see Tomasson Goodwin and Lithgow (2018).

Teaching intervention

From all six university faculties, 17 instructors across 22 courses volunteered to integrate a written STAR-reflection assignment, iterated four times (including one draft peer review) over the 12-week semester. To provide teaching support before the semester, researchers assisted instructors: to develop employability-skills statements in their syllabi and learning outcomes; to select three existing course activities that were to be tied to the three STAR-reflection assignments; and to revise their marking scheme to include grades for the three assignments. To provide learning support, and to maintain consistency of instructional delivery during the semester, instructors provided students with a set of standardized course materials created by the researchers. These materials included in-class activity supports (see, https://uwaterloo.ca/centre-for-teaching-excellence/support/integrative-learning/watcv/watcv-course-integration-instructors), reflection assignment templates, student materials including model reflections (see, https://uwaterloo.ca/centre-for-teaching-excellence/resources/integrative-learning/eportfolios/examples-student-eportfolios) and an interactive, custom-designed rubric to provide feedback and to assess the reflections and accompanying eportfolio (see, https://watcv.uwaterloo.ca/rubricdemo/).

Learning intervention

The learning intervention was designed to close the skills articulation gap between students and employers by teaching the students to master a response template – STAR – currently used by employers worldwide in “behavior-based” style of employment interviews; in other words, to have students practice articulating their employability skills with a structure and language that employers use themselves. Behavior-based interviewing emerged from industrial organizational psychology (Janz et al., 1986; Green et al., 1993) has been validated by psychometric studies (Pulakos and Schmitt, 1995) and has been refined by 30 years of research into effective interviewing techniques (Levashina et al., 2014). Specifically, behavior-based interviews proceed from the premise that past behaviors (or actions) are the best predictor of future behavior. Interviewers assess whether a candidate’s past behaviors align with the job description by using STAR-focused questions to prompt them to describe their past behavior in specific situations.

As a learning intervention in this study, the vehicle of written reflection is, equally, an educationally transformative act; that is, learning takes place when learners can understand their experiences in a personally meaningful manner, often through interactions with others (Vygotsky, 1978; Rodgers, 2002). At the beginning of the 12-week semester, instructors introduced employability skills through a standardized table of employability skills and behaviors and employability skills articulation practice through three reflection assignments, each using a standardized STAR-reflection template. By choosing a single employability skill and accompanying behavior from the employability skills table, students then compose a STAR reflection to describe how they deployed the skill and behavior while completing an associated course activity. The students received both formative and summative feedback through a standardized rubric, designed by the researchers, and enhanced by an interactive grading scheme.

Of note, the assessment rubric was designed to serve various instructional and research functions. First, it ensured consistency of feedback both to the students over the four iterations of the reflection assignment, as well as to the students across the 22 courses (reinforced by researcher-led inter-rater reliability marking training for instructors and teaching assistants). Second, it helped students become more aware of non-academic audiences: the rubric feedback form ties assigned marks to a probable employer response. A D-level grade, for example, is associated with, “We received your application,” while an A-level grade with, “We’ve put you on the shortlist.” Third, rubric feedback pushed students to practice composing effective responses to the three structural components of a STAR response. Separately in the research survey, these components were tagged as the three measures of articulation: the ability to “sharpen” the situational details of the Situation/Task section; to “deepen” the number and range of proactive reactions of the Action section; and to “transfer” lessons learned in the Results section to new situations.

Stage 2: employability skills articulation survey

A census-style survey was used to provide a more accurate and complete picture of students across the university and increase the confidence interval of the results. A total of 3,998 students enrolled in the 44 courses of the study received the survey six months post-course: 1,682 students in the experimental group courses and 1,716 students in the control group courses. (Note: individual course sizes ranged considerably in size and delivery style, ranging from studio-style courses of 20 students to large lecture-style courses of 350 students.) Of 3,398 students, 1,048 (31 percent) students (605 control, 443 experimental) accessed the survey, with 6 declining to participate. To ensure the sample represented the university student population, comparison was made between the sample and population in terms of the variables, year of study and program (co-op and non-co-op).

To summarize, Table I outlines the study procedure and timeline: Stage 1 (over two years) and Stage 2 (survey).

Measures

The survey integrated several measures. To ensure the comparability of STAR-format reflection instruction inside the experimental student cohort (i.e. exposure in only one course) and inside the control student cohort (i.e. no exposure), the survey filtered out ineligible students. To account for differences in background and skills experience, the survey gathered demographic information, post-course articulation opportunities and self-assessed skills performance achievement (on a four-point Likert scale, measuring beginning, developing, accomplished and exemplary levels). Lastly, to test students’ skills articulation ability, the survey presented students with two randomized STAR-format reflection questions. These questions focused on two of the ten employability skills identified in the self-assessment.

Data analysis

Logistic regression was used to evaluate whether students could articulate their employability skills (dependent variable). Specifically, a logistic regression model was run for each dependent variable – sharpen, deepen and transfer – that constituted articulation ability. The independent variables for each model were year of study, program (co-op or non-co-op) and course cohort.

Before the logistic regression analyses were performed, we insured comparability of the students with respect to self-assessed skills performance and post-course articulation opportunities. First, we established comparability in employability skills articulation along four lines: level of confidence in employability skills, year of study (lower or upper year), program (co-op or non-co-op) and course cohort (experimental/STAR-instructed or control/non STAR-instructed). Proportion tests established that more than half the sample rated their skills as accomplished or exemplary. The Fisher’s exact test (with Bonferroni correction) was used to validate skill-level differences by year of study, their program and course cohort. By year of study, in all but two skills (leadership and critical thinking), significantly more upper-year students rated their employability skills levels higher than lower-year students. By program, both co-op and non-co-op students rated their levels similarly, (as accomplished or exemplary). And by course cohort, in all but one skill (decision making), experimental and control group students rated their levels similarly high. As the second method of validating comparability, we also considered students’ opportunity to articulate their employability skills. Based on the students’ responses, neither the experimental nor control group had significantly more students who had opportunities to practice their articulation skills verbally, (χ2(1, N=840)=1.41, p=0.23) or in writing, (χ2(1, N=840)=0.05, p=0.83).

Qualitative analysis

Before applying logistic regression, two of the researchers independently read and coded each of the three parts of the anonymized STAR-format response (as shown in Figure 1). To maintain consistency between the assessments of in-course and survey STAR-format responses, researchers used the in-course assessment rubric to guide the survey coding criteria. They resolved discrepancies between coding scores by referring to the assessment rubric and discussing the scores.

As illustrated in Figure 1, students completed three constructs – that is, the three parts that comprise a STAR response, “Situation or Task,” “Actions,” “Results” – which the researchers parsed into a discrete text box. This layout allowed researchers to export the anonymized data as separate cells, allowing for separate analysis and coding.

For coding purposes, each construct was independently assigned a number of 0, 1 or 2 to indicate the degree to which the student articulated their employability skills by identifying a Situation or Task in which they used the skill (identified as the dependent variable, “sharpen”); by providing details of the Actions they took that demonstrated the skill (identified as the dependent variable, “deepen”); and by connecting the Results of the situation to future situations (identified as the dependent variable, “transfer”). A score of 0 was assigned if there was no evidence, a score of 1 was assigned if there was some evidence, and a score of 2 was assigned if there was strong evidence of the predetermined variables of sharpen, deepen and transfer.

Quantitative analysis

In the logistic regression models, the scores (0, 1, 2) from the qualitative coding of the dependent variables (sharpen, deepen and transfer) became the raw data such that 0 indicates not showing ability to articulate, whereas 1 and 2 indicates showing ability to articulate. Having ensured the comparability of the students through self-assessed skills performance achievement and post-course articulation opportunities, any difference found in the logistic regression models can be attributed to STAR-reflection articulation instruction.

Results

In this section, we outline the statistics model, report the quantitative results and connect the results to the study hypotheses; in the Discussion section, we will explore the implications of these results.

Both control and experimental groups had similar non-completion rates. Of the 855 eligible students who completed the close-ended portion of the survey questions, 508 did not attempt to complete the open-ended portion (the two STAR-format responses) and exited the survey at that point. Of the 483 control group students, 59 percent of them did not attempt a response (or provided invalid responses), and of the 372 experimental students, 60 percent. For students who completed two responses, we recorded the higher of the two scores for each response segment (sharpen, deepen and transfer), reasoning that students might not spend time or effort to answer both, and that students who showed some ability to articulate could consistently do so. To verify the consistency of ability, we looked at the responses of 225 students who provided two responses. More than half of them demonstrated the ability to sharpen (77 percent), deepen (63 percent) and transfer (57 percent) in both their responses. Specifically, using a one-sided one-sample proportion test, we found that the students’ ability to articulate consistently did not happen by chance: (χ2(1, N=225)=66.15, p=0.001) (sharpen), (χ2(1, N=225)=13.94, p=0.001) (deepen) and (χ2(1, N=225)=3.48, p=0.03) (transfer). Presented with this finding, the higher score was retained for subsequent analysis.

A logistic regression model was run for each of the dependent variables (sharpen, deepen and transfer) to investigate the effect of year of study, program (co-op and non-co-op) and course cohort (experimental or control) on whether students could articulate their employability skills.

Ability to sharpen

The data suggest that both upper-year and lower-year students (β=0.95, t(329)=1.12, p=0.26), and both co-op and non-co-op students (β=−0.98, t(329)=−1.27, p=0.20), are equally likely to show the ability to sharpen. These results support H1 (no articulation difference by year of study) and H2 (no articulation difference between co-op and non-co-op). However, there is strong evidence that the experimental group students are more likely to be able to sharpen their articulation (β=1.40, t(329)=−2.76, p=0.01), which supports H3 (articulation difference by treatment group). Of the experimental group, 96.6 percent could sharpen compared to 85.6 percent of the control group.

Ability to deepen

As with ability to sharpen, the data suggest that both upper-year and lower-year students (β=−15.98, t(326)=−0.02, p=0.99), and both co-op and non-co-op students (β=0.04, t(326)=−0.07, p=0.95) are equally likely to show the ability to deepen. These results, again, support H1 and H2. However, there is strong evidence that the experimental group students are more likely to be able to deepen their articulation (β=1.55, t(326)=1.94, p=0.05), which supports H3. Of the experimental group, 90.5 percent of the experimental group students could deepen, significantly more than 72.4 percent of the control group.

Ability to transfer

As with ability to sharpen and deepen, the data suggest that both upper-year and lower-year students (β=1.06, t(327)=1.63, p=0.10), and co-op and non-co-op students (β=−0.13, t(327)=−0.26, p=0.79) are equally likely to show the ability to transfer. These results also support H1 and H2. However, there is strong evidence that the experimental group students are more likely to be able to transfer their articulation (β=2.20, t(327)=3.19, p=0.001), which supports H3. Of the experimental group, 51.7 percent of the experimental group students could transfer, significantly more than 28.6 percent of the control group. Of note, fewer students in both groups were able to articulate through transfer.

Ability to sharpen, deepen and transfer

Figure 2 breaks down, by percentage, which students could articulate by grouping them into their program designation (co-op and non-co-op) and course cohort.

The non co-op/non STAR-instructed (control) student group is least proficient in articulating their employability skills, followed by the co-op/non STAR-instructed (control) student group. The co-op/STAR-instructed (experimental) and the non co-op/STAR-instructed (experimental) student groups are almost equally proficient.

Discussion

In this study, we found that students’ ability to articulate their employability skills, as measured by sharpen, deepen and transfer:

  1. is not affected by students’ year of study;

  2. is not affected by students’ program (co-op or non-co-op); and

  3. is affected by students’ enrollment in courses that required the STAR-reflection assignment.

Our data indicate that both lower-year and upper-year students could provide STAR-format responses equally well, taking into account their academic background (co-op and non-co-op) and articulation instruction (experimental and control group). While the upper-year students have taken more academic courses than the lower-year students, we speculate that regular academic courses alone do not influence these students’ ability to articulate their employability skills.

Our data also indicate that both co-op and non-co-op students could provide STAR-format responses equally well. This finding was surprising, given that co-op students often receive coaching from academic support units to prepare them for co-op employment interviews, coaching that non-co-op students may not receive. Several factors may have contributed to this non-discrepancy. First, co-op students are generally not required to enroll in employment interview coaching, and so may not be exposed to the STAR format as part of behavior-based interview training. Second, universities do not have enough personnel to ensure that students can practice STAR-format responses and receive feedback enough to ensure mastery. Likewise, because most employers do not provide co-op students with feedback on their employability skills articulation, students have little or no awareness of what to practice for future interviews. Numerous studies show that practice for mastery requires both regular repetition and feedback (see Hattie and Timperley, 2007). Third, while some co-op students may be able to articulate their employability skills verbally using the STAR format, they may find writing out STAR responses more challenging (Biber et al., 2002). Finally, coaching may not provide the kind of feedback that students attribute value in the short-term, as compared to when it is assessed as part of a course and awarded with a numeric grade (see Smith and Gorard, 2005).

As well, our data show that experimental (STAR-instructed) group students could provide better STAR-format responses than control (non STAR-instructed) group students. To account for this difference in articulation ability, we offer several differences between the instructional experiences of the two student groups. First, students become aware that employability skills and their university coursework are connected (as outlined in their syllabi, course content and STAR-reflection assignments). Second, students learn about the STAR format itself (its rationale, use in hiring and relevance to coursework). Third, students practice writing STAR reflections three times over their 12-week term (see Karpicke and Bauernschmidt (2011) for learning benefits of assignment spacing), and are rewarded for their effort (6 to 30 percent of the final course grade). Fourth, students receive both instructor and peer feedback through a customized assessment rubric designed for both formative and summative feedback (see Bransford et al., 2000). And finally, students mount their STAR reflections inside a career ePortfolio, meant for prospective employers to review. By contrast, control group students receive none of these instructional experiences.

To illustrate the effect of this instruction, we can compare the written answers of two student responses from the survey in Table II. Both students answered the question, “In the last six months, describe a situation where you successfully communicated your opinion, verbally, to others.” Both students input their three-part answer (see Table II) that followed the STAR format by responding to the writing prompts, “describe the situation you were in”; “describe the actions that you took”; and “describe the outcome of your actions.” Note that these two responses are for illustration purposes: most students, in both the control and experimental groups, provided responses that were, overall, better than the weak response and not as good as the strong response.

Across the three parts of each answer in these two responses, we note several differences. For example, the number of full sentences (1 vs 3) and their length (7 vs between 15 and 30 words) differ. This basic difference is one of the two critical indices of a respondent’s ability to articulate employability skills as measured by sharpening (identifying a skills-prompting situation), deepening (providing details of actions that demonstrated skill) and transferring (connecting details of situational results to bridge to future situations). As well, note that only the “strong response” student could reflect upon, and articulate an understanding of, the skill that s/he could transfer: “This taught me that by giving examples I can back up my points.”

The second critical index of a respondent’s ability to articulate is through the use of the personal pronoun, “I,” paired with verbs in the active voice. This subject/verb combination denotes the mastery of the baseline requirement of skills articulation in the STAR format: the ability to describe a situation in the past which prompted the use of employability skills. In the strong response above, eight subject/verb expressions, distributed evenly across the response, demonstrating a clear and compelling mastery of the STAR format: “I made picnic tables,” “I had to share with him my opinion” (situation), “I explained to my supervisor,” “I told him specifically,” “I even took him,” “I communicated this to my supervisor” (actions), “I accomplished my goal,” “I can back up my points” (results).

Despite these differences in instructional effect, fewer students in both the experimental and control groups could articulate through transfer. This finding is not surprising. Of the three measured variables, the act of transferring requires students to understand, at a minimum, two practical notions. First, the notion of simultaneity: students need to recognize that in mastering disciplinary content knowledge at university, they simultaneously engage in such employability skills as problem solving, communication and teamwork. Second, the notion of ubiquity: students need to recognize that they engage the same employability skills across a range of situations, from coursework to extracurricular activities to the workplace, and that when they practice these skills in one, they can improve them across all situations. Understanding these two ideas, students can become aware that the key to articulating their employability skills to others is extrapolating, or transferring, the lessons learned about the skills from one situation to other situations, including their future use. However, because it takes time to understand these ideas, we speculate that one course assignment is not enough for students to become aware of, and articulate confidently, this kind of reflexive, and extensive, understanding.

Implications for universities

To answer the question posed in the title of this paper, “Can students articulate their employability skills?,” the study data demonstrate that students in all years of study, in a wide range of academic majors, both co-op and non-co-op, supported by regular practice and feedback in a course setting, can articulate their employability skills long after the course intervention is finished. This study, then, points to one significant implication for universities: to help students articulate their employability skills to prospective employers, universities should support the integration of employability skills assignments into academic courses.

For universities willing overtly to commit to employability skills articulation, what support might be considered? At the institutional level, universities could adopt policies to encourage employability skills articulation programming across the campus, through such vehicles as strategic planning documents. Equally, universities could direct funding to centralized academic support units, such as teaching and learning, co-op and career, ePortfolio or student success offices. Funding could support teaching and learning centers to work with department chairs and instructors, for example, on such issues as the culture shift from disciplinary content (teaching paradigm) to content plus employability skills articulation (learning paradigm) or program and instructional design for skills articulation. Likewise, universities could direct research funding into such areas as integrating interventions in courses, into program-level curricular mapping and planning or across curricular and co-curricular experiences, such as co-op. Because co-op offices, for example, cannot sustain the individualized and iterative instruction and feedback about skills articulation that course level is structurally designed to provide, universities are positioned to support productive synergies between them, and simultaneously, better meet their own organizational mandates, such as undergraduate degree-level expectations (Ontario Universities Council on Quality Assurance, 2010).

Besides policy and funding, universities could also support its personnel in the development efforts. Through such vehicles as recognition awards, special development secondments, application for course releases, universities could recognize and reward the administrators and faculty members who commit themselves to developing, implementing and testing program and course-level initiatives. At the program level, for example, administrators could plan for and test the regularized spacing of articulation instruction – in one or two courses per year throughout a student’s degree – to support ePortfolio showcasing of employability skills articulation across course, extracurricular and workplace experiences. At the course level, instructors could research, introduce, and test syllabi and course assignments, customizable instructional materials, standardized articulation materials, and a consistent assessment rubric to contribute ongoing efforts at the program and university levels.

This study has demonstrated that co-op and non-co-op students in all years benefitted from STAR instruction at the course level and retained the ability to articulate their employability skills over time. This last finding is encouraging for students who are not enrolled in co-operative education. It also challenges the assumption that co-operative education, by itself, provides sufficient resources to help students to articulate employability skills that bridge between academic and workplace environments.

Limitations

There are three limitations that may have affected this study: time on task, composition of the survey respondents and employability skills tested.

First, because the study’s online survey could not track the length of time participants spent writing their STAR responses, researchers do not know, for example, whether the control group students attempted to respond, and then deleted partial responses and abandoned the task or whether they simply did not attempt the task at all. Likewise, the researchers do not know whether the experimental group students spent 5, 15 or 30 min on writing their responses. Future researchers, therefore, may wish to establish how quickly students can articulate their skills as another measure of mastery. This additional time on task information may help future researchers to understand, for example, how readily a whole STAR reflection can be written (ease of articulation generally) or how readily each section of a STAR reflection can be written to uncover where students might benefit from further instruction (refinement of learning intervention). Likewise, future researchers could track over the semester whether time on task decreased with each iteration of reflection writing to establish the most efficacious number of articulation exercises, and test this study’s assumption of three iterations (Guskey, 2007).

Second, this study did not consider variances across faculties or majors because, as a reflection template, the STAR reflection tests discipline-agnostic employability skills. Consequently, while this study’s survey respondents included all years from courses campus-wide, future researchers may wish to extend this study by ensuring equal representation from students who have the different variables they wish to study. For example, future researchers who wish to compare between faculty or programs, or even by major within a faculty, would need to use different sampling techniques, such as stratified sampling to ensure equal representation from each faculty and academic program.

Third, while this study is based on a compiled list of ten of the most common employability skills promised by universities and requested by employers (Munro et al., 2014; Ontario Universities Council on Quality Assurance, 2010), employability skills lists themselves evolve and change (World Economic Forum, 2018). Future researchers may wish to extend this study’s findings by using different employability skills for STAR reflections.

Future research

While this study clearly demonstrates the benefits of practicing the STAR format in writing, students need to transfer their written mastery to the oral setting of face-to-face job interviews with employers. Thus, future research can extend this element of the study in two ways: first, by examining what students need to transpose written to oral presentation of STAR responses, and second, by including the input of employers, the real-world stakeholder, to help further bridge the articulation of skills gap. Together, these will provide students with a more authentic experience.

Conclusion

This research is an original contribution to research because its study design combines practical instructional design with empirical research of significant scope (institution-wide) and participant size (3,400 students), and therefore, contributes quantitative evidence to the employability skills articulation discussion. By surveying students six months post-course, the study captures whether articulation instruction can be recalled, an ability of particular relevance for career preparedness. Likewise, our research study can help to answer other, larger calls for university accountability, such as Harvey Weingarten’s, who laments that postsecondary institutions have “little information about whether students arrive on campus with these [employability] skills or to what degree they acquired them through their studies, because we don’t test for them” (Weingarten, 2016). Most broadly, our study can also help to meet the underlying rationale of postsecondary education, which assumes that “knowledge, skills, and attitudes learned in this setting will be recalled accurately, and will be used in some other context at some time in the future” (Halpern and Hakel, 2003, p. 38).

Figures

Example survey page layout of STAR construct with dependent variables

Figure 1

Example survey page layout of STAR construct with dependent variables

Breakdown of percentages of students who could articulate by program and course cohort

Figure 2

Breakdown of percentages of students who could articulate by program and course cohort

Study process and timeline

Year 1: control group
Teaching intervention No
Learning intervention No
Six-month post-course survey Yes
Year 2: experimental group
Teaching intervention (Instructor workshops) Yes
 Integrating skills into syllabi and learning outcomes
 Administering standardized course materials for STAR reflection
 Grading with assessment rubric (inter-rater reliability)
Learning intervention (Graded student assignments) Yes
 STAR reflection 1
 STAR reflection 2A (peer assessment of draft)
 STAR reflection 2B
 STAR reflection 3
6-month post-course survey Yes
Year 2: control and experimental groups
Data analysis and report comparing STAR-reflection articulation from 6-month post-course surveys

Employability skills articulation example (oral communication)

Articulation type Writing prompt Weak response Strong response
Sharpen Describe the situation you were in I successfully communicated to my coworker about a client I was a team lead at a factory making outdoor patio sets from recycled plastic, I made picnic tables. My supervisor asked me if it was possible to complete a certain color of picnic table by the end of the day. I had to share with him my opinion on the matter i.e. if we had the right lumber for it, if it was achievable
Deepen Describe the action(s) you took to successfully communicate your opinion Verbally described my opinion on the matter I explained to my supervisor that it was not possible due to the lack of lumber to build it. I told him specifically what pieces of lumber we were missing and I even took him to where the lumber is stored to show him that we did not have it. After talking to the extruder operators who make the lumber I had a clearer picture of when the color needed would be ready and I communicated this to my supervisor
Transfer Describe the outcome of your action(s), e.g. how did the situation end? What did you accomplish? What did you learn? My coworker agreed with me The outcome of my action was that my supervisor agreed with me and trusted me to make sure the picnic table was done as soon as possible. I accomplished my goal of convincing my supervisor that this was an unachievable task at the moment. This taught me that by giving examples I can back up my points

References

Biber, D., Conrad, S., Reppen, R., Byrd, P. and Helt, M. (2002), “Speaking and writing in the university: a multidimensional comparison”, TESOL Quarterly, Vol. 36 No. 1, pp. 9-48.

Boden, R. and Nedeva, M. (2010), “Employing discourse: universities and graduate ‘employability’”, Journal of Education Policy, Vol. 25 No. 1, pp. 37-54.

Bransford, J.D., Brown, A.L. and Cocking, R.R. (Eds) (2000), How People Learn: Brain, Mind, Experience, and School, National Academy Press, Washington, DC.

Bridgstock, R. (2009), “The graduate attributes we’ve overlooked: enhancing graduate employability through career management skills”, Higher Education Research & Development, Vol. 28 No. 1, pp. 31-44.

Brumm, T.J., Mickelson, S.K. and White, P.N. (2006), “Integrating behavioral-based interviewing into the curricula”, NACTA Journal, Vol. 50 No. 2, pp. 28-31.

Chan, C.K.Y., Zhao, Y. and Luk, L.Y.Y. (2017), “A validated and reliable instrument investigating engineering students’ perceptions of competency in generic skills”, Journal of Engineering Education, Vol. 106 No. 2, pp. 299-325.

Dacre Pool, L. and Sewell, P. (2007), “The key to employability: developing a practical model of graduate employability”, Education + Training, Vol. 49 No. 4, pp. 277-289.

Drummond, D. and Rosenbluth, E.K. (2015), “Competencies can bridge the interests of business and universities”, Working Paper No. 2015-02, University of Ottawa, Ottawa, September, available at: https://ruor.uottawa.ca/bitstream/10393/33200/4/EPRI%20Working%20Paper%202015-02.pdf (accessed February 21, 2019).

Finch, D., Nadeau, J. and O’Reilly, N. (2012), “The future of marketing education: a practitioner’s perspective”, Journal of Marketing Education, Vol. 35 No. 1, pp. 54-67.

Finch, D.J., Hamilton, L.K., Baldwin, R. and Zehner, M. (2013), “An exploratory study of factors affecting undergraduate employability”, Education + Training, Vol. 55 No. 7, pp. 681-704.

Green, P.C., Alter, P. and Carr, A.F. (1993), “Development of standard anchors for scoring generic past-behaviour questions in structured interviews”, International Journal of Selection and Assessment, Vol. 1 No. 4, pp. 203-212.

Gunn, V. and Kafmann, K. (2011), “Employability and the austerity decade”, available at: www.enhancementthemes.ac.uk/docs/ethemes/graduates-for-the-21st-century/employability-and-the-austerity-decade.pdf?sfvrsn=a63df981_8 (accessed February 21, 2019).

Guskey, T.R. (2007), “Closing achievement gaps: revisiting Benjamin S. Bloom’s ‘learning for mastery’”, Journal of Advanced Academics, Vol. 19 No. 1, pp. 8-31.

Halpern, D.F. and Hakel, M.D. (2003), “Applying the science of learning to the university and beyond: teaching for long-term retention and transfer”, Change: The Magazine of Higher Learning, Vol. 35 No. 4, pp. 36-41.

Harrison, A. (2017), Skills, Competencies and Credentials, Higher Education Quality Council of Ontario, Ontario, available at: www.heqco.ca/SiteCollectionDocuments/Formatted_Skills%20Competencies%20and%20Credentials.pdf (accessed February 21, 2019).

Hart Research Associates (2013), “It takes more than a major: employer priorities for college learning and student success”, Liberal Education, Vol. 99 No. 2, pp. 22-26, available at: www.aacu.org/publications-research/periodicals/it-takes-more-major-employer-priorities-college-learning-and (accessed February 21, 2019).

Hattie, J. and Timperley, H. (2007), “The power of feedback”, Review of Educational Research, Vol. 77 No. 1, pp. 81-112.

Jackson, D. (2013), “Student perceptions of the importance of employability skill provision in business undergraduate programs”, Journal of Education for Business, Vol. 88 No. 5, pp. 271-279.

Jackson, D. and Chapman, E. (2012), “Non-technical skill gaps in Australian business graduates”, Education + Training, Vol. 54 Nos 2/3, pp. 95-113.

Janz, T., Hellervik, L. and Gilmore, D. (1986), Behavior Description Interviewing, Allyn and Bacon Publishers, Boston, MA.

Jiang, L. and Alexakis, G. (2017), “Comparing students’ and managers’ perceptions of essential entry-level management competencies in the hospitality industry: an empirical study”, Journal of Hospitality, Leisure, Sport & Tourism Education, Vol. 20 No. 1, pp. 32-46.

Jones, M., McIntyre, J. and Naylor, S. (2010), “Are physiotherapy students adequately prepared to successfully gain employment?”, Physiotherapy, Vol. 96 No. 2, pp. 169-175.

Joy, R., Shea, R. and Youden-Walsh, K. (2013), “Advancing career integrated learning at Memorial”, paper presented at Cannexus, Ottawa, January 28–30.

Joy, R., Shea, R. and Youden-Walsh, K. (2015), “Meeting the challenge of work and life using a career integrated learning approach”, Proceedings of the Atlantic Universities’ Teaching Showcase, Vol. 19, pp. 76-79.

Karpicke, J.D. and Bauernschmidt, A. (2011), “Spaced retrieval: absolute spacing enhances learning regardless of relative spacing”, Journal of Experimental Psychology: Learning, Memory, and Cognition, Vol. 37 No. 5, pp. 1250-1257.

Knight, P.T. and Yorke, M. (2003), “Employability and good learning in higher education”, Teaching in Higher Education, Vol. 8 No. 1, pp. 3-16.

Levashina, J., Hartwell, C.J., Morgeson, F.P. and Campion, M.A. (2014), “The structured employment interview: narrative and quantitative review of the research literature”, Personnel Psychology, Vol. 67 No. 1, pp. 241-293.

Lowden, K., Hall, S., Elliot, D. and Lewin, J. (2011), “Employers’ perceptions of the employability skills of new graduates”, research report, Edge Foundation, London, available at: www.educationandemployers.org/wp-content/uploads/2014/06/employability_skills_as_pdf_-_final_online_version.pdf (accessed February 21, 2019).

Luk, L., Ho, R., Yeung, C. and Chan, C. (2014), “Engineering undergraduates’ perception of transferable skills in Hong Kong”, IN-TED 2014 Proceedings of the 8th International Technology, Education and Development Conference in Valencia, IATED, Spain, March 10–12, pp. 796-802.

Munro, D., MacLaine, C. and Stuckey, J. (2014), “Skills-where are we today? The state of skills and PSE in Canada”, research report, Conference Board of Canada, Ottawa, November, available at: http://nacc.ca/wp-content/uploads/2014/11/6603_skills-whereareweat-rpt.pdf (accessed February 21, 2019).

Ontario Universities Council on Quality Assurance (2010), “Appendix 1: OCAV’s undergraduate and graduate degree level expectations”, available at: http://oucqa.ca/framework/appendix-1/ (accessed February 21, 2019).

Pegg, A., Waldock, J., Hendy-Isaac, S. and Lawton, R. (2012), Pedagogy for Employability, The Higher Education Academy, Heslington, available at: www.heacademy.ac.uk/system/files/pedagogy_for_employability_update_2012.pdf (accessed February 21, 2019).

Pretti, T.J. and Fannon, A. (2018), “Skills articulation and work-integrated learning”, in Deller, F., Pichette, J. and Watkins, E.K. (Eds), Driving Academic Quality: Lessons from Ontario’s Skills Assessment Projects, Higher Education Quality Council of Ontario, Toronto, pp. 107-122.

Pulakos, E.D. and Schmitt, N. (1995), “Experience-based and situational interview questions: studies of validity”, Personnel Psychology, Vol. 48 No. 2, pp. 289-308.

Rodgers, C. (2002), “Defining reflection: another look at John Dewey and reflective thinking”, Teachers College Record, Vol. 104 No. 4, pp. 842-866.

Smith, E. and Gorard, S. (2005), “‘They don’t give us our marks’: the role of formative feedback in student progress”, Assessment in Education: Principles, Policy & Practice, Vol. 12 No. 1, pp. 21-38.

Strachan, L. (2016), “Teaching employability skills through simulation games”, Journal of Pedagogic Development, Vol. 6 No. 2, pp. 8-17.

Stuckey, J. and Munro, D. (2013), “The need to make skills work: the cost of Ontario’s skills gap”, research report, Conference Board of Canada, Ottawa, June, available at: www.collegesontario.org/Need_to_Make_Skills_Work_Report_June_2013.pdf (accessed February 21, 2019).

Tomasson Goodwin, J. and Lithgow, K. (2018), “ePortfolio, professional identity, and twenty-first century employability skill”, in Eynon, B. and Gambino, L.M. (Eds), Catalyst in Action: Case Studies of High-Impact ePortfolio Practice, Stylus Publishing, Sterling, VA, pp. 154-171.

Vygotsky, L.S. (1978), Mind in Society: The Development of Higher Psychological Processes, Harvard University Press, Cambridge, MA.

Weingarten, H.P. (2016), “Postsecondary education and jobs: it’s a question of skills”, available at: http://blog-en.heqco.ca/2016/11/harvey-p-weingarten-postsecondary-education-and-jobs-its-a-question-of-skills/ (accessed February 21, 2019).

World Economic Forum (2018), “The future of jobs report 2018”, available at: www.weforum.org/reports/the-future-of-jobs-report-2018 (accessed February 21, 2019).

Acknowledgements

The authors wish to acknowledge the collaboration of 17 colleagues who integrated into their courses the teaching and learning intervention, which forms the basis of the quantitative data in this study. The authors would also like to thank Jennifer Roberts-Smith, colleague, for her seminal work on the assessment rubric, which the authors relied on to assess student survey responses.

Corresponding author

Jill Tomasson Goodwin can be contacted at: jtomasso@uwaterloo.ca

Related articles