Human Skills as Essential Skills: Preparing Job Seekers Who Were Skilled through Alternative Routes for Inclusion in the Future Economy

Share Share Share Share Share
[s2If is_user_logged_in()]
DOWNLOAD PDF
[/s2If] [s2If current_user_can(access_s2member_level1)]
[/s2If]

Article citation: 2021 EPIC Proceedings pp 32–42, ISSN 1559-8918, https://www.epicpeople.org/epic

[s2If current_user_is(subscriber)]

video-paywall

[/s2If] [s2If !is_user_logged_in()] [/s2If] [s2If is_user_logged_in()]

INTRODUCTION

Pundits, policy-makers, and ordinary people alike have recognized that the landscape of industries and work has rapidly changed, and coming advancements in technology and automation will end many jobs and fundamentally change others (Lamb 2016). Since the 90s, The Government of Canada has been working to define the Essential Skills (ES) broadly necessary for people in the workplace to fulfill their personal and economic potential throughout their lives. In anticipation of these future skills, policy-makers must urgently address the question: how might we effectively deliver programs which upskill and re-train adults to be ready to take on new roles and careers?

Research suggests that having a place in the economy of the future will increasingly rely on people having and being able to learn “soft” skills such as communication and collaboration (Heckman et al, 2012; Berget et al, 2017; Conference Board of Canada, 2020). Soft skills are shown to be the most transferable skills across jobs and play a significant role for people to successfully keep jobs (Rudolph et al, 2017; OECD, 2020). In a world where machines and computing are increasingly sophisticated, an emphasis on the importance and relevance of social-emotional skills signals that participation in the labour market will increasingly rely on the skills which make us uniquely capable as humans.

In recognition of this, in 2019, The Government of Canada refreshed its Essential Skills framework, now named “Skills for Success,” to include the social-emotional skills of communication, collaboration, creativity, and adaptability alongside the pre-existing hard skills of literacy, numeracy, and digital literacy (OLES, 2020).

Blueprint, a Toronto-based research organization that works with policy-makers to generate evidence on what services and programs work for Canadians, was awarded a contract to implement and evaluate a program to upskill jobseekers rooted in the new Essential Skills framework. Blueprint partnered with Douglas College, a specialist in Essential Skills curriculum, and the Province of British Columbia’s Employment Services program, WorkBC. Douglas College developed a 6-week Essential Skills program called “Amplify,” with the plan to deliver the program model over three years to ~1,500 people in the WorkBC system through 2023. The goal of the project was to demonstrate how a short-term Essential Skills program could affect the outcomes of jobseekers advancing into post-secondary training and job placements that lead to long-term, sustainable employment.

RESEARCH APPROACH

Innovation in the public sector often calls for iteration, but in reality, the nature of grant funds often requires programs to determine a delivery plan and budgets from the outset, define anticipated outcomes, and generate evidence on those pre-defined outcomes using quantitative measures of impact. Like most large-scale public sector demonstration projects, the original research approach of the project was to focus purely on putting a model into the field and conducting an implementation and impact assessment of that model. However, although the gold standard for many publicly-funded projects, impact assessments such as Randomized Control Trials are expensive, are ‘one and done’ meaning their findings are taken at face-value by decision-makers, and they often have trouble pinpointing a nuanced ‘why’ behind the outcomes that are measured, making it challenging to know what to address in future delivery (Pearce et al., 2014, NESTA, 2018; OECD, 2017). As the field of evidence generation for public policy continues to evolve, there has been a recognition that “RCTs or quasi-experiments may work well when there is a simple intervention that can be tested. However, rarely do we have such simple interventions. In the complex world of social policy, it’s unlikely that your programme is the necessary or sufficient condition for success. It’s likely to be just one factor among many, part of a ‘causal package’.” (OECD, 2017).

In recognition of this, Blueprint worked with the funder early on to expand the project to incorporate testing and iterating the model after initial roll-out, to anticipate and mitigate potential challenges and increase the program’s success. With this goal in mind, the Blueprint team incorporated a focus on understanding whether the design was working, for who, and in what contexts, before starting to assess impact. The team also built in a defined “refresh” period in the work-plan where design iterations could be made and tested and then a stabilized model would be re-deployed and measured for impact. Our ability to scope the project to include this more fulsome approach to evidence generation was a significant shift in how demonstration projects to inform policy choices are historically carried out. (Pearce et al. 2014, OECD, 2017).

The Amplify program that was studied serves a wide range of jobseekers with diverse backgrounds: adults who have worked in the service or hospitality industry for decades and were recently laid off due to the COVID-19 pandemic; single mothers, many survivors of violence, seeking to re-enter the workforce; immigrants to Canada with large gaps in their early educational history; individuals with chronic illnesses and disabilities that feel they finally have managed their condition enough to re-enter the workforce; and so many more. Participants in the program varied widely in their skill levels: some had used programs like Microsoft Excel daily in their former jobs, others were learning to use a computer for the very first time.

In order to answer the question of how the Amplify program could best set up these jobseekers for success, the design researchers built an approach to understand both the experience of participating in the program and how Amplify fit into the jobseekers’ overall employment journey. The team agreed to expand the original project scope in two key ways. One, instead of collecting data from only Amplify jobseekers, the research team would also conduct research with WorkBC case managers and the Amplify instructors/delivery team. Two, instead of focusing on solely the in-program experience, the research would seek to understand jobseekers’ lives, barriers, needs, and emotions, as well as case managers’ overall roles and longer client histories.

With this broadened approach, the research team designed two phases of work to precede the impact assessment. Phase 1 followed Amplify’s evolution over three cohorts at two delivery sites — a total of 6 classes. This phase culminated in a set of participatory co-design sessions with the implementation team, in which insights were shared back and redesign opportunity areas were identified. Phase 2 will begin after Amplify is iterated and re-deployed. In Phase 2, the design researchers will again interview jobseekers, case managers, and implementers, following two cohorts at four delivery sites— a total of 8 classes—to understand if the iterations improved experience and outcomes. Once the Amplify model is stabilized, the design researchers will pass the torch for the quantitative impact assessment to begin.

Ethnographic In-Depth Interviews (IDIs) were conducted before and after each cohort of the program with 2-3 voluntary jobseekers, 2-3 case managers who referred their clients, and all Amplify facilitators. When a surplus of jobseekers volunteered, selection was based on ensuring diversity across gender, Essential Skill level, employment barriers, and newcomer status.

Phase 1 research was conducted from October 2020- May 2021 and resulted in 56 interviews. Phase 2 will begin in September 2021 and end in May 2023 and is anticipated to result in a further ~80 interviews. This case study details the findings and outcomes from Phase 1 of this project.

Two aspects of the research approach were key in helping to develop a rich body of evidence: longitudinal IDIs and early triangulation with real-time quantitative data.

Longitudinal IDIs

The longitudinal design of the research plan allowed us to understand the evolution of participants’ experiences and outcomes.

For the interviews with jobseeker participants, the researchers used participatory activities. Specific examples include a social network map, and an activity entitled ‘the work I do’ to capture how participants met their needs such as housing, health-care, family responsibilities through a frame of agency and action rather than dependency and shame. Through these activities researchers captured jobseekers’ motivations, how they balanced their responsibilities at home and outside of the program, their networks of social support, and their experiences with employment services thus far. The interviews with caseworkers and facilitators used semi-structured protocols which probed on their decision-making processes of who to refer to the program, their perceptions of the value of essential skills programming along a jobseeker’s employment journey, and their definitions of success for their clients coming out of the program.

Several caseworkers made referrals for multiple cohorts, which allowed the team to follow how caseworkers refined their understanding of which clients would stand to benefit the most from the program and how they began to form relationships with Amplify facilitators to better strategize and serve their clients. Likewise, as facilitators delivered successive cohorts, the pre-and post-interviews conducted throughout allowed the research team to understand how facilitators changed delivery based on the needs, barriers, and social dynamics of the jobseekers, and to chart how they increasingly developed closer working relationships with caseworkers. We detail our insights in the following section.

Quantitative data on student outcomes

The research team triangulated the ethnographic research findings with real-time quantitative data about jobseekers’ outcomes. As part of the program’s impact measurement plan, several hard and soft skills assessments are administered at the start and end of each cohort of Amplify to track change. Assessments measure jobseekers’ reading skills, numeracy skills, and digital literacy skills. Additionally, multiple assessments measure different soft skills such as collaboration and communication, including a group task through which instructors would observe and rate student behaviors and interactions.

The research team further felt it would be beneficial to collect demographic information on jobseekers enrolled in Amplify as a whole, and developed a survey to capture that information at the end of the program alongside jobseekers’ opinions on their experience of the program. The survey allowed the research team to understand how demographic differences between the delivery sites shaped varying perspectives on the program. This helped us understand that one delivery site consistently had cohorts of jobseekers who were older, and more likely to be people for whom English is a second language, which was an important contextual factor when analyzing particular experiences shared about challenges in the classroom.

KEY FINDINGS AND TAKEAWAYS

The ethnographic research expanded the team’s understanding of what it means to prepare someone for a new career in multiple ways. Firstly, while this program aimed to improve people’s skills, participants felt the most meaningful gain from the experience was the changes they saw in themselves and their potential. Secondly, while this program was designed to be a standalone intervention that would prepare people for either post-secondary education or directly entering the job market, many participants left the program with remaining Essential Skills gaps or ambiguity about where to go next. Lastly, the desire for the program to be an efficient model that could flexibly serve a diverse group of jobseekers created tensions that were challenging to balance when delivering the curriculum to people with a range of needs and skill levels.

Developing self-confidence matters

A key aspect of the program’s definition of success at the outset was seeing measurable improvement of skills using a variety of assessments administered at the start and end of the program. However, early analysis of the quantitative data on assessment scores showed on average a modest improvement across all skills. Based on the assessments alone, jobseekers were generally making minimal progress by participating in the program – some even saw that post-test scores went down.

Yet, a common theme shared across participants was that they felt the assessment results did not reflect the amount of progress they made. There is extensive literature about the barriers to accurate assessment such as test anxiety and testing environments (Lu & Sireci, 2007; Cassady & Johnson, 2002), and some of those barriers were shared in situations that were recounted to the researchers. But more importantly to the jobseekers we interviewed, the assessments could not represent the full meaning of what they got out of the program and how it impacted their lives. What mattered was that the Amplify program changed how they saw themselves. Post-program jobseekers felt more confident in their ability to learn, do well in a structured school environment, and when thinking about their overall employment journey felt a groundbreaking sense of “I can do it.” Completing Amplify helped jobseekers shift from feeling scared by or avoidant of their next steps to feeling energized and motivated. Even caseworkers who were interviewed noted that in follow-up conversations with their clients after the program, jobseekers seemed transformed: they spoke more assertively, described themselves and their capabilities in more hopeful and positive ways, and were more diligent and proactive about moving towards steps for their action plan. For many caseworkers, these were significant changes they were seeing in clients who they had been working with for years. Stories as simple as, “My client has sent me an email for the first time,” were shared as revelatory.

Participants, caseworkers, and facilitators viewed success as a set of visible indicators of progress that they would term as a dramatic increase in “self-confidence.” Self-confidence was the word that was most often used to describe new behaviors, skills gains, and attitudes, and was held up as a critical ingredient in clients’ ability to take on the challenge of further upskilling and managing the fear that comes with changing careers.

A desire to link the program more deeply into the larger employment journey

Designed as a 6-week program, Amplify was intended to be a quick and intensive on-ramp for jobseekers to be able to increase their Essential Skills enough to enable them to move onto further training or into sustained employment. However, at the end of the program, many jobseekers shared that their next steps would be primarily to “continue to practice” the skills they were taught in the program. While they felt mentally and emotionally ready to move forward, they still needed to develop their skills over a longer time period than 6 weeks. For some, this program pushed them to learn how to use a computer for the first time. However, there are few other structured programs in the WorkBC system for clients to continue to improve their skills. Many job-seekers with low-levels of skills in particular shared that they felt in limbo after the Amplify program ended. Individuals typically only have the option of continuing their learning online in self-guided modules, or to find free public programs such as at a library. However, due to their low levels of skills, jobseekers often do not have the tools and capabilities to learn without structured guidance and support. As one jobseeker optimistically asked, “Is there an Amplify 2?”

The question of “what next?” after Amplify came up more often as the program evolved. Caseworkers increasingly shared how much they valued the facilitator’s feedback on a clients’ performance to help them plan and determine next steps. Caseworkers felt that facilitators were a key resource into deeper learning about their clients, as facilitators often spent more cumulative time with their clients, observing their skills and how they interact in training environments more than their caseworkers ever had. Completing Amplify was perceived by both jobseekers and caseworkers to increase clarity on jobseekers’ future plans. Many jobseekers felt that gaining a greater sense of their skills and abilities should help to either validate a desired path forward or help course correct with more practical options. However, if they didn’t already come into the program with a strong idea of where to go next, some expressed feeling frustrated and ‘back at square one.’ There was a shared desire by clients and caseworkers alike for the Amplify program to tie in more to the overall process of career navigation, and for more interventions to be waiting at the end of the program to support the interstitial space between ‘getting started’ and ‘being ready to train or work’.

Systems-level pressures shape who is referred

As the research team followed the experiences of facilitators who taught successive cohorts, it became clear that the composition of the cohort played a big role in what could be taught and covered in each iteration of the program. One facilitator described that the experience of teaching each day felt like “surgery,” constantly trying to cut and be precise about what parts to include. Many participants felt that the program’s design was stretched to serve everyone, and there was not enough time for the learnings to be as deep as they needed it to be. Participants believed they could have gotten more out of a program that grouped them with more similar peers.

From the engagement with facilitators and caseworkers, the research team came to understand how and why this outcome was normalized. Because the Amplify program is delivered through the Province of British Columbia’s employment services system (“WorkBC”), the program inherits the system’s incentives and challenges. Currently, WorkBC operates in a pay-for-performance model where employment service centers receive funding from the government depending on the enrollment and completion of services. This creates a pressure to ensure they can serve enough clients per year and to fill cohorts of jobseekers at each delivery site in order to pay for staff wages, which facilitators shared discourages them from being choiceful on who to admit to the program.

Further, eligibility limitations for other programs within the employment system meant that some clients were referred to Amplify for a variety of reasons beyond just improving their skills. Most commonly, jobseekers who sought to access funding for further post-secondary education were referred to the program as a way for case managers to bolster their case for approval of a client’s training package. Additionally, some jobseekers were not eligible for other programs within the employment services system (e.g. if they are an immigrant to Canada who has obtained Canadian citizenship, they are ineligible for free language training programs) and so were enrolled in the program despite that Amplify might not necessarily be the best fit for the skills they needed to focus on.

Finally, because past resources for Essential Skills programs in BC have been inconsistent, many caseworkers did not have a strong grasp of the function and role of Essential Skills programs. Thus many caseworkers perceive ‘Essential Skills” to be fuzzy and felt they needed more clarity on the program’s contents, support with explaining the program’s benefits, and time to make appropriate referrals. As a result, each cohort of the Amplify program could reliably be seen as a “catch-all” for jobseekers with a very wide range of skills and very diverse needs from the program.

[/s2If]

Pages: 1 2

Leave a Reply