Academia.eduAcademia.edu

ESS Methodological Report

2016 Employer Satisfaction Survey Methodology Report May 2017 Report prepared for: The Australian Government Department of Education and Training Report prepared by: Sonia Whiteley, Rebecca Bricknall, Daniela Iarossi, Jayde Grisdale The Social Research Centre Level 9, 277 William Street MELBOURNE VIC. 3000 Tel: (613) 9236 8500 Fax: (613) 9602 5422 Email: [email protected] 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Contents 1. Introduction ................................................................................................................................. 1 2. 1.1. Overview ........................................................................................................................ 1 1.2. Background to the Employer Satisfaction Survey ......................................................... 1 1.3. About the ESS ............................................................................................................... 2 Overview of the ESS ................................................................................................................... 3 3. 2.1. Summary ....................................................................................................................... 3 2.2. Mitigating potential sources of error .............................................................................. 3 Survey Establishment ................................................................................................................ 6 4. 3.1. Approach to the ESS ..................................................................................................... 6 3.2. The Employer Satisfaction Questionnaire (ESQ) .......................................................... 6 3.3. External Engagement .................................................................................................... 9 3.4. CATI survey ................................................................................................................. 10 3.5. Online Survey .............................................................................................................. 12 Sampling .................................................................................................................................... 13 5. 4.1. Approach to sample creation ....................................................................................... 13 4.2. Sample maximisation activities ................................................................................... 14 4.3. Characteristics of available sample records ................................................................ 18 Data collection .......................................................................................................................... 20 6. 5.1. Fieldwork overview ...................................................................................................... 20 5.2. ESS helpdesk .............................................................................................................. 22 5.3. Progress reporting ....................................................................................................... 22 Data Processing ........................................................................................................................ 23 7. 6.1. Definition of the analytic unit ....................................................................................... 23 6.2. Data cleaning and preparation .................................................................................... 23 6.3. Response rate ............................................................................................................. 24 6.4. Item level non-response .............................................................................................. 27 6.5. Respondent characteristics ......................................................................................... 28 Summary of issues for future surveys ................................................................................... 30 Glossary ............................................................................................................................................... 31 Appendix 1 ESQ ............................................................................................................................. 32 Appendix 2 ESQ Screenshots ...................................................................................................... 33 Appendix 3 ESS Recruitment Versions ....................................................................................... 34 Appendix 4 Item Level Non-Response ........................................................................................ 35 Appendix 5 Participating Institutions .......................................................................................... 36 iii 2016 Employer Satisfaction Survey Prepared by the Social Research Centre List of figures Figure 1 Figure 2 CATI completes by number of times contacted ................................................................ 11 Response rate across provider type and collection period .............................................. 24 List of tables Table 1 Table 2 Table 3 Table 4 Table 5 Table 6 Table 7 Table 8 Table 9 Table 10 Table 11 Table 12 Table 13 Table 14 Table 15 Table 16 Table 17 Table 18 Table 19 Table 20 Table 21 Table 22 Table 23 Table 24 Table 25 Table 26 Table 27 Table 28 Table 29 ESS project overview ......................................................................................................... 3 Potential sources of survey error relevant to the ESS........................................................ 5 Key ESS schedule dates .................................................................................................... 6 EQQ module themes .......................................................................................................... 7 GAS-E items ....................................................................................................................... 9 CATI complete by day of week ......................................................................................... 11 Graduate response to request for ESS supervisor details ............................................... 13 Graduate reasons for refusal of ESS supervisor details .................................................. 14 Recruitment text trial and contact information provision rate ........................................... 15 Supervisor detail collection refusal conversion emails ..................................................... 16 Supervisor information collection outcomes ..................................................................... 17 Requested contact from the Social Research Centre ...................................................... 17 Entered invalid supervisor details that required CATI follow up ....................................... 18 Graduate demographic profile .......................................................................................... 18 Graduate higher education characteristics ....................................................................... 19 Broad field of education of graduates ............................................................................... 19 Workflow across email and CATI ..................................................................................... 20 ESS dates by mode .......................................................................................................... 21 ESS sample outcomes ..................................................................................................... 21 Reason for supervisor contact .......................................................................................... 22 Items coded and action taken ........................................................................................... 23 Workflow across email and CATI ..................................................................................... 24 Sample utilisation.............................................................................................................. 25 CATI outcomes ................................................................................................................. 26 Email outcomes ................................................................................................................ 27 Item non-response ............................................................................................................ 27 Graduate demographic profile .......................................................................................... 28 Graduate higher education characteristics ....................................................................... 29 Broad field of education .................................................................................................... 29 2016 Employer Satisfaction Survey Prepared by the Social Research Centre ii List of abbreviations AAGE Australian Association of Graduate Employers AAIR Australasian Association for Institutional Research ABS Australian Bureau of Statistics AGS Australian Graduate Survey AITSL Australian Institute for Teaching and School Leadership AMSRS Australian Market and Social Research Society ATSI Aboriginal and Torres Strait Islander CATI Computer Assisted Telephone Interviewing ESS Employer Satisfaction Survey ESQ Employer Satisfaction Questionnaire GOS Graduate Outcomes Survey GOQ Graduate Outcomes Questionnaire HEIMS Higher Education Information Management System ISO International Standards Organisation NAGCAS National Association of Graduate Careers Advisory Services NHMRC National Health and Medical Research Council QA Quality Assurance QILT Quality Indicators for Learning and Teaching TSE Total Survey Error WRC Workplace Research Centre iii 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Introduction 1. 1.1. Overview This report documents the technical aspects of the 2016 Employer Satisfaction survey (ESS), conducted on behalf of the Australian Government Department of Education and Training (the Department). It seeks to: • document and review the survey processes • consolidate project documentation and assorted reports generated throughout the survey period • review methodological and operational procedures with a view to inform future survey iterations. The appendices attached to this report contain core administration survey materials including the questionnaire, invitation and reminder communications. 1.2. Background to the Employer Satisfaction Survey The ESS is the newest and most innovative component of the Quality Indicators for Learning and Teaching (QILT) program. While there has been a national, higher education graduate survey for more than thirty years, individual institutions have struggled to implement their own employer survey, let alone consider collecting national feedback at an employer level. The Employer Satisfaction Survey was originally developed and pilot tested by the Workplace Research Centre (WRC) at the University of Sydney. It was designed to measure employer perceptions of the readiness of graduates to enter the workplace. Based on feedback and observations from the 2013 ESS Pilot, it was apparent that there were operational and conceptual issues that still required attention. The main operational issue to be overcome related to ‘building’ an appropriate sample pool of employer contact details to support robust survey estimates. Graduates are wary about sharing their employers email address or telephone number, primarily due to concerns that their supervisor will be commenting on their work performance rather than providing feedback on the extent to which their higher education studies have prepared them for employment. The 2013 ESS Pilot also found that while the Employer Satisfaction Questionnaire (ESQ) displayed a number of useful properties, additional work was required to refine and improve the instrument. To address the identified operational and conceptual issues, two additional pilot tests were undertaken in 2014 and 2015. The two pilot test phases for the ESS included a: • methodological pilot: to test different approaches (drawing upon learnings from the 2013 ESS Pilot) to asking Australian Graduate Survey (AGS) respondents to provide their supervisor contact details for the ESS • content pilot: to review the ESS item wording and coverage against current policy and thinking. Improvements were made to the ESQ and the research methodology which were implemented as part of the first national rollout of the ESS in November 2015. 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 1 1.3. About the ESS The ESS is administered under the QILT survey program, commissioned by the Australian Government Department of Education and Training, using sample generated through the Graduate Outcomes Survey. The ESS involved two rounds of data collection with supervisors of recent graduates, commencing in November and May each year. A small supplementary round was conducted in February to support higher education institutions with trimester calendars. The ESS is open to graduate employers from universities and non-university higher education institutions. Data is aggregated and reported after the completion of the May round. 2 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 2. Overview of the ESS 2.1. Summary Table 1 contains an overview of the main elements of the 2016 ESS. Table 1 ESS project overview Total November 2015 February 2016 May 2016 Number of supervisors approached1 6,451 1,724 177 4,550 Number of completed surveys 3,061 765 75 2,221 Overall response rate 47.5% 44.4% 42.4% 48.8% Data collection period 2015 - 2016 November 2015 February 2016 February - April 2016 May - July 2016 Data collection mode Analytic unit 1 Online and CATI Supervisor Excludes opt outs and out of scope surveys 2.2. Mitigating potential sources of error SRC approaches quality assurance for survey research from a Total Survey Error (TSE) paradigm 1. This approach is further informed by the quality dimensions outlined in the ABS’ Data Quality Framework (i.e. Relevance, Timeliness, Accuracy, Coherence, Interpretability and Accessibility). The TSE approach identifies all potential sources of error in the design, collection, processing and analysis of survey data and provides a theoretical and practical framework for optimising survey quality within given design parameters. This understanding of Total Survey Error has enabled SRC to design a Total Survey Quality Framework which allows us to address known quality issues at every stage of the survey cycle. The main sources of error affecting survey accuracy include specification errors (e.g. misinterpretation/misunderstanding of project aims and objectives), sampling frame errors and omissions (e.g. gaps, biases, inaccuracies in the sampling frame), sampling error (e.g. biases in the respondent selection routine or sub-sampling routines) measurement error (e.g. questionnaire design errors, interviewer errors, respondent errors), non-response error (e.g. both unit-level and item-level non-response) and data processing errors (e.g. errors in data editing, coding, weighting or the creation of data files or tables). SRC has had an accredited ISO Quality Assurance scheme in place for over five years which addresses each of these points in the survey cycle. Our QA system documents all responsibilities, authorities, procedures and the corrective action process. All key junctures of the project process are covered by the quality system, with a particular focus on project scoping (to reduce specification error), pre-field checks (to reduce measurement error as a result of questionnaire design), output 1 Paul B. Biemer, Total Survey Error: Design, Implementation and Evaluation. Public Opinion Quarterly, Volume 75 No 5, Special Issue, 2010. 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 3 auditing (to reduce data processing errors) and project review (consistent with a continuous improvement approach towards limiting total survey error). Additionally, this research has been undertaken in accordance with the Privacy Act (1988) and the Australian Privacy Principles contained therein, the Privacy (Market and Social Research) Code 2014, the Australian Market and SRC’s Code of Professional Practice, and ISO 20252 standards. The ESS presents a range of challenges from a TSE perspective. The most significant is in relation to the sample frame as there is no known way of accessing a consolidated list of supervisors of recent graduates from an Australian higher education institution. Previous surveys have relied on general surveys of employers and asked generic questions about graduate preparation and employability. The ESS seeks to make a direct link between the graduate, the institution and the employer which add an unprecedented level of complexity to the creation of the sample for the ESS. With that in mind, a number of areas were identified as potential sources of error to be remediated, as well as aligning the TSE framework with current practices within the Social Research Centre, as outlined in Table 2. 4 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Table 2 Potential sources of survey error relevant to the ESS Error type Source of error Context Mitigation strategy Coverage error In-scope population inaccurately represented in the sample frame There is no known sample frame for supervisors of recent graduates. The ESS relies on graduates completing the GOS to generate the sample frame for graduate supervisors. Current analyses show that there are areas of over and under coverage with some graduates more likely to provide contact details than others. The sample frame will become more representative of the population when graduates become more used to providing contact information for their supervisors. Further investigation is required to determine if there are specific strategies that, for example, encourage those who have completed an undergraduate qualification to support the recruitment of supervisors. Sampling error Incorrect supervisors selected to participate All supervisors nominated by graduates are eligible to participate. Screening questions confirm that the employer is the graduate’s current supervisor. Overall, out-of-scope supervisors constituted less than 1.5 per cent of the available sample. This indicates that employers nominated into the ESS sample are the current supervisors of the graduates. Nonresponse error Unit level non-response Length of the survey or general lack of engagement may result in unacceptably high levels of non-response by supervisors. The survey is intentionally less than ten minutes in length and can be easily stopped and resumed if the supervisor is interrupted. Supervisor engagement strategies will be developed to support future collections when the first round of ESS data becomes publicly available. Item-level non-response Employers may skip items that they feel are irrelevant, unimportant or too sensitive. Item level non-response was minimised by ensuring that all questions were directly relevant to graduate supervisors. Validity Questionnaire fails to measure the relevant constructs The survey is relatively new and only tested in a limited number of contexts. Attitudinal scales were validated using Rasch Modelling and Factor Analysis which suggested that the scales and the majority of items are functioning as expected across the complete range of discipline areas. Measurement error Poor questionnaire design The layout or structure of the questionnaire could lead to inaccurate or incomplete responses The ESQ underwent cognitive testing, a trial in 2015, and was also independently reviewed by the QILT Working Group. Mode of data collection The ESS can be completed via telephone or online to provide maximum flexibility and minimise burden on supervisors. Mode effects will be modelled to assess whether any bias is introduced in relation to key reporting metrics. Inadequate validation checks With any new survey that does not have established procedures regarding data production, there is the possibility of introducing error. Core data files are independently validated as the data is extracted from the data collection system, when the data is cleaned and finalised, by the research team, and by the Department prior to distribution. Coding errors or inconsistent coding of open-ended responses There are a number of detailed items in the ESS relevant to the labour force that require accurate coding. Items were coded to ABS approved code frames such as ANZSCO and ANZSIC for industry and occupation, where possible. Existing ISO procedures ensured that all coding was executed consistently with a very low error rate. Processing error 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 5 Survey Establishment 3. 3.1. Approach to the ESS The ESS is administered in parallel with the GOS and the first collection period for the 2016 ESS reporting year took place in November 2015, and the second in May 2016. A supplementary round occurred in February 2016 to accommodate a small number of institutions that offer trimester rather than semester study periods. February 2016 outcomes were combined with the 2015 November round ESS throughout this report, with the exception of main fieldwork dates and technical aspects which have been listed separately. Computer Assisted Telephone Interviewing (CATI) is the primary mode of collection for the ESS with online collection a secondary mode and, unlike the GOS and the Student Experience Survey (SES), completed telephone surveys are included in the nationally reported data. Key schedule dates for the 2016 ESS are outlined in Table 3. Table 3 Key ESS schedule dates November collection February collection May collection 20 April 2015 - - ESS survey open 9 November 2015 17 February 2016 6 May 2016 ESS survey closed 19 February 2016 22 April 2016 22 July 2016 Schedule milestone ESS Webinar 3.2. The Employer Satisfaction Questionnaire (ESQ) 3.2.1. Development of the ESQ The 2013 ESS Pilot noted that while the ESQ was informed by a review of the literature, the instrument was largely based on items from an employer survey conducted by one institution. As such, it was recommended that additional work should be undertaken to improve and refine the instrument. More specifically: • factor analysis suggested some sets of items did not cluster in a manner that was sufficient to create a coherent scale • not all items seemed to be regarded as equally important by employers from all fields of education. • further cognitive testing of the clusters and the items was required. Data from the 2013 ESS pilot was used to examine the psychometric properties of the six sets of graduate attributes that were thought to be regarded as important by employers. Findings from this supplementary analysis of the data indicated that: 6 • the Graduate Attributes Scale – Employers (GAS - E) functions adequately as an overall scale but there was little evidence that the sub-scales (sets of attributes) formed distinct, cohesive sets of items • the GAS – E sub-scales suffer from a lack of measurement precision due to the absence of items targeted at employers who exhibited high levels of satisfaction 2016 Employer Satisfaction Survey Prepared by the Social Research Centre • the four-point rating scale performed adequately however there were large gaps between some of the categories suggesting that a mid-point was required to support employers to provide more accurate ratings • most items met the assumptions of the Rasch model however they could be improved through review and rewording, particularly in relation to those items that were answered unpredictably. These issues were addressed and tested in the 2015 pilot test of the ESQ. The current instrument includes an updated GAS-E with improved measurement precision and a revised rating scale for all attitudinal items. 3.2.2. Operationalising the ESQ In keeping with the other surveys in the QILT suite, the core design of the ESQ was modular to support a flexible and responsive approach to future implementations of the surveys. Modules can be modified or retired without unduly impacting on the overall structure or flow of the ESQ. Table 4 outlines the thematic areas of the six ESQ modules. Table 4 ESQ module themes Module Themes Module A Introduction and screening Module B Overall graduate preparation Module C Graduate Attributes Scale – Employer Module D Emerging policy issues Module E Discipline or institution specific issues Module F Close The content of each of the ESS modules is outlined below. Module A: Introduction and screening This module confirmed the supervisor’s role in relation to the graduate to ensure that they are in scope for the ESS. Additional information was collected regarding their awareness of the graduate’s qualification and awarding institution as well as the occupations of the graduate and the supervisor. Module B: Overall graduate preparation In this section, supervisors were asked about the importance of the qualification in order for the graduate to do their job, whether the qualification was a requirement for the position and the ways in which the qualification prepared the graduate for employment. Based on their experience with the graduate, supervisors also indicated the extent to which they would consider hiring another graduate from the same institution. Two open ended items were included in this module: • the main ways the institution prepared the graduate for their current employment, and • the main ways the institution could have better prepared them for their current employment. 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 7 Module C: Graduate Attributes Scale – Employer The Graduate Attributes Scale – Employer (GAS-E) measures the extent to which supervisors agreed the graduate was prepared for employment across each of the GAS-E domains (see Section 3.2.3). Three GAS-E subscales are also administered to graduates as part of the GOS and form the basis for the Graduate Attributes Scale – Graduate (GAS-G). Module D: Emerging policy issues Module D is a placeholder for any questions the Australian Government wishes to ask of graduate supervisors. This data is not available in national data files. Module E: Discipline & institution specific issues Module E is a placeholder for any discipline (e.g. engineering, accountancy etc.) or institution specific issues relevant to graduate supervisors. This data is not available in national data files. Module F: Close This module asks whether supervisors would like feedback about the survey and if they wish their institution to be acknowledged on the QILT website as a supporter of the survey. 3.2.3. Graduate Attributes Scale – Employer The basis for the Graduate Attributes Scale – Employer (GAS-E) was developed as part of the original 2013/14 Trial of the Employer Satisfaction Survey. The project team synthesised frameworks relevant to the skills of university graduates and identified a number of general graduate attributes. The GASE has been designed to assess common rather than specific graduate attributes, within a limited workplace context. The items were further tested and refined during a 2015 trial of the instrument. Five graduate attribute domains have been identified, including: • Foundation skills – general literacy, numeracy and communication skills and the ability to investigate and integrate knowledge. • Adaptive skills – the ability to innovate, adapt and apply skills/knowledge and work independently. • Collaborative skills – teamwork and interpersonal skills. • Technical and professional skills – application of professional and technical knowledge and standards. • Employability and enterprise skills - ability to perform and innovate in the workplace. Table 5 on the following page lists the items associated with each GAS-E domain. 8 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Table 5 GAS-E items # of items Domain GAS-E item label Adaptive skills and attributes egadapt1, egadapt2, egadapt3, egadapt4, egadapt5, egadapt6 6 egfound1, egfound2, egfound3, egfound4, egfound5, egfound6, egfound7, egfound8 8 Teamwork and interpersonal skills egcollb1, egcollb2, egcollb3, egcollb4, egcollb5 5 Technical and professional skills egtech1, egtech2. egtech3, egtech4, egtech5, egtech6 6 Employability and enterprise skills egemply1, egemply2, egemply3, egemply4, egemply5, egemply6, egemply7, egemply8 8 Foundation skills 3.3. External Engagement 3.3.1. Promotion of the ESS Promoting the ESS involved reaching out to both graduates and employers through institutions and various peak bodies. Institutions were supplied with flyers to hand out at graduate ceremonies, encouraging graduates to provide their supervisor details when responding to the GOS. A number of peak bodies including AAGE, AITSL and NAGCAS were also made aware of the ESS through conferences, webinars and meetings. There was no explicit advertising of the ESS due to the small proportions of graduates supplying supervisor details, and concerns of diluting branding around the GOS, which was being heavily promoted at that time. 3.3.2. Participating Institutions Employed graduates of institutions who took part in the 2016 GOS were eligible to provide contact information for their supervisors if they were not self-employed or working in a family business. As such, any participating institutions for the ESS were identical to participating institutions for the GOS. The November round GOS included graduates from 37 Table A, three Table B institutions, and 32 NUHEIs. The 2016 May round included graduates from 37 Table A, three Table B institutions, and 52 NUHEIs. Two Table A, one Table B and one NUHEI participated in the supplementary 2016 February round GOS. As the collection of ESS sample relied on employed graduates to give supervisor details, there were some institutions with small sample sizes that did not have any graduates supply valid supervisor details and so participation in GOS did not necessarily mean an institution had the opportunity to participate in ESS. For a full list of institutions that had graduates provide valid supervisor details and the supervisor complete the ESS, refer to Appendix 5. 3.3.3. Privacy & confidentiality The ESS is conducted within the ethical guidelines laid out in the Australian Code for the Responsible Conduct of Research2. All data collection for the ESS was undertaken in accordance with ISO 20252 2 National Health and Medical Research Council and Universities Australia, 2007, www.nhmrc.gov.au/guidelinespublications/r39 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 9 standards, the AMSRS code of practice, the Privacy (Market and Social Research) Code, and the Australian Privacy Act. All third-party data collected in the GOS was entered into the ESS workflow within two weeks of receiving this information, in accordance with the Privacy Code. This delay was due to the need to clean the sample provided, upload it to the central database, and allocate appropriate resources within the telephone workflow. The Social Research Centre had a number of measures in place to ensure the privacy of individuals and compliance with both Commonwealth and Industry standards on secure data storage and management, including limited access to information for project team members and any third party vendors, multi-level passwords on all electronic storage systems, the use of a secure file exchange to distribute sample and data files between the Social Research Centre and institutions, and ensuring that all servers used in the execution of the online survey complied with Australian privacy standards. In terms of online privacy management, the Social Research Centre deployed commercial UTM (Unified Threat Management) devices to protect the internal network from the public network. These devices provided firewall protection, intrusion protection, virus scanning, online content filtering and managed multiple WAN connections Additionally, all Social Research Centre staff involved in the 2016 ESS (including helpline operators) entered into a project-specific Deed of Confidentiality. 3.3.4. Institution Specific Items In keeping with QILT survey processes, institutions were able to add additional questions. In 2016 one institution added an additional question to the ESS. 3.4. CATI survey The CATI ESS survey was administered in an identical format to the online EES. Interviewers had an interfacing script at the front and back ends of the survey which allowed categorising of call outcomes. Once agreement to complete the survey was established, the interviewers initiated the online survey. The non-mandatory nature of the ESQ items allowed for responses to items to be skipped if requested by the supervisor. 3.4.1. Call procedures Call procedures for telephone non-response follow-up for the 2016 ESS featured: • call attempts placed over different days of the working week and times of day • placing a second call attempt to ‘fax / modem’ and ‘number disconnected’ outcomes (given that there are occasionally issues with internet connections and problems at the exchange) • providing login details if supervisors preferred to complete online, rather than complete a telephone interview. The majority of CATI surveys were completed earlier in the week (see Table 6). There was a very small number of requests for weekend appointments. 10 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Table 6 CATI complete by day of week Day of week % of CATI completes Monday 20.6 Tuesday 22.2 Wednesday 21.0 Thursday 19.5 Friday 15.4 Saturday/Sunday 1.2 Figure 1 below shows that the majority of CATI completes occurred within the first five times tried (77.3%), in particular on tries one (18.8%) and two (22.8%). A further one in five completes required more than five calls before participating in the ESS, indicating the need for an extensive CATI workflow when approaching employers. Figure 1 CATI completes by number of times contacted 25% % Compl etes 20% 15% 10% 5% 0% 1 2 3 4 5 6 7 8 9 10+ Ti mes tried 3.4.2. Interviewer team briefing and quality control All interviewers selected to work on the ESS attended a comprehensive briefing session, delivered by the SRC project management team. Briefings were conducted on 4 November 2015, 9 February 2016 and 1 June 2016. The briefing covered the following aspects: • survey context and background • survey procedures (sample management protocols, response rate maximisation procedures) • privacy and confidentiality issues 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 11 • a detailed examination of the survey questionnaire, with a focus on ensuring the uniform interpretation of questions and response frames, and addressing item-specific data quality issues • targeted refusal aversion techniques • strategies to maintain co-operation (i.e., minimise mid-survey terminations) • approaches to get past ‘gatekeepers’ (i.e. receptionist) • comprehensive practice interviewing and role play • a review of key data quality issues. During the 2015 November/2016 February collection period 20 interviewers worked on the project, with a core team of 5 undertaking the majority of calls. During the 2016 May collection period 25 interviewers were briefed with a core team of 6 undertaking majority of calls. Validations were undertaken by remote monitoring, in accordance with ISO 20252 procedures. 3.5. Online Survey The online survey could be accessed by clicking on the link in the email invitation or email reminders. Unlike SES and GOS, due to the limited ESS sample frame there was no option to start the survey via the QILT website. Online survey presentation was informed by Australian Bureau of Statistics standards, accessibility guidelines and other relevant resources, with standard features including: • mobile device optimisation • sequencing controls • input controls and internal logic checks • use of a progress bar • tailored error messages, as appropriate • no vertical scrolling required, with long statement batteries split over several screens, as necessary • recording panels for free text responses commensurate with level of detail required in the response • ‘saving’ with progression to the next screen • capacity to save and return to finish off at another time, resuming at the last question completed. A copy of the generic survey instrument (i.e. excluding any institution specific items) and screenshots of the survey are included in Appendices 1 and 2. 12 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 4. Sampling 4.1. Approach to sample creation The in-scope population for the ESS was supervisors of employed higher education graduates (but not self-employed or working in a family business) who completed the GOS. The Graduate Outcomes Questionnaire (GOQ) was used to create this sample frame through a recruitment module requesting supervisor contact details. We implemented a number of non-response maximisation activities to achieve the highest possible participation rate for the GOS, in order to maximise the number of ESS sample items. The ESS recruitment module in the GOQ contains a set of items aimed at graduates who were in paid employment the week prior to their completion of the GOS. Recruitment outcomes are shown below in Table 7. Table 7 Graduate response to request for ESS supervisor details November n=18,380 February n=1,681 Response to request n % n % I can provide their work contact details 938 5.1 100 5.9 5,780 9.3 I can provide their contact information but I wish to log out of the survey and check their details first 164 0.9 3.0 0.2 347 0.6 I can provide their contact information but I would like you to call me 315 1.7 23.0 1.4 1,060 1.7 13,889 75.6 1,271 75.6 33,947 54.8 - - - - 6,033 9.7 285 1.6 32 1.9 4,882 7.9 2,789 15.2 252 15.0 9,869 15.9 I do not wish to provide my supervisors details* I would like more information before I provide my supervisor's details Skipped module Partial complete (did not complete up to ESS module) May n=61,918 n % *Includes graduates who were employed but opted to not respond to the recruitment module Previous pilot surveys and cognitive testing have shown that graduates are reluctant to provide information about their supervisors, particularly if they are concerned about their work performance being assessed at the early stages of their employment, or feel that their employment is not related to their qualification. To overcome these objections, the recruitment text was designed to alleviate these concerns, and a reason for refusal question was included at the end of the module to quantitatively measure any other objections to detail collection. All graduates who opted not to provide details were asked the refusal question (excluding graduates in employment who skipped the module). The most common reason given was graduate concern that their supervisor does not have enough time to participate, followed by the explanation that their current job is only temporary or casual (see Table 8 on the following page). There were a large proportion of other reasons for refusal across all three rounds, ranging between 17.7 and 20.0 per cent. These reasons include graduates who are on secondment, on rotation between different areas of the business, as well as senior managers who didn’t feel it was relevant to their role. 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 13 Table 8 Graduate reasons for refusal of ESS supervisor details November Reason for ESS refusal February n % I’m concerned that my supervisor does not have enough time 5,424 39.1 My job is temporary only /casual only 3,120 I do not have a direct supervisor % n % 543 42.4 13,329 39.0 22.5 226 17.6 7,913 23.2 1,349 9.7 145 11.3 3,721 10.9 I’m concerned about my supervisors response 626 4.5 49 3.8 1,468 4.3 I do not know the email address of my supervisor 328 2.4 29 2.3 925 2.7 34 0.2 2.0 0.2 82.0 0.2 2,820 20.3 256 20.0 6,038 17.7 I CAN provide their contact work email 90 0.7 7.0 0.6 218 0.6 No reason provided 98 0.7 24 1.9 461 1.3 My supervisor does not have an email address Other (please provide details) n May Our approach to maximising ESS recruitment involved a broad range of activities designed to ensure quality and accuracy of recruitment, as described in the following section. 4.2. Sample maximisation activities 4.2.1. ‘Recruitment’ text modification During the May GOS Collection period, several versions of the ESS recruitment question were tested early in the fieldwork period to explore which was most effective at collecting supervisor details. Feedback from respondents indicated that the original recruitment text was too wordy so a short text version was created. There were also indications that respondents required reassurance about the nature of the questions that their supervisor would be asked, and the way in which the results were reported. The three versions of the recruitment item randomly displayed to respondents were: • Shortened text: Summarised text explaining the ESS and reasons graduates should give supervisor details; much of the messaging reduced to dot points. • Link to ESQ: Included a link to a simplified version of ESQ so the graduate could view questions asked, and be assured it is not a review of their on-the-job performance. • Link to results: Included a link to an example of aggregated results reported to assure confidentiality of responses. As shown in Table 9, all three versions of the text elicited similar rates of detail provision, resulting in a version of the question that included all three elements introduced mid-way through fieldwork. A trial of the response frame was also introduced at this time, which removed the ‘I don’t want to provide my supervisor’s contact information’ (more detail in Section 4.2.3). 14 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Table 9 Recruitment text trial and contact information provision rate Version Short Text (n=5,762) Link to ESQ (n=5,758) Link to results (n=5,678) Number details provided 465 438 410 % of details provided 5.8 5.5 5.1 This change to the response frame resulted in a large increase to the number of contact details provided, however there was also a corresponding increase to the amount of review time required to confirm that the information supplied was useable. Of the 2,853 contact details collected from the adapted response frame, 42.9 per cent were accepted into the ESS workflow (41.5% of all respondents). This was a large increase from earlier in fieldwork, where 56.1 per cent of details were accepted, but from a very small proportion of the population (10.3%). See Appendix 3 for the example results, simplified questionnaire and ESS Fact Sheet. 4.2.2. Refusal conversion In the November 2015 collection period one university agreed to take part in a refusal conversion CATI workflow. Some institutions approached to participate in this process were reluctant to agree, as they were concerned that this activity was too intrusive when graduates had chosen not to provide details already. The process involved calling graduates who indicated that they were concerned their supervisor did not have enough time to complete the survey. A team of senior interviewers with experience in refusal conversion contacted the graduate to address commonly identified barriers to the provision of supervisor details. Generally, the refusal conversion approach consisted of assuring the graduate that: • the ESS is a short survey • the ESS was definitely not a review of the graduate’s on-the-job performance • employers are encouraged to take part even if the graduates’ role is not related to the course they recently completed. Of the 233 graduates included in this refusal conversion, 12.4 per cent ended up providing their supervisor details during this call (n=29). Due to the low levels of supervisor detail collections the refusal conversion CATI workflow ended after two weeks in-field. During the May 2016 collection period a refusal conversion email was trialled. This involved emailing a subset of those who chose not to provide supervisor details with specific messages relevant to their reason for refusal (shown in Table 10). These reasons were selected as the most common objections inhibiting graduates from providing supervisor contact information. The email contained a link that took graduates directly to the supervisor details collection question in the main survey. The emails were not very effective, with only 0.2 per cent of graduates giving their supervisor details in response to the email (55 graduates out of 23,845 messaged). Due to this low level of response it was not possible to further explore the effectiveness of the different messages outlined on the following page. 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 15 Table 10 Supervisor detail collection refusal conversion emails Reason for refusal Key refusal conversion message I'm concerned that my supervisor does not have enough time The online survey only takes 7 minutes to complete and we know from previous research that employers are happy to be given the chance to complete the survey - it gives them the chance to provide their feedback on how an institution has prepared their employee for the workplace. My job is temporary only / casual only Even if your work is not directly related or impacted by the course you completed, we’d still like to hear from your supervisor. The survey is quite generic and not necessarily about specific or technical skills acquired by your course, it’s more about workplace skills and how your institution prepared you for the workforce. I'm concerned about my supervisors' response Please be assured that this survey is quite generic and is about workplace skills and how your institution prepared you for the workforce - it is not a review of your performance in any way. I do not know email address of supervisor In order to give your employer their chance to an opportunity to provide feedback about how your institution and course prepared you for the workforce we’d like to offer them the chance to complete the survey over the phone 4.2.3. Response option modification The response frame for the recruitment question originally contained a refusal option that stated ‘I do not wish to provide my supervisor’s details’. When the option was offered to graduates, close to 90 per cent selected this response (see Table 11 on the following page). Based on this observation and feedback from a follow up question exploring the reason for refusal, it was confirmed that many graduates were opting to provide a ‘soft refusal’. This included concerns such as they thought their supervisor wouldn’t have the time to participate, or they didn’t know their supervisor’s email address. As a result, and given that these questions were not mandatory, the refusal option was removed midway during the May 2016 data collection period, in order to reduce the number of graduates selecting to opt out of this process. The response frame included a new option ‘I would like more information before I provide my supervisor's details’. Selecting this option provided the graduate with further information on the purpose of the ESS and a link to more information (such as why the survey is important, the types of questions that will be asked and how the results will be reported). The respondent was then provided with the option to supply supervisor details, log out and check their supervisor’s details or ask a question prior to providing contact information. With the refusal option removed, 41.5 per cent of in scope graduates provided useable supervisor details, with the remaining 58.4 per cent providing details that were inaccurate or incomplete. While the removal of the option to refuse to give details resulted in more details being rejected as invalid, the substantial increase in useable sample records suggests that the omission of the ‘easy out’ response option was warranted. 16 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Table 11 Supervisor information collection outcomes May Refused to give details Details entered by graduate accepted Details inaccurate/incomplete 4.2.4. Nov/Feb Refusal option removed Refusal option available Refusal option available n=2,945 n=38,263 n=20,142 - 34,388 16,761 - 89.9% 83.2% 1,223 2,208 2,128 41.5% 5.7% 10.6% 1,772 1,725 1,253 58.4% 4.5% 6.2% Additional telephone workflows Requested contact Feedback from focus groups during the ESS pilot phase suggested that the option of direct contact from the QILT Helpdesk Team could provide more personalised reassurance regarding graduate concerns about providing supervisor contact information. To address this issue, the response option ‘I can provide their contact information but I would like you to call me’ was added to the response frame. During all GOS collection periods, graduates who selected this option were entered into a telephone workflow. Interviewers contacted these graduates to discuss any concerns they had about providing supervisor details. Overall, 17.4 per cent of those who requested contact provided valid supervisor details during this follow up phone call (see Table 12). Table 12 Requested contact from the Social Research Centre Overall n=3,221 Nov/Feb n=1,236 May n=1,985 Number provided valid supervisor details 561 252 309 % provided valid supervisor details 17.4 20.4 15.6 Entered invalid details All supervisor details collected went through a verification process before they were entered into online or Computer Assisted Telephone Interviewing (CATI) workflows. Each record was manually checked to confirm that valid details had been provided (phone number was not missing numbers or area code and email addresses contained a domain name) and that no offensive or obviously fake names had been entered. Where invalid details had been provided these records were entered into a separate telephone workflow where an interviewer called the graduate and asked them to confirm the correct details. Valid supervisor details were collected for 22 per cent of those entered into this workflow. In an effort to maximise recruitment of supervisors for the ESS, graduates who had chosen to provide their supervisor’s contact details but provided invalid information, were followed up by telephone two 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 17 days after completing the GOS. The number of invalid contact details (i.e. email address incorrect) are shown in Table 13. Table 13 Entered invalid supervisor details that required CATI follow up Number provided updated supervisor details % provided updated supervisor details Overall Nov/Feb May n=413 n=177 n=236 91 42 49 22.0 23.7 20.7 For cases where invalid details had been given and it was obvious the graduate did not want to give details (i.e. entered ‘Do not wish to disclose’ in Supervisor name field), or no phone number was available, the details were rejected and not entered into the telephone workflow. 4.3. Characteristics of available sample records In total, 6,451 graduates provided accurate and complete contact information for their supervisors in the 2016 GOS. Table 14 compares characteristics of the total population of graduates who were inscope for the GOS with the characteristics of those who were in employment. Respondents in employment were more likely than the in-scope population to be female, over 30 years of age and speak English at home. Table 14 Graduate demographic profile In-scope graduates In employment n % n % Male 110,473 42.1 33,963 38.1 Female 152,046 57.9 55,205 61.9 201,936 76.9 62,127 69.5 60,546 23.1 27,220 30.5 Gender Age 30 years or under Over 30 years Aboriginal and Torres Strait Islander Indigenous Non-Indigenous 2,008 0.8 712 0.8 259,840 99.2 88,459 99.2 209,599 79.9 78,008 87.3 52,887 20.1 11,341 12.7 Main language spoken at home English Language other than English Disability Reported disability No reported disability 10,096 3.8 3,615 4.1 252,163 96.2 85,554 95.9 Table 15 illustrates that university and NUHEI graduates were equally likely to be employed, however age continues to be a strong predictor of participation, with postgraduates and external/distance graduates, who tend to be older, more likely to be employed. 18 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Table 15 Graduate higher education characteristics In-scope graduates In employment n % n % 251,655 95.9 85,944 96.2 10,831 4.1 3,405 3.8 229,331 88.0 74,632 84.0 32,517 12.0 14,539 16.0 155,893 59.4 49,958 55.9 97,514 37.2 34,716 38.9 9,079 3.5 4,675 5.2 Institution type University NUHEI Study mode Internal/mixed External/distance Course level Undergraduate Postgraduate coursework Postgraduate research As Table 16 below demonstrates graduates from health were more likely than graduates from other fields to report that they were employed after completing their qualification. Table 16 Broad field of education of graduates In-scope graduates In employment n % 20,469 7.8 6,576 7.4 9,633 3.7 3,072 3.4 16,638 6.3 5,368 6.0 Architecture & Building 6,202 2.4 2,017 2.3 Agriculture & Environmental Studies 3,847 1.5 1,500 1.7 Health 45,382 17.3 18,021 20.2 Education 26,343 10.0 10,282 11.5 Management & Commerce 65,516 25.0 17,487 19.6 Society & Culture 51,819 19.7 18,834 21.1 Creative Arts 16,589 6.3 5,250 5.9 48 0.0 9 0.0 Natural & Physical Sciences Information Technology Engineering & Related Technologies Food, Hospitality & Personal Services 2016 Employer Satisfaction Survey Prepared by the Social Research Centre n % 19 5. Data collection 5.1. Fieldwork overview The 2016 ESS was a dual mode survey, with both online and CATI workflows offered to supervisors, with the understanding that many supervisors would not participate in the survey without CATI follow up. The online workflow was utilised as the first point of contact, where email was provided, on the assumptions that supervisors would appreciate receiving survey information in writing, and a reasonable proportion would prefer to have the opportunity to complete in their own time. Where only a phone number was available the supervisor entered the CATI workflow. If a supervisor did not respond to the email invitation or reminders they were then entered into the CATI follow-up workflow, if a phone number had been provided. If a graduate supplied an email address for their supervisor and the email bounced, the supervisor was transferred into the CATI workflow immediately. Table 17 below shows the proportion of supervisors approached through each workflow, with a much higher proportion receiving an email invitation (72.1% for November and 90.4% for May) than those entering the CATI workflow initially. Between seven and nine per cent of emails sent were a hard bounce, and therefore transferred over to CATI where phone numbers were available. A substantial proportion of supervisors initially allocated to the online workflow were transferred to CATI, ranging from 31 per cent to 56 per cent across the rounds. Table 17 Workflow across email and CATI November February May n=1,881 n=208 n=4,793 1,357 180 4,332 72.1 86.5 90.4 % hard bounce* 6.9 9.1 8.5 CATI approach 524 28 461 % without email entered into CATI workflow 28.9 13.5 9.6 Online workflow only 773 89 1,642 % sent email only 41.1 42.8 34.3 1,108 119 3,149 % total entered into CATI workflow 58.9 57.2 65.7 % total transferred to CATI 31.0 43.8 56.1 Initial workflow Online invitation % sent an invitation email Total workflow Total CATI approach *As a proportion of online workflow sample 20 2016 Employer Satisfaction Survey Prepared by the Social Research Centre The ESS instrument was programmed into SPSS Dimensions to ease data capture as well as facilitate the seamless use of CATI. This approach also supported the development and deployment of the live reporting module. Dates for each ESS data collection round are shown below in Table 18. Table 18 ESS dates by mode November February May Online data collection 8 Nov 2015 – 19 Feb 2016 9 Feb – 22 Apr 2016 5 May – 22 Jul 2016 CATI data collection 12 Nov 2015 – 19 Feb 2016 3 Feb – 22 Apr 2016 5 May – 22 Jul 2016 5.1.1. Initial approach and follow-up strategy As detailed above, dual methodologies were utilised in the 2016 ESS response maximisation effort; with online and CATI workflows established to support supervisor participation. The online workflow was activated as the primary workflow for all records with a valid email address. Both workflows followed a predefined structure, with the online workflow consisting of an initial invitation sent the next working day, and a reminder email sent to non-responders four working days later (i.e. if sent invitation email on a Monday they would receive a reminder email on Friday). If a supervisor had not responded to the emails and a phone number had also been provided, then they would be approached via CATI. In the November and February collection periods, nonresponders were entered into CATI five working days after non-response to the reminder email. For the May collection period a supervisors were entered into CATI two working days after non-response to the reminder email. If the record only had a valid telephone number for the supervisor, it was entered into the CATI workflow the first working day following the provision of the contact details. Table 19 provides the overall outcomes from ESS sample. Table 19 ESS sample outcomes Complete November February May n=1,881 n=208 n=4,793 765 75 2,221 Online out of scope 22 5 70 Online opt out 14 - 50 959 102 2,329 CATI out of scope 34 3 33 CATI opt out 87 23 90 No response/partial complete All emails were ESS‐branded, html‐enabled and included a hyperlink directly to the online survey, as well as helpdesk email address and dedicated 1800 telephone number. Supervisors were able to unsubscribe by clicking a link in the footer of the email. Supervisors who had completed a survey, those who had opted out of the survey, and those who had been disqualified from participating were removed from any further follow-up activity. 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 21 Additional online workflow activities In the 2016 May round a trial of a third ‘Last chance reminder’ email was tested. It was sent to over 2,000 supervisors, letting them know the survey was closing soon and that this was their last chance to have their say. Of the 2,020 supervisors sent this last chance email, 437 opened the survey (21.6% of total sent) and 234 went on to complete online (7.6% of the total completes). During the May round an issue occurred with the link in the invitation email, which sent respondents who clicked on the link to an error page. As soon it was identified, the issue was fixed and an apology email sent to those affected (54 supervisors) apologising for the error, and a new working link included. 5.2. ESS helpdesk An ESS helpdesk was established to respond to supervisor queries via telephone or email. The 1800 number was also available to international supervisors (with an international dialling code), and remained operational for the duration of the fieldwork period. The helpdesk was staffed during standard business hours, and all out of hours callers were routed to a voicemail service, with calls returned within 24 hours business hours. The EES helpdesk team were briefed on the ESS background, procedures and questionnaire to enable them to answer a wide range of queries. To further support the helpdesk, a database was made available to the team to enable them to look up caller information and survey links, as well as providing a method for logging all contacts. The helpdesk received 20 phone calls and 37 emails from supervisors, with the primary reasons for contact being problems with the email link, to confirm they had already completed the survey, and to opt out or make an appointment. The reasons for contact are shown in Table 20. Table 20 Reason for supervisor contact Reason for contact n % Problems with URL / access / login 14 24.6 Already completed the survey 12 21.1 Opt out 11 19.3 CATI appointment 10 17.5 Requested general survey information 7 12.3 Other 3 5.3 All opt-outs were removed from the reminder and CATI sample on a regular basis to avoid future reminder emails or calls to these sample members. Sample contact details were also updated before each reminder for those requesting an update to their details. Members of the QILT team were responsible for monitoring the ESS inbox and responded as appropriate to queries and complaints. 5.3. Progress reporting The Department was provided with access to a bespoke ‘live’ online reporting module which provided an overview of supervisor detail collection rates for each institution and the total participation rates for all institutions. Results were provided in real time and included number of graduates in-scope to provide details, number of details actually collected and participation rates of supervisors (including partials, out of scopes and opt outs). 22 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Data Processing 6. 6.1. Definition of the analytic unit The analytic unit for the ESS is the course or major. An ESQ was defined as valid and complete if the supervisor had provided a valid response at equalimp, ecrsprep and ehire questions. The ESS data file contains one record for each of the graduate’s courses or majors to a maximum of two. Supervisors appear twice in the file if the graduate they supervised either completed a single degree with two majors, or a double degree. If a graduate had completed a single degree with two majors, the second major is included in the ESS data file but not included in analysis. 6.2. Data cleaning and preparation All items in the body of the questionnaire were re-filtered to their respective bases to ensure there were no errant responses. 6.2.1. Data cleaning Data collected was consolidated and cleaning routines applied, such as: • recoding value labels where required • re-coding of ‘no answers’ to the missing values conventions • cleaning of supervisor name and coding of occupation and further study field of education • spell checking and light cleaning of email addresses, business names, eprep, ebetter and ‘other’ specify responses. 6.2.2. Coding All coding was undertaken by experienced, fully briefed coders, accustomed to working with standard Australian Bureau of Statistics code frames. Coding was validated in accordance with ISO 20252 procedures. Under these procedures, ten per cent of responses are validated for coding accuracy and must achieve a minimum accuracy of 90 per cent. If the accuracy of coding is found to be less than this, another batch is extracted for further validation. The proportion of the second and subsequent batches is dependent on the degree of inaccuracy found in the previous validation iterations. This process continues until the desired accuracy level is reached within each batch. Table 21 Items coded and action taken Item coded Action taken Occupation Occupation was coded using Australian and New Zealand Standard Classification of Occupations (ANZSCO, Version 1.2, 2013, Australian Bureau of Statistics catalogue number 1220.0) at the six digit level Location of employment For graduates working overseas, country of employment was coded using the Standard Australian Classification of Countries (SACC, Second edition, Australian Bureau of Statistics catalogue number 1269.0). Postcodes of employment, for graduates working in Australia, were manually applied by look up. 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 23 6.3. Response rate The overall response rate was 3,061 completed surveys from a sample size of 6,451, which translates to 47.5 per cent. Figure 2 Response rate across provider type and collection period 70.0% 8,000 7,000 6,000 60.0% 55.7% 6,268 48.8% 47.2% 44.4% 50.0% 42.4% 5,000 4,550 40.0% 4,000 30.0% 3,000 20.0% 1,724 2,000 10.0% 1,000 183 177 0 0.0% University NUHEI Provider type Number inscope sample Nov-15 Feb-16 May-16 Collection period Response rate The supervisors of university graduates completed 2,959 ESS surveys (47.2% response rate), while supervisors of NUHEI graduates achieved 102 completes (55.7%). Response rates were similar across the collection periods, with November achieving 765 completes (44.4%), February 75 completes (42.4%) and May 2,221 completed surveys (48.8%). 6.3.1. Survey mode Over two in five supervisors who completed the ESS (43.9%) elected to complete it online, supporting the case for a dual mode design. The highest proportion of online completes occurred during the May collection period at 49.1 per cent (see Table 22). The CATI workflow contributed over half of the ESS surveys, indicating that CATI is an important workflow to maintain, to ensure that response rates continue to improve. Table 22 Workflow across email and CATI November February May n=1,724 n=177 n=4,550 % sent an invitation email 72.9 88.7 90.6 % online complete* 12.8 17.5 23.7 % total completes 29.2 41.3 49.1 % total entered into CATI workflow 56.0 50.3 65.1 % CATI complete* 31.4 24.9 24.9 % total completes 70.8 58.7 50.9 Online workflow CATI workflow *As a proportion of total in-scope sample 24 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 6.3.2. Analysis of call outcomes Table 23 shows call outcomes, with a total of 18,990 calls placed to 4,400 sample records to achieve 1,717 CATI completions, equating to an interview every 3.8 calls. The average number of calls made to each sample record was 4.3. Table 23 Sample utilisation Overall n Nov/Feb May Selection % n Selection % Selection % n Total CATI selections called 4,400 100.0 1,236 100.0 3,164 100.0 Domestic phone numbers called 4,044 91.9 1,113 90.0 2,931 92.6 356 8.1 123 10.0 233 7.4 Total interviews achieved 1,717 39.0 586 47.4 1,131 35.7 Domestic interviews 1,678 38.1 565 45.7 1,113 35.2 39 0.9 21 1.7 18 0.6 International phone numbers called International interviews Total calls placed 18,990 6,928 12,062 Average call attempts per sample record initiated 4.3 5.6 3.8 Average call attempts per interview 3.8 4.6 3.5 12.8 14.3 12.0 Average interview duration (minutes) 6.3.3. CATI outcomes The proportion of unusable sample (disconnected numbers, incoming call restrictions or fax machine/modems) was low at 2.3 per cent. In total, telephone contact was made with 60.6 per cent of supervisors, with 63.3 per cent of these converting to completions. A further 9.8 per cent of contacts indicated they would prefer to complete the survey online in their own time. Table 24 on the next page contains all CATI outcomes by data collection round. 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 25 Table 24 CATI outcomes Overall n Total numbers initiated % 4,400 Nov/Feb n % 1,236 May n % 3,164 Number disconnected 58 1.3 24 1.9 34 1.1 Incoming call restrictions 17 0.4 5 0.4 12 0.4 Fax machine / modem 24 0.5 5 0.4 19 0.6 Subtotal unusable (as % sample initiated) 99 2.3 34 2.8 65 2.1 4,301 97.8 1,202 97.2 3,099 97.9 22 0.5 4 0.3 18 0.6 Answering machine 734 16.7 144 11.7 590 18.6 No answer 604 13.7 117 9.5 487 15.4 1,360 30.9 265 21.4 1,095 34.6 Claims to have done survey online 93 2.1 56 4.5 37 1.2 Language difficulty 41 0.9 16 1.3 25 0.8 101 2.3 47 3.8 54 1.7 40 0.9 36 2.9 4 0.1 275 6.3 155 12.5 120 3.8 1,687 38.3 577 46.7 1110 35.1 207 4.7 2 0.2 205 6.5 85 1.9 37 3.0 48 1.5 Named person not known 257 5.8 22 1.8 235 7.4 Refusal 113 2.6 30 2.4 83 2.6 Would prefer to do the survey online 262 6.0 81 6.6 181 5.7 55 1.3 33 1.3 22 1.3 2,666 60.6 782 63.3 1,884 59.5 Eligible numbers (as % sample initiated) No Contact Engaged Subtotal no contact (as % sample initiated) Out of scope Not a supervisor of graduate Completed online since initiated in CATI Subtotal out of scope (as % sample initiated) Contacts Completed interviews1 Appointment Selected respondent away duration Terminated Midway in the Survey Subtotal in-scope contacts (as % sample initiated) 1 CATI completes will not equal ESSanalysis=Graduate and SurveyMode=CATI due to midway terminations that were pulled into 'Complete' status. 6.3.4. Email response analysis Table 25 shows that on average, one in five supervisors opened the invitation email, with over half clicking the survey link contained in the email. 26 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Table 25 Email outcomes November February Reminder Reminder Invitation 1 Invitation 1 Total Sent Invitation May Reminder 11 Reminder 2 1,357 1,007 180 138 4,332 3,663 2,074 278 213 34 32 941 1,119 446 20.5 21.2 18.9 23.2 21.7 30.5 21.5 164 106 28 10 520 0 236 60.0 49.8 82.4 31.3 55.5 - 52.9 Soft Bounced3 15 9 2 2 50 44 21 % of total sent 1.1 0.9 1.1 1.5 1.1 1.2 1.0 129 30 19 1 407 46 27 9.5 3.0 10.6 0.7 9.5 1.3 1.4 6 6 0 1 28 22 11 % of total sent 0.4 0.6 - 0.7 0.6 0.6 0.5 Unopened 765 643 97 92 2,385 2,432 1,333 % of total sent 56.4 63.9 53.9 66.7 55.0 66.4 64.3 Opened % of total sent Survey Link clicked2 % of total opened Hard Bounced4 % of total sent Opt out 1 Due to software error survey clicks were not recorded for May Reminder 1 2 Percentage calculated from opened 3 Soft bounce denotes occasions when the email cannot get through at the time of the send, commonly due to an inbox having reached capacity or the mail server being temporarily down. These emails were reattempted on the next send. 4 Hard bounce denotes a permanently unusable email address, commonly the result of a non-existent domain or disabled mailbox. These emails were not included in subsequent sends. 6.4. Item level non-response Item-level non-response represents the proportion of respondents who have skipped past a question without submitting an answer. Average item level non-response across the entire survey was 1.1 out of 47 questions, which is very low. The items with the highest levels of non-response were questions asking about the occupation and duties involved in the graduate and supervisors’ occupations, with eempduty skipped by 3.6 per cent of supervisors, egrdduty skipped by 2.6 per cent of supervisors and eempocc skipped by 2.8 per cent of supervisors. Within the Graduate Attributes Scale statements, egfound3, egcollab5 and egemply6 had highest rates of non-response. Item level non-response for all items can be found in Appendix 4. Table 26 Item non-response ESS item 2016 non-response (%) Overall Screening & Confirmation Module 1.5 eempduty 3.6 eempocc 2.8 egrdduty 2.6 Graduate Attributes Scale Module 1.1 egfound3 1.9 egcollab5 1.7 egemply6 1.6 Average item non-response rate 1.1 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 27 6.5. Respondent characteristics 6.5.1. ESS In-Scope and completes Table 27 contains the profile of graduates in employment, those who went onto provide supervisor details, and those with supervisors who participated in the ESS. Older graduates were more likely to provide supervisor details, and while female graduates and those who speak English at home are dominant demographics, there was a noticeable increase in the proportion of male graduates, Indigenous graduates and those who spoke a language other than English at home providing their supervisor details. There were some differences in supervisor response between demographic groups, with supervisors of female graduates, those aged over 30, Indigenous graduates and English speaking graduates more likely to respond. Table 27 Graduate demographic profile In employment Provided supervisor details n % Supervisor participated n % n % Male 33,963 38.1 2,898 45.0 1,270 41.5 Female 55,205 61.9 3,546 55.0 1,790 58.5 30 years or under 62,127 69.5 3,698 57.3 1,596 52.1 Over 30 years 27,220 30.5 2,751 42.7 1,465 47.9 712 0.8 79 1.2 52 1.7 88,459 99.2 6,365 98.8 3,008 98.3 English 78,008 87.3 5,443 84.4 2,730 89.2 Language other than English 11,341 12.7 1,006 15.6 331 10.8 3,615 4.1 249 3.9 128 4.2 85,554 95.9 6,195 96.1 2,932 95.8 Gender Age Aboriginal and Torres Strait Islander Indigenous Non-Indigenous Main language spoken at home Disability Reported disability No reported disability Table 28 demonstrates that university and NUHEI graduates were equally likely to provide supervisor details and receive a response from their supervisors. Age continues to be a strong predictor of participation, with postgraduates and external/distance graduates, who tend to be older, more likely to provide supervisor details. Supervisors of external/distance graduates are particularly likely to participate in the ESS. 28 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Table 28 Graduate higher education characteristics In employment n % 85,944 96.2 3,405 Internal/mixed External/distance Supervisor details provided n Supervisor participated % n % 6,266 97.2 2,959 96.7 3.8 183 2.8 102 3.3 74,632 84.0 5,159 80.1 2,328 76.1 14,539 16.0 1,285 19.9 732 23.9 Undergraduate 49,958 55.9 2,986 46.3 1,440 47.0 Postgraduate coursework 34,716 38.9 2,919 45.3 1,394 45.5 4,675 5.2 544 8.4 227 7.4 Institution type University NUHEI Study mode Course level Postgraduate research Graduates from a number of fields of education were reluctant to provide supervisor details, particularly those from society and culture (see Table 29 below). Graduates from health were more likely to provide supervisor details, as were education graduates, with their supervisors also more likely to participate in the ESS. Table 29 Broad field of education In employment Supervisor details provided Supervisor participated n % n % n % Natural & Physical Sciences 6,576 7.4 382 5.9 182 5.9 Information Technology 3,072 3.4 231 3.6 103 3.4 Engineering & Related Technologies 5,368 6.0 481 7.5 201 6.6 Architecture & Building 2,017 2.3 144 2.2 63 2.1 Agriculture & Environmental Studies 1,500 1.7 131 2.0 62 2.0 Health 18,021 20.2 1,485 23.0 713 23.3 Education 10,282 11.5 922 14.3 491 16.0 Management & Commerce 17,487 19.6 1,346 20.9 541 17.7 Society & Culture 18,834 21.1 1,083 16.8 558 18.2 5,250 5.9 242 3.8 124 4.1 9 0.0 2 0.0 1 0.0 Creative Arts Food, Hospitality & Personal Services 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 29 Summary of issues for future surveys 7. The 2015/16 implementation of the ESS was a successful proof of concept that there is potential for a national survey of supervisors of recent graduates from higher education institutions. Key achievements included: • a centralised approach to supervisor recruitment • a flexible and responsive approach to data collection that minimises the burden on employers • the largest national survey of graduate supervisors, to date. Improving overall Total Survey Error (TSE) is the core focus of our commitment to continuous improvement across the QILT research program. Mitigating potential sources of errors of representation and measurement error are key considerations for future surveys. The dual workflow approach has improved representation in the ESS, particularly as supervisor details are often incomplete and they become restricted to only one workflow. It is our recommendation that dual workflows continue to be offered to supervisors. In terms of reducing TSE, attention will be focused on improving recruitment for the ESS. Using the GOS to recruit potential supervisors means that the accuracy of the ESS frame is constrained by any errors present in the establishment or conduct of this survey. TSE mitigation efforts for the GOS have, helpfully, ensured that there are comparatively low levels of coverage, sampling and non-response errors. The main barrier to a representative and effective ESS is coverage error for this survey as recent graduates are not yet comfortable providing contact information for their current supervisors. The removal of the refusal option in the ESS recruitment module in the GOS has gone some way to mitigating this, and we recommend this be retained at least for the short term. The most common reason reported for not providing this information continues to be that graduates feel their supervisor doesn’t have enough time to participate. Despite efforts to inform graduates that employers indicate that they do have enough time, and that they want to participate in the ESS, this sentiment has remained unchanged across the GOS trial and the GOS 2016 collections. We believe that this feedback it is actually a ‘soft refusal’ to make this contact information available rather than a legitimate barrier. Making the ESS results on the QILT website as visible as possible to graduates who are completing the GOS has the potential to put some minds at ease. Particularly in the early iterations of the ESS, the information will be very highly aggregated and should clearly demonstrate that only grouped results are being reported. Early contact has been made with the National Association of Graduate Careers Advisory Services who are keen to encourage completion of the GOS and the employer contact module with students at their institutions. This will support activities aimed at making students and graduates feel familiar and comfortable with providing information across all of the QILT surveys. Contact has also been made with a number of professional associations with a view to enlisting support around the promotion of the GOS and the ESS to recent graduates and graduate supervisors. It is still very early days with this type of promotional activity as these groups have indicated that they would like to see what is being published by the Department before they commit to a course of action. Discussions with professional associations and registering bodes will commence in early 2017 after the GOS and the ESS data have been made available on the QILT website. 30 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Glossary 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 31 Appointment: When a participant called during CATI cannot complete the survey at that time and elects to be called back at another time. CATI: Computer Assisted Telephone Interview. Refers to when the survey is completed over the telephone by a participant. CATI outcomes: The status of the participant after they are called for CATI. Cognitive testing: A qualitative test of the survey with potential respondents with the aim of ensuring that each of the questionnaire items were easily understood, relevant, and captured valid and reliable data. Complaint (as an email outcome): A ‘complaint’ is recorded as an email outcome when a supervisor clicks the ‘This is Spam’ or ‘Junk Mail’ buttons in their email client instead of clicking the unsubscribe link in the email. Field of Education: Ensures courses, specialisations and units of study with the same or similar occupational emphasis are classified to the same field of education. There are 12 broad fields and 71 narrow fields of education. Hard bounce: A hard bounce occurs when it is sent to a permanently unusable email address, commonly the result of a non-existent domain or disabled mailbox. These emails are removed from subsequent sends. Majors: The principle field of study the student completed. Graduates can have up to two majors per course. Opted out: Supervisors that said they did not want to complete the survey. Out of Scope: Supervisors that noted that they never supervised graduate. Refusal Conversion: Attempting to convert a participant who has previously indicated they do not want to provide their supervisors details into giving their supervisor details. Depending on the reason they gave for not wanting to give supervisor details, graduates were targeted with a specific message to overcome any barriers they may have had. Respondent refusal: When the survey participant chooses not to participate in the survey during CATI. Specialisation code: A code which identifies a specialisation within a course for which the academic requirements have or will be completed by a student or a group of students. Soft bounce: A soft bounce occurs when an email cannot get through to the supervisors inbox at the time of the send, commonly due to the inbox having reached capacity or the mail server is temporarily down. Emails that are a ‘soft bounce’ will be sent again in the next send. Study area: Similar courses are grouped into either 21 or 45 different study areas. Table A and Table B institutions: Providers listed in Table A are approved for all Australian Government grants under the Higher Education Support Act 2003 (HESA) and their students can receive all forms of assistance. Providers listed in Table B are eligible for some grants for particular purposes. Table B providers can offer FEE-HELP assistance to their students. Table B providers approved for National Priority places can also offer HECS-HELP assistance. Appendix 1 32 ESQ 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training 1 EMPLOYER SATISFACTION QUESTIONNAIRE SUMMARY OF KEY VARIABLES .......................................................................................................... 1 Module A: Screening and confirmation ................................................................................................... 2 Module B: Overall graduate preparation ................................................................................................. 4 Module C: Graduate attributes scale (GAS-E) ........................................................................................ 6 Module D: Emerging policy issues .......................................................................................................... 8 Module E: Institution specific issues ....................................................................................................... 8 Module F: Close ...................................................................................................................................... 8 SUMMARY OF KEY VARIABLES Questionnaire Variable name Brief description Detailed description (if applicable) UniqueESSID Employer ID SRC assigned ID To identify supervisor in sample UniqueGOSID Graduate ID SRC assigned ID in GOS sample To match back to graduate E403 Graduate’s first name Sourced from GOS sample Throughout survey E402 Graduate’s last name Sourced from GOS sample Module A E306C Graduate’s institution Sourced from GOS sample Throughout survey Qualfinal Graduate’s qualification Sourced from GOS output FinalCourseA/FinalCourseB Throughout survey supemail Supervisor email address Sourced from GOS output Module F Double Double degree flag The Social Research Centre Key use points 1 = Yes Module B 2 = No Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training 2 Module A: Screening and confirmation *(ALL) confirmO Thank you for agreeing to take part in the Employer Satisfaction Survey. This is an important survey conducted by the Social Research Centre on behalf of The Department of Education and Training. The information gathered from you will contribute to positive changes in Australian higher education by providing valuable data about graduates’ general ability, technical skills and work readiness. Most people take approximately 7 minutes to complete all the questions. If you need to take a break, you can press the ‘PAUSE’ button and close your browser. You can come back to the survey at any time and continue from where you stopped. Please do not use the browser BACK button to go back to a previous question. Please press the 'Next' button below to continue [PROGRAMMER:. Text if ‘PAUSE’ is pressed should read ‘Thanks for your time so far. You can come back to complete your survey at any time before January 30.’] [PROGRAMMER: Only QS1 to be mandatory, all other questions are optional.] First we have a few questions about your role and <E403> <E402>’s role, so we can understand your relationship to <E403>. *(ALL) SUPERVISOR RELATIONSHIP QS1 Just to check, do you currently supervise <E403>? By supervisor, we mean a person who has the authority to direct someone to do certain tasks and who has a good idea of the work that the person does in their job. 1. 2. 3. Yes No, but I used to be their supervisor No, I have never been their supervisor (GO TO TERM) ----------------------------------------------------- PAGE BREAK ---------------------------------------------------*(CURRENT OR PREVIOUS SUPERVISOR) SUPERVISOR RELATIONSHIP DURATION QS2 And how long IF QS1=1: <have you been>/IF QS1=2: <were you> <E403>’s supervisor? 1. 2. 3. 4. Less than 1 month At least 1 month but less than 3 months At least 3 months but less than 1 year 1 year or more ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*PROGRAMMER NOTE: ALL REFERS TO CURRENT OR PREVIOUS SUPERVISORS *(ALL) AWARENESS OF INSTITUTION QS3 Before today, were you aware that <E403> completed a qualification from <E306C>? 1. Yes 2. No ----------------------------------------------------- PAGE BREAK ----------------------------------------------------- The Social Research Centre Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training *(ALL) AWARENESS OF INSTITUTION QS4 And, before today, were you aware that the qualification <E403> completed was a <qualfinal>? 1. Yes 2. No ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) GRADUATE’S OCCUPATION QS5 How would you describe <E403>’s occupation? (VERBATIM RESPONSE TEXT BOX) ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) GRADUATE TASKS QS6 What are the main tasks that they usually perform in their job? (VERBATIM RESPONSE TEXT BOX) ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) EMPLOYER OCCUPATION QS7 How would you describe your main PAID occupation? (VERBATIM RESPONSE TEXT BOX) ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) EMPLOYER DUTIES QS8 What are the main tasks that you usually perform in this job? (VERBATIM RESPONSE TEXT BOX) The Social Research Centre 3 Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training Module B: Overall graduate preparation The next set of questions asks about the skills and attributes you think are important for recent graduates to have when coming into your organisation. Please answer them in relation to the job currently performed by <E403>. *(Double = 1) STUDENT GRADUATED WITH DOUBLE DEGREE QSPREOP1 We understand that <E403> graduated from <E306C> with a <qualfinal>. Please answer the following questions based on both qualifications in general. ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) FORMAL REQUIREMENT QOP1 Is a <qualfinal> or similar qualification a formal requirement for <E403> to do their job? 1. Yes 2. No ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) IMPORTANCE OF QUALIFICATION QOP2 To what extent is it important for <E403> to have a <qualfinal> or similar qualification to being able to do the job well? Is it… 1. 2. 3. 4. 5. Not at all important Not that important Fairly important Important Very important ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) OVERALL PREPARATION QOP3 Overall, how well did <E403>’s <qualfinal> prepare him/her for their job? 1. Not at all 2. Not well 3. Well 4. Very well 5. Don’t know / Unsure ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) OPEN (POSITIVE) QOP4 What are the MAIN ways that <E306C> prepared <E403> for employment? (VERBATIM RESPONSE TEXT BOX) 1. Don’t know / Unsure ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) OPEN (IMPROVE) QOP5 And what are the MAIN ways that <E306C> could have better prepared <E403> for employment? (VERBATIM RESPONSE TEXT BOX) 1. Don’t know / Unsure ----------------------------------------------------- PAGE BREAK ----------------------------------------------------- The Social Research Centre 4 Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training *(ALL) OVERALL RATING QS11 Based on your experience with <E403>, how likely are you to consider hiring another <qualfinal> graduate from <E403>, if you had a relevant vacancy? Would you say 1. 2. 3. 4. 5. 6. Very unlikely to consider Unlikely to consider Neither unlikely nor likely to consider Likely to consider Very likely to consider Don’t know / Unsure ----------------------------------------------------- PAGE BREAK ----------------------------------------------------- The Social Research Centre 5 Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training 6 Module C: Graduate attributes scale (GAS-E) The following questions ask about specific skills and attributes that may be important for employees to have in your organisation. GAS For each skill or attribute, to what extent do you agree or disagree that <E403>’s <qualfinal> from <E306C> prepared them for their job? If the skill is not required by <E403> in their role, you can answer “Not applicable”. Foundation skills 1. Oral communication skills 2. Written communication skills 3. Working with numbers 4. Ability to develop relevant knowledge 5. Ability to develop relevant skills 6. Ability to solve problems 7. Ability to integrate knowledge 8. Ability to think independently about problems ----------------------------------------------------- PAGE BREAK ----------------------------------------------------Adaptive skills and attributes 9. Broad background knowledge 10. Ability to develop innovative ideas 11. Ability to identify new opportunities 12. Ability to adapt knowledge to different contexts 13. Ability to apply skills in different contexts 14. Capacity to work independently ----------------------------------------------------- PAGE BREAK ----------------------------------------------------Teamwork and interpersonal skills 15. Working well in a team 16. Getting on well with others in the workplace 17. Working collaboratively with colleagues to complete tasks 18. Understanding different points of view 19. Ability to interact with co-workers from different or multi-cultural backgrounds ----------------------------------------------------- PAGE BREAK ----------------------------------------------------Technical and professional skills 20. Applying professional knowledge to job tasks 21. Using technology effectively 22. Applying technical skills in the workplace 23. Maintaining professional standards 24. Observing ethical standards 25. Using research skills to gather evidence The Social Research Centre Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training ----------------------------------------------------- PAGE BREAK ----------------------------------------------------Employability and enterprise skills 26. Ability to work under pressure 27. Capacity to be flexible in the workplace 28. Ability to meet deadlines 29. Understanding the nature of your business or organisation 30. Demonstrating leadership skills 31. Demonstrating management skills 32. Taking responsibility for personal professional development 33. Demonstrating initiative in the workplace RESPONSE FRAME 1. Strongly disagree 2. Disagree 3. Neither disagree nor agree 4. Agree 5. Strongly agree 9 Not applicable The Social Research Centre 7 Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training 8 Module D: Emerging policy issues Module E: Institution specific issues Module F: Close Thank you for your assistance with this survey. We would like to provide some feedback to participants about the outcomes of the study. We anticipate finishing the study in the first half of 2016 *(ALL) RESULTS FEEDBACK C1 Would you like to receive a one page summary of the outcomes of the study? 1. 2. Yes No ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) SURVEY FEEDBACK C3 Would you like to be notified when the national data is released on the Quality Indicators for Learning and Teaching (QILT) website? 1. 2. Yes No ----------------------------------------------------- PAGE BREAK ----------------------------------------------------*(ALL) ACKNOWLEDGEMENT C4 Would you like your organisation to be acknowledged on the QILT website for supporting this important research? If you are unsure please select yes, as you will be able to opt out of this during our follow up with you. 1. 2. Yes No ----------------------------------------------------- PAGE BREAK ----------------------------------------------------- PROGRAMMER: PREPOPULATE EMAIL ADDRESS FROM SAMPLE IF C1=1 OR C2=1 OR C4=1 *(WOULD LIKE SUMMARY, FEEDBACK OR ACKNOWLEDGEMENT) (CONFIRM) C2 Can we confirm the best email address to contact you on? SUPERVISOR EMAIL 1. My email address is <supemail> [DISPLAY IF SUPEMAIL≠BLANK] 2. The best email address to contact me on is: (ALLOW EMAIL ENTRY) [DISPLAY IF SUPEMAIL≠BLANK] 2. My email address is (ALLOW EMAIL ENTRY) [DISPLAY IF SUPEMAIL=BLANK] IF C4=1 *(WOULD LIKE ACKNOWLEDGEMENT) BUSINESS NAME CONFIRM C5 So that we can properly acknowledge your business on the QILT website, can you please confirm your business name as you would like it to appear on the site? 3. My businesses name is: (ALLOW BUSINESS NAME ENTRY) The Social Research Centre Employer Satisfaction Questionnaire (ESQ) Australian Government Department of Education and Training END Thank you for your time today and support in ensuring that graduates complete their qualifications well equipped to meet the needs of organisations like yours. ----------------------------------------------------- NEW PAGE ----------------------------------------------------QS1=3 (TERMINATED - NOT SUPERVISOR OF GRADUATE) TERM Thank you for your willingness to complete the Employer Satisfaction Survey (ESS). You have indicated that you are not the supervisor of <E403>. If you incorrectly selected this option or your workplace still wishes to take part with another supervisory person please call The Social Research Centre’s helpdesk on 1800 023 040. You can also email us at [email protected]. SUBMIT BUTTON LINKS TO: http://www.qilt.edu.au/about-this-site/employer-satisfaction-survey(ess) The Social Research Centre 9 Appendix 2 ESQ Screenshots 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 33 Appendix 3 34 ESS Recruitment Versions 2016 Employer Satisfaction Survey Prepared by the Social Research Centre ESS FACT SHEET TEXT WHAT is the ESS about? The Employer Satisfaction Survey (ESS) provides the only national measure of the extent to which higher education institutions in Australia are meeting employer needs. Specifically, this survey gathers employer feedback on the extent to which students are being taught the right mix of generic and technical skills to be prepared for the workforce. The research aims to ensure that institutions are responsive to labour market and industry needs. If you would like to see the types of questions that are asked, please click here. WHY is the ESS important? The ESS is an Australian Government Department of Education and Training initiative which is carried out by The Social Research Centre as part of Quality Indicators for Learning and Teaching (QILT) suite of surveys. The ESS provides employers and industry with an opportunity to provide feedback and input into the ongoing improvement of higher education. All information provided will remain completely confidential and will not identify you in any way. While the Social Research Centre will collect personal information, your supervisors’ responses are not linked to your name. Please be assured that their responses are completely confidential. If you would like to see how the results may be presented please click here. HOW can my supervisor take part? Once we collect your supervisors details, the Social Research Centre will contact your supervisor to ask them to go online and complete the survey. The survey will take approximately 7 minutes to complete. It can be completed at a time convenient to your supervisor, they log back in at a later date if they are unable to finish the survey in one sitting. If you have any further questions about your participation in this survey, please contact the Social Research Centre on 1800 055 818 (a free call) or via email at [email protected]. EMPLOYER SATISFACTION SURVEY RESULTS EXAMPLE Supervisors who would be fairly confident or very confident recommending a similar graduate 94.4% 94.7% 92.6% 92.4% 88.1% Total Institution A Institution B Institution C Insitution D * Please note that this data is from the Pilot Employer Satisfaction Survey conducted in 2013-14 and is being used to provide an example of how the Employer Satisfaction Survey 2016 results may look. For a copy of the Pilot Employer Satisfaction Survey 2013-14 report please click here. SUMMARISED EMPLOYER SATISFACTION SURVEY Confirming occupation details • • • • How would you describe <Graduate name>’s occupation? What are the main tasks that they usually perform in their job? How would you describe your main PAID occupation? What are the main tasks that you usually perform in this job? Overall preparation for work • • • • • • Is a <Graduate’s qualification> or similar qualification a formal requirement for <Graduate name> to do their job? To what extent is it important for <Graduate name> to have a <Graduate’s qualification> or similar qualification to being able to do the job well? Is it… Overall, how well did <Graduate name>’s <Graduate’s qualification> prepare him/her for their job? What are the MAIN ways that <Graduate’s institution> prepared <Graduate name> for employment? And what are the MAIN ways that <Graduate’s institution> could have better prepared <Graduate name> for employment? Based on your experience with <Graduate name>, how likely are you to consider hiring another <Graduate’s qualification> graduate from <Graduate name>, if you had a relevant vacancy? Graduate attributes For each skill or attribute below, to what extent do you agree or disagree that <Graduate name>’s <Graduate’s qualification> from <Graduate’s institution> prepared them for their job? If the skill is not required by <Graduate name> in their role, you can answer “Not applicable”. Foundation skills • Oral communication skills • Written communication skills • Working with numbers • Ability to develop relevant knowledge • Ability to develop relevant skills • Ability to solve problems • Ability to integrate knowledge • Ability to think independently about problems Adaptive skills and attributes • Broad background knowledge • Ability to develop innovative ideas • Ability to identify new opportunities • Ability to adapt knowledge to different contexts • Ability to apply skills in different contexts • Capacity to work independently Teamwork and interpersonal skills • Working well in a team • Getting on well with others in the workplace • Working collaboratively with colleagues to complete tasks • Understanding different points of view • Ability to interact with co-workers from different or multi-cultural backgrounds Technical and professional skills • Applying professional knowledge to job tasks • Using technology effectively • Applying technical skills in the workplace • Maintaining professional standards • Observing ethical standards • Using research skills to gather evidence Employability and enterprise skills • Ability to work under pressure • Capacity to be flexible in the workplace • Ability to meet deadlines • Understanding the nature of your business or organisation • Demonstrating leadership skills • Demonstrating management skills • Taking responsibility for personal professional development • Demonstrating initiative in the workplace Appendix 4 Item Level Non-Response 2016 Employer Satisfaction Survey Prepared by the Social Research Centre 35 ESS item 2016 nonresponse (%) ESS item 2016 nonresponse (%) Screener and confirmation 1.5 Graduate Attributes Scale (cont) esuper 0.0 egemply1 1.0 esuptime 0.2 egemply2 1.0 eknwinst 0.1 egemply3 0.9 eknwqual 0.4 egemply4 0.8 egrdocc 1.9 egemply5 1.3 egrdduty 2.6 egemply6 1.6 eempocc 2.8 egemply7 0.9 eempduty 3.6 egemply8 1.0 Average item non-response rate 1.1 Overall Graduate Preparation 0.9 eformreq 0.6 equalimp 0.3 ecrsprep 0.5 eprep 1.0 ebetter 2.3 ehire 0.6 Graduate Attributes Scale 1.1 egfound1 0.9 egfound2 1.0 egfound3 1.9 egfound4 0.7 egfound5 0.9 egfound6 0.6 egfound7 0.7 egfound8 0.6 egadapt1 1.0 egadapt2 1.4 egadapt3 1.2 egadapt4 0.9 egadapt5 1.1 egadapt6 0.8 egcollab1 1.0 egcollab2 1.0 egcollab3 1.0 egcollab4 1.0 egcollab5 1.7 egtech1 0.9 egtech2 1.0 egtech3 1.2 egtech4 0.8 egtech5 1.2 egtech6 1.4 Appendix 5 36 Participating Institutions 2016 Employer Satisfaction Survey Prepared by the Social Research Centre Institution Academy of Design Australia Macleay College Academy of Information Technology Macquarie University Adelaide College of Divinity Melbourne Institute of Technology Alphacrucis College Melbourne Polytechnic Asia Pacific International College Monash University Australian Catholic University Montessori World Education Institute (Australia) Australian College of Applied Psychology (Navitas Institute) Morling College Australian College of Physical Education Murdoch Institute of Technology Australian College of Theology Murdoch University Australian Institute of Business National Art School Avondale College of Higher Education Navitas Professional Institute Blue Mountains International Hotel Management School Photography Studies College (Melbourne) Bond University Queensland University of Technology Box Hill Institute RMIT University Cambridge International College Raffles College of Design and Commerce Central Queensland University SAE Institute and Qantm College Charles Darwin University SP Jain School of Management Charles Sturt University Southern Cross University Christian Heritage College Swinburne University of Technology Curtin University of Technology Sydney Institute of Traditional Chinese Medicine Deakin University TAFE NSW Eastern College Australia TAFE Queensland Edith Cowan University TAFE SA Endeavour College Tabor College of Higher Education Excelsia College The Australian National University 60 Federation University Australia The Cairnmillar Institute School Flinders University The College of Law Griffith University The Tax Institute Holmes Institute The University of Adelaide Holmesglen Institute The University of Melbourne International College of Management, Sydney The University of Notre Dame Australia James Cook University The University of Queensland Kaplan Business School The University of Sydney Kaplan Higher Education Pty Ltd trading as Murdoch Institute of Technology The University of Western Australia La Trobe University University of Canberra MIECAT University of Divinity Institution University of New England University of Wollongong University of New South Wales University of the Sunshine Coast University of Newcastle Victoria University University of South Australia Western Sydney University University of Southern Queensland Whitehouse Institute University of Tasmania William Angliss Institute University of Technology, Sydney