Please contact the managers on the details below with any queries.
Angela Haley
Forum Manager
Leishman Associates
P: 03 6234 7844
E: angelah@laevents.com.au
Please contact the managers on the details below with any queries.
Angela Haley
Forum Manager
Leishman Associates
P: 03 6234 7844
E: angelah@laevents.com.au
Ms. Tessy Jolly, Mr. Ken Yul Lee
University of Sydney, 2Cubane Consulting
9.3 –Beyond Measure: How the University of Sydney uses UniForum data to analyse feedback and set targets in transformative projects, Tasman Room C, November 13, 2019, 10:20 AM – 11:00 AM
Biography:
Tessy is the Associate Director, Institutional Planning at the University of Sydney’s Institutional Analytics and Planning unit. She is responsible for overseeing and facilitating the University’s student fees, load and course planning processes and cross-institutional research, including UniForum. She also advises on and manages the University’s government reporting requirements and relationship as well as Commonwealth funding agreements.
Tessy has over 25 years of experience in leadership and operational roles in student load planning, treasury, investments, finance and accounting. She has spent the last 12 years in the higher education sector.
Tessy holds an MBA and is a member of CPA Australia. She holds a bachelor’s degree in commerce, majoring in accounting.
The University of Sydney is currently undertaking a multi-year professional services transformation program targeting multiple functions. For any transformative project, efficiency in delivery and longevity of introduced changes are key. If academics and professional staff using impacted services do not find them effective over time, the sustainability of changes delivered through the service model will be undermined, resulting in pressure to revert to old ways. Therefore, delivering excellent service with efficiency is of utmost priority to the program. Ensuring this goal is achieved requires the University to develop a robust fact base on how services are performing, identify the most important areas for improvement and track progress against set goals.
The presentation will focus on how the University of Sydney uses data to inform this improvement journey. This is achieved by first measuring how the University fares in delivery efficiency, and then how staff feel about services provided, captured by the UniForum dataset. This process has provided access to detailed data that can be benchmarked against other institutions in the Higher Education sector. By comparing results to others in the sector, the University of Sydney has been able to identify delivery costs to the University, and critical pain points and service gaps that matter most to academic and professional staff.
In order to build support for change across the wider University, the unit responsible, Institutional Analytics and Planning, has been engaging service divisions with their efficiency and effectiveness results. This includes working sessions with functional teams that involve communicating headline results, as well as live exploration of more granular data using interactive data visualisation tools. These sessions have identified issues in critical processes, systems, infrastructure and people that are impacting staff views on effectiveness as well as efficiency matters. The results enable us to pinpoint key areas to focus for each activity, ensuring improvement efforts are well targeted.
By linking these improvements to recent high-profile systems and process transformation initiatives, and sharing these success stories widely, the University is able to support future change initiatives effectively and demonstrate that feedback is not only heard, but actioned.
Ms Kubra Chambers, Ms Tanya Gupta
The University of Sydney
6.1 – Unlearning the future: Developing predictive capabilities for institutional researchers, Tasman Room A, November 12, 2019, 4:20 PM – 5:00 PM
Biography:
Kubra has over 25 years of experience in leadership and operational roles in finance, internal audit and information and communications technology. She has spent the last 20 years in the university sector.
As Director of the Institutional Analytics and Planning team at the University of Sydney, Kubra is responsible for overseeing, implementing and advising on the University’s business intelligence and data analytics strategy and projects. She is also responsible for facilitating the University’s student fee, load and course planning processes and government reporting and relationship.
She holds a Master of Commerce majoring in Accounting and a bachelor’s degree in marketing.
In the past few years, predictive analytics has seen a significant rise in the higher education space. But soothsayers, beware – predictive analytics is a time and capital-consuming exercise whose need must be based in solving a critical institutional problem. But when it works, it creates an impactful synergy between man and machine.
The University of Sydney had such a need – our previous capability in forecasting student enrolments, load and revenue was broad, static and opaque. In 2018, to address the need we created a new tool, Smart Predictive Insights for Revenue and Enrolment (SPIRE) using Oracle PBCS, which allows us to see a picture of the University’s enrolment lifecycle that is reflective of real-time fluctuations; detailed to course-level trends; and transparent in predictions of up to ten years. And over time, it continues to learn and improve. SPIRE provides a plethora of cross-institutional insights for decision making: revenue forecasts, recruitment trends, classroom and campus enrolment predictions, international mix and many more.
However, like all crystal-ball solutions, execution can quickly turn shambolic if the foretelling is not grounded in data derived from a single source of truth. Therefore, for us it is crucial that SPIRE is powered by data and institutional researchers that capture the historical and real-time enrolment lifecycle effectively; and have the agility to switch levers depending on the stage of the lifecycle.
This is where the symbiosis of man and machine is critical – our analysts are at once the teacher and student. The data sourced by our analysts then feeds into predictive outcomes from SPIRE, which in return allows for better decision-making and better utilisation of university assets and resources. Presently, SPIRE is a new tool that we expect will become provident over time, letting us gain a truer glimpse into the University’s future.
This session will look into how SPIRE solved a critical problem for the University. But more importantly, it will delve into how the institutional researchers that power the tool continue to improve its predictive abilities through their mutualistic relationship.
Mrs Karin Conradie
Edith Cowan University
7.2 – Using QILT data for benchmarking at ECU, Tasman Room B, November 13, 2019, 8:50 AM – 9:30 AM
Biography:
Mrs. Karin Conradie:
Karin is a Senior Analyst in the Surveys team in the Strategic Governance Services Centre at ECU. Karin has been working in the Higher Education sector for 15 years. She has been involved with government data submissions, survey instrument design, developing systems for data reporting, and recently, Tableau dashboard development. Prior to working for ECU, she was a software engineer working for the Defence industry on various leading edge technology projects for 15 years. She has a strong IT background in software design, programming and database administration.
Mr Danny Collins, Mr Stuart Terry
International Division at Watermark
11.2 – Boost Response Rates & Better Connect Your Student Survey Data with an All-in-One Course/Unit Evaluation & Survey Solution, Tasman Room B, November 13, 2019, 12:10 PM – 12:50 PM
Biography:
Stuart Terry – Stuart Terry is the Organisational Researcher at Otago Polytechnic, New Zealand and leads a team of who to gather, analyse and report feedback from current students, graduates and staff. Over recent year’s Stuart has developed online systems to enable both students and teaching staff to better engage in meaningful feedback and evaluation processes. In addition, all staff are supported to participate in reflective evaluative practices which are developmentally focused. These processes include annual staff engagement surveys, peer to peer reflection and individual 360 feedback for leaders and teams at Otago Polytechnic and for a number of other organisations. In 2014 Stuart established Proud@OP for LGBTiQ+ staff and lead the institutions successful application for the nationally recognised Rainbow Tick diversity and inclusion certification.
Stuart holds a Bachelor of Commerce (Management and Marketing) and Postgraduate Diploma in Commerce from the University of Otago and is currently undertaking Doctor of Professional Practice studies.
Danny Collins – Danny Collins has extensive experience delivering SaaS solutions in Europe, Asia, and the Middle East. His passion for higher education brought him to Watermark in 2019, where he directs international sales. Collins is currently based in London.
Your university is committed to quality assurance and informed decision making, but your current process and tools may make it hard to efficiently capture data and share actionable reports. Here are some important questions to consider:
In this presentation, we will share why so many institutions have entrusted these critical processes to EvaluationKIT by Watermark, a purpose-built software system locally hosted in Australia. You’ll discover key benefits of our solution, from great response rates to streamlined administration to robust reporting and integrations with key campus systems.
Evaluation KIT helps you gain deeper insights while providing a more cost-effective way to improve the course/unit evaluation and survey process for learners, faculty, and administration.
Andrew Painter, Cliff Ashford
Altis Consulting, Monash University
11.1 – Agility vs Standardisation in Data and Analytics, Tasman Room A, November 13, 2019, 12:10 PM – 12:50 PM
If you want to know how to speed up the delivery of your data and analytics projects, and ultimately help you achieve greatness for you and your organisation, Cliff Ashford (Monash University) and Andy Painter (Altis Consulting) will be drawing on their combined 60+ years of experience in the data and analytics space to show you what works and what doesn’t. Cliff and Andy will be covering:
Bring your questions as well!
Mr. John Stanley, Dr. Serge Herzog
University of Hawaii-West Oahu, University of Nevada at Reno
8.3 – Leveraging ‘Big Data’ and BI Visualization to Enhance Performance Benchmarking, Tasman Room C, November 13, 2019, 9:35 AM – 10:15 AM
Biography:
John Stanley is the Director of Institutional Research at the University of Hawai‘i – West Oahu and a Research Fellow at the WASC Senior College and University Commission in the United States. He has published institutional research articles and has instructed workshops on using analytics to improve student outcomes at regional and national conferences. He was awarded best presenter at the 2012 California-AIR Conference and best new presenter at the 2018 Australia-AIR Conference.
Dr. Serge Herzog is the Director, Institutional Analysis at the University of Nevada, Reno. He has been the editor or co-editor of several New Directions for Institutional Research (Jossey-Bass) volumes, and his research has appeared in the Journal of Engineering Education, Research in Higher Education, University Business Magazine, Campus Technology, and Chronicle of Higher Education. Dr. Herzog is a Research Fellow with the WASC Senior College and University Commission (WSCUC)..
Abstract:
In the age of performance assessment and accountability, institutional researchers are increasingly asked to conduct comparative analyses and engage in benchmarking activities. Administrators, accrediting bodies, and state and local governing boards are among those known to ask for benchmark data at a moment’s notice. This presentation shows how two IR offices are addressing the need for a performance assessment tool that offers campus stakeholders transparent, ready-access to institutional comparative data. Combining multiple national datasets (i.e., IPEDS, Student Financial Aid, College Scorecard), the presenters demonstrate a range of highly visual and interactive benchmark reports in the areas of enrollment persistence, graduation rates, post-graduate employment and social mobility, and student loan repayment. Results are demonstrated using a cloud-based business intelligence tool.
Engaging in benchmarking activities is a common practice for universities. Benchmark data can be used to inform such varied purposes as enrollment management decisions, identifying areas of weakness, and justifying policies like tuition pricing and faculty salary adjustments. Further, many accrediting bodies require institutions to report measurable progress in relationship to peer institutions. This presentation demonstrates how two IR offices in the United States are addressing the need for a performance assessment tool that offers campus stakeholders ready-access to institutional benchmarking data. The session will begin by quickly reviewing several public no-cost datasets available to institutional researchers for the purposes of benchmarking institutional performance (i.e., IPEDS, Student Financial Aid, College Scorecard).
Objective/Purpose:
The presenters will demonstrate how they combined and transformed these data into highly customized online interactive reports that are helping to support and inform strategic planning, institutional benchmarking, accreditation, and performance assessment to key stakeholders at their university. Covering over 1,700 higher education institutions, the presentation demonstrates how to establish institutional benchmarks for performance assessment that takes into account institutional control, mission, size, selectivity and other pertinent attributes for comparative analysis.
While the presentation draws on the experience of two U.S. universities to develop more meaningful performance metrics for accountability assessment, the presentation is styled to inform the efforts at other institutions that wish to deploy transparent, ready-access reporting tools for institutional comparison in higher education.
Dr Elvira Fonacier
University of Technology Sydney
4.2 – Increasing response rates in the Student Experience Surveys (SES) – how we did it!, Tasman Room B, November 12, 2019, 2:20 PM – 3:00 PM
Biography:
Dr Elvira Fonacier is the Rankings Program Manager at UTS, who also currently has the oversight of the Survey Management Team in the Planning & Quality Unit (PQU). As Rankings Manager, Elvira manages UTS’ data submissions to the major international university rankings schemes, and provides strategic advice to the Senior Executives on rankings performance and results. How she wears two rather distinct hats is a longish story, but more importantly, Elvira was an academic for many years and therefore familiar with student surveys and how they may affect (or not) the student learning journey in higher education, and how its results are utilised. Elvira’s main focus on Survey Management in UTS is to improve the community’s engage with and participation in all surveys (both UTS- and government-sponsored) we administer, its delivery and administration, and to improve the way we analyse and utilise survey results to improve student learning experience.
The Student Experience Survey (SES) collects information that helps both higher education institutions and the government improve teaching and learning outcomes, and reports on multiple facets of the student experience. Australian universities participate in the annual SES exercise in the hope that they will be able to, not only learn from their own feedback outcomes about student learning experience, but to also benchmark themselves with other participating universities. One of the important facets of the SES is student engagement and how they respond to the survey, which is quite concerning to many universities who find themselves somewhere at the bottom of the national response rate tables reported by the Social Research Centre (SRC).
The University of Technology Sydney (UTS) reported a slightly below average overall quality of educational experience for undergraduates in the SES in 2015-2016. However, its response rate dropped significantly in 2017 to 24%, which the SRC recorded as a “low response rate”, which was worrying to the institution. With low response rates it is easy to assume that survey results did not reflect a true picture of the experience of its students, and therefore was a less reliable basis of areas that need improvement. In addition, the pooling methodology used by the Government to publicly report survey results on the QILT website gave higher weighting to results in years with “higher response rate”.
This presentation aims to discuss a case study of an institution that recognised the low response rates of its students to the SES as both an issue as well as a challenge, and in turn acted to not only improve student engagement with the SES, but more importantly to improve the culture about giving feedback and of following through and “closing the loop”. The presentation will describe the various approaches and strategies that were employed to improve student response rates to the SES, and their resulting effects.
Ms Alex Sieniarski
Australian Catholic University
9.1 –Forecasting student load, Tasman Room A, November 13, 2019, 10:20 AM – 11:00 AM
Biography:
Alex Sieniarski
Alex is National Manager, Analytics within the Office of Planning and Strategic Management at Australian Catholic University. Alex has experience in data science, business intelligence, statistics, analytics and design. Leading a team of five, she is responsible for university-wide statistical analysis services, enrolment forecasting, rankings optimisation, Government reporting, funding estimates, surveys and institutional research.
Australian Catholic University (ACU) is unique in that it has eight different campuses in five different States, each with their own student cohort demographics and differing course offerings. Student load forecasting is a proactive process that assists in determining what courses will be offered at which campus and how many offers will be made to prospective students each year. Ultimately, university budget incomes and expenditure and human resource planning are based on student load forecasts. Load forecasting requires knowledge of past and future course demand, competitor and internal strategic analysis and an understanding of the industries’ local and global conditions. Inaccurate forecasts, whether they underestimate or overestimate, incur costs and reduce the efficiency of university operations.
This presentation will explain a number of different student load forecasting methodologies, including Holt-Winters, ARIMA and Centred Moving Average (CMA) forecasting of time series. It outlines the challenges involved in choosing a method to be used in load forecasting, taking into consideration the large number of internal and external factors that influence the number of students admitted each year to university. The presentation also outlines that because some external factors, such as labour market forecasts and change in Government policies, are difficult to quantify and integrate into a simple model, a forecast may be atypical and not match past experience.
Mr Luke Havelberg
Flinders University
12.3 – Transforming your planning capability with cloud and agile, Tasman Room C, November 13, 2019, 1:50 PM – 2:30 PM
The technology and business processes that underpin the Flinders Planning & Analytics team needed to be transformed to enable the team to support the delivery of the new strategic plan of the university “The 2025 Agenda.” The transformation has a two pronged approach, combining cloud and agile to change every aspect of how the team works. While the transformation is still early in its implementation, there is plenty to talk about and share…what is the transformation? What has been delivered so far? What have been the big challenges? And what have been the big successes?
Ms Lifen Sudirjo, Mr Pablo Munguia
RMIT University
12.2 – Understanding Four Years of feedback from RMIT Vocational Education (VE) Learners and Employers, Tasman Room B, November 13, 2019, 1:50 PM – 2:30 PM
Biography:
Lifen Sudirjo is a data analyst at RMIT University Learning Analytics managing VE and HE student and graduate surveys. Prior her employment in Australia, she was a Statistics for Business lecture and involved in few collaborative researches at Maranatha Christian University in Indonesia. Her research reports were published in the local research magazine.
Pablo Munguia is an associate professor in learning analytics and marine biology within the Education Portfolio at RMIT University. His marine biology research focuses on behavioural ecology, climate change and community ecology of shallow water environments. His learning analytics research focuses on metric design, behavioural modeling, and student-information interactions at multiple scales. He is a recipient of the Best Teacher Award from the University of Adelaide and is a Fulbright fellow. He has held several editorial roles across major journals and currently is Editor in Chief of J. Exp. Mar. Biol. Ecol. and Handling Editor of Oecologia.
The Australian Council for Educational Research (ACER) in support of the Australian Quality Training Framework (AQTF) has developed a set of surveys to help Registered Training Organisations (RTOs) in collecting and using feedback from learners and employers. These surveys measure the current training quality to help identify enhancement needs and fulfil mandatory annual reporting requirements.
The two surveys are the Learner Questionnaire (LQ) and the Employer Questionnaire (EQ). The LQ measures learners’ satisfaction with the quality of training, engagement and perception of competency development. The EQ measures employer feedback about the quality and outcomes of vocational education and training, and about the responsiveness of the training organisation.
RMIT has been participating in the two surveys since 2009. Here, we use RMIT’s LQ and EQ data from 2015 to 2018 to understand two objectives concerning learner’s and employer’s perspective and experiences with RMIT as a training organisation:
First, we were interested in understanding the learner and employer satisfaction on the training quality provided by RMIT.
Second, we wanted to know learners’ and employers’ perceptions on how the training has prepared learners to be ready for work. We then analyse the two surveys to better understand the strengths, gaps and actions that can help improve the training for the learners in support of industry.
Ms Rosie Williams
University of Melbourne
12.1 – University of Melbourne’s Strategic Planning Cycle: Integrating Strategy and Planning, Tasman Room A, November 13, 2019, 1:50 PM – 2:30 PM
Biography:
Rosie has worked in the higher education sector for over 15 years in a variety of roles. Rosie started her career in the UK working in research administration and data analytics before moving to Australia in 2013. In her current role Rosie is responsible for supporting the effective delivery of the integrated strategic planning framework for the University of Melbourne, providing analytics and advice to support senior leaders in their decision making and in conducting institutional research including competitive positioning. She has a Master’s in Social Research and Bachelors of Philosophy.
Strategy helps an organisation determine its longer-term goals and objectives, and the actions, and associated allocation of resources required to achieve those goals. An effective strategic planning cycle ensures decision-making is underpinned by a consideration of both internal and external drivers impacting the University and the sector more broadly and is supported by robust analysis and planning.
The University of Melbourne’s (UOM) integrated strategic framework considers, strategy, risk, planning and performance at both a whole-of-university and divisional level (Faculty/College, Chancellery portfolio, central services etc). The role of the central planning team is to coordinate the planning cycle to ensure there is an alignment between University objectives and divisional priorities. UOM is currently in the process of developing a new university strategy and is actively considering further amendments to the process and cycle as we begin to implement the new strategy from next year.
This case study will discuss the following aspects of the UOM strategic planning cycle:
Learning outcomes:
Attendees will gain an understanding of UOM’s Strategic planning framework, including:
Mr Phil Aungles
Australian Government Department of Education
11.3 – A multivariate analysis of graduate employment outcomes, Tasman Room C, November 13, 2019, 12:10 PM – 12:50 PM
Biography:
Phil Aungles currently works in the Department of Education with responsibility for managing the Quality Indicators for Learning and Teaching (QILT) initiative. He has an interest in educational measurement. His current interest is the examination of higher education outcomes and prior to that he had responsibility for the development of the precursor to the NAPLAN within the Schools area of the department.
This presentation combines data from the Graduate Outcomes Survey with enrolment data from the Higher Education Student Statistics collection to analyse graduate employment outcomes. A multivariate analysis allows full examination of the impact of a range of student characteristics and labour market factors on graduate employment outcomes.
Ms Zaneta Park, Ms Rossana Couto-Mason
Massey University
10.3 –The magic of WordR – Automatic Production of Hundreds of Customised Word Documents using R, Tasman Room C, November 13, 2019, 11:25 AM – 12:05 PM
Biography:
Zaneta is the Senior Analyst of Institutional Research at Massey University, Palmerston North, New Zealand. She has worked with educational data for many years, having worked for seven years at the Ministry of Education in NZ before moving more recently to Massey University. Zaneta is super-excited about the new interactive, visualization tools that are now available, focusing most recently on Power BI. She has also used R for over a decade, and loves automating processes where possible. Zaneta lives on an acre of land on the outskirts of Palmerston North with her hubby, Mike, and has two grown-up daughters (plus seven cats and two dogs).
In these exciting times where a multitude of interactive, visualization tools are available with ever-growing functionality, is there ever a time when plain vanilla ice-cream should still be on the menu? In this presentation, we look at when easily digestible, familiar Word documents might be able to complement your results. But we add a streak of salted caramel by showing how you can produce literally hundreds of personalized Word documents in a few minutes, by using R Code – in particular, by using the WordR package. We recently used this approach to successfully produce individualized Word documents summarizing course performance, for each course in our university. The presentation finishes with a discussion of points to consider when deciding which type of ice cream to offer, and why some people might prefer vanilla whilst others like mint chocolate chip, and then others prefer a medley of flavours…
Ms Chandrama Acharya
Macquarie University
10.2 –Do increased reminders of online surveys have an impact on the data quality? An exploratory study on the SES, Tasman Room B, November 13, 2019, 11:25 AM – 12:05 PM
Biography:
Chandrama works as Manager, Surveys at Macquarie University, managing the operation of national surveys, like QILT and internal and other ad-hoc surveys. Chandrama worked in the higher education sector in Australia and overseas in the past 20 years. She has also the responsibilities to analysis and reporting of the student experience and graduate outcome data. She has background in marketing research, international business, statistics and research on higher education issues and published extensively in a number of international journals. Chandrama also provides expert advice to the University community regarding the best practices for the institutional surveys. She is involved in the efficient use of survey data requirements for benchmarking and other business processes of the University.
It is becoming increasingly difficult to receive a sizable number of survey responses when students are faced with several surveys each semester. A number of strategies have been implemented to increase the response rates of the QILT surveys. As a result, a 12.7 percentage point increase in overall response rate was noticed for the SES between 2017 and 2018. This increase ranged between 0.7 and 24.4 percentage points for all universities. While some universities recorded an increase in satisfaction in all SES focus areas, the satisfaction dropped for other universities in 2018.
Consequently, response rate may be a potential source of bias. The results from a survey with a substantially high non-response may be misleading and non-representative of the whole population. Past research observes that non-responders to surveys are less likely to be satisfied than people who respond. Thus, it is relevant to question (i) what the satisfaction level expressed by respondents who replied after a few reminders is and (ii) whether satisfaction expressed by respondents at the earlier stage of a survey is higher than those who replied after a few reminders.
This study explores these questions to make sense of a drop in undergraduate students’ satisfaction at Macquarie University in all 2018 SES focus areas despite a reported 14.3 percentage point increase in response rate. This study also investigates whether there is any significant difference in response patterns after each follow-up communication.
Mr Amir Rouhi
RMIT
10.1 –Institutional Planning Utilising Markovian-based Model, Tasman Room A, November 13, 2019, 11:25 AM – 12:05 PM
Biography:
Amir H. Rouhi has a PhD in information retrieval from RMIT University in 2018. He is working in Analytics & Insight team at the same university and he applied some of his analytics skills on institutional sector data analysis. His publications on education sector include a paper in SEAAIR 2017: “Vector-based Models for Educational Institution Shape Analysis” and a presentation in AAIR 2017: “New Angles to Analyse Student Load Distribution Pattern”. In his research, he introduced an institutional shape analysis method utilising Cosine-similarity.
His background education shows a bachelor degree in Software Engineering and a master degree in Artificial Intelligence
Education sector is a multidimensional complex platform, impacted by numerous internal and external factors. Planning in such a speculative environment demands appropriate tools, especially when forecasting and modeling the future is necessary.
Predictive analytics can help executives to identify the likelihood of future outcomes of their institutions based on the past and current data, as well as considering internal and external influencing factors. Such analysis can utilize a number of approaches varying from simple statistical techniques, data mining and predictive modeling tools to advanced machine learning algorithms. Selecting an appropriate yet effective model for two samples of institutional planning, is the goal of the current paper.
Markov chain is a well-known technique to predict the stochastic time-series data which is used in the current research. The suggested model is a homogenous Markov chain which is applied on modeling Course-enrolment.
Generating the transition matrix is the core concept of the model for this application. To achieve this, analyzing the historical data to identify all the possible valid transitional states is the first essential phase. Calculating the transitional probabilities among all the states is the second major and sensitive phase. The rest is about computing the likelihood of the possible future states by implementing different scenarios by way of tweaking the elements on the primary transition matrix and analyzing the results.
On top of its ability in forecasting stochastic processes, another advantage of homogenous Markov model is its simplicity in implementation.
Professor James Smith, Ms Kim Robertson
Menzies School of Health Research, Charles Darwin University
9.2 –“They don’t just tick and flick”: Indigenous viewpoints about evaluation in Indigenous higher education in Australia, Tasman Room B, November 13, 2019, 10:20 AM – 11:00 AM
Biography:
Professor James Smith is the Father Frank Flynn Fellow (Harm Minimisation) at Menzies School of Health Research. Previous to this role he was the Co-Lead of Indigenous Leadership and Evaluation Network and 2017 Equity Fellow in the Office of Pro Vice Chancellor Indigenous Leadership at Charles Darwin University. His Equity Fellowship was funded by the National Centre for Student Equity in Higher Education and focused on strengthening evaluation in Indigenous higher education contexts in Australia. The majority of his research straddles Indigenous health and education contexts. Professor Smith has adjunct appointments with the University of Sydney, Curtin University and the University of Saskatchewan.
Growing Indigenous participation and success in higher education is a key priority. Recent academic scholarship has reinforced the importance of strengthening evaluation in Indigenous higher education contexts in Australia to achieve this goal. This has paralleled national and global commentary about the importance of Indigenous data sovereignty within Indigenous affairs policy and program settings, including the current Indigenous Evaluation Strategy work being led by the Productivity Commission. In this presentation we draw on in-depth interviews with 24 Indigenous scholars from across all state and territory jurisdictions across Australia to describe evaluation in higher education from an Indigenous standpoint. The research subsequently privileges Indigenous voices and identifies enablers and drivers likely to strengthen evaluation of Indigenous success in higher education contexts. These include growing Indigenous leadership; increasing funding and resources; investing in strategy development; leading innovative policy development, implementation and reform; investing in cultural transformation and quality improvement; addressing white privilege and power; improving Indigenous student outcomes; valuing Indigenous knowledges and prioritising Indigenous epistemologies; incentivising cultural competence; embracing political challenges as opportunities; promoting cultural standards and accreditation; reframing curricula to explicitly incorporate Indigenous knowledges and practices; investing in an Indigenous workforce; and recognising sovereign rights. We discuss these findings in the context of three primary domains of control – Indigenous control, Government control and University control. In doing so, we unpack the socio-political complexities of negotiating evaluation work specific to Indigenous success in higher education in Australia. We will discuss the respective implications, and necessity for, institutional reforms in Indigenous focused evaluation.
Mr Gimwah Sng, Mr Shane Smith
Social Research Centre
8.2 – Graduate employment destination – exploring the relationships between study are, occupation and industry of employment, Tasman Room B, November 13, 2019, 9:35 AM – 10:15 AM
Biography:
Gimwah Sng, Senior Data Scientist, has been working in the QILT program at the Social Research Centre (SRC) since 2016. Prior to this he spent 3 years working for the Australian Government, managing and working with various datasets for analysis and reporting. He has a Master of Science (Statistics) degree and is keenly interested in using data science tools to explore and improve.
Shane joined the SRC as a Data Scientist in the QILT Team after graduating from a Master of Analytics. Shane’s interests include analysing geospatial data, demographic data, and developing a Bayesian model to improve his footy tipping.
The SRC is the independent administrator of Quality Indicators for Learning and Teaching on behalf of the Australian Government Department of Education and Training.
Higher education is an investment. Data from the GOS indicate that approximately 70% of graduates are employed full-time, four to six months after completion of their studies, and this proportion increases to approximately 90% three years later (as shown in the GOS-L).
Full-time employment is defined in the GOS and GOS-L as working 35 hours or more per week in the week prior to survey. It says nothing about the nature of the work or the industry, how many jobs comprised the 35 hours, or whether the work is related to the area of study. These are all very important factors in the perception of the return on investment in higher education, especially from the graduate’s perspective.
Overseas studies (as quoted here) have looked at approaches to classifying graduate roles, including work by Elias and Purcell (2013), which classified roles based on the occupation skill description and Green and Henseke (2014), applying a statistical model to labour force survey data in their classifier. Both studies were conducted for occupations in the United Kingdom, rather than in the Australian context, and neither used a GOS-style graduate survey. The labour force survey data used by Green and Henseke was large, however the number of respondents recorded in the GOS data is much larger and may provide for a more detailed analysis.
This analysis will explore the issues of common ANZSIC/ANZSCO employment destinations by ASCED course and the extent to which these destinations engage the skills of graduates, using GOS data from 2016 to 2018. Data from the SPOQ module within GOS and GOS-L will be used to triangulate the labour force data in the survey.
Mr Dean Ward
Edith Cowan University
8.1 – Diversity and Equity 2.0 – Changing the way we view students from Broad Sociological Groups to Detailed Individual Intersections, Tasman Room A, November 13, 2019, 9:35 AM – 10:15 AM
Biography:
Dean is Strategic Information Manager at ECU where he leads a team responsible for government data submissions, strategic analysis, forecasting and decision support.
He has around 40 years’ experience in strategic analysis across a broad range of sectors and industries, ranging from public utilities, gas and oil extractive industries and since 2002, in Higher Education.
Over his career, Dean has applied quantitative approaches and utilises current technologies to optimise outcomes and support evidence based decision making in the context of real world dynamics and uncertainties. He is a Fellow of Governance Institute Australia, Institute of Chartered Secretaries and Administrators (UK), as well as CPA Australia. He holds a Master of Business (Distinction) and a Bachelor Commerce (UWA).
Diversity and Equity 2.0
Changing the way we view Students from Broad Sociological Groups to Detailed Individual Intersections
When the framework for the collection of Australian Higher Education Diversity and Equity information was constructed, it was innovative and arguably ahead of most nations. In the intervening 30 years, the data collection has not evolved with the changing Australian social and culture milieu, as well as sector needs. The sector continues to utilise the variables with no uniform approach towards a more contemporaneous and modernised collection.
This presentation will show how ECU has addressed these issues through a modernised collection that meets it needs.
Mrs Nathalie Henning, Mr Joshua Soo
University of Tasmania, AlphaBeta Advisors
7.3 – The long road of organisational reformation – how to shift from HiPPOism to a data driven and evidence based culture, Tasman Room C, November 13, 2019, 8:50 AM – 9:30 AM
Biography:
Nathalie joined UTAS’s strategy team in 2018. As an expert in data analysis, Nathalie sees the disconnect between data analysis and decision making as one of the key issues facing modern organisations. Nathalie is passionate about bridging this gap and engaging people with data. Prior to her role with the University, Nathalie has acted as data translator in several organisations across Australia and Switzerland.
Nathalie holds a MSc in Business Administration from the University of Bern and a BA in Art History and Economics from the University of Basel.
Joshua is an Engagement Manager with AlphaBeta Advisors with more than six years of experience. He uses advanced data analytics and consistently pioneers bespoke innovative economic and financial modelling to help education, infrastructure, and aviation clients solve some of the country’s most complex problems.
He graduated with a Bachelor of Actuarial Studies with First Class Honours and University Medal from ANU.
In a modern organisation, information and data needs to be democratised across all levels such that decision makers can easily interrogate data and make decisions based on objective evidence. The University of Tasmania’s Student Profile Model, which is the focus of this presentation combines student, demographic, and industry insights. This model is utilised by decision makers across the university, playing a key part in the University’s process of organisational reformation.
The key complaint in organisations when it comes to data is lack of accessibility, particularly when considering the demands to make decisions within short time frames. This enviably leads to an environment in which decision making is based on opinion rather than fact. The challenge isn’t that institutions do not have enough data but face large amounts of unstructured and unorganised data.
Succeeding in the higher education sector over the next 10 to 15 years will be strongly dependent on Universities having a comprehensive understanding of its medium to long-term student population as well as being able to analyse and quantify the impact of risks the sector will be facing. For instance, a potential decline in Chinese international students or for the University of Tasmania more specifically, having to deal with population decline in its core market. Within the University sector however, student data is duplicated, spread across different systems or spreadsheets, incomplete, not accurate and out of date. The real value of institutional data can only be materialised if it can be easily accessed and transformed from its raw state into actionable insights.
Over the past 12 months, the University of Tasmania with outside advisory from Joshua Soo has undertaken in-depth analysis of its enrolments data in conjunction with Tasmania’s population and industry forecast and developed the Student Profile Model, which provides a consolidated view of the University’s medium to long-term student population inclusive of their locations, the curriculum they will undertake, and the mode of delivery. This presentation will showcase how the Student Profile Model has helped guide the University through two major transformation projects and played an integral part in the University’s long-term strategic planning process.
Mr Stuart Terry
Otago Polytechnic
7.1 – Using data to support diversity and inclusion, Tasman Room A, November 13, 2019, 8:50 AM – 9:30 AM
Biography:
Stuart Terry is the Organisational Researcher at Otago Polytechnic, New Zealand and leads a team of who to gather, analyse and report feedback from current students, graduates and staff. Over recent year’s Stuart has developed online systems to enable both students and teaching staff to better engage in meaningful feedback and evaluation processes. In addition, all staff are supported to participate in reflective evaluative practices which are developmentally focused. These processes include annual staff engagement surveys, peer to peer reflection and individual 360 feedback for leaders and teams at Otago Polytechnic and for a number of other organisations. In 2014 Stuart established Proud@OP for LGBTiQ+ staff and lead the institutions successful application for the nationally recognised Rainbow Tick diversity and inclusion certification.
Stuart holds a Bachelor of Commerce (Management and Marketing) and Postgraduate Diploma in Commerce from the University of Otago and is currently undertaking Doctor of Professional Practice studies.
Since the turn of the 21st century, we have witnessed a rise in the visibility of the lesbian, gay, bisexual, transgender, intersex, queer (LGBTiQ+) community. However while diversity and inclusion are terms widely used and referenced in organisational statements it is only recently that organisations and institutional researchers have focused attention on workplace issues facing LGBT employees.
A truly diverse and inclusive environment is directly linked to enhanced performance and strengthened reputation. It is not only members of the LGBTiQ+ community who are interested in diversity and inclusion. Evidence shows that more and more job applicants review an organisations approach as a consideration when applying for jobs. Creating an environment where people are able to bring their whole selves to work is a key factor in being able to attract and retain the best people.
In 2016 Otago Polytechnic became the first Institute of Technology and Polytechnic (ITP) and second tertiary education institution in New Zealand to receive the nationally recognised Rainbow Tick. The Rainbow Tick is an accreditation designed to demonstrate LGBTiQ+ inclusive practices and service delivery in the workplace. In 2019 the institution received a nationally recognised Rainbow Excellence Award for Programme, Policy and Practice.
Institutional data collected formed the basis that Otago Polytechnic was able to clearly demonstrate measureable improvements in LGBTiQ+ inclusion and staff wellbeing on a sustained basis. Action plans informed by the analysis of the data provided evidence that OP had inclusive practices to support diversity of staff and to be recognised nationally as a great workplace for all staff.
The presentation discusses:
Dr Robert Dalitz, Mr Liam Hooker
University of Canberra
6.3 – Who needs QILT? A business intelligence journey, Tasman Room C, November 12, 2019, 4:20 PM – 5:00 PM
Biography:
Liam has worked in Business Intelligence in Higher Education since 2012, joining the Planning and Analytics team as a Business Intelligence Developer at the University of Canberra (UC) in 2016. Liam has utilised tools for data analytics and visualisation such as SAP BusinessObjects, SAP Crystal Reports, Power BI, SQL, R and Excel. Liam has worked as the Project Manager in overseeing and developing an end to end BI solution for the GOS and SES. Liam has a Masters in Business Informatics and is also undertaking postgraduate studies in Data Science.
The Quality Indicators for Learning and Teaching (QILT) surveys are a valuable source of information for higher education institutions that provide feedback on student experience and learning outcomes. In analysing this data, institutions can implement strategies to curtail and reverse undesirable student and graduate results and help the institution strive for excellence in the multiple areas that the QILT surveys address.
This project aims to turn isolated, ad hoc analytics into a repeatable BI solution that can be consumed by a range of users with differing roles at multiple levels, including the Vice-Chancellor, Business and Academic Executives, Faculty Managers, Course Convenors, Administrators, Student Support Officers, among others.
The Planning and Analytics team at the University of Canberra (UC) has taken the initiative, with informal support from senior management, to build an end to end BI solution for the Student Experience Survey (SES) and Graduate Outcomes Survey (GOS).
This presentation details the journey from:
Mr Shane Compton1, Dr Benjamin Phillips, Dr Paul Lavrakas
TheSocial Research Centre
6.2 – Using loss framing to optimise response in the context of the Graduate Outcomes Survey – an experimental trial, Tasman Room B, November 12, 2019, 4:20 PM – 5:00 PM
Biography:
Shane is an applied social policy researcher with 17 years of consulting experience in Australian Government and research agency positions. His career has been spent in research environments with a focus on corporate reporting indicators or involving compliance with regulatory policy. At the Social Research Centre, Shane has end-to-end advisory responsibility for research studies within the health and vocational outcomes; and service experience spaces.
With a mindset of actively seeking to maximise the value of publicly funded research, Shane has a strong desire to collaboratively advance social policy for the Australian Government and social research community. Shane also has considerable experience with researching topics of a sensitive subject matter and with potentially vulnerable or dependent audiences.
Shane has a BSc (Psych) and a Master’s Degree in Applied Science. He is also a member of the Australian Market and Social Research Society with Qualified Practicing Market Researcher (QPMR) accreditation.
Background:
Loss framing is a way of presenting requests in terms of potential losses rather than benefits to take advantage of the human psyche’s tendency to weight losses more heavily than rewards (prospect theory; Kahneman and Tversky [1979]).
The Graduate Outcomes Survey (GOS) is part of the Quality Indicators for Learning and Teaching (QILT) suite of surveys – the only source of National data on graduate experiences with higher education in Australia. The final part of the GOS asks graduates to provide the contact details for their supervisor to participate in the Employer Satisfaction Survey (ESS). The ESS collects the insights and perceptions of employers to help understand how well higher education is equipping graduates for the workforce. Of all employed graduates in the 2018 GOS, 10.7 per cent provided sufficient contact details to approach supervisors as part of the ESS. Of these employers, however, 52.0 per cent subsequently completed the ESS. Maximising graduate consent to providing supervisor contact details is important for minimising coverage error. This presentation will also review other experimental trials of loss framing in email communications for QILT studies.
Methodology:
This randomised control trial tested the presentation of a loss framed statement to graduates on the importance of providing their supervisor contact details against a control condition (where this statement was not provided) under a systematic random sample.
“Without your supervisor’s input, results from this survey will be less useful to policy makers. The government uses input from graduates and employers to understand how well higher education institutions are preparing graduates for the workforce”.
Result and discussion:
While analysis is ongoing the results have demonstrated the significant effect of using loss framing when the respondent has already committed effort to the survey process. Loss framing and use of ‘help’ / ‘harm’ type text, however, does not always work – especially when the respondent had not already invested time and effort. This knowledge is useful in understanding the appropriate use of this technique in the context of survey based research to maximise response and minimise non-response error.
Dr Stewart Craig
Monash University
5.3 – Automation of Routine Reporting with R Markdown, Tasman Room C, November 12, 2019, 3:35 PM – 4:15 PM
Biography:
Stewart Craig is a Performance Analyst with the University Planning and Statistics team at Monash University. He completed a PhD in cognitive psychology at The University of Western Australia in 2012 before moving to Melbourne to work as an analyst in a research and consulting company. He joined Monash in early 2018 where he undertakes reporting and modelling to help monitor Monash’s performance and inform strategic planning. Stewart has a keen interest in analysis methods and software and how these can be used to improve analytics within universities.
As with all higher education institutions, the data and analytics teams throughout Monash University prepare a range of regular reports to monitor performance across the university. Many of these reports have historically been routine and time consuming to prepare. As such, the University Planning and Statistics team at Monash have begun automating the production of routine reports using R Markdown documents. R Markdown documents allow report text and R analysis code to be contained in a single place. The documents can be easily exported to a range of static formats, including Word, HTML and PDF. R Markdown can also be used to create dashboards and other interactive HTML documents. In addition to saving time in the production of such reports, R Markdown helps ensure that research is reproducible, allowing the analysis and reports to be re-created with the click of a button. Likewise, R Markdown helps to simplify the process of refreshing reports and reduces errors introduced from manually entering data into text documents. In the session, we present case studies demonstrating how R Markdown documents have been used to improve reporting at Monash University and show how users can start creating their own R Markdown documents.
Mr Rintaro Ohno
Tohoku University
5.1 –Strategic Planning via Quality Assurance, Tasman Room A, November 12, 2019, 3:35 PM – 4:15 PM
Biography:
Rintaro Ohno has studied Physics and Mathematics at Würzburg University in Germany, received his Ph.D. in Information Sciences at Tohoku University in Japan, and is currently Senior Assistant Professor at the Strategic Planning Office at Tohoku University. Although he specializes in Complex Analysis and Geometric Function Theory, he taught English and German for freshman classes and provides a wide, interdisciplinary perspective on Institutional Research and related projects.
In recent years, governmental funding for higher education in Japan shifted the rationale more and more toward key performance indicators and the achievement of objective, measurable figures. And although Japan is still considered as one of the top performers in higher education, spending on higher education as a share of the GDP or total government expenditure is significantly lower than the OECD average, resulting in stiff competition for funds among institutions.In order to comply to the evolving situation and improve the strategic planning process, Tohoku University renewed the procedure of faculty evaluations and quality assurance.
The presentation will briefly talk about current circumstances of universities in Japan and discuss the importance of institutional research and the process of quality assurance for improved strategic planning. The focus will be on the process of providing incentives as well as a sound understanding of the general status for the departments and individual members to compose the optimal roadmap for university projects.
Dr Claire McLean, Dr Alexander Loebbert
Central Queensland University
4.3 – Modelling student behaviour: How to be ethical when using student data in analytics, Tasman Room C, November 12, 2019, 2:20 PM – 3:00 PM
Biography:
Claire McLean is a senior postdoctoral research fellow at Central Queensland University. Her current research concerns predictive modelling of student behavior using data analytics. She is also interested in data privacy and data ethics and the methods by which these can be incorporated into modern educational policy.
Data is a valuable commodity in the modern world. Universities collect and store a vast volume of data on their students and this can provide an invaluable data set to researchers who aim to use analytics to understand which factors can influence a successful university experience. In this presentation we discuss the ethical implications of using large data sets of previous students to predict student behaviours, such as whether they may require additional academic support or not. We discuss this through the lens of the Big HEPPP data analytics project which Central Queensland University has invested in using some of its Australian Higher Education and Partnerships Program (HEPPP) funding. The research element of this initiative focuses on using analytics to determine if based on past data we can predict any differences in the university experience of low socioeconomic status (SES) students as compared to their higher SES peers. As part of this initiative we carried out a review of the existing role of data in higher education policy formation and considered the ethical implications of introducing predictive models into this space. We noted potential risks to student privacy and considered the mechanisms with which we could reduce any potential risk of students in data sets being identifiable. We also considered that using large data sets to study the effects of a single demographic such as SES may lead to low SES students being treated as a more homogenised group than they are and opportunities to tailor support to individuals may be missed. Here we present a synopsis of the potential ethical considerations in using data analytics to support low SES students and equitable access to education in general. We discuss the best practice to minimize these and ensure that universities use data in a way that does not contradict their duty of care to their students. We also consider the role learning analytics may play in the future education policy in Australia.
Ms Bronwen Loden, Mr Xichao Wang
1University of Tasmania
4.1 – Translating EFTSL into applicants: making targets meaningful for marketing, Tasman Room A, November 12, 2019, 2:20 PM – 3:00 PM
Biography:
Bronwen Loden:
Bronwen currently works as a Data Analyst in the Market Research team at the University of Tasmania. She has spent the last 10 years working in tertiary education marketing and over this time has made a steady transition from traditional marketer to data-driven marketer. Bronwen is currently completing a Master of Business Analytics with Deakin University.
Xichao Wang:
Xichao worked in supply chain areas for a few years and undertook research regarding Neural Networks. Xichao is now a member of Business Intelligence at the University of Tasmania and loves using machine learning techniques to solve real-world business problems.
Universities set enrolment targets in terms of equivalent full-time study load (EFTSL) and their marketing departments are tasked with recruiting enough students to meet these targets.
There can be a significant time lag in the application cycle from when a prospective student first submits their application through to when they are made an offer, accept, and subsequently enrol and can have EFTSL attributed to them. Because of this time lag, application numbers are the key success metric for marketing managers while a campaign is in market. If marketing managers wait for EFTSL figures they won’t know if they’ve been successful or not until it’s too late to do anything about it.
At the University of Tasmania the Marketing department had a variety of methods for approximating the number of applications required to reach EFTSL targets but none were consistent or reliable and all were time consuming to manually produce.
Working with the Market Research team as their key stakeholder, the Business Intelligence Unit developed a predictive model to standardise and automate this process using Alteryx. The model is divided into two parts. First, it calculates the conversion rate and measures the model accuracy. Then, it connects to the target EFTSL figure to forecast applicants needed. This creates an individual target for each course which can be rolled up to overarching study level, College, and University-wide targets. The model takes into consideration not just the applicant’s first preference course but also the highest preference offered. This way the model can adjust accordingly once offers have been made and provide more accurate targets as the application cycle progresses.
This presentation will cover the methodology of the predictive model, the challenges overcome along the way, and the real-time reporting tools developed to make this data meaningful for the Marketing department.
Incorporating this output from this model into UTAS’ Business Intelligence platform provided Marketing Managers access to rich data which provide applicant targets and monitors progress in meeting these targets. Future plans include enhancements to the model as more data becomes available.
Dr Robert Dalitz ,Mrs Poorni Apoutou
University of Canberra
1.1 – Linking student satisfaction with teaching units to overall course satisfaction: An exploratory analysis, Tasman Room B, November 12, 2019, 1:35 PM – 2:15 PM
Biography:
Rob has worked in the higher education sector for 17 years, mainly in research, analytical, and survey management positions. He is Senior Information Analyst and Survey Manger for the University of Canberra. Previously Rob worked for Universities Australia as their data analyst and coordinating the DVC-Corporate group, during which time he developed and managed the implementation of the UA data sharing agreement. Rob provides analysis and advice on a range of issues as well as managing the survey function at UC.
The relationship between satisfaction with individual units and overall student satisfaction at the course level is important and not well understood. Understanding the drivers of overall satisfaction can help in improving the student experience. Typically teaching quality is most highly correlated with overall satisfaction. Consequently, improving our understanding of the relationship between teaching (occurring in individual units) and the overall satisfaction of students with the course they are enrolled in can provide important insights into improving the student experience. The University of Canberra (UC) uses the Interface Student Experience Questionnaire (ISEQ) to gauge student satisfaction with the individual units students undertake. Overall satisfaction is measured at the course level through the Student Experience Survey (SES) Quality of Overall Educational Experience (QOEE) question. We compared the unit satisfaction of students to their overall satisfaction in their courses. Different ways of measuring unit satisfaction, and different measures from the ISEQ, were assessed against the QOEE scores for courses. Initial findings indicate that satisfaction with the teaching in units taken in first semester is correlated with QOEE from the SES, run early in second semester. However, the strength of the relationship varies depending on the type of unit, and its relationship to the course. Despite the correlation between unit and course satisfaction there is large variation between average unit satisfaction and QOEE. In particular, low overall satisfaction from the SES is weakly associated with unit satisfaction.
Miss Kate Bramich
University of Tasmania
3.3- How an in-house Market Research function enables data driven decisions, Tasman Room C, November 12, 2019, 1:35 PM – 2:15 PM
Biography:
Kate Bramich:
Kate is the Market Research Manager at the University of Tasmania. She has worked as a Market Researcher for 17 years in Australia and England across multiple facets of Consumer research, Government research, FMCG/Retail measurement and Media measurement both in qualitative and quantitative research. Most of this time was spent agency side with global market research agencies Nielsen and Ipsos. Kate was born and bred in Tassie. She returned to home to raise her daughter in 2015 and has worked for the University of Tasmania for the past 2 years.
Alisa Ward
Having worked in the Higher Education sector for more than a decade, Alisa has multi-disciplinary experience in tertiary marketing from the perspective of a designer, communicator and creative marketing professional. She now works as a Market Analyst in the Market Research team at the University of Tasmania and has an ever-growing interest in data and strategy.
The University of Tasmania’s Office of Marketing’s role is to develop the marketing programmes and strategic marketing to grow the University’s reputation and build national and international attractiveness of its programmes.
In mid-2017 the University’s Marketing portfolio was restructured moving focus from being purely on student load, to more ably develop brand, and build reputation and standing. The last quarter of 2017 saw the development of a central marketing team with an in-house market research function as part of a strategy to apply consumer insights to the marketing and communication strategies and help to revise our goals, programmes and image.
A number of monitors of performance have been instituted over the last 2 years by our Market Research team which has provided us with the ability to apply critical insights into market behaviour through research; as well as the ability to identify opportunities and provide strategic guidance on an on-going basis to ensure that we will remain competitive.
In this presentation we will talk about how Market Research has enabled the University to move to a data-driven mindset, the projects that helped achieve this, and the challenges we encountered along the way.
Dr Robert Dalitz, Mrs Poorni Apoutou
University of Canberra
3.2 – Linking student satisfaction with teaching units to overall course satisfaction: An exploratory analysis, Tasman Room B, November 12, 2019, 1:35 PM – 2:15 PM
Biography:
Rob has worked in the higher education sector for 17 years, mainly in research, analytical, and survey management positions. He is Senior Information Analyst and Survey Manger for the University of Canberra. Previously Rob worked for Universities Australia as their data analyst and coordinating the DVC-Corporate group, during which time he developed and managed the implementation of the UA data sharing agreement. Rob provides analysis and advice on a range of issues as well as managing the survey function at UC.
The relationship between satisfaction with individual units and overall student satisfaction at the course level is important and not well understood. Understanding the drivers of overall satisfaction can help in improving the student experience. Typically teaching quality is most highly correlated with overall satisfaction. Consequently, improving our understanding of the relationship between teaching (occurring in individual units) and the overall satisfaction of students with the course they are enrolled in can provide important insights into improving the student experience. The University of Canberra (UC) uses the Interface Student Experience Questionnaire (ISEQ) to gauge student satisfaction with the individual units students undertake. Overall satisfaction is measured at the course level through the Student Experience Survey (SES) Quality of Overall Educational Experience (QOEE) question. We compared the unit satisfaction of students to their overall satisfaction in their courses. Different ways of measuring unit satisfaction, and different measures from the ISEQ, were assessed against the QOEE scores for courses. Initial findings indicate that satisfaction with the teaching in units taken in first semester is correlated with QOEE from the SES, run early in second semester. However, the strength of the relationship varies depending on the type of unit, and its relationship to the course. Despite the correlation between unit and course satisfaction there is large variation between average unit satisfaction and QOEE. In particular, low overall satisfaction from the SES is weakly associated with unit satisfaction.
Mr Mike Seah, Ms Nadine Kiriwan
Curtin University
3.1 – Transforming the monitoring of international student progression at Curtin University, Tasman Room A, November 12, 2019, 1:35 PM – 2:15 PM
Biography:
Mike holds a Bachelor of Commerce with Honours and a Master of Commerce (Accounting) from Curtin University. He is a member of CPA Australia and is a mentor in the CPA program. Mike has had more than 16 years of work experience in the tertiary education sector, particularly in the areas of accounting information systems, student finance management, student administration, data analytics and student legislative compliance. He has been an academic and a sessional academic at the School of Accounting from 2003 to 2017 teaching in the areas of management accounting and financial modelling at both undergraduate and postgraduate levels.
Background
The Confirmation of Enrolment (COE) is an official document issued to international students by universities in Australia. The COE is required by the Department of Home Affairs for the purpose of applying for a student visa.
Current international students who are not able to complete their course by the specified date on their current COE due to compassionate or compelling circumstances will require a new COE in order to renew their student visa a few weeks before it expires.
Historically, students were responsible for monitoring their visa expiry date and were required to submit a COE extension request through an online portal before their visa expires.
Opportunities for Improvement
Curtin University saw an opportunity to create greater student experience for our international students and allow for improved management of the COE extension process for staff through a more proactive approach, using the available PRISMS data and data stored in our Student Management System (SMS). The amalgamation of PRISMS and SMS data will not only be used for tracking COE expiry but useful for monitoring international student progression. It will also facilitate a more proactive approach to the monitoring of ”at-risk” students as per standard 8 of the National Code of practice set by the Department of Education. The biggest impact of this change will be seen in the intervention of ‘”at-risk” students and a potential retention of those who may choose to transfer to another institution.
How we achieved this
A merge of PRISMS and SMS data to identify students with COEs that are ending in the months leading up to visa expiry. The output file is then filtered and forwarded onto the respective teaching areas to conduct an assessment of the remaining credits required for completion and revised expected course completion date. COE extensions are then processed and students are notified and provided with their new COE.
Benefits:
Dr Cassandra Saunders, Dr Michelle Ye
University of Tasmania
2.3 – Utilisation of QILT data to enhance the student experience: From big data to big impact, Tasman Room C, November 12, 2019, 12:05 PM – 12:45 PM
Biography:
Dr Saunders worked in academia in the Faculty of Health for a number of years before crossing-over to the ‘dark-side’ of institutional research in 2012. Cassandra is now a member of the Business Intelligence, Surveys and Analytics team at the University of Tasmania where she manages all of the core institutional surveys, particularly the data analysis and reporting. She loves immersing herself in a good spreadsheet and has embraced her inner data nerd.
Michelle is a Business Intelligence Analyst in the Strategy, Planning and Performance Unit of the University of Tasmania, where she analyses data sourced from various systems and develops reports and dashboards to inform decision making. She has a PhD in Information Technology with the research interests including power and political behaviours in information system implementations, social influence, resistance to change, group behaviours, business process management, and organisational decision making and problem solving.
Business intelligence (BI) is a broad term to describe a wide range of strategies and technologies that gather, store, access and analyse operational data to ensure business success. It provides timely, competitive information to enable better insights for operational and strategic decision making. In the last decade, BI and analytics have become increasingly important in the business of higher education (HE). Impacts from the external environment and dynamic internal change are making it increasingly critical for HE institutions to take advantage of all available information.
In line with this, the BI team at the University of Tasmania has developed a University Management Information, Analysis and Planning (UMAP) portal using IBM® Cognos® software, that brings together many data sources into a single, consolidated online environment to support the University’s strategic reporting requirements, in an easily accessible format. The recent amalgamation of the BI team with the Survey Team has provided renewed impetus to incorporate the University’s key survey data into the UMAP portal and has led to the development of highly interactive dashboards and reports for the Student Experience Survey (SES) and the Graduate Outcomes Survey (GOS).
A key component of this presentation will be to showcase the methodological approach used to develop the SES and GOS dashboards and reports and provide a detailed demonstration of their functionality and interactivity.
Incorporation of the QILT survey data into the University’s BI capabilities has significantly increased the capacity to report on the rich survey data available and allows the data to be ‘sliced and diced’ in a multitude of new and diverse perspectives, which, in turn, has identified nuances in the survey results previously undetected. It has also provided a significantly more detailed level of business knowledge about the longitudinal student experience, which is actively being utilised by Colleges and Schools to inform strategic plans, marketing and recruitment.
Mr David Romanowski, Ms Rebecca McMartin, Ms Chandrama Acharya
Macquarie University
2.2 – Visualising insights from the qualitative student surveys, Tasman Room B, November 12, 2019, 12:05 PM – 12:45 PM
Biography:
David works as a Business Intelligence and Data Analyst within the Business Intelligence & Reporting team at Macquarie University. He has worked in the higher education sector over the past 5 years and has further developed his skills in the areas of data visualisations. Currently he is involved in a number of different reporting projects, including text analytics.
Chandrama works as Manager, Surveys at Macquarie University, managing the operation of national surveys, like QILT and internal and other ad-hoc surveys. Chandrama worked in the higher education sector in Australia and overseas in the past 20 years. She has also the responsibilities to analysis and reporting of the student experience and graduate outcome data. She has a background in marketing research, international business, statistics and research on higher education issues and published extensively in a number of international journals.
Follow Macquarie University’s exploratory journey with data visualisation and text analysis using qualitative Student Experience Survey (SES) comments.
The recent focus on student experience at Macquarie University (MQ) has prioritised the analysis of 36,000+ comments from previous SES data. Insights have led to the publishing of a Business Intelligence dashboard for accessibility and improved decision-making.
The MQ Business Intelligence & Reporting (BIR) team used IBM SPSS Text Analytics software to explore student comments and to continually refine the library package. Alteryx was then used to help link qualitative data with quantitative data to build a Tableau Visualisation Dashboard for key stakeholders.
This presentation will cover the BIR team’s journey from the start to where they are now and the challenges that they have faced. What did they learn and what might they do differently if they had to start over?
Mrs Penny Szybiak
Charles Darwin University
2.1 – Enabling versus Sub-Bachelor, Tasman Room A, November 12, 2019, 12:05 PM – 12:45 PM
Biography:
Penny Szybiak is the Director of Planning and Performance at Charles Darwin University (CDU) and is also a student in the Master of Public Policy also at CDU. Her team is responsible for administering and reporting on the MyView survey of a student’s teaching and learning experience. Her team is also responsible for providing CDU’s community of policy and program managers with the evidence base required to measure and formulate programs which improve student outcomes at the University. She has more than 10 years’ experience in the Higher Education sector in an Institutional research capacity, and is a member of the Executive Committee for the Australasian Association of Institutional Researchers (AAIR).
Government policy regarding pathway course like enabling and sub-bachelor courses have been in a state of inertia since the introduction of the demand driven funding system in 2012. Places have been largely capped at both a sector and institutional levels since that time, with only small adjustments made during that time. Because of this there is a high level of over and under enrolment against allocated places by individual institutions; and also, a net over enrolment across the sector against the number of funded places available. This would signal that the current policy associated with funding pathway places, and the method for allocating them to Universities neither meeting the need.
To better understand how this policy could better cater for pathway needs of domestic students, the sector needs to better understand the specific role these two programs play in terms of types of students they cater for and reasons student enrol; as well as understanding how efficiency and effectiveness these two types of programs are in delivering on their intended purpose, relative to one another.
This presentation will compare and contrast the role, efficiency and effectiveness of Sub-bachelor and Enabling courses through a sector lenses and then in more detail using Charles Darwin University as a case study. The aim being to help inform Government policy surrounding the amount and application of funding for pathway programs across the University sector.
Ms Andrea Jeffreys
La Trobe University
1.1 – A case study in the application of project management methodology to manage risk and complexity in multi-disciplinary course transformation, Tasman Room A, November 12, 2019, 11:20 AM – 12:00 PM
Biography:
Andrea Jeffreys has worked in higher education for the past twenty years and is currently Program Director in the Projects and Business Transformation Office at La Trobe University. Andrea has broad institutional research experience through senior positions held in the College of Science Health and Engineering at La Trobe University and the Health, Science and Business and Law Faculties at Deakin University. In these roles she has managed portfolios including strategy, planning, curriculum, governance and research. Andrea has always focussed her career on roles with an analytical and strategic focus and before commencing in higher education she held several Market Analyst, Business Analyst and Data Analyst roles in various sectors including consumer goods, retail, publishing, health and energy. Andrea holds a Bachelor of Science from Deakin University and Master of Marketing from Monash University.
In 2018, La Trobe University embarked on a major piece of curriculum reform to transform the course offerings in eleven health disciplines. The courses in these disciplines are of considerable strategic importance to the University generating a significant proportion of revenue each year. They are high demand and high ATAR courses offered at La Trobe University’s metropolitan and regional campuses through the College of Science Health and Engineering. The decision to change the course offerings was driven by the need to meet compliance obligations but in doing so it was recognised that there was considerable risk involved. This session will present the application of project management methodology through a defined program to manage the transformation of the health courses from their current to planned future state. A program of work was initiated under the direction of a Program Director and Program Coordinator Academic to plan and manage the course redesign and associated course approval documentation for each of the disciplines as well as the coordination and oversight of the various divisional administrative responsibilities in preparation for admission and enrolment and teaching of the new courses. The established program provided a coordinated framework and analytical approach, effectively connecting all parts of the University and ensuring all internal and external compliance obligations were satisfied. It enabled the disciplines to be managed as discrete but related projects and provided extensive communication and support for the large number of internal and external stakeholders impacted by the changes. The session will outline the methodology applied, the benefits, challenges and lessons learned over the past eighteen months from the commencement to the conclusion of the program.
Mrs Eva Seidel
Flinders University
1.3 – Transforming the way our Planning and Analytical Services team works through Agile adoption and Business Partnership, Tasman Room C, November 12, 2019, 11:20 AM – 12:00 PM
Biography:
Eva has been working in Planning and Analytical Services at Flinders for almost 9 years. She specialises in statistical analysis and loves getting her hands on data. Eva also enjoys speaking with people across the business to understand their needs and discover ways in which she and the wider team can use data and information to solve new problems. Eva has attended many AAIR forums, has presented at a few and has found them to be valuable information sharing opportunities. She hopes you enjoy the talk.
Until recently, Planning and Analytical Services (PAS) was an underutilised data and information service. We provided great data but major stakeholders just weren’t engaging with our outputs in they way we hoped, and they expected more from us. They wanted more data, easier access and more visualisations but we just couldn’t meet their needs. We had a problem.
With a new Associate Director leading our team, we set ourselves a challenge to change the way we work, to better meet our stakeholders needs. We wanted to be more efficient, collaborative and effective, to truly work toward continuous improvement, with an aim to maximise the value we deliver to our major stakeholders.
Two of the changes we made included the full adoption of Agile across the PAS team and through the rollout of PAS Business Partnership across our major stakeholders. This presentation will discuss the details of these two changes, how we rolled them out and why they are important.
While these two change pieces are relatively new and are not yet completed, we have travelled the journey long enough to share our learnings and experiences with the AAIR community.
Ms Sonia Whiteley
Maven& Edge
1.2 – Unpacking the Student Experience: an end-to-end model, Tasman Room B, November 12, 2019, 11:20 AM – 12:00 PM
Biography:
Sonia Whiteley is formally trained as a research methodologist and program evaluator and has worked as an applied social policy researcher for more than 25 years. She specialises in educational research with a particular interest in graduate outcomes, student retention, social and emotional development of young people and school readiness.
One of her key strengths is the design, creation and governance of large-scale, often longitudinal, Minimum and Research Data sets from administrative, transactional and survey collections. Notable national initiatives where Sonia has led the data collection include the Dynamics of Income Support Recipients Study (DAISES), Child Support Reform Study, Evaluation of the National Disability Insurance Scheme and the Quality Indicators for Learning and Teaching (QILT).
Sonia is a passionate supporter of institutions and professionals who seek to improve the student experience.
The student experience of higher education starts well before enrolment. Online research, discussions with friends and family, endless reading of review websites and, for a lucky few, professional advice from careers counsellors all contribute to the formation of initial perceptions of what it will be like to study a specific course or program at an institution.
Unfortunately, entry surveys of what commencing students expect from their student experience are rarely undertaken. As such, institutions have no way of knowing if these expectations are realistic or if the student has selected a course that will give them the best chance of meeting those expectations. It is also the case that subsequent surveys of the student experience are entirely decontextualized as there is no baseline reference point for the individual’s engagement with, evaluation of and ultimately exit from the institution.
With the re-introduction of Performance Based Funding in 2020, it is now critical that institutions have a deep insight into all aspects of the student experience to ensure that attrition is minimised, satisfaction is optimised, equity groups are appropriately supported and, as far as possible, graduates are effectively transitioned to employment. A more wholistic understanding of the student experience is required in real-time that creates multiple opportunities for institutions to intervene before the student expresses dissatisfaction or prematurely exits.
A four part-model of the student experience is proposed that addresses each of the stages in the student life-cycle.
The paper focuses specifically on how to use surveys to bridge the gap between the experience universities aspire to provide and the experience students receive.
Speakers Information Available Soon
This Feature Will Be Available Soon
Abstracts Available Soon