Title: The SEIT Teaching Support Team for Transition to Online Teaching
Introduction
The UNSW Canberra School of Engineering and Information Technology (SEIT) Teaching Support Team (TST) was founded in 2018 to carry our quality assurance of SEIT courses. The TST consists of academics from within SEIT, who review courses against the SEIT Quality Teaching Framework (QTF).
The QTF was developed with reference to the literature on best practice in teaching in higher education, and with extensive consultation with the school.
Each review consists of 3 stages – self-review by the convener, review by the TST member, reporting and agreement on an action plan for development of the course. The development plan is led by the convener, unless the course fails to meet the required baseline in the QTF. In that case, the action plan is specified by the TST reviewer.
After a series of pilot course reviews was run in 2018 and 2019, the review process was ready to be rolled out in full in S1 2020. The full process involves the review of every course (almost 200 in total) once every 2 years.
It is important to note that the TST consists of SEIT academics and the QTF was developed in extensive consultation with the school. While the review process is rigorous, it is collaborative, and TST is a part of SEIT – every SEIT academic is expected to engage with TST and be a part of TST at some point.
In March 2020, The TST for S1 was recruited, conveners had completed their self-reviews, and courses were assigned for review. And then face to face classes were cancelled, almost overnight, in response to the coronavirus pandemic. TST were asked by the school executive to repurpose themselves for supporting and monitoring the transition of all undergraduate courses from face to face to online – effective immediately.
Aims for 2020
Prior to 2020 the SEIT undergraduate courses were taught almost exclusively face to face with close to 100% attendance at classes. There is a close relationship between UNSW Canberra and ADFA, and Trainee Officers and serving officers are required by the ADF to attend class. Moodle sites were used primarily to provide additional resources and for submission of assignments.
The aim of the TST for 2020 was to ensure that every undergraduate course offered by SEIT was moved from face to face to fully online, with a minimum of disruption and to the best standard possible given the time frame, and in full compliance with all emerging UNSW, UNSW Canberra and SEIT policies. Hence there was both a support and monitoring role.
Progress / Outcomes / Next steps
The TST was rapidly expanded from 17 members to 30 (approximately one third of SEIT academics), and each of the 54 undergraduate courses offered by SEIT in S1 was assigned a TST contact. Three expert teams were formed – Assessment, Content and Engagement (“to make the transition ACE”) – to support the TST and all SEIT academics. These expert teams were largely drawn from SEIT academics with expertise in post graduate teaching. SEIT post graduate courses are largely online, in contrast to the undergraduate courses.
As policies and procedures evolved, the TST provided a link between SEIT executive and course conveners, supporting conveners to implement policy as it evolved, and monitoring compliance.
The TST also provided an initial triage service for conveners. Any problem was initially looked at by their TST contact. If they could not solve it, the problem was presented to TST via the Teams site, email or in a meeting. If the problem was still not solved, it was passed to the relevant expert team. If the problem was still not solved (and this was rare) it was passed either to the DHoST if it was a policy interpretation issue or to Technology Enhanced Learning Services (TELS) for technical problems. In this way, TELS were able to focus on campus-wide projects and issues, and SEIT executive were able to focus on policy development without having to then spend time explaining it multiple times, and ensuring it was being implemented.
Within two weeks every course had a revised assessment plan which had been checked to ensure it was compliant with the new policies and procedures (e.g. S/F grading only, fully online, no timed exams of less than 24hrs duration, all CLOs to be demonstrated for an overall S). Each course was being delivered online with all “classes” available asynchronously as well as synchronously where possible. Each course convener was in contact with their students and encouraging them to engage fully in the online course. Every lab component was being redesigned to be delivered online.
In the end, every S1 course was delivered successfully, and only a single Course Learning Outcome for one course was not able to be achieved – and that was a workshop competency course, which involved using tools and machinery. Everything else was successfully moved online. The S1 2020 myExperience data supports the claim that the transition was effective – the scores for both course satisfaction and teacher satisfaction increased over 2019 scores.
The biggest challenge, in the end, was maintaining student engagement. On a campus where attendance is typically 100% or close to it, typical online attendance at synchronous classes dropped to between 50% and 80%. This was a huge change for lecturers and students, and as at S2 we are still struggling to maintain student engagement.
In S2, the TST support was scaled back and monitoring largely ceased, as most conveners had taught in S1. Conveners were surveyed, and those who wanted a TST support were assigned one. At this stage, all S2 courses are anticipated to be completed successfully.
Looking ahead to 2021, it is anticipated that TST returns to its original role of providing quality assurance via course reviews. However, the provision of a personal TST support for new conveners, or those who would like some additional support, is something that we will continue to do. The sense of community and shared endeavour that was fostered in 2020 is something that we want to maintain into the future.
***
Understanding and Addressing Gender Differences on STEM Exams
In Collaboration with:
- Dr Praveen Pathak and Dr Prashant Joshi, Homi Bhabha Centre for Science Education, Mumbai,
- Dr Matthew Verdon and Dr Alix Verdon, Australian Science Olympiads Program
- Dr David Low and Umairia Malik (PhD candidate), PEMS, UNSW Canberra
- A/Prof Elizabeth Angstmann, Physics, UNSW Sydney
Introduction: Women are under-represented in STEM degrees and careers worldwide. There are many reasons for this, including less exposure to science and mathematics at a young age, differences in the way teachers interact with girls and boys, lower self-efficacy and stereotype threat, as well as expectations of peers and family. There are also cultural issues within science that can make it less welcoming for women than for men. Performance gaps are also an important contributing factor, as even if a girl wants to pursue STEM she will not be able to if she cannot meet performance requirements.
While girls perform better than boys in both STEM and non-STEM subjects at school, girls under-perform relative to boys on competitive examinations including the Australian Science Olympiad Examinations, the National Standard Examinations and university entrance examinations in India. In many countries, including the two most populous nations on Earth – India and China – it is not school grades that determine university access. It is these external competitive examinations.
This under-performance on competitive examinations contributes in two ways to maintaining the low participation rate of women in STEM. First, for girls taking the examinations, they are less likely to be able to access a STEM degree and hence career if they score poorly compared to their male counterparts. This is a direct effect. Indirectly, poor performance of girls on these types of examinations affects the self-esteem and self-efficacy not only of the girls themselves, but of younger peers who may note the lack of success of girls in accessing degrees and competitive programs.
Aims: We have been exploring the gender differences in performance on competitive examinations in Australia and India, as well as our own UNSW examinations and tests.
We have three aims:
- Understand the reasons why girls under-perform on STEM exams, particularly competitive examinations.
- Find ways to write exams that are fit for purpose (select students appropriately and are logistically “doable”) with less gender bias.
- Design teaching interventions to better help girls with problematic content and improve their exam technique.
Outcomes: To date we have analysed student response data from 9 years of Australian Science Olympiad Physics Examinations (approx. 9000 students) and identified characteristics of questions that give large gender gaps.
We have field tested revised versions of questions with UNSW students to create questions with reduced bias, that still test the same content.
We have also examined existing UNSW physics tests for gender bias, exploring tests on a question by question basis.
In parallel with this, we have developed a teaching intervention that is simple, cheap and quick that assists girls with one particularly problematic concept area (projectile motion).
Currently we are working with large data sets (more than 300,000 students) from the Indian National Standards Examinations. These examinations are the same format as the Indian university entrance examinations, and show the same large gender gaps in achievement. For example, in the Physics exam, the ratio of male median mark to female median mark is 2.2. Even in the Biology exam, where the marks are much more similar, the ratio is 1.4. The variability of boys’ marks is also greater, meaning that there are more boys at the top end of the mark distribution.
The reasons for girls’ underperformance are a combination of less content knowledge/lower ability and poor test-taking strategy – most likely due to less preparation including less access to coaching - and lower confidence. At the high achieving end, the main reason for the gender gap is lower confidence. The format of the examinations, MCQ with negative marking, is a significant factor. The result is that very few girls proceed beyond this examination. Unfortunately, this is also the case for the university entrance examinations taken by millions of students annually, for which these exams provide a model.
Next steps: In collaboration with the Indian Science Olympiads Program we hope to develop less biased tests for that program. By publishing our findings we hope to raise awareness of the role that competitive entrance examinations, particularly MCQ exams with negative marking, play in maintaining the gender gap in participation in STEM degrees and careers.
At UNSW, we plan to continue developing teaching interventions in physics that reduce the performance gap, and devise better tests. We will continue to promote these practices to secondary and tertiary educators via workshops and publications.
Acknowledgements
We are grateful for the support of a UNSW Canberra Rector Funded Visiting Fellowship which has enabled the collaboration with the Homi Bhabha Centre for Science Education (HBCSE). I am also grateful for the hospitality of the HBCSE in hosting me for a return visit to maintain the collaboration.