Prove your humanity

If you’re anything like me, it would be fair to assume that your inbox recently became the home of a handful of emails requesting—or rather, pleading—for you to complete the new trial of midsemester Pulse Checks for one or more of your units. It also would be fair to assume that these emails were promptly forgotten or relegated to the ‘out of sight out of mind’ safety of your email’s deleted folder. Now that the results of these evaluations are back, have we as students missed a vital opportunity to shape our studies?

Every student’s learning style is vastly different from the next, so it stands to reason that we all require something different out of our educators. This is precisely what this semester’s first trial of the midsemester pulse check strives to address. Unlike end-of-semester surveys, which aim to improve the unit for the next cohort to take it, these Pulse Checks give your tutors, lecturers, and coordinators the chance to adjust their teaching style to fit you in real-time, so you can see the changes you need out of them—and your grades—whilst you study. A main goal of the Pulse Check is to foster open communication between the university and its students on what is required for improved learning experience and quality. Why then did a whopping 92.1% of us choose to ignore the opportunity to complete a survey and reap its rewards for the rest of the semester?

Twelve first-year units from the spectrum of business and law, science and engineering, health sciences, and humanities were selected to be involved in the first trial of the Pulse Check system. Across the board, feedback from all units was very low. On average, 7.9% of students provided a response—meaning that out of 10,410 eligible students, only 818 interacted with the Pulse Check. The unit with the highest rate of engagement was NPSC1003: Integrating Indigenous Science and STEM—but even here the response rate was capped at 15.8%. The Faculty of Business and Law boasted the lowest response rate with ACCT1002: Financial Decision Making, where only 3.2% of enrolled students participated in the Pulse Check.

Faculty Unit # Eligible # Resp Response Rate
1. Bus & Law ACCT1002 Financial Decision Making 753 24 3.2%
BLAW1002 Markets and Legal Frameworks 910 30 3.3%
MGMT1003 Strategic Career Design 848 35 4.1%
2. Sci & Eng COMP1005 Fundamentals of Programming 656 50 7.6%
NPSC1003 Integrating Indigenous Science and Stem 525 83 15.8%
PRRE1003 Resources, Processes and Materials Engineering 393 25 6.4%
3. Hlth Sci CMHL1000 Foundations for Professional Health Practice 1,992 187 9.4%
HUMB1000 Human Structure and Function 2,108 238 11.3%
PSYT3000 Abnormal Psychology 293 14 4.8%
4. Humanities COMS1003 Culture to Cultures 396 18 4.5%
COMS1010 Academic and Professional Communications 672 58 8.6%
EDUC1022 EDC135 Child Development for Educators 864 56 6.5%


818 7.9%

It isn’t all bad, though. Compared to end-of-semester evaluations, where some students take the opportunity to release their frustration with their teaching staff by submitting hostile evaluations, the Curtin Student Guild found that the Pulse Check was entirely respectful, constructive, and polite. In saying this, end-of-year evaluations typically have much higher response rates, owing to greater levels of promotion coupled with the option for respondents to be entered into a draw to win prizes. Whilst at first glance this may seem positive, the incentive of rewards means that the surveys are commonly filled out incorrectly as students rush their feedback in order to earn entries in the prize pool. Seeing as no prizes were offered for the Pulse Check, this was bypassed in its responses. Additionally, end-of-semester evaluations have historically been skewed by students who fill out overly positive responses to units that they did not enjoy in actuality, with the mindset that if they had to suffer through difficult units future students should have to as well.

Unit coordinator Carol Igglesden says that whilst previously implemented evaluation systems have led to many changes in her units it can be difficult to implement all the feedback offered given that responses often contradict one another. “As you can imagine, we often get conflicting ideas presented to us. [For example] … I like working with my peers; group work sucks; the class is too long; there is not enough time.” Despite this, she notes that “when a common issue emerges, or it is universally good-for-all idea, we act on it.”

In the coming months, The Guild plans to meet with staff and students to discuss the effectiveness of the Pulse Check system and see how the feedback acquired was implemented into the units involved. They will also investigate why the response rates were so low. A second trial of the Pulse Check will run in semester 2 of this year and will include second and third-year units to compare the engagement levels across cohorts. When this round of Pulse Checks is emailed out, why not take five minutes to fill out the surveys, personalise the way your units are taught, and get the most out of your semester?