Duke Education says their course uses ‘chat-style delivery’ to deliver classes on writing emails and identifying hazards but does not utilise artificial intelligence
Get our breaking news email, free app or daily news podcast
A training organisation co-run by a vice-president of the Collingwood football club has been criticised for using chatbots to help teach a course to adult job seekers.
Duke Education, a registered training organisation (RTO), offers a certificate III in community services. A chatbot takes students through some of the coursework, such as how to write an email and recognise hazard signs.
Duke Education, co-run by Collingwood co-vice-president and former AFL player Paul Licuria, offers courses for unemployed Australians.
Workforce Australia providers cover the cost of training for participants through the Employment Fund, a pool of taxpayer funding used to place job seekers into training and to buy work-related items such as clothing.
Sarah*, whose real name has been withheld, was put in the certificate III course by her job provider, MatchWorks, after she said she wanted to study social work.
Sign up for Guardian Australia’s breaking news email
“It was a chatbot-based learning system,” she claimed.
In the first week of the course, Sarah was put in a chat room with classmates and taught how to message them, she said. The online course had pre-determined answers for students to choose.
“It doesn’t really matter what answer I picked,” Sarah said. “You can’t ever have your own input in the class, it is always just to agree with the other classmates or with the teacher.”
Screenshots shared with Guardian Australia show the chatbot teaching students how to write an email, including starting “with a friendly and professional tone” and explaining the email’s purpose before ending with a “positive and encouraging note”.
Sarah lives in Perth, has two casual jobs in youth work and runs children’s programs. Her casual work is not currently enough to live off, so her income is supplemented by jobseeker payments.
Sarah said her job provider pushed her to do the course. “I’ve written emails before,” she said. “It felt kind of irrelevant. I thought it was going to be more driven towards community work.”
“I did ask them if I could do a different course but they were adamant that they already used the funding and that switching without any reasonable cause wouldn’t be possible to fund for.”
In other parts of the course, which runs for six months, students are taught what hazard signs mean, including “first aid” and danger.
In a previous version of its student handbook, Duke Education spruiked its use of AI.
“Utilising AI-driven conversational learning, our trainers and assessors monitor each student’s progress, providing valuable guidance and feedback,” the handbook said. “Additionally, AI facilitates realistic simulated scenarios, allowing students to apply theoretical knowledge with the ongoing support of our dedicated trainers and assessors.”
However, in a statement to Guardian Australia, a spokesperson said no AI was used to run the course. When asked about the handbook, Duke Education provided a new version without mention of AI.
Sign up to Breaking News Australia
Get the most important news as it breaks
“The course is not delivered via artificial intelligence,” the spokesperson said. “While chat-style delivery is used, it is fully developed and structured by qualified learning designers and trainers to meet national competency standards. Students also receive learner guides, access interactive quizzes, and participate in one-on-one sessions with real trainers and assessors.”
The organisation also defended the course material: “Workplace skills such as professional email writing and hazard identification are core competencies in community services.
“These units are nationally endorsed and assessed through contextualised, practical tasks that mirror real work environments, such as communicating with support agencies or identifying risks in client homes or service settings.”
“Whilst the conversational learning part of the experience has responses that are multiple-choice, learners are encouraged to reflect and think critically throughout,” the spokesperson said, adding that all formal assessments in the course involved “a combination of multi-choice, written or verbal responses, marked by a qualified assessor, so students can demonstrate their knowledge in their own words”.
The Antipoverty Centre’s Jeremy Poxon said: “All throughout the employment services system, companies look to extract the maximum amount of money.
“It sadly comes as no surprise to see an RTO using chatbots … the minister should come clean about how much public money is being used on these AI chatbot courses.
“The employment fund should be used to help the poorest people in the country afford vital things they need.”
MatchWorks said the course content was a matter for Duke Education. “MatchWorks ensures a tailored and personalised service to each participant based on their individual needs, circumstances and choices,” a spokesperson said.
“MatchWorks encourages all participants to provide feedback about the education services they are receiving to ensure their needs are being met.”
Duke Education’s spokesperson said “concerns about the broader employment services system do relate not to individual training providers like Duke”.
Licuria was contacted for comment.
Source: www.theguardian.com