ARTICLE AD BOX
When Kathryn Sullivan walked into the Commonwealth Bank of Australia (CBA) for the last time this July, she thought she would be leaving with decades of loyalty and experience behind her, not the memory of being replaced by a machine she had helped train.After 25 years at the bank, including five years in its customer messaging team, Sullivan was told her role was no longer needed. The decision blindsided her. “We knew that messaging would eventually be sent offshore, but never in my wildest dream did I expect to be made redundant after 25 years with the company,” she said. “Inadvertently, I was training a chatbot that took my job.”
Training the tool that replaced her
Sullivan’s last role at CBA involved developing scripts and testing chatbot responses for the bank’s in-house AI system, called Bumblebee.
When customers put questions to the bot and it couldn’t respond, it was her team that stepped in with answers.
She had assumed her role in building the chatbot would end with her redeployment somewhere else in the company. Instead, she was given just one hour’s notice before being called into a meeting and told her services were no longer required.“They ghosted me for eight business days before they answered any of my questions,” she said, recalling the lack of communication that followed the redundancy notice.
The bank had attributed its decision to make 45 call centre roles redundant to the new AI tool, which it said had reduced call volumes by 2,000 a week. It marked the first time an Australian employer had openly linked job cuts directly to artificial intelligence.
A reversal — but not a resolution
The decision didn’t last long. Just a month later, under pressure from the Finance Sector Union (FSU) and a case before the Fair Work Commission, CBA backflipped and offered the sacked employees their jobs back.
A spokesperson admitted the bank’s first assessment had been wrong. “Our initial assessment that 45 roles were not required did not adequately consider all relevant business considerations and this error meant the roles were not redundant,” the statement said. “We have apologised to the employees concerned and acknowledge we should have been more thorough.”Workers were invited either to stay in their current roles, seek redeployment, or accept voluntary redundancy.
But Sullivan says the options on the table aren’t clear or suitable. Most positions remain out of reach because of a six-month hiring freeze, and the one role offered to her was not viable. “I’m still not receiving clear communication from the company about my future,” she said.FSU national assistant secretary Nicole McPherson said the union suspects CBA may have used AI as cover for other motives. “It looks to us that the people who were being selected were high-dependency workers – people who have high needs for sick leave, due to personal illness or injury, or use a lot of carer leave.
Some of them are also domestic violence survivors,” she said. “It just seems like a very high proportion of them are in that category.
”Other former CBA employees agree AI played a role in their departure. Dhanushi Jayatileka, a product manager who lost her role in July, said the bank was increasingly outsourcing lower-grade workers offshore and relying on AI tools to replicate what she had previously done.