The ICC has introduced new software as part of a social media moderation program for the Women’s T20 World Cup, aimed at protecting the cricket community from “toxic content” and supporting the mental well-being of individuals. The initiative seeks to create a “safer, kinder, and healthier online community” for the sport.
In a statement released on the opening day of the tournament, the ICC emphasized that the program’s goal is to “promote a more positive and inclusive online experience for teams and players,” with over 60 players already opting in.
The ICC has partnered with UK-based company GoBubble to utilize a blend of AI (Artificial Intelligence) and human resources to monitor and moderate comments on social media platforms like Facebook, Instagram, and YouTube. The technology will detect and hide harmful content such as hate speech, harassment, and misogyny, providing a safer space for fans to engage with the tournament. Players who participate will have harmful comments automatically hidden from their social media accounts.
“We are committed to fostering a positive and inclusive environment for all participants and fans of the ICC Women’s T20 World Cup. It’s been encouraging to see so many players and teams embrace this initiative,” said Finn Bradshaw, ICC’s head of digital.
South African wicketkeeper Sinalo Jafta expressed her support for the initiative, noting that social media protection is crucial for players, particularly during high-stress tournaments like the World Cup. “After a loss or victory, there’s always some degrading comment, and this protection allows players to share their lives without fear of judgment or criticism,” she said.
The 10-team tournament begins on Thursday in Sharjah, with Bangladesh facing Scotland in the first match at 2 PM local time (10 AM GMT), followed by Pakistan taking on Asia Cup champions Sri Lanka at 6 PM local time in Dubai. The final will be held in Dubai on October 20.