Raheem Sterling says social media platforms must “show real leadership and take proper action” to combat online abuse against footballers following a new study over the Project Restart period.
The study, taken over the six-week period of last season’s Project Restart during the coronavirus shutdown, showed that more than 3,000 explicitly abusive messages were sent publicly via Twitter to 44 high-profile players currently or formerly involved in English football.
Forty three per cent of the Premier League players in the study (13 out of 30) experienced targeted and explicitly racist abuse, while three players who called out racism during the period – Manchester City and England forward Sterling, Crystal Palace forward Wilfried Zaha and Wycombe’s Adebayo Akinfenwa – received 50 per cent of all the abuse.
Sterling said far more needed to be done to tackle the problem: “I don’t know how many times I need to say this, but football and the social media platforms need to step up, show real leadership and take proper action in tackling online abuse.
“The technology is there to make a difference, but I’m increasingly questioning if there is the will.”
The study, which looked at 825,515 messages in total which were sent to the players, was commissioned by the Professional Footballers’ Association Charity, carried out by data scientists at Signify and supported by anti-discrimination group Kick It Out.
It also found that 29 per cent of the abuse came in emoji form, which the study commissioners described as a “glaring oversight” in the algorithms used by Twitter to spot hateful content.
Twitter has been approached for comment regarding the study by the PA news agency.
Earlier this month, it was confirmed that the social media giant was in partnership with Kick It Out over its ‘Take A Stand’ initiative. At that time, Twitter said it proactively removed more than one in two tweets deemed hateful from its platform.
Facebook’s vice-president for northern Europe, Steve Hatch, said that between April and June action had been taken against 22.5 million pieces of content and that 94.5 per cent of that was detected and removed proactively by Facebook, rather than by users reporting it.
Akinfenwa said: “As someone who has experienced online abuse first-hand and spoken to team-mates who have experienced the same, I can say that players don’t want warm words of comfort from football’s authorities and social media giants, we want action.
“The time for talking has passed, we now need action by those who can make a difference.”
The report authors say the data demonstrates the need for football’s stakeholders to work together on the funding of a system to proactively monitor online abuse using artificial intelligence.
It also calls for more work to be done to ensure there are “real world consequences” for online abusers such as prosecutions and stadium bans, and for more pressure to be brought to bear on social media companies to act proactively and strongly against abuse.
Additional reporting by PA.