The UK government has stated that it could legislate to impose age checks on users of dating apps, as a result of an investigation into the use by minors dating apps published by The Sunday Times yesterday.

The newspaper found that more than 30 cases of child rape had been the subject of police investigations related to the use of dating applications, including Grindr and Tinder, since 2015. It is reported that a 13-year-old boy whose profile was the application Grindr had been raped or assaulted by at least 21 men.

The Sunday Times also discovered 60 other sexual offenses involving children related to the use of online dating services, including grooming, abduction and violent assault, according to the BBC, which covered The report.

The youngest victim would have been only eight years old. The newspaper obtained the data through requests for access to information to the British police forces.

Responding to the Sunday Times' investigation, a Tinder spokesperson told the BBC that she was using manual and automated tools and that she was spending "millions of dollars every year" to prevent and eliminate underage users and other inappropriate behavior, claiming that she did not want minors on the platform. .

Grindr also reacted to the report by providing the Times with a statement saying, "Any story of sexual abuse or other unlawful behavior is troubling to us, as well as a clear violation of our terms of service. Our team is constantly working to improve our digital and human screening tools to prevent and remove inappropriate use of our application by minors. "

We also contacted businesses with additional questions.

Jeremy Wright, British Secretary of State for Digital Media, Media, Culture and Sport (DCMS), called the survey "really shocking", describing it as further evidence that "companies online technology must do more to protect children. "

He also suggested the government expand future age verification checks to access pornography to include dating apps. He would go to the dating companies to ask "what measures are in place to protect children, including to check their age".

"If I am not satisfied with their response, I reserve the right to take other measures," he added.

As part of the Digital Economy Act, age verification checks for viewing pornography online will take effect in the UK in April.

These age checks, which are obviously much controversial in terms of privacy protection when creating an adult identity database related to the viewing habits of porn videos, were also motivated by the children's concern about exposure to online graphic content.

Last year, the UK government also pledged to legislate on social media security, although it has not yet defined its political plans in detail. But a white paper is imminent.

A parliamentary committee that reported last week urged the government to impose a legal "duty of care" on child protection platforms.

He also called for more robust systems for checking age. So there is at least a possibility that some types of social media content will be indexed in the country in the future.

Last month, the BBC announced the death of a 14-year-old schoolgirl who committed suicide in 2017 after being exposed to images of self-injury on the platform.

Following the release of the report, Instagram's boss met with Wright and UK Health Secretary Matt Hancock to discuss concerns about the impact of suicide-related content on the platform.

After the meeting, Instagram announced its intention to ban graphic images of self-harm last week.

Earlier this week, the company reacted to public outcry by saying it would no longer allow the promotion of content related to suicide through its recommendation algorithms or hashtags.

Also last week, leading government medical advisers called for a code of conduct for social media platforms to protect vulnerable users.

Medical experts have also called for greater transparency from platform giants to support research based on the public interest on the potential impacts of their platforms on mental health.