Image Analyzer comments on illegal underage images discovered on OnlyFans
Proactive prevention of image uploads beats age verification
Investigative reporters from BBC News have demonstrated how easy it is to flout the ‘enhanced’ age verification checks on pornographic image sharing site, OnlyFans. As a result, reporters found that images and videos were being posted by children as young as 13, some of whom have told ChildLine that they have been coerced into sharing nude and explicit images on the site. This constitutes illegal content.
The US National Center for Missing and Exploited Children has also been alerted to OnlyFans videos linked to missing children.
UK police child protection lead, chief constable Simon Bailey, told the BBC that OnlyFans, “is not doing enough to put in place the safeguards that prevent children exploiting the opportunity to generate money, but also for children to be exploited.”
Speaking to the BBC, NSPCC’s head of policy for child safety online, Andy Burrows, said, “OnlyFans underlines why it’s so important that we see regulation so that it’s no longer a choice for platforms about whether and how they protect children using their services.”
Cris Pikes, CEO of Image Analyzer, a member of the Online Safety Tech Industry Association, has commented, “The numerous instances of explicit underage content posted to OnlyFans demonstrates why interactive sites need to apply proactive content inspection technology to prevent illegal images being uploaded, rather than relying on age verification alone.”
The Financial Times has reported that OnlyFans reported revenues of £281million last year after subscribers spent £1.7billion on the site.
“When the Online Safety Bill passes into law, OnlyFans, and any other online platforms that allow users to upload images and videos, will have to put proper measures in place to remove illegal and harmful content, or risk being fined 10% of global revenues, and ultimately suspension of access to their services in the UK,” says Pikes.
About Image Analyzer
Image Analyzer provides artificial intelligence-based content moderation technology for image, video and streaming media, including live-streamed footage uploaded by users. Its technology helps organizations minimize their corporate legal risk exposure caused by employees or users abusing their digital platform access to share harmful visual material. Image Analyzer’s technology has been designed to identify visual risks in milliseconds, including illegal content, and images and videos that are deemed harmful to users, especially children and vulnerable adults.
The company is a member of the Online Safety Tech Industry Association (OSTIA).
Founded in 2005, Image Analyzer holds various patents across multiple countries under the Patent Co-operation Treaty. Its worldwide customers typically include large technology and cybersecurity vendors, digital platform providers, digital forensic solution vendors, online community operators, and education technology providers which integrate its AI technology into their own solutions.
For further information please visit: https://www.image-analyzer.com
References:
BBC News, ‘The children selling explicit videos on OnlyFans,’ 27th May 2021, https://www.bbc.co.uk/news/uk-57255983
Financial Times, ‘OnlyFans feels the lockdown love as transactions hit £1.76billion,’ 26th April 2021 OnlyFans feels the lockdown love as transactions hit £1.7bn | Financial Times (ft.com)
Gov.UK, ‘Draft Online Safety Bill’, 12th May 2021 https://www.gov.uk/government/publications/draft-online-safety-bill