A key element of child protection is to amplify the participation of the business sector as a partner in the process. This is particularly challenging regarding the expanding mass of child sexual exploitation and abuse materials in a world of digitalization, algorithms and Artificial Intelligence. A recent report backed by the UN concerning the impact of AI on child sexual exploitation notes that Cyber Tipline reports on CSAM received some 36 million reports of suspected child sexual exploitation in 2023. There are the traditional forms of exploitation, such as photos of children exploited in prostitution, pornography and human trafficking, now coupled with more modern forms such as suggestive texts used for grooming children and livestreaming.
Sextortion, namely blackmail threatening children with exposure of sexual materials involving them, is also a worrying phenomenon in the expansive crucible of online scams and deception. The advent of AI complicates the scenario. It can “nudify” ordinary pictures of children and adults by converting them into sexualized images. This might be ordered by a text to create an abusive new image or by manipulating an older image to become sexualized.
There then arises the question: What are the costs and benefits for the sector to harness its cooperation more effectively? The preferred entry point is to follow the UN’s Guiding Principles on Business and Human Rights, which underline the duty of the business sector to respect human rights, especially by means of due diligence measures, and to share with the State the duty to remedy the harm to the target groups. The State itself is under the duty to protect human rights, such as by adopting good laws.
There are at least two approaches to the due diligence advocated. On the one hand, voluntary measures on the part of the business sector can be promoted. Voluntary action from the sector is based on self-regulation (“soft law”) such as Business Codes of Conduct, Community Standards, Terms of Service or contractual terms linked with TOS. They can be coupled with Oversight Boards and personnel for content moderation and technological tools and filters to block illegal and harmful content. The sector is encouraged to produce due diligence reports as monitoring for harm prevention and mitigation, together with the use of “notice and take down” requests to delete such content, and make available a range of remedies.
On the other hand, the State or a regional organization to which it belongs (such as the European Union) might seek to impose stricter measures by means of legislation (“hard law”) such as online safety and child protection laws, Digital Services Act or AI related-regulations, with mandatory reporting by the business sector to oversight mechanisms for the purpose of transparency, subject to hefty fines in the case of non-compliance. Australia’s online safety law, effective from 2022, follows this approach. There might also be a mix of the hard law and soft law approaches.
How expensive is it for the business sector to be involved in such measures? Disaggregation of the various measures shows that the expenses vary and need not be high. For example, to include a provision on child protection in the contractual terms of service does not necessarily impose additional cost. Paradoxically, the cost might be incurred by the consumer who clicks “Agree” without reading the terms of the contract! This implies that contractual terms need to be more reader-friendly and child-sensitive.
How expensive is the adoption of Codes of Conduct or Community Standards with provisions on child protection, and how costly is the Oversight Board, as well as the content moderators who help to screen and take down illegal and harmful content? Interestingly, one big digital platform gave about $150 million to its Board a couple of years ago to help oversee the materials flowing on the internet and moderate the abusive materials. However, the resources of that platform are in the billions. As for an individual content moderator, there is also sub-contracting to technicians in developing countries to save cost, and a key consideration is the psychological help which these people may need to mitigate the impact of having to review a plethora of abusive materials.
With regard to the request to take down the objectionable content as part of the “notice and take down” channel, it should cost nothing (or very little) to the platform or the complainant; the request can be from the user to the platform or from the platform to the person/group responsible for the content or loading it.
How costly are the technological tools, such as filters, to help identify such materials and to eliminate them? Some are free of charge as part of open source, while others can be costly. They range from filters to identify old abusive material, namely “hash matching,” to language processing tools to identify sextortion. The tool to identify the possibility of AI producing offensive material is known as “red teaming”, while there is also a tool for labeling or watermarking AI-generated material, for the purpose of transparency, offered free to some developers.
As for due diligence reports, the cost of preparation by experts ranges from several thousand dollars for small platforms to substantial sums for the mega-platforms. However, failure to send in such a report under the hard law systems may lead to very big fines. The bottom line is that there is a need to integrate child protection measures into digital platforms and operations through a variety of ways and means.
This is not necessarily expensive, as some of the facilities are free, e.g., various filters and technological tools available as open source. Bigger operations may need to satisfy higher standards, such as mandatory reporting. Small operations may need assistance and incentives that the bigger operations can offer as peer cooperation. Where soft law (such as Community Standards) does not work or has a poor impact, hard law (such as an online safety law, with mandatory reporting and related remedies) may be required. Multi-stakeholder action, including the business sector, is thus essential for child protection with age-appropriate and gender sensitivity.
Intrinsically, it makes good business sense for the sector to integrate child protection into its platforms — because it is the correct thing to do, build responsible consumerism, and prevent the harm that might result in exponential accountability.
Vitit Muntarbhorn
Vitit Muntarbhorn is a professor emeritus at Chulalongkorn University. He was formerly UN Special Rapporteur on the Sale of Children. The views expressed here are the writer’s own. — Ed.
(Asia News Network)
khnews@heraldcorp.com
