Self-generated content was the most prevalent form of child sex-abuse material (CSAM) in Ireland, accounting for a quarter of cases reported last year, the State’s internet watchdog has said.
The term “self-generated CSAM” indicates when material is most likely to have been created by the child depicted, such as photographs sent voluntarily or due to coercion.
Data published by hotline.ie, the Irish Internet Hotline, on Wednesday points to a “self-generated content crisis”, with a 166 per cent increase in such material during 2024.
Hotline is Ireland’s reporting platform for illegal online content. Overseen by the Department of Justice and working in conjunction with An Garda Síochána, the agency identifies, analyses and removes problematic material.
READ MORE
Hotline’s annual report revealed that 11,505 submissions made to its portal for reporting illegal online material were of self-generated CSAM.
Overall, it processed 44,955 CSAM reports in 2024, a 55 per cent increase on the previous year. CSAM accounted for most online issues reported overall, with a record total of 53,441, up on 32 per cent from 2023.
Ninety-two per cent of victims depicted were girls, 5 per cent were boys; 3 per cent of images depicted both. Pre-pubescent children featured in mostCSAM reported, with 56 per cent of cases depicting children aged between four and 12.
Online forums were found to be the dominant distribution channel for CSAM, at 63.5 per cent. Such forums make removal “more complex and urgent”, Hotline said.
Images remain the most common format, representing 90.3 per cent of content, while videos accounted for 9.4 per cent of reports.
Data on illegal global content-hosting – in terms of domains – found Ireland to be among the jurisdictions with the lowest incidence rate, with 11 confirmed last year.
The Netherlands emerged as the primary hosting location with 13,508 instances of illegal content identified, followed by Hong Kong (7,012), Germany (4,529), and Vietnam (4,053).
The Irish Internet Hotline undertook a pilot initiative in 2024 to tackle child sexual exploitation material (CSEM) – content that may not meet the legal threshold of CSAM but which raises “serious ethical and dignity concerns”. The pilot resulted in a 94 per cent removal rate.
The report detailed other online threats on the rise, including financial scams targeting Irish residents, which had risen by 51 per cent in 2024. There were also 134 fraudulent websites identified, 79 per cent of which were removed.
Intimate image abuse reports saw a decline, suggesting the early impact of deterrent campaigns. There were 908 reports of racism and xenophobia online last year through the portal.
“Some of these figures might appear shocking or overwhelming, but they are also an indication of what’s possible,” Hotline chief executive Mick Moran said.
“The online realities reflect the society we live in, and we must face the troubling parts of that in a systematic and collaborative way. There is no silver bullet, no panacea, just hard and sometimes gruelling work.”