In short, the post discusses the actions of a volunteer Wikimedia Commons administrator with regards to two series of alleged child pornography images uploaded by two newly registered users, and calls for his resignation from adminship on Commons (or removal thereof).
An interesting aspect of the blog post that DC touches upon — and, unfortunately, is a bit incorrect about — is how potential child pornography is managed on Commons in general. As a Commons oversighter, let me describe our policy and clear up some confusion (see disclaimer).
- admin-level deletion: “deleted” files are only visible to administrators and users with the “delete” user right;
- suppression: suppressed files are only visible to oversighters and users with the “suppressrevision” user right — they are not visible to regular administrators and normal users;
- actual deletion: performed by users with direct access to the servers — files are permanently removed from the servers and forever gone.
Since there are over 270 administrators on Commons, the usual admin-level deletion is not exactly the perfect way of removing delicate content from the public view. That’s where suppression comes in, as the number of users who are able to see suppressed content is much lower than that (currently 46 people outside the Wikimedia Foundation staff: 1 founder, 40 stewards, and 5 local oversighters).
It should be mentioned here that child pornography is illegal in the United States under federal law (see 18 USC § 1466A, 18 USC § 2251, 18 USC § 2252, 18 USC § 2252A, and 18 USC § 2260), and as far as I am aware, the Wikimedia Foundation is legally obliged to report any such incidents to appropriate U.S. authorities. However, I am unaware of any laws that require regular Internet users to report potential child pornography to anyone, and Wikimedia community members are only asked to “delete it and notify the WMF” (which seems quite reasonable).
As DC criticises the way that potential child pornography removal works on Commons at the moment, let me repeat this: volunteer administrators are not required to “delete” such content from Commons, and even if they do, it is not a perfect solution of the problem. Potentially illegal content should be reported directly to the WMF legal team, who then take appropriate measures, and when necessary, forward the case to the National Center For Missing and Exploited Children.
However, since the WMF operates in limited working hours in the Pacific Time Zone, it might take some time to hear from their legal team. So another way to have potential innocent images suppressed as soon as possible is to contact the volunteer oversighters and the WMF legal team at the same time; from a technical point of view, there is no difference between the two, as both are only able to suppress the content.
Here’s how the procedure works in practice when pictures are brought to the attention of volunteer oversighters:
- The report is reviewed, and if there is any chance that the pictures could be considered child pornography under the US law (see above), they are suppressed (with better be safe than sorry being the general rule);
- The uploader is blocked locally (in some cases also globally locked), and checked with the CheckUser tool;
- The case is reported to the WMF, who then take other necessary measures and forward the incident to appropriate U.S. authorities as described above.
There’s nothing secret about how we deal with potential child pornography, and in my opinion it’s quite a good and scalable solution to a very delicate problem — though I do see a lot of space for improvement; for instance, adding a simple abuse link to every Commons page might be a good first step, so that we don’t require people to actually e-mail us to report content that they find illegal.
Since you managed to reach this point, let me share a true secret now. I don’t have any statistics about child pornography reports on other media repository websites, such as Pinterest, Instagram or Flickr (just to name the three most popular ones), so the general Commons data will have to suffice instead:
- the average upload rate on Commons between May 2, 2012 and May 2, 2013 was about 11,292 files per day;
- the number of pictures that were suppressed during this period as potentially abusive can be defined in double digits.
That’s our scale: a double-digit number out of over 4,100,000 pictures uploaded to Wikimedia Commons between May 2, 2012 and May 2, 2013. Not that many, but we can definitely make improvements to the way we deal with them now.
- ^ This is true about all MediaWiki wikis that use MediaWiki 1.16.0 or higher, not just the Wikimedia ones.
- ^ The FBI website has more information on the subject, but I’m unsure about the origins of the term.
- ^ Of course, Wikimedia Foundation developers have the technical ability to remove pictures from the servers, which they do — but the legal team itself is not able to do that.
The views expressed in this blog post are solely those of the author, and do not necessarily reflect the views of the other oversighters, the Wikimedia Foundation, or other organizations with which the author is or might be associated.
Go to top.