Nov 302018
 

After my recent post about moderation on social and community applications, I realized that my comments focused almost exclusively on moderating public posts (or posts to a largely public feed). But what about private 1-1 messages, private groups (whose updates may or may not appear in a feed on the main page), and other, more “private” (relative to the rest of the site) forms of posting? Would the sort of tools and guidelines I called for work there, or would something else be needed, and if so, what?

The first place where the moderation tools that I wrote about wind up not applying is with user-level blocking, muting, etc. Using these tools should be solely at the discretion of the individual user, which means not having to justify their decision-making process. These tools are used to tune the user’s experience with the app (although consumer-side content filtering would be a better tool for that), as well as to deal with targeted harassment.  Forcing users to try to deal with spam, harassmsnt, and just general jerky behavior by routing everything through app-wide moderators just slams moderators, slows down resolution time, and forces users to have to justify why they don’t want to be connected to someone- something they don’t owe anyone an explanation for.

The other consideration I had was about user groups on social applications. These groups are semi-private, and may have additional rules in addition to the application-wide rules (e.g. “stay on the group’s topic,” or additional behaviors that are considered unacceptable). For any group to function successfully, you can’t have the level of absolute discretion that we can allow for individual users. So groups share the app’s need for a clear set of rules and moderation, but also may need to enforce additional rules. Probably the optimal solution is to make the moderation tools used by the site extendable so group moderators can add their supplemental rules and can enforce group rules following a similar process to the parent app itself (all moderation reports about violating app rules should be handled by the app team).

Another concern I thought of was “Should semi-private groups be able to operate with fewer rules than the app they’re hosting on?” In other words, can groups ignore or waive some of the app-wide content rules? As tempting as it is to say “Let groups do whatever they want,” for these apps to be considered usable, their rules need to be enforced universally. Anything else creates a sense of illegitimacy around the app – content that violates rules triggers consequences for the poster in some places but not others. Situations like this lead to perceptions of bias, regardless of whether or not that perception is justified. This scenario is why all reports about content violating the app’s rules should be handled by the app team – so those rules are enforced consistently app-wide.

Social apps that offer the ability for users to form groups offer additional moderation challenges. I think the best way to address those challenges is to find a way to offer group administrators the same type of moderation tools the app uses, in a way that allows them to plug in the group-specific rules. Group administrators should not be in charge of handling reports of violating the app’s content rules. The other big question around moderation revolves around individual user-level moderation (e.g. blocking, muting, etc.). I think these tools are important, but don’t need the same sort of formal process or justification. Moderation is still a hard problem, and the addition of user groups doesn’t make anything easier, but it’s still solvable.

 Posted by at 11:45 AM