-
-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
User-Curated Content Verification System Concept #1555
Comments
Moderation on any social platform is an absolutely awful experience for everyone involved. The sheer amount of shit the moderators have to see and click 'no, people should not see this' is insane, and anyone who has worked such a moderation job will tell you just how much godawful shit they see every day. I think I recall a story about a police investigator who was shocked after hearing that moderators see even more gruesome gore then they did on a daily basis. My point is that nobody in their right mind would subject themselves to this willingly for very long and the well of people who would participate will dry up real quick. It sucks so massively hard that this is the reality social places face when they allow anyone to upload anything. People will be horrible if they can't empathize with the people they reach across the screen, and I'm not sure how to 'fix' it. Moderation is just the brute force system that has worked so far, but when you start scaling things up you just get more and more shit to filter, and need more people. It's not sustainable best I can think of is having a really strict AI driven base level of moderation, then easing up on it a little if the user gains trust or reputation something similar. Maybe this will cause 'companies' or something to appear that have one account that uploads stuff from a larger playerbase, kind of just shifting the moderation burden to the side a bit. However, I think making people talk with other people before uploading something will drastically decrease the amount of stunts people are willing to pull after having some level of extra trust or rep or something, then people should start looking at your appeals. Yes, this very much could discourage people from trying to make things if their stuff is not allowed at first, but the balance of 'everything is allowed, go nuts' and 'we'll track you down irl to send cops' is very hard to find and you'll always have edge cases where people are mad. The ability to ask a ‘company’ to look at your thing and still get it uploaded could mitigate this somewhat. …
|
I completely understand, it is true that user and even moderator morale is a very big contributor to a system like this existing and I haven’t thought of that perspective before, thank you for that. Nobody wants to look at terrible stuff all day, you’re completely right about that. …
|
Feel like reporting content and users is a much faster process and far less steps. Moderation does this already for public content and anything that is to Harras or breaks TOS should just be reported to be removed permanently. You can also block content from annoying users or block the users too and u won't see them or their contents. Don't really see this as something that is necessary imo. …
|
it is true that reporting and blocking content is easier, i agree, but for me, the problem still remains when it is still possible for virtually anyone to easily upload content that can be used to harass others. moderation has a very easy time dealing with this right now, but in the future where there could be thousands upon thousands of users joining ChilloutVR everyday, moderation might not be able to keep up, and in a game where there could literally be children co-existing with adults, to me at least, that is unacceptable. …
|
I’d like to preface this post by stating that the situation regarding harassment in virtual reality platforms has only gotten worse over the years. In games like VRChat and even ChilloutVR, it is still way too easy for any user to abuse content creation for their benefit, which is unacceptable. I’m making this post with the hopes that we can develop a system that makes ChilloutVR a safe place for children and adults. For the example of this concept, let’s imagine that ChilloutVR has become exceedingly popular within the virtual reality space and has attracted thousands of passionate users, making scale not an issue at all. This concept is meant to fit in this reality; something like this isn't necessary with the current size of the ChilloutVR community. If you have any thoughts or suggestions, please feel free to leave a comment!
Moderation of user-generated content can only be handled so much by the developers, so much so that it is incredibly easy for anyone to upload inappropriate content and do whatever they want. Whether it's becoming naked, swinging around a bunch of dildos, or walking around as a .jpg of a real-life murder scene, you name it, people will do it. Because of this, I would like to introduce the concept of user-curated content verification - I know what you are going to say, but please hear me out!!! Don't worry, I don't want to turn ChilloutVR into an ID-locked restriction zone either, I have no intention of doing that. I’d rather eat dirt than do that.
A user-curated content verification system could work like this:
ChilloutVR would invite various users to participate in content verification. If they accept, they will be given a new tab on their home page. How exactly the moderation team would go about inviting users would be completely up to them, though I would imagine it would be based on factors like playtime, content uploads, user reports, etc. Users who have accepted the invite I will now refer to as "helpers". Helpers will be given a disclaimer that the content provided to them is unverified and may contain NSFW/NSFL material, and may inflict other side effects such as game crashes.
When you upload/update a piece of content - say an avatar or a prop - it will be tagged as Unverified. Unverified content will not be able to be viewed by default, but can easily be overridden via ChilloutVR's game settings at any time. If a user does not wish for their content piece to not be reviewed by helpers to ensure their privacy, they can do so during the CCK upload process - but their content piece will be restricted to being private, and can only be viewed by themselves unless they share their content piece with other users who accept their share requests.
Every time a piece of content is marked as unverified and it has been approved to be viewed by helpers, it would be sent through the system to be received by helpers who have chosen to participate. Helpers will receive a daily batch of content of (however many) content pieces. This list can be refreshed at any time by the user's command if they wish to do so. The system will not provide information as to who created the content they are reviewing, it will be entirely anonymous from the perspective of the helper.
The system would require that helpers must be in a private instance before reviewing content, with no exceptions. Once helpers are inside of a private instance, they may equip or spawn in the content pieces they wish to review. Once helpers begin to review the content, the system would have helpers run through a checklist per content piece that would look something like this:
#1 Does the content match the provided tags? Y/N
#2 Does the content violate ChilloutVR's TOS? Y/N (If yes, please select all that apply: NSFW Underage, Harassment, Crasher, etc)
In this scenario, content requests come and go frequently, and have a large group of sustained helpers who are willing to help out the game's moderation. Using a pendulum-based approval system, content pieces that receive consistent pointers would be judged appropriately: If the majority of helpers dictate that the content matches the tags provided and does not violate ChilloutVR's TOS, then it will become a verified content piece. Every time a piece of content is updated the system will need to verify the content again (the system would do its best to not run helpers through the same uploaded content each time) - just so users cannot trick the system by maliciously changing their content after having it be verified. Helpers who are statistically shown to have high verification rates will be rewarded with a fun little "Helper!" badge, just to show off how awesome they are. On the other hand, if helpers aren't being awesome and intentionally misuse the system, they will have low verification rates based on the majority. They will eventually have their helper rights revoked, including their awesome badge.
I think one factor that a lot of people forget when it comes to gaming communities is that they will do great things without the developers ever asking for a single cent from them. Games like Team Fortress 2 have been running almost entirely ran by the community for YEARS without true any developer support! Moderation in games like these is indeed really hard to tackle, the only real solution is just a lot of treadmill work, right? Exactly!!! That's exactly my point! Why not have the community help push through the treadmill work? The moderation team will still be in charge of this type of content regardless of whether a verification system exists or not, but a system that could help relieve the pressure off of their shoulders could prove to be effective if done in the right way. A system that automatically marks content as unverified would hault malicious users whose only incentive is to quickly upload content to harass others in public lobbies, since a toggle-able switch would be given to every user to hide potentially unwanted content.
It is currently WAY too easy for malicious users to spread their influence via content creation on ANY platform, and I believe that a toggle-able verification system (if done right) would give ChilloutVR that little extra layer of security to help ensure a safer and more pleasurable experience for everyone.
Feel free to let me know what you all think of this! The issue has only gotten worse as time goes on and I believe it is a discussion that needs to happen more until we find the best solution, for the sake of ChilloutVR's future.
…
The text was updated successfully, but these errors were encountered: