To ensure that players from all walks of life can safely create and share wonderful, emotional connections here in the world of Sky, we prioritize the safety and protection of the Sky community and continually seek out new ways to improve and expand these efforts. 

We endeavor to ensure that every player feels empowered and safe in our community, and in this effort, we want to update our players here on how community moderation is handled in Sky.

Community Empowerment and Protection

Community moderation begins with the Code of Conduct and Community Rules & Guidelines, which all players agree to abide by when first starting to play Sky. These are the “dos and don’ts” of Sky and provide every player with guidance and direction on what behaviors are acceptable. 

All player chat and creations are then passed through strict, family friendly chat filters. These chat filters serve to block risky or harmful chat messages, within milliseconds, to ensure a seamless and safe chat experience. Because chat filters must work instantaneously, the accuracy may not always be perfect, so we continually strive to update these filters based on community feedback to ensure non-violating chat can be allowed through. 

Every player is also equipped with controls to mute or block other players, as well as in-game reporting features to help flag inappropriate behavior for investigation. 


Player Report Investigations, Account Actions, and Appeals
Upon receiving a player report, our Player Protection team conducts a thorough manual investigation to assess the report’s validity and determine if any corrective actions are needed. These initial actions typically involve issuing an in-game behavioral reminder to the player or implementing a temporary account mute. 

In cases where a player continues to engage in violating behavior(s) despite prior warnings, or temporary account actions, these actions may escalate to a permanent mute or a temporary or permanent account ban, following a final manual review of the account history and recent offense(s). 

This said, certain behaviors in Sky are treated with utmost seriousness and if an investigation confirms their occurrence, we take swift action in the form of a permanent account ban to ensure the safety and well-being of our community.

Zero tolerance behaviors include, but are not limited to, the act or threat of disclosing another players’ personal identifiable information and the exploitation or endangerment of minors. Any reported or detected online illegal harms, such as these, are handled with extreme urgency and care, and we will always comply fully with all law enforcement investigations.

All accounts subject to permanent mutes or bans undergo careful manual review by our Player Protection team first, to ensure the accuracy and appropriateness of the action(s) taken.

Any players that have received an action on their account can appeal through our in-game ticket support system or our web-based ticket support form. Our Sky Player Support team carefully reviews every appeal in order to determine whether the account action is eligible for overturn.  

Real-Time Proactive Detections and Moderation 

Beginning in 2024, we are rolling out an important addition to our suite of moderation tools to help scale our player protection efforts. A real-time proactive moderation system based on highly accurate, contextual machine learning AI with comprehensive language coverage will begin issuing in-game behavioral reminders to players, prompted by detected behavioral violation(s). 

Following this initial roll out, the real-time moderation system will gradually incorporate temporary account mutes for players found to be continuing to engage in violating behavior(s), despite previous behavioral reminders. Initially, these mutes will be brief, starting with 1 hour durations. These temporary mute durations will then increase in the case where inappropriate behavior(s) persist. If enough violations are detected on an account, a manual human review is triggered to investigate if a permanent mute is required for the safety of the Sky community.

All permanent account actions, such as permanent mutes or account bans, will continue to be issued only after careful, manual investigations by our team.

Please note an important distinction between our chat filters and our real-time proactive detections and moderation system: The chat filter operates instantaneously, within milliseconds, to provide an immediate decision as to whether a chat message should be blocked or not. This is to ensure the most seamless chat experience in Sky as possible.

Meanwhile, our real-time proactive detections and moderation system uses more robust contextual machine learning AI and operates in seconds, allowing more time for the models to process the entire context of the conversation and provide a more accurate determination as to whether a violation has occurred. This is to help prevent detection errors, while still detecting violation(s) in as close to real-time as possible to automatically issue an appropriate action or initiate a human moderator review. 

Putting it all Together

From the considerate and fair handling of player reports and appeals, to the introduction of contextual machine learning AI models to drive more proactive and scaled moderation in Sky, our commitment to nurturing a positive and safe gaming environment remains at the forefront of our hearts and minds. 

In our journey towards fostering a safe and inclusive community in Sky, we all are responsible for the culture we create as a community, and we appreciate your input and feedback as we move forward together. 

As we embark on these next steps in our collective endeavor, remember that every action we take is guided by our dedication to creating a space where everyone feels valued, respected, and free to enjoy their time with us here in Sky with child-like wonder and awe.