AI is a tool to support human creativity and responsibility, not replace it.
- Open source principles
- Efficiency, particularly when funded with public money and adding accessibility features
- Quality and maintainability of code
- Consider workload on maintainers and contributors and avoiding barriers to contribution
- Humans are responsible for code and other assets, AI is a tool
The world has changed. We believe that open source projects need to be able to benefit from generative AI tools that have largely been trained on open source content. We need to do this in a way that is aligned with the goals of the project.
At this point we are not able to accept large contributions from fully automated contributors or contributors who are not known to the project.
Fully automated contributions refer to cases where AI has been given a general task (for example, tools like OpenClaw) and produces code without meaningful human involvement.
If you are new to the project, you are very welcome. Please start with a small contribution that can be easily reviewed and gradually build up your contribution profile. You can also introduce yourself on our community Discord: Flock XR Community
If you are planning a large contribution, please raise an issue first so we can discuss the approach before significant work is undertaken.
Code generated by AI with substantial human involvement is considered for acceptance when it is in service of these principles and clearly marked with which AI tool was used and in what capacity.
The contributor is fully responsible for all submitted code, regardless of whether AI tools were used.
Code submitted to the Flock XR project should be understood by the committer and substantially authored by them. This includes prompting a LLM to produce the end result code provided that significant human input was involved. Human input can include knowledge of the existing code base and research and design work that went into crafting and refining a prompt. We recognise that with a project like Flock XR, a lot of work goes into deciding which code should exist, how functionality should be designed and how it should integrate with the existing architecture.
Contributors must ensure that no third party copyrighted material is included in code. Contributors and reviewers should take additional care when considering code that is likely to be similar to existing code. Increasingly tools are providing features that can assist contributors to identify any third party code in output and its license to determine whether it is permitted for use.
We don’t have a specific list of tools that can be used. Instead we ask that you focus on the goals and principles of this policy.
AI contributions should be clearly marked with the tool used. This can be done by listing the AI as a co-author or by tagging the contribution, whichever is convenient for the workflow supported by the tool. This should be included in the pull request description.
Pull requests should include a description of how AI was used to support the completion of the task. This could include a summary of prompts and which tasks were supported by AI. We respect the privacy of contributors and do not want to create an unnecessarily large environmental footprint, therefore it is not necessary to provide a full prompt history.
The above rules apply to non-code content including documentation, images and 3D models.
We recognise that some potential contributors may wish to write code completely by hand and without AI assistance; this is acceptable provided that it does not conflict with efficient use of public funds or the delivery of key features, particularly accessibility, or has learning goals for the contributor.
This policy will be reviewed regularly as the generative AI landscape changes.
This policy is influenced by NLnet Generative AI Policy (we’re grateful that they listened to feedback as the open source community navigates this unprecedented change in how we develop software), the measured policy from Apache Generative Tooling Guidelines, and the ideas around community and human maintainers from Ghostty AI Policy.