Scargo: Transpile a subset of Python to Argo YAML #6234
Seanny123
started this conversation in
Show and tell
Replies: 2 comments 3 replies
-
@terrytangyuan this project was influenced by your work with Couler and Argo Dataflows. I plan to keep learning from your work and hope this project has some ideas worth your consideration. |
Beta Was this translation helpful? Give feedback.
3 replies
-
Can I leave you two to chat about this? I’d love to see more 🐍 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Argo is amazing software. It makes orchestrating distributed workflows way easier than before. However, I do find it difficult to define workflows in YAML with so little editor support. The scientists I work with at ProteinQure have an even harder time prototyping workflows and testing them locally.
This motivated me to create Scargo, which is a subset of Python which can be transpiled to Argo. This allows a scientist to prototype and test a workflow locally using Python, then transpile it to run with Argo.
ProteinQure has decided to open-source the prototype while it is still actively being developed. However, please consider it less of an established product and more of an ambitious demo.
The best way to understand Scargo is to look at the examples. Note the YAML output may look a bit strange and may not run on your cluster, given our cluster is configured to reduce boilerplate. However, configuring this output to match different cluster configurations should be possible in the future.
Please let me know if you have any comments or questions. Especially if you're trying to run the examples and have a different cluster configuration. I'm unclear on what the cluster config of others looks like.
Beta Was this translation helpful? Give feedback.
All reactions