Replies: 1 comment
-
As a general rule of thumb, once an SDK becomes generally available, we follow Semantic Versioning 2 and major rev on breaking changes. And we try to make as few as possible of those major versions to reduce maintenance burden on consumers. As per intégration tests, have a look at the raptor repository under this organization. We don't publish the dashboards it produces but this is used to know how aligned our snippets and sdk generation logics are and whether any regression happens. Raptor has support for go for a couple of months now and one exercice for the GA will be to increase the pass rate to a satisfactory level. The challenge we're in right now is that our team is under staffed right now, this is why we have pushed the GA of the go sdk. And we don't have any timeline at this point to share. I hope this clarifies a few things. |
Beta Was this translation helpful? Give feedback.
-
Do you know the schedule for feature releases or where it is listed?
A follow-up: There are a lot of library/versioning breaks that I've observed in the last few months. Does the current planning/scheduling have downtime for intended breaking changes as part of these refactors?
Are there regression tests available for the SDK which quickly identify what capabilities have been affected from version to version?
Beta Was this translation helpful? Give feedback.
All reactions