Skip to content

Conversation

@jhanca-robotecai
Copy link
Contributor

This is a first draft of the RFC to solve the problem of versioning Gems and Engine.

The main goal of the document is to start the conversation. Although it proposes some solutions, we should discuss about the optimal solution. The most important is to have a procedure and commit to it. If the procedure is faulty, it can be changed. If there is no procedure (or nobody sticks to it), it causes mess.

- _MINOR_ version changes indicate new features that are backward-compatible.
- _PATCH_ version changes indicate backward-compatible bug fixes.

The versioning information is updated by the developers manually. It will be automated in the future. The _stabilization_ branch takes the version number from the HEAD of the _development_ branch at the branch cutoff, and the _development_ branch should get an immediate update of the _MINOR_ version.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is one of the cornerstone changes we have to make.

I don't think that dev needs to necessarily get an immediate update, (it will naturally move forward anyway), but I would not object to it.

Any other decisions we make or processes we create basically have to deal with the above reality, that is, that released versions and dev versions are on the "same version timeline" now.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, I'm not convinced this an immediate bump is necessary, but if there is an argument for it, I think that it is fine.


The versioning information is updated by the developers manually. It will be automated in the future. The _stabilization_ branch takes the version number from the HEAD of the _development_ branch at the branch cutoff, and the _development_ branch should get an immediate update of the _MINOR_ version.

It is important to note, that there is one more RFC that describes the versioning strategy for the Engine itself, which is [rfc-core-2022-05-31-engine-versioning](https://github.com/o3de/sig-core/blob/main/rfcs/rfc-core-2022-05-31-engine-versioning.md), which describes the versioning in the format of `YEAR.MONTH.RELEASE`, e.g. `25.05.1` for the first path (point-release) version of the Engine with the planned release date in May 2025.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is actually our saving grace, since this is what is shown to users, as the "name of the release" rather than the version, and it lets us make deep changes or jumps to the internal versioning, without looking like we're making a big jump (as in, even though the internal release version will go from 2 to 4 or the dev will, it won't represent a outward facing big jump from a PR perspective, as it will still be 25.x to 26.x or whatever...)

Copy link

@byrcolin byrcolin Aug 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After reading the rfc-core-2022-05-31-engine-versioning document it starts off by talking about a "advertised/named version" which is what we now call a "release version" or "release name" which I prefer, because it is a string not a version, and rightly says this is inadequate to coordinate the matching of other objects and suggests adding an "internal version" which is what we call an "object version" to do that. I agree... and as this is an old document, we already have one. All o3de objects, engine.json, project.json, gem.json, template.json, repo.json, restricted.json have a "version" field, which is the object version, though not currently used/enforced except for some minor checks that can produce a warning, done by project manager. It says the object version will not use the same schema as the release version; This is correct, it follows a standard semver. (1.0.0, 1.2.1, 2.0.1 etc. major.minor.patch)

It is less clear what the major, minor patch mean. I would suggest we define it to mean something very clear like:

  • Major number increment (major+1.minor+0.patch+0) = "Completely new object. No expectation of any backward compatibility with previous versions whatsoever. Upgrade process MAY exist, but not guaranteed at all."
  • Minor number increment (major+0.minor+1.patch+0) = "A significant new feature has been added/api has changed, an upgrade process MAY be needed to get previously working source code/assets to work again."
  • Patch number increment (major+0.minor+0.patch+1) = "Compilation change, compatibility with previous version guaranteed/contains nothing that would break compatibility, upgrade process is not needed, no new features, bug fixes to current feature set. No upgrade process needed, source code using the previous api and or assets will work without changes."
  • No increment (major+0,minor+0,patch+0) = "No compilation change, possibly resource only change (like a icon change), comment change, documentation change, meta data change that doesn't effect compilation. Absolute backward compatibility guaranteed with previous version, no upgrade process needed, previous source code using the api and or assets will work without changes."

For specifying compatibility/dependability with other object(s), semantically, it more or less follows the python notation which is object_name[OPTIONAL(<|>|>=|<=|~=)major.minor.patch OPTIONAL[(<|<=)major.minor.patch)]]
As an example, an object is expressing compatibility/dependence on the Achievement gem)
Valid:

  • "Achievement" (with no version) meaning the highest version of Achievement Gem found that can satisfy all constraints, in cmake without EXACT and no range args.
  • "Achievement=1.0.0" meaning the Achievement Gem version 1.0.0 and only 1.0.0 can satisfy the compatibility, in cmake EXACT with no range args.
  • "Achievement<1.0.0" meaning the highest version found that can satisfy all constraints, that is less than but not including 1.0.0, in cmake [0.0.0...<1.0.0] or [...<1.0.0] (Very uncommon, may never see this, but supported)
  • "Achievement>1.0.0" meaning the highest version found that can satisfy all constraints, that is greater than but not including 1.0.0, in cmake [>1.0.0...] (Very uncommon, may never see this, but supported)
  • "Achievement<=1.0.0" meaning the Achievement Gem highest version found that can satisfy all constraints, less than or equal to 1.0.0, in cmake [...1.0.0] or [0.0.0...1.0.0] (Very uncommon, may never see this, but supported)
  • "Achievement>=1.0.0" meaning the Achievement Gem highest version found that can satisfy all constraints, greater than or equal to 1.0.0, in cmake [1.0.0...] (This is probably the most common compatibility expression we will see on the objects because it marks the start of compatibility without a known upper limit and will not change until a compatibility break is known, then this should be updated to having the upper bound BEFORE updating the version to the next open ended compatibility)
  • "Achievement~=1.0.0" meaning the Achievement Gem highest version found that can satisfy all constraints, bounded by next minor version, in cmake [1.0.0...<1.1.0] (This may be somewhat common... as it describes an implicit upper bound on next minor version change meaning compatible until next minor version change) NOTE: that we don't support other forms like "Achievement~=1" which would imply major version upper bound [1.0.0...<2.0.0] or "Achievement~=1.0" which would imply a next patch bound [1.0.0....<1.0.1]
  • "Achievement>=1.0.0<2.5.0" meaning Achievement Gem highest version found that can satisfy all constraints, greater than or equal to 1.0.0 but less then 2.5.0 (This should be another very common expression as it expresses a known upper bound that is NOT major bound, and is clearer to read frankly)
  • "Achievement<1.0.0<2.5.0" meaning Achievement Gem highest version found that can satisfy all constraints, less than 1.0.0, the <2.5.0 is superfluous as 1.0.0 is already an upper bound and though technically not invalid as both constraints are technically satisfied (This should be very uncommon expression, but technically supported)
  • "Achievement<=1.0.0<=2.5.0" meaning Achievement Gem highest version found that can satisfy all constraints, less than or equal to 1.0.0, again the <=2.5.0 is superfluous as 1.0.0 is already an upper bound and though technically not invalid as both constraints are technically satisfied (This should be very uncommon expression, but technically supported)
  • "Achievement>=1.0.0<=2.5.0" meaning Achievement Gem highest version found that can satisfy all constraints, greater than or equal to 1.0.0 and less then or equal to 2.5.0 (This should be far less common as upper bound is including the version, sometimes you see this as >=1.0.0<=1.9999.9999 when they should have really just put <2.0.0 hopefully uncommon, but technically supported)
    Invalid
  • "Achievement<0.0.0" version are positive numbers and cannot be negative and therefore can never be satisfied.
  • "Achievement>1.0.0<1.0.0" versions can not be simultaneously lower than and greater than a specific version. This includes >= <= variants.
  • "Achievement>1.0.0<0.5.0" versions can not be simultaneously greater than a lower bound and lower then a bound lower then the lower bound. This includes >= <= variants.
  • "Achievement~=1.0.0<1.0.0 versions with implicit upper bounds combined with upper bound less then the lower bound. This includes <= variants.

The document resumes: "It can be used to detect the need for upgrades to compatible gems, assets and other data schemas after new change to the engine is pulled or installed."
Technically yes, but only because the engine is an object itself, just as any other o3de object, it is not exclusive to an engine object, this applies to all objects.

Solving Dependencies
It is not enough that all dependencies EXIST, they have to all be compatible: Say object A1.0.0 requires object B=1.0.0 and C=1.0.0 then C requires B=2.0.0 which can never be satisfied, or in a longer chain of dependency A->B->C->D->F->G->A (oops).
If exact matching i.e. "o3de_object=" is used this may happen often, if we use ranges, preferably open ended ranges, this should happen far less often: A1.0.0 requires B>=1.0.0 and C>=1.0.0, which finds B1.0.0, B1.5.0, B1.6.0, C1.2.0 and C1.3.0. Each of which has their own dependencies: C1.2.0 requires B>=1.5.0 and C1.3.0 requires B<1.6.0. B1.5.0 requires D>3.0... B1.6 requires D>3.5 and so on...
These dependencies CAN be solved by choosing a specific version of, B=1.5.0 EVEN IF B=1.6 is present... This is why above it says "highest version found that can satisfy all constraints" Solving all these constraints is not trivial, but there is open source software, Resolvelib, that can do it for us in the same manner in which CMAKE does it internally. Project Manager already uses it to determine compatibility of a set of objects.

The document continues: "All of the changes that need to be made for a project to support the latest version of the engine can be determined based on the version difference from the project's last upgrade. This system can also be used to predict whether a gem is compatible with the engine, as gems can provide a range of compatible engines versions."
Again technically yes, but this is not exclusive to the gem objects. And yes, technically CMAKE will only care about versioning of potential source code objects like engine, project, gem and restricted objects, as other objects compatibility such as repo, template objects do not matter for compilation.

This document goes on to state: "A new field will be added to engine.json at the root the o3de directory called 'EngineVersion'. The field 'O3DEVersion' already exists but is used for the advertised/named version number. This field will be used to track the current version of engine. To ensure this field is only updated when it should it will be it has been added to the .codeowners under sig-core to request a review from the group."
This is incorrect. "EngineVersion" should not be a thing, it seems that this was supposed to be ultimate parent version for all children objects... We should NOT do that. The engine object has it own "version" field, and do all objects and the children DO NOT inherit the ultimate parents "EngineVersion" or "version" field. I think either the author was unaware that all objects had a "version" field, because this document was created 5/32/2022 and the "version" field I believe was already present by then, I could be wrong, but I'm pretty sure it was. In any case each object has its own version and does not share the ultimate parents version.

Object Version Releases
Currently we do not support gem binary releases, but we should. We could currently support gem source releases without too much trouble, just a GHA script could be added to each gem and executed object that would create a zip of the source files which would be available on github releases. Currently in order to create a binary of a gem, you have to include that gem from an engine object or project object or from a gem the project or engine object already uses, as engines and projects are currently the only roots for CMAKE to configure/build. Now, we could theoretically grab the artifacts from one of these builds and create a binary release of sorts, but the goal here should be the ability to configure and build gems directly and independently. And to be clear, NOT ALL GEMS SHOULD NEED A BINARY or SOURCE RELEASE. I think only root level gems, like gems that describe a system, should. For instance Atom is a tree of gems. I'm not sure it really makes a whole lot of sense to have a release of the Atom RHI gem, i.e. do this for every gem inside Atom. Though technically you could, you could separate them all and it would work, but I think keeping them all in one "Atom release" might make more sense? Also projects currently have a game "Gem" inside them and though I think this is probably unnecessary and not useful, it would not make sense to have a release of the project and a separate release of the game gem inside the project. (Personally, I think a project having a permanent game "Gem" this is kind of an abuse of the object system, as the projects game "Gem" will never be reused in other projects and I would like to see the game "Gem" removed from projects in the future as I think it muddies the water for what we are actually trying to achieve here, but some might feel otherwise.)

Proposal: Specifying what CMAKE should do/which release to use:
Schema 2.0.0 will have the ability to specify releases, so CMAKE will know about them. Once we have this ability to independently configure and build a source and binary release of a gem, we need a way to tell CMAKE which to use. Project Manager would be able to detect/warn and even satisfy all such dependencies by downloading either the git repo at a certain branch/commit OR the source release zip OR the binary release zip OR any and all combination of them, so how do we encode which one we want CMAKE to use? We would have to come up with a way to tell CMAKE what combination we wanted or what would be acceptable and in what order. Like if we want to use binary release of Achievement gem at a certain version or range of versions, we would could WART the specification and imply preferred order maybe:

  • "Achievement>=1.5.0<2.10|BINARY" means must use binary release, must be found/local or error. In practice this might never be used...
  • "Achievement>=1.5.0<2.10|SOURCE" means must use source release, must be found/local or error. In practice this might never be used...
  • "Achievement>=1.5.0<2.10|GIT" means must use git at a certain branch/tag, must be found/local or error. In practice this might never be used...
  • "Achievement>=1.5.0<2.10|SOURCE|FETCH" means must use source release, CMAKE should fetch/download it if not found/local, if not available error...
  • "Achievement>=1.5.0<2.10|BINARY|SOURCE" means must use either binary or source release, prefer binary release, if neither found/local, error...
  • "Achievement>=1.5.0<2.10|SOURCE|BINARY|FETCH" meaning prefer source release, fallback to binary release, fallback to CMAKE should fetch/download source release, fallback to CMAKE should fetch/download binary release, if none are available error... So order is important...
  • "Achievement>=1.5.0<2.10|BINARY|SOURCE|GIT|FETCH" meaning prefer binary release if available, fallback to prefer source release if available, fallback to git at a certain branch/tag if available, fallback to CMAKE should try to fetch/download binary release, fallback to CMAKE should try to fetch/download source release, fallback to CMAKE should try to fetch/download/clone/shallow clone? git repo at a certain branch/tag, if none are available error... I imagine this might be the most common...
    I think this would cover use with a fairly fine grain control and be open ended enough to succeed most of the time if we defaulted to |BINARY|SOURCE|GIT|FETCH if no specifier present? Seems reasonable...

@nick-l-o3de
Copy link

nick-l-o3de commented Aug 8, 2025

The main points of current contention or further discussion, I think are the following. Some of these we just need to brainstorm or examine how other modular projects have solved it... or wish they had :)

Working backwards from use cases

Game / Sim studio users

The main reason for versioning at all, IMO, is so that users can understand and choose which versions of their gems and engine to use. From a real perspective, please consider looking at all of this through the eyes of a studio member working with O3DE not from the point of "making the greatest modular system" to instead "making my game" or "making my simulation".

What this means is that very likely, they will be exposed to primarily the actual released official main versions, publically "named" as 26.05.0, 26.10.1 and things like that (this is just the name, it is not the actual internal API Major.Minor.Patch version of the software that will be used in dependency language files).

More specifically, the studio is unlikely to live on development unless they are willing to accept that risk and have a good reason to do so. From their perspective, when a new engine version is released, they are almost certainly going to have an individual or small team create a branch internally for their own game, to upgrade it to the new release (unless they intend to freeze and never take a release). That individual or small team is basically the only one exposed to any downsides of our versioning scheme and is essentially the primary user for it. Even the "engine team" of a game studio likely works on their fork of the engine, never on development upstream. We want the integrator person to have as frictionless of an experience as possible.

Middleware developer (Cesium, Kythera, PopcornFX, etc)

These users are selling a product (or developing a free one) that integrates with O3DE. Their day to day experience is likely getting notified that a new release is coming and that stabilization has begun, grabbing stabilization as a branch, and then making a new branch/release of their gem targetting that upcoming release (either before, or after it has been released) then releasing that new version of their product that targets the new version of the engine.

In terms of versioning, these customers almost certainly target the "vanilla" version of the engine (that is, just the core engine gems that ship with the engine, and the engine core itself) monolithically, and test and make sure their product works against the upcoming release (or just after release). They don't care about development or patch versions, they target major and minor revs and are only exposed to the frozen version of the stabilization branch and released major and minor patches.

Engine developers (not part of game team)

Individuals who as a hobby or job work primarily on a fork of actual upstream development. These are our contributors. We want their experience to be as frictionless as possible in terms of contributing new code and generate as few conflicts and round trips between maintainers and contributors and as few reruns of AR as possible given the constraints.

Automation

While we do need some amount of automation present, we can't really automate the decision as to whether a change is in fact an API change, a breaking change, or a patch, human eyes need to do that (for now). This implies that bumping version numbers in the object json files has to be done by the submitter of the PR due to the nature of GIT, and naturally implies that we're going to have to deal with a lot of cases where the submitter forgot to do this, the commenter asks them to do it, and it incurs an extra round of PR back and forth. The issue with that is simply that any further back and forth comes at a "abandon the PR" cost for non-paid contributors. If there's anything we can do to reduce the number of times that happens, it would be ideal.

WORKING GROUP: PLease recommend here anything you can think of, I can think of a few, like

  • updating our PR submission guidelines and even our PR template to have a "did you change a public API to add, remove, change a function? Did you remember to bump the version number in the appropriate object?"
  • Not bumping the development major, or minor version, more than once per release. It would seem weird not to bump a revision when changing an API if it has already been bumped since last release, but consider the user roles above:
    • Game team is only exposed to major release versions anyway. It doesn't matter to them that next release FooGem went from 1.4.xxx to 1.5.xxx instead of 1.4.xxx to 1.51.xxxx just that it bumped.
    • Development user- is always on bleeding edge anyway, they expect to be very close to latest on any source they compile or depend on. Ultimately, you only need to differentiate between two different versions of hte same gem in between releases in dev if for some reason you are targetting a gem for development use only instead of a release, which seems super unlikely, and in which case, you can always pin it to a patch release number, too.

Conflicts incurred by version bumps

Breaking changes are not very common (or SHOULD not be very common, its almost always possible to avoid this, but not always). The major version thus really rarely needs to be bumped. In fact, I would assume we'd bump it just once per release given the above.
The minor revision is also less common but can happen. If we follow the "only once" rule, we can reduce the conflicts and problems even further.
The patch revision is hugely problematic. If we require a bump of every json file of every revision of every PR we make and will almost instantaneously cause conflicts on every PR. I think we need a better solution than requiring every PR to be covered in conflicting JSON file edits. Any suggestions? We have some tools available to us - for example, the last commit in a git repo is a hash, but git can also trivially return a monotonically increasing integer number for any commit on a given branch without a human. This means the patch version could actually be tied to something like that automatically (without embedding it in the JSON files, or doing it as part of the actual release, that is, 'stamping' it with the number). If you got the code from git, its that number, if you got it from a zip, its been stamped in. This would imply the 'patch number' of all gems in a repo increase whenever that repo is modified, but nobody should actually be performing a dependency on anything but major and minor revision anyway.

Versioning the gems off the engine version

This was a recommendation in the RFC.
That is, if a gem is part of the engine, it has the same version number as the engine, automatically. To be even broader, if an object is part of another object (gem in a project, for example) it can opt-in, I assume, by setting its version number 0.0.0, to assume the version of the object its inside, or opt out, by using an explicit version number in their json file.

Pros

  • A lot less friction for engine submissions, not every PR has to be rejected with a "you forgot to do x"
  • We currently release the engine monolithically, and the gems inside it monolithically, they are part of the engine. This makes the versioning system (optionally, since gems can choose explicit versioning) reflect the reality while still allowing them to, at some point in the future, be extracted out into their own object by moving them and setting their initial version to 1+ the original engine version.
  • It reflects how you would use the engine - that is, when you (as the implementor in a game studio) grab a new release, you generally upgrade all your gems that are in the engine, all at once. Its super unlikely that you want to use engine 25.10 with physx gem 22.04, and its probably not going to work anyway. you'd have to have the other engine installed, and that other gem is going to assume its part of the 22.04 engine, since its monolithic.
  • You don't have to update other objects either, such as templates.

Cons

  • fails the "purity" test for a modular system, that is, you are having the parent of an object describe its children. (Is this real though? The engine and its gems are monolithic currently, you can't distribute them separately, until you actually separate them, which you can do, its a 2 way door as long as its optional).
  • Other cons?

Upgrade experience

Whats it like when you upgrade to a new version?

Engine upgrader role in a game studio

This person certainly wants to just "upgrade to latest". Not just the engine, but all the gems that the engine ships with, as well as any other 3rd party gems that are released "for that version of the engine."
The best outcome for them is they create that new branch of their game (to use to test upgrading and such) and then get the new engine and point their project at the new engine and/or run conversion scripts or whatever. Then they fix and get it working if needs be.
The worst outcome for them is that they end up in some weird versioning hell where individual core engine gems need individual upgrades, or they have to go over the list of 100 gems their game uses, 90 of which are from the core engine, and manually update the version of that gem to "latest" or suffer some weird situation where theyr'e using a new engine but with gems from an old one.

Engine developer on development

Whatever testbed projects (automated testing, templates, etc) should always point at the latest core gems. We'd really like to avoid a situation where not only does every PR have to update a flood of json files, but also has to update a flood of json files in every template. This implies that there has to be a way to make them all use "the latest version" somehow, without modifying them.

Middleware developer

Its the same story. They probably have test projects or example projects they use for their middleware. So they're likely in the same seat as game dev, just, they are exposed to stabilization potentially before a release.

Scenarios to avoid

Nightmare scenario 1

Every change requires touching a ton of JSON files. User fixes a small problem in a FooGem, and has to update a buggy function in AzCore to fix that too. No API is changed. PR Contents:

   AzCore/Code/Container.cpp
   engine.json - bumping patch version
   Gems/FooGem/gem.json - bumping patch version
   Templates/DefaultProject/Template/project.json - bumping patch verison of FooGem and required engine ver
   Templates/ScriptOnlyProject/Template/project.json - bumping patch verison of FooGem  and required engine ver
Other repositories:
    O3DE-Extras - MultiplayerSample template - bumping patch version of FooGem and required engine ver

Meanwhile, another user fixes a small problem in BarGem, and generates PR Conflicts that must be manually resolved for almost every file above.

Nightmare scenario 2

User is the "person who upgrades their existing game project to the new version for a studio."

Person downloads latest engine, or builds it locally, ultimately, they arrive at an install package or engine source drop.

They compile, or get, the project manager, and open it.
They select their project and "change engine version" (?)

Great so far.

Nightmare begins:
They have to now go into their project gems and individually pick and choose the right version for every gem, some of which came with the engine.


The current versioning strategy is described in details in [RFC #44](https://github.com/o3de/sig-core/issues/44). Most of the document focuses on the implementation. The summary of the procedures can be found below.

Versioning and dependency information is currently stored inside `engine.json`, `gem.json`, and `project.json` files. We use the [Semantic Versioning](https://semver.org/) scheme, which consists of three components: _MAJOR_, _MINOR_, and _PATCH_.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be all O3DE object files have a version field.

- _MINOR_ version changes indicate new features that are backward-compatible.
- _PATCH_ version changes indicate backward-compatible bug fixes.

The versioning information is updated by the developers manually. It will be automated in the future. The _stabilization_ branch takes the version number from the HEAD of the _development_ branch at the branch cutoff, and the _development_ branch should get an immediate update of the _MINOR_ version.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure it can be automated in the future. This is a human process and I would like to hear more about ideas to automate it. The problem is the version is contained in a file which itself is under source control the serial nature of which is not controllable. A branches, makes a change and correctly determines the version should be bumped to 1.0.1 and its PR is ready to merge, but before they can merge B makes a change, correctly determines that the version should be changed to 1.1.0 and merges. Then A merges and the version is back to 1.0.1 If we automate the AR to fail if the version isn't greater than the current dev, this still has a gaping whole of time because the second got in first. We could have many in queue and this would be all over the place. Also the AR is a snapshot in time, A passes right now because B isn't merged yet. B also passes because A hasn't merged yet. C also passes... and so on. So you can have any number of unmerged PR's at any one time. And once they pass they dont get unpassed when someone else merges ahead of you.

- Keep the versions of the core Gems (e.g. `Atom`) unchanged for the upcoming release (2510).
- Bump the Engine version to `4.3.0` on the _development_ branch.
- Apply the selected versioning strategy described in this document on the _development_ branch for the future release (in 2026).

Copy link

@byrcolin byrcolin Aug 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

engine.json

{
    "engine_name": "o3de",
    "O3DEVersion": "0.1.0.0",
    "O3DEBuildNumber": 0,
    "display_version": "00.00",
    "version": "4.2.0",
    "api_versions": {
        "editor": "1.0.0",
        "framework": "1.2.1",
        "launcher": "1.0.0",
        "tools": "1.1.0"
    },
    "file_version": 2,
    "copyright_year": 2023,
    "build": 0,
...

It appears that "EngineVersion" was converted to use the engine objects version, that is fine. (o3de/o3de#14082)
"O3DEVersion" I believe is the release version? but it does not seem to be set to 25.01 or whatever... Although I'm not sure why that would be in the engine object, it should be external to the engine object.
"display_version" I don't know what this is... if it has anything to do with a o3de release should be external.
Just a Note here: In schema 2.0.0 there will be a "releases" section for every object, just so it is clear, this is not o3de release, it a release of the object only, if it ever has one. A release for an object can be code and/or binary, which we currently do not individually do on a release, the same goes for projects and gems, they are currently not able to be built independently from the release, but probably will be eventually.
"file_version" I don't know what that is... Perhaps this is $schemaVersion? before $schemaVersion existed? If so it should be removed.
"O3DEBuildNumber" and "build" I don't know what these are... Maybe has to do with a release? in which case it should be external.
"api_versions" I think this is meant to relate the engine version, not object version, (or release version) to editor, framework, launcher and tools release versions? (not object versions... right?) So like this engine version is compatible with editor release x.... or something.... not sure why this is here. This was added by alexpete and the commit message just says:
"Convert engine.json 'O3DEVersion' to be 'version' and 'display_version' (o3de/o3de#14082)

  • Update engine.json version field and related code"
    So no explanation of why this was added or what it does and should probably be removed?

When the engine is branched for stability for a release the same incremental approach should be observed on both the dev and stability branches independently from each other, that is OK. When stability is merged back into dev ALL version differences should be compared, NOT just the engine version, and the highest wins and that version should be patch incremented for the merge. This way the dev always moves forward, dev is ALWAYS ahead of stability after a merge back and of course is ALWAYS ahead of main, even if only by a patch increment.

Copy link

@nick-l-o3de nick-l-o3de Aug 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, I don't think we should have all these multiple verisons for a single Object. I'd be happy if we had at most 2

Version Friendly Name (display_version) 25.10 or Shiny Sphere or Rolling Cube or whatever. For making users smile and making it easier to connect "what version are you on so we can help you troubleshoot" to a date/time. Not used for anything but display and marketing.

Actual version Number: (version) 4.1.232 semver. Used for actual code and dependency management.

Copy link

@byrcolin byrcolin Sep 5, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm pretty sure all object should have 1 version and 1 only. Anything object related to a release should be in a new "release" section of the json (incoming in schema 2.0.0) As to where we would move any such needed "build"/"release" info should be external to all these objects. Unfortunately, currently the root object of the o3de/o3de repo is the main o3de engine object... so any "root" level object would be technically part of the engine object, which is unfortunate, but for now I would recommend a "release.json" or "build.json" in the root to contain this information and not be put in the engine.json. Hopefully in the future we can refactor this repo into a standard o3de repo format like o3de/extras. Where the "root" is an o3de "repo" object, which is just a container with no code meant to gather related objects and would be more appropriate place for such info. i.e. If we used the standard format that would open a bunch of possibilities like multiple engine objects... would look like maybe something like this: (I know this is a bit of a tangent here... just spit balling ideas, this has been on my mind and I want to write them down and put them out there and see what people think...)
https://github.com/o3de/o3de.git

/repo.json
/Engines/o3de/engine.json   <-maybe this one is what we now call the engine object. good for compiling/testing (maybe most if not all the dependencies in this engine are GIT/FETCH? could be interesting...) 

(just thinking out loud here... not sure of this yet... but interesting)-------------
/Engines/fps/engine.json   <-maybe this one is a good starting place for example FPS game engines?
/Engines/flight-sim/engine.json  <-maybe this one is a good starting place for an example flight sims engines?
-------------------------------------------------------------------------------------
                                 
(Some other ideas)----------------------------------------------------------------
/Engines/source/engine.json   <-maybe this ALL dependencies are SOURCE|FETCH
/Engines/binary/engine.json   <-maybe this is ALL dependencies are BINARY|FETCH
/Engines/git/engine.json <- maybe this is ALL dependencies are GIT|FETCH
...
--------------------------------------------------------------------------------------
/Projects/AutomatedTesting/project.json <-project meant for AR and uses the Engines/o3de engine object...
/Projects/Multiplayer/project.json   <-maybe this an example use of the FPS engine
/Projects/FlightSim/project.json    <-maybe this an example use of the flight-sim engine
/Projects/...
/Gems/Achievement/gem.json
/Gems/....
/Templates/DefaultGem/template.json
/Templates/...

I think if we do this reorg, we should consider moving to a master repo with sub-modules implementation. Then the o3de/o3de repository would be a master repository with NO CODE AT ALL, only a list of sub-modules. Using the same format as above:

/Engines/o3de             ---> https://github.com/o3de/engine.git
/Engines/source           ---> https://github.com/o3de/engine-source.git
/Engines/binary            ---> https://github.com/o3de/engine-binary.git
/Engines/fps                 ---> https://github.com/o3de/engine-fps.git
/Engines/flight-sim      ---> https://github.com/o3de/engine-flight-sim.git
...
/Gems/Achievement    ---> https://github.com/o3de/gem-achievement.git
/Gems/Atom                ---> https://github.com/o3de/gem-atom.git

(OR better...)
/Repos/Atom               ---> https://github.com/o3de/repo-atom.git
...

If we did the same reorg for Atom, since Atom is a large system and not just a single gem, and reorged it into standard o3de "repo" object like the above, then the atom repository would look like:

repo.json
/Projects/atom-test/project.json
/Projects/atom-viewer/project.json
/Projects/atom-experimental-lighting/project.json
...
/Gems/atom/gem.json           <We don't have to flatten the tree either, could have nested gems, just fine>
/Gems/atom/RHI/gem.json
/Gems/atom/RHI/whatever/gem.json
/Gems/atom/RPI/gem.json
/Gems/atom/RPI/whatever/gem.json

I would prefer this because this would keep all o3de objects like projects and gems related to Atom in one repo away from the the engine.
This repo itself is of a large system and I would even recommend this be a master repository as well, and maybe not all sub-modules... maybe only really containing the atom code and all the rest are just sub-modules...

repo.json     <--- This file is in the master repo
/Projects/atom-test              ---> https://github.com/o3de/atom-test.git
/Projects/atom-viewer             ---> https://github.com/o3de/atom-viewer.git
/Projects/atom-experimental-lighting             ---> https://github.com/o3de/atom-experimental-lighting.git
...
/Gems/atom/gem.json           <--- Maybe these file ARE in the master repo... or another sub-module... Whatever sig/graphics-audio wants...
/Gems/atom/RHI/gem.json 
/Gems/atom/whatever/gem.json 
/Gems/atom/RPI/gem.json

Nice and neat... I like that better. Perhaps most of the <...shutters... I caught myself saying "engine gems"> :) should be in o3de repos like this... even if the repo, at present, would only contain:

/repo.json
/Gems/Achievement/gem.json

It would give each system to room to grow a bit more organically...

A project like atom-test would have a dependency on an engine, maybe the "source" engine object:

atom-test/project.json
{
...
"dependencies": {
   "engines": [
        "source>=1.1<2.0|SOURCE|FETCH"
   ],
    "gems":[
    ... <any gems it depends on>...
   ]
}

And the "source" engine object would just have a gem dependency on atom, and let cmake resolve the best version for us.

source/engine.json
{
...
"dependencies": {
   "engines": [
   ],
    "gems": [
       "atom>=1.5<2.0|SOURCE|FETCH"
   ]
}

- It provides a clear path for the next release without introducing additional complexity.

#### Drawbacks
- It means a bump from `2.4.0` to `4.2.0` for the Engine version between the releases.
Copy link

@byrcolin byrcolin Aug 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have no problem just jumping from 2.4.0 to 4.2.0 for the initial, lets get this going commit... Its a a new starting point, we will make a note of it in the release notes of why we did what we did. Should be fine.

I have thought it over and I can not find any problems with the inc on release strategy I talked about above... I'm pretty convinced at this point that the determination only of bump on commit, with DCO-like AR looking for tag in the commit comment, combined with the RELEASE only actual bump in version is the way to go, until someone can point out a flaw in the idea and can articulate a better easier strategy.

- It does not address the issues with the current versioning strategy for Gems.

### Remove the version number from the core Gems
Use the Engine version as the version number for all core Gems instead of a standalone versioning. This way much of the complexity of versioning is removed, making it easier to manage for the community. This could be automated, with a script that updates the version number of all core Gems to match the Engine version whenever the Engine version is updated.
Copy link

@byrcolin byrcolin Aug 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The more I think about this... NUCLEAR NO.. DON'T DO THIS... this will only make adding or removing versioned objects to or from the core near impossible. PLEASE NO.
They are NOT all the same version. We NEED to think about these objects as just coincidentally living as child inside a parent object. Parent/Child relationships HAVE NO bearing on object version. NONE. It is just a convenience.

Use the Engine version as the version number for all core Gems instead of a standalone versioning. This way much of the complexity of versioning is removed, making it easier to manage for the community. This could be automated, with a script that updates the version number of all core Gems to match the Engine version whenever the Engine version is updated.

#### Benefits
- It simplifies the versioning scheme by having a single version number for all core Gems.
Copy link

@byrcolin byrcolin Aug 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I cannot stress enough that there should be no concept as "Core Gems" or "Engine Gems" conferring on them ANY other significance. Treat them as if they are git submodule (because that is what they should probably be in a repo object like o3de-extras) and the reason they are included in the engine object at all is ONLY for convenience of the user.


#### Drawbacks
- It removes the ability to track changes in core Gems between Engine releases.
- It is not clear to all developers which Gems are core Gems and which are not (there used to be a plan to move all Gems that are not core to the `o3de-extras` repository, but it was not implemented).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

moving non core gems to the extras is an ongoing effort.

- It is not clear to all developers which Gems are core Gems and which are not (there used to be a plan to move all Gems that are not core to the `o3de-extras` repository, but it was not implemented).

### Bump the version number of the Gem only when doing a release
For all Gems that are **NOT** core Gems, I propose to bump the version number only when doing a release. This means that the version number is updated only when creating a new _stabilization_ branch from the _development_ branch; only for Gems that have changes since the last release. The version number is updated based on the changes made in the Gem, following the Semantic Versioning scheme.
Copy link

@byrcolin byrcolin Aug 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There should be no difference between so called "Core Gems" and any other external gem.

We cannot control what external gems do for there versioning except suggest to them the same best practices that we ourselves follow in o3de objects.

(My recommendation)
The frequency of when we bump a version is theoretically independent of a release, HOWEVER, lets consider that we ONLY bump version on RELEASE, BUT we require all commits to determine what bump they THINK is necessary for THEIR commit:
We add a test like the DCO test we currently have that checks the commit for either "MAJOR VERSION INCREMENT" or "MINOR VERSION INCREMENT" or "PATCH VERSION INCREMENT" or "NO VERSION INCREMENT" and AR will fail just like DCO is not present. This way everyone determines this for themselves and can be PR reviewed by others. We DO NOT ACTUALLY increment on ANY commit. We would then NEVER have commit conflicts due to version. That would mean anyone can make any change they want in any order.
Then when stability branch is made, we could run a script or just look at all the commit messages for those tags and choose which one prevails, because ultimately version ONLY means anything to the RELEASE. The script or person could look through and find let say 5 commits for this object. 4 [PATCH VERSION INCREMENT], one [MINOR VERSION INCREMENT], the script or person can easily determine that the highest of these is [MINOR VERSION INCREMENT] and looks at the previous RELEASE version and performs a minor version bump and updates all instances of canonical object references to that. i.e. Lets say we are evaluating the Achievement gem previously at 1.4.4 becomes 1.5.0. Then any reference in the engine of Achievement can be updated from Achievement>=1.4.0 to Achievement>=1.5.0
If the highest was only patch version increment, then the new version would be 1.4.5, and we wouldn't even really need to update anything because we will use open ended >= so Achievement>=1.4.0 would still work fine as 1.4.5 and 1.4.6 are >= 1.4.0 and it would pick the highest compatible version it finds.
Best of all perhaps, we would cleanly merge to main as all versions are only increasing. And we could cleanly merge main back to dev because we aren't bumping dev either and would require no automation.... except for the DCO-Like addition to the AR, and that's pretty easy.

Copy link

@nick-l-o3de nick-l-o3de Sep 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That'd possibly work, although I'm not sure about all the github scripting for this. I'll put it on my todo to see if there's another github doing similar things, or a similar script...

Another option would be to have it so that the first one to change a gem bumps the revision if necessary - and if the revision has already been bumped since last release, we don't bump it again, until next release... with major taking precidence over minor.

Then though, it becomes a tracking issue... we'd have to know for a given PR whether it needs to bump the rev, and whether it already has bumped since last release, and thats also going to require a script or something (and may incur minor conflicts). Unless there's an easy way to tell. I guess github tools make it really easy to see the last change made to a speciifc file at least (ie, the gem.json) so it would be fairly trivial to check (git blame the json file, immediatley it will show when last the version was bumped and what it was before).

Not sure the pros and cons of either though. It does mean the version bumps would be in development as soon as they were relevant though, longer time for them to settle in there and reveal any other issues. Always worried about making a big bump during stab

Copy link

@nick-l-o3de nick-l-o3de Sep 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TBH once we decide that versioning is only really necessary to push for release purposes (like once per cycle instead of every single commit), it becomes a lot less conflicty and opens up more options to script or update or work around it.

@nick-l-o3de
Copy link

nick-l-o3de commented Aug 29, 2025

How do we reconcile and get all on the same page here. I know we intend to start a working group, but maybe we can at least enumerate the common ground covered so far...

As far as I can tell, that includes

  1. We should unify the two different versions asap, that is, when we make a release of objects, the verison numbers in the release should be a snapshot of the versions of the objects in their repo at that moment. It shouldn't be a strange different manually modified version track where the engine object is 4.2.0 and then when we cut a release branch it suddenly becomes 2.1.0 for the same object.
  2. Objects should only have a version field, and maybe a display version field for something like the engine (?). Only the version field is involved in dependency computation.

I think a powerful tool we could use as a razor here would be to decide on a few common tasks and roles that people will be interacting with this engine and thus its versioning scheme from, and look at our decisions through the lens of what it would mean to those people.

Some of the roles would be largely unnaffected by our choices and decisions but some may be severely impacted, for better or worse...

I propose:

  1. Explorer role: Engine evaluator / curiosity role of downloading the installer and playing with it. probably wants to get extra gems from other repos. May make their own local gems.
  2. Integration Game Engineer Role: Someone who's team is already on a version and who's job it is to get the new version (when or slightly before its released) and upgrade their game onto it in a branch, before merging that back into their studio's main line, for everyone else.
  3. Middleware developer role: Gem maker - wants to publish gems on their own repos that people can easily get. Wants people to be able to get new version of their gem as well as new gems, when the engine revs. May be in the extras repo canonical, or might just be in their own server.
  4. Middleware developer role: platform maker - consoles, etc. Probably under NDA.
  5. O3DE development contributor role. Someone who makes PRs targetted at development or stabilization. Deep in the trenches. This includes docs people working on o3de.org although I don't know if they be impacted much by any of these decisions.
  6. Non-integration Artist role on some game team. They wait for their integration engineer to give the all clear, then just get latest.

Am I missing a role here?

Once we have roles, we can think about how our decisions would impact their workflows. I really prefer working back from roles, like, the suffering we may cause to humans, rather than working backwards from the purity of technology :)

In addition to roles, I can concur that once we do have the following two building blocks:

  1. strong way of versioning things and declaring dependencies
  2. strong automated repo system that lets you automatically get gems of the correct versions from external repos, including prebuilt ones for script only mode, while following the versioning scheme
    ... then it opens the door for gems, even what we might consider core gems, to be entirely disconnected from the release cycle of the engine, and kept in different repos. AT that point, there is no core gem, and when you create a project either manually or from a template, then the gems it has selected is what you get from wherever they live, at that moment.

However, I think we should consider this through a short term long term lens where we're probably not going to get building block 2 until we solve this versioning issue and we have to solve it without 2 being in place, so for the time being, we should consider how it works with those roles in the current workflows with an eye towards not making it worse in the future

@byrcolin
Copy link

byrcolin commented Sep 6, 2025

Strategy for long term release branches, including LTS branches

Long Term Branch: Means that Main will have a tag for it and a branch of Main is created at the tagged commit.
Long Term Release: Long term Branch created at a release tag. This branch will be protected and ONLY updated with Security Updates/Critical Fixes.
Security Updates: When a security exploit is found in code and needs to be addressed for customer safety. NOT ALL Security Update will be applicable to all previous releases because some security update may be for code that didn't exist in that release.
Critical Fixes: A fix to a really bad bug that was not found at the time of release. NOT ALL Critical Fixes will be applicable to all previous releases because some critical fixes may be for code that didn't exist in that release.
Support Window: A period of time a long term Release branch will receive applicable Security Updates and Critical Fixes.
Interim Release: Long Term Release created approximately every 6 months. These are non long term service (LTS) releases and have a Support Window of 1 year. They use the naming convention: Release_<release_name>. i.e. "Release_25.06"
Long Term Service Release(LTS): Every fourth release, i.e. Once every 2 years, a release is declared LTS which gives it a 2 Year Support Window. They use the naming convention: Release_<release_name>LTS. i.e. "Release_25.06LTS"

Customer Usage Expectation
Most customers may start out developing on the current release. If that release is not an LTS release they will most likely upgrade to the next LTS release as this will be fairly early in the cycle of development as a product cycle is usually about 2 years. This is why we will set the Support Window for Interim releases at 1 year and LTS releases at 2 years. That should cover the vast majority of customers needs.

How this works:
We begin the release process and we create the stability branch from Development -> Stability_24.06
We stabilize Stability_24.06. While this is going on Development -> continues uninterrupted.
When Stability_24.06 is ready we merge to -> Main and tag it 24.06
We merge Stability_24.06 back to -> Development
Delete Stability_24.06
We branch Main to Release_24.06 and protect it like Main

Rinse and repeat for 24.12
Now we have bleeding edge Development
Most recent release Main (equivalent to Release_24.12)
Release_24.06
Release_24.12

A security commit occurs on Development
We merge the security commit from Development to Main
We merge the security commit from Main to Release_24.06
We merge the security commit from Main to Release_24.12

Rinse and repeat for 25.06LTS
Now we have bleeding edge Development
Most recent release Main (equivalent to Release_25.06LTS)
Release_24.06
Release_24.12
Release_25.06LTS

A security commit occurs on Development
We merge the security commit from Development to Main
We SKIP Release_24.06 as 1 year support window is up
We merge the security commit from Main to Release_24.12
We merge the security commit from Main to Release_25.06LTS

Rinse and repeat for 25.12
Now we have bleeding edge Development
Most recent release Main (equivalent to Release_25.12)
Release_24.06
Release_24.12
Release_25.06LTS
Release_25.12

A security commit occurs on Development
We merge the security commit from Development to Main
We SKIP Release_24.06 as 1 year support window is up
We SKIP Release_24.12 as 1 year support window is up
We merge the security commit from Main to Release_25.06LTS
We merge the security commit from Main to Release_25.12

Rinse and repeat for 26.06
Now we have bleeding edge Development
Most recent release Main (equivalent to Release_26.06)
Release_24.06
Release_24.12
Release_25.06LTS
Release_25.12
Release_26.06

A security commit occurs on Development
We merge the security commit from Development to Main
We SKIP Release_24.06 as 1 year support window is up
We SKIP Release_24.12 as 1 year support window is up
We merge the security commit from Main to Release_25.06LTS
We merge the security commit from Main to Release_25.12
We merge the security commit from Main to Release_26.06

Rinse and repeat for 26.12
Now we have bleeding edge Development
Most recent release Main (equivalent to Release_26.12)
Release_24.06
Release_24.12
Release_25.06LTS
Release_25.12
Release_26.06
Release_26.12

A security commit occurs on Development
We merge the security commit from Development to Main
We SKIP Release_24.06 as 1 year support window is up
We SKIP Release_24.12 as 1 year support window is up
We merge the security commit from Main to Release_25.06LTS
We SKIP Release_25.12 as 1 year support window is up
We merge the security commit from Main to Release_26.06
We merge the security commit from Main to Release_26.12

Rinse and repeat for 27.06LTS
Now we have bleeding edge Development
Most recent release Main (equivalent to Release_27.06LTS)
Release_24.06
Release_24.12
Release_25.06LTS
Release_25.12
Release_26.06
Release_26.12
Release_27.06LTS

A security commit occurs on Development
We merge the security commit from Development to Main
We SKIP Release_24.06 as 1 year support window is up
We SKIP Release_24.12 as 1 year support window is up
We merge the security commit from Main to Release_25.06LTS
We SKIP Release_25.12 as 1 year support window is up
We SKIP Release_26.06 as 1 year support window is up
We merge the security commit from Main to Release_26.12
We merge the security commit from Main to Release_27.06LTS

Rinse and repeat for 27.12
Now we have bleeding edge Development
Most recent release Main (equivalent to Release_27.12)
Release_24.06
Release_24.12
Release_25.06LTS
Release_25.12
Release_26.06
Release_26.12
Release_27.06LTS
Release_27.12

A security commit occurs on Development
We merge the security commit from Development to Main
We SKIP Release_24.06 as 1 year support window is up
We SKIP Release_24.12 as 1 year support window is up
We SKIP Release_25.06LTS as 2 year support window is up
We SKIP Release_25.12 as 1 year support window is up
We SKIP Release_26.06 as 1 year support window is up
We SKIP Release_26.12 as 1 year support window is up
We merge the security commit from Main to Release_27.06LTS
We merge the security commit from Main to Release_27.12

and so on...

All updates to Releases flow in one direction only, from Development -> Main -> Releases

Do we ever remove Long Term Release Branches? What about long term LTS branches?
We could... but we should keep them around a good long time. We could remove Interim Releases after 2 years? LTS after 4 years?If we do, then we must also remove any mention of them from all the object jsons. There might not be a good reason to remove them and a good reason to keep them as removing them can only cause potential problems later on and they don't take much space in the repository, so there may be a good argument for never removing them.

Extended support branches
We can allow people to make their own branch of release branches even if they are out of the support window, but we DO NOT maintain them, they do. ( With the exception perhaps of if they are a certain level partners to the foundation maybe and we might extend the window for them... maybe... )

So lets say XYZ company is using Release_25.06LTS to make their game.
Release_25.06LTS is now out of its 2 year support window.
They contact us and tell us they are using this branch for there product and its not yet ready and they need extended support.
We branch Release_25.06LTS -> Release_25.06LTS_extended. This is to indicate to everyone this is a special case.
"_extended" branches also ONLY get Security Updates like the Release branches.

Someone may want to have a version of the release branch or extended branch which could be in or out of support window that they want a new feature from Development back ported to. So lets say XYZ company needs the new particle system in Development ported back to Release_25.06LTS_extended, You CANNOT apply non Security Updates or non Critical Fixes to releases or extended releases, so they can branch Release_25.06LTS_extended -> Release_25.06LTS_extended_particle and cherry pick the particle system into that branch themselves.

Ok, thats pretty much my take on it, poke holes in it... :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants