Skip to content

Commit 8c5d3fb

Browse files
committed
Docs: added CONTRIBUTING.md and DEVELOPING.md
These expand on the steps necessary to contribute a spec to this repo, or use/extend the CI validation scripts. Signed-off-by: Andrew Helwer <[email protected]>
1 parent ddef39b commit 8c5d3fb

File tree

4 files changed

+249
-59
lines changed

4 files changed

+249
-59
lines changed

.github/scripts/record_model_state_space.py

+11-5
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@
1414
parser.add_argument('--tlapm_lib_path', help='Path to the TLA+ proof manager module directory; .tla files should be in this directory', required=True)
1515
parser.add_argument('--community_modules_jar_path', help='Path to the CommunityModules-deps.jar file', required=True)
1616
parser.add_argument('--manifest_path', help='Path to the tlaplus/examples manifest.json file', required=True)
17+
parser.add_argument('--enable_assertions', help='Enable Java assertions (pass -enableassertions to JVM)', action='store_true')
1718
args = parser.parse_args()
1819

1920
logging.basicConfig(level=logging.INFO)
@@ -23,19 +24,24 @@
2324
community_jar_path = normpath(args.community_modules_jar_path)
2425
manifest_path = normpath(args.manifest_path)
2526
examples_root = dirname(manifest_path)
27+
enable_assertions = args.enable_assertions
2628

27-
def check_model(module_path, model):
28-
module_path = tla_utils.from_cwd(examples_root, module_path)
29+
def check_model(module, model):
30+
module_path = tla_utils.from_cwd(examples_root, module['path'])
2931
model_path = tla_utils.from_cwd(examples_root, model['path'])
3032
logging.info(model_path)
3133
hard_timeout_in_seconds = 60
3234
tlc_result = tla_utils.check_model(
3335
tools_jar_path,
36+
'.',
3437
module_path,
3538
model_path,
3639
tlapm_lib_path,
3740
community_jar_path,
3841
model['mode'],
42+
module['features'],
43+
model['features'],
44+
enable_assertions,
3945
hard_timeout_in_seconds
4046
)
4147
match tlc_result:
@@ -66,7 +72,7 @@ def check_model(module_path, model):
6672
manifest = tla_utils.load_json(manifest_path)
6773
small_models = sorted(
6874
[
69-
(module['path'], model, tla_utils.parse_timespan(model['runtime']))
75+
(module, model, tla_utils.parse_timespan(model['runtime']))
7076
for spec in manifest['specifications']
7177
for module in spec['modules']
7278
for model in module['models']
@@ -84,8 +90,8 @@ def check_model(module_path, model):
8490
reverse=True
8591
)
8692

87-
for module_path, model, _ in small_models:
88-
success = check_model(module_path, model)
93+
for module, model, _ in small_models:
94+
success = check_model(module, model)
8995
if not success:
9096
exit(1)
9197
tla_utils.write_json(manifest, manifest_path)

CONTRIBUTING.md

+85
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,85 @@
1+
# Contributing a Spec
2+
3+
Do you have a TLA⁺ specification you'd like to share with the community?
4+
Your contribution would be greatly appreciated!
5+
In addition to serving as an example, your spec will function as a powerful real-world test case for TLA⁺ language tooling.
6+
The spec will also enjoy basic maintenance over the years, ensuring it remains functional & relevant as new features are added to TLA⁺ and its tools.
7+
8+
To ensure a high bar of quality, all specs in this repo are subject to automated continuous integration tests.
9+
This means the spec contribution process has some amount of configuration & overhead, which can occasionally bring frustration.
10+
If you are willing to push through this yourself that is greatly appreciated, but if it becomes an undue obstacle you can also add your spec directory to the [`.ciignore`](.ciignore) file to exclude your spec from validation; maintainers can then integrate it into the CI system at a later date.
11+
12+
Licensing your contributed specs under MIT is most appreciated, for consistency; however, other compatible permissive licenses are accepted.
13+
14+
## Basic Contribution Steps
15+
16+
1. Fork this repository and create a new directory under `specifications` with the name of your spec.
17+
1. Place all TLA⁺ files in the directory, along with their `.cfg` model files.
18+
1. You are encouraged to include at least one model that completes execution in less than 10 seconds, and (if possible) a model that fails in an interesting way - for example illustrating a system design you previously attempted that was found unsound by TLC.
19+
1. Ensure the name of each `.cfg` file matches the `.tla` file it models, or has its name as a prefix; for example `SpecName.tla` can have the models `SpecName.cfg` and `SpecNameLiveness.cfg`.
20+
This makes it clear which model files are associated with which spec.
21+
1. Include a `README.md` in the spec directory explaining the significance of the spec with links to any relevant websites or publications, or integrate this info as comments in the spec itself.
22+
23+
At this point, you have the option of adding your spec directory to the [`.ciignore`](.ciignore) file to simply contribute the spec files & exclude it from automated validation.
24+
Contribution volume is low enough that maintainers will be able to onboard your spec to the validation process eventually.
25+
However, if you are willing to put in the addition work of onboarding your spec to continuous integration yourself, follow the steps below.
26+
27+
## Adding Spec to Continuous Integration
28+
29+
All specs & models in this repository are subject to many validation checks; for a full listing, see [`DEVELOPING.md`](DEVELOPING.md).
30+
While many of these checks concern basic properties like whether the spec parses properly and its models do not crash, they also check whether the spec correctly reflects metadata stored about it in [`manifest.json`](manifest.json) and a table in [`README.md`](README.md).
31+
32+
The central file containing metadata about all specs is [`manifest.json`](manifest.json).
33+
You will need to add an entry to it for your spec & model files.
34+
This can either be done manually (by looking at existing examples) or largely automated using the instructions below; note that if done manually you are very likely to miss tagging your spec with some feature flags detected & enforced by the CI.
35+
Before submitted your changes to run in the CI, you can quickly check your [`manifest.json`](manifest.json) against the schema [`manifest-schema.json`](manifest-schema.json) at https://www.jsonschemavalidator.net/.
36+
Steps:
37+
38+
1. Ensure you have Python 3.11+ installed; open a terminal in the root of this repository
39+
1. It is considered best practice to create & initialize a Python virtual environment so the required packages are not installed globally; run `python -m venv .` then `source ./bin/activate` on Linux & macOS or `.\Scripts\activate.bat` on Windows (run `deactivate` to exit)
40+
1. Run `pip install -r .github/scripts/requirements.txt`
41+
1. Run `python .github/scripts/generate_manifest.py` to auto-generate your spec entry in [`manifest.json`](manifest.json) with as many details as possible
42+
1. Locate your spec's entry in the [`manifest.json`](manifest.json) file and ensure the following fields are filled out:
43+
- Spec title: an appropriate title for your specification
44+
- Spec description: a short description of your specification
45+
- Spec sources: links relevant to the source of the spec (papers, blog posts, repositories)
46+
- Spec authors: a list of people who authored the spec
47+
- Spec tags:
48+
- `"beginner"` if your spec is appropriate for TLA⁺ newcomers
49+
- Model runtime: approximate model runtime on an ordinary workstation, in `"HH:MM:SS"` format
50+
- Model size:
51+
- `"small"` if the model can be executed in less than 10 seconds
52+
- `"medium"` if the model can be executed in less than five minutes
53+
- `"large"` if the model takes more than five minutes to execute
54+
- Model mode:
55+
- `"exhaustive search"` by default
56+
- `{"simulate": {"traceCount": N}}` for a simulation model
57+
- `"generate"` for model trace generation
58+
- `"symbolic"` for models checked using Apalache
59+
- Model result:
60+
- `"success"` if the model completes successfully
61+
- `"assumption failure"` if the model violates an assumption
62+
- `"safety failure"` if the model violates an invariant
63+
- `"liveness failure"` if the model fails to satisfy a liveness property
64+
- `"deadlock failure"` if the model encounters deadlock
65+
- (Optional) Model state count info: distinct states, total states, and state depth
66+
- These are all individually optional and only valid if your model uses exhaustive search and results in success
67+
- Recording these turns your model into a powerful regression test for TLC
68+
- Other fields are auto-generated by the script; if you are adding an entry manually, ensure their values are present and correct (see other entries or the [`manifest-schema.json`](manifest-schema.json) file)
69+
1. Add your spec somewhere in the **topmost** table (not the bottom one, don't get them mixed up!) in [`README.md`](README.md); this must have:
70+
- The spec name as a link to the spec's directory, matching the name in the [`manifest.json`](manifest.json)
71+
- A comma-separated list of authors, which must also match the list of authors in [`manifest.json`](manifest.json)
72+
- A checkmark indicating whether the spec is appropriate for beginners
73+
- Checked IFF (if and only if) `beginner` is present in the `tags` field of your spec in [`manifest.json`](manifest.json)
74+
- A checkmark indicating whether the spec contains a formal proof
75+
- Checked IFF a `proof` tag is present in the `features` field of least one module under your spec in [`manifest.json`](manifest.json)
76+
- A checkmark indicating whether the spec contains PlusCal
77+
- Checked IFF a `pluscal` tag is present in the `features` field of least one module under your spec in [`manifest.json`](manifest.json)
78+
- A checkmark indicating whether the spec contains a TLC-checkable model
79+
- Checked IFF `exhaustive search`, `generate`, or `simulate` tags are present in the `mode` field of at least one model under your spec in [`manifest.json`](manifest.json)
80+
- A checkmark indicating whether the spec contains an Apalache-checkable model
81+
- Checked IFF `symbolic` tag is present in the `mode` field of at least one model under your spec in [`manifest.json`](manifest.json)
82+
83+
At this point you can open a pull request and the CI will run against your spec, alerting you to any details that you missed.
84+
These scripts can be run locally for a faster development loop; see [`DEVELOPING.md`](DEVELOPING.md) for details.
85+

0 commit comments

Comments
 (0)