Skip to content

Host SCALE docs minipage on Github pages #539

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: gh-pages
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
78 changes: 78 additions & 0 deletions .github/workflows/hugo.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
# Sample workflow for building and deploying a Hugo site to GitHub Pages
name: Deploy Hugo site to Pages

on:
# Runs on pushes targeting the gh-pages branch
push:
branches:
- gh-pages

# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
contents: read
pages: write
id-token: write

# Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued.
# However, do NOT cancel in-progress runs as we want to allow these production deployments to complete.
concurrency:
group: "pages"
cancel-in-progress: false

# Default to bash
defaults:
run:
shell: bash

jobs:
# Build job
build:
runs-on: ubuntu-latest
env:
HUGO_VERSION: 0.111.3
steps:
- name: Install Hugo CLI
run: |
wget -O ${{ runner.temp }}/hugo.deb https://github.com/gohugoio/hugo/releases/download/v${HUGO_VERSION}/hugo_extended_${HUGO_VERSION}_linux-amd64.deb \
&& sudo dpkg -i ${{ runner.temp }}/hugo.deb
- name: Install Dart Sass Embedded
run: sudo snap install dart-sass-embedded
- name: Checkout
uses: actions/checkout@v3
with:
submodules: recursive
fetch-depth: 0
- name: Setup Pages
id: pages
uses: actions/configure-pages@v3
- name: Install Node.js dependencies
run: "[[ -f package-lock.json || -f npm-shrinkwrap.json ]] && npm ci || true"
- name: Build with Hugo
env:
# For maximum backward compatibility with Hugo modules
HUGO_ENVIRONMENT: production
HUGO_ENV: production
run: |
hugo \
--gc \
--minify \
--baseURL "${{ steps.pages.outputs.base_url }}/"
- name: Upload artifact
uses: actions/upload-pages-artifact@v1
with:
path: ./public

# Deployment job
deploy:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
needs: build
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
6 changes: 6 additions & 0 deletions archetypes/default.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
---
title: "{{ replace .Name "-" " " | title }}"
date: {{ .Date }}
draft: true
---

Binary file added assets/.DS_Store
Binary file not shown.
14 changes: 14 additions & 0 deletions config.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
baseURL = 'https://paritytech.github.io/parity-scale-codec/'
languageCode = 'en-us'
title = 'SCALE'
theme = 'hugo-book-master'
enableGitInfo = true
[params]
BookLogo = 'parity-logo-square.png'
BookSection = '*'
[markup]
[markup.tableOfContents]
endLevel = 3
startLevel = 1
[markup.goldmark.renderer]
unsafe = true
Binary file added content/.DS_Store
Binary file not shown.
76 changes: 76 additions & 0 deletions content/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
---
title: "Index"
weight: 1
# bookFlatSection: false
# bookToc: true
# bookHidden: false
# bookCollapseSection: false
# bookComments: false
# bookSearchExclude: false
---

![logo](logo.png)
{{< hint info >}}
**SCALE** (**S**imple **C**oncatenated **A**ggregate **L**ittle-**E**ndian) is the data format for types used in the Parity Substrate framework. It is a light-weight format which allows encoding (and decoding) which makes it highly suitable for resource-constrained execution environments like blockchain runtimes and low-power, low-memory devices.
{{< /hint >}}

Welcome to the technical documentation of the [Rust implementation](https://github.com/paritytech/parity-scale-codec) of Parity's SCALE codec. This page is intended to serve as an introduction to SCALE for those new to Substrate development. For more detailed, low-level information about the `parity-scale-codec` Rust crate, please visit the corresponding [docs.rs page](https://docs.rs/parity-scale-codec/latest/parity_scale_codec/).

This page is divided into the following sections:
- **Encode**: This section provides a practical introduction to how SCALE is used to encode types in Rust, complete with examples. It is recommended to read this section before proceeding to the Decode section.
- **Decode**: This section explains how to decode SCALE-encoded data and addresses common misconceptions and challenges related to SCALE's non-descriptive nature.
- **Use in Substrate**: This section outlines how SCALE is utilized in Substrate development and showcases common patterns.
- **Specification**: This section offers a brief overview of the SCALE encoding process.
- **SCALE crates**: This page provides a high-level overview of the various available SCALE Rust crates and their uses.

SCALE is non-descriptive. This means that the encoding context, which includes knowledge of the types and data structures, must be known separately at both the encoding and decoding ends. The encoded data does not include this contextual information. Consider the following comparison between SCALE and JSON to understand what this means in practice.
{{< tabs "SCALEvsJSON" >}}
{{< tab "SCALE" >}}
```rust
use parity_scale_codec::{ Encode };

#[derive(Encode)]
struct Example {
number: u8,
is_cool: bool,
optional: Option<u32>,
}

fn main() {
let my_struct = Example {
number: 42,
is_cool: true,
optional: Some(69),
};
println!("{:?}", my_struct.encode());
println!("{:?}", my_struct.encode().len());
}
[42, 1, 1, 69, 0, 0, 0]
7
```
{{< /tab >}}
{{< tab "JSON" >}}
```rust
use serde::{ Serialize };

#[derive(Serialize)]
struct Example {
number: u8,
is_cool: bool,
optional: Option<u32>,
}

fn main() {
let my_struct = Example {
number: 42,
is_cool: true,
optional: Some(69),
};
println!("{:?}", serde_json::to_string(&my_struct).unwrap());
println!("{:?}", serde_json::to_string(&my_struct).unwrap().len());
}
"{\"number\":42,\"is_cool\":true,\"optional\":69}"
42
```
{{< /tab >}}
{{< /tabs >}}
Binary file added content/_index/logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
85 changes: 85 additions & 0 deletions content/decode.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
---
title: "Decode"
weight: 3
# bookFlatSection: false
# bookToc: true
# bookHidden: false
# bookCollapseSection: false
# bookComments: false
# bookSearchExclude: false
math: true
---


# 1. Decoding
Since SCALE is non-descriptive, the proper metadata is needed to decode raw bytes into the appropriate types.

```rust
use parity_scale_codec::{ Encode, Decode, DecodeAll };

fn main() {
let array = [0u8, 1u8, 2u8, 3u8];
let value: u32 = 50462976;

println!("{:02x?}", array.encode());
println!("{:02x?}", value.encode());
println!("{:?}", u32::decode(&mut &array.encode()[..]));
println!("{:?}", u16::decode(&mut &array.encode()[..]));
println!("{:?}", u16::decode_all(&mut &array.encode()[..]));
println!("{:?}", u64::decode(&mut &array.encode()[..]));
}
[00, 01, 02, 03]
[00, 01, 02, 03]
Ok(50462976)
Ok(256)
Err(Error { cause: None, desc: "Input buffer has still data left after decoding!" })
Err(Error { cause: None, desc: "Not enough data to fill buffer" })
```
## 1.1 Depth Limit
Greater complexity in the decode type leads to increased computational resources used for value decoding. Generally you always want to `decode_with_depth_limit`. Substrate uses a limit of `256`.

```rust
use parity_scale_codec_derive::{Encode, Decode};
use parity_scale_codec::{Encode, Decode, DecodeLimit};

#[derive(Encode, Decode, Debug)]
enum Example {
First,
Second(Box<Self>),
}

fn main() {
let bytes = vec![1, 1, 1, 1, 1, 0];
println!("{:?}", Example::decode(&mut &bytes[..]));
println!("{:?}", Example::decode_with_depth_limit(10, &mut &bytes[..]));
println!("{:?}", Example::decode_with_depth_limit(3, &mut &bytes[..]));
}
Ok(Second(Second(Second(Second(Second(First))))))
Ok(Second(Second(Second(Second(Second(First))))))
Err(Error { cause: Some(Error { cause: Some(Error { cause: Some(Error { cause: Some(Error { cause: None, desc: "Maximum recursion depth reached when decoding" }), desc: "Could not decode `Example::Second.0`" }), desc: "Could not decode `Example::Second.0`" }), desc: "Could not decode `Example::Second.0`" }), desc: "Could not decode `Example::Second.0`" })
```

## 1.2 When One-to-One Decoding Fails: `BTreeSet`

SCALE is intended to be a one-to-one encoding, meaning the decoding process should return the exact data that was initially encoded. However, a notable exception occurs when using a `BTreeSet`.

In Rust, a `BTreeSet` is a set data structure implemented using a B-tree, which keeps its elements sorted. This ordering is part of the internal functionality of the `BTreeSet` and doesn't usually concern users directly. However, this characteristic comes into play when encoding and then decoding data with SCALE. Consider the following example:
```rust
use parity_scale_codec::{ Encode, Decode, alloc::collections::BTreeSet };

fn main() {
let vector = vec![4u8, 3u8, 2u8, 1u8, 0u8];
let vector_encoded = vector.encode();
let btree = BTreeSet::<u8>::decode(&mut &vector_encoded[..]).unwrap();
let btree_encoded = btree.encode();

println!("{:02x?}", vector_encoded);
println!("{:02x?}", btree_encoded);
}
[14, 04, 03, 02, 01, 00]
[14, 00, 01, 02, 03, 04]
```
In this code, a vector of numbers is encoded, and then decoded into a `BTreeSet`. When the resulting `BTreeSet` is encoded again, the resulting data differs from the original encoded vector. This happens because the `BTreeSet` automatically sorts its elements upon decoding, resulting in a different ordering.

It is essential to be aware of this behavior when using `BTreeSets` and similar datatypes in your Substrate code. Remember, SCALE encoding/decoding aims to be one-to-one, but the automated sorting feature of the `BTreeSet` breaks this expectation. This is not a failure of SCALE but a feature of the `BTreeSet` type itself.

Loading