Substrate Runtime Developer Academy Course

Scope

Understand the Polkadot ecosystem on Substrate Codebase. Learn blockchain design principles. Learn Polkadot.js SDK. Launch your first blockchain, and build a web3 front-end DApp.

ICP = Industry Connect Points , you gain point by asking questions or answering questions on Q-Hub (question Hub)

UseFull Links:

For the course:

Introduction

Some Foundations

Substrate is a framework to build blockchain platform.

There are many program to help on new projects:

  • Parity Substrate Builder Program

  • Web3 Foundation Grant Program

  • Polkadot/Kusama on chain Treasury

  • Multiple Accelerator and Incubators

Substrate client consists of 2 parts:

  • Runtime:

    • on-chain state transition logic

  • Client:

    • All off-chain components that are able to exectue the runtime

    • Storage

      • RocksDB -> the default DB used by many blockchain projects

      • TrieDB -> in early stage but more optimised for Substrate

    • Consensus (PoW and PoS are provided)

      • Babe & Granpa

    • p2p network (communication between substrate nodes)

    • RPC (enables clients/wallet to interact with the blockchain)

    • Runtime Executor (use to validate block, produce new block, call by block authoring, or blockimport)

Edmund keys are used to sign transactions

Substrate is Designed by Web3 Foundation and developed by Parity Tech.

Polkadot Ecosystem:

  • Relay chain: Polkadot, Kusama. Responsabile of security, consensus and cross-chain interop

  • Parachain/Parathread: sovereign chain. Parachain needs to rent a slot from relaychain. Parathread nees token from Parachain to produce blocks

  • Bridge: allows communications with external blockchain like Ethereum, Bitcoin

3 aspects:

  • Performance

  • Innovation:

    • Module oriented

    • WASM Sandbox

    • FRAME is the blockchain building framework provided by substrate, for building module

    • ink is a new WASM SmartContract framework, so developed in Rust and deployed on a substrate blockchain

    • Possible to deploye Solidity SC on Substrate

    • Reuse of existing 3th party lib (like zcash lib)

    • Off-chaine worker (for oracle)

Polkadot Ecosystem:

  • a network of purpose build blockchain

  • internet of blockchain

  • public permissionless blockchain, developped by Web3 and Parity

  • PoC 2 is the predecessor of Substrate

  • Relay Chain: Kusama, Polkadot

    • Reponsible of security, consensus, parachain

  • Parachain: independant blockchain, connected to relaychain

  • Parathread -> pay on usage: pay token on relay chain to validate blocks

  • Bridges: eth, btc

  • Role: Validators part of Relaychain, and collators node part only of the parachain.

File in a Substrate Project:

Some Concepts

An extrinsic: is a piece of information that comes from outside the chain and is included in a block. Extrinsics fall into three categories: inherents, signed transactions, and unsigned transactions. "Extrinsic" is Substrate jargon meaning a call from outside of the chain. Most of the time they are transactions, and for now it is fine to think of them as transactions. Dispatchable calls are defined in thedecl_module! macro

A Dispatchable call is a function that a blockchain user can call as part of an Extrinsic.

Inherents are pieces of information that are not signed and only inserted into a block by the block author. They are not gossiped on the network or stored in the transaction queue. Like timestamp added by validator on the block.

Signed transactions contain a signature of the account that issued the transaction and stands to pay a fee to have the transaction included on chain

Use unsigned transactions with care, as their validation logic can be difficult.

SignedExtension is a trait by which a transaction can be extended with additional data or logic. Signed extensions are used anywhere you want some information about a transaction prior to execution. This is heavily used in the transaction queue.

The transaction pool contains all transactions (signed and unsigned) broadcasted to the network that have been received and validated by the local node.

Weights represent the limited time that your blockchain has to validate a block. This includes computational cycles, and storage I/O.

The System pallet is responsible for accumulating the weight of each block as it gets executed and making sure that it does not exceed the limit. TheTransaction Payment pallet is responsible for interpreting these weights and deducting fees based upon them. The weighing function is part of the runtime so it can be upgraded if needed.

The runtime of a blockchain is the business logic that defines its behavior. In Substrate-based chains, the runtime is referred to as the "state transition function"; it is where Substrate developers define the storage items that are used to represent the blockchain's state as well as the functions that allow blockchain users to make changes to this state.

Substrate uses runtimes that are built as WebAssembly (Wasm) bytecode. Substrate also defines the core primitives that the runtime must implement.The core Substrate codebase ships with FRAME, Parity's system for Substrate runtime development. FRAME defines additional runtime primitives and provides a framework that makes it easy to construct a runtime by composing modules, called "pallets". Each pallet encapsulates domain-specific logic that is expressed as a set of a storage items, events, errors and dispatchable functions.

!!!! Keep in mind that FRAME is not the only system for Substrate runtime development.!!!

The Substrate runtime is composed with a set of primitive types that are expected by the rest of the Substrate framework.

Core Primitives:

  • hash

  • digestItem

  • digest

  • extrinsic

  • header

  • block

  • blocknumber

Frame Primities:

  • call

  • origin

  • hashing (like blake2)

  • accountId

  • events

  • version

All other pallets depend on the System library as the basis of your Substrate runtime.

Pallet Structure

Pallet frame exists in 2 version today (v1 and v2).

Skeleton of v1 frame: (composed of 5 sections)

  • Import and dependancies

  • runtime config traits

  • runtime events

  • runtime storage

  • pallet declaration (module struc exportable for that pallet)

Skeleton of v2 frame: (composed of 7 sections)

  • Import and dependancies

  • declaration of pallet types

  • runtime config traits

  • runtime storage

  • runtime event

  • hooks (to be able to call ext component from pallet)

  • extrinsics (functions callable from external pallet)

Substrate Macro

Find more info on: https://substrate.dev/docs/en/knowledgebase/runtime/macros

Substrate uses Rust macros to aggregate the logic derived from pallets that are implemented for a runtime. These runtime macros allow developers to focus on runtime logic rather than encoding and decoding on-chain variables or writing extensive blocks of code to achieve basic blockchain fundamentals.

macros are lines of code that write code.

Overview of substrate macros:

  • macros in the Frame support library

  • marcos in the Substrate system library

Frame 1 Declarative Macro:

  • decl_Storage!

  • decl_event!

  • decl_error!

  • decl_module!

Frame 2 Declarative Macro:

Additional Frame Macro:

  • construct_runtime!: To construct Substrate runtime and integrating various pallets into the runtime.

  • parameter_types!: To declare parameter types to be assigned to pallet configurable trait associated types during runtime construction.

  • ...

Substrate system library macro:

  • impl_runtime_apis!: This macro generates the API implementations for the client side through the RuntimeApi andRuntimeApiImpl struct type.

  • app_crypto!: To specify cryptographic key pairs and its signature algorithm that are to be managed by a pallet.

To understand how Macro are implemented -> https://github.com/dtolnay/cargo-expand

// Install it
$ sudo cargo install cargo-expand
// If not present, install rustfmt
$ rustup component add rustfmt

// Then run the following
// To avoid that WASM is compiled
$ sudo SKIP_WASM_BUILD=1 cargo expand -p pallet-template > template.rs
// dummy was command here under was not woring
$ BUILD_DUMMY_WASM_BINARY=cargo expand -p node-template-runtime > runtime.rs
$ BUILD_DUMMY_WASM_BINARY=cargo expand -p pallet-template > template.rs

Runtime Metadata:

Blockchains that are built on Substrate expose metadata in order to make it easier to interact with them. This metadata is separated by the different pallets that inform your blockchain. For each module, the metadata provides information about the storage items, extrinsic calls, events, constants, and errors that are exposed by that module. Substrate automatically generates this metadata for you and makes it available through RPC calls.

Runtime Storage:

Runtime storage allows you to store data in your blockchain that is persisted between blocks and can be accessed from within your runtime logic.

Storage info: https://substrate.dev/docs/en/knowledgebase/advanced/storage

Substrate uses a simple key-value data store implemented as a database-backed, modified Merkle tree. All of Substrate's higher-lever storage abstractions are built on top of this simple key-value store. Substrate implements its storage database with RocksDB, a persistent key-value store for fast storage environments. It also supports an experimental Parity DB.

(see https://rocksdb.org)

Blockchains that are built with Substrate expose a remote procedure call (RPC) server that can be used to query runtime storage.

Visual Studio Pre for Rust

You can install the following module to start using Rust on VS:

  • CodeLLDB

  • rust-analyzer (desactivate Rust officila module, make conflict)

  • Even Better TOML

  • Rust Test Explorer (View and run your Rust tests in the Sidebar of Visual Studio Code)

Install Substrate on Mac

Follow procedure here under if you have a new Mac Book Pro with M1 CPU: https://vikiival.medium.com/run-substrate-on-apple-m1-a2699743fae8

https://substrate.dev/docs/en/tutorials/create-your-first-substrate-chain/setup

// Process to install Substrate Node Template on Mac
// Step 1: install those tools/lib
$ brew install python@3.9 protobuf llvm cmake openssl

// Step 2: Use rust nigthly version
// To verify it:
$ rustup show // show version present and default one

// Step 3: Download Rust Node Template from Git
$ git clone -b v3.0.0 --depth 1 https://github.com/substrate-developer-hub/substrate-node-template

// Step 4: Modify the rocksdb compliant with M1
// rocksdb is not supported by default on Mac M1, so we will use a special compiled version for M1
# Edit the cargo.toml
[patch.crates-io]
librocksdb-sys = { git = "https://github.com/hdevalence/rust-rocksdb", branch = "master" }

# Edit the cargo.lock and replace libc with this (so use version 0.2.81 of libc)
[[package]]
name = "libc"
version = "0.2.81"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1482821306169ec4d07f6aca392a4681f66c75c9918aa49641a2595db64053cb"

// It also work for libc 0.2.94 (no issue) so not mandatory to update the cargo.lock
// Step 5: Compile the code
cargo build --release

// Step 6: Launch the node
# Run a temporary node in development mode
$ ./target/release/node-template --dev --tmp
// Recently another issue appear on last version also:
// In the WASM, there is a rip return issue in place of pc(pointer) on ARM archi.
// solution: https://github.com/substrate-developer-hub/substrate-node-template/issues/179#issuecomment-844489133
// Process to install Node Front-End Template
# Clone the frontend template from github
$ git clone -b v3.0.0 --depth 1 https://github.com/substrate-developer-hub/substrate-front-end-template

# Install the dependencies
cd substrate-front-end-template
$ yarn install

# Launch it:
$ yarn start

Install tools:

Install the node-template:

git clone https://github.com/substrate-developer-hub/substrate-node-template

To run one local node-template and the Frond-end :

// Launch the substrate network: 
$ substrate-node-template % ./target/release/node-template --dev --tmp

// Then launch the Front-End:
$ yarn start

// When transfering found:
// 1 unit = 1000000000000 (this is the minimum to use)

// To see all the options of the node-template:
$ ./target/release/node-template --help

Run a multi node substrate:

Follow the folowing steps: https://substrate.dev/docs/en/tutorials/start-a-private-network/alicebob

# Start Alice's node
./target/release/node-template \
  --base-path /tmp/alice \
  --chain local \
  --alice \
  --port 30333 \
  --ws-port 9945 \
  --rpc-port 9933 \
  --node-key 0000000000000000000000000000000000000000000000000000000000000001 \
  --telemetry-url 'wss://telemetry.polkadot.io/submit/ 0' \
  --validator
  
 // Use the Polkadot.JS App
 https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9945#/explorer
 
  ./target/release/node-template \
  --base-path /tmp/bob \
  --chain local \
  --bob \
  --port 30334 \
  --ws-port 9946 \
  --rpc-port 9934 \
  --telemetry-url 'wss://telemetry.polkadot.io/submit/ 0' \
  --validator \
  --bootnodes /ip4/127.0.0.1/tcp/30333/p2p/12D3KooWEyoppNCUx8Yx66oV9fJnriXwCcXwDDUA2kj6vnc6iDEp 

Then you can transfert 10 units from Alice to Ferdie

Some Tips & Tricks on Substrate Node

// How to change the blocktime
// Go on ..runtime/src/lib.rs (edit it)
/// Change this to adjust the block time.
pub const MILLISECS_PER_BLOCK: u64 = 6000;
// update thi 6000 ms to 2000 ms -> 2 sec per block

// Inside a module, if you want to use all the pub struct of your project
use super::*;

// If you want to run specific test on a package compose of module
$ sudo cargo test --package pallet-kitties
// You can even run a specific test
// --exact means run just that test
// --nocapture to be sure to display any println! present in the test, otherwise you would not see it
$ sudo cargo test --package pallet-kitties --lib -- tests::can_create --exact --nocapture

// To Debug on Visual Studio, you can add the extension CodeLLDB
// Then on the left side you have to configure the run and debug

If you want to display variable content during test, you can do the following. In the rust code, you use println!("My value is: {:#?}", my_val); . Then when you launch the test run the following. Important to add --exact

sudo cargo test --package pallet-kitties -- --nocapture --exact

Dependancy Injection Design Pattern

Why?

  • Insure code is modular and abstracted

  • Reduce coupling between module so that they are independant

  • Maximise reusability

  • The core function: depend on interface/trait in place of specific implementation/module

Substrate Metadata

// You can get substrate metadata by calling RPC method
// Take the Polkadot.js interface: https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/rpc
// Connected to your local substrate running node
// Go to developer/rpc, select state -> take getMetadata(at)
// You get the runtime metadata of the blockchain

The metadata is encoded using the scale codec. SCALE = simple concatenated aggregate little-endian. The pokadot.js is making the decoding automatically from the SCALE format to JSON. The metadata includes all the details for the runtime and the pallets, including:

  • storage

  • events

  • calls

  • constants

  • errors

After error, you have index, which are pallet description and you see the number of storage, events, calls, constants and errors they have. This correspond to what has been defined in the construct_runtime section of the lib.rs of the runtime package

In your Google Chrome Browser, you can see the JSON RPC Call behind the wood that polkadot.js uses to call local substrate node.

-> Go on Google Chrome menu: View -> developer -> inspect elements. Then go on the Network tab, select the 127.0.0.1 connection , to on message tab and you will see all the request dans their answers from the Substrate node. If you don't see traffic just push "cmd +R" key on your mac.

If you have error from the polkadot.js interface, you need to go to settings -> developer and enter there the custom type you have created, becasue polkadot.js is not aware of them.

Onchain storage format

Onchain storage is a key-value DB. Now how the key and how value are encoded is up to the substrate to do the work.

The sp_IO (substrate paller Input/Output) on Rust module, check on : https://crates.parity.io/sp_io/index.html and the storage module: https://crates.parity.io/sp_io/storage/index.html

You can see code by going to substrate->primities->io, check the storage trait (io::storage), you can fine the following primitive methods:

  • get/set/clear/exists

  • start_transaction/rollback_transaction/commit_transaction

  • clear_prefix (We can delete all the key with a given prefix)

  • next_key

  • append (capability to append storage without decoding and reencoding all)

Those are low level primites, normally we don't interact directly with them, we use https://crates.parity.io/frame_support/macro.decl_storage.html

The Onchain storage format is following those rules/principles:

  • Storage item

    • Module Prefix

    • Storage Prefix

  • Storage type

    • Storage value

      • Key = twox128(module-prefix) ++ twox128(storage-prefix)

    • Storage map

      • Key = twox128(module-prefix) ++ twox128(storage-prefix) + hasher(encode(key))

    • Storage double map

      • Key = twox128(module-prefix) ++ twox128(storage-prefix) + hasher1(encode(key1)) + hasher2(encode(key2))

  • What are the storage hasher:

    • identity (no hash done here, if the hash has already be calculated before you can use this)

    • twox_64_concat (fast, but only to be used if input is trusted, no secure enough otherwise)

    • Blake2_128_concat (still fast but slover than twox, but more secure)

    • Deprecated storage hasher (old Kusama): twox128, twox__256, blake2__128, blake2__256

You can play with Polkadot.js following interface to query the storage of Substrate network:

  • If you want to fine the timestamp of the last block, go to https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/chainstate, then select now() method at current block and execute, you can use your Google Chrome in dev mode (Inspector, network, WSS socket, slect 127.0.0.1). You can see on the Write the query sent to Substrate and find the row storage key used to get the value. You can copy the raw key value and then select on polkadot.js in the chaine Row storage menu the value and you should find the little indian encoded version of the value.

This link is a very good deepdive on Storage : https://www.shawntabrizi.com/substrate/substrate-storage-deep-dive/

SCALE CODEC

Codec used to serialized Key and value into byte. It is also the codec used for transaction format, block format. You can find more info on : https://substrate.dev/docs/en/knowledgebase/advanced/codec + https://substrate.dev/rustdocs/v3.0.0/parity_scale_codec/index.html

The SCALE (Simple Concatenated Aggregate Little-Endian) Codec is a lightweight, efficient, binary serialization and deserialization codec. It is inspired from SERD Framework (https://serde.rs) that is a Rust lib enable to serialize and deserialize many data format/structure but on a leightwheight version.

Scale is the opposite of JSON, this is a lightweight format, binary format not human readable. It is not selfdescribing so you need to know the schema to decode. So it doesn't included embedded in the coded value, the info to decode it. So as developer you have to be carefull that the schema is well communicate between the rust code and the SDK.

ORML (Open Run Time Module Library)

There is an Open Source Community of Opensource Runtime Module: https://github.com/open-web3-stack/open-runtime-module-library

There is also an NFT Module used by Acala: https://github.com/AcalaNetwork/Acala/issues/416

Rust Code Styling

You can check at https://github.com/rust-dev-tools/fmt-rfcs/blob/master/guide/guide.md

Origin::sign()

In Substrate you have the notion of origin. The runtime origin is used by dispatchable functions to check where a call has come from. More info on https://substrate.dev/docs/en/knowledgebase/runtime/origin

Substrate architecture Overview

  • Runtime

  • Frame: Framework for Runtime aggregation of modularized entities. This is a set of tools and modules to develop Substrate Runtime. In runtime modules, you have the pallets

  • Pallet = standardise shiping container, general name of crates/modules in rust: Trait, storage, calls, errors, events, modules, origin, banchmarks

  • The FRAME Executive module acts as the orchestration layer for the runtime. It dispatches incoming extrinsic calls to the respective pallets in the runtime.

  • The FRAME Support library is a collection of Rust macros, types, traits, and functions that simplify the development of Substrate pallets. The support macros expand at compile time to generate code that is used by the runtime and reduce boilerplate code for the most common components of a pallet.

Subtrate Macro

A pallet is implemented with several macros:

  • decl_module

  • decl_storage

  • decl_event

  • decl_error

Runtime is assembled with another macro

  • construct_rutime

Deploying a Pallet on a Substrate Node - Nick Pallet Sample

Import the Nick Pallet

// Deploy a new Pallet on a substrate node Frame Runtime
// You need editing two files: runtime/src/lib.rs, and runtime/Cargo.toml.

// Step 1: Import the pallet-nicks crate in your runtime's Cargo.toml file
// You find there a list of all the dependencies your runtime has
// Step1.1: Add this line in the Cargo.toml
[dependencies]
#--snip--
pallet-nicks = { default-features = false, version = '3.0.0' }

// Step 1.2: Add the standard feature of pallet-nicks
[features]
default = ["std"]
std = [
    #--snip--
    'pallet-nicks/std',
    #--snip--
]

// Step 2: Verify that the new dependencies resolve correclty
$ sudo cargo check -p node-template-runtime

Configure the Nick Pallet

Every pallet has a component called Config that is used for configuration. This component is a Rust "trait"; traits in Rust are similar to interfaces in languages such as C++, Java and Go. FRAME developers must implement this trait for each pallet they would like to include in a runtime in order to configure that pallet with the parameters and types that it needs from the outer runtime.

The Nick Frame Git code: https://github.com/paritytech/substrate/tree/v3.0.0/frame/nicks

If we look at the Nicks pallet in detail, we know it has:

  • Module Storage: Because it uses the decl_storage! macro.

  • Module Events: Because it uses the decl_event! macro. You will notice that in the case of the Nicks pallet, the Event keyword is parameterized with respect to a type, T; this is because at least one of the events defined by the Nicks pallet depends on a type that is configured with the Config configuration trait.

  • Callable Functions: Because it has dispatchable functions in the decl_module! macro.

  • The Module type from the decl_module! macro.

// Configure/Program the nick pallet (coding part here)

// Step1: Implement the Config interface for the Nicks pallet
// This implementation consists of two parts: a parameter_types! block where 
// constant values are defined and an impl block where the types and values 
// defined by the Config interface are configured.
// Add the following code to your runtime, under runtime/src/lib.rs
/// Add this code block to your template for Nicks:
parameter_types! {
    // Choose a fee that incentivizes desireable behavior.
    pub const NickReservationFee: u128 = 100;
    pub const MinNickLength: usize = 8;
    // Maximum bounds on storage are important to secure your chain.
    pub const MaxNickLength: usize = 32;
}

impl pallet_nicks::Config for Runtime {
    // The Balances pallet implements the ReservableCurrency trait.
    // https://substrate.dev/rustdocs/v3.0.0/pallet_balances/index.html#implementations-2
    type Currency = pallet_balances::Module<Runtime>;

    // Use the NickReservationFee from the parameter_types block.
    type ReservationFee = NickReservationFee;

    // No action is taken when deposits are forfeited.
    type Slashed = ();

    // Configure the FRAME System Root origin as the Nick pallet admin.
    // https://substrate.dev/rustdocs/v3.0.0/frame_system/enum.RawOrigin.html#variant.Root
    type ForceOrigin = frame_system::EnsureRoot<AccountId>;

    // Use the MinNickLength from the parameter_types block.
    type MinLength = MinNickLength;

    // Use the MaxNickLength from the parameter_types block.
    type MaxLength = MaxNickLength;

    // The ubiquitous event type.
    type Event = Event;
}

// Step2: Next, we need to add the Nicks pallet to the construct_runtime! macro

construct_runtime!(
    pub enum Runtime where
        Block = Block,
        NodeBlock = opaque::Block,
        UncheckedExtrinsic = UncheckedExtrinsic
    {
        /* --snip-- */

        /*** Add This Line ***/
        Nicks: pallet_nicks::{Module, Call, Storage, Event<T>},
    }
);

Note that not all pallets will expose all of these runtime types, and some may expose more! You should always look at the documentation or source code of a pallet to determine which of these types you need to expose.

Compile the node with the new nick pallet defined

$ cargo build --release

// If error compilation, this can be needed to update the version of rust and nightly for WASM

$ rustup update
$ rustup update nightly
$ rustup target add wasm32-unknown-unknown --toolchain nightly

Deploy a Pallet from Scratch - Kitties

This link is also a good example to create a pallet in its own crate: https://substrate.dev/docs/en/tutorials/create-a-pallet/, on new version of template-node, this is Template pallet (the new test pallet)

// Create a new kitties folder under ..substrate-node-template/pallets/kitties

// Create under the kitties pallet, the cargo.toml file that contains all the rust metadata
// cargo.toml
[package]
description = 'FRAME pallet kitties for testing.'
edition = '2018'  // latest Rust version edition
name = 'pallet-kitties'
authors = ["Marc-Antoine Lemaire"]
version = '0.1.0'

# alias "parity-scale-code" to "codec"
[dependencies.codec]
default-features = false
features = ['derive']
package = 'parity-scale-codec'
version = '2.0.0'

[dependencies]
frame-support = { default-features = false, version = '3.0.0' }
frame-system = { default-features = false, version = '3.0.0' }

[features]
default = ['std']
std = [
    'codec/std',
    'frame-support/std',
    'frame-system/std',
]

// Create the entry point of the Rust project: lib.rs
// ...substrate-node-template/pallets/kitties/src/lib.rs
// lib.rs:

#![cfg_attr(not(feature = "std"), no_std)] // ! std is not supported in WASM runtime
use frame_support::{decl_module, decl_storage, decl_event, decl_error, dispatch, traits::Get};

pub trait Trait: frame_system::Trait {
}

decl_module! {
	pub struct Module<T: Config> for enum Call where origin: T::Origin {
    }
}

// Last step is to add the kitties pallet to the runtime
// Go to ...substrate-node-template/runtime/Cargo.toml
// Add the kitties pallet in the dependancies
# local dependencies
pallet-kitties = { path = '../pallets/kitties', default-features = false }

// kitties pallet to feature
[features]
std = [
...
     'pallet-kitties/std',
...

// Then update the runtime ...substrate-node-template/runtime/src/lib.rs
// Configure FRAME pallets to include in runtime.
// Create the runtime by composing the FRAME pallets that were previously configured.
construct_runtime!(
...
Kitties: pallet_kitties::{Module},
...

// Then verify, compile
// You can configure that the rust project of Kitties pallet will compile
// cd ..substrate-node-template/pallets/kitties
$ sudo SKIP_WASM_BUILD=1 cargo check
// Verify that the new dependencies resolve correclty

$ sudo cargo check -p node-template-runtime

// Compile it
$ sudo cargo build --release

// Then run it
./target/release/node-template --dev --tmp

// Then launch the Front-End:
$ yarn start
// check if you see the kitties pallet

On the FE, you will need to give them the ABI of the SC, so this is the type definition API interface part. Substrate comes with a powerfull metadata system, however it does not come with the definition of custom type. We need to provide the type definition for custom types. You can use the following interface: (https://polkadot.js.org/apps/#/settings/developer)

// Go on https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/settings/developer
// copy past the Kitty Custom type:
{
  "Kitty": "[u8;16]"
}
// Refresh

// Make a transaction to insert a new kitty for Alice
// https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/extrinsics
// Select kitties extrinsic, use alice and submit transaction

// Go to explorer to see the transaction
https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/explorer

// Go to chain state to query the blockchain
https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/chainstate
// Select kitties

For more information on polkadot.js api: https://github.com/polkadot-js/api:

Explore Substrate Framework

Full Substrate project is on https://github.com/paritytech/substrate

Perform runtime upgrade with WASM code push on blockchain

Steps to do it:

  • Step1: Bump spec_version: increment by one the spec_version: go on runtime/src/lib.rs. Just increment value by one if you want the wasm runtime to be upgraded

pub const VERSION: RuntimeVersion = RuntimeVersion {
	spec_name: create_runtime_str!("node-template"), // <--- This is blockchain runtime name
	// Shoud be unique name: kusama, polkadot...
	impl_name: create_runtime_str!("node-template"), // <--- different client implementation could have different name (get, partity..)
	authoring_version: 1, // <--- block production code version
	spec_version: 100, //<---- update that value, increment by one
	impl_version: 1, // <-- patch version linked to implementation
	apis: RUNTIME_API_VERSIONS, // <--- used internally by runtime API (no need to change)
	transaction_version: 1, // <-- used by wallet to see if there is a breaking transition between Pallet order ID on runtime and core method order also
// in construct_runtime! of substrate if you introduce a line in the middle you will break it, because index of module will change
// If you add a new extrinsic on a pallet module, add it at the end otherwise you break indexing
};
  • Step2: Build the wasm file

Any time you compile a runtime, it compile with the native platform format and also compile on WASM format. The WASM portable compiled format of the node-runtime is located under the project folder in:

substrate-node-template/target/debug/wbuild/node-template-runtime/target/wasm32-unknow-unknwow/release -> node_template_runtime.wasm

$ WASM_TARGET_DIRECTORY="$(pwd)" make build-full
// pwd = current working directory
// This command will simply launch cargo build
// so you can run direclty
$ sudo cargo build
// if error, run this 
// sudo cargo clean
// Specify the version of toolchain to be used by WASM
$ sudo WASM_BUILD_TOOLCHAIN=nightly cargo build
  • Step3: use sudo to perform a runtime upgrade

To see the runtime running version, go to you https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/rpc

  • On top lef, you can see the spec version number

  • You can also go to developer menu -> rpc -> state, getruntime() function

Publish a transaction on the blockchain via polkadot.js tool and upload the wasm version of the new runtime. go to https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/sudo

Select system -> setCode, select file upload and drag and drop the wasm code, then select with weigth override and put one, then submit transaction (with account Alice)

Then make the check here above to see version spec has incremented and that your new pallet are here now

What is pallet-sudo

Sudo is not anymore activated on Polkadot, they use the Democracy pallet.

For pallet-sudo details, please go on substrate/pallet/sudo pallet/scr/lib.rs

Off Chain Worker & Pallet Versioning

Off Chain worker is a innovative feature running on validator node to bridge offchain and onchain storage.

As example:

  • Polkadot and Kusama use offchain worker to tun Sequential Phragmen Algorithm for validator selection algorithm.

  • To run non deterministic tasks

  • Run PoW Miner

  • For task required HSM

    • Sign some data with HSM

  • Perform Https request to external web site and enchor onchain some result -> so implement oracle

Offchain worker module doc: https://crates.parity.io/sp_runtime/offchain/index.html

Some technical deep dive aspect:

  • it is triggered everytime a block is imported (not a block is syncing)

Pallet versioning and Storage Migration

The TreeDB used for onchain storage is equivalent to no sql storage (schemaless key-value store). Schema is defined in the code, not in the storage. It is just bunch of bytes, you don't manage data type at TreeDB level, this is the calling code that has to manage that aspect.

Automated pallet versionning is availble from version 3.X (using version in the cargo.toml file). So now unified pallet version.

SmartContract Versus Runtime Pallet

Runtime Pallet:

  • More Flexible

  • You event don't need to charge transition fee for your state if you want, you can setup custom whitelist

  • Full access to everyting, you can call all the functions of all the other pallets.

  • Up to the developers to assure that storage usage can not be abused

  • Benchmaking to measure weight to assure that block time is reasonable

  • Require runtime upgrage and approval by governance, that is a big drawback -> so to deploy a new pallet you need the approval of all the participants which can take time

  • You need to maintiant your own bloackc

SmartContract:

  • limited by platform

  • sandbox execution environment (this put lost of complexity and limitations), so storage can be more secure

  • Performance penality as we need to assess each operations cost and storage usage, to calculat gas metering

  • State rent

  • Deployment can be permissionless

Pallet-Contrat is a pallet that implement WASM SC execution environement Feature: https://crates.parity.io/pallet_contracts/index.html

We need to use ink!, which is a Rust contract development framework developped by Parity, for SC to be deployed on pallet-contract: https://substrate.dev/substrate-contracts-workshop/#/0/setup, you have also:

Pallet-Contract versus EVM(Ethereum Virtual Machine)

Pallet-Contract

  • WASM Bytecodes

  • You can use any language that compile to WASM to write your SC

  • You can use Rust support via ink!

  • AssemblyScript support via new tool ask!

  • Compile Solidity project to wasm using Solang project

  • Limited tool such as RedSpot

  • New ecosystem and community, all is very new

EVM

  • Special Bytecodes

  • Use Solidity or any language that compile to EVM Language

  • Many mature dev library and tools

  • Large echosystem and community

  • Extra tool required to be integrated with Substrate

The big difference between Eth Solidity and Substrate Rust

  • EVM is using 160 bits address Format why Substrate is using 256

  • Signing algorithm are not the same

  • SDK are different

  • RPC are different

Frontier is the pallet enabling to use existing eth tool and SC but on substrate node: https://github.com/paritytech/frontier

Ether.js wrapper like around polkadot.js: https://github.com/w3f/Open-Grants-Program/blob/master/applications/project_bodhi.md

How to become a parachain

Cumulus (https://github.com/paritytech/cumulus) is a set of tools to build substrate parachain if you have your own independant existing Substrate based blockchain network.

The security is managed by the validators of the relay chain. Relay chain supports only a limited number of parachain slot. As estimation, one parachain requires 10 validators. For the moment Pokadot has 300 validators and has a program to push the number of validator to 1000, so it means about 100 parachain slots. Kusama has already 700 validators. So as the number of slot is limited, they use the parachain slot auction to lease slot on Polkadot/Kusama.

There is also a parathread approach, so pay as you go model. So you bid relay chain token for block inclusion on the relay chain. So you compete with other to insert your transaction in a relay chain block, the more you pay the bigger the change your will win the bid.

For each parachain slot, you have a set of validators node on the relay chain. Those node can communicate with the relay chain collator nodes which are full node created new block for the parachain and communicating with the relaychain validator to ask inclusing of their block in the relay chain.

The Relay chain is using a Grandpa/babe consensus and the parachain is using a cumulus-consensus:

  • Cumulus-Consensus: On each parachain, you run internally a relay chain node to verify, follow and finalize parachain block (you can not finalize a parachain block if the relay chain has not include it in the relay chain)

  • Cumulus-Runtime: Provide parachain runtime validation capabilities and proof generation routine. Parachain can validate Relay chain and vis-versa. Implement also validate_black API that is used by validator to validate parachain blocks

  • Cumulus-Collator: Implement parachain block producing feature

Malicious collator can not cause issue, this is really malicious valitor that can forge blocks, not the collabor.

There is a Relay-chain testnet that is Rococo for the moment. Rococo currently runs four test system parachains (Statemint, Tick, Trick, and Track), as well as several externally developed parachains

Each parachain is a chard. Each parachain is a shard of Polkadot, with unique state transition rules. Parachains have separate economies, governance mechanisms, and users. Because of the interface that Polkadot provides, the Relay Chain validators can guarantee that each parachain follows its unique rules and can pass messages between shards in a trust-free environment. Here is a very good article explaining how blocks are created, be validated in the Relay/Parachain environment: https://medium.com/polkadot-network/the-path-of-a-parachain-block-47d05765d7a

Relay-chain provides the security and validity guarantees, parachains are not subject to normal blockchain attack scenarios, like a 51% attack. Relay-chain validators will reject invalid blocks, so a parachain only needs a single honest collator to submit blocks.

Polkadot provides the security and validity guarantees, parachains are not subject to normal blockchain attack scenarios, like a 51% attack. Polkadot validators will reject invalid blocks, so a parachain only needs a single honest collator to submit blocks.

Validators use the random outputs that BABE generates — the same ones that assign block production slots — to determine which parachain to validate next. Once a validator knows its new parachain, it finds collators from that parachain to establish connections with. Each validator assigned to a parachain will import the correct state transition function to validate that parachain. Now that collators and validators share a connection and common logic, a collator can send a block to one of the validators.

By separating block production (BABE) from finality (GRANDPA), Polkadot can perform extra validity checks after a block is produced but before it is finalized.³ Polkadot has special actors, fishermen, who patrol Relay Chain blocks for invalid candidate receipts.

After enough secondary checks have been performed on all the candidate receipts within a block, validators can finally vote for that block (and by extension, all previous blocks) in GRANDPA. Once it has more than two-thirds of pre-commits, the block is in the finalized chain. All of the availability and validity checks should take place in less than one minute from the time a block is authored to the time it is finalized.

See also the following:

XCMP & XCM

XCMP = Cross chain message passing:

  • It enables arbitrary message passing between Relaychain and parachain

  • This is the relay chain that garantee that message are delivered

  • Current dev phase is: only vertical message passing

    • Up ( UMP = upward message passing) : From parachain -> relaychain

    • Down (DMP = download message passing): From relaychain -> parachain

    • At this dev stage, parachain can not yet direclty send message to other parachain

  • HRMP (Horizontal Relay-routed message passing): With that current mecanim, parachain can send message to other parachain via relaychain -> later will be direct com parachain->parachain -> no more bottlebeck

XCM = Polkadot Cross Consensus Message Format :

  • Asynchronous

  • Absolute (message is guarantee to be delivered)

  • Asymmetric

  • Agnostic

Type of message on XCM:

  • WithdrawAsset

  • ReserveAssetDeposit (When transfering BTC to Eth)

  • TeleportAsset (like when transering USDT created on one platform and ported on another one)

  • Transact

Asset instruction:

  • DepositAsset

  • InitialeReserveWithdrawn

  • Remark

Multilocation (can be a blockchain, a SmartContract, a user)

MultiAsset (define token, can be a NFT-ERC721 or FT-ERC20 like)

XCM is format of the message used by XCMP: https://github.com/paritytech/xcm-format That draft implementation is already used in Polkadot cumulus.

Deploying SmartContract on Substrate

See -> https://substrate.dev/substrate-contracts-workshop/#/

Why to became a parachain?

See how contract pallet work https://substrate.dev/docs/en/knowledgebase/smart-contracts/contracts-pallet

Using Rococo Testnet.

Future of Polkadot and Substrate Echo System

Some issues:

Future of Polkadot:

Contribute to Echo System:

Bootstrap your project (where you can get help):

Web3:

    • Web1.0: Static Web 1990-2000 (Read only)

      • Linking static content together accross the Internet

    • Web2.0: Dynamic Read/Write (Social Media was the big part)

      • Linking program to that content and building rich application accross all the devices)

    • Web3.0: Web2.0 + Trust (as a core layer at the foundation, we build verifiability at the core layer)

      • We are linking content and program direclty to each other, bypassing intermediary organisation, gaining public verifiability

Installing Substrate on ligther hardware

Video Training Received:

Exercice of the course

See https://github.com/SubstrateDevAcademy

I have clone all my exercice result on https://github.com/IPConvergence/substrateDev-Exervices

In order to run the code, there is a issue in a file for WASM, the procedure to fix it is:

Solution:

When you run the code, please run the following command:

"./target/release/node-template --dev --tmp", don't forget the tmp

// To make it work on the Polkadot.js, go on setting dev and declare the metadata
{
  "KittyIndex": "u32",
  "KittyIndexOf": "u32",
  "Kitty": "[u8; 16]",
  "ClassId": "u32",
  "TokenId": "u32"
}

Making Some Tests

// Evaluate an expression, assert it returns an expected Err value and that runtime storage has not been mutated (i.e. expression is a no-operation
assert_noop(expression_to_assert, expected_error_expression)

Last updated