# Polkadot Developer Documentation (LLMS Format) This file contains documentation for Polkadot (https://polkadot.network). Polkadot unites the world's innovators and changemakers, building and using the most transformative apps and blockchains. Access tools, guides, and resources to quickly start building custom chains, deploying smart contracts, and creating dApps. It is intended for use with large language models (LLMs) to support developers working with Polkadot. The content includes selected pages from the official docs, organized by section. This file includes documentation related to the product: Smart Contracts ## AI Prompt Template You are an AI developer assistant for Polkadot (https://polkadot.network). Your task is to assist developers in understanding and using the product described in this file. - Provide accurate answers based on the included documentation. - Do not assume undocumented features, behaviors, or APIs. - If unsure, respond with β€œNot specified in the documentation. ## List of doc pages: Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/block-explorers.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/connect-to-polkadot.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/dev-environments/hardhat.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/dev-environments/remix.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/faqs.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/libraries/ethers-js.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/libraries/viem.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/libraries/wagmi.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/libraries/web3-js.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/libraries/web3-py.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/local-development-node.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/overview.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/precompiles/interact-with-precompiles.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/precompiles/xcm-precompile.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/develop/smart-contracts/wallets.md [type: develop] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/tutorials/smart-contracts/deploy-erc20.md [type: tutorials] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/tutorials/smart-contracts/deploy-nft.md [type: tutorials] Doc-Page: https://raw.githubusercontent.com/polkadot-developers/polkadot-docs/refs/heads/main/tutorials/smart-contracts/launch-your-first-project/create-contracts.md [type: tutorials] ## Full content for each doc page Doc-Content: https://docs.polkadot.com/develop/smart-contracts/block-explorers/ --- BEGIN CONTENT --- --- title: Block Explorers description: Access PolkaVM explorers like Subscan, BlockScout, and Routescan to track transactions, analyze contracts, and view on-chain data from smart contracts. categories: Smart Contracts, Tooling --- # Block Explorers !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Block explorers serve as comprehensive blockchain analytics platforms that provide access to on-chain data. These web applications function as search engines for blockchain networks, allowing users to query, visualize, and analyze blockchain data in real time through intuitive interfaces. ## Core Functionality These block explorers provide essential capabilities for interacting with smart contracts in Polkadot Hub: - **Transaction tracking** - monitor transaction status, confirmations, fees, and metadata - **Address analysis** - view account balances, transaction history, and associated contracts - **Block information** - examine block contents - **Smart contract interaction** - review contract code, verification status, and interaction history - **Token tracking** - monitor ERC-20, ERC-721, and other token standards with transfer history and holder analytics - **Network statistics** - access metrics on transaction volume, gas usage, and other network parameters ## Available Block Explorers The following block explorers are available for PolkaVM smart contracts, providing specialized tools for monitoring and analyzing contract activity within the Polkadot ecosystem: ### BlockScout BlockScout is an open-source explorer platform with a user-friendly interface adapted for PolkaVM contracts. It excels at detailed contract analytics and provides developers with comprehensive API access. - [Polkadot Hub TestNet BlockScout](https://blockscout-passet-hub.parity-testnet.parity.io/){target=\_blank} ![](/images/develop/smart-contracts/block-explorers/block-explorers-2.webp) --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/connect-to-polkadot/ --- BEGIN CONTENT --- --- title: Connect to Polkadot description: Explore how to connect to Polkadot Hub, configure your wallet, and obtain test tokens for developing and testing smart contracts. categories: Smart Contracts --- # Connect to Polkadot !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**.
Connect to Polkadot Hub TestNet
For more information about how to connect to Polkadot Hub, please check the [Wallets for Polkadot Hub](/develop/smart-contracts/wallets/){target=\_blank} guide. ## Networks Details Developers can leverage smart contracts across diverse networks, from TestNets to MainNet. This section outlines the network specifications and connection details for each environment. === "Polkadot Hub TestNet" Network name ```text Polkadot Hub TestNet ``` --- Currency symbol ```text PAS ``` --- Chain ID ```text 420420422 ``` --- RPC URL ```text https://testnet-passet-hub-eth-rpc.polkadot.io ``` --- Block explorer URL ```text https://blockscout-passet-hub.parity-testnet.parity.io/ ``` ## Test Tokens You will need testnet tokens to perform transactions and engage with smart contracts on any chain. Here's how to obtain Paseo (PAS) tokens for testing purposes: 1. Navigate to the [Polkadot Faucet](https://faucet.polkadot.io/?parachain=1111){target=\_blank}. If the desired network is not already selected, choose it from the Network drop-down 2. Copy your address linked to the TestNet and paste it into the designated field ![](/images/develop/smart-contracts/connect-to-polkadot/connect-to-polkadot-1.webp) 3. Click the **Get Some PASs** button to request free test PAS tokens. These tokens will be sent to your wallet shortly ![](/images/develop/smart-contracts/connect-to-polkadot/connect-to-polkadot-2.webp) Now that you have obtained PAS tokens in your wallet, you’re ready to deploy and interact with smart contracts on Polkadot Hub TestNet! These tokens will allow you to pay for gas fees when executing transactions, deploying contracts, and testing your dApp functionality in a secure testnet environment. ## Where to Go Next For your next steps, explore the various smart contract guides demonstrating how to use and integrate different tools and development environments into your workflow.
- Guide __Deploy your first contract with Remix__ --- Explore the smart contract development and deployment process on Polkadot Hub using the Remix IDE. [:octicons-arrow-right-24: Build with Remix IDE](/develop/smart-contracts/dev-environments/remix/) - Guide __Interact with the blockchain with viem__ --- Use viem for interacting with Ethereum-compatible chains, to deploy and interact with smart contracts on Polkadot Hub. [:octicons-arrow-right-24: Build with viem](/develop/smart-contracts/libraries/viem/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/dev-environments/hardhat/ --- BEGIN CONTENT --- --- title: Use Hardhat with Polkadot Hub description: Learn how to create, compile, test, and deploy smart contracts on Polkadot Hub using Hardhat, a powerful development environment for blockchain developers. categories: Smart Contracts, Tooling --- # Hardhat !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**.
- :octicons-code-16:{ .lg .middle } __Test and Deploy with Hardhat__ --- Master Solidity smart contract development with Hardhat. Learn testing, deployment, and network interaction in one comprehensive tutorial.
[:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/launch-your-first-project/test-and-deploy-with-hardhat){target=\_blank}
!!! note "Contracts Code Blob Size Disclaimer" The maximum contract code blob size on Polkadot Hub networks is _100 kilobytes_, significantly larger than Ethereum’s EVM limit of 24 kilobytes. For detailed comparisons and migration guidelines, see the [EVM vs. PolkaVM](/polkadot-protocol/smart-contract-basics/evm-vs-polkavm/#current-memory-limits){target=\_blank} documentation page. ## Overview Hardhat is a robust development environment for Ethereum-compatible chains that makes smart contract development more efficient. This guide walks you through the essentials of using Hardhat to create, compile, test, and deploy smart contracts on Polkadot Hub. ## Prerequisites Before getting started, ensure you have: - [Node.js](https://nodejs.org/){target=\_blank} (v16.0.0 or later) and npm installed - Basic understanding of Solidity programming - Some PAS test tokens to cover transaction fees (easily obtainable from the [Polkadot faucet](https://faucet.polkadot.io/?parachain=1111){target=\_blank}). To learn how to get test tokens, check out the [Test Tokens](/develop/smart-contracts/connect-to-polkadot#test-tokens){target=\_blank} section ## Set Up Hardhat 1. Create a new directory for your project and navigate into it: ```bash mkdir hardhat-example cd hardhat-example ``` 2. Initialize a new npm project: ```bash npm init -y ``` 3. To interact with Polkadot, Hardhat requires the following plugin to compile contracts to PolkaVM bytecode and to spawn a local node compatible with PolkaVM: ```bash npm install --save-dev @parity/hardhat-polkadot@0.1.8 ``` 4. Create a Hardhat project: ```bash npx hardhat-polkadot init ``` Select **Create a JavaScript project** when prompted and follow the instructions. After that, your project will be created with three main folders: - **`contracts`** - where your Solidity smart contracts live - **`test`** - contains your test files that validate contract functionality - **`ignition`** - deployment modules for safely deploying your contracts to various networks 5. Add the following folders to the `.gitignore` file if they are not already there: ```bash echo '/artifacts-pvm' >> .gitignore echo '/cache-pvm' >> .gitignore echo '/ignition/deployments/' >> .gitignore ``` 6. Finish the setup by installing all the dependencies: ```bash npm install ``` !!! note This last step is needed to set up the `hardhat-polkadot` plugin. It will install the `@parity/hardhat-polkadot` package and all its dependencies. In the future, the plugin will handle this automatically. ## Compile Your Contract The plugin will compile your Solidity contracts for Solidity versions `0.8.0` and higher to be PolkaVM compatible. When compiling your contract, there are two ways to configure your compilation process: - **npm compiler** - uses library [@parity/resolc](https://www.npmjs.com/package/@parity/resolc){target=\_blank} for simplicity and ease of use - **Binary compiler** - uses your local `resolc` binary directly for more control and configuration options To compile your project, follow these instructions: 1. Modify your Hardhat configuration file to specify which compilation process you will be using and activate the `polkavm` flag in the Hardhat network: === "npm Configuration" ```javascript title="hardhat.config.js" hl_lines="9-11 14" // hardhat.config.js require('@nomicfoundation/hardhat-toolbox'); require('@parity/hardhat-polkadot'); /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { solidity: '0.8.28', resolc: { compilerSource: 'npm', }, networks: { hardhat: { polkavm: true, }, }, }; ``` === "Binary Configuration" ```javascript title="hardhat.config.js" hl_lines="9-14 17" // hardhat.config.js require('@nomicfoundation/hardhat-toolbox'); require('@parity/hardhat-polkadot'); /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { solidity: '0.8.28', resolc: { compilerSource: 'binary', settings: { compilerPath: 'INSERT_PATH_TO_RESOLC_COMPILER', }, }, networks: { hardhat: { polkavm: true, }, }, }; ``` For the binary configuration, replace `INSERT_PATH_TO_RESOLC_COMPILER` with the proper path to the binary. To obtain the binary, check the [releases](https://github.com/paritytech/revive/releases){target=\_blank} section of the `resolc` compiler, and download the latest version. The default settings used can be found in the [`constants.ts`](https://github.com/paritytech/hardhat-polkadot/blob/v0.1.5/packages/hardhat-polkadot-resolc/src/constants.ts#L8-L23){target=\_blank} file of the `hardhat-polkadot` source code. You can change them according to your project needs. Generally, the recommended settings for optimized outputs are the following: ```javascript title="hardhat.config.js" hl_lines="4-10" resolc: { ... settings: { optimizer: { enabled: true, parameters: 'z', fallbackOz: true, runs: 200, }, standardJson: true, }, ... } ``` You can check the [`ResolcConfig`](https://github.com/paritytech/hardhat-polkadot/blob/v0.1.5/packages/hardhat-polkadot-resolc/src/types.ts#L26){target=\_blank} for more information about compilation settings. 2. Compile the contract with Hardhat: ```bash npx hardhat compile ``` 3. After successful compilation, you'll see the artifacts generated in the `artifacts-pvm` directory: ```bash ls artifacts-pvm/contracts/*.sol/ ``` You should see JSON files containing the contract ABI and bytecode of the contracts you compiled. ## Set Up a Testing Environment Hardhat allows you to spin up a local testing environment to test and validate your smart contract functionalities before deploying to live networks. The `hardhat-polkadot` plugin provides the possibility to spin up a local node with an ETH-RPC adapter for running local tests. For complete isolation and control over the testing environment, you can configure Hardhat to work with a fresh local Substrate node. This approach is ideal when you want to test in a clean environment without any existing state or when you need specific node configurations. Configure a local node setup by adding the node binary path along with the ETH-RPC adapter path: ```javascript title="hardhat.config.js" hl_lines="12-20" // hardhat.config.js require('@nomicfoundation/hardhat-toolbox'); require('@parity/hardhat-polkadot'); /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { ... networks: { hardhat: { polkavm: true, nodeConfig: { nodeBinaryPath: 'INSERT_PATH_TO_SUBSTRATE_NODE', rpcPort: 8000, dev: true, }, adapterConfig: { adapterBinaryPath: 'INSERT_PATH_TO_ETH_RPC_ADAPTER', dev: true, }, }, }, }; ``` Replace `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` with the actual paths to your compiled binaries. The `dev: true` flag configures both the node and adapter for development mode. To obtain these binaries, check the [Installation](/develop/smart-contracts/local-development-node#install-the-substrate-node-and-eth-rpc-adapter){target=\_blank} section on the Local Development Node page. Once configured, start your chosen testing environment with: ```bash npx hardhat node ``` This command will launch either the forked network or local node (depending on your configuration) along with the ETH-RPC adapter, providing you with a complete testing environment ready for contract deployment and interaction. By default, the Substrate node will be running on `localhost:8000` and the ETH-RPC adapter on `localhost:8545`. The output will be something like this:
npx hardhat node
Starting server at 127.0.0.1:8000 ../bin/substrate-node --rpc-port=8000 --dev Starting the Eth RPC Adapter at 127.0.0.1:8545 ../bin/eth-rpc --node-rpc-url=ws://localhost:8000 --dev 2025-05-29 13:00:32 Running in --dev mode, RPC CORS has been disabled. 2025-05-29 13:00:32 Running in --dev mode, RPC CORS has been disabled. 2025-05-29 13:00:32 🌐 Connecting to node at: ws://localhost:8000 ... 2025-05-29 13:00:32 Substrate Node 2025-05-29 13:00:32 ✌️ version 3.0.0-dev-f73c228b7a1 2025-05-29 13:00:32 ❀️ by Parity Technologies <admin@parity.io>, 2017-2025 2025-05-29 13:00:32 πŸ“‹ Chain specification: Development 2025-05-29 13:00:32 🏷 Node name: electric-activity-4221 2025-05-29 13:00:32 πŸ‘€ Role: AUTHORITY 2025-05-29 13:00:32 πŸ’Ύ Database: RocksDb at /var/folders/f4/7rdt2m9d7j361dm453cpggbm0000gn/T/substrateOaoecu/chains/dev/db/full 2025-05-29 13:00:36 [0] πŸ’Έ generated 1 npos voters, 1 from validators and 0 nominators ...
## Test Your Contract When testing your contract, be aware that [`@nomicfoundation/hardhat-toolbox/network-helpers`](https://hardhat.org/hardhat-network-helpers/docs/overview){target=\_blank} is not fully compatible with Polkadot Hub's available RPCs. Specifically, Hardhat-only helpers like `time` and `loadFixture` may not work due to missing RPC calls in the node. For more details, refer to the [Compatibility](https://github.com/paritytech/hardhat-polkadot/tree/main/packages/hardhat-polkadot-node#compatibility){target=\_blank} section in the `hardhat-revive` docs. You should avoid using helpers like `time` and `loadFixture` when writing tests. To run your test: 1. Update the `hardhat.config.js` file accordingly to the [Set Up a Testing Environment](#set-up-a-testing-environment) section 2. Execute the following command to run your tests: ```bash npx hardhat test ``` ## Deploy to a Local Node Before deploying to a live network, you can deploy your contract to a local node using [Ignition](https://hardhat.org/ignition/docs/getting-started#overview){target=\_blank} modules: 1. Update the Hardhat configuration file to add the local network as a target for local deployment: ```javascript title="hardhat.config.js" hl_lines="13-16" // hardhat.config.js require('@nomicfoundation/hardhat-toolbox'); require('@parity/hardhat-polkadot'); /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { ... networks: { hardhat: { ... }, localNode: { polkavm: true, url: `http://127.0.0.1:8545`, }, }, }, }; ``` 2. Start a local node: ```bash npx hardhat node ``` This command will spawn a local Substrate node along with the ETH-RPC adapter. 3. In a new terminal window, deploy the contract using Ignition: ```bash npx hardhat ignition deploy ./ignition/modules/MyToken.js --network localNode ``` ## Deploying to a Live Network After testing your contract locally, you can deploy it to a live network. This guide will use the Polkadot Hub TestNet as the target network. Here's how to configure and deploy: 1. Fund your deployment account with enough tokens to cover gas fees. In this case, the needed tokens are PAS (on Polkadot Hub TestNet). You can use the [Polkadot faucet](https://faucet.polkadot.io/?parachain=1111){target=\_blank} to obtain testing tokens. 2. Export your private key and save it in your Hardhat environment: ```bash npx hardhat vars set PRIVATE_KEY "INSERT_PRIVATE_KEY" ``` Replace `INSERT_PRIVATE_KEY` with your actual private key. For further details on private key exportation, refer to the article [How to export an account's private key](https://support.metamask.io/configure/accounts/how-to-export-an-accounts-private-key/){target=\_blank}. !!! warning Never reveal your private key, otherwise anyone with access to it can control your wallet and steal your funds. Store it securely and never share it publicly or commit it to version control systems. 3. Check that your private key has been set up successfully by running: ```bash npx hardhat vars get PRIVATE_KEY ``` 4. Update your Hardhat configuration file with network settings for the Polkadot network you want to target: ```javascript title="hardhat.config.js" hl_lines="18-22" // hardhat.config.js require('@nomicfoundation/hardhat-toolbox'); require('@parity/hardhat-polkadot'); const { vars } = require('hardhat/config'); /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { ... networks: { hardhat: { ... }, localNode: { ... }, polkadotHubTestnet: { polkavm: true, url: 'https://testnet-passet-hub-eth-rpc.polkadot.io', accounts: [vars.get('PRIVATE_KEY')], }, }, }, }; ``` 6. Deploy your contract using Ignition: ```bash npx hardhat ignition deploy ./ignition/modules/MyToken.js --network polkadotHubTestnet ``` ## Interacting with Your Contract Once deployed, you can create a script to interact with your contract. To do so, create a file called `scripts/interact.js` and add some logic to interact with the contract. For example, for the default `MyToken.sol` contract, you can use the following file that connects to the contract at its address and retrieves the `unlockTime`, which represents when funds can be withdrawn. The script converts this timestamp into a readable date and logs it. It then checks the contract's balance and displays it. Finally, it attempts to call the withdrawal function on the contract, but it catches and logs the error message if the withdrawal is not yet allowed (e.g., before `unlockTime`). ```javascript title="interact.js" const hre = require('hardhat'); async function main() { // Get the contract factory const MyToken = await hre.ethers.getContractFactory('MyToken'); // Replace with your deployed contract address const contractAddress = 'INSERT_CONTRACT_ADDRESS'; // Attach to existing contract const token = await MyToken.attach(contractAddress); // Get signers const [deployer] = await hre.ethers.getSigners(); // Read contract state const name = await token.name(); const symbol = await token.symbol(); const totalSupply = await token.totalSupply(); const balance = await token.balanceOf(deployer.address); console.log(`Token: ${name} (${symbol})`); console.log( `Total Supply: ${hre.ethers.formatUnits(totalSupply, 18)} tokens`, ); console.log( `Deployer Balance: ${hre.ethers.formatUnits(balance, 18)} tokens`, ); } main().catch((error) => { console.error(error); process.exitCode = 1; }); ``` Run your interaction script: ```bash npx hardhat run scripts/interact.js --network polkadotHubTestnet ``` ## Where to Go Next Hardhat provides a powerful environment for developing, testing, and deploying smart contracts on Polkadot Hub. Its plugin architecture allows seamless integration with PolkaVM through the `hardhat-resolc` and `hardhat-revive-node` plugins. Explore more about smart contracts through these resources:
- Guide __Smart Contracts on Polkadot__ --- Dive into advanced smart contract concepts. [:octicons-arrow-right-24: Get Started](/develop/smart-contracts/) - External __Hardhat Documentation__ --- Learn more about Hardhat's advanced features and best practices. [:octicons-arrow-right-24: Get Started](https://hardhat.org/docs){target=\_blank} - External __OpenZeppelin Contracts__ --- Test your skills by deploying contracts with prebuilt templates. [:octicons-arrow-right-24: Get Started](https://www.openzeppelin.com/solidity-contracts){target=\_blank}
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/dev-environments/remix/ --- BEGIN CONTENT --- --- title: Use the Polkadot Remix IDE description: Explore the smart contract development and deployment process on Asset Hub using Remix IDE, a visual IDE for blockchain developers. categories: Smart Contracts, Tooling --- # Remix IDE !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**.
- :octicons-code-16:{ .lg .middle } __Deploy NFTs Using Remix IDE__ --- Mint your NFT on Polkadot's Asset Hub. Use PolkaVM and OpenZeppelin to bring your digital asset to life with Polkadot Remix IDE.
[:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/deploy-nft){target=\_blank} - :octicons-code-16:{ .lg .middle } __Deploy ERC20s Using Remix IDE__ --- Mint your custom ERC-20 token on Polkadot's Asset Hub. Leverage PolkaVM and Polkadot Remix IDE to bring your blockchain project to life.
[:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/deploy-erc20){target=\_blank}
!!! warning The Polkadot Remix IDE's contract compilation functionality is currently limited to Google Chrome. Alternative browsers are not recommended for this task. ## Overview Remix IDE is a robust browser-based development environment for smart contracts. This guide will walk you through the essentials of the [Polkadot Remix IDE](https://remix.polkadot.io/){target=\_blank} to understand the processes of compiling, developing, and deploying smart contracts on Asset Hub. ## Prerequisites Before getting started, ensure you have: - A web browser with [Talisman](https://talisman.xyz/){target=\_blank} extension installed - Basic understanding of Solidity programming - Some WND test tokens to cover transaction fees (easily obtainable from the [Polkadot faucet](https://faucet.polkadot.io/westend?parachain=1000){target=\_blank}) ## Accessing Remix IDE Navigate to [https://remix.polkadot.io/](https://remix.polkadot.io/){target=\_blank}. The interface will load with a default workspace containing sample contracts. ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-1.webp) In this interface, you can access a file explorer, edit your code, interact with various plugins for development, and use a terminal. ## Creating a New Contract To create a new contract using the Polkadot Remix IDE, you can follow these steps: 1. Select the **Create a new file** button in the `contracts` folder ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-2.webp) 2. Name your file with a `.sol` extension, in this case, `Counter.sol` ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-3.webp) 3. Write your Solidity code in the editor You can use the following code as an example: ???- "Counter.sol" ```solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract Counter { int256 private count; function increment() public { count += 1; } function decrement() public { count -= 1; } function getCount() public view returns (int256) { return count; } } ``` ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-4.webp) ## Compiling Your Contract 1. To compile your contract, you need to: 1. Navigate to the **Solidity Compiler** tab (third icon in the left sidebar) 2. Select **Compile** or use `Ctrl+S` ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-5.webp) !!! note Compilation errors and warnings appear in the terminal panel at the bottom of the screen. 1. After compiling your contract, you can navigate to the **File Explorer** tab (first icon in the left sidebar) and check that: 1. The `artifact` folder is present 2. The `Counter_metadata.json` and the `Counter.json` files have been generated ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-6.webp) ## Deploying Contracts 1. To deploy your contract, you need to: 1. Navigate to the **Deploy & Run Transactions** tab (fourth icon in the left sidebar) 2. Click the **Enviroment** dropdown 3. Select **Customize this list** ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-7.webp) 2. Enable the **Injected Provider - Talisman** option ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-8.webp) 4. Click again the **Enviroment** dropdown and select **Injected Provider - Talisman** ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-9.webp) 4. Click the **Deploy** button and then click **Approve** in the Talisman wallet popup ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-10.webp) 5. Once your contract is deployed successfully, you will see the following output in the Remix terminal: ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-11.webp) ## Interacting with Contracts Once deployed, your contract appears in the **Deployed/Unpinned Contracts** section: 1. Expand the contract to view available methods ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-12.webp) !!! tip Pin your frequently used contracts to the **Pinned Contracts** section for easy access. 2. To interact with the contract, you can select any of the exposed methods ![](/images/develop/smart-contracts/evm-toolkit/dev-environments/remix/remix-13.webp) In this way, you can interact with your deployed contract by reading its state or writing to it. The button color indicates the type of interaction available: - **Red** - modifies state and is payable - **Orange** - modifies state only - **Blue** - reads state ## Where to Go Next The Polkadot Remix IDE offers an environment for developing, compiling, and deploying smart contracts on Asset Hub. Its intuitive interface allows developers to easily write Solidity code, compile contracts, and interact with them directly in the browser. Explore more about smart contracts through these resources:
- Guide __Smart Contracts on Polkadot__ --- Dive into advanced smart contract concepts. [:octicons-arrow-right-24: Get Started](/develop/smart-contracts/) - External __OpenZeppelin Contracts__ --- Test your skills by deploying a simple contracts with prebuilt templates. [:octicons-arrow-right-24: Get Started](https://www.openzeppelin.com/solidity-contracts){target=\_blank}
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/faqs/ --- BEGIN CONTENT --- --- title: Polkadot Hub Smart Contract FAQs description: Find answers to common questions about smart contract development, deployment, and compatibility in the Polkadot Hub ecosystem. categories: Smart Contracts --- # Smart Contracts FAQs !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. !!! note For a list of known incompatibilities, please refer to the [Solidity and Yul IR transaltion incompatibilities](/polkadot-protocol/smart-contract-basics/evm-vs-polkavm/#solidity-and-yul-ir-translation-incompatibilities){target=\_blank} section. ## General Questions ### What are the different types of smart contracts I can build on Polkadot? Polkadot supports three main smart contract environments: 1. **PolkaVM contracts**: Available on Polkadot Hub, using a RISC-V-based virtual machine with Solidity compatibility. 2. **EVM contracts**: Available on parachains like Moonbeam, Astar, and Acala via the Frontier framework. 3. **Wasm contracts**: Using ink! (Rust-based) or Solidity via Solang compiler. ### Should I build a smart contract or a parachain? Choose smart contracts if: - You want to deploy quickly without managing consensus. - Your application fits within existing chain functionality. - You prefer familiar development tools (Ethereum ecosystem). - You need to interact with other contracts easily. Choose a parachain if: - You need custom logic that doesn't fit smart contract limitations. - You want full control over governance and upgrades. - You require specialized consensus mechanisms. - You need optimized fee structures. ### What's the difference between Polkadot Hub smart contracts and other EVM chains? Polkadot Hub contracts run on [PolkaVM](/polkadot-protocol/smart-contract-basics/polkavm-design){target=\_blank} instead of EVM: - **Performance**: RISC-V register-based architecture vs. stack-based EVM. - **Resource metering**: Three dimensions (`ref_time`, `proof_size`, `storage_deposit`) vs. single gas metric. - **Memory management**: Hard memory limits per contract vs. gas-based soft limits. - **Account system**: Polkadot's 32-byte accounts with automatic 20-byte address conversion. ## Development Environment ### Can I use my existing Ethereum development tools? Yes, check out the [Wallets](/develop/smart-contracts/wallets){target=\_blank} page, the [Development Environments](/develop/smart-contracts/dev-environments/){target=\_blank}, and the [Libraries](/develop/smart-contracts/libraries/){target=\_blank} sections for more information. ### How do I set up local development? Check the [Local Development Node](/develop/smart-contracts/local-development-node){target=\_blank} for further instructions. ### What networks are available for testing and deployment? - **Local Development**: Kitchensink node with Ethereum RPC proxy. - **TestNets**: Polkadot Hub TestNet. ## Technical Implementation ### How do Ethereum addresses work on Polkadot? Polkadot uses a [dual-address system](/polkadot-protocol/smart-contract-basics/evm-vs-polkavm#account-management-comparison){target=\_blank}: - _20-byte Ethereum addresses_ are padded with `0xEE` bytes to create 32-byte Polkadot accounts. - _32-byte Polkadot accounts_ can register mappings to 20-byte addresses. - _Automatic conversion_ happens behind the scenes. - _MetaMask compatibility_ is maintained through the mapping system. ### What are the key differences in the gas model? PolkaVM uses three resource dimensions: - **`ref_time`**: Computational time (similar to traditional gas). - **`proof_size`**: State proof size for validator verification. - **`storage_deposit`**: Refundable deposit for state storage. Key implications: - Gas values are dynamically scaled based on performance benchmarks. - Cross-contract calls don't respect gas limits (use reentrancy protection). - Storage costs are separate from execution costs. ### How does contract deployment work? PolkaVM deployment differs from EVM: - _Code must be pre-uploaded_ to the chain before instantiation. - _Factory contracts_ need modification to work with pre-uploaded code hashes. - _Two-step process_: Upload code, then instantiate contracts. - _Runtime code generation_ is not supported. ### Which Solidity features are not supported? Limited support for: - **`EXTCODECOPY`**: Only works in constructor code. - **Runtime code modification**: Use on-chain constructors instead. - **Gas stipends**: `address.send()` and `address.transfer()` don't provide reentrancy protection. Unsupported operations: - `pc`, `extcodecopy`, `selfdestruct` - `blobhash`, `blobbasefee` (blob-related operations) ### How do I handle the existential deposit requirement? What it means: - Accounts need a minimum balance, also known as an existential deposit (ED), to remain active. - Accounts below this threshold are automatically deleted. How it's handled: - _Balance queries_ via Ethereum RPC automatically deduct the ED. - _New account transfers_ automatically include ED with transaction fees. - _Contract-to-contract transfers_ draw ED from transaction signer, not sending contract. ## Migration and Compatibility ### Can I migrate my existing Ethereum contracts? Most contracts work without changes: - Standard ERC-20, ERC-721, ERC-1155 tokens. - DeFi protocols and DEXs. - DAOs and governance contracts. May need modifications: - Factory contracts that create other contracts at runtime. - Contracts using `EXTCODECOPY` for runtime code manipulation. - Contracts relying on gas stipends for reentrancy protection. ## Troubleshooting ### Why are my gas calculations different? PolkaVM uses dynamic gas scaling: - Gas values reflect actual performance benchmarks. - Don't hardcode gas valuesβ€”use flexible calculations. - Cross-contract calls ignore gas limitsβ€”implement proper access controls. ### I deployed a contract with MetaMask, and got a `code size` error - why? The latest MetaMask update affects the extension’s ability to deploy large contracts. Check the [Wallets](/develop/smart-contracts/wallets){target=\_blank} page for more details. ### I found a bug, where can I log it? Please log any bugs in the [`contracts-issues`](https://github.com/paritytech/contract-issues/issues){target=\_blank} repository so developers are aware of them and can address them. ## Known Issues ### Runtime Behavior - **`creationCode` returns hash instead of bytecode**: The Solidity keyword returns a `keccak256` hash rather than the actual creation bytecode. - [Issue #45](https://github.com/paritytech/contract-issues/issues/45){target=\_blank} - **Non-deterministic gas usage**: Gas consumption varies slightly for identical transactions. - [Issue #49](https://github.com/paritytech/contract-issues/issues/49){target=\_blank} - **Precompiles not recognized**: Precompile addresses return `Contract not found` error. - [Issue #111](https://github.com/paritytech/contract-issues/issues/111){target=\_blank} ### Development Tools - **`hardhat-polkadot` plugin compilation issues**: Plugin interferes with standard `npx hardhat compile` command. - [Issue #44](https://github.com/paritytech/contract-issues/issues/44){target=\_blank} ### Contract Patterns - **Minimal proxy (EIP-1167) deployment fails**: Standard proxy contracts cannot be deployed on PolkaVM. - [Issue #86](https://github.com/paritytech/contract-issues/issues/86){target=\_blank} ### Compilation - **`SDIV` opcode crash**: Compiler crashes with `Unsupported SDIV` assertion failure. - [Issue #342](https://github.com/paritytech/revive/issues/342){target=\_blank} --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/libraries/ethers-js/ --- BEGIN CONTENT --- --- title: Deploy Contracts to Polkadot Hub with Ethers.js description: Learn how to interact with Polkadot Hub using Ethers.js, from compiling and deploying Solidity contracts to interacting with deployed smart contracts. categories: Smart Contracts, Tooling --- # Ethers.js !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction [Ethers.js](https://docs.ethers.org/v6/){target=\_blank} is a lightweight library that enables interaction with Ethereum Virtual Machine (EVM)-compatible blockchains through JavaScript. Ethers is widely used as a toolkit to establish connections and read and write blockchain data. This article demonstrates using Ethers.js to interact and deploy smart contracts to Polkadot Hub. This guide is intended for developers who are familiar with JavaScript and want to interact with Polkadot Hub using Ethers.js. ## Prerequisites Before getting started, ensure you have the following installed: - **Node.js** - v22.13.1 or later, check the [Node.js installation guide](https://nodejs.org/en/download/current/){target=\_blank} - **npm** - v6.13.4 or later (comes bundled with Node.js) - **Solidity** - this guide uses Solidity `^0.8.9` for smart contract development ## Project Structure This project organizes contracts, scripts, and compiled artifacts for easy development and deployment. ```text title="Ethers.js Polkadot Hub" ethers-project β”œβ”€β”€ contracts β”‚ β”œβ”€β”€ Storage.sol β”œβ”€β”€ scripts β”‚ β”œβ”€β”€ connectToProvider.js β”‚ β”œβ”€β”€ fetchLastBlock.js β”‚ β”œβ”€β”€ compile.js β”‚ β”œβ”€β”€ deploy.js β”‚ β”œβ”€β”€ checkStorage.js β”œβ”€β”€ abis β”‚ β”œβ”€β”€ Storage.json β”œβ”€β”€ artifacts β”‚ β”œβ”€β”€ Storage.polkavm β”œβ”€β”€ contract-address.json β”œβ”€β”€ node_modules/ β”œβ”€β”€ package.json β”œβ”€β”€ package-lock.json └── README.md ``` ## Set Up the Project To start working with Ethers.js, create a new folder and initialize your project by running the following commands in your terminal: ```bash mkdir ethers-project cd ethers-project npm init -y ``` ## Install Dependencies Next, run the following command to install the Ethers.js library: ```bash npm install ethers ``` ## Set Up the Ethers.js Provider A [`Provider`](https://docs.ethers.org/v6/api/providers/#Provider){target=\_blank} is an abstraction of a connection to the Ethereum network, allowing you to query blockchain data and send transactions. It serves as a bridge between your application and the blockchain. To interact with Polkadot Hub, you must set up an Ethers.js provider. This provider connects to a blockchain node, allowing you to query blockchain data and interact with smart contracts. In the root of your project, create a file named `connectToProvider.js` and add the following code: ```js title="scripts/connectToProvider.js" const { JsonRpcProvider } = require('ethers'); const createProvider = (rpcUrl, chainId, chainName) => { const provider = new JsonRpcProvider(rpcUrl, { chainId: chainId, name: chainName, }); return provider; }; const PROVIDER_RPC = { rpc: 'INSERT_RPC_URL', chainId: 'INSERT_CHAIN_ID', name: 'INSERT_CHAIN_NAME', }; createProvider(PROVIDER_RPC.rpc, PROVIDER_RPC.chainId, PROVIDER_RPC.name); ``` !!! note Replace `INSERT_RPC_URL`, `INSERT_CHAIN_ID`, and `INSERT_CHAIN_NAME` with the appropriate values. For example, to connect to Polkadot Hub TestNet's Ethereum RPC instance, you can use the following parameters: ```js const PROVIDER_RPC = { rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io', chainId: 420420422, name: 'polkadot-hub-testnet' }; ``` To connect to the provider, execute: ```bash node connectToProvider ``` With the provider set up, you can start querying the blockchain. For instance, to fetch the latest block number: ??? code "Fetch Last Block code" ```js title="scripts/fetchLastBlock.js" const { JsonRpcProvider } = require('ethers'); const createProvider = (rpcUrl, chainId, chainName) => { const provider = new JsonRpcProvider(rpcUrl, { chainId: chainId, name: chainName, }); return provider; }; const PROVIDER_RPC = { rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io', chainId: 420420422, name: 'polkadot-hub-testnet', }; const main = async () => { try { const provider = createProvider( PROVIDER_RPC.rpc, PROVIDER_RPC.chainId, PROVIDER_RPC.name, ); const latestBlock = await provider.getBlockNumber(); console.log(`Latest block: ${latestBlock}`); } catch (error) { console.error('Error connecting to Polkadot Hub TestNet: ' + error.message); } }; main(); ``` ## Compile Contracts !!! note "Contracts Code Blob Size Disclaimer" The maximum contract code blob size on Polkadot Hub networks is _100 kilobytes_, significantly larger than Ethereum’s EVM limit of 24 kilobytes. For detailed comparisons and migration guidelines, see the [EVM vs. PolkaVM](/polkadot-protocol/smart-contract-basics/evm-vs-polkavm/#current-memory-limits){target=\_blank} documentation page. The `revive` compiler transforms Solidity smart contracts into [PolkaVM](/develop/smart-contracts/overview#native-smart-contracts){target=\_blank} bytecode for deployment on Polkadot Hub. Revive's Ethereum RPC interface allows you to use familiar tools like Ethers.js and MetaMask to interact with contracts. ### Install the Revive Library The [`@parity/resolc`](https://www.npmjs.com/package/@parity/resolc){target=\_blank} library will compile your Solidity code for deployment on Polkadot Hub. Run the following command in your terminal to install the library: ```bash npm install --save-dev @parity/resolc ``` This guide uses `@parity/resolc` version `{{ dependencies.javascript_packages.resolc.version }}`. ### Sample Storage Smart Contract This example demonstrates compiling a `Storage.sol` Solidity contract for deployment to Polkadot Hub. The contract's functionality stores a number and permits users to update it with a new value. ```solidity title="contracts/Storage.sol" //SPDX-License-Identifier: MIT // Solidity files have to start with this pragma. // It will be used by the Solidity compiler to validate its version. pragma solidity ^0.8.9; contract Storage { // Public state variable to store a number uint256 public storedNumber; /** * Updates the stored number. * * The `public` modifier allows anyone to call this function. * * @param _newNumber - The new value to store. */ function setNumber(uint256 _newNumber) public { storedNumber = _newNumber; } } ``` ### Compile the Smart Contract To compile this contract, use the following script: ```js title="scripts/compile.js" const { compile } = require('@parity/resolc'); const { readFileSync, writeFileSync } = require('fs'); const { basename, join } = require('path'); const compileContract = async (solidityFilePath, outputDir) => { try { // Read the Solidity file const source = readFileSync(solidityFilePath, 'utf8'); // Construct the input object for the compiler const input = { [basename(solidityFilePath)]: { content: source }, }; console.log(`Compiling contract: ${basename(solidityFilePath)}...`); // Compile the contract const out = await compile(input); for (const contracts of Object.values(out.contracts)) { for (const [name, contract] of Object.entries(contracts)) { console.log(`Compiled contract: ${name}`); // Write the ABI const abiPath = join(outputDir, `${name}.json`); writeFileSync(abiPath, JSON.stringify(contract.abi, null, 2)); console.log(`ABI saved to ${abiPath}`); // Write the bytecode const bytecodePath = join(outputDir, `${name}.polkavm`); writeFileSync( bytecodePath, Buffer.from(contract.evm.bytecode.object, 'hex'), ); console.log(`Bytecode saved to ${bytecodePath}`); } } } catch (error) { console.error('Error compiling contracts:', error); } }; const solidityFilePath = join(__dirname, '../contracts/Storage.sol'); const outputDir = join(__dirname, '../contracts'); compileContract(solidityFilePath, outputDir); ``` !!! note The script above is tailored to the `Storage.sol` contract. It can be adjusted for other contracts by changing the file name or modifying the ABI and bytecode paths. The ABI (Application Binary Interface) is a JSON representation of your contract's functions, events, and their parameters. It serves as the interface between your JavaScript code and the deployed smart contract, allowing your application to know how to format function calls and interpret returned data. Execute the script above by running: ```bash node compile ``` After executing the script, the Solidity contract will be compiled into the required PolkaVM bytecode format. The ABI and bytecode will be saved into files with `.json` and `.polkavm` extensions, respectively. You can now proceed with deploying the contract to Polkadot Hub, as outlined in the next section. ## Deploy the Compiled Contract To deploy your compiled contract to Polkadot Hub, you'll need a wallet with a private key to sign the deployment transaction. You can create a `deploy.js` script in the root of your project to achieve this. The deployment script can be divided into key components: 1. Set up the required imports and utilities: ```js title="scripts/deploy.js" // Deploy an EVM-compatible smart contract using ethers.js const { writeFileSync, existsSync, readFileSync } = require('fs'); const { join } = require('path'); const { ethers, JsonRpcProvider } = require('ethers'); const codegenDir = join(__dirname); ``` 2. Create a provider to connect to Polkadot Hub: ```js title="scripts/deploy.js" // Creates an Ethereum provider with specified RPC URL and chain details const createProvider = (rpcUrl, chainId, chainName) => { const provider = new JsonRpcProvider(rpcUrl, { chainId: chainId, name: chainName, }); return provider; }; ``` 3. Set up functions to read contract artifacts: ```js title="scripts/deploy.js" // Reads and parses the ABI file for a given contract const getAbi = (contractName) => { try { return JSON.parse( readFileSync(join(codegenDir, `${contractName}.json`), 'utf8'), ); } catch (error) { console.error( `Could not find ABI for contract ${contractName}:`, error.message, ); throw error; } }; // Reads the compiled bytecode for a given contract const getByteCode = (contractName) => { try { const bytecodePath = join( codegenDir, '../contracts', `${contractName}.polkavm`, ); return `0x${readFileSync(bytecodePath).toString('hex')}`; } catch (error) { console.error( `Could not find bytecode for contract ${contractName}:`, error.message, ); throw error; } }; ``` 4. Create the main deployment function: ```js title="scripts/deploy.js" const deployContract = async (contractName, mnemonic, providerConfig) => { console.log(`Deploying ${contractName}...`); try { // Step 1: Set up provider and wallet const provider = createProvider( providerConfig.rpc, providerConfig.chainId, providerConfig.name, ); const walletMnemonic = ethers.Wallet.fromPhrase(mnemonic); const wallet = walletMnemonic.connect(provider); // Step 2: Create and deploy the contract const factory = new ethers.ContractFactory( getAbi(contractName), getByteCode(contractName), wallet, ); const contract = await factory.deploy(); await contract.waitForDeployment(); // Step 3: Save deployment information const address = await contract.getAddress(); console.log(`Contract ${contractName} deployed at: ${address}`); const addressesFile = join(codegenDir, 'contract-address.json'); const addresses = existsSync(addressesFile) ? JSON.parse(readFileSync(addressesFile, 'utf8')) : {}; addresses[contractName] = address; writeFileSync(addressesFile, JSON.stringify(addresses, null, 2), 'utf8'); } catch (error) { console.error(`Failed to deploy contract ${contractName}:`, error); } }; ``` 5. Configure and execute the deployment: ```js title="scripts/deploy.js" const providerConfig = { rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io', chainId: 420420422, name: 'polkadot-hub-testnet', }; const mnemonic = 'INSERT_MNEMONIC'; deployContract('Storage', mnemonic, providerConfig); ``` !!! note A mnemonic (seed phrase) is a series of words that can generate multiple private keys and their corresponding addresses. It's used here to derive the wallet that will sign and pay for the deployment transaction. **Always keep your mnemonic secure and never share it publicly**. Ensure to replace the `INSERT_MNEMONIC` placeholder with your actual mnemonic. ??? code "View complete script" ```js title="scripts/deploy.js" // Deploy an EVM-compatible smart contract using ethers.js const { writeFileSync, existsSync, readFileSync } = require('fs'); const { join } = require('path'); const { ethers, JsonRpcProvider } = require('ethers'); const codegenDir = join(__dirname); // Creates an Ethereum provider with specified RPC URL and chain details const createProvider = (rpcUrl, chainId, chainName) => { const provider = new JsonRpcProvider(rpcUrl, { chainId: chainId, name: chainName, }); return provider; }; // Reads and parses the ABI file for a given contract const getAbi = (contractName) => { try { return JSON.parse( readFileSync(join(codegenDir, `${contractName}.json`), 'utf8'), ); } catch (error) { console.error( `Could not find ABI for contract ${contractName}:`, error.message, ); throw error; } }; // Reads the compiled bytecode for a given contract const getByteCode = (contractName) => { try { const bytecodePath = join( codegenDir, '../contracts', `${contractName}.polkavm`, ); return `0x${readFileSync(bytecodePath).toString('hex')}`; } catch (error) { console.error( `Could not find bytecode for contract ${contractName}:`, error.message, ); throw error; } }; const deployContract = async (contractName, mnemonic, providerConfig) => { console.log(`Deploying ${contractName}...`); try { // Step 1: Set up provider and wallet const provider = createProvider( providerConfig.rpc, providerConfig.chainId, providerConfig.name, ); const walletMnemonic = ethers.Wallet.fromPhrase(mnemonic); const wallet = walletMnemonic.connect(provider); // Step 2: Create and deploy the contract const factory = new ethers.ContractFactory( getAbi(contractName), getByteCode(contractName), wallet, ); const contract = await factory.deploy(); await contract.waitForDeployment(); // Step 3: Save deployment information const address = await contract.getAddress(); console.log(`Contract ${contractName} deployed at: ${address}`); const addressesFile = join(codegenDir, 'contract-address.json'); const addresses = existsSync(addressesFile) ? JSON.parse(readFileSync(addressesFile, 'utf8')) : {}; addresses[contractName] = address; writeFileSync(addressesFile, JSON.stringify(addresses, null, 2), 'utf8'); } catch (error) { console.error(`Failed to deploy contract ${contractName}:`, error); } }; const providerConfig = { rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io', chainId: 420420422, name: 'polkadot-hub-testnet', }; const mnemonic = 'INSERT_MNEMONIC'; deployContract('Storage', mnemonic, providerConfig); ``` To run the script, execute the following command: ```bash node deploy ``` After running this script, your contract will be deployed to Polkadot Hub, and its address will be saved in `contract-address.json` within your project directory. You can use this address for future contract interactions. ## Interact with the Contract Once the contract is deployed, you can interact with it by calling its functions. For example, to set a number, read it and then modify that number by its double, you can create a file named `checkStorage.js` in the root of your project and add the following code: ```js title="scripts/checkStorage.js" const { ethers } = require('ethers'); const { readFileSync } = require('fs'); const { join } = require('path'); const createProvider = (providerConfig) => { return new ethers.JsonRpcProvider(providerConfig.rpc, { chainId: providerConfig.chainId, name: providerConfig.name, }); }; const createWallet = (mnemonic, provider) => { return ethers.Wallet.fromPhrase(mnemonic).connect(provider); }; const loadContractAbi = (contractName, directory = __dirname) => { const contractPath = join(directory, `${contractName}.json`); const contractJson = JSON.parse(readFileSync(contractPath, 'utf8')); return contractJson.abi || contractJson; // Depending on JSON structure }; const createContract = (contractAddress, abi, wallet) => { return new ethers.Contract(contractAddress, abi, wallet); }; const interactWithStorageContract = async ( contractName, contractAddress, mnemonic, providerConfig, numberToSet, ) => { try { console.log(`Setting new number in Storage contract: ${numberToSet}`); // Create provider and wallet const provider = createProvider(providerConfig); const wallet = createWallet(mnemonic, provider); // Load the contract ABI and create the contract instance const abi = loadContractAbi(contractName); const contract = createContract(contractAddress, abi, wallet); // Send a transaction to set the stored number const tx1 = await contract.setNumber(numberToSet); await tx1.wait(); // Wait for the transaction to be mined console.log(`Number successfully set to ${numberToSet}`); // Retrieve the updated number const storedNumber = await contract.storedNumber(); console.log(`Retrieved stored number:`, storedNumber.toString()); // Send a transaction to set the stored number const tx2 = await contract.setNumber(numberToSet * 2); await tx2.wait(); // Wait for the transaction to be mined console.log(`Number successfully set to ${numberToSet * 2}`); // Retrieve the updated number const updatedNumber = await contract.storedNumber(); console.log(`Retrieved stored number:`, updatedNumber.toString()); } catch (error) { console.error('Error interacting with Storage contract:', error.message); } }; const providerConfig = { name: 'asset-hub-smart-contracts', rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io', chainId: 420420422, }; const mnemonic = 'INSERT_MNEMONIC'; const contractName = 'Storage'; const contractAddress = 'INSERT_CONTRACT_ADDRESS'; const newNumber = 42; interactWithStorageContract( contractName, contractAddress, mnemonic, providerConfig, newNumber, ); ``` Ensure you replace the `INSERT_MNEMONIC`, `INSERT_CONTRACT_ADDRESS`, and `INSERT_ADDRESS_TO_CHECK` placeholders with actual values. Also, ensure the contract ABI file (`Storage.json`) is correctly referenced. To interact with the contract, run: ```bash node checkStorage ``` ## Where to Go Next Now that you have the foundational knowledge to use Ethers.js with Polkadot Hub, you can: - **Dive into Ethers.js utilities** - discover additional Ethers.js features, such as wallet management, signing messages, etc - **Implement batch transactions** - use Ethers.js to execute batch transactions for efficient multi-step contract interactions - **Build scalable applications** - combine Ethers.js with frameworks like [`Next.js`](https://nextjs.org/docs){target=\_blank} or [`Node.js`](https://nodejs.org/en){target=\_blank} to create full-stack decentralized applications (dApps) --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/libraries/viem/ --- BEGIN CONTENT --- --- title: viem for Polkadot Hub Smart Contracts description: This guide covers deploying and interacting with contracts on Polkadot Hub using viem, a TypeScript library for Ethereum-compatible chains. categories: Smart Contracts, Tooling --- # viem !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction [viem](https://viem.sh/){target=\_blank} is a lightweight TypeScript library designed for interacting with Ethereum-compatible blockchains. This comprehensive guide will walk you through using viem to interact with and deploy smart contracts to Polkadot Hub. ## Prerequisites Before getting started, ensure you have the following installed: - **Node.js** - v22.13.1 or later, check the [Node.js installation guide](https://nodejs.org/en/download/current/){target=\_blank} - **npm** - v6.13.4 or later (comes bundled with Node.js) - **Solidity** - this guide uses Solidity `^0.8.9` for smart contract development ## Project Structure This project organizes contracts, scripts, and compiled artifacts for easy development and deployment. ```text viem-project/ β”œβ”€β”€ package.json β”œβ”€β”€ tsconfig.json β”œβ”€β”€ src/ β”‚ β”œβ”€β”€ chainConfig.ts β”‚ β”œβ”€β”€ createClient.ts β”‚ β”œβ”€β”€ createWallet.ts β”‚ β”œβ”€β”€ compile.ts β”‚ β”œβ”€β”€ deploy.ts β”‚ └── interact.ts β”œβ”€β”€ contracts/ β”‚ └── Storage.sol └── artifacts/ β”œβ”€β”€ Storage.json └── Storage.polkavm ``` ## Set Up the Project First, create a new folder and initialize your project: ```bash mkdir viem-project cd viem-project npm init -y ``` ## Install Dependencies Install viem along with other necessary dependencies, including [@parity/resolc](https://www.npmjs.com/package/@parity/resolc){target=\_blank}, which enables to compile smart contracts to [PolkaVM](/polkadot-protocol/smart-contract-basics/polkavm-design/#polkavm){target=\_blank} bytecode: ```bash # Install viem and resolc npm install viem @parity/resolc # Install TypeScript and development dependencies npm install --save-dev typescript ts-node @types/node ``` ## Initialize Project Initialize a TypeScript project by running the following command: ```bash npx tsc --init ``` Add the following scripts to your `package.json` file to enable running TypeScript files: ```json { "scripts": { "client": "ts-node src/createClient.ts", "compile": "ts-node src/compile.ts", "deploy": "ts-node src/deploy.ts", "interact": "ts-node src/interact.ts" }, } ``` Create a directory for your TypeScript source files: ```bash mkdir src ``` ## Set Up the Chain Configuration The first step is to set up the chain configuration. Create a new file at `src/chainConfig.ts`: ```typescript title="src/chainConfig.ts" import { http } from 'viem'; export const TRANSPORT = http('INSERT_RPC_URL'); // Configure the Polkadot Hub chain export const POLKADOT_HUB = { id: INSERT_CHAIN_ID, name: 'INSERT_CHAIN_NAME', network: 'INSERT_NETWORK_NAME', nativeCurrency: { decimals: INSERT_CHAIN_DECIMALS, name: 'INSERT_CURRENCY_NAME', symbol: 'INSERT_CURRENCY_SYMBOL', }, rpcUrls: { default: { http: ['INSERT_RPC_URL'], }, }, } as const; ``` Ensure to replace `INSERT_RPC_URL`, `INSERT_CHAIN_ID`, `INSERT_CHAIN_NAME`, `INSERT_NETWORK_NAME`, `INSERT_CHAIN_DECIMALS`, `INSERT_CURRENCY_NAME`, and `INSERT_CURRENCY_SYMBOL` with the proper values. Check the [Connect to Polkadot](/develop/smart-contracts/connect-to-polkadot){target=\_blank} page for more information on the possible values. ## Set Up the viem Client To interact with the chain, you need to create a client that is used solely for reading data. To accomplish this, create a new file at `src/createClient.ts`: ```typescript title="src/createClient.ts" import { createPublicClient, createWalletClient, http } from 'viem'; const transport = http('INSERT_RPC_URL'); // Configure the Polkadot Hub chain const assetHub = { id: INSERT_CHAIN_ID, name: 'INSERT_CHAIN_NAME', network: 'INSERT_NETWORK_NAME', nativeCurrency: { decimals: INSERT_CHAIN_DECIMALS, name: 'INSERT_CURRENCY_NAME', symbol: 'INSERT_CURRENCY_SYMBOL', }, rpcUrls: { default: { http: ['INSERT_RPC_URL'], }, }, } as const; // Create a public client for reading data export const publicClient = createPublicClient({ chain: assetHub, transport, }); ``` After setting up the [Public Client](https://viem.sh/docs/clients/public#public-client){target=\_blank}, you can begin querying the blockchain. Here's an example of fetching the latest block number: ??? code "Fetch Last Block code" ```js title="src/fetchLastBlock.ts" import { createPublicClient, http } from 'viem'; const transport = http('https://testnet-passet-hub-eth-rpc.polkadot.io'); // Configure the Polkadot Hub chain const polkadotHubTestnet = { id: 420420422, name: 'Polkadot Hub TestNet', network: 'polkadot-hub-testnet', nativeCurrency: { decimals: 18, name: 'PAS', symbol: 'PAS', }, rpcUrls: { default: { http: ['https://testnet-passet-hub-eth-rpc.polkadot.io'], }, }, } as const; // Create a public client for reading data export const publicClient = createPublicClient({ chain: polkadotHubTestnet, transport, }); const main = async () => { try { const block = await publicClient.getBlock(); console.log('Last block: ' + block.number.toString()); } catch (error: unknown) { console.error('Error connecting to Polkadot Hub TestNet: ' + error); } }; main(); ``` ## Set Up a Wallet In case you need to sign transactions, you will need to instantiate a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\_blank} object within your project. To do so, create `src/createWallet.ts`: ```typescript title="src/createWallet.ts" import { privateKeyToAccount } from 'viem/accounts'; import { createWalletClient, http } from 'viem'; const transport = http('INSERT_RPC_URL'); // Configure the Polkadot Hub chain const assetHub = { id: INSERT_CHAIN_ID, name: 'INSERT_CHAIN_NAME', network: 'INSERT_NETWORK_NAME', nativeCurrency: { decimals: INSERT_CHAIN_DECIMALS, name: 'INSERT_CURRENCY_NAME', symbol: 'INSERT_CURRENCY_SYMBOL', }, rpcUrls: { default: { http: ['INSERT_RPC_URL'], }, public: { http: ['INSERT_RPC_URL'], }, }, } as const; // Create a wallet client for writing data export const createWallet = (privateKey: `0x${string}`) => { const account = privateKeyToAccount(privateKey); return createWalletClient({ account, chain: assetHub, transport, }); }; ``` !!!note The wallet you import with your private key must have sufficient funds to pay for transaction fees when deploying contracts or interacting with them. Make sure to fund your wallet with the appropriate native tokens for the network you're connecting to. ## Sample Smart Contract This example demonstrates compiling a `Storage.sol` Solidity contract for deployment to Polkadot Hub. The contract's functionality stores a number and permits users to update it with a new value. ```bash mkdir contracts artifacts ``` You can use the following contract to interact with the blockchain. Paste the following contract in `contracts/Storage.sol`: ```solidity title="contracts/Storage.sol" //SPDX-License-Identifier: MIT // Solidity files have to start with this pragma. // It will be used by the Solidity compiler to validate its version. pragma solidity ^0.8.9; contract Storage { // Public state variable to store a number uint256 public storedNumber; /** * Updates the stored number. * * The `public` modifier allows anyone to call this function. * * @param _newNumber - The new value to store. */ function setNumber(uint256 _newNumber) public { storedNumber = _newNumber; } } ``` ## Compile the Contract !!! note "Contracts Code Blob Size Disclaimer" The maximum contract code blob size on Polkadot Hub networks is _100 kilobytes_, significantly larger than Ethereum’s EVM limit of 24 kilobytes. For detailed comparisons and migration guidelines, see the [EVM vs. PolkaVM](/polkadot-protocol/smart-contract-basics/evm-vs-polkavm/#current-memory-limits){target=\_blank} documentation page. Create a new file at `src/compile.ts` for handling contract compilation: ```typescript title="src/compile.ts" import { compile } from '@parity/resolc'; import { readFileSync, writeFileSync } from 'fs'; import { basename, join } from 'path'; const compileContract = async ( solidityFilePath: string, outputDir: string ): Promise => { try { // Read the Solidity file const source: string = readFileSync(solidityFilePath, 'utf8'); // Construct the input object for the compiler const input: Record = { [basename(solidityFilePath)]: { content: source }, }; console.log(`Compiling contract: ${basename(solidityFilePath)}...`); // Compile the contract const out = await compile(input); for (const contracts of Object.values(out.contracts)) { for (const [name, contract] of Object.entries(contracts)) { console.log(`Compiled contract: ${name}`); // Write the ABI const abiPath = join(outputDir, `${name}.json`); writeFileSync(abiPath, JSON.stringify(contract.abi, null, 2)); console.log(`ABI saved to ${abiPath}`); // Write the bytecode if ( contract.evm && contract.evm.bytecode && contract.evm.bytecode.object ) { const bytecodePath = join(outputDir, `${name}.polkavm`); writeFileSync( bytecodePath, Buffer.from(contract.evm.bytecode.object, 'hex') ); console.log(`Bytecode saved to ${bytecodePath}`); } else { console.warn(`No bytecode found for contract: ${name}`); } } } } catch (error) { console.error('Error compiling contracts:', error); } }; const solidityFilePath: string = './contracts/Storage.sol'; const outputDir: string = './artifacts/'; compileContract(solidityFilePath, outputDir); ``` To compile your contract: ```bash npm run compile ``` After executing this script, you will see the compilation results including the generated `Storage.json` (containing the contract's ABI) and `Storage.polkavm` (containing the compiled bytecode) files in the `artifacts` folder. These files contain all the necessary information for deploying and interacting with your smart contract on Polkadot Hub. ## Deploy the Contract Create a new file at `src/deploy.ts` for handling contract deployment: ```typescript title="src/deploy.ts" import { readFileSync } from 'fs'; import { join } from 'path'; import { createWallet } from './createWallet'; import { publicClient } from './createClient'; const deployContract = async ( contractName: string, privateKey: `0x${string}` ) => { try { console.log(`Deploying ${contractName}...`); // Read contract artifacts const abi = JSON.parse( readFileSync( join(__dirname, '../artifacts', `${contractName}.json`), 'utf8' ) ); const bytecode = `0x${readFileSync( join(__dirname, '../artifacts', `${contractName}.polkavm`) ).toString('hex')}` as `0x${string}`; // Create wallet const wallet = createWallet(privateKey); // Deploy contract const hash = await wallet.deployContract({ abi, bytecode, args: [], // Add constructor arguments if needed }); // Wait for deployment const receipt = await publicClient.waitForTransactionReceipt({ hash }); const contractAddress = receipt.contractAddress; console.log(`Contract deployed at: ${contractAddress}`); return contractAddress; } catch (error) { console.error('Deployment failed:', error); throw error; } }; const privateKey = 'INSERT_PRIVATE_KEY'; deployContract('Storage', privateKey); ``` Ensure to replace `INSERT_PRIVATE_KEY` with the proper value. For further details on private key exportation, refer to the article [How to export an account's private key](https://support.metamask.io/configure/accounts/how-to-export-an-accounts-private-key/){target=\_blank}. !!! warning Never commit or share your private key. Exposed keys can lead to immediate theft of all associated funds. Use environment variables instead. To deploy, run the following command: ```bash npm run deploy ``` If everything is successful, you will see the address of your deployed contract displayed in the terminal. This address is unique to your contract on the network you defined in the chain configuration, and you'll need it for any future interactions with your contract. ## Interact with the Contract Create a new file at `src/interact.ts` for interacting with your deployed contract: ```typescript title="src/interact.ts" import { publicClient } from './createClient'; import { createWallet } from './createWallet'; import { readFileSync } from 'fs'; const STORAGE_ABI = JSON.parse( readFileSync('./artifacts/Storage.json', 'utf8') ); const interactWithStorage = async ( contractAddress: `0x${string}`, privateKey: `0x${string}` ) => { try { const wallet = createWallet(privateKey); const currentNumber = await publicClient.readContract({ address: contractAddress, abi: STORAGE_ABI, functionName: 'storedNumber', args: [], }); console.log(`Stored number: ${currentNumber}`); const newNumber = BigInt(42); const { request } = await publicClient.simulateContract({ address: contractAddress, abi: STORAGE_ABI, functionName: 'setNumber', args: [newNumber], account: wallet.account, }); const hash = await wallet.writeContract(request); await publicClient.waitForTransactionReceipt({ hash }); console.log(`Number updated to ${newNumber}`); const updatedNumber = await publicClient.readContract({ address: contractAddress, abi: STORAGE_ABI, functionName: 'storedNumber', args: [], }); console.log('Updated stored number:', updatedNumber); } catch (error) { console.error('Interaction failed:', error); } }; const PRIVATE_KEY = 'INSERT_PRIVATE_KEY'; const CONTRACT_ADDRESS = 'INSERT_CONTRACT_ADDRESS'; interactWithStorage(CONTRACT_ADDRESS, PRIVATE_KEY); ``` Ensure to replace `INSERT_PRIVATE_KEY` and `INSERT_CONTRACT_ADDRESS` with the proper values. To interact with the contract: ```bash npm run interact ``` Following a successful interaction, you will see the stored value before and after the transaction. The output will show the initial stored number (0 if you haven't modified it yet), confirm when the transaction to set the number to 42 is complete, and then display the updated stored number value. This demonstrates both reading from and writing to your smart contract. ## Where to Go Next Now that you have the foundation for using viem with Polkadot Hub, consider exploring:
- External __Advanced viem Features__ --- Explore viem's documentation:
  • [:octicons-arrow-right-24: Multi call](https://viem.sh/docs/contract/multicall#multicall){target=\_blank}
  • [:octicons-arrow-right-24: Batch transactions](https://viem.sh/docs/clients/transports/http#batch-json-rpc){target=\_blank}
  • [:octicons-arrow-right-24: Custom actions](https://viem.sh/docs/clients/custom#extending-with-actions-or-configuration){target=\_blank}
- External __Test Frameworks__ --- Integrate viem with the following frameworks for comprehensive testing:
  • [:octicons-arrow-right-24: Hardhat](https://hardhat.org/){target=\_blank}
  • [:octicons-arrow-right-24: Foundry](https://book.getfoundry.sh/){target=\_blank}
- External __Event Handling__ --- Learn how to subscribe to and process contract events:
  • [:octicons-arrow-right-24: Event subscription](https://viem.sh/docs/actions/public/watchEvent#watchevent){target=\_blank}
- External __Building dApps__ --- Combine viem the following technologies to create full-stack applications:
  • [:octicons-arrow-right-24: Next.js](https://nextjs.org/docs){target=\_blank}
  • [:octicons-arrow-right-24: Node.js](https://nodejs.org/en){target=\_blank}
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/libraries/wagmi/ --- BEGIN CONTENT --- --- title: Wagmi for Polkadot Hub Smart Contracts description: Learn how to use Wagmi React Hooks to fetch and interact with smart contracts on Polkadot Hub for seamless dApp integration. categories: Smart Contracts, Tooling --- # Wagmi !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction [Wagmi](https://wagmi.sh/){target=\_blank} is a collection of [React Hooks](https://wagmi.sh/react/api/hooks){target=\_blank} for interacting with Ethereum-compatible blockchains, focusing on developer experience, feature richness, and reliability. This guide demonstrates how to use Wagmi to interact with and deploy smart contracts to Polkadot Hub, providing a seamless frontend integration for your dApps. ## Set Up the Project To start working with Wagmi, create a new React project and initialize it by running the following commands in your terminal: ```bash # Create a new React project using Next.js npx create-next-app@latest wagmi-asset-hub cd wagmi-asset-hub ``` ## Install Dependencies Install Wagmi and its peer dependencies: ```bash # Install Wagmi and its dependencies npm install wagmi viem @tanstack/react-query ``` ## Configure Wagmi for Polkadot Hub Create a configuration file to initialize Wagmi with Polkadot Hub. In your project, create a file named `src/lib/wagmi.ts` and add the code below. Be sure to replace `INSERT_RPC_URL`, `INSERT_CHAIN_ID`, `INSERT_CHAIN_NAME`, `INSERT_NETWORK_NAME`, `INSERT_CHAIN_DECIMALS`, `INSERT_CURRENCY_NAME`, and `INSERT_CURRENCY_SYMBOL` with your specific values. ```typescript title="src/lib/wagmi.ts" import { http, createConfig } from 'wagmi' // Configure the Polkadot Hub chain const assetHub = { id: INSERT_CHAIN_ID, name: 'INSERT_CHAIN_NAME', network: 'INSERT_NETWORK_NAME', nativeCurrency: { decimals: INSERT_CHAIN_DECIMALS, name: 'INSERT_CURRENCY_NAME', symbol: 'INSERT_CURRENCY_SYMBOL', }, rpcUrls: { default: { http: ['INSERT_RPC_URL'], }, }, } as const; // Create Wagmi config export const config = createConfig({ chains: [assetHub], transports: { [assetHub.id]: http(), }, }) ``` ??? code "Example Polkadot Hub TestNet Configuration" ```typescript title="src/lib/wagmi.ts" import { http, createConfig } from 'wagmi'; // Configure the Polkadot Hub chain const assetHub = { id: 420420422, name: 'polkadot-hub-testnet', network: 'polkadot-hub-testnet', nativeCurrency: { decimals: 18, name: 'PAS', symbol: 'PAS', }, rpcUrls: { default: { http: ['https://testnet-passet-hub-eth-rpc.polkadot.io'], }, }, } as const; // Create wagmi config export const config = createConfig({ chains: [assetHub], transports: { [assetHub.id]: http(), }, }); ``` ## Set Up the Wagmi Provider To enable Wagmi in your React application, you need to wrap your app with the [`WagmiProvider`](https://wagmi.sh/react/api/WagmiProvider#wagmiprovider){target=\_blank}. Update your `app/layout.tsx` file (for Next.js app router) with the following code: ```typescript title="app/layout.tsx" // For app router (src/app/layout.tsx) "use client"; import { WagmiProvider } from "wagmi"; import { QueryClient, QueryClientProvider } from "@tanstack/react-query"; import { config } from "./lib/wagmi"; // Create a query client const queryClient = new QueryClient(); export default function RootLayout({ children, }: { children: React.ReactNode; }) { return ( {children} ); } ``` !!!note If you are using a Next.js pages router, you should modify the `src/pages/_app.tsx` instead. ## Connect a Wallet Create a component to connect wallets to your dApp. Create a file named `app/components/ConnectWallet.tsx`: ```typescript title="app/components/ConnectWallet.tsx" "use client"; import React from "react"; import { useConnect, useAccount, useDisconnect } from "wagmi"; import { injected } from "wagmi/connectors"; export function ConnectWallet() { const { connect } = useConnect(); const { address, isConnected } = useAccount(); const { disconnect } = useDisconnect(); if (isConnected) { return (
Connected to {address}
); } return ( ); } ``` This component uses the following React hooks: - [**`useConnect`**](https://wagmi.sh/react/api/hooks/useConnect#useconnect){target=\_blank} - provides functions and state for connecting the user's wallet to your dApp. The `connect` function initiates the connection flow with the specified connector - [**`useDisconnect`**](https://wagmi.sh/react/api/hooks/useDisconnect#usedisconnect){target=\_blank} - provides a function to disconnect the currently connected wallet - [**`useAccount`**](https://wagmi.sh/react/api/hooks/useAccount#useaccount){target=\_blank} - returns data about the connected account, including the address and connection status ## Fetch Blockchain Data Wagmi provides various hooks to fetch blockchain data. Here's an example component that demonstrates some of these hooks: ```typescript title="app/components/BlockchainInfo.tsx" "use client"; import { useBlockNumber, useBalance, useAccount } from "wagmi"; export function BlockchainInfo() { const { address } = useAccount(); // Get the latest block number const { data: blockNumber } = useBlockNumber({ watch: true }); // Get balance for the connected wallet const { data: balance } = useBalance({ address, }); return (

Blockchain Information

Current Block: {blockNumber?.toString() || "Loading..."}

{address && balance && (

Balance:{" "} {( BigInt(balance.value) / BigInt(10 ** balance.decimals) ).toLocaleString()}{" "} {balance.symbol}

)}
); } ``` This component uses the following React hooks: - [**`useBlockNumber`**](https://wagmi.sh/react/api/hooks/useBlockNumber#useBlockNumber){target=\_blank} - fetches the current block number of the connected chain. The `watch` parameter enables real-time updates when new blocks are mined - [**`useBalance`**](https://wagmi.sh/react/api/hooks/useBalance#useBalance){target=\_blank} - retrieves the native token balance for a specified address, including value, symbol, and decimals information ## Interact with Deployed Contract This guide uses a simple Storage contract already deployed to the Polkadot Hub TestNet (`0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f`). The code of that contract is: ??? code "Storage.sol" ```solidity title="Storage.sol" //SPDX-License-Identifier: MIT // Solidity files have to start with this pragma. // It will be used by the Solidity compiler to validate its version. pragma solidity ^0.8.9; contract Storage { // Public state variable to store a number uint256 public storedNumber; /** * Updates the stored number. * * The `public` modifier allows anyone to call this function. * * @param _newNumber - The new value to store. */ function setNumber(uint256 _newNumber) public { storedNumber = _newNumber; } } ``` Create a component to interact with your deployed contract. Create a file named `app/components/StorageContract.tsx`: ```typescript title="app/components/StorageContract.tsx" "use client"; import { useState } from "react"; import { useReadContract, useWriteContract, useWaitForTransactionReceipt, } from "wagmi"; const CONTRACT_ADDRESS = "0xabBd46Ef74b88E8B1CDa49BeFb5057710443Fd29" as `0x${string}`; export function StorageContract() { const [number, setNumber] = useState("42"); // Contract ABI (should match your compiled contract) const abi = [ { inputs: [], name: "storedNumber", outputs: [{ internalType: "uint256", name: "", type: "uint256" }], stateMutability: "view", type: "function", }, { inputs: [ { internalType: "uint256", name: "_newNumber", type: "uint256" }, ], name: "setNumber", outputs: [], stateMutability: "nonpayable", type: "function", }, ]; // Read the current stored number const { data: storedNumber, refetch } = useReadContract({ address: CONTRACT_ADDRESS, abi, functionName: "storedNumber", }); // Write to the contract const { writeContract, data: hash, error, isPending } = useWriteContract(); // Wait for transaction to be mined const { isLoading: isConfirming, isSuccess: isConfirmed } = useWaitForTransactionReceipt({ hash, }); const handleSetNumber = () => { writeContract({ address: CONTRACT_ADDRESS, abi, functionName: "setNumber", args: [BigInt(number)], }); }; return (

Storage Contract Interaction

Contract Address: {CONTRACT_ADDRESS}

Current Stored Number: {storedNumber?.toString() || "Loading..."}

setNumber(e.target.value)} disabled={isPending || isConfirming} />
{error &&
Error: {error.message}
} {isConfirmed && (
Successfully updated!{" "}
)}
); } ``` This component demonstrates how to interact with a smart contract using Wagmi's hooks: - [**`useReadContract`**](https://wagmi.sh/react/api/hooks/useReadContract#useReadContract){target=\_blank} - calls a read-only function on your smart contract to retrieve data without modifying the blockchain state - [**`useWriteContract`**](https://wagmi.sh/react/api/hooks/useWriteContract#useWriteContract){target=\_blank} - calls a state-modifying function on your smart contract, which requires a transaction to be signed and sent - [**`useWaitForTransactionReceipt`**](https://wagmi.sh/react/api/hooks/useWaitForTransactionReceipt#useWaitForTransactionReceipt){target=\_blank} - tracks the status of a transaction after it's been submitted, allowing you to know when it's been confirmed The component also includes proper state handling to: - Show the current value stored in the contract - Allow users to input a new value - Display transaction status (pending, confirming, or completed) - Handle errors - Provide feedback when a transaction is successful ## Integrate Components Update your main page to combine all the components. Create or update the file `src/app/page.tsx`: ```typescript title="src/app/page.tsx" "use client"; import { BlockchainInfo } from "./components/BlockchainInfo"; import { ConnectWallet } from "./components/ConnectWallet"; import { StorageContract } from "./components/StorageContract"; import { useAccount } from "wagmi"; export default function Home() { const { isConnected } = useAccount(); return (

Wagmi - Polkadot Hub Smart Contracts

{isConnected ? : Connect your wallet} {isConnected ? : Connect your wallet}
); } ``` ## Where to Go Next Now that you have the foundational knowledge to use Wagmi with Polkadot Hub, consider exploring:
- External __Advanced Wagmi__ --- Explore Wagmi's advanced features:
  • [:octicons-arrow-right-24: Watch Contract Events](https://wagmi.sh/core/api/actions/watchContractEvent#eventname){target=\_blank}
  • [:octicons-arrow-right-24: Different Transports](https://wagmi.sh/react/api/transports){target=\_blank}
  • [:octicons-arrow-right-24: Actions](https://wagmi.sh/react/api/actions){target=\_blank}
- External __Wallet Integration__ --- Connect your dApp with popular wallet providers:
  • [:octicons-arrow-right-24: MetaMask](https://wagmi.sh/core/api/connectors/metaMask){target=\_blank}
  • [:octicons-arrow-right-24: WalletConnect](https://wagmi.sh/core/api/connectors/walletConnect){target=\_blank}
  • [:octicons-arrow-right-24: Coinbase Wallet](https://wagmi.sh/core/api/connectors/coinbaseWallet){target=\_blank}
- External __Testing & Development__ --- Enhance your development workflow:
  • [:octicons-arrow-right-24: Test Suite](https://wagmi.sh/dev/contributing#_6-running-the-test-suite){target=\_blank}
  • [:octicons-arrow-right-24: Dev Playground](https://wagmi.sh/dev/contributing#_5-running-the-dev-playgrounds){target=\_blank}
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/libraries/web3-js/ --- BEGIN CONTENT --- --- title: Web3.js description: Learn how to interact with Polkadot Hub using Web3.js, deploying Solidity contracts, and interacting with deployed smart contracts. categories: Smart Contracts, Tooling --- # Web3.js !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. !!! warning Web3.js has been [sunset](https://blog.chainsafe.io/web3-js-sunset/){target=\_blank}. You can find guides on using [Ethers.js](/develop/smart-contracts/libraries/ethers-js){target=\_blank} and [viem](/develop/smart-contracts/libraries/viem){target=\_blank} in the [Libraries](/develop/smart-contracts/libraries/){target=\_blank} section. ## Introduction Interacting with blockchains typically requires an interface between your application and the network. [Web3.js](https://web3js.readthedocs.io/){target=\_blank} offers this interface through a comprehensive collection of libraries, facilitating seamless interaction with the nodes using HTTP or WebSocket protocols. This guide illustrates how to utilize Web3.js specifically for interactions with Polkadot Hub. This guide is intended for developers who are familiar with JavaScript and want to interact with the Polkadot Hub using Web3.js. ## Prerequisites Before getting started, ensure you have the following installed: - **Node.js** - v22.13.1 or later, check the [Node.js installation guide](https://nodejs.org/en/download/current/){target=\_blank} - **npm** - v6.13.4 or later (comes bundled with Node.js) - **Solidity** - this guide uses Solidity `^0.8.9` for smart contract development ## Project Structure This project organizes contracts, scripts, and compiled artifacts for easy development and deployment. ```text title="Web3.js Polkadot Hub" web3js-project β”œβ”€β”€ contracts β”‚ β”œβ”€β”€ Storage.sol β”œβ”€β”€ scripts β”‚ β”œβ”€β”€ connectToProvider.js β”‚ β”œβ”€β”€ fetchLastBlock.js β”‚ β”œβ”€β”€ compile.js β”‚ β”œβ”€β”€ deploy.js β”‚ β”œβ”€β”€ updateStorage.js β”œβ”€β”€ abis β”‚ β”œβ”€β”€ Storage.json β”œβ”€β”€ artifacts β”‚ β”œβ”€β”€ Storage.polkavm β”œβ”€β”€ node_modules/ β”œβ”€β”€ package.json β”œβ”€β”€ package-lock.json └── README.md ``` ## Set Up the Project To start working with Web3.js, begin by initializing your project: ```bash npm init -y ``` ## Install Dependencies Next, install the Web3.js library: ```bash npm install web3 ``` This guide uses `web3` version `{{ dependencies.javascript_packages.web3_js.version }}`. ## Set Up the Web3 Provider The provider configuration is the foundation of any Web3.js application. The following example establishes a connection to Polkadot Hub. To use the example script, replace `INSERT_RPC_URL`, `INSERT_CHAIN_ID`, and `INSERT_CHAIN_NAME` with the appropriate values. The provider connection script should look something like this: ```javascript title="scripts/connectToProvider.js" const { Web3 } = require('web3'); const createProvider = (rpcUrl) => { const web3 = new Web3(rpcUrl); return web3; }; const PROVIDER_RPC = { rpc: 'INSERT_RPC_URL', chainId: 'INSERT_CHAIN_ID', name: 'INSERT_CHAIN_NAME', }; createProvider(PROVIDER_RPC.rpc); ``` For example, for the Polkadot Hub TestNet, use these specific connection parameters: ```js const PROVIDER_RPC = { rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io', chainId: 420420422, name: 'polkadot-hub-testnet' }; ``` With the Web3 provider set up, you can start querying the blockchain. For instance, to fetch the latest block number of the chain, you can use the following code snippet: ???+ code "View complete script" ```javascript title="scripts/fetchLastBlock.js" const { Web3 } = require('web3'); const createProvider = (rpcUrl) => { const web3 = new Web3(rpcUrl); return web3; }; const PROVIDER_RPC = { rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io', chainId: 420420422, name: 'polkadot-hub-testnet', }; const main = async () => { try { const web3 = createProvider(PROVIDER_RPC.rpc); const latestBlock = await web3.eth.getBlockNumber(); console.log('Last block: ' + latestBlock); } catch (error) { console.error('Error connecting to Polkadot Hub TestNet: ' + error.message); } }; main(); ``` ## Compile Contracts !!! note "Contracts Code Blob Size Disclaimer" The maximum contract code blob size on Polkadot Hub networks is _100 kilobytes_, significantly larger than Ethereum’s EVM limit of 24 kilobytes. For detailed comparisons and migration guidelines, see the [EVM vs. PolkaVM](/polkadot-protocol/smart-contract-basics/evm-vs-polkavm/#current-memory-limits){target=\_blank} documentation page. Polkadot Hub requires contracts to be compiled to [PolkaVM](/polkadot-protocol/smart-contract-basics/polkavm-design/){target=\_blank} bytecode. This is achieved using the [`revive`](https://github.com/paritytech/revive/tree/v0.2.0/js/resolc){target=\_blank} compiler. Install the [`@parity/resolc`](https://github.com/paritytech/revive){target=\_blank} library as a development dependency: ```bash npm install --save-dev @parity/resolc ``` This guide uses `@parity/resolc` version `{{ dependencies.javascript_packages.resolc.version }}`. Here's a simple storage contract that you can use to follow the process: ```solidity title="contracts/Storage.sol" //SPDX-License-Identifier: MIT pragma solidity ^0.8.9; contract Storage { // Public state variable to store a number uint256 public storedNumber; /** * Updates the stored number. * * The `public` modifier allows anyone to call this function. * * @param _newNumber - The new value to store. */ function setNumber(uint256 _newNumber) public { storedNumber = _newNumber; } } ``` With that, you can now create a `compile.js` snippet that transforms your solidity code into PolkaVM bytecode: ```javascript title="scripts/compile.js" const { compile } = require('@parity/resolc'); const { readFileSync, writeFileSync } = require('fs'); const { basename, join } = require('path'); const compileContract = async (solidityFilePath, outputDir) => { try { // Read the Solidity file const source = readFileSync(solidityFilePath, 'utf8'); // Construct the input object for the compiler const input = { [basename(solidityFilePath)]: { content: source }, }; console.log(`Compiling contract: ${basename(solidityFilePath)}...`); // Compile the contract const out = await compile(input); for (const contracts of Object.values(out.contracts)) { for (const [name, contract] of Object.entries(contracts)) { console.log(`Compiled contract: ${name}`); // Write the ABI const abiPath = join(outputDir, `${name}.json`); writeFileSync(abiPath, JSON.stringify(contract.abi, null, 2)); console.log(`ABI saved to ${abiPath}`); // Write the bytecode const bytecodePath = join(outputDir, `${name}.polkavm`); writeFileSync( bytecodePath, Buffer.from(contract.evm.bytecode.object, 'hex'), ); console.log(`Bytecode saved to ${bytecodePath}`); } } } catch (error) { console.error('Error compiling contracts:', error); } }; const solidityFilePath = './Storage.sol'; const outputDir = '.'; compileContract(solidityFilePath, outputDir); ``` To compile your contract, simply run the following command: ```bash node compile ``` After compilation, you'll have two key files: an ABI (`.json`) file, which provides a JSON interface describing the contract's functions and how to interact with it, and a bytecode (`.polkavm`) file, which contains the low-level machine code executable on PolkaVM that represents the compiled smart contract ready for blockchain deployment. ## Contract Deployment To deploy your compiled contract to Polkadot Hub using Web3.js, you'll need an account with a private key to sign the deployment transaction. The deployment process is exactly the same as for any Ethereum-compatible chain, involving creating a contract instance, estimating gas, and sending a deployment transaction. Here's how to deploy the contract, ensure replacing the `INSERT_RPC_URL`, `INSERT_PRIVATE_KEY`, and `INSERT_CONTRACT_NAME` with the appropriate values: ```javascript title="scripts/deploy.js" import { readFileSync } from 'fs'; import { Web3 } from 'web3'; const getAbi = (contractName) => { try { return JSON.parse(readFileSync(`${contractName}.json`), 'utf8'); } catch (error) { console.error( `❌ Could not find ABI for contract ${contractName}:`, error.message ); throw error; } }; const getByteCode = (contractName) => { try { return `0x${readFileSync(`${contractName}.polkavm`).toString('hex')}`; } catch (error) { console.error( `❌ Could not find bytecode for contract ${contractName}:`, error.message ); throw error; } }; export const deploy = async (config) => { try { // Initialize Web3 with RPC URL const web3 = new Web3(config.rpcUrl); // Prepare account const account = web3.eth.accounts.privateKeyToAccount(config.privateKey); web3.eth.accounts.wallet.add(account); // Load abi const abi = getAbi('Storage'); // Create contract instance const contract = new web3.eth.Contract(abi); // Prepare deployment const deployTransaction = contract.deploy({ data: getByteCode('Storage'), arguments: [], // Add constructor arguments if needed }); // Estimate gas const gasEstimate = await deployTransaction.estimateGas({ from: account.address, }); // Get current gas price const gasPrice = await web3.eth.getGasPrice(); // Send deployment transaction const deployedContract = await deployTransaction.send({ from: account.address, gas: gasEstimate, gasPrice: gasPrice, }); // Log and return contract details console.log(`Contract deployed at: ${deployedContract.options.address}`); return deployedContract; } catch (error) { console.error('Deployment failed:', error); throw error; } }; // Example usage const deploymentConfig = { rpcUrl: 'INSERT_RPC_URL', privateKey: 'INSERT_PRIVATE_KEY', contractName: 'INSERT_CONTRACT_NAME', }; deploy(deploymentConfig) .then((contract) => console.log('Deployment successful')) .catch((error) => console.error('Deployment error')); ``` For further details on private key exportation, refer to the article [How to export an account's private key](https://support.metamask.io/configure/accounts/how-to-export-an-accounts-private-key/){target=\_blank}. To deploy your contract, run the following command: ```bash node deploy ``` ## Interact with the Contract Once deployed, you can interact with your contract using Web3.js methods. Here's how to set a number and read it back, ensure replacing `INSERT_RPC_URL`, `INSERT_PRIVATE_KEY`, and `INSERT_CONTRACT_ADDRESS` with the appropriate values: ```javascript title="scripts/updateStorage.js" import { readFileSync } from 'fs'; import { Web3 } from 'web3'; const getAbi = (contractName) => { try { return JSON.parse(readFileSync(`${contractName}.json`), 'utf8'); } catch (error) { console.error( `❌ Could not find ABI for contract ${contractName}:`, error.message ); throw error; } }; const updateStorage = async (config) => { try { // Initialize Web3 with RPC URL const web3 = new Web3(config.rpcUrl); // Prepare account const account = web3.eth.accounts.privateKeyToAccount(config.privateKey); web3.eth.accounts.wallet.add(account); // Load abi const abi = getAbi('Storage'); // Create contract instance const contract = new web3.eth.Contract(abi, config.contractAddress); // Get initial value const initialValue = await contract.methods.storedNumber().call(); console.log('Current stored value:', initialValue); // Prepare transaction const updateTransaction = contract.methods.setNumber(1); // Estimate gas const gasEstimate = await updateTransaction.estimateGas({ from: account.address, }); // Get current gas price const gasPrice = await web3.eth.getGasPrice(); // Send update transaction const receipt = await updateTransaction.send({ from: account.address, gas: gasEstimate, gasPrice: gasPrice, }); // Log transaction details console.log(`Transaction hash: ${receipt.transactionHash}`); // Get updated value const newValue = await contract.methods.storedNumber().call(); console.log('New stored value:', newValue); return receipt; } catch (error) { console.error('Update failed:', error); throw error; } }; // Example usage const config = { rpcUrl: 'INSERT_RPC_URL', privateKey: 'INSERT_PRIVATE_KEY', contractAddress: 'INSERT_CONTRACT_ADDRESS', }; updateStorage(config) .then((receipt) => console.log('Update successful')) .catch((error) => console.error('Update error')); ``` To execute the logic above, run: ```bash node updateStorage ``` ## Where to Go Next Now that you’ve learned how to use Web3.js with Polkadot Hub, explore more advanced topics: - Utilize Web3.js utilities – learn about additional [Web3.js](https://docs.web3js.org/){target=\_blank} features such as signing transactions, managing wallets, and subscribing to events - Build full-stack dApps – [integrate Web3.js](https://docs.web3js.org/guides/dapps/intermediate-dapp){target=\_blank} with different libraries and frameworks to build decentralized web applications --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/libraries/web3-py/ --- BEGIN CONTENT --- --- title: Web3.py description: Learn how to interact with Polkadot Hub using the Web3 python library, deploying Solidity contracts, and interacting with deployed smart contracts. categories: Smart Contracts, Tooling --- # Web3.py !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Interacting with blockchains typically requires an interface between your application and the network. [Web3.py](https://web3py.readthedocs.io/en/stable/index.html){target=\_blank} offers this interface through a collection of libraries, facilitating seamless interaction with the nodes using HTTP or WebSocket protocols. This guide illustrates how to utilize Web3.py for interactions with Polkadot Hub. ## Set Up the Project 1. To start working with Web3.py, begin by initializing your project: ``` mkdir web3py-project cd web3py-project ``` 2. Create and activate a virtual environment for your project: ``` python -m venv venv source venv/bin/activate ``` 3. Next, install the Web3.py library: ``` pip install web3 ``` ## Set Up the Web3 Provider The [provider](https://web3py.readthedocs.io/en/stable/providers.html){target=\_blank} configuration is the foundation of any Web3.py application. The following example establishes a connection to Polkadot Hub. Follow these steps to use the provider configuration: 1. Replace `INSERT_RPC_URL` with the appropriate value. For instance, to connect to Polkadot Hub TestNet, use the following parameter: ```python PROVIDER_RPC = 'https://testnet-passet-hub-eth-rpc.polkadot.io' ``` The provider connection script should look something like this: ```python title="connect_to_provider.py" from web3 import Web3 def create_provider(rpc_url): web3 = Web3(Web3.HTTPProvider(rpc_url)) return web3 PROVIDER_RPC = 'INSERT_RPC_URL' create_provider(PROVIDER_RPC) ``` 1. With the Web3 provider set up, start querying the blockchain. For instance, you can use the following code snippet to fetch the latest block number of the chain: ```python title="fetch_last_block.py" def main(): try: web3 = create_provider(PROVIDER_RPC) latest_block = web3.eth.block_number print('Last block: ' + str(latest_block)) except Exception as error: print('Error connecting to Polkadot Hub TestNet: ' + str(error)) if __name__ == "__main__": main() ``` ??? code "View complete script" ```python title="fetch_last_block.py" from web3 import Web3 def create_provider(rpc_url): web3 = Web3(Web3.HTTPProvider(rpc_url)) return web3 PROVIDER_RPC = 'https://testnet-passet-hub-eth-rpc.polkadot.io' def main(): try: web3 = create_provider(PROVIDER_RPC) latest_block = web3.eth.block_number print('Last block: ' + str(latest_block)) except Exception as error: print('Error connecting to Polkadot Hub TestNet: ' + str(error)) if __name__ == "__main__": main() ``` ## Contract Deployment Before deploying your contracts, make sure you've compiled them and obtained two key files: - An ABI (.json) file, which provides a JSON interface describing the contract's functions and how to interact with it - A bytecode (.polkavm) file, which contains the low-level machine code executable on [PolkaVM](/polkadot-protocol/smart-contract-basics/polkavm-design#polkavm){target=\_blank} that represents the compiled smart contract ready for blockchain deployment To follow this guide, you can use the following solidity contract as an example: ```solidity title="Storage.sol" //SPDX-License-Identifier: MIT // Solidity files have to start with this pragma. // It will be used by the Solidity compiler to validate its version. pragma solidity ^0.8.9; contract Storage { // Public state variable to store a number uint256 public storedNumber; /** * Updates the stored number. * * The `public` modifier allows anyone to call this function. * * @param _newNumber - The new value to store. */ function setNumber(uint256 _newNumber) public { storedNumber = _newNumber; } } ``` To deploy your compiled contract to Polkadot Hub using Web3.py, you'll need an account with a private key to sign the deployment transaction. The deployment process is exactly the same as for any Ethereum-compatible chain, involving creating a contract instance, estimating gas, and sending a deployment transaction. Here's how to deploy the contract. ReplaceΒ `INSERT_RPC_URL`Β andΒ `INSERT_PRIVATE_KEY` with the appropriate values: ```python title="deploy.py" from web3 import Web3 import json def get_abi(contract_name): try: with open(f"{contract_name}.json", 'r') as file: return json.load(file) except Exception as error: print(f"❌ Could not find ABI for contract {contract_name}: {error}") raise error def get_bytecode(contract_name): try: with open(f"{contract_name}.polkavm", 'rb') as file: return '0x' + file.read().hex() except Exception as error: print(f"❌ Could not find bytecode for contract {contract_name}: {error}") raise error async def deploy(config): try: # Initialize Web3 with RPC URL web3 = Web3(Web3.HTTPProvider(config["rpc_url"])) # Prepare account account = web3.eth.account.from_key(config["private_key"]) print(f"address: {account.address}") # Load ABI abi = get_abi('Storage') # Create contract instance contract = web3.eth.contract(abi=abi, bytecode=get_bytecode('Storage')) # Get current nonce nonce = web3.eth.get_transaction_count(account.address) # Prepare deployment transaction transaction = { 'from': account.address, 'nonce': nonce, } # Build and sign transaction construct_txn = contract.constructor().build_transaction(transaction) signed_txn = web3.eth.account.sign_transaction(construct_txn, private_key=config["private_key"]) # Send transaction tx_hash = web3.eth.send_raw_transaction(signed_txn.raw_transaction) print(f"Transaction hash: {tx_hash.hex()}") # Wait for transaction receipt tx_receipt = web3.eth.wait_for_transaction_receipt(tx_hash) contract_address = tx_receipt.contractAddress # Log and return contract details print(f"Contract deployed at: {contract_address}") return web3.eth.contract(address=contract_address, abi=abi) except Exception as error: print('Deployment failed:', error) raise error if __name__ == "__main__": # Example usage import asyncio deployment_config = { "rpc_url": "INSERT_RPC_URL", "private_key": "INSERT_PRIVATE_KEY", } asyncio.run(deploy(deployment_config)) ``` !!!warning Never commit or share your private key. Exposed keys can lead to immediate theft of all associated funds. Use environment variables instead. ## Interact with the Contract After deployment, interact with your contract using Web3.py methods. The example below demonstrates how to set and retrieve a number. Be sure to replace the `INSERT_RPC_URL`, `INSERT_PRIVATE_KEY`, and `INSERT_CONTRACT_ADDRESS` placeholders with your specific values: ```python title="update_storage.py" from web3 import Web3 import json def get_abi(contract_name): try: with open(f"{contract_name}.json", 'r') as file: return json.load(file) except Exception as error: print(f"❌ Could not find ABI for contract {contract_name}: {error}") raise error async def update_storage(config): try: # Initialize Web3 with RPC URL web3 = Web3(Web3.HTTPProvider(config["rpc_url"])) # Prepare account account = web3.eth.account.from_key(config["private_key"]) # Load ABI abi = get_abi('Storage') # Create contract instance contract = web3.eth.contract(address=config["contract_address"], abi=abi) # Get initial value initial_value = contract.functions.storedNumber().call() print('Current stored value:', initial_value) # Get current nonce nonce = web3.eth.get_transaction_count(account.address) # Prepare transaction transaction = contract.functions.setNumber(1).build_transaction({ 'from': account.address, 'nonce': nonce }) # Sign transaction signed_txn = web3.eth.account.sign_transaction(transaction, private_key=config["private_key"]) # Send transaction tx_hash = web3.eth.send_raw_transaction(signed_txn.raw_transaction) print(f"Transaction hash: {tx_hash.hex()}") # Wait for receipt receipt = web3.eth.wait_for_transaction_receipt(tx_hash) # Get updated value new_value = contract.functions.storedNumber().call() print('New stored value:', new_value) return receipt except Exception as error: print('Update failed:', error) raise error if __name__ == "__main__": # Example usage import asyncio config = { "rpc_url": "INSERT_RPC_URL", "private_key": "INSERT_PRIVATE_KEY", "contract_address": "INSERT_CONTRACT_ADDRESS", } asyncio.run(update_storage(config)) ``` ## Where to Go Next Now that you have the foundation for using Web3.py with Polkadot Hub, consider exploring:
- External __Advanced Web3.py Features__ --- Explore Web3.py's documentation:
  • [:octicons-arrow-right-24: Middleware](https://web3py.readthedocs.io/en/stable/middleware.html){target=\_blank}
  • [:octicons-arrow-right-24: Filters & Events](https://web3py.readthedocs.io/en/stable/filters.html){target=\_blank}
  • [:octicons-arrow-right-24: ENS](https://web3py.readthedocs.io/en/stable/ens_overview.html){target=\_blank}
- External __Testing Frameworks__ --- Integrate Web3.py with Python testing frameworks:
  • [:octicons-arrow-right-24: Pytest](https://docs.pytest.org/){target=\_blank}
  • [:octicons-arrow-right-24: Brownie](https://eth-brownie.readthedocs.io/){target=\_blank}
- External __Transaction Management__ --- Learn advanced transaction handling:
  • [:octicons-arrow-right-24: Gas Strategies](https://web3py.readthedocs.io/en/stable/gas_price.html){target=\_blank}
  • [:octicons-arrow-right-24: Account Management](https://web3py.readthedocs.io/en/stable/web3.eth.account.html){target=\_blank}
- External __Building dApps__ --- Combine Web3.py with these frameworks to create full-stack applications:
  • [:octicons-arrow-right-24: Flask](https://flask.palletsprojects.com/){target=\_blank}
  • [:octicons-arrow-right-24: Django](https://www.djangoproject.com/){target=\_blank}
  • [:octicons-arrow-right-24: FastAPI](https://fastapi.tiangolo.com/){target=\_blank}
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/local-development-node/ --- BEGIN CONTENT --- --- title: Local Development Node description: Follow this step-by-step guide to install a Substrate node and ETH-RPC adapter for smart contract development in a local environment. categories: Smart Contracts --- # Local Development Node !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction A local development node provides an isolated blockchain environment where you can deploy, test, and debug smart contracts without incurring network fees or waiting for block confirmations. This guide demonstrates how to set up a local Polkadot SDK-based node with smart contract capabilities. By the end of this guide, you'll have: - A running Substrate node with smart contract support - An ETH-RPC adapter for Ethereum-compatible tooling integration accessible at `http://localhost:8545` ## Prerequisites Before getting started, ensure you have done the following: - Completed the [Install Polkadot SDK Dependencies](/develop/parachains/install-polkadot-sdk/){target=\_blank} guide and successfully installed [Rust](https://www.rust-lang.org/){target=\_blank} and the required packages to set up your development environment ## Install the Substrate Node and ETH-RPC Adapter The Polkadot SDK repository contains both the [Substrate node](https://github.com/paritytech/polkadot-sdk/tree/master/substrate/bin/node){target=\_blank} implementation and the [ETH-RPC adapter](https://github.com/paritytech/polkadot-sdk/tree/master/substrate/frame/revive/rpc){target=\_blank} required for Ethereum compatibility. Start by cloning the repository and navigating to the project directory: ```bash git clone -b {{dependencies.repositories.polkadot_sdk_contracts_node.version}} https://github.com/paritytech/polkadot-sdk.git cd polkadot-sdk ``` Next, you need to compile the two essential components for your development environment. The Substrate node provides the core blockchain runtime with smart contract support, while the ETH-RPC adapter enables Ethereum JSON-RPC compatibility for existing tooling: ```bash cargo build --bin substrate-node --release cargo build -p pallet-revive-eth-rpc --bin eth-rpc --release ``` The compilation process may take some time depending on your system specifications, potentially up to 30 minutes. Release builds are optimized for performance but take longer to compile than debug builds. After successful compilation, you can verify the binaries are available in the `target/release` directory: - **Substrate node path** - `polkadot-sdk/target/release/substrate-node` - **ETH-RPC adapter path** - `polkadot-sdk/target/release/eth-rpc` ## Run the Local Node With the binaries compiled, you can now start your local development environment. The setup requires running two processes. Start the Substrate node first, which will initialize a local blockchain with the `dev` chain specification. This configuration includes `pallet-revive` for smart contract functionality and uses pre-funded development accounts for testing: ```bash ./target/release/substrate-node --dev ``` The node will begin producing blocks immediately and display initialization logs:
./target/release/substrate-node --dev
2025-05-29 10:42:35 Substrate Node 2025-05-29 10:42:35 ✌️ version 3.0.0-dev-38b7581fc04 2025-05-29 10:42:35 ❀️ by Parity Technologies <admin@parity.io>, 2017-2025 2025-05-29 10:42:35 πŸ“‹ Chain specification: Development 2025-05-29 10:42:35 🏷 Node name: annoyed-aunt-3163 2025-05-29 10:42:35 πŸ‘€ Role: AUTHORITY 2025-05-29 10:42:35 πŸ’Ύ Database: RocksDb at /var/folders/x0/xl_kjddj3ql3bx7752yr09hc0000gn/T/substrate2P85EF/chains/dev/db/full 2025-05-29 10:42:40 πŸ”¨ Initializing Genesis block/state (state: 0xfc05…482e, header-hash: 0x1ae1…b8b4) 2025-05-29 10:42:40 Creating transaction pool txpool_type=SingleState ready=Limit { count: 8192, total_bytes: 20971520 } future=Limit { count: 819, total_bytes: 2097152 } 2025-05-29 10:42:40 πŸ‘΄ Loading GRANDPA authority set from genesis on what appears to be first startup. 2025-05-29 10:42:40 πŸ‘Ά Creating empty BABE epoch changes on what appears to be first startup. 2025-05-29 10:42:40 Using default protocol ID "sup" because none is configured in the chain specs 2025-05-29 10:42:40 🏷 Local node identity is: 12D3KooWAH8fgJv3hce7Yv4yKG4YXQiRqESFu6755DBnfZQU8Znm 2025-05-29 10:42:40 Running libp2p network backend 2025-05-29 10:42:40 local_peer_id=12D3KooWAH8fgJv3hce7Yv4yKG4YXQiRqESFu6755DBnfZQU8Znm 2025-05-29 10:42:40 πŸ’» Operating system: macos 2025-05-29 10:42:40 πŸ’» CPU architecture: aarch64 2025-05-29 10:42:40 πŸ“¦ Highest known block at #0 2025-05-29 10:42:40 Error binding to '127.0.0.1:9615': Os { code: 48, kind: AddrInUse, message: "Address already in use" } 2025-05-29 10:42:40 Running JSON-RPC server: addr=127.0.0.1:63333,[::1]:63334 2025-05-29 10:42:40 🏁 CPU single core score: 1.24 GiBs, parallelism score: 1.08 GiBs with expected cores: 8 2025-05-29 10:42:40 🏁 Memory score: 49.42 GiBs 2025-05-29 10:42:40 🏁 Disk score (seq. writes): 1.91 GiBs 2025-05-29 10:42:40 🏁 Disk score (rand. writes): 529.02 MiBs 2025-05-29 10:42:40 πŸ‘Ά Starting BABE Authorship worker 2025-05-29 10:42:40 πŸ₯© BEEFY gadget waiting for BEEFY pallet to become available... 2025-05-29 10:42:40 Failed to trigger bootstrap: No known peers. 2025-05-29 10:42:42 πŸ™Œ Starting consensus session on top of parent 0x1ae19030b13592b5e6fd326f26efc7b31a4f588303d348ef89ae9ebca613b8b4 (#0) 2025-05-29 10:42:42 🎁 Prepared block for proposing at 1 (5 ms) hash: 0xe046f22307fba58a3bd0cc21b1a057843d4342da8876fd44aba206f124528df0; parent_hash: 0x1ae1…b8b4; end: NoMoreTransactions; extrinsics_count: 2 2025-05-29 10:42:42 πŸ”– Pre-sealed block for proposal at 1. Hash now 0xa88d36087e7bf8ee59c1b17e0003092accf131ff8353a620410d7283657ce36a, previously 0xe046f22307fba58a3bd0cc21b1a057843d4342da8876fd44aba206f124528df0. 2025-05-29 10:42:42 πŸ‘Ά New epoch 0 launching at block 0xa88d…e36a (block slot 582842054 >= start slot 582842054). 2025-05-29 10:42:42 πŸ‘Ά Next epoch starts at slot 582842254 2025-05-29 10:42:42 πŸ† Imported #1 (0x1ae1…b8b4 β†’ 0xa88d…e36a)
For debugging purposes or to monitor low-level operations, you can enable detailed logging by setting environment variables before running the command: ```bash RUST_LOG="error,evm=debug,sc_rpc_server=info,runtime::revive=debug" ./target/release/substrate-node --dev ``` Once the Substrate node is running, open a new terminal window and start the ETH-RPC adapter. This component translates Ethereum JSON-RPC calls into Substrate-compatible requests, allowing you to use familiar Ethereum tools like MetaMask, Hardhat, or Ethers.js: ```bash ./target/release/eth-rpc --dev ``` You should see logs indicating that the adapter is ready to accept connections:
./target/release/eth-rpc --dev
2025-05-29 10:48:48 Running in --dev mode, RPC CORS has been disabled. 2025-05-29 10:48:48 Running in --dev mode, RPC CORS has been disabled. 2025-05-29 10:48:48 🌐 Connecting to node at: ws://127.0.0.1:9944 ... 2025-05-29 10:48:48 🌟 Connected to node at: ws://127.0.0.1:9944 2025-05-29 10:48:48 πŸ’Ύ Using in-memory database, keeping only 256 blocks in memory 2025-05-29 10:48:48 〽️ Prometheus exporter started at 127.0.0.1:9616 2025-05-29 10:48:48 Running JSON-RPC server: addr=127.0.0.1:8545,[::1]:8545 2025-05-29 10:48:48 πŸ”Œ Subscribing to new blocks (BestBlocks) 2025-05-29 10:48:48 πŸ”Œ Subscribing to new blocks (FinalizedBlocks)
Similar to the Substrate node, you can enable detailed logging for the ETH-RPC adapter to troubleshoot issues: ```bash RUST_LOG="info,eth-rpc=debug" ./target/release/eth-rpc --dev ``` Your local development environment is now active and accessible at `http://localhost:8545`. This endpoint accepts standard Ethereum JSON-RPC requests, enabling seamless integration with existing Ethereum development tools and workflows. You can connect wallets, deploy contracts using Remix or Hardhat, and interact with your smart contracts as you would on any Ethereum-compatible network. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/overview/ --- BEGIN CONTENT --- --- title: Smart Contracts Overview description: Learn about smart contract development capabilities in the Polkadot ecosystem, either by leveraging Polkadot Hub or other alternatives. categories: Basics, Smart Contracts --- # Smart Contracts on Polkadot !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Polkadot offers developers multiple approaches to building and deploying smart contracts within its ecosystem. As a multi-chain network designed for interoperability, Polkadot provides various environments optimized for different developer preferences and application requirements. From native smart contract support on Polkadot Hub to specialized parachain environments, developers can choose the platform that best suits their technical needs while benefiting from Polkadot's shared security model and cross-chain messaging capabilities. Whether you're looking for Ethereum compatibility through EVM-based parachains like [Moonbeam](https://docs.moonbeam.network/){target=\_blank}, [Astar](https://docs.astar.network/){target=\_blank}, and [Acala](https://evmdocs.acala.network/){target=\_blank} or prefer PolkaVM-based development with [ink!](https://use.ink/docs/v6/){target=\_blank}, the Polkadot ecosystem accommodates a range of diverse developers. These guides explore the diverse smart contract options available in the Polkadot ecosystem, helping developers understand the unique advantages of each approach and make informed decisions about where to deploy their decentralized applications. ## Native Smart Contracts ### Introduction Polkadot Hub enables smart contract deployment and execution through PolkaVM, a cutting-edge virtual machine designed specifically for the Polkadot ecosystem. This native integration allows developers to deploy smart contracts directly on Polkadot's system chain while maintaining compatibility with Ethereum development tools and workflows. ### Smart Contract Development The smart contract platform on Polkadot Hub combines _Polkadot's robust security and scalability_ with the extensive Ethereum development ecosystem. Developers can utilize familiar Ethereum libraries for contract interactions and leverage industry-standard development environments for writing and testing smart contracts. Polkadot Hub provides _full Ethereum JSON-RPC API compatibility_, ensuring seamless integration with existing development tools and services. This compatibility enables developers to maintain their preferred workflows while building on Polkadot's native infrastructure. ### Technical Architecture PolkaVM, the underlying virtual machine, utilizes a RISC-V-based register architecture _optimized for the Polkadot ecosystem_. This design choice offers several advantages: - Enhanced performance for smart contract execution. - Improved gas efficiency for complex operations. - Native compatibility with Polkadot's runtime environment. - Optimized storage and state management. ### Development Tools and Resources Polkadot Hub supports a comprehensive suite of development tools familiar to Ethereum developers. The platform integrates with popular development frameworks, testing environments, and deployment tools. Key features include: - Contract development in Solidity or Rust. - Support for standard Ethereum development libraries. - Integration with widely used development environments. - Access to blockchain explorers and indexing solutions. - Compatibility with contract monitoring and management tools. ### Cross-Chain Capabilities Smart contracts deployed on Polkadot Hub can leverage Polkadot's [cross-consensus messaging (XCM) protocol](/develop/interoperability/intro-to-xcm/){target=\_blank} protocol to seamlessly _transfer tokens and call functions on other blockchain networks_ within the Polkadot ecosystem, all without complex bridging infrastructure or third-party solutions. For further references, check the [Interoperability](/develop/interoperability/){target=\_blank} section. ### Use Cases Polkadot Hub's smart contract platform is suitable for a wide range of applications: - DeFi protocols leveraging _cross-chain capabilities_. - NFT platforms utilizing Polkadot's native token standards. - Governance systems integrated with Polkadot's democracy mechanisms. - Cross-chain bridges and asset management solutions. ## Other Smart Contract Environments Beyond Polkadot Hub's native PolkaVM support, the ecosystem offers two main alternatives for smart contract development: - **EVM-compatible parachains**: Provide access to Ethereum's extensive developer ecosystem, smart contract portability, and established tooling like Hardhat, Remix, Foundry, and OpenZeppelin. The main options include Moonbeam (the first full Ethereum-compatible parachain serving as an interoperability hub), Astar (featuring dual VM support for both EVM and WebAssembly contracts), and Acala (DeFi-focused with enhanced Acala EVM+ offering advanced DeFi primitives). - **Rust (ink!)**: ink! is a Rust-based framework that can compile to PolkaVM. It uses [`#[ink(...)]`](https://use.ink/docs/v6/macros-attributes/){target=\_blank} attribute macros to create Polkadot SDK-compatible PolkaVM bytecode, offering strong memory safety from Rust, an advanced type system, high-performance PolkaVM execution, and platform independence with sandboxed security. Each environment provides unique advantages based on developer preferences and application requirements. ## Where to Go Next Developers can use their existing Ethereum development tools and connect to Polkadot Hub's RPC endpoints. The platform's Ethereum compatibility layer ensures a smooth transition for teams already building on Ethereum-compatible chains. Subsequent sections of this guide provide detailed information about specific development tools, advanced features, and best practices for building on Polkadot Hub.
- Guide __Libraries__ --- Explore essential libraries to optimize smart contract development and interaction. [:octicons-arrow-right-24: Reference](/develop/smart-contracts/libraries/) - Guide __Dev Environments__ --- Set up your development environment for seamless contract deployment and testing. [:octicons-arrow-right-24: Reference](/develop/smart-contracts/dev-environments/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/precompiles/interact-with-precompiles/ --- BEGIN CONTENT --- --- title: Interact with Precompiles description: Learn how to interact with Polkadot Hub’s precompiles from Solidity to access native, low-level functions like hashing, pairing, EC ops, etc. categories: Smart Contracts --- # Interact with Precompiles !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Precompiles offer Polkadot Hub developers access to high-performance native functions directly from their smart contracts. Each precompile has a specific address and accepts a particular input data format. When called correctly, they execute optimized, native implementations of commonly used functions much more efficiently than equivalent contract-based implementations. This guide demonstrates how to interact with each standard precompile available in Polkadot Hub through Solidity smart contracts. ## Basic Precompile Interaction Pattern All precompiles follow a similar interaction pattern: ```solidity // Generic pattern for calling precompiles function callPrecompile(address precompileAddress, bytes memory input) internal returns (bool success, bytes memory result) { // Direct low-level call to the precompile address (success, result) = precompileAddress.call(input); // Ensure the call was successful require(success, "Precompile call failed"); return (success, result); } ``` Feel free to check the [`precompiles-hardhat`](https://github.com/polkadot-developers/polkavm-hardhat-examples/tree/v0.0.3/precompiles-hardhat){target=\_blank} repository to check all the precompiles examples. The repository contains a set of example contracts and test files demonstrating how to interact with each precompile in Polkadot Hub. Now, let's explore how to use each precompile available in Polkadot Hub. ## ECRecover (0x01) ECRecover recovers an Ethereum address associated with the public key used to sign a message. ```solidity title="ECRecover.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract ECRecoverExample { event ECRecovered(bytes result); // Address of the ECRecover precompile address constant EC_RECOVER_ADDRESS = address(0x01); bytes public result; function callECRecover(bytes calldata input) public { bool success; bytes memory resultInMemory; (success, resultInMemory) = EC_RECOVER_ADDRESS.call{value: 0}(input); if (success) { emit ECRecovered(resultInMemory); } result = resultInMemory; } function getRecoveredAddress() public view returns (address) { require(result.length == 32, "Invalid result length"); return address(uint160(uint256(bytes32(result)))); } } ``` To interact with the ECRecover precompile, you can deploy the `ECRecoverExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment. The `callECRecover` function takes a 128-byte input combining the message `hash`, `v`, `r`, and `s` signature values. Check this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/ECRecover.js){target=\_blank} that shows how to format this input and verify that the recovered address matches the expected result. ## SHA-256 (0x02) The SHA-256 precompile computes the SHA-256 hash of the input data. ```solidity title="SHA256.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract SHA256Example { event SHA256Called(bytes result); // Address of the SHA256 precompile address constant SHA256_PRECOMPILE = address(0x02); bytes public result; function callH256(bytes calldata input) public { bool success; bytes memory resultInMemory; (success, resultInMemory) = SHA256_PRECOMPILE.call{value: 0}(input); if (success) { emit SHA256Called(resultInMemory); } result = resultInMemory; } } ``` To use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\_blank} module. ## RIPEMD-160 (0x03) The RIPEMD-160 precompile computes the RIPEMD-160 hash of the input data. ```solidity title="RIPEMD160.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract RIPEMD160Example { // RIPEMD-160 precompile address address constant RIPEMD160_PRECOMPILE = address(0x03); bytes32 public result; event RIPEMD160Called(bytes32 result); function calculateRIPEMD160(bytes calldata input) public returns (bytes32) { (bool success, bytes memory returnData) = RIPEMD160_PRECOMPILE.call( input ); require(success, "RIPEMD-160 precompile call failed"); // return full 32 bytes, no assembly extraction bytes32 fullHash; assembly { fullHash := mload(add(returnData, 32)) } result = fullHash; emit RIPEMD160Called(fullHash); return fullHash; } } ``` To use it, you can deploy the `RIPEMD160Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `calculateRIPEMD160` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/RIPEMD160.js){target=\_blank} shows how to hash a UTF-8 string, pad the 20-byte result to 32 bytes, and verify it against the expected output. ## Identity (Data Copy) (0x04) The Identity precompile simply returns the input data as output. While seemingly trivial, it can be useful for testing and certain specialized scenarios. ```solidity title="Identity.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract IdentityExample { event IdentityCalled(bytes result); // Address of the Identity precompile address constant IDENTITY_PRECOMPILE = address(0x04); bytes public result; function callIdentity(bytes calldata input) public { bool success; bytes memory resultInMemory; (success, resultInMemory) = IDENTITY_PRECOMPILE.call(input); if (success) { emit IdentityCalled(resultInMemory); } result = resultInMemory; } } ``` To use it, you can deploy the `IdentityExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callIdentity` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Identity.js){target=\_blank} shows how to pass input data and verify that the precompile returns it unchanged. ## Modular Exponentiation (0x05) The ModExp precompile performs modular exponentiation, which is an operation commonly needed in cryptographic algorithms. ```solidity title="ModExp.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract ModExpExample { address constant MODEXP_ADDRESS = address(0x05); function modularExponentiation( bytes memory base, bytes memory exponent, bytes memory modulus ) public view returns (bytes memory) { bytes memory input = abi.encodePacked( toBytes32(base.length), toBytes32(exponent.length), toBytes32(modulus.length), base, exponent, modulus ); (bool success, bytes memory result) = MODEXP_ADDRESS.staticcall(input); require(success, "ModExp precompile call failed"); return result; } function toBytes32(uint256 value) internal pure returns (bytes32) { return bytes32(value); } } ``` To use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `modularExponentiation` with encoded `base`, `exponent`, and `modulus` bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/ModExp.js){target=\_blank} shows how to test modular exponentiation like (4 ** 13) % 497 = 445. ## BN128 Addition (0x06) The BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations. ```solidity title="BN128Add.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.20; contract BN128AddExample { address constant BN128_ADD_PRECOMPILE = address(0x06); event BN128Added(uint256 x3, uint256 y3); uint256 public resultX; uint256 public resultY; function callBN128Add(uint256 x1, uint256 y1, uint256 x2, uint256 y2) public { bytes memory input = abi.encodePacked( bytes32(x1), bytes32(y1), bytes32(x2), bytes32(y2) ); bool success; bytes memory output; (success, output) = BN128_ADD_PRECOMPILE.call{value: 0}(input); require(success, "BN128Add precompile call failed"); require(output.length == 64, "Invalid output length"); (uint256 x3, uint256 y3) = abi.decode(output, (uint256, uint256)); resultX = x3; resultY = y3; emit BN128Added(x3, y3); } } ``` To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\_blank} demonstrates a valid curve addition and checks the result against known expected values. ## BN128 Scalar Multiplication (0x07) The BN128Mul precompile performs scalar multiplication on the alt_bn128 curve. ```solidity title="BN128Mul.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract BN128MulExample { // Precompile address for BN128Mul address constant BN128_MUL_ADDRESS = address(0x07); bytes public result; // Performs scalar multiplication of a point on the alt_bn128 curve function bn128ScalarMul(uint256 x1, uint256 y1, uint256 scalar) public { // Format: [x, y, scalar] - each 32 bytes bytes memory input = abi.encodePacked( bytes32(x1), bytes32(y1), bytes32(scalar) ); (bool success, bytes memory resultInMemory) = BN128_MUL_ADDRESS.call{ value: 0 }(input); require(success, "BN128Mul precompile call failed"); result = resultInMemory; } // Helper to decode result from `result` storage function getResult() public view returns (uint256 x2, uint256 y2) { bytes memory tempResult = result; require(tempResult.length >= 64, "Invalid result length"); assembly { x2 := mload(add(tempResult, 32)) y2 := mload(add(tempResult, 64)) } } } ``` To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`. ## BN128 Pairing Check (0x08) The BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification. ```solidity title="BN128Pairing.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract BN128PairingExample { // Precompile address for BN128Pairing address constant BN128_PAIRING_ADDRESS = address(0x08); bytes public result; // Performs a pairing check on the alt_bn128 curve function bn128Pairing(bytes memory input) public { // Call the precompile (bool success, bytes memory resultInMemory) = BN128_PAIRING_ADDRESS .call{value: 0}(input); require(success, "BN128Pairing precompile call failed"); result = resultInMemory; } // Helper function to decode the result from `result` storage function getResult() public view returns (bool isValid) { bytes memory tempResult = result; require(tempResult.length == 32, "Invalid result length"); uint256 output; assembly { output := mload(add(tempResult, 32)) } isValid = (output == 1); } } ``` You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\_blank} contains these tests with working examples. ## Blake2F (0x09) The Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function. ```solidity title="Blake2F.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract Blake2FExample { // Precompile address for Blake2F address constant BLAKE2F_ADDRESS = address(0x09); bytes public result; function blake2F(bytes memory input) public { // Input must be exactly 213 bytes require(input.length == 213, "Invalid input length - must be 213 bytes"); // Call the precompile (bool success, bytes memory resultInMemory) = BLAKE2F_ADDRESS.call{ value: 0 }(input); require(success, "Blake2F precompile call failed"); result = resultInMemory; } // Helper function to decode the result from `result` storage function getResult() public view returns (bytes32[8] memory output) { bytes memory tempResult = result; require(tempResult.length == 64, "Invalid result length"); for (uint i = 0; i < 8; i++) { assembly { mstore(add(output, mul(32, i)), mload(add(add(tempResult, 32), mul(32, i)))) } } } // Helper function to create Blake2F input from parameters function createBlake2FInput( uint32 rounds, bytes32[8] memory h, bytes32[16] memory m, bytes8[2] memory t, bool f ) public pure returns (bytes memory) { // Start with rounds (4 bytes, big-endian) bytes memory input = abi.encodePacked(rounds); // Add state vector h (8 * 32 = 256 bytes) for (uint i = 0; i < 8; i++) { input = abi.encodePacked(input, h[i]); } // Add message block m (16 * 32 = 512 bytes, but we need to convert to 16 * 8 = 128 bytes) // Blake2F expects 64-bit words in little-endian format for (uint i = 0; i < 16; i++) { // Take only the first 8 bytes of each bytes32 and reverse for little-endian bytes8 word = bytes8(m[i]); input = abi.encodePacked(input, word); } // Add offset counters t (2 * 8 = 16 bytes) input = abi.encodePacked(input, t[0], t[1]); // Add final block flag (1 byte) input = abi.encodePacked(input, f ? bytes1(0x01) : bytes1(0x00)); return input; } // Simplified function that works with raw hex input function blake2FFromHex(string memory hexInput) public { bytes memory input = hexStringToBytes(hexInput); blake2F(input); } // Helper function to convert hex string to bytes function hexStringToBytes(string memory hexString) public pure returns (bytes memory) { bytes memory hexBytes = bytes(hexString); require(hexBytes.length % 2 == 0, "Invalid hex string length"); bytes memory result = new bytes(hexBytes.length / 2); for (uint i = 0; i < hexBytes.length / 2; i++) { result[i] = bytes1( (hexCharToByte(hexBytes[2 * i]) << 4) | hexCharToByte(hexBytes[2 * i + 1]) ); } return result; } function hexCharToByte(bytes1 char) internal pure returns (uint8) { uint8 c = uint8(char); if (c >= 48 && c <= 57) return c - 48; // 0-9 if (c >= 65 && c <= 70) return c - 55; // A-F if (c >= 97 && c <= 102) return c - 87; // a-f revert("Invalid hex character"); } } ``` To use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors. ## Conclusion Precompiles in Polkadot Hub provide efficient, native implementations of cryptographic functions and other commonly used operations. By understanding how to interact with these precompiles from your Solidity contracts, you can build more efficient and feature-rich applications on the Polkadot ecosystem. The examples provided in this guide demonstrate the basic patterns for interacting with each precompile. Developers can adapt these patterns to their specific use cases, leveraging the performance benefits of native implementations while maintaining the flexibility of smart contract development. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/precompiles/xcm-precompile/ --- BEGIN CONTENT --- --- title: Interact with the XCM Precompile description: Learn how to use the XCM precompile to send cross-chain messages, execute XCM instructions, and estimate costs from your smart contracts. categories: Smart Contracts --- # XCM Precompile ## Introduction The [XCM (Cross-Consensus Message)](/develop/interoperability/intro-to-xcm){target=\_blank} precompile enables Polkadot Hub developers to access XCM functionality directly from their smart contracts using a Solidity interface. Located at the fixed address `0x00000000000000000000000000000000000a0000`, the XCM precompile offers three primary functions: - **`execute`**: for local XCM execution - **`send`**: for cross-chain message transmission - **`weighMessage`**: for cost estimation This guide demonstrates how to interact with the XCM precompile through Solidity smart contracts using [Remix IDE](/develop/smart-contracts/dev-environments/remix){target=\_blank}. !!!note The XCM precompile provides the barebones XCM functionality. While it provides a lot of flexibility, it doesn't provide abstractions to hide away XCM details. These have to be built on top. ## Precompile Interface The XCM precompile implements the `IXcm` interface, which defines the structure for interacting with XCM functionality. The source code for the interface is as follows: ```solidity title="IXcm.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.20; /// @dev The on-chain address of the XCM (Cross-Consensus Messaging) precompile. address constant XCM_PRECOMPILE_ADDRESS = address(0xA0000); /// @title XCM Precompile Interface /// @notice A low-level interface for interacting with `pallet_xcm`. /// It forwards calls directly to the corresponding dispatchable functions, /// providing access to XCM execution and message passing. /// @dev Documentation: /// @dev - XCM: https://docs.polkadot.com/develop/interoperability /// @dev - SCALE codec: https://docs.polkadot.com/polkadot-protocol/parachain-basics/data-encoding /// @dev - Weights: https://docs.polkadot.com/polkadot-protocol/parachain-basics/blocks-transactions-fees/fees/#transactions-weights-and-fees interface IXcm { /// @notice Weight v2 used for measurement for an XCM execution struct Weight { /// @custom:property The computational time used to execute some logic based on reference hardware. uint64 refTime; /// @custom:property The size of the proof needed to execute some logic. uint64 proofSize; } /// @notice Executes an XCM message locally on the current chain with the caller's origin. /// @dev Internally calls `pallet_xcm::execute`. /// @param message A SCALE-encoded Versioned XCM message. /// @param weight The maximum allowed `Weight` for execution. /// @dev Call @custom:function weighMessage(message) to ensure sufficient weight allocation. function execute(bytes calldata message, Weight calldata weight) external; /// @notice Sends an XCM message to another parachain or consensus system. /// @dev Internally calls `pallet_xcm::send`. /// @param destination SCALE-encoded destination MultiLocation. /// @param message SCALE-encoded Versioned XCM message. function send(bytes calldata destination, bytes calldata message) external; /// @notice Estimates the `Weight` required to execute a given XCM message. /// @param message SCALE-encoded Versioned XCM message to analyze. /// @return weight Struct containing estimated `refTime` and `proofSize`. function weighMessage(bytes calldata message) external view returns (Weight memory weight); } ``` The interface defines a `Weight` struct that represents the computational cost of XCM operations. Weight has two components: - **`refTime`**: computational time on reference hardware - **`proofSize`**: the size of the proof required for execution All XCM messages must be encoded using the [SCALE codec](/polkadot-protocol/parachain-basics/data-encoding/#data-encoding){target=\_blank}, Polkadot's standard serialization format. For further information, check the [`precompiles/IXCM.sol`](https://github.com/paritytech/polkadot-sdk/blob/cb629d46ebf00aa65624013a61f9c69ebf02b0b4/polkadot/xcm/pallet-xcm/src/precompiles/IXcm.sol){target=\_blank} file present in `pallet-xcm`. ## Interact with the XCM Precompile To interact with the XCM precompile, you can use the precompile interface directly in [Remix IDE](/develop/smart-contracts/dev-environments/remix/){target=\_blank}: 1. Create a new file called `IXcm.sol` in Remix. 2. Copy and paste the `IXcm` interface code into the file. 3. Compile the interface by selecting the button or using **Ctrl +S** keys: ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-01.webp) 4. In the **Deploy & Run Transactions** tab, select the `IXcm` interface from the contract dropdown. 5. Enter the precompile address `0x00000000000000000000000000000000000a0000` in the **At Address** input field. 6. Select the **At Address** button to connect to the precompile. ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-02.webp) 7. Once connected, you can use the Remix interface to interact with the XCM precompile's `execute`, `send`, and `weighMessage` functions. ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-03.webp) The main entrypoint of the precompile is the `execute` function. However, it's necessary to first call `weighMessage` to fill in the required parameters. ### Weigh a Message The `weighMessage` function estimates the computational cost required to execute an XCM message. This estimate is crucial for understanding the resources needed before actually executing or sending a message. To test this functionality in Remix, you can call `callWeighMessage` with a SCALE-encoded XCM message. For example, for testing, you can use the following encoded XCM message: ```text title="encoded-xcm-message-example" 0x050c000401000003008c86471301000003008c8647000d010101000000010100368e8759910dab756d344995f1d3c79374ca8f70066d3a709e48029f6bf0ee7e ``` ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-04.webp) This encoded message represents a sequence of XCM instructions: - **[Withdraw Asset](https://github.com/polkadot-fellows/xcm-format?tab=readme-ov-file#withdrawasset){target=\_blank}**: This instruction removes assets from the local chain's sovereign account or the caller's account, making them available for use in subsequent XCM instructions. - **[Buy Execution](https://github.com/polkadot-fellows/xcm-format?tab=readme-ov-file#buyexecution){target=\_blank}**: This instruction purchases execution time on the destination chain using the withdrawn assets, ensuring the message can be processed. - **[Deposit Asset](https://github.com/polkadot-fellows/xcm-format?tab=readme-ov-file#depositasset){target=\_blank}**: This instruction deposits the remaining assets into a specified account on the destination chain after execution costs have been deducted. This encoded message is provided as an example. You can craft your own XCM message tailored to your specific use case as needed. The function returns a `Weight` struct containing `refTime` and `proofSize` values, which indicate the estimated computational cost of executing this message. If successful, after calling the `callWeighMessage` function, you should see the `refTime` and `proofSize` of the message: ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-05.webp) !!!note You can find many more examples of XCMs in this [gist](https://gist.github.com/franciscoaguirre/a6dea0c55e81faba65bedf700033a1a2){target=\_blank}, which connects to the Polkadot Hub TestNet. ### Execute a Message The `execute` function runs an XCM message locally using the caller's origin. This function is the main entrypoint to cross-chain interactions. Follow these steps to execute a message: 1. Call `weighMessage` with your message to get the required weight. 2. Pass the same message bytes and the weight obtained from the previous step to `execute`. For example, using the same message from the weighing example, you would call `execute` with: - `message`: The encoded XCM message bytes. - `weight`: The `Weight` struct returned from `weighMessage`. You can use the [papi console](https://dev.papi.how/extrinsics#networkId=localhost&endpoint=wss%3A%2F%2Ftestnet-passet-hub.polkadot.io&data=0x1f03050c000401000003008c86471301000003008c8647000d010101000000010100368e8759910dab756d344995f1d3c79374ca8f70066d3a709e48029f6bf0ee7e0750c61e2901daad0600){target=\_blank} to examine the complete extrinsic structure for this operation. 3. On Remix, click on the **Transact** button to execute the XCM message: ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-06.webp) If successful, you will see the following output in the Remix terminal: ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-07.webp) Additionally, you can verify that the execution of this specific message was successful by checking that the beneficiary account associated with the XCM message has received the funds accordingly. ### Send a Message While most cross-chain operations can be performed via `execute`, `send` is sometimes necessary, for example, when opening HRMP channels. To send a message: 1. Prepare your destination location encoded in XCM format. 2. Prepare your XCM message (similar to the execute example). 3. Call `send` with both parameters. The destination parameter must be encoded according to XCM's location format, specifying the target parachain or consensus system. The message parameter contains the XCM instructions to be executed on the destination chain. Unlike `execute`, the `send` function doesn't require a weight parameter since the destination chain will handle execution costs according to its fee structure. ## Cross Contract Calls Beyond direct interaction and wrapper contracts, you can integrate XCM functionality directly into your existing smart contracts by inheriting from or importing the `IXcm` interface. This approach enables you to embed cross-chain capabilities into your application logic seamlessly. Whether you're building DeFi protocols, governance systems, or any application requiring cross-chain coordination, you can incorporate XCM calls directly within your contract's functions. ## Conclusion The XCM precompile provides a simple yet powerful interface for cross-chain interactions within the Polkadot ecosystem and beyond. By building and executing XCM programs, developers can build cross-chain applications that leverage the full potential of Polkadot's interoperability features. ## Next steps Head to the Polkadot Hub TestNet and start playing around with the precompile using Hardhat or Foundry. You can use PAPI to build XCM programs and test them with Chopsticks. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/wallets/ --- BEGIN CONTENT --- --- title: Wallets for Polkadot Hub description: Comprehensive guide to connecting and managing wallets for Polkadot Hub, covering step-by-step instructions for interacting with the ecosystem. categories: Smart Contracts, Tooling --- # Wallets for Polkadot Hub !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Connecting a compatible wallet is the first essential step for interacting with the Polkadot Hub ecosystem. This guide explores wallet options that support both Substrate and Ethereum compatible layers, enabling transactions and smart contract interactions. Whether you're a developer testing on Polkadot Hub or a user accessing the MainNet, understanding wallet configuration is crucial for accessing the full range of Polkadot Hub's capabilities. ## Connect Your Wallet ### MetaMask [MetaMask](https://metamask.io/){target=\_blank} is a popular wallet for interacting with Ethereum-compatible chains. It allows users to connect to test networks that support Ethereum-based smart contracts. However, it's important to emphasize that MetaMask primarily facilitates interactions with smart contracts, giving users access to various chain functionalities. To get started with MetaMask, you need to install the [MetaMask extension](https://metamask.io/download/){target=\_blank} and add it to the browser. Once you install MetaMask, you can set up a new wallet and securely store your seed phrase. This phrase is crucial for recovery in case you lose access. For example, to connect to the Polkadot Hub TestNet via MetaMask, you need to follow these steps: 1. Open the MetaMask extension and click on the network icon to switch to the Polkadot Hub TestNet. ![](/images/develop/smart-contracts/wallets/wallets-1.webp){: .browser-extension} 2. Click on the **Add a custom network** button. ![](/images/develop/smart-contracts/wallets/wallets-2.webp){: .browser-extension} 3. Complete the necessary fields, then click the **Save** button (refer to the [Networks](/develop/smart-contracts/connect-to-polkadot#networks-details){target=\_blank} section for copy and paste parameters). ![](/images/develop/smart-contracts/wallets/wallets-3.webp){: .browser-extension} 4. Click on **Polkadot Hub TestNet** to switch the network. ![](/images/develop/smart-contracts/wallets/wallets-4.webp){: .browser-extension} The steps in the preceding section can be used to connect to any chain by modifying the network specification and endpoint parameters. ### SubWallet [SubWallet](https://www.subwallet.app/){target=\_blank} is a popular non-custodial wallet solution for Polkadot and Ethereum ecosystems. It offers seamless integration with Polkadot SDK-based networks while maintaining Ethereum compatibility, making the wallet an ideal choice for users and developers to interact with Polkadot Hub. SubWallet now fully supports the [Polkadot Hub TestNet](/polkadot-protocol/smart-contract-basics/networks/#test-networks){target=\_blank} where developers can deploy and interact with Ethereum-compatible, Solidity smart contracts. You can easily view and manage your Paseo native token (PAS) using the Ethereum RPC endpoint (Passet Hub EVM) or the Substrate node RPC endpoint (passet-hub). ??? code "Polkadot Hub TestNet" You can see support here for Polkadot Hub's TestNet. The **Passet Hub EVM** network uses an ETH RPC endpoint, and the **passet-hub** uses a Substrate endpoint. The ETH RPC endpoint will let you send transactions that follow an ETH format, while the Substrate endpoint will follow a Substrate transaction format. Note the PAS token, which is the native token of the Polkadot Hub TestNet. ![](/images/develop/smart-contracts/wallets/subwallet-PAS.webp){: .browser-extension} To connect to Polkadot Hub TestNet using SubWallet, follow these steps: 1. Install the [SubWallet browser extension](https://chromewebstore.google.com/detail/subwallet-polkadot-wallet/onhogfjeacnfoofkfgppdlbmlmnplgbn?hl=en){target=\_blank} and set up your wallet by following the on-screen instructions, or refer to our [step-by-step guide](https://docs.subwallet.app/main/extension-user-guide/getting-started/install-subwallet){target=\_blank} for assistance. 2. After setting up your wallet, click the List icon at the top left corner of the extension window to open **Settings**. ![](/images/develop/smart-contracts/wallets/subwallet-01.webp){: .browser-extension} 3. Scroll down and select **Manage networks**. ![](/images/develop/smart-contracts/wallets/subwallet-02.webp){: .browser-extension} 4. In the Manage network screen, either scroll down or type in the search bar to find the networks. Once done, enable the toggle next to the network name. ![](/images/develop/smart-contracts/wallets/subwallet-03.webp){: .browser-extension} You are now ready to use SubWallet to interact with [Polkadot Hub TestNet](/develop/smart-contracts/connect-to-polkadot/#networks-details){target=\_blank} seamlessly! ![](/images/develop/smart-contracts/wallets/subwallet-04.webp){: .browser-extension} ### Talisman [Talisman](https://talisman.xyz/){target=\_blank} is a specialized wallet for the Polkadot ecosystem that supports both Substrate and EVM accounts, making it an excellent choice for Polkadot Hub interactions. Talisman offers a more integrated experience for Polkadot-based chains while still providing Ethereum compatibility. To use Talisman with Polkadot Hub TestNet: 1. Install the [Talisman extension](https://talisman.xyz/download){target=\_blank} and set up your wallet by following the on-screen instructions. 2. Once installed, click on the Talisman icon in your browser extensions and click on the **Settings** button: ![](/images/develop/smart-contracts/wallets/wallets-5.webp){: .browser-extension} 3. Click the button **All settings**. ![](/images/develop/smart-contracts/wallets/wallets-6.webp){: .browser-extension} 4. Go to the **Networks & Tokens** section. ![](/images/develop/smart-contracts/wallets/wallets-7.webp) 5. Click the **Manage networks** button. ![](/images/develop/smart-contracts/wallets/wallets-8.webp) 6. Click the **+ Add network** button. ![](/images/develop/smart-contracts/wallets/wallets-9.webp) 7. Fill in the form with the required parameters and click the **Add network** button. ![](/images/develop/smart-contracts/wallets/wallets-10.webp) 8. After that, you can switch to the Polkadot Hub TestNet by clicking on the network icon and selecting **Polkadot Hub TestNet**. ![](/images/develop/smart-contracts/wallets/wallets-11.webp) After selecting the network, Talisman will automatically configure the necessary RPC URL and chain ID for you. You can now use Talisman to interact with the Polkadot Hub TestNet. ## Conclusion Choosing the right wallet for Polkadot Hub interactions depends on your specific requirements and familiarity with different interfaces. MetaMask provides a familiar entry point for developers with Ethereum experience, while Talisman offers deeper integration with Polkadot's unique features and native support for both EVM and Substrate accounts. By properly configuring your wallet connection, you gain access to the full spectrum of Polkadot Hub's capabilities. !!!info Remember to always verify network parameters when connecting to ensure a secure and reliable connection to the Polkadot ecosystem. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/smart-contracts/deploy-erc20/ --- BEGIN CONTENT --- --- title: Deploy an ERC-20 to Polkadot Hub description: Deploy an ERC-20 token on Polkadot Hub using PolkaVM. This guide covers contract creation, compilation, deployment, and interaction via Polkadot Remix IDE. tutorial_badge: Beginner categories: Basics, dApps, Smart Contracts --- # Deploy an ERC-20 to Polkadot Hub !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction [ERC-20](https://eips.ethereum.org/EIPS/eip-20){target=\_blank} tokens are fungible tokens commonly used for creating cryptocurrencies, governance tokens, and staking mechanisms. Polkadot Hub enables easy token deployment with Ethereum-compatible smart contracts via PolkaVM. This tutorial covers deploying an ERC-20 contract on the Polkadot Hub TestNet using [Polkadot Remix IDE](https://remix.polkadot.io){target=\_blank}, a web-based development tool. [OpenZeppelin's ERC-20 contracts]({{ dependencies.repositories.open_zeppelin_contracts.repository_url}}/tree/{{ dependencies.repositories.open_zeppelin_contracts.version}}/contracts/token/ERC20){target=\_blank} are used for security and compliance. ## Prerequisites Before starting, make sure you have: - [MetaMask](https://metamask.io/){target=\_blank} installed and connected to Polkadot Hub. For detailed instructions, see the [Connect Your Wallet](/develop/smart-contracts/wallets){target=\_blank} section - A funded account with some PAS tokens (you can get them from the [Polkadot Faucet](https://faucet.polkadot.io/?parachain=1111){target=\_blank}). To learn how to get test tokens, check out the [Test Tokens](/develop/smart-contracts/connect-to-polkadot#test-tokens){target=\_blank} section - Basic understanding of Solidity and fungible tokens ## Create the ERC-20 Contract To create the ERC-20 contract, you can follow the steps below: 1. Navigate to the [Polkadot Remix IDE](https://remix.polkadot.io){target=\_blank} 2. Click in the **Create new file** button under the **contracts** folder, and name your contract as `MyToken.sol` ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-1.webp) 3. Now, paste the following ERC-20 contract code into the editor ```solidity title="MyToken.sol" // SPDX-License-Identifier: MIT // Compatible with OpenZeppelin Contracts ^5.0.0 pragma solidity ^0.8.22; import {ERC20} from "@openzeppelin/contracts/token/ERC20/ERC20.sol"; import {Ownable} from "@openzeppelin/contracts/access/Ownable.sol"; contract MyToken is ERC20, Ownable { constructor(address initialOwner) ERC20("MyToken", "MTK") Ownable(initialOwner) {} function mint(address to, uint256 amount) public onlyOwner { _mint(to, amount); } } ``` The key components of the code above are: - Contract imports - [**`ERC20.sol`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url}}/tree/{{ dependencies.repositories.open_zeppelin_contracts.version}}/contracts/token/ERC20/ERC20.sol){target=\_blank} - the base contract for fungible tokens, implementing core functionality like transfers, approvals, and balance tracking - [**`Ownable.sol`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url}}/tree/{{ dependencies.repositories.open_zeppelin_contracts.version}}/contracts/access/Ownable.sol){target=\_blank} - provides basic authorization control, ensuring only the contract owner can mint new tokens - Constructor parameters - **`initialOwner`** - sets the address that will have administrative rights over the contract - **`"MyToken"`** - the full name of your token - **`"MTK"`** - the symbol representing your token in wallets and exchanges - Key functions - **`mint(address to, uint256 amount)`** - allows the contract owner to create new tokens for any address. The amount should include 18 decimals (e.g., 1 token = 1000000000000000000) - Inherited [Standard ERC-20](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/){target=\_blank} functions: - **`transfer(address recipient, uint256 amount)`** - sends a specified amount of tokens to another address - **`approve(address spender, uint256 amount)`** - grants permission for another address to spend a specific number of tokens on behalf of the token owner - **`transferFrom(address sender, address recipient, uint256 amount)`** - transfers tokens from one address to another, if previously approved - **`balanceOf(address account)`** - returns the token balance of a specific address - **`allowance(address owner, address spender)`** - checks how many tokens an address is allowed to spend on behalf of another address !!! tip Use the [OpenZeppelin Contracts Wizard](https://wizard.openzeppelin.com/){target=\_blank} to quickly generate customized smart contracts. Simply configure your contract, copy the generated code, and paste it into Polkadot Remix IDE for deployment. Below is an example of an ERC-20 token contract created with it: ![Screenshot of the OpenZeppelin Contracts Wizard showing an ERC-20 contract configuration.](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-2.webp) ## Compile the Contract The compilation transforms your Solidity source code into bytecode that can be deployed on the blockchain. During this process, the compiler checks your contract for syntax errors, ensures type safety, and generates the machine-readable instructions needed for blockchain execution. To compile your contract, follow the instructions below: 1. Select the **Solidity Compiler** plugin from the left panel ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-3.webp) 2. Click the **Compile MyToken.sol** button ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-4.webp) 3. If the compilation succeeded, you'll see a green checkmark indicating success in the **Solidity Compiler** icon ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-5.webp) ## Deploy the Contract Deployment is the process of publishing your compiled smart contract to the blockchain, making it permanently available for interaction. During deployment, you'll create a new instance of your contract on the blockchain, which involves: 1. Select the **Deploy & Run Transactions** plugin from the left panel ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-6.webp) 2. Configure the deployment settings 1. From the **ENVIRONMENT** dropdown, select **Injected Provider - Talisman** (check the [Deploying Contracts](/develop/smart-contracts/dev-environments/remix/#deploying-contracts){target=\_blank} section of the Remix IDE guide for more details) 2. From the **ACCOUNT** dropdown, select the account you want to use for the deploy ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-7.webp) 3. Configure the contract parameters 1. Enter the address that will own the deployed token contract 2. Click the **Deploy** button to initiate the deployment ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-8.webp) 4. Talisman will pop up - review the transaction details. Click **Approve** to deploy your contract ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-9.webp){: .browser-extension} If the deployment process succeeded, you will see the transaction details in the terminal, including the contract address and deployment transaction hash: ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-10.webp) ## Interact with Your ERC-20 Contract Once deployed, you can interact with your contract through Remix: 1. Find your contract under **Deployed/Unpinned Contracts**, and click it to expand the available methods ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-11.webp) 2. To mint new tokens: 1. Click in the contract to expand its associated methods 2. Expand the **mint** function 3. Enter: - The recipient address - The amount (remember to add 18 zeros for 1 whole token) 4. Click **Transact** ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-12.webp) 3. Click **Approve** to confirm the transaction in the Talisman popup ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-13.webp){: .browser-extension} If the transaction succeeds, you will see the following output in the terminal: ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-14.webp) Other common functions you can use: - **`balanceOf(address)`** - check token balance of any address - **`transfer(address to, uint256 amount)`** - send tokens to another address - **`approve(address spender, uint256 amount)`** - allow another address to spend your tokens Feel free to explore and interact with the contract's other functions using the same approach - selecting the method, providing any required parameters, and confirming the transaction through Talisman when needed. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/smart-contracts/deploy-nft/ --- BEGIN CONTENT --- --- title: Deploy an NFT to Polkadot Hub description: Deploy an NFT on Polkadot Hub using PolkaVM and OpenZeppelin. Learn how to compile, deploy, and interact with your contract using Polkadot Remix IDE. tutorial_badge: Beginner categories: Basics, dApps, Smart Contracts --- # Deploy an NFT to Polkadot Hub !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Non-Fungible Tokens (NFTs) represent unique digital assets commonly used for digital art, collectibles, gaming, and identity verification. Polkadot Hub supports Ethereum-compatible smart contracts through PolkaVM, enabling straightforward NFT deployment. This tutorial guides you through deploying an [ERC-721](https://eips.ethereum.org/EIPS/eip-721){target=\_blank} NFT contract on the Polkadot Hub TestNet using the [Polkadot Remix IDE](https://remix.polkadot.io){target=\_blank}, a web-based development environment. To ensure security and standard compliance, it uses [OpenZeppelin's NFT contracts]({{ dependencies.repositories.open_zeppelin_contracts.repository_url}}/tree/{{ dependencies.repositories.open_zeppelin_contracts.version}}){target=\_blank} implementation. ## Prerequisites Before starting, make sure you have: - [Talisman](https://talisman.xyz/){target=\_blank} installed and connected to the Polkadot Hub TestNet. Check the [Connect to Polkadot](/develop/smart-contracts/connect-to-polkadot/){target=\_blank} guide for more information - A funded account with some PAS tokens (you can get them from the [Faucet](https://faucet.polkadot.io/?parachain=1111){target=\_blank}, noting that the faucet imposes a daily token limit, which may require multiple requests to obtain sufficient funds for testing) - Basic understanding of Solidity and NFTs, see the [Solidity Basics](https://soliditylang.org/){target=\_blank} and the [NFT Overview](https://ethereum.org/en/nft/){target=\_blank} guides for more details ## Create the NFT Contract To create the NFT contract, you can follow the steps below: 1. Navigate to the [Polkadot Remix IDE](https://remix.polkadot.io/){target=\_blank} 2. Click in the **Create new file** button under the **contracts** folder, and name your contract as `MyNFT.sol` ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-1.webp) 3. Now, paste the following NFT contract code into the editor ```solidity title="MyNFT.sol" // SPDX-License-Identifier: MIT // Compatible with OpenZeppelin Contracts ^5.0.0 pragma solidity ^0.8.22; import {ERC721} from "@openzeppelin/contracts/token/ERC721/ERC721.sol"; import {Ownable} from "@openzeppelin/contracts/access/Ownable.sol"; contract MyToken is ERC721, Ownable { uint256 private _nextTokenId; constructor(address initialOwner) ERC721("MyToken", "MTK") Ownable(initialOwner) {} function safeMint(address to) public onlyOwner { uint256 tokenId = _nextTokenId++; _safeMint(to, tokenId); } } ``` The key components of the code above are: - Contract imports - [**`ERC721.sol`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url }}/blob/{{ dependencies.repositories.open_zeppelin_contracts.version }}/contracts/token/ERC721/ERC721.sol){target=\_blank} - the base contract for non-fungible tokens, implementing core NFT functionality like transfers and approvals - [**`Ownable.sol`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url }}/blob/{{ dependencies.repositories.open_zeppelin_contracts.version }}/contracts/access/Ownable.sol){target=\_blank} - provides basic authorization control, ensuring only the contract owner can mint new tokens - Constructor parameters - **`initialOwner`** - sets the address that will have administrative rights over the contract - **`"MyToken"`** - the full name of your NFT collection - **`"MTK"`** - the symbol representing your token in wallets and marketplaces - Key functions - [**`_safeMint(to, tokenId)`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url }}/blob/{{ dependencies.repositories.open_zeppelin_contracts.version }}/contracts/token/ERC721/ERC721.sol#L304){target=\_blank} - an internal function from `ERC721` that safely mints new tokens. It includes checks to ensure the recipient can handle `ERC721` tokens, with the `_nextTokenId` mechanism automatically generating unique sequential token IDs and the `onlyOwner` modifier restricting minting rights to the contract owner - Inherited [Standard ERC721](https://ethereum.org/en/developers/docs/standards/tokens/erc-721/){target=\_blank} functions provide a standardized set of methods that enable interoperability across different platforms, wallets, and marketplaces, ensuring that your NFT can be easily transferred, traded, and managed by any system that supports the `ERC721` standard: - **`transferFrom(address from, address to, uint256 tokenId)`** - transfers a specific NFT from one address to another - **`safeTransferFrom(address from, address to, uint256 tokenId)`** - safely transfers an NFT, including additional checks to prevent loss - **`approve(address to, uint256 tokenId)`** - grants permission for another address to transfer a specific NFT - **`setApprovalForAll(address operator, bool approved)`** - allows an address to manage all of the owner's NFTs - **`balanceOf(address owner)`** - returns the number of NFTs owned by a specific address - **`ownerOf(uint256 tokenId)`** - returns the current owner of a specific NFT !!! tip Use the [OpenZeppelin Contracts Wizard](https://wizard.openzeppelin.com/){target=\_blank} to generate customized smart contracts quickly. Simply configure your contract, copy the generated code, and paste it into Polkadot Remix IDE for deployment. Below is an example of an ERC-721 token contract created with it: ![Screenshot of the OpenZeppelin Contracts Wizard showing an ERC-721 contract configuration.](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-2.webp) ## Compile the Contract Compilation is a stage that converts your Solidity source code into bytecode suitable for deployment on the blockchain. Throughout this process, the compiler examines your contract for syntax errors, verifies type safety, and produces machine-readable instructions for execution on the blockchain. 1. Select the **Solidity Compiler** plugin from the left panel ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-3.webp) 2. Click in the **Compile MyNFT.sol** button ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-4.webp) 3. If the compilation succeeded, you can see a green checkmark indicating success in the **Solidity Compiler** icon ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-5.webp) ## Deploy the Contract Deployment is the process of uploading your compiled smart contract to the blockchain, allowing for interaction. During deployment, you will instantiate your contract on the blockchain, which involves: 1. Select the **Deploy & Run Transactions** plugin from the left panel ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-6.webp) 2. Configure the deployment settings 1. From the **ENVIRONMENT** dropdown, select **Injected Provider - Talisman** (check the [Deploying Contracts](/develop/smart-contracts/dev-environments/remix/#deploying-contracts){target=\_blank} section of the Remix IDE guide for more details) 2. From the **ACCOUNT** dropdown, select the account you want to use for the deploy ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-7.webp) 3. Configure the contract parameters 1. Enter the address that will own the deployed NFT. 2. Click the **Deploy** button to initiate the deployment ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-8.webp) 4. Talisman will pop up - review the transaction details. Click **Approve** to deploy your contract ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-9.webp){: .browser-extension} Deploying this contract requires paying gas fees in PAS tokens on the Polkadot Hub TestNet. Ensure your Talisman account is funded with sufficient PAS tokens from the faucet before confirming the transaction, check the [Test Tokens](/develop/smart-contracts/connect-to-polkadot/#test-tokens){target=\_blank} section for more information. Gas fees cover the computational resources needed to deploy and execute the smart contract on the blockchain. If the deployment process succeeded, you will see the following output in the terminal: ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-10.webp) ## Interact with Your NFT Contract Once deployed, you can interact with your contract through Remix: 1. Find your contract under **Deployed/Unpinned Contracts**, and click it to expand the available methods for the contract ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-11.webp) 2. To mint an NFT 1. Click on the contract to expand its associated methods 2. Expand the **safeMint** function 3. Enter the recipient address 4. Click **Transact** ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-12.webp) 3. Click **Approve** to confirm the transaction in the Talisman popup ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-13.webp){: .browser-extension} If the transaction is successful, the terminal will display the following output, which details the information about the transaction, including the transaction hash, the block number, the associated logs, and so on. ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-14.webp) Feel free to explore and interact with the contract's other functions using the same approach - selecting the method, providing any required parameters, and confirming the transaction through Talisman when needed. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/smart-contracts/launch-your-first-project/create-contracts/ --- BEGIN CONTENT --- --- title: Create a Smart Contract description: Learn how to write a basic smart contract using just a text editor. This guide covers creating and preparing a contract for deployment on Polkadot Hub. tutorial_badge: Beginner categories: Basics, Smart Contracts --- # Create a Smart Contract !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Creating [smart contracts](/develop/smart-contracts/overview/){target=\_blank} is fundamental to blockchain development. While many frameworks and tools are available, understanding how to write a contract from scratch with just a text editor is essential knowledge. This tutorial will guide you through creating a basic smart contract that can be used with other tutorials for deployment and integration on Polkadot Hub. To understand how smart contracts work in Polkadot Hub, check the [Smart Contract Basics](/polkadot-protocol/smart-contract-basics/){target=\_blank} guide for more information. ## Prerequisites Before starting, make sure you have: - A text editor of your choice ([VS Code](https://code.visualstudio.com/){target=\_blank}, [Sublime Text](https://www.sublimetext.com/){target=\_blank}, etc.) - Basic understanding of programming concepts - Familiarity with the Solidity programming language syntax. For further references, check the official [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\_blank} ## Understanding Smart Contract Structure Let's explore these components before building the contract: - [**SPDX license identifier**](https://docs.soliditylang.org/en/v0.6.8/layout-of-source-files.html){target=\_blank} - a standardized way to declare the license under which your code is released. This helps with legal compliance and is required by the Solidity compiler to avoid warnings - **Pragma directive** - specifies which version of Solidity compiler should be used for your contract - **Contract declaration** - similar to a class in object-oriented programming, it defines the boundaries of your smart contract - **State variables** - data stored directly in the contract that persists between function calls. These represent the contract's "state" on the blockchain - **Functions** - executable code that can read or modify the contract's state variables - **Events** - notification mechanisms that applications can subscribe to in order to track blockchain changes ## Create the Smart Contract In this section, you'll build a simple storage contract step by step. This basic Storage contract is a great starting point for beginners. It introduces key concepts like state variables, functions, and events in a simple way, demonstrating how data is stored and updated on the blockchain. Later, you'll explore each component in more detail to understand what's happening behind the scenes. This contract will: - Store a number - Allow updating the stored number - Emit an event when the number changes To build the smart contract, follow the steps below: 1. Create a new file named `Storage.sol` 2. Add the SPDX license identifier at the top of the file: ```solidity // SPDX-License-Identifier: MIT ``` This line tells users and tools which license governs your code. The [MIT license](https://opensource.org/license/mit){target=\_blank} is commonly used for open-source projects. The Solidity compiler requires this line to avoid licensing-related warnings. 3. Specify the Solidity version: ```solidity pragma solidity ^0.8.28; ``` The caret `^` means "this version or any compatible newer version." This helps ensure your contract compiles correctly with the intended compiler features. 4. Create the contract structure: ```solidity contract Storage { // Contract code will go here } ``` This defines a contract named "Storage", similar to how you would define a class in other programming languages. 5. Add the state variables and event: ```solidity contract Storage { // State variable to store a number uint256 private number; // Event to notify when the number changes event NumberChanged(uint256 newNumber); } ``` Here, you're defining: - A state variable named `number` of type `uint256` (unsigned integer with 256 bits), which is marked as `private` so it can only be accessed via functions within this contract - An event named `NumberChanged` that will be triggered whenever the number changes. The event includes the new value as data 6. Add the getter and setter functions: ```solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.28; contract Storage { // State variable to store our number uint256 private number; // Event to notify when the number changes event NumberChanged(uint256 newNumber); // Function to store a new number function store(uint256 newNumber) public { number = newNumber; emit NumberChanged(newNumber); } // Function to retrieve the stored number function retrieve() public view returns (uint256) { return number; } } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.28; contract Storage { // State variable to store our number uint256 private number; // Event to notify when the number changes event NumberChanged(uint256 newNumber); // Function to store a new number function store(uint256 newNumber) public { number = newNumber; emit NumberChanged(newNumber); } // Function to retrieve the stored number function retrieve() public view returns (uint256) { return number; } } ``` ## Understanding the Code Let's break down the key components of the contract: - **State Variable** - `uint256 private number` - a private variable that can only be accessed through the contract's functions - The `private` keyword prevents direct access from other contracts, but it's important to note that while other contracts cannot read this variable directly, the data itself is still visible on the blockchain and can be read by external tools or applications that interact with the blockchain. "Private" in Solidity doesn't mean the data is encrypted or truly hidden - State variables in Solidity are permanent storage on the blockchain, making them different from variables in traditional programming. Every change to a state variable requires a transaction and costs gas (the fee paid for blockchain operations) - **Event** - `event NumberChanged(uint256 newNumber)` - emitted when the stored number changes - When triggered, events write data to the blockchain's log, which can be efficiently queried by applications - Unlike state variables, events cannot be read by smart contracts, only by external applications - Events are much more gas-efficient than storing data when you only need to notify external systems of changes - **Functions** - `store(uint256 newNumber)` - updates the stored number and emits an event - This function changes the state of the contract and requires a transaction to execute - The `emit` keyword is used to trigger the defined event - `retrieve()` - returns the current stored number - The `view` keyword indicates that this function only reads data and doesn't modify the contract's state - View functions don't require a transaction and don't cost gas when called externally For those new to Solidity, this naming pattern (getter/setter functions) is a common design pattern. Instead of directly accessing state variables, the convention is to use functions to control access and add additional logic if needed. This basic contract serves as a foundation for learning smart contract development. Real-world contracts often require additional security considerations, more complex logic, and thorough testing before deployment. For more detailed information about Solidity types, functions, and best practices, refer to the [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\_blank} or this [beginner's guide to Solidity](https://www.tutorialspoint.com/solidity/index.htm){target=\_blank}. ## Where to Go Next
- Tutorial __Test and Deploy with Hardhat__ --- Learn how to test and deploy the smart contract you created by using Hardhat. [:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/launch-your-first-project/test-and-deploy-with-hardhat/)
--- END CONTENT --- ## Basics Concepts [shared: true] The following section contains foundational documentation shared across all Polkadot products. It describes the architecture and infrastructure that serve as the backbone for all integrations built with Polkadot. This includes developing and deploying solidity smart contracts on the Polkadot Hub, building, customizing, deploying, and maintaining a parachain, and integrating with the Polkadot ecosystem. This context is provided to help understand how the system works under the hood, but responses should stay focused on the specific product unless the user explicitly asks about the general architecture. --- ## List of shared concept pages: ## Full content for shared concepts: Doc-Content: https://docs.polkadot.com/develop/interoperability/intro-to-xcm/ --- BEGIN CONTENT --- --- title: Introduction to XCM description: Unlock blockchain interoperability with XCM β€” Polkadot's Cross-Consensus Messaging format for cross-chain interactions. categories: Basics, Polkadot Protocol --- # Introduction to XCM ## Introduction Polkadot’s unique value lies in its ability to enable interoperability between parachains and other blockchain systems. At the core of this capability is XCM (Cross-Consensus Messaging)β€”a flexible messaging format that facilitates communication and collaboration between independent consensus systems. With XCM, one chain can send intents to another one, fostering a more interconnected ecosystem. Although it was developed specifically for Polkadot, XCM is a universal format, usable in any blockchain environment. This guide provides an overview of XCM’s core principles, design, and functionality, alongside practical examples of its implementation. ## Messaging Format XCM is not a protocol but a standardized [messaging format](https://github.com/polkadot-fellows/xcm-format){target=\_blank}. It defines the structure and behavior of messages but does not handle their delivery. This separation allows developers to focus on crafting instructions for target systems without worrying about transmission mechanics. XCM messages are intent-driven, outlining desired actions for the receiving blockchain to consider and potentially alter its state. These messages do not directly execute changes; instead, they rely on the host chain's environment to interpret and implement them. By utilizing asynchronous composability, XCM facilitates efficient execution where messages can be processed independently of their original order, similar to how RESTful services handle HTTP requests without requiring sequential processing. ## The Four Principles of XCM XCM adheres to four guiding principles that ensure robust and reliable communication across consensus systems: - **Asynchronous** - XCM messages operate independently of sender acknowledgment, avoiding delays due to blocked processes - **Absolute** - XCM messages are guaranteed to be delivered and interpreted accurately, in order, and timely. Once a message is sent, one can be sure it will be processed as intended - **Asymmetric** - XCM messages follow the 'fire and forget' paradigm meaning no automatic feedback is provided to the sender. Any results must be communicated separately to the sender with an additional message back to the origin - **Agnostic** - XCM operates independently of the specific consensus mechanisms, making it compatible across diverse systems These principles guarantee that XCM provides a reliable framework for cross-chain communication, even in complex environments. ## The XCM Tech Stack ![Diagram of the XCM tech stack](/images/develop/interoperability/intro-to-xcm/intro-to-xcm-01.webp) The XCM tech stack is designed to facilitate seamless interoperable communication between chains that reside within the Polkadot ecosystem. XCM can be used to express the meaning of the messages over each of the communication channels. ## Core Functionalities of XCM XCM enhances cross-consensus communication by introducing several powerful features: - **Programmability** - supports dynamic message handling, allowing for more comprehensive use cases. Includes branching logic, safe dispatches for version checks, and asset operations like NFT management - **Functional Multichain Decomposition** - enables mechanisms such as remote asset locking, asset namespacing, and inter-chain state referencing, with contextual message identification - **Bridging** - establishes a universal reference framework for multi-hop setups, connecting disparate systems like Ethereum and Bitcoin with the Polkadot relay chain acting as a universal location The standardized format for messages allows parachains to handle tasks like user balances, governance, and staking, freeing the Polkadot relay chain to focus on shared security. These features make XCM indispensable for implementing scalable and interoperable blockchain applications. ## XCM Example The following is a simplified XCM message demonstrating a token transfer from Alice to Bob on the same chain (ParaA). ```rust let message = Xcm(vec![ WithdrawAsset((Here, amount).into()), BuyExecution { fees: (Here, amount).into(), weight_limit: WeightLimit::Unlimited }, DepositAsset { assets: All.into(), beneficiary: MultiLocation { parents: 0, interior: Junction::AccountId32 { network: None, id: BOB.clone().into() }.into(), }.into() } ]); ``` The message consists of three instructions described as follows: - [**WithdrawAsset**](https://github.com/polkadot-fellows/xcm-format?tab=readme-ov-file#withdrawasset){target=\_blank} - transfers a specified number of tokens from Alice's account to a holding register ```rust WithdrawAsset((Here, amount).into()), ``` - `Here` - the native parachain token - `amount` - the number of tokens that are transferred The first instruction takes as an input the MultiAsset that should be withdrawn. The MultiAsset describes the native parachain token with the `Here` keyword. The `amount` parameter is the number of tokens that are transferred. The withdrawal account depends on the origin of the message. In this example the origin of the message is Alice. The `WithdrawAsset` instruction moves `amount` number of native tokens from Alice's account into the holding register. - [**BuyExecution**](https://github.com/polkadot-fellows/xcm-format?tab=readme-ov-file#buyexecution){target=\_blank} - allocates fees to cover the execution [weight](/polkadot-protocol/glossary/#weight){target=\_blank} of the XCM instructions ```rust BuyExecution { fees: (Here, amount).into(), weight_limit: WeightLimit::Unlimited }, ``` - `fees` - describes the asset in the holding register that should be used to pay for the weight - `weight_limit` - defines the maximum fees that can be used to buy weight - [**DepositAsset**](https://github.com/polkadot-fellows/xcm-format?tab=readme-ov-file#depositasset){target=\_blank} - moves the remaining tokens from the holding register to Bob’s account ```rust DepositAsset { assets: All.into(), beneficiary: MultiLocation { parents: 0, interior: Junction::AccountId32 { network: None, id: BOB.clone().into() }.into(), }.into() } ``` - `All` - the wildcard for the asset(s) to be deposited. In this case, all assets in the holding register should be deposited This step-by-step process showcases how XCM enables precise state changes within a blockchain system. You can find a complete XCM message example in the [XCM repository](https://github.com/paritytech/xcm-docs/blob/main/examples/src/0_first_look/mod.rs){target=\_blank}. ## Overview XCM revolutionizes cross-chain communication by enabling use cases such as: - Token transfers between blockchains - Asset locking for cross-chain smart contract interactions - Remote execution of functions on other blockchains These functionalities empower developers to build innovative, multi-chain applications, leveraging the strengths of various blockchain networks. To stay updated on XCM’s evolving format or contribute, visit the [XCM repository](https://github.com/paritytech/xcm-docs/blob/main/examples/src/0_first_look/mod.rs){target=\_blank}. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/interoperability/send-messages/ --- BEGIN CONTENT --- --- title: Send XCM Messages description: Send cross-chain messages using XCM, Polkadot's Cross-Consensus Messaging format, designed to support secure communication between chains. categories: Basics, Polkadot Protocol --- # Send XCM Messages ## Introduction One of the core FRAME pallets that enables parachains to engage in cross-chain communication using the Cross-Consensus Message (XCM) format is [`pallet-xcm`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/index.html){target=\_blank}. It facilitates the sending, execution, and management of XCM messages, thereby allowing parachains to interact with other chains within the ecosystem. Additionally, `pallet-xcm`, also referred to as the XCM pallet, supports essential operations like asset transfers, version negotiation, and message routing. This page provides a detailed overview of the XCM pallet's key features, its primary roles in XCM operations, and the main extrinsics it offers. Whether aiming to execute XCM messages locally or send them to external chains, this guide covers the foundational concepts and practical applications you need to know. ## XCM Frame Pallet Overview The [`pallet-xcm`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/index.html){target=\_blank} provides a set of pre-defined, commonly used [XCVM programs](https://github.com/polkadot-fellows/xcm-format?tab=readme-ov-file#12-the-xcvm){target=\_blank} in the form of a [set of extrinsics](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/dispatchables/index.html){target=\blank}. This pallet provides some [default implementations](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/struct.Pallet.html#implementations){target=\_blank} for traits required by [`XcmConfig`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm_benchmarks/trait.Config.html#associatedtype.XcmConfig){target=\_blank}. The [XCM executor](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/struct.XcmExecutor.html){target=\_blank} is also included as an associated type within the pallet's configuration. For further details about the XCM configuration, see the [XCM Configuration](/develop/interoperability/xcm-config/){target=\_blank} page. Where the [XCM format](https://github.com/polkadot-fellows/xcm-format){target=\_blank} defines a set of instructions used to construct XCVM programs, `pallet-xcm` defines a set of extrinsics that can be utilized to build XCVM programs, either to target the local or external chains. The `pallet-xcm` functionality is divided into three categories: - **Primitive** - dispatchable functions to execute XCM locally - **High-level** - functions for asset transfers between chains - **Version negotiation-specific** - functions for managing XCM version compability ### Key Roles of the XCM Pallet The XCM pallet plays a central role in managing cross-chain messages, with its primary responsibilities including: - **Execute XCM messages** - interacts with the XCM executor to validate and execute messages, adhering to predefined security and filter criteria - **Send messages across chains** - allows authorized origins to send XCM messages, enabling controlled cross-chain communication - **Reserve-based transfers and teleports** - supports asset movement between chains, governed by filters that restrict operations to authorized origins - **XCM version negotiation** - ensures compatibility by selecting the appropriate XCM version for inter-chain communication - **Asset trapping and recovery** - manages trapped assets, enabling safe reallocation or recovery when issues occur during cross-chain transfers - **Support for XCVM operations** - oversees state and configuration requirements necessary for executing cross-consensus programs within the XCVM framework ## Primary Extrinsics of the XCM Pallet This page will highlight the two **Primary Primitive Calls** responsible for sending and executing XCVM programs as dispatchable functions within the pallet. ### Execute TheΒ [`execute`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.execute){target=\_blank}Β call directly interacts with the XCM executor, allowing for the execution of XCM messages originating from a locally signed origin. The executor validates the message, ensuring it complies with any configured barriers or filters before executing. Once validated, the message is executed locally, and an event is emitted to indicate the resultβ€”whether the message was fully executed or only partially completed. Execution is capped by a maximum weight ([`max_weight`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.execute.field.max_weight){target=\_blank}); if the required weight exceeds this limit, the message will not be executed. ```rust pub fn execute( message: Box::RuntimeCall>>, max_weight: Weight, ) ``` For further details about the `execute` extrinsic, see the [`pallet-xcm` documentation](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/struct.Pallet.html){target=\_blank}. !!!warning Partial execution of messages may occur depending on the constraints or barriers applied. ### Send TheΒ [`send`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.send){target=\_blank}Β call enables XCM messages to be sent to a specified destination. This could be a parachain, smart contract, or any external system governed by consensus. Unlike the execute call, the message is not executed locally but is transported to the destination chain for processing. The destination is defined using a [Location](https://paritytech.github.io/polkadot-sdk/master/xcm_docs/glossary/index.html#location){target=\_blank}, which describes the target chain or system. This ensures precise delivery through the configured XCM transport mechanism. ```rust pub fn send( dest: Box, message: Box::RuntimeCall>>, ) ``` For further information about the `send` extrinsic, see the [`pallet-xcm` documentation](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/struct.Pallet.html){target=\_blank}. ## XCM Router The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/trait.Config.html#associatedtype.XcmRouter){target=\_blank} is a critical component the XCM pallet requires to facilitate sending XCM messages. It defines where messages can be sent and determines the appropriate XCM transport protocol for the operation. For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.network/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust // Only one router so far - use DMP to communicate with child parachains. ChildParachainRouter, )>; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/interoperability/test-and-debug/ --- BEGIN CONTENT --- --- title: Testing and Debugging description: Learn how to test and debug cross-chain communication via the XCM Emulator to ensure interoperability and reliable execution. categories: Basics, Polkadot Protocol --- # Testing and Debugging ## Introduction Cross-Consensus Messaging (XCM) is a core feature of the Polkadot ecosystem, enabling communication between parachains, relay chains, and system chains. To ensure the reliability of XCM-powered blockchains, thorough testing and debugging are essential before production deployment. This guide covers the XCM Emulator, a tool designed to facilitate onboarding and testing for developers. Use the emulator if: - A live runtime is not yet available - Extensive configuration adjustments are needed, as emulated chains differ from live networks - Rust-based tests are preferred for automation and integration For scenarios where real blockchain state is required, [Chopsticks](/tutorials/polkadot-sdk/testing/fork-live-chains/#xcm-testing){target=\_blank} allows testing with any client compatible with Polkadot SDK-based chains. ## XCM Emulator Setting up a live network with multiple interconnected parachains for XCM testing can be complex and resource-intensive. The [`xcm-emulator`](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/cumulus/xcm/xcm-emulator){target=\_blank} is a tool designed to simulate the execution of XCM programs using predefined runtime configurations. These configurations include those utilized by live networks like Kusama, Polkadot, and Asset Hub. This tool enables testing of cross-chain message passing, providing a way to verify outcomes, weights, and side effects efficiently. It achieves this by utilizing mocked runtimes for both the relay chain and connected parachains, enabling developers to focus on message logic and configuration without needing a live network. The `xcm-emulator` relies on transport layer pallets. However, the messages do not leverage the same messaging infrastructure as live networks since the transport mechanism is mocked. Additionally, consensus-related events are not covered, such as disputes and staking events. Parachains should use end-to-end (E2E) tests to validate these events. ### Advantages and Limitations The XCM Emulator provides both advantages and limitations when testing cross-chain communication in simulated environments. - **Advantages**: - **Interactive debugging** - offers tracing capabilities similar to EVM, enabling detailed analysis of issues - **Runtime composability** - facilitates testing and integration of multiple runtime components - **Immediate feedback** - supports Test-Driven Development (TDD) by providing rapid test results - **Seamless integration testing** - simplifies the process of testing new runtime versions in an isolated environment - **Limitations**: - **Simplified emulation** - always assumes message delivery, which may not mimic real-world network behavior - **Dependency challenges** - requires careful management of dependency versions and patching. Refer to the [Cargo dependency documentation](https://doc.rust-lang.org/cargo/reference/overriding-dependencies.html){target=\_blank} - **Compilation overhead** - testing environments can be resource-intensive, requiring frequent compilation updates ### How Does It Work? The `xcm-emulator` provides macros for defining a mocked testing environment. Check all the existing macros and functionality in the [XCM Emulator source code](https://github.com/paritytech/polkadot-sdk/blob/{{dependencies.repositories.polkadot_sdk.version}}/cumulus/xcm/xcm-emulator/src/lib.rs){target=\_blank}. The most important macros are: - [**`decl_test_relay_chains`**](https://github.com/paritytech/polkadot-sdk/blob/{{dependencies.repositories.polkadot_sdk.version}}/cumulus/xcm/xcm-emulator/src/lib.rs#L355){target=\_blank} - defines runtime and configuration for the relay chains. Example: ```rust // Westend declaration decl_test_relay_chains! { #[api_version(11)] pub struct Westend { genesis = genesis::genesis(), on_init = (), runtime = westend_runtime, core = { SovereignAccountOf: westend_runtime::xcm_config::LocationConverter, }, pallets = { XcmPallet: westend_runtime::XcmPallet, Sudo: westend_runtime::Sudo, Balances: westend_runtime::Balances, Treasury: westend_runtime::Treasury, AssetRate: westend_runtime::AssetRate, Hrmp: westend_runtime::Hrmp, Identity: westend_runtime::Identity, IdentityMigrator: westend_runtime::IdentityMigrator, } }, } ``` - [**`decl_test_parachains`**](https://github.com/paritytech/polkadot-sdk/blob/{{dependencies.repositories.polkadot_sdk.version}}/cumulus/xcm/xcm-emulator/src/lib.rs#L590){target=\_blank} - defines runtime and configuration for the parachains. Example: ```rust // AssetHubWestend Parachain declaration decl_test_parachains! { pub struct AssetHubWestend { genesis = genesis::genesis(), on_init = { asset_hub_westend_runtime::AuraExt::on_initialize(1); }, runtime = asset_hub_westend_runtime, core = { XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, ParachainInfo: asset_hub_westend_runtime::ParachainInfo, MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, }, pallets = { PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, Balances: asset_hub_westend_runtime::Balances, Assets: asset_hub_westend_runtime::Assets, ForeignAssets: asset_hub_westend_runtime::ForeignAssets, PoolAssets: asset_hub_westend_runtime::PoolAssets, AssetConversion: asset_hub_westend_runtime::AssetConversion, } }, } ``` - [**`decl_test_bridges`**](https://github.com/paritytech/polkadot-sdk/blob/{{dependencies.repositories.polkadot_sdk.version}}/cumulus/xcm/xcm-emulator/src/lib.rs#L1178){target=\_blank} - creates bridges between chains, specifying the source, target, and message handler. Example: ```rust decl_test_bridges! { pub struct RococoWestendMockBridge { source = BridgeHubRococoPara, target = BridgeHubWestendPara, handler = RococoWestendMessageHandler }, pub struct WestendRococoMockBridge { source = BridgeHubWestendPara, target = BridgeHubRococoPara, handler = WestendRococoMessageHandler } } ``` - [**`decl_test_networks`**](https://github.com/paritytech/polkadot-sdk/blob/{{dependencies.repositories.polkadot_sdk.version}}/cumulus/xcm/xcm-emulator/src/lib.rs#L916){target=\_blank} - defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust decl_test_networks! { pub struct WestendMockNet { relay_chain = Westend, parachains = vec![ AssetHubWestend, BridgeHubWestend, CollectivesWestend, CoretimeWestend, PeopleWestend, PenpalA, PenpalB, ], bridge = () }, } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. This framework enables thorough testing of runtime and cross-chain interactions, enabling developers to effectively design, test, and optimize cross-chain functionality. To see a complete example of implementing and executing tests, refer to the [integration tests](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/cumulus/parachains/integration-tests/emulated){target=\_blank} in the Polkadot SDK repository. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/interoperability/xcm-channels/ --- BEGIN CONTENT --- --- title: XCM Channels description: Learn how Polkadot's cross-consensus messaging (XCM) channels connect parachains, facilitating communication and blockchain interaction. categories: Basics, Polkadot Protocol --- # XCM Channels ## Introduction Polkadot is designed to enable interoperability between its connected parachains. At the core of this interoperability is the [Cross-Consensus Message Format (XCM)](/develop/interoperability/intro-to-xcm/){target=\_blank}, a standard language that allows parachains to communicate and interact with each other. The network-layer protocol responsible for delivering XCM-formatted messages between parachains is the Cross-Chain Message Passing (XCMP) protocol. XCMP maintains messaging queues on the relay chain, serving as a bridge to facilitate cross-chain interactions. As XCMP is still under development, Polkadot has implemented a temporary alternative called Horizontal Relay-routed Message Passing (HRMP). HRMP offers the same interface and functionality as the planned XCMP but it has a crucial difference, it stores all messages directly in the relay chain’s storage, which is more resource-intensive. Once XCMP is fully implemented, HRMP will be deprecated in favor of the native XCMP protocol. XCMP will offer a more efficient and scalable solution for cross-chain message passing, as it will not require the relay chain to store all the messages. ## Establishing HRMP Channels To enable communication between parachains using the HRMP protocol, the parachains must explicitly establish communication channels by registering them on the relay chain. Downward and upward channels from and to the relay chain are implicitly available, meaning they do not need to be explicitly opened. Opening an HRMP channel requires the parachains involved to make a deposit on the relay chain. This deposit serves a specific purpose, it covers the costs associated with using the relay chain's storage for the message queues linked to the channel. The amount of this deposit varies based on parameters defined by the specific relay chain being used. ### Relay Chain Parameters Each Polkadot relay chain has a set of configurable parameters that control the behavior of the message channels between parachains. These parameters include `hrmpSenderDeposit`, `hrmpRecipientDeposit`, `hrmpChannelMaxMessageSize`, `hrmpChannelMaxCapacity`, and more. When a parachain wants to open a new channel, it must consider these parameter values to ensure the channel is configured correctly. To view the current values of these parameters in the Polkadot network: 1. Visit [Polkadot.js Apps](https://polkadot.js.org/apps/?rpc=wss%3A%2F%2Fpolkadot.api.onfinality.io%2Fpublic-ws#/explorer), navigate to the **Developer** dropdown and select the **Chain state** option ![](/images/develop/interoperability/xcm-channels/xcm-channels-1.webp) 2. Query the chain configuration parameters. The result will display the current settings for all the Polkadot network parameters, including the HRMP channel settings 1. Select **`configuration`** 2. Choose the **`activeConfig()`** call 3. Click the **+** button to execute the query 4. Check the chain configuration ![](/images/develop/interoperability/xcm-channels/xcm-channels-2.webp) ### Dispatching Extrinsics Establishing new HRMP channels between parachains requires dispatching specific extrinsic calls on the Polkadot relay chain from the parachain's origin. The most straightforward approach is to implement the channel opening logic off-chain, then use the XCM pallet's `send` extrinsic to submit the necessary instructions to the relay chain. However, the ability to send arbitrary programs through the `Transact` instruction in XCM is typically restricted to privileged origins, such as the `sudo` pallet or governance mechanisms. Parachain developers have a few options for triggering the required extrinsic calls from their parachain's origin, depending on the configuration and access controls defined: - **Sudo** - if the parachain has a `sudo` pallet configured, the sudo key holder can use the sudo extrinsic to dispatch the necessary channel opening calls - **Governance** - the parachain's governance system, such as a council or OpenGov, can be used to authorize the channel opening calls - **Privileged accounts** - the parachain may have other designated privileged accounts that are allowed to dispatch the HRMP channel opening extrinsics ## Where to Go Next Explore the following tutorials for detailed, step-by-step guidance on setting up cross-chain communication channels in Polkadot:
- Tutorial __Opening HRMP Channels Between Parachains__ --- Learn how to open HRMP channels between parachains on Polkadot. Discover the step-by-step process for establishing uni- and bidirectional communication. [:octicons-arrow-right-24: Reference](/tutorials/interoperability/xcm-channels/para-to-para/) - Tutorial __Opening HRMP Channels with System Parachains__ --- Learn how to open HRMP channels with Polkadot system parachains. Discover the process for establishing bi-directional communication using a single XCM message. [:octicons-arrow-right-24: Reference](/tutorials/interoperability/xcm-channels/para-to-system/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/networks/ --- BEGIN CONTENT --- --- title: Networks description: Explore the Polkadot ecosystem networks and learn the unique purposes of each, tailored for blockchain innovation, testing, and enterprise-grade solutions. template: root-subdirectory-page.html categories: Basics, Networks --- # Networks ## Introduction The Polkadot ecosystem consists of multiple networks designed to support different stages of blockchain development, from main networks to test networks. Each network serves a unique purpose, providing developers with flexible environments for building, testing, and deploying blockchain applications. This section includes essential network information such as RPC endpoints, currency symbols and decimals, and how to acquire TestNet tokens for the Polkadot ecosystem of networks. ## Production Networks ### Polkadot Polkadot is the primary production blockchain network for high-stakes, enterprise-grade applications. Polkadot MainNet has been running since May 2020 and has implementations in various programming languages ranging from Rust to JavaScript. === "Network Details" **Currency symbol** - `DOT` --- **Currency decimals** - 10 --- **Block explorer** - [Polkadot Subscan](https://polkadot.subscan.io/){target=\_blank} === "RPC Endpoints" Blockops ``` wss://polkadot-public-rpc.blockops.network/ws ``` --- Dwellir ``` wss://polkadot-rpc.dwellir.com ``` --- Dwellir Tunisia ``` wss://polkadot-rpc-tn.dwellir.com ``` --- IBP1 ``` wss://rpc.ibp.network/polkadot ``` --- IBP2 ``` wss://polkadot.dotters.network ``` --- LuckyFriday ``` wss://rpc-polkadot.luckyfriday.io ``` --- OnFinality ``` wss://polkadot.api.onfinality.io/public-ws ``` --- RadiumBlock ``` wss://polkadot.public.curie.radiumblock.co/ws ``` --- RockX ``` wss://rockx-dot.w3node.com/polka-public-dot/ws ``` --- Stakeworld ``` wss://dot-rpc.stakeworld.io ``` --- SubQuery ``` wss://polkadot.rpc.subquery.network/public/ws ``` --- Light client ``` light://substrate-connect/polkadot ``` ### Kusama Kusama is a network built as a risk-taking, fast-moving "canary in the coal mine" for its cousin Polkadot. As it is built on top of the same infrastructure, Kusama often acts as a final testing ground for new features before they are launched on Polkadot. Unlike true TestNets, however, the Kusama KSM native token does have economic value. This incentive encourages paricipants to maintain this robust and performant structure for the benefit of the community. === "Network Details" **Currency symbol** - `KSM` --- **Currency decimals** - 12 --- **Block explorer** - [Kusama Subscan](https://kusama.subscan.io/){target=\_blank} === "RPC Endpoints" Dwellir ``` wss://kusama-rpc.dwellir.com ``` --- Dwellir Tunisia ``` wss://kusama-rpc-tn.dwellir.com ``` --- IBP1 ``` wss://rpc.ibp.network/kusama ``` --- IBP2 ``` wss://kusama.dotters.network ``` --- LuckyFriday ``` wss://rpc-kusama.luckyfriday.io ``` --- OnFinality ``` wss://kusama.api.onfinality.io/public-ws ``` --- RadiumBlock ``` wss://kusama.public.curie.radiumblock.co/ws ``` --- RockX ``` wss://rockx-ksm.w3node.com/polka-public-ksm/ws ``` --- Stakeworld ``` wss://rockx-ksm.w3node.com/polka-public-ksm/ws ``` --- Light client ``` light://substrate-connect/kusama ``` ## Test Networks ### Westend Westend is the primary test network that mirrors Polkadot's functionality for protocol-level feature development. As a true TestNet, the WND native token intentionally does not have any economic value. Use the faucet information in the following section to obtain WND tokens. === "Network Information" **Currency symbol** - `WND` --- **Currency decimals** - 12 --- **Block explorer** - [Westend Subscan](https://westend.subscan.io/){target=\_blank} --- **Faucet** - [Official Westend faucet](https://faucet.polkadot.io/westend){target=\_blank} === "RPC Endpoints" Dwellir ``` wss://westend-rpc.dwellir.com ``` --- Dwellir Tunisia ``` wss://westend-rpc-tn.dwellir.com ``` --- IBP1 ``` wss://rpc.ibp.network/westend ``` --- IBP2 ``` wss://westend.dotters.network ``` --- OnFinality ``` wss://westend.api.onfinality.io/public-ws ``` --- Parity ``` wss://westend-rpc.polkadot.io ``` --- Light client ``` light://substrate-connect/westend ``` ### Paseo Paseo is a decentralised, community run, stable testnet for parachain and dapp developers to build and test their applications. Unlike Westend, Paseo is not intended for protocol-level testing. As a true TestNet, the PAS native token intentionally does not have any economic value. Use the faucet information in the following section to obtain PAS tokens. === "Network Information" **RPC URL** ``` wss://paseo.rpc.amforc.com ``` --- **Currency symbol** - `PAS` --- **Currency decimals** - 10 --- **Block explorer** - [Paseo Subscan](https://paseo.subscan.io/){target=\_blank} --- **Faucet** - [Official Paseo faucet](https://faucet.polkadot.io/){target=\_blank} === "RPC Endpoints" Amforc ``` wss://paseo.rpc.amforc.com ``` --- Dwellir ``` wss://paseo-rpc.dwellir.com ``` --- IBP1 ``` wss://rpc.ibp.network/paseo ``` --- IBP2 ``` wss://paseo.dotters.network ``` --- StakeWorld ``` wss://pas-rpc.stakeworld.io ``` ## Additional Resources - [**Polkadot Fellowship runtimes repository**](https://github.com/polkadot-fellows/runtimes){target=\_blank} - find a collection of runtimes for Polkadot, Kusama, and their system-parachains as maintained by the community via the [Polkadot Technical Fellowship](https://wiki.polkadot.network/learn/learn-polkadot-technical-fellowship/){target=\_blank} --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/parachains/customize-parachain/overview/ --- BEGIN CONTENT --- --- title: Overview of FRAME description: Learn how Polkadot SDK’s FRAME framework simplifies blockchain development with modular pallets and support libraries for efficient runtime design. categories: Basics, Parachains --- # Overview ## Introduction The runtime is the heart of any Polkadot SDK-based blockchain, handling the essential logic that governs state changes and transaction processing. With Polkadot SDK’s [FRAME (Framework for Runtime Aggregation of Modularized Entities)](/polkadot-protocol/glossary/#frame-framework-for-runtime-aggregation-of-modularized-entities){target=\_bank}, developers gain access to a powerful suite of tools for building custom blockchain runtimes. FRAME offers a modular architecture, featuring reusable pallets and support libraries, to streamline development. This guide provides an overview of FRAME, its core components like pallets and system libraries, and demonstrates how to compose a runtime tailored to your specific blockchain use case. Whether you’re integrating pre-built modules or designing custom logic, FRAME equips you with the tools to create scalable, feature-rich blockchains. ## FRAME Runtime Architecture The following diagram illustrates how FRAME components integrate into the runtime: ![](/images/develop/parachains/customize-parachain/overview/frame-overview-1.webp) All transactions sent to the runtime are handled by the `frame_executive` pallet, which dispatches them to the appropriate pallet for execution. These runtime modules contain the logic for specific blockchain features. The `frame_system` module provides core functions, while `frame_support` libraries offer useful tools to simplify pallet development. Together, these components form the backbone of a FRAME-based blockchain's runtime. ### Pallets Pallets are modular components within the FRAME ecosystem that encapsulate specific blockchain functionalities. These modules offer customizable business logic for various use cases and features that can be integrated into a runtime. Developers have the flexibility to implement any desired behavior in the core logic of the blockchain, such as: - Exposing new transactions - Storing information - Enforcing business rules Pallets also include necessary wiring code to ensure proper integration and functionality within the runtime. FRAME provides a range of [pre-built pallets](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/substrate/frame){target=\_blank} for standard and common blockchain functionalities, including consensus algorithms, staking mechanisms, governance systems, and more. These pre-existing pallets serve as building blocks or templates, which developers can use as-is, modify, or reference when creating custom functionalities. #### Pallet Structure Polkadot SDK heavily utilizes Rust macros, allowing developers to focus on specific functional requirements when writing pallets instead of dealing with technicalities and scaffolding code. A typical pallet skeleton looks like this: ```rust pub use pallet::*; #[frame_support::pallet] pub mod pallet { use frame_support::pallet_prelude::*; use frame_system::pallet_prelude::*; #[pallet::pallet] #[pallet::generate_store(pub(super) trait Store)] pub struct Pallet(_); #[pallet::config] // snip #[pallet::event] // snip #[pallet::error] // snip #[pallet::storage] // snip #[pallet::call] // snip } ``` All pallets, including custom ones, can implement these attribute macros: - **`#[frame_support::pallet]`** - marks the module as usable in the runtime - **`#[pallet::pallet]`** - applied to a structure used to retrieve module information easily - **`#[pallet::config]`** - defines the configuration for the pallets's data types - **`#[pallet::event]`** - defines events to provide additional information to users - **`#[pallet::error]`** - lists possible errors in an enum to be returned upon unsuccessful execution - **`#[pallet::storage]`** - defines elements to be persisted in storage - **`#[pallet::call]`** - defines functions exposed as transactions, allowing dispatch to the runtime These macros are applied as attributes to Rust modules, functions, structures, enums, and types and serve as the core components of a pallet. They enable the pallet to be built and added to the runtime, exposing the custom logic to the outer world. For a comprehensive guide on these and additional macros, see the [`pallet_macros`](https://paritytech.github.io/polkadot-sdk/master/frame_support/pallet_macros/index.html){target=\_blank} section in the Polkadot SDK documentation. ### Support Libraries In addition to purpose-specific pallets, FRAME offers services and core libraries that facilitate composing and interacting with the runtime: - [**`frame_system` pallet**](https://paritytech.github.io/polkadot-sdk/master/frame_system/index.html){target=\_blank} - provides low-level types, storage, and functions for the runtime - [**`frame_executive` pallet**](https://paritytech.github.io/polkadot-sdk/master/frame_executive/index.html){target=\_blank} - orchestrates the execution of incoming function calls to the respective pallets in the runtime - [**`frame_support` crate**](https://paritytech.github.io/polkadot-sdk/master/frame_support/index.html){target=\_blank} - is a collection of Rust macros, types, traits, and modules that simplify the development of Substrate pallets - [**`frame_benchmarking` crate**](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/trait.Benchmark.html){target=\_blank} - contains common runtime patterns for benchmarking and testing purposes ## Compose a Runtime with Pallets The Polkadot SDK allows developers to construct a runtime by combining various pallets, both built-in and custom-made. This modular approach enables the creation of unique blockchain behaviors tailored to specific requirements. The following diagram illustrates the process of selecting and combining FRAME pallets to compose a runtime: ![](/images/develop/parachains/customize-parachain/overview/frame-overview-2.webp) This modular design allows developers to: - Rapidly prototype blockchain systems - Easily add or remove features by including or excluding pallets - Customize blockchain behavior without rebuilding core components - Leverage tested and optimized code from built-in pallets ## Starting from Templates Using pre-built templates is an efficient way to begin building a custom blockchain. Templates provide a foundational setup with pre-configured modules, letting developers avoid starting from scratch and instead focus on customization. Depending on your project’s goalsβ€”whether you want a simple test chain, a standalone chain, or a parachain that integrates with Polkadot’s relay chainsβ€”there are templates designed to suit different levels of complexity and scalability. ### Solochain Templates Solochain templates are designed for developers who want to create standalone blockchains that operate independently without connecting to a relay chain: - [**`minimal-template`**](https://github.com/paritytech/polkadot-sdk/tree/master/templates/minimal){target=\_blank} - includes only the essential components necessary for a functioning blockchain. It’s ideal for developers who want to gain familiarity with blockchain basics and test simple customizations before scaling up - [**`solochain-template`**](https://github.com/paritytech/polkadot-sdk/tree/master/templates/solochain){target=\_blank} - provides a foundation for creating standalone blockchains with moderate features, including a simple consensus mechanism and several core FRAME pallets. It’s a solid starting point for developers who want a fully functional chain that doesn’t depend on a relay chain ### Parachain Templates Parachain templates are specifically designed for chains that will connect to and interact with relay chains in the Polkadot ecosystem: - [**`parachain-template`**](https://github.com/paritytech/polkadot-sdk/tree/master/templates/parachain){target=\_blank} - designed for connecting to relay chains like Polkadot, Kusama, or Paseo, this template enables a chain to operate as a parachain. For projects aiming to integrate with Polkadot’s ecosystem, this template offers a great starting point - [**`OpenZeppelin`**](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main){target=\_blank} - offers two flexible starting points: - The [`generic-runtime-template`](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main/generic-template){target=\_blank} provides a minimal setup with essential pallets and secure defaults, creating a reliable foundation for custom blockchain development - The [`evm-runtime-template`](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main/evm-template){target=\_blank} enables EVM compatibility, allowing developers to migrate Solidity contracts and EVM-based dApps. This template is ideal for Ethereum developers looking to leverage Substrate's capabilities Choosing a suitable template depends on your project’s unique requirements, level of customization, and integration needs. Starting from a template speeds up development and lets you focus on implementing your chain’s unique features rather than the foundational blockchain setup. ## Where to Go Next For more detailed information on implementing this process, refer to the following sections: - [Add a Pallet to Your Runtime](/develop/parachains/customize-parachain/add-existing-pallets/) - [Create a Custom Pallet](/develop/parachains/customize-parachain/make-custom-pallet/) --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/parachains/install-polkadot-sdk/ --- BEGIN CONTENT --- --- title: Install Polkadot SDK Dependencies description: Install everything you need to begin working with Substrated-based blockchains and the Polkadot SDK, the framework for building blockchains. categories: Basics, Tooling --- # Install Polkadot SDK Dependencies This guide provides step-by-step instructions for installing the dependencies you need to work with the Polkadot SDK-based chains on macOS, Linux, and Windows. Follow the appropriate section for your operating system to ensure all necessary tools are installed and configured properly. ## macOS You can install Rust and set up a Substrate development environment on Apple macOS computers with Intel or Apple M1 processors. ### Before You Begin Before you install Rust and set up your development environment on macOS, verify that your computer meets the following basic requirements: - Operating system version is 10.7 Lion or later - Processor speed of at least 2 GHz. Note that 3 GHz is recommended - Memory of at least 8 GB RAM. Note that 16 GB is recommended - Storage of at least 10 GB of available space - Broadband Internet connection #### Install Homebrew In most cases, you should use Homebrew to install and manage packages on macOS computers. If you don't already have Homebrew installed on your local computer, you should download and install it before continuing. To install Homebrew: 1. Open the Terminal application 2. Download and install Homebrew by running the following command: ```bash /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)" ``` 3. Verify Homebrew has been successfully installed by running the following command: ```bash brew --version ``` The command displays output similar to the following:
brew --version Homebrew 4.3.15
#### Support for Apple Silicon Protobuf must be installed before the build process can begin. To install it, run the following command: ```bash brew install protobuf ``` ### Install Required Packages and Rust Because the blockchain requires standard cryptography to support the generation of public/private key pairs and the validation of transaction signatures, you must also have a package that provides cryptography, such as `openssl`. To install `openssl` and the Rust toolchain on macOS: 1. Open the Terminal application 2. Ensure you have an updated version of Homebrew by running the following command: ```bash brew update ``` 3. Install the `openssl` package by running the following command: ```bash brew install openssl ``` 4. Download the `rustup` installation program and use it to install Rust by running the following command: ```bash curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` 5. Follow the prompts displayed to proceed with a default installation 6. Update your current shell to include Cargo by running the following command: ```bash source ~/.cargo/env ``` 7. Configure the Rust toolchain to default to the latest stable version by running the following commands: ```bash rustup default stable rustup update rustup target add wasm32-unknown-unknown rustup component add rust-src ``` 8. [Verify your installation](#verifying-installation) 9. Install `cmake` using the following command: ```bash brew install cmake ``` ## Linux Rust supports most Linux distributions. Depending on the specific distribution and version of the operating system you use, you might need to add some software dependencies to your environment. In general, your development environment should include a linker or C-compatible compiler, such as `clang` and an appropriate integrated development environment (IDE). ### Before You Begin {: #before-you-begin-linux } Check the documentation for your operating system for information about the installed packages and how to download and install any additional packages you might need. For example, if you use Ubuntu, you can use the Ubuntu Advanced Packaging Tool (`apt`) to install the `build-essential` package: ```bash sudo apt install build-essential ``` At a minimum, you need the following packages before you install Rust: ```text clang curl git make ``` Because the blockchain requires standard cryptography to support the generation of public/private key pairs and the validation of transaction signatures, you must also have a package that provides cryptography, such as `libssl-dev` or `openssl-devel`. ### Install Required Packages and Rust {: #install-required-packages-and-rust-linux } To install the Rust toolchain on Linux: 1. Open a terminal shell 2. Check the packages you have installed on the local computer by running an appropriate package management command for your Linux distribution 3. Add any package dependencies you are missing to your local development environment by running the appropriate package management command for your Linux distribution: === "Ubuntu" ```bash sudo apt install --assume-yes git clang curl libssl-dev protobuf-compiler ``` === "Debian" ```sh sudo apt install --assume-yes git clang curl libssl-dev llvm libudev-dev make protobuf-compiler ``` === "Arch" ```sh pacman -Syu --needed --noconfirm curl git clang make protobuf ``` === "Fedora" ```sh sudo dnf update sudo dnf install clang curl git openssl-devel make protobuf-compiler ``` === "OpenSUSE" ```sh sudo zypper install clang curl git openssl-devel llvm-devel libudev-devel make protobuf ``` Remember that different distributions might use different package managers and bundle packages in different ways. For example, depending on your installation selections, Ubuntu Desktop and Ubuntu Server might have different packages and different requirements. However, the packages listed in the command-line examples are applicable for many common Linux distributions, including Debian, Linux Mint, MX Linux, and Elementary OS. 4. Download the `rustup` installation program and use it to install Rust by running the following command: ```bash curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` 5. Follow the prompts displayed to proceed with a default installation 6. Update your current shell to include Cargo by running the following command: ```bash source $HOME/.cargo/env ``` 7. Verify your installation by running the following command: ```bash rustc --version ``` 8. Configure the Rust toolchain to default to the latest stable version by running the following commands: ```bash rustup default stable rustup update rustup target add wasm32-unknown-unknown rustup component add rust-src ``` 9. [Verify your installation](#verifying-installation) ## Windows (WSL) In general, UNIX-based operating systemsβ€”like macOS or Linuxβ€”provide a better development environment for building Substrate-based blockchains. However, suppose your local computer uses Microsoft Windows instead of a UNIX-based operating system. In that case, you can configure it with additional software to make it a suitable development environment for building Substrate-based blockchains. To prepare a development environment on a Microsoft Windows computer, you can use Windows Subsystem for Linux (WSL) to emulate a UNIX operating environment. ### Before You Begin {: #before-you-begin-windows } Before installing on Microsoft Windows, verify the following basic requirements: - You have a computer running a supported Microsoft Windows operating system: - **For Windows desktop** - you must be running Microsoft Windows 10, version 2004 or later, or Microsoft Windows 11 to install WSL - **For Windows server** - you must be running Microsoft Windows Server 2019, or later, to install WSL on a server operating system - You have good internet connection and access to a shell terminal on your local computer ### Set Up Windows Subsystem for Linux WSL enables you to emulate a Linux environment on a computer that uses the Windows operating system. The primary advantage of this approach for Substrate development is that you can use all of the code and command-line examples as described in the Substrate documentation. For example, you can run common commandsβ€”such as `ls` and `ps`β€”unmodified. By using WSL, you can avoid configuring a virtual machine image or a dual-boot operating system. To prepare a development environment using WSL: 1. Check your Windows version and build number to see if WSL is enabled by default. If you have Microsoft Windows 10, version 2004 (Build 19041 and higher), or Microsoft Windows 11, WSL is available by default and you can continue to the next step. If you have an older version of Microsoft Windows installed, see the [WSL manual installation steps for older versions](https://learn.microsoft.com/en-us/windows/wsl/install-manual){target=\_blank}. If you are installing on an older version of Microsoft Windows, you can download and install WLS 2 if your computer has Windows 10, version 1903 or higher 2. Select **Windows PowerShell** or **Command Prompt** from the **Start** menu, right-click, then **Run as administrator** 3. In the PowerShell or Command Prompt terminal, run the following command: ```bash wsl --install ``` This command enables the required WSL 2 components that are part of the Windows operating system, downloads the latest Linux kernel, and installs the Ubuntu Linux distribution by default. If you want to review the other Linux distributions available, run the following command: ```bash wsl --list --online ``` 4. After the distribution is downloaded, close the terminal 5. Click the **Start** menu, select **Shut down or sign out**, then click **Restart** to restart the computer. Restarting the computer is required to start the installation of the Linux distribution. It can take a few minutes for the installation to complete after you restart. For more information about setting up WSL as a development environment, see the [Set up a WSL development environment](https://learn.microsoft.com/en-us/windows/wsl/setup/environment){target=\_blank} docs ### Install Required Packages and Rust {: #install-required-packages-and-rust-windows } To install the Rust toolchain on WSL: 1. Click the **Start** menu, then select **Ubuntu** 2. Type a UNIX user name to create user account 3. Type a password for your UNIX user, then retype the password to confirm it 4. Download the latest updates for the Ubuntu distribution using the Ubuntu Advanced Packaging Tool (`apt`) by running the following command: ```bash sudo apt update ``` 5. Add the required packages for the Ubuntu distribution by running the following command: ```bash sudo apt install --assume-yes git clang curl libssl-dev llvm libudev-dev make protobuf-compiler ``` 6. Download the `rustup` installation program and use it to install Rust for the Ubuntu distribution by running the following command: ```bash curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` 7. Follow the prompts displayed to proceed with a default installation 8. Update your current shell to include Cargo by running the following command: ```bash source ~/.cargo/env ``` 9. Verify your installation by running the following command: ```bash rustc --version ``` 10. Configure the Rust toolchain to use the latest stable version as the default toolchain by running the following commands: ```bash rustup default stable rustup update rustup target add wasm32-unknown-unknown rustup component add rust-src ``` 11. [Verify your installation](#verifying-installation) ## Verifying Installation Verify the configuration of your development environment by running the following command: ```bash rustup show ``` The command displays output similar to the following:
rustup show ...
active toolchain ---------------- name: stable-aarch64-apple-darwin active because: it's the default toolchain installed targets: aarch64-apple-darwin wasm32-unknown-unknown
## Where to Go Next - [Parachain Zero to Hero Tutorials](/tutorials/polkadot-sdk/parachains/zero-to-hero/){target=\_blank} - a series of step-by-step guides to building, testing, and deploying custom pallets and runtimes using the Polkadot SDK --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/parachains/intro-polkadot-sdk/ --- BEGIN CONTENT --- --- title: Introduction to Polkadot SDK description: Learn about the Polkadot SDK, a robust developer toolkit for building custom blockchains. Explore its components and how it powers the Polkadot protocol. categories: Basics, Tooling --- # Introduction to Polkadot SDK ## Introduction The [Polkadot SDK](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}){target=\_blank} is a powerful and versatile developer kit designed to facilitate building on the Polkadot network. It provides the necessary components for creating custom blockchains, parachains, generalized rollups, and more. Written in the Rust programming language, it puts security and robustness at the forefront of its design. Whether you're building a standalone chain or deploying a parachain on Polkadot, this SDK equips developers with the libraries and tools needed to manage runtime logic, compile the codebase, and utilize core features like staking, governance, and Cross-Consensus Messaging (XCM). It also provides a means for building generalized peer-to-peer systems beyond blockchains. The Polkadot SDK houses the following overall functionality: - Networking and peer-to-peer communication (powered by [Libp2p](/polkadot-protocol/glossary#libp2p){target=\_blank}) - Consensus protocols, such as [BABE](/polkadot-protocol/glossary#blind-assignment-of-blockchain-extension-babe){target=\_blank}, [GRANDPA](/polkadot-protocol/glossary#grandpa){target=\_blank}, or [Aura](/polkadot-protocol/glossary#authority-round-aura){target=\_blank} - Cryptography - The ability to create portable Wasm runtimes - A selection of pre-built modules, called [pallets](/polkadot-protocol/glossary#pallet){target=\_blank} - Benchmarking and testing suites For an in-depth look at the monorepo, see the [Polkadot SDK Rust documentation](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/polkadot_sdk/index.html){target=\_blank}. ## Polkadot SDK Overview The Polkadot SDK is composed of five major components: ![](/images/develop/parachains/intro-polkadot-sdk/intro-polkadot-sdk-1.webp) - [**Substrate**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/polkadot_sdk/substrate/index.html){target=\_blank} - a set of libraries and primitives for building blockchains - [**FRAME**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/polkadot_sdk/frame_runtime/index.html){target=\_blank} - a blockchain development framework built on top of Substrate - [**Cumulus**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/polkadot_sdk/cumulus/index.html){target=\_blank} - a set of libraries and pallets to add parachain capabilities to a Substrate/FRAME runtime - [**XCM (Cross Consensus Messaging)**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/polkadot_sdk/xcm/index.html){target=\_blank} - the primary format for conveying messages between parachains - [**Polkadot**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/polkadot_sdk/polkadot/index.html){target=\_blank} - the node implementation for the Polkadot protocol ### Substrate Substrate is a Software Development Kit (SDK) that uses Rust-based libraries and tools to enable you to build application-specific blockchains from modular and extensible components. Application-specific blockchains built with Substrate can run as standalone services or in parallel with other chains to take advantage of the shared security provided by the Polkadot ecosystem. Substrate includes default implementations of the core components of the blockchain infrastructure to allow you to focus on the application logic. Every blockchain platform relies on a decentralized network of computersβ€”called nodesβ€”that communicate with each other about transactions and blocks. In general, a node in this context is the software running on the connected devices rather than the physical or virtual machine in the network. As software, Substrate-based nodes consist of two main parts with separate responsibilities: - **Client** - services to handle network and blockchain infrastructure activity - Native binary - Executes the Wasm runtime - Manages components like database, networking, mempool, consensus, and others - Also known as "Host" - **Runtime** - business logic for state transitions - Application logic - Compiled to [Wasm](https://webassembly.org/){target=\_blank} - Stored as a part of the chain state - Also known as State Transition Function (STF) ```mermaid %%{init: {'flowchart': {'padding': 25, 'nodeSpacing': 10, 'rankSpacing': 50}}}%% graph TB %% Define comprehensive styles classDef titleStyle font-size:30px,font-weight:bold,stroke-width:2px,padding:20px subgraph sg1[Substrate Node] %% Add invisible spacer with increased height spacer[ ] style spacer height:2px,opacity:0 B[Wasm Runtime - STF] I[RuntimeCall Executor] subgraph sg2[Client] direction TB C[Network and Blockchain
Infrastructure Services] end I -.-> B end %% Apply comprehensive styles class sg1 titleStyle ``` ### FRAME FRAME provides the core modular and extensible components that make the Substrate SDK flexible and adaptable to different use cases. FRAME includes Rust-based libraries that simplify the development of application-specific logic. Most of the functionality that FRAME provides takes the form of plug-in modules called [pallets](/polkadot-protocol/glossary#pallet){target=\_blank} that you can add and configure to suit your requirements for a custom runtime. ```mermaid graph LR subgraph SP["Runtime"] direction LR Timestamp ~~~ Aura ~~~ GRANDPA Balances ~~~ TransactionPayment ~~~ Sudo subgraph Timestamp["Timestamp"] SS1[Custom Config] end subgraph Aura["Aura"] SS2[Custom Config] end subgraph GRANDPA["GRANDPA"] SS3[Custom Config] end subgraph Balances["Balances"] SS4[Custom Config] end subgraph TransactionPayment["Transaction Payment"] SS5[Custom Config] end subgraph Sudo["Sudo"] SS6[Custom Config] end style Timestamp stroke:#FF69B4 style Aura stroke:#FF69B4 style GRANDPA stroke:#FF69B4 style Balances stroke:#FF69B4 style TransactionPayment stroke:#FF69B4 style Sudo stroke:#FF69B4 style SS1 stroke-dasharray: 5 style SS2 stroke-dasharray: 5 style SS3 stroke-dasharray: 5 style SS4 stroke-dasharray: 5 style SS5 stroke-dasharray: 5 style SS6 stroke-dasharray: 5 end subgraph AP["FRAME Pallets"] direction LR A1[Aura]~~~A2[BABE]~~~A3[GRANDPA]~~~A4[Transaction\nPayment] B1[Identity]~~~B2[Balances]~~~B3[Sudo]~~~B4[EVM] C1[Timestamp]~~~C2[Assets]~~~C3[Contracts]~~~C4[and more...] end AP --> SP ``` ### Cumulus Cumulus provides utilities and libraries to turn FRAME-based runtimes into runtimes that can be a parachain on Polkadot. Cumulus runtimes are still FRAME runtimes but contain the necessary functionality that allows that runtime to become a parachain on a relay chain. ## Why Use Polkadot SDK? Using the Polkadot SDK, you can build application-specific blockchains without the complexity of building a blockchain from scratch or the limitations of building on a general-purpose blockchain. You can focus on crafting the business logic that makes your chain unique and innovative with the additional benefits of flexibility, upgradeability, open-source licensing, and cross-consensus interoperability. ## Create a Custom Blockchain Using the SDK Before starting your blockchain development journey, you'll need to decide whether you want to build a standalone chain or a parachain that connects to the Polkadot network. Each path has its considerations and requirements. Once you've made this decision, follow these development stages: ```mermaid graph LR A[Install the Polkadot SDK] --> B[Build the Chain] B --> C[Deploy the Chain] ``` 1. [**Install the Polkadot SDK**](/develop/parachains/install-polkadot-sdk/) - set up your development environment with all necessary dependencies and tools 2. [**Build the chain**](/develop/parachains/customize-parachain) - learn how to create and customize your blockchain's runtime, configure pallets, and implement your chain's unique features 3. [**Deploy the chain**](/develop/parachains/deployment) - follow the steps to launch your blockchain, whether as a standalone network or as a parachain on Polkadot Each stage is covered in detail in its respective guide, walking you through the process from initial setup to final deployment. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/overview/ --- BEGIN CONTENT --- --- title: Smart Contracts Overview description: Learn about smart contract development capabilities in the Polkadot ecosystem, either by leveraging Polkadot Hub or other alternatives. categories: Basics, Smart Contracts --- # Smart Contracts on Polkadot !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Polkadot offers developers multiple approaches to building and deploying smart contracts within its ecosystem. As a multi-chain network designed for interoperability, Polkadot provides various environments optimized for different developer preferences and application requirements. From native smart contract support on Polkadot Hub to specialized parachain environments, developers can choose the platform that best suits their technical needs while benefiting from Polkadot's shared security model and cross-chain messaging capabilities. Whether you're looking for Ethereum compatibility through EVM-based parachains like [Moonbeam](https://docs.moonbeam.network/){target=\_blank}, [Astar](https://docs.astar.network/){target=\_blank}, and [Acala](https://evmdocs.acala.network/){target=\_blank} or prefer PolkaVM-based development with [ink!](https://use.ink/docs/v6/){target=\_blank}, the Polkadot ecosystem accommodates a range of diverse developers. These guides explore the diverse smart contract options available in the Polkadot ecosystem, helping developers understand the unique advantages of each approach and make informed decisions about where to deploy their decentralized applications. ## Native Smart Contracts ### Introduction Polkadot Hub enables smart contract deployment and execution through PolkaVM, a cutting-edge virtual machine designed specifically for the Polkadot ecosystem. This native integration allows developers to deploy smart contracts directly on Polkadot's system chain while maintaining compatibility with Ethereum development tools and workflows. ### Smart Contract Development The smart contract platform on Polkadot Hub combines _Polkadot's robust security and scalability_ with the extensive Ethereum development ecosystem. Developers can utilize familiar Ethereum libraries for contract interactions and leverage industry-standard development environments for writing and testing smart contracts. Polkadot Hub provides _full Ethereum JSON-RPC API compatibility_, ensuring seamless integration with existing development tools and services. This compatibility enables developers to maintain their preferred workflows while building on Polkadot's native infrastructure. ### Technical Architecture PolkaVM, the underlying virtual machine, utilizes a RISC-V-based register architecture _optimized for the Polkadot ecosystem_. This design choice offers several advantages: - Enhanced performance for smart contract execution. - Improved gas efficiency for complex operations. - Native compatibility with Polkadot's runtime environment. - Optimized storage and state management. ### Development Tools and Resources Polkadot Hub supports a comprehensive suite of development tools familiar to Ethereum developers. The platform integrates with popular development frameworks, testing environments, and deployment tools. Key features include: - Contract development in Solidity or Rust. - Support for standard Ethereum development libraries. - Integration with widely used development environments. - Access to blockchain explorers and indexing solutions. - Compatibility with contract monitoring and management tools. ### Cross-Chain Capabilities Smart contracts deployed on Polkadot Hub can leverage Polkadot's [cross-consensus messaging (XCM) protocol](/develop/interoperability/intro-to-xcm/){target=\_blank} protocol to seamlessly _transfer tokens and call functions on other blockchain networks_ within the Polkadot ecosystem, all without complex bridging infrastructure or third-party solutions. For further references, check the [Interoperability](/develop/interoperability/){target=\_blank} section. ### Use Cases Polkadot Hub's smart contract platform is suitable for a wide range of applications: - DeFi protocols leveraging _cross-chain capabilities_. - NFT platforms utilizing Polkadot's native token standards. - Governance systems integrated with Polkadot's democracy mechanisms. - Cross-chain bridges and asset management solutions. ## Other Smart Contract Environments Beyond Polkadot Hub's native PolkaVM support, the ecosystem offers two main alternatives for smart contract development: - **EVM-compatible parachains**: Provide access to Ethereum's extensive developer ecosystem, smart contract portability, and established tooling like Hardhat, Remix, Foundry, and OpenZeppelin. The main options include Moonbeam (the first full Ethereum-compatible parachain serving as an interoperability hub), Astar (featuring dual VM support for both EVM and WebAssembly contracts), and Acala (DeFi-focused with enhanced Acala EVM+ offering advanced DeFi primitives). - **Rust (ink!)**: ink! is a Rust-based framework that can compile to PolkaVM. It uses [`#[ink(...)]`](https://use.ink/docs/v6/macros-attributes/){target=\_blank} attribute macros to create Polkadot SDK-compatible PolkaVM bytecode, offering strong memory safety from Rust, an advanced type system, high-performance PolkaVM execution, and platform independence with sandboxed security. Each environment provides unique advantages based on developer preferences and application requirements. ## Where to Go Next Developers can use their existing Ethereum development tools and connect to Polkadot Hub's RPC endpoints. The platform's Ethereum compatibility layer ensures a smooth transition for teams already building on Ethereum-compatible chains. Subsequent sections of this guide provide detailed information about specific development tools, advanced features, and best practices for building on Polkadot Hub.
- Guide __Libraries__ --- Explore essential libraries to optimize smart contract development and interaction. [:octicons-arrow-right-24: Reference](/develop/smart-contracts/libraries/) - Guide __Dev Environments__ --- Set up your development environment for seamless contract deployment and testing. [:octicons-arrow-right-24: Reference](/develop/smart-contracts/dev-environments/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/toolkit/interoperability/asset-transfer-api/overview/ --- BEGIN CONTENT --- --- title: Asset Transfer API description: Asset Transfer API is a library that simplifies the transfer of assets for Polkadot SDK-based chains. It provides methods for cross-chain and local transfers. categories: Basics, Tooling, Dapps --- # Asset Transfer API ## Introduction [Asset Transfer API](https://github.com/paritytech/asset-transfer-api){target=\_blank}, a tool developed and maintained by [Parity](https://www.parity.io/){target=\_blank}, is a specialized library designed to streamline asset transfers for Polkadot SDK-based blockchains. This API provides a simplified set of methods for users to: - Execute asset transfers to other parachains or locally within the same chain - Facilitate transactions involving system parachains like Asset Hub (Polkadot and Kusama) Using this API, developers can manage asset transfers more efficiently, reducing the complexity of cross-chain transactions and enabling smoother operations within the ecosystem. For additional support and information, please reach out through [GitHub Issues](https://github.com/paritytech/asset-transfer-api/issues){target=\_blank}. ## Prerequisites Before you begin, ensure you have the following installed: - [Node.js](https://nodejs.org/en/){target=\_blank} (recommended version 21 or greater) - Package manager - [npm](https://www.npmjs.com/){target=\_blank} should be installed with Node.js by default. Alternatively, you can use other package managers like [Yarn](https://yarnpkg.com/){target=\_blank} This documentation covers version `{{dependencies.javascript_packages.asset_transfer_api.version}}` of Asset Transfer API. ## Install Asset Transfer API To use `asset-transfer-api`, you need a TypeScript project. If you don't have one, you can create a new one: 1. Create a new directory for your project: ```bash mkdir my-asset-transfer-project \ && cd my-asset-transfer-project ``` 2. Initialize a new TypeScript project: ```bash npm init -y \ && npm install typescript ts-node @types/node --save-dev \ && npx tsc --init ``` Once you have a project set up, you can install the `asset-transfer-api` package. Run the following command to install the package: ```bash npm install @substrate/asset-transfer-api@{{dependencies.javascript_packages.asset_transfer_api.version}} ``` ## Set Up Asset Transfer API To initialize the Asset Transfer API, you need three key components: - A Polkadot.js API instance - The `specName` of the chain - The XCM version to use ### Using Helper Function from Library Leverage the `constructApiPromise` helper function provided by the library for the simplest setup process. It not only constructs a Polkadot.js `ApiPromise` but also automatically retrieves the chain's `specName` and fetches a safe XCM version. By using this function, developers can significantly reduce boilerplate code and potential configuration errors, making the initial setup both quicker and more robust. ```ts import { AssetTransferApi, constructApiPromise, } from '@substrate/asset-transfer-api'; async function main() { const { api, specName, safeXcmVersion } = await constructApiPromise( 'INSERT_WEBSOCKET_URL', ); const assetsApi = new AssetTransferApi(api, specName, safeXcmVersion); // Your code using assetsApi goes here } main(); ``` !!!warning The code example is enclosed in an async main function to provide the necessary asynchronous context. However, you can use the code directly if you're already working within an async environment. The key is to ensure you're in an async context when working with these asynchronous operations, regardless of your specific setup. ## Asset Transfer API Reference For detailed information on the Asset Transfer API, including available methods, data types, and functionalities, refer to the [Asset Transfer API Reference](/develop/toolkit/interoperability/asset-transfer-api/reference){target=\_blank} section. This resource provides in-depth explanations and technical specifications to help you integrate and utilize the API effectively. ## Examples ### Relay to System Parachain Transfer This example demonstrates how to initiate a cross-chain token transfer from a relay chain to a system parachain. Specifically, 1 WND will be transferred from a Westend (relay chain) account to a Westmint (system parachain) account. ```ts import { AssetTransferApi, constructApiPromise, } from '@substrate/asset-transfer-api'; async function main() { const { api, specName, safeXcmVersion } = await constructApiPromise( 'wss://westend-rpc.polkadot.io', ); const assetApi = new AssetTransferApi(api, specName, safeXcmVersion); let callInfo; try { callInfo = await assetApi.createTransferTransaction( '1000', '5EWNeodpcQ6iYibJ3jmWVe85nsok1EDG8Kk3aFg8ZzpfY1qX', ['WND'], ['1000000000000'], { format: 'call', xcmVersion: safeXcmVersion, }, ); console.log(`Call data:\n${JSON.stringify(callInfo, null, 4)}`); } catch (e) { console.error(e); throw Error(e as string); } const decoded = assetApi.decodeExtrinsic(callInfo.tx, 'call'); console.log(`\nDecoded tx:\n${JSON.stringify(JSON.parse(decoded), null, 4)}`); } main() .catch((err) => console.error(err)) .finally(() => process.exit()); ``` After running the script, you'll see the following output in the terminal, which shows the call data for the cross-chain transfer and its decoded extrinsic details:
ts-node relayToSystem.ts
Call data: { "origin": "westend", "dest": "westmint", "direction": "RelayToSystem", "xcmVersion": 3, "method": "transferAssets", "format": "call", "tx": "0x630b03000100a10f03000101006c0c32faf970eacb2d4d8e538ac0dab3642492561a1be6f241c645876c056c1d030400000000070010a5d4e80000000000" } Decoded tx: { "args": { "dest": { "V3": { "parents": "0", "interior": { "X1": { "Parachain": "1,000" } } } }, "beneficiary": { "V3": { "parents": "0", "interior": { "X1": { "AccountId32": { "network": null, "id": "0x6c0c32faf970eacb2d4d8e538ac0dab3642492561a1be6f241c645876c056c1d" } } } } }, "assets": { "V3": [ { "id": { "Concrete": { "parents": "0", "interior": "Here" } }, "fun": { "Fungible": "1,000,000,000,000" } } ] }, "fee_asset_item": "0", "weight_limit": "Unlimited" }, "method": "transferAssets", "section": "xcmPallet" }
### Local Parachain Transfer The following example demonstrates a local GLMR transfer within Moonbeam, using the `balances` pallet. It transfers 1 GLMR token from one account to another account, where both the sender and recipient accounts are located on the same parachain. ```ts import { AssetTransferApi, constructApiPromise, } from '@substrate/asset-transfer-api'; async function main() { const { api, specName, safeXcmVersion } = await constructApiPromise( 'wss://wss.api.moonbeam.network', ); const assetApi = new AssetTransferApi(api, specName, safeXcmVersion); let callInfo; try { callInfo = await assetApi.createTransferTransaction( '2004', '0xF977814e90dA44bFA03b6295A0616a897441aceC', [], ['1000000000000000000'], { format: 'call', keepAlive: true, }, ); console.log(`Call data:\n${JSON.stringify(callInfo, null, 4)}`); } catch (e) { console.error(e); throw Error(e as string); } const decoded = assetApi.decodeExtrinsic(callInfo.tx, 'call'); console.log(`\nDecoded tx:\n${JSON.stringify(JSON.parse(decoded), null, 4)}`); } main() .catch((err) => console.error(err)) .finally(() => process.exit()); ``` Upon executing this script, the terminal will display the following output, illustrating the encoded extrinsic for the cross-chain message and its corresponding decoded format:
ts-node localParachainTx.ts
Call data: { "origin": "moonbeam", "dest": "moonbeam", "direction": "local", "xcmVersion": null, "method": "balances::transferKeepAlive", "format": "call", "tx": "0x0a03f977814e90da44bfa03b6295a0616a897441acec821a0600" } Decoded tx: { "args": { "dest": "0xF977814e90dA44bFA03b6295A0616a897441aceC", "value": "1,000,000,000,000,000,000" }, "method": "transferKeepAlive", "section": "balances" }
### Parachain to Parachain Transfer This example demonstrates creating a cross-chain asset transfer between two parachains. It shows how to send vMOVR and vBNC from a Moonriver account to a Bifrost Kusama account using the safe XCM version. It connects to Moonriver, initializes the API, and uses the `createTransferTransaction` method to prepare a transaction. ```ts import { AssetTransferApi, constructApiPromise, } from '@substrate/asset-transfer-api'; async function main() { const { api, specName, safeXcmVersion } = await constructApiPromise( 'wss://moonriver.public.blastapi.io', ); const assetApi = new AssetTransferApi(api, specName, safeXcmVersion); let callInfo; try { callInfo = await assetApi.createTransferTransaction( '2001', '0xc4db7bcb733e117c0b34ac96354b10d47e84a006b9e7e66a229d174e8ff2a063', ['vMOVR', '72145018963825376852137222787619937732'], ['1000000', '10000000000'], { format: 'call', xcmVersion: safeXcmVersion, }, ); console.log(`Call data:\n${JSON.stringify(callInfo, null, 4)}`); } catch (e) { console.error(e); throw Error(e as string); } const decoded = assetApi.decodeExtrinsic(callInfo.tx, 'call'); console.log(`\nDecoded tx:\n${JSON.stringify(JSON.parse(decoded), null, 4)}`); } main() .catch((err) => console.error(err)) .finally(() => process.exit()); ``` After running this script, you'll see the following output in your terminal. This output presents the encoded extrinsic for the cross-chain message, along with its decoded format, providing a clear view of the transaction details.
ts-node paraToPara.ts
Call data: { "origin": "moonriver", "dest": "bifrost", "direction": "ParaToPara", "xcmVersion": 2, "method": "transferMultiassets", "format": "call", "tx": "0x6a05010800010200451f06080101000700e40b540200010200451f0608010a0002093d000000000001010200451f0100c4db7bcb733e117c0b34ac96354b10d47e84a006b9e7e66a229d174e8ff2a06300" } Decoded tx: { "args": { "assets": { "V2": [ { "id": { "Concrete": { "parents": "1", "interior": { "X2": [ { "Parachain": "2,001" }, { "GeneralKey": "0x0101" } ] } } }, "fun": { "Fungible": "10,000,000,000" } }, { "id": { "Concrete": { "parents": "1", "interior": { "X2": [ { "Parachain": "2,001" }, { "GeneralKey": "0x010a" } ] } } }, "fun": { "Fungible": "1,000,000" } } ] }, "fee_item": "0", "dest": { "V2": { "parents": "1", "interior": { "X2": [ { "Parachain": "2,001" }, { "AccountId32": { "network": "Any", "id": "0xc4db7bcb733e117c0b34ac96354b10d47e84a006b9e7e66a229d174e8ff2a063" } } ] } } }, "dest_weight_limit": "Unlimited" }, "method": "transferMultiassets", "section": "xTokens" }
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/toolkit/interoperability/xcm-tools/ --- BEGIN CONTENT --- --- title: XCM Tools description: Explore essential XCM tools across Polkadot, crafted to enhance cross-chain functionality and integration within the ecosystem. categories: Basics, Tooling, Dapps --- # XCM Tools ## Introduction As described in the [Interoperability](/develop/interoperability){target=\_blank} section, XCM (Cross-Consensus Messaging) is a protocol used in the Polkadot and Kusama ecosystems to enable communication and interaction between chains. It facilitates cross-chain communication, allowing assets, data, and messages to flow seamlessly across the ecosystem. As XCM is central to enabling communication between blockchains, developers need robust tools to help interact with, build, and test XCM messages. Several XCM tools simplify working with the protocol by providing libraries, frameworks, and utilities that enhance the development process, ensuring that applications built within the Polkadot ecosystem can efficiently use cross-chain functionalities. ## Popular XCM Tools ### Moonsong Labs XCM Tools [Moonsong Labs XCM Tools](https://github.com/Moonsong-Labs/xcm-tools){target=\_blank} provides a collection of scripts for managing and testing XCM operations between Polkadot SDK-based runtimes. These tools allow performing tasks like asset registration, channel setup, and XCM initialization. Key features include: - **Asset registration** - registers assets, setting units per second (up-front fees), and configuring error (revert) codes - **XCM initializer** - initializes XCM, sets default XCM versions, and configures revert codes for XCM-related precompiles - **HRMP manipulator** - manages HRMP channel actions, including opening, accepting, or closing channels - **XCM-Transactor-Info-Setter** - configures transactor information, including extra weight and fee settings - **Decode XCM** - decodes XCM messages on the relay chain or parachains to help interpret cross-chain communication To get started, clone the repository and install the required dependencies: ```bash git clone https://github.com/Moonsong-Labs/xcm-tools && cd xcm-tools && yarn install ``` For a full overview of each script, visit the [scripts](https://github.com/Moonsong-Labs/xcm-tools/tree/main/scripts){target=\_blank} directory or refer to the [official documentation](https://github.com/Moonsong-Labs/xcm-tools/blob/main/README.md){target=\_blank} on GitHub. ### ParaSpell [ParaSpell](https://paraspell.xyz/){target=\_blank} is a collection of open-source XCM tools designed to streamline cross-chain asset transfers and interactions within the Polkadot and Kusama ecosystems. It equips developers with an intuitive interface to manage and optimize XCM-based functionalities. Some key points included by ParaSpell are: - [**XCM SDK**](https://paraspell.xyz/#xcm-sdk){target=\_blank} - provides a unified layer to incorporate XCM into decentralized applications, simplifying complex cross-chain interactions - [**XCM API**](https://paraspell.xyz/#xcm-api){target=\_blank} - offers an efficient, package-free approach to integrating XCM functionality while offloading heavy computing tasks, minimizing costs and improving application performance - [**XCM router**](https://paraspell.xyz/#xcm-router){target=\_blank} - enables cross-chain asset swaps in a single command, allowing developers to send one asset type (such as DOT on Polkadot) and receive a different asset on another chain (like ASTR on Astar) - [**XCM analyser**](https://paraspell.xyz/#xcm-analyser){target=\_blank} - decodes and translates complex XCM multilocation data into readable information, supporting easier troubleshooting and debugging - [**XCM visualizator**](https://paraspell.xyz/#xcm-visualizator){target=\_blank} - a tool designed to give developers a clear, interactive view of XCM activity across the Polkadot ecosystem, providing insights into cross-chain communication flow ParaSpell's tools make it simple for developers to build, test, and deploy cross-chain solutions without needing extensive knowledge of the XCM protocol. With features like message composition, decoding, and practical utility functions for parachain interactions, ParaSpell is especially useful for debugging and optimizing cross-chain communications. ### Astar XCM Tools The [Astar parachain](https://github.com/AstarNetwork/Astar/tree/master){target=\_blank} offers a crate with a set of utilities for interacting with the XCM protocol. The [xcm-tools](https://github.com/AstarNetwork/Astar/tree/master/bin/xcm-tools){target=\_blank} crate provides a straightforward method for users to locate a sovereign account or calculate an XC20 asset ID. Some commands included by the xcm-tools crate allow users to perform the following tasks: - **Sovereign accounts** - obtain the sovereign account address for any parachain, either on the Relay Chain or for sibling parachains, using a simple command - **XC20 EVM addresses** - generate XC20-compatible Ethereum addresses for assets by entering the asset ID, making it easy to integrate assets across Ethereum-compatible environments - **Remote accounts** - retrieve remote account addresses needed for multi-location compatibility, using flexible options to specify account types and parachain IDs To start using these tools, clone the [Astar repository](https://github.com/AstarNetwork/Astar){target=\_blank} and compile the xcm-tools package: ```bash git clone https://github.com/AstarNetwork/Astar && cd Astar && cargo build --release -p xcm-tools ``` After compiling, verify the setup with the following command: ```bash ./target/release/xcm-tools --help ``` For more details on using Astar xcm-tools, consult the [official documentation](https://docs.astar.network/docs/learn/interoperability/xcm/integration/tools/){target=\_blank}. ### Chopsticks The Chopsticks library provides XCM functionality for testing XCM messages across networks, enabling you to fork multiple parachains along with a relay chain. For further details, see the [Chopsticks documentation](/tutorials/polkadot-sdk/testing/fork-live-chains/){target=\_blank} about XCM. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/ --- BEGIN CONTENT --- --- title: Add Pallets to the Runtime description: Add pallets to your runtime for custom functionality. Learn to configure and integrate pallets in Polkadot SDK-based blockchains. tutorial_badge: Beginner categories: Basics, Parachains --- # Add Pallets to the Runtime ## Introduction In previous tutorials, you learned how to [create a custom pallet](/tutorials/polkadot-sdk/parachains/zero-to-hero/build-custom-pallet/){target=\_blank} and [test it](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-unit-testing/){target=\_blank}. The next step is to include this pallet in your runtime, integrating it into the core logic of your blockchain. This tutorial will guide you through adding two pallets to your runtime: the custom pallet you previously developed and the [utility pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/index.html){target=\_blank}. This standard Polkadot SDK pallet provides powerful dispatch functionality. The utility pallet offers, for example, batch dispatch, a stateless operation that enables executing multiple calls in a single transaction. ## Add the Pallets as Dependencies First, you'll update the runtime's `Cargo.toml` file to include the Utility pallet and your custom pallets as dependencies for the runtime. Follow these steps: Update the runtime's `Cargo.toml` file to include the utility pallet and your custom pallets as dependencies. Follow these steps: 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add the pallets with the following lines: ```toml hl_lines="3-4" title="Cargo.toml" [dependencies] ... pallet-utility = { version = "39.0.0", default-features = false } custom-pallet = { path = "../pallets/custom-pallet", default-features = false } ``` 2. In the `[features]` section, add the pallets to the `std` feature list: ```toml hl_lines="5-6" title="Cargo.toml" [features] default = ["std"] std = [ ... "pallet-utility/std", "custom-pallet/std", ] ``` 3. Save the changes and close the `Cargo.toml` file ### Update the Runtime Configuration Configure the pallets by implementing their `Config` trait and update the runtime macro to include the new pallets: 1. Add the `OriginCaller` import: ```rust title="mod.rs" hl_lines="2" // Local module imports use super::OriginCaller; ... ``` 2. Implement the [`Config`](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/pallet/trait.Config.html){target=\_blank} trait for both pallets at the end of the `runtime/src/config/mod.rs` file: ```rust title="mod.rs" hl_lines="7-25" ... impl pallet_parachain_template::Config for Runtime { type RuntimeEvent = RuntimeEvent; type WeightInfo = pallet_parachain_template::weights::SubstrateWeight; } // Configure utility pallet. impl pallet_utility::Config for Runtime { type RuntimeEvent = RuntimeEvent; type RuntimeCall = RuntimeCall; type PalletsOrigin = OriginCaller; type WeightInfo = pallet_utility::weights::SubstrateWeight; } // Define counter max value runtime constant. parameter_types! { pub const CounterMaxValue: u32 = 500; } // Configure custom pallet. impl custom_pallet::Config for Runtime { type RuntimeEvent = RuntimeEvent; type CounterMaxValue = CounterMaxValue; } ``` 3. Locate the `#[frame_support::runtime]` macro in the `runtime/src/lib.rs` file and add the pallets: ```rust hl_lines="8-12" title="lib.rs" mod runtime { #[runtime::runtime] #[runtime::derive( ... )] pub struct Runtime; #[runtime::pallet_index(51)] pub type Utility = pallet_utility; #[runtime::pallet_index(52)] pub type CustomPallet = custom_pallet; } ``` ## Recompile the Runtime After adding and configuring your pallets in the runtime, the next step is to ensure everything is set up correctly. To do this, recompile the runtime with the following command (make sure you're in the project's root directory): ```bash cargo build --release ``` This command ensures the runtime compiles without errors, validates the pallet configurations, and prepares the build for subsequent testing or deployment. ## Run Your Chain Locally Launch your parachain locally and start producing blocks: !!!tip Generated chain TestNet specifications include development accounts "Alice" and "Bob." These accounts are pre-funded with native parachain currency, allowing you to sign and send TestNet transactions. Take a look at the [Polkadot.js Accounts section](https://polkadot.js.org/apps/#/accounts){target=\_blank} to view the development accounts for your chain. 1. Create a new chain specification file with the updated runtime: ```bash chain-spec-builder create -t development \ --relay-chain paseo \ --para-id 1000 \ --runtime ./target/release/wbuild/parachain-template-runtime/parachain_template_runtime.compact.compressed.wasm \ named-preset development ``` 2. Start the omni node with the generated chain specification: ```bash polkadot-omni-node --chain ./chain_spec.json --dev ``` 3. Verify you can interact with the new pallets using the [Polkadot.js Apps](https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/extrinsics){target=\_blank} interface. Navigate to the **Extrinsics** tab and check that you can see both pallets: - Utility pallet ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-1.webp) - Custom pallet ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-2.webp) ## Where to Go Next
- Tutorial __Deploy on Paseo TestNet__ --- Deploy your Polkadot SDK blockchain on Paseo! Follow this step-by-step guide for a seamless journey to a successful TestNet deployment. [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/deploy-to-testnet/) - Tutorial __Pallet Benchmarking (Optional)__ --- Discover how to measure extrinsic costs and assign precise weights to optimize your pallet for accurate fees and runtime performance. [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-benchmarking/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/polkadot-sdk/parachains/zero-to-hero/build-custom-pallet/ --- BEGIN CONTENT --- --- title: Build a Custom Pallet description: Learn how to build a custom pallet for Polkadot SDK-based blockchains with this step-by-step guide. Create and configure a simple counter pallet from scratch. tutorial_badge: Beginner categories: Basics, Parachains --- # Build a Custom Pallet ## Introduction In Polkadot SDK-based blockchains, runtime functionality is built through modular components called [pallets](/polkadot-protocol/glossary#pallet){target=\_blank}. These pallets are Rust-based runtime modules created using [FRAME (Framework for Runtime Aggregation of Modular Entities)](/develop/parachains/customize-parachain/overview/){target=\_blank}, a powerful library that simplifies blockchain development by providing specialized macros and standardized patterns for building blockchain logic. A pallet encapsulates a specific set of blockchain functionalities, such as managing token balances, implementing governance mechanisms, or creating custom state transitions. In this tutorial, you'll learn how to create a custom pallet from scratch. You will develop a simple counter pallet with the following features: - Users can increment and decrement a counter - Only a [root origin](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/type.Origin.html#variant.Root){target=\_blank} can set an arbitrary counter value ## Prerequisites You'll use the [Polkadot SDK Parachain Template](https://github.com/paritytech/polkadot-sdk/tree/master/templates/parachain){target=\_blank} created in the [Set Up a Template](/tutorials/polkadot-sdk/parachains/zero-to-hero/set-up-a-template/){target=\_blank} tutorial. ## Create a New Project In this tutorial, you'll build a custom pallet from scratch to demonstrate the complete workflow, rather than starting with the pre-built `pallet-template`. The first step is to create a new Rust package for your pallet: 1. Navigate to the `pallets` directory in your workspace: ```bash cd pallets ``` 2. Create a new Rust library project for your custom pallet by running the following command: ```bash cargo new --lib custom-pallet ``` 3. Enter the new project directory: ```bash cd custom-pallet ``` 4. Ensure the project was created successfully by checking its structure. The file layout should resemble the following: ``` custom-pallet β”œβ”€β”€ Cargo.toml └── src └── lib.rs ``` If the files are in place, your project setup is complete, and you're ready to start building your custom pallet. ## Add Dependencies To build and integrate your custom pallet into a Polkadot SDK-based runtime, you must add specific dependencies to the `Cargo.toml` file of your pallet's project. These dependencies provide essential modules and features required for pallet development. Since your custom pallet is part of a workspace that includes other components, such as the runtime, the configuration must align with the workspace structure. Follow the steps below to set up your `Cargo.toml` file properly: 1. Open your `Cargo.toml` file 2. Add the required dependencies in the `[dependencies]` section: ```toml [dependencies] codec = { features = ["derive"], workspace = true } scale-info = { features = ["derive"], workspace = true } frame = { features = ["experimental", "runtime"], workspace = true } ``` 3. Enable `std` features: ```toml [features] default = ["std"] std = ["codec/std", "frame/std", "scale-info/std"] ``` The final `Cargo.toml` file should resemble the following: ??? code "Cargo.toml" ```toml [package] name = "custom-pallet" version = "0.1.0" license.workspace = true authors.workspace = true homepage.workspace = true repository.workspace = true edition.workspace = true [dependencies] codec = { features = ["derive"], workspace = true } scale-info = { features = ["derive"], workspace = true } frame = { features = ["experimental", "runtime"], workspace = true } [features] default = ["std"] std = ["codec/std", "frame/std", "scale-info/std"] runtime-benchmarks = ["frame/runtime-benchmarks"] ``` ## Implement the Pallet Logic In this section, you will construct the core structure of your custom pallet, starting with setting up its basic scaffold. This scaffold acts as the foundation, enabling you to later add functionality such as storage items, events, errors, and dispatchable calls. ### Add Scaffold Pallet Structure You now have the bare minimum of package dependencies that your pallet requires specified in the `Cargo.toml` file. The next step is to prepare the scaffolding for your new pallet. 1. Open `src/lib.rs` in a text editor and delete all the content 2. Prepare the scaffolding for the pallet by adding the following: ```rust title="lib.rs" #![cfg_attr(not(feature = "std"), no_std)] pub use pallet::*; #[frame::pallet] pub mod pallet { use super::*; use frame::prelude::*; #[pallet::pallet] pub struct Pallet(_); // Configuration trait for the pallet. #[pallet::config] pub trait Config: frame_system::Config { // Defines the event type for the pallet. } } ``` 3. Verify that it compiles by running the following command: ```bash cargo build --package custom-pallet ``` ### Pallet Configuration Implementing the `#[pallet::config]` macro is mandatory and sets the module's dependency on other modules and the types and values specified by the runtime-specific settings. In this step, you will configure two essential components that are critical for the pallet's functionality: - **`RuntimeEvent`** - since this pallet emits events, the [`RuntimeEvent`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/trait.Config.html#associatedtype.RuntimeEvent){target=\_blank} type is required to handle them. This ensures that events generated by the pallet can be correctly processed and interpreted by the runtime - **`CounterMaxValue`** - a constant that sets an upper limit on the value of the counter, ensuring that the counter remains within a predefined range Add the following `Config` trait definition to your pallet: ```rust title="lib.rs" #[pallet::config] pub trait Config: frame_system::Config { // Defines the event type for the pallet. type RuntimeEvent: From> + IsType<::RuntimeEvent>; // Defines the maximum value the counter can hold. #[pallet::constant] type CounterMaxValue: Get; } ``` ### Add Events Events allow the pallet to communicate with the outside world by emitting signals when specific actions occur. These events are critical for transparency, debugging, and integration with external systems such as UIs or monitoring tools. Below are the events defined for this pallet: - **`CounterValueSet`** - is emitted when the counter is explicitly set to a new value. This event includes the counter's updated value - **`CounterIncremented`** - is emitted after a successful increment operation. It includes: - The new counter value - The account responsible for the increment - The amount by which the counter was incremented - **`CounterDecremented`** - is emitted after a successful decrement operation. It includes: - The new counter value - The account responsible for the decrement - The amount by which the counter was decremented Define the events in the pallet as follows: ```rust title="lib.rs" #[pallet::event] #[pallet::generate_deposit(pub(super) fn deposit_event)] pub enum Event { /// The counter value has been set to a new value by Root. CounterValueSet { /// The new value set. counter_value: u32, }, /// A user has successfully incremented the counter. CounterIncremented { /// The new value set. counter_value: u32, /// The account who incremented the counter. who: T::AccountId, /// The amount by which the counter was incremented. incremented_amount: u32, }, /// A user has successfully decremented the counter. CounterDecremented { /// The new value set. counter_value: u32, /// The account who decremented the counter. who: T::AccountId, /// The amount by which the counter was decremented. decremented_amount: u32, }, } ``` ### Add Storage Items Storage items are used to manage the pallet's state. This pallet defines two items to handle the counter's state and user interactions: - **`CounterValue`** - a single storage value that keeps track of the current value of the counter. This value is the core state variable manipulated by the pallet's functions - **`UserInteractions`** - a storage map that tracks the number of times each account interacts with the counter Define the storage items as follows: ```rust title="lib.rs" #[pallet::storage] pub type CounterValue = StorageValue<_, u32>; /// Storage map to track the number of interactions performed by each account. #[pallet::storage] pub type UserInteractions = StorageMap<_, Twox64Concat, T::AccountId, u32>; ``` ### Implement Custom Errors The `#[pallet::error]` macro defines a custom `Error` enum to handle specific failure conditions within the pallet. Errors help provide meaningful feedback to users and external systems when an extrinsic cannot be completed successfully. They are critical for maintaining the pallet's clarity and robustness. To add custom errors, use the `#[pallet::error]` macro to define the `Error` enum. Each variant represents a unique error that the pallet can emit, and these errors should align with the logic and constraints of the pallet. Add the following errors to the pallet: ```rust title="lib.rs" #[pallet::error] pub enum Error { /// The counter value exceeds the maximum allowed value. CounterValueExceedsMax, /// The counter value cannot be decremented below zero. CounterValueBelowZero, /// Overflow occurred in the counter. CounterOverflow, /// Overflow occurred in user interactions. UserInteractionOverflow, } ``` ### Implement Calls The `#[pallet::call]` macro defines the dispatchable functions (or calls) the pallet exposes. These functions allow users or the runtime to interact with the pallet's logic and state. Each call includes comprehensive validations, modifies the state, and optionally emits events to signal successful execution. The structure of the dispatchable calls in this pallet is as follows: ```rust title="lib.rs" #[pallet::call] impl Pallet { /// Set the value of the counter. /// /// The dispatch origin of this call must be _Root_. /// /// - `new_value`: The new value to set for the counter. /// /// Emits `CounterValueSet` event when successful. #[pallet::call_index(0)] #[pallet::weight(0)] pub fn set_counter_value(origin: OriginFor, new_value: u32) -> DispatchResult { } /// Increment the counter by a specified amount. /// /// This function can be called by any signed account. /// /// - `amount_to_increment`: The amount by which to increment the counter. /// /// Emits `CounterIncremented` event when successful. #[pallet::call_index(1)] #[pallet::weight(0)] pub fn increment(origin: OriginFor, amount_to_increment: u32) -> DispatchResult { } /// Decrement the counter by a specified amount. /// /// This function can be called by any signed account. /// /// - `amount_to_decrement`: The amount by which to decrement the counter. /// /// Emits `CounterDecremented` event when successful. #[pallet::call_index(2)] #[pallet::weight(0)] pub fn decrement(origin: OriginFor, amount_to_decrement: u32) -> DispatchResult { } } ``` Expand the following items to view the implementations of each dispatchable call in this pallet. ???- code "set_counter_value(origin: OriginFor, new_value: u32) -> DispatchResult" This call sets the counter to a specific value. It is restricted to the Root origin, meaning it can only be invoked by privileged users or entities. - **Parameters**: - `new_value` - the value to set the counter to - **Validations**: - The new value must not exceed the maximum allowed counter value (`CounterMaxValue`) - **Behavior**: - Updates the `CounterValue` storage item - Emits a `CounterValueSet` event on success ```rust title="lib.rs" /// Set the value of the counter. /// /// The dispatch origin of this call must be _Root_. /// /// - `new_value`: The new value to set for the counter. /// /// Emits `CounterValueSet` event when successful. #[pallet::call_index(0)] #[pallet::weight(0)] pub fn set_counter_value(origin: OriginFor, new_value: u32) -> DispatchResult { ensure_root(origin)?; ensure!( new_value <= T::CounterMaxValue::get(), Error::::CounterValueExceedsMax ); CounterValue::::put(new_value); Self::deposit_event(Event::::CounterValueSet { counter_value: new_value, }); Ok(()) } ``` ???- code "increment(origin: OriginFor, amount_to_increment: u32) -> DispatchResult" This call increments the counter by a specified amount. It is accessible to any signed account. - **Parameters**: - `amount_to_increment` - the amount to add to the counter - **Validations**: - Prevents overflow during the addition - Ensures the resulting counter value does not exceed `CounterMaxValue` - **Behavior**: - Updates the `CounterValue` storage item - Tracks the number of interactions by the user in the `UserInteractions` storage map - Emits a `CounterIncremented` event on success ```rust title="lib.rs" /// Increment the counter by a specified amount. /// /// This function can be called by any signed account. /// /// - `amount_to_increment`: The amount by which to increment the counter. /// /// Emits `CounterIncremented` event when successful. #[pallet::call_index(1)] #[pallet::weight(0)] pub fn increment(origin: OriginFor, amount_to_increment: u32) -> DispatchResult { let who = ensure_signed(origin)?; let current_value = CounterValue::::get().unwrap_or(0); let new_value = current_value .checked_add(amount_to_increment) .ok_or(Error::::CounterOverflow)?; ensure!( new_value <= T::CounterMaxValue::get(), Error::::CounterValueExceedsMax ); CounterValue::::put(new_value); UserInteractions::::try_mutate(&who, |interactions| -> Result<_, Error> { let new_interactions = interactions .unwrap_or(0) .checked_add(1) .ok_or(Error::::UserInteractionOverflow)?; *interactions = Some(new_interactions); // Store the new value. Ok(()) })?; Self::deposit_event(Event::::CounterIncremented { counter_value: new_value, who, incremented_amount: amount_to_increment, }); Ok(()) } ``` ???- code "decrement(origin: OriginFor, amount_to_decrement: u32) -> DispatchResult" This call decrements the counter by a specified amount. It is accessible to any signed account. - **Parameters**: - `amount_to_decrement` - the amount to subtract from the counter - **Validations**: - Prevents underflow during the subtraction - Ensures the counter does not drop below zero - **Behavior**: - Updates the `CounterValue` storage item - Tracks the number of interactions by the user in the `UserInteractions` storage map - Emits a `CounterDecremented` event on success ```rust title="lib.rs" /// Decrement the counter by a specified amount. /// /// This function can be called by any signed account. /// /// - `amount_to_decrement`: The amount by which to decrement the counter. /// /// Emits `CounterDecremented` event when successful. #[pallet::call_index(2)] #[pallet::weight(0)] pub fn decrement(origin: OriginFor, amount_to_decrement: u32) -> DispatchResult { let who = ensure_signed(origin)?; let current_value = CounterValue::::get().unwrap_or(0); let new_value = current_value .checked_sub(amount_to_decrement) .ok_or(Error::::CounterValueBelowZero)?; CounterValue::::put(new_value); UserInteractions::::try_mutate(&who, |interactions| -> Result<_, Error> { let new_interactions = interactions .unwrap_or(0) .checked_add(1) .ok_or(Error::::UserInteractionOverflow)?; *interactions = Some(new_interactions); // Store the new value. Ok(()) })?; Self::deposit_event(Event::::CounterDecremented { counter_value: new_value, who, decremented_amount: amount_to_decrement, }); Ok(()) } ``` ## Verify Compilation After implementing all the pallet components, verifying that the code still compiles successfully is crucial. Run the following command in your terminal to ensure there are no errors: ```bash cargo build --package custom-pallet ``` If you encounter any errors or warnings, carefully review your code to resolve the issues. Once the build is complete without errors, your pallet implementation is ready. ## Key Takeaways In this tutorial, you learned how to create a custom pallet by defining storage, implementing errors, adding dispatchable calls, and emitting events. These are the foundational building blocks for developing robust Polkadot SDK-based blockchain logic. Expand the following item to review this implementation and the complete pallet code. ???- code "src/lib.rs" ```rust title="lib.rs" #![cfg_attr(not(feature = "std"), no_std)] pub use pallet::*; #[frame::pallet] pub mod pallet { use super::*; use frame::prelude::*; #[pallet::pallet] pub struct Pallet(_); // Configuration trait for the pallet. #[pallet::config] pub trait Config: frame_system::Config { // Defines the event type for the pallet. type RuntimeEvent: From> + IsType<::RuntimeEvent>; // Defines the maximum value the counter can hold. #[pallet::constant] type CounterMaxValue: Get; } #[pallet::event] #[pallet::generate_deposit(pub(super) fn deposit_event)] pub enum Event { /// The counter value has been set to a new value by Root. CounterValueSet { /// The new value set. counter_value: u32, }, /// A user has successfully incremented the counter. CounterIncremented { /// The new value set. counter_value: u32, /// The account who incremented the counter. who: T::AccountId, /// The amount by which the counter was incremented. incremented_amount: u32, }, /// A user has successfully decremented the counter. CounterDecremented { /// The new value set. counter_value: u32, /// The account who decremented the counter. who: T::AccountId, /// The amount by which the counter was decremented. decremented_amount: u32, }, } /// Storage for the current value of the counter. #[pallet::storage] pub type CounterValue = StorageValue<_, u32>; /// Storage map to track the number of interactions performed by each account. #[pallet::storage] pub type UserInteractions = StorageMap<_, Twox64Concat, T::AccountId, u32>; #[pallet::error] pub enum Error { /// The counter value exceeds the maximum allowed value. CounterValueExceedsMax, /// The counter value cannot be decremented below zero. CounterValueBelowZero, /// Overflow occurred in the counter. CounterOverflow, /// Overflow occurred in user interactions. UserInteractionOverflow, } #[pallet::call] impl Pallet { /// Set the value of the counter. /// /// The dispatch origin of this call must be _Root_. /// /// - `new_value`: The new value to set for the counter. /// /// Emits `CounterValueSet` event when successful. #[pallet::call_index(0)] #[pallet::weight(0)] pub fn set_counter_value(origin: OriginFor, new_value: u32) -> DispatchResult { ensure_root(origin)?; ensure!( new_value <= T::CounterMaxValue::get(), Error::::CounterValueExceedsMax ); CounterValue::::put(new_value); Self::deposit_event(Event::::CounterValueSet { counter_value: new_value, }); Ok(()) } /// Increment the counter by a specified amount. /// /// This function can be called by any signed account. /// /// - `amount_to_increment`: The amount by which to increment the counter. /// /// Emits `CounterIncremented` event when successful. #[pallet::call_index(1)] #[pallet::weight(0)] pub fn increment(origin: OriginFor, amount_to_increment: u32) -> DispatchResult { let who = ensure_signed(origin)?; let current_value = CounterValue::::get().unwrap_or(0); let new_value = current_value .checked_add(amount_to_increment) .ok_or(Error::::CounterOverflow)?; ensure!( new_value <= T::CounterMaxValue::get(), Error::::CounterValueExceedsMax ); CounterValue::::put(new_value); UserInteractions::::try_mutate(&who, |interactions| -> Result<_, Error> { let new_interactions = interactions .unwrap_or(0) .checked_add(1) .ok_or(Error::::UserInteractionOverflow)?; *interactions = Some(new_interactions); // Store the new value. Ok(()) })?; Self::deposit_event(Event::::CounterIncremented { counter_value: new_value, who, incremented_amount: amount_to_increment, }); Ok(()) } /// Decrement the counter by a specified amount. /// /// This function can be called by any signed account. /// /// - `amount_to_decrement`: The amount by which to decrement the counter. /// /// Emits `CounterDecremented` event when successful. #[pallet::call_index(2)] #[pallet::weight(0)] // This file is part of 'custom-pallet'. // SPDX-License-Identifier: MIT-0 // Permission is hereby granted, free of charge, to any person obtaining a copy // of this software and associated documentation files (the "Software"), to deal // in the Software without restriction, including without limitation the rights // to use, copy, modify, merge, publish, distribute, sublicense, and/or sell // copies of the Software, and to permit persons to whom the Software is // furnished to do so. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR // IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, // FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE // AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER // LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, // OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE // SOFTWARE. #![cfg_attr(not(feature = "std"), no_std)] pub use pallet::*; #[cfg(test)] mod mock; #[cfg(test)] mod tests; #[cfg(feature = "runtime-benchmarks")] mod benchmarking; pub mod weights; use crate::weights::WeightInfo; #[frame::pallet] pub mod pallet { use super::*; use frame::prelude::*; #[pallet::pallet] pub struct Pallet(_); // Configuration trait for the pallet. #[pallet::config] pub trait Config: frame_system::Config { // Defines the event type for the pallet. type RuntimeEvent: From> + IsType<::RuntimeEvent>; // Defines the maximum value the counter can hold. #[pallet::constant] type CounterMaxValue: Get; /// A type representing the weights required by the dispatchables of this pallet. type WeightInfo: WeightInfo; } #[pallet::event] #[pallet::generate_deposit(pub(super) fn deposit_event)] pub enum Event { /// The counter value has been set to a new value by Root. CounterValueSet { /// The new value set. counter_value: u32, }, /// A user has successfully incremented the counter. CounterIncremented { /// The new value set. counter_value: u32, /// The account who incremented the counter. who: T::AccountId, /// The amount by which the counter was incremented. incremented_amount: u32, }, /// A user has successfully decremented the counter. CounterDecremented { /// The new value set. counter_value: u32, /// The account who decremented the counter. who: T::AccountId, /// The amount by which the counter was decremented. decremented_amount: u32, }, } /// Storage for the current value of the counter. #[pallet::storage] pub type CounterValue = StorageValue<_, u32>; /// Storage map to track the number of interactions performed by each account. #[pallet::storage] pub type UserInteractions = StorageMap<_, Twox64Concat, T::AccountId, u32>; #[pallet::error] pub enum Error { /// The counter value exceeds the maximum allowed value. CounterValueExceedsMax, /// The counter value cannot be decremented below zero. CounterValueBelowZero, /// Overflow occurred in the counter. CounterOverflow, /// Overflow occurred in user interactions. UserInteractionOverflow, } #[pallet::call] impl Pallet { /// Set the value of the counter. /// /// The dispatch origin of this call must be _Root_. /// /// - `new_value`: The new value to set for the counter. /// /// Emits `CounterValueSet` event when successful. #[pallet::call_index(0)] #[pallet::weight(T::WeightInfo::set_counter_value())] pub fn set_counter_value(origin: OriginFor, new_value: u32) -> DispatchResult { ensure_root(origin)?; ensure!( new_value <= T::CounterMaxValue::get(), Error::::CounterValueExceedsMax ); CounterValue::::put(new_value); Self::deposit_event(Event::::CounterValueSet { counter_value: new_value, }); Ok(()) } /// Increment the counter by a specified amount. /// /// This function can be called by any signed account. /// /// - `amount_to_increment`: The amount by which to increment the counter. /// /// Emits `CounterIncremented` event when successful. #[pallet::call_index(1)] #[pallet::weight(T::WeightInfo::increment())] pub fn increment(origin: OriginFor, amount_to_increment: u32) -> DispatchResult { let who = ensure_signed(origin)?; let current_value = CounterValue::::get().unwrap_or(0); let new_value = current_value .checked_add(amount_to_increment) .ok_or(Error::::CounterOverflow)?; ensure!( new_value <= T::CounterMaxValue::get(), Error::::CounterValueExceedsMax ); CounterValue::::put(new_value); UserInteractions::::try_mutate(&who, |interactions| -> Result<_, Error> { let new_interactions = interactions .unwrap_or(0) .checked_add(1) .ok_or(Error::::UserInteractionOverflow)?; *interactions = Some(new_interactions); // Store the new value. Ok(()) })?; Self::deposit_event(Event::::CounterIncremented { counter_value: new_value, who, incremented_amount: amount_to_increment, }); Ok(()) } /// Decrement the counter by a specified amount. /// /// This function can be called by any signed account. /// /// - `amount_to_decrement`: The amount by which to decrement the counter. /// /// Emits `CounterDecremented` event when successful. #[pallet::call_index(2)] #[pallet::weight(T::WeightInfo::decrement())] pub fn decrement(origin: OriginFor, amount_to_decrement: u32) -> DispatchResult { let who = ensure_signed(origin)?; let current_value = CounterValue::::get().unwrap_or(0); let new_value = current_value .checked_sub(amount_to_decrement) .ok_or(Error::::CounterValueBelowZero)?; CounterValue::::put(new_value); UserInteractions::::try_mutate(&who, |interactions| -> Result<_, Error> { let new_interactions = interactions .unwrap_or(0) .checked_add(1) .ok_or(Error::::UserInteractionOverflow)?; *interactions = Some(new_interactions); // Store the new value. Ok(()) })?; Self::deposit_event(Event::::CounterDecremented { counter_value: new_value, who, decremented_amount: amount_to_decrement, }); Ok(()) } } } ``` ## Where to Go Next
- Tutorial __Pallet Unit Testing__ --- Learn to write effective unit tests for Polkadot SDK pallets! Use a custom pallet as a practical example in this comprehensive guide. [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-unit-testing/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/polkadot-sdk/parachains/zero-to-hero/set-up-a-template/ --- BEGIN CONTENT --- --- title: Set Up a Template description: Learn to compile and run a local parachain node using Polkadot SDK. Launch, run, and interact with a pre-configured runtime template. tutorial_badge: Beginner categories: Basics, Parachains --- # Set Up a Template ## Introduction [Polkadot SDK](https://github.com/paritytech/polkadot-sdk){target=\_blank} offers a versatile and extensible blockchain development framework, enabling you to create custom blockchains tailored to your specific application or business requirements. This tutorial guides you through compiling and running a parachain node using the [Polkadot SDK Parachain Template](https://github.com/paritytech/polkadot-sdk/tree/master/templates/parachain){target=\_blank}. The parachain template provides a pre-configured, functional runtime you can use in your local development environment. It includes several key components, such as user accounts and account balances. These predefined elements allow you to experiment with common blockchain operations without requiring initial template modifications. In this tutorial, you will: - Build and start a local parachain node using the node template - Explore how to use a front-end interface to: - View information about blockchain activity - Submit a transaction By the end of this tutorial, you'll have a working local parachain and understand how to interact with it, setting the foundation for further customization and development. ## Prerequisites Before getting started, ensure you have done the following: - Completed the [Install Polkadot SDK Dependencies](/develop/parachains/install-polkadot-sdk/){target=\_blank} guide and successfully installed [Rust](https://www.rust-lang.org/){target=\_blank} and the required packages to set up your development environment For this tutorial series, you need to use Rust `1.86`. Newer versions of the compiler may not work with this parachain template version. Run the following commands to set up the correct Rust version: ```bash rustup default 1.86 rustup target add wasm32-unknown-unknown --toolchain 1.86-aarch64-apple-darwin rustup component add rust-src --toolchain 1.86-aarch64-apple-darwin ``` ## Utility Tools This tutorial requires two essential tools: - [**Chain spec builder**](https://crates.io/crates/staging-chain-spec-builder/{{dependencies.crates.chain_spec_builder.version}}){target=\_blank} - is a Polkadot SDK utility for generating chain specifications. Refer to the [Generate Chain Specs](/develop/parachains/deployment/generate-chain-specs/){target=\_blank} documentation for detailed usage. Install it by executing the following command: ```bash cargo install --locked staging-chain-spec-builder@{{dependencies.crates.chain_spec_builder.version}} ``` This installs the `chain-spec-builder` binary. - [**Polkadot Omni Node**](https://crates.io/crates/polkadot-omni-node/{{dependencies.crates.polkadot_omni_node.version}}){target=\_blank} - is a white-labeled binary, released as a part of Polkadot SDK that can act as the collator of a parachain in production, with all the related auxiliary functionalities that a normal collator node has: RPC server, archiving state, etc. Moreover, it can also run the wasm blob of the parachain locally for testing and development. To install it, run the following command: ```bash cargo install --locked polkadot-omni-node@{{dependencies.crates.polkadot_omni_node.version}} ``` This installs the `polkadot-omni-node` binary. ## Compile the Runtime The [Polkadot SDK Parachain Template](https://github.com/paritytech/polkadot-sdk/tree/master/templates/parachain){target=\_blank} provides a ready-to-use development environment for building using the [Polkadot SDK](https://github.com/paritytech/polkadot-sdk){target=\_blank}. Follow these steps to compile the runtime: 1. Clone the template repository: ```bash git clone -b stable2412 https://github.com/paritytech/polkadot-sdk-parachain-template.git parachain-template ``` 2. Navigate into the project directory: ```bash cd parachain-template ``` 3. Compile the runtime: ```bash cargo build --release --locked ``` !!!tip Initial compilation may take several minutes, depending on your machine specifications. Use the `--release` flag for improved runtime performance compared to the default `--debug` build. If you need to troubleshoot issues, the `--debug` build provides better diagnostics. For production deployments, consider using a dedicated [`--profile production`](https://github.com/paritytech/polkadot-sdk-parachain-template/blob/v0.0.4/Cargo.toml#L42-L45){target=\_blank} flag - this can provide an additional 15-30% performance improvement over the standard `--release` profile. 4. Upon successful compilation, you should see output similar to:
cargo build --release --locked ... Finished `release` profile [optimized] target(s) in 1.79s
## Start the Local Chain After successfully compiling your runtime, you can spin up a local chain and produce blocks. This process will start your local parachain and allow you to interact with it. You'll first need to generate a chain specification that defines your network's identity, initial connections, and genesis state, providing the foundational configuration for how your nodes connect and what initial state they agree upon, and then run the chain. Follow these steps to launch your node in development mode: 1. Generate the chain specification file of your parachain: ```bash chain-spec-builder create -t development \ --relay-chain paseo \ --para-id 1000 \ --runtime ./target/release/wbuild/parachain-template-runtime/parachain_template_runtime.compact.compressed.wasm \ named-preset development ``` 2. Start the omni node with the generated chain spec. You'll start it in development mode (without a relay chain config), producing and finalizing blocks: ```bash polkadot-omni-node --chain ./chain_spec.json --dev ``` The `--dev` option does the following: - Deletes all active data (keys, blockchain database, networking information) when stopped - Ensures a clean working state each time you restart the node 3. Verify that your node is running by reviewing the terminal output. You should see something similar to:
polkadot-omni-node --chain ./chain_spec.json --dev
2024-12-12 12:44:02 polkadot-omni-node 2024-12-12 12:44:02 ✌️ version 0.1.0-da2dd9b7737 2024-12-12 12:44:02 ❀️ by Parity Technologies admin@parity.io, 2017-2024 2024-12-12 12:44:02 πŸ“‹ Chain specification: Custom 2024-12-12 12:44:02 🏷 Node name: grieving-drum-1926 2024-12-12 12:44:02 πŸ‘€ Role: AUTHORITY 2024-12-12 12:44:02 πŸ’Ύ Database: RocksDb at /var/folders/x0/xl_kjddj3ql3bx7752yr09hc0000gn/T/substrateoUrZMQ/chains/custom/db/full 2024-12-12 12:44:03 [Parachain] assembling new collators for new session 0 at #0 2024-12-12 12:44:03 [Parachain] assembling new collators for new session 1 at #0 2024-12-12 12:44:03 [Parachain] πŸ”¨ Initializing Genesis block/state (state: 0xa6f8…5b46, header-hash: 0x0579…2153) 2024-12-12 12:44:03 [Parachain] creating SingleState txpool Limit { count: 8192, total_bytes: 20971520 }/Limit { count: 819, total_bytes: 2097152 }. 2024-12-12 12:44:03 [Parachain] Using default protocol ID "sup" because none is configured in the chain specs 2024-12-12 12:44:03 [Parachain] 🏷 Local node identity is: 12D3KooWCSXy6rBuJVsn5mx8uyNqkdfNfFzEbToi4hR31v3PwdgX 2024-12-12 12:44:03 [Parachain] Running libp2p network backend 2024-12-12 12:44:03 [Parachain] πŸ’» Operating system: macos 2024-12-12 12:44:03 [Parachain] πŸ’» CPU architecture: aarch64 2024-12-12 12:44:03 [Parachain] πŸ“¦ Highest known block at #0 2024-12-12 12:44:03 [Parachain] 〽️ Prometheus exporter started at 127.0.0.1:9615 2024-12-12 12:44:03 [Parachain] Running JSON-RPC server: addr=127.0.0.1:9944,[::1]:9944 2024-12-12 12:44:06 [Parachain] πŸ™Œ Starting consensus session on top of parent 0x05794f9adcdaa23a5edd335e8310637d3a7e6e9393f2b0794af7d3e219f62153 (#0) 2024-12-12 12:44:06 [Parachain] 🎁 Prepared block for proposing at 1 (2 ms) hash: 0x6fbea46711e9b38bab8e7877071423cd03feab03d3f4a0d578a03ab42dcee34b; parent_hash: 0x0579…2153; end: NoMoreTransactions; extrinsics_count: 2 2024-12-12 12:44:06 [Parachain] πŸ† Imported #1 (0x0579…2153 β†’ 0x6fbe…e34b) ...
4. Confirm that your blockchain is producing new blocks by checking if the number after `finalized` is increasing
... 2024-12-12 12:49:20 [Parachain] πŸ’€ Idle (0 peers), best: #1 (0x6fbe…e34b), finalized #1 (0x6fbe…e34b), ⬇ 0 ⬆ 0 ... 2024-12-12 12:49:25 [Parachain] πŸ’€ Idle (0 peers), best: #3 (0x7543…bcfc), finalized #3 (0x7543…bcfc), ⬇ 0 ⬆ 0 ... 2024-12-12 12:49:30 [Parachain] πŸ’€ Idle (0 peers), best: #4 (0x0478…8d63), finalized #4 (0x0478…8d63), ⬇ 0 ⬆ 0 ...
The details of the log output will be explored in a later tutorial. For now, knowing that your node is running and producing blocks is sufficient. ## Interact with the Node When running the template node, it's accessible by default at `ws://localhost:9944`. To interact with your node using the [Polkadot.js Apps](https://polkadot.js.org/apps/#/explorer){target=\_blank} interface, follow these steps: 1. Open [Polkadot.js Apps](https://polkadot.js.org/apps/#/explorer){target=\_blank} in your web browser and click the network icon (which should be the Polkadot logo) in the top left corner as shown in the image below: ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/set-up-a-template/set-up-a-template-1.webp) 2. Connect to your local node: 1. Scroll to the bottom and select **Development** 2. Choose **Custom** 3. Enter `ws://localhost:9944` in the input field 4. Click the **Switch** button ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/set-up-a-template/set-up-a-template-2.webp) 3. Verify connection: - Once connected, you should see **parachain-template-runtime** in the top left corner - The interface will display information about your local blockchain ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/set-up-a-template/set-up-a-template-3.webp) You are now connected to your local node and can now interact with it through the Polkadot.js Apps interface. This tool enables you to explore blocks, execute transactions, and interact with your blockchain's features. For in-depth guidance on using the interface effectively, refer to the [Polkadot.js Guides](https://wiki.polkadot.network/general/polkadotjs/){target=\_blank} available on the Polkadot Wiki. ## Stop the Node When you're done exploring your local node, you can stop it to remove any state changes you've made. Since you started the node with the `--dev` option, stopping the node will purge all persistent block data, allowing you to start fresh the next time. To stop the local node: 1. Return to the terminal window where the node output is displayed 2. Press `Control-C` to stop the running process 3. Verify that your terminal returns to the prompt in the `parachain-template` directory ## Where to Go Next
- Tutorial __Build a Custom Pallet__ --- Build your own custom pallet for Polkadot SDK-based blockchains! Follow this step-by-step guide to create and configure a simple counter pallet from scratch. [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/build-custom-pallet/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-asset/ --- BEGIN CONTENT --- --- title: Register a Local Asset description: Comprehensive guide to registering a local asset on the Asset Hub system parachain, including step-by-step instructions. tutorial_badge: Beginner categories: Basics, dApps --- # Register a Local Asset on Asset Hub ## Introduction As detailed in the [Asset Hub Overview](/polkadot-protocol/architecture/system-chains/asset-hub){target=\_blank} page, Asset Hub accommodates two types of assets: local and foreign. Local assets are those that were created in Asset Hub and are identifiable by an integer ID. On the other hand, foreign assets originate from a sibling parachain and are identified by a Multilocation. This guide will take you through the steps of registering a local asset on the Asset Hub parachain. ## Prerequisites Before you begin, ensure you have access to the [Polkadot.js Apps](https://polkadot.js.org/apps/){target=\_blank} interface and a funded wallet with DOT or KSM. - For Polkadot Asset Hub, you would need a deposit of 10 DOT and around 0.201 DOT for the metadata - For Kusama Asset Hub, the deposit is 0.1 KSM and around 0.000669 KSM for the metadata You need to ensure that your Asset Hub account balance is a bit more than the sum of those two deposits, which should seamlessly account for the required deposits and transaction fees. ## Steps to Register a Local Asset To register a local asset on the Asset Hub parachain, follow these steps: 1. Open the [Polkadot.js Apps](https://polkadot.js.org/apps/){target=\_blank} interface and connect to the Asset Hub parachain using the network selector in the top left corner - You may prefer to test local asset registration on TestNet before registering the asset on a MainNet hub. If you still need to set up a local testing environment, review the [Environment setup](#test-setup-environment) section for instructions. Once the local environment is set up, connect to the Local Node (Chopsticks) available on `ws://127.0.0.1:8000` - For the live network, connect to the **Asset Hub** parachain. Either Polkadot or Kusama Asset Hub can be selected from the dropdown list, choosing the desired RPC provider 2. Click on the **Network** tab on the top navigation bar and select **Assets** from the dropdown list ![Access to Asset Hub through Polkadot.JS](/images/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-assets/register-a-local-asset-1.webp) 3. Now, you need to examine all the registered asset IDs. This step is crucial to ensure that the asset ID you are about to register is unique. Asset IDs are displayed in the **assets** column ![Asset IDs on Asset Hub](/images/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-assets/register-a-local-asset-2.webp) 4. Once you have confirmed that the asset ID is unique, click on the **Create** button on the top right corner of the page ![Create a new asset](/images/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-assets/register-a-local-asset-3.webp) 5. Fill in the required fields in the **Create Asset** form: 1. **creator account** - the account to be used for creating this asset and setting up the initial metadata 2. **asset name** - the descriptive name of the asset you are registering 3. **asset symbol** - the symbol that will be used to represent the asset 4. **asset decimals** - the number of decimal places for this token, with a maximum of 20 allowed through the user interface 5. **minimum balance** - the minimum balance for the asset. This is specified in the units and decimals as requested 6. **asset ID** - the selected id for the asset. This should not match an already-existing asset id 7. Click on the **Next** button ![Create Asset Form](/images/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-assets/register-a-local-asset-4.webp) 6. Choose the accounts for the roles listed below: 1. **admin account** - the account designated for continuous administration of the token 2. **issuer account** - the account that will be used for issuing this token 3. **freezer account** - the account that will be used for performing token freezing operations 4. Click on the **Create** button ![Admin, Issuer, Freezer accounts](/images/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-assets/register-a-local-asset-5.webp) 7. Click on the **Sign and Submit** button to complete the asset registration process ![Sign and Submit](/images/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-assets/register-a-local-asset-6.webp) ## Verify Asset Registration After completing these steps, the asset will be successfully registered. You can now view your asset listed on the [**Assets**](https://polkadot.js.org/apps/?rpc=wss%3A%2F%2Fasset-hub-polkadot-rpc.dwellir.com#/assets){target=\_blank} section of the Polkadot.js Apps interface. ![Asset listed on Polkadot.js Apps](/images/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-assets/register-a-local-asset-7.webp) !!! tip Take into consideration that the **Assets** section’s link may differ depending on the network you are using. For the local environment, enter `ws://127.0.0.1:8000` into the **Custom Endpoint** field. In this way, you have successfully registered a local asset on the Asset Hub parachain. For an in-depth explanation about Asset Hub and its features, see the [Asset Hub](/tutorials/polkadot-sdk/system-chains/asset-hub/asset-conversion/){target=\_blank} entry in the Polkadot Wiki. ## Test Setup Environment You can set up a local parachain environment to test the asset registration process before deploying it on the live network. This guide uses Chopsticks to simulate that process. For further information on chopsticks usage, refer to the [Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks/get-started){target=\_blank} documentation. To set up a test environment, execute the following command: ```bash npx @acala-network/chopsticks \ --config=https://raw.githubusercontent.com/AcalaNetwork/chopsticks/master/configs/polkadot-asset-hub.yml ``` The above command will spawn a lazy fork of Polkadot Asset Hub with the latest block data from the network. If you need to test Kusama Asset Hub, replace `polkadot-asset-hub.yml` with `kusama-asset-hub.yml` in the command. An Asset Hub instance is now running locally, and you can proceed with the asset registration process. Note that the local registration process does not differ from the live network process. Once you have a successful TestNet transaction, you can use the same steps to register the asset on MainNet. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/polkadot-sdk/testing/fork-live-chains/ --- BEGIN CONTENT --- --- title: Fork a Chain with Chopsticks description: Learn how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM, and interact programmatically or via UI. tutorial_badge: Beginner categories: Basics, dApps, Tooling --- # Fork a Chain with Chopsticks ## Introduction Chopsticks is an innovative tool that simplifies the process of forking live Polkadot SDK chains. This guide provides step-by-step instructions to configure and fork chains, enabling developers to: - Replay blocks for state analysis - Test cross-chain messaging (XCM) - Simulate blockchain environments for debugging and experimentation With support for both configuration files and CLI commands, Chopsticks offers flexibility for diverse development workflows. Whether you're testing locally or exploring complex blockchain scenarios, Chopsticks empowers developers to gain deeper insights and accelerate application development. Chopsticks uses the [Smoldot](https://github.com/smol-dot/smoldot){target=\_blank} light client, which does not support calls made through the Ethereum JSON-RPC. As a result, you can't fork your chain using Chopsticks and then interact with it using tools like MetaMask. For additional support and information, please reach out through [GitHub Issues](https://github.com/AcalaNetwork/chopsticks/issues){target=\_blank}. ## Prerequisites To follow this tutorial, ensure you have completed the following: - **Installed Chopsticks** - if you still need to do so, see the [Install Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/#install-chopsticks){target=\_blank} guide for assistance - **Reviewed** [**Configure Chopsticks**](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/#configure-chopsticks){target=\_blank} - and understand how forked chains are configured ## Configuration File To run Chopsticks using a configuration file, utilize the `--config` flag. You can use a raw GitHub URL, a path to a local file, or simply the chain's name. The following commands all look different but they use the `polkadot` configuration in the same way: === "GitHub URL" ```bash npx @acala-network/chopsticks \ --config=https://raw.githubusercontent.com/AcalaNetwork/chopsticks/master/configs/polkadot.yml ``` === "Local File Path" ```bash npx @acala-network/chopsticks --config=configs/polkadot.yml ``` === "Chain Name" ```bash npx @acala-network/chopsticks --config=polkadot ``` Regardless of which method you choose from the preceding examples, you'll see an output similar to the following:
npx @acala-network/chopsticks --config=polkadot
[18:38:26.155] INFO: Loading config file https://raw.githubusercontent.com/AcalaNetwork/chopsticks/master/configs/polkadot.yml app: "chopsticks" chopsticks::executor TRACE: Calling Metadata_metadata chopsticks::executor TRACE: Completed Metadata_metadata [18:38:28.186] INFO: Polkadot RPC listening on port 8000 app: "chopsticks"
If using a file path, make sure you've downloaded the [Polkadot configuration file](https://github.com/AcalaNetwork/chopsticks/blob/master/configs/polkadot.yml){target=\_blank}, or have created your own. ## Create a Fork Once you've configured Chopsticks, use the following command to fork Polkadot at block 100: ```bash npx @acala-network/chopsticks \ --endpoint wss://polkadot-rpc.dwellir.com \ --block 100 ``` If the fork is successful, you will see output similar to the following: -8<-- 'code/tutorials/polkadot-sdk/testing/fork-live-chains/polkadot-fork-01.html' Access the running Chopsticks fork using the default address. ```bash ws://localhost:8000 ``` ## Interact with a Fork You can interact with the forked chain using various [libraries](/develop/toolkit/#libraries){target=\_blank} such as [Polkadot.js](https://polkadot.js.org/docs/){target=\_blank} and its user interface, [Polkadot.js Apps](https://polkadot.js.org/apps/#/explorer){target=\_blank}. ### Use Polkadot.js Apps To interact with Chopsticks via the hosted user interface, visit [Polkadot.js Apps](https://polkadot.js.org/apps/#/explorer){target=\_blank} and follow these steps: 1. Select the network icon in the top left corner ![](/images/tutorials/polkadot-sdk/testing/fork-live-chains/chopsticks-1.webp) 2. Scroll to the bottom and select **Development** 3. Choose **Custom** 4. Enter `ws://localhost:8000` in the input field 5. Select the **Switch** button ![](/images/tutorials/polkadot-sdk/testing/fork-live-chains/chopsticks-2.webp) You should now be connected to your local fork and can interact with it as you would with a real chain. ### Use Polkadot.js Library For programmatic interaction, you can use the Polkadot.js library. The following is a basic example: ```js import { ApiPromise, WsProvider } from '@polkadot/api'; async function connectToFork() { const wsProvider = new WsProvider('ws://localhost:8000'); const api = await ApiPromise.create({ provider: wsProvider }); await api.isReady; // Now you can use 'api' to interact with your fork console.log(`Connected to chain: ${await api.rpc.system.chain()}`); } connectToFork(); ``` ## Replay Blocks Chopsticks allows you to replay specific blocks from a chain, which is useful for debugging and analyzing state changes. You can use the parameters in the [Configuration](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/#configure-chopsticks){target=\_blank} section to set up the chain configuration, and then use the run-block subcommand with the following additional options: - `output-path` - path to print output - `html` - generate HTML with storage diff - `open` - open generated HTML For example, the command to replay block 1000 from Polkadot and save the output to a JSON file would be as follows: ```bash npx @acala-network/chopsticks run-block \ --endpoint wss://polkadot-rpc.dwellir.com \ --output-path ./polkadot-output.json \ --block 1000 ``` ??? code "polkadot-output.json" ```json { "Call": { "result": "0xba754e7478944d07a1f7e914422b4d973b0855abeb6f81138fdca35beb474b44a10f6fc59a4d90c3b78e38fac100fc6adc6f9e69a07565ec8abce6165bd0d24078cc7bf34f450a2cc7faacc1fa1e244b959f0ed65437f44208876e1e5eefbf8dd34c040642414245b501030100000083e2cc0f00000000d889565422338aa58c0fd8ebac32234149c7ce1f22ac2447a02ef059b58d4430ca96ba18fbf27d06fe92ec86d8b348ef42f6d34435c791b952018d0a82cae40decfe5faf56203d88fdedee7b25f04b63f41f23da88c76c876db5c264dad2f70c", "storageDiff": [ [ "0x0b76934f4cc08dee01012d059e1b83eebbd108c4899964f707fdaffb82636065", "0x00" ], [ "0x1cb6f36e027abb2091cfb5110ab5087f0323475657e0890fbdbf66fb24b4649e", null ], [ "0x1cb6f36e027abb2091cfb5110ab5087f06155b3cd9a8c9e5e9a23fd5dc13a5ed", "0x83e2cc0f00000000" ], [ "0x1cb6f36e027abb2091cfb5110ab5087ffa92de910a7ce2bd58e99729c69727c1", null ], [ "0x26aa394eea5630e07c48ae0c9558cef702a5c1b19ab7a04f536c519aca4983ac", null ], [ "0x26aa394eea5630e07c48ae0c9558cef70a98fdbe9ce6c55837576c60c7af3850", "0x02000000" ], [ "0x26aa394eea5630e07c48ae0c9558cef734abf5cb34d6244378cddbf18e849d96", "0xc03b86ae010000000000000000000000" ], [ "0x26aa394eea5630e07c48ae0c9558cef780d41e5e16056765bc8461851072c9d7", "0x080000000000000080e36a09000000000200000001000000000000ca9a3b00000000020000" ], [ "0x26aa394eea5630e07c48ae0c9558cef78a42f33323cb5ced3b44dd825fda9fcc", null ], [ "0x26aa394eea5630e07c48ae0c9558cef799e7f93fc6a98f0874fd057f111c4d2d", null ], [ "0x26aa394eea5630e07c48ae0c9558cef7a44704b568d21667356a5a050c118746d366e7fe86e06375e7030000", "0xba754e7478944d07a1f7e914422b4d973b0855abeb6f81138fdca35beb474b44" ], [ "0x26aa394eea5630e07c48ae0c9558cef7a86da5a932684f199539836fcb8c886f", null ], [ "0x26aa394eea5630e07c48ae0c9558cef7b06c3320c6ac196d813442e270868d63", null ], [ "0x26aa394eea5630e07c48ae0c9558cef7bdc0bd303e9855813aa8a30d4efc5112", null ], [ "0x26aa394eea5630e07c48ae0c9558cef7df1daeb8986837f21cc5d17596bb78d15153cb1f00942ff401000000", null ], [ "0x26aa394eea5630e07c48ae0c9558cef7df1daeb8986837f21cc5d17596bb78d1b4def25cfda6ef3a00000000", null ], [ "0x26aa394eea5630e07c48ae0c9558cef7ff553b5a9862a516939d82b3d3d8661a", null ], [ "0x2b06af9719ac64d755623cda8ddd9b94b1c371ded9e9c565e89ba783c4d5f5f9b4def25cfda6ef3a000000006f3d6b177c8acbd8dc9974cdb3cebfac4d31333c30865ff66c35c1bf898df5c5dd2924d3280e7201", "0x9b000000" ], ["0x3a65787472696e7369635f696e646578", null], [ "0x3f1467a096bcd71a5b6a0c8155e208103f2edf3bdf381debe331ab7446addfdc", "0x550057381efedcffffffffffffffffff" ], [ "0x3fba98689ebed1138735e0e7a5a790ab0f41321f75df7ea5127be2db4983c8b2", "0x00" ], [ "0x3fba98689ebed1138735e0e7a5a790ab21a5051453bd3ae7ed269190f4653f3b", "0x080000" ], [ "0x3fba98689ebed1138735e0e7a5a790abb984cfb497221deefcefb70073dcaac1", "0x00" ], [ "0x5f3e4907f716ac89b6347d15ececedca80cc6574281671b299c1727d7ac68cabb4def25cfda6ef3a00000000", "0x204e0000183887050ecff59f58658b3df63a16d03a00f92890f1517f48c2f6ccd215e5450e380e00005809fd84af6483070acbb92378e3498dbc02fb47f8e97f006bb83f60d7b2b15d980d000082104c22c383925323bf209d771dec6e1388285abe22c22d50de968467e0bb6ce00b000088ee494d719d68a18aade04903839ea37b6be99552ceceb530674b237afa9166480d0000dc9974cdb3cebfac4d31333c30865ff66c35c1bf898df5c5dd2924d3280e72011c0c0000e240d12c7ad07bb0e7785ee6837095ddeebb7aef84d6ed7ea87da197805b343a0c0d0000" ], [ "0xae394d879ddf7f99595bc0dd36e355b5bbd108c4899964f707fdaffb82636065", null ], [ "0xbd2a529379475088d3e29a918cd478721a39ec767bd5269111e6492a1675702a", "0x4501407565175cfbb5dca18a71e2433f838a3d946ef532c7bff041685db1a7c13d74252fffe343a960ef84b15187ea0276687d8cb3168aeea5202ea6d651cb646517102b81ff629ee6122430db98f2cadf09db7f298b49589b265dae833900f24baa8fb358d87e12f3e9f7986a9bf920c2fb48ce29886199646d2d12c6472952519463e80b411adef7e422a1595f1c1af4b5dd9b30996fba31fa6a30bd94d2022d6b35c8bc5a8a51161d47980bf4873e01d15afc364f8939a6ce5a09454ab7f2dd53bf4ee59f2c418e85aa6eb764ad218d0097fb656900c3bdd859771858f87bf7f06fc9b6db154e65d50d28e8b2374898f4f519517cd0bedc05814e0f5297dc04beb307b296a93cc14d53afb122769dfd402166568d8912a4dff9c2b1d4b6b34d811b40e5f3763e5f3ab5cd1da60d75c0ff3c12bcef3639f5f792a85709a29b752ffd1233c2ccae88ed3364843e2fa92bdb49021ee36b36c7cdc91b3e9ad32b9216082b6a2728fccd191a5cd43896f7e98460859ca59afbf7c7d93cd48da96866f983f5ff8e9ace6f47ee3e6c6edb074f578efbfb0907673ebca82a7e1805bc5c01cd2fa5a563777feeb84181654b7b738847c8e48d4f575c435ad798aec01631e03cf30fe94016752b5f087f05adf1713910767b7b0e6521013be5370776471191641c282fdfe7b7ccf3b2b100a83085cd3af2b0ad4ab3479448e71fc44ff987ec3a26be48161974b507fb3bc8ad23838f2d0c54c9685de67dc6256e71e739e9802d0e6e3b456f6dca75600bc04a19b3cc1605784f46595bfb10d5e077ce9602ae3820436166aa1905a7686b31a32d6809686462bc9591c0bc82d9e49825e5c68352d76f1ac6e527d8ac02db3213815080afad4c2ecb95b0386e3e9ab13d4f538771dac70d3059bd75a33d0b9b581ec33bb16d0e944355d4718daccb35553012adfcdacb1c5200a2aec3756f6ad5a2beffd30018c439c1b0c4c0f86dbf19d0ad59b1c9efb7fe90906febdb9001af1e7e15101089c1ab648b199a40794d30fe387894db25e614b23e833291a604d07eec2ade461b9b139d51f9b7e88475f16d6d23de6fe7831cc1dbba0da5efb22e3b26cd2732f45a2f9a5d52b6d6eaa38782357d9ae374132d647ef60816d5c98e6959f8858cfa674c8b0d340a8f607a68398a91b3a965585cc91e46d600b1310b8f59c65b7c19e9d14864a83c4ad6fa4ba1f75bba754e7478944d07a1f7e914422b4d973b0855abeb6f81138fdca35beb474b44c7736fc3ab2969878810153aa3c93fc08c99c478ed1bb57f647d3eb02f25cee122c70424643f4b106a7643acaa630a5c4ac39364c3cb14453055170c01b44e8b1ef007c7727494411958932ae8b3e0f80d67eec8e94dd2ff7bbe8c9e51ba7e27d50bd9f52cbaf9742edecb6c8af1aaf3e7c31542f7d946b52e0c37d194b3dd13c3fddd39db0749755c7044b3db1143a027ad428345d930afcefc0d03c3a0217147900bdea1f5830d826f7e75ecd1c4e2bc8fd7de3b35c6409acae1b2215e9e4fd7e360d6825dc712cbf9d87ae0fd4b349b624d19254e74331d66a39657da81e73d7b13adc1e5efa8efd65aa32c1a0a0315913166a590ae551c395c476116156cf9d872fd863893edb41774f33438161f9b973e3043f819d087ba18a0f1965e189012496b691f342f7618fa9db74e8089d4486c8bd1993efd30ff119976f5cc0558e29b417115f60fd8897e13b6de1a48fbeee38ed812fd267ae25bffea0caa71c09309899b34235676d5573a8c3cf994a3d7f0a5dbd57ab614c6caf2afa2e1a860c6307d6d9341884f1b16ef22945863335bb4af56e5ef5e239a55dbd449a4d4d3555c8a3ec5bd3260f88cabca88385fe57920d2d2dfc5d70812a8934af5691da5b91206e29df60065a94a0a8178d118f1f7baf768d934337f570f5ec68427506391f51ab4802c666cc1749a84b5773b948fcbe460534ed0e8d48a15c149d27d67deb8ea637c4cc28240ee829c386366a0b1d6a275763100da95374e46528a0adefd4510c38c77871e66aeda6b6bfd629d32af9b2fad36d392a1de23a683b7afd13d1e3d45dad97c740106a71ee308d8d0f94f6771164158c6cd3715e72ccfbc49a9cc49f21ead8a3c5795d64e95c15348c6bf8571478650192e52e96dd58f95ec2c0fb4f2ccc05b0ab749197db8d6d1c6de07d6e8cb2620d5c308881d1059b50ffef3947c273eaed7e56c73848e0809c4bd93619edd9fd08c8c5c88d5f230a55d2c6a354e5dd94440e7b5bf99326cf4a112fe843e7efdea56e97af845761d98f40ed2447bd04a424976fcf0fe0a0c72b97619f85cf431fe4c3aa6b3a4f61df8bc1179c11e77783bfedb7d374bd1668d0969333cb518bd20add8329462f2c9a9f04d150d60413fdd27271586405fd85048481fc2ae25b6826cb2c947e4231dc7b9a0d02a9a03f88460bced3fef5d78f732684bd218a1954a4acfc237d79ccf397913ab6864cd8a07e275b82a8a72520624738368d1c5f7e0eaa2b445cf6159f2081d3483618f7fc7b16ec4e6e4d67ab5541bcda0ca1af40efd77ef8653e223191448631a8108c5e50e340cd405767ecf932c1015aa8856b834143dc81fa0e8b9d1d8c32278fca390f2ff08181df0b74e2d13c9b7b1d85543416a0dae3a77530b9cd1366213fcf3cd12a9cd3ae0a006d6b29b5ffc5cdc1ab24343e2ab882abfd719892fca5bf2134731332c5d3bef6c6e4013d84a853cb03d972146b655f0f8541bcd36c3c0c8a775bb606edfe50d07a5047fd0fe01eb125e83673930bc89e91609fd6dfe97132679374d3de4a0b3db8d3f76f31bed53e247da591401d508d65f9ee01d3511ee70e3644f3ab5d333ca7dbf737fe75217b4582d50d98b5d59098ea11627b7ed3e3e6ee3012eadd326cf74ec77192e98619427eb0591e949bf314db0fb932ed8be58258fb4f08e0ccd2cd18b997fb5cf50c90d5df66a9f3bb203bd22061956128b800e0157528d45c7f7208c65d0592ad846a711fa3c5601d81bb318a45cc1313b122d4361a7d7a954645b04667ff3f81d3366109772a41f66ece09eb93130abe04f2a51bb30e767dd37ec6ee6a342a4969b8b342f841193f4f6a9f0fac4611bc31b6cab1d25262feb31db0b8889b6f8d78be23f033994f2d3e18e00f3b0218101e1a7082782aa3680efc8502e1536c30c8c336b06ae936e2bcf9bbfb20dd514ed2867c03d4f44954867c97db35677d30760f37622b85089cc5d182a89e29ab0c6b9ef18138b16ab91d59c2312884172afa4874e6989172014168d3ed8db3d9522d6cbd631d581d166787c93209bec845d112e0cbd825f6df8b64363411270921837cfb2f9e7f2e74cdb9cd0d2b02058e5efd9583e2651239654b887ea36ce9537c392fc5dfca8c5a0facbe95b87dfc4232f229bd12e67937d32b7ffae2e837687d2d292c08ff6194a2256b17254748857c7e3c871c3fff380115e6f7faf435a430edf9f8a589f6711720cfc5cec6c8d0d94886a39bb9ac6c50b2e8ef6cf860415192ca4c1c3aaa97d36394021a62164d5a63975bcd84b8e6d74f361c17101e3808b4d8c31d1ee1a5cf3a2feda1ca2c0fd5a50edc9d95e09fb5158c9f9b0eb5e2c90a47deb0459cea593201ae7597e2e9245aa5848680f546256f3" ], [ "0xd57bce545fb382c34570e5dfbf338f5e326d21bc67a4b34023d577585d72bfd7", null ], [ "0xd57bce545fb382c34570e5dfbf338f5ea36180b5cfb9f6541f8849df92a6ec93", "0x00" ], [ "0xd57bce545fb382c34570e5dfbf338f5ebddf84c5eb23e6f53af725880d8ffe90", null ], [ "0xd5c41b52a371aa36c9254ce34324f2a53b996bb988ea8ee15bad3ffd2f68dbda", "0x00" ], [ "0xf0c365c3cf59d671eb72da0e7a4113c49f1f0515f462cdcf84e0f1d6045dfcbb", "0x50defc5172010000" ], [ "0xf0c365c3cf59d671eb72da0e7a4113c4bbd108c4899964f707fdaffb82636065", null ], [ "0xf68f425cf5645aacb2ae59b51baed90420d49a14a763e1cbc887acd097f92014", "0x9501800300008203000082030000840300008503000086030000870300008703000089030000890300008b0300008b0300008d0300008d0300008f0300008f0300009103000092030000920300009403000094030000960300009603000098030000990300009a0300009b0300009b0300009d0300009d0300009f0300009f030000a1030000a2030000a3030000a4030000a5030000a6030000a6030000a8030000a8030000aa030000ab030000ac030000ad030000ae030000af030000b0030000b1030000b1030000b3030000b3030000b5030000b6030000b7030000b8030000b9030000ba030000ba030000bc030000bc030000be030000be030000c0030000c1030000c2030000c2030000c4030000c5030000c5030000c7030000c7030000c9030000c9030000cb030000cc030000cd030000ce030000cf030000d0030000d0030000d2030000d2030000d4030000d4030000d6030000d7030000d8030000d9030000da030000db030000db030000dd030000dd030000df030000e0030000e1030000e2030000e3030000e4030000e4030000" ], [ "0xf68f425cf5645aacb2ae59b51baed9049b58374218f48eaf5bc23b7b3e7cf08a", "0xb3030000" ], [ "0xf68f425cf5645aacb2ae59b51baed904b97380ce5f4e70fbf9d6b5866eb59527", "0x9501800300008203000082030000840300008503000086030000870300008703000089030000890300008b0300008b0300008d0300008d0300008f0300008f0300009103000092030000920300009403000094030000960300009603000098030000990300009a0300009b0300009b0300009d0300009d0300009f0300009f030000a1030000a2030000a3030000a4030000a5030000a6030000a6030000a8030000a8030000aa030000ab030000ac030000ad030000ae030000af030000b0030000b1030000b1030000b3030000b3030000b5030000b6030000b7030000b8030000b9030000ba030000ba030000bc030000bc030000be030000be030000c0030000c1030000c2030000c2030000c4030000c5030000c5030000c7030000c7030000c9030000c9030000cb030000cc030000cd030000ce030000cf030000d0030000d0030000d2030000d2030000d4030000d4030000d6030000d7030000d8030000d9030000da030000db030000db030000dd030000dd030000df030000e0030000e1030000e2030000e3030000e4030000e4030000" ] ], "offchainStorageDiff": [], "runtimeLogs": [] } } ``` ## XCM Testing To test XCM (Cross-Consensus Messaging) messages between networks, you can fork multiple parachains and a relay chain locally using Chopsticks. - `relaychain` - relay chain config file - `parachain` - parachain config file For example, to fork Moonbeam, Astar, and Polkadot enabling XCM between them, you can use the following command: ```bash npx @acala-network/chopsticks xcm \ --r polkadot \ --p moonbeam \ --p astar ``` After running it, you should see output similar to the following:
npx @acala-network/chopsticks xcm \ --r polkadot \ --p moonbeam \ --p astar
[13:46:07.901] INFO: Loading config file https://raw.githubusercontent.com/AcalaNetwork/chopsticks/master/configs/moonbeam.yml app: "chopsticks" [13:46:12.631] INFO: Moonbeam RPC listening on port 8000 app: "chopsticks" [13:46:12.632] INFO: Loading config file https://raw.githubusercontent.com/AcalaNetwork/chopsticks/master/configs/astar.yml app: "chopsticks" chopsticks::executor TRACE: Calling Metadata_metadata chopsticks::executor TRACE: Completed Metadata_metadata [13:46:23.669] INFO: Astar RPC listening on port 8001 app: "chopsticks" [13:46:25.144] INFO (xcm): Connected parachains [2004,2006] app: "chopsticks" [13:46:25.144] INFO: Loading config file https://raw.githubusercontent.com/AcalaNetwork/chopsticks/master/configs/polkadot.yml app: "chopsticks" chopsticks::executor TRACE: Calling Metadata_metadata chopsticks::executor TRACE: Completed Metadata_metadata [13:46:53.320] INFO: Polkadot RPC listening on port 8002 app: "chopsticks" [13:46:54.038] INFO (xcm): Connected relaychain 'Polkadot' with parachain 'Moonbeam' app: "chopsticks" [13:46:55.028] INFO (xcm): Connected relaychain 'Polkadot' with parachain 'Astar' app: "chopsticks"
Now you can interact with your forked chains using the ports specified in the output. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/polkadot-sdk/testing/spawn-basic-chain/ --- BEGIN CONTENT --- --- title: Spawn a Basic Chain with Zombienet description: Learn to spawn, connect to and monitor a basic blockchain network with Zombienet, using customizable configurations for streamlined development and debugging. tutorial_badge: Beginner categories: Basics, dApps, Tooling --- # Spawn a Basic Chain with Zombienet ## Introduction Zombienet simplifies blockchain development by enabling developers to create temporary, customizable networks for testing and validation. These ephemeral chains are ideal for experimenting with configurations, debugging applications, and validating functionality in a controlled environment. In this guide, you'll learn how to define a basic network configuration file, spawn a blockchain network using Zombienet's CLI, and interact with nodes and monitor network activity using tools like Polkadot.js Apps and Prometheus By the end of this tutorial, you'll be equipped to deploy and test your own blockchain networks, paving the way for more advanced setups and use cases. ## Prerequisites To successfully complete this tutorial, you must ensure you've first: - [Installed Zombienet](/develop/toolkit/parachains/spawn-chains/zombienet/get-started/#install-zombienet){target=\_blank}. This tutorial requires Zombienet version `{{ dependencies.repositories.zombienet.version }}`. Verify that you're using the specified version to ensure compatibility with the instructions. - Reviewed the information in [Configure Zombienet](/develop/toolkit/parachains/spawn-chains/zombienet/get-started/#configure-zombienet){target=\_blank} and understand how to customize a spawned network ## Set Up Local Provider In this tutorial, you will use the Zombienet [local provider](/develop/toolkit/parachains/spawn-chains/zombienet/get-started/#local-provider){target=\_blank} (also called native provider) that enables you to run nodes as local processes in your development environment. You must have the necessary binaries installed (such as `polkadot` and `polkadot-parachain`) to spin up your network successfully. To install the required binaries, use the following Zombienet CLI command: ```bash zombienet setup polkadot polkadot-parachain ``` This command downloads the following binaries: - `polkadot` - `polkadot-execute-worker` - `polkadot-parachain` - `polkadot-prepare-worker` Finally, add these binaries to your PATH environment variable to ensure Zombienet can locate them when spawning the network. For example, you can move the binaries to a directory in your PATH, such as `/usr/local/bin`: ```bash sudo mv ./polkadot ./polkadot-execute-worker ./polkadot-parachain ./polkadot-prepare-worker /usr/local/bin ``` ## Define the Network Zombienet uses a [configuration file](/develop/toolkit/parachains/spawn-chains/zombienet/get-started/#configuration-files){target=\_blank} to define the ephemeral network that will be spawned. Follow these steps to create and define the configuration file: 1. Create a file named `spawn-a-basic-network.toml` ```bash touch spawn-a-basic-network.toml ``` 2. Add the following code to the file you just created: ```toml title="spawn-a-basic-network.toml" [settings] timeout = 120 [relaychain] [[relaychain.nodes]] name = "alice" validator = true [[relaychain.nodes]] name = "bob" validator = true [[parachains]] id = 100 [parachains.collator] name = "collator01" ``` This configuration file defines a network with the following chains: - **relaychain** - with two nodes named `alice` and `bob` - **parachain** - with a collator named `collator01` Settings also defines a timeout of 120 seconds for the network to be ready. ## Spawn the Network To spawn the network, run the following command: ```bash zombienet -p native spawn spawn-a-basic-network.toml ``` This command will spawn the network defined in the `spawn-a-basic-network.toml` configuration file. The `-p native` flag specifies that the network will be spawned using the native provider. If successful, you will see the following output:
zombienet -p native spawn spawn-a-basic-network.toml
Network launched πŸš€πŸš€
Namespace zombie-75a01b93c92d571f6198a67bcb380fcd
Provider native
Node Information
Name alice
Direct Link https://polkadot.js.org/apps/?rpc=ws://127.0.0.1:55308#explorer
Prometheus Link http://127.0.0.1:55310/metrics
Log Cmd tail -f /tmp/zombie-794af21178672e1ff32c612c3c7408dc_-2397036-6717MXDxcS55/alice.log
Node Information
Name bob
Direct Link https://polkadot.js.org/apps/?rpc=ws://127.0.0.1:55312#explorer
Prometheus Link http://127.0.0.1:50634/metrics
Log Cmd tail -f /tmp/zombie-794af21178672e1ff32c612c3c7408dc_-2397036-6717MXDxcS55/bob.log
Node Information
Name collator01
Direct Link https://polkadot.js.org/apps/?rpc=ws://127.0.0.1:55316#explorer
Prometheus Link http://127.0.0.1:55318/metrics
Log Cmd tail -f /tmp/zombie-794af21178672e1ff32c612c3c7408dc_-2397036-6717MXDxcS55/collator01.log
Parachain ID 100
ChainSpec Path /tmp/zombie-794af21178672e1ff32c612c3c7408dc_-2397036-6717MXDxcS55/100-rococo-local.json
!!! note If the IPs and ports aren't explicitly defined in the configuration file, they may change each time the network is started, causing the links provided in the output to differ from the example. ## Interact with the Spawned Network After the network is launched, you can interact with it using [Polkadot.js Apps](https://polkadot.js.org/apps/){target=\_blank}. To do so, open your browser and use the provided links listed by the output as `Direct Link`. ### Connect to the Nodes Use the [55308 port address](https://polkadot.js.org/apps/?rpc=ws://127.0.0.1:55308#explorer){target=\_blank} to interact with the same `alice` node used for this tutorial. Ports can change from spawn to spawn so be sure to locate the link in the output when spawning your own node to ensure you are accessing the correct port. If you want to interact with the nodes more programmatically, you can also use the [Polkadot.js API](https://polkadot.js.org/docs/api/){target=\_blank}. For example, the following code snippet shows how to connect to the `alice` node using the Polkadot.js API and log some information about the chain and node: ```typescript import { ApiPromise, WsProvider } from '@polkadot/api'; async function main() { const wsProvider = new WsProvider('ws://127.0.0.1:55308'); const api = await ApiPromise.create({ provider: wsProvider }); // Retrieve the chain & node information via rpc calls const [chain, nodeName, nodeVersion] = await Promise.all([ api.rpc.system.chain(), api.rpc.system.name(), api.rpc.system.version(), ]); console.log( `You are connected to chain ${chain} using ${nodeName} v${nodeVersion}` ); } main() .catch(console.error) .finally(() => process.exit()); ``` Both methods allow you to interact easily with the network and its nodes. ### Check Metrics You can also check the metrics of the nodes by accessing the links provided in the output as `Prometheus Link`. [Prometheus](https://prometheus.io/){target=\_blank} is a monitoring and alerting toolkit that collects metrics from the nodes. By accessing the provided links, you can see the metrics of the nodes in a web interface. So, for example, the following image shows the Prometheus metrics for Bob's node from the Zombienet test: ![](/images/tutorials/polkadot-sdk/testing/spawn-basic-chain/spawn-basic-network-01.webp) ### Check Logs To view individual node logs, locate the `Log Cmd` command in Zombienet's startup output. For example, to see what the alice node is doing, find the log command that references `alice.log` in its file path. Note that Zombienet will show you the correct path for your instance when it starts up, so use that path rather than copying from the below example: ```bash tail -f /tmp/zombie-794af21178672e1ff32c612c3c7408dc_-2397036-6717MXDxcS55/alice.log ``` After running this command, you will see the logs of the `alice` node in real-time, which can be useful for debugging purposes. The logs of the `bob` and `collator01` nodes can be checked similarly. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/smart-contracts/deploy-erc20/ --- BEGIN CONTENT --- --- title: Deploy an ERC-20 to Polkadot Hub description: Deploy an ERC-20 token on Polkadot Hub using PolkaVM. This guide covers contract creation, compilation, deployment, and interaction via Polkadot Remix IDE. tutorial_badge: Beginner categories: Basics, dApps, Smart Contracts --- # Deploy an ERC-20 to Polkadot Hub !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction [ERC-20](https://eips.ethereum.org/EIPS/eip-20){target=\_blank} tokens are fungible tokens commonly used for creating cryptocurrencies, governance tokens, and staking mechanisms. Polkadot Hub enables easy token deployment with Ethereum-compatible smart contracts via PolkaVM. This tutorial covers deploying an ERC-20 contract on the Polkadot Hub TestNet using [Polkadot Remix IDE](https://remix.polkadot.io){target=\_blank}, a web-based development tool. [OpenZeppelin's ERC-20 contracts]({{ dependencies.repositories.open_zeppelin_contracts.repository_url}}/tree/{{ dependencies.repositories.open_zeppelin_contracts.version}}/contracts/token/ERC20){target=\_blank} are used for security and compliance. ## Prerequisites Before starting, make sure you have: - [MetaMask](https://metamask.io/){target=\_blank} installed and connected to Polkadot Hub. For detailed instructions, see the [Connect Your Wallet](/develop/smart-contracts/wallets){target=\_blank} section - A funded account with some PAS tokens (you can get them from the [Polkadot Faucet](https://faucet.polkadot.io/?parachain=1111){target=\_blank}). To learn how to get test tokens, check out the [Test Tokens](/develop/smart-contracts/connect-to-polkadot#test-tokens){target=\_blank} section - Basic understanding of Solidity and fungible tokens ## Create the ERC-20 Contract To create the ERC-20 contract, you can follow the steps below: 1. Navigate to the [Polkadot Remix IDE](https://remix.polkadot.io){target=\_blank} 2. Click in the **Create new file** button under the **contracts** folder, and name your contract as `MyToken.sol` ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-1.webp) 3. Now, paste the following ERC-20 contract code into the editor ```solidity title="MyToken.sol" // SPDX-License-Identifier: MIT // Compatible with OpenZeppelin Contracts ^5.0.0 pragma solidity ^0.8.22; import {ERC20} from "@openzeppelin/contracts/token/ERC20/ERC20.sol"; import {Ownable} from "@openzeppelin/contracts/access/Ownable.sol"; contract MyToken is ERC20, Ownable { constructor(address initialOwner) ERC20("MyToken", "MTK") Ownable(initialOwner) {} function mint(address to, uint256 amount) public onlyOwner { _mint(to, amount); } } ``` The key components of the code above are: - Contract imports - [**`ERC20.sol`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url}}/tree/{{ dependencies.repositories.open_zeppelin_contracts.version}}/contracts/token/ERC20/ERC20.sol){target=\_blank} - the base contract for fungible tokens, implementing core functionality like transfers, approvals, and balance tracking - [**`Ownable.sol`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url}}/tree/{{ dependencies.repositories.open_zeppelin_contracts.version}}/contracts/access/Ownable.sol){target=\_blank} - provides basic authorization control, ensuring only the contract owner can mint new tokens - Constructor parameters - **`initialOwner`** - sets the address that will have administrative rights over the contract - **`"MyToken"`** - the full name of your token - **`"MTK"`** - the symbol representing your token in wallets and exchanges - Key functions - **`mint(address to, uint256 amount)`** - allows the contract owner to create new tokens for any address. The amount should include 18 decimals (e.g., 1 token = 1000000000000000000) - Inherited [Standard ERC-20](https://ethereum.org/en/developers/docs/standards/tokens/erc-20/){target=\_blank} functions: - **`transfer(address recipient, uint256 amount)`** - sends a specified amount of tokens to another address - **`approve(address spender, uint256 amount)`** - grants permission for another address to spend a specific number of tokens on behalf of the token owner - **`transferFrom(address sender, address recipient, uint256 amount)`** - transfers tokens from one address to another, if previously approved - **`balanceOf(address account)`** - returns the token balance of a specific address - **`allowance(address owner, address spender)`** - checks how many tokens an address is allowed to spend on behalf of another address !!! tip Use the [OpenZeppelin Contracts Wizard](https://wizard.openzeppelin.com/){target=\_blank} to quickly generate customized smart contracts. Simply configure your contract, copy the generated code, and paste it into Polkadot Remix IDE for deployment. Below is an example of an ERC-20 token contract created with it: ![Screenshot of the OpenZeppelin Contracts Wizard showing an ERC-20 contract configuration.](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-2.webp) ## Compile the Contract The compilation transforms your Solidity source code into bytecode that can be deployed on the blockchain. During this process, the compiler checks your contract for syntax errors, ensures type safety, and generates the machine-readable instructions needed for blockchain execution. To compile your contract, follow the instructions below: 1. Select the **Solidity Compiler** plugin from the left panel ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-3.webp) 2. Click the **Compile MyToken.sol** button ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-4.webp) 3. If the compilation succeeded, you'll see a green checkmark indicating success in the **Solidity Compiler** icon ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-5.webp) ## Deploy the Contract Deployment is the process of publishing your compiled smart contract to the blockchain, making it permanently available for interaction. During deployment, you'll create a new instance of your contract on the blockchain, which involves: 1. Select the **Deploy & Run Transactions** plugin from the left panel ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-6.webp) 2. Configure the deployment settings 1. From the **ENVIRONMENT** dropdown, select **Injected Provider - Talisman** (check the [Deploying Contracts](/develop/smart-contracts/dev-environments/remix/#deploying-contracts){target=\_blank} section of the Remix IDE guide for more details) 2. From the **ACCOUNT** dropdown, select the account you want to use for the deploy ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-7.webp) 3. Configure the contract parameters 1. Enter the address that will own the deployed token contract 2. Click the **Deploy** button to initiate the deployment ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-8.webp) 4. Talisman will pop up - review the transaction details. Click **Approve** to deploy your contract ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-9.webp){: .browser-extension} If the deployment process succeeded, you will see the transaction details in the terminal, including the contract address and deployment transaction hash: ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-10.webp) ## Interact with Your ERC-20 Contract Once deployed, you can interact with your contract through Remix: 1. Find your contract under **Deployed/Unpinned Contracts**, and click it to expand the available methods ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-11.webp) 2. To mint new tokens: 1. Click in the contract to expand its associated methods 2. Expand the **mint** function 3. Enter: - The recipient address - The amount (remember to add 18 zeros for 1 whole token) 4. Click **Transact** ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-12.webp) 3. Click **Approve** to confirm the transaction in the Talisman popup ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-13.webp){: .browser-extension} If the transaction succeeds, you will see the following output in the terminal: ![](/images/tutorials/smart-contracts/deploy-erc20/deploy-erc20-14.webp) Other common functions you can use: - **`balanceOf(address)`** - check token balance of any address - **`transfer(address to, uint256 amount)`** - send tokens to another address - **`approve(address spender, uint256 amount)`** - allow another address to spend your tokens Feel free to explore and interact with the contract's other functions using the same approach - selecting the method, providing any required parameters, and confirming the transaction through Talisman when needed. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/smart-contracts/deploy-nft/ --- BEGIN CONTENT --- --- title: Deploy an NFT to Polkadot Hub description: Deploy an NFT on Polkadot Hub using PolkaVM and OpenZeppelin. Learn how to compile, deploy, and interact with your contract using Polkadot Remix IDE. tutorial_badge: Beginner categories: Basics, dApps, Smart Contracts --- # Deploy an NFT to Polkadot Hub !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Non-Fungible Tokens (NFTs) represent unique digital assets commonly used for digital art, collectibles, gaming, and identity verification. Polkadot Hub supports Ethereum-compatible smart contracts through PolkaVM, enabling straightforward NFT deployment. This tutorial guides you through deploying an [ERC-721](https://eips.ethereum.org/EIPS/eip-721){target=\_blank} NFT contract on the Polkadot Hub TestNet using the [Polkadot Remix IDE](https://remix.polkadot.io){target=\_blank}, a web-based development environment. To ensure security and standard compliance, it uses [OpenZeppelin's NFT contracts]({{ dependencies.repositories.open_zeppelin_contracts.repository_url}}/tree/{{ dependencies.repositories.open_zeppelin_contracts.version}}){target=\_blank} implementation. ## Prerequisites Before starting, make sure you have: - [Talisman](https://talisman.xyz/){target=\_blank} installed and connected to the Polkadot Hub TestNet. Check the [Connect to Polkadot](/develop/smart-contracts/connect-to-polkadot/){target=\_blank} guide for more information - A funded account with some PAS tokens (you can get them from the [Faucet](https://faucet.polkadot.io/?parachain=1111){target=\_blank}, noting that the faucet imposes a daily token limit, which may require multiple requests to obtain sufficient funds for testing) - Basic understanding of Solidity and NFTs, see the [Solidity Basics](https://soliditylang.org/){target=\_blank} and the [NFT Overview](https://ethereum.org/en/nft/){target=\_blank} guides for more details ## Create the NFT Contract To create the NFT contract, you can follow the steps below: 1. Navigate to the [Polkadot Remix IDE](https://remix.polkadot.io/){target=\_blank} 2. Click in the **Create new file** button under the **contracts** folder, and name your contract as `MyNFT.sol` ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-1.webp) 3. Now, paste the following NFT contract code into the editor ```solidity title="MyNFT.sol" // SPDX-License-Identifier: MIT // Compatible with OpenZeppelin Contracts ^5.0.0 pragma solidity ^0.8.22; import {ERC721} from "@openzeppelin/contracts/token/ERC721/ERC721.sol"; import {Ownable} from "@openzeppelin/contracts/access/Ownable.sol"; contract MyToken is ERC721, Ownable { uint256 private _nextTokenId; constructor(address initialOwner) ERC721("MyToken", "MTK") Ownable(initialOwner) {} function safeMint(address to) public onlyOwner { uint256 tokenId = _nextTokenId++; _safeMint(to, tokenId); } } ``` The key components of the code above are: - Contract imports - [**`ERC721.sol`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url }}/blob/{{ dependencies.repositories.open_zeppelin_contracts.version }}/contracts/token/ERC721/ERC721.sol){target=\_blank} - the base contract for non-fungible tokens, implementing core NFT functionality like transfers and approvals - [**`Ownable.sol`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url }}/blob/{{ dependencies.repositories.open_zeppelin_contracts.version }}/contracts/access/Ownable.sol){target=\_blank} - provides basic authorization control, ensuring only the contract owner can mint new tokens - Constructor parameters - **`initialOwner`** - sets the address that will have administrative rights over the contract - **`"MyToken"`** - the full name of your NFT collection - **`"MTK"`** - the symbol representing your token in wallets and marketplaces - Key functions - [**`_safeMint(to, tokenId)`**]({{ dependencies.repositories.open_zeppelin_contracts.repository_url }}/blob/{{ dependencies.repositories.open_zeppelin_contracts.version }}/contracts/token/ERC721/ERC721.sol#L304){target=\_blank} - an internal function from `ERC721` that safely mints new tokens. It includes checks to ensure the recipient can handle `ERC721` tokens, with the `_nextTokenId` mechanism automatically generating unique sequential token IDs and the `onlyOwner` modifier restricting minting rights to the contract owner - Inherited [Standard ERC721](https://ethereum.org/en/developers/docs/standards/tokens/erc-721/){target=\_blank} functions provide a standardized set of methods that enable interoperability across different platforms, wallets, and marketplaces, ensuring that your NFT can be easily transferred, traded, and managed by any system that supports the `ERC721` standard: - **`transferFrom(address from, address to, uint256 tokenId)`** - transfers a specific NFT from one address to another - **`safeTransferFrom(address from, address to, uint256 tokenId)`** - safely transfers an NFT, including additional checks to prevent loss - **`approve(address to, uint256 tokenId)`** - grants permission for another address to transfer a specific NFT - **`setApprovalForAll(address operator, bool approved)`** - allows an address to manage all of the owner's NFTs - **`balanceOf(address owner)`** - returns the number of NFTs owned by a specific address - **`ownerOf(uint256 tokenId)`** - returns the current owner of a specific NFT !!! tip Use the [OpenZeppelin Contracts Wizard](https://wizard.openzeppelin.com/){target=\_blank} to generate customized smart contracts quickly. Simply configure your contract, copy the generated code, and paste it into Polkadot Remix IDE for deployment. Below is an example of an ERC-721 token contract created with it: ![Screenshot of the OpenZeppelin Contracts Wizard showing an ERC-721 contract configuration.](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-2.webp) ## Compile the Contract Compilation is a stage that converts your Solidity source code into bytecode suitable for deployment on the blockchain. Throughout this process, the compiler examines your contract for syntax errors, verifies type safety, and produces machine-readable instructions for execution on the blockchain. 1. Select the **Solidity Compiler** plugin from the left panel ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-3.webp) 2. Click in the **Compile MyNFT.sol** button ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-4.webp) 3. If the compilation succeeded, you can see a green checkmark indicating success in the **Solidity Compiler** icon ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-5.webp) ## Deploy the Contract Deployment is the process of uploading your compiled smart contract to the blockchain, allowing for interaction. During deployment, you will instantiate your contract on the blockchain, which involves: 1. Select the **Deploy & Run Transactions** plugin from the left panel ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-6.webp) 2. Configure the deployment settings 1. From the **ENVIRONMENT** dropdown, select **Injected Provider - Talisman** (check the [Deploying Contracts](/develop/smart-contracts/dev-environments/remix/#deploying-contracts){target=\_blank} section of the Remix IDE guide for more details) 2. From the **ACCOUNT** dropdown, select the account you want to use for the deploy ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-7.webp) 3. Configure the contract parameters 1. Enter the address that will own the deployed NFT. 2. Click the **Deploy** button to initiate the deployment ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-8.webp) 4. Talisman will pop up - review the transaction details. Click **Approve** to deploy your contract ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-9.webp){: .browser-extension} Deploying this contract requires paying gas fees in PAS tokens on the Polkadot Hub TestNet. Ensure your Talisman account is funded with sufficient PAS tokens from the faucet before confirming the transaction, check the [Test Tokens](/develop/smart-contracts/connect-to-polkadot/#test-tokens){target=\_blank} section for more information. Gas fees cover the computational resources needed to deploy and execute the smart contract on the blockchain. If the deployment process succeeded, you will see the following output in the terminal: ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-10.webp) ## Interact with Your NFT Contract Once deployed, you can interact with your contract through Remix: 1. Find your contract under **Deployed/Unpinned Contracts**, and click it to expand the available methods for the contract ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-11.webp) 2. To mint an NFT 1. Click on the contract to expand its associated methods 2. Expand the **safeMint** function 3. Enter the recipient address 4. Click **Transact** ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-12.webp) 3. Click **Approve** to confirm the transaction in the Talisman popup ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-13.webp){: .browser-extension} If the transaction is successful, the terminal will display the following output, which details the information about the transaction, including the transaction hash, the block number, the associated logs, and so on. ![](/images/tutorials/smart-contracts/deploy-nft/deploy-nft-14.webp) Feel free to explore and interact with the contract's other functions using the same approach - selecting the method, providing any required parameters, and confirming the transaction through Talisman when needed. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/tutorials/smart-contracts/launch-your-first-project/create-contracts/ --- BEGIN CONTENT --- --- title: Create a Smart Contract description: Learn how to write a basic smart contract using just a text editor. This guide covers creating and preparing a contract for deployment on Polkadot Hub. tutorial_badge: Beginner categories: Basics, Smart Contracts --- # Create a Smart Contract !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Creating [smart contracts](/develop/smart-contracts/overview/){target=\_blank} is fundamental to blockchain development. While many frameworks and tools are available, understanding how to write a contract from scratch with just a text editor is essential knowledge. This tutorial will guide you through creating a basic smart contract that can be used with other tutorials for deployment and integration on Polkadot Hub. To understand how smart contracts work in Polkadot Hub, check the [Smart Contract Basics](/polkadot-protocol/smart-contract-basics/){target=\_blank} guide for more information. ## Prerequisites Before starting, make sure you have: - A text editor of your choice ([VS Code](https://code.visualstudio.com/){target=\_blank}, [Sublime Text](https://www.sublimetext.com/){target=\_blank}, etc.) - Basic understanding of programming concepts - Familiarity with the Solidity programming language syntax. For further references, check the official [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\_blank} ## Understanding Smart Contract Structure Let's explore these components before building the contract: - [**SPDX license identifier**](https://docs.soliditylang.org/en/v0.6.8/layout-of-source-files.html){target=\_blank} - a standardized way to declare the license under which your code is released. This helps with legal compliance and is required by the Solidity compiler to avoid warnings - **Pragma directive** - specifies which version of Solidity compiler should be used for your contract - **Contract declaration** - similar to a class in object-oriented programming, it defines the boundaries of your smart contract - **State variables** - data stored directly in the contract that persists between function calls. These represent the contract's "state" on the blockchain - **Functions** - executable code that can read or modify the contract's state variables - **Events** - notification mechanisms that applications can subscribe to in order to track blockchain changes ## Create the Smart Contract In this section, you'll build a simple storage contract step by step. This basic Storage contract is a great starting point for beginners. It introduces key concepts like state variables, functions, and events in a simple way, demonstrating how data is stored and updated on the blockchain. Later, you'll explore each component in more detail to understand what's happening behind the scenes. This contract will: - Store a number - Allow updating the stored number - Emit an event when the number changes To build the smart contract, follow the steps below: 1. Create a new file named `Storage.sol` 2. Add the SPDX license identifier at the top of the file: ```solidity // SPDX-License-Identifier: MIT ``` This line tells users and tools which license governs your code. The [MIT license](https://opensource.org/license/mit){target=\_blank} is commonly used for open-source projects. The Solidity compiler requires this line to avoid licensing-related warnings. 3. Specify the Solidity version: ```solidity pragma solidity ^0.8.28; ``` The caret `^` means "this version or any compatible newer version." This helps ensure your contract compiles correctly with the intended compiler features. 4. Create the contract structure: ```solidity contract Storage { // Contract code will go here } ``` This defines a contract named "Storage", similar to how you would define a class in other programming languages. 5. Add the state variables and event: ```solidity contract Storage { // State variable to store a number uint256 private number; // Event to notify when the number changes event NumberChanged(uint256 newNumber); } ``` Here, you're defining: - A state variable named `number` of type `uint256` (unsigned integer with 256 bits), which is marked as `private` so it can only be accessed via functions within this contract - An event named `NumberChanged` that will be triggered whenever the number changes. The event includes the new value as data 6. Add the getter and setter functions: ```solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.28; contract Storage { // State variable to store our number uint256 private number; // Event to notify when the number changes event NumberChanged(uint256 newNumber); // Function to store a new number function store(uint256 newNumber) public { number = newNumber; emit NumberChanged(newNumber); } // Function to retrieve the stored number function retrieve() public view returns (uint256) { return number; } } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" // SPDX-License-Identifier: MIT pragma solidity ^0.8.28; contract Storage { // State variable to store our number uint256 private number; // Event to notify when the number changes event NumberChanged(uint256 newNumber); // Function to store a new number function store(uint256 newNumber) public { number = newNumber; emit NumberChanged(newNumber); } // Function to retrieve the stored number function retrieve() public view returns (uint256) { return number; } } ``` ## Understanding the Code Let's break down the key components of the contract: - **State Variable** - `uint256 private number` - a private variable that can only be accessed through the contract's functions - The `private` keyword prevents direct access from other contracts, but it's important to note that while other contracts cannot read this variable directly, the data itself is still visible on the blockchain and can be read by external tools or applications that interact with the blockchain. "Private" in Solidity doesn't mean the data is encrypted or truly hidden - State variables in Solidity are permanent storage on the blockchain, making them different from variables in traditional programming. Every change to a state variable requires a transaction and costs gas (the fee paid for blockchain operations) - **Event** - `event NumberChanged(uint256 newNumber)` - emitted when the stored number changes - When triggered, events write data to the blockchain's log, which can be efficiently queried by applications - Unlike state variables, events cannot be read by smart contracts, only by external applications - Events are much more gas-efficient than storing data when you only need to notify external systems of changes - **Functions** - `store(uint256 newNumber)` - updates the stored number and emits an event - This function changes the state of the contract and requires a transaction to execute - The `emit` keyword is used to trigger the defined event - `retrieve()` - returns the current stored number - The `view` keyword indicates that this function only reads data and doesn't modify the contract's state - View functions don't require a transaction and don't cost gas when called externally For those new to Solidity, this naming pattern (getter/setter functions) is a common design pattern. Instead of directly accessing state variables, the convention is to use functions to control access and add additional logic if needed. This basic contract serves as a foundation for learning smart contract development. Real-world contracts often require additional security considerations, more complex logic, and thorough testing before deployment. For more detailed information about Solidity types, functions, and best practices, refer to the [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\_blank} or this [beginner's guide to Solidity](https://www.tutorialspoint.com/solidity/index.htm){target=\_blank}. ## Where to Go Next
- Tutorial __Test and Deploy with Hardhat__ --- Learn how to test and deploy the smart contract you created by using Hardhat. [:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/launch-your-first-project/test-and-deploy-with-hardhat/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/architecture/parachains/overview/ --- BEGIN CONTENT --- --- title: Overview description: Learn about the role, functionality, and implementation of parachains as a developer in the wider Polkadot architecture. categories: Basics, Polkadot Protocol, Parachains --- ## Introduction A [_parachain_](/polkadot-protocol/glossary#parachain){target=\_blank} is a coherent, application-specific blockchain that derives security from its respective relay chain. Parachains on Polkadot are each their own separate, fully functioning blockchain. The primary difference between a parachain and a regular, "solo" blockchain is that the relay chain verifies the state of all parachains that are connected to it. In many ways, parachains can be thought of as a ["cynical" rollup](#cryptoeconomic-security-elves-protocol), as the crypto-economic protocol used (ELVES) assumes the worst-case scenario, rather than the typical optimistic approach that many roll-up mechanisms take. Once enough validators attest that a block is valid, then the probability of that block being valid is high. As each parachain’s state is validated by the relay chain, the relay chain represents the collective state of all parachains. ```mermaid flowchart TB subgraph "Relay Chain" RC[Relay Chain Validators] State[Collective State Validation] end PA[Parachain A] PB[Parachain B] PC[Parachain C] RC -->|Validate State| PA RC -->|Validate State| PB RC -->|Validate State| PC State -->|Represents Collective\nParachain State| RC note["ELVES Protocol:\n- Crypto-economic security\n- Assumes worst-case scenario\n- High probability validation"] ``` ## Coherent Systems Coherency refers to the degree of synchronization, consistency, and interoperability between different components or chains within a system. It encompasses the internal coherence of individual chains and the external coherence between chains regarding how they interact. A single-state machine like Ethereum is very coherent, as all of its components (smart contracts, dApps/applications, staking, consensus) operate within a single environment with the downside of less scalability. Multi-protocol state machines, such as Polkadot, offer less coherency due to their sharded nature but more scalability due to the parallelization of their architecture. Parachains are coherent, as they are self-contained environments with domain-specific functionality. ## Flexible Ecosystem Parachains enable parallelization of different services within the same network. However, unlike most layer two rollups, parachains don't suffer the same interoperability pitfalls that most rollups suffer. [Cross-Consensus Messaging (XCM)](/develop/interoperability/intro-to-xcm/){target=\_blank} provides a common communication format for each parachain and can be configured to allow a parachain to communicate with just the relay chain or certain parachains. The diagram below highlights the flexibility of the Polkadot ecosystem, where each parachain specializes in a distinct domain. This example illustrates how parachains, like DeFi and GameFi, leverage XCM for cross-chain operations such as asset transfers and credential verification. ```mermaid flowchart TB subgraph "Polkadot Relay Chain" RC[Relay Chain\nCross-Consensus\nRouting] end subgraph "Parachain Ecosystem" direction TB DeFi[DeFi Parachain\nFinancial Services] GameFi[GameFi Parachain\nGaming Ecosystem] NFT[NFT Parachain\nDigital Collectibles] Identity[Identity Parachain\nUser Verification] end DeFi <-->|XCM: Asset Transfer| GameFi GameFi <-->|XCM: Token Exchange| NFT Identity <-->|XCM: Credential Verification| DeFi RC -->|Validate & Route XCM| DeFi RC -->|Validate & Route XCM| GameFi RC -->|Validate & Route XCM| NFT RC -->|Validate & Route XCM| Identity note["XCM Features:\n- Standardized Messaging\n- Cross-Chain Interactions\n- Secure Asset/Data Transfer"] ``` Most parachains are built using the Polkadot SDK, which provides all the tools to create a fully functioning parachain. However, it is possible to construct a parachain that can inherit the security of the relay chain as long as it implements the correct mechanisms expected by the relay chain. ## State Transition Functions (Runtimes) Determinism is a fundamental property where given the same input, a system will consistently produce identical outputs. In blockchain systems, this predictable behavior is essential for state machines, which are algorithms that transition between different states based on specific inputs to generate a new state. At their core, parachains, like most blockchains, are deterministic, finite-state machines that are often backed by game theory and economics. The previous state of the parachain, combined with external input in the form of [extrinsics](/polkadot-protocol/glossary#extrinsic){target=\_blank}, allows the state machine to progress forward, one block at a time. ```mermaid stateDiagram-v2 direction LR [*] --> StateA : Initial State StateA --> STF : Extrinsics/Transactions STF --> StateB : Deterministic Transformation StateB --> [*] : New State ``` The primary driver of this progression is the state transition function (STF), commonly referred to as a runtime. Each time a block is submitted, it represents the next proposed state for a parachain. By applying the state transition function to the previous state and including a new block that contains the proposed changes in the form of a list of extrinsics/transactions, the runtime defines just exactly how the parachain is to advance from state A to state B. The STF in a Polkadot SDK-based chain is compiled to Wasm and uploaded on the relay chain. This STF is crucial for the relay chain to validate the state changes coming from the parachain, as it is used to ensure that all proposed state transitions are happening correctly as part of the validation process. For more information on the Wasm meta protocol that powers runtimes, see the [WASM Meta Protocol](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/wasm_meta_protocol/index.html){target=\blank} in the Polkadot SDK Rust Docs. ## Shared Security: Validated by the Relay Chain The relay chain provides a layer of economic security for its parachains. Parachains submit proof of validation (PoV) data to the relay chain for validation through [collators](/polkadot-protocol/glossary/#collator), upon which the relay chains' validators ensure the validity of this data in accordance with the STF for that particular parachain. In other words, the consensus for a parachain follows the relay chain. While parachains choose how a block is authored, what it contains, and who authors it, the relay chain ultimately provides finality and consensus for those blocks. For more information about the parachain and relay chain validation process, see the [Parachains' Protocol Overview: Protocols' Summary](https://wiki.polkadot.network/learn/learn-parachains-protocol/#protocols-summary){target=\blank} entry in the Polkadot Wiki. Parachains need at least one honest collator to submit PoV data to the relay chain. Without this, the parachain can't progress. The mechanisms that facilitate this are found in the Cumulus portion of the Polkadot SDK, some of which are found in the [`cumulus_pallet_parachain_system`](https://paritytech.github.io/polkadot-sdk/master/cumulus_pallet_parachain_system/index.html){target=\blank} ### Cryptoeconomic Security: ELVES Protocol The [ELVES (Economic Last Validation Enforcement System)](https://eprint.iacr.org/2024/961){target=\_blank} protocol forms the foundation of Polkadot's cryptoeconomic security model. ELVES assumes a worst-case scenario by enforcing strict validation rules before any state transitions are finalized. Unlike optimistic approaches that rely on post-facto dispute resolution, ELVES ensures that validators collectively confirm the validity of a block before it becomes part of the parachain's state. Validators are incentivized through staking and penalized for malicious or erroneous actions, ensuring adherence to the protocol. This approach minimizes the probability of invalid states being propagated across the network, providing robust security for parachains. ## Interoperability Polkadot's interoperability framework allows parachains to communicate with each other, fostering a diverse ecosystem of interconnected blockchains. Through [Cross-Consensus Messaging (XCM)](/develop/interoperability/intro-to-xcm/){target=_blank}, parachains can transfer assets, share data, and invoke functionalities on other chains securely. This standardized messaging protocol ensures that parachains can interact with the relay chain and each other, supporting efficient cross-chain operations. The XCM protocol mitigates common interoperability challenges in isolated blockchain networks, such as fragmented ecosystems and limited collaboration. By enabling decentralized applications to leverage resources and functionality across parachains, Polkadot promotes a scalable, cooperative blockchain environment that benefits all participants. ## Where to Go Next For further information about the consensus protocol used by parachains, see the [Consensus](/polkadot-protocol/architecture/parachains/consensus/) page.
- Learn __Consensus__ --- Understand how the blocks authored by parachain collators are secured by the relay chain validators and how the parachain transactions achieve finality. [:octicons-arrow-right-24: Reference](/polkadot-protocol/architecture/parachains/consensus/)
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/architecture/polkadot-chain/overview/ --- BEGIN CONTENT --- --- title: Overview of the Polkadot Relay Chain description: Explore Polkadot's core architecture, including its multi-chain vision, shared security, and the DOT token's governance and staking roles. categories: Basics, Polkadot Protocol, Parachains --- # Overview ## Introduction Polkadot is a next-generation blockchain protocol designed to support a multi-chain future by enabling secure communication and interoperability between different blockchains. Built as a Layer-0 protocol, Polkadot introduces innovations like application-specific Layer-1 chains ([parachains](/polkadot-protocol/architecture/parachains/){targe=\_blank}), shared security through [Nominated Proof of Stake (NPoS)](/polkadot-protocol/glossary/#nominated-proof-of-stake-npos){target=\_blank}, and cross-chain interactions via its native [Cross-Consensus Messaging Format (XCM)](/develop/interoperability/intro-to-xcm/){target=\_blank}. This guide covers key aspects of Polkadot’s architecture, including its high-level protocol structure, blockspace commoditization, and the role of its native token, DOT, in governance, staking, and resource allocation. ## Polkadot 1.0 Polkadot 1.0 represents the state of Polkadot as of 2023, coinciding with the release of [Polkadot runtime v1.0.0](https://github.com/paritytech/polkadot/releases/tag/v1.0.0){target=\_blank}. This section will focus on Polkadot 1.0, along with philosophical insights into network resilience and blockspace. As a Layer-0 blockchain, Polkadot contributes to the multi-chain vision through several key innovations and initiatives, including: - **Application-specific Layer-1 blockchains (parachains)** - Polkadot's sharded network allows for parallel transaction processing, with shards that can have unique state transition functions, enabling custom-built L1 chains optimized for specific applications - **Shared security and scalability** - L1 chains connected to Polkadot benefit from its [Nominated Proof of Stake (NPoS)](/polkadot-protocol/architecture/polkadot-chain/pos-consensus/#nominated-proof-of-stake){target=\_blank} system, providing security out-of-the-box without the need to bootstrap their own - **Secure interoperability** - Polkadot's native interoperability enables seamless data and value exchange between parachains. This interoperability can also be used outside of the ecosystem for bridging with external networks - **Resilient infrastructure** - decentralized and scalable, Polkadot ensures ongoing support for development and community initiatives via its on-chain [treasury](https://wiki.polkadot.network/learn/learn-polkadot-opengov-treasury/){target=\_blank} and governance - **Rapid L1 development** - the [Polkadot SDK](/develop/parachains/intro-polkadot-sdk/){target=\_blank} allows fast, flexible creation and deployment of Layer-1 chains - **Cultivating the next generation of Web3 developers** - Polkadot supports the growth of Web3 core developers through initiatives such as: - [Polkadot Blockchain Academy](https://polkadot.com/blockchain-academy){target=\_blank} - [Polkadot Alpha Program](https://polkadot.com/alpha-program){target=\_blank} - [EdX courses](https://www.edx.org/school/web3x){target=\_blank} - Rust and Substrate courses (coming soon) ### High-Level Architecture Polkadot features a chain that serves as the central component of the system. This chain is depicted as a ring encircled by several parachains that are connected to it. According to Polkadot's design, any blockchain that can compile to WebAssembly (Wasm) and adheres to the Parachains Protocol becomes a parachain on the Polkadot network. Here’s a high-level overview of the Polkadot protocol architecture: ![](/images/polkadot-protocol/architecture/polkadot-chain/overview/overview-1.webp) Parachains propose blocks to Polkadot validators, who check for availability and validity before finalizing them. With the relay chain providing security, collatorsβ€”full nodes of parachainsβ€”can focus on their tasks without needing strong incentives. The [Cross-Consensus Messaging Format (XCM)](/develop/interoperability/intro-to-xcm/){target=\_blank} allows parachains to exchange messages freely, leveraging the chain's security for trust-free communication. In order to interact with chains that want to use their own finalization process (e.g., Bitcoin), Polkadot has [bridges](/polkadot-protocol/parachain-basics/interoperability/#bridges-connecting-external-networks){target=\_blank} that offer two-way compatibility, meaning that transactions can be made between different parachains. ### Polkadot's Additional Functionalities Historically, obtaining core slots on Polkadot chain relied upon crowdloans and auctions. Chain cores were leased through auctions for three-month periods, up to a maximum of two years. Crowdloans enabled users to securely lend funds to teams for lease deposits in exchange for pre-sale tokens, which is the only way to access slots on Polkadot 1.0. Auctions are now deprecated in favor of [coretime](/polkadot-protocol/architecture/system-chains/coretime/){target=\_blank}. Additionally, the chain handles [staking](https://wiki.polkadot.network/learn/learn-staking/){target=\_blank}, [accounts](/polkadot-protocol/basics/accounts/){target=\_blank}, balances, and [governance](/polkadot-protocol/onchain-governance/){target=\_blank}. #### Agile Coretime The new and more efficient way of obtaining core on Polkadot is to go through the process of purchasing coretime. [Agile coretime](/polkadot-protocol/architecture/polkadot-chain/agile-coretime/){target=\_blank} improves the efficient use of Polkadot's network resources and offers economic flexibility for developers, extending Polkadot's capabilities far beyond the original vision outlined in the [whitepaper](https://polkadot.com/papers/Polkadot-whitepaper.pdf){target=\_blank}. It enables parachains to purchase monthly "bulk" allocations of coretime (the time allocated for utilizing a core, measured in Polkadot relay chain blocks), ensuring heavy-duty parachains that can author a block every six seconds with [Asynchronous Backing](https://wiki.polkadot.network/learn/learn-async-backing/#asynchronous-backing){target=\_blank} can reliably renew their coretime each month. Although six-second block times are now the default, parachains have the option of producing blocks less frequently. Renewal orders are prioritized over new orders, offering stability against price fluctuations and helping parachains budget more effectively for project costs. ### Polkadot's Resilience Decentralization is a vital component of blockchain networks, but it comes with trade-offs: - An overly decentralized network may face challenges in reaching consensus and require significant energy to operate - Also, a network that achieves consensus quickly risks centralization, making it easier to manipulate or attack A network should be decentralized enough to prevent manipulative or malicious influence. In this sense, decentralization is a tool for achieving resilience. Polkadot 1.0 currently achieves resilience through several strategies: - **Nominated Proof of Stake (NPoS)** - ensures that the stake per validator is maximized and evenly distributed among validators - **Decentralized nodes** - designed to encourage operators to join the network. This program aims to expand and diversify the validators in the ecosystem who aim to become independent of the program during their term. Feel free to explore more about the program on the official [Decentralized Nodes](https://nodes.web3.foundation/){target=\_blank} page - **On-chain treasury and governance** - known as [OpenGov](/polkadot-protocol/onchain-governance/overview/){target=\_blank}, this system allows every decision to be made through public referenda, enabling any token holder to cast a vote ### Polkadot's Blockspace Polkadot 1.0’s design allows for the commoditization of blockspace. Blockspace is a blockchain's capacity to finalize and commit operations, encompassing its security, computing, and storage capabilities. Its characteristics can vary across different blockchains, affecting security, flexibility, and availability. - **Security** - measures the robustness of blockspace in Proof of Stake (PoS) networks linked to the stake locked on validator nodes, the variance in stake among validators, and the total number of validators. It also considers social centralization (how many validators are owned by single operators) and physical centralization (how many validators run on the same service provider) - **Flexibility** - reflects the functionalities and types of data that can be stored, with high-quality data essential to avoid bottlenecks in critical processes - **Availability** - indicates how easily users can access blockspace. It should be easily accessible, allowing diverse business models to thrive, ideally regulated by a marketplace based on demand and supplemented by options for "second-hand" blockspace Polkadot is built on core blockspace principles, but there's room for improvement. Tasks like balance transfers, staking, and governance are managed on the relay chain. Delegating these responsibilities to [system chains](/polkadot-protocol/architecture/system-chains/){target=\_blank} could enhance flexibility and allow the relay chain to concentrate on providing shared security and interoperability. For more information about blockspace, watch [Robert Habermeier’s interview](https://www.youtube.com/watch?v=e1vISppPwe4){target=\_blank} or read his [technical blog post](https://www.rob.tech/blog/polkadot-blockspace-over-blockchains/){target=\_blank}. ## DOT Token DOT is the native token of the Polkadot network, much like BTC for Bitcoin and Ether for the Ethereum blockchain. DOT has 10 decimals, uses the Planck base unit, and has a balance type of `u128`. The same is true for Kusama's KSM token with the exception of having 12 decimals. ### Redenomination of DOT Polkadot conducted a community poll, which ended on 27 July 2020 at block 888,888, to decide whether to redenominate the DOT token. The stakeholders chose to redenominate the token, changing the value of 1 DOT from 1e12 plancks to 1e10 plancks. Importantly, this did not affect the network's total number of base units (plancks); it only affects how a single DOT is represented. The redenomination became effective 72 hours after transfers were enabled, occurring at block 1,248,328 on 21 August 2020 around 16:50 UTC. ### The Planck Unit The smallest unit of account balance on Polkadot SDK-based blockchains (such as Polkadot and Kusama) is called _Planck_, named after the Planck length, the smallest measurable distance in the physical universe. Similar to how BTC's smallest unit is the Satoshi and ETH's is the Wei, Polkadot's native token DOT equals 1e10 Planck, while Kusama's native token KSM equals 1e12 Planck. ### Uses for DOT DOT serves three primary functions within the Polkadot network: - **Governance** - it is used to participate in the governance of the network - **Staking** - DOT is staked to support the network's operation and security - **Buying coretime** - used to purchase coretime in-bulk or on-demand and access the chain to benefit from Polkadot's security and interoperability Additionally, DOT can serve as a transferable token. For example, DOT, held in the treasury, can be allocated to teams developing projects that benefit the Polkadot ecosystem. ## JAM and the Road Ahead The Join-Accumulate Machine (JAM) represents a transformative redesign of Polkadot's core architecture, envisioned as the successor to the current relay chain. Unlike traditional blockchain architectures, JAM introduces a unique computational model that processes work through two primary functions: - **Join** - handles data integration - **Accumulate** - folds computations into the chain's state JAM removes many of the opinions and constraints of the current relay chain while maintaining its core security properties. Expected improvements include: - **Permissionless code execution** - JAM is designed to be more generic and flexible, allowing for permissionless code execution through services that can be deployed without governance approval - **More effective block time utilization** - JAM's efficient pipeline processing model places the prior state root in block headers instead of the posterior state root, enabling more effective utilization of block time for computations This architectural evolution promises to enhance Polkadot's scalability and flexibility while maintaining robust security guarantees. JAM is planned to be rolled out to Polkadot as a single, complete upgrade rather than a stream of smaller updates. This approach seeks to minimize the developer overhead required to address any breaking changes. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/architecture/system-chains/overview/ --- BEGIN CONTENT --- --- title: Overview of Polkadot's System Chains description: Discover how system parachains enhance Polkadot's scalability and performance by offloading tasks like governance, asset management, and bridging from the relay chain. categories: Basics, Polkadot Protocol --- ## Introduction Polkadot's relay chain is designed to secure parachains and facilitate seamless inter-chain communication. However, resource-intensiveβ€”tasks like governance, asset management, and bridging are more efficiently handled by system parachains. These specialized chains offload functionality from the relay chain, leveraging Polkadot's parallel execution model to improve performance and scalability. By distributing key functionalities across system parachains, Polkadot can maximize its relay chain's blockspace for its core purpose of securing and validating parachains. This guide will explore how system parachains operate within Polkadot and Kusama, detailing their critical roles in network governance, asset management, and bridging. You'll learn about the currently deployed system parachains, their unique functions, and how they enhance Polkadot's decentralized ecosystem. ## System Chains System parachains contain core Polkadot protocol features, but in parachains rather than the relay chain. Execution cores for system chains are allocated via network [governance](/polkadot-protocol/onchain-governance/overview/){target=\_blank} rather than purchasing coretime on a marketplace. System parachains defer to on-chain governance to manage their upgrades and other sensitive actions as they do not have native tokens or governance systems separate from DOT or KSM. It is not uncommon to see a system parachain implemented specifically to manage network governance. !!!note You may see system parachains called common good parachains in articles and discussions. This nomenclature caused confusion as the network evolved, so system parachains is preferred. For more details on this evolution, review this [parachains forum discussion](https://forum.polkadot.network/t/polkadot-protocol-and-common-good-parachains/866){target=\_blank}. ## Existing System Chains ```mermaid --- title: System Parachains at a Glance --- flowchart TB subgraph POLKADOT["Polkadot"] direction LR PAH["Polkadot Asset Hub"] PCOL["Polkadot Collectives"] PBH["Polkadot Bridge Hub"] PPC["Polkadot People Chain"] PCC["Polkadot Coretime Chain"] end subgraph KUSAMA["Kusama"] direction LR KAH["Kusama Asset Hub"] KBH["Kusama Bridge Hub"] KPC["Kusama People Chain"] KCC["Kusama Coretime Chain"] E["Encointer"] end ``` All system parachains are on both Polkadot and Kusama with the following exceptions: - [**Collectives**](#collectives) - only on Polkadot - [**Encointer**](#encointer) - only on Kusama ### Asset Hub The [Asset Hub](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/cumulus#asset-hub-){target=\_blank} is an asset portal for the entire network. It helps asset creators, such as reserve-backed stablecoin issuers, track the total issuance of an asset in the network, including amounts transferred to other parachains. It also serves as the hub where asset creators can perform on-chain operations, such as minting and burning, to manage their assets effectively. This asset management logic is encoded directly in the runtime of the chain rather than in smart contracts. The efficiency of executing logic in a parachain allows for fees and deposits that are about 1/10th of what is required on the relay chain. These low fees mean that the Asset Hub is well suited for handling the frequent transactions required when managing balances, transfers, and on-chain assets. The Asset Hub also supports non-fungible assets (NFTs) via the [Uniques pallet](https://polkadot.js.org/docs/substrate/extrinsics#uniques){target=\_blank} and [NFTs pallet](https://polkadot.js.org/docs/substrate/extrinsics#nfts){target=\_blank}. For more information about NFTs, see the Polkadot Wiki section on [NFT Pallets](https://wiki.polkadot.network/learn/learn-nft-pallets/){target=\_blank}. ### Collectives The Polkadot Collectives parachain was added in [Referendum 81](https://polkadot.polkassembly.io/referendum/81){target=\_blank} and exists on Polkadot but not on Kusama. The Collectives chain hosts on-chain collectives that serve the Polkadot network, including the following: - [**Polkadot Alliance**](https://polkadot.polkassembly.io/referendum/94){target=\_blank} - provides a set of ethics and standards for the community to follow. Includes an on-chain means to call out bad actors - [**Polkadot Technical Fellowship**](https://wiki.polkadot.network/learn/learn-polkadot-technical-fellowship/){target=\_blank} - a rules-based social organization to support and incentivize highly-skilled developers to contribute to the technical stability, security, and progress of the network These on-chain collectives will play essential roles in the future of network stewardship and decentralized governance. Networks can use a bridge hub to help them act as collectives and express their legislative voices as single opinions within other networks. ### Bridge Hub Before parachains, the only way to design a bridge was to put the logic onto the relay chain. Since both networks now support parachains and the isolation they provide, each network can have a parachain dedicated to bridges. The Bridge Hub system parachain operates on the relay chain, and is responsible for faciliating bridges to the wider Web3 space. It contains the required bridge [pallets](/polkadot-protocol/glossary/#pallet){target=\_blank} in its runtime, which enable trustless bridging with other blockchain networks like Polkadot, Kusama, and Ethereum. The Bridge Hub uses the native token of the relay chain. See the [Bridge Hub](/polkadot-protocol/architecture/system-chains/bridge-hub/){target=\_blank} documentation for additional information. ### People Chain The People Chain provides a naming system that allows users to manage and verify their account [identity](https://wiki.polkadot.network/learn/learn-identity/){target=\_blank}. ### Coretime Chain The Coretime system chain lets users buy coretime to access Polkadot's computation. [Coretime marketplaces](https://wiki.polkadot.network/learn/learn-guides-coretime-marketplaces/){target=\_blank} run on top of the Coretime chain. Kusama does not use the Collectives system chain. Instead, Kusama relies on the Encointer system chain, which provides Sybil resistance as a service to the entire Kusama ecosystem. Visit [Introduction to Agile Coretime](https://wiki.polkadot.network/learn/learn-agile-coretime/#introduction-to-agile-coretime){target=\_blank} in the Polkadot Wiki for more information. ### Encointer [Encointer](https://encointer.org/encointer-for-web3/){target=\_blank} is a blockchain platform for self-sovereign ID and a global [universal basic income (UBI)](https://book.encointer.org/economics-ubi.html){target=\_blank}. The Encointer protocol uses a novel Proof of Personhood (PoP) system to create unique identities and resist Sybil attacks. PoP is based on the notion that a person can only be in one place at any given time. Encointer offers a framework that allows for any group of real people to create, distribute, and use their own digital community tokens. Participants are requested to attend physical key-signing ceremonies with small groups of random people at randomized locations. These local meetings are part of one global signing ceremony occurring at the same time. Participants use the Encointer wallet app to participate in these ceremonies and manage local community currencies. Referendums marking key Encointer adoption milestones include: - [**Referendum 158 - Register Encointer As a Common Good Chain**](https://kusama.polkassembly.io/referendum/158){target=\_blank} - registered Encointer as the second system parachain on Kusama's network - [**Referendum 187 - Encointer Runtime Upgrade to Full Functionality**](https://kusama.polkassembly.io/referendum/187){target=\_blank} - introduced a runtime upgrade bringing governance and full functionality for communities to use the protocol To learn more about Encointer, see the official [Encointer book](https://book.encointer.org/introduction.html){target=\_blank} or watch an [Encointer ceremony](https://www.youtube.com/watch?v=tcgpCCYBqko){target=\_blank} in action. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/onchain-governance/overview/ --- BEGIN CONTENT --- --- title: On-Chain Governance Overview description: Discover Polkadot’s cutting-edge OpenGov system, enabling transparent, decentralized decision-making through direct democracy and flexible governance tracks. categories: Basics, Polkadot Protocol --- # On-Chain Governance ## Introduction Polkadot’s governance system exemplifies decentralized decision-making, empowering its community of stakeholders to shape the network’s future through active participation. The latest evolution, OpenGov, builds on Polkadot’s foundation by providing a more inclusive and efficient governance model. This guide will explain the principles and structure of OpenGov and walk you through its key components, such as Origins, Tracks, and Delegation. You will learn about improvements over earlier governance systems, including streamlined voting processes and enhanced stakeholder participation. With OpenGov, Polkadot achieves a flexible, scalable, and democratic governance framework that allows multiple proposals to proceed simultaneously, ensuring the network evolves in alignment with its community's needs. ## Governance Evolution Polkadot’s governance journey began with [Governance V1](https://wiki.polkadot.network/learn/learn-polkadot-opengov/#governance-summary){target=\_blank}, a system that proved effective in managing treasury funds and protocol upgrades. However, it faced limitations, such as: - Slow voting cycles, causing delays in decision-making - Inflexibility in handling multiple referendums, restricting scalability To address these challenges, Polkadot introduced OpenGov, a governance model designed for greater inclusivity, efficiency, and scalability. OpenGov replaces the centralized structures of Governance V1, such as the Council and Technical Committee, with a fully decentralized and dynamic framework. For a full comparison of the historic and current governance models, visit the [Gov1 vs. Polkadot OpenGov](https://wiki.polkadot.network/learn/learn-polkadot-opengov/#gov1-vs-polkadot-opengov){target=\_blank} section of the Polkadot Wiki. ## OpenGov Key Features OpenGov transforms Polkadot’s governance into a decentralized, stakeholder-driven model, eliminating centralized decision-making bodies like the Council. Key enhancements include: - **Decentralization** - shifts all decision-making power to the public, ensuring a more democratic process - **Enhanced delegation** - allows users to delegate their votes to trusted experts across specific governance tracks - **Simultaneous referendums** - multiple proposals can progress at once, enabling faster decision-making - **Polkadot Technical Fellowship** - a broad, community-driven group replacing the centralized Technical Committee This new system ensures Polkadot governance remains agile and inclusive, even as the ecosystem grows. ## Origins and Tracks In OpenGov, origins and tracks are central to managing proposals and votes. - **Origin** - determines the authority level of a proposal (e.g., Treasury, Root) which decides the track of all referendums from that origin - **Track** - define the procedural flow of a proposal, such as voting duration, approval thresholds, and enactment timelines Developers must be aware that referendums from different origins and tracks will take varying amounts of time to reach approval and enactment. The [Polkadot Technical Fellowship](https://wiki.polkadot.network/learn/learn-polkadot-technical-fellowship/){target=\_blank} has the option to shorten this timeline by whitelisting a proposal and allowing it to be enacted through the [Whitelist Caller](https://wiki.polkadot.network/learn/learn-polkadot-opengov-origins/#whitelisted-caller){target=\_blank} origin. Visit [Origins and Tracks Info](https://wiki.polkadot.network/learn/learn-polkadot-opengov/#origins-and-tracks){target=\_blank} for details on current origins and tracks, associated terminology, and parameters. ## Referendums In OpenGov, anyone can submit a referendum, fostering an open and participatory system. The timeline for a referendum depends on the privilege level of the origin with more significant changes offering more time for community voting and participation before enactment. The timeline for an individual referendum includes four distinct periods: - **Lead-in** - a minimum amount of time to allow for community participation, available room in the origin, and payment of the decision deposit. Voting is open during this period - **Decision** - voting continues - **Confirmation** - referendum must meet [approval and support](https://wiki.polkadot.network/learn/learn-polkadot-opengov/#approval-and-support){target=\_blank} criteria during entire period to avoid rejection - **Enactment** - changes approved by the referendum are executed ### Vote on Referendums Voters can vote with their tokens on each referendum. Polkadot uses a voluntary token locking mechanism, called conviction voting, as a way for voters to increase their voting power. A token holder signals they have a stronger preference for approving a proposal based upon their willingness to lock up tokens. Longer voluntary token locks are seen as a signal of continual approval and translate to increased voting weight. See [Voting on a Referendum](https://wiki.polkadot.network/learn/learn-polkadot-opengov/#voting-on-a-referendum){target=\_blank} for a deeper look at conviction voting and related token locks. ### Delegate Voting Power The OpenGov system also supports multi-role delegations, allowing token holders to assign their voting power on different tracks to entities with expertise in those areas. For example, if a token holder lacks the technical knowledge to evaluate proposals on the [Root track](https://wiki.polkadot.network/learn/learn-polkadot-opengov-origins/#root){target=\_blank}, they can delegate their voting power for that track to an expert they trust to vote in the best interest of the network. This ensures informed decision-making across tracks while maintaining flexibility for token holders. Visit [Multirole Delegation](https://wiki.polkadot.network/learn/learn-polkadot-opengov/#multirole-delegation){target=\_blank} for more details on delegating voting power. ### Cancel a Referendum Polkadot OpenGov has two origins for rejecting ongoing referendums: - [**Referendum Canceller**](https://wiki.polkadot.network/learn/learn-polkadot-opengov-origins/#referendum-canceller){target=\_blank} - cancels an active referendum when non-malicious errors occur and refunds the deposits to the originators - [**Referendum Killer**](https://wiki.polkadot.network/learn/learn-polkadot-opengov-origins/#referendum-killer){target=\_blank} - used for urgent, malicious cases this origin instantly terminates an active referendum and slashes deposits See [Cancelling, Killing, and Blacklisting](https://wiki.polkadot.network/learn/learn-polkadot-opengov/#cancelling-killing--blacklisting){target=\_blank} for additional information on rejecting referendums. ## Additional Resources - [**Democracy pallet**](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/substrate/frame/democracy/src){target=\_blank} - handles administration of general stakeholder voting - [**Gov2: Polkadot’s Next Generation of Decentralised Governance**](https://medium.com/polkadot-network/gov2-polkadots-next-generation-of-decentralised-governance-4d9ef657d11b){target=\_blank} - Medium article by Gavin Wood - [**Polkadot Direction**](https://matrix.to/#/#Polkadot-Direction:parity.io){target=\_blank} - Matrix Element client - [**Polkassembly**](https://polkadot.polkassembly.io/){target=\_blank} - OpenGov dashboard and UI - [**Polkadot.js Apps Governance**](https://polkadot.js.org/apps/#/referenda){target=\_blank} - overview of active referendums --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/accounts/ --- BEGIN CONTENT --- --- title: Polkadot SDK Accounts description: Learn about account structures, balances, and address formats in the Polkadot SDK, including how to manage lifecycle, references, and balances. categories: Basics, Polkadot Protocol --- # Accounts ## Introduction Accounts are essential for managing identity, transactions, and governance on the network in the Polkadot SDK. Understanding these components is critical for seamless development and operation on the network, whether you're building or interacting with Polkadot-based chains. This page will guide you through the essential aspects of accounts, including their data structure, balance types, reference counters, and address formats. You’ll learn how accounts are managed within the runtime, how balances are categorized, and how addresses are encoded and validated. ## Account Data Structure Accounts are foundational to any blockchain, and the Polkadot SDK provides a flexible management system. This section explains how the Polkadot SDK defines accounts and manages their lifecycle through data structures within the runtime. ### Account The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/type.Account.html){target=\_blank} is a storage map within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank} that links an account ID to its corresponding data. This structure is fundamental for mapping account-related information within the chain. The code snippet below shows how accounts are defined: ```rs /// The full account information for a particular account ID. #[pallet::storage] #[pallet::getter(fn account)] pub type Account = StorageMap< _, Blake2_128Concat, T::AccountId, AccountInfo, ValueQuery, >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). The `StorageMap` consists of the following parameters: - **`_`** - used in macro expansion and acts as a placeholder for the storage prefix type. Tells the macro to insert the default prefix during expansion - **`Blake2_128Concat`** - the hashing function applied to keys in the storage map - **`T::AccountId`** - represents the key type, which corresponds to the account’s unique ID - **`AccountInfo`** - the value type stored in the map. For each account ID, the map stores an `AccountInfo` struct containing: - **`T::Nonce`** - a nonce for the account, which is incremented with each transaction to ensure transaction uniqueness - **`T::AccountData`** - custom account data defined by the runtime configuration, which could include balances, locked funds, or other relevant information - **`ValueQuery`** - defines how queries to the storage map behave when no value is found; returns a default value instead of `None` For a detailed explanation of storage maps, see the [`StorageMap`](https://paritytech.github.io/polkadot-sdk/master/frame_support/storage/types/struct.StorageMap.html){target=\_blank} entry in the Rust docs. ### Account Info The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs /// Information of an account. #[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] pub struct AccountInfo { /// The number of transactions this account has sent. pub nonce: Nonce, /// The number of other modules that currently depend on this account's existence. The account /// cannot be reaped until this is zero. pub consumers: RefCount, /// The number of other modules that allow this account to exist. The account may not be reaped /// until this and `sufficients` are both zero. pub providers: RefCount, /// The number of modules that allow this account to exist for their own purposes only. The /// account may not be reaped until this and `providers` are both zero. pub sufficients: RefCount, /// The additional data that belongs to this account. Used to store the balance(s) in a lot of /// chains. pub data: AccountData, } ``` The `AccountInfo` structure includes the following components: - **`nonce`** - tracks the number of transactions initiated by the account, which ensures transaction uniqueness and prevents replay attacks - **`consumers`** - counts how many other modules or pallets rely on this account’s existence. The account cannot be removed from the chain (reaped) until this count reaches zero - **`providers`** - tracks how many modules permit this account’s existence. An account can only be reaped once both `providers` and `sufficients` are zero - **`sufficients`** - represents the number of modules that allow the account to exist for internal purposes, independent of any other modules - **`AccountData`** - a flexible data structure that can be customized in the runtime configuration, usually containing balances or other user-specific data This structure helps manage an account's state and prevents its premature removal while it is still referenced by other on-chain data or modules. The [`AccountInfo`](https://paritytech.github.io/polkadot-sdk/master/frame_system/struct.AccountInfo.html){target=\_blank} structure can vary as long as it satisfies the trait bounds defined by the `AccountData` associated type in the [`frame-system::pallet::Config`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/trait.Config.html){target=\_blank} trait. ### Account Reference Counters Polkadot SDK uses reference counters to track an account’s dependencies across different runtime modules. These counters ensure that accounts remain active while data is associated with them. The reference counters include: - **`consumers`** - prevents account removal while other pallets still rely on the account - **`providers`** - ensures an account is active before other pallets store data related to it - **`sufficients`** - indicates the account’s independence, ensuring it can exist even without a native token balance, such as when holding sufficient alternative assets #### Providers Reference Counters The `providers` counter ensures that an account is ready to be depended upon by other runtime modules. For example, it is incremented when an account has a balance above the existential deposit, which marks the account as active. The system requires this reference counter to be greater than zero for the `consumers` counter to be incremented, ensuring the account is stable before any dependencies are added. #### Consumers Reference Counters The `consumers` counter ensures that the account cannot be reaped until all references to it across the runtime have been removed. This check prevents the accidental deletion of accounts that still have active on-chain data. It is the user’s responsibility to clear out any data from other runtime modules if they wish to remove their account and reclaim their existential deposit. #### Sufficients Reference Counter The `sufficients` counter tracks accounts that can exist independently without relying on a native account balance. This is useful for accounts holding other types of assets, like tokens, without needing a minimum balance in the native token. For instance, the [Assets pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_assets/index.html){target=\_blank}, may increment this counter for an account holding sufficient tokens. #### Account Deactivation In Polkadot SDK-based chains, an account is deactivated when its reference counters (such as `providers`, `consumers`, and `sufficient`) reach zero. These counters ensure the account remains active as long as other runtime modules or pallets reference it. When all dependencies are cleared and the counters drop to zero, the account becomes deactivated and may be removed from the chain (reaped). This is particularly important in Polkadot SDK-based blockchains, where accounts with balances below the existential deposit threshold are pruned from storage to conserve state resources. Each pallet that references an account has cleanup functions that decrement these counters when the pallet no longer depends on the account. Once these counters reach zero, the account is marked for deactivation. #### Updating Counters The Polkadot SDK provides runtime developers with various methods to manage account lifecycle events, such as deactivation or incrementing reference counters. These methods ensure that accounts cannot be reaped while still in use. The following helper functions manage these counters: - **`inc_consumers()`** - increments the `consumer` reference counter for an account, signaling that another pallet depends on it - **`dec_consumers()`** - decrements the `consumer` reference counter, signaling that a pallet no longer relies on the account - **`inc_providers()`** - increments the `provider` reference counter, ensuring the account remains active - **`dec_providers()`** - decrements the `provider` reference counter, allowing for account deactivation when no longer in use - **`inc_sufficients()`** - increments the `sufficient` reference counter for accounts that hold sufficient assets - **`dec_sufficients()`** - decrements the `sufficient` reference counter To ensure proper account cleanup and lifecycle management, a corresponding decrement should be made for each increment action. The `System` pallet offers three query functions to assist developers in tracking account states: - [**`can_inc_consumer()`**](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_inc_consumer){target=\_blank} - checks if the account can safely increment the consumer reference - [**`can_dec_provider()`**](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_dec_provider){target=\_blank} - ensures that no consumers exist before allowing the decrement of the provider counter - [**`is_provider_required()`**](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.is_provider_required){target=\_blank} - verifies whether the account still has any active consumer references This modular and flexible system of reference counters tightly controls the lifecycle of accounts in Polkadot SDK-based blockchains, preventing the accidental removal or retention of unneeded accounts. You can refer to the [System pallet Rust docs](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html){target=\_blank} for more details. ## Account Balance Types In the Polkadot ecosystem, account balances are categorized into different types based on how the funds are utilized and their availability. These balance types determine the actions that can be performed, such as transferring tokens, paying transaction fees, or participating in governance activities. Understanding these balance types helps developers manage user accounts and implement balance-dependent logic. !!! note "A more efficient distribution of account balance types is in development" Soon, pallets in the Polkadot SDK will implement the [`Fungible` trait](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\_blank} (see the [tracking issue](https://github.com/paritytech/polkadot-sdk/issues/226){target=\_blank} for more details). For example, the [`transaction-storage`](https://paritytech.github.io/polkadot-sdk/master/pallet_transaction_storage/index.html){target=\_blank} pallet changed the implementation of the [`Currency`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/currency/index.html){target=\_blank} trait (see the [Refactor transaction storage pallet to use fungible traits](https://github.com/paritytech/polkadot-sdk/pull/1800){target=\_blank} PR for further details): ```rust type BalanceOf = <::Currency as Currency<::AccountId>>::Balance; ``` To the [`Fungible`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\_blank} trait: ```rust type BalanceOf = <::Currency as FnInspect<::AccountId>>::Balance; ``` This update will enable more efficient use of account balances, allowing the free balance to be utilized for on-chain activities such as setting proxies and managing identities. ### Balance Types The five main balance types are: - **Free balance** - represents the total tokens available to the account for any on-chain activity, including staking, governance, and voting. However, it may not be fully spendable or transferrable if portions of it are locked or reserved - **Locked balance** - portions of the free balance that cannot be spent or transferred because they are tied up in specific activities like [staking](https://wiki.polkadot.network/learn/learn-staking/#nominating-validators){target=\_blank}, [vesting](https://wiki.polkadot.network/learn/learn-guides-transfers/#vested-transfers-with-the-polkadot-js-ui){target=\_blank}, or participating in [governance](https://wiki.polkadot.network/learn/learn-polkadot-opengov/#voting-on-a-referendum){target=\_blank}. While the tokens remain part of the free balance, they are non-transferable for the duration of the lock - **Reserved balance** - funds locked by specific system actions, such as setting up an [identity](https://wiki.polkadot.network/learn/learn-identity/){target=\_blank}, creating [proxies](https://wiki.polkadot.network/learn/learn-proxies/){target=\_blank}, or submitting [deposits for governance proposals](https://wiki.polkadot.network/learn/learn-guides-polkadot-opengov/#claiming-opengov-deposits){target=\_blank}. These tokens are not part of the free balance and cannot be spent unless they are unreserved - **Spendable balance** - the portion of the free balance that is available for immediate spending or transfers. It is calculated by subtracting the maximum of locked or reserved amounts from the free balance, ensuring that existential deposit limits are met - **Untouchable balance** - funds that cannot be directly spent or transferred but may still be utilized for on-chain activities, such as governance participation or staking. These tokens are typically tied to certain actions or locked for a specific period The spendable balance is calculated as follows: ```text spendable = free - max(locked - reserved, ED) ``` Here, `free`, `locked`, and `reserved` are defined above. The `ED` represents the [existential deposit](https://wiki.polkadot.network/learn/learn-accounts/#existential-deposit-and-reaping){target=\_blank}, the minimum balance required to keep an account active and prevent it from being reaped. You may find you can't see all balance types when looking at your account via a wallet. Wallet providers often display only spendable, locked, and reserved balances. ### Locks Locks are applied to an account's free balance, preventing that portion from being spent or transferred. Locks are automatically placed when an account participates in specific on-chain activities, such as staking or governance. Although multiple locks may be applied simultaneously, they do not stack. Instead, the largest lock determines the total amount of locked tokens. Locks follow these basic rules: - If different locks apply to varying amounts, the largest lock amount takes precedence - If multiple locks apply to the same amount, the lock with the longest duration governs when the balance can be unlocked #### Locks Example Consider an example where an account has 80 DOT locked for both staking and governance purposes like so: - 80 DOT is staked with a 28-day lock period - 24 DOT is locked for governance with a 1x conviction and a 7-day lock period - 4 DOT is locked for governance with a 6x conviction and a 224-day lock period In this case, the total locked amount is 80 DOT because only the largest lock (80 DOT from staking) governs the locked balance. These 80 DOT will be released at different times based on the lock durations. In this example, the 24 DOT locked for governance will be released first since the shortest lock period is seven days. The 80 DOT stake with a 28-day lock period is released next. Now, all that remains locked is the 4 DOT for governance. After 224 days, all 80 DOT (minus the existential deposit) will be free and transferrable. ![Illustration of Lock Example](/images/polkadot-protocol/parachain-basics/accounts/locks-example-2.webp) #### Edge Cases for Locks In scenarios where multiple convictions and lock periods are active, the lock duration and amount are determined by the longest period and largest amount. For example, if you delegate with different convictions and attempt to undelegate during an active lock period, the lock may be extended for the full amount of tokens. For a detailed discussion on edge case lock behavior, see this [Stack Exchange post](https://substrate.stackexchange.com/questions/5067/delegating-and-undelegating-during-the-lock-period-extends-it-for-the-initial-am){target=\_blank}. ### Balance Types on Polkadot.js Polkadot.js provides a user-friendly interface for managing and visualizing various account balances on Polkadot and Kusama networks. When interacting with Polkadot.js, you will encounter multiple balance types that are critical for understanding how your funds are distributed and restricted. This section explains how different balances are displayed in the Polkadot.js UI and what each type represents. ![](/images/polkadot-protocol/parachain-basics/accounts/account-balance-types-1.webp) The most common balance types displayed on Polkadot.js are: - **Total balance** - the total number of tokens available in the account. This includes all tokens, whether they are transferable, locked, reserved, or vested. However, the total balance does not always reflect what can be spent immediately. In this example, the total balance is 0.6274 KSM - **Transferrable balance** - shows how many tokens are immediately available for transfer. It is calculated by subtracting the locked and reserved balances from the total balance. For example, if an account has a total balance of 0.6274 KSM and a transferrable balance of 0.0106 KSM, only the latter amount can be sent or spent freely - **Vested balance** - tokens that allocated to the account but released according to a specific schedule. Vested tokens remain locked and cannot be transferred until fully vested. For example, an account with a vested balance of 0.2500 KSM means that this amount is owned but not yet transferable - **Locked balance** - tokens that are temporarily restricted from being transferred or spent. These locks typically result from participating in staking, governance, or vested transfers. In Polkadot.js, locked balances do not stackβ€”only the largest lock is applied. For instance, if an account has 0.5500 KSM locked for governance and staking, the locked balance would display 0.5500 KSM, not the sum of all locked amounts - **Reserved balance** - refers to tokens locked for specific on-chain actions, such as setting an identity, creating a proxy, or making governance deposits. Reserved tokens are not part of the free balance, but can be freed by performing certain actions. For example, removing an identity would unreserve those funds - **Bonded balance** - the tokens locked for staking purposes. Bonded tokens are not transferrable until they are unbonded after the unbonding period - **Redeemable balance** - the number of tokens that have completed the unbonding period and are ready to be unlocked and transferred again. For example, if an account has a redeemable balance of 0.1000 KSM, those tokens are now available for spending - **Democracy balance** - reflects the number of tokens locked for governance activities, such as voting on referenda. These tokens are locked for the duration of the governance action and are only released after the lock period ends By understanding these balance types and their implications, developers and users can better manage their funds and engage with on-chain activities more effectively. ## Address Formats The SS58 address format is a core component of the Polkadot SDK that enables accounts to be uniquely identified across Polkadot-based networks. This format is a modified version of Bitcoin's Base58Check encoding, specifically designed to accommodate the multi-chain nature of the Polkadot ecosystem. SS58 encoding allows each chain to define its own set of addresses while maintaining compatibility and checksum validation for security. ### Basic Format SS58 addresses consist of three main components: ```text base58encode(concat(,
, )) ``` - **Address type** - a byte or set of bytes that define the network (or chain) for which the address is intended. This ensures that addresses are unique across different Polkadot SDK-based chains - **Address** - the public key of the account encoded as bytes - **Checksum** - a hash-based checksum which ensures that addresses are valid and unaltered. The checksum is derived from the concatenated address type and address components, ensuring integrity The encoding process transforms the concatenated components into a Base58 string, providing a compact and human-readable format that avoids easily confused characters (e.g., zero '0', capital 'O', lowercase 'l'). This encoding function ([`encode`](https://docs.rs/bs58/latest/bs58/fn.encode.html){target=\_blank}) is implemented exactly as defined in Bitcoin and IPFS specifications, using the same alphabet as both implementations. For more details about the SS58 address format implementation, see the [`Ss58Codec`](https://paritytech.github.io/polkadot-sdk/master/sp_core/crypto/trait.Ss58Codec.html){target=\_blank} trait in the Rust Docs. ### Address Type The address type defines how an address is interpreted and to which network it belongs. Polkadot SDK uses different prefixes to distinguish between various chains and address formats: - **Address types `0-63`** - simple addresses, commonly used for network identifiers - **Address types `64-127`** - full addresses that support a wider range of network identifiers - **Address types `128-255`** - reserved for future address format extensions For example, Polkadot’s main network uses an address type of 0, while Kusama uses 2. This ensures that addresses can be used without confusion between networks. The address type is always encoded as part of the SS58 address, making it easy to quickly identify the network. Refer to the [SS58 registry](https://github.com/paritytech/ss58-registry){target=\_blank} for the canonical listing of all address type identifiers and how they map to Polkadot SDK-based networks. ### Address Length SS58 addresses can have different lengths depending on the specific format. Address lengths range from as short as 3 to 35 bytes, depending on the complexity of the address and network requirements. This flexibility allows SS58 addresses to adapt to different chains while providing a secure encoding mechanism. | Total | Type | Raw account | Checksum | |-------|------|-------------|----------| | 3 | 1 | 1 | 1 | | 4 | 1 | 2 | 1 | | 5 | 1 | 2 | 2 | | 6 | 1 | 4 | 1 | | 7 | 1 | 4 | 2 | | 8 | 1 | 4 | 3 | | 9 | 1 | 4 | 4 | | 10 | 1 | 8 | 1 | | 11 | 1 | 8 | 2 | | 12 | 1 | 8 | 3 | | 13 | 1 | 8 | 4 | | 14 | 1 | 8 | 5 | | 15 | 1 | 8 | 6 | | 16 | 1 | 8 | 7 | | 17 | 1 | 8 | 8 | | 35 | 1 | 32 | 2 | SS58 addresses also support different payload sizes, allowing a flexible range of account identifiers. ### Checksum Types A checksum is applied to validate SS58 addresses. Polkadot SDK uses a Blake2b-512 hash function to calculate the checksum, which is appended to the address before encoding. The checksum length can vary depending on the address format (e.g., 1-byte, 2-byte, or longer), providing varying levels of validation strength. The checksum ensures that an address is not modified or corrupted, adding an extra layer of security for account management. ### Validating Addresses SS58 addresses can be validated using the subkey command-line interface or the Polkadot.js API. These tools help ensure an address is correctly formatted and valid for the intended network. The following sections will provide an overview of how validation works with these tools. #### Using Subkey [Subkey](https://paritytech.github.io/polkadot-sdk/master/subkey/index.html){target=\_blank} is a CLI tool provided by Polkadot SDK for generating and managing keys. It can inspect and validate SS58 addresses. The `inspect` command gets a public key and an SS58 address from the provided secret URI. The basic syntax for the `subkey inspect` command is: ```bash subkey inspect [flags] [options] uri ``` For the `uri` command-line argument, you can specify the secret seed phrase, a hex-encoded private key, or an SS58 address. If the input is a valid address, the `subkey` program displays the corresponding hex-encoded public key, account identifier, and SS58 addresses. For example, to inspect the public keys derived from a secret seed phrase, you can run a command similar to the following: ```bash subkey inspect "caution juice atom organ advance problem want pledge someone senior holiday very" ``` The command displays output similar to the following:
subkey inspect "caution juice atom organ advance problem want pledge someone senior holiday very" Secret phrase `caution juice atom organ advance problem want pledge someone senior holiday very` is account: Secret seed: 0xc8fa03532fb22ee1f7f6908b9c02b4e72483f0dbd66e4cd456b8f34c6230b849 Public key (hex): 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746 Public key (SS58): 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR Account ID: 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746 SS58 Address: 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR
The `subkey` program assumes an address is based on a public/private key pair. If you inspect an address, the command returns the 32-byte account identifier. However, not all addresses in Polkadot SDK-based networks are based on keys. Depending on the command-line options you specify and the input you provided, the command output might also display the network for which the address has been encoded. For example: ```bash subkey inspect "12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU" ``` The command displays output similar to the following:
subkey inspect "12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU" Public Key URI `12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU` is account: Network ID/Version: polkadot Public key (hex): 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a Account ID: 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a Public key (SS58): 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU SS58 Address: 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU
#### Using Polkadot.js API To verify an address in JavaScript or TypeScript projects, you can use the functions built into the [Polkadot.js API](https://polkadot.js.org/docs/){target=\_blank}. For example: ```js // Import Polkadot.js API dependencies const { decodeAddress, encodeAddress } = require('@polkadot/keyring'); const { hexToU8a, isHex } = require('@polkadot/util'); // Specify an address to test. const address = 'INSERT_ADDRESS_TO_TEST'; // Check address const isValidSubstrateAddress = () => { try { encodeAddress(isHex(address) ? hexToU8a(address) : decodeAddress(address)); return true; } catch (error) { return false; } }; // Query result const isValid = isValidSubstrateAddress(); console.log(isValid); ``` If the function returns `true`, the specified address is a valid address. #### Other SS58 Implementations Support for encoding and decoding Polkadot SDK SS58 addresses has been implemented in several other languages and libraries. - **Crystal** - [`wyhaines/base58.cr`](https://github.com/wyhaines/base58.cr){target=\_blank} - **Go** - [`itering/subscan-plugin`](https://github.com/itering/subscan-plugin){target=\_blank} - **Python** - [`polkascan/py-scale-codec`](https://github.com/polkascan/py-scale-codec){target=\_blank} - **TypeScript** - [`subsquid/squid-sdk`](https://github.com/subsquid/squid-sdk){target=\_blank} --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/blocks-transactions-fees/blocks/ --- BEGIN CONTENT --- --- title: Blocks description: Understand how blocks are produced, validated, and imported in Polkadot SDK-based blockchains, covering initialization, finalization, and authoring processes. categories: Basics, Polkadot Protocol --- # Blocks ## Introduction In the Polkadot SDK, blocks are fundamental to the functioning of the blockchain, serving as containers for [transactions](/polkadot-protocol/parachain-basics/blocks-transactions-fees/transactions/){target=\_blank} and changes to the chain's state. Blocks consist of headers and an array of transactions, ensuring the integrity and validity of operations on the network. This guide explores the essential components of a block, the process of block production, and how blocks are validated and imported across the network. By understanding these concepts, developers can better grasp how blockchains maintain security, consistency, and performance within the Polkadot ecosystem. ## What is a Block? In the Polkadot SDK, a block is a fundamental unit that encapsulates both the header and an array of transactions. The block header includes critical metadata to ensure the integrity and sequence of the blockchain. Here's a breakdown of its components: - **Block height** - indicates the number of blocks created in the chain so far - **Parent hash** - the hash of the previous block, providing a link to maintain the blockchain's immutability - **Transaction root** - cryptographic digest summarizing all transactions in the block - **State root** - a cryptographic digest representing the post-execution state - **Digest** - additional information that can be attached to a block, such as consensus-related messages Each transaction is part of a series that is executed according to the runtime's rules. The transaction root is a cryptographic digest of this series, which prevents alterations and enables succinct verification by light clients. This verification process allows light clients to confirm whether a transaction exists in a block with only the block header, avoiding downloading the entire block. ## Block Production When an authoring node is authorized to create a new block, it selects transactions from the transaction queue based on priority. This step, known as block production, relies heavily on the executive module to manage the initialization and finalization of blocks. The process is summarized as follows: ### Initialize Block The block initialization process begins with a series of function calls that prepare the block for transaction execution: 1. **Call `on_initialize`** - the executive module calls theΒ [`on_initialize`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.on_initialize){target=\_blank}Β hook from the system pallet and other runtime pallets to prepare for the block's transactions 2. **Coordinate runtime calls** - coordinates function calls in the order defined by the transaction queue 3. **Verify information** - once [`on_initialize`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.on_initialize){target=\_blank}Β functions are executed, the executive module checks the parent hash in the block header and the trie root to verify information is consistent ### Finalize Block Once transactions are processed, the block must be finalized before being broadcast to the network. The finalization steps are as follows: 1. -**Call `on_finalize`** - the executive module calls the [`on_finalize`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.on_finalize){target=\_blank} hooks in each pallet to ensure any remaining state updates or checks are completed before the block is sealed and published 2. -**Verify information** - the block's digest and storage root in the header are checked against the initialized block to ensure consistency 3. -**Call `on_idle`** - theΒ [`on_idle`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.on_idle){target=\_blank} hook is triggered to process any remaining tasks using the leftover weight from the block ## Block Authoring and Import Once the block is finalized, it is gossiped to other nodes in the network. Nodes follow this procedure: 1. **Receive transactions** - the authoring node collects transactions from the network 2. **Validate** - transactions are checked for validity 3. **Queue** - valid transactions are placed in the transaction pool for execution 4. **Execute** - state changes are made as the transactions are executed 5. **Publish** - the finalized block is broadcast to the network ### Block Import Queue After a block is published, other nodes on the network can import it into their chain state. The block import queue is part of the outer node in every Polkadot SDK-based node and ensures incoming blocks are valid before adding them to the node's state. In most cases, you don't need to know details about how transactions are gossiped or how other nodes on the network import blocks. The following traits are relevant, however, if you plan to write any custom consensus logic or want a deeper dive into the block import queue: - [**`ImportQueue`**](https://paritytech.github.io/polkadot-sdk/master/sc_consensus/import_queue/trait.ImportQueue.html){target=\_blank} - the trait that defines the block import queue - [**`Link`**](https://paritytech.github.io/polkadot-sdk/master/sc_consensus/import_queue/trait.Link.html){target=\_blank} - the trait that defines the link between the block import queue and the network - [**`BasicQueue`**](https://paritytech.github.io/polkadot-sdk/master/sc_consensus/import_queue/struct.BasicQueue.html){target=\_blank} - a basic implementation of the block import queue - [**`Verifier`**](https://paritytech.github.io/polkadot-sdk/master/sc_consensus/import_queue/trait.Verifier.html){target=\_blank} - the trait that defines the block verifier - [**`BlockImport`**](https://paritytech.github.io/polkadot-sdk/master/sc_consensus/block_import/trait.BlockImport.html){target=\_blank} - the trait that defines the block import process These traits govern how blocks are validated and imported across the network, ensuring consistency and security. ## Additional Resources To learn more about the block structure in the Polkadot SDK runtime, see the [`Block` reference](https://paritytech.github.io/polkadot-sdk/master/sp_runtime/traits/trait.Block.html){target=\_blank} entry in the Rust Docs. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/blocks-transactions-fees/fees/ --- BEGIN CONTENT --- --- title: Transactions Weights and Fees description: Overview of transaction weights and fees in Polkadot SDK chains, detailing how fees are calculated using a defined formula and runtime specifics. categories: Basics, Polkadot Protocol --- # Transactions Weights and Fees ## Introductions When transactions are executed, or data is stored on-chain, the activity changes the chain's state and consumes blockchain resources. Because the resources available to a blockchain are limited, managing how operations on-chain consume them is important. In addition to being limited in practical terms, such as storage capacity, blockchain resources represent a potential attack vector for malicious users. For example, a malicious user might attempt to overload the network with messages to stop the network from producing new blocks. To protect blockchain resources from being drained or overloaded, you need to manage how they are made available and how they are consumed. The resources to be aware of include: - Memory usage - Storage input and output - Computation - Transaction and block size - State database size The Polkadot SDK provides block authors with several ways to manage access to resources and to prevent individual components of the chain from consuming too much of any single resource. Two of the most important mechanisms available to block authors areΒ weightsΒ andΒ transaction fees. [Weights](/polkadot-protocol/glossary/#weight){target=\_blank}Β manage the time it takes to validate a block and characterize the time it takes to execute the calls in the block's body. By controlling the execution time a block can consume, weights set limits on storage input, output, and computation. Some of the weight allowed for a block is consumed as part of the block's initialization and finalization. The weight might also be used to execute mandatory inherent extrinsic calls. To help ensure blocks don’t consume too much execution time and prevent malicious users from overloading the system with unnecessary calls, weights are combined withΒ transaction fees. [Transaction fees](/polkadot-protocol/basics/blocks-transactions-fees/transactions/#transaction-fees){target=\_blank} provide an economic incentive to limit execution time, computation, and the number of calls required to perform operations. Transaction fees are also used to make the blockchain economically sustainable because they are typically applied to transactions initiated by users and deducted before a transaction request is executed. ## How Fees are Calculated The final fee for a transaction is calculated using the following parameters: - **`base fee`** - this is the minimum amount a user pays for a transaction. It is declared aΒ base weightΒ in the runtime and converted to a fee using theΒ [`WeightToFee`](https://docs.rs/pallet-transaction-payment/latest/pallet_transaction_payment/pallet/trait.Config.html#associatedtype.WeightToFee){target=\_blank}Β conversion - **`weight fee`** - a fee proportional to the execution time (input and output and computation) that a transaction consumes - **`length fee`** - a fee proportional to the encoded length of the transaction - **`tip`** - an optional tip to increase the transaction’s priority, giving it a higher chance to be included in the transaction queue The base fee and proportional weight and length fees constitute theΒ inclusion fee. The inclusion fee is the minimum fee that must be available for a transaction to be included in a block. ```text inclusion fee = base fee + weight fee + length fee ``` Transaction fees are withdrawn before the transaction is executed. After the transaction is executed, the weight can be adjusted to reflect the resources used. If a transaction uses fewer resources than expected, the transaction fee is corrected, and the adjusted transaction fee is deposited. ## Using the Transaction Payment Pallet TheΒ [Transaction Payment pallet](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/substrate/frame/transaction-payment){target=\_blank}Β provides the basic logic for calculating the inclusion fee. You can also use the Transaction Payment pallet to: - Convert a weight value into a deductible fee based on a currency type usingΒ [`Config::WeightToFee`](https://docs.rs/pallet-transaction-payment/latest/pallet_transaction_payment/pallet/trait.Config.html#associatedtype.WeightToFee){target=\_blank} - Update the fee for the next block by defining a multiplier based on the chain’s final state at the end of the previous block usingΒ [`Config::FeeMultiplierUpdate`](https://docs.rs/pallet-transaction-payment/latest/pallet_transaction_payment/pallet/trait.Config.html#associatedtype.FeeMultiplierUpdate){target=\_blank} - Manage the withdrawal, refund, and deposit of transaction fees usingΒ [`Config::OnChargeTransaction`](https://docs.rs/pallet-transaction-payment/latest/pallet_transaction_payment/pallet/trait.Config.html#associatedtype.OnChargeTransaction){target=\_blank} You can learn more about these configuration traits in theΒ [Transaction PaymentΒ documentation](https://paritytech.github.io/polkadot-sdk/master/pallet_transaction_payment/index.html){target=\_blank}. ### Understanding the Inclusion Fee The formula for calculating the inclusion fee is as follows: ```text inclusion_fee = base_fee + length_fee + [targeted_fee_adjustment * weight_fee] ``` And then, for calculating the final fee: ```text final_fee = inclusion_fee + tip ``` In the first formula, theΒ `targeted_fee_adjustment`Β is a multiplier that can tune the final fee based on the network’s congestion. - TheΒ `base_fee`Β derived from the base weight covers inclusion overhead like signature verification - TheΒ `length_fee`Β is a per-byte fee that is multiplied by the length of the encoded extrinsic - TheΒ `weight_fee`Β fee is calculated using two parameters: - TheΒ `ExtrinsicBaseWeight`Β that is declared in the runtime and applies to all extrinsics - TheΒ `#[pallet::weight]`Β annotation that accounts for an extrinsic's complexity To convert the weight to `Currency`, the runtime must define a `WeightToFee` struct that implements a conversion function, [`Convert`](https://docs.rs/pallet-transaction-payment/latest/pallet_transaction_payment/pallet/struct.Pallet.html#method.weight_to_fee){target=\_blank}. Note that the extrinsic sender is charged the inclusion fee before the extrinsic is invoked. The fee is deducted from the sender's balance even if the transaction fails upon execution. ### Accounts with an Insufficient Balance If an account does not have a sufficient balance to pay the inclusion fee and remain aliveβ€”that is, enough to pay the inclusion fee and maintain the minimumΒ existential depositβ€”then you should ensure the transaction is canceled so that no fee is deducted and the transaction does not begin execution. The Polkadot SDK doesn't enforce this rollback behavior. However, this scenario would be rare because the transaction queue and block-making logic perform checks to prevent it before adding an extrinsic to a block. ### Fee Multipliers The inclusion fee formula always results in the same fee for the same input. However, weight can be dynamic andβ€”based on howΒ [`WeightToFee`](https://docs.rs/pallet-transaction-payment/latest/pallet_transaction_payment/pallet/trait.Config.html#associatedtype.WeightToFee){target=\_blank}Β is definedβ€”the final fee can include some degree of variability. The Transaction Payment pallet provides theΒ [`FeeMultiplierUpdate`](https://docs.rs/pallet-transaction-payment/latest/pallet_transaction_payment/pallet/trait.Config.html#associatedtype.FeeMultiplierUpdate){target=\_blank}Β configurable parameter to account for this variability. The Polkadot network inspires the default update function and implements a targeted adjustment in which a target saturation level of block weight is defined. If the previous block is more saturated, the fees increase slightly. Similarly, if the last block has fewer transactions than the target, fees are decreased by a small amount. For more information about fee multiplier adjustments, see theΒ [Web3 Research Page](https://research.web3.foundation/Polkadot/overview/token-economics#relay-chain-transaction-fees-and-per-block-transaction-limits){target=\_blank}. ## Transactions with Special Requirements Inclusion fees must be computable before execution and can only represent fixed logic. Some transactions warrant limiting resources with other strategies. For example: - Bonds are a type of fee that might be returned or slashed after some on-chain event. For example, you might want to require users to place a bond to participate in a vote. The bond might then be returned at the end of the referendum or slashed if the voter attempted malicious behavior - Deposits are fees that might be returned later. For example, you might require users to pay a deposit to execute an operation that uses storage. The user’s deposit could be returned if a subsequent operation frees up storage - Burn operations are used to pay for a transaction based on its internal logic. For example, a transaction might burn funds from the sender if the transaction creates new storage items to pay for the increased state size - Limits enable you to enforce constant or configurable limits on specific operations. For example, the default [Staking pallet](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/substrate/frame/staking){target=\_blank} only allows nominators to nominate 16 validators to limit the complexity of the validator election process It is important to note that if you query the chain for a transaction fee, it only returns the inclusion fee. ## Default Weight Annotations All dispatchable functions in the Polkadot SDK must specify a weight. The way of doing that is using the annotation-based system that lets you combine fixed values for database read/write weight and/or fixed values based on benchmarks. The most basic example would look like this: ```rust #[pallet::weight(100_000)] fn my_dispatchable() { // ... } ``` Note that theΒ [`ExtrinsicBaseWeight`](https://crates.parity.io/frame_support/weights/constants/struct.ExtrinsicBaseWeight.html){target=\_blank}Β is automatically added to the declared weight to account for the costs of simply including an empty extrinsic into a block. ### Weights and Database Read/Write Operations To make weight annotations independent of the deployed database backend, they are defined as a constant and then used in the annotations when expressing database accesses performed by the dispatchable: ```rust #[pallet::weight(T::DbWeight::get().reads_writes(1, 2) + 20_000)] fn my_dispatchable() { // ... } ``` This dispatchable allows one database to read and two to write, in addition to other things that add the additional 20,000. Database access is generally every time a value declared inside theΒ [`#[pallet::storage]`](https://paritytech.github.io/polkadot-sdk/master/frame_support/pallet_macros/attr.storage.html){target=\_blank}Β block is accessed. However, unique accesses are counted because after a value is accessed, it is cached, and reaccessing it does not result in a database operation. That is: - Multiple reads of the exact value count as one read - Multiple writes of the exact value count as one write - Multiple reads of the same value, followed by a write to that value, count as one read and one write - A write followed by a read-only counts as one write ### Dispatch Classes Dispatches are broken into three classes: - Normal - Operational - Mandatory If a dispatch is not defined asΒ `Operational`Β orΒ `Mandatory`Β in the weight annotation, the dispatch is identified asΒ `Normal`Β by default. You can specify that the dispatchable uses another class like this: ```rust #[pallet::dispatch((DispatchClass::Operational))] fn my_dispatchable() { // ... } ``` This tuple notation also allows you to specify a final argument determining whether the user is charged based on the annotated weight. If you don't specify otherwise,Β `Pays::Yes`Β is assumed: ```rust #[pallet::dispatch(DispatchClass::Normal, Pays::No)] fn my_dispatchable() { // ... } ``` #### Normal Dispatches Dispatches in this class represent normal user-triggered transactions. These types of dispatches only consume a portion of a block's total weight limit. For information about the maximum portion of a block that can be consumed for normal dispatches, seeΒ [`AvailableBlockRatio`](https://paritytech.github.io/polkadot-sdk/master/frame_system/limits/struct.BlockLength.html){target=\_blank}. Normal dispatches are sent to theΒ transaction pool. #### Operational Dispatches Unlike normal dispatches, which representΒ the usageΒ of network capabilities, operational dispatches are those thatΒ provideΒ network capabilities. Operational dispatches can consume the entire weight limit of a block. They are not bound by theΒ [`AvailableBlockRatio`](https://paritytech.github.io/polkadot-sdk/master/frame_system/limits/struct.BlockLength.html){target=\_blank}. Dispatches in this class are given maximum priority and are exempt from paying theΒ [`length_fee`](https://docs.rs/pallet-transaction-payment/latest/pallet_transaction_payment/){target=\_blank}. #### Mandatory Dispatches Mandatory dispatches are included in a block even if they cause the block to surpass its weight limit. You can only use the mandatory dispatch class forΒ inherent transactionsΒ that the block author submits. This dispatch class is intended to represent functions in the block validation process. Because these dispatches are always included in a block regardless of the function weight, the validation process must prevent malicious nodes from abusing the function to craft valid but impossibly heavy blocks. You can typically accomplish this by ensuring that: - The operation performed is always light - The operation can only be included in a block once To make it more difficult for malicious nodes to abuse mandatory dispatches, they cannot be included in blocks that return errors. This dispatch class serves the assumption that it is better to allow an overweight block to be created than not to allow any block to be created at all. ### Dynamic Weights In addition to purely fixed weights and constants, the weight calculation can consider the input arguments of a dispatchable. The weight should be trivially computable from the input arguments with some basic arithmetic: ```rust use frame_support:: { dispatch:: { DispatchClass::Normal, Pays::Yes, }, weights::Weight, }; #[pallet::weight(FunctionOf( |args: (&Vec,)| args.0.len().saturating_mul(10_000), ) ] fn handle_users(origin, calls: Vec) { // Do something per user } ``` ## Post Dispatch Weight Correction Depending on the execution logic, a dispatchable function might consume less weight than was prescribed pre-dispatch. To correct weight, the function declares a different return type and returns its actual weight: ```rust #[pallet::weight(10_000 + 500_000_000)] fn expensive_or_cheap(input: u64) -> DispatchResultWithPostInfo { let was_heavy = do_calculation(input); if (was_heavy) { // None means "no correction" from the weight annotation. Ok(None.into()) } else { // Return the actual weight consumed. Ok(Some(10_000).into()) } } ``` ## Custom Fees You can also define custom fee systems through custom weight functions or inclusion fee functions. ### Custom Weights Instead of using the default weight annotations, you can create a custom weight calculation type using theΒ weightsΒ module. The custom weight calculation type must implement the following traits: - [`WeighData`](https://crates.parity.io/frame_support/weights/trait.WeighData.html){target=\_blank}Β to determine the weight of the dispatch - [`ClassifyDispatch`](https://crates.parity.io/frame_support/weights/trait.ClassifyDispatch.html){target=\_blank}Β to determine the class of the dispatch - [`PaysFee`](https://crates.parity.io/frame_support/weights/trait.PaysFee.html){target=\_blank}Β to determine whether the sender of the dispatch pays fees The Polkadot SDK then bundles the output information of the three traits into theΒ [`DispatchInfo`](https://paritytech.github.io/polkadot-sdk/master/frame_support/dispatch/struct.DispatchInfo.html){target=\_blank} struct and provides it by implementing theΒ [`GetDispatchInfo`](https://docs.rs/frame-support/latest/frame_support/dispatch/trait.GetDispatchInfo.html){target=\_blank}Β for allΒ `Call`Β variants and opaque extrinsic types. This is used internally by the System and Executive modules. `ClassifyDispatch`,Β `WeighData`, andΒ `PaysFee`Β are generic overΒ T, which gets resolved into the tuple of all dispatch arguments except for the origin. The following example illustrates aΒ structΒ that calculates the weight asΒ `m * len(args)`,Β whereΒ `m`Β is a given multiplier andΒ argsΒ is the concatenated tuple of all dispatch arguments. In this example, the dispatch class isΒ `Operational`Β if the transaction has more than 100 bytes of length in arguments and will pay fees if the encoded length exceeds 10 bytes. ```rust struct LenWeight(u32); impl WeighData for LenWeight { fn weigh_data(&self, target: T) -> Weight { let multiplier = self.0; let encoded_len = target.encode().len() as u32; multiplier * encoded_len } } impl ClassifyDispatch for LenWeight { fn classify_dispatch(&self, target: T) -> DispatchClass { let encoded_len = target.encode().len() as u32; if encoded_len > 100 { DispatchClass::Operational } else { DispatchClass::Normal } } } impl PaysFee { fn pays_fee(&self, target: T) -> Pays { let encoded_len = target.encode().len() as u32; if encoded_len > 10 { Pays::Yes } else { Pays::No } } } ``` A weight calculator function can also be coerced to the final type of the argument instead of defining it as a vague type that can be encoded. The code would roughly look like this: ```rust struct CustomWeight; impl WeighData<(&u32, &u64)> for CustomWeight { fn weigh_data(&self, target: (&u32, &u64)) -> Weight { ... } } // given a dispatch: #[pallet::call] impl, I: 'static> Pallet { #[pallet::weight(CustomWeight)] fn foo(a: u32, b: u64) { ... } } ``` In this example, the `CustomWeight` can only be used in conjunction with a dispatch with a particular signature `(u32, u64)`, as opposed to `LenWeight`, which can be used with anything because there aren't any assumptions about ``. #### Custom Inclusion Fee The following example illustrates how to customize your inclusion fee. You must configure the appropriate associated types in the respective module. ```rust // Assume this is the balance type type Balance = u64; // Assume we want all the weights to have a `100 + 2 * w` conversion to fees struct CustomWeightToFee; impl WeightToFee for CustomWeightToFee { fn convert(w: Weight) -> Balance { let a = Balance::from(100); let b = Balance::from(2); let w = Balance::from(w); a + b * w } } parameter_types! { pub const ExtrinsicBaseWeight: Weight = 10_000_000; } impl frame_system::Config for Runtime { type ExtrinsicBaseWeight = ExtrinsicBaseWeight; } parameter_types! { pub const TransactionByteFee: Balance = 10; } impl transaction_payment::Config { type TransactionByteFee = TransactionByteFee; type WeightToFee = CustomWeightToFee; type FeeMultiplierUpdate = TargetedFeeAdjustment; } struct TargetedFeeAdjustment(sp_std::marker::PhantomData); impl> WeightToFee for TargetedFeeAdjustment { fn convert(multiplier: Fixed128) -> Fixed128 { // Don't change anything. Put any fee update info here. multiplier } } ``` ## Additional Resources You now know the weight system, how it affects transaction fee computation, and how to specify weights for your dispatchable calls. The next step is determining the correct weight for your dispatchable operations. You can use SubstrateΒ benchmarking functionsΒ andΒ frame-benchmarkingΒ calls to test your functions with different parameters and empirically determine the proper weight in their worst-case scenarios. - [Benchmark](/develop/parachains/testing/benchmarking/) - [`SignedExtension`](https://paritytech.github.io/polkadot-sdk/master/sp_runtime/traits/trait.SignedExtension.html){target=\_blank} - [Custom weights for the Example pallet](https://github.com/paritytech/polkadot-sdk/blob/{{dependencies.repositories.polkadot_sdk.version}}/substrate/frame/examples/basic/src/weights.rs){target=\_blank} - [Web3 Foundation Research](https://research.web3.foundation/Polkadot/overview/token-economics#relay-chain-transaction-fees-and-per-block-transaction-limits){target=\_blank} --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/blocks-transactions-fees/transactions/ --- BEGIN CONTENT --- --- title: Transactions description: Learn how to construct, submit, and validate transactions in the Polkadot SDK, covering signed, unsigned, and inherent types of transactions. categories: Basics, Polkadot Protocol --- # Transactions ## Introduction Transactions are essential components of blockchain networks, enabling state changes and the execution of key operations. In the Polkadot SDK, transactions, often called extrinsics, come in multiple forms, including signed, unsigned, and inherent transactions. This guide walks you through the different transaction types and how they're formatted, validated, and processed within the Polkadot ecosystem. You'll also learn how to customize transaction formats and construct transactions for FRAME-based runtimes, ensuring a complete understanding of how transactions are built and executed in Polkadot SDK-based chains. ## What Is a Transaction? In the Polkadot SDK, transactions represent operations that modify the chain's state, bundled into blocks for execution. The term extrinsic is often used to refer to any data that originates outside the runtime and is included in the chain. While other blockchain systems typically refer to these operations as "transactions," the Polkadot SDK adopts the broader term "extrinsic" to capture the wide variety of data types that can be added to a block. There are three primary types of transactions (extrinsics) in the Polkadot SDK: - **Signed transactions** - signed by the submitting account, often carrying transaction fees - **Unsigned transactions** - submitted without a signature, often requiring custom validation logic - **Inherent transactions** - typically inserted directly into blocks by block authoring nodes, without gossiping between peers Each type serves a distinct purpose, and understanding when and how to use each is key to efficiently working with the Polkadot SDK. ### Signed Transactions Signed transactions require an account's signature and typically involve submitting a request to execute a runtime call. The signature serves as a form of cryptographic proof that the sender has authorized the action, using their private key. These transactions often involve a transaction fee to cover the cost of execution and incentivize block producers. Signed transactions are the most common type of transaction and are integral to user-driven actions, such as token transfers. For instance, when you transfer tokens from one account to another, the sending account must sign the transaction to authorize the operation. For example, the [`pallet_balances::Call::transfer_allow_death`](https://paritytech.github.io/polkadot-sdk/master/pallet_balances/pallet/struct.Pallet.html#method.transfer_allow_death){target=\_blank} extrinsic in the Balances pallet allows you to transfer tokens. Since your account initiates this transaction, your account key is used to sign it. You'll also be responsible for paying the associated transaction fee, with the option to include an additional tip to incentivize faster inclusion in the block. ### Unsigned Transactions Unsigned transactions do not require a signature or account-specific data from the sender. Unlike signed transactions, they do not come with any form of economic deterrent, such as fees, which makes them susceptible to spam or replay attacks. Custom validation logic must be implemented to mitigate these risks and ensure these transactions are secure. Unsigned transactions typically involve scenarios where including a fee or signature is unnecessary or counterproductive. However, due to the absence of fees, they require careful validation to protect the network. For example, [`pallet_im_online::Call::heartbeat`](https://paritytech.github.io/polkadot-sdk/master/pallet_im_online/pallet/struct.Pallet.html#method.heartbeat){target=\_blank} extrinsic allows validators to send a heartbeat signal, indicating they are active. Since only validators can make this call, the logic embedded in the transaction ensures that the sender is a validator, making the need for a signature or fee redundant. Unsigned transactions are more resource-intensive than signed ones because custom validation is required, but they play a crucial role in certain operational scenarios, especially when regular user accounts aren't involved. ### Inherent Transactions Inherent transactions are a specialized type of unsigned transaction that is used primarily for block authoring. Unlike signed or other unsigned transactions, inherent transactions are added directly by block producers and are not broadcasted to the network or stored in the transaction queue. They don't require signatures or the usual validation steps and are generally used to insert system-critical data directly into blocks. A key example of an inherent transaction is inserting a timestamp into each block. The [`pallet_timestamp::Call::now`](https://paritytech.github.io/polkadot-sdk/master/pallet_timestamp/pallet/struct.Pallet.html#method.now-1){target=\_blank} extrinsic allows block authors to include the current time in the block they are producing. Since the block producer adds this information, there is no need for transaction validation, like signature verification. The validation in this case is done indirectly by the validators, who check whether the timestamp is within an acceptable range before finalizing the block. Another example is the [`paras_inherent::Call::enter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_parachains/paras_inherent/pallet/struct.Pallet.html#method.enter){target=\_blank} extrinsic, which enables parachain collator nodes to send validation data to the relay chain. This inherent transaction ensures that the necessary parachain data is included in each block without the overhead of gossiped transactions. Inherent transactions serve a critical role in block authoring by allowing important operational data to be added directly to the chain without needing the validation processes required for standard transactions. ## Transaction Formats Understanding the structure of signed and unsigned transactions is crucial for developers building on Polkadot SDK-based chains. Whether you're optimizing transaction processing, customizing formats, or interacting with the transaction pool, knowing the format of extrinsics, Polkadot's term for transactions, is essential. ### Types of Transaction Formats In Polkadot SDK-based chains, extrinsics can fall into three main categories: - **Unchecked extrinsics** - typically used for signed transactions that require validation. They contain a signature and additional data, such as a nonce and information for fee calculation. Unchecked extrinsics are named as such because they require validation checks before being accepted into the transaction pool - **Checked extrinsics** - typically used for inherent extrinsics (unsigned transactions); these don't require signature verification. Instead, they carry information such as where the extrinsic originates and any additional data required for the block authoring process - **Opaque extrinsics** - used when the format of an extrinsic is not yet fully committed or finalized. They are still decodable, but their structure can be flexible depending on the context ### Signed Transaction Data Structure A signed transaction typically includes the following components: - **Signature** - verifies the authenticity of the transaction sender - **Call** - the actual function or method call the transaction is requesting (for example, transferring funds) - **Nonce** - tracks the number of prior transactions sent from the account, helping to prevent replay attacks - **Tip** - an optional incentive to prioritize the transaction in block inclusion - **Additional data** - includes details such as spec version, block hash, and genesis hash to ensure the transaction is valid within the correct runtime and chain context Here's a simplified breakdown of how signed transactions are typically constructed in a Polkadot SDK runtime: ``` code + + ``` Each part of the signed transaction has a purpose, ensuring the transaction's authenticity and context within the blockchain. ### Signed Extensions Polkadot SDK also provides the concept of [signed extensions](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/signed_extensions/index.html){target=\_blank}, which allow developers to extend extrinsics with additional data or validation logic before they are included in a block. The [`SignedExtension`](https://paritytech.github.io/try-runtime-cli/sp_runtime/traits/trait.SignedExtension.html){target=\_blank} set helps enforce custom rules or protections, such as ensuring the transaction's validity or calculating priority. The transaction queue regularly calls signed extensions to verify a transaction's validity before placing it in the ready queue. This safeguard ensures transactions won't fail in a block. Signed extensions are commonly used to enforce validation logic and protect the transaction pool from spam and replay attacks. In FRAME, a signed extension can hold any of the following types by default: - [**`AccountId`**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_frame/runtime/types_common/type.AccountId.html){target=\_blank} - to encode the sender's identity - [**`Call`**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_frame/traits/trait.SignedExtension.html#associatedtype.Call){target=\_blank} - to encode the pallet call to be dispatched. This data is used to calculate transaction fees - [**`AdditionalSigned`**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_frame/traits/trait.SignedExtension.html#associatedtype.AdditionalSigned){target=\_blank} - to handle any additional data to go into the signed payload allowing you to attach any custom logic prior to dispatching a transaction - [**`Pre`**](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_frame/traits/trait.SignedExtension.html#associatedtype.Pre){target=\_blank} - to encode the information that can be passed from before a call is dispatched to after it gets dispatched Signed extensions can enforce checks like: - [**`CheckSpecVersion`**](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/extensions/check_spec_version.rs.html){target=\_blank} - ensures the transaction is compatible with the runtime's current version - [**`CheckWeight`**](https://paritytech.github.io/polkadot-sdk/master/frame_system/struct.CheckWeight.html){target=\_blank} - calculates the weight (or computational cost) of the transaction, ensuring the block doesn't exceed the maximum allowed weight These extensions are critical in the transaction lifecycle, ensuring that only valid and prioritized transactions are processed. ## Transaction Construction Building transactions in the Polkadot SDK involves constructing a payload that can be verified, signed, and submitted for inclusion in a block. Each runtime in the Polkadot SDK has its own rules for validating and executing transactions, but there are common patterns for constructing a signed transaction. ### Construct a Signed Transaction A signed transaction in the Polkadot SDK includes various pieces of data to ensure security, prevent replay attacks, and prioritize processing. Here's an overview of how to construct one: 1. **Construct the unsigned payload** - gather the necessary information for the call, including: - **Pallet index** - identifies the pallet where the runtime function resides - **Function index** - specifies the particular function to call in the pallet - **Parameters** - any additional arguments required by the function call 2. **Create a signing payload** - once the unsigned payload is ready, additional data must be included: - **Transaction nonce** - unique identifier to prevent replay attacks - **Era information** - defines how long the transaction is valid before it's dropped from the pool - **Block hash** - ensures the transaction doesn't execute on the wrong chain or fork 3. **Sign the payload** - using the sender's private key, sign the payload to ensure that the transaction can only be executed by the account holder 4. **Serialize the signed payload** - once signed, the transaction must be serialized into a binary format, ensuring the data is compact and easy to transmit over the network 5. **Submit the serialized transaction** - finally, submit the serialized transaction to the network, where it will enter the transaction pool and wait for processing by an authoring node The following is an example of how a signed transaction might look: ``` rust node_runtime::UncheckedExtrinsic::new_signed( function.clone(), // some call sp_runtime::AccountId32::from(sender.public()).into(), // some sending account node_runtime::Signature::Sr25519(signature.clone()), // the account's signature extra.clone(), // the signed extensions ) ``` ### Transaction Encoding Before a transaction is sent to the network, it is serialized and encoded using a structured encoding process that ensures consistency and prevents tampering: - `[1]` - compact encoded length in bytes of the entire transaction - `[2]` - aΒ u8Β containing 1 byte to indicate whether the transaction is signed or unsigned (1 bit) and the encoded transaction version ID (7 bits) - `[3]` - if signed, this field contains an account ID, an SR25519 signature, and some extra data - `[4]` - encoded call data, including pallet and function indices and any required arguments This encoded format ensures consistency and efficiency in processing transactions across the network. By adhering to this format, applications can construct valid transactions and pass them to the network for execution. To learn more about how compact encoding works usingΒ SCALE, see the [SCALE Codec](https://github.com/paritytech/parity-scale-codec){target=\_blank} README on GitHub. ### Customize Transaction Construction Although the basic steps for constructing transactions are consistent across Polkadot SDK-based chains, developers can customize transaction formats and validation rules. For example: - **Custom pallets** - you can define new pallets with custom function calls, each with its own parameters and validation logic - **Signed extensions** - developers can implement custom extensions that modify how transactions are prioritized, validated, or included in blocks By leveraging Polkadot SDK's modular design, developers can create highly specialized transaction logic tailored to their chain's needs. ## Lifecycle of a Transaction In the Polkadot SDK, transactions are often referred to as extrinsics because the data in transactions originates outside of the runtime. These transactions contain data that initiates changes to the chain state. The most common type of extrinsic is a signed transaction, which is cryptographically verified and typically incurs a fee. This section focuses on how signed transactions are processed, validated, and ultimately included in a block. ### Define Transaction Properties The Polkadot SDK runtime defines key transaction properties, such as: - **Transaction validity** - ensures the transaction meets all runtime requirements - **Signed or unsigned** - identifies whether a transaction needs to be signed by an account - **State changes** - determines how the transaction modifies the state of the chain Pallets, which compose the runtime's logic, define the specific transactions that your chain supports. When a user submits a transaction, such as a token transfer, it becomes a signed transaction, verified by the user's account signature. If the account has enough funds to cover fees, the transaction is executed, and the chain's state is updated accordingly. ### Process on a Block Authoring Node In Polkadot SDK-based networks, some nodes are authorized to author blocks. These nodes validate and process transactions. When a transaction is sent to a node that can produce blocks, it undergoes a lifecycle that involves several stages, including validation and execution. Non-authoring nodes gossip the transaction across the network until an authoring node receives it. The following diagram illustrates the lifecycle of a transaction that's submitted to a network and processed by an authoring node. ![Transaction lifecycle diagram](/images/polkadot-protocol/parachain-basics/blocks-transactions-fees/transactions/transaction-lifecycle-1.webp) ### Validate and Queue Once a transaction reaches an authoring node, it undergoes an initial validation process to ensure it meets specific conditions defined in the runtime. This validation includes checks for: - **Correct nonce** - ensures the transaction is sequentially valid for the account - **Sufficient funds** - confirms the account can cover any associated transaction fees - **Signature validity** - verifies that the sender's signature matches the transaction data After these checks, valid transactions are placed in the transaction pool, where they are queued for inclusion in a block. The transaction pool regularly re-validates queued transactions to ensure they remain valid before being processed. To reach consensus, two-thirds of the nodes must agree on the order of the transactions executed and the resulting state change. Transactions are validated and queued on the local node in a transaction pool to prepare for consensus. #### Transaction Pool The transaction pool is responsible for managing valid transactions. It ensures that only transactions that pass initial validity checks are queued. Transactions that fail validation, expire, or become invalid for other reasons are removed from the pool. The transaction pool organizes transactions into two queues: - **Ready queue** - transactions that are valid and ready to be included in a block - **Future queue** - transactions that are not yet valid but could be in the future, such as transactions with a nonce too high for the current state Details on how the transaction pool validates transactions, including fee and signature handling, can be found in the [`validate_transaction`](https://paritytech.github.io/polkadot-sdk/master/sp_transaction_pool/runtime_api/trait.TaggedTransactionQueue.html#method.validate_transaction){target=\_blank} method. #### Invalid Transactions If a transaction is invalid, for example, due to an invalid signature or insufficient funds, it is rejected and won't be added to the block. Invalid transactions might be rejected for reasons such as: - The transaction has already been included in a block - The transaction's signature does not match the sender - The transaction is too large to fit in the current block ### Transaction Ordering and Priority When a node is selected as the next block author, it prioritizes transactions based on weight, length, and tip amount. The goal is to fill the block with high-priority transactions without exceeding its maximum size or computational limits. Transactions are ordered as follows: - **Inherents first** - inherent transactions, such as block timestamp updates, are always placed first - **Nonce-based ordering** - transactions from the same account are ordered by their nonce - **Fee-based ordering** - among transactions with the same nonce or priority level, those with higher fees are prioritized ### Transaction Execution Once a block author selects transactions from the pool, the transactions are executed in priority order. As each transaction is processed, the state changes are written directly to the chain's storage. It's important to note that these changes are not cached, meaning a failed transaction won't revert earlier state changes, which could leave the block in an inconsistent state. Events are also written to storage. Runtime logic should not emit an event before performing the associated actions. If the associated transaction fails after the event was emitted, the event will not revert. ## Transaction Mortality Transactions in the network can be configured as either mortal (with expiration) or immortal (without expiration). Every transaction payload contains a block checkpoint (reference block number and hash) and an era/validity period that determines how many blocks after the checkpoint the transaction remains valid. When a transaction is submitted, the network validates it against these parameters. If the transaction is not included in a block within the specified validity window, it is automatically removed from the transaction queue. - **Mortal transactions**: have a finite lifespan and will expire after a specified number of blocks. For example, a transaction with a block checkpoint of 1000 and a validity period of 64 blocks will be valid from blocks 1000 to 1064. - **Immortal transactions**: never expire and remain valid indefinitely. To create an immortal transaction, set the block checkpoint to 0 (genesis block), use the genesis hash as a reference, and set the validity period to 0. However, immortal transactions pose significant security risks through replay attacks. If an account is reaped (balance drops to zero, account removed) and later re-funded, malicious actors can replay old immortal transactions. The blockchain maintains only a limited number of prior block hashes for reference validation, called `BlockHashCount`. If your validity period exceeds `BlockHashCount`, the effective validity period becomes the minimum of your specified period and the block hash count. ## Unique Identifiers for Extrinsics Transaction hashes are **not unique identifiers** in Polkadot SDK-based chains. Key differences from traditional blockchains: - Transaction hashes serve only as fingerprints of transaction information - Multiple valid transactions can share the same hash - Hash uniqueness assumptions lead to serious issues For example, when an account is reaped (removed due to insufficient balance) and later recreated, it resets to nonce 0, allowing identical transactions to be valid at different points: | Block | Extrinsic Index | Hash | Origin | Nonce | Call | Result | |-------|----------------|------|-----------|-------|---------------------|-------------------------------| | 100 | 0 | 0x01 | Account A | 0 | Transfer 5 DOT to B | Account A reaped | | 150 | 5 | 0x02 | Account B | 4 | Transfer 7 DOT to A | Account A created (nonce = 0) | | 200 | 2 | 0x01 | Account A | 0 | Transfer 5 DOT to B | Successful transaction | Notice that blocks 100 and 200 contain transactions with identical hashes (0x01) but are completely different, valid operations occurring at different times. Additional complexity comes from Polkadot SDK's origin abstraction. Origins can represent collectives, governance bodies, or other non-account entities that don't maintain nonces like regular accounts and might dispatch identical calls multiple times with the same hash values. Each execution occurs in different chain states with different results. The correct way to uniquely identify an extrinsic on a Polkadot SDK-based chain is to use the block ID (height or hash) and the extrinsic index. Since the Polkadot SDK defines blocks as headers plus ordered arrays of extrinsics, the index position within a canonical block provides guaranteed uniqueness. ## Additional Resources For a video overview of the lifecycle of transactions and the types of transactions that exist, see the [Transaction lifecycle](https://www.youtube.com/watch?v=3pfM0GOp02c){target=\_blank} seminar from Parity Tech. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/chain-data/ --- BEGIN CONTENT --- --- title: Chain Data description: Learn how to expose and utilize chain data for blockchain applications. Discover runtime metadata, RPC APIs, and tools for efficient development. categories: Basics, Polkadot Protocol --- # Chain Data ## Introduction Understanding and leveraging on-chain data is a fundamental aspect of blockchain development. Whether you're building frontend applications or backend systems, accessing and decoding runtime metadata is vital to interacting with the blockchain. This guide introduces you to the tools and processes for generating and retrieving metadata, explains its role in application development, and outlines the additional APIs available for interacting with a Polkadot node. By mastering these components, you can ensure seamless communication between your applications and the blockchain. ## Application Development You might not be directly involved in building frontend applications as a blockchain developer. However, most applications that run on a blockchain require some form of frontend or user-facing client to enable users or other programs to access and modify the data that the blockchain stores. For example, you might develop a browser-based, mobile, or desktop application that allows users to submit transactions, post articles, view their assets, or track previous activity. The backend for that application is configured in the runtime logic for your blockchain, but the frontend client makes the runtime features accessible to your users. For your custom chain to be useful to others, you'll need to provide a client application that allows users to view, interact with, or update information that the blockchain keeps track of. In this article, you'll learn how to expose information about your runtime so that client applications can use it, see examples of the information exposed, and explore tools and libraries that use it. ## Understand Metadata Polkadot SDK-based blockchain networks are designed to expose their runtime information, allowing developers to learn granular details regarding pallets, RPC calls, and runtime APIs. The metadata also exposes their related documentation. The chain's metadata is [SCALE-encoded](/polkadot-protocol/basics/data-encoding/){target=\_blank}, allowing for the development of browser-based, mobile, or desktop applications to support the chain's runtime upgrades seamlessly. It is also possible to develop applications compatible with multiple Polkadot SDK-based chains simultaneously. ## Expose Runtime Information as Metadata To interact with a node or the state of the blockchain, you need to know how to connect to the chain and access the exposed runtime features. This interaction involves a Remote Procedure Call (RPC) through a node endpoint address, commonly through a secure web socket connection. An application developer typically needs to know the contents of the runtime logic, including the following details: - Version of the runtime the application is connecting to - Supported APIs - Implemented pallets - Defined functions and corresponding type signatures - Defined custom types - Exposed parameters users can set As the Polkadot SDK is modular and provides a composable framework for building blockchains, there are limitless opportunities to customize the schema of properties. Each runtime can be configured with its properties, including function calls and types, which can be changed over time with runtime upgrades. The Polkadot SDK enables you to generate the runtime metadata schema to capture information unique to a runtime. The metadata for a runtime describes the pallets in use and types defined for a specific runtime version. The metadata includes information about each pallet's storage items, functions, events, errors, and constants. The metadata also provides type definitions for any custom types included in the runtime. Metadata provides a complete inventory of a chain's runtime. It is key to enabling client applications to interact with the node, parse responses, and correctly format message payloads sent back to that chain. ## Generate Metadata To efficiently use the blockchain's networking resources and minimize the data transmitted over the network, the metadata schema is encoded using the [Parity SCALE Codec](https://github.com/paritytech/parity-scale-codec?tab=readme-ov-file#parity-scale-codec){target=\_blank}. This encoding is done automatically through the [`scale-info`](https://docs.rs/scale-info/latest/scale_info/){target=\_blank}crate. At a high level, generating the metadata involves the following steps: 1. The pallets in the runtime logic expose callable functions, types, parameters, and documentation that need to be encoded in the metadata 2. The `scale-info` crate collects type information for the pallets in the runtime, builds a registry of the pallets that exist in a particular runtime, and the relevant types for each pallet in the registry. The type information is detailed enough to enable encoding and decoding for every type 3. The [`frame-metadata`](https://github.com/paritytech/frame-metadata){target=\_blank} crate describes the structure of the runtime based on the registry provided by the `scale-info` crate 4. Nodes provide the RPC method `state_getMetadata` to return a complete description of all the types in the current runtime as a hex-encoded vector of SCALE-encoded bytes ## Retrieve Runtime Metadata The type information provided by the metadata enables applications to communicate with nodes using different runtime versions and across chains that expose different calls, events, types, and storage items. The metadata also allows libraries to generate a substantial portion of the code needed to communicate with a given node, enabling libraries like [`subxt`](https://github.com/paritytech/subxt){target=\_blank} to generate frontend interfaces that are specific to a target chain. ### Use Polkadot.js Visit the [Polkadot.js Portal](https://polkadot.js.org/apps/#/rpc){target=\_blank} and select the **Developer** dropdown in the top banner. Select **RPC Calls** to make the call to request metadata. Follow these steps to make the RPC call: 1. Select **state** as the endpoint to call 2. Select **`getMetadata(at)`** as the method to call 3. Click **Submit RPC call** to submit the call and return the metadata in JSON format ### Use Curl You can fetch the metadata for the network by calling the node's RPC endpoint. This request returns the metadata in bytes rather than human-readable JSON: ```sh curl -H "Content-Type: application/json" \ -d '{"id":1, "jsonrpc":"2.0", "method": "state_getMetadata"}' \ https://rpc.polkadot.io ``` ### Use Subxt [`subxt`](https://github.com/paritytech/subxt){target=\_blank} may also be used to fetch the metadata of any data in a human-readable JSON format: ```sh subxt metadata --url wss://rpc.polkadot.io --format json > spec.json ``` Another option is to use the [`subxt` explorer web UI](https://paritytech.github.io/subxt-explorer/#/){target=\_blank}. ## Client Applications and Metadata The metadata exposes the expected way to decode each type, meaning applications can send, retrieve, and process application information without manual encoding and decoding. Client applications must use the [SCALE codec library](https://github.com/paritytech/parity-scale-codec?tab=readme-ov-file#parity-scale-codec){target=\_blank} to encode and decode RPC payloads to use the metadata. Client applications use the metadata to interact with the node, parse responses, and format message payloads sent to the node. ## Metadata Format Although the SCALE-encoded bytes can be decoded using the `frame-metadata` and [`parity-scale-codec`](https://github.com/paritytech/parity-scale-codec){target=\_blank} libraries, there are other tools, such as `subxt` and the Polkadot-JS API, that can convert the raw data to human-readable JSON format. The types and type definitions included in the metadata returned by the `state_getMetadata` RPC call depend on the runtime's metadata version. In general, the metadata includes the following information: - A constant identifying the file as containing metadata - The version of the metadata format used in the runtime - Type definitions for all types used in the runtime and generated by the `scale-info` crate - Pallet information for the pallets included in the runtime in the order that they are defined in the `construct_runtime` macro !!!tip Depending on the frontend library used (such as the [Polkadot API](https://papi.how/){target=\_blank}), they may format the metadata differently than the raw format shown. The following example illustrates a condensed and annotated section of metadata decoded and converted to JSON: ```json [ 1635018093, { "V14": { "types": { "types": [{}] }, "pallets": [{}], "extrinsic": { "ty": 126, "version": 4, "signed_extensions": [{}] }, "ty": 141 } } ] ``` The constant `1635018093` is a magic number that identifies the file as a metadata file. The rest of the metadata is divided into the `types`, `pallets`, and `extrinsic` sections: - The `types` section contains an index of the types and information about each type's type signature - The `pallets` section contains information about each pallet in the runtime - The `extrinsic` section describes the type identifier and transaction format version that the runtime uses Different extrinsic versions can have varying formats, especially when considering [signed transactions](/polkadot-protocol/parachain-basics/blocks-transactions-fees/transactions/#signed-transactions){target=\_blank}. ### Pallets The following is a condensed and annotated example of metadata for a single element in the `pallets` array (the [`sudo`](https://paritytech.github.io/polkadot-sdk/master/pallet_sudo/index.html){target=\_blank} pallet): ```json { "name": "Sudo", "storage": { "prefix": "Sudo", "entries": [ { "name": "Key", "modifier": "Optional", "ty": { "Plain": 0 }, "default": [0], "docs": ["The `AccountId` of the sudo key."] } ] }, "calls": { "ty": 117 }, "event": { "ty": 42 }, "constants": [], "error": { "ty": 124 }, "index": 8 } ``` Every element metadata contains the name of the pallet it represents and information about its storage, calls, events, and errors. You can look up details about the definition of the calls, events, and errors by viewing the type index identifier. The type index identifier is the `u32` integer used to access the type information for that item. For example, the type index identifier for calls in the Sudo pallet is 117. If you view information for that type identifier in the `types` section of the metadata, it provides information about the available calls, including the documentation for each call. For example, the following is a condensed excerpt of the calls for the Sudo pallet: ```json { "id": 117, "type": { "path": ["pallet_sudo", "pallet", "Call"], "params": [ { "name": "T", "type": null } ], "def": { "variant": { "variants": [ { "name": "sudo", "fields": [ { "name": "call", "type": 114, "typeName": "Box<::RuntimeCall>" } ], "index": 0, "docs": [ "Authenticates sudo key, dispatches a function call with `Root` origin" ] }, { "name": "sudo_unchecked_weight", "fields": [ { "name": "call", "type": 114, "typeName": "Box<::RuntimeCall>" }, { "name": "weight", "type": 8, "typeName": "Weight" } ], "index": 1, "docs": [ "Authenticates sudo key, dispatches a function call with `Root` origin" ] }, { "name": "set_key", "fields": [ { "name": "new", "type": 103, "typeName": "AccountIdLookupOf" } ], "index": 2, "docs": [ "Authenticates current sudo key, sets the given AccountId (`new`) as the new sudo" ] }, { "name": "sudo_as", "fields": [ { "name": "who", "type": 103, "typeName": "AccountIdLookupOf" }, { "name": "call", "type": 114, "typeName": "Box<::RuntimeCall>" } ], "index": 3, "docs": [ "Authenticates sudo key, dispatches a function call with `Signed` origin from a given account" ] } ] } } } } ``` For each field, you can access type information and metadata for the following: - **Storage metadata** - provides the information required to enable applications to get information for specific storage items - **Call metadata** - includes information about the runtime calls defined by the `#[pallet]` macro including call names, arguments and documentation - **Event metadata** - provides the metadata generated by the `#[pallet::event]` macro, including the name, arguments, and documentation for each pallet event - **Constants metadata** - provides metadata generated by the `#[pallet::constant]` macro, including the name, type, and hex-encoded value of the constant - **Error metadata** - provides metadata generated by the `#[pallet::error]` macro, including the name and documentation for each pallet error !!!tip Type identifiers change from time to time, so you should avoid relying on specific type identifiers in your applications. ### Extrinsic The runtime generates extrinsic metadata and provides useful information about transaction format. When decoded, the metadata contains the transaction version and the list of signed extensions. For example: ```json { "extrinsic": { "ty": 126, "version": 4, "signed_extensions": [ { "identifier": "CheckNonZeroSender", "ty": 132, "additional_signed": 41 }, { "identifier": "CheckSpecVersion", "ty": 133, "additional_signed": 4 }, { "identifier": "CheckTxVersion", "ty": 134, "additional_signed": 4 }, { "identifier": "CheckGenesis", "ty": 135, "additional_signed": 11 }, { "identifier": "CheckMortality", "ty": 136, "additional_signed": 11 }, { "identifier": "CheckNonce", "ty": 138, "additional_signed": 41 }, { "identifier": "CheckWeight", "ty": 139, "additional_signed": 41 }, { "identifier": "ChargeTransactionPayment", "ty": 140, "additional_signed": 41 } ] }, "ty": 141 } ``` The type system is [composite](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/frame_runtime_types/index.html){target=\_blank}, meaning each type identifier contains a reference to a specific type or to another type identifier that provides information about the associated primitive types. For example, you can encode the `BitVec` type, but to decode it properly, you must know the types used for the `Order` and `Store` types. To find type information for `Order` and `Store`, you can use the path in the decoded JSON to locate their type identifiers. ## Included RPC APIs A standard node comes with the following APIs to interact with a node: - [**`AuthorApiServer`**](https://paritytech.github.io/polkadot-sdk/master/sc_rpc/author/trait.AuthorApiServer.html){target=\_blank} - make calls into a full node, including authoring extrinsics and verifying session keys - [**`ChainApiServer`**](https://paritytech.github.io/polkadot-sdk/master/sc_rpc/chain/trait.ChainApiServer.html){target=\_blank} - retrieve block header and finality information - [**`OffchainApiServer`**](https://paritytech.github.io/polkadot-sdk/master/sc_rpc/offchain/trait.OffchainApiServer.html){target=\_blank} - make RPC calls for off-chain workers - [**`StateApiServer`**](https://paritytech.github.io/polkadot-sdk/master/sc_rpc/state/trait.StateApiServer.html){target=\_blank} - query information about on-chain state such as runtime version, storage items, and proofs - [**`SystemApiServer`**](https://paritytech.github.io/polkadot-sdk/master/sc_rpc/system/trait.SystemApiServer.html){target=\_blank} - retrieve information about network state, such as connected peers and node roles ## Additional Resources The following tools can help you locate and decode metadata: - [Subxt Explorer](https://paritytech.github.io/subxt-explorer/#/){target=\_blank} - [Metadata Portal πŸŒ—](https://github.com/paritytech/metadata-portal){target=\_blank} - [De[code] Sub[strate]](https://github.com/paritytech/desub){target=\_blank} --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/cryptography/ --- BEGIN CONTENT --- --- title: Cryptography description: A concise guide to cryptography in blockchain, covering hash functions, encryption types, digital signatures, and elliptic curve applications. categories: Basics, Polkadot Protocol --- # Cryptography ## Introduction Cryptography forms the backbone of blockchain technology, providing the mathematical verifiability crucial for consensus systems, data integrity, and user security. While a deep understanding of the underlying mathematical processes isn't necessary for most blockchain developers, grasping the fundamental applications of cryptography is essential. This page comprehensively overviews cryptographic implementations used across Polkadot SDK-based chains and the broader blockchain ecosystem. ## Hash Functions Hash functions are fundamental to blockchain technology, creating a unique digital fingerprint for any piece of data, including simple text, images, or any other form of file. They map input data of any size to a fixed-size output (typically 32 bytes) using complex mathematical operations. Hashing is used to verify data integrity, create digital signatures, and provide a secure way to store passwords. This form of mapping is known as the ["pigeonhole principle,"](https://en.wikipedia.org/wiki/Pigeonhole_principle){target=\_blank} it is primarily implemented to efficiently and verifiably identify data from large sets. ### Key Properties of Hash Functions - **Deterministic** - the same input always produces the same output - **Quick computation** - it's easy to calculate the hash value for any given input - **Pre-image resistance** - it's infeasible to generate the input data from its hash - **Small changes in input yield large changes in output** - known as the ["avalanche effect"](https://en.wikipedia.org/wiki/Avalanche_effect){target=\_blank} - **Collision resistance** - the probabilities are extremely low to find two different inputs with the same hash ### Blake2 The Polkadot SDK utilizes Blake2, a state-of-the-art hashing method that offers: - Equal or greater security compared to [SHA-2](https://en.wikipedia.org/wiki/SHA-2){target=\_blank} - Significantly faster performance than other algorithms These properties make Blake2 ideal for blockchain systems, reducing sync times for new nodes and lowering the resources required for validation. For detailed technical specifications about Blake2, see the [official Blake2 paper](https://www.blake2.net/blake2.pdf){target=\_blank}. ## Types of Cryptography There are two different ways that cryptographic algorithms are implemented: symmetric cryptography and asymmetric cryptography. ### Symmetric Cryptography Symmetric encryption is a branch of cryptography that isn't based on one-way functions, unlike asymmetric cryptography. It uses the same cryptographic key to encrypt plain text and decrypt the resulting ciphertext. Symmetric cryptography is a type of encryption that has been used throughout history, such as the Enigma Cipher and the Caesar Cipher. It is still widely used today and can be found in Web2 and Web3 applications alike. There is only one single key, and a recipient must also have access to it to access the contained information. #### Advantages {: #symmetric-advantages } - Fast and efficient for large amounts of data - Requires less computational power #### Disadvantages {: #symmetric-disadvantages } - Key distribution can be challenging - Scalability issues in systems with many users ### Asymmetric Cryptography Asymmetric encryption is a type of cryptography that uses two different keys, known as a keypair: a public key, used to encrypt plain text, and a private counterpart, used to decrypt the ciphertext. The public key encrypts a fixed-length message that can only be decrypted with the recipient's private key and, sometimes, a set password. The public key can be used to cryptographically verify that the corresponding private key was used to create a piece of data without compromising the private key, such as with digital signatures. This has obvious implications for identity, ownership, and properties and is used in many different protocols across Web2 and Web3. #### Advantages {: #asymmetric-advantages } - Solves the key distribution problem - Enables digital signatures and secure key exchange #### Disadvantages {: #asymmetric-disadvantages } - Slower than symmetric encryption - Requires more computational resources ### Trade-offs and Compromises Symmetric cryptography is faster and requires fewer bits in the key to achieve the same level of security that asymmetric cryptography provides. However, it requires a shared secret before communication can occur, which poses issues to its integrity and a potential compromise point. On the other hand, asymmetric cryptography doesn't require the secret to be shared ahead of time, allowing for far better end-user security. Hybrid symmetric and asymmetric cryptography is often used to overcome the engineering issues of asymmetric cryptography, as it is slower and requires more bits in the key to achieve the same level of security. It encrypts a key and then uses the comparatively lightweight symmetric cipher to do the "heavy lifting" with the message. ## Digital Signatures Digital signatures are a way of verifying the authenticity of a document or message using asymmetric keypairs. They are used to ensure that a sender or signer's document or message hasn't been tampered with in transit, and for recipients to verify that the data is accurate and from the expected sender. Signing digital signatures only requires a low-level understanding of mathematics and cryptography. For a conceptual example -- when signing a check, it is expected that it cannot be cashed multiple times. This isn't a feature of the signature system but rather the check serialization system. The bank will check that the serial number on the check hasn't already been used. Digital signatures essentially combine these two concepts, allowing the signature to provide the serialization via a unique cryptographic fingerprint that cannot be reproduced. Unlike pen-and-paper signatures, knowledge of a digital signature cannot be used to create other signatures. Digital signatures are often used in bureaucratic processes, as they are more secure than simply scanning in a signature and pasting it onto a document. Polkadot SDK provides multiple different cryptographic schemes and is generic so that it can support anything that implements the [`Pair` trait](https://paritytech.github.io/polkadot-sdk/master/sp_core/crypto/trait.Pair.html){target=\_blank}. ### Example of Creating a Digital Signature The process of creating and verifying a digital signature involves several steps: 1. The sender creates a hash of the message 2. The hash is encrypted using the sender's private key, creating the signature 3. The message and signature are sent to the recipient 4. The recipient decrypts the signature using the sender's public key 5. The recipient hashes the received message and compares it to the decrypted hash If the hashes match, the signature is valid, confirming the message's integrity and the sender's identity. ## Elliptic Curve Blockchain technology requires the ability to have multiple keys creating a signature for block proposal and validation. To this end, Elliptic Curve Digital Signature Algorithm (ECDSA) and Schnorr signatures are two of the most commonly used methods. While ECDSA is a far simpler implementation, Schnorr signatures are more efficient when it comes to multi-signatures. Schnorr signatures bring some noticeable features over the ECDSA/EdDSA schemes: - It is better for hierarchical deterministic key derivations - It allows for native multi-signature through [signature aggregation](https://bitcoincore.org/en/2017/03/23/schnorr-signature-aggregation/){target=\_blank} - It is generally more resistant to misuse One sacrifice that is made when using Schnorr signatures over ECDSA is that both require 64 bytes, but only ECDSA signatures communicate their public key. ### Various Implementations - [ECDSA](https://en.wikipedia.org/wiki/Elliptic_Curve_Digital_Signature_Algorithm){target=\_blank} - Polkadot SDK provides an ECDSA signature scheme using the [secp256k1](https://en.bitcoin.it/wiki/Secp256k1){target=\_blank} curve. This is the same cryptographic algorithm used to secure [Bitcoin](https://en.wikipedia.org/wiki/Bitcoin){target=\_blank} and [Ethereum](https://en.wikipedia.org/wiki/Ethereum){target=\_blank} - [Ed25519](https://en.wikipedia.org/wiki/EdDSA#Ed25519){target=\_blank} - is an EdDSA signature scheme using [Curve25519](https://en.wikipedia.org/wiki/Curve25519){target=\_blank}. It is carefully engineered at several levels of design and implementation to achieve very high speeds without compromising security - [SR25519](https://research.web3.foundation/Polkadot/security/keys/accounts-more){target=\_blank} - is based on the same underlying curve as Ed25519. However, it uses Schnorr signatures instead of the EdDSA scheme --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/data-encoding/ --- BEGIN CONTENT --- --- title: Data Encoding description: SCALE codec enables fast, efficient data encoding, ideal for resource-constrained environments like Wasm, supporting custom types and compact encoding. categories: Basics, Polkadot Protocol --- # Data Encoding ## Introduction The Polkadot SDK uses a lightweight and efficient encoding/decoding mechanism to optimize data transmission across the network. This mechanism, known as the _SCALE_ codec, is used for serializing and deserializing data. The SCALE codec enables communication between the runtime and the outer node. This mechanism is designed for high-performance, copy-free data encoding and decoding in resource-constrained environments like the Polkadot SDK [Wasm runtime](/develop/parachains/deployment/build-deterministic-runtime/#introduction){target=\_blank}. It is not self-describing, meaning the decoding context must fully know the encoded data types. Parity's libraries utilize the [`parity-scale-codec`](https://github.com/paritytech/parity-scale-codec){target=\_blank} crate (a Rust implementation of the SCALE codec) to handle encoding and decoding for interactions between RPCs and the runtime. The `codec` mechanism is ideal for Polkadot SDK-based chains because: - It is lightweight compared to generic serialization frameworks like [`serde`](https://serde.rs/){target=\_blank}, which add unnecessary bulk to binaries - It doesn’t rely on Rust’s `libstd`, making it compatible with `no_std` environments like Wasm runtime - It integrates seamlessly with Rust, allowing easy derivation of encoding and decoding logic for new types using `#[derive(Encode, Decode)]` Defining a custom encoding scheme in the Polkadot SDK-based chains, rather than using an existing Rust codec library, is crucial for enabling cross-platform and multi-language support. ## SCALE Codec The codec is implemented using the following traits: - [`Encode`](#encode) - [`Decode`](#decode) - [`CompactAs`](#compactas) - [`HasCompact`](#hascompact) - [`EncodeLike`](#encodelike) ### Encode The [`Encode`](https://docs.rs/parity-scale-codec/latest/parity_scale_codec/trait.Encode.html){target=\_blank} trait handles data encoding into SCALE format and includes the following key functions: - **`size_hint(&self) -> usize`** - estimates the number of bytes required for encoding to prevent multiple memory allocations. This should be inexpensive and avoid complex operations. Optional if the size isn’t known - **`encode_to(&self, dest: &mut T)`** - encodes the data, appending it to a destination buffer - **`encode(&self) -> Vec`** - encodes the data and returns it as a byte vector - **`using_encoded R>(&self, f: F) -> R`** - encodes the data and passes it to a closure, returning the result - **`encoded_size(&self) -> usize`** - calculates the encoded size. Should be used when the encoded data isn’t required !!!tip For best performance, value types should override `using_encoded`, and allocating types should override `encode_to`. It's recommended to implement `size_hint` for all types where possible. ### Decode The [`Decode`](https://docs.rs/parity-scale-codec/latest/parity_scale_codec/trait.Decode.html){target=\_blank} trait handles decoding SCALE-encoded data back into the appropriate types: - **`fn decode(value: &mut I) -> Result`** - decodes data from the SCALE format, returning an error if decoding fails ### CompactAs The [`CompactAs`](https://docs.rs/parity-scale-codec/latest/parity_scale_codec/trait.CompactAs.html){target=\_blank} trait wraps custom types for compact encoding: - **`encode_as(&self) -> &Self::As`** - encodes the type as a compact type - **`decode_from(_: Self::As) -> Result`** - decodes from a compact encoded type ### HasCompact The [`HasCompact`](https://docs.rs/parity-scale-codec/latest/parity_scale_codec/trait.HasCompact.html){target=\_blank} trait indicates a type supports compact encoding. ### EncodeLike The [`EncodeLike`](https://docs.rs/parity-scale-codec/latest/parity_scale_codec/trait.EncodeLike.html){target=\_blank} trait is used to ensure multiple types that encode similarly are accepted by the same function. When using `derive`, it is automatically implemented. ### Data Types The table below outlines how the Rust implementation of the Parity SCALE codec encodes different data types. | Type | Description | Example SCALE Decoded Value | SCALE Encoded Value | |-------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------| | Boolean | Boolean values are encoded using the least significant bit of a single byte. | `false` / `true` | `0x00` / `0x01` | | Compact/general integers | A "compact" or general integer encoding is sufficient for encoding large integers (up to 2^536) and is more efficient at encoding most values than the fixed-width version. | `unsigned integer 0` / `unsigned integer 1` / `unsigned integer 42` / `unsigned integer 69` / `unsigned integer 65535` / `BigInt(100000000000000)` | `0x00` / `0x04` / `0xa8` / `0x1501` / `0xfeff0300` / `0x0b00407a10f35a` | | Enumerations (tagged-unions) | A fixed number of variants, each mutually exclusive and potentially implying a further value or series of values. Encoded as the first byte identifying the index of the variant that the value is. Any further bytes are used to encode any data that the variant implies. Thus, no more than 256 variants are supported. | `Int(42)` and `Bool(true)` where `enum IntOrBool { Int(u8), Bool(bool) }` | `0x002a` and `0x0101` | | Fixed-width integers | Basic integers are encoded using a fixed-width little-endian (LE) format. | `signed 8-bit integer 69` / `unsigned 16-bit integer 42` / `unsigned 32-bit integer 16777215` | `0x45` / `0x2a00` / `0xffffff00` | | Options | One or zero values of a particular type. | `Some` / `None` | `0x01` followed by the encoded value / `0x00` | | Results | Results are commonly used enumerations which indicate whether certain operations were successful or unsuccessful. | `Ok(42)` / `Err(false)` | `0x002a` / `0x0100` | | Strings | Strings are Vectors of bytes (Vec) containing a valid UTF8 sequence. | | | | Structs | For structures, the values are named, but that is irrelevant for the encoding (names are ignored - only order matters). | `SortedVecAsc::from([3, 5, 2, 8])` | `[3, 2, 5, 8] ` | | Tuples | A fixed-size series of values, each with a possibly different but predetermined and fixed type. This is simply the concatenation of each encoded value. | Tuple of compact unsigned integer and boolean: `(3, false) ` | `0x0c00` | | Vectors (lists, series, sets) | A collection of same-typed values is encoded, prefixed with a compact encoding of the number of items, followed by each item's encoding concatenated in turn. | Vector of unsigned `16`-bit integers: `[4, 8, 15, 16, 23, 42] ` | `0x18040008000f00100017002a00` | ## Encode and Decode Rust Trait Implementations Here's how the `Encode` and `Decode` traits are implemented: ```rust use parity_scale_codec::{Encode, Decode}; [derive(Debug, PartialEq, Encode, Decode)] enum EnumType { #[codec(index = 15)] A, B(u32, u64), C { a: u32, b: u64, }, } let a = EnumType::A; let b = EnumType::B(1, 2); let c = EnumType::C { a: 1, b: 2 }; a.using_encoded(|ref slice| { assert_eq!(slice, &b"\x0f"); }); b.using_encoded(|ref slice| { assert_eq!(slice, &b"\x01\x01\0\0\0\x02\0\0\0\0\0\0\0"); }); c.using_encoded(|ref slice| { assert_eq!(slice, &b"\x02\x01\0\0\0\x02\0\0\0\0\0\0\0"); }); let mut da: &[u8] = b"\x0f"; assert_eq!(EnumType::decode(&mut da).ok(), Some(a)); let mut db: &[u8] = b"\x01\x01\0\0\0\x02\0\0\0\0\0\0\0"; assert_eq!(EnumType::decode(&mut db).ok(), Some(b)); let mut dc: &[u8] = b"\x02\x01\0\0\0\x02\0\0\0\0\0\0\0"; assert_eq!(EnumType::decode(&mut dc).ok(), Some(c)); let mut dz: &[u8] = &[0]; assert_eq!(EnumType::decode(&mut dz).ok(), None); ``` ## SCALE Codec Libraries Several SCALE codec implementations are available in various languages. Here's a list of them: - **AssemblyScript** - [`LimeChain/as-scale-codec`](https://github.com/LimeChain/as-scale-codec){target=\_blank} - **C** - [`MatthewDarnell/cScale`](https://github.com/MatthewDarnell/cScale){target=\_blank} - **C++** - [`qdrvm/scale-codec-cpp`](https://github.com/qdrvm/scale-codec-cpp){target=\_blank} - **JavaScript** - [`polkadot-js/api`](https://github.com/polkadot-js/api){target=\_blank} - **Dart** - [`leonardocustodio/polkadart`](https://github.com/leonardocustodio/polkadart){target=\_blank} - **Haskell** - [`airalab/hs-web3`](https://github.com/airalab/hs-web3/tree/master/packages/scale){target=\_blank} - **Golang** - [`itering/scale.go`](https://github.com/itering/scale.go){target=\_blank} - **Java** - [`splix/polkaj`](https://github.com/splix/polkaj){target=\_blank} - **Python** - [`polkascan/py-scale-codec`](https://github.com/polkascan/py-scale-codec){target=\_blank} - **Ruby** - [` wuminzhe/scale_rb`](https://github.com/wuminzhe/scale_rb){target=\_blank} - **TypeScript** - [`parity-scale-codec-ts`](https://github.com/tjjfvi/subshape){target=\_blank}, [`scale-ts`](https://github.com/unstoppablejs/unstoppablejs/tree/main/packages/scale-ts#scale-ts){target=\_blank}, [`soramitsu/scale-codec-js-library`](https://github.com/soramitsu/scale-codec-js-library){target=\_blank}, [`subsquid/scale-codec`](https://github.com/subsquid/squid-sdk/tree/master/substrate/scale-codec){target=\_blank} --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/interoperability/ --- BEGIN CONTENT --- --- title: Interoperability description: Explore the importance of interoperability in the Polkadot ecosystem, covering XCM, bridges, and cross-chain communication. categories: Basics, Polkadot Protocol --- # Interoperability ## Introduction Interoperability lies at the heart of the Polkadot ecosystem, enabling communication and collaboration across a diverse range of blockchains. By bridging the gaps between parachains, relay chains, and even external networks, Polkadot unlocks the potential for truly decentralized applications, efficient resource sharing, and scalable solutions. Polkadot’s design ensures that blockchains can transcend their individual limitations by working together as part of a unified system. This cooperative architecture is what sets Polkadot apart in the blockchain landscape. ## Why Interoperability Matters The blockchain ecosystem is inherently fragmented. Different blockchains excel in specialized domains such as finance, gaming, or supply chain management, but these chains function in isolation without interoperability. This lack of connectivity stifles the broader utility of blockchain technology. Interoperability solves this problem by enabling blockchains to: - **Collaborate across networks** - chains can interact to share assets, functionality, and data, creating synergies that amplify their individual strengths - **Achieve greater scalability** - specialized chains can offload tasks to others, optimizing performance and resource utilization - **Expand use-case potential** - cross-chain applications can leverage features from multiple blockchains, unlocking novel user experiences and solutions In the Polkadot ecosystem, interoperability transforms a collection of isolated chains into a cohesive, efficient network, pushing the boundaries of what blockchains can achieve together. ## Key Mechanisms for Interoperability At the core of Polkadot's cross-chain collaboration are foundational technologies designed to break down barriers between networks. These mechanisms empower blockchains to communicate, share resources, and operate as a cohesive ecosystem. ### Cross-Consensus Messaging (XCM): The Backbone of Communication Polkadot's Cross-Consensus Messaging (XCM) is the standard framework for interaction between parachains, relay chains, and, eventually, external blockchains. XCM provides a trustless, secure messaging format for exchanging assets, sharing data, and executing cross-chain operations. Through XCM, decentralized applications can: - Transfer tokens and other assets across chains - Coordinate complex workflows that span multiple blockchains - Enable seamless user experiences where underlying blockchain differences are invisible - XCM exemplifies Polkadot’s commitment to creating a robust and interoperable ecosystem For further information about XCM, check the [Introduction to XCM](/develop/interoperability/intro-to-xcm/){target=\_blank} article. ### Bridges: Connecting External Networks While XCM enables interoperability within the Polkadot ecosystem, bridges extend this functionality to external blockchains such as Ethereum and Bitcoin. By connecting these networks, bridges allow Polkadot-based chains to access external liquidity, additional functionalities, and broader user bases. With bridges, developers and users gain the ability to: - Integrate external assets into Polkadot-based applications - Combine the strengths of Polkadot’s scalability with the liquidity of other networks - Facilitate accurate multi-chain applications that transcend ecosystem boundaries For more information about bridges in the Polkadot ecosystem, see the [Bridge Hub](/polkadot-protocol/architecture/system-chains/bridge-hub/){target=\_blank} guide. ## The Polkadot Advantage Polkadot was purpose-built for interoperability. Unlike networks that add interoperability as an afterthought, Polkadot integrates it as a fundamental design principle. This approach offers several distinct advantages: - **Developer empowerment** - polkadot’s interoperability tools allow developers to build applications that leverage multiple chains’ capabilities without added complexity - **Enhanced ecosystem collaboration** - chains in Polkadot can focus on their unique strengths while contributing to the ecosystem’s overall growth - **Future-proofing blockchain** - by enabling seamless communication, Polkadot ensures its ecosystem can adapt to evolving demands and technologies ## Looking Ahead Polkadot’s vision of interoperability extends beyond technical functionality, representing a shift towards a more collaborative blockchain landscape. By enabling chains to work together, Polkadot fosters innovation, efficiency, and accessibility, paving the way for a decentralized future where blockchains are not isolated competitors but interconnected collaborators. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/networks/ --- BEGIN CONTENT --- --- title: Networks description: Explore Polkadot's testing and production networks, including Westend, Kusama, and Paseo, for efficient development, deployment, and testing. categories: Basics, Polkadot Protocol, Networks --- # Networks ## Introduction The Polkadot ecosystem is built on a robust set of networks designed to enable secure and scalable development. Whether you are testing new features or deploying to live production, Polkadot offers several layers of networks tailored for each stage of the development process. From local environments to experimental networks like Kusama and community-run TestNets such as Paseo, developers can thoroughly test, iterate, and validate their applications. This guide will introduce you to Polkadot's various networks and explain how they fit into the development workflow. ## Network Overview Polkadot's development process is structured to ensure new features and upgrades are rigorously tested before being deployed on live production networks. The progression follows a well-defined path, starting from local environments and advancing through TestNets, ultimately reaching the Polkadot MainNet. The diagram below outlines the typical progression of the Polkadot development cycle: ``` mermaid flowchart LR id1[Local] --> id2[Westend] --> id4[Kusama] --> id5[Polkadot] id1[Local] --> id3[Paseo] --> id5[Polkadot] ``` This flow ensures developers can thoroughly test and iterate without risking real tokens or affecting production networks. Testing tools like [Chopsticks](#chopsticks) and various TestNets make it easier to experiment safely before releasing to production. A typical journey through the Polkadot core protocol development process might look like this: 1. **Local development node** - development starts in a local environment, where developers can create, test, and iterate on upgrades or new features using a local development node. This stage allows rapid experimentation in an isolated setup without any external dependencies 2. **Westend** - after testing locally, upgrades are deployed to [Westend](#westend), Polkadot's primary TestNet. Westend simulates real-world conditions without using real tokens, making it the ideal place for rigorous feature testing before moving on to production networks 3. **Kusama** - once features have passed extensive testing on Westend, they move to Kusama, Polkadot's experimental and fast-moving "canary" network. Kusama operates as a high-fidelity testing ground with actual economic incentives, giving developers insights into how their features will perform in a real-world environment 4. **Polkadot** - after passing tests on Westend and Kusama, features are considered ready for deployment to Polkadot, the live production network In addition, parachain developers can leverage local TestNets like [Zombienet](#zombienet) and deploy upgrades on parachain TestNets. 5. **Paseo** - For parachain and dApp developers, Paseo serves as a community-run TestNet that mirrors Polkadot's runtime. Like Westend for core protocol development, Paseo provides a testing ground for parachain development without affecting live networks !!!note The Rococo TestNet deprecation date was October 14, 2024. Teams should use Westend for Polkadot protocol and feature testing and Paseo for chain development-related testing. ## Polkadot Development Networks Development and testing are crucial to building robust dApps and parachains and performing network upgrades within the Polkadot ecosystem. To achieve this, developers can leverage various networks and tools that provide a risk-free environment for experimentation and validation before deploying features to live networks. These networks help avoid the costs and risks associated with real tokens, enabling testing for functionalities like governance, cross-chain messaging, and runtime upgrades. ## Kusama Network Kusama is the experimental version of Polkadot, designed for developers who want to move quickly and test their applications in a real-world environment with economic incentives. Kusama serves as a production-grade testing ground where developers can deploy features and upgrades with the pressure of game theory and economics in mind. It mirrors Polkadot but operates as a more flexible space for innovation. The native token for Kusama is KSM. For more information about KSM, visit the [Native Assets](https://wiki.polkadot.network/learn/learn-dot/#kusama){target=\_blank} page. ## Test Networks The following test networks provide controlled environments for testing upgrades and new features. TestNet tokens are available from the [Polkadot faucet](https://faucet.polkadot.io/){target=\_blank}. ### Westend Westend is Polkadot's primary permanent TestNet. Unlike temporary test networks, Westend is not reset to the genesis block, making it an ongoing environment for testing Polkadot core features. Managed by Parity Technologies, Westend ensures that developers can test features in a real-world simulation without using actual tokens. The native token for Westend is WND. More details about WND can be found on the [Native Assets](https://wiki.polkadot.network/learn/learn-dot/#getting-tokens-on-the-westend-testnet){target=\_blank} page. ### Paseo [Paseo](https://github.com/paseo-network){target=\_blank} is a community-managed TestNet designed for parachain and dApp developers. It mirrors Polkadot's runtime and is maintained by Polkadot community members. Paseo provides a dedicated space for parachain developers to test their applications in a Polkadot-like environment without the risks associated with live networks. The native token for Paseo is PAS. Additional information on PAS is available on the [Native Assets](https://wiki.polkadot.network/learn/learn-dot/#getting-tokens-on-the-paseo-testnet){target=\_blank} page. ## Local Test Networks Local test networks are an essential part of the development cycle for blockchain developers using the Polkadot SDK. They allow for fast, iterative testing in controlled, private environments without connecting to public TestNets. Developers can quickly spin up local instances to experiment, debug, and validate their code before deploying to larger TestNets like Westend or Paseo. Two key tools for local network testing are Zombienet and Chopsticks. ### Zombienet [Zombienet](https://github.com/paritytech/zombienet){target=\_blank} is a flexible testing framework for Polkadot SDK-based blockchains. It enables developers to create and manage ephemeral, short-lived networks. This feature makes Zombienet particularly useful for quick iterations, as it allows you to run multiple local networks concurrently, mimicking different runtime conditions. Whether you're developing a parachain or testing your custom blockchain logic, Zombienet gives you the tools to automate local testing. Key features of Zombienet include: - Creating dynamic, local networks with different configurations - Running parachains and relay chains in a simulated environment - Efficient testing of network components like cross-chain messaging and governance Zombienet is ideal for developers looking to test quickly and thoroughly before moving to more resource-intensive public TestNets. ### Chopsticks [Chopsticks](https://github.com/AcalaNetwork/chopsticks){target=\_blank} is a tool designed to create forks of Polkadot SDK-based blockchains, allowing developers to interact with network forks as part of their testing process. This capability makes Chopsticks a powerful option for testing upgrades, runtime changes, or cross-chain applications in a forked network environment. Key features of Chopsticks include: - Forking live Polkadot SDK-based blockchains for isolated testing - Simulating cross-chain messages in a private, controlled setup - Debugging network behavior by interacting with the fork in real-time Chopsticks provides a controlled environment for developers to safely explore the effects of runtime changes. It ensures that network behavior is tested and verified before upgrades are deployed to live networks. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/node-and-runtime/ --- BEGIN CONTENT --- --- title: Node and Runtime description: Learn how Polkadot SDK-based nodes function, how the client and runtime are separated, and how they communicate using SCALE-encoded data. categories: Basics, Polkadot Protocol --- # Node and Runtime ## Introduction Every blockchain platform relies on a decentralized network of computers, called nodes, that communicate with each other about transactions and blocks. In this context, a node refers to the software running on the connected devices rather than the physical or virtual machines in the network. Polkadot SDK-based nodes consist of two main components, each with distinct responsibilities: the client (also called node) and the runtime. If the system were a monolithic protocol, any modification would require updating the entire system. Instead, Polkadot achieves true upgradeability by defining an immutable meta-protocol (the client) and a protocol (the runtime) that can be upgraded independently. This separation gives the [Polkadot Relay Chain](/polkadot-protocol/architecture/polkadot-chain){target=\_blank} and all connected [parachains](/polkadot-protocol/architecture/parachains){target=\_blank} an evolutionary advantage over other blockchain platforms. ## Architectural Principles The Polkadot SDK-based blockchain architecture is fundamentally built on two distinct yet interconnected components: - **Client (Meta-protocol)** - Handles the foundational infrastructure of the blockchain - Manages runtime execution, networking, consensus, and other off-chain components - Provides an immutable base layer that ensures network stability - Upgradable only through hard forks - **Runtime (Protocol)** - Defines the blockchain's state transition logic - Determines the specific rules and behaviors of the blockchain - Compiled to WebAssembly (Wasm) for platform-independent execution - Capable of being upgraded without network-wide forking ### Advantages of this Architecture - **Forkless upgrades** - runtime can be updated without disrupting the entire network - **Modularity** - clear separation allows independent development of client and runtime - **Flexibility** - enables rapid iteration and evolution of blockchain logic - **Performance** - WebAssembly compilation provides efficient, cross-platform execution ## Node (Client) The node, also known as the client, is the core component responsible for executing the Wasm runtime and orchestrating various essential blockchain components. It ensures the correct execution of the state transition function and manages multiple critical subsystems, including: - **Wasm execution** - runs the blockchain runtime, which defines the state transition rules - **Database management** - stores blockchain data - **Networking** - facilitates peer-to-peer communication, block propagation, and transaction gossiping - **Transaction pool (Mempool)** - manages pending transactions before they are included in a block - **Consensus mechanism** - ensures agreement on the blockchain state across nodes - **RPC services** - provides external interfaces for applications and users to interact with the node ## Runtime The runtime is more than just a set of rules. It's the fundamental logic engine that defines a blockchain's entire behavior. In Polkadot SDK-based blockchains, the runtime represents a complete, self-contained description of the blockchain's state transition function. ### Characteristics The runtime is distinguished by three key characteristics: - **Business logic** - defines the complete application-specific blockchain behavior - **WebAssembly compilation** - ensures platform-independent, secure execution - **On-chain storage** - stored within the blockchain's state, allowing dynamic updates ### Key Functions The runtime performs several critical functions, such as: - Define state transition rules - Implement blockchain-specific logic - Manage account interactions - Control transaction processing - Define governance mechanisms - Handle custom pallets and modules ## Communication Between Node and Runtime The client and runtime communicate exclusively using [SCALE-encoded](/polkadot-protocol/parachain-basics/data-encoding){target=\_blank} communication. This ensures efficient and compact data exchange between the two components. ### Runtime APIs The Runtime API consists of well-defined functions and constants a client assumes are implemented in the Runtime Wasm blob. These APIs enable the client to interact with the runtime to execute blockchain operations and retrieve information. The client invokes these APIs to: - Build, execute, and finalize blocks - Access metadata - Access consensus related information - Handle transaction execution ### Host Functions During execution, the runtime can access certain external client functionalities via host functions. The specific functions the client exposes allow the runtime to perform operations outside the WebAssembly domain. Host functions enable the runtime to: - Perform cryptographic operations - Access the current blockchain state - Handle storage modifications - Allocate memory --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/parachain-basics/randomness/ --- BEGIN CONTENT --- --- title: Randomness description: Explore the importance of randomness in PoS blockchains, focusing on Polkadot’s VRF-based approach to ensure fairness and security in validator selection. categories: Basics, Polkadot Protocol --- # Randomness ## Introduction Randomness is crucial in Proof of Stake (PoS) blockchains to ensure a fair and unpredictable distribution of validator duties. However, computers are inherently deterministic, meaning the same input always produces the same output. What we typically refer to as "random" numbers on a computer are actually pseudo-random. These numbers rely on an initial "seed," which can come from external sources like [atmospheric noise](https://www.random.org/randomness/){target=\_blank}, [heart rates](https://mdpi.altmetric.com/details/47574324){target=\_blank}, or even [lava lamps](https://en.wikipedia.org/wiki/Lavarand){target=\_blank}. While this may seem random, given the same "seed," the same sequence of numbers will always be generated. In a global blockchain network, relying on real-world entropy for randomness isn’t feasible because these inputs vary by time and location. If nodes use different inputs, blockchains can fork. Hence, real-world randomness isn't suitable for use as a seed in blockchain systems. Currently, two primary methods for generating randomness in blockchains are used: [`RANDAO`](#randao) and [`VRF`](#vrf) (Verifiable Random Function). Polkadot adopts the `VRF` approach for its randomness. ## VRF AΒ Verifiable Random Function (VRF)Β is a cryptographic function that generates a random number and proof that ensures the submitter produced the number. This proof allows anyone to verify the validity of the random number. Polkadot's VRF is similar to the one used in [**Ouroboros Praos**](https://eprint.iacr.org/2017/573.pdf){target=\_blank}, which secures randomness for block production in systems like [BABE](/polkadot-protocol/architecture/polkadot-chain/pos-consensus/#block-production-babe){target=\_blank} (Polkadot’s block production mechanism). The key difference is that Polkadot's VRF doesn’t rely on a central clockβ€”avoiding the issue of whose clock to trust. Instead, it uses its own past results and slot numbers to simulate time and determine future outcomes. ### How VRF Works Slots on Polkadot are discrete units of time, each lasting six seconds, and can potentially hold a block. Multiple slots form an epoch, with 2400 slots making up one four-hour epoch. In each slot, validators execute a "die roll" using a VRF. The VRF uses three inputs: 1. A "secret key," unique to each validator, is used for the die roll 2. An epoch randomness value, derived from the hash of VRF outputs from blocks two epochs ago (N-2), so past randomness influences the current epoch (N) 3. The current slot number This process helps maintain fair randomness across the network. Here is a graphical representation: ![](/images/polkadot-protocol/parachain-basics/blocks-transactions-fees/randomness/slots-epochs.webp) The VRF produces two outputs: a result (the random number) and a proof (verifying that the number was generated correctly). TheΒ resultΒ is checked by the validator against a protocol threshold. If it's below the threshold, the validator becomes a candidate for block production in that slot. The validator then attempts to create a block, submitting it along with the `PROOF` and `RESULT`. So, VRF can be expressed like: `(RESULT, PROOF) = VRF(SECRET, EPOCH_RANDOMNESS_VALUE, CURRENT_SLOT_NUMBER)` Put simply, performing a "VRF roll" generates a random number along with proof that the number was genuinely produced and not arbitrarily chosen. After executing the VRF, the `RESULT` is compared to a protocol-defined `THRESHOLD`. If the `RESULT` is below the `THRESHOLD`, the validator becomes a valid candidate to propose a block for that slot. Otherwise, the validator skips the slot. As a result, there may be multiple validators eligible to propose a block for a slot. In this case, the block accepted by other nodes will prevail, provided it is on the chain with the latest finalized block as determined by the GRANDPA finality gadget. It's also possible for no block producers to be available for a slot, in which case the AURA consensus takes over. AURA is a fallback mechanism that randomly selects a validator to produce a block, running in parallel with BABE and only stepping in when no block producers exist for a slot. Otherwise, it remains inactive. Because validators roll independently, no block candidates may appear in some slots if all roll numbers are above the threshold. To verify resolution of this issue and that Polkadot block times remain near constant-time, see the [PoS Consensus](/polkadot-protocol/architecture/polkadot-chain/pos-consensus/){target=\_blank} page of this documentation. ## RANDAO An alternative on-chain randomness method is Ethereum'sΒ RANDAO, where validators perform thousands of hashes on a seed and publish the final hash during a round. The collective input from all validators forms the random number, and as long as one honest validator participates, the randomness is secure. To enhance security,Β RANDAOΒ can optionally be combined with aΒ Verifiable Delay Function (VDF), ensuring that randomness can't be predicted or manipulated during computation. For more information about RANDAO, see the [Randomness - RANDAO](https://eth2book.info/capella/part2/building_blocks/randomness/){target=\_blank} section of the Upgrading Ethereum documentation. ## VDFs Verifiable Delay Functions (VDFs) are time-bound computations that, even on parallel computers, take a set amount of time to complete. They produce a unique result that can be quickly verified publicly. When combined with RANDAO, feeding RANDAO's output into a VDF introduces a delay that nullifies an attacker's chance to influence the randomness. However,Β VDFΒ likely requires specialized ASIC devices to run separately from standard nodes. !!!warning While only one is needed to secure the system, and they will be open-source and inexpensive, running VDF devices involves significant costs without direct incentives, adding friction for blockchain users. ## Additional Resources For more information about the reasoning for choices made along with proofs, see Polkadot's research on blockchain randomness and sortition in the [Block production](https://research.web3.foundation/Polkadot/protocols/block-production){target=\_blank} entry of the Polkadot Wiki. For a discussion with Web3 Foundation researchers about when and under what conditions Polkadot's randomness can be utilized, see the [Discussion on Randomness used in Polkadot](https://github.com/use-ink/ink/issues/57){target=\_blank} issue on GitHub. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/smart-contract-basics/accounts/ --- BEGIN CONTENT --- --- title: Accounts in Asset Hub Smart Contracts description: Bridges Ethereum's 20-byte addresses with Polkadot's 32-byte accounts, enabling seamless interaction while maintaining compatibility with Ethereum tooling. categories: Basics, Polkadot Protocol --- # Accounts on Asset Hub Smart Contracts !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Asset Hub natively utilizes Polkadot's 32-byte account system while providing interoperability with Ethereum's 20-byte addresses through an automatic conversion system. When interacting with smart contracts: - Ethereum-compatible wallets (like MetaMask) can use their familiar 20-byte addresses. - Polkadot accounts continue using their native 32-byte format. - The Asset Hub chain automatically handles conversion between the two formats behind the scenes: - 20-byte Ethereum addresses are padded with `0xEE` bytes to create valid 32-byte Polkadot accounts. - 32-byte Polkadot accounts can optionally register a mapping to a 20-byte address for Ethereum compatibility. This dual-format approach enables Asset Hub to maintain compatibility with Ethereum tooling while fully integrating with the Polkadot ecosystem. ## Address Types and Mappings The platform handles two distinct address formats: - [Ethereum-style addresses (20 bytes)](https://ethereum.org/en/developers/docs/accounts/#account-creation){target=\_blank} - [Polkadot native account IDs (32 bytes)](https://wiki.polkadot.network/docs/build-protocol-info#addresses){target=\_blank} ### Ethereum to Polkadot Mapping The [`AccountId32Mapper`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/struct.AccountId32Mapper.html){target=\_blank} implementation in [`pallet_revive`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/index.html){target=\_blank} handles the core address conversion logic. For converting a 20-byte Ethereum address to a 32-byte Polkadot address, the pallet uses a simple concatenation approach: - [**Core mechanism**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/trait.AddressMapper.html#tymethod.to_fallback_account_id){target=\_blank} : takes a 20-byte Ethereum address and extends it to 32 bytes by adding twelve `0xEE` bytes at the end. The key benefits of this approach are: - Able to fully revert, allowing a smooth transition back to the Ethereum format. - Provides clear identification of Ethereum-controlled accounts through the `0xEE` suffix pattern. - Maintains cryptographic security with a `2^96` difficulty for pattern reproduction. ### Polkadot to Ethereum Mapping The conversion from 32-byte Polkadot accounts to 20-byte Ethereum addresses is more complex than the reverse direction due to the lossy nature of the conversion. The [`AccountId32Mapper`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/struct.AccountId32Mapper.html){target=\_blank} handles this through two distinct approaches: - **For Ethereum-derived accounts** : The system uses the [`is_eth_derived`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/fn.is_eth_derived.html){target=\_blank} function to detect accounts that were originally Ethereum addresses (identified by the `0xEE` suffix pattern). For these accounts, the conversion strips the last 12 bytes to recover the original 20-byte Ethereum address. - **For native Polkadot accounts** : Since these accounts utilize the whole 32-byte space and weren't derived from Ethereum addresses, direct truncation would result in lost information. Instead, the system: 1. Hashes the entire 32-byte account using Keccak-256. 2. Takes the last 20 bytes of the hash to create the Ethereum address. 3. This ensures a deterministic mapping while avoiding simple truncation. The conversion process is implemented through the [`to_address`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/trait.AddressMapper.html#tymethod.to_address){target=\_blank} function, which automatically detects the account type and applies the appropriate conversion method. **Stateful Mapping for Reversibility** : Since the conversion from 32-byte to 20-byte addresses is inherently lossy, the system provides an optional stateful mapping through the [`OriginalAccount`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/pallet/storage_types/struct.OriginalAccount.html){target=\_blank} storage. When a Polkadot account registers a mapping (via the [`map`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/trait.AddressMapper.html#tymethod.map){target=\_blank} function), the system stores the original 32-byte account ID, enabling the [`to_account_id`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/trait.AddressMapper.html#tymethod.to_account_id){target=\_blank} function to recover the exact original account rather than falling back to a default conversion. ## Account Registration The registration process is implemented through the [`map`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/trait.AddressMapper.html#tymethod.map){target=\_blank} function. This process involves: - Checking if the account is already mapped. - Calculating and collecting required deposits based on data size. - Storing the address suffix for future reference. - Managing the currency holds for security. ## Fallback Accounts The fallback mechanism is integrated into the [`to_account_id`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/trait.AddressMapper.html#tymethod.to_account_id){target=\_blank} function. It provides a safety net for address conversion by: - First, attempting to retrieve stored mapping data. - Falling back to the default conversion method if no mapping exists. - Maintaining consistency in address representation. ## Contract Address Generation The system supports two methods for generating contract addresses: - [**CREATE1 method**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/fn.create1.html){target=\_blank}: - Uses the deployer address and nonce. - Generates deterministic addresses for standard contract deployment. - [**CREATE2 method**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/fn.create2.html){target=\_blank}: - Uses the deployer address, initialization code, input data, and salt. - Enables predictable address generation for advanced use cases. ## Security Considerations The address mapping system maintains security through several design choices evident in the implementation: - The stateless mapping requires no privileged operations, as shown in the [`to_fallback_account_id`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/trait.AddressMapper.html#tymethod.to_fallback_account_id){target=\_blank} implementation. - The stateful mapping requires a deposit managed through the [`Currency`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/pallet/trait.Config.html#associatedtype.Currency){target=\_blank} trait. - Mapping operations are protected against common errors through explicit checks. - The system prevents double-mapping through the [`ensure!(!Self::is_mapped(account_id))`](https://github.com/paritytech/polkadot-sdk/blob/stable2412/substrate/frame/revive/src/address.rs#L125){target=\_blank} check. All source code references are from the [`address.rs`](https://github.com/paritytech/polkadot-sdk/blob/stable2412/substrate/frame/revive/src/address.rs){target=\_blank} file in the Revive pallet of the Polkadot SDK repository. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/smart-contract-basics/blocks-transactions-fees/ --- BEGIN CONTENT --- --- title: Blocks, Transactions and Fees for Asset Hub Smart Contracts description: Explore how Asset Hub smart contracts handle blocks, transactions, and fees with EVM compatibility, supporting various Ethereum transaction types. categories: Basics, Polkadot Protocol --- # Blocks, Transactions, and Fees !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Asset Hub smart contracts operate within the Polkadot ecosystem using the [`pallet_revive`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/){target=\_blank} implementation, which provides EVM compatibility. While many aspects of blocks and transactions are inherited from the underlying parachain architecture, there are specific considerations and mechanisms unique to smart contract operations on Asset Hub. ## Smart Contract Blocks Smart contract blocks in Asset Hub follow the same fundamental structure as parachain blocks, inheriting all standard parachain block components. The `pallet_revive` implementation maintains this consistency while adding necessary [EVM-specific features](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/evm){target=\_blank}. For detailed implementation specifics, the [`Block`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/evm/struct.Block.html){target=\_blank} struct in `pallet_revive` demonstrates how parachain and smart contract block implementations align. ## Smart Contract Transactions Asset Hub implements a sophisticated transaction system that supports various transaction types and formats, encompassing both traditional parachain operations and EVM-specific interactions. ### EVM Transaction Types The system provides a fundamental [`eth_transact`](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/pallet/dispatchables/fn.eth_transact.html){target=\_blank} interface for processing raw EVM transactions dispatched through [Ethereum JSON-RPC APIs](/develop/smart-contracts/json-rpc-apis/){target=\_blank}. This interface acts as a wrapper for Ethereum transactions, requiring an encoded signed transaction payload, though it cannot be dispatched directly. Building upon this foundation, the system supports multiple transaction formats to accommodate different use cases and optimization needs: - [**Legacy transactions**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/evm/struct.TransactionLegacyUnsigned.html){target=\_blank} - the original Ethereum transaction format, providing basic transfer and contract interaction capabilities. These transactions use a simple pricing mechanism and are supported for backward compatibility - [**EIP-1559 transactions**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/evm/struct.Transaction1559Unsigned.html){target=\_blank} - an improved transaction format that introduces a more predictable fee mechanism with base fee and priority fee components. This format helps optimize gas fee estimation and network congestion management - [**EIP-2930 transactions**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/evm/struct.Transaction2930Unsigned.html){target=\_blank} - introduces access lists to optimize gas costs for contract interactions by pre-declaring accessed addresses and storage slots - [**EIP-4844 transactions**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/evm/struct.Transaction4844Unsigned.html){target=\_blank} - implements blob-carrying transactions, designed to optimize Layer 2 scaling solutions by providing dedicated space for roll-up data Each transaction type can exist in both signed and unsigned states, with appropriate validation and processing mechanisms for each. ## Fees and Gas Asset Hub implements a sophisticated resource management system that combines parachain transaction fees with EVM gas mechanics, providing both Ethereum compatibility and enhanced features. ### Gas Model Overview Gas serves as the fundamental unit for measuring computational costs, with each network operation consuming a specified amount. This implementation maintains compatibility with Ethereum's approach while adding parachain-specific optimizations. - **Dynamic gas scaling** - Asset Hub implements a dynamic pricing mechanism that reflects actual execution performance. This results in: - More efficient pricing for computational instructions relative to I/O operations - Better correlation between gas costs and actual resource consumption - Need for developers to implement flexible gas calculation rather than hardcoding values - **Multi-dimensional resource metering** - Asset Hub extends beyond the traditional single-metric gas model to track three distinct resources: - `ref_time` (computation time) - Functions as traditional gas equivalent - Measures actual computational resource usage - Primary metric for basic operation costs - `proof_size` (verification overhead) - Tracks state proof size required for validator verification - Helps manage consensus-related resource consumption - Important for cross-chain operations - `storage_deposit` (state management) - Manages blockchain state growth - Implements a deposit-based system for long-term storage - Refundable when storage is freed These resources can be limited at both transaction and contract levels, similar to Ethereum's gas limits. For more information, check the [Gas Model](/polkadot-protocol/smart-contract-basics/evm-vs-polkavm#gas-model){target=\_blank} section in the [EVM vs PolkaVM](/polkadot-protocol/smart-contract-basics/evm-vs-polkavm/){target=\_blank} article. ### Fee Components - **Base fees** - Storage deposit for contract deployment - Minimum transaction fee for network access - Network maintenance costs - **Execution fees** - Computed based on gas consumption - Converted to native currency using network-defined rates - Reflects actual computational resource usage - **Storage fees** - Deposit for long-term storage usage - Refundable when storage is freed - Helps prevent state bloat ### Gas Calculation and Conversion The system maintains precise conversion mechanisms between: - Substrate weights and EVM gas units - Native currency and gas costs - Different resource metrics within the multi-dimensional model This ensures accurate fee calculation while maintaining compatibility with existing Ethereum tools and workflows. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/smart-contract-basics/evm-vs-polkavm/ --- BEGIN CONTENT --- --- title: EVM vs PolkaVM description: Compares EVM and PolkaVM, highlighting key architectural differences, gas models, memory management, and account handling while ensuring Solidity compatibility. categories: Basics, Polkadot Protocol --- # EVM vs PolkaVM !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction While [PolkaVM](/polkadot-protocol/smart-contract-basics/polkavm-design/){target=\_blank} strives for maximum Ethereum compatibility, several fundamental design decisions create necessary divergences from the [EVM](https://ethereum.org/en/developers/docs/evm/){target=\_blank}. These differences represent trade-offs that enhance performance and resource management while maintaining accessibility for Solidity developers. ## Core Virtual Machine Architecture The most significant departure from Ethereum comes from PolkaVM's foundation itself. Rather than implementing the EVM, PolkaVM utilizes a RISC-V instruction set. For most Solidity developers, this architectural change remains transparent thanks to the [Revive compiler's](https://github.com/paritytech/revive){target=\_blank} complete Solidity support, including inline assembler functionality. ```mermaid graph TD subgraph "Ethereum Path" EthCompile["Standard Solidity Compiler"] --> EVM_Bytecode["EVM Bytecode"] EVM_Bytecode --> EVM["Stack-based EVM"] EVM --> EthExecution["Contract Execution"] end subgraph "PolkaVM Path" ReviveCompile["Revive Compiler"] --> RISCV_Bytecode["RISC-V Format Bytecode"] RISCV_Bytecode --> PolkaVM["RISC-V Based PolkaVM"] PolkaVM --> PolkaExecution["Contract Execution"] end EthExecution -.-> DifferencesNote["Key Differences: - Instruction Set Architecture - Bytecode Format - Runtime Behavior"] PolkaExecution -.-> DifferencesNote ``` However, this architectural difference becomes relevant in specific scenarios. Tools that attempt to download and inspect contract bytecode will fail, as they expect EVM bytecode rather than PolkaVM's RISC-V format. Most applications typically pass bytecode as an opaque blob, making this a non-issue for standard use cases. This primarily affects contracts using [`EXTCODECOPY`](https://www.evm.codes/?fork=cancun#3c){target=\_blank} to manipulate code at runtime. A contract encounters problems specifically when it uses `EXTCODECOPY` to copy contract code into memory and then attempts to mutate it. This pattern is not possible in standard Solidity and requires dropping down to YUL assembly. An example would be a factory contract written in assembly that constructs and instantiates new contracts by generating code at runtime. Such contracts are rare in practice. PolkaVM offers an elegant alternative through its [on-chain constructors](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/pallet/struct.Pallet.html#method.bare_instantiate){target=\_blank}, enabling contract instantiation without runtime code modification, making this pattern unnecessary. This architectural difference also impacts how contract deployment works more broadly, as discussed in the [Contract Deployment](#contract-deployment) section. ### High-Level Architecture Comparison | Feature | Ethereum Virtual Machine (EVM) | PolkaVM | | :---------------------------: | :----------------------------------------------------------------------------------: | :----------------------------------------------------: | | **Instruction Set** | Stack-based architecture | RISC-V instruction set | | **Bytecode Format** | EVM bytecode | RISC-V format | | **Contract Size Limit** | 24KB code size limit | Contract-specific memory limits | | **Compiler** | Solidity Compiler | Revive Compiler | | **Inline Assembly** | Supported | Supported with the compatibility layer | | **Code Introspection** | Supported via [`EXTCODECOPY`](https://www.evm.codes/?fork=cancun#3c){target=\_blank} | Limited support, alternative via on-chain constructors | | **Resource Metering** | Single gas metric | Multi-dimensional | | **Runtime Code Modification** | Supported | Limited, with alternatives | | **Contract Instantiation** | Standard deployment | On-chain constructors for flexible instantiation | ## Gas Model Ethereum's resource model relies on a single metric: [gas](https://ethereum.org/en/developers/docs/gas/#what-is-gas){target=\_blank}, which serves as the universal unit for measuring computational costs. Each operation on the network consumes a specific amount of gas. Most platforms aiming for Ethereum compatibility typically adopt identical gas values to ensure seamless integration. The significant changes to Ethereum's gas model will be outlined in the following sections. ### Dynamic Gas Value Scaling Instead of adhering to Ethereum's fixed gas values, PolkaVM implements benchmark-based pricing that better reflects its improved execution performance. This makes instructions cheaper relative to I/O-bound operations but requires developers to avoid hardcoding gas values, particularly in cross-contract calls. ### Multi-Dimensional Resource Metering Moving beyond Ethereum's single gas metric, PolkaVM meters three distinct resources: - **`ref_time`** - Equivalent to traditional gas, measuring computation time. - **`proof_size`** - Tracks state proof size for validator verification. - **`storage_deposit`** - Manages state bloat through a deposit system. All three resources can be limited at the transaction level, just like gas on Ethereum. The [Ethereum RPC proxy](https://github.com/paritytech/polkadot-sdk/tree/master/substrate/frame/revive/rpc){target=\_blank} maps all three dimensions into the single gas dimension, ensuring everything behaves as expected for users. These resources can also be limited when making cross-contract calls, which is essential for security when interacting with untrusted contracts. However, Solidity only allows specifying `gas_limit` for cross-contract calls. The `gas_limit` is most similar to Polkadots `ref_time_limit`, but the Revive compiler doesn't supply any imposed `gas_limit` for cross-contract calls for two key reasons: - **Semantic differences** - `gas_limit` and `ref_time_limit` are not semantically identical; blindly passing EVM gas as `ref_time_limit` can lead to unexpected behavior. - **Incomplete protection** - The other two resources (`proof_size` and `storage_deposit`) would remain uncapped anyway, making it insufficient to prevent malicious callees from performing DOS attacks. When resources are "uncapped" in cross-contract calls, they remain constrained by transaction-specified limits, preventing abuse of the transaction signer. !!! note The runtime will provide a special precompile, allowing cross-contract calls with limits specified for all weight dimensions in the future. All gas-related opcodes like [`GAS`](https://www.evm.codes/?fork=cancun#5a){target=\_blank} or [`GAS_LIMIT`](https://www.evm.codes/?fork=cancun#45){target=\_blank} return only the `ref_time` value as it's the closest match to traditional gas. Extended APIs will be provided through precompiles to make full use of all resources, including cross-contract calls with all three resources specified. ## Memory Management The EVM and the PolkaVM take fundamentally different approaches to memory constraints: | Feature | Ethereum Virtual Machine (EVM) | PolkaVM | | :----------------------: | :---------------------------------------: | :--------------------------------------------: | | **Memory Constraints** | Indirect control via gas costs | Hard memory limits per contract | | **Cost Model** | Increasing gas curve with allocation size | Fixed costs separated from execution gas | | **Memory Limits** | Soft limits through prohibitive gas costs | Hard fixed limits per contract | | **Pricing Efficiency** | Potential overcharging for memory | More efficient through separation of concerns | | **Contract Nesting** | Limited by available gas | Limited by constant memory per contract | | **Memory Metering** | Dynamic based on total allocation | Static limits per contract instance | | **Future Improvements** | Incremental gas cost updates | Potential dynamic metering for deeper nesting | | **Cross-Contract Calls** | Handled through gas forwarding | Requires careful boundary limit implementation | The architecture establishes a constant memory limit per contract, which is the basis for calculating maximum contract nesting depth. This calculation assumes worst-case memory usage for each nested contract, resulting in a straightforward but conservative limit that operates independently of actual memory consumption. Future iterations may introduce dynamic memory metering, allowing deeper nesting depths for contracts with smaller memory footprints. However, such an enhancement would require careful implementation of cross-contract boundary limits before API stabilization, as it would introduce an additional resource metric to the system. ### Current Memory Limits The following table depicts memory-related limits at the time of writing: | Limit | Maximum | | :----------------------------------------: | :-------------: | | Call stack depth | 5 | | Event topics | 4 | | Event data payload size (including topics) | 416 bytes | | Storage value size | 416 bytes | | Transient storage variables | 128 uint values | | Immutable variables | 16 uint values | | Contract code blob size | ~100 kilobytes | !!! note Limits might be increased in the future. To guarantee existing contracts work as expected, limits will never be decreased. ## Account Management - Existential Deposit Ethereum and Polkadot handle account persistence differently, affecting state management and contract interactions: ### Account Management Comparison | Feature | Ethereum Approach | PolkaVM/Polkadot Approach | | :-----------------------: | :---------------------------------------------------: | :----------------------------------------------------: | | **Account Persistence** | Accounts persist indefinitely, even with zero balance | Requires existential deposit (ED) to maintain account | | **Minimum Balance** | None | ED required | | **Account Deletion** | Accounts remain in state | Accounts below ED are automatically deleted | | **Contract Accounts** | Exist indefinitely | Must maintain ED | | **Balance Reporting** | Reports full balance | Reports ED-adjusted balance via Ethereum RPC | | **New Account Transfers** | Standard transfer | Includes ED automatically with extra fee cost | | **Contract-to-Contract** | Direct transfers | ED drawn from transaction signer, not sending contract | | **State Management** | Potential bloat from zero-balance accounts | Optimized with auto-deletion of dust accounts | This difference introduces potential compatibility challenges for Ethereum-based contracts and tools, particularly wallets. To mitigate this, PolkaVM implements several transparent adjustments: - Balance queries via Ethereum RPC automatically deduct the ED, ensuring reported balances match spendable amounts. - Account balance checks through EVM opcodes reflect the ED-adjusted balance. - Transfers to new accounts automatically include the ED (`x + ED`), with the extra cost incorporated into transaction fees. - Contract-to-contract transfers handle ED requirements by: - Drawing ED from the transaction signer instead of the sending contract. - Keeping transfer amounts transparent for contract logic. - Treating ED like other storage deposit costs. This approach ensures that Ethereum contracts work without modifications while maintaining Polkadot's optimized state management. ## Contract Deployment For most users deploying contracts (like ERC-20 tokens), contract deployment works seamlessly without requiring special steps. However, when using advanced patterns like factory contracts that dynamically create other contracts at runtime, you'll need to understand PolkaVM's unique deployment model. In the PolkaVM, contract deployment follows a fundamentally different model from EVM. The EVM allows contracts to be deployed with a single transaction, where the contract code is bundled with the deployment transaction. In contrast, PolkaVM has a different process for contract instantiation. - **Code must be pre-uploaded** - Unlike EVM, where contract code is bundled within the deploying contract, PolkaVM requires all contract bytecode to be uploaded to the chain before instantiation. - **Factory pattern limitations** - The common EVM pattern, where contracts dynamically create other contracts, will fail with a `CodeNotFound` error unless the dependent contract code was previously uploaded. - **Separate upload and instantiation** - This creates a two-step process where developers must first upload all contract code, then instantiate relationships between contracts. This architecture impacts several common EVM patterns and requires developers to adapt their deployment strategies accordingly. _Factory contracts must be modified to work with pre-uploaded code rather than embedding bytecode_, and runtime code generation is not supported due to PolkaVM's RISC-V bytecode format. The specific behavior of contract creation opcodes is detailed in the [YUL IR Translation](#yul-function-translation-differences) section. When migrating EVM projects to PolkaVM, developers should identify all contracts that will be instantiated at runtime and ensure they are pre-uploaded to the chain before any instantiation attempts. ## Solidity and YUL IR Translation Incompatibilities While PolkaVM maintains high-level compatibility with Solidity, several low-level differences exist in the translation of YUL IR and specific Solidity constructs. These differences are particularly relevant for developers working with assembly code or utilizing advanced contract patterns. ### Contract Code Structure PolkaVM's contract runtime does not differentiate between runtime code and deploy (constructor) code. Instead, both are emitted into a single PolkaVM contract code blob and live on-chain. Therefore, in EVM terminology, the deploy code equals the runtime code. For most standard Solidity contracts, this is transparent. However, if you are analyzing raw bytecode or building tools that expect separate deploy and runtime sections, you'll need to adjust for this unified structure. In the constructor code, the `codesize` instruction returns the call data size instead of the actual code blob size, which differs from standard EVM behavior. Developers might consider that the constructor logic uses `codesize` to inspect the deployed contract's size (e.g., for self-validation or specific deployment patterns); this will return an incorrect value on PolkaVM. Re-evaluate such logic or use alternative methods to achieve your goal. ### Solidity-Specific Differences Solidity constructs behave differently under PolkaVM: - **`address.creationCode`** - Returns the bytecode keccak256 hash instead of the actual creation code, reflecting PolkaVM's hash-based code referencing system. - If your contract relies on `address.creationCode` to verify or interact with the full raw bytecode of a newly deployed contract, this will not work as expected. You will receive a hash, not the code itself. This typically affects highly specialized factory contracts or introspection tools. ### YUL Function Translation Differences The following YUL functions exhibit notable behavioral differences in PolkaVM: - **Memory Operations:** - **`mload`, `mstore`, `msize`, `mcopy`** - PolkaVM preserves memory layout but implements several constraints: - EVM linear heap memory is emulated using a fixed 64KB byte buffer, limiting maximum contract memory usage. - Accessing memory offsets larger than the buffer size traps the contract with an `OutOfBound` error. - Compiler optimizations may eliminate unused memory operations, potentially causing `msize` to differ from EVM behavior. For Solidity developers, the compiler generally handles memory efficiently within this 64KB limit. However, if you are writing low-level YUL assembly and perform direct memory manipulations, you must respect the 64KB buffer limit. Attempting to access memory outside this range will cause your transaction to revert. Be aware that `msize` might not always reflect the exact EVM behavior if compiler optimizations occur. - **Call Data Operations:** - **`calldataload`, `calldatacopy`** - In constructor code, the offset parameter is ignored and these functions always return `0`, diverging from EVM behavior where call data represents constructor arguments. - If your constructor logic in YUL assembly attempts to read constructor arguments using `calldataload` or `calldatacopy` with specific offsets, this will not yield the expected constructor arguments. Instead, these functions will return `zeroed` values. Standard Solidity constructors are handled correctly by the compiler, but manual YUL assembly for constructor argument parsing will need adjustment. - **Code Operations:** - **`codecopy`** - Only supported within constructor code, reflecting PolkaVM's different approach to code handling and the unified code blob structure. - If your contracts use `codecopy` (e.g., for self-modifying code or inspecting other contract's runtime bytecode) outside of the constructor, this will not be supported and will likely result in a compile-time error or runtime trap. This implies that patterns like dynamically generating or modifying contract code at runtime are not directly feasible with `codecopy` on PolkaVM. - **Control Flow:** - **`invalid`** - Traps the contract execution but does not consume remaining gas, unlike EVM where it consumes all available gas. - While `invalid` still reverts the transaction, the difference in gas consumption could subtly affect very specific error handling or gas accounting patterns that rely on `invalid` to consume all remaining gas. For most error scenarios, `revert()` is the standard and recommended practice. - **Cross-Contract Calls:** - **`call`, `delegatecall`, `staticall`** - These functions ignore supplied gas limits and forward all remaining resources due to PolkaVM's multi-dimensional resource model. This creates important security implications: - Contract authors must implement reentrancy protection since gas stipends don't provide protection. - The compiler detects `address payable.{send,transfer}` patterns and disables call reentrancy as a protective heuristic. - Using `address payable.{send,transfer}` is already deprecated; PolkaVM will provide dedicated precompiles for safe balance transfers. The traditional EVM pattern of limiting gas in cross-contract calls (especially with the 2300 gas stipend for send/transfer) does not provide reentrancy protection on PolkaVM. Developers must explicitly implement reentrancy guards (e.g., using a reentrancy lock mutex) in their Solidity code when making external calls to untrusted contracts. Relying on gas limits alone for reentrancy prevention is unsafe and will lead to vulnerabilities on PolkaVM. !!! warning The 2300 gas stipend that is provided by solc for address payable.{send, transfer} calls offers no reentrancy protection in PolkaVM. While the compiler attempts to detect and mitigate this pattern, developers should avoid these deprecated functions. - **Contract Creation:** - **`create`, `create2`** - Contract instantiation works fundamentally differently in PolkaVM. Instead of supplying deploy code concatenated with constructor arguments, the runtime expects: 1. A buffer containing the code hash to deploy. 2. The constructor arguments buffer. PolkaVM translates `dataoffset` and `datasize` instructions to handle contract hashes instead of contract code, enabling seamless use of the `new` keyword in Solidity. However, this translation may fail for contracts creating other contracts within `assembly` blocks. If you use the Solidity `new` keyword to deploy contracts, the Revive compiler handles this transparently. However, if you are creating contracts manually in YUL assembly using `create` or `create2` opcodes, you must provide the code hash of the contract to be deployed, not its raw bytecode. Attempting to pass raw bytecode will fail. This fundamentally changes how manual contract creation is performed in assembly. !!! warning Avoid using `create` family opcodes for manual deployment crafting in `assembly` blocks. This pattern is discouraged due to translation complexity and offers no gas savings benefits in PolkaVM. - **Data Operations:** - **`dataoffset`** - Returns the contract hash instead of code offset, aligning with PolkaVM's hash-based code referencing. - **`datasize`** - Returns the constant contract hash size (32 bytes) rather than variable code size. These changes are primarily relevant for low-level YUL assembly developers who are trying to inspect or manipulate contract code directly. `dataoffset` will provide a hash, not a memory offset to the code, and `datasize` will always be 32 bytes (the size of a hash). This reinforces that direct manipulation of contract bytecode at runtime, as might be done in some EVM patterns, is not supported. - **Resource Queries:** - **`gas`, `gaslimit`** - Return only the `ref_time` component of PolkaVM's multi-dimensional weight system, providing the closest analog to traditional gas measurements. - While `gas` and `gaslimit` still provide a useful metric, consider that they represent `ref_time` (computation time) only. If your contract logic depends on precise knowledge of other resource costs (like `proof_size` or `storage_deposit`), you won't get that information from these opcodes. You'll need to use future precompiles for full multi-dimensional resource queries. - **Blockchain State:** - **`prevrandao`, `difficulty`** - Both translate to a constant value of `2500000000000000`, as PolkaVM doesn't implement Ethereum's difficulty adjustment or randomness mechanisms. - If your Solidity contract relies on `block.difficulty` (or its equivalent YUL opcode `difficulty`) for randomness generation or any logic tied to Ethereum's proof-of-work difficulty, this will not provide true randomness on PolkaVM. The value will always be constant. Developers needing on-chain randomness should utilize Polkadot's native randomness sources or dedicated VRF (Verifiable Random Function) solutions if available. ### Unsupported Operations Several EVM operations are not supported in PolkaVM and produce compile-time errors: - **`pc`, `extcodecopy`** - These operations are EVM-specific and have no equivalent functionality in PolkaVM's RISC-V architecture. - Any Solidity contracts that utilize inline assembly to interact with `pc` (program counter) or `extcodecopy` will fail to compile or behave unexpectedly. This means patterns involving introspection of the current execution location or copying external contract bytecode at runtime are not supported. - **`blobhash`, `blobbasefee`** - Related to Ethereum's rollup model and blob data handling, these operations are unnecessary given Polkadot's superior rollup architecture. - If you are porting contracts designed for Ethereum's EIP-4844 (proto-danksharding) and rely on these blob-related opcodes, they will not be available on PolkaVM. - **`extcodecopy`, `selfdestruct`** - These deprecated operations are not supported and generate compile-time errors. - The `selfdestruct` opcode, which allowed contracts to remove themselves from the blockchain, is not supported. Contracts cannot be self-destroyed on PolkaVM. This affects contract upgradeability patterns that rely on self-destruction and redeployment. Similarly, `extcodecopy` is unsupported, impacting contracts that intend to inspect or copy the bytecode of other deployed contracts. ### Compilation Pipeline Considerations PolkaVM processes YUL IR exclusively, meaning all contracts exhibit behavior consistent with Solidity's `via-ir` compilation mode. Developers familiar with the legacy compilation pipeline should expect [IR-based codegen behavior](https://docs.soliditylang.org/en/latest/ir-breaking-changes.html){target=\_blank} when working with PolkaVM contracts. If you've previously worked with older Solidity compilers that did not use the `via-ir` pipeline by default, you might observe subtle differences in compiled bytecode size or gas usage. It's recommended to familiarize yourself with Solidity's IR-based codegen behavior, as this is the standard for PolkaVM. ### Memory Pointer Limitations YUL functions accepting memory buffer offset pointers or size arguments are limited by PolkaVM's 32-bit pointer size. Supplying values above `2^32-1` will trap the contract immediately. The Solidity compiler typically generates valid memory references, making this primarily a concern for low-level assembly code. For standard Solidity development, this limitation is unlikely to be hit as the compiler handles memory addresses correctly within typical contract sizes. However, if you are writing extremely large contracts using YUL assembly that manually and extensively manipulate memory addresses, ensure that your memory offsets and sizes do not exceed PolkaVM's **fixed 64KB memory limit per contract**. While the YUL functions might accept 32-bit pointers (up to 2^32-1), attempting to access memory beyond the allocated 64KB buffer will trap the contract immediately. These incompatibilities reflect the fundamental architectural differences between EVM and PolkaVM while maintaining high-level Solidity compatibility. Most developers using standard Solidity patterns will encounter no issues, but those working with assembly code or advanced contract patterns should carefully review these differences during migration. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/smart-contract-basics/networks/ --- BEGIN CONTENT --- --- title: Networks for Polkadot Hub Smart Contracts description: Explore the available networks for smart contract development on Polkadot Hub, including Westend Hub, Kusama Hub, and Polkadot Hub. categories: Basics, Polkadot Protocol --- # Networks !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Polkadot Hub provides smart contract functionality across multiple networks to facilitate smart contract development in the Polkadot ecosystem. Whether you're testing new contracts or deploying to production, Polkadot Hub offers several network environments tailored for each stage of development. Developers can thoroughly test, iterate, and validate their smart contracts from local testing environments to production networks like Polkadot Hub. This guide will introduce you to the current and upcoming networks available for smart contract development and explain how they fit into the development workflow. ## Network Overview Smart contract development on Polkadot Hub follows a structured process to ensure rigorous testing of new contracts and upgrades before deployment on production networks. Development progresses through a well-defined path, beginning with local environments, advancing through TestNets, and ultimately reaching MainNets. The diagram below illustrates this progression: ``` mermaid flowchart LR id1[Local Polkadot Hub] --> id2[TestNet Polkadot Hub] --> id4[MainNet Polkadot Hub] ``` This progression ensures developers can thoroughly test and iterate their smart contracts without risking real tokens or affecting production networks. A typical development journey consists of three main stages: 1. **Local Development** - Developers start in a local environment to create, test, and iterate on smart contracts - Provides rapid experimentation in an isolated setup without external dependencies 2. **TestNet Development** - Contracts move to TestNets like Westend Hub and Passet Hub - Enables testing in simulated real-world conditions without using real tokens 3. **Production Deployment** - Final deployment to MainNets like Kusama Hub and Polkadot Hub - Represents the live environment where contracts interact with real economic value ## Local Development The local development environment is crucial for smart contract development on Polkadot Hub. It provides developers a controlled space for rapid testing and iteration before moving to public networks. The local setup consists of several key components: - [**Kitchensink node**](https://paritytech.github.io/polkadot-sdk/master/kitchensink_runtime/index.html){target=\_blank} - a local node that can be run for development and testing. It includes logging capabilities for debugging contract execution and provides a pre-configured development environment with pre-funded accounts for testing purposes - [**Ethereum RPC proxy**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive_eth_rpc/index.html){target=\_blank} - bridges Ethereum-compatible tools with the Polkadot SDK-based network. It enables seamless integration with popular development tools like MetaMask and Remix IDE. The purpose of this component is to translate Ethereum RPC calls into Substrate format ## Test Networks The following test networks provide controlled environments for testing smart contracts. TestNet tokens are available from the [Polkadot faucet](https://faucet.polkadot.io/){target=\_blank}. They provide a stable environment for testing your contracts without using real tokens. ``` mermaid flowchart TB id1[Polkadot Hub TestNets] --> id2[Passet Hub] id1[Polkadot Hub TestNets] --> id3[Westend Hub] ``` ### Passet Hub The Passet Hub will be a community-managed TestNet designed specifically for smart contract development. It will mirror Asset Hub's runtime and provide developers with an additional environment for testing their contracts before deployment to production networks. ### Westend Hub Westend Hub is the TestNet for smart contract development and its cutting-edge features. The network maintains the same features and capabilities as the production Polkadot Hub, and also incorporates the latest features developed by core developers. ## Production Networks The MainNet environments represent the final destination for thoroughly tested and validated smart contracts, where they operate with real economic value and serve actual users. ``` mermaid flowchart TB id1[Polkadot Hub MainNets] --> id2[Polkadot Hub] id1[Polkadot Hub MainNets] --> id3[Kusama Hub] ``` ### Polkadot Hub Polkadot Hub is the primary production network for deploying smart contracts in the Polkadot ecosystem. It provides a secure and stable environment for running smart contracts with real economic value. The network supports PolkaVM-compatible contracts written in Solidity or Rust, maintaining compatibility with Ethereum-based development tools. ### Kusama Hub Kusama Hub is the canary version of Polkadot Hub. It is designed for developers who want to move quickly and test their smart contracts in a real-world environment with economic incentives. It provides a more flexible space for innovation while maintaining the same core functionality as Polkadot Hub. --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/smart-contract-basics/overview/ --- BEGIN CONTENT --- --- title: Smart Contracts Basics Overview description: Learn how developers can build smart contracts on Polkadot by leveraging either Wasm/ink! or EVM contracts across many parachains. categories: Basics, Polkadot Protocol --- # An Overview of the Smart Contract Landscape on Polkadot !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Polkadot is designed to support an ecosystem of parachains, rather than hosting smart contracts directly. Developers aiming to build smart contract applications on Polkadot rely on parachains within the ecosystem that provide smart contract functionality. This guide outlines the primary approaches to developing smart contracts in the Polkadot ecosystem: - **PolkaVM-compatible contracts** - which support Solidity and any language that compiles down to RISC-V while maintaining compatibility with Ethereum based tools - **EVM-compatible contracts** - which support languages like [Solidity](https://soliditylang.org/){target=\_blank} and [Vyper](https://vyperlang.org/){target=\_blank}, offering compatibility with popular Ethereum tools and wallets - **Wasm-based smart contracts** - using [ink!](https://use.ink/){target=\_blank}, a Rust-based embedded domain-specific language (eDSL), enabling developers to leverage Rust’s safety and tooling You'll explore the key differences between these development paths, along with considerations for parachain developers integrating smart contract functionality. !!!note "Parachain Developer?" If you are a parachain developer looking to add smart contract functionality to your chain, please refer to the [Add Smart Contract Functionality](/develop/parachains/customize-parachain/add-smart-contract-functionality/){target=\_blank} page, which covers both Wasm and EVM-based contract implementations. ## Smart Contracts Versus Parachains A smart contract is a program that executes specific logic isolated to the chain on which it is being executed. All the logic executed is bound to the same state transition rules determined by the underlying virtual machine (VM). Consequently, smart contracts are more streamlined to develop, and programs can easily interact with each other through similar interfaces. ``` mermaid flowchart LR subgraph A[Chain State] direction LR B["Program Logic and Storage
(Smart Contract)"] C["Tx Relevant Storage"] end A --> D[[Virtual Machine]] E[Transaction] --> D D --> F[(New State)] D --> G[Execution Logs] style A fill:#ffffff,stroke:#000000,stroke-width:1px ``` In addition, because smart contracts are programs that execute on top of existing chains, teams don't have to think about the underlying consensus they are built on. These strengths do come with certain limitations. Some smart contracts environments, like EVM, tend to be immutable by default. Developers have developed different [proxy strategies](https://blog.openzeppelin.com/proxy-patterns){target=\_blank} to be able to upgrade smart contracts over time. The typical pattern relies on a proxy contract which holds the program storage forwarding a call to an implementation contract where the execution logic resides. Smart contract upgrades require changing the implementation contract while retaining the same storage structure, necessitating careful planning. Another downside is that smart contracts often follow a gas metering model, where program execution is associated with a given unit and a marketplace is set up to pay for such an execution unit. This fee system is often very rigid, and some complex flows, like account abstraction, have been developed to circumvent this problem. In contrast, parachains can create their own custom logics (known as pallets or modules), and combine them as the state transition function (STF or runtime) thanks to the modularity provided by the [Polkadot-SDK](https://github.com/paritytech/polkadot-sdk/){target=\_blank}. The different pallets within the parachain runtime can give developers a lot of flexibility when building applications on top of it. ``` mermaid flowchart LR A[(Chain State)] --> B[["STF
[Pallet 1]
[Pallet 2]
...
[Pallet N]"]] C[Transaction
Targeting Pallet 2] --> B B --> E[(New State)] B --> F[Execution Logs] ``` Parachains inherently offer features such as logic upgradeability, flexible transaction fee mechanisms, and chain abstraction logic. More so, by using Polkadot, parachains can benefit from robust consensus guarantees with little engineering overhead. To read more about the differences between smart contracts and parachain runtimes, see the [Runtime vs. Smart Contracts](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/runtime_vs_smart_contract/index.html){target=\_blank} section of the Polkadot SDK Rust docs. For a more in-depth discussion about choosing between runtime development and smart contract development, see the Stack Overflow post on [building a Polkadot SDK runtime versus a smart contract](https://stackoverflow.com/a/56041305){target=\_blank}. ## Building a Smart Contract The Polkadot SDK supports multiple smart contract execution environments: - **PolkaVM** - a cutting-edge virtual machine tailored to optimize smart contract execution on Polkadot. Unlike traditional EVMs, PolkaVM is built with a [RISC-V-based register architecture](https://en.wikipedia.org/wiki/RISC-V){target=\_blank} for increased performance and scalability - **EVM** - through [Frontier](https://github.com/polkadot-evm/frontier){target=\_blank}. It consists of a full Ethereum JSON RPC compatible client, an Ethereum emulation layer, and a [Rust-based EVM](https://github.com/rust-ethereum/evm){target=\_blank}. This is used by chains like [Acala](https://acala.network/){target=\_blank}, [Astar](https://astar.network/){target=\_blank}, [Moonbeam](https://moonbeam.network){target=\_blank} and more - **Wasm** - [ink!](https://use.ink/){target=\_blank} is a domain-specific language (DSL) for Rust smart contract development that uses the [Contracts pallet](https://github.com/paritytech/polkadot-sdk/blob/master/substrate/frame/contracts/){target=\_blank} with [`cargo-contract`](https://github.com/use-ink/cargo-contract){target=\_blank} serving as the compiler to WebAssembly. Wasm contracts can be used by chains like [Astar](https://astar.network/){target=\_blank} ### PolkaVM Contracts A component of the Asset Hub parachain, PolkaVM helps enable the deployment of Solidity-based smart contracts directly on Asset Hub. Learn more about how this cutting edge virtual machine facilitates using familiar Ethereum-compatible contracts and tools with Asset Hub by visiting the [Native Smart Contracts](/develop/smart-contracts/overview#native-smart-contracts){target=\_blank} guide. ### EVM Contracts The [Frontier](https://github.com/polkadot-evm/frontier){target=\_blank} project provides a set of modules that enables a Polkadot SDK-based chain to run an Ethereum emulation layer that allows the execution of EVM smart contracts natively with the same API/RPC interface. [Ethereum addresses (ECDSA)](https://ethereum.org/en/glossary/#address){target=\_blank} can also be mapped directly to and from the Polkadot SDK's SS58 scheme from existing accounts. Moreover, you can modify Polkadot SDK to use the ECDSA signature scheme directly to avoid any mapping. At a high level, [Frontier](https://github.com/polkadot-evm/frontier){target=\_blank} is composed of three main components: - [**Ethereum Client**](https://github.com/polkadot-evm/frontier/tree/master/client){target=\_blank} - an Ethereum JSON RPC compliant client that allows any request coming from an Ethereum tool, such as [Remix](https://remix.ethereum.org/){target=\_blank}, [Hardhat](https://hardhat.org/){target=\_blank} or [Foundry](https://getfoundry.sh/){target=\_blank}, to be admitted by the network - [**Pallet Ethereum**](https://docs.rs/pallet-ethereum/latest/pallet_ethereum/){target=\_blank} - a block emulation and Ethereum transaction validation layer that works jointly with the Ethereum client to ensure compatibility with Ethereum tools - [**Pallet EVM**](https://docs.rs/pallet-evm/latest/pallet_evm/){target=\_blank} - access layer to the [Rust-based EVM](https://github.com/rust-ethereum/evm){target=\_blank}, enabling the execution of EVM smart contract logic natively The following diagram illustrates a high-level overview of the path an EVM transaction follows when using this configuration: ``` mermaid flowchart TD A[Users and Devs] -->|Send Tx| B[Frontier RPC Ext] subgraph C[Pallet Ethereum] D[Validate Tx] E[Send
Valid Tx] end B -->|Interact with| C D --> E subgraph F[Pallet EVM] G[Rust EVM] end I[(Current EVM
Emulated State)] H[Smart Contract
Solidity, Vyper...] <-->|Compiled to EVM
Bytecode| I C --> F I --> F F --> J[(New Ethereum
Emulated State)] F --> K[Execution Logs] style C fill:#ffffff,stroke:#000000,stroke-width:1px style F fill:#ffffff,stroke:#000000,stroke-width:1px ``` Although it seems complex, users and developers are abstracted of that complexity, and tools can easily interact with the parachain as they would with any other Ethereum-compatible environment. The Rust EVM is capable of executing regular [EVM bytecode](https://www.ethervm.io/){target=\_blank}. Consequently, any language that compiles to EVM bytecode can be used to create programs that the parachain can execute. ### Wasm Contracts The [`pallet_contracts`](https://docs.rs/pallet-contracts/latest/pallet_contracts/index.html#contracts-pallet){target=\_blank} provides the execution environment for Wasm-based smart contracts. Consequently, any smart contract language that compiles to Wasm can be executed in a parachain that enables this module. At the time of writing there are two main languages that can be used for Wasm programs: - [**ink!**](https://use.ink/){target=\_blank} - a Rust-based language that compiles to Wasm. It allows developers to inherit all its safety guarantees and use normal Rust tooling, being the dedicated domain-specific language - **Solidity** - can be compiled to Wasm via the [Solang](https://github.com/hyperledger-solang/solang/){target=\_blank} compiler. Consequently, developers can write Solidity 0.8 smart contracts that can be executed as Wasm programs in parachains The following diagram illustrates a high-level overview of the path a transaction follows when using [`pallet_contracts`](https://docs.rs/pallet-contracts/latest/pallet_contracts/index.html#contracts-pallet){target=\_blank}: ``` mermaid flowchart TD subgraph A[Wasm Bytecode API] C[Pallet Contracts] end B[Users and Devs] -- Interact with ---> A D[(Current State)] E[Smart Contract
ink!, Solidity...] <-->|Compiled to Wasm
Bytecode| D D --> A A --> F[(New State)] A --> G[Execution Logs] style A fill:#ffffff,stroke:#000000,stroke-width:1px ``` --- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/smart-contract-basics/polkavm-design/ --- BEGIN CONTENT --- --- title: PolkaVM Design description: Discover PolkaVM, a high-performance smart contract VM for Polkadot, enabling Ethereum compatibility via pallet_revive, Solidity support & optimized execution. categories: Basics, Polkadot Protocol --- # PolkaVM Design !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction The Asset Hub smart contracts solution includes multiple components to ensure Ethereum compatibility and high performance. Its architecture allows for integration with current Ethereum tools, while its innovative virtual machine design enhances performance characteristics. ## PolkaVM [**PolkaVM**](https://github.com/paritytech/polkavm){target=\_blank} is a custom virtual machine optimized for performance with [RISC-V-based](https://en.wikipedia.org/wiki/RISC-V){target=\_blank} architecture, supporting Solidity and additional high-performance languages. It serves as the core execution environment, integrated directly within the runtime. It features: - An efficient interpreter for immediate code execution - A planned JIT compiler for optimized performance - Dual-mode execution capability, allowing selection of the most appropriate backend for specific workloads - Optimized performance for short-running contract calls through the interpreter The interpreter remains particularly beneficial for contracts with minimal code execution, as it eliminates JIT compilation overhead and enables immediate code execution through lazy interpretation. ## Architecture The smart contract solution consists of the following key components that work together to enable Ethereum compatibility on Polkadot-based chains: ### Pallet Revive [**`pallet_revive`**](https://paritytech.github.io/polkadot-sdk/master/pallet_revive/index.html){target=\_blank} is a runtime module that executes smart contracts by adding extrinsics, runtime APIs, and logic to convert Ethereum-style transactions into formats compatible with Polkadot SDK-based blockchains. It processes Ethereum-style transactions through the following workflow: ```mermaid sequenceDiagram participant User as User/dApp participant Proxy as Ethereum JSON RPC Proxy participant Chain as Blockchain Node participant Pallet as pallet_revive User->>Proxy: Submit Ethereum Transaction Proxy->>Chain: Repackage as Polkadot Compatible Transaction Chain->>Pallet: Process Transaction Pallet->>Pallet: Decode Ethereum Transaction Pallet->>Pallet: Execute Contract via PolkaVM Pallet->>Chain: Return Results Chain->>Proxy: Forward Results Proxy->>User: Return Ethereum-compatible Response ``` This proxy-based approach eliminates the need for node binary modifications, maintaining compatibility across different client implementations. Preserving the original Ethereum transaction payload simplifies adapting existing tools, which can continue processing familiar transaction formats. ### PolkaVM Design Fundamentals PolkaVM introduces two fundamental architectural differences compared to the Ethereum Virtual Machine (EVM): ```mermaid flowchart TB subgraph "EVM Architecture" EVMStack[Stack-Based] EVM256[256-bit Word Size] end subgraph "PolkaVM Architecture" PVMReg[Register-Based] PVM64[64-bit Word Size] end ``` - **Register-based design** - PolkaVM utilizes a RISC-V register-based approach. This design: - Employs a finite set of registers for argument passing instead of an infinite stack - Facilitates efficient translation to underlying hardware architectures - Optimizes register allocation through careful register count selection - Enables simple 1:1 mapping to x86-64 instruction sets - Reduces compilation complexity through strategic register limitation - Improves overall execution performance through hardware-aligned design - **64-bit word size** - PolkaVM operates with a 64-bit word size as follows: - Enables direct hardware-supported arithmetic operations - Maintains compatibility with Solidity's 256-bit operations through YUL translation - Allows integration of performance-critical components written in lower-level languages - Optimizes computation-intensive operations through native word size alignment - Reduces overhead for operations not requiring extended precision - Facilitates efficient integration with modern CPU architectures ## Compilation Process When compiling a Solidity smart contract, the code passes through the following stages: ```mermaid flowchart LR Dev[Developer] --> |Solidity\nSource\nCode| Solc subgraph "Compilation Process" direction LR Solc[solc] --> |YUL\nIR| Revive Revive[Revive Compiler] --> |LLVM\nIR| LLVM LLVM[LLVM\nOptimizer] --> |RISC-V ELF\nShared Object| PVMLinker end PVMLinker[PVM Linker] --> PVM[PVM Blob\nwith Metadata] ``` The compilation process integrates several specialized components: 1. **Solc** - the standard Ethereum Solidity compiler that translates Solidity source code to [YUL IR](https://docs.soliditylang.org/en/latest/yul.html){target=\_blank} 2. **Revive Compiler** - takes YUL IR and transforms it to [LLVM IR](https://llvm.org/){target=\_blank} 3. **LLVM** - a compiler infrastructure that optimizes the code and generates RISC-V ELF objects 4. **PVM linker** - links the RISC-V ELF object into a final PolkaVM blob with metadata --- END CONTENT --- ## Reference Concepts [shared: true] The following section contains reference material for Polkadot. While it may not be required for all use cases, it offers a deeper technical layer for advanced development work. --- ## List of shared concept pages: ## Full content for shared concepts: Doc-Content: https://docs.polkadot.com/develop/interoperability/xcm-config/ --- BEGIN CONTENT --- --- title: XCM Config description: Learn how the XCM Executor configuration works for your custom Polkadot SDK-based runtime with detailed guidance and references. categories: Reference, Polkadot Protocol --- # XCM Config ## Introduction The [XCM executor](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/index.html){target=\_blank} is a crucial component responsible for interpreting and executing XCM messages (XCMs) with Polkadot SDK-based chains. It processes and manages XCM instructions, ensuring they are executed correctly and in sequentially. Adhering to the [Cross-Consensus Virtual Machine (XCVM) specification](https://paritytech.github.io/xcm-docs/overview/xcvm.html#the-xcvm){target=\_blank}, the XCM executor can be customized or replaced with an alternative that also complies with the [XCVM standards](https://github.com/polkadot-fellows/xcm-format?tab=readme-ov-file#12-the-xcvm){target=\_blank}. The `XcmExecutor` is not a pallet but a struct parameterized by a `Config` trait. The `Config` trait is the inner configuration, parameterizing the outer `XcmExecutor` struct. Both configurations are set up within the runtime. The executor is highly configurable, with the [XCM builder](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/index.html){target=\_blank} offering building blocks to tailor the configuration to specific needs. While they serve as a foundation, users can easily create custom blocks to suit unique configurations. Users can also create their building blocks to address unique needs. This article examines the XCM configuration process, explains each configurable item, and provides examples of the tools and types available to help customize these settings. ## XCM Executor Configuration The `Config` trait defines the XCM executor’s configuration, which requires several associated types. Each type has specific trait bounds that the concrete implementation must fulfill. Some types, such as `RuntimeCall`, come with a default implementation in most cases, while others use the unit type `()` as the default. For many of these types, selecting the appropriate implementation carefully is crucial. Predefined solutions and building blocks can be adapted to your specific needs. These solutions can be found in the [`xcm-builder`](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/polkadot/xcm/xcm-builder){target=\_blank} folder. Each type is explained below, along with an overview of some of its implementations: ```rust pub trait Config { type RuntimeCall: Parameter + Dispatchable + GetDispatchInfo; type XcmSender: SendXcm; type AssetTransactor: TransactAsset; type OriginConverter: ConvertOrigin<::RuntimeOrigin>; type IsReserve: ContainsPair; type IsTeleporter: ContainsPair; type Aliasers: ContainsPair; type UniversalLocation: Get; type Barrier: ShouldExecute; type Weigher: WeightBounds; type Trader: WeightTrader; type ResponseHandler: OnResponse; type AssetTrap: DropAssets; type AssetClaims: ClaimAssets; type AssetLocker: AssetLock; type AssetExchanger: AssetExchange; type SubscriptionService: VersionChangeNotifier; type PalletInstancesInfo: PalletsInfoAccess; type MaxAssetsIntoHolding: Get; type FeeManager: FeeManager; type MessageExporter: ExportXcm; type UniversalAliases: Contains<(MultiLocation, Junction)>; type CallDispatcher: CallDispatcher; type SafeCallFilter: Contains; type TransactionalProcessor: ProcessTransaction; type HrmpNewChannelOpenRequestHandler: HandleHrmpNewChannelOpenRequest; type HrmpChannelAcceptedHandler: HandleHrmpChannelAccepted; type HrmpChannelClosingHandler: HandleHrmpChannelClosing; type XcmRecorder: RecordXcm; } ``` ## Config Items Each configuration item is explained below, detailing the associated type’s purpose and role in the XCM executor. Many of these types have predefined solutions available in the `xcm-builder`. Therefore, the available configuration items are: - [**`RuntimeCall`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.RuntimeCall){target=\_blank} - defines the runtime's callable functions, created via the [`frame::runtime`](https://paritytech.github.io/polkadot-sdk/master/frame_support/attr.runtime.html){target=\_blank} macro. It represents an enum listing the callable functions of all implemented pallets ```rust type RuntimeCall: Parameter + Dispatchable + GetDispatchInfo ``` The associated traits signify: - `Parameter` - ensures the type is encodable, decodable, and usable as a parameter - `Dispatchable` - indicates it can be executed in the runtime - `GetDispatchInfo` - provides weight details, determining how long execution takes - [**`XcmSender`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.XcmSender){target=\_blank} - implements the [`SendXcm`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v4/trait.SendXcm.html){target=\_blank} trait, specifying how the executor sends XCMs using transport layers (e.g., UMP for relay chains or XCMP for sibling chains). If a runtime lacks certain transport layers, such as [HRMP](https://wiki.polkadot.network/learn/learn-xcm-transport/#hrmp-xcmp-lite){target=\_blank} (or [XCMP](https://wiki.polkadot.network/learn/learn-xcm-transport/#xcmp-cross-consensus-message-passing-design-summary){target=\_blank}) ```rust type XcmSender: SendXcm; ``` - [**`AssetTransactor`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.AssetTransactor){target=\_blank} - implements the [`TransactAsset`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/traits/trait.TransactAsset.html){target=\_blank} trait, handling the conversion and transfer of MultiAssets between accounts or registers. It can be configured to support native tokens, fungibles, and non-fungibles or multiple tokens using pre-defined adapters like [`FungibleAdapter`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/struct.FungibleAdapter.html){target=\_blank} or custom solutions ```rust type AssetTransactor: TransactAsset; ``` - [**`OriginConverter`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.OriginConverter){target=\_blank} - implements the [`ConvertOrigin`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/traits/trait.ConvertOrigin.html){target=\_blank} trait to map `MultiLocation` origins to `RuntimeOrigin`. Multiple implementations can be combined, and [`OriginKind`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/test_utils/enum.OriginKind.html){target=\_blank} is used to resolve conflicts. Pre-defined converters like [`SovereignSignedViaLocation`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/struct.SovereignSignedViaLocation.html){target=\_blank} and [`SignedAccountId32AsNative`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/struct.SignedAccountId32AsNative.html){target=\_blank} handle sovereign and local accounts respectively ```rust type OriginConverter: ConvertOrigin<::RuntimeOrigin>; ``` - [**`IsReserve`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.IsReserve){target=\_blank} - specifies trusted `` pairs for depositing reserve assets. Using the unit type `()` blocks reserve deposits. The [`NativeAsset`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/struct.NativeAsset.html){target=\_blank} struct is an example of a reserve implementation ```rust type IsReserve: ContainsPair; ``` - [**`IsTeleporter`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.IsTeleporter){target=\_blank} - defines trusted `` pairs for teleporting assets to the chain. Using `()` blocks the [`ReceiveTeleportedAssets`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/test_utils/enum.Instruction.html#variant.ReceiveTeleportedAsset){target=\_blank} instruction. The [`NativeAsset`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/struct.NativeAsset.html){target=\_blank} struct can act as an implementation ```rust type IsTeleporter: ContainsPair; ``` - [**`Aliasers`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.Aliasers){target=\_blank} - a list of `(Origin, Target)` pairs enabling each `Origin` to be replaced with its corresponding `Target` ```rust type Aliasers: ContainsPair; ``` - [**`UniversalLocation`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.UniversalLocation){target=\_blank} - specifies the runtime's location in the consensus universe ```rust type UniversalLocation: Get; ``` - Some examples are: - `X1(GlobalConsensus(NetworkId::Polkadot))` for Polkadot - `X1(GlobalConsensus(NetworkId::Kusama))` for Kusama - `X2(GlobalConsensus(NetworkId::Polkadot), Parachain(1000))` for Statemint - [**`Barrier`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.Barrier){target=\_blank} - implements the [`ShouldExecute`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/traits/trait.ShouldExecute.html){target=\_blank} trait, functioning as a firewall for XCM execution. Multiple barriers can be combined in a tuple, where execution halts if one succeeds ```rust type Barrier: ShouldExecute; ``` - [**`Weigher`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.Weigher){target=\_blank} - calculates the weight of XCMs and instructions, enforcing limits and refunding unused weight. Common solutions include [`FixedWeightBounds`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/struct.FixedWeightBounds.html){target=\_blank}, which uses a base weight and limits on instructions ```rust type Weigher: WeightBounds; ``` - [**`Trader`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.Trader){target=\_blank} - manages asset-based weight purchases and refunds for `BuyExecution` instructions. The [`UsingComponents`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_builder/struct.UsingComponents.html){target=\_blank} trader is a common implementation ```rust type Trader: WeightTrader; ``` - [**`ResponseHandler`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.ResponseHandler){target=\_blank} - handles `QueryResponse` instructions, implementing the [`OnResponse`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/traits/trait.OnResponse.html){target=\_blank} trait. FRAME systems typically use the pallet-xcm implementation ```rust type ResponseHandler: OnResponse; ``` - [**`AssetTrap`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.AssetTrap){target=\_blank} - handles leftover assets in the holding register after XCM execution, allowing them to be claimed via `ClaimAsset`. If unsupported, assets are burned ```rust type AssetTrap: DropAssets; ``` - [**`AssetClaims`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.AssetClaims){target=\_blank} - facilitates the claiming of trapped assets during the execution of the `ClaimAsset` instruction. Commonly implemented via pallet-xcm ```rust type AssetClaims: ClaimAssets; ``` - [**`AssetLocker`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.AssetLocker){target=\_blank} - handles the locking and unlocking of assets. Can be omitted using `()` if asset locking is unnecessary ```rust type AssetLocker: AssetLock; ``` - [**`AssetExchanger`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.AssetExchanger){target=\_blank} - implements the [`AssetExchange`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/traits/trait.AssetExchange.html){target=\_blank} trait to manage asset exchanges during the `ExchangeAsset` instruction. The unit type `()` disables this functionality ```rust type AssetExchanger: AssetExchange; ``` - [**`SubscriptionService`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.SubscriptionService){target=\_blank} - manages `(Un)SubscribeVersion` instructions and returns the XCM version via `QueryResponse`. Typically implemented by pallet-xcm ```rust type SubscriptionService: VersionChangeNotifier; ``` - [**`PalletInstancesInfo`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.PalletInstancesInfo){target=\_blank} - provides runtime pallet information for `QueryPallet` and `ExpectPallet` instructions. FRAME-specific systems often use this, or it can be disabled with `()` ```rust type PalletInstancesInfo: PalletsInfoAccess; ``` - [**`MaxAssetsIntoHolding`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.MaxAssetsIntoHolding){target=\_blank} - limits the number of assets in the [Holding register](https://wiki.polkadot.network/learn/learn-xcm/#holding-register){target=\_blank}. At most, twice this limit can be held under worst-case conditions ```rust type MaxAssetsIntoHolding: Get; ``` - [**`FeeManager`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.FeeManager){target=\_blank} - manages fees for XCM instructions, determining whether fees should be paid, waived, or handled in specific ways. Fees can be waived entirely using `()` ```rust type FeeManager: FeeManager; ``` - [**`MessageExporter`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.MessageExporter){target=\_blank} - implements the [`ExportXcm`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/traits/trait.ExportXcm.html){target=\_blank} trait, enabling XCMs export to other consensus systems. It can spoof origins for use in bridges. Use `()` to disable exporting ```rust type MessageExporter: ExportXcm; ``` - [**`UniversalAliases`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.UniversalAliases){target=\_blank} - lists origin locations and universal junctions allowed to elevate themselves in the `UniversalOrigin` instruction. Using `Nothing` prevents origin aliasing ```rust type UniversalAliases: Contains<(MultiLocation, Junction)>; ``` - [**`CallDispatcher`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.CallDispatcher){target=\_blank} - dispatches calls from the `Transact` instruction, adapting the origin or modifying the call as needed. Can default to `RuntimeCall` ```rust type CallDispatcher: CallDispatcher; ``` - [**`SafeCallFilter`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.SafeCallFilter){target=\_blank} - whitelists calls permitted in the `Transact` instruction. Using `Everything` allows all calls, though this is temporary until proof size weights are accounted for ```rust type SafeCallFilter: Contains; ``` - [**`TransactionalProcessor`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.TransactionalProcessor){target=\_blank} - implements the [`ProccessTransaction`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/traits/trait.ProcessTransaction.html){target=\_blank} trait. It ensures that XCM instructions are executed atomically, meaning they either fully succeed or fully fail without any partial effects. This type allows for non-transactional XCM instruction processing by setting the `()` type ```rust type TransactionalProcessor: ProcessTransaction; ``` - [**`HrmpNewChannelOpenRequestHandler`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.HrmpNewChannelOpenRequestHandler){target=\_blank} - enables optional logic execution in response to the `HrmpNewChannelOpenRequest` XCM notification ```rust type HrmpNewChannelOpenRequestHandler: HandleHrmpNewChannelOpenRequest; ``` - [**`HrmpChannelAcceptedHandler`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.HrmpChannelAcceptedHandler){target=\_blank} - enables optional logic execution in response to the `HrmpChannelAccepted` XCM notification ```rust type HrmpChannelAcceptedHandler: HandleHrmpChannelAccepted; ``` - [**`HrmpChannelClosingHandler`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.HrmpChannelClosingHandler){target=\_blank} - enables optional logic execution in response to the `HrmpChannelClosing` XCM notification ```rust type HrmpChannelClosingHandler: HandleHrmpChannelClosing; ``` - [**`XcmRecorder`**](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/trait.Config.html#associatedtype.XcmRecorder){target=\_blank} - allows tracking of the most recently executed XCM, primarily for use with dry-run runtime APIs ```rust type XcmRecorder: RecordXcm; ``` ### Inner Config The `Config` trait underpins the `XcmExecutor`, defining its core behavior through associated types for asset handling, XCM processing, and permission management. These types are categorized as follows: - **Handlers** - manage XCMs sending, asset transactions, and special notifications - **Filters** - define trusted combinations, origin substitutions, and execution barriers - **Converters** - handle origin conversion for call execution - **Accessors** - provide weight determination and pallet information - **Constants** - specify universal locations and asset limits - **Common Configs** - include shared settings like `RuntimeCall` The following diagram outlines this categorization: ```mermaid flowchart LR A[Inner Config] --> B[Handlers] A --> C[Filters] A --> D[Converters] A --> E[Accessors] A --> F[Constants] A --> G[Common Configs] B --> H[XcmSender] B --> I[AssetTransactor] B --> J[Trader] B --> K[ResponseHandler] B --> L[AssetTrap] B --> M[AssetLocker] B --> N[AssetExchanger] B --> O[AssetClaims] B --> P[SubscriptionService] B --> Q[FeeManager] B --> R[MessageExporter] B --> S[CallDispatcher] B --> T[HrmpNewChannelOpenRequestHandler] B --> U[HrmpChannelAcceptedHandler] B --> V[HrmpChannelClosingHandler] C --> W[IsReserve] C --> X[IsTeleporter] C --> Y[Aliasers] C --> Z[Barrier] C --> AA[UniversalAliases] C --> AB[SafeCallFilter] D --> AC[OriginConverter] E --> AD[Weigher] E --> AE[PalletInstancesInfo] F --> AF[UniversalLocation] F --> AG[MaxAssetsIntoHolding] G --> AH[RuntimeCall] ``` ### Outer Config The `XcmExecutor` struct extends the functionality of the inner config by introducing fields for execution context, asset handling, error tracking, and operational management. For further details, see the documentation for [`XcmExecutor`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm_executor/struct.XcmExecutor.html#impl-XcmExecutor%3CConfig%3E){target=\_blank}. ## Multiple Implementations Some associated types in the `Config` trait are highly configurable and may have multiple implementations (e.g., Barrier). These implementations are organized into a tuple `(impl_1, impl_2, ..., impl_n)`, and the execution follows a sequential order. Each item in the tuple is evaluated individually, each being checked to see if it fails. If an item passes (e.g., returns `Ok` or `true`), the execution stops, and the remaining items are not evaluated. The following example of the `Barrier` type demonstrates how this grouping operates (understanding each item in the tuple is unnecessary for this explanation). In the following example, the system will first check the `TakeWeightCredit` type when evaluating the barrier. If it fails, it will check `AllowTopLevelPaidExecutionFrom`, and so on, until one of them returns a positive result. If all checks fail, a Barrier error will be triggered. ```rust pub type Barrier = ( TakeWeightCredit, AllowTopLevelPaidExecutionFrom, AllowKnownQueryResponses, AllowSubscriptionsFrom, ); pub struct XcmConfig; impl xcm_executor::Config for XcmConfig { ... type Barrier = Barrier; ... } ``` --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/interoperability/xcm-runtime-apis/ --- BEGIN CONTENT --- --- title: XCM Runtime APIs description: Learn about XCM Runtime APIs in Polkadot for cross-chain communication. Explore the APIs to simulate and test XCM messages before execution on the network. categories: Reference, Polkadot Protocol --- # XCM Runtime APIs ## Introduction Runtime APIs allow node-side code to extract information from the runtime state. While simple storage access retrieves stored values directly, runtime APIs enable arbitrary computation, making them a powerful tool for interacting with the chain's state. Unlike direct storage access, runtime APIs can derive values from storage based on arguments or perform computations that don't require storage access. For example, a runtime API might expose a formula for fee calculation, using only the provided arguments as inputs rather than fetching data from storage. In general, runtime APIs are used for: - Accessing a storage item - Retrieving a bundle of related storage items - Deriving a value from storage based on arguments - Exposing formulas for complex computational calculations This section will teach you about specific runtime APIs that support XCM processing and manipulation. ## Dry Run API The [Dry-run API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/dry_run/trait.DryRunApi.html){target=\_blank}, given an extrinsic, or an XCM program, returns its effects: - Execution result - Local XCM (in the case of an extrinsic) - Forwarded XCMs - List of events This API can be used independently for dry-running, double-checking, or testing. However, it mainly shines when used with the [Xcm Payment API](#xcm-payment-api), given that it only estimates fees if you know the specific XCM you want to execute or send. ### Dry Run Call This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust fn dry_run_call(origin: OriginCaller, call: Call) -> Result, Error>; ``` ??? interface "Input parameters" `origin` ++"OriginCaller"++ ++"required"++ The origin used for executing the transaction. --- `call` ++"Call"++ ++"required"++ The extrinsic to be executed. --- ??? interface "Output parameters" ++"Result, Error>"++ Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects. ??? child "Type `CallDryRunEffects`" `execution_result` ++"DispatchResultWithPostInfo"++ The result of executing the extrinsic. --- `emitted_events` ++"Vec"++ The list of events fired by the extrinsic. --- `local_xcm` ++"Option>"++ The local XCM that was attempted to be executed, if any. --- `forwarded_xcms` ++"Vec<(VersionedLocation, Vec>)>"++ The list of XCMs that were queued for sending. ??? child "Type `Error`" Enum: - **`Unimplemented`** - an API part is unsupported - **`VersionedConversionFailed`** - converting a versioned data structure from one version to another failed --- ??? interface "Example" This example demonstrates how to simulate a cross-chain asset transfer from the Paseo network to the Pop Network using a [reserve transfer](https://wiki.polkadot.network/docs/learn/xcm/journey/transfers-reserve){target=\_blank} mechanism. Instead of executing the actual transfer, the code shows how to test and verify the transaction's behavior through a dry run before performing it on the live network. Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script. ***Usage with PAPI*** ```js import { paseo } from '@polkadot-api/descriptors'; import { createClient } from 'polkadot-api'; import { getWsProvider } from 'polkadot-api/ws-provider/web'; import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat'; import { PolkadotRuntimeOriginCaller, XcmVersionedLocation, XcmVersionedAssets, XcmV3Junction, XcmV3Junctions, XcmV3WeightLimit, XcmV3MultiassetFungibility, XcmV3MultiassetAssetId, } from '@polkadot-api/descriptors'; import { DispatchRawOrigin } from '@polkadot-api/descriptors'; import { Binary } from 'polkadot-api'; import { ss58Decode } from '@polkadot-labs/hdkd-helpers'; // Connect to the Paseo relay chain const client = createClient( withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')), ); const paseoApi = client.getTypedApi(paseo); const popParaID = 4001; const userAddress = 'INSERT_USER_ADDRESS'; const userPublicKey = ss58Decode(userAddress)[0]; const idBeneficiary = Binary.fromBytes(userPublicKey); // Define the origin caller // This is a regular signed account owned by a user let origin = PolkadotRuntimeOriginCaller.system( DispatchRawOrigin.Signed(userAddress), ); // Define a transaction to transfer assets from Polkadot to Pop Network using a Reserve Transfer const tx = paseoApi.tx.XcmPallet.limited_reserve_transfer_assets({ dest: XcmVersionedLocation.V3({ parents: 0, interior: XcmV3Junctions.X1( XcmV3Junction.Parachain(popParaID), // Destination is the Pop Network parachain ), }), beneficiary: XcmVersionedLocation.V3({ parents: 0, interior: XcmV3Junctions.X1( XcmV3Junction.AccountId32({ // Beneficiary address on Pop Network network: undefined, id: idBeneficiary, }), ), }), assets: XcmVersionedAssets.V3([ { id: XcmV3MultiassetAssetId.Concrete({ parents: 0, interior: XcmV3Junctions.Here(), // Native asset from the sender. In this case PAS }), fun: XcmV3MultiassetFungibility.Fungible(120000000000n), // Asset amount to transfer }, ]), fee_asset_item: 0, // Asset used to pay transaction fees weight_limit: XcmV3WeightLimit.Unlimited(), // No weight limit on transaction }); // Execute the dry run call to simulate the transaction const dryRunResult = await paseoApi.apis.DryRunApi.dry_run_call( origin, tx.decodedCall, ); // Extract the data from the dry run result const { execution_result: executionResult, emitted_events: emmittedEvents, local_xcm: localXcm, forwarded_xcms: forwardedXcms, } = dryRunResult.value; // Extract the XCM generated by this call const xcmsToPop = forwardedXcms.find( ([location, _]) => location.type === 'V4' && location.value.parents === 0 && location.value.interior.type === 'X1' && location.value.interior.value.type === 'Parachain' && location.value.interior.value.value === popParaID, // Pop network's ParaID ); const destination = xcmsToPop[0]; const remoteXcm = xcmsToPop[1][0]; // Print the results const resultObject = { execution_result: executionResult, emitted_events: emmittedEvents, local_xcm: localXcm, destination: destination, remote_xcm: remoteXcm, }; console.dir(resultObject, { depth: null }); client.destroy(); ``` ***Output***
    {
      execution_result: {
        success: true,
        value: {
          actual_weight: undefined,
          pays_fee: { type: 'Yes', value: undefined }
        }
      },
      emitted_events: [
        {
          type: 'Balances',
          value: {
            type: 'Transfer',
            value: {
              from: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',
              to: '13YMK2ePPKQeW7ynqLozB65WYjMnNgffQ9uR4AzyGmqnKeLq',
              amount: 120000000000n
            }
          }
        },
        {
          type: 'Balances',
          value: { type: 'Issued', value: { amount: 0n } }
        },
        {
          type: 'XcmPallet',
          value: {
            type: 'Attempted',
            value: {
              outcome: {
                type: 'Complete',
                value: { used: { ref_time: 251861000n, proof_size: 6196n } }
              }
            }
          }
        },
        {
          type: 'Balances',
          value: {
            type: 'Burned',
            value: {
              who: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',
              amount: 397000000n
            }
          }
        },
        {
          type: 'Balances',
          value: {
            type: 'Minted',
            value: {
              who: '13UVJyLnbVp9RBZYFwFGyDvVd1y27Tt8tkntv6Q7JVPhFsTB',
              amount: 397000000n
            }
          }
        },
        {
          type: 'XcmPallet',
          value: {
            type: 'FeesPaid',
            value: {
              paying: {
                parents: 0,
                interior: {
                  type: 'X1',
                  value: {
                    type: 'AccountId32',
                    value: {
                      network: { type: 'Polkadot', value: undefined },
                      id: FixedSizeBinary {
                        asText: [Function (anonymous)],
                        asHex: [Function (anonymous)],
                        asOpaqueHex: [Function (anonymous)],
                        asBytes: [Function (anonymous)],
                        asOpaqueBytes: [Function (anonymous)]
                      }
                    }
                  }
                }
              },
              fees: [
                {
                  id: {
                    parents: 0,
                    interior: { type: 'Here', value: undefined }
                  },
                  fun: { type: 'Fungible', value: 397000000n }
                }
              ]
            }
          }
        },
        {
          type: 'XcmPallet',
          value: {
            type: 'Sent',
            value: {
              origin: {
                parents: 0,
                interior: {
                  type: 'X1',
                  value: {
                    type: 'AccountId32',
                    value: {
                      network: { type: 'Polkadot', value: undefined },
                      id: FixedSizeBinary {
                        asText: [Function (anonymous)],
                        asHex: [Function (anonymous)],
                        asOpaqueHex: [Function (anonymous)],
                        asBytes: [Function (anonymous)],
                        asOpaqueBytes: [Function (anonymous)]
                      }
                    }
                  }
                }
              },
              destination: {
                parents: 0,
                interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }
              },
              message: [
                {
                  type: 'ReserveAssetDeposited',
                  value: [
                    {
                      id: {
                        parents: 1,
                        interior: { type: 'Here', value: undefined }
                      },
                      fun: { type: 'Fungible', value: 120000000000n }
                    }
                  ]
                },
                { type: 'ClearOrigin', value: undefined },
                {
                  type: 'BuyExecution',
                  value: {
                    fees: {
                      id: {
                        parents: 1,
                        interior: { type: 'Here', value: undefined }
                      },
                      fun: { type: 'Fungible', value: 120000000000n }
                    },
                    weight_limit: { type: 'Unlimited', value: undefined }
                  }
                },
                {
                  type: 'DepositAsset',
                  value: {
                    assets: {
                      type: 'Wild',
                      value: { type: 'AllCounted', value: 1 }
                    },
                    beneficiary: {
                      parents: 0,
                      interior: {
                        type: 'X1',
                        value: {
                          type: 'AccountId32',
                          value: {
                            network: undefined,
                            id: FixedSizeBinary {
                              asText: [Function (anonymous)],
                              asHex: [Function (anonymous)],
                              asOpaqueHex: [Function (anonymous)],
                              asBytes: [Function (anonymous)],
                              asOpaqueBytes: [Function (anonymous)]
                            }
                          }
                        }
                      }
                    }
                  }
                }
              ],
              message_id: FixedSizeBinary {
                asText: [Function (anonymous)],
                asHex: [Function (anonymous)],
                asOpaqueHex: [Function (anonymous)],
                asBytes: [Function (anonymous)],
                asOpaqueBytes: [Function (anonymous)]
              }
            }
          }
        }
      ],
      local_xcm: undefined,
      destination: {
        type: 'V4',
        value: {
          parents: 0,
          interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }
        }
      },
      remote_xcm: {
        type: 'V3',
        value: [
          {
            type: 'ReserveAssetDeposited',
            value: [
              {
                id: {
                  type: 'Concrete',
                  value: {
                    parents: 1,
                    interior: { type: 'Here', value: undefined }
                  }
                },
                fun: { type: 'Fungible', value: 120000000000n }
              }
            ]
          },
          { type: 'ClearOrigin', value: undefined },
          {
            type: 'BuyExecution',
            value: {
              fees: {
                id: {
                  type: 'Concrete',
                  value: {
                    parents: 1,
                    interior: { type: 'Here', value: undefined }
                  }
                },
                fun: { type: 'Fungible', value: 120000000000n }
              },
              weight_limit: { type: 'Unlimited', value: undefined }
            }
          },
          {
            type: 'DepositAsset',
            value: {
              assets: { type: 'Wild', value: { type: 'AllCounted', value: 1 } },
              beneficiary: {
                parents: 0,
                interior: {
                  type: 'X1',
                  value: {
                    type: 'AccountId32',
                    value: {
                      network: undefined,
                      id: FixedSizeBinary {
                        asText: [Function (anonymous)],
                        asHex: [Function (anonymous)],
                        asOpaqueHex: [Function (anonymous)],
                        asBytes: [Function (anonymous)],
                        asOpaqueBytes: [Function (anonymous)]
                      }
                    }
                  }
                }
              }
            }
          },
          {
            type: 'SetTopic',
            value: FixedSizeBinary {
              asText: [Function (anonymous)],
              asHex: [Function (anonymous)],
              asOpaqueHex: [Function (anonymous)],
              asBytes: [Function (anonymous)],
              asOpaqueBytes: [Function (anonymous)]
            }
          }
        ]
      }
    }      
  
...
    {
      execution_result: {
        success: true,
        value: {
          actual_weight: undefined,
          pays_fee: { type: 'Yes', value: undefined }
        }
      },
      emitted_events: [
        {
          type: 'Balances',
          value: {
            type: 'Transfer',
            value: {
              from: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',
              to: '13YMK2ePPKQeW7ynqLozB65WYjMnNgffQ9uR4AzyGmqnKeLq',
              amount: 120000000000n
            }
          }
        },
        {
          type: 'Balances',
          value: { type: 'Issued', value: { amount: 0n } }
        },
        {
          type: 'XcmPallet',
          value: {
            type: 'Attempted',
            value: {
              outcome: {
                type: 'Complete',
                value: { used: { ref_time: 251861000n, proof_size: 6196n } }
              }
            }
          }
        },
        {
          type: 'Balances',
          value: {
            type: 'Burned',
            value: {
              who: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',
              amount: 397000000n
            }
          }
        },
        {
          type: 'Balances',
          value: {
            type: 'Minted',
            value: {
              who: '13UVJyLnbVp9RBZYFwFGyDvVd1y27Tt8tkntv6Q7JVPhFsTB',
              amount: 397000000n
            }
          }
        },
        {
          type: 'XcmPallet',
          value: {
            type: 'FeesPaid',
            value: {
              paying: {
                parents: 0,
                interior: {
                  type: 'X1',
                  value: {
                    type: 'AccountId32',
                    value: {
                      network: { type: 'Polkadot', value: undefined },
                      id: FixedSizeBinary {
                        asText: [Function (anonymous)],
                        asHex: [Function (anonymous)],
                        asOpaqueHex: [Function (anonymous)],
                        asBytes: [Function (anonymous)],
                        asOpaqueBytes: [Function (anonymous)]
                      }
                    }
                  }
                }
              },
              fees: [
                {
                  id: {
                    parents: 0,
                    interior: { type: 'Here', value: undefined }
                  },
                  fun: { type: 'Fungible', value: 397000000n }
                }
              ]
            }
          }
        },
        {
          type: 'XcmPallet',
          value: {
            type: 'Sent',
            value: {
              origin: {
                parents: 0,
                interior: {
                  type: 'X1',
                  value: {
                    type: 'AccountId32',
                    value: {
                      network: { type: 'Polkadot', value: undefined },
                      id: FixedSizeBinary {
                        asText: [Function (anonymous)],
                        asHex: [Function (anonymous)],
                        asOpaqueHex: [Function (anonymous)],
                        asBytes: [Function (anonymous)],
                        asOpaqueBytes: [Function (anonymous)]
                      }
                    }
                  }
                }
              },
              destination: {
                parents: 0,
                interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }
              },
              message: [
                {
                  type: 'ReserveAssetDeposited',
                  value: [
                    {
                      id: {
                        parents: 1,
                        interior: { type: 'Here', value: undefined }
                      },
                      fun: { type: 'Fungible', value: 120000000000n }
                    }
                  ]
                },
                { type: 'ClearOrigin', value: undefined },
                {
                  type: 'BuyExecution',
                  value: {
                    fees: {
                      id: {
                        parents: 1,
                        interior: { type: 'Here', value: undefined }
                      },
                      fun: { type: 'Fungible', value: 120000000000n }
                    },
                    weight_limit: { type: 'Unlimited', value: undefined }
                  }
                },
                {
                  type: 'DepositAsset',
                  value: {
                    assets: {
                      type: 'Wild',
                      value: { type: 'AllCounted', value: 1 }
                    },
                    beneficiary: {
                      parents: 0,
                      interior: {
                        type: 'X1',
                        value: {
                          type: 'AccountId32',
                          value: {
                            network: undefined,
                            id: FixedSizeBinary {
                              asText: [Function (anonymous)],
                              asHex: [Function (anonymous)],
                              asOpaqueHex: [Function (anonymous)],
                              asBytes: [Function (anonymous)],
                              asOpaqueBytes: [Function (anonymous)]
                            }
                          }
                        }
                      }
                    }
                  }
                }
              ],
              message_id: FixedSizeBinary {
                asText: [Function (anonymous)],
                asHex: [Function (anonymous)],
                asOpaqueHex: [Function (anonymous)],
                asBytes: [Function (anonymous)],
                asOpaqueBytes: [Function (anonymous)]
              }
            }
          }
        }
      ],
      local_xcm: undefined,
      destination: {
        type: 'V4',
        value: {
          parents: 0,
          interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }
        }
      },
      remote_xcm: {
        type: 'V3',
        value: [
          {
            type: 'ReserveAssetDeposited',
            value: [
              {
                id: {
                  type: 'Concrete',
                  value: {
                    parents: 1,
                    interior: { type: 'Here', value: undefined }
                  }
                },
                fun: { type: 'Fungible', value: 120000000000n }
              }
            ]
          },
          { type: 'ClearOrigin', value: undefined },
          {
            type: 'BuyExecution',
            value: {
              fees: {
                id: {
                  type: 'Concrete',
                  value: {
                    parents: 1,
                    interior: { type: 'Here', value: undefined }
                  }
                },
                fun: { type: 'Fungible', value: 120000000000n }
              },
              weight_limit: { type: 'Unlimited', value: undefined }
            }
          },
          {
            type: 'DepositAsset',
            value: {
              assets: { type: 'Wild', value: { type: 'AllCounted', value: 1 } },
              beneficiary: {
                parents: 0,
                interior: {
                  type: 'X1',
                  value: {
                    type: 'AccountId32',
                    value: {
                      network: undefined,
                      id: FixedSizeBinary {
                        asText: [Function (anonymous)],
                        asHex: [Function (anonymous)],
                        asOpaqueHex: [Function (anonymous)],
                        asBytes: [Function (anonymous)],
                        asOpaqueBytes: [Function (anonymous)]
                      }
                    }
                  }
                }
              }
            }
          },
          {
            type: 'SetTopic',
            value: FixedSizeBinary {
              asText: [Function (anonymous)],
              asHex: [Function (anonymous)],
              asOpaqueHex: [Function (anonymous)],
              asBytes: [Function (anonymous)],
              asOpaqueBytes: [Function (anonymous)]
            }
          }
        ]
      }
    }      
  
--- ### Dry Run XCM This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" `origin_location` ++"VersionedLocation"++ ++"required"++ The location of the origin that will execute the xcm message. --- `xcm` ++"VersionedXcm"++ ++"required"++ A versioned XCM message. --- ??? interface "Output parameters" ++"Result, Error>"++ Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects. ??? child "Type `XcmDryRunEffects`" `execution_result` ++"DispatchResultWithPostInfo"++ The result of executing the extrinsic. --- `emitted_events` ++"Vec"++ The list of events fired by the extrinsic. --- `forwarded_xcms` ++"Vec<(VersionedLocation, Vec>)>"++ The list of XCMs that were queued for sending. ??? child "Type `Error`" Enum: - **`Unimplemented`** - an API part is unsupported - **`VersionedConversionFailed`** - converting a versioned data structure from one version to another failed --- ??? interface "Example" This example demonstrates how to simulate a [teleport asset transfer](https://wiki.polkadot.network/docs/learn/xcm/journey/transfers-teleport){target=\_blank} from the Paseo network to the Paseo Asset Hub parachain. The code shows how to test and verify the received XCM message's behavior in the destination chain through a dry run on the live network. Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script. ***Usage with PAPI*** ```js import { createClient } from 'polkadot-api'; import { getWsProvider } from 'polkadot-api/ws-provider/web'; import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat'; import { XcmVersionedXcm, paseoAssetHub, XcmVersionedLocation, XcmV3Junction, XcmV3Junctions, XcmV3WeightLimit, XcmV3MultiassetFungibility, XcmV3MultiassetAssetId, XcmV3Instruction, XcmV3MultiassetMultiAssetFilter, XcmV3MultiassetWildMultiAsset, } from '@polkadot-api/descriptors'; import { Binary } from 'polkadot-api'; import { ss58Decode } from '@polkadot-labs/hdkd-helpers'; // Connect to Paseo Asset Hub const client = createClient( withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')), ); const paseoAssetHubApi = client.getTypedApi(paseoAssetHub); const userAddress = 'INSERT_USER_ADDRESS'; const userPublicKey = ss58Decode(userAddress)[0]; const idBeneficiary = Binary.fromBytes(userPublicKey); // Define the origin const origin = XcmVersionedLocation.V3({ parents: 1, interior: XcmV3Junctions.Here(), }); // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens const xcm = XcmVersionedXcm.V3([ XcmV3Instruction.ReceiveTeleportedAsset([ { id: XcmV3MultiassetAssetId.Concrete({ parents: 1, interior: XcmV3Junctions.Here(), }), fun: XcmV3MultiassetFungibility.Fungible(12000000000n), }, ]), XcmV3Instruction.ClearOrigin(), XcmV3Instruction.BuyExecution({ fees: { id: XcmV3MultiassetAssetId.Concrete({ parents: 1, interior: XcmV3Junctions.Here(), }), fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)), }, weight_limit: XcmV3WeightLimit.Unlimited(), }), XcmV3Instruction.DepositAsset({ assets: XcmV3MultiassetMultiAssetFilter.Wild( XcmV3MultiassetWildMultiAsset.All(), ), beneficiary: { parents: 0, interior: XcmV3Junctions.X1( XcmV3Junction.AccountId32({ network: undefined, id: idBeneficiary, }), ), }, }), ]); // Execute dry run xcm const dryRunResult = await paseoAssetHubApi.apis.DryRunApi.dry_run_xcm( origin, xcm, ); // Print the results console.dir(dryRunResult.value, { depth: null }); client.destroy(); ``` ***Output***
    {
      execution_result: {
        type: 'Complete',
        value: { used: { ref_time: 15574200000n, proof_size: 359300n } }
      },
      emitted_events: [
        {
          type: 'System',
          value: {
            type: 'NewAccount',
            value: { account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET' }
          }
        },
        {
          type: 'Balances',
          value: {
            type: 'Endowed',
            value: {
              account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',
              free_balance: 10203500000n
            }
          }
        },
        {
          type: 'Balances',
          value: {
            type: 'Minted',
            value: {
              who: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',
              amount: 10203500000n
            }
          }
        },
        {
          type: 'Balances',
          value: { type: 'Issued', value: { amount: 1796500000n } }
        },
        {
          type: 'Balances',
          value: {
            type: 'Deposit',
            value: {
              who: '13UVJyLgBASGhE2ok3TvxUfaQBGUt88JCcdYjHvUhvQkFTTx',
              amount: 1796500000n
            }
          }
        }
      ],
      forwarded_xcms: [
        [
          {
            type: 'V4',
            value: { parents: 1, interior: { type: 'Here', value: undefined } }
          },
          []
        ]
      ]
    }
  
--- ## XCM Payment API The [XCM Payment API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/fees/trait.XcmPaymentApi.html){target=\_blank} provides a standardized way to determine the costs and payment options for executing XCM messages. Specifically, it enables clients to: - Retrieve the [weight](/polkadot-protocol/glossary/#weight) required to execute an XCM message - Obtain a list of acceptable `AssetIds` for paying execution fees - Calculate the cost of the weight in a specified `AssetId` - Estimate the fees for XCM message delivery This API eliminates the need for clients to guess execution fees or identify acceptable assets manually. Instead, clients can query the list of supported asset IDs formatted according to the XCM version they understand. With this information, they can weigh the XCM program they intend to execute and convert the computed weight into its cost using one of the acceptable assets. To use the API effectively, the client must already know the XCM program to be executed and the chains involved in the program's execution. ### Query Acceptable Payment Assets Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" `xcm_version` ++"Version"++ ++"required"++ Specifies the XCM version that will be used to send the XCM message. --- ??? interface "Output parameters" ++"Result, Error>"++ A list of acceptable payment assets. Each asset is provided in a versioned format (`VersionedAssetId`) that matches the specified XCM version. If an error occurs, it is returned instead of the asset list. ??? child "Type `Error`" Enum: - **`Unimplemented`** - an API part is unsupported - **`VersionedConversionFailed`** - converting a versioned data structure from one version to another failed - **`WeightNotComputable`** - XCM message weight calculation failed - **`UnhandledXcmVersion`** - XCM version not able to be handled - **`AssetNotFound`** - the given asset is not handled as a fee asset - **`Unroutable`** - destination is known to be unroutable --- ??? interface "Example" This example demonstrates how to query the acceptable payment assets for executing XCM messages on the Paseo Asset Hub network using XCM version 3. ***Usage with PAPI*** ```js import { paseoAssetHub } from '@polkadot-api/descriptors'; import { createClient } from 'polkadot-api'; import { getWsProvider } from 'polkadot-api/ws-provider/web'; import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat'; // Connect to the polkadot relay chain const client = createClient( withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')), ); const paseoAssetHubApi = client.getTypedApi(paseoAssetHub); // Define the xcm version to use const xcmVersion = 3; // Execute the runtime call to query the assets const result = await paseoAssetHubApi.apis.XcmPaymentApi.query_acceptable_payment_assets( xcmVersion, ); // Print the assets console.dir(result.value, { depth: null }); client.destroy(); ``` ***Output***
    [
      {
        type: 'V3',
        value: {
          type: 'Concrete',
          value: { parents: 1, interior: { type: 'Here', value: undefined } }
        }
      }
    ]
  
--- ### Query XCM Weight Calculates the weight required to execute a given XCM message. It is useful for estimating the execution cost of a cross-chain message in the destination chain before sending it. ```rust fn query_xcm_weight(message: VersionedXcm<()>) -> Result; ``` ??? interface "Input parameters" `message` ++"VersionedXcm<()>"++ ++"required"++ A versioned XCM message whose execution weight is being queried. --- ??? interface "Output parameters" ++"Result"++ The calculated weight required to execute the provided XCM message. If the calculation fails, an error is returned instead. ??? child "Type `Weight`" `ref_time` ++"u64"++ The weight of computational time used based on some reference hardware. --- `proof_size` ++"u64"++ The weight of storage space used by proof of validity. --- ??? child "Type `Error`" Enum: - **`Unimplemented`** - an API part is unsupported - **`VersionedConversionFailed`** - converting a versioned data structure from one version to another failed - **`WeightNotComputable`** - XCM message weight calculation failed - **`UnhandledXcmVersion`** - XCM version not able to be handled - **`AssetNotFound`** - the given asset is not handled as a fee asset - **`Unroutable`** - destination is known to be unroutable --- ??? interface "Example" This example demonstrates how to calculate the weight needed to execute a [teleport transfer](https://wiki.polkadot.network/docs/learn/xcm/journey/transfers-teleport){target=\_blank} from the Paseo network to the Paseo Asset Hub parachain using the XCM Payment API. The result shows the required weight in terms of reference time and proof size needed in the destination chain. Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script. ***Usage with PAPI*** ```js import { createClient } from 'polkadot-api'; import { getWsProvider } from 'polkadot-api/ws-provider/web'; import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat'; import { XcmVersionedXcm, paseoAssetHub, XcmV3Junction, XcmV3Junctions, XcmV3WeightLimit, XcmV3MultiassetFungibility, XcmV3MultiassetAssetId, XcmV3Instruction, XcmV3MultiassetMultiAssetFilter, XcmV3MultiassetWildMultiAsset, } from '@polkadot-api/descriptors'; import { Binary } from 'polkadot-api'; import { ss58Decode } from '@polkadot-labs/hdkd-helpers'; // Connect to Paseo Asset Hub const client = createClient( withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')), ); const paseoAssetHubApi = client.getTypedApi(paseoAssetHub); const userAddress = 'INSERT_USER_ADDRESS'; const userPublicKey = ss58Decode(userAddress)[0]; const idBeneficiary = Binary.fromBytes(userPublicKey); // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens const xcm = XcmVersionedXcm.V3([ XcmV3Instruction.ReceiveTeleportedAsset([ { id: XcmV3MultiassetAssetId.Concrete({ parents: 1, interior: XcmV3Junctions.Here(), }), fun: XcmV3MultiassetFungibility.Fungible(12000000000n), }, ]), XcmV3Instruction.ClearOrigin(), XcmV3Instruction.BuyExecution({ fees: { id: XcmV3MultiassetAssetId.Concrete({ parents: 1, interior: XcmV3Junctions.Here(), }), fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)), }, weight_limit: XcmV3WeightLimit.Unlimited(), }), XcmV3Instruction.DepositAsset({ assets: XcmV3MultiassetMultiAssetFilter.Wild( XcmV3MultiassetWildMultiAsset.All(), ), beneficiary: { parents: 0, interior: XcmV3Junctions.X1( XcmV3Junction.AccountId32({ network: undefined, id: idBeneficiary, }), ), }, }), ]); // Execute the query weight runtime call const result = await paseoAssetHubApi.apis.XcmPaymentApi.query_xcm_weight(xcm); // Print the results console.dir(result.value, { depth: null }); client.destroy(); ``` ***Output***
{ ref_time: 15574200000n, proof_size: 359300n }
--- ### Query Weight to Asset Fee Converts a given weight into the corresponding fee for a specified `AssetId`. It allows clients to determine the cost of execution in terms of the desired asset. ```rust fn query_weight_to_asset_fee(weight: Weight, asset: VersionedAssetId) -> Result; ``` ??? interface "Input parameters" `weight` ++"Weight"++ ++"required"++ The execution weight to be converted into a fee. ??? child "Type `Weight`" `ref_time` ++"u64"++ The weight of computational time used based on some reference hardware. --- `proof_size` ++"u64"++ The weight of storage space used by proof of validity. --- --- `asset` ++"VersionedAssetId"++ ++"required"++ The asset in which the fee will be calculated. This must be a versioned asset ID compatible with the runtime. --- ??? interface "Output parameters" ++"Result"++ The fee needed to pay for the execution for the given `AssetId.` ??? child "Type `Error`" Enum: - **`Unimplemented`** - an API part is unsupported - **`VersionedConversionFailed`** - converting a versioned data structure from one version to another failed - **`WeightNotComputable`** - XCM message weight calculation failed - **`UnhandledXcmVersion`** - XCM version not able to be handled - **`AssetNotFound`** - the given asset is not handled as a fee asset - **`Unroutable`** - destination is known to be unroutable --- ??? interface "Example" This example demonstrates how to calculate the fee for a given execution weight using a specific versioned asset ID (PAS token) on Paseo Asset Hub. ***Usage with PAPI*** ```js import { paseoAssetHub } from '@polkadot-api/descriptors'; import { createClient } from 'polkadot-api'; import { getWsProvider } from 'polkadot-api/ws-provider/web'; import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat'; // Connect to the polkadot relay chain const client = createClient( withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')), ); const paseoAssetHubApi = client.getTypedApi(paseoAssetHub); // Define the weight to convert to fee const weight = { ref_time: 15574200000n, proof_size: 359300n }; // Define the versioned asset id const versionedAssetId = { type: 'V4', value: { parents: 1, interior: { type: 'Here', value: undefined } }, }; // Execute the runtime call to convert the weight to fee const result = await paseoAssetHubApi.apis.XcmPaymentApi.query_weight_to_asset_fee( weight, versionedAssetId, ); // Print the fee console.dir(result.value, { depth: null }); client.destroy(); ``` ***Output***
1796500000n
--- ### Query Delivery Fees Retrieves the delivery fees for sending a specific XCM message to a designated destination. The fees are always returned in a specific asset defined by the destination chain. ```rust fn query_delivery_fees(destination: VersionedLocation, message: VersionedXcm<()>) -> Result; ``` ??? interface "Input parameters" `destination` ++"VersionedLocation"++ ++"required"++ The target location where the message will be sent. Fees may vary depending on the destination, as different destinations often have unique fee structures and sender mechanisms. --- `message` ++"VersionedXcm<()>"++ ++"required"++ The XCM message to be sent. The delivery fees are calculated based on the message's content and size, which can influence the cost. --- ??? interface "Output parameters" ++"Result"++ The calculated delivery fees expressed in a specific asset supported by the destination chain. If an error occurs during the query, it returns an error instead. ??? child "Type `Error`" Enum: - **`Unimplemented`** - an API part is unsupported - **`VersionedConversionFailed`** - converting a versioned data structure from one version to another failed - **`WeightNotComputable`** - XCM message weight calculation failed - **`UnhandledXcmVersion`** - XCM version not able to be handled - **`AssetNotFound`** - the given asset is not handled as a fee asset - **`Unroutable`** - destination is known to be unroutable --- ??? interface "Example" This example demonstrates how to query the delivery fees for sending an XCM message from Paseo to Paseo Asset Hub. Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script. ***Usage with PAPI*** ```js import { createClient } from 'polkadot-api'; import { getWsProvider } from 'polkadot-api/ws-provider/web'; import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat'; import { XcmVersionedXcm, paseo, XcmVersionedLocation, XcmV3Junction, XcmV3Junctions, XcmV3WeightLimit, XcmV3MultiassetFungibility, XcmV3MultiassetAssetId, XcmV3Instruction, XcmV3MultiassetMultiAssetFilter, XcmV3MultiassetWildMultiAsset, } from '@polkadot-api/descriptors'; import { Binary } from 'polkadot-api'; import { ss58Decode } from '@polkadot-labs/hdkd-helpers'; const client = createClient( withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')), ); const paseoApi = client.getTypedApi(paseo); const paseoAssetHubParaID = 1000; const userAddress = 'INSERT_USER_ADDRESS'; const userPublicKey = ss58Decode(userAddress)[0]; const idBeneficiary = Binary.fromBytes(userPublicKey); // Define the destination const destination = XcmVersionedLocation.V3({ parents: 0, interior: XcmV3Junctions.X1(XcmV3Junction.Parachain(paseoAssetHubParaID)), }); // Define the xcm message that will be sent to the destination const xcm = XcmVersionedXcm.V3([ XcmV3Instruction.ReceiveTeleportedAsset([ { id: XcmV3MultiassetAssetId.Concrete({ parents: 1, interior: XcmV3Junctions.Here(), }), fun: XcmV3MultiassetFungibility.Fungible(12000000000n), }, ]), XcmV3Instruction.ClearOrigin(), XcmV3Instruction.BuyExecution({ fees: { id: XcmV3MultiassetAssetId.Concrete({ parents: 1, interior: XcmV3Junctions.Here(), }), fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)), }, weight_limit: XcmV3WeightLimit.Unlimited(), }), XcmV3Instruction.DepositAsset({ assets: XcmV3MultiassetMultiAssetFilter.Wild( XcmV3MultiassetWildMultiAsset.All(), ), beneficiary: { parents: 0, interior: XcmV3Junctions.X1( XcmV3Junction.AccountId32({ network: undefined, id: idBeneficiary, }), ), }, }), ]); // Execute the query delivery fees runtime call const result = await paseoApi.apis.XcmPaymentApi.query_delivery_fees( destination, xcm, ); // Print the results console.dir(result.value, { depth: null }); client.destroy(); ``` ***Output***
    {
      type: 'V3',
      value: [
        {
          id: {
            type: 'Concrete',
            value: { parents: 0, interior: { type: 'Here', value: undefined } }
          },
          fun: { type: 'Fungible', value: 396000000n }
        }
      ]
    }
  
--- --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/smart-contracts/json-rpc-apis/ --- BEGIN CONTENT --- --- title: JSON-RPC APIs description: JSON-RPC APIs guide for Polkadot Hub, covering supported methods, parameters, and examples for interacting with the chain. categories: Reference --- # JSON-RPC APIs !!! smartcontract "PolkaVM Preview Release" PolkaVM smart contracts with Ethereum compatibility are in **early-stage development and may be unstable or incomplete**. ## Introduction Polkadot Hub provides Ethereum compatibility through its JSON-RPC interface, allowing developers to interact with the chain using familiar Ethereum tooling and methods. This document outlines the supported [Ethereum JSON-RPC methods](https://ethereum.org/en/developers/docs/apis/json-rpc/#json-rpc-methods){target=\_blank} and provides examples of how to use them. This guide uses the Polkadot Hub TestNet endpoint: ```text https://testnet-passet-hub-eth-rpc.polkadot.io ``` ## Available Methods ### eth_accounts Returns a list of addresses owned by the client. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_accounts){target=\_blank}. **Parameters**: None **Example**: ```bash title="eth_accounts" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_accounts", "params":[], "id":1 }' ``` --- ### eth_blockNumber Returns the number of the most recent block. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_blocknumber){target=\_blank}. **Parameters**: None **Example**: ```bash title="eth_blockNumber" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_blockNumber", "params":[], "id":1 }' ``` --- ### eth_call Executes a new message call immediately without creating a transaction. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_call){target=\_blank}. **Parameters**: - `transaction` ++"object"++ - the transaction call object: - `to` ++"string"++ - recipient address of the call. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `data` ++"string"++ - hash of the method signature and encoded parameters. Must be a [data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `from` ++"string"++ - (optional) sender's address for the call. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `gas` ++"string"++ - (optional) gas limit to execute the call. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `gasPrice` ++"string"++ - (optional) gas price per unit of gas. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `value` ++"string"++ - (optional) value in wei to send with the call. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `blockValue` ++"string"++ - (optional) block tag or block number to execute the call at. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} **Example**: ```bash title="eth_call" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_call", "params":[{ "to": "INSERT_RECIPIENT_ADDRESS", "data": "INSERT_ENCODED_CALL" }, "INSERT_BLOCK_VALUE"], "id":1 }' ``` Ensure to replace the `INSERT_RECIPIENT_ADDRESS`, `INSERT_ENCODED_CALL`, and `INSERT_BLOCK_VALUE` with the proper values. --- ### eth_chainId Returns the chain ID used for signing transactions. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_chainid){target=\_blank}. **Parameters**: None **Example**: ```bash title="eth_chainId" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_chainId", "params":[], "id":1 }' ``` --- ### eth_estimateGas Estimates gas required for a transaction. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_estimategas){target=\_blank}. **Parameters**: - `transaction` ++"object"++ - the transaction call object: - `to` ++"string"++ - recipient address of the call. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `data` ++"string"++ - hash of the method signature and encoded parameters. Must be a [data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `from` ++"string"++ - (optional) sender's address for the call. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `gas` ++"string"++ - (optional) gas limit to execute the call. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `gasPrice` ++"string"++ - (optional) gas price per unit of gas. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `value` ++"string"++ - (optional) value in wei to send with the call. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `blockValue` ++"string"++ - (optional) block tag or block number to execute the call at. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} **Example**: ```bash title="eth_estimateGas" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_estimateGas", "params":[{ "to": "INSERT_RECIPIENT_ADDRESS", "data": "INSERT_ENCODED_FUNCTION_CALL" }], "id":1 }' ``` Ensure to replace the `INSERT_RECIPIENT_ADDRESS` and `INSERT_ENCODED_CALL` with the proper values. --- ### eth_gasPrice Returns the current gas price in Wei. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_gasprice){target=\_blank}. **Parameters**: None **Example**: ```bash title="eth_gasPrice" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_gasPrice", "params":[], "id":1 }' ``` --- ### eth_getBalance Returns the balance of a given address. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_getbalance){target=\_blank}. **Parameters**: - `address` ++"string"++ - address to query balance. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `blockValue` ++"string"++ - (optional) the block value to be fetched. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} **Example**: ```bash title="eth_getBalance" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getBalance", "params":["INSERT_ADDRESS", "INSERT_BLOCK_VALUE"], "id":1 }' ``` Ensure to replace the `INSERT_ADDRESS` and `INSERT_BLOCK_VALUE` with the proper values. --- ### eth_getBlockByHash Returns information about a block by its hash. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_getblockbyhash){target=\_blank}. **Parameters**: - `blockHash` ++"string"++ – the hash of the block to retrieve. Must be a [32 byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `fullTransactions` ++"boolean"++ – if `true`, returns full transaction details; if `false`, returns only transaction hashes **Example**: ```bash title="eth_getBlockByHash" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getBlockByHash", "params":["INSERT_BLOCK_HASH", INSERT_BOOLEAN], "id":1 }' ``` Ensure to replace the `INSERT_BLOCK_HASH` and `INSERT_BOOLEAN` with the proper values. --- ### eth_getBlockByNumber Returns information about a block by its number. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_getblockbynumber){target=\_blank}. **Parameters**: - `blockValue` ++"string"++ - (optional) the block value to be fetched. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} - `fullTransactions` ++"boolean"++ – if `true`, returns full transaction details; if `false`, returns only transaction hashes **Example**: ```bash title="eth_getBlockByNumber" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getBlockByNumber", "params":["INSERT_BLOCK_VALUE", INSERT_BOOLEAN], "id":1 }' ``` Ensure to replace the `INSERT_BLOCK_VALUE` and `INSERT_BOOLEAN` with the proper values. --- ### eth_getBlockTransactionCountByNumber Returns the number of transactions in a block from a block number. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_getblocktransactioncountbynumber){target=\_blank}. **Parameters**: - `blockValue` ++"string"++ - the block value to be fetched. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} **Example**: ```bash title="eth_getBlockTransactionCountByNumber" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getBlockTransactionCountByNumber", "params":["INSERT_BLOCK_VALUE"], "id":1 }' ``` Ensure to replace the `INSERT_BLOCK_VALUE` with the proper values. --- ### eth_getBlockTransactionCountByHash Returns the number of transactions in a block from a block hash. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_getblocktransactioncountbyhash){target=\_blank}. **Parameters**: - `blockHash` ++"string"++ – the hash of the block to retrieve. Must be a [32 byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string **Example**: ```bash title="eth_getBlockTransactionCountByHash" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getBlockTransactionCountByHash", "params":["INSERT_BLOCK_HASH"], "id":1 }' ``` Ensure to replace the `INSERT_BLOCK_HASH` with the proper values. --- ### eth_getCode Returns the code at a given address. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_getcode){target=\_blank}. **Parameters**: - `address` ++"string"++ - contract or account address to query code. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `blockValue` ++"string"++ - (optional) the block value to be fetched. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block) **Example**: ```bash title="eth_getCode" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getCode", "params":["INSERT_ADDRESS", "INSERT_BLOCK_VALUE"], "id":1 }' ``` Ensure to replace the `INSERT_ADDRESS` and `INSERT_BLOCK_VALUE` with the proper values. --- ### eth_getLogs Returns an array of all logs matching a given filter object. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_getlogs){target=\_blank}. **Parameters**: - `filter` ++"object"++ - the filter object: - `fromBlock` ++"string"++ - (optional) block number or tag to start from. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} - `toBlock` ++"string"++ - (optional) block number or tag to end at. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} - `address` ++"string" or "array of strings"++ - (optional) contract address or a list of addresses from which to get logs. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `topics` ++"array of strings"++ - (optional) array of topics for filtering logs. Each topic can be a single [32 byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string or an array of such strings (meaning OR). - `blockhash` ++"string"++ - (optional) hash of a specific block. Cannot be used with `fromBlock` or `toBlock`. Must be a [32 byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string **Example**: ```bash title="eth_getLogs" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getLogs", "params":[{ "fromBlock": "latest", "toBlock": "latest" }], "id":1 }' ``` --- ### eth_getStorageAt Returns the value from a storage position at a given address. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_getstorageat){target=\_blank}. **Parameters**: - `address` ++"string"++ - contract or account address to query code. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `storageKey` ++"string"++ - position in storage to retrieve data from. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `blockValue` ++"string"++ - (optional) the block value to be fetched. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block) **Example**: ```bash title="eth_getStorageAt" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getStorageAt", "params":["INSERT_ADDRESS", "INSERT_STORAGE_KEY", "INSERT_BLOCK_VALUE"], "id":1 }' ``` Ensure to replace the `INSERT_ADDRESS`, `INSERT_STORAGE_KEY`, and `INSERT_BLOCK_VALUE` with the proper values. --- ### eth_getTransactionCount Returns the number of transactions sent from an address (nonce). [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_gettransactioncount){target=\_blank}. **Parameters**: - `address` ++"string"++ - address to query balance. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `blockValue` ++"string"++ - (optional) the block value to be fetched. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block) **Example**: ```bash title="eth_getTransactionCount" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getTransactionCount", "params":["INSERT_ADDRESS", "INSERT_BLOCK_VALUE"], "id":1 }' ``` Ensure to replace the `INSERT_ADDRESS` and `INSERT_BLOCK_VALUE` with the proper values. --- ### eth_getTransactionByHash Returns information about a transaction by its hash. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_gettransactionbyhash){target=\_blank}. **Parameters**: - `transactionHash` ++"string"++ - the hash of the transaction. Must be a [32 byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string **Example**: ```bash title="eth_getTransactionByHash" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getTransactionByHash", "params":["INSERT_TRANSACTION_HASH"], "id":1 }' ``` Ensure to replace the `INSERT_TRANSACTION_HASH` with the proper values. --- ### eth_getTransactionByBlockNumberAndIndex Returns information about a transaction by block number and transaction index. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_gettransactionbyblocknumberandindex){target=\_blank}. **Parameters**: - `blockValue` ++"string"++ - the block value to be fetched. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} - `transactionIndex` ++"string"++ - the index of the transaction in the block. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string **Example**: ```bash title="eth_getTransactionByBlockNumberAndIndex" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getTransactionByBlockNumberAndIndex", "params":["INSERT_BLOCK_VALUE", "INSERT_TRANSACTION_INDEX"], "id":1 }' ``` Ensure to replace the `INSERT_BLOCK_VALUE` and `INSERT_TRANSACTION_INDEX` with the proper values. --- ### eth_getTransactionByBlockHashAndIndex Returns information about a transaction by block hash and transaction index. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_gettransactionbyblockhashandindex){target=\_blank}. **Parameters**: - `blockHash` ++"string"++ – the hash of the block. Must be a [32 byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `transactionIndex` ++"string"++ - the index of the transaction in the block. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string **Example**: ```bash title="eth_getTransactionByBlockHashAndIndex" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getTransactionByBlockHashAndIndex", "params":["INSERT_BLOCK_HASH", "INSERT_TRANSACTION_INDEX"], "id":1 }' ``` Ensure to replace the `INSERT_BLOCK_HASH` and `INSERT_TRANSACTION_INDEX` with the proper values. --- ### eth_getTransactionReceipt Returns the receipt of a transaction by transaction hash. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_gettransactionreceipt){target=\_blank}. **Parameters**: - `transactionHash` ++"string"++ - the hash of the transaction. Must be a [32 byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string **Example**: ```bash title="eth_getTransactionReceipt" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_getTransactionReceipt", "params":["INSERT_TRANSACTION_HASH"], "id":1 }' ``` Ensure to replace the `INSERT_TRANSACTION_HASH` with the proper values. --- ### eth_maxPriorityFeePerGas Returns an estimate of the current priority fee per gas, in Wei, to be included in a block. **Parameters**: None **Example**: ```bash title="eth_maxPriorityFeePerGas" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_maxPriorityFeePerGas", "params":[], "id":1 }' ``` --- ### eth_sendRawTransaction Submits a raw transaction. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_sendrawtransaction){target=\_blank}. **Parameters**: - `callData` ++"string"++ - signed transaction data. Must be a [data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string **Example**: ```bash title="eth_sendRawTransaction" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_sendRawTransaction", "params":["INSERT_CALL_DATA"], "id":1 }' ``` Ensure to replace the `INSERT_CALL_DATA` with the proper values. --- ### eth_sendTransaction Creates and sends a new transaction. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_sendtransaction){target=\_blank}. **Parameters**: - `transaction` ++"object"++ - the transaction object: - `from` ++"string"++ - address sending the transaction. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `to` ++"string"++ - (optional) recipient address. No need to provide this value when deploying a contract. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `gas` ++"string"++ - (optional, default: `90000`) gas limit for execution. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `gasPrice` ++"string"++ - (optional) gas price per unit. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `value` ++"string"++ - (optional) amount of Ether to send. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `data` ++"string"++ - (optional) contract bytecode or encoded method call. Must be a [data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `nonce` ++"string"++ - (optional) transaction nonce. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string **Example**: ```bash title="eth_sendTransaction" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_sendTransaction", "params":[{ "from": "INSERT_SENDER_ADDRESS", "to": "INSERT_RECIPIENT_ADDRESS", "gas": "INSERT_GAS_LIMIT", "gasPrice": "INSERT_GAS_PRICE", "value": "INSERT_VALUE", "input": "INSERT_INPUT_DATA", "nonce": "INSERT_NONCE" }], "id":1 }' ``` Ensure to replace the `INSERT_SENDER_ADDRESS`, `INSERT_RECIPIENT_ADDRESS`, `INSERT_GAS_LIMIT`, `INSERT_GAS_PRICE`, `INSERT_VALUE`, `INSERT_INPUT_DATA`, and `INSERT_NONCE` with the proper values. --- ### eth_syncing Returns an object with syncing data or `false` if not syncing. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#eth_syncing){target=\_blank}. **Parameters**: None **Example**: ```bash title="eth_syncing" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"eth_syncing", "params":[], "id":1 }' ``` --- ### net_listening Returns `true` if the client is currently listening for network connections, otherwise `false`. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#net_listening){target=\_blank}. **Parameters**: None **Example**: ```bash title="net_listening" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"net_listening", "params":[], "id":1 }' ``` --- ### net_peerCount Returns the number of peers currently connected to the client. **Parameters**: None **Example**: ```bash title="net_peerCount" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"net_peerCount", "params":[], "id":1 }' ``` --- ### net_version Returns the current network ID as a string. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#net_version){target=\_blank}. **Parameters**: None **Example**: ```bash title="net_version" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"net_version", "params":[], "id":1 }' ``` --- ### system_health Returns information about the health of the system. **Parameters**: None **Example**: ```bash title="system_health" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"system_health", "params":[], "id":1 }' ``` --- ### web3_clientVersion Returns the current client version. [Reference](https://ethereum.org/en/developers/docs/apis/json-rpc/#web3_clientversion){target=\_blank}. **Parameters**: None **Example**: ```bash title="web3_clientVersion" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"web3_clientVersion", "params":[], "id":1 }' ``` --- ### debug_traceBlockByNumber Traces a block's execution by its number and returns a detailed execution trace for each transaction. **Parameters**: - `blockValue` ++"string"++ - the block number or tag to trace. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} - `options` ++"object"++ - (optional) an object containing tracer options: - `tracer` ++"string"++ - the name of the tracer to use (e.g., "callTracer", "opTracer"). - Other tracer-specific options may be supported. **Example**: ```bash title="debug_traceBlockByNumber" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"debug_traceBlockByNumber", "params":["INSERT_BLOCK_VALUE", {"tracer": "callTracer"}], "id":1 }' ``` Ensure to replace `INSERT_BLOCK_VALUE` with a proper block number if needed. --- ### debug_traceTransaction Traces the execution of a single transaction by its hash and returns a detailed execution trace. **Parameters**: - `transactionHash` ++"string"++ - the hash of the transaction to trace. Must be a [32 byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `options` ++"object"++ - (optional) an object containing tracer options (e.g., `tracer: "callTracer"`). **Example**: ```bash title="debug_traceTransaction" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"debug_traceTransaction", "params":["INSERT_TRANSACTION_HASH", {"tracer": "callTracer"}], "id":1 }' ``` Ensure to replace the `INSERT_TRANSACTION_HASH` with the proper value. --- ### debug_traceCall Executes a new message call and returns a detailed execution trace without creating a transaction on the blockchain. **Parameters**: - `transaction` ++"object"++ - the transaction call object, similar to `eth_call` parameters: - `to` ++"string"++ - recipient address of the call. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `data` ++"string"++ - hash of the method signature and encoded parameters. Must be a [data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `from` ++"string"++ - (optional) sender's address for the call. Must be a [20-byte data](https://ethereum.org/en/developers/docs/apis/json-rpc/#unformatted-data-encoding){target=\_blank} string - `gas` ++"string"++ - (optional) gas limit to execute the call. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `gasPrice` ++"string"++ - (optional) gas price per unit of gas. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `value` ++"string"++ - (optional) value in wei to send with the call. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string - `blockValue` ++"string"++ - (optional) block tag or block number to execute the call at. Must be a [quantity](https://ethereum.org/en/developers/docs/apis/json-rpc/#quantities-encoding){target=\_blank} string or a [default block parameter](https://ethereum.org/en/developers/docs/apis/json-rpc/#default-block){target=\_blank} - `options` ++"object"++ - (optional) an object containing tracer options (e.g., `tracer: "callTracer"`). **Example**: ```bash title="debug_traceCall" curl -X POST https://testnet-passet-hub-eth-rpc.polkadot.io \ -H "Content-Type: application/json" \ --data '{ "jsonrpc":"2.0", "method":"debug_traceCall", "params":[{ "from": "INSERT_SENDER_ADDRESS", "to": "INSERT_RECIPIENT_ADDRESS", "data": "INSERT_ENCODED_CALL" }, "INSERT_BLOCK_VALUE", {"tracer": "callTracer"}], "id":1 }' ``` Ensure to replace the `INSERT_SENDER_ADDRESS`, `INSERT_RECIPIENT_ADDRESS`, `INSERT_ENCODED_CALL`, and `INSERT_BLOCK_VALUE` with the proper value. --- ## Response Format All responses follow the standard JSON-RPC 2.0 format: ```json { "jsonrpc": "2.0", "id": 1, "result": ... // The return value varies by method } ``` ## Error Handling If an error occurs, the response will include an error object: ```json { "jsonrpc": "2.0", "id": 1, "error": { "code": -32000, "message": "Error message here" } } ``` --- END CONTENT --- Doc-Content: https://docs.polkadot.com/develop/toolkit/interoperability/asset-transfer-api/reference/ --- BEGIN CONTENT --- --- title: Asset Transfer API Reference description: Explore the Asset Transfer API Reference for comprehensive details on methods, data types, and functionalities. Essential for cross-chain asset transfers. categories: Reference, Dapps --- # Asset Transfer API Reference
- :octicons-download-16:{ .lg .middle } __Install the Asset Transfer API__ --- Learn how to install [`asset-transfer-api`](https://github.com/paritytech/asset-transfer-api){target=\_blank} into a new or existing project.
[:octicons-arrow-right-24: Get started](/develop/toolkit/interoperability/asset-transfer-api/overview/#install-asset-transfer-api){target=\_blank} - :octicons-code-16:{ .lg .middle } __Dive in with a tutorial__ --- Ready to start coding? Follow along with a step-by-step tutorial.
[:octicons-arrow-right-24: How to use the Asset Transfer API](/develop/toolkit/interoperability/asset-transfer-api/overview/#examples)

## Asset Transfer API Class Holds open an API connection to a specified chain within the `ApiPromise` to help construct transactions for assets and estimate fees. For a more in-depth explanation of the Asset Transfer API class structure, check the [source code](https://github.com/paritytech/asset-transfer-api/blob/{{dependencies.repositories.asset_transfer_api.version}}/src/AssetTransferApi.ts#L128){target=\_blank}. ### Methods #### Create Transfer Transaction Generates an XCM transaction for transferring assets between chains. It simplifies the process by inferring what type of transaction is required given the inputs, ensuring that the assets are valid, and that the transaction details are correctly formatted. After obtaining the transaction, you must handle the signing and submission process separately. ```ts public async createTransferTransaction( destChainId: string, destAddr: string, assetIds: string[], amounts: string[], opts: TransferArgsOpts = {}, ): Promise> { ``` ??? interface "Request parameters" `destChainId` ++"string"++ ++"required"++ ID of the destination chain (`'0'` for relay chain, other values for parachains). --- `destAddr` ++"string"++ ++"required"++ Address of the recipient account on the destination chain. --- `assetIds` ++"string[]"++ ++"required"++ Array of asset IDs to be transferred. When asset IDs are provided, the API dynamically selects the appropriate pallet for the current chain to handle these specific assets. If the array is empty, the API defaults to using the `balances` pallet. --- `amounts` ++"string[]"++ ++"required"++ Array of amounts corresponding to each asset in `assetIds`. --- `opts` ++"TransferArgsOpts"++ Options for customizing the claim assets transaction. These options allow you to specify the transaction format, fee payment details, weight limits, XCM versions, and more. ??? child "Show more" `format` ++"T extends Format"++ Specifies the format for returning a transaction. ??? child "Type `Format`" ```ts export type Format = 'payload' | 'call' | 'submittable'; ``` --- `paysWithFeeOrigin` ++"string"++ The Asset ID to pay fees on the current common good parachain. The defaults are as follows: - Polkadot Asset Hub - `'DOT'` - Kusama Asset Hub - `'KSM'` --- `paysWithFeeDest` ++"string"++ Asset ID to pay fees on the destination parachain. --- `weightLimit` ++"{ refTime?: string, proofSize?: string }"++ Custom weight limit option. If not provided, it will default to unlimited. --- `xcmVersion` ++"number"++ Sets the XCM version for message construction. If this is not present a supported version will be queried, and if there is no supported version a safe version will be queried. --- `keepAlive` ++"boolean"++ Enables `transferKeepAlive` for local asset transfers. For creating local asset transfers, if `true` this will allow for a `transferKeepAlive` as opposed to a `transfer`. --- `transferLiquidToken` ++"boolean"++ Declares if this will transfer liquidity tokens. Default is `false`. --- `assetTransferType` ++"string"++ The XCM transfer type used to transfer assets. The `AssetTransferType` type defines the possible values for this parameter. ??? child "Type `AssetTransferType`" ```ts export type AssetTransferType = LocalReserve | DestinationReserve | Teleport | RemoteReserve; ``` !!! note To use the `assetTransferType` parameter, which is a string, you should use the `AssetTransferType` type as if each of its variants are strings. For example: `assetTransferType = 'LocalReserve'`. --- `remoteReserveAssetTransferTypeLocation` ++"string"++ The remove reserve location for the XCM transfer. Should be provided when specifying an `assetTransferType` of `RemoteReserve`. --- `feesTransferType` ++"string"++ XCM TransferType used to pay fees for XCM transfer. The `AssetTransferType` type defines the possible values for this parameter. ??? child "Type `AssetTransferType`" ```ts export type AssetTransferType = LocalReserve | DestinationReserve | Teleport | RemoteReserve; ``` !!! note To use the `feesTransferType` parameter, which is a string, you should use the `AssetTransferType` type as if each of its variants are strings. For example: `feesTransferType = 'LocalReserve'`. --- `remoteReserveFeesTransferTypeLocation` ++"string"++ The remote reserve location for the XCM transfer fees. Should be provided when specifying a `feesTransferType` of `RemoteReserve`. --- `customXcmOnDest` ++"string"++ A custom XCM message to be executed on the destination chain. Should be provided if a custom XCM message is needed after transferring assets. Defaults to: ```bash Xcm(vec![DepositAsset { assets: Wild(AllCounted(assets.len())), beneficiary }]) ``` ??? interface "Response parameters" ++"Promise"++ A promise containing the result of constructing the transaction. ??? child "Show more" `dest` ++"string"++ The destination `specName` of the transaction. --- `origin` ++"string"++ The origin `specName` of the transaction. --- `format` ++"Format | 'local'"++ The format type the transaction is outputted in. ??? child "Type `Format`" ```ts export type Format = 'payload' | 'call' | 'submittable'; ``` --- `xcmVersion` ++"number | null"++ The XCM version that was used to construct the transaction. --- `direction` ++"Direction | 'local'"++ The direction of the cross-chain transfer. ??? child "Enum `Direction` values" `Local` Local transaction. --- `SystemToPara` System parachain to parachain. --- `SystemToRelay` System paracahin to system relay chain. --- `SystemToSystem` System parachain to System parachain chain. --- `SystemToBridge` System parachain to an external `GlobalConsensus` chain. --- `ParaToPara` Parachain to Parachain. --- `ParaToRelay` Parachain to Relay chain. --- `ParaToSystem` Parachain to System parachain. --- `RelayToSystem` Relay to System Parachain. --- `RelayToPara` Relay chain to Parachain. --- `RelayToBridge` Relay chain to an external `GlobalConsensus` chain. `method` ++"Methods"++ The method used in the transaction. ??? child "Type `Methods`" ```ts export type Methods = | LocalTransferTypes | 'transferAssets' | 'transferAssetsUsingTypeAndThen' | 'limitedReserveTransferAssets' | 'limitedTeleportAssets' | 'transferMultiasset' | 'transferMultiassets' | 'transferMultiassetWithFee' | 'claimAssets'; ``` ??? child "Type `LocalTransferTypes`" ```ts export type LocalTransferTypes = | 'assets::transfer' | 'assets::transferKeepAlive' | 'assets::transferAll' | 'foreignAssets::transfer' | 'foreignAssets::transferKeepAlive' | 'foreignAssets::transferAll' | 'balances::transfer' | 'balances::transferKeepAlive' | 'balances::transferAll' | 'poolAssets::transfer' | 'poolAssets::transferKeepAlive' | 'poolAssets::transferAll' | 'tokens::transfer' | 'tokens::transferKeepAlive' | 'tokens::transferAll'; ``` --- `tx` ++"ConstructedFormat"++ The constructed transaction. ??? child "Type `ConstructedFormat`" ```ts export type ConstructedFormat = T extends 'payload' ? GenericExtrinsicPayload : T extends 'call' ? `0x${string}` : T extends 'submittable' ? SubmittableExtrinsic<'promise', ISubmittableResult> : never; ``` The `ConstructedFormat` type is a conditional type that returns a specific type based on the value of the TxResult `format` field. - **Payload format** - if the format field is set to `'payload'`, the `ConstructedFormat` type will return a [`GenericExtrinsicPayload`](https://github.com/polkadot-js/api/blob/v15.8.1/packages/types/src/extrinsic/ExtrinsicPayload.ts#L87){target=\_blank} - **Call format** - if the format field is set to `'call'`, the `ConstructedFormat` type will return a hexadecimal string (`0x${string}`). This is the encoded representation of the extrinsic call - **Submittable format** - if the format field is set to `'submittable'`, the `ConstructedFormat` type will return a [`SubmittableExtrinsic`](https://github.com/polkadot-js/api/blob/v15.8.1/packages/api-base/src/types/submittable.ts#L56){target=\_blank}. This is a Polkadot.js type that represents a transaction that can be submitted to the blockchain ??? interface "Example" ***Request*** ```ts import { AssetTransferApi, constructApiPromise, } from '@substrate/asset-transfer-api'; async function main() { const { api, specName, safeXcmVersion } = await constructApiPromise( 'wss://wss.api.moonbeam.network', ); const assetsApi = new AssetTransferApi(api, specName, safeXcmVersion); let callInfo; try { callInfo = await assetsApi.createTransferTransaction( '2004', '0xF977814e90dA44bFA03b6295A0616a897441aceC', [], ['1000000000000000000'], { format: 'call', keepAlive: true, }, ); console.log(`Call data:\n${JSON.stringify(callInfo, null, 4)}`); } catch (e) { console.error(e); throw Error(e as string); } } main() .catch((err) => console.error(err)) .finally(() => process.exit()); ``` ***Response***
Call data: { "origin": "moonbeam", "dest": "moonbeam", "direction": "local", "xcmVersion": null, "method": "balances::transferKeepAlive", "format": "call", "tx": "0x0a03f977814e90da44bfa03b6295a0616a897441acec821a0600" }
#### Claim Assets Creates a local XCM transaction to retrieve trapped assets. This function can be used to claim assets either locally on a system parachain, on the relay chain, or on any chain that supports the `claimAssets` runtime call. ```ts public async claimAssets( assetIds: string[], amounts: string[], beneficiary: string, opts: TransferArgsOpts, ): Promise> { ``` ??? interface "Request parameters" `assetIds` ++"string[]"++ ++"required"++ Array of asset IDs to be claimed from the `AssetTrap`. --- `amounts` ++"string[]"++ ++"required"++ Array of amounts corresponding to each asset in `assetIds`. --- `beneficiary` ++"string"++ ++"required"++ Address of the account to receive the trapped assets. --- `opts` ++"TransferArgsOpts"++ Options for customizing the claim assets transaction. These options allow you to specify the transaction format, fee payment details, weight limits, XCM versions, and more. ??? child "Show more" `format` ++"T extends Format"++ Specifies the format for returning a transaction. ??? child "Type `Format`" ```ts export type Format = 'payload' | 'call' | 'submittable'; ``` --- `paysWithFeeOrigin` ++"string"++ The Asset ID to pay fees on the current common good parachain. The defaults are as follows: - Polkadot Asset Hub - `'DOT'` - Kusama Asset Hub - `'KSM'` --- `paysWithFeeDest` ++"string"++ Asset ID to pay fees on the destination parachain. --- `weightLimit` ++"{ refTime?: string, proofSize?: string }"++ Custom weight limit option. If not provided, it will default to unlimited. --- `xcmVersion` ++"number"++ Sets the XCM version for message construction. If this is not present a supported version will be queried, and if there is no supported version a safe version will be queried. --- `keepAlive` ++"boolean"++ Enables `transferKeepAlive` for local asset transfers. For creating local asset transfers, if `true` this will allow for a `transferKeepAlive` as opposed to a `transfer`. --- `transferLiquidToken` ++"boolean"++ Declares if this will transfer liquidity tokens. Default is `false`. --- `assetTransferType` ++"string"++ The XCM transfer type used to transfer assets. The `AssetTransferType` type defines the possible values for this parameter. ??? child "Type `AssetTransferType`" ```ts export type AssetTransferType = LocalReserve | DestinationReserve | Teleport | RemoteReserve; ``` !!! note To use the `assetTransferType` parameter, which is a string, you should use the `AssetTransferType` type as if each of its variants are strings. For example: `assetTransferType = 'LocalReserve'`. --- `remoteReserveAssetTransferTypeLocation` ++"string"++ The remove reserve location for the XCM transfer. Should be provided when specifying an `assetTransferType` of `RemoteReserve`. --- `feesTransferType` ++"string"++ XCM TransferType used to pay fees for XCM transfer. The `AssetTransferType` type defines the possible values for this parameter. ??? child "Type `AssetTransferType`" ```ts export type AssetTransferType = LocalReserve | DestinationReserve | Teleport | RemoteReserve; ``` !!! note To use the `feesTransferType` parameter, which is a string, you should use the `AssetTransferType` type as if each of its variants are strings. For example: `feesTransferType = 'LocalReserve'`. --- `remoteReserveFeesTransferTypeLocation` ++"string"++ The remote reserve location for the XCM transfer fees. Should be provided when specifying a `feesTransferType` of `RemoteReserve`. --- `customXcmOnDest` ++"string"++ A custom XCM message to be executed on the destination chain. Should be provided if a custom XCM message is needed after transferring assets. Defaults to: ```bash Xcm(vec![DepositAsset { assets: Wild(AllCounted(assets.len())), beneficiary }]) ``` ??? interface "Response parameters" ++"Promise>"++ A promise containing the result of constructing the transaction. ??? child "Show more" `dest` ++"string"++ The destination `specName` of the transaction. --- `origin` ++"string"++ The origin `specName` of the transaction. --- `format` ++"Format | 'local'"++ The format type the transaction is outputted in. ??? child "Type `Format`" ```ts export type Format = 'payload' | 'call' | 'submittable'; ``` --- `xcmVersion` ++"number | null"++ The XCM version that was used to construct the transaction. --- `direction` ++"Direction | 'local'"++ The direction of the cross-chain transfer. ??? child "Enum `Direction` values" `Local` Local transaction. --- `SystemToPara` System parachain to parachain. --- `SystemToRelay` System paracahin to system relay chain. --- `SystemToSystem` System parachain to System parachain chain. --- `SystemToBridge` System parachain to an external `GlobalConsensus` chain. --- `ParaToPara` Parachain to Parachain. --- `ParaToRelay` Parachain to Relay chain. --- `ParaToSystem` Parachain to System parachain. --- `RelayToSystem` Relay to System Parachain. --- `RelayToPara` Relay chain to Parachain. --- `RelayToBridge` Relay chain to an external `GlobalConsensus` chain. `method` ++"Methods"++ The method used in the transaction. ??? child "Type `Methods`" ```ts export type Methods = | LocalTransferTypes | 'transferAssets' | 'transferAssetsUsingTypeAndThen' | 'limitedReserveTransferAssets' | 'limitedTeleportAssets' | 'transferMultiasset' | 'transferMultiassets' | 'transferMultiassetWithFee' | 'claimAssets'; ``` ??? child "Type `LocalTransferTypes`" ```ts export type LocalTransferTypes = | 'assets::transfer' | 'assets::transferKeepAlive' | 'assets::transferAll' | 'foreignAssets::transfer' | 'foreignAssets::transferKeepAlive' | 'foreignAssets::transferAll' | 'balances::transfer' | 'balances::transferKeepAlive' | 'balances::transferAll' | 'poolAssets::transfer' | 'poolAssets::transferKeepAlive' | 'poolAssets::transferAll' | 'tokens::transfer' | 'tokens::transferKeepAlive' | 'tokens::transferAll'; ``` --- `tx` ++"ConstructedFormat"++ The constructed transaction. ??? child "Type `ConstructedFormat`" ```ts export type ConstructedFormat = T extends 'payload' ? GenericExtrinsicPayload : T extends 'call' ? `0x${string}` : T extends 'submittable' ? SubmittableExtrinsic<'promise', ISubmittableResult> : never; ``` The `ConstructedFormat` type is a conditional type that returns a specific type based on the value of the TxResult `format` field. - **Payload format** - if the format field is set to `'payload'`, the `ConstructedFormat` type will return a [`GenericExtrinsicPayload`](https://github.com/polkadot-js/api/blob/v15.8.1/packages/types/src/extrinsic/ExtrinsicPayload.ts#L87){target=\_blank} - **Call format** - if the format field is set to `'call'`, the `ConstructedFormat` type will return a hexadecimal string (`0x${string}`). This is the encoded representation of the extrinsic call - **Submittable format** - if the format field is set to `'submittable'`, the `ConstructedFormat` type will return a [`SubmittableExtrinsic`](https://github.com/polkadot-js/api/blob/v15.8.1/packages/api-base/src/types/submittable.ts#L56){target=\_blank}. This is a Polkadot.js type that represents a transaction that can be submitted to the blockchain ??? interface "Example" ***Request*** ```ts import { AssetTransferApi, constructApiPromise, } from '@substrate/asset-transfer-api'; async function main() { const { api, specName, safeXcmVersion } = await constructApiPromise( 'wss://westend-rpc.polkadot.io', ); const assetsApi = new AssetTransferApi(api, specName, safeXcmVersion); let callInfo; try { callInfo = await assetsApi.claimAssets( [ `{"parents":"0","interior":{"X2":[{"PalletInstance":"50"},{"GeneralIndex":"1984"}]}}`, ], ['1000000000000'], '0xf5d5714c084c112843aca74f8c498da06cc5a2d63153b825189baa51043b1f0b', { format: 'call', xcmVersion: 2, }, ); console.log(`Call data:\n${JSON.stringify(callInfo, null, 4)}`); } catch (e) { console.error(e); throw Error(e as string); } } main() .catch((err) => console.error(err)) .finally(() => process.exit()); ``` ***Response***
Call data: { "origin": "0", "dest": "westend", "direction": "local", "xcmVersion": 2, "method": "claimAssets", "format": "call", "tx": "0x630c0104000002043205011f00070010a5d4e80100010100f5d5714c084c112843aca74f8c498da06cc5a2d63153b825189baa51043b1f0b" }
#### Decode Extrinsic Decodes the hex of an extrinsic into a string readable format. ```ts public decodeExtrinsic(encodedTransaction: string, format: T): string { ``` ??? interface "Request parameters" `encodedTransaction` ++"string"++ ++"required"++ A hex encoded extrinsic. --- `format` ++"T extends Format"++ ++"required"++ Specifies the format for returning a transaction. ??? child "Type `Format`" ```ts export type Format = 'payload' | 'call' | 'submittable'; ``` ??? interface "Response parameters" ++"string"++ Decoded extrinsic in string readable format. ??? interface "Example" ***Request*** ```ts import { AssetTransferApi, constructApiPromise, } from '@substrate/asset-transfer-api'; async function main() { const { api, specName, safeXcmVersion } = await constructApiPromise( 'wss://wss.api.moonbeam.network', ); const assetsApi = new AssetTransferApi(api, specName, safeXcmVersion); const encodedExt = '0x0a03f977814e90da44bfa03b6295a0616a897441acec821a0600'; try { const decodedExt = assetsApi.decodeExtrinsic(encodedExt, 'call'); console.log( `Decoded tx:\n ${JSON.stringify(JSON.parse(decodedExt), null, 4)}`, ); } catch (e) { console.error(e); throw Error(e as string); } } main() .catch((err) => console.error(err)) .finally(() => process.exit()); ``` ***Response***
Decoded tx: { "args": { "dest": "0xF977814e90dA44bFA03b6295A0616a897441aceC", "value": "100,000" }, "method": "transferKeepAlive", "section": "balances" }
#### Fetch Fee Info Fetch estimated fee information for an extrinsic. ```ts public async fetchFeeInfo( tx: ConstructedFormat, format: T, ): Promise { ``` ??? interface "Request parameters" `tx` ++"ConstructedFormat"++ ++"required"++ The constructed transaction. ??? child "Type `ConstructedFormat`" ```ts export type ConstructedFormat = T extends 'payload' ? GenericExtrinsicPayload : T extends 'call' ? `0x${string}` : T extends 'submittable' ? SubmittableExtrinsic<'promise', ISubmittableResult> : never; ``` The `ConstructedFormat` type is a conditional type that returns a specific type based on the value of the TxResult `format` field. - **Payload format** - if the format field is set to `'payload'`, the `ConstructedFormat` type will return a [`GenericExtrinsicPayload`](https://github.com/polkadot-js/api/blob/{{ dependencies.javascript_packages.asset_transfer_api.polkadot_js_api_version}}/packages/types/src/extrinsic/ExtrinsicPayload.ts#L87){target=\_blank} - Call format - if the format field is set to `'call'`, the `ConstructedFormat` type will return a hexadecimal string (`0x${string}`). This is the encoded representation of the extrinsic call - **Submittable format** - if the format field is set to `'submittable'`, the `ConstructedFormat` type will return a [`SubmittableExtrinsic`](https://github.com/polkadot-js/api/blob/{{dependencies.javascript_packages.asset_transfer_api.polkadot_js_api_version}}/packages/api-base/src/types/submittable.ts#L56){target=\_blank}. This is a Polkadot.js type that represents a transaction that can be submitted to the blockchain --- `format` ++"T extends Format"++ ++"required"++ Specifies the format for returning a transaction. ??? child "Type `Format`" ```ts export type Format = 'payload' | 'call' | 'submittable'; ``` ??? interface "Response parameters" ++"Promise"++ A promise containing the estimated fee information for the provided extrinsic. ??? child "Type `RuntimeDispatchInfo`" ```ts export interface RuntimeDispatchInfo extends Struct { readonly weight: Weight; readonly class: DispatchClass; readonly partialFee: Balance; } ``` For more information on the underlying types and fields of `RuntimeDispatchInfo`, check the [`RuntimeDispatchInfo`](https://github.com/polkadot-js/api/blob/{{ dependencies.javascript_packages.asset_transfer_api.polkadot_js_api_version}}/packages/types/src/interfaces/payment/types.ts#L21){target=\_blank} source code. ??? child "Type `RuntimeDispatchInfoV1`" ```ts export interface RuntimeDispatchInfoV1 extends Struct { readonly weight: WeightV1; readonly class: DispatchClass; readonly partialFee: Balance; } ``` For more information on the underlying types and fields of `RuntimeDispatchInfoV1`, check the [`RuntimeDispatchInfoV1`](https://github.com/polkadot-js/api/blob/{{dependencies.javascript_packages.asset_transfer_api.polkadot_js_api_version}}/packages/types/src/interfaces/payment/types.ts#L28){target=\_blank} source code. ??? interface "Example" ***Request*** ```ts import { AssetTransferApi, constructApiPromise, } from '@substrate/asset-transfer-api'; async function main() { const { api, specName, safeXcmVersion } = await constructApiPromise( 'wss://wss.api.moonbeam.network', ); const assetsApi = new AssetTransferApi(api, specName, safeXcmVersion); const encodedExt = '0x0a03f977814e90da44bfa03b6295a0616a897441acec821a0600'; try { const decodedExt = await assetsApi.fetchFeeInfo(encodedExt, 'call'); console.log(`Fee info:\n${JSON.stringify(decodedExt, null, 4)}`); } catch (e) { console.error(e); throw Error(e as string); } } main() .catch((err) => console.error(err)) .finally(() => process.exit()); ``` ***Response***
Fee info: { "weight": { "refTime": 163777000, "proofSize": 3581 }, "class": "Normal", "partialFee": 0 }
--- END CONTENT --- Doc-Content: https://docs.polkadot.com/polkadot-protocol/glossary/ --- BEGIN CONTENT --- --- title: Glossary description: Glossary of terms used within the Polkadot ecosystem, Polkadot SDK, its subsequent libraries, and other relevant Web3 terminology. template: root-subdirectory-page.html categories: Reference --- # Glossary Key definitions, concepts, and terminology specific to the Polkadot ecosystem are included here. Additional glossaries from around the ecosystem you might find helpful: - [Polkadot Wiki Glossary](https://wiki.polkadot.network/general/glossary/){target=\_blank} - [Polkadot SDK Glossary](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/glossary/index.html){target=\_blank} ## Authority The role in a blockchain that can participate in consensus mechanisms. - [GRANDPA](#grandpa) - the authorities vote on chains they consider final - [Blind Assignment of Blockchain Extension](#blind-assignment-of-blockchain-extension-babe) (BABE) - the authorities are also [block authors](#block-author) Authority sets can be used as a basis for consensus mechanisms such as the [Nominated Proof of Stake (NPoS)](#nominated-proof-of-stake-npos) protocol. ## Authority Round (Aura) A deterministic [consensus](#consensus) protocol where block production is limited to a rotating list of [authorities](#authority) that take turns creating blocks. In authority round (Aura) consensus, most online authorities are assumed to be honest. It is often used in combination withΒ [GRANDPA](#grandpa)Β as aΒ [hybrid consensus](#hybrid-consensus)Β protocol. Learn more by reading the official [Aura consensus algorithm](https://openethereum.github.io/Aura){target=\_blank} wiki article. ## Blind Assignment of Blockchain Extension (BABE) A [block authoring](#block-author) protocol similar to [Aura](#authority-round-aura), except [authorities](#authority) win [slots](#slot) based on a Verifiable Random Function (VRF) instead of the round-robin selection method. The winning authority can select a chain and submit a new block. Learn more by reading the official Web3 Foundation [BABE research document](https://research.web3.foundation/Polkadot/protocols/block-production/Babe){target=\_blank}. ## Block Author The node responsible for the creation of a block, also called _block producers_. In a Proof of Work (PoW) blockchain, these nodes are called _miners_. ## Byzantine Fault Tolerance (BFT) The ability of a distributed computer network to remain operational if a certain proportion of its nodes or [authorities](#authority) are defective or behaving maliciously. A distributed network is typically considered Byzantine fault tolerant if it can remain functional, with up to one-third of nodes assumed to be defective, offline, actively malicious, and part of a coordinated attack. ### Byzantine Failure The loss of a network service due to node failures that exceed the proportion of nodes required to reach consensus. ### Practical Byzantine Fault Tolerance (pBFT) An early approach to Byzantine fault tolerance (BFT), practical Byzantine fault tolerance (pBFT) systems tolerate Byzantine behavior from up to one-third of participants. The communication overhead for such systems is `O(nΒ²)`, where `n` is the number of nodes (participants) in the system. ### Preimage A preimage is the data that is input into a hash function to calculate a hash. Since a hash function is a [one-way function](https://en.wikipedia.org/wiki/One-way_function){target=\_blank}, the output, the hash, cannot be used to reveal the input, the preimage. ## Call In the context of pallets containing functions to be dispatched to the runtime, `Call` is an enumeration data type that describes the functions that can be dispatched with one variant per pallet. A `Call` represents a [dispatch](#dispatchable) data structure object. ## Chain Specification A chain specification file defines the properties required to run a node in an active or new Polkadot SDK-built network. It often contains the initial genesis runtime code, network properties (such as the network's name), the initial state for some pallets, and the boot node list. The chain specification file makes it easy to use a single Polkadot SDK codebase as the foundation for multiple independently configured chains. ## Collator An [author](#block-author) of a [parachain](#parachain) network. They aren't [authorities](#authority) in themselves, as they require a [relay chain](#relay-chain) to coordinate [consensus](#consensus). More details are found on the [Polkadot Collator Wiki](https://wiki.polkadot.network/learn/learn-collator/){target=\_blank}. ## Collective Most often used to refer to an instance of the Collective pallet on Polkadot SDK-based networks such as [Kusama](#kusama) or [Polkadot](#polkadot) if the Collective pallet is part of the FRAME-based runtime for the network. ## Consensus Consensus is the process blockchain nodes use to agree on a chain's canonical fork. It is composed of [authorship](#block-author), finality, and [fork-choice rule](#fork-choice-rulestrategy). In the Polkadot ecosystem, these three components are usually separate and the term consensus often refers specifically to authorship. See also [hybrid consensus](#hybrid-consensus). ## Consensus Algorithm Ensures a set of [actors](#authority)β€”who don't necessarily trust each otherβ€”can reach an agreement about the state as the result of some computation. Most consensus algorithms assume that up to one-third of the actors or nodes can be [Byzantine fault tolerant](#byzantine-fault-tolerance-bft). Consensus algorithms are generally concerned with ensuring two properties: - **Safety** - indicating that all honest nodes eventually agreed on the state of the chain - **Liveness** - indicating the ability of the chain to keep progressing ## Consensus Engine The node subsystem responsible for consensus tasks. For detailed information about the consensus strategies of the [Polkadot](#polkadot) network, see the [Polkadot Consensus](/polkadot-protocol/architecture/polkadot-chain/pos-consensus/){target=\_blank} blog series. See also [hybrid consensus](#hybrid-consensus). ## Coretime The time allocated for utilizing a core, measured in relay chain blocks. There are two types of coretime: *on-demand* and *bulk*. On-demand coretime refers to coretime acquired through bidding in near real-time for the validation of a single parachain block on one of the cores reserved specifically for on-demand orders. They are available as an on-demand coretime pool. Set of cores that are available on-demand. Cores reserved through bulk coretime could also be made available in the on-demand coretime pool, in parts or in entirety. Bulk coretime is a fixed duration of continuous coretime represented by an NFT that can be split, shared, or resold. It is managed by the [Broker pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_broker/index.html){target=\_blank}. ## Development Phrase A [mnemonic phrase](https://en.wikipedia.org/wiki/Mnemonic#For_numerical_sequences_and_mathematical_operations){target=\_blank} that is intentionally made public. Well-known development accounts, such as Alice, Bob, Charlie, Dave, Eve, and Ferdie, are generated from the same secret phrase: ``` bottom drive obey lake curtain smoke basket hold race lonely fit walk ``` Many tools in the Polkadot SDK ecosystem, such as [`subkey`](https://github.com/paritytech/polkadot-sdk/tree/{{dependencies.repositories.polkadot_sdk.version}}/substrate/bin/utils/subkey){target=\_blank}, allow you to implicitly specify an account using a derivation path such as `//Alice`. ## Digest An extensible field of the [block header](#header) that encodes information needed by several actors in a blockchain network, including: - [Light clients](#light-client) for chain synchronization - Consensus engines for block verification - The runtime itself, in the case of pre-runtime digests ## Dispatchable Function objects that act as the entry points in FRAME [pallets](#pallet). Internal or external entities can call them to interact with the blockchain’s state. They are a core aspect of the runtime logic, handling [transactions](#transaction) and other state-changing operations. ## Events A means of recording that some particular [state](#state) transition happened. In the context of [FRAME](#frame-framework-for-runtime-aggregation-of-modularized-entities), events are composable data types that each [pallet](#pallet) can individually define. Events in FRAME are implemented as a set of transient storage items inspected immediately after a block has been executed and reset during block initialization. ## Executor A means of executing a function call in a given [runtime](#runtime) with a set of dependencies. There are two orchestration engines in Polkadot SDK, _WebAssembly_ and _native_. - The _native executor_ uses a natively compiled runtime embedded in the node to execute calls. This is a performance optimization available to up-to-date nodes - The _WebAssembly executor_ uses a [Wasm](#webassembly-wasm) binary and a Wasm interpreter to execute calls. The binary is guaranteed to be up-to-date regardless of the version of the blockchain node because it is persisted in the [state](#state) of the Polkadot SDK-based chain ## Existential Deposit The minimum balance an account is allowed to have in the [Balances pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_balances/index.html){target=\_blank}. Accounts cannot be created with a balance less than the existential deposit amount. If an account balance drops below this amount, the Balances pallet uses [a FRAME System API](https://paritytech.github.io/substrate/master/frame_system/pallet/struct.Pallet.html#method.dec_ref){target=\_blank} to drop its references to that account. If the Balances pallet reference to an account is dropped, the account can be [reaped](https://paritytech.github.io/substrate/master/frame_system/pallet/struct.Pallet.html#method.allow_death){target=\_blank}. ## Extrinsic A general term for data that originates outside the runtime, is included in a block, and leads to some action. This includes user-initiated transactions and inherent transactions placed into the block by the block builder. It is a SCALE-encoded array typically consisting of a version number, signature, and varying data types indicating the resulting runtime function to be called. Extrinsics can take two forms: [inherents](#inherent-transactions) and [transactions](#transaction). For more technical details, see the [Polkadot spec](https://spec.polkadot.network/id-extrinsics){target=\_blank}. ## Fork Choice Rule/Strategy A fork choice rule or strategy helps determine which chain is valid when reconciling several network forks. A common fork choice rule is the [longest chain](https://paritytech.github.io/polkadot-sdk/master/sc_consensus/struct.LongestChain.html){target=\_blank}, in which the chain with the most blocks is selected. ## FRAME (Framework for Runtime Aggregation of Modularized Entities) Enables developers to create blockchain [runtime](#runtime) environments from a modular set of components called [pallets](#pallet). It utilizes a set of procedural macros to construct runtimes. [Visit the Polkadot SDK docs for more details on FRAME.](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/polkadot_sdk/frame_runtime/index.html){target=\_blank} ## Full Node A node that prunes historical states, keeping only recently finalized block states to reduce storage needs. Full nodes provide current chain state access and allow direct submission and validation of [extrinsics](#extrinsic), maintaining network decentralization. ## Genesis Configuration A mechanism for specifying the initial state of a blockchain. By convention, this initial state or first block is commonly referred to as the genesis state or genesis block. The genesis configuration for Polkadot SDK-based chains is accomplished by way of a [chain specification](#chain-specification) file. ## GRANDPA A deterministic finality mechanism for blockchains that is implemented in the [Rust](https://www.rust-lang.org/){target=\_blank} programming language. The [formal specification](https://github.com/w3f/consensus/blob/master/pdf/grandpa-old.pdf){target=\_blank} is maintained by the [Web3 Foundation](https://web3.foundation/){target=\_blank}. ## Header A structure that aggregates the information used to summarize a block. Primarily, it consists of cryptographic information used by [light clients](#light-client) to get minimally secure but very efficient chain synchronization. ## Hybrid Consensus A blockchain consensus protocol that consists of independent or loosely coupled mechanisms for [block production](#block-author) and finality. Hybrid consensus allows the chain to grow as fast as probabilistic consensus protocols, such as [Aura](#authority-round-aura), while maintaining the same level of security as deterministic finality consensus protocols, such as [GRANDPA](#grandpa). ## Inherent Transactions A special type of unsigned transaction, referred to as _inherents_, that enables a block authoring node to insert information that doesn't require validation directly into a block. Only the block-authoring node that calls the inherent transaction function can insert data into its block. In general, validators assume the data inserted using an inherent transaction is valid and reasonable even if it can't be deterministically verified. ## JSON-RPC A stateless, lightweight remote procedure call protocol encoded in JavaScript Object Notation (JSON). JSON-RPC provides a standard way to call functions on a remote system by using JSON. For Polkadot SDK, this protocol is implemented through the [Parity JSON-RPC](https://github.com/paritytech/jsonrpc){target=\_blank} crate. ## Keystore A subsystem for managing keys for the purpose of producing new blocks. ## Kusama [Kusama](https://kusama.network/){target=\_blank} is a Polkadot SDK-based blockchain that implements a design similar to the [Polkadot](#polkadot) network. Kusama is a [canary](https://en.wiktionary.org/wiki/canary_in_a_coal_mine){target=\_blank} network and is referred to as [Polkadot's "wild cousin."](https://wiki.polkadot.network/learn/learn-comparisons-kusama/){target=\_blank} As a canary network, Kusama is expected to be more stable than a test network like [Westend](#westend) but less stable than a production network like [Polkadot](#polkadot). Kusama is controlled by its network participants and is intended to be stable enough to encourage meaningful experimentation. ## libp2p A peer-to-peer networking stack that allows the use of many transport mechanisms, including WebSockets (usable in a web browser). Polkadot SDK uses the [Rust implementation](https://github.com/libp2p/rust-libp2p){target=\_blank} of the `libp2p` networking stack. ## Light Client A type of blockchain node that doesn't store the [chain state](#state) or produce blocks. A light client can verify cryptographic primitives and provides a [remote procedure call (RPC)](https://en.wikipedia.org/wiki/Remote_procedure_call){target=\_blank} server, enabling blockchain users to interact with the network. ## Metadata Data that provides information about one or more aspects of a system. The metadata that exposes information about a Polkadot SDK blockchain enables you to interact with that system. ## Nominated Proof of Stake (NPoS) A method for determining [validators](#validator) or _[authorities](#authority)_ based on a willingness to commit their stake to the proper functioning of one or more block-producing nodes. ## Oracle An entity that connects a blockchain to a non-blockchain data source. Oracles enable the blockchain to access and act upon information from existing data sources and incorporate data from non-blockchain systems and services. ## Origin A [FRAME](#frame-framework-for-runtime-aggregation-of-modularized-entities) primitive that identifies the source of a [dispatched](#dispatchable) function call into the [runtime](#runtime). The FRAME System pallet defines three built-in [origins](#origin). As a [pallet](#pallet) developer, you can also define custom origins, such as those defined by the [Collective pallet](https://paritytech.github.io/substrate/master/pallet_collective/enum.RawOrigin.html){target=\_blank}. ## Pallet A module that can be used to extend the capabilities of a [FRAME](#frame-framework-for-runtime-aggregation-of-modularized-entities)-based [runtime](#runtime). Pallets bundle domain-specific logic with runtime primitives like [events](#events) and [storage items](#storage-item). ## Parachain A parachain is a blockchain that derives shared infrastructure and security from a _[relay chain](#relay-chain)_. You can learn more about parachains on the [Polkadot Wiki](https://wiki.polkadot.network/docs/en/learn-parachains){target=\_blank}. ## Paseo Paseo TestNet provisions testing on Polkadot's "production" runtime, which means less chance of feature or code mismatch when developing parachain apps. Specifically, after the [Polkadot Technical fellowship](https://wiki.polkadot.network/learn/learn-polkadot-technical-fellowship/){target=\_blank} proposes a runtime upgrade for Polkadot, this TestNet is updated, giving a period where the TestNet will be ahead of Polkadot to allow for testing. ## Polkadot The [Polkadot network](https://polkadot.com/){target=\_blank} is a blockchain that serves as the central hub of a heterogeneous blockchain network. It serves the role of the [relay chain](#relay-chain) and provides shared infrastructure and security to support [parachains](#parachain). ## Polkadot Cloud Polkadot Cloud is a platform for deploying resilient, customizable and scalable Web3 applications through Polkadot's functionality. It encompasses the wider Polkadot network infrastructure and security layer where parachains operate. The platform enables users to launch Ethereum-compatible chains, build specialized blockchains, and flexibly manage computing resources through on-demand or bulk coretime purchases. Initially launched with basic parachain functionality, Polkadot Cloud has evolved to offer enhanced flexibility with features like coretime, elastic scaling, and async backing for improved performance. ## Polkadot Hub Polkadot Hub is a Layer 1 platform that serves as the primary entry point to the Polkadot ecosystem, providing essential functionality without requiring parachain deployment. It offers core services including smart contracts, identity management, staking, governance, and interoperability with other ecosystems, making it simple and fast for both builders and users to get started in Web3. ## PolkaVM PolkaVM is a custom virtual machine optimized for performance, leveraging a RISC-V-based architecture to support Solidity and any language that compiles to RISC-V. It is specifically designed for the Polkadot ecosystem, enabling smart contract deployment and execution. ## Relay Chain Relay chains are blockchains that provide shared infrastructure and security to the [parachains](#parachain) in the network. In addition to providing [consensus](#consensus) capabilities, relay chains allow parachains to communicate and exchange digital assets without needing to trust one another. ## Rococo A [parachain](#parachain) test network for the Polkadot network. The [Rococo](#rococo) network is a Polkadot SDK-based blockchain with an October 14, 2024 deprecation date. Development teams are encouraged to use the Paseo TestNet instead. ## Runtime The runtime represents the [state transition function](#state-transition-function-stf) for a blockchain. In Polkadot SDK, the runtime is stored as a [Wasm](#webassembly-wasm) binary in the chain state. The Runtime is stored under a unique state key and can be modified during the execution of the state transition function. ## Slot A fixed, equal interval of time used by consensus engines such as [Aura](#authority-round-aura) and [BABE](#blind-assignment-of-blockchain-extension-babe). In each slot, a subset of [authorities](#authority) is permitted, or obliged, to [author](#block-author) a block. ## Sovereign Account The unique account identifier for each chain in the relay chain ecosystem. It is often used in cross-consensus (XCM) interactions to sign XCM messages sent to the relay chain or other chains in the ecosystem. The sovereign account for each chain is a root-level account that can only be accessed using the Sudo pallet or through governance. The account identifier is calculated by concatenating the Blake2 hash of a specific text string and the registered parachain identifier. ## SS58 Address Format A public key address based on the Bitcoin [`Base-58-check`](https://en.bitcoin.it/wiki/Base58Check_encoding){target=\_blank} encoding. Each Polkadot SDK SS58 address uses a `base-58` encoded value to identify a specific account on a specific Polkadot SDK-based chain The [canonical `ss58-registry`](https://github.com/paritytech/ss58-registry){target=\_blank} provides additional details about the address format used by different Polkadot SDK-based chains, including the network prefix and website used for different networks ## State Transition Function (STF) The logic of a blockchain that determines how the state changes when a block is processed. In Polkadot SDK, the state transition function is effectively equivalent to the [runtime](#runtime). ## Storage Item [FRAME](#frame-framework-for-runtime-aggregation-of-modularized-entities) primitives that provide type-safe data persistence capabilities to the [runtime](#runtime). Learn more in the [storage items](https://paritytech.github.io/polkadot-sdk/master/frame_support/storage/types/index.html){target=\_blank} reference document in the Polkadot SDK. ## Substrate A flexible framework for building modular, efficient, and upgradeable blockchains. Substrate is written in the [Rust](https://www.rust-lang.org/){target=\_blank} programming language and is maintained by [Parity Technologies](https://www.parity.io/){target=\_blank}. ## Transaction An [extrinsic](#extrinsic) that includes a signature that can be used to verify the account authorizing it inherently or via [signed extensions](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/signed_extensions/index.html){target=\_blank}. ## Transaction Era A definable period expressed as a range of block numbers during which a transaction can be included in a block. Transaction eras are used to protect against transaction replay attacks if an account is reaped and its replay-protecting nonce is reset to zero. ## Trie (Patricia Merkle Tree) A data structure used to represent sets of key-value pairs and enables the items in the data set to be stored and retrieved using a cryptographic hash. Because incremental changes to the data set result in a new hash, retrieving data is efficient even if the data set is very large. With this data structure, you can also prove whether the data set includes any particular key-value pair without access to the entire data set. In Polkadot SDK-based blockchains, state is stored in a trie data structure that supports the efficient creation of incremental digests. This trie is exposed to the [runtime](#runtime) as [a simple key/value map](#storage-item) where both keys and values can be arbitrary byte arrays. ## Validator A validator is a node that participates in the consensus mechanism of the network. Its roles include block production, transaction validation, network integrity, and security maintenance. ## WebAssembly (Wasm) An execution architecture that allows for the efficient, platform-neutral expression of deterministic, machine-executable logic. [Wasm](https://webassembly.org/){target=\_blank} can be compiled from many languages, including the [Rust](https://www.rust-lang.org/){target=\_blank} programming language. Polkadot SDK-based chains use a Wasm binary to provide portable [runtimes](#runtime) that can be included as part of the chain's state. ## Weight A convention used in Polkadot SDK-based blockchains to measure and manage the time it takes to validate a block. Polkadot SDK defines one unit of weight as one picosecond of execution time on reference hardware. The maximum block weight should be equivalent to one-third of the target block time with an allocation of one-third each for: - Block construction - Network propagation - Import and verification By defining weights, you can trade-off the number of transactions per second and the hardware required to maintain the target block time appropriate for your use case. Weights are defined in the runtime, meaning you can tune them using runtime updates to keep up with hardware and software improvements. ## Westend Westend is a Parity-maintained, Polkadot SDK-based blockchain that serves as a test network for the [Polkadot](#polkadot) network. --- END CONTENT ---