Key-Value Store

Purpose

This documentation will assist you in using substreams-sink-kv to write data from your existing substreams into a key-value store and serve it back through Connect-Web/GRPC.

Overview

substreams-sink-kv works by reading the output of specially-designed substreams module (usually called kv_out) that produces data in a protobuf-encoded structure called sf.substreams.sink.kv.v1.KVOperations.

The data is written to a key-value store. Currently supported KV store are Badger, BigTable and TiKV.

A Connect-Web interface makes the data available directly from the substreams-sink-kv process. Alternatively, you can consume the data directly from your key-value store.

Requirements

  • An existing substreams (including substreams.yaml and Rust code) that you want to instrument for substreams-sink-kv.

  • A key-value store where you want to send your data (a badger local file can be used for development)

  • Knowledge about Substreams development (start here)

  • Rust installation and compiler

Installation

Instrumenting your Substreams

Assumptions

The following instructions will assume that you are instrumenting substreams-eth-block-meta, which contains:

  • A store store_block_meta_end defined like this:

  • a eth.block_meta.v1.BlockMeta protobuf structure like this:

Note The substreams-eth-block-meta is already instrumented for sink-kv, the proposed changes here are a simplified version of what has been implemented. Please adjust the proposed code to your own substreams.

Import the Cargo module

  1. Add the substreams-sink-kv crate to your Cargo.toml:

  1. Add map module implementation function named kv_out to your src/lib.rs:

  1. Add a kv_out public function to your src/lib.rs:

  1. Add the kv::process_deltas transformation function referenced in the last snippet:

Test your substreams

  1. Compile your changes in your rust code:

  1. Run with substreams command directly:

Note To connect to a public StreamingFast substreams endpoint, you will need an authentication token, follow this guide to obtain one.

  1. Run with substreams-sink-kv:

You should see output similar to this one:

Note This writes the data to a local folder "./badger_data.db/" in Badger format. You can rm -rf ./badger_data.db between your tests to cleanup all existing data.

  1. Look at the stored data

You can scan the whole dataset using the 'Scan' command:

You can look at data by key prefix:

Consume the key-value data from a web-page using Connect-Web

The Connect-Web library allows you to quickly bootstrap a web-based client for your key-value store.

Requirements

Start from our example for substreams-eth-block-meta

You can checkout and run our connect-web-example like this:

Then, enter a key in the text box. The app currently only decodes eth.block_meta.v1.BlockMeta, so you will likely receive the corresponding value encoded in hex string.

To decode the value of your own data structures, add your .proto files under proto/ and generate Rust bindings like this:

You should see this output:

Then, modify the code from src/App.tsx to decode your custom type, from this:

to this:

Bootstrap your own application

If you want to start with an empty application, you can follow these instructions

Sending to a production key-value store

Until now, we've used the badger database as a store, for simplicity. However, substreams-sink-kv also supports TiKV and bigtable.

  • tikv://pd0,pd1,pd2:2379?prefix=namespace_prefix

  • bigkv://project.instance/namespace-prefix?createTables=true

See kvdb for more details.

Conclusion and review

The ability to route data extracted from the blockchain by using Substreams is powerful and useful. Key-value stores aren't the only type of sink the data extracted by Substreams can be piped into. Review the core Substreams sinks documentation for additional information on other types of sinks and sinking strategies.

Last updated

Was this helpful?