Using Database Changes

If you require more control over the tables and the data that you want to store into the database, then creating a db_out module would be the best option.

You will create a new module, db_out, which maps the output of your Substreams to the DatabaseChanges data model, which is a format that the SQL sink understands.

Running the Sink

To index a db_out module, you will have to run two different commands: substreams-sink-sql setup to create the necessary tables from a given schema.sql file, and substreams-sink-sql run to perform the actual execution.

substreams-sink-sql setup <DSN> <SUBSTREAMS_PACKAGE>

The substreams.yaml file of your package must contain the sink configuration:

sink:
  module: map_program_data
  type: sf.substreams.sink.sql.v1.Service
  config:
    engine: postgres
    schema: schema.sql

Example: Pump.Fun

Consider that you want to dump all the Pump.Fun data decoded with an IDL into your database.

Clone the Pump Fun Substreams GitHub repository.

Inpsect the Project

  • Observe the substreams.yaml file:

  1. The map_program_data module maps a Solana Block to the different instructions and events of the Pump.Fun IDL.

  2. The db_out module maps the output of map_program_data to DatabaseChanges, a format that the SQL sink can understand.

  3. The sink section defines the SQL sink configuration. In this example, the sink will map db_out to the tables of the database. When using a db_out module, it is necessary to specify a schema.sql file

Run the Sink

To run the sink, you will need a Postgres database. You can use a Docker container to spin up one in your computer.

  • Define the DSN string, which will contain the credentials of the database.

  • Configure the sink to create the necessary tables.

  • Run the sink

Last updated

Was this helpful?