kubo_loadtesting

0

Описание

Языки

  • Go93,5%
  • Makefile6,5%
7 месяцев назад
7 месяцев назад
4 месяца назад
4 месяца назад
4 месяца назад
README.md

kubo_loadtesting

A load testing framework for IPFS Kubo APIs, built with Vegeta.
It helps benchmark and validate the performance of critical IPFS API endpoints such as:

  • /api/v0/add
  • /api/v0/cat
  • /api/v0/pin/add
  • /api/v0/pin/rm
  • /api/v0/name/publish
  • /api/v0/name/resolve

Each test generates structured metrics, raw results, and HTML plots for visualization.


Features

  • Runs load tests with configurable RPS (requests per second) and duration.
  • Automatically saves:
    • Raw Vegeta results (
      results.bin
      )
    • Metrics in JSON (
      metrics.json
      )
    • HTML latency plots (
      plot.html
      )
  • Persists generated CIDs and IPNS names across tests.
  • Supports chained workflows (e.g.
    add → cat → pin → unpin → publish → resolve
    ).
  • Tracks IPNS propagation time until success or timeout.

Prerequisites


Installation

Clone and build:


Configuration

All test parameters are set via the config package. Example config values include:

  • RPS (requests per second)

  • Test duration

  • Payload size (for /add)

  • Sleep intervals (for chained tests like IPNS publish → resolve)

You can edit config/config.go or provide your own config struct.


Config IPFS nodes

Run the initialization and configuration:

This sets up two IPFS nodes with separate repositories and ports:

Base node → repository: ~/.ipfs_node_base, ports: 4003 (Swarm), 5003 (API), 8082 (Gateway)

Resolver node → repository: ~/.ipfs_node_resolver, ports: 4002 (Swarm), 5002 (API), 8081 (Gateway)

Start both nodes simultaneously:

This runs both IPFS daemons in the background on your local machine.


Running tests

Run the main test runner:

This will execute all configured tests sequentially:

  • /api/v0/add
    → generates random files and saves CIDs to
    added_cids.json

  • /api/v0/cat
    → reads CIDs from
    added_cids.json
    and benchmarks retrieval

  • /api/v0/pin/add
    → pins files by CID

  • /api/v0/pin/rm
    → unpins files after a delay

  • /api/v0/name/publish
    → publishes CIDs via IPNS, measures propagation

  • /api/v0/name/resolve
    → resolves IPNS names from saved file


Output

Each test automatically saves structured results into files:

  • Raw results (

    results_<test>.bin
    ) — detailed request/response data

  • Metrics (

    metrics_<test>.json
    ) — JSON with rates, success %, latencies

  • Plots (

    plot_<test>.html
    ) — interactive HTML latency/throughput graphs

Additionally:

  • CIDs generated by /add are saved to

    added_cids.json

  • IPNS names generated by /publish are saved to

    published_names.json


Example Output


Roadmad