Skip to content

xrpscan/platform

Repository files navigation

Platform

XRPL Deep search platform is a data processing pipeline to store XRPL transactions in a distributed search and analytics platform. The platform enables simple data retrieval, aggregate information and discovering trends and patterns in XRPL.

What is Deep search?

XRP Ledger exploration tools and APIs available today, such as rippled, clio and various explorer APIs provide access to ledger data based on object's primary key such as ledger index, transaction hash, account address, NFT id, object ids etc. This project aims to provide deeper search capability such as filtering transactions by source/destination tags, range query over payment amounts, aggregate volumes and much more. This is enabled by indexing all properties of the transactions in an analytics engine.

Requirements

  1. Apache Kafka
  2. rippled
  3. Access to full history rippled (if backfilling older ledgers)
  4. Elasticsearch

Architecture

Search platform architecture

Installation

  1. This project is known to run on Linux and macOS. This README lists out steps to run the service on CentOS.

  2. Install Elasticsearch via this guide: https://www.elastic.co/guide/en/elasticsearch/reference/current/rpm.html

dnf install --enablerepo=elasticsearch elasticsearch
systemctl daemon-reload
systemctl enable elasticsearch.service
systemctl start elasticsearch.service

Elasticsearch default installer would print the instance's defauly password. This password must be noted as it would be required in the later steps (ELASTICSEARCH_PASSWORD).

  1. Install Docker via this guide: https://docs.docker.com/engine/install/centos/

  2. Configure Docker to run as non-root

usermod -aG docker non-root-user
systemctl restart docker
  1. Install Zookeeper and Kafka
docker compose up -d
  1. Install Go via this guide: https://go.dev/doc/install

  2. Build deep search platform

dnf install make
git clone git@github.com:xrpscan/platform.git
cd platform
make
  1. Create environment file and update settings within
cp .env.example .env
  1. Copy fingerprint output from the following command into .env file. ELASTICSEARCH_FINGERPRINT="xxxxxx"
openssl x509 -fingerprint -sha256 -noout -in /etc/elasticsearch/certs/http_ca.crt | sed s/://g
  1. Create Kafka topics
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-ledgers
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-transactions
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-validations
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-manifests
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-peerstatus
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-consensus
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-server
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-default
docker exec kafka-broker1 kafka-topics --bootstrap-server kafka-broker1:9092 --create --if-not-exists --topic xrpl-platform-tx
  1. Create Elasticsearch indexes
./bin/platform-cli init -elasticsearch -shards 8 -replicas 0

Running the service

  1. Index new ledgers
./bin/platform-server
  1. Backfill old ledgers
./bin/platform-cli backfill -verbose -from 82000000 -to 82999999

Monitoring the service

The services ships a command named eps that may be used to print Elasticsearch index statistics. Open file cmd/eps/eps and update ES_ENV_FILE variable so that it points to your platform .env file.

vi cmd/eps/eps  # Update ES_ENV_FILE variable
cp cmd/eps/eps /path/to/your/bin
eps

Querying data

Deep search platform will provide easy to use APIs for querying XRPL transaction data in a future release. For now, data can be queried by connecting to Elasticsearch directly

source .env
curl -k -u elastic:$ELASTICSEARCH_PASSWORD \
-H 'Content-type: application/json' \
-XPOST 'https://localhost:9200/platform.transactions/_search' \
-d '{"query":{"term":{"ctid":"C511CC0400850000" }}}' | \
jq .hits

Maintenance

Over time, XRPL protocol may receive updates via the amendment process. New amendments may add additional fields to the transaction, ledger or validation objects. When this happens, Elasticsearch index templates would need an update.

./bin/platform-cli init -elasticsearch -force

References

Ledger Stream - xrpl.org

Developer notes

Updating Models

When a new amendment adds or removes fields from transaction object, review the following files and update as necessary:

  • models/transaction.go
  • config/mapping/transaction.go
  • models/currency.go (for Currency fields)

Known issues

Reporting bugs

Please create a new issue in Platform issue tracker

EOF