Skip to content

redis-performance/geo-bench

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

This repository contains a set of scripts and tools for running benchmarks on vanilla Redis GEO commands and RediSearch, a full-text search engine for Redis.

The benchmarks in this repository cover a range of common Redis GEO and RediSearch operations, such as indexing, searching, and querying data.

The results of the benchmarks can be used to compare the performance of different Redis configurations, to gain insights into the behavior of these tools, and to identify potential bottlenecks or areas for optimization.

Installation

Download Standalone binaries ( no Golang needed )

If you don't have go on your machine and just want to use the produced binaries you can download the following prebuilt bins:

https://github.com/redis-performance/geo-bench/releases/latest

OS Arch Link
Linux amd64 (64-bit X86) geo-bench-linux-amd64
Linux arm64 (64-bit ARM) geo-bench-linux-arm64
Darwin amd64 (64-bit X86) geo-bench-darwin-amd64
Darwin arm64 (64-bit ARM) geo-bench-darwin-arm64

Here's how bash script to download and try it:

wget -c https://github.com/redis-performance/geo-bench/releases/latest/download/geo-bench-$(uname -mrs | awk '{ print tolower($1) }')-$(dpkg --print-architecture).tar.gz -O - | tar -xz

# give it a try
./geo-bench --help

Installation in a Golang env

The easiest way to get and install the benchmark utility with a Go Env is to use go get and then go install:

# Fetch this repo
go get github.com/redis-performance/geo-bench
cd $GOPATH/src/github.com/redis-performance/geo-bench
make

Try it out

GeoPoints

# get dataset
wget https://s3.us-east-2.amazonaws.com/redis.benchmarks.spec/datasets/geopoint/documents.json.bz2
bzip2 -d documents.json.bz2

# get tool
wget -c https://github.com/redis-performance/geo-bench/releases/latest/download/geo-bench-$(uname -mrs | awk '{ print tolower($1) }')-$(dpkg --print-architecture).tar.gz -O - | tar -xz

# load data
./geo-bench load

GeoPolygons

Retrieve input data and tool

# get dataset ( around 30GB uncompressed )
wget https://s3.us-east-2.amazonaws.com/redis.benchmarks.spec/datasets/geoshape/polygons.json.bz2
bzip2 -d polygons.json.bz2

# get 100K dataset ( around 3GB uncompressed )
wget https://s3.us-east-2.amazonaws.com/redis.benchmarks.spec/datasets/geoshape/polygons.100k.json.bz2
bzip2 -d polygons.100k.json.bz2

# get a simplified version of the 100K polygons
wget https://s3.us-east-2.amazonaws.com/redis.benchmarks.spec/datasets/geoshape/polygons.100k.simplified-threshold-0.001.json.bz2
bzip2 -d polygons.100k.simplified-threshold-0.001.json.bz2

# get tool
wget -c https://github.com/redis-performance/geo-bench/releases/latest/download/geo-bench-$(uname -mrs | awk '{ print tolower($1) }')-$(dpkg --print-architecture).tar.gz -O - | tar -xz

Load data in Redis

# load 1st 100K polygons
./geo-bench load --input-type geoshape --input polygons.json -n 100000 --db redisearch-hash

Query data in Redis

# send 10K queries of type within
./geo-bench query --db redisearch-hash --input polygons.json -c 50 --input-type geoshape -n 10000 --query-type geoshape-within

Load data in ElasticSearch

Spin up Elastic locally

sudo sysctl -w vm.max_map_count=262144
docker run -p 9200:9200 -p 9300:9300 -e "ELASTIC_PASSWORD=password"  docker.elastic.co/elasticsearch/elasticsearch:8.7.1

Load the data

# load 1st 100K polygons
./geo-bench load --db elasticsearch --input polygons.json -c 50 -n 100000 --input-type geoshape --es.password password --es.bulk.batch.size 100

# confirm you've got the expected doc count
$ curl -k --user "elastic:password" -X GET "https://localhost:9200/geo/_count?pretty" -H 'Content-Type: application/json'

# check the memory usage of that index
$ curl -k --user "elastic:password" -X GET "https://localhost:9200/geo/_stats?pretty" -H 'Content-Type: application/json'

Query data in ElasticSearch

# send 10K queries of type within
./geo-bench query --db elasticsearch --input polygons.json -c 50 --input-type geoshape --es.password password -n 10000 --query-type geoshape-within