New

SingleStore Cheat Sheet

Notebook


SingleStore Notebooks

SingleStore Cheat Sheet

Note

This notebook can be run on a Free Starter Workspace. To create a Free Starter Workspace navigate to Start using the left nav. You can also use your existing Standard or Premium workspace with this Notebook.

SingleStore Database Cheat Sheet

List of useful commands for SingleStore SQL and Kai (MongoDB API) operations

Important Notes

SingleStore Core Concepts

  1. Reference tables don't need a SHARD KEY as they are replicated to all nodes

  2. SingleStore supports both rowstore and columnstore (default) table types

  3. Hash indexes are recommended for fast equality lookups on large tables

  4. JSON operations are optimized for performance in SingleStore

  5. Use Reference tables for lookup data that needs to be available on all nodes

Vector Operations Tips

  1. Vector dimensions must be specified at table creation

  2. Normalize vectors to length 1 before inserting them in the database when you are doing cosine similarity calculations (but note that many models produce length-1 vectors so this is often not necessary; check the documentation for your model)

  3. Choose appropriate index metric based on your use case

  4. Vector operations support AI/ML workloads

  5. Combine with full-text search for hybrid search capabilities

  6. Available both in SQL and through SingleStore Kai (MongoDB API)


For the most up-to-date information, refer to the official SingleStore documentation at https://singlestore.com/docs.

Database Operations

In [1]:

1%%sql2# Show Databases3SHOW DATABASES;

In [2]:

1%%sql2# Create Database3CREATE DATABASE database_name; -- Note this will not work on free tier due to one DB constraint

In [3]:

1%%sql2# Use Database3USE database_name;

In [4]:

1%%sql2# Drop Database3DROP DATABASE database_name; -- Use with extreme caution

Table Operations

In [5]:

1%%sql2# Create Distributed Table3CREATE TABLE posts (4    id BIGINT AUTO_INCREMENT PRIMARY KEY,5    title VARCHAR(255),6    body TEXT,7    category VARCHAR(50),8    created_at DATETIME DEFAULT CURRENT_TIMESTAMP,9    SHARD KEY (id)10);

In [6]:

1%%sql2# Create Reference Table3CREATE REFERENCE TABLE categories (4    id INT PRIMARY KEY,5    name VARCHAR(50)6    -- No SHARD KEY needed for reference tables7);

In [7]:

1%%sql2# Create Columnstore Table3CREATE TABLE analytics (4    id BIGINT,5    event_type VARCHAR(50),6    ts DATETIME,7    data JSON,8    SORT KEY (timestamp),9    SHARD KEY (id)10);

Table Management

In [8]:

1%%sql2# Show tables3SHOW TABLES;4
5# Describe table6DESCRIBE posts;7
8# Drop table9DROP TABLE posts;

Data Manipulation

In [9]:

1%%sql2# Insert single row3INSERT INTO posts (title, body, category)4VALUES ('Post One', 'Body of post one', 'News');5
6# Insert multiple rows7INSERT INTO posts (title, body, category) VALUES8    ('Post Two', 'Body of post two', 'Technology'),9    ('Post Three', 'Body of post three', 'News');

In [10]:

1%%sql2# Select Data3# Select all rows4SELECT * FROM posts;5
6# Select specific columns7SELECT title, category FROM posts;8
9# Select with condition10SELECT * FROM posts WHERE category = 'News';

In [11]:

1%%sql2# Update Data3UPDATE posts4SET body = 'Updated body'5WHERE title = 'Post One';

In [12]:

1%%sql2# Delete Data3DELETE FROM posts WHERE title = 'Post One';

SingleStore Pipelines

Pipelines are used to bring data into SingleStore tables from different sources, for example an S3 bucket

Create Pipeline

In [13]:

1%%sql2#Create Pipeline3CREATE PIPELINE SalesData_Pipeline AS4LOAD DATA S3 's3://singlestoreloaddata/SalesData/*.csv'5CONFIG '{ "region": "ap-south-1" }'6INTO TABLE SalesData7FIELDS TERMINATED BY ','8LINES TERMINATED BY '\n'9IGNORE 1 lines;

Start Pipeline

In [14]:

1%%sql2START PIPELINE SalesData_Pipeline;

Check pipeline status

In [15]:

1%%sql2SELECT * FROM information_schema.pipelines_files3WHERE pipeline_name = "SalesData_Pipeline";

Stop pipeline

In [16]:

1%%sql2STOP PIPELINE [IF RUNNING] pipeline_name [DETACH];

Drop Pipeline

In [17]:

1%%sql2DROP PIPELINE [IF EXISTS] pipeline_name;

SingleStore Specific Features

JSON Operations

In [18]:

1%%sql2# Create table with JSON column3CREATE TABLE json_posts (4    id BIGINT AUTO_INCREMENT PRIMARY KEY,5    data JSON,6    SHARD KEY (id)7);8
9# Insert JSON10INSERT INTO json_posts (data)11VALUES ('{"title": "Post One", "tags": ["news", "events"]}');12
13# Query JSON14SELECT JSON_EXTRACT_STRING(data, '$.title') as title15FROM json_posts;

Vector Operations

In [19]:

1%%sql2# Create table with vector column3CREATE TABLE embeddings (4    id BIGINT AUTO_INCREMENT PRIMARY KEY,5    description TEXT,6    embedding VECTOR(1536),  -- Specify vector dimension7    SHARD KEY (id)8);

In [20]:

1%%sql2# Create vector index using dot product as distance metric3ALTER TABLE embeddings ADD VECTOR INDEX idx_embedding (embedding)4INDEX_OPTIONS '{"metric_type": "DOT_PRODUCT"}';

In [21]:

1%%sql2# Vector search examples3# Find similar vectors using dot product4SELECT id, description, DOT_PRODUCT(embedding, '[0.1, 0.2, ...]') as similarity5FROM embeddings6ORDER BY similarity DESC7LIMIT 10;8
9# Create a full-text index10ALTER TABLE embeddings ADD FULLTEXT USING VERSION 2 fts_idx(description);11
12# Hybrid search combining full-text and vector search13SELECT id, description,14    DOT_PRODUCT(embedding, '[0.1, 0.2, ...]') as vector_score,15    MATCH(table embeddings) AGAINST('description:("search terms")') as text_score16FROM embeddings17WHERE MATCH(table embeddings) AGAINST('description:("search terms")')18ORDER BY (vector_score * 0.7 + text_score * 0.3) DESC;

SingleStore Kai (MongoDB API)

MongoDB Client Connection

mongodb://username:password@hostname:27017/database

In [22]:

1# MongoDB-style commands2
3# Show databases4show dbs5
6# Use database7use mydb8
9# Show collections10show collections11
12# Create collection13db.createCollection('users')

Details


About this Template

Get started with SingleStore quickly with common commands

This Notebook can be run in Shared Tier, Standard and Enterprise deployments.

Tags

starter

License

This Notebook has been released under the Apache 2.0 open source license.

See Notebook in action

Launch this notebook in SingleStore and start executing queries instantly.