Files
2025-05-28 09:55:51 +08:00
..
2025-05-28 09:55:51 +08:00
2025-05-28 09:55:51 +08:00
2025-05-28 09:55:51 +08:00
2025-05-28 09:55:51 +08:00
2025-05-28 09:55:51 +08:00
2025-05-28 09:55:51 +08:00
2025-05-28 09:55:51 +08:00
2025-05-28 09:55:51 +08:00

Image Search with Supabase Vector

In this example we're implementing image search using the OpenAI CLIP Model, which was trained on a variety of (image, text)-pairs.

We're implementing two methods in the /image_search/main.py file:

  1. The seed method generates embeddings for the images in the images folder and upserts them into a collection in Supabase Vector.
  2. The search method generates an embedding from the search query and performs a vector similarity search query.

Setup

  • Install poetry: pip install poetry
  • Activate the virtual environment: poetry shell
    • (to leave the venv just run exit)
  • Install app dependencies: poetry install

Run locally

Generate the embeddings and seed the collection

  • poetry run search "bike in front of red brick wall"

Run on hosted Supabase project

Attributions

Models

clip-ViT-B-32 via Hugging Face

Images

Images from https://unsplash.com/license via https://picsum.photos/