Tech StackBackendOpenAI

OpenAI

The OpenAI API provides access to OpenAI’s advanced AI models, such as GPT for natural language processing and Codex for programming assistance. It enables developers to integrate AI capabilities into their applications, such as text generation, conversation, summarization, translation, and code completion.


Generating Text Embeddings

OpenAI’s embeddings.create method is used in multiple places to create vector representations of text. These embeddings are stored in a database for retrieval and are utilized in further operations like similarity searches.

Example

const embeddingRequest = await openAi.embeddings.create({
  model: "text-embedding-3-small",
  input: data,
  encoding_format: "float",
});
  • Purpose: Converts data (a string) into a numerical embedding that can be used for comparisons or storage.
  • Use Case: Storing embeddings in Supabase for reuse (CachedEmbedding table).

Text Matching and Auto-completion

Embeddings are used with Supabase’s remote procedure calls (RPCs) to perform similarity searches or auto-complete descriptions.

Example

const response = await serviceRoleSupabase.rpc(
  "auto_complete_description_TimeActivity",
  {
    query_embedding: JSON.stringify(embedding.data[0].embedding),
    match_threshold,
    match_count,
    user_id: userId,
    organization_id: organizationId,
    match_threshold_increment,
  },
);
  • Purpose: Matches user-provided text (search) with stored embeddings to provide suggestions or auto-completed data.
  • Use Case: Searching or filtering data efficiently using embeddings as a search vector.

Image Description via Chat Completion

OpenAI’s chat.completions.create method is used to generate detailed descriptions of images for tagging and search purposes.

Example

const response = await openAi.chat.completions.create({
  model: "gpt-4o",
  messages: [
    {
      role: "user",
      content: [
        { type: "text", text: "Please describe this image..." },
        { type: "image_url", image_url: { url: url.data.publicUrl } },
      ],
    },
  ],
});
  • Purpose: Converts an image’s content into a descriptive text suitable for tagging or categorization.
  • Use Case: Enabling search functionalities based on image characteristics.

Caching and Updating Embeddings

  • Embeddings are cached in Supabase to avoid redundant API calls.
  • When data is updated, corresponding embeddings are recalculated and stored.

Example

if (cachedEmbeddingsRequest.data) return cachedEmbeddingsRequest.data.key!;
const embedding = JSON.stringify(embeddingRequest.data[0].embedding);
await serviceRoleSupabase.from("CachedEmbedding").insert({ key: data, value: embedding });

Integration with Supabase

OpenAI is tightly coupled with Supabase for:

  • Storing embeddings.
  • Executing RPCs for similarity-based operations.
  • Handling updates to associated data (e.g., TimeActivity, Fixture).

Example

const saveResponse = await serviceRoleSupabase.from("Fixture").update({
  detailEmbedding: JSON.stringify(embedding.data[0].embedding),
});

Summary of Tools and Concepts:

  • OpenAI API: Used for embedding creation and chat completions.
  • Supabase: Acts as the database and serverless function handler for caching embeddings and executing RPCs.
  • Deno: Runtime for serverless function deployment.
  • Applications:
    • Text embedding for search.
    • Auto-completion and matching.
    • Image-to-text descriptions for tagging.

This integration provides efficient and scalable solutions for NLP tasks such as similarity searches, data enhancement, and embedding-based operations.