Skip to content

ai.client #

AIClient Factory

This directory contains the implementation of the AIClient factory, which provides a unified interface for interacting with various Large Language Model (LLM) providers such as Groq and OpenRouter. It leverages the existing OpenAI client infrastructure to abstract away the differences between providers.

File Structure

  • aiclient.v: The main factory and core functions for the AIClient.
  • aiclient_models.v: Defines LLM model enums and their mapping to specific model names and API base URLs.
  • aiclient_llm.v: Handles the initialization of various LLM provider clients.
  • aiclient_embed.v: Provides functions for generating embeddings using the configured LLM models.
  • aiclient_write.v: Implements complex file writing logic, including backup, AI-driven modification, content validation, and retry mechanisms.
  • aiclient_validate.v: Contains placeholder functions for validating different file types (Vlang, Markdown, YAML, JSON).

Usage

To use the AIClient, you first need to initialize it:

import aiclient

mut client := aiclient.new()!

Ensure that the necessary environment variables (GROQKEY and OPENROUTER_API_KEY) are set for the LLM providers.

Environment Variables

  • GROQKEY: API key for Groq.
  • OPENROUTER_API_KEY: API key for OpenRouter.

Key Features

v install prantlf.yaml
v install markdown

fn llms_init #

fn llms_init() !AIClientLLMs

Initialize all LLM clients

fn new #

fn new() !AIClient

fn validate_json_content #

fn validate_json_content(path_ pathlib.Path) !string

fn validate_markdown_content #

fn validate_markdown_content(path_ pathlib.Path) !string

fn validate_vlang_content #

fn validate_vlang_content(path pathlib.Path) !string

fn validate_yaml_content #

fn validate_yaml_content(path_ pathlib.Path) !string

enum LLMEnum #

enum LLMEnum {
	maverick
	qwen
	embed
	llm_120b
	best
	flash
	pro
	morph
	local
}

struct AIClient #

@[heap]
struct AIClient {
pub mut:
	llms AIClientLLMs
	// Add other fields as needed
}

fn (AIClient) write_from_prompt #

fn (mut ac AIClient) write_from_prompt(args WritePromptArgs) !

write_from_prompt modifies a file based on AI-generated modification instructions

The process:1. Uses the first model to generate modification instructions from the prompt2. Uses the morph model to apply those instructions to the original content3. Validates the result based on file type (.v, .md, .yaml, .json)4. On validation failure, retries with the next model in the list5. Restores from backup if all models fail

struct AIClientLLMs #

struct AIClientLLMs {
pub mut:
	llm_maverick    &openai.OpenAI
	llm_qwen        &openai.OpenAI
	llm_120b        &openai.OpenAI
	llm_best        &openai.OpenAI
	llm_flash       &openai.OpenAI
	llm_pro         &openai.OpenAI
	llm_morph       &openai.OpenAI
	llm_embed       &openai.OpenAI
	llm_local       &openai.OpenAI
	llm_embed_local &openai.OpenAI
}

struct WritePromptArgs #

@[params]
struct WritePromptArgs {
pub mut:
	path          pathlib.Path
	prompt        string
	models        []LLMEnum = [.best]
	temperature   f64       = 0.5
	max_tokens    int       = 16000
	system_prompt string    = 'You are a helpful assistant that modifies files based on user instructions.'
}

WritePromptArgs holds the parameters for write_from_prompt function