Version 0.3.0

Version 0.3.0 is currently being prepared for CRAN. This release represents a major milestone for tidyllm

The largest changes compared to 0.2.0 are:

New Verb-Based Interface

Each verb and provider combination routes the interaction to provider-specific functions like openai_chat() or claude_chat() that do the work in the background. These functions can also be called directly as an alternative more verbose and provider-specific interface.

Old Usage:

llm_message("Hello World") |>
  openai(.model = "gpt-4o")

New Usage:

# Recommended Verb-Based Approach
llm_message("Hello World") |>
  chat(openai(.model = "gpt-4o"))
  
# Or even configuring a provider outside
my_ollama <- ollama(.model = "llama3.2-vision:90B",
       .ollama_server = "https://ollama.example-server.de",
       .temperature = 0)

llm_message("Hello World") |>
  chat(my_ollama)

# Alternative Approach is to use more verbose specific functions:
llm_message("Hello World") |>
  openai_chat(.model = "gpt-4o")

Backward Compatibility:

Breaking Changes:

Other Major Features:

Improvements:

Version 0.2.7

Major Features

llm_message("What is tidyllm and who maintains this package?") |>
  gemini_chat(.grounding_threshold = 0.3)

Improvements

Version 0.2.6

Large Refactor of package internals

Breaking Changes

Minor Features

here::here("local_wip","example.mp3") |> gemini_upload_file()
here::here("local_wip","legrille.mp4") |> gemini_upload_file()

file_tibble <- gemini_list_files()

llm_message("What are these two files about?") |>
  gemini_chat(.fileid=file_tibble$name)

Version 0.2.5

Major Features

Better embedding functions with improved output and error handling and new documentation. New article on using embeddings with tidyllm. Support for embedding models on azure with azure_openai_embedding()

Breaking Changes

Version 0.2.4

Refinements of the new interface

One disadvantage of the first iteration of the new interface was that all arguements that needed to be passed to provider-specific functions, were going through the provider function. This feels, unintuitive, because users expect common arguments (e.g., .model, .temperature) to be set directly in main verbs like chat() or send_batch().Moreover, provider functions don’t expose arguments for autocomplete, making it harder for users to explore options. Therefore, the main API verbs now directly accept common arguements, and check them against the available arguements for each API.

Bug-fixes

Version 0.2.3

Major Interface Overhaul

tidyllm has introduced a verb-based interface overhaul to provide a more intuitive and flexible user experience. Previously, provider-specific functions like claude(), openai(), and others were directly used for chat-based workflows. Now, these functions primarily serve as provider configuration for some general verbs like chat().

Key Changes:

Each verb and provider combination routes the interaction to provider-specific functions like openai_chat() or claude_chat() that do the work in the background. These functions can also be called directly as an alternative more verbose and provider-specific interface.

Old Usage:

llm_message("Hello World") |>
  openai(.model = "gpt-4o")

New Usage:

# Recommended Verb-Based Approach
llm_message("Hello World") |>
  chat(openai(.model = "gpt-4o"))
  
# Or even configuring a provider outside
my_ollama <- ollama(.model = "llama3.2-vision:90B",
       .ollama_server = "https://ollama.example-server.de",
       .temperature = 0)

llm_message("Hello World") |>
  chat(my_ollama)

# Alternative Approach is to use more verbose specific functions:
llm_message("Hello World") |>
  openai_chat(.model = "gpt-4o")

Version 0.2.2

Major Features

#Upload a file for use with gemini
upload_info <- gemini_upload_file("example.mp3")

#Make the file available during a Gemini API call
llm_message("Summarize this speech") |>
  gemini(.fileid = upload_info$name)
  
#Delte the file from the Google servers
gemini_delete_file(upload_info$name)

Version 0.2.1

Major Features:

conversation <- llm_message("Write a short poem about software development") |>
  claude()
  
#Get metdata on token usage and model as tibble  
get_metadata(conversation)

#or print it with the message
print(conversation,.meta=TRUE)

#Or allways print it
options(tidyllm_print_metadata=TRUE)

Bug-fixes:

Version 0.2.0

New CRAN release. Largest changes compared to 0.1.0:

Major Features:

Improvements:

Breaking Changes:

Minor Updates and Bug Fixes:

Version 0.1.11

Major Features

Improvements

Version 0.1.10

Breaking Changes

Improvements

Version 0.1.9

Major Features

Breaking Changes

Improvements


Version 0.1.8

Major Features

Improvements


Version 0.1.7

Major Features


Version 0.1.6

Major Features


Version 0.1.5

Major Features

Improvements


Version 0.1.4

Major Features

Improvements


Version 0.1.3

Major Features

Breaking Changes


Version 0.1.2

Improvements


Version 0.1.1

Major Features

Breaking Changes