Skip to content

fabiandistler/aidea

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

aidea

Combine the power of LLMs and R to help guide exploration of a dataset.

DISCLAIMER: This package is a proof of concept and was created for a 2-day hackathon. It's currently just a fun side project. Don't use it for anything serious.

Installation

You can install the development version of aidea from GitHub with:

remotes::install_github("cpsievert/aidea")

Prerequisites

To use this package, you'll also need credentials for the LLM that powers assist().

By default, assist() uses OpenAI, so you'll need to set an environment variable named OPENAI_API_KEY using the key from https://platform.openai.com/account/api-keys

We recommend setting that variable via usethis::use_renviron(). See {elmer}'s prerequisites if you plan on using a different model.

Usage

This package currently contains just one function, assist(), which takes a data frame as input, and provides a chat bot experience tailored for that dataset:

# Load a dataset
data(diamonds, package = "ggplot2")
# Start the aidea app assistant
aidea::assist(diamonds)

You'll be welcomed with overview of what's in the data (e.g., interesting summary stats, variable types, etc) as well as some questions to ask about the data.

Screenshot 2024-10-10 at 12 57 51 PM

When you ask a question about the data, it'll offer R code to assist in answering that question.

That R code will include an option to run the code in browser:

Screenshot 2024-10-10 at 12 58 34 PM

When clicked, the code is run in a sidebar, and results displayed below the interactive code editor.

Screenshot 2024-10-10 at 12 59 07 PM

When you're unsure of how to interpret the results, press the interpret button. This will open an additional sidebar with an interpretation of the current results:

Screenshot 2024-10-10 at 12 59 43 PM

How does it work?

This package uses a combination of {elmer} and {shinychat} to provide the LLM assisted chatbot experience. It does not send all of your data to the LLM, just basic summary stats (e.g., number of rows/columns) and data characteristics (e.g., variable types). It will, however, send any results you choose to interpret to the LLM. If you are worried about privacy, consider using a local model (i.e., assist(chat = elmer::chat_ollama())) instead of OpenAI

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Lua 83.6%
  • R 13.4%
  • JavaScript 3.0%