10.4 C
New York
Tuesday, November 12, 2024

Chat with AI in RStudio


chattr is a package that allows interaction with large language models (LLMs), such as GitHub Copilot Chat and OpenAI’s GPT 3.5 and 4. The main vehicle is a Shiny application running within the RStudio IDE. Here’s an example of what it looks like running inside the Viewer panel:


Figure 1: chattrThe Shiny app

Although this article highlights chattrRegarding the integration with the RStudio IDE, it is worth mentioning that it works outside of RStudio, for example the terminal.

Getting started

To get started, install the package from CRAN and then call the Shiny app using the chattr_app() function:

Modify message enhancements

Beyond the app

In addition to the Shiny app, chattr offers a couple of other ways to interact with the LLM:

  • Use the chattr() function
  • Highlight a question in your script and use it as a prompt.

here.

RStudio Plugins

chattr comes with two RStudio plugins:


Screenshot of chattr plugins in RStudio

Figure 4: chattr accessories

You can bind these plugin calls to keyboard shortcuts, making it easy to open the app without having to type the command each time. To find out how to do this, see the keyboard shortcut section in the
chattr official website.

Work with local LLMs

Today, capable open source models are widely available that can run on your laptop. Instead of integrating with each model individually, chattr
work with CallGPTJ-chat. This is a lightweight application that communicates with a variety of local models. At this time LlamaGPTJ-chat integrates with the following model families:

  • GPT-J (ggml and gpt4all models)
  • Calls (ggml Vicuña de Meta models)
  • Tile Pretrained Transformers (MPT)

LlamaGPTJ-chat works directly from the terminal. chattr It integrates with the application by starting a “hidden” terminal session. There it initializes the selected model and makes it available to start chatting with it.

To get started, you need to install LlamaGPTJ-chat and download a compatible model. More detailed instructions are found.
here.

chattr looks for the LlamaGPTJ chat location and the installed model in a specific folder location on your machine. If the installation paths do not match the locations expected by chattrthen the CallGPT will not appear on the menu. But it’s okay, you can still access it with chattr_use():

here.

Welcome comments

After trying it out, feel free to post your thoughts or issues on the
chattr‘s GitHub repository.

Related Articles

Latest Articles