Dev Tools // // 4 min read

BChat IntelliJ Plugin: Installation & Usage Guide

balakumar Senior Software Engineer

BChat seamlessly integrates Large Language Models (LLMs) into your IntelliJ IDEA workflow, boosting your coding productivity and enhancing code quality. This guide provides a comprehensive overview of the plugin's features and how to use them effectively.

Installation

From a JAR File

  1. Download the BChat plugin JAR file.
  2. Open IntelliJ IDEA.
  3. Go to File > Settings (or IntelliJ IDEA > Preferences on macOS).
  4. Select Plugins.
  5. Click the gear icon and choose Install Plugin from Disk...
  6. Navigate to the downloaded BChat JAR file and select it.
  7. Restart IntelliJ IDEA to activate the plugin.

Usage

Accessing BChat

After installation, access the BChat tool window:

  • Menu: View > Tool Windows > BChat
  • Toolbar: Click the BChat icon (a brain with waves) in the toolbar.

Configuring LLM Providers

Before using BChat, configure settings for your chosen LLM providers:

  1. Open BChat Settings:
    • Menu: File > Settings > Tools > BChat Settings (Windows/Linux)
    • Menu: IntelliJ IDEA > Preferences > Tools > BChat Settings (macOS)
  2. Configure Settings:
    • Local LLMs: Provide the URLs for your locally hosted Ollama, LMStudio, or GPT4All instances:
      • Ollama URL: The URL where your Ollama server is running (e.g., http://localhost:11434/)
      • LMStudio URL: The URL of your LMStudio server (e.g., http://localhost:1234/v1/)
      • GPT4All URL: The URL for your GPT4All server (e.g., http://localhost:4891/v1/)
    • Cloud-based LLMs: Enter your API keys for:
      • OpenAI API Key: Your API key from OpenAI.
      • Anthropic API Key: Your API key from Anthropic.
      • Mistral API Key: Your API key from Mistral AI.
      • Groq API Key: Your API key from Groq.
      • DeepInfra API Key: Your API key from DeepInfra.
    • LLM Parameters: Adjust the following:
      • Temperature: Controls the "creativity" of the LLM (higher values = more creative).
      • Top-P: Influences the diversity of the generated text (higher values = more diverse).
      • Timeout (in seconds): Sets a time limit for LLM responses.
      • Maximum Retries: Specifies the number of attempts if a request fails.
    • Predefined Command Prompts: Customize the prompts for the following commands (more details below):
      • /test: Generate unit tests.
      • /explain: Explain the selected code.
      • /review: Get a code review.
      • /custom: Execute a custom prompt.

Interacting with LLMs: The Core Workflow

  1. Select Code: Select the code snippet you want the LLM to process.
    • Select text within a file.
    • Add entire files to the prompt context (see "Adding Files to Prompt Context" below).
  2. Choose Provider & Model: In the BChat tool window, select your preferred LLM provider and model from the dropdowns.
  3. Enter Prompt: Type your prompt or use a predefined command:
    • /test: Write unit tests using JUnit for the selected code.
    • /review: Review the code for improvements and potential bugs.
    • /explain: Explain the selected code in a way that a junior developer would understand.
    • /custom: Execute your custom prompt (defined in the Settings).
  4. Submit: Click the Submit button (paper airplane icon) or press Ctrl+Enter (Cmd+Enter on macOS).
  5. View Response: The LLM response will appear in the output panel.
    • Copy Response: Copy the response to the clipboard.
    • Insert Code Snippets: Directly insert code snippets from the response into your editor.

Adding Files to Prompt Context

  1. Click the Add File button (plus icon) in the BChat tool window. A popup will appear.
  2. Double-click a file from the list to add it.
  3. Added files will be listed in the "Prompt Context Files" panel.
    • Remove Files: Click the close (x) button next to a file entry to remove it.

Adding Code Snippets to Prompt Context

  1. Select a code snippet in your editor.
  2. Right-click and choose Add Snippet to BChat.
  3. The snippet is added to the "Prompt Context Files" panel. BChat treats the snippet as a separate "file" when generating prompts.

Generating Commit Messages: A Powerful Feature

BChat can automatically generate commit messages based on your changes:

  1. Stage your changes as you normally would in Git.
  2. Open the "Commit" tool window.
  3. Click the Generate Commit Message button. BChat will:
    • Analyze the Git diff of your staged changes.
    • Generate a concise and informative commit message using the selected LLM.
    • Automatically extract the Jira issue ID from your branch name (if it follows the pattern PROJECT-ISSUE_NUMBER).
    • Prepend the Jira issue ID to the commit message (e.g., JIRA-123: Fix bug in user authentication).

Starting a New Conversation

To start fresh and clear the chat history, click the New Conversation (plus icon) button.

Additional Tips

  • Clear Prompts: Be clear and specific in your prompts to get the best results from the LLM.
  • Experiment: Try different LLM providers and models to find the one that works best for your specific needs and coding style.
  • Quick Settings: Use the Settings (cog icon) button in the BChat tool window to access the plugin configuration quickly.

Contributing

BChat is open source. Contributions and feedback are very welcome! The project repository is on GitHub.