Grooper Help - Version 25.0
25.0.0017 2,127
  • Overview
  • Help Status

LLM Connector

Repository Option Grooper.GPT

Provides integration with OpenAI-compatible large language models (LLMs) for the Grooper repository.

Remarks

The LLM Connector option enables advanced AI-powered features throughout Grooper by integrating one or more OpenAI-compatible large language model (LLM) providers.

Functionality Enabled

The following Grooper features require an LLM Connector option to be configured:

  • AI Productivity Tools: Activates chat-based and generative AI tools in the Grooper UI, such as document summarization, content generation, and context-aware assistance.
  • AI Extract: Enables the AI Extract activity, which uses LLMs to extract structured data from unstructured documents using prompt-driven extraction logic.
  • AI Separate: Powers the AI Separate activity, allowing LLMs to intelligently split documents into logical units based on content and context.
  • Ask AI: Provides the Ask AI tool, enabling users to query LLMs directly from the Grooper UI for ad-hoc questions, explanations, or content generation.
  • LLM Classifier: Supports the LLM Classifier activity, which uses LLMs to classify documents or data elements based on prompts or examples.
  • GPT Embed: Supports the GPT Embed activity, which generates vector embeddings for document text, enabling semantic search and context-aware retrieval.
  • Chat Page: Enables the use of AI Assistants on the Chat Page in the Grooper UI, allowing users to interact with LLMs in a conversational interface for assistance, brainstorming, or workflow support.
  • Other LLM-Powered Features: Any custom modules, scripts, or activities that leverage LLMs for advanced language understanding, transformation, or decision support.

Integration Points

  • Grooper UI: Enables AI-powered features in the user interface, including productivity tools, the Chat Page, and context-aware suggestions.
  • Batch Processes & Activities: Makes LLM-based operations available to Grooper activities (such as AI Extract, AI Separate, LLM Classifier), scripts, and custom modules.
  • Repository-Wide Availability: Once configured, all users and processes in the repository can access the enabled LLM features, subject to licensing and permissions.

Configuration

  1. Add one or more LLM Providers to the 'Service Providers' property. Each provider defines a connection to an OpenAI-compatible API or service.
  2. Set the 'Tools Chat Model' property to specify the default model for UI-based AI tools.
  3. Adjust 'Default Chat Parameters' to control the behavior of LLM completions and responses.
  4. Optionally, configure additional providers or models to support multiple endpoints or specialized use cases.

Notes

  • The list of available models is automatically aggregated from all configured providers.
  • Disabling or removing the LLM Connector option will deactivate all LLM-powered features in Grooper.
  • Licensing may be required to enable certain features or providers.

Related Types

Properties

NameTypeDescription

See Also

Notification