Bring Your Own LLM

Now you can plug in your own LLMs directly into Botpress using the LLM interface. This gives you more flexibility and control over your bot's AI-powered actions. Whether you're working with custom models or specialized solutions, you can now easily integrate them into your workflows. This update is perfect for those who want to fine-tune their bot's behaviour to fit specific use cases or experiment with different LLMs on the fly.

Hugging Face Integration

You can now access Hugging Face's extensive library of models right into the Botpress Hub. It's now simple to access and deploy state-of-the-art NLP models without jumping through extra hoops.

Computed Table Columns

Computed table columns are now live, allowing you to create dynamic, calculated data directly within your tables. You can perform programmatic computation executed through code, or AI-powered operations powered by an LLM. Itโ€™s a great way to add more depth to the structured data you have stored in Tables.

What else is new?

FEATURES & ENHANCEMENTS

๐Ÿ› ๏ธ Autonomous Nodes now provide more descriptive error messages when encountering issues.
๐Ÿ“ LLMzโ€™s truncation logic now accounts for response length, ensuring more accurate outputs.
๐Ÿงฉ Improved the Autonomous Nodeโ€™s handling of variable reading and writing.
โšก Optimized Autonomous Nodes to reduce redundant model calls when previous results are available.

BUG FIXES

โŒจ๏ธ Fixed an issue with premature truncation that caused messages to lose content unexpectedly.
๐Ÿ—บ๏ธ Fixed an issue with sitemap discovery for Knowledge Bases (KBs).
๐Ÿ“• Resolved a problem where LLMz would respond with an empty message.
โŒ Corrected an error that prevented bot deletion under certain conditions.

UI Enhancements and Stability Improvements

๐Ÿ”ง Tooltip visibility in execute code card: Resolved an issue where tooltips were being cut off in the execute code card, ensuring complete visibility of information.

๐Ÿš€ Studio & bot conversation stability: Implemented extensive stability improvements to enhance the reliability of Studio and bot conversation states.

๐Ÿšจ Timeout flow override notification: When setting the Inactivity Timeout to 0, a new Toast message now appears alerting you that the Timeout Workflow is disabled.

Integration and Postback Enhancements

๐Ÿ“‚ Grouped integration actions in cards tray: Actions associated with an integration are now neatly grouped together within the cards tray, improving organization and usability.

๐Ÿ”„ Postback functionality in carousel cards: Added the ability to include postback actions in carousel cards. When a user clicks on the button, the associated text can now be sent as a user message.

Bug Fixes and Improvements

๐Ÿ“š KB flexibility: Fixed an issue where responses without citations couldnโ€™t be sent before.

โ— Integration setup error messages: Error messages are now displayed directly within the integration setup modal, making it easier to diagnose and resolve integration issues.

โš™๏ธ Webchat v2 improvements: Various enhancements have been made to webchat v2 to improve user experience and performance.

More robust LLM selection

Now, when selecting which LLM to use for which case, we provide more robust information, like context on which models are best used for which use cases or modalities. We also provide recommendations like general purpose models, or which to use if you want to optimize for cost. The expanded menu allows you to sort by maximum context window, as well as the price of input and output tokens.

Rate Autonomous Node responses

You can now rate and provide feedback on the answers generated by Autonomous Nodes. In the emulator, when an Autonomous Node generates a response, you'll be prompted to provide feedback like a rating and any qualitative commentary you have about that response.

What else is new?

Features & enhancements

๐Ÿง  We now provide a more robust description of LLMs when selecting them for use in the studio, including tags, use cases, and the ability to sort by input/output token price.
โญ Provide feedback on Autonomous Node responses directly in the emulator by rating generations.
๐Ÿ› ๏ธ You can now execute integration actions through code, instead of solely through the cards provided by an integration.
๐Ÿšช A helpful message is now displayed when trying to enter a Workspace you donโ€™t have access to.
๐Ÿ“‚ You can now for Workflow Hub cards in a git-like manner.
๐ŸŽฏ We added a small animation to the Inspect button below Autonomous Node responses to make it clear that you can analyze its iteration process.
๐ŸŽ›๏ธ You can now select which Workspace you want to publish Workflows from, for example if youโ€™ve got one you want to keep for production and development.
โš™๏ธ We optimized the Table management experience, so you should notice a smoother workflow when working with records.
๐Ÿ“Š We made various improvements to the Logs and Issues tabs of the Dashboard, to make them easier to read and understand.

Bug fixes

๐Ÿ•’ Fixed an error where the Timeout flow would ignore choices on capture cards.
๐Ÿค– Fixed a bug where the Autonomous Node would ignore instructions provided to an Agent.
๐ŸŒ Fixed an issue where browser translators would prevent people from entering the Studio under certain conditions.
๐Ÿ—‚๏ธ Fixed the behaviour of the File API when encountering files with invalid encoding.
๐Ÿ”„ We now prevent an issue where Tables would crash upon failure to connect to a database under certain circumstances.
๐Ÿ” Fixed a bug where special characters would cause a search query to fail.

Studio Integrations

The Studio is now the default location to install and manage integrations. They can be configured from this location, as well as installed/uninstalled. This reduces the need to navigate back to the dashboard when wanting to make changes to an integration.

Select LLMs

We've added the ability to select which LLM is used for which task across all actions and card in the Studio. You can configure them in cards, agents, and other global settings like for Autonomous Nodes. You can also set up custom LLM configurations for different actions.

Enabling LLMs can be done through the integrations menu. You can install integrations that allow you to access LLMs across different LLM providers like OpenAI and Anthropic.

WhatsApp OAuth

The WhatsApp integration now supports OAuth, which offers a simplified configuration experience if you're looking to deploy a bot to a Meta-verified business quickly and simply. Manual configuration is still available for complex use cases, but if you're looking for a faster experience then OAuth should work great.

Transcribe Audio

Enabling models that support audio transcription will give you access to the 'Transcribe Audio' card in the Studio. This enables your bot to receive and transcribe audio files, which is currently done through a File URL. We are working to simplify this experience soon.

Policy Agent

The Policy Agent is a global agent that allows you to set system prompts, like 'do not talk about competitor X or Y'. It's a great option to provide global commands to your bot that can be managed at the conversation or the node level.