Configure an LLM Model as a Perceptor
You can use an LLM as a to add language and communication capabilities to your agent.
This allows you to create human-like assistants or copilots who can contribute natural language capabilities to your agent. Composabl has several personas for LLM assistants to help structure your agent design.
The analyst interprets sensor data and passes it to an interface that the user can access, allowing real-time monitoring of conditions and the agent's responses.
The executive reads external data sources in text and reports information to the agent, such as trends in the business press that would help to anticipate demand for a product
The plant manager allows operators to communicate directly with the agent and give it instructions based on information that would not be otherwise available in its sensor space
LLM perceptors can either:
Output language to the operator about what the agent is doing (ex. the analyst)
Take in inputs in natural language and then transform them into information that the decision-making layer of the agent can use (ex. the executive and plant manager)
Create an LLM Perceptor
Step 1: Create the perceptor shell
From the CLI, when logged into Composabl, type composabl perceptor new
. You will be prompted for a location to save your new perceptor, and then a new directory with your perceptor will be created.
This will include the pyproject.toml
file that will allow you to publish the perceptor to the UI once it is created.
Step 2: Define the perceptor class
Within the perceptor.py
file, create the API call and prompt for the LLM.
Analyst Perceptor Code Sample
The analyst displays information to the human user, but doesn't send information to the decision-making layer of the agent, so the perceptor returns 0.
Executive Code Sample
The executive in this sample related to the industrial mixer use case automatically queries a chemical engineering LLM for advice about control actions to take.
The perceptor returns an action that it recommends the decision-making layer of the skill to take. This becomes a new sensor variable that the skill teacher(s) will take into account when training the agent in simulation.
Examples and Reference
See full code samples and more examples.
Step 3: Filter the Sensor Space
Composabl agents can include text fields in perceptors, but they must be transformed or filtered out in the teacher.py
file before training with DRL. For any text variables that are not transformed into a different data type, use the filtered_sensor_space
method of the teacher to remove them.
Step 4: Publish the Perceptor
Publish the perceptor to the UI.
Naviage to the folder above your perceptor. Then publish your perceptor.
Select the organization and project that your perceptor to add your perceptor. The refresh your Agent Builder Studio to see the perceptor and add it to agents.
Last updated