Configure an LLM Model as a Perceptor

You can use an LLM as a to add language and communication capabilities to your agent.

This allows you to create human-like assistants or copilots who can contribute natural language capabilities to your agent. Composabl has several personas for LLM assistants to help structure your agent design.

  • The analyst interprets sensor data and passes it to an interface that the user can access, allowing real-time monitoring of conditions and the agent's responses.

  • The executive reads external data sources in text and reports information to the agent, such as trends in the business press that would help to anticipate demand for a product

  • The plant manager allows operators to communicate directly with the agent and give it instructions based on information that would not be otherwise available in its sensor space

LLM perceptors can either:

  1. Output language to the operator about what the agent is doing (ex. the analyst)

  2. Take in inputs in natural language and then transform them into information that the decision-making layer of the agent can use (ex. the executive and plant manager)

Create an LLM Perceptor

Step 1: Create the perceptor shell

From the CLI, when logged into Composabl, type composabl perceptor new. You will be prompted for a location to save your new perceptor, and then a new directory with your perceptor will be created.

This will include the pyproject.toml file that will allow you to publish the perceptor to the UI once it is created.

Step 2: Define the perceptor class

Within the perceptor.py file, create the API call and prompt for the LLM.

Analyst Perceptor Code Sample

The analyst displays information to the human user, but doesn't send information to the decision-making layer of the agent, so the perceptor returns 0.

from fake_llm import llm_client
from fake_factory_console import factory_console_client

from composabl_core import PerceptorImpl

class AnalystPerceptor(PerceptorImpl):
    """
    The analyst type that displays information to the human operators but doesn't send any information to the agent.
    """
    def __init__(self, *args, **kwargs):
        # Example:
        self.llm_client = llm_client()
        self.factory_console_client = factory_console_client()
        pass    
    async def compute(self, obs_spec, obs):       
        # First, ask the LLM for its thoughts on the current state of the plant
        llm_response = self.llm_client.ask(f"You are controlling a CSTR plant, the current state of the plant is {obs}. What are your thoughts on the current state of the plant?")

        # Second, post the LLM's thoughts to the factory console for a human to read
        self.factory_console_client.post(f"The LLM thoughts on the current state of the plant are: {llm_response}")

        return {"chemical_engineer_llm": 0}

Executive Code Sample

The executive in this sample related to the industrial mixer use case automatically queries a chemical engineering LLM for advice about control actions to take.

The perceptor returns an action that it recommends the decision-making layer of the skill to take. This becomes a new sensor variable that the skill teacher(s) will take into account when training the agent in simulation.

from fake_llm import llm_client
from composabl_core import PerceptorImpl

class ChemicalEngineerPerceptor(PerceptorImpl):
    """
    The perceptor for the text agent
    """
    def __init__(self, *args, **kwargs):
        self.llm_client = llm_client()
        pass

    async def compute(self, obs_spec, obs):
        """
        Asks the LLM for its thoughts on the current state of the plant, and returns a recommended action
        """
        llm_response = self.llm_client.ask(f"You are controlling a CSTR plant, the current state of the plant is {obs}. what action do you recommend?")
        llm_action = llm_response.find("action")
        return {"chemical_engineer_llm": llm_action}

Examples and Reference

See full code samples and more examples.

Step 3: Filter the Sensor Space

Composabl agents can include text fields in perceptors, but they must be transformed or filtered out in the teacher.py file before training with DRL. For any text variables that are not transformed into a different data type, use the filtered_sensor_space method of the teacher to remove them.

Step 4: Publish the Perceptor

Publish the perceptor to the UI.

composabl login

Naviage to the folder above your perceptor. Then publish your perceptor.

composabl perceptor publish foldername

Select the organization and project that your perceptor to add your perceptor. The refresh your Agent Builder Studio to see the perceptor and add it to agents.

Last updated