Plan-Execute Pattern
Along with the strategy pattern, and the perception pattern, the plan-execute pattern is one of the major of machine teaching. In this pattern, the skill agents work together in a skill group, with the first skill agent determining what the action should be and the second skill agent determining how to achieve it.
What is special about this agent is that it combines DRL and MPC, the technologies from the two single-skilled agent systems — the worst performers — to create the best performing agent.
In this example, the DRL skill agent first uses its powers of learning and experimentation to determine the goal temperature for the cooling jacket — the set point. It then passes this information on to the MPC skill agent, which uses its powers of control and execution to direct the agent on what action to take to achieve the desired temperature.
This tutorial will show you how to publish the MPC controller to the platform using the data science workflow and then use it to create a multi-agent system using the plan-execute pattern.
Remember how the strategy pattern is like a math class where each student solves the problems they are best at, as assigned by the teacher? In the plan-execute pattern, the students work in groups to solve problems together. Let’s say Student A is good at translating word problems into equations, while Student B is good at solving equations. Student A works on each problem first, and then passes it over to Student B, who produces the solution. No teacher is needed here, because the students divide each problem the same way.

Let's get started configuring this agent!
1. Publish the MPC Skill Agent to Your Project
This agent has two skill agents called control_full_reaction
and mpc-skill-group
. We have already created control_full_reaction
in our project, so we only need to publish mpc-skill-group
to build this agent in the Agent Builder UI. To publish mpc-skill-group
to your use case you will need to open up your favorite code editor and terminal. In your terminal, navigate to the skills folder of the Industrial Mixer Repo and use this command with the Composabl CLI.
composabl skill publish mpc-skill-group
Return to the agent orchestration studio and refresh the page. The skill agent will appear in the skills menu on the left of your page.
Explore the Code Files
All skill agents, perceptors, and orchestrators have at least two files in them. A Python file contains the code the skill agent will use, and a config file.
pyproject.toml
, a config file with the following information.A Python file. For this skill agent, we use controller class with the following code and explanations in comments inline.
File Structure

See the Code
MPC Skill Group Controller Skill Agent
pyproject.toml
controller.py
2. Build the Plan Execute Pattern Agent System in the Agent Orchestrator UI
First, drag the skill agent control_full_reaction
from the hand side of the page to the skill layer. Once it's there, drag over the mpc-skill-group
and make sure that it is dropped below the control_full_reaction
skill agent and not beside it.

3. Run Your Training Session
We are ready to train your agent system and see the results. Select the cluster you want to use and the number of training cycles. We suggest you run 50 training cycles. You will see the skill agents training one at a time, and you assign the number of cycles you want each skill agent to use. It will automatically assign an equal number of training sessions for each skill agent, but in some agent system designs, some skill agents might require more training than others.

4. View Results
When the training has been completed, you can view your results in the training sessions tab in the UI. This will show you information on how well the agent is learning.
Analyzing the Plan-Execute Pattern Agent’s Performance
Conversion rate: 95% Thermal runaway risk: Very low
We tested this fully trained agent and plotted the results.

This agent is the best performer of the group. Combining two imperfect technologies together with Machine Teaching produces much better results than either technology achieves alone.
Last updated