Skip to main content

Abacus AI Workflows Tutorial

A comprehensive guide to building AI-powered workflows using the Abacus AI Python SDK.


Introduction​

Abacus AI Workflows enable you to build sophisticated AI-powered automation systems by connecting Python functions together into intelligent workflows. This tutorial covers three types of workflows:

Workflow TypeAgent InterfaceUse CasesKey Features
AutonomousAUTONOMOUSMonitoring, scheduled tasksLoop nodes, automatic execution
Form SubmitFORM_SUBMITContent generation, data processingInput schemas, user-triggered
ConversationalCONVERSATIONALChatbots, Q&A systemsMemory, context awareness, streaming

Prerequisites​

Before you begin, ensure you have:

  1. Abacus AI Account

  2. API Key

    • Navigate to your profile settings
    • Generate an API key for authentication
  3. Python Environment

    • Python 3.8 or higher
    • Install the Abacus AI SDK:
      pip install abacusai
  4. Project Setup

    • Have your project_id ready (found in your Abacus AI project dashboard)
    • For updating existing agents, have your model_id ready

Library Imports​

Before building workflows, you'll need to import the necessary Abacus AI components at the top of your script:

Required Imports for Workflow Creation​

from abacusai import (
ApiClient,
AgentInterface,
AgentResponse,
WorkflowGraph,
WorkflowGraphNode,
WorkflowNodeInputMapping,
WorkflowNodeInputSchema,
WorkflowNodeInputType,
WorkflowNodeOutputMapping,
WorkflowNodeOutputSchema,
WorkflowNodeOutputType,
)

Understanding the AI Workflow Framework​

The Abacus AI Workflow framework transforms Python functions into visual, executable workflows through a structured process:

The Transformation Pipeline​

Python Function → WorkflowGraphNode → WorkflowGraph → Agent

How It Works​

  1. Define Python Functions: Write self-contained functions with all necessary imports
  2. Create Nodes: Transform functions into WorkflowGraphNode objects
  3. Connect Nodes: Define data flow between nodes using input/output mappings
  4. Build Graph: Combine nodes into a WorkflowGraph
  5. Register Agent: Deploy the workflow as an agent with a specific interface type

Key Components​

ComponentPurpose
WorkflowGraphNodeRepresents a single processing step (function)
WorkflowGraphOrchestrates the flow of data between nodes
AgentInterfaceDefines how users interact with the workflow
Input/Output MappingsConnect data between nodes
Input/Output SchemasDefine UI forms and data validation

Core Concepts​

Self-Contained Functions​

CRITICAL: Each Python function in your workflow must include all imports inside the function body. This is because functions are serialized and executed in isolated environments.

Incorrect (imports outside function):

import requests
from abacusai import ApiClient

def fetch_data():
client = ApiClient()
response = requests.get("https://api.example.com")
return response.json()

Correct (imports inside function):

def fetch_data():
import requests
from abacusai import ApiClient, AgentResponse

client = ApiClient()
response = requests.get("https://api.example.com")
return AgentResponse(raw_data = response.json())

AgentResponse​

Functions should return an AgentResponse object to structure output data:

def process_data(input_text):
from abacusai import AgentResponse

result = input_text.upper()
return AgentResponse(
processed_text=result,
character_count=len(result)
)

And here is an example of how they can return a document to the user:

# Save document to bytes
doc_buffer = io.BytesIO()
doc.save(doc_buffer)
doc_bytes = doc_buffer.getvalue()

# Return as Blob (file download)
return AgentResponse(
story_document=Blob(
doc_bytes,
"application/vnd.openxmlformats-officedocument.wordprocessingml.document",
filename=f"document.docx",
)
)

Node Input/Output Mappings​

Nodes communicate through mapped inputs and outputs:

# Define first node
fetch_node = WorkflowGraphNode(
name="Fetch Data",
function=fetch_data,
output_mappings=["raw_data"]
)

# Define second node that uses first node's output
process_node = WorkflowGraphNode(
name="Process Data",
function=process_data,
input_mappings={
"input_text": fetch_node.outputs.raw_data # Connect fetch_node's output to process_node's input
},
output_mappings=["processed_result"]
)

Initialize Functions​

AI workflows can also take an initialize function. This allows you to pre-load data (for instance a feature group) so that you don't need to reload it every time the AI workflow executes.

def initialize_workflow():
import abacusai
client = abacusai.ApiClient()
my_table = client.describe_feature_group_by_table_name("MY_TABLE").load_as_pandas()
return {"my_table": my_table}

# To use data returned by the initialize function, use this code inside your other nodes:
my_table = client.get_initialized_data()["my_table"]

Defining the WorkflowGraph​

Combine your nodes into a workflow graph:

agent_interface = AgentInterface.DEFAULT

workflow_graph = WorkflowGraph(
nodes=[fetch_node, process_node],
specification_type="data_flow"
)

Creating the Agent​

Finally, create the agent using the create_agent method:

from abacusai import ApiClient

client = ApiClient()

# With initialize function
agent = client.create_agent(
project_id = 'YOUR_PROJECT_ID',
workflow_graph=workflow_graph,
agent_interface=agent_interface,
initialize_function_name=initialize_workflow.__name__,
initialize_function_code=initialize_workflow
)

# Without initialize function
agent = client.create_agent(
project_id = 'YOUR_PROJECT_ID',
workflow_graph=workflow_graph,
agent_interface=agent_interface
)

agent.wait_for_publish()
agent

Note: You can also update existing agents using the update_agent method.


Next Steps​

Now that you understand the basic components of an AI workflow, continue to the next tutorials of this section for end-to-end examples!