node-red-contrib-ai-agent 0.0.3 → 0.0.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -4,7 +4,7 @@
4
4
 
5
5
  ## Overview
6
6
 
7
- A powerful AI Agent for Node-RED that enables natural language processing and tool integration. This node allows you to create AI-powered flows with support for function calling and external tool integration.
7
+ A powerful AI Agent for Node-RED that enables natural language processing with memory and tool integration. This package provides nodes for creating AI-powered flows with conversation context management and extensible tool integration.
8
8
 
9
9
  ## ⚠️ Beta Notice
10
10
 
@@ -18,42 +18,165 @@ Your feedback and contributions are highly appreciated!
18
18
 
19
19
  ## Features
20
20
 
21
- - Integration with OpenRouter's AI models
22
- - Support for function calling and tool execution
23
- - Easy configuration through Node-RED's interface
24
- - Conversation context management
25
- - Extensible with custom tools
21
+ - **AI Agent Node**: Process messages with AI, maintaining conversation context
22
+ - **Memory Nodes**:
23
+ - **In-Memory**: Store conversation context in memory (volatile)
24
+ - **File-based**: Persist conversation context to disk
25
+ - **AI Model Node**: Configure AI models and API settings
26
+ - **Tool Integration**: Extend functionality with custom tools
27
+ - **Stateless Design**: Memory nodes are stateless, making them more reliable and scalable
28
+ - **Context Management**: Automatic conversation history management with configurable retention
26
29
 
27
30
  ## Getting Started
28
31
 
29
32
  1. Install the package via the Node-RED palette manager
30
33
  2. Add an AI Model node to configure your OpenRouter API key and model
31
- 3. Add AI Tool nodes to define custom functions
32
- 4. Connect to an AI Agent node to process messages
34
+ 3. (Optional) Add a Memory node (In-Memory or File-based) to manage conversation context
35
+ 4. (Optional) Add AI Tool nodes to define custom functions or HTTP requests
36
+ 5. Connect to an AI Agent node to process messages
37
+ 6. (Optional) Connect more AI Agent nodes to process messages in a chain
38
+
39
+ ##
40
+
41
+ ## Node Types
42
+
43
+ ### AI Agent
44
+ Processes messages using the configured AI model and maintains conversation context through connected memory nodes.
45
+
46
+ **Properties:**
47
+ - **Name**: Display name for the node
48
+ - **System Prompt**: Initial instructions for the AI
49
+ - **Response Type**: Format of the response (text or JSON object)
50
+
51
+ ### Memory (In-Memory)
52
+ A configuration node that initializes the conversation context in memory. The agent node uses this configuration to manage the conversation context.
53
+
54
+ **Properties:**
55
+ - **Max Items**: Maximum number of conversation turns to keep in memory
56
+ - **Name**: Display name for the node
57
+
58
+ ### Memory (File)
59
+ A configuration node that initializes the conversation context with file-based persistence. The agent node uses this configuration to manage the conversation context across restarts.
60
+
61
+ **Properties:**
62
+ - **Max Items**: Maximum number of conversation turns to keep
63
+ - **File Path**: Path to store the conversation history
64
+ - **Name**: Display name for the node
65
+
66
+ ### AI Model
67
+ Configures the AI model and API settings.
68
+
69
+ **Properties:**
70
+ - **Model**: The AI model to use (e.g., gpt-4, claude-3-opus)
71
+ - **API Key**: Your OpenRouter API key
72
+ - **Name**: Display name for the node
73
+
74
+ ### AI Tool Function
75
+ Creates a JavaScript function tool that can be used by the AI Agent.
76
+
77
+ **Properties:**
78
+ - **Name**: Display name for the node
79
+ - **Tool Name**: Name of the tool (used by the AI to identify the tool)
80
+ - **Description**: Description of what the tool does
81
+ - **Function**: JavaScript code to execute when the tool is called
82
+
83
+ ### AI Tool HTTP
84
+ Creates an HTTP request tool that can be used by the AI Agent to make external API calls.
85
+
86
+ **Properties:**
87
+ - **Name**: Display name for the node
88
+ - **Tool Name**: Name of the tool (used by the AI to identify the tool)
89
+ - **Description**: Description of what the tool does
90
+ - **Method**: HTTP method (GET, POST, PUT, DELETE, etc.)
91
+ - **URL**: The URL to make the request to (supports template variables)
92
+ - **Headers**: JSON object of headers to include in the request
93
+ - **Body**: Content to send in the request body
94
+
95
+ **Template Variables:**
96
+ You can use template variables in the URL, headers, and body to reference properties from the input object that the AI provides when calling the tool.
97
+
98
+ Example: `https://api.example.com/users/${input.userId}`
33
99
 
34
- ## Example: Today's Joke
100
+ ## Example: Basic Usage
35
101
 
36
- Here's an example flow that tells a joke related to today's date using a custom tool:
102
+ Here's how to use the AI Agent:
37
103
 
38
- ![Today's Joke Flow](https://raw.githubusercontent.com/lesichkovm/node-red-contrib-ai-agent/refs/heads/main/snapshots/todays-joke-flow.png "Example flow showing the Today's Joke implementation")
104
+ 1. Add an AI Agent node to your flow
105
+ 2. Configure it with an AI Model node
106
+ 3. (Optional) Add a Memory configuration node if you need conversation context
107
+ 4. Connect your flow: `[Input] --> [AI Model] --> [AI Agent] --> [Output]`
39
108
 
40
- ### Flow Output
109
+ Memory is only required if you need to maintain conversation context between messages or chain multiple agents together. For simple, stateless interactions, you can use the AI Agent without any memory configuration.
41
110
 
42
- When executed, the flow will generate a joke related to the current date:
111
+ ## Example: Using AI Tools
43
112
 
44
- ![Today's Joke Output](https://raw.githubusercontent.com/lesichkovm/node-red-contrib-ai-agent/refs/heads/main/snapshots/todays-joke.png "Example output showing a date-related joke")
113
+ To extend the AI Agent with custom tools:
45
114
 
46
- ## Basic Usage
115
+ 1. Add an AI Tool Function or AI Tool HTTP node to your flow
116
+ 2. Configure the tool with appropriate settings
117
+ 3. Connect the tool to the AI Agent: `[Input] --> [AI Model] --> [AI Tool] --> [AI Agent] --> [Output]`
47
118
 
48
- 1. **AI Model Node**: Configure your AI model and API settings
49
- 2. **AI Tool Node**: Define custom functions and tools
50
- 3. **AI Agent Node**: Process messages with AI and tool integration
119
+ The AI Agent will automatically detect and use the tools in its processing. You can add multiple tools to give the AI Agent different capabilities.
120
+
121
+ ### HTTP Tool Example
122
+
123
+ ```
124
+ [AI Tool HTTP] --> |
125
+ |--> [AI Agent] --> [Output]
126
+ [AI Tool Function] --> |
127
+ ```
128
+
129
+ In this example, the AI Agent has access to both an HTTP tool for making external API calls and a function tool for custom logic.
130
+
131
+ ## Example: Chained Agents
132
+
133
+ For more complex scenarios, you can chain multiple agents to process messages in sequence:
134
+
135
+ 1. Create Memory node (it will init the context, it will be shared between agents)
136
+ 2. Configure Agent 1
137
+ 3. Configure Agent 2
138
+ 4. Connect your flow: `[Input] --> [AI Model] --> [AI Memory] --> [Agent 1] --> [Agent 2] --> [Output]`
139
+
140
+ Each agent will maintain its own conversation context based on its memory configuration.
141
+
142
+ ## Best Practices
143
+
144
+ ### Memory Management
145
+ - Memory nodes are configuration nodes that define how conversation context is managed
146
+ - Each AI Agent node references a memory configuration
147
+ - The memory configuration is instantiated once and can be shared between multiple agents
148
+ - The AI Agent node is responsible for managing and updating the conversation context based on its memory configuration
149
+ - Memory configurations are particularly useful in chained agent scenarios where different agents need different context handling
150
+ - Use **In-Memory** configuration for temporary conversations
151
+ - Use **File-based** configuration for conversations that should persist across restarts
152
+ - Set appropriate `maxItems` to control context length and memory usage
153
+
154
+ ### Error Handling
155
+ - Always handle errors from the AI Agent node
156
+ - Check for API key and model configuration errors
157
+ - Monitor memory usage with large conversation histories
158
+
159
+ ### Performance
160
+ - For high-volume applications, consider using a database-backed memory implementation
161
+ - Be mindful of token usage with large contexts
162
+ - Use appropriate timeouts for API calls
163
+
164
+ ## Advanced: Chaining Agents
165
+
166
+ You can chain multiple AI Agents with different memory scopes to create complex conversation flows:
167
+
168
+ ```
169
+ [Input] -->[Memory 1] --> [Agent 1] --> [Agent 2] --> [Memory 2] --> [Agent 3] --> [Agent 4] --> [Output]
170
+ ```
171
+
172
+ This allows for complex conversation flows where different agents handle different aspects of the interaction.
51
173
 
52
174
  ## Advanced Features
53
175
 
54
- - **Tool Integration**: Extend functionality with custom tools
176
+ - **Tool Integration**: Extend functionality with custom tools (Function and HTTP)
55
177
  - **Context Management**: Maintain conversation history
56
178
  - **Flexible Configuration**: Customize model parameters and behavior
179
+ - **Template Variables**: Use dynamic values in HTTP requests
57
180
 
58
181
  ## Contributing
59
182