a2a-adapter 0.1.0__tar.gz → 0.1.2__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,15 +1,15 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: a2a-adapter
3
- Version: 0.1.0
3
+ Version: 0.1.2
4
4
  Summary: A2A Protocol Adapter SDK for integrating various agent frameworks
5
5
  Author-email: HYBRO AI <info@hybro.ai>
6
- License: MIT
7
- Project-URL: Homepage, https://github.com/hybro-ai/a2a-adapter
8
- Project-URL: Documentation, https://github.com/hybro-ai/a2a-adapter#readme
9
- Project-URL: Repository, https://github.com/hybro-ai/a2a-adapter
6
+ License: Apache-2.0
7
+ Project-URL: Homepage, https://github.com/hybroai/a2a-adapter
8
+ Project-URL: Documentation, https://github.com/hybroai/a2a-adapter#readme
9
+ Project-URL: Repository, https://github.com/hybroai/a2a-adapter
10
10
  Classifier: Development Status :: 3 - Alpha
11
11
  Classifier: Intended Audience :: Developers
12
- Classifier: License :: OSI Approved :: MIT License
12
+ Classifier: License :: OSI Approved :: Apache Software License
13
13
  Classifier: Programming Language :: Python :: 3
14
14
  Classifier: Programming Language :: Python :: 3.11
15
15
  Classifier: Programming Language :: Python :: 3.12
@@ -45,16 +45,25 @@ Dynamic: license-file
45
45
 
46
46
  # A2A Adapter
47
47
 
48
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
49
- [![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://www.python.org/downloads/)
48
+ [![PyPI version](https://badge.fury.io/py/a2a-adapter.svg)](https://badge.fury.io/py/a2a-adapter)
49
+ [![License: Apache-2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
50
+ [![Python 3.11+](https://img.shields.io/badge/python-3.11+-blue.svg)](https://www.python.org/downloads/)
51
+ [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
50
52
 
51
- **Open Source A2A Protocol Adapter SDK for Different Agent Frameworks**
53
+ **🚀 Open Source A2A Protocol Adapter SDK - Make Any Agent Framework A2A-Compatible in 3 Lines**
52
54
 
53
- A Python SDK that enables seamless integration of various agent frameworks (n8n, CrewAI, LangChain, etc.) with the [A2A (Agent-to-Agent) Protocol](https://github.com/a2a-protocol/a2a-protocol). Build interoperable AI agent systems that can communicate across different platforms and frameworks.
55
+ A Python SDK that enables seamless integration of various agent frameworks (n8n, CrewAI, LangChain, etc.) with the [A2A (Agent-to-Agent) Protocol](https://github.com/a2aproject/A2A). Build interoperable AI agent systems that can communicate across different platforms and frameworks.
56
+
57
+ **✨ Key Benefits:**
58
+
59
+ - 🔌 **3-line setup** - Expose any agent as A2A-compliant
60
+ - 🌐 **Framework agnostic** - Works with n8n, CrewAI, LangChain, and more
61
+ - 🌊 **Streaming support** - Built-in streaming for real-time responses
62
+ - 🎯 **Production ready** - Type-safe, well-tested, and actively maintained
54
63
 
55
64
  ## Features
56
65
 
57
- ✨ **Framework Agnostic**: Integrate n8n workflows, CrewAI crews, LangChain chains, or custom agents
66
+ ✨ **Framework Agnostic**: Integrate n8n workflows, CrewAI crews, LangChain chains, and more
58
67
  🔌 **Simple API**: 3-line setup to expose any agent as A2A-compliant
59
68
  🌊 **Streaming Support**: Built-in streaming for LangChain and custom adapters
60
69
  🎯 **Type Safe**: Leverages official A2A SDK types
@@ -87,6 +96,14 @@ A Python SDK that enables seamless integration of various agent frameworks (n8n,
87
96
 
88
97
  See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed design documentation.
89
98
 
99
+ ## Documentation
100
+
101
+ - 🚀 Quick Start: [QUICKSTART.md](QUICKSTART.md)
102
+ - 🧪 Examples: [examples/](examples/)
103
+ - 🛠 Debug & Advanced Usage: [GETTING_STARTED_DEBUG.md](GETTING_STARTED_DEBUG.md)
104
+ - 🧠 Architecture: [ARCHITECTURE.md](ARCHITECTURE.md)
105
+ - 🤝 Contributing: [CONTRIBUTING.md](CONTRIBUTING.md)
106
+
90
107
  ## Installation
91
108
 
92
109
  ### Basic Installation
@@ -117,39 +134,17 @@ pip install a2a-adapter[all]
117
134
  pip install a2a-adapter[dev]
118
135
  ```
119
136
 
120
- ## Quick Start
137
+ ## 🚀 Quick Start
121
138
 
122
- ### 🚀 Easy Start with Examples
139
+ **Get started in 5 minutes!** See [QUICKSTART.md](QUICKSTART.md) for detailed guide.
123
140
 
124
- For the fastest way to get started, use the included examples:
141
+ ### Install
125
142
 
126
143
  ```bash
127
- # Clone and setup
128
- git clone <repository>
129
- cd a2a-adapter
130
- python -m venv .venv
131
- source .venv/bin/activate # On Windows: .venv\Scripts\activate
132
- pip install -e .
133
-
134
- # Start an agent
135
- ./run_agent.sh n8n # N8n workflow agent
136
- ./run_agent.sh crewai # CrewAI agent
137
- ./run_agent.sh langchain # LangChain agent
138
-
139
- # Stop with Ctrl+C
140
- ```
141
-
142
- **Environment Variables:**
143
-
144
- ```bash
145
- export N8N_WEBHOOK_URL="https://your-n8n.com/webhook/your-workflow"
144
+ pip install a2a-adapter
146
145
  ```
147
146
 
148
- ### 📝 Manual Setup
149
-
150
- ### 1. N8n Workflow Agent
151
-
152
- Expose an n8n workflow as an A2A agent:
147
+ ### Your First Agent (3 Lines!)
153
148
 
154
149
  ```python
155
150
  import asyncio
@@ -157,124 +152,66 @@ from a2a_adapter import load_a2a_agent, serve_agent
157
152
  from a2a.types import AgentCard
158
153
 
159
154
  async def main():
160
- # Load adapter
161
155
  adapter = await load_a2a_agent({
162
156
  "adapter": "n8n",
163
- "webhook_url": "https://n8n.example.com/webhook/math",
164
- "timeout": 30
157
+ "webhook_url": "https://your-n8n.com/webhook/workflow"
165
158
  })
166
-
167
- # Define agent card
168
- card = AgentCard(
169
- name="Math Agent",
170
- description="Performs mathematical calculations via n8n"
159
+ serve_agent(
160
+ agent_card=AgentCard(name="My Agent", description="..."),
161
+ adapter=adapter
171
162
  )
172
163
 
173
- # Start server
174
- serve_agent(agent_card=card, adapter=adapter, port=9000)
175
-
176
164
  asyncio.run(main())
177
165
  ```
178
166
 
179
- ### 2. CrewAI Agent
180
-
181
- Expose a CrewAI crew as an A2A agent:
182
-
183
- ```python
184
- import asyncio
185
- from crewai import Crew, Agent, Task
186
- from a2a_adapter import load_a2a_agent, serve_agent
187
- from a2a.types import AgentCard
188
-
189
- # Create your crew
190
- crew = Crew(
191
- agents=[...],
192
- tasks=[...],
193
- verbose=True
194
- )
167
+ **That's it!** Your agent is now A2A-compatible and ready to communicate with other A2A agents.
195
168
 
196
- async def main():
197
- adapter = await load_a2a_agent({
198
- "adapter": "crewai",
199
- "crew": crew
200
- })
169
+ 👉 **[Read the full Quick Start Guide →](QUICKSTART.md)**
201
170
 
202
- card = AgentCard(
203
- name="Research Crew",
204
- description="Multi-agent research team"
205
- )
171
+ ## 📖 Usage Examples
206
172
 
207
- serve_agent(agent_card=card, adapter=adapter, port=8001)
173
+ ### n8n Workflow → A2A Agent
208
174
 
209
- asyncio.run(main())
175
+ ```python
176
+ adapter = await load_a2a_agent({
177
+ "adapter": "n8n",
178
+ "webhook_url": "https://n8n.example.com/webhook/math"
179
+ })
210
180
  ```
211
181
 
212
- ### 3. LangChain Agent (with Streaming)
213
-
214
- Expose a LangChain chain with streaming support:
182
+ ### CrewAI Crew A2A Agent
215
183
 
216
184
  ```python
217
- import asyncio
218
- from langchain_openai import ChatOpenAI
219
- from langchain_core.prompts import ChatPromptTemplate
220
- from a2a_adapter import load_a2a_agent, serve_agent
221
- from a2a.types import AgentCard
222
-
223
- # Create chain
224
- prompt = ChatPromptTemplate.from_messages([
225
- ("system", "You are a helpful assistant."),
226
- ("user", "{input}")
227
- ])
228
- llm = ChatOpenAI(model="gpt-4o-mini", streaming=True)
229
- chain = prompt | llm
230
-
231
- async def main():
232
- adapter = await load_a2a_agent({
233
- "adapter": "langchain",
234
- "runnable": chain,
235
- "input_key": "input"
236
- })
237
-
238
- card = AgentCard(
239
- name="Chat Agent",
240
- description="Streaming chat agent powered by GPT-4"
241
- )
242
-
243
- serve_agent(agent_card=card, adapter=adapter, port=8002)
244
-
245
- asyncio.run(main())
185
+ adapter = await load_a2a_agent({
186
+ "adapter": "crewai",
187
+ "crew": your_crew_instance
188
+ })
246
189
  ```
247
190
 
248
- ### 4. Custom Adapter
249
-
250
- Create a custom agent with any async function:
191
+ ### LangChain Chain → A2A Agent (with Streaming)
251
192
 
252
193
  ```python
253
- import asyncio
254
- from a2a_adapter import load_a2a_agent, serve_agent
255
- from a2a.types import AgentCard
256
-
257
- async def my_agent_function(inputs: dict) -> str:
258
- """Your custom agent logic."""
259
- message = inputs["message"]
260
- return f"Echo: {message}"
261
-
262
- async def main():
263
- adapter = await load_a2a_agent({
264
- "adapter": "callable",
265
- "callable": my_agent_function
266
- })
194
+ adapter = await load_a2a_agent({
195
+ "adapter": "langchain",
196
+ "runnable": your_chain,
197
+ "input_key": "input"
198
+ })
199
+ ```
267
200
 
268
- card = AgentCard(
269
- name="Echo Agent",
270
- description="Simple echo agent"
271
- )
201
+ ### Custom Function → A2A Agent
272
202
 
273
- serve_agent(agent_card=card, adapter=adapter, port=8003)
203
+ ```python
204
+ async def my_agent(inputs: dict) -> str:
205
+ return f"Processed: {inputs['message']}"
274
206
 
275
- asyncio.run(main())
207
+ adapter = await load_a2a_agent({
208
+ "adapter": "callable",
209
+ "callable": my_agent
210
+ })
276
211
  ```
277
212
 
213
+ 📚 **[View all examples →](examples/)**
214
+
278
215
  ## Advanced Usage
279
216
 
280
217
  ### Custom Adapter Class
@@ -520,38 +457,42 @@ Check if this adapter supports streaming responses.
520
457
 
521
458
  ## Framework Support
522
459
 
523
- | Framework | Adapter | Streaming | Status |
524
- | ------------------- | ----------------------- | ----------- | ---------- |
525
- | **n8n** | `N8nAgentAdapter` | | ✅ Stable |
526
- | **CrewAI** | `CrewAIAgentAdapter` | | Stable |
527
- | **LangChain** | `LangChainAgentAdapter` | | Stable |
528
- | **Custom Function** | `CallableAgentAdapter` | ✅ Optional | ✅ Stable |
529
- | **AutoGen** | - | - | 🔜 Planned |
530
- | **Semantic Kernel** | - | - | 🔜 Planned |
460
+ | Framework | Adapter | Non-Streaming | Streaming | Status |
461
+ | ------------- | ----------------------- | ------------- | ---------- | ---------- |
462
+ | **n8n** | `N8nAgentAdapter` | | 🔜 Planned | ✅ Stable |
463
+ | **CrewAI** | `CrewAIAgentAdapter` | 🔜 Planned | 🔜 Planned | 🔜 Planned |
464
+ | **LangChain** | `LangChainAgentAdapter` | 🔜 Planned | 🔜 Planned | 🔜 Planned |
465
+
466
+ ## 🤝 Contributing
467
+
468
+ We welcome contributions from the community! Whether you're fixing bugs, adding features, or improving documentation, your help makes this project better.
531
469
 
532
- ## Contributing
470
+ **Ways to contribute:**
533
471
 
534
- We welcome contributions! To add support for a new framework:
472
+ - 🐛 **Report bugs** - Help us improve by reporting issues
473
+ - 💡 **Suggest features** - Share your ideas for new adapters or improvements
474
+ - 🔧 **Add adapters** - Integrate new agent frameworks (AutoGen, Semantic Kernel, etc.)
475
+ - 📝 **Improve docs** - Make documentation clearer and more helpful
476
+ - 🧪 **Write tests** - Increase test coverage and reliability
535
477
 
536
- 1. Create `a2a_adapter/integrations/{framework}.py`
537
- 2. Implement a class extending `BaseAgentAdapter`
538
- 3. Add to `loader.py` factory function
539
- 4. Update `integrations/__init__.py`
540
- 5. Add optional dependency to `pyproject.toml`
541
- 6. Create an example in `examples/`
542
- 7. Add tests in `tests/`
543
- 8. Update this README
478
+ **Quick start contributing:**
544
479
 
545
- See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed guidance.
480
+ 1. Fork the repository
481
+ 2. Create a feature branch (`git checkout -b feature/amazing-feature`)
482
+ 3. Make your changes
483
+ 4. Run tests (`pytest`)
484
+ 5. Submit a pull request
485
+
486
+ 📖 **[Read our Contributing Guide →](CONTRIBUTING.md)** for detailed instructions, coding standards, and development setup.
546
487
 
547
488
  ## Roadmap
548
489
 
549
490
  - [x] Core adapter abstraction
550
491
  - [x] N8n adapter
551
- - [x] CrewAI adapter
552
- - [x] LangChain adapter with streaming
553
- - [x] Callable adapter
554
- - [x] Comprehensive examples
492
+ - [ ] CrewAI adapter
493
+ - [ ] LangChain adapter with streaming
494
+ - [ ] Callable adapter
495
+ - [ ] Comprehensive examples
555
496
  - [ ] Task support (async execution pattern)
556
497
  - [ ] Artifact support (file uploads/downloads)
557
498
  - [ ] AutoGen adapter
@@ -585,20 +526,39 @@ See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed guidance.
585
526
 
586
527
  ## License
587
528
 
588
- MIT License - see [LICENSE](LICENSE) file for details.
529
+ Apache-2.0 License - see [LICENSE](LICENSE) file for details.
589
530
 
590
531
  ## Credits
591
532
 
592
533
  Built with ❤️ by [HYBRO AI](https://hybro.ai)
593
534
 
594
- Powered by the [A2A Protocol](https://github.com/a2a-protocol/a2a-protocol)
535
+ Powered by the [A2A Protocol](https://github.com/a2aproject/A2A)
536
+
537
+ ## 💬 Community & Support
538
+
539
+ - 📚 **[Full Documentation](README.md)** - Complete API reference and guides
540
+ - 🚀 **[Quick Start Guide](QUICKSTART.md)** - Get started in 5 minutes
541
+ - 🏗️ **[Architecture Guide](ARCHITECTURE.md)** - Deep dive into design decisions
542
+ - 🐛 **[Report Issues](https://github.com/hybroai/a2a-adapter/issues)** - Found a bug? Let us know!
543
+ - 💬 **[Discussions](https://github.com/hybroai/a2a-adapter/discussions)** - Ask questions and share ideas
544
+ - 🤝 **[Contributing Guide](CONTRIBUTING.md)** - Want to contribute? Start here!
545
+
546
+ ## 📄 License
595
547
 
596
- ## Support
548
+ This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.
597
549
 
598
- - 📚 [Documentation](https://github.com/hybro-ai/a2a-adapter)
599
- - 🐛 [Issue Tracker](https://github.com/hybro-ai/a2a-adapter/issues)
600
- - 💬 [Discussions](https://github.com/hybro-ai/a2a-adapter/discussions)
550
+ ## 🙏 Acknowledgments
551
+
552
+ - Built with ❤️ by [HYBRO AI](https://hybro.ai)
553
+ - Powered by the [A2A Protocol](https://github.com/a2aproject/A2A)
554
+ - Thanks to all [contributors](https://github.com/hybroai/a2a-adapter/graphs/contributors) who make this project better!
601
555
 
602
556
  ---
603
557
 
604
- **Star ⭐ this repo if you find it useful!**
558
+ <div align="center">
559
+
560
+ **⭐ Star this repo if you find it useful! ⭐**
561
+
562
+ [⬆ Back to Top](#a2a-adapter)
563
+
564
+ </div>
@@ -1,15 +1,24 @@
1
1
  # A2A Adapter
2
2
 
3
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
4
- [![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://www.python.org/downloads/)
3
+ [![PyPI version](https://badge.fury.io/py/a2a-adapter.svg)](https://badge.fury.io/py/a2a-adapter)
4
+ [![License: Apache-2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
5
+ [![Python 3.11+](https://img.shields.io/badge/python-3.11+-blue.svg)](https://www.python.org/downloads/)
6
+ [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
5
7
 
6
- **Open Source A2A Protocol Adapter SDK for Different Agent Frameworks**
8
+ **🚀 Open Source A2A Protocol Adapter SDK - Make Any Agent Framework A2A-Compatible in 3 Lines**
7
9
 
8
- A Python SDK that enables seamless integration of various agent frameworks (n8n, CrewAI, LangChain, etc.) with the [A2A (Agent-to-Agent) Protocol](https://github.com/a2a-protocol/a2a-protocol). Build interoperable AI agent systems that can communicate across different platforms and frameworks.
10
+ A Python SDK that enables seamless integration of various agent frameworks (n8n, CrewAI, LangChain, etc.) with the [A2A (Agent-to-Agent) Protocol](https://github.com/a2aproject/A2A). Build interoperable AI agent systems that can communicate across different platforms and frameworks.
11
+
12
+ **✨ Key Benefits:**
13
+
14
+ - 🔌 **3-line setup** - Expose any agent as A2A-compliant
15
+ - 🌐 **Framework agnostic** - Works with n8n, CrewAI, LangChain, and more
16
+ - 🌊 **Streaming support** - Built-in streaming for real-time responses
17
+ - 🎯 **Production ready** - Type-safe, well-tested, and actively maintained
9
18
 
10
19
  ## Features
11
20
 
12
- ✨ **Framework Agnostic**: Integrate n8n workflows, CrewAI crews, LangChain chains, or custom agents
21
+ ✨ **Framework Agnostic**: Integrate n8n workflows, CrewAI crews, LangChain chains, and more
13
22
  🔌 **Simple API**: 3-line setup to expose any agent as A2A-compliant
14
23
  🌊 **Streaming Support**: Built-in streaming for LangChain and custom adapters
15
24
  🎯 **Type Safe**: Leverages official A2A SDK types
@@ -42,6 +51,14 @@ A Python SDK that enables seamless integration of various agent frameworks (n8n,
42
51
 
43
52
  See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed design documentation.
44
53
 
54
+ ## Documentation
55
+
56
+ - 🚀 Quick Start: [QUICKSTART.md](QUICKSTART.md)
57
+ - 🧪 Examples: [examples/](examples/)
58
+ - 🛠 Debug & Advanced Usage: [GETTING_STARTED_DEBUG.md](GETTING_STARTED_DEBUG.md)
59
+ - 🧠 Architecture: [ARCHITECTURE.md](ARCHITECTURE.md)
60
+ - 🤝 Contributing: [CONTRIBUTING.md](CONTRIBUTING.md)
61
+
45
62
  ## Installation
46
63
 
47
64
  ### Basic Installation
@@ -72,39 +89,17 @@ pip install a2a-adapter[all]
72
89
  pip install a2a-adapter[dev]
73
90
  ```
74
91
 
75
- ## Quick Start
92
+ ## 🚀 Quick Start
76
93
 
77
- ### 🚀 Easy Start with Examples
94
+ **Get started in 5 minutes!** See [QUICKSTART.md](QUICKSTART.md) for detailed guide.
78
95
 
79
- For the fastest way to get started, use the included examples:
96
+ ### Install
80
97
 
81
98
  ```bash
82
- # Clone and setup
83
- git clone <repository>
84
- cd a2a-adapter
85
- python -m venv .venv
86
- source .venv/bin/activate # On Windows: .venv\Scripts\activate
87
- pip install -e .
88
-
89
- # Start an agent
90
- ./run_agent.sh n8n # N8n workflow agent
91
- ./run_agent.sh crewai # CrewAI agent
92
- ./run_agent.sh langchain # LangChain agent
93
-
94
- # Stop with Ctrl+C
95
- ```
96
-
97
- **Environment Variables:**
98
-
99
- ```bash
100
- export N8N_WEBHOOK_URL="https://your-n8n.com/webhook/your-workflow"
99
+ pip install a2a-adapter
101
100
  ```
102
101
 
103
- ### 📝 Manual Setup
104
-
105
- ### 1. N8n Workflow Agent
106
-
107
- Expose an n8n workflow as an A2A agent:
102
+ ### Your First Agent (3 Lines!)
108
103
 
109
104
  ```python
110
105
  import asyncio
@@ -112,124 +107,66 @@ from a2a_adapter import load_a2a_agent, serve_agent
112
107
  from a2a.types import AgentCard
113
108
 
114
109
  async def main():
115
- # Load adapter
116
110
  adapter = await load_a2a_agent({
117
111
  "adapter": "n8n",
118
- "webhook_url": "https://n8n.example.com/webhook/math",
119
- "timeout": 30
112
+ "webhook_url": "https://your-n8n.com/webhook/workflow"
120
113
  })
121
-
122
- # Define agent card
123
- card = AgentCard(
124
- name="Math Agent",
125
- description="Performs mathematical calculations via n8n"
114
+ serve_agent(
115
+ agent_card=AgentCard(name="My Agent", description="..."),
116
+ adapter=adapter
126
117
  )
127
118
 
128
- # Start server
129
- serve_agent(agent_card=card, adapter=adapter, port=9000)
130
-
131
119
  asyncio.run(main())
132
120
  ```
133
121
 
134
- ### 2. CrewAI Agent
135
-
136
- Expose a CrewAI crew as an A2A agent:
137
-
138
- ```python
139
- import asyncio
140
- from crewai import Crew, Agent, Task
141
- from a2a_adapter import load_a2a_agent, serve_agent
142
- from a2a.types import AgentCard
143
-
144
- # Create your crew
145
- crew = Crew(
146
- agents=[...],
147
- tasks=[...],
148
- verbose=True
149
- )
122
+ **That's it!** Your agent is now A2A-compatible and ready to communicate with other A2A agents.
150
123
 
151
- async def main():
152
- adapter = await load_a2a_agent({
153
- "adapter": "crewai",
154
- "crew": crew
155
- })
124
+ 👉 **[Read the full Quick Start Guide →](QUICKSTART.md)**
156
125
 
157
- card = AgentCard(
158
- name="Research Crew",
159
- description="Multi-agent research team"
160
- )
126
+ ## 📖 Usage Examples
161
127
 
162
- serve_agent(agent_card=card, adapter=adapter, port=8001)
128
+ ### n8n Workflow → A2A Agent
163
129
 
164
- asyncio.run(main())
130
+ ```python
131
+ adapter = await load_a2a_agent({
132
+ "adapter": "n8n",
133
+ "webhook_url": "https://n8n.example.com/webhook/math"
134
+ })
165
135
  ```
166
136
 
167
- ### 3. LangChain Agent (with Streaming)
168
-
169
- Expose a LangChain chain with streaming support:
137
+ ### CrewAI Crew A2A Agent
170
138
 
171
139
  ```python
172
- import asyncio
173
- from langchain_openai import ChatOpenAI
174
- from langchain_core.prompts import ChatPromptTemplate
175
- from a2a_adapter import load_a2a_agent, serve_agent
176
- from a2a.types import AgentCard
177
-
178
- # Create chain
179
- prompt = ChatPromptTemplate.from_messages([
180
- ("system", "You are a helpful assistant."),
181
- ("user", "{input}")
182
- ])
183
- llm = ChatOpenAI(model="gpt-4o-mini", streaming=True)
184
- chain = prompt | llm
185
-
186
- async def main():
187
- adapter = await load_a2a_agent({
188
- "adapter": "langchain",
189
- "runnable": chain,
190
- "input_key": "input"
191
- })
192
-
193
- card = AgentCard(
194
- name="Chat Agent",
195
- description="Streaming chat agent powered by GPT-4"
196
- )
197
-
198
- serve_agent(agent_card=card, adapter=adapter, port=8002)
199
-
200
- asyncio.run(main())
140
+ adapter = await load_a2a_agent({
141
+ "adapter": "crewai",
142
+ "crew": your_crew_instance
143
+ })
201
144
  ```
202
145
 
203
- ### 4. Custom Adapter
204
-
205
- Create a custom agent with any async function:
146
+ ### LangChain Chain → A2A Agent (with Streaming)
206
147
 
207
148
  ```python
208
- import asyncio
209
- from a2a_adapter import load_a2a_agent, serve_agent
210
- from a2a.types import AgentCard
211
-
212
- async def my_agent_function(inputs: dict) -> str:
213
- """Your custom agent logic."""
214
- message = inputs["message"]
215
- return f"Echo: {message}"
216
-
217
- async def main():
218
- adapter = await load_a2a_agent({
219
- "adapter": "callable",
220
- "callable": my_agent_function
221
- })
149
+ adapter = await load_a2a_agent({
150
+ "adapter": "langchain",
151
+ "runnable": your_chain,
152
+ "input_key": "input"
153
+ })
154
+ ```
222
155
 
223
- card = AgentCard(
224
- name="Echo Agent",
225
- description="Simple echo agent"
226
- )
156
+ ### Custom Function → A2A Agent
227
157
 
228
- serve_agent(agent_card=card, adapter=adapter, port=8003)
158
+ ```python
159
+ async def my_agent(inputs: dict) -> str:
160
+ return f"Processed: {inputs['message']}"
229
161
 
230
- asyncio.run(main())
162
+ adapter = await load_a2a_agent({
163
+ "adapter": "callable",
164
+ "callable": my_agent
165
+ })
231
166
  ```
232
167
 
168
+ 📚 **[View all examples →](examples/)**
169
+
233
170
  ## Advanced Usage
234
171
 
235
172
  ### Custom Adapter Class
@@ -475,38 +412,42 @@ Check if this adapter supports streaming responses.
475
412
 
476
413
  ## Framework Support
477
414
 
478
- | Framework | Adapter | Streaming | Status |
479
- | ------------------- | ----------------------- | ----------- | ---------- |
480
- | **n8n** | `N8nAgentAdapter` | | ✅ Stable |
481
- | **CrewAI** | `CrewAIAgentAdapter` | | Stable |
482
- | **LangChain** | `LangChainAgentAdapter` | | Stable |
483
- | **Custom Function** | `CallableAgentAdapter` | ✅ Optional | ✅ Stable |
484
- | **AutoGen** | - | - | 🔜 Planned |
485
- | **Semantic Kernel** | - | - | 🔜 Planned |
415
+ | Framework | Adapter | Non-Streaming | Streaming | Status |
416
+ | ------------- | ----------------------- | ------------- | ---------- | ---------- |
417
+ | **n8n** | `N8nAgentAdapter` | | 🔜 Planned | ✅ Stable |
418
+ | **CrewAI** | `CrewAIAgentAdapter` | 🔜 Planned | 🔜 Planned | 🔜 Planned |
419
+ | **LangChain** | `LangChainAgentAdapter` | 🔜 Planned | 🔜 Planned | 🔜 Planned |
420
+
421
+ ## 🤝 Contributing
422
+
423
+ We welcome contributions from the community! Whether you're fixing bugs, adding features, or improving documentation, your help makes this project better.
486
424
 
487
- ## Contributing
425
+ **Ways to contribute:**
488
426
 
489
- We welcome contributions! To add support for a new framework:
427
+ - 🐛 **Report bugs** - Help us improve by reporting issues
428
+ - 💡 **Suggest features** - Share your ideas for new adapters or improvements
429
+ - 🔧 **Add adapters** - Integrate new agent frameworks (AutoGen, Semantic Kernel, etc.)
430
+ - 📝 **Improve docs** - Make documentation clearer and more helpful
431
+ - 🧪 **Write tests** - Increase test coverage and reliability
490
432
 
491
- 1. Create `a2a_adapter/integrations/{framework}.py`
492
- 2. Implement a class extending `BaseAgentAdapter`
493
- 3. Add to `loader.py` factory function
494
- 4. Update `integrations/__init__.py`
495
- 5. Add optional dependency to `pyproject.toml`
496
- 6. Create an example in `examples/`
497
- 7. Add tests in `tests/`
498
- 8. Update this README
433
+ **Quick start contributing:**
499
434
 
500
- See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed guidance.
435
+ 1. Fork the repository
436
+ 2. Create a feature branch (`git checkout -b feature/amazing-feature`)
437
+ 3. Make your changes
438
+ 4. Run tests (`pytest`)
439
+ 5. Submit a pull request
440
+
441
+ 📖 **[Read our Contributing Guide →](CONTRIBUTING.md)** for detailed instructions, coding standards, and development setup.
501
442
 
502
443
  ## Roadmap
503
444
 
504
445
  - [x] Core adapter abstraction
505
446
  - [x] N8n adapter
506
- - [x] CrewAI adapter
507
- - [x] LangChain adapter with streaming
508
- - [x] Callable adapter
509
- - [x] Comprehensive examples
447
+ - [ ] CrewAI adapter
448
+ - [ ] LangChain adapter with streaming
449
+ - [ ] Callable adapter
450
+ - [ ] Comprehensive examples
510
451
  - [ ] Task support (async execution pattern)
511
452
  - [ ] Artifact support (file uploads/downloads)
512
453
  - [ ] AutoGen adapter
@@ -540,20 +481,39 @@ See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed guidance.
540
481
 
541
482
  ## License
542
483
 
543
- MIT License - see [LICENSE](LICENSE) file for details.
484
+ Apache-2.0 License - see [LICENSE](LICENSE) file for details.
544
485
 
545
486
  ## Credits
546
487
 
547
488
  Built with ❤️ by [HYBRO AI](https://hybro.ai)
548
489
 
549
- Powered by the [A2A Protocol](https://github.com/a2a-protocol/a2a-protocol)
490
+ Powered by the [A2A Protocol](https://github.com/a2aproject/A2A)
491
+
492
+ ## 💬 Community & Support
493
+
494
+ - 📚 **[Full Documentation](README.md)** - Complete API reference and guides
495
+ - 🚀 **[Quick Start Guide](QUICKSTART.md)** - Get started in 5 minutes
496
+ - 🏗️ **[Architecture Guide](ARCHITECTURE.md)** - Deep dive into design decisions
497
+ - 🐛 **[Report Issues](https://github.com/hybroai/a2a-adapter/issues)** - Found a bug? Let us know!
498
+ - 💬 **[Discussions](https://github.com/hybroai/a2a-adapter/discussions)** - Ask questions and share ideas
499
+ - 🤝 **[Contributing Guide](CONTRIBUTING.md)** - Want to contribute? Start here!
500
+
501
+ ## 📄 License
550
502
 
551
- ## Support
503
+ This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.
552
504
 
553
- - 📚 [Documentation](https://github.com/hybro-ai/a2a-adapter)
554
- - 🐛 [Issue Tracker](https://github.com/hybro-ai/a2a-adapter/issues)
555
- - 💬 [Discussions](https://github.com/hybro-ai/a2a-adapter/discussions)
505
+ ## 🙏 Acknowledgments
506
+
507
+ - Built with ❤️ by [HYBRO AI](https://hybro.ai)
508
+ - Powered by the [A2A Protocol](https://github.com/a2aproject/A2A)
509
+ - Thanks to all [contributors](https://github.com/hybroai/a2a-adapter/graphs/contributors) who make this project better!
556
510
 
557
511
  ---
558
512
 
559
- **Star ⭐ this repo if you find it useful!**
513
+ <div align="center">
514
+
515
+ **⭐ Star this repo if you find it useful! ⭐**
516
+
517
+ [⬆ Back to Top](#a2a-adapter)
518
+
519
+ </div>
@@ -1,15 +1,15 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: a2a-adapter
3
- Version: 0.1.0
3
+ Version: 0.1.2
4
4
  Summary: A2A Protocol Adapter SDK for integrating various agent frameworks
5
5
  Author-email: HYBRO AI <info@hybro.ai>
6
- License: MIT
7
- Project-URL: Homepage, https://github.com/hybro-ai/a2a-adapter
8
- Project-URL: Documentation, https://github.com/hybro-ai/a2a-adapter#readme
9
- Project-URL: Repository, https://github.com/hybro-ai/a2a-adapter
6
+ License: Apache-2.0
7
+ Project-URL: Homepage, https://github.com/hybroai/a2a-adapter
8
+ Project-URL: Documentation, https://github.com/hybroai/a2a-adapter#readme
9
+ Project-URL: Repository, https://github.com/hybroai/a2a-adapter
10
10
  Classifier: Development Status :: 3 - Alpha
11
11
  Classifier: Intended Audience :: Developers
12
- Classifier: License :: OSI Approved :: MIT License
12
+ Classifier: License :: OSI Approved :: Apache Software License
13
13
  Classifier: Programming Language :: Python :: 3
14
14
  Classifier: Programming Language :: Python :: 3.11
15
15
  Classifier: Programming Language :: Python :: 3.12
@@ -45,16 +45,25 @@ Dynamic: license-file
45
45
 
46
46
  # A2A Adapter
47
47
 
48
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
49
- [![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://www.python.org/downloads/)
48
+ [![PyPI version](https://badge.fury.io/py/a2a-adapter.svg)](https://badge.fury.io/py/a2a-adapter)
49
+ [![License: Apache-2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
50
+ [![Python 3.11+](https://img.shields.io/badge/python-3.11+-blue.svg)](https://www.python.org/downloads/)
51
+ [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
50
52
 
51
- **Open Source A2A Protocol Adapter SDK for Different Agent Frameworks**
53
+ **🚀 Open Source A2A Protocol Adapter SDK - Make Any Agent Framework A2A-Compatible in 3 Lines**
52
54
 
53
- A Python SDK that enables seamless integration of various agent frameworks (n8n, CrewAI, LangChain, etc.) with the [A2A (Agent-to-Agent) Protocol](https://github.com/a2a-protocol/a2a-protocol). Build interoperable AI agent systems that can communicate across different platforms and frameworks.
55
+ A Python SDK that enables seamless integration of various agent frameworks (n8n, CrewAI, LangChain, etc.) with the [A2A (Agent-to-Agent) Protocol](https://github.com/a2aproject/A2A). Build interoperable AI agent systems that can communicate across different platforms and frameworks.
56
+
57
+ **✨ Key Benefits:**
58
+
59
+ - 🔌 **3-line setup** - Expose any agent as A2A-compliant
60
+ - 🌐 **Framework agnostic** - Works with n8n, CrewAI, LangChain, and more
61
+ - 🌊 **Streaming support** - Built-in streaming for real-time responses
62
+ - 🎯 **Production ready** - Type-safe, well-tested, and actively maintained
54
63
 
55
64
  ## Features
56
65
 
57
- ✨ **Framework Agnostic**: Integrate n8n workflows, CrewAI crews, LangChain chains, or custom agents
66
+ ✨ **Framework Agnostic**: Integrate n8n workflows, CrewAI crews, LangChain chains, and more
58
67
  🔌 **Simple API**: 3-line setup to expose any agent as A2A-compliant
59
68
  🌊 **Streaming Support**: Built-in streaming for LangChain and custom adapters
60
69
  🎯 **Type Safe**: Leverages official A2A SDK types
@@ -87,6 +96,14 @@ A Python SDK that enables seamless integration of various agent frameworks (n8n,
87
96
 
88
97
  See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed design documentation.
89
98
 
99
+ ## Documentation
100
+
101
+ - 🚀 Quick Start: [QUICKSTART.md](QUICKSTART.md)
102
+ - 🧪 Examples: [examples/](examples/)
103
+ - 🛠 Debug & Advanced Usage: [GETTING_STARTED_DEBUG.md](GETTING_STARTED_DEBUG.md)
104
+ - 🧠 Architecture: [ARCHITECTURE.md](ARCHITECTURE.md)
105
+ - 🤝 Contributing: [CONTRIBUTING.md](CONTRIBUTING.md)
106
+
90
107
  ## Installation
91
108
 
92
109
  ### Basic Installation
@@ -117,39 +134,17 @@ pip install a2a-adapter[all]
117
134
  pip install a2a-adapter[dev]
118
135
  ```
119
136
 
120
- ## Quick Start
137
+ ## 🚀 Quick Start
121
138
 
122
- ### 🚀 Easy Start with Examples
139
+ **Get started in 5 minutes!** See [QUICKSTART.md](QUICKSTART.md) for detailed guide.
123
140
 
124
- For the fastest way to get started, use the included examples:
141
+ ### Install
125
142
 
126
143
  ```bash
127
- # Clone and setup
128
- git clone <repository>
129
- cd a2a-adapter
130
- python -m venv .venv
131
- source .venv/bin/activate # On Windows: .venv\Scripts\activate
132
- pip install -e .
133
-
134
- # Start an agent
135
- ./run_agent.sh n8n # N8n workflow agent
136
- ./run_agent.sh crewai # CrewAI agent
137
- ./run_agent.sh langchain # LangChain agent
138
-
139
- # Stop with Ctrl+C
140
- ```
141
-
142
- **Environment Variables:**
143
-
144
- ```bash
145
- export N8N_WEBHOOK_URL="https://your-n8n.com/webhook/your-workflow"
144
+ pip install a2a-adapter
146
145
  ```
147
146
 
148
- ### 📝 Manual Setup
149
-
150
- ### 1. N8n Workflow Agent
151
-
152
- Expose an n8n workflow as an A2A agent:
147
+ ### Your First Agent (3 Lines!)
153
148
 
154
149
  ```python
155
150
  import asyncio
@@ -157,124 +152,66 @@ from a2a_adapter import load_a2a_agent, serve_agent
157
152
  from a2a.types import AgentCard
158
153
 
159
154
  async def main():
160
- # Load adapter
161
155
  adapter = await load_a2a_agent({
162
156
  "adapter": "n8n",
163
- "webhook_url": "https://n8n.example.com/webhook/math",
164
- "timeout": 30
157
+ "webhook_url": "https://your-n8n.com/webhook/workflow"
165
158
  })
166
-
167
- # Define agent card
168
- card = AgentCard(
169
- name="Math Agent",
170
- description="Performs mathematical calculations via n8n"
159
+ serve_agent(
160
+ agent_card=AgentCard(name="My Agent", description="..."),
161
+ adapter=adapter
171
162
  )
172
163
 
173
- # Start server
174
- serve_agent(agent_card=card, adapter=adapter, port=9000)
175
-
176
164
  asyncio.run(main())
177
165
  ```
178
166
 
179
- ### 2. CrewAI Agent
180
-
181
- Expose a CrewAI crew as an A2A agent:
182
-
183
- ```python
184
- import asyncio
185
- from crewai import Crew, Agent, Task
186
- from a2a_adapter import load_a2a_agent, serve_agent
187
- from a2a.types import AgentCard
188
-
189
- # Create your crew
190
- crew = Crew(
191
- agents=[...],
192
- tasks=[...],
193
- verbose=True
194
- )
167
+ **That's it!** Your agent is now A2A-compatible and ready to communicate with other A2A agents.
195
168
 
196
- async def main():
197
- adapter = await load_a2a_agent({
198
- "adapter": "crewai",
199
- "crew": crew
200
- })
169
+ 👉 **[Read the full Quick Start Guide →](QUICKSTART.md)**
201
170
 
202
- card = AgentCard(
203
- name="Research Crew",
204
- description="Multi-agent research team"
205
- )
171
+ ## 📖 Usage Examples
206
172
 
207
- serve_agent(agent_card=card, adapter=adapter, port=8001)
173
+ ### n8n Workflow → A2A Agent
208
174
 
209
- asyncio.run(main())
175
+ ```python
176
+ adapter = await load_a2a_agent({
177
+ "adapter": "n8n",
178
+ "webhook_url": "https://n8n.example.com/webhook/math"
179
+ })
210
180
  ```
211
181
 
212
- ### 3. LangChain Agent (with Streaming)
213
-
214
- Expose a LangChain chain with streaming support:
182
+ ### CrewAI Crew A2A Agent
215
183
 
216
184
  ```python
217
- import asyncio
218
- from langchain_openai import ChatOpenAI
219
- from langchain_core.prompts import ChatPromptTemplate
220
- from a2a_adapter import load_a2a_agent, serve_agent
221
- from a2a.types import AgentCard
222
-
223
- # Create chain
224
- prompt = ChatPromptTemplate.from_messages([
225
- ("system", "You are a helpful assistant."),
226
- ("user", "{input}")
227
- ])
228
- llm = ChatOpenAI(model="gpt-4o-mini", streaming=True)
229
- chain = prompt | llm
230
-
231
- async def main():
232
- adapter = await load_a2a_agent({
233
- "adapter": "langchain",
234
- "runnable": chain,
235
- "input_key": "input"
236
- })
237
-
238
- card = AgentCard(
239
- name="Chat Agent",
240
- description="Streaming chat agent powered by GPT-4"
241
- )
242
-
243
- serve_agent(agent_card=card, adapter=adapter, port=8002)
244
-
245
- asyncio.run(main())
185
+ adapter = await load_a2a_agent({
186
+ "adapter": "crewai",
187
+ "crew": your_crew_instance
188
+ })
246
189
  ```
247
190
 
248
- ### 4. Custom Adapter
249
-
250
- Create a custom agent with any async function:
191
+ ### LangChain Chain → A2A Agent (with Streaming)
251
192
 
252
193
  ```python
253
- import asyncio
254
- from a2a_adapter import load_a2a_agent, serve_agent
255
- from a2a.types import AgentCard
256
-
257
- async def my_agent_function(inputs: dict) -> str:
258
- """Your custom agent logic."""
259
- message = inputs["message"]
260
- return f"Echo: {message}"
261
-
262
- async def main():
263
- adapter = await load_a2a_agent({
264
- "adapter": "callable",
265
- "callable": my_agent_function
266
- })
194
+ adapter = await load_a2a_agent({
195
+ "adapter": "langchain",
196
+ "runnable": your_chain,
197
+ "input_key": "input"
198
+ })
199
+ ```
267
200
 
268
- card = AgentCard(
269
- name="Echo Agent",
270
- description="Simple echo agent"
271
- )
201
+ ### Custom Function → A2A Agent
272
202
 
273
- serve_agent(agent_card=card, adapter=adapter, port=8003)
203
+ ```python
204
+ async def my_agent(inputs: dict) -> str:
205
+ return f"Processed: {inputs['message']}"
274
206
 
275
- asyncio.run(main())
207
+ adapter = await load_a2a_agent({
208
+ "adapter": "callable",
209
+ "callable": my_agent
210
+ })
276
211
  ```
277
212
 
213
+ 📚 **[View all examples →](examples/)**
214
+
278
215
  ## Advanced Usage
279
216
 
280
217
  ### Custom Adapter Class
@@ -520,38 +457,42 @@ Check if this adapter supports streaming responses.
520
457
 
521
458
  ## Framework Support
522
459
 
523
- | Framework | Adapter | Streaming | Status |
524
- | ------------------- | ----------------------- | ----------- | ---------- |
525
- | **n8n** | `N8nAgentAdapter` | | ✅ Stable |
526
- | **CrewAI** | `CrewAIAgentAdapter` | | Stable |
527
- | **LangChain** | `LangChainAgentAdapter` | | Stable |
528
- | **Custom Function** | `CallableAgentAdapter` | ✅ Optional | ✅ Stable |
529
- | **AutoGen** | - | - | 🔜 Planned |
530
- | **Semantic Kernel** | - | - | 🔜 Planned |
460
+ | Framework | Adapter | Non-Streaming | Streaming | Status |
461
+ | ------------- | ----------------------- | ------------- | ---------- | ---------- |
462
+ | **n8n** | `N8nAgentAdapter` | | 🔜 Planned | ✅ Stable |
463
+ | **CrewAI** | `CrewAIAgentAdapter` | 🔜 Planned | 🔜 Planned | 🔜 Planned |
464
+ | **LangChain** | `LangChainAgentAdapter` | 🔜 Planned | 🔜 Planned | 🔜 Planned |
465
+
466
+ ## 🤝 Contributing
467
+
468
+ We welcome contributions from the community! Whether you're fixing bugs, adding features, or improving documentation, your help makes this project better.
531
469
 
532
- ## Contributing
470
+ **Ways to contribute:**
533
471
 
534
- We welcome contributions! To add support for a new framework:
472
+ - 🐛 **Report bugs** - Help us improve by reporting issues
473
+ - 💡 **Suggest features** - Share your ideas for new adapters or improvements
474
+ - 🔧 **Add adapters** - Integrate new agent frameworks (AutoGen, Semantic Kernel, etc.)
475
+ - 📝 **Improve docs** - Make documentation clearer and more helpful
476
+ - 🧪 **Write tests** - Increase test coverage and reliability
535
477
 
536
- 1. Create `a2a_adapter/integrations/{framework}.py`
537
- 2. Implement a class extending `BaseAgentAdapter`
538
- 3. Add to `loader.py` factory function
539
- 4. Update `integrations/__init__.py`
540
- 5. Add optional dependency to `pyproject.toml`
541
- 6. Create an example in `examples/`
542
- 7. Add tests in `tests/`
543
- 8. Update this README
478
+ **Quick start contributing:**
544
479
 
545
- See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed guidance.
480
+ 1. Fork the repository
481
+ 2. Create a feature branch (`git checkout -b feature/amazing-feature`)
482
+ 3. Make your changes
483
+ 4. Run tests (`pytest`)
484
+ 5. Submit a pull request
485
+
486
+ 📖 **[Read our Contributing Guide →](CONTRIBUTING.md)** for detailed instructions, coding standards, and development setup.
546
487
 
547
488
  ## Roadmap
548
489
 
549
490
  - [x] Core adapter abstraction
550
491
  - [x] N8n adapter
551
- - [x] CrewAI adapter
552
- - [x] LangChain adapter with streaming
553
- - [x] Callable adapter
554
- - [x] Comprehensive examples
492
+ - [ ] CrewAI adapter
493
+ - [ ] LangChain adapter with streaming
494
+ - [ ] Callable adapter
495
+ - [ ] Comprehensive examples
555
496
  - [ ] Task support (async execution pattern)
556
497
  - [ ] Artifact support (file uploads/downloads)
557
498
  - [ ] AutoGen adapter
@@ -585,20 +526,39 @@ See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed guidance.
585
526
 
586
527
  ## License
587
528
 
588
- MIT License - see [LICENSE](LICENSE) file for details.
529
+ Apache-2.0 License - see [LICENSE](LICENSE) file for details.
589
530
 
590
531
  ## Credits
591
532
 
592
533
  Built with ❤️ by [HYBRO AI](https://hybro.ai)
593
534
 
594
- Powered by the [A2A Protocol](https://github.com/a2a-protocol/a2a-protocol)
535
+ Powered by the [A2A Protocol](https://github.com/a2aproject/A2A)
536
+
537
+ ## 💬 Community & Support
538
+
539
+ - 📚 **[Full Documentation](README.md)** - Complete API reference and guides
540
+ - 🚀 **[Quick Start Guide](QUICKSTART.md)** - Get started in 5 minutes
541
+ - 🏗️ **[Architecture Guide](ARCHITECTURE.md)** - Deep dive into design decisions
542
+ - 🐛 **[Report Issues](https://github.com/hybroai/a2a-adapter/issues)** - Found a bug? Let us know!
543
+ - 💬 **[Discussions](https://github.com/hybroai/a2a-adapter/discussions)** - Ask questions and share ideas
544
+ - 🤝 **[Contributing Guide](CONTRIBUTING.md)** - Want to contribute? Start here!
545
+
546
+ ## 📄 License
595
547
 
596
- ## Support
548
+ This project is licensed under the Apache License 2.0 - see the [LICENSE](LICENSE) file for details.
597
549
 
598
- - 📚 [Documentation](https://github.com/hybro-ai/a2a-adapter)
599
- - 🐛 [Issue Tracker](https://github.com/hybro-ai/a2a-adapter/issues)
600
- - 💬 [Discussions](https://github.com/hybro-ai/a2a-adapter/discussions)
550
+ ## 🙏 Acknowledgments
551
+
552
+ - Built with ❤️ by [HYBRO AI](https://hybro.ai)
553
+ - Powered by the [A2A Protocol](https://github.com/a2aproject/A2A)
554
+ - Thanks to all [contributors](https://github.com/hybroai/a2a-adapter/graphs/contributors) who make this project better!
601
555
 
602
556
  ---
603
557
 
604
- **Star ⭐ this repo if you find it useful!**
558
+ <div align="center">
559
+
560
+ **⭐ Star this repo if you find it useful! ⭐**
561
+
562
+ [⬆ Back to Top](#a2a-adapter)
563
+
564
+ </div>
@@ -4,10 +4,10 @@ build-backend = "setuptools.build_meta"
4
4
 
5
5
  [project]
6
6
  name = "a2a-adapter"
7
- version = "0.1.0"
7
+ version = "0.1.2"
8
8
  description = "A2A Protocol Adapter SDK for integrating various agent frameworks"
9
9
  readme = "README.md"
10
- license = {text = "MIT"}
10
+ license = {text = "Apache-2.0"}
11
11
  authors = [
12
12
  {name = "HYBRO AI", email = "info@hybro.ai"}
13
13
  ]
@@ -15,7 +15,7 @@ requires-python = ">=3.11"
15
15
  classifiers = [
16
16
  "Development Status :: 3 - Alpha",
17
17
  "Intended Audience :: Developers",
18
- "License :: OSI Approved :: MIT License",
18
+ "License :: OSI Approved :: Apache Software License",
19
19
  "Programming Language :: Python :: 3",
20
20
  "Programming Language :: Python :: 3.11",
21
21
  "Programming Language :: Python :: 3.12",
@@ -49,9 +49,9 @@ dev = [
49
49
  ]
50
50
 
51
51
  [project.urls]
52
- Homepage = "https://github.com/hybro-ai/a2a-adapter"
53
- Documentation = "https://github.com/hybro-ai/a2a-adapter#readme"
54
- Repository = "https://github.com/hybro-ai/a2a-adapter"
52
+ Homepage = "https://github.com/hybroai/a2a-adapter"
53
+ Documentation = "https://github.com/hybroai/a2a-adapter#readme"
54
+ Repository = "https://github.com/hybroai/a2a-adapter"
55
55
 
56
56
  [tool.setuptools]
57
57
  packages = ["a2a_adapter", "a2a_adapter.integrations"]
File without changes
File without changes