agent-starter-pack 0.2.2__py3-none-any.whl → 0.2.3__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Potentially problematic release.
This version of agent-starter-pack might be problematic. Click here for more details.
- {agent_starter_pack-0.2.2.dist-info → agent_starter_pack-0.2.3.dist-info}/METADATA +7 -13
- {agent_starter_pack-0.2.2.dist-info → agent_starter_pack-0.2.3.dist-info}/RECORD +26 -26
- src/base_template/Makefile +6 -4
- src/base_template/README.md +7 -1
- src/base_template/deployment/README.md +4 -1
- src/base_template/deployment/cd/deploy-to-prod.yaml +3 -3
- src/base_template/deployment/cd/staging.yaml +4 -4
- src/base_template/deployment/ci/pr_checks.yaml +1 -1
- src/base_template/deployment/terraform/build_triggers.tf +3 -0
- src/base_template/pyproject.toml +2 -2
- src/cli/commands/create.py +35 -9
- src/cli/commands/setup_cicd.py +22 -6
- src/data_ingestion/README.md +37 -50
- src/deployment_targets/agent_engine/app/agent_engine_app.py +7 -12
- src/deployment_targets/cloud_run/Dockerfile +1 -1
- src/frontends/streamlit/frontend/utils/stream_handler.py +3 -3
- src/resources/locks/uv-agentic_rag-agent_engine.lock +128 -127
- src/resources/locks/uv-agentic_rag-cloud_run.lock +174 -173
- src/resources/locks/uv-crewai_coding_crew-agent_engine.lock +149 -148
- src/resources/locks/uv-crewai_coding_crew-cloud_run.lock +195 -194
- src/resources/locks/uv-langgraph_base_react-agent_engine.lock +125 -124
- src/resources/locks/uv-langgraph_base_react-cloud_run.lock +171 -170
- src/resources/locks/uv-live_api-cloud_run.lock +162 -161
- {agent_starter_pack-0.2.2.dist-info → agent_starter_pack-0.2.3.dist-info}/WHEEL +0 -0
- {agent_starter_pack-0.2.2.dist-info → agent_starter_pack-0.2.3.dist-info}/entry_points.txt +0 -0
- {agent_starter_pack-0.2.2.dist-info → agent_starter_pack-0.2.3.dist-info}/licenses/LICENSE +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: agent-starter-pack
|
|
3
|
-
Version: 0.2.
|
|
3
|
+
Version: 0.2.3
|
|
4
4
|
Summary: CLI tool to create GCP-based AI agent projects from templates
|
|
5
5
|
Author-email: Google LLC <agent-starter-pack@google.com>
|
|
6
6
|
License: Apache-2.0
|
|
@@ -9,7 +9,7 @@ Requires-Python: >=3.10
|
|
|
9
9
|
Requires-Dist: backoff>=2.2.1
|
|
10
10
|
Requires-Dist: click~=8.1.7
|
|
11
11
|
Requires-Dist: cookiecutter~=2.5.0
|
|
12
|
-
Requires-Dist: google-cloud-aiplatform~=1.
|
|
12
|
+
Requires-Dist: google-cloud-aiplatform~=1.87.0
|
|
13
13
|
Requires-Dist: pyyaml~=6.0.1
|
|
14
14
|
Requires-Dist: rich~=13.7.0
|
|
15
15
|
Provides-Extra: jupyter
|
|
@@ -33,7 +33,7 @@ It accelerates development by providing a holistic, production-ready solution, a
|
|
|
33
33
|
|
|
34
34
|
| ⚡️ Launch | 🧪 Experiment | ✅ Deploy | 🛠️ Customize |
|
|
35
35
|
|---|---|---|---|
|
|
36
|
-
| [Pre-built agent templates](./agents/) (ReAct, RAG, multi-agent, Live Multimodal API). | [Vertex AI evaluation](https://cloud.google.com/vertex-ai/generative-ai/docs/models/evaluation-overview) and an interactive playground. | Production-ready infra with [monitoring
|
|
36
|
+
| [Pre-built agent templates](./agents/) (ReAct, RAG, multi-agent, Live Multimodal API). | [Vertex AI evaluation](https://cloud.google.com/vertex-ai/generative-ai/docs/models/evaluation-overview) and an interactive playground. | Production-ready infra with [monitoring, observability](./docs/observability.md), and [CI/CD](./docs/deployment.md) on [Cloud Run](https://cloud.google.com/run) or [Agent Engine](https://cloud.google.com/vertex-ai/generative-ai/docs/agent-engine/overview). | Extend and customize templates according to your needs. |
|
|
37
37
|
|
|
38
38
|
---
|
|
39
39
|
|
|
@@ -94,16 +94,6 @@ This starter pack covers all aspects of Agent development, from prototyping and
|
|
|
94
94
|
|
|
95
95
|
---
|
|
96
96
|
|
|
97
|
-
#### From `e2e-gen-ai-app-starter-pack` to `agent-starter-pack`
|
|
98
|
-
|
|
99
|
-
This project represents the next evolution of the [e2e-gen-ai-app-starter-pack](goo.gle/e2e-gen-ai-app-starter-pack). Building on the foundation of the original, we've made significant improvements:
|
|
100
|
-
|
|
101
|
-
* **Streamlined CLI:** A new command-line interface (`agent-starter-pack`) simplifies project creation, template selection, and deployment.
|
|
102
|
-
* **Expanded Agent Options:** Support for a wider variety of agent frameworks (LangGraph, CrewAI, and the Google GenAI SDK) and deployment targets (including Vertex AI Agent Engine).
|
|
103
|
-
* **Simplified setup**: Integrated gcloud authentication and projects and region configurations
|
|
104
|
-
|
|
105
|
-
---
|
|
106
|
-
|
|
107
97
|
## 🔧 Requirements
|
|
108
98
|
|
|
109
99
|
- Python 3.10+
|
|
@@ -127,6 +117,10 @@ See the [documentation](docs/) for more details:
|
|
|
127
117
|
- **March 6, 2025**: A [120 Minute livestream video demo](https://www.youtube.com/watch?v=yIRIT_EtALs&t=235s) of the new `agent-starter-pack` were we build 3 Agents under 30 minutes!
|
|
128
118
|
- **Oct 29, 2024**: A [20-Minute Video Walkthrough](https://youtu.be/kwRG7cnqSu0) is available, showcasing the previous `agent-starter-pack`.
|
|
129
119
|
|
|
120
|
+
## Explore More Generative AI Resources
|
|
121
|
+
|
|
122
|
+
Looking for more examples and resources for Generative AI on Google Cloud? Check out the [GoogleCloudPlatform/generative-ai](https://github.com/GoogleCloudPlatform/generative-ai) repository for notebooks, code samples, and more!
|
|
123
|
+
|
|
130
124
|
## Contributing
|
|
131
125
|
|
|
132
126
|
Contributions are welcome! See the [Contributing Guide](CONTRIBUTING.md).
|
|
@@ -29,17 +29,17 @@ agents/live_api/tests/integration/test_server_e2e.py,sha256=D2VETDIyTD2fQyQ6DXwL
|
|
|
29
29
|
agents/live_api/tests/load_test/load_test.py,sha256=HHZyfC4gqiQtZVF_CbbxENGgWQccMLpwMv0IdoQ6cbQ,1275
|
|
30
30
|
agents/live_api/tests/unit/test_server.py,sha256=_TjlgQgNkjerIaBGnu8P8_KB8ZlSolDcivALpUOn_Rw,4786
|
|
31
31
|
src/base_template/.gitignore,sha256=mJKTZIcVdAFiIUQicRfPNGUg6WvwcfTEC2xbmAaU34g,2579
|
|
32
|
-
src/base_template/Makefile,sha256
|
|
33
|
-
src/base_template/README.md,sha256=
|
|
34
|
-
src/base_template/pyproject.toml,sha256
|
|
32
|
+
src/base_template/Makefile,sha256=-FuUonze4AtGb0t_-R3L1ckDS3dOKmpXrGUC2fnQxUo,3167
|
|
33
|
+
src/base_template/README.md,sha256=lg5awHyvvii7wuNgR02fhO_gE9LCdMf_ot9Fd7KOHWo,10317
|
|
34
|
+
src/base_template/pyproject.toml,sha256=-fnazs9G4-EL0Ett7bgMP8qtWxWAHB22akcP5_ZCRz8,2813
|
|
35
35
|
src/base_template/app/utils/tracing.py,sha256=JA5xVmWpJqaMiNUcD5irGS7Xii1LO37Dmfmzt3kA0Ys,5715
|
|
36
36
|
src/base_template/app/utils/typing.py,sha256=LXvSescgmqf5V4h5vT5bcpu-6sWqzD6WgrEBGYLg1C0,3287
|
|
37
|
-
src/base_template/deployment/README.md,sha256=
|
|
38
|
-
src/base_template/deployment/cd/deploy-to-prod.yaml,sha256=
|
|
39
|
-
src/base_template/deployment/cd/staging.yaml,sha256=
|
|
40
|
-
src/base_template/deployment/ci/pr_checks.yaml,sha256=
|
|
37
|
+
src/base_template/deployment/README.md,sha256=XjRRcEiyoobi0YEXybfvtaOi9ecNDU2bVKczv0MUBac,5255
|
|
38
|
+
src/base_template/deployment/cd/deploy-to-prod.yaml,sha256=Vzt83DunPF64wVsY96KTGvhxZFHAF5Tze_88KW4O-oo,4297
|
|
39
|
+
src/base_template/deployment/cd/staging.yaml,sha256=URqQPOxEv1ngYM7RoIs-iGMZsDdq6wUdmH953sipK_o,7925
|
|
40
|
+
src/base_template/deployment/ci/pr_checks.yaml,sha256=7jS9HlRfistS4hhUXMF0tc-5m6g6l9s0rGhq_xP8Tnc,1517
|
|
41
41
|
src/base_template/deployment/terraform/apis.tf,sha256=98vqe53RLtFwnQq_9N1widR8J0c1SGqwhCXp_GohITA,1497
|
|
42
|
-
src/base_template/deployment/terraform/build_triggers.tf,sha256=
|
|
42
|
+
src/base_template/deployment/terraform/build_triggers.tf,sha256=_XDFj6kPd21NmEKKRjpqYIa48P8VHCMVG_6thLqrS0Y,6363
|
|
43
43
|
src/base_template/deployment/terraform/iam.tf,sha256=-KrOngRch4gKnPkZy0ybQQq0RW5-TI80tJ-VJLg-AdA,5597
|
|
44
44
|
src/base_template/deployment/terraform/locals.tf,sha256=mrmOigExLk5g734-2VodDj8qQUTBq20e7FgQD0KUF8E,1390
|
|
45
45
|
src/base_template/deployment/terraform/log_sinks.tf,sha256=PP_n3Jr-_EhMy6shVUrx7ztdQic5W3mKPjC0zQSvAjA,2955
|
|
@@ -57,8 +57,8 @@ src/base_template/deployment/terraform/dev/vars/env.tfvars,sha256=fySlaivTsrxA6s
|
|
|
57
57
|
src/base_template/deployment/terraform/vars/env.tfvars,sha256=vasZKhOscuK0yjYrAUDHqbbYj8HV4d_rN87UJ-2ZEBM,1467
|
|
58
58
|
src/base_template/tests/unit/test_utils/test_tracing_exporter.py,sha256=JAb0vIB7wNFPm_kaDaHcxtPKNReypDHjdsMQyzpuePQ,4687
|
|
59
59
|
src/cli/main.py,sha256=pMsSlNwkrFqHUHHA5U-WMZ4QRquaI_F7OXQt6yxuugE,1688
|
|
60
|
-
src/cli/commands/create.py,sha256=
|
|
61
|
-
src/cli/commands/setup_cicd.py,sha256=
|
|
60
|
+
src/cli/commands/create.py,sha256=iDfA1ewmGYAtWsYjNjzDka5eEWANgX1kyBjov0H9_nE,23787
|
|
61
|
+
src/cli/commands/setup_cicd.py,sha256=bZb0cSjYFFprvGK-j7Msb0tTFKMzecyyoj2EKWjHNIg,31281
|
|
62
62
|
src/cli/utils/__init__.py,sha256=_cTmsXGPqOtK0q8UW5164QTltbJRJFR_Efxq_BRL1-o,1311
|
|
63
63
|
src/cli/utils/cicd.py,sha256=tlz50wI0unRosRZsDpZ5Oh5haMJmNjc9dV0-TPWawKE,27443
|
|
64
64
|
src/cli/utils/datastores.py,sha256=gv1V6eDcOEKx4MRNG5C3Y-VfixYq1AzQuaYMLp8QRNo,1058
|
|
@@ -66,7 +66,7 @@ src/cli/utils/gcp.py,sha256=IHTLqMCOkDwITQV_7YnNun2b6YIzTWKZCxgaDVobdeQ,4066
|
|
|
66
66
|
src/cli/utils/logging.py,sha256=0lHe4EPi1A8sOx9xkA7gS4UNl0GsIyp2ahydkkuCzLY,1570
|
|
67
67
|
src/cli/utils/template.py,sha256=kY6K7Nra2hYN1a36TUb2PoLDKwD3Np9rLTMeVBRbsNg,28730
|
|
68
68
|
src/cli/utils/version.py,sha256=F4udQmzniPStqWZFIgnv3Qg3l9non4mfy2An-Oveqmc,2916
|
|
69
|
-
src/data_ingestion/README.md,sha256=
|
|
69
|
+
src/data_ingestion/README.md,sha256=LNxSQoJW9JozK-TbyGQLj5L_MGWNwrfLk6V6RmQ2oBQ,4032
|
|
70
70
|
src/data_ingestion/pyproject.toml,sha256=-1Mf2QB8K70ICQV5UPZDpf-fN3UwEQLVzQyxfakCSTY,445
|
|
71
71
|
src/data_ingestion/uv.lock,sha256=HzSD6_IxS2urt49EefD9MvVxBwxW_bJ-k0XltTDT3Vc,144223
|
|
72
72
|
src/data_ingestion/data_ingestion_pipeline/pipeline.py,sha256=UAS6dxV954yARP0NdDbCf5kzap3NfmoX52eZ2mZtWXs,3331
|
|
@@ -74,7 +74,7 @@ src/data_ingestion/data_ingestion_pipeline/submit_pipeline.py,sha256=tWGqL0zUB3V
|
|
|
74
74
|
src/data_ingestion/data_ingestion_pipeline/components/ingest_data.py,sha256=s5IlAshPbf5kgBTBb-sz0zHqZnIpCzaoM2b2cs7PkJA,10640
|
|
75
75
|
src/data_ingestion/data_ingestion_pipeline/components/process_data.py,sha256=3D4CaUV9pBaIU10MogJMQAR00JPVzqjFHN6eL7WM_5Y,22045
|
|
76
76
|
src/deployment_targets/agent_engine/deployment_metadata.json,sha256=G_t_n-UNrFsBgH1Xrw5-ZqYzUGwZ4X6ipsIm_yiSq-w,72
|
|
77
|
-
src/deployment_targets/agent_engine/app/agent_engine_app.py,sha256=
|
|
77
|
+
src/deployment_targets/agent_engine/app/agent_engine_app.py,sha256=RK8jsVLWWiaEK_hTE5fI3mQga0K-SrqyRbO7oAZ-N1g,9461
|
|
78
78
|
src/deployment_targets/agent_engine/app/utils/gcs.py,sha256=voQNs8sbvLDH0PX3avpn8RFNTq7QWNgXfMWUKU7hBTA,1495
|
|
79
79
|
src/deployment_targets/agent_engine/notebooks/intro_agent_engine.ipynb,sha256=RE6Gp-bu4bHtNBngfkt_CF5emy5YcOWQ5rMYW_Cv8_w,48809
|
|
80
80
|
src/deployment_targets/agent_engine/tests/integration/test_agent_engine_app.py,sha256=maYeJgeZceMgD4TI8ZID6SKAqHPlQuW2CksT7MGile4,3938
|
|
@@ -82,7 +82,7 @@ src/deployment_targets/agent_engine/tests/load_test/README.md,sha256=ckP2eu5_HpO
|
|
|
82
82
|
src/deployment_targets/agent_engine/tests/load_test/load_test.py,sha256=msUkkqsNinCJ8loQHsEeGkQQqQ-Zvv7oAp5ni9vOsWs,3541
|
|
83
83
|
src/deployment_targets/agent_engine/tests/load_test/.results/.placeholder,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
|
84
84
|
src/deployment_targets/agent_engine/tests/unit/test_dummy.py,sha256=hL-JCmoQUJ4IgluliEo3KkYd47nmiCy3obiPuz_b0sU,712
|
|
85
|
-
src/deployment_targets/cloud_run/Dockerfile,sha256=
|
|
85
|
+
src/deployment_targets/cloud_run/Dockerfile,sha256=vedZX0JJW2bymESqB-0LumnkKyr7hcog8NlR0DX8Q74,843
|
|
86
86
|
src/deployment_targets/cloud_run/uv.lock,sha256=2RYrR89xDS1FHVb-g2JGZq8zZ-rzbm_jRvehVCYuXRQ,887957
|
|
87
87
|
src/deployment_targets/cloud_run/app/server.py,sha256=h6H6xwK_VVlOOfD6sOSbxX3WqRrHHzH0IUbns5VOyoo,3999
|
|
88
88
|
src/deployment_targets/cloud_run/tests/integration/test_server_e2e.py,sha256=KKtdtBTxaAfHQodm4FdyQHa7hBAwWCCda5aEF26-bjk,5978
|
|
@@ -134,25 +134,25 @@ src/frontends/streamlit/frontend/utils/chat_utils.py,sha256=Z0OYQu-14_d9tDmH9Z4V
|
|
|
134
134
|
src/frontends/streamlit/frontend/utils/local_chat_history.py,sha256=9wc8L8j4tk10DBPQdV64kdZvqE1fHxC2esK8szid0l8,4741
|
|
135
135
|
src/frontends/streamlit/frontend/utils/message_editing.py,sha256=YWoPe2KeWMuL3YVTm0am6MK3kzjEIYVmdkdwTQpmGdQ,2263
|
|
136
136
|
src/frontends/streamlit/frontend/utils/multimodal_utils.py,sha256=v6YbCkz_YcnEo-9YvRjwBNt0SzU4M39bYxJGmKk69vE,7313
|
|
137
|
-
src/frontends/streamlit/frontend/utils/stream_handler.py,sha256=
|
|
137
|
+
src/frontends/streamlit/frontend/utils/stream_handler.py,sha256=kRKv7gQDEDsVtJxUsxY5uqCCtR3nQmT1nkZyrQ6Hpsw,12030
|
|
138
138
|
src/frontends/streamlit/frontend/utils/title_summary.py,sha256=B0cadS_KPW-tsbABauI4J681aqjEtuKFDa25e9R1WKc,3030
|
|
139
139
|
src/resources/containers/data_processing/Dockerfile,sha256=VoB9d5yZiiWnqRfWrIq0gGNMzZg-eVy733OgP72ZgO0,950
|
|
140
140
|
src/resources/containers/e2e-tests/Dockerfile,sha256=Q_aTyX_iaFY8j06XZkpMuggJnNO5daiLmmrvqaZHMxw,1611
|
|
141
|
-
src/resources/locks/uv-agentic_rag-agent_engine.lock,sha256=
|
|
142
|
-
src/resources/locks/uv-agentic_rag-cloud_run.lock,sha256=
|
|
143
|
-
src/resources/locks/uv-crewai_coding_crew-agent_engine.lock,sha256=
|
|
144
|
-
src/resources/locks/uv-crewai_coding_crew-cloud_run.lock,sha256=
|
|
145
|
-
src/resources/locks/uv-langgraph_base_react-agent_engine.lock,sha256=
|
|
146
|
-
src/resources/locks/uv-langgraph_base_react-cloud_run.lock,sha256=
|
|
147
|
-
src/resources/locks/uv-live_api-cloud_run.lock,sha256=
|
|
141
|
+
src/resources/locks/uv-agentic_rag-agent_engine.lock,sha256=M4TwcVwII-qHzTE8KgLZM0B9XM_loV1nDCXBuVAG6Ac,615811
|
|
142
|
+
src/resources/locks/uv-agentic_rag-cloud_run.lock,sha256=ng8RMphRb5FoxtG7S9-kEGvolpXDjM5Vx4nykd0yzEY,782807
|
|
143
|
+
src/resources/locks/uv-crewai_coding_crew-agent_engine.lock,sha256=IvLBCezxo8Txr5JO9tHjawVQryWXH-dhXWg6fxtjvJI,712926
|
|
144
|
+
src/resources/locks/uv-crewai_coding_crew-cloud_run.lock,sha256=efnOBWavF8OlBW5AF0WkxkOQpvcb_zt6pN-Dkk9CE94,888350
|
|
145
|
+
src/resources/locks/uv-langgraph_base_react-agent_engine.lock,sha256=XqI00QN9JsplRc5f8oEMVOwIDXPk-h05ZSLurCzUfGo,595857
|
|
146
|
+
src/resources/locks/uv-langgraph_base_react-cloud_run.lock,sha256=iv6yt5pSMKivd-XVjvJwEvuTGzfsMfjSvTLTelv4w5s,758692
|
|
147
|
+
src/resources/locks/uv-live_api-cloud_run.lock,sha256=TeUVt--pNkPrkjb08ek62oJwKKMb70pRSKpza0C-G6M,761749
|
|
148
148
|
src/resources/setup_cicd/cicd_variables.tf,sha256=PMflYe1TzQi63LORHkmeCktTYzXFplJgxffNH4DtuAQ,1244
|
|
149
149
|
src/resources/setup_cicd/github.tf,sha256=scTBgeZlCM74N-pzhVKsnTN0PX9a5GboNl1HN3-LlCM,2791
|
|
150
150
|
src/resources/setup_cicd/providers.tf,sha256=Km4z6IJt7x7PLaa0kyZbBrO2m3lpuIJZFD5jB7QBfF0,1122
|
|
151
151
|
src/utils/generate_locks.py,sha256=xu5IAhGGBPkVQGSJX4kk7_JNDwWJUEaXAHbmaQIohbg,4386
|
|
152
152
|
src/utils/lock_utils.py,sha256=_QdzQtgIbCmJ87s046_i1g966slVNmvr3bJDeHbRQSM,2419
|
|
153
153
|
src/utils/watch_and_rebuild.py,sha256=vP4yIiA7E_lj5sfQdJUl8TXas6V7msDg8XWUutAC05Q,6679
|
|
154
|
-
agent_starter_pack-0.2.
|
|
155
|
-
agent_starter_pack-0.2.
|
|
156
|
-
agent_starter_pack-0.2.
|
|
157
|
-
agent_starter_pack-0.2.
|
|
158
|
-
agent_starter_pack-0.2.
|
|
154
|
+
agent_starter_pack-0.2.3.dist-info/METADATA,sha256=54iuThzxPP5KGanSjYeJ_5S9yuPZMuO5Tn_l0i7ZK0A,7705
|
|
155
|
+
agent_starter_pack-0.2.3.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
|
|
156
|
+
agent_starter_pack-0.2.3.dist-info/entry_points.txt,sha256=U7uCxR7YulIhZ0L8R8Hui0Bsy6J7oyESBeDYJYMrQjA,56
|
|
157
|
+
agent_starter_pack-0.2.3.dist-info/licenses/LICENSE,sha256=xx0jnfkXJvxRnG63LTGOxlggYnIysveWIZ6H3PNdCrQ,11357
|
|
158
|
+
agent_starter_pack-0.2.3.dist-info/RECORD,,
|
src/base_template/Makefile
CHANGED
|
@@ -1,5 +1,5 @@
|
|
|
1
1
|
install:
|
|
2
|
-
@command -v uv >/dev/null 2>&1 || { echo "uv is not installed. Installing uv..."; curl -LsSf https://astral.sh/uv/install.sh | sh; source ~/.bashrc; }
|
|
2
|
+
@command -v uv >/dev/null 2>&1 || { echo "uv is not installed. Installing uv..."; curl -LsSf https://astral.sh/uv/0.6.12/install.sh | sh; source ~/.bashrc; }
|
|
3
3
|
uv sync --dev {% if cookiecutter.agent_name != 'live_api' %}--extra streamlit{%- endif %} --extra jupyter --frozen{% if cookiecutter.agent_name == 'live_api' %} && npm --prefix frontend install{%- endif %}
|
|
4
4
|
|
|
5
5
|
test:
|
|
@@ -19,7 +19,9 @@ backend:
|
|
|
19
19
|
{%- if cookiecutter.deployment_target == 'cloud_run' %}
|
|
20
20
|
uv run uvicorn app.server:app --host 0.0.0.0 --port 8000 --reload
|
|
21
21
|
{%- elif cookiecutter.deployment_target == 'agent_engine' %}
|
|
22
|
-
uv export
|
|
22
|
+
# Export dependencies to requirements file using uv export (preferred method), otherwise fall back to uv pip freeze
|
|
23
|
+
uv export --no-hashes --no-sources --no-header --no-dev --no-emit-project --no-annotate --frozen > .requirements.txt 2>/dev/null || \
|
|
24
|
+
uv pip freeze --exclude-editable > .requirements.txt && uv run app/agent_engine_app.py
|
|
23
25
|
{%- endif %}
|
|
24
26
|
|
|
25
27
|
{% if cookiecutter.deployment_target == 'cloud_run' -%}
|
|
@@ -34,8 +36,7 @@ ui:
|
|
|
34
36
|
setup-dev-env:
|
|
35
37
|
@if [ -z "$$PROJECT_ID" ]; then echo "Error: PROJECT_ID environment variable is not set"; exit 1; fi
|
|
36
38
|
(cd deployment/terraform/dev && terraform init && terraform apply --var-file vars/env.tfvars --var dev_project_id=$$PROJECT_ID --auto-approve)
|
|
37
|
-
|
|
38
|
-
{%- if cookiecutter.data_ingestion%}
|
|
39
|
+
{% if cookiecutter.data_ingestion %}
|
|
39
40
|
data-ingestion:
|
|
40
41
|
@if [ -z "$$PROJECT_ID" ]; then echo "Error: PROJECT_ID environment variable is not set"; exit 1; fi
|
|
41
42
|
$(MAKE) install
|
|
@@ -48,6 +49,7 @@ data-ingestion:
|
|
|
48
49
|
{%- elif cookiecutter.datastore_type == "vertex_ai_vector_search" %}
|
|
49
50
|
--vector-search-index="{{cookiecutter.project_name}}-vector-search" \
|
|
50
51
|
--vector-search-index-endpoint="{{cookiecutter.project_name}}-vector-search-endpoint" \
|
|
52
|
+
--vector-search-data-bucket-name="$$PROJECT_ID-{{cookiecutter.project_name}}-vs" \
|
|
51
53
|
{%- endif %}
|
|
52
54
|
--service-account="{{cookiecutter.project_name}}-rag@$$PROJECT_ID.iam.gserviceaccount.com" \
|
|
53
55
|
--pipeline-root="gs://$$PROJECT_ID-{{cookiecutter.project_name}}-rag" \
|
src/base_template/README.md
CHANGED
|
@@ -56,6 +56,10 @@ make install && make playground
|
|
|
56
56
|
{%- endif %}
|
|
57
57
|
| `make test` | Run unit and integration tests |
|
|
58
58
|
| `make lint` | Run code quality checks (codespell, ruff, mypy) |
|
|
59
|
+
| `make setup-dev-env` | Set up development environment resources using Terraform |
|
|
60
|
+
{%- if cookiecutter.data_ingestion %}
|
|
61
|
+
| `make data-ingestion`| Run data ingestion pipeline in the Dev environment |
|
|
62
|
+
{%- endif %}
|
|
59
63
|
| `uv run jupyter lab` | Launch Jupyter notebook |
|
|
60
64
|
|
|
61
65
|
For full command options and usage, refer to the [Makefile](Makefile).
|
|
@@ -144,12 +148,14 @@ This template follows a "bring your own agent" approach - you focus on your busi
|
|
|
144
148
|
1. **Prototype:** Build your Generative AI Agent using the intro notebooks in `notebooks/` for guidance. Use Vertex AI Evaluation to assess performance.
|
|
145
149
|
2. **Integrate:** Import your agent into the app by editing `app/agent.py`.
|
|
146
150
|
3. **Test:** Explore your agent functionality using the Streamlit playground with `make playground`. The playground offers features like chat history, user feedback, and various input types, and automatically reloads your agent on code changes.
|
|
147
|
-
4. **Deploy:**
|
|
151
|
+
4. **Deploy:** Set up and initiate the CI/CD pipelines, customizing tests as necessary. Refer to the [deployment section](#deployment) for comprehensive instructions. For streamlined infrastructure deployment, simply run `agent-starter-pack setup-cicd`. Check out the [`agent-starter-pack setup-cicd` CLI command](https://github.com/GoogleCloudPlatform/agent-starter-pack/blob/main/docs/cli/setup_cicd.md). Currently only supporting Github.
|
|
148
152
|
5. **Monitor:** Track performance and gather insights using Cloud Logging, Tracing, and the Looker Studio dashboard to iterate on your application.
|
|
149
153
|
{% endif %}
|
|
150
154
|
|
|
151
155
|
## Deployment
|
|
152
156
|
|
|
157
|
+
> **Note:** For a streamlined one-command deployment of the entire CI/CD pipeline and infrastructure using Terraform, you can use the [`agent-starter-pack setup-cicd` CLI command](https://github.com/GoogleCloudPlatform/agent-starter-pack/blob/main/docs/cli/setup_cicd.md). Currently only supporting Github.
|
|
158
|
+
|
|
153
159
|
### Dev Environment
|
|
154
160
|
|
|
155
161
|
{%- if cookiecutter.deployment_target == 'agent_engine' %}
|
|
@@ -29,6 +29,8 @@ The application leverages [**Terraform**](http://terraform.io) to define and pro
|
|
|
29
29
|
|
|
30
30
|
## Setup
|
|
31
31
|
|
|
32
|
+
> **Note:** For a streamlined one-command deployment of the entire CI/CD pipeline and infrastructure using Terraform, you can use the [`agent-starter-pack setup-cicd` CLI command](https://github.com/GoogleCloudPlatform/agent-starter-pack/blob/main/docs/cli/setup_cicd.md). Currently only supporting Github.
|
|
33
|
+
|
|
32
34
|
**Prerequisites:**
|
|
33
35
|
|
|
34
36
|
1. A set of Google Cloud projects:
|
|
@@ -58,6 +60,7 @@ The application leverages [**Terraform**](http://terraform.io) to define and pro
|
|
|
58
60
|
|
|
59
61
|
| Variable | Description | Required |
|
|
60
62
|
| ---------------------- | --------------------------------------------------------------- | :------: |
|
|
63
|
+
| project_name | Project name used as a base for resource naming | Yes |
|
|
61
64
|
| prod_project_id | **Production** Google Cloud Project ID for resource deployment. | Yes |
|
|
62
65
|
| staging_project_id | **Staging** Google Cloud Project ID for resource deployment. | Yes |
|
|
63
66
|
| cicd_runner_project_id | Google Cloud Project ID where CI/CD pipelines will execute. | Yes |
|
|
@@ -65,7 +68,7 @@ The application leverages [**Terraform**](http://terraform.io) to define and pro
|
|
|
65
68
|
| host_connection_name | Name of the host connection you created in Cloud Build | Yes |
|
|
66
69
|
| repository_name | Name of the repository you added to Cloud Build | Yes |
|
|
67
70
|
|
|
68
|
-
Other optional variables include: telemetry and feedback
|
|
71
|
+
Other optional variables may include: telemetry and feedback log filters, service account roles, and for projects requiring data ingestion: pipeline cron schedule, pipeline roles, and datastore-specific configurations.
|
|
69
72
|
|
|
70
73
|
4. **Deploy Infrastructure with Terraform**
|
|
71
74
|
|
|
@@ -20,7 +20,7 @@ steps:
|
|
|
20
20
|
args:
|
|
21
21
|
- -c
|
|
22
22
|
- |
|
|
23
|
-
cd data_ingestion && pip install uv --user && uv sync --frozen && \
|
|
23
|
+
cd data_ingestion && pip install uv==0.6.12 --user && uv sync --frozen && \
|
|
24
24
|
uv run python data_ingestion_pipeline/submit_pipeline.py
|
|
25
25
|
env:
|
|
26
26
|
- "PIPELINE_ROOT=${_PIPELINE_GCS_ROOT}"
|
|
@@ -75,7 +75,7 @@ steps:
|
|
|
75
75
|
args:
|
|
76
76
|
- "-c"
|
|
77
77
|
- |
|
|
78
|
-
pip install uv --user && uv sync --frozen
|
|
78
|
+
pip install uv==0.6.12 --user && uv sync --frozen
|
|
79
79
|
env:
|
|
80
80
|
- 'PATH=/usr/local/bin:/usr/bin:~/.local/bin'
|
|
81
81
|
|
|
@@ -85,7 +85,7 @@ steps:
|
|
|
85
85
|
args:
|
|
86
86
|
- "-c"
|
|
87
87
|
- |
|
|
88
|
-
uv export --no-hashes --no-sources --no-header --no-emit-project --frozen > .requirements.txt
|
|
88
|
+
uv export --no-hashes --no-sources --no-header --no-dev --no-emit-project --no-annotate --frozen > .requirements.txt
|
|
89
89
|
uv run app/agent_engine_app.py \
|
|
90
90
|
--project ${_PROD_PROJECT_ID} \
|
|
91
91
|
--location ${_REGION} \
|
|
@@ -20,7 +20,7 @@ steps:
|
|
|
20
20
|
args:
|
|
21
21
|
- -c
|
|
22
22
|
- |
|
|
23
|
-
cd data_ingestion && pip install uv --user && uv sync --frozen && \
|
|
23
|
+
cd data_ingestion && pip install uv==0.6.12 --user && uv sync --frozen && \
|
|
24
24
|
uv run python data_ingestion_pipeline/submit_pipeline.py
|
|
25
25
|
env:
|
|
26
26
|
- "PIPELINE_ROOT=${_PIPELINE_GCS_ROOT}"
|
|
@@ -108,7 +108,7 @@ steps:
|
|
|
108
108
|
args:
|
|
109
109
|
- "-c"
|
|
110
110
|
- |
|
|
111
|
-
pip install uv --user && uv sync --frozen
|
|
111
|
+
pip install uv==0.6.12 --user && uv sync --frozen
|
|
112
112
|
env:
|
|
113
113
|
- 'PATH=/usr/local/bin:/usr/bin:~/.local/bin'
|
|
114
114
|
|
|
@@ -118,7 +118,7 @@ steps:
|
|
|
118
118
|
args:
|
|
119
119
|
- "-c"
|
|
120
120
|
- |
|
|
121
|
-
uv export --no-hashes --no-sources --no-header --no-emit-project --frozen > .requirements.txt
|
|
121
|
+
uv export --no-hashes --no-sources --no-header --no-dev --no-emit-project --no-annotate --frozen > .requirements.txt
|
|
122
122
|
uv run app/agent_engine_app.py \
|
|
123
123
|
--project ${_STAGING_PROJECT_ID} \
|
|
124
124
|
--location ${_REGION} \
|
|
@@ -146,7 +146,7 @@ steps:
|
|
|
146
146
|
{%- if cookiecutter.deployment_target == 'cloud_run' %}
|
|
147
147
|
export _ID_TOKEN=$(cat id_token.txt)
|
|
148
148
|
export _STAGING_URL=$(cat staging_url.txt)
|
|
149
|
-
pip install uv --user && uv sync --frozen
|
|
149
|
+
pip install uv==0.6.12 --user && uv sync --frozen
|
|
150
150
|
{%- elif cookiecutter.deployment_target == 'agent_engine' %}
|
|
151
151
|
export _AUTH_TOKEN=$(cat auth_token.txt)
|
|
152
152
|
{%- endif %}
|
|
@@ -38,6 +38,7 @@ resource "google_cloudbuild_trigger" "pr_checks" {
|
|
|
38
38
|
"data_ingestion/**",
|
|
39
39
|
{% endif %}
|
|
40
40
|
]
|
|
41
|
+
include_build_logs = "INCLUDE_BUILD_LOGS_WITH_STATUS"
|
|
41
42
|
depends_on = [resource.google_project_service.cicd_services, resource.google_project_service.shared_services]
|
|
42
43
|
}
|
|
43
44
|
|
|
@@ -64,6 +65,7 @@ resource "google_cloudbuild_trigger" "cd_pipeline" {
|
|
|
64
65
|
"deployment/**",
|
|
65
66
|
"uv.lock"
|
|
66
67
|
]
|
|
68
|
+
include_build_logs = "INCLUDE_BUILD_LOGS_WITH_STATUS"
|
|
67
69
|
substitutions = {
|
|
68
70
|
_STAGING_PROJECT_ID = var.staging_project_id
|
|
69
71
|
_BUCKET_NAME_LOAD_TEST_RESULTS = resource.google_storage_bucket.bucket_load_test_results.name
|
|
@@ -104,6 +106,7 @@ resource "google_cloudbuild_trigger" "deploy_to_prod_pipeline" {
|
|
|
104
106
|
repository = "projects/${var.cicd_runner_project_id}/locations/${var.region}/connections/${var.host_connection_name}/repositories/${var.repository_name}"
|
|
105
107
|
}
|
|
106
108
|
filename = "deployment/cd/deploy-to-prod.yaml"
|
|
109
|
+
include_build_logs = "INCLUDE_BUILD_LOGS_WITH_STATUS"
|
|
107
110
|
approval_config {
|
|
108
111
|
approval_required = true
|
|
109
112
|
}
|
src/base_template/pyproject.toml
CHANGED
|
@@ -14,11 +14,11 @@ dependencies = [
|
|
|
14
14
|
"traceloop-sdk~=0.38.7",
|
|
15
15
|
"google-cloud-logging~=3.11.4",
|
|
16
16
|
{%- if cookiecutter.deployment_target == 'cloud_run' %}
|
|
17
|
-
"google-cloud-aiplatform[evaluation]~=1.
|
|
17
|
+
"google-cloud-aiplatform[evaluation]~=1.87.0",
|
|
18
18
|
"fastapi~=0.115.8",
|
|
19
19
|
"uvicorn~=0.34.0"
|
|
20
20
|
{%- elif cookiecutter.deployment_target == 'agent_engine' %}
|
|
21
|
-
"google-cloud-aiplatform[evaluation,reasoningengine]~=1.
|
|
21
|
+
"google-cloud-aiplatform[evaluation,reasoningengine]~=1.87.0"
|
|
22
22
|
{%- endif %}
|
|
23
23
|
]
|
|
24
24
|
{% if cookiecutter.deployment_target == 'cloud_run' %}
|
src/cli/commands/create.py
CHANGED
|
@@ -37,6 +37,40 @@ from ..utils.template import (
|
|
|
37
37
|
console = Console()
|
|
38
38
|
|
|
39
39
|
|
|
40
|
+
def normalize_project_name(project_name: str) -> str:
|
|
41
|
+
"""Normalize project name for better compatibility with cloud resources and tools."""
|
|
42
|
+
|
|
43
|
+
needs_normalization = (
|
|
44
|
+
any(char.isupper() for char in project_name) or "_" in project_name
|
|
45
|
+
)
|
|
46
|
+
|
|
47
|
+
if needs_normalization:
|
|
48
|
+
normalized_name = project_name
|
|
49
|
+
console.print(
|
|
50
|
+
"Note: Project names are normalized (lowercase, hyphens only) for better compatibility with cloud resources and tools.",
|
|
51
|
+
style="dim",
|
|
52
|
+
)
|
|
53
|
+
if any(char.isupper() for char in normalized_name):
|
|
54
|
+
normalized_name = normalized_name.lower()
|
|
55
|
+
console.print(
|
|
56
|
+
f"Info: Converting to lowercase for compatibility: '{project_name}' -> '{normalized_name}'",
|
|
57
|
+
style="bold yellow",
|
|
58
|
+
)
|
|
59
|
+
|
|
60
|
+
if "_" in normalized_name:
|
|
61
|
+
# Capture the name state before this specific change
|
|
62
|
+
name_before_hyphenation = normalized_name
|
|
63
|
+
normalized_name = normalized_name.replace("_", "-")
|
|
64
|
+
console.print(
|
|
65
|
+
f"Info: Replacing underscores with hyphens for compatibility: '{name_before_hyphenation}' -> '{normalized_name}'",
|
|
66
|
+
style="yellow",
|
|
67
|
+
)
|
|
68
|
+
|
|
69
|
+
return normalized_name
|
|
70
|
+
|
|
71
|
+
return project_name
|
|
72
|
+
|
|
73
|
+
|
|
40
74
|
@click.command()
|
|
41
75
|
@click.pass_context
|
|
42
76
|
@click.argument("project_name")
|
|
@@ -110,15 +144,7 @@ def create(
|
|
|
110
144
|
)
|
|
111
145
|
return
|
|
112
146
|
|
|
113
|
-
|
|
114
|
-
if any(char.isupper() for char in project_name):
|
|
115
|
-
original_name = project_name
|
|
116
|
-
project_name = project_name.lower()
|
|
117
|
-
console.print(
|
|
118
|
-
f"Warning: Project name '{original_name}' contains uppercase characters. "
|
|
119
|
-
f"Converting to lowercase: '{project_name}'",
|
|
120
|
-
style="bold yellow",
|
|
121
|
-
)
|
|
147
|
+
project_name = normalize_project_name(project_name)
|
|
122
148
|
|
|
123
149
|
# Setup debug logging if enabled
|
|
124
150
|
if debug:
|
src/cli/commands/setup_cicd.py
CHANGED
|
@@ -356,9 +356,11 @@ console = Console()
|
|
|
356
356
|
|
|
357
357
|
@click.command()
|
|
358
358
|
@click.option("--dev-project", help="Development project ID")
|
|
359
|
-
@click.option("--staging-project",
|
|
360
|
-
@click.option("--prod-project",
|
|
361
|
-
@click.option(
|
|
359
|
+
@click.option("--staging-project", help="Staging project ID")
|
|
360
|
+
@click.option("--prod-project", help="Production project ID")
|
|
361
|
+
@click.option(
|
|
362
|
+
"--cicd-project", help="CICD project ID (defaults to prod project if not specified)"
|
|
363
|
+
)
|
|
362
364
|
@click.option("--region", default="us-central1", help="GCP region")
|
|
363
365
|
@click.option("--repository-name", help="Repository name (optional)")
|
|
364
366
|
@click.option(
|
|
@@ -402,9 +404,9 @@ console = Console()
|
|
|
402
404
|
)
|
|
403
405
|
def setup_cicd(
|
|
404
406
|
dev_project: str | None,
|
|
405
|
-
staging_project: str,
|
|
406
|
-
prod_project: str,
|
|
407
|
-
cicd_project: str,
|
|
407
|
+
staging_project: str | None,
|
|
408
|
+
prod_project: str | None,
|
|
409
|
+
cicd_project: str | None,
|
|
408
410
|
region: str,
|
|
409
411
|
repository_name: str | None,
|
|
410
412
|
repository_owner: str | None,
|
|
@@ -426,6 +428,20 @@ def setup_cicd(
|
|
|
426
428
|
"Make sure you are in the folder created by agent-starter-pack."
|
|
427
429
|
)
|
|
428
430
|
|
|
431
|
+
# Prompt for staging and prod projects if not provided
|
|
432
|
+
if staging_project is None:
|
|
433
|
+
staging_project = click.prompt(
|
|
434
|
+
"Enter your staging project ID (where tests will be run)", type=str
|
|
435
|
+
)
|
|
436
|
+
|
|
437
|
+
if prod_project is None:
|
|
438
|
+
prod_project = click.prompt("Enter your production project ID", type=str)
|
|
439
|
+
|
|
440
|
+
# If cicd_project is not provided, default to prod_project
|
|
441
|
+
if cicd_project is None:
|
|
442
|
+
cicd_project = prod_project
|
|
443
|
+
console.print(f"Using production project '{prod_project}' for CI/CD resources")
|
|
444
|
+
|
|
429
445
|
console.print(
|
|
430
446
|
"\n⚠️ WARNING: The setup-cicd command is experimental and may have unexpected behavior.",
|
|
431
447
|
style="bold yellow",
|
src/data_ingestion/README.md
CHANGED
|
@@ -1,84 +1,71 @@
|
|
|
1
|
+
{%- if cookiecutter.datastore_type == "vertex_ai_search" -%}
|
|
2
|
+
{%- set datastore_service_name = "Vertex AI Search" -%}
|
|
3
|
+
{%- elif cookiecutter.datastore_type == "vertex_ai_vector_search" -%}
|
|
4
|
+
{%- set datastore_service_name = "Vertex AI Vector Search" -%}
|
|
5
|
+
{%- else -%}
|
|
6
|
+
{%- set datastore_service_name = "Your Configured Datastore" -%}
|
|
7
|
+
{%- endif -%}
|
|
8
|
+
|
|
1
9
|
# Data Ingestion Pipeline
|
|
2
10
|
|
|
3
|
-
This pipeline automates the ingestion of data into
|
|
11
|
+
This pipeline automates the ingestion of data into {{ datastore_service_name }}, streamlining the process of building Retrieval Augmented Generation (RAG) applications.
|
|
4
12
|
|
|
5
|
-
It orchestrates the complete workflow: loading data, chunking it into manageable segments, generating embeddings using Vertex AI Embeddings, and importing the processed data into your
|
|
13
|
+
It orchestrates the complete workflow: loading data, chunking it into manageable segments, generating embeddings using Vertex AI Embeddings, and importing the processed data into your {{ datastore_service_name }} datastore.
|
|
6
14
|
|
|
7
15
|
You can trigger the pipeline for an initial data load or schedule it to run periodically, ensuring your search index remains current. Vertex AI Pipelines provides the orchestration and monitoring capabilities for this process.
|
|
8
16
|
|
|
9
17
|
## Prerequisites
|
|
10
18
|
|
|
11
|
-
Before running
|
|
19
|
+
Before running any commands, ensure you have set your Google Cloud Project ID as an environment variable. This variable will be used by the subsequent `make` commands.
|
|
12
20
|
|
|
13
|
-
|
|
21
|
+
```bash
|
|
22
|
+
export PROJECT_ID="YOUR_PROJECT_ID"
|
|
23
|
+
```
|
|
24
|
+
Replace `"YOUR_PROJECT_ID"` with your actual Google Cloud Project ID.
|
|
14
25
|
|
|
15
|
-
|
|
26
|
+
Now, you can set up the development environment:
|
|
16
27
|
|
|
17
|
-
|
|
28
|
+
1. **Set up Dev Environment:** Use the following command from the root of the repository to provision the necessary resources in your development environment using Terraform. This includes deploying a datastore and configuring the required permissions.
|
|
18
29
|
|
|
19
|
-
|
|
30
|
+
```bash
|
|
31
|
+
make setup-dev-env
|
|
32
|
+
```
|
|
33
|
+
This command requires `terraform` to be installed and configured.
|
|
20
34
|
|
|
21
|
-
|
|
35
|
+
## Running the Data Ingestion Pipeline
|
|
22
36
|
|
|
23
|
-
|
|
37
|
+
After setting up the infrastructure using `make setup-dev-env`, you can run the data ingestion pipeline.
|
|
24
38
|
|
|
25
|
-
|
|
26
|
-
cd data_ingestion
|
|
27
|
-
```
|
|
39
|
+
> **Note:** The initial pipeline execution might take longer as your project is configured for Vertex AI Pipelines.
|
|
28
40
|
|
|
29
|
-
**
|
|
41
|
+
**Steps:**
|
|
30
42
|
|
|
31
|
-
|
|
43
|
+
**a. Execute the Pipeline:**
|
|
44
|
+
Run the following command from the root of the repository. Ensure the `PROJECT_ID` environment variable is still set in your current shell session (as configured in Prerequisites).
|
|
32
45
|
|
|
33
46
|
```bash
|
|
34
|
-
|
|
47
|
+
make data-ingestion
|
|
35
48
|
```
|
|
36
49
|
|
|
37
|
-
|
|
38
|
-
|
|
39
|
-
Run the following command to execute the data ingestion pipeline. Replace the placeholder values with your actual project details.
|
|
50
|
+
This command handles installing dependencies (if needed via `make install`) and submits the pipeline job using the configuration derived from your project setup. The specific parameters passed to the underlying script depend on the `datastore_type` selected during project generation:
|
|
40
51
|
{%- if cookiecutter.datastore_type == "vertex_ai_search" %}
|
|
41
|
-
|
|
42
|
-
PROJECT_ID="YOUR_PROJECT_ID"
|
|
43
|
-
REGION="us-central1"
|
|
44
|
-
DATA_STORE_REGION="us"
|
|
45
|
-
uv run data_ingestion_pipeline/submit_pipeline.py \
|
|
46
|
-
--project-id=$PROJECT_ID \
|
|
47
|
-
--region=$REGION \
|
|
48
|
-
--data-store-region=$DATA_STORE_REGION \
|
|
49
|
-
--data-store-id="sample-datastore" \
|
|
50
|
-
--service-account="{{cookiecutter.project_name}}-rag@$PROJECT_ID.iam.gserviceaccount.com" \
|
|
51
|
-
--pipeline-root="gs://$PROJECT_ID-{{cookiecutter.project_name}}-rag" \
|
|
52
|
-
--pipeline-name="data-ingestion-pipeline"
|
|
53
|
-
```
|
|
52
|
+
* It will use parameters like `--data-store-id`, `--data-store-region`.
|
|
54
53
|
{%- elif cookiecutter.datastore_type == "vertex_ai_vector_search" %}
|
|
55
|
-
|
|
56
|
-
PROJECT_ID="YOUR_PROJECT_ID"
|
|
57
|
-
REGION="us-central1"
|
|
58
|
-
VECTOR_SEARCH_INDEX="YOUR_VECTOR_SEARCH_INDEX"
|
|
59
|
-
VECTOR_SEARCH_INDEX_ENDPOINT="YOUR_VECTOR_SEARCH_INDEX_ENDPOINT"
|
|
60
|
-
uv run data_ingestion_pipeline/submit_pipeline.py \
|
|
61
|
-
--project-id=$PROJECT_ID \
|
|
62
|
-
--region=$REGION \
|
|
63
|
-
--vector-search-index=$VECTOR_SEARCH_INDEX \
|
|
64
|
-
--vector-search-index-endpoint=$VECTOR_SEARCH_INDEX_ENDPOINT \
|
|
65
|
-
--service-account="{{cookiecutter.project_name}}-rag@$PROJECT_ID.iam.gserviceaccount.com" \
|
|
66
|
-
--pipeline-root="gs://$PROJECT_ID-{{cookiecutter.project_name}}-rag" \
|
|
67
|
-
--pipeline-name="data-ingestion-pipeline"
|
|
68
|
-
```
|
|
54
|
+
* It will use parameters like `--vector-search-index`, `--vector-search-index-endpoint`, `--vector-search-data-bucket-name`.
|
|
69
55
|
{%- endif %}
|
|
56
|
+
* Common parameters include `--project-id`, `--region`, `--service-account`, `--pipeline-root`, and `--pipeline-name`.
|
|
70
57
|
|
|
71
|
-
**
|
|
58
|
+
**b. Pipeline Scheduling:**
|
|
72
59
|
|
|
73
|
-
The
|
|
60
|
+
The `make data-ingestion` command triggers an immediate pipeline run. For production environments, the underlying `submit_pipeline.py` script also supports scheduling options with flags like `--schedule-only` and `--cron-schedule` for periodic execution.
|
|
74
61
|
|
|
75
|
-
**
|
|
62
|
+
**c. Monitoring Pipeline Progress:**
|
|
76
63
|
|
|
77
|
-
The pipeline's configuration and execution status will be printed to the console. For detailed monitoring, use the Vertex AI Pipelines dashboard in the Google Cloud Console.
|
|
64
|
+
The pipeline's configuration and execution status link will be printed to the console upon submission. For detailed monitoring, use the Vertex AI Pipelines dashboard in the Google Cloud Console.
|
|
78
65
|
|
|
79
66
|
## Testing Your RAG Application
|
|
80
67
|
|
|
81
|
-
Once the data ingestion pipeline completes successfully, you can test your RAG application with
|
|
68
|
+
Once the data ingestion pipeline completes successfully, you can test your RAG application with {{ datastore_service_name }}.
|
|
82
69
|
{%- if cookiecutter.datastore_type == "vertex_ai_search" %}
|
|
83
70
|
> **Troubleshooting:** If you encounter the error `"google.api_core.exceptions.InvalidArgument: 400 The embedding field path: embedding not found in schema"` after the initial data ingestion, wait a few minutes and try again. This delay allows Vertex AI Search to fully index the ingested data.
|
|
84
71
|
{%- endif %}
|