chub-dev 0.1.0 → 0.1.2-beta.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +55 -0
- package/bin/chub-mcp +2 -0
- package/dist/airtable/docs/database/javascript/DOC.md +1437 -0
- package/dist/airtable/docs/database/python/DOC.md +1735 -0
- package/dist/amplitude/docs/analytics/javascript/DOC.md +1282 -0
- package/dist/amplitude/docs/analytics/python/DOC.md +1199 -0
- package/dist/anthropic/docs/claude-api/javascript/DOC.md +503 -0
- package/dist/anthropic/docs/claude-api/python/DOC.md +389 -0
- package/dist/asana/docs/tasks/DOC.md +1396 -0
- package/dist/assemblyai/docs/transcription/DOC.md +1043 -0
- package/dist/atlassian/docs/confluence/javascript/DOC.md +1347 -0
- package/dist/atlassian/docs/confluence/python/DOC.md +1604 -0
- package/dist/auth0/docs/identity/javascript/DOC.md +968 -0
- package/dist/auth0/docs/identity/python/DOC.md +1199 -0
- package/dist/aws/docs/s3/javascript/DOC.md +1773 -0
- package/dist/aws/docs/s3/python/DOC.md +1807 -0
- package/dist/binance/docs/trading/javascript/DOC.md +1315 -0
- package/dist/binance/docs/trading/python/DOC.md +1454 -0
- package/dist/braintree/docs/gateway/javascript/DOC.md +1278 -0
- package/dist/braintree/docs/gateway/python/DOC.md +1179 -0
- package/dist/chromadb/docs/embeddings-db/javascript/DOC.md +1263 -0
- package/dist/chromadb/docs/embeddings-db/python/DOC.md +1707 -0
- package/dist/clerk/docs/auth/javascript/DOC.md +1220 -0
- package/dist/clerk/docs/auth/python/DOC.md +274 -0
- package/dist/cloudflare/docs/workers/javascript/DOC.md +918 -0
- package/dist/cloudflare/docs/workers/python/DOC.md +994 -0
- package/dist/cockroachdb/docs/distributed-db/DOC.md +1500 -0
- package/dist/cohere/docs/llm/DOC.md +1335 -0
- package/dist/datadog/docs/monitoring/javascript/DOC.md +1740 -0
- package/dist/datadog/docs/monitoring/python/DOC.md +1815 -0
- package/dist/deepgram/docs/speech/javascript/DOC.md +885 -0
- package/dist/deepgram/docs/speech/python/DOC.md +685 -0
- package/dist/deepl/docs/translation/javascript/DOC.md +887 -0
- package/dist/deepl/docs/translation/python/DOC.md +944 -0
- package/dist/deepseek/docs/llm/DOC.md +1220 -0
- package/dist/directus/docs/headless-cms/javascript/DOC.md +1128 -0
- package/dist/directus/docs/headless-cms/python/DOC.md +1276 -0
- package/dist/discord/docs/bot/javascript/DOC.md +1090 -0
- package/dist/discord/docs/bot/python/DOC.md +1130 -0
- package/dist/elasticsearch/docs/search/DOC.md +1634 -0
- package/dist/elevenlabs/docs/text-to-speech/javascript/DOC.md +336 -0
- package/dist/elevenlabs/docs/text-to-speech/python/DOC.md +552 -0
- package/dist/firebase/docs/auth/DOC.md +1015 -0
- package/dist/gemini/docs/genai/javascript/DOC.md +691 -0
- package/dist/gemini/docs/genai/python/DOC.md +555 -0
- package/dist/github/docs/octokit/DOC.md +1560 -0
- package/dist/google/docs/bigquery/javascript/DOC.md +1688 -0
- package/dist/google/docs/bigquery/python/DOC.md +1503 -0
- package/dist/hubspot/docs/crm/javascript/DOC.md +1805 -0
- package/dist/hubspot/docs/crm/python/DOC.md +2033 -0
- package/dist/huggingface/docs/transformers/DOC.md +948 -0
- package/dist/intercom/docs/messaging/javascript/DOC.md +1844 -0
- package/dist/intercom/docs/messaging/python/DOC.md +1797 -0
- package/dist/jira/docs/issues/javascript/DOC.md +1420 -0
- package/dist/jira/docs/issues/python/DOC.md +1492 -0
- package/dist/kafka/docs/streaming/javascript/DOC.md +1671 -0
- package/dist/kafka/docs/streaming/python/DOC.md +1464 -0
- package/dist/landingai-ade/docs/api/DOC.md +620 -0
- package/dist/landingai-ade/docs/sdk/python/DOC.md +489 -0
- package/dist/landingai-ade/docs/sdk/typescript/DOC.md +542 -0
- package/dist/landingai-ade/skills/SKILL.md +489 -0
- package/dist/launchdarkly/docs/feature-flags/javascript/DOC.md +1191 -0
- package/dist/launchdarkly/docs/feature-flags/python/DOC.md +1671 -0
- package/dist/linear/docs/tracker/DOC.md +1554 -0
- package/dist/livekit/docs/realtime/javascript/DOC.md +303 -0
- package/dist/livekit/docs/realtime/python/DOC.md +163 -0
- package/dist/mailchimp/docs/marketing/DOC.md +1420 -0
- package/dist/meilisearch/docs/search/DOC.md +1241 -0
- package/dist/microsoft/docs/onedrive/javascript/DOC.md +1421 -0
- package/dist/microsoft/docs/onedrive/python/DOC.md +1549 -0
- package/dist/mongodb/docs/atlas/DOC.md +2041 -0
- package/dist/notion/docs/workspace-api/javascript/DOC.md +1435 -0
- package/dist/notion/docs/workspace-api/python/DOC.md +1400 -0
- package/dist/okta/docs/identity/javascript/DOC.md +1171 -0
- package/dist/okta/docs/identity/python/DOC.md +1401 -0
- package/dist/openai/docs/chat/javascript/DOC.md +407 -0
- package/dist/openai/docs/chat/python/DOC.md +568 -0
- package/dist/paypal/docs/checkout/DOC.md +278 -0
- package/dist/pinecone/docs/sdk/javascript/DOC.md +984 -0
- package/dist/pinecone/docs/sdk/python/DOC.md +1395 -0
- package/dist/plaid/docs/banking/javascript/DOC.md +1163 -0
- package/dist/plaid/docs/banking/python/DOC.md +1203 -0
- package/dist/playwright-community/skills/login-flows/SKILL.md +108 -0
- package/dist/postmark/docs/transactional-email/DOC.md +1168 -0
- package/dist/prisma/docs/orm/javascript/DOC.md +1419 -0
- package/dist/prisma/docs/orm/python/DOC.md +1317 -0
- package/dist/qdrant/docs/vector-search/javascript/DOC.md +1221 -0
- package/dist/qdrant/docs/vector-search/python/DOC.md +1653 -0
- package/dist/rabbitmq/docs/message-queue/javascript/DOC.md +1193 -0
- package/dist/rabbitmq/docs/message-queue/python/DOC.md +1243 -0
- package/dist/razorpay/docs/payments/javascript/DOC.md +1219 -0
- package/dist/razorpay/docs/payments/python/DOC.md +1330 -0
- package/dist/redis/docs/key-value/javascript/DOC.md +1851 -0
- package/dist/redis/docs/key-value/python/DOC.md +2054 -0
- package/dist/registry.json +2817 -0
- package/dist/replicate/docs/model-hosting/DOC.md +1318 -0
- package/dist/resend/docs/email/DOC.md +1271 -0
- package/dist/salesforce/docs/crm/javascript/DOC.md +1241 -0
- package/dist/salesforce/docs/crm/python/DOC.md +1183 -0
- package/dist/search-index.json +1 -0
- package/dist/sendgrid/docs/email-api/javascript/DOC.md +371 -0
- package/dist/sendgrid/docs/email-api/python/DOC.md +656 -0
- package/dist/sentry/docs/error-tracking/javascript/DOC.md +1073 -0
- package/dist/sentry/docs/error-tracking/python/DOC.md +1309 -0
- package/dist/shopify/docs/storefront/DOC.md +457 -0
- package/dist/slack/docs/workspace/javascript/DOC.md +933 -0
- package/dist/slack/docs/workspace/python/DOC.md +271 -0
- package/dist/square/docs/payments/javascript/DOC.md +1855 -0
- package/dist/square/docs/payments/python/DOC.md +1728 -0
- package/dist/stripe/docs/api/DOC.md +1727 -0
- package/dist/stripe/docs/payments/DOC.md +1726 -0
- package/dist/stytch/docs/auth/javascript/DOC.md +1813 -0
- package/dist/stytch/docs/auth/python/DOC.md +1962 -0
- package/dist/supabase/docs/client/DOC.md +1606 -0
- package/dist/twilio/docs/messaging/python/DOC.md +469 -0
- package/dist/twilio/docs/messaging/typescript/DOC.md +946 -0
- package/dist/vercel/docs/platform/DOC.md +1940 -0
- package/dist/weaviate/docs/vector-db/javascript/DOC.md +1268 -0
- package/dist/weaviate/docs/vector-db/python/DOC.md +1388 -0
- package/dist/zendesk/docs/support/javascript/DOC.md +2150 -0
- package/dist/zendesk/docs/support/python/DOC.md +2297 -0
- package/package.json +22 -6
- package/skills/get-api-docs/SKILL.md +84 -0
- package/src/commands/annotate.js +83 -0
- package/src/commands/build.js +12 -1
- package/src/commands/feedback.js +150 -0
- package/src/commands/get.js +83 -42
- package/src/commands/search.js +7 -0
- package/src/index.js +43 -17
- package/src/lib/analytics.js +90 -0
- package/src/lib/annotations.js +57 -0
- package/src/lib/bm25.js +170 -0
- package/src/lib/cache.js +69 -6
- package/src/lib/config.js +8 -3
- package/src/lib/identity.js +99 -0
- package/src/lib/registry.js +103 -20
- package/src/lib/telemetry.js +86 -0
- package/src/mcp/server.js +177 -0
- package/src/mcp/tools.js +251 -0
|
@@ -0,0 +1,1688 @@
|
|
|
1
|
+
---
|
|
2
|
+
name: bigquery
|
|
3
|
+
description: "BigQuery API JavaScript/TypeScript coding guidelines using the official Node.js client library"
|
|
4
|
+
metadata:
|
|
5
|
+
languages: "javascript"
|
|
6
|
+
versions: "8.1.1"
|
|
7
|
+
updated-on: "2026-03-02"
|
|
8
|
+
source: maintainer
|
|
9
|
+
tags: "google,bigquery,data-warehouse,sql,analytics"
|
|
10
|
+
---
|
|
11
|
+
|
|
12
|
+
# BigQuery API Coding Guidelines (JavaScript/TypeScript)
|
|
13
|
+
|
|
14
|
+
You are a BigQuery API coding expert. Help me with writing code using the BigQuery API and the official Node.js client library.
|
|
15
|
+
|
|
16
|
+
Please follow the following guidelines when generating code.
|
|
17
|
+
|
|
18
|
+
You can find the official SDK documentation and code samples here:
|
|
19
|
+
https://cloud.google.com/nodejs/docs/reference/bigquery/latest
|
|
20
|
+
|
|
21
|
+
## Golden Rule: Use the Correct and Current SDK
|
|
22
|
+
|
|
23
|
+
Always use the official Google Cloud BigQuery Node.js client library for all BigQuery API interactions. Do not use unofficial or deprecated libraries.
|
|
24
|
+
|
|
25
|
+
- **Library Name:** Google Cloud BigQuery Node.js Client
|
|
26
|
+
- **NPM Package:** `@google-cloud/bigquery`
|
|
27
|
+
- **Current Version:** 8.1.1
|
|
28
|
+
- **Do NOT use:** `bigquery` (deprecated standalone package)
|
|
29
|
+
- **Do NOT use:** Any unofficial BigQuery wrappers
|
|
30
|
+
|
|
31
|
+
**Installation:**
|
|
32
|
+
|
|
33
|
+
```bash
|
|
34
|
+
npm install @google-cloud/bigquery
|
|
35
|
+
```
|
|
36
|
+
|
|
37
|
+
**Correct Import:**
|
|
38
|
+
|
|
39
|
+
```javascript
|
|
40
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
41
|
+
```
|
|
42
|
+
|
|
43
|
+
or with ES6 modules:
|
|
44
|
+
|
|
45
|
+
```javascript
|
|
46
|
+
import {BigQuery} from '@google-cloud/bigquery';
|
|
47
|
+
```
|
|
48
|
+
|
|
49
|
+
**Incorrect patterns to avoid:**
|
|
50
|
+
- `const bigquery = require('bigquery')` - Wrong package
|
|
51
|
+
- `import BigQuery from '@google-cloud/bigquery'` - Wrong import syntax (missing curly braces)
|
|
52
|
+
- Using any package other than `@google-cloud/bigquery`
|
|
53
|
+
|
|
54
|
+
## Installation and Setup
|
|
55
|
+
|
|
56
|
+
### Install the package
|
|
57
|
+
|
|
58
|
+
```bash
|
|
59
|
+
npm install @google-cloud/bigquery
|
|
60
|
+
```
|
|
61
|
+
|
|
62
|
+
### Authentication
|
|
63
|
+
|
|
64
|
+
BigQuery requires authentication. The client library uses Application Default Credentials (ADC).
|
|
65
|
+
|
|
66
|
+
**Set up authentication using one of these methods:**
|
|
67
|
+
|
|
68
|
+
**Option 1: Service Account Key File (Recommended for local development)**
|
|
69
|
+
|
|
70
|
+
```bash
|
|
71
|
+
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/your/service-account-key.json"
|
|
72
|
+
```
|
|
73
|
+
|
|
74
|
+
**Option 2: Application Default Credentials (Recommended for production)**
|
|
75
|
+
|
|
76
|
+
When running on Google Cloud (Cloud Functions, Cloud Run, GKE, etc.), authentication happens automatically.
|
|
77
|
+
|
|
78
|
+
**Option 3: Explicit credentials in code**
|
|
79
|
+
|
|
80
|
+
```javascript
|
|
81
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
82
|
+
|
|
83
|
+
const bigquery = new BigQuery({
|
|
84
|
+
projectId: 'your-project-id',
|
|
85
|
+
keyFilename: '/path/to/service-account-key.json'
|
|
86
|
+
});
|
|
87
|
+
```
|
|
88
|
+
|
|
89
|
+
### Environment Variables
|
|
90
|
+
|
|
91
|
+
```javascript
|
|
92
|
+
// Set these environment variables
|
|
93
|
+
process.env.GOOGLE_APPLICATION_CREDENTIALS = '/path/to/service-account-key.json';
|
|
94
|
+
process.env.GOOGLE_CLOUD_PROJECT = 'your-project-id';
|
|
95
|
+
```
|
|
96
|
+
|
|
97
|
+
## Initialization
|
|
98
|
+
|
|
99
|
+
### Basic Client Initialization
|
|
100
|
+
|
|
101
|
+
```javascript
|
|
102
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
103
|
+
|
|
104
|
+
// Uses GOOGLE_APPLICATION_CREDENTIALS and GOOGLE_CLOUD_PROJECT env vars
|
|
105
|
+
const bigquery = new BigQuery();
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
### Client with Explicit Project ID
|
|
109
|
+
|
|
110
|
+
```javascript
|
|
111
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
112
|
+
|
|
113
|
+
const bigquery = new BigQuery({
|
|
114
|
+
projectId: 'your-project-id'
|
|
115
|
+
});
|
|
116
|
+
```
|
|
117
|
+
|
|
118
|
+
### Client with Explicit Credentials
|
|
119
|
+
|
|
120
|
+
```javascript
|
|
121
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
122
|
+
|
|
123
|
+
const bigquery = new BigQuery({
|
|
124
|
+
projectId: 'your-project-id',
|
|
125
|
+
keyFilename: '/path/to/service-account-key.json'
|
|
126
|
+
});
|
|
127
|
+
```
|
|
128
|
+
|
|
129
|
+
### Client with Location
|
|
130
|
+
|
|
131
|
+
```javascript
|
|
132
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
133
|
+
|
|
134
|
+
const bigquery = new BigQuery({
|
|
135
|
+
projectId: 'your-project-id',
|
|
136
|
+
location: 'US' // or 'EU', 'us-central1', etc.
|
|
137
|
+
});
|
|
138
|
+
```
|
|
139
|
+
|
|
140
|
+
## Working with Datasets
|
|
141
|
+
|
|
142
|
+
### Create a Dataset
|
|
143
|
+
|
|
144
|
+
```javascript
|
|
145
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
146
|
+
const bigquery = new BigQuery();
|
|
147
|
+
|
|
148
|
+
async function createDataset() {
|
|
149
|
+
const datasetId = 'my_new_dataset';
|
|
150
|
+
|
|
151
|
+
const [dataset] = await bigquery.createDataset(datasetId);
|
|
152
|
+
console.log(`Dataset ${dataset.id} created.`);
|
|
153
|
+
}
|
|
154
|
+
|
|
155
|
+
createDataset();
|
|
156
|
+
```
|
|
157
|
+
|
|
158
|
+
### Create Dataset with Location
|
|
159
|
+
|
|
160
|
+
```javascript
|
|
161
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
162
|
+
const bigquery = new BigQuery();
|
|
163
|
+
|
|
164
|
+
async function createDatasetWithLocation() {
|
|
165
|
+
const datasetId = 'my_new_dataset';
|
|
166
|
+
|
|
167
|
+
const options = {
|
|
168
|
+
location: 'US'
|
|
169
|
+
};
|
|
170
|
+
|
|
171
|
+
const [dataset] = await bigquery.createDataset(datasetId, options);
|
|
172
|
+
console.log(`Dataset ${dataset.id} created in ${dataset.location}.`);
|
|
173
|
+
}
|
|
174
|
+
|
|
175
|
+
createDatasetWithLocation();
|
|
176
|
+
```
|
|
177
|
+
|
|
178
|
+
### List Datasets
|
|
179
|
+
|
|
180
|
+
```javascript
|
|
181
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
182
|
+
const bigquery = new BigQuery();
|
|
183
|
+
|
|
184
|
+
async function listDatasets() {
|
|
185
|
+
const [datasets] = await bigquery.getDatasets();
|
|
186
|
+
|
|
187
|
+
console.log('Datasets:');
|
|
188
|
+
datasets.forEach(dataset => console.log(dataset.id));
|
|
189
|
+
}
|
|
190
|
+
|
|
191
|
+
listDatasets();
|
|
192
|
+
```
|
|
193
|
+
|
|
194
|
+
### Get Dataset Reference
|
|
195
|
+
|
|
196
|
+
```javascript
|
|
197
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
198
|
+
const bigquery = new BigQuery();
|
|
199
|
+
|
|
200
|
+
const dataset = bigquery.dataset('my_dataset');
|
|
201
|
+
```
|
|
202
|
+
|
|
203
|
+
### Check if Dataset Exists
|
|
204
|
+
|
|
205
|
+
```javascript
|
|
206
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
207
|
+
const bigquery = new BigQuery();
|
|
208
|
+
|
|
209
|
+
async function datasetExists() {
|
|
210
|
+
const dataset = bigquery.dataset('my_dataset');
|
|
211
|
+
const [exists] = await dataset.exists();
|
|
212
|
+
console.log(`Dataset exists: ${exists}`);
|
|
213
|
+
}
|
|
214
|
+
|
|
215
|
+
datasetExists();
|
|
216
|
+
```
|
|
217
|
+
|
|
218
|
+
### Delete a Dataset
|
|
219
|
+
|
|
220
|
+
```javascript
|
|
221
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
222
|
+
const bigquery = new BigQuery();
|
|
223
|
+
|
|
224
|
+
async function deleteDataset() {
|
|
225
|
+
const datasetId = 'my_dataset';
|
|
226
|
+
|
|
227
|
+
await bigquery.dataset(datasetId).delete({force: true}); // force deletes all tables
|
|
228
|
+
console.log(`Dataset ${datasetId} deleted.`);
|
|
229
|
+
}
|
|
230
|
+
|
|
231
|
+
deleteDataset();
|
|
232
|
+
```
|
|
233
|
+
|
|
234
|
+
## Working with Tables
|
|
235
|
+
|
|
236
|
+
### Create a Table
|
|
237
|
+
|
|
238
|
+
```javascript
|
|
239
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
240
|
+
const bigquery = new BigQuery();
|
|
241
|
+
|
|
242
|
+
async function createTable() {
|
|
243
|
+
const datasetId = 'my_dataset';
|
|
244
|
+
const tableId = 'my_table';
|
|
245
|
+
|
|
246
|
+
const schema = [
|
|
247
|
+
{name: 'name', type: 'STRING', mode: 'REQUIRED'},
|
|
248
|
+
{name: 'age', type: 'INTEGER', mode: 'NULLABLE'},
|
|
249
|
+
{name: 'email', type: 'STRING', mode: 'NULLABLE'}
|
|
250
|
+
];
|
|
251
|
+
|
|
252
|
+
const [table] = await bigquery
|
|
253
|
+
.dataset(datasetId)
|
|
254
|
+
.createTable(tableId, {schema: schema});
|
|
255
|
+
|
|
256
|
+
console.log(`Table ${table.id} created.`);
|
|
257
|
+
}
|
|
258
|
+
|
|
259
|
+
createTable();
|
|
260
|
+
```
|
|
261
|
+
|
|
262
|
+
### Create Table with Advanced Schema
|
|
263
|
+
|
|
264
|
+
```javascript
|
|
265
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
266
|
+
const bigquery = new BigQuery();
|
|
267
|
+
|
|
268
|
+
async function createTableWithAdvancedSchema() {
|
|
269
|
+
const datasetId = 'my_dataset';
|
|
270
|
+
const tableId = 'my_advanced_table';
|
|
271
|
+
|
|
272
|
+
const schema = [
|
|
273
|
+
{name: 'id', type: 'INTEGER', mode: 'REQUIRED'},
|
|
274
|
+
{name: 'name', type: 'STRING', mode: 'REQUIRED'},
|
|
275
|
+
{name: 'timestamp', type: 'TIMESTAMP', mode: 'REQUIRED'},
|
|
276
|
+
{name: 'scores', type: 'FLOAT', mode: 'REPEATED'}, // Array field
|
|
277
|
+
{
|
|
278
|
+
name: 'address',
|
|
279
|
+
type: 'RECORD', // Nested/struct field
|
|
280
|
+
mode: 'NULLABLE',
|
|
281
|
+
fields: [
|
|
282
|
+
{name: 'street', type: 'STRING'},
|
|
283
|
+
{name: 'city', type: 'STRING'},
|
|
284
|
+
{name: 'zipcode', type: 'STRING'}
|
|
285
|
+
]
|
|
286
|
+
}
|
|
287
|
+
];
|
|
288
|
+
|
|
289
|
+
const options = {
|
|
290
|
+
schema: schema,
|
|
291
|
+
location: 'US',
|
|
292
|
+
timePartitioning: {
|
|
293
|
+
type: 'DAY',
|
|
294
|
+
field: 'timestamp'
|
|
295
|
+
}
|
|
296
|
+
};
|
|
297
|
+
|
|
298
|
+
const [table] = await bigquery
|
|
299
|
+
.dataset(datasetId)
|
|
300
|
+
.createTable(tableId, options);
|
|
301
|
+
|
|
302
|
+
console.log(`Table ${table.id} created with partitioning.`);
|
|
303
|
+
}
|
|
304
|
+
|
|
305
|
+
createTableWithAdvancedSchema();
|
|
306
|
+
```
|
|
307
|
+
|
|
308
|
+
### List Tables
|
|
309
|
+
|
|
310
|
+
```javascript
|
|
311
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
312
|
+
const bigquery = new BigQuery();
|
|
313
|
+
|
|
314
|
+
async function listTables() {
|
|
315
|
+
const datasetId = 'my_dataset';
|
|
316
|
+
|
|
317
|
+
const [tables] = await bigquery.dataset(datasetId).getTables();
|
|
318
|
+
|
|
319
|
+
console.log('Tables:');
|
|
320
|
+
tables.forEach(table => console.log(table.id));
|
|
321
|
+
}
|
|
322
|
+
|
|
323
|
+
listTables();
|
|
324
|
+
```
|
|
325
|
+
|
|
326
|
+
### Get Table Metadata
|
|
327
|
+
|
|
328
|
+
```javascript
|
|
329
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
330
|
+
const bigquery = new BigQuery();
|
|
331
|
+
|
|
332
|
+
async function getTableMetadata() {
|
|
333
|
+
const datasetId = 'my_dataset';
|
|
334
|
+
const tableId = 'my_table';
|
|
335
|
+
|
|
336
|
+
const table = bigquery.dataset(datasetId).table(tableId);
|
|
337
|
+
const [metadata] = await table.getMetadata();
|
|
338
|
+
|
|
339
|
+
console.log('Table metadata:');
|
|
340
|
+
console.log(`Schema: ${JSON.stringify(metadata.schema.fields)}`);
|
|
341
|
+
console.log(`Num rows: ${metadata.numRows}`);
|
|
342
|
+
console.log(`Num bytes: ${metadata.numBytes}`);
|
|
343
|
+
console.log(`Created: ${metadata.creationTime}`);
|
|
344
|
+
}
|
|
345
|
+
|
|
346
|
+
getTableMetadata();
|
|
347
|
+
```
|
|
348
|
+
|
|
349
|
+
### Get Table Reference
|
|
350
|
+
|
|
351
|
+
```javascript
|
|
352
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
353
|
+
const bigquery = new BigQuery();
|
|
354
|
+
|
|
355
|
+
const table = bigquery.dataset('my_dataset').table('my_table');
|
|
356
|
+
```
|
|
357
|
+
|
|
358
|
+
### Check if Table Exists
|
|
359
|
+
|
|
360
|
+
```javascript
|
|
361
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
362
|
+
const bigquery = new BigQuery();
|
|
363
|
+
|
|
364
|
+
async function tableExists() {
|
|
365
|
+
const table = bigquery.dataset('my_dataset').table('my_table');
|
|
366
|
+
const [exists] = await table.exists();
|
|
367
|
+
console.log(`Table exists: ${exists}`);
|
|
368
|
+
}
|
|
369
|
+
|
|
370
|
+
tableExists();
|
|
371
|
+
```
|
|
372
|
+
|
|
373
|
+
### Delete a Table
|
|
374
|
+
|
|
375
|
+
```javascript
|
|
376
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
377
|
+
const bigquery = new BigQuery();
|
|
378
|
+
|
|
379
|
+
async function deleteTable() {
|
|
380
|
+
const datasetId = 'my_dataset';
|
|
381
|
+
const tableId = 'my_table';
|
|
382
|
+
|
|
383
|
+
await bigquery.dataset(datasetId).table(tableId).delete();
|
|
384
|
+
console.log(`Table ${tableId} deleted.`);
|
|
385
|
+
}
|
|
386
|
+
|
|
387
|
+
deleteTable();
|
|
388
|
+
```
|
|
389
|
+
|
|
390
|
+
### Copy a Table
|
|
391
|
+
|
|
392
|
+
```javascript
|
|
393
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
394
|
+
const bigquery = new BigQuery();
|
|
395
|
+
|
|
396
|
+
async function copyTable() {
|
|
397
|
+
const srcDatasetId = 'my_dataset';
|
|
398
|
+
const srcTableId = 'my_source_table';
|
|
399
|
+
const destDatasetId = 'my_dataset';
|
|
400
|
+
const destTableId = 'my_destination_table';
|
|
401
|
+
|
|
402
|
+
const srcTable = bigquery.dataset(srcDatasetId).table(srcTableId);
|
|
403
|
+
const destTable = bigquery.dataset(destDatasetId).table(destTableId);
|
|
404
|
+
|
|
405
|
+
const [job] = await srcTable.copy(destTable);
|
|
406
|
+
|
|
407
|
+
console.log(`Job ${job.id} completed. Table copied.`);
|
|
408
|
+
}
|
|
409
|
+
|
|
410
|
+
copyTable();
|
|
411
|
+
```
|
|
412
|
+
|
|
413
|
+
## Querying Data
|
|
414
|
+
|
|
415
|
+
### Simple Query
|
|
416
|
+
|
|
417
|
+
```javascript
|
|
418
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
419
|
+
const bigquery = new BigQuery();
|
|
420
|
+
|
|
421
|
+
async function queryData() {
|
|
422
|
+
const query = `
|
|
423
|
+
SELECT name, age
|
|
424
|
+
FROM \`my-project.my_dataset.my_table\`
|
|
425
|
+
WHERE age > 25
|
|
426
|
+
ORDER BY age DESC
|
|
427
|
+
LIMIT 10
|
|
428
|
+
`;
|
|
429
|
+
|
|
430
|
+
const [rows] = await bigquery.query(query);
|
|
431
|
+
|
|
432
|
+
console.log('Rows:');
|
|
433
|
+
rows.forEach(row => console.log(`${row.name}: ${row.age}`));
|
|
434
|
+
}
|
|
435
|
+
|
|
436
|
+
queryData();
|
|
437
|
+
```
|
|
438
|
+
|
|
439
|
+
### Query with Location
|
|
440
|
+
|
|
441
|
+
```javascript
|
|
442
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
443
|
+
const bigquery = new BigQuery();
|
|
444
|
+
|
|
445
|
+
async function queryWithLocation() {
|
|
446
|
+
const query = 'SELECT name FROM `my-project.my_dataset.my_table` LIMIT 10';
|
|
447
|
+
|
|
448
|
+
const options = {
|
|
449
|
+
query: query,
|
|
450
|
+
location: 'US'
|
|
451
|
+
};
|
|
452
|
+
|
|
453
|
+
const [rows] = await bigquery.query(options);
|
|
454
|
+
|
|
455
|
+
rows.forEach(row => console.log(row.name));
|
|
456
|
+
}
|
|
457
|
+
|
|
458
|
+
queryWithLocation();
|
|
459
|
+
```
|
|
460
|
+
|
|
461
|
+
### Query Public Dataset
|
|
462
|
+
|
|
463
|
+
```javascript
|
|
464
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
465
|
+
const bigquery = new BigQuery();
|
|
466
|
+
|
|
467
|
+
async function queryPublicDataset() {
|
|
468
|
+
const query = `
|
|
469
|
+
SELECT name, SUM(number) as total
|
|
470
|
+
FROM \`bigquery-public-data.usa_names.usa_1910_2013\`
|
|
471
|
+
WHERE state = 'TX'
|
|
472
|
+
GROUP BY name, state
|
|
473
|
+
ORDER BY total DESC
|
|
474
|
+
LIMIT 10
|
|
475
|
+
`;
|
|
476
|
+
|
|
477
|
+
const [rows] = await bigquery.query(query);
|
|
478
|
+
|
|
479
|
+
console.log('Top 10 names in Texas:');
|
|
480
|
+
rows.forEach(row => console.log(`${row.name}: ${row.total}`));
|
|
481
|
+
}
|
|
482
|
+
|
|
483
|
+
queryPublicDataset();
|
|
484
|
+
```
|
|
485
|
+
|
|
486
|
+
### Query with Parameters (Parameterized Query)
|
|
487
|
+
|
|
488
|
+
```javascript
|
|
489
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
490
|
+
const bigquery = new BigQuery();
|
|
491
|
+
|
|
492
|
+
async function queryWithParameters() {
|
|
493
|
+
const query = `
|
|
494
|
+
SELECT name, age
|
|
495
|
+
FROM \`my-project.my_dataset.my_table\`
|
|
496
|
+
WHERE age > @min_age
|
|
497
|
+
AND name LIKE @name_pattern
|
|
498
|
+
LIMIT @limit
|
|
499
|
+
`;
|
|
500
|
+
|
|
501
|
+
const options = {
|
|
502
|
+
query: query,
|
|
503
|
+
params: {
|
|
504
|
+
min_age: 25,
|
|
505
|
+
name_pattern: 'John%',
|
|
506
|
+
limit: 10
|
|
507
|
+
}
|
|
508
|
+
};
|
|
509
|
+
|
|
510
|
+
const [rows] = await bigquery.query(options);
|
|
511
|
+
|
|
512
|
+
rows.forEach(row => console.log(`${row.name}: ${row.age}`));
|
|
513
|
+
}
|
|
514
|
+
|
|
515
|
+
queryWithParameters();
|
|
516
|
+
```
|
|
517
|
+
|
|
518
|
+
### Query with Typed Parameters
|
|
519
|
+
|
|
520
|
+
```javascript
|
|
521
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
522
|
+
const bigquery = new BigQuery();
|
|
523
|
+
|
|
524
|
+
async function queryWithTypedParameters() {
|
|
525
|
+
const query = `
|
|
526
|
+
SELECT name, timestamp
|
|
527
|
+
FROM \`my-project.my_dataset.my_table\`
|
|
528
|
+
WHERE timestamp > @start_date
|
|
529
|
+
AND status IN UNNEST(@statuses)
|
|
530
|
+
`;
|
|
531
|
+
|
|
532
|
+
const options = {
|
|
533
|
+
query: query,
|
|
534
|
+
params: {
|
|
535
|
+
start_date: '2024-01-01',
|
|
536
|
+
statuses: ['active', 'pending']
|
|
537
|
+
},
|
|
538
|
+
types: {
|
|
539
|
+
start_date: 'DATE',
|
|
540
|
+
statuses: ['STRING']
|
|
541
|
+
}
|
|
542
|
+
};
|
|
543
|
+
|
|
544
|
+
const [rows] = await bigquery.query(options);
|
|
545
|
+
|
|
546
|
+
rows.forEach(row => console.log(row));
|
|
547
|
+
}
|
|
548
|
+
|
|
549
|
+
queryWithTypedParameters();
|
|
550
|
+
```
|
|
551
|
+
|
|
552
|
+
### Query with Job Configuration
|
|
553
|
+
|
|
554
|
+
```javascript
|
|
555
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
556
|
+
const bigquery = new BigQuery();
|
|
557
|
+
|
|
558
|
+
async function queryWithJobConfig() {
|
|
559
|
+
const query = 'SELECT * FROM `my-project.my_dataset.my_table` LIMIT 100';
|
|
560
|
+
|
|
561
|
+
const options = {
|
|
562
|
+
query: query,
|
|
563
|
+
location: 'US',
|
|
564
|
+
useLegacySql: false, // Use Standard SQL (default)
|
|
565
|
+
useQueryCache: true,
|
|
566
|
+
maximumBytesBilled: '1000000' // Set query cost limit
|
|
567
|
+
};
|
|
568
|
+
|
|
569
|
+
const [job] = await bigquery.createQueryJob(options);
|
|
570
|
+
console.log(`Job ${job.id} started.`);
|
|
571
|
+
|
|
572
|
+
const [rows] = await job.getQueryResults();
|
|
573
|
+
console.log(`Rows: ${rows.length}`);
|
|
574
|
+
}
|
|
575
|
+
|
|
576
|
+
queryWithJobConfig();
|
|
577
|
+
```
|
|
578
|
+
|
|
579
|
+
### Dry Run Query (Estimate Query Cost)
|
|
580
|
+
|
|
581
|
+
```javascript
|
|
582
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
583
|
+
const bigquery = new BigQuery();
|
|
584
|
+
|
|
585
|
+
async function dryRunQuery() {
|
|
586
|
+
const query = 'SELECT * FROM `my-project.my_dataset.my_table`';
|
|
587
|
+
|
|
588
|
+
const options = {
|
|
589
|
+
query: query,
|
|
590
|
+
dryRun: true
|
|
591
|
+
};
|
|
592
|
+
|
|
593
|
+
const [job] = await bigquery.createQueryJob(options);
|
|
594
|
+
|
|
595
|
+
console.log('Query would process:');
|
|
596
|
+
console.log(`${job.metadata.statistics.totalBytesProcessed} bytes`);
|
|
597
|
+
}
|
|
598
|
+
|
|
599
|
+
dryRunQuery();
|
|
600
|
+
```
|
|
601
|
+
|
|
602
|
+
### Check Query Job Status
|
|
603
|
+
|
|
604
|
+
```javascript
|
|
605
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
606
|
+
const bigquery = new BigQuery();
|
|
607
|
+
|
|
608
|
+
async function checkJobStatus() {
|
|
609
|
+
const query = 'SELECT * FROM `my-project.my_dataset.large_table`';
|
|
610
|
+
|
|
611
|
+
const [job] = await bigquery.createQueryJob({query: query});
|
|
612
|
+
|
|
613
|
+
console.log(`Job ${job.id} started.`);
|
|
614
|
+
|
|
615
|
+
// Wait for job to complete
|
|
616
|
+
const [rows] = await job.getQueryResults();
|
|
617
|
+
|
|
618
|
+
// Get job metadata
|
|
619
|
+
const [metadata] = await job.getMetadata();
|
|
620
|
+
|
|
621
|
+
console.log('Job statistics:');
|
|
622
|
+
console.log(`State: ${metadata.status.state}`);
|
|
623
|
+
console.log(`Bytes processed: ${metadata.statistics.totalBytesProcessed}`);
|
|
624
|
+
console.log(`Rows: ${rows.length}`);
|
|
625
|
+
}
|
|
626
|
+
|
|
627
|
+
checkJobStatus();
|
|
628
|
+
```
|
|
629
|
+
|
|
630
|
+
## Inserting Data
|
|
631
|
+
|
|
632
|
+
### Streaming Insert (Single Row)
|
|
633
|
+
|
|
634
|
+
```javascript
|
|
635
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
636
|
+
const bigquery = new BigQuery();
|
|
637
|
+
|
|
638
|
+
async function insertSingleRow() {
|
|
639
|
+
const datasetId = 'my_dataset';
|
|
640
|
+
const tableId = 'my_table';
|
|
641
|
+
|
|
642
|
+
const row = {
|
|
643
|
+
name: 'John Doe',
|
|
644
|
+
age: 30,
|
|
645
|
+
email: 'john@example.com'
|
|
646
|
+
};
|
|
647
|
+
|
|
648
|
+
await bigquery
|
|
649
|
+
.dataset(datasetId)
|
|
650
|
+
.table(tableId)
|
|
651
|
+
.insert(row);
|
|
652
|
+
|
|
653
|
+
console.log('Row inserted successfully.');
|
|
654
|
+
}
|
|
655
|
+
|
|
656
|
+
insertSingleRow();
|
|
657
|
+
```
|
|
658
|
+
|
|
659
|
+
### Streaming Insert (Multiple Rows)
|
|
660
|
+
|
|
661
|
+
```javascript
|
|
662
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
663
|
+
const bigquery = new BigQuery();
|
|
664
|
+
|
|
665
|
+
async function insertMultipleRows() {
|
|
666
|
+
const datasetId = 'my_dataset';
|
|
667
|
+
const tableId = 'my_table';
|
|
668
|
+
|
|
669
|
+
const rows = [
|
|
670
|
+
{name: 'Tom Smith', age: 28, email: 'tom@example.com'},
|
|
671
|
+
{name: 'Jane Doe', age: 32, email: 'jane@example.com'},
|
|
672
|
+
{name: 'Bob Johnson', age: 45, email: 'bob@example.com'}
|
|
673
|
+
];
|
|
674
|
+
|
|
675
|
+
await bigquery
|
|
676
|
+
.dataset(datasetId)
|
|
677
|
+
.table(tableId)
|
|
678
|
+
.insert(rows);
|
|
679
|
+
|
|
680
|
+
console.log(`Inserted ${rows.length} rows successfully.`);
|
|
681
|
+
}
|
|
682
|
+
|
|
683
|
+
insertMultipleRows();
|
|
684
|
+
```
|
|
685
|
+
|
|
686
|
+
### Streaming Insert with Options
|
|
687
|
+
|
|
688
|
+
```javascript
|
|
689
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
690
|
+
const bigquery = new BigQuery();
|
|
691
|
+
|
|
692
|
+
async function insertWithOptions() {
|
|
693
|
+
const datasetId = 'my_dataset';
|
|
694
|
+
const tableId = 'my_table';
|
|
695
|
+
|
|
696
|
+
const rows = [
|
|
697
|
+
{name: 'Alice', age: 25, email: 'alice@example.com'},
|
|
698
|
+
{name: 'Charlie', age: 35, email: 'charlie@example.com'}
|
|
699
|
+
];
|
|
700
|
+
|
|
701
|
+
const options = {
|
|
702
|
+
skipInvalidRows: true, // Skip rows with errors
|
|
703
|
+
ignoreUnknownValues: true, // Ignore fields not in schema
|
|
704
|
+
raw: false // Use field names, not raw API format
|
|
705
|
+
};
|
|
706
|
+
|
|
707
|
+
await bigquery
|
|
708
|
+
.dataset(datasetId)
|
|
709
|
+
.table(tableId)
|
|
710
|
+
.insert(rows, options);
|
|
711
|
+
|
|
712
|
+
console.log('Rows inserted with options.');
|
|
713
|
+
}
|
|
714
|
+
|
|
715
|
+
insertWithOptions();
|
|
716
|
+
```
|
|
717
|
+
|
|
718
|
+
### Streaming Insert with Error Handling
|
|
719
|
+
|
|
720
|
+
```javascript
|
|
721
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
722
|
+
const bigquery = new BigQuery();
|
|
723
|
+
|
|
724
|
+
async function insertWithErrorHandling() {
|
|
725
|
+
const datasetId = 'my_dataset';
|
|
726
|
+
const tableId = 'my_table';
|
|
727
|
+
|
|
728
|
+
const rows = [
|
|
729
|
+
{name: 'Valid User', age: 30, email: 'valid@example.com'},
|
|
730
|
+
{name: 'Invalid User', age: 'not a number', email: 'invalid@example.com'}
|
|
731
|
+
];
|
|
732
|
+
|
|
733
|
+
try {
|
|
734
|
+
await bigquery
|
|
735
|
+
.dataset(datasetId)
|
|
736
|
+
.table(tableId)
|
|
737
|
+
.insert(rows);
|
|
738
|
+
|
|
739
|
+
console.log('All rows inserted successfully.');
|
|
740
|
+
} catch (error) {
|
|
741
|
+
if (error.name === 'PartialFailureError') {
|
|
742
|
+
console.error('Some rows failed to insert:');
|
|
743
|
+
error.errors.forEach((err, index) => {
|
|
744
|
+
console.error(`Row ${index}:`, err.errors);
|
|
745
|
+
});
|
|
746
|
+
} else {
|
|
747
|
+
console.error('Insert error:', error);
|
|
748
|
+
}
|
|
749
|
+
}
|
|
750
|
+
}
|
|
751
|
+
|
|
752
|
+
insertWithErrorHandling();
|
|
753
|
+
```
|
|
754
|
+
|
|
755
|
+
## Loading Data from Files
|
|
756
|
+
|
|
757
|
+
### Load Data from Local CSV File
|
|
758
|
+
|
|
759
|
+
```javascript
|
|
760
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
761
|
+
const bigquery = new BigQuery();
|
|
762
|
+
|
|
763
|
+
async function loadCSVFromFile() {
|
|
764
|
+
const datasetId = 'my_dataset';
|
|
765
|
+
const tableId = 'my_table';
|
|
766
|
+
const filename = '/path/to/file.csv';
|
|
767
|
+
|
|
768
|
+
const metadata = {
|
|
769
|
+
sourceFormat: 'CSV',
|
|
770
|
+
skipLeadingRows: 1, // Skip header row
|
|
771
|
+
autodetect: true, // Auto-detect schema
|
|
772
|
+
location: 'US'
|
|
773
|
+
};
|
|
774
|
+
|
|
775
|
+
const [job] = await bigquery
|
|
776
|
+
.dataset(datasetId)
|
|
777
|
+
.table(tableId)
|
|
778
|
+
.load(filename, metadata);
|
|
779
|
+
|
|
780
|
+
console.log(`Job ${job.id} completed. Data loaded.`);
|
|
781
|
+
}
|
|
782
|
+
|
|
783
|
+
loadCSVFromFile();
|
|
784
|
+
```
|
|
785
|
+
|
|
786
|
+
### Load Data from Cloud Storage (CSV)
|
|
787
|
+
|
|
788
|
+
```javascript
|
|
789
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
790
|
+
const bigquery = new BigQuery();
|
|
791
|
+
|
|
792
|
+
async function loadCSVFromGCS() {
|
|
793
|
+
const datasetId = 'my_dataset';
|
|
794
|
+
const tableId = 'my_table';
|
|
795
|
+
|
|
796
|
+
const metadata = {
|
|
797
|
+
sourceFormat: 'CSV',
|
|
798
|
+
skipLeadingRows: 1,
|
|
799
|
+
schema: {
|
|
800
|
+
fields: [
|
|
801
|
+
{name: 'name', type: 'STRING'},
|
|
802
|
+
{name: 'age', type: 'INTEGER'},
|
|
803
|
+
{name: 'email', type: 'STRING'}
|
|
804
|
+
]
|
|
805
|
+
},
|
|
806
|
+
location: 'US',
|
|
807
|
+
writeDisposition: 'WRITE_TRUNCATE' // Overwrite table
|
|
808
|
+
};
|
|
809
|
+
|
|
810
|
+
const [job] = await bigquery
|
|
811
|
+
.dataset(datasetId)
|
|
812
|
+
.table(tableId)
|
|
813
|
+
.load('gs://my-bucket/data.csv', metadata);
|
|
814
|
+
|
|
815
|
+
console.log(`Job ${job.id} completed.`);
|
|
816
|
+
|
|
817
|
+
const errors = job.status.errors;
|
|
818
|
+
if (errors && errors.length > 0) {
|
|
819
|
+
throw errors;
|
|
820
|
+
}
|
|
821
|
+
}
|
|
822
|
+
|
|
823
|
+
loadCSVFromGCS();
|
|
824
|
+
```
|
|
825
|
+
|
|
826
|
+
### Load Data from Cloud Storage (JSON)
|
|
827
|
+
|
|
828
|
+
```javascript
|
|
829
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
830
|
+
const bigquery = new BigQuery();
|
|
831
|
+
|
|
832
|
+
async function loadJSONFromGCS() {
|
|
833
|
+
const datasetId = 'my_dataset';
|
|
834
|
+
const tableId = 'my_table';
|
|
835
|
+
|
|
836
|
+
const metadata = {
|
|
837
|
+
sourceFormat: 'NEWLINE_DELIMITED_JSON',
|
|
838
|
+
autodetect: true,
|
|
839
|
+
location: 'US',
|
|
840
|
+
writeDisposition: 'WRITE_APPEND' // Append to table
|
|
841
|
+
};
|
|
842
|
+
|
|
843
|
+
const [job] = await bigquery
|
|
844
|
+
.dataset(datasetId)
|
|
845
|
+
.table(tableId)
|
|
846
|
+
.load('gs://my-bucket/data.json', metadata);
|
|
847
|
+
|
|
848
|
+
console.log(`Job ${job.id} completed. JSON data loaded.`);
|
|
849
|
+
}
|
|
850
|
+
|
|
851
|
+
loadJSONFromGCS();
|
|
852
|
+
```
|
|
853
|
+
|
|
854
|
+
### Load Data from Multiple Files
|
|
855
|
+
|
|
856
|
+
```javascript
|
|
857
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
858
|
+
const bigquery = new BigQuery();
|
|
859
|
+
|
|
860
|
+
async function loadFromMultipleFiles() {
|
|
861
|
+
const datasetId = 'my_dataset';
|
|
862
|
+
const tableId = 'my_table';
|
|
863
|
+
|
|
864
|
+
const metadata = {
|
|
865
|
+
sourceFormat: 'CSV',
|
|
866
|
+
skipLeadingRows: 1,
|
|
867
|
+
autodetect: true,
|
|
868
|
+
location: 'US'
|
|
869
|
+
};
|
|
870
|
+
|
|
871
|
+
const sourceUris = [
|
|
872
|
+
'gs://my-bucket/data1.csv',
|
|
873
|
+
'gs://my-bucket/data2.csv',
|
|
874
|
+
'gs://my-bucket/data3.csv'
|
|
875
|
+
];
|
|
876
|
+
|
|
877
|
+
const [job] = await bigquery
|
|
878
|
+
.dataset(datasetId)
|
|
879
|
+
.table(tableId)
|
|
880
|
+
.load(sourceUris, metadata);
|
|
881
|
+
|
|
882
|
+
console.log(`Job ${job.id} completed. Multiple files loaded.`);
|
|
883
|
+
}
|
|
884
|
+
|
|
885
|
+
loadFromMultipleFiles();
|
|
886
|
+
```
|
|
887
|
+
|
|
888
|
+
### Load Data from Cloud Storage (Parquet)
|
|
889
|
+
|
|
890
|
+
```javascript
|
|
891
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
892
|
+
const bigquery = new BigQuery();
|
|
893
|
+
|
|
894
|
+
async function loadParquetFromGCS() {
|
|
895
|
+
const datasetId = 'my_dataset';
|
|
896
|
+
const tableId = 'my_table';
|
|
897
|
+
|
|
898
|
+
const metadata = {
|
|
899
|
+
sourceFormat: 'PARQUET',
|
|
900
|
+
location: 'US',
|
|
901
|
+
writeDisposition: 'WRITE_TRUNCATE'
|
|
902
|
+
};
|
|
903
|
+
|
|
904
|
+
const [job] = await bigquery
|
|
905
|
+
.dataset(datasetId)
|
|
906
|
+
.table(tableId)
|
|
907
|
+
.load('gs://my-bucket/data.parquet', metadata);
|
|
908
|
+
|
|
909
|
+
console.log('Parquet data loaded successfully.');
|
|
910
|
+
}
|
|
911
|
+
|
|
912
|
+
loadParquetFromGCS();
|
|
913
|
+
```
|
|
914
|
+
|
|
915
|
+
### Load with createLoadJob
|
|
916
|
+
|
|
917
|
+
```javascript
|
|
918
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
919
|
+
const bigquery = new BigQuery();
|
|
920
|
+
|
|
921
|
+
async function createLoadJobExample() {
|
|
922
|
+
const datasetId = 'my_dataset';
|
|
923
|
+
const tableId = 'my_table';
|
|
924
|
+
|
|
925
|
+
const metadata = {
|
|
926
|
+
sourceFormat: 'CSV',
|
|
927
|
+
skipLeadingRows: 1,
|
|
928
|
+
schema: {
|
|
929
|
+
fields: [
|
|
930
|
+
{name: 'id', type: 'INTEGER'},
|
|
931
|
+
{name: 'name', type: 'STRING'},
|
|
932
|
+
{name: 'value', type: 'FLOAT'}
|
|
933
|
+
]
|
|
934
|
+
},
|
|
935
|
+
location: 'US'
|
|
936
|
+
};
|
|
937
|
+
|
|
938
|
+
const [job] = await bigquery
|
|
939
|
+
.dataset(datasetId)
|
|
940
|
+
.table(tableId)
|
|
941
|
+
.createLoadJob('gs://my-bucket/file.csv', metadata);
|
|
942
|
+
|
|
943
|
+
console.log(`Load job ${job.id} started.`);
|
|
944
|
+
|
|
945
|
+
// Wait for job to complete
|
|
946
|
+
const [completedJob] = await job.promise();
|
|
947
|
+
|
|
948
|
+
console.log(`Job ${completedJob.id} completed.`);
|
|
949
|
+
}
|
|
950
|
+
|
|
951
|
+
createLoadJobExample();
|
|
952
|
+
```
|
|
953
|
+
|
|
954
|
+
## Exporting Data
|
|
955
|
+
|
|
956
|
+
### Export Table to Cloud Storage (CSV)
|
|
957
|
+
|
|
958
|
+
```javascript
|
|
959
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
960
|
+
const {Storage} = require('@google-cloud/storage');
|
|
961
|
+
|
|
962
|
+
const bigquery = new BigQuery();
|
|
963
|
+
const storage = new Storage();
|
|
964
|
+
|
|
965
|
+
async function exportTableToGCS() {
|
|
966
|
+
const datasetId = 'my_dataset';
|
|
967
|
+
const tableId = 'my_table';
|
|
968
|
+
const bucketName = 'my-bucket';
|
|
969
|
+
const filename = 'export.csv';
|
|
970
|
+
|
|
971
|
+
const destination = storage.bucket(bucketName).file(filename);
|
|
972
|
+
|
|
973
|
+
const options = {
|
|
974
|
+
format: 'CSV',
|
|
975
|
+
location: 'US'
|
|
976
|
+
};
|
|
977
|
+
|
|
978
|
+
const [job] = await bigquery
|
|
979
|
+
.dataset(datasetId)
|
|
980
|
+
.table(tableId)
|
|
981
|
+
.extract(destination, options);
|
|
982
|
+
|
|
983
|
+
console.log(`Job ${job.id} completed. Table exported to ${filename}.`);
|
|
984
|
+
}
|
|
985
|
+
|
|
986
|
+
exportTableToGCS();
|
|
987
|
+
```
|
|
988
|
+
|
|
989
|
+
### Export Table to Cloud Storage (JSON)
|
|
990
|
+
|
|
991
|
+
```javascript
|
|
992
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
993
|
+
const {Storage} = require('@google-cloud/storage');
|
|
994
|
+
|
|
995
|
+
const bigquery = new BigQuery();
|
|
996
|
+
const storage = new Storage();
|
|
997
|
+
|
|
998
|
+
async function exportTableToJSON() {
|
|
999
|
+
const datasetId = 'my_dataset';
|
|
1000
|
+
const tableId = 'my_table';
|
|
1001
|
+
const bucketName = 'my-bucket';
|
|
1002
|
+
const filename = 'export.json';
|
|
1003
|
+
|
|
1004
|
+
const destination = storage.bucket(bucketName).file(filename);
|
|
1005
|
+
|
|
1006
|
+
const options = {
|
|
1007
|
+
format: 'JSON',
|
|
1008
|
+
location: 'US'
|
|
1009
|
+
};
|
|
1010
|
+
|
|
1011
|
+
const [job] = await bigquery
|
|
1012
|
+
.dataset(datasetId)
|
|
1013
|
+
.table(tableId)
|
|
1014
|
+
.extract(destination, options);
|
|
1015
|
+
|
|
1016
|
+
console.log('Table exported to JSON.');
|
|
1017
|
+
}
|
|
1018
|
+
|
|
1019
|
+
exportTableToJSON();
|
|
1020
|
+
```
|
|
1021
|
+
|
|
1022
|
+
### Export Table with Compression
|
|
1023
|
+
|
|
1024
|
+
```javascript
|
|
1025
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1026
|
+
const {Storage} = require('@google-cloud/storage');
|
|
1027
|
+
|
|
1028
|
+
const bigquery = new BigQuery();
|
|
1029
|
+
const storage = new Storage();
|
|
1030
|
+
|
|
1031
|
+
async function exportTableCompressed() {
|
|
1032
|
+
const datasetId = 'my_dataset';
|
|
1033
|
+
const tableId = 'my_table';
|
|
1034
|
+
const bucketName = 'my-bucket';
|
|
1035
|
+
const filename = 'export.csv.gz';
|
|
1036
|
+
|
|
1037
|
+
const destination = storage.bucket(bucketName).file(filename);
|
|
1038
|
+
|
|
1039
|
+
const options = {
|
|
1040
|
+
format: 'CSV',
|
|
1041
|
+
gzip: true,
|
|
1042
|
+
location: 'US'
|
|
1043
|
+
};
|
|
1044
|
+
|
|
1045
|
+
const [job] = await bigquery
|
|
1046
|
+
.dataset(datasetId)
|
|
1047
|
+
.table(tableId)
|
|
1048
|
+
.extract(destination, options);
|
|
1049
|
+
|
|
1050
|
+
console.log('Table exported with gzip compression.');
|
|
1051
|
+
}
|
|
1052
|
+
|
|
1053
|
+
exportTableCompressed();
|
|
1054
|
+
```
|
|
1055
|
+
|
|
1056
|
+
### Export to Multiple Files
|
|
1057
|
+
|
|
1058
|
+
```javascript
|
|
1059
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1060
|
+
const {Storage} = require('@google-cloud/storage');
|
|
1061
|
+
|
|
1062
|
+
const bigquery = new BigQuery();
|
|
1063
|
+
const storage = new Storage();
|
|
1064
|
+
|
|
1065
|
+
async function exportToMultipleFiles() {
|
|
1066
|
+
const datasetId = 'my_dataset';
|
|
1067
|
+
const tableId = 'my_large_table';
|
|
1068
|
+
const bucketName = 'my-bucket';
|
|
1069
|
+
|
|
1070
|
+
// Use wildcard to split into multiple files
|
|
1071
|
+
const destination = storage.bucket(bucketName).file('export-*.csv');
|
|
1072
|
+
|
|
1073
|
+
const options = {
|
|
1074
|
+
format: 'CSV',
|
|
1075
|
+
location: 'US'
|
|
1076
|
+
};
|
|
1077
|
+
|
|
1078
|
+
const [job] = await bigquery
|
|
1079
|
+
.dataset(datasetId)
|
|
1080
|
+
.table(tableId)
|
|
1081
|
+
.extract(destination, options);
|
|
1082
|
+
|
|
1083
|
+
console.log('Table exported to multiple files.');
|
|
1084
|
+
}
|
|
1085
|
+
|
|
1086
|
+
exportToMultipleFiles();
|
|
1087
|
+
```
|
|
1088
|
+
|
|
1089
|
+
### Export with createExtractJob
|
|
1090
|
+
|
|
1091
|
+
```javascript
|
|
1092
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1093
|
+
const {Storage} = require('@google-cloud/storage');
|
|
1094
|
+
|
|
1095
|
+
const bigquery = new BigQuery();
|
|
1096
|
+
const storage = new Storage();
|
|
1097
|
+
|
|
1098
|
+
async function createExtractJobExample() {
|
|
1099
|
+
const datasetId = 'my_dataset';
|
|
1100
|
+
const tableId = 'my_table';
|
|
1101
|
+
const bucketName = 'my-bucket';
|
|
1102
|
+
const filename = 'extract.csv';
|
|
1103
|
+
|
|
1104
|
+
const destination = storage.bucket(bucketName).file(filename);
|
|
1105
|
+
|
|
1106
|
+
const options = {
|
|
1107
|
+
format: 'CSV',
|
|
1108
|
+
printHeader: true,
|
|
1109
|
+
location: 'US'
|
|
1110
|
+
};
|
|
1111
|
+
|
|
1112
|
+
const [job] = await bigquery
|
|
1113
|
+
.dataset(datasetId)
|
|
1114
|
+
.table(tableId)
|
|
1115
|
+
.createExtractJob(destination, options);
|
|
1116
|
+
|
|
1117
|
+
console.log(`Extract job ${job.id} started.`);
|
|
1118
|
+
|
|
1119
|
+
const [completedJob] = await job.promise();
|
|
1120
|
+
console.log(`Job ${completedJob.id} completed.`);
|
|
1121
|
+
}
|
|
1122
|
+
|
|
1123
|
+
createExtractJobExample();
|
|
1124
|
+
```
|
|
1125
|
+
|
|
1126
|
+
## DML Operations (INSERT, UPDATE, DELETE)
|
|
1127
|
+
|
|
1128
|
+
### INSERT with DML
|
|
1129
|
+
|
|
1130
|
+
```javascript
|
|
1131
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1132
|
+
const bigquery = new BigQuery();
|
|
1133
|
+
|
|
1134
|
+
async function insertWithDML() {
|
|
1135
|
+
const query = `
|
|
1136
|
+
INSERT INTO \`my-project.my_dataset.my_table\` (name, age, email)
|
|
1137
|
+
VALUES
|
|
1138
|
+
('Alice', 25, 'alice@example.com'),
|
|
1139
|
+
('Bob', 30, 'bob@example.com'),
|
|
1140
|
+
('Charlie', 35, 'charlie@example.com')
|
|
1141
|
+
`;
|
|
1142
|
+
|
|
1143
|
+
const [job] = await bigquery.createQueryJob({
|
|
1144
|
+
query: query,
|
|
1145
|
+
location: 'US'
|
|
1146
|
+
});
|
|
1147
|
+
|
|
1148
|
+
await job.getQueryResults();
|
|
1149
|
+
console.log('Rows inserted via DML.');
|
|
1150
|
+
}
|
|
1151
|
+
|
|
1152
|
+
insertWithDML();
|
|
1153
|
+
```
|
|
1154
|
+
|
|
1155
|
+
### UPDATE with DML
|
|
1156
|
+
|
|
1157
|
+
```javascript
|
|
1158
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1159
|
+
const bigquery = new BigQuery();
|
|
1160
|
+
|
|
1161
|
+
async function updateWithDML() {
|
|
1162
|
+
const query = `
|
|
1163
|
+
UPDATE \`my-project.my_dataset.my_table\`
|
|
1164
|
+
SET age = age + 1
|
|
1165
|
+
WHERE name = 'Alice'
|
|
1166
|
+
`;
|
|
1167
|
+
|
|
1168
|
+
const [job] = await bigquery.createQueryJob({
|
|
1169
|
+
query: query,
|
|
1170
|
+
location: 'US'
|
|
1171
|
+
});
|
|
1172
|
+
|
|
1173
|
+
await job.getQueryResults();
|
|
1174
|
+
console.log('Rows updated via DML.');
|
|
1175
|
+
}
|
|
1176
|
+
|
|
1177
|
+
updateWithDML();
|
|
1178
|
+
```
|
|
1179
|
+
|
|
1180
|
+
### DELETE with DML
|
|
1181
|
+
|
|
1182
|
+
```javascript
|
|
1183
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1184
|
+
const bigquery = new BigQuery();
|
|
1185
|
+
|
|
1186
|
+
async function deleteWithDML() {
|
|
1187
|
+
const query = `
|
|
1188
|
+
DELETE FROM \`my-project.my_dataset.my_table\`
|
|
1189
|
+
WHERE age < 18
|
|
1190
|
+
`;
|
|
1191
|
+
|
|
1192
|
+
const [job] = await bigquery.createQueryJob({
|
|
1193
|
+
query: query,
|
|
1194
|
+
location: 'US'
|
|
1195
|
+
});
|
|
1196
|
+
|
|
1197
|
+
await job.getQueryResults();
|
|
1198
|
+
console.log('Rows deleted via DML.');
|
|
1199
|
+
}
|
|
1200
|
+
|
|
1201
|
+
deleteWithDML();
|
|
1202
|
+
```
|
|
1203
|
+
|
|
1204
|
+
### MERGE (Upsert) with DML
|
|
1205
|
+
|
|
1206
|
+
```javascript
|
|
1207
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1208
|
+
const bigquery = new BigQuery();
|
|
1209
|
+
|
|
1210
|
+
async function mergeWithDML() {
|
|
1211
|
+
const query = `
|
|
1212
|
+
MERGE \`my-project.my_dataset.target_table\` T
|
|
1213
|
+
USING \`my-project.my_dataset.source_table\` S
|
|
1214
|
+
ON T.id = S.id
|
|
1215
|
+
WHEN MATCHED THEN
|
|
1216
|
+
UPDATE SET T.name = S.name, T.age = S.age
|
|
1217
|
+
WHEN NOT MATCHED THEN
|
|
1218
|
+
INSERT (id, name, age) VALUES (S.id, S.name, S.age)
|
|
1219
|
+
`;
|
|
1220
|
+
|
|
1221
|
+
const [job] = await bigquery.createQueryJob({
|
|
1222
|
+
query: query,
|
|
1223
|
+
location: 'US'
|
|
1224
|
+
});
|
|
1225
|
+
|
|
1226
|
+
await job.getQueryResults();
|
|
1227
|
+
console.log('Merge operation completed.');
|
|
1228
|
+
}
|
|
1229
|
+
|
|
1230
|
+
mergeWithDML();
|
|
1231
|
+
```
|
|
1232
|
+
|
|
1233
|
+
### TRUNCATE Table
|
|
1234
|
+
|
|
1235
|
+
```javascript
|
|
1236
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1237
|
+
const bigquery = new BigQuery();
|
|
1238
|
+
|
|
1239
|
+
async function truncateTable() {
|
|
1240
|
+
const query = `TRUNCATE TABLE \`my-project.my_dataset.my_table\``;
|
|
1241
|
+
|
|
1242
|
+
const [job] = await bigquery.createQueryJob({
|
|
1243
|
+
query: query,
|
|
1244
|
+
location: 'US'
|
|
1245
|
+
});
|
|
1246
|
+
|
|
1247
|
+
await job.getQueryResults();
|
|
1248
|
+
console.log('Table truncated.');
|
|
1249
|
+
}
|
|
1250
|
+
|
|
1251
|
+
truncateTable();
|
|
1252
|
+
```
|
|
1253
|
+
|
|
1254
|
+
## Advanced Query Patterns
|
|
1255
|
+
|
|
1256
|
+
### Create Table As Select (CTAS)
|
|
1257
|
+
|
|
1258
|
+
```javascript
|
|
1259
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1260
|
+
const bigquery = new BigQuery();
|
|
1261
|
+
|
|
1262
|
+
async function createTableAsSelect() {
|
|
1263
|
+
const query = `
|
|
1264
|
+
CREATE OR REPLACE TABLE \`my-project.my_dataset.new_table\` AS
|
|
1265
|
+
SELECT name, age
|
|
1266
|
+
FROM \`my-project.my_dataset.source_table\`
|
|
1267
|
+
WHERE age > 25
|
|
1268
|
+
`;
|
|
1269
|
+
|
|
1270
|
+
const [job] = await bigquery.createQueryJob({
|
|
1271
|
+
query: query,
|
|
1272
|
+
location: 'US'
|
|
1273
|
+
});
|
|
1274
|
+
|
|
1275
|
+
await job.getQueryResults();
|
|
1276
|
+
console.log('Table created from query.');
|
|
1277
|
+
}
|
|
1278
|
+
|
|
1279
|
+
createTableAsSelect();
|
|
1280
|
+
```
|
|
1281
|
+
|
|
1282
|
+
### Query with Common Table Expressions (CTEs)
|
|
1283
|
+
|
|
1284
|
+
```javascript
|
|
1285
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1286
|
+
const bigquery = new BigQuery();
|
|
1287
|
+
|
|
1288
|
+
async function queryWithCTE() {
|
|
1289
|
+
const query = `
|
|
1290
|
+
WITH filtered_data AS (
|
|
1291
|
+
SELECT name, age, email
|
|
1292
|
+
FROM \`my-project.my_dataset.my_table\`
|
|
1293
|
+
WHERE age > 25
|
|
1294
|
+
),
|
|
1295
|
+
aggregated_data AS (
|
|
1296
|
+
SELECT COUNT(*) as total
|
|
1297
|
+
FROM filtered_data
|
|
1298
|
+
)
|
|
1299
|
+
SELECT * FROM aggregated_data
|
|
1300
|
+
`;
|
|
1301
|
+
|
|
1302
|
+
const [rows] = await bigquery.query(query);
|
|
1303
|
+
|
|
1304
|
+
rows.forEach(row => console.log(row));
|
|
1305
|
+
}
|
|
1306
|
+
|
|
1307
|
+
queryWithCTE();
|
|
1308
|
+
```
|
|
1309
|
+
|
|
1310
|
+
### Query with Window Functions
|
|
1311
|
+
|
|
1312
|
+
```javascript
|
|
1313
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1314
|
+
const bigquery = new BigQuery();
|
|
1315
|
+
|
|
1316
|
+
async function queryWithWindowFunctions() {
|
|
1317
|
+
const query = `
|
|
1318
|
+
SELECT
|
|
1319
|
+
name,
|
|
1320
|
+
age,
|
|
1321
|
+
ROW_NUMBER() OVER (ORDER BY age DESC) as rank,
|
|
1322
|
+
AVG(age) OVER () as avg_age
|
|
1323
|
+
FROM \`my-project.my_dataset.my_table\`
|
|
1324
|
+
ORDER BY rank
|
|
1325
|
+
LIMIT 10
|
|
1326
|
+
`;
|
|
1327
|
+
|
|
1328
|
+
const [rows] = await bigquery.query(query);
|
|
1329
|
+
|
|
1330
|
+
rows.forEach(row => {
|
|
1331
|
+
console.log(`${row.rank}. ${row.name} (${row.age}) - Avg: ${row.avg_age}`);
|
|
1332
|
+
});
|
|
1333
|
+
}
|
|
1334
|
+
|
|
1335
|
+
queryWithWindowFunctions();
|
|
1336
|
+
```
|
|
1337
|
+
|
|
1338
|
+
### Query with Arrays and Structs
|
|
1339
|
+
|
|
1340
|
+
```javascript
|
|
1341
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1342
|
+
const bigquery = new BigQuery();
|
|
1343
|
+
|
|
1344
|
+
async function queryArraysAndStructs() {
|
|
1345
|
+
const query = `
|
|
1346
|
+
SELECT
|
|
1347
|
+
name,
|
|
1348
|
+
scores,
|
|
1349
|
+
ARRAY_LENGTH(scores) as num_scores,
|
|
1350
|
+
address.city,
|
|
1351
|
+
address.zipcode
|
|
1352
|
+
FROM \`my-project.my_dataset.my_table\`
|
|
1353
|
+
WHERE 90 IN UNNEST(scores)
|
|
1354
|
+
`;
|
|
1355
|
+
|
|
1356
|
+
const [rows] = await bigquery.query(query);
|
|
1357
|
+
|
|
1358
|
+
rows.forEach(row => console.log(row));
|
|
1359
|
+
}
|
|
1360
|
+
|
|
1361
|
+
queryArraysAndStructs();
|
|
1362
|
+
```
|
|
1363
|
+
|
|
1364
|
+
## Pagination and Iterating Results
|
|
1365
|
+
|
|
1366
|
+
### Paginate Query Results
|
|
1367
|
+
|
|
1368
|
+
```javascript
|
|
1369
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1370
|
+
const bigquery = new BigQuery();
|
|
1371
|
+
|
|
1372
|
+
async function paginateResults() {
|
|
1373
|
+
const query = 'SELECT * FROM `my-project.my_dataset.large_table`';
|
|
1374
|
+
|
|
1375
|
+
const options = {
|
|
1376
|
+
query: query,
|
|
1377
|
+
maxResults: 100 // Page size
|
|
1378
|
+
};
|
|
1379
|
+
|
|
1380
|
+
const [job] = await bigquery.createQueryJob(options);
|
|
1381
|
+
|
|
1382
|
+
let pageToken = null;
|
|
1383
|
+
let pageCount = 0;
|
|
1384
|
+
|
|
1385
|
+
do {
|
|
1386
|
+
const [rows, nextQuery, apiResponse] = await job.getQueryResults({
|
|
1387
|
+
pageToken: pageToken,
|
|
1388
|
+
maxResults: 100
|
|
1389
|
+
});
|
|
1390
|
+
|
|
1391
|
+
pageCount++;
|
|
1392
|
+
console.log(`Page ${pageCount}: ${rows.length} rows`);
|
|
1393
|
+
|
|
1394
|
+
rows.forEach(row => {
|
|
1395
|
+
// Process each row
|
|
1396
|
+
console.log(row);
|
|
1397
|
+
});
|
|
1398
|
+
|
|
1399
|
+
pageToken = apiResponse.pageToken;
|
|
1400
|
+
} while (pageToken);
|
|
1401
|
+
|
|
1402
|
+
console.log(`Total pages: ${pageCount}`);
|
|
1403
|
+
}
|
|
1404
|
+
|
|
1405
|
+
paginateResults();
|
|
1406
|
+
```
|
|
1407
|
+
|
|
1408
|
+
### Stream Query Results
|
|
1409
|
+
|
|
1410
|
+
```javascript
|
|
1411
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1412
|
+
const bigquery = new BigQuery();
|
|
1413
|
+
|
|
1414
|
+
async function streamQueryResults() {
|
|
1415
|
+
const query = 'SELECT * FROM `my-project.my_dataset.large_table`';
|
|
1416
|
+
|
|
1417
|
+
const [job] = await bigquery.createQueryJob({query: query});
|
|
1418
|
+
|
|
1419
|
+
return new Promise((resolve, reject) => {
|
|
1420
|
+
let rowCount = 0;
|
|
1421
|
+
|
|
1422
|
+
job.getQueryResultsStream()
|
|
1423
|
+
.on('data', row => {
|
|
1424
|
+
rowCount++;
|
|
1425
|
+
console.log(row);
|
|
1426
|
+
})
|
|
1427
|
+
.on('error', reject)
|
|
1428
|
+
.on('end', () => {
|
|
1429
|
+
console.log(`Processed ${rowCount} rows`);
|
|
1430
|
+
resolve();
|
|
1431
|
+
});
|
|
1432
|
+
});
|
|
1433
|
+
}
|
|
1434
|
+
|
|
1435
|
+
streamQueryResults();
|
|
1436
|
+
```
|
|
1437
|
+
|
|
1438
|
+
## Working with Views
|
|
1439
|
+
|
|
1440
|
+
### Create a View
|
|
1441
|
+
|
|
1442
|
+
```javascript
|
|
1443
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1444
|
+
const bigquery = new BigQuery();
|
|
1445
|
+
|
|
1446
|
+
async function createView() {
|
|
1447
|
+
const datasetId = 'my_dataset';
|
|
1448
|
+
const tableId = 'my_view';
|
|
1449
|
+
|
|
1450
|
+
const query = `
|
|
1451
|
+
SELECT name, age
|
|
1452
|
+
FROM \`my-project.my_dataset.my_table\`
|
|
1453
|
+
WHERE age > 25
|
|
1454
|
+
`;
|
|
1455
|
+
|
|
1456
|
+
const options = {
|
|
1457
|
+
view: query,
|
|
1458
|
+
location: 'US'
|
|
1459
|
+
};
|
|
1460
|
+
|
|
1461
|
+
const [view] = await bigquery
|
|
1462
|
+
.dataset(datasetId)
|
|
1463
|
+
.createTable(tableId, options);
|
|
1464
|
+
|
|
1465
|
+
console.log(`View ${view.id} created.`);
|
|
1466
|
+
}
|
|
1467
|
+
|
|
1468
|
+
createView();
|
|
1469
|
+
```
|
|
1470
|
+
|
|
1471
|
+
### Update a View
|
|
1472
|
+
|
|
1473
|
+
```javascript
|
|
1474
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1475
|
+
const bigquery = new BigQuery();
|
|
1476
|
+
|
|
1477
|
+
async function updateView() {
|
|
1478
|
+
const datasetId = 'my_dataset';
|
|
1479
|
+
const viewId = 'my_view';
|
|
1480
|
+
|
|
1481
|
+
const newQuery = `
|
|
1482
|
+
SELECT name, age, email
|
|
1483
|
+
FROM \`my-project.my_dataset.my_table\`
|
|
1484
|
+
WHERE age > 30
|
|
1485
|
+
`;
|
|
1486
|
+
|
|
1487
|
+
const table = bigquery.dataset(datasetId).table(viewId);
|
|
1488
|
+
|
|
1489
|
+
await table.setMetadata({
|
|
1490
|
+
view: newQuery
|
|
1491
|
+
});
|
|
1492
|
+
|
|
1493
|
+
console.log('View updated.');
|
|
1494
|
+
}
|
|
1495
|
+
|
|
1496
|
+
updateView();
|
|
1497
|
+
```
|
|
1498
|
+
|
|
1499
|
+
## Error Handling
|
|
1500
|
+
|
|
1501
|
+
### Complete Error Handling Example
|
|
1502
|
+
|
|
1503
|
+
```javascript
|
|
1504
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1505
|
+
const bigquery = new BigQuery();
|
|
1506
|
+
|
|
1507
|
+
async function queryWithErrorHandling() {
|
|
1508
|
+
const query = 'SELECT * FROM `my-project.my_dataset.my_table` LIMIT 10';
|
|
1509
|
+
|
|
1510
|
+
try {
|
|
1511
|
+
const [rows] = await bigquery.query({
|
|
1512
|
+
query: query,
|
|
1513
|
+
location: 'US'
|
|
1514
|
+
});
|
|
1515
|
+
|
|
1516
|
+
console.log(`Query returned ${rows.length} rows`);
|
|
1517
|
+
rows.forEach(row => console.log(row));
|
|
1518
|
+
|
|
1519
|
+
} catch (error) {
|
|
1520
|
+
console.error('Error occurred:');
|
|
1521
|
+
console.error(`Message: ${error.message}`);
|
|
1522
|
+
|
|
1523
|
+
if (error.code) {
|
|
1524
|
+
console.error(`Error code: ${error.code}`);
|
|
1525
|
+
}
|
|
1526
|
+
|
|
1527
|
+
if (error.errors) {
|
|
1528
|
+
console.error('Detailed errors:');
|
|
1529
|
+
error.errors.forEach(err => console.error(err));
|
|
1530
|
+
}
|
|
1531
|
+
}
|
|
1532
|
+
}
|
|
1533
|
+
|
|
1534
|
+
queryWithErrorHandling();
|
|
1535
|
+
```
|
|
1536
|
+
|
|
1537
|
+
### Handle Partial Insert Errors
|
|
1538
|
+
|
|
1539
|
+
```javascript
|
|
1540
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1541
|
+
const bigquery = new BigQuery();
|
|
1542
|
+
|
|
1543
|
+
async function handlePartialInsertErrors() {
|
|
1544
|
+
const datasetId = 'my_dataset';
|
|
1545
|
+
const tableId = 'my_table';
|
|
1546
|
+
|
|
1547
|
+
const rows = [
|
|
1548
|
+
{name: 'Valid', age: 25},
|
|
1549
|
+
{name: 'Invalid', age: 'not a number'},
|
|
1550
|
+
{name: 'Another Valid', age: 30}
|
|
1551
|
+
];
|
|
1552
|
+
|
|
1553
|
+
try {
|
|
1554
|
+
await bigquery
|
|
1555
|
+
.dataset(datasetId)
|
|
1556
|
+
.table(tableId)
|
|
1557
|
+
.insert(rows);
|
|
1558
|
+
|
|
1559
|
+
console.log('All rows inserted successfully.');
|
|
1560
|
+
} catch (error) {
|
|
1561
|
+
if (error.name === 'PartialFailureError') {
|
|
1562
|
+
console.log('Some rows were inserted, but some failed:');
|
|
1563
|
+
|
|
1564
|
+
error.errors.forEach((rowError, index) => {
|
|
1565
|
+
console.log(`Row ${index} errors:`);
|
|
1566
|
+
rowError.errors.forEach(err => {
|
|
1567
|
+
console.log(` - ${err.reason}: ${err.message}`);
|
|
1568
|
+
});
|
|
1569
|
+
});
|
|
1570
|
+
} else {
|
|
1571
|
+
console.error('Complete failure:', error.message);
|
|
1572
|
+
}
|
|
1573
|
+
}
|
|
1574
|
+
}
|
|
1575
|
+
|
|
1576
|
+
handlePartialInsertErrors();
|
|
1577
|
+
```
|
|
1578
|
+
|
|
1579
|
+
## Working with Jobs
|
|
1580
|
+
|
|
1581
|
+
### List Recent Jobs
|
|
1582
|
+
|
|
1583
|
+
```javascript
|
|
1584
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1585
|
+
const bigquery = new BigQuery();
|
|
1586
|
+
|
|
1587
|
+
async function listJobs() {
|
|
1588
|
+
const [jobs] = await bigquery.getJobs({
|
|
1589
|
+
maxResults: 10,
|
|
1590
|
+
allUsers: false
|
|
1591
|
+
});
|
|
1592
|
+
|
|
1593
|
+
console.log('Recent jobs:');
|
|
1594
|
+
jobs.forEach(job => {
|
|
1595
|
+
console.log(`${job.id} - ${job.metadata.status.state}`);
|
|
1596
|
+
});
|
|
1597
|
+
}
|
|
1598
|
+
|
|
1599
|
+
listJobs();
|
|
1600
|
+
```
|
|
1601
|
+
|
|
1602
|
+
### Get Job Details
|
|
1603
|
+
|
|
1604
|
+
```javascript
|
|
1605
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1606
|
+
const bigquery = new BigQuery();
|
|
1607
|
+
|
|
1608
|
+
async function getJobDetails(jobId) {
|
|
1609
|
+
const job = bigquery.job(jobId);
|
|
1610
|
+
const [metadata] = await job.getMetadata();
|
|
1611
|
+
|
|
1612
|
+
console.log('Job details:');
|
|
1613
|
+
console.log(`ID: ${metadata.id}`);
|
|
1614
|
+
console.log(`State: ${metadata.status.state}`);
|
|
1615
|
+
console.log(`Created: ${metadata.statistics.creationTime}`);
|
|
1616
|
+
console.log(`Started: ${metadata.statistics.startTime}`);
|
|
1617
|
+
console.log(`Ended: ${metadata.statistics.endTime}`);
|
|
1618
|
+
|
|
1619
|
+
if (metadata.statistics.query) {
|
|
1620
|
+
console.log(`Bytes processed: ${metadata.statistics.query.totalBytesProcessed}`);
|
|
1621
|
+
}
|
|
1622
|
+
}
|
|
1623
|
+
|
|
1624
|
+
getJobDetails('your-job-id');
|
|
1625
|
+
```
|
|
1626
|
+
|
|
1627
|
+
### Cancel a Job
|
|
1628
|
+
|
|
1629
|
+
```javascript
|
|
1630
|
+
const {BigQuery} = require('@google-cloud/bigquery');
|
|
1631
|
+
const bigquery = new BigQuery();
|
|
1632
|
+
|
|
1633
|
+
async function cancelJob(jobId) {
|
|
1634
|
+
const job = bigquery.job(jobId);
|
|
1635
|
+
|
|
1636
|
+
await job.cancel();
|
|
1637
|
+
console.log(`Job ${jobId} cancelled.`);
|
|
1638
|
+
}
|
|
1639
|
+
|
|
1640
|
+
cancelJob('your-job-id');
|
|
1641
|
+
```
|
|
1642
|
+
|
|
1643
|
+
## Important Notes
|
|
1644
|
+
|
|
1645
|
+
### Streaming Insert Limitations
|
|
1646
|
+
|
|
1647
|
+
- Streaming inserts are available immediately for querying but may take up to 90 minutes to become available for copy, export, or DML operations
|
|
1648
|
+
- Tables with data in the streaming buffer cannot be updated or deleted using UPDATE/DELETE DML statements
|
|
1649
|
+
- Use batch loading for large data volumes to reduce costs
|
|
1650
|
+
|
|
1651
|
+
### DML Quotas
|
|
1652
|
+
|
|
1653
|
+
- Maximum 1,000 DML statements per table per day
|
|
1654
|
+
- Use batch operations when possible to stay within quota limits
|
|
1655
|
+
|
|
1656
|
+
### Query Costs
|
|
1657
|
+
|
|
1658
|
+
- Queries are billed based on bytes processed
|
|
1659
|
+
- Use `dryRun: true` to estimate query costs before running
|
|
1660
|
+
- Use partitioned tables and clustering to reduce query costs
|
|
1661
|
+
- Set `maximumBytesBilled` to prevent accidentally expensive queries
|
|
1662
|
+
|
|
1663
|
+
### Data Types
|
|
1664
|
+
|
|
1665
|
+
Available BigQuery data types:
|
|
1666
|
+
- **Numeric**: INTEGER (INT64), NUMERIC, BIGNUMERIC, FLOAT (FLOAT64)
|
|
1667
|
+
- **String**: STRING, BYTES
|
|
1668
|
+
- **Date/Time**: DATE, DATETIME, TIME, TIMESTAMP
|
|
1669
|
+
- **Boolean**: BOOLEAN (BOOL)
|
|
1670
|
+
- **Complex**: ARRAY, STRUCT (RECORD), GEOGRAPHY, JSON
|
|
1671
|
+
|
|
1672
|
+
### Best Practices
|
|
1673
|
+
|
|
1674
|
+
- Always specify location for datasets and queries to avoid cross-region data transfer costs
|
|
1675
|
+
- Use parameterized queries to improve query caching and prevent SQL injection
|
|
1676
|
+
- Use batch loading instead of streaming for large data volumes
|
|
1677
|
+
- Enable query cache when possible to reduce costs
|
|
1678
|
+
- Use partitioned and clustered tables for better query performance
|
|
1679
|
+
- Set timeouts and cost limits for queries in production
|
|
1680
|
+
- Handle errors gracefully, especially for streaming inserts
|
|
1681
|
+
|
|
1682
|
+
## Useful Links
|
|
1683
|
+
|
|
1684
|
+
- Official Documentation: https://cloud.google.com/bigquery/docs
|
|
1685
|
+
- Node.js Client Reference: https://cloud.google.com/nodejs/docs/reference/bigquery/latest
|
|
1686
|
+
- BigQuery Pricing: https://cloud.google.com/bigquery/pricing
|
|
1687
|
+
- SQL Reference: https://cloud.google.com/bigquery/docs/reference/standard-sql/query-syntax
|
|
1688
|
+
- Quotas and Limits: https://cloud.google.com/bigquery/quotas
|