embulk-input-bigquery_extract_files 0.0.9 → 0.0.14

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 55f1e7dc3d4dcd5ebe905f42c289c4034702718d
4
- data.tar.gz: eda2e0485a70f8fa63e6f4c292a792bd2af7633a
3
+ metadata.gz: be7d0d070196d522edcb8bf289e5d5156acfef52
4
+ data.tar.gz: da58c00653e0a96db45475fc2c8f91012d620c2b
5
5
  SHA512:
6
- metadata.gz: 30debdbcaed3653c29a66a327d5799c4af50fb1783e20244b8aa8ca539dfbcef3a093d159d5678ae684bee2900af44aa7d884a3dc950cd79ecf3eab92eec652e
7
- data.tar.gz: c7f538f8dcb10b1799e6c38d4e0e9c5477da032a13ce3bede7ba03830b98c7f7961b1c87e5d141c7313e2f1781a141b19466a942ab826cfe67bd2c481586c913
6
+ metadata.gz: 1076f9ac8e7fca9c6ec6e4558310700591c31249d9ea6bd76022ad2787a877f908338df9c64c77f23edb005cfb1f651a449f580affc09b49081acb8bb05b4053
7
+ data.tar.gz: 5510bc7e0b676b152dcd6ab31bc56ee54a2a5fc8009e93fb1b09c928fd08c3ef1242ce85bfa7a6fb1423cf64998414d1a295def64776a6b43f39f1dff8a2c36b
data/README.md CHANGED
@@ -4,9 +4,9 @@ embulk file input plugin.
4
4
 
5
5
  - embulk : http://www.embulk.org/docs/
6
6
 
7
- - embulk plugins : http://www.embulk.org/plugins/
7
+ - embulk plugins : https://plugins.embulk.org/
8
8
 
9
- Read files stored in Google Cloud Storage that extracted from Google Cloud Bigquery's table or query result.
9
+ Reads files stored on Google Cloud Storage that extracted from bigquery table or query result
10
10
 
11
11
  ## Overview
12
12
 
@@ -16,9 +16,9 @@ Read files stored in Google Cloud Storage that extracted from Google Cloud Bigqu
16
16
 
17
17
  ### Detail
18
18
 
19
- Read files stored in Google Cloud Storage, that exported from Google Cloud Bigquery's table or query result.
19
+ Reads files stored on Google Cloud Storage that extracted from bigquery table or query result
20
20
 
21
- Maybe solution for very big data in bigquery.
21
+ Maybe solution for download very big data in bigquery.
22
22
 
23
23
  If you set **table** config without **query** config,
24
24
  then just extract table to Google Cloud Storage.
@@ -26,6 +26,7 @@ then just extract table to Google Cloud Storage.
26
26
  If you set **query** config,
27
27
  then query result save to temp table and then extracted that temp table to Google Cloud Storage uri.
28
28
  see : https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.extract
29
+
29
30
 
30
31
  ## Usage
31
32
 
@@ -35,6 +36,12 @@ see : https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuratio
35
36
  embulk gem install embulk-input-bigquery_extract_files
36
37
  ```
37
38
 
39
+ ### Update plugin (latest version : 0.0.13)
40
+
41
+ ```bash
42
+ embulk gem update embulk-input-bigquery_extract_files
43
+ ```
44
+
38
45
  * rubygem url : https://rubygems.org/profiles/jo8937
39
46
 
40
47
 
@@ -42,6 +49,7 @@ embulk gem install embulk-input-bigquery_extract_files
42
49
 
43
50
  - **project**: Google Cloud Platform (gcp) project id (string, required)
44
51
  - **json_keyfile**: gcp service account's private key with json (string, required)
52
+ - **location**: location of bigquery dataset and temp_dataset. see : https://cloud.google.com/bigquery/docs/locations (Optional) (string, default: `US`)
45
53
  - **gcs_uri**: bigquery result saved uri. bucket and path names parsed from this uri. (string, required)
46
54
  - **temp_local_path**: extract files download directory in local machine (string, required)
47
55
 
@@ -63,6 +71,8 @@ embulk gem install embulk-input-bigquery_extract_files
63
71
 
64
72
  - **bigquery_job_wait_second**: bigquery job waiting second. (Optional) (string, default: `600`)
65
73
 
74
+ - **throw_bigquery_job_wait_timeout**: throw exception when bigquery job waiting second timeout. (Optional) (string, default: `false`)
75
+
66
76
  - **cleanup_gcs_before_executing**: delete all file in gcs temp path before process start (Optional) (string, default: `true`)
67
77
 
68
78
  - **cleanup_gcs_files**: delete all file in gcs temp path after process end (Optional) (string, default: `false`)
@@ -71,6 +81,8 @@ embulk gem install embulk-input-bigquery_extract_files
71
81
 
72
82
  - **cleanup_local_temp_files**: delete all file in local temp dir (Optional) (string, default: `true`)
73
83
 
84
+ - **direct_download_enabled**: gcs download option. see : https://developers.google.com/api-client-library/java/google-api-java-client/reference/1.19.1/com/google/api/client/googleapis/media/MediaHttpDownloader#setDirectDownloadEnabled(boolean) (Optional) (string, default: `false`)
85
+
74
86
  - **decoders**: embulk java-file-input plugin's default attribute. see : http://www.embulk.org/docs/built-in.html#gzip-decoder-plugin
75
87
  - **parser**: embulk java-file-input plugin's default .attribute see : http://www.embulk.org/docs/built-in.html#csv-parser-plugin
76
88
 
@@ -161,6 +173,20 @@ out:
161
173
  $ ./gradlew gem # -t to watch change of files and rebuild continuously
162
174
  ```
163
175
 
176
+
177
+ ## Plugin maintenance
178
+
179
+ for old version user
180
+
181
+ ### Remove plugin specific version
182
+
183
+ ```bash
184
+ embulk gem uninstall embulk-input-bigquery_extract_files --version 0.0.13
185
+ ```
186
+
187
+ * rubygem url : https://rubygems.org/profiles/jo8937
188
+
189
+
164
190
  # Another choice
165
191
 
166
192
  This plugin useful for file-input type. but maybe so complicated to use.
@@ -13,7 +13,7 @@ configurations {
13
13
  provided
14
14
  }
15
15
 
16
- version = "0.0.9"
16
+ version = "0.0.14"
17
17
 
18
18
  sourceCompatibility = 1.7
19
19
  targetCompatibility = 1.7
@@ -22,11 +22,12 @@ dependencies {
22
22
  compile ("org.embulk:embulk-core:0.8.36")
23
23
  provided("org.embulk:embulk-core:0.8.36")
24
24
 
25
- compile "com.google.http-client:google-http-client-jackson2:1.21.0"
25
+ compile "com.google.guava:guava:16+"
26
+ compile "com.google.http-client:google-http-client-jackson2:1.28.0"
26
27
  //compile ('com.google.apis:google-api-services-bigquery:v2-rev363-1.23.0') {exclude module: "guava-jdk5"}
27
- compile ('com.google.apis:google-api-services-bigquery:v2-rev402-1.25.0') {exclude module: "guava-jdk5"}
28
+ compile ('com.google.apis:google-api-services-bigquery:v2-rev429-1.25.0') {exclude module: "guava-jdk5"}
28
29
  //compile ("com.google.apis:google-api-services-storage:v1-rev59-1.21.0") {exclude module: "guava-jdk5"}
29
- compile ("com.google.apis:google-api-services-storage:v1-rev136-1.25.0") {exclude module: "guava-jdk5"}
30
+ compile ("com.google.apis:google-api-services-storage:v1-rev150-1.25.0") {exclude module: "guava-jdk5"}
30
31
 
31
32
  testCompile "junit:junit:4.+"
32
33
  testCompile "org.embulk:embulk-core:0.8.36:tests"
@@ -1,9 +1,6 @@
1
1
  package org.embulk.input.bigquery_export_gcs;
2
2
 
3
- import java.io.File;
4
- import java.io.FileNotFoundException;
5
- import java.io.IOException;
6
- import java.io.InputStream;
3
+ import java.io.*;
7
4
  import java.nio.file.Path;
8
5
  import java.util.List;
9
6
 
@@ -29,8 +26,6 @@ import org.slf4j.Logger;
29
26
  import com.google.api.services.bigquery.Bigquery;
30
27
  import com.google.common.base.Optional;
31
28
 
32
- import io.airlift.slice.RuntimeIOException;
33
-
34
29
  /**
35
30
  *
36
31
  *
@@ -60,6 +55,10 @@ public class BigqueryExportGcsFileInputPlugin implements FileInputPlugin
60
55
  @Config("json_keyfile")
61
56
  public String getJsonKeyfile();
62
57
 
58
+ @Config("location")
59
+ @ConfigDefault("\"US\"")
60
+ public Optional<String> getLocation();
61
+
63
62
  @Config("dataset")
64
63
  @ConfigDefault("null")
65
64
  public Optional<String> getDataset();
@@ -71,7 +70,8 @@ public class BigqueryExportGcsFileInputPlugin implements FileInputPlugin
71
70
  @Config("query")
72
71
  @ConfigDefault("null")
73
72
  public Optional<String> getQuery();
74
-
73
+ public void setQuery(Optional<String> tempDataset);
74
+
75
75
  @Config("file_format")
76
76
  @ConfigDefault("\"CSV\"")
77
77
  public Optional<String> getFileFormat();
@@ -123,7 +123,8 @@ public class BigqueryExportGcsFileInputPlugin implements FileInputPlugin
123
123
  @Config("bigquery_job_wait_second")
124
124
  @ConfigDefault("600")
125
125
  public Optional<Integer> getBigqueryJobWaitingSecond();
126
-
126
+ public void setBigqueryJobWaitingSecond(Optional<Integer> second);
127
+
127
128
  @Config("cleanup_gcs_files")
128
129
  @ConfigDefault("false")
129
130
  public boolean getCleanupGcsTempFiles();
@@ -168,9 +169,34 @@ public class BigqueryExportGcsFileInputPlugin implements FileInputPlugin
168
169
 
169
170
  public String getWorkId();
170
171
  public void setWorkId(String temp);
171
-
172
+
173
+
174
+ @Config("direct_download_enabled")
175
+ @ConfigDefault("false")
176
+ public boolean getDirectDownloadEnabled();
172
177
  //public Schema getSchemaConfig();
173
178
  //public void setSchameConfig(SchemaConfig schema);
179
+
180
+ /**
181
+ * 2020.11.16 sometimes, bigquery extract job just very slow. not becouse of error. then workflow must throw exception for alert message
182
+ * @return
183
+ */
184
+ @Config("throw_bigquery_job_wait_timeout")
185
+ @ConfigDefault("false")
186
+ public boolean getThrowBigqueryJobWaitTimeout();
187
+ public void setThrowBigqueryJobWaitTimeout(boolean toThrow);
188
+
189
+ /**
190
+ * 2020.11.18 sometime, bigquery job return "DONE" but include errors.
191
+ * DONE does not mean job success.
192
+ * https://cloud.google.com/bigquery/docs/running-jobs#bigquery_create_job-java
193
+ *
194
+ * @return
195
+ */
196
+ @Config("throw_bigquery_job_includes_error")
197
+ @ConfigDefault("false")
198
+ public boolean getThrowBigqueryJobIncludesError();
199
+ public void setThrowBigqueryJobIncludesError(boolean toThrow);
174
200
  }
175
201
 
176
202
  @Override
@@ -194,7 +220,7 @@ public class BigqueryExportGcsFileInputPlugin implements FileInputPlugin
194
220
  log.info("create local downlaod path : {}", localPath);
195
221
  boolean ok = localPath.mkdirs();
196
222
  if(!ok){
197
- throw new RuntimeIOException(new IOException("local path create fail : " + localPath));
223
+ throw new UncheckedIOException (new IOException("local path create fail : " + localPath));
198
224
  }
199
225
  }
200
226
  }
@@ -246,7 +272,7 @@ public class BigqueryExportGcsFileInputPlugin implements FileInputPlugin
246
272
  BigqueryExportUtils.executeQueryToDestinationWorkTable(bigquery, task);
247
273
  } catch (IOException e) {
248
274
  log.error("bigquery io error",e);
249
- throw new RuntimeIOException(e);
275
+ throw new UncheckedIOException(e);
250
276
  } catch (InterruptedException e) {
251
277
  log.error("bigquery job error",e);
252
278
  throw new RuntimeException(e);
@@ -260,7 +286,7 @@ public class BigqueryExportGcsFileInputPlugin implements FileInputPlugin
260
286
  return schema;
261
287
  } catch (IOException e) {
262
288
  log.error("bigquery io error",e);
263
- throw new RuntimeIOException(e);
289
+ throw new UncheckedIOException (e);
264
290
  } catch (InterruptedException e) {
265
291
  log.error("bigquery job error",e);
266
292
  throw new RuntimeException(e);
@@ -276,7 +302,7 @@ public class BigqueryExportGcsFileInputPlugin implements FileInputPlugin
276
302
  return BigqueryExportUtils.getFileListFromGcs(task);
277
303
  } catch (IOException e) {
278
304
  log.error("GCS api call error");
279
- throw new RuntimeIOException(e);
305
+ throw new UncheckedIOException (e);
280
306
  }
281
307
 
282
308
  }
@@ -1,14 +1,10 @@
1
1
  package org.embulk.input.bigquery_export_gcs;
2
2
 
3
- import java.io.File;
4
- import java.io.FileInputStream;
5
- import java.io.FileNotFoundException;
6
- import java.io.FileOutputStream;
7
- import java.io.IOException;
8
- import java.io.InputStream;
3
+ import java.io.*;
9
4
  import java.math.BigInteger;
10
5
  import java.nio.file.FileSystems;
11
6
  import java.nio.file.Path;
7
+ import java.util.Collections;
12
8
  import java.util.Date;
13
9
  import java.util.List;
14
10
  import java.util.UUID;
@@ -33,7 +29,7 @@ import com.google.api.client.http.HttpTransport;
33
29
  import com.google.api.client.http.javanet.NetHttpTransport;
34
30
  import com.google.api.client.json.JsonFactory;
35
31
  import com.google.api.client.json.jackson2.JacksonFactory;
36
- import com.google.api.client.repackaged.com.google.common.base.Strings;
32
+ import com.google.common.base.Strings;
37
33
  import com.google.api.services.bigquery.Bigquery;
38
34
  import com.google.api.services.bigquery.Bigquery.Jobs.Insert;
39
35
  import com.google.api.services.bigquery.Bigquery.Tabledata;
@@ -59,8 +55,6 @@ import com.google.common.base.Optional;
59
55
  import com.google.common.collect.ImmutableList;
60
56
  import com.google.common.collect.Lists;
61
57
 
62
- import io.airlift.slice.RuntimeIOException;
63
-
64
58
  /**
65
59
  *
66
60
  *
@@ -143,7 +137,7 @@ public class BigqueryExportUtils
143
137
 
144
138
  log.info("query to Table jobId : {} : waiting for job end...",jobId);
145
139
 
146
- Job lastJob = waitForJob(bigquery, task.getProject(), jobId, task.getBigqueryJobWaitingSecond().get());
140
+ Job lastJob = waitForJob(bigquery, task.getProject(), jobId, task.getLocation().get(), task.getBigqueryJobWaitingSecond().get(), task.getThrowBigqueryJobWaitTimeout(), task.getThrowBigqueryJobIncludesError());
147
141
 
148
142
  log.debug("waiting for job end....... {}", lastJob.toPrettyString());
149
143
  }
@@ -222,7 +216,10 @@ public class BigqueryExportUtils
222
216
  do {
223
217
  objects = listRequest.execute();
224
218
  if(objects.getItems() == null){
225
- log.error("file not found in gs://{}/{}",bucket,blobName);
219
+
220
+ String errorMessage = String.format("file not found in gs://%s/%s",bucket,blobName);
221
+ log.error(errorMessage);
222
+
226
223
  return builder.build();
227
224
  }
228
225
  for(StorageObject obj : objects.getItems()){
@@ -339,32 +336,46 @@ public class BigqueryExportUtils
339
336
  log.info("extract jobId : {}",jobId);
340
337
  log.debug("waiting for job end....... ");
341
338
 
342
- Job lastJob = waitForJob(bigquery, task.getProject(), jobId, task.getBigqueryJobWaitingSecond().get());
339
+ Job lastJob = waitForJob(bigquery, task.getProject(), jobId, task.getLocation().get(), task.getBigqueryJobWaitingSecond().get(), task.getThrowBigqueryJobWaitTimeout(), task.getThrowBigqueryJobIncludesError());
343
340
 
344
341
  log.info("table extract result : {}",lastJob.toPrettyString());
345
342
 
346
343
  return embulkSchema;
347
344
  }
348
345
 
349
- public static Job waitForJob(Bigquery bigquery, String project, String jobId, int bigqueryJobWaitingSecond) throws IOException, InterruptedException{
346
+ public static Job waitForJob(Bigquery bigquery, String project, String jobId, String location, int bigqueryJobWaitingSecond, boolean exceptionWhenTimeout, boolean exceptionWhenErrorResult) throws IOException, InterruptedException{
350
347
  int maxAttempts = bigqueryJobWaitingSecond;
351
348
  int initialRetryDelay = 1000; // ms
352
349
  Job pollingJob = null;
353
350
  log.info("waiting for job end : {}",jobId);
354
351
  int tryCnt = 0;
355
352
  for (tryCnt=0; tryCnt < maxAttempts; tryCnt++){
356
- pollingJob = bigquery.jobs().get(project, jobId).execute();
353
+ pollingJob = bigquery.jobs().get(project, jobId).setLocation(location).execute();
357
354
  String state = pollingJob.getStatus().getState();
358
355
  log.debug("Job Status {} : {}",jobId, state);
359
-
356
+
357
+ // 2020-11-18 DONE is not means "no error" then, we must handle it explictly
358
+ if(exceptionWhenErrorResult){
359
+ if(pollingJob.getStatus().getErrorResult() != null){
360
+ throw new IOException(pollingJob.getStatus().getErrorResult().getMessage());
361
+ }
362
+ }
363
+
360
364
  if (pollingJob.getStatus().getState().equals("DONE")) {
361
- break;
365
+ break;
362
366
  }
363
- log.info("waiting {} ... ",tryCnt);
367
+ log.info("waiting {} ... {} ", tryCnt,state);
364
368
  Thread.sleep(initialRetryDelay);
365
369
  }
366
- if(tryCnt + 1 == maxAttempts){
367
- log.error("Bigquery Job Waiting exceed : over {} second...", bigqueryJobWaitingSecond);
370
+ if(tryCnt + 1 >= maxAttempts){
371
+
372
+ String errorMessage = String.format("Bigquery Job [%s] Waiting timeout : over %s second...", jobId, bigqueryJobWaitingSecond);
373
+ if(exceptionWhenTimeout){
374
+ throw new IOException(errorMessage);
375
+ }else{
376
+ log.error(errorMessage);
377
+ }
378
+
368
379
  }
369
380
 
370
381
  return pollingJob;
@@ -393,7 +404,7 @@ public class BigqueryExportUtils
393
404
  log.info("Start download : gs://{}/{} ...to ... {} ",task.getGcsBucket(), file, task.getTempLocalPath());
394
405
 
395
406
  Storage.Objects.Get getObject = gcs.objects().get(task.getGcsBucket(), file);
396
- getObject.getMediaHttpDownloader().setDirectDownloadEnabled(true);
407
+ getObject.getMediaHttpDownloader().setDirectDownloadEnabled( task.getDirectDownloadEnabled() );
397
408
 
398
409
  // return getObject.executeMediaAsInputStream() // direct InputStream ?? I Think this is faster then temp file. but ...
399
410
 
@@ -4,19 +4,17 @@ import java.io.FileNotFoundException;
4
4
  import java.io.IOException;
5
5
  import java.io.InputStream;
6
6
 
7
+ import com.google.common.base.Optional;
7
8
  import org.junit.Test;
8
9
  import org.slf4j.Logger;
9
10
  import org.slf4j.LoggerFactory;
10
11
 
12
+ import javax.validation.constraints.AssertTrue;
13
+
11
14
  public class TestGoogleCloudAccessData extends UnitTestInitializer
12
15
  {
13
16
  private static final Logger log = LoggerFactory.getLogger(TestGoogleCloudAccessData.class);
14
17
 
15
- @Test
16
- public void envTest(){
17
- log.info("{}",System.getenv("GCP_PROJECT"));
18
- }
19
-
20
18
  @Test
21
19
  public void testGcsInputStreamOpen() throws FileNotFoundException, IOException
22
20
  {
@@ -28,6 +26,34 @@ public class TestGoogleCloudAccessData extends UnitTestInitializer
28
26
 
29
27
  log.info("file size : {}",org.apache.commons.compress.utils.IOUtils.toByteArray(ins).length);
30
28
  }
31
-
29
+
30
+
31
+ @Test(expected=Exception.class)
32
+ public void testJobDoneButError() throws FileNotFoundException, IOException
33
+ {
34
+ BigqueryExportGcsFileInputPlugin.PluginTask task = config.loadConfig(BigqueryExportGcsFileInputPlugin.PluginTask.class );
35
+ task.setThrowBigqueryJobWaitTimeout(true);
36
+ task.setThrowBigqueryJobIncludesError(true);
37
+ task.setQuery(Optional.of("select a from b"));
38
+ plugin.executeBigqueryApi(task);
39
+
40
+ InputStream ins = BigqueryExportUtils.openInputStream(task, task.getFiles().get(0));
41
+ log.info("file size : {}",org.apache.commons.compress.utils.IOUtils.toByteArray(ins).length);
42
+
43
+ }
44
+
45
+
46
+ @Test(expected=Exception.class)
47
+ public void testJobWaitTimeout() throws FileNotFoundException, IOException
48
+ {
49
+ BigqueryExportGcsFileInputPlugin.PluginTask task = config.loadConfig(BigqueryExportGcsFileInputPlugin.PluginTask.class );
50
+ task.setThrowBigqueryJobWaitTimeout(true);
51
+ task.setBigqueryJobWaitingSecond(Optional.of(1));
52
+ plugin.executeBigqueryApi(task);
53
+
54
+ InputStream ins = BigqueryExportUtils.openInputStream(task, task.getFiles().get(0));
55
+ log.info("file size : {}",org.apache.commons.compress.utils.IOUtils.toByteArray(ins).length);
56
+
57
+ }
32
58
 
33
59
  }
@@ -53,4 +53,5 @@ public class TestPluginFunctions extends UnitTestInitializer
53
53
  assertEquals("", "bbb/ccc_", task.getGcsBlobNamePrefix());
54
54
  }
55
55
 
56
+
56
57
  }
@@ -83,7 +83,8 @@ public class UnitTestInitializer
83
83
  config.get(String.class, "temp_local_path")
84
84
  );
85
85
  }
86
-
86
+
87
+
87
88
  @Test
88
89
  public void testInitTask(){
89
90
  BigqueryExportGcsFileInputPlugin.PluginTask task = config.loadConfig(BigqueryExportGcsFileInputPlugin.PluginTask.class );
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: embulk-input-bigquery_extract_files
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.9
4
+ version: 0.0.14
5
5
  platform: ruby
6
6
  authors:
7
7
  - jo8937
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2018-08-28 00:00:00.000000000 Z
11
+ date: 2020-11-20 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -63,21 +63,23 @@ files:
63
63
  - src/test/java/org/embulk/input/bigquery_export_gcs/TestGoogleCloudAccessData.java
64
64
  - src/test/java/org/embulk/input/bigquery_export_gcs/TestPluginFunctions.java
65
65
  - src/test/java/org/embulk/input/bigquery_export_gcs/UnitTestInitializer.java
66
- - classpath/commons-codec-1.10.jar
67
- - classpath/commons-logging-1.2.jar
68
- - classpath/embulk-input-bigquery_extract_files-0.0.9.jar
66
+ - classpath/animal-sniffer-annotations-1.14.jar
67
+ - classpath/checker-compat-qual-2.5.2.jar
68
+ - classpath/embulk-input-bigquery_extract_files-0.0.14.jar
69
+ - classpath/error_prone_annotations-2.1.3.jar
69
70
  - classpath/google-api-client-1.25.0.jar
70
- - classpath/google-api-services-bigquery-v2-rev402-1.25.0.jar
71
- - classpath/google-api-services-storage-v1-rev136-1.25.0.jar
72
- - classpath/google-http-client-1.25.0.jar
73
- - classpath/google-http-client-jackson2-1.25.0.jar
71
+ - classpath/google-api-services-bigquery-v2-rev429-1.25.0.jar
72
+ - classpath/google-api-services-storage-v1-rev150-1.25.0.jar
73
+ - classpath/google-http-client-1.28.0.jar
74
+ - classpath/google-http-client-jackson2-1.28.0.jar
74
75
  - classpath/google-oauth-client-1.25.0.jar
75
- - classpath/guava-20.0.jar
76
- - classpath/httpclient-4.5.5.jar
77
- - classpath/httpcore-4.4.9.jar
76
+ - classpath/grpc-context-1.14.0.jar
77
+ - classpath/guava-26.0-android.jar
78
78
  - classpath/j2objc-annotations-1.1.jar
79
79
  - classpath/jackson-core-2.9.6.jar
80
80
  - classpath/jsr305-3.0.2.jar
81
+ - classpath/opencensus-api-0.18.0.jar
82
+ - classpath/opencensus-contrib-http-util-0.18.0.jar
81
83
  homepage: https://github.com/jo8937/embulk-input-bigquery_extract_files
82
84
  licenses:
83
85
  - MIT