google-cloud-bigtable 0.6.1 → 1.0.1

Sign up to get free protection for your applications and to get access to all the features.
Files changed (61) hide show
  1. checksums.yaml +4 -4
  2. data/AUTHENTICATION.md +4 -26
  3. data/CHANGELOG.md +85 -0
  4. data/CONTRIBUTING.md +1 -1
  5. data/OVERVIEW.md +388 -19
  6. data/lib/google-cloud-bigtable.rb +19 -22
  7. data/lib/google/bigtable/admin/v2/bigtable_table_admin_services_pb.rb +1 -1
  8. data/lib/google/bigtable/v2/bigtable_pb.rb +3 -0
  9. data/lib/google/bigtable/v2/bigtable_services_pb.rb +1 -1
  10. data/lib/google/cloud/bigtable.rb +11 -17
  11. data/lib/google/cloud/bigtable/admin.rb +1 -1
  12. data/lib/google/cloud/bigtable/admin/v2.rb +1 -1
  13. data/lib/google/cloud/bigtable/admin/v2/bigtable_table_admin_client.rb +1 -1
  14. data/lib/google/cloud/bigtable/admin/v2/doc/google/iam/v1/policy.rb +42 -21
  15. data/lib/google/cloud/bigtable/app_profile.rb +162 -96
  16. data/lib/google/cloud/bigtable/app_profile/job.rb +5 -8
  17. data/lib/google/cloud/bigtable/app_profile/list.rb +18 -12
  18. data/lib/google/cloud/bigtable/chunk_processor.rb +24 -36
  19. data/lib/google/cloud/bigtable/cluster.rb +45 -18
  20. data/lib/google/cloud/bigtable/cluster/job.rb +3 -7
  21. data/lib/google/cloud/bigtable/cluster/list.rb +22 -20
  22. data/lib/google/cloud/bigtable/column_family.rb +18 -231
  23. data/lib/google/cloud/bigtable/column_family_map.rb +426 -0
  24. data/lib/google/cloud/bigtable/column_range.rb +15 -7
  25. data/lib/google/cloud/bigtable/convert.rb +12 -4
  26. data/lib/google/cloud/bigtable/errors.rb +4 -1
  27. data/lib/google/cloud/bigtable/gc_rule.rb +188 -69
  28. data/lib/google/cloud/bigtable/instance.rb +209 -189
  29. data/lib/google/cloud/bigtable/instance/cluster_map.rb +17 -13
  30. data/lib/google/cloud/bigtable/instance/job.rb +6 -5
  31. data/lib/google/cloud/bigtable/instance/list.rb +18 -13
  32. data/lib/google/cloud/bigtable/longrunning_job.rb +7 -1
  33. data/lib/google/cloud/bigtable/mutation_entry.rb +36 -39
  34. data/lib/google/cloud/bigtable/mutation_operations.rb +90 -73
  35. data/lib/google/cloud/bigtable/policy.rb +9 -5
  36. data/lib/google/cloud/bigtable/project.rb +87 -196
  37. data/lib/google/cloud/bigtable/read_modify_write_rule.rb +15 -10
  38. data/lib/google/cloud/bigtable/read_operations.rb +42 -59
  39. data/lib/google/cloud/bigtable/routing_policy.rb +172 -0
  40. data/lib/google/cloud/bigtable/row.rb +32 -21
  41. data/lib/google/cloud/bigtable/row_filter.rb +80 -35
  42. data/lib/google/cloud/bigtable/row_filter/chain_filter.rb +119 -68
  43. data/lib/google/cloud/bigtable/row_filter/condition_filter.rb +8 -2
  44. data/lib/google/cloud/bigtable/row_filter/interleave_filter.rb +117 -66
  45. data/lib/google/cloud/bigtable/row_filter/simple_filter.rb +24 -9
  46. data/lib/google/cloud/bigtable/row_range.rb +5 -0
  47. data/lib/google/cloud/bigtable/rows_mutator.rb +14 -21
  48. data/lib/google/cloud/bigtable/rows_reader.rb +23 -18
  49. data/lib/google/cloud/bigtable/sample_row_key.rb +6 -3
  50. data/lib/google/cloud/bigtable/service.rb +200 -253
  51. data/lib/google/cloud/bigtable/status.rb +76 -0
  52. data/lib/google/cloud/bigtable/table.rb +158 -262
  53. data/lib/google/cloud/bigtable/table/cluster_state.rb +17 -6
  54. data/lib/google/cloud/bigtable/table/list.rb +16 -9
  55. data/lib/google/cloud/bigtable/v2.rb +1 -1
  56. data/lib/google/cloud/bigtable/v2/bigtable_client.rb +12 -12
  57. data/lib/google/cloud/bigtable/v2/doc/google/bigtable/v2/bigtable.rb +16 -13
  58. data/lib/google/cloud/bigtable/value_range.rb +19 -13
  59. data/lib/google/cloud/bigtable/version.rb +1 -1
  60. metadata +67 -25
  61. data/lib/google/cloud/bigtable/table/column_family_map.rb +0 -70
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 3d888a5ae0e03db4d343f1f472df17e34e6749305a18907309afc1e9aba13451
4
- data.tar.gz: 921696b2987c12c2480bf8407b9d19c500bdbfd0ff2ab2401a58a3880ca357d5
3
+ metadata.gz: a043924866764ff0a48e0787f4cf3cda747470cbd81977734aac3ed8ff080cd8
4
+ data.tar.gz: ec6fdc94f142babade487a79415fba618ebcd3ec83a0ea8d3a8820dc6b3f4eea
5
5
  SHA512:
6
- metadata.gz: 5f898010312460f67d88380f2252a83b26acfd2e1220c31a09bacc1026c87066ffe184369807fc2fadcf17e49e98cee4ae255aaf771dd54617cdb706ba829f91
7
- data.tar.gz: 17358abe5747ae235d77a62cbb4d3e0030939bb254111fbdd2d1dfea8e520681a1dbc4376b9c2a36d0880167864e109b89f4f42317bb1a59a43d748452dff7ec
6
+ metadata.gz: f67a6084d021869cbbf8065a2d5bec0452ba92d2754832e33f041e06e19d6988037c922118d4a443db7ba741034519c274fd201cc09871ca4da92bbfbcd076bd
7
+ data.tar.gz: 4b1f0e5b94d521830fd7626b37d1be278d1f34b9e4cb8b95a14a7c9a8ddf17aa5c1697c9ef42f4b2fa31d1c90f8aec7c0103eacd12591a553440144f1414324b
@@ -55,32 +55,10 @@ code.
55
55
 
56
56
  ### Google Cloud Platform environments
57
57
 
58
- While running on Google Cloud Platform environments such as Google Compute
59
- Engine, Google App Engine and Google Kubernetes Engine, no extra work is needed.
60
- The **Project ID** and **Credentials** and are discovered automatically. Code
61
- should be written as if already authenticated. Just be sure when you [set up the
62
- GCE instance][gce-how-to], you add the correct scopes for the APIs you want to
63
- access. For example:
64
-
65
- * **All APIs**
66
- * `https://www.googleapis.com/auth/cloud-platform`
67
- * `https://www.googleapis.com/auth/cloud-platform.read-only`
68
- * **BigQuery**
69
- * `https://www.googleapis.com/auth/bigquery`
70
- * `https://www.googleapis.com/auth/bigquery.insertdata`
71
- * **Compute Engine**
72
- * `https://www.googleapis.com/auth/compute`
73
- * **Datastore**
74
- * `https://www.googleapis.com/auth/datastore`
75
- * `https://www.googleapis.com/auth/userinfo.email`
76
- * **DNS**
77
- * `https://www.googleapis.com/auth/ndev.clouddns.readwrite`
78
- * **Pub/Sub**
79
- * `https://www.googleapis.com/auth/pubsub`
80
- * **Storage**
81
- * `https://www.googleapis.com/auth/devstorage.full_control`
82
- * `https://www.googleapis.com/auth/devstorage.read_only`
83
- * `https://www.googleapis.com/auth/devstorage.read_write`
58
+ When running on Google Cloud Platform (GCP), including Google Compute Engine (GCE),
59
+ Google Kubernetes Engine (GKE), Google App Engine (GAE), Google Cloud Functions
60
+ (GCF) and Cloud Run, the **Project ID** and **Credentials** and are discovered
61
+ automatically. Code should be written as if already authenticated.
84
62
 
85
63
  ### Environment Variables
86
64
 
@@ -1,5 +1,90 @@
1
1
  # Release History
2
2
 
3
+ ### 1.0.1 / 2020-01-15
4
+
5
+ #### Documentation
6
+
7
+ * Update lower-level API documentation
8
+
9
+ ### 1.0.0 / 2019-12-03
10
+
11
+ #### Documentation
12
+
13
+ * Update release level to GA
14
+ * Add OVERVIEW.md guide with samples
15
+ * Add sample to README.md
16
+ * Fix samples and copy edit all in-line documentation
17
+ * Correct error in lower-level API Table IAM documentation
18
+ * Update lower-level API documentation to indicate attributes as required
19
+ * Update low-level IAM Policy documentation
20
+
21
+ ### 0.8.0 / 2019-11-01
22
+
23
+ #### ⚠ BREAKING CHANGES
24
+
25
+ * The following methods now raise Google::Cloud::Error instead of
26
+ Google::Gax::GaxError and/or GRPC::BadStatus:
27
+ * Table#mutate_row
28
+ * Table#read_modify_write_row
29
+ * Table#check_and_mutate_row
30
+ * Table#sample_row_keys
31
+
32
+ #### Features
33
+
34
+ * Raise Google::Cloud::Error from Table#mutate_row, Table#read_modify_write_row,
35
+ Table#check_and_mutate_row, and Table#sample_row_keys.
36
+
37
+ #### Bug Fixes
38
+
39
+ * Update minimum runtime dependencies
40
+
41
+ #### Documentation
42
+
43
+ * Update the list of GCP environments for automatic authentication
44
+
45
+ ### 0.7.0 / 2019-10-22
46
+
47
+ #### Features
48
+
49
+ * Update Table#column_families to yield ColumnFamilyMap for updates.
50
+ * ColumnFamilyMap now manages ColumnFamily lifecycle.
51
+ * Add MutationOperations::Response.
52
+ * Add Bigtable::Status.
53
+ * Add Bigtable::RoutingPolicy.
54
+ * Update Ruby dependency to minimum of 2.4.
55
+
56
+ #### BREAKING CHANGES
57
+
58
+ * Remove ColumnFamily lifecycle methods (create, save, delete, and related class methods).
59
+ * Replaced by Table#column_families yield block.
60
+ * Move Google::Cloud::Bigtable::Table::ColumnFamilyMap to Google::Cloud::Bigtable::ColumnFamilyMap.
61
+ * This should only affect introspection, since the constructor was previously undocumented.
62
+ * Remove Project#modify_column_families.
63
+ * Replaced by Table#column_families yield block.
64
+ * Remove Table#column_family.
65
+ * Replaced by ColumnFamilyMap lifecycle methods.
66
+ * Remove Table#modify_column_families.
67
+ * Replaced by Table#column_families yield block.
68
+ * Update GcRule#union and #intersection to not return lower-level API types.
69
+ * Update all return types and parameters associated with AppProfile routing policy to not use lower-level API types.
70
+ * The new types have exactly the same API as the old types, so this change should only affect type introspection.
71
+ * Update return types of Chain and Interleave row filters to not use lower-level API types.
72
+ * Change return type of MutationOperations#mutate_rows from lower-level API types to wrapper types.
73
+ * Remove private MutationEntry#mutations from documentation.
74
+ * Update GcRule#max_age to microsecond precision.
75
+
76
+ #### Documentation
77
+
78
+ * Update sample code.
79
+ * Update documentation.
80
+
81
+ ### 0.6.2 / 2019-10-01
82
+
83
+ #### Documentation
84
+
85
+ * Fix role string in low-level IAM Policy JSON example
86
+ * Update low-level IAM Policy class description and sample code
87
+
3
88
  ### 0.6.1 / 2019-09-05
4
89
 
5
90
  #### Features
@@ -24,7 +24,7 @@ be able to accept your pull requests.
24
24
  In order to use the google-cloud-bigtable console and run the project's tests,
25
25
  there is a small amount of setup:
26
26
 
27
- 1. Install Ruby. google-cloud-bigtable requires Ruby 2.3+. You may choose to
27
+ 1. Install Ruby. google-cloud-bigtable requires Ruby 2.4+. You may choose to
28
28
  manage your Ruby and gem installations with [RVM](https://rvm.io/),
29
29
  [rbenv](https://github.com/rbenv/rbenv), or
30
30
  [chruby](https://github.com/postmodern/chruby).
@@ -1,31 +1,400 @@
1
1
  # Cloud Bigtable
2
2
 
3
- Ruby Client for Cloud Bigtable API ([Beta](https://github.com/googleapis/google-cloud-ruby#versioning))
3
+ Cloud Bigtable is a petabyte-scale, fully managed NoSQL database service for
4
+ large analytical and operational workloads. Ideal for ad tech, fintech, and IoT,
5
+ Cloud Bigtable offers consistent sub-10ms latency. Replication provides higher
6
+ availability, higher durability, and resilience in the face of zonal failures.
7
+ Cloud Bigtable is designed with a storage engine for machine learning
8
+ applications and provides easy integration with open source big data tools.
4
9
 
5
- [Cloud Bigtable API][Product Documentation]:
6
- API for reading and writing the contents of Bigtables associated with a
7
- cloud project.
8
- - [Product Documentation][]
10
+ For more information about Cloud Bigtable, read the [Cloud Bigtable
11
+ Documentation](https://cloud.google.com/bigtable/docs/).
9
12
 
10
- ## Quick Start
11
- In order to use this library, you first need to go through the following
12
- steps:
13
+ The goal of google-cloud is to provide an API that is comfortable to Rubyists.
14
+ Your authentication credentials are detected automatically in Google Cloud
15
+ Platform (GCP), including Google Compute Engine (GCE), Google Kubernetes Engine
16
+ (GKE), Google App Engine (GAE), Google Cloud Functions (GCF) and Cloud Run. In
17
+ other environments you can configure authentication easily, either directly in
18
+ your code or via environment variables. Read more about the options for
19
+ connecting in the {file:AUTHENTICATION.md Authentication Guide}.
13
20
 
14
- 1. [Select or create a Cloud Platform project.](https://console.cloud.google.com/project)
15
- 2. [Enable billing for your project.](https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project)
16
- 3. [Enable the Cloud Bigtable API.](https://console.cloud.google.com/apis/library/bigtable.googleapis.com)
17
- 4. [Setup Authentication.](https://googleapis.dev/ruby/google-cloud-bigtable/latest/file.AUTHENTICATION.html)
21
+ ## Creating instances and clusters
18
22
 
19
- ### Next Steps
20
- - Read the [Cloud Bigtable API Product documentation][Product Documentation]
21
- to learn more about the product and see How-to Guides.
22
- - View this [repository's main README](https://github.com/googleapis/google-cloud-ruby/blob/master/README.md)
23
- to see the full list of Cloud APIs that we cover.
23
+ When you first use Cloud Bigtable, you must create an instance, which is an
24
+ allocation of resources that are used by Cloud Bigtable. When you create an
25
+ instance, you must specify at least one cluster. Clusters describe where your
26
+ data is stored and how many nodes are used for your data.
27
+
28
+ Use {Google::Cloud::Bigtable::Project#create_instance Project#create_instance}
29
+ to create an instance. The following example creates a production instance with
30
+ one cluster and three nodes:
31
+
32
+ ```ruby
33
+ require "google/cloud/bigtable"
34
+
35
+ bigtable = Google::Cloud::Bigtable.new
36
+
37
+ job = bigtable.create_instance(
38
+ "my-instance",
39
+ display_name: "Instance for user data",
40
+ labels: { "env" => "dev"}
41
+ ) do |clusters|
42
+ clusters.add("test-cluster", "us-east1-b", nodes: 3, storage_type: :SSD)
43
+ end
44
+
45
+ job.done? #=> false
46
+
47
+ # To block until the operation completes.
48
+ job.wait_until_done!
49
+ job.done? #=> true
50
+
51
+ if job.error?
52
+ status = job.error
53
+ else
54
+ instance = job.instance
55
+ end
56
+ ```
57
+
58
+ You can also create a low-cost development instance for development and testing,
59
+ with performance limited to the equivalent of a one-node cluster. There are no
60
+ monitoring or throughput guarantees; replication is not available; and the SLA
61
+ does not apply. When creating a development instance, you do not specify `nodes`
62
+ for your clusters:
63
+
64
+ ```ruby
65
+ require "google/cloud/bigtable"
66
+
67
+ bigtable = Google::Cloud::Bigtable.new
68
+
69
+ job = bigtable.create_instance(
70
+ "my-instance",
71
+ display_name: "Instance for user data",
72
+ type: :DEVELOPMENT,
73
+ labels: { "env" => "dev"}
74
+ ) do |clusters|
75
+ clusters.add("test-cluster", "us-east1-b") # nodes not allowed
76
+ end
77
+
78
+ job.done? #=> false
79
+
80
+ # Reload job until completion.
81
+ job.wait_until_done!
82
+ job.done? #=> true
83
+
84
+ if job.error?
85
+ status = job.error
86
+ else
87
+ instance = job.instance
88
+ end
89
+ ```
90
+
91
+ You can upgrade a development instance to a production instance at any time.
92
+
93
+ ## Creating tables
94
+
95
+ Cloud Bigtable stores data in massively scalable tables, each of which is a
96
+ sorted key/value map. The table is composed of rows, each of which typically
97
+ describes a single entity, and columns, which contain individual values for each
98
+ row. Each row is indexed by a single row key, and columns that are related to
99
+ one another are typically grouped together into a column family. Each column is
100
+ identified by a combination of the column family and a column qualifier, which
101
+ is a unique name within the column family.
102
+
103
+ Each row/column intersection can contain multiple cells, or versions, at
104
+ different timestamps, providing a record of how the stored data has been altered
105
+ over time. Cloud Bigtable tables are sparse; if a cell does not contain any
106
+ data, it does not take up any space.
107
+
108
+ Use {Google::Cloud::Bigtable::Project#create_table Project#create_table} or
109
+ {Google::Cloud::Bigtable::Instance#create_table Instance#create_table} to
110
+ create a table:
111
+
112
+ ```ruby
113
+ require "google/cloud/bigtable"
114
+
115
+ bigtable = Google::Cloud::Bigtable.new
116
+
117
+ table = bigtable.create_table("my-instance", "my-table")
118
+ puts table.name
119
+ ```
120
+
121
+ When you create a table, you may specify the column families to use in the
122
+ table, as well as a list of row keys that will be used to initially split the
123
+ table into several tablets (tablets are similar to HBase regions):
124
+
125
+ ```ruby
126
+ require "google/cloud/bigtable"
127
+
128
+ bigtable = Google::Cloud::Bigtable.new
129
+
130
+ initial_splits = ["user-00001", "user-100000", "others"]
131
+ table = bigtable.create_table("my-instance", "my-table", initial_splits: initial_splits) do |cfm|
132
+ cfm.add('cf1', gc_rule: Google::Cloud::Bigtable::GcRule.max_versions(5))
133
+ cfm.add('cf2', gc_rule: Google::Cloud::Bigtable::GcRule.max_age(600))
134
+
135
+ gc_rule = Google::Cloud::Bigtable::GcRule.union(
136
+ Google::Cloud::Bigtable::GcRule.max_age(1800),
137
+ Google::Cloud::Bigtable::GcRule.max_versions(3)
138
+ )
139
+ cfm.add('cf3', gc_rule: gc_rule)
140
+ end
141
+
142
+ puts table
143
+ ```
144
+
145
+ You may also add, update, and delete column families later by passing a block to
146
+ {Google::Cloud::Bigtable::Table#column_families Table#column_families}:
147
+
148
+ ```ruby
149
+ require "google/cloud/bigtable"
150
+
151
+ bigtable = Google::Cloud::Bigtable.new
152
+
153
+ table = bigtable.table("my-instance", "my-table", perform_lookup: true)
154
+
155
+ table.column_families do |cfm|
156
+ cfm.add "cf4", gc_rule: Google::Cloud::Bigtable::GcRule.max_age(600)
157
+ cfm.add "cf5", gc_rule: Google::Cloud::Bigtable::GcRule.max_versions(5)
158
+
159
+ rule_1 = Google::Cloud::Bigtable::GcRule.max_versions(3)
160
+ rule_2 = Google::Cloud::Bigtable::GcRule.max_age(600)
161
+ rule_union = Google::Cloud::Bigtable::GcRule.union(rule_1, rule_2)
162
+ cfm.update "cf2", gc_rule: rule_union
163
+
164
+ cfm.delete "cf3"
165
+ end
166
+
167
+ puts table.column_families["cf3"] #=> nil
168
+ ```
169
+
170
+ ## Writing data
171
+
172
+ The {Google::Cloud::Bigtable::Table Table} class allows you to perform the
173
+ following types of writes:
174
+
175
+ * Simple writes
176
+ * Increments and appends
177
+ * Conditional writes
178
+ * Batch writes
179
+
180
+ See [Cloud Bigtable writes](https://cloud.google.com/bigtable/docs/writes) for
181
+ detailed information about writing data.
182
+
183
+ ### Simple writes
184
+
185
+ Use {Google::Cloud::Bigtable::Table#mutate_row Table#mutate_row} to make
186
+ one or more mutations to a single row:
187
+
188
+ ```ruby
189
+ require "google/cloud/bigtable"
190
+
191
+ bigtable = Google::Cloud::Bigtable.new
192
+
193
+ table = bigtable.table("my-instance", "my-table")
194
+
195
+ entry = table.new_mutation_entry("user-1")
196
+ entry.set_cell(
197
+ "cf-1",
198
+ "field-1",
199
+ "XYZ",
200
+ timestamp: (Time.now.to_f * 1000000).round(-3) # microseconds
201
+ ).delete_cells("cf2", "field02")
202
+
203
+ table.mutate_row(entry)
204
+ ```
205
+
206
+ ### Increments and appends
207
+
208
+ If you want to append data to an existing value or increment an existing numeric
209
+ value, use
210
+ {Google::Cloud::Bigtable::Table#read_modify_write_row Table#read_modify_write_row}:
211
+
212
+ ```ruby
213
+ require "google/cloud/bigtable"
214
+
215
+ bigtable = Google::Cloud::Bigtable.new
216
+ table = bigtable.table("my-instance", "my-table")
217
+
218
+ rule_1 = table.new_read_modify_write_rule("cf", "field01")
219
+ rule_1.append("append-xyz")
220
+
221
+ rule_2 = table.new_read_modify_write_rule("cf", "field01")
222
+ rule_2.increment(1)
223
+
224
+ row = table.read_modify_write_row("user01", [rule_1, rule_2])
225
+
226
+ puts row.cells
227
+ ```
228
+
229
+ Do not use `read_modify_write_row` if you are using an app profile that has
230
+ multi-cluster routing. (See
231
+ {Google::Cloud::Bigtable::AppProfile#routing_policy AppProfile#routing_policy}.)
232
+
233
+ ### Conditional writes
234
+
235
+ To check a row for a condition and then, depending on the result, write data to
236
+ that row, use
237
+ {Google::Cloud::Bigtable::Table#check_and_mutate_row Table#check_and_mutate_row}:
238
+
239
+ ```ruby
240
+ require "google/cloud/bigtable"
241
+
242
+ bigtable = Google::Cloud::Bigtable.new
243
+ table = bigtable.table("my-instance", "my-table")
244
+
245
+ predicate_filter = Google::Cloud::Bigtable::RowFilter.key("user-10")
246
+ on_match_mutations = Google::Cloud::Bigtable::MutationEntry.new
247
+ on_match_mutations.set_cell(
248
+ "cf-1",
249
+ "field-1",
250
+ "XYZ",
251
+ timestamp: (Time.now.to_f * 1000000).round(-3) # microseconds
252
+ ).delete_cells("cf2", "field02")
253
+
254
+ otherwise_mutations = Google::Cloud::Bigtable::MutationEntry.new
255
+ otherwise_mutations.delete_from_family("cf3")
256
+
257
+ predicate_matched = table.check_and_mutate_row(
258
+ "user01",
259
+ predicate_filter,
260
+ on_match: on_match_mutations,
261
+ otherwise: otherwise_mutations
262
+ )
263
+
264
+ if predicate_matched
265
+ puts "All predicates matched"
266
+ end
267
+ ```
268
+
269
+ Do not use `check_and_mutate_row` if you are using an app profile that has
270
+ multi-cluster routing. (See
271
+ {Google::Cloud::Bigtable::AppProfile#routing_policy AppProfile#routing_policy}.)
272
+
273
+ ### Batch writes
274
+
275
+ You can write more than one row in a single RPC using
276
+ {Google::Cloud::Bigtable::Table#mutate_rows Table#mutate_rows}:
277
+
278
+ ```ruby
279
+ require "google/cloud/bigtable"
280
+
281
+ bigtable = Google::Cloud::Bigtable.new
282
+
283
+ table = bigtable.table("my-instance", "my-table")
284
+
285
+ entries = []
286
+ entries << table.new_mutation_entry("row-1").set_cell("cf1", "field1", "XYZ")
287
+ entries << table.new_mutation_entry("row-2").set_cell("cf1", "field1", "ABC")
288
+ responses = table.mutate_rows(entries)
289
+
290
+ responses.each do |response|
291
+ puts response.status.description
292
+ end
293
+ ```
294
+
295
+ Each entry in the request is atomic, but the request as a whole is not. As shown
296
+ above, Cloud Bigtable returns a list of responses corresponding to the entries.
297
+
298
+ ## Reading data
299
+
300
+ The {Google::Cloud::Bigtable::Table Table} class also enables you to read data.
301
+
302
+ Use {Google::Cloud::Bigtable::Table#read_row Table#read_row} to read a single
303
+ row by key:
304
+
305
+ ```ruby
306
+ require "google/cloud/bigtable"
307
+
308
+ bigtable = Google::Cloud::Bigtable.new
309
+ table = bigtable.table("my-instance", "my-table")
310
+
311
+ row = table.read_row("user-1")
312
+ ```
313
+
314
+ If desired, you can apply a filter:
315
+
316
+ ```ruby
317
+ require "google/cloud/bigtable"
318
+
319
+ bigtable = Google::Cloud::Bigtable.new
320
+ table = bigtable.table("my-instance", "my-table")
321
+
322
+ filter = Google::Cloud::Bigtable::RowFilter.cells_per_row(3)
323
+
324
+ row = table.read_row("user-1", filter: filter)
325
+ ```
326
+
327
+ For multiple rows, the
328
+ {Google::Cloud::Bigtable::Table#read_rows Table#read_rows} method streams back
329
+ the contents of all requested rows in key order:
330
+
331
+ ```ruby
332
+ require "google/cloud/bigtable"
333
+
334
+ bigtable = Google::Cloud::Bigtable.new
335
+ table = bigtable.table("my-instance", "my-table")
336
+
337
+ table.read_rows(keys: ["user-1", "user-2"]).each do |row|
338
+ puts row
339
+ end
340
+ ```
341
+
342
+ Instead of specifying individual keys (or a range), you can often just use a
343
+ filter:
344
+
345
+ ```ruby
346
+ require "google/cloud/bigtable"
347
+
348
+ bigtable = Google::Cloud::Bigtable.new
349
+ table = bigtable.table("my-instance", "my-table")
350
+
351
+ filter = table.filter.key("user-*")
352
+ # OR
353
+ # filter = Google::Cloud::Bigtable::RowFilter.key("user-*")
354
+
355
+ table.read_rows(filter: filter).each do |row|
356
+ puts row
357
+ end
358
+ ```
359
+
360
+ ## Deleting rows, tables, and instances
361
+
362
+ Use {Google::Cloud::Bigtable::Table#drop_row_range Table#drop_row_range} to
363
+ delete some or all of the rows in a table:
364
+
365
+ ```ruby
366
+ require "google/cloud/bigtable"
367
+
368
+ bigtable = Google::Cloud::Bigtable.new
369
+
370
+ table = bigtable.table("my-instance", "my-table")
371
+
372
+ # Delete rows using row key prefix.
373
+ table.drop_row_range(row_key_prefix: "user-100")
374
+
375
+ # Delete all data With timeout
376
+ table.drop_row_range(delete_all_data: true, timeout: 120) # 120 seconds.
377
+ ```
378
+
379
+ Delete tables and instances using
380
+ {Google::Cloud::Bigtable::Table#delete Table#delete} and
381
+ {Google::Cloud::Bigtable::Instance#delete Instance#delete}, respectively:
382
+
383
+ ```ruby
384
+ require "google/cloud/bigtable"
385
+
386
+ bigtable = Google::Cloud::Bigtable.new
387
+
388
+ instance = bigtable.instance("my-instance")
389
+ table = instance.table("my-table")
390
+
391
+ table.delete
392
+
393
+ instance.delete
394
+ ```
24
395
 
25
396
  ## Additional information
26
397
 
27
398
  Google Bigtable can be configured to use an emulator or to enable gRPC's
28
399
  logging. To learn more, see the {file:EMULATOR.md Emulator guide} and
29
400
  {file:LOGGING.md Logging guide}.
30
-
31
- [Product Documentation]: https://cloud.google.com/bigtable