delta_sharing 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +7 -0
- data/CHANGELOG.md +16 -0
- data/CODE_OF_CONDUCT.md +84 -0
- data/Gemfile +10 -0
- data/Gemfile.lock +92 -0
- data/LICENSE.txt +21 -0
- data/README.md +93 -0
- data/Rakefile +14 -0
- data/bin/console +15 -0
- data/bin/setup +12 -0
- data/examples/client_example.rb +44 -0
- data/examples/reader_example.rb +33 -0
- data/lib/delta_sharing/client.rb +249 -0
- data/lib/delta_sharing/errors.rb +11 -0
- data/lib/delta_sharing/reader.rb +261 -0
- data/lib/delta_sharing/schema.rb +81 -0
- data/lib/delta_sharing/version.rb +5 -0
- data/lib/delta_sharing.rb +15 -0
- metadata +151 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: 25f4e8115e6fa01d01f05c01215125e8776d5ea22951cd0ad60e77eed9dbb434
|
4
|
+
data.tar.gz: a3b72291cea43781e9aafad280dade65cf969d9155df137ca5ac3535e268cc64
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 3c7c64e3e2dfe90e1458d4b153536bb4217be4d202094fffcb1ccc59c9977067b5978067542878e3096e8931d50b3c46c7ad7d97ebfab1205e555f72560ce156
|
7
|
+
data.tar.gz: db27a691874474cba4833c0ec9ed4154d7a3d1d193a83cf8bb63c8c6d42b153ed2b8ea1e1b502768866b5d4c37c4420f6f1a72fa4858acb898829960e183d32c
|
data/CHANGELOG.md
ADDED
@@ -0,0 +1,16 @@
|
|
1
|
+
## [Unreleased]
|
2
|
+
|
3
|
+
## [0.1.0] - 2025-05-28
|
4
|
+
|
5
|
+
### Added
|
6
|
+
- Initial implementation of Delta Sharing Protocol client
|
7
|
+
- Support for reading shared Delta Lake tables in Parquet format
|
8
|
+
- Apache Arrow integration for efficient data processing
|
9
|
+
- Authentication via bearer tokens and profile files
|
10
|
+
- Table discovery (list shares, schemas, tables)
|
11
|
+
- Table data reading with filtering support
|
12
|
+
- Change data feed support
|
13
|
+
- Comprehensive test suite with >80% coverage
|
14
|
+
|
15
|
+
### Limitations
|
16
|
+
- Only supports Parquet response format
|
data/CODE_OF_CONDUCT.md
ADDED
@@ -0,0 +1,84 @@
|
|
1
|
+
# Contributor Covenant Code of Conduct
|
2
|
+
|
3
|
+
## Our Pledge
|
4
|
+
|
5
|
+
We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.
|
6
|
+
|
7
|
+
We pledge to act and interact in ways that contribute to an open, welcoming, diverse, inclusive, and healthy community.
|
8
|
+
|
9
|
+
## Our Standards
|
10
|
+
|
11
|
+
Examples of behavior that contributes to a positive environment for our community include:
|
12
|
+
|
13
|
+
* Demonstrating empathy and kindness toward other people
|
14
|
+
* Being respectful of differing opinions, viewpoints, and experiences
|
15
|
+
* Giving and gracefully accepting constructive feedback
|
16
|
+
* Accepting responsibility and apologizing to those affected by our mistakes, and learning from the experience
|
17
|
+
* Focusing on what is best not just for us as individuals, but for the overall community
|
18
|
+
|
19
|
+
Examples of unacceptable behavior include:
|
20
|
+
|
21
|
+
* The use of sexualized language or imagery, and sexual attention or
|
22
|
+
advances of any kind
|
23
|
+
* Trolling, insulting or derogatory comments, and personal or political attacks
|
24
|
+
* Public or private harassment
|
25
|
+
* Publishing others' private information, such as a physical or email
|
26
|
+
address, without their explicit permission
|
27
|
+
* Other conduct which could reasonably be considered inappropriate in a
|
28
|
+
professional setting
|
29
|
+
|
30
|
+
## Enforcement Responsibilities
|
31
|
+
|
32
|
+
Community leaders are responsible for clarifying and enforcing our standards of acceptable behavior and will take appropriate and fair corrective action in response to any behavior that they deem inappropriate, threatening, offensive, or harmful.
|
33
|
+
|
34
|
+
Community leaders have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, and will communicate reasons for moderation decisions when appropriate.
|
35
|
+
|
36
|
+
## Scope
|
37
|
+
|
38
|
+
This Code of Conduct applies within all community spaces, and also applies when an individual is officially representing the community in public spaces. Examples of representing our community include using an official e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event.
|
39
|
+
|
40
|
+
## Enforcement
|
41
|
+
|
42
|
+
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported to the community leaders responsible for enforcement at TODO: Write your email address. All complaints will be reviewed and investigated promptly and fairly.
|
43
|
+
|
44
|
+
All community leaders are obligated to respect the privacy and security of the reporter of any incident.
|
45
|
+
|
46
|
+
## Enforcement Guidelines
|
47
|
+
|
48
|
+
Community leaders will follow these Community Impact Guidelines in determining the consequences for any action they deem in violation of this Code of Conduct:
|
49
|
+
|
50
|
+
### 1. Correction
|
51
|
+
|
52
|
+
**Community Impact**: Use of inappropriate language or other behavior deemed unprofessional or unwelcome in the community.
|
53
|
+
|
54
|
+
**Consequence**: A private, written warning from community leaders, providing clarity around the nature of the violation and an explanation of why the behavior was inappropriate. A public apology may be requested.
|
55
|
+
|
56
|
+
### 2. Warning
|
57
|
+
|
58
|
+
**Community Impact**: A violation through a single incident or series of actions.
|
59
|
+
|
60
|
+
**Consequence**: A warning with consequences for continued behavior. No interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, for a specified period of time. This includes avoiding interactions in community spaces as well as external channels like social media. Violating these terms may lead to a temporary or permanent ban.
|
61
|
+
|
62
|
+
### 3. Temporary Ban
|
63
|
+
|
64
|
+
**Community Impact**: A serious violation of community standards, including sustained inappropriate behavior.
|
65
|
+
|
66
|
+
**Consequence**: A temporary ban from any sort of interaction or public communication with the community for a specified period of time. No public or private interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, is allowed during this period. Violating these terms may lead to a permanent ban.
|
67
|
+
|
68
|
+
### 4. Permanent Ban
|
69
|
+
|
70
|
+
**Community Impact**: Demonstrating a pattern of violation of community standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals.
|
71
|
+
|
72
|
+
**Consequence**: A permanent ban from any sort of public interaction within the community.
|
73
|
+
|
74
|
+
## Attribution
|
75
|
+
|
76
|
+
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 2.0,
|
77
|
+
available at https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
|
78
|
+
|
79
|
+
Community Impact Guidelines were inspired by [Mozilla's code of conduct enforcement ladder](https://github.com/mozilla/diversity).
|
80
|
+
|
81
|
+
[homepage]: https://www.contributor-covenant.org
|
82
|
+
|
83
|
+
For answers to common questions about this code of conduct, see the FAQ at
|
84
|
+
https://www.contributor-covenant.org/faq. Translations are available at https://www.contributor-covenant.org/translations.
|
data/Gemfile
ADDED
data/Gemfile.lock
ADDED
@@ -0,0 +1,92 @@
|
|
1
|
+
PATH
|
2
|
+
remote: .
|
3
|
+
specs:
|
4
|
+
delta_sharing (0.1.0)
|
5
|
+
httparty (~> 0.21.0)
|
6
|
+
red-arrow (~> 19.0)
|
7
|
+
red-arrow-dataset (~> 19.0)
|
8
|
+
red-parquet (~> 19.0)
|
9
|
+
|
10
|
+
GEM
|
11
|
+
remote: https://rubygems.org/
|
12
|
+
specs:
|
13
|
+
addressable (2.8.7)
|
14
|
+
public_suffix (>= 2.0.2, < 7.0)
|
15
|
+
ast (2.4.3)
|
16
|
+
bigdecimal (3.2.2)
|
17
|
+
crack (1.0.0)
|
18
|
+
bigdecimal
|
19
|
+
rexml
|
20
|
+
csv (3.3.5)
|
21
|
+
extpp (0.1.1)
|
22
|
+
fiddle (1.1.8)
|
23
|
+
gio2 (4.2.9)
|
24
|
+
fiddle
|
25
|
+
gobject-introspection (= 4.2.9)
|
26
|
+
glib2 (4.2.9)
|
27
|
+
native-package-installer (>= 1.0.3)
|
28
|
+
pkg-config (>= 1.3.5)
|
29
|
+
gobject-introspection (4.2.9)
|
30
|
+
glib2 (= 4.2.9)
|
31
|
+
hashdiff (1.2.0)
|
32
|
+
httparty (0.21.0)
|
33
|
+
mini_mime (>= 1.0.0)
|
34
|
+
multi_xml (>= 0.5.2)
|
35
|
+
json (2.7.6)
|
36
|
+
mini_mime (1.1.5)
|
37
|
+
minitest (5.25.4)
|
38
|
+
multi_xml (0.6.0)
|
39
|
+
native-package-installer (1.1.9)
|
40
|
+
parallel (1.24.0)
|
41
|
+
parser (3.3.8.0)
|
42
|
+
ast (~> 2.4.1)
|
43
|
+
racc
|
44
|
+
pkg-config (1.6.2)
|
45
|
+
public_suffix (5.1.1)
|
46
|
+
racc (1.8.1)
|
47
|
+
rainbow (3.1.1)
|
48
|
+
rake (13.3.0)
|
49
|
+
red-arrow (19.0.1)
|
50
|
+
bigdecimal (>= 3.1.0)
|
51
|
+
csv
|
52
|
+
extpp (>= 0.1.1)
|
53
|
+
gio2 (>= 4.2.3)
|
54
|
+
native-package-installer
|
55
|
+
pkg-config
|
56
|
+
red-arrow-dataset (19.0.1)
|
57
|
+
red-arrow (= 19.0.1)
|
58
|
+
red-parquet (19.0.1)
|
59
|
+
red-arrow (= 19.0.1)
|
60
|
+
regexp_parser (2.10.0)
|
61
|
+
rexml (3.4.1)
|
62
|
+
rubocop (1.50.2)
|
63
|
+
json (~> 2.3)
|
64
|
+
parallel (~> 1.10)
|
65
|
+
parser (>= 3.2.0.0)
|
66
|
+
rainbow (>= 2.2.2, < 4.0)
|
67
|
+
regexp_parser (>= 1.8, < 3.0)
|
68
|
+
rexml (>= 3.2.5, < 4.0)
|
69
|
+
rubocop-ast (>= 1.28.0, < 2.0)
|
70
|
+
ruby-progressbar (~> 1.7)
|
71
|
+
unicode-display_width (>= 2.4.0, < 3.0)
|
72
|
+
rubocop-ast (1.30.0)
|
73
|
+
parser (>= 3.2.1.0)
|
74
|
+
ruby-progressbar (1.13.0)
|
75
|
+
unicode-display_width (2.6.0)
|
76
|
+
webmock (3.25.1)
|
77
|
+
addressable (>= 2.8.0)
|
78
|
+
crack (>= 0.3.2)
|
79
|
+
hashdiff (>= 0.4.0, < 2.0.0)
|
80
|
+
|
81
|
+
PLATFORMS
|
82
|
+
x86_64-linux
|
83
|
+
|
84
|
+
DEPENDENCIES
|
85
|
+
delta_sharing!
|
86
|
+
minitest (~> 5.0)
|
87
|
+
rake (~> 13.0)
|
88
|
+
rubocop
|
89
|
+
webmock (~> 3.18)
|
90
|
+
|
91
|
+
BUNDLED WITH
|
92
|
+
2.2.32
|
data/LICENSE.txt
ADDED
@@ -0,0 +1,21 @@
|
|
1
|
+
The MIT License (MIT)
|
2
|
+
|
3
|
+
Copyright (c) 2025 TODO: Write your name
|
4
|
+
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
7
|
+
in the Software without restriction, including without limitation the rights
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
10
|
+
furnished to do so, subject to the following conditions:
|
11
|
+
|
12
|
+
The above copyright notice and this permission notice shall be included in
|
13
|
+
all copies or substantial portions of the Software.
|
14
|
+
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
21
|
+
THE SOFTWARE.
|
data/README.md
ADDED
@@ -0,0 +1,93 @@
|
|
1
|
+
# Delta Sharing Ruby Client
|
2
|
+
|
3
|
+
A Ruby implementation of the Delta Sharing Protocol for reading shared Delta Lake tables. For now it only suports parquet format. See Todo Bellow
|
4
|
+
|
5
|
+
## Installation
|
6
|
+
|
7
|
+
Add this line to your application's Gemfile:
|
8
|
+
|
9
|
+
```ruby
|
10
|
+
gem 'delta_sharing'
|
11
|
+
```
|
12
|
+
|
13
|
+
And then execute:
|
14
|
+
|
15
|
+
$ bundle install
|
16
|
+
|
17
|
+
Or install it yourself as:
|
18
|
+
|
19
|
+
$ gem install delta_sharing
|
20
|
+
|
21
|
+
## Usage
|
22
|
+
|
23
|
+
### Initialize the Client
|
24
|
+
|
25
|
+
```ruby
|
26
|
+
require 'delta_sharing'
|
27
|
+
|
28
|
+
client = DeltaSharing::Client.new('profile.json')
|
29
|
+
```
|
30
|
+
or
|
31
|
+
```ruby
|
32
|
+
require 'delta_sharing'
|
33
|
+
|
34
|
+
client = DeltaSharing::Client.new(endpoint: "https://your-delta-sharing-server/delta-sharing/", bearerToken: "your-bearer-token")
|
35
|
+
```
|
36
|
+
|
37
|
+
|
38
|
+
### List Shares
|
39
|
+
|
40
|
+
```ruby
|
41
|
+
shares = client.list_shares
|
42
|
+
```
|
43
|
+
|
44
|
+
### List Schemas in a Share
|
45
|
+
|
46
|
+
```ruby
|
47
|
+
schemas = client.list_schemas('share_name')
|
48
|
+
```
|
49
|
+
|
50
|
+
### List Tables
|
51
|
+
|
52
|
+
```ruby
|
53
|
+
# List tables in a specific schema
|
54
|
+
tables = client.list_tables('share_name', 'schema_name')
|
55
|
+
```
|
56
|
+
|
57
|
+
### Read Table
|
58
|
+
|
59
|
+
```ruby
|
60
|
+
# List tables in a specific schema
|
61
|
+
client = DeltaSharing::Client.new('profile.json')
|
62
|
+
|
63
|
+
# Create a Reader json predicates hints can a string or a hash
|
64
|
+
# Returns a Arrow:Table
|
65
|
+
reader = DeltaSharing::Reader.new(table: "#{share_name}.#{schema_name}.#{table_name}", client: client)
|
66
|
+
aarrow_table = reader.load_as_arrow(limit: 100, json_predicate_hints: '{ "op": "equal", "children": [ { "op": "column", "name": "active", "valueType": "int" }, { "op": "literal", "value": "1", "valueType": "int" } ] }', predicate_hints: ['active = 1'])
|
67
|
+
|
68
|
+
```
|
69
|
+
|
70
|
+
## Current Implementation Status
|
71
|
+
|
72
|
+
🚧 **TODO:**
|
73
|
+
- Implement table data changes reading functionality
|
74
|
+
- Add support for reading table as delta format
|
75
|
+
- Add documentation
|
76
|
+
|
77
|
+
## Development
|
78
|
+
- Fork the project.
|
79
|
+
- Run bundle
|
80
|
+
- Make your changes.
|
81
|
+
- Run bundle exec rake(run tests)
|
82
|
+
- Crete a PR
|
83
|
+
|
84
|
+
## Contributing
|
85
|
+
|
86
|
+
Bug reports and pull requests are welcome on GitHub at https://github.com/samssouza/delta-sharing-ruby.
|
87
|
+
## License
|
88
|
+
|
89
|
+
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
|
90
|
+
|
91
|
+
## Code of Conduct
|
92
|
+
|
93
|
+
Everyone interacting in the Delta Sharing project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/samssouza/delta-sharing-ruby/blob/main/CODE_OF_CONDUCT.md).
|
data/Rakefile
ADDED
data/bin/console
ADDED
@@ -0,0 +1,15 @@
|
|
1
|
+
#!/usr/bin/env ruby
|
2
|
+
# frozen_string_literal: true
|
3
|
+
|
4
|
+
require 'bundler/setup'
|
5
|
+
require 'delta_sharing'
|
6
|
+
|
7
|
+
# You can add fixtures and/or initialization code here to make experimenting
|
8
|
+
# with your gem easier. You can also use a different console, if you like.
|
9
|
+
|
10
|
+
# (If you use this, don't forget to add pry to your Gemfile!)
|
11
|
+
# require "pry"
|
12
|
+
# Pry.start
|
13
|
+
|
14
|
+
require 'irb'
|
15
|
+
IRB.start(__FILE__)
|
data/bin/setup
ADDED
@@ -0,0 +1,12 @@
|
|
1
|
+
#!/usr/bin/env bash
|
2
|
+
set -euo pipefail
|
3
|
+
IFS=$'\n\t'
|
4
|
+
set -vx
|
5
|
+
|
6
|
+
# Install Ruby dependencies
|
7
|
+
bundle install
|
8
|
+
|
9
|
+
# Run any setup tasks
|
10
|
+
echo "Setup complete! You can now run:"
|
11
|
+
echo " bin/console # for an interactive prompt"
|
12
|
+
echo " bundle exec rake test # to run tests"
|
@@ -0,0 +1,44 @@
|
|
1
|
+
#!/usr/bin/env ruby
|
2
|
+
|
3
|
+
require 'bundler/setup'
|
4
|
+
require 'delta_sharing'
|
5
|
+
|
6
|
+
# Initialize client with profile file
|
7
|
+
# The profile file should contain:
|
8
|
+
# {
|
9
|
+
# "shareCredentialsVersion": 1,
|
10
|
+
# "endpoint": "https://your-delta-sharing-server/delta-sharing/",
|
11
|
+
# "bearerToken": "your-bearer-token"
|
12
|
+
# }
|
13
|
+
|
14
|
+
client = DeltaSharing::Client.new('config.share')
|
15
|
+
|
16
|
+
shares = client.list_shares
|
17
|
+
puts "Shares: #{shares}"
|
18
|
+
|
19
|
+
share_name = shares.first[:name]
|
20
|
+
|
21
|
+
schemas = client.list_schemas(share_name)
|
22
|
+
puts "Schemas in share #{share_name}: #{schemas}"
|
23
|
+
|
24
|
+
schema_name = schemas.first[:name]
|
25
|
+
|
26
|
+
tables = client.list_tables(share_name, schema_name)
|
27
|
+
puts "Tables in share #{share_name} and schema #{schema_name}: #{tables}"
|
28
|
+
|
29
|
+
table_name = tables.first[:name]
|
30
|
+
|
31
|
+
# Get table metadata
|
32
|
+
metadata = client.get_table_metadata(share_name, schema_name, table_name)
|
33
|
+
puts "Table #{table_name} metadata: #{metadata}"
|
34
|
+
|
35
|
+
# Get table version
|
36
|
+
version = client.get_table_version(share_name, schema_name, table_name)
|
37
|
+
puts "Table #{table_name} current version: #{version}"
|
38
|
+
|
39
|
+
# Read table data
|
40
|
+
raw_response = client.read_table_data(share_name, schema_name, table_name, limit: 2)
|
41
|
+
puts "Table #{table_name} response lines: #{raw_response.length}"
|
42
|
+
raw_response.each_with_index do |line, i|
|
43
|
+
puts " Line #{i}: #{line[0..100]}#{line.length > 100 ? '...' : ''}"
|
44
|
+
end
|
@@ -0,0 +1,33 @@
|
|
1
|
+
#!/usr/bin/env ruby
|
2
|
+
# frozen_string_literal: true
|
3
|
+
|
4
|
+
require 'bundler/setup'
|
5
|
+
require 'delta_sharing'
|
6
|
+
|
7
|
+
# Example of reading a Delta Sharing table with Apache Arrow
|
8
|
+
|
9
|
+
# Initialize the client with a profile file
|
10
|
+
profile_path = File.join(__dir__, '..', 'config.share')
|
11
|
+
client = DeltaSharing::Client.new(profile_path)
|
12
|
+
|
13
|
+
# List available shares
|
14
|
+
share_name = client.list_shares.first[:name]
|
15
|
+
schema_name = client.list_schemas(share_name).first[:name]
|
16
|
+
table_name = client.list_tables(share_name, schema_name).first[:name]
|
17
|
+
|
18
|
+
# Create a Table object
|
19
|
+
reader = DeltaSharing::Reader.new(table: "#{share_name}.#{schema_name}.#{table_name}", client: client)
|
20
|
+
|
21
|
+
# Read the table data as an Arrow::Table
|
22
|
+
puts 'Reading table data...'
|
23
|
+
arrow_table = reader.load_as_arrow(limit: 100,
|
24
|
+
json_predicate_hints: '{ "op": "equal", "children": [ { "op": "column", "name": "active", "valueType": "int" }, { "op": "literal", "value": "1", "valueType": "int" } ] }', predicate_hints: ['active = 1'])
|
25
|
+
# Display first few rows
|
26
|
+
if arrow_table.n_rows > 0
|
27
|
+
puts 'First 5 rows:'
|
28
|
+
arrow_table.slice(0, [5, arrow_table.n_rows].min).each_record.with_index do |record, i|
|
29
|
+
puts " Row #{i}: #{record.to_h}"
|
30
|
+
end
|
31
|
+
else
|
32
|
+
puts 'Table is empty'
|
33
|
+
end
|
@@ -0,0 +1,249 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module DeltaSharing
|
4
|
+
class Client
|
5
|
+
attr_reader :profile
|
6
|
+
|
7
|
+
# Initialize with profile file path OR direct credentials
|
8
|
+
def initialize(profile_file = nil, endpoint: nil, bearer_token: nil)
|
9
|
+
if profile_file && (endpoint || bearer_token)
|
10
|
+
raise ArgumentError,
|
11
|
+
'Must provide either profile_file path OR both endpoint and bearer_token not both'
|
12
|
+
end
|
13
|
+
|
14
|
+
if profile_file.nil? && (endpoint && bearer_token.nil? || endpoint.nil? && bearer_token)
|
15
|
+
raise ArgumentError,
|
16
|
+
'Must provide both endpoint and bearer_token'
|
17
|
+
end
|
18
|
+
|
19
|
+
if profile_file
|
20
|
+
@profile = load_profile_from_file(profile_file)
|
21
|
+
elsif endpoint && bearer_token
|
22
|
+
@profile = {
|
23
|
+
'endpoint' => endpoint,
|
24
|
+
'bearerToken' => bearer_token
|
25
|
+
}
|
26
|
+
end
|
27
|
+
end
|
28
|
+
|
29
|
+
# List all shares
|
30
|
+
def list_shares
|
31
|
+
path = '/shares'
|
32
|
+
response = make_request(path)
|
33
|
+
shares, next_page_token = parse_shares_response(response.body)
|
34
|
+
while next_page_token
|
35
|
+
response = make_request(path, params: { nextPageToken: next_page_token })
|
36
|
+
shares, next_page_token = parse_shares_response(response.body)
|
37
|
+
end
|
38
|
+
shares
|
39
|
+
end
|
40
|
+
|
41
|
+
# Get share info
|
42
|
+
def get_share(share_name)
|
43
|
+
path = "/shares/#{share_name}"
|
44
|
+
response = make_request(path)
|
45
|
+
JSON.parse(response.body)
|
46
|
+
end
|
47
|
+
|
48
|
+
# List all schemas in a share
|
49
|
+
def list_schemas(share_name)
|
50
|
+
path = "/shares/#{share_name}/schemas"
|
51
|
+
response = make_request(path)
|
52
|
+
schemas, next_page_token = parse_schemas_response(response.body)
|
53
|
+
while next_page_token
|
54
|
+
response = make_request(path, params: { nextPageToken: next_page_token })
|
55
|
+
schemas, next_page_token = parse_schemas_response(response.body)
|
56
|
+
end
|
57
|
+
schemas
|
58
|
+
end
|
59
|
+
|
60
|
+
# List all tables in a share and schema
|
61
|
+
def list_tables(share_name, schema_name = nil)
|
62
|
+
path = if schema_name
|
63
|
+
# List tables in a specific schema
|
64
|
+
"/shares/#{share_name}/schemas/#{schema_name}/tables"
|
65
|
+
else
|
66
|
+
# List all tables in the share (across all schemas)
|
67
|
+
"/shares/#{share_name}/all-tables"
|
68
|
+
end
|
69
|
+
|
70
|
+
response = make_request(path)
|
71
|
+
tables, next_page_token = parse_tables_response(response.body)
|
72
|
+
while next_page_token
|
73
|
+
response = make_request(path, params: { nextPageToken: next_page_token })
|
74
|
+
tables, next_page_token = parse_tables_response(response.body)
|
75
|
+
end
|
76
|
+
tables
|
77
|
+
end
|
78
|
+
|
79
|
+
# Query table version (HEAD request)
|
80
|
+
def get_table_version(share_name, schema_name, table_name)
|
81
|
+
path = "/shares/#{share_name}/schemas/#{schema_name}/tables/#{table_name}/version"
|
82
|
+
response = make_request(path, method: 'GET')
|
83
|
+
response.headers['delta-table-version']
|
84
|
+
end
|
85
|
+
|
86
|
+
# Get table metadata
|
87
|
+
def get_table_metadata(share_name, schema_name, table_name)
|
88
|
+
path = "/shares/#{share_name}/schemas/#{schema_name}/tables/#{table_name}/metadata"
|
89
|
+
response = make_request(path)
|
90
|
+
parse_metadata_response(response.body)
|
91
|
+
end
|
92
|
+
|
93
|
+
# Read table data using POST request
|
94
|
+
def read_table_data(share_name, schema_name, table_name, options = {})
|
95
|
+
path = "/shares/#{share_name}/schemas/#{schema_name}/tables/#{table_name}/query"
|
96
|
+
body = build_query_body(options)
|
97
|
+
response = make_request(path, method: 'POST', body: body)
|
98
|
+
parse_newline_delimited_json(response.body)
|
99
|
+
end
|
100
|
+
|
101
|
+
# Read table changes using POST request
|
102
|
+
def read_table_changes(share_name, schema_name, table_name, options = {})
|
103
|
+
path = "/shares/#{share_name}/schemas/#{schema_name}/tables/#{table_name}/changes"
|
104
|
+
params = build_changes_params(options)
|
105
|
+
response = make_request(path, params: params)
|
106
|
+
parse_newline_delimited_json(response.body)
|
107
|
+
end
|
108
|
+
|
109
|
+
private
|
110
|
+
|
111
|
+
def load_profile_from_file(profile_file)
|
112
|
+
raise Error, "Profile file not found: #{profile_file}" unless File.exist?(profile_file)
|
113
|
+
|
114
|
+
content = File.read(profile_file)
|
115
|
+
JSON.parse(content)
|
116
|
+
rescue JSON::ParserError => e
|
117
|
+
raise Error, "Invalid JSON in profile file: #{e.message}"
|
118
|
+
end
|
119
|
+
|
120
|
+
def make_request(path, method: 'GET', params: {}, body: nil)
|
121
|
+
url = @profile['endpoint'] + path
|
122
|
+
|
123
|
+
options = {
|
124
|
+
headers: {
|
125
|
+
'Authorization' => "Bearer #{@profile['bearerToken']}",
|
126
|
+
'Content-Type' => 'application/json',
|
127
|
+
'delta-sharing-capabilities' => 'responseformat=parquet'
|
128
|
+
},
|
129
|
+
timeout: 300
|
130
|
+
}
|
131
|
+
|
132
|
+
# Add query parameters if provided
|
133
|
+
unless params.empty?
|
134
|
+
# Filter out nil values and convert to strings
|
135
|
+
filtered_params = params.compact.transform_values(&:to_s)
|
136
|
+
options[:query] = filtered_params
|
137
|
+
end
|
138
|
+
|
139
|
+
# Add body for POST requests
|
140
|
+
options[:body] = body.to_json unless body.nil?
|
141
|
+
|
142
|
+
response = case method.upcase
|
143
|
+
when 'GET'
|
144
|
+
HTTParty.get(url, options)
|
145
|
+
when 'HEAD'
|
146
|
+
HTTParty.head(url, options)
|
147
|
+
when 'POST'
|
148
|
+
HTTParty.post(url, options)
|
149
|
+
else
|
150
|
+
raise ArgumentError, "Unsupported HTTP method: #{method}"
|
151
|
+
end
|
152
|
+
|
153
|
+
handle_error_response(response, path) unless response.success?
|
154
|
+
|
155
|
+
response
|
156
|
+
end
|
157
|
+
|
158
|
+
def build_query_body(options)
|
159
|
+
body = {}
|
160
|
+
|
161
|
+
body[:predicateHints] = options[:predicate_hints] if options[:predicate_hints]
|
162
|
+
if options[:json_predicate_hints]
|
163
|
+
body[:jsonPredicateHints] = if options[:json_predicate_hints].is_a?(Hash)
|
164
|
+
JSON.dump(options[:json_predicate_hints])
|
165
|
+
else
|
166
|
+
options[:json_predicate_hints]
|
167
|
+
end
|
168
|
+
end
|
169
|
+
body[:limitHint] = options[:limit] if options[:limit]
|
170
|
+
body[:version] = options[:version] if options[:version]
|
171
|
+
|
172
|
+
body
|
173
|
+
end
|
174
|
+
|
175
|
+
def build_changes_params(options)
|
176
|
+
params = {}
|
177
|
+
|
178
|
+
params[:startingVersion] = options[:starting_version] if options[:starting_version]
|
179
|
+
params[:endingVersion] = options[:ending_version] if options[:ending_version]
|
180
|
+
params[:startingTimestamp] = options[:starting_timestamp] if options[:starting_timestamp]
|
181
|
+
params[:endingTimestamp] = options[:ending_timestamp] if options[:ending_timestamp]
|
182
|
+
|
183
|
+
params
|
184
|
+
end
|
185
|
+
|
186
|
+
def parse_newline_delimited_json(response_body)
|
187
|
+
lines = response_body.split("\n")
|
188
|
+
lines.reject(&:empty?)
|
189
|
+
end
|
190
|
+
|
191
|
+
def handle_error_response(response, path)
|
192
|
+
case response.code
|
193
|
+
when 401, 403
|
194
|
+
raise AuthenticationError, "Authentication failed: #{response.code} #{response.message}"
|
195
|
+
when 404
|
196
|
+
raise TableNotFoundError, "Resource not found: #{path}"
|
197
|
+
when 400
|
198
|
+
raise ProtocolError, "Bad request: #{response.body}"
|
199
|
+
else
|
200
|
+
raise Error, "HTTP request failed: #{path} #{response.code} #{response.message}"
|
201
|
+
end
|
202
|
+
end
|
203
|
+
|
204
|
+
def parse_tables_response(response)
|
205
|
+
response = JSON.parse(response)
|
206
|
+
tables = (response['items'] || []).map do |table|
|
207
|
+
{
|
208
|
+
name: table['name'],
|
209
|
+
share: table['share'],
|
210
|
+
schema: table['schema'],
|
211
|
+
share_id: table['shareId'],
|
212
|
+
id: table['id']
|
213
|
+
}
|
214
|
+
end
|
215
|
+
[tables, response['nextPageToken']]
|
216
|
+
end
|
217
|
+
|
218
|
+
def parse_shares_response(response)
|
219
|
+
response = JSON.parse(response)
|
220
|
+
shares = (response['items'] || []).map do |share|
|
221
|
+
{
|
222
|
+
name: share['name'],
|
223
|
+
id: share['id']
|
224
|
+
}
|
225
|
+
end
|
226
|
+
|
227
|
+
[shares, response['nextPageToken']]
|
228
|
+
end
|
229
|
+
|
230
|
+
def parse_schemas_response(response)
|
231
|
+
response = JSON.parse(response)
|
232
|
+
schemas = (response['items'] || []).map do |schema|
|
233
|
+
{
|
234
|
+
name: schema['name'],
|
235
|
+
share: schema['share']
|
236
|
+
}
|
237
|
+
end
|
238
|
+
[schemas, response['nextPageToken']]
|
239
|
+
end
|
240
|
+
|
241
|
+
def parse_metadata_response(response)
|
242
|
+
parsed_response = parse_newline_delimited_json(response)
|
243
|
+
JSON.parse(parsed_response[1])
|
244
|
+
rescue JSON::ParserError => e
|
245
|
+
# raise ProtocolError, "Invalid schema JSON in metadata: #{e.message}"
|
246
|
+
raise e
|
247
|
+
end
|
248
|
+
end
|
249
|
+
end
|
@@ -0,0 +1,11 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module DeltaSharing
|
4
|
+
class Error < StandardError; end
|
5
|
+
class AuthenticationError < Error; end
|
6
|
+
class TableNotFoundError < Error; end
|
7
|
+
class ProtocolError < Error; end
|
8
|
+
class NetworkError < Error; end
|
9
|
+
class SchemaError < Error; end
|
10
|
+
class ParsingError < Error; end
|
11
|
+
end
|
@@ -0,0 +1,261 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module DeltaSharing
|
4
|
+
class Reader
|
5
|
+
attr_reader :table, :client, :share, :schema, :name
|
6
|
+
|
7
|
+
# Table format: <share>.<schema>.<table>
|
8
|
+
def initialize(table:, client:, **_options)
|
9
|
+
validate_table_format(table)
|
10
|
+
@share, @schema, @name = table.split('.')
|
11
|
+
@client = client
|
12
|
+
end
|
13
|
+
|
14
|
+
# Main reading methods (to be implemented)
|
15
|
+
def load_as_arrow(options = {})
|
16
|
+
validate_query_options(options)
|
17
|
+
read(options)
|
18
|
+
end
|
19
|
+
|
20
|
+
private
|
21
|
+
|
22
|
+
def validate_table_format(table)
|
23
|
+
# Pattern: anything.anything.anything (where anything = one or more non-dot characters)
|
24
|
+
pattern = /\A[^.]+\.[^.]+\.[^.]+\z/
|
25
|
+
|
26
|
+
return if table.is_a?(String) && table.match?(pattern)
|
27
|
+
|
28
|
+
raise ArgumentError, "Invalid table format. Expected '<share>.<schema>.<table>', got: #{table}"
|
29
|
+
end
|
30
|
+
|
31
|
+
def validate_query_options(options)
|
32
|
+
validate_predicate_hints(options[:predicate_hints]) if options.key?(:predicate_hints)
|
33
|
+
validate_json_predicate_hints(options[:json_predicate_hints]) if options.key?(:json_predicate_hints)
|
34
|
+
validate_limit(options[:limit]) if options.key?(:limit)
|
35
|
+
validate_version(options[:version]) if options.key?(:version)
|
36
|
+
validate_timestamp(options[:timestamp]) if options.key?(:timestamp)
|
37
|
+
end
|
38
|
+
|
39
|
+
def validate_predicate_hints(hints)
|
40
|
+
return if hints.nil?
|
41
|
+
|
42
|
+
raise ArgumentError, "predicate_hints must be an Array, got #{hints.class}" unless hints.is_a?(Array)
|
43
|
+
|
44
|
+
return if hints.all? { |hint| hint.is_a?(String) }
|
45
|
+
|
46
|
+
raise ArgumentError, 'All predicate hints must be strings'
|
47
|
+
end
|
48
|
+
|
49
|
+
def validate_json_predicate_hints(hints)
|
50
|
+
return if hints.nil?
|
51
|
+
|
52
|
+
unless hints.is_a?(String) || hints.is_a?(Hash)
|
53
|
+
raise ArgumentError, "json_predicate_hints must be a String or Hash, got #{hints.class}"
|
54
|
+
end
|
55
|
+
|
56
|
+
# If it's a string, try to parse it as JSON to validate
|
57
|
+
return unless hints.is_a?(String)
|
58
|
+
|
59
|
+
begin
|
60
|
+
JSON.parse(hints)
|
61
|
+
rescue JSON::ParserError => e
|
62
|
+
raise ArgumentError, "Invalid JSON in predicate hints: #{e.message}"
|
63
|
+
end
|
64
|
+
end
|
65
|
+
|
66
|
+
def validate_limit(limit_value)
|
67
|
+
return if limit_value.nil?
|
68
|
+
|
69
|
+
return if limit_value.is_a?(Integer) && limit_value >= 0
|
70
|
+
|
71
|
+
raise ArgumentError, "limit must be a non-negative Integer, got #{limit_value}"
|
72
|
+
end
|
73
|
+
|
74
|
+
def validate_version(version_value)
|
75
|
+
return if version_value.nil?
|
76
|
+
|
77
|
+
return if version_value.is_a?(Integer) && version_value >= 0
|
78
|
+
|
79
|
+
raise ArgumentError, "version must be a non-negative Integer, got #{version_value}"
|
80
|
+
end
|
81
|
+
|
82
|
+
def validate_timestamp(timestamp_value)
|
83
|
+
return if timestamp_value.nil?
|
84
|
+
|
85
|
+
return if timestamp_value.is_a?(String) || timestamp_value.is_a?(Time) || timestamp_value.is_a?(Integer)
|
86
|
+
|
87
|
+
raise ArgumentError, "timestamp must be a String, Time, or Integer, got #{timestamp_value.class}"
|
88
|
+
end
|
89
|
+
|
90
|
+
# Read change data feed
|
91
|
+
def changes(starting_version: nil, ending_version: nil, starting_timestamp: nil, ending_timestamp: nil)
|
92
|
+
response = client.read_table_changes(share, schema, name, {
|
93
|
+
starting_version: starting_version,
|
94
|
+
ending_version: ending_version,
|
95
|
+
starting_timestamp: starting_timestamp,
|
96
|
+
ending_timestamp: ending_timestamp
|
97
|
+
})
|
98
|
+
|
99
|
+
process_read_response(response)
|
100
|
+
end
|
101
|
+
|
102
|
+
# Read table data with optional filtering
|
103
|
+
def read(limit: nil, predicate_hints: nil, json_predicate_hints: nil, version: nil)
|
104
|
+
response = client.read_table_data(share, schema, name, {
|
105
|
+
limit: limit,
|
106
|
+
predicate_hints: predicate_hints,
|
107
|
+
json_predicate_hints: json_predicate_hints,
|
108
|
+
version: version
|
109
|
+
})
|
110
|
+
|
111
|
+
process_read_response(response)
|
112
|
+
end
|
113
|
+
|
114
|
+
def process_read_response(response_lines)
|
115
|
+
metadata_line = nil
|
116
|
+
file_lines = []
|
117
|
+
|
118
|
+
# Parse newline-delimited JSON response
|
119
|
+
response_lines.each do |line|
|
120
|
+
line = line.strip
|
121
|
+
next if line.empty?
|
122
|
+
|
123
|
+
begin
|
124
|
+
json_obj = JSON.parse(line)
|
125
|
+
|
126
|
+
if json_obj['protocol']
|
127
|
+
next
|
128
|
+
elsif json_obj['metaData']
|
129
|
+
metadata_line = json_obj
|
130
|
+
elsif json_obj['file']
|
131
|
+
file_lines << json_obj
|
132
|
+
end
|
133
|
+
rescue JSON::ParserError => e
|
134
|
+
raise ProtocolError, "Invalid JSON in response line: #{e.message}"
|
135
|
+
end
|
136
|
+
end
|
137
|
+
@files = file_lines
|
138
|
+
@metadata = metadata_line
|
139
|
+
arrow_schema = Schema.new(@metadata['metaData']['schemaString']).arrow_schema
|
140
|
+
# Download and process Parquet files
|
141
|
+
|
142
|
+
arrow_tables = []
|
143
|
+
file_lines.each do |file_obj|
|
144
|
+
file_info = file_obj['file']
|
145
|
+
arrow_table = download_and_read_parquet(file_info)
|
146
|
+
arrow_table = add_partition_columns(arrow_table, arrow_schema, file_info) if file_info['partitionValues']
|
147
|
+
arrow_tables << arrow_table if arrow_table
|
148
|
+
end
|
149
|
+
|
150
|
+
# Combine all Arrow tables
|
151
|
+
@arrow_table = if arrow_tables.length == 1
|
152
|
+
arrow_tables.first
|
153
|
+
else
|
154
|
+
combine_arrow_tables(arrow_tables)
|
155
|
+
end
|
156
|
+
|
157
|
+
@arrow_table
|
158
|
+
end
|
159
|
+
|
160
|
+
def download_and_read_parquet(file_info)
|
161
|
+
url = file_info['url']
|
162
|
+
|
163
|
+
# Download Parquet file
|
164
|
+
parquet_data = download_file(url)
|
165
|
+
|
166
|
+
# Read with Apache Arrow
|
167
|
+
read_parquet_data(parquet_data)
|
168
|
+
end
|
169
|
+
|
170
|
+
def download_file(url)
|
171
|
+
response = HTTParty.get(url)
|
172
|
+
|
173
|
+
unless response.success?
|
174
|
+
raise NetworkError,
|
175
|
+
"Failed to download file from #{url}: #{response.code} #{response.message}"
|
176
|
+
end
|
177
|
+
|
178
|
+
response.body
|
179
|
+
end
|
180
|
+
|
181
|
+
def read_parquet_data(data)
|
182
|
+
# Create a temporary file to write the Parquet data
|
183
|
+
Tempfile.create(['delta_sharing', '.parquet']) do |temp_file|
|
184
|
+
temp_file.binmode
|
185
|
+
temp_file.write(data)
|
186
|
+
temp_file.rewind
|
187
|
+
|
188
|
+
# Read using Apache Arrow
|
189
|
+
Arrow::Table.load(temp_file.path, format: :parquet)
|
190
|
+
end
|
191
|
+
rescue StandardError => e
|
192
|
+
raise ProtocolError, "Failed to read Parquet data: #{e.message}"
|
193
|
+
end
|
194
|
+
|
195
|
+
def add_partition_columns(arrow_table, schema, file_info)
|
196
|
+
partition_values = file_info['partitionValues'] || {}
|
197
|
+
return arrow_table if partition_values.empty?
|
198
|
+
|
199
|
+
partition_fields = []
|
200
|
+
partition_column_values = []
|
201
|
+
|
202
|
+
schema.fields.each do |field|
|
203
|
+
if partition_values.keys.include?(field.name)
|
204
|
+
partition_fields << field
|
205
|
+
partition_column_values << partition_values[field.name]
|
206
|
+
end
|
207
|
+
end
|
208
|
+
|
209
|
+
return arrow_table if partition_fields.empty?
|
210
|
+
|
211
|
+
# Create row-oriented data: each row is an array of partition values
|
212
|
+
partition_rows = Array.new(arrow_table.n_rows) { partition_column_values.dup }
|
213
|
+
|
214
|
+
partition_schema = Arrow::Schema.new(partition_fields)
|
215
|
+
partition_table = Arrow::Table.new(partition_schema, partition_rows)
|
216
|
+
arrow_table.merge(partition_table)
|
217
|
+
end
|
218
|
+
|
219
|
+
def combine_arrow_tables(tables)
|
220
|
+
return tables.first if tables.length == 1
|
221
|
+
|
222
|
+
# Use the first table's schema as the reference for column order
|
223
|
+
reference_schema = tables.first.schema
|
224
|
+
reference_column_order = reference_schema.fields.map(&:name)
|
225
|
+
|
226
|
+
# Align all tables to match the reference schema
|
227
|
+
aligned_tables = tables.map do |table|
|
228
|
+
current_column_order = table.schema.fields.map(&:name)
|
229
|
+
|
230
|
+
# Check if reordering is needed
|
231
|
+
if current_column_order == reference_column_order
|
232
|
+
table # Already in correct order
|
233
|
+
else
|
234
|
+
# Verify all required columns are present
|
235
|
+
missing_columns = reference_column_order - current_column_order
|
236
|
+
extra_columns = current_column_order - reference_column_order
|
237
|
+
|
238
|
+
raise ProtocolError, "Table missing required columns: #{missing_columns.join(', ')}" if missing_columns.any?
|
239
|
+
|
240
|
+
if extra_columns.any?
|
241
|
+
# Log warning but continue (extra columns will be ignored)
|
242
|
+
puts "Warning: Table has extra columns that will be ignored: #{extra_columns.join(', ')}"
|
243
|
+
end
|
244
|
+
|
245
|
+
# Reorder columns to match reference schema
|
246
|
+
reordered_columns = reference_column_order.map do |column_name|
|
247
|
+
table.column(column_name)
|
248
|
+
end
|
249
|
+
|
250
|
+
Arrow::Table.new(reference_schema, reordered_columns)
|
251
|
+
end
|
252
|
+
end
|
253
|
+
|
254
|
+
# Now combine all aligned tables
|
255
|
+
record_batches = aligned_tables.flat_map { |t| t.each_record_batch.to_a }
|
256
|
+
Arrow::Table.new(reference_schema, record_batches)
|
257
|
+
rescue StandardError => e
|
258
|
+
raise ProtocolError, "Failed to combine Arrow tables: #{e.message}"
|
259
|
+
end
|
260
|
+
end
|
261
|
+
end
|
@@ -0,0 +1,81 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module DeltaSharing
|
4
|
+
class Schema
|
5
|
+
attr_reader :fields, :arrow_schema
|
6
|
+
|
7
|
+
def initialize(schema_string)
|
8
|
+
schema_json = JSON.parse(schema_string)
|
9
|
+
@fields = schema_json['fields'] || []
|
10
|
+
@arrow_schema = build_arrow_schema
|
11
|
+
end
|
12
|
+
|
13
|
+
def field_names
|
14
|
+
@fields.map { |f| f['name'] }
|
15
|
+
end
|
16
|
+
|
17
|
+
def field_index(name)
|
18
|
+
@fields.find_index { |f| f['name'] == name }
|
19
|
+
end
|
20
|
+
|
21
|
+
def build_arrow_schema
|
22
|
+
fields = @fields.map do |field|
|
23
|
+
arrow_type = delta_type_to_arrow_type(field['type'])
|
24
|
+
Arrow::Field.new(field['name'], arrow_type, field['nullable'])
|
25
|
+
end
|
26
|
+
|
27
|
+
Arrow::Schema.new(fields)
|
28
|
+
end
|
29
|
+
|
30
|
+
def delta_type_to_arrow_type(delta_type)
|
31
|
+
case delta_type
|
32
|
+
when 'boolean'
|
33
|
+
Arrow::BooleanDataType.new
|
34
|
+
when 'byte'
|
35
|
+
Arrow::Int8DataType.new
|
36
|
+
when 'short'
|
37
|
+
Arrow::Int16DataType.new
|
38
|
+
when 'integer'
|
39
|
+
Arrow::Int32DataType.new
|
40
|
+
when 'long'
|
41
|
+
Arrow::Int64DataType.new
|
42
|
+
when 'float'
|
43
|
+
Arrow::FloatDataType.new
|
44
|
+
when 'double'
|
45
|
+
Arrow::DoubleDataType.new
|
46
|
+
when 'string'
|
47
|
+
Arrow::StringDataType.new
|
48
|
+
when 'binary'
|
49
|
+
Arrow::BinaryDataType.new
|
50
|
+
when 'date'
|
51
|
+
Arrow::Date32DataType.new
|
52
|
+
when 'timestamp'
|
53
|
+
Arrow::TimestampDataType.new(:micro)
|
54
|
+
else
|
55
|
+
if delta_type.is_a?(Hash)
|
56
|
+
case delta_type['type']
|
57
|
+
when 'decimal'
|
58
|
+
Arrow::DecimalDataType.new(delta_type['precision'], delta_type['scale'])
|
59
|
+
when 'array'
|
60
|
+
element_type = delta_type_to_arrow_type(delta_type['elementType'])
|
61
|
+
Arrow::ListDataType.new(element_type)
|
62
|
+
when 'map'
|
63
|
+
key_type = delta_type_to_arrow_type(delta_type['keyType'])
|
64
|
+
value_type = delta_type_to_arrow_type(delta_type['valueType'])
|
65
|
+
Arrow::MapDataType.new(key_type, value_type)
|
66
|
+
when 'struct'
|
67
|
+
fields = delta_type['fields'].map do |field|
|
68
|
+
field_type = delta_type_to_arrow_type(field['type'])
|
69
|
+
Arrow::Field.new(field['name'], field_type, field['nullable'])
|
70
|
+
end
|
71
|
+
Arrow::StructDataType.new(fields)
|
72
|
+
else
|
73
|
+
Arrow::StringDataType.new # Fallback to string
|
74
|
+
end
|
75
|
+
else
|
76
|
+
Arrow::StringDataType.new # Fallback to string
|
77
|
+
end
|
78
|
+
end
|
79
|
+
end
|
80
|
+
end
|
81
|
+
end
|
@@ -0,0 +1,15 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require_relative 'delta_sharing/version'
|
4
|
+
require_relative 'delta_sharing/errors'
|
5
|
+
require_relative 'delta_sharing/client'
|
6
|
+
require_relative 'delta_sharing/reader'
|
7
|
+
require_relative 'delta_sharing/schema'
|
8
|
+
require 'arrow'
|
9
|
+
require 'parquet'
|
10
|
+
require 'httparty'
|
11
|
+
require 'tempfile'
|
12
|
+
require 'json'
|
13
|
+
|
14
|
+
module DeltaSharing
|
15
|
+
end
|
metadata
ADDED
@@ -0,0 +1,151 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: delta_sharing
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: 0.1.0
|
5
|
+
platform: ruby
|
6
|
+
authors:
|
7
|
+
- Samuel Souza
|
8
|
+
autorequire:
|
9
|
+
bindir: bin
|
10
|
+
cert_chain: []
|
11
|
+
date: 2025-06-26 00:00:00.000000000 Z
|
12
|
+
dependencies:
|
13
|
+
- !ruby/object:Gem::Dependency
|
14
|
+
name: httparty
|
15
|
+
requirement: !ruby/object:Gem::Requirement
|
16
|
+
requirements:
|
17
|
+
- - "~>"
|
18
|
+
- !ruby/object:Gem::Version
|
19
|
+
version: 0.21.0
|
20
|
+
type: :runtime
|
21
|
+
prerelease: false
|
22
|
+
version_requirements: !ruby/object:Gem::Requirement
|
23
|
+
requirements:
|
24
|
+
- - "~>"
|
25
|
+
- !ruby/object:Gem::Version
|
26
|
+
version: 0.21.0
|
27
|
+
- !ruby/object:Gem::Dependency
|
28
|
+
name: red-arrow
|
29
|
+
requirement: !ruby/object:Gem::Requirement
|
30
|
+
requirements:
|
31
|
+
- - "~>"
|
32
|
+
- !ruby/object:Gem::Version
|
33
|
+
version: '19.0'
|
34
|
+
type: :runtime
|
35
|
+
prerelease: false
|
36
|
+
version_requirements: !ruby/object:Gem::Requirement
|
37
|
+
requirements:
|
38
|
+
- - "~>"
|
39
|
+
- !ruby/object:Gem::Version
|
40
|
+
version: '19.0'
|
41
|
+
- !ruby/object:Gem::Dependency
|
42
|
+
name: red-arrow-dataset
|
43
|
+
requirement: !ruby/object:Gem::Requirement
|
44
|
+
requirements:
|
45
|
+
- - "~>"
|
46
|
+
- !ruby/object:Gem::Version
|
47
|
+
version: '19.0'
|
48
|
+
type: :runtime
|
49
|
+
prerelease: false
|
50
|
+
version_requirements: !ruby/object:Gem::Requirement
|
51
|
+
requirements:
|
52
|
+
- - "~>"
|
53
|
+
- !ruby/object:Gem::Version
|
54
|
+
version: '19.0'
|
55
|
+
- !ruby/object:Gem::Dependency
|
56
|
+
name: red-parquet
|
57
|
+
requirement: !ruby/object:Gem::Requirement
|
58
|
+
requirements:
|
59
|
+
- - "~>"
|
60
|
+
- !ruby/object:Gem::Version
|
61
|
+
version: '19.0'
|
62
|
+
type: :runtime
|
63
|
+
prerelease: false
|
64
|
+
version_requirements: !ruby/object:Gem::Requirement
|
65
|
+
requirements:
|
66
|
+
- - "~>"
|
67
|
+
- !ruby/object:Gem::Version
|
68
|
+
version: '19.0'
|
69
|
+
- !ruby/object:Gem::Dependency
|
70
|
+
name: minitest
|
71
|
+
requirement: !ruby/object:Gem::Requirement
|
72
|
+
requirements:
|
73
|
+
- - "~>"
|
74
|
+
- !ruby/object:Gem::Version
|
75
|
+
version: '5.0'
|
76
|
+
type: :development
|
77
|
+
prerelease: false
|
78
|
+
version_requirements: !ruby/object:Gem::Requirement
|
79
|
+
requirements:
|
80
|
+
- - "~>"
|
81
|
+
- !ruby/object:Gem::Version
|
82
|
+
version: '5.0'
|
83
|
+
- !ruby/object:Gem::Dependency
|
84
|
+
name: webmock
|
85
|
+
requirement: !ruby/object:Gem::Requirement
|
86
|
+
requirements:
|
87
|
+
- - "~>"
|
88
|
+
- !ruby/object:Gem::Version
|
89
|
+
version: '3.18'
|
90
|
+
type: :development
|
91
|
+
prerelease: false
|
92
|
+
version_requirements: !ruby/object:Gem::Requirement
|
93
|
+
requirements:
|
94
|
+
- - "~>"
|
95
|
+
- !ruby/object:Gem::Version
|
96
|
+
version: '3.18'
|
97
|
+
description: A Ruby implementation of the Delta Sharing client for reading shared
|
98
|
+
Delta Lake tables
|
99
|
+
email:
|
100
|
+
- samuel.ssouza95@gmail.com
|
101
|
+
executables:
|
102
|
+
- console
|
103
|
+
- setup
|
104
|
+
extensions: []
|
105
|
+
extra_rdoc_files: []
|
106
|
+
files:
|
107
|
+
- CHANGELOG.md
|
108
|
+
- CODE_OF_CONDUCT.md
|
109
|
+
- Gemfile
|
110
|
+
- Gemfile.lock
|
111
|
+
- LICENSE.txt
|
112
|
+
- README.md
|
113
|
+
- Rakefile
|
114
|
+
- bin/console
|
115
|
+
- bin/setup
|
116
|
+
- examples/client_example.rb
|
117
|
+
- examples/reader_example.rb
|
118
|
+
- lib/delta_sharing.rb
|
119
|
+
- lib/delta_sharing/client.rb
|
120
|
+
- lib/delta_sharing/errors.rb
|
121
|
+
- lib/delta_sharing/reader.rb
|
122
|
+
- lib/delta_sharing/schema.rb
|
123
|
+
- lib/delta_sharing/version.rb
|
124
|
+
homepage: https://github.com/samssouza/delta-sharing-ruby
|
125
|
+
licenses:
|
126
|
+
- MIT
|
127
|
+
metadata:
|
128
|
+
allowed_push_host: https://rubygems.org
|
129
|
+
homepage_uri: https://github.com/samssouza/delta-sharing-ruby
|
130
|
+
source_code_uri: https://github.com/samssouza/delta-sharing-ruby
|
131
|
+
changelog_uri: https://github.com/samssouza/delta-sharing-ruby/blob/main/CHANGELOG.md
|
132
|
+
post_install_message:
|
133
|
+
rdoc_options: []
|
134
|
+
require_paths:
|
135
|
+
- lib
|
136
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
137
|
+
requirements:
|
138
|
+
- - ">="
|
139
|
+
- !ruby/object:Gem::Version
|
140
|
+
version: '2.6'
|
141
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
142
|
+
requirements:
|
143
|
+
- - ">="
|
144
|
+
- !ruby/object:Gem::Version
|
145
|
+
version: '0'
|
146
|
+
requirements: []
|
147
|
+
rubygems_version: 3.0.8
|
148
|
+
signing_key:
|
149
|
+
specification_version: 4
|
150
|
+
summary: Ruby client for Delta Sharing protocol
|
151
|
+
test_files: []
|