dataverse 0.1.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/.env.template +3 -0
- data/.gitignore +16 -0
- data/.rspec +3 -0
- data/.travis.yml +6 -0
- data/CHANGELOG.md +5 -0
- data/CODE_OF_CONDUCT.md +84 -0
- data/Gemfile +14 -0
- data/Gemfile.lock +62 -0
- data/LICENSE.txt +21 -0
- data/README.md +598 -0
- data/Rakefile +8 -0
- data/bin/console +19 -0
- data/bin/setup +8 -0
- data/dataverse.gemspec +31 -0
- data/lib/dataverse.rb +7 -0
- data/lib/dataverse/base.rb +124 -0
- data/lib/dataverse/dataset.rb +376 -0
- data/lib/dataverse/dataverse.rb +157 -0
- data/lib/dataverse/errors.rb +27 -0
- data/lib/dataverse/version.rb +5 -0
- metadata +80 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: 218801f555a41f3c5a6b5f780903c8558c5df6d1b149470e649a763bf2f7fd39
|
4
|
+
data.tar.gz: 498a782ecc5c745fa4b23c2bae2cd599b73a855dec072e91a1f96a254db9c8be
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: 6cd333801c73346653ca2a9780f9cf99d22bb2d2e343d424dff73e48a6789dd87487309a8d7ae72b905bc6c9512de2f84d12f08e188c8a2d0cf1f18add2344c2
|
7
|
+
data.tar.gz: 65ad0e6cf95d2a557195414226da5fa751dde28e3111370beb46c5f24b00b65781ba40183d8b3dbee3fe568eab3520bf355920aa975e215326d20af5695978af
|
data/.env.template
ADDED
data/.gitignore
ADDED
data/.rspec
ADDED
data/.travis.yml
ADDED
data/CHANGELOG.md
ADDED
data/CODE_OF_CONDUCT.md
ADDED
@@ -0,0 +1,84 @@
|
|
1
|
+
# Contributor Covenant Code of Conduct
|
2
|
+
|
3
|
+
## Our Pledge
|
4
|
+
|
5
|
+
We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.
|
6
|
+
|
7
|
+
We pledge to act and interact in ways that contribute to an open, welcoming, diverse, inclusive, and healthy community.
|
8
|
+
|
9
|
+
## Our Standards
|
10
|
+
|
11
|
+
Examples of behavior that contributes to a positive environment for our community include:
|
12
|
+
|
13
|
+
* Demonstrating empathy and kindness toward other people
|
14
|
+
* Being respectful of differing opinions, viewpoints, and experiences
|
15
|
+
* Giving and gracefully accepting constructive feedback
|
16
|
+
* Accepting responsibility and apologizing to those affected by our mistakes, and learning from the experience
|
17
|
+
* Focusing on what is best not just for us as individuals, but for the overall community
|
18
|
+
|
19
|
+
Examples of unacceptable behavior include:
|
20
|
+
|
21
|
+
* The use of sexualized language or imagery, and sexual attention or
|
22
|
+
advances of any kind
|
23
|
+
* Trolling, insulting or derogatory comments, and personal or political attacks
|
24
|
+
* Public or private harassment
|
25
|
+
* Publishing others' private information, such as a physical or email
|
26
|
+
address, without their explicit permission
|
27
|
+
* Other conduct which could reasonably be considered inappropriate in a
|
28
|
+
professional setting
|
29
|
+
|
30
|
+
## Enforcement Responsibilities
|
31
|
+
|
32
|
+
Community leaders are responsible for clarifying and enforcing our standards of acceptable behavior and will take appropriate and fair corrective action in response to any behavior that they deem inappropriate, threatening, offensive, or harmful.
|
33
|
+
|
34
|
+
Community leaders have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, and will communicate reasons for moderation decisions when appropriate.
|
35
|
+
|
36
|
+
## Scope
|
37
|
+
|
38
|
+
This Code of Conduct applies within all community spaces, and also applies when an individual is officially representing the community in public spaces. Examples of representing our community include using an official e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event.
|
39
|
+
|
40
|
+
## Enforcement
|
41
|
+
|
42
|
+
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported to the community leaders responsible for enforcement at kris.dekeyser@libis.be. All complaints will be reviewed and investigated promptly and fairly.
|
43
|
+
|
44
|
+
All community leaders are obligated to respect the privacy and security of the reporter of any incident.
|
45
|
+
|
46
|
+
## Enforcement Guidelines
|
47
|
+
|
48
|
+
Community leaders will follow these Community Impact Guidelines in determining the consequences for any action they deem in violation of this Code of Conduct:
|
49
|
+
|
50
|
+
### 1. Correction
|
51
|
+
|
52
|
+
**Community Impact**: Use of inappropriate language or other behavior deemed unprofessional or unwelcome in the community.
|
53
|
+
|
54
|
+
**Consequence**: A private, written warning from community leaders, providing clarity around the nature of the violation and an explanation of why the behavior was inappropriate. A public apology may be requested.
|
55
|
+
|
56
|
+
### 2. Warning
|
57
|
+
|
58
|
+
**Community Impact**: A violation through a single incident or series of actions.
|
59
|
+
|
60
|
+
**Consequence**: A warning with consequences for continued behavior. No interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, for a specified period of time. This includes avoiding interactions in community spaces as well as external channels like social media. Violating these terms may lead to a temporary or permanent ban.
|
61
|
+
|
62
|
+
### 3. Temporary Ban
|
63
|
+
|
64
|
+
**Community Impact**: A serious violation of community standards, including sustained inappropriate behavior.
|
65
|
+
|
66
|
+
**Consequence**: A temporary ban from any sort of interaction or public communication with the community for a specified period of time. No public or private interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, is allowed during this period. Violating these terms may lead to a permanent ban.
|
67
|
+
|
68
|
+
### 4. Permanent Ban
|
69
|
+
|
70
|
+
**Community Impact**: Demonstrating a pattern of violation of community standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals.
|
71
|
+
|
72
|
+
**Consequence**: A permanent ban from any sort of public interaction within the community.
|
73
|
+
|
74
|
+
## Attribution
|
75
|
+
|
76
|
+
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 2.0,
|
77
|
+
available at https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
|
78
|
+
|
79
|
+
Community Impact Guidelines were inspired by [Mozilla's code of conduct enforcement ladder](https://github.com/mozilla/diversity).
|
80
|
+
|
81
|
+
[homepage]: https://www.contributor-covenant.org
|
82
|
+
|
83
|
+
For answers to common questions about this code of conduct, see the FAQ at
|
84
|
+
https://www.contributor-covenant.org/faq. Translations are available at https://www.contributor-covenant.org/translations.
|
data/Gemfile
ADDED
data/Gemfile.lock
ADDED
@@ -0,0 +1,62 @@
|
|
1
|
+
PATH
|
2
|
+
remote: .
|
3
|
+
specs:
|
4
|
+
dataverse (0.1.0)
|
5
|
+
rest-client (~> 2.0)
|
6
|
+
|
7
|
+
GEM
|
8
|
+
remote: https://rubygems.org/
|
9
|
+
specs:
|
10
|
+
awesome_print (1.9.2)
|
11
|
+
coderay (1.1.3)
|
12
|
+
diff-lcs (1.4.4)
|
13
|
+
domain_name (0.5.20190701)
|
14
|
+
unf (>= 0.0.5, < 1.0.0)
|
15
|
+
dotenv (2.7.6)
|
16
|
+
http-accept (1.7.0)
|
17
|
+
http-cookie (1.0.3)
|
18
|
+
domain_name (~> 0.5)
|
19
|
+
method_source (1.0.0)
|
20
|
+
mime-types (3.3.1)
|
21
|
+
mime-types-data (~> 3.2015)
|
22
|
+
mime-types-data (3.2021.0225)
|
23
|
+
netrc (0.11.0)
|
24
|
+
pry (0.14.0)
|
25
|
+
coderay (~> 1.1)
|
26
|
+
method_source (~> 1.0)
|
27
|
+
rake (13.0.3)
|
28
|
+
rest-client (2.1.0)
|
29
|
+
http-accept (>= 1.7.0, < 2.0)
|
30
|
+
http-cookie (>= 1.0.2, < 2.0)
|
31
|
+
mime-types (>= 1.16, < 4.0)
|
32
|
+
netrc (~> 0.8)
|
33
|
+
rspec (3.10.0)
|
34
|
+
rspec-core (~> 3.10.0)
|
35
|
+
rspec-expectations (~> 3.10.0)
|
36
|
+
rspec-mocks (~> 3.10.0)
|
37
|
+
rspec-core (3.10.1)
|
38
|
+
rspec-support (~> 3.10.0)
|
39
|
+
rspec-expectations (3.10.1)
|
40
|
+
diff-lcs (>= 1.2.0, < 2.0)
|
41
|
+
rspec-support (~> 3.10.0)
|
42
|
+
rspec-mocks (3.10.2)
|
43
|
+
diff-lcs (>= 1.2.0, < 2.0)
|
44
|
+
rspec-support (~> 3.10.0)
|
45
|
+
rspec-support (3.10.2)
|
46
|
+
unf (0.1.4)
|
47
|
+
unf_ext
|
48
|
+
unf_ext (0.0.7.7)
|
49
|
+
|
50
|
+
PLATFORMS
|
51
|
+
x86_64-linux
|
52
|
+
|
53
|
+
DEPENDENCIES
|
54
|
+
awesome_print
|
55
|
+
dataverse!
|
56
|
+
dotenv
|
57
|
+
pry
|
58
|
+
rake (~> 13.0)
|
59
|
+
rspec (~> 3.0)
|
60
|
+
|
61
|
+
BUNDLED WITH
|
62
|
+
2.2.15
|
data/LICENSE.txt
ADDED
@@ -0,0 +1,21 @@
|
|
1
|
+
The MIT License (MIT)
|
2
|
+
|
3
|
+
Copyright (c) 2021 Kris Dekeyser
|
4
|
+
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
7
|
+
in the Software without restriction, including without limitation the rights
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
10
|
+
furnished to do so, subject to the following conditions:
|
11
|
+
|
12
|
+
The above copyright notice and this permission notice shall be included in
|
13
|
+
all copies or substantial portions of the Software.
|
14
|
+
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
21
|
+
THE SOFTWARE.
|
data/README.md
ADDED
@@ -0,0 +1,598 @@
|
|
1
|
+
# Dataverse
|
2
|
+
|
3
|
+
Welcome to your new gem! In this directory, you'll find the files you need to be able to package up your Ruby library into a gem. Put your Ruby code in the file `lib/dataverse`. To experiment with that code, run `bin/console` for an interactive prompt.
|
4
|
+
|
5
|
+
TODO: Delete this and the text above, and describe your gem
|
6
|
+
|
7
|
+
## Installation
|
8
|
+
|
9
|
+
Add this line to your application's Gemfile:
|
10
|
+
|
11
|
+
```ruby
|
12
|
+
gem 'dataverse'
|
13
|
+
```
|
14
|
+
|
15
|
+
And then execute:
|
16
|
+
|
17
|
+
$ bundle install
|
18
|
+
|
19
|
+
Or install it yourself as:
|
20
|
+
|
21
|
+
$ gem install dataverse
|
22
|
+
|
23
|
+
## Usage
|
24
|
+
|
25
|
+
This gem wraps the Dataverse.org API in a set of Ruby classes. You can use the classes to perform the API calls and process the result on the Ruby objects. It builds upon the rest-client gem to perform the low-level REST API calls.
|
26
|
+
|
27
|
+
In order to configure the API calls, you need to define at least two environment variables:
|
28
|
+
|
29
|
+
- API_URL: the full url of the Dataverse repository you want to access. This URL should be give up to and including the '/api' path. Optionally, a version portion can be added to the path.
|
30
|
+
- API_TOKEN: a token to identify and authorize the user. Note that for some API calls a superuser token may be required.
|
31
|
+
- RESTCLIENT_LOG: if defined, will log the REST API calls to the given file. Set to 'stdout' if you want to log to screen.
|
32
|
+
|
33
|
+
## Dataverse::Dataverse
|
34
|
+
|
35
|
+
The class that captures the api dealing with dataverses.
|
36
|
+
|
37
|
+
### Accessing an existing dataverse
|
38
|
+
|
39
|
+
You can create a new instance by supplying the id or alias of an existing dataverse to the constructor:
|
40
|
+
|
41
|
+
```ruby
|
42
|
+
Dataverse::Dataverse.id('my_dataverse')
|
43
|
+
# => #<Dataverse::Dataverse:0x0...>
|
44
|
+
```
|
45
|
+
|
46
|
+
You can pass the value ':root' or use the #root class method if you want to access the root dataverse.
|
47
|
+
|
48
|
+
```ruby
|
49
|
+
Dataverse::Dataverse.id(':root') == Dataverse::Dataverse.root
|
50
|
+
# => true
|
51
|
+
```
|
52
|
+
|
53
|
+
### Creating a new dataverse
|
54
|
+
|
55
|
+
To create a new dataverse, you should first open an instance for the parent dataverse, then call the #create method on it, supplying either a Hash, a file name or a JSON string.
|
56
|
+
|
57
|
+
```ruby
|
58
|
+
parent_dv = Dataverse::Dataverse.id('parent_dv')
|
59
|
+
# => #<Dataverse::Dataverse:0x0...>
|
60
|
+
|
61
|
+
new_dv = parent_dv.create(name: 'My new dataverse', alias: 'new_dv', ...)
|
62
|
+
# => #<Dataverse::Dataverse:0x0...>
|
63
|
+
```
|
64
|
+
|
65
|
+
A sample data hash for a new dataset is provided in
|
66
|
+
```ruby
|
67
|
+
Dataverse::Dataverse::SAMPLE_DATA
|
68
|
+
# => {:name=>"new dataverse", :alias=>"new_dv", :dataverseContacts=>[
|
69
|
+
# {:contactEmail=>"abc@def.org"}], :affiliation=>"My organization",
|
70
|
+
# :description=>"My new dataverse", :dataverseType=>"ORGANIZATIONS_INSTITUTIONS"}
|
71
|
+
```
|
72
|
+
and the list of valid values for the field 'dataverseType' can be found at:
|
73
|
+
```ruby
|
74
|
+
Dataverse::Dataverse::TYPES
|
75
|
+
# => ["DEPARTMENT", "JOURNALS", "LABORATORY", "ORGANIZATIONS_INSTITUTIONS",
|
76
|
+
# "RESEARCHERS", "RESEARCH_GROUP", "RESEARCH_PROJECTS", "TEACHING_COURSES",
|
77
|
+
# "UNCATEGORIZED"]
|
78
|
+
```
|
79
|
+
|
80
|
+
All the metadata of an existing dataverse can be retrieved as a Hash with the #rdm_data method:
|
81
|
+
|
82
|
+
```ruby
|
83
|
+
parent_dv.rdm_data
|
84
|
+
# => {"id"=>5, "alias"=>"parent_dv", ...}
|
85
|
+
```
|
86
|
+
|
87
|
+
The resulting Hash can be saved to a file and used to create a new dataverse:
|
88
|
+
|
89
|
+
```ruby
|
90
|
+
data = parent_dv.rdm_data.dup
|
91
|
+
data['alias'] = 'new_dv'
|
92
|
+
filename = 'dataverse.json'
|
93
|
+
File.open(filename, 'wt') { |f| f.write JSON.pretty_generate(data) }
|
94
|
+
new_dv = parent_dv.create(filename)
|
95
|
+
# => #<Dataverse::Dataverse:0x0...>
|
96
|
+
```
|
97
|
+
|
98
|
+
### Deleting a dataverse
|
99
|
+
|
100
|
+
```ruby
|
101
|
+
new_dv.delete
|
102
|
+
# => {"message" => "Dataverse 15 deleted"}
|
103
|
+
```
|
104
|
+
|
105
|
+
### Publishing a dataverse
|
106
|
+
|
107
|
+
```ruby
|
108
|
+
new_dv.publish
|
109
|
+
# => "Dataverse 15 published"
|
110
|
+
|
111
|
+
new_dv.publish
|
112
|
+
# => Dataverse::Error: Dataverse new_dv has already been published
|
113
|
+
```
|
114
|
+
|
115
|
+
Note that if a dataverse was already published, the call will raise a Dataverse::Error exception.
|
116
|
+
|
117
|
+
### Access dataverse properties
|
118
|
+
|
119
|
+
The Dataverse properties can be accessed similar to a Hash:
|
120
|
+
|
121
|
+
```ruby
|
122
|
+
parent_dv.keys
|
123
|
+
# => ["id", "alias", "name", "affiliation", "dataverseContacts", "permissionRoot",
|
124
|
+
# "description", "dataverseType", "ownerId", "creationDate"]
|
125
|
+
|
126
|
+
parent_dv['alias']
|
127
|
+
# => "parent_dv"
|
128
|
+
|
129
|
+
parent_dv.fetch('alias')
|
130
|
+
# => "parent_dv"
|
131
|
+
```
|
132
|
+
|
133
|
+
Only the above Hash methods are implemented on the Dataverse class. For other Hash operations, you can access the data Hash directly:
|
134
|
+
|
135
|
+
```ruby
|
136
|
+
parent_dv.api_data.select {|k,v| k =~ /^a/ }
|
137
|
+
# => {"alias" = "parent_dv", "affiliation" => "My organization"}
|
138
|
+
|
139
|
+
parent_dv.api_data.values
|
140
|
+
# => [5, "parent_dv", ...]
|
141
|
+
```
|
142
|
+
|
143
|
+
Note that the data Hash is frozen and using methods on the data Hash that change the contents of the Hash (e.g. #reject! and #delete) will throw a FrozenError exception. If you want to manipulate the Hash, you should create a copy of the Hash:
|
144
|
+
|
145
|
+
```ruby
|
146
|
+
parent_dv.api_data['id'] = 123456
|
147
|
+
# => FrozenError: can't modify a frozen Hash: ...
|
148
|
+
|
149
|
+
data = parent_dv.api_data.dup
|
150
|
+
# => {"id" => 5, "alias" => "parent_dv", ...}
|
151
|
+
|
152
|
+
data.delete('id')
|
153
|
+
# => 123456
|
154
|
+
|
155
|
+
data['alias'] = 'new_dv'
|
156
|
+
# => "new_dv"
|
157
|
+
|
158
|
+
data
|
159
|
+
# => {"alias" => "new_dv", ...}
|
160
|
+
```
|
161
|
+
|
162
|
+
The id or alias that was used to instantiate the Dataverse:
|
163
|
+
|
164
|
+
```ruby
|
165
|
+
parent_dv.id
|
166
|
+
# => "parent_dv"
|
167
|
+
|
168
|
+
new_dv.id
|
169
|
+
# => 15
|
170
|
+
```
|
171
|
+
|
172
|
+
To get the id or alias explicitly, use the Hash methods:
|
173
|
+
|
174
|
+
```ruby
|
175
|
+
parent_dv['id']
|
176
|
+
# => 5
|
177
|
+
|
178
|
+
parent_dv['alias']
|
179
|
+
# => "parent_dv"
|
180
|
+
```
|
181
|
+
|
182
|
+
### Report the data file size of a Dataverse (in bytes)
|
183
|
+
|
184
|
+
```ruby
|
185
|
+
parent_dv.size
|
186
|
+
# => 123456789
|
187
|
+
```
|
188
|
+
|
189
|
+
### Browsing
|
190
|
+
|
191
|
+
Get an array of child dataverses and datasets:
|
192
|
+
|
193
|
+
```ruby
|
194
|
+
parent_dv.children
|
195
|
+
# => [#<Dataverse::Dataverse:0x0...>, #<Dataverse::Dataset:0x0...>]
|
196
|
+
```
|
197
|
+
|
198
|
+
Iterate over all child dataverses recursively:
|
199
|
+
|
200
|
+
```ruby
|
201
|
+
parent_dv.each_dataverse do |dv|
|
202
|
+
puts dv.id
|
203
|
+
end
|
204
|
+
# => 10
|
205
|
+
# => 15
|
206
|
+
# ...
|
207
|
+
```
|
208
|
+
|
209
|
+
Iterate over all child datasets recursively:
|
210
|
+
|
211
|
+
```ruby
|
212
|
+
parent_dv.each_dataverse do |dv|
|
213
|
+
puts dv.size
|
214
|
+
end
|
215
|
+
# => 123456
|
216
|
+
# => 456123
|
217
|
+
# ...
|
218
|
+
```
|
219
|
+
|
220
|
+
## Dataverse::Dataset
|
221
|
+
|
222
|
+
The class that encapsulates the dataset related API.
|
223
|
+
|
224
|
+
### Accessing an existing dataset
|
225
|
+
|
226
|
+
A new Dataset instance can be obtained from the parent Dataverse's #children call or can be directly instantiated if you know the dataset's id or persistent identifier:
|
227
|
+
|
228
|
+
```ruby
|
229
|
+
ds = parent_dv.children[1]
|
230
|
+
# => #<Dataverse::Dataset:0x0...>
|
231
|
+
|
232
|
+
Dataverse::Dataset.new(25)
|
233
|
+
# => #<Dataverse::Dataset:0x0...>
|
234
|
+
|
235
|
+
Dataverse::Dataset.pid('doi:10.5072/FK2/J8SJZB')
|
236
|
+
# => #<Dataverse::Dataset:0x0...>
|
237
|
+
```
|
238
|
+
|
239
|
+
### Creating a new dataset
|
240
|
+
|
241
|
+
A new dataset can only be created on an existing dataverse. You should supply either a Hash, a file name or a JSON string to the #create_dataset method.
|
242
|
+
|
243
|
+
```ruby
|
244
|
+
ds = parent_dv.create_dataset(
|
245
|
+
'datasetVersion' => {
|
246
|
+
'metadataBlocks' => {
|
247
|
+
'citation' => {
|
248
|
+
...
|
249
|
+
}
|
250
|
+
}
|
251
|
+
)
|
252
|
+
# => #<Dataverse::Dataset: 0x0...>
|
253
|
+
```
|
254
|
+
|
255
|
+
All the metadata of an existing dataset required to create a new dataset can be retrieved as a Hash with the #raw_data method:
|
256
|
+
|
257
|
+
```ruby
|
258
|
+
data = ds.raw_data
|
259
|
+
# => {"datasetVersion" => {"metadataBlocks" => {"citation" => {...}}}}
|
260
|
+
```
|
261
|
+
|
262
|
+
The resulting Hash can be used to create a new dataverse, either directly or by saving it to a file.
|
263
|
+
|
264
|
+
```ruby
|
265
|
+
data = ds.raw_data
|
266
|
+
new_ds = parent_dv.create_dataset(data)
|
267
|
+
# => #<Dataverse::Dataset:0x0...>
|
268
|
+
|
269
|
+
filename = 'dataset.json'
|
270
|
+
File.open(filename, 'wt') { |f| f.write JSON.pretty_generate(data) }
|
271
|
+
new_ds = parent_dv.create_dataset(filename)
|
272
|
+
# => #<Dataverse::Dataset:0x0...>
|
273
|
+
```
|
274
|
+
|
275
|
+
### Importing a dataset
|
276
|
+
|
277
|
+
The #import_dataset method on a dataverse allows to import an existing dataset. The dataset should be registred and its persisten identifier should be supplied in the pid argument. The data argument is similar to the #create_dataset method.
|
278
|
+
|
279
|
+
```ruby
|
280
|
+
data = 'dataset.json'
|
281
|
+
pid = 'doi:ZZ7/MOSEISLEYDB94'
|
282
|
+
ds = parent_dv.import_dataset(data, pid: pid)
|
283
|
+
# => #<Dataverse::Dataset:0x0...>
|
284
|
+
```
|
285
|
+
|
286
|
+
Optionally, upon importing, you can immediately publish the imported dataset.
|
287
|
+
|
288
|
+
```ruby
|
289
|
+
data = 'dataset.json'
|
290
|
+
pid = 'doi:ZZ7/MOSEISLEYDB94'
|
291
|
+
ds = parent_dv.import_dataset(data, pid: pid, publish: true)
|
292
|
+
ds.versions
|
293
|
+
# => [:latest, :published, 1.0]
|
294
|
+
```
|
295
|
+
|
296
|
+
If you have DDI data instead of Dataverse JSON, you can import as well:
|
297
|
+
|
298
|
+
```ruby
|
299
|
+
data = 'dataset_ddi.xml'
|
300
|
+
pid = 'doi:ZZ7/MOSEISLEYDB94'
|
301
|
+
ds = parent_dv.import_dataset(data, pid: pid, ddi: true)
|
302
|
+
# => #<Dataverse::Dataset:0x0...>
|
303
|
+
```
|
304
|
+
|
305
|
+
### Deleting a dataset
|
306
|
+
|
307
|
+
```ruby
|
308
|
+
ds.delete
|
309
|
+
# => 'Draft version of dataset 53 deleted'
|
310
|
+
```
|
311
|
+
|
312
|
+
Only the draft version of a dataset can be deleted. If there is only a draft version in the repository, the entire dataset will be deleted. Note that the Ruby Dataverse::Dataset object will still exist and it will still hold the cached data for any version other than the draft version.
|
313
|
+
|
314
|
+
### Access dataset properties
|
315
|
+
|
316
|
+
The Dataset properties can be accessed just like with the Dataverse class:
|
317
|
+
|
318
|
+
```ruby
|
319
|
+
ds.keys
|
320
|
+
# => ["id", "identifier", "persistentUrl", "protocol", "authority", "publisher",
|
321
|
+
# "publicationDate", "storageIdentifier", "latestVersion"]
|
322
|
+
|
323
|
+
ds['identifier']
|
324
|
+
# => "FK2/J8SJZB"
|
325
|
+
|
326
|
+
ds.fetch('identifier')
|
327
|
+
# => "FK2/J8SJZB"
|
328
|
+
|
329
|
+
ds.api_data.keys
|
330
|
+
# => ["id", "identifier", "persistentUrl", "protocol", "authority", "publisher",
|
331
|
+
# "publicationDate", "storageIdentifier", "latestVersion"]
|
332
|
+
|
333
|
+
ds.api_data['identifier']
|
334
|
+
# => "FK2/J8SJZB"
|
335
|
+
```
|
336
|
+
|
337
|
+
The id or pid of the Dataset:
|
338
|
+
|
339
|
+
```ruby
|
340
|
+
ds.id
|
341
|
+
# => "25"
|
342
|
+
|
343
|
+
ds.pid
|
344
|
+
# => "doi:10.5072/FK2/J8SJZB"
|
345
|
+
```
|
346
|
+
|
347
|
+
The title and author in the metadata of the latest version:
|
348
|
+
|
349
|
+
```ruby
|
350
|
+
ds.title
|
351
|
+
# => "My new dataset"
|
352
|
+
|
353
|
+
ds.author
|
354
|
+
# => "Lastname, Firstname"
|
355
|
+
```
|
356
|
+
|
357
|
+
Some timestamps:
|
358
|
+
|
359
|
+
```ruby
|
360
|
+
ds.created
|
361
|
+
# => 2021-02-11 18:05:46 +0100
|
362
|
+
|
363
|
+
ds.updated
|
364
|
+
# => 2021-02-11 18:34:47 +0100
|
365
|
+
|
366
|
+
ds.published
|
367
|
+
# => 2021-02-11 18:34:47 +0100
|
368
|
+
```
|
369
|
+
|
370
|
+
### Accessing metadata
|
371
|
+
|
372
|
+
```ruby
|
373
|
+
ds.metadata_fields
|
374
|
+
# => ["title", "alternativeTitle", "alternativeURL", "otherId", "author", ...]
|
375
|
+
|
376
|
+
ds.metadata
|
377
|
+
# => { "title" => "My new dataset",
|
378
|
+
# "author" => [
|
379
|
+
# {
|
380
|
+
# "authorName" => "Lastname, Firstname",
|
381
|
+
# "authorIdentifierScheme" => "ORCID",
|
382
|
+
# "authorIdentifier" => "0000-0001-2345-6789"
|
383
|
+
# }
|
384
|
+
# ],
|
385
|
+
# ...
|
386
|
+
# }
|
387
|
+
```
|
388
|
+
|
389
|
+
### Exporting metadata
|
390
|
+
|
391
|
+
```ruby
|
392
|
+
md_type = 'dataverse_json'
|
393
|
+
ds.export_metadata(md_type)
|
394
|
+
# => { ... }
|
395
|
+
|
396
|
+
md_type = 'raw'
|
397
|
+
ds.export_metadata(md_type) == ds.raw_data
|
398
|
+
# => true
|
399
|
+
|
400
|
+
Dataverse::Dataset::MD_TYPES
|
401
|
+
# => ["rdm", "raw", "schema.org", "OAI_ORE", "dataverse_json",
|
402
|
+
# "ddi", "oai_ddi", "dcterms", "oai_dc", "Datacite", "oai_datacite"]
|
403
|
+
|
404
|
+
# Note: format types 'ddi' and after that are XML formats; anything else is JSON.
|
405
|
+
|
406
|
+
Dataverse::Dataset::MD_TYPES_XML
|
407
|
+
# => ["ddi", "oai_ddi", "dcterms", "oai_dc", "Datacite", "oai_datacite"]
|
408
|
+
|
409
|
+
Dataverse::Dataset::MD_TYPES_JSON
|
410
|
+
# => ["schema.org", "OAI_ORE", "dataverse_json"]
|
411
|
+
|
412
|
+
# Note: JSON metadata will be converted into a Hash and returned as a Hash,
|
413
|
+
# to improve parsing and manipulation of the metadata:
|
414
|
+
data = ds.export_metadata('schema.org')
|
415
|
+
# => {"@context"=>"http://schema.org", "@type"=>"Dataset", ...}
|
416
|
+
```
|
417
|
+
|
418
|
+
The resulting Hash can be used to create a new dataverse, either directly or by saving it to a file.
|
419
|
+
|
420
|
+
```ruby
|
421
|
+
data = ds.raw_data
|
422
|
+
new_ds = parent_dv.create_dataset(data)
|
423
|
+
# => #<Dataverse::Dataset:0x0...>
|
424
|
+
|
425
|
+
filename = 'dataset.json'
|
426
|
+
File.open(filename, 'wt') { |f| f.write JSON.pretty_generate(data) }
|
427
|
+
new_ds = parent_dv.create_dataset(filename)
|
428
|
+
# => #<Dataverse::Dataset:0x0...>
|
429
|
+
```
|
430
|
+
|
431
|
+
If the metadata type is a XML format, the data will be a REXML::Document instance:
|
432
|
+
|
433
|
+
```ruby
|
434
|
+
data = ds.export_metadata('dcterms')
|
435
|
+
# => <UNDEFINED> ...</>
|
436
|
+
|
437
|
+
data.write(indent: 2)
|
438
|
+
# <?xml version='1.0' encoding='UTF-8'?>
|
439
|
+
# <metadata xmlns:dcterms='http://purl.org/dc/elements/1.1/' ...>
|
440
|
+
# <dcterms:title>My new dataset</dcterms:title>
|
441
|
+
# ...
|
442
|
+
# </metadata>
|
443
|
+
|
444
|
+
File.open('dataset_dcterms.xml') 'wt') do |f|
|
445
|
+
f.write data
|
446
|
+
end
|
447
|
+
```
|
448
|
+
|
449
|
+
The 'rdm' metadata format is not one of the officially supported metadata output formats, but a slightly more compact version of the 'dataverse_json' format. It can be accessed directly using the #rdm_data method:
|
450
|
+
|
451
|
+
```ruby
|
452
|
+
ds.rdm_data
|
453
|
+
# => {"id"=>5, "versionId"=>8, ...,
|
454
|
+
# "metadata"=> {"title"=>"My new dataset", ....}}
|
455
|
+
|
456
|
+
md_type = 'rdm'
|
457
|
+
ds.export_metadata(md_type) == ds.rdm_data
|
458
|
+
# => true
|
459
|
+
```
|
460
|
+
|
461
|
+
The 'raw' metadata format is the format that is required for the creation and import of datasets.
|
462
|
+
|
463
|
+
```ruby
|
464
|
+
data = ds.export_metadata('raw')
|
465
|
+
#=> {"datasetVersion"=>{"id"=>25, ...}}
|
466
|
+
|
467
|
+
data == ds.raw_data
|
468
|
+
# => true
|
469
|
+
|
470
|
+
data.dig('datasetVersion', 'files')
|
471
|
+
# => nil
|
472
|
+
|
473
|
+
ds.raw_data(with_files: true).dig('datasetVersion', 'files')
|
474
|
+
# => [{"description"=>"data file", "label"=>"file.pdf", ...}]
|
475
|
+
```
|
476
|
+
|
477
|
+
### Report the data file size of a Dataset (in bytes)
|
478
|
+
|
479
|
+
```ruby
|
480
|
+
ds.size
|
481
|
+
# => 123456789
|
482
|
+
```
|
483
|
+
|
484
|
+
### Accessing dataset files
|
485
|
+
```ruby
|
486
|
+
ds.files
|
487
|
+
# => [ { "description"=>"File descripion",
|
488
|
+
# "label"=>"file.pdf",
|
489
|
+
# "id"=>16,
|
490
|
+
# "persistentId"=>"doi:10.5072/FK2/J8SJZB/2QPLAC",
|
491
|
+
# ...
|
492
|
+
# }
|
493
|
+
# ]
|
494
|
+
```
|
495
|
+
|
496
|
+
To download datafiles, you can:
|
497
|
+
|
498
|
+
1. Download all files:
|
499
|
+
|
500
|
+
```ruby
|
501
|
+
ds.download
|
502
|
+
# => downloads all files as 'dataset_files.zip' for the latest version
|
503
|
+
|
504
|
+
ds.download 'files.zip'
|
505
|
+
# => use 'files.zip' as target file
|
506
|
+
|
507
|
+
ds.download version: '1.0'
|
508
|
+
# => downloads all files for the version '1.0'
|
509
|
+
```
|
510
|
+
|
511
|
+
2. Download a specific file
|
512
|
+
|
513
|
+
```ruby
|
514
|
+
# TODO
|
515
|
+
```
|
516
|
+
|
517
|
+
### Accessing dataset versions
|
518
|
+
|
519
|
+
```ruby
|
520
|
+
ds.versions
|
521
|
+
# => [:latest, :published, :draft, 3.0, 2.0, 1.0]
|
522
|
+
|
523
|
+
ds.published_versions
|
524
|
+
# => [1.0, 2.0, 3.0]
|
525
|
+
|
526
|
+
ds.draft_version
|
527
|
+
# => :draft
|
528
|
+
# will return nil if there is no draft version
|
529
|
+
|
530
|
+
ds.version(:published)
|
531
|
+
# => 3.0
|
532
|
+
|
533
|
+
ds.version(:latest)
|
534
|
+
# => :draft
|
535
|
+
|
536
|
+
ds.version(2)
|
537
|
+
# => 2.0
|
538
|
+
|
539
|
+
ds.version(4)
|
540
|
+
# => nil
|
541
|
+
```
|
542
|
+
|
543
|
+
Use the #versions to get a list of valid versions and the #version method resolves the given version name to the version number or the special version :draft. Valid version names are:
|
544
|
+
|
545
|
+
- :draft, ':draft' or 'draft' => the draft version, if it exists
|
546
|
+
- :latest, ':latest', or 'latest' => the latest version: draft if it exists, the last published version otherwise
|
547
|
+
- :published, ':published', 'published', ':latest-published', 'latest-published' => the last published version if it exists
|
548
|
+
- a number => a specific published version; integer numbers n will be interpreted as n.0
|
549
|
+
|
550
|
+
The following methods take an optional version: argument that allows to retrieve the data specific for that version: #pid, #title, #author, #updated, #created, #published, #metadata_fields, #rdm_data, #metadata, #files, #download. Most of these methods default to using :latest version if omitted. Exceptions to this rule are #rdm (:published) and #download (nil, uses the download api on dataset level by default).
|
551
|
+
|
552
|
+
```ruby
|
553
|
+
ds.title
|
554
|
+
# => "My new dataset"
|
555
|
+
|
556
|
+
ds.title(version: 1)
|
557
|
+
# => "Preliminary title"
|
558
|
+
|
559
|
+
ds.updated
|
560
|
+
# => 2021-02-11 18:34:47 +0100
|
561
|
+
|
562
|
+
ds.updated(version: 1)
|
563
|
+
# => 2021-02-03 12:05:13 +0100
|
564
|
+
|
565
|
+
ds.published
|
566
|
+
# => 2021-02-11 18:34:47 +0100
|
567
|
+
|
568
|
+
ds.published(version: :draft)
|
569
|
+
# => nil
|
570
|
+
# (:draft version does not have a publication date)
|
571
|
+
```
|
572
|
+
|
573
|
+
Note that in most cases, entering a non-existent version will throw a Dataverse::VersionError exception. If you want to prevent having to catch the exception in your code, you can use the #version method first to check if the version is valid.
|
574
|
+
|
575
|
+
```ruby
|
576
|
+
ds.title(version: 8.2)
|
577
|
+
# => Dataverse::VersionError: Version 8.2 does not exist
|
578
|
+
|
579
|
+
ds.title(version: 8.2) if ds.version(8.2)
|
580
|
+
# => nil
|
581
|
+
```
|
582
|
+
## Development
|
583
|
+
|
584
|
+
After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
|
585
|
+
|
586
|
+
To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and the created tag, and push the `.gem` file to [rubygems.org](https://rubygems.org).
|
587
|
+
|
588
|
+
## Contributing
|
589
|
+
|
590
|
+
Bug reports and pull requests are welcome on GitHub at https://github.com/libis/dataverse_api. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/libis/dataverse_api/blob/master/CODE_OF_CONDUCT.md).
|
591
|
+
|
592
|
+
## License
|
593
|
+
|
594
|
+
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
|
595
|
+
|
596
|
+
## Code of Conduct
|
597
|
+
|
598
|
+
Everyone interacting in the Dataverse project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/libis/dataverse_api/blob/master/CODE_OF_CONDUCT.md).
|