gpt_neox_client 0.1.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +7 -0
- data/CHANGELOG.md +5 -0
- data/CODE_OF_CONDUCT.md +84 -0
- data/LICENSE.txt +21 -0
- data/README.md +68 -0
- data/ext/gpt_neox_client/extconf.rb +25 -0
- data/ext/gpt_neox_client/gpt_neox_client.cpp +316 -0
- data/ext/gpt_neox_client/gpt_neox_client.h +10 -0
- data/ext/gpt_neox_client/src/LICENSE +21 -0
- data/ext/gpt_neox_client/src/common-ggml.cpp +246 -0
- data/ext/gpt_neox_client/src/common-ggml.h +18 -0
- data/ext/gpt_neox_client/src/common.cpp +809 -0
- data/ext/gpt_neox_client/src/common.h +176 -0
- data/ext/gpt_neox_client/src/dr_wav.h +6434 -0
- data/ext/gpt_neox_client/src/ggml/ggml-alloc.c +594 -0
- data/ext/gpt_neox_client/src/ggml/ggml-alloc.h +26 -0
- data/ext/gpt_neox_client/src/ggml/ggml-cuda.cu +6756 -0
- data/ext/gpt_neox_client/src/ggml/ggml-cuda.h +46 -0
- data/ext/gpt_neox_client/src/ggml/ggml-metal.h +85 -0
- data/ext/gpt_neox_client/src/ggml/ggml-metal.m +1195 -0
- data/ext/gpt_neox_client/src/ggml/ggml-metal.metal +2049 -0
- data/ext/gpt_neox_client/src/ggml/ggml-opencl.cpp +1865 -0
- data/ext/gpt_neox_client/src/ggml/ggml-opencl.h +25 -0
- data/ext/gpt_neox_client/src/ggml/ggml.c +20632 -0
- data/ext/gpt_neox_client/src/ggml/ggml.h +1997 -0
- data/ext/gpt_neox_client/src/main.cpp +814 -0
- data/lib/gpt_neox_client/version.rb +7 -0
- data/lib/gpt_neox_client.rb +4 -0
- metadata +75 -0
checksums.yaml
ADDED
@@ -0,0 +1,7 @@
|
|
1
|
+
---
|
2
|
+
SHA256:
|
3
|
+
metadata.gz: d77a992f9cbba2e7a91141c859194cc0a200b9c5dd4e58aeedc51482ec75b8c0
|
4
|
+
data.tar.gz: 302c37b125a0875463859b62fb7918b47c38cd521ad444fe758a36ba06e27ade
|
5
|
+
SHA512:
|
6
|
+
metadata.gz: a3ab1eb43db87f08e24fb16181d83feeb4b1421fd601165a233cc48eccfd0403de2e1d98042e89e259c843e5c3021e3931d57f4742ee20ae8c7e55cf45f6c0d4
|
7
|
+
data.tar.gz: 88f961e5a901ea5896486b4612ab3ecc8ebcbad12726fd76700bb1248e31c11716ae4a7248592f7657fb056e36af19f466f95860862a4d165fc026df3d4cb04f
|
data/CHANGELOG.md
ADDED
data/CODE_OF_CONDUCT.md
ADDED
@@ -0,0 +1,84 @@
|
|
1
|
+
# Contributor Covenant Code of Conduct
|
2
|
+
|
3
|
+
## Our Pledge
|
4
|
+
|
5
|
+
We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation.
|
6
|
+
|
7
|
+
We pledge to act and interact in ways that contribute to an open, welcoming, diverse, inclusive, and healthy community.
|
8
|
+
|
9
|
+
## Our Standards
|
10
|
+
|
11
|
+
Examples of behavior that contributes to a positive environment for our community include:
|
12
|
+
|
13
|
+
* Demonstrating empathy and kindness toward other people
|
14
|
+
* Being respectful of differing opinions, viewpoints, and experiences
|
15
|
+
* Giving and gracefully accepting constructive feedback
|
16
|
+
* Accepting responsibility and apologizing to those affected by our mistakes, and learning from the experience
|
17
|
+
* Focusing on what is best not just for us as individuals, but for the overall community
|
18
|
+
|
19
|
+
Examples of unacceptable behavior include:
|
20
|
+
|
21
|
+
* The use of sexualized language or imagery, and sexual attention or
|
22
|
+
advances of any kind
|
23
|
+
* Trolling, insulting or derogatory comments, and personal or political attacks
|
24
|
+
* Public or private harassment
|
25
|
+
* Publishing others' private information, such as a physical or email
|
26
|
+
address, without their explicit permission
|
27
|
+
* Other conduct which could reasonably be considered inappropriate in a
|
28
|
+
professional setting
|
29
|
+
|
30
|
+
## Enforcement Responsibilities
|
31
|
+
|
32
|
+
Community leaders are responsible for clarifying and enforcing our standards of acceptable behavior and will take appropriate and fair corrective action in response to any behavior that they deem inappropriate, threatening, offensive, or harmful.
|
33
|
+
|
34
|
+
Community leaders have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, and will communicate reasons for moderation decisions when appropriate.
|
35
|
+
|
36
|
+
## Scope
|
37
|
+
|
38
|
+
This Code of Conduct applies within all community spaces, and also applies when an individual is officially representing the community in public spaces. Examples of representing our community include using an official e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event.
|
39
|
+
|
40
|
+
## Enforcement
|
41
|
+
|
42
|
+
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported to the community leaders responsible for enforcement at yoshoku@outlook.com. All complaints will be reviewed and investigated promptly and fairly.
|
43
|
+
|
44
|
+
All community leaders are obligated to respect the privacy and security of the reporter of any incident.
|
45
|
+
|
46
|
+
## Enforcement Guidelines
|
47
|
+
|
48
|
+
Community leaders will follow these Community Impact Guidelines in determining the consequences for any action they deem in violation of this Code of Conduct:
|
49
|
+
|
50
|
+
### 1. Correction
|
51
|
+
|
52
|
+
**Community Impact**: Use of inappropriate language or other behavior deemed unprofessional or unwelcome in the community.
|
53
|
+
|
54
|
+
**Consequence**: A private, written warning from community leaders, providing clarity around the nature of the violation and an explanation of why the behavior was inappropriate. A public apology may be requested.
|
55
|
+
|
56
|
+
### 2. Warning
|
57
|
+
|
58
|
+
**Community Impact**: A violation through a single incident or series of actions.
|
59
|
+
|
60
|
+
**Consequence**: A warning with consequences for continued behavior. No interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, for a specified period of time. This includes avoiding interactions in community spaces as well as external channels like social media. Violating these terms may lead to a temporary or permanent ban.
|
61
|
+
|
62
|
+
### 3. Temporary Ban
|
63
|
+
|
64
|
+
**Community Impact**: A serious violation of community standards, including sustained inappropriate behavior.
|
65
|
+
|
66
|
+
**Consequence**: A temporary ban from any sort of interaction or public communication with the community for a specified period of time. No public or private interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, is allowed during this period. Violating these terms may lead to a permanent ban.
|
67
|
+
|
68
|
+
### 4. Permanent Ban
|
69
|
+
|
70
|
+
**Community Impact**: Demonstrating a pattern of violation of community standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals.
|
71
|
+
|
72
|
+
**Consequence**: A permanent ban from any sort of public interaction within the community.
|
73
|
+
|
74
|
+
## Attribution
|
75
|
+
|
76
|
+
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 2.0,
|
77
|
+
available at https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
|
78
|
+
|
79
|
+
Community Impact Guidelines were inspired by [Mozilla's code of conduct enforcement ladder](https://github.com/mozilla/diversity).
|
80
|
+
|
81
|
+
[homepage]: https://www.contributor-covenant.org
|
82
|
+
|
83
|
+
For answers to common questions about this code of conduct, see the FAQ at
|
84
|
+
https://www.contributor-covenant.org/faq. Translations are available at https://www.contributor-covenant.org/translations.
|
data/LICENSE.txt
ADDED
@@ -0,0 +1,21 @@
|
|
1
|
+
The MIT License (MIT)
|
2
|
+
|
3
|
+
Copyright (c) 2023 Atsushi Tatsuma
|
4
|
+
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
7
|
+
in the Software without restriction, including without limitation the rights
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
10
|
+
furnished to do so, subject to the following conditions:
|
11
|
+
|
12
|
+
The above copyright notice and this permission notice shall be included in
|
13
|
+
all copies or substantial portions of the Software.
|
14
|
+
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
21
|
+
THE SOFTWARE.
|
data/README.md
ADDED
@@ -0,0 +1,68 @@
|
|
1
|
+
# GPTNeoXClient
|
2
|
+
|
3
|
+
[![Gem Version](https://badge.fury.io/rb/gpt_neox_client.svg)](https://badge.fury.io/rb/gpt_neox_client)
|
4
|
+
[![License](https://img.shields.io/badge/License-MIT-yellowgreen.svg)](https://github.com/yoshoku/gpt_neox_client/blob/main/LICENSE.txt)
|
5
|
+
[![Documentation](https://img.shields.io/badge/api-reference-blue.svg)](https://yoshoku.github.io/gpt_neox_client/doc/)
|
6
|
+
|
7
|
+
gpt_neox_client is a simple client for GPT-NeoX.
|
8
|
+
|
9
|
+
## Installation
|
10
|
+
|
11
|
+
Install the gem and add to the application's Gemfile by executing:
|
12
|
+
|
13
|
+
$ bundle add gpt_neox_client
|
14
|
+
|
15
|
+
If bundler is not being used to manage dependencies, install the gem by executing:
|
16
|
+
|
17
|
+
$ gem install gpt_neox_client
|
18
|
+
|
19
|
+
## Usage
|
20
|
+
|
21
|
+
GPTNeoXClient supports ggml format model. Model files need to be converted to ggml format.
|
22
|
+
The following is an example of converting [the model published by LINE Corporation](https://huggingface.co/line-corporation/japanese-large-lm-3.6b-instruction-sft):
|
23
|
+
|
24
|
+
```sh
|
25
|
+
# Clone ggml.
|
26
|
+
$ git clone https://github.com/ggerganov/ggml.git
|
27
|
+
$ cd ggml
|
28
|
+
|
29
|
+
# Install python dependencies.
|
30
|
+
$ pip install -U protobuf~=3.20.0
|
31
|
+
$ python -m pip install -r requirements.txt
|
32
|
+
|
33
|
+
# Convert downloaded model to ggml format.
|
34
|
+
$ git clone https://huggingface.co/line-corporation/japanese-large-lm-3.6b-instruction-sft
|
35
|
+
$ python examples/gpt-neox/convert-h5-to-ggml.py japanese-large-lm-3.6b-instruction-sft 1
|
36
|
+
$ ls japanese-large-lm-3.6b-instruction-sft/ggml-model-f16.bin
|
37
|
+
japanese-large-lm-3.6b-instruction-sft/ggml-model-f16.bin
|
38
|
+
```
|
39
|
+
|
40
|
+
```ruby
|
41
|
+
require "gpt_neox_client"
|
42
|
+
|
43
|
+
client = GPTNeoXClient.new('japanese-large-lm-3.6b-instruction-sft/ggml-model-f16.bin', seed: 123456789, n_threads: 4)
|
44
|
+
puts client.completions(
|
45
|
+
'ユーザー:四国の県名を全て列挙してください。<0x0A>システム:',
|
46
|
+
top_p: 0.9,
|
47
|
+
top_k: 1,
|
48
|
+
temperature: 0.7
|
49
|
+
).gsub("<0x0A>", "\n").gsub("</s>", " ")
|
50
|
+
#
|
51
|
+
# ユーザー:四国の県名を全て列挙してください。
|
52
|
+
# システム:徳島県、香川県、愛媛県、高知県
|
53
|
+
```
|
54
|
+
|
55
|
+
## Contributing
|
56
|
+
|
57
|
+
Bug reports and pull requests are welcome on GitHub at https://github.com/yoshoku/gpt_neox_client.
|
58
|
+
This project is intended to be a safe, welcoming space for collaboration,
|
59
|
+
and contributors are expected to adhere to the [code of conduct](https://github.com/yoshoku/gpt_neox_client/blob/main/CODE_OF_CONDUCT.md).
|
60
|
+
|
61
|
+
## License
|
62
|
+
|
63
|
+
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
|
64
|
+
|
65
|
+
## Code of Conduct
|
66
|
+
|
67
|
+
Everyone interacting in the GPTNeoXClient project's codebases, issue trackers,
|
68
|
+
chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/yoshoku/gpt_neox_client/blob/main/CODE_OF_CONDUCT.md).
|
@@ -0,0 +1,25 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'mkmf'
|
4
|
+
|
5
|
+
abort 'libstdc++ is not found.' unless have_library('stdc++')
|
6
|
+
|
7
|
+
File.open("#{__dir__}/src/main.hpp", 'w') do |fout|
|
8
|
+
File.open("#{__dir__}/src/main.cpp", 'r') do |fin|
|
9
|
+
fin.each_line do |line|
|
10
|
+
break if line.match?(/^int main/)
|
11
|
+
|
12
|
+
fout.puts line
|
13
|
+
end
|
14
|
+
end
|
15
|
+
end
|
16
|
+
|
17
|
+
$srcs = %w[ggml.c ggml-alloc.c common-ggml.cpp common.cpp gpt_neox_client.cpp]
|
18
|
+
$CFLAGS << ' -w -DNDEBUG'
|
19
|
+
$CXXFLAGS << ' -std=c++11 -DNDEBUG'
|
20
|
+
$INCFLAGS << ' -I$(srcdir)/src'
|
21
|
+
$INCFLAGS << ' -I$(srcdir)/src/ggml'
|
22
|
+
$VPATH << '$(srcdir)/src'
|
23
|
+
$VPATH << '$(srcdir)/src/ggml'
|
24
|
+
|
25
|
+
create_makefile('gpt_neox_client/gpt_neox_client')
|
@@ -0,0 +1,316 @@
|
|
1
|
+
#include "gpt_neox_client.h"
|
2
|
+
|
3
|
+
VALUE rb_cGPTNeoXClient;
|
4
|
+
VALUE rb_cGPTVocab;
|
5
|
+
VALUE rb_cGPTNeoXModel;
|
6
|
+
|
7
|
+
class RbGPTNeoXModel {
|
8
|
+
public:
|
9
|
+
static VALUE gpt_neox_model_alloc(VALUE self) {
|
10
|
+
gpt_neox_model* ptr = (gpt_neox_model*)ruby_xmalloc(sizeof(gpt_neox_model));
|
11
|
+
new (ptr) gpt_neox_model();
|
12
|
+
return TypedData_Wrap_Struct(self, &gpt_neox_model_type, ptr);
|
13
|
+
}
|
14
|
+
|
15
|
+
static void gpt_neox_model_free(void* ptr) {
|
16
|
+
((gpt_neox_model*)ptr)->~gpt_neox_model();
|
17
|
+
ruby_xfree(ptr);
|
18
|
+
}
|
19
|
+
|
20
|
+
static size_t gpt_neox_model_size(const void* ptr) {
|
21
|
+
return sizeof(*((gpt_neox_model*)ptr));
|
22
|
+
}
|
23
|
+
|
24
|
+
static gpt_neox_model* get_gpt_neox_model(VALUE self) {
|
25
|
+
gpt_neox_model* ptr = nullptr;
|
26
|
+
TypedData_Get_Struct(self, gpt_neox_model, &gpt_neox_model_type, ptr);
|
27
|
+
return ptr;
|
28
|
+
}
|
29
|
+
|
30
|
+
static void define_class(VALUE outer) {
|
31
|
+
rb_cGPTNeoXModel = rb_define_class_under(outer, "Model", rb_cObject);
|
32
|
+
rb_define_alloc_func(rb_cGPTNeoXModel, gpt_neox_model_alloc);
|
33
|
+
}
|
34
|
+
|
35
|
+
private:
|
36
|
+
static const rb_data_type_t gpt_neox_model_type;
|
37
|
+
};
|
38
|
+
|
39
|
+
const rb_data_type_t RbGPTNeoXModel::gpt_neox_model_type = {
|
40
|
+
"RbGPTNeoXModel",
|
41
|
+
{ 0,
|
42
|
+
RbGPTNeoXModel::gpt_neox_model_free,
|
43
|
+
RbGPTNeoXModel::gpt_neox_model_size },
|
44
|
+
0,
|
45
|
+
0,
|
46
|
+
RUBY_TYPED_FREE_IMMEDIATELY
|
47
|
+
};
|
48
|
+
|
49
|
+
class RbGPTVocab {
|
50
|
+
public:
|
51
|
+
static VALUE gpt_vocab_alloc(VALUE self) {
|
52
|
+
gpt_vocab* ptr = (gpt_vocab*)ruby_xmalloc(sizeof(gpt_vocab));
|
53
|
+
new (ptr) gpt_vocab();
|
54
|
+
return TypedData_Wrap_Struct(self, &gpt_vocab_type, ptr);
|
55
|
+
}
|
56
|
+
|
57
|
+
static void gpt_vocab_free(void* ptr) {
|
58
|
+
((gpt_vocab*)ptr)->~gpt_vocab();
|
59
|
+
ruby_xfree(ptr);
|
60
|
+
}
|
61
|
+
|
62
|
+
static size_t gpt_vocab_size(const void* ptr) {
|
63
|
+
return sizeof(*((gpt_vocab*)ptr));
|
64
|
+
}
|
65
|
+
|
66
|
+
static gpt_vocab* get_gpt_vocab(VALUE self) {
|
67
|
+
gpt_vocab* ptr = nullptr;
|
68
|
+
TypedData_Get_Struct(self, gpt_vocab, &gpt_vocab_type, ptr);
|
69
|
+
return ptr;
|
70
|
+
}
|
71
|
+
|
72
|
+
static void define_class(VALUE outer) {
|
73
|
+
rb_cGPTVocab = rb_define_class_under(outer, "Vocab", rb_cObject);
|
74
|
+
rb_define_alloc_func(rb_cGPTVocab, gpt_vocab_alloc);
|
75
|
+
}
|
76
|
+
|
77
|
+
private:
|
78
|
+
static const rb_data_type_t gpt_vocab_type;
|
79
|
+
};
|
80
|
+
|
81
|
+
const rb_data_type_t RbGPTVocab::gpt_vocab_type = {
|
82
|
+
"RbGPTVocab",
|
83
|
+
{ 0,
|
84
|
+
RbGPTVocab::gpt_vocab_free,
|
85
|
+
RbGPTVocab::gpt_vocab_size },
|
86
|
+
0,
|
87
|
+
0,
|
88
|
+
RUBY_TYPED_FREE_IMMEDIATELY
|
89
|
+
};
|
90
|
+
|
91
|
+
static VALUE gpt_neox_client_initialize(int argc, VALUE* argv, VALUE self) {
|
92
|
+
VALUE kw_args = Qnil;
|
93
|
+
ID kw_table[3] = { rb_intern("path"), rb_intern("seed"), rb_intern("n_threads") };
|
94
|
+
VALUE kw_values[3] = { Qundef, Qundef, Qundef };
|
95
|
+
rb_scan_args(argc, argv, ":", &kw_args);
|
96
|
+
rb_get_kwargs(kw_args, kw_table, 1, 2, kw_values);
|
97
|
+
|
98
|
+
if (!RB_TYPE_P(kw_values[0], T_STRING)) {
|
99
|
+
rb_raise(rb_eArgError, "path must be a String");
|
100
|
+
return Qnil;
|
101
|
+
}
|
102
|
+
if (kw_values[1] != Qundef && !RB_NIL_P(kw_values[1]) && !RB_INTEGER_TYPE_P(kw_values[1])) {
|
103
|
+
rb_raise(rb_eArgError, "seed must be an integer");
|
104
|
+
return Qnil;
|
105
|
+
}
|
106
|
+
if (RB_INTEGER_TYPE_P(kw_values[1]) && NUM2INT(kw_values[1]) < 0) {
|
107
|
+
rb_raise(rb_eArgError, "seed must be an integer greater than or equal to zero");
|
108
|
+
return Qnil;
|
109
|
+
}
|
110
|
+
if (kw_values[2] != Qundef && !RB_NIL_P(kw_values[2]) && !RB_INTEGER_TYPE_P(kw_values[2])) {
|
111
|
+
rb_raise(rb_eArgError, "n_threads must be an integer");
|
112
|
+
return Qnil;
|
113
|
+
}
|
114
|
+
if (RB_INTEGER_TYPE_P(kw_values[2]) && NUM2INT(kw_values[2]) < 1) {
|
115
|
+
rb_raise(rb_eArgError, "n_threads must be a positive integer");
|
116
|
+
return Qnil;
|
117
|
+
}
|
118
|
+
|
119
|
+
std::string path(StringValueCStr(kw_values[0]));
|
120
|
+
std::random_device rnd;
|
121
|
+
const unsigned int seed = RB_INTEGER_TYPE_P(kw_values[1]) ? NUM2INT(kw_values[1]) : rnd();
|
122
|
+
const unsigned int n_threads_ = RB_INTEGER_TYPE_P(kw_values[2]) ? NUM2INT(kw_values[2]) : 1;
|
123
|
+
const unsigned int n_threads = std::min(n_threads_, std::thread::hardware_concurrency());
|
124
|
+
|
125
|
+
rb_iv_set(self, "@path", kw_values[0]);
|
126
|
+
rb_iv_set(self, "@seed", UINT2NUM(seed));
|
127
|
+
rb_iv_set(self, "@n_threads", UINT2NUM(n_threads));
|
128
|
+
|
129
|
+
rb_iv_set(self, "@vocab", rb_funcall(rb_const_get(rb_cGPTNeoXClient, rb_intern("Vocab")), rb_intern("new"), 0));
|
130
|
+
rb_iv_set(self, "@model", rb_funcall(rb_const_get(rb_cGPTNeoXClient, rb_intern("Model")), rb_intern("new"), 0));
|
131
|
+
|
132
|
+
gpt_neox_model* model = RbGPTNeoXModel::get_gpt_neox_model(rb_iv_get(self, "@model"));
|
133
|
+
gpt_vocab* vocab = RbGPTVocab::get_gpt_vocab(rb_iv_get(self, "@vocab"));
|
134
|
+
|
135
|
+
if (!gpt_neox_model_load(path, *model, *vocab)) {
|
136
|
+
rb_raise(rb_eRuntimeError, "failed to load model: %s", path.c_str());
|
137
|
+
return Qnil;
|
138
|
+
}
|
139
|
+
|
140
|
+
return self;
|
141
|
+
}
|
142
|
+
|
143
|
+
static VALUE gpt_neox_client_completions(int argc, VALUE* argv, VALUE self) {
|
144
|
+
VALUE prompt_ = Qnil;
|
145
|
+
VALUE kw_args = Qnil;
|
146
|
+
rb_scan_args(argc, argv, "1:", &prompt_, &kw_args);
|
147
|
+
|
148
|
+
ID kw_table[7] = { rb_intern("top_k"), rb_intern("top_p"), rb_intern("temperature"),
|
149
|
+
rb_intern("n_predict"), rb_intern("n_batch"),
|
150
|
+
rb_intern("repeat_last_n"), rb_intern("repeat_penalty") };
|
151
|
+
VALUE kw_values[7] = { Qundef, Qundef, Qundef, Qundef, Qundef, Qundef, Qundef };
|
152
|
+
rb_get_kwargs(kw_args, kw_table, 0, 7, kw_values);
|
153
|
+
|
154
|
+
if (kw_values[0] != Qundef && !RB_INTEGER_TYPE_P(kw_values[0])) {
|
155
|
+
rb_raise(rb_eArgError, "top_k must be an integer");
|
156
|
+
return Qnil;
|
157
|
+
}
|
158
|
+
if (kw_values[1] != Qundef && !RB_FLOAT_TYPE_P(kw_values[1])) {
|
159
|
+
rb_raise(rb_eArgError, "top_p must be a float");
|
160
|
+
return Qnil;
|
161
|
+
}
|
162
|
+
if (kw_values[2] != Qundef && !RB_FLOAT_TYPE_P(kw_values[2])) {
|
163
|
+
rb_raise(rb_eArgError, "temp must be a float");
|
164
|
+
return Qnil;
|
165
|
+
}
|
166
|
+
if (kw_values[3] != Qundef && !RB_INTEGER_TYPE_P(kw_values[3])) {
|
167
|
+
rb_raise(rb_eArgError, "n_predict must be an integer");
|
168
|
+
return Qnil;
|
169
|
+
}
|
170
|
+
if (kw_values[4] != Qundef && !RB_INTEGER_TYPE_P(kw_values[4])) {
|
171
|
+
rb_raise(rb_eArgError, "n_batch must be an integer");
|
172
|
+
return Qnil;
|
173
|
+
}
|
174
|
+
if (kw_values[5] != Qundef && !RB_INTEGER_TYPE_P(kw_values[5])) {
|
175
|
+
rb_raise(rb_eArgError, "repeat_last_n must be an integer");
|
176
|
+
return Qnil;
|
177
|
+
}
|
178
|
+
if (kw_values[6] != Qundef && !RB_FLOAT_TYPE_P(kw_values[6])) {
|
179
|
+
rb_raise(rb_eArgError, "repeat_penalty must be a float");
|
180
|
+
return Qnil;
|
181
|
+
}
|
182
|
+
|
183
|
+
std::string prompt(StringValueCStr(prompt_));
|
184
|
+
const int top_k = kw_values[0] != Qundef ? NUM2INT(kw_values[0]) : 40;
|
185
|
+
const double top_p = kw_values[1] != Qundef ? NUM2DBL(kw_values[1]) : 0.9;
|
186
|
+
const double temp = kw_values[2] != Qundef ? NUM2DBL(kw_values[2]) : 0.9;
|
187
|
+
const int n_predict_ = kw_values[3] != Qundef ? NUM2INT(kw_values[3]) : 200;
|
188
|
+
const int n_batch = kw_values[4] != Qundef ? NUM2INT(kw_values[4]) : 8;
|
189
|
+
const int repeat_last_n = kw_values[5] != Qundef ? NUM2INT(kw_values[5]) : 64;
|
190
|
+
const float repeat_penalty = kw_values[6] != Qundef ? NUM2DBL(kw_values[6]) : 1.0f;
|
191
|
+
|
192
|
+
gpt_neox_model* model = RbGPTNeoXModel::get_gpt_neox_model(rb_iv_get(self, "@model"));
|
193
|
+
gpt_vocab* vocab = RbGPTVocab::get_gpt_vocab(rb_iv_get(self, "@vocab"));
|
194
|
+
|
195
|
+
std::vector<gpt_vocab::id> embd_inp = gpt_tokenize(*vocab, prompt);
|
196
|
+
const int n_predict = std::min(n_predict_, model->hparams.n_ctx - static_cast<int>(embd_inp.size()));
|
197
|
+
|
198
|
+
const int n_threads = NUM2INT(rb_iv_get(self, "@n_threads"));
|
199
|
+
std::vector<float> logits;
|
200
|
+
size_t mem_per_token = 0;
|
201
|
+
gpt_neox_eval(*model, n_threads, 0, { 0, 1, 2, 3 }, logits, mem_per_token);
|
202
|
+
|
203
|
+
int n_past = 0;
|
204
|
+
int n_consumed = 0;
|
205
|
+
int n_sampled = 0;
|
206
|
+
std::string completions = "";
|
207
|
+
const unsigned int seed = NUM2UINT(rb_iv_get(self, "@seed"));
|
208
|
+
std::mt19937 rng(seed);
|
209
|
+
std::vector<gpt_vocab::id> embd;
|
210
|
+
std::vector<int32_t> last_n_tokens(model->hparams.n_ctx, 0);
|
211
|
+
|
212
|
+
while (n_sampled < n_predict) {
|
213
|
+
if (embd.size() > 0) {
|
214
|
+
if (!gpt_neox_eval(*model, n_threads, n_past, embd, logits, mem_per_token)) {
|
215
|
+
rb_raise(rb_eRuntimeError, "failed to predict.");
|
216
|
+
return Qnil;
|
217
|
+
}
|
218
|
+
n_past += embd.size();
|
219
|
+
embd.clear();
|
220
|
+
}
|
221
|
+
|
222
|
+
if (embd_inp.size() <= n_consumed) {
|
223
|
+
gpt_vocab::id id = gpt_sample_top_k_top_p_repeat(
|
224
|
+
*vocab,
|
225
|
+
logits.data() + (logits.size() - model->hparams.n_vocab),
|
226
|
+
last_n_tokens.data(), last_n_tokens.size(),
|
227
|
+
top_k, top_p, temp,
|
228
|
+
repeat_last_n, repeat_penalty,
|
229
|
+
rng);
|
230
|
+
last_n_tokens.erase(last_n_tokens.begin());
|
231
|
+
last_n_tokens.push_back(id);
|
232
|
+
embd.push_back(id);
|
233
|
+
n_sampled += 1;
|
234
|
+
} else {
|
235
|
+
while (embd_inp.size() > n_consumed) {
|
236
|
+
embd.push_back(embd_inp[n_consumed]);
|
237
|
+
n_consumed += 1;
|
238
|
+
if (embd.size() >= n_batch) break;
|
239
|
+
}
|
240
|
+
}
|
241
|
+
|
242
|
+
for (auto id : embd) completions += vocab->id_to_token[id];
|
243
|
+
if (embd.back() == 0) break;
|
244
|
+
}
|
245
|
+
|
246
|
+
RB_GC_GUARD(prompt_);
|
247
|
+
return rb_utf8_str_new_cstr(completions.c_str());
|
248
|
+
}
|
249
|
+
|
250
|
+
extern "C" void Init_gpt_neox_client(void) {
|
251
|
+
/**
|
252
|
+
* Document-class: GPTNeoXClient
|
253
|
+
* GPTNeoXClient is a Ruby client for GPT-NeoX.
|
254
|
+
*/
|
255
|
+
rb_cGPTNeoXClient = rb_define_class("GPTNeoXClient", rb_cObject);
|
256
|
+
/**
|
257
|
+
* Creates a new GPTNeoXClient.
|
258
|
+
*
|
259
|
+
* @example
|
260
|
+
* require "gpt_neox_client"
|
261
|
+
*
|
262
|
+
* client = GPTNeoXClient.new("gpt-neox-f16.bin")
|
263
|
+
*
|
264
|
+
* @overload initialize(path, seed: nil, n_threads: 1)
|
265
|
+
* @param [String] path The path to the model.
|
266
|
+
* @param [Integer] seed The seed for random number generation.
|
267
|
+
* @param [Integer] n_threads The number of threads.
|
268
|
+
* @return [GPTNeoXClient]
|
269
|
+
*/
|
270
|
+
rb_define_method(rb_cGPTNeoXClient, "initialize", RUBY_METHOD_FUNC(gpt_neox_client_initialize), -1);
|
271
|
+
/**
|
272
|
+
* Generates completions.
|
273
|
+
*
|
274
|
+
* @example
|
275
|
+
* require "gpt_neox_client"
|
276
|
+
*
|
277
|
+
* client = GPTNeoXClient.new("gpt-neox-f16.bin")
|
278
|
+
* client.completions("Hello, my name is")
|
279
|
+
*
|
280
|
+
* @overload completions(prompt, top_k: 40, top_p: 0.9, temperature: 0.9, n_predict: 200, n_batch: 8, repeat_last_n: 64, repeat_penalty: 1.0)
|
281
|
+
* @param [String] prompt The prompt.
|
282
|
+
* @param [Integer] top_k The number of top tokens to consider for sampling.
|
283
|
+
* @param [Float] top_p The cumulative probability of top tokens to consider for sampling.
|
284
|
+
* @param [Float] temperature The temperature of the softmax distribution.
|
285
|
+
* @param [Integer] n_predict The number of tokens to predict.
|
286
|
+
* @param [Integer] n_batch The number of tokens to predict at once.
|
287
|
+
* @param [Integer] repeat_last_n The number of tokens to consider for repeat penalty.
|
288
|
+
* @param [Float] repeat_penalty The repeat penalty.
|
289
|
+
* @return [String]
|
290
|
+
*/
|
291
|
+
rb_define_method(rb_cGPTNeoXClient, "completions", RUBY_METHOD_FUNC(gpt_neox_client_completions), -1);
|
292
|
+
/**
|
293
|
+
* Returns the path to the model.
|
294
|
+
* @return [String]
|
295
|
+
*/
|
296
|
+
rb_define_attr(rb_cGPTNeoXClient, "path", 1, 0);
|
297
|
+
/**
|
298
|
+
* Returns the seed for random number generation.
|
299
|
+
* @return [Integer]
|
300
|
+
*/
|
301
|
+
rb_define_attr(rb_cGPTNeoXClient, "seed", 1, 0);
|
302
|
+
/**
|
303
|
+
* Returns the number of threads.
|
304
|
+
* @return [Integer]
|
305
|
+
*/
|
306
|
+
rb_define_attr(rb_cGPTNeoXClient, "n_threads", 1, 0);
|
307
|
+
|
308
|
+
/**
|
309
|
+
* @!visibility private
|
310
|
+
*/
|
311
|
+
RbGPTVocab::define_class(rb_cGPTNeoXClient);
|
312
|
+
/**
|
313
|
+
* @!visibility private
|
314
|
+
*/
|
315
|
+
RbGPTNeoXModel::define_class(rb_cGPTNeoXClient);
|
316
|
+
}
|
@@ -0,0 +1,21 @@
|
|
1
|
+
MIT License
|
2
|
+
|
3
|
+
Copyright (c) 2022 Georgi Gerganov
|
4
|
+
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
7
|
+
in the Software without restriction, including without limitation the rights
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
10
|
+
furnished to do so, subject to the following conditions:
|
11
|
+
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
13
|
+
copies or substantial portions of the Software.
|
14
|
+
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
21
|
+
SOFTWARE.
|