llamafile 0.1.d → 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/README.md +35 -33
- data/lib/llamafile/version.rb +1 -1
- data/lib/llamafile.rb +7 -72
- data/llamafile.gemspec +2 -5
- metadata +7 -76
- data/lib/llamafile/llama.rb +0 -67
- data/main.log +0 -599
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 3cb58e041815fd32831d47c2473705472795147c5f331b5244b2c4e86ffff9be
|
4
|
+
data.tar.gz: a32745ecfecc635933b453c30dae6a2c97fdd03e0b9ff284f82ba20b2d99df5b
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 19d7f0b890a7eb6daf894389d7a8f89c139a63b7f88adec34680797e7ffb4099d96cf26b4e2e49d58cc48567e82b872fca8cfa763ec3ceb26ab1e00d6ea4132a
|
7
|
+
data.tar.gz: c178e010127051134d6da9214aa48169496579516285ef462d74f884fbf1bfcd1eb4a06789fc1587b7b800f47ae1e44b16fd42e9720d2bcea4cbb7bd4b85e03a
|
data/README.md
CHANGED
@@ -1,37 +1,39 @@
|
|
1
|
-
#
|
2
|
-
Because sometimes local llamafile interactions on local hardware with local data is the only option.
|
1
|
+
# Llamafile
|
3
2
|
|
4
|
-
|
5
|
-
|
6
|
-
|
7
|
-
3. sudo mv lamafile /usr/bin/llama
|
3
|
+
TODO: Delete this and the text below, and describe your gem
|
4
|
+
|
5
|
+
Welcome to your new gem! In this directory, you'll find the files you need to be able to package up your Ruby library into a gem. Put your Ruby code in the file `lib/llamafile`. To experiment with that code, run `bin/console` for an interactive prompt.
|
8
6
|
|
9
7
|
## Installation
|
10
|
-
|
11
|
-
|
12
|
-
|
13
|
-
|
14
|
-
|
15
|
-
|
16
|
-
|
17
|
-
|
8
|
+
|
9
|
+
TODO: Replace `UPDATE_WITH_YOUR_GEM_NAME_IMMEDIATELY_AFTER_RELEASE_TO_RUBYGEMS_ORG` with your gem name right after releasing it to RubyGems.org. Please do not do it earlier due to security reasons. Alternatively, replace this section with instructions to install your gem from git if you don't plan to release to RubyGems.org.
|
10
|
+
|
11
|
+
Install the gem and add to the application's Gemfile by executing:
|
12
|
+
|
13
|
+
$ bundle add UPDATE_WITH_YOUR_GEM_NAME_IMMEDIATELY_AFTER_RELEASE_TO_RUBYGEMS_ORG
|
14
|
+
|
15
|
+
If bundler is not being used to manage dependencies, install the gem by executing:
|
16
|
+
|
17
|
+
$ gem install UPDATE_WITH_YOUR_GEM_NAME_IMMEDIATELY_AFTER_RELEASE_TO_RUBYGEMS_ORG
|
18
|
+
|
18
19
|
## Usage
|
19
|
-
|
20
|
-
|
21
|
-
|
22
|
-
|
23
|
-
|
24
|
-
|
25
|
-
|
26
|
-
|
27
|
-
|
28
|
-
|
29
|
-
|
30
|
-
|
31
|
-
|
32
|
-
|
33
|
-
|
34
|
-
|
35
|
-
|
36
|
-
|
37
|
-
|
20
|
+
|
21
|
+
TODO: Write usage instructions here
|
22
|
+
|
23
|
+
## Development
|
24
|
+
|
25
|
+
After checking out the repo, run `bin/setup` to install dependencies. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
|
26
|
+
|
27
|
+
To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and the created tag, and push the `.gem` file to [rubygems.org](https://rubygems.org).
|
28
|
+
|
29
|
+
## Contributing
|
30
|
+
|
31
|
+
Bug reports and pull requests are welcome on GitHub at https://github.com/[USERNAME]/llamafile. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/[USERNAME]/llamafile/blob/master/CODE_OF_CONDUCT.md).
|
32
|
+
|
33
|
+
## License
|
34
|
+
|
35
|
+
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
|
36
|
+
|
37
|
+
## Code of Conduct
|
38
|
+
|
39
|
+
Everyone interacting in the Llamafile project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/[USERNAME]/llamafile/blob/master/CODE_OF_CONDUCT.md).
|
data/lib/llamafile/version.rb
CHANGED
data/lib/llamafile.rb
CHANGED
@@ -1,78 +1,13 @@
|
|
1
1
|
# frozen_string_literal: true
|
2
2
|
|
3
|
-
require 'mqtt'
|
4
|
-
|
5
3
|
require_relative "llamafile/version"
|
6
|
-
|
4
|
+
|
7
5
|
require_relative "llamafile/llama"
|
8
|
-
require_relative "llamafile/grammar"
|
9
|
-
require_relative "llamafile/prompt"
|
10
|
-
require_relative "llamafile/voice"
|
11
6
|
|
12
|
-
module
|
13
|
-
|
14
|
-
|
15
|
-
|
16
|
-
|
17
|
-
|
18
|
-
end
|
19
|
-
def self.prompt h={}
|
20
|
-
LLAMA.post(prompt: PROMPT.make(h))
|
21
|
-
end
|
22
|
-
@@C = Hash.new { |h,k| h[k] = C.new(k) }
|
23
|
-
class C
|
24
|
-
attr_accessor :character, :actor, :example, :format
|
25
|
-
attr_reader :output, :input
|
26
|
-
def initialize k
|
27
|
-
@id = k
|
28
|
-
@input = ""
|
29
|
-
@output = ""
|
30
|
-
@character = %[a helpful and honest personal assistant.]
|
31
|
-
@actor = %[an honest person.]
|
32
|
-
@format = %[Format your response as properly formatted markdown.]
|
33
|
-
@example = %[# this is a heading\nThis is some general text about the heading.\n1. list item one.\n2. list item two.\n3. list item three.]
|
34
|
-
@convo = []
|
35
|
-
end
|
36
|
-
def pre q, a
|
37
|
-
@convo << [ q, a ]
|
38
|
-
end
|
39
|
-
def convo
|
40
|
-
o = []; @convo[-5..-1].each { |e| o << %[User: #{e[0]}\nLlama: #{e[1]}] };
|
41
|
-
return o.join("\n\n")
|
42
|
-
end
|
43
|
-
def prompt
|
44
|
-
%[Llama is #{@character}\nUser is #{@actor}\n#{@format}\nUse this example to guide your response:\n#{@example}\n#{convo}]
|
45
|
-
end
|
46
|
-
def << i
|
47
|
-
chain i
|
48
|
-
end
|
49
|
-
def chain *p
|
50
|
-
[p].flatten.compact.each { |e|
|
51
|
-
@convo << [ i, Lf.prompt(output: prompt, input: i)]
|
52
|
-
@output = @convo[-1][1]
|
53
|
-
@input = e
|
54
|
-
}
|
55
|
-
return @output
|
56
|
-
end
|
57
|
-
end
|
58
|
-
def self.[] k
|
59
|
-
@@C[k]
|
60
|
-
end
|
61
|
-
def self.keys
|
62
|
-
@@C.keys
|
63
|
-
end
|
64
|
-
def self.delete k
|
65
|
-
@@C.delete(k)
|
66
|
-
end
|
67
|
-
def self.chain *p
|
68
|
-
s = {}
|
69
|
-
[p].flatten.compact.each { |e| o = []; s.each_pair {|k,v| o << %[User: #{k}\nLlama: #{v}] }; s[e] = Lf.prompt(output: o.join("\n"), input: e); }
|
70
|
-
return s
|
71
|
-
end
|
7
|
+
module Llamafile
|
8
|
+
class Error < StandardError; end
|
9
|
+
def self.llama i
|
10
|
+
Llama.new().process(i)
|
11
|
+
end
|
12
|
+
|
72
13
|
end
|
73
|
-
|
74
|
-
VOICE.hear = lambda { |tgt, voice, payload| puts %[VOICE #{tgt}: #{voice} #{payload}]; MIND.publish(%[#{tgt}/], %[Thinking like a #{voice}, #{payload}]) }
|
75
|
-
MIND.input = lambda { |tgt, payload| puts %[THINK INPUT #{tgt}: #{payload}]; VOICE.hear(tgt, payload); }
|
76
|
-
MIND.output = lambda { |tgt, payload| puts %[THINK OUTPUT #{tgt}: #{payload}]; }
|
77
|
-
|
78
|
-
MIND.think!
|
data/llamafile.gemspec
CHANGED
@@ -31,11 +31,8 @@ Gem::Specification.new do |spec|
|
|
31
31
|
spec.require_paths = ["lib"]
|
32
32
|
|
33
33
|
# Uncomment to register a new dependency of your gem
|
34
|
-
spec.add_dependency "
|
35
|
-
|
36
|
-
spec.add_dependency "mqtt"
|
37
|
-
spec.add_dependency "csv"
|
38
|
-
spec.add_dependency "bigdecimal"
|
34
|
+
# spec.add_dependency "example-gem", "~> 1.0"
|
35
|
+
|
39
36
|
# For more information and examples about making a new gem, check out our
|
40
37
|
# guide at: https://bundler.io/guides/creating_gem.html
|
41
38
|
end
|
metadata
CHANGED
@@ -1,84 +1,15 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: llamafile
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.1.
|
4
|
+
version: 0.1.0
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Erik Olson
|
8
|
+
autorequire:
|
8
9
|
bindir: exe
|
9
10
|
cert_chain: []
|
10
|
-
date:
|
11
|
-
dependencies:
|
12
|
-
- !ruby/object:Gem::Dependency
|
13
|
-
name: httparty
|
14
|
-
requirement: !ruby/object:Gem::Requirement
|
15
|
-
requirements:
|
16
|
-
- - ">="
|
17
|
-
- !ruby/object:Gem::Version
|
18
|
-
version: '0'
|
19
|
-
type: :runtime
|
20
|
-
prerelease: false
|
21
|
-
version_requirements: !ruby/object:Gem::Requirement
|
22
|
-
requirements:
|
23
|
-
- - ">="
|
24
|
-
- !ruby/object:Gem::Version
|
25
|
-
version: '0'
|
26
|
-
- !ruby/object:Gem::Dependency
|
27
|
-
name: multi_xml
|
28
|
-
requirement: !ruby/object:Gem::Requirement
|
29
|
-
requirements:
|
30
|
-
- - ">="
|
31
|
-
- !ruby/object:Gem::Version
|
32
|
-
version: '0'
|
33
|
-
type: :runtime
|
34
|
-
prerelease: false
|
35
|
-
version_requirements: !ruby/object:Gem::Requirement
|
36
|
-
requirements:
|
37
|
-
- - ">="
|
38
|
-
- !ruby/object:Gem::Version
|
39
|
-
version: '0'
|
40
|
-
- !ruby/object:Gem::Dependency
|
41
|
-
name: mqtt
|
42
|
-
requirement: !ruby/object:Gem::Requirement
|
43
|
-
requirements:
|
44
|
-
- - ">="
|
45
|
-
- !ruby/object:Gem::Version
|
46
|
-
version: '0'
|
47
|
-
type: :runtime
|
48
|
-
prerelease: false
|
49
|
-
version_requirements: !ruby/object:Gem::Requirement
|
50
|
-
requirements:
|
51
|
-
- - ">="
|
52
|
-
- !ruby/object:Gem::Version
|
53
|
-
version: '0'
|
54
|
-
- !ruby/object:Gem::Dependency
|
55
|
-
name: csv
|
56
|
-
requirement: !ruby/object:Gem::Requirement
|
57
|
-
requirements:
|
58
|
-
- - ">="
|
59
|
-
- !ruby/object:Gem::Version
|
60
|
-
version: '0'
|
61
|
-
type: :runtime
|
62
|
-
prerelease: false
|
63
|
-
version_requirements: !ruby/object:Gem::Requirement
|
64
|
-
requirements:
|
65
|
-
- - ">="
|
66
|
-
- !ruby/object:Gem::Version
|
67
|
-
version: '0'
|
68
|
-
- !ruby/object:Gem::Dependency
|
69
|
-
name: bigdecimal
|
70
|
-
requirement: !ruby/object:Gem::Requirement
|
71
|
-
requirements:
|
72
|
-
- - ">="
|
73
|
-
- !ruby/object:Gem::Version
|
74
|
-
version: '0'
|
75
|
-
type: :runtime
|
76
|
-
prerelease: false
|
77
|
-
version_requirements: !ruby/object:Gem::Requirement
|
78
|
-
requirements:
|
79
|
-
- - ">="
|
80
|
-
- !ruby/object:Gem::Version
|
81
|
-
version: '0'
|
11
|
+
date: 2024-03-21 00:00:00.000000000 Z
|
12
|
+
dependencies: []
|
82
13
|
description: Wraps a locally installed llamafile in an a pure ruby object.
|
83
14
|
email:
|
84
15
|
- xorgnak@gmail.com
|
@@ -92,10 +23,8 @@ files:
|
|
92
23
|
- README.md
|
93
24
|
- Rakefile
|
94
25
|
- lib/llamafile.rb
|
95
|
-
- lib/llamafile/llama.rb
|
96
26
|
- lib/llamafile/version.rb
|
97
27
|
- llamafile.gemspec
|
98
|
-
- main.log
|
99
28
|
- sig/llamafile.rbs
|
100
29
|
homepage: https://github.com/xorgnak/llamafile
|
101
30
|
licenses:
|
@@ -104,6 +33,7 @@ metadata:
|
|
104
33
|
homepage_uri: https://github.com/xorgnak/llamafile
|
105
34
|
source_code_uri: https://github.com/xorgnak/llamafile
|
106
35
|
changelog_uri: https://github.com/xorgnak/llamafile
|
36
|
+
post_install_message:
|
107
37
|
rdoc_options: []
|
108
38
|
require_paths:
|
109
39
|
- lib
|
@@ -118,7 +48,8 @@ required_rubygems_version: !ruby/object:Gem::Requirement
|
|
118
48
|
- !ruby/object:Gem::Version
|
119
49
|
version: '0'
|
120
50
|
requirements: []
|
121
|
-
rubygems_version: 3.
|
51
|
+
rubygems_version: 3.3.15
|
52
|
+
signing_key:
|
122
53
|
specification_version: 4
|
123
54
|
summary: llamafile wrapper
|
124
55
|
test_files: []
|
data/lib/llamafile/llama.rb
DELETED
@@ -1,67 +0,0 @@
|
|
1
|
-
require 'httparty'
|
2
|
-
module LLAMA
|
3
|
-
DEF = {
|
4
|
-
stream: false,
|
5
|
-
grammar: %[root ::= l+\nl ::= i "\n"\ni ::= [^\n\t{|}]+ [.!?]],
|
6
|
-
n_predict: 2048,
|
7
|
-
n_probs: 0,
|
8
|
-
cache_prompt: true,
|
9
|
-
min_keep: 10,
|
10
|
-
min_p: 0.05,
|
11
|
-
mirostat: 2,
|
12
|
-
mirostat_eta: 0.1,
|
13
|
-
mirostat_tau: 5,
|
14
|
-
repeat_lat_n: 256,
|
15
|
-
repeat_penalty: 2,
|
16
|
-
slot_id: -1,
|
17
|
-
temperature: 0,
|
18
|
-
tfs_z: 1,
|
19
|
-
top_k: 95,
|
20
|
-
top_p: 0.95,
|
21
|
-
typical_p: 1,
|
22
|
-
stop: ['</s>','Llama:','User:']
|
23
|
-
}
|
24
|
-
|
25
|
-
class Llama
|
26
|
-
include HTTParty
|
27
|
-
base_uri ENV['LLAMA']
|
28
|
-
#default_timeout 120
|
29
|
-
end
|
30
|
-
def self.post h={}
|
31
|
-
#puts %[LLAMA IN: #{h}]
|
32
|
-
hh = {
|
33
|
-
headers: {
|
34
|
-
"Content-Type": "application/json",
|
35
|
-
"Connection": 'keep-alive',
|
36
|
-
"Priority": 'u=0'
|
37
|
-
},
|
38
|
-
body: JSON.generate(DEF.merge(h))
|
39
|
-
}
|
40
|
-
#puts %[LLAMA PACK: #{hh}]
|
41
|
-
r = Llama.post('/completion', hh)
|
42
|
-
#puts %[LLAMA CODE: #{r.code}]
|
43
|
-
if r.code == 200
|
44
|
-
return r['content'].gsub(/<.+>/, "").gsub(/\s\s+/, " ").gsub(/\n+/, "\n");
|
45
|
-
else
|
46
|
-
return false
|
47
|
-
end
|
48
|
-
end
|
49
|
-
|
50
|
-
# LLAMA.post prompt: PROMPT.make { system: "system prompt", mode: "output mode", output: "previous output", input: "new input" }
|
51
|
-
|
52
|
-
def self.<< i
|
53
|
-
fiber = Fiber.new do |ii|
|
54
|
-
Fiber.yield LLAMA.post(prompt: PROMPT.make(input: ii))
|
55
|
-
end
|
56
|
-
fiber.resume i
|
57
|
-
end
|
58
|
-
|
59
|
-
def self.if? h={}
|
60
|
-
if LLAMA.post({ n_predict: 4, grammar: GRAMMAR[:bool], prompt: PROMPT.test(h) }) == 'yes'
|
61
|
-
return true
|
62
|
-
else
|
63
|
-
return false
|
64
|
-
end
|
65
|
-
end
|
66
|
-
end
|
67
|
-
|
data/main.log
DELETED
@@ -1,599 +0,0 @@
|
|
1
|
-
[1711034362] Log start
|
2
|
-
[1711034362] Cmd: /usr/local/bin/llama -m tinyllama-1.1b-chat-v1.0.Q4_0.gguf -c 2048 -b 128 -p "The time is 2024-03-21 09:19:22 -0600. What time is it?
|
3
|
-
"
|
4
|
-
[1711034364] main: seed = 1711034364
|
5
|
-
[1711034364] main: llama backend init
|
6
|
-
[1711034365] main: load the model and apply lora adapter, if any
|
7
|
-
[1711034365] llama_model_loader: loaded meta data with 23 key-value pairs and 201 tensors from tinyllama-1.1b-chat-v1.0.Q4_0.gguf (version GGUF V3 (latest))
|
8
|
-
[1711034365] llama_model_loader: - tensor 0: output.weight q6_K [ 2048, 32000, 1, 1 ]
|
9
|
-
[1711034365] llama_model_loader: - tensor 1: token_embd.weight q4_0 [ 2048, 32000, 1, 1 ]
|
10
|
-
[1711034365] llama_model_loader: - tensor 2: blk.0.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
11
|
-
[1711034365] llama_model_loader: - tensor 3: blk.0.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
12
|
-
[1711034365] llama_model_loader: - tensor 4: blk.0.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
13
|
-
[1711034365] llama_model_loader: - tensor 5: blk.0.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
14
|
-
[1711034365] llama_model_loader: - tensor 6: blk.0.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
15
|
-
[1711034365] llama_model_loader: - tensor 7: blk.0.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
16
|
-
[1711034365] llama_model_loader: - tensor 8: blk.0.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
17
|
-
[1711034365] llama_model_loader: - tensor 9: blk.0.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
18
|
-
[1711034365] llama_model_loader: - tensor 10: blk.0.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
19
|
-
[1711034365] llama_model_loader: - tensor 11: blk.1.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
20
|
-
[1711034365] llama_model_loader: - tensor 12: blk.1.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
21
|
-
[1711034365] llama_model_loader: - tensor 13: blk.1.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
22
|
-
[1711034365] llama_model_loader: - tensor 14: blk.1.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
23
|
-
[1711034365] llama_model_loader: - tensor 15: blk.1.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
24
|
-
[1711034365] llama_model_loader: - tensor 16: blk.1.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
25
|
-
[1711034365] llama_model_loader: - tensor 17: blk.1.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
26
|
-
[1711034365] llama_model_loader: - tensor 18: blk.1.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
27
|
-
[1711034365] llama_model_loader: - tensor 19: blk.1.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
28
|
-
[1711034365] llama_model_loader: - tensor 20: blk.10.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
29
|
-
[1711034365] llama_model_loader: - tensor 21: blk.10.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
30
|
-
[1711034365] llama_model_loader: - tensor 22: blk.10.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
31
|
-
[1711034365] llama_model_loader: - tensor 23: blk.10.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
32
|
-
[1711034365] llama_model_loader: - tensor 24: blk.10.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
33
|
-
[1711034365] llama_model_loader: - tensor 25: blk.10.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
34
|
-
[1711034365] llama_model_loader: - tensor 26: blk.10.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
35
|
-
[1711034365] llama_model_loader: - tensor 27: blk.10.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
36
|
-
[1711034365] llama_model_loader: - tensor 28: blk.10.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
37
|
-
[1711034365] llama_model_loader: - tensor 29: blk.11.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
38
|
-
[1711034365] llama_model_loader: - tensor 30: blk.11.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
39
|
-
[1711034365] llama_model_loader: - tensor 31: blk.11.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
40
|
-
[1711034365] llama_model_loader: - tensor 32: blk.11.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
41
|
-
[1711034365] llama_model_loader: - tensor 33: blk.11.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
42
|
-
[1711034365] llama_model_loader: - tensor 34: blk.11.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
43
|
-
[1711034365] llama_model_loader: - tensor 35: blk.11.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
44
|
-
[1711034365] llama_model_loader: - tensor 36: blk.11.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
45
|
-
[1711034365] llama_model_loader: - tensor 37: blk.11.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
46
|
-
[1711034365] llama_model_loader: - tensor 38: blk.12.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
47
|
-
[1711034365] llama_model_loader: - tensor 39: blk.12.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
48
|
-
[1711034365] llama_model_loader: - tensor 40: blk.12.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
49
|
-
[1711034365] llama_model_loader: - tensor 41: blk.12.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
50
|
-
[1711034365] llama_model_loader: - tensor 42: blk.12.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
51
|
-
[1711034365] llama_model_loader: - tensor 43: blk.12.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
52
|
-
[1711034365] llama_model_loader: - tensor 44: blk.12.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
53
|
-
[1711034365] llama_model_loader: - tensor 45: blk.12.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
54
|
-
[1711034365] llama_model_loader: - tensor 46: blk.12.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
55
|
-
[1711034365] llama_model_loader: - tensor 47: blk.13.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
56
|
-
[1711034365] llama_model_loader: - tensor 48: blk.13.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
57
|
-
[1711034365] llama_model_loader: - tensor 49: blk.13.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
58
|
-
[1711034365] llama_model_loader: - tensor 50: blk.13.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
59
|
-
[1711034365] llama_model_loader: - tensor 51: blk.13.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
60
|
-
[1711034365] llama_model_loader: - tensor 52: blk.13.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
61
|
-
[1711034365] llama_model_loader: - tensor 53: blk.13.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
62
|
-
[1711034365] llama_model_loader: - tensor 54: blk.13.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
63
|
-
[1711034365] llama_model_loader: - tensor 55: blk.13.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
64
|
-
[1711034365] llama_model_loader: - tensor 56: blk.14.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
65
|
-
[1711034365] llama_model_loader: - tensor 57: blk.14.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
66
|
-
[1711034365] llama_model_loader: - tensor 58: blk.14.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
67
|
-
[1711034365] llama_model_loader: - tensor 59: blk.14.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
68
|
-
[1711034365] llama_model_loader: - tensor 60: blk.14.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
69
|
-
[1711034365] llama_model_loader: - tensor 61: blk.14.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
70
|
-
[1711034365] llama_model_loader: - tensor 62: blk.14.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
71
|
-
[1711034365] llama_model_loader: - tensor 63: blk.14.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
72
|
-
[1711034365] llama_model_loader: - tensor 64: blk.14.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
73
|
-
[1711034365] llama_model_loader: - tensor 65: blk.15.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
74
|
-
[1711034365] llama_model_loader: - tensor 66: blk.15.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
75
|
-
[1711034365] llama_model_loader: - tensor 67: blk.15.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
76
|
-
[1711034365] llama_model_loader: - tensor 68: blk.15.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
77
|
-
[1711034365] llama_model_loader: - tensor 69: blk.15.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
78
|
-
[1711034365] llama_model_loader: - tensor 70: blk.15.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
79
|
-
[1711034365] llama_model_loader: - tensor 71: blk.15.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
80
|
-
[1711034365] llama_model_loader: - tensor 72: blk.15.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
81
|
-
[1711034365] llama_model_loader: - tensor 73: blk.15.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
82
|
-
[1711034365] llama_model_loader: - tensor 74: blk.16.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
83
|
-
[1711034365] llama_model_loader: - tensor 75: blk.16.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
84
|
-
[1711034365] llama_model_loader: - tensor 76: blk.16.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
85
|
-
[1711034365] llama_model_loader: - tensor 77: blk.16.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
86
|
-
[1711034365] llama_model_loader: - tensor 78: blk.16.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
87
|
-
[1711034365] llama_model_loader: - tensor 79: blk.16.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
88
|
-
[1711034365] llama_model_loader: - tensor 80: blk.16.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
89
|
-
[1711034365] llama_model_loader: - tensor 81: blk.16.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
90
|
-
[1711034365] llama_model_loader: - tensor 82: blk.16.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
91
|
-
[1711034365] llama_model_loader: - tensor 83: blk.17.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
92
|
-
[1711034365] llama_model_loader: - tensor 84: blk.17.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
93
|
-
[1711034365] llama_model_loader: - tensor 85: blk.17.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
94
|
-
[1711034365] llama_model_loader: - tensor 86: blk.17.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
95
|
-
[1711034365] llama_model_loader: - tensor 87: blk.17.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
96
|
-
[1711034365] llama_model_loader: - tensor 88: blk.17.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
97
|
-
[1711034365] llama_model_loader: - tensor 89: blk.17.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
98
|
-
[1711034365] llama_model_loader: - tensor 90: blk.17.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
99
|
-
[1711034365] llama_model_loader: - tensor 91: blk.17.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
100
|
-
[1711034365] llama_model_loader: - tensor 92: blk.18.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
101
|
-
[1711034365] llama_model_loader: - tensor 93: blk.18.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
102
|
-
[1711034365] llama_model_loader: - tensor 94: blk.18.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
103
|
-
[1711034365] llama_model_loader: - tensor 95: blk.18.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
104
|
-
[1711034365] llama_model_loader: - tensor 96: blk.18.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
105
|
-
[1711034365] llama_model_loader: - tensor 97: blk.18.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
106
|
-
[1711034365] llama_model_loader: - tensor 98: blk.18.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
107
|
-
[1711034365] llama_model_loader: - tensor 99: blk.18.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
108
|
-
[1711034365] llama_model_loader: - tensor 100: blk.18.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
109
|
-
[1711034365] llama_model_loader: - tensor 101: blk.19.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
110
|
-
[1711034365] llama_model_loader: - tensor 102: blk.19.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
111
|
-
[1711034365] llama_model_loader: - tensor 103: blk.19.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
112
|
-
[1711034365] llama_model_loader: - tensor 104: blk.19.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
113
|
-
[1711034365] llama_model_loader: - tensor 105: blk.19.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
114
|
-
[1711034365] llama_model_loader: - tensor 106: blk.19.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
115
|
-
[1711034365] llama_model_loader: - tensor 107: blk.19.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
116
|
-
[1711034365] llama_model_loader: - tensor 108: blk.19.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
117
|
-
[1711034365] llama_model_loader: - tensor 109: blk.19.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
118
|
-
[1711034365] llama_model_loader: - tensor 110: blk.2.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
119
|
-
[1711034365] llama_model_loader: - tensor 111: blk.2.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
120
|
-
[1711034365] llama_model_loader: - tensor 112: blk.2.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
121
|
-
[1711034365] llama_model_loader: - tensor 113: blk.2.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
122
|
-
[1711034365] llama_model_loader: - tensor 114: blk.2.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
123
|
-
[1711034365] llama_model_loader: - tensor 115: blk.2.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
124
|
-
[1711034365] llama_model_loader: - tensor 116: blk.2.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
125
|
-
[1711034365] llama_model_loader: - tensor 117: blk.2.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
126
|
-
[1711034365] llama_model_loader: - tensor 118: blk.2.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
127
|
-
[1711034365] llama_model_loader: - tensor 119: blk.20.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
128
|
-
[1711034365] llama_model_loader: - tensor 120: blk.20.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
129
|
-
[1711034365] llama_model_loader: - tensor 121: blk.20.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
130
|
-
[1711034365] llama_model_loader: - tensor 122: blk.20.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
131
|
-
[1711034365] llama_model_loader: - tensor 123: blk.20.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
132
|
-
[1711034365] llama_model_loader: - tensor 124: blk.20.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
133
|
-
[1711034365] llama_model_loader: - tensor 125: blk.20.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
134
|
-
[1711034365] llama_model_loader: - tensor 126: blk.20.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
135
|
-
[1711034365] llama_model_loader: - tensor 127: blk.20.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
136
|
-
[1711034365] llama_model_loader: - tensor 128: blk.21.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
137
|
-
[1711034365] llama_model_loader: - tensor 129: blk.21.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
138
|
-
[1711034365] llama_model_loader: - tensor 130: blk.21.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
139
|
-
[1711034365] llama_model_loader: - tensor 131: blk.21.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
140
|
-
[1711034365] llama_model_loader: - tensor 132: blk.21.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
141
|
-
[1711034365] llama_model_loader: - tensor 133: blk.21.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
142
|
-
[1711034365] llama_model_loader: - tensor 134: blk.21.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
143
|
-
[1711034365] llama_model_loader: - tensor 135: blk.21.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
144
|
-
[1711034365] llama_model_loader: - tensor 136: blk.21.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
145
|
-
[1711034365] llama_model_loader: - tensor 137: blk.3.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
146
|
-
[1711034365] llama_model_loader: - tensor 138: blk.3.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
147
|
-
[1711034365] llama_model_loader: - tensor 139: blk.3.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
148
|
-
[1711034365] llama_model_loader: - tensor 140: blk.3.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
149
|
-
[1711034365] llama_model_loader: - tensor 141: blk.3.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
150
|
-
[1711034365] llama_model_loader: - tensor 142: blk.3.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
151
|
-
[1711034365] llama_model_loader: - tensor 143: blk.3.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
152
|
-
[1711034365] llama_model_loader: - tensor 144: blk.3.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
153
|
-
[1711034365] llama_model_loader: - tensor 145: blk.3.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
154
|
-
[1711034365] llama_model_loader: - tensor 146: blk.4.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
155
|
-
[1711034365] llama_model_loader: - tensor 147: blk.4.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
156
|
-
[1711034365] llama_model_loader: - tensor 148: blk.4.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
157
|
-
[1711034365] llama_model_loader: - tensor 149: blk.4.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
158
|
-
[1711034365] llama_model_loader: - tensor 150: blk.4.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
159
|
-
[1711034365] llama_model_loader: - tensor 151: blk.4.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
160
|
-
[1711034365] llama_model_loader: - tensor 152: blk.4.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
161
|
-
[1711034365] llama_model_loader: - tensor 153: blk.4.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
162
|
-
[1711034365] llama_model_loader: - tensor 154: blk.4.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
163
|
-
[1711034365] llama_model_loader: - tensor 155: blk.5.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
164
|
-
[1711034365] llama_model_loader: - tensor 156: blk.5.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
165
|
-
[1711034365] llama_model_loader: - tensor 157: blk.5.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
166
|
-
[1711034365] llama_model_loader: - tensor 158: blk.5.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
167
|
-
[1711034365] llama_model_loader: - tensor 159: blk.5.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
168
|
-
[1711034365] llama_model_loader: - tensor 160: blk.5.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
169
|
-
[1711034365] llama_model_loader: - tensor 161: blk.5.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
170
|
-
[1711034365] llama_model_loader: - tensor 162: blk.5.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
171
|
-
[1711034365] llama_model_loader: - tensor 163: blk.5.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
172
|
-
[1711034365] llama_model_loader: - tensor 164: blk.6.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
173
|
-
[1711034365] llama_model_loader: - tensor 165: blk.6.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
174
|
-
[1711034365] llama_model_loader: - tensor 166: blk.6.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
175
|
-
[1711034365] llama_model_loader: - tensor 167: blk.6.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
176
|
-
[1711034365] llama_model_loader: - tensor 168: blk.6.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
177
|
-
[1711034365] llama_model_loader: - tensor 169: blk.6.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
178
|
-
[1711034365] llama_model_loader: - tensor 170: blk.6.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
179
|
-
[1711034365] llama_model_loader: - tensor 171: blk.6.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
180
|
-
[1711034365] llama_model_loader: - tensor 172: blk.6.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
181
|
-
[1711034365] llama_model_loader: - tensor 173: blk.7.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
182
|
-
[1711034365] llama_model_loader: - tensor 174: blk.7.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
183
|
-
[1711034365] llama_model_loader: - tensor 175: blk.7.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
184
|
-
[1711034365] llama_model_loader: - tensor 176: blk.7.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
185
|
-
[1711034365] llama_model_loader: - tensor 177: blk.7.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
186
|
-
[1711034365] llama_model_loader: - tensor 178: blk.7.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
187
|
-
[1711034365] llama_model_loader: - tensor 179: blk.7.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
188
|
-
[1711034365] llama_model_loader: - tensor 180: blk.7.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
189
|
-
[1711034365] llama_model_loader: - tensor 181: blk.7.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
190
|
-
[1711034365] llama_model_loader: - tensor 182: blk.8.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
191
|
-
[1711034365] llama_model_loader: - tensor 183: blk.8.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
192
|
-
[1711034365] llama_model_loader: - tensor 184: blk.8.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
193
|
-
[1711034365] llama_model_loader: - tensor 185: blk.8.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
194
|
-
[1711034365] llama_model_loader: - tensor 186: blk.8.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
195
|
-
[1711034365] llama_model_loader: - tensor 187: blk.8.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
196
|
-
[1711034365] llama_model_loader: - tensor 188: blk.8.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
197
|
-
[1711034365] llama_model_loader: - tensor 189: blk.8.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
198
|
-
[1711034365] llama_model_loader: - tensor 190: blk.8.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
199
|
-
[1711034365] llama_model_loader: - tensor 191: blk.9.attn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
200
|
-
[1711034365] llama_model_loader: - tensor 192: blk.9.ffn_down.weight q4_0 [ 5632, 2048, 1, 1 ]
|
201
|
-
[1711034365] llama_model_loader: - tensor 193: blk.9.ffn_gate.weight q4_0 [ 2048, 5632, 1, 1 ]
|
202
|
-
[1711034365] llama_model_loader: - tensor 194: blk.9.ffn_up.weight q4_0 [ 2048, 5632, 1, 1 ]
|
203
|
-
[1711034365] llama_model_loader: - tensor 195: blk.9.ffn_norm.weight f32 [ 2048, 1, 1, 1 ]
|
204
|
-
[1711034365] llama_model_loader: - tensor 196: blk.9.attn_k.weight q4_0 [ 2048, 256, 1, 1 ]
|
205
|
-
[1711034365] llama_model_loader: - tensor 197: blk.9.attn_output.weight q4_0 [ 2048, 2048, 1, 1 ]
|
206
|
-
[1711034365] llama_model_loader: - tensor 198: blk.9.attn_q.weight q4_0 [ 2048, 2048, 1, 1 ]
|
207
|
-
[1711034365] llama_model_loader: - tensor 199: blk.9.attn_v.weight q4_0 [ 2048, 256, 1, 1 ]
|
208
|
-
[1711034365] llama_model_loader: - tensor 200: output_norm.weight f32 [ 2048, 1, 1, 1 ]
|
209
|
-
[1711034365] llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
|
210
|
-
[1711034365] llama_model_loader: - kv 0: general.architecture str = llama
|
211
|
-
[1711034365] llama_model_loader: - kv 1: general.name str = tinyllama_tinyllama-1.1b-chat-v1.0
|
212
|
-
[1711034365] llama_model_loader: - kv 2: llama.context_length u32 = 2048
|
213
|
-
[1711034365] llama_model_loader: - kv 3: llama.embedding_length u32 = 2048
|
214
|
-
[1711034365] llama_model_loader: - kv 4: llama.block_count u32 = 22
|
215
|
-
[1711034365] llama_model_loader: - kv 5: llama.feed_forward_length u32 = 5632
|
216
|
-
[1711034365] llama_model_loader: - kv 6: llama.rope.dimension_count u32 = 64
|
217
|
-
[1711034365] llama_model_loader: - kv 7: llama.attention.head_count u32 = 32
|
218
|
-
[1711034365] llama_model_loader: - kv 8: llama.attention.head_count_kv u32 = 4
|
219
|
-
[1711034365] llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
|
220
|
-
[1711034365] llama_model_loader: - kv 10: llama.rope.freq_base f32 = 10000.000000
|
221
|
-
[1711034365] llama_model_loader: - kv 11: general.file_type u32 = 2
|
222
|
-
[1711034365] llama_model_loader: - kv 12: tokenizer.ggml.model str = llama
|
223
|
-
[1711034365] llama_model_loader: - kv 13: tokenizer.ggml.tokens arr[str,32000] = ["<unk>", "<s>", "</s>", "<0x00>", "<...
|
224
|
-
[1711034365] llama_model_loader: - kv 14: tokenizer.ggml.scores arr[f32,32000] = [0.000000, 0.000000, 0.000000, 0.0000...
|
225
|
-
[1711034365] llama_model_loader: - kv 15: tokenizer.ggml.token_type arr[i32,32000] = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ...
|
226
|
-
[1711034365] llama_model_loader: - kv 16: tokenizer.ggml.merges arr[str,61249] = ["▁ t", "e r", "i n", "▁ a", "e n...
|
227
|
-
[1711034365] llama_model_loader: - kv 17: tokenizer.ggml.bos_token_id u32 = 1
|
228
|
-
[1711034365] llama_model_loader: - kv 18: tokenizer.ggml.eos_token_id u32 = 2
|
229
|
-
[1711034365] llama_model_loader: - kv 19: tokenizer.ggml.unknown_token_id u32 = 0
|
230
|
-
[1711034365] llama_model_loader: - kv 20: tokenizer.ggml.padding_token_id u32 = 2
|
231
|
-
[1711034365] llama_model_loader: - kv 21: tokenizer.chat_template str = {% for message in messages %}\n{% if m...
|
232
|
-
[1711034365] llama_model_loader: - kv 22: general.quantization_version u32 = 2
|
233
|
-
[1711034365] llama_model_loader: - type f32: 45 tensors
|
234
|
-
[1711034365] llama_model_loader: - type q4_0: 155 tensors
|
235
|
-
[1711034365] llama_model_loader: - type q6_K: 1 tensors
|
236
|
-
[1711034365] llm_load_vocab: special tokens definition check successful ( 259/32000 ).
|
237
|
-
[1711034365] llm_load_print_meta: format = GGUF V3 (latest)
|
238
|
-
[1711034365] llm_load_print_meta: arch = llama
|
239
|
-
[1711034365] llm_load_print_meta: vocab type = SPM
|
240
|
-
[1711034365] llm_load_print_meta: n_vocab = 32000
|
241
|
-
[1711034365] llm_load_print_meta: n_merges = 0
|
242
|
-
[1711034365] llm_load_print_meta: n_ctx_train = 2048
|
243
|
-
[1711034365] llm_load_print_meta: n_embd = 2048
|
244
|
-
[1711034365] llm_load_print_meta: n_head = 32
|
245
|
-
[1711034365] llm_load_print_meta: n_head_kv = 4
|
246
|
-
[1711034365] llm_load_print_meta: n_layer = 22
|
247
|
-
[1711034365] llm_load_print_meta: n_rot = 64
|
248
|
-
[1711034365] llm_load_print_meta: n_gqa = 8
|
249
|
-
[1711034365] llm_load_print_meta: f_norm_eps = 0.0e+00
|
250
|
-
[1711034365] llm_load_print_meta: f_norm_rms_eps = 1.0e-05
|
251
|
-
[1711034365] llm_load_print_meta: f_clamp_kqv = 0.0e+00
|
252
|
-
[1711034365] llm_load_print_meta: f_max_alibi_bias = 0.0e+00
|
253
|
-
[1711034365] llm_load_print_meta: n_ff = 5632
|
254
|
-
[1711034365] llm_load_print_meta: n_expert = 0
|
255
|
-
[1711034365] llm_load_print_meta: n_expert_used = 0
|
256
|
-
[1711034365] llm_load_print_meta: rope scaling = linear
|
257
|
-
[1711034365] llm_load_print_meta: freq_base_train = 10000.0
|
258
|
-
[1711034365] llm_load_print_meta: freq_scale_train = 1
|
259
|
-
[1711034365] llm_load_print_meta: n_yarn_orig_ctx = 2048
|
260
|
-
[1711034365] llm_load_print_meta: rope_finetuned = unknown
|
261
|
-
[1711034365] llm_load_print_meta: model type = ?B
|
262
|
-
[1711034365] llm_load_print_meta: model ftype = mostly Q4_0
|
263
|
-
[1711034365] llm_load_print_meta: model params = 1.10 B
|
264
|
-
[1711034365] llm_load_print_meta: model size = 606.53 MiB (4.63 BPW)
|
265
|
-
[1711034365] llm_load_print_meta: general.name = tinyllama_tinyllama-1.1b-chat-v1.0
|
266
|
-
[1711034365] llm_load_print_meta: BOS token = 1 '<s>'
|
267
|
-
[1711034365] llm_load_print_meta: EOS token = 2 '</s>'
|
268
|
-
[1711034365] llm_load_print_meta: UNK token = 0 '<unk>'
|
269
|
-
[1711034365] llm_load_print_meta: PAD token = 2 '</s>'
|
270
|
-
[1711034365] llm_load_print_meta: LF token = 13 '<0x0A>'
|
271
|
-
[1711034365] llm_load_tensors: ggml ctx size = 0.08 MiB
|
272
|
-
[1711034365] llm_load_tensors: mem required = 606.61 MiB
|
273
|
-
[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365] .[1711034365]
|
274
|
-
[1711034365] llama_new_context_with_model: n_ctx = 2048
|
275
|
-
[1711034365] llama_new_context_with_model: freq_base = 10000.0
|
276
|
-
[1711034365] llama_new_context_with_model: freq_scale = 1
|
277
|
-
[1711034365] llama_new_context_with_model: KV self size = 44.00 MiB, K (f16): 22.00 MiB, V (f16): 22.00 MiB
|
278
|
-
[1711034365] llama_build_graph: non-view tensors processed: 466/466
|
279
|
-
[1711034365] llama_new_context_with_model: compute buffer total size = 39.31 MiB
|
280
|
-
[1711034365] llama_new_context_with_model: VRAM scratch buffer: 36.00 MiB
|
281
|
-
[1711034365] llama_new_context_with_model: total VRAM used: 36.00 MiB (model: 0.00 MiB, context: 36.00 MiB)
|
282
|
-
[1711034365] warming up the model with an empty run
|
283
|
-
[1711034386] n_ctx: 2048
|
284
|
-
[1711034386]
|
285
|
-
[1711034386] system_info: n_threads = 4 / 4 | AVX = 1 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 |
|
286
|
-
[1711034386] add_bos: 1
|
287
|
-
[1711034386] tokenize the prompt
|
288
|
-
[1711034386] prompt: "The time is 2024-03-21 09:19:22 -0600. What time is it?
|
289
|
-
"
|
290
|
-
[1711034386] tokens: [ '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13 ]
|
291
|
-
[1711034386] recalculate the cached logits (check): embd_inp.empty() false, n_matching_session_tokens 0, embd_inp.size() 36, session_tokens.size() 0, embd_inp.size() 36
|
292
|
-
[1711034386] inp_pfx: [ '':1, ' ':29871, '':13, '':13, '##':2277, '#':29937, ' Inst':2799, 'ruction':4080, ':':29901, '':13, '':13 ]
|
293
|
-
[1711034386] inp_sfx: [ ' ':29871, '':13, '':13, '##':2277, '#':29937, ' Response':13291, ':':29901, '':13, '':13 ]
|
294
|
-
[1711034386] cml_pfx: [ '':1, ' ':29871, '':13, '<':29966, '|':29989, 'im':326, '_':29918, 'start':2962, '|':29989, '>':29958, 'user':1792, '':13 ]
|
295
|
-
[1711034386] cml_sfx: [ ' <':529, '|':29989, 'im':326, '_':29918, 'end':355, '|':29989, '>':29958, '':13, '<':29966, '|':29989, 'im':326, '_':29918, 'start':2962, '|':29989, '>':29958, 'ass':465, 'istant':22137, '':13 ]
|
296
|
-
[1711034386] sampling:
|
297
|
-
repeat_last_n = 64, repeat_penalty = 1.100, frequency_penalty = 0.000, presence_penalty = 0.000
|
298
|
-
top_k = 40, tfs_z = 1.000, top_p = 0.950, min_p = 0.050, typical_p = 1.000, temp = 0.800
|
299
|
-
mirostat = 0, mirostat_lr = 0.100, mirostat_ent = 5.000
|
300
|
-
[1711034386] sampling order:
|
301
|
-
CFG -> Penalties -> top_k -> tfs_z -> typical_p -> top_p -> min_p -> temp
|
302
|
-
[1711034386] generate: n_ctx = 2048, n_batch = 128, n_predict = -1, n_keep = 0
|
303
|
-
[1711034386]
|
304
|
-
|
305
|
-
[1711034386] embd_inp.size(): 36, n_consumed: 0
|
306
|
-
[1711034386] eval: [ '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13 ]
|
307
|
-
[1711034403] n_past = 36
|
308
|
-
[1711034403] sampled token: 3421: 'My'
|
309
|
-
[1711034403] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421 ]
|
310
|
-
[1711034403] n_remain: -2
|
311
|
-
[1711034403] eval: [ 'My':3421 ]
|
312
|
-
[1711034403] n_past = 37
|
313
|
-
[1711034403] sampled token: 931: ' time'
|
314
|
-
[1711034403] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931 ]
|
315
|
-
[1711034403] n_remain: -3
|
316
|
-
[1711034403] eval: [ ' time':931 ]
|
317
|
-
[1711034404] n_past = 38
|
318
|
-
[1711034404] sampled token: 338: ' is'
|
319
|
-
[1711034404] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338 ]
|
320
|
-
[1711034404] n_remain: -4
|
321
|
-
[1711034404] eval: [ ' is':338 ]
|
322
|
-
[1711034404] n_past = 39
|
323
|
-
[1711034404] sampled token: 29871: ' '
|
324
|
-
[1711034404] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871 ]
|
325
|
-
[1711034404] n_remain: -5
|
326
|
-
[1711034404] eval: [ ' ':29871 ]
|
327
|
-
[1711034404] n_past = 40
|
328
|
-
[1711034404] sampled token: 29906: '2'
|
329
|
-
[1711034404] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906 ]
|
330
|
-
[1711034404] n_remain: -6
|
331
|
-
[1711034404] eval: [ '2':29906 ]
|
332
|
-
[1711034404] n_past = 41
|
333
|
-
[1711034404] sampled token: 29900: '0'
|
334
|
-
[1711034404] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900 ]
|
335
|
-
[1711034404] n_remain: -7
|
336
|
-
[1711034404] eval: [ '0':29900 ]
|
337
|
-
[1711034404] n_past = 42
|
338
|
-
[1711034404] sampled token: 29906: '2'
|
339
|
-
[1711034404] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906 ]
|
340
|
-
[1711034404] n_remain: -8
|
341
|
-
[1711034404] eval: [ '2':29906 ]
|
342
|
-
[1711034404] n_past = 43
|
343
|
-
[1711034404] sampled token: 29946: '4'
|
344
|
-
[1711034404] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946 ]
|
345
|
-
[1711034404] n_remain: -9
|
346
|
-
[1711034404] eval: [ '4':29946 ]
|
347
|
-
[1711034404] n_past = 44
|
348
|
-
[1711034404] sampled token: 29899: '-'
|
349
|
-
[1711034404] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899 ]
|
350
|
-
[1711034404] n_remain: -10
|
351
|
-
[1711034404] eval: [ '-':29899 ]
|
352
|
-
[1711034405] n_past = 45
|
353
|
-
[1711034405] sampled token: 29900: '0'
|
354
|
-
[1711034405] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900 ]
|
355
|
-
[1711034405] n_remain: -11
|
356
|
-
[1711034405] eval: [ '0':29900 ]
|
357
|
-
[1711034405] n_past = 46
|
358
|
-
[1711034405] sampled token: 29941: '3'
|
359
|
-
[1711034405] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941 ]
|
360
|
-
[1711034405] n_remain: -12
|
361
|
-
[1711034405] eval: [ '3':29941 ]
|
362
|
-
[1711034405] n_past = 47
|
363
|
-
[1711034405] sampled token: 29899: '-'
|
364
|
-
[1711034405] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899 ]
|
365
|
-
[1711034405] n_remain: -13
|
366
|
-
[1711034405] eval: [ '-':29899 ]
|
367
|
-
[1711034405] n_past = 48
|
368
|
-
[1711034405] sampled token: 29906: '2'
|
369
|
-
[1711034405] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906 ]
|
370
|
-
[1711034405] n_remain: -14
|
371
|
-
[1711034405] eval: [ '2':29906 ]
|
372
|
-
[1711034405] n_past = 49
|
373
|
-
[1711034405] sampled token: 29896: '1'
|
374
|
-
[1711034405] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896 ]
|
375
|
-
[1711034405] n_remain: -15
|
376
|
-
[1711034405] eval: [ '1':29896 ]
|
377
|
-
[1711034405] n_past = 50
|
378
|
-
[1711034405] sampled token: 29871: ' '
|
379
|
-
[1711034405] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871 ]
|
380
|
-
[1711034405] n_remain: -16
|
381
|
-
[1711034405] eval: [ ' ':29871 ]
|
382
|
-
[1711034406] n_past = 51
|
383
|
-
[1711034406] sampled token: 29900: '0'
|
384
|
-
[1711034406] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900 ]
|
385
|
-
[1711034406] n_remain: -17
|
386
|
-
[1711034406] eval: [ '0':29900 ]
|
387
|
-
[1711034406] n_past = 52
|
388
|
-
[1711034406] sampled token: 29929: '9'
|
389
|
-
[1711034406] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929 ]
|
390
|
-
[1711034406] n_remain: -18
|
391
|
-
[1711034406] eval: [ '9':29929 ]
|
392
|
-
[1711034406] n_past = 53
|
393
|
-
[1711034406] sampled token: 29901: ':'
|
394
|
-
[1711034406] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901 ]
|
395
|
-
[1711034406] n_remain: -19
|
396
|
-
[1711034406] eval: [ ':':29901 ]
|
397
|
-
[1711034406] n_past = 54
|
398
|
-
[1711034406] sampled token: 29896: '1'
|
399
|
-
[1711034406] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896 ]
|
400
|
-
[1711034406] n_remain: -20
|
401
|
-
[1711034406] eval: [ '1':29896 ]
|
402
|
-
[1711034406] n_past = 55
|
403
|
-
[1711034406] sampled token: 29929: '9'
|
404
|
-
[1711034406] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929 ]
|
405
|
-
[1711034406] n_remain: -21
|
406
|
-
[1711034406] eval: [ '9':29929 ]
|
407
|
-
[1711034406] n_past = 56
|
408
|
-
[1711034406] sampled token: 29901: ':'
|
409
|
-
[1711034406] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901 ]
|
410
|
-
[1711034406] n_remain: -22
|
411
|
-
[1711034406] eval: [ ':':29901 ]
|
412
|
-
[1711034407] n_past = 57
|
413
|
-
[1711034407] sampled token: 29906: '2'
|
414
|
-
[1711034407] last: [ '':0, '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906 ]
|
415
|
-
[1711034407] n_remain: -23
|
416
|
-
[1711034407] eval: [ '2':29906 ]
|
417
|
-
[1711034407] n_past = 58
|
418
|
-
[1711034407] sampled token: 29906: '2'
|
419
|
-
[1711034407] last: [ '':0, '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906 ]
|
420
|
-
[1711034407] n_remain: -24
|
421
|
-
[1711034407] eval: [ '2':29906 ]
|
422
|
-
[1711034407] n_past = 59
|
423
|
-
[1711034407] sampled token: 448: ' -'
|
424
|
-
[1711034407] last: [ '':0, '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448 ]
|
425
|
-
[1711034407] n_remain: -25
|
426
|
-
[1711034407] eval: [ ' -':448 ]
|
427
|
-
[1711034407] n_past = 60
|
428
|
-
[1711034407] sampled token: 29900: '0'
|
429
|
-
[1711034407] last: [ '':0, '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900 ]
|
430
|
-
[1711034407] n_remain: -26
|
431
|
-
[1711034407] eval: [ '0':29900 ]
|
432
|
-
[1711034407] n_past = 61
|
433
|
-
[1711034407] sampled token: 29953: '6'
|
434
|
-
[1711034407] last: [ '':0, '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953 ]
|
435
|
-
[1711034407] n_remain: -27
|
436
|
-
[1711034407] eval: [ '6':29953 ]
|
437
|
-
[1711034407] n_past = 62
|
438
|
-
[1711034407] sampled token: 29900: '0'
|
439
|
-
[1711034407] last: [ '':0, '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900 ]
|
440
|
-
[1711034407] n_remain: -28
|
441
|
-
[1711034407] eval: [ '0':29900 ]
|
442
|
-
[1711034408] n_past = 63
|
443
|
-
[1711034408] sampled token: 29900: '0'
|
444
|
-
[1711034408] last: [ '':1, ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900 ]
|
445
|
-
[1711034408] n_remain: -29
|
446
|
-
[1711034408] eval: [ '0':29900 ]
|
447
|
-
[1711034408] n_past = 64
|
448
|
-
[1711034408] sampled token: 29889: '.'
|
449
|
-
[1711034408] last: [ ' The':450, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889 ]
|
450
|
-
[1711034408] n_remain: -30
|
451
|
-
[1711034408] eval: [ '.':29889 ]
|
452
|
-
[1711034408] n_past = 65
|
453
|
-
[1711034408] sampled token: 16564: ' Based'
|
454
|
-
[1711034408] last: [ ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564 ]
|
455
|
-
[1711034408] n_remain: -31
|
456
|
-
[1711034408] eval: [ ' Based':16564 ]
|
457
|
-
[1711034408] n_past = 66
|
458
|
-
[1711034408] sampled token: 373: ' on'
|
459
|
-
[1711034408] last: [ ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373 ]
|
460
|
-
[1711034408] n_remain: -32
|
461
|
-
[1711034408] eval: [ ' on':373 ]
|
462
|
-
[1711034408] n_past = 67
|
463
|
-
[1711034408] sampled token: 278: ' the'
|
464
|
-
[1711034408] last: [ ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278 ]
|
465
|
-
[1711034408] n_remain: -33
|
466
|
-
[1711034408] eval: [ ' the':278 ]
|
467
|
-
[1711034408] n_past = 68
|
468
|
-
[1711034408] sampled token: 13382: ' passage'
|
469
|
-
[1711034408] last: [ '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382 ]
|
470
|
-
[1711034408] n_remain: -34
|
471
|
-
[1711034408] eval: [ ' passage':13382 ]
|
472
|
-
[1711034409] n_past = 69
|
473
|
-
[1711034409] sampled token: 2038: ' above'
|
474
|
-
[1711034409] last: [ '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038 ]
|
475
|
-
[1711034409] n_remain: -35
|
476
|
-
[1711034409] eval: [ ' above':2038 ]
|
477
|
-
[1711034409] n_past = 70
|
478
|
-
[1711034409] sampled token: 29892: ','
|
479
|
-
[1711034409] last: [ '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892 ]
|
480
|
-
[1711034409] n_remain: -36
|
481
|
-
[1711034409] eval: [ ',':29892 ]
|
482
|
-
[1711034409] n_past = 71
|
483
|
-
[1711034409] sampled token: 1128: ' How'
|
484
|
-
[1711034409] last: [ '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128 ]
|
485
|
-
[1711034409] n_remain: -37
|
486
|
-
[1711034409] eval: [ ' How':1128 ]
|
487
|
-
[1711034409] n_past = 72
|
488
|
-
[1711034409] sampled token: 947: ' does'
|
489
|
-
[1711034409] last: [ '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947 ]
|
490
|
-
[1711034409] n_remain: -38
|
491
|
-
[1711034409] eval: [ ' does':947 ]
|
492
|
-
[1711034409] n_past = 73
|
493
|
-
[1711034409] sampled token: 278: ' the'
|
494
|
-
[1711034409] last: [ '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278 ]
|
495
|
-
[1711034409] n_remain: -39
|
496
|
-
[1711034409] eval: [ ' the':278 ]
|
497
|
-
[1711034410] n_past = 74
|
498
|
-
[1711034410] sampled token: 4148: ' author'
|
499
|
-
[1711034410] last: [ '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148 ]
|
500
|
-
[1711034410] n_remain: -40
|
501
|
-
[1711034410] eval: [ ' author':4148 ]
|
502
|
-
[1711034410] n_past = 75
|
503
|
-
[1711034410] sampled token: 29915: '''
|
504
|
-
[1711034410] last: [ '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915 ]
|
505
|
-
[1711034410] n_remain: -41
|
506
|
-
[1711034410] eval: [ ''':29915 ]
|
507
|
-
[1711034410] n_past = 76
|
508
|
-
[1711034410] sampled token: 29879: 's'
|
509
|
-
[1711034410] last: [ '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879 ]
|
510
|
-
[1711034410] n_remain: -42
|
511
|
-
[1711034410] eval: [ 's':29879 ]
|
512
|
-
[1711034410] n_past = 77
|
513
|
-
[1711034410] sampled token: 931: ' time'
|
514
|
-
[1711034410] last: [ '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931 ]
|
515
|
-
[1711034410] n_remain: -43
|
516
|
-
[1711034410] eval: [ ' time':931 ]
|
517
|
-
[1711034410] n_past = 78
|
518
|
-
[1711034410] sampled token: 29279: ' relate'
|
519
|
-
[1711034410] last: [ ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279 ]
|
520
|
-
[1711034410] n_remain: -44
|
521
|
-
[1711034410] eval: [ ' relate':29279 ]
|
522
|
-
[1711034410] n_past = 79
|
523
|
-
[1711034410] sampled token: 304: ' to'
|
524
|
-
[1711034410] last: [ '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304 ]
|
525
|
-
[1711034410] n_remain: -45
|
526
|
-
[1711034410] eval: [ ' to':304 ]
|
527
|
-
[1711034411] n_past = 80
|
528
|
-
[1711034411] sampled token: 1009: ' their'
|
529
|
-
[1711034411] last: [ '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009 ]
|
530
|
-
[1711034411] n_remain: -46
|
531
|
-
[1711034411] eval: [ ' their':1009 ]
|
532
|
-
[1711034411] n_past = 81
|
533
|
-
[1711034411] sampled token: 1857: ' current'
|
534
|
-
[1711034411] last: [ ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857 ]
|
535
|
-
[1711034411] n_remain: -47
|
536
|
-
[1711034411] eval: [ ' current':1857 ]
|
537
|
-
[1711034411] n_past = 82
|
538
|
-
[1711034411] sampled token: 4423: ' location'
|
539
|
-
[1711034411] last: [ '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423 ]
|
540
|
-
[1711034411] n_remain: -48
|
541
|
-
[1711034411] eval: [ ' location':4423 ]
|
542
|
-
[1711034411] n_past = 83
|
543
|
-
[1711034411] sampled token: 322: ' and'
|
544
|
-
[1711034411] last: [ '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322 ]
|
545
|
-
[1711034411] n_remain: -49
|
546
|
-
[1711034411] eval: [ ' and':322 ]
|
547
|
-
[1711034411] n_past = 84
|
548
|
-
[1711034411] sampled token: 278: ' the'
|
549
|
-
[1711034411] last: [ ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278 ]
|
550
|
-
[1711034411] n_remain: -50
|
551
|
-
[1711034411] eval: [ ' the':278 ]
|
552
|
-
[1711034411] n_past = 85
|
553
|
-
[1711034411] sampled token: 2635: ' date'
|
554
|
-
[1711034411] last: [ '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278, ' date':2635 ]
|
555
|
-
[1711034411] n_remain: -51
|
556
|
-
[1711034411] eval: [ ' date':2635 ]
|
557
|
-
[1711034412] n_past = 86
|
558
|
-
[1711034412] sampled token: 1641: ' being'
|
559
|
-
[1711034412] last: [ '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278, ' date':2635, ' being':1641 ]
|
560
|
-
[1711034412] n_remain: -52
|
561
|
-
[1711034412] eval: [ ' being':1641 ]
|
562
|
-
[1711034412] n_past = 87
|
563
|
-
[1711034412] sampled token: 5276: ' mentioned'
|
564
|
-
[1711034412] last: [ ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278, ' date':2635, ' being':1641, ' mentioned':5276 ]
|
565
|
-
[1711034412] n_remain: -53
|
566
|
-
[1711034412] eval: [ ' mentioned':5276 ]
|
567
|
-
[1711034412] n_past = 88
|
568
|
-
[1711034412] sampled token: 297: ' in'
|
569
|
-
[1711034412] last: [ '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278, ' date':2635, ' being':1641, ' mentioned':5276, ' in':297 ]
|
570
|
-
[1711034412] n_remain: -54
|
571
|
-
[1711034412] eval: [ ' in':297 ]
|
572
|
-
[1711034412] n_past = 89
|
573
|
-
[1711034412] sampled token: 278: ' the'
|
574
|
-
[1711034412] last: [ '6':29953, '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278, ' date':2635, ' being':1641, ' mentioned':5276, ' in':297, ' the':278 ]
|
575
|
-
[1711034412] n_remain: -55
|
576
|
-
[1711034412] eval: [ ' the':278 ]
|
577
|
-
[1711034412] n_past = 90
|
578
|
-
[1711034412] sampled token: 1426: ' text'
|
579
|
-
[1711034412] last: [ '0':29900, '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278, ' date':2635, ' being':1641, ' mentioned':5276, ' in':297, ' the':278, ' text':1426 ]
|
580
|
-
[1711034412] n_remain: -56
|
581
|
-
[1711034412] eval: [ ' text':1426 ]
|
582
|
-
[1711034413] n_past = 91
|
583
|
-
[1711034413] sampled token: 29973: '?'
|
584
|
-
[1711034413] last: [ '0':29900, '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278, ' date':2635, ' being':1641, ' mentioned':5276, ' in':297, ' the':278, ' text':1426, '?':29973 ]
|
585
|
-
[1711034413] n_remain: -57
|
586
|
-
[1711034413] eval: [ '?':29973 ]
|
587
|
-
[1711034413] n_past = 92
|
588
|
-
[1711034413] sampled token: 2: ''
|
589
|
-
[1711034413] last: [ '.':29889, ' What':1724, ' time':931, ' is':338, ' it':372, '?':29973, '':13, 'My':3421, ' time':931, ' is':338, ' ':29871, '2':29906, '0':29900, '2':29906, '4':29946, '-':29899, '0':29900, '3':29941, '-':29899, '2':29906, '1':29896, ' ':29871, '0':29900, '9':29929, ':':29901, '1':29896, '9':29929, ':':29901, '2':29906, '2':29906, ' -':448, '0':29900, '6':29953, '0':29900, '0':29900, '.':29889, ' Based':16564, ' on':373, ' the':278, ' passage':13382, ' above':2038, ',':29892, ' How':1128, ' does':947, ' the':278, ' author':4148, ''':29915, 's':29879, ' time':931, ' relate':29279, ' to':304, ' their':1009, ' current':1857, ' location':4423, ' and':322, ' the':278, ' date':2635, ' being':1641, ' mentioned':5276, ' in':297, ' the':278, ' text':1426, '?':29973, '':2 ]
|
590
|
-
[1711034413] n_remain: -58
|
591
|
-
[1711034413] found EOS token
|
592
|
-
[1711034413] [end of text]
|
593
|
-
[1711034413]
|
594
|
-
[1711034413] llama_print_timings: load time = 20864.11 ms
|
595
|
-
[1711034413] llama_print_timings: sample time = 63.77 ms / 57 runs ( 1.12 ms per token, 893.80 tokens per second)
|
596
|
-
[1711034413] llama_print_timings: prompt eval time = 17429.96 ms / 36 tokens ( 484.17 ms per token, 2.07 tokens per second)
|
597
|
-
[1711034413] llama_print_timings: eval time = 9591.44 ms / 56 runs ( 171.28 ms per token, 5.84 tokens per second)
|
598
|
-
[1711034413] llama_print_timings: total time = 27180.78 ms
|
599
|
-
[1711034413] Log end
|