async-ollama 0.9.0 → 0.10.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: e3be9b447500c19efa6291703fffde4f69bf15843835ee407b245f32c3afb2d8
4
- data.tar.gz: 58021a3c814a34f5e7a905d4fc39af14aa662bdd2fb826e8c02dd62e4ff5639d
3
+ metadata.gz: b056f00255624f62489fbe6c06a2f2c9a5ec520293b647d0420cff3624b65249
4
+ data.tar.gz: d3e1ab3b0e78a4c5886e039cc5f81c9e25a89a56f91125e169c9e3c9009187aa
5
5
  SHA512:
6
- metadata.gz: c0d8201a8e2e1692d78453addd780c662f7ac9e0e0303a27618e855850a04da9d65808bfc21b69f2d8dc9a6d79293eb6cdc01ef3ccfc32939290956c9e350a18
7
- data.tar.gz: '08a238788cdcc33a79a26b9f2eaffe842392c8f9d68730b6f3b6edcd18c9c5c3e10455fcb8c6906d173510e1f97683c653501c8237233b4eba3950fd4e207d54'
6
+ metadata.gz: 8f7fde1e4f5a3a8a6b5d7b39c494a58a97287fff397d909e3d87b2e77b6ef29f1f9a407ca20adfb480757cf08545360161e73fa86fd932b9cf62fdfd07c049ca
7
+ data.tar.gz: '06974289477bb32e452b89c4ccdcb9122de141216c422c3139ec7402c31cbdd9c5d63aff6ba4d915bff81494dc56e984bc910cc1cb745890d8e8e0b51be6b08f'
checksums.yaml.gz.sig CHANGED
Binary file
@@ -12,7 +12,7 @@ require_relative "pull"
12
12
 
13
13
  module Async
14
14
  module Ollama
15
- MODEL = ENV.fetch("ASYNC_OLLAMA_MODEL", "llama3.1:latest")
15
+ MODEL = ENV.fetch("ASYNC_OLLAMA_MODEL", "llama3.2:latest")
16
16
 
17
17
  # Represents a connection to the Ollama service, providing methods to generate completions, chat, and list models.
18
18
  class Client < Async::REST::Resource
@@ -23,8 +23,9 @@ module Async
23
23
  SYSTEM_PROMPT = <<~PROMPT
24
24
  You are a precise file transformation tool.
25
25
  When given a file and instructions, you output ONLY the transformed file content.
26
- Do not add any explanation, preamble, summary of changes, or markdown code fences.
27
- Output the raw file content exactly as it should be written to disk.
26
+ Do not add any explanation, preamble, summary of changes.
27
+ Output the raw file content exactly as it should be executed.
28
+ Do not include markdown code fences in your output.
28
29
  Preserve all existing content unless the instructions explicitly require removal.
29
30
  If a reference template is provided, use it as a structural guide for what to add or how to format new content — do not copy it verbatim or replace existing content with it.
30
31
  PROMPT
@@ -40,16 +41,16 @@ module Async
40
41
  # @parameter instruction [String] What change to make.
41
42
  # @parameter template [String | nil] An optional reference example showing the desired structure.
42
43
  # @returns [String] The transformed file content.
43
- def call(content, instruction:, template: nil)
44
+ def call(content, instruction:, template: nil, model: @model)
44
45
  messages = [
45
46
  {role: "system", content: SYSTEM_PROMPT},
46
47
  {role: "user", content: build_prompt(content, instruction, template)}
47
48
  ]
48
49
 
49
50
  Client.open do |client|
50
- reply = client.chat(messages, model: @model)
51
+ reply = client.chat(messages, model: model)
51
52
 
52
- reply.response
53
+ return strip_fences(reply.response)
53
54
  end
54
55
  end
55
56
 
@@ -64,6 +65,18 @@ module Async
64
65
 
65
66
  private
66
67
 
68
+ def strip_fences(content)
69
+ content = content.strip
70
+
71
+ # Models frequently wrap output in code fences despite being instructed not to:
72
+ if match = content.match(/\A```\w*\n(.*\n)```\z/m)
73
+ Console.warn(self, "Stripping markdown code fences from model output.")
74
+ content = match[1]
75
+ end
76
+
77
+ return content
78
+ end
79
+
67
80
  def build_prompt(content, instruction, template = nil)
68
81
  parts = []
69
82
  parts << "## Current file\n\n```\n#{content}\n```"
@@ -5,6 +5,6 @@
5
5
 
6
6
  module Async
7
7
  module Ollama
8
- VERSION = "0.9.0"
8
+ VERSION = "0.10.0"
9
9
  end
10
10
  end
data/readme.md CHANGED
@@ -14,6 +14,12 @@ Please see the [project documentation](https://socketry.github.io/async-ollama/)
14
14
 
15
15
  Please see the [project releases](https://socketry.github.io/async-ollama/releases/index) for all releases.
16
16
 
17
+ ### v0.10.0
18
+
19
+ - Update default model to `llama3.2`.
20
+ - Allow model selection for `Async::Ollama::Transform.call(model: "...")`.
21
+ - Introduce code fence stripping for better handling of code blocks in transformations.
22
+
17
23
  ### v0.9.0
18
24
 
19
25
  - Add `Async::Ollama::Transform` for intelligent code transformations using Ollama models. This can be used to implement features like automatic code updates based on instructions, while preserving user modifications and existing content.
data/releases.md CHANGED
@@ -1,5 +1,11 @@
1
1
  # Releases
2
2
 
3
+ ## v0.10.0
4
+
5
+ - Update default model to `llama3.2`.
6
+ - Allow model selection for `Async::Ollama::Transform.call(model: "...")`.
7
+ - Introduce code fence stripping for better handling of code blocks in transformations.
8
+
3
9
  ## v0.9.0
4
10
 
5
11
  - Add `Async::Ollama::Transform` for intelligent code transformations using Ollama models. This can be used to implement features like automatic code updates based on instructions, while preserving user modifications and existing content.
data.tar.gz.sig CHANGED
Binary file
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: async-ollama
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.9.0
4
+ version: 0.10.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Samuel Williams
metadata.gz.sig CHANGED
Binary file