clag 0.0.4 → 0.0.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: e1108e8ad1844399f3606f873dd03b65fb4131a130b829ba1332636ffa7bae74
4
- data.tar.gz: c736a7bf0cd0e6fc3a6000b288c9b411dc53cf30857b61698bc6b28b82bb7a63
3
+ metadata.gz: 7567d9817e4d7abc1c6da5ab57e36b99945fe64d3cb9d361d86c2c712e5e41af
4
+ data.tar.gz: 5425fc8a75d9267aaf341e4b13d82f76dc7dea19549fd2686d84bc667653f5f8
5
5
  SHA512:
6
- metadata.gz: 36ae1725de2b5d549d2f07f1401724ab82ea1fea9ba8622fb05d154f4008d53638b750a43654c7391c07f953e58d4dd9e6352519fead7b8182e5dbd8694078d8
7
- data.tar.gz: 70d477e186eba030431e2e69d6b9cac4d88741d00a946dee73364e07e2e61f178cf39fd0619da059d3a80ff5b631d83bb2d53674cb8a84922ac163a99b9c784d
6
+ metadata.gz: 8b629c28ad455cc20af1fe172c9dd774f88a980335c5f127fe7ebe6a624688f1ff004103261baedd252a7832beea01be378a2f1995e8c3c55cae7c1522355eb6
7
+ data.tar.gz: e2327b5d9f0625b0ddf0d084f40372e677d2952becd2ba7f0271ccbc8e234a0dd1e35a0231d2223c31ebdca881409c4d46cad8d72cc05b7dd23e1b0aea76e202
data/README.md CHANGED
@@ -47,6 +47,15 @@ command with the help of an LLM!
47
47
 
48
48
  * Select Groq as your preferred LLM by setting CLAG\_LLM=groq in your environment
49
49
 
50
+ ### Using a Local Model
51
+
52
+ * Have a model locally from either Ollama or Llamafile with an OpenAI compatible
53
+ API
54
+
55
+ * Have the API server running on port 8080
56
+
57
+ * Select local as your preferred LLM by setting CLAG\_LLM=local in your environment
58
+
50
59
  ## Usage
51
60
 
52
61
  Currently support one command: "g".
data/lib/clag/version.rb CHANGED
@@ -1,3 +1,3 @@
1
1
  module Clag
2
- VERSION = '0.0.4'
2
+ VERSION = '0.0.5'
3
3
  end
@@ -25,6 +25,8 @@ module Sublayer
25
25
  generate_with_claude
26
26
  when "groq"
27
27
  generate_with_groq
28
+ when "local"
29
+ generate_with_local_model
28
30
  else
29
31
  generate_with_openai
30
32
  end
@@ -32,6 +34,54 @@ module Sublayer
32
34
 
33
35
  private
34
36
 
37
+ def generate_with_local_model
38
+ system_prompt = <<-PROMPT
39
+ In this environment you have access to a set of tools you can use to answer the user's question.
40
+
41
+ You may call them like this:
42
+ <function_calls>
43
+ <invoke>
44
+ <tool_name>$TOOL_NAME</tool_name>
45
+ <parameters>
46
+ <command>value</command>
47
+ ...
48
+ </parameters>
49
+ </invoke>
50
+ </function_calls>
51
+
52
+ Here are the tools available:
53
+ <tools>
54
+ #{self.class::OUTPUT_FUNCTION.to_xml}
55
+ </tools>
56
+
57
+ Respond only with valid xml.
58
+ The entire response should be wrapped in a <response> tag.
59
+ Any additional information not inside a tool call should go in a <scratch> tag.
60
+ PROMPT
61
+
62
+ response = HTTParty.post(
63
+ "http://localhost:8080/v1/chat/completions",
64
+ headers: {
65
+ "Authorization": "Bearer no-key",
66
+ "Content-Type": "application/json"
67
+ },
68
+ body: {
69
+ "model": "LLaMA_CPP",
70
+ "messages": [
71
+ { "role": "system", "content": system_prompt },
72
+ { "role": "user", "content": prompt }
73
+ ]
74
+ }.to_json
75
+ )
76
+
77
+ text_containing_xml = JSON.parse(response.body).dig("choices", 0, "message", "content")
78
+ xml = text_containing_xml.match(/\<response\>(.*?)\<\/response\>/m).to_s
79
+ response_xml = Nokogiri::XML(xml)
80
+ function_output = response_xml.at_xpath("//parameters/command").children.to_s
81
+
82
+ return function_output
83
+ end
84
+
35
85
  def generate_with_groq
36
86
  system_prompt = <<-PROMPT
37
87
  In this environment you have access to a set of tools you can use to answer the user's question.
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: clag
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.0.4
4
+ version: 0.0.5
5
5
  platform: ruby
6
6
  authors:
7
7
  - Scott Werner