@jeremysnr/snug 0.1.0 → 0.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +16 -2
  2. package/package.json +3 -2
package/README.md CHANGED
@@ -1,5 +1,10 @@
1
1
  # snug
2
2
 
3
+ [![npm](https://img.shields.io/npm/v/@jeremysnr/snug)](https://www.npmjs.com/package/@jeremysnr/snug)
4
+ [![bundle size](https://img.shields.io/bundlephobia/minzip/@jeremysnr/snug)](https://bundlephobia.com/package/@jeremysnr/snug)
5
+ [![zero dependencies](https://img.shields.io/badge/dependencies-0-brightgreen)](https://www.npmjs.com/package/@jeremysnr/snug?activeTab=dependencies)
6
+ [![license](https://img.shields.io/npm/l/@jeremysnr/snug)](./LICENSE)
7
+
3
8
  **Fit prioritised content into a token budget.**
4
9
 
5
10
  Every LLM application has the same problem: you have a context window of N tokens and need to fit a system prompt, conversation history, retrieved documents, and tool definitions into it — with space left for the model's reply. Every team writes their own solution from scratch.
@@ -7,7 +12,7 @@ Every LLM application has the same problem: you have a context window of N token
7
12
  `snug` is a single function that solves this once.
8
13
 
9
14
  ```ts
10
- import { fit } from 'snug';
15
+ import { fit } from '@jeremysnr/snug';
11
16
 
12
17
  const { included } = fit(
13
18
  [
@@ -27,7 +32,7 @@ Items are selected greedily in descending priority order. The result preserves o
27
32
  ## Install
28
33
 
29
34
  ```
30
- npm install snug
35
+ npm install @jeremysnr/snug
31
36
  ```
32
37
 
33
38
  ## API
@@ -103,6 +108,15 @@ If you already have a token count (e.g. from an API usage response), pass it dir
103
108
 
104
109
  When no tokenizer is supplied, snug falls back to `Math.ceil(text.length / 4)` and prints a warning. This is useful for prototyping but can be off by up to 37% in production.
105
110
 
111
+ ## Ecosystem
112
+
113
+ | Package | What it does |
114
+ |---------|-------------|
115
+ | `@jeremysnr/snug` | Zero-dependency core — bring your own tokenizer |
116
+ | [`@jeremysnr/snug-tiktoken`](https://github.com/JeremySNR/snug-tiktoken) | Pre-wired with tiktoken, model-agnostic |
117
+ | [`@jeremysnr/snug-openai`](https://github.com/JeremySNR/snug-openai) | Accepts OpenAI SDK message arrays directly |
118
+ | [`@jeremysnr/snug-anthropic`](https://github.com/JeremySNR/snug-anthropic) | Accepts Anthropic SDK message arrays, auto-pairs tool messages |
119
+
106
120
  ## Licence
107
121
 
108
122
  MIT
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@jeremysnr/snug",
3
- "version": "0.1.0",
3
+ "version": "0.1.1",
4
4
  "description": "Fit prioritised content into a token budget. Zero dependencies.",
5
5
  "keywords": [
6
6
  "llm",
@@ -27,7 +27,8 @@
27
27
  }
28
28
  },
29
29
  "files": [
30
- "dist"
30
+ "dist",
31
+ "LICENSE"
31
32
  ],
32
33
  "scripts": {
33
34
  "build": "tsup",