writers_room 0.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,361 @@
1
+ # Writer's Room Configuration Guide
2
+
3
+ ## Default Configuration
4
+
5
+ The Writer's Room is pre-configured with the following defaults:
6
+
7
+ - **LLM Provider**: Ollama
8
+ - **Model**: gpt-oss
9
+ - **Ollama URL**: http://localhost:11434
10
+ - **Timeout**: 120 seconds (2 minutes)
11
+
12
+ ## Environment Variables
13
+
14
+ All configuration can be customized using environment variables:
15
+
16
+ ### LLM Provider Configuration
17
+
18
+ | Variable | Default | Description |
19
+ |----------|---------|-------------|
20
+ | `RUBY_LLM_PROVIDER` | `ollama` | LLM provider to use (ollama, openai, anthropic, etc.) |
21
+ | `RUBY_LLM_MODEL` | `gpt-oss` | Model name to use |
22
+ | `OLLAMA_URL` | `http://localhost:11434` | Ollama server URL |
23
+
24
+ ### Provider-Specific API Keys
25
+
26
+ | Variable | Required For | Description |
27
+ |----------|-------------|-------------|
28
+ | `OPENAI_API_KEY` | OpenAI | API key for OpenAI models |
29
+ | `ANTHROPIC_API_KEY` | Anthropic | API key for Anthropic Claude models |
30
+
31
+ ### Scene Control
32
+
33
+ | Variable | Default | Description |
34
+ |----------|---------|-------------|
35
+ | `MAX_LINES` | `50` | Maximum dialog lines before scene ends |
36
+ | `DEBUG_ME` | (unset) | Set to `1` to enable debug output |
37
+
38
+ ## Configuration Examples
39
+
40
+ ### Default: Ollama with gpt-oss
41
+
42
+ No configuration needed! Just ensure Ollama is running:
43
+
44
+ ```bash
45
+ ollama serve
46
+ ollama pull gpt-oss
47
+ ./run_scene_example.sh
48
+ ```
49
+
50
+ ### Use Different Ollama Model
51
+
52
+ ```bash
53
+ export RUBY_LLM_MODEL="llama2"
54
+ ./director.rb -s scenes/scene_01_gym_wars.yml
55
+ ```
56
+
57
+ ```bash
58
+ export RUBY_LLM_MODEL="mistral"
59
+ ./director.rb -s scenes/scene_02_statistical_anomaly.yml
60
+ ```
61
+
62
+ ### Use Remote Ollama Server
63
+
64
+ ```bash
65
+ export OLLAMA_URL="http://192.168.1.100:11434"
66
+ export RUBY_LLM_MODEL="llama2"
67
+ ./director.rb -s scenes/scene_01_gym_wars.yml
68
+ ```
69
+
70
+ ### Switch to OpenAI
71
+
72
+ ```bash
73
+ export RUBY_LLM_PROVIDER="openai"
74
+ export RUBY_LLM_MODEL="gpt-4"
75
+ export OPENAI_API_KEY="sk-your-key-here"
76
+ ./director.rb -s scenes/scene_01_gym_wars.yml
77
+ ```
78
+
79
+ ### Switch to Anthropic Claude
80
+
81
+ ```bash
82
+ export RUBY_LLM_PROVIDER="anthropic"
83
+ export RUBY_LLM_MODEL="claude-3-5-sonnet-20241022"
84
+ export ANTHROPIC_API_KEY="sk-ant-your-key-here"
85
+ ./director.rb -s scenes/scene_01_gym_wars.yml
86
+ ```
87
+
88
+ ### Enable Debug Mode
89
+
90
+ ```bash
91
+ export DEBUG_ME=1
92
+ ./director.rb -s scenes/scene_01_gym_wars.yml
93
+ ```
94
+
95
+ This will output detailed debug information from the `debug_me` gem, including:
96
+ - Character initialization details
97
+ - LLM configuration (provider, model, URL)
98
+ - Dialog generation prompts
99
+ - Message publishing/receiving
100
+ - Decision-making logic
101
+
102
+ ### Longer Scenes
103
+
104
+ ```bash
105
+ # Allow up to 100 dialog lines
106
+ ./director.rb -s scenes/scene_08_data_dump.yml -l 100
107
+ ```
108
+
109
+ ### Combination Example
110
+
111
+ ```bash
112
+ # Remote Ollama server with custom model and debug mode
113
+ export OLLAMA_URL="http://myserver.local:11434"
114
+ export RUBY_LLM_MODEL="mixtral"
115
+ export DEBUG_ME=1
116
+
117
+ ./director.rb -s scenes/scene_01_gym_wars.yml -l 75
118
+ ```
119
+
120
+ ## Checking Your Configuration
121
+
122
+ ### Before Running
123
+
124
+ Check that services are accessible:
125
+
126
+ ```bash
127
+ # Check Redis
128
+ redis-cli ping
129
+ # Should return: PONG
130
+
131
+ # Check Ollama
132
+ curl http://localhost:11434
133
+ # Should return Ollama version info
134
+
135
+ # List available Ollama models
136
+ ollama list
137
+ # Should show gpt-oss (or your chosen model)
138
+ ```
139
+
140
+ ### During Execution
141
+
142
+ With `DEBUG_ME=1`, you'll see configuration details:
143
+
144
+ ```
145
+ DEBUG: LLM setup complete for Marcus
146
+ provider: "ollama"
147
+ model: "gpt-oss"
148
+ base_url: "http://localhost:11434"
149
+ ```
150
+
151
+ ### Verify Model
152
+
153
+ Test the model directly with Ollama:
154
+
155
+ ```bash
156
+ ollama run gpt-oss "Test message: Hello!"
157
+ ```
158
+
159
+ ## Configuration Persistence
160
+
161
+ ### Session-Based (Temporary)
162
+
163
+ Set environment variables in your terminal session:
164
+
165
+ ```bash
166
+ export RUBY_LLM_MODEL="llama2"
167
+ # Only affects current terminal session
168
+ ```
169
+
170
+ ### Permanent Configuration
171
+
172
+ Add to your shell profile (`~/.bashrc`, `~/.zshrc`, etc.):
173
+
174
+ ```bash
175
+ # Writer's Room Configuration
176
+ export RUBY_LLM_PROVIDER="ollama"
177
+ export RUBY_LLM_MODEL="gpt-oss"
178
+ export OLLAMA_URL="http://localhost:11434"
179
+ ```
180
+
181
+ Then reload:
182
+ ```bash
183
+ source ~/.bashrc # or ~/.zshrc
184
+ ```
185
+
186
+ ### Project-Specific (.env file)
187
+
188
+ Create a `.env` file in the writers_room directory:
189
+
190
+ ```bash
191
+ # .env
192
+ RUBY_LLM_PROVIDER=ollama
193
+ RUBY_LLM_MODEL=gpt-oss
194
+ OLLAMA_URL=http://localhost:11434
195
+ MAX_LINES=50
196
+ ```
197
+
198
+ Then load it before running:
199
+ ```bash
200
+ source .env
201
+ ./director.rb -s scenes/scene_01_gym_wars.yml
202
+ ```
203
+
204
+ **Note**: The `.env` file is in `.gitignore` to keep your configuration private.
205
+
206
+ ## Performance Tuning
207
+
208
+ ### Model Selection Trade-offs
209
+
210
+ | Model Type | Speed | Quality | Memory | Best For |
211
+ |------------|-------|---------|--------|----------|
212
+ | Small (7B) | Fast | Good | Low | Quick iterations, testing |
213
+ | Medium (13B) | Medium | Better | Medium | Development, short scenes |
214
+ | Large (70B+) | Slow | Best | High | Final production, long scenes |
215
+
216
+ ### Recommended Models by Use Case
217
+
218
+ **Quick Testing:**
219
+ ```bash
220
+ export RUBY_LLM_MODEL="tinyllama" # Very fast, basic quality
221
+ ```
222
+
223
+ **Development:**
224
+ ```bash
225
+ export RUBY_LLM_MODEL="gpt-oss" # Default, good balance
226
+ # or
227
+ export RUBY_LLM_MODEL="llama2" # Solid performance
228
+ ```
229
+
230
+ **Production:**
231
+ ```bash
232
+ export RUBY_LLM_MODEL="mixtral" # High quality
233
+ # or
234
+ export RUBY_LLM_PROVIDER="openai"
235
+ export RUBY_LLM_MODEL="gpt-4" # Best quality (paid)
236
+ ```
237
+
238
+ ## Troubleshooting Configuration
239
+
240
+ ### "Connection refused" errors
241
+
242
+ **Check Ollama is running:**
243
+ ```bash
244
+ ps aux | grep ollama
245
+ # Should show ollama serve process
246
+
247
+ # If not running:
248
+ ollama serve
249
+ ```
250
+
251
+ **Check correct URL:**
252
+ ```bash
253
+ curl $OLLAMA_URL
254
+ # Should connect successfully
255
+ ```
256
+
257
+ ### "Model not found" errors
258
+
259
+ **List installed models:**
260
+ ```bash
261
+ ollama list
262
+ ```
263
+
264
+ **Install missing model:**
265
+ ```bash
266
+ ollama pull gpt-oss
267
+ # or whichever model you're using
268
+ ```
269
+
270
+ ### "Timeout" errors
271
+
272
+ **Increase timeout in actor.rb:**
273
+
274
+ Edit `actor.rb` line 188:
275
+ ```ruby
276
+ timeout: 240 # Increase to 4 minutes
277
+ ```
278
+
279
+ Or use a faster model:
280
+ ```bash
281
+ export RUBY_LLM_MODEL="tinyllama"
282
+ ```
283
+
284
+ ### Debug Configuration Issues
285
+
286
+ **Enable debug mode:**
287
+ ```bash
288
+ DEBUG_ME=1 ./director.rb -s scenes/scene_01_gym_wars.yml
289
+ ```
290
+
291
+ **Check actor logs:**
292
+ ```bash
293
+ tail -f logs/marcus_*.log
294
+ tail -f logs/marcus_*_err.log
295
+ ```
296
+
297
+ **Monitor Redis traffic:**
298
+ ```bash
299
+ redis-cli monitor
300
+ ```
301
+
302
+ ## Advanced Configuration
303
+
304
+ ### Custom RubyLLM Client Options
305
+
306
+ Edit `actor.rb` method `setup_llm` to add custom options:
307
+
308
+ ```ruby
309
+ @llm = RubyLLM::Client.new(
310
+ provider: provider,
311
+ model: model,
312
+ base_url: base_url,
313
+ timeout: 120,
314
+ # Add custom options:
315
+ temperature: 0.7, # Creativity level
316
+ max_tokens: 150, # Response length
317
+ top_p: 0.9, # Nucleus sampling
318
+ # etc.
319
+ )
320
+ ```
321
+
322
+ ### Per-Character Model Selection
323
+
324
+ Modify `Actor#initialize` to accept model override:
325
+
326
+ ```ruby
327
+ def initialize(character_info, model: nil)
328
+ @custom_model = model
329
+ # ... rest of initialization
330
+ end
331
+ ```
332
+
333
+ Then in `setup_llm`:
334
+ ```ruby
335
+ model = @custom_model || ENV['RUBY_LLM_MODEL'] || 'gpt-oss'
336
+ ```
337
+
338
+ Run with different models per character:
339
+ ```bash
340
+ # Would require custom launcher script
341
+ ```
342
+
343
+ ## Summary
344
+
345
+ **Minimal Setup (Default):**
346
+ ```bash
347
+ ollama serve
348
+ ollama pull gpt-oss
349
+ redis-server
350
+ ./run_scene_example.sh
351
+ ```
352
+
353
+ **Production Setup:**
354
+ ```bash
355
+ # In ~/.bashrc or ~/.zshrc
356
+ export RUBY_LLM_PROVIDER="ollama"
357
+ export RUBY_LLM_MODEL="mixtral"
358
+ export OLLAMA_URL="http://localhost:11434"
359
+ ```
360
+
361
+ The system is designed to work out-of-the-box with Ollama and gpt-oss, but gives you full flexibility to use any LLM provider supported by RubyLLM.
@@ -0,0 +1,226 @@
1
+ # Project Structure Guide
2
+
3
+ ## Overview
4
+
5
+ The Writer's Room now supports a **project-based organization** allowing multiple independent dialog projects to coexist.
6
+
7
+ ## Directory Layout
8
+
9
+ ```
10
+ writers_room/
11
+ ├── actor.rb # Core actor AI (shared)
12
+ ├── director.rb # Scene orchestrator (shared)
13
+ ├── run_scene_example.sh # Quick launcher (uses teen_play)
14
+ ├── messages/ # SmartMessage types (shared)
15
+ │ ├── dialog_message.rb
16
+ │ ├── scene_control_message.rb
17
+ │ ├── stage_direction_message.rb
18
+ │ └── meta_message.rb
19
+ ├── projects/ # PROJECT-BASED ORGANIZATION
20
+ │ ├── teen_play/ # Example project
21
+ │ │ ├── project.yml # Project metadata (optional)
22
+ │ │ ├── characters/
23
+ │ │ │ ├── marcus.yml
24
+ │ │ │ ├── jamie.yml
25
+ │ │ │ └── ...
26
+ │ │ └── scenes/
27
+ │ │ ├── scene_01_gym_wars.yml
28
+ │ │ └── ...
29
+ │ └── your_project/ # Your custom project
30
+ │ ├── project.yml # Project metadata (optional)
31
+ │ ├── characters/
32
+ │ └── scenes/
33
+ └── logs/ # Actor logs (auto-created)
34
+ ```
35
+
36
+ ## Project Metadata (Optional)
37
+
38
+ Each project can include a `project.yml` file with metadata about the project:
39
+
40
+ ```yaml
41
+ # projects/my_project/project.yml
42
+ title: "My Project Title"
43
+ tagline: "A catchy tagline for your project"
44
+ description: |
45
+ A detailed description of your project, its themes,
46
+ and what it's about.
47
+ genre: Comedy # or Drama, Thriller, etc.
48
+ target_audience: Teenagers and young adults
49
+ setting: Where your story takes place
50
+ themes:
51
+ - Theme 1
52
+ - Theme 2
53
+ ```
54
+
55
+ When present, the launcher (`run_scene_example.sh`) will display the project title, tagline, and description when the project is selected. This is purely informational and does not affect the execution of scenes.
56
+
57
+ ## Auto-Detection
58
+
59
+ The director **automatically detects** the character directory from the scene file path:
60
+
61
+ ```bash
62
+ # Scene path: projects/teen_play/scenes/scene_01.yml
63
+ # Auto-detects: projects/teen_play/characters/
64
+
65
+ ./director.rb -s projects/teen_play/scenes/scene_01_gym_wars.yml
66
+ # Characters loaded from: projects/teen_play/characters/
67
+ ```
68
+
69
+ ## Creating a New Project
70
+
71
+ ### 1. Create Project Directory Structure
72
+
73
+ ```bash
74
+ mkdir -p projects/my_project/characters
75
+ mkdir -p projects/my_project/scenes
76
+ ```
77
+
78
+ ### 2. Add Project Metadata (Optional)
79
+
80
+ Create a `project.yml` file with your project information:
81
+
82
+ ```yaml
83
+ # projects/my_project/project.yml
84
+ title: "My Awesome Dialog"
85
+ tagline: "A story about..."
86
+ description: |
87
+ Full description of your project
88
+ ```
89
+
90
+ ### 3. Add Character Definitions
91
+
92
+ Create YAML files in `projects/my_project/characters/`:
93
+
94
+ ```yaml
95
+ # projects/my_project/characters/alice.yml
96
+ name: Alice
97
+ age: 25
98
+ personality: |
99
+ Your character description...
100
+ voice_pattern: |
101
+ How this character speaks...
102
+ # ... etc
103
+ ```
104
+
105
+ ### 4. Create Scene Definitions
106
+
107
+ Create YAML files in `projects/my_project/scenes/`:
108
+
109
+ ```yaml
110
+ # projects/my_project/scenes/scene_01.yml
111
+ scene_number: 1
112
+ scene_name: "Opening Scene"
113
+ characters:
114
+ - Alice
115
+ - Bob
116
+ # ... etc
117
+ ```
118
+
119
+ ### 5. Run Your Project
120
+
121
+ ```bash
122
+ ./director.rb -s projects/my_project/scenes/scene_01.yml
123
+ ```
124
+
125
+ The director will automatically find characters in `projects/my_project/characters/`.
126
+
127
+ ## Path Resolution Logic
128
+
129
+ The director uses this logic to find characters:
130
+
131
+ 1. **Explicit override**: If `-c` flag provided, use that path
132
+ 2. **Project pattern**: If scene matches `projects/PROJECT_NAME/scenes/`, use `projects/PROJECT_NAME/characters/`
133
+ 3. **Relative lookup**: Check for `../characters/` relative to scene directory
134
+ 4. **Fallback**: Use `characters/` in current directory
135
+
136
+ ## Running Commands
137
+
138
+ ### Using Auto-Detection (Recommended)
139
+
140
+ ```bash
141
+ ./director.rb -s projects/teen_play/scenes/scene_01_gym_wars.yml
142
+ ```
143
+
144
+ ### Specifying Character Directory Manually
145
+
146
+ ```bash
147
+ ./director.rb -s projects/my_project/scenes/scene_01.yml -c projects/my_project/characters
148
+ ```
149
+
150
+ ### Running Individual Actors
151
+
152
+ ```bash
153
+ ./actor.rb \
154
+ -c projects/my_project/characters/alice.yml \
155
+ -s projects/my_project/scenes/scene_01.yml
156
+ ```
157
+
158
+ ## Example Projects
159
+
160
+ ### Teen Play (Included)
161
+
162
+ ```
163
+ projects/teen_play/
164
+ ├── project.yml # Project metadata
165
+ ├── characters/ # 6 teen characters
166
+ │ ├── marcus.yml
167
+ │ ├── jamie.yml
168
+ │ ├── tyler.yml
169
+ │ ├── alex.yml
170
+ │ ├── benny.yml
171
+ │ └── zoe.yml
172
+ └── scenes/ # 4 complete scenes
173
+ ├── scene_01_gym_wars.yml
174
+ ├── scene_02_statistical_anomaly.yml
175
+ ├── scene_04_equipment_room.yml
176
+ └── scene_08_data_dump.yml
177
+ ```
178
+
179
+ ### Your Custom Project
180
+
181
+ ```
182
+ projects/your_project/
183
+ ├── characters/
184
+ │ ├── character1.yml
185
+ │ └── character2.yml
186
+ └── scenes/
187
+ └── scene_01.yml
188
+ ```
189
+
190
+ ## Benefits of Project Structure
191
+
192
+ ✅ **Multiple Projects**: Run different dialog projects independently
193
+ ✅ **Clean Organization**: Each project is self-contained
194
+ ✅ **Easy Sharing**: Share individual projects as directories
195
+ ✅ **Auto-Detection**: No need to specify character paths
196
+ ✅ **Flexible**: Can still use custom paths with `-c` flag
197
+
198
+ ## Migration from Flat Structure
199
+
200
+ If you have an older setup with flat `characters/` and `scenes/` directories:
201
+
202
+ ```bash
203
+ # Old structure:
204
+ writers_room/characters/*.yml
205
+ writers_room/scenes/*.yml
206
+
207
+ # Migrate to:
208
+ mkdir -p projects/my_project
209
+ mv characters projects/my_project/
210
+ mv scenes projects/my_project/
211
+
212
+ # Update your commands:
213
+ # OLD: ./director.rb -s scenes/scene_01.yml
214
+ # NEW: ./director.rb -s projects/my_project/scenes/scene_01.yml
215
+ ```
216
+
217
+ ## Tips
218
+
219
+ - **One project at a time**: Each scene run uses one project
220
+ - **Share characters**: Copy character YAMLs between projects
221
+ - **Version control**: Each project can have its own git repo
222
+ - **Templates**: Use `teen_play` as a template for new projects
223
+
224
+ ---
225
+
226
+ **Need help?** See `README.md` for full documentation.