olca 0.2.71__tar.gz → 0.2.72__tar.gz
Sign up to get free protection for your applications and to get access to all the features.
- {olca-0.2.71 → olca-0.2.72}/PKG-INFO +7 -2
- {olca-0.2.71 → olca-0.2.72}/README.md +6 -1
- {olca-0.2.71 → olca-0.2.72}/olca.egg-info/PKG-INFO +7 -2
- {olca-0.2.71 → olca-0.2.72}/pyproject.toml +1 -1
- {olca-0.2.71 → olca-0.2.72}/setup.py +1 -1
- {olca-0.2.71 → olca-0.2.72}/LICENSE +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/__init__.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/fusewill_cli.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/fusewill_utils.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/oiv.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/olcacli.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/olcahelper.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/prompts.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/tracing.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca/utils.py +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca.egg-info/SOURCES.txt +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca.egg-info/dependency_links.txt +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca.egg-info/entry_points.txt +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca.egg-info/requires.txt +0 -0
- {olca-0.2.71 → olca-0.2.72}/olca.egg-info/top_level.txt +0 -0
- {olca-0.2.71 → olca-0.2.72}/setup.cfg +0 -0
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.1
|
2
2
|
Name: olca
|
3
|
-
Version: 0.2.
|
3
|
+
Version: 0.2.72
|
4
4
|
Summary: A Python package for experimental usage of Langchain and Human-in-the-Loop
|
5
5
|
Home-page: https://github.com/jgwill/olca
|
6
6
|
Author: Jean GUillaume ISabelle
|
@@ -444,11 +444,16 @@ To initialize `olca`, you need to create a configuration file named `olca.yml`.
|
|
444
444
|
```yaml
|
445
445
|
api_keyname: OPENAI_API_KEY__o450olca241128
|
446
446
|
human: true
|
447
|
-
model_name: gpt-4o-mini
|
447
|
+
model_name: gpt-4o-mini #or bellow:
|
448
|
+
model_name: ollama://llama3.1:latest #or with host
|
449
|
+
model_name: ollama://llama3.1:latest@mymachine.mydomain.com:11434
|
448
450
|
recursion_limit: 300
|
449
451
|
system_instructions: You focus on interacting with human and do what they ask. Make sure you dont quit the program.
|
450
452
|
temperature: 0.0
|
451
453
|
tracing: true
|
454
|
+
tracing_providers:
|
455
|
+
- langsmith
|
456
|
+
- langfuse
|
452
457
|
user_input: Look in the file 3act.md and in ./story, we have created a story point by point and we need you to generate the next iteration of the book in the folder ./book. You use what you find in ./story to start the work. Give me your plan to correct or accept.
|
453
458
|
```
|
454
459
|
|
@@ -73,11 +73,16 @@ To initialize `olca`, you need to create a configuration file named `olca.yml`.
|
|
73
73
|
```yaml
|
74
74
|
api_keyname: OPENAI_API_KEY__o450olca241128
|
75
75
|
human: true
|
76
|
-
model_name: gpt-4o-mini
|
76
|
+
model_name: gpt-4o-mini #or bellow:
|
77
|
+
model_name: ollama://llama3.1:latest #or with host
|
78
|
+
model_name: ollama://llama3.1:latest@mymachine.mydomain.com:11434
|
77
79
|
recursion_limit: 300
|
78
80
|
system_instructions: You focus on interacting with human and do what they ask. Make sure you dont quit the program.
|
79
81
|
temperature: 0.0
|
80
82
|
tracing: true
|
83
|
+
tracing_providers:
|
84
|
+
- langsmith
|
85
|
+
- langfuse
|
81
86
|
user_input: Look in the file 3act.md and in ./story, we have created a story point by point and we need you to generate the next iteration of the book in the folder ./book. You use what you find in ./story to start the work. Give me your plan to correct or accept.
|
82
87
|
```
|
83
88
|
|
@@ -1,6 +1,6 @@
|
|
1
1
|
Metadata-Version: 2.1
|
2
2
|
Name: olca
|
3
|
-
Version: 0.2.
|
3
|
+
Version: 0.2.72
|
4
4
|
Summary: A Python package for experimental usage of Langchain and Human-in-the-Loop
|
5
5
|
Home-page: https://github.com/jgwill/olca
|
6
6
|
Author: Jean GUillaume ISabelle
|
@@ -444,11 +444,16 @@ To initialize `olca`, you need to create a configuration file named `olca.yml`.
|
|
444
444
|
```yaml
|
445
445
|
api_keyname: OPENAI_API_KEY__o450olca241128
|
446
446
|
human: true
|
447
|
-
model_name: gpt-4o-mini
|
447
|
+
model_name: gpt-4o-mini #or bellow:
|
448
|
+
model_name: ollama://llama3.1:latest #or with host
|
449
|
+
model_name: ollama://llama3.1:latest@mymachine.mydomain.com:11434
|
448
450
|
recursion_limit: 300
|
449
451
|
system_instructions: You focus on interacting with human and do what they ask. Make sure you dont quit the program.
|
450
452
|
temperature: 0.0
|
451
453
|
tracing: true
|
454
|
+
tracing_providers:
|
455
|
+
- langsmith
|
456
|
+
- langfuse
|
452
457
|
user_input: Look in the file 3act.md and in ./story, we have created a story point by point and we need you to generate the next iteration of the book in the folder ./book. You use what you find in ./story to start the work. Give me your plan to correct or accept.
|
453
458
|
```
|
454
459
|
|
@@ -2,7 +2,7 @@ from setuptools import setup, find_packages
|
|
2
2
|
|
3
3
|
setup(
|
4
4
|
name='olca',
|
5
|
-
version = "0.2.
|
5
|
+
version = "0.2.72",
|
6
6
|
author='Jean GUillaume ISabelle',
|
7
7
|
author_email='jgi@jgwill.com',
|
8
8
|
description='A Python package for experimenting with Langchain agent and interactivity in Terminal modalities.',
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|