interintel 1.0.22 → 2.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,38 +1,137 @@
1
- ## INTERINTEL
1
+ # Interintel
2
2
 
3
- The application `interintel` is a command line interface (CLI) application implemented in Node.js. It essentially is an interactive communication tool between the user and an AI model, only GPTs for now.
3
+ A CLI for local AI-assisted development. Chat with AI models running on your machine to read, search, edit files, and run commands—without sending your code to the cloud.
4
4
 
5
- Here's a brief overview of the main functionalities, as contained in the index.js file:
5
+ ## Features
6
6
 
7
- - The application starts an interactive session with the user. It does this by invoking the readline module, which reads user inputs line by line from the terminal.
7
+ - **Local-first**: Works with Ollama for fully local AI (no data leaves your machine)
8
+ - **Multi-provider**: Supports Ollama, OpenAI, and Mistral
9
+ - **AI-driven tools**: AI can read files, search code, edit files, and run commands
10
+ - **Permission system**: You control what the AI can do
11
+ - **Fuzzy search**: Natural language queries find `chatCompletion`, `chat_completion`, etc.
8
12
 
9
- -- 'node index.js' will start the app
10
-
11
- - The OpenAI's API is accessed using API keys, and the version of AI being used is specified via a config.
13
+ ## Requirements
12
14
 
13
- ### //writeFile
14
- - If a user types '//writefile', the application prompts the user to provide a name for the file and then a prompt for the AI. It then automatically generates some text (by communicating with OpenAI's GPT-3 model), writes this to a file and stores it in the application's directory.
15
-
16
- ### //readRefs
17
- - If a user writes '//readRefs', the application reads the content of the specified files in the interintel.config.js (in the current implementation) and uses this as part of the conversation with the AI.
18
- - So as you work on your project and files change or you move to other parts of the code base you can adjust where interintel points and what is referenced by AI conversation.
15
+ - Node.js >= 18.0.0
16
+ - [Ollama](https://ollama.ai/) (for local AI) with a tool-capable model:
17
+ - `llama3.1:8b` (recommended)
18
+ - `qwen2.5:7b`
19
+ - `gpt-oss:20b`
19
20
 
20
- ### everything else
21
- - For all user inputs outside of these special keywords, the chat conversation is simply updated with the user's message and a call is made to the OpenAI API to generate the AI's response. This is then displayed on the console.
21
+ ## Installation
22
22
 
23
- The application relies heavily on async-await pattern for handling the asynchronicity associated with reading user inputs and waiting for responses from the OpenAI 'aiChatCompletion' function.
23
+ ```bash
24
+ npm install interintel
25
+ ```
24
26
 
25
- Keep in mind that this is a high level overview and each functionality has its own level of implementation detail.
27
+ Or clone and run locally:
28
+ ```bash
29
+ git clone https://github.com/modern-sapien/interintel.git
30
+ cd interintel
31
+ npm install
32
+ node index.js
33
+ ```
26
34
 
27
- File Structure
35
+ ## Configuration
28
36
 
29
- project/
30
- ├── functions/
31
- │ ├── chat-functions.js
32
- │ ├── file-functions.js
33
- │ ├── openai-functions.js
34
- │ ├── messageUtils.js
35
- │ ├── writeFileHandler.js
36
- │ └── readRefsHandler.js
37
- ├── interintel.config.js
38
- └── index.js
37
+ After install, edit `interintel.config.js` in your project root:
38
+
39
+ ```javascript
40
+ export default {
41
+ // 'ollama' | 'openai' | 'mistral'
42
+ aiService: 'ollama',
43
+
44
+ // Model name
45
+ aiVersion: 'llama3.1:8b',
46
+
47
+ // API key (only for OpenAI/Mistral)
48
+ apiKey: process.env.OPENAI_API_KEY || '',
49
+
50
+ // System prompt - customize AI behavior
51
+ systemPrompt: 'You are a direct, focused coding assistant.',
52
+
53
+ // Files to load into context
54
+ filePaths: [],
55
+ };
56
+ ```
57
+
58
+ ## Usage
59
+
60
+ ```bash
61
+ interintel
62
+ # or
63
+ node index.js
64
+ ```
65
+
66
+ Then chat naturally:
67
+ ```
68
+ You: What files are in this project?
69
+ You: Find where fetchUser is defined
70
+ You: Update the version in package.json to 2.0.0
71
+ You: Run npm test
72
+ ```
73
+
74
+ ## AI Tools
75
+
76
+ The AI has access to these tools:
77
+
78
+ | Tool | Description | Permission |
79
+ |------|-------------|------------|
80
+ | `read_file` | Read file contents | Allowed |
81
+ | `list_directory` | List files/folders | Allowed |
82
+ | `search_directory` | Search code with fuzzy matching | Allowed |
83
+ | `edit_file` | Modify existing files | Requires permission |
84
+ | `write_file` | Create new files | Requires permission |
85
+ | `run_command` | Execute shell commands | Requires permission |
86
+
87
+ ## Permissions
88
+
89
+ Write and execute operations require your approval:
90
+
91
+ ```
92
+ AI wants to write: src/index.js
93
+ [y]es (once) / [n]o / [a]lways (save) / [g]lobal (trust all writes):
94
+ ```
95
+
96
+ - `y` - Allow once (session only)
97
+ - `n` - Deny
98
+ - `a` - Always allow this directory (saved)
99
+ - `g` - Trust all write operations (saved)
100
+
101
+ View permissions with `//permissions`. Settings saved in `.interintel/interintel.permissions.json`.
102
+
103
+ ## Commands
104
+
105
+ | Command | Description |
106
+ |---------|-------------|
107
+ | `//plan <task>` | Collaborative planning mode - discuss approach before executing |
108
+ | `//permissions` | Show current permission settings |
109
+ | `//readrefs` | Load reference files from config |
110
+ | `//writefile` | AI-assisted file creation |
111
+ | `exit` | Quit interintel |
112
+
113
+ ## File Structure
114
+
115
+ ```
116
+ interintel/
117
+ ├── index.js # Entry point
118
+ ├── setup.js # Postinstall setup
119
+ ├── src/
120
+ │ ├── cli.js # Main CLI loop
121
+ │ ├── providers.js # AI service interface (Ollama/OpenAI/Mistral)
122
+ │ ├── tools/
123
+ │ │ ├── definitions.json
124
+ │ │ ├── executors.js
125
+ │ │ ├── index.js
126
+ │ │ └── permissions.js
127
+ │ └── utils/
128
+ │ ├── chat.js # Readline helpers
129
+ │ ├── files.js # File operations
130
+ │ └── writeFileHandler.js
131
+ └── templates/
132
+ └── config.js # Config template
133
+ ```
134
+
135
+ ## License
136
+
137
+ Apache-2.0
package/index.js CHANGED
@@ -1,99 +1,6 @@
1
- import path from 'path';
2
- import readline from 'readline';
3
- import dotenv from 'dotenv';
4
- import colors from 'colors';
5
- const configPath = path.join(process.cwd(), 'interintel.config.js');
6
-
7
- import config from './interintel.config.js';
8
- import { readSpecificFiles } from './functions/file-functions.js';
9
- import { askQuestion } from './functions/chat-functions.js';
10
- import { handleWriteFile } from './functions/handleWriteFile.js';
11
- import { chatCompletion } from './serviceInterface.js';
12
-
13
- dotenv.config();
14
-
15
- const rl = readline.createInterface({
16
- input: process.stdin,
17
- output: process.stdout,
18
- });
19
-
20
- async function main() {
21
- let initialContent = await readSpecificFiles(configPath);
22
- let messages = [{ role: 'system', content: initialContent }];
23
-
24
- let currentState = null;
25
- let promptFileName = '';
26
-
27
- while (true) {
28
- const userMessage = await askQuestion(rl, 'You: '.blue.bold);
29
- let response = '';
30
-
31
- // Exit condition
32
- if (userMessage.toLowerCase() === 'exit') {
33
- console.log('Exiting chat...'.bgRed);
34
- rl.close();
35
- break;
36
- }
37
-
38
- if (userMessage.toLowerCase().startsWith('//writefile') && currentState === null) {
39
- let result = await handleWriteFile(config, messages, currentState, '');
40
- ({ currentState, messages, promptFileName, response } = result); // Update messages array
41
- console.log(response.yellow);
42
- } else if (currentState === 'awaitingFileName') {
43
- ({ currentState, messages, promptFileName, response } = await handleWriteFile(
44
- config,
45
- messages,
46
- currentState,
47
- userMessage,
48
- promptFileName
49
- ));
50
- console.log(response.yellow);
51
- } else if (currentState === 'awaitingAIprompt') {
52
- ({ currentState, messages, promptFileName, response } = await handleWriteFile(
53
- config,
54
- messages,
55
- currentState,
56
- userMessage,
57
- promptFileName
58
- ));
59
- console.log(response.yellow);
60
- } else if (currentState === null && userMessage.toLowerCase() === '//readrefs') {
61
- console.log('System message:'.bgYellow);
62
- console.log('Processing //readRefs command...'.yellow);
63
-
64
- let content = readSpecificFiles(configPath);
65
- messages.push({
66
- role: 'user',
67
- content: `please just acknowledge you have read the name and the content of the files I have provided. once you have done this a single time you do not need to do it again. ${content}`,
68
- });
69
- const completion = await chatCompletion(config.aiService, messages, config.aiVersion);
70
-
71
- let botMessage = '';
72
-
73
- if (config.aiService === 'openai' || config.aiService === 'mistral') {
74
- botMessage = completion.choices[0].message.content;
75
- } else if (config.aiService === 'ollama') {
76
- // Adjust this line based on how Ollama's response is structured
77
- botMessage = completion;
78
- }
79
- } else {
80
- // Regular message processing and interaction with GPT model
81
- messages.push({ role: 'user', content: userMessage });
82
-
83
- const completion = await chatCompletion(config.aiService, messages, config.aiVersion);
84
-
85
- let botMessage;
86
- if (config.aiService === 'openai' || config.aiService === 'mistral') {
87
- botMessage = completion.choices[0].message.content;
88
- } else if (config.aiService === 'ollama') {
89
- // Adjust based on Ollama's response format
90
- botMessage = completion; // Example - replace with actual response structure for Ollama
91
- }
92
-
93
- console.log(`${config.aiVersion}`.bgGreen, botMessage.green);
94
- console.log('----------------'.bgGreen);
95
- }
96
- }
97
- }
98
-
99
- export { main };
1
+ #!/usr/bin/env node
2
+ /**
3
+ * Interintel - Local AI Development Assistant
4
+ * Entry point - delegates to src/cli.js
5
+ */
6
+ import './src/cli.js';
package/package.json CHANGED
@@ -1,35 +1,49 @@
1
1
  {
2
+ "name": "interintel",
3
+ "version": "2.0.0",
4
+ "description": "CLI for local AI-assisted development with OpenAI, Mistral, and Ollama. Chat with AI to read, search, edit files and run commands.",
5
+ "type": "module",
6
+ "main": "index.js",
7
+ "bin": {
8
+ "interintel": "index.js"
9
+ },
10
+ "engines": {
11
+ "node": ">=18.0.0"
12
+ },
13
+ "files": [
14
+ "index.js",
15
+ "setup.js",
16
+ "src/",
17
+ "templates/"
18
+ ],
2
19
  "dependencies": {
3
20
  "@mistralai/mistralai": "^0.0.8",
4
21
  "colors": "^1.4.0",
5
22
  "dotenv": "^16.3.1",
23
+ "node-fetch": "^3.3.2",
6
24
  "openai": "^4.24.0"
7
25
  },
8
- "name": "interintel",
9
- "description": "The application `Interintel` is a command line interface (CLI) application implemented in Node.js. It essentially is an interactive communication tool between the user and an AI model, only openai models for now.",
10
- "version": "1.0.22",
11
- "type": "module",
12
- "main": "index.js",
13
- "directories": {
14
- "doc": "docs"
15
- },
16
26
  "scripts": {
17
- "postinstall": "echo 'Running postinstall script...' && node setup.js",
18
- "test": ""
27
+ "postinstall": "node setup.js",
28
+ "start": "node index.js"
19
29
  },
20
30
  "repository": {
21
31
  "type": "git",
22
32
  "url": "git+https://github.com/modern-sapien/inter-intel.git"
23
33
  },
24
34
  "keywords": [
25
- "testing",
26
- "development",
27
35
  "ai",
28
- "debugging",
29
- "copilot"
36
+ "cli",
37
+ "local-ai",
38
+ "ollama",
39
+ "openai",
40
+ "mistral",
41
+ "development",
42
+ "copilot",
43
+ "assistant"
30
44
  ],
31
45
  "author": "Modern-Sapien",
32
- "license": "SEE LICENSE IN README.md",
46
+ "license": "Apache-2.0",
33
47
  "bugs": {
34
48
  "url": "https://github.com/modern-sapien/inter-intel/issues"
35
49
  },
package/setup.js CHANGED
@@ -1,46 +1,34 @@
1
- const fs = require('fs');
2
- const path = require('path');
3
- const colors = require('colors')
1
+ #!/usr/bin/env node
2
+ import fs from 'fs';
3
+ import path from 'path';
4
+ import { fileURLToPath } from 'url';
5
+ import colors from 'colors';
4
6
 
5
- const configPath = path.join('../../interintel.config.js');
6
- const templatePath = path.join(__dirname, '/resources/interintel.config.template.js');
7
+ const __filename = fileURLToPath(import.meta.url);
8
+ const __dirname = path.dirname(__filename);
7
9
 
8
- const readMePath = path.join('../../interintel/interintelReadMe.md');
9
- const readMeTemplate = path.join(__dirname, '/README.md');
10
+ // When installed via npm, process.cwd() is the user's project root
11
+ // When running locally, it's the interintel directory
12
+ const projectRoot = process.cwd();
10
13
 
11
- // Creating directory to hold onto assets
12
- try {
13
- fs.mkdirSync('../../interintel')
14
- } catch (error) {
15
- console.log('Error occurred during setup:', error)
16
- }
17
-
18
- // Cloning config outside of node_modules and where root typically is
19
- try {
20
- if (!fs.existsSync(configPath)) {
21
- console.log('Config file does not exist, creating...');
22
- fs.copyFileSync(templatePath, configPath);
23
- console.log('Interintel config created. Please update it with your settings.'.yellow);
14
+ const configPath = path.join(projectRoot, 'interintel.config.js');
15
+ const templatePath = path.join(__dirname, 'templates/config.js');
24
16
 
25
- } else {
26
- console.log('Interintel config file already exists.');
27
- }
28
- } catch (error) {
29
- console.error('Error occurred during setup:', error);
30
- }
17
+ console.log('Interintel setup...'.cyan);
18
+ console.log(`Project root: ${projectRoot}`.gray);
31
19
 
32
- // Cloning README outside of node_modules and where root typically is
20
+ // Create config file in user's project root
33
21
  try {
34
- if (!fs.existsSync(readMePath)) {
35
- console.log('Readme file does not exist, creating...');
36
- fs.copyFileSync(readMeTemplate, readMePath);
37
- console.log('Interintel readme created. Please update it with your settings.'.yellow);
38
-
39
- } else {
40
- console.log('Interintel readme file already exists.');
41
- }
22
+ if (!fs.existsSync(configPath)) {
23
+ console.log('Creating interintel.config.js...'.yellow);
24
+ fs.copyFileSync(templatePath, configPath);
25
+ console.log('Config created! Edit interintel.config.js to set your AI service and model.'.green);
26
+ } else {
27
+ console.log('interintel.config.js already exists.'.gray);
28
+ }
42
29
  } catch (error) {
43
- console.error('Error occurred during setup:', error);
30
+ // Likely running in a context where we can't write (e.g., global install)
31
+ console.log('Note: Could not create config file. Create interintel.config.js manually.'.yellow);
44
32
  }
45
33
 
46
- console.log("Finished running setup.js script.");
34
+ console.log('Setup complete. Run "interintel" or "node index.js" to start.'.green);