A simple pipeline to integrate searching, model iteration, and code correction with local Ollama models.
## Overview
This project comprises several modules that work together to streamline the process of:
- Creating isolated virtual environments for safe code execution ([`UserEnvironment`](codeExecution.py)).
- Executing and orchestrating code via [`main.py`](main.py).
- Handling web search queries, model iterations, and task classification with local Ollama models using functions from [`queries.py`](queries.py) and [`search.py`](search.py).
- Managing conversation history in a local SQLite database via [`conversation_store.py`](conversation_store.py).
## Project Structure
- **codeExecution.py**: Implements the [`UserEnvironment`](codeExecution.py) class that creates a virtual environment for code execution with basic security measures.
- **main.py**: Serves as the entry point to the pipeline, orchestrating code execution and integrating search and model iterations.
- **queries.py**: Contains functions to perform web search, task classification, and other queries.
- **search.py**: Provides utility for performing web searches in the pipeline.
- **conversation_store.py**: Manages conversation persistence in a SQLite database under the `data/` folder.
- **debug.py**: Includes debug utilities for troubleshooting.
## Installation
1. Install the necessary dependencies via [requirements.txt](requirements.txt):
```sh
pip install -r requirements.txt
```
2. Ensure your Python version is compatible with the virtual environment setup (see [codeExecution.py](http://_vscodecontentref_/0)).
## Usage
Run the pipeline by executing the main script:
```sh
python main.py
```
During execution, the project will:
- Pull and stream model updates from ollama.
- Orchestrate web searches, model queries, and classification tasks.
- Maintain a conversation history for iterative improvements.