mirror of
https://github.com/ION606/ML-pipeline.git
synced 2026-05-14 21:06:54 +00:00
1.9 KiB
1.9 KiB
ML-pipeline
A simple pipeline to integrate searching, model iteration, and code correction with local Ollama models.
Overview
This project comprises several modules that work together to streamline the process of:
- Creating isolated virtual environments for safe code execution (
UserEnvironment). - Executing and orchestrating code via
main.py. - Handling web search queries, model iterations, and task classification with local Ollama models using functions from
queries.pyandsearch.py. - Managing conversation history in a local SQLite database via
conversation_store.py.
Project Structure
- codeExecution.py: Implements the
UserEnvironmentclass that creates a virtual environment for code execution with basic security measures. - main.py: Serves as the entry point to the pipeline, orchestrating code execution and integrating search and model iterations.
- queries.py: Contains functions to perform web search, task classification, and other queries.
- search.py: Provides utility for performing web searches in the pipeline.
- conversation_store.py: Manages conversation persistence in a SQLite database under the
data/folder. - debug.py: Includes debug utilities for troubleshooting.
Installation
-
Install the necessary dependencies via requirements.txt:
pip install -r requirements.txt -
Ensure your Python version is compatible with the virtual environment setup (see codeExecution.py).
Usage
Run the pipeline by executing the main script:
python main.py
During execution, the project will:
- Pull and stream model updates from ollama.
- Orchestrate web searches, model queries, and classification tasks.
- Maintain a conversation history for iterative improvements.
License
See LICENSE for details.