In today's world, we interact with our computers through a series of clicks, drags, and memorized commands. We navigate through menus, search for the right application, and perform multi-step processes to achieve a single goal.
What if we could simplify this? What if you could just tell your computer what you want, and it would understand and act?
This is the vision behind ODK.
ODK is not just another application; it's a new way to interact. It's a bridge between your thoughts and your computer's capabilities. It allows you to express your goals in plain, natural language, and watch as your computer brings them to life.
Think of it as a blank canvas where your words become actions. You write down what you need, and the machine handles the "how."
Imagine transforming complex tasks into a single instruction. With ODK, you can:
Organize Your Files Effortlessly:
- "Find every PDF presentation I've downloaded this month, rename them to include today's date, and move them all into my 'Reports' folder."
Automate Tedious Work:
- "Look at this spreadsheet of contacts, find everyone from New York, and create a separate mailing list file for them."
Boost Your Creativity:
- "Take the last 5 images from my 'Designs' folder, resize them for social media, and place a small watermark in the bottom-right corner of each."
Streamline Your Projects:
- "Create a new folder for my project named 'Odyssey', set up the standard subfolders like 'src', 'assets', and 'docs', and initialize a Git repository inside it."
ODK is designed to remove the friction between having an idea and seeing it realized. It's for anyone who has ever thought, "There has to be a faster way to do this."
Our goal is to empower you to work more creatively and efficiently, turning the power of your computer into a natural extension of your own mind.
ODK is built with a focus on local AI models rather than relying on external APIs. This approach ensures privacy and allows you to work offline with small, efficient models.
Frontend (HTML/CSS/JS)
โ๏ธ Tauri IPC
Rust Backend
โ๏ธ Process Communication
Python AI Engine (Open Interpreter)
Technology Stack:
- Frontend: HTML, CSS, JavaScript with modern UI
- Backend: Rust with Tauri framework for cross-platform support
- AI Engine: Python with Open Interpreter library
- Models: Supports Ollama local models, OpenAI, Google Gemini, and Anthropic Claude
- โ Cross-Platform: Works on Windows, macOS, and Linux
- โ Local AI Support: Automatic detection of Ollama models
- โ Multiple AI Providers: OpenAI, Google Gemini, Anthropic Claude support
- โ Clean Interface: Simple input/output design
- โ Keyboard Shortcuts: Ctrl+Enter (Cmd+Enter on Mac) to execute
- โ Error Handling: Clear error messages and loading states
โ ๏ธ Performance: Currently runs slower than optimal (first version)
- Node.js (v16 or higher)
- Rust (1.77.2 or newer) - Run
rustup update
to get latest - Python 3 (3.8 or higher)
- Tauri CLI:
npm install -g @tauri-apps/cli
-
Clone and install dependencies:
git clone [your-repo-url] cd odk-shell npm install
-
Install Open Interpreter:
pip install open-interpreter
-
Verify Python is available:
python --version # or python3 --version
-
Update Rust if needed:
rustup update rustc --version # Should be 1.77.2 or newer
-
Run the development server:
npm run tauri dev
-
Launch the Application
npm run tauri dev
-
Select an AI Provider
- Click "Choose AI" dropdown
- Select from available options (Ollama models recommended for local use)
-
Enter Your Command Type natural language commands like:
"List all files in my Documents folder"
"Create a backup of my project files"
"Find all Python files and show their sizes"
-
Execute
- Click "Run" button, or
- Press
Ctrl+Enter
(Windows/Linux) orCmd+Enter
(Mac)
"Show me the current directory contents"
"Create a new folder called 'test-project'"
"List all .txt files in the current directory"
"What is the current date and time?"
The current version will show:
- The command you entered
- AI model being used
- Results of the execution
- Any error messages if something goes wrong
This is an early version with several limitations:
- Performance: Runs slower than optimal due to Python process spawning
- Error Handling: Basic error reporting (improvements planned)
- Security: Uses
auto_run=True
for demonstration (not production-ready) - Features: Limited to basic command execution
- Documentation: Minimal documentation (expanding)
odk-shell/
โโโ src/ # Frontend files
โ โโโ index.html # Main UI structure
โ โโโ styles.css # Modern styling
โ โโโ main.js # Frontend logic
โโโ src-tauri/ # Rust backend
โ โโโ src/
โ โโโ main.rs # Entry point
โ โโโ lib.rs # Commands & logic
โโโ runner.py # Python AI engine
โโโ package.json # Node.js dependencies
โโโ README.md # This file
npm run tauri build
- Basic UI and communication bridge
- Local model support (Ollama)
- Simple command execution
- Enhanced Open Interpreter integration
- User confirmation for commands
- Better error handling and security
- Performance optimizations
.odk
file format for conversation history- Enhanced security and sandboxing
- Plugin system for extensibility
- Better UI/UX improvements
Important: The current version is for development and testing only. It uses auto_run=True
which automatically executes commands without confirmation. Do not use in production environments without implementing proper security measures:
- Command confirmation dialogs
- Sandboxed execution environment
- User permission management
- Input validation and filtering
- Open Interpreter: The core AI engine that powers ODK
- Tauri: Cross-platform application framework
- Rust: Backend systems programming
- Ollama: Local AI model management (optional)
This is an experimental project. Contributions welcome for:
- Performance improvements
- Security enhancements
- Better error handling
- Cross-platform compatibility
- Documentation improvements
Built with โค๏ธ using Tauri, Rust, Open Interpreter, and modern web technologies.
Note: This project prioritizes local AI models for privacy and offline functionality. While it supports cloud-based AI providers, the focus is on running small, efficient models locally on your machine.