Commit 54f47a8

mo khan <mo@mokhan.ca>
2025-10-16 15:50:55
chore: prepare 0.2.0 release tag: v0.2.0
1 parent bd09dda
Changed files (3)
lib/elelem/version.rb
@@ -1,5 +1,5 @@
 # frozen_string_literal: true
 
 module Elelem
-  VERSION = "0.1.3"
+  VERSION = "0.2.0"
 end
CHANGELOG.md
@@ -1,5 +1,29 @@
 ## [Unreleased]
 
+## [0.2.0] - 2025-10-15
+
+### Added
+- New `llm-ollama` executable - minimal coding agent with streaming support for Ollama
+- New `llm-openai` executable - minimal coding agent for OpenAI/compatible APIs
+- Memory feature for persistent context storage and retrieval
+- Web fetch tool for retrieving and analyzing web content
+- Streaming responses with real-time token display
+- Visual "thinking" progress indicators with dots during reasoning phase
+
+### Changed
+- **BREAKING**: Migrated from custom Net::HTTP implementation to `net-llm` gem
+- API client now uses `Net::Llm::Ollama` for better reliability and maintainability
+- Removed direct dependencies on `net-http` and `uri` (now transitive through net-llm)
+- Maps Ollama's `thinking` field to internal `reasoning` field
+- Maps Ollama's `done_reason` to internal `finish_reason`
+- Improved system prompt for better agent behavior
+- Enhanced error handling and logging
+
+### Fixed
+- Response processing for Ollama's native message format
+- Tool argument parsing to handle both string and object formats
+- Safe navigation operator usage to prevent nil errors
+
 ## [0.1.2] - 2025-08-14
 
 ### Fixed
Gemfile.lock
@@ -1,7 +1,7 @@
 PATH
   remote: .
   specs:
-    elelem (0.1.3)
+    elelem (0.2.0)
       cli-ui
       erb
       json