Skip to content

Anant3008/SysSage

Repository files navigation

SysSage — Local AI System Assistant (Windows)

SysSage is a privacy-first, on-device assistant for Windows that uses local LLM adapters and OS tooling to answer questions about your system, run diagnostics, automate workflows, and assist with file/process/hardware queries — all while keeping data on your machine.

Table of Contents

Why SysSage

  • System troubleshooting, inventory, and personal assistance often require multiple tools, admin rights, or cloud services that expose sensitive data.
  • Developers, power users, and administrators need fast, contextual, and private help about processes, running services, hardware, and files without sending data to the cloud.

Proposed solution

SysSage provides a modular agent framework and a Streamlit UI that connects a suite of OS-level tools (process, file, hardware, browser artifacts) with local LLMs (via llm/ollama_wrapper.py or other adapters). It focuses on local-first intelligence, scripted automation, and extensibility.

Key features

  • Privacy-first local LLM integration — use Ollama (or swap in another local/secure model) so prompts and context can stay on-device.
  • System inspection tools — process, hardware, file, and browser helpers under tools/ (e.g., psutil, pywin32).
  • Agent-based automation — small agents in agents/ can monitor, execute, and report tasks (executor, monitor).
  • Crew runner / Orchestrationcrew_runner.py coordinates multiple agents for multi-step workflows.
  • Simple Streamlit UIui/streamlit_ui.py plus an intent dispatcher (ui/intent_dispatcher.py) for quick interactions and saved histories.
  • Scriptable experimentsassistant.py, simple_approach.py provide examples and starting points for new features.

Quickstart (Windows / PowerShell)

  1. Create and activate a virtual environment
python -m venv .venv
.\.venv\Scripts\Activate.ps1
  1. Install dependencies
pip install -r requirements.txt
  1. (Optional) Configure local LLMs
  • If you plan to use Ollama or another local LLM, install and configure it separately. See llm/ollama_wrapper.py for integration points.
  1. Run the Streamlit UI
streamlit run .\ui\streamlit_ui.py
  1. Run the agent crew example
python .\crew_runner.py

Project layout

  • agents/ — agent implementations (e.g., executor_agent.py, monitor_agent.py).
  • llm/ — local LLM adapter(s) and wrappers.
  • tasks/ — task orchestration helpers.
  • tools/ — OS and utility tooling (process, file, hardware, browsing helpers).
  • ui/ — Streamlit UI, intent dispatcher, and front-end wiring.
  • crew_runner.py — example orchestrator that launches agent crews.
  • assistant.py, simple_approach.py — experimental entrypoints and scripts.
  • requirements.txt — pinned Python dependencies.

Architecture (high level)

  1. UI (Streamlit)

    • Collects intents and displays agent outputs. Sends structured requests to the intent dispatcher.
  2. Intent Dispatcher (ui/intent_dispatcher.py)

    • Maps UI intents to agent/task workflows and routes them to the crew runner or direct tools.
  3. Agents & Crew Runner

    • Agents encapsulate capabilities (monitoring, execution, file analysis). crew_runner.py composes agents to run multi-step flows.
  4. Tools

    • Deterministic helpers that perform system actions or data retrieval (psutil wrappers, process scanning, file search, browser artifact parsing).
  5. LLM Adapter

    • llm/ollama_wrapper.py shows how to place contextual prompts and combine deterministic outputs with model reasoning while retaining privacy.

Security & Privacy

  • Local-first operation: LLM prompts and context can stay local when using on-device models.
  • Minimal telemetry: default behavior is to avoid external logging of sensitive outputs; add explicit opt-in if remote logging is required.
  • Elevated actions: some tools require admin rights — the UI will surface warnings and request appropriate permissions.

Extending SysSage

  • Add a new tool: create a helper in tools/, write a small wrapper, and register it with an agent.
  • Add a new agent: follow the pattern in agents/ and expose a simple interface (start/stop/handle_intent).
  • Swap LLM provider: implement a new adapter under llm/ that matches the wrapper interface.

Development notes

  • Recommended Python: 3.10+.
  • Tests: add unit tests for tool logic (avoid calling LLMs in unit tests; mock adapters).
  • CI: small lint and test step (GitHub Actions recommended).

Troubleshooting & common notes

  • Streamlit launching issues: ensure your venv is active and streamlit is installed.
  • Missing OS-level permissions: run PowerShell as Administrator for tasks that require system-level access.
  • LLM adapter errors: confirm the local model server is running and the wrapper configuration matches your installation.

Contributing & roadmap

  • Open issues to suggest features or report bugs.
  • Short-term roadmap: improve agent orchestration, add richer browser artifact parsing, and include more robust offline prompt templates.

SysSage — Local AI System Assistant (Windows)

SysSage is a privacy-first, on-device assistant for Windows that uses local LLM adapters and OS tooling to answer questions about your system, run diagnostics, automate workflows, and assist with file/process/hardware queries — all while keeping data on your machine.

Why SysSage exists (Problem)

  • System troubleshooting, inventory, and personal assistance often require multiple tools, admin rights, or cloud services that expose sensitive data.
  • Developers, power users, and administrators need fast, contextual, and private help about processes, running services, hardware, and files without sending data to the cloud.

Proposed solution

  • SysSage provides a modular agent framework and a Streamlit UI that connects a suite of OS-level tools (process, file, hardware, browser artifacts) with local LLMs (via ollama_wrapper.py or other adapters). It focuses on local-first intelligence, scripted automation, and extensibility.

Key features

  • Privacy-first local LLM integration: use Ollama (or swap in another local/secure model) so prompts and context can stay on-device.
  • System inspection tools: process, hardware, file, and browser helpers under tools/ (e.g., psutil, pywin32 usages).
  • Agent-based automation: small agents live in agents/ and can monitor, execute, and report tasks (executor, monitor).
  • Crew runner / Orchestration: crew_runner.py coordinates multiple agents for multi-step workflows.
  • Simple Streamlit UI: ui/streamlit_ui.py plus an intent dispatcher (ui/intent_dispatcher.py) for quick interactions and saved histories.
  • Scriptable experiments: assistant.py, simple_approach.py provide examples and starting points for new features.

Short developer quickstart (Windows / PowerShell)

  1. Create and activate a virtual environment
python -m venv .venv
.\.venv\Scripts\Activate.ps1
  1. Install dependencies
pip install -r requirements.txt
  1. (Optional) Configure local LLMs
  • If you plan to use Ollama or another local LLM, install and configure it separately. See llm/ollama_wrapper.py for integration points.
  1. Run the Streamlit UI
streamlit run .\ui\streamlit_ui.py
  1. Run the agent crew example
python .\crew_runner.py

Project layout (what's where)

  • agents/ — agent implementations (e.g., executor_agent.py, monitor_agent.py).
  • llm/ — local LLM adapter(s) and wrappers.
  • tasks/ — task orchestration helpers.
  • tools/ — OS and utility tooling (process, file, hardware, browsing helpers).
  • ui/ — Streamlit UI, intent dispatcher, and front-end wiring.
  • crew_runner.py — example orchestrator that launches agent crews.
  • assistant.py, simple_approach.py — experimental entrypoints and scripts.
  • requirements.txt — pinned Python dependencies.

Architecture (high level)

  1. UI (Streamlit)

    • Collects intents and displays agent outputs. Sends structured requests to the intent dispatcher.
  2. Intent Dispatcher (ui/intent_dispatcher.py)

    • Maps UI intents to agent/task workflows and routes them to the crew runner or direct tools.
  3. Agents & Crew Runner

    • Agents encapsulate capabilities (monitoring, execution, file analysis). crew_runner.py composes agents to run multi-step flows.
  4. Tools

    • Deterministic helpers that perform system actions or data retrieval (psutil wrappers, process scanning, file search, browser artifact parsing).
  5. LLM Adapter

    • llm/ollama_wrapper.py shows how to place contextual prompts and combine deterministic outputs with model reasoning while retaining privacy.

Security & Privacy

  • Local-first operation: LLM prompts and context can stay local when using on-device models.
  • Minimal telemetry: default behavior is to avoid external logging of sensitive outputs; add explicit opt-in if remote logging is required.
  • Elevated actions: some tools require admin rights — the UI will surface warnings and request appropriate permissions.

Extending SysSage

  • Add a new tool: create a helper in tools/, write a small wrapper, and register it with an agent.
  • Add a new agent: follow the pattern in agents/ and expose a simple interface (start/stop/handle_intent).
  • Swap LLM provider: implement a new adapter under llm/ that matches the wrapper interface.

Development notes

  • Recommended Python: 3.10+.
  • Tests: add unit tests for tool logic (avoid calling LLMs in unit tests; mock adapters).
  • CI: small lint and test step (GitHub Actions recommended).

Troubleshooting & common notes

  • Streamlit launching issues: ensure your venv is active and streamlit is installed.
  • Missing OS-level permissions: run PowerShell as Administrator for tasks that require system-level access.
  • LLM adapter errors: confirm the local model server is running and the wrapper configuration matches your installation.

Contributing & roadmap

  • Open issues to suggest features or report bugs.
  • Short-term roadmap: improve agent orchestration, add richer browser artifact parsing, and include more robust offline prompt templates.

License

  • Add a LICENSE to the repository if you plan to open-source. MIT or Apache-2.0 are common choices.

Contact

  • Use the repository issue tracker for collaboration and questions.

— If you want this written in a slightly shorter pitch format, converted to a slide-friendly one-pager, or saved directly into the repo as README.md, tell me and I will update the file or create the alternate formats.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages