Aresto OS

The first AI-native operating system.

The first true agentic operating system. Where the execution layer isn't applications — it's agents. The most secure, most private, most AI-optimized OS ever built.

What “AI-native” actually means.

Most operating systems treat AI as an add-on. A sidebar. An assistant that floats above the real work. Aresto OS takes a fundamentally different approach: agents are the execution layer. The Aresto Daemon runs as a system service — always available, always loaded, always ready. Agents reason, plan, and act autonomously. The shell understands natural language natively. Software doesn't wait for you to click — it works for you.

This isn't about adding a chatbot to Linux. It's about replacing the application model itself. Every layer is hardened for security, optimized for AI workloads, and designed so your data never leaves your machine.

Core components.

Aresto Daemon

The AI engine

A Rust-based system service that manages AI inference. Loads and runs local LLMs (Nemotron Nano 4B as default), routes between local and cloud models, manages conversation state, and provides a Unix socket API that every application can talk to.

Local / Cloud / Auto modes
Model hot-swapping
TensorRT optimization

Aresto Shell

The AI terminal

An AI-native terminal emulator that understands both bash commands and natural language. Renders a GPU-accelerated status bar, connects to the daemon over Unix socket, and streams AI responses in real-time.

Command detection
Streaming AI responses
Built-in system commands

.aresto Agents

The agentic execution layer

The fundamental unit of software on Aresto OS isn't an application — it's an agent. A .aresto manifest declares an agent's identity, tools, permissions, and scope. The daemon orchestrates execution. Agents reason, plan, and act autonomously on your behalf. This is what software looks like when AI is the operating system.

Coming soon

The full stack.

From silicon to software, every layer is purpose-built.

Cloud Layer

Cloud inference fallback, OTA updates, app store

Application Layer

Autonomous agents, .aresto agents, native Linux apps

Aresto Layer

Aresto Daemon + Aresto Shell + .aresto Agent Runtime

core

Desktop Layer

Custom Wayland compositor (smithay + iced/wgpu)

OS Layer

Linux kernel + JetPack (CUDA, TensorRT, drivers)

Hardware Layer

NVIDIA Jetson Orin Nano Super (GPU, CPU, 8GB unified RAM)

Security is the architecture.

Aresto OS is built from the ground up to be the most secure, most private operating system for AI. Every agent runs in a sandboxed environment with explicit permissions. Your data never leaves your machine unless you say so.

Sandboxed Agent Execution

Every .aresto agent runs in an isolated sandbox with explicit permission grants. Agents can only access the tools, files, and APIs you authorize. No ambient authority, no silent data access.

Zero Data Exfiltration

All AI inference runs locally on your hardware. Your prompts, documents, and conversations are processed on-device. Network access is opt-in and auditable. Privacy isn't a policy — it's physics.

Hardware-Rooted Trust

Secure boot chain from hardware to agent runtime. Encrypted storage by default. TensorRT-optimized inference that never compromises on security for speed. Built on Rust for memory safety.