CLI ToolsMay 13, 2026·3 min read

apfel — On-Device OpenAI-Compatible AI for macOS

apfel exposes Apple Foundation Models as a CLI and local OpenAI-compatible server (no API keys), with streaming, files, and tool calling on macOS 26+.

Agent ready

This asset can be read and installed directly by agents

TokRepo exposes a universal CLI command, install contract, metadata JSON, adapter-aware plan, and raw content links so agents can judge fit, risk, and next actions.

Native · 94/100Policy: allow
Agent surface
Any MCP/CLI agent
Kind
Cli
Install
Brew
Trust
Trust: Established
Entrypoint
brew install apfel
Universal CLI install command
npx tokrepo install ba18be09-e80b-5803-989c-e8ffdd99386d
Intro

apfel exposes Apple Foundation Models as a UNIX CLI and a local OpenAI-compatible server so you can run prompts 100% on-device with no API keys. It supports tool calling and a 4096-token context, and is GitHub-verified at 5,329★.

Best for: Mac developers who want a local OpenAI-compatible backend for scripts, SDKs, and agent tools (no cloud keys)

Works with: macOS 26+ Apple Silicon; OpenAI SDKs via base_url; shell pipelines; file-attached prompts

Setup time: 10–20 minutes

Key facts (verified)

  • README shows apfel --serve as a local OpenAI-compatible server at http://localhost:11434/v1.
  • Requirements call out macOS 26 Tahoe+ on Apple Silicon (M1+) with Apple Intelligence enabled.
  • README badge lists version 1.3.3 and context size 4096 tokens.
  • GitHub: 5,329 stars · 205 forks; pushed 2026-05-12 (GitHub API verified).

Main

Use apfel in two production-friendly ways:

  1. CLI for scripts: pipe text in/out, attach files with -f, and request JSON output for automation.
  2. Local server for SDKs/agents: point any OpenAI-compatible client at http://localhost:11434/v1 and keep prompts on-device.

If you maintain agent tooling, start by wiring one integration (CLI or server) and add a smoke test that calls a tiny prompt so failures are obvious after OS updates.

README excerpt (verbatim)

apfel

The free AI already on your Mac.

Version 1.3.3 Swift 6.3+ macOS 26 Tahoe+ No Xcode Required License: MIT 100% On-Device Website #agentswelcome

Apple Silicon Macs ship a built-in LLM via Apple FoundationModels. apfel exposes it as a UNIX tool and a local OpenAI-compatible server. 100% on-device. No API keys, no cloud.

Mode Command What you get
UNIX tool apfel "prompt" / echo "text" | apfel Pipe-friendly answers, file attachments, JSON output, exit codes
OpenAI-compatible server apfel --serve Drop-in local http://localhost:11434/v1 backend for OpenAI SDKs

apfel --chat - interactive REPL.

Tool calling works in all contexts. 4096-token context.

apfel CLI

Requirements & Install

macOS 26 Tahoe+, Apple Silicon (M1+), Apple Intelligence enabled.

brew install apfel

Update:

brew upgrade apfel

Build from source (Command Line Tools with macOS 26.4 SDK / Swift 6.3, no Xcode):

git clone https://github.com/Arthur-Ficial/apfel.git && cd apfel && make install

Nix, same-day tap, Mint, mise, troubleshooting: docs/install.md.

🙏

Source & Thanks

Source: https://github.com/Arthur-Ficial/apfel > License: MIT > GitHub stars: 5,329 · forks: 205

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets