Cette page est affichée en anglais. Une traduction française est en cours.
CLI ToolsMay 13, 2026·3 min de lecture

apfel — On-Device OpenAI-Compatible AI for macOS

apfel exposes Apple Foundation Models as a CLI and local OpenAI-compatible server (no API keys), with streaming, files, and tool calling on macOS 26+.

Prêt pour agents

Cet actif peut être lu et installé directement par les agents

TokRepo expose une commande CLI universelle, un contrat d'installation, le metadata JSON, un plan selon l'adaptateur et le contenu raw pour aider les agents à juger l'adaptation, le risque et les prochaines actions.

Native · 94/100Policy : autoriser
Surface agent
Tout agent MCP/CLI
Type
Cli
Installation
Brew
Confiance
Confiance : Established
Point d'entrée
brew install apfel
Commande CLI universelle
npx tokrepo install ba18be09-e80b-5803-989c-e8ffdd99386d
Introduction

apfel exposes Apple Foundation Models as a UNIX CLI and a local OpenAI-compatible server so you can run prompts 100% on-device with no API keys. It supports tool calling and a 4096-token context, and is GitHub-verified at 5,329★.

Best for: Mac developers who want a local OpenAI-compatible backend for scripts, SDKs, and agent tools (no cloud keys)

Works with: macOS 26+ Apple Silicon; OpenAI SDKs via base_url; shell pipelines; file-attached prompts

Setup time: 10–20 minutes

Key facts (verified)

  • README shows apfel --serve as a local OpenAI-compatible server at http://localhost:11434/v1.
  • Requirements call out macOS 26 Tahoe+ on Apple Silicon (M1+) with Apple Intelligence enabled.
  • README badge lists version 1.3.3 and context size 4096 tokens.
  • GitHub: 5,329 stars · 205 forks; pushed 2026-05-12 (GitHub API verified).

Main

Use apfel in two production-friendly ways:

  1. CLI for scripts: pipe text in/out, attach files with -f, and request JSON output for automation.
  2. Local server for SDKs/agents: point any OpenAI-compatible client at http://localhost:11434/v1 and keep prompts on-device.

If you maintain agent tooling, start by wiring one integration (CLI or server) and add a smoke test that calls a tiny prompt so failures are obvious after OS updates.

README excerpt (verbatim)

apfel

The free AI already on your Mac.

Version 1.3.3 Swift 6.3+ macOS 26 Tahoe+ No Xcode Required License: MIT 100% On-Device Website #agentswelcome

Apple Silicon Macs ship a built-in LLM via Apple FoundationModels. apfel exposes it as a UNIX tool and a local OpenAI-compatible server. 100% on-device. No API keys, no cloud.

Mode Command What you get
UNIX tool apfel "prompt" / echo "text" | apfel Pipe-friendly answers, file attachments, JSON output, exit codes
OpenAI-compatible server apfel --serve Drop-in local http://localhost:11434/v1 backend for OpenAI SDKs

apfel --chat - interactive REPL.

Tool calling works in all contexts. 4096-token context.

apfel CLI

Requirements & Install

macOS 26 Tahoe+, Apple Silicon (M1+), Apple Intelligence enabled.

brew install apfel

Update:

brew upgrade apfel

Build from source (Command Line Tools with macOS 26.4 SDK / Swift 6.3, no Xcode):

git clone https://github.com/Arthur-Ficial/apfel.git && cd apfel && make install

Nix, same-day tap, Mint, mise, troubleshooting: docs/install.md.

🙏

Source et remerciements

Source: https://github.com/Arthur-Ficial/apfel > License: MIT > GitHub stars: 5,329 · forks: 205

Fil de discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.

Actifs similaires