# apfel — On-Device OpenAI-Compatible AI for macOS > apfel exposes Apple Foundation Models as a CLI and local OpenAI-compatible server (no API keys), with streaming, files, and tool calling on macOS 26+. ## Install Copy the content below into your project: ## Quick Use ```bash brew install apfel # CLI mode apfel "Summarize this diff in 5 bullets" <(git diff) # Local OpenAI-compatible server apfel --serve ``` ```python from openai import OpenAI client = OpenAI(base_url="http://localhost:11434/v1", api_key="unused") print(client.chat.completions.create(model="apple-foundationmodel", messages=[{"role":"user","content":"Hello"}]).choices[0].message.content) ``` ## Intro apfel exposes Apple Foundation Models as a UNIX CLI and a local OpenAI-compatible server so you can run prompts 100% on-device with no API keys. It supports tool calling and a 4096-token context, and is GitHub-verified at 5,329★. **Best for:** Mac developers who want a local OpenAI-compatible backend for scripts, SDKs, and agent tools (no cloud keys) **Works with:** macOS 26+ Apple Silicon; OpenAI SDKs via base_url; shell pipelines; file-attached prompts **Setup time:** 10–20 minutes ### Key facts (verified) - README shows `apfel --serve` as a local OpenAI-compatible server at `http://localhost:11434/v1`. - Requirements call out macOS 26 Tahoe+ on Apple Silicon (M1+) with Apple Intelligence enabled. - README badge lists version 1.3.3 and context size 4096 tokens. - GitHub: 5,329 stars · 205 forks; pushed 2026-05-12 (GitHub API verified). ## Main Use apfel in two production-friendly ways: 1) **CLI for scripts**: pipe text in/out, attach files with `-f`, and request JSON output for automation. 2) **Local server for SDKs/agents**: point any OpenAI-compatible client at `http://localhost:11434/v1` and keep prompts on-device. If you maintain agent tooling, start by wiring one integration (CLI or server) and add a smoke test that calls a tiny prompt so failures are obvious after OS updates. ### README excerpt (verbatim) # apfel ### The free AI already on your Mac. [![Version 1.3.3](https://img.shields.io/badge/version-1.3.3-blue)](https://github.com/Arthur-Ficial/apfel) [![Swift 6.3+](https://img.shields.io/badge/Swift-6.3%2B-F05138?logo=swift&logoColor=white)](https://swift.org) [![macOS 26 Tahoe+](https://img.shields.io/badge/macOS-26%20Tahoe%2B-000000?logo=apple&logoColor=white)](https://developer.apple.com/macos/) [![No Xcode Required](https://img.shields.io/badge/Xcode-not%20required-orange)](https://developer.apple.com/xcode/resources/) [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](LICENSE) [![100% On-Device](https://img.shields.io/badge/inference-100%25%20on--device-green)](https://developer.apple.com/documentation/foundationmodels) [![Website](https://img.shields.io/badge/web-apfel.franzai.com-16A34A)](https://apfel.franzai.com) [![#agentswelcome](https://img.shields.io/badge/%23agentswelcome-PRs%20welcome-0066cc?style=for-the-badge&labelColor=0d1117&logo=probot&logoColor=white)](#contributing) Apple Silicon Macs ship a built-in LLM via [Apple FoundationModels](https://developer.apple.com/documentation/foundationmodels). `apfel` exposes it as a UNIX tool and a local OpenAI-compatible server. 100% on-device. No API keys, no cloud. | Mode | Command | What you get | |------|---------|--------------| | UNIX tool | `apfel "prompt"` / `echo "text" \| apfel` | Pipe-friendly answers, file attachments, JSON output, exit codes | | OpenAI-compatible server | `apfel --serve` | Drop-in local `http://localhost:11434/v1` backend for OpenAI SDKs | `apfel --chat` - interactive REPL. Tool calling works in all contexts. 4096-token context. ![apfel CLI](screenshots/cli.png) ## Requirements & Install macOS 26 Tahoe+, Apple Silicon (M1+), [Apple Intelligence enabled](https://support.apple.com/en-us/121115). ```bash brew install apfel ``` Update: ```bash brew upgrade apfel ``` Build from source (Command Line Tools with macOS 26.4 SDK / Swift 6.3, no Xcode): ```bash git clone https://github.com/Arthur-Ficial/apfel.git && cd apfel && make install ``` Nix, same-day tap, Mint, mise, troubleshooting: [docs/install.md](docs/install.md). ## Quick Start ### UNIX tool Quote prompts with `!` in single quotes (zsh/bash history expansion): `apfel 'Hello, Mac!'`. ```bash # Single prompt apfel "What is the capital of Austria?" # Permissive mode - reduces guardrail false positives for creative/long prompts apfel --permissive "Write a dramatic opening for a thriller novel" # Stream output apfel --stream "Write a haiku about code" # Pipe input ### FAQ **Q: Is apfel cloud-based?** A: No—README describes 100% on-device inference via Apple Foundation Models (no API keys). **Q: Does it work with OpenAI SDKs?** A: Yes—run `apfel --serve` and set the client base_url to `http://localhost:11434/v1`. **Q: What are the requirements?** A: macOS 26 Tahoe+ on Apple Silicon with Apple Intelligence enabled (per README). ## Source & Thanks > Source: https://github.com/Arthur-Ficial/apfel > License: MIT > GitHub stars: 5,329 · forks: 205 --- ## 快速使用 ```bash brew install apfel # CLI 模式 apfel "把这个 diff 总结成 5 条要点" <(git diff) # 启动本地 OpenAI 兼容服务 apfel --serve ``` ```python from openai import OpenAI client = OpenAI(base_url="http://localhost:11434/v1", api_key="unused") resp = client.chat.completions.create(model="apple-foundationmodel", messages=[{"role":"user","content":"你好"}]) print(resp.choices[0].message.content) ``` ## 简介 apfel 把 Apple Foundation Models 暴露成 UNIX CLI 与本地 OpenAI 兼容服务,让你在本机 100% 离线/本地推理(无需 API key)。它支持工具调用与 4096 token 上下文,GitHub 已验证 5,329★。 **最适合:** 希望在 Mac 上获得本地 OpenAI 兼容后端(无需云端 key)的开发者与 agent 工具链 **适配:** macOS 26+ Apple Silicon;OpenAI SDK 可用 base_url;Shell 管道;文件附件提示 **配置时间:** 10–20 分钟 ### 关键事实(已验证) - README 提供 `apfel --serve`:本地 OpenAI 兼容服务端点 `http://localhost:11434/v1`。 - 要求:macOS 26 Tahoe+、Apple Silicon(M1+)且已启用 Apple Intelligence。 - README 徽章标注版本 1.3.3,并提到 4096 token 上下文。 - GitHub:5,329 stars · 205 forks;最近更新 2026-05-12(GitHub API 验证)。 ## 正文 apfel 有两种更“工程化”的用法: 1)**脚本/终端 CLI**:把输入输出放进管道里,用 `-f` 附件把文件内容塞进提示词,并用 JSON 输出方便自动化。 2)**本地服务端**:把任意 OpenAI 兼容 SDK/agent 的 `base_url` 指向 `http://localhost:11434/v1`,把提示与上下文留在本机。 建议先选一种路径接入你的工具链(CLI 或 server),再加一个最小健康检查(小提示词 + 快速返回),确保系统升级后能第一时间发现问题。 ### README 原文节选(verbatim) # apfel ### The free AI already on your Mac. [![Version 1.3.3](https://img.shields.io/badge/version-1.3.3-blue)](https://github.com/Arthur-Ficial/apfel) [![Swift 6.3+](https://img.shields.io/badge/Swift-6.3%2B-F05138?logo=swift&logoColor=white)](https://swift.org) [![macOS 26 Tahoe+](https://img.shields.io/badge/macOS-26%20Tahoe%2B-000000?logo=apple&logoColor=white)](https://developer.apple.com/macos/) [![No Xcode Required](https://img.shields.io/badge/Xcode-not%20required-orange)](https://developer.apple.com/xcode/resources/) [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](LICENSE) [![100% On-Device](https://img.shields.io/badge/inference-100%25%20on--device-green)](https://developer.apple.com/documentation/foundationmodels) [![Website](https://img.shields.io/badge/web-apfel.franzai.com-16A34A)](https://apfel.franzai.com) [![#agentswelcome](https://img.shields.io/badge/%23agentswelcome-PRs%20welcome-0066cc?style=for-the-badge&labelColor=0d1117&logo=probot&logoColor=white)](#contributing) Apple Silicon Macs ship a built-in LLM via [Apple FoundationModels](https://developer.apple.com/documentation/foundationmodels). `apfel` exposes it as a UNIX tool and a local OpenAI-compatible server. 100% on-device. No API keys, no cloud. | Mode | Command | What you get | |------|---------|--------------| | UNIX tool | `apfel "prompt"` / `echo "text" \| apfel` | Pipe-friendly answers, file attachments, JSON output, exit codes | | OpenAI-compatible server | `apfel --serve` | Drop-in local `http://localhost:11434/v1` backend for OpenAI SDKs | `apfel --chat` - interactive REPL. Tool calling works in all contexts. 4096-token context. ![apfel CLI](screenshots/cli.png) ## Requirements & Install macOS 26 Tahoe+, Apple Silicon (M1+), [Apple Intelligence enabled](https://support.apple.com/en-us/121115). ```bash brew install apfel ``` Update: ```bash brew upgrade apfel ``` Build from source (Command Line Tools with macOS 26.4 SDK / Swift 6.3, no Xcode): ```bash git clone https://github.com/Arthur-Ficial/apfel.git && cd apfel && make install ``` Nix, same-day tap, Mint, mise, troubleshooting: [docs/install.md](docs/install.md). ## Quick Start ### UNIX tool Quote prompts with `!` in single quotes (zsh/bash history expansion): `apfel 'Hello, Mac!'`. ```bash # Single prompt apfel "What is the capital of Austria?" # Permissive mode - reduces guardrail false positives for creative/long prompts apfel --permissive "Write a dramatic opening for a thriller novel" # Stream output apfel --stream "Write a haiku about code" # Pipe input ### FAQ **apfel 需要联网/云端吗?** 答:不需要:README 描述为 100% 本地推理(无需 API key)。 **能当作 OpenAI SDK 后端吗?** 答:可以:运行 `apfel --serve`,把 base_url 指向 `http://localhost:11434/v1`。 **运行条件是什么?** 答:README 标注需要 macOS 26 Tahoe+、Apple Silicon,并启用 Apple Intelligence。 ## 来源与感谢 > Source: https://github.com/Arthur-Ficial/apfel > License: MIT > GitHub stars: 5,329 · forks: 205 --- Source: https://tokrepo.com/en/workflows/apfel-on-device-openai-compatible-ai-for-macos Author: Script Depot