SkillsApr 8, 2026·2 min read

Together AI GPU Clusters Skill for Claude Code

Skill that teaches Claude Code Together AI's GPU cluster API. Provision on-demand and reserved H100, H200, and B200 GPU clusters for large-scale training and inference.

MC
MCP Hub · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

npx skills add togethercomputer/skills

What is This Skill?

This skill teaches AI coding agents how to provision and manage GPU clusters on Together AI. Request on-demand or reserved clusters of H100, H200, and B200 GPUs for large-scale model training, distributed inference, and research workloads.

Answer-Ready: Together AI GPU Clusters Skill for coding agents. Provision H100/H200/B200 GPU clusters on-demand or reserved. Large-scale training and distributed inference. Part of official 12-skill collection.

Best for: Teams needing GPU clusters for training or large-scale inference. Works with: Claude Code, Cursor, Codex CLI.

What the Agent Learns

Provision Cluster

from together import Together

client = Together()
cluster = client.clusters.create(
    name="training-cluster",
    gpu_type="h100-80gb",
    gpu_count=8,
    reservation_type="on-demand",
)
print(f"Cluster ID: {cluster.id}")

GPU Options

GPU VRAM Interconnect Best For
H100 80GB 80GB NVLink Standard training
H200 141GB NVLink Large models
B200 192GB NVLink Cutting-edge

Reservation Types

Type Billing Commitment
On-demand Per hour None
Reserved Discounted 1-12 months

Cluster Management

# Monitor utilization
status = client.clusters.retrieve(cluster.id)
# Resize
client.clusters.update(cluster.id, gpu_count=16)
# Release
client.clusters.delete(cluster.id)

FAQ

Q: How many GPUs can I request? A: From single GPUs to clusters of 1000+ for large training runs. Contact Together AI for very large allocations.

🙏

Source & Thanks

Part of togethercomputer/skills — MIT licensed.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets