PrimeCalcPro
Explore 1070+ free calculators — math, finance, health & more.

GPU VRAM Calculator

Estimate VRAM needed to run any LLM locally

GPU VRAM Calculator for LLMs

Model Size (billions of params)
Precision

A GPU VRAM calculator estimates the video RAM required to run a large language model locally. VRAM requirements depend on model size (parameters) and numerical precision (quantization).

  1. 1VRAM (bytes) = Parameters × bytes per parameter
  2. 2FP32: 4 bytes/param, FP16/BF16: 2 bytes, INT8: 1 byte, INT4: 0.5 bytes
  3. 3Add ~20% overhead for activations and KV cache
  4. 47B model at FP16 = 7B × 2 = 14GB minimum
7B params at FP16=~14GB VRAM minimum (RTX 3090 or better)
70B params at INT4=~35GB VRAM (2× A100 40GB)
13B params at INT8=~13GB VRAM (RTX 4090)
Model SizeFP16 VRAMINT8 VRAMINT4 VRAM
7B14 GB7 GB3.5 GB
13B26 GB13 GB6.5 GB
30B60 GB30 GB15 GB
70B140 GB70 GB35 GB
140B280 GB140 GB70 GB
🔒
100% Бесплатно
Без регистрации
Точный
Проверенные формулы
Мгновенный
Результаты сразу
📱
Мобильный
Все устройства

Settings

Theme

Light

Dark

Layout

Language

PrivacyTermsAbout© 2025 PrimeCalcPro