FP64 34 TFLOPS
FP64 Tensor Core 67 TFLOPS
FP32 67 TFLOPS
TF32 Tensor Core 989 TFLOPS2²
BFLOAT16 Tensor Core 1,979 TFLOPS²
FP16 Tensor Core 1,979 TFLOPS²
FP8 Tensor Core 3,958 TFLOPS²
INT8 Tensor Core 3,958 TFLOPS²
GPU Memory 141GB
GPU Memory Bandwidth 4.8TB/s
Decoders 7 NVDEC
7 JPEG
Confidential Computing Supported
Max Thermal Design Power (TDP) Up to 600W (configurable)
Multi-Instance GPUs Up to 7 MIGs @16.5GB each
Form Factor PCIe
Interconnect 2- or 4-way NVIDIA NVLink bridge: 900GB/s PCIe Gen5: 128GB/s
Server Options NVIDIA MGX™ H200 NVL partner and NVIDIA-Certified Systems with up to 8 GPUs
NVIDIA AI Enterprise Included
Reviews
There are no reviews yet.