echnical Specifications
H100 SXM H100 NVL
FP64 34 teraFLOPS 30 teraFLOPS
FP64 Tensor Core 67 teraFLOPS 60 teraFLOPS
FP32 67 teraFLOPS 60 teraFLOPS
TF32 Tensor Core* 989 teraFLOPS 835 teraFLOPS
BFLOAT16 Tensor Core* 1,979 teraFLOPS 1,671 teraFLOPS
FP16 Tensor Core* 1,979 teraFLOPS 1,671 teraFLOPS
FP8 Tensor Core* 3,958 teraFLOPS 3,341 teraFLOPS
INT8 Tensor Core* 3,958 TOPS 3,341 TOPS
GPU Memory 80GB 94GB
GPU Memory Bandwidth 3.35TB/s 3.9TB/s
Decoders 7 NVDEC
7 JPEG
7 NVDEC
7 JPEG
Max Thermal Design
Power (TDP)
Up to 700W
(configurable)
350-400W (configurable)
Multi-Instance GPUs Up to 7 MIGs @ 10GB each Up to 7 MIGS @ 12GB each
Form Factor SXM PCIe dual-slot air-cooled
Interconnect NVIDIA NVLinkâ„¢: 900GB/s
PCIe Gen5: 128GB/s
NVIDIA NVLink: 600GB/s
PCIe Gen5: 128GB/s
Server Options NVIDIA HGX H100
Partner and NVIDIA-
Certified Systemsâ„¢ with
4 or 8 GPUs
NVIDIA DGX H100 with
8 GPUs
Partner and NVIDIA-
Certified Systems with
1–8 GPUs
NVIDIA Enterprise Add-on Included
Sold out
Description
Reviews (0)
Be the first to review “NVIDIA H100 for PCIe” Cancel reply
Shipping & Delivery
Delivery Options
- Anywhere
- Visa, Mastercard, AMEX are accepted
-
International Delivery
Must have a UPS FedEx or DHL account to ship on or a USA based Freight Forwarder to Ship To.
Reviews
There are no reviews yet.