Nvidia H100 GPU

Nvidia 8 H100 80GB SXM5 Bundle Specifications

H100 SXM H100 NVL
FP64 34 teraFLOPS 30 teraFLOPS
FP64 Tensor Core 67 teraFLOPS 60 teraFLOPS
FP32 67 teraFLOPS 60 teraFLOPS
TF32 Tensor Core 989 teraFLOPS 835 teraFLOPS
BFLOAT16 Tensor Core 1,979 teraFLOPS 1,671 teraFLOPS
FP16 Tensor Core 1,979 teraFLOPS 1,671 teraFLOPS
FP8 Tensor Core 3,958 teraFLOPS 3,341 teraFLOPS
INT8 Tensor Core 3,958 TOPS 3,341 TOPS
GPU Memory 80GB 94GB
GPU Memory Bandwidth 3.35TB/s 3.9TB/s
Decoders -7 NVDEC
-7 JPEG
-7 NVDEC
-7 JPEG
Max Thermal Design Power (TDP) Up to 700W (configurable) 350-400W (configurable)
Multi-Instance GPUs Up to 7 MIGs @ 10GB each Up to 7 MIGs @ 12GB each
Form Factor SXM PCIe dual-slot air-cooled
Interconnect -NVIDIA NVLink™: 900GB/s
-PCIe Gen5: 128GB/s
-NVIDIA NVLink: 600GB/s
-PCIe Gen5: 128GB/s
Server Options -NVIDIA HGX H100 Partner and NVIDIACertified Systems™ with 4 or 8 GPUs
-NVIDIA DGX H100 with
8 GPUs
Partner and NVIDIA Certified Systems with
1–8 GPUs
NVIDIA Enterprise Add-on Included