Oh and I'm working on another post about which clouds are best in general. My current recommendation set is:
If you need a huge number of A100s/H100s - talk to Oracle, FluidStack, Lambda Labs. Capacity is very low though for large quantities, especially of H100s, based on a couple of cloud founders/execs I've talked with.
If you need a couple A100s: FluidStack or Runpod.
If you need 1x H100: FluidStack or Lambda Labs.
If you need cheap 3090s: Tensordock.
If you need Stable Diffusion inference only: Salad.
If you want to play around with templates / general hobbyist: Runpod.
Assuming you're not tied to / required to use a specific large cloud by your enterprise.
If you need a huge number of A100s/H100s - talk to Oracle, FluidStack, Lambda Labs. Capacity is very low though for large quantities, especially of H100s, based on a couple of cloud founders/execs I've talked with.
If you need a couple A100s: FluidStack or Runpod.
If you need 1x H100: FluidStack or Lambda Labs.
If you need cheap 3090s: Tensordock.
If you need Stable Diffusion inference only: Salad.
If you want to play around with templates / general hobbyist: Runpod.
Assuming you're not tied to / required to use a specific large cloud by your enterprise.