Blog Article

No Man’s Guide to GPUs

Deepak Joshi
By Deepak Joshi | November 29, 2023 12:18 pm

For Stable Diffusion - authored by PeePa, Ceaseless Vibing, Lewington

Who is this for?

You want to run stable diffusion, but you still have some questions about GPUs. Buy or rent? Local or remote? Nvidia or AMD? 4 Gigabytes of RAM or 24? This guide has the answers.






Why do I need a GPU?

GPUs are a special kind of computer processor purpose built for performing special kinds of computation. Stable Diffusion can run on a GPU or the more common CPU, but it’s hundreds of times faster when run on a GPU, so a GPU is highly recommended.

Which GPU should I use?

The right GPU for you depends on what you want to do, and how frequently you want to be doing it. Different use cases demand different amounts of GPU RAM. Below is a handy table which lists different use cases and the amount of RAM that they require.


Use Case Minimum GPU VRAM (GB) Works on AMD Links
Just have a poke around 4 Yes Online Interface:
Online Interface:
Install Locally 4 Yes
Dreambooth 8 Not yet Colab:
Local Install Starting Point:
Textual Inversion 6 Unsure Colab:
Deforum Animation 8 Yes Colab:
Local Install Tutorial:
Make Anime 7 Yes Online Interface:
Local Install Tutorial:
Use AUTOMATIC1111 Webui 4 Yes, AMD Install
Use XFormers 4 Not Yet Local Install Tutorial:
Use with Photoshop 4 Yes Install Tutorial:
Use with Krita 4 Not yet Install Tutorial:
Img2Img 4 Yes
Text-to-3D 8 Not yet Tutorial:
Disco-Diffusion 8 Not yet Colab:
Install Instructions:
Outpainting 8 Unsure Install Instructions:

Buy or Rent?

The benefits of buying your own GPU and installing it on your local computer are:

  1. You have more control over your own setup.
  2. Generating is (probably) slightly faster when you’re only generating a few images.
  3. You just need to set up once.
  4. You can run generate images even when your internet is bad.
  5. There are many good programs to help you install and run locally, for example

View Purchasing a GPU for info.

Alternatively, you could simply rent a GPU. How does this work? Well, there are lots of companies out there who maintain racks upon racks of computers with very powerful GPUs already installed. For a fee, you can access one of these computers, set it up for stable diffusion, and start generating right away.

Benefits of renting:

  1. Probably much cheaper unless you are generating a LOT of art.
  2. No need to physically install a GPU.
  3. Less hassle setting up the software needed to run stable diffusion applications, as most services provide this already.
  4. Access to super-huge GPUs.

View Renting Cloud GPUs for info.

Whether you rent or buy, the actual price you will pay depends on the complexity and scale of your applications, so bear in mind the minimum processing power required and how that might change in the future with new tools and platforms or components.

For instance, it costs $389 for a Nvidia RTX 2060, which is one of the lowest end GPUs worth buying. By comparison, it costs $0.2 per hour to rent an RTX 3070 (which is slightly faster) in runpod. This means you would need to generate for 1,945 hours on runpod before you spent the same amount. If you generated for 3 hours every single day, this would take nearly two years, by which time Nvidia has probably released a new GPU anyway.

Renting Cloud GPUs

Renting cloud GPUs grants more accessibility and you are able to rent from of the following services:

  • Free + Paid Jupyter-lab based services
  • Invite only Amazon Sagemaker
  • Sign up for access to amazon sagemaker apply here.
  • Provides compute storage upto 15GBs.
  • 4 hours of runtime per session.
  • 8 hours of total runtime in 24 hours.
  • Free + Paid Google Colab, options include: free, colab pro, colab pro +.

Additionally there is no guarantee of resources nor availability.

  • Paid Colab Pro, you are usually assigned a Tesla K80/Tesla T4 GPU. With 100 compute credits. You can request access to Premium GPUs in the settings, which are V100/A100s.
  • Each Colab Pro, Tesla T4 execution takes around 1.94 compute credits per hour.
  • Each Colab Pro instance, A100 execution, costs around 10 compute units/hour.
  • In Colab Pro+, you are usually assigned a Tesla K80/Tesla T4 GPU. With 500 compute credits. You can request access to Premium GPUs in the settings, which are V100/A100s. Resources are more guaranteed. You can allow background execution too.
  • Free Colab Free, this offers an environment with a Tesla K80/Tesla T4 GPU. Which expires once you have used all the RAM, GPU access is revoked for 24 hours
  • Gradient Pepperspace
  • Create workflows, and automate them.
  • Persistent notebooks: The files created on google colab will be deleted, though the files stored on persistent notebooks will remain even after being deleted.
  • Access to Jupyterlab.
  • Kaggle Notebooks
  • Loading data is instantaneous. Good for running notebooks with datasets hosted on kaggle.
  • 30 hours/week free GPU timing.
  • Free access to TESLA P100 GPU (Pepperspace) .
  • Dedicated GPU workspaces
  • Coreweave
  • Runpod
  • Fluidstack
  • Google Cloud Platform
  • Amazon AWS

Purchasing a GPU

Wanting to buy a GPU you will want to know about AMD vs Nvidia Compatibility first and foremost view Compatibility here for info.

Look here to see a speculative graph for GPU Comparison, with your GPU, make sure to check if your CPU is a good match in terms of performance. Once done go ahead and set it up in your desired configuration for stable diffusion along with its various applications e.g 3D, 2D image synthesis and more?

Generally speaking those who are in this category should look at the newly released RTX 40 Series GPUs and RTX 20 or 30 series, take into consideration the Applications and your usage case.

Absolute minimum?

For users who want to run these applications with minimum specs look at the GTX 10 Series AMD or similar GPU range with 4GB Vram.

Running locally, AMD or Nvidia GPUs?

Nvidia - Most Compatible

For those of you who are using Nvidia, CUDA is a powerful architecture which allows you to use many of the features included in stable diffusion.

AMD - Incompatible with CUDA

As for AMD users, the CUDA toolkit is not available and therefore you probably want to stick with using pytorch or renting cloud GPUs, following this “here” is a link that shows what applications will be available to you.

SD on AMD Compatability

What Environment is needed to run Stable Diffusion?

In general you will need the following:

  • Python 3 (currently 3.10 and newer) or (Anaconda3) depending on the use case
  • Git
  • Visual Studio libraries
  • An editor, notepad++ or visual studio code

For Nvidia, As of writing this guide At the very least as an Nvidia user you will need the CUDA toolkit 11.8 and up to date graphics drivers

For Amd, the application may be the same but the setup required maybe more complex as you will need to download your compute platform through pytorch, view here AMD's answer to CUDA

For Others, you will need to check if your GPU or system is capable of running pytorch etc.

Special Thanks






Related Articles

Deepak Joshi

Content Marketing Specialist at Appy Pie