In r/learnmachinelearning, multiple threads argue beginners should start on CPU and rent GPUs or use Colab when needed, because time spent on drivers and fitting models into small GPUs can outweigh the benefits of owning hardware.
You don't actually need to own or even rent a gpu to get started learning about machine learning.
I’ve been learning and experimenting with ML mostly using rented GPUs
the more time you spend fighting with trying to fit models into smaller GPUs or messing with drivers and libraries, the less time you have to actually accomplish your goal.
This finding is one of many signals tracked across Artificial Intelligence. The live feed updates every few hours with new expert voices, debates, and emerging ideas.
← Back to Artificial Intelligence