I’m not sure what you mean by ideal. Like, run any model you ever wanted? Probably the latest ai nvidia chips.
But you can get away with a lot less for smaller models. I have the amd mid range card from 4 years ago (i forget the model at the top of my head) and can run text, 8B sized, models without issue.
You’d have to go looking for a specific model but https://huggingface.co/ has a model for nearly anything you’d want. You just have to setup your local machine to run it.
I’m not sure what you mean by ideal. Like, run any model you ever wanted? Probably the latest ai nvidia chips.
But you can get away with a lot less for smaller models. I have the amd mid range card from 4 years ago (i forget the model at the top of my head) and can run text, 8B sized, models without issue.
I’m sorry, I use chatgpt for writing mysql queries and dax-formulas so that would be the use case.
You’d have to go looking for a specific model but https://huggingface.co/ has a model for nearly anything you’d want. You just have to setup your local machine to run it.