I’ve tried coding and every one I’ve tried fails unless really, really basic small functions like what you learn as a newbie compared to say 4o mini that can spit out more sensible stuff that works.
I’ve tried explanations and they just regurgitate sentences that can be irrelevant, wrong, or get stuck in a loop.
So. what can I actually use a small LLM for? Which ones? I ask because I have an old laptop and the GPU can’t really handle anything above 4B in a timely manner. 8B is about 1 t/s!
Snippets are a great use.
I use StableCode on my phone as a programming tutor for learning Python. It is outstanding in both speed and in accuracy for this task. I have it generate definitions which I copy and paste into Anki the flashcard app. Whenever I’m on a bus or airplane I just start studying. Wish that it could also quiz me interactively.
Please be very careful. The python code it’ll spit out will most likely be outdated, not work as well as it should (the code isn’t “thought out” as if a human did it.
If you want to learn, dive it, set yourself tasks, get stuck, and f around.
I know what you mean. All the code generated with ai was loaded with problems. Specifically it kept forcing my api keys into the code without using environmental variables. But for basic coding concepts it has so far been perfect. even a 3b model seemingly generates great definitions