Programming. I have a bunch of ideas that are actually useful for my job, but I can’t seem to keep track of the necessary steps to write the code. ChatGPT has helped me create a couple of programs: a discord bot and a very complex (for me) application that brings in NASA data that automatically runs through Stable Diffusion. The code interpreter is amazing… but there’s too much context I’m missing for these things to be truly fun the way I imagine them to be.
Depending on what your level of experience is, it might just take more time and practice. When I was doing my degree, it took two years, an internship, and multiple serious programming courses before I truly felt comfortable programming.
Look up UC Berkeley’s CS61A, B, and C, then start looking deeper into the CS curriculum to find the pieces you are missing. For me, it took trying to optimise the CPU scheduler for single threaded processes in Linux, specifically complex assemblies in FreeCAD. That lead me to a few lectures about the scheduler in the OS principals course. I’ve done a bunch of little embedded projects but struggle with complexity. The concepts around a scheduler are what I was missing. There are a lot of things like this that are readily available for free online if you just go searching for them specifically.
If you want to really free yourself, I run a offline Llama2 70B with GGML 4bit to code. It works well for snippets and can do better than 3 tokens a second on a 12th gen i7/64GB and 16GBV 3080Ti. It can run at around 2 tokens a second on just the CPU, but you’ll need a Linux machine with an additional 8GB swap partition just to initially load the model. It takes around 43GB to run after init.
Programming. I have a bunch of ideas that are actually useful for my job, but I can’t seem to keep track of the necessary steps to write the code. ChatGPT has helped me create a couple of programs: a discord bot and a very complex (for me) application that brings in NASA data that automatically runs through Stable Diffusion. The code interpreter is amazing… but there’s too much context I’m missing for these things to be truly fun the way I imagine them to be.
Depending on what your level of experience is, it might just take more time and practice. When I was doing my degree, it took two years, an internship, and multiple serious programming courses before I truly felt comfortable programming.
Look up UC Berkeley’s CS61A, B, and C, then start looking deeper into the CS curriculum to find the pieces you are missing. For me, it took trying to optimise the CPU scheduler for single threaded processes in Linux, specifically complex assemblies in FreeCAD. That lead me to a few lectures about the scheduler in the OS principals course. I’ve done a bunch of little embedded projects but struggle with complexity. The concepts around a scheduler are what I was missing. There are a lot of things like this that are readily available for free online if you just go searching for them specifically.
If you want to really free yourself, I run a offline Llama2 70B with GGML 4bit to code. It works well for snippets and can do better than 3 tokens a second on a 12th gen i7/64GB and 16GBV 3080Ti. It can run at around 2 tokens a second on just the CPU, but you’ll need a Linux machine with an additional 8GB swap partition just to initially load the model. It takes around 43GB to run after init.