I'm not sure I can agree with your analysis and approach. Here is why . . . there are current open source models out there such as Vicuna which in my admittedly lite testing appear to be sufficiently powerful at 13B parameter and function well on powerful personal hardware. Meaning you don't go the cloud to run a graphics card. So costs are really significantly less when spread over time of investment. Now I'm not saying you are going to generate and train your own AI, so don't get me wrong there. You can however use python to code to an existing AI like Vicuna or other which has already been trained. So if you feel that is not complete freedom and control, then maybe you are right. However, I still think it would be better to own your own metal (hardware). For more information of my Vicuna example see: https://medium.com/@michael-mcanally/ai-wow-it-is-possible-to-run-an-open-source-ai-like-chatgpt-for-free-on-your-local-pc-without-even-a2df08806abf