![](/static/253f0d9b/assets/icons/icon-96x96.png)
Yup this, if you would like more help we need the code, or at least a minimal viable reproduction scenario.
Yup this, if you would like more help we need the code, or at least a minimal viable reproduction scenario.
I might be open to the idea, but it would need to be a trustworthy company that doesn’t cancel stuff left and right. An ide would be too annoying to switch constantly to take this risk.
(What do (you) mean? ( Lisp certainly has its (downsides) and (upsides)))
This seems more focused on commercial license holders. here paying for your ide is not that uncommon, and also the amount of revenue to be gained is a lot higher. That being said I always found it a bit weird that jetbrains didnt make clion free for non commercial use as they did with pycharm/intelij.
I’ll be starting with the andrej karpathy neural network series. Might not be reading per se, but I find it high time I actually go through and learn fully how each part of a neural net works together, instead of focusing only on small parts.
Those certainly also look nice, did not notice those
I have been out of the ml world for a bit (like 6months lol …) And I already feel way out if date. It seems like I should pick up the vicuna llm, didnt want to touch llama initially due to the legal problems with it. I thought that would be a problem for a while, and then they went and solved it. Somehow even missed the news of it, most likely due to the enormous amount of news comming from the ml world (I might need a model to abbreviate it). Anyways thanks for the article I know what to do this weekend.
The second part has some of this, but not as in depth as i’d like.
Back in the day before university (around 6 years ago) I got recommended a mooc(massive open online course) by the university of Helsinki. I used this course to get started with learning to program, and to find out whether it was something for me. It has been some time, and it seems they update the course but I hope it can help you too in learning. Here is the link: https://java-programming.mooc.fi/. It really starts from 0, with setting up te environment which is nice. It is in java using the netbeans ide which some would call antique, but in my opinion that does not really matter to start to learn.
The most interesting part here I find is the cost analysis. Was quite surprised to see that the cost to train it on current hardware would have been a third of the cost it was back when they were training it. That is like a 3x improvement in a year/year and a half. I winder whether this trend will continue.
I wonder whether there even is something to deal with this collapse. It just seems to me that data generated by the model itself will never have any new cases to learn from. I think it will just keep reinforcing the current knowledge already in the model. This is not always a bad thing, we have been doing that(the reinforcing) for so long now under “epochs”. But even with epochs you end up overfitting after a while
I still am not sure what to think of this entire thing. It feels that at a certain point someone started playing some circus music, and they forgot to turn it off.