

I think a lot of it is still done by hand, and there is also synthetic data distilled from larger models of course.


I think a lot of it is still done by hand, and there is also synthetic data distilled from larger models of course.


I mean, making fuel is one thing
Actually i was more thinking of crude metallurgy and materials processing. You could quite easily get aluminium from lunar regolith, and also tons of silicates. This allows you to produce shielding, radiators and the structural elements of solar panels without having to kaboom-boom the tons of raw material from the Earth. And it’s not particularly high-tech stuff either, just some furnaces and basic extruding would go a long way. If you just have to ship the delicate electronics from Earth you’re already saving a lot.


Interestingly NASA had an idea of a plan that sounds at least technically possible, but it’s a multi-decade operation and doesn’t look anything like what the current startups are pitching. Of course you can have your data centers in space, why the fuck not, but a data center sits on top of a lot of boring old infrastructure which nobody’s excited to talk about.
It’s going to be prohibitive if you have to pay the gravity tax every time you want to move 1 ton of metal, so realistically this kind of high-tech project cannot even begin without having substantially industrialized the moon. Nothing fancy but you’ll need at least some mining and refining, and solid trans-lunar logistics routes. Probably some housing for a bit of personnel too. At that point the space data center would be dwarfed by the size of its own support system.


AI datacenters buying up the hardware is why their hosting costs increased. Worsening the problem significantly.
I’m sorry but this is bullshit. For basic storage you absolutely don’t need a lot of RAM or SSDs, older gen hard drives are extremely easy to find and very cheap. People have been hypnotized into believing they absolutey definitely need the latest gen of hardware, without realizing it’s useless, and it’s the only kind of hardware that knows shortages and high prices.
edit : okay seems i’ve overplayed my hand a little here. My personal experience comes from buying used hardware which is still going pretty cheap but obviously that’s a dicey sell for professional hosting.


The best way to learn to write is to write and have someone critique you. That someone can be an AI it doesn’t change anything about the process, as long as the initial input is your own best effort and the final result is your own edit based on the feedback you received.


That’s an excellent point! On that topic I recently listened to an interview of the founder of EleutherAI, who focuses on training small language models. She said they were able to train a 1B parameters reasoning model with 50K Wikipedia articles and carefully curated RL traces. The thing could run in your smartphone and is at parity with much larger models trained on trillions of tokens.
She also scoffed at Common Crawl and said it contained mostly cookies and porn. She had a kind of attitude like “no wonder the big labs need to slurp trillions of tokens when the tokens are such low quality”. Very interesting approach, if you understand french I can only recommend the interview.


It’s been a while since I used any MS product but I’ve got the same feeling with Google products. Weird bugs are starting to accumulate and at the same time they’re cramming every corner with buttons for their new AI integrations, with no explanation of how they’re supposed to work. It’s a mess, the stuff they add in doesn’t even respect the original app design so they’re really starting to look like they’re put together with toothpicks and duct tape.


The Revolutions podcast is a gold mine. Have you listened to the latest season? You’re in for a big surprise if you haven’t :)


OK that’s a fair observation. Honestly my naive guess would be that they simply do not optimize mainline gpt models for the kind of use case you generally have on Api (tool use, multi-step actions, etc…). They need it to be a perky every day assistant not necessarily a reliable worker. Already on gpt-4 i found it extremely mediocre compared to the Claude models of the same time.
I think that’s a more likely explanation than model collapse which is a really drastic phenomenon. A collapsed model will not just fail tasks at a higher rate, it will spit garbled text and go completely off the rails, which would be way more noticeable. It would also be weird that Claude models keep getting better and better while they’re probably fed roughly the same diet of synthetic data.


The switch you mention (from 4th gen to 5th gen GPT) is when they introduced the model router, which created a lot of friction. Basically this will try to answer your question with as cheap a model as possible, so most of the time you won’t be using flagship 5.2 but a 5.2-mini or 5.2-tiny which are seriously dumber. This is done to save money of course, and the only way to guarantee pure 5.2 usage is to go through the API where you pay for every token.
There’s also a ton of affect and personal bias. Humans are notoriously bad at evaluating others intelligence, and this is especially true of chatbots which try to mimic specific personalities that may or may not mesh well with your own. For example, OpenAI’s signature “salesman & bootlicker” personality is grating to me and i consistently think it’s stupider than it is. I’ve even done a bit of double blind evaluation on various cognitive tasks to confirm my impression but the data really didn’t agree with me. It’s smart, roughly as smart as other models of its generation, but it’s just fucking insufferable. It’s like i see Sam Altman’s shit eating grin each time i read a word from ChatGPT, that’s why i stopped using it. That’s a property of me, the human, not GPT, the machine.


I’m sorry but no, models are definitely not collapsing. They still have a million issues and are subject to a variety of local optima, but they are not collapsing in any way. It is not known whether this can even happen in large models, and if it can it would require months of active effort to generate the toxic data and fine-tune models on that data. Nobody is gonna spend that kind of money to shoot themselves in the foot.


Yeah i remember that Ed article ! I don’t think the technical aspects are relevant to the newer generation of models, but yeah of course any attempt to compress inference costs can have side effects : either response quality will degrade for using dumber models, or you’ll have re-inference costs when the dumb model shits its pants. In fact the re-inference can become super costly as dumber models tend to get lost in reasoning loops more easily.


Yeah that’s also something that you have to train for, i’m not super aware of the technicals but model routing is definitely important to the AI companies. I suspect that’s part of why they can pretend that “inference is profitable” as they are already trying to squeeze it down as much as possible.


I mean what’s the point of morging if you’re gonna do it intermittoucly?


To clarify : model collapse is a hypothetical phenomenon that has only been observed in toy models under extreme circumstances. This is not related in any way to what is happening at OpenAI.
OpenAI made a bunch of choices in their product design which basically boil down to “what if we used a cheaper, dumber model to reply to you once in a while”.


I don’t know about other bands but the bit about iron maiden is really stretched.
I guess if you consider the first lineup to be the one for their first concert in a bar’s basement, alright. But if you take the first album, Dave Murray was already in the band and still is.


Companies that have had their whole data on Google/MS servers for 20 years certainly don’t care for privacy the way you and I do. But they are certainly realizing that US providers are not the way to go. Baby steps I guess.


Oh yes. Because Merz builds data centers, and Der Leyen is known for making IT decisions at EU companies. They also happen to be the queens of private investors.


testing is doubting
Interestingly, Roots was inspired by the first nu-metal albums, especially Korn’s first album. That’s crazy to me and i would have guessed it was the other way around.