

And yet whenever some achievement is made, the headlines are “Musk achieves great feat”
And yet whenever some achievement is made, the headlines are “Musk achieves great feat”
My personal problem is that I’m so bad at using it
It’s not you, it’s just that Windows is badly designed.
Not at all. It’s not “how likely is the next word to be X”. That wouldn’t be context.
I’m guessing you didn’t watch the video.
I’m not wrong. There’s mountains of research demonstrating that LLMs encode contextual relationships between words during training.
There’s so much more happening beyond “predicting the next word”. This is one of those unfortunate “dumbing down the science communication” things. It was said once and now it’s just repeated non-stop.
If you really want a better understanding, watch this video:
And before your next response starts with “but Apple…”
Their paper has had many holes poked into it already. Also, it’s not a coincidence their paper released just before their WWDC event which had almost zero AI stuff in it. They flopped so hard on AI that they even have class action lawsuits against them for their false advertising. In fact, it turns out that a lot of their AI demos from last year were completely fabricated and didn’t exist as a product when they announced them. Even some top Apple people only learned of those features during the announcements.
Apple’s paper on LLMs is completely biased in their favour.
it just repeats things which approximate those that have been said before.
That’s not correct and over simplifies how LLMs work. I agree with the spirit of what you’re saying though.
It’s been talked about to death. It’s been analysed to death.
But here’s a very detailed and thorough breakdown:
15 years ago, maybe. But I haven’t had any compatibility issues in many years.
He’s on float plane
I’ll never support anyone on that platform. I’ll never do anything to give LTT a cent.
The funny thing about that story, and the outset that no one covered after the fact, is that Munich reversed direction again and ultimately did go with Linux and open source stacks.
This is corporate AI against open source AI.
Show me where I can download Midjourneys full model to run it locally and then we can agree to call it “open weights”. Unless their base model and training data is also available, it’s not open source.
It’s not like her international protesting license is limited to only climate change. You can dedicate 3 months out of a year to other forms without having to pay for the dual pretest license.
I moved out of Toronto 4 years ago. Just came here for the weekend and I saw 5 cybertrucks in one day.
It’s a tech illiterate YouTuber for tech illiterate people that think they are tech literate
That’s a great way to say it. I usually just call him a “tech entertainer” that real tech people look down on. But I like your version.
The video that really polarized my opinion on them was their “storage server upgrade” video where they worked on replacing their horribly and amateurishly configured storage server.
Wendell from Lvl1 and Allan Jude (maintainer for OpenZFS) commented on LTT’s setup and, while they didn’t outright say anything negative, they didn’t have anything good to say and their tone heavily implied they thought LTT are posers.
I cannot believe he still has a channel worth anything. I think LTT favs might actually be worse than Swift8es
What I do is just take out the card a plug it into a little USB dongle thing which I can plug into either my phone or laptop.
What’s wild to me is that anyone would do it any other way. I’m astounded that this is somehow a “tip”.
Not even 10 years ago it was simply the way to do it.
I mean sure it has a shared history
Wait, you serious?
I think you were projecting with that “you’re insane” comment.
I have no idea what you’re trying to say this time. Maybe have a lie down?
Just tried it. It works.