It’s also the font on the pop-ups, and the AI logo.
There’s a very generic style of Facebook cartoon that these things absolutely nail.
It’s also the font on the pop-ups, and the AI logo.
There’s a very generic style of Facebook cartoon that these things absolutely nail.


If they’re going to name a memorial center after him, shouldn’t he be dead?
No, you just use a standard technique like word2vec.
Basically words are considered similar (and embedded to nearby locations in a high dimensional space) if they are likely to be used in the same context.
And because slurs are used to indicate that you don’t like someone, they tend to occur in the same kind of context.
So they’re all very similar. This is actual natural language processing being used, but it’s a shit post and the graphics aren’t very clear.


No one uses the word gaol in the UK.
We say prison instead. Occasionally jail but that’s more common as a verb and not a noun.
Bragging rights and improved sleeping ability from the knowledge that the devs are being supported.
The serious answer is it’s often easier for people in a company to buy a license key than it is for them to arrange a donation to the devs. So this is an easy way to make small donations.


Also not to be confused with tower bridge which is the one everyone that doesn’t live in London calls London bridge.


This message was brought to you by our sponsors Big Crow


You can get a very good idea of what works by just looking for AMD GPU cloud compute.
If it was usable and cheaper everyone would be offering it. As far as I can see, it’s still experimental and only the smaller players, e.g. IBM and Oracle are pushing it.


See if Tesla does well that proves musk is a genius and should be paid more. If it does badly it proves he was unmotivated and should be paid more.


Well it is trained to copy musk.
I think you meant to post this to /c/dataisunreadable
I guess it might work if HR don’t know how an LLM works. There’s not many that can edit a word file so it includes whited-out footnotes.
You’re better off getting a friend to lie for you. They can say they added it while helping you with formatting and you know nothing about it.
Genuinely, this already happens in large companies for related reasons.
The CV is on file, and if HR reprocess it for any reason e.g. relocation or change of role, it’s automatic dismissal for dishonestly if they catch a deliberate lie.
No but they can fire you later even if you’re good at the job.
Then you’re stuck in an even worse position with a big gap in your CV and no reference.
Unfortunately, this is seen as dishonestly and is grounds for immediate dismissal in a lot of places.


Right. That’s why they overreact to everything, and bring old military equipment on swat raids.
They’re much more likely to panic and drive an APC through the crowd or return fire on a mostly unarmed crowd using automatic weapons.
Just ask yourself, “what has Israel done recently?” and remember that US police train with them.
Bottom has infinite density and will collapse into a black hole killing everyone, and destroying the tram and lever.


I live in a walkable European city.
My nearest library is 5 minutes away, there’s a bigger library maybe 20 minutes away, and for anything further I’d take public transport.


Worse, it seems to be Tony Blair.
Id cards were one of Blair’s most unpopular policies, and all the AI first stuff seems to be coming straight from the Tony Blair institute.
Nah, you can actually see some of them developing AI psychosis.
https://medium.com/write-a-catalyst/this-prominent-vc-investor-just-had-a-chatgpt-induced-psychosis-on-twitter-heres-what-this-means-197ae5df77f4
You’ve got to understand that most AI execs aren’t technical people, they’re hype men. And LLMs are weirdly good at hype and the illusion of technical correctness. So they don’t have a problem with it.
Sam Altman saying he uses chatgpt to tell him how to act with his baby is one of the things he’s said I actually believe. Of course he’s also a got a team of nannies he couldn’t be bothered to mention, but the trust for chatgpt is there.