Don’t look for statistical precision in analogies. That’s why it’s called an analogy, not a calculation.
Don’t look for statistical precision in analogies. That’s why it’s called an analogy, not a calculation.
No, this is the equivalent of writing off calculators if they required as much power as a city block. There are some applications for LLMs, but if they cost this much power, they’re doing far more harm than good.
I’ll take a stab at it.
“Researchers spend $X to see whether poison leaking into the ground gets into our water.”
Yeah, I don’t know why you’re getting so many down votes. That is what Hitler’s party was called, so technically Poilievre isn’t lying. He’s still a dumbass for seriously stating that position and purposefully spreading confusion for his own benefit, but it’s technically not a lie.
Exactly this, and rightly so. The school’s administration has a moral and legal obligation to do what it can for the safety of its students, and allowing this to continue unchecked violates both of those obligations.
I find it difficult to lay the blame with VSCode when the terminology belongs to git, which (even 7 years ago) was an industry standard technology.
People using tools they don’t understand and plowing ahead through scary warnings will always encounter problems.
I mean, that’s easy enough if you’re trying to catch the ball with your face. Usually that’s not the goal, so you’ll be standing slightly to the side or the object is moving toward your stomach. ;)
Even then, that’s discounting the whole image analysis part of the equation, which your brain does dozens of times per second with incredible accuracy. Your waste bin example would have had to do enough to differentiate the ball from the background, and that definitely qualifies as a complex algorithm.
ETA: also, closing your hand at the right time does require your brain to know how close the object is, not just that you’ve positioned yourself in its path.
I worked on an industrial robot once, and we parked it such that the middle section of the arm was up above the robot and supposed to be level. I could tell from 50 feet away and a glance that it wasn’t, so we checked. It was off by literally 1 degree.
Degrees are bigger than we think, but also our eyes are incredible instruments.
Even if you could care less about the tastiness of the food entering your face hole, everyone should know enough cooking that they can take care of themselves. I don’t care if you have or want a romantic partner, knowing how to feed yourself should be next on the list after being able to bathe and dress yourself as a fundamental life skill.
I agree that LIDAR or radar are better solutions than image recognition. I mean, that’s literally what those technologies are for.
But even then, that’s not enough. LIDAR/radar can’t help it identify its lane in inclement weather, drive well on gravel, and so on. These are the kinds of problems where automakers severely downplay the difficulty of the problem and just how much a human driver does.
You are making it far simpler than it actually is. Recognizing what a thing is is the essential first problem. Is that a child, a ball, a goose, a pothole, or a shadow that the cameras see? It would be absurd and an absolute show stopper if the car stopped for dark shadows.
We take for granted the vast amount that the human brain does in this problem space. The system has to identify and categorize what it’s seeing, otherwise it’s useless.
That leads to my actual opinion on the technology, which is that it’s going to be nearly impossible to have fully autonomous cars on roads as we know them. It’s fine if everything is normal, which is most of the time. But software can’t recognize and correctly react to the thousands of novel situations that can happen.
They should be automating trains instead. (Oh wait, we pretty much did that already.)
Absolutely terrifying… but thank you for the insight.
Yeah, the only way someone is dying in a furnace before feeling pain is if you’re dealing with molten-metal-type temperatures. Not a bakery oven. I’m sure this poor woman experienced excruciating pain for far too long.
Yeah, 100%. This is the town’s fault IMO - not maintaining the markings in the first place (it’s not the contractor’s fault that the old marking is non-existent), and then probably refusing to pay the contractor “extra” to repaint the whole thing.
I grew up in a small town in Canada. We never had any kind of lock down drills.
Even talking about it this way is misleading. An LLM doesn’t “guess” or “catch” anything, because it is not capable of comprehending the meaning of words. It’s a statistical sentence generator; no more, no less.
Nobody going to mention a Cask of Amontillado? Maybe not the most mind-bending example, but the tale of leading a supposed friend to their own horrific murder was not a thing I expected to be reading in school.
I had blocked that one from my memory; I remember now. Thanks. ಠ_ಠ
This article and discussion is specifically about massively upscaling LLMs. Go follow the links and read OpenAI’s CEO literally proposing data centers which require multiple, dedicated grid-scale nuclear reactors.
I’m not sure what your definition of optimization and efficiency is, but that sure as heck does not fit mine.