OP is absolutely mistaken that it’s somehow ableist to stick to a meeting deadline or similar “punishment” for lateness, and t3rmit3 has said why much more eloquently than I could. However, you’ve said something that I can’t let pass without a rebuttal.
perpetual lateness means someone values their time more than they do the commitment and the time of others. period.
[…]
perpetual lateness, though, is a statement, that individual could not give a shit what others needs and responsibilities are
This is making a moral judgment on what you believe is in someone’s mind, and your judgment is based on a false premise. There exists an extremely common mental disorder (so common that some might consider it a form of neurodivergence) that when left untreated makes it much harder to do the things you want and are obligated to do. It’s harder to start doing things, it’s harder to stop, it’s harder to focus yet too easy to focus, it’s harder to remember important things, and it’s harder to motivate yourself to do anything you aren’t doing at any given moment, and anything you have to put effort into motivating yourself to do leaves you with less mental energy to do anything else in that category.
The one thing that can usually overcome all of these mental blocks is panic - when you’re actually out of time and Consequences are approaching if you don’t do something RIGHT NOW then you can finally do what you need to do and get something done - later than you wanted, worse than you wanted, more mentally drained, and with plenty of reasons to beat yourself up over it, not that it helps if you do. This is the reason behind why most people show up perpetually late. They might not let the emotional turmoil show, but if they’re consistently a few minutes late for everything, I can just about promise it’s not because they don’t care.
People who have this disorder and receive prescription medication for it often describe the first dose as like receiving superpowers. The idea that they can decide they want to do something, and then just go do it? Without thinking about it? No buildup? No psyching yourself into it? No roundabout coping strategies? No reorganizing the entire structure of your life to make it happen? No bargaining with the goddamn monkey in your brain that almost never lets you do the rational thing? Wait, normal people don’t have the monkey? They live like this every day, without any expensive pills? Impossible. It couldn’t be that simple. Do they have any idea how lucky they are?
Your misplaced sense of moral superiority is unfortunately quite common, but it’s not going to help these people, it’s going to hurt them. If it’s affecting their life, and it often is, they need treatment and training in how their brain works, not to be told they’re a piece of shit who doesn’t care about others and are choosing to inconvenience everyone else in their life including themselves. That’s only going to put them in a worse place.
You’re entirely correct, but in theory they can give it a pretty good go, it just requires a lot more computation, developer time, and non-LLM data structures than these companies are willing to spend money on. For any single query, they’d have to get dozens if not hundreds of separate responses from additional LLM instances spun up on the side, many of which would be customized for specific subjects, as well as specialty engines such as Wolfram Alpha for anything directly requiring math.
LLMs in such a system would be used only as modules in a handcrafted algorithm, modules which do exactly what they’re good at in a way that is useful. To give an example, if you pass a specific context to an LLM with the right format of instructions, and then ask it a yes-or-no question, even very small and lightweight models often give the same answer a human would. Like this, human-readable text can be converted into binary switches for an algorithmic state machine with thousands of branches of pre-written logic.
Not only would this probably use an even more insane amount of electricity than the current approach of “build a huge LLM and let it handle everything directly”, it would take much longer to generate responses to novel queries.