

If we had AGI, the number of jobs that would be at risk would be enormous. But these LLMs aren’t it.
They are language models and until someone can replace that second L with Logic, no amount of layering is going to get us there.
Those layers are basically all the previous AI techniques laid over the top of an LLM but anyone that has a basic understanding of languages can tell you how illogical they are.
The problem is these people don’t listen to health experts but for whatever reason they listened to this health expert. Clearly it’s not about his position of power but rather who they choose to listen to or not listen to.