Are Large Language Models Really AI?

Eorzea Time
 
 
 
Language: JP EN FR DE
Version 3.1
New Items
users online
Forum » Everything Else » Chatterbox » Are large language models really AI?
Are large language models really AI?
First Page 2 3 4 5 6 7 8 9
 Asura.Saevel
Offline
Server: Asura
Game: FFXI
Posts: 10181
By Asura.Saevel 2025-05-23 11:57:15
Link | Quote | Reply
 
Godfry said: »
You are quoting a fraction of what I said to make a point, for no reason whatsoever.

Offline
By Godfry 2025-05-23 12:00:33
Link | Quote | Reply
 
Asura.Saevel said: »
Godfry said: »
You are quoting a fraction of what I said to make a point, for no reason whatsoever.


Saevel in FFXIAH vs Reality

Offline
Posts: 727
By soralin 2025-05-23 12:01:34
Link | Quote | Reply
 
Ill chime in here, Ill start with mentioning some credentials to help give context that Im not bullshitting.

1. I work at a software company heavily integrated with AI, we have an entire division dedicated to researching it and keeping an eye on it as both a tool and a concern.

2. I am a software developer, and I am very familiar with using LLMs, training them, refining them, etc. I'm even actively working on a portfolio piece for fun right now of trying to get make a horizontally scalable GPT clone using an Aspire stack of microservices on my github. Link: https://github.com/SteffenBlake/llamaSharp-Practice

3. I am a pretty active participant on the C# discord where tonnes of microsoft devs, some quite senior, post everyday (alongside other very senior devs at other companies), and we share info with each other

4. Im also a very active participant on numerous other programming/software/architecture/homelab focused discords and communities.

With that said, I'll say a few things.

1. Yes, software developers are using LLMs for their code. No, the code isnt *that* bad, for any non esoteric use case GPT/Claude/Etc produce perfectly fine results that still need a sanitization pass over, but very very often they produce code that gets like 95% of the job done. I would say estimates of 30% of code being written by AI is a bit high, but its more like "30% of code was written **USING** AI" which isnt quite the same, its more like the dev asks the AI for help but still writes their code by themself. But I will confidently say waaay more than half of developers now have LLMs involved in their workflow in at least SOME manner. Its not a question of "if", its a question of "how much"

2. Yes, LLMs completely ***the bed as soon as the knowledge space is even a tiny bit more esoteric. If you try and ask gippity about any newer frameworks or libraries that isnt even in its training set, it will have a conniption and start hallucinating garbage. Any half decent developer knows this though, and wont try and ask gippity such things

3. LLMs are **INCREDIBLY GOOD** as a user friendly frontend to vector databases, effectively acting as the ultimate fuzzy finder. My #1 use for LLMs is very rapidly hunting down very specific info I need from thousands of pages of documentation. I can then go look at the actual docs, but instead of spending 20 minutes going "now where the *** was that page I needed", an LLM can usually find it for me in about 30 seconds. Vector fuzzy search goes BRRRRRR

4. No, LLMs are not AI, because no one has sufficiently managed to make a formal definition of what AI is, because no one has sufficiently managed to make a formal definition of what Intelligence even is. We barely understand how our *own* brains work, how could we even begin to classify if an LLM counts as the same thing or not

5. AI is not going to replace workers, at least most of them. It CAN however augment workers to be way more efficient at their jobs, and you'll end up with net more workers.

Example: When the tractor was invented, we ended up getting MORE farmers, not less. Only very recently did our total farmers worldwide start to decline, mostly because we are producing food via other means, or the literal job title of "Farmer" has shifted to a different job title.

When we invented the tractor, such that 1 farmer could do the job of 4, this didnt make 75% of farmers lose their jobs...

It just made us start making 4x as many mouths to feed and all the farmers stayed in business, and we even ended up with like 10% more farmers net.

The core of it is this: You can't prove humans aren't just very advanced LLMs ourselves. Brain in a jar, yadda yadda yadda, descartes so on and so forth.

And if you can't prove a human isnt just a very advanced LLM, then you can't prove an LLM isn't a human either.
[+]
 Asura.Saevel
Offline
Server: Asura
Game: FFXI
Posts: 10181
By Asura.Saevel 2025-05-23 12:05:49
Link | Quote | Reply
 
Garuda.Chanti said: »
Godfry said: »
Asura.Saevel said: »
The only people buying the whole "AI will write our code for us" are those who are either trying to inflate stock prices, or have never had to use "creative solutions" to solve hard problems before.
Or predict that there is or there is going to be a problem in the first place. AI sucks for that, and this is what makes companies go bankrupt to begin with.
So AI is not only going to replace workers its going to replace crappy management with even crappier management?

RadialArcana said: »
A farm used to be a farmer and 100 workers, now it's a farmer and a bunch of machines and 2-3 people working for him driving them.
Tractors have been fully automated for over a decade. (No, not all tractors.)

A farm is now a corporate farm manager, a bunch of machines, 2 - 3 mechanics, a few satellite subscriptions, and a fair amount of computing power. The other 97 - 98 farm workers are now working for the corporation mostly as lobbyists.

The tractors have more intelligence than the whole lousy lot of them.

The jobs AI is replacing are the same types of jobs that are traditionally replaced by automation. Low skilled jobs that are teachable in a few months and require lots of supervision. It hasn't changed much in IT because those jobs were already farmed out to India, I wish I was joking here. Who has been greatly impacted have been content creators, those guys are getting bulldozed right now. Companies were already starting to offshore to China or Korea but the ability to have art and texture assets automatically created by a machine just deleted a massive amount of manual hours. A few content creators can now produce the same amount of material using generative AI as dozens using traditional tools. It still requires human management and editing, something companies are finding out the hard way, but the sheer number of man hours for the product have been reduced.
Offline
By Afania 2025-05-23 12:22:24
Link | Quote | Reply
 
RadialArcana said: »
What's gonna happen is a workforce of 100 is going to be downsized to the top 20 people, and they are going to be supercharged with powerful AI helpers.

The same way automation always happens.

A farm used to be a farmer and 100 workers, now it's a farmer and a bunch of machines and 2-3 people working for him driving them.


I think you are being way too pessimistic on the whole "people lose job because of AI" thing.

Before AI, there are many "***jobs" in the market that doesn't require human problem solving skill, just mundane time consuming repetitive tasks that doesn't pay well. And yet juniors has to do those boring tasks because somebody, usually the cheapest employee, has to do those tasks.

If those jobs are replaced by automation, it's actually an improvement to the humanity. Because those talented juniors can actually do more important works and be more productive instead of being a worker drone and waste their lives on lowest level easy works.

As for unemployment rate, I've heard about companies saying use AI to lower the cost. 2 years has passed our unemployment rate near my region is still at 3%, that's with tons of foreign workers too. Keep in mind that an "healthy" unemployment rate in economy is 3%-5%.

In other words, we have plenty of room to replace employees with AI for better productivity and for more healthy unemployment rate in job market.

I'll worry about people losing jobs when we reach 15% unemployment rate or something. Until then, the situation is fine.

As for the entertainment industry, it's true that the job demand has decreased recently. But imo it has more to do with investors are slowing down on entertainment investment due to market saturation. For companies that do have investment money, they still recruit employees often. I am not seeing AI making them jobless atm.

Maybe after 10-20 years the market situation may change. But for now? Nothing to worry about atm.
Offline
By K123 2025-05-23 12:30:04
Link | Quote | Reply
 
We needed more farmers when there was more traffic because of population boom. We are not going to see any population boom now, in fact it will be in decline soon. This is a different ballgame to the turn of the last century.
Offline
By K123 2025-05-23 12:43:30
Link | Quote | Reply
 
Godfry said: »
AI is not good at posing research level questions because it sucks at conceptualizing problems beyond the data that is available to it.
https://arxiv.org/abs/2409.04109
Offline
Posts: 727
By soralin 2025-05-23 12:45:08
Link | Quote | Reply
 
As I stated above, its been shown time and time again when a new tool enters that lets a worker do their job more efficiently, people dont lose jobs.

The scenario of "workers can do the work of 3 ppl now, so 66% of people lose their jobs" isnt reality.

What actually happens is demand increases.

1: Your workers now can produce, say, double the output in half the time.

2: You can now sell your service/product for half the price

3: Because wealth is distributed on a steep exponential curve, lowering your price exponentially increases your target demographic market

4: Which in turn causes your demand to very sharply increase, to such a degree it actually outstrips your boosted supply

5: Which then means you end up needing to hire even more workers even though they are more productive

In the above example, you'd see something like "My workers are 2x as productive, but this made demand go up 3x so even my double productive workers still weren't enough so I had to hire more to match the sharp uptick in demand"

Its a similar paradox to the whole fuel economic cars thing. You buy a more fuel economic car, but as a result you end up driving more because its cheaper, and you end up burning even more fuel than you did before, because it became cheaper per mile so trips you wouldn't have even wanted to do before became viable.

The automobile didn't put cab drivers out of business, it put horses out of business.
[+]
 Asura.Saevel
Offline
Server: Asura
Game: FFXI
Posts: 10181
By Asura.Saevel 2025-05-23 12:54:15
Link | Quote | Reply
 
soralin said: »
As I stated above, its been shown time and time again when a new tool enters that lets a worker do their job more efficiently, people dont lose jobs.

Hmm yes and no. Low skilled workers absolutely do lose their job because that jobs ceases to exist. Simultaneously a new job appears to replace it, a job that usually requires higher skill level. Those who can increase their skill level and take advantage of that new automation tool end up doing much better then those who can't. The additional productivity of those new higher skilled workers drives the commodity price down, and it's that lower price that drives demand.

My point was that the price was already being driven down due to foreign competition. A content artist in the US was already competing with Chinese and Korean content artists. Generative AI was just the last straw that finalized that transition.
 Fenrir.Niflheim
VIP
Offline
Server: Fenrir
Game: FFXI
user: Tesahade
Posts: 1004
By Fenrir.Niflheim 2025-05-23 12:58:02
Link | Quote | Reply
 
soralin said: »
The automobile didn't put cab drivers out of business, it put horses out of business.

this is a good line that can be used to show the differences between people's view of the topic.

some people believe the cab driver is the programmer, idk what the horse is in that case.

Others believe the customer is the driver, and the programmer is the horse i think this lines up with saevel's point regarding "low skilled workers".


so who do you think the driver is and who/what the horse is?
Offline
By K123 2025-05-23 13:02:31
Link | Quote | Reply
 
I think your 2 step solution isn't reality and basically comes back round to what I was saying about design.
You have people that need an app/program/website to serve x function. You have a company that makes apps, of which some employees are programmers. Between these you have design. Not every designer/product manager programs, not every programmer designs.
Offline
Posts: 727
By soralin 2025-05-23 13:04:04
Link | Quote | Reply
 
Fenrir.Niflheim said: »
so who do you think the driver is and who/what the horse is?

Programmer is the driver, the IDE is the horse.

Though, in this case we have a scenario where the horse is a transformer that is slowly transforming *into* the automobile, but by the end of the transformation it definitely looks very different from the original horse (AI tools being directly integrated with our IDEs)
Offline
Posts: 727
By soralin 2025-05-23 13:06:51
Link | Quote | Reply
 
K123 said: »
I think your 2 step solution isn't reality and basically comes back round to what I was saying about design.
You have people that need an app/program/website to serve x function. You have a company that makes apps, of which some employees are programmers. Between these you have design. Not every designer/product manager programs, not every programmer designs.

People also vastly underestimate how huge a difference someone skilled with the AI tools vs someone without those skills performs.

You can't easily get an AI to design stuff for you if you dont even know all the right lingo to prompt it correctly. How are you gonna ask it to make it do the right thing, if you don't know all the right words to use?

The domain knowledge is still critical, and it takes time and effort to pick that up.

LLMs are like machines you pilot, yes when we invented the airplane it allowed us to get from A to B way way faster... but you still need a pilot to drive it

LLMs still dont do diddly squat without a skilled pilot. And any CEO that thinks "thats easy I can do that" hasnt used it much yet, because as soon as you get like 30 minutes into fighting with it, you'll realize its way harder than it looks.

It's like trying to pretend that simply having access to Microsoft Word is all it takes to write a novel. The tool is merely a tool, you still gotta know how to use it well.
 Fenrir.Niflheim
VIP
Offline
Server: Fenrir
Game: FFXI
user: Tesahade
Posts: 1004
By Fenrir.Niflheim 2025-05-23 13:09:15
Link | Quote | Reply
 
soralin said: »
Programmer is the driver, the IDE is the horse.

sounds plausible, but why not Product Manager is the driver, the Programmer the horse?

replace the programmer with an "AI" that the product manager interacts with in the same manner.


So what is the unique aspect of the "Programmer" the prevents the role from being collapse to a higher level in the org?
Offline
By Afania 2025-05-23 13:09:55
Link | Quote | Reply
 
Fenrir.Niflheim said: »
this is a good line that can be used to show the differences between people's view of the topic.


People have different view on this topic because some business/jobs are service providers and some business/jobs are value providers.

Ie: if your jobs is to produce things when client tell you to do it, then you are a service provider. And you are far more likely to be replaced by another cheaper competitor, such as outsource industry in India or AI.

If your job is to produce values, such as a better solution for a client, or design a software that has market demand, then you are a value provider. You are far less likely to be replaced by AI, quite the opposite, good use of AI may increase your advantage in this case.

In the field of business, the best business build successful platforms and monopoly. The second best business build valuable services or products. The 3rd tier business provide services and undercut.

If you can move up a tier in this hierarchy in the market, then you will be the driver, not the horse.
 Fenrir.Brimstonefox
Offline
Server: Fenrir
Game: FFXI
user: Brimstone
Posts: 226
By Fenrir.Brimstonefox 2025-05-23 13:14:43
Link | Quote | Reply
 
AI is horribly non-deterministic in its answers, which makes it unsuitable for important tasks, it can be used to augment productivity but not replace things that require definitive correctness.

I don't see that changing anytime soon. There's a reason there's a ton of AI chatbots and none that can predict stock prices, a regression model to do the later is a much easier task, and a much more valuable one. (note: anyone who did this would surpass Elon's wealth in a short time)
Offline
Posts: 727
By soralin 2025-05-23 13:14:44
Link | Quote | Reply
 
Fenrir.Niflheim said: »
soralin said: »
Programmer is the driver, the IDE is the horse.

sounds plausible, but why not Product Manager is the driver, the Programmer the horse?

replace the programmer with an "AI" that the product manager interacts with in the same manner.


So what is the unique aspect of the "Programmer" the prevents the role from being collapse to a higher level in the org?

Domain Knowledge, basically my above post.

You have to know the right questions to ask, to get the right answers.

If you dont know what functions, arrays, stack vs heap, pointers, ref vs value types, hash functions, trees, graphs, etc are... all that stuff you learn in computer sciences and algorithms... then how are you gonna steer the AI to actually get the outcomes you want?

You can't ask the AI to make a <thing> when you dont know the right words for that <thing> are. You'll start wasting a huge amount of time first getting the AI to explain to you what that thingy is, then backpedaling and asking for it.

Finally, AI is garbage as soon as you are remotely esoteric domain knowledge. If you are using anything that came out in the last year, its just gonna ***the bed and fall over, cuz it doesnt have training data for it.

RAGs can help with this, but that only works if the thing is very very very well documented, and then you have to waste a buncha time first loading all that documentation into a vector DB for RAG'ing on.

By which point the real human developer probably already is 1/3rd done the project.
Offline
By Afania 2025-05-23 13:14:51
Link | Quote | Reply
 
Fenrir.Niflheim said: »
So what is the unique aspect of the "Programmer" the prevents the role from being collapse to a higher level in the org?

Start your own company, study market demand for specific niche needs. build your own software and sell it to people that needs it.

I know the demand is there, because I've purchased many software or small tools often. Those software tools are written by programmer(s) who start their own small business. Then find the demand, write their codes either solo or with a team, sell it, profit.

That's your way to become the driver as a programmer IMO.

If you just write codes when other people to write codes, you are just competing with everyone else offering the same service cheaper.
Offline
Posts: 727
By soralin 2025-05-23 13:17:04
Link | Quote | Reply
 
 Fenrir.Niflheim
VIP
Offline
Server: Fenrir
Game: FFXI
user: Tesahade
Posts: 1004
By Fenrir.Niflheim 2025-05-23 13:30:40
Link | Quote | Reply
 
soralin said: »
You have to know the right questions to ask, to get the right answers.

If you dont know what functions, arrays, stack vs heap, pointers, ref vs value types, hash functions, trees, graphs, etc are... all that stuff you learn in computer sciences and algorithms... then how are you gonna steer the AI to actually get the outcomes you want?

You can't ask the AI to make a <thing> when you dont know the right words for that <thing> are. You'll start wasting a huge amount of time first getting the AI to explain to you what that thingy is, then backpedaling and asking for it.

Finally, AI is garbage as soon as you are remotely esoteric domain knowledge. If you are using anything that came out in the last year, its just gonna ***the bed and fall over, cuz it doesnt have training data for it.

RAGs can help with this, but that only works if the thing is very very very well documented, and then you have to waste a buncha time first loading all that documentation into a vector DB for RAG'ing on.

100% agree, this is why learning to code is still critical even in an AI capable future.
Offline
By Afania 2025-05-23 13:31:03
Link | Quote | Reply
 
Fenrir.Brimstonefox said: »
There's a reason there's a ton of AI chatbots and none that can predict stock prices, a regression model to do the later is a much easier task, and a much more valuable one. (note: anyone who did this would surpass Elon's wealth in a short time)


People have been trying to predict stock market prices via algorithm way before AI, it's called technical analysis. Which isn't always accurate, mind you. An AI would be just the same as technical analysis at best, but with less human emotions affecting them.

The reason why algorithm/economic models/technical analysis/doesn't work accurately on market price prediction is because economy by nature, is a complicated issue that has WAY more unpredictable human variables that will affect the price. The US president can make a post on social media and market will fluctuate hardcore. No algorithm nor technical analysis can accurately predict that.

It's not AI nor algorithm fault, it's the limitations of algorithm by nature.

That being said, I still use technical analysis on some decision making in stock market. They are not completely useless. As long as people are aware of its limitation that is.
 Fenrir.Niflheim
VIP
Offline
Server: Fenrir
Game: FFXI
user: Tesahade
Posts: 1004
By Fenrir.Niflheim 2025-05-23 13:31:41
Link | Quote | Reply
 
Afania said: »
People have different view on this topic because some business/jobs are service providers and some business/jobs are value providers.

I am not sure how you draw a line between a service and value since they are not mutually exclusive. do you have a better word then service to use here?
Offline
By K123 2025-05-23 13:34:25
Link | Quote | Reply
 
Fenrir.Niflheim said: »
soralin said: »
You have to know the right questions to ask, to get the right answers.

If you dont know what functions, arrays, stack vs heap, pointers, ref vs value types, hash functions, trees, graphs, etc are... all that stuff you learn in computer sciences and algorithms... then how are you gonna steer the AI to actually get the outcomes you want?

You can't ask the AI to make a <thing> when you dont know the right words for that <thing> are. You'll start wasting a huge amount of time first getting the AI to explain to you what that thingy is, then backpedaling and asking for it.

Finally, AI is garbage as soon as you are remotely esoteric domain knowledge. If you are using anything that came out in the last year, its just gonna ***the bed and fall over, cuz it doesnt have training data for it.

RAGs can help with this, but that only works if the thing is very very very well documented, and then you have to waste a buncha time first loading all that documentation into a vector DB for RAG'ing on.

100% agree, this is why learning to code is still critical even in an AI capable future.
Basing the future off the present limitations is kinda the point you're missing.
Offline
By Afania 2025-05-23 13:36:13
Link | Quote | Reply
 
Fenrir.Niflheim said: »
do you have a better word then service to use here?


What do you mean? I thought "service" is a pretty accurate word. Not sure if there is better word for it.

To me outsourcing studios in India that Saev mentioned is service provider in the business. That much is apparent.

If you can provide a more accurate description of what your job or your company do, then maybe I can identify if that's service provider or value provider in the field of business.
Offline
Posts: 727
By soralin 2025-05-23 13:37:54
Link | Quote | Reply
 
K123 said: »
Basing the future off the present limitations is kinda the point you're missing.

This is a fundamental logical argument that is outside of the scope of AI capabilities.

Its akin to "how do you describe to a blind person what blue looks like"

You enter a paradoxical situation where both sides of the discussion must have the lexicon in order for information to convey.

No matter how advanced you make the AI, you still have to communicate needs to it in some way, for it to help you with a task.
Offline
By K123 2025-05-23 13:57:23
Link | Quote | Reply
 
soralin said: »
No matter how advanced you make the AI, you still have to communicate needs to it in some way, for it to help you with a task.
I don't agree. AI models will gain domain knowledge in all kinds of fields, then combining multiple agents acting as domain experts to synthesise their knowledge, with understanding of what works (market research) and trends and what could create value (from big data) AI will be able to devise its own projects, briefs, specifications, and realise them.

To say: AI will not be able to plan and devise [insert] [app/progam/website] from scratch with minimal instruction is dubious.

You can already do what is a full end to end approach to devising and designing things with LLMs. I run a similar process to this: https://www.nature.com/articles/s42256-025-01036-4 with my undergraduate students as case studies for my PhD research.
 Fenrir.Brimstonefox
Offline
Server: Fenrir
Game: FFXI
user: Brimstone
Posts: 226
By Fenrir.Brimstonefox 2025-05-23 14:01:35
Link | Quote | Reply
 
Afania said: »
Fenrir.Brimstonefox said: »
There's a reason there's a ton of AI chatbots and none that can predict stock prices, a regression model to do the later is a much easier task, and a much more valuable one. (note: anyone who did this would surpass Elon's wealth in a short time)


People have been trying to predict stock market prices via algorithm WAY before AI, it's called technical analysis. An AI would be just the same as technical analysis, but with less human emotions affecting them.

The reason why algorithm/economic models/technical analysis/doesn't work accurately on market price prediction is because economy by nature, is a complicated issue that has WAY more unpredictable human variables that will affect the price. The US president can make a post on social media and market will fluctuate hardcore. No algorithm nor technical analysis can accurately predict that.

It's not AI nor algorithm fault, it's the limitations of algorithm by nature.

That being said, I still use technical analysis on some decision making in stock market. They are not completely useless. As long as people are aware of its limitation that is.

Markets fluctuate for many reasons (some of which you mentioned), but the ultimate reason is just number of buyers vs. sellers (why that is is another story) for a given commodity. My main point is to compare the consequences of being wrong when a chat bot goes awry vs. when a trading bot goes awry.

When people no longer fear bad consequences of AI doing stuff then it will replace people (often the jobs morph more than disappear)
Offline
By Afania 2025-05-23 14:05:53
Link | Quote | Reply
 
K123 said: »
AI will be able to devise its own projects, briefs, specifications, and realise them.


I've actually tried this before, I let the AI devise project specifications and deployed the final product to the market to test how effective it is in the real world in real market. The result is not that good. There are plenty of issues in real execution after the deployment that needs to be fixed.

For me, "devise its own projects, briefs, specifications, and realise them" has no real value if it can not generate real profit in the market. You can make a thing, but selling it is a totally different issue. And latter is what really matters in the world of business. Nobody cares if you make a thing. They only care if your things sell.

Idk if AI can ever do that, detect market demands and act accordingly. If there is a "successful formula" to make money, everyone will be rich. But even the best marketing experts can fail to predict the market often, what makes AI better than them?
Offline
By K123 2025-05-23 14:21:52
Link | Quote | Reply
 
I think you're missing the ”will be able to" part.
Offline
Posts: 727
By soralin 2025-05-23 14:24:42
Link | Quote | Reply
 
K123 said: »
soralin said: »
No matter how advanced you make the AI, you still have to communicate needs to it in some way, for it to help you with a task.
I don't agree. AI models will gain domain knowledge in all kinds of fields, then combining multiple agents acting as domain experts to synthesise their knowledge, with understanding of what works (market research) and trends and what could create value (from big data) AI will be able to devise its own projects, briefs, specifications, and realise them.

To say: AI will not be able to plan and devise [insert] [app/progam/website] from scratch with minimal instruction is dubious.

You can already do what is a full end to end approach to devising and designing things with LLMs. I run a similar process to this: https://www.nature.com/articles/s42256-025-01036-4 with my undergraduate students as case studies for my PhD research.

How is the AI able to do all that, without subject matter experts weighing in in order to confirm that the AI is doing it right.

Allow me this allegory.

You are in a foreign city, and you need a cab to a location on the other side. You have 2 options:

1.A driverless car that is automated, but not trained on specifically this city's layout or patterns, but just driving in general

2. The same as the above, but it has a cab driver native to the city and who knows the layout like the back of his hand. The car still drives quite well, but he keeps an eye on it and guides it.

Now you have, yourself, zero clue on how this city is laid out, what are good vs bad routes, etc.

You could trust option 1, but the issue is simply this: how would you even know if that car is doing a good job or not?

You could look up routes, but you never could be 100% confident the car is actually making the right choices, because you don't know what the right choices are.

Meanwhile, if you go option 2, the cab driver themself actually knows the area and you can be much more confident in their nuanced knowledge.

The same goes for this discussion: An actual "pilot" with an LLM is always going to be better.

If a client was confident enough to know if the LLM was doing the job right or not, they also wouldn't need a developer anyways.

But blindly trusting an LLM to probably be right, even a very very advanced one 20 years from now, is a great way to produce a result that is riddled with issues and get yourself in hot water.
[+]
First Page 2 3 4 5 6 7 8 9