There has never been as much automation as today.
There are widespread labour shortages today.
I work in a service organisation. All we produce is words, numbers, and pictures.I suspect that AI won’t take people’s jobs but instead people who know how to use AI will take jobs from people who don’t.
Which does speak to the opportunity for people who learn how to use it.I work in a service organisation. All we produce is words, numbers, and pictures.
There’s a staggering ignorance of what AI technology can do never mind how to use it.
The problem as I see it is that there is a risk of an even bigger divergence between 80% of people on minimum wage and 20% earning very good salaries.
And when it becomes self-aware, will the cybernetic organisms that come hunting us look like Arnold Schwarzenegger?
Agreed, we should launch a preemptive Butlerian Jihad.I fear that worrying about the impact of AI on jobs is a bit like worrying about rearranging the deck chairs on the Titanic.
Inequality would increase in that it will be winner takes all and whoever owns the best AI will briefly take all, but that window could turn out to be quite small because you can't control something that is magnitudes smarter than you. Once an AGI can self improve it will rapidly overtake human intelligence and we won't be able to contain it. That is what is referred to as the technological singularity and it isn't a tin foil hat theory unfortunately, it's a very real concern of people who work in the field. There is a good analogy in the book 'Our Final Invention' of humans trying to contain an AI 1,000 times smarter than them being like mice trying to keep humans imprisoned. If you think about all the technological progress in the last century, an AGI system that is on 24/7, networked and connected to all the data in the world might make the equivalent next leaps in scientific discoveries in months, weeks or days, and keep improving so it makes more discoveries even faster, it will be exponential. We don't know what an AGI 1,000 times smarter than a human will figure out about us and our role in the universe, best case it might keep us as pets, I guess the UBI model is the nearest economic model there. Worst case scenario, it figures out some end goal that needs our matter in the same way we use animals for clothes and meat.
If AI gets trained as badly as the current LLMs, we could be in trouble.Must admit, I've some concerns about AI getting out of control - and yes, I've seen Terminator, I, Robot etc., but I'm ultimately thinking about a situation where AI starts making "logical", but bad decisions
Pierre Marie GalloisIf you put tomfoolery into a computer, nothing comes out of it but tomfoolery. But this tomfoolery, having passed through a very expensive machine, is somehow enobled and no-one dares criticize it.
Have you not been reading the posts? Sure we'll all be dead. That's climate change, the impending pension crisis and the waiting lists and A&E problems all sorted out.Is there any role for more automation and AI in construction, to help build more houses?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?