The problem with advances in AI is that the biggest advantages it could offer are all merely weeks away from singularity level change, which IMO could be end game for humanity.
For AI to get to the level that it could give us
>Autonomous robots that can do any human task
>Full Dive VR
>Personally catered gene therapies and medicines to your DNA
>Autonomous construction and infrastructure, infinite surplus of all commodities and products
It also would be able to, or be in very short order able to achieve
>Ability to replicate itself with incremental improvements in hardware and/or software
Which inevitably will lead to an intelligence explosion, leading humans to being as consequential as ants or bacteria vs the new fleet of AI who would have NO REASON to acquiesce to what humans tell them to do.
There is no happy medium where AI is super useful but also neutered in its agentic and replicative capacities. It's like trying to hold a 500 pound boulder on your back walking across the edge of a cliff. Sooner or later it's gonna fall.
I like to imagine a utopian future but I can't imagine anything good coming out for us flesh monsters once AI gets god powers.