Bear Ribs
Well-known member
So you've gone from "AI will need decades to be able to make movies" to "AI doesn't make very good movies." So, like AI itself, progress.The post being linked demonstrates incredible difficulty with stability for static scenes, with the shift having nearly total change of the "planets" arriving between frames, with a total of thirty seconds shown. Same problem crops up in Disturbed's Bad Man video, and every other AI animation I've seen, they are all hellishly unstable to the point of being nearly totally worthless for producing conventional content.
Except the AI Seinfield, but that goes to extreme lengths to reduce the complexity to the point its barely worth considering for end-use and still has visible consistency errors like a "box" with an edge that switches from concave to convex mid-scene. Provided that the "clipping" behavior and certain other artifacts aren't the result of using "dolls" to completely bypass a lot of the difficulty instead of direct AI generation of footage.
The difference is that we're running out of curve on silicone and the current methodology has little potential for optimizing the problem at the scale worried about, so continued progression is the domain of wholly new technology. Maybe quantum computing will leave the "coming soon!" it's been in for over a decade in the next five years, maybe dedicated architecture magic will bridge the gap with existing manufacturing like it warped cryptocurrency, maybe a new AI methodology will bypass the technical challenges.
But because we physically can't brute-force it by throwing more transistors at it, it is uncertain. This is "Fusion is 10 years away!" for the last 50 years thinking, that because visible progress is being made it's obviously close to being sold to the end-user. Maybe it's 5 years, maybe it's 10, maybe the technical challenges will keep cropping up for the next 50 years like they have for fusion.
The thing about shaving atoms is, computer architecture hasn't significantly changed since the days of Von Neumann in 1945. There hasn't been any need because there have always been more atoms to shave and making the numbers go up looks sexier on a business report. Even if we run hit the edge of the silicon curve, which like Peak Oil I've heard we've been hitting for twenty years straight with no apparent loss of speed, there are massive, massive gains to be had in improving architecture. We literally began seeing improvements to chip architecture specifically to build AI systems start appearing in the last year or two, this process is already underway.
Artificial Intelligence Is Driving A Silicon Renaissance
More innovation is happening in the semiconductor industry today than at any time since Silicon Valley's earliest days.
www.forbes.com
This isn't "Fusion in 10 years," this is you telling us Fusion will take decades when we're slapping a working primitive reactor in a prototype BattleMech right now. "But it's still primitive" is never a viable criticism of a technology that's still in the process of emerging.