Kat Tsun
eeeeeeeeeeeeeee
- Joined
- 16 June 2013
- Messages
- 1,200
- Reaction score
- 1,531
We're also reaching peak transistors, so computers will likely stagnate over the coming decades, like how internal combustion engines did in the 80's. I suspect a decade from now GANs will still be producing incoherent semi-dreamlike ramblings rather than coherent sentences. Frightening for horoscope writers but not much else.There is still orders of magnitudes speed up possible by converting hardware from general purpose to single purpose even without new scientific development, just more engineering work.In addition, you're not taking physics, quantum mechanics, and reality into account here. We're already hitting the top end of what solid-state electronics can do, and the most likely prospect for AGIs is quantum computing, which is barely in its infancy right now
All the very thorough knowledge of working with low level programming that would allow for this is long, long gone and likely never coming back though. The days of needing to count bits and manage memory like on a C64 or something are dead and over for at least the past 10, if not closer to 20, years. Modern programmers only know Javascript, C++, and Python, broadly speaking, and this is a problem for writing things that can take full advantage of existing computational architecture. If only because there aren't any new, young people who are relatively well versed in it this seems very unlikely to occur in general. There's also an expectation that any problems with efficiency can be overcome with more muscle, which is not going to be the case in the future, as we're reaching peak transistor size and stuff like qubits and photonic computing are either appearing to be blind alleys or perennially 20 years away, respectively.
It might be the case for the odd supercomputer, I guess, at least until the programmer(s) retire, but not for general use. Stuff like compiler creation are still just running off inertia of slowly aging and close to retirement programmers due to the neglect of the past generation (or two) in programming at very low levels, and the only place Assembly has left is coding things like street lights and other embedded systems, at least sometimes. There are now Windows Embedded editions, after all, so it's possible street lights in the future might run off Windows or something.
This is caused, ironically, by increasing availability of computing resources allowing for use of large libraries with relatively little (but absolutely large) overhead and allowing computers to just eat software bloat. Lack of education and dearth of widespread knowledge of low level programming ability is another reason why computing power is likely going to stagnate in the coming decades, as people are not going to have sufficient ability, nor sufficient availability of teachers, to take advantage of existing powerful CPUs and whatnot. Even if there are the occasional people in every generation who are interested autonomously in the subject (I recently saw a video of a guy who reverse engineered a PSOne programming architecture to make a simple tech demo, for instance), they will not be formally recognized for any sort of military purposes unless they become interested in that subject I suppose, so it's very much reactive.
In other words, it's literally becoming lost knowledge that will have to be rediscovered and it likely won't be. Which will vex "AI researchers" in some way I'm sure. OTOH people have developed algorithms that allow for flocks of starlings to be simulated in virtual environments, so it's likely that AI may just evolve to being bird-like for small aircraft swarms. Really, the idea of actually needing decent low level skills to emulate artificial intelligence in meaningful ways is itself highly dubious, since we're able to make pretty robust AI systems in video games with 15-20 year old hardware, using simple finite state machines.
The "good" news is that AGI already exists: it's called a maid. We'll sooner see the return of indentured domestic servants than robots tbh. The "bad" news is that AGI, at least for robots, is unlikely to be possible at any point in history. It's a bit like nuclear fusion, except we know that nuclear fusion works, because the Sun and thermonuclear weapons do it. So it's not really like nuclear fusion at all.
The other good news, actually good rather than ironic, is that you don't actually need AGI for weapons guidance, obviously.
You just need to tell an autonomous weapon swarm to go attack or defend some particular object and have it run through a massive library of images and ML trained databases to identify armored vehicles and infantrymen. Perhaps it can be electrically powered and recharge through wireless power transmission using a large transmitter near the object, so it can infinitely cycle itself through a looping path without running out of juice? Something like a far-field rectenna using a microwave wide area transmitter, hooked up to solar panels or a tiny nuclear reactor, for the purpose of powering and conveying orders to a defensive robotic swarm. Opening a box and letting the bugbots fly around would certainly be less annoying than placing Hornets and Spiders around a object. Maybe. Might be loud.
Presumably it can take off like a swarm intellect al a flock of birds, school of fish, or swarm of ants, all highly robust logical systems that require simple algorithms and can be replicated on ancient (20 year old) hardware these days, and kill people you need dead if they get too close. This might have an attack vector using thermoelectric panels or electrochromic cloth to disguise a field force as a harmless flock of sheep or cows, when in actuality it's a platoon or demi-company of ZTZ-34s or M25 Milleys with attendant infantrymen, but that's just normal weapons development.
Last edited: