Indeed, these are experimental trials in a controlled environment... However, even if an AI based combat system advances to a point were is becomes superior to human pilots. Usually AEH and SW are considered items of certification. I'm keen to see how a fully-autonomous, entirely AI based combat system, is going to be certified (by humans). Ultimately, someone needs to be responsible/accountable.
Quite, the issue is “define combat”. Combat, or rather, the role a human pilot performs, is such a wide span of tasks and yet these trials have to telescope their focus on very, very narrowly defined tasks to even be comparable.
Being neck deep in sw certification it is very apparant that the sheer scale of sw required is beyond any level of resources. Now if AI writes its own sw that fixes that, but as you say, how is that certificated? Nobody is going to back off certification requirements short of existential war, and indeed that burden gets higher every year.
Human pilots are more expensive in terms of time and money to produce
Compared to developing AI that cannot yet fly a GA aircraft iaw the ‘see and avoid’ of the average ‘puddle jumping’ pilot?
Compared to the hordes of other people that make up the pyramid of the air force?
Could we save far more money investing AI in automating some of their functions? A sqn has what, 30 people on it for every pilot, and thats just the sqn, let alone the rest of the pyramid.
I think aircrew account for 2% of the RAF. (Yet 98% of its leaders…!)
as well as weight in the aircraft. Increasing they likely also have longer reaction times compared to an AI.
Likely, evidence being? Piloting is not predicated on “reaction times” but decision making. Balancing a vast array of factors with very changeable priorities and doing the right thing. Not a strong point of software - I’m deep in the engineering aspects of teams writing sw to deliver functionality of complex defence avionics and it is taking millions and years to not very well deliver limited functionality that is actually quite repeatable process based, let alone deal with the variables of flying / combat air decision making. In a previous existence I was in the back of the jets making decisions on the fly, there I wrote sw to aid the process part of it, but that took a lot of effort and again was very limited vs the scope of I/pilot as decision maker.
There is a huge post in “why are we trying to replace pilots” that Ive not got time to write now, although given they’re the most annoying twats on the planet I’m definitely up for it - I just dont see it as realistic or sadly, desirable.
But being able to create lots of cheap planes that can fight a conflict as an attritional versus survival battle is going to enable a lot of tactics that would probably not be effective with human pilots.
These cheap planes have kinematics, LO and avionics akin to piloted ones right? How does that make them cheap? If they dont, they just die to something that is better.
On a large high end fighter, the pilot isnt really imposing that much of a constraint. The constraints are sensors/antenna, integration (software) and engine oil.
In the short term the USAF seems to intend to use primarily as off board sensor and weapon platforms under human control. But I think in the 2030s we will see them sent on mid alone, perhaps with a satellite link to ground controllers on a specialized gateway aircraft.
That is extraordianrily optimistic. I think we’ll see the same platforms as today in the 2030s and likely the 2040s. To get FCAS/NGAD into service is going to make F35 look cheap and quick.
The end of manned aircraft was pronounced in 1957, viewed through a very narrow optic that seemed sensible then (to some!), this penchant for AI/unmanned is the same mistake.