Lost Zergling wrote: ↑Oct 10, 2024 10:19
The new frontier is known: AI.
I'm not sure that is really architecturally. AI and other fuzzy logic have been used for years, it is just that the scale of the models is now larger and can encompass natural language and image manipulation. But will probably be all hidden away in libraries that ayou use, not really influence programming on a practical level. (except for things like CoPilot, but the vote is still out on that).
Many programmers will find requests and uses for AI, but long term most will just deploy some stock solution for natural language translation, search/summarisation or image generation and recognition. Mostly pre-trained and only marginally customised by adding a samples library.
Of course in this stage of the hype, all vendors try to convince you that you need cracking (read: buying their products) now, but it is not even clear yet what parts of AI are going to stick, and more importantly, in which form (well, voice assistants on telephones are probably staying, and will probably evolve into such stock solution to integrate with your phone app).
Architectures in so-called learning neural networks are becoming an established technology, but the high level remains:
- How to operate and understand learning patterns?
- Which languages (or meta languages) to interact with an AI?
All in the library, with a simple API to access. Popular languages will get it first of course, since they can bet on multiple horses. Smaller projects with less manpower will probably choose one, the one that the person that does the integration work selects
- How to model (at a high level) what we call coherent thought, knowing that AI is deterministic and that its learning processes and especially its 'reasoning' processes are based on imitation (what I will call the 'chameleon paradox')
The forefront of AI in science, and the current wave of commercialisation of LLMs in software are on completely different timelines.
Maybe in 5-10 years a next wave will be fundamentally different., but that is all not part of this wave of AI. The current wave is just about throwing random output against the wall in the hope that it sticks. Correctness and determinism are not part of it, they have already enough trouble it doesn't spew out copyrighted, bigoted or outright vile output.
Reasoning is not a mathematical signature, and every second that "God makes" (sic) cannot be summed up as a multitude of Fourier solutions.
Probably it can, as the Wave equation that governs electron distribution in atoms (and hence, chemistry) is a differential equation, typically solved with Fourier

But with 6E23 atoms in a few grams, that is a
whole lot of equations.
The new frontier is the creation and understanding of symbolisms that should allow the piloting of an AI, and it is obvious to me that, like programming, a new rupture between knowers is underway.
Hype talk. We all got this with a few years back how blockchain would revolutionise and democratise everything too. And see how much it is left , and how many people are busy integrating blockchain in each and everything. Very few, only a few staunch believers remaining.
For the average programmer all that is left is a link with "buy this product using bitcoin" and some API interfacing with the bitcoin payment processor.
It would be wise to separate the current generation of AI that is available (and for us to integrate) from the scientific front and visionary's dreams. And usually the academic front is somewhat abstract and not fully in the stage of finding applications yet.