A new frontier

General discussion for topics related to the FreeBASIC project or its community.
Post Reply
Lost Zergling
Posts: 601
Joined: Dec 02, 2011 22:51
Location: France

A new frontier

Post by Lost Zergling »

In 1963, a For Next loop was "high level". In the 2000s, LotusScript introduced home-made object programming (business-oriented, Big Blue obliges) with support for index manipulation. Then, from 2010, other languages ​​introduced new paradigms: intelligent garbage collectors and business-oriented data structures (Java), new instructions (Rust, etc.), business scalability to the scientific domain (Python), or ultra-efficient iterators (C++ family) to be able to create data structures, customized garbage collectors, or use lower-level scientific libraries.
The new frontier is known: AI. Architectures in so-called learning neural networks are becoming an established technology, but the high level remains:
- How to operate and understand learning patterns?
- Which languages ​​(or meta languages) to interact with an AI?
- How to model (at a high level) what we call coherent thought, knowing that AI is deterministic and that its learning processes and especially its 'reasoning' processes are based on imitation (what I will call the 'chameleon paradox')
Reasoning is not a mathematical signature, and every second that "God makes" (sic) cannot be summed up as a multitude of Fourier solutions.
The new frontier, if it exists, would be the manipulation of high-level instructions to allow the modeling of pseudo-reasoning. I am not talking about setting up expert systems or mimetic AI (IBM Watson type), but about a language of symbol-oriented macro-commands ('combinable' algorithms?) that would be likely to influence the orientation of graphs (or what replaces them).
The new frontier is the creation and understanding of symbolisms that should allow the piloting of an AI, and it is obvious to me that, like programming, a new rupture between knowers is underway.
If the spirit of Basic is to open information science to the uninitiated, perhaps we should have an experimental low-level AI project on which to experiment with algorithmic grafts piloted by an instruction set.
What we lack is the multidisciplinary team, the lab, the means and the right university. :shock: :? :roll:
Any go ? :?: :idea: :arrow:
marcov
Posts: 3490
Joined: Jun 16, 2005 9:45
Location: Netherlands
Contact:

Re: A new frontier

Post by marcov »

Lost Zergling wrote: Oct 10, 2024 10:19 The new frontier is known: AI.
I'm not sure that is really architecturally. AI and other fuzzy logic have been used for years, it is just that the scale of the models is now larger and can encompass natural language and image manipulation. But will probably be all hidden away in libraries that ayou use, not really influence programming on a practical level. (except for things like CoPilot, but the vote is still out on that).

Many programmers will find requests and uses for AI, but long term most will just deploy some stock solution for natural language translation, search/summarisation or image generation and recognition. Mostly pre-trained and only marginally customised by adding a samples library.

Of course in this stage of the hype, all vendors try to convince you that you need cracking (read: buying their products) now, but it is not even clear yet what parts of AI are going to stick, and more importantly, in which form (well, voice assistants on telephones are probably staying, and will probably evolve into such stock solution to integrate with your phone app).
Architectures in so-called learning neural networks are becoming an established technology, but the high level remains:
- How to operate and understand learning patterns?
- Which languages ​​(or meta languages) to interact with an AI?
All in the library, with a simple API to access. Popular languages will get it first of course, since they can bet on multiple horses. Smaller projects with less manpower will probably choose one, the one that the person that does the integration work selects :D
- How to model (at a high level) what we call coherent thought, knowing that AI is deterministic and that its learning processes and especially its 'reasoning' processes are based on imitation (what I will call the 'chameleon paradox')
The forefront of AI in science, and the current wave of commercialisation of LLMs in software are on completely different timelines.

Maybe in 5-10 years a next wave will be fundamentally different., but that is all not part of this wave of AI. The current wave is just about throwing random output against the wall in the hope that it sticks. Correctness and determinism are not part of it, they have already enough trouble it doesn't spew out copyrighted, bigoted or outright vile output.
Reasoning is not a mathematical signature, and every second that "God makes" (sic) cannot be summed up as a multitude of Fourier solutions.
Probably it can, as the Wave equation that governs electron distribution in atoms (and hence, chemistry) is a differential equation, typically solved with Fourier :-) But with 6E23 atoms in a few grams, that is a whole lot of equations.
The new frontier is the creation and understanding of symbolisms that should allow the piloting of an AI, and it is obvious to me that, like programming, a new rupture between knowers is underway.
Hype talk. We all got this with a few years back how blockchain would revolutionise and democratise everything too. And see how much it is left , and how many people are busy integrating blockchain in each and everything. Very few, only a few staunch believers remaining.

For the average programmer all that is left is a link with "buy this product using bitcoin" and some API interfacing with the bitcoin payment processor.

It would be wise to separate the current generation of AI that is available (and for us to integrate) from the scientific front and visionary's dreams. And usually the academic front is somewhat abstract and not fully in the stage of finding applications yet.
Lost Zergling
Posts: 601
Joined: Dec 02, 2011 22:51
Location: France

Re: A new frontier

Post by Lost Zergling »

@marcov. A word about blockchain: it is revolutionizing the financial system, it is just that it is not visible and that it will take years or even decades, but it is irreversible. I love your remark on Fourier solutions.
I will remain a dreamer and my fundamental criticism of the AI ​​of "hype" is in the adjective "mimetic", which is the very negation of an evolved language, a technical reality and the observation of the fundamental incoherence of the marketing discourse.
Thank you very much for this very appreciable insight. :)
Berkeley
Posts: 92
Joined: Jun 08, 2024 15:03

Re: A new frontier

Post by Berkeley »

First I don't have an idea how a AI API should look like except FEED_PICTURE(), FEED_TEXT(). Well, you might need this, but it's not very much to think about. You can directly tell a ChatGPT what you want, you don't need a programming language. It might even write source code for you.

And AI/neuronal networks ARE the future. It's not just a hype, it's what will make "universal translators", speech control, self-driving cars, Mr. Datas, computer face recognition and faked video evidences like in "Running Man" possible...
adx
Posts: 11
Joined: Dec 11, 2005 10:18
Location: New Zealand
Contact:

Re: A new frontier

Post by adx »

to some extent
marcov wrote: Oct 10, 2024 11:44
Lost Zergling wrote: Oct 10, 2024 10:19 - How to model (at a high level) what we call coherent thought, knowing that AI is deterministic and that its learning processes and especially its 'reasoning' processes are based on imitation (what I will call the 'chameleon paradox')
but mostly
Reasoning is not a mathematical signature, and every second that "God makes" (sic) cannot be summed up as a multitude of Fourier solutions.
Probably it can, as the Wave equation that governs electron distribution in atoms (and hence, chemistry) is a differential equation, typically solved with Fourier :-) But with 6E23 atoms in a few grams, that is a whole lot of equations.
We know (or assume we do) that chaos can be responsible for randomness, but where does nonlinearity come from then? Is it an illusion of determinism (as well?)

more practically
I was here doing some '0 shot lurking', and probably the wrong thread, but wondered if FreeBASIC could be extended to do...
  • functional programming (or a mix like Scala, based on things that work, more than wild dreams)
  • wrap Pytorch (or at least Torch) so tinkering can be done in a familiar language (not that Python's bad)
  • provide more direct support for microcontrollers via C (eg BASCOM-AVR syntax, GCB, Arduino intermediate syntax),
...that's all!
marcov
Posts: 3490
Joined: Jun 16, 2005 9:45
Location: Netherlands
Contact:

Re: A new frontier

Post by marcov »

adx wrote: Oct 23, 2024 2:56
more practically
I was here doing some '0 shot lurking', and probably the wrong thread, but wondered if FreeBASIC could be extended to do...
  • functional programming (or a mix like Scala, based on things that work, more than wild dreams)
  • wrap Pytorch (or at least Torch) so tinkering can be done in a familiar language (not that Python's bad)
  • provide more direct support for microcontrollers via C (eg BASCOM-AVR syntax, GCB, Arduino intermediate syntax),
...that's all!
It is usually a really bad idea to take a language in a radically different direction, too much conflicting implementation details. If you want e.g. functional languages, start a new language or dialect.

I don't know enough about pytorch to comment on that. Microcontrollers support is basically a codegenerator, a few language extensions (dealing with interrupts etc) and sometimes some binary tools (e.g. binary to hex for the programmer).

Afaik native FB is x86/x86_64 only, and changing that will be hard, but maybe there are possibilities via the LLVM route.
Berkeley
Posts: 92
Joined: Jun 08, 2024 15:03

Re: A new frontier

Post by Berkeley »

marcov wrote: Oct 27, 2024 15:40 It is usually a really bad idea to take a language in a radically different direction, too much conflicting implementation details.
Correct. It could become good, but it might be so radical that you lose the benefits of the original(s).
marcov wrote: Oct 27, 2024 15:40 If you want e.g. functional languages, start a new language or dialect.
As far as I understand the description, FreeBASIC could already work functional.
marcov wrote: Oct 27, 2024 15:40 Afaik native FB is x86/x86_64 only, and changing that will be hard, but maybe there are possibilities via the LLVM route.
It's not hard. If it's written in Assembler, it will be very much work though, C will be easier - can be recompiled with no changes in many cases. The biggest problem is the endianess. Intel is "Little Endian", which is already confusing when using SHL/SHR resp. dealing with binary numbers - which are "Big Endian" in code. But most new CPUs are Little Endian. Other points of course might be the size of INTEGER, in general the limitations of data types and similar. But FreeBASIC is still also available for MS DOS... No new troubles.
Post Reply