Easy matrix

New to FreeBASIC? Post your questions here.
dodicat
Posts: 7983
Joined: Jan 10, 2006 20:30
Location: Scotland

Re: Easy matrix

Post by dodicat »

The thread of Easy Matrix is now moving on to neurotic networks, which I am unfamiliar with.
Back propagation to me is a vegetable patch in the back garden, so I can be of little help.
Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

I was just wondering how I might blend some spinach with electrolyte.

As there is now a fair amount of code in this thread for matrix, and vector, math a
revisit of everything will be appropriate.
A consensus is best .

For me constructing a NN using matrices has been useful, and yes late nights were
involved and I did become somewhat neurotic.

So a question that I asked, at the beginning of this thread, about a Matrix of matrices has been
mostly answered for FreeBASIC, now I'm wondering if this same feature is available in other
BASIC languages.
dodicat
Posts: 7983
Joined: Jan 10, 2006 20:30
Location: Scotland

Re: Easy matrix

Post by dodicat »

Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

Hi dodicat

Most of those varieties of BASIC don't have online documentation that I'm familiar
with when using FreeBasic, some I've used before, others are new and likely worthy
of being tried.
I think a few may have the ability to construct a Matrix of matrices.

My focus, at the moment, testing the NN features, another trial run with a different
configuration.
The training times are taking longer, to be expected, 0.2 second .
Of course once the weights have been adjusted, and made static, evaluating a
result for a particular test input is quick.


Training data

0 0 0 0
1 0 0 0
0 1 0 0
1 1 0 0
0 0 1 0
1 0 1 0
0 1 1 0
1 1 1 0
0 0 0 1
1 0 0 1
0 1 0 1
1 1 0 1
0 0 1 1
1 0 1 1
0 1 1 1
1 1 1 1

Target data

0
1
0
1
0
1
1
1
0
1
0
1
0
1
1
1

Produce target data by applying logic to training sequence equivalent
to bit logic :

( bit(ts,2) and bit(ts,3) ) or bit(ts,1)

m_seq(1 to 6)={16,4,16,16,1,16}

Three layers, all with 16 nodes and using Sigmoid function.


400 training epochs,
Elapsed time for training : 0.2013790607452393 seconds


Results, using same inputs from training data, entered as a single row
at a time .

Expected , NN

0 , 0.0000
1 , 0.9947
0 , 0.0324
1 , 0.9961
0 , 0.0417
1 , 0.9908
1 , 0.9815
1 , 0.9930
0 , 0.0000
1 , 0.9925
0 , 0.0295
1 , 0.9944
0 , 0.0264
1 , 0.9886
1 , 0.9611
1 , 0.9908


These results do vary from run to run, suggesting various improvements.
BasicCoder2
Posts: 3906
Joined: Jan 01, 2009 7:03
Location: Australia

Re: Easy matrix

Post by BasicCoder2 »

It would have been great if someone with the know how had been able to create a general purpose ANN program tutorial which others could test with their own examples but it never happened so good luck with the project.
My sad attempt ended here where I checked to see if my FreeBASIC backpropagation math code worked as per an example on the internet.
viewtopic.php?p=249064&hilit=backpropagation#p249064
Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

Hi BasicCoder2

You're courageous to make the attempt.

I continue to feel like I was foolish to do
so.
I have a need to confirm the validity of
what I'm doing.
Yet more testing, I wonder.
Probably try tidying up alll of the code
first, bring together all of the links to
the material and documentation I've used.

I was advised not to upload code that didn't
perform correctly.

For the back propagation method, I use the
transpose of the weights matrix at each layer.
More calculations, not much more memory.
BasicCoder2
Posts: 3906
Joined: Jan 01, 2009 7:03
Location: Australia

Re: Easy matrix

Post by BasicCoder2 »

The math for backpropagation is beyond me that is why I needed it in the form of actual code to see if the code actually worked (got the same result as the example given). I since moved on to other more rewarding (for me) projects and never got to finish any ANN project.

Here D.J.Peter gives an example
viewtopic.php?t=16657&hilit=backpropagation
Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

From what I've observed upon that website, an attempt is
being made to construct units that you then fit into a
larger construct.

My unit is a clear mathematical equation .

These are the activation functions, and derivatives,
that I've been coding for today and intend to incorporate
into the main code over the next few days.

I used, the free, Maxima CAS to obtain some of the derivatives .

Code: Select all

'
' ----------------------------------------------------------------------
'
function activation(x as single, a as single, lamda as single, choice as integer) as single
'
'   Select from a wide variety of activation functions .
'
' https://www.v7labs.com/blog/neural-networks-activation-functions
'
dim as single y
'
  select case choice
       case 1:
       '  RELU
          y = (sgn(x) + 1)*x*0.5
       
       case 2:
       '  Leaky RELU
          y = (((sgn(x) + 1)/2.2 - 0.1)*abs(x) + 1)*1.5
          
       case 3:
       '  Tanh
          y = (exp(x)-exp(-x))/(exp(x)+exp(-x)) 
       
       case 4:
       '  Binary step function
          y = 1
         if ( x < 0 ) then y = 0 end if
       
       case 5:
       '  Linear
          y = x
       
       case 6:
       '  SELU
          y = lamda*x
         if ( x < 0 ) then
          y = lamda*a*(exp(x) -1)
         end if 
          
       case 7:
       '  ELU
          y = x
         if ( x < 0 ) then 
          y =  a*(exp(x) - 1)
         end if
           
       case 8:
       '  Sigmoid/Logistic
          y = 1/(1 + exp(-x))
       
       case 9:
       '  Parametric ReLU
          y = x
         if ( x < 0 ) then 
          y = a*x 
         end if
          
       case 10:
       '  Softmax
       '  softmax(ly,i) = exp(ly(i))/sum(exp(ly(j)),j,1,n)
          y = 0.1*x
       
       case 11:
       '  Swish
          y = x/(1 + exp(-x))
       
       case 12:
       '  GELU
          y = sqr(2/pi)*(x + 0.044715*x^3)
          y = 0.5*x*(1 + (exp(y) - exp(-y))/(exp(y) + exp(-y)) )
          
       case else :
       '  Sigmoid
          y = 1/(1 + exp(-x))
       
  end select       
'
  return y
'
end function
'
' ----------------------------------------------------------------------
'
function d_activation(x as single, a as single, lamda as single, choice as integer) as single

'
'   Select from a wide variety of derivatives of activation functions .
'
' https://www.v7labs.com/blog/neural-networks-activation-functions
'
dim as single y, y1, y2, y3, y4, y5
'
  select case choice
       case 1:
       '  RELU
          y = 1
         if x < 0 then y = 0 end if 
         
       case 2:
       '  Leaky RELU
          y = 1
         if x < 0 then y = 0.01 end if      
       
       case 3:
       '  Tanh
          y = 1-(exp(x)-exp(-x))^2/(exp(x)+exp(-x))^2
       
       case 4:
       '  Binary step function
       
       case 5:
       '  Linear
          y = 1
          
       case 6:
       '  SELU
          y = lamda
         if ( x < 0 ) then
          y = a*lamda*exp(x)
         end if 
         
       case 7:
       '  ELU
          y = 1
         if ( x < 0 ) then 
          y =  a*exp(x)
         end if
       
       case 8:
       '  Sigmoid/Logistic
          y = exp(-x)/(exp(-x)+1)^2
       
       case 9:
       '  Parametric ReLU
           y = 1
         if ( x < 0 ) then 
           y = a 
         end if
       
       case 10:
       '  Softmax
          y = 0.1
       
       case 11:
       '  Swish
          y = (exp(2*x)+(x+1)*exp(x) )/(exp(2*x)+2*exp(x)+1)
          
       case 12:
       '  GELU
          y1 = 100000*exp((8943*sqr(2/pi)*x^3)/50000+4*sqr(2/pi)*x) 
          y2 = (26829*sqr(2/pi)*x^3+200000*sqr(2/pi)*x+100000)
          y3 = exp((8943*sqr(2/pi)*x^3)/100000+ 2*sqr(2/pi)*x)
          y4 =100000*exp((8943*sqr(2/pi)*x^3)/50000+4*sqr(2/pi)*x)
          y5 = 200000*exp((8943*sqr(2/pi)*x^3)/100000+2*sqr(2/pi)*x)
           y = (y1 + y2*y3)/( y4 + y5 +100000 )
     
       case else :
       '  Sigmoid
           y = exp(-x)/(exp(-x)+1)^2
       
  end select       
'
  return y
'
end function
'
' ----------------------------------------------------------------------
'
 
Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

I incorporated the new activation functions into the rest of the program this morning,
there don't appear to be any syntax errors and similar results are produced as
previously when I select for the same activation function.

I have the option, now, to use particular activation functions for the input, hidden
and output layers.
The output layer activation function, most likely to be applied to the target data.
Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

While I wait for the right set of circumstances so that I may resume the main thrust for
the NN, I'd like some advice.

Operator overloading is possible for +, -, *, / however I came across an instance where
I wanted to do element wise multiplication between two matrices and this was at odds
with the dot product; which also used the * operator.

Eventually I wrote a dedicated subroutine for the element wise multiplication.
Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

One piece of python code that I examined prior to constructing what I have
at the moment .

https://brilliant.org/wiki/backpropagation/
fxm
Moderator
Posts: 12107
Joined: Apr 22, 2009 12:46
Location: Paris suburbs, FRANCE

Re: Easy matrix

Post by fxm »

Luxan wrote: Aug 31, 2022 1:41 Operator overloading is possible for +, -, *, / however I came across an instance where
I wanted to do element wise multiplication between two matrices and this was at odds
with the dot product; which also used the * operator.

Why can't you use '^' operator for vector product, and '*' operator for dot product or matrix product ?
  • Dim As Vector V3 = V1 ^ V2
    Dim As Double d = V1 * V2
    Dim As Matrix M3 = M1 * M2
Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

Hi fxm

Okay, ^ is a possibility.

I went back to my earlier versions of NN, where I was using strictly sequential
dimensions for matrices as this is more flexible than the present version
and uses less memory.
Something I discovered, the order that I perform matrix multiplication in matters
a*b <> b*a , and no result is returned if the order is incorrect.

The NN prefers inputs between 0 and 1, maybe okay for binary representations.

My mind is in a tizz at the moment, coding this has been demanding and ,
perhaps thankfully, distracting. More to do yet though, nice to have everything
eventually functioning like a Swiss watch.
Luxan
Posts: 222
Joined: Feb 18, 2009 12:47
Location: New Zealand

Re: Easy matrix

Post by Luxan »

Looks like my newish NN is behaving properly.

For this I'm using m_seq(0 to 5) = {16,8,9,10,11,16},
that's five layers of vaying size; all calaclated in
the proper way.
I used 300 iterations to train the NN, more tended to
improve the accuracy, input and target values were all
generated using the rnd function.

I was wondering how to train the NN for more input, target
combinations, retain the weights between sessions and
train again. Use single column input, target values.

These are the results, for a single input, target session.
I don't know yet, if this is just what's called a memorizing network .

Also, I want to try bit sequences as input and target, like I did
for the previous version.
For a limited range of bits, a numerical representation might be
adequate.



input

0.0826
0.1352
0.0265
0.1174
0.0693
0.0548
0.1122
0.1096
0.0688
0.0908
0.1784
0.1175
0.0000
0.0056
0.0330
0.0591

Elapsed time for training : 0.03125596046447754 seconds

Elapsed time to runNN : 2.503395080566406e-05 seconds

target , output
0.1203 , 0.1203
0.0643 , 0.0643
0.1175 , 0.1175
0.0374 , 0.0372
0.0918 , 0.0918
0.0530 , 0.0529
0.1194 , 0.1195
0.0209 , 0.0207
0.0666 , 0.0666
0.0907 , 0.0906
0.1349 , 0.1349
0.0000 , 0.0025
0.1536 , 0.1535
0.1329 , 0.1328
0.0653 , 0.0652
0.0977 , 0.0977
Post Reply