Neural networks programming with prorealtime

Viewing 15 posts - 61 through 75 (of 127 total)
  • Author
    Posts
  • #83980 quote
    Leo
    Participant
    Veteran

    Hi all,

    Here I add a new classifier that satisfied me: define a very quick movement and only one parameter . When i find the time I update the Neural Network.

    Kind Regards

    Madrosat thanked this post
    #84453 quote
    Leo
    Participant
    Veteran

    Hi all,

    I am glad to share with you my new version of a neural network.

    I am satisfied (so far) with the performance of the neural network. THE CODE I UPLOAD NOW USES AN OLD VERSION OF THE CLASSIFIER

    Here the changes:

    –> Implementing gradient descent in mini batches for 10values of data

    –> Only 5 epochs per training with constant update of the Learning Rate for faster learning (only 5 because if I incremented to 6 there is an error of infinite loop… I think ProrealCode does not have enough power of press ion to compute a proper gradient descent)

    –> I implement learning process with Momentum in order to increase speed of convergence.

    –>  The Values for the data is an indicator I invent for measure the behaviour  of different Moving Averages.

    –> If you scroll down the graphic of the indicator (values from -1 to -10), you will see how it shows the different values of weights, bias, Cost and Error which I use to evaluate and check the correct functionality of the Neural Network)

    Hope you like it, testing it and sharing your results. I am also glad to hear your opinions.

    Cheers

    // Hyperparameters to be optimized
    
    // ETA=0.5 //known as the learning rate
    // PipsToTrade=25 //pips movement for classifier
    // betha=0.1
    
    /////////////////    CLASSIFIER    /////////////
    
    
    candlerange=average[200](average[100](average[50](abs(close-open))))
    candlesback=round(average[200](round(PipsToTrade*pipsize/candlerange)))
    
    IF barindex >600 then
    //Supports and resistances
    Xcandlesback=round(0.6*candlesback)
    
    highest1=highest[candlesback](high)
    IF highest1 = highest1[Xcandlesback] then
    Re1=highest1
    ENDIF
    IF high > Re1 then
    Re1=high
    ENDIF
    
    lowest1=lowest[candlesback](low)
    IF lowest1 = lowest1[Xcandlesback] then
    S1=lowest1
    ENDIF
    If low < S1 then
    S1=low
    ENDIF
    
    
    //for long trades
    classifierlong=0
    IF close - S1 > PipsToTrade*pipsize+candlerange THEN
    FOR scanL=1 to candlesback*2 DO
    IF classifierlong[scanL]=1 then
    BREAK
    ENDIF
    IF low[scanL]=S1 THEN
    classifierlong=1
    candleentrylong=barindex-scanL
    BREAK
    ENDIF
    NEXT
    ENDIF
    //for short trades
    classifiershort=0
    IF Re1-close > PipsToTrade*pipsize+candlerange THEN
    FOR scanS=1 to candlesback*2 DO
    IF classifiershort[scanS]=1 then
    BREAK
    ENDIF
    IF High[scanS]=Re1 THEN
    classifiershort=1
    candleentryshort=barindex-scanS
    BREAK
    ENDIF
    NEXT
    ENDIF
    ENDIF
    
    
    /////////////////////////  NEURONAL NETWORK  ///////////////////
    
    // ...INITIAL VALUES...
    once a11=1
    once a12=-1
    once a13=1
    once a14=-1
    
    once a21=-1
    once a22=1
    once a23=-1
    once a24=1
    
    once a31=1
    once a32=-1
    once a33=1
    once a34=-1
    
    once a41=-1
    once a42=1
    once a43=-1
    once a44=1
    
    once a51=1
    once a52=-1
    once a53=1
    once a54=-1
    
    once a61=-1
    once a62=1
    once a63=-1
    once a64=1
    
    once Fbias1=1
    once Fbias2=1
    once Fbias3=1
    once Fbias4=1
    once Fbias5=1
    once Fbias6=1
    
    once b11=1
    once b12=1
    once b13=1
    once b14=1
    once b15=1
    once b16=1
    
    once b21=1
    once b22=1
    once b23=1
    once b24=1
    once b25=1
    once b26=1
    
    once Obias1=0
    once Obias2=0
    
    // ...DEFINITION OF INPUTS...
    
    //TREND001
    SuperPeriod001=7
    SuperPeriod002=21
    SuperPeriod003=34
    SuperPeriod004=77
    
    SuperRange=pipsize*round((average[200](average[100](average[50](average[20](average[5](round(RANGE/pipsize)))))))/2)*2
    
    Curve001=average[SuperPeriod001](close)
    Main001=average[SuperPeriod001](Curve001)
    Trend001=round(10*(Curve001-Main001)/SuperRange)/10
    
    Curve002=average[SuperPeriod002](close)
    Main002=average[SuperPeriod002](Curve002)
    Trend002=round(10*(Curve002-Main002)/SuperRange)/10
    
    Curve003=average[SuperPeriod003](close)
    Main003=average[SuperPeriod003](Curve003)
    Trend003=round(10*(Curve003-Main003)/SuperRange)/10
    
    Curve004=average[SuperPeriod004](close)
    Main004=average[SuperPeriod004](Curve004)
    Trend004=round(10*(Curve004-Main004)/SuperRange)/10
    
    
    variable1= Trend001 //or to be defined
    variable2= Trend002//or to be defined
    variable3= Trend003 // to be defined
    variable4= Trend004  // to be defined
    
    //   >>>    LEARNING PROCESS  <<<
    // If the classifier has detected a wining trade in the past
    //IF hour > 7 and hour < 21 then
    
    //STORING THE LEARNING DATA
    IF classifierlong=1 or classifiershort=1 THEN
    candleentry0010=candleentry0009
    Y10010=Y10009
    Y20010=Y20009
    candleentry0009=candleentry0008
    Y10009=Y10008
    Y20009=Y20008
    candleentry0008=candleentry0007
    Y10008=Y10007
    Y20008=Y20007
    candleentry0007=candleentry0006
    Y10007=Y10006
    Y20007=Y20006
    candleentry0006=candleentry0005
    Y10006=Y10005
    Y20006=Y20005
    candleentry0005=candleentry0004
    Y10005=Y10004
    Y20005=Y20004
    candleentry0004=candleentry0003
    Y10004=Y10003
    Y20004=Y20003
    candleentry0003=candleentry0002
    Y10003=Y10002
    Y20003=Y20002
    candleentry0002=candleentry0001
    Y10002=Y10001
    Y20002=Y20001
    candleentry0001=max(candleentrylong,candleentryshort)
    Y10001=classifierlong
    Y20001=classifiershort
    ENDIF
    
    once Keta=1
    ETAi=ETA*Keta
    IF BARINDEX > 3000 THEN
    IF classifierlong=1 or classifiershort=1 THEN
    IF hour > 8 and hour < 21 then
    FOR ii=1 to 5 DO //EPOCHS
    //Backing up the old values of the gradient for implement DESCENT GRADIENT WITH MOMENTUM
    BDerObias1 = DerObias1
    BDerObias2 = DerObias2
    
    BDerb11    = Derb11
    BDerb12    = Derb12
    BDerb13    = Derb13
    BDerb14    = Derb14
    BDerb15    = Derb15
    BDerb16    = Derb16
    
    BDerb21    = Derb21
    BDerb22    = Derb22
    BDerb23    = Derb23
    BDerb24    = Derb24
    BDerb25    = Derb25
    BDerb26    = Derb26
    
    BDerFbias1 = DerFbias1
    BDerFbias2 = DerFbias2
    BDerFbias3 = DerFbias3
    BDerFbias4 = DerFbias4
    BDerFbias5 = DerFbias5
    BDerFbias6 = DerFbias6
    
    BDera11    = Dera11
    BDera12    = Dera12
    BDera13    = Dera13
    BDera14    = Dera14
    
    BDera21    = Dera21
    BDera22    = Dera22
    BDera23    = Dera23
    BDera24    = Dera24
    
    BDera31    = Dera31
    BDera32    = Dera32
    BDera33    = Dera33
    BDera34    = Dera34
    
    BDera41    = Dera41
    BDera42    = Dera42
    BDera43    = Dera43
    BDera44    = Dera44
    
    BDera51    = Dera51
    BDera52    = Dera52
    BDera53    = Dera53
    BDera54    = Dera54
    
    BDera61    = Dera61
    BDera62    = Dera62
    BDera63    = Dera63
    BDera64    = Dera64
    
    //Reseting Error and cost functions
    ERROR=0
    COST=0
    //Reseting the gradient descet for calculting it in the minibatch
    DerObias1 = 0
    DerObias2 = 0
    
    Derb11    = 0
    Derb12    = 0
    Derb13    = 0
    Derb14    = 0
    Derb15    = 0
    Derb16    = 0
    
    Derb21    = 0
    Derb22    = 0
    Derb23    = 0
    Derb24    = 0
    Derb25    = 0
    Derb26    = 0
    
    
    //   >>> PARTIAL DERIVATES OF COST FUNCTION (LAYER) <<<
    
    DerFbias1 = 0
    DerFbias2 = 0
    DerFbias3 = 0
    DerFbias4 = 0
    DerFbias5 = 0
    DerFbias6 = 0
    
    Dera11    = 0
    Dera12    = 0
    Dera13    = 0
    Dera14    = 0
    
    Dera21    = 0
    Dera22    = 0
    Dera23    = 0
    Dera24    = 0
    
    Dera31    = 0
    Dera32    = 0
    Dera33    = 0
    Dera34    = 0
    
    Dera41    = 0
    Dera42    = 0
    Dera43    = 0
    Dera44    = 0
    
    Dera51    = 0
    Dera52    = 0
    Dera53    = 0
    Dera54    = 0
    
    Dera61    = 0
    Dera62    = 0
    Dera63    = 0
    Dera64    = 0
    
    
    FOR i=1 to 10 DO // Gradient descent and backpropagation
    IF i = 1 THEN
    candleentry=candleentry0010
    Y1=Y10010
    Y2=Y20010
    ELSIF i = 2 THEN
    candleentry=candleentry0009
    Y1=Y10009
    Y2=Y20009
    ELSIF i = 3 THEN
    candleentry=candleentry0008
    Y1=Y10008
    Y2=Y20008
    ELSIF i = 4 THEN
    candleentry=candleentry0007
    Y1=Y10007
    Y2=Y20007
    ELSIF i = 5 THEN
    candleentry=candleentry0006
    Y1=Y10006
    Y2=Y20006
    ELSIF i = 6 THEN
    candleentry=candleentry0005
    Y1=Y10005
    Y2=Y20005
    ELSIF i = 7 THEN
    candleentry=candleentry0004
    Y1=Y10004
    Y2=Y20004
    ELSIF i = 8 THEN
    candleentry=candleentry0003
    Y1=Y10003
    Y2=Y20003
    ELSIF i = 9 THEN
    candleentry=candleentry0002
    Y1=Y10002
    Y2=Y20002
    ELSIF i = 10 THEN
    candleentry=candleentry0001
    Y1=Y10001
    Y2=Y20001
    ENDIF
    
    //    >>> INPUT FOR NEURONS <<<
    input1=variable1[barindex-candleentry]
    input2=variable2[barindex-candleentry]
    input3=variable3[barindex-candleentry]
    input4=variable4[barindex-candleentry]
    
    //   >>> FIRST LAYER OF NEURONS <<<
    F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1
    F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2
    F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3
    F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4
    F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5
    F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6
    F1=1/(1+EXP(-1*F1))
    F2=1/(1+EXP(-1*F2))
    F3=1/(1+EXP(-1*F3))
    F4=1/(1+EXP(-1*F4))
    F5=1/(1+EXP(-1*F5))
    F6=1/(1+EXP(-1*F6))
    //   >>> OUTPUT NEURONS <<<
    output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1
    output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2
    output1=1/(1+EXP(-1*output1))
    output2=1/(1+EXP(-1*output2))
    
    //   >>> PARTIAL DERIVATES OF COST FUNCTION <<<
    //     ... CROSS-ENTROPY AS COST FUCTION ...
    // COST = - ( (Y1*LOG(output1)+(1-Y1)*LOG(1-output1) ) - (Y2*LOG(output2)+(1-Y2)*LOG(1-output2) )
    
    DerObias1 = (output1-Y1) * 1+  DerObias1
    DerObias2 = (output2-Y2) * 1+  DerObias2
    
    Derb11    = (output1-Y1) * F1+  Derb11
    Derb12    = (output1-Y1) * F2+  Derb12
    Derb13    = (output1-Y1) * F3+  Derb13
    Derb14    = (output1-Y1) * F4+  Derb14
    Derb15    = (output1-Y1) * F5+  Derb15
    Derb16    = (output1-Y1) * F6+  Derb16
    
    Derb21    = (output2-Y2) * F1+  Derb21
    Derb22    = (output2-Y2) * F2+  Derb22
    Derb23    = (output2-Y2) * F3+  Derb23
    Derb24    = (output2-Y2) * F4+  Derb24
    Derb25    = (output2-Y2) * F5+  Derb25
    Derb26    = (output2-Y2) * F6+  Derb26
    
    
    //   >>> PARTIAL DERIVATES OF COST FUNCTION (LAYER) <<<
    
    DerFbias1 = (output1-Y1) * b11 * F1*(1-F1) * 1 + (output2-Y2) * b21 * F1*(1-F1) * 1+   DerFbias1
    DerFbias2 = (output1-Y1) * b12 * F2*(1-F2) * 1 + (output2-Y2) * b22 * F2*(1-F2) * 1+   DerFbias2
    DerFbias3 = (output1-Y1) * b13 * F3*(1-F3) * 1 + (output2-Y2) * b23 * F3*(1-F3) * 1+   DerFbias3
    DerFbias4 = (output1-Y1) * b14 * F4*(1-F4) * 1 + (output2-Y2) * b24 * F4*(1-F4) * 1+   DerFbias4
    DerFbias5 = (output1-Y1) * b15 * F5*(1-F5) * 1 + (output2-Y2) * b25 * F5*(1-F5) * 1+   DerFbias5
    DerFbias6 = (output1-Y1) * b16 * F6*(1-F6) * 1 + (output2-Y2) * b26 * F6*(1-F6) * 1+   DerFbias6
    
    Dera11    = (output1-Y1) * b11 * F1*(1-F1) * input1 + (output2-Y2) * b21 * F1*(1-F1) * input1+  Dera11
    Dera12    = (output1-Y1) * b11 * F1*(1-F1) * input2 + (output2-Y2) * b21 * F1*(1-F1) * input2+  Dera12
    Dera13    = (output1-Y1) * b11 * F1*(1-F1) * input3 + (output2-Y2) * b21 * F1*(1-F1) * input3+  Dera13
    Dera14    = (output1-Y1) * b11 * F1*(1-F1) * input4 + (output2-Y2) * b21 * F1*(1-F1) * input4+  Dera14
    
    Dera21    = (output1-Y1) * b12 * F2*(1-F2) * input1 + (output2-Y2) * b22 * F2*(1-F2) * input1+  Dera21
    Dera22    = (output1-Y1) * b12 * F2*(1-F2) * input2 + (output2-Y2) * b22 * F2*(1-F2) * input2+  Dera22
    Dera23    = (output1-Y1) * b12 * F2*(1-F2) * input3 + (output2-Y2) * b22 * F2*(1-F2) * input3+  Dera23
    Dera24    = (output1-Y1) * b12 * F2*(1-F2) * input4 + (output2-Y2) * b22 * F2*(1-F2) * input4+  Dera24
    
    Dera31    = (output1-Y1) * b13 * F3*(1-F3) * input1 + (output2-Y2) * b23 * F3*(1-F3) * input1+  Dera31
    Dera32    = (output1-Y1) * b13 * F3*(1-F3) * input2 + (output2-Y2) * b23 * F3*(1-F3) * input2+  Dera32
    Dera33    = (output1-Y1) * b13 * F3*(1-F3) * input3 + (output2-Y2) * b23 * F3*(1-F3) * input3+  Dera33
    Dera34    = (output1-Y1) * b13 * F3*(1-F3) * input4 + (output2-Y2) * b23 * F3*(1-F3) * input4+  Dera34
    
    Dera41    = (output1-Y1) * b14 * F4*(1-F4) * input1 + (output2-Y2) * b24 * F4*(1-F4) * input1+  Dera41
    Dera42    = (output1-Y1) * b14 * F4*(1-F4) * input2 + (output2-Y2) * b24 * F4*(1-F4) * input2+  Dera42
    Dera43    = (output1-Y1) * b14 * F4*(1-F4) * input3 + (output2-Y2) * b24 * F4*(1-F4) * input3+  Dera43
    Dera44    = (output1-Y1) * b14 * F4*(1-F4) * input4 + (output2-Y2) * b24 * F4*(1-F4) * input4+  Dera44
    
    Dera51    = (output1-Y1) * b15 * F5*(1-F5) * input1 + (output2-Y2) * b25 * F5*(1-F5) * input1+  Dera51
    Dera52    = (output1-Y1) * b15 * F5*(1-F5) * input2 + (output2-Y2) * b25 * F5*(1-F5) * input2+  Dera52
    Dera53    = (output1-Y1) * b15 * F5*(1-F5) * input3 + (output2-Y2) * b25 * F5*(1-F5) * input3+  Dera53
    Dera54    = (output1-Y1) * b15 * F5*(1-F5) * input4 + (output2-Y2) * b25 * F5*(1-F5) * input4+  Dera54
    
    Dera61    = (output1-Y1) * b16 * F6*(1-F6) * input1 + (output2-Y2) * b26 * F6*(1-F6) * input1+  Dera61
    Dera62    = (output1-Y1) * b16 * F6*(1-F6) * input2 + (output2-Y2) * b26 * F6*(1-F6) * input2+  Dera62
    Dera63    = (output1-Y1) * b16 * F6*(1-F6) * input3 + (output2-Y2) * b26 * F6*(1-F6) * input3+  Dera63
    Dera64    = (output1-Y1) * b16 * F6*(1-F6) * input4 + (output2-Y2) * b26 * F6*(1-F6) * input4+  Dera64
    
    ERROR= 0.5*( (output1-Y1)*(output1-Y1) + (output2-Y2)*(output2-Y2) )  +  ERROR
    COST= - ( Y1*LOG(output1)+(1-Y1)*LOG(1-output1) ) - ( Y2*LOG(output2)+(1-Y2)*LOG(1-output2) ) + COST
    
    NEXT
    
    DerObias1 = DerObias1/10*(1-betha) + betha*BDerObias1
    DerObias2 = DerObias2/10*(1-betha) + betha*BDerObias2
    
    Derb11    = Derb11/10*(1-betha) + betha*BDerb11
    Derb12    = Derb12/10*(1-betha) + betha*BDerb12
    Derb13    = Derb13/10*(1-betha) + betha*BDerb13
    Derb14    = Derb14/10*(1-betha) + betha*BDerb14
    Derb15    = Derb15/10*(1-betha) + betha*BDerb15
    Derb16    = Derb16/10*(1-betha) + betha*BDerb16
    
    Derb21    = Derb21/10*(1-betha) + betha*BDerb21
    Derb22    = Derb22/10*(1-betha) + betha*BDerb22
    Derb23    = Derb23/10*(1-betha) + betha*BDerb23
    Derb24    = Derb24/10*(1-betha) + betha*BDerb24
    Derb25    = Derb25/10*(1-betha) + betha*BDerb25
    Derb26    = Derb26/10*(1-betha) + betha*BDerb26
    
    DerFbias1 = DerFbias1/10*(1-betha) + betha*BDerFbias1
    DerFbias2 = DerFbias2/10*(1-betha) + betha*BDerFbias2
    DerFbias3 = DerFbias3/10*(1-betha) + betha*BDerFbias3
    DerFbias4 = DerFbias4/10*(1-betha) + betha*BDerFbias4
    DerFbias5 = DerFbias5/10*(1-betha) + betha*BDerFbias5
    DerFbias6 = DerFbias6/10*(1-betha) + betha*BDerFbias6
    
    Dera11    = Dera11/10*(1-betha) + betha*BDera11
    Dera12    = Dera12/10*(1-betha) + betha*BDera12
    Dera13    = Dera13/10*(1-betha) + betha*BDera13
    Dera14    = Dera14/10*(1-betha) + betha*BDera14
    
    Dera21    = Dera21/10*(1-betha) + betha*BDera21
    Dera22    = Dera22/10*(1-betha) + betha*BDera22
    Dera23    = Dera23/10*(1-betha) + betha*BDera23
    Dera24    = Dera24/10*(1-betha) + betha*BDera24
    
    Dera31    = Dera31/10*(1-betha) + betha*BDera31
    Dera32    = Dera32/10*(1-betha) + betha*BDera32
    Dera33    = Dera33/10*(1-betha) + betha*BDera33
    Dera34    = Dera34/10*(1-betha) + betha*BDera34
    
    Dera41    = Dera41/10*(1-betha) + betha*BDera41
    Dera42    = Dera42/10*(1-betha) + betha*BDera42
    Dera43    = Dera43/10*(1-betha) + betha*BDera43
    Dera44    = Dera44/10*(1-betha) + betha*BDera44
    
    Dera51    = Dera51/10*(1-betha) + betha*BDera51
    Dera52    = Dera52/10*(1-betha) + betha*BDera52
    Dera53    = Dera53/10*(1-betha) + betha*BDera53
    Dera54    = Dera54/10*(1-betha) + betha*BDera54
    
    Dera61    = Dera61/10*(1-betha) + betha*BDera61
    Dera62    = Dera62/10*(1-betha) + betha*BDera62
    Dera63    = Dera63/10*(1-betha) + betha*BDera63
    Dera64    = Dera64/10*(1-betha) + betha*BDera64
    
    //Implementing BackPropagation
    Obias1=Obias1-ETAi*DerObias1
    Obias2=Obias2-ETAi*DerObias2
    
    b11=b11-ETAi*Derb11
    b12=b12-ETAi*Derb12
    b13=b11-ETAi*Derb13
    b14=b11-ETAi*Derb14
    b15=b11-ETAi*Derb15
    b16=b11-ETAi*Derb16
    
    b21=b11-ETAi*Derb21
    b22=b12-ETAi*Derb22
    b23=b11-ETAi*Derb23
    b24=b11-ETAi*Derb24
    b25=b11-ETAi*Derb25
    b26=b11-ETAi*Derb26
    
    Fbias1=Fbias1-ETAi*DerFbias1
    Fbias2=Fbias2-ETAi*DerFbias2
    Fbias3=Fbias3-ETAi*DerFbias3
    Fbias4=Fbias4-ETAi*DerFbias4
    Fbias5=Fbias5-ETAi*DerFbias5
    Fbias6=Fbias6-ETAi*DerFbias6
    
    a11=a11-ETAi*Dera11
    a12=a12-ETAi*Dera12
    a13=a13-ETAi*Dera13
    a14=a14-ETAi*Dera14
    
    a21=a21-ETAi*Dera21
    a22=a22-ETAi*Dera22
    a23=a23-ETAi*Dera23
    a24=a24-ETAi*Dera24
    
    a31=a31-ETAi*Dera31
    a32=a32-ETAi*Dera32
    a33=a33-ETAi*Dera33
    a34=a34-ETAi*Dera34
    
    a41=a41-ETAi*Dera41
    a42=a42-ETAi*Dera42
    a43=a43-ETAi*Dera43
    a44=a44-ETAi*Dera44
    
    a51=a51-ETAi*Dera51
    a52=a52-ETAi*Dera52
    a53=a53-ETAi*Dera53
    a54=a54-ETAi*Dera54
    
    a61=a61-ETAi*Dera61
    a62=a62-ETAi*Dera62
    a63=a63-ETAi*Dera63
    a64=a64-ETAi*Dera64
    
    ERROR=ERROR/10
    ERROR=round(ERROR*1000)/1000
    
    COST = COST/10
    COST= round( COST*1000 ) /1000
    
    DRAWTEXT("C= #COST# , ETAi=#ETAi#", candleentry, -0.15*(ii-1)-0.05, Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("E= #ERROR#", candleentry, -0.15*(ii-1)-0.11, Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("a1i: #a11# ; #a12# ; #a13# ; #a14# ; Fb1:#Fbias1#", candleentry, -1-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("a2i: #a21# ; #a22# ; #a23# ; #a24# ; Fb2:#Fbias2#", candleentry, -2-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("a3i: #a31# ; #a32# ; #a33# ; #a34# ; Fb3:#Fbias3#", candleentry, -3-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("a4i: #a41# ; #a42# ; #a43# ; #a44# ; Fb4:#Fbias4#", candleentry, -4-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("a5i: #a51# ; #a52# ; #a53# ; #a54# ; Fb5:#Fbias5#", candleentry, -5-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("a6i: #a61# ; #a62# ; #a63# ; #a64# ; Fb6:#Fbias6#", candleentry, -6-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("b1i: #b11# ; #b12# ; #b13# ; #b14# ; #b15# ; #b16# ; Ob1:#Obias1#", candleentry, -7-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)
    DRAWTEXT("b2i: #b21# ; #b22# ; #b23# ; #b24# ; #b25# ; #b26# ; Ob2:#Obias2#", candleentry, -8-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)
    
    //GradientNorm = SQRT(DerObias1*DerObias1 + DerObias2*DerObias2+Derb11*Derb11+Derb12*Derb12+Derb13*Derb13+Derb14*Derb14+Derb15*Derb15+Derb16*Derb16 + Derb21*Derb21+Derb22*Derb22+Derb23*Derb23+Derb24*Derb24+Derb25*Derb25+Derb26*Derb26 + DerFbias1*DerFbias1+DerFbias2*DerFbias2+DerFbias3+DerFbias3+DerFbias4*DerFbias4+DerFbias4*DerFbias5+DerFbias6*DerFbias6 + Dera11*Dera11+Dera12*Dera12+Dera13*Dera13+Dera14*Dera14 + Dera21*Dera21+Dera22*Dera22+Dera23*Dera23+Dera24*Dera24 + Dera31*Dera31+Dera32*Dera32+Dera33*Dera33+Dera34*Dera34 + Dera41*Dera41+Dera42*Dera42+Dera43*Dera43+Dera44*Dera44 + Dera51*Dera51+Dera52*Dera52+Dera53*Dera53+Dera54*Dera54 + Dera61*Dera61+Dera62*Dera62+Dera63*Dera63+Dera64*Dera64)
    //DRAWTEXT("GradientNorm: #GradientNorm#", candleentry, 2-0.1*(ii), Dialog, Bold, 10) COLOURED(50,150,50)
    
    COST2=COST1
    COST1=COST
    
    IF COST1 > COST2 and ii>=2 THEN //ETAi must be reduced
    ETAi=ETAi/1.5
    ELSE //ETAi can be increased
    ETAi=ETAi*(1.5+0.2*ii)
    ENDIF
    COST2=COST1
    
    
    NEXT
    //DRAWTEXT("#candleentry0001#,#candleentry0002#,#candleentry0003#,#candleentry0004#,#candleentry0005#,#candleentry0006#,#candleentry0007#,#candleentry0008#,#candleentry0009#,#candleentry0010#", candleentry, -0.7, Dialog, Bold, 10) COLOURED(50,150,50)
    once error1=ERROR
    error2=error1
    error1=ERROR
    //IF error1 < error2 THEN
    //Keta=Keta+0.05
    //ELSE
    //Keta=Keta-0.05
    //ENDIF
    Keta=Keta*(1+min(0.1,abs((error2-error1)/error1))*SGN(error2-error1))
    
    ENDIF
    ENDIF
    //ENDIF
    
    ///////////////////    NEW PREDICTION  ///////////////////
    //    >>> INPUT NEURONS <<<
    input1=variable1
    input2=variable2
    input3=variable3
    input4=variable4
    //   >>> FIRST LAYER OF NEURONS <<<
    F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1
    F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2
    F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3
    F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4
    F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5
    F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6
    F1=1/(1+EXP(-1*F1))
    F2=1/(1+EXP(-1*F2))
    F3=1/(1+EXP(-1*F3))
    F4=1/(1+EXP(-1*F4))
    F5=1/(1+EXP(-1*F5))
    F6=1/(1+EXP(-1*F6))
    //   >>> OUTPUT NEURONS <<<
    output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1
    output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2
    output1=1/(1+EXP(-1*output1))
    output2=1/(1+EXP(-1*output2))
    ENDIF
    
    return output1 coloured(0,150,0) style(line,1) as "prediction long" , output2 coloured(200,0,0) style(line,1) as "prediction short",classifierlong coloured(0,150,0) STYLE(HISTOGRAM,2) AS "classifier_long" , classifiershort coloured(200,0,0) STYLE(HISTOGRAM,2) AS "classifier_short", 0.5 coloured(0,0,200) as "0.5", 0.6 coloured(0,0,200) as "0.6", 0.7 coloured(0,0,200) as "0.7", 0.8 coloured(0,0,200) as "0.8"
    
    swapping, GraHal, Meta Signals Pro and 3 others thanked this post
    #84477 quote
    Meta Signals Pro
    Participant
    Veteran

    Hi Leo,

    I feel like in Sci-Fi with this indicator;-) and it is great;

    thanks a lot for sharing;

    I tried it on stocks but was unable to make it release its prediction! I surely missed smthing (see picture).

    Should we change smthing?

    Can you give us on which asset you were able to have the graph you mentioned in your post?

    Madrosat thanked this post
    #84480 quote
    Leo
    Participant
    Veteran

    Hi.

    Well. First you should know that this is not a universal indicator. Not like, we both see the same stuff if we open it.

    You need to load many bars like 20000bars then you give enough time to the indicator to learn. If you open it with 20k and me with 30k. My indicator has learn more than yours.

    Another parameters is the pips to a metaphirc trade. Now we look for possible trades and learn what has happend in the moment of entry the trade.

    For the prediction you take a value from 0.5 to 0.999 for make where is ypur predicition.

    In any case, There is a lot of work ahead. Thats why I share. Me alone I can not do it as faster I want.

    The neural network itself is working properlly ( with the limitations of prorealtime). That I recheck and recheck. And corrected and redoing stufff and checked it againg. Pufff

    #84483 quote
    GraHal
    Participant
    Master

    Thank you Leo for your ongoing work and for being so kind to share with us!

    I made a System out of your latest NN Code … equity curve attached over 100k bars with spread = 4.

    I have it on Demo Fwd Test from today … I’ll report back on here after 10 trades.

    #84489 quote
    Nicolas
    Keymaster
    Master

    @Kris75

    100 units is far from sufficient for the machine to has learned something!

    Meta Signals Pro thanked this post
    #84495 quote
    Meta Signals Pro
    Participant
    Veteran

    thanks Leo for your precautions and Nicolas for your both answers;

    still blocked here; I get this message; with all TF;

    Best

    #84497 quote
    Meta Signals Pro
    Participant
    Veteran

    great Grahal, can you share it too?

    #84499 quote
    Leo
    Participant
    Veteran

    Kris 75. Take pips per trade higher number because you loaded it in daily time frame.

    Meta Signals Pro thanked this post
    #84502 quote
    Meta Signals Pro
    Participant
    Veteran
    thanks Leo; was able to make it work in daily with “trade perpips” = 100 ; Hope Grahal will share the straegy so I can backtest it; Best Chris
    #84504 quote
    GraHal
    Participant
    Master
    Hope Grahal will share the strategy so I can backtest it;
    I will share for sure, but I’d prefer the System to do a few trades first. All I did was use output1 as a Long entry and output2 as a Short entry and added TP and SL (rough feel for profit / loss) and optimisation, oh and a simple Filter. Am I the only one who has makes Leo’s NN Indicator code into Systems? I have about 5 or 6 versions running in Demo Forward Test? This Thread needs to be kept clean for discussion re Leo’s NN Code. I have shared a few Systems based on Leo / NN code on my Thread …
    Right full Neural Network now, huge thanks to Leo!
    What we need is cross-fertilisation of ideas as I’m sure there are better ways (than mine) of using Leo’s code to make profitable Systems? Different TF and Markets even? The wider view? Feedback will likely help Leo with his superlative NN coding?? So there’s a challenge   @Kris75 … you show me yours and I’ll show you mine? 🙂 I will post the System (based on Leo 1.3.2 code) on the other Thread (link above), after I’ve seen at least one trade open and closed. Cheers GraHal
    #84592 quote
    Leo
    Participant
    Veteran
    Forgot to tell you there is a line at 196. It is about the time for intraday… by daily  bars you should delete it
    #84593 quote
    Leo
    Participant
    Veteran
    Hi all, I posted something interesting here: https://www.prorealcode.com/topic/weekly-seasonality-analysis/#post-84588 What if do something similar for weekly? and then also for daily? when 3 predictions are showing the same then we entry in daily or hourly time frame… mmm…. sexy…
    Vonasi and Rumjacks thanked this post
    #84703 quote
    Rumjacks
    Participant
    Average
    Hi Leo, Why not use a special type of recurrent neural network called LSTM network? A solution to the vanishing gradient problem is to use cells to remember longterm dependencies Cordially Sources: http://colah.github.io/posts/2015-08-Understanding-LSTMs/ https://github.com/llSourcell/How-to-Predict-Stock-Prices-Easily-Demo
    #84720 quote
    Leo
    Participant
    Veteran
    How man! sorry for disappoint you.  What I posted here is the first attempt ever  (as far as I know) of creating artificial intelligent using such a basic language like ProBuilder with no array support. It only has  memory for execute  5 epochs per mini batch, I do not even think to make something like a recurrent neural network. I only start with a Feed Forward Neural Network! with only 1 hidden layer!!! with only 6 neurons! do you think that my decent gradient is vanishing? really?  I have other problems, but definitely not a vanishing gradient. I will be happy if I were able one of those days to increase the number of neurons… maybe you Raspoutine can help us with this endeavour.
Viewing 15 posts - 61 through 75 (of 127 total)
  • You must be logged in to reply to this topic.

Neural networks programming with prorealtime


ProBuilder: Indicators & Custom Tools

New Reply
Author
author-avatar
Leo @leito Participant
Summary

This topic contains 126 replies,
has 8 voices, and was last updated by MobiusGrey
2 years, 4 months ago.

Topic Details
Forum: ProBuilder: Indicators & Custom Tools
Language: English
Started: 08/18/2018
Status: Active
Attachments: 32 files
Logo Logo
Loading...