Confirmation of Trend using Neural Networks (by kind permission of Leo)

Viewing 8 posts - 31 through 38 (of 38 total)
  • Author
    Posts
  • #79564 quote
    Stefanb
    Participant
    Senior

    My fault, copy past went wrong.

    Like GraHal says, line 37-46 are duplicated.

     

    Comments on the code otherwise?

    #79578 quote
    GraHal
    Participant
    Master

    Right full Neural Network now, huge thanks to Leo!

    v4.0 below, spread = 4.

    Tweaks and variations welcome (better exit strategy?).

    Please post full System Code and Performance Stats.

    // Hyperparameters to be optimized
    DEFPARAM CUMULATEORDERS = False
    ETA=1 //known as the learning rate
    candlesback=6 // for the classifier
    ProfitRiskRatio=2// for the classifier
    spread=0.9 // for the classifier
    
    
    /////////////////    CLASSIFIER    /////////////
    
    myATR=average[20](range)+std[20](range)
    ExtraStopLoss=MyATR
    //ExtraStopLoss=3*spread*pipsize
    
    //for long trades
    classifierlong=0
    FOR scanL=1 to candlesback DO
    IF classifierlong[scanL]=1 then
    BREAK
    ENDIF
    LongTradeLength=ProfitRiskRatio*(close[scanL]-(low[scanL]-ExtraStopLoss[scanL]))
    IF close[scanL]+LongTradeLength < high-spread*pipsize then
    IF lowest[scanL+1](low) > low[scanL]-ExtraStopLoss[scanL]+spread*pipsize then
    classifierlong=1
    candleentrylong=barindex-scanL
    BREAK
    ENDIF
    ENDIF
    NEXT
    
    //for short trades
    classifiershort=0
    FOR scanS=1 to candlesback DO
    IF classifiershort[scanS]=1 then
    BREAK
    ENDIF
    ShortTradeLength=ProfitRiskRatio*((high[scanS]-close[scanS])+ExtraStopLoss[scanS])
    IF close[scanS]-ShortTradeLength > low+spread*pipsize then
    IF highest[scanS+1](high) < high[scanS]+ExtraStopLoss[scanS]-spread*pipsize then
    classifiershort=1
    candleentryshort=barindex-scanS
    BREAK
    ENDIF
    ENDIF
    NEXT
    
    /////////////////////////  NEURONAL NETWORK  ///////////////////
    
    // ...INITIAL VALUES...
    once a11=1
    once a12=-1
    once a13=1
    once a14=-1
    
    once a21=1
    once a22=1
    once a23=-1
    once a24=1
    
    once a31=-1
    once a32=1
    once a33=-1
    once a34=1
    
    once a41=1
    once a42=-1
    once a43=1
    once a44=-1
    
    once a51=-1
    once a52=1
    once a53=-1
    once a54=1
    
    once a61=1
    once a62=-1
    once a63=1
    once a64=-1
    
    once Fbias1=0
    once Fbias2=0
    once Fbias3=0
    once Fbias4=0
    once Fbias5=0
    once Fbias6=0
    
    once b11=1
    once b12=-1
    once b13=1
    once b14=-1
    once b15=1
    once b16=-1
    
    once b21=-1
    once b22=1
    once b23=-1
    once b24=1
    once b25=-1
    once b26=1
    
    once Obias1=0
    once Obias2=0
    
    // ...DEFINITION OF INPUTS...
    
    SMA20=average[min(20,barindex)](close)
    SMA200=average[min(200,barindex)](close)
    SMA2400=average[min(2400,barindex)](close) //in 5 min time frame this is the value of SMA 200 periods in hourly
    
    variable1= RSI[14](close)     // or to be defined
    variable2= (close-SMA20)/SMA20 *100 //or to be defined
    variable3= (SMA20-SMA200)/SMA200 *100 //or to be defined
    variable4= (SMA200-SMA2400)/SMA2400 *100   // to be defined
    
    //   >>>    LEARNING PROCESS  <<<
    // If the classifier has detected a wining trade in the past
    //IF hour > 7 and hour < 21 then
    IF BARINDEX > 2500 THEN
    IF classifierlong=1 or classifiershort=1 THEN
    IF hour > 7 and hour < 21 then
    candleentry=max(candleentrylong,candleentryshort)
    Y1=classifierlong
    Y2=classifiershort
    
    //    >>> INPUT FOR NEURONS <<<
    input1=variable1[barindex-candleentry]
    input2=variable2[barindex-candleentry]
    input3=variable3[barindex-candleentry]
    input4=variable4[barindex-candleentry]
    
    FOR i=1 to 10 DO //  THIS HAVE TO BE IMPROVED
    ETAi=ETA - ETA/10*(i-1) //Learning Rate
    //   >>> FIRST LAYER OF NEURONS <<<
    F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1
    F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2
    F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3
    F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4
    F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5
    F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6
    F1=1/(1+EXP(-1*F1))
    F2=1/(1+EXP(-1*F2))
    F3=1/(1+EXP(-1*F3))
    F4=1/(1+EXP(-1*F4))
    F5=1/(1+EXP(-1*F5))
    F6=1/(1+EXP(-1*F6))
    //   >>> OUTPUT NEURONS <<<
    output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1
    output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2
    output1=1/(1+EXP(-1*output1))
    output2=1/(1+EXP(-1*output2))
    
    //   >>> PARTIAL DERIVATES OF COST FUNCTION <<<
    //     ... CROSS-ENTROPY AS COST FUCTION ...
    // COST = - ( (Y1*LOG(output1)+(1-Y1)*LOG(1-output1) ) - (Y2*LOG(output2)+(1-Y2)*LOG(1-output2) )
    
    DerObias1 = (output1-Y1) * 1
    DerObias2 = (output2-Y2) * 1
    
    Derb11    = (output1-Y1) * F1
    Derb12    = (output1-Y1) * F2
    Derb13    = (output1-Y1) * F3
    Derb14    = (output1-Y1) * F4
    Derb15    = (output1-Y1) * F5
    Derb16    = (output1-Y1) * F6
    
    Derb21    = (output2-Y2) * F1
    Derb22    = (output2-Y2) * F2
    Derb23    = (output2-Y2) * F3
    Derb24    = (output2-Y2) * F4
    Derb25    = (output2-Y2) * F5
    Derb26    = (output2-Y2) * F6
    
    //Implementing BackPropagation
    Obias1=Obias1-ETAi*DerObias1
    Obias2=Obias2-ETAi*DerObias2
    
    b11=b11-ETAi*Derb11
    b12=b12-ETAi*Derb12
    b13=b11-ETAi*Derb13
    b14=b11-ETAi*Derb14
    b15=b11-ETAi*Derb15
    b16=b11-ETAi*Derb16
    
    b21=b11-ETAi*Derb21
    b22=b12-ETAi*Derb22
    b23=b11-ETAi*Derb23
    b24=b11-ETAi*Derb24
    b25=b11-ETAi*Derb25
    b26=b11-ETAi*Derb26
    
    //   >>> PARTIAL DERIVATES OF COST FUNCTION (LAYER) <<<
    
    DerFbias1 = (output1-Y1) * b11 * F1*(1-F1) * 1 + (output2-Y2) * b21 * F1*(1-F1) * 1
    DerFbias2 = (output1-Y1) * b12 * F2*(1-F2) * 1 + (output2-Y2) * b22 * F2*(1-F2) * 1
    DerFbias3 = (output1-Y1) * b13 * F3*(1-F3) * 1 + (output2-Y2) * b23 * F3*(1-F3) * 1
    DerFbias4 = (output1-Y1) * b14 * F4*(1-F4) * 1 + (output2-Y2) * b24 * F4*(1-F4) * 1
    DerFbias5 = (output1-Y1) * b15 * F5*(1-F5) * 1 + (output2-Y2) * b25 * F5*(1-F5) * 1
    DerFbias6 = (output1-Y1) * b16 * F6*(1-F6) * 1 + (output2-Y2) * b26 * F6*(1-F6) * 1
    
    Dera11    = (output1-Y1) * b11 * F1*(1-F1) * input1 + (output2-Y2) * b21 * F1*(1-F1) * input1
    Dera12    = (output1-Y1) * b11 * F1*(1-F1) * input2 + (output2-Y2) * b21 * F1*(1-F1) * input2
    Dera13    = (output1-Y1) * b11 * F1*(1-F1) * input3 + (output2-Y2) * b21 * F1*(1-F1) * input3
    Dera14    = (output1-Y1) * b11 * F1*(1-F1) * input4 + (output2-Y2) * b21 * F1*(1-F1) * input4
    
    Dera21    = (output1-Y1) * b12 * F2*(1-F2) * input1 + (output2-Y2) * b22 * F2*(1-F2) * input1
    Dera22    = (output1-Y1) * b12 * F2*(1-F2) * input2 + (output2-Y2) * b22 * F2*(1-F2) * input2
    Dera23    = (output1-Y1) * b12 * F2*(1-F2) * input3 + (output2-Y2) * b22 * F2*(1-F2) * input3
    Dera24    = (output1-Y1) * b12 * F2*(1-F2) * input4 + (output2-Y2) * b22 * F2*(1-F2) * input4
    
    Dera31    = (output1-Y1) * b13 * F3*(1-F3) * input1 + (output2-Y2) * b23 * F3*(1-F3) * input1
    Dera32    = (output1-Y1) * b13 * F3*(1-F3) * input2 + (output2-Y2) * b23 * F3*(1-F3) * input2
    Dera33    = (output1-Y1) * b13 * F3*(1-F3) * input3 + (output2-Y2) * b23 * F3*(1-F3) * input3
    Dera34    = (output1-Y1) * b13 * F3*(1-F3) * input4 + (output2-Y2) * b23 * F3*(1-F3) * input4
    
    Dera41    = (output1-Y1) * b14 * F4*(1-F4) * input1 + (output2-Y2) * b24 * F4*(1-F4) * input1
    Dera42    = (output1-Y1) * b14 * F4*(1-F4) * input2 + (output2-Y2) * b24 * F4*(1-F4) * input2
    Dera43    = (output1-Y1) * b14 * F4*(1-F4) * input3 + (output2-Y2) * b24 * F4*(1-F4) * input3
    Dera44    = (output1-Y1) * b14 * F4*(1-F4) * input4 + (output2-Y2) * b24 * F4*(1-F4) * input4
    
    Dera51    = (output1-Y1) * b15 * F5*(1-F5) * input1 + (output2-Y2) * b25 * F5*(1-F5) * input1
    Dera52    = (output1-Y1) * b15 * F5*(1-F5) * input2 + (output2-Y2) * b25 * F5*(1-F5) * input2
    Dera53    = (output1-Y1) * b15 * F5*(1-F5) * input3 + (output2-Y2) * b25 * F5*(1-F5) * input3
    Dera54    = (output1-Y1) * b15 * F5*(1-F5) * input4 + (output2-Y2) * b25 * F5*(1-F5) * input4
    
    Dera61    = (output1-Y1) * b16 * F6*(1-F6) * input1 + (output2-Y2) * b26 * F6*(1-F6) * input1
    Dera62    = (output1-Y1) * b16 * F6*(1-F6) * input2 + (output2-Y2) * b26 * F6*(1-F6) * input2
    Dera63    = (output1-Y1) * b16 * F6*(1-F6) * input3 + (output2-Y2) * b26 * F6*(1-F6) * input3
    Dera64    = (output1-Y1) * b16 * F6*(1-F6) * input4 + (output2-Y2) * b26 * F6*(1-F6) * input4
    
    //Implementing BackPropagation
    Fbias1=Fbias1-ETAi*DerFbias1
    Fbias2=Fbias2-ETAi*DerFbias2
    Fbias3=Fbias3-ETAi*DerFbias3
    Fbias4=Fbias4-ETAi*DerFbias4
    Fbias5=Fbias5-ETAi*DerFbias5
    Fbias6=Fbias6-ETAi*DerFbias6
    
    a11=a11-ETAi*Dera11
    a12=a12-ETAi*Dera12
    a13=a13-ETAi*Dera13
    a14=a14-ETAi*Dera14
    
    a21=a21-ETAi*Dera21
    a22=a22-ETAi*Dera22
    a23=a23-ETAi*Dera23
    a24=a24-ETAi*Dera24
    
    a31=a31-ETAi*Dera31
    a32=a32-ETAi*Dera32
    a33=a33-ETAi*Dera33
    a34=a34-ETAi*Dera34
    
    a41=a41-ETAi*Dera41
    a42=a42-ETAi*Dera42
    a43=a43-ETAi*Dera43
    a44=a44-ETAi*Dera44
    
    a51=a51-ETAi*Dera51
    a52=a52-ETAi*Dera52
    a53=a53-ETAi*Dera53
    a54=a54-ETAi*Dera54
    
    a61=a61-ETAi*Dera61
    a62=a62-ETAi*Dera62
    a63=a63-ETAi*Dera63
    a64=a64-ETAi*Dera64
    
    //GradientNorm = SQRT(DerObias1*DerObias1 + DerObias2*DerObias2+Derb11*Derb11+Derb12*Derb12+Derb13*Derb13+Derb14*Derb14+Derb15*Derb15+Derb16*Derb16 + Derb21*Derb21+Derb22*Derb22+Derb23*Derb23+Derb24*Derb24+Derb25*Derb25+Derb26*Derb26 + DerFbias1*DerFbias1+DerFbias2*DerFbias2+DerFbias3+DerFbias3+DerFbias4*DerFbias4+DerFbias4*DerFbias5+DerFbias6*DerFbias6 + Dera11*Dera11+Dera12*Dera12+Dera13*Dera13+Dera14*Dera14 + Dera21*Dera21+Dera22*Dera22+Dera23*Dera23+Dera24*Dera24 + Dera31*Dera31+Dera32*Dera32+Dera33*Dera33+Dera34*Dera34 + Dera41*Dera41+Dera42*Dera42+Dera43*Dera43+Dera44*Dera44 + Dera51*Dera51+Dera52*Dera52+Dera53*Dera53+Dera54*Dera54 + Dera61*Dera61+Dera62*Dera62+Dera63*Dera63+Dera64*Dera64)
    
    NEXT
    ENDIF
    ENDIF
    //ENDIF
    
    ///////////////////    NEW PREDICTION  ///////////////////
    //    >>> INPUT NEURONS <<<
    input1=variable1
    input2=variable2
    input3=variable3
    input4=variable4
    //   >>> FIRST LAYER OF NEURONS <<<
    F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1
    F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2
    F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3
    F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4
    F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5
    F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6
    F1=1/(1+EXP(-1*F1))
    F2=1/(1+EXP(-1*F2))
    F3=1/(1+EXP(-1*F3))
    F4=1/(1+EXP(-1*F4))
    F5=1/(1+EXP(-1*F5))
    F6=1/(1+EXP(-1*F6))
    //   >>> OUTPUT NEURONS <<<
    output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1
    output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2
    output1=1/(1+EXP(-1*output1))
    output2=1/(1+EXP(-1*output2))
    ENDIF
    
    //return output1 as "prediction long", output2 as "prediction short", threshold as "threshold"
    
    If Output1 > 0.9  and output2 < 1 Then
    Buy at Market
    Endif
    
    If Output2 > 0.7 and output1 < 0.7 Then
    Sellshort at Market
    endif
    
    If LongonMarket then
    SET TARGET PPROFIT 390
    SET STOP PLOSS 50
    Endif
    
    If ShortonMarket then
    SET TARGET PPROFIT 100
    SET STOP PLOSS 60
    Endif
    
    #79581 quote
    Nicolas
    Keymaster
    Master

    @stefanb

    You’re off topic…and over optimized 😉

    @grahal

    Thanks for posting it. Now the topic is full of different stuffs. I let you manage the versioning 🙂

    #79583 quote
    GraHal
    Participant
    Master

    Ha I think it’s a lost cause!?

    I have tried in the past on several Threads to encourage use of version numbers etc, but – whilst we still have folks making elementary faux pas like posting unformatted code (not using the Insert PRT Code button) then I think we have to accept we may never have Configuration Control?

    But I think now this Topic can justifiably be renamed 🙂 …

    Confirmation of Trend using Neural Networks (by kind permission of Leo)

    #79605 quote
    GraHal
    Participant
    Master

    Leo posted a later version Neural Network (v1.2) so here is the System based on Leo’s latest self-learning neural algorithm.

    Tested with Spread = 4

    4everTrading thanked this post
    #79613 quote
    Leo
    Participant
    Veteran

    Wow GraHal!

    It looks awesome!

    thanks for testing in an strategy.

    A lot of work ahead! But such a rustic neural network proves than it works, imagine all the possibilities!

    GraHal and 4everTrading thanked this post
    #80056 quote
    Leo
    Participant
    Veteran

    Hi GraHal,

    I posted a little improved version of the neural network… but the inputs totally different.

    In case you feel curious.

    NOTE:

    please add in your codes

    DEFPARAM PreLoadBars = 10000

    then you give time to the algorithm for learn

    Thanks in advance

    As you know, while I am coding with one hand in the other one my baby is sleeping otherwise she will cry, haha 🙂

    GraHal thanked this post
    #80060 quote
    GraHal
    Participant
    Master

    in the other one my baby is sleeping

    I recall similar scenarios, can be frustrating at the time, but it passes all too quick then the precious moments will be only in your mind also, so enjoy if you can!? 🙂

    Make sure she not swallow a memory stick full of code or she may end up like Nicolas!! 🙂

Viewing 8 posts - 31 through 38 (of 38 total)
  • You must be logged in to reply to this topic.

Confirmation of Trend using Neural Networks (by kind permission of Leo)


ProOrder: Automated Strategies & Backtesting

New Reply
Author
Summary

This topic contains 37 replies,
has 8 voices, and was last updated by GraHal
7 years, 5 months ago.

Topic Details
Forum: ProOrder: Automated Strategies & Backtesting
Language: English
Started: 08/24/2018
Status: Active
Attachments: 22 files
Logo Logo
Loading...