Neural networks programming with prorealtime

Viewing 15 posts - 16 through 30 (of 127 total)
  • Author
    Posts
  • #79003 quote
    Nicolas
    Keymaster
    Master

    Discussion about the trading strategy made by GraHal from the classifier code can be found by following this link: Long only trading with trend confirmation on 5 minutes timeframe

    Inertia thanked this post
    #79072 quote
    Leo
    Participant
    Veteran

    Hi all here is the complete classifier.

    Now that I completed it,  I realised that the clasiffier can be whatever you want for example using fractals and a loop for store the positions of the points. Or whatever method to detects the exactly change of tendency.

    //Variables:
    //candlesback=7
    //ProfitRiskRatio=2
    //spread=1.5
    
    
    myATR=average[20](range)+std[20](range)
    ExtraStopLoss=MyATR
    //ExtraStopLoss=3*spread*pipsize
    
    //for long trades
    classifierlong=0
    FOR scanL=1 to candlesback DO
    IF classifierlong[scanL]=1 then
    BREAK
    ENDIF
    LongTradeLength=ProfitRiskRatio*(close[scanL]-(low[scanL]-ExtraStopLoss[scanL]))
    IF close[scanL]+LongTradeLength < high-spread*pipsize then
    IF lowest[scanL+1](low) > low[scanL]-ExtraStopLoss[scanL]+spread*pipsize then
    classifierlong=1
    candleentrylong=barindex-scanL
    BREAK
    ENDIF
    ENDIF
    NEXT
    IF classifierlong=1 then
    DRAWSEGMENT(candleentrylong,close[barindex-candleentrylong],barindex,close[barindex-candleentrylong]+LongTradeLength) COLOURED(0,150,0)
    DRAWELLIPSE(candleentrylong-1,low[barindex-candleentrylong]-ExtraStopLoss,barindex+1,high+ExtraStopLoss) COLOURED(0,150,0)
    ENDIF
    
    //for short trades
    classifiershort=0
    FOR scanS=1 to candlesback DO
    IF classifiershort[scanS]=1 then
    BREAK
    ENDIF
    ShortTradeLength=ProfitRiskRatio*((high[scanS]-close[scanS])+ExtraStopLoss[scanS])
    IF close[scanS]-ShortTradeLength > low+spread*pipsize then
    IF highest[scanS+1](high) < high[scanS]+ExtraStopLoss[scanS]-spread*pipsize then
    classifiershort=1
    candleentryshort=barindex-scanS
    BREAK
    ENDIF
    ENDIF
    NEXT
    
    IF classifiershort=1 then
    DRAWSEGMENT(candleentryshort,close[barindex-candleentryshort],barindex,close[barindex-candleentryshort]-ShortTradeLength) COLOURED(150,0,0)
    DRAWELLIPSE(candleentryshort-1,high[barindex-candleentryshort]+ExtraStopLoss,barindex+1,low-ExtraStopLoss) COLOURED(150,0,0)
    ENDIF
    
    
    return
    
    GraHal and Nicolas thanked this post
    #79073 quote
    GraHal
    Participant
    Master

    can be found by following this link:

    The link comes up with 404 not found for me yesterday and today.

    Hey thank you so much for sharing your latest @Leo … this is new exciting work!

    Cheers

    #79135 quote
    Leo
    Participant
    Veteran

    Hi all,

    I write the main structure of the neural network.

    Still a lot work in process… for example to figure out the initial values for the weights and bias. I add a photo with the scheme

    /////////////////    CLASSIFIER    /////////////
    
    myATR=average[20](range)+std[20](range)
    ExtraStopLoss=MyATR
    //ExtraStopLoss=3*spread*pipsize
    
    //for long trades
    classifierlong=0
    FOR scanL=1 to candlesback DO
    IF classifierlong[scanL]=1 then
    BREAK
    ENDIF
    LongTradeLength=ProfitRiskRatio*(close[scanL]-(low[scanL]-ExtraStopLoss[scanL]))
    IF close[scanL]+LongTradeLength < high-spread*pipsize then
    IF lowest[scanL+1](low) > low[scanL]-ExtraStopLoss[scanL]+spread*pipsize then
    classifierlong=1
    candleentrylong=barindex-scanL
    BREAK
    ENDIF
    ENDIF
    NEXT
    
    //for short trades
    classifiershort=0
    FOR scanS=1 to candlesback DO
    IF classifiershort[scanS]=1 then
    BREAK
    ENDIF
    ShortTradeLength=ProfitRiskRatio*((high[scanS]-close[scanS])+ExtraStopLoss[scanS])
    IF close[scanS]-ShortTradeLength > low+spread*pipsize then
    IF highest[scanS+1](high) < high[scanS]+ExtraStopLoss[scanS]-spread*pipsize then
    classifiershort=1
    candleentryshort=barindex-scanS
    BREAK
    ENDIF
    ENDIF
    NEXT
    
    /////////////////////////  NEURONAL NETWORK  ///////////////////
    
    //variable1=     // to be defined
    //variable2=     // to be defined
    //variable3=     // to be defined
    //variable4=     // to be defined
    
    IF classifierlong=1 or classifiershort=1 THEN
    candleentry=max(candleentrylong,candleentryshort)
    
    //    >>> INPUT NEURONS <<<
    input1=variable1[barindex-candleentry]
    input2=variable2[barindex-candleentry]
    input3=variable3[barindex-candleentry]
    input4=variable4[barindex-candleentry]
    //   >>> FIRST LAYER OF NEURONS <<<
    F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1
    F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2
    F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3
    F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4
    F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5
    F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6
    F1=1/(1+EXP(-1*F1))
    F2=1/(1+EXP(-1*F2))
    F3=1/(1+EXP(-1*F3))
    F4=1/(1+EXP(-1*F4))
    F5=1/(1+EXP(-1*F5))
    F6=1/(1+EXP(-1*F6))
    //   >>> OUTPUT NEURONS <<<
    output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1
    output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2
    output1=1/(1+EXP(-1*output1))
    output2=1/(1+EXP(-1*output2))
    
    
    
    
    #79140 quote
    Nicolas
    Keymaster
    Master

    For the weights, I would suggest using the percentage of a similarity, as a coefficient. Like the way I did in this previous version: https://www.prorealcode.com/topic/neural-networks-programming-with-prorealtime/#post-78789

    #79150 quote
    Leo
    Participant
    Veteran

    For the weights, I would suggest using the percentage of a similarity, as a coefficient. Like the way I did in this previous version: https://www.prorealcode.com/topic/neural-networks-programming-with-prorealtime/#post-78789

    but we need 4*6+6+2*6+2 = 44 initial values   :s

     

    I still reading and learning about this topic

    #79152 quote
    Nicolas
    Keymaster
    Master

    You are referring to this kind of line: F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1 , right?

    I assume that a11 … a14 are different weights to give more or less importance to each of your input. That’s how weights are used in neural network, such as perceptron for instance. There is no one way to determine how weights are calculated or initiated.

    Like I said in my previous post you can set them as a statistical percentage of good value. So, let’s assume that all your inputs (from input1 to input4) are boolean variables (0=false, 1=true). For example, if your input1 is a value that gives better accuracy in your previous trades results (and you’ll have to make a study for this.. like the way I did in my example  ), you can weight the input like this:

    input1 was true for 53% when a classifierlong occurred so  a11=0.53

    F1 = 0.53*1+a12*input2+a13*input3+a14*input4+Fbias1

    This is a rough idea of how you can temper an input.

    Leo thanked this post
    #79179 quote
    Leo
    Participant
    Veteran

    Thanks Nicolas,

    Yes indeed, so far I read is a lot of try and error for the initial values, some others said with experience, one can get a good initial values. the actual values will are achieve during the learning process of the neural network.

    Talking about the learning process…

    Imagine that we build our Neural Network with success.

    How do we do the learning process?

    Is the fuction: DEFPARAM PreLoadBars = 50000 ?

    it means that 50k candles are loaded and the code is run through all of those candles? that will be great for the learning process.

    #79185 quote
    Nicolas
    Keymaster
    Master

    Preloadbars is only used in ProBacktest and the maximum value you can preload is 10.000 bars.

    While it is not the case for indicator, maximum bars loaded is the value shown in the ‘units’ dropdown list.

    #79194 quote
    Leo
    Participant
    Veteran

    Sorry Nicolas, I could not fully understand the concept of PreLoadBars = 10000.

    At the moment of activating the strategy , it means that the program is run 10000 units of time before barindex=1?

    I would like to understand correct because the Neuronal Network should learn before taking decisions for trading, otherwise we have to “wait” many “barindex” before actually buy or sell… that can be a pin in the neck using this methodology of trading.

    another way is to back test using the function Graph and “try” to export-import all the 44 values as initial values for a quicker learning process…

    puff… what if are eager to create a bigger Neural Network with hundreds for weights !!!?

    I put a lot of hope in that “preloadbar ” fuction. Am I right? or naive?

    #79195 quote
    Leo
    Participant
    Veteran

    Hi all,

    I complete the calculus the partial derivates for the learning process of the neuronal network. I will really appreciate if someone can confirm or correct them if I miss something.

    /////////////////    CLASSIFIER    /////////////
    
    myATR=average[20](range)+std[20](range)
    ExtraStopLoss=MyATR
    //ExtraStopLoss=3*spread*pipsize
    
    //for long trades
    classifierlong=0
    FOR scanL=1 to candlesback DO
    IF classifierlong[scanL]=1 then
    BREAK
    ENDIF
    LongTradeLength=ProfitRiskRatio*(close[scanL]-(low[scanL]-ExtraStopLoss[scanL]))
    IF close[scanL]+LongTradeLength < high-spread*pipsize then
    IF lowest[scanL+1](low) > low[scanL]-ExtraStopLoss[scanL]+spread*pipsize then
    classifierlong=1
    candleentrylong=barindex-scanL
    BREAK
    ENDIF
    ENDIF
    NEXT
    
    //for short trades
    classifiershort=0
    FOR scanS=1 to candlesback DO
    IF classifiershort[scanS]=1 then
    BREAK
    ENDIF
    ShortTradeLength=ProfitRiskRatio*((high[scanS]-close[scanS])+ExtraStopLoss[scanS])
    IF close[scanS]-ShortTradeLength > low+spread*pipsize then
    IF highest[scanS+1](high) < high[scanS]+ExtraStopLoss[scanS]-spread*pipsize then
    classifiershort=1
    candleentryshort=barindex-scanS
    BREAK
    ENDIF
    ENDIF
    NEXT
    
    /////////////////////////  NEURONAL NETWORK  ///////////////////
    
    //variable1=     // to be defined
    //variable2=     // to be defined
    //variable3=     // to be defined
    //variable4=     // to be defined
    
    IF classifierlong=1 or classifiershort=1 THEN
    candleentry=max(candleentrylong,candleentryshort)
    
    Y1=classifierlong
    Y2=classifiershort
    
    //    >>> INPUT NEURONS <<<
    input1=variable1[barindex-candleentry]
    input2=variable2[barindex-candleentry]
    input3=variable3[barindex-candleentry]
    input4=variable4[barindex-candleentry]
    //   >>> FIRST LAYER OF NEURONS <<<
    F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1
    F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2
    F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3
    F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4
    F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5
    F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6
    F1=1/(1+EXP(-1*F1))
    F2=1/(1+EXP(-1*F2))
    F3=1/(1+EXP(-1*F3))
    F4=1/(1+EXP(-1*F4))
    F5=1/(1+EXP(-1*F5))
    F6=1/(1+EXP(-1*F6))
    //   >>> OUTPUT NEURONS <<<
    output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1
    output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2
    output1=1/(1+EXP(-1*output1))
    output2=1/(1+EXP(-1*output2))
    
    // >>>PARTIAL DERIVATES OF COST FUNCTION <<<
    // COST = (1/2)* ( (Y1-output1)^2 + (Y2-output2)^2 )
    
    DerObias1 = (Y1-output1) * output1*(1-output1) * 1
    DerObias2 = (Y2-output2) * output2*(1-output2) * 1
    
    Derb11    = (Y1-output1) * output1*(1-output1) * F1
    Derb12    = (Y1-output1) * output1*(1-output1) * F2
    Derb13    = (Y1-output1) * output1*(1-output1) * F3
    Derb14    = (Y1-output1) * output1*(1-output1) * F4
    Derb15    = (Y1-output1) * output1*(1-output1) * F5
    Derb16    = (Y1-output1) * output1*(1-output1) * F6
    
    Derb21    = (Y2-output2) * output2*(1-output2) * F1
    Derb22    = (Y2-output2) * output2*(1-output2) * F2
    Derb23    = (Y2-output2) * output2*(1-output2) * F3
    Derb24    = (Y2-output2) * output2*(1-output2) * F4
    Derb25    = (Y2-output2) * output2*(1-output2) * F5
    Derb26    = (Y2-output2) * output2*(1-output2) * F6
    
    DerFbias1 = (Y1-output1) * output1*(1-output1) * b11 * F1*(1-F1) * 1 + (Y2-output2) * output2*(1-output2) * b21 * F1*(1-F1) * 1
    DerFbias2 = (Y1-output1) * output1*(1-output1) * b12 * F2*(1-F2) * 1 + (Y2-output2) * output2*(1-output2) * b22 * F2*(1-F2) * 1
    DerFbias3 = (Y1-output1) * output1*(1-output1) * b13 * F3*(1-F3) * 1 + (Y2-output2) * output2*(1-output2) * b23 * F3*(1-F3) * 1
    DerFbias4 = (Y1-output1) * output1*(1-output1) * b14 * F4*(1-F4) * 1 + (Y2-output2) * output2*(1-output2) * b24 * F4*(1-F4) * 1
    DerFbias5 = (Y1-output1) * output1*(1-output1) * b15 * F5*(1-F5) * 1 + (Y2-output2) * output2*(1-output2) * b25 * F5*(1-F5) * 1
    DerFbias6 = (Y1-output1) * output1*(1-output1) * b16 * F6*(1-F6) * 1 + (Y2-output2) * output2*(1-output2) * b26 * F6*(1-F6) * 1
    
    Dera11    = (Y1-output1) * output1*(1-output1) * b11 * F1*(1-F1) * input1 + (Y2-output2) * output2*(1-output2) * b21 * F1*(1-F1) * input1
    Dera12    = (Y1-output1) * output1*(1-output1) * b11 * F1*(1-F1) * input2 + (Y2-output2) * output2*(1-output2) * b21 * F1*(1-F1) * input2
    Dera13    = (Y1-output1) * output1*(1-output1) * b11 * F1*(1-F1) * input3 + (Y2-output2) * output2*(1-output2) * b21 * F1*(1-F1) * input3
    Dera14    = (Y1-output1) * output1*(1-output1) * b11 * F1*(1-F1) * input4 + (Y2-output2) * output2*(1-output2) * b21 * F1*(1-F1) * input4
    
    Dera21    = (Y1-output1) * output1*(1-output1) * b12 * F2*(1-F2) * input1 + (Y2-output2) * output2*(1-output2) * b22 * F2*(1-F2) * input1
    Dera22    = (Y1-output1) * output1*(1-output1) * b12 * F2*(1-F2) * input2 + (Y2-output2) * output2*(1-output2) * b22 * F2*(1-F2) * input2
    Dera23    = (Y1-output1) * output1*(1-output1) * b12 * F2*(1-F2) * input3 + (Y2-output2) * output2*(1-output2) * b22 * F2*(1-F2) * input3
    Dera24    = (Y1-output1) * output1*(1-output1) * b12 * F2*(1-F2) * input4 + (Y2-output2) * output2*(1-output2) * b22 * F2*(1-F2) * input4
    
    Dera31    = (Y1-output1) * output1*(1-output1) * b13 * F3*(1-F3) * input1 + (Y2-output2) * output2*(1-output2) * b23 * F3*(1-F3) * input1
    Dera32    = (Y1-output1) * output1*(1-output1) * b13 * F3*(1-F3) * input2 + (Y2-output2) * output2*(1-output2) * b23 * F3*(1-F3) * input2
    Dera33    = (Y1-output1) * output1*(1-output1) * b13 * F3*(1-F3) * input3 + (Y2-output2) * output2*(1-output2) * b23 * F3*(1-F3) * input3
    Dera34    = (Y1-output1) * output1*(1-output1) * b13 * F3*(1-F3) * input4 + (Y2-output2) * output2*(1-output2) * b23 * F3*(1-F3) * input4
    
    Dera41    = (Y1-output1) * output1*(1-output1) * b14 * F4*(1-F4) * input1 + (Y2-output2) * output2*(1-output2) * b24 * F4*(1-F4) * input1
    Dera42    = (Y1-output1) * output1*(1-output1) * b14 * F4*(1-F4) * input2 + (Y2-output2) * output2*(1-output2) * b24 * F4*(1-F4) * input2
    Dera43    = (Y1-output1) * output1*(1-output1) * b14 * F4*(1-F4) * input3 + (Y2-output2) * output2*(1-output2) * b24 * F4*(1-F4) * input3
    Dera44    = (Y1-output1) * output1*(1-output1) * b14 * F4*(1-F4) * input4 + (Y2-output2) * output2*(1-output2) * b24 * F4*(1-F4) * input4
    
    Dera51    = (Y1-output1) * output1*(1-output1) * b15 * F5*(1-F5) * input1 + (Y2-output2) * output2*(1-output2) * b25 * F5*(1-F5) * input1
    Dera52    = (Y1-output1) * output1*(1-output1) * b15 * F5*(1-F5) * input2 + (Y2-output2) * output2*(1-output2) * b25 * F5*(1-F5) * input2
    Dera53    = (Y1-output1) * output1*(1-output1) * b15 * F5*(1-F5) * input3 + (Y2-output2) * output2*(1-output2) * b25 * F5*(1-F5) * input3
    Dera54    = (Y1-output1) * output1*(1-output1) * b15 * F5*(1-F5) * input4 + (Y2-output2) * output2*(1-output2) * b25 * F5*(1-F5) * input4
    
    Dera61    = (Y1-output1) * output1*(1-output1) * b16 * F6*(1-F6) * input1 + (Y2-output2) * output2*(1-output2) * b26 * F6*(1-F6) * input1
    Dera62    = (Y1-output1) * output1*(1-output1) * b16 * F6*(1-F6) * input2 + (Y2-output2) * output2*(1-output2) * b26 * F6*(1-F6) * input2
    Dera63    = (Y1-output1) * output1*(1-output1) * b16 * F6*(1-F6) * input3 + (Y2-output2) * output2*(1-output2) * b26 * F6*(1-F6) * input3
    Dera64    = (Y1-output1) * output1*(1-output1) * b16 * F6*(1-F6) * input4 + (Y2-output2) * output2*(1-output2) * b26 * F6*(1-F6) * input4
    
    GradientNorm = SQRT(DerObias1*DerObias1 + DerObias2*DerObias2+Derb11*Derb11+Derb12*Derb12+Derb13*Derb13+Derb14*Derb14+Derb15*Derb15+Derb16*Derb16 + Derb21*Derb21+Derb22*Derb22+Derb23*Derb23+Derb24*Derb24+Derb25*Derb25+Derb26*Derb26 + DerFbias1*DerFbias1+DerFbias2*DerFbias2+DerFbias3+DerFbias3+DerFbias4*DerFbias4+DerFbias4*DerFbias5+DerFbias6*DerFbias6 + Dera11*Dera11+Dera12*Dera12+Dera13*Dera13+Dera14*Dera14 + Dera21*Dera21+Dera22*Dera22+Dera23*Dera23+Dera24*Dera24 + Dera31*Dera31+Dera32*Dera32+Dera33*Dera33+Dera34*Dera34 + Dera41*Dera41+Dera42*Dera42+Dera43*Dera43+Dera44*Dera44 + Dera51*Dera51+Dera52*Dera52+Dera53*Dera53+Dera54*Dera54 + Dera61*Dera61+Dera62*Dera62+Dera63*Dera63+Dera64*Dera64)
    
    
    #79197 quote
    Leo
    Participant
    Veteran

    Now every  time there is a new input generated by the Classifier, every weight or bias can be improve ( the neural network learns) by using the equation I attached (epsilon is a small value that can be a variable to be optimised in the walk forward testing) .

    It will be great if some of you can help me with this project.

    In fact, now that I finished the back-propagation algorithm,   it is very near already for create an indicator for prediction using Neural Network

    All I do, is to apply what I learn in here:

    http://neuralnetworksanddeeplearning.com/

    #79200 quote
    Nicolas
    Keymaster
    Master

    At the moment of activating the strategy , it means that the program is run 10000 units of time before barindex=1?

    Just try if a variable had incremented on the first bar:

    defparam preloadbars=10000
    
    a = a + 1
    
    if a = 0 then 
     buy at market 
    endif 
    
    graph a

    if a >=10.000 on the first bar of the backtest, then the code been read and executed 10 thousands times (I have a very bad feeling about this! 🙂 )

    #79206 quote
    GraHal
    Participant
    Master

    It will be great if some of you can help me with this project.

    I only wish I could Leo and I feel sad that only Nicolas is helping you, but you also need folks with more time than Nicolas can spare.

    Your work is ground breaking as far as most of us are concerned. I did read the reference you posted and broadly understood, but it’s the coding I struggle with.

    Tomorrow I will see if I can contact @Maz as he is a capable member who sticks in my mind as one who similarly tried to move forward with complex collaborative projects on here.

    #79209 quote
    Leo
    Participant
    Veteran

    it works!

    defparam preloadbars=10000
     
    a = a + 1
     
    if a = 10001 then
    buy at market
    endif
     
    graph a
    

    that means it will learn 10000 bars that is a very good news, cause we do not have to wait until it learns

Viewing 15 posts - 16 through 30 (of 127 total)
  • You must be logged in to reply to this topic.

Neural networks programming with prorealtime


ProBuilder: Indicators & Custom Tools

New Reply
Author
author-avatar
Leo @leito Participant
Summary

This topic contains 126 replies,
has 8 voices, and was last updated by MobiusGrey
2 years, 4 months ago.

Topic Details
Forum: ProBuilder: Indicators & Custom Tools
Language: English
Started: 08/18/2018
Status: Active
Attachments: 32 files
Logo Logo
Loading...