Please help me understand what's going on here

 

The i-Regr indicator (as well as numerous others) calculates the linear regression of X bars and displays it on the screen (green line):


At each new bar the index buffers are cleared and re-written with new values; the indicator repaints (as it should). When it does this all prior regression values are lost.

I use the "slope" of the regression in my EA.

I define the slope like this:

MathAbs(1000000*(fx[0]-fx[1])/Period()) 

Where fx[0] is the regression value at the most recent bar, and fx[1] is the value at the previous bar. The rest of that formula (such as the 1000000* and Period()) is not pertinent to my issue; it is there simply to make the numbers larger and easier to read.

Since the indicator clears its index buffers, I had to modify it to create a new index buffer - one that would not be erased - to represent the "slope". Here is the relevant code segment that I changed:

extern int barshift = 0;
double slope[];

... 
// in init()
SetIndexBuffer(3, slope);
i0 = barshift;

...
//in start()
for(n=i0;n<=i0+p-1;n++)
  {
    sum=0;
    for(kk=1;kk<=degree;kk++)
    {
       sum+=x[kk+1]*MathPow(n,kk);
    }
    fx[n]=x[1]+sum;
    
  } 
  slope[barshift] = MathAbs(1000000*(fx[barshift]-fx[barshift+1])/Period()); //Store regression slope in a persistent index buffer

I made an EA for testing purposes that calls this modified i-Regr indicator and prints the slope. Here is the code to that EA:

extern int length = 34;
extern int barshift  = 0;


int start()
  {
  if (TimeCurrent()<1306880399) return(0);
  double Slope =  iCustom(Symbol(),0,"i-RegrS",1,1,length,0,3,barshift); //Call the modified i-Regr indicator with a specified barshift.
  Print(" Slope: ",Slope);
  return(0); 
  } 

I ran this "EA" in the strategy tester and it produced the following output:

//shift = 0
2011.05.31 23:59  showslope EURUSD,M5:  Slope: 30.2633 // <- bar 0
2011.05.31 23:55  showslope EURUSD,M5:  Slope: 30.7395 // <- bar 1
2011.05.31 23:50  showslope EURUSD,M5:  Slope: 30.2577 // <- bar 2
2011.05.31 23:45  showslope EURUSD,M5:  Slope: 29.7591 // <- bar 3
2011.05.31 23:40  showslope EURUSD,M5:  Slope: 29.8319 // <- bar 4
2011.05.31 23:35  showslope EURUSD,M5:  Slope: 29.8039 // <- bar 5

//shift = 1
2011.05.31 23:59  showslope EURUSD,M5:  Slope: 30.2577 // <- bar 2 (*why?* should be `bar 1`, Slope: 30.7395)
2011.05.31 23:55  showslope EURUSD,M5:  Slope: 30.2577 // <- bar 2 
2011.05.31 23:50  showslope EURUSD,M5:  Slope: 29.7591 // <- bar 3
2011.05.31 23:45  showslope EURUSD,M5:  Slope: 29.8319 // <- bar 4
2011.05.31 23:40  showslope EURUSD,M5:  Slope: 29.8039 // <- bar 5
2011.05.31 23:35  showslope EURUSD,M5:  Slope: 29.5742 // <- bar 6

//shift = 2
2011.05.31 23:59  showslope EURUSD,M5:  Slope: 29.7591 // <- bar 3 (*why?* should be `bar 2`, Slope: 30.2577)
2011.05.31 23:55  showslope EURUSD,M5:  Slope: 29.7591 // <- bar 3
2011.05.31 23:50  showslope EURUSD,M5:  Slope: 29.8319 // <- bar 4
2011.05.31 23:45  showslope EURUSD,M5:  Slope: 29.8039 // <- bar 5
2011.05.31 23:40  showslope EURUSD,M5:  Slope: 29.5742 // <- bar 6
2011.05.31 23:35  showslope EURUSD,M5:  Slope: 28.8627 // <- bar 7

 

(...continued)

Aside from the underlined parts this produced the right results.

My next step was to produce the same results but with a script that I can drop on the chart instead of using an EA through the strategy tester. To make sure I was using the same data, I ran the strategy tester in visual mode up to the last output and then dropped my script on the strategy tester chart.

This is where I ran into an unexpected error the nature of which I cannot understand.

Here is the script:

extern int length = 34;
extern int barshift  = 0;


int start()
{

  double Slope =  iCustom(Symbol(),0,"i-RegrS",1,1,34,barshift,3,barshift);
  Print(" Slope: ",Slope);
  return(0);
}

And the output:

//shift = 0
showslope_script EURUSD,M5:  Slope: 30.2633  // Excellent - thats what I expected...

//shift = 1
showslope_script EURUSD,M5:  Slope: 30.5434 /* Where did this number come from? Should be `30.2577` like in the strategy tester! */

//shift = 2
showslope_script EURUSD,M5:  Slope: 30.0448 /*  Where did this number come from? Should be `29.7591` like in the strategy tester! */

If I use a barshift of 0 (latest bar) within the script, then the proper slope value is returned: 30.2633.

BUT... if I use a barshift greater than 0, then incorrect values are returned! Why does this occur? This has been plaguing me for days.

Any assistance would be GREATLY appreciated!

- Mike

 

In your EA you use this . . .

double Slope =  iCustom(Symbol(),0,"i-RegrS",1,1,length,0,3,barshift); 

In your script you use this . . . .

double Slope =  iCustom(Symbol(),0,"i-RegrS",1,1,34,barshift,3,barshift);

Why the difference ?

 

The only actual difference between those two statements is this:

(...0,3,barshift) >>> (...barshift,3,barshift)

My reasoning went something like this:
Since the strategy tester iterates through every bar sequentially, I can fill up the indicator buffer in the following manner:

 slope[0] = MathAbs(1000000*(fx[0]-fx[1])/Period()); 

And then simply use the `shift` property of iCustom to access the appropriate bar-shift.

However if I used that same statement in a script I would only be able to access the most recent bar's calculations. Since I don't have the strategy tester to automatically iterate through the bars I need a way to bar-shift within the indicator itself. That is why I passed "barshift" instead of "0" in the script.

I know that there must be some faulty logic either in what I just explained or in something closely related. When I substitute the script's iCustom call for the EA's I get those same strange results!

 

Are you passing the correct number of Externs ? the original i-Regr has 4 externs, did you add a fifth ? looks like you are only passing 4 ?

extern int degree = 1;
extern double kstd = 2.0;
extern int bars = 48;
extern int shift = 0;

+

extern int barshift = 0;

??
 
I removed the original 4th buffer, which was "shift" but entirely the wrong kind of shift. I replaced that indicator with the sort of "barshift" that I am talking about. (The `shift` property that i-Regr has originally simply shifts the buffer to the left or right - not what I want..
 

does i-Regr reposition during the formation of the current bar ? If it does that might cause the results to be different when the current bar is included in your calculations when shift is zero as opposed to only using fixed completed bar prices when shift is 1 or more.

I dont have that indicator but if you look through the code and see if it uses any of the prices that change during the formation of the current bar, in other words anything other than the Open price that should tell you it probably does reposition.

Edit: I looked up that indicator, it does use High[] and Low[] but I noticed it also has this code at the beginning of start():

if(prevtime1==iTime(NULL,0,0))
   {
      prevtime1=iTime(NULL,0,0);
      return(0);
   }
   else
      prevtime1=iTime(NULL,0,0);

I think this would mean it must only be using High[] and Low[] at the current bar at the moment of the first tick of the bar which would mean they are equal to the open price so do not reposition, so I'm thinking that would mean the indicator will give you a different result for each bar when it is shifted because then it is using the real high[] and low[].

I'm not entirely sure about that because I just quickly skimmed over the code and the variables used makes it hard to read.

 
SDC:

does i-Regr reposition during the formation of the current bar ? If it does that might cause the results to be different when the current bar is included in your calculations when shift is zero as opposed to only using fixed completed bar prices when shift is 1 or more.

I dont have that indicator but if you look through the code and see if it uses any of the prices that change during the formation of the current bar, in other words anything other than the Open price that should tell you it probably does reposition.

Edit: I looked up that indicator, it does use High[] and Low[] but I noticed it also has this code at the beginning of start():

I think this would mean it must only be using High[] and Low[] at the current bar at the moment of the first tick of the bar which would mean they are equal to the open price so do not reposition, so I'm thinking that would mean the indicator will give you a different result for each bar when it is shifted because then it is using the real high[] and low[].

I'm not entirely sure about that because I just quickly skimmed over the code and the variables used makes it hard to read.


I am not sure where you found that code, but that is not the indicator that I'm using. The indicator I used only references the Close[x] price and has no code analogous to what you quoted.

Attached is my modification of the i-Regr indicator (where I have added the `slope` index buffer).

Files:
i-regrs.mq4  5 kb
 
An obvious comment, if you use a barshift of 0 then Close[barshift] has a high probability of being incorrect.
 

Hmmm ok This is the indicator I thought you meant https://www.mql5.com/en/code/8436 but anyway yes as Raptor said Close[0] will introduce the same kinds of repositioning errors as High[0] or Low[0] would. You could change that to Open[0] just for error elimination purposes.

If you still get your error when your indicator uses Open[0] then the problem is caused by something else.

 
RaptorUK:
An obvious comment, if you use a barshift of 0 then Close[barshift] has a high probability of being incorrect.

How so?
I should mention that all of this is used in "open prices only" mode! Perhaps this is where the key point lies?

Even if Close[0] is somehow a less accurate representation of the real market I do not think that should effect the consistency of results.

Anyway - I get those different results even with higher shifts! In fact shift 0 is the only one that worked as intended in both the EA and the script.

Reason: