OnTester() the Genetic Algorithm and return(INIT_PARAMETERS_INCORRECT)

 

Hi, does anybody know a workaround for this disappointing behaviour?

In OnInit() I use:

if (IsPerA == false && ValPerA != 2 ) return(INIT_PARAMETERS_INCORRECT);

to switch on and off the test of variables and in case of an off to  reject all variations by the optimizer of this variable.

If I now disable this return(INIT_PARAMETERS_INCORRECT) in OnInit() the Optimizer in genetic-mode performs ~12'000 passes and as each valid pass writes a new line in a csv-file (the chache has been deleted each time  before the start!!) and I found ~10'000 rows in that csv-file.

f I now enable return(INIT_PARAMETERS_INCORRECT) I see ~19'000 passes (it means incl. all skipped parameter-sets), but only 4'000 rows in my csv-file.

In the journal-tab I read:

There were 9591 passes done during optimization, 4862 results have been discarded as insignificant

Here it seems that the skipped parameter-sets aren't counted.

But in the tabs Optimization-Result and Optimization-Graph I can see all skipped parameter-sets blowing up the size of all.

Finally the the best achieved result is far away from what I get if I disable return(INIT_PARAMETERS_INCORRECT).

It means all the useless and meaningless (and therefore skipped) parameter-sets are counted as valid tries by the optimizer de-valuating the whole optimization process.

Anybody with an idea what to do except informing the service desk, not using INIT_PARAMETERS_INCORRECT?

 
Honestly, I didn't understand anything to your problem.
 

Me neither.

The point is to skip bad combinations rather than to process and reject all ticks of that combination.

Are you rejecting all ticks for bad combinations? Otherwise all bets are off.

Of course skipped sets aren't counted; they were skipped. What's the problem?

 

"Of course skipped sets aren't counted; they were skipped." Sorry but wrong!! Of course they are counted by the Genetic Algorithm other wise I would have got comparable results. But I got only one third valid results: 4'000 instead of 12'000 with a best value far away from the other test!!

BTW: The service desk answered: "Unfortunataley we have no estimated time when this behaviour would be fixed."

The situation is: Optimizing in the Strategy Tester.

INIT_PARAMETERS_INCORRECT is used (by me) in OnInit() in this case:

extern bool   UseLimit1Filter =         TRUE;
extern double ValLimit1Filter =         1.6;  /// default value

...

int Oninit() {
   ...
   if(UseLimit1Filter ==        false && ValLimit1Filter !=     1.6 )   { Print("Opti skipped due to Line ",__LINE__,"  ", ValLimit1Filter); return(INIT_PARAMETERS_INCORRECT); }
   ...
}

void OnTick() {
   ...
   if ( UseLimit1Filter && xVal > ValLimit1Filter ) { ... }
   // else this filter is bypassed
}

This way I want to check whether the filter ValLimit1Filter is needed or not!

But even if UseLimit1Filter is set FALSE the Strategy Tester will vary (for nothing) this variable: ValLimit1Filter => costs time and the evaluation in the genetic mode will be worse!

Therefore I want the Strategy Tester to skip any setup with UseLimit1Filter=FALSE and ValLimit1Filter!=1.6 (=default value).  Any value variation is therefore the whole setup needless: so save time and save evaluation passes for valid setups.

But the Strategy Tester treats the combinations that are to be skipped as valid but with result 0.0 and they are counted by the Genetic Algorithm (making it to stop earlier) and are shown in the Optimization Results and Optimization Graphs. Well this won't change the result as they are blown up with nothing but hot air (as we would say in German)!

Reason: