Hi,

I am experimenting with the Zorro / R bridge. I have the following Lite-C script, which makes calls to R.


Code
#include <r.h>

function run()
{
  set(PLOTNOW+PARAMETERS+LOGFILE);
  BarPeriod = 1440;
  LookBack = 100;
  MaxLong = MaxShort = 1;
  
  if(Init) {
    if(!Rstart())
      return quit("Error - R won't start!");
    Rx("rm(list = ls());"); // clear the workspace
    Rx("library(tseries)"); // load time series library
  }
  
  if(is(LOOKBACK)) return;

  int size = optimize(50,10,100);

  Rset("Data",rev(seriesC(),size),size); // send Close series to R
  Rx("ADF = adf.test(Data)"); // Augmented Dickey-Fuller test
  var adf = Rd("ADF$p.value");
  
  if(adf > 0.6) enterLong();
  if(adf < 0.6) exitLong();
  
  plot("ADF p-value", adf ,NEW,RED); //display p-value
}


When I hit the train button, it takes the default value (50) and saves it in the .par file, but no actual training takes place.
Once the value 50 is saved, I can hit test, and everything runs fine. However, there is no optimization happening.

What am I doing wrong here?

Thanks!

Last edited by frutza; 11/05/25 00:56.