I would like to have zorro create a series for an artificial price that starts at 100 at Init and then changes with every run call.

The code looks something like this:

if(Init)
prices = series(100.);
else
prices = series( *prices * (100+pcts)/100);

Now this works fine as long as prices is a global variable but crashes with an error 111 if it is local, which is what I need.
The reason it crashes I presume is that *prices is not yet initialized the first time it tries to acces the value, which puzzles me.

Does anybody have a smart idea how to better code this?