The following type casting issue really took me off-guard and I wonder whether this is expected behaviour. It can certainly cause trouble, if one is unaware of it (zorro S 1.96.4):
The following simple script illustrates what type casting produces for Tradesize = 0.01:
var lotstotrade;
var Tradesize;
int ilots;
int jlots;
int klots;
int llots;
int mlots;
Tradesize = 0.01;
lotstotrade = 100.0*Tradesize;
ilots = (int)lotstotrade;
jlots = (int)(100.0*Tradesize);
klots = lotstotrade;
llots = 100.0 * Tradesize;
mlots = 1000.0 * Tradesize;
int roundlots;
int ceillots;
roundlots = round(lotstotrade,1);
ceillots = ceil(lotstotrade);
printf("nTradesize %f lotstotrade %.4f ilots %i jlots %i klots %i mlots %i llots %i",Tradesize,lotstotrade,ilots,jlots,klots,llots,mlots);
printf("nroundlots %i ceillots %i",roundlots,ceillots);
The output will be as follows:
Tradesize 0.010000 lotstrade 1.0000 ilots 0 jlots 0 klots 0 llots 0 mlots 9
roundlots 1
ceillots 1
Only by using round and ceil you get the proper value!
ilots, jlots, klots llots and also mlots are wrong.
If you set Tradesize = 0.03 then all the results are correct
For the first printf statement you will get
Tradesize 0.030000 lotstrade 3.0000 ilots 3 jlots 3 klots 3 llots 3 mlots 30
I assume it is related to how floats/double are stored in memory, but is this behaviour really expected?!!