Hi,

so I have this jumping code in my player function (I ripped off the other stuff, as this is enough to show the problem). For some reason I don't understand I get differnt jumping heights at different framerates. Isn't time_step supposed to scale the values to fit the framerate or did I get it wrong?

Tested at 30, 60 and 75 fps.

Code:
#define PL_GRAVITY	(-0.35)
#define PL_MAXGRAVITY	(-9)
#define PL_JUMPACC	(8)

void pl_UpdatePlayer(PL_PLAYER* pl)
{
	me = pl.unit;
	
	VECTOR traceto; vec_set(traceto, my.x);
	traceto.z -= 100;
	var traceresult = c_trace(my.x, traceto, IGNORE_ME | IGNORE_PASSABLE | IGNORE_PUSH | USE_BOX);
	DEBUG_VAR(traceresult, 100);

	if ((traceresult == 0) || (traceresult > (9.8))) // Sturz
	{
		pl.move.gravity.z = maxv(pl.move.gravity.z + (PL_GRAVITY * time_step), PL_MAXGRAVITY * time_step);
	}
	else
	{
		pl.move.gravity.z = 0;

		if (key_space)
			pl.move.gravity.z = PL_JUMPACC*time_step;
	}

	c_move(me, pl.move.movedir, pl.move.gravity, GLIDE);
	
	if ((traceresult < (9.8)) && (traceresult > 0)) my.z += (9.8) - traceresult;
}


Last edited by lemming; 05/13/13 19:21.