The fps contribution of certain elements can be seen in the [F11] panel in ms per frame. In some cases it can be larger than the real rendering time by the engine and hardware, especially in fullscreen mode in which the frame rate is limited by the monitor frequency (mostly 60 Hz on LCD screens and 70..80 Hz on CRT monitors).
Monitor frequency and fps
In fullscreen mode, DirectX keeps the frame rate always in sync with the monitor frequency between 60 and 80 Hz in order to avoid tearing artifacts. The screen refresh is artificially delayed for matching the time when the monitor has finished its video cycle. This delay is visible in the 'screen refresh' time in the [F11] panel. Thus you'll never get a higher frame rate than your monitor can display. Your frame rate will be an integer division of the monitor frequency (such as 60, 30, 20, or 15 Hz when the monitor frequency is 60 Hz). When the frame rate is close to the monitor frequency, a small change of the rendering time will cause the frame rate to suddenly jump from 60 to 30 fps or vice versa.
What I find mildly confusing though is that your entity rendering (and code execution) is low in execution time/ ms but only your refresh time is so high. This should result in a much higher framerate in windowed mode, can you post a screenshot of the latter, too? Do you use fps_max (and/ or fps_min)?