Suppose we have an animation in Flash with 1 fps, where each frame has a script that runs 100 ms. As far as I know, the animation in Flash works as follows:
0ms: Begin executing Frame 1's frame script
100ms: Finish executing Frame 1's frame script
1000ms: Begin rendering Frame 1's content and frame-script output
1050ms: Finish rendering Frame 1's content and frame-script output
1051ms: Begin executing Frame 2's frame script
1151ms: Finish executing Frame 2's frame script
2000ms: Begin rendering Frame 2's content and frame-script output
2050ms: Finish rendering Frame 2's content and frame-script output
2051ms: Begin executing Frame 3's frame script
2151ms: Finish executing Frame 3's frame script
3000ms: Begin rendering Frame 3's content and frame-script output
3050ms: Finish rendering Frame 3's content and frame-script output
...
This workflow is logical, as the frame script is being executed while waiting for the next screen update. Even if script takes as long as 1000ms to execute, the rendering will not be delayed and it would still be 1fps.
However! When programming animation from within AS3, people often use ENTER_FRAME event, which occurs just before the next screen update. Then, if we have instructions that take 1000ms to execute, the workflow is as follows:
0ms: do nothing (waste time!)
1000ms: begin executing instructions in ENTER_FRAME
2000ms: finish executing instructions in ENTER_FRAME
2001ms: Begin rendering Frame 1's content and ENTER_FRAME output
2051ms: Finish rendering Frame 1's content and ENTER_FRAME output
2051ms: do nothing (waste time!), as we have to wait 1000ms from last rendering to current
3000ms: begin executing instructions in ENTER_FRAME (1000ms after last rendering)
4000ms: finish executing instructions in ENTER_FRAME
4001ms: Begin rendering Frame 2's content and ENTER_FRAME output
4051ms: Finish rendering Frame 2's content and ENTER_FRAME output
4051ms: do nothing (waste time!), as we have to wait 1000ms from last rendering to current
5000ms: begin executing instructions in ENTER_FRAME (1000ms after last rendering)
6000ms: finish executing instructions in ENTER_FRAME
6001ms: Begin rendering Frame 2's content and ENTER_FRAME output
6051ms: Finish rendering Frame 2's content and ENTER_FRAME output
...
As a result we have 0.5 fps instead of 1 fps! The delays come because ENTER_FRAME occur right before rendering the scene. To me, it would be much logical if ENTER_FRAME occurred right after rendering the scene, to prepare for the rendering at next frame.
This is a toy example, and in real world the rendering would not occur at that perfect schedule, but the logic is the same. When having 15 ms of code to execute each frame (perfectly normal situation), 60 fps would turn into 30 fps...
... or not? Is there a flaw in what I am saying?
Enter frame is the beginning of the lifecycle.
Display object lifecycle:
Event.ENTER_FRAME
dispatchedEvent.ADDED
dispatched from children display objectsEvent.ADDED_TO_STAGE
dispatched from children display objectsEvent.FRAME_CONSTRUCTED
dispatchedEvent.EXIT_FRAME
dispatchedEvent.RENDER
dispatchedEvent.REMOVED
dispatched from children display objectsEvent.REMOVED_FROM_STAGE
dispatched from children display objectsWhat you describe is often referred to as the elastic racetrack, where heavy code execution can delay frame rendering.