Confirmed users
24
edits
mNo edit summary |
|||
| Line 110: | Line 110: | ||
==Specialization== | ==Specialization== | ||
''Found in: <tt> | ''Found in: <tt>jit/IonBuilder.cpp</tt>'' | ||
While building MIR, we attempt to attach type information to certain nodes. For example, the likely inputs and outputs of arithmetic operations, the likely targets of a call, or the likely shapes of objects for property accesses. This information is | While building MIR, we attempt to attach type information to certain nodes. For example, the likely inputs and outputs of arithmetic operations, the likely targets of a call, or the likely shapes of objects for property accesses. This information is retrieved from Type Inference or baseline caches. | ||
Type Inference: | |||
During execution the types are saved and TI gives back this list of output types that were observed. During compilation this list is frozen and cannot get adjusted anymore without requiring to invalidate the IonScript based on this type information. | |||
Baseline Engine: | |||
During execution the baseline compiler needs to create specialized stubs. When specializing we can take put specialized information on those stubs and use these hints to better optimize the MIR nodes. Since those are hints we still need to guard during execution this is true. | |||
==Respecialization== | ==Respecialization== | ||