In sys.dm_os_memory_cache_entries view, what does original cost mean?

by Stuart Ainsworth   Last Updated January 12, 2018 01:06 AM

Looking at a couple of DMVs, and trying to see if I can programmatically figure out the cost of the cached plans in my SQL Server. I'm hoping that column represents the same concept of cost that is used by the optimizer to determine if the query should be parallelized, but I'm not sure. Is that the case?

Edit: below is the script I'm using:

SELECT  text
      , objtype
      , refcounts
      , usecounts
      , size_in_bytes
      , disk_ios_count
      , context_switches_count
      , original_cost
      , current_cost
FROM    sys.dm_exec_cached_plans p
        CROSS APPLY sys.dm_exec_sql_text(plan_handle)
        JOIN sys.dm_os_memory_cache_entries e 
          ON p.memory_object_address = e.memory_object_address
WHERE   cacheobjtype = 'Compiled Plan'
      , usecounts DESC;

Answers 1

From the documentation for sys.dm_os_memory_cache_entries, this represents the combined CPU + I/O cost that the optimizer uses both for the value of the plan (and those metrics) and also to determine how quickly to evict those plans from the cache. Though, for your use case, I suspect current_cost may be more relevant. Be curious to see what comes next - how exactly are you going to use this information?

Aaron Bertrand
Aaron Bertrand
October 23, 2013 21:37 PM

Related Questions