Thermal interface materials (TIMs) constitute a critical component for heat dissipation in electronic packaging systems. However, the extent to which a conventional steady-state thermal characterization apparatus can resolve the interfacial thermal resistance across current high-performance interfaces (RT < 1 mm2⋅K/W) is not clear. In this work, we quantify the minimum value of RT that can be measured with this instrument. We find that in order to increase the resolution of the measurement, the thermal resistance through the instrument's reference bars must be minimized relative to RT. This is practically achieved by reducing reference bar length. However, we purport that the minimization of reference bar length is limited by the effects of thermal probe intrusion along the primary measurement pathway. Using numerical simulations, we find that the characteristics of the probes and surrounding filler material can significantly impact the measurement of temperature along each reference bar. Moreover, we find that probes must be spaced 15 diameters apart to maintain a uniform heat flux at the interface, which limits the number of thermal probes that can be used for a given reference bar length. Within practical constraints, the minimum thermal resistance that can be measured with an ideal instrument is found to be 3 mm2⋅K/W. To verify these results, the thermal resistance across an indium heat spring material with an expected thermal contact resistance of ∼1 mm2⋅K/W is experimentally measured and found to differ by more than 100% when compared to manufacturer-reported values.