When job sizes are known, Shortest Remaining Processing Time (SRPT) is known to be an optimal (in a very strong sense) scheduling policy for a single server queue under general assumptions on underlying random variables. However, the performance of SRPT is known not to be robust to errors in processing time estimates. For a popular error model, we characterize the optimal policy using a Gittins Index approach and discuss its properties. The implementability of the policy is studied, with the structure of the optimal policy guiding heuristic policies that appear to perform well. Time permitting, issues for multiple server queues will be highlighted.Non UBCUnreviewedAuthor affiliation: McMaster UniversityFacult