Tarun Chitra pfp
Tarun Chitra

@pinged

Not ignorant at all! There’s sort of a sense in which CoT allows models to “back track”; by reprompting, you allow the model to see errors in the original answer and go backwards before making a new prompt. This gets you out of local minima; but it’s not clear _when_ the model can figure out how or when to backtrack (which in some ways, is the mystique of reasoning models)
0 reply
0 recast
1 reaction