LLM Block Null Output Caching

Hi!

We have caching enabled on LLM block and the output was null (the request was rate limited and failed), will it cache the null output?

We do not have errors enabled on this LLM block (since that would erase our cache), but we are very confident that the failure was rate limiting.

Will the null output will not be cached only if the error option in the LLM block is enabled?

We do not cache nulls or any errors!

2 Likes

Great! Regardless if errors are enabled or not?

Yup regardless if errors are enabled

1 Like

Following on this thread, since the null outputs are not cached, we were hoping to rerun the pipeline until all the rows are computed - however, we notice that rerunning actually does not change the number of null outputs from the LLM blocks.

We suspect that this is because the input dataset did not change and the pipeline is just not running, therefore tried the Force Build, that did not change anything as well.

We have checked and the input from those rows does not exceed

What would be the recommended way to iteratively compute the rows that failed?

Normally that should work – if you run one of the rows that failed on an llm separately (just to test) does it succeed or still give you null? My first thought is that the rows are still getting rate limited or error-ed by themselves. If the former works though, you could pass in a subset of the null rows to the llm to see if that succeeds and that way it’ll get stored in the cache

The individual rows succeed in execution, so it looks like the problem is not in the content of those rows.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.