We have a registered llm model and is working well in aip thread and pipeline builder.
But, when we use that model in aip agent, Error returned with this message.
“Failed to load webhook from webhooks registry”
So, I re-tagging the source and create new webhook version but has same error.
LLM completion failed due to a remote exception
errorInstanceId: 1e3de28e-7a58-4235-b5c2-0837ed722c6e
errorName: FunctionExecution:ExecutionSystemException
errorCode: INTERNAL
stacktrace: SafeError: Failed to load Webhook from Webhooks Registry.
{Redacted stacktrace: see unsafe logs}
message: Failed to load Webhook from Webhooks Registry.
parameters: {"throwable0_webhookRid":"ri.webhooks.main.webhook.69ce0e3a-2f52-4628-9950-266bfa929d51","throwable0_webhookVersion":"26","throwable0_stacktrace":"SafeError: Failed to load Webhook from Webhooks Registry.\n at new SafeError (/app/node_modules/@foundry/witchcraft-logging-api/dist/args/safeError.js:21:28)\n at /app/node_modules/@foundry/functions-typescript-runtime-lib/dist/services/CachingWebhookIsReadOnlyProvider.js:30:23\n at Array.map (<anonymous>)\n at CachingWebhookIsReadOnlyProvider.getBatch (/app/node_modules/@foundry/functions-typescript-runtime-lib/dist/services/CachingWebhookIsReadOnlyProvider.js:27:32)\n at processTicksAndRejections (node:internal/process/task_queues:96:5)"}