AIP Logic function hits rate limit for 4000 ontology objects

I have 20 year history dataset of stock prices with below columns.

open: Stock opening price
high: Stock highest price of the day
low: Stock lowest price of the day
close: Stock closing price
volume: Total shares traded that day
symbol: The stock ticker symbol (e.g., AAPL, TSLA)
date: The date of the trading day

Roughly we will have 4000 ontology objects per stock (200 trading days per year * 20 years = 4k objects)

I ask below questions.

  1. When was the last time stock dipped more than 10 percent?
  2. How many times stock gained more than 5 percent in last 10 years?

Calculator tool is going through each and every object and reaching rate limit.

I was able to get desired output with create_pandas_dataframe_agent from LangChain. Basically it treats input as dataframe and uses pandas.

I can certainly write a function that uses LangChain and call that one in AIP Logic. But, I have to provide a different set of LLM API Keys.

I guess there should be a way to achieve this in AIP itself.

The issue seems to be in the setup of your overall logic, though I understand why you have done it this way.

I’ve previously implemented similar functionality by using these discrete tools, and prompt the LLM to pick up the meaning the user is conveying, and then routing that to the right tool, to then parse the output back to the user.

How is your Calculator tool configured?