I am ingesting data from a Kafka topic into Foundry Streams. But I don’t actually need the near real-time update capabilities offered by Foundry Steams in this particular instance.
Is there a way to ingest data from a Kafka topic on a schedule (e.g. every 5, 15 or 30 mins) into regular datasets using batch compute (w/o every utilizing Flink)?