I have a slate application where user uploads some data, based on this data some queries run and fetch huge data. However , the queries take long time and fail. As a user , this frustrating. Even if user is uploading small data , it might fetch huge data and query runs longer.
What could be the possible solutions here?
Is it possible to make temp table or can we run the query partially and store results, once all the results fetched , display to users.
Hello,
I have a few questions that would help me understand your use case better.
- What are you using to upload data in the slate application? Examples include Action widget and direct queries to a Foundry service such as Phonograph or Secure Upload.
- How are you displaying the data that you are fetching?
- Have you considered using paging and/or aggregating your data in the data fetch queries, if relevant?
Best,
Gregorian
Hi,
I am using file import widget to upload file.
After uploading it run few queries and function. Based on it , I am displaying results.
example I have uploaded student ids and searching them inside query in where condition. This is where it is failing.
further , user can upload 1 GB file unless told to restrict
lastly , I am doing paging, but the query itself fails
Can you share information on why the query is failing? Is it timing out? Do you get some other error?
You can check this using the Chrome Developer Tools in the Network Tab or the Health Dialog inside of Slate.
yes , it is time out.
Request timed out
My query is like
SELECT
“xx”
FROM
“foundry_sync”.“yy”
WHERE
“zz” IN (aa, bb… around 6000 ) values
These 6000 values are taken by the file uploaded by user. So , I need to tell user some limit.
Is there any other way we can approach this problem ?
The dataset is huge and we are just fetching values based on the where condition