[sc-25841] Remove 4MB limit for gRPC message payloads#49
Conversation
Currently our logic chunks data to send over gRPC according to numbers of rows of data, this led to large streamsets in the multivalue api erroring out while trying to recv this data. This PR sets the limit for the client to receive from the server to be unlimited for the time being. This will allow arbitrarly-sized streamsets to have successful multivalue queries. In the future we should update our logic to better handle these size limits when sending from the server, but this is a patch fix for now.
|
trying to link ticket [sc-25841] |
|
This pull request has been linked to Shortcut Story #25841: Update grpc message size on the python client to prevent 4MB size limit for large streamset queries. |
jleifnf
left a comment
There was a problem hiding this comment.
LGTM.
Wonder how the no limit on the size to send would effect for the local querying of the data.
|
I think there will still be a bit of a limit:
We have a similar limit on the send to the server side when inserting data as well. |
|
Our server limits things itself, if we trust our server then it shouldn't be a problem. |
Currently our logic chunks data to send over gRPC according to numbers of rows of data, this led to large streamsets in the multivalue api erroring out while trying to recv this data on the client side.
This PR sets the limit for the client to receive from the server to be unlimited for the time being. This will allow arbitrarily-sized streamsets to have successful multivalue queries.
In the future we should update our logic to better handle these size limits when sending from the server, but this is a patch fix for now.
I tested this for 5000 streams on ni4ai, with a
sampling_frequencyof30for 100seconds of data, this would error out on the master branch, but successfully returns the data on with this PR.MWE