feat: support setting max_stream_count when fetching query result#2051
feat: support setting max_stream_count when fetching query result#2051Linchin merged 7 commits intogoogleapis:mainfrom
Conversation
| created by the server. If ``max_queue_size`` is :data:`None`, the queue | ||
| size is infinite. | ||
|
|
||
| max_stream_count (Optional[int]): |
There was a problem hiding this comment.
I think it would be more consistent if we use the same docstring as here. It also mentions the effect of preserve_order (in this case self._preserve_order), which I think we should make clear here.
There was a problem hiding this comment.
In this case, _preserve_order is automatically set by parsing the queries, and not a user-facing API. I'll update the docstring to mention that effect.
|
|
||
| .. versionadded:: 2.14.0 | ||
|
|
||
| max_stream_count (Optional[int]): |
There was a problem hiding this comment.
|
Thank you @kien-truong for adding further support for |
|
Hi, the default code path with the default arguments is already covered by the current tests. |
|
@kien-truong sounds good. The mypy test is also failing, could you fix it too? You can run it by running |
31662ad to
0e52722
Compare
Allow user to set max_stream_count when fetching result using BigQuery Storage API with RowIterator's incremental methods: * to_arrow_iterable * to_dataframe_iterable
0e52722 to
fb726eb
Compare
|
I have added tests to cover user-provided |
|
Thanks @kien-truong, the PR mostly looks good. I just have some small question regarding ignored coverage for the tests. |
Allow user to set max_stream_count when fetching result using BigQuery Storage API with incremental methods:
Fixes #2030 🦕