feat: Add bpd.options.compute.maximum_result_rows option to limit client data download#1829
Merged
feat: Add bpd.options.compute.maximum_result_rows option to limit client data download#1829
bpd.options.compute.maximum_result_rows option to limit client data download#1829Conversation
This commit introduces a new compute option `bigframes.pandas.options.compute.maximum_rows_downloaded` that allows you to set a limit on the maximum number of rows that can be downloaded to a client machine. When this option is set and a data-downloading operation (e.g., `to_pandas()`, `to_pandas_batches()`) attempts to download more rows than the configured limit, a `bigframes.exceptions.MaximumRowsDownloadedExceeded` exception is raised. This feature helps prevent Out-Of-Memory (OOM) errors in shared execution environments by providing a mechanism to control the amount of data downloaded to the client. The limit is checked in both `DirectGbqExecutor` and `BigQueryCachingExecutor`. Unit tests have been added to verify the functionality, including scenarios where the limit is not set, set but not exceeded, and set and exceeded for various DataFrame operations. Documentation has been updated by ensuring the docstring for the new option in `ComputeOptions` is comprehensive for automatic generation.
This commit refactors the row limit check logic in `DirectGbqExecutor` and `BigQueryCachingExecutor` to use a new shared helper function `check_row_limit` located in `bigframes.session.utils`. This change reduces code duplication and improves maintainability. The functionality remains the same as before the refactoring.
tswast
commented
Jun 17, 2025
| import bigframes.core.tree_properties as tree_properties | ||
| import bigframes.dtypes | ||
| import bigframes.exceptions as bfe | ||
| from bigframes.exceptions import MaximumRowsDownloadedExceeded, QueryComplexityError, format_message |
Collaborator
Author
There was a problem hiding this comment.
Don't import Classes.
tswast
commented
Jun 17, 2025
tswast
commented
Jun 17, 2025
tswast
commented
Jun 17, 2025
tswast
commented
Jun 17, 2025
Contributor
|
Another maybe more general option would be to truncate the arrow iterator itself, throwing if iteration is attempted past max rows? Ideally we'd be able to add this functionality by only modifying ExecuteResult, and keep the executor itself from taking on another responsibility? |
bpd.options.compute.maximum_rows_downloaded option to limit client data downloadbpd.options.compute.maximum_result_rows option to limit client data download
Collaborator
Author
Great idea @TrevorBergeron ! Pushed this change to ExecuteResult in the latest commit. |
TrevorBergeron
approved these changes
Jun 18, 2025
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This commit introduces a new compute option
bigframes.pandas.options.compute.maximum_rows_downloadedthat allows you to set a limit on the maximum number of rows that can be downloaded to a client machine.When this option is set and a data-downloading operation (e.g.,
to_pandas(),to_pandas_batches()) attempts to download more rows than the configured limit, abigframes.exceptions.MaximumRowsDownloadedExceededexception is raised.This feature helps prevent Out-Of-Memory (OOM) errors in shared execution environments by providing a mechanism to control the amount of data downloaded to the client.
The limit is checked in both
DirectGbqExecutorandBigQueryCachingExecutor. Unit tests have been added to verify the functionality, including scenarios where the limit is not set, set but not exceeded, and set and exceeded for various DataFrame operations.Documentation has been updated by ensuring the docstring for the new option in
ComputeOptionsis comprehensive for automatic generation.Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:
Fixes internal issue 417780501 🦕