You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Perhaps there is a faster way to construct a dataframe from the results returned by the client library than looping over rows individually?
Note: the client library ends up effectively looping over all rows as well by returning an iterator that does the type conversions / parsing over the actual API results. I imagine some profiling might reveal places where the performance there can also be improved.
P.S. version 0.29.0 of the BigQuery client library (not yet released, as of 2017-12-08) will expose a to_dataframe() method. The actual implementation of this issue may be to just use that method here.
Note: I tried out using to_dataframe() in #112, but there are some issues with indexes, which aren't handled in the google-cloud-bigquery library. More investigation is needed.
See: #25 (comment)
Perhaps there is a faster way to construct a dataframe from the results returned by the client library than looping over rows individually?
Note: the client library ends up effectively looping over all rows as well by returning an iterator that does the type conversions / parsing over the actual API results. I imagine some profiling might reveal places where the performance there can also be improved.
P.S. version 0.29.0 of the BigQuery client library (not yet released, as of 2017-12-08) will expose a
to_dataframe()
method. The actual implementation of this issue may be to just use that method here.https://github.com/GoogleCloudPlatform/google-cloud-python/blob/061011d0213f82ca5ccaa9dec0a12713faaa2899/bigquery/google/cloud/bigquery/table.py#L1103-L1123
The text was updated successfully, but these errors were encountered: