Skip to content

Queries currently fail on large datasets #242

Closed
@feczo

Description

@feczo

Queries currently fail on large datasets (>6000 entities). [1] We
have many cases where we have well over 30K entities of a given Model that
we'd like the ability to iterate over.

[1] Code snippet & traceback:

>>> query = dataset.query().kind('Account')
>>> print datetime.datetime.utcnow(); all_accts = query.fetch(6000); print
datetime.datetime.utcnow()

2014-10-02 09:27:55.235381
2014-10-02 09:28:54.029017

>>> print datetime.datetime.utcnow(); all_accts = query.fetch(7000); print
datetime.datetime.utcnow()

2014-10-02 09:29:01.298517
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File
"/Users/ehiggins/src/datastore/.env/lib/python2.7/site-packages/gcloud-0.02.2-py2.7.egg/gcloud/datastore/query.py",
line 307, in fetch
query_pb=clone.to_protobuf(), dataset_id=self.dataset().id())
File
"/Users/ehiggins/src/datastore/.env/lib/python2.7/site-packages/gcloud-0.02.2-py2.7.egg/gcloud/datastore/connection.py",
line 207, in run_query
datastore_pb.RunQueryResponse)
File
"/Users/ehiggins/src/datastore/.env/lib/python2.7/site-packages/gcloud-0.02.2-py2.7.egg/gcloud/datastore/connection.py",
line 64, in _rpc
data=request_pb.SerializeToString())
File
"/Users/ehiggins/src/datastore/.env/lib/python2.7/site-packages/gcloud-0.02.2-py2.7.egg/gcloud/datastore/connection.py",
line 58, in _request
raise Exception('Request failed. Error was: %s' % content)
Exception: Request failed. Error was: Internal Error

Metadata

Metadata

Assignees

Labels

🚨This issue needs some love.api: datastoreIssues related to the Datastore API.triage meI really want to be triaged.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions