You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Unpack an EventPage into Events and do the actual insert inside
# the `event` method. (This is the oppose what DocumentRouter does by
# default.)
event_method=self.event# Avoid attribute lookup in hot loop.
filled_events= []
forevent_docinevent_model.unpack_event_page(doc):
filled_events.append(event_method(event_doc))
This can incur a large amount of latency with Mongo. On the floor we've seen that one event takes Tiled ~60ms to process but one event_page of about a dozen rows takes almost 1000ms.
It should be possible to do the update as a single MongoDB command that that adds N new documents to the event collection. This might be as simple as a bulk insert operation. That is: still "unpack" in Python but insert the resulting event documents in bulk.
It might be possible to get even fancier and do the unpacking server-side through some kind of aggregation, but I would start by benchmarking the simple thing. My guess is that MongoDB latency >> Python runtime cost of unpacking.
The text was updated successfully, but these errors were encountered:
Currently, if
suitcase.mongo_normalized.Serializer
receives anevent_page
, it "unpacks" it into Nevent
documents and inserts them separately.suitcase-mongo/suitcase/mongo_normalized/__init__.py
Lines 261 to 270 in 14333df
This can incur a large amount of latency with Mongo. On the floor we've seen that one
event
takes Tiled ~60ms to process but oneevent_page
of about a dozen rows takes almost 1000ms.It should be possible to do the update as a single MongoDB command that that adds N new documents to the
event
collection. This might be as simple as a bulk insert operation. That is: still "unpack" in Python but insert the resultingevent
documents in bulk.It might be possible to get even fancier and do the unpacking server-side through some kind of aggregation, but I would start by benchmarking the simple thing. My guess is that MongoDB latency >> Python runtime cost of unpacking.
The text was updated successfully, but these errors were encountered: