-
Notifications
You must be signed in to change notification settings - Fork 532
tire:import not freeing memory properly? #870
Comments
Hmm, this is strange. We've been redoing and fixing that code couple of times, at some point even It should use |
Sorry, I'm actually using Mongoid, but this should be using a cursor anyway, and its not pulling all the documents in in 1 go, as its hitting the limit somewhere around 60% complete each time, as if the documents once loaded aren't getting collected.. |
Yeah, look at the history of that file: https://github.com/karmi/retire/commits/master/lib/tire/model/import.rb, I had the impression it works well with Mongoid now… Can we get any more debug info here? |
Hi, can you tell a little bit more about your system env? I used this to find out about the tire mongoid interactions:
|
@threez I guess you have enabled mongoid's identity map. If so, iterating by all documents cause storing each of them in memory. Please see my explanation in #884. Would be nice if you try this change and give us feedback if it help you. You can do this, by using gem from my fork: gem 'tire', github: 'Proghat/tire', branch: 'mongoid-identity-map' |
I've actually finished my contract on the project we had this problem on now, but we did have identity map enabled and it would make perfect sense if that caches all the loaded documents... Maybe @carvil will see this as he is on that project. |
When I run against a collection of ~200,000 documents on heroku, it runs out of memory and gets killed.. If its paging things, and indexing in batches, why does this memory footprint keep growing?
The text was updated successfully, but these errors were encountered: