-
Type: Task
-
Resolution: Done
-
Affects Version/s: None
-
Component/s: None
I'm still trying to narrow down exactly what is causing this.
I have an array of Mongoid documents produced like so:
harvestable_tweets << HarvestableTweet.new(
harvestable_id: kw[:id],
tweet_id: tweet._id,
created_at: tweet.created_at
).as_document
When done, this totals 10,801 records.
If I batch the insert like so:
HarvestableTweet.collection.insert(harvestable_tweets)
5 records are added.
If I loop through the array and insert them individually:
harvestable_tweets.each do |ht|
HarvestableTweet.collection.insert(ht)
end
All 10,801 tweets are inserted
If I slice the array and insert in batches of 1000 I get 9,806 tweets inserted...
harvestable_tweets.each_slice(1000) do |ht|
HarvestableTweet.collection.insert(ht)
end
I thought I might be hitting the 16MB limit, but the size of the batch is way below it, totalling: 1,684,956 bytes
I'm assuming it's hitting an error in the batch, but there's no errors being reported.
Any thoughts?