-
Type:
Task
-
Resolution: Done
-
Affects Version/s: None
-
Component/s: None
-
None
-
None
-
None
-
None
-
None
-
None
-
None
-
None
revision: b890f60e0e1d6b57900eaafff5c5d77433e03504 branch: 4.0.0-dev
Here's a failing test to get the gist of it:
require 'spec_helper' class Book include Mongoid::Document embeds_many :pages def num_pages=(num) pages.destroy_all num.times() { pages.new } end end class Page include Mongoid::Document embedded_in :book end describe Book do it "should only ever have 3 pages" do # Create a book with 3 pages book = Book.create(num_pages: 3) expect(book).to have(3).pages # Pass # Fetch the same book and give it # 3 new pages. same_book = Book.find(book.id) same_book.update_attributes(num_pages: 3) expect(same_book).to have(3).pages # Pass # Fetch it one more time... expect(Book.find book.id).to have(3).pages # Uh oh... 6 :\ end end
Based on the logs Mongoid destroys the child objects in memory, pushes the new pages, while forgetting to pull the destroyed ones.
INSERT database=test collection=books documents=[{
"_id"=>"5121d770e454a36825000001",
"pages"=>[
{"_id"=>"5121d770e454a36825000002"},
{"_id"=>"5121d770e454a36825000003"},
{"_id"=>"5121d770e454a36825000004"}
]
}] flags=[] (0.1688ms)
QUERY database=test collection=books selector={
"_id"=>"5121d770e454a36825000001"
} flags=[] limit=0 skip=0 batch_size=nil fields=nil (0.7050ms)
UPDATE database=test collection=books selector={
"_id"=>"5121d770e454a36825000001"
}
update={
"$pushAll"=>{
"pages"=>[
{"_id"=>"5121d770e454a36825000005"},
{"_id"=>"5121d770e454a36825000006"},
{"_id"=>"5121d770e454a36825000007"}
]
}
} flags=[] (0.1609ms)
QUERY database=test collection=books selector={
"_id"=>"5121d770e454a36825000001"
} flags=[] limit=0 skip=0 batch_size=nil fields=nil (0.4468ms)
It is possible of course to move the call to destroy_all outside of num_pages=, but it's pretty bad to have to remember to call it every time :\
book.pages.destroy_all if params[:book][:num_pages] # sucks book.update_attributes(params[:book])