Prevent deep_merge from mutating nested hashes#467
Conversation
| else | ||
| value.dup | ||
| end | ||
| end |
There was a problem hiding this comment.
Is this a pattern from other libraries? I feel like taking extra overhead to call dup on even those simple types is OK because you can't make assumptions about any underlying implementation. It's also unclear whether there's performance gain doing a case vs. a dup, so maybe just not worth it?
There was a problem hiding this comment.
The issue is that on Rubies < 2.4(? I think ... maybe 2.5), these classes raise a TypeError when you call #dup on them. Since they are all written in C, they need to have an allocate method for #dup to work and they do not.
I can roll out the safe dup change and run it on CI if you'd like to see. Otherwise, you can pull it down and try running the test suite after changing the Hashie::Utils.safe_dup(value) line to value.dup.
I originally went the route of a refinement that added ActiveSupport-like feature detection for `#dup, but decided that was a lot of overhead to do something that is just as fast and much less code.
What do you think?
There was a problem hiding this comment.
Ok, got it. I think what you did makes sense then, I didn't know this.
michaelherold
left a comment
There was a problem hiding this comment.
I'll the get squash for the changelog once I get your thoughts on the item below.
| else | ||
| value.dup | ||
| end | ||
| end |
There was a problem hiding this comment.
The issue is that on Rubies < 2.4(? I think ... maybe 2.5), these classes raise a TypeError when you call #dup on them. Since they are all written in C, they need to have an allocate method for #dup to work and they do not.
I can roll out the safe dup change and run it on CI if you'd like to see. Otherwise, you can pull it down and try running the test suite after changing the Hashie::Utils.safe_dup(value) line to value.dup.
I originally went the route of a refinement that added ActiveSupport-like feature detection for `#dup, but decided that was a lot of overhead to do something that is just as fast and much less code.
What do you think?
70f3130 to
eabd6b4
Compare
| # Returns a new hash with +self+ and +other_hash+ merged recursively. | ||
| def deep_merge(other_hash, &block) | ||
| copy = dup | ||
| copy = _deep_dup(self) |
There was a problem hiding this comment.
I was wondering why we create a dup all the time? What is the reason behind this?
There was a problem hiding this comment.
The reason for this is due to the semantics between #merge and #merge!. The plain method does not mutate the receiver, the bang-method does.
The `DeepMerge` extension has two methods of mutating hashes: a destructive one and a non-destructive one. The `#deep_merge` version should not mutate the original hash or any hash nested within it. The `#deep_merge!` version is free to mutate the receiver. Without deeply duplicating the values contained within the hash, the invariant of immutability cannot be held for the original hash. In order to preserve that invariant, we need to introduce a method of deeply duplicating the hash. The trick here is that we cannot rely on a simple call to `Object#dup`. Some classes within the Ruby standard library are not duplicable in particular versions of Ruby. Newer versions of Ruby allow these classes to be "duplicated" in a way that returns the original value. These classes represent value objects, so it is safe to return the original value ... unless the classes are monkey-patched, but that isn't something we can protect against. This implementation does a best-effort to deeply duplicate an entire hash by relying on these value object classes being able to return themselves without violating immutability.
eabd6b4 to
72d9692
Compare
https://build.opensuse.org/request/show/865196 by user coolo + dimstar_suse - updated to version 4.1.0 see installed CHANGELOG.md ## [4.1.0] - 2020-02-01 [4.1.0]: hashie/hashie@v4.0.0...v4.1.0 ### Added * [#499](hashie/hashie#499): Add `Hashie::Extensions::Mash::PermissiveRespondTo` to make specific subclasses of Mash fully respond to messages for use with `SimpleDelegator` - [@michaelherold](https://github.com/michaelherold). ### Fixed * [#467](hashie/hashie#467): Fixed `DeepMerge#deep_merge` mutating nested values within the receiver - [@michaelherold](https://github.com/michaelherold). * [#505](hashie/hashie#505): Ensure that `Hashie::Array`s are not deconverted within `Hashie::Mash`es to make `Mash#dig` work properly - [@Michaelhero
The
DeepMergeextension has two methods of mutating hashes: adestructive one and a non-destructive one. The
#deep_mergeversionshould not mutate the original hash or any hash nested within it. The
#deep_merge!version is free to mutate the receiver.Without deeply duplicating the values contained within the hash, the
invariant of immutability cannot be held for the original hash. In order
to preserve that invariant, we need to introduce a method of deeply
duplicating the hash.
The trick here is that we cannot rely on a simple call to
Object#dup.Some classes within the Ruby standard library are not duplicable in
particular versions of Ruby. Newer versions of Ruby allow these classes
to be "duplicated" in a way that returns the original value. These
classes represent value objects, so it is safe to return the original
value ... unless the classes are monkey-patched, but that isn't
something we can protect against.
This implementation does a best-effort to deeply duplicate an entire
hash by relying on these value object classes being able to return
themselves without violating immutability.