-
Notifications
You must be signed in to change notification settings - Fork 27.4k
angular.copy Very poor performance (large objects) #11099
Comments
@Tomino2112 did you try to put this code and see if it passes all the existing tests? As you've noticed, It is kind of hard to say more without an isolated reproduction scenario which we could profile to actually see what is going on. We are missing this as well as other essential info (ex.: browser and its version used for tests). In short: please provide an isolated reproduce scenario, otherwise it is not very actionable. |
@Tomino2112, it may be interesting to see if your approach can be somehow used within angular for speed optimization, but in the meantime, it is never a good approach trying to display 10,000 cells on a page at once through a single request. It is a very poor approach from the architecture point of view. @pkozlowski-opensource, I find his research credible enough to suggest that Angular's implementation for deep copy is to be reviewed, based on implementation by @Tomino2112. The performance difference seems to be huge. |
Hi guys, @VitalyTomilov Yeah its not great to render that many cells, actually it is edge scenario, usually its around hundereds of cells - still not great though, but thats what you get when you try to render sheets of data online. I have some ideas for rendering on my list, just not enough time to play around with them at the moment. I will prepare some code example so you cna see it in action (i mean the angular.copy) |
I think the main thing slowing #11215 helps a bit mainly by reducing the size of the stack (today sub object/arrays are pushed twice!). However I have found that replacing the stack with an ES6 |
I wonder if a shimmed Map would have the same performance benefits as the "native" Map. |
|
That's what I thought :) |
Maybe it is a bit sinister having just one Copy version for both simple data and immutable objects, so if one wants a simpler, high-performance copy without any advanced provisions, it is not available. If this is the case, the best approach is to provide two separate Copy implementations either through 2 separate functions or through an extra parameter for Copy. |
Using something like this: destination = JSON.parse(JSON.stringify(source)); I copied 10,000 key/value pairs of an object in ~5 milliseconds on my On Mon, Mar 2, 2015 at 10:30 AM, Vitaly Tomilov [email protected]
|
|
That's a good point, I didn't realize that code doesn't copy over functions or object methods :/ |
@realityking that's what I was proposing, I'm just not sure if creating that shim is worth one use case... |
Considering how Two things I would take into account:
|
When I tried it out (on top of #11215) I think it was ~50 LOC and made all modern browsers ~5x faster with my random over complicated test data. I could open a PR if people think this is worth it? |
👍 for using Map where supported |
@jbedard what about with not so complex data? |
As the complexity goes down (less objects) using However in those simple data cases it is normally |
I don't have time for it right now, but if no one else does it by tomorrow What if we did something like the code in the post below? And figure out where that threshold would be exactly? Or would it even be |
sorry, that code highlighter did not work out lol...
|
It is not worth having 2 implementations... |
Had an issue: when I try to copy a large object, it copies in ~300ms, but sometimes it can copy for 18-20s. jbedard's method |
33c67ce has improved this a bit more often making copy 1.5-3x faster. |
I have tried MANY solutions from all over the internet, but in the end I end-up writing specific function for copying my object. After all the tests and benchmark, the result is, if you have large arrays/objects, its better to write your own copy function specifically for that object with |
It was just put into master a few days ago so it will be in the next 1.5 release. |
@jbedard Ok thanks for the info. We will try 1.5 then, I'm very curious about the performance difference. I hope there won't be too many breaking changes in 1.5 |
Maybe there is some performance regression with 1.4.8. After upgrading alone with From project practice, I think angular's built-in array/object util functions are not full cover the usage. I have to use lodash (or underscore) anyway. So I think angular should move these helper functions eg. |
@stanleyxu2005, (as stated before) Angular never intended to provide a replacement for general purpose utility libraries. Its helper functions ( At that point, it seemed like a good idea to expose the utility functions to the developers (since they were implemented anyway), so they didn't have to import a whole different library for simple usecases. This actually turned out to be a bad idea - among other things, it's much more difficult to make breaking changes to them to better support the evolving needs of the framework. There's not much we can do now, because removing them would break too many apps. The functions are also used internally, so removing them from the core is not an option (that's their purpose in the first place). So, they'll continue to exist, but users should keep in mind that they might not be the best option for their usecase. |
I just replaced angular.copy with JSON.parse(JSON.stringify()) on a 18meg tree structure. The speed difference was huge... |
One more case study.... let as = [...] // array of ~4000 objects
console.time('angular')
angular.copy(as)
console.timeEnd('angular')
// => 71048.443ms
console.time('lodash')
_.cloneDeep(as)
console.timeEnd('lodash')
// => 521.026ms
console.time('json')
JSON.parse(JSON.stringify(as))
console.timeEnd('json')
// => 92.422ms Please deprecate the public angular.copy API, and add a note on the docs page! |
The team should be aware that source of |
@e-cloud, is there any evidence that inlining the functions would improve performance? |
Nope, what i mean is every time you call |
Regarding @coli and @bcherny 's experiment, if the code change is tiny and the performance is remarkable, I'd vote +1 to apply this change. This is a gratis improvement proposal for the angular team, seriously.
@gkalpak Improve performance of frequently use functions is a common sense. I feel a bit confused of using angular functions. If those functions are intended to be used internally, better rename |
@stanleyxu2005 |
@e-cloud, the question is whether the cost is neglible compared to other operations involved or not. If there is evidence that we can noticeably improve the performance of |
@e-cloud I tested your idea and it really makes no difference. It's only 3 closures per copy call. If it were 3 per object (and sub object) it would probably make a difference but that's not the case. |
@jbedard , Do you try it in scale?
I've mention before, if only few calls. |
In my use case, I did not saw any a difference in angular.copy vs JSON.parse(JSON.stringfy(dataobject)); Rather _.map(dataobject, _.clone) was super fast. angular.copy(dataobject) JSON.parse(JSON.stringfy(dataobject)) _.map(dataobject, _.clone) |
Our application loads resources from API and renders them in a table. API request takes ~200ms, rendering table ~140ms but when you actually looking at the page it takes about 6s to display it.
I have drilled through the code and realized this was caused by using angular.copy . I need to keep track of the old data so I copied the whole object to backup object via angular.copy() function. The copying itself took over 2s (~2500ms).
I have searched for better ways, to copy objects and stumbled upon this code:
This is my implementation of this: http://stackoverflow.com/a/10729242/820942
Replacing all angular.copy() in my code with this hugely improved the webapp overall. The same example above takes ~41ms instead of ~2s.
I should mention that the data I am mentioning is pretty large (renders into 10 000 cells table)
After very brief look on the angular.copy code I am aware of that there are some added angular goodies ($$hashkey) but still the difference in performance is just too big.
The text was updated successfully, but these errors were encountered: