Open bmoren opened 8 years ago
Won't the stress be pretty dependent on the content it's duplicating? It'll probably handle millions of paragraph tags super fast, but more complex, nested content, images, etc. will be much slower.
yes, lets test as many things as possible, although there is likely little we can do about 1000000000 images..... I'm more interested in just finding those breaking points so we can articulate them to the users
I tested .replicate() with Chrome's Timeline tools, here's what I got.
.Replicate(){
total: x,
random :true,
mode: once
}
100 images - 138.3ms 1000 images - 360.2ms 2000 images - 577.3ms 3000 images - 670.6ms 4000 images - 771.5ms 5000 images - 2344.7ms 6000 images - 2801.3ms 7000 images - 3104.4ms 8000 images - 3114.4ms 9000 images - 3176.1ms 10000 images - 3153.5ms
These aren't averages or anything. Not sure if there's a more efficient / accurate way to record this data. But it definitely starts chugging at about 5000 images. Though this is likely a hardware dependent thing.
Computer stats:
Mac mini (Late 2014)
Processor 3 GHZ Intel Core i7
Memory 16 GB 1600 MHz DDR3
Graphics Intel Iris 1636 MB
maxing out the stress on the functions
eg.
for loop to create 100000000 elements then use .populate() in a forloop to iterate through them.
.replicate(), 10000000000 replications at 1ms / replication
etc.