Should we do a bulk insertion operation in mongodb ?? will this block other Read / Write Operations? -
i using mongodb our application .
we have high usage of mongodb usage , know seeing mongostat output
insert/s query/s update/s delete/s getmore/s command/s flushes/s mapped vsize res faults/s locked % idx miss % q t|r|w conn time 0 4 1 0 0 3 1 10396 11347 591 0 0.1 0 0|0|0 70 10:57:28 0 65 31 0 0 35 0 10396 11347 591 1 3.7 0 0|0|0 70 10:57:29 0 76 37 0 0 41 0 10396 11347 591 0 3.5 0 0|0|0 70 10:57:30 0 85 42 0 0 44 0 10396 11347 591 1 4.7 0 0|0|0 70 10:57:33 0 52 25 0 0 29 0 10396 11347 591 0 2.9 0 0|0|0 70 10:57:34 0 26 11 0 0 15 0 10396 11347 591 0 1.1 0 0|0|0 70 10:57:36 0 83 41 0 0 43 0 10396 11347 591 1 4.6 0 0|0|0 69 10:57:37
regarding query ,
as per our application whenver user logins application , need store whatever symbols user holds . (a user might contain symbols within range 20 100 )
what best way achieve above insertion operation
1. perform insert operation each symbol individually or 2. insert symbols @ once way public void insert(arraylist<quotereportbean> quotelist) { dbobject[] totalrecords = new basicdbobject[quotelist.size()]; (int = 0; < quotelist.size(); i++) { quotereportbean reportbean = quotelist.get(i); basicdbobject dbrecord = new basicdbobject(); dbrecord.append("cust", reportbean.getcustomerid()); dbrecord.append("symbol", reportbean.getuniquesymbol()); dbrecord.append("access_time", reportbean.getdate()); totalrecords[i] = dbrecord; } writeresult result = coll.insert(totalrecords,writeconcern.normal); logger.info("quotelist" + result.tostring()); }
my concern set profiling level 50
db.setprofilinglevel(1,50)
and insertion query second approach has been recorded in system.profile collection
could please tell me if insert operation might lock other read / write operations ??
sending multiple objects database in array faster doing separate insert each object. can order of magnitude faster, depending on array size , size of objects.
the reason might see second statement (the batched/array insert) logged "slow query" because takes longer single insert, faster single inserts together.
finally, mongostat
output not particularly dramatic. mongodb can few thousand operations per second (depending on operation of course), you're not anywhere close hitting limit, @ least according screenshot.
Comments
Post a Comment