ruby on rails - ActiveRecord#execute vs PG#send_query memory usage for large result sets -


background

i noticed our worker boxes spike in memory utilization whenever hit activerecord's #execute() large result set (>1m rows of data), memory stays relatively flat when connect pg using raw connection , use #send_query method instead. additionally, memory stays @ same high level after #execute has finished running, makes seem whatever temporary object(s) have been created sticking around, , not getting garbage collected.

i'm not setting result of #execute or #send_query method variable in either case; being invoked execute sql passed in.

questions

  1. is #execute temporarily storing result set in pg::result object, while send_query streams data without storing temporarily?
  2. why doesn't memory utilization return normal levels after sql has been executed using #execute method? responsible manually dumping objects created in memory? shouldn't gc take care of this?

i did digging objectspace couldn't pinpoint memory bloat was. appreciated!


Comments

Popular posts from this blog

javascript - Using jquery append to add option values into a select element not working -

Android soft keyboard reverts to default keyboard on orientation change -

jquery - javascript onscroll fade same class but with different div -