Log message #4193211

# At Username Text
# Jul 15th 2019, 15:55 neon1024 Ahh, thanks!
# Jul 15th 2019, 15:55 ndm `$this->_response`
# Jul 15th 2019, 15:50 neon1024 Using the IntegrationTestTrait, how do I get the response?
# Jul 15th 2019, 15:24 kaliel your welcome :slightly_smiling_face:
# Jul 15th 2019, 15:24 lubos I'll pick this in the morning with fresh head :slightly_smiling_face: thanks @kaliel for your input! :slightly_smiling_face:
# Jul 15th 2019, 15:23 lubos @kaliel the memory issues are before creating any CSV (when trying to debug `orders` results - just count output)
# Jul 15th 2019, 15:23 kaliel @lubos do you use a plugin or something to output csv ? I had memory issues too with some of them. Or do the allowed memory error is triggered before csv construction ?
# Jul 15th 2019, 15:22 lubos to be even more specific, we have `orders` and `leads` data and client is requiring output of both combined in CSV file so they can process data on their side
# Jul 15th 2019, 15:21 lubos @kaliel as you say, only the new collection is not third table but CSV output
# Jul 15th 2019, 15:20 kaliel @lubos To be clear, you have to query 2 tables, merge the results, do some process, then save this new collection (to a third table) ?
# Jul 15th 2019, 15:19 lubos I'll see, but for now it seems I stick with increasing memory limit, hopefully `512M` is just fine :slightly_smiling_face:
# Jul 15th 2019, 15:18 lubos and sql union could be answer too, but who does like it? :slightly_smiling_face:
# Jul 15th 2019, 15:17 lubos but still, i need to combine two collections (each has different table) that's why I picked post-processing at the first place
# Jul 15th 2019, 15:17 lubos rather than do post-processing with collections
# Jul 15th 2019, 15:16 lubos so using endless group and sql with `limit (number, from)` is the way to go I guess :slightly_smiling_face:
# Jul 15th 2019, 15:15 lubos @kaliel but then 101 - 200 created date could be wrongly sorted with first 100 results, but I need to loop sql queries with short period at this case I think :slightly_smiling_face:
# Jul 15th 2019, 15:14 kaliel @lubos you can sortBy('created') the first 100 results, then query for the 101th to 200th, etc...
# Jul 15th 2019, 15:14 lubos is using `512MB` too much? :slightly_smiling_face:
# Jul 15th 2019, 15:13 lubos but so much easier to increase memory limit
# Jul 15th 2019, 15:13 lubos thanks both, i will do
# Jul 15th 2019, 15:13 lubos but I guess I can make sets limited by short period, e.g. day or so
# Jul 15th 2019, 15:12 lubos and output is one CSV file which would be OK I can append to the CSV, but neeed sort them first
# Jul 15th 2019, 15:12 lubos well I can chunk them, but i need `sortBy('created')` for all items, then chunk does not work well
# Jul 15th 2019, 15:11 ricksaccous like @kaliel is suggesting
# Jul 15th 2019, 15:11 ricksaccous do x amount at a time with some delay in between maybe
# Jul 15th 2019, 15:11 kaliel @lubos you can add a "processed" boolean to each item of your collection, and add multiples tasks with queue plugin
# Jul 15th 2019, 15:11 ricksaccous @lubos you can probably chunk them no?
# Jul 15th 2019, 15:10 ricksaccous in one widget
# Jul 15th 2019, 15:10 ricksaccous and permeating data through different templates
# Jul 15th 2019, 15:10 lubos processing about 9300 results set which does not seems too much though, but there are many fields in the row
# Jul 15th 2019, 15:10 ricksaccous I'm having so much trouble with widgets
# Jul 15th 2019, 15:09 lubos @kaliel I can but I rather write scripts memory wise
# Jul 15th 2019, 15:08 kaliel @lubos So you can increase memory_limit ?
# Jul 15th 2019, 15:07 lubos so far I am setting the memory limit to `ini_set('memory_limit', '512M');` and it works fine, hopefully I am not using too much memory :slightly_smiling_face:
# Jul 15th 2019, 15:05 lubos good way is to use `take` with `while` loop, but in my case i need to merge 2 collections first so I am using ``` $collection3 = $collection1 ->append($collection2) ->sortBy('created') ->take(100) ``` but because the `$collection1` is too large it fails due to memory limit
# Jul 15th 2019, 15:03 lubos I probably can optimize the sql (adding fields list instead of *) so the collettion is not so large and can fit into memory
# Jul 15th 2019, 15:02 lubos @kaliel well I am already executing this on the cli
# Jul 15th 2019, 15:01 kaliel @lubos maybe you can try to defer some process, see cakephp-queue plugin : https://www.dereuromark.de/2013/12/22/queue-deferred-execution-in-cakephp/
# Jul 15th 2019, 14:55 lubos Is there any good way to process large `ResultSet` collections while not increasing `memory_limit`? I am havving allowed memory error with my result set
# Jul 15th 2019, 13:49 ricksaccous i think i know how to go about it actually
# Jul 15th 2019, 13:48 ricksaccous custom nesting label covers no label