# |
Sep 13th 2019, 16:25 |
louis |
Wow that’s perfect |
# |
Sep 13th 2019, 16:24 |
challgren |
https://sandbox.dereuromark.de/sandbox/queue-examples/scheduling |
# |
Sep 13th 2019, 16:24 |
challgren |
Plus you can see whats been cleaned and what failed cleanup |
# |
Sep 13th 2019, 16:23 |
louis |
I’m going to try this way ! I agree, never thought about it :expressionless: Thank you @challgren |
# |
Sep 13th 2019, 16:22 |
challgren |
A little bit cleaner IMHO |
# |
Sep 13th 2019, 16:22 |
challgren |
Yeah same idea except the queue handles it |
# |
Sep 13th 2019, 16:21 |
louis |
Okay, in previous projects I was writing CakePHP shell planned with a cron task. this is why I’m looking for better solution :slightly_smiling_face: |
# |
Sep 13th 2019, 16:20 |
challgren |
That way if the user wants to redownload it again, Im not hammering the server with another pdf creation/zip process and can quickly serve them the “cached” package |
# |
Sep 13th 2019, 16:19 |
challgren |
Using https://github.com/dereuromark/cakephp-queue |
# |
Sep 13th 2019, 16:19 |
challgren |
But what I do is create my pdfs/zips and then create a queue task to clean it up in an hour |
# |
Sep 13th 2019, 16:18 |
challgren |
Nah Im just waking up so the mt dew hasnt kicked in yet |
# |
Sep 13th 2019, 16:18 |
louis |
Should I explain in another way ? |
# |
Sep 13th 2019, 16:18 |
challgren |
Oh probably do a queue task |
# |
Sep 13th 2019, 16:18 |
louis |
@challgren Yeap I’m using it ;) . Sorry if my question is not well formulated |
# |
Sep 13th 2019, 16:17 |
challgren |
@louis https://github.com/FriendsOfCake/CakePdf is awesome! |
# |
Sep 13th 2019, 16:16 |
louis |
Hello everyone, I’m looking for advices for PDF generation. At the moment i’ve a method that saving multiple Pdf files into TMP directory (named with a uuid) adnd then I create an archive of theses Pdf. The problem is what is a good way to remove this directory after serving file to user using `response->withFile()` |
# |
Sep 13th 2019, 16:15 |
ruano84 |
Hi guys |
# |
Sep 13th 2019, 14:35 |
admad |
welcome |
# |
Sep 13th 2019, 14:33 |
felipe.marinho |
Nice, thank you :pray: |
# |
Sep 13th 2019, 14:29 |
admad |
https://dev.mysql.com/doc/refman/8.0/en/memory-storage-engine.html |
# |
Sep 13th 2019, 14:29 |
admad |
*storage engine |
# |
Sep 13th 2019, 14:28 |
admad |
also if possible use 'MEMORY' engine for the temp table |
# |
Sep 13th 2019, 14:27 |
felipe.marinho |
Awesome... a temp table and an action to match those 2 tables... I'll try this too... Thank you @admad |
# |
Sep 13th 2019, 14:26 |
admad |
*if your |
# |
Sep 13th 2019, 14:26 |
admad |
then do whatever processing you need |
# |
Sep 13th 2019, 14:25 |
admad |
if you table columns don't directly match csv columns then first dump the csv in a temp table with matching columns |
# |
Sep 13th 2019, 14:24 |
felipe.marinho |
the problem is to match the fields from the CSV and the table columns... |
# |
Sep 13th 2019, 14:23 |
admad |
you should just upload file and use a shell / command to do the importing in background |
# |
Sep 13th 2019, 14:23 |
admad |
ideally importing 500,000 lines of csv should be even done in a web request. |
# |
Sep 13th 2019, 14:22 |
felipe.marinho |
Yes, his approach was to use the input type="hidden"... I'll try to upload the CSV, match the fields, and after use the uploaded CSV. |
# |
Sep 13th 2019, 14:22 |
admad |
sorry but i am not going to bother reading that :slightly_smiling_face: |
# |
Sep 13th 2019, 14:21 |
felipe.marinho |
I adapted this tutorial: https://quickadminpanel.com/blog/how-to-import-csv-in-laravel-and-choose-matching-fields/ |
# |
Sep 13th 2019, 14:20 |
admad |
if not possible then query builder to do multi insert |
# |
Sep 13th 2019, 14:20 |
admad |
next don't use ORM methods to insert. if server allows Best would be to `LOAD DATA INFILE` syntax to directly make mysql use the csv file to insert rows |
# |
Sep 13th 2019, 14:18 |
admad |
also make sure post_max_size and max_file_size are set to large enough values in php.ini to allow large file uploads |
# |
Sep 13th 2019, 14:18 |
admad |
using hidden input to pass csv data sounds like a very brittle approach. use a file input to upload |
# |
Sep 13th 2019, 14:17 |
ckjksl |
Locally will be faster, but are you trying to save 500,000 entities all at once? |
# |
Sep 13th 2019, 14:15 |
slackebot |
approach is the best option for now... locally it will be faster. |
# |
Sep 13th 2019, 14:15 |
felipe.marinho |
I'm using MYSQL 5.5, I read about update the mysql version, I'll try to install the MariaDB updated and test it... and maybe latter try to create all this entities using an uploaded csv... I tried to use the file inside the TMP folder, but after it uploads and use it inside the action the server delete the file... I saved the tmp file path in the input type="hidden" i created first, but the file was deleted :( I think the upload |
# |
Sep 13th 2019, 14:11 |
ckjksl |
Maybe instead of trying to create them all at once, you can stagger it so it saves 100 (or whatever number you want) at a time? 1000 seconds isn't very long |
# |
Sep 13th 2019, 14:07 |
felipe.marinho |
@ckjksl The PHP Timeout in the php.ini is max_execution_time=1000, and it crashes instantaneously, not even a single second lol... Yes, I'm using saveMany to save all those data... |