This is more an architectural question.
I'm about to code a bunch of Import implementations. They all expect some parameters (i.e. an CSV file) and then will take up quiet some time to proceed. In my previous project, I used to send those Imports in the background using an "shell_exec()"-command and than monitored a logfile in the browser to report about the status. My big hope now is, that Laravel takes over here to streamline all that manual work.
For now, my question would be about the proposed class architecture behind this.
My requirements for a bunch of imports are:
- Each Import need to run as a background process
- Monitor progress in Browser (and logfile)
- Start imports in console and via HTTP
Right now I plan to use a "Job" in L5.1 to implement the basic Import. What I'm struggling with, is the implementation of some kind of "progress bar" and monitoring of the (most recent) "log messages" in the browser. I do not need a real "live" view via sockets, but it should be possible to regularly update the progress view of a running Import.
- Has anybody some hints, how to implement this progress stuff?
My approach so far: Read the CSV file, put each line to a queue element and monitor the queue. The log messages could trigger an event that populates a stack of the most recent log messages. (I may run into race conditions because some lines may depend on a previous processing of another line)
via Chebli Mohamed
Aucun commentaire:
Enregistrer un commentaire