Hello! Is Soundflow aware when a Pro Tools session crashes? In my imagination, Soundflow would execute a workflow when a Pro Tools session crashes, which would upload logs to a central server or even Slack.
- Christian Scheuer @chrscheuer2019-03-12 12:20:34.458Z
Hi @ryan!
Great to see you here in the forum.
It is possible to create more complex, manual triggers in the upcoming v2.2 builds. One trigger could be made to periodically check if there's recently been added a crash log file to the proper directory and this could be uploaded to the central server / Slack.
Another possibility would be to create a custom trigger to periodically check for the presence of a crash window.
The first one would probably be less taxing on system resources since it could run with an unobtrusive interval, whereas the 2nd one would need a short enough timer to make sure to register the crash window before the user would close it.
Which one would best fit the workflow you had in mind?- RRyan Frias @ryan
Hi @chrscheuer!
Glad to be on here.
The first workflow would probably be best. I was thinking of just making a cron job to check for the existence of a new crash log, but it might be cleaner to do it with Soundflow. It's easier to assume that the user has Pro Tools open if Soundflow is open than it is to have a generic cron job always running in the background of a machine.With that said, does Soundflow have an encrypted data store where users can store tokens/secrets/HTTP endpoints? The tricky part would be sync'ing these values across workstations and making sure that these never make it to the Soundflow package directory.
Thanks Christian for your thoughts.
Ryan
Christian Scheuer @chrscheuer2019-03-12 14:25:46.181Z
I moved the 2nd question to its own thread: https://forum.soundflow.org/-474/how-to-store-encrypted-data-in-soundflow
We can use that to discuss encrypted data flows and how to best implement it.
- RRyan Frias @ryan
Sounds great!
- In reply toryan⬆:
Christian Scheuer @chrscheuer2019-03-12 14:38:01.963Z2019-03-12 14:51:39.417Z
Here's an example script to sync log files.
//Setup here: const logDirectory = '~/Library/Logs/Avid'; const syncFilePath = '~/.soundflow/synclogs.json'; const destinationDirectory = '/Volumes/NetworkDrive/CopyToHere'; const howOften = 10000; function syncFile(path) { //Now copy it to the destination var filename = path.split('/').slice(-1)[0]; var destinationPath = destinationDirectory + '/' + filename; sf.file.copy({ sourcePath: path, destinationPath: destinationPath, executionMode: 'Background' }); } function checkLogFile(syncData, path) { var lastSyncDate = syncData.lastSyncDate; var fileInfo = sf.file.getFileInfo({ path: path }).fileInfo; if (lastSyncDate && fileInfo.creationTime.valueOf() <= lastSyncDate) return; syncFile(path); } function job() { //Read last sync date from file var syncData = sf.file.readJson({ path: syncFilePath, onError: 'Continue', executionMode: 'Background' }).json || {}; //Get a value of NOW as int var now = (new Date).valueOf(); //Enum directory, start with newest files first var paths = sf.file.directoryGetFiles({ path: logDirectory, sortOrder: 'CreationTimeDesc', executionMode: 'Background' }, 'Could not search Logs directory').paths; //Walk through each file paths.map(checkLogFile.bind(undefined, syncData)); //Store our NOW var newSyncData = { lastSyncDate: now, }; sf.file.writeJson({ path: syncFilePath, json: newSyncData, executionMode: 'Background' }, 'Could not write sync file to disk'); } function scheduleJob() { sf.engine.runInBackground(function () { while (true) { sf.engine.checkForCancellation(); sf.wait({ intervalMs: howOften, executionMode: 'Background' }); sf.engine.checkForCancellation(); job(); } }); } //To run once: job(); //To schedule cron job //scheduleJob();
Christian Scheuer @chrscheuer2019-03-12 14:39:51.007Z
Note, as you can see this has code to schedule itself. This is a way of running custom background jobs.
We should implement cron triggers that are customizable in the GUI which would eliminate the need for this custom scheduling code.
The scheduling code requires 2.2.0 preview 7. I'll get you onboard that, let's talk about that in a PM.Christian Scheuer @chrscheuer2019-03-12 14:42:06.722Z
Another note, to check if PT is running, check activeBundleID to be truish:
if (sf.ui.proTools.activeBundleID) ...
- RRyan Frias @ryan
Wow. Fantastic. Thanks Christian!
Christian Scheuer @chrscheuer2019-03-12 14:50:42.574Z
The sync algo should probably use the last copied file's timestamp instead of a NOW timestamp since that should be more accurate.
If you need to test without copying everything each time, you'll find that the log(...) function is handy, it will create a notification.
I've changed in the script so that the IO operations happen on the background thread to not interfere with normal PT/SF operation.