Jump to content

How to bundle workflow dependencies from Homebrew


Recommended Posts

Hello!

 

I know I'm missing something basic here - I have a workflow that works great on my computer and I'd like to finally export it out to install on another computer. Because it's just a bash script that references the ffmpeg binary (installed in /usr/local/bin via homebrew) I've copied over the ffmpeg binary to the workflow folder and got it to work - on my computer. When I install it on another user's computer I get the error "Library not loaded: /usr/local/Cellar..etc." which makes sense since that user hasn't installed homebrew nor the ffmpeg.

 

What I'm looking for:

Is there a way to bundle everything into the workflow directory so that it can just run as a standalone workflow instead of the user having to install homebrew and the ffmpeg binary? I've been at this for hours (or days actually) and haven't been able to find anything on this in the forums or the web which makes me think I'm, well, missing something. I'd appreciate any answers or links to this concept. 

Link to post

Make sure to reference to the binary in your workflow folder. From your error, it seems that you have a script that look at "/usr/local/Cellar", but you just have to reference to your binary at "./ffmpeg". Think of your workflow folder as your working directory in your script and you should be good if everything look at this directory instead of somewhere else in your computer.

Link to post

Hello @msa,

 

When asking for help with a Workflow, please post it so we can properly help you. Debugging can already be hard with access to the code, and it’s considerably harder from just a description.


Depending on the case, bundling dependencies can be extremely easy or a pain, but don’t expect to be able to just copy things from a Homebrew installation and have it work. Sometimes there are movable parts to worry about.


I’ve built a script exactly for this, and it supports ffmpeg. Download it to your Workflow’s directory and then call it in your Bash script with something like PATH="$(./_sharedresources 'ffmpeg')". It’ll alter your PATH for that script and you can just call ffmpeg as usual.


In the machines that already have it installed and detectable, it’ll use that. In the ones that don’t, it’ll download it and keep it contained in a place that won’t interfere with other tools.

Link to post
11 hours ago, msa said:

When I install it on another user's computer I get the error "Library not loaded: /usr/local/Cellar..etc."

 

That error means the ffpmeg you're using is dynamically linked to other Homebrew libraries on your system. You'll need a statically-compiled version if you want it to run on other people's computers.

 

The standard Homebrew version of ffpmeg should work, so you've probably got a custom build. Run brew info ffmpeg and see whether it says "Built from source on…" near the top of the output.

Link to post

Thank you so much everyone! I have the workflow working now thanks to your help. I leave the details below to help anyone else that has a similar question. 

 

  • Firstly, I do apologize for not copying the script here, I wasn't intending to debug the script (since it was working locally) as much as understanding the bundling process. In hindsight, I see it would have helped and thank you all for your patience in my short sightedness.
  • Secondly, this was a great pointer for me to understand how to optimize exporting workflows. I've been creating different workflow processes for the better part of a year and now that I feel comfortable with my bash scripting, I definitely want to put the workflows out there publicly but ran into this wall. 
  • Lastly, I'm happy to post the script here once I clean up a few parts in case anyone would like to see it and I of course welcome their feedback. 

 

@GuiB I can confirm this was how I originally started my script and it worked fine - as long as I was running it on my own computer - because although my script was referencing the ffmpeg binary located in the workflow folder, the ffmpeg configuration was set up to reference my own local path and not the workflow's.

 

@deanishe This was what I started brainstorming late last night and am grateful you confirmed it. Although the script @vitor linked to was what worked in the end, I wanted to know exactly how to sort it out manually per se. From my experience in this, I downloaded ffmpeg from source and added it to the workflow folder, but because I already had it installed on my system it kept using the original "--prefix" configuration which continued to point to my "/usr/local/Cellar..etc" and I couldn't figure a way around that, as well I didn't want to bloat the workflow with a huge ffmpeg build that others may already have installed so I opted for @vitor 's elegant solution which works great as it detects whether or not the user already has the ffmpeg binary installed.

 

@vitor the "_sharedresources" script did exactly what was hoped for (and more!). As I move ahead with other workflows that I wish to bundle and share publicly, would you recommend using this script for all dependencies (e.g. homebrew's "tag" binary) or is this better catered to more complex binaries like ffmpeg and the others that you reference in your script?
 

Link to post

As promised, here is the working code for anyone interested for my workflow script added below. I welcome any feedback. The script takes folder/file input and searches through the input for any video files and creates lo-res proxies suitable for importing into an NLE (e.g. Premiere Pro) and organizes the hi-res and lo-res footage into their respective directories. It's not ready for public as I'll add the ability to specify the proxy width in the next version, it's currently hard coding the proxies at 640px wide. 

 

#!/usr/bin/env bash

# SET PATH TO SHARED RESOURCES
PATH="$(./_sharedresources 'ffmpeg')";

# SAVE AND REFORMAT IFS (INTERNAL FIELD SEPARATOR)
OLDIFS="$IFS";
IFS=$(echo -en "\n\t\b");

# GET THE ARRAY OF THE PATH(S) SELECTED.
shopt -s nullglob nocaseglob;
for f in ${selection[*]}; do
	
		# CHECK IF SELECTION IS DIRECTORY
		if [[ -d "$f" ]]; then
			shopt -s nullglob nocaseglob;
		else
			if [[ -f "$f" ]]; then
				shopt -s nocasematch;
			fi
		fi
		videos=$(find "$f" -type f -iname "*.mov" -o -iname "*.mp4" -o -iname "*.m4v" -o -iname "*.mts" -o -iname "*.avi" -o -iname "*.flv" -o -iname "*.wmv" -o -iname "*.mpg");

		# TRANSCODE FOOTAGE
		for v in ${videos[*]}; do
			ffmpeg -i "$v" -b:v 600k -vf scale=640:-1 -ar 22050 -b:a 96k "${v%.*}.PROXY.${v##*.}";
			
			# CREATE DIRECTORIES TO ORGANIZE FOOTAGE
			mkdir ${v%/*}/{lo-res,hi-res};

			# SORT FILES INTO RESPECTIVE SUB DIRECTORIES AND RENAME PROXIES
			# RENAME PROXY VIDEOS TO MATCH HI-RES MASTER FOOTAGE
			for low in ${v%/*}/*PROXY*; do mv "$low" ${v%/*}/lo-res; done
			for proxy in ${v%/*}/lo-res/*PROXY*; do mv "$proxy" "${proxy/.PROXY/}"; done
			for hi in ${v[*]}; do	mv "$hi" ${v%/*}/hi-res; done
		done
	done

# RESTORE $IFS
IFS=$OLDIFS;

 

Edited by msa
Missed a sentence
Link to post
2 hours ago, msa said:

or is this better catered to more complex binaries like ffmpeg and the others that you reference in your script?

 

It is only catered to the tools referenced in the script. Each tool is distributed differently by their developers, and to be able to provide them in a manner that’ll work for everyone, each needs to be downloaded in their own way.

 

1 hour ago, msa said:

I welcome any feedback.

 

  • Get rid of the OLDIFS lines. Setting IFS in a script will not change it outside the script, so there’s no point in saving and restoring it.
  • There’s no need for echo to set IFS. Do IFS=$'\n\t\b'.
  • Why are you setting so much in IFS? Typically \n suffices.
  • Don’t do a single if inside an else. Do an elif instead.
  • Why are you setting so many (any, actually) shopt? Those seem useless in this script.
  • Get in the habit of quote what’s between ${} to prevent nasty surprises.
  • Don’t do $f; do ${f}. Be consistent with the rest of the script.
  • Don’t ; at the end of lines. It’s unnecessary and it’s a style no one uses.
  • Install shellcheck on your editor. It’ll find even more things.

Link to post

Thanks @vitor, very kind and succinct. Exactly what I was looking for. I had no idea that IFS setting in a script doesn't change anything outside the script so I was being as safe as possible (intention). The rest of your feedback is excellent and I'm embarrassed at missing the elif but thankful for catching that. Also, shellcheck - I had never used that before and it's exactly what I needed. I've picked up a lot today and looking forward to implementing the changes. Have a great day/night everyone.

Link to post
3 minutes ago, msa said:

I had no idea that IFS setting in a script doesn't change anything outside the script so I was being as safe as possible

 

Scripts are run in subshells, and nothing is shared "upwards". That's the difference between running a script and sourcing one. A sourced (imported) script is run in your own script's namespace.

 

You have to worry about resetting IFS if you change it in a function.

Link to post

 

On 3/4/2018 at 7:06 PM, deanishe said:

 

Scripts are run in subshells, and nothing is shared "upwards". That's the difference between running a script and sourcing one. A sourced (imported) script is run in your own script's namespace.

 

You have to worry about resetting IFS if you change it in a function.

 

Hi @deanishe, sorry it took me some time to get back to this - duty called. So from what I understand now, all my bash scripts will execute in their own subshells and therefore variables (like the IFS setting) are basically local to that script and will not pollute the parent shell is that correct? If I were to create a function instead of a script however, then that runs in the parent shell correct? I just want to make sure i'm on the right path here. From what I understand though, subshells execute slower and use more processes so should would you recommend I should be looking at a different alternative then bash scripts?

 

Thank you both @vitor and @deanishe for taking the time to help with my bash scripting, I'm already seeing improvements and this forum in particular has served as inspiration to a great model to learn from.

Edited by msa
Added a question.
Link to post
2 minutes ago, msa said:

all my bash scripts will execute in their own subshells and therefore variables (like the IFS setting) are basically local to that script and will not pollute the parent shell is that correct?

 

Yes. Unless you call your script via source (which you’ll likely never do, that’s more for your shell’s configuration files).

 

6 minutes ago, msa said:

If I were to create a function instead of a script however, then that runs in the parent shell correct?

 

Just so we’re clear here, if you create a function inside a script, that function will have the same fate as the variables — it’ll be forgotten when the script ends and not affect the shell. But if you define a function outside a script, directly in your shell or by sourceing your shell’s startup files, then it’ll affect the shell.

 

5 minutes ago, msa said:

Thank you both @vitor and @deanishe for taking the time to help with my bash scripting, I'm already seeing improvements and this forum in particular has served as inspiration to a great model to learn from.

 

You are very welcome! The users that give the most joy to help are the ones that genuinely want to learn.

Link to post
41 minutes ago, msa said:

If I were to create a function instead of a script however, then that runs in the parent shell correct?

 

Nope. IFS is a global variable (to the running script). If you change it in a function, it’s also changed for the rest of the script.

 

As IFS is so fundamental to the way bash works, *not* restoring it immediately is likely to break something else. 

 

The same applies to your script.

 

@vitor‘s point was that there’s no reason to restore it as the last line in a script, as the very next thing bash does is forget all about that script and its variables.

Link to post
6 minutes ago, deanishe said:

Nope. IFS is a global variable (to the running script). If you change it in a function, it’s also changed for the rest of the script.

Right, that too helps. I should have been clearer with my phrasing but you both answered and helped me make sure I don't make that mistake and accidentally change it globally. This was what I started with in my script but as @vitor and you have pointed out now, it was useless because it was in a script. I'll do some more digging with this topic on my own just to make sure I'm super clear with it. Thanks everyone!

Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...