Jump to content

msa

Member
  • Posts

    22
  • Joined

  • Last visited

Profile Information

  • Location
    Cusco, Peru

Recent Profile Visitors

829 profile views

msa's Achievements

Member

Member (4/5)

1

Reputation

  1. That did the trick, I uninstalled blueutil from homebrew, called the workflow and downloaded the utility in the shared_resources as you mentioned. Great thanks for the help!
  2. Great workflow, just wanted to point out that the update (via OneUpdater) didn't seem to connect for some reason to the blueutil located in the shared_resources directory so broke the workflow, however I installed blueutil via homebrew as you recommended and it works again. Not sure why it didn't connect but just wanted to let you know. I can send you the debug report if you like.
  3. Flow: A Minimal Project Management Workflow Flow is an Alfred workflow that provides a File Action for quick project management processes written in Bash. It's a minimal solution for tasks and project management that stays out of your way so you can stay focused. Version 1.1 Updates 1. More simplicity for the workflow variable declarations. You can now declare your variables with the standard path syntax such as notes=~/Documents/Notes. 2. Also in version 1.1, the tag cli tool is included for those who do not have it installed via homebrew so you do not need to install it. If you do have it installed in it’s default location (via homebrew) /usr/local/bin/tag the workflow will use that version instead of the one included. A Real Life Example In any given moment of thought or email/message received, I want to be able to bring up Alfred and make a selection from things on my desktop to: Apply a simple tag Include a memo Be done with it (I don’t even want to see it anymore) until I’m ready to focus on it later. This way I won’t lose (hardly) any time but more important; I don’t lose the mental real-estate that I already have accrued while working in a different project at that moment. The idea is that by applying the tag you set a process in motion to migrate the selected item(s) over to your main Projects directory where you ideally keep them. The tag options are (tomorrow, shelf, or any day of the week) and the selected items are moved accordingly. Also included is a script that can process the items for “today” meaning that if you invoke the script with the hotkey “ttoday” it will comb through your projects directory and look for items tagged with the matching tag for whatever today is (sunday-saturday), or with “tomorrow” and symlink them to the desktop. I find using this script as a launch utility when booting up is the best. Thanks for taking the time to read and give Flow a look. I’ve included the technical information in the “About this Workflow” that should show up when you import it into Alfred. You can download the workflow here: https://cp.sync.com/dl/9a0958280/gauk3st2-pdbjhpjb-purw294c-2usf89s5
  4. Right, that too helps. I should have been clearer with my phrasing but you both answered and helped me make sure I don't make that mistake and accidentally change it globally. This was what I started with in my script but as @vitor and you have pointed out now, it was useless because it was in a script. I'll do some more digging with this topic on my own just to make sure I'm super clear with it. Thanks everyone!
  5. Is there a way to mark this question as Solved/Answered?
  6. Got it! Simply brilliant @vitor. So many users here are wonderfully helpful, this is a great community. I love this path.
  7. Hi @deanishe, sorry it took me some time to get back to this - duty called. So from what I understand now, all my bash scripts will execute in their own subshells and therefore variables (like the IFS setting) are basically local to that script and will not pollute the parent shell is that correct? If I were to create a function instead of a script however, then that runs in the parent shell correct? I just want to make sure i'm on the right path here. From what I understand though, subshells execute slower and use more processes so should would you recommend I should be looking at a different alternative then bash scripts? Thank you both @vitor and @deanishe for taking the time to help with my bash scripting, I'm already seeing improvements and this forum in particular has served as inspiration to a great model to learn from.
  8. Thanks @vitor, very kind and succinct. Exactly what I was looking for. I had no idea that IFS setting in a script doesn't change anything outside the script so I was being as safe as possible (intention). The rest of your feedback is excellent and I'm embarrassed at missing the elif but thankful for catching that. Also, shellcheck - I had never used that before and it's exactly what I needed. I've picked up a lot today and looking forward to implementing the changes. Have a great day/night everyone.
  9. As promised, here is the working code for anyone interested for my workflow script added below. I welcome any feedback. The script takes folder/file input and searches through the input for any video files and creates lo-res proxies suitable for importing into an NLE (e.g. Premiere Pro) and organizes the hi-res and lo-res footage into their respective directories. It's not ready for public as I'll add the ability to specify the proxy width in the next version, it's currently hard coding the proxies at 640px wide. #!/usr/bin/env bash # SET PATH TO SHARED RESOURCES PATH="$(./_sharedresources 'ffmpeg')"; # SAVE AND REFORMAT IFS (INTERNAL FIELD SEPARATOR) OLDIFS="$IFS"; IFS=$(echo -en "\n\t\b"); # GET THE ARRAY OF THE PATH(S) SELECTED. shopt -s nullglob nocaseglob; for f in ${selection[*]}; do # CHECK IF SELECTION IS DIRECTORY if [[ -d "$f" ]]; then shopt -s nullglob nocaseglob; else if [[ -f "$f" ]]; then shopt -s nocasematch; fi fi videos=$(find "$f" -type f -iname "*.mov" -o -iname "*.mp4" -o -iname "*.m4v" -o -iname "*.mts" -o -iname "*.avi" -o -iname "*.flv" -o -iname "*.wmv" -o -iname "*.mpg"); # TRANSCODE FOOTAGE for v in ${videos[*]}; do ffmpeg -i "$v" -b:v 600k -vf scale=640:-1 -ar 22050 -b:a 96k "${v%.*}.PROXY.${v##*.}"; # CREATE DIRECTORIES TO ORGANIZE FOOTAGE mkdir ${v%/*}/{lo-res,hi-res}; # SORT FILES INTO RESPECTIVE SUB DIRECTORIES AND RENAME PROXIES # RENAME PROXY VIDEOS TO MATCH HI-RES MASTER FOOTAGE for low in ${v%/*}/*PROXY*; do mv "$low" ${v%/*}/lo-res; done for proxy in ${v%/*}/lo-res/*PROXY*; do mv "$proxy" "${proxy/.PROXY/}"; done for hi in ${v[*]}; do mv "$hi" ${v%/*}/hi-res; done done done # RESTORE $IFS IFS=$OLDIFS;
  10. Thank you so much everyone! I have the workflow working now thanks to your help. I leave the details below to help anyone else that has a similar question. Firstly, I do apologize for not copying the script here, I wasn't intending to debug the script (since it was working locally) as much as understanding the bundling process. In hindsight, I see it would have helped and thank you all for your patience in my short sightedness. Secondly, this was a great pointer for me to understand how to optimize exporting workflows. I've been creating different workflow processes for the better part of a year and now that I feel comfortable with my bash scripting, I definitely want to put the workflows out there publicly but ran into this wall. Lastly, I'm happy to post the script here once I clean up a few parts in case anyone would like to see it and I of course welcome their feedback. @GuiB I can confirm this was how I originally started my script and it worked fine - as long as I was running it on my own computer - because although my script was referencing the ffmpeg binary located in the workflow folder, the ffmpeg configuration was set up to reference my own local path and not the workflow's. @deanishe This was what I started brainstorming late last night and am grateful you confirmed it. Although the script @vitor linked to was what worked in the end, I wanted to know exactly how to sort it out manually per se. From my experience in this, I downloaded ffmpeg from source and added it to the workflow folder, but because I already had it installed on my system it kept using the original "--prefix" configuration which continued to point to my "/usr/local/Cellar..etc" and I couldn't figure a way around that, as well I didn't want to bloat the workflow with a huge ffmpeg build that others may already have installed so I opted for @vitor 's elegant solution which works great as it detects whether or not the user already has the ffmpeg binary installed. @vitor the "_sharedresources" script did exactly what was hoped for (and more!). As I move ahead with other workflows that I wish to bundle and share publicly, would you recommend using this script for all dependencies (e.g. homebrew's "tag" binary) or is this better catered to more complex binaries like ffmpeg and the others that you reference in your script?
  11. Hello! I know I'm missing something basic here - I have a workflow that works great on my computer and I'd like to finally export it out to install on another computer. Because it's just a bash script that references the ffmpeg binary (installed in /usr/local/bin via homebrew) I've copied over the ffmpeg binary to the workflow folder and got it to work - on my computer. When I install it on another user's computer I get the error "Library not loaded: /usr/local/Cellar..etc." which makes sense since that user hasn't installed homebrew nor the ffmpeg. What I'm looking for: Is there a way to bundle everything into the workflow directory so that it can just run as a standalone workflow instead of the user having to install homebrew and the ffmpeg binary? I've been at this for hours (or days actually) and haven't been able to find anything on this in the forums or the web which makes me think I'm, well, missing something. I'd appreciate any answers or links to this concept.
  12. Good morning and thank you Deanishe - so much. I'll get the hang of the forum space I promise, sorry about that : ). That said, thanks not only for the help but the direction. Happily I can get ffmpeg at least recognized as I better understand the PATH issue. I'll be spending more time with the variables and getting to better understand them. I'm really happy with Alfred and have been wanting to take it up another level for quite some time. Your resources are extremely helpful. Have a great rest of your Afternoon! -MSA
  13. Hello everyone, first off thanks for checking in with this hopeful simple question. I'm trying to write my first simple workflow that incorporates user input, saves the arguments as variables and then passes them into the script to run. I've debugged it through Alfred and all the variables are coming through fine. The problem comes when I try to run and compile the variables into the script. Here is the script I'm trying to run: ffmpeg -i {var:file-location} -b {var:v-bitrate} -r {var:frame-rate} -ar {var:sample-rate} -ab {var:a-bitrate}k {var:file-location} Now as I'm a first timer here trying to write something like this, I keep getting the error that says "ffmpeg: command not found" which when I run this in terminal (without the variables of course) runs fine. So I'm presuming that there needs to be a different way to use the variables when applying them to the script, is that correct? Apologies in advance if I'm way off, I've just been wanting to find a way to learn how to create this as I'm getting quite excited about creating different workflows for my needs. Any help is truly appreciated. Thanks so much! -- MSA
×
×
  • Create New...