Jump to content

File action w/list of actions


dfay

Recommended Posts

I recently wanted to create a file action that would then display a list of tags and allow me to select one to tag the selected file.  I was hoping to model it on the Move To... , Open With... , Copy To.... actions that are built into Alfred, but with my list of tags as the options.  But I realized this isn't possible.

 

The same functionality has been requested in some other recent threads:

 

 

 

Basically the structure be

 

File action (with selected file) -> embedded List Filter (displayed in Alfred file actions pane, to get second parameter) -> action

 

 

Edited by dfay
Link to comment
Share on other sites

Thanks

Yes, that worked well -- rather than use the multiple pathways in your example, I used a single pipeline:

 

file action (as original) -> Arg and Vars to set filesList (as original) -> list filter that passes the desired tag as arg -> Arg and Vars object to concatenate {query} {var:filesList} -> run bash script /opt/local/bin/tag -a $1

 

My OCD wants a native version but this works fine.

Link to comment
Share on other sites

1 hour ago, dfay said:

spoke too soon...running into the usual pitfalls with escaping spaces in tags and filenames....

 

Well, yeah. If you connect a File Filter directly to a Run Script, the paths will be in ARGV (e.g. $1, $2 etc.) and you can process them fairly sensibly via $@.


But if you put the File Filter output into a variable, you end up with a single, TAB-separated string, just like if you try to set a workflow variable to an array.

 

You can handle that with a shell script, but it'd probably be smarter to use a language that doesn't treat whitespace as argument delimiters.

 

Edited by deanishe
Link to comment
Share on other sites

@deanishe, yes there's simpler solution to deal with tab separated string, but in this case I don't think it's too much trouble. I mean, I'm not an expert of shell scripting, but I found a simple solution to split a string delimited by tab into an array and expand the array into different command arguments:

 

Split the string into an array:

IFS=$'\t' read -a filesListArr <<< "${filesList}"

Expand the array to use it as multiple command arguments:

./tag -a "$1" "${filesListArr[@]}"

 

@dfay, you were talking about multiple actions, so that's why I used multiple pathways, but yes, in your case it is simpler to only use one. Here I made an example that should do what you want and with my above solution for the files list into an array

 

Download: https://nofile.io/f/E11PVYVee0Z/Tag+Files+with+List+Of+Tags.alfredworkflow

Edited by GuiB
Link to comment
Share on other sites

58 minutes ago, GuiB said:

I found a simple solution to split a string delimited by tab into an array

 

I'd argue that it's not simple because you have to look that crap up. Every. Single. Time.

 

I mean, why is it $'\t'? What's the $ for?

 

That's what I hate about shell scripting: It's so damn cryptic.

 

TBH, I think the sensible way to do it is to replace the first Args & Vars with a Run Script that turns ARGV into a proper JSON array and set the variable to that. Tab-delimited filepaths is a kludge, and will die in flames if someone manages to put a tab in a filename.

 

I've no idea what will happen if there's a " or a $ in the filepath. Because shell scripting languages are mental.

Link to comment
Share on other sites

"if you put the File Filter output into a variable, you end up with a single, TAB-separated string, just like if you try to set a workflow variable to an array" -- did not know that...which shows how few workflows I've actually worked up since Alfred 3 came out....

 

I agree, the code is not pretty, but it is working for me at the moment with spaces in my tags & filenames.  

 

Maybe we need a feature request for Alfred to produce a JSON array rather than a tab-delimited list.

Edited by dfay
Link to comment
Share on other sites

1 hour ago, dfay said:

Maybe we need a feature request for Alfred to produce a JSON array rather than a tab-delimited list.

 

Dunno. I agree that a tab-delimited string isn't ideal (what uses that format?), but JSON is no good at all in a shell script or AppleScript. Alfred's utilities also assume plain text.

 

As explained above, I'd use a script to read the individual paths from ARGV and export them as variables. If the language handling them downstream supports JSON, I'd use that, but for a shell script or AppleScript I think I'd set a sequence of variables, such as $filepath1, $filepath2, $filepath3 etc.

 

Edited by deanishe
Link to comment
Share on other sites

Well here's what I came up with:

 

 

File Action (unchanged)

write the file list to a text file (files.txt)

List of Actions (unchanged)

then Run Script:

#!/usr/bin/python

import sys
import subprocess

theFiles = open('files.txt', 'r').readlines()

for aFile in theFiles:
	args = ['/opt/local/bin/tag', '-a', sys.argv[1], aFile.replace('\n','')]
	p = subprocess.call(args)

 

In theory, writing to a file might slow things down a tiny bit, but this is a solution that I will be able to come back to in a year and immediately understand again.

 

Link to comment
Share on other sites

That will work.

 

Looking at your earlier command, it appears that tag can accept more than one filepath at once. If so, it'd make a lot of sense to only call it once with all the paths instead of once per path:

#!/usr/bin/python

import sys
import subprocess

tag = sys.argv[1]

with open('files.txt') as fp:
    paths = fp.read().strip().split('\n')

cmd = ['/opt/local/bin/tag', '-a', tag] + paths
subprocess.call(cmd)

 

Edited by deanishe
Link to comment
Share on other sites

10 hours ago, deanishe said:

I'd argue that it's not simple because you have to look that crap up. Every. Single. Time.

You're right that Bash scripting is often not so intuitive, but I meant that those snippets to split the string into an array or to expand a list into multiple arguments are not too hard to keep around to use again in another workflow. But you're right that the syntax is quite obscure and less intuitive than a string.split('\t').

 

@dfay, your method to output to a file and parse it by splitting it's lines into an array is not so different than to build a string where the files path are delimited by a newline and where an Alfred variable is then set with the content of this string (so kept in memory instead of manipulating a file). I've done it in my updated workflow to give you an example in Bash and Python.

 

Also, I've made a third example to play around with the way @deanishe thought it by using a JSON string.

 

In short (as reference if you prefer not to download the example workflow):

- Put a File Action node

- Connect it to a Script node to create the JSON string from the ARGV:

#!/usr/bin/python
import sys, json

data = []
for i, filePath in enumerate(sys.argv[1:]):
    data.append({ 'filePath'+str(i) : filePath })

sys.stdout.write(json.dumps(data))

- The output set a 'filesList' variable as my other example with a "Arg and Vars" node

- Then a List Filter node to show the Tags list

- Then a last Script node to process the files using:

#!/usr/bin/python

import sys, subprocess, os, json

tag = sys.argv[1]
pathsJSON = json.loads(os.getenv("filesList",""))

paths = []
for items in pathsJSON:
    for key, value in items.iteritems():
        paths.append(value)

cmd = ['./tag', '-a', tag] + paths
subprocess.call(cmd)

 

Both way seems to be working fine! (files path delimited be newline or by creating a JSON string)

 

Here is the workflow to have an example to play with: https://nofile.io/f/mnNgmh60OB2/Tag+Files+with+List+Of+Tags.alfredworkflow

 

 

Link to comment
Share on other sites

@GuiB There's no need to use an object in your first script. Just use an array.

import json
import sys

json.dump(sys.argv[1:], sys.stdout)
import json
import os
import subprocess
import sys

cmd = ['tag', '-a', sys.argv[1]] + json.loads(os.getenv('fileList'))
subprocess.call(cmd)

If you do have a mapping, and don't need the keys:

paths = data.values()

 

Link to comment
Share on other sites

@deanishe, thanks for your input! I like it, short and sweet! ;)

 

I've updated my workflow with your mentions and simplified my python scripts on the version that uses a newline delimited string.

 

To have it as reference here in this forum.

 

To convert the files list into a newline delimited string:

- Connect a Script node to the File Action node with the language set to "Python"

import sys

sys.stdout.write("\n".join(sys.argv[1:]))

- The output then set a variable using the Arg and Vars node (where the variable name is set to "filesList" in this example)

- Then the List Filter node to specified the action to do

- Then to reuse the newline delimited string for this tagging workflow example using a Script node with the language set to "Python":

import sys, subprocess, os

cmd = ['./tag', '-a', sys.argv[1]] + os.getenv('filesList').split("\n")
subprocess.call(cmd)

 

The updated workflow: https://nofile.io/f/S05SkQpvGdk/Tag+Files+with+List+Of+Tags.alfredworkflow

Edited by GuiB
Link to comment
Share on other sites

14 hours ago, GuiB said:

 

@dfay, your method to output to a file and parse it by splitting it's lines into an array is not so different than to build a string where the files path are delimited by a newline and where an Alfred variable is then set with the content of this string (so kept in memory instead of manipulating a file).

 

I'd bet that Alfred dumping to a file is faster than starting python and loading two modules....

Link to comment
Share on other sites

56 minutes ago, dfay said:

I'd bet that Alfred dumping to a file is faster than starting python and loading two modules....

 

On an SSD, definitely. Of course, you have to open and read it again, too…

 

I like the file method. Alfred writes the filepaths one per line, which is much cleaner than a tab-separated string. As long as the element that reads the file is always downstream from one that writes it, you can't get stale data.

 

Personally, I'd probably still use the JSON method simply because the paths are transient data, but in the above examples I completely ignored encoding. JSON strings are unicode, and you have to encode them to UTF-8 before passing them to subprocess.call() if you don't want it to die in flames when given a non-ASCII path:

# ASCII-only
cmd = ['tag', '-a', sys.argv[1]] + json.loads(os.getenv('fileList'))

# Works with forrin stuff
cmd = ['tag', '-a', sys.argv[1]] + [s.encode('utf-8') for s in json.loads(os.getenv('fileList'))]

 

Edited by deanishe
Add example code
Link to comment
Share on other sites

1 hour ago, dfay said:

I'd bet that Alfred dumping to a file is faster than starting python and loading two modules....

 

Look inside the workflow for the solution using Bash. This should be faster to load and execute. You're right that Python could be slower to start and load modules, but I often prefer to work in memory instead of manipulating files.

 

If you want a faster script (not sure how faster it is if any, we would need to do a speed test...), then maybe you'll be better to stay with Bash...

 

For your Script node connected to the File Action and using the Bash language (convert arguments to a newline delimited string and output):

IFS=$'\n'
echo "${*}"

- Then, same as before with the Arg and Vars (that set the 'filesList' variable) and the List Filter

- Then, connect to another Script node using Bash to process the files:

IFS=$'\n' filesListArr=($filesList)
./tag -a "$1" "${filesListArr[@]}"

 

Updated workflow with this code for the newline delimited string method (if it's useful to someone):

https://nofile.io/f/jWPWou9CT1D/Tag+Files+with+List+Of+Tags.alfredworkflow

Link to comment
Share on other sites

3 minutes ago, dfay said:

I'm sure the time we have committed to this far outweighs the savings for file vs. array vs. json, python vs. bash, etc.

 

At least, I think this is something that could be useful for multiple kind of workflows (I mean, it's just some ways of keeping the files list in a consistent way that could be used later in the workflow). So, I think it's great to have one or many solutions that we know works and that we can have a place with reference to look at. :)

Link to comment
Share on other sites

  • 9 months later...

@dfay Thanks for sharing the link to the command line tool! 

 

I tried adapting the JSON approach in the workflow above: (1) removing the list filter, and (2) updating the second python script. But I still haven't had any luck.

 

As modified, the workflow looks something like this:

1103608837_JSONApproachinUpload.jpg.7944aace81d5c21a6a03b83381caf2eb.jpg

 

And, the second python script now reads as follows:

import json, os, subprocess, sys

cmd = ['./tag', '-r \*', json.loads(os.getenv('filesList'))
subprocess.call(cmd)

Any idea what I might have overlooked? I'm a newbie, and assume that it's something with the python script.

 

To make things easier, here's a link to a workflow containing the original process, and the new one outlined above: 

Thanks for your help!

 

Edited by Jasondm007
typo & added link
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...