Jump to content


  • Content Count

  • Joined

  • Last visited

  • Days Won


h2ner last won the day on September 8 2018

h2ner had the most liked content!

About h2ner

  • Rank

Recent Profile Visitors

2,744 profile views
  1. https://github.com/gennaios/alfred-jnana/releases port of Gnosis to Go. setup code for creating database, etc. not included but if having run Gnosis at least once, moving the db to the workflow folder as jnana.db and it should work. I reread your variable docs deanishe and figured it out. I didn't have a step at the end that deleted the variable; had thought that if it wasn't exportable, it would automatically disappear and not show up in the variable list. Perhaps it's there in the wording somewhere. Having added the step of saving a variable through AppleScript, it seems there's a noticeable delay which may be the step of retrieving the variable. Where I need a variable, for filtering items according to the open file, that SQLite query is more complex and seems like it can take up at least 20+ more ms; could be both combined that make that querying items for the currently opened file seem slightly slower than a FTS query on the whole database. It seems like Arg and Vars is then not appropriate for use before a Script Filter action? Arg and Vars seems to automatically pass the variable as query or argv to the following script filter. Or maybe there's another way to set it up. As the workflow is in general pretty fast in Go, any slightly delay I'm starting to notice.
  2. Hello Andrew, When I tried an Arg and Vars step after Run Script, which as you suggested sets an environment variable from {query}, in the next step of a Script Filter, {query} and argv are also set to the value from the Run Script (return … [to {query}]). Perhaps I'm missing something? The Script Filter should run with argv as whatever is the input text of the Script Filter, with the variable additionally available. So far, I'm only able to get that from the above mentioned AppleScript within the Run Script. I make use of the variable only during execution of the workflow – no need for it to be saved afterwards –, and am thus not sure if there's another way, or if within the Script Filter, some alternate code is needed. I don't remember if it was mentioned in the docs, are variables set permanently or is there a way to set variables to be available only during script execution and thus aren't saved to the plist? If they are permanent, perhaps then there's some step I can add at the end which will remove the variable from the environment.
  3. Hello, I was looking at my workflow and eventually figured out why one part is slow. I have a script filter that first finds the front most app and then gets the open file. I realized that is running on each execution of the script filter when it doesn't need to. I thus separated it into an intermediary run script action. How do I pass that file path as a variable? I'm not sure if Arg and Vars can be used in this case, accessing the variable thus from the next step of an AppleScript script filter. Alternatively, I can save the file as an environment variable using: set bundleID to (system attribute "alfred_workflow_bundleid") tell application "Alfred 3" set configuration "…" to value quoted form of theFile in workflow bundleID with exportable end tell but then the variable becomes a permanent setting, saved with the workflow plist. Unneeded plus if I wanted to not have that always show up as a git change, I'll have to add some file filter to ignore that line. Is what is set by Arg and Vars accessible by later AppleScript scripts and script filters?
  4. I was out of mind when I had phrased that. Closer to what I was originally thinking, trying to get an idea of potential lowest total execution time a script filter. I haven't looked much at the AppleScript in a while but will do. Launching a minimal python script w/o imports that returns one result, unsure what is the overhead of launching python. Unsure if Alfred needs to create a shell environment first, or if there's anything else. Perhaps all of that doesn't add much and the potential of a script filter to execute and return results, in Python, might be in the dozens of milliseconds? Helps to get an idea of let's say my script execution time could be reduced by perhaps 100 ms, that is perceivable; if it's just 20 ms or so, what percentage of total time might that be? A rough estimate would be nice to know though now I've started to port it to Go, less important. Though over time, having an idea of potential speed and trying to reach it is a nice aim. Recently I had tried to separate the Peewee import as such of their apsw module and Postgres extended. Problem was the main Peewee module imports all drivers if available, rather than by class instantiation, and I had some issue with it breaking a table field type that is extended in it's extended SQLite module that uses apsw, but seems was imported from main Peewee. Reverted back since it didn't make any speed difference. Now I'm more comfortable with SQL, Python, etc., I may try to remove Peewee though now with Go, who knows. Only in recent weeks had I started to seriously look into SQLite, and I'm pretty happy with it, so may remove the other code and drivers. Will wait for that, though will concentrate on Go. As as far SQLite, seems the load time of the apsw driver itself and running a query, if I remember correctly, might be around 40 ms. Timing the query within sqlite3, it rounds to 10s of ms, is 10 ms w/a query that returns one result. Maybe there are compile options of the driver. … All these details, perhaps less important now though I may try some, though future efforts will be porting to Go. Also would like to better know SQLite options, pragma statements that might affect queries that return many results, and there was also a mention in the sqlite-users mailing list of someone writing a ranking algorithm that adds a fair amount of Sphinx's SPH04. Little details I may try to look into over time. Thanks for the AppleScript tip. As far as the error, on a new setup of Alfred with a different macOS user, I was unable to produce it. Seems like an AppleScript error. Will keep looking into it and thinking what it might be. Of note, seemingly unrelated, a recent version of calibre (within the last ~2 years) is needed for EPUB use. I had asked, specifically for this workflow, and there was an addition to the ebook-viewer command-line parameters to specify opening a file to a TOC entry by title. Not ideal since it might mismatch but it's a start. Unaware of any other EPUB viewer that has anything similar or AppleScript support.
  5. Indeed removing Peewee is a source for more speed. At this point, the workflow maybe in maintenance mode with fixes, new features, etc. but not larger efforts of recoding. As I started porting it to Go, at least the most used part that would benefit most from speed, I use both in combo and will slowly convert the rest to Go as I learn the language, available modules, etc. As for what's taking up the ~150 ms for one script filter, I was curious how such a time compared to total round trip time of a Python script filter. As is: - full-text relevancy ranked search is done on an SQLite FTS5 table, sorted by relevancy in 3 columns. DB is at ~1.2 million rows. Should be a few dozen ms max or less. Read some notes the other day about FTS query syntax that would result in the same results but one might run slightly faster. various pragma statements might make a small difference. - loading of three DB drivers (apsw for SQLite, psycopg2 for postgresql, pymysql for sphinx) seems to take up about 2/3s of execution time. ~100 ms of the total ~150. Wasn't too familar with SQLite at the time I began so used postgresql/sphinx. I could remove the code to optionally use a different backend. should reduce load time but I'll likely leave it in there for those that want it. It was only recently in getting the workflow mostly good enough for sharing that I looked into SQLite how to configure it. Still possibly more work could be done with that. Overall seems Peewee possibly could completely load and parse modules in less than 50 ms if just using apsw for SQLite. total time includes steps: - init db, create tables if needed, create triggers if they don't exist for FTS table population, SQLite pragma statements. Unsure what is deal. All of that seems to go quick. - reading of cached query string. so bringing up the script filter will populate text field and results with previous search. query string is saved on each execution of script filter, unsure if there's a way to save it only after and if one has pressed return or selected a filter result. as such, query string is saved on each execution of script filter. seems like this step, at least file write, could be threaded. - image thumbnails are used if they exist for filter results. so for each result that goes into the filter (searching index ebook TOC entries to open book to section), a check if exists a thumbnail is done and then the icon type is set. Search is limited to 100 results so 100 checks are done. Maybe that doesn't take too much time. As there could be results from the same file occurring multiple times in results (different sections of same book), maybe some map could be done to set icon path for all entries to the same file. Not too strong with such tasks but will look into it; unsure if such a task before creating workflow items would be faster or slower. In general, unsure how fast all that could run in optimal conditions. under 60 ms? Unsure still of additional overhead of Alfred initiating the script filter, running AppleScript before and after, and parsing and displaying script filter results. Part I mostly wanted to get faster was remade in Go; as such, perhaps will mostly be devoting effort that. As it seems mostly ready for others to start using, updated:
  6. Thank you deanishe. I started my one workflow long ago when my programming abilities were less than ok. They're perhaps only slightly better now. Working on it here and there over time, I haven't looked at it in a while since the beginning. I recently tried to make some improvements and additions to get it close to sharing. As mentioned elsewhere, I used Peewee for ease of use at the start and trying to reduce code duplication with configuration of different db backends. That takes about .09 seconds to load, most of it seems to be the loading of two drivers. Now after having read up on SQLite, I'm using only that thought haven't removed the rest of the code. A full-text search over a fairly large database takes about 150 ms total from script start to returning workflow items. I tried to optimize it and it's maybe as close as it can get; only so much can be done it seems. Removing Peewee might gain me half that time, unsure. You encouraged me to try Go. As I was getting back to my workflow, been looking into Go in recent weeks. The main part of the workflow for which I'm concerned about speed is a script filter of a full-text search over the entire db. Like a search engine, I might refine the search terms to narrow down and explore results. Since you're reply, I've tried to get just that most used part working and got it going earlier today. So nice. ~8–50 ms. And thank you for the wonderful alfred-workflow and awgo.
  7. It seems some here might have an idea. Anyone have a rough estimate of how long it takes, on a modern system, for a script filter to run a minimal python script and return results to Alfred? Perhaps somewhere in the hundreds of milliseconds? I'm wondering how much I should try try to optimize my workflow, e.g., reductions in execution time of for example 0.05 seconds might be perceivable but require too much effort.
  8. Sadly no AppleScript support. Neither in some other similar app of which I forget the same. Any other similar apps? Would be great.
  9. Terrific. I am used to apps like Intellij IDEs where one can access any app action and any preference thru search. I have seen very few apps offer that. Sublime Text is decent for that. Hope such comes someday.
  10. I’m not sure if since my first post there have been any changes or additional features that might help. To better explain what I’m hoping for: I have a script filter that searches a database and returns results. Relevant results could be at least several dozen and sometimes a hundred or more. Just like a search engine, I tend to often look at several to a dozen or two of the most relevant entries, perhaps refining search and continuing. Ideally, a script filter could persist its state so the next time I run it, it doesn’t requery the database but is reinvoked with the same query text filled in if I wish to edit the text, the filter results list is the same, and cursor and scroll position is also the same so for example I could easily just scroll down one and choose the next result if let’s say there’s 100. Sometime there was the addition of using AppleScript to call Alfred from an external script. With persistent variables; can I perhaps set the query text as some variable and then have an action before my script filter that calls it with the previously saved query text stored in a variable? My preference is to use the external trigger mode bound to a hot key as like a search engine I may invoke it repeatedly to look at dozens of results. Is it worth the possible addition of a toggle so that script filters can persist state upon successive invocations? I imagine some may wish to do something similar, use a script filter repeatedly to examine numerous results while refining the search.
  11. I realize removing peewee would give me great gains. If it's something like .15–.2+ seconds to return results for a script filter, there is a visible delay. I'm pretty novice at programming and my workflow was meant to be able to work out of the box with sqlite, but usable with other databases such as postgresql. Full-text searches with relevancy ranking over two full text indices in two tables; after enough records postgresql FTS became slow so I added sphinx. Maybe I could figure it out without an ORM though there'd be more code, more to maintain, and so forth. I'm not sure if I'm ready to do that yet and maybe some lightweight ORM-like library in Go, though indeed more complicated as you say, may work better. There may be other reasons too to want to switch. My workflow works fairly well enough as is for me, though performance certainly could be better, though it's a good amount left to do before release whenever that happens.
  12. Indeed the SQLite module would be much more responsive. At the time a few years ago, I had tried to optimize it as much as I could using peewee and considered it a bit slow. Gotten used to it since then but now revisiting it, maybe an eventual port to Go would be worth it. I'd possibly still use some ORM or ORM-ish library thought it should be better.
  13. The main workflow I use and perhaps the only one I spent any significant time on uses peewee. Agree native sqlite support might not be worth adding. It's configured to work on install using sqlite but optionally with postgresql & sphinx for full-text search. Currently searching 1 million+ records with sphinx and running at the terminal 'time' a script filter using your alfred-workflow returns results in ~0.15 seconds. I'm a novice at python, haven't profiled it in a while, haven't done any dev on it in a while, just got back to it, and am unsure if it's function calls, peewee initialization, db driver overhead, etc., I'm unsure what constitutes that time. I'm now looking at porting it to Go for more speed. It's not too bad but certainly faster is better. In a quick comparison of various ORMs, I found peewee easier to use and not to bad to write code to use various db backends.
  14. Agree with deanishe and was thinking the same: cmd-return to open in Contacts. I didn't know of cmd-O until I found this thread. As the modifiers control and option offer other options, I think cmd is a good choice for perhaps the most often wanted other action.
  15. Perhaps there hasn't been much consideration. Labels/tags are possibly going to be more important and utilized; see how they occupy a primary place in Files for iOS. If they were displayed in Alfred, a bit like Gmail for iOS, right aligned in a colored box with label name on the same line as the file path, I don't think that'd be too intrusive.
  • Create New...