Jump to content

Luke

Member
  • Posts

    2
  • Joined

  • Last visited

Everything posted by Luke

  1. I realized that I had a lot of files in my ~/Documents directory, especially since installing Parallels. So I made sure all my projects were in ~/Documents/Projects and then limited find's depth and the type to directories. Much much faster!
  2. I have a script filter that looks like this: #!/bin/bash echo "<?xml version=\"1.0\"?>" echo "<items>" find ~/Documents -name '.git' -path '*{query}*' -exec dirname {} \; | while read -r file do base=$(basename "$file") echo "<item type=\"file\" arg=\"${file}\" uuid=\"${file}\" autocomplete=\"${base}\">" echo "<title>${base}</title>" echo "<subtitle>${file}</subtitle>" echo "<icon type=\"fileicon\">${file}</icon>" echo "</item>" done echo "</items>" The purpose of this script is to search all of my projects to be able to open them in Sublime. It looks for .git folders to accomplish this. This works great, but can be a bit slow as it is firing on every keystroke. The results either show up immediately or takes up to 1-3 seconds to appear. It's likely because I'm calling find over and over. The result doesn't change that often, so I figure that caching the results would help out a great deal. Are there any existing solutions to this, or do I have to roll my own? If the ladder, is there any way to create a filter that makes an HTTP request? I was thinking I could make a daemon in Go that would keep a cache of the file listing that would periodically update itself.
×
×
  • Create New...