Jump to content

danluba

Member
  • Posts

    6
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

danluba's Achievements

Helping Hand

Helping Hand (3/5)

0

Reputation

  1. @giovanni Thanks for this. It did help. I've managed to quit the main script and leave the background one running, downloading the images. The only problem I have left, and it is a significant one, is that although the script is getting rerun every second, it doesn't seem to re-render the results and so the images don't show up until Alfred is dismissed and then the same search is performed again. Not much use. Any ideas? I'm wondering if it is something to do with the way I am setting and passing in my environment variables, which I am using to cache the api response, so that I don't have to keep hitting that up. Anyway, I'll keep working on it. Also, interestingly, the process backgrounding seems to work just as well without the nohup command as it does with. Just cutting off the buffers alone seems to do the trick.
  2. @giovanniThat would be awesome. Thank you.
  3. @giovanniI'm struggling. I looked at that script. There's a lot of new stuff in there for me. I have a basic idea of what I am trying to do, but all of my attempts are failing, and I can't really understand why @deanishe is doing the things he's doing. For example, he talks about nohupping (one of the new things for me), and I genned up on what that is, but I can't see where he does that in that script. It's making my head spin. Here's the money shot in the rough code I've written so far, just to try and get my head around things: pid = os.fork() if pid > 0: # return results to alfred on main process sys.stdout.write(json.dumps({'rerun': 1, 'variables': {'rerun': 'True', 'data': json.dumps(make_result_set(data))}, 'items': make_result_set(data)})) sys.exit(0) else: if sys.argv[3] == 'False': subprocess.Popen( ['nohup', '/usr/bin/python3', '/Users/danluba/PycharmProjects/AlfredMusicSearch/images.py', "&", json.dumps([d['img'] for d in data if d['img'] is not None]), "Bandcamp"], stdin=None, stdout=None, stderr=None ) That was me trying to nohup my subprocess and disconnect from all my buffers, but I guess it's not working because the images are not downloading. If there are any pointers that could be given to me, me would be very grateful for receiving them.
  4. @giovanni Thanks for your response. Will using rerun actually cause my script to exit while the images download? It was my understanding that using rerun would still let the script wait for all processes to finish, but would then rerun the script again after the specified time. I can't find anything in either of those links that tells me it will make the script exit. Incidentally - Was there supposed to be three links in your reply? Only two actually got linked.
  5. Hi. I'm currently writing a workflow that searches music stores for a query and returns a list of results you can select from to open the corresponding URL. I would like to include the cover art in these results. I have that working now, but it's taking five to ten seconds to download all of the images. My script is written in Python 3.8. I have the downloads being done concurrently, using asyncio - my problem is that Alfred is waiting for all of my subprocesses to finish before it exits the main script. I've spend the best part of the day (or the worst part of the day, depending on how you look at it) trying to get Alfred to exit the script, leaving the download threads doing their thing in the background and displaying the results, sans images, so that I can get the script to rerun and bind to the downloaded images periodically, thereafter. Can someone explain to me how this can be done? How can I get my main script to quit, leaving the background processes to download the images on the sly? Probably the safest thing to do is explain it to me like I'm five years old. Many thanks in advance. Dan
×
×
  • Create New...