Jump to content

Joshua Lian

Member
  • Content Count

    2
  • Joined

  • Last visited

  1. Thanks for the help. Launch Agent solves my problem. For my case I did not use both feedparser and speedparser, because I have to fetch pages of the links in the RSS and get some extra information. So I used beautifulsoup4 together with multi-thread processing to speed it up, also I did some incremental update to avoid fetch pages multiple times. As my RSS only has 10 new posts per day on average, the current workflow works perfectly, but I think the performance will be fine as long as there is no burst in my RSS.
  2. I am using the python package "Alfred-Workflow" to write a workflow to manage downloads, it first read from an RSS source and then I can send some item to aria2c to download. However, when there are many items in the RSS, updating takes some time so I need to cache the results for 20 minutes. I know that I can set the update process by "run_in_background", to run it I need to at least trigger the workflow once. It would be nice if the update can run in background every 20 minutes automatically without being triggered. I read the tutorial and find the delay block utility, I found that if I can wrap an action with self-call and delay, then the whole procedure can be viewed as a scheduled task. However, I did not find anyone else using this approach, I wonder if there is any performance drawback.
×
×
  • Create New...