Jump to content

Recommended Posts

TumblrBackup

Backup any Tumblr site via Alfred

 

This is a quick workflow that I use on a day to day basis. Figured that maybe someone else might want to use it too.

 

This workflow allows the user to enter any URL to any Tumblr site (formatted http://x.tumblr.com/), then Alfred will backup all data of that site to folder on the user's machine. I find this workflow most useful when I need to mass download my previous uploads or save my entire Tumblr site locally.

 

Screen_Shot_2014_02_23_at_11_24_35_PM.pn

 

Small workflow, but useful all the same.

Download HERE

Edited by Ritashugisha
Link to comment

Wonderful workflow! 

 

Too bad it is not working for me. I can save the backup path but when I try to back my stuff up, it only creates an empty folder.

The notification is working fine too.

 

What is the path that you set using the backup path command? I don't think that the URL should be a problem, but if you could tell me what the url was that would be helpful as well.

Link to comment

I set the path to /Users/User/Downloads.

 

When I add "http:" to the url, an empty folder is created with the name "Downloads+name of the url" in /Users/User/.

 

I tested this workflow with this url: fuckyeahjacqueschirac.tumblr.com. 

 

If you entered "http:fuckyeahjacqueschirac.tumblr.com" to the workflow the script won't recognize it as a URL.

Try entering "http://fuckyeahjacqueschirac.tumblr.com" instead. The two forward slashes after http: are required.

 

If this doesn't fix your problem I'll try and stabilize the script some more and post an update.

Edited by Ritashugisha
Link to comment

This was a typo in my post. Sorry. In fact I wrote http:// .

 

Just retested it.

 

It still creates a Downloadsfuckyeahjacqueschirac.tumblr.com empty folder in /Users/User/...

 

Oh this bug is my bad. Try setting your path to "/Users/user/Downloads/" making sure to add the extra forward slash at the end. In the download script I forgot to add an escape for this character. So formatting the path like this should fix your problem.

 

If not, then I'll push out a new update soon.

Link to comment
  • 3 weeks later...

The same thing is happening to me. I opened the Console app on my Mac, and there is actually a wget crash every time I run this workflow. I assume this politicus' issue as well. It's easy to verify if you open Console, select All Messages on the upper left, and then run the workflow. You should see a new message pop up in the list that says something about 'Saved crash report for wget'. From there, you can open the crash report. Here are the contents of my report:

Process:         wget [25362]
Path:            /Users/USER/*/wget
Identifier:      wget
Version:         0
Code Type:       X86-64 (Native)
Parent Process:  sh [25361]
User ID:         501

Date/Time:       2014-03-20 08:55:09.044 -0700
OS Version:      Mac OS X 10.8.4 (12E55)
Report Version:  10
Sleep/Wake UUID: 61CFEA41-ED5D-41FB-953F-2C86C6813715

Crashed Thread:  0

Exception Type:  EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000002, 0x0000000000000000

Application Specific Information:
dyld: launch, loading dependent libraries

Dyld Error Message:
  Library not loaded: /opt/local/lib/libiconv.2.dylib
  Referenced from: /Users/USER/*/wget
  Reason: Incompatible library version: wget requires version 8.0.0 or later, but libiconv.2.dylib provides version 7.0.0

Binary Images:
       0x1092d5000 -        0x109343fff +wget (0) <73D6481A-1B9F-37E1-A871-D4C64C887238> /Users/USER/*/wget
    0x7fff68ed5000 -     0x7fff68f0993f  dyld (210.2.3) <A40597AA-5529-3337-8C09-D8A014EB1578> /usr/lib/dyld
    0x7fff8c2ff000 -     0x7fff8c3f4fff  libiconv.2.dylib (34) <FEE8B996-EB44-37FA-B96E-D379664DEFE1> /usr/lib/libiconv.2.dylib

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...