Jump to content

Workflow Request: MAMP to Live environment Wordpress


Recommended Posts

Hi,

 

I was always thinking how to cut short my work progress. It will be cool if that's a way to auto compile local environment data then upload to a specific live site using Alfred.

 

I'm weak at scripts, etc, hence it would be cool if anyone can help point out to me into some direction.

 

Currently, I'm just thinking maybe, make Alfred do this workflow.

 

1) Export MYSQL database from MAMP (local)

2) Save the SQL file to a folder

3) Copy the whole wordpress folder. Access a particular FTP (through Forklift or some other FTP programs)

4) Paste the wordpress folder to the directory

5) Change the urls of "local environment directory" to a appropiate directories in SQL file.

6) Import the SQL file into live site through PHPMyAdmin

 

Any help would be great. :) Or is there a workflow for such a complex workflow?

Link to comment

I'm not sure all that is easily possible. it is, but would be awkward to make.

Basically what is needed would be:

KEYWORD databaseName FolderLocation FolderLocation FTPAccount FTPLocation URLBefore URLAfter PHPMyAdminLocation


It's just too many pieces to do in one step dynamically I think. In pieces it could be possible such as each piece of your workflow was a separate workflow, but together I think it's not possible unless I'm missing something

Link to comment

It seems like the easiest way to do that would be to actually write a config file for each site, then use the mysqldump command tools and rsync to update the files in the background. Then you'd have to somehow ssh in to the server and do a drop database and import sort of workflow. There are quite a few moving parts here, and each would be up to the hosting configuration.

 

It wouldn't be nearly as good to do it through any programs (like ForkLift) as they would have to be launched and would play in the foreground. So, native command line tools that can be backgrounded would be best.

 

So, say you're using a virtual machine that you have root access to. Then it seems like it would be easiest to write a cron job on the server that would check for the new files in an upload directory and then run a script to make a backup of the website, overwrite the database, and drop in the new files. (This of course would consume a lot of bandwidth, and so it would be better to do with some way of tracking changed files or a database diff sort of thing, depending on the size of the database). Downside: you have to install a companion "app" on your server, and then it might not work quite as well on different *nix flavors, etc. Slightly different commands, different paths. It's a nightmare.

 

If you don't have access to the shell, the ability to write cron scripts, etc... then it becomes much harder. You can connect to mysql non-locally, but that's only if your hosting provider allows you to do so. So a full inventory of the firewalls and open ports would need to be considered. It gets even trickier if you're using private keys.

 

There is another possibility that a simply little web-app could be written in php, and, when it's pinged (probably with a token to verify it), then it will run a php script or a python script or something (depending on what's installed) that will then move the files and import the database. A config file for the server side would need to be generated, but I guess that could be done through a single web-script.

 

In the config file, however, you'd have to have the local mysql db, user, and pass; (depending on setup above) the production mysql db, user, and pass; authentication credentials to push the files (login/pass/private key); the local dir; the remote dir; the local path to the mysql apps (these would probably stay the same for mamp, but if the program isn't in the applications folder, then, well, it could break); the remote sever ip; the local url; the remote url. There might be a few other things that I'm leaving out.

 

You could probably write an alfred sub-workflow that would help create the config file, but that wouldn't be too elegant. And needing files stored on a remote server to do the rest of the work isn't so great.

 

---

 

Perhaps another way that you could do it would be to combine the github workflow with a server script so that when you update (and push to github), then the server grabs everything from github.

 

---

 

So. What's possible? What is your hosting provider? I do hope you have a virtual box.

 

Shawn

Link to comment

Wow it seems alot harder than I thought.

After giving some thought, I was thinking maybe I can use Forklift to integrate with Alfred. Because Forklift have a synclet function that Alfred can open because the synclet functions as a app The Synclet will then deploy the files into the server.

But then it probably will not be very dynamic because you must create keywords for each synclet. My main concern is probably trying to figure out how to make Alfred open a particular file in ST2 then make changes to the file automatically. Something like a Automator? Or can i do it with scripts?

This is the thing that I just can't wrap my head around.

The bad thing about WP is probably how they placed their respective absolute URLS. :(

Edited by uNdefin3d
Link to comment

There just really are so many moving parts.

 

What's ST2?

 

Automator would again make things not work well at all because it would take over your computer for a bit, which isn't ideal as the update should really be a background task that just lets you know when it's done, especially because it could take a few minutes to do all the tasks that are necessary to do what you need to do.

 

If would be easiest to write it as a bash script. Next, it would probably be easiest as a python script, followed by a php script. Since you're developing in Wordpress, I assume that you know at least a few bits of PHP and a tiny bit about how to work with MySql commands.

 

If you don't, then this could be a great time to learn a few things as they can only help boost your ability to create and curate any site you develop.

 

With each script, you could load a config file that has all the relevant information that I mentioned above.

 

I can help you out if you answer a few questions about your hosting provider:

  1. Where are you hosting?
  2. Do you have shell access (access to the command line)?
  3. If yes, can you access commands like "mysqladmin" or "mysql" on the server?
  4. If you don't have shell access, then can you initiate cron jobs?
  5. If you don't have shell access, then does the webserver allow you to use the "shell_exec()" or "exec()" functions in php?
  6. Is this a regular development/production site sort of thing you have going on?
  7. If yes, then is the production site updated with new content, or does all content development happen on the local machine?
  8. Can you use rsync to connect to the server?
  9. Do you have access to git commands on the server?
  10. What webserver is the server running (apache2, nginx, lighthttpd)?
  11. What version of php is the server running?

I'd say that, generally, dropping a full database and reuploading it wouldn't be the best way to go about this sort of thing. Instead, using some sort of version control system would be better. This actually goes for files as well as for the database. For files, look into Github (github.com) and for database versioning, look into the new tool dbv.php (http://dbv.vizuina.com/).

 

Generally, to test some of these things, you should go into into the command line (for local, open up the terminal app on your mac, and try typing in commands like "rsync" "wget" "curl" "awk" etc...).

 

To get the info about the webserver, if you don't already have it, then upload a php file that simply has the contents:

 

<?php

phpinfo();

?>

 

That should give you all relevant information about the webserver (except for the shell access).

 

To test for the shell access, then try to type "ssh user:pass@domain.com."

 

If you don't have access to anything except for ftp, then it still is possible to do everything by writing a php script that will live outside of your wordpress installation that, when accessed, it will go through a series of operations that will do what's listed above. Granted, you'll want to include some sort of authentication to make sure that no one can exploit the script.

 

Let me know, and I can help you develop this workflow, and if you don't know how to do many of these things, then this could be a great time to learn.

Link to comment
  • 3 weeks later...

I think this is not necessarily appropriate to a workflow...

 

However, if you are looking for a better way to develop for wordpress, I would offer the following suggestion - Use Beanstalk App and Git. You develop locally, everytime you commit a change you can have it deployed automatically. You still have to set up the database and wp-config file on your remote site of course, at the beginning, and there are unfortunately a lot of things that wordpress stores in the database that are impossible to bundle up into files so they can be version controlled, but for theming and function development, it works a charm!

 

Also, this script is very helpful

 

http://interconnectit.com/products/search-and-replace-for-wordpress-databases/

 

--

 

I would like to have more ftp/sftp workflows. 

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...