Jump to content

Recommended Posts

This workflow lets you search the WordPress developer reference with the keyword "wpdev".





Searching WordPress core made easy.  Finding results is very fast because it searches in local files and doesn't need to be connected to the internet for the search .

The workflow checks every 2 weeks if there are new updates in WordPress core. Or you can check manually for updates with the "wpdev update" keyword.


Example of a function search






Last update of this workflow: WordPress 5.5

Download the workflow here: wordpress-developer.alfredworkflow


Follow this workflow on github.


Note: Deprecated functions, hooks or classes are not included in the search results.


If you encounter wrong results or 404 pages please post it here,  or create a github issue. thanks.

Edited by keesiemeijer
Update for WordPress 5.5
Link to comment

This looks pretty useful for me.


If you want the data to be updated without having to update the workflow each time, then you could write a little script that gets executed with the script filter from time to time (once a week?) that checks a file on Github to see if there is a newer version than the user has; if so, then you just download the newer ones in the background.


To do this, you'd probably want to move the data into the appropriate Alfred data directory; and, if you do that, it would be easier for other workflow authors to piggyback off of the data as well.

Link to comment

So, the data directory is usually ~/Library/Application Support/Alfred 2/Workflow Data/BUNDLEID.


The bundleid is really good to fill in for all the workflows because it serves as the unique identifier, which matters for some larger interaction (i.e. External Triggers, Packal....).


Since you're using PHP, it might be easiest to do this:


$bundle = 'com.wp.dev.keesiemeijer';
$home = `echo $HOME`;
$data = "$home/Library/Application Support/Alfred 2/Workflow Data/$bundle";

if ( ! file_exists( $data ) && is_dir( $data ) ) {
  mkdir( $data );


That will make sure the data directory exists.


If you want to download the files from Github instead of packaging them with the Workflow, you can do:


// Assuming you have "$data" defined above.

$files = array( 'functions.json' => 'https://raw.github..../functions.json',
                'classes.json' => 'https://raw.github..../classes.json'
// Put them all in here

foreach ( $files as $file => $url ) {
  if ( ! file_exists( "$data/$file" ) ) {
    file_put_contents( "$data/$file" , file_get_contents( $url ) );

Then, if you want to update them, put something like 


// assuming $data is defined

$version = '1';

if ( ! file_exists( "$data/version" ) ) {
  file_put_contents( "$data/version" );

// Do some caching logic here to check like once a week. Store the last
// checked time in a file in the data directory too.
if ( file_get_contents( "https://raw.github.com/url/to/version/file" ) != file_get_contents( "$data/version" ) ) {
// do something to download the new files

You could make it fancier by using the cURL class and set the timeouts with the error, etc, but this should work. Just make sure that you put in error checking for what happens when you try to check the "versions" file on Github when you're not connected to the Internet.


You could save this logic as a separate script and then just include it in the top of every script filter. (require_once( 'update.php' ); )


There are definitely workflows that do this sort of thing. Some of mine do, but those ones are usually written in Bash. (The Alfred Bundler does a lot of this).


Does this help?

Link to comment
  • 1 year later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Create New...