Automatically check links on GitHub pages site

4/4/2017programming, tipsCody Reichert

I'm not good at checking my own code, and I've accidentally pushed broken links and bad HTML on numerous occasions. To solve this, I set up Assertible to automatically check the links and HTML on this blog every time I make a change.

Assertible has test assertions for checking links and HTML validation on a website. These assertions are perfect for this blog; it's a simple static site, heavy on content. I want to make sure that the links stay in tact and the markup is accessible.

The Assertible link checker assertion also checks <img>, <script>, <link>, etc tags in addition to <a> tags - so I get the free benefit of making sure all of the site's assets are available.

Automated checks and monitoring

First, I set up an hourly test schedule to run the assertions and notify me if anything fails. Failure alerts are sent to a Slack channel, and email.

But when you think about, it makes sense to check links immediately after every change to the site. Links may go bad over time, but by far the most common occurrence is when I'm pushing something new. Assertible's GitHub Deployments integration actually works out of the box with GitHub pages, so all I had to do was install the integration and create an environment called github-pages.

Now, every time I push to the master branch of this repository, GitHub deploys the site to GitHub Pages (automatically), sends a deployment event to Assertible, and Assertible then runs these smoke tests on the website. In addition, the tests are run hourly, and every failure will alert me over Slack or email. Cool, right?

:: @CodyReichert

Chrome Canary: Take mobile device screenshots

6/6/2016programming, tipsCody Reichert

I came across a tweet the other day by @ChromeDevTool that I had to share - super cool.

In the latest Chrome Canary build you can not only view your website on different mobile device sizes, but you can now also view the device and take a screenshot of the whole thing! Here's an example:

*fig 1.2*

So meta

I'll leave you to play with all the different devices and take your own screenshots - go download the latest Chrome Canary build and get started!

:: @CodyReichert

JavaScript: Anything to boolean

6/4/2016javascript, tipsCody Reichert

Double negatives aren't not bad, but the Logical NOT operator is pretty cool. You're probably familiar with it, !. It's a unary operator that returns false if it's argument can be coerced to true, and vice versa:

    !true // false
    !false // true

    // with coercion
    !1 = // false
    !0 = // true
    ![] = // false

And since ! returns a single boolean, we can pass it's result to another !: !!:

    !!true // true
    !!false // false

    // with coercion
    !!1 = // true
    !!0 = // false
    !![] = // true

You should know the rules of what can be converted to true (truthy) and to false (falsy).

Pretty Printing JSON in JavaScript

5/3/2016programming, javascriptCody Reichert

I frequently look up how to pretty print JSON in JavaScript. Whether it be for debugging in the console, to display some output to the user, or any other reason.

JSON.stringify() is usually used to simply convert a JSON object to a string for use elsewhere in the application. It also takes 2 additional parameters: replacer and space.


The replacer argument tells JSON.stringify to parse your JSON object in a custom way -- eg, replace all "true" and "false" strings with the Boolean version true or false. You can do a million other things here, but we're not too concered about this parameter when all we want to do is pretty print.


The space argument, though a little oddly named, tells JSON.stringify() to add spacing to the string output -- like a newline, tab, or single space.

Using this parameter, we can print out our stringified JSON in a readable format:

JSON.stringify({ a: 1, b:2 }, null, 2)

// "{
//    "a": 1,
//    "b": 2
//  }"

And there we go - a nice, readable JSON object as a string. There's more information on the MDN page about what you can pass to this function - so check those out and start pretty printing!

:: Cody Reichert

JSX - The Basics

3/5/2016programming, javascriptCody Reichert

When most people get started with React.js, JSX is the first thing that comes as a surprise. JSX is a templating language inside JavaScript. Remeber all those string template you to write with (insert frontend framework here)?, well JSX to the rescue.

Table of Contents:

What is JSX

JSX is a JavaScript syntax extension that looks similar to XML. You can use a simple JSX syntactic transform with React.

Picture this. Instead of writing HTML strings within your code, you can use a preprocessor that allow you to write (basically) pure HTML with some compile-time checking? Brilliant, let's look at the examples from Facebook's Displaying Data post:

Without JSX:

React.createElement('a', {href: ''}, 'Hello!')

With JSX:

<a href="">Hello!</a>

Great! We've improved readability by 10,000%.

Now that we've got a solid introduction;) to JSX, let's take a look at some common gotchas.

Comments in JSX

It's common to want to write comments within your markup. Inside of JSX, there are a couple of 'rules' to follow:

In between JSX elements, wrap your comments in {}


    {/** A comment about the component below */}
    <span>Hello, JSX</span>


    /** A comment about the component below */
    <span>Hello, JSX</span>

Inside of a JSX tag you can use normal comments:


        /** Some commentsabout the props below */


        {/** Some commentsabout the props below */}

Passing Props

JSX also makes passing props (arguments) to other components with simple HTML-like syntax. If you're not familiar with nesting components and receiving components, you should check out JSX In Depth first.

String Props

Normal, string value props are easy to pass to child components:


Now inside of AwesomeComponent you have access to that references the value jsxRocks.


Passing functions as callback is also common practice when writing JSX components. A basic example would look like:

    handleNameChange={data => myAction(data)}

Whoah, let's not a couple of things here:

  • {}. Where did that come from? JSX can take JavaScript expressions as arguments to props. This means that you're not restricted to string props. You can pas functions to children that make handling callback easier.

    To extend the example, this means we can use any logic we want inside of component's attributes. Need computed data at render time? Not problem

    conditionalProp={thing === true ? 'thisprop' : 'thatprop'}
  • No quotes? When using expressions as a value to a prop, you don't need to do any fancy string interpolation - React will take care of rendering the result of the expression.

Basic function optimizations

Keep in mind that when creating functions inside of components render method (like we're doing with the expressions in props), that a new function is created on each re-render.

This is a pretty minor thing for most apps, but as the size of application grows and you start thinking about performance, minimizing the amount of work on the client can be idea.

If we take our two examples from above, they would be better written so a new function is not created and the class intsance can take care of re-computing the value:

class AwesomeComponent extends React.Component {

    conditionalProp() {
        return this.props.thing ? 'prop1' : 'prop2'

    render() {

        const { thing } = this.props

        return (

Function reference vs Function unwrapping

When the render method of a component is called, functions inside of the JSX are executed and their return values are used in place of the props. That brings up two common use-cases for using expression in components:

Using a reference to a function

Sometimes you will need to provide a callback method to child component - and sometimes you will need to provide the first argument to a function and let the child provide the rest. In those cases, anonymouse functions work perfectly:

    conditionalProp={data => myCallback(id, data)}

What we're doing in the above example is called partial function application. In a nutshell, we create a unary function that returns a binary function with the first argument already given (so really, we get a unary function again).

When the render method of the component is called, functions are executed. With the example above again, what we actually get after the render is:

    handleCallback={data => myCallback(id, data)}

And inside of AwesomeComponent, you can simply call:


Using the value of a function

class AwesomeComponent extends React.Component {

    conditionalProp() {
        return this.props.thing ? 'prop1' : 'prop2'

    render() {

        const { thing } = this.props

        return (

Boolean Props

Passing boolean properties to JSX components is simple. The only thing you need to know is that they are JavaScript Booleans, not String Booleans. In other words:

This is correct:


This is not:


Just a quick note that the second one actually is valid, but it's not considered a boolean it's considered a string.

Alternative syntax

An alternative for passing true is to simply provide the prop name without a value:

<Input disabled/>

My professional opinion is to always use the full true or false value for the sake of readability.

Final Thoughts

Are there any pieces of JSX where you got caught up first starting? Let's chat - shoot me a message on twitter @CodyReichert and I'll add it!

:: Cody Reichert

January 2016 Reading List

1/4/2016programming, common lispCody Reichert

One of my goals this year is to spend more time reading. Books on business, startups, psychology, history, etc, are all fair game.

Last month, January, I picked up a few new books. Some I really like and others were OK. In no particular order:


Rework - Jason Fried

Reading Rework was a breath of fresh air. Brilliantly put together, blunt no-fluff advice, and wittingly accurate illustrations and metaphors made this my #1 book this month.

If you're new to startups, veteran to startups, interested in the culture, or anywhere in between, then go to your local Amazon and pick this up immediately. You won't regret it.

Zero to One - Peter Thiel

Zero to One is filled with interesting insight from one of our startup culture's very own, Peter Thiel. There's no denying that Thiel is a smart businessman - so that alone is reason enough to read anything he writes.

This book really made me think about some tough topics like competing with your competition and how major technilogical advances occur. Every chapter of this book got me pumped up to build something world-changing.

The Virgin Way - Richard Branson

I didn't have many expectations going into this book - but I came out with nothing but praise for Sir Branson and The Virgin Way. The book is funny, thought-provoking, and riddled with real life anecdotes about Virgin and other mega-corps (Netflix, Microsoft, etc).

I would recommend this to anyone who is interested in Richard Branson, or what it's like running hundreds of businesses. His writing is easy to read and intriguing, it's hard to put down!

After this one, I'll be looking to pick up a couple more of Richard Bransons books.

How to Win Friends and Influence Peopl - Dale Carnegie

A classic. Anyone who has read any of Dale Carnegie's books (then you've probably read this one) would tell you that is writing style is one of a kind and hilariously interstesting. How to Win Friends is a must read (and re-read) for anyone in a leadership role.

I'll keep this one on the shelf and read it again.

In Progress

Lincoln the Unknown - Dale Carnegie

I picked Up Lincoln the Unknown after finishing Carnegie's other book, How to Win Friends. Lincoln the Unknown is an interesting tale of Abraham Lincoln's life and upbringing.

I haven't quite finished it, so let's hope for an update next month.

The Hard Thing About Hard Things - Ben Horowitz

At this point in time, the most I've read of this book is the inside cover. Horowitz's book is highly recommened across the industry as a guide for managing a startup.

:: Cody Reichert

cl-ses - Sending Emails from Common Lisp with AWS SES

8/6/2015programming, common lispCody Reichert

I've been enjoying using Common Lisp lately. After going through the first half of Practical Common Lisp (highly recommended), I wanted to write some scripts to automate some of my own tasks. We use AWS for most of our infrastraucture at SimplyRETS and Reichert Brothers, and run jobs to check how our API's are holding up. Naturally, I wanted to automate some of that and send out an email when reports are generated.

In case you don't want to read the rest: Here's the code


There are a few AWS libraries out there for Common Lisp1, but I couldn't find one that supported SES - except for aws-ses. The problem I had with aws-ses is that it only works with LispWorks - which is perfectly fine, but I've been using SBCL and wanted something with a bit more flexibility.

One a side note, if you are using LispWorks - aws-ses is really cool since the author wrote his own hmac, sha1, and base64 algorithms with 0 dependencies.


The app I wanted to send email from was running in SBCL - so I decided a port of aws-ses would be the right thing to do.

I put up cl-ses on GithHub the other day after porting all of the LispWorks specific function. Specifically, the use of comm for the tcp connection was the major blocker. I decided against porting comm to sb-bsd-ports and opted for bringing in Drakma, which is an awesome HTTP Library that has already hashed out the differences between Lisp implementations.

Losing the "no dependency" badge and adding the "1 dependency, but multiple implementations" badge was the best route - especially since there seems to be a lack of any high level email libraries.


I changed up a bit of code in order to get the signing algorthims to work with Drakma's headers, but the library stayed very simple - only exposing one send-email function. Here's all it takes to send an email:

  (cl-ses:send-email :from ""
                     :to ""
                     :subject "Hello from CL-SES"
                     :message "The body of the email message"
                     :aws-access-key "XXXXXX"

send-email returns T if the status was 200 (OK), and NIL otherwise. In the future I'll hopefully have implemented better error reporting.

You can easily send to multiple recipients:

  (cl-ses:send-email :from ""
                     :to ",,"
                     :subject "Hello from CL-SES"
                     :message "The body of the email message"
                     :aws-access-key "XXXXXX"
                     :aws-secret-key "XXXXXXXXXXX")

I'm working on extending this to support more of AWS's features, allow for a lot of the obvious settings (like the AWS region), provide better error hanlding, and built in support for HTML emails - but other than that it's working great so far.


1 - zpb-aws and hh-aws are the two I found.

:: Cody Reichert

Managing Your Dotfiles with GNU Stow

7/1/2015programmingCody Reichert

I see questions and posts almost every day (Eg, One, Two, Three) on the best way to manage configuration files between machines or workplaces. There's a lot of (ok) solutions out there. However, they're all setup-specific bash scripts that aren't very reproducible.

I wanted to write this to hopefully prevent people from maintaining their Makefiles, and to keep their dots pristine.

Welcome GNU Stow

GNU Stow is a symlink-farm manager. Wait, I thought we were talking about dotfiles? Well, we are. To quote the original site:

GNU Stow is a symlink farm manager which takes distinct packages of software and/or data located in separate directories on the filesystem, and makes them appear to be installed in the same place

When it comes to configuration files, this means that we can do things like: create a directory anywhere on our system that imitates the structure of our $HOME directory, and then have Stow symlink them from the imitated directory, to the real $HOME.

Directory Structure

There's a lot of good examples out there, but one of the most common questions I see is still, "Wait, so how does it work?". Hopefully this will clarify.

Create a directory anywhere on your system called dotfiles/, and cd into it. Now pretend for a minute that you're actually in your $HOME directory. Where would you find your .bashrc? Probably some place like this:

     └── ~/
         └── .bashrc

With Stow, you imitate that structure:

     └── dotfiles/
         └── bash/
              └── .bashrc

Not all configuration files live in the top-level of your home-directory, so what about a program that keeps in config file in the $XDG_CONFIG_HOME?

The key to Stow is remembering that, you have a subdirectory (eg, dotfiles/bash) for each program that wish to store configuration files. So in essence, we end up imitating our home directory each time we add something new. Here's a bigger example with more nesting:

   ── dotfiles/
      ├── awesomewm/
      ├──── .config/
      ├──────── awesome/
      ├─────────── rc.lua
      ├── bash/
      ├──── .bashrc
      ├── emacs/
      ├──── .emacs.d/
      ├──────── init.el
      ├── zsh/ # we can name these dirs whatever we want

There's a few different things going on up there, but they all follow the same pattern.

First, note that we can name the first-level directories whatever we want, but it makes sense to call them the name of the program they contain. Second, see how directly under each first-level directory, we start placing files exactly where they should show up in our $HOME directory.

Repeat those steps for any other configuration files you want to Stow away.

Using Stow

Now that the dotfiles folder is set up, we can actually use Stow. Let's with only using the .bashrc from above.

Remove, backup, rename, your original .bashrc (the one that's not in your dotfiles), because we need that name for Stow.

cd into your new dotfiles directory, and run:

  stow bash

That will create a symlink from your dotfiles repo, to the correct place at home:

  ls -al ~/.bashrc
  # outputs:
  # .bashrc -> dotfiles/bash/.bashrc

The stow command simply takes the name of the directory you wish to symlink. You can do the same for the other configurations in your dotfiles repo, and you successfully have them all managed.

Advanced Usage

Using Stow in combination with git (or any other VC) is really the best part. It allows you to have your entire configuration on any system in just a matter of seconds.

And when you leave, if you don't want to leave your config files there, stow comes with a nice flag to unstow:

  stow -D bash

For a more complex and complete example, you can chekout my dotfiles on Github.

My hope here is that I'll now be able to point people to this article to help them understand Stow little better without needing to actually set it up. If you've found any other cool uses for this tool, or some other programs manange your dotfiles - leave it in the comments!

:: Cody Reichert

Blogging with Emacs and Org Mode

7/6/2015emacs, org-modeCody Reichert

I've finally got my new blog up! I've been wanting to migrate from Middleman, a static-site generator written in Ruby. The problem was that it was too many steps to post a new article - so I just never did.

I've been eyeing a few solutions for blogging completely from within Emacs. There's some good (and some outdated) software on the wiki to accomplish that.

I finally came across org-page, which seemed to be exactly what I wanted. The documentation was a little terse, but it's a simple setup so I decided to give it a shot.

Here's what I was able to get set up, with a few snippets to accompany the official documentation.

  • Write blogs completely in org-mode (obviously)
  • Publish to GitHub pages (or anywhere you can push static files).
  • One command publish
  • Ability to customize a theme, or write my own
  • Tags, RSS Feed, and all the other blog goodies.
  • Never have to leave Emacs


Org-page is available on MELPA, so the install is simple:

  M-x package-install RET org-page RET

That will give you org-page and a few commands (which is all you need) to create a repo, add a new post, and publish.

For manual installation, see the documentation

Set up a repo

Org-page also handles this for you, with the available command op/new-repository. So find a place on your system you want

M-x op/new-repository RET /path/to/new/blog RET

This sets up a new git repository, with a few pages already laid out for you (,, readme, etc).

Org-page in your init.el

There are few things you'll need to set up in your Emacs config file to get things working correctly. Here's an annotated example:

  (require 'org-page)
  (setq op/repository-directory "~/workspace/play/newblog")
  (setq op/site-domain "")
  (setq op/personal-github-link "")
  (setq op/site-main-title "The One True Blog")
  (setq op/site-sub-title "Emacs, Programming, and Arch Linux")
  (setq op/personal-disqus-shortname "theonetrueblog")

Kelvin used very sane variable names, so most of that should be self-explanatory. Not all of those are required, but if you leave out things like the GitHub Link, it just won't show at all - perfect.

Reload your Emacs config and let's move on

Creating a new post

You'll probably first want to fill out some of the generated pages, like and

Pro-tip: If you remove the it will default to a list of posts, like mine. It's preferable since there is already an about page.

Once again, org-page has another built in command to get a new post started. The best thing about it is that is handles the description, file name, post uri, tags, and more. Meaning you can get to just writing articles, not boilerplate.

M-x op/new-post RET

It will run you through a few steps to generate all of those fields for you post, and put your cursor where you can start writing. op/new-post gif example

Publishing your new blog

Yet again...built in to org-page. And since we're in Emacs, we can make it do whatever we want. First, make sure you set the remote in your blogs git repo:

    git remote add origin
    git remote -v

Org-page has a command op/do-publication. It asks a couple of questions, and compiles the org mode pages for you. When you set up your repository, org-page created two branches: source and master. This is a good setup for GitHub-pages and probably most other hosts. All of your org files live on the source branch, and org-page will add and commit the compiled files to the mater branch.

The questions:

1) Publish all org-files (y or n) 2) Publish to directory? (original repo if not) (y or n) This on is particularly useful for sending the compiled files to another directory, which you can watch with a simple HTTP server and quickly view changes when your writing. 3) Auto-commit to repo? (y or n) 4) Auto-push to repo? (y or n)

The last two are great, because all I need to do run op/do-publication and the new post is live within a few seconds. That's the Emacs way.

Here's a GIF of how I published this blog, right after I wrote this part.

Other setup

I have a few other snippets for using a custom theme with org-page, and a couple other nice settings I'll share eventually.

My recommendation is to C-h f RET ___ RET on some of the org-page functions, they're documented well.

You can also view my org-page setup on Github.

:: Cody Reichert

Webpack: Create Multiple Bundles Using Entry Points

7/5/2015webpack, javascriptCody Reichert

Webpack is module bundler that generates static assets for almost all of your front-end dependencies, and their dependencies. It's primary use-case is for creating optimized bundles for you Javascript, but it's quickly been extended to handle everything from fonts, images, and even a compilation step for CoffeeScript, LESS, and pretty much anything else you can think of.

A common use case for Webpack is single page applications, but another large one is for multi page applications. Loading a large JavaScript bundle on every page is not ideal, so let's set up Webpack to create multiple bundles for us.

A basic setup

So let's say the front-end JavaScript/Stylesheets structure of our site looks like this:

    └── static
        ├── dist
        └── src
            ├── js
            │   ├── Account.js
            │   └── Front.js
            ├── node_modules
            ├── package.json
            ├── stylesheets
            │   └── main.scss
            └── webpack.config.js

Most importantly, We have a main Javascript file for Front and Account.

The goal is to have Webpack generate a front-bundle.js and account-bundle.js bundle. The advantage here is that new visitors who aren't logged in don't need to load a huge JavaScript bundle just for visiting the homepage.

Single Entry Point

With a goal in mind, we can dig into Webpack and see what it offers. By default, Webpack makes you define an entry-point, basically the root JavaScript file for you app:

    module.exports = {
        entry: {
            app: "./static/src/app.js"
        output: {
            path: "./static/dist",
            filename: "app-bundle.js"

Our site structure doesn't match up with that. With this, we would have to load all the account panel JavaScript on the homepage too - which is far from ideal.

Multiple Entry Points

Webpack supports multiple entry points for this reason. Here's a new configuration more suited to our site structure:

     module.exports = {
         entry: {
             front: "./static/src/js/Front.js",
             account: "./static/src/js/Account.js"
         output: {
             path: "./static/dist",
             filename: "[name]-bundle.js"

Much better. What's happening here is that Webpack is now looking for both Front.js and Account.js, and will create a separate bundle, including the Webpack runtime and dependencies, for each of those. In the output object, we export the bundle to static/dist and use the [name] variable to name each bundle.

We end up with /static/dist/front-bundle.js and /static/dist/account-bundle.js. Great, so now we can the script tag to each page and we're done!


Even though the bundles contain different code, there are a few libraries/modules that we use in both Front and Account. So, what about the use-case where a new user does end up logging in?

We wouldn't want to make them re-download the same JavaScript!

Common Chunks

While the solution above is good, it can be better.

Ideally, we have Front-bundle.js and Account-bundle.js - but we also have a Common-bundle.js that contains the modules we use everywhere. The browser will cache Common-bundle.js, so when a user transitions from the Front to the Account, they've already got most of what they need.

Say hello to the CommonChunksPlugin.

Configuring CommonChunksPlugin

The common chunks plugin will look for and find all common modules and dependencies between your entry points, and automatically bundle them. All we need to is a little configuration:

      let commonsPlugin = new webpack.optimize.CommonsChunkPlugin(
          'commons',  // Just name it
          'common.js' // Name of the output file
                      // There are more options, but we don't need them yet.

      module.exports = {
          entry: {
              front: "./static/src/js/Front.js",
              account: "./static/src/js/Account.js"
          output: {
              path: "./static/dist",
              filename: "[name]-bundle.js"
          plugins: [ commonsPlugin ]
          // more config options

We initialize a new instance of the CommonChunksPlugin and pass a couple parameters (annotated). After that, the Webpack will do the rest.

The commons bundle will also be output to static/dist/, with the name that we gave it (common.js).

Wrapping Up

Now we're done! Remeber to add the <script> for both the entry bundle and the common bundle to the correct pages, and Webpack will do the rest.

It's a powerful tool, and I think does a great job of cleaning up the mess that is front-end dependency management. There's an endless amount of plugins and extentions already out there, so we'll see where Webpack ends up in 6 months to a year.

:: Cody Reichert

Adding Template Options to a Wordpress Custom Post Type

12/6/2014wordpress, phpCody Reichert

About the Task

I've recently been working on a Wordpress plugin for a new service we'll be releasing at Reichert Brothers in the next couple of months. Although I haven't ever done too much in Wordpress, it allows for some really cool things and lends itself well to rapid development. I ran into a couple of problems when developing out the plugin that I think might be valuable enough to get documented and help anyone else out that may come across the same issues.


For the sake of making this easier to understand, I'll give a little background on what I needed my plugin to accomplish. In a nutshell, I ventured to write a 'simple' plugin that would retrieve JSON from an external API, format it, and display it on a Custom Post Type Page I created. I wanted to write as few of my own custom templates as possible and inherit the templates that the active theme makes available. The following things needed to be achieved:

  • Create a Custom Post Type: to display the data from an API on our custom pages. (not covered here)
  • Add meta boxes: to filter the requests to our third part API (not covered here)
  • Allow the user to pick a template: for their Custom Post Type Pages. These templates should not be hardcoded and should be the same templates that the active theme offers (full-width, left-sidebar, etc).

We'll be covering the third bullet - allowing the admin to pick a different template for each of our Custom Post Type pages.

The Problem

When creating a new custom post type, Wordpress does allow for authors to add the page-attributes capabilities to their posts. But they don't include the Template dropdown (see fig 1.1). As a matter of fact, they explicity hardcode it to only be allowed on page post types. Well, I wanted my users to be able to pick a template offered by their active theme on a page-by-page basis.

*fig 1.1*

The most common solution I found was to register a setting for the admin where they could pick one template to take effect on all of those custom post type pages. But let's say my custom post type is for restaraunt menus. Well, now the restaraunt needs two pages: one for their lunch menu and one for the dinner menu. The lunch menu is much smaller and thus doesn't need a 'full-width' layout. But the dinner menu is large and the admin doesn't want anything else on that page. Now do you see why we might need the ability to choose a template on a page-by-page basis. At least this was my though process.

Diving In

Well in the end, I ended up figuring out what I think is a decent solution. I don't think I 'invented' this solution, because I bet there a many other people out there doing the same thing, but let's take a look.

TL;DR Solution

We'll leverage the use of Meta Boxes to provide our own dropdown menu. Then we'll save that setting in our post_meta, retrieve that setting when our CPT pages are loaded, and show the correct template. Pretty simple!

Registering the Meta Box

First, we'll start off by creating our meta box for our Custom Post Type. If you've never used meta boxes before, I recommend taking a look at the WP Codex Function Reference for add_meta_box. It gives some great examples and provides plenty of info on how to set one up. Keep in mind all of this is from a plugin. I won't go into setting all of that up in this blog, but that's where we'll be working from.

Let's set up our meta box! I'll touch on the important methods and settings, but I'll leave out explaining the irrelevant ones. Our code:

  function cptTemplateMetaBox() {
      , __( 'Page Template', 'my-cpt-textdomain' )
      , 'postTemplateMetaBoxMarkup'
      , 'my-cpt-name'
      , 'side'
      , 'core'
  add_action( 'add_meta_boxes', 'cptTemplateMetaBox' );

Ok so we've successfully created our Meta Box (even if it's not doing anything yet.) Let's go through these the add_meta_box function and see what we're setting up.

  • cpt-template-meta-box: is simply the html ID that wordpress will give our meta box when it's put on the page.
  • __( 'Page Template, 'cpt-textdomain' ): is the title that wordpress will give our meta box when it's rendered.
  • postTemplateMetaBoxMarkup: is the name of the function we're about to define that will render the markup to go inside of our meta box.
  • my-cpt-name: is the name of our custom post type for which to load our meta box.
  • side: is where our meta box will go. On the side, since that's where the normal one would be.
  • core: This field is the 'priority' of the meta box.
  • add_action: registers the meta box.

Rendering the Meta Box

Now that we have registered our Meta Box, it's time to give it some markup. Basically, we'll just generate one simple dropdown box that has a list of all the currently available templates. Since this part is a little more detailed than the previous code snippet, I commented inline what everything is doing. Here's the code:

  function postTemplateMetaBoxMarkup( $post ) {
      // create a nonce for verification (not covered in this post)
      wp_nonce_field( basename(__FILE__), 'cpt_template_meta_nonce' );

      // we get the cpt_page_template meta field from the database when we load
      // the admin panel. We haven't saved on yet, but when we do it'll be here.
      $current_template = get_post_meta( $post->ID, 'cpt_page_template', true);
      // the get_page_templates() function retrieves all of the currently enabled
      // templates.
      $template_options = get_page_templates();

      // start creating our markup
      // first we create a label, the 'for' attribute should match the 'name' of the <input> we
      // want to save.
      $box_label = '<label for="cpt_page_template">Page Template</label>';
      // <select> wrapper around our options. notice the 'name' == 'for' from above
      $box_select = '<select name="cpt_page_template">';

      // we give a Default option which will default to whatever the theme's default
      // template is.
      $box_default_option = '<option value="">Default Template</option>';
      $box_options = '';

      // here's the meat. For EACH of the available templates, create an <option> for it,
      // and put it in our <select> box.
      foreach (  $template_options as $name=>$file ) {
          if ( $current_template == $file ) {
              $box_options .= '<option value="' . $file . '" selected="selected">' . $name . '</option>';
          } else {
              $box_options .= '<option value="' . $file . '">' . $name . '</option>';

      // echo our markup (you should return it, but we won't do that here).
      echo $box_label;
      echo $box_select;
      echo $box_default_option;
      echo $box_options;
      echo '</select>';

Note, we don't have to register this function with any hooks or filters because it's called directory from the =add_meta_box= function. Now we should have a fully rendered meta box on our Custom Post Type Pages. It's not saving any settings yet, but now we can start persisting the selection. (See fig 1.2)

*fig 1.2*

Persisting the Meta Box Data

Wordpress makes saving the data from the Meta Box really simple. In our case, it's going to see our select box and look for the =selected= option. Since this isn't a meta box tutorial, I'll leave out the details of how the saving works. All we need to know is that the field saved, and what the name of the saved field is. Here's the code:

  function postTemplateMetaBoxSave( $post_id ) {
      $current_nonce = $_POST['cpt_template_meta_nonce'];
      $is_autosaving = wp_is_post_autosave( $post_id );
      $is_revision   = wp_is_post_revision( $post_id );
      $valid_nonce   = ( isset( $current_nonce ) && wp_verify_nonce( $current_nonce, basename( __FILE__ ) ) ) ? 'true' : 'false';

      // if the post is autosaving, a revision, or the nonce is not valid
      // do not save any changed settings.
      if ( $is_autosaving || $is_revision || !$valid_nonce ) {

      // Find our 'cpt_page_template' field in the POST request, and save it
      // when the post is updated. Note that the POST field matches the
      // name of the select box in the markup.
      $cpt_page_template = $_POST['cpt_page_template'];
      update_post_meta( $post_id, 'cpt_page_template', $cpt_page_template );
  add_action( 'save_post', 'postTemplateMetaBoxSave' );

Add the end we hook into save_post with add_action, and run this function when the post is saved. This saves a field called cpt_page_templates in our database for this specific post. We can access this field when the page is loaded.

Retrieving the template on the front end

This is the fun part. Now we have a shiny new meta box on our admin post pages, and a field in the database for each post telling us what template to show. So let's show it!

Fortunately this part is also pretty straightforward, and only requires a single function. Again I commented this inline since it flows pretty linearly. Let's take a look code:

  function loadMyCptPostTemplate() {
      // get the queried object which contains the information we need to
      // access our post meta data
      $query_object = get_queried_object();
      $page_template = get_post_meta( $query_object->ID, 'cpt_page_template', true );

      // the name of our custom post type for which we'll load a template
      $my_post_type = 'my-cpt-name';

      // create an array of default templates
      $default_templates    = array();
      $default_templates[]  = 'single-{$query_object->post_type}-{$query_object->post_name}.php';
      $default_templates[]  = 'single-{$query_object->post_type}.php';
      $default_templates[]  = 'single.php';

      // only apply our template to our CPT pages.
      if ( $query_object->post_type == $my_post_type ) {
          // if the page_template isn't empty, set it as the default_template
          if ( !empty( $page_template ) ) {
              echo 'need to load ' . $page_template;
              $default_templates = $page_template;

      // locate the template and return it
      $new_template = locate_template( $default_templates, false );
      return $new_template;
  add_filter( 'single_template', 'loadMyCptPostTemplate' );

The add_filter() function at the end is important. It allows to hook into the query and change the template to be displayed. In our case, we intercept the query, run a function to see which template we had saved, and load that instead. Also notice that we set a default template. So if the post had no saved option, or something went wrong when trying to find it, it won't fail.


So that's about it. The new template should load with all of the regular post content in the body. The only problem I've noticed is that some themes won't show the page's content on more specialized templates (like a Contact Page template), but this is pretty much expected and it has always worked for the more commen Full Width, Left Sidebar, etc, templates. This code is a little out of context, but the general idea is there and easy to adapt to any plugin.

:: Cody Reichert