Well, the backend was the tricky part. I started it as a simple PHP crawler (which still works), and which runs a s cronjob. The crawler just checks the file hierarchy in a directory against some database entries to know whats new, then it adds the articles to the static.js file which contains all the information about the site.

However, I don't use the crawler anymore, although its really cool, because it has a few drawbacks. When I started writing the blog, all articles were just the markdown files parsed into a div, but now there is much more meta information around an article, for example the list of tags, and the GitHub data to generate the nice fork and watch buttons that you can see in Portfolio posts. Here is how this looks like:
Code:
jb_portfolio[3] = new Array();
jb_portfolio[3]["file"]  	= "jb-articles/portfolio/vinter.txt";
jb_portfolio[3]["title"] 	= "Vinter";
jb_portfolio[3]["id"]	 	= "portfolio:3";
jb_portfolio[3]["github"]	= new Array("justsid", "vinter");
jb_portfolio[3]["date"]  	= new Date("September 10, 2011 12:00:00");
jb_portfolio[3]["tags"]	 	= new Array("vinter", "open-source", "projects", "portfolio");



So, instead of writing a full featured Admin PHP backend, I decided to write a Mac OS X app which does all of this for me, and because I'm lazy, its just a command line app which generates the static.js file for me. Not user friendly, but its working. And as I am the only user of the backend, I don't ned to care about other users \o/
To write the articles, I use Mou.app which is a Markdown writing and preview App for Mac OS X.


Shitlord by trade and passion. Graphics programmer at Laminar Research.
I write blog posts at feresignum.com