It's been over a year, since the launch of perfplanet.com. Looks good and useful for people so far. Sergey "ShowSlow" is doing also a great job of tweeting as @perfplanet about news from the perfplanet pipes as well as other interesting happenings in our perf community. Good stuff.
From the beginning I was sure I'm not including all blogs that deserve it. And new ones come up. So I said - send me an email, I'll look around the blog and add it to the planetarium. Problem with that is that the process was cumbersome. I don't always have the time (or am just being lazy because the process is kinda involved). Namely - update a yahoo pipe and update an html page with the list of blogs.
So I decided to take a few hours tonight to remedy the situation. I thought long and hard for what must have been a whole minute and the solution that came was - GitHub.
All the code is now updated, there's a bit of build process, minification and such and the code is now on GitHub, yeey. So if you send a pull request, I just accept it, run the build script and update the site.
There is this "planetarium.json" file:
All you have to do is update this file, add your (or your favorite) blog for syndication and that's that. Or delete a spammy or irrelevant blog.
When adding a feed URL, try to find the "performance" related category. Because, sadly, not everyone is all that interested in other people's cats, as they are in other people's performance thoughts.
E.g. Ben Cherry's feed is:
But we're only interested in the posts tagged "performance":
There are exceptions, of course, some folks only talk about performance.
About internationalizations - talk to me. There is currently an fr.perfplanet.com, not necessarily maintained. But if you want to aggregate blogs in your language, you can just clone the github project, maintain your blogs and I'll setup your-lang.perfplanet.com and start pulling updates from github.
So first, I switched from Yahoo Pipes to YQL. Because the aggregation request can be generated from a list of URLs, no need to use a UI, login into Pipes, etc.
- it takes the JSON list of blogs, an HTML template, CSS and JS
- generates index.html with inline minified CSS and JS using cssmin.js and jsmin.js. Gotta minify, gotta save requests
- also in the index.html there's a list of blogger names and URLs generated from the JSON
- generates an up.sh ("up" as in update) - this is a curl call with generated YQL query. This file is executed by a cron job every hour to read new blog posts and write data.js
- data.js is then used in index.html to display the content
So that's that. Hopefully this way the site will see much more updates with new fresh content and blogs.
Bugs, etc, welcome.
Contributions to the list of blogs (or anything else really) more than welcome.