I’ve found that it’s pretty easy to run your own custom blog for with zero hosting cost, and even survive some significant traffic spikes, with liberal usage of HTTP Caching in Rails. So it’s somewhat ironic that when I come to update this blog this morning (after giving the last post a good year to really take root in the Collective Unconscious), I noticed that my caching fetish had shot me in the foot.
You see1, for logged-in users (i.e. me) the blog pages have an extra
panel in the sidebar for creating new posts. Since all of the blog pages are cached (with
last_modified set to the update time of the most recent post), that admin panel was getting cached along with the rest of the page, and served to all public visitors (both of them). D’oh!
The quick solution was pretty simple:
fresh_when(...) unless current_user
That will at least keep from polluting the publicly-cached page with user- (and, especially, admin-) specific HTML. If that seems like such an obvious thing to do that only an idiot wouldn’t think to do it, then you’re probably reading the wrong blog.
For the rest of us, I’m contemplating more complicated (but safer!) solutions, to keep us from spamming the internet with our private data. The first step might just be a mechanism where the partial that renders users-specific HTML could just cancel the HTTP caching. A more scalable solution would involve Fragment Caching and Memcached. But part of me wants to go Full Paranoid with some sort of Frankensteinian melding of SafeBuffers and CanCan, so no string ever makes it to the response body unless it’s Certified Cache Safe.
1 That’s a retroactively-foreshadowing meta-pun. You’re welcome.