Marko Anastasov wrote this on March 8, 2009
Using delayed_job to run tasks asynchronously
A common programming task when you’re developing a Rails app (or in fact with any other framework) is to asynchronously execute potentially long tasks, as well as those that rely on another party’s resources, in order to keep your app responsive. Such jobs typically include sending a lot of email, image processing, file downloads, generating PDFs, uploading something to S3 and so on.
In our previous projects we’ve been using Starling and the Workling plugin to execute background tasks. (Our first attempt was actually with BackgrounDRb, but it we couldn’t get it to run robustly in production.) That solution has been working fine however it always felt slightly overblown for a simple problem of running another Rails process for long tasks. In addition, it’s better to avoid monitoring multiple processes that depend on each other for that sole purpose.
I tried delayed_job for the first time on Friday and had our background processes up and running quickly. It is a very lightweight Rails plugin with zero dependencies, and the source files in lib/ are less then 400 LoC.
With delayed_job, your job classes need to have a perform method, and in general look like this:
class FooJob # it's another Rails process, so you need to pass object instances # and store the ID for the job attr_accessor :foo_id def initialize(foo) self.foo_id = foo.id end def perform foo = Foo.find(foo_id) # do something with foo end end
Then they’re ready to run with
Delayed::Job.enqueue FooJob.new(foo)
See the project’s README for other possibilities and how to set it up. A cool thing to do, which is mentioned on the wiki, is to daemonize the worker. Also, check out the github folks' configuration file for God.