Flowdock
find_in_batches(options = {}) public

Yields each batch of records that was found by the find options as an array. The size of each batch is set by the :batch_size option; the default is 1000.

You can control the starting point for the batch processing by supplying the :start option. This is especially useful if you want multiple workers dealing with the same processing queue. You can make worker 1 handle all the records between id 0 and 10,000 and worker 2 handle from 10,000 and beyond (by setting the :start option on that worker).

It’s not possible to set the order. That is automatically set to ascending on the primary key ("id ASC") to make the batch ordering work. This also mean that this method only works with integer-based primary keys. You can’t set the limit either, that’s used to control the the batch sizes.

Example:

  Person.where("age > 21").find_in_batches do |group|
    sleep(50) # Make sure it doesn't get too crowded in there!
    group.each { |person| person.party_all_night! }
  end
Show source
Register or log in to add new notes.
March 27, 2013 - (>= v3.0.0)
0 thanks

When dealing with has_many through

The non-repeating primary key id must be used with find_in_batches.

  • User has many things

  • User has many socks through things

  • Sock has many things

  • Sock has many users through things

For the sake of argument, assume the first user has two socks and all other users have one sock. There are 1000 users in total and 1001 socks in total.

User.joins(:socks).count
=> 1001
agg = []
# Incorrect
User.joins(:socks).find_in_batches{|g| agg += g}
agg.count
=> 1000

Sock.joins(:users).count
=> 1001
agg = []
# Correct
Sock.joins(:users).find_in_batches{|g| agg += g}
agg.count
=> 1001
September 12, 2014
0 thanks

Be careful with .select

With 999 people in the table:

Person.select('person.firstname').find_in_batches do |group|
  group.each { |person| puts person.firstname }
end

Will work properly.

But with 1001 people in the table, this will raise “Primary key not included in the custom select clause”. It’s a bit of a time bomb. If you’re writing tests for methods that use this, you won’t see a failure unless you’ve tested with more than records than the default batch size.