We don't have enough information from the original
poster of this question to know for sure, but it seems to me that
a dataset that grew from something under 100 megabytes,
to several hundred megabytes or more, could simply have gone
from doing memory resident updates to having to really
pound the disc and thus run slower and slower each time
the capacity increased.
If the machine has several gigabytes of memory, this is
probably not the problem, but if it has more like 256mb or
so, then I would expect batch updates to get slower and slower
as the data base size went past available memory..