f0b0e7fd9a
use dedicated archives queue As the used disk space will increase, we want a fined grain control move zip logic in dedicated method zip wip wip fix(spec): pass spec in green tech(improvements): avoid File.delete(folder), favor FileUtils.remove_entry_secure which is safer. Also wrap most of code that open file within blocks so it is cleaned when the block ends. Lastly use attachement.download to avoid big memory pressure [download in chunk, write in chunk] otherwise big file [124>1GO] are loaded in memory. what if we run multiple jobs/download in parallel ? fix(spec): try to retry with grace clean(procedure_archive_service_spec.rb): better retry [avoid to rewrite on open file] lint(things): everything
20 lines
460 B
Ruby
20 lines
460 B
Ruby
module Utils
|
|
module Retryable
|
|
# usage:
|
|
# max_attempt : retry count
|
|
# errors : only retry those errors
|
|
# with_retry(max_attempt: 10, errors: [StandardError]) do
|
|
# do_something_which_can_fail
|
|
# end
|
|
def with_retry(max_attempt: 1, errors: [StandardError], &block)
|
|
limiter = 0
|
|
begin
|
|
yield
|
|
rescue *errors
|
|
limiter += 1
|
|
retry if limiter <= max_attempt
|
|
raise
|
|
end
|
|
end
|
|
end
|
|
end
|