Switch to usage of zip unix binary to create archive. Also use a dedicated queue for DelayedJob

use dedicated archives queue

As the used disk space will increase, we want a fined grain control

move zip logic in dedicated method

zip

wip

wip

fix(spec): pass spec in green

tech(improvements): avoid File.delete(folder), favor FileUtils.remove_entry_secure which is safer. Also wrap most of code that open file within blocks so it is cleaned when the block ends. Lastly use  attachement.download to avoid big memory pressure [download in chunk, write in chunk] otherwise big file [124>1GO] are loaded in memory. what if we run multiple jobs/download in parallel ?

fix(spec): try to retry with grace

clean(procedure_archive_service_spec.rb): better retry [avoid to rewrite on open file]

lint(things): everything
This commit is contained in:
simon lehericey 2021-11-29 15:43:51 +01:00 committed by Martin
parent 68a0b6f474
commit f0b0e7fd9a
6 changed files with 164 additions and 40 deletions

View file

@ -1,4 +1,20 @@
class ActiveStorage::DownloadableFile
# https://edgeapi.rubyonrails.org/classes/ActiveStorage/Blob.html#method-i-download
def self.download(attachment:, destination_path:, in_chunk: true)
byte_written = 0
File.open(destination_path, mode: 'wb') do |fd| # we expact a path as string, so we can recreate the file (ex: failure/retry on former existing fd)
if in_chunk
attachment.download do |chunk|
byte_written += fd.write(chunk)
end
else
byte_written = fd.write(attachment.download)
end
end
byte_written
end
def self.create_list_from_dossier(dossier, for_expert = false)
dossier_export = PiecesJustificativesService.generate_dossier_export(dossier)
pjs = [dossier_export] + PiecesJustificativesService.liste_documents(dossier, for_expert)