Errors generated by the `invisible-catcha` gem are reported as
`flash[:error]` (which differs from Rail's usual `flash[:alert]`).
We probably shouldn't use flash[:error] in our own codebase; but we can
handle its use in third-party libraries.
Errors generated by the `invisible-catcha` gem are reported as
`flash[:error]` (which differs from Rail's usual `flash[:alert]`).
Avoid crashing when the flash level passed to this helper is not known.
For an unknown reason, ActiveRecord::Base.connected? does not work anymore in dev. It returns false negative.
This simple workaround can timeout. The caller has to monitor the response time.
fix(profil_controller#update_email): changing email from current_user.email to current_user.email destroy current user. whoops ☠️'
Update config/locales/en.yml
Co-authored-by: Pierre de La Morinerie <pierre.de_la_morinerie@beta.gouv.fr>
Update config/locales/fr.yml
Co-authored-by: Pierre de La Morinerie <pierre.de_la_morinerie@beta.gouv.fr>
Update spec/controllers/users/profil_controller_spec.rb
Update config/locales/fr.yml
Co-authored-by: Pierre de La Morinerie <pierre.de_la_morinerie@beta.gouv.fr>
Update spec/controllers/users/profil_controller_spec.rb
fix(spec): broken due to typo
use dedicated archives queue
As the used disk space will increase, we want a fined grain control
move zip logic in dedicated method
zip
wip
wip
fix(spec): pass spec in green
tech(improvements): avoid File.delete(folder), favor FileUtils.remove_entry_secure which is safer. Also wrap most of code that open file within blocks so it is cleaned when the block ends. Lastly use attachement.download to avoid big memory pressure [download in chunk, write in chunk] otherwise big file [124>1GO] are loaded in memory. what if we run multiple jobs/download in parallel ?
fix(spec): try to retry with grace
clean(procedure_archive_service_spec.rb): better retry [avoid to rewrite on open file]
lint(things): everything
feat(expiration_banner): enhance wording of expiration
feat(dossiers/expiration_banner): enhance wording regarding expiration to include duree_conservation_dossiers_dans_ds + extension_conservation, also add spec on expiration_banner for instructeur
fix(lint): lint haml
fix(spec): enable flipper and allow procedure to receive flipper check when checking banner presence
fix(doc): add missing documentation on readme regarding system testing with a visual feedback
fix(typo): add missing accent
clean(PR): feedback from Tchak, better to wrap feature check for expirability by procedure within dossier.expirable? helper
tech(question): discard_and_keep_track! ; are we really keeping track with default_scope { kept } ?
feat(stats): add DeletedDossier in Stat computations
Revert "tech(question): discard_and_keep_track! ; are we really keeping track with default_scope { kept } ?"
This reverts commit d1155b7eeaaf1a9f80189e59667e109541fcb089.
feat(stats): support deleted_dossiers for last_four_months_hash and cumulative_hash. extract sanitize query & merge hashes in methdos
clean(rubocop): lint with rubocop
Update db/migrate/20211126080118_add_index_to_deleted_at_to_deleted_dossiers.rb
Co-authored-by: LeSim <mail@simon.lehericey.net>
fix(rubocop): avoid uneeded allocation
fix(migration): add concurrent index with expected synthax
fix(brakeman): add ignore message since group date_trunc evaluation is used by only ourself
i18n(france_connect/*): replace wording with i18n
fix(lint): i18n key issue
secu(views/france_connect/particulier/merge.html.haml): sanitize france_connect_email just in case
fix(brakeman): sanitize FCI.email_france_connect when used with html_safe via an I18n.t, also add exception to brakeman
feat(fci.confirmation_code): add confirmation code to france_connect_informations
feat(user_mailer.france_connect_confirmation_code): add confirmation by email mail method/preview/spec, pointing to merge_mail_with_existing_account (reuse existing method)
feat(mail_merge): mail merge
feat(merge.cannot_use_france_connect): same behaviour as callback
clean(fci.confirmation_code): use same token for mail validation as merge
feat(resend_france_connect/particulier/merge_confirmation): resend email with link. also enhance some trads, cleanup halfy finished refacto
clean(tech): finalize story by plugging merge_with_new_account to email validation
fix(deadspec): was removed
fix(spec): broken after last refactoring
lint(rubocop): space before parenthesis
lint(haml-lint): yoohoooo space before =
fix(lint): scss now :D
Update app/assets/stylesheets/buttons.scss
cleanup
feat(france_connect): re-add confirm by email, with an option for confirmation by email instead of only confirmation by email
fixup! Add confirmation by email when merging DC/FC accounts
fix(lint): haml_spec failure
Deep-cloned objects have all their relationships stale. Thus, for a
newly deep-cloned revision, `revision.types_de_champs` returns `[]`,
even when it actually has associated types de champ.
This causes consecutive champs creations and re-ordering to fail in
subtle ways, like:
```
procedure.draft_revision.add_type_de_champ(…)
procedure.publish_revision!
procedure.draft_revision.add_type_de_champ(…)
procedure.draft_revision.move_type_de_champ(…) # this will fail
```
As `publish_revision!` created a new stale revision, moving the type
de champ fails because not all existing champs are found until the
object is refreshed.
We don't hit this path in production, because usually only a single
operation is made in a request.
To fix this, save the new revision before associating it as the draft
procedure.
(Another option would be to `reload` the revision after creation, but
this seems better contained and matches the name of the method.)
It seems better to create associations in an declarative fashion, rather
than using imperative code. This also makes the attribute compatible
with build_stubbed.
Calling business logic in a factory is a code-smell, because it
usually requires the object to be saved into database, and may have
unintended consequences when the business logic is changed.
Also, this allows to just build a published procedure, without saving it
to the database.
This fix prevent repetition children types de champ from being pulled from cloned procedures. stable_id is stable across revisions but also across cloned procedures.
Creating dossiers is faster than creating a procedure, but still slow.
We can create a single dossier in the default case, and only create
several others when the example requires it.
Speeds up this spec from 0m 57s to 0m 49s.
Creating a procedure with all available types de champ is slow. We can
create a simpler procedure in the default case, and only create all
types de champs when the example requires it.
Speeds up this spec from 1m 55s to 0m 57s.
Creating a dossier with available champs populated is slow. We can
create simpler dossiers in the default case, and only populate all
champs when the example requires it.
Speeds up this spec from 2mn 20s to 1m 55s.
Before, every time a password was tested, the dictionaries were parsed
again by zxcvbn.
Parsing dictionaries is slow: it may take up to ~1s. This doesn't matter
that much in production, but it makes tests very slow (because we tend
to create a lot of User records).
With this changes, the initializer tester is shared between calls, class
instances and threads. It is lazily loaded on first use, in order not to
slow down the application boot sequence.
This uses ~20 Mo of memory (only once for all threads), but makes tests
more that twice faster.
For instance, model tests go from **8m 21s** to **3m 26s**.
NB:
An additionnal optimization could be to preload the tester on
boot, before workers are forked, to take advantage of Puma copy-on-write
mechanism. In this way all forked workers would use the same cached
instance.
But:
- We're not actually sure this would work properly. What if Ruby updates
an interval ivar on the class, and this forces the OS to copy the
whole data structure in each fork?
- Puma phased restarts are not compatible with copy-on-write anyway.
So we're avoiding this optimisation for now, and take the extra 20 Mo
per worker.
instead of looking linked user by email because :
- follows FC recommendation to fetch ds account by openid
- the email is not a valid key as many user can share the same FCI email.
The following scenario is now working
A user A (email: 1@mail.com) uses FC to connect to DS
=> It is connected as 1@mail.com
Another user B (email: generic@mail.com) uses FC to connect
=> It is connected as generic@mail.com
The first user A change its FC email to generic@mail.com and connect to DS
=> It is still connected as 1@mail.com
There's a random failure in this spec, where the CI triggers this error:
> Failure/Error: check('checkbox')
>
> Selenium::WebDriver::Error::ElementClickInterceptedError:
> element click intercepted: Element <input required="required" type="checkbox" value="on" name="dossier[champs_attributes][7][value]" id="dossier_champs_attributes_7_value"> is not clickable at point (205, 892). Other element would receive the click: <div class="send-dossier-actions-bar">...</div>
That's because the checkbox is partially overlapped by the sticky
action bar at the bottom of the screen – but only _some of the time_.
This commit attempts to fix the issue by manually scrolling the checkbox
at the center of the screen before clicking it.
By default, Devise will look for views:
1. First in `views/resource/passwords/…`,
2. Then in `views/devise/passwords/…` if not found.
By moving the views to `views/devise`, we avoid having a partial in
`views/shared` that we need to include manually, and instead let Devise
do the job automatically.
We don't want to expose full demarche type on dossiers because it would open the door for recursive queries that we want to avoid. DemarcheDescriptorType is a lightweight representation of demarche metadata.
In a9a4f6e2a8, a task to migrate
ProcedurePresentation's filters was added.
This task added a "migrated: true" key to all migrated filters.
Now that this task has run, we can safely remove the extra key.
In a previous version of this commit, the migration would fail for
invalid ProcedurePresentation records. This is now fixed.
Providing a query param ("locale") will enable localization. A language picker will be shown once
localization is activated. Locale is stored in a cookie "locale".
There are two cases where the draft auto-save might fail because the
user is no longer authenticated:
- The user signed-out in another tab,
- The brower quit and re-opened, so the Session cookie expired.
In both cases, the auto-save will never succeed until the user
authenticates again, so displaying a "Retry" button is cruel.
Moreover, in plus of all auto-save requests failing with a small error,
the actual hard failure only occurs after filling all the form and
trying to submit it. Then the user is redirected to the sign-in page –
but all their changes are lost.
Instead, we now redirect to the sign-in page on the first 401 error
during the auto-save, let the user sign-in, and then redirect back to
the form.
In a9a4f6e2a8, a task to migrate
ProcedurePresentation's filters was added.
This task added a "migrated: true" key to all migrated filters.
Now that this task has run, we can safely remove the extra key.
The check for whether the checkbox should be checked or not was made by
matching the whole string. Thus, given two options 'valid' and
'invalid', the check for the presence of 'valid' would succeed even when
only 'invalid' was present in the values (because
`'valid'.includes?('invalid')`.
The code now checks against the list of items in the selected_options.
This occurs mostly when Safari attempts to perform a POST request
again (without sending any of the cookies).
In that case, our custom `422.html` page is more helpful to the user
(because it has a link to the previous page) than a "No cookies" blank
text.
before this commit, the average dossier weight took account only pieces
justificatives. With this commit, we add a minimum weight for other
files included in an archive like pdf_export, log operations,
attachments added to traitements. This minimum weight is set arbitrary,
from the observation of some random procedures in production
there are sometimes an error that happen when building an everything
archive. The error explanation is not understood at the moment.
To deliver the archive feature quickly, we remove the 'everything' archive for
the moment