When this was an UPDATE statement with a WHERE clause, and the LoginAttempts
table was vacant, nothing would happen. Thankfully, SQLite supports an UPSERT
clause so that I can INSERT a new record or UPDATE conditionally.
And the best part is: it works!
"SELECT *" in SQL may not guarantee the order in which a record's columns are
returned. For example, in my FromRow instances for Account, I make successive call
The following scenario silently and erroneously assigns:
firstName, lastName = lastName, firstName
```sql
CREATE TABLE People (
firstName TEXT NOT NULL,
lastName TEXT NOT NULL,
age INTEGER NOT NULL,
PRIMARY KEY (firstName, lastName)
)
```
```haskell
data Person = Person { firstName :: String, lastName :: String, age :: Integer }
fromRow = do
firstName <- field
lastName <- field
age <- field
pure Person{..}
getPeople :: Connection -> IO [Person]
getPeople conn = query conn "SELECT * FROM People"
```
This silently fails because both firstName and lastName are Strings, and so the
FromRow Person instance type-checks, but you should expect to receive a list of
names like "Wallace William" instead of "William Wallace".
The following won't break the type-checker, but will result in a runtime parsing
error:
```haskell
-- all code from the previous example remains the same except for:
fromRow = do
age <- field
firstName <- field
lastName <- field
```
The "SELECT *" will return records like (firstName,lastName,age), but the
FromRow instance for Person will attempt to parse firstName as
Integer.
So... what have we learned? Prefer "SELECT (firstName,lastName,age)" instead of
"SELECT *".
Lots of changes here:
- Add the GET /verify endpoint
- Email users a secret using MailGun
- Create a PendingAccounts table and record type
- Prefer do-notation for FromRow instances (and in general) instead of the <*>
or a liftA2 style. Using instances using `<*>` makes the instances depend on
the order in which the record's fields were defined. When combined with a
"SELECT *", which returns the columns in whichever order the schema defines
them (or depending on the DB implementation), produces runtime parse errors
at best and silent errors at worst.
- Delete bill from accounts.csv to free up the wpcarro@gmail.com when testing
the /verify route.
Whichever package is on nixpkgs right now is broken, so I'm using `fetchGit` and
`callCabal2nix`.
Create Email module exposing a simplifies `send` function that partially applies
some of the configuration options.
Using my dear friend's, dmjio's, excellent library, envy -- to read and parse
variables from the system environment.
I added and git-ignored the .envrc file that contains API secrets. I'm using
Envy to read these values, so that I don't hard-code these values into the
source code.
If I ever fully learn `servant-auth`, I'll probably recognize how naive this
hand-rolled solution is. But it works! And the code is pretty declarative, which
I like.
Many Bollywood movies have excellent acting, excellent directing, excellent
storytelling, but in my opinion, they spoil this with unnecessary musicals
interspersed throughout the films.
Dangal is a notable exception here. Overall, I'd say that this movie is
appropriately rated!
Refactor my handlers to use the `Handler a` type instead of `IO a`; this allows
me to throwError inside of handlers that Servant properly handles. Previously I
was creating 500 errors unnecessarily.
Update my API type and handler types to reflect which handlers read and write
cookies.
TODO:
- Actually read from and write to Set-Cookie header
- Returning `pure NoContent` breaks my types, so I'm returning `undefined` now
Now that we've migrated over all the data to postgresql, we can get rid
of cl-prevalence as a dependency from Panettone along with all code that
mentions it.
Change-Id: I945f50a88fea5770aac5b4a058342b8269c0bea2
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1495
Reviewed-by: kanepyork <rikingcoding@gmail.com>
Reviewed-by: tazjin <mail@tazj.in>
Tested-by: BuildkiteCI
TL;DR:
- Since POST /login is more rigorous, our accounts.csv needs to contain validly
hashed passwords; you can use tests/create-accounts.sh to create dummy
accounts
I still need to test the login flow and support:
- Tracking failed attempts (three maximum)
- Verifying accounts by sending emails to the users
For the past 3-4 Haskell projects on which I've worked, I've tried to habituate
the usage of the (&) operator, but I find that -- as petty as it may sound -- I
don't like the way that it looks, and I end up avoiding using it as a result.
This time around, I'm aliasing it to (|>) (i.e. Elixir style), and I'm hoping to
use it more.
TL;DR:
- introduce the Cryptonite library
- Remove the redundant language extensions, imports, deps from Persistent
- Prefer NoContent return type for POST /accounts
- Define custom {To,From}JSON instances for Role
Instead of sending and receiving JSON like "accountUsername", which leaks
implementation details and is a bit unwieldy, define custom instances that
prefer the shorter, more user-friendly "username" version.
Allow a user to delete a trip entry from the Trips table using the Primary
Key. While this type-checks and compiles, it doesn't appear to be working as
intended. Perhaps I should use an auto-incrementing integer as the Primary
Key. I'm not sure how I want to handle this, so I'm punting for now.
Additionally, add IsValidBase16() to restore the behavior of rejecting invalid base16, which absl's HexStringToBytes does not do.
Change-Id: I777a36f5dc787aa54a2aa316d6728f68da129768
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1484
Tested-by: BuildkiteCI
Reviewed-by: tazjin <mail@tazj.in>
It appears this didn't even *work* without a password, so we've been
forced into being more secure.
Change-Id: I4ff9d04961a703a85299dafb79e8447b0a933fc1
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1491
Tested-by: BuildkiteCI
Reviewed-by: tazjin <mail@tazj.in>
I have been. Very tired.
Change-Id: Iab9d21e53630be092080cc73196da90534b06553
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1490
Tested-by: BuildkiteCI
Reviewed-by: tazjin <mail@tazj.in>
This is how panettone is currently connecting, so this needs to be here
in order for it to work. Shortly I'll update all of this to use
passwords, but for now this gets things up and running again
Change-Id: If87f4dbce0800dcbc4f7bf10e88f3e591410b416
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1488
Tested-by: BuildkiteCI
Reviewed-by: tazjin <mail@tazj.in>
Switch from cl-prevalence to postgres (via postmodern) as the storage
backend for panettone. The first time the application starts up after
this commit, it will (idempotently) initialize the db schema and migrate
over all data from the prevalence snapshot to the database - the plan is
then to get rid of the prevalence classes and dependency once that's
deployed.
Change-Id: I4f35707efead67d8854f1c224ef67f8471620453
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1467
Tested-by: BuildkiteCI
Reviewed-by: tazjin <mail@tazj.in>
Reviewed-by: eta <eta@theta.eu.org>
Say ~my~ its name!
Change-Id: I7890318aef984af0f6bc011de32282f16e01cbb3
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1483
Tested-by: BuildkiteCI
Reviewed-by: eta <eta@theta.eu.org>
Create a running Postgres database server along with a user and database
for Panettone, and pass configuration for it to the panettone module
Change-Id: I333994288131be328e62069382d6d40f8034c400
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1466
Tested-by: BuildkiteCI
Reviewed-by: tazjin <mail@tazj.in>
In the spirit of walking crawling before I walk, I'm preferring the less
powerful SQLite.Simple library to the more powerful (but mystifying) Persistent
library.
Add support for explicitly specifying tests as part of a buildLisp
program or library.
Change-Id: I733213c1618f0fa60f645465560bce0522641efd
Reviewed-on: https://cl.tvl.fyi/c/depot/+/1481
Tested-by: BuildkiteCI
Reviewed-by: tazjin <mail@tazj.in>
I believe data should be validated at each level of the stack:
- database
- server
- client
The database, in my opinion, is the most important layer at which to validate
because you can eliminate entire classes of bugs. However, the CHECK constraint
is limited, and the more complex the predicates are, the more expensive database
operations become.
At the server and client layers, the data validations can be more sophisticated
and return more useful error messages to help users better understand the shape
of the data that our application expects.