The concise cost/benefit analysis very helpful.
- Some contain plaintext, some contain ciphertext
- Some contain keys, some contain data
- The keys themselves exist in plaintext and ciphertext forms (data keys generated by AWS KMS)
- Some keys were used for signing/verifying, some were used for encrypting/decrypting
- Some arrays contain Base64 data (for embedding in JSON), some contain unencoded data
Using 'Array[Byte]' would work, but would be error-prone. Using distinct types or wrappers like 'Base64[Plaintext[Key[Signing]]]' would require lots of extra definitions, and wrapping/unwrapping scattered around the code. Phantom types let me use type signatures with the right amount of specificity and polymorphism as needed, whilst the code itself was the 'straightforward' version without wrapping/unwrapping.
GADTs which don't store a value based on one of their type parameters are still making use of phantom types. The key concept with a GADT is that you can have type variable binding(s) (whether phantom or concrete) which are determined by the constructor. If you aren't taking advantage of that then you're probably just defining an ordinary data type using GADT syntax.
Imagine aircraft engineers will do that.
Up to (but not including) NonEmpty lists everything was fine.
"but not including" how in the world is NonEmpty redundant abstraction?
Look, a lot of the other patterns are interesting but I really think that if you are getting into criticising code because it has "Boolean blindness" you are probably getting a bit lost in navel gazing. Do you really have nothing better to do?
foo x y z = if (isNothing x) then bar y else baz z
where bar val = [val]
baz val = [fromJust x, val]
Here we're using a boolean like 'isNothing x' to choose what to do, but one of those choices (baz) is making implicit assumptions about the meaning of that choice ('fromJust x' is an unsafe function, which will crash if x isn't 'Just'). In real code this sort of implicit coupling can spread across larger pieces of code, with more complicated and less obvious behaviour (e.g. indexing into a list, assuming it's safe due to some arithmetic written elsewhere). This is bad for a few reasons:- If we don't get it exactly right (including edge cases, etc.) then it can go pretty badly wrong (a crash in this case).
- The dependency/coupling between the check and the assumption are implicit and may break in the future; e.g. if we try to re-use the assumption-riddled code without performing the check; or if the condition in the check needs to change and we don't realise it breaks the assumption-riddled code.
- The logic usually ends up being overly-complicated, since we're performing redundant work. In my above example we could do this instead:
foo x y z = maybe (bar y) (baz z) x
where bar val = [val]
baz val x' = [x', val]
Branching on the value we care about ('x', using the 'maybe' function) ensures that each branch (a) has the context it needs to do it's job (e.g. the x' parameter) and (b) isn't given any more context than what is known at that point (e.g. the compiler can tell us if we're forgetting unhandled cases, which it can't if our assumptions are implicit).I think the real "boolean blindless" is something that happens in beginners, who don't yet understand that ADTs can be a rich modeling language, and instead use (Bool, Bool, Bool, Bool) when some combination of product and sum types is better.
https://martinfowler.com/bliki/FlagArgument.html
https://medium.com/@amlcurran/clean-code-the-curse-of-a-bool...
Or are you conflating booleans with black/white and arguing for colour blindness because All Bools Matter or something?
I can't really make sense of this.
If beginners see a list like this, and start re-factoring the code to remove "Boolean blindness", that's likely not the best way for them to spend their time and the code may become more verbose in the process.
I prefer not to dwell on negativity about code, as long as it does the job and is implemented in a reasonably simple manner. I think in the Haskell community sometimes there can be an unhealthy tendency to nitpick the implementation details.