Description
The function Data.Word64.Show.showHexa
blows up when I try to normalize for example showHexa (fromℕ 0xdeadcafe)
in Emacs.
The reason for this presumably is that it uses Data.Nat.Show.showInBase
, which in turn uses Data.Digit.toDigits
, which produces a proof term too, not only the result.
(The same holds for Data.Word8.Show.showHexa
, but as those numbers are small, the effect is less pronounced.)
On the other hand, Data.Nat.Show.show
does not suffer from the same problem, because it never tries to build a proof term behind the scene.
The simplest fix would be to change the implementation of showHexa
to be more similar to Data.Nat.Show.show
. Here is one possible way to do that:
open import Data.Bool
open import Data.Nat
open import Data.List using ( List )
open import Data.String
open import Data.Word64
open import Data.Nat.Show as ℕ
open import Data.Digit using ( toNatDigits )
open import Data.Char as Char
open import Function.Base using ( _$_ ; _∘′_ )
toHexDigitChar : ℕ → Char
toHexDigitChar n with n <ᵇ 10
toHexDigitChar n | true = Char.fromℕ (n + Char.toℕ '0')
toHexDigitChar n | false = Char.fromℕ (n ∸ 10 + Char.toℕ 'a')
toHexadecimalChars : ℕ → List Char
toHexadecimalChars = Data.List.map toHexDigitChar ∘′ toNatDigits 16
showHexa : Word64 -> Strings
showHexa w
= "0x" ++_
$ padLeft '0' 16
$ Data.String.fromList
$ toHexadecimalChars (Data.Word64.toℕ w)
(btw, currently Data.Word64.Show.showHexa
pads to 8 digits, but 64 bit numbers require 16 digits... I changed that too above)