Why Prefer 16-bytes to 12-bytes for AES IVs

This is a super short post, but something that I came across that others may run into.

TL;DR

Use 16-bytes for AES-GCM IVs. It’s what everyone else seems to be using.

Story Version

We use 256-bit AES-GCM keys to secure our device linking flows, which run over IPFS PubSub (i.e. on WebRTC), as well as on a custom hosted WebSocket relay as fallback. This makes it a critical component for us here at Fission.

We ran into a problem where our CLI could link to other CLIs just fine, browsers could link to each other, but not across. After much debugging, the issue was a mismatched initialization vector (IV) length.

The original 2007 AES-GCM NIST paper recommends using a 12-byte IV because…

For IVs, it is recommended that implementations restrict support to the length of 96 bits [12 bytes], to promote interoperability, efficiency, and simplicity of design.

As such, the MDN docs on AES for WebCrypto have an example that uses a 12-byte IV, and links to that paper.

However…

Our CLI is written in Haskell, and has a hardcoded 16-byte IV requirement (it checks on parse). By coincidence, a base64 encoded 16-byte IV looks like 12-bytes, so we effectively had the wrong IV (my bad). Looking around at other libraries in Java, C, Go, and others, they all default to 16-bytes.

The W3C WebCrypto API spec makes no recommendation on IV size, other than requiring that it be smaller than 264 bytes.

  1. If the iv member of normalizedAlgorithm has a length greater than 2^64 - 1 bytes, then throw an OperationError .

Also fun fact: the openssl CLI tool includes AES in its ciphers, but fails with a “AEAD ciphers not supported” message when you try to use them.

5 Likes