Right, I need to vent a bit again. Warning, the below text will contain ASN.1, x509 and text encodings. It is not suitable for children or developers who for some reason aren't allowed to consume alcohol.
I got a mail yesterday, from a customer, which included a lovely stacktrace:
> Bad subject: 'utf-8' codec can't decode byte 0xd6 in position 0: invalid continuation byte.
So. Something that should be utf-8 isn't utf-8. It turns out that this customer was on a mac.
OS X has defaulted to utf-8 for the whole platform since the start. Everything there is unicode, so it seems strange that something coming out of a mac would NOT be unicode.
It turns out that this really was the case. On OS X, the system supplied config file for openssl (The command line utility) explicitly prohibits the output of utf8.
This is a setting that has been changed in upstream openssl for several years, and that defaults to "utf8only" output. But not Apple.
Apple as said denies output of utf8, and anything that should be utf8, like any string or domainname that includes unicode, will be forcibly degraded into the venerable T.61 text encoding standard.
Note that T.61 is obsolete, and was obsoleted back in 1994 when they withdrew the updated version of it, in favour of unicode.
In fact, openssl states that the reason to not create certificates with utf8 is that "really ancient" versions of Netscape crashed.
How Ancient are those? Well, we're talking "Before Netscape 2.x"
So. Thank you Apple.
And for anyone who tries to create SSL certificates on a Mac, you want to change that config file.