[Python-Dev] bytes.from_hex() (original) (raw)
Josiah Carlson jcarlson at uci.edu
Tue Feb 21 19:52:48 CET 2006
- Previous message: [Python-Dev] bytes.from_hex()
- Next message: [Python-Dev] bytes.from_hex()
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
Stephen J. Turnbull wrote: > What I advocate for Python is to require that the standard base64 > codec be defined only on bytes, and always produce bytes. I don't understand that. It seems quite clear to me that base64 encoding (in the general sense of encoding, not the unicode sense) takes binary data (bytes) and produces characters. That's the whole point of base64 -- so you can send arbitrary data over a channel that is only capable of dealing with characters. So in Py3k the correct usage would be base64 unicode encode encode(x) original bytes --------> unicode ---------> bytes for transmission <-------- <--------- base64 unicode decode decode(x) where x is whatever unicode encoding the transmission channel uses for characters (probably ascii or an ascii superset, but not necessarily).
It doesn't seem strange to you to need to encode data twice to be able to have a usable sequence of characters which can be embedded in an effectively 7-bit email; when base64 was, dare I say it, designed to have 7-bit email as its destination in the first place? It does to me.
- Josiah
- Previous message: [Python-Dev] bytes.from_hex()
- Next message: [Python-Dev] bytes.from_hex()
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]