[
Lists Home |
Date Index |
Thread Index
]
Dennis Sosnoski wrote:
> I don't know why you say "you can't really make the
> claim that a binary format will be faster to encode and decode".
My apologies. I should have said: "binary formats won't
*ALWAYS* be faster." Often, some would argue most of the time, they
will be -- but not always.
There are a number of things that could slow down the handling
of a binary encoding. The obvious first thing to consider is simply
bad design... Some binary formats are written in such a way that the
require a great deal of data buffering or complex state management in
order to be properly decoded. They may compress nicely, but they do so
using methods that are inherently very expensive.
Another thing to consider is the potential cost in converting
data to address things like "endian" concerns. If you have to flip the
bits in every number you get and you have many numbers, this can get
expensive. Similarly, converting from a "standard" floating point
representation to some different representation supported by your
machine or language might be expensive. Also, if it turns out that
your application is primarily concerned with large strings, then a
binary format can deliver very disappointing results.
Please note that in my comment, I wasn't arguing against
binary encodings. Rather, I was simply suggesting that we shouldn't
consider all binary encodings to be a panacea. Some are better than
others and in some cases, using a binary encoding just won't buy you
very much. Folk who use these things should use the same due diligence
in decision making that they would use when making any other design or
configuration decision. Neither binary nor text encodings are
automatically either good or bad. "It all depends."
bob wyman
|