[foaf-protocols] Why exponent/modulus
henry.story at bblfish.net
Fri Sep 17 18:13:02 CEST 2010
On 17 Sep 2010, at 15:48, Nathan wrote:
> Hi All,
> Similar to my previous question, why do we extract the modulus and
> exponent of a public key and represent those in RDF, if we were instead
> to store a PEM/DER encoded version of the public key, or even
> certificate, in our profiles instead, what would break?
Not sure anything would break, assuming we all enhance our servers
to deal with this extra encoding. The authenticating servers would
need to learn to also parse those PEM and DER version and extract the key too,
as you will never get anyone to agree on using only PEM or only DER. In
those ancient formats the public key is hidden by placing them in a
three layered encoded string with semantics that goes way back into the
past. Instead of putting something clearly in XML,
you are using asn.1 encodings which is a binary encoding format with some
extremely weird parsing rules that is then hex encoded and placed in XML!
The advantage of sticking with the description of the public key is that we
only presuppose the mathematics of cryptography, and that is all we need.
In the end the only thing that counts is the public key, ie the modulus and
exponent in RSA. PEM and DER just hide that fact.
So using PEM and DER is not really going to make server authentication
agents simpler. Is it perhaps that you have a database that is storing
information in PEM or DER format, and that you just want to publish this in
a copy paste operation in the WebID Profile? In that case authentication agents
will certainly need to parse both those formats and more, as who knows
how many different ancient formats people use to store their keys in their
database. Each of these agents could argue that as far as copy paste goes
their system requires a different format.
But people who store keys in those formats in their database are probably not
storing a lot of keys in that format, as I don't know that cryptography has
really taken off seriously on the web. And if they store keys in that format,
they are not tied to a WebID and to a client side certificate for sure.
So you will require each implementation to also parse PEM and
DER, and extract the public key from it. In Java that is not a problem btw.
But in other programming languages that is more so.
Adding all these parsers would have made the case for the simplicity of
WebID somewhat more difficult to make. Well now we are also opening up
to parse RDFa and rdf/xml and json, and xml with GRDDL, so there is no reason
we could not also accept a binary format encoding. But perhaps we can hold back
a bit on this? If you can explain your problem there may be other ways of solving
The psychological danger is also that people may think that they can do string
comparison between those formats, which would be wrong. After all with the
same public key I can create any number of PEM or DER strings I think.
Furthermore matching the X509 certificate with the PEM for human inspection
is going to be more difficult, as the public key that all tools show you there
won't match in any way the PEM. Well unless you want to teach the developers
the beauty of openssl command line complexity.
So I am not absolutely convinced you are making things more simple by doing
this. It needs some more clarification.
> With this specific question, the main background thinking is that
> implementations of WebID protocol would be much easier, with far less
> dependencies, if we did simply throw a PEM/DER certificate in to our
> profiles, all those Wordpress/Mediawiki/Drupal type plugins, and indeed
> support in any language which had basic support for HTTP+TLS would
> suddenly become a very easy hit.
How would putting a PEM make those tools easier to integrate? Can you
explain in more detail where things become easier and why.
> foaf-protocols mailing list
> foaf-protocols at lists.foaf-project.org
Social Web Architect
More information about the foaf-protocols