[foaf-dev] for more information please log in

Peter Williams pwilliams at rapattoni.com
Mon Jan 14 10:46:56 GMT 2008

You could even put two of your ideas together.
The pubkeys counter-signed by Henry of his friends could contain the bloom filter, representing membership in a particular friend/group list. The user acting as FOAF subject would then assert that signed publickey (by showing possession of the private key) and also communicate the bloom filter (in the same signing block as the publickey) when seeking authentication EITHER against the webserver hosting Henry's FOAF file OR webservers hosting files of Henrys friends which are willing to rely on Henry's management of both pubkeys and groups.
In security doctrine, the assertion becomes a capability. As with all capabilities, they are hard to revoke. In this friend-focused model scenario, revocation really may not be much of an issue.  As during the adoption of SSL and PKI, noone actually needed revocation until military applications of SSL came to the scene- requiring individual accountability controls. In a social networking scene, the security doctrine is presumably not aiming at accountability - but privacy expectation assertion making (that friends will abide by, for the most part).


From: Story Henry [mailto:henry.story at bblfish.net]
Sent: Mon 1/14/2008 2:03 AM
To: Peter Williams
Cc: foaf-dev
Subject: Re: [foaf-dev] for more information please log in

On 14 Jan 2008, at 10:44, Peter Williams wrote:

> Lets address the application of the bloom filter to access control 
> in specifically a FOAF setting:-
> If we assume that one authenticates optionally and only for a 
> purpose that is FOAF-specific (i.e. FOAF is not a general purpose 
> cache of distributed directory data, but targets the assertion of 
> personal friend relationship and personal expectations of privacy) 
> then authentication exists to assert friendship in order to obtain 
> friendship privileges in accordance with the limits of the class of 
> friendship/acquaintance.
> If the bloom filter allows one to test whether one is probably a 
> member of the FOAF-object's friend/acquaintaince list with the 
> implied right of the FOAF-subject to query private friendship 
> relations,  lets now assume that authentication exists to now (1) 
> "assert" that status seeking a session; and a resulting login  
> should then (2) "entitle" one to exploit various query rights about 
> the subject's and others' relations with the person whose PPD is 
> being addressed.
> Upon login with strength S, the authorization cookie assigned to the 
> FOAF-subject's web session will be assigned various access mode 
> rights as a function of S and after the session manager possibly 
> tests additional bloom filters that guard whether you the subject 
> ought to be assigned privilege P - where Pi compute particular 
> classes of friend-based inferences. For example, strength S of 
> "publickey" may gate access to inference about the pubkeys of 
> Henry's friends. If strengh S' is "openid from provider #myOP", then 
> this may gate query rights to the pubkeys of Henrys friends whose 
> openids are asserted by #myOP. In general S => P.

Yes. I think we are agreeing on that.

> So, in one embodiment, std xml encryption would be applied as the 
> entire FOAF file as is being streamed out by the webserver hosting 
> the FOAF file, where those query rights would identify which 
> (syntactic) elements of the xml/rdf should be masked. Those parts of 
> the stream that should be masked for the particular FOAF subject UA 
> would be encrypted using std xml element selection principles built 
> into the xml encryption transform. The various groups of parts 
> identified udner the transform would each be encrypted using a 
> particular key generated now for this purpose - a key which could 
> subsequeently be shared with those doing authorized FOAF file 
> aggregation of particular classes of relations that the PPD owner 
> select, when removing his/her expecation of privacy.

That would be an option, but the least likely to get traction it seems 
to me. I don't think we are speaking about encrypting pieces of the 
xml or n3. That could be done, but it seems more than is needed. 
Currently I was suggesting that one publish only what the viewer is 
allowed to see. No need to send him information that is encrypted. 
There are two solutions currently for this:

   1. the same URL could return different representations (different 
content) depending on who is viewing the file, as determined by 
   2. the URL always returns the same representation, but has 
conditional pointers to other files where more information can be 
found for those who are authenticated

In both cases there needs to be a way to tell the consumer of the 
initial representation that he may find more information if he is a 
member of a certain group. This group could indeed be specified in a 
number of ways (including not at all). One of these ways may be to 
specify a group via a bloom filter. Another way may be via some family 
relation. This is a topic I have not thought about. It seems useful to 
specify the group, so that agents don't always need to log in whenever 
they come across a foaf file. But perhaps not. That was just a thought.

> Ignoring my fanciful embodiment, have I got the general thrust of 
> the idea?

I think so.


More information about the foaf-dev mailing list