[foaf-protocols] webid vs distributed social networks

peter williams home_pw at msn.com
Tue Mar 15 21:23:16 CET 2011

I think we are on the same page.


Let's remember, Microsoft stays deliberately 3-5 years behind the leading
edge of the wave curve, so third parties get to add-value and make a living.
It's just that once they include X from 3 years ago (e.g. SAML) into the
core, move on, quick. It looks like they have allowed the likes of rdfs/owl
to now drive RESTful services, if only one builds a dynamic typing interface
wrapper that invokes the sparql queries on that reasoning.


Now, Ill guess that they will in their own systems engineer one particular
implementation of that dynamic typing, for runtimes - and it will be the
usual entity framework stuff. But, they are making a statement: plug your
own in, if you want. Assuming it takes off reasonably in the hands of third
parties, by the time NET 5 comes along, it will show an v1 microsoft
implementation too. By .NET 5.5, third parties best be looking for other
lines of work.


This is just the way they work.


Ok. So after my Azure cloud project (see if really client certs work there,
on the service bus, with consistent sslsessionid management for a instance
pool), I'll go get that .NET semweb library out again; since it had basic
reasoning. It won't got out to focus on sparql for querying foaf cards
(since that is uriburners role, delegated), but to play now with its
rdfs/owl support for driving a type resolver, addressing the various types
of webids to be used in all sorts of rest-client/rest-server information



From: Kingsley Idehen [mailto:kidehen at openlinksw.com] 
Sent: Tuesday, March 15, 2011 12:24 PM
To: peter williams
Cc: foaf-protocols at lists.foaf-project.org
Subject: Re: [foaf-protocols] webid vs distributed social networks


On 3/15/11 2:51 PM, peter williams wrote: 

I just half read an MSDN article by Jubal Lowy - a guru in windows
communication framework. Its discussed .NET4 and how to define the types of
a reference, where references are tied to subclassing worlds and one wants a
ref of a subclass to imply it's baseclass. The last time I saw the
underlying topic I was in computer science school (a while ago); but now
it's mainstream. And that's the biggest point.


He is making the point that now (due to Google pressure), windows and the
azure cloud is a very webby, rest environment, .NET 4 has gone beyond merely
delivering metadata-driven data services (Odata etc) - and now has allowed
dynamic metadata into the core.


To cite the example: should there by a chain of subclasses in an isa
relation, and the client refers to a subclass instance, how does the client
proxy know what to serialize? The article goes on  to show how, going beyond
the static declarations used in the .NET3 world, one can in the .NET4 world
build one's own type resolver - one that consumes a URI as the reference and
how logically, since I get to write the implementing class, one might use
the RDFS class model from a rdf stream to build the type resolver for


Now you might argue: well don't bother serializing by value. *Just* pass
around every more comprehsive sets of refs (and let document crawlers
collate triples in caches).


I have to go step by step with the semantic web. Linked data is too big a

But Linked Data is the small step: Each Object has an ID and its
Representation has an Address :-)

Semantic Web Project offers other things that build on the foundation above
e.g., basic and advanced reasoning via RDF Schema and OWL-* respectively.

A smaller step is to simply make the SOA framework be exploiting dynamic
type resolvers, and be using rdfs/owl as the backing for that resolver. This
would of course be quite a big thing to do in its own right - one which MAY
then set the stage for a pure refs-only world. But, to get there, perhaps we
just find a really mainstream application - like power intelligent SOA -
that just wants the intelligent part of the semantic web, first. From that,
intelligent SOA will compete with document-centric conceptions. But, both
traditions will be reinforcing each other, each using a common reasoning
engines and knowledge representation, driving dynamic, ontological typing.

Linked Data lets us work with Data Objects without OS or Programming Lang.
confinement. The concepts you describe from .NET (take Entity Frameworks and
LINQ for example) == what Linked Data is delivering at InterWeb scale.
Basically distilling the entire effort down to:

1. Object ID
2. Object Representation Address
3. Create, Read, Update, and Delete operation (HTTP methods).

It even adds the Time Variant dimension to data (resources) that adds
flexibility and fluidity that isn't naturally part .NET and other platform
specific offerings. 

Anyway, the marriage is there to be made. Linked Data is facilitates
technical polygamy for lack of a better analogy :-)





From: Kingsley Idehen [mailto:kidehen at openlinksw.com] 
Sent: Tuesday, March 15, 2011 11:23 AM
To: peter williams
Cc: foaf-protocols at lists.foaf-project.org
Subject: Re: [foaf-protocols] webid vs distributed social networks


On 3/15/11 1:01 PM, peter williams wrote: 


Let me recast all this using .NET terminology (classical computer science,
in general).


Since the cert is self-asserted, it's the trusted validator that consults a
source of owl:sameAs statements for the n names in the cert. From trusted
validator, infer trusted statements. For me as a resource serveer trusting
the validator, I don't  care how it made the leap that the owl:sameAs is not
only true,. but trustworthy per se. Atg the same time as any such offloading
validator can be hijacked (and probably will be, as soon as its google
size), as a resource server I have to be able to compute the same truths
myself. I  can then test&verify my offloading validators (to see when they
are acting against my interests).


Yes, of course.

In the reference semantic framework of .NET, folks now have the framework to
do what Henry often discusses (let metadata define the type of the URI name,
where the metadata is triples/RDF). Since the web is a pass by value world, 

Huh? URI base Names imply pass by Reference. That's one the Linked Data
tenets that might no be so generally obvious.

the de-referencer in .NET can use my own type-resolver, when handling
apparent isa relations between subclasses to be shared between clients and
servers. That resolver can now be implemented by me and be used when
processing rest service built by channel factories, where the my
implementation can be using the foaf/rdf card as its source of structural
relations between [sub]classes. This of course supports relations between
URIs/webids; out of which one builds core trust graphs.


Now that I've got passed worrying largely about SSL, certs and sparql
queryies, I think I can focus on the security semantics of identifiers. It's
not enough to merely test a foaf card for existence of a pubkey (delivering
security to the assertion). 

Existence of a successful handshake Public Key that matches a WebID in a
Data Space where WebID Referent has the requisite privileges for expressing
the aforementioned association. It's a composite key of the network variety
when you decompose it down to: WebID, Profile Graph URL, and Public Key.
Each is individually unique, but only together can they deliver a "super
key" that works well for WOT scenarios.

Now, the ontologies and the ontology plumbing in the core of the SOA
framework have to really get to grips with what the semantic web is
attempting to address. While there is nothing there I have not seen before,
who cares! What matters is that the right tweak or two by W3C community is
what should give its momentum and mass appeal - once society is ready.

Well society is going to be forced into readiness on the back of privacy
pains and silo growth. WebID should ultimately have a very easy ride with
regards to:

1. Value proposition articulation
2. Value proposition manifestation.

The key challenge for us all is how we deliver 1&2 in a viral manner via
solutions that materialize pain alleviation, Apple style. 

At the end of the day WebID even makes traditional silos like FB, Twitter,
and friends better, they just need to incur some "opportunity costs" prior
to getting it! :-)



From: Kingsley Idehen [mailto:kidehen at openlinksw.com] 
Sent: Sunday, March 13, 2011 1:37 PM
To: peter williams
Cc: foaf-protocols at lists.foaf-project.org
Subject: Re: [foaf-protocols] webid vs distributed social networks


On 3/12/11 2:43 PM, peter williams wrote: 

Ok. Ive confused myself.


Simple: http://webid.myxwiki.org/xwiki/bin/view/XWiki/homepw4#me.

This is a ref to an RDFa marked up RDF source.



This is call to a query engine. I can even stuff it in my cert, to help
coding of webid VAs.


So what is this?

A Proxy Entity ID (URI Name Ref) generated by the URIBurner service. Trouble
is this, the URIBurner service takes a Resource URL and passes it through
70+ extractors that use a variety of heuristics that may or may not result
in transformation. Then it does another 70+ lookups against Web Services and
the LOD cloud etc.. Net effect is a much larger and richer Linked Data

URIBurner has had issues dealing with RDFa out in the wild since there isn't
uniformity re. use of DOCTYPE declarations etc. Thus, we've ended up making
two RDFa cartridges i.e., one that assumes the producer knows what its doing
and another that makes a "best effort" to make sense of the resource. I've
just disable the "best effort" variant and re. sponged (SPARQL with HTTP GET
invoked) and the result is better.


bid.myxwiki.org/xwiki/bin/view/XWiki/homepw4%01me>  - -not backlinks tab in
this page
bin%2Fview%2FXWiki%2Fhomepw4%23me -- a different page showing the same data

It's an element of the response to the query of course, which I took to me:
the required cert exists in the foaf card. Further metadata on this response
is available at that source (I said to myself) - thinking of that URI as an
artifact-refnum from the SAML world.


So lets have a look at the resource. It's a complex XML document with RDF
markup tags, which makes statements about the very same information as was
in my own foaf card.


So, now that said resource exists and has a name, what would it means if I
did the sparql query against it (versus the myxwiki graph)?


Let me treat the resources as a trusted cache copy of my foaf card, mashed
up with other content. 


In webid semantics, what does it mean for that source to assert: pubkey
present in resource? How does that compare with the meaning of myxwiki
asserting: pubkey present in resource? 

Nothing changes re. relation between public key and webid; especially as you
can always force invocation against the source rather than cache via pragma
(as per my initial example). In addition, if you published from a space that
had its own SPARQL endpoint, you can use SPARQL-FED from my instance which
cuts out all the additional sponging that occurs (when the instance has
these cartridges enabled). 


Should I now have 2 webids in my cert



Yes, and a little tweak that we need to make (long scheduled but awaiting
completion and release) is the automatic addition of:

XWiki/homepw4#me> owl:sameAs
<http://webid.myxwiki.org/xwiki/bin/view/XWiki/homepw4#me> .

Then you can get a key match using either URI Name Ref. You achieve this by
invoking the owl:sameAs inference pragma which then handles the union
expansion automatically when processing your SPARQL query.


letting the VA choose which one it wants to consume (based on the authority
in the http scheme)?


Well, we know that the validation agent in webidland wants to enforce: user
has control over id (and write access to id'd resource).

But should VA choose the first above, would it matter in webidland if the VA
confirms that "user trusted caching agent" . has control over id (and has
delegated write access to cached security enforcing content)?

Shouldn't need to choose since the endpoint can be ACL protected, ditto
specific inference rules (which reside in their own Named Graphs).


I think not, so long  as the VA is reasoning with the indirection.


Is there any real difference between the two cases?

Hopefully, I've cleared the coreference issue via comments above.


No, I think. As, after all though we assume only the user has write access
to the document (a fact being tested), in reality so does the privileged
administration (who can spoof the user). In the general case, such admin is
the owner of the portal hosting the blogsite say (Google, Yahoo, etc). As we
know, given a secret/non-secret order from USG, they would spoof me at the
drop of a hat, no questions asked. Would not even bother telling me, 99% of
the time; such is the nature of that web sub-society


Does this mental model sound right?

Yes. But remember there is granular control that can be invoked. If you were
working with http://id.mopenlinkse.com/ods instance, you can make the
co-reference assertions yourself. Then scope your own queries to your graph,
which can be ACL constrained while sitting behind an ACL constrained SPARQL
endpoint etc..


Feels like I should put the first form (trusted cache) in the IAN URI, and
the second form in the SAN URI - so they are "tagged" as subject-centric and
issuer-centric webids, thus signaling that there are multiple indirections
when enforcing, in the issuer case.

I think owl:sameAs inference takes care of this :-)





From: Kingsley Idehen [mailto:kidehen at openlinksw.com] 
Sent: Monday, February 28, 2011 4:08 AM
To: peter williams
Cc: foaf-protocols at lists.foaf-project.org
Subject: Re: [foaf-protocols] webid vs distributed social networks


On 2/27/11 2:51 PM, peter williams wrote: 

Now, this is what I expect the semweb to feel like. A remote agent (or an
agent down a chain of agents) does some work as specified by the user-agent,
probably teaching the user agent by its result how to do it directly the
next time.
If the agent provider makes a data silo or insists on being the only gateway
to a public data set, one avoids it politically. If it adds some value (not
jus control, not just wrappers, not just aggregation), then perhaps its ok.
Im trying to decide whether or not to boycott Microsoft's new Azure ACS v2
service when building a realty SAAS site in Azure land (because the program
managers seem to have decided to refuse to allow me to talk to my SAAS
tenants bridged by their ACS service from my wordpress IDP (or the ~3000
sites realtors have in wordpress) -  even though the Microsoft fabric
service (ACS) supports the very same protocol as wordpress uses, when
talking to upstream to Yahoo IDP).
I tried to alter the query, to make it an existence test. Not sure I quite
got it right. For the m and e value I supply as constants (read from the
incoming client cert), I want it now to answer essentially: exists/not-exist
But, it worked (as you gave it me), 99% of what I want. One last push, I
feel. (Peter starting to get that itch  that usually means "go into budget
finding mode").
# Pragma for enabling Virtuoso's Sponger Middleware -- component that
#  - HTTP GETs against resources that may or my not be RDF formats based
data containers 
#  - Transform data into a 3-tuple based graph 
# Post actions above the SPARQL engine processes the SPARQL query pattern
DEFINE  get:soft "replace"
PREFIX cert:  <http://www.w3.org/ns/auth/cert>
PREFIX rsa:  <http://www.w3.org/ns/auth/rsa>
SELECT ?webid FROM  <http://webid.myxwiki.org/xwiki/bin/view/XWiki/homepw4>
    [] cert:identity ?webid ;
b7c59" ;
         rsa:public_exponent "65537" .


# Remove commented out pragma below if you want to override cache, otherwise
the system will do it automagically in its own time based on server settings

# DEFINE get:soft "replace"

PREFIX cert: <http://www.w3.org/ns/auth/cert#
<http://www.w3.org/ns/auth/cert> > 

PREFIX rsa: <http://www.w3.org/ns/auth/rsa# <http://www.w3.org/ns/auth/rsa>


select  ?webid 

FROM <http://webid.myxwiki.org/xwiki/bin/view/XWiki/homepw4>


    [] cert:identity ?webid ;

         rsa:modulus ?m ;

         rsa:public_exponent ?e .

        ?m cert:hex
d216a705ad08b7c59\n"^^xsd:string .

        ?e cert:decimal "65537"^^xsd:string


Kingsley Idehen       
President & CEO 
OpenLink Software     
Web: http://www.openlinksw.com
Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca: kidehen 

Kingsley Idehen       
President & CEO 
OpenLink Software     
Web: http://www.openlinksw.com
Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca: kidehen 

Kingsley Idehen       
President & CEO 
OpenLink Software     
Web: http://www.openlinksw.com
Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca: kidehen 

Kingsley Idehen       
President & CEO 
OpenLink Software     
Web: http://www.openlinksw.com
Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca: kidehen 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.foaf-project.org/pipermail/foaf-protocols/attachments/20110315/46b23ba0/attachment-0001.htm 

More information about the foaf-protocols mailing list