OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] indexing and querying XML (not XQuery)

[ Lists Home | Date Index | Thread Index ]

From: 'Alan Gutierrez' [mailto:alan-xml-dev@engrm.com]

* Bullard, Claude L (Len) <len.bullard@intergraph.com> [2005-08-23 15:34]:

> One other thought:  suppose that rather than providing Google with
> original content, the index was locally generated and submitted in
> a standard format.  Wouldn't that be something like a locally
> generated topic map that is published as the sole interface to the
> content?  Is that your idea, Alan? As long as the code is
> inspectable (no compiled components), this is palatable.

>    Sorry, fading in and out today. Programming, for one. A lot of
>    new information, to try to absorb for another.

>    Topic maps? You mean these?

>        http://www.topicmaps.org/xtm/index.html

Yes.  They may not be exactly what you want, but the idea is that 
local search engines return a spec'd index to enable smooth integrated 
control of results rather than spidering and scraping, thus returning 
local control.


>    In essence, most blogs have a search box, so most blogs already
>    have a search engine of some kind on board.

>    I'd like to provide a REST interface that generates results in
>    XML, so that a blog can be searched via scripting.

Fine so far.

>    I'd like to create a REST interface that would allow a script,
>    or a user interface, to create a comment regarding relevance,
>    and have the blog store that comment.

Blogs have comments now.  Microsoft web pages have ranking boxes but 
these are preset questions or simple pick a number between n and n. 
Relevance is a tricky concept.  Relevant to what?

>    Gaming is countered by only permitting scripts or persons to
>    comment, if they also have a blog, and that blog is listed as a
>    trusted blog, somewhere.

So a social network of members who can comment.  Blogger has that 
now.  Having scripts comment is tricky.  That is where the 'do 
we trust this algorithm' question comes up.  The article Koberg 
referenced mentions this issue and notes that in Cutting's opinion, 
most of the Google secret sauce algorithms have been reversed 
engineered by the gamers.  Given a choice, I'd ask for full transparency 
for any algorithm used to evaluate my blog or any network of blogs 
in which I elect membership.  I would want right of first refusal 
if my blog were selected for membership, and the ability to blog 
engines that attempt to pirate it into a social network (no press 
gangs).

>    A lot of voting. A lot of consenus. No single authority.

Except for the script writer and the creator of the ontological 
basis for relevance.  To make this work openly, you will need 
some form of folksonymy.

If this all seems paranoid, well, it is.  The World Wide Web 
has evolved into a universe full of one click nasties, neglinks, 
secret police, and the full rot of propaganda.  It has all of the 
good stuff too, but 'trust until violated' rules aren't working, 
so defensive measures are necessary and they emerged too in the 
form of AdAware, etc.  With blogs, now our publications are being 
used as well as our surfing behaviors.  Folksonomies are one thing; 
having opaque scripts based on highly local no-opt-in categories 
create the linkages is suspect.

Opt-with or opt-not.

len




 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS