OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

 


 

   RE: [xml-dev] When Searching With Google

[ Lists Home | Date Index | Thread Index ]

Interpret, yes, but also to enable selection 
of visualization techniques in adaptive graphical 
user interfaces.  As the next generation of 
adaptive interfaces emerges, we need to know 
more about the sources, their policies (eg, 
algorithms) to enable the system to dynamically 
select the interface model.  That's a side 
jaunt, but what was on my mind while I was 
walking the dog.

To be clear, I am not singling out Google 
as having become proprietary; I think that if 
we look around, we'll find non-deterministic 
aspects in many search engines, and possibly, 
**web services in general**.  I used Google as 
the example because of realizing that my 
own mental model of the search interface is 
likely incorrect with how Google actually 
processes terms that I enter.  The rest is a 
follow on from that realization.  Part of 
that realization was that the metadata being 
derived is probably skewed as well, but how 
would one know?  

We say we can build web services as encapsulated 
systems and we can; yet that old bugaboo of 
trust but verify keeps popping back up.  It 
is as if we need a model for adaptive paranoia 
which is what contracting processes are.

len


From: Murali Mani [mailto:mani@CS.UCLA.EDU]

I think Len is arguing that if we know more about how google searches, we
can interpret the results more.. such as in germany, I get german pages
etc..

I think several techniques used by google are kind of known..?? for
example, we know that it uses page rank, text in anchor tags, tf-idf etc..
but we do not know the exact numbers, and we do not know other metrics
used etc..

I think the question is: has google crossed the line with respect to what
is proprietary and what is disclosed. I think Len argues that it has..

To me, actually, I do not like too much non-determinism in the results
that I see.. I think google has crossed the line with the non-determinism
seen by the user, and user-based adaptivity.. they are using wrong
metrics..??  Irrespective of where I am, I need the same results.. I think
language etc should be handled differently..???

After hearing that google uses user-based adaptivity, and that they are
detecting user characteristics automatically.. I think that is not a
correct approach.. google always maintained they wanted to keep things
simple.. I think this is not keeping things simple... but what is
alternative approach??

as someone pointed out, we can go out google.com/ncr to get the same
results irrespective of our location??




 

News | XML in Industry | Calendar | XML Registry
Marketplace | Resources | MyXML.org | Sponsors | Privacy Statement

Copyright 2001 XML.org. This site is hosted by OASIS