[
Lists Home |
Date Index |
Thread Index
]
Doug Royer wrote:
>
> Yes! And I wonder how many people that try to shoot down XML
> have ever had to hand write a parser for complex data transfers
> that use arbitrary data models. One parser bug can ruin your
> whole project.
>
I'm not a parser expert by any stretch of the imagination, but two
parsing/web page scraping projects come to mind: one to scrape
advertisor bids from Overture for bid gap analysis (before Overture shut
that down) and one to pull weekly NFL player statistics for scoring
fantasy football leagues. The final score:
Overture - 100% parsed from scratch: 2 or 3 weeks to finally get it
right
Football - with the proper application of HTMLTidy and XSLT: 2 days
to pull stats from not just one site but from two completely
different sites.
And let me tell you, if you want arbitrary data models, each of the
football statistics pages was apparently written by a couple different
designers and populated by twice as many different content management
systems. It doesn't come much more arbitrary.
The ease with which the football stats were pulled using HTMLTidy and
XSLT gave rise to the newest million dollar idea which was to develop
ways to protect website data from the evils of XSLT scraping. (Any
investors? :)
If that sort of efficiency is boring, I'll never need exciting software
again.
--
Steve Rosenberry
Sr. Partner
Electronic Solutions Company -- For the Home of Integration
http://ElectronicSolutionsCo.com
(610) 670-1710
|