[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]
XML Daily Newslink. Tuesday, 17 October 2006
- From: Robin Cover <robin@oasis-open.org>
- To: XML Daily Newslink <xml-dailynews@lists.xml.org>
- Date: Tue, 17 Oct 2006 13:58:58 -0400 (EDT)
XML Daily Newslink. Tuesday, 17 October 2006
A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS http://www.oasis-open.org
Edited by Robin Cover
====================================================
This issue of XML.org Daily Newslink is sponsored
by BEA Systems, Inc. http://www.bea.com
====================================================
HEADLINES:
* Getting to Know the Atom Publishing Protocol, Part 1
* W3C Launches Secure Browsing Initiative as Part of Security Activity
* WS-Notification Version 1.3 Approved as an OASIS Standard
* WebCGM 2.0 Becomes a W3C Proposed Recommendation
* French Prime Minister Recommends Adoption of Open Document Format
* Look Who's Updating Those Data Dinosaurs
* Is Open XML a One Way Specification for Most People?
* Sun Thinks Inside the Box for Datacenter System
----------------------------------------------------------------------
Getting to Know the Atom Publishing Protocol, Part 1
James Snell, IBM developerWorks
The IETF Atom Syndication Format, or Atom 1.0 as it is known commonly,
has since been deployed to millions of Web sites and is supported by
every major syndication platform on the market. Today, just over a
year later, work nears completion on the second of the two
specifications: The Atom Publishing Protocol. The Atom Publishing
Protocol is an HTTP-based approach for creating and editing Web
resources. It is designed fundamentally around the idea of using the
basic operations provided by the HTTP protocol (such as GET, PUT, and
DELETE) to pass around instances of Atom 1.0 Feed and Entry documents
that represent things like blog entries, podcasts, wiki pages, calendar
entries and so on. Central to the Atom Publishing Protocol is the concept
of collections of editable resources that are represented by Atom 1.0
Feed and Entry documents. A collection has a unique URI. Issuing an HTTP
GET request to that URI returns an Atom Feed Document. To create new
entries in that feed, clients send HTTP POST requests to the
collection's URI. Those newly created entries will be assigned their
own unique edit URI. To modify those entries, the client simply
retrieves the resource from the collection, makes its modifications,
then puts it back. Removing the entry from the feed is a simple matter
of issuing an HTTP DELETE request to the appropriate edit URI. All
operations are performed using simple HTTP requests and can usually be
performed with nothing more than a simple text editor and a command
prompt... In the next installment of this series, I will walk through
a number of application scenarios that are considered good uses of the
protocol. These include such obvious things as Weblogs, social
bookmarking and photo album type applications as well as somewhat non-
obvious uses in calendaring, contact management, document and media
content repositories, database management, situational applications and
even Service Oriented Architecture. Beyond that, you will explore how
to implement a Atom Publishing client and server in Java using the
Apache Abdera open source Atom implementation currently in incubation
at the Apache Software Foundation and will step through the creation
of an APP-enabled application service.
http://www-128.ibm.com/developerworks/web/library/x-atompp1/index.html
See also Atom references: http://xml.coverpages.org/atom.html
----------------------------------------------------------------------
W3C Launches Secure Browsing Initiative as Part of Security Activity
Staff, W3C Announcement
W3C has announced the creation of a new Web Security Context Working
Group (WSC) whose mission, as part of the W3C Security Activity, is to
enable a secure and usable interface so Web users can make safe trust
decisions on the Web. Mary Ellen Zurko (IBM) chairs the group which is
chartered to establish requirements and deliver standards for presenting
essential security information to users and for ensuring the integrity
of that information. According to the published Charter, the mission of
the Web Security Context Working Group is to specify a baseline set of
security context information that should be accessible to Web users,
and practices for the secure and usable presentation of this information,
to enable users to come to a better understanding of the context that
they are operating in when making trust decisions on the Web. The charter
follows up on discussions from the W3C Workshop on Usability and
Transparency of Web Authentication on leveraging metadata and improving
the security of user interfaces and user agent behaviors. Current Web
user agents communicate only a small portion of available security
context information to users in a way that is easily perceived and
understood. Other context information that might be available to user
agents and possibly helpful to users is either not presented, or presented
in a way that is not understood by users, and hence useless or confusing.
This information ranges from logotypes and company names and addresses
that might be present in PKI certificates, to the user agent's memory of
past activities. Where the mechanisms that are used to communicate context
information can be effectively spoofed by Web content, they also open the
scene for attackers serving fake security indicators, and become useless.
Tim Berners-Lee, W3C Director: "When I'm browsing the Web, I want my
browser to help me understand who really is the owner of a Web page;
there is much deployed and proven security technology, but we now need
to connect it all the way through to the Web user. A Web browser acts on
my behalf as I surf the Web, and I need more help from it to avoid being
spoofed."
http://www.w3.org/2005/Security/wsc-charter
See also the announcement: http://www.w3.org/2006/10/security-pressrelease
----------------------------------------------------------------------
WS-Notification Version 1.3 Approved as an OASIS Standard
Staff, OASIS Announcement
OASIS announced that its members voted to approve WS-Notification
version 1.3 as an OASIS Standard. WS-Notification defines a pattern-
based approach for disseminating information amongst Web services. The
event-driven pattern that's defined in WS-Notification is very similar
to the one used by publish/subscribe systems from message-oriented
middleware vendors and in many device management applications. There
are many use cases for WS-Notification in the areas of system and device
management and also in commercial fields, such as electronic trading.
The WS-Notification OASIS standard consists of three specifications:
WS-BaseNotification; WS-BrokeredNotification; and WS-Topics.
WS-BaseNotification defines standard message exchanges that allow one
service to register or de-register with another, and to receive
notification messages from that service. WS-BrokeredNotification builds
on WS-BaseNotification to define the message exchanges to be implemented
by a "Notification Broker." A Notification Broker is an intermediary
that decouples the publishers of notification messages from the
consumers of those messages; among other things, this allows
publication of messages from entities that are not themselves Web
service providers. WS-Topics provides an XML model to organize and
categorize classes of events into "Topics," enabling users of
WS-BaseNotification or WS-BrokeredNotification to specify the types
of events in which they are interested. WS-Notification was designed
to fit well with related standards. It makes use of the Web Services
Resource Framework (WSRF) OASIS Standard, and is, in turn, used by
the Web Services Distributed Management (WSDM) OASIS Standard.
http://www.oasis-open.org/news/oasis-news-2006-10-11.php
See also the OASIS WSN TC web site: http://www.oasis-open.org/committees/wsn/
----------------------------------------------------------------------
WebCGM 2.0 Becomes a W3C Proposed Recommendation
Benoit Bezaire, David Cruikshank, Lofton Henderson (eds), W3C PR
W3C has announced the advancement of of the WebCGM 2.0 specification
to the level of Proposed Recommendation. Computer Graphics Metafile
(CGM) is an ISO standard, defined by ISO/IEC 8632:1999, for the
interchange of 2D vector and mixed vector/raster graphics. WebCGM is
a profile of CGM, which adds Web linking and is optimized for Web
applications in technical illustration, electronic documentation,
geophysical data visualization, and similar fields. First published
(1.0) in 1999 and followed by a second (errata) release in 2001, WebCGM
unifies potentially diverse approaches to CGM utilization in Web
document applications. It therefore represents a significant
interoperability agreement amongst major users and implementers of the
ISO CGM standard. WebCGM 2.0 adds a DOM (API) specification for
programmatic access to WebCGM objects, and a specification of an XML
Companion File (XCF) architecture, for externalization of non-graphical
metadata. WebCGM 2.0, in addition, builds upon and extends the graphical
and intelligent content of WebCGM 1.0, delivering functionality that
was forecast for WebCGM 1.0, but was postponed in order to get the
standard and its implementations to users expeditiously. The design
criteria for WebCGM aim at a balance between graphical expressive power
on the one hand, and simplicity and implementability on the other. A
small but powerful set of standardized metadata elements supports the
functionalities of hyperlinking and document navigation, picture
structuring and layering, and enabling search and query of WebCGM
picture content. Comments are welcome through 30 November. Several
implementations of WebCGM 2.0 are already available.
http://www.w3.org/TR/2006/PR-webcgm20-20061017/
See also the OASIS CGM Open WebCGM TC web site: http://www.oasis-open.org/committees/cgmo-webcgm/
----------------------------------------------------------------------
French Prime Minister Recommends Adoption of Open Document Format
European Communities, eGovernment News
"Official report recommends adoption of Open Document Format: A recently
published report, commissioned by the French Prime Minister, Dominique
de Villepin, strongly recommends that France should follow the example
of Belgium and make Open Document Format (ODF) mandatory for all public
bodies. The report, 'On equal terms', was prepared for the Prime Minister
by the Member of Parliament for the Tarn region, Bernard Carayon. In it,
Carayon calls for new legislation to make it compulsory for French
government departments to use ODF for the creation and dissemination of
documents. He also suggests that France should ask its European partners
to do likewise when exchanging documents at a European level.
Interoperability and the use of open standards are a precondition of
European technological development, stresses the report. It argues that
the widespread adoption of ODF would help encourage the development of
software which supports ODF, and could create more opportunities for
French and European businesses. ODF was approved as an ISO official
standard file format in May 2006. Shortly after this, the Belgian federal
government adopted a proposal to make ODF the mandatory standard for
all internal government documents from September 2008 onwards. Belgium
thus became the first Member State to take this important step towards
Open Source standards aimed at ensuring the effective delivery of
eGovernment services to citizens and enterprises. France now looks likely
to follow suit, and other Member States are also examining this
possibility closely. Carayon's report also recommends the creation, by
the EU, of a body to ensure the technological independence of Europe,
and calls for the setting up of a research centre addressing issues
relating to the security of open source software."
http://ec.europa.eu/idabc/en/document/6206/194
See also IDABC eGovernment News: http://ec.europa.eu/idabc/en/chapter/194
----------------------------------------------------------------------
Look Who's Updating Those Data Dinosaurs
John Pulley, Federal Computer Week
Nearly a dozen states are focused on reducing the costs of presenting
benefits eligibility information to caseworkers using different
approaches, including service-oriented architecture. Eager to improve
the delivery of health and human services, states are replacing their
mainframes with more powerful, more nimble systems that incorporate
sophisticated service-oriented architecture (SOA). The transition is
akin to replacing a two-toed sloth with an ocelot -- the DNA is
fundamentally different. Bridging the gap between technological
obsolescence and best-of-breed information technology is made more
challenging because demands on states' resources often require social
services programs to do more with less. In addition to the scarcity of
funds, few states have the political wherewithal or the risk tolerance
to replace their decrepit processing systems in one fell swoop... Even
as states move to acquire new systems for managing social services,
recent missteps in Texas have tempered the march to modernization, if
only temporarily. The Texas Integrated Eligibility Redesign System
(TIERS) was conceived to modernize out-of-date 1970s technology used
by the state's Health and Human Services Department. Employing a
browser-based system to integrate the application process for more
than 50 health and human services programs in Texas, officials had
hoped to save money and improve service delivery; the implementation
of TIERS has not gone as planned. If Texas took a jackrabbit approach
to integrated eligibility, California's CalWorks Information Network
(CalWIN) system is the tortoise. The network recently concluded
implementation of a new integrated eligibility system throughout an
18-county consortium by bringing one jurisdiction online every month
for 18 months. The California consortium and its partners designed
and developed a state-of-the-art, modern, robust eligibility
determination system designed against SOA principles...
http://www.fcw.com/article96443-10-16-06-Print
----------------------------------------------------------------------
Is Open XML a One Way Specification for Most People?
Bob Sutor, Bob Sutor's Open Blog
Blog genre: "Who will implement Open XML correctly and fully? Maybe
Microsoft. Why? Since it is essentially a dump into XML of all the
data needed for all the functionality of their Office products and
since those products are proprietary, only they will understand any
nuances that go beyond the spec. The spec may illuminate some of the
mistakes that have been made and are now being written into a so called
standard for all to have to implement, but I'm guessing there might be
a few other shades of meaning that will not be clear. Fully and
correctly implementing Open XML will require the cloning of a large
portion of Microsoft's product. Best of luck doing that, especially
since they have over a decade head start. Also, since they have avoided
using industry standards like SVG and MathML, you'll have to reimplement
Microsoft's flavor of many things. You had better start now. So
therefore I conclude that while Microsoft may end up supporting most
of Open XML (and we'll have to see the final products to see how much
and how correctly), other products will likely only end up supporting
a subset."
http://www.sutor.com/newsite/blog-open/?p=1145
See also Tim Bray's blog: http://www.tbray.org/ongoing/When/200x/2006/10/16/OOXML-Hoo-Hah
----------------------------------------------------------------------
Sun Thinks Inside the Box for Datacenter System
Robert Mullins, InfoWorld
Sun claims its Project Blackbox offers a mobile datacenter at a
fraction of the cost and 20 percent more power efficiency. Sun's
Project Blackbox crams multiple servers and storage hardware into a
box the size of a semi-trailer truck that can be literally driven up
to a company, plugged in, and turned on. The Blackbox, available in
either a 20-foot or 40-foot long shipping container, can be configured
to hold up to 250 Sun Fire servers or up to 2 petabytes worth of
storage devices or 7 terabytes worth of memory. The equipment runs on
Sun's Solaris 10 operating system and uses water cooling to dissipate
heat from the processors. This kind of rapid deployment of extra
computing power will address many of the needs of the modern enterprise
data center concerned with performance but also energy and space
efficiency. IBM cited Gartner research that said that by 2009, 70
percent of datacenter facilities will fail to meet operational and
capacity requirements without some level of renovation, expansion or
relocation. "Basically, it rolls up to you, you hook up the power,
you hook up your network and you hook up the chiller water lines and
you're ready to go," [Anil] Gadre said. "It's like prefab housing."
Research firm IDC, meanwhile, reports that by 2007 spending on power
and cooling datacenters will exceed that of the computer hardware
itself. While not providing specific pricing information, Sun claims
its Project Blackbox system will be available at one one-hundredth of
the initial cost of traditional datacenters with the same computing
power, and offer 20 percent more power efficiency.
http://www.infoworld.com/article/06/10/17/HNsundatacentersystem_1.html
See also eWEEK: http://www.eweek.com/article2/0,1895,2031935,00.asp
----------------------------------------------------------------------
BEA Systems, Inc. http://www.bea.com
IBM Corporation http://www.ibm.com
Innodata Isogen http://www.innodata-isogen.com
SAP AG http://www.sap.com
Sun Microsystems, Inc. http://sun.com
----------------------------------------------------------------------
Newsletter subscribe: xml-dailynews-subscribe@lists.xml.org
Newsletter unsubscribe: xml-dailynews-unsubscribe@lists.xml.org
Newsletter help: xml-dailynews-help@lists.xml.org
Cover Pages: http://xml.coverpages.org/
----------------------------------------------------------------------
[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]