Archive for March, 2009
I’ve just uploaded the presentation I gave on Sahana at the Sahana 2009 Conference in Colombo, Sri Lanka on the 25th of March, 2009. I’ll put a link up to the associated paper soon as well.
Here’s another CAP idea I wanted to get out before I read a document I’ve been sent that may cover the same topic (just to make sure I don’t potentially draw on someone else’s idea). This concept came to me, again, last year whilst I was working on the CAENZ Public Alerting research report last year (I’m still waiting for this to be publicly released so I can link to it). My recent post proposing a browser plugin for CAP alerts is part of this bigger picture I am outlining today.
The background for it came from the realisation that there are a significant number of organisations in New Zealand that are responsible for the publication of alerts – whether to a secure group, or the general public. For example, there are 16 CDEMG Groups, 70 odd local authorities, GeoNet, MetService, Police and those responsible for infrastructure such as roads, and the Centre for Critical Infrastructure Protection.
Each of these agencies would need some means of hosting a CAP server, and incorporating some means of resilience into their CAP server(s). Given that there are potentially such a large number of CAP servers required, there are some aspects that could provide a strong and robust CAP network without seeing a proliferation of potentially fragile CAP servers. This is all built on the concept of a secure peer-to-peer network of CAP servers.
It should be possible to federate a group of CAP servers into a cluster. If we take a CDEM Group as an example, the group members may elect to deploy say 4 or 5 CAP servers to create a peer-to-peer network providing CAP alert hosting for the CDEM Group. Any authorised CAP message posted to one of the federated CAP servers would automatically distributed the CAP message to the other CAP servers in the federation. In this manner, the CAP message is instantly distributed and made available to other servers in the federation.
I believe that the more robust approach to developing a CAP network is to base it upon peer-to-peer network technology, although tweaked to provide a secure means of publishing messages to the network These servers could of course be deployed any way to provide maximum resilience, and may be located close to major New Zealand Internet backbones, and quite possibly well outside of their geographic region. This has two potential benefits for resilience. Firstly, the message is available from multiple servers, so that the load (particularly for publicly accessible CAP servers) can be distributed across the multiple servers automatically. Secondly, should any particular server fail, the messages will still automatically be provided from the other CAP servers in the federation.
One example means by which this could be deployed is the following.
Provide a national CAP server network of federated CAP servers at key points – a nationally managed set of strategically located CAP servers. For example, Government internal CAP servers would be most likely located on the Government Shared Network (GSN) or whatever comes out of the recent restructure of this service. Public servers may be spread around both by geography and ISP (e.g. key ISPs may host a CAP server for their customers). In all circumstances these would fallback to other CAP servers in the federation in case of their failure.
Naturally, the open approach applied to peer-to-peer file sharing is not appropriate for a trusted network CAP service. To create a more secure network, something like a two-tier approach may be necessary.
CAP Publishing Servers
Private CAP publishing servers may be utilised to act as the publishing gateway to the public read-only peer-to-peer network provided by the CAP Read-only servers. Authentication, encryption and/or digital signing should be used as the basis to authorise the publication of a CAP message via the publishing server. The publishing server is responsible for verifying the digitally signed CAP alert, as well as the authentication details to verify the user is authorised to post the alert. Once authorised, the CAP publishing server publishes the alert to the road-only servers. This is the only channel to publishing CAP alerts to the network. Some form of CAP writing software (or service) may be useful for creating CAP messages and then publishing them to the servers. One protocol that may be useful for publishing is Atom as suggested by this IBM article.
CAP Read-only Servers
These are the user-facing servers that provide CAP messages to their end users. Only the CAP publishing servers are authorised to publish CAP messages to the peer-to-peer network for dissemination.
Naturally, this concept is part of a larger plan to build a CAP framework, and the circle would be able to be partly completed by designing web browser plugins that are capable of connecting to the peer-to-peer CAP read-only servers.
Widespread deployment of CAP browser plugins may mean that traditional servers may not be capable of supporting tens or hundreds of thousands of CAP clients regularly checking for new alerts. A peer-to-peer approach will probably provide the most scaleable and robust approach to disseminating CAP alerts via the Internet.
I visited the MetService website this afternoon to check on weather conditions and the rain radar for the country, and lo-and-behold it loaded a brief static forecast page with a couple of paragraphs at the top stating that the website had been overloaded and was experiencing increased loading.
I assume that the additional load was being generated by the warnings of bad weather in the North Island.
Anyway, a simple but brief point. This is not the first time the MetService website has gone down under the load of interest when our country’s weather looks like getting a little interesting. I would have thought by now that MetService would have realised that they should in fact have the capacity in place to provide information via their website – at a time when Kiwi’s are more looking to need it, e.g. when there is potential severe weather.
Reducing website functionality down to a simple static forecast during severe weather is entirely inappropriate.
NZ National Party just twittered a new press release on clamping down on red tape. Here is my reply asking that the Copyright Amendment Act be included in the review as it is going to impose significant compliance costs on businesses that provide no benefit other than to copyright holders.
To the Honorable Mr Hide:
I urgently request that you add the compliance costs for businesses associated with Section 92 of the Copyright Amendment Act to the list that require review. The current reprieve until later this month does not suggest that any changes will occur in the compliance costs associated with this Act.
The draft TCF Code does nothing to deal with the fact that businesses will still effectively be an ISP as defined in the Act. This will include compliance costs associated with implementing stricter firewall rules (e.g. to ensure that employees are unable to use peer-to-peer software), and require expensive tracking software to log all employee activity on the business internet connection (accurate auditing will be required to either identify an offending employee, or to prove to the upstream ISP that no offence was committed).
If it were not for the Copyright Amendment Act, these measures would not need to be implemented. As it stands, every small business in New Zealand is going to be stuck with potential compliance costs in the thousands of dollars just to upgrade their organisations firewall to comply with the Act.
To paraphrase the press release – “Businesses want to get on with productive activity without being hindered by silly rules imposed by inappropriate regulation such as the Copyright Amendment Act”.
During a recession such as this now is not the time to be forcing small businesses to waste time and potentially thousands of dollars in implementing measures to protect their organisation against poorly drafted legislation. Times are such that small businesses have better uses of their money.
Section 92 exposes businesses to a significant risk as more businesses have come to rely on their Internet connection. Either spend thousands to tighten up your organisation’s firewall and policies and mitigate some of the potential downtime if your organisation is identified as a copyright infringer; or don’t comply, and when your organisation is identified as an infringer because of the possible actions of one of your employees – and you won’t have the system in place to identify and defend your employee in case they have been falsely accused.
Should businesses be forced to spend little spare money on a compliance cost that is only going to have a detrimental affect on cashflow during a recession? No.
Whilst I respect and support copyright holders (I’m one myself as a photographer and maybe soon an open source programmer) – businesses should not have to incur expenses because the entertainment industry wants them to become their copyright policeman.
I haven’t blogged about Sahana for a long time, and I’ve got plenty to write. So much that I can’t decide where to start, so I’m going to pick a nice small piece to start with.
Last year, I was involved in a project in New Zealand to produce an investigative report on Public Alerting Systems with the New Zealand Centre for Advanced Engineering. This report will hopefully soon go public, and I’ll provide a link when it does.
This report was looking at the different technological solutions for getting alerts out to people in as timely a manner as possible. At one point in the search for different systems, we started discussing means of injecting HTML in web pages via an ISP, so that a public alert could be sent out to anyone on the Internet. I’ll talk about this and other options later. Let me get to the point of this post.
After starting at the HTML injection idea, and progressing through a few others, I reached a kind of natural conclusion that a more suitable means of alerting users via a web browser would be a browser plugin that can subscribe to Common Alerting Protocol (CAP) feeds, and when a relevant alert comes in via CAP, this is displayed to the user in their browser using the XUL:notificationbox at the top of the webpage.
Anyway, a possible idea for a Google Summer of Code 2009 project is that of constructing a browser plugin for Firefox that implements this alerting capability, and expanding Sahana to support full publishing of CAP alerts. Here are some features it could/should support.
- Bundle publicly available CAP feeds (ideally listed in a nice Country/State taxonomy – this will make it easy to discover and utilise existing CAP services.
- Allow users to optionally register location in some manner, so as the plugin can identify relevant alerts (by location) and give them higher status than say remote alerts. Users should be able to register multiple locations – whether it has home & work, or multiple cities. Privacy is of course king and this information must be protected.
- Provide a means of adding additional user provided CAP feeds to the plugin.
- Provide the ability to open the alert in a new tab and format in a human-readable manner, including niceties such as embedding Google Maps to show geospatial information and links back to the source website of the alert for verification.
- Implement means of verifying messages that are digital signed, and decrypting encrypted messages.
- Implement a CAP feed in Sahana so that Sahana can act as both a producer (in terms of creating a CAP message) and a publisher (in terms of making it available via a CAP RSS/Atom feed).
- Implement a CAP proxy or similar, so that say all users of a Sahana server can obtain CAP alerts directly from the Sahana server – rather than going to an external website. This may be useful for distribution of alerts within an organisation or centre without having every client browser connecting to an external server.
What would be very nice, but may be beyond the capabilities of Sahana servers currently, is making the CAP service on a Sahana server easily discoverable on a LAN via zero-conf services such as Bonjour.
Draft Outcomes for Assessment
The outcome of such a project would be to produce a working solution whereby a Firefox Browser plugin is capable of working with public CAP alerts and that CAP within Sahana is capable of fully acting as a CAP server via RSS/Atom feeds to the CAP alerting plugin.
- Implement the specified requirements
- The browser plugin works as expected with publicly available CAP feeds.
- The browser plugin works as expected against the Sahana demo server. (Yes, this means that your modifications to CAP on SahanaPHP need to be implemented).
- Implement the Sahana CAP server in SahanaPY
- Provide one or more standalone CAP clients for a mobile platform e.g. Google Android, Apple iPhone/iPod Touch etc
- Write an Internet Explorer plugin with similar functionality – it is important that this functionality is also provided for IE given its widespread usage and deployment.
Whilst the plug-in can and should operate completely independently of Sahana, it should also be designed to work well with Sahana servers (e.g. SahanaPHP and SahanaPY).
Anyway, this is just an idea I wanted to float and get out in the community for discussion. I’d welcome any further comment or ideas to build upon this!