Archive for the ‘standards’ tag
I had the opportunity on Monday (6 December 2010) to attend a great workshop held in Wellington. This started off with a set of three presentations by Mark Reichardt who is the President and CEO of the Open Geospatial Consortium.
Mark presented a number of presentations during the morning, starting with CEOs, followed by one focused on OGC and Disaster Management, as well as finishing with An Overview of OGC Standards and Programs. Hopefully these will be publicly accessible in Google on portal.opengeospatial.org – I’ll put links up when the files are openly available.
I got a great buzz to see Sahana, OpenStreetMap, CrisisCommons and Ushahidi mentioned in slides by Mark, and these event got coverage in the presentation to the invited Government CEOs at the breakfast session. In the open presentation on OGC and Disaster Management in the morning, Sahana got its own slide and was recognised for incorportating various OGC standards including Web Map Service (WMS), Catalogue, KML , Web Feature Service (WFS) and the Coverage Service. It is great to see the significant geospatial efforts that the likes of Mifan, Fran, and David have put in – amongst others and sorry I can’t recall everyones names!
I also talked about my experience around geospatial data and building safety evaluations following the Canterbury earthquake, and there seemed to be some real interest in using that as a possible test bed geospatial project that we may be able to undertake here in NZ. I must get onto blogging more about that.
As always, it was a great opportunity to catch up with some colleagues that I hadn’t seen for a while, and it has been particularly reassuring to see some great minds that were thought lost with the State Services Commission restructure now turning up in Land Information New Zealand. LINZ taking on these people definitely sends a positive signal about open data and standards. In particular I want to note Richard Murcott who is now the Geospatial Standards Leader at LINZ.
Anyway – what were some of the points I took away from the workshop?
- It sounds like Land Information New Zealand is going to become a full OGC member in 2011.
- The OGC GeoSMS standard is going to be coming soon! A discussion document from Freburary 2010 on OGC GeoSMS is available here (link to agreement page, then pdf download). This will be great to have a GeoSMS standard to work to, as we had created our own in Naval Postgraduate School Disaster Relief Experiments previously.
- Mark highlighted that demonstrating standards – through test bed projects and the like – is one of the most important aspects of standards promotion.
- The Taiwanese have been doing quite a bit of work with OGC, and have been doing some very interesting debris flow monitoring projects with OGC Sensor Web. There is supposed to be a good pdf available that outlines the Taiwanese work with Sensor Web.
- Geosynchronisation. OGC has within the last month announced the formation of a GeoSynchronisation Service Standards Working Group. This of course has a lot of potential – not only for taking OpenStreetMap data out into the real world, editing it, and coming back and syncing it later, but also of course for emergency management. I intend to watch this one quite closely.
One of the closing quotes of the emergency management presentation came from the Chairman of the OGC, David Schell.
What the OGC is doing is facilitating a common picture of reality for different organizations which have different views of the reality, the disaster, the catastrophe, that they all have to deal with collectively
The use of OGC standards is probably the only clear path forward towards a Common Operating Picture – well, one that has anything to do with location anyway.
In the afternoon, Richard Murcott of LINZ led a workshop discussing standards and interoperability.
One of the big takeaways for me was the model of Conceptual Interoperability from Simulation Theory. Basically it builds up from nothing – no interoperability, to a state of full interoperability – where fully conceptual models are used to integrate data consistently from multiple sources.
Of course, we are a long way from this in emergency management, a lot of the current interoperability we have is at level 2 which is only a common data format. The OASIS work with EDXL is taking us a step higher (level 3) with increasing semantic interoperability through the use of more clearly defined standards. I think there is a very long way to go using this model though to ensure we have interoperability that considers methods and procedures (level 4), assumptions and constraints (level 5) to a “fully specified but implementation indepentent model” (level 6).
Some other quick takeaway points that I and others came up with:
- There is a spectrum of the reason for interoperability – from selfish to altruistic. A selfish organisation wants to bring any data into its system and processes, whereas a truly altruistic organisation wants only to publish and share information.
- There is a spectrum of the management approach of interoperability – from adhoc/chaotic to extremely structured. Some organisations want full control over how interoperability is managed that require a very structured and formal approach, and even agreements or MOUs. At the other end is complete anarchy and chaos.
- Risk aversion is a significant barrier to interoperability, so clearly taking a risk management approach to interoperability is likely to provide a better means to manage risks, and hence make true interoperability more acceptable to management.
- Restrictive licensing of standards creates barriers to entry. Nothing new there.
- Data sets provide an excellent focal point for collaboration and communities may well form around a released data set. E.g. NZ Open GPS Maps project around released LINZ roads.
One of the final points Richard made in closing was more targeted at New Zealand in general, and certainly a sentiment I think we should take to heart. New Zealand, as a country, needs to behave more like a city of 4 million. Dispersed from Northland to Southland we pack nowhere near as much punch than if we better bring our expertise together from across the country. If we want to be more successful on the world stage, then we need to lose our small town mentalities, and start thinking bigger and broader!
One question that I’ve had about building a solution for Building Safety Evaluation (BSE) is whether it should be built into an existing council system, or indeed implemented on existing council systems, or perhaps a standalone solution should be used. Clearly there are pros and cons both ways, but I’m definitely tending towards a standalone solution – at least initially. I certainly gained some insights in the 7 days that I had working within CCC’s BSE team.
There are certainly benefits to be gained from integrating a BSE into existing council systems. These include:
- Information as it is captured goes directly into the business-as-usual systems.
- Building information is tightly linked to existing council data structures e.g. building records, ids etc.
But there are problems associated with systems implemented on a per-council basis:
- It requires each and every council to build and integrate a BSE system into their existing systems – something which most don’t have the time or budget to do, especially for relatively infrequent events.
- It is harder to bring in staff from other councils to provide surge capacity for the data entry tasks (data entry is another problem I’ll get to as well) – they would be more likely to be trained in a different system that their council uses.
- As an inhouse solution would be limited by existing council IT systems – don’t underestimate some of the issues associated with getting large organisations IT systems working following a disaster of this magnitude.
I will admit to being slightly biased, but I believe a more sustainable solution is to create a free and open source software tool that can be used in a standalone manner for the first few weeks, and then council IT staff can find a means to import the information back into the council system. This would likely become easier if the BSE data was able to be implemented in a standard XML format. I’d like to see an OASIS Emergency Data eXchange Language (EDXL) extension created for representing BSE information.
Why do I think an open source source solution would be best?
- The system will be relatively infrequently used, so it is easier to justify a consortium approach to development. This will be far cheaper than multiple councils thinking about building their own bespoke solution, that probably won’t be compatible with neighbouring councils. This means multiple councils, and indeed governments worldwide may be able to contribute relatively small amounts each to build a better system than any one single organisation could build.
- Being free, it is also likely to be widely deployed, and this means that rather than just having one councils staff trained in their systems use, there are likely to be an order of magnitude more people trained in its use if it is open source. This greatly increases the ability to have surge capacity for data entry.
- An open source solution is also likely to implement open standards, whereas a bespoke council system is likely to forgo the additional cost associated with implementing a recognised data interoperability standard. This means that bespoke council BSE systems will be inherently closed, and potentially incompatible with their neighbouring councils. An open source application with open standards automatically means that neighbouring councils can either share the one system, or at least use the same software on separate instances, and use the interoperability standards to allow easy aggregation of the BSE data for reporting.
- Open source would also allow the creation of what is effectively a BSE kit in a box. A wireless hub, a handful of netbooks etc and it would be quite easy to have a portable, redeployable and standalone kit for implementing BSE without having to depend upon any existing organisational IT infrastructure.
So, for the time being, I’ve convinced myself that a standalone open source BSE application is probably preferable to councils implementing their own system in house.
Always being one keen to live on the bleeding edge, I’ve been using the beta software for my Garmin Colorado. It has been great for driving around and recording tracklogs for the OpenStreetMap project.
One of the downside of using the beta software though, means that you can be exposed to bugs. I discovered that recently when attempting to open some of my recent GPX tracklogs and the software just refused to open them. After a bit of hunting, I found a relatively easy means of detecting errors and fixing them.
The tool to use is an XML validator called Xerces produced by the Apache Foundation. On a Mac, I download the appropriate binary package, and I copied the binary files in xerces/bin to /usr/local/bin, and the libraries from xerces/lib to /usr/local/lib. You can then run the program SAXCount that counts the number of elements in an XML file – the side benefit that we’re after is that it is good at detecting and reporting errors that many GPX applications are not capable of.
After working through a few minor problems on the NZ GPS forums, I had Xerces up and validating GPX – including with Garmin’s extensions. Note that if you get an error about trying to connect to Garmin’s server to download the schema, e.g. an error like…
Fatal Error at file , line 0, char 0 Message: An exception occurred! Type:NetAccessorException, Message:Could not open file: http://www.garmin.com/xmlschemas/GpxExtensions/v3/GpxExtensionsv3.xsd
I believe this is a combination of Garmin redirecting the original link to a new location, and SAXCount not handling the redirect very well. If you strike this problem, this post in the forums has the fix. I’d basically recommend keeping a version of the fixed Garmin header ready to cut and paste into each GPX so that SAXCount can actually download each xsd. I’ve been using this one…
<?xml version="1.0" encoding="UTF-8" standalone="no"?><gpx xmlns="http://www.topografix.com/GPX/1/1" xmlns:gpxx="http://www.garmin.com/xmlschemas/GpxExtensions/v3" xmlns:gpxtpx="http://www.garmin.com/xmlschemas/TrackPointExtension/v1" creator="Colorado 300" version="1.1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.topografix.com/GPX/1/1 http://www.topografix.com/GPX/1/1/gpx.xsd http://www.garmin.com/xmlschemas/GpxExtensions/v3 http://www8.garmin.com/xmlschemas/GpxExtensions/v3/GpxExtensionsv3.xsd http://www.garmin.com/xmlschemas/TrackPointExtension/v1 http://www8.garmin.com/xmlschemas/TrackPointExtensionv1.xsd">
From there it is a simple step to validate.
SAXCount -v=always -n -s -f test.gpx
All going well you’ll get something back similar to.
test.gpx: 24 ms (7478 elems, 2498 attrs, 0 spaces, 38613 chars)
This means that everything checked out ok. Otherwise, it will let you know the lines that have errors, making it quick and easy to open in a text editor to edit or delete the corrupt elements. One trick I’ve noted is that by default, a Colorado GPX has no line breaks in it. A trick here is to search for /trkpt><trkpt and replace it with /trkpt>\r<trkpt – this will insert a linebreak, ensuring that each trkpt element starts on a new line and SAXCount can refer to it by line number for easier identification.
Late last week I stumbled across the Research e-Labs blog, that comprises part of the larger eGovernment efforts. They are currently consulting on the new eGovernment feed standard. They are proposing a move to Atom 1.0 as it appears to provide better mechanisms for semantically marking up the information contained in the feed. This is an exciting proposition, especially if they expand the defined tags to support reports and datasources – this is on top of the already defined news, jobs, consultation and events. They have a good presentation available too.
Having just been through the process of obtaining and publishing some Government data, I was interested to see Dennis McDonald covering an Australian blog post on eGovernment on ‘Making data freely available‘.
Craig makes some very good points in his conclusions, and I’ve got one that I would like to add based on recent experience with the State Highway trackpoint data I sourced in NZ and uploaded to OpenStreetMap.
In the followup to uploading the trackpoints to OSM, a group of NZ OSM mappers have been discussing how to classify roads in New Zealand – basically how the road hierarchy is classified. Turns out that this is a hard problem and there is not a single standardised approach taken to classifying roads in New Zealand. After asking for guidance, I became aware of a number of issues and capture these on the gis.org.nz wiki.
The simple fact was that there was not a nationally consistent approach to classifying the road hierarchy in New Zealand. I expect it is similar in Australia, and possibly worse with the additional layer of State government.
This highlights an additional key role that I think Government has to play in national data collection.
Government should set the standards for data collection to ensure that datasets are nationally consistent to enable simple aggregation of disparate datasets.
It doesn’t have to actually perform the aggregation, although that would be nice. It just has to ensure that standards are used to enabled aggregation. Using the road hierarchy classification example from above, this means that a road of a certain class in one part of the country, means exactly the same in another. This would probably occur by an engagement-based approach that determines a controlled vocabulary to define types of roads.
Without this responsibility, citizens are doomed to a million-and-one datasets that cannot be easily aggregated to produce coherent and consistent national datasets.
I just heard today that the Department of Internal Affairs is consulting on an opt-in single sign-on identity verification service (IVS) that may be used by government agencies to identify us online when interacting with said agencies.
I have included my submission below for reference.
We would like to know whether you are likely to use the Internet to verify your identity with a government agency.
Yes – but it must work on any operating system and web browser. I use a variety of operating systems and web browsers including:
- Operating Systems – Apple OS X, Fedora Core Linux, and Microsoft Windows
- Browsers – Firefox and Safari
I will not be able to use the service if it is tied to Microsoft Internet Explorer/Windows platform. I expect that all the good work that the State Services Commission has been doing on standards and interoperability will be applied to IVS as well.
We would like to hear from you regarding the type of services you might want to access that require you to verify your identity.
- Inland Revenue for management of personal/business taxes, KiwiSaver?
- Government Electronic Tender Service (GETS)
- NZ Qualifications Authority for NZQA Learner’s Record
- Local Government
We would like to know what you think of being able to verify your identity with businesses and other organisations.
I would support the service being made available to local government.
I am initially dubious about IVS being made available to businesses until such time as more details are made available. I trust the Government to run their IT systems to a higher level of security than most businesses. I am also concerned that if the IVS was made available to non-governmental users, that uptake may well make the IVS to be more than an opt-in service – businesses may use incentives that Government cannot to strongly promote registration and use of the service.
I would however support a limited number of business sectors to utilise the IVS – in particular those that provide online financial services such as banks, fund managers and sharebrokers. It is preferable to have them using a national framework rather than having a token for each organisation AND government on my keyring. Note that this would present some risks – in particular the risk of a distributed-denial-of-service (DDOS) attack against the IVS infrastructure. If the IVS does grow to become widely used, and includes the financial sector, then a DDOS against poorly planned IVS infrastructure may have significant negative consequences – even if just in perception of the service. Naturally, as IVS grows in usage, it would have the potential to become national critical infrastructure and would need to be managed as such.
Read the rest of this entry »
I can sympathise with Jeff’s frustration at not being able to get frames for paper sizes from the ISO standard but I think he has it wrong when he suggests that Epson needs to step up and produce US paper sizes. There is a followup, from Scott Kelby’s blog that explains Epson’s position further.
With most of the world actively using ISO paper sizes, it would be far easier for framer producers to recognise that they should start producing frames that support internationally recognised standards, rather than trying to maintain antiquated North American paper sizes. This is just inefficient for international producers, and I’m sure they would love the US and Canada to adopt international standard paper sizes, so that they don’t have to keep producing paper for the North American market. If North Americans want to keep using different paper from the rest of the world, then surely their internal market would pick up on the opportunity?
I wrote this article for the July 2006 International Association of Emergency Managers Bulletin.
In February 2004, I wrote an article for the IAEM Bulletin outlining some of the benefits that open source software had the potential to provide for emergency managers. At that time, little open source software existed for emergency management, and I had just come out of a simple attempt in 2003 to create a Web-based disaster management system. That effort failed, for while there was a well-recognized need for open source disaster management software, there were no real drivers to encourage development of a solution.
2004 Tsunami Spurs Development of Sahana
The driver came with the tsunami that struck Sri Lanka on Dec. 26, 2004, which prompted the development of a free and open source solution called Sahana. Within a couple of days, the need for a system to manage vast quantities of information became obvious, along with the need to attempt to coordinate 1,300 NGOs responding to hundreds of thousands of displaced people. In the following days and weeks, a Web-based system for managing disaster information was built on-the-fly based on the most pressing needs. Accordingly, the following were the first modules developed:
- People Registry – track and match victims of a disaster.
- Organization Registry – register, connect and track NGOs involved in response.
- Camp Management System – register and track camps.
- Request/Assistance Management System – record, track and match requests and offers of assistance.
Sahana development was initially led by the Lanka Software Foundation and supported by volunteers from the Sri Lankan IT industry. As the immediate need for Sahana subsided in the months following the tsunami, more international contributors became involved in the project, myself included. These ranged from programmers wanting to help out, to those who wanted to offer assistance drawing upon their disaster experiences, including emergency managers.The positive feedback to Sahana prompted further development to add more response and recovery capabilities applicable to any disaster management situation.Longer-term, the goal is to use Sahana as a means of encouraging comprehensive emergency management in communities by supporting preparation and mitigation. This will start by providing tools to incorporate plans and reference material, such as communication directories in advance and other techniques to encourage greater interagency co-ordination before an event.
Sahana has been designed to operate in a diverse range of environments due to the nature of disasters. It can run on Web servers and laptops, and has even been installed on a PDA. Over time, it will support both standalone and networked modes of operation and allow communication between multiple Sahana servers, including synchronization of data.There are a number of future modules planned or under development:
- Disaster Impact Assessment.
- Inventory/Supply Chain/Logistics.
- Volunteer Coordination.
- Response/Rescue Team Management.
In addition, there are a number of key technologies identified for inclusion:
- Mapping/GIS, and GPS integration – it can already use Google Maps.
- Provision of information via open standards:
- Common Alerting Protocol (OASIS/CAP).
- Emergency Data Exchange Protocol (OASIS/EDXL).
- Various OpenGIS Protocols (OpenGIS Consortium).
- Support of existing paperbased forms.
- PDA forms for remote fieldwork.
Sahana has seen official deploy ments in multiple events, including the Sri Lankan response to the tsunami in 2004, the 2005 earth quake in Pakistan and the 2006 mudslide in the Philippines. It has also recently seen unofficial deployment in support of the Yogjakartra earthquake and in preparation for an eruption of Mt. Merapi. Sri Lanka’s largest NGO is also deploying Sahana within their disaster unit.
In mid-May 2006, a workshop was held in New York that brought together key members of the Sahana development community and IBM. The meeting served two purposes:
- To discuss IBM support of the project, and
- To consider further development of modules for Sahana that could be used during response to a pandemic.
A pandemic presents an interesting opportunity for the deployment of Web-based disaster management systems, as most infrastructure should be operating normally (relative to a hurricane or earthquake).The Sahana project is interested in contributions, be they time or financial. Time contributions can be made in providing design advice based upon disaster experience, writing the code, testing Sahana or helping to write the documentation. Financial contributions will be used to target module development, such as sponsoring development of a specific module or supporting the core development team that works full time. An international community maintains Sahana, and all contributions are provided back to that community at no cost – a share-and-share-alike ethos to ensure that everyone benefits. Sahana is free to use and has no licensing costs associated with it.