Archive for the ‘interoperability’ tag
I had the opportunity on Monday (6 December 2010) to attend a great workshop held in Wellington. This started off with a set of three presentations by Mark Reichardt who is the President and CEO of the Open Geospatial Consortium.
Mark presented a number of presentations during the morning, starting with CEOs, followed by one focused on OGC and Disaster Management, as well as finishing with An Overview of OGC Standards and Programs. Hopefully these will be publicly accessible in Google on portal.opengeospatial.org – I’ll put links up when the files are openly available.
I got a great buzz to see Sahana, OpenStreetMap, CrisisCommons and Ushahidi mentioned in slides by Mark, and these event got coverage in the presentation to the invited Government CEOs at the breakfast session. In the open presentation on OGC and Disaster Management in the morning, Sahana got its own slide and was recognised for incorportating various OGC standards including Web Map Service (WMS), Catalogue, KML , Web Feature Service (WFS) and the Coverage Service. It is great to see the significant geospatial efforts that the likes of Mifan, Fran, and David have put in – amongst others and sorry I can’t recall everyones names!
I also talked about my experience around geospatial data and building safety evaluations following the Canterbury earthquake, and there seemed to be some real interest in using that as a possible test bed geospatial project that we may be able to undertake here in NZ. I must get onto blogging more about that.
As always, it was a great opportunity to catch up with some colleagues that I hadn’t seen for a while, and it has been particularly reassuring to see some great minds that were thought lost with the State Services Commission restructure now turning up in Land Information New Zealand. LINZ taking on these people definitely sends a positive signal about open data and standards. In particular I want to note Richard Murcott who is now the Geospatial Standards Leader at LINZ.
Anyway – what were some of the points I took away from the workshop?
- It sounds like Land Information New Zealand is going to become a full OGC member in 2011.
- The OGC GeoSMS standard is going to be coming soon! A discussion document from Freburary 2010 on OGC GeoSMS is available here (link to agreement page, then pdf download). This will be great to have a GeoSMS standard to work to, as we had created our own in Naval Postgraduate School Disaster Relief Experiments previously.
- Mark highlighted that demonstrating standards – through test bed projects and the like – is one of the most important aspects of standards promotion.
- The Taiwanese have been doing quite a bit of work with OGC, and have been doing some very interesting debris flow monitoring projects with OGC Sensor Web. There is supposed to be a good pdf available that outlines the Taiwanese work with Sensor Web.
- Geosynchronisation. OGC has within the last month announced the formation of a GeoSynchronisation Service Standards Working Group. This of course has a lot of potential – not only for taking OpenStreetMap data out into the real world, editing it, and coming back and syncing it later, but also of course for emergency management. I intend to watch this one quite closely.
One of the closing quotes of the emergency management presentation came from the Chairman of the OGC, David Schell.
What the OGC is doing is facilitating a common picture of reality for different organizations which have different views of the reality, the disaster, the catastrophe, that they all have to deal with collectively
The use of OGC standards is probably the only clear path forward towards a Common Operating Picture – well, one that has anything to do with location anyway.
In the afternoon, Richard Murcott of LINZ led a workshop discussing standards and interoperability.
One of the big takeaways for me was the model of Conceptual Interoperability from Simulation Theory. Basically it builds up from nothing – no interoperability, to a state of full interoperability – where fully conceptual models are used to integrate data consistently from multiple sources.
Of course, we are a long way from this in emergency management, a lot of the current interoperability we have is at level 2 which is only a common data format. The OASIS work with EDXL is taking us a step higher (level 3) with increasing semantic interoperability through the use of more clearly defined standards. I think there is a very long way to go using this model though to ensure we have interoperability that considers methods and procedures (level 4), assumptions and constraints (level 5) to a “fully specified but implementation indepentent model” (level 6).
Some other quick takeaway points that I and others came up with:
- There is a spectrum of the reason for interoperability – from selfish to altruistic. A selfish organisation wants to bring any data into its system and processes, whereas a truly altruistic organisation wants only to publish and share information.
- There is a spectrum of the management approach of interoperability – from adhoc/chaotic to extremely structured. Some organisations want full control over how interoperability is managed that require a very structured and formal approach, and even agreements or MOUs. At the other end is complete anarchy and chaos.
- Risk aversion is a significant barrier to interoperability, so clearly taking a risk management approach to interoperability is likely to provide a better means to manage risks, and hence make true interoperability more acceptable to management.
- Restrictive licensing of standards creates barriers to entry. Nothing new there.
- Data sets provide an excellent focal point for collaboration and communities may well form around a released data set. E.g. NZ Open GPS Maps project around released LINZ roads.
One of the final points Richard made in closing was more targeted at New Zealand in general, and certainly a sentiment I think we should take to heart. New Zealand, as a country, needs to behave more like a city of 4 million. Dispersed from Northland to Southland we pack nowhere near as much punch than if we better bring our expertise together from across the country. If we want to be more successful on the world stage, then we need to lose our small town mentalities, and start thinking bigger and broader!
One question that I’ve had about building a solution for Building Safety Evaluation (BSE) is whether it should be built into an existing council system, or indeed implemented on existing council systems, or perhaps a standalone solution should be used. Clearly there are pros and cons both ways, but I’m definitely tending towards a standalone solution – at least initially. I certainly gained some insights in the 7 days that I had working within CCC’s BSE team.
There are certainly benefits to be gained from integrating a BSE into existing council systems. These include:
- Information as it is captured goes directly into the business-as-usual systems.
- Building information is tightly linked to existing council data structures e.g. building records, ids etc.
But there are problems associated with systems implemented on a per-council basis:
- It requires each and every council to build and integrate a BSE system into their existing systems – something which most don’t have the time or budget to do, especially for relatively infrequent events.
- It is harder to bring in staff from other councils to provide surge capacity for the data entry tasks (data entry is another problem I’ll get to as well) – they would be more likely to be trained in a different system that their council uses.
- As an inhouse solution would be limited by existing council IT systems – don’t underestimate some of the issues associated with getting large organisations IT systems working following a disaster of this magnitude.
I will admit to being slightly biased, but I believe a more sustainable solution is to create a free and open source software tool that can be used in a standalone manner for the first few weeks, and then council IT staff can find a means to import the information back into the council system. This would likely become easier if the BSE data was able to be implemented in a standard XML format. I’d like to see an OASIS Emergency Data eXchange Language (EDXL) extension created for representing BSE information.
Why do I think an open source source solution would be best?
- The system will be relatively infrequently used, so it is easier to justify a consortium approach to development. This will be far cheaper than multiple councils thinking about building their own bespoke solution, that probably won’t be compatible with neighbouring councils. This means multiple councils, and indeed governments worldwide may be able to contribute relatively small amounts each to build a better system than any one single organisation could build.
- Being free, it is also likely to be widely deployed, and this means that rather than just having one councils staff trained in their systems use, there are likely to be an order of magnitude more people trained in its use if it is open source. This greatly increases the ability to have surge capacity for data entry.
- An open source solution is also likely to implement open standards, whereas a bespoke council system is likely to forgo the additional cost associated with implementing a recognised data interoperability standard. This means that bespoke council BSE systems will be inherently closed, and potentially incompatible with their neighbouring councils. An open source application with open standards automatically means that neighbouring councils can either share the one system, or at least use the same software on separate instances, and use the interoperability standards to allow easy aggregation of the BSE data for reporting.
- Open source would also allow the creation of what is effectively a BSE kit in a box. A wireless hub, a handful of netbooks etc and it would be quite easy to have a portable, redeployable and standalone kit for implementing BSE without having to depend upon any existing organisational IT infrastructure.
So, for the time being, I’ve convinced myself that a standalone open source BSE application is probably preferable to councils implementing their own system in house.
I just heard today that the Department of Internal Affairs is consulting on an opt-in single sign-on identity verification service (IVS) that may be used by government agencies to identify us online when interacting with said agencies.
I have included my submission below for reference.
We would like to know whether you are likely to use the Internet to verify your identity with a government agency.
Yes – but it must work on any operating system and web browser. I use a variety of operating systems and web browsers including:
- Operating Systems – Apple OS X, Fedora Core Linux, and Microsoft Windows
- Browsers – Firefox and Safari
I will not be able to use the service if it is tied to Microsoft Internet Explorer/Windows platform. I expect that all the good work that the State Services Commission has been doing on standards and interoperability will be applied to IVS as well.
We would like to hear from you regarding the type of services you might want to access that require you to verify your identity.
- Inland Revenue for management of personal/business taxes, KiwiSaver?
- Government Electronic Tender Service (GETS)
- NZ Qualifications Authority for NZQA Learner’s Record
- Local Government
We would like to know what you think of being able to verify your identity with businesses and other organisations.
I would support the service being made available to local government.
I am initially dubious about IVS being made available to businesses until such time as more details are made available. I trust the Government to run their IT systems to a higher level of security than most businesses. I am also concerned that if the IVS was made available to non-governmental users, that uptake may well make the IVS to be more than an opt-in service – businesses may use incentives that Government cannot to strongly promote registration and use of the service.
I would however support a limited number of business sectors to utilise the IVS – in particular those that provide online financial services such as banks, fund managers and sharebrokers. It is preferable to have them using a national framework rather than having a token for each organisation AND government on my keyring. Note that this would present some risks – in particular the risk of a distributed-denial-of-service (DDOS) attack against the IVS infrastructure. If the IVS does grow to become widely used, and includes the financial sector, then a DDOS against poorly planned IVS infrastructure may have significant negative consequences – even if just in perception of the service. Naturally, as IVS grows in usage, it would have the potential to become national critical infrastructure and would need to be managed as such.
Read the rest of this entry »