Archive for the ‘collaboration’ tag
Recently discussion on the CrisisCommons email list raised an issue about security pertaining to crowdsourced data – and the ease with which the information can be deleted by an anonymous malicious individual when using tools such as etherpad or Google Docs with open editing rights.
In this case an anonymous user was deleting data as quickly as it was entered in a shared public document. What is a more concerning risk is perhaps the subtle editing of crowdsourced information, where the edits are not obvious enough to be detected – such as the subtle and malicious modification of facts and figures.
For tech volunteers, there is a careful balance to be struck between protecting information (in this particular case its availability and integrity) and not creating significant barriers to entry.
The first obvious solution is that access on the document be restricted to authorised users. This means that only those individuals that are trusted can be expected to contribute to the collection and management of unstructured crowdsourced information.
This is less than ideal as it means that new users that volunteer immediately following an emergency haven’t developed a trust relationship with, for example, the CrisisCommons community, and are unable to immediately contribute.
I believe that with the simple use of a two-tier approach, one can easily protect the quality of the final document(s), whilst still making it easy for new volunteers to contribute.
You effectively create two types of document:
- Public and open documents – which are open to all to edit, and are effectively a rough scratchpad for collecting unstructured information.
- Trusted documents – which are open for only a limited pool of trusted users to edit, but draw from the content provided in the public and open documents.
The trusted editors effectively become the curators of the information, and once content has been copied and edited from the open documents, malicious anonymous users won’t be able to waste other volunteers time through deletion or editing.
There are other process benefits to this approach. For example, you may create a public document particular topics of the emergency – such as infrastructure, health/medical and background information (e.g. weather forecasts, population demographics etc) and these multiple individual documents may map to a single section within the trusted document to produce an edited and trusted version of crowdsourced information.
Still, from an operational perspective, this is a far from ideal approach, and there are certainly more robust approaches available to turn this into a process that can be used for intelligence gathering and situation reporting.
I’ve been mostly quiet on the geospatial front recently as I’ve been busy with work, but there are a couple of things worth commenting on.
- Environment Canterbury GIS Beta blog. Not the first to the council GIS blogging game, but certainly early, is my local regional council in Canterbury. They’ve started a blog to engage with the community, and built their first mashup. This is a great step, and I’m excited to see more of the public GIS practitioners provide a means of direction communication and engagement with the communities they represent. Very exciting.
- A reminder that whilst we’ve still got a long way to go, things could actually be worse.
Getting back to mashups, I believe that they are a nice simple and cost effective means of communicating simple geospatial needs to users – ECan’s swimming water quality mashup is an example of that – a simple means of seeing visually the quality of swimming places around Canterbury.
However, the real benefits are going to come when that information is freed from the encumberance of being tied to an Internet connection. Lots of recreational activities that we undertake are not always near Internet connectivity, and mobile data access is still prohibitively expensive to be checking these sort of data sources over summer when away from your computer and Internet connection.
As a very keen recreational user of handheld GPS for geocaching and other activities, I am ever hopeful that councils such as Environment Canterbury will start to consider making some of their underlyign data available for download so that it can be bundled into offline maps that can be used in car, auto and mobile phone, GPS devices.
This will complete the full loop – councils create our communities (geospatially at the very minimum), they record geospatial data about our communities, and with mashups – we are finally starting to get some of this geospatial data back.
I look forward to the day when councils are extremely open with the geospatial data and make it available so that we can put in our choice of portable mapping device
Back in July, I posted about dc.gov releasing some data. I was a bit slow replying to a comment made by Nat Torkington then, and felt that a reply actually required a new post to elaborate further on why I’m so supportive of governments – be they local or national – releasing data that has been paid for by the rate/tax-payer. Nat said:
“Isn’t it the case that the USA doesn’t have an authoritative roading database, either? That’s why Navteq, TeleAtlas, and Google have to drive the roads.”
Whilst the US doesn’t have an authoritative roading database either, the release of the TIGER line shapefiles has spurred the development of free and open maps – e.g. the inclusion of Tiger data in OpenStreetMap, and the production of free and open maps for GPS units. This mirrors what has occurred in New Zealand with the likes of the NZ Open GPS Maps project, utilising the free information made available from Land Information NZ.
However, this leaves us with two broad types of maps both with their problems – commercial datasets with restrictive usage conditions and free datasets maintained by volunteers that may not be sustainable in the long term. In New Zealand, the commercial dataset providers are primarily Terralink, Critchlow’s and Eagle Technology, with some more affordable sets made available by Kim Ollivier. The free maps are primarily catered for by the New Zealand OpenStreetMap project and the NZ Open GPS Maps project.
My problem is that there is a lot of inefficiency in the current way that mapping data is managed in New Zealand (and this probably applies internationally). Why do we have four+ commercial sources for roading data and two volunteer driven projects all duplicating each other, as well as Government agencies that have legislative responsibilities for roading infrastructure?
Well, it is because LINZ is not currently funded to provide a centralised repository for all this information – they are too busy focusing on the cadestral database where they make their money. Instead we are producing inefficient silos of information, that are all subtly different. I have been prodding at a few people to try and get the NZ OpenStreetMap and Open GPS Maps projects to try and consolidate the underlying database to OSM, and I believe that this will occur over the long term, but there are a number of issues to work through before this will happen.
As Nat indicated in the original post – in the US Navteg, TeleAtlas and Google drive the roads there, and we’ve got at least Terralink, Google and probably others driving the roads here. In addition we have active volunteers also driving roads and correcting errors in OpenStreetMap and the Open GPS Maps project – I personally provide GPS tracklogs to OSM, and have also placed the 2007/8 High Speed Data Survey in there. The interesting part is that all of the errors are being corrected from the original LINZ roading dataset. So, because the New Zealand Government has not funded LINZ to maintain the roading dataset, make it widely available under permissive licensing terms, and allow feedback and corrections to be suggested for review and possible inclusion, we now have a massively inefficient approach to mapping roads in New Zealand.
All of these projects have sprung up because LINZ is not funded to provide the correct road dataset in the first place.
We can’t support that in a small country in New Zealand where only corporates, local authorities, and central government agencies can afford the commercial roading datasets due to expense. I know at least one of the commercial datasets costs over $100,000 to license. What this means is that small-and-medium sized businesses are being left out in the cold from using geospatial information to improve the way they do business as it is too expensive, and rate/tax-payers do not have affordable access to the information for tourism, recreational and safety purposes.
As the Immediate Past President of the NZ Recreational GPS Society, I’ve seen people balking in our forums at having to pay extra for decent road or topographical maps. Some of these are expensive because the GPS map vendor has needed to license the underlying data from a commercial provider. In addition to the cost, vendors also have to implement measures to stop the reverse-engineering and redistribution of this licensed data. However, like most forms of Digital Rights Management (some may say Restrictions), the technical mechanisms cause their own problems. I’ve just been helping with one person that has been suffering through Garmin’s Map Unlock process that is poorly communicated to customers, and provides nothing but roadblocks in an effort to set up the maps on the user’s computer and GPS. And even when he hopefully does have the maps unlocked, he will only be able to install them on one GPS!
Perhaps as a comparison, I am not able to download and install a copy of the Yellow Pages on my iPhone so that I can use it in a disconnected manner, but I can download the free and open Zenbu iPhone application that bundles all the data – so if for whatever reason I am out of mobile coverage, I can still use this data as it is stored locally on the device. I don’t believe that commercial directory services would be very comfortable about releasing their datasets to be installed on mobile devices, as they would risk the loss of their database in which the perceived value of their business resides. So having data released under permissive liceneses is also essential for new applications such as storing massive geospatial resources in our pockets.
That said, I’m not really in favour any more of the Government attempting to build a single massive dataset any more, as I think Government has proven that it cannot build these IT things effectively because there is too much management by committee, and the commercial vendors that provide the infrastructure are just looking for a jackpot if they win the tender (e.g. tender prices of $9-48 million for the failed National Address Register (NAR) project). I don’t see the need for the Government to build what is effectively their own OpenStreetMap infrastructure when we can just use something like OSM. Honestly, NZ Govt should just approach OpenStreetMap and look at an arrangement where Government can publish geospatial datasets into OSM with the ability to set some layers (such as say electoral and property boundaries which shouldn’t be editable) as read only, and the rest as editable – e.g. roads and walking tracks that can be maintained by everyone. If the publisher of a layer doesn’t want the original layer edited, then in some circumstances editable child layers should be allowed – e.g. so I can add a new walking track to a layer that hasn’t yet been updated to reflect it, and the owner of the original dataset can then look at whether they want to accept the change back into their layer.
Commercial geospatial datasets put nothing but roadblocks in the way for new and creative uses of geospatial data. I have no problem with commercial datasets providing value-add to the data, but the fundamental data such as roads and the like should be made as open and accessible as possible to encourage adoption and standardisation upon that dataset – this will also consolidate feedback and error correction. If I find an error now, I can’t report it to LINZ – they won’t listen. What benefit do I have in reporting a roading error to a commercial provider? Indeed the only benefit I get is if I report the error to a free and open project.
Adoption and standardisation of fundamental datasets are important to ensure consistency between map sets. Right now on my GPS I have two maps sets that both provide roads and you don’t have to look far to find discrepancies between the two datasets – but guess what, they are both derived from the LINZ road centrelines.
If left to commercial providers, geospatial data will be left as an expensive tool that only large organisations can afford.
The sooner governments in general recognise this, start funding the publishing and maintenance of fundamental datasets, the sooner we will see a real renaissance in how spatial information is used by the average organisation and individual. That is why I am so supportive of dc.gov releasing all their data.
Clearly following on from the antipodes, the UK Government is now holding a mashup competition as well. But it appears that they haven’t quite gone as far as we have in New Zealand – at least not in terms of trying to remove as many restrictions on the use of Government data.
Having a look at the data page, where the Government has published data, it is interesting to note that citizens are required to enter click-use licensing agreements, and commit to using API’s to access government data. You can see some of the frustration on the page listing the data downloads.
Why are they such a problem? Well, the data can only be used for non-commercial purposes – that is going to hinder development somewhat, with no potential return at the end of the work. The licensing agreements appear to allow the government agency to pull the plug at any point if they don’t like what you are doing – I wonder what would happen if a mashup showed an inconvenient truth for a government, would that be reason to get a government agency to pull the data plug? Another joy of accessing data through API’s is that they are limiting queries on their servers to a few thousand queries per day (in the case of the Ornance Survey).
Which is why Government data really should be properly freed. It needs to be released to obtain a life of its own under a suitable Creative Commons license. Let citizens download the data, mash it up on their systems, and deal with all the issues. Governments must not become the gatekeepers of this information through API’s and license agreements.
I’ve been meaning to blog about this sooner, but have been pretty busy with work. A chance email on a NZ GIS list that I belong to two weeks ago, inspired me to go out on a limb and see if I could get some Government data. I saw a post from someone within the Transit (soon to be merged into the New Zealand Transport Agency) refering to working with 2.2 million trackpoints from a roading survey. I started a private email discussion, and after a couple discussions, I soon had 2.2 million trackpoints from the 2008 High Speed Data Collection survey of New Zealand State Highway network.
My intention of obtaining this data was to be able to convert it to GPX files and upload it as a raw data survey layer to OpenStreetMap (OSM) so that it could be used as the basis for mapping New Zealand’s State Highway network in OSM.
I had some help from John McCombs from Integrated Mapping in Christchurch who very kindly reprojected all the points to WGS84. I then spent 4 evenings last week converting to GPX and uploading the files to OSM.
Was this data essential to mapping the highways in OSM? No. But it was a great experiment to see if a New Zealand Government Agency was willing to release data under acceptable terms and conditions – this dataset is licensed under the Creative Commons v3 Attritbution ShareAlike license, and effectively turn the raw data over for public consumption. Naturally, this doesn’t contain all of the detailed geometry that is collected during the survey, so not all of the data was made available, but we got the most important – latitude and longitude, and a lot of them!
For more information, see the following links.
One of the key points I was trying to make, was indicating that citizens are actually interested in accessing government data such as this, and that agencies should take a more proactive approach to releasing data for the world. After all, data is global these days – put it on the Internet and anyway can access it.
Over at the In Development blog at eGovt, they are asking for comments on the state of progress for eGovernment in New Zealand. I’ve copied my comment here.
Location, Location, Location!
A quick skim and keyword search indicates that the report has almost entirely neglected one of the most important aspects of information, and that it location – its spatial component. It is not surprising then that the Government is struggling with how to handle spatial information. Some of the most important aspects of community anywhere are distance and connectivity. However with Government doing a poor job in making spatial information available to the citizen, it is no wonder that individuals are struggling to find out about consents from two properties over.
Even more of a shame is the fact that Government has failed to deliver on a project to produce an authoritative National Address Register of addresses, roads and placenames. This dataset is one of the most fundamental to being able to place information in space under truly understand its context. eGovernment will never take off under the core location issues are worked out. Whilst the recent release of significant amounts of Statistics New Zealand information is a great step, the true benefits and insights contained in the released census data is being held back by the lack of an authoritative National Address Register.
Organisational Inertia and Champion Individuals
Government organisations themselves are often hesitant to change or try something new. This is not to say it doesn’t happen, and there have been some excellent examples recently of organisations stepping up and trying something new – such as using wikis for consultation and engagement. I am fairly certain that most of these projects will have been championed internally by a small group of individuals, and I’ll bet they had quite a struggle to see their project through. Here’s hoping that these recent projects are the catalyst for more risk taking, and acceptance of exciting IT projects within Government.
Not enough Champions
As a consultant that has worked a lot within Government agencies, much of the information projects I’ve seen successfully delivered have been almost solely on the back of one or two key individuals. The agencies themselves have often been more of a hindrance to delivery of exciting new initiatives, and the Champions have had to fight an uphill battle against their own organisations, and other Government agencies when ‘engaging stakeholders’. Even worse, a number of these Champions actually run up so many brick walls in their organisation, and within Government, that they end up choosing to leave in frustration, or even worse are forced out by unenlightened superiors. In some circumstances they make their way to an organisation that is supportive, but that fate does not appear to wait all Champions. You need some way to support and encourage these key individuals that lie within Government agencies – there may only be one or two, but they are your key instrument of change from within.
Start Small, Release Early, Upgrade Often
Despite the fact that the National Address Register is such a necessary component of eGovernment. I don’t believe that the project as tendered should have gone ahead – not for something where tender prices ranged between $9 and $48 million. The project would have been so large and complex that it would likely have not been delivered on time, would have been over budget, and would have lacked the desired capabilities.
I think Government has to learn to try the small and simple things first and work up.
Surely, for the price of one or two Geospatial Professionals and supporting hardware, the New Zealand Government should be capable of aggregating all roading information from the 74 Road Control Authorities and Transit New Zealand, and publishing it as a single national roading dataset under permissive licensing. This should be able to be done now for a couple of hundred thousand dollars. But just produce a dataset and get it out there. Get feedback about how it is used, and look at improving the process, and accepting feedback. If you don’t, then the New Zealand Government is going to look pretty ridiculous when volunteers have created their own national roading dataset using OpenStreetMap because Government wasn’t capable.
Here are some simple actions that Government can undertake to encourage more exciting use of Government information.
Find data. Release it under permissive licensing. Release it in formats that make it readily accessible to manipulation in software (e.g. don’t release maps only as pdfs, make the underlying spatial data available). Announce it through a simple clearing house – nothing flash, it only needs to be WordPress blog pointing to the relevant source. It won’t be until this happens that the more exciting concepts such as entirely unexpected but useful mashups occur.
And that is when things will get really interesting. At that point, we will have citizens building mashups and services about ‘Our Place In Space’ – and they won’t be constrained by the organisational inertia inherent in most Government agencies that are tied back by accountability and liability that makes them hesitant to take bold steps.
Remember eGovernment is not just about Government developing systems and solutions. It does include citizens, communities and organisations building systems to meet their own needs.
I wanted to share some extremely disappointing news that I received today. The National Address Register project has been terminated.
This project had the intention of providing a single national authoritative dataset for roads, addresses and placename information. The potential of this project was to deliver a free dataset that all organisations and individuals in New Zealand were free to use. This would have made a fantastic resource, and had the potential to consolidate a number of mapping projects, and could have greatly simplified the work associated with project such as the NZ Open GPS Maps project, as the NAR would have provided a single national focal point for feedback and correction of road and address information.
The cynic in me says that the reason this project failed was because of the commercial interests in existing roading datasets. Currently there are multiple roading datasets from different providers, and they are making very good money from these. Some roading datasets sell for six figure sums on an annual basis. Naturally, very few organisations can afford these prices, so only large Government agencies tend to be able to purchase them. Suffice to say, these datasets are different, and there is not a single authoritative dataset amongst them.
The NAR had the potential to create a single, free and authoritative road, address and placename dataset. Tenders were invited for the project, and there was going to be only one organisation to win the tender. As a result, all but one of the current commercial providers stood to lose their revenue streams from their roading datasets. As you will see in the notification below, the tenders were too expensive. I believe that it was in all the commercial vendors interests to put in high tender prices to ensure that the NAR did not go ahead, and that they could protect their existing revenue streams rather than risk missing the tender and losing it all.
The upshot of this is that my faith in the Government to provide geospatial information to its citizens is now close to zero. If they are not capable of producing a single authoritative roading dataset (arguably one of the most important sets of spatial information as it defines most of our physical connectivity) then there is little hope of them being able to deliver any useful spatial information to citizens.
As the NZ Open GPS Maps, Zenbu and NZ Open Street Map projects have shown us, a volunteer community can develop products faster and cheaper than commercial or government organisations, and over time they will have better quality as well.
I believe the time has come for us to build more volunteer communities to provide spatial information that our Government is failing to provide to us. No longer can we wait upon them, rather we must build it ourselves. There are four key areas that we need to focus on.
1. Raw data collection – taking our GPS units out into the real world and collecting and sharing data. Collecting track logs and uploading these to the OpenStreetMap or NZ Open GPS Maps projects. Providing waypoints to OSM and Zenbu. Please – if you haven’t already, consider donating some time and information to these projects so that they have raw data to work with. This ‘field survey’ work is essential to creating our own spatial information resources. (I would particularly encourage geocachers to contribute their tracklogs if at all possible as we tend to travel a little more than others)
2. Mapping – converting the data collected in the field to information. Creating vectors for road lines, adding street name, directions, speeds. Using your local knowledge to map the community around you.
3. End products – converting the spatial information into a form suitable for others to use, for example the NZ Open GPS Maps project producing Garmin map files that can be loaded into GPS units.
4. Distribution – due to the large quantities of information involved, we may need to look at creating an ad-hoc network of individuals and websites to share the vast quantities of information about our country via torrents or similar P2P mechanism.
I believe the time has come for all those that want better access to spatial information to go out there and be a part of collecting, and building it. We can’t wait for Government to build it for us, so we will have to do it ourselves.
Let’s get started.
SUBJECT: NAR PROJECT TERMINATEDThe National Address Register (NAR) project is a cross-government initiative set up to develop infrastructure to improve the provision of address, road and place name information for government agencies, businesses and the wider community.
The project is over-seen by a Steering Committee comprised of representatives of key stakeholders from central government, local government and emergency services agencies.
An integral part of developing a business model and business case for the National Address Register (NAR) was to assess whether there was a supplier able to provide the relevant services and to identify the likely costs. An RFP process was chosen as the most effective way of identifying both of these.
Following assessment of the tender proposals, the Steering Committee has decided to terminate the project. Despite the project showing considerable potential to reduce duplication across government and reduce costs, it is too expensive to proceed with in its current form.
Further investigation into the need for, and the most cost-effective way of providing address, road and place information, will be led by the New Zealand Geospatial Office, within their mandate under the NZ Geospatial Strategy. This work will include determining the optimal role for the Crown, local government and the private sector. Brendon Whiteman, Director New Zealand Geospatial Office - email@example.com; will be happy to answer any queries that you may have in relation to these activities in the context of the overall work programme of the Geospatial Office.
It is expected that agencies will continue with existing arrangements they have for the purchase of this location data, from the commercial sector.
Nancy McBeth of the State Services Commission, is preparing a Lessons Learnt report on the NAR project. If you have some views that you would like considered in that report, please contact her at firstname.lastname@example.org by 31 May 2008.
On his return from Annual Leave next week, Laurence Millar, Chair of the NAR Steering Committee will formally write to your Chief Executive to advise of the decision.
Operational Owner NAR project
Received from a public email list I subscribe to.
This award recognises an outstanding project to develop open source software. The project must be either based in New Zealand or have substantial contribution from New Zealanders living here or overseas.
For those that aren’t aware, the Open GPS Maps Project started a number of years ago out of a desire to produce freely available maps for Garmin GPS receivers, and it has grown to be a fairly significant project supported by a wide community of users and ‘regional mappers’. Graeme Williams has done a fantastic job in growing and maintaining this project, and it has been very well received by members of the NZ Recreational GPS Society of which I’m the current President. Many of our members are active contributors to the project. Back in the early days of the project, I produced the original ocean to give the map a more realistic look.
This article was originally written for the July 2006 International Association of Emergency Managers Bulletin.
The rapid growth of the Internet and World Wide Web has spawned the creation of new and potentially useful software applications that may provide benefits to emergency managers. One of these applications that is currently drawing attention is the wiki.
Wiki is the Hawaiian word meaning to hurry, hasten; quick, fast, swift. Wiki software therefore refers to packages that are designed to make it quick and easy to create and modify collaborative web pages on the Internet. They have actually become more powerful and advanced than just for the creation of web content – wikis now power some very content-rich websites including the open source Wikipedia – the open encyclopaedia.
What are some of the key characteristics of a wiki?
- server-based software
- free, with few licensing restrictions
- accessible from any web browser
- can be run on a standalone laptop
- uses html links to reference other pages in the database
- designed for collaboration and sharing
- records all revisions of documents and tracks changes made by users
- immediately highlights recent page changes and by whom
What opportunities exist for wikis in the emergency management domain?
Wiki software has much potential to be used as a collaborative planning tool – whether planning occurs within or between organisations. Rather than passing a word processing document around via email to all participants in the planning process, the plan could instead be created and maintained using a wiki. A secured web site would provide an excellent home where plan developers could log in to check the latest changes and make modifications. The one key benefit over using a document-based approach is that everyone is always guaranteed to be reading and editing the latest version of the plan.
As certain milestones are reached in plan development, it is possible to lock the wiki, and create a ‘snapshot’ of the current plan before continuing the review and development process. Conceptually, this model of development is quite similar to techniques used for managing the development of computer software – with developers sharing a central repository.
In addition to planning, a wiki can also be used as a knowledgebase to store information and references to other documents. For example, certain pages in a wiki could be ‘tagged’ with a pandemic tag. Then, by viewing the pandemic category, it will show all pages that are tagged with pandemic. This provides quick and easy access to relevant information.
The benefits of wikis do not end when response starts. Conceptually, wikis can be installed on laptops or PDA’s enabling responders to have an entire knowledgebase available on a PDA including all the links and available plans.
Wikis could be used on a set of wireless laptops as a tool to assist your incident management system of choice. For example, the response plan developed in the EOC could be created in a wiki, and then planning/intel, operations, logistics, finance, information could collaborate on the one document with each section being able to view the other sections.
Wikis are also starting to be used in response and recovery by those people that have access to power and communications. Probably the best recent example is the Katrina Help Info wiki that is used to consolidate response and recovery information following a disaster – in effect creating a portal for the event with links to other agencies websites. In this manner, a wiki could be used as a public information system where key infrastructure is available.
Another example is the Hurricane Katrina web page on Wikipedia which started as a collaborative effort to record open source situation information. In the case of the Flu Wiki, wikis are even being used to develop a community knowledgebase about a hazard before the event.
It is important to note at this point that public wikis with permissive access controls can have issues with the quality and authenticity of information provided. Restriction of editing rights to approved and trained personnel can ensure that quality of information contained in the wiki is not threatened.
The next likely development is going to be the consolidation of wikis and community mapping projects such as the Hurricane Information Maps that were developed following Hurricane Katrina and utilise Google Maps. The combination of information contained in a wiki linked to spatial references and presented on a map will provide a very powerful information resource for response and recovery.
I wrote this article for the July 2006 International Association of Emergency Managers Bulletin.
In February 2004, I wrote an article for the IAEM Bulletin outlining some of the benefits that open source software had the potential to provide for emergency managers. At that time, little open source software existed for emergency management, and I had just come out of a simple attempt in 2003 to create a Web-based disaster management system. That effort failed, for while there was a well-recognized need for open source disaster management software, there were no real drivers to encourage development of a solution.
2004 Tsunami Spurs Development of Sahana
The driver came with the tsunami that struck Sri Lanka on Dec. 26, 2004, which prompted the development of a free and open source solution called Sahana. Within a couple of days, the need for a system to manage vast quantities of information became obvious, along with the need to attempt to coordinate 1,300 NGOs responding to hundreds of thousands of displaced people. In the following days and weeks, a Web-based system for managing disaster information was built on-the-fly based on the most pressing needs. Accordingly, the following were the first modules developed:
- People Registry – track and match victims of a disaster.
- Organization Registry – register, connect and track NGOs involved in response.
- Camp Management System – register and track camps.
- Request/Assistance Management System – record, track and match requests and offers of assistance.
Sahana development was initially led by the Lanka Software Foundation and supported by volunteers from the Sri Lankan IT industry. As the immediate need for Sahana subsided in the months following the tsunami, more international contributors became involved in the project, myself included. These ranged from programmers wanting to help out, to those who wanted to offer assistance drawing upon their disaster experiences, including emergency managers.The positive feedback to Sahana prompted further development to add more response and recovery capabilities applicable to any disaster management situation.Longer-term, the goal is to use Sahana as a means of encouraging comprehensive emergency management in communities by supporting preparation and mitigation. This will start by providing tools to incorporate plans and reference material, such as communication directories in advance and other techniques to encourage greater interagency co-ordination before an event.
Sahana has been designed to operate in a diverse range of environments due to the nature of disasters. It can run on Web servers and laptops, and has even been installed on a PDA. Over time, it will support both standalone and networked modes of operation and allow communication between multiple Sahana servers, including synchronization of data.There are a number of future modules planned or under development:
- Disaster Impact Assessment.
- Inventory/Supply Chain/Logistics.
- Volunteer Coordination.
- Response/Rescue Team Management.
In addition, there are a number of key technologies identified for inclusion:
- Mapping/GIS, and GPS integration – it can already use Google Maps.
- Provision of information via open standards:
- Common Alerting Protocol (OASIS/CAP).
- Emergency Data Exchange Protocol (OASIS/EDXL).
- Various OpenGIS Protocols (OpenGIS Consortium).
- Support of existing paperbased forms.
- PDA forms for remote fieldwork.
Sahana has seen official deploy ments in multiple events, including the Sri Lankan response to the tsunami in 2004, the 2005 earth quake in Pakistan and the 2006 mudslide in the Philippines. It has also recently seen unofficial deployment in support of the Yogjakartra earthquake and in preparation for an eruption of Mt. Merapi. Sri Lanka’s largest NGO is also deploying Sahana within their disaster unit.
In mid-May 2006, a workshop was held in New York that brought together key members of the Sahana development community and IBM. The meeting served two purposes:
- To discuss IBM support of the project, and
- To consider further development of modules for Sahana that could be used during response to a pandemic.
A pandemic presents an interesting opportunity for the deployment of Web-based disaster management systems, as most infrastructure should be operating normally (relative to a hurricane or earthquake).The Sahana project is interested in contributions, be they time or financial. Time contributions can be made in providing design advice based upon disaster experience, writing the code, testing Sahana or helping to write the documentation. Financial contributions will be used to target module development, such as sponsoring development of a specific module or supporting the core development team that works full time. An international community maintains Sahana, and all contributions are provided back to that community at no cost – a share-and-share-alike ethos to ensure that everyone benefits. Sahana is free to use and has no licensing costs associated with it.