Archive for the ‘eGovernment’ tag
Decisions on procurement can determine how government delivers its functions and services. The government’s procurement reform agenda will drive cost savings, releasing fiscal savings to be used in other priority areas.
Substantial cost savings will be delivered with the establishment of Centres of Expertise within lead agencies to negotiate all-of-government contracts in common-spend areas.
One area that needs significant improvement is the documentation associated with tendering. Many Government tenders require a fair amount of associated information, such as demonstration of previous projects, and profiles of individuals. This has to be replicated in some form for each and every tender. It would be very useful if part of this reform process involved the creation of a website where businesses were able to load both business and personnel profiles and instead of having to include these with each and every tender. In fact, if they were quite smart about lowering business red tape associated with Government tenders, Government would actually create a website through which the whole tender process is managed.
It should allow:
- Businesses to register their profiles, capabilities, personnel and past project experience
- Agencies should be able to post comment on past projects
- Agencies should be able to create a template for a tender that all businesses use as the basis for building up their submission
- Businesses then create their draft and make the submission electronically – no more posting/couriering of three paper copies!
- Agencies would review each tender and privately comment on and rate through the same website
Three months ago I blogged about the Conficker worm and its relevance for emergency managers. Since then, I’ve rumours that a number of health agencies were still having problems with their email systems. The reason I raise this again, is that now, with a large national response to a potential pandemic taking place, one hopes that Conficker has been well and truly removed from all Health systems (both Ministry and DHB).
If Conficker is still impacting on health agency IT systems during this period of increased activity, then honestly, heads need to roll at MOH.
I’ve only recently started following the NZ Health WebEOC blog, but it is exciting to see this sort of information sharing taking place. Congratulations to Charles and the team for the work involved. I found in their feed today an article about the Ministry of Health suffering from the recent Conficker worm outbreak over the past few days. There is more info here from Computerworld.
First, what is Conficker? From Wikipedia.
Conficker disables a number of system services such as Windows Automatic Update, Windows Security Center, Windows Defender and Windows Error Reporting. It then connects to a server, where it receives further orders to propagate, gather personal information, and downloads and installs additional malware onto the victim’s computer. The worm also attaches itself to certain critical Windows processes such as svchost.exe, explorer.exe and services.exe.
What is interesting is that the security hole that Conficker utilises to gain control of the Windows operating systems was plugged in a security patch released on 23 October 2008. That means in theory that all those systems that have been compromised in the past week were systems that had not had the patch applied that was released in late October. The security patch to protect against Conficker-like attacks for Windows 2000, Windows XP and Windows Server 2003 was marked as critical and should have been installed in a timely manner.
What are some lessons from an emergency management and business continuity perspective?
1. If you’re running Microsoft Operating Systems – you must keep them patched, and do it in a timely manner. Windows represents the largest near-homogenous family of operating systems in the world. This makes them the primary target for the developers of botnets and malicious software. Whilst I recognise that it takes time to deploy patches in a large organisation such as the Ministry of Health – an organisation will always be at risk if it doesn’t install security updates in a timely manner. All Microsoft ‘Critical’ patches should be patched within weeks of release.
2. Where possible, organisations should attempt to diversify the installed base of operating systems in an organisation. If you solely run Microsoft operating systems then a worm has the potential to take down an entire organisation. If you run a heterogeneous computing environment that has a variety of operating systems (e.g. Windows, Unix and OS X), then any outbreak of malicious software will only directly impact some of the systems. In our small business I support all three of these platforms. We have Windows and OS X clients, and servers on Linux, OS X Server, OS X Desktop, and this is one of the main reasons I refused to deploy solely Windows software for client and server when setting up our business. Reliance on a homogeneous computing environment decreases overall IT resiliency.
3. Emergency Management Information Systems (EMIS) should ideally be able to be segregated from the production systems. Malicious software doesn’t have to infect a system to have an impact on it. Even if the malicious software just consumes 100% of the network bandwidth, that will be enough to create a continuity issue by denying access to critical systems – such as servers. Therefore, EMIS should really be configured on a separate network so that even if the internal network bandwidth has been fully consumed, and access to the Internet severely restricted to limit the spread, critical systems can still be provided to the wider world. Network segmentation can be used to limit the impact upon critical systems. Direct access to the emergency network segment could be provided from network jacks in the EOC. Once again, these should be on an entirely independent network segement to ensure that emergency operations can continue during an outbreak of malicious software on the main LAN.
Finally, emergency managers should also make themselves aware of the Centre for Critical Infrastructure Protection (CCIP), and consider signing up for vulernability alert emails. These are sent out for critical advisories associated with information security risks, and can be good prompts for getting in touch with IT, and making sure that your systems are patched and up-to-date.
Update 2009-01-27: I see that the Manager of the CCIP went public yesterday saying the CCIP advised MOH of the security patch in October. The real question is whether the Ministry has custom applications installed on all its systems (e.g. including clients), or if they are just talking about server applications. If most of the desktops are only running Office and a groupware application such as Outlook or Notes, then they should have been able to be relatively easily patched before December. It is well recognised that patching servers running legacy applications takes longer to test for complications before deploying patches.
I’ve been mostly quiet on the geospatial front recently as I’ve been busy with work, but there are a couple of things worth commenting on.
- Environment Canterbury GIS Beta blog. Not the first to the council GIS blogging game, but certainly early, is my local regional council in Canterbury. They’ve started a blog to engage with the community, and built their first mashup. This is a great step, and I’m excited to see more of the public GIS practitioners provide a means of direction communication and engagement with the communities they represent. Very exciting.
- A reminder that whilst we’ve still got a long way to go, things could actually be worse.
Getting back to mashups, I believe that they are a nice simple and cost effective means of communicating simple geospatial needs to users – ECan’s swimming water quality mashup is an example of that – a simple means of seeing visually the quality of swimming places around Canterbury.
However, the real benefits are going to come when that information is freed from the encumberance of being tied to an Internet connection. Lots of recreational activities that we undertake are not always near Internet connectivity, and mobile data access is still prohibitively expensive to be checking these sort of data sources over summer when away from your computer and Internet connection.
As a very keen recreational user of handheld GPS for geocaching and other activities, I am ever hopeful that councils such as Environment Canterbury will start to consider making some of their underlyign data available for download so that it can be bundled into offline maps that can be used in car, auto and mobile phone, GPS devices.
This will complete the full loop – councils create our communities (geospatially at the very minimum), they record geospatial data about our communities, and with mashups – we are finally starting to get some of this geospatial data back.
I look forward to the day when councils are extremely open with the geospatial data and make it available so that we can put in our choice of portable mapping device
Laurence Millar, the Chief Information Officer for the NZ Government recently blogged about some thoughts around government data and Web 2.0. It is exciting to see someone in such a position of influence raise these issues, and hopefully lead the charge to promote change in government agency practices.
One angle I would like to extend upon, is Tim O’Reilly’s quote.
What if you don’t think of what you produce as the “final product” but rather as a step in an information pipeline, what do you do differently to add value for downstream consumers?
I think that this is a fundamental point that Government needs to get to grips with. When you look at government websites these days, most of them are designed around the philosophy of being just the terminal point of an information production chain – the result being information products expressed entirely in the form that the government agency ‘expects’ citizens to digest them. Most Government websites therefore only produce web pages, and pdf documents – they contain relatively little in raw data in a format that is more accessible for citizens. The exceptions of course are those agencies that have extensive mandates for publishing vast datasets such as Statistics New Zealand and Land Information New Zealand.
Laurence concludes his post with absolutely the correct next action required
…open up our content, expose our data so that it is easier to consume, rather than applying resources to redesign information dissemination. By creating objects that others can assemble we are likely to be significantly more successful at ensuring New Zealanders have access to the government information when, where and how they prefer.
A classic example is the recently released Atlas of Socioeconomic Deprivation in New Zealand (NZDep2006) published by the Ministry of Health. Whilst it contains a wealth of data, all of the downloadable forms from the website have been constricted by their publication in a read-only pdf format. The multitude of maps are all pdfs. The tables and reference information are in pdfs. This makes it near impossible to extract and utilise the data “when, where and how they prefer“. All is not lost however, I did contact MOH and they did provide a CD with a couple of hundreds MBs of data, including shapefiles for use within Geographical Information Systems. However, the licensing of the data is still somewhat unclear – Crown Copyright is not ideal terms, for example, to enable the republishing of the data by uploading it to a geospatial data hub such as Koordinates. What I mean to say is that it doesn’t allow automatic republishing without having to clarify conditions of use with the agency, in this case Health. If it were released under Creative Commons, then this would greatly speed the republication and disitrbution of the data. So, whilst the NZDep2006 data has been released, it is really not yet ready in prime time for wider Web 2.0 use.
This has also been replicated with Health’s Atlas of New Zealand’s District Health Boards. Using pdf’s to provide spatial data is admirable, but in this day of Google Maps or Earth – surely Government should be considering publishing data as KMZ’s or even live network services that can be loaded dynamically into a far richer and intuitive client than Adobe Acrobat.
Coming back to my point, SSC is certainly making all the right noises about where we should be going. However, right now, it appears to be left up to individuals like myself to actually go to agencies and say – “Hey, we’d actually like access to this data in a more reasonable format”.
What I would like to see is a mechanism whereby individuals such as myself, can instead approach SSC with a request to an agency, and the SSC will actually engage said agency to ensure agencies make said data available in a consistent manner across government.
I’m sure agencies would stand up and listen to someone making a request on behald of the Government CIO instead of a lone citizen or three. It would also mean that a consistent playbook could be promoted (Government Geospatial Information Web Access Guideline) that includes formats, hosting and most importantly licensing agreements by encouraging widespread adoption of the Creative Commons v3 NZ license.
For Government agencies to really start opening up their data, this needs to be driven from within Government, and only the SSC has the voice to be able to catalyse this process. Sure, individuals such as myself are engaging with success, and in some cases we’ll be able to obtain access – such as my recent win with Transit’s 2008 road survey trackpoint data. But for an individual to engage multiple agencies is very time consuming, and it is a slow process – especially when we are often cold calling and have to restate our case for publicly accessible data every time. And honestly, it is not something we as volunteers should have to be doing, this is really work that paid government personnel should be doing.
SSC needs to short-circuit this process by stepping up and creating an inter-agency mechanism to accept requests from citizens, and use the position of the SSC to engage, promote and ensure release of the data – whether spatial or not.
Late last week I stumbled across the Research e-Labs blog, that comprises part of the larger eGovernment efforts. They are currently consulting on the new eGovernment feed standard. They are proposing a move to Atom 1.0 as it appears to provide better mechanisms for semantically marking up the information contained in the feed. This is an exciting proposition, especially if they expand the defined tags to support reports and datasources – this is on top of the already defined news, jobs, consultation and events. They have a good presentation available too.
Clearly following on from the antipodes, the UK Government is now holding a mashup competition as well. But it appears that they haven’t quite gone as far as we have in New Zealand – at least not in terms of trying to remove as many restrictions on the use of Government data.
Having a look at the data page, where the Government has published data, it is interesting to note that citizens are required to enter click-use licensing agreements, and commit to using API’s to access government data. You can see some of the frustration on the page listing the data downloads.
Why are they such a problem? Well, the data can only be used for non-commercial purposes – that is going to hinder development somewhat, with no potential return at the end of the work. The licensing agreements appear to allow the government agency to pull the plug at any point if they don’t like what you are doing – I wonder what would happen if a mashup showed an inconvenient truth for a government, would that be reason to get a government agency to pull the data plug? Another joy of accessing data through API’s is that they are limiting queries on their servers to a few thousand queries per day (in the case of the Ornance Survey).
Which is why Government data really should be properly freed. It needs to be released to obtain a life of its own under a suitable Creative Commons license. Let citizens download the data, mash it up on their systems, and deal with all the issues. Governments must not become the gatekeepers of this information through API’s and license agreements.
Having just been through the process of obtaining and publishing some Government data, I was interested to see Dennis McDonald covering an Australian blog post on eGovernment on ‘Making data freely available‘.
Craig makes some very good points in his conclusions, and I’ve got one that I would like to add based on recent experience with the State Highway trackpoint data I sourced in NZ and uploaded to OpenStreetMap.
In the followup to uploading the trackpoints to OSM, a group of NZ OSM mappers have been discussing how to classify roads in New Zealand – basically how the road hierarchy is classified. Turns out that this is a hard problem and there is not a single standardised approach taken to classifying roads in New Zealand. After asking for guidance, I became aware of a number of issues and capture these on the gis.org.nz wiki.
The simple fact was that there was not a nationally consistent approach to classifying the road hierarchy in New Zealand. I expect it is similar in Australia, and possibly worse with the additional layer of State government.
This highlights an additional key role that I think Government has to play in national data collection.
Government should set the standards for data collection to ensure that datasets are nationally consistent to enable simple aggregation of disparate datasets.
It doesn’t have to actually perform the aggregation, although that would be nice. It just has to ensure that standards are used to enabled aggregation. Using the road hierarchy classification example from above, this means that a road of a certain class in one part of the country, means exactly the same in another. This would probably occur by an engagement-based approach that determines a controlled vocabulary to define types of roads.
Without this responsibility, citizens are doomed to a million-and-one datasets that cannot be easily aggregated to produce coherent and consistent national datasets.
Over at the In Development blog at eGovt, they are asking for comments on the state of progress for eGovernment in New Zealand. I’ve copied my comment here.
Location, Location, Location!
A quick skim and keyword search indicates that the report has almost entirely neglected one of the most important aspects of information, and that it location – its spatial component. It is not surprising then that the Government is struggling with how to handle spatial information. Some of the most important aspects of community anywhere are distance and connectivity. However with Government doing a poor job in making spatial information available to the citizen, it is no wonder that individuals are struggling to find out about consents from two properties over.
Even more of a shame is the fact that Government has failed to deliver on a project to produce an authoritative National Address Register of addresses, roads and placenames. This dataset is one of the most fundamental to being able to place information in space under truly understand its context. eGovernment will never take off under the core location issues are worked out. Whilst the recent release of significant amounts of Statistics New Zealand information is a great step, the true benefits and insights contained in the released census data is being held back by the lack of an authoritative National Address Register.
Organisational Inertia and Champion Individuals
Government organisations themselves are often hesitant to change or try something new. This is not to say it doesn’t happen, and there have been some excellent examples recently of organisations stepping up and trying something new – such as using wikis for consultation and engagement. I am fairly certain that most of these projects will have been championed internally by a small group of individuals, and I’ll bet they had quite a struggle to see their project through. Here’s hoping that these recent projects are the catalyst for more risk taking, and acceptance of exciting IT projects within Government.
Not enough Champions
As a consultant that has worked a lot within Government agencies, much of the information projects I’ve seen successfully delivered have been almost solely on the back of one or two key individuals. The agencies themselves have often been more of a hindrance to delivery of exciting new initiatives, and the Champions have had to fight an uphill battle against their own organisations, and other Government agencies when ‘engaging stakeholders’. Even worse, a number of these Champions actually run up so many brick walls in their organisation, and within Government, that they end up choosing to leave in frustration, or even worse are forced out by unenlightened superiors. In some circumstances they make their way to an organisation that is supportive, but that fate does not appear to wait all Champions. You need some way to support and encourage these key individuals that lie within Government agencies – there may only be one or two, but they are your key instrument of change from within.
Start Small, Release Early, Upgrade Often
Despite the fact that the National Address Register is such a necessary component of eGovernment. I don’t believe that the project as tendered should have gone ahead – not for something where tender prices ranged between $9 and $48 million. The project would have been so large and complex that it would likely have not been delivered on time, would have been over budget, and would have lacked the desired capabilities.
I think Government has to learn to try the small and simple things first and work up.
Surely, for the price of one or two Geospatial Professionals and supporting hardware, the New Zealand Government should be capable of aggregating all roading information from the 74 Road Control Authorities and Transit New Zealand, and publishing it as a single national roading dataset under permissive licensing. This should be able to be done now for a couple of hundred thousand dollars. But just produce a dataset and get it out there. Get feedback about how it is used, and look at improving the process, and accepting feedback. If you don’t, then the New Zealand Government is going to look pretty ridiculous when volunteers have created their own national roading dataset using OpenStreetMap because Government wasn’t capable.
Here are some simple actions that Government can undertake to encourage more exciting use of Government information.
Find data. Release it under permissive licensing. Release it in formats that make it readily accessible to manipulation in software (e.g. don’t release maps only as pdfs, make the underlying spatial data available). Announce it through a simple clearing house – nothing flash, it only needs to be WordPress blog pointing to the relevant source. It won’t be until this happens that the more exciting concepts such as entirely unexpected but useful mashups occur.
And that is when things will get really interesting. At that point, we will have citizens building mashups and services about ‘Our Place In Space’ – and they won’t be constrained by the organisational inertia inherent in most Government agencies that are tied back by accountability and liability that makes them hesitant to take bold steps.
Remember eGovernment is not just about Government developing systems and solutions. It does include citizens, communities and organisations building systems to meet their own needs.