Archive for May, 2009
This is a copy of an article I had published in The Box on Tuesday, the 26th of May 2009. This is a copy of my originally submitted text, an archive for my my records. It is a topic that has seen a bit of interest. I’ve also been interviewed on National Radio’s Panel on this topic. I will probably be writing a more detailed article about the problem in due course.
A recent report from the US Government Accountability Office has identified possible trouble ahead for the Global Positioning System (GPS). Due to governance failures of those responsible for the GPS, there is the risk that satellites may not be able to be replaced faster than the rate at which they fail over the next 10 years. Whilst replacement satellites have been ordered and developed, some technical and project management issues have delayed the launch schedule, with the next launch planned for November 2009. This doesn’t mean that the GPS will just stop working. There are currently 31 active satellites in the GPS constellation – only 24 are required for the agreed level of service. There are 13 satellites that are more than 12 years old, and are increasingly likely to fail. This happens as the solar panels age, and they produce less electricity to power the satellite. There are options for extending satellite life by turning off less critical secondary payloads that draw less power. A few satellites can fail without having a significant impact on end users. If the number of active satellites drops to 24 or below, GPS receivers will probably be less accurate as fewer satellites would be visible at any given time. Given the GPS is a strategic military asset for the US, it is highly unlikely it will be allowed to fail completely. This may drive innovation in GPS receivers to support multiple satellite navigation systems to reduce reliance on a single system.
Microsoft recently released an invitation-only beta of Vine, a social networking application to allow people to share information with their networks, and receive news and public safety alerts for areas they are interested in. It appears to integrate with Twitter and Facebook, and allows you to post and receive information. Microsoft is targeting Vine as a tool for both routine and emergency use. It has just entered testing, but has some potential as a social networking tools for disasters. To ultimately be successful, it will need to run on Mac, Linux, and popular mobile phones such as Symbian, and the iPhone so we can carry it in our pockets. It also needs to interoperate with similar applications from other vendors, but most of all be free so that price does not dissuade adoption.
Update – I received an invite, but the current beta is really only keyed for US usage.
Not long ago I wrote briefly about my concerns of the United Nations finally deciding to get involved in the LTTE conflict in Sri Lanka – making the point that the ‘international community’ has ignored this conflict for years, and now, only as it nears its end, has it decided to try and do something.
Today we see the full scale of the hypocracy of this action, by attempting to compare it to recent events in Pakistan.
Up to 15,000 troops have been deployed to take on 4-5,000 militants… The fighting has already displaced some 200,000 people, while a further 300,000 are estimated to be on the move or about to flee, the UN says… “We are feeling so helpless, we want to go but can’t,”… The army has also accused the Taleban of holding the civilian population hostage and blocking their exit… The US says the militants in northern Pakistan pose a direct threat to its security, and has demanded they be confronted. Pakistani military spokesman Gen Athar Abbas said the military’s objective was to eliminate the militants from the Swat valley and also the neighbouring districts of Dir and Buner.
Well, you tell me if that doesn’t sound the same situation as Sri Lanka? The difference?
One country has been ignored by the international community, decided to do its own thing, and gets told off in the final minutes. The other, with a nearly identical situation, is given the go ahead to ‘wipe them out’.
This internal conflict could impact on US operations in Afghanistan so the US is all for this action. The LTTE conflict has no direct impact on the US, so they are more than happy to support involvement by the UN to broker a peace deal. Somehow I don’t see the UN being involved in the Pakistan conflict. What’s the bet Pakistan won’t see a UN inquiry?
The issue of community-produced maps has reared its head on the IAEM email list today – closely linked to my post back on the 26th. The following issue was raised, and I wanted to share my reply to this.
Lack of citation was my major concern with the other available maps that have been in wide circulation. The second concern with the other maps is that they showed push-pins when they did not have or could not cite the data to support specific points.
My reply follows:
I think you’ll find that most of those maps do actually have references, in the case of the Google Maps mash-ups, they are contained in the hundreds of comments accessible from the same page as the maps. In fact, it is generally from the posting of these references in the comments, that the Google Maps get updated. What they have failed to to is to make it easier to reference the citations, by not including the reference in the popup bubble above the marker. But if you read through all the comments, you’ll likely find most of the citations there.
Another big failure is to create a timeline/history so that one can see the growth/change in numbers over time for each marker. Most of the maps are purely a snapshot of the here-and-now, and give no context via history.
The real point that emergency managers should take away from this is the following.
Agencies that ‘own’ the source information (e.g. CDC, WHO, and health agencies in every other country in this case), really should be publishing authoritative georeferenced data at the source. If agencies did this, then there would be no need for these ‘amateur’ cartographic efforts to hack together information from news, rumours and other sources. It would sure save a lot of time and effort in people trying to recreate information that already exists and either hasn’t been released, or has not been converted to a georeferenced format.
Likewise, it isn’t really the role of companies to provide this information. Once again, they are just filling a gap that we, as emergency managers, have failed to meet.
The mashup culture is a direct result of a failure by emergency managers to make information available in a form that end users clearly want it (as evidenced by the time and effort they will put into recreating the data in the form that they want to use it).
Perhaps we really should start thinking seriously about how we can produce authoritative information in formats that our communities want.
If you have a look at the example map I created in under an hour on the 26th, you’ll note that I created a little table in each popup for a marker that contained a link to the source article, and in the case of the San Diego marker, included daily figures for three days so it was possible to track the state of that marker over time.In addition, I scaled the marker images so that they were more proportional to the number of cases – a marker for each infection quickly produced an unreadable map, hence it seemed a better approach is to produce summary markers for each location, with the size of the marker indicating the numbers.
The real trick is going to be to produce a web application to track and manage this information, that can then export it in a suitable form to display the information as discussed above. This is clearly something we should look at for Sahana.
I was catching up on Twitter today, and saw some references to optimising Adobe Lightroom catalogues in #Lightroom (I’ve now found the original1 posts2 and had a read of them – there is more good stuff to try out. I’ve incorporated the previews and caching as well in my cleanup today). This struck me as something I was due to do as I knew my catalogue database was well over 250MB, and when something gets that big it needs a little maintenance. This prompted me to poke around a little and have a good cleanout.
Here’s what I did – some for speed, others just freeing up space.
0. Complete a Time Machine backup first.
Naturally, I wanted to make sure a had a backup before I did anything destructive – such as deleting files – so I forced Apple’s Time Machine to complete a backup. With that out of the way, I could start tidying up. If you’re on Windows, you should do a backup first before you do anything else.
1. Check the Backups directory.
I have Lr setup to backup my catalogue weekly – this keeps a fairly good trail of backups in case something goes wrong. It hasn’t yet. However, after 2 years of Lightroom use, my directory of Lightroom catalogue backups had blown out to 13.52GB! There were a couple of things I did to trash some of the existing backups.
- I removed all Lr v1 backups. I knew the date that I upgraded (by the last modified date of my old v1 catalogue file). I deleted all the v1 backups and this freed up 8.71GB. Since the upgrade produced a new catalogue name (with a -2 appended), I was able to confirm that I was only deleting v1 backups.
- For all months bar the last one, I decided that I only really needed to keep the most recent backup in each month, rather than sometimes all 4 or 5 in a month. I deleted all but the most recent and freed up another 2.64GB of space.
This brought the Backups directly back to a rather svelte 2.42GB.
2. Optimise Catalogue
I hadn’t seen this option before today, so I thought I’d check it and see how it goes. It is accessible via Lightroom > Catalogue Settings, and is a simple ‘Relaunch and Optimise’ button. My catalogue was around 18k photos, and was 262.9MB before optimisation. My Previews file was 101.7MB. Note this figure. After expecting it to take say quarter of an hour to process, I was surprised at how little time it took.
The resulting catalogue saw ~20MB removed and the overall file size brought down to 241.5MB.
I did get a shock when I saw my Previews file though, it had blown out from a lowly 101.7MB to a massive 7.74GB! However, I think this may have been a case of the package (Previews are a package on OS X) under-reporting its true size. I don’t think the optimisation process was busy enough to create 7GB of previews in such a short time! Therefore I think the optimisation actually made the package report the correct size. So I’m not treating this as a ‘loss’ of 7GB
3. Deleting Cruft
In the same directory as my main catalogue, I noted I still had my unloved v1 catalogue, as well as a couple of ‘Temporary Folders’ Lightroom had created that contained a couple of images that I no longer needed. Since these were already backed-up, I just deleted the old catalogue and these temp directories.
4. Rendering Previews
Quite a few of my previews had already been rendered, but I decided to check my rendering settings, and force Lr to render the rest. Library > Previews > Render Standard-sized Previews. I expect I may be facing a 20GB+ Preview file by the time its finished!
5. Adobe Camera Raw cache
As per Lightroom Queen’s article, I went and configured the cache on a separate internal hard drive. I created a new directory called /Cache/Adobe Camera Raw in the root of a second ‘working’ drive and added this in the Lightroom > Preferences > File Handling dialog. I choose the new directory I had created, and upped the somewhat meagre 1GB cache to a more workable 10GB.
All up, it has been a good little cleaning effort that netted me another ~11.5GB of storage. Which I think I will probably lose to previews. Lightroom doesn’t seem noticeably faster on open, but it does seem quicker to quit. I haven’t spent any other time in it this evening yet to comment on general usage. I’ll report back once I have completed the preview rendering and had a chance to spend a little more time using the optimised Lightroom.