LizardTech.com

Archive for the ‘Google’ Category

Only in the Age of Google

Friday, December 12th, 2008

One of our engineers was investigating compiler error C2766 (for a friend offsite, of course. We never get compiler errors ourselves, you understand). He did what anyone would do, he googled it. His monitor was suddenly awash in returns linking to pages about the assassination of JFK. Turns out the error number is the same as the serial number of the Italian rifle used in that doleful outing.

After we all had a good macabre snort about this, one of our guys commented that there are alarming juxtapositions that would only come to light in the Google Age.

Thought I: Touché! That’s worth a post.

GIS and Geography Week

Friday, November 21st, 2008

As part of National Geography Week we at LizardTech have been asking ourselves what GIS and geography mean to us personally. What usually comes to mind for me (aside from work) is images from space on Google Maps. I like to scout places before I go, and perhaps show where I’ve been afterwards via GPS tracks like the ones displayed in Google Maps using this KML file.

It’s also cool to stumble on places I’ve been in the past. For instance I found the cheap hotel we stayed at in Jamaica in the mid-80s before this technology was available to the public. Here’s a link.

It was easy to spot because it is the only one up the road from the airport with a pool. It was one of the few Montego Bay hotels that would allow native Jamaican’s to stay (most were exclusively for tourists).

Lots of fun!

Making public imagery truly public

Friday, August 8th, 2008

Lizard watchers will have noticed the recent LizardTech announcement regarding the Express Server sale to North Carolina. I spent an afternoon pointing Google Earth over there and reflected about the need for better public access for public GIS aerial imagery.

Like most people (certainly most people who read LizardTech’s blog) I’m really excited by the advent of feature-rich GIS clients like Google Earth and very extensive base map that they make available for free, non-commercial use. That said, there are lots of different kinds of imagery you might want to view and Google can’t provide them all. (For an interesting introduction to some of what’s behind the Google Earth base map imagery, see this Google Earth Blog posting.

Despite the availability of free or nearly free commercial data from Google, there are myriad public programs that acquire new imagery each year.

  • USDA’s Aerial Photography Field Office runs several image collection programs the best known of which is probably the National Agricultural Image Progam (NAIP) which provides farm-related orthophotos
  • USGS’s National Aerial Photography Program (NAPP) supports various federal agencies
  • The National Geodetic Survey’s Aeronautical Survey Program supports aerial photography of Airports

That’s a by-no-means exhaustive list of federal collections programs. There are similar programs that work at the State and Local level. In general, there’s a very good reason for this: no single base map, even one as well-funded and maintained as Google’s, can hope to meet everyone’s needs.

Happily, most of that publicly-acquired data is publicly available for little or no cost to US taxpayers. The rub is the means by which it’s made accessible.

In many cases you can view maps via a web application like this NAIP 2006 Viewer and this MDOQ Viewer hosted by USDA’s Geospatial Data Warehouse. That’s great for basic access and a pan-and-zoom style of data discovery but limits the user to exactly what web application has built for you. For most non-trivial work, it’s not what you need (for example comparing / aggregating diverse datasets).

More frequently, you download the data or have it mailed to you on a DVD. Like this. That’s OK if you’re not in a rush and you have industrial strength GIS tools and the necessary technical sophistication to use them. Even then, you’re stuck with this same problem for every public dataset you need to access. Consider briefly that each of those datasets are frequently bigger than 1 gigabyte in size.

What we need are better tools and better integration. Increasingly, Google Earth is being picked as an easy, relatively inexpensive ($400/year for commercial use, free for personal use) tool and OGC standards like WMS are serving the integration need.

All of that brings me back to the the North Caroline site. Randolph County is a growing community. If you want to see the latest growth (here from 2007), you’ll need to ask the locals, it is not in the Google Earth base map, which shows this:
Google base map

Here’s the same area, again inside of Google Earth, using the WMS feed from NC OneMap.

overlaid additional data from NC OneMap

Try it yourself: Click this KML file to open it in Google Earth (Note: In Firefox, you’ll be asked first whether you want to open or save the file).

Directions magazine explored the question of the commercial data Google publishes as a public trust. My view is that, by making their real public data accessible via a common WMS interface, North Carolina has made it, well, really public.

GeoWeb 2008 trip report (or, What I did on my summer vacation)

Friday, August 1st, 2008

Last week I had the pleasure of attending GeoWeb 2008 on behalf of both LizardTech and OSGeo. The conference was once again in Vancouver BC, at my favorite business hotel and conference venue. I’ve attended this conference for a number of years now, and it gets better every passing year.

Just a few highlights:

  • The underlying theme running through the week was the integration (confluence? convergence?) of the GIS world with the worlds of CAD and BIM (building information model). Architects typically operate at a different scale than we’re used to, but increasingly they want to be able to envision and model their buildings in the larger urban landscape that we can provide for them. Kimon Onuma and his BIMStorm work demonstrated this integration very well. Going the other direction, traditional GIS folks are looking to things like CityGML to be able to improve the fidelity and add that 3rd dimension to their own models.

panelists

  • I moderated a one hour discussion on Open Source Servers, ably assisted by panelists Paul Ramsey of Clever Elephant, Justin Deoliveira of OpenGeo, and Bob Bray of Autodesk. The attendance was good, and we had some good questions and discussions about the pros (and sometimes cons) of working in and with open source software.
  • On behalf of Cody Benkelman of Mission Mountain Technology, I also presented a cool paper on using Amazon’s Mechanical Turk web service and Google Earth to solve a real problem for a real customer. I tried to get across two main ideas. First, Turks and Turk-like things can be seen as “outsourcing for the Web 2.0 generation”. Secondly, and possibly disconcertingly to some, complete “automation” is not always the best answer – us geeks think of it first, and yes, it’s usually the right move – but not always. Contact us for reprints.

Mechanical Turk

  • Dr. Michael Goodchild gave a great workshop on Data Quality – a topic which quite honestly sounded pretty dry and uninspiring, but which turned out to be both educational and interesting. He convinced me that LizardTech’s viewers are displaying lat/long incorrectly, at least from a data quality perspective.
  • Michael Jones of Google keynoted again this year, and he once again made everyone stop and think deeply about the human impact the geo community can – and does – have on the world. Not the kind of talk you can summarize easily, you just had to be there.
  • This year the conference held its first Student Competition. The competition required use of open source software for the projects; OSGeo was one of the sponsors and as such I was one of the judges. First prize went to Tobias Fleischmann (Paris Lodron University Salzburg, Germany) for “Web Processing Service for Moving Objects Analysis”, which was built using deegree. Second prize went to Tran Tho Ha and Nguyen Thi Thanh Thuy (Politecnico di Milano, Italy) for “e-Collaboration for DGPS/GPS data distribution and receiver device evaluation”, which used PostGIS, MapScript, and OpenLayers. Congratulations to both winners!
  • GeoWeb is also famous for being scheduled during Vancouver’s annual “Celebration of Light“, an international

    Shipmates

    fireworks competition held several evenings high above English Bay. As in previous years, the conference’s evening reception was turned into a sunset dinner cruise, after which we all went up on deck to oooh and aaah at the pyrotechnic ballet.

Finally, just for kicks, I’ll offer the following bits of geotrivia I collected during the conference:

  • “A GPS with a bullet hole in it is a paperweight. A paper map with a bullet hole in it is a paper map with a bullet hole in it.” (attributed to the US Marine Corps)
  • Tobler’s First Law of Geography: “Nearby things are more similar than distant things.”
  • city furniture (noun): features of the urban landscape such as park benches, bus shelters, street lamps, etc
  • “The amount of metadata needed for a piece of data varies with the ‘social distance’ from me to my data consumer.” (Michael Goodchild)
  • For Amazon’s web services, 85% use the REST API and 15% use the SOAP API. (quoted by Satish Sankaran, ESRI)
  • On the Vancouver transit system today, 100 of 144 bus routes are under detours, due to 2010 Olympics work. (Peter Ladner, Vancouver deputy mayor)
  • Thirty percent of all 911 calls are not associated with a street address. (Talbot Brooks)

GeoWeb 2009 is already being planned, and will include special emphasis on both cityscapes and 3-D modeling.