ESRI UC: Exhibit Hall, Wisdom of the Cloud

Below are three booths I visited in the Exhibit hall and a use case describing how Cloud based GIS would allow them to collaborate.

Surface Area and Ratio
I visited with Jeff Jenness, who showed me tools he’s developed to compute Surface Area and Ratio. An acre in a hilly location has a lot more surface area (and wildlife habitat) than an acre in a flat location. Cool stuff.

During lunch, Michael F. from Santa Barbara described all the fires they’ve had near his home. I wonder if reports of the number of acres burned for that region reflects the surface area.

HAZUS MH
I watched an in depth presentation on of HAZUS – MH. This is a free tool available from FEMA for performing risk assessment. While lots of free data is provided with the tool, users often like to plug in their own data. It runs on the desktop.


WeoGeo Marketplace

WeoGeo showed me how their marketplace allows data vendors to present their wares on a site where buyers can comparison shop and purchase the best data for their needs. While their marketplace focuses on data, they see a potential for using a marketplace for tools.

A Cloud Use Case
Let’s suppose FEMA ported HAZUS MH to the cloud. Right now HAZUS supports Hurricane, Flooding and Earthquake. It does not support Fire risk modeling. Imagine a cloud based tool, perhaps looking like modelbuilder, that would a allow GeoDesigner to author a template wildfire risk assessment model, and publish it for others to use.

A user would log in and create a model of specific area (Santa Barbara) using the template. They would augment the free data with data purchased from some site like WeoGeo. Maybe they could purchase a DEM and some color infrared imagery to indicate how much fuel is available for a fire. Likewise they might decide to pay extra to have WeoGeo pre-process the data using a tool like Jeff’s before shipping it.

The guy at lunch told me there was one fire that moved 20 mph down a valley. He said shifting winds can quickly change which areas are deemed to be at risk. With such rapidly changing conditions it seems like it would be easier to re-run models with updated parameters and serve the results out to appropriate agencies if the model is in the cloud.

Advertisements

4 comments so far

  1. HAZUS.org on

    A HAZUS wildfire model would be a real challenge. Probably too many variables to make an effective model. Wildfires create their own micro-climates which would have to be modeled and you would require very detailed inventory data concerning building materials … not to say that it is not worth doing just that it would be difficult.

    You could operate like the flood model by calculating losses in the burned area … but again this is inventory driven

    PS there are folks at UWF running HAZUS-MH on a virtual machine: http://www.hazus.org/2009/06/student-hazus-projects-at-uwf.html

  2. Kirk Kuykendall on

    Yes, it would certainly be difficult. Risk models, and perhaps especially WF models, seem like they could benefit from being on a platform that supports easier collaboration among experts and data providers. Looks like the cloud would be a good place for such a platform.

  3. Milver on

    HAZUS MH requires ArcGIS 9.x. There are a lot of geoprocessing operations involved in those tools. In order to take advantage of the cloud, ESRI would need to allow variable licensing in the cloud. Is there any progress on that front?

    I wonder how UWF virtualized and allowed access to ArcGIS & HAZUS…

  4. Kirk Kuykendall on

    Milver –
    In the “road ahead” session on the cloud, ESRI mentioned they have a research project looking into Amazon Web Services. Later I asked who in charge of the project, and was told Mr. Vu (can’t remember first name).

    Each month I get a pleasant reminder of the elastic nature of Amazon Web Services (a credit card charge for $0.01). I haven’t been using it much lately. That’s the beauty of it – you pay for how much you use it. I think at some point people will start putting geoprocessing into EC2. When that happens, I would imagine you would be able to pay a bit extra for a geo-enabled AMI. Last I checked the Sql Server AMI’s were still on 2005. http://aws.amazon.com/windows/ When AWS upgrades to 2008 I suspect we’ll see a lot more geo activity.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: