Archive for October, 2007|Monthly archive page

Spatial is Special, what about Time?

pocket knife
The swiss are known for clocks and pocket knives, so why didn’t they include a watch on this pocket knife?

If you’ve worked much with GIS there’s a good chance you’ve had to go through the why-spatial-is-special routine with a DBA wanting to store geometry as numeric columns within normalized tables.

But what about time?

Say you’re using GPS clock to compute location via time difference of arrival (TDOA). Nanosecond precision is needed (speed of light = 1 foot per nanosecond), however, SQL Server doesn’t support anything finer than 3.33 microseconds. This could be overcome by introducing a time column with an ITemporalReference. Internally it would store time as a 64bit integer along with a domain and scale – just like with spatial types. ITime is to IGeometry what ITemporalReference is to ISpatialReference. A simpler (though perhaps more confusing) alternative might be to overload the M (measure) value of geometry to allow time to be stored as a measure.

On the other end of the scale is geologic time, which falls outside the .NET DateTime structure limits. In this case the domain would be much larger.

From the helpdoc:

What is the best way for storing temporal data – a netCDF file or a relational database? Which one is faster?

Storing temporal data in relational database is just as viable as using a netCDF file. ESRI’s support of netCDF is primarily to support the existing community of netCDF data and users, not to force people to learn about a new file format. The decision should be made based on how you want to create and manage data in your organization.

It looks like netCDF addresses this issue. But what if I don’t want to represent time using netCDF or date columns as ESRI suggests?

Advertisements

Space, Time & Hydrography

hydrograph

ESRI has developed some useful hydrography tools for Spatial Analyst. Maybe I’m missing something, but I don’t see any tools for cleaning up gaps and spikes in flow (time series) data. Water flow telemetry can contain a lot of subtle noise that should be cleaned before used in models.

convolution
I’m writing some tools to clean up time series data. It seems like some of the concepts of the 2 dimensional kernel used by spatial analyst could be generalized (and simplified) to work with one dimensional (time series) data. This would allow me to resample data collected at odd intervals into an evenly spaced sampling scheme. Lots of the operators available for rasters seem applicable to time series: resampling, overlays, convolution.

sun
There appear to be a lot of tools out there written for stock market analysis, but they don’t seem concerned with cleaning up errors.

More GIS in the Cloud

clouds
From EnchantedLearning.

Peter Batty is looking into EC2 for his new venture:

… thinking seriously about using Amazon EC2 and S3 when we roll out, especially now that Amazon has added new “extra large” servers with 15 GB of memory, 8 “EC2 Compute Units” (4 virtual cores with 2 EC2 Compute Units each), and 1690 GB of instance storage, based on a 64-bit platform – these servers should work well for serious database processing.

Amazon has details on the new instance types Peter refers to here.

With such large amounts of memory available, it seems possible to build some really killer route finding services.

Microsoft is working on something similar to EC2. I just hope ESRI provides 64-bit, and a license policy that allows cloud deployment when Microsoft comes up with something.

In response to EC2 questions, Microsoft CTO Ray Ozzie said:

Amazon Web Services [are] … showing Web 2.0 startups that there might actually be something there with regard to this utility computing model. Whether it’s the right set of services exactly, or whether the way that they’ve designed them is exactly what matches the needs of those potential developers, there are some questions. But I think they’ve done the industry a service by beginning to open people’s [in other words, Microsoft’s] eyes to the potential.

I don’t have any announcements at this point in time. But directionally, I think you could see in my presentation that we believe very heavily in this utility computing fabric concept; it’s the only way, even internally focused, it’s the only way we can get scale amongst all the properties we run internally. And I think it just makes sense to offer those services to developers and to enterprise customers over time.

Sounds like the same business case Bezos made for AWS.

It’s not so much that (Amazon Web Services) has something to do with selling books. It’s the inverse: Selling books has a lot to do with this.