More GIS in the Cloud
Peter Batty is looking into EC2 for his new venture:
… thinking seriously about using Amazon EC2 and S3 when we roll out, especially now that Amazon has added new “extra large” servers with 15 GB of memory, 8 “EC2 Compute Units” (4 virtual cores with 2 EC2 Compute Units each), and 1690 GB of instance storage, based on a 64-bit platform – these servers should work well for serious database processing.
Amazon has details on the new instance types Peter refers to here.
With such large amounts of memory available, it seems possible to build some really killer route finding services.
Microsoft is working on something similar to EC2. I just hope ESRI provides 64-bit, and a license policy that allows cloud deployment when Microsoft comes up with something.
In response to EC2 questions, Microsoft CTO Ray Ozzie said:
Amazon Web Services [are] … showing Web 2.0 startups that there might actually be something there with regard to this utility computing model. Whether it’s the right set of services exactly, or whether the way that they’ve designed them is exactly what matches the needs of those potential developers, there are some questions. But I think they’ve done the industry a service by beginning to open people’s [in other words, Microsoft’s] eyes to the potential.
I don’t have any announcements at this point in time. But directionally, I think you could see in my presentation that we believe very heavily in this utility computing fabric concept; it’s the only way, even internally focused, it’s the only way we can get scale amongst all the properties we run internally. And I think it just makes sense to offer those services to developers and to enterprise customers over time.
Sounds like the same business case Bezos made for AWS.
It’s not so much that (Amazon Web Services) has something to do with selling books. It’s the inverse: Selling books has a lot to do with this.