Storing Code in a Geodatabase

It appears that the .NET 2.0 framework will be part of the standard install now for ArcGIS. This means it should be possible to dynamically compile code at run time using CodeDOM.Compiler.

What if we could store code in the geodatabase that would be dynamically compiled at run time?

A major pain for working with ClassExtensions is that people who do not have the DLL installed on their machine are not able to even open featureclasses that have extensions. Perhaps it would be possible to store code in the database (optionally obfuscated) for a classextension.

Perhaps this approach could also be applied to support triggers. In arccatalog, I’d like to be able to right click on a field in a geodatabase table (or featureclass) and provide code that would be called for OnCreateFeature, OnDelete etc. Behind the scenes the geodatabase would store this in a GDB_ table, and when a feature is added to a featureclass (or table) it would JIT compile and call the code. The GDB_ table could also provide a read/write property field (or fields) that would allow me to implement sequences.

I suppose it should be possible to write Visual Studio Addins to support editing code stored in a geodatabase.

I guess there’s really no reason the code would need to be C#, VB.NET or whatever. Maybe a simplified (domain specific) language could be provided that even DBA’s could understand. For example, say I want to assign a county ID to a point whenever a point is created or updated, based on the county the point falls within. The C# code to accomplish this might be a bit overwhelming, so maybe a simpler language could be provided that expresses this. Of course ESRI would need to provide a compiler to create the CIL from the simplified language.

Maybe this approach would also allow custom features. Custom features were promoted at 8.0, but they never really worked as intended. AFAIK ArcFM is the only custom feature based solution in widespread use. What if the CLSID stored in the GDB_Objectclasses table were a key to another table that stored code. Instead of instantiating a COM class when the objectclass is opened, ArcGIS could JIT compile the code stored in the geodatabase. If this is possible perhaps ESRI could provide us with a base Feature class that we could extend and whose methods we could override.

Update: I’ve written a prototype and uploaded here.

8 comments so far

  1. Brian Flood on

    thats a great idea Kirk, never thought of that. It would solve the GDB+implementation dll issue quite nicely, just store it along with the data (kinda like VBA scripts with MXDs). Of course, this only works if ESRI outs the necessary JITer code in the GDB classes, otherwise you’d need a dll to get your dll out of the GDB 🙂

    a common OS framework would be nice but nothing beats ESRI baking it right in.

    cheers
    brian

  2. Kirk on

    Good to hear from you Brian. I just read a bit about how UDT’s supported in SqlServer 2005. At first I was thinking SqlServer must be storing the assembly in the database, but this article says apparently not:

    “To do all of these, the only requirement for using UDTs in the client application tier is that the client application must have access to the assembly that implements the UDT.”

    Not sure if this changes in SqlServer2008.

    Oracle also provides UDT’s. The DLL search order described in this powerpoint does not mention the dbms being a place where assemblies may be stored, implying dlls need to be installed on each client machine.

    # DLL search order

    * 1. Current application directory
    * 2. Application config or web.config
    * 3. Machine.config
    * 4. Windows Registry
    o HKLM\Software\Oracle\ODP.NET\\DllPath
    * 5. Windows System Path setting

    Since neither SqlServer nor Oracle are storing assemblies in tables, I wonder if the performance hit might be prohibitive.

    Regards, Kirk

  3. Paolo on

    Hey Kirk

    I agree with you.
    A DSL in the RDBMS would be fine, would solve the dlls deployment problems with fc extensions and custom features.
    But IMHO the best would be if Esri would implement SQL Spatial for any platform, and then you could code triggers and stored procedures with that.

    For more specific GDB code, not possibly implementable by Spatial SQL (but the simple cases you were talking about would be covered), the DSL approach would be fine: but can a .NET based DSL runs on every SDE platform, or could run only at SQL Server? Or the DSL would be just sended to the .NET client (ie ArcMap) and there would run?

    cheers, Paolo

  4. Kirk on

    Hey Paolo –
    I suppose Oracle will be lowering their price for spatial in response to SqlServer 2008. As spatial db competition accelerates, maybe ArcSDE’s role as an abstraction layer allowing development of dbms-agnostic spatial apps will become more relevant.

    I see that oracle has a lot of support for .NET, but I suspect java still gets more attention. Googling around a bit, I can’t see any java equivalent to System.CodeDOM. But I did find Drools, a java based DSL. http://labs.jboss.com/drools/

    Perhaps a good reason to offer a DSL would be to provide language independence (have it work with either java or .NET). It seems like it should be possible to allow a spatial DSL to run either client-side or server-side. Not sure how debugging serverside would work though.

    Regards, Kirk

  5. Brian Flood on

    hey kirk

    a client app can use .Net UDTs but as you noted, they need access to the assembly that defines them. IMO, this is the same issue as delivering your own assembly logic to all the client machines. You could use a URL for the assembly loader but to be honest, I’ve never gotten that to work correctly (especially with com interop and the assembly cache).

    I think a better bet is a generic loader that would work with all GDB custom extensions that are stored in the database, at least you could change the logic in the DB and have multiple extensions that the loader simply compiles and instanciates. The obvious problem is getting the loader to all the machines (an esri implementation would be nice)

    Sql08 uses the same .Net UDT setup and the spatial extensions are built using this functionality. BTW, you can deploy the spatial assemblies in your own app, a good free method for adding a geometry model/spatial operation library to your app. They do not reply on SqlServer at all, you can spin up geometries and work on them in memory…

    SDE as an abstraction layer – hah, pretty funny if this came about. If they gave it away for free I could see that happening…

    cheers
    brian

  6. Kirk on

    Hey Brian –

    Thanks again for your comments. A bit of background: I’ve got a large water utility where I’ve rolled out a .NET Editor Extension that does some very basic things. Other users in a different department have seen the extension and have become envious, and have begun asking for their own editor extensions tweaked to handle the particular featureclasses they are responsible for editing. Their requirements are rather vague, so an iterative approach is needed where I write something, they test it, give me feedback, then repeat.

    Each time I install an update the IT guy with Admin privs has to be scheduled way in advance. I’d like to be able to gotomeeting with the end user and tweak the code to get it working the way they expect it. While I have done this with VBA and then cut/pasted into VB6 once it’s stabilized, I’d really like to stick with .NET.

    So what I’m thinking is maybe I should write a simple Editor Extension loader as you describe that leverages CodeDOM.Compiler to load other extensions from the geodb at run time. I don’t think I’d need to mess with interop(?) … COM exposure is only needed for discovery via the COM category anyway correct?

    Unfortunately, IExtensionManagerAdmin.Addextension takes a clsid as an argument, so I guess I’ll need to use IParentExtension. The example here indicates the child extension need not be COM visible. Instead of using the ICategoryFactory.CreateNext, I’ll use CodeDOM.Compiler.

    This should allow more agility in deployment. Like you say, I’ll still need to do the initial install, but no admin privs should be needed in subsequent iterations.

    Most of the things I’m doing are very simple, e.g. when a manhole is added, populate the ServiceArea column based on the service area polygon it falls within. It sure seems like a DSL could be provided that generates code behind the scenes to accomplish this. That way a non-programmer could make changes to the extensions.

    Regards, Kirk

  7. Brian Flood on

    hi kirk

    I’m not sure about the non-programmer angle, I’ve never looked at how you could do this so I doubt I can be of any help there

    however, the dynamic load of a editor extension (or an traditional application extention for that matter) is definitely possible. first, I would make the loader itself an editor extension. It needs to be installed locally with admin privileges but once it’s there it will never really change. In turn, it acts as a proxy for all the editor extensions you are loading dynamically from the DB. Some manifest would (file, db table, etc) would tell it where the bits were and then it would either

    a) load source in memory, compile, then instanciate in the object
    b) load the compiled dll directly from the DB (as a byte array stored in a BLOB) and use Assembly.Load()

    Once you have the object in your shim extension (which should be keeping a collection of these dynamic objects), just call its Startup using what was passed to your loader extension. Likewise, call Shutdown on you’re your dynamic extensions when appropriate. The ArcMap Editor will not know about these extensions but you should have everything you need via the loader call to their StartUp methods.

    using the proxy technique, you could eliminate the whole com interop/clsid issues. as you change your extensions, copy the source or compiled code into the DB and tell them to restart 🙂

    cheers
    brian

  8. […] 8, 2008 I’ve written a proof of concept editor extension based on the ideas outlined in my previous post, plus some helpful feedback from Brian Flood […]


Leave a reply to Brian Flood Cancel reply