Here is a early build of a ActiveRecord adapter using RubyCLR and the .NET 2.0 SqlClient libraries. My early test show it is 4x faster than the current DBI version.
I’ve only tested it on activerecord 1.13.2 (The version included in Rails v1.0.0).
Here is a list of the things currently missing:
- Support for current and edge ActiveRecord
- Binary columns always return null
- Special columns (text, ntext, image) support. If you order, filter or join on special columns you will receive a SqlException.
- Migrations
- Rails loads the database adapters before the plugins. So I have to see if it’s possible to deploy using a plugin.
- The current version of RubyCLR appears to have a problem with memory being GC’ed. I haven’t been able to confirmed the root cause yet.
How to install
You need the RubyCLR and a .NET 2.0 build environment
Checkout Rails
svn co http://dev.rubyonrails.org/svn/rails/tags/rel_1-0-0 rails
Checkout SqlClientAdapter
svn co http://ca.utio.us/svn/sqlclient/trunk sqlclient
Compile the SqlClient solution and copy the assembly to the activerecord vendor directory
(I created a post build event but I don’t know if it’s included in the solution.)
Copy the sqlclientadapter.rb to the activerecord/lib/activerecord/connection_adapters folder
Apply the patch inside the rails root folder using TortoiseSVN or manually add a new adapter
Comments Archive
Sam Smoot said about 18 hours later:
I wonder if you ran into the same issues I have trying to write or modify an ActiveRecord adapter. Mostly I can’t figure out why the serialization related tests are failing when the majority of other tests pass… any experience with that?
Also, I was wondering if you’d thought about Unicode support. It’s easy enough to “just do” UTF8 in a Rails application, but RubyClr doesn’t detect those strings as such to my knowledge and the extra info is lost crossing the Ruby/.NET boundary.
So either you deal with UTF-8 strings being dumped to byte-arrays in varchar fields and looking like garbage in the database (though working fine in the application), or you try to get the bytes out of the ruby string and convert it to a unicode string with .NET’s Encoding.UTF8.GetString(). That works fine, but going back the other way is a problem since any Unicode data in a .NET string is tossed out the window since I suppose RubyClr makes an assumption that you’re using the Default/ASCII encoding (wild guess). I’ve been meaning to ask John Lam about that… but I’m lazy :-)
Ceaser said 1 day later:
Which serialization tests are you referring to? ActiveRecord’s object serialization? The only tests that don’t work are related to binary data, threading, and non-integer primary keys.
I just finished testing RubyCLR for non ASCII support and you’re right. It explicitly converting all strings to ASCII.
I tried to change the Ruby::ToRubyString method to use a method that doesn’t convert the value to ASCII, but I’m seeing errors related to loading assemblies.
As it stand right now there are several major issues with RubyCLR that I’m trying to track down.
- Memory corruption
- Ruby and CLR trying to access object that have been GC’d
- Support for non-ASCII strings
Are you seeing the following error from time to time:
RuntimeError: System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
Brent Rowland said 5 days later:
I haven’t had any luck installing a new database adapter as a plugin without updating other files. For now, I’m resigned to updating my environment.rb file like so:
require File.join(File.dirname(FILE), ‘boot’) require ‘plugins/fbadapter/lib/fbadapter’ Rails::Initializer.run do |config|