Are there any services out there that will trap Googlebot and serve a html snapshot, rendered by an actual browser, of your website? This is pretty much needed if you want to implement the ajax crawling specification.
This shouldn't be too complicated, firstly you'll need a script that detects if it is a spider or bot visiting your site, e.g http://www.dynamicguru.com/php/sniffing-googlebot-using-php/
Secondly, you'll want something like Offline Explorer or a custom tool that will create a static version of your site. This should be run as often as required, e.g. hourly or daily, to create the static version of your site. If Googlebot visits you would then detect that with a script like the above and serve the static HTML.
The best app for this is: http://browsershots.org/
Choose your browser set and/or OS and wait about 10 minutes to see the magic. Try couple of times :)