Back to reality...

Lately I made some experiments with JNode.

This time equipped with the plug-in reloading capabilities I tried again to run the only available free Java to byte code compiler kjc (for more details see:
under JNode.
First I stumbled into a problem with resource loading, the GNU getopt library was unable to load the resource files for localized messages. After finding a quick fix for it (noticed that in the gnu classpath version of java.util.ResourceBundle.getBundle(String, Locale) the fourth entry in the class stack is loaded by the caller's class loader) I could go one step further to get a NullPointerExcepcion in a constructor that apparently and under normal conditions has no chance to throw one (kjc works fine under JDK).

A this point I had the idea to try the sun Java to byte code compiler under JNode. I took the tools.jar from jdk1.3.1_06 and transformed it quickly into a JNode plug-in and tried to load it. Surprise: with almost 200MB free JNode memory I get an OutOfMemoryError. It turned out the plug-in implies a file copy, that implies a memory allocation proportional with  the size of the file. However the named file is larger than 4MB which is larger than a JNode heap block size (2M) used by the memory manager. And it looks like it is not possible to allocate a larger object than the size of such a heap region. All right. I increased the named limit to 16M and tried again. At that point I started to GC errors with a message something like: object allocation during GC. Well it is not the first time I get this but still it sounds very serious and makes me think of thread synchronization (an other not new topic). By some miracle after several restarts of JNode these GC problems were gradually left out and I started to get further and further toward compiling a file under JNode with javac. However I didn't get there. The compiler is able to print the command line options when you start it with no arguments. However if you start it with a valid java filename to compile JNode gets into endless GC cycles, usual output appearing again and again with a few seconds of break in between.

Then I tried the JAR tool from the same library to decompress a plug-in jar on an ext2 FS and I kept getting an error that it is unable to create the META-INF directory. Finally the decompression worked under the /jnode directory (on the in-memory partition).

All these very peculiar error situations show one thing: JNode needs to be tried, tested and fixed a lot. For this there are many ways but a very simple one is to take real world software and try to run it on JNode.
JNunit might help to find errors that you can think of and a few more. We want to run real software on JNode and that is where you get the errors that you cannot think of or even imagine.
There are plenty of tools and programs that do not need the graphics (practically missing) the most of them without native methods should work on JNode. Do they? Smiling

By the way, for this type of trials you might have problems with security. Well the security is an important thing to have but it is equally important to not to have any restriction in certain situations.
As a simple solution you can use java.util.AllPermission for your plug-in locally if it is not the security that you want to test.

What real world software did you successfully run on JNode?


How to edit a comment?

I would like to write instead of
java.util.AllPermission above but cannot find the way to do it.
I remember not a long time ago it was still possible to modify the entries posted by myself.

What happened?