I have not been written any blog entry quite long time ago. My research project, AiBO+, was not dead, but I worked hard to update the URBI 1.5 to a 2.x version. The work is not done yet, but I won 3rd price on the Gostai Open Source Contest 2010 with this subproject.
I have performance problems with Urbi 2 on Aibo; the more extensive Urbi script usage in the core of the Urbi 2 seems to slow down everything to unusable. Their plans are to move more and more scripts instead of native implementation, but it kills the performance for embedded robot systems like Aibo. If they will re-implement some performance critical parts in native C++, then the upgrade will be feasible for Aibo.
So was it a waste of time for me to work on the Urbi 2 port? No, I have great achievements like upgrading the Aibo toolchain to gcc 4.x, aggressive compiler optimisations for final binary programs for Aibo and reimplemented linking tools in Open-R. These advances push out almost everything from the RM-7000 CPU in ERS-7. Many times have been spent on these tasks. A great thing would be to upgrade the toolchain to gcc 4.4.x or later to have the MIPS specific PLT optimisations, which would boost the performance a bit more and reduce the final binary size, but I think this task is almost impossible to do without any help from Sony. I have not got any information for my current efforts. I did everything on my own and the work was hard regarding that Aibo does not have a Unix-like system, but a proprietary black-box.
The future is to re-implement some low-level functions to replace the Urbi 1.5 completely and start to implement locomotion functions from zero. I have not found too much reusable open-source software for this, although I will use, if there is something useful. Aibo joints have not been moved since half-year ago, so I am very keen to implement these functions. Aibo movements are hard-coded in the official Sony Mind software and in Urbi, so it will be very interesting to implement such movements affected by the forces in the joints and the environment objects.
Now, the fetching of the sensor data, images are done and I am able to ping my native AiBO+ server on the robot. An other cool thing that the WiFi LED shows if the AiBO+ client is connected to the robot and the LEDs on the back of the robot show the battery status. My general plan with the client-server architecture changed a bit and I will implement many functions on the robot, so if there is no connection to a computer client, only some heavy function will not be available, but the locomotion and other lower level function will work.
Let's see the future!