How to write stereo applications for LG Optimus 3D. Trying the Real3D SDK

Hi There!

I've finally played a little bit with my Optimus 3D phone. Actually I had played one month ago, but I haven't had time to write about this till now. For the time being I had just installed the SDK, run the samples and taken a quick look at the code.

The Real3D SDK is available as a Third Party Add-on from the Android SDK and AVD manager, so installation is pretty simple. Just select it and press install. The only remark here is that it looks like the remote server is not very responsible some times. Once Real3D SDK is installed then you can create a new Virtual selecting as target "Real3D Add-On (LGE) -- API Level 8" or something similar.

After that, the usual thing. From Eclipse we can just create a project using the sample code from the SDK. It compiles without problems and the emulator simulates the stereo effects rendering the application output using anaglyph. Click in the image on the left to see it with blue-red glasses.

The performance is very poor, at least in an Intel card. I'm still fighting the other Optimus, the one from nVidia, and the optirun script from bumblebee projects doesn't help. Maybe the emulator is not using 3D acceleration at all. I do not know.

The other comment about the emulation is that it seems to come up in portrait mode. That's OK for most of the application, but stereo in Optimus3D only works on portrait, so you have to turn your head to see the image. Maybe there is a way to configure that, but I do not know it.

After installing the samples in the phone, you can see the sample application in the nice autostereoscopic screen in the device. In my case, for some reason, the samples involving the camera just crash. I haven't dig on this yet. This is a picture of the OpenGL sample running on the Optimus3D. No stereo in the photo, of course.

A couple of comments in the code of the OpenGL Sample application to finish.

The code is composed of two files, the main activity and the OpenGL stuff. The first one is the most important as it is the one in charge of setting up the stereo rendering. The relevant code is this:


private GLSurfaceView glSurfaceView;
private GLTestRenderer renderer;
private Real3D mReal3D;

@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
glSurfaceView = new GLSurfaceView(this);
mReal3D = new Real3D(glSurfaceView.getHolder());
mReal3D.setReal3DInfo(new Real3DInfo(true, Real3D.REAL3D_TYPE_SS, Real3D.REAL3D_ORDER_LR));
renderer = new GLTestRenderer(this);
glSurfaceView.setRenderer(renderer);
glSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
setContentView(glSurfaceView);
}

It basically creates a Real3D object and uses it to initialise the hardware to use side by side (REAL3D_TYPE_SS) left-right order (yes, the other constant). In other words, the first half of our screen will go to our left eye, and the right half of the screen will go to the right eye. Then it configures the rendered defined in the other file, the one that actually draws the Moon spinning around the Earth. I'm not an expert on Android OpenGL development but the content of this method looks pretty standard.

The second file (GLTestRenderer.java) is standard OpenGL stuff. There are only two things to highlight. First one is that, for stereo rendering, you need to draw two images. This is done in the onDrawFrame method. This method just select first the left area of the screen, draws the scene with the correct camera position, then selects the right side of the screen and renders the same scene with the updated camera position.


public void onDrawFrame(GL10 gl) {
mAngle += mAngleOffset;
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
setLeftEnv(gl);
draw(gl);
setRightEnv(gl);
draw(gl);
}

In order to provide the full picture we just need to look into those setLeft/RightEnv methods. Here is the the left eye one.


private void setLeftEnv(GL10 gl) {
gl.glViewport(0, 0, (int) mWidth / 2, (int) mHeight);

gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
GLU.gluLookAt(gl, - mEyeDistance, 0.0f, 4.5f, mFocusPoint[0], mFocusPoint[1], mFocusPoint[2], 0.0f, 1.0f, 0.0f);
}

The first line sets the view port to the first half of the screen. Then, the rest of the function updates the camera position for the left eye. This becomes more clear if we also take a quick look to the right eye configuration method.


private void setRightEnv(GL10 gl) {
gl.glViewport((int) mWidth / 2, 0, (int) mWidth / 2, (int) mHeight);

gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
GLU.gluLookAt(gl, mEyeDistance, 0.0f, 4.5f, mFocusPoint[0], mFocusPoint[1], mFocusPoint[2], 0.0f, 1.0f, 0.0f);
}

The view port is now the right part of the screen and the camera is configured identically with the difference than the X position is now mEyeDistance, instead of the -mEyeDistance used for the left eye.

Summing up, using the Real3D SDK to produce stereo applications for the LG Optimus3D seems pretty straight forward. At some point I hope to have the time to try my very own example, and maybe different problems will arise.

CU
The picoFlamingo Team