How to write stereo applications for LG Optimus 3D. Trying the Real3D SDK

Hi There!

I've finally played a little bit with my Optimus 3D phone. Actually I had played one month ago, but I haven't had time to write about this till now. For the time being I had just installed the SDK, run the samples and taken a quick look at the code.

The Real3D SDK is available as a Third Party Add-on from the Android SDK and AVD manager, so installation is pretty simple. Just select it and press install. The only remark here is that it looks like the remote server is not very responsible some times. Once Real3D SDK is installed then you can create a new Virtual selecting as target "Real3D Add-On (LGE) -- API Level 8" or something similar.

After that, the usual thing. From Eclipse we can just create a project using the sample code from the SDK. It compiles without problems and the emulator simulates the stereo effects rendering the application output using anaglyph. Click in the image on the left to see it with blue-red glasses.

The performance is very poor, at least in an Intel card. I'm still fighting the other Optimus, the one from nVidia, and the optirun script from bumblebee projects doesn't help. Maybe the emulator is not using 3D acceleration at all. I do not know.

The other comment about the emulation is that it seems to come up in portrait mode. That's OK for most of the application, but stereo in Optimus3D only works on portrait, so you have to turn your head to see the image. Maybe there is a way to configure that, but I do not know it.

After installing the samples in the phone, you can see the sample application in the nice autostereoscopic screen in the device. In my case, for some reason, the samples involving the camera just crash. I haven't dig on this yet. This is a picture of the OpenGL sample running on the Optimus3D. No stereo in the photo, of course.

A couple of comments in the code of the OpenGL Sample application to finish.

The code is composed of two files, the main activity and the OpenGL stuff. The first one is the most important as it is the one in charge of setting up the stereo rendering. The relevant code is this:

private GLSurfaceView glSurfaceView;
private GLTestRenderer renderer;
private Real3D mReal3D;

protected void onCreate(Bundle savedInstanceState) {
glSurfaceView = new GLSurfaceView(this);
mReal3D = new Real3D(glSurfaceView.getHolder());
mReal3D.setReal3DInfo(new Real3DInfo(true, Real3D.REAL3D_TYPE_SS, Real3D.REAL3D_ORDER_LR));
renderer = new GLTestRenderer(this);

It basically creates a Real3D object and uses it to initialise the hardware to use side by side (REAL3D_TYPE_SS) left-right order (yes, the other constant). In other words, the first half of our screen will go to our left eye, and the right half of the screen will go to the right eye. Then it configures the rendered defined in the other file, the one that actually draws the Moon spinning around the Earth. I'm not an expert on Android OpenGL development but the content of this method looks pretty standard.

The second file ( is standard OpenGL stuff. There are only two things to highlight. First one is that, for stereo rendering, you need to draw two images. This is done in the onDrawFrame method. This method just select first the left area of the screen, draws the scene with the correct camera position, then selects the right side of the screen and renders the same scene with the updated camera position.

public void onDrawFrame(GL10 gl) {
mAngle += mAngleOffset;

In order to provide the full picture we just need to look into those setLeft/RightEnv methods. Here is the the left eye one.

private void setLeftEnv(GL10 gl) {
gl.glViewport(0, 0, (int) mWidth / 2, (int) mHeight);

GLU.gluLookAt(gl, - mEyeDistance, 0.0f, 4.5f, mFocusPoint[0], mFocusPoint[1], mFocusPoint[2], 0.0f, 1.0f, 0.0f);

The first line sets the view port to the first half of the screen. Then, the rest of the function updates the camera position for the left eye. This becomes more clear if we also take a quick look to the right eye configuration method.

private void setRightEnv(GL10 gl) {
gl.glViewport((int) mWidth / 2, 0, (int) mWidth / 2, (int) mHeight);

GLU.gluLookAt(gl, mEyeDistance, 0.0f, 4.5f, mFocusPoint[0], mFocusPoint[1], mFocusPoint[2], 0.0f, 1.0f, 0.0f);

The view port is now the right part of the screen and the camera is configured identically with the difference than the X position is now mEyeDistance, instead of the -mEyeDistance used for the left eye.

Summing up, using the Real3D SDK to produce stereo applications for the LG Optimus3D seems pretty straight forward. At some point I hope to have the time to try my very own example, and maybe different problems will arise.

The picoFlamingo Team


Accessing optimus 3D camera

I am new to android application development. I am trying to develop an application on optimus 3d by accessing the stereo camera and processing that using computer vision algorithm. So i will be needing only the synchronized left and right images from optimus 3D and not the 3D functionality. Will i require the real 3D SDK to be installed or i can go ahead with normal android application.

Thanks and Regards

Probably not required

Unfortunately my Optimus 3D died on a water related accident sometime ago. Anyhow the Real3D API comes with a camera example that you can use as a starting point. Quickly looking to the code it looks like the camera access is normal. Real3D is used for rendering the camera images in 3D.

Also in that example, looks like you can chose between using any of the cameras or the two combined using camera parameters.... I do not know if in order to use the keys for the related parameters you would have to install the API... probably those are just numbers that you can redefine...

Good luck