Liquid Galaxy Project
Web Page: http://code.google.com/p/liquid-galaxy/wiki/GSoC2012
Mailing List: http://groups.google.com/group/liquid-galaxy
Liquid Galaxy is a remarkably compelling panoramic display system. As an immersive visualization tool it breaks new ground by using commodity-off-the-shelf hardware. Applications that have been enabled for Liquid Galaxy include - Google Earth, SecondLife, WorldWide Telescope, MPlayer video player and game engines OpenArena, Cube 2: Sauerbrauten and Irrlicht. The goal of this Summer of Code project is to further extend the capabilities of this environment using open source software and tools.
Students: Having an understanding and appreciation of Liquid Galaxy is an absolute requirement for a successful application. The best way to get that knowledge is to build one! Please make an attempt to build a two machine Liquid Galaxy using the Quick Start guide. If you have difficulties use the community mailing list at liquid-galaxy@googlegroups.com
Please read our GSoC Ideas Page and if you have specific GSoC questions contact lg-gsoc@endpoint.com.
Ben Goldstein & Andrew Leahy (Liquid Galaxy Project Admins)
Projects
- ClusterGL Liquid Galaxy Integration/Packaging Continuing the ClusterGL project from SoC2011
- Liquid Galaxy Project - -JavaScript based Liquid Galaxy I'm focused on the computer vision & computer graph in Chinese Academy of Sciences. At the same time I work with a team about 3D Urban Scene Modeling and street view in Shenzhen. My GSoC project is to make the Liquid Galaxy views synchronised with JavaScript ,a js library which can control the slave camera just as the google earth client(which we always set the yawoffset in drive.ini).Surely some websockets are needed . Later We'd get this to some other web application.Maybe ShenZhen(a city of China) street view application. If anyone who are interested in this project and have some ideas about the project.Welcome to chat with me.Hope to be your friend and share with you!
- New Control Devices Using Microsoft Kinect, the gesture and voice recognition features will be used to control the Liquid Galaxy navigation around the Google Earth Map to create a more interactive user experience. This will involve using voice to search for locations and get directions to them and using gestures to move the map around and zoom.