Gesture based 3D mesh modeling

Date of Publication

2012

Document Type

Bachelor's Thesis

Degree Name

Bachelor of Science in Computer Science

College

College of Computer Studies

Department/Unit

Computer Science

Thesis Adviser

Florante Salvador

Defense Panel Chair

Nathalie Lim-Cheng

Defense Panel Member

Miguel Cabral
Juan Lorenzo Hagad

Abstract/Summary

The user interface is an integral component in the way people use computers. User interface have evolved from the text-based user interfaces of the past to the now popular window-based user interface. But one of these are how humans intuitively interact with objects. Humans tend to used their hands when trying to interact or express their ideas. One way to simulate this intuitive process is by using sketches or finger strokes as input for machines to process instead of the usual windows and pointer interface.

This research focused on the implementation of a 3D modeling interface that utilizes gesture based input. We created a system for Android tablet PCs that accept touch input to create and manipulate 3D geometries. The system supports creation and deletion of 3D primitives such as vertices, edges, faces and meshes, as well as manipulating them using modeling and viewing transformation, and loading and saving 3D geometries. The 3D modeling process is adopted onto the modern user interface of tablet PCs and can emulate the human process of drawing as an alternative to the window and pointer-based 3D modeling tools that are more commonly available.

Abstract Format

html

Language

English

Format

Print

Accession Number

TU18497

Shelf Location

Archives, The Learning Commons, 12F, Henry Sy Sr. Hall

Physical Description

1v. various foliation : illustrations (some colored) ; 28 cm.

This document is currently not available here.

Share

COinS