Abstract
This paper describes a virtual environment in which a designer can define the contour of a sketch by controlling a pointer using a pair of data gloves in 3D space. Most standard input devises like joysticks, mice, keyboards, trackballs, and light pens do not imitate natural hand motions such as drawing and sketching. The methodology used is to construct an interactive 3D model from a combination of sketches. The sketches are drawn in 3D space using the user?s hands as dynamic input devices. The above-mentioned approach is mainly consists of two parts: Hand Gesture Recognition and implementation of a Virtual Hand. Our focus is to examine the sketching behaviours using hand gestures, where a Virtual hand generates 3D models through the exploration of a number of hand drawn sketches.