In an attempt to explore the domain of using the face as a modality for desktop control, in this project, we use gaze input and facial gestures (tilt head, raise eyebrows, lean backwards) for two common desktop control operations: tab switching (within application) and window switching (between application). A working prototype was built using an eye tracker to collect gaze input and a face detection library to detect facial gestures, both connected to a custom-built dummy desktop, and some initial response was taken by showing it to a few people.