Multi-Touch Solution Brings Modern UI Elements to Embedded Designs

Microchip Technology recently announced a new addition to its Human Interface Solutions portfolio. The MTCH6303 is a turnkey projected-capacitive touch controller for touch pads and screens. Touch sensors with up to 1,000 nodes and diagonals of up to 10” are supported. The MTCH6303 provides multi-touch coordinates as well as a ready-made multi-finger surface gesture suite that brings modern user interface (UI) elements (e.g., pinch and zoom, multi-finger scrolling, and swipes) to any embedded design, with minimal host requirements.Microchip MTCH6303

The MTCH6303’s advanced signal processing provides noise-avoidance techniques and predictive tracking for 10 fingers, at scan rates of up to 250 Hz with a minimum of 100 Hz each for five touches. It also combines with Microchip’s MTCH652 high-voltage line driver to achieve a superior signal-to-noise ratio (SNR) for outstanding touch performance in noisy environments.

When combined with the MGC3130, the MTCH6303 solution can support 3-D air gestures up to 20 cm from the touch panel. Microchip’s MGC3130 E-field-based 3-D tracking and gesture controller includes Microchip’s GestIC technology, allowing user input via natural hand and finger movements in free space. Thus, you can create interface-control possibilities in two and three dimensions.

The advanced capabilities of the MTCH6303 create robust, ready-to-go touch and gesture solutions for the rapid growth of human-interface applications and requirements in the industrial (e.g., machine control panels), home automation (e.g., lighting controls) and office equipment (e.g., printers) markets, among others.

The MTCH6303 is supported by Microchip’s new $149 Multi-Touch Projected Capacitive Touch Screen Development Kit (part # DV102013), which is now available with free, downloadable software. The DV102013 incorporates the MTCH6303 projected-capacitive touch controller and the MTCH652 high-voltage driver on a controller board, and includes a transparent, 8″ ITO touch panel for easy demonstration of the MTCH6303’s touch-controller capabilities and supporting graphical user interface (GUI) functionality.

Microchip’s free MTCH6303 GUI provides you with complete access to the configuration and tuning parameters. Advanced visualization windows assist all user levels with easy-to-comprehend feedback, to accelerate design integration for fast time-to-market.

Additionally, Microchip empowers designers by providing access to the firmware library, to enable further customizations for maximum design flexibility and control.

The new MTCH6303 is available today in 64-pin QFN and TQFP packages, for sampling and volume production. Pricing starts at $2.46 each, in 10,000-unit quantities.

The MTCH652 is available today in 28-pin QFN, SOIC and SSOP packages, for sampling and volume production. Pricing starts at $1.04 each, in 10,000-unit quantities. The MGC3130 is available in a 28-pin QFN package for sampling and volume production. Pricing starts at $2.26 each in 10,000-unit quantities.

Source: Microchip Technology

User Interface Innovation: Collaborative Navigation in Virtual Search and Rescue

An engineering team from Virginia Tech’s Center of HCI and Department of Computer Science recently won first place in the IEEE’s 2012 3DUI Contest the for their Collaborative Navigation in Virtual Search and Rescue Escort (CARNAGE) project. The project was designed to enable emergency responders to collaborate and safely navigate a dangerous environment such as a disaster area.

The contest was open to researchers, students, and professionals working on 3-D user interface technologies. Entrants were challenged to design an application to enable two users—situated in different locations with his or her own UI—to navigate a 3-D environment without speaking to each other.

Collaborative Augmented Rescue Navigation and Guidance Escort UI (Source: Virginia Tech News, YouTube)

The Virginia Tech team—comprising Felipe Bacim, Eric Ragan, Siroberto, and Cheryl Stinson—described their design in a concise system description, which is currently available on the 3DUI 2012 contest website:

Our task specifically looks at communication between a scene commander and a disaster relief responder during a search and rescue operation. The responders inside the environment have great difficulty navigating because of hazards, reduced visibility, disorientation, and lack of survey knowledge of the environment. Observing the operation from outside of the disaster area, scene commanders work to help coordinate the response effort  [1, 2]. With the responder’s notifications about the environment, scene commanders can provide new instructions, alert the responders to risks, and issue evacuation orders. Since neither the commander nor the responder has complete information about the environment, effective communication is essential.

As technology advances, the incorporation of new tools into search and rescue protocols shows promise for improving  operation efficiency and safety. In this research, we explore the  use of 3D user interfaces to assist collaborative search-and-rescue. Ideally, users should be able to focus on their primary tasks in the VE, rather than struggle with travel and way finding. Using virtual reality (VR) as a prototyping testbed, we implemented a proof-of-concept collaborative guidance system. Preliminary evaluation has demonstrated promising results for efficient rescue operations.

The team also created an explanatory 6:16-minute project video:

Click here for more information about the IEEE Symposium on 3D User Interfaces in Costa Mesa, CA.

Source: Virginia Tech News