Skip to main content Skip to secondary navigation

Multi-Sensory State Estimation

Main content start

By outfitting physical systems with a network of assorted type sensors, a multitude of localized information can be obtained about these systems. Although local information can efficiently be monitored via conventional methods, fusing these data intelligently gives rise to much richer interpretations about the states of such systems. SACL investigates various applications where multi-sensory perception opens possibilities by revealing latent characteristics of such systems, and develops specific artificial intelligence techniques for these systems.

SACL has developed a flight state estimation framework, named fly-by-feel, that leverages the high-dimensionality and multimodality properties of sensor network data. The i-FlyNet model architecture, which acts as the backbone classifier of this framework, is built as a combination of conventional signal processing and modern deep learning techniques to make the richest possible inference from this unique sensory data. SACL’s fly-by-feel powered morphing wing not only excels at predicting stall for safe flight, it also estimates the wing shape and angle of attack that will achieve the maximum flight efficiency.

Another application SACL is developing is a "smart skin" with an embedded stretchable sensor network which can fit onto any shape (e.g., robotic arms or hands). Building on this smart skin technique, SACL is developing an active tactile sensing method to detect and diagnose local contact/slip conditions in robotic arms and hands for dexterous manipulation.