Mobile Sensing

Mobile Sensing
  • The rise of autonomous machines sees a corresponding rise in the importance of sensing.
  • For autonomous vehicles, sensors include cameras, radar and lidar (radar uses sound to bounce off stuff, while lidar uses lasers).
  • A sensing market leader is Tesla which has vehicles on the road, at the front line. Unlike competitors Waymo and MobileEye, Tesla is moving towards using cameras only versus its rivals which use cams, radar and lidar.
  • Tesla's move towards cameras only is to resolve this: during field tests when the camera says go, and the radar says don't, which does the AI defer to? Answer: use cameras only and make them really good. (As an aside, intuitively, the vehicle should go only if both cam and radar say go, but this breaks down if there are too many disagreements).
  • A camera only approach also makes sense because it more closely approximates humans, who don't have radar or lidar senses, thus making the AI more natural. It also translates well to drones, low cost vehicles etc because lidar and radar are bulky and expensive. 
  • There are still many challenges. Machine learning thrives on training data, a car trained in one city may not work well in another city where the built environment, vehicle mix, rules and driving culture are radically different.
  • Sensing is a big field. Gadgets like smartphones sense too but can't do much for the end-user (except perhaps give health signals) and are possibly mainly used for data collection. For vehicles, sensing will make mobility safer and cleaner.

Self-Driving Startups