Vehicle-based vision algorithms, such as the collision alert systems , are able to interpret a scene in real-time and provide drivers with immediate feedback. However, such technologies are based on cameras on the car, limited to the vicinity of the car, severely limiting their potential. They cannot find empty parking slots, bypass traffic jams, or warn about dangers outside the car's immediate surrounding. An intelligent driving system augmented with additional sensors and network inputs may significantly reduce the number of accidents, improve traffic congestion, and care for the safety and quality of people's lives. We propose an open-code system, called Fleye, that consists of an autonomous drone (nano quadrotor) that carries a radio camera and flies few meters in front and above the car. The streaming video is transmitted in real time from the quadcopter to Amazon's EC2 cloud together with information about the driver, the drone, and the car's state. The output is then transmitted to the "smart glasses" of the driver. The control of the drone, as well as the sensor data collection from the driver, is done by low cost (<30$) minicomputer. Most computation is done in the cloud, allowing straightforward integration of multiple vehicle behaviour and additional sensors, as well as greater computational capability.