Laserscape — understanding how autonomous vehicles perceive the city
Laserscape — understanding how autonomous vehicles perceive the city
As autonomous vehicles are getting closer to reality, it is important to think about how we are going to interact with these machines. First, we need to know more about them: how they work, how they perceive us and their environment. While working on an autonomous boat project at the MIT Senseable City Lab we realized that we could use the sensing data the boat is using to navigate, to help understand how the boat perceive its surroundings. So we made a video just about that.
As autonomous vehicles are getting closer to reality, it is important to think about how we are going to interact with these machines. First, we need to know more about them: how they work, how they perceive us and their environment. While working on an autonomous boat project at the MIT Senseable City Lab we realized that we could use the sensing data the boat is using to navigate, to help understand how the boat perceive its surroundings. So we made a video just about that.
Process
Process
Seeing the city through 16 laser beams
Seeing the city through 16 laser beams
To makes sense of its environment the boat uses a sensor (Lidar) and some computer power (computer vision) to create a map of its surroundings. The Lidar sensor uses 16 beams of laser bouncing off surface to generate a 3D map made of point clouds. When testing the prototypes with the engineering team we realized how visually appealing the point clouds map is.
Through the Lidar data we saw the possiblity of starting a conversation about our relationship with autonomous vehicles.
To makes sense of its environment the boat uses a sensor (Lidar) and some computer power (computer vision) to create a map of its surroundings. The Lidar sensor uses 16 beams of laser bouncing off surface to generate a 3D map made of point clouds. When testing the prototypes with the engineering team we realized how visually appealing the point clouds map is.
Through the Lidar data we saw the possiblity of starting a conversation about our relationship with autonomous vehicles.
Lidar data in its original habitat
We started gathering data from the engineering team and worked on a script for the video, a perfect medium to tell the story while showing the beautiful Lidar point clouds. We also added an important part on computer vision, less spectacular, but essential to understand how the boat makes decision. After many weeks of tests and rendering in both 3D softwares and video editing softwares we ended up with a short 1’40 minute video explaining how our autonomous boat was navigating in the canals of Amsterdam.
We started gathering data from the engineering team and worked on a script for the video, a perfect medium to tell the story while showing the beautiful Lidar point clouds. We also added an important part on computer vision, less spectacular, but essential to understand how the boat makes decision. After many weeks of tests and rendering in both 3D softwares and video editing softwares we ended up with a short 1’40 minute video explaining how our autonomous boat was navigating in the canals of Amsterdam.
First Lidar data experiments
Amsterdam point clouds map
Boat’s point of view of a canal, color is based on the distance to the sensor, and water reflection is added
What I learned
What I learned
Beauty can help understand complexity
Beauty can help understand complexity
What I find fascinating in this project is that the same sensor the complex machine uses to understand its environment can be used by us to understand how that complex machine works. In a way, the Lidar - through its visual aspect - works both ways, helping the boat and the human to understand each other. And we were able to generate that understanding thanks to the point clouds beauty, showing us a familiar but magical world.
Beauty can help us understand the complex world we live in.
What I find fascinating in this project is that the same sensor the complex machine uses to understand its environment can be used by us to understand how that complex machine works. In a way, the Lidar - through its visual aspect - works both ways, helping the boat and the human to understand each other. And we were able to generate that understanding thanks to the point clouds beauty, showing us a familiar but magical world.
Beauty can help us understand the complex world we live in.
Credits
Credits
MIT Senseable City Lab Director: Carlo Ratti
Project Lead: Fábio Duarte
Research & Video editing: Louis Charron
Research & Visualization: Iñigo Lorente-Riverola, Bill Cai
Design: Irene de la Torre, Lenna Johnsen
© All right reserved, MIT Senseable City Lab
Please get in touch before using the images or video.
MIT Senseable City Lab Director: Carlo Ratti
Project Lead: Fábio Duarte
Research & Video editing: Louis Charron
Research & Visualization: Iñigo Lorente-Riverola, Bill Cai
Design: Irene de la Torre, Lenna Johnsen
© All right reserved, MIT Senseable City Lab
Please get in touch before using the images or video.