We are excited to announce our initial round of beta partners building with the Mapbox Vision SDK. We’re giving Microsoft Azure, Elektrobit, and Nexar the first set of keys to the future of mapping and navigation.
The Vision SDK processes imagery directly on the device, turning your connected camera into a second set of eyes for your car. We are bringing visual context to our live location platform and redefining how machines and humans alike interact with the driving environment.
The public beta is now available on Mapbox’s site here: https://vision.mapbox.com/
For users, the Vision SDK unlocks augmented reality navigation, detection and segmentation of various road features, customizable safety alerts, and more. On the backend, the SDK feeds valuable road metadata back into the living map. Highly efficient neural networks run solely on the device, so network bandwidth needs are low.
Pairing AR and real-time data opens a new world of possibilities: fleet managers can empower drivers with enhanced situational awareness, ride-hailing apps can vividly flag pickup locations, and municipal services can immediately spot changes to infrastructure as they’re happening.
Power a “living” map of the world.
The nature of mapping has changed: you can start to build an HD map, but you can never finish. Even with specialized equipment and training — and fleets of survey vehicles with state-of-the-art technology that can build cm-accurate maps — a survey map is still static. It’s out-of-date before the team finishes uploading the data.
Effectively mapping a fluid, real-time driving environment is a daunting problem of scale. The early days of autonomous vehicles featured self-driving cars with such massive sensing, computing and networking requirements that they were effectively server-rooms-on-wheels. Today, drivers have everything they need to update and interact with a map right in their pockets.
Today, the smartphone is the most ubiquitous platform in the world for communication, computation, and sensing. The Vision SDK harnesses the power of these distributed sensors to unlock a revolution in live mapping. It identifies salient road features (such as signs, traffic lights, and lane information) and processes the data directly on-device. Changes to the driving environment are detected on the spot and uploaded to make low-latency, low-bandwidth updates to the living map.
Build better navigation experiences for drivers.
Travelers, commuters, and even professional drivers face the challenge of navigating complex environments when they’re behind the wheel. The Vision SDK offers you the ability to marry augmented reality with a semantic understanding of the road scene, adding crucial context to a heads-up navigation experience.
Developers can also use the Vision SDK to create novel navigation features that improve the driving experience, including lane-level navigation, illumination of passenger pickup and drop-off locations, and alerts of regulatory signs and traffic incidents.
The Vision SDK is not just for mobile apps – it unleashes new functionality for embedded automotive systems as well. Elektrobit is incorporating augmented reality and live mapping applications in their infotainment and autonomous driving platform.
It takes a community to build the live map of the future, and it will take a community to unlock the radical potential of the Vision SDK. We’re excited to start partnering with developers across industries to build new applications that integrate our Vision SDK, and we’ll be rolling it out to more partners in the days and weeks to come.
Want to get early access? Apply for beta access today.