We have secured proprietary technology for the entire process of perception, decision,
and control which are the core technologies of autonomous driving.
We have built a demand-driven business model across the entire mobility industry,
from vehicle mass production to providing top-quality and expertise.
Detects the surrounding environment and objects using LiDAR, Radar, and Camera sensors and precisely determines the vehicle's location through accurate positioning.
PerceptionDetects the surrounding environment and objects using LiDAR, Radar, and Camera sensors and precisely determines the vehicle's location through accurate positioning.
Analyzes, predicts, and evaluates the current scenario using perception data to decide on the best driving strategy.
DecisionAnalyzes, predicts, and evaluates the current scenario using perception data to decide on the best driving strategy.
Controls the vehicle accurately based on perception and decision results to safely reach the destination.
ControlControls the vehicle accurately based on perception and decision results to safely reach the destination.
Light HD Map
Light HD Map
Location Perception
Location Perception
LiDAR Signal Processing & Seonsor Fusion
LiDAR Signal Processing & Seonsor Fusion
Decision & Precise Control
Decision & Precise Control
Core Tech I
a2z has secured proprietary technology for the entire process of perception, decision, and control in autonomous driving
Detects the surrounding environment and objects using
LiDAR, Radar, and Camera sensors and precisely determines
the vehicle's location through accurate positioning.
PerceptionDetects the surrounding environment and objects using
LiDAR, Radar, and Camera sensors and precisely determines
the vehicle's location through accurate positioning.
Analyzes, predicts, and evaluates the current scenario
using perception data to decide on the best driving strategy.
DecisionAnalyzes, predicts, and evaluates the current scenario
using perception data to decide on the best driving strategy.
Controls the vehicle accurately based on
perception and decision results to safely reach the destination.
ControlControls the vehicle accurately based on
perception and decision results to safely reach the destination.
Light HD Map
Light HD Map
Location Perception
Location Perception
LiDAR Signal Processing & Seonsor Fusion
LiDAR Signal Processing & Seonsor Fusion
Decision & Precise Control
Decision & Precise Control
Core Tech Ⅱ
Achieved high safety based on sensing overlay through the fusion of LiDAR, RADAR, and CAMERA sensors.
Front | Side | Rear | |
Camera | 60°, 110°, 60° | 110° | 110° |
LiDAR | 135° , 300m | 360° , 200m | 135° , 300m |
Radar | ±10° , 220m ±45° , 60m | ±75° , 100m | |
● Camera ● LiDAR ● Radar |
AI deep learning-based proprietary camera for autonomous vehicles.
CameraAI deep learning-based proprietary camera for autonomous vehicles.
Achieves redundancy through 360-degree perception.
SensorAchieves redundancy through 360-degree perception.
Dual redundancy of key autonomous driving controllers for safety.
ControllerDual redundancy of key autonomous driving controllers for safety.
AI deep learning-based proprietary camera for autonomous vehicles.
CameraAI deep learning-based proprietary camera for autonomous vehicles.
Achieves redundancy through 360-degree perception.
SensorAchieves redundancy through 360-degree perception.
Dual redundancy of key autonomous driving controllers for safety.
ControllerDual redundancy of key autonomous driving controllers for safety.
a2z Vision
Self-developed cameras with two fields of view
can capture both front and rear-side images.
Graphic Processing Unit
Image processing and running image
processing software using Nvidia Xavier GPU.
Main Controller
Perception-Decisions-Control Full Stack Algorithm
Sub Controller
Image Recognition Deep Learning
Autonomous Driving Redundancy Algorithm
![]() | CMU Camera Multiplexing Unit |
![]() | TCU Telematics Controller Unit |
Seoul
Pangyo
Anyang
Gwangju
Sejong
Ulsan
Daegu
Kyungil University
Hwaseong
Incheon
a2z Vision
Achieved system stability optimized for our software by
integrating self-developed cameras for autonomous driving.
a2z Vision
Self-developed cameras with two fields of view
can capture both front and rear-side images.
Graphic Processing Unit
Image processing and running image
processing software using Nvidia Xavier GPU.
a2z Controller
A self-developed autonomous driving controller with a redundancy system for true Level 4 autonomous driving. Even if one controller fails unexpectedly during autonomous driving, the auxiliary controller can still operate, ensuring high safety.
Main Controller
Perception-Decisions-Control Full Stack Algorithm
Sub Controller
Image Recognition Deep Learning
Autonomous Driving Redundancy Algorithm
![]() | CMU Camera Multiplexing Unit |
![]() | TCU Telematics Controller Unit |
Operations
We have completed demonstrations in more than 13 cities in Korea, and a2z aims to expand into even more regions.
Seoul
Pangyo
Anyang
Gwangju
Sejong
Ulsan
Daegu
Kyungil University
Hwaseong
Platforms
Achieved the versatility to operate the highest number of autonomous vehicle models (13) globally with one software.
Micro Car
Micro CarMicro Car
Sedan
SedanSedan
SUV
Sport Utility VehicleSUV
Van
VanVan
PBV
PBV(Purpose-Built Vehicle)PBV
Small Bus
Small Bus Small Bus
Full-sized Bus
Full-sized BusFull-sized Bus
![]() SNS | Company Technology Mobility Service Careers↗ Privacy Policy Terms of Service |
Addresses |
![]() | Company Technology Mobility Service Careers↗ |
Addresses Headquarters 50, Gamasil-gil, Hayang-eup, Gyeongsan-si, Gyeongsangbuk-do Pyeongchon Research Center 5~7th Floor, Daego Building, 55, Pyeongchon-daero 212beon-gil, Dongan-gu, Anyang-si, Gyeonggi-do | Autonomous a2z | Representative Han Ji-hyung | Main Contact +82)070-5066-4910 | Business Registration Number 282-81-01099 | ![]() |