Recognition of objects (pedestrian, vehicle) is especially important in the case of autonomous vehicles. We have 3 types of sensors: LIDARs, cameras, radars. During the researches we apply traditional and artificial intelligence-based solutions.
JKK-ZalaZONE Vehicle Industry Test Center
Function of the Center
Our main research areas are the development and education issues of the autonomous transport systems and the understanding of the collaboration between people and vehicles. The fully self-driving technology will lead to safe and sustainable transportation, so we are already preparing for the new technology of future by the researching of the autonomous vehicles. By the developments, we gain unique knowledge of informatics, mechatronics, robotics and artificial intelligence. We are working that the transport of future be uncompromising, safe and sustainable.
Head of the Center: Dr. András Háry
Contact – e-mail address: firstname.lastname@example.org
The current position of the detected-objects is really important information for the vehicle control algorithms. Aware of this information, the vehicle estimate the position and movement of the pedestrian and the algorithm modify the original trajectory of the vehicle to avoid the accident.
Firstly, the vehicle’s computer filters out the free-zone area from the data of sensors and the vehicle control algorithm form an image of the potentially walkable area and plan in that area. One of our main research direction is the recognition of free-zone area, which is one of the basic function of the autonomous systems (vehicles and robots).
We develop neural networks (NN) on the field of artificial intelligence (AI). The neural networks are well-applicable for perception and decision-making. Our active research-area is the detection issues of sensor systems and the real-time data-processing.
The area – where the vehicle is moving – is describing by an environmental representation. Our main research area is the automatically creation of environmental representation from vehicle sensor-data. The HD map consists of point-cloud map, traffic lanes and other signs. The point-cloud map contains more information than a simple map.
One of the inputs of HD map is the point-cloud and we generate the point-cloud map from the LIDAR data. The accurate position and orientation are required for simultaneous localization and positioning solutions (SLAM).
Remote control is typical of autonomous vehicle function. This function is solved traditionally by camera, but we are also examining the LIDAR-based solutions.
Our active research area is the issue of the localization. The modern GPS systems have not high accuracy for autonomous vehicle control and these systems are not available in parking houses. These and similar problems are solved by using other sensors (LIDAR, IMU). One of the appropriate direction is the point-cloud-based localization of the vehicle.
The free-zone area, detected-objects and accurate localization are given to vehicle control algorithms and these programs design a smoothed, traceable, collision-free and continuous trajectory. Under trajectory we mean the complete route with time and speed data.
Our research area is the trajectory tracking considering the kinodynamic constraints of the vehicle. We develop trajectory tracking algorithms and our aim is that the steering movement and braking not be aggressive or unpleasant for passengers.
Our research areas are the traditional V2X-based systems and the modern 5G mobile network-based systems. The technology is well-illustrated by the scenario, where a vehicle with distinctive sign (police car, ambulance car) sends messages to the infrastructure, that it wants to reach its destination as soon as possible. The vehicle will move on green wave as the result of the vehicle-to-infrastructure communication.
Autonomous test robots and autonomous test vehicles are equipped with special sensors. The accurate simulation is difficult task, but we try out those algorithms in the real systems which run well in the simulations. The researchers will accomplish different tests with these programs and algorithms in the test track.
Our research center share the acquired-knowledge, algorithms and data, where the aspect of the work allows it. Our systems are based on the open source ROS/ROS2 ecosystem, so our algorithms are developed in Matlab / Simulink, LabVIEW and C++, Python.