In design we utilize our long experience in mobile work machine industry to take the harsh conditions and use the technology that won’t freeze at the first snow fall. We also constantly look for ways to place and pack our sensors, so that the weather has minimal effect on their outputs. 


Our system uses continuously multiple sensor modalities, visible light, invisible light, and different frequencies of radio signals to identify the objects abound the vehicle. We call this Sensible Situational Awareness.


We use our extensive knowledge on mobile robot navigation to perfect the position of a vehicle on road. We do not use lane markings, we estimate the road. We fuse a lot. Inertial measurements, wheel encoder, angle measurements, GNSS in a tightly coupled way, and we use range measurements to correct our position to known map. 


We are specialists in control as well! We utilize vehicle model and create predictions of the future while we control the vehicle in present. Our controller does not only guide the vehicle through the desired trajectory but together with SA module predicts the future risks ahead of time.


Did we mention that we are not perfect? None of us are, and especially no complex automation system will ever be. Therefore we are developing remote operation capabilities to our automation. When our SA module decides that the risk for automated driving is too high, it simply slows down and hands the control to human operator who can use their cognition in problem solving. 


We believe that the future of urban mobility is in autonomous shared mobility, such as light electric vehicles. Therefore, Renault Twizy was chosen as test platform. A standard Twizy was first instrumented i.e. it was equipped with needed actuators and sensors to be fully controlled by a computer (drive-by-wire). Perception of the environment is the base for our proprietary autonomous driving algorithms, thus Twizy was equipped with comprehensive sensor setup. 3D-lidars in both front and rear cover 360deg around the vehicle up to 150m. Long and short range radars in the around the vehicle detect static and mobile obstacles from 0 to 200m.

Camera system consists of 6 cameras, 4 cameras look around the vehicle, where one camera and thermal camera focus on looking in to front to assist our system in detecting and recognising objects in all conditions. On top of the vehicle we have the RTK-GPS, which provides ~5cm position accuracy globally – if there is satellite visibility. The huge amount of data from our system is efficiently fused and processed with our algorithms and result optimal control of the vehicle steering, throttle and brake in all weather conditions.

"Juto" – means a sleigh pulling reindeer. It finds its way automatically to home even in the thickest snowstorm when the visibility of eye is limited.