Robotics & Autonomy
Powered By PaleBlue
Simulation tools are widely used to model the robotic and human actors within the virtual environment. PaleBlue Robotics Tools allows to develop and validate robots in a simulated space, before they are tested out in the field. PaleBlue provides the validation platform and services on it.
Working with 3D and Simulation since 1990s
World-class solutions in 3D, VR and Simulators
Agile process for cost efficiency & solid delivery
Project portfolio includes world-leading industrial companies
Simulated Sensors & Motors
Environment can be scanned and turned into a navigatable 3D model. PaleBlue provides virtual sensors (Lidar, Depth Cameras, RGB) and virtual motors (including tracks and drivetrain). This allows to sense the virtual environment, actors in it, and perform actions of the robots.
Its therefore possible to run the robot in a virtual environment and validation robot design and autonomy AI.
Robot AI Training
An important task is the process to train the robot AI to operate in the chosen environment. PaleBlue tools generate realistic visuals and provide means for class segmentation, human skeleton extraction and more, so that sensor performance can be validated. Same is done for validating the motions, as forces are reproduced and force feedback is captured.
Several robots can be run in the simulated space, where PaleBlue tech will handle synchronization of robot states across the network. This allows to develop complex robot coordination and collaboration routines.
Simulated environments allows robots to interact with human participants. We are connecting robotic actors with human users wearing VR headsets – all within the same simulation environment. This allows for multi-agent modeling of complex operations in the virtual world.
Unity As Foundation
Unity 3D Engine, while known by many as a game development toolkit, is also a very solid foundation for simulators, computer vision, and robotics. PaleBlue Robotics runs on top of Unity, getting all of the platform’s benefits.
Fully Connected to ROS
Connect to ROS environment, to publish topics from sensors and subscribe to topics for motors. Create a fully controllable digital twin of your robot, and test it in the virtual PaleBlue environment.
Create completely virtual 3D environment and generate tons of training data. Use virtual sensors as LIDAR to sense the virtual environment and train your models.
Create separate ground-truth topics to perform reinforced learning.