Robotic Simulator: How to Predefine a Scenario for the Robot (18/27)

Robotic Simulator: How to Predefine a Scenario for the Robot (18/27)

In the previous lesson, we learned how to save and load different data in the simulator. In this part, we are going to design a complete step with a multi-part scenario for our simulator and define a separate function for each piece. The corresponding lessons on building a robotic simulator in the Unity engine can be found at the link below (note that more lessons will be added gradually): https://www.mecharithm.com/category/a-robotic-simulator-on-unity/ Important: Robotic Simulator's Source Code and Sample Output: The source code for the entire tutorials (from 1 to 27) can be downloaded HERE! And a test output for Windows can…
Read More
Robotic Simulator: How to Save/Load the Values and Show Them as a Table (17/27)

Robotic Simulator: How to Save/Load the Values and Show Them as a Table (17/27)

In the previous part, we got acquainted with implementing different types of weather and day and night conditions in the simulator. In this lesson, we will learn how to save and load essential data in the simulator and its application in the main menu and other parts of the simulator. The corresponding lessons on building a robotic simulator in the Unity engine can be found at the link below (note that more lessons will be added gradually): https://www.mecharithm.com/category/a-robotic-simulator-on-unity/ Important: Robotic Simulator's Source Code and Sample Output: The source code for the entire tutorials (from 1 to 27) can be downloaded…
Read More
a Bioinspired Advanced Neural Control for Autonomous Walking Robots

a Bioinspired Advanced Neural Control for Autonomous Walking Robots

Researchers from BRAIN-lab at VISTEC in Thailand developed a bioinspired advanced neural control with proactive behavior learning and short-term memory for autonomous walking robots that enables them to traverse complex terrain. Image credit: BRAIN-lab at VISTEC The proposed control consists of three main modular neural mechanisms to create insect-like gaits, adapt robot joint movement individually with respect to the terrain during the stance phase using only the torque feedback, and generate a short-term memory for proactive obstacle avoidance during the swing phase. Their proposed control: does not require robot and environmental models, exteroceptive feedback, or multiple learning trialsdepends only on…
Read More
Robotic Simulator: Creating Different Climates with the Impact of Them (16/27)

Robotic Simulator: Creating Different Climates with the Impact of Them (16/27)

In the previous lesson, we were able to successfully simulate a complex function called removing/placing an object for our robot. In this lesson, we want to simulate a variety of weather conditions and day and night for our environment using lighting and various skyboxes and to our scene management system. The corresponding lessons on building a robotic simulator in the Unity engine can be found at the link below (note that more lessons will be added gradually): https://www.mecharithm.com/category/a-robotic-simulator-on-unity/ Important: Robotic Simulator's Source Code and Sample Output: The source code for the entire tutorials (from 1 to 27) can be downloaded…
Read More
Flying Microrobots for Pollination, Rescue, and Navigating Machinery

Flying Microrobots for Pollination, Rescue, and Navigating Machinery

Flying microrobots are diminutive drones developed by a group of researchers from MIT, Harvard, and the City University of Hong Kong. Image credit: MIT These microrobots are powered by soft actuators made of thin rubber cylinders coated in a carbon nanotube, producing an electrostatic force that squeezes and elongates the rubber cylinder when voltage is being applied. They recently proposed a new fabrication technique by removing air bubbles to produce low-voltage, power-dense artificial muscles that can improve the actuation of these microrobots. Image credit: MIT A swarm of these robots can have possible applications to pollinate a field of crops,…
Read More
Drone Badminton to Improve Physical and Mental Health of the People with Low Vision

Drone Badminton to Improve Physical and Mental Health of the People with Low Vision

Digital Nature Group at the University of Tsukuba in Japan developed "Drone Badminton" that enables people with low vision to play badminton. The drone acts as the "ball," and when passing through a ring frame, the sensor at the frame detects this, and the drone's direction changes back to the opponent. Image credit: Digital Nature Group at the University of Tsukuba This project helps people with low vision regain their mental and physical well-being who otherwise are unable to see the size of the ball and could not react to the speed of the ball in the regular badminton game.…
Read More
John Deere’s Fully Autonomous Tractor for Large-scale Production

John Deere’s Fully Autonomous Tractor for Large-scale Production

John Deere's new autonomous tractor is all the fast-growing population needs for large-scale production. The tractor can do the job autonomously while the farmer is doing other tasks, and they can monitor its status through an app on their mobile phone. Image credit: John Deere Key features: GPS guidance systemsix pairs of stereo cameras for 360-deg visual perceptionimages are processed through a deep neural network for obstacle detection This tractor will be available to all farmers in late 2022! Watch a short video of this below: More information: https://ces2022.deere.com/https://www.linkedin.com/company/john-deere/ Thanks for reading this post. You can access more categorized news…
Read More
Human-robot Interaction at the Next Level with the Humanoid Robot, Ameca

Human-robot Interaction at the Next Level with the Humanoid Robot, Ameca

Ameca is a humanoid robot from Engineered Arts designed specifically as a platform for future advanced robotics technologies and can serve as a testbed for different Artificial Intelligence (AI) and Machine Learning (ML) systems. Image credit: Engineered Arts In the video below, they test Ameca's interaction with a human, and it is shown that the robot reacts when its "personal space" is violated. Key features: the human-like artificial body (AB) accompanies the human-like AIeye cameras take images, and the images are processed through tensor flowmodular design for future upgradesSoftware is the Tritium robot operating systemHardware is based on Mesmar technology…
Read More
Robotic Simulator: Creating an Advanced UGV Robot in Unity with C# (15/27)

Robotic Simulator: Creating an Advanced UGV Robot in Unity with C# (15/27)

In the previous lesson, we learned how to use triggers and their associated programming. In this part, we will simulate and test much more complex actions for our robot, such as picking up or dropping an object off the ground. The corresponding lessons on building a robotic simulator in the Unity engine can be found at the link below (note that more lessons will be added gradually): https://www.mecharithm.com/category/a-robotic-simulator-on-unity/ Important: Robotic Simulator's Source Code and Sample Output: The source code for the entire tutorials (from 1 to 27) can be downloaded HERE! And a test output for Windows can be downloaded…
Read More
a Navigation System for Multi-legged Robots for Different Terrain

a Navigation System for Multi-legged Robots for Different Terrain

Researchers at the Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) Robotics Innovation Center, in partnership with the Italian Institute of Technology (IIT) and Airbus Defence and Space Ltd (ADS), developed ANT, which is a novel navigation and motion control system for multi-legged robots. Image credit: Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI) This navigation system is designed to enable the multi-legged robots to perceive the terrain, generate a path to the desired location, and finally control the path execution to be able to walk on different kinds of terrains. The system is tested on hexapods (robots with six legs) as wells as…
Read More