SAIN ROBOTICS



SAIN ROBOT is a humanoid robot. It was created by Mohamed Hussain from Sri Lanka in 2020 and released in 2021. It was fully developed by continuous effort despite many failures while creating it. Although this robot was designed to do the work of ordinary human beings, it can be developed into a robot with many abilities if great efforts are made in its development in the future.

A HUMANOID ROBOT is a robot with its body shape built to resemble the human body. The design may be for functional purposes, such as interacting with human tools and environments, for experimental purposes, such as the study of science and medical, or for other purposes. In general, humanoid robots have a torso, a head, two arms, and two legs, though some forms of humanoid robots may model only part of the body, for example, from the waist up. Some humanoid robots also have heads designed to replicate human facial features such as eyes and mouths. Androids are humanoid robots built to aesthetically resemble humans.

SAIN ROBOT can talk, move, hear, take the objects, take the photos and videos, google assistant and recognize human faces.

The robot's face recognition system can capture the photos of human faces and save it on SD card. if a person come in front of the robot, the robot's front camera will take photos. then if another time that person come in front of robot, robot will talk ''Hi i know you''. 

Robot is controlled by radio control, voice commands control and detection control. few function will operate by the incode and most functions data list will activate by time control (automatic)


This robot can teach basic things to students without lecturer. how to draw the art, can teach small questions with answer, how to make step by step small LED blinking system, step by step how to fix the 3D computer parts, and more small programmed things.

Just as a lecturer can teach students with body language and speech in the classroom, so this robot can teach students through an internet connection without smart phone.

When the lecturer teach the lesson by speech and his body language in front of the robot, the robot observes these body language movements by the robot's front camera and will sends a signal to the student's robot via the internet.


During this time the student's robot will teach the lesson with the same movements as the lecturer did with the body movements to students.