Experts have made an effort in research and development of humanoid robotics from 1970s, and made some breakthroughs. But, humanoid robot can't realize autonomous control limited to interrelated subjects, and still can't be applied out of laboratories. Meanwhile, coming the increasingly aging societies, robots that assit human indoors are expected. Apparently, humanoid robot which can perform autonomously is the best choice. But it's hard to develop humanoid robots with absolute antonomous capability, and it's the only way to develop semi-antonomous humanoid robots monitored by operators like the aged. Monitoring and control system was studied and designed in this paper for the humanoid robot developed in our laboratory.Firstly, work environment, operation mode and function requirements were analyzed after we disscussed machine, control system and action competency of the humanoid. And then, hardware and software architecture of the monitoring and control system were disigned. The system was divided into 3 subsystems named data communications, locomotion control and human-robot interaction.And then, the 3 subsystems were designed and realized separately. As to the data communications subsystem, wireless LAN module and CAN bus communication module. Reliability optimizion was realized for UDP protocol, and CAN bus communication with 8 channels based PCI interface chip PCI9052 was designed. As to the locomotion control subsystem, this paper mainly studied self-location, path planning and locomotion planning of humanoid robot. Pose tracking based dead reckoning and global localization based landmark recognization were designed for self-localization, location mechanism was designed integrating the two methods. Work environment was modeled by grids, and path planning based improved ant colony algorithm was realized. Feasibility of the algorithm was proved by emulational result. Walking locomotion planning of humanoid robot based online synthesizing of elementary action planned offline was designed, and continuous walking was realized. As to the human-robot interaction subsystem, human-robot interactive interface and voice control mode were designed. There were no video display in the interface, so burden of communication was lightened. Meanwhile, the operator might obtain all state of the robot by display of the 3D virtual robot running synchronously with the real robot and pose and planned path of the robot. Voice control mode was designed based SAPI5.1 developed by Microsoft. Performance test was made, and the rate of recognization of voice instruction may be better than 95%, while there was no misrecognization which the system regard noises or voice which was not instruction as voice instruction. |