Empowering robots with autonomous perception and decision-making capabilities not only reduces the requirements of robots on the environment,but also allows humans to work safely and effectively with robots to achieve the co-integration of robots with humans and the environment.To this end,this paper designs a perception-decision control framework for human-robot-environment physical interaction based on multimodal information fusion.First,the operating environment and the physiological characteristics of human operation are analyzed,and an equivalent uncertain environment model,a simplified human motion model,and a human-robot-environment interaction process model are established.At the same time,considering the influence of the robot joint gap on the dynamics of the interaction process,a joint transmission model with lubrication contact characteristics is established to provide a simulation object with higher realism for simulation analysis.Secondly,the control problems of human-robot-environment interaction are analyzed by taking a robot-assisted upper limb rehabilitation training task as an example,and a multimodal information fusion strategy is designed to enable the robot to autonomously perceive both uncertain environmental features and human motion intentions,and to make the two types of information fused effectively based on the confidence mechanism to obtain more effective results.In this strategy,the fusion of proprioceptive data based on a particle filter probabilistic model is combined with historical data to correct the estimates of target location and shape features,a radial basis neural network is used to estimate the human motion intention,and a fuzzy distributed Kalman filter fusion is further used to calculate the confidence level of the perceived environmental information and human motion intention.In addition,based on D-S evidence theory,a task decision algorithm is designed using force and position BPA to enable the robot to determine whether the desired task goal has been achieved or whether the estimation of environmental features in the robot’s autonomous perception strategy needs to be updated to guide the robot in motion planning.Then,to ensure the safety of the human-robot-environment interaction process,a supple model is introduced to reduce the contact forces between the end and the environment to optimize the overall control framework.The proposed algorithm is a two-loop human-machine-environment system control framework,in which the outer loop combines the supple model with a multimodal information fusion strategy to deal with the human-machine-environment interaction behavior,and the inner loop compensates the unknown terms in the nonlinear model of the robot by the sliding mode term to reduce the effect of the system model error,and to reduce the effect of the robot joint drive,a neural network is used to compensate the current of the joint motor to achieve In order to reduce the influence of the robot joint drive,the current compensation of the joint motor is applied by neural network to realize the joint servo control of the robot.Finally,the simulation system and test platform of the human-robot collaboration task were built respectively.The simulation and experimental results jointly verify the effectiveness of the framework in terms of intelligence,assistability,and safety.It is shown that humans can perform collaborative tasks with smarter assistance from robots. |