During the development of control schemes for upper-limb prostheses, the selection of a classification method is the decisive factor on predicting the correct hand movements. This contribution brings forward an approach to validate and visualize the output of a chosen classifier by simulative means. Using features extracted from a collection of recorded myoelectric signals (MES), a training set for different classes of hand movements is produced and validated with additional MES recordings. Using the output of the classifier, the behavior of an actual prosthesis is simulated by controlling the 3D model of a prosthetic hand. For systematic comparison of feature sets and classification methods, a toolbox for MATLABTM has been developed. Our classification results show, that existing classification schemes based on EMG data can be improved significantly by adding NIR sensor data. Employing only two combined EMGNIR sensors, five motion classes comprising full movements, including pronation and supination, can be distinguished with 100% accuracy.