Existing online forum software support limited assessment features. This paper presents an analysis of an assessment model which has been implemented in online discussion forum software. The assessment model is aimed to automate the assessment of students' participation in online discussion forums. The model is formulated based on four different participation indicators and educators' feedback. The model was tested by a group of students who used the online forum to complete a project. Pearson product-moment correlations were calculated using the scores (performance indicator scores) generated by the model and the actual scores given by five educators. The performance indicator scores generated using the assessment formula was highly correlated with the actual grades assigned by the educators. The results suggest that the assessment model is reliable and can be used to evaluate students' participation in online discussion forums.