We consider differences coming from Popoviciu's inequality and give upper and lower bounds by employing weighted Hermite-Hadamard inequality along with the approximations of Montgomery two point formula. We also give bounds for Popoviciu's inequality by employing weighted Hermite-Hadamard inequality along with the approximations of Montgomery one point formula. We testify this scenario by utilizing the theory of n-times differentiable convex functions. Our results hold for all n ≥ 2 and we provide explicit examples to show the correctness of the bounds obtained for special cases. Last but not least, we provide applications in information theory by providing new uniform estimations of the generalized Csiszár divergence, Rényi-divergence, Shannon-entropy, Kullback-Leibler divergence, Zipf and Hybrid Zipf-Mandelbrot entropies.