Yes, that is strange.
The only thing I can think of is that the parallella and matlab handle floating point numbers slightly differently. If a tiny bit of rounding error on the end of a number causes one output function not to fire in the first layer then this will introduce a perturbation that will flow through the whole network.
Can I suggest that you load, run and save the network using a selection of data sets that are identical and ones that are not and compare the input with the output. There might be a slight difference in the lower order digits. That could be then reason.
Please let me know what you find. Your experiments are very interesting.
nick