10 bit integers are definitely not good enough for serious image processing, in any program. One must retain many more digits during intermediate computations than are needed in the end product. The human eye's "bit depth", which Apple is catering to, has little or nothing to do with it.
In any case, floating point representations are usually best for processing workflows that have lots of steps.