There could be some merit to this statement. The form of multitouch the touchscreen on the G1 and Magic (perhaps more, but I'm not sure) used was capable of "2x1d" positioning. That is, if you touched the screen in two places, it knew the two x coordinates being touched and the two y coordinates being touched, but had to way of correlating which x was paired to which y... this was handled in software.
This was fine for pinch gestures, but recall in Luke Hutchison's videos where he showed that the software could become confused and pair the x and y incorrectly, which would lead to the system registering the opposite corners of the touch "box" between your fingers.
You can still see the effect today on some multitouch applications in the Market on the Nexus One. Like multitouch pong or the multitouch plugin for ethereal dialpad. In each case, the software can directly determine the initial touch points usually, but when points cross in the x or y axis they will "snap" together in that axis, and when moving from that position may become confused. This makes games like Multitouch Pong unusable on the Nexus One because it is just not reliable. However I have no way of knowing if this is a hardware limitation or a limitation in the touch software, and nobody has really talked about it.
This is contrasted to the iphone, whose touch screen allows it to keep track of touch points as two (x,y) coordinates without any sort of axis snapping.