In Cocoa on macOS, the NSTouch object contains the coordinates of a touch in a normalized coordinate system (with x, y in the range between 0.0 and 1.0). And it also has the size of the touchpad device.
So for an NSTouch object you have to coordinates of the touch on the touchpad device. This makes of course much sense.
What I am now trying to do is to map these coordinates to screen coordinates. (I know that this is not well-defined, so please read on for a further explanation what I mean by that.)
For a single-touch, this conversion is handled already by Cocoa/macOS: I get an NSEvent object (type=Gesture) and this NSEvent has the location of the mouse pointer (and the single NSTouch of the NSEvent has the touchpad-device coordinates).
But this does not work for a multi-touch gesture since the NSEvent has only one location and for multi-touch gestures, the location of the mouse pointer stays at the position where the multi-touch gesture started.
This makes all very much sense. But now I want somehow to do the same transformation that is done for the single-touch case (for each of the touches in the multi-touch gesture). I believe that the trackpad speed settings in the system preferences will influence such a mapping.
If you are wondering about a use-case, think of a drawing app that would allow multi-line (multi-touch) drawings and those multi-line drawings should happen with the same speed as the mouse pointer.
I found that NSTouch has a locationInView function, but this asserts for my all the time. Which is to be expected according to https://stackoverflow.com/a/48752995/85539 (this function works only for touch events coming from the touchbar, but not from the trackpad).
I couldn't find any other methods that look promising.