I’m pretty sure this can be done but the maths eludes me…
I have a workobject with its origin a good way away from the working area. When presented each cycle, it will be in a slightly different position so there are 3 laser measurement sensors to detect displacement; 2 measuring X about 300mm apart and another measuring a Y displacement. If the part is only moved by a true X and/or Y shift, then its an easy transformation by inputting the values directly into the object frame:
wobj_new:=wobj_original.oframe.trans.y:=AI_Yshift
If the 2 off X displacement sensors show different values, then that indicates a rotation has occured but whilst I can calculate the angle, I can’t figure out how to reflect these values back to moving the origin point of the workobject (about 2m away from the measuring points). Making the robtargets and hence the workobject referenced to the zero point is a customer requirement.
Hopefully the screenshot might help explain - the two plates are the ideal and probable positions of the area; blocks are the measuring devices.
As the points at which the measurement occur won’t be the same, I can’t use something like DecAccFrame as I can’t locate the exact points. Origin point is about 2m off the top of the screenshot..and hence where (I think) I need to place the offset X,Y,X,Q1 etc figures?