I need to calculate the correct position of a 3D object that I display on a RawImage image in my user interface. If I put my UI image in the center of the screen, I can calculate it perfectly. If I move my image around its UI pattern, the results are shifted, depending on which side I move. I've taken some screenshots of what I mean, the orange side is my 3D quad, the white square is just an image debug where my point needs to be calculated.
The configuration I have is as follows:
– a world camera pointing to a 3D quad
– a canvas of user interface with a dedicated perspective camera (Screen space – Camera)
– a panel in my web of user interface displaying the 3D quad
var worldPoint = t.Value.MeshRenderer.bounds.min; //t.Value is the 3D quad var screenPoint = worldCamera.WorldToScreenPoint(worldPoint); screenPoint.z = (baseCanvas.transform.position - uiCamera.transform.position).magnitude; //baseCanvas is the UI canvas var pos = uiCamera.ScreenToWorldPoint(screenPoint); debugger.transform.position = pos; //debugger is just the square image used to see where my calculated point is landing
I've tried several ways, like this:
var screenPoint = worldCamera.WorldToScreenPoint(t.Value.MeshRenderer.bounds.min); Vector2 localPoint; RectTransformUtility.ScreenPointToLocalPointInRectangle(rectTransform, screenPoint, uiCamera, out localPoint); //rectTransform is my UI panel debugger.transform.localPosition = localPoint
But I always get the same result, how can I do the right calculation considering the offset?