unity – Copy component features to Instantiate object?

I’m trying to add component to Instantiate object. This new component is audioSource. Now I added the audio to the Instantiate but I want to copy all features from “_sound” to the new component.

 public AudioSource _sound;
 public float targetCompass;

  Void Start() {
      _sound = GetComponent<AudioSource>();

   }

 Void Update() {
 if (targetCompass <= 5 )
     {
      GameObject bullets = Instantiate(ebullet) as GameObject;
      bullets.AddComponent<AudioSource>();
      bullets.GetComponent<AudioSource>() = _sound;
     }

}

unity – How to rotate an object based on the movement and angles of Lifters

I have Lifters that “carry” an object. I want the object to move and rotate based on how those Lifters move.

Lifters

I have the object move by setting its velocity to the average of the Lifters’ velocity.

Now, if not all of them move (or if there’s a disparity in their velocities), I want the object to rotate, like so:

Rotation

This is done by keeping track of how much an individual lifter has moved in a frame from the Object’s position, and calculating the angle via m_Angle += Vector3.SignedAngle(lastPos - transform.position, currentPos - transform.position). This “movement angle” I add up for every Lifter and then at the end, Rotate the Object via Transform.Rotate(Vector3.up, m_Angle).

However, this approach has some problems:

  1. Calculations are done along the Y axis and therefore does not angle the Object up or down if the Lifters have different vertical positions,
  2. m_Angle actually overshoots, but I found simply dividing m_Angle by the number of Lifters made it perfectly accurate. I do not know why this works.

Because I want this to rotate the object in “3 dimensions”, I have attempted to calculate the “euler angles” of each Lifters’ movement via:

Vector3 oldDisplacement = lastPos - transform.position;
Vector3 newDisplacement = currentPos - transform.position;
Quaternion rotation = Quaternion.FromToRotation(oldDisplacement, newDisplacement);
m_EulerAngles += rotation.eulerAngles;

And then at the end, I rotate the Object based on these “movement angles”:

transform.Rotate(m_EulerAngles, Space.World);

This works in three dimensions, however it is off in the same way that m_Angle was before I divided it. It rotates the object too far. On top of that it seems to produce rotations along axes that have no displacement.

m_EulerAngles

When I check how much rotation Blue is adding per frame, I get an output like so:

Blue is adding (0.000, 359.507, 0.066); on movement (-0.015, 0.000, 0.000)

The only issue I can perceive is that since the Object’s position is also moved at the end, the angle calculation assumes it is not moving and generates a larger angle (e.g., perhaps newDisplacement should be something like currentPos - (transform.position - m_Body.velocity * Time.deltaTime) ). However, removing the Object’s movement from the finalPosition of the Lifter seemingly has no effect.

I’d love a sanity check on the code or approach or if I should approach this problem from a different perspective.

raycasting – How can I access the normal map value in Unity?

I’m simulating laser scanners in Unity with Raycasts, and I’ve got some “bumpy” items I’d like to scan.

I had tried generating the actual geometry for these objects, but there are tons of these objects and the polygon count exploded.

What I would like to do is something like described here to get the color from a raycast intersection:

Renderer renderer = raycastHit.collider.GetComponent<MeshRenderer>();
Texture2D texture2D = renderer.material.mainTexture as Texture2D;
Vector2 pCoord = raycastHit.textureCoord;
pCoord.x *= texture2D.width;
pCoord.y *= texture2D.height;

Vector2 tiling = renderer.material.mainTextureScale;
Color color = texture2D.GetPixel(Mathf.FloorToInt(pCoord.x * tiling.x) , Mathf.FloorToInt(pCoord.y * tiling.y));

But, instead of getting the color, I’d like to access the normal. I used grayscale bump maps to simulate height differences, but the raycasting doesn’t take the normal map into account and I can’t figure out how to access normal map values. Ideally there’d be some texture2D.GetNormal() I could use in place of the texture2D.GetPixel() given in the snippet above, but I can’t seem to find anything anywhere on getting at the material normals.

I’m using Unity 2019.4, with the Universal Render Pipeline at the moment.

Unity: What’s the difference between SRP batching & GPU Instancing?

I’ve been researching a lot about Draw calls and how to reduce them, i’m using GPU Instancing and it’s amazing, but i did not succeed in making SRP batching work.

  1. What’s the difference between the two and when to use which?

  2. can we use them together?

unity – Understanding Physics Materials

I have been reading the documentation about Physics materials, and it really seems very incomplete to me. When I try to work with it, I get so many questions… Maybe can someone help me wrap my head around how it works?

  • Why is there a “physics material” field for the Rigidbody2D, but also another in the BoxCollider2D?
  • Do they work differently, depending on which you put it?
  • What happens if you put different materials, one on the RB and one in the collider with different settings?
  • If a player is standing in a platform, and you want it to bounce, or slide… do you set that on the player? Or in the platform?

c# – unity cinemachine 2d

Estou criando um jogo 2d com unity e usando cinemachine para câmera seguir o personagem, mas estou com duvida em como faço para a câmera travar caso o player tente voltar para trás na fase permitindo que ele siga apenas para frente

collision detection – How to make pathfinding agens avoid each other using A* in unity?

Thanks for contributing an answer to Game Development Stack Exchange!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.

To learn more, see our tips on writing great answers.

unity – Having multiple different friction values on a single piece of track?

I am trying to figure out a good/performant way to give a piece of track multiple values of dynamic friction. A track piece is 10×10 units in Unity and the idea is to puzzle them together to a big track like the tracks in Trackmania. The game itself features no acceleration, except from gravity. To make things more interesting I want to give each track piece at the start of the scene multiple random friction values within a given range.

Currently I achieve this by making each piece of track consisting of 100 little tiles which are 1×1 units and they get a random friction values assigned. This approach works somewhat fine as long as there are less than 30 to 40 track pieces(3000-4000 tiles) but with more than that the fps are dropping really low. As it is a racing type of game I had to set the “Fixed Timestep” in the project settings to 0.001 to get accurate time measurments and this is hurting the performance as well. With a lower timestep the collision detection with all the little tiles is really bad too.

Is there a more elegant and/or performant way to achieve this in within the unity physics system?

Unity – Cube edges are jagged

I am new to Unity. I have a basic setup following Brackeys tutorial. When in game mode, I see the red cube having jagged edges. I tried increasing the antialiasing in Unity to 8x but it did not work. I also tried turning on antialiasing in my Nvidia control panel. But the issue still persists.

enter image description here

unity – How can I create a mesh of triangles made from the vertices of a polygon and contained within that polygon?

I’m trying to programatically generate a mesh, given the vertices of a polygon, which is contained within that polygon. For example, starting with these points:

points solo

I would want to generate a mesh similar to this:

points preferred

My first port of call would be a Delaunay triangulation, but having implemented this, I came up against the problem that this was not contained within the polygon, i.e. it produced a mesh like this:

delaunay

My question then is, is there either an algorithm which produces the desired result rather than the Delaunay result (because I can’t find one online), or a modification that I can make to the Delaunay triangulation algorithm that will fix this?

I’m using the Bowyer-Watson algorithm, by the way. Thank you!