Is there a cross-platform API / library for rendering on XBOX / PS4?

I'm looking to create a game engine and I'm having trouble finding a graphics / rendering library that also supports ps4 and xbox.

Do I have to create a different rendering system for each platform or is there a library I can use?

Thank you so much!

client-side rendering – SharePoint 2013 DataSheet View makes column mandatory

I am currently trying to make a specific column mandatory in SharePoint datasheet mode.
SharePoint default validation at the column or list level is NOT an option.

I already have the code to verify the correct GUID of the view and so on, but I can't find any indication on how to make a specific column mandatory in a specific view.

I would be happy if I could insert the JS later at the Masterpage level, but it would also be nice if it works via js Link.

I have been trying for days now but I can't find the right point ….

I hope it is understandable of what I am trying to do.

opengl – Rendering to framebuffer object does not work

There is a problem when I try to render the texture from the framebuffer object. I get the message that the FrameBuffer object is finished but apparently all I get is a black texture.

this is the code to create the FBO:

void CubeMap::createEmptyCubeMap(int size) {
    this->size = size;

    // create texture
    glGenTextures(1, &textureID);
    glBindTexture(GL_TEXTURE_CUBE_MAP, textureID);


    // Allocate space for each side of the cube map
    // RGBA color texturing
    for (GLuint i = 0; i < 6; i++)
        glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X + i, 0, GL_RGBA8, size,
            size, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);

    // create the framebuffer object
    glGenFramebuffers(1, &fbo);
    glBindFramebuffer(GL_FRAMEBUFFER, fbo);


    glGenRenderbuffers(1, &depthBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, depthBuffer);
    glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24, this->size, this->size);

    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthBuffer);


    // attach color
    glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, textureID, 0);
    glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, depthBuffer, 0);

    glBindFramebuffer(GL_FRAMEBUFFER, 0);


and here when I return it:

void CubeMap::renderEnviromentMap(glm::vec3 center, CubeMap obj, Shader* shader) {

    CubeMapCamera camera = CubeMapCamera(center);

    glBindFramebuffer(GL_FRAMEBUFFER, fbo);
    glViewport(0, 0, this->size, this->size);

    for (int i = 0; i < 6; i++) {

        obj.renderCubeMap(shader);  //render random object for testing


    glBindFramebuffer(GL_FRAMEBUFFER, 0);
    glViewport(0, 0, screenWidth, screenHeight);    

Does anyone know what i am doing wrong with this?

How to control the rendering order of `Graphics3D` to prevent or encourage overlap?

I make my graphics in Mathematica using the Wolfram Language. It works pretty well, but I have encountered a problem that will make it much more enjoyable if it is resolved.

In the following code, I make a Cuboid then place some Graphics3D primitives around it. It becomes obvious that the concentrated region of the intersecting graphics has problems with their rendering order.

Graphics3D[{{Yellow, Opacity[0.25], 
   Cuboid[{-4, -1.7, -1}, {4, 1.7, 0.}]},

  {Dashed, Thickness[0.01], 
   Arrowheads[0.05, Appearance -> "Projected"], 
   Arrow[Line[{{0, 0, 0}, {5.4, 0, 0}}]], 
   Arrow[Line[{{0, 0, 0}, {0, 3.4, 0}}]], 
   Arrow[Line[{{0, 0, 0}, {0, 0, 2.5}}]]},

  {Black, Cuboid[{-3.4, -2.6, 0}, {-3.6, 2.6, 0}], 
   Cuboid[{3.4, -2.6, 0}, {3.6, 2.6, 0}]},

  {Thickness[0.0325], Green, Arrowheads[0.125], 
   Arrow[Tube[{{0, 0, 0}, {4.75, 0, 0}}, .1]]},

  {Thickness[0.02], Red, Arrowheads[0.09],
   Arrow[Tube[{{0, 0, 0}, 2.75 {1, 0, 0}}, 0.1]]}


 Boxed -> False, ImageSize -> 1000, ViewAngle -> Automatic, 
 ViewCenter -> {0.5`, 0.5`, 0.5`}, ViewMatrix -> Automatic, 
 ViewPoint -> {1.3135668740464197`, -2.8819650173048537`, 
   1.1911421856516324`}, ViewProjection -> Automatic, 
 ViewRange -> All, ViewVector -> Automatic, 
 ViewVertical -> {0.07086652061143876`, -0.17857804014486425`, 

Simple rendering command graphic

The goal is to deliberately prevent or encourage the overlap of certain Graphics3D creations.

Server-side rendering of HTML components updated by third parties

Suppose I want to include a third-party HTML component in my site …

I know I can just include a

opengl – Geometric rendering problem

I am currently working on a 3D game engine with OpenGL 4 and C ++. The problem is that, I don't know why, my geometry is not well rendered, except for the primitives.


On the right, you can see a cube, it is rendered as expected.
In the center you can see a mesh that I created in Blender, it is not rendered as expected, on the left you can see my mesh rendered as expected BUT I have adjusted the mesh ; Z scale (and I shouldn't)

So, to be short: my stitches are not well proportioned.

I checked the coordinates of each vertex in blender and my project: they are the same.

I don't know if there is a problem with my dies, the only thing I know is that this problem only appears on the Z axis (Z is in place ).

Everything makes me think that somewhere a number is rounded but I do not see where

I leave some code here, which could be useful:

Mesh rendering code:

void RD_Mesh::render(RenderMode rndrMode) {
    if (rndrMode == RenderMode::Filled) {
        glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
    else {
        glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);


    glm::mat4 mdl = glm::mat4(1.0f); //Declaring Model Matrix

    glm::mat4 translate = glm::mat4(1.0f);
    glm::mat4 scale = glm::mat4(1.0f);
    glm::mat4 rotation = glm::mat4(1.0f);

    translate = glm::translate(translate, glm::vec3(m_position.getX(), m_position.getY(), m_position.getZ()));

    scale = glm::scale(scale, glm::vec3(m_scale.getX(), m_scale.getY(), m_scale.getZ()));

    rotation = glm::rotate(rotation, glm::radians(m_rotation.getX()), glm::vec3(1.0f, 0.0f, 0.0f));
    rotation = glm::rotate(rotation, glm::radians(m_rotation.getY()), glm::vec3(0.0f, 1.0f, 0.0f));
    rotation = glm::rotate(rotation, glm::radians(m_rotation.getZ()), glm::vec3(0.0f, 0.0f, 1.0f));

    mdl = translate * rotation * scale;

    m_shader->SetMatrix("model", mdl);

    glDrawElements(GL_TRIANGLES, RAWindices.size(), GL_UNSIGNED_INT, 0);

Camera code:

void RD_Camera::SetupCamera() {
    projection = glm::perspective(glm::radians(FOV), (float)m_rndr->getWindowWidth() / m_rndr->getWindowHeigh(), m_near, m_far); //Projection matrix

    view = glm::lookAt(glm::vec3(m_pos.getX(), m_pos.getY(), m_pos.getZ()), glm::vec3(m_subject.getX(), m_subject.getY(), m_subject.getZ()), glm::vec3(0.0f, 0.0f, 1.0f)); //View matrix

    m_rndr->GetCurrentShader()->SetMatrix("projection", projection);
    m_rndr->GetCurrentShader()->SetMatrix("view", view);
    m_rndr->GetCurrentShader()->SetVec3("CamPos", m_pos);

My Vertex Shader:

#version 450 core

layout (location = 0) in vec3 aPos;
layout (location = 1) in vec3 aNormal;

out vec3 Normal;
out vec3 FragPos;

uniform mat4 projection;
uniform mat4 view;
uniform mat4 model;

void main()
    gl_Position = projection * view * model * vec4(aPos, 1.0);

    Normal = normalize(mat3(transpose(inverse(model))) * aNormal);
    FragPos = vec3(model * vec4(aPos, 1.0));

Ps: I am sorry for my English, it is not my mother tongue.

pbr – How to apply the filmic / aces tonemapping in the rendering (before the post-process)

Tone mapping, as I understand it, maps the light levels rendered along a curve, so that very bright parts of the scene can always have details and not be blown out (all white) by bright light.

I see in babylonJS that the “post-processing” pipeline has a tone map offering optional ACES (similar to unreal). However, there is a huge difference, from what I understand, if you're trying to take a “ totally overexposed '' picture. and the “ post-treat '' & # 39; & # 39; compared to the mapping of the actual illuminated texture values ​​on a curve BEFORE outputting the color.

So in the babylonjs examples, I see this code:

var pipeline = new BABYLON.DefaultRenderingPipeline("default", true, scene);
scene.imageProcessingConfiguration.toneMappingEnabled = true;
scene.imageProcessingConfiguration.toneMappingType = BABYLON.ImageProcessingConfiguration.TONEMAPPING_ACES;
scene.imageProcessingConfiguration.exposure = 1;

but I don't want to apply a curve to the color values ​​AFTER the information is already lost because it was blocked. I want to apply bright light and display the full range of details.

So, do we have a "real" tone mapping in BJS where the values ​​are calculated as floats and mapped to a curve, or do we only have a "here are the RGB 0 – 255 pixels" and then somehow modify them with a post-process of “curves” function?

So the question is, how do you make the tone mapping happen at the color space, not as post-processing?

unit – The rendering texture shows an opposite view of the camera when rendering?

I use the render texture (cube) to render what the camera sees there. However, the rendering texture shows me the opposite side of the camera view (inverted) instead of what the camera sees. A solution for that?

    void Update()
         renderTexture.isPowerOfTwo = true;          
         //Graphics.SetRenderTarget(renderTexture, 0, CubemapFace.PositiveZ);
         renderTexture.dimension = TextureDimension.Cube;
         camera.RenderToCubemap(renderTexture, 63);    

macos – Html rendering in preview to display MathJax equations

I write homework for a class of statistics. Students knit html rmd files. For convenience, I want to use the quick view to read the files so that I don't have to open all of them. Unfortunately, MathJax equations don't show up and just show up as text.

Is there a quick search plugin that I can use to render MathJax?

If not, could you point me to another quick view app that doesn't require me to open each html file in a "real" browser?

c # – Unity performance issues when rendering items

I am trying to get the position of the objects in the game world and display them on the screen. I have managed to get this to work, but I seem to be experiencing major performance issues when I enable this feature. I lose about 40 to 50 ips. I have a feeling this is related to iteration through the array doing the transformation in the OnGUI () life cycle method which is called multiple times per frame, but I'm not entirely sure. All pointers would be appreciated, I'm still very new to game development.

Code in question:

private void OnGUI()
    if (showItems)

    if (showItems && Time.time >= itemNextUpdateTime)
        lootItems = UnityEngine.Object.FindObjectsOfType();
        itemNextUpdateTime = Time.time + itemUpdateInterval;

Draw method:

public void DrawLoot()
    foreach (LootItem item in lootItems)
        if (item == null || == null || == string.Empty)

        var pos = Camera.main.WorldToScreenPoint(item.transform.position);
        float itemDistance = Vector3.Distance(Camera.main.transform.position, item.transform.position);
        Vector3 vector = new Vector3(pos.x, pos.y, pos.z);
        if (itemDistance <= maxLootDrawingDistance && vector.z > 0.01)
            string itemLabel = $"{item.Item.ShortName.Localized()} - {itemDistance}";
            GUI.Label(new Rect(vector.x - 50f, Screen.height - vector.y, 100f, 50f), itemLabel);