Draw on Render Target Texture

Hello everyone!

I need to store data in texture in order to send it to material. But I have no idea, how to draw custom pixels on that, only render a cam view…

There is this Canvas functionality inside Blueprints, but I do not understand how to work with it and is it able to draw primitives there. Maybe some of you do?

So… Is there any way to draw textures and send them into material. But if not, is there any way to dynamically send unknown number of vectors to mat instance? The reason I ask, because I work with HLSL inside of the material, and I need to get an array of vectors from my BP for making loop operations on that data.

1 Like

Hello, AlFlakky,

In case you still need it or somebody else stumbles on this question, it is possible by using a CanvasRenderTarget2D, which is a render target that you can create dynamically inside blueprints. You first create an asset for it (it is a blueprint class). Inside that class, you can add the functionality you want for when it updates.

Then, in another blueprint, you use a create a canvas render target node, selecting the class you created and setting its size, and save a reference for it as a variable. Whenever you want to change the content of it, you call the Update Resource node on the reference you saved. And if you want to pass some variable for it (a float or 2D vector to change the position you are writing to, or even a dynamic material instance) you can cast to your CanvasRenderTarget2D class and set a variable or call an event within it (although the changes you make will only reflect the next time you update the resource).

As for how to use it as a texture inside a material, your canvas reference can plug into the value of a Set Texture Parameter Value node. You create a material with a Param2D node, create a Dynamic Material Instance out of that material, and then set that texture parameter with your canvas. You only need to do this once. It will reflect any changes that happen when you update your resource.

Also, you can do more than just draw primitives. Inside the CanvasRenderTarget2D blueprint you can use the Draw Texture or Draw Material nodes. Those will give you what you want (for example, draw a texture a certain coordinate in your render target, and with a certain size; there are even blend modes).

As for the Draw Material node, I find it more useful than Draw Texture. Its function is not to draw all channels of a material (that wouldn’t make sense because you are in the end just updating a Param2D in your final material, and that is plugged to just one material input). In fact, it only writes out the emissive channel to your texture. But since it can take any material as an input, including dynamic material instances whose properties you can change by casting to your canvas blueprint and changing some variables, then you can make some interesting stuff such as changing the tint of the texture you are drawing. That wouldn’t be possible with the Draw Texture node.

I used this technique to make a paint brush game and it worked very well. I even was able to use a second canvas render target to paint the normals of my final material.

Here is another thread that shows a step by step tutorial. The first step (creating a Render Target asset) is not needed, the rest are correct.

All the best.

Can you tell more about paint brush? I try to make the same, but I have same problem. Canvas Render Target 2d clear all texture with black color on every Update Resource. How can I keep Canvas Render Target texture and paint on it?

I used two Canvas Render Targets, using one as a buffer. So they still get cleared on every update, but one is saving the other one. The logic is like this:

The first canvas saves the past frame result every time it’s updated, so it serves as a buffer. The second one applies the buffer first, and then paints a material on my selected location. I chose to go with the material route because that allows me to pass parameters to it and change it on the fly.

The texture that is used as an input for the applied material is the current frame result. On every tick (but it could be on every event that implies a change to the textures) I am updating both canvases, first the buffer, then the current frame.

For painting normals, you would do the same but with another set of past frame and current frame RTs, so you end up with a set of four textures.

As for how to use a canvas render target result inside another render target, I just created a texture reference variable, and then on my original blueprint I cast to my canvas blueprint and set it with the other canvas every time I need.

Now, it seems like there are some new features in 4.13 that will make this technique much more straightforward. I recommend watching the twitch stream, a lot of cool things coming on. 4.13 Release Preview | Feature Highlight | Unreal Engine - YouTube

If you have questions about the original technique please let me know.

Yes, the new DrawMaterialToRenderTarget should make things easier. However, I still can’t create a ‘feedback loop’ between my render targets, where one buffer keeps the last frame. My render targets keep getting cleared. =/

I’d be interested in learning more.

Hello, aoakenfo,

Here are some screenshots of my blueprints. It will be a bit dirty because there are some things that are not needed for this particular functionality, but I hope it gives some insight on a working solution. I will only show the part that involves the final base color, but it would be the same for the normal, just double the render targets.

I have the following assets involved: Two CanvasRenderTarget2D blueprints (named lastFrame and newFrame), two Materials (one for my dynamic brush, and one for my paintable object), and one paintable actor.

This is the content of the Last Frame CRT2D.

Notice my newFrame variable. It is a Texture Render Target reference, that will be set from the outside by casting to this class. The boolean firstRun defaults to true.

This is the content of New Frame CRT2D.

Here, the variable lastFrame_Base is also a Texture Render Target Reference. What’s new is the brushBaseMaterial variable, a Material Dynamic Instance reference that will be set from the outside. This gives me the freedom to change the material I am using as a brush should I want to.

Also, notice that I am storing the last target screen position and doing a forLoop that interpolates to the new target position. This is to have some sort of “sub frame” painting, so that my brush is smooth no matter how fast I am moving or how fast is the computer ticking. Right now it interpolates on fixed steps, but a smarter interpolation would have this responsive and adapting to how fast you moved, or the delta time to the last tick.

This is the material of my brush:

Notice that I can change my color with a parameter, and that I am outputting to the emissive channel, as that is what is used by the canvas paint material node.

And this is the blueprint of my paintable object.

This happens on begin play, on this order:

Where this is the Update Canvas function. This stores some of the key functionality for this technique, including passing the relevant information to each render target, and also determining the order of the updates.

I also noticed that I gave incorrect information before: I am updating the newFrame first (so I use the buffer), and then the buffer (so I lose it). Else I would be getting a black screen. Might be what happened to you if you followed my past posts.

This is the Event Tick:

Initialize stroke just sets some variables for my brush material. Everything important happens again on Update Canvas.

Finally this is the material that is applied to my paintable object:

And that is all. This is a working implementation of a CanvasRenderTarget2D that can paint on itself without perceiving as being cleared, and also gives some insights into how to alter some of the variables that affect it’s functionality for a good user interaction in this application.

Also notice that my paintable object is a Plane. Basically any object where you can make a direct relation between the local trace hit coordinate and the UV space, because at this version we can’t get the UV coordinates from a trace hit.

That will change too in 4.13, making this technique or the one using the new functionality more versatile for different shapes of objects.

Hope this helps!

Thanks for the taking the time to put this together - much appreciated. I can’t wait to get home tonight and try this out!

So no luck on my end. Here’s an explanation for my blueprint, perhaps you see an obvious blunder.

I’ve setup a SceneCaptureComponent on my camera and passed it into the blueprint. I’ve set CaptureEveryFrame to false. Every tick, UpdateCapture is called and it alternates between the two paths.

I capture the scene into the first render target. And set it as an input into the material, with the second input being the previously rendered frame.

In my case, I’m using a basic HUD to draw the texture. The last step is try using the new DrawMaterialToRenderTarget to merge the current scene with the previous one and save the result as a frame (which will also be used in the next pass).

The 2nd path does the exact same, but with different targets. My hope was to create a “hall of mirrors” effect, smearing the accumulated buffers together. But, it just seems to blend the last 2 frames.