So I had a quick thought I wanted to play around with which was using render textures with cameras and projecting that texture onto a mesh in the scene. This isn’t a new idea, but it did give me a unique idea for a gameplay element.
This was actually really straightforward to setup, here are the steps I took
- Created a simple cube in Blender, went into the UV editor and did a smart UV project with an island spacing of .06
- I moved all the faces except one out of the UV coordinates and I scaled the one face up to fill the UV image space.
- Created a new scene in Unity
- Created a new render texture
- Added an additional camera to the scene and positioned it where I wanted to capture
- In the target texture, I assigned the render texture I created in step 4
- I created a new material called “spycam” and set the Albedo texture to the render texture
- Added the cube in step 1 to the scene and assigned the spycam material to it.
Overall really cool and it even works in capturing “recursively” you can see the nested layers of rendering on the projected faces
Oh and a really cool thing is that you can even use different camera post processing effects on the camera that is rendering to a texture than you have on the main camera.