Shader Forge. Quite often when performing blending I need to do the same operations to all four channels and while I can do that by doing the operations on RGB and on the Alpha that seems needlessly repetitive and inefficient.
This could be sorted by an 'RGBA' output being added to the Texture2D, although some changes to nodes such as Component Mask would also likely be needed to act on Vector4s. You can already do that, and the component mask supports 4 component vectors too :. Ah, excellent! I had used the Append node before but hadn't thought of using it for this. I use a lot of information in my alpha channels, so I use this trick too a lot - I must say that having the option to get the full vector4 directly would be nice :.
Mostly when mixing two or more textures together to get the mixed RGBA to use again further down the chain. For instance, today I did a splat-map shader and I wanted the to multiply each channel of the splatmap with a unique tiled texture to get the resulting RGBA, since the Alpha channel of the tiling textures contained spec values, I wanted it to get the same input multiplication so the spec values could be obtained with the same splatting. In the same shader I wanted to lerp two color inputs together, and I also wanted the resulting alpha value to affect the shader as well.
I could of course use an extra chain with the same nodes for just the alpha, but since I often want to do more things to the information such as adding it together, multiplying by a value and so on, keeping it in vector 4 makes it simpler and also makes it as few nodes as possible on the board. It's not a biggie using an append node, but since the data is an RGBA to begin with it feels a bit weird that I need to first append the data together.For testing, I visualized the texture via DrawTexture.
Works in the editor and when I build the game for Windows 64bit, the texture has the correct dimensions but it is blank.
I read about a bug that I shouldn't use Anti-Aliasing but that doesn't help me. Attachments: Up to 2 attachments including images can be used with a maximum of To help users navigate the site we have posted a site navigation guide. Make sure to check out our Knowledge Base for commonly asked Unity questions. Answers Answers and Comments. Using Screenshot as Texture 4 Answers. Why is ReadPixels on my RenderTexture creating a pure gray texture? GetPixels of RenderTexture 2 Answers.
Alternative for Graphics. Can a Texture2D be created at runtime from a snapshot of a RenderTexture? Login Create account. Ask a question. ReadPixels new Rect 0, 0, textureB.
Apply ; For testing, I visualized the texture via DrawTexture. Thanks for help, folks! I don't know but perhaps your render texture format differ between standalone and editor? Perhaps you could try a different texture format? Your answer. Hint: You can notify a user about this post by typing username.
Welcome to Unity Answers The best place to ask and answer questions about development with Unity. If you are a moderator, see our Moderator Guidelines page. We are making improvements to UA, see the list of changes.
The shader code:. The strange thing is that if i change all the places in the code back from Texture2D into Texture1D and the sampler in the shader to 1D everything works great. Anyone has any idea where the issue can be? Sign up to join this community. The best answers are voted up and rise to the top. Home Questions Tags Users Unanswered.
Asked 5 years, 7 months ago. Active 4 years, 10 months ago. Viewed times. Shachar Shachar 1 1 1 bronze badge. The problem might be there. It is loaded from a text file and i checked with a debugger and the values in it are ok.
Active Oldest Votes. Kromster 9, 4 4 gold badges 46 46 silver badges 60 60 bronze badges. Sign up or log in Sign up using Google. Sign up using Facebook.Resource data formats, including fully-typed and typeless formats. A list of modifiers at the bottom of the page more fully describes each format type.
All 0's maps to 0. The sequence of unsigned integer encodings between all 0's and all 1's represent a nonlinear progression in the floating-point interpretation of the numbers between 0. Typeless formats are designed for creating typeless resources; that is, a resource whose size is known, but whose data type is not yet fully defined.
When a typeless resource is bound to a shader, the application or shader must resolve the format type which must match the number of bits per component in the typeless format. A typeless format contains one or more subformats; each subformat resolves the data type.
For example, in the R32G32B32 group, which defines types for three-component bit data, there is one typeless format and three fully typed subformats. DXGI Enumerations. Skip to main content. Exit focus mode. This format supports bit depth, 8-bit stencil, and 24 bits are unused. This format supports bit red channel, 8 bits are unused, and 24 bits are unused. This format has 32 bits unused, 8 bits for green channel, and 24 bits are unused.
There are no sign bits, and there is a 5-bit biased 15 exponent for each channel, 6-bit mantissa for R and G, and a 5-bit mantissa for B, as shown in the following illustration. This format has 24 bits red channel and 8 bits unused. This format has 24 bits unused and 8 bits green channel. There is no sign bit, and there is a shared 5-bit biased 15 exponent and a 9-bit mantissa for each channel, as shown in the following illustration. However, Direct3D 10, Direct3D One view provides a straightforward mapping of the entire surface.
Width and height must be even. Passing in a 1-channel format compatible with the Y plane maps only the Y plane. Passing in a 2-channel format compatible with the UV planes together maps only the U and V planes as a single resource view. The runtime does not enforce whether the lowest 6 bits are 0 given that this video resource format is a bit format that uses 16 bits.
If required, application shader code would have to enforce this manually. This format is subsampled where each pixel has its own Y value, but each 2x2 pixel block shares a single U and V value.
The runtime requires that the width and height of all resources that are created with this format are multiples of 2. The runtime also requires that the left, right, top, and bottom members of any RECT that are used for this format are multiples of 2. Applications cannot use the CPU to map the resource and then access the data within the resource. You cannot use shaders with this format. Because of this behavior, legacy hardware that supports a non-NV12 layout for example, YV12, and so on can be used.
The dark mode beta is finally here. Change your preferences any time.
Stack Overflow for Teams is a private, secure spot for you and your coworkers to find and share information. I'm using SlimDX for a game that my team is making and I've run into an issue.
I've searched looked for answers to my issue and this is as far as I have gotten. I believe that the texture's data is being set but I can't be sure because nothing is being displayed. I know that my renderer works as I can load texture from a file perfectly fine but I seem to have a problem that I cannot find. Thanks for your help in advance! I've had the code right all along. Thanks to all that viewed however. Learn more. Asked 8 years, 3 months ago.
Active 8 years, 3 months ago. Viewed 3k times. SampleDescription 1, 0 ; desc2. Dynamic; desc2. ShaderResource; desc2. Map 0, MapMode. WriteDiscard, MapFlags. None ; if rect. Pitch; rect. Seek rowStart, System. WriteByte byte color. Red ; rect. Green ; rect. Blue ; rect. Format; desc. Texture2D; desc. TrupaJay TrupaJay 71 2 2 silver badges 5 5 bronze badges. It can be useful to see the pixel history in cases where you are getting unexpected behaviour, such as nothing appearing on the screen.
I can see you asked this a while ago, but I'm new to DirectX and SlimDX and was wondering if you could let me know how you initialized your device for this. Active Oldest Votes. Sign up or log in Sign up using Google.Now that our sample program has a rotating 3D cube, let's map a texture onto it instead of having its faces be solid colors. The first thing to do is add code to load the textures.
In our case, we'll be using a single texture, mapped onto all six sides of our rotating cube, but the same technique can be used for any number of textures. This makes the texture immediately usable as a solid blue color even though it may take a few moments for our image to download.
ReadPixels returns RGBA(0,0,0,0)
At that point we again call texImage2D this time using the image as the source for the texture. After that we setup filtering and wrapping for the texture based on whether or not the image we download was a power of 2 in both dimensions or not. This will allow non-power-of-two NPOT textures at the expense of mipmapping, UV wrapping, UV tiling, and your control over how the device will handle your texture. Again, with these parameters, compatible WebGL devices will automatically accept any resolution for that texture up to their maximum dimensions.
At this point, the texture is loaded and ready to use. But before we can use it, we need to establish the mapping of the texture coordinates to the vertices of the faces of our cube. This replaces all the previously existing code for configuring colors for each of the cube's faces in initBuffers. The textureCoordinates array defines the texture coordinates corresponding to each vertex of each face.
Note that the texture coordinates range from 0. We need to replace the vertex shader so that instead of fetching color data, it instead fetches the texture coordinate data.
Instead of assigning a color value to the fragment's color, the fragment's color is computed by fetching the texel that is, the pixel within the texture based on the value of vTextureCoord which like the colors is interpolated bewteen vertices.
WebGL provides a minimum of 8 texture units; the first of these is gl. We tell WebGL we want to affect unit 0. View the complete code Open this demo on a new page. See this hacks. Tainted write-only 2D canvases can't be used as WebGL textures. Note: CORS support for cross-domain videos and the crossorigin attribute for embeds a media player which supports video playback into the document.
Converting from DXT4 format to RGBA
Get the latest and greatest from MDN delivered straight to your inbox. Sign in to enjoy the benefits of an MDN account. Using textures in WebGL.Discussion in ' Scripting ' started by romDec 1, Search Unity. Log in Create a Unity ID. Unity Forum. Forums Quick Links. Asset Store Spring Sale starts soon!
Joined: Jul 2, Posts: Any Ideas here. I am using C. Joined: Jul 19, Posts: 32, Convert the byte array to a Color array and use SetPixels, perhaps? Eric5h5Dec 1, Jonathan Czeck. Joined: Mar 17, Posts: 1, Cheers, -Jon. Jonathan CzeckDec 1, Joined: Jul 19, Posts: 8, Joined: Aug 26, Posts: Sorry to revive this old thread, but I couldn't find a more recent one about this very same topic.
Is there now a method to DecodePNG? Included in Unity, I mean. Thanks in advance. EDIT: hey, wait a minute There is Texture2D. LoadImage byte bytes to achieve this. Sorry for the inconveniences!Modular Textures in Unity, Part 1: Basics
LorenzoChompNov 3, Joined: Feb 18, Posts: Joined: Nov 26, Posts: Andrey-PostelzhukOct 17, ModLunar likes this. Joined: Feb 25, Posts: From which Unity version ImageConversion was introduced?
Voxel-BustersOct 20, Joined: Oct 11, Posts: 15, RyiahOct 20, You must log in or sign up to reply here. Show Ignored Content. Your name or email address: Password: Forgot your password?