Resizing Textures In Unity
It sounds silly but its true! Unity does not have any methods for resizing textures. Experienced readers might object, “what about Texture2D.Resize?” you say. Well, turns out that Texture2D.Resize only resizes the texture container, like trimming an array. Unlike trimming an array, Texture2D.Resize also sets the pixels to undefined. So really, all its doing is changing the amount of memory allocated. Thats all a bit disappointing, especially when you might be wanting to build a multi resolution pipeline for mobile. The weird thing is that Unity does this already when generating mip maps. So, lets roll up the old sleeves and figure this out.
We can get pixel data from texture, check! We can alter the pixel data, check! We can put pixel data into a texture, check! So whats left? The algorithm to resample pixel data. Resample is really the operative word because we are going to sample the source texture from the destination texture, pixel by pixel. This is exactly how pixel shaders work. So, We have our source texture at 1024x1024 and a destination texture at 512x512. What we are going to do is iterate through that 512x512 grid, find the corresponding color from the source texture and fill out destination pixel. Now we get to words you might have heard but never have known what they meant. Bilinear sampling is most common in games and gives a good understanding of how to sample but we are going to start with a technique called Nearest Neighbor. Resampling is mapping one set of coordinates onto another. So to do this, we must first reduce both sets of coordinates into a common format. Enter our best friend, ratio.
We will reduce everything into 0 to 1 notation. Do this by dividing the pixel position in question by the length. Eg, x=12, y=37 will end up being x=0.0234375, y=0.072265625 for our 512 destination. Take this number and multiply it by your source length, 1024 and you get x=24, y=74. So we take the pixel from 24,74 and put it in out destination slot 12,37. This is all well and good, but what happens when we want to go the other way? Same rules apply but we will run into floating points. This will be true in downscaling to non power of 2 as well, say 1024 to 339. If we go from x=13, y=37 in up sampling from 512 to 1024, we reduce (0.0126953125, 0.0263671875) and then sample our source and get x=6.5, y=13.5. WTF do we do with a half pixel? Well really, that is what this article is all about. The Nearest Neighbor technique rounds the number (x=7, y=14). Bilinear interpolates between the 4 colors it is stuck between(6,13 -> 7,13->7,14->6,14). The Average technique takes the average of the pixels on dispute. There are tons more techniques, but we will be focusing on these. We will also be focusing on downscaling rather than up scaling because this is more useful for us in creating multi resolution pipelines for different screen sizes. Up-resed artwork, no matter how good the algorithm, will always look bad.
Down scaling’s problem is not a lack of source data, its how we combine the source pixels to best represent what was going on in that pixel. We have already covered Nearest Neighbor and saw that in downscaling 2X, we just threw away half our pixel data. See the image below where we sample down 2X and 4X, its not pretty.
Bilinear is good for downscaling up to 2X, but then falls down due to the same problem, as above, throwing away pixels with out factoring them into our new image. You can see in the image below, its just about the same as Nearest Neighbor for downscaling 2X and 4X. In fact, it will be exactly the same if its 2X and 4X because no interpolation will be done.
So now we need to get clever and the average technique is the key. In the Average technique, we … surprise, surprise … average the pixels we are reducing into a single color. To do this, we must know what size our destination pixel is to the source image. For 2X, its 2x2, for 4X its 4x4. Use the ratio method described above in resampling. The outcome is pretty reliable and bad ass. You can go even further and weight the pixels you are averaging so that a pixel further away from the center sample has less effect on the color as the pixel closest to the center of the sample.
So thats all the theory, now this is the code. Take a look through in detail and see how each one is done. Be sure to hit me up on twitter @dj_roeeze if you have any questions or comments.
public enum ImageFilterMode : int {
Nearest = 0,
Biliner = 1,
Average = 2
}
public static Texture2D ResizeTexture(Texture2D pSource, ImageFilterMode pFilterMode, float pScale){
int i;
// Get All the source pixels
Color[] aSourceColor = pSource.GetPixels(0);
Vector2 vSourceSize = new Vector2(pSource.width, pSource.height);
// Calculate New Size
float xWidth = Mathf.RoundToInt((float)pSource.width * pScale);
float xHeight = Mathf.RoundToInt((float)pSource.height * pScale);
// Make New
Texture2D oNewTex = new Texture2D((int)xWidth, (int)xHeight, TextureFormat.RGBA32, false);
// Make destination array
int xLength = (int)xWidth * (int)xHeight;
Color[] aColor = new Color[xLength];
Vector2 vPixelSize = new Vector2(vSourceSize.x / xWidth, vSourceSize.y / xHeight);
// Loop through destination pixels and process
Vector2 vCenter = new Vector2();
for(i=0; i<xLength; i++){
// Figure out x&y
float xX = (float)i % xWidth;
float xY = Mathf.Floor((float)i / xWidth);
// Calculate Center
vCenter.x = (xX / xWidth) * vSourceSize.x;
vCenter.y = (xY / xHeight) * vSourceSize.y;
// Do Based on mode
// Nearest neighbour (testing)
if(pFilterMode == ImageFilterMode.Nearest){
// Nearest neighbour (testing)
vCenter.x = Mathf.Round(vCenter.x);
vCenter.y = Mathf.Round(vCenter.y);
// Calculate source index
int xSourceIndex = (int)((vCenter.y * vSourceSize.x) + vCenter.x);
// Copy Pixel
aColor[i] = aSourceColor[xSourceIndex];
}
// Bilinear
else if(pFilterMode == ImageFilterMode.Biliner){
// Get Ratios
float xRatioX = vCenter.x - Mathf.Floor(vCenter.x);
float xRatioY = vCenter.y - Mathf.Floor(vCenter.y);
// Get Pixel index's
int xIndexTL = (int)((Mathf.Floor(vCenter.y) * vSourceSize.x) + Mathf.Floor(vCenter.x));
int xIndexTR = (int)((Mathf.Floor(vCenter.y) * vSourceSize.x) + Mathf.Ceil(vCenter.x));
int xIndexBL = (int)((Mathf.Ceil(vCenter.y) * vSourceSize.x) + Mathf.Floor(vCenter.x));
int xIndexBR = (int)((Mathf.Ceil(vCenter.y) * vSourceSize.x) + Mathf.Ceil(vCenter.x));
// Calculate Color
aColor[i] = Color.Lerp(
Color.Lerp(aSourceColor[xIndexTL], aSourceColor[xIndexTR], xRatioX),
Color.Lerp(aSourceColor[xIndexBL], aSourceColor[xIndexBR], xRatioX),
xRatioY
);
}
// Average
else if(pFilterMode == ImageFilterMode.Average){
// Calculate grid around point
int xXFrom = (int)Mathf.Max(Mathf.Floor(vCenter.x - (vPixelSize.x * 0.5f)), 0);
int xXTo = (int)Mathf.Min(Mathf.Ceil(vCenter.x + (vPixelSize.x * 0.5f)), vSourceSize.x);
int xYFrom = (int)Mathf.Max(Mathf.Floor(vCenter.y - (vPixelSize.y * 0.5f)), 0);
int xYTo = (int)Mathf.Min(Mathf.Ceil(vCenter.y + (vPixelSize.y * 0.5f)), vSourceSize.y);
// Loop and accumulate
Vector4 oColorTotal = new Vector4();
Color oColorTemp = new Color();
float xGridCount = 0;
for(int iy = xYFrom; iy < xYTo; iy++){
for(int ix = xXFrom; ix < xXTo; ix++){
// Get Color
oColorTemp += aSourceColor[(int)(((float)iy * vSourceSize.x) + ix)];
// Sum
xGridCount++;
}
}
// Average Color
aColor[i] = oColorTemp / (float)xGridCount;
}
}
// Set Pixels
oNewTex.SetPixels(aColor);
oNewTex.Apply();
// Return
return oNewTex;
}