我有一个类似的问题,例如以下三个问题:
- 将 Unity ARCore 中的 AcquireCameraImageBytes() 作为图像保存到存储 https://stackoverflow.com/questions/49579334/save-acquirecameraimagebytes-from-unity-arcore-to-storage-as-an-image
- 从 unity ARCore 保存相机图像 https://stackoverflow.com/questions/49674399/save-camera-image-from-unity-arcore
- ARCore for Unity 保存相机图像 https://stackoverflow.com/questions/49798067/arcore-for-unity-save-camera-image
我的目标是通过单击按钮保存相机图像(没有增强对象)和相机的姿势。今天我尝试了一整天用 ARCore 保存相机图像。我尝试了上面链接的三个问题的不同方法,但没有成功。
我的 C# 脚本附加到按钮:
using UnityEngine;
using UnityEngine.UI;
using GoogleARCore;
using System.IO;
public class takeimg : MonoBehaviour
{
private Texture2D m_TextureRender;
public Button yourButton;
private byte[] m_EdgeDetectionResultImage = null;
void Start()
{
Button btn = yourButton.GetComponent<Button>();
btn.onClick.AddListener(TaskOnClick);
}
void TaskOnClick()
{
var image = Frame.CameraImage.AcquireCameraImageBytes();
m_TextureRender = new Texture2D(image.Width, image.Height, TextureFormat.RGBA32, false, false);
m_EdgeDetectionResultImage = new byte[image.Width * image.Height * 4];
System.Runtime.InteropServices.Marshal.Copy(image.Y, m_EdgeDetectionResultImage, 0, image.Width * image.Height * 4);
m_TextureRender.LoadRawTextureData(m_EdgeDetectionResultImage);
m_TextureRender.Apply();
var encodedJpg = m_TextureRender.EncodeToJPG();
var path = Application.persistentDataPath;
File.WriteAllBytes(path + "/test.jpg", encodedJpg);
}
}
目前我得到的图像看起来:保存的图像 https://i.stack.imgur.com/n9i8O.jpg它看起来类似于我上面链接的第三个SO问题。
所以有些东西仍然是错误/缺失的。有人可以帮我看看出了什么问题吗?与缓冲区有关的东西吗?
Update:与此同时,我设法取回一张黑白照片:黑白图片 https://i.stack.imgur.com/tVGTt.jpg这是我的新 TaskOnClick 函数:
void TaskOnClick()
{
var image = Frame.CameraImage.AcquireCameraImageBytes();
byte[] bufferY = new byte[image.Width * image.Height];
byte[] bufferU = new byte[image.Width * image.Height / 2];
byte[] bufferV = new byte[image.Width * image.Height / 2];
System.Runtime.InteropServices.Marshal.Copy(image.Y, bufferY, 0, image.Width * image.Height);
System.Runtime.InteropServices.Marshal.Copy(image.U, bufferU, 0, image.Width * image.Height / 2);
System.Runtime.InteropServices.Marshal.Copy(image.V, bufferV, 0, image.Width * image.Height / 2);
m_TextureRender = new Texture2D(image.Width, image.Height, TextureFormat.RGBA32, false, false);
Color c = new Color();
for (int y = 0; y < image.Height; y++) {
for (int x =0; x<image.Width;x++) {
float Y = bufferY[y * image.Width + x];
float U = bufferU[(y/2) * image.Width + x];
float V = bufferV[(y/2) * image.Width + x];
c.r = Y;
c.g = Y;
c.b = Y;
c.r /= 255.0f;
c.g /= 255.0f;
c.b /= 255.0f;
if (c.r < 0.0f) c.r = 0.0f;
if (c.g < 0.0f) c.g = 0.0f;
if (c.b < 0.0f) c.b = 0.0f;
if (c.r > 1.0f) c.r = 1.0f;
if (c.g > 1.0f) c.g = 1.0f;
if (c.b > 1.0f) c.b = 1.0f;
c.a = 1.0f;
m_TextureRender.SetPixel(image.Width-1-x, y, c);
}
}
var encodedJpg = m_TextureRender.EncodeToJPG();
var path = Application.persistentDataPath;
File.WriteAllBytes(path + "/test.jpg", encodedJpg);
}
有人能告诉我,Google ARCore 使用的实际 YUV 到 RGB 对话是什么吗?我尝试了一些,但图片中的颜色看起来总是不对......
有没有比我的解决方案更简单的方法来保存当前帧的相机图像?