我正在按照 https://docs.unity3d.com/2019.4/Documentation/ScriptReference/Windows.WebCam.PhotoCapture.html 使用 Hololens 2 相机拍照并使用广告牌(Quad
对象)显示它。当我尝试运行该代码时,即使使用全息仿真并在连接的 Hololens 2 中播放场景,我也会收到错误消息,因为脚本无法访问相机 (Failed to initialize IMediaCapture (hr = 0xC00DABE0)
)。如果我构建应用程序并将其部署在 HoloLens 2 中,则不会发生这种情况。
我的问题是:有没有办法授予对这台相机的 Unity 访问权限,所以当我点击播放并进入游戏模式(启用全息仿真并连接 Hololens 2)时,脚本可以访问相机?
同样,如果我将脚本有效地部署在 Hololens 2 中,该脚本就可以工作,但是必须在 Unity 中构建项目,然后在 VS 中构建项目以进行每个小测试需要太长时间。我正在使用 Unity 2019.4.26f 和 VS 2019。
万一链接不起作用的代码:
using UnityEngine;
using System.Collections;
using System.Linq;
using UnityEngine.Windows.WebCam;
public class PhotoCaptureExample : MonoBehaviour
{
PhotoCapture photoCaptureObject = null;
Texture2D targetTexture = null;
// Use this for initialization
void Start()
{
Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);
// Create a PhotoCapture object
PhotoCapture.CreateAsync(false, delegate(PhotoCapture captureObject) {
photoCaptureObject = captureObject;
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
// Activate the camera
photoCaptureObject.StartPhotoModeAsync(cameraParameters, delegate(PhotoCapture.PhotoCaptureResult result) {
// Take a picture
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
});
});
}
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame)
{
// Copy the raw image data into our target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
// Create a gameobject that we can apply our texture to
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
Renderer quadRenderer = quad.GetComponent<Renderer>() as Renderer;
quadRenderer.material = new Material(Shader.Find("Unlit/Texture"));
quad.transform.parent = this.transform;
quad.transform.localPosition = new Vector3(0.0f, 0.0f, 3.0f);
quadRenderer.material.SetTexture("_MainTex", targetTexture);
// Deactivate our camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
void OnStoppedPhotoMode(PhotoCapture.PhotoCaptureResult result)
{
// Shutdown our photo capture resource
photoCaptureObject.Dispose();
photoCaptureObject = null;
}
}
回答1
强烈建议参考 https://docs.microsoft.com/en-us/windows/mixed-reality/develop/native/mixed-reality-capture-directx#render-from-the-pv-camera-opt-in 以获取视频流以在纹理上进行渲染,或参考 https://docs.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/research-mode 以获取深入的相机/传感器数据。
还请仔细检查此行中的资产目录是否正确:quadRenderer.material = new Material(Shader.Find("Unlit/Texture"));