AR(增强现实)应用在Unity引擎中的渲染需要精确协调多个图形和计算组件,以实现真实世界与虚拟内容的无缝融合。本文将深入探讨Unity AR渲染的完整流程,从摄像头捕获到最终显示。

一、AR渲染基础架构

1.1 Unity AR渲染系统概述

在Unity中,AR渲染系统主要由以下几个核心部分组成:

  • 摄像头采集子系统:负责从设备摄像头获取实时图像
  • 跟踪子系统:处理空间定位与运动追踪
  • 渲染管线:负责虚拟内容与现实图像的合成
  • 后处理系统:应用各种图像效果和调整

1.2 Unity AR支持框架

Unity支持多种AR框架,每种框架都有其特定的渲染流程:

  1. AR Foundation:Unity原生跨平台AR框架
  2. ARCore (Google):专注于Android平台
  3. ARKit (Apple):专用于iOS平台
  4. Vuforia:专注于图像识别和跟踪的第三方AR平台

本文将主要以AR Foundation为基础,因为它封装了平台特定的API,提供了统一的接口。

二、摄像头图像捕获与处理

2.1 摄像头配置与初始化

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

public class ARCameraManager : MonoBehaviour
{
    private ARCameraManager arCameraManager;
    private ARSession arSession;
    
    void Awake()
    {
        arCameraManager = GetComponent<ARCameraManager>();
        arSession = FindObjectOfType<ARSession>();
        
        // 注册相机帧接收事件
        arCameraManager.frameReceived += OnCameraFrameReceived;
    }
    
    // 配置相机参数
    void ConfigureCamera()
    {
        // 检查并请求摄像头权限
        var cameraPermissionRequester = arSession.subsystem as ICameraPermissionRequester;
        if (cameraPermissionRequester != null)
        {
            cameraPermissionRequester.RequestPermission();
        }
        
        // 选择相机配置(分辨率、帧率等)
        var configs = arCameraManager.GetConfigurations(Allocator.Temp);
        if (configs.Length > 0)
        {
            // 选择适合需求的配置,一般选择平衡性能和质量的配置
            arCameraManager.currentConfiguration = configs[0];
            configs.Dispose();
        }
    }
    
    // 相机帧接收回调
    void OnCameraFrameReceived(ARCameraFrameEventArgs args)
    {
        // 获取相机内参
        XRCameraIntrinsics intrinsics;
        if (arCameraManager.TryGetIntrinsics(out intrinsics))
        {
            // 相机内参可用于纹理映射计算
            // 焦距: intrinsics.focalLength
            // 主点: intrinsics.principalPoint
            // 图像尺寸: intrinsics.resolution
        }
        
        // 获取光照估计信息
        if (args.lightEstimation.averageBrightness.HasValue)
        {
            float brightness = args.lightEstimation.averageBrightness.Value;
            // 使用亮度调整AR对象的光照
        }
    }
    
    void OnDestroy()
    {
        if (arCameraManager != null)
            arCameraManager.frameReceived -= OnCameraFrameReceived;
    }
}

2.2 摄像头图像处理

摄像头图像在渲染系统中需要处理成背景纹理:

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.Rendering;

public class ARBackgroundTextureProcessor : MonoBehaviour
{
    private ARCameraBackground arCameraBackground;
    private ARCameraManager arCameraManager;
    
    [SerializeField]
    private Material customProcessingMaterial;
    
    private RenderTexture processedTexture;
    
    void Awake()
    {
        arCameraBackground = GetComponent<ARCameraBackground>();
        arCameraManager = GetComponent<ARCameraManager>();
    }
    
    void Start()
    {
        // 创建用于处理的渲染纹理
        processedTexture = new RenderTexture(Screen.width, Screen.height, 0, 
                                           RenderTextureFormat.ARGB32);
        
        // 设置背景渲染模式
        arCameraBackground.backgroundRenderingMode = 
            ARCameraBackgroundRenderingMode.BeforeOpaques;
            
        // 注册纹理获取回调
        arCameraManager.frameReceived += OnCameraFrameReceived;
    }
    
    void OnCameraFrameReceived(ARCameraFrameEventArgs args)
    {
        if (customProcessingMaterial != null)
        {
            // 获取原始相机纹理
            Texture cameraTexture = arCameraBackground.material.mainTexture;
            if (cameraTexture != null)
            {
                // 应用自定义处理
                Graphics.Blit(cameraTexture, processedTexture, customProcessingMaterial);
                
                // 替换原始纹理
                if (arCameraBackground.material.HasProperty("_MainTex"))
                {
                    arCameraBackground.material.SetTexture("_MainTex", processedTexture);
                }
            }
        }
    }
    
    // 示例:动态调整图像处理参数
    public void UpdateImageProcessing(float contrast, float brightness)
    {
        if (customProcessingMaterial != null)
        {
            customProcessingMaterial.SetFloat("_Contrast", contrast);
            customProcessingMaterial.SetFloat("_Brightness", brightness);
        }
    }
    
    void OnDestroy()
    {
        if (arCameraManager != null)
            arCameraManager.frameReceived -= OnCameraFrameReceived;
            
        if (processedTexture != null)
            processedTexture.Release();
    }
}

三、AR跟踪系统与渲染准备

3.1 空间跟踪与相机位姿估计

跟踪系统是AR渲染的基础,确保虚拟内容正确对齐到现实世界:

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

public class ARTrackingSystem : MonoBehaviour
{
    private ARSession arSession;
    private ARSessionOrigin arSessionOrigin;
    
    [SerializeField]
    private Text trackingStateText;
    
    private bool isTracking = false;
    
    void Awake()
    {
        arSession = FindObjectOfType<ARSession>();
        arSessionOrigin = FindObjectOfType<ARSessionOrigin>();
    }
    
    void Update()
    {
        // 监控跟踪状态
        ARSessionState sessionState = ARSession.state;
        
        switch (sessionState)
        {
            case ARSessionState.CheckingAvailability:
                UpdateTrackingStatus("Checking AR availability");
                break;
            case ARSessionState.Ready:
                UpdateTrackingStatus("AR Ready");
                break;
            case ARSessionState.SessionInitializing:
                UpdateTrackingStatus("Initializing AR session");
                break;
            case ARSessionState.SessionTracking:
                HandleSessionTracking();
                break;
            case ARSessionState.Unsupported:
                UpdateTrackingStatus("AR not supported on this device");
                break;
            default:
                UpdateTrackingStatus("Unknown AR state");
                break;
        }
    }
    
    private void HandleSessionTracking()
    {
        // 获取当前跟踪状态
        TrackingState trackingState = arSessionOrigin.camera.trackingState;
        
        switch (trackingState)
        {
            case TrackingState.Tracking:
                if (!isTracking)
                {
                    isTracking = true;
                    OnTrackingAcquired();
                }
                UpdateTrackingStatus("Tracking");
                break;
            case TrackingState.Limited:
                isTracking = false;
                UpdateTrackingStatus("Limited Tracking");
                break;
            case TrackingState.None:
                isTracking = false;
                UpdateTrackingStatus("No Tracking");
                break;
        }
    }
    
    private void OnTrackingAcquired()
    {
        Debug.Log("AR tracking acquired - rendering can proceed normally");
        
        // 发送事件或通知渲染系统tracking已准备就绪
        EventBus.Publish(new TrackingReadyEvent());
    }
    
    private void UpdateTrackingStatus(string status)
    {
        if (trackingStateText != null)
            trackingStateText.text = "Tracking: " + status;
        
        Debug.Log("AR Tracking status: " + status);
    }
}

3.2 跟踪数据与渲染变换矩阵

using UnityEngine;
using UnityEngine.XR.ARFoundation;

public class ARRenderingTransform : MonoBehaviour
{
    private ARCameraManager arCameraManager;
    private Camera arCamera;
    
    // 缓存相机矩阵用于渲染系统
    private Matrix4x4 viewMatrix;
    private Matrix4x4 projectionMatrix;
    
    // 暴露给着色器的矩阵
    private static readonly int ViewMatrixID = Shader.PropertyToID("_ViewMatrix");
    private static readonly int ProjMatrixID = Shader.PropertyToID("_ProjMatrix");
    
    void Awake()
    {
        arCameraManager = GetComponent<ARCameraManager>();
        arCamera = GetComponent<Camera>();
    }
    
    void Start()
    {
        arCameraManager.frameReceived += OnCameraFrameReceived;
    }
    
    void OnCameraFrameReceived(ARCameraFrameEventArgs args)
    {
        // 更新视图和投影矩阵
        UpdateRenderingMatrices();
    }
    
    void UpdateRenderingMatrices()
    {
        if (arCamera != null)
        {
            // 获取当前AR相机的视图矩阵
            viewMatrix = arCamera.worldToCameraMatrix;
            
            // 获取当前AR相机的投影矩阵
            projectionMatrix = arCamera.projectionMatrix;
            
            // 将矩阵传递给所有使用AR渲染的材质
            Shader.SetGlobalMatrix(ViewMatrixID, viewMatrix);
            Shader.SetGlobalMatrix(ProjMatrixID, projectionMatrix);
            
            // 计算View-Projection矩阵(常用于阴影计算等)
            Matrix4x4 viewProjMatrix = projectionMatrix * viewMatrix;
            Shader.SetGlobalMatrix("_ViewProjMatrix", viewProjMatrix);
        }
    }
    
    // 获取矩阵供其他渲染组件使用
    public Matrix4x4 GetViewMatrix() => viewMatrix;
    public Matrix4x4 GetProjectionMatrix() => projectionMatrix;
    
    void OnDestroy()
    {
        if (arCameraManager != null)
            arCameraManager.frameReceived -= OnCameraFrameReceived;
    }
}

四、虚拟物体渲染原理

4.1 AR渲染管线配置

using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

public class ARRenderPipelineConfigurator : MonoBehaviour
{
    [SerializeField]
    private UniversalRenderPipelineAsset renderPipelineAsset;
    
    [SerializeField]
    private bool optimizeForMobile = true;
    
    void Awake()
    {
        if (renderPipelineAsset != null)
        {
            // 设置为当前渲染管线资产
            GraphicsSettings.renderPipelineAsset = renderPipelineAsset;
            
            // 优化移动端AR渲染设置
            if (optimizeForMobile)
            {
                OptimizeForMobileAR();
            }
        }
    }
    
    private void OptimizeForMobileAR()
    {
        // 设置一些针对AR场景的渲染管线优化
        renderPipelineAsset.shadowDistance = 5.0f; // 减小阴影距离
        renderPipelineAsset.shadowCascadeCount = 1; // 减少阴影级联
        
        renderPipelineAsset.supportsCameraDepthTexture = true; // 启用深度纹理(对AR很重要)
        renderPipelineAsset.supportsCameraOpaqueTexture = true; // 启用背景纹理
        
        // 减少抗锯齿等级以提高性能
        renderPipelineAsset.msaaSampleCount = 1;
        
        // 在运行时动态调整其他设置
        var additionalCameraData = Camera.main.GetUniversalAdditionalCameraData();
        if (additionalCameraData != null)
        {
            additionalCameraData.renderPostProcessing = true;
            additionalCameraData.antialiasing = AntialiasingMode.FastApproximateAntialiasing;
            additionalCameraData.dithering = true;
        }
    }
    
    // 在不同环境光条件下调整渲染质量
    public void AdjustQualityForLighting(float lightingIntensity)
    {
        // 在光线较暗环境下减少后处理效果以提高帧率
        if (lightingIntensity < 0.3f)
        {
            renderPipelineAsset.msaaSampleCount = 1; // 禁用MSAA
            // 减少其他耗时效果...
        }
        else
        {
            renderPipelineAsset.msaaSampleCount = 2; // 适当开启MSAA
            // 增加其他效果...
        }
    }
}

4.2 深度处理与遮挡处理

AR中正确的深度处理对虚拟物体与现实场景的融合至关重要:

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.Rendering;

public class ARDepthManager : MonoBehaviour
{
    private ARCameraManager arCameraManager;
    private AROcclusionManager arOcclusionManager;
    
    [SerializeField]
    private Material occlusionMaterial;
    
    [SerializeField]
    private bool useEnvironmentDepth = true;
    
    [SerializeField]
    private bool useHumanSegmentation = true;
    
    private Texture2D humanStencilTexture;
    private Texture2D humanDepthTexture;
    private Texture2D environmentDepthTexture;
    
    private static readonly int HumanStencilTextureID = Shader.PropertyToID("_HumanStencilTexture");
    private static readonly int HumanDepthTextureID = Shader.PropertyToID("_HumanDepthTexture");
    private static readonly int EnvDepthTextureID = Shader.PropertyToID("_EnvironmentDepthTexture");
    
    void Awake()
    {
        arCameraManager = GetComponent<ARCameraManager>();
        arOcclusionManager = GetComponent<AROcclusionManager>();
    }
    
    void Start()
    {
        // 设置遮挡管理器
        if (arOcclusionManager != null)
        {
            arOcclusionManager.requestedEnvironmentDepthMode = 
                useEnvironmentDepth ? 
                EnvironmentDepthMode.Fastest : 
                EnvironmentDepthMode.Disabled;
            
            arOcclusionManager.requestedHumanDepthMode = 
                useHumanSegmentation ? 
                HumanSegmentationDepthMode.Fastest : 
                HumanSegmentationDepthMode.Disabled;
            
            arOcclusionManager.requestedHumanStencilMode = 
                useHumanSegmentation ? 
                HumanSegmentationStencilMode.Fastest : 
                HumanSegmentationStencilMode.Disabled;
        }
        
        // 注册遮挡纹理更新事件
        arOcclusionManager.frameReceived += OnOcclusionFrameReceived;
    }
    
    void OnOcclusionFrameReceived(AROcclusionFrameEventArgs args)
    {
        // 更新人体分割纹理
        if (args.humanStencilTexture != null && useHumanSegmentation)
        {
            humanStencilTexture = args.humanStencilTexture;
            Shader.SetGlobalTexture(HumanStencilTextureID, humanStencilTexture);
        }
        
        if (args.humanDepthTexture != null && useHumanSegmentation)
        {
            humanDepthTexture = args.humanDepthTexture;
            Shader.SetGlobalTexture(HumanDepthTextureID, humanDepthTexture);
        }
        
        // 更新环境深度纹理
        if (args.environmentDepthTexture != null && useEnvironmentDepth)
        {
            environmentDepthTexture = args.environmentDepthTexture;
            Shader.SetGlobalTexture(EnvDepthTextureID, environmentDepthTexture);
        }
        
        // 更新遮挡材质
        UpdateOcclusionMaterial();
    }
    
    void UpdateOcclusionMaterial()
    {
        if (occlusionMaterial != null)
        {
            if (humanStencilTexture != null)
                occlusionMaterial.SetTexture(HumanStencilTextureID, humanStencilTexture);
                
            if (humanDepthTexture != null)
                occlusionMaterial.SetTexture(HumanDepthTextureID, humanDepthTexture);
                
            if (environmentDepthTexture != null)
                occlusionMaterial.SetTexture(EnvDepthTextureID, environmentDepthTexture);
                
            // 设置相机参数
            if (arCameraManager != null)
            {
                // 传递近平面和远平面信息
                occlusionMaterial.SetFloat("_ZNear", arCameraManager.GetComponent<Camera>().nearClipPlane);
                occlusionMaterial.SetFloat("_ZFar", arCameraManager.GetComponent<Camera>().farClipPlane);
            }
        }
    }
    
    void OnDestroy()
    {
        if (arOcclusionManager != null)
            arOcclusionManager.frameReceived -= OnOcclusionFrameReceived;
    }
}

4.3 AR光照估计与物体渲染

正确的光照估计可以使AR内容更好地融入现实环境:

using UnityEngine;
using UnityEngine.XR.ARFoundation;

public class ARLightingManager : MonoBehaviour
{
    private ARCameraManager arCameraManager;
    
    [SerializeField]
    private Light directionalLight;
    
    [SerializeField]
    private ReflectionProbe reflectionProbe;
    
    // 用于控制环境反射的立方体贴图
    [SerializeField]
    private Cubemap[] environmentCubemaps;
    
    // 光照信息
    private float? brightness;
    private float? colorTemperature;
    private Color? colorCorrection;
    private SphericalHarmonicsL2? sphericalHarmonics;
    
    void Awake()
    {
        arCameraManager = GetComponent<ARCameraManager>();
    }
    
    void Start()
    {
        arCameraManager.frameReceived += OnCameraFrameReceived;
        
        // 初始化默认光照
        if (directionalLight != null)
        {
            directionalLight.intensity = 1.0f;
            directionalLight.color = Color.white;
        }
        
        // 初始化反射探针
        if (reflectionProbe != null)
        {
            reflectionProbe.mode = ReflectionProbeMode.Realtime;
            reflectionProbe.refreshMode = ReflectionProbeRefreshMode.EveryFrame;
            reflectionProbe.timeSlicingMode = ReflectionProbeTimeSlicingMode.AllFacesAtOnce;
        }
    }
    
    void OnCameraFrameReceived(ARCameraFrameEventArgs args)
    {
        // 获取光照估计数据
        brightness = args.lightEstimation.averageBrightness;
        colorTemperature = args.lightEstimation.averageColorTemperature;
        colorCorrection = args.lightEstimation.colorCorrection;
        sphericalHarmonics = args.lightEstimation.averageSphericalHarmonics;
        
        // 更新光照设置
        UpdateLighting();
    }
    
    void UpdateLighting()
    {
        if (directionalLight != null)
        {
            // 更新方向光亮度
            if (brightness.HasValue)
            {
                directionalLight.intensity = brightness.Value;
            }
            
            // 更新光源颜色
            if (colorCorrection.HasValue)
            {
                directionalLight.color = colorCorrection.Value;
            }
            else if (colorTemperature.HasValue)
            {
                // 从色温计算光源颜色
                directionalLight.color = Mathf.CorrelatedColorTemperatureToRGB(colorTemperature.Value);
            }
        }
        
        // 更新球谐光照(全局环境光)
        if (sphericalHarmonics.HasValue)
        {
            RenderSettings.ambientMode = AmbientMode.Skybox;
            RenderSettings.ambientProbe = sphericalHarmonics.Value;
            
            // 根据球谐估计环境亮度选择合适的Cubemap
            if (environmentCubemaps.Length > 0 && brightness.HasValue)
            {
                int index = Mathf.Clamp(Mathf.FloorToInt(brightness.Value * environmentCubemaps.Length), 
                                       0, environmentCubemaps.Length - 1);
                                     
                RenderSettings.customReflection = environmentCubemaps[index];
                
                if (reflectionProbe != null)
                {
                    reflectionProbe.customBakedTexture = environmentCubemaps[index];
                }
            }
        }
        
        // 将光照信息传递给所有AR材质
        Shader.SetGlobalFloat("_ARBrightness", brightness.HasValue ? brightness.Value : 1.0f);
        if (colorCorrection.HasValue)
        {
            Shader.SetGlobalColor("_ARColorCorrection", colorCorrection.Value);
        }
    }
    
    void OnDestroy()
    {
        if (arCameraManager != null)
            arCameraManager.frameReceived -= OnCameraFrameReceived;
    }
}

五、特殊AR渲染技术

5.1 阴影渲染

在AR环境中渲染逼真的阴影:

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.Rendering;

public class ARShadowRenderer : MonoBehaviour
{
    [SerializeField]
    private ARPlaneManager planeManager;
    
    [SerializeField]
    private Light mainLight;
    
    [SerializeField]
    private Material shadowReceiverMaterial;
    
    private Camera arCamera;
    private CommandBuffer commandBuffer;
    
    // 阴影相机
    private Camera shadowCamera;
    private RenderTexture shadowTexture;
    
    void Awake()
    {
        arCamera = GetComponent<Camera>();
        
        // 创建命令缓冲
        commandBuffer = new CommandBuffer();
        commandBuffer.name = "AR Shadow Pass";
        
        // 初始化阴影相机
        InitializeShadowCamera();
    }
    
    void Start()
    {
        // 注册平面检测事件
        if (planeManager != null)
        {
            planeManager.planesChanged += OnPlanesChanged;
        }
        
        // 添加命令缓冲到相机
        arCamera.AddCommandBuffer(CameraEvent.BeforeForwardOpaque, commandBuffer);
    }
    
    void InitializeShadowCamera()
    {
        // 创建阴影相机对象
        GameObject shadowCameraObj = new GameObject("AR Shadow Camera");
        shadowCamera = shadowCameraObj.AddComponent<Camera>();
        shadowCamera.enabled = false;
        shadowCamera.clearFlags = CameraClearFlags.SolidColor;
        shadowCamera.backgroundColor = Color.white;
        shadowCamera.orthographic = true;
        shadowCamera.nearClipPlane = 0.1f;
        shadowCamera.farClipPlane = 10.0f;
        
        // 创建阴影纹理
        shadowTexture = new RenderTexture(1024, 1024, 16, RenderTextureFormat.ARGB32);
        shadowTexture.wrapMode = TextureWrapMode.Clamp;
        shadowTexture.filterMode = FilterMode.Bilinear;
        shadowCamera.targetTexture = shadowTexture;
    }
    
    void OnPlanesChanged(ARPlanesChangedEventArgs args)
    {
        // 有新的平面检测到,更新阴影投射
        UpdateShadowProjection();
    }
    
    void UpdateShadowProjection()
    {
        // 清除之前的命令
        commandBuffer.Clear();
        
        if (planeManager != null && planeManager.trackables.count > 0 && mainLight != null)
        {
            // 更新阴影相机位置和方向
            shadowCamera.transform.position = mainLight.transform.position;
            shadowCamera.transform.rotation = mainLight.transform.rotation;
            
            // 计算阴影投影区域
            Bounds shadowBounds = CalculateShadowBounds();
            float shadowOrthoSize = Mathf.Max(shadowBounds.extents.x, shadowBounds.extents.z);
            shadowCamera.orthographicSize = shadowOrthoSize;
            
            // 生成阴影纹理
            commandBuffer.SetRenderTarget(shadowTexture);
            commandBuffer.ClearRenderTarget(true, true, Color.white);
            
            // 渲染所有可投射阴影的对象
            var renderers = FindObjectsOfType<Renderer>();
            foreach (var renderer in renderers)
            {
                if (renderer.shadowCastingMode != ShadowCastingMode.Off)
                {
                    commandBuffer.DrawRenderer(renderer, shadowReceiverMaterial, 0, 0);
                }
            }
            
            // 将阴影纹理传给平面的材质
            foreach (var plane in planeManager.trackables)
            {
                var planeMaterial = plane.GetComponent<Renderer>().material;
                if (planeMaterial != null)
                {
                    planeMaterial.SetTexture("_ShadowTex", shadowTexture);
                    
                    // 设置阴影变换矩阵
                    Matrix4x4 shadowMatrix = CalculateShadowMatrix(plane.transform);
                    planeMaterial.SetMatrix("_ShadowMatrix", shadowMatrix);
                }
            }
        }
    }
    
    Bounds CalculateShadowBounds()
    {
        // 计算所有AR平面和虚拟对象的边界盒
        Bounds bounds = new Bounds(Vector3.zero, Vector3.zero);
        bool initialized = false;
        
        // 包含所有平面
        foreach (var plane in planeManager.trackables)
        {
            if (!initialized)
            {
                bounds = new Bounds(plane.transform.position, Vector3.zero);
                initialized = true;
            }
            else
            {
                bounds.Encapsulate(plane.transform.position);
            }
            
            // 包含平面的四个角点
            foreach (var point in plane.boundary)
            {
                Vector3 worldPoint = plane.transform.TransformPoint(new Vector3(point.x, 0, point.y));
                bounds.Encapsulate(worldPoint);
            }
        }
        
        // 包含所有虚拟对象
        var renderers = FindObjectsOfType<Renderer>();
        foreach (var renderer in renderers)
        {
            if (renderer.shadowCastingMode != ShadowCastingMode.Off)
            {
                bounds.Encapsulate(renderer.bounds);
            }
        }
        
        // 添加一些边距
        bounds.Expand(1.0f);
        
        return bounds;
    }
    
    Matrix4x4 CalculateShadowMatrix(Transform planeTransform)
    {
        // 计算从世界空间到阴影相机空间的变换
        Matrix4x4 worldToShadow = shadowCamera.projectionMatrix * 
                                 shadowCamera.worldToCameraMatrix;
        
        // 计算从阴影空间到平面局部空间的变换
        Matrix4x4 shadowToLocal = planeTransform.worldToLocalMatrix;
        
        // 组合得到最终的阴影变换矩阵
        return shadowToLocal * worldToShadow;
    }
    
    void OnDestroy()
    {
        if (planeManager != null)
            planeManager.planesChanged -= OnPlanesChanged;
            
        if (arCamera != null)
            arCamera.RemoveCommandBuffer(CameraEvent.BeforeForwardOpaque, commandBuffer);
            
        if (commandBuffer != null)
            commandBuffer.Dispose();
            
        if (shadowTexture != null)
            shadowTexture.Release();
            
        if (shadowCamera != null)
            Destroy(shadowCamera.gameObject);
    }
}

5.2 反射与屏幕空间效果

为AR内容添加逼真的反射效果:

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.Rendering;

public class ARReflectionRenderer : MonoBehaviour
{
    [SerializeField]
    private Camera arCamera;
    
    [SerializeField]
    private ARPlaneManager planeManager;
    
    [SerializeField]
    private Material reflectionMaterial;
    
    [SerializeField]
    private LayerMask reflectiveLayers;
    
    private Camera reflectionCamera;
    private RenderTexture reflectionTexture;
    private CommandBuffer commandBuffer;
    
    void Start()
    {
        if (arCamera == null)
            arCamera = Camera.main;
        
        // 初始化反射相机和纹理
        InitializeReflectionCamera();
        
        // 创建命令缓冲
        commandBuffer = new CommandBuffer();
        commandBuffer.name = "AR Reflection Pass";
        
        // 注册平面检测事件
        if (planeManager != null)
        {
            planeManager.planesChanged += OnPlanesChanged;
        }
    }
    
    void InitializeReflectionCamera()
    {
        GameObject reflectionCameraObj = new GameObject("AR Reflection Camera");
        reflectionCamera = reflectionCameraObj.AddComponent<Camera>();
        reflectionCamera.enabled = false;
        
        // 设置反射相机参数与主相机匹配
        reflectionCamera.CopyFrom(arCamera);
        reflectionCamera.cullingMask = reflectiveLayers;
        
        // 创建反射纹理
        reflectionTexture = new RenderTexture(Screen.width, Screen.height, 24);
        reflectionCamera.targetTexture = reflectionTexture;
    }
    
    void OnPlanesChanged(ARPlanesChangedEventArgs args)
    {
        // 检测到新平面,更新反射设置
        foreach (var plane in args.added)
        {
            SetupPlaneReflection(plane);
        }
    }
    
    void SetupPlaneReflection(ARPlane plane)
    {
        // 获取平面材质
        Renderer planeRenderer = plane.GetComponent<Renderer>();
        if (planeRenderer != null)
        {
            // 应用反射材质
            Material planeMaterial = new Material(reflectionMaterial);
            planeRenderer.material = planeMaterial;
            
            // 设置反射纹理
            planeMaterial.SetTexture("_ReflectionTex", reflectionTexture);
        }
    }
    
    void Update()
    {
        // 如果有水平平面,计算反射
        if (planeManager != null && planeManager.trackables.count > 0)
        {
            // 找到一个水平平面作为反射平面
            ARPlane reflectionPlane = null;
            foreach (var plane in planeManager.trackables)
            {
                if (Vector3.Dot(plane.normal, Vector3.up) > 0.9f)
                {
                    reflectionPlane = plane;
                    break;
                }
            }
            
            if (reflectionPlane != null)
            {
                RenderReflection(reflectionPlane);
            }
        }
    }
    
    void RenderReflection(ARPlane plane)
    {
        // 获取平面法线和位置
        Vector3 planeNormal = plane.transform.up;
        Vector3 planePos = plane.transform.position;
        
        // 计算反射矩阵
        Vector4 reflectionPlane = new Vector4(planeNormal.x, planeNormal.y, planeNormal.z, 
                                             -Vector3.Dot(planeNormal, planePos));
        Matrix4x4 reflectionMatrix = CalculateReflectionMatrix(reflectionPlane);
        
        // 设置反射相机位置和朝向
        Vector3 camPos = arCamera.transform.position;
        Vector3 reflectedPos = ReflectPosition(camPos, reflectionPlane);
        reflectionCamera.transform.position = reflectedPos;
        
        Vector3 camForward = arCamera.transform.forward;
        Vector3 reflectedForward = ReflectDirection(camForward, planeNormal);
        
        Vector3 camUp = arCamera.transform.up;
        Vector3 reflectedUp = ReflectDirection(camUp, planeNormal);
        
        reflectionCamera.transform.LookAt(reflectedPos + reflectedForward, reflectedUp);
        
        // 反射裁剪平面,防止渲染平面下方的物体
        reflectionCamera.projectionMatrix = arCamera.CalculateObliqueMatrix(
            CameraSpacePlane(reflectionCamera, planePos, planeNormal, 1.0f));
            
        // 渲染反射纹理
        reflectionCamera.Render();
    }
    
    Vector4 CameraSpacePlane(Camera cam, Vector3 planePos, Vector3 planeNormal, float clipSide)
    {
        // 转换到相机空间
        Matrix4x4 worldToCameraMatrix = cam.worldToCameraMatrix;
        Vector3 cameraPosition = worldToCameraMatrix.MultiplyPoint(planePos);
        Vector3 cameraNormal = worldToCameraMatrix.MultiplyVector(planeNormal).normalized;
        
        // 计算相机空间中的平面
        return new Vector4(cameraNormal.x, cameraNormal.y, cameraNormal.z, 
                          -Vector3.Dot(cameraPosition, cameraNormal) * clipSide);
    }
    
    Matrix4x4 CalculateReflectionMatrix(Vector4 reflectionPlane)
    {
        Matrix4x4 reflectionMatrix = Matrix4x4.identity;
        
        reflectionMatrix.m00 = 1 - 2 * reflectionPlane.x * reflectionPlane.x;
        reflectionMatrix.m01 = -2 * reflectionPlane.x * reflectionPlane.y;
        reflectionMatrix.m02 = -2 * reflectionPlane.x * reflectionPlane.z;
        reflectionMatrix.m03 = -2 * reflectionPlane.x * reflectionPlane.w;
        
        reflectionMatrix.m10 = -2 * reflectionPlane.y * reflectionPlane.x;
        reflectionMatrix.m11 = 1 - 2 * reflectionPlane.y * reflectionPlane.y;
        reflectionMatrix.m12 = -2 * reflectionPlane.y * reflectionPlane.z;
        reflectionMatrix.m13 = -2 * reflectionPlane.y * reflectionPlane.w;
        
        reflectionMatrix.m20 = -2 * reflectionPlane.z * reflectionPlane.x;
        reflectionMatrix.m21 = -2 * reflectionPlane.z * reflectionPlane.y;
        reflectionMatrix.m22 = 1 - 2 * reflectionPlane.z * reflectionPlane.z;
        reflectionMatrix.m23 = -2 * reflectionPlane.z * reflectionPlane.w;
        
        reflectionMatrix.m30 = 0;
        reflectionMatrix.m31 = 0;
        reflectionMatrix.m32 = 0;
        reflectionMatrix.m33 = 1;
        
        return reflectionMatrix;
    }
    
    Vector3 ReflectPosition(Vector3 position, Vector4 plane)
    {
        float distance = Vector3.Dot(new Vector3(plane.x, plane.y, plane.z), position) + plane.w;
        Vector3 reflection = position - 2 * distance * new Vector3(plane.x, plane.y, plane.z);
        return reflection;
    }
    
    Vector3 ReflectDirection(Vector3 direction, Vector3 normal)
    {
        return direction - 2 * Vector3.Dot(direction, normal) * normal;
    }
    
    void OnDestroy()
    {
        if (planeManager != null)
            planeManager.planesChanged -= OnPlanesChanged;
            
        if (reflectionTexture != null)
            reflectionTexture.Release();
            
        if (reflectionCamera != null)
            Destroy(reflectionCamera.gameObject);
            
        if (commandBuffer != null)
            commandBuffer.Dispose();
    }
}

5.3 透明与半透明物体渲染

处理AR中透明物体的渲染,确保正确的混合和深度排序:

using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.XR.ARFoundation;

public class ARTransparencyRenderer : MonoBehaviour
{
    [SerializeField]
    private Camera arCamera;
    
    [SerializeField]
    private AROcclusionManager occlusionManager;
    
    [SerializeField]
    private Material transparencyBlendMaterial;
    
    [SerializeField]
    private LayerMask transparentObjectLayers;
    
    private CommandBuffer commandBuffer;
    private RenderTexture depthTexture;
    
    void Start()
    {
        if (arCamera == null)
            arCamera = Camera.main;
            
        // 创建深度纹理
        depthTexture = new RenderTexture(Screen.width, Screen.height, 24, 
                                        RenderTextureFormat.Depth);
        
        // 创建命令缓冲
        commandBuffer = new CommandBuffer();
        commandBuffer.name = "AR Transparency Pass";
        
        // 设置相机
        ConfigureARCamera();
        
        // 添加命令缓冲到相机
        arCamera.AddCommandBuffer(CameraEvent.BeforeForwardAlpha, commandBuffer);
        
        // 注册遮挡管理器事件
        if (occlusionManager != null)
        {
            occlusionManager.frameReceived += OnOcclusionFrameReceived;
        }
    }
    
    void ConfigureARCamera()
    {
        // 确保相机可以渲染深度
        arCamera.depthTextureMode |= DepthTextureMode.Depth;
    }
    
    void OnOcclusionFrameReceived(AROcclusionFrameEventArgs args)
    {
        UpdateTransparencyRendering();
    }
    
    void UpdateTransparencyRendering()
    {
        // 清除之前的命令
        commandBuffer.Clear();
        
        // 获取环境深度纹理
        Texture environmentDepthTexture = null;
        if (occlusionManager != null)
        {
            environmentDepthTexture = occlusionManager.environmentDepthTexture;
        }
        
        if (environmentDepthTexture != null)
        {
            // 首先将环境深度渲染到深度纹理
            commandBuffer.SetRenderTarget(depthTexture);
            commandBuffer.ClearRenderTarget(true, false, Color.black);
            
            // 使用特殊材质渲染环境深度
            commandBuffer.Blit(environmentDepthTexture, depthTexture, transparencyBlendMaterial, 0);
            
            // 设置深度纹理
            commandBuffer.SetGlobalTexture("_ARDepthTexture", depthTexture);
            
            // 查找并配置所有透明物体
            Renderer[] transparentRenderers = FindTransparentRenderers();
            foreach (var renderer in transparentRenderers)
            {
                // 使材质能够使用深度纹理
                foreach (var material in renderer.materials)
                {
                    material.SetTexture("_ARDepthTexture", depthTexture);
                    material.SetMatrix("_ARWorldToCameraMatrix", arCamera.worldToCameraMatrix);
                    material.SetMatrix("_ARProjectionMatrix", arCamera.projectionMatrix);
                }
            }
        }
    }
    
    Renderer[] FindTransparentRenderers()
    {
        // 查找所有在透明层上的渲染器
        return FindObjectsOfType<Renderer>().Where(r => 
            (transparentObjectLayers.value & (1 << r.gameObject.layer)) != 0).ToArray();
    }
    
    void OnRenderImage(RenderTexture source, RenderTexture destination)
    {
        // 这里可以添加自定义后处理效果来增强透明度渲染
        if (transparencyBlendMaterial != null)
        {
            // 将最终的图像与AR相机图像进行混合
            Graphics.Blit(source, destination, transparencyBlendMaterial, 1);
        }
        else
        {
            Graphics.Blit(source, destination);
        }
    }
    
    void OnDestroy()
    {
        if (arCamera != null)
            arCamera.RemoveCommandBuffer(CameraEvent.BeforeForwardAlpha, commandBuffer);
            
        if (commandBuffer != null)
            commandBuffer.Dispose();
            
        if (depthTexture != null)
            depthTexture.Release();
            
        if (occlusionManager != null)
            occlusionManager.frameReceived -= OnOcclusionFrameReceived;
    }
}

六、后处理与特效

6.1 AR特定后处理效果

using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
using UnityEngine.XR.ARFoundation;

public class ARPostProcessingManager : MonoBehaviour
{
    [SerializeField]
    private Volume postProcessVolume;
    
    [SerializeField]
    private ARCameraManager arCameraManager;
    
    // 后处理效果参数
    [SerializeField, Range(0, 1)]
    private float bloomIntensity = 0.5f;
    
    [SerializeField, Range(0, 1)]
    private float vignetteIntensity = 0.3f;
    
    [SerializeField, Range(-100, 100)]
    private float colorAdjustment = 0f;
    
    // 后处理组件引用
    private Bloom bloom;
    private Vignette vignette;
    private ColorAdjustments colorAdjustments;
    private DepthOfField depthOfField;
    
    // 环境光亮度缓存
    private float currentBrightness = 1.0f;
    
    void Start()
    {
        // 获取后处理效果组件
        if (postProcessVolume != null && postProcessVolume.profile != null)
        {
            postProcessVolume.profile.TryGet(out bloom);
            postProcessVolume.profile.TryGet(out vignette);
            postProcessVolume.profile.TryGet(out colorAdjustments);
            postProcessVolume.profile.TryGet(out depthOfField);
            
            // 应用初始设置
            ApplyInitialSettings();
        }
        
        // 注册相机帧事件
        if (arCameraManager != null)
        {
            arCameraManager.frameReceived += OnCameraFrameReceived;
        }
    }
    
    void ApplyInitialSettings()
    {
        // 设置初始Bloom
        if (bloom != null)
        {
            bloom.active = true;
            bloom.intensity.value = bloomIntensity;
        }
        
        // 设置初始Vignette
        if (vignette != null)
        {
            vignette.active = true;
            vignette.intensity.value = vignetteIntensity;
        }
        
        // 设置初始颜色调整
        if (colorAdjustments != null)
        {
            colorAdjustments.active = true;
            colorAdjustments.saturation.value = colorAdjustment;
        }
        
        // 设置初始景深
        if (depthOfField != null)
        {
            depthOfField.active = false; // 默认关闭景深
        }
    }
    
    void OnCameraFrameReceived(ARCameraFrameEventArgs args)
    {
        // 根据环境光调整后处理
        if (args.lightEstimation.averageBrightness.HasValue)
        {
            currentBrightness = args.lightEstimation.averageBrightness.Value;
            UpdatePostProcessingForEnvironment();
        }
        
        // 根据图像统计数据调整后处理
        if (args.cameraGrainTexture != null)
        {
            // 可以使用相机噪点纹理进行自定义后处理
        }
    }
    
    void UpdatePostProcessingForEnvironment()
    {
        // 根据环境亮度动态调整后处理参数
        
        // 亮度低时增加Bloom强度
        if (bloom != null)
        {
            float dynamicBloomIntensity = Mathf.Lerp(bloomIntensity * 1.5f, bloomIntensity * 0.5f, currentBrightness);
            bloom.intensity.value = Mathf.Clamp(dynamicBloomIntensity, 0, 2);
        }
        
        // 亮度低时增加暗角效果
        if (vignette != null)
        {
            float dynamicVignetteIntensity = Mathf.Lerp(vignetteIntensity * 1.5f, vignetteIntensity, currentBrightness);
            vignette.intensity.value = Mathf.Clamp(dynamicVignetteIntensity, 0, 1);
        }
        
        // 亮度低时增加对比度
        if (colorAdjustments != null)
        {
            float dynamicContrast = Mathf.Lerp(20, 0, currentBrightness);
            colorAdjustments.contrast.value = dynamicContrast;
        }
    }
    
    // 在特定场景中启用景深效果
    public void EnableDepthOfField(float focusDistance, float aperture)
    {
        if (depthOfField != null)
        {
            depthOfField.active = true;
            depthOfField.mode.value = DepthOfFieldMode.Bokeh;
            depthOfField.focusDistance.value = focusDistance;
            depthOfField.aperture.value = aperture;
        }
    }
    
    // 禁用景深效果
    public void DisableDepthOfField()
    {
        if (depthOfField != null)
        {
            depthOfField.active = false;
        }
    }
    
    // 应用基于AR特定内容的后处理
    public void ApplyARContentSpecificPostProcess(ARContentType contentType)
    {
        switch (contentType)
        {
            case ARContentType.Fantasy:
                // 梦幻风格:增加辉光和饱和度
                if (bloom != null) bloom.intensity.value = 1.0f;
                if (colorAdjustments != null) colorAdjustments.saturation.value = 20;
                break;
                
            case ARContentType.SciFi:
                // 科幻风格:冷色调和轻微的色差
                if (colorAdjustments != null)
                {
                    colorAdjustments.temperature.value = -20;
                    colorAdjustments.saturation.value = -10;
                }
                break;
                
            case ARContentType.Horror:
                // 恐怖风格:强烈的暗角和降低饱和度
                if (vignette != null) vignette.intensity.value = 0.7f;
                if (colorAdjustments != null) colorAdjustments.saturation.value = -50;
                break;
                
            default:
                // 恢复默认设置
                ApplyInitialSettings();
                break;
        }
    }
    
    void OnDestroy()
    {
        if (arCameraManager != null)
            arCameraManager.frameReceived -= OnCameraFrameReceived;
    }
}

// AR内容类型枚举
public enum ARContentType
{
    Default,
    Fantasy,
    SciFi,
    Horror,
    Cartoon
}

6.2 镜头扭曲与边缘处理

处理AR相机镜头扭曲与屏幕边缘效果:

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

public class ARLensEffectsManager : MonoBehaviour
{
    [SerializeField]
    private Material lensDistortionMaterial;
    
    [SerializeField]
    private ARCameraManager arCameraManager;
    
    [SerializeField, Range(-1, 1)]
    private float distortionAmount = 0.0f;
    
    [SerializeField, Range(0, 1)]
    private float chromaticAberrationAmount = 0.0f;
    
    [SerializeField, Range(0, 1)]
    private float vignetteAmount = 0.2f;
    
    private Camera arCamera;
    private CommandBuffer commandBuffer;
    
    void Start()
    {
        arCamera = GetComponent<Camera>();
        
        // 创建命令缓冲
        commandBuffer = new CommandBuffer();
        commandBuffer.name = "AR Lens Effects";
        
        // 初始化镜头材质
        InitializeLensMaterial();
        
        // 添加命令缓冲到相机
        arCamera.AddCommandBuffer(CameraEvent.AfterEverything, commandBuffer);
        
        // 注册相机帧事件
        if (arCameraManager != null)
        {
            arCameraManager.frameReceived += OnCameraFrameReceived;
        }
    }
    
    void InitializeLensMaterial()
    {
        if (lensDistortionMaterial != null)
        {
            // 设置初始参数
            lensDistortionMaterial.SetFloat("_DistortionAmount", distortionAmount);
            lensDistortionMaterial.SetFloat("_ChromaticAberration", chromaticAberrationAmount);
            lensDistortionMaterial.SetFloat("_VignetteAmount", vignetteAmount);
        }
    }
    
    void OnCameraFrameReceived(ARCameraFrameEventArgs args)
    {
        // 根据AR相机参数更新镜头效果
        UpdateLensEffects(args);
    }
    
    void UpdateLensEffects(ARCameraFrameEventArgs args)
    {
        if (lensDistortionMaterial == null) return;
        
        // 根据相机移动动态调整扭曲效果
        if (args.timestampNs > 0)
        {
            // 计算相机移动速度
            float cameraSpeed = 0;
            if (args.cameraVelocity.HasValue)
            {
                cameraSpeed = args.cameraVelocity.Value.magnitude;
            }
            
            // 根据相机速度动态调整扭曲和色差
            float dynamicDistortion = distortionAmount + Mathf.Min(0.1f, cameraSpeed * 0.01f);
            float dynamicChromaticAberration = chromaticAberrationAmount + 
                                             Mathf.Min(0.2f, cameraSpeed * 0.02f);
            
            // 应用效果参数
            lensDistortionMaterial.SetFloat("_DistortionAmount", dynamicDistortion);
            lensDistortionMaterial.SetFloat("_ChromaticAberration", dynamicChromaticAberration);
        }
        
        // 更新帧边缘暗角效果
        lensDistortionMaterial.SetFloat("_VignetteAmount", vignetteAmount);
        
        // 更新命令缓冲
        UpdateCommandBuffer();
    }
    
    void UpdateCommandBuffer()
    {
        // 清除命令缓冲
        commandBuffer.Clear();
        
        // 设置屏幕空间纹理来源
        int screenCopyID = Shader.PropertyToID("_ScreenCopyTexture");
        commandBuffer.GetTemporaryRT(screenCopyID, -1, -1, 0, FilterMode.Bilinear);
        commandBuffer.Blit(BuiltinRenderTextureType.CurrentActive, screenCopyID);
        
        // 应用镜头效果
        commandBuffer.Blit(screenCopyID, BuiltinRenderTextureType.CurrentActive, lensDistortionMaterial);
        
        // 释放临时RT
        commandBuffer.ReleaseTemporaryRT(screenCopyID);
    }
    
    // 外部接口:应用强镜头扭曲效果
    public void ApplyStrongDistortion(float duration = 0.5f)
    {
        StartCoroutine(DistortionEffect(0.5f, duration));
    }
    
    // 外部接口:应用色差效果
    public void ApplyChromaticAberration(float amount, float duration = 0.5f)
    {
        StartCoroutine(ChromaticAberrationEffect(amount, duration));
    }
    
    private System.Collections.IEnumerator DistortionEffect(float maxAmount, float duration)
    {
        float originalAmount = distortionAmount;
        float startTime = Time.time;
        
        // 逐渐增加扭曲
        while (Time.time < startTime + duration * 0.5f)
        {
            float t = (Time.time - startTime) / (duration * 0.5f);
            float currentAmount = Mathf.Lerp(originalAmount, maxAmount, t);
            lensDistortionMaterial.SetFloat("_DistortionAmount", currentAmount);
            yield return null;
        }
        
        // 保持最大扭曲一小段时间
        lensDistortionMaterial.SetFloat("_DistortionAmount", maxAmount);
        yield return new WaitForSeconds(0.1f);
        
        // 逐渐恢复
        startTime = Time.time;
        while (Time.time < startTime + duration * 0.5f)
        {
            float t = (Time.time - startTime) / (duration * 0.5f);
            float currentAmount = Mathf.Lerp(maxAmount, originalAmount, t);
            lensDistortionMaterial.SetFloat("_DistortionAmount", currentAmount);
            yield return null;
        }
        
        // 恢复原始设置
        lensDistortionMaterial.SetFloat("_DistortionAmount", originalAmount);
    }
    
    private System.Collections.IEnumerator ChromaticAberrationEffect(float maxAmount, float duration)
    {
        float originalAmount = chromaticAberrationAmount;
        float startTime = Time.time;
        
        // 应用色差效果
        while (Time.time < startTime + duration)
        {
            float t = (Time.time - startTime) / duration;
            float currentAmount = Mathf.Lerp(maxAmount, originalAmount, t * t);
            lensDistortionMaterial.SetFloat("_ChromaticAberration", currentAmount);
            yield return null;
        }
        
        // 恢复原始设置
        lensDistortionMaterial.SetFloat("_ChromaticAberration", originalAmount);
    }
    
    void OnDestroy()
    {
        if (arCamera != null)
            arCamera.RemoveCommandBuffer(CameraEvent.AfterEverything, commandBuffer);
            
        if (commandBuffer != null)
            commandBuffer.Dispose();
            
        if (arCameraManager != null)
            arCameraManager.frameReceived -= OnCameraFrameReceived;
    }
}

七、性能优化

7.1 AR渲染性能监控

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using System.Collections.Generic;

public class ARRenderingPerformanceMonitor : MonoBehaviour
{
    [SerializeField]
    private bool showOverlay = true;
    
    [SerializeField]
    private int performanceHistoryLength = 100;
    
    // 性能数据
    private float[] frameTimeHistory;
    private float[] gpuTimeHistory;
    private int historyIndex = 0;
    
    // 统计数据
    private float minFrameTime = float.MaxValue;
    private float maxFrameTime = 0f;
    private float avgFrameTime = 0f;
    
    // 性能优化状态
    private ARRenderingQualityLevel currentQualityLevel = ARRenderingQualityLevel.Medium;
    
    // UI显示
    private GUIStyle guiStyle = new GUIStyle();
    private Rect windowRect = new Rect(20, 20, 250, 320);
    private bool showDetails = false;
    
    void Start()
    {
        // 初始化性能历史数组
        frameTimeHistory = new float[performanceHistoryLength];
        gpuTimeHistory = new float[performanceHistoryLength];
        
        // 设置GUI样式
        guiStyle.normal.textColor = Color.white;
        guiStyle.fontSize = 14;
        guiStyle.padding = new RectOffset(10, 10, 5, 5);
        
        // 初始第一次性能检查
        Invoke("CheckPerformance", 3.0f);
    }
    
    void Update()
    {
        // 记录帧时间
        frameTimeHistory[historyIndex] = Time.deltaTime * 1000f; // 转换为毫秒
        
        // 使用Unity Profiler API获取GPU时间(如果可用)
#if UNITY_2019_3_OR_NEWER
        if (FrameTimingManager.IsFeatureEnabled())
        {
            FrameTimingManager.CaptureFrameTimings();
            double gpuTime = 0;
            FrameTimingManager.GetGpuTimerFrequency(out double gpuFreq);
            
            if (FrameTimingManager.GetLatestTimings(1, out FrameTiming[] frameTimings) != 0)
            {
                gpuTime = frameTimings[0].gpuFrameTime * 1000.0;
            }
            
            gpuTimeHistory[historyIndex] = (float)gpuTime;
        }
#endif
        
        // 更新索引
        historyIndex = (historyIndex + 1) % performanceHistoryLength;
        
        // 每10帧更新统计数据
        if (historyIndex % 10 == 0)
        {
            UpdateStatistics();
        }
    }
    
    void UpdateStatistics()
    {
        // 计算帧时间统计
        float sum = 0f;
        minFrameTime = float.MaxValue;
        maxFrameTime = 0f;
        
        for (int i = 0; i < performanceHistoryLength; i++)
        {
            float frameTime = frameTimeHistory[i];
            if (frameTime > 0)
            {
                sum += frameTime;
                minFrameTime = Mathf.Min(minFrameTime, frameTime);
                maxFrameTime = Mathf.Max(maxFrameTime, frameTime);
            }
        }
        
        avgFrameTime = sum / performanceHistoryLength;
    }
    
    void CheckPerformance()
    {
        // 根据当前平均帧时间评估性能并推荐优化
        float targetFrameTime = 16.67f; // 60FPS
        
        ARRenderingQualityLevel recommendedLevel = currentQualityLevel;
        
        if (avgFrameTime > targetFrameTime * 1.5f)
        {
            // 性能较差,建议降低质量
            recommendedLevel = currentQualityLevel > ARRenderingQualityLevel.Low ? 
                             currentQualityLevel - 1 : ARRenderingQualityLevel.Low;
        }
        else if (avgFrameTime < targetFrameTime * 0.7f && maxFrameTime < targetFrameTime)
        {
            // 性能良好,可以提高质量
            recommendedLevel = currentQualityLevel < ARRenderingQualityLevel.Ultra ? 
                             currentQualityLevel + 1 : ARRenderingQualityLevel.Ultra;
        }
        
        // 如果建议改变质量级别
        if (recommendedLevel != currentQualityLevel)
        {
            Debug.Log($"AR性能监控: 建议将渲染质量从 {currentQualityLevel} 调整为 {recommendedLevel}");
            
            // 这里可以调用外部系统来改变渲染质量
            ARRenderingQualityController.SetQualityLevel(recommendedLevel);
            
            currentQualityLevel = recommendedLevel;
        }
        
        // 安排下一次检查
        Invoke("CheckPerformance", 5.0f);
    }
    
    void OnGUI()
    {
        if (!showOverlay) return;
        
        windowRect = GUI.Window(0, windowRect, DoWindow, "AR渲染性能");
    }
    
    void DoWindow(int id)
    {
        // 显示性能统计
        float fps = 1000f / avgFrameTime;
        GUI.Label(new Rect(10, 25, 230, 20), $"FPS: {fps:F1}", guiStyle);
        GUI.Label(new Rect(10, 45, 230, 20), $"帧时间: {avgFrameTime:F2}ms", guiStyle);
        
        // 质量级别
        GUI.Label(new Rect(10, 65, 230, 20), $"当前质量: {currentQualityLevel}", guiStyle);
        
        // 性能评级
        string performanceRating = GetPerformanceRating(fps);
        GUI.Label(new Rect(10, 85, 230, 20), $"性能评级: {performanceRating}", guiStyle);
        
        // 详细信息按钮
        if (GUI.Button(new Rect(10, 110, 230, 30), showDetails ? "隐藏详细信息" : "显示详细信息"))
        {
            showDetails = !showDetails;
        }
        
        // 应用优化按钮
        if (GUI.Button(new Rect(10, 145, 230, 30), "应用推荐优化"))
        {
            ApplyRecommendedOptimizations();
        }
        
        // 如果显示详细信息
        if (showDetails)
        {
            GUI.Label(new Rect(10, 180, 230, 20), $"最小帧时间: {minFrameTime:F2}ms", guiStyle);
            GUI.Label(new Rect(10, 200, 230, 20), $"最大帧时间: {maxFrameTime:F2}ms", guiStyle);
            GUI.Label(new Rect(10, 220, 230, 20), $"GPU时间: {GetAverageGPUTime():F2}ms", guiStyle);
            
            // 瓶颈分析
            string bottleneck = AnalyzeBottleneck();
            GUI.Label(new Rect(10, 240, 230, 20), $"瓶颈: {bottleneck}", guiStyle);
            
            // 性能问题分析
            List<string> issues = AnalyzePerformanceIssues();
            for (int i = 0; i < issues.Count; i++)
            {
                GUI.Label(new Rect(10, 260 + i * 20, 230, 20), $"• {issues[i]}", guiStyle);
            }
        }
        
        // 使窗口可拖动
        GUI.DragWindow();
    }
    
    string GetPerformanceRating(float fps)
    {
        if (fps >= 55) return "极佳";
        if (fps >= 45) return "良好";
        if (fps >= 30) return "一般";
        if (fps >= 20) return "较差";
        return "差";
    }
    
    float GetAverageGPUTime()
    {
        float sum = 0f;
        int count = 0;
        
        for (int i = 0; i < performanceHistoryLength; i++)
        {
            if (gpuTimeHistory[i] > 0)
            {
                sum += gpuTimeHistory[i];
                count++;
            }
        }
        
        return count > 0 ? sum / count : 0;
    }
    
    string AnalyzeBottleneck()
    {
        float avgGpuTime = GetAverageGPUTime();
        float cpuOverhead = avgFrameTime - avgGpuTime;
        
        if (avgGpuTime > 0)
        {
            if (cpuOverhead > avgGpuTime * 1.5f)
                return "CPU";
            else if (avgGpuTime > cpuOverhead * 1.5f)
                return "GPU";
            else
                return "CPU+GPU";
        }
        
        // 无法确定瓶颈
        return "未知";
    }
    
    List<string> AnalyzePerformanceIssues()
    {
        List<string> issues = new List<string>();
        
        // 帧率不稳定
        if (maxFrameTime > minFrameTime * 2)
        {
            issues.Add("帧率不稳定");
        }
        
        // 过高的GPU负载
        float avgGpuTime = GetAverageGPUTime();
        if (avgGpuTime > 12)
        {
            issues.Add("GPU负载过高");
        }
        
        // AR特定问题
        ARSession arSession = FindObjectOfType<ARSession>();
        if (arSession != null)
        {
            // 检查跟踪质量
            var arSubsystem = arSession.subsystem;
            if (arSubsystem != null && !arSubsystem.running)
            {
                issues.Add("AR跟踪未运行");
            }
        }
        
        // 检测渲染分辨率过高
        float resolutionScale = QualitySettings.resolutionScalingFixedDPIFactor;
        if (resolutionScale > 1.5f)
        {
            issues.Add("渲染分辨率可能过高");
        }
        
        return issues;
    }
    
    void ApplyRecommendedOptimizations()
    {
        // 分析当前瓶颈
        string bottleneck = AnalyzeBottleneck();
        
        // 根据瓶颈应用不同的优化策略
        if (bottleneck == "GPU")
        {
            // GPU瓶颈优化
            ARRenderingQualityController.OptimizeForGPUPerformance();
        }
        else if (bottleneck == "CPU")
        {
            // CPU瓶颈优化
            ARRenderingQualityController.OptimizeForCPUPerformance();
        }
        else
        {
            // 平衡优化
            ARRenderingQualityController.ApplyBalancedOptimization();
        }
        
        // 记录新的质量级别
        currentQualityLevel = ARRenderingQualityController.GetCurrentQualityLevel();
        
        Debug.Log("已应用推荐的AR渲染优化");
    }
}

// 质量级别枚举
public enum ARRenderingQualityLevel
{
    Low,
    Medium,
    High,
    Ultra
}

// 渲染质量控制器(这将是一个单独的类)
public static class ARRenderingQualityController
{
    private static ARRenderingQualityLevel currentLevel = ARRenderingQualityLevel.Medium;
    
    public static void SetQualityLevel(ARRenderingQualityLevel level)
    {
        currentLevel = level;
        
        switch (level)
        {
            case ARRenderingQualityLevel.Low:
                ApplyLowQualitySettings();
                break;
            case ARRenderingQualityLevel.Medium:
                ApplyMediumQualitySettings();
                break;
            case ARRenderingQualityLevel.High:
                ApplyHighQualitySettings();
                break;
            case ARRenderingQualityLevel.Ultra:
                ApplyUltraQualitySettings();
                break;
        }
    }
    
    public static ARRenderingQualityLevel GetCurrentQualityLevel()
    {
        return currentLevel;
    }
    
    public static void OptimizeForGPUPerformance()
    {
        // 降低GPU密集型设置
        QualitySettings.shadows = ShadowQuality.Disable;
        QualitySettings.shadowResolution = ShadowResolution.Low;
        QualitySettings.antiAliasing = 0;
        QualitySettings.softParticles = false;
        
        // 降低渲染分辨率
        QualitySettings.resolutionScalingFixedDPIFactor = 0.75f;
    }
    
    public static void OptimizeForCPUPerformance()
    {
        // 降低CPU密集型设置
        QualitySettings.shadowCascades = 1;
        QualitySettings.realtimeReflectionProbes = false;
        QualitySettings.maximumLODLevel = 2;
        
        // 物理和动画性能
        Physics.defaultSolverIterations = 2;
        Time.fixedDeltaTime = 0.02f;
    }
    
    public static void ApplyBalancedOptimization()
    {
        // 平衡的优化设置
        QualitySettings.shadows = ShadowQuality.HardOnly;
        QualitySettings.shadowResolution = ShadowResolution.Medium;
        QualitySettings.shadowCascades = 2;
        QualitySettings.antiAliasing = 0;
        QualitySettings.realtimeReflectionProbes = false;
        QualitySettings.softParticles = false;
        QualitySettings.resolutionScalingFixedDPIFactor = 1.0f;
    }
    
    private static void ApplyLowQualitySettings()
    {
        QualitySettings.SetQualityLevel(0, true);
        QualitySettings.shadows = ShadowQuality.Disable;
        QualitySettings.antiAliasing = 0;
        QualitySettings.realtimeReflectionProbes = false;
        QualitySettings.softParticles = false;
        QualitySettings.resolutionScalingFixedDPIFactor = 0.6f;
    }
    
    private static void ApplyMediumQualitySettings()
    {
        QualitySettings.SetQualityLevel(1, true);
        QualitySettings.shadows = ShadowQuality.HardOnly;
        QualitySettings.shadowResolution = ShadowResolution.Low;
        QualitySettings.shadowCascades = 1;
        QualitySettings.antiAliasing = 0;
        QualitySettings.realtimeReflectionProbes = false;
        QualitySettings.softParticles = false;
        QualitySettings.resolutionScalingFixedDPIFactor = 0.8f;
    }
    
    private static void ApplyHighQualitySettings()
    {
        QualitySettings.SetQualityLevel(2, true);
        QualitySettings.shadows = ShadowQuality.All;
        QualitySettings.shadowResolution = ShadowResolution.Medium;
        QualitySettings.shadowCascades = 2;
        QualitySettings.antiAliasing = 2;
        QualitySettings.realtimeReflectionProbes = true;
        QualitySettings.softParticles = true;
        QualitySettings.resolutionScalingFixedDPIFactor = 1.0f;
    }
    
    private static void ApplyUltraQualitySettings()
    {
        QualitySettings.SetQualityLevel(3, true);
        QualitySettings.shadows = ShadowQuality.All;
        QualitySettings.shadowResolution = ShadowResolution.High;
        QualitySettings.shadowCascades = 4;
        QualitySettings.antiAliasing = 4;
        QualitySettings.realtimeReflectionProbes = true;
        QualitySettings.softParticles = true;
        QualitySettings.resolutionScalingFixedDPIFactor = 1.2f;
    }
}

7.2 动态性能调整

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.Rendering;

public class ARRenderingDynamicOptimizer : MonoBehaviour
{
    [SerializeField]
    private Camera arCamera;
    
    [SerializeField]
    private ARSession arSession;
    
    [SerializeField]
    private Volume postProcessVolume;
    
    // 控制动态优化的参数
    [SerializeField, Range(15, 60)]
    private float targetFrameRate = 30f;
    
    [SerializeField, Range(0.1f, 2.0f)]
    private float resolutionScaleFactor = 1.0f;
    
    [SerializeField]
    private bool enableDynamicResolution = true;
    
    [SerializeField]
    private bool enableDynamicEffects = true;
    
    // 性能监控
    private float[] frameTimeHistory = new float[20];
    private int frameIndex = 0;
    private float adaptationSpeed = 0.2f;
    
    // 优化状态
    private float currentResolutionScale = 1.0f;
    private int currentShadowQuality = 2;
    private int currentEffectsLevel = 2;
    
    // 最近一次优化时间
    private float lastOptimizationTime = 0f;
    
    void Start()
    {
        if (arCamera == null)
            arCamera = Camera.main;
            
        // 初始化当前设置
        currentResolutionScale = resolutionScaleFactor;
        
        // 初始应用设置
        ApplySettings();
    }
    
    void Update()
    {
        // 记录帧时间
        frameTimeHistory[frameIndex] = Time.unscaledDeltaTime;
        frameIndex = (frameIndex + 1) % frameTimeHistory.Length;
        
        // 每秒评估性能并可能优化
        if (Time.time - lastOptimizationTime > 1.0f)
        {
            float avgFrameTime = CalculateAverageFrameTime();
            float currentFPS = 1.0f / avgFrameTime;
            
            // 如果低于目标帧率,进行优化
            if (currentFPS < targetFrameRate * 0.9f)
            {
                OptimizeForPerformance(currentFPS);
            }
            // 如果高于目标帧率,可以提高质量
            else if (currentFPS > targetFrameRate * 1.2f)
            {
                OptimizeForQuality(currentFPS);
            }
            
            lastOptimizationTime = Time.time;
        }
    }
    
    float CalculateAverageFrameTime()
    {
        float sum = 0f;
        for (int i = 0; i < frameTimeHistory.Length; i++)
        {
            sum += frameTimeHistory[i];
        }
        return sum / frameTimeHistory.Length;
    }
    
    void OptimizeForPerformance(float currentFPS)
    {
        // 当前帧率低于目标,需要降低质量提高性能
        
        bool madeSomeChange = false;
        
        // 1. 降低渲染分辨率
        if (enableDynamicResolution && currentResolutionScale > 0.5f)
        {
            // 动态计算分辨率缩放,性能差距越大,降低幅度越大
            float performanceRatio = currentFPS / targetFrameRate;
            float targetScale = Mathf.Lerp(currentResolutionScale, 
                                          currentResolutionScale * (0.8f + 0.2f * performanceRatio), 
                                          adaptationSpeed);
            
            // 确保分辨率不会太低
            targetScale = Mathf.Max(0.5f, targetScale);
            
            if (Mathf.Abs(targetScale - currentResolutionScale) > 0.02f)
            {
                currentResolutionScale = targetScale;
                madeSomeChange = true;
            }
        }
        
        // 2. 降低阴影质量
        if (currentShadowQuality > 0 && currentFPS < targetFrameRate * 0.7f)
        {
            currentShadowQuality--;
            madeSomeChange = true;
        }
        
        // 3. 降低后处理效果
        if (enableDynamicEffects && currentEffectsLevel > 0 && currentFPS < targetFrameRate * 0.6f)
        {
            currentEffectsLevel--;
            madeSomeChange = true;
        }
        
        // 应用更改
        if (madeSomeChange)
        {
            ApplySettings();
            Debug.Log($"性能优化: FPS={currentFPS:F1} → 分辨率={currentResolutionScale:F2}, 阴影={currentShadowQuality}, 特效={currentEffectsLevel}");
        }
    }
    
    void OptimizeForQuality(float currentFPS)
    {
        // 当前帧率高于目标,可以提高质量
        
        bool madeSomeChange = false;
        
        // 1. 提高后处理效果(最先恢复)
        if (enableDynamicEffects && currentEffectsLevel < 3 && currentFPS > targetFrameRate * 1.3f)
        {
            currentEffectsLevel++;
            madeSomeChange = true;
        }
        // 2. 提高阴影质量
        else if (currentShadowQuality < 2 && currentFPS > targetFrameRate * 1.4f)
        {
            currentShadowQuality++;
            madeSomeChange = true;
        }
        // 3. 提高渲染分辨率(最后恢复)
        else if (enableDynamicResolution && currentResolutionScale < resolutionScaleFactor)
        {
            // 逐渐恢复到目标分辨率
            float targetScale = Mathf.Lerp(currentResolutionScale, 
                                          resolutionScaleFactor, 
                                          adaptationSpeed * 0.5f); // 恢复速度较慢
            
            if (Mathf.Abs(targetScale - currentResolutionScale) > 0.02f)
            {
                currentResolutionScale = targetScale;
                madeSomeChange = true;
            }
        }
        
        // 应用更改
        if (madeSomeChange)
        {
            ApplySettings();
            Debug.Log($"质量提升: FPS={currentFPS:F1} → 分辨率={currentResolutionScale:F2}, 阴影={currentShadowQuality}, 特效={currentEffectsLevel}");
        }
    }
    
    void ApplySettings()
    {
        // 1. 应用分辨率缩放
        if (enableDynamicResolution)
        {
#if UNITY_2019_3_OR_NEWER
            ScalableBufferManager.ResizeBuffers(currentResolutionScale, currentResolutionScale);
#else
            // 旧版Unity需要特定于平台或插件实现
#endif
        }
        
        // 2. 应用阴影设置
        switch (currentShadowQuality)
        {
            case 0:
                QualitySettings.shadows = ShadowQuality.Disable;
                break;
            case 1:
                QualitySettings.shadows = ShadowQuality.HardOnly;
                QualitySettings.shadowResolution = ShadowResolution.Low;
                break;
            case 2:
                QualitySettings.shadows = ShadowQuality.All;
                QualitySettings.shadowResolution = ShadowResolution.Medium;
                break;
        }
        
        // 3. 应用特效设置
        if (enableDynamicEffects && postProcessVolume != null)
        {
            // 获取后处理组件
            UnityEngine.Rendering.Universal.Bloom bloom;
            if (postProcessVolume.profile.TryGet(out bloom))
            {
                switch (currentEffectsLevel)
                {
                    case 0:
                        bloom.active = false;
                        break;
                    case 1:
                        bloom.active = true;
                        bloom.intensity.value = 0.1f;
                        break;
                    case 2:
                        bloom.active = true;
                        bloom.intensity.value = 0.5f;
                        break;
                    case 3:
                        bloom.active = true;
                        bloom.intensity.value = 1.0f;
                        break;
                }
            }
            
            // 其他后处理效果控制...
        }
        
        // 4. AR会话配置优化
        if (arSession != null)
        {
            if (currentFPS < targetFrameRate * 0.6f)
            {
                // 性能非常差时,可以临时降低AR特性
                arSession.requestedTrackingMode = TrackingMode.PositionOnly;
            }
            else
            {
                // 恢复正常AR跟踪
                arSession.requestedTrackingMode = TrackingMode.PositionAndRotation;
            }
        }
    }
    
    // 提供给外部调用的API,用于响应系统事件来优化渲染
    public void OptimizeForBatteryLevel(float batteryLevel)
    {
        // 电池电量低时更激进地节省性能
        if (batteryLevel < 0.2f)
        {
            targetFrameRate = 24;
            enableDynamicEffects = false;
            currentShadowQuality = 0;
            currentResolutionScale = 0.6f;
            ApplySettings();
        }
        else if (batteryLevel < 0.5f)
        {
            targetFrameRate = 30;
            currentShadowQuality = 1;
            currentResolutionScale = 0.8f;
            ApplySettings();
        }
    }
    
    public void OptimizeForThermalState(int thermalState)
    {
        // 热状态: 0=正常, 1=温暖, 2=热, 3=严重过热
        switch (thermalState)
        {
            case 3: // 严重过热
                targetFrameRate = 20;
                enableDynamicEffects = false;
                currentShadowQuality = 0;
                currentResolutionScale = 0.5f;
                break;
                
            case 2: // 热
                targetFrameRate = 24;
                enableDynamicEffects = false;
                currentShadowQuality = 0;
                currentResolutionScale = 0.7f;
                break;
                
            case 1: // 温暖
                targetFrameRate = 30;
                enableDynamicEffects = true;
                currentEffectsLevel = 1;
                currentShadowQuality = 1;
                currentResolutionScale = 0.9f;
                break;
            
            case 0: // 正常
                // 恢复默认设置
                targetFrameRate = 30;
                enableDynamicEffects = true;
                currentEffectsLevel = 2;
                currentShadowQuality = 2;
                currentResolutionScale = resolutionScaleFactor;
                break;
        }
        
        ApplySettings();
    }
}

八、完整AR渲染流水线

8.1 AR渲染管理器

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
using System.Collections.Generic;

[RequireComponent(typeof(ARCameraManager))]
public class ARRenderingManager : MonoBehaviour
{
    // 主要AR组件引用
    private ARCameraManager arCameraManager;
    private ARSession arSession;
    private ARPlaneManager arPlaneManager;
    private AROcclusionManager arOcclusionManager;
    
    // 渲染组件引用
    private Camera arCamera;
    
    [SerializeField]
    private Volume postProcessVolume;
    
    [SerializeField]
    private UniversalRenderPipelineAsset renderPipelineAsset;
    
    // 渲染阶段管理
    private CommandBuffer backgroundCommandBuffer;
    private CommandBuffer arEffectsCommandBuffer;
    private CommandBuffer postProcessCommandBuffer;
    
    // 资源管理
    private RenderTexture arBackgroundTexture;
    private RenderTexture depthTexture;
    private RenderTexture occlusionTexture;
    
    // 材质
    [SerializeField]
    private Material arBackgroundMaterial;
    
    [SerializeField]
    private Material occlusionMaterial;
    
    [SerializeField]
    private Material postEffectsMaterial;
    
    // 渲染设置
    [SerializeField]
    private bool useOcclusionEffects = true;
    
    [SerializeField]
    private bool useEnvironmentLighting = true;
    
    [SerializeField]
    private bool useAdaptivePerformance = true;
    
    // 性能监控
    private ARRenderingPerformanceMonitor performanceMonitor;
    private ARRenderingDynamicOptimizer dynamicOptimizer;
    
    // 渲染参数
    private float lastFrameProcessingTime = 0f;
    private int frameCount = 0;
    
    void Awake()
    {
        // 获取组件引用
        arCameraManager = GetComponent<ARCameraManager>();
        arCamera = GetComponent<Camera>();
        arSession = FindObjectOfType<ARSession>();
        arPlaneManager = FindObjectOfType<ARPlaneManager>();
        arOcclusionManager = GetComponent<AROcclusionManager>();
        
        // 创建性能监控组件
        if (useAdaptivePerformance)
        {
            performanceMonitor = gameObject.AddComponent<ARRenderingPerformanceMonitor>();
            dynamicOptimizer = gameObject.AddComponent<ARRenderingDynamicOptimizer>();
        }
        
        // 创建命令缓冲
        backgroundCommandBuffer = new CommandBuffer();
        backgroundCommandBuffer.name = "AR Background Rendering";
        
        arEffectsCommandBuffer = new CommandBuffer();
        arEffectsCommandBuffer.name = "AR Effects Rendering";
        
        postProcessCommandBuffer = new CommandBuffer();
        postProcessCommandBuffer.name = "AR Post Processing";
    }
    
    void Start()
    {
        // 初始化渲染资源
        InitializeRenderingResources();
        
        // 配置渲染管线
        ConfigureRenderPipeline();
        
        // 注册AR相机帧事件
        arCameraManager.frameReceived += OnCameraFrameReceived;
        
        // 添加命令缓冲到摄像机
        arCamera.AddCommandBuffer(CameraEvent.BeforeForwardOpaque, backgroundCommandBuffer);
        arCamera.AddCommandBuffer(CameraEvent.BeforeForwardAlpha, arEffectsCommandBuffer);
        arCamera.AddCommandBuffer(CameraEvent.AfterEverything, postProcessCommandBuffer);
        
        // 初始化遮挡系统
        if (useOcclusionEffects && arOcclusionManager != null)
        {
            arOcclusionManager.frameReceived += OnOcclusionFrameReceived;
        }
    }
    
    void InitializeRenderingResources()
    {
        // 创建AR背景纹理
        arBackgroundTexture = new RenderTexture(Screen.width, Screen.height, 0, RenderTextureFormat.ARGB32);
        arBackgroundTexture.Create();
        
        // 创建深度纹理
        depthTexture = new RenderTexture(Screen.width, Screen.height, 24, RenderTextureFormat.Depth);
        depthTexture.Create();
        
        // 创建遮挡纹理
        occlusionTexture = new RenderTexture(Screen.width, Screen.height, 0, RenderTextureFormat.R8);
        occlusionTexture.Create();
        
        // 初始化材质
        if (arBackgroundMaterial == null)
        {
            arBackgroundMaterial = new Material(Shader.Find("Unlit/ARBackground"));
        }
        
        if (occlusionMaterial == null && useOcclusionEffects)
        {
            occlusionMaterial = new Material(Shader.Find("AR/OcclusionBlending"));
        }
        
        if (postEffectsMaterial == null)
        {
            postEffectsMaterial = new Material(Shader.Find("AR/PostEffects"));
        }
    }
    
    void ConfigureRenderPipeline()
    {
        if (renderPipelineAsset != null)
        {
            // 设置URP资产
            GraphicsSettings.renderPipelineAsset = renderPipelineAsset;
            
            // 配置AR渲染特定设置
            renderPipelineAsset.supportsCameraDepthTexture = true;
            renderPipelineAsset.supportsCameraOpaqueTexture = true;
            
            // 获取相机的URP附加数据
            var additionalCameraData = arCamera.GetUniversalAdditionalCameraData();
            if (additionalCameraData != null)
            {
                additionalCameraData.renderPostProcessing = true;
                additionalCameraData.requiresDepthTexture = true;
                additionalCameraData.requiresColorTexture = true;
            }
        }
    }
    
    void OnCameraFrameReceived(ARCameraFrameEventArgs args)
    {
        frameCount++;
        
        // 记录处理开始时间(用于性能分析)
        float startTime = Time.realtimeSinceStartup;
        
        // 处理相机纹理
        if (args.cameraGrainTexture != null)
        {
            // 可以应用相机噪点纹理来增强真实感
            Shader.SetGlobalTexture("_ARCameraGrainTexture", args.cameraGrainTexture);
        }
        
        // 处理光照估计
        ProcessLightingEstimation(args);
        
        // 更新背景渲染
        UpdateBackgroundRendering(args);
        
        // 记录帧处理时间
        lastFrameProcessingTime = Time.realtimeSinceStartup - startTime;
    }
    
    void ProcessLightingEstimation(ARCameraFrameEventArgs args)
    {
        if (!useEnvironmentLighting) return;
        
        // 获取光照估计数据
        float? brightness = args.lightEstimation.averageBrightness;
        Color? colorCorrection = args.lightEstimation.colorCorrection;
        SphericalHarmonicsL2? sphericalHarmonics = args.lightEstimation.averageSphericalHarmonics;
        
        // 更新主方向光
        Light mainLight = RenderSettings.sun;
        if (mainLight != null)
        {
            // 调整亮度
            if (brightness.HasValue)
            {
                mainLight.intensity = brightness.Value;
            }
            
            // 调整颜色
            if (colorCorrection.HasValue)
            {
                mainLight.color = colorCorrection.Value;
            }
        }
        
        // 更新环境光照
        if (sphericalHarmonics.HasValue)
        {
            RenderSettings.ambientMode = AmbientMode.Skybox;
            RenderSettings.ambientProbe = sphericalHarmonics.Value;
        }
        
        // 将光照参数传递给着色器
        if (brightness.HasValue)
        {
            Shader.SetGlobalFloat("_ARBrightness", brightness.Value);
        }
        
        if (colorCorrection.HasValue)
        {
            Shader.SetGlobalColor("_ARColorCorrection", colorCorrection.Value);
        }
    }
    
    void UpdateBackgroundRendering(ARCameraFrameEventArgs args)
    {
        // 清除之前的命令
        backgroundCommandBuffer.Clear();
        
        // 获取背景纹理
        Texture backgroundTexture = arCameraManager.GetComponent<ARCameraBackground>().material.mainTexture;
        if (backgroundTexture != null)
        {
            // 获取相机内参,用于纹理正确映射
            XRCameraIntrinsics intrinsics;
            if (arCameraManager.TryGetIntrinsics(out intrinsics))
            {
                arBackgroundMaterial.SetVector("_CameraIntrinsics", 
                    new Vector4(intrinsics.focalLength.x, intrinsics.focalLength.y, 
                               intrinsics.principalPoint.x, intrinsics.principalPoint.y));
            }
            
            // 将显示透视矩阵传给着色器
            Matrix4x4 displayTransform = arCameraManager.GetDisplayTransform();
            arBackgroundMaterial.SetMatrix("_DisplayTransform", displayTransform);
            
            // 绘制背景
            backgroundCommandBuffer.Blit(backgroundTexture, BuiltinRenderTextureType.CurrentActive, arBackgroundMaterial);
        }
    }
    
    void OnOcclusionFrameReceived(AROcclusionFrameEventArgs args)
    {
        // 更新AR效果命令缓冲
        arEffectsCommandBuffer.Clear();
        
        // 处理环境深度贴图
        if (args.environmentDepthTexture != null)
        {
            Texture environmentDepthTexture = args.environmentDepthTexture;
            
            // 设置遮挡材质参数
            if (occlusionMaterial != null)
            {
                occlusionMaterial.SetTexture("_EnvironmentDepthTexture", environmentDepthTexture);
                occlusionMaterial.SetFloat("_ZNear", arCamera.nearClipPlane);
                occlusionMaterial.SetFloat("_ZFar", arCamera.farClipPlane);
                
                // 设置全局深度纹理供其他材质使用
                Shader.SetGlobalTexture("_AREnvironmentDepthTexture", environmentDepthTexture);
            }
        }
        
        // 处理人体遮挡
        if (args.humanStencilTexture != null && args.humanDepthTexture != null)
        {
            // 设置人体遮挡材质参数
            if (occlusionMaterial != null)
            {
                occlusionMaterial.SetTexture("_HumanStencilTexture", args.humanStencilTexture);
                occlusionMaterial.SetTexture("_HumanDepthTexture", args.humanDepthTexture);
                
                // 设置全局人体遮挡纹理
                Shader.SetGlobalTexture("_ARHumanStencilTexture", args.humanStencilTexture);
                Shader.SetGlobalTexture("_ARHumanDepthTexture", args.humanDepthTexture);
            }
        }
    }
    
    void Update()
    {
        // 更新后处理效果命令缓冲
        UpdatePostProcessing();
    }
    
    void UpdatePostProcessing()
    {
        // 清除之前的命令
        postProcessCommandBuffer.Clear();
        
        // 为后处理创建临时RT
        int tempRT = Shader.PropertyToID("_ARPostProcessTemp");
        postProcessCommandBuffer.GetTemporaryRT(tempRT, -1, -1, 0, FilterMode.Bilinear);
        
        // 应用后处理效果
        if (postEffectsMaterial != null)
        {
            // 应用环境深度到后处理
            if (arOcclusionManager != null && arOcclusionManager.environmentDepthTexture != null)
            {
                postEffectsMaterial.SetTexture("_DepthTex", arOcclusionManager.environmentDepthTexture);
            }
            
            // 设置其他后处理参数
            postEffectsMaterial.SetFloat("_FrameCount", frameCount);
            
            // 应用后处理效果
            postProcessCommandBuffer.Blit(BuiltinRenderTextureType.CurrentActive, tempRT);
            postProcessCommandBuffer.Blit(tempRT, BuiltinRenderTextureType.CurrentActive, postEffectsMaterial);
        }
        
        // 释放临时RT
        postProcessCommandBuffer.ReleaseTemporaryRT(tempRT);
    }
    
    // 外部访问接口,用于控制AR渲染设置
    public void SetOcclusionEnabled(bool enabled)
    {
        useOcclusionEffects = enabled;
        
        if (arOcclusionManager != null)
        {
            arOcclusionManager.requestedEnvironmentDepthMode = 
                enabled ? EnvironmentDepthMode.Fastest : EnvironmentDepthMode.Disabled;
        }
    }
    
    public void SetEnvironmentLightingEnabled(bool enabled)
    {
        useEnvironmentLighting = enabled;
        
        if (arCameraManager != null)
        {
            arCameraManager.requestedLightEstimationMode = 
                enabled ? LightEstimationMode.EnvironmentalHDR : LightEstimationMode.Disabled;
        }
    }
    
    public void SetRenderingQuality(int qualityLevel)
    {
        if (dynamicOptimizer != null)
        {
            dynamicOptimizer.OptimizeForThermalState(3 - qualityLevel); // 反转值:0=最低质量,3=最高质量
        }
        else
        {
            // 如果没有优化器,直接设置质量级别
            QualitySettings.SetQualityLevel(qualityLevel, true);
        }
    }
    
    // 获取性能统计
    public string GetPerformanceStats()
    {
        string stats = $"帧处理时间: {lastFrameProcessingTime * 1000:F2}ms\n";
        stats += $"FPS: {1.0f / Time.smoothDeltaTime:F1}\n";
        return stats;
    }
    
    void OnDestroy()
    {
        // 清理事件订阅
        if (arCameraManager != null)
            arCameraManager.frameReceived -= OnCameraFrameReceived;
            
        if (arOcclusionManager != null)
            arOcclusionManager.frameReceived -= OnOcclusionFrameReceived;
            
        // 移除命令缓冲
        if (arCamera != null)
        {
            arCamera.RemoveCommandBuffer(CameraEvent.BeforeForwardOpaque, backgroundCommandBuffer);
            arCamera.RemoveCommandBuffer(CameraEvent.BeforeForwardAlpha, arEffectsCommandBuffer);
            arCamera.RemoveCommandBuffer(CameraEvent.AfterEverything, postProcessCommandBuffer);
        }
        
        // 释放命令缓冲
        backgroundCommandBuffer?.Dispose();
        arEffectsCommandBuffer?.Dispose();
        postProcessCommandBuffer?.Dispose();
        
        // 释放渲染纹理
        if (arBackgroundTexture != null)
            arBackgroundTexture.Release();
            
        if (depthTexture != null)
            depthTexture.Release();
            
        if (occlusionTexture != null)
            occlusionTexture.Release();
    }
}

九、总结

Unity引擎中的AR渲染流程是一个复杂而精密的过程,涉及多个子系统之间的协作:

  1. 摄像头采集:获取实时现实世界图像,并处理为纹理
  2. 位姿跟踪:确定设备在空间中的位置和方向
  3. 环境理解:处理平面检测、深度估计和环境光照估计
  4. 前景渲染:渲染虚拟对象,应用正确的遮挡和光照
  5. 后处理集成:将虚拟内容与现实图像无缝融合
  6. 性能优化:动态调整渲染设置以适应不同设备和条件

通过深入理解这些流程,开发者可以创建高质量的AR体验,实现虚拟内容与现实世界的自然融合。随着AR技术的不断发展,Unity的AR渲染流程也在持续优化,支持更加逼真的光照、阴影、反射和遮挡效果,为用户提供更加沉浸式的AR体验。

Logo

分享前沿Unity技术干货和开发经验,精彩的Unity活动和社区相关信息

更多推荐