zoukankan      html  css  js  c++  java
  • 手把手教你做个AR涂涂乐

    前段时间公司有一个AR涂涂乐的项目,虽然之前接触过AR也写过小Demo,但是没有完整开发过AR项目.不过经过1个多星期的学习,现在已经把项目相关的技术都学会了,在此向互联网上那些乐于分享的程序员前辈们致敬.学习的过程中我发现好多博客只有代码没有讲解,在这里我就写一个详细一点的涂涂乐教程吧.

    一.AR涂涂乐原理

    目前市场上所有的AR产品中,涂涂乐是一个做的比较成功的产品,因为其形象 生动 新奇的特点,在早教行业内很受欢迎.其实AR涂涂乐的原理非常简单,就是把被当作画画工具的识别图上的颜色材质渲染到空白的模型上.

    二.制作流程

    我大概总结了下从模型到AR应用涉及到的具体流程,如下:

    1. 美术制作AR应用中需要用到的模型和动画
    2. 模型制作好之后,对照识别图上的模型外形匹配好UV
    3. UV匹配好之后把模型和识别图交付程序,程序登录Vuforia官网添加证书和目标数据
    4. 程序把Vuforia插件和目标数据分别下好并导入工程
    5. 删除默认场景中的摄像机,添加ARCamera和ImageTarget到场景并配置好参数
    6. 把涂涂乐需要用到的模型放入场景,把模型的动画切好
    7. 求识别图4个顶点的位置,获取一帧图像,把这些参数传给Shader,Shader处理过后,识别图上的颜色就渲染到模型上了
    8. 导出的手机,我这里是安卓平台

    三.图文教程

    这里以我最近写的项目为例:

    1.美术不是自己亲自做的,就写下我当时做项目时对美术的要求吧:

    • UV要展好,和识别图匹配好
    • 带动画的模型要保持独立,不能和其他模型在一起
    • 模型动画要有周期性2遍为一个周期
    • 识别图要对比强烈
    • 命名规范要有意义同一模型的不同组件要分组
    • 单位M
    • 格式FBX

    交付:

    • 模型:

    2.png

    • 识别图:

    1.png

    这个模型只有花是用来画画的,因此识别图是一个空白的花.

    2.Vuforia准备工作(许可证 和 识别图数据)

    • 没有注册过的先注册下,Vuforia注册时要注意的一点是密码要包含大小写和特殊符号
    • 注册后点击Develop --->Add License Key

    3.png

    • 正常测试选Development就行了,然后取个名字点Next

    4.png

    • 然后确认就好了

    5.png

    • 这里的License Key是待会项目中要用到的 先用记事本保存下来备用

    Paste_Image.png

    • 到了添加识别图的时候了,先创建一个Target Database
      Paste_Image.png

    Paste_Image.png

    • 然后点击刚才创建的Target Database

    Paste_Image.png

    • 添加识别图

    Paste_Image.png

    • 静静等待 不要关闭

    Paste_Image.png

    • 下载识别图数据备用

    Paste_Image.png

    • 下载Vuforia插件备用

    Paste_Image.png

    3.Unity配置阶段

    • 配置安卓开发环境(之前写过的博客,很具体)电梯

    • 转换开发平台(不提前转换导出时有可能会报错)

    Paste_Image.png

    • 依次导入Vuforia插件包,识别图数据包,美术资源

    Paste_Image.png

    • 删除场景中的Camera,在场景中添加AR Camera和Image Target

    Paste_Image.png

    • 配置AR Camera

    Paste_Image.png

    • 配置Image Target

    Paste_Image.png

    • 把模型拖入Image Target并调整好位置

    Paste_Image.png

    4.编程阶段

    总体思想:编程阶段做的主要工作是把识别图上的材质信息经过计算赋给模型,这里因为动画的原因,每个模型可能涉及很多单个小模块,每个单个的模块都要经过这个计算,在这里,花的7个花瓣是独立的,也就需要7次运算.

    • 核心代码
    using UnityEngine;
    using Vuforia;
    using System.Collections;
    
    public class ARRender : MonoBehaviour
    {
    
        public GameObject Scene;
    
        private Animator flowerAnimator;
    	
    	//七色花的七个花瓣
        public GameObject flower1;
        public GameObject flower2;
        public GameObject flower3;
        public GameObject flower4;
        public GameObject flower5;
        public GameObject flower6;
        public GameObject flower7;
    
    	private Texture2D texture;
    	//申请Texture2D变量储存屏幕截图
    
    	private int screenWidth;
    	//保存屏幕宽度
    	private int screenHeight;
    	//保存屏幕高度
    
    	//拾取真正贴图的四个点的坐标
    	Vector3 targetAnglePoint1;
    	//左上角坐标
    	Vector3 targetAnglePoint2;
    	//左下角坐标
    	Vector3 targetAnglePoint3;
    	//右上角坐标
    	Vector3 targetAnglePoint4;
    	//右下角坐标
    
    	public GameObject plane;
    	//储存确定贴图大小的面片物体
    
    	Vector2 halfSize;
    	//记录plane宽高的一半值
    
    
    	void Start()
    	{
    
    		screenWidth = Screen.width;
    		//屏幕宽
    		screenHeight = Screen.height;
    		//屏幕高
            
    		texture = new Texture2D(screenWidth, screenHeight, TextureFormat.RGB24, false);//实例化空纹理
    
    	    flowerAnimator = this.GetComponent<Animator>();
    
    	}
    
    
        //截屏函数
    	public void ScreenShot()
    	{
           Scene.SetActive(true);
    	   
            flowerAnimator.SetTrigger("FlowerRainbow");
    		
    
    		texture.ReadPixels(new Rect(0, 0, screenWidth, screenHeight), 0, 0);
    		//读取屏幕像素信息
    		texture.Apply();
    		//存储为纹理数据
    
    		halfSize = new Vector2(plane.GetComponent<MeshFilter>().mesh.bounds.size.x, plane.GetComponent<MeshFilter>().mesh.bounds.size.z) * 50.0f*0.5f;
    		//获取Plane的长宽的一半值
    
    		//确定真实贴图的世界坐标
    		targetAnglePoint1 = transform.parent.position + new Vector3(-halfSize.x, 0, halfSize.y);
    		targetAnglePoint2 = transform.parent.position + new Vector3(-halfSize.x, 0, -halfSize.y);
    		targetAnglePoint3 = transform.parent.position + new Vector3(halfSize.x, 0, halfSize.y);
    		targetAnglePoint4 = transform.parent.position + new Vector3(halfSize.x, 0, -halfSize.y);
    
    		//获取VP值
    		Matrix4x4 P = GL.GetGPUProjectionMatrix(Camera.main.projectionMatrix, false);
    		Matrix4x4 V = Camera.main.worldToCameraMatrix;
    		Matrix4x4 VP = P * V;
    
    		//给地球的Shader传递贴图四个点的世界坐标,VP,以及贴图
            flower1.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
            flower1.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
            flower1.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
            flower1.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
            flower1.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
            flower1.GetComponent<Renderer>().material.mainTexture = texture;
    
            flower2.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
            flower2.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
            flower2.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
            flower2.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
            flower2.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
            flower2.GetComponent<Renderer>().material.mainTexture = texture;
    
            flower3.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
            flower3.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
            flower3.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
            flower3.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
            flower3.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
            flower3.GetComponent<Renderer>().material.mainTexture = texture;
    
            flower4.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
            flower4.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
            flower4.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
            flower4.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
            flower4.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
            flower4.GetComponent<Renderer>().material.mainTexture = texture;
    
            flower5.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
            flower5.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
            flower5.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
            flower5.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
            flower5.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
            flower5.GetComponent<Renderer>().material.mainTexture = texture;
    
            flower6.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
            flower6.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
            flower6.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
            flower6.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
            flower6.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
            flower6.GetComponent<Renderer>().material.mainTexture = texture;
    
            flower7.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
            flower7.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
            flower7.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
            flower7.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
            flower7.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
            flower7.GetComponent<Renderer>().material.mainTexture = texture;        
    	}
    }
    
    • Shader
    Shader "AR paint/ToMaterial" {
        Properties {
            _MainTex ("Base (RGB)", 2D) = "white" {}
            _Uvpoint1("point1", Vector) = (0 , 0 , 0 , 0)
            _Uvpoint2("point2", Vector) = (0 , 0 , 0 , 0)
            _Uvpoint3("point3", Vector) = (0 , 0 , 0 , 0)
            _Uvpoint4("point4", Vector) = (0 , 0 , 0 , 0)
    
        }
        SubShader {
            Tags { "Queue"="Transparent" "RenderType"="Transparent" }
            LOD 200
    
            Pass{
                Blend SrcAlpha OneMinusSrcAlpha
    
                CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag
                #include "UnityCG.cginc"
    
                sampler2D _MainTex;
                float4 _MainTex_ST;
                float4 _Uvpoint1;
                float4 _Uvpoint2;
                float4 _Uvpoint3;
                float4 _Uvpoint4;
    			float4x4 _VP;
    
                struct v2f {
                    float4  pos : SV_POSITION;
                    float2  uv : TEXCOORD0;
                    float4  fixedPos : TEXCOORD2;
                } ;
    
                v2f vert (appdata_base v)
                {
                    v2f o;
                    o.pos = mul(UNITY_MATRIX_MVP,v.vertex);
                    o.uv = TRANSFORM_TEX(v.texcoord,_MainTex);
    				
                    float4 top = lerp(_Uvpoint1, _Uvpoint3, o.uv.x);
                    float4 bottom = lerp(_Uvpoint2, _Uvpoint4, o.uv.x);
                    float4 fixedPos = lerp(bottom, top, o.uv.y);
                    o.fixedPos = ComputeScreenPos(mul(UNITY_MATRIX_VP, fixedPos));
                    return o;
                }
    
                float4 frag (v2f i) : COLOR
                {
    			    
    			    float4 top = lerp(_Uvpoint1, _Uvpoint3, i.uv.x);
                    float4 bottom = lerp(_Uvpoint2, _Uvpoint4, i.uv.x);
                    float4 fixedPos = lerp(bottom, top, i.uv.y);
    				fixedPos = ComputeScreenPos(mul(_VP, fixedPos));
                    return tex2D(_MainTex, fixedPos.xy / fixedPos.w);
    				
                }
                ENDCG
            }
        }
        //FallBack "Diffuse"
    }
    
    • 首先在ImageTarget下新建一个Plane和识别图大小一致,核心代码的功能就是把场景中Plane(也就是识别图)的世界坐标转换到屏幕坐标,然后截图形成纹理,最后把这些数据传给Shader处理.

    Paste_Image.png

    • 把模型拖入ImageTarget,装饰场景也拖进去先隐藏,然后把需要渲染的模块拖入ARRender的代码参数,然后新建一个材质,Shader选择ARpaint/ToMaterial,把这个材质赋给所有要涂色渲染的模块,也就是7个花瓣.
    • 新建一个Button,添加事件拖入Flower,方法选择ScreenShot
    • 导出安装包到手机,测试,这是我的测试结果:

    85.jpg

    8.png

    • 关于动画

    • 动画拿到之后根据需要进行切割并Apply

    Paste_Image.png

    • 新建一个动画控制器,把刚才切好的动画拖到动画状态机里面,右键建立Transition,Parameters选项卡中创建Trigger,这样就可以在程序中控制动画了

    • 关于音效

    • 新建一个空物体,起名Audio

    • 添加AudioSource组件

    • 拖入ImageTarget

    • 找到ImageTarget下的 DefaultTrackableEventHandler 脚本 声明AudioSource变量然后分别在 OnTrackingFound() 和 OnTrackingLost()这两个方法里添加声音暂停和开始方法

    public class DefaultTrackableEventHandler : MonoBehaviour,
                                                    ITrackableEventHandler
        {
            public AudioSource clothesAudioSource;
            #region PRIVATE_MEMBER_VARIABLES
     
            private TrackableBehaviour mTrackableBehaviour;
        
            #endregion // PRIVATE_MEMBER_VARIABLES
    
    
    
            #region UNTIY_MONOBEHAVIOUR_METHODS
        
            void Start()
            {
                mTrackableBehaviour = GetComponent<TrackableBehaviour>();
                if (mTrackableBehaviour)
                {
                    mTrackableBehaviour.RegisterTrackableEventHandler(this);
                }
            }
    
            #endregion // UNTIY_MONOBEHAVIOUR_METHODS
    
    
    
            #region PUBLIC_METHODS
    
            /// <summary>
            /// Implementation of the ITrackableEventHandler function called when the
            /// tracking state changes.
            /// </summary>
            public void OnTrackableStateChanged(
                                            TrackableBehaviour.Status previousStatus,
                                            TrackableBehaviour.Status newStatus)
            {
                if (newStatus == TrackableBehaviour.Status.DETECTED ||
                    newStatus == TrackableBehaviour.Status.TRACKED ||
                    newStatus == TrackableBehaviour.Status.EXTENDED_TRACKED)
                {
                    OnTrackingFound();
                }
                else
                {
                    OnTrackingLost();
                }
            }
    
            #endregion // PUBLIC_METHODS
    
    
    
            #region PRIVATE_METHODS
    
    
            private void OnTrackingFound()
            {
    			//gameObject.transform.FindChild ("Earth").gameObject.SetActive (true);
                Renderer[] rendererComponents = GetComponentsInChildren<Renderer>(true);
                Collider[] colliderComponents = GetComponentsInChildren<Collider>(true);
    
                // Enable rendering:
                foreach (Renderer component in rendererComponents)
                {
                    component.enabled = true;
                }
    
                // Enable colliders:
                foreach (Collider component in colliderComponents)
                {
                    component.enabled = true;
                }
                if (!clothesAudioSource.isPlaying)
                {
                   clothesAudioSource.Play(); 
                }
                
    
                Debug.Log("Trackable " + mTrackableBehaviour.TrackableName + " found");
            }
    
    
            private void OnTrackingLost()
            {
                Renderer[] rendererComponents = GetComponentsInChildren<Renderer>(true);
                Collider[] colliderComponents = GetComponentsInChildren<Collider>(true);
    
                // Disable rendering:
                foreach (Renderer component in rendererComponents)
                {
                    component.enabled = false;
                }
    
                // Disable colliders:
                foreach (Collider component in colliderComponents)
                {
                    component.enabled = false;
                }
    
                clothesAudioSource.Pause();
    
                Debug.Log("Trackable " + mTrackableBehaviour.TrackableName + " lost");
            }
    
            #endregion // PRIVATE_METHODS
        }
    }
    
    • 关于对焦

    Vuforia默认不会自动对焦,新建一个脚本,把下面代码复制进去,把脚本拖到ARCamera上面就好了

    using UnityEngine;
    using System.Collections;
    
    public class Duijiao : MonoBehaviour
    {
    
        // Use this for initialization
        void Start()
        {
            GameObject ARCamera = GameObject.Find("ARCamera");
    
            Vuforia.CameraDevice.Instance.SetFocusMode(Vuforia.CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
        }
    
        // Update is called once per frame
        void Update()
        {
            Vuforia.CameraDevice.Instance.SetFocusMode(Vuforia.CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
        }
    }
    
    
    • 关于多图识别

    • 设置目标上限

    Paste_Image.png

    • 拖入更多的ImageTarget,每个ImageTarget下配置不同的识别图和模型就能多图识别了

    Paste_Image.png

    • 测试结果

    Paste_Image.png

    由于还没有重构,代码比较丑陋,不过好处是适合新手,很好理解

  • 相关阅读:
    day6_redis模块和pipeline
    day6_hashlib模块
    18 MySQL数据导入导出方法与工具介绍之二
    【Vijos1264】神秘的咒语
    【Vijos1180】选课
    【vijos1234】口袋的天空
    【vijos1790】拓扑编号
    【WC2008】【BZOJ1271】秦腾与教学评估(二分,前缀和,奇偶性乱搞)
    【Baltic2003】【BZOJ1370】Gang团伙(并查集,拆点)
    【基础】二分算法学习笔记
  • 原文地址:https://www.cnblogs.com/qiaogaojian/p/6592171.html
Copyright © 2011-2022 走看看