ARCore for Android developers - pARt 1: The basics

栏目: IT技术 · 发布时间: 4年前

内容简介:Nowadays, augmented reality sounds like a buzzword, but actually, as an Android developer, you have a pretty easy to use toolset to do basic things - like showing a model - with only a few lines of code. The goal of this article is to introduce you to the

Introduction

Nowadays, augmented reality sounds like a buzzword, but actually, as an Android developer, you have a pretty easy to use toolset to do basic things - like showing a model - with only a few lines of code. The goal of this article is to introduce you to the tools and methods to use with the ARCore framework, focusing mostly on the Sceneform helper library.

First of all, you should have a look at the following guides:

If you are done with the guides, let's get started. You'll create an application in which you can add a chosen model to your augmented environment!

Preparation

This guide and sample application will use Kotlin and coroutines with a twist. All long-running tasks in Sceneform should be started from the main thread, and the library handles concurrency for us, but we'll use the suspending capabilities of coroutines anyway.

You'll need at least Android Studio 3.1 or newer and the Google Sceneform Tools (Beta) plugin to be installed. Hint: always be sure that the plugin version matches the ARCore dependency version, otherwise it could cause serious problems to debug the errors.

Create a new project with an Empty Activity and a minimum API level of 24 . This seems pretty high right now, but Sceneform requires it and most of the supported devices are on this API.

Dependencies

Make sure that your project level build.gradle file contains the google() repository, and add the following to the app level build.gradle :

android {
    compileOptions {
        sourceCompatibility 1.8
        targetCompatibility 1.8
    }
}

dependencies {
    // ARCore
    def ar_core_version = '1.14.0'
    implementation "com.google.ar:core:$ar_core_version"
    implementation "com.google.ar.sceneform.ux:sceneform-ux:$ar_core_version"
    implementation "com.google.ar.sceneform:core:$ar_core_version"

    // Coroutines
    def coroutines_version = '1.2.0'
    implementation "org.jetbrains.kotlinx:kotlinx-coroutines-core:$coroutines_version"
    implementation "org.jetbrains.kotlinx:kotlinx-coroutines-jdk8:$coroutines_version"
    implementation "org.jetbrains.kotlinx:kotlinx-coroutines-android:$coroutines_version"
}

The compileOptions configuration is necessary because the ARCore library is based on Java 8 features. Next to the usual coroutine dependencies, you may notice the jdk8 extension library, which you'll use to bridge the coroutine functionality with the CompletableFuture in JDK8.

Manifest modifications

Next, you'll need to update the AndroidManifest.xml file:

<manifest ...>

    <uses-permission android:name="android.permission.CAMERA" />
    <uses-feature android:name="android.hardware.camera.ar" />
    <uses-feature android:glEsVersion="0x00030000" android:required="true" />

    <application
       ...
       android:largeHeap="true"
	   ... >
        ...
        <meta-data android:name="com.google.ar.core" android:value="required" />
        ...
    </application>

</manifest>

You're defining the minimum OpenGL version, the CAMERA permission, the AR required value , and restricting the application in the Play Store to AR capable devices.

Add the sampledata folder

The next step is to change the project tab's view mode from Android to Project and create a new sampledata folder inside the app folder.

ARCore for Android developers - pARt 1: The basics

ARCore for Android developers - pARt 1: The basics

You can put all original model files into this folder. These won't be packaged into the final application, but will be part of the project. You'll use this folder later!

Would you be surprised if I said you are already halfway to your goal?

Plane finding

So let's assume you have a MainFragment or MainActivity that starts when the application is launched. Its layout XML should look like this:

<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent">

    <fragment
        android:id="@+id/arView"
        android:name="com.google.ar.sceneform.ux.ArFragment"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

</FrameLayout>

The root ViewGroup contains only a single fragment element which is referencing ArFragment . This Fragment is a complete, all-in-one solution for handling the basic AR related configuration, checking the ARCore companion application availability, checking the API level, handling permissions, and so on.

Now you can install the application on an emulator - or preferrably, a physical device. You should see something like this (with permission and companion app handling at first start, if needed):

ARCore for Android developers - pARt 1: The basics

As you can see, the built-in Fragment gives us a hand waving icon which guides the user on how to move the phone around, and if the system finds a plane, it highlights it with small white dots. Note that ARCore only works on colorful, non-homogeneous surfaces! So for example, it's nearly impossible for it to detect a plain white wall or floor.

Add your model

Next, you'll need to find a model to use. You could use your own models made in Blender, 3DS Max, Maya, etc., or download one from the Internet. In my opinion, a good source for this is Sketchfab , where you can find free models with CC licensing and a "bonus feature". In many cases, you will face an issue where the textures will not appear on your model when you place it in the AR environment. There are many ways to handle this, but to keep it simple, you may download the model from Sketchfab automatically converted to gltf , which is one of the supported file formats . If that doesn't work either, then I suggest looking for another model, as debugging or fixing 3D models is generally not worth the time as an Android developer.

Because a certain series is so popular right now (and I personally like it too), you will use a Baby Yoda model in the application, this one:

BABY YODA TEXTURES by olivier.gide on Sketchfab

A note about the model: it's made up of around 10.000 triangles and multiple image texture files, which means it's pretty complex. This greater model complexity comes with a greater memory footprint, which is why you added the largeHeap="true" option to the AndroidManifest.xml . At least it looks great!

You should save this as an auto-converted gltf , unpack it, and copy the model file with all related files like textures, .bin , etc. to the previously created sampledata folder. Then, in Android Studio, right-click on the .gltf file and select the Import Sceneform Asset option. This would open up a dialog:

ARCore for Android developers - pARt 1: The basics

Here you can leave everything on default, and just click Finish .

If everything goes well, a Gradle task would start and convert your model to a Sceneform Asset (.sfa) and to a Sceneform Binary (.sfb) file. You will find the latter in your src/main/assets folder, and this will be compiled into your application. The relation between sfa and sfb files is that the sfb is generated from the sfa , so you should always modify the sfa file to apply any changes to your binary model. At the end of this tutorial, if you find that your model is too small or too large when shown, open the generated sfa file, look for the scale parameter, and set the value to your liking. For the Baby Yoda model, you can try setting it to 0.15 .

So right now you have a converted model and a working plane detecting application, but how do you add the model to the scene?

Placing the model

First, you should load the binary model into the ARCore framework. I assume you are familiar with coroutines and use a CoroutineScope somewhere in your application to handle background tasks. For the sake of simplicity, you can also use the lifecycleScope of a Fragment.

private fun loadModel() {
    lifecycleScope.launch {
        yodaModel = ModelRenderable
            .builder()
            .setSource(
                context,
                Uri.parse("scene.sfb")
            )
            .build()
            .await()
        Toast.makeText(
            requireContext(),
            "Model available",
            Toast.LENGTH_SHORT
        ).show()
        initTapListener()
    }
}

Here, you build a ModelRenderable with a given source and await() its completion. The build method returns a CompletableFuture , and the aforementioned JDK8 coroutines library provides the await() extension for it. This component stores the model and is responsible for the render mechanism. The model name in the Uri.parse() call should be the same as the generated .sfb file name.

Then you initiate the tap listener. For this purpose, you have to have a reference to the contained Fragment instance:

override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
    super.onViewCreated(view, savedInstanceState)
    arFragment = childFragmentManager.findFragmentById(R.id.arView) as ArFragment
    loadModel()
}

With that, the tap listener initialization is as follows:

private fun initTapListener() {
    arFragment.setOnTapArPlaneListener { hitResult, _, _ ->
        val anchorNode = AnchorNode(
            hitResult.createAnchor()
        )
        anchorNode.setParent(arFragment.arSceneView.scene)
        val yodaNode = Node()
        yodaNode.renderable = yodaModel
        yodaNode.setParent(anchorNode)
    }
}

As you can see, it's pretty easy to add a model to your AR scene. In just a few steps:

  • Assign a tap listener to the Fragment, just like a click listener.
  • Create an anchor node from the given hitResult .
  • Set the Fragment's scene as its parent.
  • Crate a Node() which will show the ModelRenderable and set the anchorNode as its parent.

And that's it, you are done! Build and run the application, find a plane, and place the model by tapping on it! Magic.

ARCore for Android developers - pARt 1: The basics

Summary

This guide should have given you a small introduction into AR usage as an Android developer. I hope you liked this article, and the small but effective sample application.

You can find the source code here .

We are planning to release more AR related articles, so be sure to follow us!


以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

数学世纪

数学世纪

皮耶尔乔治·奥迪弗雷迪 / 胡作玄、胡俊美、于金青 / 上海科学技术出版社 / 2012-1 / 28.00元

《数学世纪:过去100年间30个重大问题》以简短可读的方式论述了整个20世纪的数学。20世纪的数学博大精深,新兴领域及学科的建立发展,许多经典问题得到解决,大量新的有意义的问题的引入,为数学带来了活力。《数学世纪:过去100年间30个重大问题》介绍了数学基础,20世纪的纯粹数学、应用和计算数学,以及目前未解的重要问题,中间穿插了希尔伯特的23个问题的解决情况、菲尔兹奖和沃尔夫奖得主的工作成就等。一起来看看 《数学世纪》 这本书的介绍吧!

在线进制转换器
在线进制转换器

各进制数互转换器

XML、JSON 在线转换
XML、JSON 在线转换

在线XML、JSON转换工具

HEX CMYK 转换工具
HEX CMYK 转换工具

HEX CMYK 互转工具