Author

Ryan D. Lewis

Publication Date

5-3-2019

Document Type

Website

First Advisor

Papka, Michael E.

Degree Name

B.S. (Bachelor of Science)

Legacy Department

Department of Computer Science

Abstract

When working in an augmented reality (AR) environment, a key component for successful interactions is close integration of the real and virtual world. To that end, this project sought to further current research into the 3D printing of easily trackable objects which could then be used to influence actions in an AR environment. By using these objects to dynamically generate virtual canvases or objects, one can control the virtual world in a manner which is more natural and tactile than what purely virtual objects and controls can provide. The project was broken down into 3 basic components—3D printing, object tracking, and virtual canvas generation—and was a collaborative project between Ryan Lewis and Nolan Cooper. This abstract primarily addresses the 3D printing component of the project, research on which was completed by Ryan Lewis. To ascertain the optimal properties of 3D-printed trackable objects, several prototypes were designed, printed, and tested. Size, shape, object complexity, color, and surface contrast were all tested traits. To test the impacts of size on trackability, 3 sizes of 3 different types of simple objects were printed: cylinders, cubes, and spheres, with cross-axis dimensions measuring 20mm, 50mm, and 100mm across. We found that, while the size improved the distance at which the objects could be detected by our AR application to an appreciable degree, they had no obvious impact on the quality of detection itself, as the increased size did not add any additional features for detection. To test the effect of shape, 5 different 3D models were used, 3 of which were primitives—a cylinder, cube, and sphere—and 2 of which were complex shapes—a Klein bottle (a non-orientable 3D projection of a 4-dimensional object) and a custom designed amalgamation of uniquely positioned polyhedral shapes, dubbed a “Reticle”. Unsurprisingly, the shape of the object had a large impact on its detectability, with the simple shapes performing much poorer than the more complicated Klein Bottle and Reticle. Since the tracking software used edge and feature detection, the Sphere was virtually un-trackable, while the reticle performed superbly with its numerous distinct faces and edges. In the context of 3D printing, the color of an object is almost always determined by the material in which it is printed. A few materials and colors, including white, blue, gray, yellow, and green, were tested using the Reticle model. While the color itself only made a slight difference, mostly in so far as it provided contrast against a given background, a slightly different property was of far greater interest. While testing color, a slightly translucent, glossy yellow material was used, and it was discovered that this made the target entirely invisible to our tracking software. Because the appearance of the object changes based on lighting and position due to the material, it became impossible to accurately track. This suggests that for optimal results, extremely matte, opaque materials are preferred. To test complexity, a Klein bottle printed with a regular smooth surface was compared to a Klein bottle printed with a Voronoi Tessellation applied to its surface, creating numerous holes and edges. Applying the Voronoi Tesselation lead to a marked increasing in detectable features, suggesting that such algorithmic complexity-increasing approaches could help improve the detectability of otherwise difficult to detect objects. Surface Contrast Lastly, surface contrast was tested by printing a multi-material Reticle, using two different colors. This also proved effective at increasing the detectable feature count of the object, effectively adding additional edges to the object where the colors meet on the surface of the object. This suggest another effective method for making difficult to detect models more trackable is by printing it with a zebra-like multi-color surface. Ultimately, the 3D printing component of the project determined that, of the properties tested, the optimal design involved a complex shape, printed with a non-trivial surface using multiple materials, with a matte material and a size based on the distance at which tracking is required by a given application. While trackable objects printed without some of these characteristics could often work to some degree or another, incorporating at least one of these characteristics is generally required. For more information on the Tracking and Virtual Canvas Generation components of the project, see Nolan Cooper’s Capstone Abstract of the same name in the Honors Capstone Library. A copy of the source code, complete with a working Augmented Reality (AR) Demo Project, copies of the 3D meshes used in the project, and recorded Video Demonstrations, will be uploaded to the Huskie Commons library, in addition to the live GitHub repository publicly available at https://github.com/barrelmaker97/VirtualCanvas.

Comments

Special thanks to Northern Illinois University's Department of Computer Science and University Honors Program, Dr. Michael Papka, the Data, Devices, and Interactions Laboratory, and the Honors Council of the Illinois Region for their support of this project.

VirtualCanvas-master.zip (124986 kB)
Virtual Canvas Repository Master Branch (122.0Mb)

VirtualCanvas-android.zip (68124 kB)
Virtual Canvas Repository Android Branch (66.52Mb)

Capstone Submission & Abstract.pdf (137 kB)
Capstone Submission & Abstract.pdf (137.7Kb)

Language

eng

Publisher

Northern Illinois University

Rights Statement

In Copyright

Rights Statement 2

NIU theses are protected by copyright. They may be viewed from Huskie Commons for any purpose, but reproduction or distribution in any format is prohibited without the written permission of the authors.

Media Type

Other

Share

COinS