2022
AR Choose-Your-Own-Adventure Book Application
This project showcases how one might make a choose-your-own-adventure AR application that can be tied in with a physical book, bringing the story to life via the application as a reader goes through the book and makes decisions.
This project creates a storytelling process that combines reading a book and bringing it to life. We chose to do a choose-your-own-adventure style book as many other books are being translated into AR for study and recreation, however the choose-your-own-adventure genre has yet to be done. The goal is to grab younger readers' (6 to 14 – ages TBD) attention by animating the choices made by the reader.
For the prototype, we generated an original story loosely based on an old classic poem – The Owl and the Pussycat. The story branches the poem at each stanza to create some choice. The goal is to have the reader take the poem to its conclusion. There are four pages in the book, and the reader chooses the path in the first page based on the physical item piece they drag to the collection point. In the second and third pages, the reader must move the story along by taking the physical flag piece to the navigation point. All items and characters appear on the phone as AR elements on the plane in the book so the reader can see the story come to life.
Contributions:
Project Manager:
For this project I took charge of organizing the team's work in a Trello board and kept people on task so we could focus on getting the project to completion. I worked with the team to come up with a timeline and checked in with them each week on progress and adjusted as needed (using Agile practices).
Lead Developer:
I wrote the code to get objects to interact with the physical world and the AR systems using image tracking from the AR Foundations framework. I created the Game Manager script to control the application as well as all the item interaction scripts (with help from Daisy with initial set up). To do this, I set up all the UI elements in the project once Daisy finished the concept and artwork. The UI is connected to the project through the physical images being tracked - creating virtual objects on them that collide with the scene virtual objects as the physical image is moved into position and then call up the appropriate UI associated with the object. The code linking the objects can be found here in the Page Collision System script and here in the UI Collision script.