My name is Cameron Eidenier. I am a software developer who lives in Grand Rapids, Michigan. During my high school years, I discovered my passion for computers and programming. This led me to attend, and graduate from, Ferris State University with a Bachelor’s degree in Digital Animation and Game Design.
I was fortunate to have an internship through YETi, CGI, where I would later work. While at YETi, I learned, through a series of projects, how to take each project from conception to release. During this period, I had the opportunity to interact with new systems and technologies. Several projects I have been involved with utilized the emerging field of augmented reality as it coalesced from its earlier stages of development to a more standardized system. My experience with AR has given me insights into developing skills that will allow me to react to the new and changing landscape of software development.
In my professional life, I have worked as an individual as well as with teams of up to seven developers. In addition, I have coordinated with other departments that include art, developer operations and quality assurance. As an integral part of my career, I have had experience working directly with clients to communicate and manage expectations. Thanks to the experiences I have had in my career to date, I am looking to leverage the skills that I have acquired in the pursuit of new opportunities. If you have any questions, or would like to reach out, feel free to contact me.
After testing and experimenting with the initial Unity app, it was decided to recreate the experience online using AUX/casual simulation to allow for a similar experience, but without the need to download an application. This would allow the museum guest to run the experience by simply loading a web page onto his phone. While the experience is similar to the previous Unity app, the AR room tracking was removed due to unsatisfactory performance. This caused the AR image scans to be replaced with QR codes, but retained the core experience loop tested in the Unity app.
• Non-screen based AR: Experimenting with AR without using a pass-through camera, primarily through use of location based, scanned images.
• Online collaborative: To reinforce the collaborative nature of sturgeon conservation, the online experience features a multi-user interactive experience.
• Database manipulation: To support the collaborative nature of the experience, I created and maintained a database.
Worked with GRPM to create an application to increase guests’ engagement with the museum’s sturgeon conservation efforts through the use of an AR, in which the guest would interact with a virtual sturgeon. The guest picks up the sturgeon from an AR sturgeon tank, and while traveling through the museum, has the opportunity to scan other AR markers to collect food and feed the sturgeon, thereby allowing it to grow. At the end, the guest has the option to travel down to the actual Grand River to release his sturgeon. The sturgeon will be visualized with other sturgeon released by other museum guests.
• Screen-based AR: Using ARCore and ARKits Unity plug-ins, I created a series of AR stations, triggered with image detection which allowed the guests to navigate the experience.
• User data and analytics: To aid the GRPM with their grant requirements, I created an analytics database through the use of Google’s Firebase analytics to track guests’ interaction and usage numbers.
• Online database management manipulation: To maintain and manage the guests’ virtual sturgeon and their accompanying data, I created and maintained a database through Google’s Firebase database.
I assisted on a building-scale AR project with another developer for a pop-up store located in Japan. The store was populated with numerous AR markers that guests’ could scan and interact with, and once each guest found all of the different experiences, he would be eligible to receive a discount on the product.
• Multi-target AR tracking: Due to the size of the store, we were required to create and utilize a large collection of tracking images while trouble shooting tracking limits and misidentified markers of similar design. We gained a better understanding of what computer vision looks for when differentiating tracking markers.
Assisted a developer on a polling/ data-visualization system that was used at the Google Cloud Next event. The system used a series of low-energy Bluetooth beacons to determine a user’s position in the conference hall. Once the guest was seated, the application would present them with a series of questions to correlate with the presentation being given, and would then present a visualization onstage and on the user’s device.
• AR through the use of Bluetooth beacon: I was in charge of developing the system for detecting and placing guests at their tables through the use of low-energy Bluetooth beacons.
• Interactive data visualization: Leveraging Unity’s physics system. I created an interactive data visualization system that would create a series of circles that would orbit the center of the guests’ table with the highest answer growing larger and bouncing the others’ answers out. This would be displayed on the users’ phones and on the presenter’s projector as part of their
We were contracted by Google to help develop a system of getting Google’s own ARCore and Apples’ ARKit to work together through a single Unity package. This would help developers streamline AR application development in Unity. My responsibilities included working with the senior development team to translate their work into example projects that would be distributed to other development teams.
Box Box was a proof of concept system developed during the early popularity of Pokémon Go. YETi CGI was hired by an investor to see if we could develop a similar location-based AR system using a cellphone’s GPS. My responsibilities included the prototyping of game play loops and the tracking and visualization of the user’s GPS data to determine their location and any objects of interest nearby.
YETi CGI had the opportunity to work with an up-and-coming company that was producing head mounted AR displays. My responsibility was to create visualization and demos to both test the headsets to determine that they were functioning properly, as well as to present at IMMY investor meetings.
• Desktop & Mobile VR/AR experimentation.
• Experience in a team development environment.
“Dinosaurs” was a View-Master product from Mattel, in which the user would view VR content of prehistoric animals. This project leveraged Google Cardboard for the VR content and Vuforia for the limited AR content. My responsibilities for this product included playtesting and running quality assurance tests, and cataloguing and fixing any bugs found.
• Experience in photo processing.
•Customer service and problem solving regarding technical issues.
• Graduated May 7, 2016
• Cum Laude (3.56 GPA)
• Dean’s List: Spring 2014, Fall 2013, Spring 2013
• Award for best Capstone project 2014
• GPA: 3.59
• Dean’s List, 2015; President’s Club, Fall 2013, Summer 2015
• Honor Roll, 4 years