Multifaceted developer with a strong history of execution and innovation in a wide variety of software projects and interaction media. Can help you make your next great iOS app, or dive deep into experimental technology to make something nobody has ever seen before. Skilled with iOS app development, virtual / augmented reality, and computer vision. Looking for new ways to bring innovation to human-centric computing.

Skills

iOS Development

  • Swift
  • Objective-C
  • Metal
  • ARKit
  • SceneKit
  • SwiftUI
  • watchOS

AR / VR

  • OpenCV
  • ARKit
  • Unity3D
  • Homebrew
  • Rapid Prototyping
  • C++
  • C#

Languages

  • Swift
  • Objective-C
  • Python
  • C#
  • C++
  • Java
  • Metal
  • GLSL

Work Experience

Senior iOS Developer, iOS Developer
Rightpoint (formerly Raizlabs)
Jan 2018 - Current

Responsible for architecture and implementation on large, public-facing projects that:

  • 🚏 Help visually impaired people find bus stops with Bluetooth ranging
  • 🏥 Raise money for cancer research
  • 💡 Power the smart lighting in your home
  • 🏬 Help store owners manage their shelves with computer vision
  • 🤖 Pilot large robots
  • 🗺️ Visualize the way your autonomous vacuum sees your home
  • Go-to for difficult experimental / trailblazing projects requiring an investigative approach

  • Pinch hitter and frequent consultant for augmented reality projects

  • Worked extensively with Bluetooth, BLE, and Internet of Things devices

Experience Designer (Contract)
Maine Discovery Museum
2016 - 2017

Worked in collaboration with the museum to create interactive exhibits for children.

  • Sea What Grows: Created an iOS-based kiosk as part of a wider aquaculture exhibit to teach children about the ocean

  • The X-Ray Hand: Developed interactive exhibit that uses Leap Motion to show an x-ray visualization of a visitor’s hand

VR / AR Engineer
Virtual Environment and Multimodal Interaction Laboratory
Sep 2011 - Jan 2018

Worked in close collaboration with researchers to develop and prototype several VR simulations using Unity3D; responsible for code and design in solo and team projects.

  • Provided leadership in lab environment to push into new territories with self-directed and wide-ranging projects, from the development of a custom-built wearable AR platform to the usage of VR to prototype AR techniques for enhancing human spatial perception

  • Mentored for more than 30 students in programming and design

  • Integrated VR technologies with Unity3D before official support, often involving hardware hacking or creating native plugins

  • Developed VR experiences for multiple human interface platforms, including the HTC Vive, Oculus Rift, Leap Motion, Microsoft Kinect, andoptical marker tracking systems by Phasespace and WorldViz

  • Created a full-stack implementation of the W3C Web Annotation Model for Dartmouth’s Semantic Annotation Tool as part of the Media Ecology Project

  • Created native and Unity3D-based iOS apps for AR and data visualization contexts

Projects

Liquid Math
https://apps.apple.com/us/app/liquid-math/id1331320224
Conceived, designed, an implemented an interactive, Metal-powered Reaction-Diffusion simulator for iOS and macOS.
  • iOS
  • macOS
  • Swift
  • Objective-C
  • Metal
  • GPGPU
  • Creative Coding

Kino
https://github.com/colejd/Kino
Master's thesis. Created a software and hardware platform to empower researchers to rapidly design computer vision experiments for wearable AR headsets without needing to worry about threading, camera synchronization, or OS details. An example plugin was developed for real-time object recognition via machine learning.
  • OpenCV
  • Augmented Reality
  • C++
  • Homebrew
  • Research

Semantic Annotation Tool
https://mediaecology.dartmouth.edu/sat/
Created the frontend and co-created the backend implementations of the W3C Web Annotation Model for Dartmouth’s Media Ecology Project. Waldorf (the frontend) comprises an embeddable video player which allows users to create and edit rich annotations for web videos, while Statler (the backend) is a RESTful Rails backend that manages these annotations. The purpose of this system was to provide researchers with an interface for automatic tagging of videos via computer vision techniques.
  • Rails
  • Ruby
  • Javascript
  • REST
  • Frontend
  • Backend

Education

Master of Science -
Spatial Information Science and Engineering
University of Maine
Dec 2017
  • R
  • Prolog
  • Spatial Analysis
  • Spatial Database Systems
  • Human Computer Interaction
  • Information Systems Law
  • Real-Time Sensor Data Streams
Bachelor of Science -
Computer Science
University of Maine
May 2015
  • Object-Oriented Programming
  • Algorithms
  • Data Structures
  • Operating Systems
  • Discrete Math
  • Agile/Scrum