Author Archives:

Neutron Star


These gifs came from a conversation with my sibling Ansel, who works with the LIGO collaboration and also teaches undergraduate physics. They were dissatisfied with the diagrams that were generally available when teaching students about spinning neutron stars that generate gravitational waves. The existing diagram they had been using illustrated several pieces of important information:

  • A neutron star is spherical and spins around its axis

  • It can be “squished” along its rotational axis, but that distortion is symmetrical around the spin axis, so it doesn’t create gravitational waves.

  • If the star has a bump or some other distortion (sometimes called a mountain) which is NOT symmetrical around the spin axis, that causes gravitational waves to be generated as the star spins.

But the diagram also had some shortcomings:

  • It didn’t clearly portray the way that the waves move out from their source

  • It didn’t give a good sense of the “mountain” being an asymmetrical piece of the star

  • It was a still image, where including movement would have provided much more info to the viewer.

It also didn’t portray some potentially helpful information that was shown in other diagrams, like the star’s magnetic field and the beams of light that they emit from their poles, which also don’t necessarily align with the spin axis.


So I took this as challenge for my animation skills.

Modular Animations

My goals here were:

  • To use color to differentiate and highlight important elements of the diagram. There is a lot of information at play here, and I didn’t want it to become too visually cluttered to be clear
  • To use animation to visually describe a moving system in a way that a still image couldn’t
  • To create variations of the gif that showed different combinations of the same elements, so that Ansel (or any other instructors who wanted to use the images) could choose which one was most useful for what they were teaching.

Alternates

Once I had the initial layout, I also created some alternate versions of the gifs with different mountain placement, just to show another possible configuration.

Ice Age AR (In-Progress)

This is an passion project which is still in development. It’s a mobile AR app which visualizes the huge sheets of ice which covered the landscape in the ice ages. It’s intended for use at scenic overlooks amid the dramatic landscapes of the Pacific NW.

The app is build in Unity and uses Mapbox to load terrain data at the user’s location, which the user then orients to match the real-world landscape. Then a layer of ice is placed at a specified height, where the digital terrain masks it out, creating an overlay on the real world.

The technical setup, where the pre-loaded scan is overlaid on the real world, was based on the system I put together for the Escape Room Ghost — it’s an AR experience tailored for a particular place. In this case, that place is a local scenic overlook.

This app is a work in progress, but here’s a demonstration of its current state.

More info on Mapbox toolset can be found at mapbox.com.

Oscillarium Graphics

My role: Modeling, rigging, and texturing

Some background graphics for the game Oscillarium.

The goal was for the environment to be calm, meditative, and welcoming — a bit like watching an aquarium. The models needed to be low-poly, stylized, and painterly.

Made in Blender, for Unity.

Saturn Rocket VR Visualization

A prototype of an educational VR experience which I developed for 9iFX. Built for the Vive, the experience allows the user to observe and interact with a model of a Saturn rocket engine in various ways:

  • Mode one displays the rocket at full size, and the user can spin the model on the Y axis to see all sides of it.
  • Mode two provides the “exploded” view, where the user can see the individual pieces of the engine, and hover over each to see them highlighted and labeled.
  • Mode three allows the user to rotate the model on two axes, and provides and interactive “cutaway” plane, which can be moved through the model to show its inner workings
  • Mode four guides the user through assembling part of the engine. Pieces can be highlighted and labeled, rotated, grabbed and moved using the controllers.

 

Since the models here were originally given to me as CAD files, they required some conversion and optimization to work in Unity. I reduced the polygon count pretty significantly in Blender while maintaining their fidelity.

 

 

Escape Room Ghost

My Role: Development, animation, and effects

The escape room ghost project is a space-based mobile AR app. It’s an interactive experience that allows the user to see a ghostly figure appear and move around the room. The ghost interacts with objects in the room and provides hints to the escape room puzzles.

The app is set up before each use by matching a pre-existing 3D scan of the room with the room itself. This is done by looking around the room space and allowing the app to detect planes, so that the AR elements can remain steady and and locked into place. The user can rotate and adjust the room scan mesh to match the room. The scan mesh then becomes invisible, but will still occlude the ghost and other AR elements, allowing them to move behind or through objects in the room.

The scan of the space and the ghost model mid-animation, as seen in Blender

The app then detects the presence of cards that the user finds and scans. Each of the cards triggers a unique animation in which the ghost appears in a puff of smoke, moves around the room to give the player a hint about the puzzle they are trying to solve, and then vanishes into smoke again.

The AR app did not end up being used, but some of the effects and animations can still be viewed in the escape room experience. The technical setup of the project did inspire one of my other ongoing projects though.

Created by Sprocketship for Mental Trap Escape Room Games.

The app in action.

Hololens Transmission App Animations

 

My role: Animation

 As a student in Clackamas Community College’s first Hololens Development course, I was part of the team tasked with creating an app to help students in CCC’s automotive courses visualize the workings of the gearbox in an automatic transmission. The interactions of the gears and the power flow of the device is notoriously hard to explain using 2D diagrams, so the transmission was chosen in part to showcase the power of AR as an educational tool. The app was started by the class as a whole and completed by a small group of students, myself included, after the course was over.

 

The Transmission app in action.

 

 

 

 

 

 

 

The Thinking Machine

My Role: Developer

The Thinking Machine project is a VR experience by Shovels and Whiskey. It’s an educational tool that allows users to interact with information in the form of objects: selecting, moving, and categorizing customizable data “blocks”.

The experience is built in Unity using Windows Mixer Reality.

Information blocks are automatically generated and placed on a shelf, based on a text list of inputs.
Text and images can appear dynamically on the blocks as the user completes different sorting exercises.
The user moves and sorts the blocks into categories, and their responses are evaluated for correctness before moving on.
In this exercise, the user selects a hypothesis to defend, which they will then need to support with information from a variety of sources.

This project also has a testing version which was built to explore various interaction modes. This included different methods of transporting around the scene, and of interacting with objects via the controllers.

Different combinations were then created and tested to see which ones were most intuitive to new or inexperienced VR users.

Object interaction options included clicking the controller trigger once to pick up a block and once to drop it, and a click-and-hold variation.
In this alternative interaction mode, the user moves around the scene by selecting from a panel of buttons, rather than point-and-click teleportation.

Wisps (Unity FX)

My role: Effects creation, texturing

The Wisps are enemies in the fantastical VR fencing game Willowisp VR. They are real-time generative effects built in Unity, using a combination of particle systems and semi-transparent textures to create fire and magic effects.

Each Wisp is made up of simple particle systems that use partially transparent, additive textures to appear more complex. The limited number of particles keeps the effect efficient and the variety of textures makes each Wisp recognizable and visually engaging.

An example of particle system layers (left) and the texture they use:

Core layer:

  

Effect layer:

  

Edge layer:

Complete Wisp:

The edge layer’s opacity is limited to the outside rim, which gives the Wisp a clearly defined spherical edge from any angle. This matches with the object’s collider in-game and shows the player clearly where their target is.

Wisp styles:


If you want to know more about my process for particle effects, also check out this talk I did for the Portland Indie Game Squad on introductory Unity particles.

Green Loop VR

 My role: Development, optimization

Green Loop VR for Shovels + Whiskey, with the City of Portland Bureau of Planning and Sustainability, is a virtual reality visualization of potential sites of development for the city’s Green Loop walking path.

The experience invites the user to explore familiar spaces of Portland, both as they are now and as they could be as part of the Green Loop path. It’s an architectural diagram brought to life, where the user moves through points of interest at their own pace and can view the potential changes to the scene in each location.

Made to engage the public in the city’s decision-making process, Green Loop focuses on ease-of-use, clarity, and education through visual immersion.

You can find more info on this project here.