Content
summary Summary

The iPhone app Luma AI makes it easy to capture NeRFs and edit them with camera paths, creating a video that would otherwise require a drone.

Luma AI is available in the iPhone App Store and makes it surprisingly easy to capture a NeRF. I simply circle an object three times at different heights, angling my iPhone toward the center.

The app takes care of the rest, uploading to Luma Labs' servers, where their proprietary AI builds a neural radiance field for me. An ultra-smooth orbit around the NeRF can be downloaded as a video, and a 3D object with texture maps can be generated within a few minutes. You can also share the NeRF as an interactive web view so your friends can experience the location or the item you've captured.

As impressive as that is, it's only the beginning.

Ad
Ad

Moving through the scene with AR

Luma AI captures the object you circle, but simultaneously records the background. When I first open the app, I see a gallery of NeRFs I've previously captured. Tapping any thumbnail will place you into the Cinematic Render view, which is usually a slow spin around the object.

Tapping the AR button at the top right places the NeRF within my physical space. I see either the extracted central object resting on my floor or the entire NeRF scene filling my room.

Tap the background button at the bottom right to switch between these views. It's as if my iPhone is a window into the location I recorded the NeRF. Check out Luma Lab's Twitter video, introducing the AR feature.

In my examples, I see the trees, moss, leaves, and rocks as I look through my phone. Also, I can walk around the inside AR scene, raise my iPhone or reach down to see from a lower angle. It's as if I were back on the forest trail.

The quickest way to make a fly-through video is to press the record button at the bottom of the screen and start moving through the AR scene. I don't have to worry about bumping into trees and crashing my drone. However, I must stay aware of my surroundings because my physical space doesn't match the NeRF environment.

Recommendation

A fun effect is to make the scene or object larger or smaller than it really was. If I pinch to make the NeRF much bigger than it really is, I can travel through the NeRF as if I'm the size of a mouse, jumping off mossy patches and ducking under raised roots.

Next, I shrink the forest trail NeRF to half size, which makes my relative size about 12 feet tall. My movements appear twice as fast, making it easier to simulate a speedy drone.

I drag the NeRF to the middle of my room, using a bottle of peanuts to mark the physical location of a central tree for reference. I hit record and start "flying" my iPhone around the tree, trying to simulate the movements of a drone. I run through the process several times before I have a performance I like.

When I finish recording, I tap the shutter button again. The video can be saved immediately, but there won't be any stabilization. Instead, I hit the Reshoot button, then the magic wand at the bottom left. Now I can choose how much stabilization to apply. The video will process for several minutes, but I can make more videos while I wait.

Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.

Creating videos that look like they were made with a drone is so fast that I made several. The perfectionist in me is still unsatisfied, so I move on to a more controlled way to record a camera path.

Keyframes allow fine-tuning

To gain total control of the "flight" path, I can open a NeRF from the gallery and tap Reshoot, then Custom. This puts me in a camera path editor similar to the kind found in a 3D animation app.

The most important concept to understand is keyframes. I can move my viewpoint to a new position and tap the Add Keyframe button to record that position. I pinch and drag to move to another location and camera angle, then click the plus button at the right to add another keyframe.

Luma AI places keyframes two seconds apart, and pressing the play button shows how it smoothly glides the virtual camera through those two points.

I continue to add keyframes to plot out a drone-like path. It's laborious compared to walking and moving my iPhone in AR mode, but keyframing allows greater precision.

Dragging the purple, diamond-shaped keyframes below the camera view allows me to fine-tune the timing. The stopwatch icon allows me to adjust the total length to speed up or slow down the entire video.

I can tap the Overview tab at the top of the screen to see the camera path as a purple line with blue camera icons at each keyframe location. On the iPhone, this is only useful as a visualization. If I open the web app, I can drag the camera icons in overview mode to change their positions and angles, smoothing out the action.

There are many more options, but this is the basic process. When I'm happy with my flight path, I use the options at the bottom of the screen to set the resolution and aspect ratio, toggle the NeRF background on or off, and other settings.

The final step is to hit Render and wait. Luma AI will upload these details to Luma Labs' servers to create the video. Your iPhone isn't needed for this step. You can close the app, record another NeRF or work on other animations.

When the render is complete, I can download the video to use as a part of a longer video or to share as it is. Professional videographers might combine NeRF video with actual video to get the versatility of AI-generated images and the superior quality of reality.

The video below by Jake Oleson using Luma AI and was highlighted by Luma Labs in a recent blog post. The video was edited to put together several NeRFs, giving an example of what's possible with sufficient time and creativity.

The limitations of NeRFs

Since Luma AI uses NeRFs, I often see "floaters," the equivalent of noise or artifacts in the AI ​​​​​​model. If I move slowly and steadily enough while capturing the NeRF, this problem is minimized, and the details are sharper. Doing this over uneven terrain is challenging, but it's worth the effort.

When viewing a Luma AI NeRF or when recording a camera path for a video, there's a limit to how far and where you can move without these distortions hurting the realism. That will improve as this technology evolves, and Luma Labs has already significantly enhanced the quality.

As I move further from the center of the NeRF, the scene distorts and blurs due to a lack of detail. The AI ​​lacks the spatial information to faithfully reconstruct objects at the far edges of the capture area. The scene gets weird if you venture too far.

That can be fun to explore. In some videos, I take advantage of this to create a warp effect to transport to what looks like another dimension. When editing, it's possible to get lost in these distorted "pocket worlds."

There's an easy solution when that happens. If I tap a keyframe, I can return to a coherent section and continue editing.

Luma AI makes NeRFs accessible

Neural radiance fields have been around for several years, but were difficult to use. The technology has finally become accessible enough for anyone to use. Nvidia's InstantNGP was a step in the right direction, allowing quick viewing on a computer.

Luma Labs makes NeRFs almost as easy as recording a video. I can capture a moment in three dimensions, then reshoot it as a video at any time later using a virtual camera.

The possibilities are endless. As this technology advances, NeRFs will be a valuable tool for the metaverse, simplifying the creation of 3D objects and places. You can already view NeRFs in VR if you have a compatible computer and headset.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Luma AI is an iPhone app that lets you capture and edit NeRFs.
  • These 3D models of objects and scenes can be viewed from any angle.
  • With Luma AI, you can move through the NeRF in AR to record a fly-through video with your iPhone.
  • A camera path editor lets you create smooth, cinematic videos of your NeRFs.
Sources
Alan loves to explore new technology and believes AR and VR could soon replace both phones and laptops as the best solution for work and play.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.