How To EASILY Film in RED RAW With The Nikon ZR

I’m about to show you how to easily film in RED RAW with the Nikon ZR. To save you some time, check out my RAW filmmaking cheat sheet that shows all of the best settings in one easy place, so you can reference it whenever you need.

And if you want help getting amazing-looking colors from your RED RAW footage, I also have a set of color presets that work fantastically with RED RAW. I’ll link those as well so you can check them out.

Why Settings Still Matter When Shooting RAW

When it comes to filming in RED RAW, you might think the settings you choose on the Nikon ZR aren’t that important. After all, you’re shooting RAW, right? You can just fix everything in post.

Unfortunately, that’s not true.

RAW video is very forgiving, and RED RAW in particular is one of the easiest RAW formats to work with. It gives you incredible control over exposure and color. But that doesn’t mean you can just switch the camera to RAW, hit record, and magically get gorgeous footage.

There are still several key settings you need to dial in to make your footage look its best, and I’m going to walk you through how to do that quickly and easily.

Enabling RED RAW on the Nikon ZR

Let’s start with the basics.

Grab your Nikon ZR, open the menu, go to the video camera icon, and set the video file type to R3D NE 12-bit in LOG3G10. This enables RED RAW recording in log, which gives you a very flat and desaturated image.

That flat image is exactly what we want for maximum flexibility in post.

Making RED RAW Easier to Work With Using View Assist

Next, we want to make filming in RED RAW as easy as possible. The Nikon ZR gives you some excellent tools to help with this.

The first thing you should enable is View Assist. This removes the super flat log preview and applies a 3D LUT so you see a more saturated and contrasty image on the camera screen.

Important reminder: this LUT is only for monitoring. Your footage is still recorded in flat log and will need to be color graded later.

Turning on View Assist also affects how some of the exposure tools behave, which is why I strongly recommend enabling it. That said, if you don’t want to use it, that’s okay. I’ll explain how exposure works without it as well.

To quickly enable View Assist, press the Assist Display button in the bottom right of the screen, then tap the Assist button in the bottom middle. Instantly, your footage becomes much easier to judge.

Choosing the Correct Base ISO

Now that the image looks better, you might think you’re ready to hit record. But before you do, we need to talk about exposure tools.

First, some ground rules.

When filming in R3D, I recommend always starting by setting your ISO to one of the camera’s two base ISOs: ISO 800 or ISO 6400.

The Nikon ZR makes this easy. Press the ISO button in the bottom left of the screen, and you can toggle between the low and high base ISOs.

If you’re filming in bright conditions, like outdoors in daylight, use ISO 800. If you’re filming in darker environments, like an evening wedding reception, use ISO 6400.

Using these base ISOs gives you the cleanest image with the least amount of noise.

You might be thinking, “Matt, why does ISO matter if I’m filming RAW and can change it later?”

Great question. The answer has everything to do with zebras.

Setting Up Zebras Correctly

Zebras are those black-and-white lines that appear on overexposed areas of your image. If you see them, it means parts of your image are too bright and unrecoverable.

You want zebras enabled almost all the time when filming.

To set them up properly, open the menu, go to the pencil icon, Video, G16, and select Zebra Pattern.

Set the highlight threshold to 245, which is where highlights start to clip when using View Assist.

For midtone range, set the value to 130 with a range of plus or minus 15. Shoutout to Calen Rhome from Gamut for testing the camera and figuring this out.

Once set, press OK and exit the menu.

To toggle zebras on and off, tap the Assist Display button again and then tap the Zebra Pattern button. The first tap enables midtone zebras, and the second tap enables highlight zebras.

Why Base ISO Matters for Zebras

Here’s why sticking to base ISO is so important.

If you change the ISO away from the base values, the zebra levels also change. That means the camera can no longer accurately tell you when highlights are clipping unless you manually adjust zebra values.

Nikon even provides a PDF chart showing the correct zebra values for every ISO. You can download it from Nikon or from a mirror I’ve uploaded.

With View Assist enabled and ISO set to 800 or 6400, your zebras at 245 will be accurate. If you don’t use View Assist, you’ll want to lower zebras to around 180.

This is why I recommend base ISO whenever possible. It makes exposure faster and far less confusing.

Using Zebras to Expose Correctly

Now let’s tie everything together.

Set your Nikon ZR to RED RAW, choose one of the base ISOs, and enable highlight zebras. Adjust exposure using your aperture or an ND filter until highlight zebras just begin to appear, then back off slightly until they disappear.

This ensures you retain highlight detail while keeping shadows clean and noise-free. This is how you capture the maximum dynamic range the camera can offer.

Next, switch to midtone zebras and check skin tones. You want to see zebras appear on the side of the face where light is hitting. If you don’t, your image is either underexposed or overexposed.

This is a balancing act. You want properly exposed skin tones without blowing out bright backgrounds, so you’ll often toggle between highlight and midtone zebras to find the sweet spot.

Using the Waveform Monitor for Extra Accuracy

If you want even more confidence in your exposure, Nikon gives you another excellent tool: the waveform monitor.

To enable it, go to the menu, pencil icon, Video, G19, and set Brightness Information Display to WFM. Press right on the joystick to adjust size and position.

I recommend setting the waveform size to large and transparency to 3. The small version is just too tiny to be useful.

Back out of the menu, tap the Assist Display button, and enable the brightness info icon to turn on the waveform.

The waveform shows brightness values across the entire image. Dark areas sit lower, bright areas higher, and the image is mapped horizontally across the display.

The Firmware 1.10 Waveform Upgrade

Make sure your Nikon ZR is updated to firmware 1.10 or newer. This update added a crucial feature to the waveform monitor: a red clipping line.

If any part of the waveform goes above this red line, that area is clipped and unrecoverable. Even better, this clipping line automatically adjusts based on ISO, so it’s always accurate.

When filming RED RAW, keep in mind that exposure values are compressed downward. Properly exposed skin tones should sit about ⅓ to ½ of the way above the shadow line on the waveform.

If you hit that range, your footage will look fantastic.

The Complete RED RAW Workflow

Here’s the full workflow from start to finish.

Set ISO to 800 or 6400. Enable highlight zebras. Adjust exposure until zebras just appear, then back off slightly. Switch to midtone zebras and confirm they appear on skin tones. Use the waveform to double check skin tones sit about ⅓ to ½ above the shadow line. Finally, return to highlight zebras to make sure clipping is minimal.

This is the sweet spot of the Nikon ZR and where you’ll get the highest image quality and flexibility in color grading.

Final Thoughts

And that’s how you easily film RED RAW with the Nikon ZR!

Huge shoutout to Brandon Talbot for his helpful ZR videos that helped me confirm these settings.

If you want more videos about filmmaking, make sure to subscribe. Thanks so much for reading, and have a great day.

iPhone 17 Pro Camera Test: Will It Overheat?!

When Apple releases a new iPhone, the big question is always the same. Is it actually better, or is it just a small refresh? With the iPhone 17 Pro and the iPhone 16 Pro, the differences are not dramatic on paper, but real-world use tells a more interesting story.

This comparison focuses on performance, heat management, battery behavior, and how these phones hold up when you actually push them.

Raw Performance Differences

Both phones are fast. There is no getting around that. Everyday tasks like messaging, browsing, and social media feel identical on the iPhone 17 Pro and the iPhone 16 Pro.

The difference shows up when you start stressing the phone. Things like recording long video clips, exporting footage, or running demanding apps back to back reveal a small but noticeable edge for the iPhone 17 Pro.

Apps load slightly quicker, and the phone feels more responsive under sustained use.

Heat and Thermal Management

Heat has been a concern with recent iPhones, especially for video shooters and creators. This is one area where the iPhone 17 Pro shows improvement.

During extended recording sessions, the 17 Pro stays cooler for longer. It still gets warm, but it takes more time to reach uncomfortable temperatures. The iPhone 16 Pro heats up faster when pushed hard, especially during 4K video recording or long camera sessions.

This matters if you use your phone professionally or rely on it for long shoots.

Battery Behavior Under Load

Battery life between the two phones is similar during light use. Texting, calls, and casual browsing do not show much difference.

Under heavy use, the iPhone 17 Pro pulls ahead. When recording video, navigating, and multitasking throughout the day, the 17 Pro drains more slowly and maintains performance better as the battery drops.

It is not a massive improvement, but it is consistent.

Camera Performance in Real Use

Image quality between the two phones is very close. Photos look sharp, colors are accurate, and Apple’s processing is still strong on both devices.

The difference comes in consistency. The iPhone 17 Pro handles challenging lighting a bit better, especially during longer video clips. Stabilization feels more reliable, and exposure changes are smoother.

For quick clips, social content, or professional backup footage, the 17 Pro is more dependable.

Everyday Experience

If you are using your phone casually, you may not notice much difference between these two models. The iPhone 16 Pro is still an excellent device and feels fast in almost every scenario.

If you regularly push your phone with video work, long recordings, or demanding apps, the iPhone 17 Pro feels more stable and better optimized.

This is a refinement upgrade, not a revolution.

Which One Should You Choose?

If you already own an iPhone 16 Pro, upgrading to the iPhone 17 Pro is not essential. The improvements are real, but they are incremental.

If you are upgrading from an older phone, or if you frequently deal with heat, battery drain, or performance slowdowns, the iPhone 17 Pro is the better long-term choice.

Final Verdict

Apple focused on polish with the iPhone 17 Pro. Better thermal management, slightly improved performance under load, and more consistent camera behavior make it the best version of this design so far.

The iPhone 16 Pro is still a great phone. The iPhone 17 Pro is simply more reliable when it matters most.

iPhone 16 Pro Camera Review After 12 Months

What I Loved, What I Hated, and What I Want From the iPhone 17

After using the iPhone 16 Pro for a full year and filming everything from family moments to sponsored videos, I want to share my honest thoughts on its camera. What did I love? What drove me a little nuts? And what changes am I hoping Apple makes with the iPhone 17?

Alright, let’s get into it.

What I Loved About the iPhone 16 Pro Camera

The biggest win for me is Apple Log.

Yes, Apple Log technically debuted with the iPhone 15 Pro, but it absolutely deserves to be talked about again. Using Apple Log has been a genuinely huge upgrade for me as a filmmaker, and not for the reason you might expect.

It is not just about dynamic range or having more flexibility in color grading. The real magic of Apple Log is that it backs off the aggressive sharpening Apple usually applies to iPhone footage.

That sharpening looks fine on a phone screen, but the second you bring the footage onto a computer, it screams “shot on a phone.” Apple Log fixes that.

The image quality from the iPhone 16 Pro suddenly looks much closer to what you would expect from a mirrorless camera with a larger sensor. It mixes surprisingly well with footage from other cameras without instantly giving itself away. That alone has made me far more comfortable using the iPhone as a serious filmmaking tool.

The Biggest Ongoing Issue: Lens Reflections

That said, there is still room for improvement.

One of the biggest long standing issues with iPhone cameras is lens reflections, especially when filming at night. If you are shooting lights in a dark environment, you will often see tiny reflections and ghosting artifacts in the footage.

Once you notice them, you cannot unsee them, and they are a dead giveaway that the footage came from a phone.

I have watched reviews of Android phones that seem to handle this much better, and I would really love to see Apple reduce these reflections on the iPhone 17 Pro.

Image Quality Across the Lenses

Overall image quality from the main camera is very good. The 10 bit 4K footage looks fantastic and usually holds up well, even in lower light situations.

Unfortunately, the same cannot be said for all of the lenses.

The ultra wide camera is decent and I do use it fairly often when I want a wider shot. However, once the light drops, it starts to struggle.

The telephoto lens is worse. In low light, it gets noisy very quickly, and zooming in only makes it more obvious. Because of this, I avoid using the telephoto lens unless I am filming in bright daylight. It is a bit of a bummer and really limits when that lens feels usable.

The Front Camera Is Still Behind

The front facing camera has been fine for a while now, especially since Apple upgraded it to 4K. But it still does not come close to matching the quality of the rear cameras.

This is why we are now seeing MagSafe monitors that let you frame yourself while using the back camera. It is a clever workaround, but it also highlights the problem.

I film myself a lot. While the front camera works for Instagram and TikTok, I do not love using it for YouTube. If Apple improved the selfie camera, I would use it far more often.

In a perfect world, Apple would use the same sensor across all cameras and simply pair it with different lenses. They have moved everything toward 48 megapixels, but I am not convinced all of the sensors are truly equal yet. If they were, overall image quality would be much more consistent.

The Camera Control Button: Big Miss

Now we need to talk about the biggest negative by far.

The camera control button.

This was hyped as a massive upgrade to how we use the iPhone camera. After a few weeks, I completely disabled it.

The reason is simple. I kept accidentally pressing it when picking up my phone. The placement is awkward, right around the middle of the device. Grab the phone too low and the camera opens. Because the button is recessed, you often do not even realize you are pressing it until it is already happened.

I have heard plenty of people complain about this, and I agree with them. It feels more like a nuisance than a helpful tool.

Apple talked a lot about future software updates expanding what this button could do, but that never really materialized. Much like Apple Intelligence, it sounded better on paper than it worked in reality. Disabling it made my experience with the phone noticeably better.

The idea is fine. The execution needs a serious rethink, especially the placement.

Software Features I Still Want

There are also several software improvements I would love to see.

First, more control over Apple Log. I would love the option to use Apple Log with the H.265 codec instead of being locked to ProRes. Third party apps like the Blackmagic Camera app already do this.

Second, let us use LUTs in the native camera app. At the very least, give us LUT previews when filming in Apple Log. Even better would be the option to bake them in. And yes, my Apple Log LUTs are linked in the description.

Third, expand Apple Log to more modes. Let us shoot time lapses in Apple Log. And why can we not use cinematic mode with Apple Log? If Apple is already adding artificial bokeh, give us log too. I would use cinematic mode far more often.

Looking Ahead to the iPhone 17 Pro

At the time of writing, the iPhone 17 Pro has not been announced. I would not be surprised if Apple pushes resolution to 6K or even 8K.

As someone who has been filming more in 6K, I can appreciate that. But if it is locked to ProRes, the file sizes are going to be massive. They are already huge in 4K.

Personally, I would rather see Apple focus on improving lens quality and reducing reflections before chasing higher resolutions.

Final Thoughts

Overall, I have been happy with the iPhone 16 Pro, mostly because of the image quality made possible by Apple Log.

That said, many of those gains already arrived with the iPhone 15 Pro. USB-C, external SSD recording, and Apple Log were massive upgrades. The iPhone 16 Pro’s main addition, camera control, feels pretty underwhelming by comparison.

If I could go back, I could have easily stuck with the iPhone 15 Pro without losing much in terms of video quality. But then I would not be making this post, so here we are.

Here is hoping the iPhone 17 Pro brings more meaningful improvements for filmmakers.

Remember, you can download my color presets that work great with Apple Log to get vibrant, true to life colors with just one click.

Thanks so much for reading, and have a great day.

Destination Wedding Videography Gear Guide

Destination wedding videography can be incredibly stressful due to needing to pack up all your gear and fly with it across the country or around the world. Here’s everything I brought for a destination wedding I recently filmed in California!