HMD Global on What Makes Every Single Nokia 9 Unique, and the Tech Behind the Five-Camera Setup

HMD Global on What Makes Every Single Nokia 9 Unique, and the Tech Behind the Five-Camera Setup
Highlights
  • The 5-sensor setup is capable of capturing up to 240-megapixels of data
  • HMD teamed up with Light for part of the solution
  • The relative positioning of the five lenses is unique in each unit
Advertisement

HMD Global unveiled five phones ahead of the Mobile World Congress (MWC) last week, but it was one in particular that grabbed all the headlines. That of course was the Nokia 9 PureView, the first mainstream smartphone to feature a five-camera setup on the rear. The smartphone packs two colour (RGB) and three monochrome 12-megapixel sensors, all with the same f/1.82 aperture and field-of-view.

It also marks a revival of the PureView brand, first seen on the Nokia 808, followed by the Nokia Lumia 1020 (apart from a couple of other Lumia phones), both of which packed 41-megapixel sensors. In that sense, the Nokia 9 PureView is a phone that many have been anticipating ever since HMD Global revived the Nokia brand a little over two years ago.

“The Nokia brand has that heritage”, says Pranav Shroff, Director, Global Portfolio Strategy & Planning, HMD Global. “Will the PureView come back, what will you do with the PureView? It was almost like an expectation when we came into the world of Android. We did this on Symbian. We did this on Windows. The question was what will you do on Android?”

Manufacturers have usually gone down two different routes with multi-lens camera setups — the first involves offering different types of lenses for use in different scenarios. The other approach involves combining data from different sensors to create an image that (ideally) has more information than what would've been possible with a single sensor.

“There are different ways of doing this. You can configure your imaging solution as a Swiss knife — you have a specific purpose, you pull out the lens that you need for it, and you configure it. So there's a main, or a telephoto, or there's an ultra-wide etc,” Shroff adds. “Or then we wanted to pioneer was something like this.”

While the approach itself isn't new or pioneering in any way, throwing as many as five sensors at the problem certainly is an industry-first, and something that presented its unique challenges.

“Five similar sensors, with similar kind of specifications, two RGB, three mono, but they capture as much amount of light as you typical never seen in a smartphone,” Shroff continues. “And then all of them [sensors] will simultaneously click at least one picture.”

“So every time you capture, you have at least sixty megapixels of data, but in certain settings — if your shot is such that you have very bright sunlight and very dark points, you can go all the way up to 240 megapixels of data.”

That's because the Nokia 9 PureView is configured to shoot up to four different frames from each sensor, at different exposures, every time you click a picture, if the algorithm determines that's what's necessary to capture the moment. This involved building a lot of custom tech and Gadgets 360 spoke to Juho Sarvikas, Chief Product Officer — HMD Global, to find out how everything comes together to delivery this enhanced imaging experience.

“Five Sony sensors, 1.25 micron, like I said, three are monochrome, two are RGB,” says Sarvikas. “We did careful analysis in terms of, like, from an image quality point of view, how much of a benefit we get from different amount of sensors, and during the development process, we switched the positions.”

Ultimately, HMD settled on the configuration wherein the colour sensors are in the middle (below the ZEISS branding), right below a monochrome sensor, with the two other monochrome sensors on either side. As for the number of sensors, Sarvikas says their testing showed “diminishing returns” beyond five sensors.

nokia 9 pureview gadgets 360 2

“In different builds, we kept moving it around, because it does impact image quality, and also from the algorithm point of view it has an impact,” Sarvikas adds.

Combining information from multiple sensors involves complex algorithms that depend, among other things, on knowing the relative positioning of the sensors in a 3D plane. A minuscule difference in positioning of one lens can ruin the results, so HMD had to make special arrangements to account for the inevitable vagaries of the manufacturing process.

What makes every single Nokia 9 PureView unique
“The interest thing is that this is a single camera assembly — so it's one module that we had to develop or specify ourselves, completely,” Sarvikas explains. “One of the things that took a lot of time to calibrate is that there can be no module warp — everything has to be in the exact same plane, and then all of them have to be calibrated at the same time.”

“And every single Nokia 9 PureView you always get small variations,” he continues. “It's impossible to reproduce in production the exact same [thing], so we also had to develop a system for testing the exact alignment of each module in the manufacturing process and then have adaptive software where we can give the data from the calibration tool — they all sit here, here, here.”

“Even like a small, very unnoticeable tilt for example, there will always be some degree of that. We test that, we document the values of each sensor, and we give it back to the software which will then calibrate itself based on that data.”

“This is done once, in manufacturing, but everyone is unique, so you need to be able to dynamically take that into account,” Sarvikas adds. He says the decision to go with five identical cameras was taken as asymmetric cameras would've further increased the complexity of the solution.

The partnership with Light
To develop the Nokia 9 PureView, HMD partnered with Light, the company behind the L16 camera with 16 individual 13-megapixel sensors that works on a similar principle. We asked Sarvikas what Light brings to the table.

“Their IP is in how to instruct the [multiple] cameras [at the same time] depending on the scene on what to do, and then secondly how to perform the image fusion,” he explains. “To do that they've also developed their own custom ASIC chip.”

“The hardware is from them. The chip is from Light, and we've licensed their software solution, which of course we had to completely readapt and modify because the L16 is quite different than our camera right here, but it's the same algorithm, same IP adapted for our solution,” he continues.

“Then Light also orchestrates the data and feeds it to Qualcomm in a way that Qualcomm can understand. And that goes to the ISP. But then the processing is done by Qualcomm, and again with the Light algorithms for the fusion, ISP kind of controls or orchestrates the capture.”

Bringing it all together
Sarvikas then elaborates on some of the work done to minimise power consumption.

“Typical smartphone will do the image processing primarily or entirely on the CPU, which is fine if you are doing a single image, or you do like one or two,” he explains. “But then if you go into multiple image fusion, it's no longer possible because the processing will take too long and it's also very ineffective from power consumption point of view.”

“So we use the DSP, the digital signal processor, instead. It has multiple pipelines. They are not as broad, if you will, as the CPU, but there are multiple. It's actually a very good design for noise reduction, because that's what the DSP is designed for. And then we also get us ten-times lower power consumption.”

“And then the graphics processing unit does the depth mapping. So in parallel, it will compute, the full-view, full resolution, 12-megapixel depth map.”

Capturing up to 240 megapixels of data
We ask Sarvikas to confirm if the setup is indeed capturing 240-megapixels of data in certain scenarios, something Shroff had mentioned in passing earlier, as well as when and how that might happen.

“Yes, correct. It's always at least one [frame] of each [sensor], but then depending on the scene, light conditions and all of that, one thing we'll also do is that even in good lightning conditions, we typically take multiple images, but then we might disregard, if there's objects in motion, for example, to avoid motion blur, we might fuse only five [one from each sensor],” Sarvikas explains. “But if that's not the case, it might become like ten pictures that are on top of one another, and the exposure values are different, which is how you get the dynamic range.”

 

So do RBG and monochrome sensors work together at all times, or are there certain scenarios where on or the other might fire?

“Everybody always fires together. We never do individual, so it's always all,” says Sarvikas. “You'll take one primary colour picture with the one in the middle, and then you populate the detail from all of the others. That helps us to reduce noise, because noise happens when there's no data, right.”

“Then way that typically smartphones deal with noise is they got to smoothen it out. They take pixels around it and guess what it was. But what that leads to is that you lose detail. We don't have to make as many guesses because we actually have information for that one pixel from so many other pictures.”

Limitations
One of the limitations of the technology in its current form is that there's a delay of a few seconds when capturing an image using all five sensors simultaneously. Sarvikas says this won't be going away anytime soon.

“This generation, [Snapdragon] 845, we've optimised it, but it's a trade-off between how much data you crunch i.e. the image quality, and how much you want to wait,” he explains. “We could of course enable a snapshot mode or something like that where we use one or two cameras, which I'm not sure if it's [useful], because we already have the uninterrupted experience across all the five.”

“Couple of months ago, we knew we would land somewhere between four and six seconds with the processing, just do the maths, we knew that's what it will take. We came down from 10+ [seconds], maybe 12-15 seconds, to where we are now, and of course we have to optimise power consumption as well.”

The way of the future?
Ajey Mehta, Vice President and Country Head, India – HMD Global says that the Nokia 9 PureView signals the start of a new chapter in innovation as far as HMD is concerned, and is hopeful these improvements will eventually trickle down to the budget devices as well.

“[The Nokia 9 PureView] really drives the whole imaging story that Nokia has stood for, and Nokia will stand for going forward,” Mehta tells Gadgets 360. “More than the fact that it is a fantastic phone, it has a trickle down on the rest of the portfolio that if there's phone that can do such great things with imaging, then the rest of the portfolio can also suit can also follow suit in terms of terms of the quality of the imaging experience.”

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Nokia 9 PureView, HMD Global, Nokia
Samsung Galaxy A40 Price Leaked Ahead of Imminent Launch
Microsoft Excel Lets Users Take a Photo of a Spreadsheet to Digitally Recreate It
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »