Camera companies, like traditional phone manufacturers, dismissed theiPhone as a toy when it launched, in 2007. Nokia thought that the iPhone used inferior technology; the camera makers thought that it took lousy pictures. Neither thought that they had anything to worry about. Of course, neither anticipated the value of having a computer in your pocket, and what the camera folks, especially, didn’t anticipate was that, as the photographer Chase Jarvis puts it, the best camera is the one that’s with you.
The iPhone didn’t really start to cannibalize the camera business until the iPhone 4 came out, in 2010. That year, Instagram was born and a hundred and twenty-two million digital cameras were sold—a record, according to the Camera and Imaging Products Association, a Japanese camera makers’ trade organization. By 2015, however, that number had shrunk to about thirty-five million. Since that time, the iPhone has bulked up its photographic capabilities and formed a symbiotic relationship with social networks such as Facebook, Twitter, and, especially, Instagram. The better the phone camera became, the more photos we started to snap and share. There are now nearly a billion smartphones worldwide capturing selfies, birthday smiles, breakfast sandwiches, Tuscan villages, and cats. In the past, such photos were taken by a point-and-shoot camera. Even today, the interchangeable-lens cameras and high-end cameras have their fans, so demand for these monsters still exists. But for how long?
We don’t know the digital-camera industry’s own answer to that question, but as of Wednesday the time frame certainly shortened. That was the day Apple announced its new iPhone. While in most ways the device launch was predictable, the iPhone 7 Plus, with its souped-up camera, made a big impression on serious photographers. The iPhone 7 Plus, which retails for seven hundred and sixty-nine dollars (or higher) has two lenses—a 28-mm.-equivalent, 12-megapixel lens and a 56-mm.-equivalent, 12-megapixel telephoto lens. Apple has managed to pack a lot of premium features—longer exposures, better aperture, and the ability to shoot digital negatives, which professionals call DNGs. A DNG is, essentially, a photo file that captures all the visual information possible for further manipulation, such as enhancing shadows or removing highlights. The new iPhone uses circuitry, software, and algorithms to create images that look and feel as if they came out of high-end cameras. Tellingly, Apple’s presentation of the camera’s abilities was the one aspect of the biennial iPhone rollout that wasn’t mercilessly mocked on social media.
Thus far, expensive stand-alone cameras with great lenses have been the ones able to offer what is called “bokeh,” a way to blur the background and focus on the subject in the foreground. This is especially useful when shooting portraits. It has been difficult to achieve on smartphones because of hardware limitations. Apple designed a new beefy image-processing chip for the iPhone 7 Plus, which, according to Apple’s senior vice-president of marketing, Phil Schiller, can perform “one hundred billion operations in twenty-five milliseconds”; he described it as “a supercomputer for photos.” It is sixty per cent faster than the image processor on the iPhone 6.
Apple isn’t the first phone company to reach the market with dual-lens systems. LG and Huawei have already introduced them in their high-end phones. The San Francisco-based startup Light has proposed a device (still under development) that uses data captured by multiple lenses. But Apple’s iPhone 7 Plus is the first major phone to marry the dual-lens system to immense computing capabilities.
This is terrible news for companies making compact cameras—Olympus and Nikon’s compact-camera sales in the most recent quarter had already nosedived by twenty-five per cent and forty-five per cent, respectively. The new iPhone 7 Plus drives a stake through the heart of these mass-market devices. As everyday shutterbugs, we can expect higher-quality photos. We will be able to create much more interesting images. Jon Oringer, the founder of Shutterstock, wrote in a recent blog post, “Just like our two eyes can be used to detect depth, two lenses can do the same thing. By using the disparity of pixels between two lenses, the camera processor can figure out how far away parts of the image are.” This new dual-lens system, he said, will change photography forever.
Photography has always been about capturing light. In the early days, we used film to capture it, and used chemicals to process information captured on the film to make images. Digital cameras capture light on sensors. In a way, digital cameras were like very early personal computers such as the Commodore 64—clunky, and able to do only a few things. Over time, thanks to better sensors and faster processors (which allowed for better software), digital cameras got better, captured more information from light, and were able to make better images.
Smartphone cameras, too, started out making pictures that were notably inferior to what could be captured on even the most basic of stand-alone cameras, film or digital. Small apertures, small sensors, and low processing power inside the phones limited their capabilities. Since 2010, as the silicon chips inside the phones (especially the iPhone) have become more powerful, camera phones have been able to push the possibilities. We are now entering an era when more information from light data is captured, merged, and deciphered using algorithms and computation.
Apple’s new iOS 10 (which powers the iPhone 7) is capable of automatic photo organization, image recognition, video creation, and other such functions—the same capabilities as Google Photos, with the exception that it does all the processing on the device itself rather than via the cloud-first approach preferred by Google.
We are splintering what was the “camera” and its functionality—lens, sensors, and processing—into distinct parts, but, instead of lenses and shutters, software and algorithms are becoming the driving force. And this is not just happening on smartphone cameras. You can expect the software to define and enhance what lenses, sensors, and processing units in other settings can do. Dash cams, security cams, adventure cams, driving cams—these are just early examples of devices that have specific applications, cameras that could become much more powerful in the future. In the coming era of augmented and virtual reality, these new cameras will also create content to be consumed within V.R. headsets like Oculus and Magic Leap.
The distinct business advantage that Apple has achieved thanks to its hardware is the sheer volume of iPhone sales, which justifies the big spending on the specialized chips that make that hardware so powerful. The new image processor is a perfect example. It can spread the cost of that investment in chips over hundreds of millions of iPhones. In comparison, the falling sales of stand-alone cameras have hampered the ability of camera companies to innovate and spend on core technologies. Given that hardware and software are equally important today, Apple’s advances in both areas makes it difficult for anyone to beat the company in photography for the masses. You can see why the camera companies are doomed.
Labels:
Tech News
Thanks for reading With the iPhone 7, Apple Changed the Camera Industry Forever. Please share...!
0 Comment for "With the iPhone 7, Apple Changed the Camera Industry Forever"