Senin, 30 September 2019

The Morning After: A 'Microsoft Flight Simulator' preview - Engadget

Hey, good morning! You look fabulous.

Welcome back! Apple's latest iOS update is here, and the difference from .0 to .1 is bigger than you might expect. Also, Flight Simulator is making a comeback, and on Saturday, Elon Musk set the timetable for Starship test flights.


Let's try that again.iOS 13.1 review

According to Chris Velazco, "A dark mode, helpfully redesigned apps and Voice Control add offer more flexibility out of the box, and now that iOS 13.1 has filed down a lot of the initial release's rough edges, it's finally worth installing."


This is the series' first new entry since 2014.Microsoft 'Flight Simulator' hands-on

Microsoft is now accepting applications for a pre-alpha Flight Simulator Insider program due to kick off later this year. The full game will launch on PC in 2020 and on Xbox after that. To find out what the ultra-realistic sim has to offer this time around, read Jessica Conditt's impressions.

"Flight Simulator offers a new perspective on the world, period. Developers are committed to holding a mirror to reality, researching and recreating accurate atmospheres, cockpits, wind patterns, flight maneuvers, weather and locations. Even the stars in the night skies are accurate."


Mark your calendar (in pencil).Elon Musk hopes SpaceX's Starship will reach orbit in six months

As part of a Q&A session at SpaceX's Starship presentation on Saturday night, Musk outlined plans for rapid prototyping that could get the vessel into space in a short time frame. Starship Mk1 at Boca Chica, Texas, should have a suborbital test flight in one to two months. If all goes well, either Mk3 or an eventual Mk5 would fly an orbital test within six months. Besides his typically optimistic production timelines, Musk's presentation also included a look at the plan for in-orbit refueling, which would help the Starship take longer trips to Mars and beyond.


It's an extension of the game that encourages players to create in the real world.'Minecraft Earth' launches in early access this October

Mojang and Microsoft have revealed that the augmented reality game will be available in "early access" for some countries, starting in October. It didn't say which countries or platforms would be included, but the beta has both Android and iOS users.


There's an extra gig of RAM, but not much else new.Apple's seventh-gen iPad gets bigger but keeps the same size battery

Now in its seventh generation, the "most popular" iPad that Apple sells has grown from 9.7- to 10.2-inches, ships with iPadOS and has a connector to support the company's still-pricey keyboard add-on. The folks at iFixit attacked this new model with their assortment of tools and found that despite the new size, inside it's still very similar to the previous model.

But wait, there's more...


The Morning After is a new daily newsletter from Engadget designed to help you fight off FOMO. Who knows what you'll miss if you don't Subscribe.

Craving even more? Like us on Facebook or Follow us on Twitter.

Have a suggestion on how we can improve The Morning After? Send us a note.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
Tweet
Share
Save

Let's block ads! (Why?)


https://www.engadget.com/2019/09/30/the-morning-after/

2019-09-30 11:10:26Z
52780397522393

Xiaomi’s Mi 9T Pro is one of this year’s best phone bargains - Circuit Breaker

With the barely believable Mi Mix Alpha and its wraparound display dominating attention last week, you could have been forgiven for not noticing the other phone Xiaomi announced at its event, the Mi 9 Pro 5G. Xiaomi has released a lot of phones that look and feel very similar this year, and frankly it’s getting hard to keep up.

Another Xiaomi phone that I’ve been using in recent weeks, the Mi 9T Pro, very much falls into that category. Nothing about its spec sheet particularly stands out even within Xiaomi’s lineup, let alone the Chinese smartphone landscape at large. That I’m only getting around to writing about it this week is down to quirks of Xiaomi’s convoluted naming schemes and global release schedules. But I still think it’s one of the most notable phones of the year.

To put the Mi 9T Pro into context, allow me to briefly explain the Mi 9 line. First off, the flagship Mi 9 and mid-range Mi 9 SE were announced in February. As is typical for Xiaomi, the Mi 9 paired high-end specs (Snapdragon 855, etc) with an affordable price (£499 / €449). I reviewed the Mi 9 when it came to Europe in April; it was good. Then in June, a cheaper €329 variant called the Mi 9T was also released in mainland Europe.

The Mi 9T, however, was a completely different phone. It had a slower Snapdragon 730 processor, a notchless display, a pop-up selfie camera, and an all-new design. All-new, that is, if you hadn’t been paying attention to Indian phone launches the previous month, when the Redmi K20 and K20 Pro were announced. The Mi 9T is just a rebranded Redmi K20.

Now here we are in the fall with the £399 Mi 9T Pro, which is a rebranded Redmi K20 Pro. Why Xiaomi didn’t put this phone out earlier in the year is beyond me, but Europeans shouldn’t sleep on it. It’s even more flagship-like than the Mi 9, at an even lower price.

The Mi 9T Pro’s spec sheet looks pretty standard if you’re used to following Chinese phones, but drop this thing into an AT&T store and it’d be as high-end as anything else. You’ve got a Snapdragon 855 processor, notchless 6.4-inch OLED screen, in-display fingerprint sensor, pop-up selfie camera, 4,000mAh battery, and triple-rear cameras including a 48-megapixel primary unit alongside telephoto and ultrawide lenses. It’s basically a OnePlus 7 Pro without the high refresh rate screen, except it starts at £399 instead of £699. (Conversions being what they are, US residents should think of this as a $399 phone.)

What makes the Mi 9T Pro different to Xiaomi’s other high-spec-low-cost endeavors — take last year’s sub-$300 Pocophone F1, which crammed the fastest parts available into the cheapest body possible — is that it doesn’t compromise on design. There are lots of neat flourishes here, from the tiny circle in the pop-up camera module that lights up when the phone is charging to the way the holographic rear panel shimmers in response to its surroundings.

Now, the Mi 9 remains a sleeker device in a few ways. It has a few flagship-esque touches, like wireless charging and a better haptic engine. It’s also noticeably thinner, although it does have a smaller battery and omits the headphone jack. The design is certainly more understated, if that’s your thing. Overall, though, the 9T Pro feels like a more balanced, capable phone for most people — not to mention a cheaper one.

The phone’s price, however, actually sparked some controversy in India when it was released as the K20 Pro. Many Indian tech followers, who are often both very knowledgeable about phone specs and keen to get the best value possible, were disappointed in the K20 Pro’s 27,999-rupee ($395) price point after expecting something more in line with the Pocophone F1. (The $7,000 gold version didn’t help.)

Xiaomi found itself having to justify the price in an open letter to Indian fans, pointing out that its features don’t come cheap. But given the Pocophone precedent, those who prioritize price-performance over everything else won’t have been satisfied.

In Europe, though, this is arguably the best-value flagship-class phone around — and I do call this a flagship-class phone. It’d probably be a big deal in the less competitive US market, too, if Xiaomi had ever figured out how to sell phones there. If you’re looking to buy a phone at around this price point, I can’t think of a better option.

Xiaomi releases phones at such an intense pace and in such seemingly random locales that it’s easy to ignore individual models and just think of the lineup as a collective hive mind devoted to being pretty good for whatever the price is. Sometimes, though, the company transcends the sweet spot and delivers something of really incredible value. The Mi 9T Pro, or K20 Pro, is one of those times.

Let's block ads! (Why?)


https://www.theverge.com/circuitbreaker/2019/9/30/20890945/xiaomi-mi-9-pro-hands-on-review-price

2019-09-30 10:02:00Z
52780397422660

The new MS Flight Simulator taught me how to fly an actual plane - Ars Technica

RENTON, Wash.—This month's Microsoft Flight Simulator world-premiere reveal event, held at a hangar just outside Seattle, was designed for two types of people. The first is the plane enthusiast, the kind of person who purchases pricey equipment in order to recreate the experience of piloting aircraft. Members of the new game's lead development team, Asobo Studio, were on hand to speak about reviving the decades-old MSFS brand and the inherent scrutiny those fans will direct at any rebirth.

The second type is me, a person who has logged very little time in one of those pricey, realistic flight-sim cockpits, let alone flying a real plane. I didn't even grow up playing MSFS, Janes, or other classic flight-sim series. Nobody in my family held aviation in esteem. For all the notes I took at the event about rotational weather systems, drag coefficients, and friction models, I got the feeling Microsoft and Asobo wanted to bowl me over with something a bit more specific and literal with its new Microsoft Flight Simulator, slated to launch on Windows PCs in "2020."

My MSFS kiosk was set up with a pre-loaded virtual flight opportunity: to take off in a Cessna 172 from the Renton Municipal Airport, then simulate flight around the cities, forests, and valleys of the Seattle area. Hours later, I would do the exact same thing... in real life, in a real Cessna 172, as the pilot.

Jeez, I thought to myself. This new version of the game better be realistic.

Update: I didn't die

To clarify: I did not manage my real flight's take-off or landing, in spite of practicing both in Microsoft Flight Simulator. And an instructor sat at my side the entire time I was in the real plane, ready to take control should I lose control or become uncomfortable. But, yes, I piloted a Cessna 172 for about a half-hour, managing its altitude and bearing in a trip that took me from Renton to Snoqualmie Falls and from Microsoft's Bellevue headquarters to the north end of Seattle itself.

This was Microsoft's gutsy effort to impress upon visiting journalists how good the company thought MSFS was in its current, pre-alpha state. I say "gutsy" because it's crazy to put a real-life flight up against a computer version in terms of visuals. Asobo has delivered a phenomenal rendering engine that juggles a mix of satellite data, machine-learning calculations, and procedurally generated buildings and terrain. It looks amazing on a computer, but I'm not crazy: the real thing looks better.

But the feeling of flight? Well, gosh. I'm barely an hour into the 100+ hours of flight time needed before I might qualify as licensed, so this is as anecdotal as it gets. But my time testing MSFS did a remarkable job of preparing me for the exact touch and execution needed to fly comfortably and reliably in the skies above Renton.

All of the nerves I had about real-life flight evaporated the moment I heard the command from my real-life Cessna's co-pilot: "Your plane." This was my cue to reply out loud, "My plane," and take firm grasp of the yoke. At which point, the sense of force feedback and required movement was seemingly identical to what I'd tested 1,600 feet below when I had been testing MSFS augmented by a Thrustmaster Pendular Rudder and a Saitek Pro Flight Yoke. (I got a hint of this 1:1 connection to the simulator before I was officially piloting the Cessna. I lightly had my hands and feet on my flight equipment in order to feel my instructor take off in nearly the exact same way I'd successfully done on a computer.)

I was also astonished by how much my real flight felt like the MSFS version I'd played an hour earlier when I encountered the mild turbulence of flying through wind and clouds.

In the game, this required flipping through a menu of weather presets, which had been set to a sunny-and-clear option before our arrival—and, c'mon, this is the Seattle area in September, so good weather is never a safe bet. A "live weather" option, on the other hand, delivered a more realistic volley of thin, transparent clouds and associated wind patterns. The game version didn't look exactly as gray-yet-clear as my real flight, but my need to adjust my bearing 5-10 degrees to account for regular wind did. I certainly exclaimed in mild panic when this happened in real life, but I was glad to have been prepared for it.

Fuel to the Flight Simulator fire

Earlier this summer, Microsoft teased its new version of Flight Simulator during its E3 keynote. While tantalizing, the 60-second video flew by without answering some key questions. How would this new version of MS Flight Simulator work? Who was producing it? When was it coming? Would it run on Xbox consoles (a first for the series)?

A lengthy presentation from the game's lead developers at Asobo Studio answered all of these questions and then some. That first question, of course, was: wait, Asobo? Who's that?

Peeking at the French developer's roster of recent games might make a flight-sim diehard scratch their head, since it includes zero flying games. The company has been a supporting developer for various Xbox Game Studios properties for years, whether by leading on Kinect and HoloLens games or supporting the development of "core" Xbox games like Quantum Break. But it's Fuel, an open-world driving adventure from 2009, that kept on coming up in conversations at the MSFS event.

"I can tell you that the flight-simulation problem—the streaming of an entire Earth at high levels of detail plus altitude change—requires a proprietary engine," Microsoft Flight Simulator head Jorg Neumann told Ars. "Asobo had made Fuel in 2009, then known to be the biggest game world [on a console]. That'd never been done before. Why? Because they procedurally generated it. They took the best places from satellite photos, condensed them, and made a game world out of it. The trees, grass, terrain were procedural. They'd already solved some of the problems hitting you when you're rendering at this scale. When you're in control of your own engine, you can go to double precision floating points without a problem. If you're on a third-party engine, good luck with that! That's a fundamental architectural change."

The whole world

This custom engine work, which emerged well before the likes of No Man's Sky tackled procedurally generated worlds, convinced Neumann to sign Asobo onto Microsoft's select second-party slew of developers... but not to make an open-world game. This is when he led efforts at Xbox's Kinect team, a tricky platform that required its own efficient engine work, and it stayed in the back of his mind while the Asobo crew worked on the first wave of major HoloLens games and apps and while he additionally worked on that platform's room-scanning technologies.

Neumann's appreciation for the team's HoloLens work inspired him to drop a franchise reboot possibility in Asobo's lap. In 2016, he asked the studio to prototype a realistic, playable flight sequence over the Seattle area, using a mix of Bing satellite map data and procedurally generated 3D details. His question to Asobo: could the studio combine existing map data with a Fuel-like game engine to make a virtual flight over Seattle seem realistic? Could that workflow then be applied to... the entire planet?

The prototype was shown to Xbox head Phil Spencer later that year, to which he murmured, "Why are we looking at this video?" Then the Neumann and the team pressed a controller to show they could manipulate the action in real time. "He looked at me, I looked at him, and he said, 'Are we really going back?'" Neumann recalled. "'If we're doing this, we're in it for the long run. You're in it for the long run.'" And Neumann told Ars Technica that the plan is indeed a long-term vision: Asobo and Xbox Game Studios are pledging 10 years of support for this version of Microsoft Flight Simulator. That begins with an insane scope.

With previous MSFS releases, "People didn't like that it was just flying between two states, 1-2 planes," Neumann said. "They didn't like that it wasn't the Earth. Can you compromise on that? Nope. Now we know there's 44,000 airports across the globe. For this game, that's the baseline."

Listing image by Sam Machkovech

Fly through the cloud, connect to the cloud

Notice that mention of Bing? That's not the only Microsoft service being leveraged to run the new MSFS. The game will also rely on a whopping two petabytes of satellite and geographical data in Microsoft's Azure Cloud system, which each MSFS instance can connect to in order to render any portion of the planet as realistically as possible.

Asobo was quick to confirm that the game will work in a variety of offline modes. Players will be able to designate a maximum offline cache size on their computer, which the game can then either fill with geographical data from a preferred region or with wherever you previously flew when you were last connected to the Internet. Should you wish to unplug from the Internet after installing the game, you can expect a more rudimentary visual experience, as Asobo's rendering system relies heavily on a stream of satellite scans and metadata on top of the base Earth scans installed with the game.

I went back to my demo kiosk and picked from thousands of available airports. (Seriously, they're going for all 44,000 on the planet here. It's no overstatement.) An Asobo staffer invited us to fly from our own hometown airports and find our neighborhoods, so I did that by buzzing over downtown Dallas, where I lived once upon a time. But the team hadn't fleshed out an optimized model of that metropolis, and that meant the major landmarks I'd seen handsomely modeled in other cities were nowhere to be found. Reunion Tower was a tall, square building. The Calatrava Bridge was a slightly raised bump in the geometry over a river. And the terrifying "mixmaster" of combined highways near downtown wasn't rendered in full 3D. Somehow, the American Airlines Center was immaculately rendered in the middle of this stew of unfinished terrain.

The locally rendered engine did its best to combine visible map data from a satellite feed and Asobo's procedural terrain generation system, so there was indeed a skyline of generic-looking buildings where they were supposed to be, along with reflective water, car-lined highways, and realistic-looking trees below. This essentially was the worst-case scenario of how the game renders major cities in an "offline" or unoptimized mode.

International

Other cities in the alpha fare much better. With the additional pump of Azure's pool of data, and with terrain and building models down to "30cm of accuracy," the results looked all the more incredible, with stunningly accurate geometry everywhere from bustling metropolises (Manhattan, Seattle, Paris) to oases of nature.

For the latter case, I spent about 20 minutes of my gameplay demo piloting a virtual Cessna 172 all the way from downtown Seattle to Mount Rainier. I began my route simply by looking ahead in the horizon and seeing a bonkers-accurate range of Cascades mountains all around, just like I might see on an average clear day in the Pacific Northwest. The result was an incredible flying experience dotted with the valleys, towns, industrial centers, and rivers that carve the world where I currently live. And when I tapped a shortcut key to fly once more over Seattle, I really could pick out significant natural and manmade landmarks by tapping a "look below" shortcut to get my bearings.

This proved out Asobo's mission of delivering true "visual flight rules" (or VFR) while in a plane. "After our own personal flight lessons, we can use a town or a river to direct our flights and know where we're going," Asobo engine lead Lionel Fuentes said. And he wants MSFS players to enjoy the same sensation.

A connection to Azure Cloud's treasure trove of data lets the engine grab more of this useful data, ranging from detailed renders of specific landmarks to more generic metadata like the color of roofs or more precise foliage data for a given forest or valley. That metadata is arguably the more useful stuff for the sake of VFR, since Asobo's MSFS relies so heavily on procedurally generated terrain and buildings. It's the difference between looking down and seeing a generic slew of buildings and trees up against the ocean's edge and looking down and seeing enough detail to pick out exactly which Puget Sound park that is.

In my case, Seattle's Gasworks Park didn't look as realistically rendered as the Space Needle, but it was good enough to help me find my bearing from its neighborhood to the next. Depending on where you call home on this planet, that kind of fine detail may very well be your VFR difference maker.

Xbox? Virtual reality?

The entire Microsoft event revolved MSFS running on Windows 10 PCs—massive ones, at that, with high-end graphics cards. These systems occasionally struggled to keep up with a 60fps refresh, sometimes hitching for three- or six-second pauses for inexplicable, "pre-alpha" reasons. That was arguably due to a mix of unoptimized code, network issues, and sheer power. Which brings us to the Xbox question.

Yes, Microsoft Flight Simulator will come to Xbox consoles at some point, Neumann said, and it will be an Xbox Game Pass title on that platform, just like its Windows 10 version will be. And, yes, the game will be available either as part of XGP or as a standalone purchase.

But the game's announced "2020" launch window is for Windows 10, not Xbox consoles, and there's no indication on whether the console version will have specific limitations. Might it require online streaming? Might it require an Xbox One X minimum? Microsoft isn't saying.

"We honestly don't have a date for Xbox, otherwise we might've said something," Neumann told Ars Technica. "Is there a parallel effort related to Xbox [development]? Yeah, but it's very modest at this point. Our first and foremost topic is getting the simulation right, then do as much as we can from the [closed beta] feedback."

Asobo reps also made clear that VR is not currently in the cards for this version of MSFS, in spite of pretty much every member of the attending press asking that exact question. "Maybe one day, Flight Simulator will," Asobo CCO and co-founder David Dedeine told Ars. "It's not a commitment at all," he added with a laugh. "But we're already thinking about VR... We've worked on HoloLens, so we're familiar with [VR]'s types of problematics."

Closed alpha approaches; third parties are already involved

The rest of the presentation revolved around every factor that affects the feeling of flight, including "live" weather fed from online services, cloud density, and various friction systems. But no matter how long I had to play the demo at the MSFS event, I couldn't truly test and scrutinize every promised upgrade. As an example, Asobo insisted that a new collision model was developed to enable take-offs and landings on sloped surfaces ("they're never a flat plane," a staffer remarked), which will likely require a lot more scrutiny from pro flyers. In my novice attempts to land planes, I learned that MSFS's collision system is currently missing a key aspect: detection of my plane barreling through nearby buildings when my landing failed. (Its collision system also didn't like when I perfectly piloted a Cessna beneath a bridge, sigh.)

Asobo confirmed that the game's full system of handling friction, weather, and other relevant flight systems began with the legacy codebase from 2005's MS Flight Simulator X. Again, Asobo was brought aboard to implement its own custom engine—so it could pull off tricks like rendering the entire planet—but the French studio combed through the existing codebase to understand "not just the universe of aviation but the universe of simulation." Should users be married to the previous game's tunings, certain toggles within the new MSFS will let those players revert to older simulation models, but it's currently unclear exactly which in-game systems will receive such toggles.

By the way, your chance to try the game is coming very, very soon, and you can sign up for the new game's "Insider" program right now to potentially test the game's first "closed alpha" starting in October. That demo will likely include similar content to what I tested. Thousands of airports appear on a manipulable globe, and then players will pick one, choose an aircraft, and either go through a pre-flight checklist (with the ability to mouse over every dial, button, and screen in a cockpit) or automatically ready up for takeoff. Only three aircraft were available in my demo: the Cessna 172, Robin DR400, and Daher TBM 930. It's unclear whether more aircraft will appear in the closed alpha. But Asobo doesn't sound like it will restrict flights anywhere around the globe for its first wave of players.

Asobo says it wants the community's feedback in a big way, and in a good-faith gesture, the company is already working with third-party developers who've made and sold add-ons to existing Flight Simulator games. Shortly after the new game's E3 reveal, existing third-party devs began talking to Asobo. NDAs were signed and reps flew to meet with Asobo and Microsoft, receive the upcoming game's SDK, and affirm their interest in having their add-on wares hosted both in an in-game digital-download shop and at their own privately hosted sites. (I can't recall a recent Microsoft or Xbox game that came with such a pledge.) Neither Microsoft nor Asobo confirmed which existing third-party add-ons were involved in this process or when their efforts may be revealed as attachments to the upcoming game.

“I want to have a ‘why’”

For now, Asobo is focused on gathering players' feedback, seeing where they fly, and seeing how they use MSFS's current incredibly open system of merely picking an airport and flying around. Which, to some people, might just sound like a glorified Google Earth, Bing Edition. I pressed Neumann on this: doesn't having a beautifully rendered planet open the door for all kinds of gameplay possibilities?

He stuck firmly to dreaming of what the engine might support for future Flight Simulator modes and how such content could be guided by closed alpha feedback.

"We are multiplayer," Neumann said. "This is the Xbox group, after all. Where that exactly goes, people will tell us. Like, 'I really want to be a co-pilot, to have a real instructor call into my sim from my house on an iPad.' Cool, awesome, that's one way. If people want that, it could happen. The only thing I can say with certainty is we can't have everything at launch. We'll get prioritization from the community we already have. That'll guide us. Some ideas and maybe passions to do certain things.

"I am a biologist," Neumann continued. "I love animals, the planet is alive, and I'd love to see that in our world simulation here. But also, some things planes do in reality. Whether it's someone doing rescue helicopters, or whatever. I wouldn't be surprised if people say, 'Search and rescue, this, that, and the other, we need that in the sim.' Then, OK, what's the most important one? We will listen, and we will keep evolving."

But first and foremost is the sensation of flight—which Dedeine insisted was a driver for the game's realistic visual approach, as opposed to a directive to slap "Bing" onto another Microsoft product.

"I want to have a 'why' to everything we make," Dedeine said to Ars. "Now we can integrate awesome, real-world data into a simulation. Why are we doing this? As we studied how to fly aircraft at the same time, it went immediately further than making it beautiful. Having a beautiful world to explore is great to have as a goal. I love traveling. But we found when you fly, they teach you: look at the ground. Not at your cockpit all the time. You don't want to get lost. Knowing the terrain, having a map, looking at the ground, understanding where you are, you learn in pilot training that you always follow a highway, a rail, a lake. There's always something you connect to in the world to fly around. To not get lost. We said, OK, we can do this now. We couldn't before. There wasn't enough data to cover the world. This became a goal: pilots need these waypoints to navigate via VFR. That's when our story evolved."

Let's block ads! (Why?)


https://arstechnica.com/gaming/2019/09/the-new-ms-flight-simulator-taught-me-how-to-fly-an-actual-plane/

2019-09-30 07:01:00Z
52780397522393

What is semantic rendering, and how it improves your iPhone 11’s camera - Digital Trends

iPhone 11 Pro Max rear triple camera
Julian Chokkattu/Digital Trends

The biggest improvements to Apple’s new iPhones are in the cameras, and not just because of the new ultra-wide-angle lenses on the iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max. The software powering the cameras is responsible for a significant leap forward in image quality thanks to improvements in computational photography techniques. One of the most interesting is semantic rendering, an intelligent approach to automatically adjusting highlights, shadows, and sharpness in specific areas of a photo.

What is semantic rendering?

In artificial intelligence, “semantics” refers to a machine’s ability to smartly segment information similar to how a human would. Different branches of machine learning may have different uses for semantic segmentation, but for photography, it starts with subject recognition.

In Apple’s case, the camera is specifically looking for any people within the frame, but it goes a level deeper than that. When the iPhone detects a human subject, Apple told Digital Trends it further differentiates between skin, hair, and even eyebrows. It can then render these segments differently to achieve the best results, creating a portrait that is properly exposed over the background.

[/pullquote]Just take a picture, and the phone will do the work in fractions of a second. [/pullquote]

To understand why this is so important, it helps to also understand how a standard camera works. Whether an older iPhone or a professional DSLR, a camera usually doesn’t know what it’s shooting. It knows the color and brightness of any given pixel, but it can’t glean any meaning about what’s actually in the frame. When you select the “portrait” color profile on a Nikon or Canon, for example, the camera is merely applying settings to specific color ranges of pixels commonly found in human subjects; it doesn’t really know if a person is present or not.

Such an effect is called a global adjustment, meaning it is applied to the entire photo equally. This is also how standard high dynamic range, or HDR, photos work: Highlights are lowered, shadows are raised, and midrange contrast might be enhanced — but without regard to what’s in the picture. This approach works well for subjects like landscapes, but it doesn’t always work for portraits.

iphone 11 review portrait mode girl
iPhone 11 portrait mode Julian Chokkattu/Digital Trends

With semantic rendering, an iPhone 11 can apply local, rather than global, adjustments. This means a bright sky can have its brightness reduced to maintain color and detail, while the highlights on a person’s face won’t be reduced as much, preserving depth in the subject. Sharpness can also be applied to the skin and hair in different strengths.

Photographers have been doing this kind of retouching by hand in programs like Adobe Photoshop for years, but the enhancements are applied instantly on an iPhone 11.

How do you use it? Just take a picture, and the phone will do the work in fractions of a second. Know that semantic rendering only affects human portraits; other types of photos receive the standard HDR treatment. It is not limited to portrait mode — any photo with a human subject is automatically a candidate for semantic rendering.

Computational photography — which incorporates everything from HDR to depth-sensing portrait modes — enables phone cameras to surpass the physical limitations of their small lenses and sensors. Apple’s semantic rendering is among the next evolution of these technologies — Google has been using similar machine learning to power the camera in its Pixel smartphones.

While the tech powering it is complex, its goal is simple. By giving the iPhone the ability to know when it’s looking at a person, it sees the world a little more like we do, leading to pictures that look natural and more true to life.

Editors' Recommendations

Let's block ads! (Why?)


https://www.digitaltrends.com/photography/apple-semantic-rendering-iphone-11/

2019-09-30 00:59:14Z
52780396097266