iPhone XS

Apple’s iOS 13 update will make FaceTime eye contact way easier

Posted by | Apple, apple inc, Apps, FaceTime, Gadgets, iOS, iOS 12, iPhone, iPhone XS, mobile phones, operating systems, TC | No Comments

Apple has added a feature called “FaceTime Attention Correction” to the latest iOS 13 Developer beta, and it looks like it could make a big difference when it comes to actually making FaceTime calls feel even more like talking to someone in person. The feature, spotted in the third beta of the new software update that went out this week, apparently does a terrific job of making it look like you’re looking directly into the camera even when you’re looking at the screen during a FaceTime call.

That’s actually a huge improvement, because when people FaceTime, most of the time they’re looking at the screen rather than the camera, since the whole point is to see the person or people you’re talking to, rather than the small black lens at the top of your device.

Guys – “FaceTime Attention Correction” in iOS 13 beta 3 is wild.

Here are some comparison photos featuring @flyosity: https://t.co/HxHhVONsi1 pic.twitter.com/jKK41L5ucI

— Will Sigmon (@WSig) July 2, 2019

The catch so far seems to be that this FaceTime feature is only available on iPhone XS and iPhone XS Max, which could mean it only works with the latest camera tech available on Apple hardware. That could be because of the new image signal processor that Apple included in the A12 processor that powers the iPhone XS and XS Max, which also provide improvements over previous generation phones in terms of HDR and portrait lighting effects.

It’s also possible with any updates or features that arrive in iOS beta releases that they could expand to other devices and/or vanish prior to the actual public launch of iOS 13, which is set for this fall. But here’s hoping this one remains in place, because it really seems to make a huge difference in terms of providing a sense of “presence” for FaceTime calls, which is one of the core values of the Apple chat feature overall.

Powered by WPeMatico

Review: Apple’s iPhone XR is a fine young cannibal

Posted by | Android, Apple, apple inc, Google, iOS, iPad, iPhone, iphone 5c, iPhone Xr, iPhone XS, smartphone, smartphones, TC, technology | No Comments

This iPhone is great. It is most like the last iPhone — but not the last “best” iPhone — more like the last not as good iPhone. It’s better than that one though, just not as good as the newest best iPhone or the older best iPhone.

If you’re upgrading from an iPhone 7 or iPhone 8, you’re gonna love it and likely won’t miss any current features while also getting a nice update to a gesture-driven phone with Face ID. But don’t buy it if you’re coming from an iPhone X, you’ll be disappointed as there are some compromises from the incredibly high level of performance and quality in Apple’s last flagship, which really was pushing the envelope at the time.

From a consumer perspective, this is offering a bit of choice that targets the same kind of customer who bought the iPhone 8 instead of the iPhone X last year. They want a great phone with a solid feature set and good performance but are not obsessed with ‘the best’ and likely won’t notice any of the things that would bug an iPhone X user about the iPhone XR.

On the business side, Apple is offering the iPhone XR to make sure there is no pricing umbrella underneath the iPhone XS and iPhone XS Max, and to make sure that the pricing curve is smooth across the iPhone line. It’s not so much a bulwark against low-end Android, that’s why the iPhone 8 and iPhone 7 are sticking around at those low prices.

Instead it’s offering an ‘affordable’ option that’s similar in philosophy to the iPhone 8’s role last year but with some additional benefits in terms of uniformity. Apple gets to move more of its user base to a fully gesture-oriented interface, as well as giving them Face ID. It benefits from more of its pipeline being dedicated to devices that share a lot of components like the A12 and True Depth camera system. It’s also recognizing the overall move towards larger screens in the market.

If Apple was trying to cannibalize sales of the iPhone XS, it couldn’t have created a better roasting spit than the iPhone XR.

Screen

Apple says that the iPhone XR has ‘the most advanced LCD ever in a smartphone’ — their words.

The iPhone XR’s screen is an LCD, not an OLED. This is one of the biggest differences between the iPhone XR and the iPhone XS models, and while the screen is one of the best LCDs I’ve ever seen, it’s not as good as the other models. Specifically, I believe that the OLED’s ability to display true black and display deeper color (especially in images that are taken on the new XR cameras in HDR) set it apart easily.

That said, I have a massive advantage in that I am able to hold the screens side by side to compare images. Simply put, if you don’t run them next to one another, this is a great screen. Given that the iPhone XS models have perhaps the best displays ever made for a smartphone, coming in a very close second isn’t a bad place to be.

A lot of nice advancements have been made here over earlier iPhone LCDs. You get True Tone, faster 120hz touch response and wide color support. All on a 326 psi stage that’s larger than the iPhone 8 Plus in a smaller body. You also now get tap-to-wake, another way Apple is working hard to unify the design and interaction language of its phones across the lineup.

All of these advancements don’t come for free to an LCD. There was a lot of time, energy and money spent getting the older technology to work as absolutely closely as possible to the flagship models. It’s rare to the point of non-existence that companies care at all to put in the work to make the lower end devices feel as well worked as the higher end ones. For as much crap as Apple gets about withholding features to get people to upsell, there is very little of that happening with the iPhone XR, quite the opposite really.

There are a few caveats here. First, 3D touch is gone, replaced by ‘Haptic Touch’ which Apple says works similarly to the MacBook’s track pad. It provides feedback from the iPhone’s Taptic vibration engine to simulate a ‘button press’ or trigger. In practice, the reality of the situation is that it is a very prosaic ‘long press to activate’ more than anything else. It’s used to trigger the camera on the home screen and the flashlight, and Apple says it’s coming to other places throughout the system as it sees it appropriate and figures out how to make it feel right.

I’m not a fan. I know 3D touch has its detractors, even among the people I’ve talked to who helped build it, I think it’s a clever utility that has a nice snap to it when activating quick actions like the camera. In contrast, on the iPhone XR you must tap and hold the camera button for about a second and a half — no pressure sensitivity here obviously — as the system figures out that this is an intentional press by determining duration, touch shape and spread etc and then triggers the action. You get the feedback still, which is nice, but it feels disconnected and slow. It’s the best case scenario without the additional 3D touch layer, but it’s not ideal.

I’d also be remiss if I didn’t mention that the edges of the iPhone XR screen have a slight dimming effect that is best described as a ‘drop shadow’. It’s wildly hard to photograph but imagine a very thin line of shadow around the edge of the phone that gets more pronounced as you tilt it and look at the edges. It’s likely an effect of the way Apple was able to get a nice sharp black drop-off at the edges that gets that to-the-edges look of the iPhone XR’s screen.

Apple is already doing a ton of work rounding the corners of the LCD screen to make them look smoothly curved (this works great and is nearly seamless unless you bust out the magnifying loupe) and it’s doing some additional stuff around the edge to keep it looking tidy. They’ve doubled the amount of LEDs in the screen to make that dithering and the edging possible.

Frankly, I don’t think most people will ever notice this slight shading of dark around the edge — it is very slight — but when the screen is displaying mostly white and it’s next to the iPhone XS it’s visible.

Oh, the bezels are bigger. It makes the front look slightly less elegant and screenful than the iPhone XS, but it’s not a big deal.

Camera

Yes, the portrait mode works. No, it’s not as good as the iPhone XS. Yes, I miss having a zoom lens.

All of those things are true and easily the biggest reason I won’t be buying an iPhone XR. However, in the theme of Apple working its hardest to make even its ‘lower end’ devices work and feel as much like its best, it’s really impressive what has been done here.

The iPhone XR’s front-facing camera array is identical to what you’ll find in the iPhone XS. Which is to say it’s very good.

The rear facing camera is where it gets interesting, and different.

The rear camera is a single lens and sensor that is both functionally and actually identical to the wide angle lens in the iPhone XS. It’s the same sensor, the same optics, the same 27mm wide-angle frame. You’re going to get great ‘standard’ pictures out of this. No compromises.

However, I found myself missing the zoom lens a lot. This is absolutely a your mileage may vary scenario, but I take the vast majority of my pictures with the telephoto lens. Looking back at my year with the iPhone X I’d say north of 80% of my pictures were shot with the telephoto, even if they were close ups. I simply prefer the “52mm” equivalent with its nice compression and tight crop. It’s just a better way to shoot than a wide angle — as any photographer or camera company will tell you because that’s the standard (equivalent) lens that all cameras have shipped with for decades.

Wide angle lenses were always a kludge in smartphones and it’s only in recent years that we’ve started getting decent telephotos. If I had my choice, I’d default to the tele and have a button to zoom out to the wide angle, that would be much nicer.

But with the iPhone XR you’re stuck with the wide — and it’s a single lens at that, without the two different perspectives Apple normally uses to gather its depth data to apply the portrait effect.

So they got clever. iPhone XR portrait images still contain a depth map that determines foreground, subject and background, as well as the new segmentation map that handles fine detail like hair. While the segmentation maps are roughly identical, the depth maps from the iPhone XR are nowhere as detailed or information rich as the ones that are generated by the iPhone XS.

See the two maps compared here, the iPhone XR’s depth map is far less aware of the scene depth and separation between the ‘slices’ of distance. It means that the overall portrait effect, while effective, is not as nuanced or aggressive.

In addition, the iPhone XR’s portrait mode only works on people.You’re also limited to just a couple of the portrait lighting modes: studio and contour.

In order to accomplish portrait mode without the twin lens perspective, Apple is doing facial landmark mapping and image recognition work to determine that the subject you’re shooting is a person. It’s doing depth acquisition by acquiring the map using a continuous real-time buffer of information coming from the focus pixels embedded in the iPhone XR’s sensor that it is passing to the A12 Bionic’s Neural Engine. Multiple neural nets analyze the data and reproduce the depth effect right in the viewfinder.

When you snap the shutter it combines the depth data, the segmentation map and the image data into a portrait shot instantaneously. You’re able to see the effect immediately. It’s wild to see this happen in real time and it boggles thinking about the horsepower needed to do this. By comparison, the Pixel 3 does not do real time preview and takes a couple of seconds to even show you the completed portrait shot once it’s snapped.

It’s a bravura performance in terms of silicon. But how do the pictures look?

I have to say, I really like the portraits that come out of the iPhone XR. I was ready to hate on the software-driven solution they’d come up with for the single lens portrait but it’s pretty damn good. The depth map is not as ‘deep’ and the transitions between out of focus and in focus areas are not as wide or smooth as they are on iPhone XS, but it’s passable. You’re going to get more funny blurring of the hair, more obvious hard transitions between foreground and background and that sort of thing.

And the wide angle portraits are completely incorrect from an optical compression perspective (nose too large, ears too small). Still, they are kind of fun in an exaggerated way. Think the way your face looks when you get to close to your front camera.

If you take a ton of portraits with your iPhone, the iPhone XS is going to give you a better chance of getting a great shot with a ton of depth that you can play with to get the exact look that you want. But as a solution that leans hard on the software and the Neural Engine, the iPhone XR’s portrait mode isn’t bad.

Performance

Unsurprisingly, given that it has the same exact A12 Bionic processor, but the iPhone XR performs almost identically to the iPhone XS in tests. Even though it features 3GB of RAM to the iPhone XS’ 4GB, the overall situation here is that you’re getting a phone that is damn near identical as far as speed and capability. If you care most about core features and not the camera or screen quirks, the iPhone XR does not offer many, if any, compromises here.

Size

The iPhone XR is the perfect size. If Apple were to make only one phone next year, they could just make it XR-sized and call it good. Though I am now used to the size of the iPhone X, a bit of extra screen real-estate is much appreciated when you do a lot of reading and email. Unfortunately, the iPhone XS Max is a two-handed phone, period. The increase in vertical size is lovely for reading and viewing movies, but it’s hell on reachability. Stretching to the corners with your thumb is darn near impossible and to complete even simple actions like closing a modal view inside an app it’s often easiest (and most habitual) to just default to two hands to perform those actions.

For those users that are ‘Plus’ addicts, the XS Max is an exercise in excess. It’s great as a command center for someone who does most of their work on their iPhones or in scenarios where it’s their only computer. My wife, for instance, has never owned her own computer and hasn’t really needed a permanent one in 15 years. For the last 10 years, she’s been all iPhone, with a bit of iPad thrown in. I myself am now on a XS Max because I also do a huge amount of my work on my iPhone and the extra screen size is great for big email threads and more general context.

But I don’t think Apple has done enough to capitalize on the larger screen iPhones in terms of software — certainly not enough to justify two-handed operation. It’s about time iOS was customized thoroughly for larger phones beyond a couple of concessions to split-view apps like Mail.

That’s why the iPhone XR’s size comes across as such a nice compromise. It’s absolutely a one-handed phone, but you still get some extra real-estate over the iPhone XS and the exact same amount of information appears on the iPhone XR’s screen as on the iPhone XS Max in a phone that is shorter enough to be thumb friendly.

Color

Apple’s industrial design chops continue to shine with the iPhone XR’s color finishes. My tester iPhone was the new Coral color and it is absolutely gorgeous.

The way Apple is doing colors is like nobody else. There’s no comparison to holding a Pixel 3, for instance. The Pixel 3 is fun and photographs well, but super “cheap and cheerful” in its look and feel. Even though the XR is Apple’s mid-range iPhone, the feel is very much that of a piece of nicely crafted jewelry. It’s weighty, with a gorgeous 7-layer color process laminating the back of the rear glass, giving it a depth and sparkle that’s just unmatched in consumer electronics.

The various textures of the blasted aluminum and glass are complimentary and it’s a nice melding of the iPhone 8 and iPhone X design ethos. It’s massively unfortunate that most people will be covering the color with cases, and I expect clear cases to explode in popularity when these phones start getting delivered.

It remains very curious that Apple is not shipping any first-party cases for the iPhone XR — not even the rumored clear case. I’m guessing that they just weren’t ready or that Apple was having issues with some odd quirk of clear cases like yellowing or cracking or something. But whatever it is, they’re leaving a bunch of cash on the table.

Apple’s ID does a lot of heavy lifting here, as usual. It often goes un-analyzed just how well the construction of the device works in conjunction with marketing and market placement to help customers both justify and enjoy their purchase. It transmits to the buyer that this is a piece of quality kit that has had a lot of thought put into it and makes them feel good about paying a hefty price for a chunk of silicon and glass. No one takes materials science anywhere as seriously at Apple and it continues to be on display here.

Should you buy it?

As I said above, it’s not that complicated of a question. I honestly wouldn’t overthink this one too much. The iPhone XR is made to serve a certain segment of customers that want the new iPhone but don’t necessarily need every new feature. It works great, has a few small compromises that probably won’t faze the kind of folks that would consider not buying the best and is really well built and executed.

“Apple’s pricing lineup is easily its strongest yet competitively,” creative Strategies’ Ben Bajarin puts it here in a subscriber piece. “The [iPhone] XR in particular is well lined up against the competition. I spoke to a few of my carrier contacts after Apple’s iPhone launch event and they seemed to believe the XR was going to stack up well against the competition and when you look at it priced against the Google Pixel ($799) and Samsung Galaxy 9 ($719). Some of my contacts even going so far to suggest the XR could end up being more disruptive to competitions portfolios than any iPhone since the 6/6 Plus launch.”

Apple wants to fill the umbrella, leaving less room than ever for competitors. Launching a phone that’s competitive in price and features an enormous amount of research and execution that attempt to make it as close a competitor as possible to its own flagship line, Apple has set itself up for a really diverse and interesting fiscal Q4.

Whether you help Apple boost its average selling price by buying one of the maxed out XS models or you help it block another Android purchase with an iPhone XR, I think it will probably be happy having you, raw or cooked.

Powered by WPeMatico

See the new iPhone’s ‘focus pixels’ up close

Posted by | cameras, Gadgets, hardware, iPhone, iPhone XS, iPhone Xs Max, Mobile, Photography | No Comments

The new iPhones have excellent cameras, to be sure. But it’s always good to verify Apple’s breathless onstage claims with first-hand reports. We have our own review of the phones and their photography systems, but teardowns provide the invaluable service of letting you see the biggest changes with your own eyes — augmented, of course, by a high-powered microscope.

We’ve already seen iFixit’s solid-as-always disassembly of the phone, but TechInsights gets a lot closer to the device’s components — including the improved camera of the iPhone XS and XS Max.

Although the optics of the new camera are as far as we can tell unchanged since the X, the sensor is a new one and is worth looking closely at.

Microphotography of the sensor die show that Apple’s claims are borne out and then some. The sensor size has increased from 32.8mm2 to 40.6mm2 — a huge difference despite the small units. Every tiny bit counts at this scale. (For comparison, the Galaxy S9 is 45mm2, and the soon-to-be-replaced Pixel 2 is 25mm2.)

The pixels themselves also, as advertised, grew from 1.22 microns (micrometers) across to 1.4 microns — which should help with image quality across the board. But there’s an interesting, subtler development that has continually but quietly changed ever since its introduction: the “focus pixels.”

That’s Apple’s brand name for phase detection autofocus (PDAF) points, found in plenty of other devices. The basic idea is that you mask off half a sub-pixel every once in a while (which I guess makes it a sub-sub-pixel), and by observing how light enters these half-covered detectors you can tell whether something is in focus or not.

Of course, you need a bunch of them to sense the image patterns with high fidelity, but you have to strike a balance: losing half a pixel may not sound like much, but if you do it a million times, that’s half a megapixel effectively down the drain. Wondering why all the PDAF points are green? Many camera sensors use an “RGBG” sub-pixel pattern, meaning there are two green sub-pixels for each red and blue one — it’s complicated why. But there are twice as many green sub-pixels and therefore the green channel is more robust to losing a bit of information.Apple introduced PDAF in the iPhone 6, but as you can see in TechInsights’ great diagram, the points are pretty scarce. There’s one for maybe every 64 sub-pixels, and not only that, they’re all masked off in the same orientation: either the left or right half gone.

The 6S and 7 Pluses saw the number double to one PDAF point per 32 sub-pixels. And in the 8 Plus, the number is improved to one per 20 — but there’s another addition: now the phase detection masks are on the tops and bottoms of the sub-pixels as well. As you can imagine, doing phase detection in multiple directions is a more sophisticated proposal, but it could also significantly improve the accuracy of the process. Autofocus systems all have their weaknesses, and this may have addressed one Apple regretted in earlier iterations.

Which brings us to the XS (and Max, of course), in which the PDAF points are now one per 16 sub-pixels, having increased the frequency of the vertical phase detection points so that they’re equal in number to the horizontal one. Clearly the experiment paid off and any consequent light loss has been mitigated or accounted for.

I’m curious how the sub-pixel patterns of Samsung, Huawei and Google phones compare, and I’m looking into it. But I wanted to highlight this interesting little evolution. It’s an interesting example of the kind of changes that are hard to understand when explained in simple number form — we’ve doubled this, or there are a million more of that — but which make sense when you see them in physical form.

Powered by WPeMatico

The 7 most egregious fibs Apple told about the iPhone XS camera today

Posted by | Apple, Apple Hardware Event 2018, Gadgets, hardware, iPhone, iPhone XS, Photography | No Comments

Apple always drops a few whoppers at its events, and the iPhone XS announcement today was no exception. And nowhere were they more blatant than in the introduction of the devices’ “new” camera features. No one doubts that iPhones take great pictures, so why bother lying about it? My guess is they can’t help themselves.

To be clear, I have no doubt they made some great updates to make a good camera better. But whatever those improvements are, they were overshadowed today by the breathless hype that was frequently questionable and occasionally just plain wrong. Now, to fill this article out I had to get a bit pedantic, but honestly, some of these are pretty egregious.

“The world’s most popular camera”

There are a lot of iPhones out there, to be sure. But defining the iPhone as some sort of decade-long continuous camera, which Apple seems to be doing, is sort of a disingenuous way to do it. By that standard, Samsung would almost certainly be ahead, since it would be allowed to count all its Galaxy phones going back a decade as well, and they’ve definitely outsold Apple in that time. Going further, if you were to say that a basic off-the-shelf camera stack and common Sony or Samsung sensor was a “camera,” iPhone would probably be outnumbered 10:1 by Android phones.

Is the iPhone one of the world’s most popular cameras? To be sure. Is it the world’s most popular camera? You’d have to slice it pretty thin and say that this or that year and this or that model was more numerous than any other single model. The point is this is a very squishy metric and one many could lay claim to depending on how they pick or interpret the numbers. As usual, Apple didn’t show their work here, so we may as well coin a term and call this an educated bluff.

“Remarkable new dual camera system”

As Phil would explain later, a lot of the newness comes from improvements to the sensor and image processor. But as he said that the system was new while backed by an exploded view of the camera hardware, we may consider him as referring to that as well.

It’s not actually clear what in the hardware is different from the iPhone X. Certainly if you look at the specs, they’re nearly identical:

If I said these were different cameras, would you believe me? Same F numbers, no reason to think the image stabilization is different or better, and so on. It would not be unreasonable to guess that these are, as far as optics, the same cameras as before. Again, not that there was anything wrong with them — they’re fabulous optics. But showing components that are in fact the same and saying it’s different is misleading.

Given Apple’s style, if there were any actual changes to the lenses or OIS, they’d have said something. It’s not trivial to improve those things and they’d take credit if they had done so.

The sensor of course is extremely important, and it is improved: the 1.4-micrometer pixel pitch on the wide-angle main camera is larger than the 1.22-micrometer pitch on the X. Since the megapixels are similar we can probably surmise that the “larger” sensor is a consequence of this different pixel pitch, not any kind of real form factor change. It’s certainly larger, but the wider pixel pitch, which helps with sensitivity, is what’s actually improved, and the increased dimensions are just a consequence of that.

We’ll look at the image processor claims below.

“2x faster sensor… for better image quality”

It’s not really clear what is meant when he says this. “To take advantage of all this technology.” Is it the readout rate? Is it the processor that’s faster, since that’s what would probably produce better image quality (more horsepower to calculate colors, encode better, and so on)? “Fast” also refers to light-gathering — is that faster?

I don’t think it’s accidental that this was just sort of thrown out there and not specified. Apple likes big simple numbers and doesn’t want to play the spec game the same way as the others. But this in my opinion crosses the line from simplifying to misleading. This at least Apple or some detailed third party testing can clear up.

“What it does that is entirely new is connect together the ISP with that neural engine, to use them together.”

Now, this was a bit of sleight of hand on Phil’s part. Presumably what’s new is that Apple has better integrated the image processing pathway between the traditional image processor, which is doing the workhorse stuff like autofocus and color, and the “neural engine,” which is doing face detection.

It may be new for Apple, but this kind of thing has been standard in many cameras for years. Both phones and interchangeable-lens systems like DSLRs use face and eye detection, some using neural-type models, to guide autofocus or exposure. This (and the problems that come with it) go back years and years. I remember point-and-shoots that had it, but unfortunately failed to detect people who had dark skin or were frowning.

It’s gotten a lot better (Apple’s depth-detecting units probably help a lot), but the idea of tying a face-tracking system, whatever fancy name you call it, in to the image-capture process is old hat. What’s in the XS may be the best, but it’s probably not “entirely new” even for Apple, let alone the rest of photography.

“We have a brand new feature we call smart HDR.”

Apple’s brand new feature has been on Google’s Pixel phones for a while now. A lot of cameras now keep a frame buffer going, essentially snapping pictures in the background while the app is open, then using the latest one when you hit the button. And Google, among others, had the idea that you could use these unseen pictures as raw material for an HDR shot.

Probably Apple’s method is a different, and it may even be better, but fundamentally it’s the same thing. Again, “brand new” to iPhone users, but well known among Android flagship devices.

“This is what you’re not supposed to do, right, shooting a photo into the sun, because you’re gonna blow out the exposure.”

I’m not saying you should shoot directly into the sun, but it’s really not uncommon to include the sun in your shot. In the corner like that it can make for some cool lens flares, for instance. It won’t blow out these days because almost every camera’s auto-exposure algorithms are either center-weighted or intelligently shift around — to find faces, for instance.

When the sun is in your shot, your problem isn’t blown out highlights but a lack of dynamic range caused by a large difference between the exposure needed to capture the sun-lit background and the shadowed foreground. This is, of course, as Phil says, one of the best applications of HDR — a well-bracketed exposure can make sure you have shadow details while also keeping the bright ones.

Funnily enough, in the picture he chose here, the shadow details are mostly lost — you just see a bunch of noise there. You don’t need HDR to get those water droplets — that’s a shutter speed thing, really. It’s still a great shot, by the way, I just don’t think it’s illustrative of what Phil is talking about.

“You can adjust the depth of field… this has not been possible in photography of any type of camera.”

This just isn’t true. You can do this on the Galaxy S9, and it’s being rolled out in Google Photos as well. Lytro was doing something like it years and years ago, if we’re including “any type of camera.” Will this be better? Probably – looks great to me. Has it never been possible ever? Not even close. I feel kind of bad that no one told Phil. He’s out here without the facts.

Well, that’s all the big ones. There were plenty more, shall we say, embellishments at the event, but that’s par for the course at any big company’s launch. I just felt like these ones couldn’t go unanswered. I have nothing against the iPhone camera — I use one myself. But boy are they going wild with these claims. Somebody’s got to say it, since clearly no one inside Apple is.

Check out the rest of our Apple event coverage here:

more iPhone Event 2018 coverage

Powered by WPeMatico

Huge leak shows off the new iPhone XS

Posted by | Apple, hardware, iPhone XS, Mobile, smartphones, TC | No Comments

Get ready for a leaked look at the new iPhone XS. 9to5Mac has gotten its hands on an image of Apple’s next generation of iPhone hardware, and the future looks pretty swanky.

The leaked image showcases the new sizing of Apple’s soon-to-be-unveiled flagship bezel-less devices, which likely will have 5.8-inch and 6.5-inch screens, respectively. The phones will be called the iPhone XS, according to the report. The pictured devices represent the higher-end OLED screen models, not the cheaper rumored notch LCD iPhone.

The device will feature a new gold color shell. The iPhone X is currently available in space gray and silver.

Image credit: 9to5mac

A picture is worth a thousand words, but there are still a lot of details we’re waiting on here obviously. Apple is expected to show off the new phone hardware as well as a new version of the Apple Watch at a hardware event on September 12.

Powered by WPeMatico