Elex Xbox One review — A gorgeous RPG world hindered by performance issues

Elex is a vast open world role-playing game which is surprisingly filled with a lot of choices. For those of you who aren’t familiar with the developers—Piranha Bytes—they’re a German studio known for the Gothic and Risen franchises.

Gothic and Risen are great role-playing experiences, so the studio has a high standard to meet with Elex. Does the game live up to what we expect from the studio? Let’s find out.

See at Microsoft Store
See at Amazon

Elex takes place in a devastated world where humans have broken into various factions. A comet containing a rare substance called Elex destroyed the planet, and the remnants adopt their own beliefs which they view as necessary for survival. For example, the Berserkers believe in magic over technology whereas the Clerics focus on advanced weaponry. Each faction has its own benefits and weaknesses so who you align with greatly influences the game.

The mysterious Albs are everyone’s enemy so will all of these factions come together to fight for survival? You’ll have to play Elex to find out. Despite initial impressions, the game features a complex—at times convoluted—plot which explores topics like addiction and the limits of power.

The world of Magalan is gorgeous and diverse. The open environments are absolutely massive and range from lush forests to erupted wastelands. You encounter some of the most breathtaking places in the city of Goliet, as it looks upon snow-covered mountains and is filled with the ruins of a distant past. The hills are covered with colorful vegetation and dotted with flowers. The developers strived for a photorealistic aesthetic and Elex is definitely better for it, making it one of the best looking games on Xbox One.

Elex features a jetpack which allows you to surprise enemies and reach secret areas. However, it’s most useful for getting away from ravenous creatures and landing safely, because even the slightest fall can damage you. This isn’t Middle-earth: Shadow of War where you can jump off towers and survive. Watching your step is necessary especially during the first few hours of gameplay because you don’t have any armor or other protection.

From the get-go, you can go to any area you want. While this makes the world truly open, it also makes it unpredictable. Areas that are further from your starting location contain incredibly dangerous enemies that can annihilate you with one hit. Since the combat is Dark Souls-like, as it focuses on stamina and evasion, taking on powerful foes at the very beginning is next to impossible. Be sure to level up your character by initially carrying out mundane tasks, because tackling even a harmless-looking salamander-like creature can be the end.

Just like any role-playing game, Elex features a lot of leveling up and customization. However, you aren’t limited to a particular class like many other titles. Do you want to use a bow? Go ahead. What about a sword? You can do that too. What if you want to become proficient at both of those weapons? The choice is available. There are also firearms and other forms of magic you can customize your character with. The choices are almost endless because the game features a deep level of personalization. Elex can be adjusted to however you want to play.

Have you played Skyrim? Remember how you could kill certain characters, and that would change the story a bit? Well, Elex is the same way except the integration is more pronounced. Since the game revolves around factions, you can help or kill whoever you want. However, it’s not like others will let it slide and you’ll be thrown into a dungeon. People will remember your words and actions and, well, being a jerk can come back to bite you at a later time. Just remember that. You need allies and killing guards and storekeepers will get you nowhere.

Nowadays many studios use motion capture technology to animate faces, but Elex doesn’t seem to do that. Just like Deus Ex: Mankind Divided, Elex suffers from wooden animations. However, the voice acting is a little off, so it exasperates the situation. The main character seems a little too generic as well so that doesn’t help matters. Luckily, the world is filled with intriguing and sometimes hilarious individuals that make it a joy to explore.

Unlike its Windows PC counterpart, Elex on Xbox One suffers from some performance issues. The title runs at 30 FPS most of the time but regularly experiences stuttering which hinders the gameplay. The frame rate fluctuations make it difficult to control your character, and this can be problematic when you’re fighting a few foes at once. Despite the day one patch, Elex still isn’t truly enjoyable due to this.

Despite its colorful and interesting world, the game feels like a chore to get through because of these performance issues. While the frame rate isn’t as bad as some other titles out there, in order to get engrossed in a title there can’t be any technical distractions. Unfortunately, that isn’t the case for Elex on Xbox One, and it doesn’t matter how great the content is, the lack of polish is what will push consumers away.

Pros:

  • Nice art direction.
  • Interesting world.
  • RPG choices with consequences.

Cons:

  • Suffers from a lack of polish on Xbox One.
  • Technical issues make it hard to recommend.

Overall, Elex is a great game marred by technical issues on Xbox One. It needs more work on the console, but if you can look past the occasional problems, it’s a rewarding role-playing game. Hopefully, the developers will issue an update soon which fixes the remaining grievances because the day one patch didn’t do enough. It’s a shame that despite having great content, it’s hard to recommend because the problems are that noticeable. Piranha Bytes have a potentially great franchise here, but they need to make it work properly if they want a following.

See at Microsoft Store
See at Amazon

Blackberry KEYone REVIEW – Blackberry is back with Android 7.1



The entire review of the new Blackberry KEYone Smartphone.r
Obtain the Blackberry KEYone on Amazon: – r
Far more details on the BlackBerry KEYone right here: r
Video clip Gear i use: r
r
In this review i will demonstrate you every thing there is to know about the approaching Blackberry KEYone smartphone. This cellular phone is running Android 7.1, has the Snapdragon 625 CPU, 3GB RAM and 32GB of interior storage.r
You will be equipped to verify out a lot of smaple photos from the 12mp rear digicam and some sample photos from the 8mp front dealing with digicam. I will also demonstrate you the rating on the Antutu Benchmark and how the cellular phone performs for working day to working day use.r
r
Also verify out some extra terrific testimonials for funds pleasant telephones:r
Umidigi G Evaluation: – r
Vernee Apollo X Evaluation: – r
Honor 6X Evaluation: – r
UMIDIGI C Notice Evaluation: – r
Oukitel K6000 Additionally: – r
Moto G5 Additionally: – r
BLU Everyday living 1 X2:

There is certainly A Legend Of Zelda Live performance At The Sydney Opera House Up coming Weekend

In a several days’ time, the live performance hall of the Sydney Opera House will ring yet again with the chords of The Legend of Zelda‘s unforgettable soundtracks for the 1st time in a 10 years. Symphony of the Goddesses is the final result of a shut collaboration amongst the symphony’s creator Jason Michael Paul and Nintendo, and it’s an honest recreation of the original audio of the games — “as 1st-occasion as it will get”, states the producer powering it all.

Jason Michael Paul, probably best regarded in Australia for the Enjoy! A Online video Recreation Symphony series that toured in 2007, is dependable for bringing this Zelda live performance to the Opera House, in which it will operate for two exhibits on Sunday 29 October.

Unsurprisingly, he’s a extensive-time admirer of Zelda. “Like numerous, my journey by The Legend of Zelda began at 10 many years aged with the original gold cartridge and NES. I am specially fond of Majora’s Mask — a activity that has taken on a complete new indicating given that I enjoy it with my 10 calendar year aged daughter. Skyward Sword is a beloved [too] Nintendo requested me to create the 25th Anniversary orchestral CD that was unveiled with the bundle. It was an honor… Breath of the Wild is just simply wonderful.”

Enjoy! experienced some Zelda in it already, but this new live performance is all about Nintendo’s most liked series — it will aspect audio from Skyward Sword and Breath of the Wild as perfectly as the series’ classics like Ocarina of Time. A total orchestra and choir will be accompanied by a “stirring” video clip designed for the functionality. Understandably, Nintendo was “quite palms on” with the output. “When the perform is submitted and revisions (if any) are designed, then we only collaborate further on new submissions. Mr. Kondo and Mr. Aonuma oversee all the things and anything that is executed as portion of The Legend of Zelda: Symphony of the Goddesses — it is as 1st-occasion as it will get.”

There are 3 major factors, the producer-promoter states, in bringing video clip activity soundtracks into actuality by an orchestra like Symphony of the Goddesses will have: “…reimagining the scores to audio wonderful being executed by an orchestra and choir, selecting genuinely proficient humans that are equally as passionate about Zelda as they are about arranging and composing, [and] employing a whole lot of the themes and melodies and earning them audio bombastic and big-sounding.”

Whilst it’s likely that a whole lot of the Zelda orchestral parts will bring about nostalgia and strong recollections in listeners, the live performance may possibly also add a bit of deviation from the original soundtracks into the combine: “you will find a very little bit of both”, states Paul. The series has already operate in Perth and Melbourne. [Sydney Opera House]

Check out Additional: Amusement Information

PC gamers hope for improvements after hackers and performance issues mar the Call of Duty: WW2 beta • Eurogamer.net

It rings out from the rooftops, across vast swaths of the Net – Call of Obligation is in trouble. Certainly, after rocketing past the drone-choked skies of its “around-long term” into the identical bleak vacuum of navy-fetishist sci-fi occupied by its rival, Titanfall 2, COD finds alone desperately jamming on the eject button as it falls again down to Earth.

Set upon by the encroaching tide of so-identified as “hero shooters” brimming with color and vibrancy, steward Sledgehammer Game titles has retreated to the closest point the megaseries ever experienced to a coherent identity – the seashores of Normandy, established in sepia like an insect in amber, the place soldiers were being soldiers, Nazis were being Nazis and those people heavy boots stayed planted firmly in the sand.

But even as the ludicrously-named Call of Obligation: WW2 creeps ever-nearer to its 3rd November launch day, if the bumpy reception of past week’s open up beta is anything to go by, Sledgehammer nevertheless has a approaches to go – even as the studio head a short while ago announced the conclusion of creative work on the job.

To be very clear, most would concur that the place of such an open up beta is to clean out technical wrinkles just before remaining launch, primarily in conditions of the pesky calculus of efficiency and server load. However, the selection of gamers complaining of regular hitching, framerate drops and outright crashes far exceeded those people generous requirements, with weary fanatics pointing to defective Nvidia motorists or display screen-capture software such as Shadowplay for the hiccups.

While I skilled no video game-ending bugs, the experience teetered from just unpleasant to borderline unplayable on my average-spec gaming rig, with framerates ping-ponging from 120 down to the solitary-digits each handful of moments for no obvious motive. Updating the motorists on my GTX 970 appeared to mitigate issues rather, but my CPU utilization would spike without having are unsuccessful at minimum the moment or 2 times for each round, practically often ensuing in my speedy demise.

I wasn’t by yourself – in an (definitely non-scientific) study carried out on the Call of Obligation: WW2 subreddit, 45.8 for each cent of gamers documented body-rate drops as one particular of the substantial demerits of the beta, and threads noting subpar efficiency dotted gaming community forums and subreddits alike.

For those people who could operate the video game at an suitable clip, the experience was typically what they expected – that is, until eventually the cheaters confirmed up. In excess of the yrs, COD has garnered rather of a dismal standing for its legions of hackers, with one particular frustrated enthusiast going so far as to connect with it “the most hacked video game in the heritage of the gaming field”. While that may be marginally hyperbolic, various exploits were being significantly rampant on the prior generation of consoles, the place cheating was as uncomplicated as modifying your controller. Given that this beta demanded no get-in other than a Steam account, cheaters descended on the beta in droves, wallhacking and aimbotting from dawn ’til dusk.

When community forums were being established alight by reports of these shenanigans, Sledgehammer was rapid to note in a separate update the typical anti-cheating concoctions that gird a video game of this scale were not energetic throughout the beta, and hackers will be suitably punished the moment the video game basically arrives out. While this helped assuage some of the concerns – with some noting DICE’s open up beta for Battlefield 1 endured from identical malfeasance – cheaters nevertheless registered on the study as the subreddit’s leading issue, with 73 for each cent of people marking it as a substantial issue.

Beyond these two leading-line things, the documented concerns of the populace at massive descend into a constellation of odd omissions several Computer system avid gamers acquire as provided these times, such as numbers that correspond to the mouse sensitivity location bar, or industry-of-look at that scales with the real weapon product. While some of these are additional substantive than other folks – why a Computer system video game would neglect to consist of a numbered latency in the calendar year 2017 is beyond me – if Sledgehammer is to be considered, a barrage of article-start updates will undoubtedly iron most of these out.

Main between these expenses is the use of peer-to-peer connections with a player host alternatively than dedicated servers. At the moment, however particulars are sketchy, it appears WW2 uses a so-identified as “hybrid method” of dedi/P2P, depending on your location. An obvious supermajority (80 for each cent) of the userbase decries this as an unacceptable fifty percent-measure if the player hosting a P2P match leaves throughout a video game, the dreaded “host migration” method stops the match and tries to reconnect to a new host, with decidedly-blended final results. Thinking of the prior video games in the franchise have operate off this hybrid-method for yrs now, the chances of such a stark reversal appears to be dubious, but not altogether extremely hard.

But beyond all the forum-flailing and tummy-aching, if the redditors are to be considered, the total of the beta was stable, if unspectacular. When attained for remark, the reaction was amazingly dependable, with a handful of outliers: however handful of were being certain by the five times of carnage, most felt Call of Obligation: WW2 signifies a organization action forward for the franchise. Even with the vast unpopularity of the sci-fi nonsense used by the past handful of entries perfectly in thoughts, the universal utilization of the phrase “boots on the floor” – in some cases abbreviated to BotG – stunned even me.

“I performed with a bunch of my good friends and the typical boots on the floor really feel, the over-all video game style alone and enjoying with all my good friends collectively definitely brought me again to the times when we were being younger and we employed to just perform COD all the time,” redditor xZombieMike said.

JP_76 echoed this sentiment, but with less enthusiasm: “I’m nevertheless doubtful about the long term of COD. So this calendar year we get a good, standard BotG style video game, but what about future calendar year? Are we going to get an additional space themed COD video game with jet packs and see the superior movement method make a return? I hope not.”

Scanning as a result of these posts, I are unable to aid but recall the snide remark made by a good friend who I cajoled into hoping the beta with me: “it truly is COD, but with worse guns.” While you can absolutely connect with it a reductive look at of items, there’s no denying there’s some kernel of fact there. As the monster of funds known as Call of Obligation enters the second fifty percent of its second decade of existence, we can not aid but silently notice the creature munch on its individual delectable tail.

It is never ever also early to mine for nostalgia these times, but by returning to WW2, Call of Obligation abandons the pretence of novelty and goes the place the real funds is: a correct revival act, complete with the righteous commandos and unpleasant Nazis you’ve come to anticipate. They say you never ever go broke enjoying to the cheap seats, and I suspect Activision will acquire that adage all the way to the lender for as long as Activision can. Let us hope they you should not go for the can-can, however – after all, they will need to preserve those people boots planted firmly to the stage.

With the Note 8, Samsung No Longer Delivers Embarrassing Real-World Performance [Comparing Note 8, Pixel XL, OnePlus 5]

For the longest time, TouchWiz was often criticized for its inability to get with the times and correct some of its longest-running points of criticism. Bloatware, a myriad of redundant features, outdated aesthetics and sub-par performance were the main things that stood out to many enthusiasts — particularly those enamored with stock Android. Through many Galaxy S and Note devices, TouchWiz has further iterated and improved, even attempting to re-brand itself under the name “Grace UX”. Despite such efforts, some of us still felt it provided an experience that was less than gracious, and the community at large opted to call the new Samsung Experience by its old name. But nothing stays the same for too long in this industry, and the Galaxy Note 8 has addressed one of Samsung’s oldest shortcomings: performance.

Whether this came from a natural evolution in specs or a targeted effort (likely both) is still up for debate — there’s no doubt that Samsung has extensively tweaked its software for this revision of the Note 8, and it’s also no secret that today’s hardware is measurably faster than last years. Indeed, on the hardware front the Galaxy Note 8 sports several key upgrades over last year’s Note7 … another bump in RAM to 6GB, an upgraded Snapdragon 835 processor (with the Adreno GPU clocked at 710MHz, unlike the S8’s 670MHz) and Samsung’s faster UFS 2.1 storage. All of this is important because, as we’ve said, Samsung’s improvements on the performance front have been tangible, but not relative to the leaps in fluidity and responsiveness that other devices have provided. This is why, when we looked at last year’s Galaxy Note 7, we couldn’t help but notice Samsung had yet again delivered what we called “embarrassing performance”.

With the Note 7, Samsung Still Delivers Embarrassing Real-World Performance

In last year’s article, we detailed some of the performance problems and annoyances we found in the Galaxy Note7 Snapdragon variants within our writer staff. XDA Contributor Eric Hulse provided various examples of reproducible lag that had plagued early adopters across a variety of usage scenarios — specific menus, scrolling in applications, and Samsung’s own typing experience through the stock keyboard. We provided several samples of these jarring stutters through gifs and videos, many of them showing the well-known “GPU profiling” bars — a histogram of the device’s last few frame times, which allow developers and enthusiasts alike to quantify their device’s smoothness, even if roughly, by allowing them to estimate the number of stutters and dropped frames. That article sparked a lot of debate within and outside of XDA, with multiple online blogs responding in retaliation to our claims or backing up our assessment and sharing their similar experiences.

The Galaxy Note 8 is a new phone, though, and a lot of things have changed since. TouchWiz has improved with its jump to Nougat, hardware has gotten better, and our standards have changed too. To be more specific, new devices have raised the bar for what we understand to be excellent performance — namely, the Pixel XL which keeps surprising all of us with its insane levels of fluidity (which we’ll demonstrate below). We thought it’d be appropriate to revisit the topic, and see how Samsung’s newest addition to the Galaxy lineup stacks up both against competitors and, most importantly, its notorious reputation. The Galaxy Note 8 offers perceptibly better performance with none of last year’s embarrassing stutters and jank, and we’ve been thoroughly surprised throughout our time with our units. Below we’ll document some of the improvements we’ve noted in the device’s performance and user experience, including comparisons with other devices.


Real World Smoothness

Methodology: In order to test real world fluidity, we won’t be presenting gifs or screenshots showing GPU profiling bars, but instead we will show you the extracted frame times plotted against other devices under the exact same usage scenario. We put together a tool to extract and parse the frame data, and a UI automation system that allowed us to build macros that mimic real-world use cases by simulating touch input — scrolling, loading new activities or windows, and compound tests with complex UI navigation. These tests were run across a Pixel XL, a OnePlus 5 (6GB) and the Galaxy Note 8, with all devices set to 1080p resolution and all devices fresh off a factory reset; this does mean that the Pixel XL is rendering less pixels than what most readers will be used to, potentially increasing performance by a noticeable margin. We made sure the tests were perfectly synchronized across devices, measuring the same actions at the same time, with multiple tests across each device to validate our results. Repeated tests continuously show minimal variance in the number of frames captured, though the number of total frames captured on each test varies significantly across devices. This is because these devices behave differently in their scrolling acceleration/final velocity, and set different baseline speeds for many actions and transitions (even at the same 1x setting).


The first thing we compared across devices was simple scrolling performance. It turns out that “dissecting” a scroll across devices does give us an idea as to what causes some of the performance differences we observe, especially across devices that largely feature the same hardware like the OnePlus 5 and Galaxy Note 8. Below you can both see the graphs of a simple three-scroll motion in the Google Play Store’s “Top Charts” list, pre-loaded ahead of the motion to ensure there’d be no disparities as the phones don’t fetch and load new elements from the internet, The motion is the same across the three devices, yet we see the Pixel XL showing absolutely no frames over the 16.6ms green line that indicates a missed frame. The OnePlus 5, by contrast, features more of those missed frames and they correspond to user input, similarly to what’s displayed on the Galaxy Note 8. There’s a reason for why these devices have higher frame times at the instants there’s user interaction, while the Pixel XL does not.

In the data above you can find these devices’ CPU frequency in their performance cores as the scrolling takes place. You can see how each of them reacts to user input (where frame times spike), with the Pixel XL quickly scaling up closer to its maximum frequency and remaining at a higher frequency throughout the motion. The OnePlus 5, by contrast, is a lot more erratic — its frequency steps are larger, and it very quickly jumps up and down. As a result, it’s likely that the OnePlus 5 does not encounter each new user interaction with the big cluster at an appropriate frequency, having to quickly ramp up to address the increase in CPU workload — by then, though, the phone might have already missed a few frames. The Note 8 sits somewhere in the middle of these two, with a median frequency sitting between the OnePlus 5’s lower recorded frequency and their top frequency of 2.36GHz (under this workload). The same disparities in frame times and CPU frequencies can be observed while scrolling through the main inbox within Gmail, again with a simple three-swipe scrolling scenario.

The OnePlus 5, once again, has the highest number of frame time spikes near the points at which our program simulates user input and scrolling. In this case, their “touch boost” succeeds at ramping the CPU frequency, but seemingly not at meeting the entirety of the workload demand in time. The Pixel XL does see jank (which we’ll be calling frames or sets of frames above the 16.6ms green line) near the user input, but frame times progressively go down immediately after, and even then the spikes are much lower than what we see on both the OnePlus 5 and the Galaxy Note 8. Keep in mind that our perception of jank or stutters is not only about how many frames miss the 16.6ms target, but also how long a single or small group of frames takes to render. For example, if we saw a frame stuck for 700ms that would go from jank to freeze. While the Note 8 notably has fewer janky frames than the OnePlus 5 in nearly all of our tests, it is also capable of far higher frame times for individual frames, with repeated instances of the device blowing past our 50ms y-axis limit. Even then, none of these devices are “perceptibly” stuttery, with the results shown here matching the kind of scrolling performance most would deem entirely smooth.

So far we’ve seen that all of these devices are quite capable of smoothly handling simple scrolling motions, with the number of janky frames ranging from 0 to 13 percent. The Pixel XL is particularly impressive in this regard, and while it’ll continue its dominance across all of our tests, there’s an area where nearly all of these devices drop frames: loading new windows within an app. We’ve all seen, for example, some dropped frames when opening a Play Store listing and seeing all those beautiful animations putting everything in place. The following test includes a lot more element loading and UI navigation, being a composite UI test on the Google Play Store. You can click the toggle below to see the list of actions undertaken throughout the tests. The test is repeated three consecutive times, and results are displayed in the graphs below.

Play Store Test Actions

– Open application
– Open “Top Charts” list
– Scroll
– Open app listing
– Scroll
– Open “Read all reviews”
– Scroll
– Go back
– Go back
– Open side panel
– Open “Music”
– Scroll
– Go back

Nearly all of the devices we tested had tremendous spikes whenever they had to load a new window, be it a Play Store listing or a list of applications or songs. The scrolling portions are the ones that saw the smoother intervals by comparison. Loading the more complex windows (like a Play Store listing) can look perceptibly choppy in places, and I personally didn’t need any of these graphs to confirm that, but again the Note 8 performs much better than I expected. The Play Store isn’t a thoroughly lightweight application and the Note 8 does a good job at managing its transitions — at 12% jank, it had twice as many janky frames as the Pixel XL (relatively speaking, given total captured frames differ in number) and 33% less than the OnePlus 5 even though both had the same average frame time throughout the test. Mind you, this doesn’t mean that the OnePlus 5 effectively runs at 49 frames per second — frames are only recorded when the screen updates, and the only points at which there is continuous motion that add 60 frames every second to our plot is when there is continuous scrolling, or long loading instances.

Another popular application we decided to look at is YouTube, with the steps detailed under the toggle below. Once more, the test is repeated three times, and keep in mind that video playback within the app is not represented in the frame plot in any way, as none of the video frames or frame times are recorded.

YouTube Test Actions

– Open YouTube
– Search for “Nyan Cat”
– Open video
– Minimize video
– Scroll
– Search for “xda-developers”
– Open xda-developers channel
– Go through videos
– Scroll
– Go through playlists
– Go through channels
– Go back to playlists
– Go back to videos
– Go back to channel home
– Go back to search screen
– Go back to YouTube homescreen
– Swipe video away

Yet once again, we see the Galaxy Note 8 having a higher jank percentage than the Pixel XL, but less than the OnePlus 5, with the OnePlus 5 having the highest average frame time. This isn’t the most intensive test, but YouTube is a very popular application and one that I personally use for hours every day on any given phone. I was very pleased with the Galaxy Note 8’s in-app performance in all of these compound tests (and IRL), and while performance isn’t quite on the level of the Pixel XL, it’s quite literally better than what I’ve been used to as far as smoothness specifically is concerned, and much much better than the stuttery mess I encountered in many instances with my Galaxy Note 7 units (both the Snapdragon and the Exynos variant). This test has plenty of navigation, horizontal swiping and thumbnail loading, so I was pleased to see that these devices performed rather well, and that the Galaxy Note 8 did such a good job.

Finally, the last composite test I ran was through the Gmail application, three times on the same Google account under the same network.

Gmail Test Actions

– Open Gmail
– Scroll twice
– Open Side Panel
– Open “Spam”
– Scroll down
– Scroll up
– Open spam email
– Go back
– Go back
– Compose email
– Go back
– Open side panel
– Scroll side panel
– Open Settings
– Open General Settings
– Go back
– Go back

There’s not much to do in Gmail other than scrolling through lists, opening emails and composing new messages — that’s pretty much what this test replicates, as it navigates through various Gmail interfaces, loading various activities with long and diverse animations. This is probably why it’s one of the worst performing tests, as it only includes a small section of scrolling near the beginning. As we saw in previous tests, scrolling through a Gmail list also proved bumpier than the Play Store app list (which surprised me) on these devices and with these swipes. In the end and like clockwork, we see the same order: Pixel XL is first, Galaxy Note 8 janks a tad more, and the OnePlus 5 further a bit more than the Note 8. All in all, this last test reinforces the general induction I arrived to after spending a week careful obsessing over the Galaxy Note 8: in-app performance is now really nothing to complain about, and a far cry from the old TouchWiz we trashed so vigorously over the years.

Sadly, in-app performance isn’t the entire story (though it’s a huge part of what we look for in a phone). The unfortunate reality is that the Galaxy Note 8 still features a number of micro-stutters outside of third-party applications, many of which were more difficult to capture with our tool. For example, take the graphs above which show opening and closing (swiping) the app drawer in the homescreen, multiple times (TouchWiz Launcher, OxygenOS launcher, Pixel Launcher). The Galaxy Note 8 will almost always drop frames whenever it’s summoned by swiping up, and the first jank upon any new start of the launcher is always dramatically bad, shooting way past the 50ms limit in our graph and being a lot more jarring and noticeable that subsequent swipes. Other areas of the user interface have shown similar stutters, including the occasional stutter when unlocking the phone and the dropped frames when accessing the homescreen’s left-most panel, which now houses Bixby. It’s been several years since Samsung’s left-most homescreen has had spectacularly choppy transitions, and while it’s better than the Flipboard mess of previous years, it now manages to embarrass Samsung’s own first-party assistant service. But of course, this is just a launcher — swap it for something better, and the problem’s done with in this instance. The few stutters that remain in system animations when transitioning across applications aren’t much of a problem at this stage, and in my experience they are so minimal compared to previous years that I don’t think many people will be bothered by them.


App launching times & Multi-tasking

Methodology: We measured cold-start launch-time performance of Gmail, the Play Store and YouTube on the Pixel XL, OnePlus 5 and Galaxy Note 8. Keep in mind that we are not measuring the time it takes for an app to be fully rendered with all its elements drawn on screen. Rather, we are using a proxy by recording the time it takes for the app to create the main activity of the application. The time measure we include encompasses launching the application process, initializing its objects, creating and initializing the activity, inflating the activity’s layout and drawing the application for the first time. It ignores inline processes that do not prevent the initial display of the application, which in turn means the recorded time is not affected by extraneous variables such as network speed fetching burdensome assets. Also keep in mind that the phones tested immediately top up their CPU frequencies whenever an application is launched, minimizing CPU bottlenecks. We cycled through the three applications, and opened each of them 150 times, to look at how these phones’ app-launching capabilities perform over time. This throttling is unconventional and we’ve pushed the phone way past the limits you would encounter in real-world use-case scenarios, but even then, only one device was afflicted by severe performance degradation.

In this case, the OnePlus 5 takes the crown, featuring the fastest app opening speeds — even if these numbers don’t include finalizing all assets, it’s still a very solid predictor that closely matches our perceived real-world experience. The OnePlus 5 is a champion at opening apps with incredible stability even as temperatures rise throughout this 15 minute test of intense app-opening, which keeps the CPU at its higher frequencies for an unreasonable amount of time. While the OnePlus 5’s performance cluster frequency doesn’t climb past 2.36GHz in most usage scenarios (including the tests shown above), it does shoot to 2.5GHz (and thus higher than the Note 8’s 2.36GHz) when opening applications. The Galaxy Note 8 is the subject of this analysis, though, and it performs almost as well as the OnePlus 5, with slightly higher opening times in every application and the same overall pattern, with near-identical performance gaps for Gmail and the Play Store and almost the same gap between those two and the YouTube app. We included the Pixel XL given we had included it in every other test, though it’s no match for the Note 8 and OnePlus 5. Its slower storage (a factor in this instance) and now-outdated Snapdragon 821 were unable to achieve the same opening times, with significantly worse throttling throughout the test.

But app opening speeds do not tell us the full story, there’s another key aspect of an excellent Android experience and that is multi-tasking. While we attempted to measure app opening speeds from cold-starts, under most conditions we see what we call warm (and luke-warm) starts. And having an additional 2GB of RAM goes a long way in making the Galaxy Note 8 a much better performer when it comes to juggling applications, with this increase being one of the largest functional bumps in RAM Samsung has brought forth. Many videos and hands-on speed tests or comparisons have already shown the Galaxy Note 8 to have much better app-holding capabilities than the 4GB Galaxy S8 and S8+, and while we don’t always agree with such tests we echo the same opinion — the Note 8 is consistently better at holding up to twice as many applications as the Galaxy Note 7, and even the S8 when we factor in bigger games or heavier applications. I’ve personally been very critical of Samsung’s RAM management in the past, and while it’s obvious that a 6GB device would not (and really should not) have such issues in today’s app environment, I am still very happy to see the problem addressed (in one way or another). The Note 8 hasn’t disappointed me with its ability to keep applications in memory, and that really helps enhance the overall real-world performance of this device.


Note 8 Real-World Performance & Final Thoughts

As my XDA bio has stated for the past two years, I am very obsessive when it comes to performance (and button tactile feedback, but that’s a story for another day). At the same time, long-time readers might be aware that I’ve also been a fan of the Galaxy Note line-up ever since the Galaxy Note 3 won me over — I have owned every Note device since, in some cases multiple variants of each. But despite the fact that I kept coming back to the Note line, I was always aware that I was sacrificing an extremely important aspect of the user experience which I sought to fix through modifications, custom ROMs and kernels, and a lean and laid back approach to applications. Having to restrain my user experience in what’s otherwise the quintessential power usage device is a contradiction that never escaped me, yet it was something I had to live with device after device in order to make use of the valuable S-Pen and excellent multi-tasking capabilities (which other devices severely lacked back then).

When we wrote our assessment of the Galaxy Note 7’s real-world performance, we faced a salvo of criticism (much of it being the result of everlasting fanboy wars). This time around, we sought to show you our findings through a data-driven approach that attempts to capture and quantify how these devices perform on equal footing, under the exact same workload, with the same starting parameters. We’ve validated these results and have run these tests multiple times, with minimal variance (the Pixel XL, in particular, seems to operate in a perfect Newtonian clockwork universe here). But as confident as we are about the Galaxy Note 8’s real-world performance today, we’ve only provided a snapshot in time against a specific set of devices, and Samsung phones in particular are notorious for “slowing down” over weeks and months. Most reviewers, for example, initially sing praise towards Samsung for finally fixing TouchWiz’s performance woes, only to begin walking back their evaluations weeks later, or in re-reviews of Samsung devices. This has happened time and time again, and while we never were particularly fond of Samsung’s real-world performance out of the box, we certainly can’t guarantee the Galaxy Note 8 won’t regress with age either.

Yet at the same time, we can’t help but be surprised that the Note 8, so many years later, manages to minimize one of the series’ most criticized aspects. We were really critical of Samsung and the Note 7 last year, but so far there is no indication that the Galaxy Note 8 suffers from the same lockups, jarring animations, reproducible stutters and general lack of polish that left us disappointed in 2016. Past the examples provided in this article, I have personally had no complaints with my day to day usage of the device, and I have only noticed one small keyboard lockup whilst updating applications, which arguably excuses it. Other than that, though, it has been extremely serviceable with fast and fluid in-app performance, much better responsiveness than last year, and night-and-day improvements in app-holding capabilities. All of this allows the Note 8 to shine brighter as a power user phone than its predecessors, perhaps not in relative terms (given phones in general have gotten much better) but it is certainly everything I wished previous Note devices were in this regard. But do not be mistaken — it is still not perfect. It is still below the bar put forth by devices like the Pixel XL, and its biggest weak points remain precisely in those areas of the phone where Samsung is most involved, namely the system UI and some stock applications. It’s also nowhere near as snappy as the OnePlus 5, which combines an aggressive approach to performance with zippy animations to deliver an extremely responsive UX. It’s much harder for us to accurately measure the time it takes for actions within applications to be carried out, but I am confident that the OnePlus 5 would be a clear winner in this area.

That being said, I am very happy with the Note 8 so far. It is a compelling powerhouse of a phone, and while it might have priced itself a bit north of what I would have hoped, it does offer tremendous value in a market that still hesitates to challenge Samsung in the niche it has held for so many years — productivity devices with stylus support. My only complaint is battery life, which has been the worst I’ve had in any Note device to memory. This is puzzling to me, given I don’t feel a substantial improvement (if any) over the Note 5, which had a smaller 3,000mAh battery, an older processor and a default resolution of 1440p for its display settings. If it wasn’t for fast charging, as well as fast wireless charging, this would be unacceptable… and I often wonder whether we traded away some of that endurance for either more features most don’t need, or the performance this very article praises. However, that is a topic for another day. As it stands, the Galaxy Note 8 has proved to us that Samsung no longer delivers embarrassing real-world performance. I’m quite satisfied with the phone’s performance out of the box, and I really hope it can last this way — we’ll be sure to let you know if that’s not the case, but for now, I can say the Note 8 hasn’t disappointed us.


What do you think of newer Galaxy devices and the modern TouchWiz/Samsung Experience? Sound off in the comments!

The 4 Biggest iPhone X Unknowns, Including Face ID’s Performance

There are plenty of good reasons to think the iPhone X will at least be Apple Inc.’s (AAPL) biggest hit since the iPhone 6. In terms of looks, it’s hands-down Apple’s most stunning iPhone to date. And for anyone looking to upgrade from an iPhone 6S or older phone — that’s most of the addressable market, given the pace of smartphone upgrade rates — it delivers big improvements in display and camera quality, traditionally the two biggest drivers of smartphone upgrades. There’s also a major processing power boost, improved battery life and a few neat augmented reality features.

But there are also a few big remaining question marks for Apple’s latest flagship phone. Some have gotten a decent amount of attention, others much less so. They go as follows:

1. How quickly supply constraints will lift.

Widespread reports of iPhone X production challenges have dinged Apple shares the past two weeks, The Wall Street Journal reports of major challenges related to assembling the infrared dot projector modules (codenamed Romeo) that make up part of 3D face-mapping system used to enable the X’s Face ID face-unlocking system. Japan’s Nikkei also reports of limited production for 3D-sensing parts, and cites one source as saying the X is “being produced in small quantities, around tens of thousands daily.” That fits with what KGI Securities’ Ming-Chi Kuo previously suggested.

Raymond James, Rosenblatt Securities, Digitimes and others have also reported of production issues. RJ indicates iPhone X mass-production is due to start in mid-October, or about 2 months later than was expected at the end of June; Digitimes reports orders to component suppliers have been slashed.

Major supply constraints on Oct. 27 — the day iPhone X pre-orders start — won’t necessarily do massive damage, particularly given how much pent-up interest there is in the X. But if major constraints persist into December and consumers start wondering whether iPhone X orders will ship in time for their purchases to be unwrapped on Christmas morning, some of those consumers just might start exploring their options.

2. How reliably Face ID will perform.

Face ID, which relies on the front camera, a dot projector, an infrared camera and (at night) an infrared flood illuminator, is easily much more advanced than the face-unlocking features built to date into Android phones. Apple insists one can’t trick Face ID with a user’s photo, nor can it work when a user’s eyes are closed. The company also promises Face ID can work if a user is wearing a hat or glasses, or if he or she attempts to unlock a phone at an angle.

But it isn’t enough for Face ID to be better than face-unlocking alternatives. It also needs to be as quick and reliable as the Touch ID fingerprint sensors built into iPhones and iPads in recent years, given that the iPhone X lacks a Touch ID sensor (the company reportedly encountered issues in its attempts to put one underneath the X’s OLED display). That’s plausible given all of the technology Face ID relies on, but it’s also a high bar.

Apple has had plenty of controversies over the years about unexpected problems (sometimes exaggerated in scope) with new iPhones. The 2012 Apple Maps debacle is probably the biggest of the bunch, but we’ve also seen things like bendgate and and antennagate. Hopefully “facegate” or something to the effect won’t be added to the lexicon of Apple customers and investors.

3. How much steep overseas prices will act as a deterrent.

With its 64GB model going for $999 and its 256GB model for $1,149, the iPhone X’s pricing already pushes the envelope in the U.S.. But things are often much worse overseas, due to steep sales taxes and/or tariffs.

The 64GB model sells for the U.S. equivalent of $1,280 in China, $1,325 in the U.K., $1,390 in India and over $1,300 in many eurozone markets. More modest price deltas can be found for Canada and Japan.

To be fair, U.S. sales taxes will tack on about $60 or $80 to iPhone X prices for many stateside consumers. But on the whole, the phone is still meaningfully cheaper in the U.S. than in most overseas markets.

Thanks partly to installment plan adoption, high-end phone buyers have been getting more comfortable paying over $800 to get their hands on flagship devices — a fact reflected not only by the iPhone X’s pricing, but that of phones such as Samsung’s Galaxy Note 8, LG’s V30 and (reportedly) Alphabet Inc./Google’s (GOOGL) Pixel 2 and Pixel 2 XL. But the laws of price elasticity still apply to this market, and Tim Cook’s company seems to be testing them in certain foreign locales.

4. Whether the iPhone X’s design and user interface changes will cause headaches.

This is unlikely to be a major problem, but it could annoy some early iPhone X buyers and produce some negative press. Since it didn’t provide the iPhone X with a home button (or the fingerprint sensor normally built into one) in order to pack an edge-to-edge display, and since the phone also doesn’t support the touch-based buttons (softkeys) found at the bottom of Android phone displays, Apple requires X owners to swipe up from the bottom of the display to access the home screen, and to swipe up halfway to switch between open apps. And users have to swipe down from the top right-hand corner to access iOS 11’s Control Center.

That’s going to take some getting used to for many iPhone owners. And in the short-term, the swipe-up gestures will likely lead to accidental presses of controls placed at the bottom of iOS apps. Apple has issued new app design guidelines meant to address this issue, but some developers are bound to adopt them sooner than others.

Likewise, the notch protruding into the top of the X’s display — it houses the front camera and 3D-sensing modules — could obscure part of a video or game’s imagery, as well as portions of other content viewed in landscape mode. Once more, developers should eventually fix this issue, but some complaints are likely to pop up in the interim.

AMD, Nvidia shares drop after Intel unveils new chip’s stellar gaming performance

Intel shareholders’ lackluster year compared with its chip peers may be turning around with the launch of its new gaming-focused processors.

The chip maker unveiled its latest Core desktop processors on Monday, proclaiming up to 25 percent frame-rate improvements for PC gaming versus the previous models.

“Our 8th Gen Intel Core desktop processors deliver tremendous improvements across the board and — for gamers, in particular — offer an unbeatable experience,” Anand Srivatsa, general manager of Desktop Platform Group at Intel, said in the announcement.

“We are laser-focused on giving the enthusiast community the ultimate desktop experience, from chart-topping performance to a platform that can flex with their needs.”

The processors will be available for sale on Oct. 5.

Intel shares outperformed midday Monday, falling 0.3 percent compared with iShares PHLX Semiconductor ETF’s 2 percent drop, a 3.8 percent fall for Nvidia shares and 3 percent decline for AMD stock.

One technology industry analyst said the market reaction may mean Intel’s latest product offering could be bad news for AMD, which also makes desktop processors.

“I believe the [market] sentiment is driven primarily by belief that AMD poses less of a threat [to Intel],” Patrick Moorhead, principal analyst at Moor Insights & Strategy, wrote in an email.

At first blush, it may seem Intel’s processors may not be a direct competition for Nvidia’s graphics cards. However, if enthusiasts can improve gaming performance with a new desktop processor, it could take budget dollars away from graphics card upgrades.

Nvidia’s stock is up nearly 165 percent in the past 12 months through midday Monday compared with the S&P 500’s 15 percent gain. That performance ranks No. 1 in the entire S&P 500, according to FactSet. AMD and Intel shares are up 97 percent and roughly flat respectively in the same time period.

AMD and Nvidia did not immediately respond to requests for comment. Intel referred to the statement by its executive in the product announcement for its comment.

Apple Tunes Up Apps, Performance of macOS High Sierra

1 of 12

macOS High Sierra Low on Big New Advances, But Has Valuable Features

Now that iOS 11 has been pushed out to iPhones and iPads, Apple is turning the spotlight to the desktop with macOS High Sierra. The operating system, which was introduced at Apple’s  Worldwide Developers Conference in June, is a relatively a minor upgrade over last year’s macOS Sierra, but it includes some important improvements, such as a new Apple File System for better performance. Users also will find tweaks to built-in apps, as well as improved graphics and video support. There are enough improvements to make working with Mac desktops and laptops a more productive or even pleasant experience after it’s released Sept. 25. Take a look at this eWEEK slide show to learn more about High Sierra’s new features.

2 of 12

The Apple File System Arrives

Apple is revamping the file system in macOS High Sierra, for the first time in decades. The new Apple File System uses a 64-bit architecture and is far more responsive. It also has built-in encryption and other tools to boost the platform’s security. Apple File System should improve the macOS user experience dramatically.

3 of 12

High Sierra Supports a New Video Standard

Apple is moving to a new video standard in macOS High Sierra called High Efficiency Video Coding (HEVC), or H.265. According to the company, HEVC can compress videos up to 40 percent more effectively than H.264, with image quality far better than the older technology.

4 of 12

There Are Significant Improvements in Graphics Performance

To expand its graphics appeal, High Sierra works with the Metal 2 graphic processing system, which significantly advances how the operating system leverages the GPU chip for optimal performance. Metal 2 also uses utilizes machine learning and virtual reality to improve app performance across video games, creative apps and more.

5 of 12

There Are Improved Tools for Image Editing

Apple’s Photos app has received the biggest update of any built-in app in High Sierra. Photos now includes a variety of photo-editing tools and filters, and organizing photos has become much easier. It’s also easier to transfer images from inside Photos to third-party apps such as Photoshop.

6 of 12

Better Privacy in Safari

Apple has bolstered several privacy features in Safari. This includes a new Intelligent Tracking filter in the browser to stop advertisers from tracking user web browsing activity. The new Safari also stops videos from auto-playing. Advertisers have cried foul over the features, but Apple says they’re important to improving privacy and security.

7 of 12

Siri Keeps Improving

Each year, Apple improves Siri, and the same is true in macOS High Sierra. Siri now can learn user music preferences and provide audio recommendations. It also has been updated with a more natural voice that will sound more far more like a human. Finally, Siri should be able to understand and respond to more queries.

8 of 12

Spotlight Gets Smarter

The Spotlight search feature in macOS High Sierra now provides access to more than just files and folders on the computer. Spotlight also can find information on the web, and understand contextual queries and respond with relevant results. If users input a flight number, for instance, Spotlight will display arrival and departure times and terminal and gate information.

9 of 12

There’s Small, But Important Mail Boost

Apple’s Mail will look the same in High Sierra, but the app’s search features have received a big improvement. It now does a better job of understanding a search query and delivering so-called “Top Hits” that it believes are most relevant. The Mail search also learns over time, so the more a user searches, the better it understands.

10 of 12

Notes Apple Support Tables

It might seem like a small change, but Apple’s decision to add table support to Notes is an important addition. Apple Notes also can be used to pin important notes, such as grocery lists and meeting agendas. Look for Notes to keep users far more organized than in the past.

11 of 12

Apple Has Paid Attention to Backward Compatibility

To its credit, Apple has done a good job of ensuring macOS High Sierra works with the most Macs in the wild now. High Sierra will work with all Macs unveiled in mid-2010 or later and MacBooks and iMacs introduced in late-2009. Apple is planning to make High Sierra available Sept. 25 as a free download from the Mac App Store.

12 of 12

Life on the road with the Toshiba X20W-D ultrabook is enhanced by its thin profile, at 0.61 inches thick and a weight of about 2.5 pounds, in a rugged magnesium alloy case with a Corning Gorilla Glass 4 screen.

Xbox One X setup: Here’s what you’ll need to get the best performance

The new Xbox One X can deal in 4K and HDR video and produce Dolby Atmos and DTS:X sound. It’s a potential AV powerhouse, but only if you set it up properly. Here’s how to maximise its audio and video to help you create the best-ever gaming experience at home – and it could even give you a competitive advantage. 

How to maximize the sound quality of an Xbox One X 

Forget surround sound – all hail the new era of object-based audio and verticality. 

Games will soon ship with soundtracks specially encoded in Dolby Atmos and/or DTS:X, which stretch the surround sound concept to new levels – literally – with audio also coming from above. And, of course, the Xbox One X can handle both of those new formats. 

Is there a helicopter up there? Or a sniper? Only those with the correct audio gear will know for sure. Whether you go for a soundbar, AVR and a home cinema, or a pair of headphones, audio immersion awaits.  

The Yamaha YSP-5600SW is a soundbar with built-in Atmos functionality

Soundbars & all-in-one systems

Since Dolby Atmos is all about channels at different heights as well as directions, the idea of boiling that concept down into a soundbar seems like a classic case of space-saving convenience over core quality. However, there are some clever products out there. The best so far is the Yamaha YSP-5600SW, a Dolby Atmos-enabled (and, soon, DTS:X) sound projector that uses 46 speakers to create a 7.1.2 system. In what’s fast becoming a standard feature on flagship soundbars, others spatial sound-compatible examples include the LG SJ9, Pioneer Elite FS-EB70, Onkyo SBT-A500, Samsung HW-K950 and Sony HT-ST5000

If you want to go beyond a soundbar to ensure you do actually get height channels by physically installing them above your display, go for an all-in-one- home cinema package like the Onkyo HT-S5805 (though in this category there are surprisingly few to choose from). 

The Denon AVR-X2400H is a receiver capable of outputting a Dolby Atmos to a set of speakers.

AVRs

If you go for separates then you need to find a Dolby Atmos and DTS:X-ready AV receiver to put at the centre of your system, and then add speakers. All the usual AVR brands are on board with Dolby Atmos, and from mid-range to flagship. So you can go for something like the Denon AVR-X6300H or Onkyo TX-RZ3100 for a mind-bending (and very expensive) 7.2.4-channel home cinema system. Or you can head down the ranges and spec the Sony STR-DN1080, Onkyo TX-SR444 or Denon ‪AVR-X2400H to create a 5.1.2 system. 

The Definitive Technology BP9080x has an integrated ‘height module’ that adds verticality to your surround sound.

Elevated speakers

You could then use existing speakers, adding two front height channels using any two satellite speakers (or specialist Atmos products like the KEF R50, Klipsch RP-140SA or Onkyo SKH-410 installed in an elevated position on the wall, or you could consider installing two in-ceiling speakers like the Monitor Audio CT165 or Polk Audio V60

However, there are also some nice ‘bipolar’ tower speakers around that combine upward firing speakers with normal front-firing ones, such as the Pioneer S-FS73A, Klipsch RP-280FA and Definitive Technology BP9080x

Headphones can provide a cheap alternative to expensive Atmos speakers thanks to Dolby’s virtualisation technology.

Object-based headphones

Not many people have the space or the budget to spec join enormous Dolby Atmos-compatible home cinemas. Cue Dolby Atmos for Headphones, a more personal, more affordable and possibly the most effective implementation of Dolby’s new virtual surround sound format. It’s about placement of audio around you, and the good news is that you can use any pair of headphones. 

The catch is that you’ll have to pay extra for an Atmos license to unlock the functionality. 

If that sounds like too much effort to you and you’re in the market for a new headset anyway then Plantronics has an exclusive Atmos partnership, which means that you get an Atmos license in the box alongside its RIG 400LX, RIG 600LX and wireless RIG 800LX headphones.

But if you’ve already got a nice headset that you like to use, then we’d recommend just paying for the Atmos license. 

The Xbox One X will produce 4K HDR video.

How to optimize the video output of an Xbox One X 

When it comes to picture quality, the Xbox One X is all about 4K and HDR. The two new cutting-edge video features are worth preparing for, even though not all games will include both. 

4K, also called Ultra HD, consists of 3840×2160 pixel resolution which amounts to four times more pixels than Full HD. 

HDR, meanwhile, is all about massively increased colour definition, and increasingly it goes hand-in-hand with 4K on modern TVs and home cinema projectors. It’s now almost impossible to buy a 4K TV that isn’t compatible with HDR. 

That said, a lot of cheaper sets try to claim that they’re HDR without meeting the full HDR spec. Make sure your TV is able to hit a peak brightness of 1000 nits (if it’s LCD, the requirement for OLED is a more moderate 540 nits), and also check that it supports 10-bit color. These two features will mean it’s properly specced for HDR10, which is currently the dominant HDR technology. 

A second, more advanced, HDR specification called Dolby Vision is also available on more premium TVs, but since the Xbox One X doesn’t support the standard you won’t see any benefit with the console. 

Equally as important when choosing a display to get the best out of an Xbox One X will be to find one with as minimal input lag as possible; the sweet-spot is around 10ms. 

LG’s W7 OLED TV handles Dolby Atmos.

4K HDR TVs

Since the Xbox One X outputs 4K resolution, you should buy a 4K resolution display. You still have to choose between OLED and LED (and that includes QLED) when it comes to display technology, and you should also try to find a display with the lowest input lag. 

None of these choices are easy to make. OLED TVs have unbeatable contrast ratio and black levels, but they tend to have slightly more input lag than LED TVs. 

Samsung’s TVs tend to be rated well for minimal input lag – it’s an area they have certainly concentrated on – with screens like the Samsung QE65Q9FAM a candidate. 

However, if money is no barrier, LG’s W7 OLED TV is unique in that it deals in Dolby Atmos via its included 5.0.2 soundbar. It’s also vastly improved in the input lag stakes. 

Sony’s VW285ES projects in 4K

If you wants to take advantage of the 4K resolution the Xbox One X is capable of spitting out, then you’re going to need to maximise the size of the display area. And unless you’re able to afford a 75-inch TV, a projector is your best bet. 

4K projectors are still expensive, but Sony’s upcoming VW285ES – due in November – effectively halves the cost of native 4K projection (it will sell for around $4,999). It supports HDR, as do the more affordable JVC DLA-X5000 and Epson EH-TW7300, though both of these upscale into 4K rather than produce it natively. So, for now, 4K projection is still a rich man’s hobby. 

Full HD televisions still benefit

The above advice should help you get the most out of the Xbox One X, but it’s worth noting that even if you still use a standard Full HD set with the new console you’ll see a benefit thanks to the way the console will ‘super-sample’ the extra detail down into a Full HD set. 

But if you want to squeeze every bit of performance from the new machine then it might be time to take a look at your entertainment center and work out if anything needs to be upgraded. 

Speed, Thermal & Performance Comparison of Fast Charging Standards

OnePlus DashCharge Takes the Crown

One of the most common qualms from smartphone users is how their phones never last through the whole day. Despite all the advances in smartphones in recent years, such as quick charging solutions like Quick Charge, Dash Charge and SuperCharge, batteries feel like they have not evolved quick enough to keep up with our needs.

Part of the blame goes onto OEMs, who do work towards making our smartphones more efficient year-on-year. But on the flip side, the increasing efficiency of our smartphones are seen as perfect excuses to thin down our phones by yet another millimeter. And to retain the practicality of the phone, advances in the field of charging are advertised as a key feature of the device. So what if your phone dies after 6 hours of standby? Now you can get a day’s power in half an hour, or some other slogan.

Choice, one of Android’s strongest selling points, also ends up confusing users when it comes to charging standards. There are multiple charging solutions available across Android flagships, with their own positive and negatives attributes, intricacies and particularities. Some charging solutions are quick, some are efficient and some aren’t really quite as great as one would expect.

In this article, we will take a look at the performance and efficiency of some popular charging standards, namely Huawei’s SuperCharge, USB Power Delivery, OnePlus’s Dash Charge, Samsung’s Adaptive Fast Charging, and Qualcomm’s Quick Charge 3.0.

Index

Conclusion
OnePlus Dash ChargeHuawei SuperchargeQuick Charge 3.0Adaptive Fast ChargingUSB Power Delivery


Current Winner 9/16/2017

Offering an excellent balance between speed and stability, Dash Charge surprised us with its ability to charge your phone quickly and painlessly. Its custom charging adapter and signature red cable allow newer OnePlus devices to remain cool while charging, without sacrificing performance on device nor charging rates. This means you use your device while it’s getting topped up and keep on messaging, browsing the web or even playing a game. Dash Charge cannot offer wide compatibility or a diverse set of charger options, but in the end it provides an excellent charging solution that does not get in the way of the user experience.


Methodology

The data we collected involved the use of a script that automatically measured key charging parameters  (as reported by Android) and dumped them into a data file for us to analyze. All charging standards were tested with their stock charging adapter and cable to ensure that the data is representative of what we can expect from each standard. All data collection began with the battery at 5% and ended with the battery at 95%. To test thermal performance and charging speeds during screen-on use cases, the script looped PCMark tests while the phone was charging to simulate a real-world usage environment; temperature readings are gathered from the OS, and they are not measured externally. For the sake of clarity in this presentation, averaged data was rounded off while preparing the graphs.


Quickest Charging Standard

When we measured the charging times of the popular charging solutions, we came across a peculiar conclusion: USB Power Delivery was the slowest of all fast charging solutions that we tested, at least as implemented on the Pixel XL. This is only surprising because USB Power Delivery is the “standard” pushed forth by the USB-IF standards body, and the one that Google strongly encourages as well — once we look at each standard’s workings further down this article, it’ll make more sense.

USB Power Delivery has been implemented in the Google Pixel and Google Pixel XL. The smaller Google Pixel is marketed at being capable of 15W-18W charging, while the bigger Google Pixel XL is capable of 18W charging. As we noted in our Google Pixel XL review, actual charge times on the device were not competitive, ending up in the last place when compared with other solutions, and our extensive testing on the charging times for the purposes of comparison reveals the same. Below you can see the charging time of each standard from 5% to 80% when scaling the battery capacity across test devices to 3,000mAh — this does not represent how each standard would charge such battery capacity with perfect accuracy, and the graph should be used to get an approximate idea as to how they compare.

When we look at which device charged the fastest, the quickest charging solution we tested is OnePlus’s Dash Charge functionality, which on the OnePlus 3 ends up being quicker than competitors by about 10 minutes in the end (before adjusting for battery capacity), and by a good half hour against USB Power Delivery. On the flip side, Dash Charging is proprietary technology, which adds its own set of complications which we will discuss later on in this article. Dash Charge does end up behind Huawei Supercharge when we take into account, and adjust for, battery capacity in the device, as the Huawei Mate 9 has a substantially larger battery than the OnePlus 3. While Supercharge achieves a faster peak charging rate, the Huawei Mate 9 does not reach 95% charge the earliest because of its larger battery capacity. So while the OnePlus 3 tops up faster in terms of reaching the higher percentages of its battery capacity, the Mate 9 is actually adding more charge per unit of time (a function of Huawei’s higher power delivery ouput).

Huawei Supercharge and Qualcomm Quick Charge 3.0 performed similarly, while Samsung’s Adaptive Fast Charge had less of an initial speed advantage but it still managed to reach the goal of 95% charge while giving close competition to the other two.

We also have temperature data alongside the charging time. This graph coincides with the charge percentage, but had to be separated to keep things simpler, uncluttered and easy to understand.

We were unable to finely control all the starting temperatures of our test devices because of the varying temperatures in the different locations they were tested in, so our focus should be on consistency and stability rather than the absolute highs and lows displayed by each data set. Battery temperature was obtained from Android’s low-level system record of battery temperature.

The most thermally consistent of the lot is Samsung’s Adaptive Fast Charging as it maintains a good hold over the device temperature throughout the entire session. Qualcomm’s Quick Charge 3.0 was the “coolest”, though again, we would need better-controlled initial conditions with perfect starting points and minimal extraneous variables to crown it the king. Similarly, we cannot call USB Power Delivery the “hottest”, but it definitely displays the widest range of temperatures. It’s also worth noting that most of these devices end up cooling down once their charging rate begins slowing down, and USB-PD does a good job at managing temperature past its peak.

The situation changes when you look at how these technologies perform when the device is subjected to a real-world workload. As stated before, we looped PCMark’s Work 2.0 test to simulate real-world usage while charging these devices, in order to measure how charging times and temperatures differed.

OnePlus’s Dash Charging remains as the top performer primarily because of its implementation, which we’ll detail further down. The voltage and current regulating circuitry is situated in the Dash Charger, which leads to lower temperatures while charging. So Dash Charge’s idle-charging and under-load charging scores tend to show very little variation.

On the other hand, Samsung’s Adaptive Fast Charging shows the worst performance when subjected to charging under a real-world workload. The device takes about twice the time to charge if it is being used, and the charging also increases in a peculiarly linear fashion (given voltage and current remain constant) that is not seen across any of our other tests. In fact, according to Samsung’s support page for the S6, its Adaptive Fast Charging solution is entirely disabled when the screen is on. Express mentions like these could not be found on newer support pages, but Samsung continues to recommend devices to be switched off while using Fast Charging.

Other standards continue to occupy positions between these extremes, most lying on the better side of the scale. Even USB Power Delivery, the worst performer of idle-charging takes just about 10 minutes more to achieve the same charge levels under load.

Temperature-wise, Samsung’s Adaptive Fast Charging (if we can call it that under this test) maintains a consistent range of temperatures, flowing within a 5°C range. Huawei’s Supercharge follows along next, followed by OnePlus’s Dash Charge. Qualcomm’s Quick Charge 3.0 and USB Power Delivery are the worst performer temperature-wise with large inconsistencies and variations throughout their cycles.


With inter-standard comparison out of the way, let’s take a closer look at how the standards performed individually under idle-charging and load-charging scenarios, with a short explanation as to why they behave this way and how they work.


Huawei Supercharge

Huawei’s SuperCharge is one of the more interesting standards we’ve tested, showing impressive results under most conditions. Unlike traditional high-voltage charging solutions, Supercharge employs a relatively low-voltage and high-current formula that aims to maximize the amount of current going into the device, while minimizing efficiency losses, heat, and throttling. Coupled with the Smart Charge protocol, the Mate 9 also adapts its charging parameters based on the requirements of the battery, as well as the charger supplied (for example, it can make full use of a USB-PD charger). The actual Supercharge charger comes with 5V 2A, 4.5V 5A, or 5V 4.5A (for up to 25W, or a common 22.5 throughout the most relevant segment) and uses a chipset in-charger to regulate voltage as well — this means that there is no additional in-phone voltage transformation, in turn reducing temperature and efficiency losses. Coupled with what Huawei calls “8- layer thermal mechanics” in its design, the Mate 9 promised fast charging speeds at low temperature. Focusing on current over voltage, and going for a less-lopsided distribution is similar to the Dash Charge standard’s approach, and in many ways both OnePlus (or Oppo’s) solution is similar to Huawei’s Super Charge.

Looking at the data we’ve gathered, we see the typical pattern of temperature beginning to go down past the 55% mark, the point at which current begins dropping off as well. Peak current comes close to the 5A rating of the charger, and sustains the 4.5 nominal current throughout the first 20 minutes, or until around 45%. The fastest charging rate occurs from 10% to 5%, with a linear slope that begins curving at that current drop-off, where voltage starts remaining somewhat constant after a fast climb from 2V to over 3.5V. Throughout this test, peak temperature hits 38° Celsius, which is significantly hotter than most other standards in this list. However, temperature will become really important when we take a look at our “under load” test, where we simulate activity on the device to compare charging speeds. We can clearly see temperature decreasing alongside the current, which doesn’t drop in clearly-defined steps as other standards in this article, but with a set downwards trajectory

In terms of charging speed, Huawei SuperCharge arrives to 90% in about 60 minutes, putting it second in in terms of speed behind OnePlus’ Dash Charge. However, the Huawei Mate 9 we tested also has a 4,000mAh battery, which means the mAh per percentage are higher than on all OnePlus devices, actually putting the standard in a better light and ahead of OnePlus. There are differences, however, in terms of charging speed, as Super Charge begins leveling off harder than Dash Charge at the 30 minute mark. Most of these companies advertise how much battery life one can obtain in half an hour, and Huawei’s claims were surpassed by our testing as the device managed to climb past 60% in that time period.

Under workloads, the rate of charging naturally is lower than during idle charging. Instead of a steep drop off, we see a more relaxed curve that trails off at around 75%. Current and temperature drop off is experienced when the device approaches 60%.


OnePlus Dash Charge

One of the newer champions of fast charging is Dash Charge, which surfaced in 2016 with the OnePlus 3. While the OnePlus 2 had disappointingly long charging via a regular 2A charger, the OnePlus 3 brought what OnePlus called “exclusive technology [that] sets a new benchmark for fast charging solutions”. As with most marketing statements from OEMs, this is only half true. Dash Charging technology is actually licensed from OPPO, which OnePlus is a subsidiary of, and mimics their VOOC charging system — Voltage Open Multi-Step Constant-Current Charging. While Dash Charge is a much better name, VOOC charging can be found on OPPO devices like the R9 and R11, though in this article we are focusing on Dash Charge as implemented on the OnePlus 3 / 3T and OnePlus 5.

So what’s special about Dash Charge? Not unlike Huawei SuperCharge, it produces a larger electrical current of 4A and at 5V for 20W power delivery. Rather than increasing voltage, OnePlus opted for a more even distribution with larger electrical current, meaning more electrical charge delivered per unit of time. This is accomplished via both software and, primarily, through hardware — specifically the charger used, which is non-standard (unlike the plethora of QC chargers, for example) and thus you need a VOOC or Dash Charger to make use of these charging speeds.

Much like Huawei’s solution, OnePlus employs dedicated circuitry in the charger itself, and both VOOC and Dash Charge deliver higher amperage thanks to many components of the charger, including a microcontroller that monitors charge level; voltage and current regulating circuitry; heat management and dissipation components (that contribute to a 5-point safety check); and a thicker cable that delivers greater current, specializing in minimizing power fluctuations. Because the charger converts the high voltage from your wall into the lower voltage the battery requires, most of the heat from this conversion never leaves the charger — in turn, your phone remains cooler. The consistent current going into the phone coupled with the lower temperatures on the actual handset allow for reduced thermal throttling, which impacts both charging speed and consistency as well as the direct user experience.

OnePlus proudly proclaims it can give you “a day of power in half an hour”, which in reality means you are looking at around 60% of battery capacity in 30 minutes. This is not only extremely fast, but there are also a few perks that come with it. The charging speed is fastest and one of the fastest at those lower percentages, ensuring you can get extreme amounts of charge in just a few minutes should you be running low on battery. Moreover, the thermal consistency and lack of throttling is no joke. As we can see from the data supplied, the difference between under-load charging and regular charging is minimal. And this does mean that you will not notice slowdowns, additional stutter or general throttling side effects whilst using your device. This is a great plus and, as we’ve noted in a past analysis, it does truly mean you can play demanding 3D games such as Asphalt 8 while still getting nearly the same charging speed, with the difference being explained by the drain incurred by gaming itself.

Dash Charge does have a major disadvantage, and that’s compatibility. The OnePlus 3 and 3T, for example, are not able to fully utilize USB-PD should you not have a Dash Charge cable and charger handy. And you need both the charger and the cable to make Dash Charge work its magic. Unlike with Qualcomm Quick Charge, you won’t find multiple charger offerings and accessories from various suppliers — you are stuck with OnePlus and their stock, which includes regular chargers and also car chargers (that have been known to be out of stock in regular and somewhat frequent intervals). You could try getting your hands on a VOOC charger, but that’s arguably more difficult in many markets. There’s also a noticeable and disappointing lack of battery packs supporting Dash Charge speeds, as OnePlus offers none — you could try OPPO’s power bank with an adapter, but this is far from ideal.

If you can look past those inconveniences and incompatibilities, Dash Charge is a clear winner in both speed and consistency. It’s a charging standard that does its job quickly and efficiently, without tying down the users to a wall for long periods of time, and without hindering their real-world usage while plugged in. The heat reduction could even lead to increased battery longevity. Your phone will remain cool, but your charger will not — so just make sure not to touch it while it’s doing its thing!


Qualcomm Quick Charge 3.0

Qualcomm Quick Charge is by all accounts the most popular charging standard in this list, and for good reason. Its paradigm is different than what we see with OnePlus and Huawei, because most of the magic happens through Qualcomm’s power management IC, their SoC and the algorithms they employ — all of this enabled Quick Charge to be a relatively low-cost solution (to OEMs) who are already packing a Snapdragon chipset in their smartphones anyway, and while it might not be as impressive as some of the dedicated solutions in this list, the reach of Qualcomm Quick Charge comes with its own set of benefits. While we are focusing on Quick Charge 3.0, keep in mind Quick Charge 4.0 is already available with considerable improvements. The latest revision is also compatible with USB-PD, as strongly recommended by the Android Compatibility Definition Document.

Quick Charge 3.0 has been offered in chipsets including the Snapdragon 820, 620, 618, 617 and 430, and offers backwards compatibility with previous Quick Charge standard chargers (meaning you can benefit from a plethora of lower-cost, slower chargers). This is mainly because the power draw is handled entirely on-device, with you only needing to provide a charge capable of supplying the requisite current to make use of its advantages — there’s no shortage of Quick Charge-certified chargers, so it shouldn’t be hard to stumble upon one. But again, we should re-emphasize that Quick Charge 3.0 even allows a phone to charge faster or more efficiently than non-Quick Charge devices while using a non-certified charger, precisely because so much of what makes it tick is independent of specific charger hardware, unlike Supercharge and Dash Charge.

Quick Charge 3.0 makes use of ‘Intelligent Negotiation for Optimum Voltage’ (INOV), and as the name suggests this allows for intelligent voltage control in order to determine the most efficient voltage, for the most efficient power delivery, at any given point while charging. This coupled with a higher voltage than competitors does allow the standard to expedite charging time, while preventing overheating and ensuring battery safety. INOV is also a step up from Quick Charge 2.0, which had rather discrete power modes of 5V/2A, 9V/2A, 12V/1.67A and 20V); instead, this revision allows for fine-grained voltage scaling, anything from 3.6V to 20V in 200mV increments. By determining which power level to request at any point in time, QuickCharge also prevents damaging the chemical composition of the battery while still providing an optimum charging speed taking into account factors like temperature and available power output. A potential downside is more inconsistency in charging speeds across charging scenarios and chargers, and the improvements do manifest in the earlier stages of charging and a noticeable decline around the 80% mark.

Still, looking at the graphs provided, one can see the finer granularity and wider range of voltage steps are clearly being taken advantage of. It’s worth noting that the Quick Charge 3.0 samples shown here do not behave as efficiently under load as other alternatives that offload much of the voltage conversion and heat dissipation to outside hardware; it’s more than serviceable if you want to use it while charging, however we don’t see the lack of throttling and heat buildup found on solutions like Dash Charge. And, unlike with other standards, you really won’t have a hard time finding power banks that’ll provide the rated charging speeds — this isn’t the case for SuperCharge or OnePlus, unless you are willing to spend more money, spend more time, or make extra concessions.

It’s precisely this level of versatility and support that make Quick Charge a great standard, and some OEMs do ultimately rebrand it as a superior “customized” alternative. But in the end, Quick Charge is an excellent solution for most OEMs looking to implement fast charging that’s efficient, highly compatible, and does not need special accessories. This holds extreme significance given Qualcomm is essentially granting the option to provide faster charging to dozens of smaller OEMs, or of bringing faster charging to mid-range devices through mid-range chipsets. This, in turn, improves the minimum baseline of fast charging offerings, in turn promoting competition and prompting those brands that do offer fast charging as a specific selling point to aggressively improve or market their solution.


USB Power Delivery

USB as a standard has been evolving for years, from a simple data interface that eventually became widely-used as a constrained power supplier, to a fully-fledged primary power provider alongside a data interface. Many small devices have featured USB charging for years, and you probably have a handful of peripherals being powered up by USB cables right at this moment. Power management in the initial generations of USB, however, was not meant for battery charging — rather, it was cleverly exploited for that by manufacturers who saw the slow power delivery was enough for the small batteries of their products. Since then, we’ve seen a tremendous jump — from the USB 2.0 power source of 5V/500mA (2.5W), to USB 3.0 and 3.1’s 5V/900mAh (which was very, very underutilized on Android) and finally, USB PDs 100W charging maximum.

Of course, smartphones have no need (and cannot take in!) such power draw — while 20V/5A is a peak for USB PD, actual chargers see a much lower specification with our tested Pixel clocking in at up to 15W (5V/3A), and the Pixel XL up to 18W. In most charging circumstances, however, voltage goes up to 5V with current sitting just under 2A, with the highest power draw we found during charging being just under 12.25W. As shown in the data provided here, USB-PD really isn’t the fastest charging standard, nor does it offer the best thermal consistency/lack of throttling. It does charge quite quickly under load, however, and overall it offers a very satisfactory – if unspectacular – charging profile.

It is, however, an extremely versatile standard that’s relatively easy to implement and that’s increasingly being pushed forth by Google in products like the Pixel C, Pixel Chromebooks, and Pixel smartphones as well as by various other manufacturers for laptops and other devices of varying sizes. Moreover, USB-PD is now part of the Android Compatibility Definition Document. Last year, the following entry made the rounds because it showed Google’s commitment to the standard, and what many interpreted as discouragement of proprietary solutions.

Type-C devices are STRONGLY RECOMMENDED to not support proprietary charging methods that modify Vbus voltage beyond default levels, or alter sink/source roles as such may result in interoperability issues with the chargers or devices that support the standard USB Power Delivery methods. While this is called out as “STRONGLY RECOMMENDED”, in future Android versions we might REQUIRE all type-C devices to support full interoperability with standard type-C chargers.

Since then, we’ve seen Qualcomm adopt USB-PD spec compliance with their Quick Charge 4.0 release for newer Snapdragon chipsets, which is a huge victory for both Google and Qualcomm. The increasing proliferation of USB-PD and Type C ports can lead us to a future where we see more device interconnectivity, with a near-universal port for audio, video, data transfer and charging needs. USB Type C devices like the Pixel XL currently allow the option to charge other devices using their battery as a power source, for example, and widespread USB Type C and USB-PD adoption in other devices such as laptops could lead to more convenient charging and cable-management use cases.

There’s also no shortage of charger options available for USB-PD devices, and if the standard can co-exist with proprietary standards, that opens up even more possibilities for device manufacturers. As it stands, though, it’s not present in many Android devices yet, with the Pixel and Pixel XL leading the charge. For these two phones and their adequate battery capacities, the charging rate and resulting times are sufficient, and Pixel / Pixel XL owners have multiple options at their fingertips — one just needs to make sure the charger is able to meet the 9V/2A or 5V/3A requirements of the phone, and that it meets specifications. With the emergence of USB Type C and USB-PD, we did see a few reports of potentially dangerous cables being sold online, as they didn’t meet the specifications of the resistor in the cable, for example. Luckily such issues are disappearing and if you make sure to research your purchase properly, you should be OK. Keep in mind that the standard is scaleable, and there will be more voltage and current configurations that OEMs can experiment with.


Adaptive Fast Charging

Adaptive Fast Charging has been Samsung’s preferred charging solution for many years and, unfortunately, it has largely stayed the same since. While our results show that it’s one of the slowest (yet more stable) standards, Samsung opts for it year after year over either a charging solution more in line with what OnePlus and Huawei are doing, or the proper Qualcomm Quick Charge (however, Samsung devices can make use of Quick Charge chargers for fast chargers!). The latter is a consequence of their split chipset strategy, given that their Exynos chipsets wouldn’t be able to take readily take advantage of Qualcomm’s charging technology. Samsung’s Adaptive Fast Charging is thus present in their devices across the globe, and limited to Samsung devices.

While Adaptive Fast charging is faster than USB-PD when adjusting for battery capacity, it’s still significantly slower than Supercharge and Dash Charge, and slightly slower than Quick Charge. It features a peak power delivery of 15W (5V/3A) which is in line with other standards, but Samsung seems to be quite conservative with its charging times — this is particularly evident when charging under load, as the charging rate becomes nearly linear, and has the slowest charging rate out of all devices we’ve tested for this article. That being said, the temperature difference is also the smallest of the bunch, and throttling the charging speeds and minimizing temperature led to consistent performance under usage.

 

Under both circumstances (regular charging and charging under load*) Samsung’s solution is the slowest (without adjusting for battery capacities) and the coolest (or, rather, features the smallest range of temperatures). This emphasis on stability and consideration for thermals is now more important to Samsung more than ever, after what happened with their Galaxy Note 7 and its faulty batteries. While there might be no correlation between this approach to fast charging and this incident – after all, as we’ve mentioned, their standard has remained largely constant over time – it’s still worth considering that a safer approach to fast charging is not bad in and of itself.

 

This is especially true for Samsung devices, which also provide an additional different rapid charging solution altogether — fast wireless charging. While conventional wireless charging was gaining popularity a few years back, Samsung is one of the few that stuck with it and then improved upon their implementation by adopting faster wireless charging, which originally cut down charging times from around three hours to just around two. Having this alternative can make up for some of the disadvantages of Adaptive Fast Charging, given wireless charging is a more passive approach that is less cumbersome and thus allows for more regular charging intervals, effectively taking the hassle out of topping up a phone around an office or bedroom space.

* You might notice that the intervals between points in these data sets are smaller than on other stubs and graphs. While gathering data from the GS8+, we stumbled upon a device-specific issue that prevented the PCMark test with UI automation from being carried out properly. We thus revised our data collection and automation tool for the GS8+ and improved the polling mechanism while we were at it. Data added in the future will benefit from these improvements resulting in more accurate or smoother graphs.


This article will be continuously updated as we get our hands on more devices, and get to test newer or updated standards. Stay tuned for more comparisons!