The Huawei Ascend P6 is the Chinese company's latest flagship smartphone, and is the worlds slimmest, at just 6.18mm thick.
However, while flagship for Huawei, the Ascend P6 is more of a premium mid-range handset in terms of features and design. Its screen is 720p rather than 1080p and, quite astonishingly considering the time of its launch, it lacks 4G LTE, but on the other hand its chassis is finely crafted from aluminium. To find out just what the deal is, we got hands on.
One thing that is clear about the Huawei Ascend P6 is the influence from Apple. This phone sports machined aluminium sides and back and really could be mistaken for the iPhone 5 at a glance.
However, similarities apart, this is a lovely handset. That aluminium construction lends the phone a near iPhone 5-equalling level of build quality that just feels great in the hand. Also helping immensely are the proportions of the phone. While the record-breaking 6.18mm slimness we could take or leave - though notably we didn't find it so thin as to be unwieldy - the narrowness makes the phone sit nice and snug in the hand. In comparison the Galaxy S4 is nearly 5mm wider and 6mm taller - that may not sound like a lot but in the hand it makes quite a difference.
This size difference may well be somewhat down to the smaller screen that the P6 uses but in fact that just highlights another way in which this phone trumps the S4 for ergonomics. The smaller 4.7in screen combined with the smaller body makes it significantly easier to get to grips with.
Even better are the main buttons that are ranged down the right edge of the phone. Up top is the power button while below is the volume. Both are perfectly placed so as to fall easily under finger or thumb, have just the right level of click and are nicely machined from aluminium - we're talking class-leading stuff here.
We're also fans of the use of on-screen buttons for Home, Back and Multi-tasking. Yes, it means you loose some screen space in some scenarios but equally it means there's a decent amount of space below the screen to rest your thumb and grip the phone, unlike on the Galaxy S4 in particular.
The good news continues with the addition of a microSD slot that allows for cheap and simple upgrading of the phone's storage.
However, not all is rosy. For a start the battery is inaccessible without dismantling the phone. Then there's the placement of the charging socket, which is on the top edge - ever heard of a phone dock Huawei? But the worst is where Huawei has placed the headphone socket: right at the bottom of the left edge. This means the headphone cable gets in the way in almost every conceivable situation that the phone finds itself in - in the hand, in the pocket, when gaming...
Although the P6 lacks a 1080p screen, in use it doesn't feel remotely lacking (indeed there's arguably an advantage of a slightly lower resolution screen as it requires less power to run and needs less processing power). Its 720p LCD screen is still very sharp (it's basically iPhone Retina matching) and uses the latest laminated screen manufacturing techniques such that the image appears right on the surface, rather than below the front glass. The result is superb viewing angles and bright colours. How it fares in darker lighting conditions when watching a video, we'll have to wait and see.
As for the phone's interface, it felt suitably speedy, thanks to its quad-core 1.5GHz chip (Huawei K3V2) and the use of Android 4.2.2. The look and feel is very different to stock Android as Huawei has gone to town customising it - and we had precious little time to really get to grips with it - but from what we could see it was largely cosmetic and all the key functions still work as any users of vanilla Android would expect.
Huawei was certainly keen to talk up the many funky features that its Emotion UI includes but five minutes at a press event is hardly time enough to do any of them justice.
Likewise the rear camera, which is an 8megapixel BSI model. With an F2.0 lens, it has a faster optic than both the iPhone 5 and Samsung Galaxy S4. While this may sound impressive, the HTC One had a equally fast lens and that phone's camera has far from lived up to expectations.
Taking a few snaps with the P6, the camera seems nice and speedy in operation with a clean interface - and that extra thumb space really helps for when holding the phone in landscape orientation. When it comes to image quality, we'll have to wait and see
All told, from what we've seen so far, the Huawei Ascend P6 is going to be a phone well worth considering if it really does come to market at around £300-£350. It's in fact a shame there are a few slipups like the headphone socket placement as without those it could have enough going for it to take on the likes of the HTC One, iPhone 5 or Galaxy S4 right at the top of the smartphone league.
3K Reviews!
martes, 10 de febrero de 2015
Nvidia GeForce GTX 970 Review Roundup: feat. ASUS, EVGA and MSI
If you've already worked your way though our GTX 980 review, then you'll know that unlike that card, Nvidia is not producing a reference model of the GTX 970, the second enthusiast-grade Maxwell part that replaces the GTX 770 in the product stack. The images on this page do not reflect a ready to buy product. Instead, Nvidia's AICs will be able to purchase the GPU and memory chips, but after that it's up to them, and this will undoubtedly lead to a high degree of variability when it comes to PCB and cooler design as well as factory overclocks. Pricing is set to start at £259 inc VAT here, which is a significantly lower price than the £429 of the GTX 980 considering that the specs aren't that far apart.
As such, we're approaching this review slightly differently to usual, and we'll be testing three different GTX 970 cards from three of Nvidia's key partners: ASUS, EVGA and MSI. All three ship with custom coolers and factory overclocks, so it will be interesting to see how well they cool themselves and how much further they can be replaced.
If you haven't read the GTX 980 review, then that's where you'll need to head for all the technical specifics on the new GM204 GPU and its new display and graphics processing capabilities.
In summary, however, GM204 is Nvidia's enthusiast level Maxwell GPU, fully enabled in the new flagship GTX 980, and slightly cut down in this here GTX 970, as the specs table illustrates.
In the GTX 970, three of the 16 SMMs are disabled, leaving the card with a total of 1,664 CUDA cores, 13 geometry/tesselation units and 104 texture units. As is common practice, Nvidia does not disclose which SMMs are disabled. It actually varies from GPU to GPU, though not in a way that affects performance, and this allows Nvidia to get the best yield from chips that don't make it through the manufacturing process with all the SMMs intact.
Clock speeds have been lowered slightly too, although they're still relatively high. The GTX 970 has a base clock of 1,050MHz and a rated boost clock of 1,178MHz, which represents the average frequency the cards will boost to in normal gaming workloads – exact frequencies will vary. However, it should also be pointed out that these reference clock speeds are really only that – for reference only. Most partners will opt for a factory overclock of some sort on each of the SKUs they release.
Other than this, the specifications are largely the same. The 4GB GDDR5 frame buffer is left at full capacity. The GPU communicates with this via a 256-bit bus (four 64-bit memory controllers) at an effective data rate of 7GHz for a total memory bandwidth of 224GB/sec. This is on the low side for a high-end GPU these days, but as discussed in the GTX 980 review, Nvidia reduces the need for memory bandwidth by introducing a large, shared L2 cache of 2MB (higher than even GTX Titan) and by a technique referred to as third generation delta colour compression, which compresses frame data in a lossless fashion on-the-fly, saving on average 25 percent of memory bandwidth. Rounding out the GTX 970 GPU at the back-end is the full complement of 64 ROPs.
The GTX 970 supports all of the major display technologies introduced with the GTX 980, including Voxel Global Illumination, Dynamic Super Resolution and Multi-Frame Sampled AA. Again, for a full rundown of these you'll want to head here.
Efficiency is the name of the game with Maxwell, and the already low TDP of the GTX 980 has been reduced even further for the GTX 970, from 165W to 145W, which is 85W lower than the card it replaces, the GTX 770. Power is supplied (at least in the reference specifications) through two 6-pin PCI-E power connectors, via a 4+1 phase power delivery system.
Finally, the reference display connections are the same as those of the GTX 980: a single dual-link DVI-I, three DisplayPorts (version 1.2) and a HDMI port, which is version 2.0 and thus capable of powering 4K displays at 60Hz all by itself. Of course, actual display configurations are likely to vary from partner to partner, but what doesn't vary is that the GTX 970 can support up to four displays at once. G-Sync is of course also support via DisplayPort.
As such, we're approaching this review slightly differently to usual, and we'll be testing three different GTX 970 cards from three of Nvidia's key partners: ASUS, EVGA and MSI. All three ship with custom coolers and factory overclocks, so it will be interesting to see how well they cool themselves and how much further they can be replaced.
If you haven't read the GTX 980 review, then that's where you'll need to head for all the technical specifics on the new GM204 GPU and its new display and graphics processing capabilities.
In summary, however, GM204 is Nvidia's enthusiast level Maxwell GPU, fully enabled in the new flagship GTX 980, and slightly cut down in this here GTX 970, as the specs table illustrates.
Nvidia GeForce GTX 980 4GB | Nvidia GeForce GTX 970 4GB | Nvidia GeForce GTX 780 Ti 3GB | Nvidia GeForce GTX 780 3GB | Nvidia GeForce GTX 680 2GB | |
GPU | |||||
Architecture | Maxwell | Maxwell | Kepler | Kepler | Kepler |
Codename | GM204 | GM204 | GK110 | GK110 | GK104 |
Base Clock | 1,126MHz | 1,050MHz | 876MHz | 836MHz | 1,006MHz |
Boost Clock | 1,216MHz | 1,178MHz | 928MHz | 900MHz | 1,058MHz |
Stream Processors | 2,048 | 1,664 | 2,880 | 2,304 | 1,536 |
Layout | 4 GPCs, 16 SMMs | 4 GPCs, 13 SMMs | 5 GPCs, 15 SMXs | 4 GPCs, 12 SMXs | 4 GPCs, 8 SMXs |
Rasterisers | 4 | 4 | 5 | 4 | 4 |
Tesselation Units | 16 | 13 | 15 | 12 | 8 |
Texture Units | 128 | 104 | 240 | 194 | 128 |
ROPs | 64 | 64 | 48 | 48 | 32 |
Transistors | 5.2 billion | 5.2 billion | 7.1 billion | 7.1 billion | 3.54 billion |
Die Size | 398mm2 | 398mm2 | 533mm2 | 551mm2 | 294mm2 |
Process | 28nm | 28nm | 28nm | 28nm | 28nm |
Memory | |||||
Amount | 4GB GDDR5 | 4GB GDDR5 | 3GB GDDR5 | 3GB GDDR5 | 2GB GDDR5 |
Frequency | 1.75GHz (7GHz Effective) | 1.75GHz (7GHz Effective) | 1.75GHz (7GHz Effective) | 1.5GHz (6GHz Effective) | 1.5GHz (6GHz Effective) |
Interface | 256-bit | 256-bit | 384-bit | 384-bit | 256-bit |
Bandwidth | 224GB/sec | 224GB/sec | 336GB/sec | 288GB/sec | 192GB/sec |
Card Specifications | |||||
Power Connectors | 2 x 6-pin PCI-E | 2 x 6-pin PCI-E | 1 x 6-pin, 1 x 8-pin PCI-E | 1 x 6-pin, 1 x 8-pin PCI-E | 2 x 6-pin PCI-E |
Stock Card Length | 267mm | 267mm | 267mm | 267mm | 252mm |
TDP | 165W | 145W | 250W | 250W | 195W |
In the GTX 970, three of the 16 SMMs are disabled, leaving the card with a total of 1,664 CUDA cores, 13 geometry/tesselation units and 104 texture units. As is common practice, Nvidia does not disclose which SMMs are disabled. It actually varies from GPU to GPU, though not in a way that affects performance, and this allows Nvidia to get the best yield from chips that don't make it through the manufacturing process with all the SMMs intact.
Clock speeds have been lowered slightly too, although they're still relatively high. The GTX 970 has a base clock of 1,050MHz and a rated boost clock of 1,178MHz, which represents the average frequency the cards will boost to in normal gaming workloads – exact frequencies will vary. However, it should also be pointed out that these reference clock speeds are really only that – for reference only. Most partners will opt for a factory overclock of some sort on each of the SKUs they release.
Other than this, the specifications are largely the same. The 4GB GDDR5 frame buffer is left at full capacity. The GPU communicates with this via a 256-bit bus (four 64-bit memory controllers) at an effective data rate of 7GHz for a total memory bandwidth of 224GB/sec. This is on the low side for a high-end GPU these days, but as discussed in the GTX 980 review, Nvidia reduces the need for memory bandwidth by introducing a large, shared L2 cache of 2MB (higher than even GTX Titan) and by a technique referred to as third generation delta colour compression, which compresses frame data in a lossless fashion on-the-fly, saving on average 25 percent of memory bandwidth. Rounding out the GTX 970 GPU at the back-end is the full complement of 64 ROPs.
The GTX 970 supports all of the major display technologies introduced with the GTX 980, including Voxel Global Illumination, Dynamic Super Resolution and Multi-Frame Sampled AA. Again, for a full rundown of these you'll want to head here.
Efficiency is the name of the game with Maxwell, and the already low TDP of the GTX 980 has been reduced even further for the GTX 970, from 165W to 145W, which is 85W lower than the card it replaces, the GTX 770. Power is supplied (at least in the reference specifications) through two 6-pin PCI-E power connectors, via a 4+1 phase power delivery system.
Finally, the reference display connections are the same as those of the GTX 980: a single dual-link DVI-I, three DisplayPorts (version 1.2) and a HDMI port, which is version 2.0 and thus capable of powering 4K displays at 60Hz all by itself. Of course, actual display configurations are likely to vary from partner to partner, but what doesn't vary is that the GTX 970 can support up to four displays at once. G-Sync is of course also support via DisplayPort.
SK Hynix SH910A SSD 256GB Review
Given that SK Hynix is the sixth-largest semiconductor company in the world and the world's second-largest producer of DRAM and flash memory (beaten only by Samsung here), it may surprise you to know that the company has virtually no presence in the SSD market beyond the OEM space. At least, that is, until now, as the semiconductor giant has just begun its first real foray into the consumer SSD market. It's certainly a little late to the party, but SK Hynix is also backed by experience, vast sums of money and the ability to produce all the main components of an SSD in-house, something which has given players like Samsung and Crucial a clear advantage in the field.
The SH910A range will be available in capacities from 64GB to 512GB (though the larger one is still absent from retailers) for 2.5-inch models and in 32GB to 128GB for mSATA users. It appears to be based on the OEM SH920 SSD line found in various notebooks.
The 256GB model, which is the capacity we consider to now be the mainstay of enthusiasts, comes in at a very attractive £85, which works out to just 36p per formatted GB. This means SK Hynix is very much aiming for entry-level users rather than professionals, which in turn means that like so many other drives the SH910A will have to do battle with Crucial's MX100 and Samsung's SSD 840 EVO, which come in at just £75 for 256GB and £84 for 250GB. As such, the new SK Hynix model certainly has its work cut out for it, as these two drives combine have solid performance (for client usage models) and impressive features too.
The retail-packaged SH910A comes with a 3.5-inch adaptor bracket as well as SATA power and data cables. The drive itself is very sleek, with a glossy, all-white exterior.
The basic performance specifications naturally aren't as high as professional level drives, hence the sub-500MB/sec sequential write speeds, but they are still solid for an entry-level drive. Sequential write speeds, at 410MB/sec, are actually rated at 80MB/sec higher than the Crucial MX100 256GB.
We rarely see SK Hynix NAND flash used in consumer SSDs – most companies opt for Toshiba or Micron solutions, while SanDisk and Samsung both produce their own. In the SH910A, Hynix uses its own F20 20nm MLC NAND, and each individual die can hold 64Gb of data. Hynix uses no additional overprovisioning with the SH910A, hence the advertised 256GB capacity. Annoyingly, and endurance rating in terabytes written has not been issued, but the drive does come with a three year warranty – standard for the entry-level market.
The controller comes courtesy of Link_A_Media, or LAMD, a company that Hynix acquired in 2012. Specifically, the model used is the LM87800AA, the same as that in the Corsair Neutron GTX, and LAMD's first consumer-oriented solution. It's odd that a newer model hasn't been produced since, but if Hynix is able to use its deep knowledge of the controller and its own NAND to produce solid firmware, then it may not matter. The controller is an eight-channel design like the vast majority currently used in client SSDs, and it supports up to four dies per channel, so Hynix's use of 64Gb dies ensures maximum controller saturation, as you need 32 such dies to get to 256GB (there are four dies per NAND package on the PCB, and eight packages in total). It also means the 512GB version will use 128Gb dies.
As you'd expect, the DRAM cache is also of Hynix origin. There are two cache chips on the PCB, one on either side, and the model number, H5PS1G83JFR-Y5I tells us that each chip is a 128MB DDR2 module, for 256MB cache total. We're now used to seeing either low-power or DDR3 cache (or both), so the SH910A is a bit behind here – it's likely this is a limitation of the controller, which may not support low power modules or DDR3.
Despite the lack of a low-power cache, the SH910A does support the DEVSLP function, and as such is suitable for notebook users. Other features are absent though; even the level of encryption is unclear. We're assuming the drive features AES 256-bit encryption (very much a standard now) as the controller supports it, but we haven't been able to confirm this.
Nominal capacity: 256GB
Formatted capacity: 238.47GiB
Controller: LAMD LM87800AA
Cache: 2 x SK Hynix 128MB DDR2
Memory type/amount: 64Gb SK Hynix F20 20nm MLC NAND dies (8 x 32GB packages)
Endurance rating: Not stated
Warranty: Three years
The SH910A range will be available in capacities from 64GB to 512GB (though the larger one is still absent from retailers) for 2.5-inch models and in 32GB to 128GB for mSATA users. It appears to be based on the OEM SH920 SSD line found in various notebooks.
The 256GB model, which is the capacity we consider to now be the mainstay of enthusiasts, comes in at a very attractive £85, which works out to just 36p per formatted GB. This means SK Hynix is very much aiming for entry-level users rather than professionals, which in turn means that like so many other drives the SH910A will have to do battle with Crucial's MX100 and Samsung's SSD 840 EVO, which come in at just £75 for 256GB and £84 for 250GB. As such, the new SK Hynix model certainly has its work cut out for it, as these two drives combine have solid performance (for client usage models) and impressive features too.
SK Hynix SH910A SSD | 64GB | 128GB | 256GB | 512GB |
Max Sequential Read (MB/sec) | 525 | 530 | 530 | 530 |
Max Sequential Write (MB/sec) | 180 | 330 | 410 | 410 |
Max Random Read - 4K QD32 (IOPS) | 70,000 | 94,000 | 94,000 | 94,000 |
Max Random Write - 4K QD32 (IOPS) | 40,000 | 71,000 | 75,000 | 76,000 |
The retail-packaged SH910A comes with a 3.5-inch adaptor bracket as well as SATA power and data cables. The drive itself is very sleek, with a glossy, all-white exterior.
The basic performance specifications naturally aren't as high as professional level drives, hence the sub-500MB/sec sequential write speeds, but they are still solid for an entry-level drive. Sequential write speeds, at 410MB/sec, are actually rated at 80MB/sec higher than the Crucial MX100 256GB.
We rarely see SK Hynix NAND flash used in consumer SSDs – most companies opt for Toshiba or Micron solutions, while SanDisk and Samsung both produce their own. In the SH910A, Hynix uses its own F20 20nm MLC NAND, and each individual die can hold 64Gb of data. Hynix uses no additional overprovisioning with the SH910A, hence the advertised 256GB capacity. Annoyingly, and endurance rating in terabytes written has not been issued, but the drive does come with a three year warranty – standard for the entry-level market.
The controller comes courtesy of Link_A_Media, or LAMD, a company that Hynix acquired in 2012. Specifically, the model used is the LM87800AA, the same as that in the Corsair Neutron GTX, and LAMD's first consumer-oriented solution. It's odd that a newer model hasn't been produced since, but if Hynix is able to use its deep knowledge of the controller and its own NAND to produce solid firmware, then it may not matter. The controller is an eight-channel design like the vast majority currently used in client SSDs, and it supports up to four dies per channel, so Hynix's use of 64Gb dies ensures maximum controller saturation, as you need 32 such dies to get to 256GB (there are four dies per NAND package on the PCB, and eight packages in total). It also means the 512GB version will use 128Gb dies.
As you'd expect, the DRAM cache is also of Hynix origin. There are two cache chips on the PCB, one on either side, and the model number, H5PS1G83JFR-Y5I tells us that each chip is a 128MB DDR2 module, for 256MB cache total. We're now used to seeing either low-power or DDR3 cache (or both), so the SH910A is a bit behind here – it's likely this is a limitation of the controller, which may not support low power modules or DDR3.
Despite the lack of a low-power cache, the SH910A does support the DEVSLP function, and as such is suitable for notebook users. Other features are absent though; even the level of encryption is unclear. We're assuming the drive features AES 256-bit encryption (very much a standard now) as the controller supports it, but we haven't been able to confirm this.
Specifications
Interface: SATA 6GbpsNominal capacity: 256GB
Formatted capacity: 238.47GiB
Controller: LAMD LM87800AA
Cache: 2 x SK Hynix 128MB DDR2
Memory type/amount: 64Gb SK Hynix F20 20nm MLC NAND dies (8 x 32GB packages)
Endurance rating: Not stated
Warranty: Three years
AMD and Nvidia need to step up to the 4K challenge
Resolution, particularly pixel density, is the new frontier when it comes to gaming graphics. There’s little doubt, certainly from my first hand experience, that 4K offers huge advantages in sharpness, even on 24in and 27in monitors – not just on super-large screens.
Some may disagree here, but to me, I’d welcome more pixels than my current 24in 1,920 x 1,200 main monitor offers. As I have two screens, I’ve also considered investing in a super-wide screen too.
There are some fantastic-sounding options here in the ultra high resolution department as well. LG and AOC have 34in 3,440 x 1,440 monitors plus Dell and LG have recently announced their own curved versions (WANT). The prospect here for immersive, high resolution gaming is pretty compelling but the extra screen real-estate is useful for all manner of other tasks too. I’ve played with super-wide monitors before as well, as you canread about here, and despite older 30in models only sporting 1,080 vertical pixels, I didn’t find this too restrictive when editing photos and the like.
However, there’s one major issue stopping me splashing some cash on a new ultra HD monitor. This is the fact that I’d need to invest twice as much again in the graphics department to be able to get playable frame rates in games. I’ve never been one to tone down graphics settings in order to get playable frame rates; this is partly the reason I find myself writing about PC hardware for a living, apart from the fact I caught the upgrade bug two decades ago.
However, even if I was prepared to drop a little in terms of detail settings, this still wouldn’t be enough to allow even a £400 single-GPU graphics card to handle all the latest games, never mind my aging GTX 660 Ti. Even Nvidia’s latest effort – the GTX 980 was a long way from achieving playable frame rates in Crysis 3 in our review; you’d need to opt for a monster such as AMD’s R9 295X2 in order to get some headroom at 4K.
Something else that concerns me, though, is that there’s not much effort going on to address the main issue here, which is that higher resolutions are going mainstream. Windows 8.1 achieved a lot in terms of 4K scaling, though there are a few more issues to iron out, not least of all by software companies with their own program scaling.
However, we’re nearly at the point where it makes absolute sense to aim for 4K in a mid to high-end system, rather than a super-high end one as is the case at the moment. This doesn’t mean I think those of us with limited wallet power won’t consider splashing out £300-400 on a 4K-capable graphics card, but the fact is that once 4K monitors fall in price further, mid and high-end enthusiasts will have a bit of a problem on their hands.
They can afford a 4K monitor, but not the graphics card/s to power it in games. We haven’t had such a big reason to upgrade our graphics card since Crysis landed but AMD and Nvidia need to do more to make these ultra high resolutions more attainable outside of super-expensive systems. In the past, you've needed to invest heavily if you game on triple screens, for example, and I think this needs to change.
4K is waiting to take off, be it in super-wide or standard aspect ratio monitors. In addition, true 4K gaming is also something the latest consoles lack. So this is also a huge opportunity for PC gaming to take a giant leap forwards and offer something tangible when it comes to a better gaming experience.
In short what we need is a GTX 970-type graphics card that can handle the latest games at 4K – something in the region of £250-350 – not the £700 odd that you’d currently need for something like the R9 295X2. So come on AMD and Nvidia, rise to the challenge and give us more reasonably-priced 4K-capable graphics cards.
Some may disagree here, but to me, I’d welcome more pixels than my current 24in 1,920 x 1,200 main monitor offers. As I have two screens, I’ve also considered investing in a super-wide screen too.
LG's 34UC97 is a curved 34in super-wide monitor that sports a resolution of 3,440 x 1,440 - Click to enlarge
There are some fantastic-sounding options here in the ultra high resolution department as well. LG and AOC have 34in 3,440 x 1,440 monitors plus Dell and LG have recently announced their own curved versions (WANT). The prospect here for immersive, high resolution gaming is pretty compelling but the extra screen real-estate is useful for all manner of other tasks too. I’ve played with super-wide monitors before as well, as you canread about here, and despite older 30in models only sporting 1,080 vertical pixels, I didn’t find this too restrictive when editing photos and the like.
AOC's u3477Pqu super-wide 3,440 x 1,440 monitor will retail for around £500 in October - Click to enlarge
However, there’s one major issue stopping me splashing some cash on a new ultra HD monitor. This is the fact that I’d need to invest twice as much again in the graphics department to be able to get playable frame rates in games. I’ve never been one to tone down graphics settings in order to get playable frame rates; this is partly the reason I find myself writing about PC hardware for a living, apart from the fact I caught the upgrade bug two decades ago.
However, even if I was prepared to drop a little in terms of detail settings, this still wouldn’t be enough to allow even a £400 single-GPU graphics card to handle all the latest games, never mind my aging GTX 660 Ti. Even Nvidia’s latest effort – the GTX 980 was a long way from achieving playable frame rates in Crysis 3 in our review; you’d need to opt for a monster such as AMD’s R9 295X2 in order to get some headroom at 4K.
To be able to play all current games at 4K, you need to invest in multiple GPUs or AMD's R9 295X2 - Click to enlarge
Something else that concerns me, though, is that there’s not much effort going on to address the main issue here, which is that higher resolutions are going mainstream. Windows 8.1 achieved a lot in terms of 4K scaling, though there are a few more issues to iron out, not least of all by software companies with their own program scaling.
However, we’re nearly at the point where it makes absolute sense to aim for 4K in a mid to high-end system, rather than a super-high end one as is the case at the moment. This doesn’t mean I think those of us with limited wallet power won’t consider splashing out £300-400 on a 4K-capable graphics card, but the fact is that once 4K monitors fall in price further, mid and high-end enthusiasts will have a bit of a problem on their hands.
Nvidia's GTX 980 can play some games at 4K, but doesn't offer much headroom - Click to enlarge
They can afford a 4K monitor, but not the graphics card/s to power it in games. We haven’t had such a big reason to upgrade our graphics card since Crysis landed but AMD and Nvidia need to do more to make these ultra high resolutions more attainable outside of super-expensive systems. In the past, you've needed to invest heavily if you game on triple screens, for example, and I think this needs to change.
4K is waiting to take off, be it in super-wide or standard aspect ratio monitors. In addition, true 4K gaming is also something the latest consoles lack. So this is also a huge opportunity for PC gaming to take a giant leap forwards and offer something tangible when it comes to a better gaming experience.
In short what we need is a GTX 970-type graphics card that can handle the latest games at 4K – something in the region of £250-350 – not the £700 odd that you’d currently need for something like the R9 295X2. So come on AMD and Nvidia, rise to the challenge and give us more reasonably-priced 4K-capable graphics cards.
Tags 4k gaming, 4k monitor
Corsair Gaming M65 RGB Review
Although we saw Corsair's Raptor M45 mouse early last year, we've yet to look at the mouse that the cheaper Raptor model was based on, namely the M65. However, with Corsair's new Corsair Gaming brand came a refresh of a number of its higher end peripherals, so today we're looking at the M65 RGB. It comes with the same right-handed design as the Raptor M45, but adds an extra button to the mix for a total of eight, and also includes customisable RGB lighting; something the original Vengeance M65 lacked.
The M65 RGB has an internal aluminium unibody, which feels very durable and sturdy. Externally it is plastic coated, with three distinct sections – the top has a smooth, matt finish while both sides have textured coats for improved grip. The mouse is also fitted with five PTFE feet which are nice and big and ensure very smooth, low friction movements.
The split sections and cutaway rear of the M65 RGB meant we didn't find it massively suitable for a palm grip, at least compared to other mice like the Asus Strix Claw and Mionix Naos 8200, as the gaps and edges here are noticeable and could prove distracting. We wouldn't say it's uncomfortable, but the effect was minimised by switching to more of a claw grip, where the textured sides come in handy. There's a very subtle groove on the right side too where you can rest your ring and pinky if you have small fingers.
On the bottom there are three adjustable weight positions – front left, front right and rear – allowing you to fine-tune the weight of the mouse and centre of gravity really well. It's easy to do too, as you just need a coin to unscrew them. In each position there are two weights; one 2.5g screw and one 4.5g doughnut-shaped one. The system allows you to remove both, or just the heavier one.
There is also a three-zone RGB lighting system – the logo, scroll wheel and DPI indicator. The former two are aesthetic only while the latter is functional, changing colour based on the current DPI level. However, its position makes it hard to see, so it's usefulness is limited. If we wanted to check it at a glance we have to move our index finger out of the way.
The M65 RGB's Omron switches ensure a good action on the main buttons, and we've no complaints with regards to the thumb buttons either, which are springy and easy to hit. The dedicated sniper button (this is the switch the M45 lacks) is oversized and positioned well for instant actuation when needed. The DPI switches have a higher actuation force so you won't accidentally press them, but the back one is quite far down the body and a stretch to reach – this is often the case with such designs, however. Finally, the scroll wheel is weighty thanks to its metal body, and it's rubberised on the outside too. It has good tactile feedback and accurate scrolling.
Tracking from the 8,200 DPI laser sensor is good across the DPI range. We have grown rather fond of a number of optical mice we've been using recently, but we can't deny this sensor performs well and it's less fussy about your surface than an optical sensor would be, which is good for those often on the go. We detected no built-in acceleration or angle snapping, and only very minor levels of jitter. It gets a bit uncontrollable at the high DPI levels, but that's the same with any mouse that's this sensitive.
The M65 RGB relies on Corsair Gaming's CUE software, just like the Sabre Optical, and it has the same benefits and drawbacks. On the plus side, it's very powerful, with the macro and custom functions being particularly impressive in their depth. However, it does have a steep learning curve; basic functions and lighting are easy to achieve, but more complex ones will take a fair bit of getting used to. We're also not fans of the fact that when using a compatible Corsair Gaming keyboard like the K70 RGB too, profiles appear to apply to both peripherals – switching profiles on one switches profiles on the other.
CUE allows you to save one profile to the M65 RGB for plug and play functionality on the go. As we said, button assignment is easy, and while you cannot alter the scroll up or down commands, all eight buttons are fully programmable. Performance tuning is also easy, with lots of DPI options as well as lift height, pointer speed (including an acceleration option) and an angle snapping toggle. Note that the polling rate selection (125, 250, 500 and 1,000Hz) is not accessed in the Performance sub-tab, but in the main Settings tab, since it is universal and not changeable using profiles.
When tweaking the scroll wheel and logo lights, you can have patterns based on solid colours or gradients between two or more colours of your choosing. The more complex wave and ripple patterns are reserved for Corsair Gaming's RGB keyboards as they can only be used with groups of lights rather than individual ones.
Conclusion
The M65 RGB is another solid product from Corsair Gaming. In the high-end peripherals market, it's getting to the point where there's a high level of quality across the major brands, with many relying on the same sensors, switches and materials and differing mostly in size, shape and software control. We think palm grippers looking to spend this much will be better suited to the optical Asus ROG Gladius or laser Roccat Kone XTD, but Corsair Gaming's M65 RGB is a great, albeit expensive, choice for claw-gripping FPS types
Suscribirse a:
Entradas (Atom)