Skip to Main Content

AMD Radeon R9 Fury X Review

3.5
Good

The Bottom Line

With the help of a new type of memory, AMD's liquid-cooled Fury X delivers performance that nearly matches Nvidia's GeForce GTX 980 Ti in a compact, quiet form factor. Just know you'll need to mount a small radiator, and think twice if you intend to use the card with an HDTV for 4K, 60Hz gaming.

MSRP $649.99
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Pros

  • Compact for a high-end card geared toward 4K gaming output.
  • Runs quiet and cool, thanks to bundled radiator and fan.
  • Roughly matches Nvidia's GeForce GTX 980 Ti at 4K and high settings.

Cons

  • Overall performance slides in just behind Nvidia's competing card, especially at lower resolutions.
  • Radiator complicates installation, takes up space gained by smaller card.
  • Lack of HDMI 2.0 port makes card an iffy choice for gaming on a 4K HDTV.

Despite the fact that AMD hasn't released a new single-card graphics chip since the Radeon R9 290X in October of 2013, the company has managed to remain surprisingly competitive in the video card business. Since that date, GPU-rival Nvidia has cranked out the GeForce GTX 780 Ti, the GeForce GTX 980, and the GeForce GTX 980 Ti, along with a few $1,000-plus GeForce GTX Titan cards. (See, for example, our most recent Titan review, of the GeForce GTX Titan X.) Despite that formidable competition, AMD, through a series of price cuts, has remained close in terms of both raw graphics performance and performance-per-dollar—at least until the GTX 980 Ti landed just weeks ago. It is a killer card, delivering nearly identical performance to Nvidia's own, much pricier GeForce GTX Titan X at about two-thirds of that card's price.

Now, alongside a new 300-Series line comprising essentially existing 200-series chips with tweaked memory, power management, and clock speeds, along with additional memory (up to 8GB), AMD is taking off the gloves with a new corker of its own. It has taken the final wraps off of "Fiji," the company's first truly new chip design in almost two years, in the hopes of again competing with Nvidia in the high-end market. And high-end these days means exactly two characters: "4K."

Our Experts Have Tested 18 Products in the Graphics Cards Category in the Past Year
Since 1982, PCMag has tested and rated thousands of products to help you make better buying decisions. See how we test.

As we wrote this in late June 2015, a few Fiji-based cards had just been announced, on the heels of a rash of rumors. All bear the "Fury" name, a moniker resurrected from the late-1990s heyday of the Rage Fury Maxx, a card current several years before AMD bought ATI Technologies. The promised cards to come are an air-cooled Radeon R9 Fury, a surprisingly compact Radeon R9 Fury Nano, and a dual-GPU Fiji-based card to which AMD hasn't given a proper name yet.

Those other cards are expected to land in the coming weeks and months, with the Radeon R9 Nano card likely at the end of the summer of 2015, and the dual-chip card promised only in "the fall." (Air-cooled R9 Fury cards should start showing up in late July.) But the first of the Furies, the water-cooled AMD Radeon R9 Fury X, landed on our test bench a few days ago, and we've been impressed with its compact 7.5-inch frame and quiet performance. To keep things running cool and quiet, though, the card comes with a closed-loop liquid cooler. So, much like the Radeon R9 295X2, you'll have to make room in your case for a 120mm radiator.

Similar Products

Image: nvidia.jpg
editors choice horizontal
4.5
Outstanding

Nvidia GeForce GTX 980 Ti

Zotac GeForce GTX 970 Amp Omega Edition
4.0
Excellent

Zotac GeForce GTX 970 Amp Omega

Image: AMD Radeon R9 Fury X (Intro) 450R.jpg

The Fury line's most notable internal design change is a shift to vertically stacked high-bandwidth memory (HBM), which helps AMD deliver a high-end card that's considerably shorter than the previous flagship, the Radeon R9 290X. There's only 4GB of memory here, but AMD claims the wider 1,024-bit bus (compared to 32-bit with GDDR5) means less video memory is necessary in the first place. From our benchmark testing, that generally seems to be true, as the card tends to perform better against the competition at 4K (3,840x2,160) resolution with high settings.

That being said, while the Radeon R9 Fury X gets extremely close in our testing to Nvidia's GeForce GTX Titan X and GeForce GTX 980 Ti, it doesn't move past those cards in any significant fashion. So, considering that the Radeon R9 Fury X's $649 price exactly matches that of the GeForce GTX 980 Ti, AMD's card is a clear-cut choice only for those looking to build a high-end system in a compact case that isn't large enough to house a 10.5-inch-long Nvidia card.

Image: AMD Radeon R9 Fury X (Attached to Radiator).jpgThat said, if you're looking to play 4K games on an HDTV, the Radeon R9 Fury X may pose another problem. The card's single HDMI port supports HDMI 1.4a, not the 2.0 specification necessary for (among other things) 4K gaming at 60Hz (above 30fps). Active adapters that convert DisplayPort to HDMI 2.0 have been promised, but they don't seem to be available yet. In addition, adapters will likely be fairly expensive when they do arrive, while Nvidia's recent cards ship with HDMI 2.0 support baked into the port.

Considering that almost all HDTVs lack DisplayPort input, this limits the appeal of a compact Radeon R9 Fury card for living-room gaming, and will likely put off many who plan to purchase a 4K TV for use as a lower-cost gaming monitor (or who have already done so). At a time when 4K gaming is still a pretty niche endeavor, that may prove a big problem for the Radeon R9 Fury X. The card pushes pixels about as well as the GeForce GTX 980 Ti, but Nvidia's cards can deliver faster refresh rates on a lot more 4K screens.

Design and Features

There's a lot to unpack in terms of physical design changes with the new Fiji chip in the AMD Radeon R9 Fury X, most of it revolving around the shift to HBM memory instead of the familiar GDDR5. Before we get to those details, though, here's a look at the Radeon R9 Fury X's basic specs, direct from AMD...

Image: AMD Radeon R9 Fury X (Specs).jpg

The Radeon R9 Fury X is still stuck on the 28nm manufacturing process, as is Nvidia with its latest cards—the shift to 14nm- and 16nm-process cards is expected to happen in 2016. But aside from that, the Radeon R9 Fury X is a fairly drastic departure from the Radeon R9 290X, AMD's previous flagship single-chip card. The number of Stream Processors and Shader Units get a 45 percent bump up from the Radeon R9 290X (which has 2,816 and 176, respectively), while the top stock boost-clock speed gets a slight bump up to 1,050MHz, versus an even 1,000MHz on the Radeon R9 290X.

Memory is a key differentiator, though. The 4GB included on the Radeon R9 Fury X is the same capacity that is included on the 290X, but the company has moved from the GDDR5 on previous models to HBM. AMD says the new memory technology increases the bus width from 32 bits with GDDR5 to 1,024 bits with HBM. That gives the Fury X's memory an immense 512GB per second of potential bandwidth at its disposal (128GB/sec per stack), versus 28GB per second with GDDR5. Here's a diagram of some of HBM's benefits, provided by AMD, including that of the memory's "interposer" layer...

Image: AMD Radeon R9 Fury X (HBM Benefits 3).jpg

As you can also see, HBM can deliver a massive amount of bandwidth while running at much slower clock speeds and a lower voltage. That helps the Radeon R9 Fury X use less power per watt than previous AMD cards, though the claimed 275 watts of typical board power (TBP) doesn't seem to be a drastic departure from the draw of the Radeon R9 290X. AMD never officially announced a board power rating for the Radeon R9 290X, but it was widely reported that the card drew around 300 watts under heavy load. And the Radeon R9 Fury X has two eight-pin power-supply connectors, just like the Radeon R9 290X reference board. That means the board can draw up to 375 watts total, so there should be plenty of power overhead for overclocking.

It's good to see AMD make some progress on the power-efficiency front. But on paper, at least, Nvidia's like-performing GeForce GTX 980 Ti (and GeForce GTX Titan X) are still ahead on that front, with a rated thermal design power (TDP) of 250 watts. And keep in mind that Nvidia's cards don't require liquid cooling, while the Radeon R9 Fury X does. Granted, AMD's "TBP" and Nvidia's "TDP" aren't the same measurement. But it's probably safe to say that if AMD could have claimed a TDP the same as (or better than) Nvidia's card, the company would have.

The other main benefit for HBM memory is that it allows for smaller chip design. This is how AMD has managed to shift from a roughly 11-inch-long Radeon R9 290X to a 7.5-inch Radeon R9 Fury X card. Instead of the memory being placed outside the GPU die and spread out horizontally, HBM allows for its DRAM chips to be placed atop each other, vertically stacked and connected via what the company terms "through-silicon vias" (TSVs). TSVs essentially consist of tiny wire-filled holes drilled vertically down the memory stack, running to the adjacent graphics processor via the separate interposer layer we mentioned above. Here's another illustration of how stacked HBM memory works, from AMD's press materials...

Image: AMD Radeon R9 Fury X (HBM Benefits 2).jpg

According to AMD, this allows for a drastic increase in efficiency, delivering 35GB per second of bandwidth per watt, compared to 10.5GB per second with GDDR5. But it also means the memory itself takes up a lot less space, allowing for a roughly three-fold reduction in horizontal PCB space compared to the R9 290X, as diagrammed here...

Image: AMD Radeon R9 Fury X (Fiji Size Comparison).jpg

Again, these are important improvements, as the Radeon R9 290X was a power-hungry card that ran loud—especially with the company's stock cooler. But the move to this first implementation of HBM has limited AMD to just 4GB of video memory. That will likely cause some confusion among gamers, as, all else being equal, more memory has traditionally meant better performance at higher resolutions with larger texture files. Nvidia's GeForce GTX 980 Ti has 6GB of (admittedly lower-bandwidth) GDDR5. Given the performance we've seen at 4K resolutions (which we'll get to in a bit), 4GB of HBM is probably enough for most 4K games. But that doesn't mean uninformed buyers might not opt on impulse for an Nvidia card just because it has a larger memory number in the specs box, given that pricing between the two cards is the same (at least at the time of this writing).

Nvidia's roadmap also indicates that the company will switch to HBM memory in 2016, with its Pascal architecture. That implementation of HBM will reportedly allow for up to 32GB of memory on a single GPU, while delivering similar space-saving and power-saving benefits. So the gains AMD has managed with the Fury line and its shift to HBM likely will last just a single generation.

As for the card itself, AMD has done a very good job with the exterior shell and cooler setup with the Radeon R9 Fury X...

Image: AMD Radeon R9 Fury X (Overall View).jpg

The card is wrapped in black metal and silver accents, with a glowing Radeon logo on the top edge for those with case windows. The braided tubes for the self-contained cooler snake out the end of the card and flex and route easily, feeling more like thick power cables than stiff hoses. The tubing is also nearly 16 inches long, so unless you have a truly massive case, you shouldn't have problems mounting the radiator wherever you want.

The top edge of the card also features a series of LEDs that show GPU load...

Image: AMD Radeon R9 Fury X (LEDs).jpg

They're unlit above and look like solder points, but there they are. They reminded us of old-school stereo UV level indicators. (Again, that's a detail that will appeal to gamers or power users building a PC in a windowed chassis.) Also on the top edge next to the LEDs is the pair of power-supply input connectors, here a pair of eight-pin connectors...

Image: AMD Radeon R9 Fury X (Power Connectors).jpg

There is no cooling fan on the card itself. All the internal components are cooled by the 120mm radiator and fan, which was designed with the help of liquid-cooling veterans Cooler Master...

Image: AMD Radeon R9 Fury X (Radiator).jpg

AMD says the fan and pump are designed to operate at less than 32dBA under typical use, and they can handle up to 500 watts of heat. So even if you're overclocking the R9 Fury X, the cooling setup shouldn't be overly loud. It certainly didn't stand out in testing in our (admittedly, fairly noisy) testing lab with the case side off. It seldom rose above a light whisper or pump whine, nothing remotely like the roaring reference-version cooler on the Radeon R9 290X.

Because the entirety of the card is cooled by the liquid-cooling setup, no vents are to be found on the Radeon R9 Fury X's port plate. You will find, however, four full-size DisplayPort connectors and an HDMI 1.4a port...

Image: AMD Radeon R9 Fury X (Ports).jpg

Know that, for starters, if you've got an old 30-inch monitor with DVI-in only, you'll need to upgrade or find an adapter. Worse than the lack of DVI, however, is the absence of an HDMI 2.0-compliant port. HDMI 2.0 can be found on Nvidia's GeForce GTX 970, GTX 980, GTX 980 Ti, and GTX Titan X cards—but not the Radeon R9 Fury X.

If you're planning on connecting this card to a 4K computer monitor, the absence of HDMI 2.0 might not be worrying, because you can game at 4K resolution and 60Hz (60fps) via the DisplayPort interface. But for those who like to game on larger screens, or who just want to use a 4K HDTV with their new graphics card because 4K TVs tend to be larger and more affordable than monitors, AMD's inclusion of the older HDMI 1.4a connector is a big problem. HDMI 1.4a can't deliver the bandwidth for 4K at 60Hz, so games (and video) won't display at higher than 30fps at that resolution, and even desktop productivity can look choppy, such as when moving a mouse. HDMI is the standard for HDTVs, with very, very few televisions also including a DisplayPort.

Adapters that convert DisplayPort to HDMI 2.0 have been promised, but they don't seem to be available yet. And because DisplayPort and HDMI operate on different voltages, those adapters will have to be active, rather than passive, so they're likely to be expensive—at least at first. Considering that the Radeon R9 Fury X is priced the same as the GeForce GTX 980 Ti, that puts AMD's card at a disadvantage for those buyers.

Granted, not all gamers are going to opt for an HDTV over a monitor, and sub-30-inch 4K monitors are more affordable at the moment than most 4K HDTVs. But there's a good chance that some buyers considering a Radeon R9 Fury X have already purchased a 4K TV that they plan to use for gaming, or are considering buying one in the near future in the hopes of using it for both gaming and other entertainment purposes.

Considering that the 4K gaming market is nascent and made up of early adopters, it's problematic, to say the least, that AMD's latest card locks out many of those early adopters and limits the variety of 4K screens that can make full use of this flagship card. Nvidia's cards have included HDMI 2.0 ports going back 10 months to the launch of the GeForce GTX 980 and GeForce GTX 970, so it's hard to argue that AMD didn't have the time or capability to include HDMI 2.0 on the Radeon R9 Fury X. That's this card's biggest disappointment, to our eyes.

Performance Testing

Before we get into the nitty-gritty of our benchmark-test results, it's important to note that we tested this card at its out-of-the-box settings (that is, with a top boost-clock speed of 1,050MHz). You can try, of course, to overclock the card further. There is an intuitive overclocking utility built into AMD's Catalyst software. Within the tight time constraints we had with this card, we were able to push the Radeon R9 Fury X to a stable overclock of 5 percent, with a 6 percent power boost. (We did have the card running mostly stable at a 7 percent GPU clock boost, but it would lock up consistently in one game trial, Sleeping Dogs, at maxed-out settings.) Given more time, you may be able to get more from the Radeon R9 Fury X, but, as always, some card samples can be pushed further than others.

At 5 percent above stock, the Fury X delivered a 3DMark Graphics Score of 16,166, or about 3.3 percent better than what we saw at stock speeds. (More on that below.) And in our gaming benchmarks, that overclock translated into an extra frame or two per second at 4K in the titles Tomb Raider, Sleeping Dogs, Bioshock Infinite, and Hitman: Absolution. Not a huge gain, but a material one when you're trying to hit the magic 30 frames per second (fps) at 4K.

Also, note that we were forced to omit our midrange performance numbers (which we run, typically, at 2,560x1,600) because the AMD card's lack of a DVI-out port meant we couldn't connect our trusty, long-standing Dell 30-inch monitor to get results for that resolution. (The many adapters we tried with this card wouldn't power that screen at its full native resolution.) Our 4K test screens all supported 2,560x1,440 pixels (1440p) under AMD's provided beta Catalyst driver, but with a 10 percent pixel difference between those two resolutions, the numbers weren't comparable. And with only a few days to test the Radeon R9 Fury X card before its debut, we didn't have time to go back and re-test all the other cards at 1440p. So, if 2,560x1,600 performance is what interests you most about this card, know that it will place somewhere between our 1080p and 4K results. (We were able to get 2,560x1,600 numbers in our Unigine tests, which let us force a custom resolution; we just couldn't do that in the commercial test games below.)

That said, 4K is really what this card is all about, and why you'd pay the premium for it in the first place.

3DMark (Fire Strike)

We started off our testing with Futuremark's 2013 version of 3DMark, specifically the suite's Fire Strike subtest. Fire Strike is a synthetic test designed to measure overall gaming performance potential, and here we shall let the bars tell the story…

Image: AMD Radeon R9 Fury X (3DMark Fire Strike).jpg

Particularly in the Graphics Subscore, which isolates our testbed's graphics hardware, the AMD Radeon R9 Fury X delivered a healthy 40-percent-plus boost over the last-generation Radeon R9 290X. But AMD's newest card still landed about 6 percentage points behind the Nvidia GeForce GTX 980 Ti and about 10 percent behind the GeForce GTX Titan X. While that's respectable for the Radeon R9 Fury X, it looks like Nvidia may be holding on to its high-end performance crown.

Heaven 4.0

Our Heaven DirectX 11 benchmark test is not strictly a game, but a heady DirectX 11 workout that displays a complex, game-like graphics scenario. It's developed by Unigine.

Image: AMD Radeon R9 Fury X (Heaven).jpg

Here, the Radeon R9 Fury X lagged far behind Nvidia's competing cards at 1080p, but it was much closer when stepping up to 4K, the resolution this card was made for.

Aliens vs. Predator

Moving on to the less-demanding older DirectX 11 title Aliens vs. Predator, the relative results were similar, although the frame rates were higher overall…

Image: AMD Radeon R9 Fury X (Aliens vs Predator).jpg

At 3,840x2,160 resolution, the GeForce GTX 980 Ti landed just slightly behind the Radeon R9 Fury X here, and the AMD card was more competitive at 1080p, as well. Then again, this is a fairly old benchmark. As we move to newer titles, we'll get a better sense of how these cards push pixels in newer game engines.

Tomb Raider

Here, we fired up the 2013 reboot of the classic title Tomb Raider, testing at two levels of detail and three resolutions. ("Ultimate" is a tougher workout than "Ultra.")

Image: AMD Radeon R9 Fury X (Tomb Raider Ultimate).jpg Image: AMD Radeon R9 Fury X (Tomb Raider Ultra).jpg

On this test, the Radeon R9 Fury X managed to roughly match or best Nvidia's top single-chip cards at 4K, while still lagging behind a bit at 1080p. It seems that despite the Fury X card having only 4GB of memory, the HBM tech helps AMD's card more than it hurts it at high resolutions.

Unigine Valley

Next up was Unigine's Valley benchmark test. Valley, like Unigine's Heaven, is not a game, but a graphical workout that's a taxing measure of DirectX 11 prowess.

Image: AMD Radeon R9 Fury X (Valley).jpg

Here, the GeForce GTX 980 Ti held on to its edge at 1080p. But at 4K, the Radeon R9 Fury X gets within a frame of its Team Green rival. And the dual-GPU Radeon R9 295X2 delivers the best performance here by far. Considering it's currently available for just $10 more than the Radeon R9 Fury X, it's still worth considering if you're willing to deal with the multi-GPU driver, game-support, and frame-time issues inherent with dual-GPU graphics setups.

Sleeping Dogs

Next, we rolled out the very demanding real-world gaming benchmark test built into the title Sleeping Dogs…

Image: AMD Radeon R9 Fury X (Sleeping Dogs).jpg

The Radeon R9 Fury X looked its best thus far on this test, pulling nearly even with the GeForce GTX 980 Ti at 1080p, and a couple frames ahead at 4K. But Sleeping Dogs traditionally favors AMD cards, so this result isn't that surprising.

Bioshock Infinite

The popular title Bioshock Infinite isn't overly demanding, as recent games go, but it's a popular one with stellar good looks. In its built-in benchmark program, we set the graphics level to the highest preset (Ultra+DDOF)…

Image: AMD Radeon R9 Fury X (Bioshock Infinite).jpg

Bioshock also often favors AMD cards. But here, the Nvidia cards showed a slight edge at 4K. Still, performance is close enough at both resolutions that the Nvidia and AMD flagships are essentially tied.

Metro: Last Light

Next, we ran the benchmark test built into the very demanding game Metro: Last Light. We used the Very High preset at each resolution…

Image: AMD Radeon R9 Fury X (Metro - Last Light).jpg

This game is a tough one, and tends to generally favor Nvidia cards over AMD. But, the Radeon R9 Fury X looked good here, matching the GeForce GTX 980 Ti at 4K and slightly outpacing it at 1080p.

Hitman: Absolution

Last up was Hitman: Absolution, another recent game that's hard on a video card. Nvidia's GeForce GTX 980 Ti continued to impress here at 1080p…

Image: AMD Radeon R9 Fury X (Hitman - Absolution).jpg

...but once again on this test, the Radeon R9 Fury X did slightly better at 4K.

Even so, just as we noted about the GeForce GTX 980 Ti and GeForce GTX Titan X when we reviewed them, as powerful as this card is, for all-out 4K fun, you'll still likely have to dial back a couple in-game settings, possibly lowering the AA setting if you want to keep frame rates smooth with the most demanding games. And if you want to hit 60fps at 4K, you're definitely still going to need a multiple-processor card or a multiple-card CrossFire or SLI setup. If you're going in that direction, the Radeon R9 295X2 is hard to argue against, unless your case is too small to handle a large power supply and an extra radiator for cooling. For multicard use, the Radeon R9 Fury X and its radiator are a tougher choice, simply because of the need to mount two radiators and route their separate cooling loops.

Image: AMD Radeon R9 Fury X (Intro) 450L.jpg

Conclusion

On the one hand, AMD deserves plenty of credit for crafting the Radeon R9 Fury X. It's the first card to feature high-bandwidth memory, a technology that will only become more important as virtual reality becomes an important part of gaming. With VR, we'll shift to larger textures and resolutions beyond 4K for increased photorealism in virtual worlds where pixels float mere inches from our eyeballs.

The move to HBM has also allowed for a substantially smaller card design than we're used to seeing in flagship parts. But the necessity of liquid cooling for this model goes a long way toward negating the convenience of a smaller card—even if it does help the Radeon R9 Fury X run much, much quieter than the Radeon R9 290X.

Had this card launched before the GeForce GTX 980 Ti (and therefore been pitted against the GeForce GTX Titan X, which we bet was the plan, given the "X" tagged onto the end of the Fury name), it would have been an impressive bump in performance for the price, versus Nvidia's $1,000 card. And that value proposition has in recent years been one of AMD's key selling points.

But with the similarly-performing GeForce GTX 980 Ti selling at the same $649 as the Radeon R9 Fury X, AMD's new flagship only effectively matches, not exceeds, the pixel-pushing power that Nvidia now delivers in the same price range. And while AMD and Cooler Master did a great job crafting the external radiator and cooler, that's not something you have to deal with when opting for Nvidia's top cards.

Image: AMD Radeon R9 Fury X (Intro) 450R.jpgIt also remains to be seen how either of these cards will handle DirectX 12 games, which should start to arrive fairly soon, given the imminent launch of Windows 10. But we'll have to wait to see if either card gains an edge there once DX12 games and benchmarks start arriving, likely later in 2015.

Those considering compact PC cases that aren't long enough to accommodate a 10.5-inch Nvidia card may want to wait to see what the air-cooled Radeon R9 Fury and the even more compact Radeon R9 Nano bring to the gaming table in the next few months. But considering the Radeon R9 Fury X lacks HDMI 2.0 support for 60Hz 4K gaming on HDTVs, and indications seem to point to other Fury cards lacking HDMI 2.0 as well, that will limit the appeal of those cards as the basis for media PCs and living-room rigs.

As good as the Radeon R9 Fury X is, without a price drop, it's hard to recommend this card over the GeForce GTX 980 Ti unless you have no plans to plug your PC into a 4K TV, and you are considering buying (or have already purchased) a FreeSync monitor over Nvidia's competing G-Sync tech for smooth gaming. Either one will mean less screen tearing and judder, but 4K TVs still offer a compelling alternative for low-cost big-screen gaming.

Like What You're Reading?

Sign up for Lab Report to get the latest reviews and top product advice delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About Matt Safford

Matt is a self-described Net nerd, gadget geek, and general connoisseur of off-kilter culture. A graduate of the first class of the CUNY Graduate School of Journalism, his work has appeared in Popular Science, Consumer Reports, Smithsonian, and elsewhere in the ether. You'll often find him writing while walking on his treadmill desk, surrounded by heaps of consumer tech. (But really, he prefers the low-tech scenery of the Scottish Highlands and the hills of Japan.)

Read Matt's full bio

Read the latest from Matt Safford

AMD Radeon R9 Fury X