||FiringSquad Rumor Patrol: Apple, NVIDIA
August 23, 2008 R21
Summary: FiringSquad's top secret division looks at technology rumors floating around the 'net. In this round: NVIDIA and Apple!
| NVIDIA’s CPU ambitions||Page:: ( 1 / 4 )|
An introduction to R21
NVIDIA set to launch x86 CPU?
Just a month ago, an article at DigiTimes suggested that NVIDIA was getting out of the chipset business. Now, in the days before NVIDIA's NVISION expo, The Inquirer is reporting that NVIDIA is set to launch a new x86 compatible chipset.
Impression: The rumor, as reported, is wishful thinking, but R21 isn’t counting it out.
NVIDIA faces challenges from both Intel and AMD. Intel already maintains CPU dominance in the industry and has made its intentions to bring its graphics technology up to the same level as NVIDIA and AMD. Likewise, with ATI absorbed into AMD, the company is decreasingly dependent on NVIDIA for developing high performance motherboard chipsets. How can NVIDIA survive? Make their own CPU. If Intel can make GPUs, then certainly NVIDIA's own engineering wizards can make a decent CPU too, right?
The best rumors are the ones that almost make sense. In a way, development of the CPU to combat AMD's and Intel's increasing independence is a great way to maintain relevancy in the upcoming future. AMD has already shown with their current Radeon line-up that they continue to be a formidable competitor and Intel's Larabee shows the outside-the-box thinking that Intel is bringing to the graphics world.
NVIDIA certainly has the capabilities. With the acquisition of Stexar, NVIDIA gained access to Randy Steck, the man that led the development of Intel’s Pentium Pro, Pentium II, Pentium III, Pentium 4, Gary Brown, program manager of the Northwood and Willamette Pentium 4’s, and Darrell Boggs, a lead architect on the Pentium 4. Naysayers claim that Intel's manufacturing capabilities cannot be challenged by a fabless company such as NVIDIA. That's hardly true. Should NVIDIA develop an x86 CPU, 45nm fab capabilities can be found at IBM or TSMC.
The R21 Analysis
We would be surprised if NVIDIA didn't have an x86 project. We’re actually sure of it. The question is if NVIDIA is planning to create a flagship gaming CPU and how far along they actually are. Well-run technology organizations maintain competitiveness by combining forward looking research while maintaining tight product cycles to ensure continuous growth. If NVIDIA announces a desktop CPU next week at NVISION capable of replacing our Core 2 Quad’s, NVIDIA will have to be so far ahead of the development schedule that it approaches the realm of impossibility.
Behind ever rumor is a grain of truth.
An announcement of an x86 processor at NVISION would still be unlikely in our mind, but there are two key areas where we suspect NVIDIA does have the technology and desire to launch x86-processors: CUDA and SOC.
| Page 2||Page:: ( 2 / 4 )|
Compute Unified Device Architecture
CUDA is NVIDIA’s brand name for general purpose computing on GPUs and this is absolutely part of NVIDIA’s core vision for the next decade. CUDA has its roots in NVIDIA Gelato. Gelato is NVIDIA’s offline rendering technology that allows Hollywood production level quality renders to be calculated using GPU. Gelato renders have made their way into movies such as Resident Evil: Extinction, and the same visual effects studio is working on John Woo’s Battle for Red Cliff.
CUDA took the idea of using the GPU to perform complicated math at non-real-time speeds further by making it fully programmable. Based upon the Pathscale C compiler, CUDA allowed developers to take advantage of the number crunching capabilities of today’s gaming GPUs. Going beyond Folding @ Home, real-world uses of CUDA include molecular dynamics, MRI processing, weather simulation, as well as seismic databases. Compared to traditional CPU models, GPU based solutions can provide stratospheric increases in performance given the right application.
While CUDA will only work for the right applications, there are still limitations in the current platform including the lack of recursion and the bandwidth/latency limits between the CPU and GPU. This is where the addition of on-chip x86 capabilities will be helpful. The C compiler is already based upon the x86 architecture, so the software development process is “simplified” (as opposed to going with a different instruction set). As important, these x86 processors would simply be performing all of the “housekeeping” operations, relying on the heavy number crunching to be done by the GPU, “simplifying” the hardware side of things (as opposed to developing a full-fledged FPU to go with the CPU). “Simplifying” has been put into quotes because this is no easy task and while we believe that NVIDIA will include some x86 capabilities in the future to extend CUDA, we don’t think it’s ready for NVISION ‘08.
“System on a Chip”
Let’s step back from the scientific computing side of things and look at the second area where NVIDIA is going to be using x86 technology: a system on a chip. While Intel’s GPU and next-generation i7 CPU has caught the attention of most technophiles, the Intel Atom is a CPU that also deserves attention. Even with Intel’s fabrication capabilities, reports suggest that demand for the Intel Atom is exceeding supply. Even though this first generation CPU has primarily been found in Netbooks such as the Asus EEE, Intel’s vision for the Atom includes use in the smartphones and CE devices. That’s because, at 1.6 GHz, the Atom offers almost equal performance to the Pentium III 1GHz at an amazing 2W TDP.
A few caveats though. Intel has been pairing the Atom with the older 945GC chipset and as a result, an 8W AMD Athlon 64 2000+ (1GHz) paired with AMD’s 780G actually offers better performance and economy. This is one area where NVIDIA has the potential to compete, all they need is a CPU that’s competitive to Atom.
Enter Transmeta, a company that has broadly licensed its technology to NVIDIA.
Remember the Crusoe and the Efficeon? Exactly. Either you don’t remember these because they were 2004-era products, or you do, and remember that these CPUs were too slow to compete against the standard AMD and Intel CPUs of the day. However, a look back at Transmeta’s performance brings up a lot of interesting points.
Intel’s own presentation slides had the 1.6GHz Atom reaching 126-130 on the Embedded Microprocessor Benchmark, with hyperthreaded versions reaching 172 points. The Transmeta 1GHz Efficeon hit 137 points. The Efficeon had a TDP of 3W at 1GHz. In 2004. At 90 nm. So not only did the Transmeta CPU offer Intel Atom performance, it offered nearly the same power envelope.
Oh by the way, the Efficeon used a Hypertransport interconnect, was manufactured by TSMC, and used the NVIDIA nForce3 as the reference platform.
It turns out that thermal issues meant that the Efficeon would throttle down, and the integrated memory controller wasn’t that fast. But considering the improvements in manufacturing process, the improvements in NVIDIA’s chipset platforms, and the today’s market for low-power CPUs, the opportunity is ripe for NVIDIA to launch an x86 platform built upon around Transmeta’s original work. They could launch a multi-chip platform today and be competitive with Intel Atom, but we suspect the ultimate goal will be SOC capabilties.
x86 from NVIDIA?
Absolutely. It makes sense in at least two areas of NVIDIA’s active focus, they have the technical resources and the economic incentive.
x86 from NVIDIA next week?
Publically? R21 doubts it. NVIDIA would have to be way ahead of schedule.
| Apple’s September Surprise||Page:: ( 3 / 4 )|
“We have some investments in front of us that I can’t discuss with you today where we are going to be delivering state-of-the-art new products that our competitors just aren’t going to be able to match and as a result, I would see gross margins being about 30%, and that’s all I can tell you at this point.”
He’s just told R21 as much as we need to know. First of all, Apple’s profit margins:
Original Core 2 Duo iMac: 44%
iPhone 3G: 56%
iPod touch: 90%.
The Macbook/Macbook Pro are already the best selling notebooks on college campuses and most of the rumors floating on the ‘net have been on multi-touch screens or some fancy glass trackpad with a display along the lines of the Optimus keyboard. While the MacBook Pro is certainly due for an upgrade, we find both to be unlikely.
As cool as multi-touch is, a multi touch screen along the lines of Microsoft Surface, has no role on a laptop at this time. Gamers are always going for higher DPI mice due to ergonomics – you can move your cursor further for a given physical movement without having to give up the micro-precision needed for aiming. A multi-touch laptop wouldn’t make sense because it requires you to move your hand too much for normal activities. Would pinching a window on the monitor to be easier than just clicking on the minimize button? No. It works on the iPhone because you’re dealing with a smaller physical space.
A tablet PC “gets away” with asking the user to move their hand a lot because the 1:1 stylus makes digital art easier (where you do want the ability to move your hand across a large surface area) and because the applications in which tablet PC’s are helpful (point of sale, anything strictly menu driven) doesn’t require a significant amount of input. That is to say, a multi-touch screen offers so little user benefit for a substantial increase in cost. The artists who want a tablet Mac can go 3rd party or go with a Wacom Cintiq, and the rest of us will enjoy our cheaper laptops. Multi-touch is awesome for a trackpad though. If you’re pointing at an image and now use the pinch gesture to zoom in/zoom out, it’s faster to do that because you don’t have to lift a finger and have finer control over the amount of zooming than a scroll wheel or keyboard shortcut would offer. You actually get an improved user experience with a multitouch trackpad and that’s why Apple includes it on the Macbook Air and Macbook Pro.
What about the trackpad? Also bogus. The OLED based Optimus keyboard looks cool but it offers very little advantage. This isn’t like Microsoft’s Sidebar technology which allows you to see and do stuff with the notebook closed. Can you name any instance when it’s advantageous to have a low resolution display at the trackpad instead of displaying that exact same information on the main monitor itself? Some sort of status monitor? Some sort of contextual menu? Would that justify the added cost and increased warranty repair? Hardly. Some sort of digital loupe? The viewing angles would be off.
What about a docking trackpad that can be removed and act as a standalone iPod Touch? No way. You’re talking about the company that has convinced consumers to buy a new monitor everytime they buy a new PC (iMac) and what would happen if you misplaced the trackpad and wanted to use your computer.
What about a quick launcher that gave you instant access to a few key applications? Maybe, but it would hardly be a killer app.
I will be shocked if Apple includes a screen-in-a-trackpad in the next Macbook. Either they’ve come up with some application that is so cool that R21 hasn’t thought of, or the utility is so small and somehow Apple interface designers completely ignored the concept of making useful applications.
Behind ever rumor is a grain of truth.
The glass trackpad is the silliest idea R21 has ever heard. It’s cool for the Apple fanboys, but it just doesn’t make sense to use it as a laptop trackpad because there’s never a time where you have an open notebook and want a second small resolution screen that’s obscured by your fingers and will be viewed askew. So how did this rumor get started? Our guess is manufacturing. Suppose Apple is ordering a bunch of these glass trackpads from contract manufacturers in Taiwan and China, what could they be using it for? What product could use that technology in an Apple-friendly way?
A remote control for a TV? No way, margins are even less than 30%. Try a digital camera.
The “Apple Digital Camera” is a rumor that has been brought up before. People will point to the Apple Quicktake and 2002 had a rumors of a prototype Apple digital camera with an iPod like 5GB hard drive. This time, R21 really believes it can happen.
| Page 4||Page:: ( 4 / 4 )|
The R21 Analysis
Apple isn’t always the first to market with any technology, but they’re often the ones to do it right. The iPod and iPhone being two quintessential examples. (They’ve also screwed up: Mobile Me being the best example.)
Apple’s success is largely due to Steve Jobs’s “benevolent dictatorship” where every product ultimately must meet his standard. There are countless Apple concept products that are developed but never released because they don’t meet the standard. Importantly, Steve Jobs isn’t a programmer or a hardware engineer – he’s a business leader with an eye for design, an understanding of the market, and the ability to hire and inspire the best out of his engineers and designers.
R21 doesn’t know if there really was an Apple digital camera in the works in 2002, but if a product existed, it is clear why it wouldn’t have made sense. While capacity would have been a major complaint of photographers at the time, the technology did not exist in 2002 where Apple could offer a digital camera with a 5GB drive and still maintain any reasonable battery life or a reasonable form factor. Could they have done something with a Microdrive? Perhaps, but what would an Apple digital camera in 2002 offer over a competing Sony, Canon, or Nikon camera?
Fast forward to 2008. The technology has advanced and today, Apple remains the second largest consumer of flash memory in the world. iPhones push 8 and 16GB capacities, more than what a typical consumer will carry in compact flash or SD cards for their digital camera. Apple’s buying power also means that they get better pricing than their competitors so it’s hard for Canon, Nikon, and Sony to compete at the same level.
In addition, Apple’s photography technology has improved considerably since 2008. Today, Apple Aperture, iPhoto, and Shake have shown that the software side of Apple understands color management and color processing. Apple has the technical know-how to take off-the-shelf imaging sensors and lenses to develop a point-and-shoot with image quality competitive with the big 3.
Apple’s secret sauce will come in the form of the multi-touch interface and Mobile Me. A digital camera with an iPhone-esque multi-touch display will offer a great user experience and Mobile Me/iTunes will provide a mechanism for easy archiving, easy sharing, and easy printing of photos taken by such a camera.
Finally, the market for digital cameras is considerably higher in 2008 and a recent survey by Reevoo.com says that the digital camera is the #1 most baffling gadget in Britain, ahead of the GPS and mobile phone, with more than 25% having no idea how to use the camera properly! Everyone in the US knows that you sound smarter when you speak in a British accent, and if the British are confused, this is a trend that’ll likely exist worldwide. This is an opportunity for Apple’s human-computer interface design specialist.
Finally, with that much memory on-board… what about including iPod capabilities to the camera? Going from the 90% profit margin of the iPod Touch to the iPod Camera makes sense.
What about the 30% clue? Well, digital SLRs offer a profit margin of 20-30% while compact cameras have a profit margin of 5-10%. With the “Apple tax” and the flash memory buying power, 30% sounds reasonable to us.
September surprise from Apple?
It’s harder to predict what Apple will do, but we’re betting on a digital camera. A 30% profit margin makes sense given the market and Apple has the technology and incentive to create it. The more subscribers to Mobile Me, the more likely they’ll end up with a Mac, an iPhone, and an iPod Camera.