Powered By Nikeshoestore!

Default utility Image

Sun announces expansive Intel x86 partnership At a news conference this morning in San Francisco, Sun announced that they are renewing and expanding their long-dormant partnership with Intel. Today’s announcement centered around Sun’s plans to launch server and workstation products based on Intel’s new Woodcrest-based Xeon chips. Sun will continue offering x86 systems based on AMD’s processors, but the focus of the company’s x86 business now appears to be squarely on Intel. For its part, Intel announced that it will begin devoting engineering, design, and marketing resources to Solaris, in a move that adds Sun’s OS to the roster of operating systems with some level of Intel support. Intel will also begin selling Solaris for x86 directly, as part of its whitebox server business. This deal is yet more evidence that Intel has taken the performance lead back from AMD in the server market. When Sun first announced a line of x86 systems based on AMD’s Opteron processor, Intel’s Xeon processors were all still based on the power-hungry and underperforming Netburst architecture. The AMD chips were the only sane x86 choice for Sun from a price/performance perspective, which is why they replaced the Xeons that were originally at the heart of Sun’s first x86 server offerings. Now that the competitive landscape has shifted in favor of Intel, Sun is shifting back to the larger chipmaker in a big way. The Big Picture: what the Sun parternship means for Intel The scope of Intel’s partnership with Sun is a bit surprising, but it makes a certain amount of sense in light of the realities of a post-GHz-race world. In a post last year on Microsoft’s new in-house chip design team, I placed the team’s formation in the context of the renewed battle between those companies who make only layers of the software-hardware stack (i.e., Microsoft and Intel) and companies that make the “whole widget” (i.e., IBM, Sun, and to a lesser extent Apple). I then suggested that Microsoft is acquiring more in-house hardware expertise for the same reason that Intel is reaching out to software vendors for help in designing processor performance metrics: both companies recognize that the new multicore paradigm demands unprecedented levels of design-level (not just implementation-level) integration between software and hardware layers, with the result that Microsoft has to get even more savvy about chip design just as Intel has to get even more savvy about application and OS design. As I mentioned above, one way that Intel would like to indirectly bring OS and application writers into the CPU design loop is by asking for their help in designing benchmarks, benchmarks that in turn will provide input into the microprocessor design process. But another way that Intel can beef up its knowledge of and influence on the OS and application layers of the x86 system stack is by partnering very closely with whole widget makers. The Apple deal was the first of these partnerships, and the Sun deal is the second. By getting its engineers involved in Solaris, Java, and other Sun software initiatives, Intel is able to penetrate still deeper into the software layer of a very important, growing market segment—high-end x86 workstations and servers—thereby keeping its finger closely on the pulse of the x86 systems market as it races in new directions. Intel can now boast that it has engineering resources invested to some extent or another in the four major silos of the x86 OS/application stack: Windows, Linux, Mac OS X, and Solaris.

Default utility Image

The DRM imbroglio is front and center at Cannes When I think of Cannes, I think of warm beaches, frou-frou Hollywood soirees, and, of course, the Cannes Film Festival. But Cannes is also home to Midem, a conference aimed at the music industry’s digital aspirations. The conference kicked off last Saturday, and some of the battles taking place there deserve attention. Victoria Shannon filed a story for the International Herald Tribune that’s making the rounds today, and in it she covers the state of the music industry’s love affair with DRM. Her take is that the industry is showing signs of going DRM-free, which is an argument we’ve heard quite a lot of recently. As much as we might hope this is true, there are plenty of reasons to believe that DRM isn’t going anywhere, and we need to look no further than Midem to see why. The future of music is… licensing? Larry Kenswil is the President of UMG/eLabs, Universal Music Group’s new media, business development, and advanced technology division. Kenswil is in charge of soothsaying at Universal: he’s the man whose job it is to look into the future and tell us where music is headed, and how Universal can remain a giant amongst the music giants. Thus when Kenswil says that the music industry is about to change, people listen. What’s the change? Kenswil told Midem attendees that it’s time for the industry to stop counting units sold and start focusing on licensing. The future of the industry lies not in the "shrinking box" of traditional music sales but in licensing the usage of music. Kenswil spoke of this mostly in the context of new "user-generated content" sites such as YouTube and MySpace, but let’s not also forget that Universal is banking $1 on each Zune sold as well. It’s no secret that they also want in on the iPod, but it doesn’t stop there. Universal’s Zune deal is just the first in what we believe will be many attempts to shoehorn revenue streams into consumer electronics devices and ultimately into any pattern of usage they can monetize. To really maximize licensing, the music industry needs refined control of its product, and that’s something that DRM has been explicitly tapped to do. And let’s face it, getting rid of DRM doesn’t help licensing efforts. Licensing approaches, such as we see these days with software, can also scratch a chronic itch. Fritz Attaway of the MPAA said something that hearkens back to my missive on DRM from last week. According to a snapshot of quotes at the conference from PaidContent, Attaway said, "When one consumes a movie by viewing it, there is some obligation to compensate those involved in making it." This sounds fair, but immediately we must ask: just who does he expect to pay and how often? When I "consume" a Coke, I pay for it. When I consume another, I pay for it, too. Is this what Attaway means? What if I share my Coke with my wife? Do we both pay? What if what we’re sharing isn’t a traditional consumable good? 10 people can watch a movie at the same time, but 10 people can’t really share a Coke at once. Attaway doesn’t clarify what he means, and that’s a shame because there has been a long-running debate in the industry over this very question. We already know that some studios are opposed to iTunes DRM because it allows very limited sharing—sharing that is considerably more limited than what is possible with either DVD or VHS today. Yet because none of these technologies can literally count the number of eyeballs watching a movie and send you an invoice, some Hollywood players are only interested in DRM that permits as little sharing as possible. RIAA Boss Mitch Bainwol perhaps said it best: "Technology is the basis of our future." It’s what technology—in particular DRM—can do for the industry that has it excited, and what technology can do is create business models. Bainwol’s comments came during an attack on Gary Shapiro, the head of the Consumer Electronics Association. Shapiro argues that DRM is oppressive, that the RIAA and MPAA engage in fearmongering, and that the road to profits is paved with letting the little things (like casual piracy) slide so as not to end up treating your legitimate customers like criminals. Yet according to both Bainwol and Attaway, Shapiro is out to destroy their attempts to monetize their products, with Attaway saying that Shapiro "is trying to enact laws that limit the use of technology to create new business models." Bainwol said that Shapiro’s rhetoric "makes us look like we’re evil." Shapiro’s rejoinder pointed out the obvious: "I don’t make you look evil—your lawsuits against old people around the country make you look evil." So what is the future of music? Perhaps DRM will be abandoned, but when these guys are shrugging off arguments that DRM punishes honest users while doing nothing to stop piracy, I have my doubts.

Default utility Image

Rogers to carry iPhone in Canada? Maybe, but… This week must be the week of post-Macworld rumors. The most recent iPhone-related theory is that Rogers Wireless will be the carrier for the iPhone in our northern neighbor of Canada. The rumors have been sort of loosely floating around for a while now, especially considering that Rogers is currently the only GSM carrier in Canada. One potential iPhone lover started pestering Rogers Wireless himself about whether or not they'd be carrying the iPhone, and got a response from the company. You were wondering about iPhone..Lots of speculation out there. Beyond the fact that Rogers is the only GSM carrier in Canada, we have not issued any statements as to whether or when the iPhone would be available at Rogers. Of course, this is a typical corporate response to a question that they cannot legally answer yet, but the whole "we're the only GSM carrier in Canada" thing does indicate that they're aware of their position in the market, in that respect. This would kind of make them a shoe-in for the iPhone, unless Apple decides to release another version of the iPhone before they launch in Canada. Considering that Apple will have to do so for some other international markets (such as, say, Asia), it's not outside the realm of possibility that the iPhone will no longer be GSM-only by that time. Okay, for you "no double negatives!" types, that translates to: By that time, the iPhone could be available as CDMA, or even something else. So basically, no one has any idea of whether this rumor has any weight or not. But for now, Canadians can sleep easily thinking that maybe, perhaps, possibly Rogers will be the carrier for the iPhone. For now.

Default utility Image

Game Review: Far Cry: Vengeance (Wii) Most of the time I love what I do. The rest of the time I'm playing Far Cry games. The first PC title was actually pretty good, but since then the consoles have been riddled with bad Far Cry titles, and the abysmal 360 attempt was up there with 25 to Life in games that were painful to play for Opposable Thumbs. There is just no spark to them; I've never been so bored playing a game in my life. Luckily Far Cry: Vengeance on the Wii wasn't boring. It's horrible. To start, when I sat down my couch, the game said the controller was too far away. That's odd. I adjusted the lights, changed the sensitity of the Wiimote, and tried again. Still too far away. I put in another game and it was fine. Far Cry? I was too far away. I finally had to stand in front of the television and get uncomfortably close to play the game, not the best way to start things off. I'm not sure what the problem is here, but after reconnecting the sensor bar and trying other games, the issue appears confined to the Far Cry software. I should have stopped there, because things only got worse. Remember what video looked like on the Sega CD? Welcome to the intro video of Far Cry: Vengeance. It's that bad; a pixelly mess that hurts your eyes. Why? Don't tell me the Wii disc couldn't hold actual video; I've seen video in other games. It's almost like they tried to make it look at bad as possible. Once I actually started the game I started to think my first instinct may have been right: the mangled video was the only thing in the world that could possible have looked worse than the in-game graphics. We're talking some serious PSone-level graphics here. I was shocked when I first saw them—this is the worst-looking game I've played in a very long time. I've often played games I thought weren't that attractive, but this game was downright ugly. It's like ugly's uglier older sister, Uglita. Except she was horribly burned in an accident when volatile Nasty-sauce from the Beat-Down corporation spilled all over her face. It's like someone ate a bad port of an already bad game and this is what they shat out. It just doesn't look good and it wouldn't have looked good last-gen. In fact, there is not enough bourbon in the world to make this game attractive. And I drink a lot of bourbon. This would be forgivable if the game play was fun, but it's not. It's a straight-up run-and-gun and the enemies just kind of wander around until you shoot them. Some of them may get the bright idea to shoot back at you. The control scheme works though, right? Not really. You have to twitch the nunchuk up to jump, and it's just doesn't work well in most situations. Aiming works fine, although it's a shame you can't turn the gun on yourself and end the suffering. Verdict: Skip Price: $49.99 (shop for this game—but don't say we didn't warn you) System: Wii Developer and Publisher: UbiSoft ESRB Rating: Mature Other recent minireviews: Wario Ware: Smooth MovesTurtle Beach Earforce X2 Wireless Headphones Metal Slug Anthology Super Columbine Massacre RPGThe Mad Catz Xbox Live Retro Stick

Default utility Image

Accountants call out Apple on $1.99 fee In the next episode of The Young and the 802.11n-less… It appears as if Apple is still getting hassled over its 802.11n enabling fee of $1.99, but not just by the users anymore. Managing director of research at Glass Lewis & Co. and former chief accountant of the Securities and Exchange Commission Lynn Turner told the Wall Street Journal (subscription) yesterday that Apple's excuse of "generally accepted accounting principles" as the reason for the fee was a bunch of bull. "GAAP doesn't require you to charge squat," she said. "You charge whatever you want. GAAP doesn't even remotely address whether or not you charge for a significant functionality change." So why in the world is Apple charging the fee? Surely, they're not trying to squeeze $1.99 out of each of us just for giggles. The WSJ's anonymous sources who are "familiar with the matter" say that Apple felt it needed to charge the fee "based on the accounting outcome that would have resulted had it given the product away." That translates to: because 802.11n did not officially exist at the time when the technology was sold in Apple's computers, the company could not recognize revenue based off of that technology because there was no market price for the enhancement yet. Therefore, if the company wants to recognize revenue for the sale of 802.11n hardware, it needs to charge the fee so that it can later recognize that revenue, otherwise the company may as well have just given away 802.11n cards. Whew. "If Apple had given the enhancement away free, Apple's auditors could have required it to restate revenue for that period and could possibly have required Apple to start in the future to defer all the revenue from computer sales until all such enhancements are shipped," according to WSJ's source. So basically, the argument is that Apple is doing it because it would be a huge pain in the ass otherwise. I get that. Despite this, accountants are still getting all riled up by Apple's claim that GAAP required the fee. In addition to Ms. Turner, Edward Trott, a member of the Financial Accounting Standards Board (who comes up with the aforementioned rules) said that "No, GAAP doesn't tell you to do anything. You need to work out your transaction with your customer, and GAAP will tell you how to reflect your transaction with that customer." Lesson to Apple: Don't piss off the accounting industry to make yourself look less bad. They will call you out on it faster than Steve can say "BOOM!"

Default utility Image

Microsoft offering free Vista “test drive” Microsoft recently unveiled a new web site called “Windows Vista Test Drive” designed to allow business users, consumers, and the merely curious to find out what running Microsoft’s latest operating system might be like without actually having to install it. The web site requires Windows 2000 or XP, Internet Explorer 6 or 7, and Microsoft’s Virtual Machine Remote Control (VMRC) Advanced ActiveX control, which is installed when the user first visits the page—IE 7 dutifully reminds the user of all the inherent dangers of installing ActiveX controls first, but eventually allows the installation. For fun, I tried out the site under Windows Vista itself, and it worked fine. After getting through this process, the user is presented with a web page that mimics Windows Vista’s desktop, although all the options on the Start Menu have been replaced with links to various “exercises” that one can enter to find out more about Vista’s new features. At first, it seems as if this “faux desktop” page is the actual Vista test drive, but in fact it merely serves as a launching pad for the real OS running in a virtual machine. Getting to that stage, however, requires one more security verification process. The page pops up a dialog box that says: “Using NLTM authentication, the client cannot verify that the server “vh07.virtuallab.microsoft.com” has not been impersonated. Although your password is never sent over the Internet, it may still be discovered through an attack on the authentication token sent to this server. Do you wish to continue?” The instructions on the web site say that this is nothing to worry about, and the password that is given out to test drive users is a temporary one. Finally, the OS lights up in a small (about 640 by 480) screen, after which the user can click on the regular Vista Start Menu and explore various features, guided by a separate information bar on the right that lists common tasks, such as using Vista’s new search features. As expected from a virtual machine running over the Internet, display performance is extremely slow. One can see the screen slowly “painting” from top to bottom as new screens and menus appear. However, the virtual disk performance is quite speedy, so programs load without annoying waits. I tried clicking on the Microsoft Excel 2007 icon, and Excel dutifully loaded with the Office Activation Wizard in the foreground. Attempting to activate this VM copy resulted in a communications error, which was quite hilarious. However, I was able to muck about in Excel and enter a few formulas. Microsoft is hoping that making the trial available will help people evaluate the software without resorting to piracy. Last Friday, Microsoft’s Cori Hartje was in New York to discuss how its Genuine Software Initiative (GSI) was proceeding. GSI launched in July 2005, and has spawned such programs as Windows Genuine Advantage, the controversial method by which Windows’ license key is checked against a list of known pirated keys, and operating systems that fail the Genuine test are prevented from downloading non-critical Windows updates and other Microsoft programs such as IE 7 and Windows Defender. Providing free test drives of Windows Vista is not likely to make any significant dent in piracy, although it does remove one excuse that some pirates use to justify “sampling” commercial software. More important than any antipiracy efforts, the test drive is a useful marketing tool that many computer companies have used over the years to promote new products. The original Apple Macintosh, for example, was once offered to prospective buyers on a “Test Drive” basis, and ads even showed users sporting leather driving gloves. That promotion fell flat, as most people who participated in the test drive turned out to be tire-kickers, not buyers. Will the Vista test drive end up the same way? With new computers being sold with Vista prebundled starting next month, it may not matter too much. The Test Drive also serves as a promotion for Microsoft Virtual Server, which is used to deliver the trial over the web.

Default utility Image

Retrospect not pining for the fjords, according to Dantz co-founder Retrospect. The name alone is enough to strike fear into the hearts of many a seasoned Mac OS X admin—or at least set them off on a 20-minute rant. Still, it's been the most popular Mac backup application for over two decades, bundled with hard drives, tape drives, and even Macs themselves at one time or another. It's safe to say that it's a fairly critical piece of software for the platform. And that's what makes this report from The Register a little scary; according to their sources, only minimal staff remain to develop version 8.0 of Retrospect for Windows and Mac following a series of layoffs in EMC's Insignia group, which is largely responsible for Retrospect development. "From what I understand, the new version will not even be released," said another source. "Everyone had such high hopes for the product. It had a polished interface, and I think it would have been a success." However, Dantz's co-founder, Larry Zulch, wants you to know that you don't have to start generating purchase orders for BRU or NetVault just yet. In a message posted to the Retro-Talk mailing list and later reposted by Mac administrator John C. Welch, Zulch claims that the Reg article is blowing things out of proportion: Don't take any comments about offices changing for more than they are. The team is moving to existing facilities in Pleasanton and have kept a great attitude through disruption and change. I'm not Steve, for better or worse, but I still believe like him in not talking about future releases, but I will say that new versions are coming. So there you… have it, I guess? Granted, it's not exactly unequivocal either way, but longtime Retrospect users have been down this road before, with updates seemingly coming at the last possible minute of compatibility. It's hard to imagine this software stalwart just fading away for good.

Default utility Image

802.11n in all but name: draft hardware in the clear Late last week, we reported that the IEEE’s 802.11 working group signed off on Draft 2.0 and sent it out to the rest of the IEEE membership for approval. More details have emerged from the working group’s meetings in London and it’s good news for those looking forward to faster wireless networking. Perhaps most important is the revised timeline. Draft 2.0 will go out to the IEEE membership at the end of this month for approval. Voting should be wrapped up by the end of March. If it is approved as expected, work on Draft 3.0 will move into high gear. By the beginning of June, Draft 3.0 will also be sent out to the membership for approval. Should 75 percent of the membership sign off on it, it will become the basis for the final 802.11n spec. Although the 802.11n will not be finalized until October 2008, the month to circle on your calendar is January 2008. That’s when Draft 3.0 is expected to be approved, meaning that equipment made after that point is "true" 802.11n gear in everything but name. Assuming everything goes as planned, we’re still a year out from final ratification. What about today’s "Draft N" 802.11n hardware? After Draft 1.0 went out for comment last year, there was a lot of uncertainty over the ratification process. Task Group N received over 12,000 comments from IEEE members and OEMs—over six times the number of comments expected—and over half of the membership were unhappy with it. This time around, just about everybody seems to be on board with Draft 2.0. That’s because Draft 2.0 has something for everyone—low power consumption, increased support for multiple access points, and better handling of VoIP traffic. Most importantly, there’s backwards compatibility for early 802.11n gear which is capable of using only one of Draft 2.0’s two 20MHz bands, although it may operate at lower speeds (but still faster than 802.11g). The end result is that Draft 2.0 products will be compatible with so-called draft-compliant gear, often with nothing more than a firmware update, as some of the delay in moving the 802.11n spec forward has come from vendors ensuring that "Pre N" gear would be supported. And Draft 3.0 equipment will maintain this compatibility as well. This is also one reason why Apple has waited to turn on the latent 802.11n functionality in its hardware, to find out if the chips they’ve used would be fully compatible with the evolving spec. Now that the question has been answered in the affirmative, Apple can officially enable 802.11n on its Macs, accounting rules be damned. So if you were an early adopter on 802.11n, you should be all set, although you may not get the full 120-200Mbps of performance that 802.11n offers. If you buy gear with Draft 2.0 chipsets, you’re all but certain to reap all the benefits of 802.11n—faster speeds, better coverage, and 50 percent longer range.

Default utility Image

CPI suing FCC to get at real state of broadband competition in the US The Center for Public Integrity (CPI) wants to find out exactly how competitive the US broadband market is. To do that, it needs access to the raw data collected by the FCC, but the agency has refused to turn it over on the grounds that it could give a competitive advantage to other companies. CPI now finds itself in a District Court battle against the agency, which is being supported by AT&T, Verizon, and the three major industry trade groups: NCTA (cable), CTIA (wireless), and USTA (telephone). CPI wants the FCC database of Form 477 filings. These documents are filed with the FCC by every telecom company in the US, and they give the agency data on each company’s line deployments, broken down by ZIP code (and generally unaudited by the FCC). The FCC then uses this data to generate reports about the state of broadband competition, usually arguing that nothing radical needs to be done. But the agency’s methods for generating these reports have come under scrutiny, and CPI wants to take a look for itself. When talking about broadband deployment, for instance, the FCC says that any particular ZIP code has broadband access if even a single cable or DSL connection exists there. It also classes “broadband” as anything above 200kbps—a woefully low standard for any true broadband connection. The General Accounting Office, the federal government’s internal watchdog agency, took the FCC to task (PDF) last May for the way it prepared these reports. The GAO’s own examination of Form 477 data found that the median number of broadband options in a particular ZIP code was two, not eight as the FCC claimed. CPI filed a Freedom of Information Act (FOIA) request with the FCC on August 24. After the statutory 20 business days had passed without any word from the agency, CPI filed suit on September 25, 2006. That apparently got the FCC’s attention; the FOIA request was officially denied the next day. The matter is now in the hands of a federal judge, and the FCC is trying to have the case dismissed. The agency argues that the material in the reports is confidential business information and that the release of it could damage the companies involved. In a court filing, Alan Feldman of the FCC tells the court how this might work. “For example,” he says, “information about how a company’s number of lines has increased or decreased in a particular area over time provides competitors with insights into how that company is focusing its investment and marketing efforts.” He also notes that most filers requested confidentiality for their data. CPI hopes to add the Form 477 data to its Media Tracker, a web site that shows consumers the available broadband providers, cable operators, television and radio stations, and newspapers in the area.

Default utility Image

Intel jobs page tips GPU plans In a recent update to their careers page, Intel has pulled back the curtain a little bit more on what is probably one of the worst kept secrets in the tech industry right now: their plans to build a GPU to compete with NVIDIA and AMD/ATI. The blurb for Intel’s Visual Computing Group goes as follows: Intel’s Visual Computing Group (VCG) has the mission to establish the future of computing for high-throughput workloads. We are focused on developing discrete graphics products based on a many-core architecture targeting high-end client platforms. Our vision is that the resulting ingredients and technology will extend to mobile clients, servers, and embedded platforms over time. VCG will initially focus on discrete graphics products but will also expand the previous charter to include developing plans for accelerated CPU integration. There’s not a lot to add, because that pretty much sums it all up. Intel intends to make discrete (i.e., non-integrated, like their GMA line) GPUs for everything from workstations down to cell phones. Of course, nobody at NVIDIA or AMD/ATI is really going to fling themselves out of a 20th-floor window in despair upon reading this posting, because this particular cat has been at least halfway out of the bag for quite some time. It has always just been a matter of waiting for the formal announcement. Judging by this careers page, I’d expect some kind of announcement this year. I think the most interesting question that this posting raises is “What took Intel so long?” Our ads may feature bunny suits, but we make serious chips for serious people My own sense of why the world’s top supplier of complex, high-performance integrated circuits left companies like NVIDIA, ATI, the erstwhile Voodoo, and others to carry us through the 3D graphics revolution on their own is that it has at least something to do with corporate culture. One of my favorite Intel anecdotes from undergrad, which I think I’ve told on the site before, is about the time when some Intel engineers came to visit a class in order to talk about what the company does. They told us that they have a testing lab where they test processors for bugs, and this testing lab runs a very sophisticated benchmarking program that really pushes the CPU to the max. The program is also a piece of self-modifying code, which makes it especially useful for turning up bugs. The benchmark program, as it turns out, was Id software’s Doom, and of course we all had a good laugh at the idea that Doom was being played in the hallowed halls of Intel. Intel didn’t do games; they did business. Fast forward to the Pentium III ad campaign, in which we were told that the Pentium III with SSE would “make your Internet faster.” At this point, we all knew quite well that most of the people who were buying top-of-the-line PIII chips were doing so to run games. But still, one got the sense that Intel would rather go out of business than be seen as a supplier of silicon primarily intended for gaming and entertainment. It seemed that the company would go to any lengths to maintain their image as a maker of serious chips for serious people, regardless of what the general public was actually using all that CPU horsepower for. Intel did give discrete graphics a very lackluster try with the i740, but it was clear that their heart wasn’t in it. The i740, which was based on acquired technology, was utterly mediocre, and analysts were mystified that an engineering powerhouse like Intel could let itself be so thoroughly upstaged by much smaller players. Intel obviously had their priorities set, and gaming chips were not among them. Today’s Intel, with its homepage that’s strangely reminiscent of Apple’s (down to the dancing people with earbuds), may seem so focused on consumer technologies that it’s hard to imagine a time when the company wouldn’t touch gaming with a ten-foot pole. But that stuffy, business-only image is Intel’s past, and the company now realizes that its future definitely encompasses the realm of computer-based entertainment. Intel coveted the Apple contract precisely because of products like the AppleTV, and soon the company will make its first serious foray into full-blown 3D gaming. Unlike the general-purpose CPU, real-time 3D graphics on a PC is a technology that Intel did not either invent or popularize, but the company is not a total outsider to this realm. In fact, if you were to gauge success strictly in terms of sales volume, Intel is the world’s top supplier of 3D graphics accelerators, in the form of its integrated graphics products. I expect that the pattern that Intel may follow with discrete graphics could be similar to the pattern that it followed with integrated graphics. The first post-740 discrete graphics product may not be the king of the benchmark hill, but like the 740 it will be cheap and very widely available. Intel has always been a process-driven company, and they can use their fab muscle to make up for any initial performance problems by keeping prices low. Eventually, they’ll get the design right, too, and at that point they’re going to be a very tough competitor in the price/performance game.

Categories

Recent Posts

Default utility Image isoHunt.com taken offline by ISP

Fans of BitTorrent search site isoHunt discovered yesterday that the popular site had gone offline....

Default utility Image New Jersey, New York to reap the benefits of Vista

With Windows Vista's release looming, Microsoft is doing all it can to spread...

Default utility Image Ballmer on the iPhone, or whatever it will ultimately be called

HangZhou Night Net

Microsoft Steve is not impressed with...

Default utility Image MySpace offers limited parental tools; critics not impressed

MySpace is, according to one conservative author I spoke with last year, a "porn hole."...

Default utility Image Beatles to finally arrive on iTunes in February. Maybe.

Yes, yes, we know that this rumor has been floating around...

Recent Posts

Default utility Image Default utility Image Default utility Image Default utility Image Default utility Image

Recent Posts

Default utility Image isoHunt.com taken offline by ISP

Fans of BitTorrent search site isoHunt discovered yesterday that the popular site had gone offline....

Default utility Image New Jersey, New York to reap the benefits of Vista

With Windows Vista's release looming, Microsoft is doing all it can to spread...

Default utility Image Ballmer on the iPhone, or whatever it will ultimately be called

HangZhou Night Net

Microsoft Steve is not impressed with...

Default utility Image MySpace offers limited parental tools; critics not impressed

MySpace is, according to one conservative author I spoke with last year, a "porn hole."...

Default utility Image Beatles to finally arrive on iTunes in February. Maybe.

Yes, yes, we know that this rumor has been floating around...

Tag Cloud