Total Items: | |
SubTotal: | |
Tax Cost: | |
Shipping Cost: | |
Final Total: |
Sabtu, 14 Mei 2011
AMD Athlon II X4 640 with Thuban Core Spotted, Is Fully Unlockable
AMD Athlon II X4 640 with Thuban Core Spotted, Is Fully Unlockable
Shortly after the launch of its first six-core processors based on the Thuban core, AMD has revealed that it plans to bring its new chip design into the less powerful Athlon II X4 processor range and it seems like that day has finally come as a Thuban-based Athlon II X4 640 has been spotted online.These new Athnol II X4 processors were mentioned for the first time in a CPU support list for Biostar motherboards which made its appearance in January, according to CPU-World.
The processor was identified with the part number ADX640WFK42GR and it featured the same specs as the current Athlon II X4 640, four processing cores running at 3GHz paired together with 2MB of L2 cache, in spite of the different OPN.
This seemed to indicate that the CPU was actually based on the Zosma core which is basically a castrated Thuban with two of its cores and the 6MB of L3 cache disabled.
This supposition was later confirmed by a user on the Overclock.net forum who managed to get his hand on such a processor.
Furthermore, the user also found out that the two disabled cores as well as the 6MB of Level 3 cache memory can be enabled, CPU-Z identifying the chip as an Phenom II X6 1405T.
AMD Athlon II X4 640 Thuban-based CPU unlocked to Phenom II X6
Enlarge picture
While nobody can guarantee that the unlock procedure will be successful for every Zosma-based Athlon II X4 640, AMD processors have a pretty positive track record when it comes to core unlocking.
Sadly, the biggest problem right now is getting a hold of such a chip as the Athlon II X5 640 "GR" is an OEM-only part, which means that it isn't available for purchase in retail.
In addition, the CPU hasn't been spotted in any desktop machine built by top-tier OEMs, which makes tracking it even harder. Add to Cart More Info
MSI Readies GTX 580 Lightning Xtreme Edition with 3GB VRAM
MSI Readies GTX 580 Lightning Xtreme Edition with 3GB VRAM
MSI is getting ready to add yet another high-performance graphics card that is based on Nvidia's GTX 580 core and will pack no less than 3GB of video buffer. The card is expected to make its appearance at the end of May.The information was posted on the Chip Hell forum in a thread started by the site's owner who promised that by the end of May it will publish a review of the MSI GTX580 Lightning Xtreme Edition.
While the post was scarce in details, it was mentioned that the card will pack 3GB of video buffer memory, compared to the 1.5GB used by the stock version of the GTX 580.
The increased amount of memory should help the card when running SLI multi-monitor games or in titles such as Metro 2033 or Shogun 2, that use high-resolution textures.
As this is a Lighting series model, the card is expected to come factory overclocked and it will also sport a series of features targeting enthusiast users.
Also in the same thread, an image was posted which depicts the back of a graphics card that is allegedly the GTX 580 Lightning Xtreme Edition.
MSI GTX 580 Lighting Extreme Edition graphics card
Enlarge picture
The card in question is marked as the N580GTX Lighting and appears to be using the MSI Twin Frozr III cooling system, but it comes with an aluminum backplate which isn't a standard fitting for the regular Lighting GTX 580.
Furthermore, the four NEC Tokin proadlizers appear to have been moved from their regular spot and this is also the case with some of the back DIP switches. Other details are not visible in the cropped image.
As mentioned earlier, the MSI built GTX 580 Lightning Xtreme Edition is expected to be announced at the end of May for a yet undisclosed price tag. Add to Cart More Info
One Can't Make a Mistake with Today's Graphics, Game Maker Says
One Can't Make a Mistake with Today's Graphics, Game Maker Says
With all the new video cards out on today's market, one would think it would be hard for end-users to decide which to buy, but a certain game developer says that such fears are unnecessary, considering that pretty much every modern board with any claims of prowess will be more than adequate.If there is any word that can describe today's graphics card market in relation to the video game industry, “ahead” would probably qualify.
Basically, the rivalry between NVIDIA and Advanced Micro Devices led to fairly rapid development of next-generation models.
In fact, both companies are already at their second DirectX 11-capable iterations of add-in-boards.
Granted, high-detail game titles do have playing modes that can draw upon the hidden reserves of high-end models, like multi-monitor options at very high resolutions.
Still, the fact is that any mainstream, not to mention high-end, adapter can handle pretty much everything just fine, even at Full HD (1,920 x 1,080 pixels) resolution.
As such, a certain, well-known game developer known as John Carmack, the head programmer at id Software (the game developer behind such titles as Doom, Quake and Wolfenstein), reached the conclusion that choosing a video board nowadays is a matter of taste and brand preference, not capability.
Although AMD boards do perform better than NVIDIA counterparts in benchmarks, the Santa Clara, California-based GPU maker makes up for it, to some extent at least, by having a “stronger dev-relations team,” leading to closer ties with game makers and, thus, optimizations on its hardware.
“You almost cannot make a bad decision with graphics cards nowadays. Any of the add-in cards from AMD or Nvidia are all insanely powerful,” said John Carmack.
On the matter of integrated graphics, Carmack actually had good words, saying that, while there probably won't be much in terms of real, high-quality game support for a year or so, it probably won't take more than five years for CPUs with built-in GPUs to reach a level where they can cope with truly visually-demanding applications. Add to Cart More Info
New Ring from Genius Is Really a Wireless Mouse
New Ring from Genius Is Really a Wireless Mouse
Some might have thought that there isn't much new that can happen on the peripheral market, but Genius most likely proved such assumptions wrong when it created the Ring Mouse.The thing about computers and other electronics is that, even when equipped with touch input, a solid peripheral still enhances ease of use.
Nevertheless, it isn't exactly practical to carry around a mouse just so one can plug it into their laptop, tablet or whatever else once they settle down somewhere to play or browse the net, especially when traveling light.
That said, Genius came up with the idea of a mouse that is as easy to carry around as any sort of accessory.
More specifically, the outfit created the Ring Mouse which, as its name implies, is fashioned in the shape of a ring.
Indeed, the small item is meant to be worn on one's finger and, thanks to the bundled ioMedia software, it allows one to enjoy the use of multimedia playback controls.
This makes web browsing, playing videos in Windows Media Player and using Windows Picture and Fax Viewer, and Adobe Reader, something of a breeze.
In order to let the Ring Mouse communicate with pretty much any system and sophisticated consumer electronic of today, it was given support for 2.4 GHz wireless radio communication.
The range is of 33 feet and relies on a Pico USB receiver being plugged into the electronic being controlled. Using touch-based technology, it can scroll with a movement sensitivity of up to 1,000 DPI.
The newcomer will make it easy for anyone to control web browsing, photo viewing, multimedia playback and presentations, among other things, from afar.
Shipments should already be underway, for a price of $70. The package includes a portable USB charger (for the built-in lithium ion battery) and a hard carrying case, for when owners don't want the ring adorning their finger. Add to Cart More Info
Emt and Akasa Release USB 3.0 Cards with Four Ports
Emt and Akasa Release USB 3.0 Cards with Four Ports
Native USB 3.0 support may no longer be just a dream, but some might still decide they could do with some extra connectors, or give some to systems without the interface, so Emt and Akasa put the VLI four-port controller to work.The SuperSpeed USB 3.0 interface standard is one that promises a bandwidth of up to 5 Gbps, ten times that of USB 2.0.
As such, it is not surprising to learn of the existence of USB 3.0 add-on cards, especially with how the standard relied on third-party controller chips for quite a while.
It is this very sort of add-in-board that Emt and Akasa have developed, based on a single VLI host controller.
In the past, there have been such things as implementations that used four Fresco Logic FL1000 single-port USB 3.0 controllers, plus a PCI Express bridge chip, among other things.
The four-port VLI USB 3.0 cards, however, two in number, need a single VLI chip for all four connectors to be functional.
Granted, this does mean that all ports will share the same PCI Express 2.0 lane, meaning that all of them won't work at full speeds at once.
Fortunately, this won't really be much of a problem, since it will only interfere with transfers when four individual SATA 6 Gbps SSDs are connected and tasked with copying data all at the same time.
Needless to say, the above sort of situation doesn't usually arise, so Akasa's AK-PCCU3-03 and Ent's own model should do just fine.
Emt USB 3.0 card
Enlarge picture
Ent's model hasn't really been detailed well, but it differs from the other one though its port arrangement and PCB.
More specifically, it is colored red and has all four ports on the back, while Akasa's product (is colored blue) has just three, with the fourth one available internally.
The only disadvantage that the VLI USB 3.0 host controller has is that it has yet to pass USB-IF certification tests. Then again, many mainboards use similarly unrecognized ASMedia and Etron chips, so this shouldn't prove to much of an issue. Add to Cart More Info
Intel Sandy Bridge Pentium CPUs Get Reviewed and Benchmarked
Intel Sandy Bridge Pentium CPUs Get Reviewed and Benchmarked
In just a few days time, on May 22 to be more exact, Intel will introduce the first Pentium processors based on the Sandy Bridge architecture and two of these upcoming CPUs were just reviewed by a Chinese publication, which also included a great number of benchmarks.The two chips that were reviewed are the Intel Pentium G840 and the Pentium G620 and both of these feature dual processing cores, 3MB of Level 3 cache memory and a TDP of 65W.
The processors also include an on-die graphics core that is a cut back version of the HD 2000 and ranges in speed from 850MHz to 1100MHz according to the GPU loading.
Intel Sandy Bridge based Pentium processor benchmark - StarCraft II
Enlarge picture
Compared to the HD 2000 GPU that is used in the more powerful Intel Core processors, the Pentium HD Graphics unit loses support for InTru 3D and high-speed video synchronization, but the good news is that Intel left ClearVideo HD intact.
The performances of the two Sandy Bridge Pentium CPUs were compared against those of the Intel Core i3 540 and the AMD Phenom II X2 555, which was paired together with a 880GX motherboard.
Surprisingly, the graphics core of the Pentium CPUs can more than hold its own against the integrated GPU found inside the Core i3-2100 as the performance differences recorded only varied slightly in the benchmarks run.
Intel Sandy Bridge based Pentium processor benchmark - CineBench
Enlarge picture
However, the same thing can be said about the CPU portion of the review, where the Core i3-2100 managed to gain a clear lead, which varied from 5% all the way up to 31% in applications that can take advantage of Intel's Hyper-Threading technology.
Gaming using a discrete graphics also favors the Core i3 CPU where its lead can extend to 59%. Add to Cart More Info
Langganan:
Postingan (Atom)