Well, well, well. It looks like a single judge in the US is finally asking the right questions and perhaps coming to the same conclusions that many in the press and the consumer advocate sector have understood for some time. What is the conclusion? Just the simple fact that the MPAA and the RIAA have been using the US Judicial System as nothing more than a collection agency. The Judge in question is Judge Bernard Zimmerman of the Northern District of California. While looking over a case that was filed there (On The Cheap, LLC vs Does 1-5011) Judge Zimmerman began to feel that this blanket BitTorrent suit might be little more than a nice fishing expedition for some easy money.
With this in mind the Judge asked the lead Attorney Ira M. Siegal to reveal how much he has made from threats made through the court system. Mr. Seigel failed to respond on time and then refused to respond with the information requested by the Judge (a move that would get most thrown in jail for contempt). Instead Mr. Siegel chose to bash the Electronic Frontier Foundation and a couple of others for good measure.
But more than just the monetary issue at hand Judge Zimmerman also felt that there was a jurisdictional issue. You see Mr. Siegal and the Plaintiff are both based in Southern California, yet chose to file the suit in Northern California. This would seem to be very odd, however Mr. Siegel feels that due to the way BitTorrents work, if you are in a swarm then you are under national jurisdiction. Judge Zimmerman appears to feel differently.
Now the question is what will Judge Zimmerman do? If he dismisses the case based on failure to respond then the cycle will continue. This is very likely what Mr. Siegel would like to have happen. It would remove the scrutiny from him for a while and then allow him to pick up where he left off. If Judge Zimmerman finds him in contempt, fines him and then tosses him in jail along with a nice complaint to the Bar things could be very different. It could set precedence in these cases and in some perhaps even allow for further appeals. We hope that since Judge Zimmerman was smart enough to recognize the scam in the first place he will see the second one and take the appropriate actions. Let’s face it most of these suits are nothing more than extortion with the US Court system’s approval and while it is perfectly reasonable to protect Intellectual Property it is not right by any means to abuse the system the way the MPAA and RIAA have done.
Source TorrentFreak
Discuss in our Forum
About two years ago, roughly the same time as the ZuneHD hit the market with the first Tegra inside, nVidia CEO Jen-Hsun Huang made a prediction of sorts. He stated that he envisioned a time when the GPU was not the bread and butter of nVidia. Instead he saw the mobile CPU and the SoC (system on chip) as the wave of the future. Of course he could not get away from his graphical legacy, so his vision also included an nVidia GPU (or two) along with the mobile CPU. At the time the press sort of overlooked the story. It was not that news worthy. After all the Tegra only had one well known design win (there were others but many never reached the market) the ZuneHD. Arguably it was (and still is) a great product, it just was marketed VERY poorly and was going head to head with the greatest show on Earth; the Apple marketing team.
Jump forward to today and we find the Tegra and the Tegra 2 in many devices. In fact one of the best-selling Android tablets on the market today has a Tegra 2 dual core SoC inside (for those of you that do not know it is the Galaxy Tab 10.1) this is followed by devices like the Asus EEE Pad Tansformer and others. They really have come a very long way in terms of the smartphone and tablet market. Of course they still have Apple to contend with (and their legal and marketing teams) so the battle is not won just yet. However, what we are seeing is that Huang’s vision is coming to pass. nVidia just might find itself earning more than 50% of their income from the tiny SoC and not their high-end GPUs.
Still the road ahead is not completely clear, nVidia will face competition in the form of Qualcom (which just bought BigFoot Networks), as well as Samsung and Even Apple (to a lesser degree) in this new market. As for Intel, Huang says they are not worried about them because the Atom is not an ARM CPU and is not even “speaking the same language”. He feels that the lower cost ARM based tablets will be more attractive to the consumer looking for a small and light system. With the advent of Windows 8 for ARM people will also gain the ability to move back and forth between ARM and x86 keeping things on almost the same platform. This will help to bring the more “desktop centric” consumers into the fold especially with the prospect of a quad core ARM CPU running Windows 8 on the horizon.
It is when companies have to innovate to survive that some of the coolest things arise. I wonder what we will see from Tegra in near future and what lessons from Tegra will nVidia take to other departments to help improve them?
Source CNET
Discuss in our Forum
It seems like companies are determined to re-vsit old ideas these days. We see VMWare trying to recreate a wheel that was pioneered by Citrix. Apple is always redoing an old idea and presenting it like it is a new concept and now we see nVidia going down a road that has been traveled more than once before. The road in question is external video devices; not monitors or splitters or anything like that, but we are talking about external video cards. This is something that has been done before and did not go over all that well. I ccan remember when people were buying PCMCIA cards for use with video editing software. These would work for a while, but the cards would often die (and be replaced) or the inconvinience of using this would become so great we would end up building them a desktop system to replace the laptop they had just bought.
Still if the word from Fudzilla is to be believed this is something that nVidia will be producing and they are even excited about the prospect. We think that this will go the same way that AMD's external grpahics has gone. It is a VERY niche product and one that not only has a limited number of partners (Sony) but also has a very limited market vertical. The problem is that most of the markets where this would be desired already have mobile systems with impressive discrete graphics all on there own. If this is reall a direction that nVidia is going down, they will have a rought road ahead.
Source Fudzilla
Discuss in our Forum
Not all that long ago we talked about HP’s decision to pretty much kill off its WebOS platform and with it the HP TouchPad. Right after those announcements we saw the HP TouchPad prices drop down to $100 in some cases. There was a rush on them and people ran out to get them. Well now, thanks to precentral.net we are finding out a little more about what HP has in mind. It seems that the WebOS development team is heading to the Office of Strategic Technology while the hardware side gets left with the Personal Systems Group.
Of course these items lead to interesting questions. With WebOS going to the Strategic Technology Office will it end up being licensed out as a product on its own? Will we see the new owners of the TouchPad hardware (and the rest of the Personal Systems Group) buy this software from HP’s core business?
We do have some insight into the whys of this event, but the final outcome of everything is not clear. HP needed to do this to help prevent some liabilities that come from selling both hardware and software (that they own and license). But, what on earth does HP plan to do once this is all over? We know they do not have the moxie to go head to head with IBM (which appears to be who they are modeling themselves after) so why the shift and the drop of the consumer side of their company?
We will continue to see if we can find out some more solid answers than what is available at the moment. If you want to read the documents that appear to say this is going happen click the source link below.
Source precentral.net
Discuss in our Forum
Back in the 1940s there was a technological breakthrough that allowed for the computers that we know today to be born. This was the mating of a computing processor with a memory component to store data on. This simple discovery helped to bring about the PC era. Now 71 years later the world of quantum computing has had the same breakthrough as a group of University or California scientist managed to combine a quantum processor and memory. For those of you that are not familiar with quantum computing; here is a very simplified explanation. A normal computer has transistors that function in two states on or off these represent 1 and 0 in binary code. The speed of the computer is represented (in very basic terms) by how fast these transistors can switch on and off (and past that information along). In quantum computing each qubit can be both on and off having the data bits of both 1 and 0 present at the same time. This means that you can actually process significantly more information in the same time (and with the same number of computational components).
Now, before you get all excited about dropping this into your next home computer we should tell you that these types of computers are a little pricey. Right now the least expensive one goes for something like $10 million. The reason for this is that a quantum computer is not easy to make. To get true quantum computing you can do a few things (none of which are easy). You can suspend IONs or other Atoms in a magnetic field. You can also use conventional computer circuits made using the less fancy lithographic methods. These have to be cooled to almost absolute zero though (-273.15c sorry phase change guys). This requires quite a bit of extra power (not to mention space). Once you reduce the temps down to that level then components exhibit new behaviors (called quantum effect) like superconducting. This is what allows the switch to be both on and off at the same time. Or to put it in more technical terms; at Absolute Zero a system’s kinetic energy reaches zero and all thermal energy vanishes. This allows the system to reach a zero-point energy state (or the energy of its ground state). It is the thermal energy which causes a lot of the leakage of the traditional circuit in a CPU and prevents it from operating at this superconductivity state. This temperature threshold where the system is too hot to operate efficiently is called entropy. Super cooling reduces the entropy level and allows for faster processing (more current and higher speeds).
Now quantum computing is nothing new, researchers have been doing this for at least the last 10 years. The cool thing about today is that this is the first time that anyone has built a quantum computer that works using the same architectural principal as a modern day computer with the memory linked to the processor. Who know, with the introduction of graphine, nanotubes and more advanced methods of cooling and transmission of power through a circuit we may see advances that start pushing the levels of quantum computing in the next 5-10 years. I wonder how well it will run Crysis (9).
Discuss in our Forum
Page 556 of 570