• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

I7 Processor

Which i3 has it got exactly? What did you do to confirm compatibility?

Those CPUs are not cheap BTW
 
How do you know the Laptop is compatible with these processors ? Can you please post a link ?

Caz usually in Laptops it is not advised to replace processors by yourself !
and as already mentioned by SURoot , these processors are costly ;)
 
Did it literally say "you can put processor X in this" or was it that it was available as an option? Just because when you buy with an I7, it may have slightly different hadrware requirements... hence the request to know what i3 you had now :)
 
Apart from bragging rights, why would you want to upgrade? Intel's Laptop processors are (I think) BGA (Ball Grid Array) which means they're soldered to the socket and all but impossible to change.
 
some laptops have nonreplaceable, soldered-in CPUs although they are not that common anymore some manufactures still solder them to the mobo. Wouldn't be hard to change though if it was replaceable. Just take the old CPU out of the socket by lifting the metal arm by the socket and replace it with your new CPU. simple
 
I have an i7 question: Do most or all of these use UEFI these days? We've been supposed to get a "quantum leap" BIOS replacement for as long as I've had a PC "RSN", and EFI/UEFI seems to be running late as well.

I have a new i7 Xeon machine on the way, and a box of 3TB drives that the OEM swears will work, but they can't tell me if their systems use UEFI to boot or not. Any thoughts?
 
some laptops have nonreplaceable, soldered-in CPUs although they are not that common anymore some manufactures still solder them to the mobo. Wouldn't be hard to change though if it was replaceable. Just take the old CPU out of the socket by lifting the metal arm by the socket and replace it with your new CPU. simple
Simple if the laptop's BIOS supports the replacement CPU. That's not a given!
 
Simple if the laptop's BIOS supports the replacement CPU. That's not a given!

Haha Forgot to mention that :p I didn't take into consideration the software side. just check the make of the motherboard and then Check the website and see if they have a new BIOS file which supports the new CPU you bought then flash the new bios. Now the bios should/might support other CPU's.
 
The whole thing sounds way too complicated and risk to me!

Does the OP really need the extra grunt of an i7? What's he doing? Climate modelling? :)
 
Does the OP really need the extra grunt of an i7? What's he doing? Climate modelling? :)
If he (or she) is, a single i7 isn't going to be the answer. NOAA just recently got a new AMD-based Cray supercomputer to take over its climate modeling.

cray_xk6.jpg


At the 30th annual Fermilab Severe Weather Seminar a few years ago, I had a nice chat with the guy in charge of these beasts. One of the best take-aways I got from talking with him was that even supercomputer operators are constantly scrambling to keep up with Moore's Law. :laugh:
 
A couple of years ago some university in the UK tried a different approach and had apps you could download which kicked in when your computer was idle in order to process small subsets of climate data.

The idea was to create a virtual supercomputer on t'interweb.

I tried getting involved but the software was pants and after 6 months of it thrashing my CPU then erroring before producing any results, I gave up.

Nice theory, though.

As for Moore's law, I think that's pretty much over, no? Don't we need something like quantum computing (and by extension, quantum theory to actually mean multiple universes - I read that somewhere, don't quote me :)) to work for that to continue?
 
Yeah, distributed computing comes in very handy for scientific research that has lots of willing participants, like Folding@Home or chasing space aliens. (:rolleyes: to the latter.)

The thing about atmospheric modeling is that the various agencies must publish their models and forecasts by a certain time (at least once a day) no matter what. So they have no choice but to buy all the CPUs that they'll need, so they'll have 100% of them available all the time.

Moore's Law is still alive and kicking for now. Clever chipmakers keep on finding ways to push back the quantum limits just enough to keep on improving. When (if?) they finally do hit a wall, they'll just go back to multi-chip modules like in olden times.

These day's we're locked into a vicious cycle of endless upgrades. The chipmakers must keep on improving in order to keep on making money! They're literally "too important to fail", so we have no choice but to stick with the path of endless upgrades, whether we really need them or not.
 
..
These day's we're locked into a vicious cycle of endless upgrades. The chipmakers must keep on improving in order to keep on making money! They're literally "too important to fail", so we have no choice but to stick with the path of endless upgrades, whether we really need them or not.

Very true

Of course, the regular punter can - maybe should? - opt out of the upgrade cycle, at least for things like home PCs.

Don't know about you, but other'n for playing games, my last laptop - which was over 7 years old when it finally gave up the ghost a year ago - was plenty fast enough for regular home computing - MS Office stuff, bit of browsing / downloading podcasts, ripping CDs (maybe not recently), editing the odd photo etc.

It's even a while since I last came across a business process that was actually CPU bound - network, I/O, disk space, sure but rarely CPU. Even when things take hours to run, you look on the server and the CPUs are barely breaking a sweat. And I've worked on some pretty big things, in telcos for instance.
 
Of course, the regular punter can - maybe should? - opt out of the upgrade cycle, at least for things like home PCs.
Absolutely! I'm not advocating pointless consumerism at all. I'm just saying that "more features" isn't likely to stop driving new computer sales. As a Linux guy I'm all about using older machines and not missing out on anything.

It's even a while since I last came across a business process that was actually CPU bound - network, I/O, disk space, sure but rarely CPU. Even when things take hours to run, you look on the server and the CPUs are barely breaking a sweat. And I've worked on some pretty big things, in telcos for instance.
My current business processes include video transcoding, which is very much CPU/GPU-bound. My 6-10 year old PCs are still plenty good for everyday use, but when it comes to making H.264 files out of MPEG2 files, I need to leave a whole machine doing nothing but doing that one task for the better part of a working day. And when I do, it's booked up solid!

During my IT career I found myself in situations where the bulk of our desktop boxes were woefully inadequate, and really had to struggle to run the software they were required to run. OTOH I had a generous budget inside the data center. So I found myself offloading CPU cycles from the desktops to the servers by designing web applications that ran mostly on the server side.

Yes I've been in plenty of data centers where the costly server machines were underutilized in a way that would be a crime if I had my 'druthers. IME that's usually a political-funding and lack-of-engineering thing. I can only hope that someone noticed when I told them that they were wasting money by the truckload, and either hired an actual professional engineer to make IT spending decisions (I sure wasn't going to step into that trap without a hefty raise!), or just stop IT spending altogether until they figured out what they really needed.
 
Back
Top Bottom