EagerDragon
Nov 16, 12:53 PM
Previous question: How hard could it be to take advangate of the multi-cores.
The first thing is that it depends on what you are starting with. If you have zero code out there, you can come up with a nice design for your program that takes advantage of as many cores as you throw at it. If on the other hand you have large chunks of legacy code that was written in the time of single cores, it may be close to a re-write to fully take advantage of the hardware. In some cases it will be easier in some cases to throw the old code away.
But some of it is imagination, if you can look at a problem and the solution you orginaly came up with, and using your imagination look at the problem at hand in inovative ways, parts of the programs could be re-written to take advantage of the hardware and other parts can be left alone (for the short term). This is an incremental step, you gain X% in one area and little to nothing in another area. The key is to determine what your program spends most of it time doing and re-write/re-design that section of the code for the biggest short-term gains.
I remeber working in assembler and selecting the correct combination of instructions based on their function and the number of CPU cycles it took to execute each instruction. Sometimes a set of 12 instructions was faster than a different set of 8 instructions in accomplishing the same result. Use your imagination and look at the problem from a different angle. If your brain only sees a number of serialized steps, you won't be able to come up with anything that takes advange of the hardware.
What you start with (old code) and your imagination can get you there quicker or slower.
Short answer: It depends.
The first thing is that it depends on what you are starting with. If you have zero code out there, you can come up with a nice design for your program that takes advantage of as many cores as you throw at it. If on the other hand you have large chunks of legacy code that was written in the time of single cores, it may be close to a re-write to fully take advantage of the hardware. In some cases it will be easier in some cases to throw the old code away.
But some of it is imagination, if you can look at a problem and the solution you orginaly came up with, and using your imagination look at the problem at hand in inovative ways, parts of the programs could be re-written to take advantage of the hardware and other parts can be left alone (for the short term). This is an incremental step, you gain X% in one area and little to nothing in another area. The key is to determine what your program spends most of it time doing and re-write/re-design that section of the code for the biggest short-term gains.
I remeber working in assembler and selecting the correct combination of instructions based on their function and the number of CPU cycles it took to execute each instruction. Sometimes a set of 12 instructions was faster than a different set of 8 instructions in accomplishing the same result. Use your imagination and look at the problem from a different angle. If your brain only sees a number of serialized steps, you won't be able to come up with anything that takes advange of the hardware.
What you start with (old code) and your imagination can get you there quicker or slower.
Short answer: It depends.
Mal
Mar 24, 01:43 PM
Apple writes all the drivers for the cards. It supports, so that will probably never happen.
Huh? That doesn't make any sense. If Apple's writing drivers for these cards, then doesn't that make the chances of them being supported 100%? Obviously it doesn't indicate that retail (PC) versions would be supported, but I can't make any sense out of your comment.
jW
Huh? That doesn't make any sense. If Apple's writing drivers for these cards, then doesn't that make the chances of them being supported 100%? Obviously it doesn't indicate that retail (PC) versions would be supported, but I can't make any sense out of your comment.
jW
MacSA
Nov 27, 02:52 PM
meh - does this matter? Isn't 17" is getting to be a bit skimpy by any consumer standards.
I'm still using a 15" monitor. :p
I'm still using a 15" monitor. :p
PBF
Apr 2, 11:20 PM
In Safari, you can now change the width of a page by moving the cursor to the scrollbar and you see the little "adjust width" icon. Drag that and the width of the page decreases/increases toward the center.
If I understood your explanation/description correctly (which was kinda confusing), then it's been there since DP1, and it's not just the right side, it's all four sides and all four corners, and lastly, it's a system-wide feature, not just Safari's.
If I understood your explanation/description correctly (which was kinda confusing), then it's been there since DP1, and it's not just the right side, it's all four sides and all four corners, and lastly, it's a system-wide feature, not just Safari's.
Nero Wolfe
Apr 3, 12:29 PM
One thing that's been frustrating me since DP1 is that when you minimize a window into its app icon it's sort of in limbo. Mission Control won't show it, nor does app expose (at least in some apps). Swipe-up on the dock icon (is this app-expose?) does sometimes, but it's mingled with recent files. Add that to the indicator lights being gone and I could have an open app with 20 minimized windows and totally forget about it.
It's not a huge deal, I suppose, but it makes the window management seem broken. I'd hate to go back to using the old-style minimize to the right side of the dock. I never liked that because it mixed those windows with stacks and got messy, plus it stretched and shrank my already-full dock.
Anyway, anyone else bugged by this? Am i missing something? Expose is the single most important thing to me in OS X; I rely on it to greatly speed my workflow. I like mission Control but this needs to be addressed. And, yes I filed a bug report for each DP release on this.
It's not a huge deal, I suppose, but it makes the window management seem broken. I'd hate to go back to using the old-style minimize to the right side of the dock. I never liked that because it mixed those windows with stacks and got messy, plus it stretched and shrank my already-full dock.
Anyway, anyone else bugged by this? Am i missing something? Expose is the single most important thing to me in OS X; I rely on it to greatly speed my workflow. I like mission Control but this needs to be addressed. And, yes I filed a bug report for each DP release on this.
macfan881
Jul 18, 03:37 PM
there r plenty movies i watch more than once i would rather buy a movie though from itunes rather than rent as i have netflix already and it would be a better fit i think steve went to rental if its true cause of the price diffrence he wants 9.99 for a movie the studios want atleast 19.99 so thats why i think this would be true even if they both comprimise and be 12-14 and maybe through some exclusive content bonus featurs i would still buy movies for itunes espiecaily once i get my ipod video
vincenz
Feb 20, 03:31 PM
http://img130.imageshack.us/img130/4189/p1000762s.jpg
Who's the gal? ;)
Who's the gal? ;)
aiqw9182
Mar 24, 03:24 PM
No, I don't like Apple to force me to buy Intel.
Cool story bro, would read again. If you want the fusion so badly then buy a PC. No one's forcing you to buy from Apple.
So, this doesn't mean it would be possible to upgrade a 2010 15" MBP's GPU from the INTEL HD graphics to anything different does it??
This INTEL HD sucks really bad...
Why should you care about the IGP in your 2010 15" MBP? You have a discrete GPU(NVIDIA 330M) alongside it that it should automatically switch to while under heavy load.
Cool story bro, would read again. If you want the fusion so badly then buy a PC. No one's forcing you to buy from Apple.
So, this doesn't mean it would be possible to upgrade a 2010 15" MBP's GPU from the INTEL HD graphics to anything different does it??
This INTEL HD sucks really bad...
Why should you care about the IGP in your 2010 15" MBP? You have a discrete GPU(NVIDIA 330M) alongside it that it should automatically switch to while under heavy load.
dmaxdmax
Nov 28, 01:23 PM
Erm... So you're calling a slightly reheated Toshiba Gigabeat-POS with pseudo-WiFi (sure, it may be fully enabled in the future, but with a screen with that resolution, it'd be preety much useless) a "moterately high ante"?
<snip>
And by the way, there's already a "Gates' sucessor", and I'm talking about the CEO title, not the Chairman... Come to think about it, Ballmer is already a "chair-man" of sorts... :D And we all know how smart that guy is. :rolleyes:
What I meant by moderately high ante was the dollars spent, not the product.
I don't think we'll know what MS executives will do when Gates leaves until he's gone. Even Mr. B.
Don't get me wrong - I think the Zune is crap. However it's always foolish to ignore the 800 pound gorilla, even when it's lazy and clueless. They can wake up and buy clues.
<snip>
And by the way, there's already a "Gates' sucessor", and I'm talking about the CEO title, not the Chairman... Come to think about it, Ballmer is already a "chair-man" of sorts... :D And we all know how smart that guy is. :rolleyes:
What I meant by moderately high ante was the dollars spent, not the product.
I don't think we'll know what MS executives will do when Gates leaves until he's gone. Even Mr. B.
Don't get me wrong - I think the Zune is crap. However it's always foolish to ignore the 800 pound gorilla, even when it's lazy and clueless. They can wake up and buy clues.
slb
Aug 24, 09:26 PM
Core 2s will be nice, but if you've already got a Core-based Mac now, I wouldn't rush to sell it. The Meroms coming out are an "initial" version according to Intel, designed to be pin-compatible as an easy replacement for the Yonahs. But next year, Intel will be releasing a new platform called Santa Rosa that the Meroms are really designed for, which will increase the frontside bus to really take advantage of the speed of the Meroms, as well as include new WiFi and the "Robson" flash technology for fast-booting.
I suspect we'll see slight case redesigns for Santa Rosa-based Macs. Santa Rosa will be the real Core 2 platform. This year's Meroms are a stopgap.
I suspect we'll see slight case redesigns for Santa Rosa-based Macs. Santa Rosa will be the real Core 2 platform. This year's Meroms are a stopgap.
Yamcha
Mar 25, 08:27 PM
I think this is pretty cool, but I agree that quite a lot of the games are fairly low quality, I don't know how it'll be enjoyable on 1080P, I downloaded Asphalt from the mac app store, and honestly it was crap :P.. I've tried lots of FPS games on my iPod Touch, and again the experience just isn't the same as you'd get from a console or PC gaming..
Apple needs to find a new way to improve the gaming experience, like for example make iPad compatible joysticks, I know that there some out there (third party), but what I want to see is Apple making one specifically for the ipad..
The problem I've found on my iPod Touch is that the fingers take far too much space, at least for games that have a joystick on the screen.. Not a good experience, and the same apply's to the iPad, although sure we have a much bigger screen, still I think gamers like to be able to use a real joystick, instead of using a touch based one.. I know I do..
Apple needs to find a new way to improve the gaming experience, like for example make iPad compatible joysticks, I know that there some out there (third party), but what I want to see is Apple making one specifically for the ipad..
The problem I've found on my iPod Touch is that the fingers take far too much space, at least for games that have a joystick on the screen.. Not a good experience, and the same apply's to the iPad, although sure we have a much bigger screen, still I think gamers like to be able to use a real joystick, instead of using a touch based one.. I know I do..
NebulaClash
Sep 14, 12:03 PM
I think you are a minority of one on this interpretation.
Then you should read the entire thread and see that you are wrong in this thought.
Then you should read the entire thread and see that you are wrong in this thought.
milo
Aug 29, 12:31 PM
ALL desktop machines......
Apple posted their 3rd Quarter 2006 financial results today.
http://www.macrumors.com/pages/2006/07/20060719164004.shtml
That was before the Pro shipped, it's a safe bet since it's released desktop numbers have gone up. And that's just one quarter, I doubt desktop numbers have been on the decline for the last twelve months.
Apple posted their 3rd Quarter 2006 financial results today.
http://www.macrumors.com/pages/2006/07/20060719164004.shtml
That was before the Pro shipped, it's a safe bet since it's released desktop numbers have gone up. And that's just one quarter, I doubt desktop numbers have been on the decline for the last twelve months.
dguisinger
Nov 28, 02:24 PM
It may not be true that they broke even, it's just something I thought I heard on a tv interview...
Sony is selling the PS3 at a loss as well, Nintendo I'm sure is making money on the Wii...
There was also a lot of buzz for the 360 a launch & after, MS has sold over 15 million XBOX 360's in the last year, so I think they have done pretty well....
I don't think Sony has the best plan, if they did they would have launched earlier, had more units at launch & not be so overpriced...
Actually, I'll make some corrections for you:
Sony is losing $241 (source: iSuppli) on each PS3 at RETAIL pricing. We all know that Sony sells to distributors who sell to retailers, all of whom profit, so if you accept a 30% combined margin you are talking well over $300 loss per console. Their games are also in the $70 range to make up for it.
iSupply also states that the xbox 360 costs $323 for the premium unit to build; at $76 less than the retail price. After the channel margins are taken out, Microsoft is breaking even. Microsoft is already a year into things, and is about to release a cheaper xbox 360 using 65nm parts, which will save them even more. All in all, Microsoft is looking fairly good this time around for turning a profit. Infact, in an interview this past week I read that the Entertainment division would have turned a profit this year if it wasn't for the Zune.
As far as # of units sold:
XBox sold 27 million units
Xbox 360 has sold 7 million so far, and Microsoft expects to sell a total of 10 million by year end.
Sony has sold 200,000 units in the US, and won't hit 400,000 at year end.
Wii has sold 400,000 units, and will hit an estimated 4 million by year end.
The Xbox 360 and Wii also both have very high software attach rates (I've bought 5 titles already for my Wii); and Microsoft i'm sure is making a killing on Live.
Sony is selling the PS3 at a loss as well, Nintendo I'm sure is making money on the Wii...
There was also a lot of buzz for the 360 a launch & after, MS has sold over 15 million XBOX 360's in the last year, so I think they have done pretty well....
I don't think Sony has the best plan, if they did they would have launched earlier, had more units at launch & not be so overpriced...
Actually, I'll make some corrections for you:
Sony is losing $241 (source: iSuppli) on each PS3 at RETAIL pricing. We all know that Sony sells to distributors who sell to retailers, all of whom profit, so if you accept a 30% combined margin you are talking well over $300 loss per console. Their games are also in the $70 range to make up for it.
iSupply also states that the xbox 360 costs $323 for the premium unit to build; at $76 less than the retail price. After the channel margins are taken out, Microsoft is breaking even. Microsoft is already a year into things, and is about to release a cheaper xbox 360 using 65nm parts, which will save them even more. All in all, Microsoft is looking fairly good this time around for turning a profit. Infact, in an interview this past week I read that the Entertainment division would have turned a profit this year if it wasn't for the Zune.
As far as # of units sold:
XBox sold 27 million units
Xbox 360 has sold 7 million so far, and Microsoft expects to sell a total of 10 million by year end.
Sony has sold 200,000 units in the US, and won't hit 400,000 at year end.
Wii has sold 400,000 units, and will hit an estimated 4 million by year end.
The Xbox 360 and Wii also both have very high software attach rates (I've bought 5 titles already for my Wii); and Microsoft i'm sure is making a killing on Live.
gugy
Sep 1, 12:57 PM
Great,
Thanks for waking me up multimedia:D
I never really care for the Imac. As a second computer is great, but for me I rather have a Mac Pro. I am glad is possible to do that now.
Thanks for waking me up multimedia:D
I never really care for the Imac. As a second computer is great, but for me I rather have a Mac Pro. I am glad is possible to do that now.
extraextra
Oct 23, 09:14 AM
Starting to feel about as likely as flying saucers...
http://www.wal9000.aonservers.com/hostedpics/mbp_wanttobelieve.jpg
Lol
I hope it comes out this week, for all those who are waiting.
http://www.wal9000.aonservers.com/hostedpics/mbp_wanttobelieve.jpg
Lol
I hope it comes out this week, for all those who are waiting.
jwp1964
Sep 7, 09:09 AM
A good idea, just poorly executed.
Actually makes more sense than the system we have now.
Just where would you prefer to live? Make a choice from the list below:
USA, England, Austalia, New Zealand, Germany, Japan, Korea (South) most other EU members OR
China, Cuba, Russia, North Korea.
You've got to be kidding me, unless you actually believe we should all be equally miserable.:p
Actually makes more sense than the system we have now.
Just where would you prefer to live? Make a choice from the list below:
USA, England, Austalia, New Zealand, Germany, Japan, Korea (South) most other EU members OR
China, Cuba, Russia, North Korea.
You've got to be kidding me, unless you actually believe we should all be equally miserable.:p
heffemonkeyman
Sep 7, 12:59 PM
On my lunch break at work, I just downloaded a couple of HD trailers, both 2min30sec in length; 1 at 480p and the other at 720p. My set up is an 3.0Ghz Pentium D, 1G ram, 256K Nvidia Gforce 6800, 20" Dell Digital LCD.
I could tell no difference in file quality. The problem lies in download time. Both files average dl speed was 150KBps. Thats 1.2Mbps if my math is right. The 420p file took 4:28 to dl, translating to 3:34:24 for a 2hr movie. For 720p, it took 12:39, meaning a full movie would take 9:28:45.
I know my cable provider offers up to 4Mbps downlaods, for about $120/month. And thats before the cable servise itself. Even then its not dedicated. Most people with cable will opt for their providers basic service ,like $40 - 50/month for 500-600kbps, or 1/2 as fast as my test. The movies would take twice as long to dl. 19hrs to downlaod will not fly. 7hrs may not either.
If the compression works to get a DVD quality movie down to 1G, then it could be downloaded in about 1h50mim, nearly realtime at work, or 3h40min at home. At work, I would only need maybe a 15min buffer before I start watching, and not catch up to the dl. But at home, I would need about 1h40min buffer. Maybe this is acceptable to some, but if I can walk to Wal-mart or Blockbuster and back in that time, then what's the consumer advantage beyond the novelty?
I'm sure apple engineers can do these same napkin calculations. There would have to be some alternative to the straight dl. Maybe a torrent of some kind built into iTunes 7. I don't know. Just thinking.
This is a good test, but your connection is not fast enough for this to be viable. If your getting only getting 1.2mbps, that not going to cut it.
Bandwith is a huge issue. In my area, Seattle, I can get Comcast cable for about $50/mo and I get 6-8mbps solid download. So I can stream anything that is encoded at 6-8mbps just fine. The 720p trailers are about 4-8mbps, so it works for me.
I know not everyone can get that kind of bandwidth/price, but they will soon. I think this is where Apple is going, but it's not going to work for everyone. At least not right away. But maybe enough to be profitable?
I could tell no difference in file quality. The problem lies in download time. Both files average dl speed was 150KBps. Thats 1.2Mbps if my math is right. The 420p file took 4:28 to dl, translating to 3:34:24 for a 2hr movie. For 720p, it took 12:39, meaning a full movie would take 9:28:45.
I know my cable provider offers up to 4Mbps downlaods, for about $120/month. And thats before the cable servise itself. Even then its not dedicated. Most people with cable will opt for their providers basic service ,like $40 - 50/month for 500-600kbps, or 1/2 as fast as my test. The movies would take twice as long to dl. 19hrs to downlaod will not fly. 7hrs may not either.
If the compression works to get a DVD quality movie down to 1G, then it could be downloaded in about 1h50mim, nearly realtime at work, or 3h40min at home. At work, I would only need maybe a 15min buffer before I start watching, and not catch up to the dl. But at home, I would need about 1h40min buffer. Maybe this is acceptable to some, but if I can walk to Wal-mart or Blockbuster and back in that time, then what's the consumer advantage beyond the novelty?
I'm sure apple engineers can do these same napkin calculations. There would have to be some alternative to the straight dl. Maybe a torrent of some kind built into iTunes 7. I don't know. Just thinking.
This is a good test, but your connection is not fast enough for this to be viable. If your getting only getting 1.2mbps, that not going to cut it.
Bandwith is a huge issue. In my area, Seattle, I can get Comcast cable for about $50/mo and I get 6-8mbps solid download. So I can stream anything that is encoded at 6-8mbps just fine. The 720p trailers are about 4-8mbps, so it works for me.
I know not everyone can get that kind of bandwidth/price, but they will soon. I think this is where Apple is going, but it's not going to work for everyone. At least not right away. But maybe enough to be profitable?
decimortis
Apr 26, 01:23 PM
Amazon is not a generic term. It is, however, the name of a single river on planet Earth...among a few other names/uses ("the Amazon", "Amazon basin", "Amazon Women").
Where else have you seen/heard the term Amazon in a generic sense? Some examples of a generic term are (at least have been generic over the past 75+ years):
light bulb
door
wood
lock
you forgot windows.....
Where else have you seen/heard the term Amazon in a generic sense? Some examples of a generic term are (at least have been generic over the past 75+ years):
light bulb
door
wood
lock
you forgot windows.....
nioh
Oct 24, 12:57 AM
Here it is! 8 hours early!
60499
you forgot the new and improved hd-screen :D
60499
you forgot the new and improved hd-screen :D
surroundfan
Sep 6, 06:56 AM
Most of Europe's down. Oz and the US are still up, so a product update's unlikely (I'd guess)...
Macula
Jan 11, 10:28 PM
In colloquial modern Greek, "air" is metaphorically a price premium one pays for hype.
Sinister.
Sinister.
AidenShaw
Aug 26, 11:12 AM
Err...I was defending that Conroe could fit in the iMac. Especially having the G5 in there.
Could the deciding factor be the noise?
Not arguing about whether a Conroe would fit in the iMacIntel case - but wondering whether the extra heat would result in extra noise from the cooling fans.
The iMacIntel doesn't have to as fast as it possibly can, especially since the New Form-Factor Conroe Mini-Tower/Home-Theatre Mac� will be there for people who want a bit more power without the size and cost of the maxi-tower ProMacIntel.
Could the deciding factor be the noise?
Not arguing about whether a Conroe would fit in the iMacIntel case - but wondering whether the extra heat would result in extra noise from the cooling fans.
The iMacIntel doesn't have to as fast as it possibly can, especially since the New Form-Factor Conroe Mini-Tower/Home-Theatre Mac� will be there for people who want a bit more power without the size and cost of the maxi-tower ProMacIntel.
m-dogg
Aug 29, 09:03 AM
This is the lowest end machine Apple makes. Let's be realistic. This is a reasonable update for the base model. And it's probably being done in advance of a Core 2 Duo update to the iMac.
No comments:
Post a Comment