KnightWRX
Apr 7, 09:36 AM
You make it seem like intel told apple they can't use the sb chips unless they use the IGP, which is obviously false.
It's not false per say, at least not 100%. Of course, graphics in such systems are usually IGPs, but before the Core iX line of processors, anyone could license and build chipsets for these processors and include a different IGP than Intel did. Intel however refused to license this for the new processors, including the SB line and thus nVidia who was making chipsets could not produce an IGP for the new platform.
So yes, essentially Intel told Apple they had to use the 3000 HD as an IGP, where before, Apple was using nVidia's tech. There was even a massive lawsuit about all of this, between Intel and nVidia which ended with nVidia stepping out of the chipset business alltogether.
So the poster you were replying to wasn't 100% wrong at all. It is in fact a testament to Intel's incompetence how all of this was handled, since an old MBA with a 320m outpaces new SB machines that have a much more powerful CPU in graphics performance.
It's not false per say, at least not 100%. Of course, graphics in such systems are usually IGPs, but before the Core iX line of processors, anyone could license and build chipsets for these processors and include a different IGP than Intel did. Intel however refused to license this for the new processors, including the SB line and thus nVidia who was making chipsets could not produce an IGP for the new platform.
So yes, essentially Intel told Apple they had to use the 3000 HD as an IGP, where before, Apple was using nVidia's tech. There was even a massive lawsuit about all of this, between Intel and nVidia which ended with nVidia stepping out of the chipset business alltogether.
So the poster you were replying to wasn't 100% wrong at all. It is in fact a testament to Intel's incompetence how all of this was handled, since an old MBA with a 320m outpaces new SB machines that have a much more powerful CPU in graphics performance.
Snowy_River
Jul 28, 03:26 PM
Dan=='s mockup is something that I had considered before, I remember talking about it with Yvan 256 at some point as something like "the return of the Cube." I think it's a pretty good design, the guts of the Mini are so packed as it is, an expanded case would allow for a substantial upgrade in components, including the oft clamored for dedicated GPU.
Another way Apple could do it is just to elongate the Mini's case to make it just as svelte vertically, only slightly wider. Could you take a run at that one Dan==? ;)
Okay, I did some tinkering myself, just for kicks, and here's what I came up with. I thought that we were talking about a computer that was somewhere between a Mac Mini and a Mac Pro (Power Mac), so I thought, maybe the style should be a combination of the two. Let me know what you think.
It's not a Mac Plus... It's a Mac++!
http://www.ghwphoto.com/Mac++1.PNGhttp://www.ghwphoto.com/Mac++2.PNG
Another way Apple could do it is just to elongate the Mini's case to make it just as svelte vertically, only slightly wider. Could you take a run at that one Dan==? ;)
Okay, I did some tinkering myself, just for kicks, and here's what I came up with. I thought that we were talking about a computer that was somewhere between a Mac Mini and a Mac Pro (Power Mac), so I thought, maybe the style should be a combination of the two. Let me know what you think.
It's not a Mac Plus... It's a Mac++!
http://www.ghwphoto.com/Mac++1.PNGhttp://www.ghwphoto.com/Mac++2.PNG
gnasher729
Mar 22, 01:38 PM
You are the funniest poster on here. Thanks for the entertainment. (Not sure if it's your intent, but thanks anyway.)
Here's what he doesn't realise: Every product has both a price, and a value. In case of the iPhone, Apple has left a lot of space for others to undercut it in price. And many people will go for something that is cheaper, even when it doesn't have quite the value. But as we can see now, Apple hasn't left any margin with the iPad for competitors to undercut it in price. If the iPad was starting at around $1000 as had been suggested originally, then Samsung would be able to sell lots and lots of tablets for $499. But the iPad starts at $499. Samsung could sell lots and lots of tablets for $249 or $299, but they can't build them for the price. The reason why none of these tablets are cheaper than the iPad is because they just can't build them cheaper.
For the same price, people are going to buy the original and not a cheap copy. So they will buy and continue buying the iPad. And the iPad is the one that you know will be around next year, unlike others.
Here's what he doesn't realise: Every product has both a price, and a value. In case of the iPhone, Apple has left a lot of space for others to undercut it in price. And many people will go for something that is cheaper, even when it doesn't have quite the value. But as we can see now, Apple hasn't left any margin with the iPad for competitors to undercut it in price. If the iPad was starting at around $1000 as had been suggested originally, then Samsung would be able to sell lots and lots of tablets for $499. But the iPad starts at $499. Samsung could sell lots and lots of tablets for $249 or $299, but they can't build them for the price. The reason why none of these tablets are cheaper than the iPad is because they just can't build them cheaper.
For the same price, people are going to buy the original and not a cheap copy. So they will buy and continue buying the iPad. And the iPad is the one that you know will be around next year, unlike others.
janstett
Oct 23, 11:44 AM
Unfortunately not many multithreaded apps - yet. For a long time most of the multi-threaded apps were just a select few pro level things. 3D/Visualization software, CAD, database systems, etc.. Those of us who had multiprocessor systems bought them because we had a specific software in mind or group of software applications that could take advantage of multiple processors. As current CPU manufacturing processes started hitting a wall right around the 3GHz mark, chip makers started to transition to multiple CPU cores to boost power - makes sense. Software developers have been lazy for years, just riding the wave of ever-increasing MHz. Now the multi-core CPUs are here and the software is behind as many applications need to have serious re-writes done in order to take advantage of multiple processors. Intel tried to get a jump on this with their HT (Hyper Threading) implementation that essentially simulated dual-cores on a CPU by way of two virtual CPUs. Software developers didn't exactly jump on this and warm up to it. But I also don't think the software industry truly believed that CPUs would go multi-core on a mass scale so fast... Intel and AMD both said they would, don't know why the software industry doubted. Intel and AMD are uncommonly good about telling the truth about upcoming products. Both will be shipping quad-core CPU offerings by year's end.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
Slumpey
Apr 8, 06:53 AM
To avoid any hastle buy the ipad 2 from target. Was able to buy two on different days using their inventory tracker found on the web.. They don't hastle you with service plans, unnecessary accessories, etc which best buy does.
Sent from my HTC Incredible using Tapatalk
Sent from my HTC Incredible using Tapatalk
BlizzardBomb
Jul 27, 02:00 PM
Well it's back to the future for all of us. Remember when the Mac was going 64-bit with the introduction of the G5 PowerMac on June 23, 2003? :rolleyes: Only more thanthree years later and we're doing it all over again thanks to Yonah's 7 month retrograde.
What difference does it make if virtually no consumer software is effected by 64-bit processors, even now?
What difference does it make if virtually no consumer software is effected by 64-bit processors, even now?
LordJohnWhorfin
Nov 28, 06:57 PM
If Apple pays Universal to compensate it for their losses due to iPod users being pirates, I will make sure I only procure pirate copies of Universal music and movies, since Universal has already been compensated. No need for them to get paid twice.
kenypowa
Apr 27, 08:19 AM
Wow. That's surprising. This whole time people downplayed it because there was no evidence that apple was actually transmitting this data. It wasn't a big deal because the db file was local only. Now when Apple addresses it they had to not only admit that the file exists but that they actually were transmitting data.
Ah well, still not a big deal. :p
It was never a big deal. Either you are holding it wrong or there is a misunderstanding. Apple never makes mistakes, didn't you get the memo? ;)
Ah well, still not a big deal. :p
It was never a big deal. Either you are holding it wrong or there is a misunderstanding. Apple never makes mistakes, didn't you get the memo? ;)
kingtj
Sep 13, 12:33 PM
He's totally mistaken! The Cloverton CPUs will *all* be 64-bits, as Woodcrest (found in current Mac Pros) is. Intel is not going to ever go back to a 32-bit Xeon class CPU.
The difference between Woodcrest and "Tigerton" is that Woodcrest CPUs achieve their "dual core" status by basically placing two complete Xeon CPUs under one outer casing, and making them communicate with each other through the front-side bus on the motherboard.
Cloverton will be the same way, but with 4 cores packed into one casing, instead of just two.
"Tigerton" will finally allow both cores to interconnect with each other through an internal interface built into the CPU, instead of slowing communications down by routing it off one CPU core, through the motherboard's front-side bus, and back onto the other core.
This was his response:
"Cloverton is not 64, Cloverton MP (Tigerton) is 64 and is still on the drawing board last I heard.
Christina Aguilera - Mugshot
Christina Aguilera Mugshot
Christina+aguilera+mugshot
Arrested: Christina Aguilera
christina aguilera mugshot tmz
C hristina Aguilera#39;s mugshot
Christina Aguilera#39;s Mugshot
Christina Aguilera hasn#39;t
christina aguilera mugshot tmz
Christina Aguilera Mug Shot
The difference between Woodcrest and "Tigerton" is that Woodcrest CPUs achieve their "dual core" status by basically placing two complete Xeon CPUs under one outer casing, and making them communicate with each other through the front-side bus on the motherboard.
Cloverton will be the same way, but with 4 cores packed into one casing, instead of just two.
"Tigerton" will finally allow both cores to interconnect with each other through an internal interface built into the CPU, instead of slowing communications down by routing it off one CPU core, through the motherboard's front-side bus, and back onto the other core.
This was his response:
"Cloverton is not 64, Cloverton MP (Tigerton) is 64 and is still on the drawing board last I heard.
Becordial
Apr 27, 08:25 AM
I think the patch to iOS is a good response.
Making it clear the log file especially when you switch off location services is a good response, and that it will shorten the overall storage of it.
I hope it still does fast triangulation as necessary - there is a benefit to that - but just that the record keeping part basically is a non issue any more, because the cache is regularly flushed.
Making it clear the log file especially when you switch off location services is a good response, and that it will shorten the overall storage of it.
I hope it still does fast triangulation as necessary - there is a benefit to that - but just that the record keeping part basically is a non issue any more, because the cache is regularly flushed.
janstett
Sep 15, 08:26 AM
And of course, NT started as a reimplementation of VMS for a failed Intel RISC CPU...
More pedantic details for those who are interested... :)
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
More pedantic details for those who are interested... :)
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
soundbwoy
Apr 27, 10:54 AM
Is it me or are there more idiots about. Damn it people, leave the damn tracking contro alone if I lose my phone, I want to be able to find. I'm so not in the mood to spend $600 again.
-SD-
Aug 19, 02:25 PM
Kart racing and Course Maker videos and pictures over on Joystiq (http://www.joystiq.com/2010/08/19/gran-turismo-5-course-maker-and-kart-racing-unveiled/).
:apple:
:apple:
pkson
Apr 10, 08:36 PM
Wow. You'd think a FCP Users group would be able to track down a halfway decent graphic artist to make their banner graphic...
Aww, give them a break, they're probably just trying to keep with the 90's design of the UI. :D
Honestly, the website totally sucks. Looks like a get-rich-quick advertisement site. They might be FCP pros, but they know amateur HTML.
Aww, give them a break, they're probably just trying to keep with the 90's design of the UI. :D
Honestly, the website totally sucks. Looks like a get-rich-quick advertisement site. They might be FCP pros, but they know amateur HTML.
portishead
Apr 11, 08:16 PM
They are abandoning it. I know quite a few FCP editors who have switched to Avid MC5 or Premiere Pro.
We are large facility with about 10-12 full time FCP editors and we will probably switch to Avid MC5 unless Apple provides *needed* features for the future.
I'd there's a general mood of 'Apple is abandoning FCP' in the post community and facilities/users are setting up their exit strategies.
And its a strategy. Buying into new software is expensive and time consuming.
Overreact much? FCP hasn't even been announced and your company is already talking about jumping ship? I call b.s. I'm in LA and I haven't heard anyone talking about switching anything. What needed features do you need that don't already exist?
We are large facility with about 10-12 full time FCP editors and we will probably switch to Avid MC5 unless Apple provides *needed* features for the future.
I'd there's a general mood of 'Apple is abandoning FCP' in the post community and facilities/users are setting up their exit strategies.
And its a strategy. Buying into new software is expensive and time consuming.
Overreact much? FCP hasn't even been announced and your company is already talking about jumping ship? I call b.s. I'm in LA and I haven't heard anyone talking about switching anything. What needed features do you need that don't already exist?
Glideslope
Apr 25, 03:53 PM
Except it doesn't use GPS data. It uses cell towers and wifi.
Ouch!!!!! :apple:
Ouch!!!!! :apple:
milo
Jul 28, 09:37 AM
Apple had better step its game up compared to the prices/specs rumored last week.
That list was probably something some random guy threw together, it didn't come from a real source and AI only posted it because it's been floating around (saying they didn't believe it).
Actually I like the one with 2 slots. Perfect for all those people wanting 2 drives. :-)
But it would make way more sense to lose the "slot" and go with a standard tray loading drive. It's very impractical to give users the ability to add an optical drive...but require it to be a laptop model.
you can't make a statement like that. that's like saying "i hate general electric air conditioners." what the heck? all CPU's (and air conditioners) do the same thing.
You don't think there's a significant difference between different models of CPU? :eek:
How about Mac Midi?
I've thought about Mac Mid, but just doesn't seem quite right. Mac Midi is funny, but would confuse music guys (unless it actually had midi ports).
So if the new iMacs are using 64-bit merom or conroe chips, what is the likelihood of them offering 4Mb of RAM?
Current macs can handle 4 gigs of ram, if you get the expensive 2 gig chips. 32 bit limits you to 4 gig, doubt iMacs will handle more than that for a while.
That list was probably something some random guy threw together, it didn't come from a real source and AI only posted it because it's been floating around (saying they didn't believe it).
Actually I like the one with 2 slots. Perfect for all those people wanting 2 drives. :-)
But it would make way more sense to lose the "slot" and go with a standard tray loading drive. It's very impractical to give users the ability to add an optical drive...but require it to be a laptop model.
you can't make a statement like that. that's like saying "i hate general electric air conditioners." what the heck? all CPU's (and air conditioners) do the same thing.
You don't think there's a significant difference between different models of CPU? :eek:
How about Mac Midi?
I've thought about Mac Mid, but just doesn't seem quite right. Mac Midi is funny, but would confuse music guys (unless it actually had midi ports).
So if the new iMacs are using 64-bit merom or conroe chips, what is the likelihood of them offering 4Mb of RAM?
Current macs can handle 4 gigs of ram, if you get the expensive 2 gig chips. 32 bit limits you to 4 gig, doubt iMacs will handle more than that for a while.
Willis
Jul 27, 01:45 PM
I havent checked yet to see if someone mentioned it, but in regard to what people expect to see at WWDC, dont worry if it isnt annouced then... Paris Expo is in September.
Mac Pro's might come out then i think.
Mac Pro's might come out then i think.
leekohler
Feb 28, 12:57 PM
A same-sex attracted person is living a "gay lifestyle" when he or she dates people of the same sex, "marries" people of the same sex, has same-sex sex, or does any combination of these things. I think that if same-sex attracted people are going to live together, they need to do that as though they were siblings, not as sex partners. In my opinion, they should have purely platonic, nonsexual relationships with one another.
What I do is none of your damn business. And your opinion has no bearing on my life. Why you feel the need to tell others what to do is beyond me. Take care of your own house, let me take care of mine.
What I do is none of your damn business. And your opinion has no bearing on my life. Why you feel the need to tell others what to do is beyond me. Take care of your own house, let me take care of mine.
Bilbo63
Apr 19, 02:31 PM
What annoys me even more is that Apple always seems to make these claims that they made such and such first, and that Windows is copying Mac OS.. What annoys me is if you know a bit of the history you'll find that Apple copied Xerox interface, with permission of course, but it's not like they came up with it first..
Now they are making another claim that Samsung is copying..
No, you are wrong here. Apple did not copy the Xerox interface. Xerox developed a GUI that became the very early building blocks of the Mac OS. Xerox brass didn't get it and didn't know what to do with it. Apple made a deal with Xerox, hired the key talent, brought it in-house and further developed the whole GUI approach.
The seeds were clearly planted at Xerox, however, the finished Mac OS was a very different, more complete animal.
Now they are making another claim that Samsung is copying..
No, you are wrong here. Apple did not copy the Xerox interface. Xerox developed a GUI that became the very early building blocks of the Mac OS. Xerox brass didn't get it and didn't know what to do with it. Apple made a deal with Xerox, hired the key talent, brought it in-house and further developed the whole GUI approach.
The seeds were clearly planted at Xerox, however, the finished Mac OS was a very different, more complete animal.
Kevin Monahan
Apr 5, 06:20 PM
At present we have to re-encode a lot of our footage (7D / Minicam etc), and you don't need to do that in Premiere, it just plays on the timeline - however editing in that is quite frankly an exercise in sheer frustration and strange bugs.
I don't find it frustrating, in fact, it runs circles around FCP and I worked at Apple on 2 versions of the software, wrote a book and founded the first FCPUG.
As for strange bugs, please let me know what they are. Our users aren't complaining about anything strange.
If you do find something, please report it: Submit bugs to http://www.adobe.com/go/wish . More on how to give feedback: http://bit.ly/93d6NF
Best,
Kevin
I don't find it frustrating, in fact, it runs circles around FCP and I worked at Apple on 2 versions of the software, wrote a book and founded the first FCPUG.
As for strange bugs, please let me know what they are. Our users aren't complaining about anything strange.
If you do find something, please report it: Submit bugs to http://www.adobe.com/go/wish . More on how to give feedback: http://bit.ly/93d6NF
Best,
Kevin
cloudnine
Jul 14, 04:08 PM
To charge $1800 for a system that only has 512MB is a real disappoitment. 1GB RAM oughta be standard, especially with Leopard being on the horizon.
Unless the Xeon is that expensive (which I can't see how it would be), I don't see that as anything except creating some seperation between the configurations.
I agree... my buddy got a macbook pro and it came standard with 512mb of ram. For the first 3 or 4 days, he thought he purchased a defective notebook, it ran so badly. Opening MS Office applications literally took minutes, and that was with nothing else open. He took it back into the Apple store and the rep told him that his problem was his ram, so he purchased another 1gb (1.5gb total), and now it runs perfectly. You'd think that with all of these intel machines being released and a huge selection of software not being Universal yet, that 1 gig of ram would be standard...
kinda a$$h0lish if you ask me. :mad:
Unless the Xeon is that expensive (which I can't see how it would be), I don't see that as anything except creating some seperation between the configurations.
I agree... my buddy got a macbook pro and it came standard with 512mb of ram. For the first 3 or 4 days, he thought he purchased a defective notebook, it ran so badly. Opening MS Office applications literally took minutes, and that was with nothing else open. He took it back into the Apple store and the rep told him that his problem was his ram, so he purchased another 1gb (1.5gb total), and now it runs perfectly. You'd think that with all of these intel machines being released and a huge selection of software not being Universal yet, that 1 gig of ram would be standard...
kinda a$$h0lish if you ask me. :mad:
iGary
Aug 15, 11:39 AM
I would have thought that the Final Cut Pro benchmark would have really blown away the G5 - not so much, right?
Awesome on FileMaker and I can't wait to see how this stuff runs Adobe PS Natively.
Awesome on FileMaker and I can't wait to see how this stuff runs Adobe PS Natively.
dethmaShine
Mar 31, 02:41 PM
Google is going to define 'open' in a way it benefits them and their advertising crusade.
I remember those days of the G1 on vodafone (in the UK, I guess?) such horrible, sluggish phones; google shipping out an incomplete product were at the mercy of Carriers and Manufacturers and now they don't give a ****.
Another less in the light: Never partner with Google. They have always been like this.
I remember those days of the G1 on vodafone (in the UK, I guess?) such horrible, sluggish phones; google shipping out an incomplete product were at the mercy of Carriers and Manufacturers and now they don't give a ****.
Another less in the light: Never partner with Google. They have always been like this.