firewood
Apr 28, 06:20 PM
I want it to be like a PC, a Mac or a Laptop.
Why should Apple care what you want it to be like when they know what more people actually buy? More people purchased iPads last quarter than MacBooks or iMacs. And reports are the most of those iPad were used for exactly the same kinds of things that most PCs are actually used for.
Ya know, mainframe and minicomputer companies used to call personal computers toys, not real computers. How can it be a real computer without a punched card reader and a line printer?
The vast majority of those mainframe and minicomputer companies no longer exist.
Why should Apple care what you want it to be like when they know what more people actually buy? More people purchased iPads last quarter than MacBooks or iMacs. And reports are the most of those iPad were used for exactly the same kinds of things that most PCs are actually used for.
Ya know, mainframe and minicomputer companies used to call personal computers toys, not real computers. How can it be a real computer without a punched card reader and a line printer?
The vast majority of those mainframe and minicomputer companies no longer exist.
AppliedVisual
Oct 26, 10:34 AM
Considering that Windows supports up to 64 CPU cores, and that 64 core Windows machines are available - it would be nice if you could show some proof that OSX on a 64 CPU machine scales better than Windows or Linux....
Are you being overly pedantic or do you just want to argue? I said WinXP. -- "probably as good or better than WinXP". WinXP only supports two CPUs with a max of 4 cores each right now as per the EULA. The Windows kernel itself actually handles CPU division and scales dynamically based on addressable CPUs within a system all the way up to 256 CPUs or cores, with support for up to 4 logical or virtual CPUs each. And just think where those 64-CPU Windows systems are going to be in the near future as they're updraded with quad-core CPUs from AMD/Intel...
BTW: You have to buy Windows Server Datacenter Edition to get to all those CPUs.
Are you being overly pedantic or do you just want to argue? I said WinXP. -- "probably as good or better than WinXP". WinXP only supports two CPUs with a max of 4 cores each right now as per the EULA. The Windows kernel itself actually handles CPU division and scales dynamically based on addressable CPUs within a system all the way up to 256 CPUs or cores, with support for up to 4 logical or virtual CPUs each. And just think where those 64-CPU Windows systems are going to be in the near future as they're updraded with quad-core CPUs from AMD/Intel...
BTW: You have to buy Windows Server Datacenter Edition to get to all those CPUs.
dudemac
Mar 18, 03:58 PM
To all but a few of the replies so far that seem totally out raged by this,
\
First there is no support for itms on linux as it currently stands and this just allows user of linux to purchase songs from the itms and play them on that platform. It also allows someone like me who has a high speed connection at work to purchase music and take it home with me. Yes I have a couple of mac's and an ipod, so my loyalty hasn't changed.
Secoundly this doesn't hack the DRM that apple supplies, however it does violate the EULA, which I don't know anyone that doesn't violate a EULA at least once a day. But that is really a different argument.
Finally why is there no outrage that DRM is not optional or that there hasn't been a standardized format for music. There are reasons why the mini disc failed and it had nothing to do with quality. But it was a propriotary format that needed to be liscencsed. So when looking at the delima of DRM it should be more of a how do we get everything to play everywhere kind of question then just limiting how the user can play/share the music at home. I really hate being limited for "my own good". or more appropriately for the good of a corporation. If WMA beats apple it will only be because they failed to standardize and work within the industry.
cartoon styled clip art
clip art tree with roots. quot
clip art tree with roots.
February 19, 2008 in Art
TREE OF LIFE - Clip #1
Your videos with clip art
clip art tree with roots.
where jul dream Clip art,
Wholesome tongue tree of life
clip art tree trunk.
clip art tree outline.
Wallpaper image: Tree of life,
Mammal Clipart
Tree Clipart Black And White.
THE TREE OF LIFE by PACINO di
clip art tree with roots.
\
First there is no support for itms on linux as it currently stands and this just allows user of linux to purchase songs from the itms and play them on that platform. It also allows someone like me who has a high speed connection at work to purchase music and take it home with me. Yes I have a couple of mac's and an ipod, so my loyalty hasn't changed.
Secoundly this doesn't hack the DRM that apple supplies, however it does violate the EULA, which I don't know anyone that doesn't violate a EULA at least once a day. But that is really a different argument.
Finally why is there no outrage that DRM is not optional or that there hasn't been a standardized format for music. There are reasons why the mini disc failed and it had nothing to do with quality. But it was a propriotary format that needed to be liscencsed. So when looking at the delima of DRM it should be more of a how do we get everything to play everywhere kind of question then just limiting how the user can play/share the music at home. I really hate being limited for "my own good". or more appropriately for the good of a corporation. If WMA beats apple it will only be because they failed to standardize and work within the industry.
nehunte
Oct 7, 10:52 AM
Every phone that comes out after the iPhone is supposed to surpass the iPhone by 20**. This is getting old. It took how many years for someone to beat up on Nokia? That's right, it'll be a long time before you see a dent in the iPhone's armor.
I'm going to make a new smartphone next week. It's an iPhone-killer. Guaranteed.
I'm going to make a new smartphone next week. It's an iPhone-killer. Guaranteed.
notjustjay
Apr 6, 11:58 AM
forgot to add that the "+" (maximize) button is wildly inconsistent in its function.
maximizing to full screen in general isn't the way OS X "works", which is why most programs don't do that...but it seems Apple never really decided what the maximize button is supposed to do.
That's because Apple didn't decide what the maximize button was supposed to do. That was supposed to be up to each application developer.
Don't think of it as a "maximize" button, think of it as "optimize". As in "Hey, application, the user just clicked your green button. Go ahead and resize yourself to whatever you think is most appropriate given what document is currently open." Most apps should resize their window to display the full width without needing scrollbars. In theory.
I agree with the person a few posts up who said "Don't think about how you did it in Windows. Think about what you think would make sense" and it usually works.
As for the other little quibbles discussed in this thread: yes, OS X is a little different (most of these issues are with Finder versus Explorer, I notice). You just get used to it. I use XP at work and OSX at home every day, and I learn to work with each. I do some of the tricks mentioned in this thread (like adding a shortcut to my Applications folder on the dock to mimic a Start menu) but not so much because "I prefer the Windows way" as "this is efficient and makes sense".
maximizing to full screen in general isn't the way OS X "works", which is why most programs don't do that...but it seems Apple never really decided what the maximize button is supposed to do.
That's because Apple didn't decide what the maximize button was supposed to do. That was supposed to be up to each application developer.
Don't think of it as a "maximize" button, think of it as "optimize". As in "Hey, application, the user just clicked your green button. Go ahead and resize yourself to whatever you think is most appropriate given what document is currently open." Most apps should resize their window to display the full width without needing scrollbars. In theory.
I agree with the person a few posts up who said "Don't think about how you did it in Windows. Think about what you think would make sense" and it usually works.
As for the other little quibbles discussed in this thread: yes, OS X is a little different (most of these issues are with Finder versus Explorer, I notice). You just get used to it. I use XP at work and OSX at home every day, and I learn to work with each. I do some of the tricks mentioned in this thread (like adding a shortcut to my Applications folder on the dock to mimic a Start menu) but not so much because "I prefer the Windows way" as "this is efficient and makes sense".
Apple OC
Apr 23, 02:29 AM
This is just a form of soldier conditioning. Don't fool yourself into thinking we don't do this to our own soldiers. That's why we get them when they are young 18 year olds who are impressionable and tell them they are doing this for "god and country". The good wolves will "go to heaven" protecting the sheep. "God Speed" in their mission. Being sent out to get blown up by an IED is as cannon fodderish as strapping one to your chest. The only difference is that the latter tactic is used in times of despiration against an overwhelmingly powerful enemy. Just like Kamakazis, Viet Cong, etc. And now these ppl make our TV's and clothing. ;)
sorry but you are wrong ... we do not tell soldiers they are fighting for God or that there is anything such as being a martyr
nice try though :rolleyes:
sorry but you are wrong ... we do not tell soldiers they are fighting for God or that there is anything such as being a martyr
nice try though :rolleyes:
nozebleed
May 5, 10:51 AM
I still believe its just where you are at in the country. This graph is the exact opposite of what I experience. Verizon work phone - SHITE. Dropped calls so bad I forwarded the number to my iPhone. AT&T personal phone - no dropped calls.
leekohler
Apr 15, 09:07 AM
This is great to see. Good job, Apple!
I'mAMac
Aug 29, 04:29 PM
My point is that Greenpeace would be far better served educating the public how to help. They get even 10% of the world's population to make some radical changes in their lives and the changes to the planet would be amazing.
I agree corporations need to set examples and do teh best they can. I don't think its where environmentalists should be pointing fingers.
You , me and everyone else are the biggest polluters.
right. why don't they invent something that doesnt pollute so we can all use it. (yeah right)
I agree corporations need to set examples and do teh best they can. I don't think its where environmentalists should be pointing fingers.
You , me and everyone else are the biggest polluters.
right. why don't they invent something that doesnt pollute so we can all use it. (yeah right)
supmango
Mar 18, 10:48 AM
+11
The whole "it's MY data, I can do what I want with it!" argument is countered by your perfect analogy with a buffet. I tip my hat to you on that one. If you're at an all-you-can-eat buffet, it doesn't mean you can share your food with your entire family.
I've always believed that unlimited data, on a smartphone, enables you to connect to the internet as much as you want on the device you're contracted to. It's not like home internet where you can share the connection, nor have I ever imagined it would be.
I think that people just like to get "angry at the man" when they don't get things the way they want. ATT is trying to improve their network, good for them.
If AT&T let you keep your "unlimited" data plan AND add tethering, his analogy would work. As it stands right now, AT&T forces you to downgrade to a capped data plan and add tethering to it which essentially doubles your data cap to 2gb.
The analogy is more accurately like a traditional restaurant where you order an entre that is not "all you can eat". But in this case, they don't allow you to share it with another person, even though you could never possibly eat all of it by yourself (use your existing data allotment). However, they are more than happy to let you buy another entre. Oh, and you can't take home your leftovers either (rollover). That does a little better job of highlighting exactly how AT&T is being greedy in this scenario.
Bottom line, what people are doing is sticking with unlimited data and tethering (using some other means), and then downloading gigabits of data which does affect network performance for other users. That is how AT&T sees it. If you are careful about what you do while "illegally" tethering, and how often you do it, I seriously doubt they will figure it out. They really aren't that put together on this, as anyone who has spoken to "customer service" can attest.
The whole "it's MY data, I can do what I want with it!" argument is countered by your perfect analogy with a buffet. I tip my hat to you on that one. If you're at an all-you-can-eat buffet, it doesn't mean you can share your food with your entire family.
I've always believed that unlimited data, on a smartphone, enables you to connect to the internet as much as you want on the device you're contracted to. It's not like home internet where you can share the connection, nor have I ever imagined it would be.
I think that people just like to get "angry at the man" when they don't get things the way they want. ATT is trying to improve their network, good for them.
If AT&T let you keep your "unlimited" data plan AND add tethering, his analogy would work. As it stands right now, AT&T forces you to downgrade to a capped data plan and add tethering to it which essentially doubles your data cap to 2gb.
The analogy is more accurately like a traditional restaurant where you order an entre that is not "all you can eat". But in this case, they don't allow you to share it with another person, even though you could never possibly eat all of it by yourself (use your existing data allotment). However, they are more than happy to let you buy another entre. Oh, and you can't take home your leftovers either (rollover). That does a little better job of highlighting exactly how AT&T is being greedy in this scenario.
Bottom line, what people are doing is sticking with unlimited data and tethering (using some other means), and then downloading gigabits of data which does affect network performance for other users. That is how AT&T sees it. If you are careful about what you do while "illegally" tethering, and how often you do it, I seriously doubt they will figure it out. They really aren't that put together on this, as anyone who has spoken to "customer service" can attest.
roland.g
Sep 12, 06:33 PM
That's what I thought when I saw that they weren't specific about WiFi ... simply calling it "802.11 wireless networking" instead of specifically stating it was "802.11 A/B/G".
but that brings up the point of what's sending to it. Doesn't matter that it has new tech to recieve at higher bandwidth if the computer streaming to it only sends out at 802.11g.
but that brings up the point of what's sending to it. Doesn't matter that it has new tech to recieve at higher bandwidth if the computer streaming to it only sends out at 802.11g.
WestonHarvey1
Apr 15, 09:27 AM
I have a couple problems with this approach. There's so much attention brought to this issue of specifically gay bullying that it's hard to see this outside of the framework of identity politics.
Where's the videos and support for fat kids being bullied? Aren't they suicidal, too, or are we saying here that gays have a particular emotional defect and weakness? They're not strong enough to tough this out? Is that the image the gay community wants to promote?
Man, being a fat kid in high school. That was rough. There were a number of cool, popular gay guys in my school. I'm sure they took some crap from some people, but oh how I would have rather been one of them! But hey, I'm still here, I'm still alive.
Bullying is a universal problem that affects just about anyone with some kind of difference others choose to pick on. It seems like everyone is just ignoring all that for this hip, trendy cause.
Where's the videos and support for fat kids being bullied? Aren't they suicidal, too, or are we saying here that gays have a particular emotional defect and weakness? They're not strong enough to tough this out? Is that the image the gay community wants to promote?
Man, being a fat kid in high school. That was rough. There were a number of cool, popular gay guys in my school. I'm sure they took some crap from some people, but oh how I would have rather been one of them! But hey, I'm still here, I'm still alive.
Bullying is a universal problem that affects just about anyone with some kind of difference others choose to pick on. It seems like everyone is just ignoring all that for this hip, trendy cause.
Small White Car
May 5, 10:23 AM
AT&T's plan worked brilliantly.
They put me through a year where about 40% of my calls got dropped and then fixed it so only about 5% get dropped now.
So even though that's worse than the other carriers I am personally thrilled with that number.
So...good plan, AT&T!
They put me through a year where about 40% of my calls got dropped and then fixed it so only about 5% get dropped now.
So even though that's worse than the other carriers I am personally thrilled with that number.
So...good plan, AT&T!
Eraserhead
Mar 26, 03:05 AM
Love conquers all until it hits a rough patch
If you really love someone, surely you don't want to be with anyone else? If so, then it would be pretty moronic not to ultimately work out your issues with the other person.
If you really love someone, surely you don't want to be with anyone else? If so, then it would be pretty moronic not to ultimately work out your issues with the other person.
NebulaClash
Apr 28, 08:20 AM
A PC is something you work with not a fancy looking gadget. I don't see this happening in the next 5-10 years.
Excellent! I love it when people put these predictions down in black and white for posterity. OK, see you in 2020 when the Tablet Era will be ten years old, the dominant computer format people buy, and containing capabilities that we cannot even imagine now.
But you've put down in writing that it will not be something you work with even then. Noted.
Excellent! I love it when people put these predictions down in black and white for posterity. OK, see you in 2020 when the Tablet Era will be ten years old, the dominant computer format people buy, and containing capabilities that we cannot even imagine now.
But you've put down in writing that it will not be something you work with even then. Noted.
Lamarak
Jun 19, 05:52 PM
Guess it is really area dependent. Tried the droid incredible with Verizon, had more dropped lost calls in my 3 weeks with them than I had with my Iphone and ATT in 3 years ( or seemed like it). We went back to ATT and no problems thus far. This is here in San Antonio, TX.
appleguy123
Apr 22, 08:31 PM
proof?
I wouldn't want to succumb to the accusation made in the first post. :) http://forums.macrumors.com/showthread.php?t=1055916&highlight=
I wouldn't want to succumb to the accusation made in the first post. :) http://forums.macrumors.com/showthread.php?t=1055916&highlight=
SPUY767
Sep 26, 10:40 AM
Pardon Me But Would You Please Track Down The Link To That Card And IM Me and post it here? I need it NOW! Thanks.
I will be on this thread until the Mac Pro Clovertown option ships. :D
This is the Mac Pro I have been waiting for.
This is not the one I use but the same in concept. Gigayte i-RAM (http://www.anandtech.com/storage/showdoc.aspx?i=2480) This item uses PCI and not PCIe.
The one that I use doesn't work with the Macintosh, but apparently, the PCIe/SATAII version of the one that Eld is talking about will as mine uses no SATA interface for data transfer.
I will be on this thread until the Mac Pro Clovertown option ships. :D
This is the Mac Pro I have been waiting for.
This is not the one I use but the same in concept. Gigayte i-RAM (http://www.anandtech.com/storage/showdoc.aspx?i=2480) This item uses PCI and not PCIe.
The one that I use doesn't work with the Macintosh, but apparently, the PCIe/SATAII version of the one that Eld is talking about will as mine uses no SATA interface for data transfer.
jayducharme
Oct 7, 05:04 PM
I have no doubt Android will surpass the iPhone in terms of user numbers. Will it surpass in quality? That remains to be seen...
Even if it does surpass in the number of users, since when has Apple been solely concerned about numbers? Quality of design really does seem to be an obsession with Apple. When the iPhone was first released, didn't Jobs state that he would be happy with 1% of the cell phone market? He's already surpassed that. Just as with their computers, Apple has never positioned itself as a mass market brand for everybody. They have opened the floodgates on the smart phone market, but I don't think they ever intended to dominate it. They simply want to provide the best experience, and that in turn brings them discriminating customers.
Even if it does surpass in the number of users, since when has Apple been solely concerned about numbers? Quality of design really does seem to be an obsession with Apple. When the iPhone was first released, didn't Jobs state that he would be happy with 1% of the cell phone market? He's already surpassed that. Just as with their computers, Apple has never positioned itself as a mass market brand for everybody. They have opened the floodgates on the smart phone market, but I don't think they ever intended to dominate it. They simply want to provide the best experience, and that in turn brings them discriminating customers.
wtfk
Aug 29, 02:32 PM
Eh, I believe little of what Greenpeace ever says. :rolleyes:
There's little reason to. Penn & Teller blasted them real good on their TV show "Bulls hit! (http://tinyurl.com/s4gfc)" on Showtime.
There's little reason to. Penn & Teller blasted them real good on their TV show "Bulls hit! (http://tinyurl.com/s4gfc)" on Showtime.
dante@sisna.com
Oct 26, 03:35 AM
Open and doing something. Safari, Mail, iTunes, and working in photoshop probably won't benefit much from quad cores. Batching in PS, Aperture and doing a render in FCP would.
I am on the brink of buying something. What, time will tell. If the quad core does make a marked difference when running PS and at most one background process I'll consider it. Otherwise its a Dual core 2.66 for me.
I could not disagree with you more. Our G5 and Mac Pro Quads give us an extra production hour, at least, per day, using many of the apps you mentioned above. It is up to the user the know how to push these boxes.
Just today, we processed 8.7 Gig of Photoshop documents (high res art scans from a lambda flatbed of 4x8 foot originals at 300 dpi -- i know the artist was crazy, but it is what we GOT.) -- We open all this data over 20 docs, changed RGB to CMYK, adjusted color, resized to a normal size, sharpened, added masks and saved. We did all this in 40 minutes -- that is 2 minutes per average size doc of 600MB.
Are you really going to tell me that my G5 Dual 2.7 could hang like this.
No Way -- We had activity monitor open -- Photoshop used an average of 72% off ALL FOUR PROCESSORS.
We did use safari at the same time to download a template for the art book (250 MG) and we had a DVD ripping via Mac the Ripper as well.
Quad Core Rules. Soon to be OCTO.
I am on the brink of buying something. What, time will tell. If the quad core does make a marked difference when running PS and at most one background process I'll consider it. Otherwise its a Dual core 2.66 for me.
I could not disagree with you more. Our G5 and Mac Pro Quads give us an extra production hour, at least, per day, using many of the apps you mentioned above. It is up to the user the know how to push these boxes.
Just today, we processed 8.7 Gig of Photoshop documents (high res art scans from a lambda flatbed of 4x8 foot originals at 300 dpi -- i know the artist was crazy, but it is what we GOT.) -- We open all this data over 20 docs, changed RGB to CMYK, adjusted color, resized to a normal size, sharpened, added masks and saved. We did all this in 40 minutes -- that is 2 minutes per average size doc of 600MB.
Are you really going to tell me that my G5 Dual 2.7 could hang like this.
No Way -- We had activity monitor open -- Photoshop used an average of 72% off ALL FOUR PROCESSORS.
We did use safari at the same time to download a template for the art book (250 MG) and we had a DVD ripping via Mac the Ripper as well.
Quad Core Rules. Soon to be OCTO.
theBB
Jul 12, 12:38 PM
Unless Apple bucks their own trend of charging more for the Intel Mac replacements over the G4/G5 units, we may be in for a rather large increase at the higher end on up. Intel processors cost more than G4/G5 processors. The high end of any processor costs a lot more than the slower ones of the same type. Does all of this add up to price decreases or price increases? As much as I would like to see a price decrease, to me that just does not add up.
The only G5 machine replaced by an Intel version has been iMac and its price stayed the same.
The only G5 machine replaced by an Intel version has been iMac and its price stayed the same.
BornAgainMac
Apr 13, 04:40 AM
Finally Grand Central has been used in a major app.
d.perel
Mar 18, 03:47 PM
Wish he'd do something useful like cracking WMA.
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment